WorldWideScience

Sample records for randomly selected general

  1. 47 CFR 1.1602 - Designation for random selection.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...

  2. 47 CFR 1.1603 - Conduct of random selection.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection. The...

  3. Non-compact random generalized games and random quasi-variational inequalities

    OpenAIRE

    Yuan, Xian-Zhi

    1994-01-01

    In this paper, existence theorems of random maximal elements, random equilibria for the random one-person game and random generalized game with a countable number of players are given as applications of random fixed point theorems. By employing existence theorems of random generalized games, we deduce the existence of solutions for non-compact random quasi-variational inequalities. These in turn are used to establish several existence theorems of noncompact generalized random ...

  4. Blocked Randomization with Randomly Selected Block Sizes

    Directory of Open Access Journals (Sweden)

    Jimmy Efird

    2010-12-01

    Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  5. Application of random effects to the study of resource selection by animals.

    Science.gov (United States)

    Gillies, Cameron S; Hebblewhite, Mark; Nielsen, Scott E; Krawchuk, Meg A; Aldridge, Cameron L; Frair, Jacqueline L; Saher, D Joanne; Stevens, Cameron E; Jerde, Christopher L

    2006-07-01

    1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence. 2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability. 3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed. 4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects. 5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection. 6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions

  6. Random selection of items. Selection of n1 samples among N items composing a stratum

    International Nuclear Information System (INIS)

    Jaech, J.L.; Lemaire, R.J.

    1987-02-01

    STR-224 provides generalized procedures to determine required sample sizes, for instance in the course of a Physical Inventory Verification at Bulk Handling Facilities. The present report describes procedures to generate random numbers and select groups of items to be verified in a given stratum through each of the measurement methods involved in the verification. (author). 3 refs

  7. Randomized Oversampling for Generalized Multiscale Finite Element Methods

    KAUST Repository

    Calo, Victor M.

    2016-03-23

    In this paper, we develop efficient multiscale methods for flows in heterogeneous media. We use the generalized multiscale finite element (GMsFEM) framework. GMsFEM approximates the solution space locally using a few multiscale basis functions. This approximation selects an appropriate snapshot space and a local spectral decomposition, e.g., the use of oversampled regions, in order to achieve an efficient model reduction. However, the successful construction of snapshot spaces may be costly if too many local problems need to be solved in order to obtain these spaces. We use a moderate quantity of local solutions (or snapshot vectors) with random boundary conditions on oversampled regions with zero forcing to deliver an efficient methodology. Motivated by the randomized algorithm presented in [P. G. Martinsson, V. Rokhlin, and M. Tygert, A Randomized Algorithm for the approximation of Matrices, YALEU/DCS/TR-1361, Yale University, 2006], we consider a snapshot space which consists of harmonic extensions of random boundary conditions defined in a domain larger than the target region. Furthermore, we perform an eigenvalue decomposition in this small space. We study the application of randomized sampling for GMsFEM in conjunction with adaptivity, where local multiscale spaces are adaptively enriched. Convergence analysis is provided. We present representative numerical results to validate the method proposed.

  8. Selection for altruism through random drift in variable size populations

    Directory of Open Access Journals (Sweden)

    Houchmandzadeh Bahram

    2012-05-01

    Full Text Available Abstract Background Altruistic behavior is defined as helping others at a cost to oneself and a lowered fitness. The lower fitness implies that altruists should be selected against, which is in contradiction with their widespread presence is nature. Present models of selection for altruism (kin or multilevel show that altruistic behaviors can have ‘hidden’ advantages if the ‘common good’ produced by altruists is restricted to some related or unrelated groups. These models are mostly deterministic, or assume a frequency dependent fitness. Results Evolutionary dynamics is a competition between deterministic selection pressure and stochastic events due to random sampling from one generation to the next. We show here that an altruistic allele extending the carrying capacity of the habitat can win by increasing the random drift of “selfish” alleles. In other terms, the fixation probability of altruistic genes can be higher than those of a selfish ones, even though altruists have a smaller fitness. Moreover when populations are geographically structured, the altruists advantage can be highly amplified and the fixation probability of selfish genes can tend toward zero. The above results are obtained both by numerical and analytical calculations. Analytical results are obtained in the limit of large populations. Conclusions The theory we present does not involve kin or multilevel selection, but is based on the existence of random drift in variable size populations. The model is a generalization of the original Fisher-Wright and Moran models where the carrying capacity depends on the number of altruists.

  9. Selectivity and sparseness in randomly connected balanced networks.

    Directory of Open Access Journals (Sweden)

    Cengiz Pehlevan

    Full Text Available Neurons in sensory cortex show stimulus selectivity and sparse population response, even in cases where no strong functionally specific structure in connectivity can be detected. This raises the question whether selectivity and sparseness can be generated and maintained in randomly connected networks. We consider a recurrent network of excitatory and inhibitory spiking neurons with random connectivity, driven by random projections from an input layer of stimulus selective neurons. In this architecture, the stimulus-to-stimulus and neuron-to-neuron modulation of total synaptic input is weak compared to the mean input. Surprisingly, we show that in the balanced state the network can still support high stimulus selectivity and sparse population response. In the balanced state, strong synapses amplify the variation in synaptic input and recurrent inhibition cancels the mean. Functional specificity in connectivity emerges due to the inhomogeneity caused by the generative statistical rule used to build the network. We further elucidate the mechanism behind and evaluate the effects of model parameters on population sparseness and stimulus selectivity. Network response to mixtures of stimuli is investigated. It is shown that a balanced state with unselective inhibition can be achieved with densely connected input to inhibitory population. Balanced networks exhibit the "paradoxical" effect: an increase in excitatory drive to inhibition leads to decreased inhibitory population firing rate. We compare and contrast selectivity and sparseness generated by the balanced network to randomly connected unbalanced networks. Finally, we discuss our results in light of experiments.

  10. Testing, Selection, and Implementation of Random Number Generators

    National Research Council Canada - National Science Library

    Collins, Joseph C

    2008-01-01

    An exhaustive evaluation of state-of-the-art random number generators with several well-known suites of tests provides the basis for selection of suitable random number generators for use in stochastic simulations...

  11. A generalized model via random walks for information filtering

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Zhuo-Ming, E-mail: zhuomingren@gmail.com [Department of Physics, University of Fribourg, Chemin du Musée 3, CH-1700, Fribourg (Switzerland); Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences, ChongQing, 400714 (China); Kong, Yixiu [Department of Physics, University of Fribourg, Chemin du Musée 3, CH-1700, Fribourg (Switzerland); Shang, Ming-Sheng, E-mail: msshang@cigit.ac.cn [Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences, ChongQing, 400714 (China); Zhang, Yi-Cheng [Department of Physics, University of Fribourg, Chemin du Musée 3, CH-1700, Fribourg (Switzerland)

    2016-08-06

    There could exist a simple general mechanism lurking beneath collaborative filtering and interdisciplinary physics approaches which have been successfully applied to online E-commerce platforms. Motivated by this idea, we propose a generalized model employing the dynamics of the random walk in the bipartite networks. Taking into account the degree information, the proposed generalized model could deduce the collaborative filtering, interdisciplinary physics approaches and even the enormous expansion of them. Furthermore, we analyze the generalized model with single and hybrid of degree information on the process of random walk in bipartite networks, and propose a possible strategy by using the hybrid degree information for different popular objects to toward promising precision of the recommendation. - Highlights: • We propose a generalized recommendation model employing the random walk dynamics. • The proposed model with single and hybrid of degree information is analyzed. • A strategy with the hybrid degree information improves precision of recommendation.

  12. A generalized model via random walks for information filtering

    International Nuclear Information System (INIS)

    Ren, Zhuo-Ming; Kong, Yixiu; Shang, Ming-Sheng; Zhang, Yi-Cheng

    2016-01-01

    There could exist a simple general mechanism lurking beneath collaborative filtering and interdisciplinary physics approaches which have been successfully applied to online E-commerce platforms. Motivated by this idea, we propose a generalized model employing the dynamics of the random walk in the bipartite networks. Taking into account the degree information, the proposed generalized model could deduce the collaborative filtering, interdisciplinary physics approaches and even the enormous expansion of them. Furthermore, we analyze the generalized model with single and hybrid of degree information on the process of random walk in bipartite networks, and propose a possible strategy by using the hybrid degree information for different popular objects to toward promising precision of the recommendation. - Highlights: • We propose a generalized recommendation model employing the random walk dynamics. • The proposed model with single and hybrid of degree information is analyzed. • A strategy with the hybrid degree information improves precision of recommendation.

  13. Random walks on generalized Koch networks

    International Nuclear Information System (INIS)

    Sun, Weigang

    2013-01-01

    For deterministically growing networks, it is a theoretical challenge to determine the topological properties and dynamical processes. In this paper, we study random walks on generalized Koch networks with features that include an initial state that is a globally connected network to r nodes. In each step, every existing node produces m complete graphs. We then obtain the analytical expressions for first passage time (FPT), average return time (ART), i.e. the average of FPTs for random walks from node i to return to the starting point i for the first time, and average sending time (AST), defined as the average of FPTs from a hub node to all other nodes, excluding the hub itself with regard to network parameters m and r. For this family of Koch networks, the ART of the new emerging nodes is identical and increases with the parameters m or r. In addition, the AST of our networks grows with network size N as N ln N and also increases with parameter m. The results obtained in this paper are the generalizations of random walks for the original Koch network. (paper)

  14. A generalized model via random walks for information filtering

    Science.gov (United States)

    Ren, Zhuo-Ming; Kong, Yixiu; Shang, Ming-Sheng; Zhang, Yi-Cheng

    2016-08-01

    There could exist a simple general mechanism lurking beneath collaborative filtering and interdisciplinary physics approaches which have been successfully applied to online E-commerce platforms. Motivated by this idea, we propose a generalized model employing the dynamics of the random walk in the bipartite networks. Taking into account the degree information, the proposed generalized model could deduce the collaborative filtering, interdisciplinary physics approaches and even the enormous expansion of them. Furthermore, we analyze the generalized model with single and hybrid of degree information on the process of random walk in bipartite networks, and propose a possible strategy by using the hybrid degree information for different popular objects to toward promising precision of the recommendation.

  15. A Bayesian random effects discrete-choice model for resource selection: Population-level selection inference

    Science.gov (United States)

    Thomas, D.L.; Johnson, D.; Griffith, B.

    2006-01-01

    Modeling the probability of use of land units characterized by discrete and continuous measures, we present a Bayesian random-effects model to assess resource selection. This model provides simultaneous estimation of both individual- and population-level selection. Deviance information criterion (DIC), a Bayesian alternative to AIC that is sample-size specific, is used for model selection. Aerial radiolocation data from 76 adult female caribou (Rangifer tarandus) and calf pairs during 1 year on an Arctic coastal plain calving ground were used to illustrate models and assess population-level selection of landscape attributes, as well as individual heterogeneity of selection. Landscape attributes included elevation, NDVI (a measure of forage greenness), and land cover-type classification. Results from the first of a 2-stage model-selection procedure indicated that there is substantial heterogeneity among cow-calf pairs with respect to selection of the landscape attributes. In the second stage, selection of models with heterogeneity included indicated that at the population-level, NDVI and land cover class were significant attributes for selection of different landscapes by pairs on the calving ground. Population-level selection coefficients indicate that the pairs generally select landscapes with higher levels of NDVI, but the relationship is quadratic. The highest rate of selection occurs at values of NDVI less than the maximum observed. Results for land cover-class selections coefficients indicate that wet sedge, moist sedge, herbaceous tussock tundra, and shrub tussock tundra are selected at approximately the same rate, while alpine and sparsely vegetated landscapes are selected at a lower rate. Furthermore, the variability in selection by individual caribou for moist sedge and sparsely vegetated landscapes is large relative to the variability in selection of other land cover types. The example analysis illustrates that, while sometimes computationally intense, a

  16. A Generalized Random Regret Minimization Model

    NARCIS (Netherlands)

    Chorus, C.G.

    2013-01-01

    This paper presents, discusses and tests a generalized Random Regret Minimization (G-RRM) model. The G-RRM model is created by replacing a fixed constant in the attribute-specific regret functions of the RRM model, by a regret-weight variable. Depending on the value of the regret-weights, the G-RRM

  17. Local randomization in neighbor selection improves PRM roadmap quality

    KAUST Repository

    McMahon, Troy; Jacobs, Sam; Boyd, Bryan; Tapia, Lydia; Amato, Nancy M.

    2012-01-01

    Probabilistic Roadmap Methods (PRMs) are one of the most used classes of motion planning methods. These sampling-based methods generate robot configurations (nodes) and then connect them to form a graph (roadmap) containing representative feasible pathways. A key step in PRM roadmap construction involves identifying a set of candidate neighbors for each node. Traditionally, these candidates are chosen to be the k-closest nodes based on a given distance metric. In this paper, we propose a new neighbor selection policy called LocalRand(k,K'), that first computes the K' closest nodes to a specified node and then selects k of those nodes at random. Intuitively, LocalRand attempts to benefit from random sampling while maintaining the higher levels of local planner success inherent to selecting more local neighbors. We provide a methodology for selecting the parameters k and K'. We perform an experimental comparison which shows that for both rigid and articulated robots, LocalRand results in roadmaps that are better connected than the traditional k-closest policy or a purely random neighbor selection policy. The cost required to achieve these results is shown to be comparable to k-closest. © 2012 IEEE.

  18. Local randomization in neighbor selection improves PRM roadmap quality

    KAUST Repository

    McMahon, Troy

    2012-10-01

    Probabilistic Roadmap Methods (PRMs) are one of the most used classes of motion planning methods. These sampling-based methods generate robot configurations (nodes) and then connect them to form a graph (roadmap) containing representative feasible pathways. A key step in PRM roadmap construction involves identifying a set of candidate neighbors for each node. Traditionally, these candidates are chosen to be the k-closest nodes based on a given distance metric. In this paper, we propose a new neighbor selection policy called LocalRand(k,K\\'), that first computes the K\\' closest nodes to a specified node and then selects k of those nodes at random. Intuitively, LocalRand attempts to benefit from random sampling while maintaining the higher levels of local planner success inherent to selecting more local neighbors. We provide a methodology for selecting the parameters k and K\\'. We perform an experimental comparison which shows that for both rigid and articulated robots, LocalRand results in roadmaps that are better connected than the traditional k-closest policy or a purely random neighbor selection policy. The cost required to achieve these results is shown to be comparable to k-closest. © 2012 IEEE.

  19. Image encryption using random sequence generated from generalized information domain

    International Nuclear Information System (INIS)

    Zhang Xia-Yan; Wu Jie-Hua; Zhang Guo-Ji; Li Xuan; Ren Ya-Zhou

    2016-01-01

    A novel image encryption method based on the random sequence generated from the generalized information domain and permutation–diffusion architecture is proposed. The random sequence is generated by reconstruction from the generalized information file and discrete trajectory extraction from the data stream. The trajectory address sequence is used to generate a P-box to shuffle the plain image while random sequences are treated as keystreams. A new factor called drift factor is employed to accelerate and enhance the performance of the random sequence generator. An initial value is introduced to make the encryption method an approximately one-time pad. Experimental results show that the random sequences pass the NIST statistical test with a high ratio and extensive analysis demonstrates that the new encryption scheme has superior security. (paper)

  20. 47 CFR 1.1604 - Post-selection hearings.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a) Following the random...

  1. Generalized Whittle-Matern random field as a model of correlated fluctuations

    International Nuclear Information System (INIS)

    Lim, S C; Teo, L P

    2009-01-01

    This paper considers a generalization of the Gaussian random field with covariance function of the Whittle-Matern family. Such a random field can be obtained as the solution to the fractional stochastic differential equation with two fractional orders. Asymptotic properties of the covariance functions belonging to this generalized Whittle-Matern family are studied, which are used to deduce the sample path properties of the random field. The Whittle-Matern field has been widely used in modeling geostatistical data such as sea beam data, wind speed, field temperature and soil data. In this paper we show that the generalized Whittle-Matern field provides a more flexible model for wind speed data

  2. No differential attrition was found in randomized controlled trials published in general medical journals: a meta-analysis.

    Science.gov (United States)

    Crutzen, Rik; Viechtbauer, Wolfgang; Kotz, Daniel; Spigt, Mark

    2013-09-01

    Differential attrition is regarded as a major threat to the internal validity of a randomized controlled trial (RCT). This study identifies the degree of differential attrition in RCTs covering a broad spectrum of clinical areas and factors that are related to this. A PubMed search was conducted to obtain a random sample of 100 RCTs published between 2008 and 2010 in journals from the ISI Web of Knowledge(SM) category of medicine, general and internal. Eligibility criteria for selecting studies were primary publications of two-arm parallel randomized clinical trials, containing human participants and one or multiple follow-up measurements whose availability depended on the patients' willingness to participate. A significant amount of differential attrition was observed in 8% of the trials. The average differential attrition rate was 0.99 (95% confidence interval: 0.97-1.01), indicating no general difference in attrition rates between intervention and control groups. Moreover, no indication of heterogeneity was found, suggesting that the occurrence of differential attrition in the published literature is mostly a chance finding, unrelated to any particular design factors. Differential attrition did not generally occur in RCTs covering a broad spectrum of clinical areas within general and internal medicine. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. A Bidirectional Generalized Synchronization Theorem-Based Chaotic Pseudo-random Number Generator

    Directory of Open Access Journals (Sweden)

    Han Shuangshuang

    2013-07-01

    Full Text Available Based on a bidirectional generalized synchronization theorem for discrete chaos system, this paper introduces a new 5-dimensional bidirectional generalized chaos synchronization system (BGCSDS, whose prototype is a novel chaotic system introduced in [12]. Numerical simulation showed that two pair variables of the BGCSDS achieve generalized chaos synchronization via a transform H.A chaos-based pseudo-random number generator (CPNG was designed by the new BGCSDS. Using the FIPS-140-2 tests issued by the National Institute of Standard and Technology (NIST verified the randomness of the 1000 binary number sequences generated via the CPNG and the RC4 algorithm respectively. The results showed that all the tested sequences passed the FIPS-140-2 tests. The confidence interval analysis showed the statistical properties of the randomness of the sequences generated via the CPNG and the RC4 algorithm do not have significant differences.

  4. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  5. Applications of random forest feature selection for fine-scale genetic population assignment.

    Science.gov (United States)

    Sylvester, Emma V A; Bentzen, Paul; Bradbury, Ian R; Clément, Marie; Pearce, Jon; Horne, John; Beiko, Robert G

    2018-02-01

    Genetic population assignment used to inform wildlife management and conservation efforts requires panels of highly informative genetic markers and sensitive assignment tests. We explored the utility of machine-learning algorithms (random forest, regularized random forest and guided regularized random forest) compared with F ST ranking for selection of single nucleotide polymorphisms (SNP) for fine-scale population assignment. We applied these methods to an unpublished SNP data set for Atlantic salmon ( Salmo salar ) and a published SNP data set for Alaskan Chinook salmon ( Oncorhynchus tshawytscha ). In each species, we identified the minimum panel size required to obtain a self-assignment accuracy of at least 90% using each method to create panels of 50-700 markers Panels of SNPs identified using random forest-based methods performed up to 7.8 and 11.2 percentage points better than F ST -selected panels of similar size for the Atlantic salmon and Chinook salmon data, respectively. Self-assignment accuracy ≥90% was obtained with panels of 670 and 384 SNPs for each data set, respectively, a level of accuracy never reached for these species using F ST -selected panels. Our results demonstrate a role for machine-learning approaches in marker selection across large genomic data sets to improve assignment for management and conservation of exploited populations.

  6. Interference-aware random beam selection for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed M.

    2012-09-01

    Spectrum sharing systems have been introduced to alleviate the problem of spectrum scarcity by allowing secondary unlicensed networks to share the spectrum with primary licensed networks under acceptable interference levels to the primary users. In this paper, we develop interference-aware random beam selection schemes that provide enhanced throughput for the secondary link under the condition that the interference observed at the primary link is within a predetermined acceptable value. For a secondary transmitter equipped with multiple antennas, our schemes select a random beam, among a set of power- optimized orthogonal random beams, that maximizes the capacity of the secondary link while satisfying the interference constraint at the primary receiver for different levels of feedback information describing the interference level at the primary receiver. For the proposed schemes, we develop a statistical analysis for the signal-to-noise and interference ratio (SINR) statistics as well as the capacity of the secondary link. Finally, we present numerical results that study the effect of system parameters including number of beams and the maximum transmission power on the capacity of the secondary link attained using the proposed schemes. © 2012 IEEE.

  7. Interference-aware random beam selection for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed M.; Sayed, Mostafa M.; Alouini, Mohamed-Slim; Qaraqe, Khalid A.

    2012-01-01

    . In this paper, we develop interference-aware random beam selection schemes that provide enhanced throughput for the secondary link under the condition that the interference observed at the primary link is within a predetermined acceptable value. For a secondary

  8. The signature of positive selection at randomly chosen loci.

    OpenAIRE

    Przeworski, Molly

    2002-01-01

    In Drosophila and humans, there are accumulating examples of loci with a significant excess of high-frequency-derived alleles or high levels of linkage disequilibrium, relative to a neutral model of a random-mating population of constant size. These are features expected after a recent selective sweep. Their prevalence suggests that positive directional selection may be widespread in both species. However, as I show here, these features do not persist long after the sweep ends: The high-frequ...

  9. Simulated Performance Evaluation of a Selective Tracker Through Random Scenario Generation

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar

    2006-01-01

    performance assessment. Therefore, a random target motion scenario is adopted. Its implementation in particular for testing the proposed selective track splitting algorithm using Kalman filters is investigated through a number of performance parameters which gives the activity profile of the tracking scenario......  The paper presents a simulation study on the performance of a target tracker using selective track splitting filter algorithm through a random scenario implemented on a digital signal processor.  In a typical track splitting filter all the observation which fall inside a likelihood ellipse...... are used for update, however, in our proposed selective track splitting filter less number of observations are used for track update.  Much of the previous performance work [1] has been done on specific (deterministic) scenarios. One of the reasons for considering the specific scenarios, which were...

  10. Minimization over randomly selected lines

    Directory of Open Access Journals (Sweden)

    Ismet Sahin

    2013-07-01

    Full Text Available This paper presents a population-based evolutionary optimization method for minimizing a given cost function. The mutation operator of this method selects randomly oriented lines in the cost function domain, constructs quadratic functions interpolating the cost function at three different points over each line, and uses extrema of the quadratics as mutated points. The crossover operator modifies each mutated point based on components of two points in population, instead of one point as is usually performed in other evolutionary algorithms. The stopping criterion of this method depends on the number of almost degenerate quadratics. We demonstrate that the proposed method with these mutation and crossover operations achieves faster and more robust convergence than the well-known Differential Evolution and Particle Swarm algorithms.

  11. Selection bias and subject refusal in a cluster-randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Rochelle Yang

    2017-07-01

    Full Text Available Abstract Background Selection bias and non-participation bias are major methodological concerns which impact external validity. Cluster-randomized controlled trials are especially prone to selection bias as it is impractical to blind clusters to their allocation into intervention or control. This study assessed the impact of selection bias in a large cluster-randomized controlled trial. Methods The Improved Cardiovascular Risk Reduction to Enhance Rural Primary Care (ICARE study examined the impact of a remote pharmacist-led intervention in twelve medical offices. To assess eligibility, a standardized form containing patient demographics and medical information was completed for each screened patient. Eligible patients were approached by the study coordinator for recruitment. Both the study coordinator and the patient were aware of the site’s allocation prior to consent. Patients who consented or declined to participate were compared across control and intervention arms for differing characteristics. Statistical significance was determined using a two-tailed, equal variance t-test and a chi-square test with adjusted Bonferroni p-values. Results were adjusted for random cluster variation. Results There were 2749 completed screening forms returned to research staff with 461 subjects who had either consented or declined participation. Patients with poorly controlled diabetes were found to be significantly more likely to decline participation in intervention sites compared to those in control sites. A higher mean diastolic blood pressure was seen in patients with uncontrolled hypertension who declined in the control sites compared to those who declined in the intervention sites. However, these findings were no longer significant after adjustment for random variation among the sites. After this adjustment, females were now found to be significantly more likely to consent than males (odds ratio = 1.41; 95% confidence interval = 1.03, 1

  12. Strategyproof Peer Selection using Randomization, Partitioning, and Apportionment

    OpenAIRE

    Aziz, Haris; Lev, Omer; Mattei, Nicholas; Rosenschein, Jeffrey S.; Walsh, Toby

    2016-01-01

    Peer review, evaluation, and selection is a fundamental aspect of modern science. Funding bodies the world over employ experts to review and select the best proposals of those submitted for funding. The problem of peer selection, however, is much more general: a professional society may want to give a subset of its members awards based on the opinions of all members; an instructor for a MOOC or online course may want to crowdsource grading; or a marketing company may select ideas from group b...

  13. Generalized linear models with random effects unified analysis via H-likelihood

    CERN Document Server

    Lee, Youngjo; Pawitan, Yudi

    2006-01-01

    Since their introduction in 1972, generalized linear models (GLMs) have proven useful in the generalization of classical normal models. Presenting methods for fitting GLMs with random effects to data, Generalized Linear Models with Random Effects: Unified Analysis via H-likelihood explores a wide range of applications, including combining information over trials (meta-analysis), analysis of frailty models for survival data, genetic epidemiology, and analysis of spatial and temporal models with correlated errors.Written by pioneering authorities in the field, this reference provides an introduction to various theories and examines likelihood inference and GLMs. The authors show how to extend the class of GLMs while retaining as much simplicity as possible. By maximizing and deriving other quantities from h-likelihood, they also demonstrate how to use a single algorithm for all members of the class, resulting in a faster algorithm as compared to existing alternatives. Complementing theory with examples, many of...

  14. Interference-aware random beam selection schemes for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed; Qaraqe, Khalid; Alouini, Mohamed-Slim

    2012-01-01

    users. In this work, we develop interference-aware random beam selection schemes that provide enhanced performance for the secondary network under the condition that the interference observed by the receivers of the primary network is below a

  15. Modal Parameter Identification from Responses of General Unknown Random Inputs

    DEFF Research Database (Denmark)

    Ibrahim, S. R.; Asmussen, J. C.; Brincker, Rune

    1996-01-01

    Modal parameter identification from ambient responses due to a general unknown random inputs is investigated. Existing identification techniques which are based on assumptions of white noise and or stationary random inputs are utilized even though the inputs conditions are not satisfied....... This is accomplished via adding. In cascade. A force cascade conversion to the structures system under consideration. The input to the force conversion system is white noise and the output of which is the actual force(s) applied to the structure. The white noise input(s) and the structures responses are then used...

  16. Using Random Forests to Select Optimal Input Variables for Short-Term Wind Speed Forecasting Models

    Directory of Open Access Journals (Sweden)

    Hui Wang

    2017-10-01

    Full Text Available Achieving relatively high-accuracy short-term wind speed forecasting estimates is a precondition for the construction and grid-connected operation of wind power forecasting systems for wind farms. Currently, most research is focused on the structure of forecasting models and does not consider the selection of input variables, which can have significant impacts on forecasting performance. This paper presents an input variable selection method for wind speed forecasting models. The candidate input variables for various leading periods are selected and random forests (RF is employed to evaluate the importance of all variable as features. The feature subset with the best evaluation performance is selected as the optimal feature set. Then, kernel-based extreme learning machine is constructed to evaluate the performance of input variables selection based on RF. The results of the case study show that by removing the uncorrelated and redundant features, RF effectively extracts the most strongly correlated set of features from the candidate input variables. By finding the optimal feature combination to represent the original information, RF simplifies the structure of the wind speed forecasting model, shortens the training time required, and substantially improves the model’s accuracy and generalization ability, demonstrating that the input variables selected by RF are effective.

  17. Recurrence and Polya Number of General One-Dimensional Random Walks

    International Nuclear Information System (INIS)

    Zhang Xiaokun; Wan Jing; Lu Jingju; Xu Xinping

    2011-01-01

    The recurrence properties of random walks can be characterized by Polya number, i.e., the probability that the walker has returned to the origin at least once. In this paper, we consider recurrence properties for a general 1D random walk on a line, in which at each time step the walker can move to the left or right with probabilities l and r, or remain at the same position with probability o (l + r + o = 1). We calculate Polya number P of this model and find a simple expression for P as, P = 1 - Δ, where Δ is the absolute difference of l and r (Δ = |l - r|). We prove this rigorous expression by the method of creative telescoping, and our result suggests that the walk is recurrent if and only if the left-moving probability l equals to the right-moving probability r. (general)

  18. Day-ahead load forecast using random forest and expert input selection

    International Nuclear Information System (INIS)

    Lahouar, A.; Ben Hadj Slama, J.

    2015-01-01

    Highlights: • A model based on random forests for short term load forecast is proposed. • An expert feature selection is added to refine inputs. • Special attention is paid to customers behavior, load profile and special holidays. • The model is flexible and able to handle complex load signal. • A technical comparison is performed to assess the forecast accuracy. - Abstract: The electrical load forecast is getting more and more important in recent years due to the electricity market deregulation and integration of renewable resources. To overcome the incoming challenges and ensure accurate power prediction for different time horizons, sophisticated intelligent methods are elaborated. Utilization of intelligent forecast algorithms is among main characteristics of smart grids, and is an efficient tool to face uncertainty. Several crucial tasks of power operators such as load dispatch rely on the short term forecast, thus it should be as accurate as possible. To this end, this paper proposes a short term load predictor, able to forecast the next 24 h of load. Using random forest, characterized by immunity to parameter variations and internal cross validation, the model is constructed following an online learning process. The inputs are refined by expert feature selection using a set of if–then rules, in order to include the own user specifications about the country weather or market, and to generalize the forecast ability. The proposed approach is tested through a real historical set from the Tunisian Power Company, and the simulation shows accurate and satisfactory results for one day in advance, with an average error exceeding rarely 2.3%. The model is validated for regular working days and weekends, and special attention is paid to moving holidays, following non Gregorian calendar

  19. Continuous-Time Mean-Variance Portfolio Selection with Random Horizon

    International Nuclear Information System (INIS)

    Yu, Zhiyong

    2013-01-01

    This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right

  20. Continuous-Time Mean-Variance Portfolio Selection with Random Horizon

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Zhiyong, E-mail: yuzhiyong@sdu.edu.cn [Shandong University, School of Mathematics (China)

    2013-12-15

    This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right.

  1. TEHRAN AIR POLLUTANTS PREDICTION BASED ON RANDOM FOREST FEATURE SELECTION METHOD

    Directory of Open Access Journals (Sweden)

    A. Shamsoddini

    2017-09-01

    Full Text Available Air pollution as one of the most serious forms of environmental pollutions poses huge threat to human life. Air pollution leads to environmental instability, and has harmful and undesirable effects on the environment. Modern prediction methods of the pollutant concentration are able to improve decision making and provide appropriate solutions. This study examines the performance of the Random Forest feature selection in combination with multiple-linear regression and Multilayer Perceptron Artificial Neural Networks methods, in order to achieve an efficient model to estimate carbon monoxide and nitrogen dioxide, sulfur dioxide and PM2.5 contents in the air. The results indicated that Artificial Neural Networks fed by the attributes selected by Random Forest feature selection method performed more accurate than other models for the modeling of all pollutants. The estimation accuracy of sulfur dioxide emissions was lower than the other air contaminants whereas the nitrogen dioxide was predicted more accurate than the other pollutants.

  2. Tehran Air Pollutants Prediction Based on Random Forest Feature Selection Method

    Science.gov (United States)

    Shamsoddini, A.; Aboodi, M. R.; Karami, J.

    2017-09-01

    Air pollution as one of the most serious forms of environmental pollutions poses huge threat to human life. Air pollution leads to environmental instability, and has harmful and undesirable effects on the environment. Modern prediction methods of the pollutant concentration are able to improve decision making and provide appropriate solutions. This study examines the performance of the Random Forest feature selection in combination with multiple-linear regression and Multilayer Perceptron Artificial Neural Networks methods, in order to achieve an efficient model to estimate carbon monoxide and nitrogen dioxide, sulfur dioxide and PM2.5 contents in the air. The results indicated that Artificial Neural Networks fed by the attributes selected by Random Forest feature selection method performed more accurate than other models for the modeling of all pollutants. The estimation accuracy of sulfur dioxide emissions was lower than the other air contaminants whereas the nitrogen dioxide was predicted more accurate than the other pollutants.

  3. Generalized random walk algorithm for the numerical modeling of complex diffusion processes

    CERN Document Server

    Vamos, C; Vereecken, H

    2003-01-01

    A generalized form of the random walk algorithm to simulate diffusion processes is introduced. Unlike the usual approach, at a given time all the particles from a grid node are simultaneously scattered using the Bernoulli repartition. This procedure saves memory and computing time and no restrictions are imposed for the maximum number of particles to be used in simulations. We prove that for simple diffusion the method generalizes the finite difference scheme and gives the same precision for large enough number of particles. As an example, simulations of diffusion in random velocity field are performed and the main features of the stochastic mathematical model are numerically tested.

  4. Generalized random walk algorithm for the numerical modeling of complex diffusion processes

    International Nuclear Information System (INIS)

    Vamos, Calin; Suciu, Nicolae; Vereecken, Harry

    2003-01-01

    A generalized form of the random walk algorithm to simulate diffusion processes is introduced. Unlike the usual approach, at a given time all the particles from a grid node are simultaneously scattered using the Bernoulli repartition. This procedure saves memory and computing time and no restrictions are imposed for the maximum number of particles to be used in simulations. We prove that for simple diffusion the method generalizes the finite difference scheme and gives the same precision for large enough number of particles. As an example, simulations of diffusion in random velocity field are performed and the main features of the stochastic mathematical model are numerically tested

  5. Hebbian Learning in a Random Network Captures Selectivity Properties of the Prefrontal Cortex

    Science.gov (United States)

    Lindsay, Grace W.

    2017-01-01

    Complex cognitive behaviors, such as context-switching and rule-following, are thought to be supported by the prefrontal cortex (PFC). Neural activity in the PFC must thus be specialized to specific tasks while retaining flexibility. Nonlinear “mixed” selectivity is an important neurophysiological trait for enabling complex and context-dependent behaviors. Here we investigate (1) the extent to which the PFC exhibits computationally relevant properties, such as mixed selectivity, and (2) how such properties could arise via circuit mechanisms. We show that PFC cells recorded from male and female rhesus macaques during a complex task show a moderate level of specialization and structure that is not replicated by a model wherein cells receive random feedforward inputs. While random connectivity can be effective at generating mixed selectivity, the data show significantly more mixed selectivity than predicted by a model with otherwise matched parameters. A simple Hebbian learning rule applied to the random connectivity, however, increases mixed selectivity and enables the model to match the data more accurately. To explain how learning achieves this, we provide analysis along with a clear geometric interpretation of the impact of learning on selectivity. After learning, the model also matches the data on measures of noise, response density, clustering, and the distribution of selectivities. Of two styles of Hebbian learning tested, the simpler and more biologically plausible option better matches the data. These modeling results provide clues about how neural properties important for cognition can arise in a circuit and make clear experimental predictions regarding how various measures of selectivity would evolve during animal training. SIGNIFICANCE STATEMENT The prefrontal cortex is a brain region believed to support the ability of animals to engage in complex behavior. How neurons in this area respond to stimuli—and in particular, to combinations of stimuli (

  6. Performance Evaluation of User Selection Protocols in Random Networks with Energy Harvesting and Hardware Impairments

    Directory of Open Access Journals (Sweden)

    Tan Nhat Nguyen

    2016-01-01

    Full Text Available In this paper, we evaluate performances of various user selection protocols under impact of hardware impairments. In the considered protocols, a Base Station (BS selects one of available Users (US to serve, while the remaining USs harvest the energy from the Radio Frequency (RF transmitted by the BS. We assume that all of the US randomly appear around the BS. In the Random Selection Protocol (RAN, the BS randomly selects a US to transmit the data. In the second proposed protocol, named Minimum Distance Protocol (MIND, the US that is nearest to the BS will be chosen. In the Optimal Selection Protocol (OPT, the US providing the highest channel gain between itself and the BS will be served. For performance evaluation, we derive exact and asymptotic closed-form expressions of average Outage Probability (OP over Rayleigh fading channels. We also consider average harvested energy per a US. Finally, Monte-Carlo simulations are then performed to verify the theoretical results.

  7. The reliability of randomly selected final year pharmacy students in ...

    African Journals Online (AJOL)

    Employing ANOVA, factorial experimental analysis, and the theory of error, reliability studies were conducted on the assessment of the drug product chloroquine phosphate tablets. The G–Study employed equal numbers of the factors for uniform control, and involved three analysts (randomly selected final year Pharmacy ...

  8. Random 2D Composites and the Generalized Method of Schwarz

    Directory of Open Access Journals (Sweden)

    Vladimir Mityushev

    2015-01-01

    Full Text Available Two-phase composites with nonoverlapping inclusions randomly embedded in matrix are investigated. A straightforward approach is applied to estimate the effective properties of random 2D composites. First, deterministic boundary value problems are solved for all locations of inclusions, that is, for all events of the considered probabilistic space C by the generalized method of Schwarz. Second, the effective properties are calculated in analytical form and averaged over C. This method is related to the traditional method based on the average probabilistic values involving the n-point correlation functions. However, we avoid computation of the correlation functions and compute their weighted moments of high orders by an indirect method which does not address the correlation functions. The effective properties are exactly expressed through these moments. It is proved that the generalized method of Schwarz converges for an arbitrary multiply connected doubly periodic domain and for an arbitrary contrast parameter. The proposed method yields an algorithm which can be applied with symbolic computations. The Torquato-Milton parameter ζ1 is exactly written for circular inclusions.

  9. On the number of vertices of each rank in phylogenetic trees and their generalizations

    OpenAIRE

    Bóna, Miklós

    2015-01-01

    We find surprisingly simple formulas for the limiting probability that the rank of a randomly selected vertex in a randomly selected phylogenetic tree or generalized phylogenetic tree is a given integer.

  10. The mathematics of random mutation and natural selection for multiple simultaneous selection pressures and the evolution of antimicrobial drug resistance.

    Science.gov (United States)

    Kleinman, Alan

    2016-12-20

    The random mutation and natural selection phenomenon act in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures when treating infections and cancers. The underlying principle to impair the random mutation and natural selection phenomenon is to use combination therapy, which forces the population to evolve to multiple selection pressures simultaneously that invoke the multiplication rule of probabilities simultaneously as well. Recently, it has been seen that combination therapy for the treatment of malaria has failed to prevent the emergence of drug-resistant variants. Using this empirical example and the principles of probability theory, the derivation of the equations describing this treatment failure is carried out. These equations give guidance as to how to use combination therapy for the treatment of cancers and infectious diseases and prevent the emergence of drug resistance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  11. Mirtazapine in generalized social anxiety disorder: a randomized, double-blind, placebo-controlled study

    NARCIS (Netherlands)

    Schutters, Sara I. J.; van Megen, Harold J. G. M.; van Veen, Jantien Frederieke; Denys, Damiaan A. J. P.; Westenberg, Herman G. M.

    2010-01-01

    This study is aimed at investigating the efficacy and tolerability of mirtazapine in a generalized social anxiety disorder. Sixty patients with generalized social anxiety disorder were randomly allocated to receive mirtazapine (30-45 mg/day) (n = 30) or placebo (n = 30) for 12 weeks in a

  12. New Results On the Sum of Two Generalized Gaussian Random Variables

    KAUST Repository

    Soury, Hamza

    2015-01-01

    We propose in this paper a new method to compute the characteristic function (CF) of generalized Gaussian (GG) random variable in terms of the Fox H function. The CF of the sum of two independent GG random variables is then deduced. Based on this results, the probability density function (PDF) and the cumulative distribution function (CDF) of the sum distribution are obtained. These functions are expressed in terms of the bivariate Fox H function. Next, the statistics of the distribution of the sum, such as the moments, the cumulant, and the kurtosis, are analyzed and computed. Due to the complexity of bivariate Fox H function, a solution to reduce such complexity is to approximate the sum of two independent GG random variables by one GG random variable with suitable shape factor. The approximation method depends on the utility of the system so three methods of estimate the shape factor are studied and presented.

  13. New Results on the Sum of Two Generalized Gaussian Random Variables

    KAUST Repository

    Soury, Hamza

    2016-01-06

    We propose in this paper a new method to compute the characteristic function (CF) of generalized Gaussian (GG) random variable in terms of the Fox H function. The CF of the sum of two independent GG random variables is then deduced. Based on this results, the probability density function (PDF) and the cumulative distribution function (CDF) of the sum distribution are obtained. These functions are expressed in terms of the bivariate Fox H function. Next, the statistics of the distribution of the sum, such as the moments, the cumulant, and the kurtosis, are analyzed and computed. Due to the complexity of bivariate Fox H function, a solution to reduce such complexity is to approximate the sum of two independent GG random variables by one GG random variable with suitable shape factor. The approximation method depends on the utility of the system so three methods of estimate the shape factor are studied and presented [1].

  14. New Results on the Sum of Two Generalized Gaussian Random Variables

    KAUST Repository

    Soury, Hamza; Alouini, Mohamed-Slim

    2016-01-01

    We propose in this paper a new method to compute the characteristic function (CF) of generalized Gaussian (GG) random variable in terms of the Fox H function. The CF of the sum of two independent GG random variables is then deduced. Based on this results, the probability density function (PDF) and the cumulative distribution function (CDF) of the sum distribution are obtained. These functions are expressed in terms of the bivariate Fox H function. Next, the statistics of the distribution of the sum, such as the moments, the cumulant, and the kurtosis, are analyzed and computed. Due to the complexity of bivariate Fox H function, a solution to reduce such complexity is to approximate the sum of two independent GG random variables by one GG random variable with suitable shape factor. The approximation method depends on the utility of the system so three methods of estimate the shape factor are studied and presented [1].

  15. Implementation of selective prevention for cardiometabolic diseases; are Dutch general practices adequately prepared?

    Science.gov (United States)

    Stol, Daphne M; Hollander, Monika; Nielen, Markus M J; Badenbroek, Ilse F; Schellevis, François G; de Wit, Niek J

    2018-03-01

    Current guidelines acknowledge the need for cardiometabolic disease (CMD) prevention and recommend five-yearly screening of a targeted population. In recent years programs for selective CMD-prevention have been developed, but implementation is challenging. The question arises if general practices are adequately prepared. Therefore, the aim of this study is to assess the organizational preparedness of Dutch general practices and the facilitators and barriers for performing CMD-prevention in practices currently implementing selective CMD-prevention. Observational study. Dutch primary care. General practices. Organizational characteristics. General practices implementing selective CMD-prevention are more often organized as a group practice (49% vs. 19%, p = .000) and are better organized regarding chronic disease management compared to reference practices. They are motivated for performing CMD-prevention and can be considered as 'frontrunners' of Dutch general practices with respect to their practice organization. The most important reported barriers are a limited availability of staff (59%) and inadequate funding (41%). The organizational infrastructure of Dutch general practices is considered adequate for performing most steps of selective CMD-prevention. Implementation of prevention programs including easily accessible lifestyle interventions needs attention. All stakeholders involved share the responsibility to realize structural funding for programmed CMD-prevention. Aforementioned conditions should be taken into account with respect to future implementation of selective CMD-prevention. Key Points   There is need for adequate CMD prevention. Little is known about the organization of selective CMD prevention in general practices.   • The organizational infrastructure of Dutch general practices is adequate for performing most steps of selective CMD prevention.   • Implementation of selective CMD prevention programs including easily accessible

  16. The signature of positive selection at randomly chosen loci.

    Science.gov (United States)

    Przeworski, Molly

    2002-03-01

    In Drosophila and humans, there are accumulating examples of loci with a significant excess of high-frequency-derived alleles or high levels of linkage disequilibrium, relative to a neutral model of a random-mating population of constant size. These are features expected after a recent selective sweep. Their prevalence suggests that positive directional selection may be widespread in both species. However, as I show here, these features do not persist long after the sweep ends: The high-frequency alleles drift to fixation and no longer contribute to polymorphism, while linkage disequilibrium is broken down by recombination. As a result, loci chosen without independent evidence of recent selection are not expected to exhibit either of these features, even if they have been affected by numerous sweeps in their genealogical history. How then can we explain the patterns in the data? One possibility is population structure, with unequal sampling from different subpopulations. Alternatively, positive selection may not operate as is commonly modeled. In particular, the rate of fixation of advantageous mutations may have increased in the recent past.

  17. Site selection and general layout of heap leaching uranium mill

    International Nuclear Information System (INIS)

    Zhang Chunmao; Rongfeng

    2011-01-01

    The site selection and general layout of uranium mill is an important work in the design and consultation stage of uranium mining and metallurgy's engineering construction. Based on the design practices, the principles and methods for the site selection and general layout of heap leaching uranium mill are analyzed and studied. Some problems which should be paid much attention to in the design are discussed in hopes of providing a useful reference for the design and consultation of similar projects. (authors)

  18. Random matrix analysis of the QCD sign problem for general topology

    International Nuclear Information System (INIS)

    Bloch, Jacques; Wettig, Tilo

    2009-01-01

    Motivated by the important role played by the phase of the fermion determinant in the investigation of the sign problem in lattice QCD at nonzero baryon density, we derive an analytical formula for the average phase factor of the fermion determinant for general topology in the microscopic limit of chiral random matrix theory at nonzero chemical potential, for both the quenched and the unquenched case. The formula is a nontrivial extension of the expression for zero topology derived earlier by Splittorff and Verbaarschot. Our analytical predictions are verified by detailed numerical random matrix simulations of the quenched theory.

  19. Differential privacy-based evaporative cooling feature selection and classification with relief-F and random forests.

    Science.gov (United States)

    Le, Trang T; Simmons, W Kyle; Misaki, Masaya; Bodurka, Jerzy; White, Bill C; Savitz, Jonathan; McKinney, Brett A

    2017-09-15

    Classification of individuals into disease or clinical categories from high-dimensional biological data with low prediction error is an important challenge of statistical learning in bioinformatics. Feature selection can improve classification accuracy but must be incorporated carefully into cross-validation to avoid overfitting. Recently, feature selection methods based on differential privacy, such as differentially private random forests and reusable holdout sets, have been proposed. However, for domains such as bioinformatics, where the number of features is much larger than the number of observations p≫n , these differential privacy methods are susceptible to overfitting. We introduce private Evaporative Cooling, a stochastic privacy-preserving machine learning algorithm that uses Relief-F for feature selection and random forest for privacy preserving classification that also prevents overfitting. We relate the privacy-preserving threshold mechanism to a thermodynamic Maxwell-Boltzmann distribution, where the temperature represents the privacy threshold. We use the thermal statistical physics concept of Evaporative Cooling of atomic gases to perform backward stepwise privacy-preserving feature selection. On simulated data with main effects and statistical interactions, we compare accuracies on holdout and validation sets for three privacy-preserving methods: the reusable holdout, reusable holdout with random forest, and private Evaporative Cooling, which uses Relief-F feature selection and random forest classification. In simulations where interactions exist between attributes, private Evaporative Cooling provides higher classification accuracy without overfitting based on an independent validation set. In simulations without interactions, thresholdout with random forest and private Evaporative Cooling give comparable accuracies. We also apply these privacy methods to human brain resting-state fMRI data from a study of major depressive disorder. Code

  20. A simulation-based goodness-of-fit test for random effects in generalized linear mixed models

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus

    2006-01-01

    The goodness-of-fit of the distribution of random effects in a generalized linear mixed model is assessed using a conditional simulation of the random effects conditional on the observations. Provided that the specified joint model for random effects and observations is correct, the marginal...... distribution of the simulated random effects coincides with the assumed random effects distribution. In practice, the specified model depends on some unknown parameter which is replaced by an estimate. We obtain a correction for this by deriving the asymptotic distribution of the empirical distribution...

  1. A simulation-based goodness-of-fit test for random effects in generalized linear mixed models

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus Plenge

    The goodness-of-fit of the distribution of random effects in a generalized linear mixed model is assessed using a conditional simulation of the random effects conditional on the observations. Provided that the specified joint model for random effects and observations is correct, the marginal...... distribution of the simulated random effects coincides with the assumed random effects distribution. In practice the specified model depends on some unknown parameter which is replaced by an estimate. We obtain a correction for this by deriving the asymptotic distribution of the empirical distribution function...

  2. How random is a random vector?

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2015-01-01

    Over 80 years ago Samuel Wilks proposed that the “generalized variance” of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the “Wilks standard deviation” –the square root of the generalized variance–is indeed the standard deviation of a random vector. We further establish that the “uncorrelation index” –a derivative of the Wilks standard deviation–is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: “randomness measures” and “independence indices” of random vectors. In turn, these general notions give rise to “randomness diagrams”—tangible planar visualizations that answer the question: How random is a random vector? The notion of “independence indices” yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.

  3. How random is a random vector?

    Science.gov (United States)

    Eliazar, Iddo

    2015-12-01

    Over 80 years ago Samuel Wilks proposed that the "generalized variance" of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the "Wilks standard deviation" -the square root of the generalized variance-is indeed the standard deviation of a random vector. We further establish that the "uncorrelation index" -a derivative of the Wilks standard deviation-is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: "randomness measures" and "independence indices" of random vectors. In turn, these general notions give rise to "randomness diagrams"-tangible planar visualizations that answer the question: How random is a random vector? The notion of "independence indices" yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.

  4. Survivor bias in Mendelian randomization analysis

    DEFF Research Database (Denmark)

    Vansteelandt, Stijn; Dukes, Oliver; Martinussen, Torben

    2017-01-01

    Mendelian randomization studies employ genotypes as experimental handles to infer the effect of genetically modified exposures (e.g. vitamin D exposure) on disease outcomes (e.g. mortality). The statistical analysis of these studies makes use of the standard instrumental variables framework. Many...... of these studies focus on elderly populations, thereby ignoring the problem of left truncation, which arises due to the selection of study participants being conditional upon surviving up to the time of study onset. Such selection, in general, invalidates the assumptions on which the instrumental variables...... analysis rests. We show that Mendelian randomization studies of adult or elderly populations will therefore, in general, return biased estimates of the exposure effect when the considered genotype affects mortality; in contrast, standard tests of the causal null hypothesis that the exposure does not affect...

  5. rFerns: An Implementation of the Random Ferns Method for General-Purpose Machine Learning

    Directory of Open Access Journals (Sweden)

    Miron B. Kursa

    2014-11-01

    Full Text Available Random ferns is a very simple yet powerful classification method originally introduced for specific computer vision tasks. In this paper, I show that this algorithm may be considered as a constrained decision tree ensemble and use this interpretation to introduce a series of modifications which enable the use of random ferns in general machine learning problems. Moreover, I extend the method with an internal error approximation and an attribute importance measure based on corresponding features of the random forest algorithm. I also present the R package rFerns containing an efficient implementation of this modified version of random ferns.

  6. Interference-aware random beam selection schemes for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed

    2012-10-19

    Spectrum sharing systems have been recently introduced to alleviate the problem of spectrum scarcity by allowing secondary unlicensed networks to share the spectrum with primary licensed networks under acceptable interference levels to the primary users. In this work, we develop interference-aware random beam selection schemes that provide enhanced performance for the secondary network under the condition that the interference observed by the receivers of the primary network is below a predetermined/acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a primary link composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes select a beam, among a set of power-optimized random beams, that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint for different levels of feedback information describing the interference level at the primary receiver. For the proposed schemes, we develop a statistical analysis for the SINR statistics as well as the capacity and bit error rate (BER) of the secondary link.

  7. Topology-selective jamming of fully-connected, code-division random-access networks

    Science.gov (United States)

    Polydoros, Andreas; Cheng, Unjeng

    1990-01-01

    The purpose is to introduce certain models of topology selective stochastic jamming and examine its impact on a class of fully-connected, spread-spectrum, slotted ALOHA-type random access networks. The theory covers dedicated as well as half-duplex units. The dominant role of the spatial duty factor is established, and connections with the dual concept of time selective jamming are discussed. The optimal choices of coding rate and link access parameters (from the users' side) and the jamming spatial fraction are numerically established for DS and FH spreading.

  8. Generalization of Random Intercept Multilevel Models

    Directory of Open Access Journals (Sweden)

    Rehan Ahmad Khan

    2013-10-01

    Full Text Available The concept of random intercept models in a multilevel model developed by Goldstein (1986 has been extended for k-levels. The random variation in intercepts at individual level is marginally split into components by incorporating higher levels of hierarchy in the single level model. So, one can control the random variation in intercepts by incorporating the higher levels in the model.

  9. Peculiarities of the statistics of spectrally selected fluorescence radiation in laser-pumped dye-doped random media

    Science.gov (United States)

    Yuvchenko, S. A.; Ushakova, E. V.; Pavlova, M. V.; Alonova, M. V.; Zimnyakov, D. A.

    2018-04-01

    We consider the practical realization of a new optical probe method of the random media which is defined as the reference-free path length interferometry with the intensity moments analysis. A peculiarity in the statistics of the spectrally selected fluorescence radiation in laser-pumped dye-doped random medium is discussed. Previously established correlations between the second- and the third-order moments of the intensity fluctuations in the random interference patterns, the coherence function of the probe radiation, and the path difference probability density for the interfering partial waves in the medium are confirmed. The correlations were verified using the statistical analysis of the spectrally selected fluorescence radiation emitted by a laser-pumped dye-doped random medium. Water solution of Rhodamine 6G was applied as the doping fluorescent agent for the ensembles of the densely packed silica grains, which were pumped by the 532 nm radiation of a solid state laser. The spectrum of the mean path length for a random medium was reconstructed.

  10. A generalization of random matrix theory and its application to statistical physics.

    Science.gov (United States)

    Wang, Duan; Zhang, Xin; Horvatic, Davor; Podobnik, Boris; Eugene Stanley, H

    2017-02-01

    To study the statistical structure of crosscorrelations in empirical data, we generalize random matrix theory and propose a new method of cross-correlation analysis, known as autoregressive random matrix theory (ARRMT). ARRMT takes into account the influence of auto-correlations in the study of cross-correlations in multiple time series. We first analytically and numerically determine how auto-correlations affect the eigenvalue distribution of the correlation matrix. Then we introduce ARRMT with a detailed procedure of how to implement the method. Finally, we illustrate the method using two examples taken from inflation rates for air pressure data for 95 US cities.

  11. Generalized Selectivity Description for Polymeric Ion-Selective Electrodes Based on the Phase Boundary Potential Model.

    Science.gov (United States)

    Bakker, Eric

    2010-02-15

    A generalized description of the response behavior of potentiometric polymer membrane ion-selective electrodes is presented on the basis of ion-exchange equilibrium considerations at the sample-membrane interface. This paper includes and extends on previously reported theoretical advances in a more compact yet more comprehensive form. Specifically, the phase boundary potential model is used to derive the origin of the Nernstian response behavior in a single expression, which is valid for a membrane containing any charge type and complex stoichiometry of ionophore and ion-exchanger. This forms the basis for a generalized expression of the selectivity coefficient, which may be used for the selectivity optimization of ion-selective membranes containing electrically charged and neutral ionophores of any desired stoichiometry. It is shown to reduce to expressions published previously for specialized cases, and may be effectively applied to problems relevant in modern potentiometry. The treatment is extended to mixed ion solutions, offering a comprehensive yet formally compact derivation of the response behavior of ion-selective electrodes to a mixture of ions of any desired charge. It is compared to predictions by the less accurate Nicolsky-Eisenman equation. The influence of ion fluxes or any form of electrochemical excitation is not considered here, but may be readily incorporated if an ion-exchange equilibrium at the interface may be assumed in these cases.

  12. Evaluation of general practitioners' routine assessment of patients ...

    African Journals Online (AJOL)

    The authors wished to establish the use of existing diabetes management guidelines by general practitioners (GPs) in the City of Tshwane (Pretoria) Metropolitan Municipality of South Africa. Method: A cross-sectional and descriptive study was conducted. A total of 50 randomly selected general practitioners participated in ...

  13. Effectiveness of oncogenetics training on general practitioners' consultation skills: a randomized controlled trial.

    Science.gov (United States)

    Houwink, Elisa J F; Muijtjens, Arno M M; van Teeffelen, Sarah R; Henneman, Lidewij; Rethans, Jan Joost; van der Jagt, Liesbeth E J; van Luijk, Scheltus J; Dinant, Geert Jan; van der Vleuten, Cees; Cornel, Martina C

    2014-01-01

    General practitioners are increasingly called upon to deliver genetic services and could play a key role in translating potentially life-saving advancements in oncogenetic technologies to patient care. If general practitioners are to make an effective contribution in this area, their genetics competencies need to be upgraded. The aim of this study was to investigate whether oncogenetics training for general practitioners improves their genetic consultation skills. In this pragmatic, blinded, randomized controlled trial, the intervention consisted of a 4-h training (December 2011 and April 2012), covering oncogenetic consultation skills (family history, familial risk assessment, and efficient referral), attitude (medical ethical issues), and clinical knowledge required in primary-care consultations. Outcomes were measured using observation checklists by unannounced standardized patients and self-reported questionnaires. Of 88 randomized general practitioners who initially agreed to participate, 56 completed all measurements. Key consultation skills significantly and substantially improved; regression coefficients after intervention were equivalent to 0.34 and 0.28 at 3-month follow-up, indicating a moderate effect size. Satisfaction and perceived applicability of newly learned skills were highly scored. The general practitioner-specific training proved to be a feasible, satisfactory, and clinically applicable method to improve oncogenetics consultation skills and could be used as an educational framework to inform future training activities with the ultimate aim of improving medical care.

  14. Attention Training in Individuals with Generalized Social Phobia: A Randomized Controlled Trial

    Science.gov (United States)

    Amir, Nader; Beard, Courtney; Taylor, Charles T.; Klumpp, Heide; Elias, Jason; Burns, Michelle; Chen, Xi

    2009-01-01

    The authors conducted a randomized, double-blind placebo-controlled trial to examine the efficacy of an attention training procedure in reducing symptoms of social anxiety in 44 individuals diagnosed with generalized social phobia (GSP). Attention training comprised a probe detection task in which pictures of faces with either a threatening or…

  15. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil

    2016-12-14

    This paper considers the problem of selecting a set of $k$ measurements from $n$ available sensor observations. The selected measurements should minimize a certain error function assessing the error in estimating a certain $m$ dimensional parameter vector. The exhaustive search inspecting each of the $n\\\\choose k$ possible choices would require a very high computational complexity and as such is not practical for large $n$ and $k$. Alternative methods with low complexity have recently been investigated but their main drawbacks are that 1) they require perfect knowledge of the measurement matrix and 2) they need to be applied at the pace of change of the measurement matrix. To overcome these issues, we consider the asymptotic regime in which $k$, $n$ and $m$ grow large at the same pace. Tools from random matrix theory are then used to approximate in closed-form the most important error measures that are commonly used. The asymptotic approximations are then leveraged to select properly $k$ measurements exhibiting low values for the asymptotic error measures. Two heuristic algorithms are proposed: the first one merely consists in applying the convex optimization artifice to the asymptotic error measure. The second algorithm is a low-complexity greedy algorithm that attempts to look for a sufficiently good solution for the original minimization problem. The greedy algorithm can be applied to both the exact and the asymptotic error measures and can be thus implemented in blind and channel-aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also presented and sustain the efficiency of the proposed blind methods in reaching the performances of channel-aware algorithms.

  16. Working covariance model selection for generalized estimating equations.

    Science.gov (United States)

    Carey, Vincent J; Wang, You-Gan

    2011-11-20

    We investigate methods for data-based selection of working covariance models in the analysis of correlated data with generalized estimating equations. We study two selection criteria: Gaussian pseudolikelihood and a geodesic distance based on discrepancy between model-sensitive and model-robust regression parameter covariance estimators. The Gaussian pseudolikelihood is found in simulation to be reasonably sensitive for several response distributions and noncanonical mean-variance relations for longitudinal data. Application is also made to a clinical dataset. Assessment of adequacy of both correlation and variance models for longitudinal data should be routine in applications, and we describe open-source software supporting this practice. Copyright © 2011 John Wiley & Sons, Ltd.

  17. Subset selection from Type-I and Type-II generalized logistic populations

    NARCIS (Netherlands)

    Laan, van der M.J.; Laan, van der P.

    1996-01-01

    We give an introduction to the logistic and generalized logistic distributions. We obtain exact results for the probability of correct selection from Type-I and Type-II generalized logistic populations which only differ in their location parameter. Some open problems are formulated.

  18. Generalized Pareto for Pattern-Oriented Random Walk Modelling of Organisms' Movements.

    Directory of Open Access Journals (Sweden)

    Sophie Bertrand

    Full Text Available How organisms move and disperse is crucial to understand how population dynamics relates to the spatial heterogeneity of the environment. Random walk (RW models are typical tools to describe movement patterns. Whether Lévy or alternative RW better describes forager movements is keenly debated. We get around this issue using the Generalized Pareto Distribution (GPD. GPD includes as specific cases Normal, exponential and power law distributions, which underlie Brownian, Poisson-like and Lévy walks respectively. Whereas previous studies typically confronted a limited set of candidate models, GPD lets the most likely RW model emerge from the data. We illustrate the wide applicability of the method using GPS-tracked seabird foraging movements and fishing vessel movements tracked by Vessel Monitoring System (VMS, both collected in the Peruvian pelagic ecosystem. The two parameters from the fitted GPD, a scale and a shape parameter, provide a synoptic characterization of the observed movement in terms of characteristic scale and diffusive property. They reveal and quantify the variability, among species and individuals, of the spatial strategies selected by predators foraging on a common prey field. The GPD parameters constitute relevant metrics for (1 providing a synthetic and pattern-oriented description of movement, (2 using top predators as ecosystem indicators and (3 studying the variability of spatial behaviour among species or among individuals with different personalities.

  19. Generalized Pareto for Pattern-Oriented Random Walk Modelling of Organisms' Movements.

    Science.gov (United States)

    Bertrand, Sophie; Joo, Rocío; Fablet, Ronan

    2015-01-01

    How organisms move and disperse is crucial to understand how population dynamics relates to the spatial heterogeneity of the environment. Random walk (RW) models are typical tools to describe movement patterns. Whether Lévy or alternative RW better describes forager movements is keenly debated. We get around this issue using the Generalized Pareto Distribution (GPD). GPD includes as specific cases Normal, exponential and power law distributions, which underlie Brownian, Poisson-like and Lévy walks respectively. Whereas previous studies typically confronted a limited set of candidate models, GPD lets the most likely RW model emerge from the data. We illustrate the wide applicability of the method using GPS-tracked seabird foraging movements and fishing vessel movements tracked by Vessel Monitoring System (VMS), both collected in the Peruvian pelagic ecosystem. The two parameters from the fitted GPD, a scale and a shape parameter, provide a synoptic characterization of the observed movement in terms of characteristic scale and diffusive property. They reveal and quantify the variability, among species and individuals, of the spatial strategies selected by predators foraging on a common prey field. The GPD parameters constitute relevant metrics for (1) providing a synthetic and pattern-oriented description of movement, (2) using top predators as ecosystem indicators and (3) studying the variability of spatial behaviour among species or among individuals with different personalities.

  20. A general method for handling missing binary outcome data in randomized controlled trials

    OpenAIRE

    Jackson, Dan; White, Ian R; Mason, Dan; Sutton, Stephen

    2014-01-01

    Aims The analysis of randomized controlled trials with incomplete binary outcome data is challenging. We develop a general method for exploring the impact of missing data in such trials, with a focus on abstinence outcomes. Design We propose a sensitivity analysis where standard analyses, which could include ‘missing = smoking’ and ‘last observation carried forward’, are embedded in a wider class of models. Setting We apply our general method to data from two smoking cessation trials. Partici...

  1. Random geometric graphs with general connection functions

    Science.gov (United States)

    Dettmann, Carl P.; Georgiou, Orestis

    2016-03-01

    In the original (1961) Gilbert model of random geometric graphs, nodes are placed according to a Poisson point process, and links formed between those within a fixed range. Motivated by wireless ad hoc networks "soft" or "probabilistic" connection models have recently been introduced, involving a "connection function" H (r ) that gives the probability that two nodes at distance r are linked (directly connect). In many applications (not only wireless networks), it is desirable that the graph is connected; that is, every node is linked to every other node in a multihop fashion. Here the connection probability of a dense network in a convex domain in two or three dimensions is expressed in terms of contributions from boundary components for a very general class of connection functions. It turns out that only a few quantities such as moments of the connection function appear. Good agreement is found with special cases from previous studies and with numerical simulations.

  2. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti.

    Science.gov (United States)

    Wampler, Peter J; Rediske, Richard R; Molla, Azizur R

    2013-01-18

    A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This

  3. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti

    Directory of Open Access Journals (Sweden)

    Wampler Peter J

    2013-01-01

    Full Text Available Abstract Background A remote sensing technique was developed which combines a Geographic Information System (GIS; Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. Methods The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. Results A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. Conclusions The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only

  4. UPDATE ON THE SELECTION OF THE FUTURE DIRECTOR-GENERAL AT CERN

    CERN Document Server

    2002-01-01

    In September 2002 the Committee of Council decided, in accordance with the selection procedure it adopted in 2000, to interview the candidate proposed by the Search Committee for the post of future Director-General. The Committee of Council conducted this interview on 14 November 2002. The selection procedure is moving forward, and it is expected that the conditions allowing the Council to make a decision on the nomination will be satisfied in December 2002. The term of the next Director-General begins on 1 January 2004. Further information on the procedure will be communicated as it becomes available.

  5. Yoga for generalized anxiety disorder: design of a randomized controlled clinical trial.

    Science.gov (United States)

    Hofmann, Stefan G; Curtiss, Joshua; Khalsa, Sat Bir S; Hoge, Elizabeth; Rosenfield, David; Bui, Eric; Keshaviah, Aparna; Simon, Naomi

    2015-09-01

    Generalized anxiety disorder (GAD) is a common disorder associated with significant distress and interference. Although cognitive behavioral therapy (CBT) has been shown to be the most effective form of psychotherapy, few patients receive or have access to this intervention. Yoga therapy offers another promising, yet under-researched, intervention that is gaining increasing popularity in the general public, as an anxiety reduction intervention. The purpose of this innovative clinical trial protocol is to investigate the efficacy of a Kundalini Yoga intervention, relative to CBT and a control condition. Kundalini yoga and CBT are compared with each other in a noninferiority test and both treatments are compared to stress education training, an attention control intervention, in superiority tests. The sample will consist of 230 individuals with a primary DSM-5 diagnosis of GAD. This randomized controlled trial will compare yoga (N=95) to both CBT for GAD (N=95) and stress education (N=40), a commonly used control condition. All three treatments will be administered by two instructors in a group format over 12 weekly sessions with four to six patients per group. Groups will be randomized using permuted block randomization, which will be stratified by site. Treatment outcome will be evaluated bi-weekly and at 6month follow-up. Furthermore, potential mediators of treatment outcome will be investigated. Given the individual and economic burden associated with GAD, identifying accessible alternative behavioral treatments will have substantive public health implications. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Optimizing Event Selection with the Random Grid Search

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C. [Fermilab; Prosper, Harrison B. [Florida State U.; Sekmen, Sezen [Kyungpook Natl. U.; Stewart, Chip [Broad Inst., Cambridge

    2017-06-29

    The random grid search (RGS) is a simple, but efficient, stochastic algorithm to find optimal cuts that was developed in the context of the search for the top quark at Fermilab in the mid-1990s. The algorithm, and associated code, have been enhanced recently with the introduction of two new cut types, one of which has been successfully used in searches for supersymmetry at the Large Hadron Collider. The RGS optimization algorithm is described along with the recent developments, which are illustrated with two examples from particle physics. One explores the optimization of the selection of vector boson fusion events in the four-lepton decay mode of the Higgs boson and the other optimizes SUSY searches using boosted objects and the razor variables.

  7. Feature Import Vector Machine: A General Classifier with Flexible Feature Selection.

    Science.gov (United States)

    Ghosh, Samiran; Wang, Yazhen

    2015-02-01

    The support vector machine (SVM) and other reproducing kernel Hilbert space (RKHS) based classifier systems are drawing much attention recently due to its robustness and generalization capability. General theme here is to construct classifiers based on the training data in a high dimensional space by using all available dimensions. The SVM achieves huge data compression by selecting only few observations which lie close to the boundary of the classifier function. However when the number of observations are not very large (small n ) but the number of dimensions/features are large (large p ), then it is not necessary that all available features are of equal importance in the classification context. Possible selection of an useful fraction of the available features may result in huge data compression. In this paper we propose an algorithmic approach by means of which such an optimal set of features could be selected. In short, we reverse the traditional sequential observation selection strategy of SVM to that of sequential feature selection. To achieve this we have modified the solution proposed by Zhu and Hastie (2005) in the context of import vector machine (IVM), to select an optimal sub-dimensional model to build the final classifier with sufficient accuracy.

  8. Non-random mating for selection with restricted rates of inbreeding and overlapping generations

    NARCIS (Netherlands)

    Sonesson, A.K.; Meuwissen, T.H.E.

    2002-01-01

    Minimum coancestry mating with a maximum of one offspring per mating pair (MC1) is compared with random mating schemes for populations with overlapping generations. Optimum contribution selection is used, whereby $\\\\\\\\Delta F$ is restricted. For schemes with $\\\\\\\\Delta F$ restricted to 0.25% per

  9. Generalizing Evidence From Randomized Clinical Trials to Target Populations

    Science.gov (United States)

    Cole, Stephen R.; Stuart, Elizabeth A.

    2010-01-01

    Properly planned and conducted randomized clinical trials remain susceptible to a lack of external validity. The authors illustrate a model-based method to standardize observed trial results to a specified target population using a seminal human immunodeficiency virus (HIV) treatment trial, and they provide Monte Carlo simulation evidence supporting the method. The example trial enrolled 1,156 HIV-infected adult men and women in the United States in 1996, randomly assigned 577 to a highly active antiretroviral therapy and 579 to a largely ineffective combination therapy, and followed participants for 52 weeks. The target population was US people infected with HIV in 2006, as estimated by the Centers for Disease Control and Prevention. Results from the trial apply, albeit muted by 12%, to the target population, under the assumption that the authors have measured and correctly modeled the determinants of selection that reflect heterogeneity in the treatment effect. In simulations with a heterogeneous treatment effect, a conventional intent-to-treat estimate was biased with poor confidence limit coverage, but the proposed estimate was largely unbiased with appropriate confidence limit coverage. The proposed method standardizes observed trial results to a specified target population and thereby provides information regarding the generalizability of trial results. PMID:20547574

  10. Derrida's Generalized Random Energy models; 4, Continuous state branching and coalescents

    CERN Document Server

    Bovier, A

    2003-01-01

    In this paper we conclude our analysis of Derrida's Generalized Random Energy Models (GREM) by identifying the thermodynamic limit with a one-parameter family of probability measures related to a continuous state branching process introduced by Neveu. Using a construction introduced by Bertoin and Le Gall in terms of a coherent family of subordinators related to Neveu's branching process, we show how the Gibbs geometry of the limiting Gibbs measure is given in terms of the genealogy of this process via a deterministic time-change. This construction is fully universal in that all different models (characterized by the covariance of the underlying Gaussian process) differ only through that time change, which in turn is expressed in terms of Parisi's overlap distribution. The proof uses strongly the Ghirlanda-Guerra identities that impose the structure of Neveu's process as the only possible asymptotic random mechanism.

  11. OPRA capacity bounds for selection diversity over generalized fading channels

    KAUST Repository

    Hanif, Muhammad Fainan

    2014-05-01

    Channel side information at the transmitter can increase the average capacity by enabling optimal power and rate adaptation. The resulting optimal power and rate adaptation (OPRA) capacity rarely has a closed-form analytic expression. In this paper, lower and upper bounds on OPRA capacity for selection diversity scheme are presented. These bounds hold for variety of fading channels including log-normal and generalized Gamma distributed models and have very simple analytic expressions for easy evaluation even for kth best path selection. Some selected numerical results show that the newly proposed bounds closely approximate the actual OPRA capacity. © 2014 IEEE.

  12. Comparative Evaluations of Randomly Selected Four Point-of-Care Glucometer Devices in Addis Ababa, Ethiopia.

    Science.gov (United States)

    Wolde, Mistire; Tarekegn, Getahun; Kebede, Tedla

    2018-05-01

    Point-of-care glucometer (PoCG) devices play a significant role in self-monitoring of the blood sugar level, particularly in the follow-up of high blood sugar therapeutic response. The aim of this study was to evaluate blood glucose test results performed with four randomly selected glucometers on diabetes and control subjects versus standard wet chemistry (hexokinase) methods in Addis Ababa, Ethiopia. A prospective cross-sectional study was conducted on randomly selected 200 study participants (100 participants with diabetes and 100 healthy controls). Four randomly selected PoCG devices (CareSens N, DIAVUE Prudential, On Call Extra, i-QARE DS-W) were evaluated against hexokinase method and ISO 15197:2003 and ISO 15197:2013 standards. The minimum and maximum blood sugar values were recorded by CareSens N (21 mg/dl) and hexokinase method (498.8 mg/dl), respectively. The mean sugar values of all PoCG devices except On Call Extra showed significant differences compared with the reference hexokinase method. Meanwhile, all four PoCG devices had strong positive relationship (>80%) with the reference method (hexokinase). On the other hand, none of the four PoCG devices fulfilled the minimum accuracy measurement set by ISO 15197:2003 and ISO 15197:2013 standards. In addition, the linear regression analysis revealed that all four selected PoCG overestimated the glucose concentrations. The overall evaluation of the selected four PoCG measurements were poorly correlated with standard reference method. Therefore, before introducing PoCG devices to the market, there should be a standardized evaluation platform for validation. Further similar large-scale studies on other PoCG devices also need to be undertaken.

  13. Geography and genography: prediction of continental origin using randomly selected single nucleotide polymorphisms

    Directory of Open Access Journals (Sweden)

    Ramoni Marco F

    2007-03-01

    Full Text Available Abstract Background Recent studies have shown that when individuals are grouped on the basis of genetic similarity, group membership corresponds closely to continental origin. There has been considerable debate about the implications of these findings in the context of larger debates about race and the extent of genetic variation between groups. Some have argued that clustering according to continental origin demonstrates the existence of significant genetic differences between groups and that these differences may have important implications for differences in health and disease. Others argue that clustering according to continental origin requires the use of large amounts of genetic data or specifically chosen markers and is indicative only of very subtle genetic differences that are unlikely to have biomedical significance. Results We used small numbers of randomly selected single nucleotide polymorphisms (SNPs from the International HapMap Project to train naïve Bayes classifiers for prediction of ancestral continent of origin. Predictive accuracy was tested on two independent data sets. Genetically similar groups should be difficult to distinguish, especially if only a small number of genetic markers are used. The genetic differences between continentally defined groups are sufficiently large that one can accurately predict ancestral continent of origin using only a minute, randomly selected fraction of the genetic variation present in the human genome. Genotype data from only 50 random SNPs was sufficient to predict ancestral continent of origin in our primary test data set with an average accuracy of 95%. Genetic variations informative about ancestry were common and widely distributed throughout the genome. Conclusion Accurate characterization of ancestry is possible using small numbers of randomly selected SNPs. The results presented here show how investigators conducting genetic association studies can use small numbers of arbitrarily

  14. On polynomial selection for the general number field sieve

    Science.gov (United States)

    Kleinjung, Thorsten

    2006-12-01

    The general number field sieve (GNFS) is the asymptotically fastest algorithm for factoring large integers. Its runtime depends on a good choice of a polynomial pair. In this article we present an improvement of the polynomial selection method of Montgomery and Murphy which has been used in recent GNFS records.

  15. A theory of solving TAP equations for Ising models with general invariant random matrices

    DEFF Research Database (Denmark)

    Opper, Manfred; Çakmak, Burak; Winther, Ole

    2016-01-01

    We consider the problem of solving TAP mean field equations by iteration for Ising models with coupling matrices that are drawn at random from general invariant ensembles. We develop an analysis of iterative algorithms using a dynamical functional approach that in the thermodynamic limit yields...... the iteration dependent on a Gaussian distributed field only. The TAP magnetizations are stable fixed points if a de Almeida–Thouless stability criterion is fulfilled. We illustrate our method explicitly for coupling matrices drawn from the random orthogonal ensemble....

  16. Randomized controlled trial of the effect of medical audit on AIDS prevention in general practice

    DEFF Research Database (Denmark)

    Sandbæk, Annelli

    1999-01-01

    OBJECTIVE: We aimed to evaluate the effect of a medical audit on AIDS prevention in general practice. METHODS: We conducted a prospective randomized controlled study performed as 'lagged intervention'. At the time of comparison, the intervention group had completed 6 months of audit including a p...... of such consultations initiated by the GPs. CONCLUSIONS: Medical audit had no observed effect on AIDS prevention in general practice. Udgivelsesdato: 1999-Oct......OBJECTIVE: We aimed to evaluate the effect of a medical audit on AIDS prevention in general practice. METHODS: We conducted a prospective randomized controlled study performed as 'lagged intervention'. At the time of comparison, the intervention group had completed 6 months of audit including....... One hundred and thirty-three GPs completed the project. The main outcome measures were the number of consultations involving AIDS prevention and the number of talks about AIDS initiated by the GP, and some elements of the content were registered on a chart. RESULTS: No statistically significant...

  17. OPRA capacity bounds for selection diversity over generalized fading channels

    KAUST Repository

    Hanif, Muhammad Fainan; Yang, Hongchuan; Alouini, Mohamed-Slim

    2014-01-01

    , lower and upper bounds on OPRA capacity for selection diversity scheme are presented. These bounds hold for variety of fading channels including log-normal and generalized Gamma distributed models and have very simple analytic expressions for easy

  18. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology.

    Science.gov (United States)

    Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H

    2017-07-01

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in

  19. Fuzzy Random λ-Mean SAD Portfolio Selection Problem: An Ant Colony Optimization Approach

    Science.gov (United States)

    Thakur, Gour Sundar Mitra; Bhattacharyya, Rupak; Mitra, Swapan Kumar

    2010-10-01

    To reach the investment goal, one has to select a combination of securities among different portfolios containing large number of securities. Only the past records of each security do not guarantee the future return. As there are many uncertain factors which directly or indirectly influence the stock market and there are also some newer stock markets which do not have enough historical data, experts' expectation and experience must be combined with the past records to generate an effective portfolio selection model. In this paper the return of security is assumed to be Fuzzy Random Variable Set (FRVS), where returns are set of random numbers which are in turn fuzzy numbers. A new λ-Mean Semi Absolute Deviation (λ-MSAD) portfolio selection model is developed. The subjective opinions of the investors to the rate of returns of each security are taken into consideration by introducing a pessimistic-optimistic parameter vector λ. λ-Mean Semi Absolute Deviation (λ-MSAD) model is preferred as it follows absolute deviation of the rate of returns of a portfolio instead of the variance as the measure of the risk. As this model can be reduced to Linear Programming Problem (LPP) it can be solved much faster than quadratic programming problems. Ant Colony Optimization (ACO) is used for solving the portfolio selection problem. ACO is a paradigm for designing meta-heuristic algorithms for combinatorial optimization problem. Data from BSE is used for illustration.

  20. Pediatric selective mutism therapy: a randomized controlled trial.

    Science.gov (United States)

    Esposito, Maria; Gimigliano, Francesca; Barillari, Maria R; Precenzano, Francesco; Ruberto, Maria; Sepe, Joseph; Barillari, Umberto; Gimigliano, Raffaele; Militerni, Roberto; Messina, Giovanni; Carotenuto, Marco

    2017-10-01

    Selective mutism (SM) is a rare disease in children coded by DSM-5 as an anxiety disorder. Despite the disabling nature of the disease, there is still no specific treatment. The aims of this study were to verify the efficacy of six-month standard psychomotor treatment and the positive changes in lifestyle, in a population of children affected by SM. Randomized controlled trial registered in the European Clinical Trials Registry (EuDract 2015-001161-36). University third level Centre (Child and Adolescent Neuropsychiatry Clinic). Study population was composed by 67 children in group A (psychomotricity treatment) (35 M, mean age 7.84±1.15) and 71 children in group B (behavioral and educational counseling) (37 M, mean age 7.75±1.36). Psychomotor treatment was administered by trained child therapists in residential settings three times per week. Each child was treated for the whole period by the same therapist and all the therapists shared the same protocol. The standard psychomotor session length is of 45 minutes. At T0 and after 6 months (T1) of treatments, patients underwent a behavioral and SM severity assessment. To verify the effects of the psychomotor management, the Child Behavior Checklist questionnaire (CBCL) and Selective Mutism Questionnaire (SMQ) were administered to the parents. After 6 months of psychomotor treatment SM children showed a significant reduction among CBCL scores such as in social relations, anxious/depressed, social problems and total problems (Pselective mutism, even if further studies are needed. The present study identifies in psychomotricity a safe and efficacy therapy for pediatric selective mutism.

  1. Primitive polynomials selection method for pseudo-random number generator

    Science.gov (United States)

    Anikin, I. V.; Alnajjar, Kh

    2018-01-01

    In this paper we suggested the method for primitive polynomials selection of special type. This kind of polynomials can be efficiently used as a characteristic polynomials for linear feedback shift registers in pseudo-random number generators. The proposed method consists of two basic steps: finding minimum-cost irreducible polynomials of the desired degree and applying primitivity tests to get the primitive ones. Finally two primitive polynomials, which was found by the proposed method, used in pseudorandom number generator based on fuzzy logic (FRNG) which had been suggested before by the authors. The sequences generated by new version of FRNG have low correlation magnitude, high linear complexity, less power consumption, is more balanced and have better statistical properties.

  2. Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design

    Science.gov (United States)

    Wagler, Amy; Wagler, Ron

    2014-01-01

    Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…

  3. Generalized Selection Weighted Vector Filters

    Directory of Open Access Journals (Sweden)

    Rastislav Lukac

    2004-09-01

    Full Text Available This paper introduces a class of nonlinear multichannel filters capable of removing impulsive noise in color images. The here-proposed generalized selection weighted vector filter class constitutes a powerful filtering framework for multichannel signal processing. Previously defined multichannel filters such as vector median filter, basic vector directional filter, directional-distance filter, weighted vector median filters, and weighted vector directional filters are treated from a global viewpoint using the proposed framework. Robust order-statistic concepts and increased degree of freedom in filter design make the proposed method attractive for a variety of applications. Introduced multichannel sigmoidal adaptation of the filter parameters and its modifications allow to accommodate the filter parameters to varying signal and noise statistics. Simulation studies reported in this paper indicate that the proposed filter class is computationally attractive, yields excellent performance, and is able to preserve fine details and color information while efficiently suppressing impulsive noise. This paper is an extended version of the paper by Lukac et al. presented at the 2003 IEEE-EURASIP Workshop on Nonlinear Signal and Image Processing (NSIP '03 in Grado, Italy.

  4. Applications of a general random-walk theory for confined diffusion.

    Science.gov (United States)

    Calvo-Muñoz, Elisa M; Selvan, Myvizhi Esai; Xiong, Ruichang; Ojha, Madhusudan; Keffer, David J; Nicholson, Donald M; Egami, Takeshi

    2011-01-01

    A general random walk theory for diffusion in the presence of nanoscale confinement is developed and applied. The random-walk theory contains two parameters describing confinement: a cage size and a cage-to-cage hopping probability. The theory captures the correct nonlinear dependence of the mean square displacement (MSD) on observation time for intermediate times. Because of its simplicity, the theory also requires modest computational requirements and is thus able to simulate systems with very low diffusivities for sufficiently long time to reach the infinite-time-limit regime where the Einstein relation can be used to extract the self-diffusivity. The theory is applied to three practical cases in which the degree of order in confinement varies. The three systems include diffusion of (i) polyatomic molecules in metal organic frameworks, (ii) water in proton exchange membranes, and (iii) liquid and glassy iron. For all three cases, the comparison between theory and the results of molecular dynamics (MD) simulations indicates that the theory can describe the observed diffusion behavior with a small fraction of the computational expense. The confined-random-walk theory fit to the MSDs of very short MD simulations is capable of accurately reproducing the MSDs of much longer MD simulations. Furthermore, the values of the parameter for cage size correspond to the physical dimensions of the systems and the cage-to-cage hopping probability corresponds to the activation barrier for diffusion, indicating that the two parameters in the theory are not simply fitted values but correspond to real properties of the physical system.

  5. Materials selection for oxide-based resistive random access memories

    International Nuclear Information System (INIS)

    Guo, Yuzheng; Robertson, John

    2014-01-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO 2 , TiO 2 , Ta 2 O 5 , and Al 2 O 3 , to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta 2 O 5 RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy

  6. Effect of a tailored physical activity intervention delivered in general practice settings: results of a randomized controlled trial

    NARCIS (Netherlands)

    Sluijs, van E.M.F.; Poppel - Bruinvels, van M.N.M.; Twisk, J.W.R.; Paw, M.J.M. Chin A; Calfas, K.J.; Mechelen, van W.

    2005-01-01

    OBJECTIVES: We evaluated the effectiveness of a minimal intervention physical activity strategy (physician-based assessment and counseling for exercise [PACE]) applied in general practice settings in the Netherlands. METHODS: Randomization took place at the general practice level. Participants were

  7. Effect of a tailored physical activity intervention delivered in general practice settings: results of a randomized controlled trial

    NARCIS (Netherlands)

    van Sluijs, E.M.F.; van Poppel-Bruinvels, M.N.M.; Twisk, J.W.R.; Chin A Paw, M.J.M.; Calfas, K.J.; van Mechelen, W.

    2005-01-01

    Objectives. We evaluated the effectiveness of a minimal intervention physical activity strategy (physician-based assessment and counseling for exercise [PACE]) applied in general practice settings in the Netherlands. Methods. Randomization took place at the general practice level. Participants were

  8. Emergence of multilevel selection in the prisoner's dilemma game on coevolving random networks

    International Nuclear Information System (INIS)

    Szolnoki, Attila; Perc, Matjaz

    2009-01-01

    We study the evolution of cooperation in the prisoner's dilemma game, whereby a coevolutionary rule is introduced that molds the random topology of the interaction network in two ways. First, existing links are deleted whenever a player adopts a new strategy or its degree exceeds a threshold value; second, new links are added randomly after a given number of game iterations. These coevolutionary processes correspond to the generic formation of new links and deletion of existing links that, especially in human societies, appear frequently as a consequence of ongoing socialization, change of lifestyle or death. Due to the counteraction of deletions and additions of links the initial heterogeneity of the interaction network is qualitatively preserved, and thus cannot be held responsible for the observed promotion of cooperation. Indeed, the coevolutionary rule evokes the spontaneous emergence of a powerful multilevel selection mechanism, which despite the sustained random topology of the evolving network, maintains cooperation across the whole span of defection temptation values.

  9. Selective memory generalization by spatial patterning of protein synthesis.

    Science.gov (United States)

    O'Donnell, Cian; Sejnowski, Terrence J

    2014-04-16

    Protein synthesis is crucial for both persistent synaptic plasticity and long-term memory. De novo protein expression can be restricted to specific neurons within a population, and to specific dendrites within a single neuron. Despite its ubiquity, the functional benefits of spatial protein regulation for learning are unknown. We used computational modeling to study this problem. We found that spatially patterned protein synthesis can enable selective consolidation of some memories but forgetting of others, even for simultaneous events that are represented by the same neural population. Key factors regulating selectivity include the functional clustering of synapses on dendrites, and the sparsity and overlap of neural activity patterns at the circuit level. Based on these findings, we proposed a two-step model for selective memory generalization during REM and slow-wave sleep. The pattern-matching framework we propose may be broadly applicable to spatial protein signaling throughout cortex and hippocampus. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  11. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology

    Science.gov (United States)

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...

  12. Optimization of the Dutch Matrix Test by Random Selection of Sentences From a Preselected Subset

    Directory of Open Access Journals (Sweden)

    Rolph Houben

    2015-04-01

    Full Text Available Matrix tests are available for speech recognition testing in many languages. For an accurate measurement, a steep psychometric function of the speech materials is required. For existing tests, it would be beneficial if it were possible to further optimize the available materials by increasing the function’s steepness. The objective is to show if the steepness of the psychometric function of an existing matrix test can be increased by selecting a homogeneous subset of recordings with the steepest sentence-based psychometric functions. We took data from a previous multicenter evaluation of the Dutch matrix test (45 normal-hearing listeners. Based on half of the data set, first the sentences (140 out of 311 with a similar speech reception threshold and with the steepest psychometric function (≥9.7%/dB were selected. Subsequently, the steepness of the psychometric function for this selection was calculated from the remaining (unused second half of the data set. The calculation showed that the slope increased from 10.2%/dB to 13.7%/dB. The resulting subset did not allow the construction of enough balanced test lists. Therefore, the measurement procedure was changed to randomly select the sentences during testing. Random selection may interfere with a representative occurrence of phonemes. However, in our material, the median phonemic occurrence remained close to that of the original test. This finding indicates that phonemic occurrence is not a critical factor. The work highlights the possibility that existing speech tests might be improved by selecting sentences with a steep psychometric function.

  13. Generalized structural equations improve sexual-selection analyses.

    Directory of Open Access Journals (Sweden)

    Sonia Lombardi

    Full Text Available Sexual selection is an intense evolutionary force, which operates through competition for the access to breeding resources. There are many cases where male copulatory success is highly asymmetric, and few males are able to sire most females. Two main hypotheses were proposed to explain this asymmetry: "female choice" and "male dominance". The literature reports contrasting results. This variability may reflect actual differences among studied populations, but it may also be generated by methodological differences and statistical shortcomings in data analysis. A review of the statistical methods used so far in lek studies, shows a prevalence of Linear Models (LM and Generalized Linear Models (GLM which may be affected by problems in inferring cause-effect relationships; multi-collinearity among explanatory variables and erroneous handling of non-normal and non-continuous distributions of the response variable. In lek breeding, selective pressure is maximal, because large numbers of males and females congregate in small arenas. We used a dataset on lekking fallow deer (Dama dama, to contrast the methods and procedures employed so far, and we propose a novel approach based on Generalized Structural Equations Models (GSEMs. GSEMs combine the power and flexibility of both SEM and GLM in a unified modeling framework. We showed that LMs fail to identify several important predictors of male copulatory success and yields very imprecise parameter estimates. Minor variations in data transformation yield wide changes in results and the method appears unreliable. GLMs improved the analysis, but GSEMs provided better results, because the use of latent variables decreases the impact of measurement errors. Using GSEMs, we were able to test contrasting hypotheses and calculate both direct and indirect effects, and we reached a high precision of the estimates, which implies a high predictive ability. In synthesis, we recommend the use of GSEMs in studies on

  14. Teaching Emotional Intelligence to Intensive Care Unit Nurses and their General Health: A Randomized Clinical Trial

    Directory of Open Access Journals (Sweden)

    F Sharif

    2013-07-01

    Full Text Available Background: Emotion and how people manage it is an important part of personality that would immensely affect their health. Investigations showed that emotional intelligence is significantly related to and can predict psychological health. Objective: To determine the effect of teaching emotional intelligence to intensive care unit nurses on their general health. Methods: This randomized clinical trial (registered as IRCT201208022812N9 was conducted on 52 of 200 in intensive care unit nurses affiliated to Shiraz University of Medical Sciences. They were recruited through purposeful convenience sampling and then randomly categorized into two groups. The intervention group members were trained in emotional intelligence. Bar-on emotional intelligence and Goldberg's general health questionnaires were administered to each participant before, immediately after, and one month after the intervention. Results: While the mean score of general health for the intervention group decreased from 25.4 before the intervention, to 18.1 immediately after the intervention and to 14.6 one month later, for the control group, it increased from 22.0, to 24.2 and to 26.5, respectively (p<0.001. Conclusion: Teaching emotional intelligence improved the general health of intensive care unit nurses.

  15. Random drift versus selection in academic vocabulary: an evolutionary analysis of published keywords.

    Science.gov (United States)

    Bentley, R Alexander

    2008-08-27

    The evolution of vocabulary in academic publishing is characterized via keyword frequencies recorded in the ISI Web of Science citations database. In four distinct case-studies, evolutionary analysis of keyword frequency change through time is compared to a model of random copying used as the null hypothesis, such that selection may be identified against it. The case studies from the physical sciences indicate greater selection in keyword choice than in the social sciences. Similar evolutionary analyses can be applied to a wide range of phenomena; wherever the popularity of multiple items through time has been recorded, as with web searches, or sales of popular music and books, for example.

  16. Materials selection for oxide-based resistive random access memories

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Yuzheng; Robertson, John [Engineering Department, Cambridge University, Cambridge CB2 1PZ (United Kingdom)

    2014-12-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO{sub 2}, TiO{sub 2}, Ta{sub 2}O{sub 5}, and Al{sub 2}O{sub 3}, to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta{sub 2}O{sub 5} RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy.

  17. Partial relay selection based on shadowing side information over generalized composite fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2011-11-01

    In this paper, in contrast to the relay selection protocols available in the literature, we propose a partial relay selection protocol utilizing only the shadowing side information of the relays instead of their full channel side information in order to select a relay in a dual-hop relaying system through the available limited feedback channels and power budget. We then presented an exact unified performance expression combining the average bit error probability, ergodic capacity, and moments-generating function of the proposed partial relay selection over generalized fading channels. Referring to the unified performance expression introduced in [1], we explicitly offer a generic unified performance expression that can be easily calculated and that is applicable to a wide variety of fading scenarios. Finally, as an illustration of the mathematical formalism, some numerical and simulation results are generated for an extended generalized-K fading environment, and these numerical and simulation results are shown to be in perfect agreement. © 2011 IEEE.

  18. Interscalene plexus block versus general anaesthesia for shoulder surgery: a randomized controlled study.

    Science.gov (United States)

    Lehmann, Lars J; Loosen, Gregor; Weiss, Christel; Schmittner, Marc D

    2015-02-01

    This randomized clinical trial evaluates interscalene brachial plexus block (ISB), general anaesthesia (GA) and the combination of both anaesthetic methods (GA + ISB) in patients undergoing shoulder arthroscopy. From July 2011 until May 2012, 120 patients (male/female), aged 20-80 years, were allocated randomly to receive ISB (10 ml mepivacaine 1 % and 20 ml ropivacaine 0.375%), GA (propofol, sunfentanil, desflurane) or ISB + GA. The primary outcome variable was opioid consumption at the day of surgery. Anaesthesia times were analysed as secondary endpoints. After surgery, 27 of 40 patients with a single ISB bypassed the recovery room (p surgery [GA: n = 25 vs. GA + ISB: n = 10 vs. ISB: n = 10, p = 0.0037]. ISB is superior to GA and GA + ISB in patients undergoing shoulder arthroscopy in terms of faster recovery and analgesics consumption.

  19. Growth rate for the expected value of a generalized random Fibonacci sequence

    International Nuclear Information System (INIS)

    Janvresse, Elise; De la Rue, Thierry; Rittaud, BenoIt

    2009-01-01

    We study the behaviour of generalized random Fibonacci sequences defined by the relation g n = |λg n-1 ± g n-2 |, where the ± sign is given by tossing an unbalanced coin, giving probability p to the + sign. We prove that the expected value of g n grows exponentially fast for any 0 (2 - λ)/4 when λ is of the form 2cos(π/k) for some fixed integer k ≥ 3. In both cases, we give an algebraic expression for the growth rate

  20. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  1. Velocity and Dispersion for a Two-Dimensional Random Walk

    International Nuclear Information System (INIS)

    Li Jinghui

    2009-01-01

    In the paper, we consider the transport of a two-dimensional random walk. The velocity and the dispersion of this two-dimensional random walk are derived. It mainly show that: (i) by controlling the values of the transition rates, the direction of the random walk can be reversed; (ii) for some suitably selected transition rates, our two-dimensional random walk can be efficient in comparison with the one-dimensional random walk. Our work is motivated in part by the challenge to explain the unidirectional transport of motor proteins. When the motor proteins move at the turn points of their tracks (i.e., the cytoskeleton filaments and the DNA molecular tubes), some of our results in this paper can be used to deal with the problem. (general)

  2. From Protocols to Publications: A Study in Selective Reporting of Outcomes in Randomized Trials in Oncology

    Science.gov (United States)

    Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C.; Hobbs, Brian P.; Berry, Donald A.; Pentz, Rebecca D.; Tam, Alda; Hong, Waun K.; Ellis, Lee M.; Abbruzzese, James; Overman, Michael J.

    2015-01-01

    Purpose The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. Methods We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. Results A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P < .001). Twenty-eight studies (37.8%) reported a total of 65 unplanned end points; 52 (80.0%) of which were not identified as unplanned. Thirty-one (41.9%) and 19 (25.7%) of 74 trials reported a total of 52 unplanned analyses involving primary end points and 33 unplanned analyses involving nonprimary end points, respectively. Studies reported positive unplanned end points and unplanned analyses more frequently than negative outcomes in abstracts (unplanned end points odds ratio, 6.8; P = .002; unplanned analyses odd ratio, 8.4; P = .007). Conclusion Despite public and reviewer access to protocols, selective outcome reporting persists and is a major concern in the reporting of randomized clinical trials. To foster credible evidence-based medicine, additional initiatives are needed to minimize selective reporting. PMID:26304898

  3. Joint random beam and spectrum selection for spectrum sharing systems with partial channel state information

    KAUST Repository

    Abdallah, Mohamed M.

    2013-11-01

    In this work, we develop joint interference-aware random beam and spectrum selection scheme that provide enhanced performance for the secondary network under the condition that the interference observed at the primary receiver is below a predetermined acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a set of primary links composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes jointly select a beam, among a set of power-optimized random beams, as well as the primary spectrum that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint. In particular, we consider the case where the interference level is described by a q-bit description of its magnitude, whereby we propose a technique to find the optimal quantizer thresholds in a mean square error (MSE) sense. © 2013 IEEE.

  4. Joint random beam and spectrum selection for spectrum sharing systems with partial channel state information

    KAUST Repository

    Abdallah, Mohamed M.; Sayed, Mostafa M.; Alouini, Mohamed-Slim; Qaraqe, Khalid A.

    2013-01-01

    In this work, we develop joint interference-aware random beam and spectrum selection scheme that provide enhanced performance for the secondary network under the condition that the interference observed at the primary receiver is below a predetermined acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a set of primary links composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes jointly select a beam, among a set of power-optimized random beams, as well as the primary spectrum that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint. In particular, we consider the case where the interference level is described by a q-bit description of its magnitude, whereby we propose a technique to find the optimal quantizer thresholds in a mean square error (MSE) sense. © 2013 IEEE.

  5. The RANDOM computer program: A linear congruential random number generator

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  6. Seismic random noise attenuation using shearlet and total generalized variation

    International Nuclear Information System (INIS)

    Kong, Dehui; Peng, Zhenming

    2015-01-01

    Seismic denoising from a corrupted observation is an important part of seismic data processing which improves the signal-to-noise ratio (SNR) and resolution. In this paper, we present an effective denoising method to attenuate seismic random noise. The method takes advantage of shearlet and total generalized variation (TGV) regularization. Different regularity levels of TGV improve the quality of the final result by suppressing Gibbs artifacts caused by the shearlet. The problem is formulated as mixed constraints in a convex optimization. A Bregman algorithm is proposed to solve the proposed model. Extensive experiments based on one synthetic datum and two post-stack field data are done to compare performance. The results demonstrate that the proposed method provides superior effectiveness and preserve the structure better. (paper)

  7. Seismic random noise attenuation using shearlet and total generalized variation

    Science.gov (United States)

    Kong, Dehui; Peng, Zhenming

    2015-12-01

    Seismic denoising from a corrupted observation is an important part of seismic data processing which improves the signal-to-noise ratio (SNR) and resolution. In this paper, we present an effective denoising method to attenuate seismic random noise. The method takes advantage of shearlet and total generalized variation (TGV) regularization. Different regularity levels of TGV improve the quality of the final result by suppressing Gibbs artifacts caused by the shearlet. The problem is formulated as mixed constraints in a convex optimization. A Bregman algorithm is proposed to solve the proposed model. Extensive experiments based on one synthetic datum and two post-stack field data are done to compare performance. The results demonstrate that the proposed method provides superior effectiveness and preserve the structure better.

  8. Analysis and applications of a frequency selective surface via a random distribution method

    International Nuclear Information System (INIS)

    Xie Shao-Yi; Huang Jing-Jian; Yuan Nai-Chang; Liu Li-Guo

    2014-01-01

    A novel frequency selective surface (FSS) for reducing radar cross section (RCS) is proposed in this paper. This FSS is based on the random distribution method, so it can be called random surface. In this paper, the stacked patches serving as periodic elements are employed for RCS reduction. Previous work has demonstrated the efficiency by utilizing the microstrip patches, especially for the reflectarray. First, the relevant theory of the method is described. Then a sample of a three-layer variable-sized stacked patch random surface with a dimension of 260 mm×260 mm is simulated, fabricated, and measured in order to demonstrate the validity of the proposed design. For the normal incidence, the 8-dB RCS reduction can be achieved both by the simulation and the measurement in 8 GHz–13 GHz. The oblique incidence of 30° is also investigated, in which the 7-dB RCS reduction can be obtained in a frequency range of 8 GHz–14 GHz. (condensed matter: electronic structure, electrical, magnetic, and optical properties)

  9. Random drift versus selection in academic vocabulary: an evolutionary analysis of published keywords.

    Directory of Open Access Journals (Sweden)

    R Alexander Bentley

    Full Text Available The evolution of vocabulary in academic publishing is characterized via keyword frequencies recorded in the ISI Web of Science citations database. In four distinct case-studies, evolutionary analysis of keyword frequency change through time is compared to a model of random copying used as the null hypothesis, such that selection may be identified against it. The case studies from the physical sciences indicate greater selection in keyword choice than in the social sciences. Similar evolutionary analyses can be applied to a wide range of phenomena; wherever the popularity of multiple items through time has been recorded, as with web searches, or sales of popular music and books, for example.

  10. On theoretical models of gene expression evolution with random genetic drift and natural selection.

    Directory of Open Access Journals (Sweden)

    Osamu Ogasawara

    2009-11-01

    Full Text Available The relative contributions of natural selection and random genetic drift are a major source of debate in the study of gene expression evolution, which is hypothesized to serve as a bridge from molecular to phenotypic evolution. It has been suggested that the conflict between views is caused by the lack of a definite model of the neutral hypothesis, which can describe the long-run behavior of evolutionary change in mRNA abundance. Therefore previous studies have used inadequate analogies with the neutral prediction of other phenomena, such as amino acid or nucleotide sequence evolution, as the null hypothesis of their statistical inference.In this study, we introduced two novel theoretical models, one based on neutral drift and the other assuming natural selection, by focusing on a common property of the distribution of mRNA abundance among a variety of eukaryotic cells, which reflects the result of long-term evolution. Our results demonstrated that (1 our models can reproduce two independently found phenomena simultaneously: the time development of gene expression divergence and Zipf's law of the transcriptome; (2 cytological constraints can be explicitly formulated to describe long-term evolution; (3 the model assuming that natural selection optimized relative mRNA abundance was more consistent with previously published observations than the model of optimized absolute mRNA abundances.The models introduced in this study give a formulation of evolutionary change in the mRNA abundance of each gene as a stochastic process, on the basis of previously published observations. This model provides a foundation for interpreting observed data in studies of gene expression evolution, including identifying an adequate time scale for discriminating the effect of natural selection from that of random genetic drift of selectively neutral variations.

  11. A theory for the origin of a self-replicating chemical system. I - Natural selection of the autogen from short, random oligomers

    Science.gov (United States)

    White, D. H.

    1980-01-01

    A general theory is presented for the origin of a self-replicating chemical system, termed an autogen, which is capable of both crude replication and translation (protein synthesis). The theory requires the availability of free energy and monomers to the system, a significant background low-yield synthesis of kinetically stable oligopeptides and oligonucleotides, the localization of the oligomers, crude oligonucleotide selectivity of amino acids during oligopeptide synthesis, crude oligonucleotide replication, and two short peptide families which catalyze replication and translation, to produce a localized group of at least one copy each of two protogenes and two protoenzymes. The model posits a process of random oligomerization, followed by the random nucleation of functional components and the rapid autocatalytic growth of the functioning autogen to macroscopic amounts, to account for the origin of the first self-replicating system. Such a process contains steps of such high probability and short time periods that it is suggested that the emergence of an autogen in a laboratory experiment of reasonable time scale may be possible.

  12. Specific collaborative group intervention for patients with medically unexplained symptoms in general practice: a cluster randomized controlled trial.

    Science.gov (United States)

    Schaefert, R; Kaufmann, C; Wild, B; Schellberg, D; Boelter, R; Faber, R; Szecsenyi, J; Sauer, N; Guthrie, E; Herzog, W

    2013-01-01

    Patients with medically unexplained symptoms (MUS) are frequent in primary care and substantially impaired in their quality of life (QoL). Specific training of general practitioners (GPs) alone did not demonstrate sustained improvement at later follow-up in current reviews. We evaluated a collaborative group intervention. We conducted a cluster randomized controlled trial. Thirty-five GPs recruited 304 MUS patients (intervention group: 170; control group: 134). All GPs were trained in diagnosis and management of MUS (control condition). Eighteen randomly selected intervention GPs participated in training for a specific collaborative group intervention. They conducted 10 weekly group sessions and 2 booster meetings in their practices, together with a psychosomatic specialist. Six and 12 months after baseline, QoL was assessed with the Short-Form 36. The primary outcome was the physical composite score (PCS), and the secondary outcome was the mental composite score (MCS). At 12 months, intention-to-treat analyses showed a significant between-group effect for the MCS (p = 0.023) but not for the PCS (p = 0.674). This effect was preceded by a significant reduction of somatic symptom severity (15-item somatic symptom severity scale of the Patient Health Questionnaire, PHQ-15) at 6 months (p = 0.008) that lacked significance at 12 months (p = 0.078). As additional between-group effects at 12 months, per-protocol analyses showed less health anxiety (Whiteley-7; p = 0.038) and less psychosocial distress (PHQ; p = 0.024); GP visits were significantly (p = 0.042) reduced in the intervention group. Compared to pure GP training, collaborative group intervention achieved a progressive, clinically meaningful improvement in mental but not physical QoL. It could bridge gaps between general practice and mental health care. Copyright © 2012 S. Karger AG, Basel.

  13. From Protocols to Publications: A Study in Selective Reporting of Outcomes in Randomized Trials in Oncology.

    Science.gov (United States)

    Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C; Hobbs, Brian P; Berry, Donald A; Pentz, Rebecca D; Tam, Alda; Hong, Waun K; Ellis, Lee M; Abbruzzese, James; Overman, Michael J

    2015-11-01

    The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P medicine, additional initiatives are needed to minimize selective reporting. © 2015 by American Society of Clinical Oncology.

  14. Combined impact of negative lifestyle factors on cardiovascular risk in children: a randomized prospective study

    OpenAIRE

    Meyer, Ursina; Schindler, Christian; Bloesch, Tamara; Schmocker, Eliane; Zahner, Lukas; Puder, Jardena J; Kriemler, Susi

    2014-01-01

    PURPOSE: Negative lifestyle factors are known to be associated with increased cardiovascular risk (CVR) in children, but research on their combined impact on a general population of children is sparse. Therefore, we aimed to quantify the combined impact of easily assessable negative lifestyle factors on the CVR scores of randomly selected children after 4 years. METHODS: Of the 540 randomly selected 6- to 13-year-old children, 502 children participated in a baseline health assessment, and ...

  15. The randomly renewed general item and the randomly inspected item with exponential life distribution

    International Nuclear Information System (INIS)

    Schneeweiss, W.G.

    1979-01-01

    For a randomly renewed item the probability distributions of the time to failure and of the duration of down time and the expectations of these random variables are determined. Moreover, it is shown that the same theory applies to randomly checked items with exponential probability distribution of life such as electronic items. The case of periodic renewals is treated as an example. (orig.) [de

  16. Key Aspects of Nucleic Acid Library Design for in Vitro Selection

    Science.gov (United States)

    Vorobyeva, Maria A.; Davydova, Anna S.; Vorobjev, Pavel E.; Pyshnyi, Dmitrii V.; Venyaminova, Alya G.

    2018-01-01

    Nucleic acid aptamers capable of selectively recognizing their target molecules have nowadays been established as powerful and tunable tools for biospecific applications, be it therapeutics, drug delivery systems or biosensors. It is now generally acknowledged that in vitro selection enables one to generate aptamers to almost any target of interest. However, the success of selection and the affinity of the resulting aptamers depend to a large extent on the nature and design of an initial random nucleic acid library. In this review, we summarize and discuss the most important features of the design of nucleic acid libraries for in vitro selection such as the nature of the library (DNA, RNA or modified nucleotides), the length of a randomized region and the presence of fixed sequences. We also compare and contrast different randomization strategies and consider computer methods of library design and some other aspects. PMID:29401748

  17. High Entropy Random Selection Protocols

    NARCIS (Netherlands)

    H. Buhrman (Harry); M. Christandl (Matthias); M. Koucky (Michal); Z. Lotker (Zvi); B. Patt-Shamir; M. Charikar; K. Jansen; O. Reingold; J. Rolim

    2007-01-01

    textabstractIn this paper, we construct protocols for two parties that do not trust each other, to generate random variables with high Shannon entropy. We improve known bounds for the trade off between the number of rounds, length of communication and the entropy of the outcome.

  18. Randomized controlled trial of the effect of medical audit on AIDS prevention in general practice

    DEFF Research Database (Denmark)

    Sandbæk, Annelli

    1999-01-01

    OBJECTIVE: We aimed to evaluate the effect of a medical audit on AIDS prevention in general practice. METHODS: We conducted a prospective randomized controlled study performed as 'lagged intervention'. At the time of comparison, the intervention group had completed 6 months of audit including...... of such consultations initiated by the GPs. CONCLUSIONS: Medical audit had no observed effect on AIDS prevention in general practice. Udgivelsesdato: 1999-Oct...... a primary activity registration, feedback of own data and a meeting with colleagues and experts, and had received brief summaries of the meetings and reminders about the project (a full 'audit circle'). The participants were from general practices in Copenhagen and the Counties of Funen and Vejle, Denmark...

  19. Integrated Behavior Therapy for Selective Mutism: a randomized controlled pilot study.

    Science.gov (United States)

    Bergman, R Lindsey; Gonzalez, Araceli; Piacentini, John; Keller, Melody L

    2013-10-01

    To evaluate the feasibility, acceptability, and preliminary efficacy of a novel behavioral intervention for reducing symptoms of selective mutism and increasing functional speech. A total of 21 children ages 4 to 8 with primary selective mutism were randomized to 24 weeks of Integrated Behavior Therapy for Selective Mutism (IBTSM) or a 12-week Waitlist control. Clinical outcomes were assessed using blind independent evaluators, parent-, and teacher-report, and an objective behavioral measure. Treatment recipients completed a three-month follow-up to assess durability of treatment gains. Data indicated increased functional speaking behavior post-treatment as rated by parents and teachers, with a high rate of treatment responders as rated by blind independent evaluators (75%). Conversely, children in the Waitlist comparison group did not experience significant improvements in speaking behaviors. Children who received IBTSM also demonstrated significant improvements in number of words spoken at school compared to baseline, however, significant group differences did not emerge. Treatment recipients also experienced significant reductions in social anxiety per parent, but not teacher, report. Clinical gains were maintained over 3 month follow-up. IBTSM appears to be a promising new intervention that is efficacious in increasing functional speaking behaviors, feasible, and acceptable to parents and teachers. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Noncontextuality with Marginal Selectivity in Reconstructing Mental Architectures

    Directory of Open Access Journals (Sweden)

    Ru eZhang

    2015-06-01

    Full Text Available We present a general theory of series-parallel mental architectures with selectively influenced stochastically non-independent components. A mental architecture is a hypothetical network of processes aimed at performing a task, of which we only observe the overall time it takes under variable parameters of the task. It is usually assumed that the network contains several processes selectively influenced by different experimental factors, and then the question is asked as to how these processes are arranged within the network, e.g., whether they are concurrent or sequential. One way of doing this is to consider the distribution functions for the overall processing time and compute certain linear combinations thereof (interaction contrasts. The theory of selective influences in psychology can be viewed as a special application of the interdisciplinary theory of (noncontextuality having its origins and main applications in quantum theory. In particular, lack of contextuality is equivalent to the existence of a hidden random entity of which all the random variables in play are functions. Consequently, for any given value of this common random entity, the processing times and their compositions (minima, maxima, or sums become deterministic quantities. These quantities, in turn, can be treated as random variables with (shifted Heaviside distribution functions, for which one can easily compute various linear combinations across different treatments, including interaction contrasts. This mathematical fact leads to a simple method, more general than the previously used ones, to investigate and characterize the interaction contrast for different types of series-parallel architectures.

  1. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    Science.gov (United States)

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  2. Treatment selection in a randomized clinical trial via covariate-specific treatment effect curves.

    Science.gov (United States)

    Ma, Yunbei; Zhou, Xiao-Hua

    2017-02-01

    For time-to-event data in a randomized clinical trial, we proposed two new methods for selecting an optimal treatment for a patient based on the covariate-specific treatment effect curve, which is used to represent the clinical utility of a predictive biomarker. To select an optimal treatment for a patient with a specific biomarker value, we proposed pointwise confidence intervals for each covariate-specific treatment effect curve and the difference between covariate-specific treatment effect curves of two treatments. Furthermore, to select an optimal treatment for a future biomarker-defined subpopulation of patients, we proposed confidence bands for each covariate-specific treatment effect curve and the difference between each pair of covariate-specific treatment effect curve over a fixed interval of biomarker values. We constructed the confidence bands based on a resampling technique. We also conducted simulation studies to evaluate finite-sample properties of the proposed estimation methods. Finally, we illustrated the application of the proposed method in a real-world data set.

  3. A general symplectic method for the response analysis of infinitely periodic structures subjected to random excitations

    Directory of Open Access Journals (Sweden)

    You-Wei Zhang

    Full Text Available A general symplectic method for the random response analysis of infinitely periodic structures subjected to stationary/non-stationary random excitations is developed using symplectic mathematics in conjunction with variable separation and the pseudo-excitation method (PEM. Starting from the equation of motion for a single loaded substructure, symplectic analysis is firstly used to eliminate the dependent degrees of the freedom through condensation. A Fourier expansion of the condensed equation of motion is then applied to separate the variables of time and wave number, thus enabling the necessary recurrence scheme to be developed. The random response is finally determined by implementing PEM. The proposed method is justified by comparison with results available in the literature and is then applied to a more complicated time-dependent coupled system.

  4. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  5. Random matrices and random difference equations

    International Nuclear Information System (INIS)

    Uppuluri, V.R.R.

    1975-01-01

    Mathematical models leading to products of random matrices and random difference equations are discussed. A one-compartment model with random behavior is introduced, and it is shown how the average concentration in the discrete time model converges to the exponential function. This is of relevance to understanding how radioactivity gets trapped in bone structure in blood--bone systems. The ideas are then generalized to two-compartment models and mammillary systems, where products of random matrices appear in a natural way. The appearance of products of random matrices in applications in demography and control theory is considered. Then random sequences motivated from the following problems are studied: constant pulsing and random decay models, random pulsing and constant decay models, and random pulsing and random decay models

  6. Brief cognitive behavioral therapy compared to general practitioners care for depression in primary care: a randomized trial

    Science.gov (United States)

    2010-01-01

    Background Depressive disorders are highly prevalent in primary care (PC) and are associated with considerable functional impairment and increased health care use. Research has shown that many patients prefer psychological treatments to pharmacotherapy, however, it remains unclear which treatment is most optimal for depressive patients in primary care. Methods/Design A randomized, multi-centre trial involving two intervention groups: one receiving brief cognitive behavioral therapy and the other receiving general practitioner care. General practitioners from 109 General Practices in Nijmegen and Amsterdam (The Netherlands) will be asked to include patients aged between 18-70 years presenting with depressive symptomatology, who do not receive an active treatment for their depressive complaints. Patients will be telephonically assessed with the Structured Clinical Interview for DSM-IV Axis I Disorders (SCID-I) to ascertain study eligibility. Eligible patients will be randomized to one of two treatment conditions: either 8 sessions of cognitive behavioral therapy by a first line psychologist or general practitioner's care according to The Dutch College of General Practitioners Practice Guideline (NHG- standaard). Baseline and follow-up assessments are scheduled at 0, 6, 12 and 52 weeks following the start of the intervention. Primary outcome will be measured with the Hamilton Depression Rating Scale-17 (HDRS-17) and the Patient Health Questionnaire-9 (PHQ-9). Outcomes will be analyzed on an intention to treat basis. Trial Registration ISRCTN65811640 PMID:20939917

  7. Privacy-Preserving Evaluation of Generalization Error and Its Application to Model and Attribute Selection

    Science.gov (United States)

    Sakuma, Jun; Wright, Rebecca N.

    Privacy-preserving classification is the task of learning or training a classifier on the union of privately distributed datasets without sharing the datasets. The emphasis of existing studies in privacy-preserving classification has primarily been put on the design of privacy-preserving versions of particular data mining algorithms, However, in classification problems, preprocessing and postprocessing— such as model selection or attribute selection—play a prominent role in achieving higher classification accuracy. In this paper, we show generalization error of classifiers in privacy-preserving classification can be securely evaluated without sharing prediction results. Our main technical contribution is a new generalized Hamming distance protocol that is universally applicable to preprocessing and postprocessing of various privacy-preserving classification problems, such as model selection in support vector machine and attribute selection in naive Bayes classification.

  8. Generalized Optical Theorem Detection in Random and Complex Media

    Science.gov (United States)

    Tu, Jing

    The problem of detecting changes of a medium or environment based on active, transmit-plus-receive wave sensor data is at the heart of many important applications including radar, surveillance, remote sensing, nondestructive testing, and cancer detection. This is a challenging problem because both the change or target and the surrounding background medium are in general unknown and can be quite complex. This Ph.D. dissertation presents a new wave physics-based approach for the detection of targets or changes in rather arbitrary backgrounds. The proposed methodology is rooted on a fundamental result of wave theory called the optical theorem, which gives real physical energy meaning to the statistics used for detection. This dissertation is composed of two main parts. The first part significantly expands the theory and understanding of the optical theorem for arbitrary probing fields and arbitrary media including nonreciprocal media, active media, as well as time-varying and nonlinear scatterers. The proposed formalism addresses both scalar and full vector electromagnetic fields. The second contribution of this dissertation is the application of the optical theorem to change detection with particular emphasis on random, complex, and active media, including single frequency probing fields and broadband probing fields. The first part of this work focuses on the generalization of the existing theoretical repertoire and interpretation of the scalar and electromagnetic optical theorem. Several fundamental generalizations of the optical theorem are developed. A new theory is developed for the optical theorem for scalar fields in nonhomogeneous media which can be bounded or unbounded. The bounded media context is essential for applications such as intrusion detection and surveillance in enclosed environments such as indoor facilities, caves, tunnels, as well as for nondestructive testing and communication systems based on wave-guiding structures. The developed scalar

  9. Programmable disorder in random DNA tilings

    Science.gov (United States)

    Tikhomirov, Grigory; Petersen, Philip; Qian, Lulu

    2017-03-01

    Scaling up the complexity and diversity of synthetic molecular structures will require strategies that exploit the inherent stochasticity of molecular systems in a controlled fashion. Here we demonstrate a framework for programming random DNA tilings and show how to control the properties of global patterns through simple, local rules. We constructed three general forms of planar network—random loops, mazes and trees—on the surface of self-assembled DNA origami arrays on the micrometre scale with nanometre resolution. Using simple molecular building blocks and robust experimental conditions, we demonstrate control of a wide range of properties of the random networks, including the branching rules, the growth directions, the proximity between adjacent networks and the size distribution. Much as combinatorial approaches for generating random one-dimensional chains of polymers have been used to revolutionize chemical synthesis and the selection of functional nucleic acids, our strategy extends these principles to random two-dimensional networks of molecules and creates new opportunities for fabricating more complex molecular devices that are organized by DNA nanostructures.

  10. Implementation of selective prevention for cardiometabolic diseases; is general practice adequately prepared ?

    NARCIS (Netherlands)

    Stol, D.M.; Hollander, M.; Nielen, M.M.J.; Badenbroek, I.F.; Schellevis, F.G.; Wit, N.J. de

    2018-01-01

    Objective: Current guidelines acknowledge the need for cardiometabolic disease (CMD) prevention and recommend five-yearly screening of a targeted population. In recent years programs for selective CMD-prevention have been developed, but implementation is challenging. The question arises if general

  11. Randomization tests

    CERN Document Server

    Edgington, Eugene

    2007-01-01

    Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani

  12. The South African Stroke Risk in General Practice Study | Connor ...

    African Journals Online (AJOL)

    Two hundred general practices were randomly selected from lists provided by pharmaceutical .representatives. Each GP approached 50 consecutive patients aged 30 years and older. Patients completed an information sheet and the GP documented the patient's risk factors. The resulting sample is relevant.if not necessarily ...

  13. Modified truncated randomized singular value decomposition (MTRSVD) algorithms for large scale discrete ill-posed problems with general-form regularization

    Science.gov (United States)

    Jia, Zhongxiao; Yang, Yanfei

    2018-05-01

    In this paper, we propose new randomization based algorithms for large scale linear discrete ill-posed problems with general-form regularization: subject to , where L is a regularization matrix. Our algorithms are inspired by the modified truncated singular value decomposition (MTSVD) method, which suits only for small to medium scale problems, and randomized SVD (RSVD) algorithms that generate good low rank approximations to A. We use rank-k truncated randomized SVD (TRSVD) approximations to A by truncating the rank- RSVD approximations to A, where q is an oversampling parameter. The resulting algorithms are called modified TRSVD (MTRSVD) methods. At every step, we use the LSQR algorithm to solve the resulting inner least squares problem, which is proved to become better conditioned as k increases so that LSQR converges faster. We present sharp bounds for the approximation accuracy of the RSVDs and TRSVDs for severely, moderately and mildly ill-posed problems, and substantially improve a known basic bound for TRSVD approximations. We prove how to choose the stopping tolerance for LSQR in order to guarantee that the computed and exact best regularized solutions have the same accuracy. Numerical experiments illustrate that the best regularized solutions by MTRSVD are as accurate as the ones by the truncated generalized singular value decomposition (TGSVD) algorithm, and at least as accurate as those by some existing truncated randomized generalized singular value decomposition (TRGSVD) algorithms. This work was supported in part by the National Science Foundation of China (Nos. 11771249 and 11371219).

  14. Randomness at the root of things 1: Random walks

    Science.gov (United States)

    Ogborn, Jon; Collins, Simon; Brown, Mick

    2003-09-01

    This is the first of a pair of articles about randomness in physics. In this article, we use some variations on the idea of a `random walk' to consider first the path of a particle in Brownian motion, and then the random variation to be expected in radioactive decay. The arguments are set in the context of the general importance of randomness both in physics and in everyday life. We think that the ideas could usefully form part of students' A-level work on random decay and quantum phenomena, as well as being good for their general education. In the second article we offer a novel and simple approach to Poisson sequences.

  15. Selective decontamination in pediatric liver transplants. A randomized prospective study.

    Science.gov (United States)

    Smith, S D; Jackson, R J; Hannakan, C J; Wadowsky, R M; Tzakis, A G; Rowe, M I

    1993-06-01

    Although it has been suggested that selective decontamination of the digestive tract (SDD) decreases postoperative aerobic Gram-negative and fungal infections in orthotopic liver transplantation (OLT), no controlled trials exist in pediatric patients. This prospective, randomized controlled study of 36 pediatric OLT patients examines the effect of short-term SDD on postoperative infection and digestive tract flora. Patients were randomized into two groups. The control group received perioperative parenteral antibiotics only. The SDD group received in addition polymyxin E, tobramycin, and amphotericin B enterally and by oropharyngeal swab postoperatively until oral intake was tolerated (6 +/- 4 days). Indications for operation, preoperative status, age, and intensive care unit and hospital length of stay were no different in SDD (n = 18) and control (n = 18) groups. A total of 14 Gram-negative infections (intraabdominal abscess 7, septicemia 5, pneumonia 1, urinary tract 1) developed in the 36 patients studied. Mortality was not significantly different in the two groups. However, there were significantly fewer patients with Gram-negative infections in the SDD group: 3/18 patients (11%) vs. 11/18 patients (50%) in the control group, P < 0.001. There was also significant reduction in aerobic Gram-negative flora in the stool and pharynx in patients receiving SDD. Gram-positive and anaerobic organisms were unaffected. We conclude that short-term postoperative SDD significantly reduces Gram-negative infections in pediatric OLT patients.

  16. Prospective randomized assessment of single versus double-gloving for general surgical procedures.

    Science.gov (United States)

    Na'aya, H U; Madziga, A G; Eni, U E

    2009-01-01

    There is increased tendency towards double-gloving by general surgeons in our practice, due probably to awareness of the risk of contamination with blood or other body fluids during surgery. The aim of the study was to compare the relative frequency of glove puncture in single-glove versus double glove sets in general surgical procedures, and to determine if duration of surgery affects perforation rate. Surgeons at random do single or double gloves at their discretion, for general surgical procedures. All the gloves used by the surgeons were assessed immediately after surgery for perforation. A total of 1120 gloves were tested, of which 880 were double-glove sets and 240 single-glove sets. There was no significant difference in the overall perforation rate between single and double glove sets (18.3% versus 20%). However, only 2.3% had perforations in both the outer and inner gloves in the double glove group. Therefore, there was significantly greater risk for blood-skin exposure in the single glove sets (p < 0.01). The perforation rate was also significantly greater during procedures lasting an hour or more compared to those lasting less than an hour (p < 0.01). Double-gloving reduces the risk of blood-skin contamination in all general surgical procedures, and especially so in procedures lasting an hour or more.

  17. A randomized, controlled clinical trial: the effect of mindfulness-based cognitive therapy on generalized anxiety disorder among Chinese community patients: protocol for a randomized trial

    Directory of Open Access Journals (Sweden)

    Wong Samuel YS

    2011-11-01

    Full Text Available Abstract Background Research suggests that an eight-week Mindfulness-Based Cognitive Therapy (MBCT program may be effective in the treatment of generalized anxiety disorders. Our objective is to compare the clinical effectiveness of the MBCT program with a psycho-education programme and usual care in reducing anxiety symptoms in people suffering from generalized anxiety disorder. Methods A three armed randomized, controlled clinical trial including 9-month post-treatment follow-up is proposed. Participants screened positive using the Structure Clinical Interview for DSM-IV (SCID for general anxiety disorder will be recruited from community-based clinics. 228 participants will be randomly allocated to the MBCT program plus usual care, psycho-education program plus usual care or the usual care group. Validated Chinese version of instruments measuring anxiety and worry symptoms, depression, quality of life and health service utilization will be used. Our primary end point is the change of anxiety and worry score (Beck Anxiety Inventory and Penn State Worry Scale from baseline to the end of intervention. For primary analyses, treatment outcomes will be assessed by ANCOVA, with change in anxiety score as the baseline variable, while the baseline anxiety score and other baseline characteristics that significantly differ between groups will serve as covariates. Conclusions This is a first randomized controlled trial that compare the effectiveness of MBCT with an active control, findings will advance current knowledge in the management of GAD and the way that group intervention can be delivered and inform future research. Unique Trail Number (assigned by Centre for Clinical Trails, Clinical Trials registry, The Chinese University of Hong Kong: CUHK_CCT00267

  18. Reporting quality of randomized controlled trial abstracts: survey of leading general dental journals.

    Science.gov (United States)

    Hua, Fang; Deng, Lijia; Kau, Chung How; Jiang, Han; He, Hong; Walsh, Tanya

    2015-09-01

    The authors conducted a study to assess the reporting quality of randomized controlled trial (RCT) abstracts published in leading general dental journals, investigate any improvement after the release of the Consolidated Standards of Reporting Trials (CONSORT) for Abstracts guidelines, and identify factors associated with better reporting quality. The authors searched PubMed for RCTs published in 10 leading general dental journals during the periods from 2005 to 2007 (pre-CONSORT period) and 2010 to 2012 (post-CONSORT period). The authors evaluated and scored the reporting quality of included abstracts by using the original 16-item CONSORT for Abstracts checklist. The authors used risk ratios and the t test to compare the adequate reporting rate of each item and the overall quality in the 2 periods. The authors used univariate and multivariate regressions to identify predictors of better reporting quality. The authors included and evaluated 276 RCT abstracts. Investigators reported significantly more checklist items during the post-CONSORT period (mean [standard deviation {SD}], 4.53 [1.69]) than during the pre-CONSORT period (mean [SD], 3.87 [1.10]; mean difference, -0.66 [95% confidence interval, -0.99 to -0.33]; P 80%). In contrast, the authors saw sufficient reporting of randomization, recruitment, outcome in the results section, and funding in none of the pre-CONSORT abstracts and less than 2% of the post-CONSORT abstracts. On the basis of the multivariate analysis, a higher impact factor (P general dental journals has improved significantly, but there is still room for improvement. Joint efforts by authors, reviewers, journal editors, and other stakeholders to improve the reporting of dental RCT abstracts are needed. Copyright © 2015 American Dental Association. Published by Elsevier Inc. All rights reserved.

  19. Ambulatory blood pressure monitoring for hypertension in general practice.

    OpenAIRE

    Taylor, R S; Stockman, J; Kernick, D; Reinhold, D; Shore, A C; Tooke, J E

    1998-01-01

    Ambulatory blood pressure monitoring (ABPM) is being increasingly used in general practice. There is at present little published evidence regarding the clinical utility of ABPM in the care of patients with established hypertension in this setting. We examined this issue by undertaking ABPM in a group of patients with established hypertension. 40 patients (aged 33-60 years) currently being treated for hypertension were randomly selected from a general practice list and underwent a single 24-ho...

  20. Oracle Efficient Variable Selection in Random and Fixed Effects Panel Data Models

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl

    This paper generalizes the results for the Bridge estimator of Huang et al. (2008) to linear random and fixed effects panel data models which are allowed to grow in both dimensions. In particular we show that the Bridge estimator is oracle efficient. It can correctly distinguish between relevant...... and irrelevant variables and the asymptotic distribution of the estimators of the coefficients of the relevant variables is the same as if only these had been included in the model, i.e. as if an oracle had revealed the true model prior to estimation. In the case of more explanatory variables than observations......, we prove that the Marginal Bridge estimator can asymptotically correctly distinguish between relevant and irrelevant explanatory variables. We do this without restricting the dependence between covariates and without assuming sub Gaussianity of the error terms thereby generalizing the results...

  1. Distribution of orientation selectivity in recurrent networks of spiking neurons with different random topologies.

    Science.gov (United States)

    Sadeh, Sadra; Rotter, Stefan

    2014-01-01

    Neurons in the primary visual cortex are more or less selective for the orientation of a light bar used for stimulation. A broad distribution of individual grades of orientation selectivity has in fact been reported in all species. A possible reason for emergence of broad distributions is the recurrent network within which the stimulus is being processed. Here we compute the distribution of orientation selectivity in randomly connected model networks that are equipped with different spatial patterns of connectivity. We show that, for a wide variety of connectivity patterns, a linear theory based on firing rates accurately approximates the outcome of direct numerical simulations of networks of spiking neurons. Distance dependent connectivity in networks with a more biologically realistic structure does not compromise our linear analysis, as long as the linearized dynamics, and hence the uniform asynchronous irregular activity state, remain stable. We conclude that linear mechanisms of stimulus processing are indeed responsible for the emergence of orientation selectivity and its distribution in recurrent networks with functionally heterogeneous synaptic connectivity.

  2. Dynamic Average Consensus and Consensusability of General Linear Multiagent Systems with Random Packet Dropout

    Directory of Open Access Journals (Sweden)

    Wen-Min Zhou

    2013-01-01

    Full Text Available This paper is concerned with the consensus problem of general linear discrete-time multiagent systems (MASs with random packet dropout that happens during information exchange between agents. The packet dropout phenomenon is characterized as being a Bernoulli random process. A distributed consensus protocol with weighted graph is proposed to address the packet dropout phenomenon. Through introducing a new disagreement vector, a new framework is established to solve the consensus problem. Based on the control theory, the perturbation argument, and the matrix theory, the necessary and sufficient condition for MASs to reach mean-square consensus is derived in terms of stability of an array of low-dimensional matrices. Moreover, mean-square consensusable conditions with regard to network topology and agent dynamic structure are also provided. Finally, the effectiveness of the theoretical results is demonstrated through an illustrative example.

  3. Collective excitations in the Penson-Kolb model: A generalized random-phase-approximation study

    International Nuclear Information System (INIS)

    Roy, G.K.; Bhattacharyya, B.

    1997-01-01

    The evolution of the superconducting ground state of the half-filled Penson-Kolb model is examined as a function of the coupling constant using a mean-field approach and the generalized random phase approximation (RPA) in two and three dimensions. On-site singlet pairs hop to compete against single-particle motion in this model, giving the coupling constant a strong momentum dependence. There is a pronounced bandwidth enhancement effect that converges smoothly to a finite value in the strong-coupling (Bose) regime. The low-lying collective excitations evaluated in generalized RPA show a linear dispersion and a gradual crossover from the weak-coupling (BCS) limit to the Bose regime; the mode velocity increases monotonically in sharp contrast to the attractive Hubbard model. Analytical results are derived in the asymptotic limits. copyright 1997 The American Physical Society

  4. Selecting, training and assessing new general practice community teachers in UK medical schools.

    Science.gov (United States)

    Hydes, Ciaran; Ajjawi, Rola

    2015-09-01

    Standards for undergraduate medical education in the UK, published in Tomorrow's Doctors, include the criterion 'everyone involved in educating medical students will be appropriately selected, trained, supported and appraised'. To establish how new general practice (GP) community teachers of medical students are selected, initially trained and assessed by UK medical schools and establish the extent to which Tomorrow's Doctors standards are being met. A mixed-methods study with questionnaire data collected from 24 lead GPs at UK medical schools, 23 new GP teachers from two medical schools plus a semi-structured telephone interview with two GP leads. Quantitative data were analysed descriptively and qualitative data were analysed informed by framework analysis. GP teachers' selection is non-standardised. One hundred per cent of GP leads provide initial training courses for new GP teachers; 50% are mandatory. The content and length of courses varies. All GP leads use student feedback to assess teaching, but other required methods (peer review and patient feedback) are not universally used. To meet General Medical Council standards, medical schools need to include equality and diversity in initial training and use more than one method to assess new GP teachers. Wider debate about the selection, training and assessment of new GP teachers is needed to agree minimum standards.

  5. Generalized Encoding CRDSA: Maximizing Throughput in Enhanced Random Access Schemes for Satellite

    Directory of Open Access Journals (Sweden)

    Manlio Bacco

    2014-12-01

    Full Text Available This work starts from the analysis of the literature about the Random Access protocols with contention resolution, such as Contention Resolution Diversity Slotted Aloha (CRDSA, and introduces a possible enhancement, named Generalized Encoding Contention Resolution Diversity Slotted Aloha (GE-CRDSA. The GE-CRDSA aims at improving the aggregated throughput when the system load is less than 50%, playing on the opportunity of transmitting an optimal combination of information and parity packets frame by frame. This paper shows the improvement in terms of throughput, by performing traffic estimation and adaptive choice of information and parity rates, when a satellite network undergoes a variable traffic load profile.

  6. The Bethe Sum Rule and Basis Set Selection in the Calculation of Generalized Oscillator Strengths

    DEFF Research Database (Denmark)

    Cabrera-Trujillo, Remigio; Sabin, John R.; Oddershede, Jens

    1999-01-01

    Fulfillment of the Bethe sum rule may be construed as a measure of basis set quality for atomic and molecular properties involving the generalized oscillator strength distribution. It is first shown that, in the case of a complete basis, the Bethe sum rule is fulfilled exactly in the random phase...

  7. Discrete least squares polynomial approximation with random evaluations - application to PDEs with Random parameters

    KAUST Repository

    Nobile, Fabio

    2015-01-07

    We consider a general problem F(u, y) = 0 where u is the unknown solution, possibly Hilbert space valued, and y a set of uncertain parameters. We specifically address the situation in which the parameterto-solution map u(y) is smooth, however y could be very high (or even infinite) dimensional. In particular, we are interested in cases in which F is a differential operator, u a Hilbert space valued function and y a distributed, space and/or time varying, random field. We aim at reconstructing the parameter-to-solution map u(y) from random noise-free or noisy observations in random points by discrete least squares on polynomial spaces. The noise-free case is relevant whenever the technique is used to construct metamodels, based on polynomial expansions, for the output of computer experiments. In the case of PDEs with random parameters, the metamodel is then used to approximate statistics of the output quantity. We discuss the stability of discrete least squares on random points show convergence estimates both in expectation and probability. We also present possible strategies to select, either a-priori or by adaptive algorithms, sequences of approximating polynomial spaces that allow to reduce, and in some cases break, the curse of dimensionality

  8. Topics in random walks in random environment

    International Nuclear Information System (INIS)

    Sznitman, A.-S.

    2004-01-01

    Over the last twenty-five years random motions in random media have been intensively investigated and some new general methods and paradigms have by now emerged. Random walks in random environment constitute one of the canonical models of the field. However in dimension bigger than one they are still poorly understood and many of the basic issues remain to this day unresolved. The present series of lectures attempt to give an account of the progresses which have been made over the last few years, especially in the study of multi-dimensional random walks in random environment with ballistic behavior. (author)

  9. General service and child immunization-specific readiness assessment of healthcare facilities in two selected divisions in Bangladesh.

    Science.gov (United States)

    Shawon, Md Shajedur Rahman; Adhikary, Gourab; Ali, Md Wazed; Shamsuzzaman, Md; Ahmed, Shahabuddin; Alam, Nurul; Shackelford, Katya A; Woldeab, Alexander; Lim, Stephen S; Levine, Aubrey; Gakidou, Emmanuela; Uddin, Md Jasim

    2018-01-25

    Service readiness of health facilities is an integral part of providing comprehensive quality healthcare to the community. Comprehensive assessment of general and service-specific (i.e. child immunization) readiness will help to identify the bottlenecks in healthcare service delivery and gaps in equitable service provision. Assessing healthcare facilities readiness also helps in optimal policymaking and resource allocation. A health facility survey was conducted between March 2015 and December 2015 in two purposively selected divisions in Bangladesh; i.e. Rajshahi division (high performing) and Sylhet division (low performing). A total of 123 health facilities were randomly selected from different levels of service, both public and private, with variation in sizes and patient loads from the list of facilities. Data on various aspects of healthcare facility were collected by interviewing key personnel. General service and child immunization specific service readiness were assessed using the Service Availability and Readiness Assessment (SARA) manual developed by World Health Organization (WHO). The analyses were stratified by division and level of healthcare facilities. The general service readiness index for pharmacies, community clinics, primary care facilities and higher care facilities were 40.6%, 60.5%, 59.8% and 69.5%, respectively in Rajshahi division and 44.3%, 57.8%, 57.5% and 73.4%, respectively in Sylhet division. Facilities at all levels had the highest scores for basic equipment (ranged between 51.7% and 93.7%) and the lowest scores for diagnostic capacity (ranged between 0.0% and 53.7%). Though facilities with vaccine storage capacity had very high levels of service readiness for child immunization, facilities without vaccine storage capacity lacked availability of many tracer items. Regarding readiness for newly introduced pneumococcal conjugate vaccine (PCV) and inactivated polio vaccine (IPV), most of the surveyed facilities reported lack of

  10. The Risk of Bias in Randomized Trials in General Dentistry Journals.

    Science.gov (United States)

    Hinton, Stephanie; Beyari, Mohammed M; Madden, Kim; Lamfon, Hanadi A

    2015-01-01

    The use of a randomized controlled trial (RCT) research design is considered the gold standard for conducting evidence-based clinical research. In this present study, we aimed to assess the quality of RCTs in dentistry and create a general foundation for evidence-based dentistry on which to perform subsequent RCTs. We conducted a systematic assessment of bias of RCTs in seven general dentistry journals published between January 2011 and March 2012. We extracted study characteristics in duplicate and assessed each trial's quality using the Cochrane Risk of Bias tool. We compared risk of bias across studies graphically. Among 1,755 studies across seven journals, we identified 67 RCTs. Many included studies were conducted in Europe (39%), with an average sample size of 358 participants. These studies included 52% female participants and the maximum follow-up period was 13 years. Overall, we found a high percentage of unclear risk of bias among included RCTs, indicating poor quality of reporting within the included studies. An overall high proportion of trials with an "unclear risk of bias" suggests the need for better quality of reporting in dentistry. As such, key concepts in dental research and future trials should focus on high-quality reporting.

  11. Intensive versus conventional blood pressure monitoring in a general practice population. The Blood Pressure Reduction in Danish General Practice trial: a randomized controlled parallel group trial

    DEFF Research Database (Denmark)

    Klarskov, Pia; Bang, Lia E; Schultz-Larsen, Peter

    2018-01-01

    To compare the effect of a conventional to an intensive blood pressure monitoring regimen on blood pressure in hypertensive patients in the general practice setting. Randomized controlled parallel group trial with 12-month follow-up. One hundred and ten general practices in all regions of Denmark....... One thousand forty-eight patients with essential hypertension. Conventional blood pressure monitoring ('usual group') continued usual ad hoc blood pressure monitoring by office blood pressure measurements, while intensive blood pressure monitoring ('intensive group') supplemented this with frequent...... a reduction of blood pressure. Clinical Trials NCT00244660....

  12. Properties of Risk Measures of Generalized Entropy in Portfolio Selection

    Directory of Open Access Journals (Sweden)

    Rongxi Zhou

    2017-12-01

    Full Text Available This paper systematically investigates the properties of six kinds of entropy-based risk measures: Information Entropy and Cumulative Residual Entropy in the probability space, Fuzzy Entropy, Credibility Entropy and Sine Entropy in the fuzzy space, and Hybrid Entropy in the hybridized uncertainty of both fuzziness and randomness. We discover that none of the risk measures satisfy all six of the following properties, which various scholars have associated with effective risk measures: Monotonicity, Translation Invariance, Sub-additivity, Positive Homogeneity, Consistency and Convexity. Measures based on Fuzzy Entropy, Credibility Entropy, and Sine Entropy all exhibit the same properties: Sub-additivity, Positive Homogeneity, Consistency, and Convexity. These measures based on Information Entropy and Hybrid Entropy, meanwhile, only exhibit Sub-additivity and Consistency. Cumulative Residual Entropy satisfies just Sub-additivity, Positive Homogeneity, and Convexity. After identifying these properties, we develop seven portfolio models based on different risk measures and made empirical comparisons using samples from both the Shenzhen Stock Exchange of China and the New York Stock Exchange of America. The comparisons show that the Mean Fuzzy Entropy Model performs the best among the seven models with respect to both daily returns and relative cumulative returns. Overall, these results could provide an important reference for both constructing effective risk measures and rationally selecting the appropriate risk measure under different portfolio selection conditions.

  13. Randomized random walk on a random walk

    International Nuclear Information System (INIS)

    Lee, P.A.

    1983-06-01

    This paper discusses generalizations of the model introduced by Kehr and Kunter of the random walk of a particle on a one-dimensional chain which in turn has been constructed by a random walk procedure. The superimposed random walk is randomised in time according to the occurrences of a stochastic point process. The probability of finding the particle in a particular position at a certain instant is obtained explicitly in the transform domain. It is found that the asymptotic behaviour for large time of the mean-square displacement of the particle depends critically on the assumed structure of the basic random walk, giving a diffusion-like term for an asymmetric walk or a square root law if the walk is symmetric. Many results are obtained in closed form for the Poisson process case, and these agree with those given previously by Kehr and Kunter. (author)

  14. Peer-Led Self-Management of General Medical Conditions for Patients With Serious Mental Illnesses: A Randomized Trial.

    Science.gov (United States)

    Druss, Benjamin G; Singh, Manasvini; von Esenwein, Silke A; Glick, Gretl E; Tapscott, Stephanie; Tucker, Sherry Jenkins; Lally, Cathy A; Sterling, Evelina W

    2018-02-01

    Individuals with serious mental illnesses have high rates of general medical comorbidity and challenges in managing these conditions. A growing workforce of certified peer specialists is available to help these individuals more effectively manage their health and health care. However, few studies have examined the effectiveness of peer-led programs for self-management of general medical conditions for this population. This randomized study enrolled 400 participants with a serious mental illness and one or more chronic general medical conditions across three community mental health clinics. Participants were randomly assigned to the Health and Recovery Peer (HARP) program, a self-management program for general medical conditions led by certified peer specialists (N=198), or to usual care (N=202). Assessments were conducted at baseline and three and six months. At six months, participants in the intervention group demonstrated a significant differential improvement in the primary study outcome, health-related quality of life. Specifically, compared with the usual care group, intervention participants had greater improvement in the Short-Form Health Survey physical component summary (an increase of 2.7 versus 1.4 points, p=.046) and mental component summary (4.6 versus 2.5 points, p=.039). Significantly greater six-month improvements in mental health recovery were seen for the intervention group (p=.02), but no other between-group differences in secondary outcome measures were significant. The HARP program was associated with improved physical health- and mental health-related quality of life among individuals with serious mental illness and comorbid general medical conditions, suggesting the potential benefits of more widespread dissemination of peer-led disease self-management in this population.

  15. Exploring pseudo- and chaotic random Monte Carlo simulations

    Science.gov (United States)

    Blais, J. A. Rod; Zhang, Zhan

    2011-07-01

    Computer simulations are an increasingly important area of geoscience research and development. At the core of stochastic or Monte Carlo simulations are the random number sequences that are assumed to be distributed with specific characteristics. Computer-generated random numbers, uniformly distributed on (0, 1), can be very different depending on the selection of pseudo-random number (PRN) or chaotic random number (CRN) generators. In the evaluation of some definite integrals, the resulting error variances can even be of different orders of magnitude. Furthermore, practical techniques for variance reduction such as importance sampling and stratified sampling can be applied in most Monte Carlo simulations and significantly improve the results. A comparative analysis of these strategies has been carried out for computational applications in planar and spatial contexts. Based on these experiments, and on some practical examples of geodetic direct and inverse problems, conclusions and recommendations concerning their performance and general applicability are included.

  16. Surveillance of acute respiratory infections in general practices - The Netherlands, winter 1997/98

    NARCIS (Netherlands)

    Heijnen MLA; Bartelds AIM; Wilbrink B; Verweij C; Bijlsma K; Nat H van der; Boswijk H; Boer AB de; Sprenger MJW; Dorigo-Zetsma JW; NIVEL; CIE; NIVEL; LIS

    1999-01-01

    To provide insight into the virological aetiology of influenza-like illnesses and other acute respiratory infections, nose/throat swabs were taken by 30 general practitioners of the sentinel surveillance network of the Netherlands Institute of Primary Health Care from a random selection of patients

  17. Generalized index for spatial data sets as a measure of complete spatial randomness

    Science.gov (United States)

    Hackett-Jones, Emily J.; Davies, Kale J.; Binder, Benjamin J.; Landman, Kerry A.

    2012-06-01

    Spatial data sets, generated from a wide range of physical systems can be analyzed by counting the number of objects in a set of bins. Previous work has been limited to equal-sized bins, which are inappropriate for some domains (e.g., circular). We consider a nonequal size bin configuration whereby overlapping or nonoverlapping bins cover the domain. A generalized index, defined in terms of a variance between bin counts, is developed to indicate whether or not a spatial data set, generated from exclusion or nonexclusion processes, is at the complete spatial randomness (CSR) state. Limiting values of the index are determined. Using examples, we investigate trends in the generalized index as a function of density and compare the results with those using equal size bins. The smallest bin size must be much larger than the mean size of the objects. We can determine whether a spatial data set is at the CSR state or not by comparing the values of a generalized index for different bin configurations—the values will be approximately the same if the data is at the CSR state, while the values will differ if the data set is not at the CSR state. In general, the generalized index is lower than the limiting value of the index, since objects do not have access to the entire region due to blocking by other objects. These methods are applied to two applications: (i) spatial data sets generated from a cellular automata model of cell aggregation in the enteric nervous system and (ii) a known plant data distribution.

  18. Variance-based selection may explain general mating patterns in social insects.

    Science.gov (United States)

    Rueppell, Olav; Johnson, Nels; Rychtár, Jan

    2008-06-23

    Female mating frequency is one of the key parameters of social insect evolution. Several hypotheses have been suggested to explain multiple mating and considerable empirical research has led to conflicting results. Building on several earlier analyses, we present a simple general model that links the number of queen matings to variance in colony performance and this variance to average colony fitness. The model predicts selection for multiple mating if the average colony succeeds in a focal task, and selection for single mating if the average colony fails, irrespective of the proximate mechanism that links genetic diversity to colony fitness. Empirical support comes from interspecific comparisons, e.g. between the bee genera Apis and Bombus, and from data on several ant species, but more comprehensive empirical tests are needed.

  19. Total variation regularization of the 3-D gravity inverse problem using a randomized generalized singular value decomposition

    Science.gov (United States)

    Vatankhah, Saeed; Renaut, Rosemary A.; Ardestani, Vahid E.

    2018-04-01

    We present a fast algorithm for the total variation regularization of the 3-D gravity inverse problem. Through imposition of the total variation regularization, subsurface structures presenting with sharp discontinuities are preserved better than when using a conventional minimum-structure inversion. The associated problem formulation for the regularization is nonlinear but can be solved using an iteratively reweighted least-squares algorithm. For small-scale problems the regularized least-squares problem at each iteration can be solved using the generalized singular value decomposition. This is not feasible for large-scale, or even moderate-scale, problems. Instead we introduce the use of a randomized generalized singular value decomposition in order to reduce the dimensions of the problem and provide an effective and efficient solution technique. For further efficiency an alternating direction algorithm is used to implement the total variation weighting operator within the iteratively reweighted least-squares algorithm. Presented results for synthetic examples demonstrate that the novel randomized decomposition provides good accuracy for reduced computational and memory demands as compared to use of classical approaches.

  20. Achieving involvement: process outcomes from a cluster randomized trial of shared decision making skill development and use of risk communication aids in general practice.

    Science.gov (United States)

    Elwyn, G; Edwards, A; Hood, K; Robling, M; Atwell, C; Russell, I; Wensing, M; Grol, R

    2004-08-01

    A consulting method known as 'shared decision making' (SDM) has been described and operationalized in terms of several 'competences'. One of these competences concerns the discussion of the risks and benefits of treatment or care options-'risk communication'. Few data exist on clinicians' ability to acquire skills and implement the competences of SDM or risk communication in consultations with patients. The aims of this study were to evaluate the effects of skill development workshops for SDM and the use of risk communication aids on the process of consultations. A cluster randomized trial with crossover was carried out with the participation of 20 recently qualified GPs in urban and rural general practices in Gwent, South Wales. A total of 747 patients with known atrial fibrillation, prostatism, menorrhagia or menopausal symptoms were invited to a consultation to review their condition or treatments. Half the consultations were randomly selected for audio-taping, of which 352 patients attended and were audio-taped successfully. After baseline, participating doctors were randomized to receive training in (i) SDM skills or (ii) the use of simple risk communication aids, using simulated patients. The alternative training was then provided for the final study phase. Patients were allocated randomly to a consultation during baseline or intervention 1 (SDM or risk communication aids) or intervention 2 phases. A randomly selected half of the consultations were audio-taped from each phase. Raters (independent, trained and blinded to study phase) assessed the audio-tapes using a validated scale to assess levels of patient involvement (OPTION: observing patient involvement), and to analyse the nature of risk information discussed. Clinicians completed questionnaires after each consultation, assessing perceived clinician-patient agreement and level of patient involvement in decisions. Multilevel modelling was carried out with the OPTION score as the dependent variable, and

  1. Teacher's and Students' Beliefs on English for General Academic Purposes: The Case of Iranian University Students

    Science.gov (United States)

    Kojour, Masoud Kermani; Heirati, Javad Kia

    2015-01-01

    This study was framed in the sociocultural theory to look into the evolution of L2 learners' beliefs about the general English course during a term. One hundred ninety-eight male and female university students and their general English course teacher were randomly selected as the participants of the study. Data were gathered through the…

  2. Is self-selection the main driver of positive interpretations of general health checks?

    DEFF Research Database (Denmark)

    Bender, Anne Mette; Jørgensen, Torben; Pisinger, Charlotta

    2015-01-01

    OBJECTIVE: To investigate if the lower mortality among participants of a health check followed by lifestyle intervention of high risk persons is explained by self-selection. METHODS: All persons residing in the study area (Copenhagen; Denmark) were randomized to intervention (n=11,629) or control...... group (n=47,987). Persons in the intervention group were invited for a health check and individual lifestyle counselling. At baseline, 52.5% participated. Differences between participants and control group in 10-year all-cause and disease specific mortality was assessed. In survival analyses we...... was seen both for lifestyle related and non-lifestyle related diseases....

  3. A guide to developing resource selection functions from telemetry data using generalized estimating equations and generalized linear mixed models

    Directory of Open Access Journals (Sweden)

    Nicola Koper

    2012-03-01

    Full Text Available Resource selection functions (RSF are often developed using satellite (ARGOS or Global Positioning System (GPS telemetry datasets, which provide a large amount of highly correlated data. We discuss and compare the use of generalized linear mixed-effects models (GLMM and generalized estimating equations (GEE for using this type of data to develop RSFs. GLMMs directly model differences among caribou, while GEEs depend on an adjustment of the standard error to compensate for correlation of data points within individuals. Empirical standard errors, rather than model-based standard errors, must be used with either GLMMs or GEEs when developing RSFs. There are several important differences between these approaches; in particular, GLMMs are best for producing parameter estimates that predict how management might influence individuals, while GEEs are best for predicting how management might influence populations. As the interpretation, value, and statistical significance of both types of parameter estimates differ, it is important that users select the appropriate analytical method. We also outline the use of k-fold cross validation to assess fit of these models. Both GLMMs and GEEs hold promise for developing RSFs as long as they are used appropriately.

  4. An MGF-based unified framework to determine the joint statistics of partial sums of ordered i.n.d. random variables

    KAUST Repository

    Nam, Sungsik

    2014-08-01

    The joint statistics of partial sums of ordered random variables (RVs) are often needed for the accurate performance characterization of a wide variety of wireless communication systems. A unified analytical framework to determine the joint statistics of partial sums of ordered independent and identically distributed (i.i.d.) random variables was recently presented. However, the identical distribution assumption may not be valid in several real-world applications. With this motivation in mind, we consider in this paper the more general case in which the random variables are independent but not necessarily identically distributed (i.n.d.). More specifically, we extend the previous analysis and introduce a new more general unified analytical framework to determine the joint statistics of partial sums of ordered i.n.d. RVs. Our mathematical formalism is illustrated with an application on the exact performance analysis of the capture probability of generalized selection combining (GSC)-based RAKE receivers operating over frequency-selective fading channels with a non-uniform power delay profile. © 1991-2012 IEEE.

  5. A general definition of the heritable variation that determines the potential of a population to respond to selection.

    Science.gov (United States)

    Bijma, Piter

    2011-12-01

    Genetic selection is a major force shaping life on earth. In classical genetic theory, response to selection is the product of the strength of selection and the additive genetic variance in a trait. The additive genetic variance reflects a population's intrinsic potential to respond to selection. The ordinary additive genetic variance, however, ignores the social organization of life. With social interactions among individuals, individual trait values may depend on genes in others, a phenomenon known as indirect genetic effects. Models accounting for indirect genetic effects, however, lack a general definition of heritable variation. Here I propose a general definition of the heritable variation that determines the potential of a population to respond to selection. This generalizes the concept of heritable variance to any inheritance model and level of organization. The result shows that heritable variance determining potential response to selection is the variance among individuals in the heritable quantity that determines the population mean trait value, rather than the usual additive genetic component of phenotypic variance. It follows, therefore, that heritable variance may exceed phenotypic variance among individuals, which is impossible in classical theory. This work also provides a measure of the utilization of heritable variation for response to selection and integrates two well-known models of maternal genetic effects. The result shows that relatedness between the focal individual and the individuals affecting its fitness is a key determinant of the utilization of heritable variance for response to selection.

  6. Selection, integration, and conflict monitoring; assessing the nature and generality of prefrontal cognitive control mechanisms.

    Science.gov (United States)

    Badre, David; Wagner, Anthony D

    2004-02-05

    Prefrontal cortex (PFC) supports flexible behavior by mediating cognitive control, though the elemental forms of control supported by PFC remain a central debate. Dorsolateral PFC (DLPFC) is thought to guide response selection under conditions of response conflict or, alternatively, may refresh recently active representations within working memory. Lateral frontopolar cortex (FPC) may also adjudicate response conflict, though others propose that FPC supports higher order control processes such as subgoaling and integration. Anterior cingulate cortex (ACC) is hypothesized to upregulate response selection by detecting response conflict; it remains unclear whether ACC functions generalize beyond monitoring response conflict. The present fMRI experiment directly tested these competing theories regarding the functional roles of DLPFC, FPC, and ACC. Results reveal dissociable control processes in PFC, with mid-DLPFC selectively mediating resolution of response conflict and FPC further mediating subgoaling/integration. ACC demonstrated a broad sensitivity to control demands, suggesting a generalized role in modulating cognitive control.

  7. The fuzzy TOPSIS and generalized Choquet fuzzy integral algorithm for nuclear power plant site selection - a case study from Turkey

    International Nuclear Information System (INIS)

    Kurt, Ünal

    2014-01-01

    The location selection for nuclear power plant (NPP) is a strategic decision, which has significant impact on the economic operation of the plant and sustainable development of the region. This paper proposes fuzzy TOPSIS and generalized Choquet fuzzy integral algorithm for evaluation and selection of optimal locations for NPP in Turkey. Many sub-criteria such as geological, social, touristic, transportation abilities, cooling water capacity and nearest to consumptions markets are taken into account. Among the evaluated locations, according to generalized Choquet fuzzy integral method, Inceburun–Sinop was selected as a study site due to its highest performance and meeting most of the investigated criteria. The Inceburun-Sinop is selected by generalized Choquet fuzzy integral and fuzzy TOPSIS Iğneada–Kırklareli took place in the first turn. The Mersin–Akkuyu is not selected in both methods. (author)

  8. Implications of structural genomics target selection strategies: Pfam5000, whole genome, and random approaches

    Energy Technology Data Exchange (ETDEWEB)

    Chandonia, John-Marc; Brenner, Steven E.

    2004-07-14

    The structural genomics project is an international effort to determine the three-dimensional shapes of all important biological macromolecules, with a primary focus on proteins. Target proteins should be selected according to a strategy which is medically and biologically relevant, of good value, and tractable. As an option to consider, we present the Pfam5000 strategy, which involves selecting the 5000 most important families from the Pfam database as sources for targets. We compare the Pfam5000 strategy to several other proposed strategies that would require similar numbers of targets. These include including complete solution of several small to moderately sized bacterial proteomes, partial coverage of the human proteome, and random selection of approximately 5000 targets from sequenced genomes. We measure the impact that successful implementation of these strategies would have upon structural interpretation of the proteins in Swiss-Prot, TrEMBL, and 131 complete proteomes (including 10 of eukaryotes) from the Proteome Analysis database at EBI. Solving the structures of proteins from the 5000 largest Pfam families would allow accurate fold assignment for approximately 68 percent of all prokaryotic proteins (covering 59 percent of residues) and 61 percent of eukaryotic proteins (40 percent of residues). More fine-grained coverage which would allow accurate modeling of these proteins would require an order of magnitude more targets. The Pfam5000 strategy may be modified in several ways, for example to focus on larger families, bacterial sequences, or eukaryotic sequences; as long as secondary consideration is given to large families within Pfam, coverage results vary only slightly. In contrast, focusing structural genomics on a single tractable genome would have only a limited impact in structural knowledge of other proteomes: a significant fraction (about 30-40 percent of the proteins, and 40-60 percent of the residues) of each proteome is classified in small

  9. Genome-wide association data classification and SNPs selection using two-stage quality-based Random Forests.

    Science.gov (United States)

    Nguyen, Thanh-Tung; Huang, Joshua; Wu, Qingyao; Nguyen, Thuy; Li, Mark

    2015-01-01

    Single-nucleotide polymorphisms (SNPs) selection and identification are the most important tasks in Genome-wide association data analysis. The problem is difficult because genome-wide association data is very high dimensional and a large portion of SNPs in the data is irrelevant to the disease. Advanced machine learning methods have been successfully used in Genome-wide association studies (GWAS) for identification of genetic variants that have relatively big effects in some common, complex diseases. Among them, the most successful one is Random Forests (RF). Despite of performing well in terms of prediction accuracy in some data sets with moderate size, RF still suffers from working in GWAS for selecting informative SNPs and building accurate prediction models. In this paper, we propose to use a new two-stage quality-based sampling method in random forests, named ts-RF, for SNP subspace selection for GWAS. The method first applies p-value assessment to find a cut-off point that separates informative and irrelevant SNPs in two groups. The informative SNPs group is further divided into two sub-groups: highly informative and weak informative SNPs. When sampling the SNP subspace for building trees for the forest, only those SNPs from the two sub-groups are taken into account. The feature subspaces always contain highly informative SNPs when used to split a node at a tree. This approach enables one to generate more accurate trees with a lower prediction error, meanwhile possibly avoiding overfitting. It allows one to detect interactions of multiple SNPs with the diseases, and to reduce the dimensionality and the amount of Genome-wide association data needed for learning the RF model. Extensive experiments on two genome-wide SNP data sets (Parkinson case-control data comprised of 408,803 SNPs and Alzheimer case-control data comprised of 380,157 SNPs) and 10 gene data sets have demonstrated that the proposed model significantly reduced prediction errors and outperformed

  10. The predictive validity of selection for entry into postgraduate training in general practice: evidence from three longitudinal studies.

    Science.gov (United States)

    Patterson, Fiona; Lievens, Filip; Kerrin, Máire; Munro, Neil; Irish, Bill

    2013-11-01

    The selection methodology for UK general practice is designed to accommodate several thousand applicants per year and targets six core attributes identified in a multi-method job-analysis study To evaluate the predictive validity of selection methods for entry into postgraduate training, comprising a clinical problem-solving test, a situational judgement test, and a selection centre. A three-part longitudinal predictive validity study of selection into training for UK general practice. In sample 1, participants were junior doctors applying for training in general practice (n = 6824). In sample 2, participants were GP registrars 1 year into training (n = 196). In sample 3, participants were GP registrars sitting the licensing examination after 3 years, at the end of training (n = 2292). The outcome measures include: assessor ratings of performance in a selection centre comprising job simulation exercises (sample 1); supervisor ratings of trainee job performance 1 year into training (sample 2); and licensing examination results, including an applied knowledge examination and a 12-station clinical skills objective structured clinical examination (OSCE; sample 3). Performance ratings at selection predicted subsequent supervisor ratings of job performance 1 year later. Selection results also significantly predicted performance on both the clinical skills OSCE and applied knowledge examination for licensing at the end of training. In combination, these longitudinal findings provide good evidence of the predictive validity of the selection methods, and are the first reported for entry into postgraduate training. Results show that the best predictor of work performance and training outcomes is a combination of a clinical problem-solving test, a situational judgement test, and a selection centre. Implications for selection methods for all postgraduate specialties are considered.

  11. Comparative analysis of instance selection algorithms for instance-based classifiers in the context of medical decision support

    International Nuclear Information System (INIS)

    Mazurowski, Maciej A; Tourassi, Georgia D; Malof, Jordan M

    2011-01-01

    When constructing a pattern classifier, it is important to make best use of the instances (a.k.a. cases, examples, patterns or prototypes) available for its development. In this paper we present an extensive comparative analysis of algorithms that, given a pool of previously acquired instances, attempt to select those that will be the most effective to construct an instance-based classifier in terms of classification performance, time efficiency and storage requirements. We evaluate seven previously proposed instance selection algorithms and compare their performance to simple random selection of instances. We perform the evaluation using k-nearest neighbor classifier and three classification problems: one with simulated Gaussian data and two based on clinical databases for breast cancer detection and diagnosis, respectively. Finally, we evaluate the impact of the number of instances available for selection on the performance of the selection algorithms and conduct initial analysis of the selected instances. The experiments show that for all investigated classification problems, it was possible to reduce the size of the original development dataset to less than 3% of its initial size while maintaining or improving the classification performance. Random mutation hill climbing emerges as the superior selection algorithm. Furthermore, we show that some previously proposed algorithms perform worse than random selection. Regarding the impact of the number of instances available for the classifier development on the performance of the selection algorithms, we confirm that the selection algorithms are generally more effective as the pool of available instances increases. In conclusion, instance selection is generally beneficial for instance-based classifiers as it can improve their performance, reduce their storage requirements and improve their response time. However, choosing the right selection algorithm is crucial.

  12. Lifestyle factors and experience of respiratory alarm symptoms in the general population

    DEFF Research Database (Denmark)

    Sele, Lisa Maria Falk; Balasubramaniam, Kirubakaran; Elnegaard, Sandra

    2015-01-01

    BACKGROUND: The first step in the diagnosis of lung cancer is for individuals in the general population to recognise respiratory alarm symptoms (RAS). Knowledge is sparse about RAS and factors associated with experiencing RAS in the general population. This study aimed to estimate the prevalence...... of RAS in the general population, and to analyse possible associations between lifestyle factors and experiencing RAS. METHODS: A web-based survey comprising 100 000 individuals randomly selected from the Danish Civil Registration System. Items regarding experience of RAS (prolonged coughing, shortness...

  13. Web-based consultation between general practitioners and nephrologists: a cluster randomized controlled trial.

    Science.gov (United States)

    van Gelder, Vincent A; Scherpbier-de Haan, Nynke D; van Berkel, Saskia; Akkermans, Reinier P; de Grauw, Inge S; Adang, Eddy M; Assendelft, Pim J; de Grauw, Wim J C; Biermans, Marion C J; Wetzels, Jack F M

    2017-08-01

    Consultation of a nephrologist is important in aligning care for patients with chronic kidney disease (CKD) at the primary-secondary care interface. However, current consultation methods come with practical difficulties that can lead to postponed consultation or patient referral instead. This study aimed to investigate whether a web-based consultation platform, telenephrology, led to a lower referral rate of indicated patients. Furthermore, we assessed consultation rate, quality of care, costs and general practitioner (GPs') experiences with telenephrology. Cluster randomized controlled trial with 47 general practices in the Netherlands was randomized to access to telenephrology or to enhanced usual care. A total of 3004 CKD patients aged 18 years or older who were under primary care were included (intervention group n = 1277, control group n = 1727) and 2693 completed the trial. All practices participated in a CKD management course and were given an overview of their CKD patients. The referral rates amounted to 2.3% (n = 29) in the intervention group and 3.0% (n = 52) in the control group, which was a non-significant difference, OR 0.61; 95% CI 0.31 to 1.23. The intervention group's consultation rate was 6.3% (n = 81) against 5.0% (n = 87) (OR 2.00; 95% CI 0.75-5.33). We found no difference in quality of care or costs. The majority of GPs had a positive opinion about telenephrology. The data in our study do not allow for conclusions on the effect of telenephrology on the rate of patient referrals and provider-to-provider consultations, compared to conventional methods. It was positively evaluated by GPs and was non-inferior in terms of quality of care and costs. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. Participant-selected music and physical activity in older adults following cardiac rehabilitation: a randomized controlled trial.

    Science.gov (United States)

    Clark, Imogen N; Baker, Felicity A; Peiris, Casey L; Shoebridge, Georgie; Taylor, Nicholas F

    2017-03-01

    To evaluate effects of participant-selected music on older adults' achievement of activity levels recommended in the physical activity guidelines following cardiac rehabilitation. A parallel group randomized controlled trial with measurements at Weeks 0, 6 and 26. A multisite outpatient rehabilitation programme of a publicly funded metropolitan health service. Adults aged 60 years and older who had completed a cardiac rehabilitation programme. Experimental participants selected music to support walking with guidance from a music therapist. Control participants received usual care only. The primary outcome was the proportion of participants achieving activity levels recommended in physical activity guidelines. Secondary outcomes compared amounts of physical activity, exercise capacity, cardiac risk factors, and exercise self-efficacy. A total of 56 participants, mean age 68.2 years (SD = 6.5), were randomized to the experimental ( n = 28) and control groups ( n = 28). There were no differences between groups in proportions of participants achieving activity recommended in physical activity guidelines at Week 6 or 26. Secondary outcomes demonstrated between-group differences in male waist circumference at both measurements (Week 6 difference -2.0 cm, 95% CI -4.0 to 0; Week 26 difference -2.8 cm, 95% CI -5.4 to -0.1), and observed effect sizes favoured the experimental group for amounts of physical activity (d = 0.30), exercise capacity (d = 0.48), and blood pressure (d = -0.32). Participant-selected music did not increase the proportion of participants achieving recommended amounts of physical activity, but may have contributed to exercise-related benefits.

  15. The Effect of Different Modes of English Captioning on EFL Learners' General Listening Comprehension: Full Text vs. Keyword Captions

    Science.gov (United States)

    Behroozizad, Sorayya; Majidi, Sudabeh

    2015-01-01

    This study investigated the effect of different modes of English captioning on EFL learners' general listening comprehension. To this end, forty-five intermediate-level learners were selected based on their scores on a standardized English proficiency test (PET) to carry out the study. Then, the selected participants were randomly assigned into…

  16. r2VIM: A new variable selection method for random forests in genome-wide association studies.

    Science.gov (United States)

    Szymczak, Silke; Holzinger, Emily; Dasgupta, Abhijit; Malley, James D; Molloy, Anne M; Mills, James L; Brody, Lawrence C; Stambolian, Dwight; Bailey-Wilson, Joan E

    2016-01-01

    Machine learning methods and in particular random forests (RFs) are a promising alternative to standard single SNP analyses in genome-wide association studies (GWAS). RFs provide variable importance measures (VIMs) to rank SNPs according to their predictive power. However, in contrast to the established genome-wide significance threshold, no clear criteria exist to determine how many SNPs should be selected for downstream analyses. We propose a new variable selection approach, recurrent relative variable importance measure (r2VIM). Importance values are calculated relative to an observed minimal importance score for several runs of RF and only SNPs with large relative VIMs in all of the runs are selected as important. Evaluations on simulated GWAS data show that the new method controls the number of false-positives under the null hypothesis. Under a simple alternative hypothesis with several independent main effects it is only slightly less powerful than logistic regression. In an experimental GWAS data set, the same strong signal is identified while the approach selects none of the SNPs in an underpowered GWAS. The novel variable selection method r2VIM is a promising extension to standard RF for objectively selecting relevant SNPs in GWAS while controlling the number of false-positive results.

  17. Relay model for recruiting alcohol dependent patients in general hospitals--a single-blind pragmatic randomized trial

    DEFF Research Database (Denmark)

    Schwarz, Anne-Sophie; Bilberg, Randi; Bjerregaard, Lene Berit Skov

    2016-01-01

    - The Relay Model. METHOD/DESIGN: The study is a single-blind pragmatic randomized controlled trial including patients admitted to the hospital. The study group (n = 500) will receive an intervention, and the control group (n = 500) will be referred to treatment by usual procedures. All patients complete......://register.clinicaltrials.gov/by identifier: RESCueH_Relay NCT02188043 Project Relay Model for Recruiting Alcohol Dependent Patients in General Hospitals (TRN Registration: 07/09/2014)....

  18. A Comparison of Satisfaction; Spinal versus General Anesthesia for Cesarean Section

    International Nuclear Information System (INIS)

    Meo, S. A.; Siddique, S.; Meo, R. A.

    2013-01-01

    Objective: To compare the patients satisfaction with spinal and general anesthesia after cesarean section at CMH Lahore. Study Design: Randomized controlled trials. Study Setting: The study was conducted at the department of Obstetrics and Gynaecology combined military Hospital, Lahore, for 6 months from July to Dec 2011. Patients and Methods: Total 70 patients were included in the study and randomly divided into two groups of 35 each using random numbers table. All patients between ages of 20-40 years admitted for elective cesarean section and presented for following up at day 5-7 who never had any type of anesthesia in the past. There included in the study patients with complaints of migraine, low backaches, positive history or any other medical disorder were excluded from the study. Results: A total number of patients included were 70. Out of these selected patients, 35 procedures were carried out under spinal anesthesia and 35 under general anesthesia. Insignificant difference was found in satisfaction level of both the groups (p=0.220). There is significant difference for the future choice between two groups (p<0.001). Conclusion: Spinal anesthesia provides equal satisfaction for patients of cesarean section than general anesthesia. (author)

  19. Bayesian dose selection design for a binary outcome using restricted response adaptive randomization.

    Science.gov (United States)

    Meinzer, Caitlyn; Martin, Renee; Suarez, Jose I

    2017-09-08

    In phase II trials, the most efficacious dose is usually not known. Moreover, given limited resources, it is difficult to robustly identify a dose while also testing for a signal of efficacy that would support a phase III trial. Recent designs have sought to be more efficient by exploring multiple doses through the use of adaptive strategies. However, the added flexibility may potentially increase the risk of making incorrect assumptions and reduce the total amount of information available across the dose range as a function of imbalanced sample size. To balance these challenges, a novel placebo-controlled design is presented in which a restricted Bayesian response adaptive randomization (RAR) is used to allocate a majority of subjects to the optimal dose of active drug, defined as the dose with the lowest probability of poor outcome. However, the allocation between subjects who receive active drug or placebo is held constant to retain the maximum possible power for a hypothesis test of overall efficacy comparing the optimal dose to placebo. The design properties and optimization of the design are presented in the context of a phase II trial for subarachnoid hemorrhage. For a fixed total sample size, a trade-off exists between the ability to select the optimal dose and the probability of rejecting the null hypothesis. This relationship is modified by the allocation ratio between active and control subjects, the choice of RAR algorithm, and the number of subjects allocated to an initial fixed allocation period. While a responsive RAR algorithm improves the ability to select the correct dose, there is an increased risk of assigning more subjects to a worse arm as a function of ephemeral trends in the data. A subarachnoid treatment trial is used to illustrate how this design can be customized for specific objectives and available data. Bayesian adaptive designs are a flexible approach to addressing multiple questions surrounding the optimal dose for treatment efficacy

  20. Estimation and variable selection for generalized additive partial linear models

    KAUST Repository

    Wang, Li

    2011-08-01

    We study generalized additive partial linear models, proposing the use of polynomial spline smoothing for estimation of nonparametric functions, and deriving quasi-likelihood based estimators for the linear parameters. We establish asymptotic normality for the estimators of the parametric components. The procedure avoids solving large systems of equations as in kernel-based procedures and thus results in gains in computational simplicity. We further develop a class of variable selection procedures for the linear parameters by employing a nonconcave penalized quasi-likelihood, which is shown to have an asymptotic oracle property. Monte Carlo simulations and an empirical example are presented for illustration. © Institute of Mathematical Statistics, 2011.

  1. Ordered random variables theory and applications

    CERN Document Server

    Shahbaz, Muhammad Qaiser; Hanif Shahbaz, Saman; Al-Zahrani, Bander M

    2016-01-01

    Ordered Random Variables have attracted several authors. The basic building block of Ordered Random Variables is Order Statistics which has several applications in extreme value theory and ordered estimation. The general model for ordered random variables, known as Generalized Order Statistics has been introduced relatively recently by Kamps (1995).

  2. Dr Fabiola Gianotti has been selected by CERN Council to become next CERN Director General

    CERN Multimedia

    Brice, Maximilien

    2014-01-01

    With the next Director-General announced, watch the press conference starting in a few minutes via http://cern.ch/webcast/ and send your questions via Twitter to @CERNpressoffice CERN Council selects Italian physicist, Dr Fabiola Gianotti, as CERN’s next Director-General. Dr Gianotti’s mandate will begin on 1 January 2016 and run for a period of five years, read more: http://cern.ch/go/tN09F

  3. Two-year Randomized Clinical Trial of Self-etching Adhesives and Selective Enamel Etching.

    Science.gov (United States)

    Pena, C E; Rodrigues, J A; Ely, C; Giannini, M; Reis, A F

    2016-01-01

    The aim of this randomized, controlled prospective clinical trial was to evaluate the clinical effectiveness of restoring noncarious cervical lesions with two self-etching adhesive systems applied with or without selective enamel etching. A one-step self-etching adhesive (Xeno V(+)) and a two-step self-etching system (Clearfil SE Bond) were used. The effectiveness of phosphoric acid selective etching of enamel margins was also evaluated. Fifty-six cavities were restored with each adhesive system and divided into two subgroups (n=28; etch and non-etch). All 112 cavities were restored with the nanohybrid composite Esthet.X HD. The clinical effectiveness of restorations was recorded in terms of retention, marginal integrity, marginal staining, caries recurrence, and postoperative sensitivity after 3, 6, 12, 18, and 24 months (modified United States Public Health Service). The Friedman test detected significant differences only after 18 months for marginal staining in the groups Clearfil SE non-etch (p=0.009) and Xeno V(+) etch (p=0.004). One restoration was lost during the trial (Xeno V(+) etch; p>0.05). Although an increase in marginal staining was recorded for groups Clearfil SE non-etch and Xeno V(+) etch, the clinical effectiveness of restorations was considered acceptable for the single-step and two-step self-etching systems with or without selective enamel etching in this 24-month clinical trial.

  4. Multiple mini interview (MMI) for general practice training selection in Australia: interviewers' motivation.

    Science.gov (United States)

    Burgess, Annette; Roberts, Chris; Sureshkumar, Premala; Mossman, Karyn

    2018-01-25

    Multiple Mini Interviews (MMIs) are being used by a growing number of postgraduate training programs and medical schools as their interview process for selection entry. The Australian General Practice and Training (AGPT) used a National Assessment Centre (NAC) approach to selection into General Practice (GP) Training, which include MMIs. Interviewing is a resource intensive process, and implementation of the MMI requires a large number of interviewers, with a number of candidates being interviewed simultaneously. In 2015, 308 interviewers participated in the MMI process - a decrease from 340 interviewers in 2014, and 310 in 2013. At the same time, the number of applicants has steadily increased, with 1930 applications received in 2013; 2254 in 2014; and 2360 in 2015. This has raised concerns regarding the increasing recruitment needs, and the need to retain interviewers for subsequent years of MMIs. In order to investigate interviewers' reasons for participating in MMIs, we utilised self-determination theory (SDT) to consider interviewers' motivation to take part in MMIs at national selection centres. In 2015, 308 interviewers were recruited from 17 Regional Training Providers (RTPs) to participate in the MMI process at one of 15 NACs. For this study, a convenience sample of NAC sites was used. Forty interviewers were interviewed (n = 40; 40/308 = 13%) from five NACs. Framework analysis was used to code and categorise data into themes. Interviewers' motivation to take part as interviewers were largely related to their sense of duty, their desire to contribute their expertise to the process, and their desire to have input into selection of GP Registrars; a sense of duty to their profession; and an opportunity to meet with colleagues and future trainees. Interviewers also highlighted factors hindering motivation, which sometimes included the large number of candidates seen in one day. Interviewers' motivation for contributing to the MMIs was largely related

  5. A comparison of random forest and its Gini importance with standard chemometric methods for the feature selection and classification of spectral data

    Directory of Open Access Journals (Sweden)

    Himmelreich Uwe

    2009-07-01

    Full Text Available Abstract Background Regularized regression methods such as principal component or partial least squares regression perform well in learning tasks on high dimensional spectral data, but cannot explicitly eliminate irrelevant features. The random forest classifier with its associated Gini feature importance, on the other hand, allows for an explicit feature elimination, but may not be optimally adapted to spectral data due to the topology of its constituent classification trees which are based on orthogonal splits in feature space. Results We propose to combine the best of both approaches, and evaluated the joint use of a feature selection based on a recursive feature elimination using the Gini importance of random forests' together with regularized classification methods on spectral data sets from medical diagnostics, chemotaxonomy, biomedical analytics, food science, and synthetically modified spectral data. Here, a feature selection using the Gini feature importance with a regularized classification by discriminant partial least squares regression performed as well as or better than a filtering according to different univariate statistical tests, or using regression coefficients in a backward feature elimination. It outperformed the direct application of the random forest classifier, or the direct application of the regularized classifiers on the full set of features. Conclusion The Gini importance of the random forest provided superior means for measuring feature relevance on spectral data, but – on an optimal subset of features – the regularized classifiers might be preferable over the random forest classifier, in spite of their limitation to model linear dependencies only. A feature selection based on Gini importance, however, may precede a regularized linear classification to identify this optimal subset of features, and to earn a double benefit of both dimensionality reduction and the elimination of noise from the classification task.

  6. Manual therapy, physical therapy, or continued care by a general practitioner for patients with neck pain: a randomized, controlled trial.

    NARCIS (Netherlands)

    Hoving, J.L.; Koes, B.W.; Vet, H.C.W. de; Windt, D.A.W.M. van der; Assendelft, W.J.J.; Mameren, H. van; Devillé, W.L.J.M.; Pool, J.J.M.; Scholten, R.J.P.M.; Bouter, L.M.

    2002-01-01

    BACKGROUND: Neck pain is a common problem, but the effectiveness of frequently applied conservative therapies has never been directly compared. OBJECTIVE: To determine the effectiveness of manual therapy, physical therapy, and continued care by a general practitioner. DESIGN: Randomized, controlled

  7. Manual therapy, physical therapy, or continued care by a general practitioner for patients with neck pain. A randomized, controlled trial

    NARCIS (Netherlands)

    Hoving, Jan Lucas; Koes, Bart W.; de Vet, Henrica C. W.; van der Windt, Danielle A. W. M.; Assendelft, Willem J. J.; van Mameren, Henk; Devillé, Walter L. J. M.; Pool, Jan J. M.; Scholten, Rob J. P. M.; Bouter, Lex M.

    2002-01-01

    BACKGROUND: Neck pain is a common problem, but the effectiveness of frequently applied conservative therapies has never been directly compared. OBJECTIVE: To determine the effectiveness of manual therapy, physical therapy, and continued care by a general practitioner. DESIGN: Randomized, controlled

  8. Bedside rationing by general practitioners: a postal survey in the Danish public healthcare system

    DEFF Research Database (Denmark)

    Lauridsen, Sigurd; Norup, Michael; Rossel, Peter

    2008-01-01

    survey of 600 randomly selected Danish GPs, of which 330 responded to the questionnaire. The Statistical Package for the Social Sciences (SPSS, version 14.0) was used to produce general descriptive statistics. Significance was calculated with the McNemar and the chi-square test. The main outcome measures...

  9. Random effect selection in generalised linear models

    DEFF Research Database (Denmark)

    Denwood, Matt; Houe, Hans; Forkman, Björn

    We analysed abattoir recordings of meat inspection codes with possible relevance to onfarm animal welfare in cattle. Random effects logistic regression models were used to describe individual-level data obtained from 461,406 cattle slaughtered in Denmark. Our results demonstrate that the largest...

  10. Fast randomized point location without preprocessing in two- and three-dimensional Delaunay triangulations

    Energy Technology Data Exchange (ETDEWEB)

    Muecke, E.P.; Saias, I.; Zhu, B.

    1996-05-01

    This paper studies the point location problem in Delaunay triangulations without preprocessing and additional storage. The proposed procedure finds the query point simply by walking through the triangulation, after selecting a good starting point by random sampling. The analysis generalizes and extends a recent result of d = 2 dimensions by proving this procedure to take expected time close to O(n{sup 1/(d+1)}) for point location in Delaunay triangulations of n random points in d = 3 dimensions. Empirical results in both two and three dimensions show that this procedure is efficient in practice.

  11. Generalized linear longitudinal mixed models with linear covariance structure and multiplicative random effects

    DEFF Research Database (Denmark)

    Holst, René; Jørgensen, Bent

    2015-01-01

    The paper proposes a versatile class of multiplicative generalized linear longitudinal mixed models (GLLMM) with additive dispersion components, based on explicit modelling of the covariance structure. The class incorporates a longitudinal structure into the random effects models and retains...... a marginal as well as a conditional interpretation. The estimation procedure is based on a computationally efficient quasi-score method for the regression parameters combined with a REML-like bias-corrected Pearson estimating function for the dispersion and correlation parameters. This avoids...... the multidimensional integral of the conventional GLMM likelihood and allows an extension of the robust empirical sandwich estimator for use with both association and regression parameters. The method is applied to a set of otholit data, used for age determination of fish....

  12. Randomized Oversampling for Generalized Multiscale Finite Element Methods

    KAUST Repository

    Calo, Victor M.; Efendiev, Yalchin R.; Galvis, Juan; Li, Guanglian

    2016-01-01

    boundary conditions defined in a domain larger than the target region. Furthermore, we perform an eigenvalue decomposition in this small space. We study the application of randomized sampling for GMsFEM in conjunction with adaptivity, where local multiscale

  13. Risk-Controlled Multiobjective Portfolio Selection Problem Using a Principle of Compromise

    Directory of Open Access Journals (Sweden)

    Takashi Hasuike

    2014-01-01

    Full Text Available This paper proposes a multiobjective portfolio selection problem with most probable random distribution derived from current market data and other random distributions of boom and recession under the risk-controlled parameters determined by an investor. The current market data and information include not only historical data but also interpretations of economists’ oral and linguistic information, and hence, the boom and recession are often caused by these nonnumeric data. Therefore, investors need to consider several situations from most probable condition to boom and recession and to avoid the risk less than the target return in each situation. Furthermore, it is generally difficult to set random distributions of these cases exactly. Therefore, a robust-based approach for portfolio selection problems using the only mean values and variances of securities is proposed as a multiobjective programming problem. In addition, an exact algorithm is developed to obtain an explicit optimal portfolio using a principle of compromise.

  14. Efficacy of training optimism on general health

    Directory of Open Access Journals (Sweden)

    Mojgan Behrad

    2012-09-01

    Full Text Available Background: The purpose of this study was to investigate the relation of optimism with mental health and the affectivity of optimism training on mental health and its components on Yazd University students. Materials and Methods: Fifty new students of the 2008-2009 academic years were randomly selected. The General Health Questionnaire (GHQ-28 and the optimism scale were completed by them. Thirty persons of these students, who had the highest psychological problems based on the general health questionnaire, were divided into two case and control groups through random assignment. The case group was trained for one month, in two 90-minute sessions per week. Pre-tests and follow-up tests were performed in both groups.Results: The results of Pearson correlation coefficients showed that optimism had a negative and significant relationship with mental health, anxiety, social function, and depression scores (p0.005. Multivariate analysis of covariance showed that optimism training had significant impact on mental health and its components in the case group, compared with the control group (p< 0.0001.Conclusion: In general, the findings of this research suggest the relationship between optimism and mental health and the effectiveness of optimism training on mental health. This method can be used to treat and prevent mental health problems.

  15. Selective mutism.

    Science.gov (United States)

    Hua, Alexandra; Major, Nili

    2016-02-01

    Selective mutism is a disorder in which an individual fails to speak in certain social situations though speaks normally in other settings. Most commonly, this disorder initially manifests when children fail to speak in school. Selective mutism results in significant social and academic impairment in those affected by it. This review will summarize the current understanding of selective mutism with regard to diagnosis, epidemiology, cause, prognosis, and treatment. Studies over the past 20 years have consistently demonstrated a strong relationship between selective mutism and anxiety, most notably social phobia. These findings have led to the recent reclassification of selective mutism as an anxiety disorder in the Diagnostic and Statistical Manual of Mental Disorders, 5th Edition. In addition to anxiety, several other factors have been implicated in the development of selective mutism, including communication delays and immigration/bilingualism, adding to the complexity of the disorder. In the past few years, several randomized studies have supported the efficacy of psychosocial interventions based on a graduated exposure to situations requiring verbal communication. Less data are available regarding the use of pharmacologic treatment, though there are some studies that suggest a potential benefit. Selective mutism is a disorder that typically emerges in early childhood and is currently conceptualized as an anxiety disorder. The development of selective mutism appears to result from the interplay of a variety of genetic, temperamental, environmental, and developmental factors. Although little has been published about selective mutism in the general pediatric literature, pediatric clinicians are in a position to play an important role in the early diagnosis and treatment of this debilitating condition.

  16. Random forest variable selection in spatial malaria transmission modelling in Mpumalanga Province, South Africa

    Directory of Open Access Journals (Sweden)

    Thandi Kapwata

    2016-11-01

    Full Text Available Malaria is an environmentally driven disease. In order to quantify the spatial variability of malaria transmission, it is imperative to understand the interactions between environmental variables and malaria epidemiology at a micro-geographic level using a novel statistical approach. The random forest (RF statistical learning method, a relatively new variable-importance ranking method, measures the variable importance of potentially influential parameters through the percent increase of the mean squared error. As this value increases, so does the relative importance of the associated variable. The principal aim of this study was to create predictive malaria maps generated using the selected variables based on the RF algorithm in the Ehlanzeni District of Mpumalanga Province, South Africa. From the seven environmental variables used [temperature, lag temperature, rainfall, lag rainfall, humidity, altitude, and the normalized difference vegetation index (NDVI], altitude was identified as the most influential predictor variable due its high selection frequency. It was selected as the top predictor for 4 out of 12 months of the year, followed by NDVI, temperature and lag rainfall, which were each selected twice. The combination of climatic variables that produced the highest prediction accuracy was altitude, NDVI, and temperature. This suggests that these three variables have high predictive capabilities in relation to malaria transmission. Furthermore, it is anticipated that the predictive maps generated from predictions made by the RF algorithm could be used to monitor the progression of malaria and assist in intervention and prevention efforts with respect to malaria.

  17. Effect of General Anesthesia and Conscious Sedation During Endovascular Therapy on Infarct Growth and Clinical Outcomes in Acute Ischemic Stroke: A Randomized Clinical Trial

    DEFF Research Database (Denmark)

    Simonsen, Claus Ziegler; Yoo, Albert J; Sørensen, Leif Hougaard

    2018-01-01

    Institutes of Health Stroke Scale score was 18 (interquartile range [IQR], 14-21). Four patients (6.3%) in the CS group were converted to the GA group. Successful reperfusion was significantly higher in the GA arm than in the CS arm (76.9% vs 60.3%; P = .04). The difference in the volume of infarct growth......Importance: Endovascular therapy (EVT) is the standard of care for select patients who had a stroke caused by a large vessel occlusion in the anterior circulation, but there is uncertainty regarding the optimal anesthetic approach during EVT. Observational studies suggest that general anesthesia...... was a single-center prospective, randomized, open-label, blinded end-point evaluation that enrolled patients from March 12, 2015, to February 2, 2017. Although the trial screened 1501 patients, it included 128 consecutive patients with acute ischemic stroke caused by large vessel occlusions in the anterior...

  18. Selecting Optimal Parameters of Random Linear Network Coding for Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Heide, J; Zhang, Qi; Fitzek, F H P

    2013-01-01

    This work studies how to select optimal code parameters of Random Linear Network Coding (RLNC) in Wireless Sensor Networks (WSNs). With Rateless Deluge [1] the authors proposed to apply Network Coding (NC) for Over-the-Air Programming (OAP) in WSNs, and demonstrated that with NC a significant...... reduction in the number of transmitted packets can be achieved. However, NC introduces additional computations and potentially a non-negligible transmission overhead, both of which depend on the chosen coding parameters. Therefore it is necessary to consider the trade-off that these coding parameters...... present in order to obtain the lowest energy consumption per transmitted bit. This problem is analyzed and suitable coding parameters are determined for the popular Tmote Sky platform. Compared to the use of traditional RLNC, these parameters enable a reduction in the energy spent per bit which grows...

  19. Surveillance of acute respiratory infections in general practices - The Netherlands, winters 1998/1999 and 1999/2000

    NARCIS (Netherlands)

    Brandhof WE van den; Bartelds AIM; Wilbrink B; Verweij C; Bijlsma K; Nat H van der; Boswijk H; Pronk JDD; Dorigo-Zetsma JW; Heijnen MLA; NIVEL; CIE; LIS

    2001-01-01

    To provide insight into the virological aetiology of influenza-like illnesses and other acute respiratory infections, nose/throat swabs were taken by 30-35 general practitioners of the sentinel surveillance network of The Netherlands Institute of Health Services Research from a random selection of

  20. Application of random coherence order selection in gradient-enhanced multidimensional NMR

    International Nuclear Information System (INIS)

    Bostock, Mark J.; Nietlispach, Daniel

    2016-01-01

    Development of multidimensional NMR is essential to many applications, for example in high resolution structural studies of biomolecules. Multidimensional techniques enable separation of NMR signals over several dimensions, improving signal resolution, whilst also allowing identification of new connectivities. However, these advantages come at a significant cost. The Fourier transform theorem requires acquisition of a grid of regularly spaced points to satisfy the Nyquist criterion, while frequency discrimination and acquisition of a pure phase spectrum require acquisition of both quadrature components for each time point in every indirect (non-acquisition) dimension, adding a factor of 2 N -1 to the number of free- induction decays which must be acquired, where N is the number of dimensions. Compressed sensing (CS) ℓ 1 -norm minimisation in combination with non-uniform sampling (NUS) has been shown to be extremely successful in overcoming the Nyquist criterion. Previously, maximum entropy reconstruction has also been used to overcome the limitation of frequency discrimination, processing data acquired with only one quadrature component at a given time interval, known as random phase detection (RPD), allowing a factor of two reduction in the number of points for each indirect dimension (Maciejewski et al. 2011 PNAS 108 16640). However, whilst this approach can be easily applied in situations where the quadrature components are acquired as amplitude modulated data, the same principle is not easily extended to phase modulated (P-/N-type) experiments where data is acquired in the form exp (iωt) or exp (-iωt), and which make up many of the multidimensional experiments used in modern NMR. Here we demonstrate a modification of the CS ℓ 1 -norm approach to allow random coherence order selection (RCS) for phase modulated experiments; we generalise the nomenclature for RCS and RPD as random quadrature detection (RQD). With this method, the power of RQD can be extended

  1. Canonical Naimark extension for generalized measurements involving sets of Pauli quantum observables chosen at random

    Science.gov (United States)

    Sparaciari, Carlo; Paris, Matteo G. A.

    2013-01-01

    We address measurement schemes where certain observables Xk are chosen at random within a set of nondegenerate isospectral observables and then measured on repeated preparations of a physical system. Each observable has a probability zk to be measured, with ∑kzk=1, and the statistics of this generalized measurement is described by a positive operator-valued measure. This kind of scheme is referred to as quantum roulettes, since each observable Xk is chosen at random, e.g., according to the fluctuating value of an external parameter. Here we focus on quantum roulettes for qubits involving the measurements of Pauli matrices, and we explicitly evaluate their canonical Naimark extensions, i.e., their implementation as indirect measurements involving an interaction scheme with a probe system. We thus provide a concrete model to realize the roulette without destroying the signal state, which can be measured again after the measurement or can be transmitted. Finally, we apply our results to the description of Stern-Gerlach-like experiments on a two-level system.

  2. Economic selection index development for Beefmaster cattle II: General-purpose breeding objective.

    Science.gov (United States)

    Ochsner, K P; MacNeil, M D; Lewis, R M; Spangler, M L

    2017-05-01

    An economic selection index was developed for Beefmaster cattle in a general-purpose production system in which bulls are mated to a combination of heifers and mature cows, with resulting progeny retained as replacements or sold at weaning. National average prices from 2010 to 2014 were used to establish income and expenses for the system. Genetic parameters were obtained from the literature. Economic values were estimated by simulating 100,000 animals and approximating the partial derivatives of the profit function by perturbing traits 1 at a time, by 1 unit, while holding the other traits constant at their respective means. Relative economic values for the objective traits calving difficultly direct (CDd), calving difficulty maternal (CDm), weaning weight direct (WWd), weaning weight maternal (WWm), mature cow weight (MW), and heifer pregnancy (HP) were -2.11, -1.53, 18.49, 11.28, -33.46, and 1.19, respectively. Consequently, under the scenario assumed herein, the greatest improvements in profitability could be made by decreasing maintenance energy costs associated with MW followed by improvements in weaning weight. The accuracy of the index lies between 0.218 (phenotypic-based index selection) and 0.428 (breeding values known without error). Implementation of this index would facilitate genetic improvement and increase profitability of Beefmaster cattle operations with a general-purpose breeding objective when replacement females are retained and with weaned calves as the sale end point.

  3. Universal Prevention for Anxiety and Depressive Symptoms in Children: A Meta-analysis of Randomized and Cluster-Randomized Trials.

    Science.gov (United States)

    Ahlen, Johan; Lenhard, Fabian; Ghaderi, Ata

    2015-12-01

    Although under-diagnosed, anxiety and depression are among the most prevalent psychiatric disorders in children and adolescents, leading to severe impairment, increased risk of future psychiatric problems, and a high economic burden to society. Universal prevention may be a potent way to address these widespread problems. There are several benefits to universal relative to targeted interventions because there is limited knowledge as to how to screen for anxiety and depression in the general population. Earlier meta-analyses of the prevention of depression and anxiety symptoms among children suffer from methodological inadequacies such as combining universal, selective, and indicated interventions in the same analyses, and comparing cluster-randomized trials with randomized trials without any correction for clustering effects. The present meta-analysis attempted to determine the effectiveness of universal interventions to prevent anxiety and depressive symptoms after correcting for clustering effects. A systematic search of randomized studies in PsychINFO, Cochrane Library, and Google Scholar resulted in 30 eligible studies meeting inclusion criteria, namely peer-reviewed, randomized or cluster-randomized trials of universal interventions for anxiety and depressive symptoms in school-aged children. Sixty-three percent of the studies reported outcome data regarding anxiety and 87 % reported outcome data regarding depression. Seventy percent of the studies used randomization at the cluster level. There were small but significant effects regarding anxiety (.13) and depressive (.11) symptoms as measured at immediate posttest. At follow-up, which ranged from 3 to 48 months, effects were significantly larger than zero regarding depressive (.07) but not anxiety (.11) symptoms. There was no significant moderation effect of the following pre-selected variables: the primary aim of the intervention (anxiety or depression), deliverer of the intervention, gender distribution

  4. On Random Numbers and Design

    Science.gov (United States)

    Ben-Ari, Morechai

    2004-01-01

    The term "random" is frequently used in discussion of the theory of evolution, even though the mathematical concept of randomness is problematic and of little relevance in the theory. Therefore, since the core concept of the theory of evolution is the non-random process of natural selection, the term random should not be used in teaching the…

  5. Subset selection from generalized logistic populations

    NARCIS (Netherlands)

    Laan, van der M.J.; Laan, van der P.

    1997-01-01

    We give an introduction to the logistic and generalized logistic distributions. These generalized logistic distributions Type-I, Type-II and Type-III are indexed by a real valued parameter. They have been derived as mixtures with the standard logistic distribution and for discrete values of the

  6. Selecting for Fast Protein-Protein Association As Demonstrated on a Random TEM1 Yeast Library Binding BLIP.

    Science.gov (United States)

    Cohen-Khait, Ruth; Schreiber, Gideon

    2018-04-27

    Protein-protein interactions mediate the vast majority of cellular processes. Though protein interactions obey basic chemical principles also within the cell, the in vivo physiological environment may not allow for equilibrium to be reached. Thus, in vitro measured thermodynamic affinity may not provide a complete picture of protein interactions in the biological context. Binding kinetics composed of the association and dissociation rate constants are relevant and important in the cell. Therefore, changes in protein-protein interaction kinetics have a significant impact on the in vivo activity of the proteins. The common protocol for the selection of tighter binders from a mutant library selects for protein complexes with slower dissociation rate constants. Here we describe a method to specifically select for variants with faster association rate constants by using pre-equilibrium selection, starting from a large random library. Toward this end, we refine the selection conditions of a TEM1-β-lactamase library against its natural nanomolar affinity binder β-lactamase inhibitor protein (BLIP). The optimal selection conditions depend on the ligand concentration and on the incubation time. In addition, we show that a second sort of the library helps to separate signal from noise, resulting in a higher percent of faster binders in the selected library. Fast associating protein variants are of particular interest for drug development and other biotechnological applications.

  7. [Dementia friendly care services in general hospitals : Representative results of the general hospital study (GHoSt)].

    Science.gov (United States)

    Hendlmeier, Ingrid; Bickel, Horst; Hessler, Johannes Baltasar; Weber, Joshua; Junge, Magdalena Nora; Leonhardt, Sarah; Schäufele, Martina

    2017-11-06

    Mostly model projects report on special care services and procedures for general hospital patients with cognitive impairment. The objective of this study was to determine the frequency of special care services and procedures in general hospitals on the basis of a representative cross-sectional study. From a list of all general hospitals in southern Germany we randomly selected a specified number of hospitals und somatic wards. The hospitals were visited and all older patients on the selected wards on that day were included in the study. Information about care services and their utilization was collected with standardized instruments. A total of 33 general hospitals and 172 wards participated in the study. The patient sample included 1469 persons over 65 (mean age 78.6 years) and 40% of the patients showed cognitive impairments. The staff reported that the most frequent measures for patients with cognitive impairments concerned patients with wandering behavior (63.1%), efforts to involve the patients' relatives to help with their daily care (60.1%), conducting nonintrusive interviews to identify cognitive impairments (59.9%), allocation to other rooms (58%) and visual aids for place orientation of patients (50.6%). In accordance with earlier studies our results show that other dementia friendly services implemented in pilot projects were rare. The existing special services for patients with cognitive impairment were rarely used by the patients or their relatives. The results demonstrate the urgent need to improve special care services and routines for identification of elderly patients with cognitive impairment and risk of delirium in general hospitals.

  8. Nutritional counselling in primary health care: a randomized comparison of an intervention by general practitioner or dietician

    DEFF Research Database (Denmark)

    Willaing, Ingrid; Ladelund, Steen; Jørgensen, Torben

    2004-01-01

    AIMS: To compare health effects and risk reduction in two different strategies of nutritional counselling in primary health care for patients at high risk of ischaemic heart disease. METHODS: In a cluster-randomized trial 60 general practitioners (GPs) in the Copenhagen County were randomized...... to give nutritional counselling or to refer patients to a dietician. Patients were included after opportunistically screening (n=503 patients), and received nutritional counselling by GP or dietician over 12 months. Health effects were measured by changes in weight, waist circumference and blood lipids....... Risk of cardiovascular disease was calculated by The Copenhagen Risk Score. Data on use of medicine and primary health care was obtained from central registers. RESULTS: Altogether 339 (67%) patients completed the intervention. Weight loss was larger in the dietician group (mean 4.5 kg vs. 2.4 kg...

  9. Training specialists to write appropriate reply letters to general practitioners about patients with medically unexplained physical symptoms; A cluster-randomized trial.

    NARCIS (Netherlands)

    A. Weiland (Anne); A.H. Blankenstein (Annette); M.H.A. Willems; J.L.C.M. van Saase (Jan); P.L.A. van Daele (Paul); H.T. van der Molen (Henk); G.B. Langbroek (Ginger B.); A. Bootsma (Aart); E.M. Vriens (Els M.); A. Oberndorff-Klein Woolthuis (Ardi); R. Vernhout (Rene); L.R. Arends (Lidia)

    2015-01-01

    textabstractObjective: To evaluate effects of a communication training for specialists on the quality of their reply letters to general practitioners (GPs) about patients with medically unexplained physical symptoms (MUPS). Methods: Before randomization, specialists included ≤3 MUPS patients in a

  10. A Permutation Importance-Based Feature Selection Method for Short-Term Electricity Load Forecasting Using Random Forest

    Directory of Open Access Journals (Sweden)

    Nantian Huang

    2016-09-01

    Full Text Available The prediction accuracy of short-term load forecast (STLF depends on prediction model choice and feature selection result. In this paper, a novel random forest (RF-based feature selection method for STLF is proposed. First, 243 related features were extracted from historical load data and the time information of prediction points to form the original feature set. Subsequently, the original feature set was used to train an RF as the original model. After the training process, the prediction error of the original model on the test set was recorded and the permutation importance (PI value of each feature was obtained. Then, an improved sequential backward search method was used to select the optimal forecasting feature subset based on the PI value of each feature. Finally, the optimal forecasting feature subset was used to train a new RF model as the final prediction model. Experiments showed that the prediction accuracy of RF trained by the optimal forecasting feature subset was higher than that of the original model and comparative models based on support vector regression and artificial neural network.

  11. Direct random insertion mutagenesis of Helicobacter pylori

    NARCIS (Netherlands)

    de Jonge, Ramon; Bakker, Dennis; van Vliet, Arnoud H. M.; Kuipers, Ernst J.; Vandenbroucke-Grauls, Christina M. J. E.; Kusters, Johannes G.

    2003-01-01

    Random insertion mutagenesis is a widely used technique for the identification of bacterial virulence genes. Most strategies for random mutagenesis involve cloning in Escherichia coli for passage of plasmids or for phenotypic selection. This can result in biased selection due to restriction or

  12. Direct random insertion mutagenesis of Helicobacter pylori.

    NARCIS (Netherlands)

    Jonge, de R.; Bakker, D.; Vliet, van AH; Kuipers, E.J.; Vandenbroucke-Grauls, C.M.J.E.; Kusters, J.G.

    2003-01-01

    Random insertion mutagenesis is a widely used technique for the identification of bacterial virulence genes. Most strategies for random mutagenesis involve cloning in Escherichia coli for passage of plasmids or for phenotypic selection. This can result in biased selection due to restriction or

  13. Generalized Gaussian Error Calculus

    CERN Document Server

    Grabe, Michael

    2010-01-01

    For the first time in 200 years Generalized Gaussian Error Calculus addresses a rigorous, complete and self-consistent revision of the Gaussian error calculus. Since experimentalists realized that measurements in general are burdened by unknown systematic errors, the classical, widespread used evaluation procedures scrutinizing the consequences of random errors alone turned out to be obsolete. As a matter of course, the error calculus to-be, treating random and unknown systematic errors side by side, should ensure the consistency and traceability of physical units, physical constants and physical quantities at large. The generalized Gaussian error calculus considers unknown systematic errors to spawn biased estimators. Beyond, random errors are asked to conform to the idea of what the author calls well-defined measuring conditions. The approach features the properties of a building kit: any overall uncertainty turns out to be the sum of a contribution due to random errors, to be taken from a confidence inter...

  14. Bit Error Rate Performance Analysis of a Threshold-Based Generalized Selection Combining Scheme in Nakagami Fading Channels

    Directory of Open Access Journals (Sweden)

    Kousa Maan

    2005-01-01

    Full Text Available The severity of fading on mobile communication channels calls for the combining of multiple diversity sources to achieve acceptable error rate performance. Traditional approaches perform the combining of the different diversity sources using either the conventional selective diversity combining (CSC, equal-gain combining (EGC, or maximal-ratio combining (MRC schemes. CSC and MRC are the two extremes of compromise between performance quality and complexity. Some researches have proposed a generalized selection combining scheme (GSC that combines the best branches out of the available diversity resources ( . In this paper, we analyze a generalized selection combining scheme based on a threshold criterion rather than a fixed-size subset of the best channels. In this scheme, only those diversity branches whose energy levels are above a specified threshold are combined. Closed-form analytical solutions for the BER performances of this scheme over Nakagami fading channels are derived. We also discuss the merits of this scheme over GSC.

  15. Role of Danish general practitioners in AIDS prevention

    DEFF Research Database (Denmark)

    Sandbæk, Annelli

    1996-01-01

    OBJECTIVE: To describe Danish general practitioners' perception of their own role and to register their actual behaviour in the prevention of HIV/AIDS. DESIGN: Data collection was carried out by a) questionnaire and b) prospective registration of consultations dealing with HIV/AIDS in a two...... (94%) were of the opinion that GPs should play a central part in the prevention of HIV; 96% found that their knowledge was sufficient to advise on the prevention of HIV, and 90% thought that the GP should take the initiative to talk about HIV. The median number of consultations dealing with HIV......-week period in September 1992. SETTING: General practice, Denmark. SUBJECTS: One thousand general practitioners (GPs), selected at random, were asked to participate. The study population comprised 352 GPs who returned the questionnaire and participated in the prospective registration. RESULTS: Most of the GPs...

  16. Criteria for selecting children with special needs for dental treatment under general anaesthesia

    OpenAIRE

    Nova García, M. Joaquín de; Gallardo López, Nuria E.; Martín Sanjuán, Carmen; Mourelle Martínez, M. Rosa; Alonso García, Yolanda; Carracedo Cabaleiro, Esther

    2007-01-01

    Objective: To study criteria for helping to select children with special needs for dental treatment under general anaesthesia. Materials and methods: Group of 30 children (aged under 18) examined on the Course at the Universidad Complutense de Madrid (UCM) (Specialisation on holistic dental treatment of children with special needs) and subsequently referred to the Disabled Children’s Oral Health Unit (DCOHU) within Primary Health Care Area 2 of the Madrid Health Service (SERMAS) where dental ...

  17. Selective oropharyngeal decontamination versus selective digestive decontamination in critically ill patients: a meta-analysis of randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Zhao D

    2015-07-01

    Full Text Available Di Zhao,1,* Jian Song,2,* Xuan Gao,3 Fei Gao,4 Yupeng Wu,2 Yingying Lu,5 Kai Hou1 1Department of Neurosurgery, The First Hospital of Hebei Medical University, 2Department of Neurosurgery, 3Department of Neurology, The Second Hospital of Hebei Medical University, 4Hebei Provincial Procurement Centers for Medical Drugs and Devices, 5Department of Neurosurgery, The Second Hospital of Hebei Medical University, Shijiazhuang People’s Republic of China *These authors contributed equally to this work Background: Selective digestive decontamination (SDD and selective oropharyngeal decontamination (SOD are associated with reduced mortality and infection rates among patients in intensive care units (ICUs; however, whether SOD has a superior effect than SDD remains uncertain. Hence, we conducted a meta-analysis of randomized controlled trials (RCTs to compare SOD with SDD in terms of clinical outcomes and antimicrobial resistance rates in patients who were critically ill. Methods: RCTs published in PubMed, Embase, and Web of Science were systematically reviewed to compare the effects of SOD and SDD in patients who were critically ill. Outcomes included day-28 mortality, length of ICU stay, length of hospital stay, duration of mechanical ventilation, ICU-acquired bacteremia, and prevalence of antibiotic-resistant Gram-negative bacteria. Results were expressed as risk ratio (RR with 95% confidence intervals (CIs, and weighted mean differences (WMDs with 95% CIs. Pooled estimates were performed using a fixed-effects model or random-effects model, depending on the heterogeneity among studies. Results: A total of four RCTs involving 23,822 patients met the inclusion criteria and were included in this meta-analysis. Among patients whose admitting specialty was surgery, cardiothoracic surgery (57.3% and neurosurgery (29.7% were the two main types of surgery being performed. Pooled results showed that SOD had similar effects as SDD in day-28 mortality (RR =1

  18. A general population genetic framework for antagonistic selection that accounts for demography and recurrent mutation.

    Science.gov (United States)

    Connallon, Tim; Clark, Andrew G

    2012-04-01

    Antagonistic selection--where alleles at a locus have opposing effects on male and female fitness ("sexual antagonism") or between components of fitness ("antagonistic pleiotropy")--might play an important role in maintaining population genetic variation and in driving phylogenetic and genomic patterns of sexual dimorphism and life-history evolution. While prior theory has thoroughly characterized the conditions necessary for antagonistic balancing selection to operate, we currently know little about the evolutionary interactions between antagonistic selection, recurrent mutation, and genetic drift, which should collectively shape empirical patterns of genetic variation. To fill this void, we developed and analyzed a series of population genetic models that simultaneously incorporate these processes. Our models identify two general properties of antagonistically selected loci. First, antagonistic selection inflates heterozygosity and fitness variance across a broad parameter range--a result that applies to alleles maintained by balancing selection and by recurrent mutation. Second, effective population size and genetic drift profoundly affect the statistical frequency distributions of antagonistically selected alleles. The "efficacy" of antagonistic selection (i.e., its tendency to dominate over genetic drift) is extremely weak relative to classical models, such as directional selection and overdominance. Alleles meeting traditional criteria for strong selection (N(e)s > 1, where N(e) is the effective population size, and s is a selection coefficient for a given sex or fitness component) may nevertheless evolve as if neutral. The effects of mutation and demography may generate population differences in overall levels of antagonistic fitness variation, as well as molecular population genetic signatures of balancing selection.

  19. A Randomized Controlled Trial of Cognitive-Behavioral Therapy for Generalized Anxiety Disorder with Integrated Techniques from Emotion-Focused and Interpersonal Therapies

    Science.gov (United States)

    Newman, Michelle G.; Castonguay, Louis G.; Borkovec, Thomas D.; Fisher, Aaron J.; Boswell, James F.; Szkodny, Lauren E.; Nordberg, Samuel S.

    2011-01-01

    Objective: Recent models suggest that generalized anxiety disorder (GAD) symptoms may be maintained by emotional processing avoidance and interpersonal problems. Method: This is the first randomized controlled trial to test directly whether cognitive-behavioral therapy (CBT) could be augmented with the addition of a module targeting interpersonal…

  20. General Algorithm (High level)

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. General Algorithm (High level). Iteratively. Use Tightness Property to remove points of P1,..,Pi. Use random sampling to get a Random Sample (of enough points) from the next largest cluster, Pi+1. Use the Random Sampling Procedure to approximate ci+1 using the ...

  1. Quetiapine monotherapy in acute treatment of generalized anxiety disorder: a systematic review and meta-analysis of randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Maneeton N

    2016-01-01

    Full Text Available Narong Maneeton,1 Benchalak Maneeton,1 Pakapan Woottiluk,2 Surinporn Likhitsathian,1 Sirijit Suttajit,1 Vudhichai Boonyanaruthee,1 Manit Srisurapanont1 1Department of Psychiatry, Faculty of Medicine, Chiang Mai University, Chiang Mai, Thailand; 2Psychiatric Nursing Division, Faculty of Nursing, Chiang Mai University, Chiang Mai, Thailand Background: Some studies have indicated the efficacy of quetiapine in the treatment of generalized anxiety disorder (GAD.Objective: The purpose of this study was to systematically review the efficacy, acceptability, and tolerability of quetiapine in adult patients with GAD.Methods: The SCOPUS, MEDLINE, CINAHL, Cochrane Central Register of Controlled Trials, and ClinicalTrials.gov databases were searched in April 2015. All randomized controlled trials (RCTs of GAD were considered to be included in this meta-analysis. All RCTs of quetiapine in GAD patients providing endpoint outcomes relevant to severity of anxiety, response rate, remission rate, overall discontinuation rate, or discontinuation rate due to adverse events were included. The version reports from suitable clinical studies were explored, and the important data were extracted. Measurement for efficacy outcomes consisted of the mean-changed scores of the rating scales for anxiety, and response rate.Results: A total of 2,248 randomized participants in three RCTs were included. The pooled mean-changed score of the quetiapine-treated group was greater than that of the placebo-treated group and comparable to selective serotonin reuptake inhibitors (SSRIs. Unfortunately, the response and the remission rates in only 50 and 150 mg/day of quetiapine-XR (extended-release were better than those of the placebo. Their response and remission rates were comparable to SSRIs. The rates of pooled overall discontinuation and discontinuation due to adverse events of quetiapine-XR were greater than placebo. Only the overall discontinuation rate of quetiapine-XR at 50 and

  2. Random genetic drift, natural selection, and noise in human cranial evolution.

    Science.gov (United States)

    Roseman, Charles C

    2016-08-01

    This study assesses the extent to which relationships among groups complicate comparative studies of adaptation in recent human cranial variation and the extent to which departures from neutral additive models of evolution hinder the reconstruction of population relationships among groups using cranial morphology. Using a maximum likelihood evolutionary model fitting approach and a mixed population genomic and cranial data set, I evaluate the relative fits of several widely used models of human cranial evolution. Moreover, I compare the goodness of fit of models of cranial evolution constrained by genomic variation to test hypotheses about population specific departures from neutrality. Models from population genomics are much better fits to cranial variation than are traditional models from comparative human biology. There is not enough evolutionary information in the cranium to reconstruct much of recent human evolution but the influence of population history on cranial variation is strong enough to cause comparative studies of adaptation serious difficulties. Deviations from a model of random genetic drift along a tree-like population history show the importance of environmental effects, gene flow, and/or natural selection on human cranial variation. Moreover, there is a strong signal of the effect of natural selection or an environmental factor on a group of humans from Siberia. The evolution of the human cranium is complex and no one evolutionary process has prevailed at the expense of all others. A holistic unification of phenome, genome, and environmental context, gives us a strong point of purchase on these problems, which is unavailable to any one traditional approach alone. Am J Phys Anthropol 160:582-592, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  3. Analysis of random point images with the use of symbolic computation codes and generalized Catalan numbers

    Science.gov (United States)

    Reznik, A. L.; Tuzikov, A. V.; Solov'ev, A. A.; Torgov, A. V.

    2016-11-01

    Original codes and combinatorial-geometrical computational schemes are presented, which are developed and applied for finding exact analytical formulas that describe the probability of errorless readout of random point images recorded by a scanning aperture with a limited number of threshold levels. Combinatorial problems encountered in the course of the study and associated with the new generalization of Catalan numbers are formulated and solved. An attempt is made to find the explicit analytical form of these numbers, which is, on the one hand, a necessary stage of solving the basic research problem and, on the other hand, an independent self-consistent problem.

  4. General Education vs. Vocational Training: Evidence from an Economy in Transition. NBER Working Paper No. 14155

    Science.gov (United States)

    Malamud, Ofer; Pop-Eleches, Cristian

    2008-01-01

    This paper examines the relative benefits of general education and vocational training in Romania, a country which experienced major technological and institutional change during its transition from Communism to a market economy. To avoid the bias caused by non-random selection, we exploit a 1973 educational reform that shifted a large proportion…

  5. Generalisability of a composite student selection programme

    DEFF Research Database (Denmark)

    O'Neill, Lotte Dyhrberg; Korsholm, Lars; Wallstedt, Birgitta

    2009-01-01

    format); general knowledge (multiple-choice test), and a semi-structured admission interview. The aim of this study was to estimate the generalisability of a composite selection. METHODS: Data from 307 applicants who participated in the admission to medicine in 2007 were available for analysis. Each...... admission parameter was double-scored using two random, blinded and independent raters. Variance components for applicant, rater and residual effects were estimated for a mixed model with the restricted maximum likelihood (REML) method. The reliability of obtained applicant ranks (G coefficients......) was calculated for individual admission criteria and for composite admission procedures. RESULTS: A pre-selection procedure combining qualification and motivation scores showed insufficient generalisability (G = 0.45). The written motivation in particular, displayed low generalisability (G = 0.10). Good...

  6. A New Formula for the BER of Binary Modulations with Dual-Branch Selection over Generalized-K

    KAUST Repository

    Ansari, Imran Shafique

    2012-09-08

    Error performance is one of the main performance measures and the derivation of its closed-form expression has proved to be quite involved for certain communication systems operating over composite fading channels. In this letter, a unified closed-form expression, applicable to different binary modulation schemes, for the bit error rate of dual-branch selection diversity based systems undergoing independent but not necessarily identically distributed generalized-K fading is derived in terms of the extended generalized bivariate Meijer G-function.

  7. Comparison of maternal and fetal outcomes among patients undergoing cesarean section under general and spinal anesthesia: a randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Anıl İçel Saygı

    Full Text Available CONTEXT AND OBJECTIVE: As the rates of cesarean births have increased, the type of cesarean anesthesia has gained importance. Here, we aimed to compare the effects of general and spinal anesthesia on maternal and fetal outcomes in term singleton cases undergoing elective cesarean section.DESIGN AND SETTING: Prospective randomized controlled clinical trial in a tertiary-level public hospital.METHODS: Our study was conducted on 100 patients who underwent cesarean section due to elective indications. The patients were randomly divided into general anesthesia (n = 50 and spinal anesthesia (n = 50 groups. The maternal pre and postoperative hematological results, intra and postoperative hemodynamic parameters and perinatal results were compared between the groups.RESULTS: Mean bowel sounds (P = 0.036 and gas discharge time (P = 0.049 were significantly greater and 24th hour hemoglobin difference values (P = 0.001 were higher in the general anesthesia group. The mean hematocrit and hemoglobin values at the 24th hour (P = 0.004 and P < 0.001, respectively, urine volume at the first postoperative hour (P < 0.001 and median Apgar score at the first minute (P < 0.0005 were significantly higher, and the time that elapsed until the first requirement for analgesia was significantly longer (P = 0.042, in the spinal anesthesia group.CONCLUSION: In elective cases, spinal anesthesia is superior to general anesthesia in terms of postoperative comfort. In pregnancies with a risk of fetal distress, it would be appropriate to prefer spinal anesthesia by taking the first minute Apgar score into account.

  8. The effectiveness of manual therapy, physiotherapy, and treatment by the general practitioner for nonspecific back and neck complaints : A randomized clinical trial

    NARCIS (Netherlands)

    Koes, B. W.; Bouter, L. M.; Van Mameren, H.; Essers, A. H.; Verstegen, G. M.; Hofhuizen, D. M.; Houben, J. P.; Knipschild, P. G.

    1992-01-01

    In a randomized trial, the effectiveness of manual therapy, physiotherapy, continued treatment by the general practitioner, and placebo therapy (detuned ultrasound and detuned short-wave diathermy) were compared for patients (n = 256) with nonspecific back and neck complaints lasting for at least 6

  9. Postcolonial Sub-Saharan 1 State and Contemporary General Business Environment. Selected Issues

    Directory of Open Access Journals (Sweden)

    Tomasz W. Kolasinski

    2015-06-01

    Full Text Available Purpose: The paper presents the results of a qualitative analysis of selected aspects of general business environment. The author strives to answer the following question formulated in the context of postcolonial deliberations: has the general business environment been affected by European colonialism? Methodology: Semantic and semiotic analysis of primary sources (statistical data and research findings formulated by international organisations, such as the World Bank, OECD, UNIDO and secondary sources (scientific and research studies of Polish and foreign authors based on primary sources; literature review. Findings: In the postcolonial perspective, qualitative analysis shows neither a positive nor a negative impact of colonialism on the contemporary general business environment. If certain signs of its deterioration are observed, they are mostly due to the erosion of state capacity, whose origins can be traced back to the Berlin Conference. Originality: Papers on Postcolonial Management and Critical Management Studies (CMS bridge the gap in literaturę pertaining to management issues, especially in Poland. Due to their interdisciplinary nature, Postcolonial Management and CMS cover a broad range of research areas (i.e. theory of state and nation, sociology, economic history. They pertain to both economics an management, and are therefore difficult to classify.

  10. Randomized controlled trial on promoting influenza vaccination in general practice waiting rooms.

    Directory of Open Access Journals (Sweden)

    Christophe Berkhout

    Full Text Available Most of general practitioners (GPs use advertising in their waiting rooms for patient's education purposes. Patients vaccinated against seasonal influenza have been gradually lessening. The objective of this trial was to assess the effect of an advertising campaign for influenza vaccination using posters and pamphlets in GPs' waiting rooms.Registry based 2/1 cluster randomized controlled trial, a cluster gathering the enlisted patients of 75 GPs aged over 16 years. The trial, run during the 2014-2015 influenza vaccination campaign, compared patient's awareness from being in 50 GPs' standard waiting rooms (control group versus that of waiting in 25 rooms from GPs who had received and exposed pamphlets and one poster on influenza vaccine (intervention group, in addition to standard mandatory information. The main outcome was the number of vaccination units delivered in pharmacies. Data were extracted from the SIAM-ERASME claim database of the Health Insurance Fund of Lille-Douai (France. The association between the intervention (yes/no and the main outcome was assessed through a generalized estimating equation. Seventy-five GPs enrolled 10,597 patients over 65 years or suffering from long lasting diseases (intervention/control as of 3781/6816 patients from October 15, 2014 to February 28, 2015. No difference was found regarding the number of influenza vaccination units delivered (Relative Risk (RR = 1.01; 95% Confidence interval: 0.97 to 1.05; p = 0.561.Effects of the monothematic campaign promoting vaccination against influenza using a poster and pamphlets exposed in GPs' waiting rooms could not be demonstrated.

  11. Simultaneous escaping of explicit and hidden free energy barriers: application of the orthogonal space random walk strategy in generalized ensemble based conformational sampling.

    Science.gov (United States)

    Zheng, Lianqing; Chen, Mengen; Yang, Wei

    2009-06-21

    To overcome the pseudoergodicity problem, conformational sampling can be accelerated via generalized ensemble methods, e.g., through the realization of random walks along prechosen collective variables, such as spatial order parameters, energy scaling parameters, or even system temperatures or pressures, etc. As usually observed, in generalized ensemble simulations, hidden barriers are likely to exist in the space perpendicular to the collective variable direction and these residual free energy barriers could greatly abolish the sampling efficiency. This sampling issue is particularly severe when the collective variable is defined in a low-dimension subset of the target system; then the "Hamiltonian lagging" problem, which reveals the fact that necessary structural relaxation falls behind the move of the collective variable, may be likely to occur. To overcome this problem in equilibrium conformational sampling, we adopted the orthogonal space random walk (OSRW) strategy, which was originally developed in the context of free energy simulation [L. Zheng, M. Chen, and W. Yang, Proc. Natl. Acad. Sci. U.S.A. 105, 20227 (2008)]. Thereby, generalized ensemble simulations can simultaneously escape both the explicit barriers along the collective variable direction and the hidden barriers that are strongly coupled with the collective variable move. As demonstrated in our model studies, the present OSRW based generalized ensemble treatments show improved sampling capability over the corresponding classical generalized ensemble treatments.

  12. CHAIN-WISE GENERALIZATION OF ROAD NETWORKS USING MODEL SELECTION

    Directory of Open Access Journals (Sweden)

    D. Bulatov

    2017-05-01

    Full Text Available Streets are essential entities of urban terrain and their automatized extraction from airborne sensor data is cumbersome because of a complex interplay of geometric, topological and semantic aspects. Given a binary image, representing the road class, centerlines of road segments are extracted by means of skeletonization. The focus of this paper lies in a well-reasoned representation of these segments by means of geometric primitives, such as straight line segments as well as circle and ellipse arcs. We propose the fusion of raw segments based on similarity criteria; the output of this process are the so-called chains which better match to the intuitive perception of what a street is. Further, we propose a two-step approach for chain-wise generalization. First, the chain is pre-segmented using circlePeucker and finally, model selection is used to decide whether two neighboring segments should be fused to a new geometric entity. Thereby, we consider both variance-covariance analysis of residuals and model complexity. The results on a complex data-set with many traffic roundabouts indicate the benefits of the proposed procedure.

  13. Goal selection versus process control while learning to use a brain-computer interface

    Science.gov (United States)

    Royer, Audrey S.; Rose, Minn L.; He, Bin

    2011-06-01

    A brain-computer interface (BCI) can be used to accomplish a task without requiring motor output. Two major control strategies used by BCIs during task completion are process control and goal selection. In process control, the user exerts continuous control and independently executes the given task. In goal selection, the user communicates their goal to the BCI and then receives assistance executing the task. A previous study has shown that goal selection is more accurate and faster in use. An unanswered question is, which control strategy is easier to learn? This study directly compares goal selection and process control while learning to use a sensorimotor rhythm-based BCI. Twenty young healthy human subjects were randomly assigned either to a goal selection or a process control-based paradigm for eight sessions. At the end of the study, the best user from each paradigm completed two additional sessions using all paradigms randomly mixed. The results of this study were that goal selection required a shorter training period for increased speed, accuracy, and information transfer over process control. These results held for the best subjects as well as in the general subject population. The demonstrated characteristics of goal selection make it a promising option to increase the utility of BCIs intended for both disabled and able-bodied users.

  14. Impact of the European Randomized Study of Screening for Prostate Cancer (ERSPC) on prostate-specific antigen (PSA) testing by Dutch general practitioners

    NARCIS (Netherlands)

    Van der Meer, Saskia; Kollen, Boudewijn J.; Hirdes, Willem H.; Steffens, Martijn G.; Hoekstra-Weebers, Josette E. H. M.; Nijman, Rien M.; Blanker, Marco H.

    Objective To determine the impact of the European Randomized Study of Screening for Prostate Cancer (ERSPC) publication in 2009 on prostate-specific antigen (PSA) level testing by Dutch general practitioners (GPs) in men aged 40 years. Materials and Methods Retrospective study with a Dutch insurance

  15. Rationale and study design of PROVHILO - a worldwide multicenter randomized controlled trial on protective ventilation during general anesthesia for open abdominal surgery.

    Science.gov (United States)

    Hemmes, Sabrine N T; Severgnini, Paolo; Jaber, Samir; Canet, Jaume; Wrigge, Hermann; Hiesmayr, Michael; Tschernko, Edda M; Hollmann, Markus W; Binnekade, Jan M; Hedenstierna, Göran; Putensen, Christian; de Abreu, Marcelo Gama; Pelosi, Paolo; Schultz, Marcus J

    2011-05-06

    Post-operative pulmonary complications add to the morbidity and mortality of surgical patients, in particular after general anesthesia >2 hours for abdominal surgery. Whether a protective mechanical ventilation strategy with higher levels of positive end-expiratory pressure (PEEP) and repeated recruitment maneuvers; the "open lung strategy", protects against post-operative pulmonary complications is uncertain. The present study aims at comparing a protective mechanical ventilation strategy with a conventional mechanical ventilation strategy during general anesthesia for abdominal non-laparoscopic surgery. The PROtective Ventilation using HIgh versus LOw positive end-expiratory pressure ("PROVHILO") trial is a worldwide investigator-initiated multicenter randomized controlled two-arm study. Nine hundred patients scheduled for non-laparoscopic abdominal surgery at high or intermediate risk for post-operative pulmonary complications are randomized to mechanical ventilation with the level of PEEP at 12 cmH(2)O with recruitment maneuvers (the lung-protective strategy) or mechanical ventilation with the level of PEEP at maximum 2 cmH(2)O without recruitment maneuvers (the conventional strategy). The primary endpoint is any post-operative pulmonary complication. The PROVHILO trial is the first randomized controlled trial powered to investigate whether an open lung mechanical ventilation strategy in short-term mechanical ventilation prevents against postoperative pulmonary complications. ISRCTN: ISRCTN70332574.

  16. A randomized trial of dialectical behavior therapy versus general psychiatric management for borderline personality disorder.

    Science.gov (United States)

    McMain, Shelley F; Links, Paul S; Gnam, William H; Guimond, Tim; Cardish, Robert J; Korman, Lorne; Streiner, David L

    2009-12-01

    The authors sought to evaluate the clinical efficacy of dialectical behavior therapy compared with general psychiatric management, including a combination of psychodynamically informed therapy and symptom-targeted medication management derived from specific recommendations in APA guidelines for borderline personality disorder. This was a single-blind trial in which 180 patients diagnosed with borderline personality disorder who had at least two suicidal or nonsuicidal self-injurious episodes in the past 5 years were randomly assigned to receive 1 year of dialectical behavior therapy or general psychiatric management. The primary outcome measures, assessed at baseline and every 4 months over the treatment period, were frequency and severity of suicidal and nonsuicidal self-harm episodes. Both groups showed improvement on the majority of clinical outcome measures after 1 year of treatment, including significant reductions in the frequency and severity of suicidal and nonsuicidal self-injurious episodes and significant improvements in most secondary clinical outcomes. Both groups had a reduction in general health care utilization, including emergency visits and psychiatric hospital days, as well as significant improvements in borderline personality disorder symptoms, symptom distress, depression, anger, and interpersonal functioning. No significant differences across any outcomes were found between groups. These results suggest that individuals with borderline personality disorder benefited equally from dialectical behavior therapy and a well-specified treatment delivered by psychiatrists with expertise in the treatment of borderline personality disorder.

  17. The patient general satisfaction of mandibular single-implant overdentures and conventional complete dentures: Study protocol for a randomized crossover trial.

    Science.gov (United States)

    Kanazawa, Manabu; Tanoue, Mariko; Miyayasu, Anna; Takeshita, Shin; Sato, Daisuke; Asami, Mari; Lam, Thuy Vo; Thu, Khaing Myat; Oda, Ken; Komagamine, Yuriko; Minakuchi, Shunsuke; Feine, Jocelyne

    2018-05-01

    Mandibular overdentures retained by a single implant placed in the midline of edentulous mandible have been reported to be more comfortable and function better than complete dentures. Although single-implant overdentures are still more costly than conventional complete dentures, there are a few studies which investigated whether mandibular single-implant overdentures are superior to complete dentures when patient general satisfaction is compared. The aim of this study is to assess patient general satisfaction with mandibular single-implant overdentures and complete dentures. This study is a randomized crossover trial to compare mandibular single-implant overdentures and complete dentures in edentulous individuals. Participant recruitment is ongoing at the time of this submission. Twenty-two participants will be recruited. New mandibular complete dentures will be fabricated. A single implant will be placed in the midline of the edentulous mandible. The mucosal surface of the complete denture around the implant will be relieved for 3 months. The participants will then be randomly allocated into 2 groups according to the order of the interventions; group 1 will receive single-implant overdentures first and will wear them for 2 months, followed by complete dentures for 2 months. Group 2 will receive the same treatments in a reverse order. After experiencing the 2 interventions, the participants will choose one of the mandibular prostheses, and yearly follow-up visits are planned for 5 years. The primary outcome of this trial is patient ratings of general satisfaction on 100 mm visual analog scales. Assessments of the prostheses and oral health-related quality of life will also be recorded as patient-reported outcomes. The secondary outcomes are cost and time for treatment. Masticatory efficiency and cognitive capacity will also be recorded. Furthermore, qualitative research will be performed to investigate the factors associated with success of these mandibular

  18. A Generalized Measure for the Optimal Portfolio Selection Problem and its Explicit Solution

    Directory of Open Access Journals (Sweden)

    Zinoviy Landsman

    2018-03-01

    Full Text Available In this paper, we offer a novel class of utility functions applied to optimal portfolio selection. This class incorporates as special cases important measures such as the mean-variance, Sharpe ratio, mean-standard deviation and others. We provide an explicit solution to the problem of optimal portfolio selection based on this class. Furthermore, we show that each measure in this class generally reduces to the efficient frontier that coincides or belongs to the classical mean-variance efficient frontier. In addition, a condition is provided for the existence of the a one-to-one correspondence between the parameter of this class of utility functions and the trade-off parameter λ in the mean-variance utility function. This correspondence essentially provides insight into the choice of this parameter. We illustrate our results by taking a portfolio of stocks from National Association of Securities Dealers Automated Quotation (NASDAQ.

  19. Reconstructing random media

    International Nuclear Information System (INIS)

    Yeong, C.L.; Torquato, S.

    1998-01-01

    We formulate a procedure to reconstruct the structure of general random heterogeneous media from limited morphological information by extending the methodology of Rintoul and Torquato [J. Colloid Interface Sci. 186, 467 (1997)] developed for dispersions. The procedure has the advantages that it is simple to implement and generally applicable to multidimensional, multiphase, and anisotropic structures. Furthermore, an extremely useful feature is that it can incorporate any type and number of correlation functions in order to provide as much morphological information as is necessary for accurate reconstruction. We consider a variety of one- and two-dimensional reconstructions, including periodic and random arrays of rods, various distribution of disks, Debye random media, and a Fontainebleau sandstone sample. We also use our algorithm to construct heterogeneous media from specified hypothetical correlation functions, including an exponentially damped, oscillating function as well as physically unrealizable ones. copyright 1998 The American Physical Society

  20. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil; Kammoun, Abla; Al-Naffouri, Tareq Y.; Alouini, Mohamed-Slim

    2016-01-01

    -aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also

  1. Combining rational and random strategies in β-glucosidase Zm-p60.1 protein library construction.

    Directory of Open Access Journals (Sweden)

    Dušan Turek

    Full Text Available Saturation mutagenesis is a cornerstone technique in protein engineering because of its utility (in conjunction with appropriate analytical techniques for assessing effects of varying residues at selected positions on proteins' structures and functions. Site-directed mutagenesis with degenerate primers is the simplest and most rapid saturation mutagenesis technique. Thus, it is highly appropriate for assessing whether or not variation at certain sites is permissible, but not necessarily the most time- and cost-effective technique for detailed assessment of variations' effects. Thus, in the presented study we applied the technique to randomize position W373 in β-glucosidase Zm-p60.1, which is highly conserved among β-glucosidases. Unexpectedly, β-glucosidase activity screening of the generated variants showed that most variants were active, although they generally had significantly lower activity than the wild type enzyme. Further characterization of the library led us to conclude that a carefully selected combination of randomized codon-based saturation mutagenesis and site-directed mutagenesis may be most efficient, particularly when constructing and investigating randomized libraries with high fractions of positive hits.

  2. Combining rational and random strategies in β-glucosidase Zm-p60.1 protein library construction.

    Science.gov (United States)

    Turek, Dušan; Klimeš, Pavel; Mazura, Pavel; Brzobohatý, Břetislav

    2014-01-01

    Saturation mutagenesis is a cornerstone technique in protein engineering because of its utility (in conjunction with appropriate analytical techniques) for assessing effects of varying residues at selected positions on proteins' structures and functions. Site-directed mutagenesis with degenerate primers is the simplest and most rapid saturation mutagenesis technique. Thus, it is highly appropriate for assessing whether or not variation at certain sites is permissible, but not necessarily the most time- and cost-effective technique for detailed assessment of variations' effects. Thus, in the presented study we applied the technique to randomize position W373 in β-glucosidase Zm-p60.1, which is highly conserved among β-glucosidases. Unexpectedly, β-glucosidase activity screening of the generated variants showed that most variants were active, although they generally had significantly lower activity than the wild type enzyme. Further characterization of the library led us to conclude that a carefully selected combination of randomized codon-based saturation mutagenesis and site-directed mutagenesis may be most efficient, particularly when constructing and investigating randomized libraries with high fractions of positive hits.

  3. Social phobia, anxiety, oppositional behavior, social skills, and self-concept in children with specific selective mutism, generalized selective mutism, and community controls.

    Science.gov (United States)

    Cunningham, Charles E; McHolm, Angela E; Boyle, Michael H

    2006-08-01

    We compared social phobia, anxiety, oppositional behavior, social skills, and self-concept in three groups: (1) 28 children with specific mutism (who did not speak to teachers but were more likely to speak to parents and peers at home and school); (2) 30 children with generalized mutism (whose speaking was restricted primarily to their homes); and (3) 52 community controls. Children with generalized mutism evidenced higher anxiety at school, and more separation anxiety, OCD, and depressive symptoms at home. Parents and teachers reported that the social phobia and anxiety scores of children in both the specific and generalized mutism subgroups were higher than controls. Children in both the specific and generalized mutism groups evidenced greater deficits in verbal and nonverbal social skills at home and school than controls. Teachers and parents did not report differences in nonverbal measures of social cooperation and conflict resolution and we found no evidence that selective mutism was linked to an increase in externalizing problems such as oppositional behavior or ADHD. Although children with specific mutism speak in a wider range of situations and appear less anxious to their teachers than children with generalized mutism, significant socially phobic behavior and social skills deficits are present in both groups.

  4. Effects of choice architecture and chef-enhanced meals on the selection and consumption of healthier school foods: a randomized clinical trial.

    Science.gov (United States)

    Cohen, Juliana F W; Richardson, Scott A; Cluggish, Sarah A; Parker, Ellen; Catalano, Paul J; Rimm, Eric B

    2015-05-01

    Little is known about the long-term effect of a chef-enhanced menu on healthier food selection and consumption in school lunchrooms. In addition, it remains unclear if extended exposure to other strategies to promote healthier foods (eg, choice architecture) also improves food selection or consumption. To evaluate the short- and long-term effects of chef-enhanced meals and extended exposure to choice architecture on healthier school food selection and consumption. A school-based randomized clinical trial was conducted during the 2011-2012 school year among 14 elementary and middle schools in 2 urban, low-income school districts (intent-to-treat analysis). Included in the study were 2638 students in grades 3 through 8 attending participating schools (38.4% of eligible participants). Schools were first randomized to receive a professional chef to improve school meal palatability (chef schools) or to a delayed intervention (control group). To assess the effect of choice architecture (smart café), all schools after 3 months were then randomized to the smart café intervention or to the control group. School food selection was recorded, and consumption was measured using plate waste methods. After 3 months, vegetable selection increased in chef vs control schools (odds ratio [OR], 1.75; 95% CI, 1.36-2.24), but there was no effect on the selection of other components or on meal consumption. After long-term or extended exposure to the chef or smart café intervention, fruit selection increased in the chef (OR, 3.08; 95% CI, 2.23-4.25), smart café (OR, 1.45; 95% CI, 1.13-1.87), and chef plus smart café (OR, 3.10; 95% CI, 2.26-4.25) schools compared with the control schools, and consumption increased in the chef schools (OR, 0.17; 95% CI, 0.03-0.30 cups/d). Vegetable selection increased in the chef (OR, 2.54; 95% CI, 1.83-3.54), smart café (OR, 1.91; 95% CI, 1.46-2.50), and chef plus smart café schools (OR, 7.38, 95% CI, 5.26-10.35) compared with the control schools

  5. Modified random hinge transport mechanics and multiple scattering step-size selection in EGS5

    International Nuclear Information System (INIS)

    Wilderman, S.J.; Bielajew, A.F.

    2005-01-01

    The new transport mechanics in EGS5 allows for significantly longer electron transport step sizes and hence shorter computation times than required for identical problems in EGS4. But as with all Monte Carlo electron transport algorithms, certain classes of problems exhibit step-size dependencies even when operating within recommended ranges, sometimes making selection of step-sizes a daunting task for novice users. Further contributing to this problem, because of the decoupling of multiple scattering and continuous energy loss in the dual random hinge transport mechanics of EGS5, there are two independent step sizes in EGS5, one for multiple scattering and one for continuous energy loss, each of which influences speed and accuracy in a different manner. Further, whereas EGS4 used a single value of fractional energy loss (ESTEPE) to determine step sizes at all energies, to increase performance by decreasing the amount of effort expended simulating lower energy particles, EGS5 permits the fractional energy loss values which are used to determine both the multiple scattering and continuous energy loss step sizes to vary with energy. This results in requiring the user to specify four fractional energy loss values when optimizing computations for speed. Thus, in order to simplify step-size selection and to mitigate step-size dependencies, a method has been devised to automatically optimize step-size selection based on a single material dependent input related to the size of problem tally region. In this paper we discuss the new transport mechanics in EGS5 and describe the automatic step-size optimization algorithm. (author)

  6. Estimation and model selection of semiparametric multivariate survival functions under general censorship.

    Science.gov (United States)

    Chen, Xiaohong; Fan, Yanqin; Pouzo, Demian; Ying, Zhiliang

    2010-07-01

    We study estimation and model selection of semiparametric models of multivariate survival functions for censored data, which are characterized by possibly misspecified parametric copulas and nonparametric marginal survivals. We obtain the consistency and root- n asymptotic normality of a two-step copula estimator to the pseudo-true copula parameter value according to KLIC, and provide a simple consistent estimator of its asymptotic variance, allowing for a first-step nonparametric estimation of the marginal survivals. We establish the asymptotic distribution of the penalized pseudo-likelihood ratio statistic for comparing multiple semiparametric multivariate survival functions subject to copula misspecification and general censorship. An empirical application is provided.

  7. Order Selection for General Expression of Nonlinear Autoregressive Model Based on Multivariate Stepwise Regression

    Science.gov (United States)

    Shi, Jinfei; Zhu, Songqing; Chen, Ruwen

    2017-12-01

    An order selection method based on multiple stepwise regressions is proposed for General Expression of Nonlinear Autoregressive model which converts the model order problem into the variable selection of multiple linear regression equation. The partial autocorrelation function is adopted to define the linear term in GNAR model. The result is set as the initial model, and then the nonlinear terms are introduced gradually. Statistics are chosen to study the improvements of both the new introduced and originally existed variables for the model characteristics, which are adopted to determine the model variables to retain or eliminate. So the optimal model is obtained through data fitting effect measurement or significance test. The simulation and classic time-series data experiment results show that the method proposed is simple, reliable and can be applied to practical engineering.

  8. Diathermy vs. scalpel skin incisions in general surgery: double-blind, randomized, clinical trial.

    Science.gov (United States)

    Shamim, Muhammad

    2009-08-01

    This prospective, double-blind, randomized, controlled trial was designed to compare the outcome of diathermy incisions versus scalpel incisions in general surgery. A total of 369 patients who underwent diathermy incision (group A: 185 patients) or scalpel incision (group B: 184 patients) were analyzed. Variables analyzed were: surgical wound classification, length and depth of incision, incision time, duration of operation, incisional blood loss, postoperative pain, duration of hospital stay, duration of healing, and postoperative complications. The inclusion criteria were all patients who underwent elective or emergency general surgery. The exclusion criteria were only cases with incomplete patients' data and patients who were lost to follow-up. This study was conducted at Fatima Hospital-Baqai Medical University and Shamsi Hospital (Karachi), from January 2006 to December 2007. Incision time was significantly longer for patients in group B (p = 0.001). Incisional blood loss also was more for patients in group B (p = 0.000). Pain perception was found to be markedly reduced during the first 48 h in group A (p = 0.000). Total period of hospital stay (p = 0.129) and time for complete wound healing (p = 0.683) were almost the same for both groups. Postoperative complication rate by wound classification did not differ markedly between the two groups (p = 0.002 vs. p = 0.000). Diathermy incision has significant advantages compared with the scalpel because of reduced incision time, less blood loss, & reduced early postoperative pain.

  9. Improving general flexibility with a mind-body approach: a randomized, controlled trial using neuro emotional Technique®.

    Science.gov (United States)

    Jensen, Anne M; Ramasamy, Adaikalavan; Hall, Michael W

    2012-08-01

    General flexibility is a key component of health, well-being, and general physical conditioning. Reduced flexibility has both physical and mental/emotional etiologies and can lead to musculoskeletal injuries and athletic underperformance. Few studies have tested the effectiveness of a mind-body therapy on general flexibility. The aim of this study was to investigate if Neuro Emotional Technique® (NET), a mind-body technique shown to be effective in reducing stress, can also improve general flexibility. The sit-and-reach test (SR) score was used as a measure of general flexibility. Forty-five healthy participants were recruited from the general population and assessed for their initial SR score before being randomly allocated to receive (a) two 20-minute sessions of NET (experimental group); (b) two 20-minute sessions of stretching instruction (active control group); or (c) no intervention or instruction (passive control group). After intervention, the participants were reassessed in a similar manner by the same blind assessor. The participants also answered questions about demographics, usual water and caffeine consumption, and activity level, and they completed an anxiety/mood psychometric preintervention and postintervention. The mean (SD) change in the SR score was +3.1 cm (2.5) in the NET group, +1.2 cm (2.3) in the active control group and +1.0 cm (2.6) in the passive control group. Although all the 3 groups showed some improvement, the improvement in the NET group was statistically significant when compared with that of either the passive controls (p = 0.015) or the active controls (p = 0.021). This study suggests that NET could provide an effective treatment in improving general flexibility. A larger study is required to confirm these findings and also to assess longer term effectiveness of this therapy on general flexibility.

  10. USING THE GENERAL ELECTRIC / MCKINSEY MATRIX IN THE PROCESS OF SELECTING THE CENTRAL AND EAST EUROPEAN MARKETS

    Directory of Open Access Journals (Sweden)

    Nicolae Răzvan Decuseară

    2013-01-01

    Full Text Available Due to limited resources a company cannot serve all potential markets in the world in a manner that all the clients to be satisfied and the business goals achieved, which is why the company should select the most appropriate markets. It can focus on a single product market serving many geographic areas, but may also decide to serve different product markets in a group of selected geographic areas. Due to the large number and diversity of markets that can choose, analyze of the market attractiveness and the selection the most interesting is a complex process. General Electric Matrix/McKinsey has two dimensions, market attractiveness and the competitive strength of the firm, and aims to analyze the strengths and weaknesses of the company in a variety of areas, allowing the company to identify the most attractive markets and to guide managers in allocating resources to these markets, improve the weaker competitive position of the company in emerging markets, or to draw firm unattractive markets. We can say that it is a very efficient tool for the company being used by international market specialists, on one hand to select foreign markets for the company, and on the other hand, to determine the strategy that the firm will be using to internationalize on those markets. At the end of this paper we present a part of a larger study in which we showed how General Electric Matrix/McKinsey it is used specifically in select foreign markets.

  11. Uncertain programming models for portfolio selection with uncertain returns

    Science.gov (United States)

    Zhang, Bo; Peng, Jin; Li, Shengguo

    2015-10-01

    In an indeterminacy economic environment, experts' knowledge about the returns of securities consists of much uncertainty instead of randomness. This paper discusses portfolio selection problem in uncertain environment in which security returns cannot be well reflected by historical data, but can be evaluated by the experts. In the paper, returns of securities are assumed to be given by uncertain variables. According to various decision criteria, the portfolio selection problem in uncertain environment is formulated as expected-variance-chance model and chance-expected-variance model by using the uncertainty programming. Within the framework of uncertainty theory, for the convenience of solving the models, some crisp equivalents are discussed under different conditions. In addition, a hybrid intelligent algorithm is designed in the paper to provide a general method for solving the new models in general cases. At last, two numerical examples are provided to show the performance and applications of the models and algorithm.

  12. Evolving artificial metalloenzymes via random mutagenesis

    Science.gov (United States)

    Yang, Hao; Swartz, Alan M.; Park, Hyun June; Srivastava, Poonam; Ellis-Guardiola, Ken; Upp, David M.; Lee, Gihoon; Belsare, Ketaki; Gu, Yifan; Zhang, Chen; Moellering, Raymond E.; Lewis, Jared C.

    2018-03-01

    Random mutagenesis has the potential to optimize the efficiency and selectivity of protein catalysts without requiring detailed knowledge of protein structure; however, introducing synthetic metal cofactors complicates the expression and screening of enzyme libraries, and activity arising from free cofactor must be eliminated. Here we report an efficient platform to create and screen libraries of artificial metalloenzymes (ArMs) via random mutagenesis, which we use to evolve highly selective dirhodium cyclopropanases. Error-prone PCR and combinatorial codon mutagenesis enabled multiplexed analysis of random mutations, including at sites distal to the putative ArM active site that are difficult to identify using targeted mutagenesis approaches. Variants that exhibited significantly improved selectivity for each of the cyclopropane product enantiomers were identified, and higher activity than previously reported ArM cyclopropanases obtained via targeted mutagenesis was also observed. This improved selectivity carried over to other dirhodium-catalysed transformations, including N-H, S-H and Si-H insertion, demonstrating that ArMs evolved for one reaction can serve as starting points to evolve catalysts for others.

  13. A prospective randomized study comparing percutaneous nephrolithotomy under combined spinal-epidural anesthesia with percutaneous nephrolithotomy under general anesthesia.

    Science.gov (United States)

    Singh, Vishwajeet; Sinha, Rahul Janak; Sankhwar, S N; Malik, Anita

    2011-01-01

    A prospective randomized study was executed to compare the surgical parameters and stone clearance in patients who underwent percutaneous nephrolithotomy (PNL) under combined spinal-epidural anesthesia (CSEA) versus those who underwent PNL under general anesthesia (GA). Between January 2008 to December 2009, 64 patients with renal calculi were randomized into 2 groups and evaluated for the purpose of this study. Group 1 consisted of patients who underwent PNL under CSEA and Group 2 consisted of patients who underwent PNL under GA. The operative time, stone clearance rate, visual pain analog score, mean analgesic dose and mean hospital stay were compared amongst other parameters. The difference between visual pain analog score after the operation and the dose of analgesic requirement was significant on statistical analysis between both groups. PNL under CSEA is as effective and safe as PNL under GA. Patients who undergo PNL under CESA require lesser analgesic dose and have a shorter hospital stay. Copyright © 2011 S. Karger AG, Basel.

  14. The Relationship between General Health and Internet Addiction

    Directory of Open Access Journals (Sweden)

    Nastiezaie Nasser

    2009-03-01

    Full Text Available Background: Internet is a neutral tool on its own. However, excessive use of internet poses the risk of being addicted to it. This study was performed to investigate the association between general health and internet addiction.Materials and Methods: In this descriptive study a total number of 375 students (189 female and 186 male were randomly selected from Sistan and Baluchestan University between 2007 to 2008. Using the internet addiction test (IAT, the students were divided into two groups: ordinary users and addicted users of internet. The general health questionnaire (GHQ was used to compare these two groups. The SPSS software and t test were used to analyze the data and P<0.01 was significant.Results: The general health of internet-addicted users in comparison with ordinary users was at a higher risk (P<0.01. But the difference between two groups in general health and disorders of social function were not statistically significant. Conclusion: Depression and anxiety were common in internet addicted users and it was correlated to the amount of time that they were allocating to it.

  15. On the Wigner law in dilute random matrices

    Science.gov (United States)

    Khorunzhy, A.; Rodgers, G. J.

    1998-12-01

    We consider ensembles of N × N symmetric matrices whose entries are weakly dependent random variables. We show that random dilution can change the limiting eigenvalue distribution of such matrices. We prove that under general and natural conditions the normalised eigenvalue counting function coincides with the semicircle (Wigner) distribution in the limit N → ∞. This can be explained by the observation that dilution (or more generally, random modulation) eliminates the weak dependence (or correlations) between random matrix entries. It also supports our earlier conjecture that the Wigner distribution is stable to random dilution and modulation.

  16. ARCADO - Adding random case analysis to direct observation in workplace-based formative assessment of general practice registrars.

    Science.gov (United States)

    Ingham, Gerard; Fry, Jennifer; Morgan, Simon; Ward, Bernadette

    2015-12-10

    Workplace-based formative assessments using consultation observation are currently conducted during the Australian general practice training program. Assessment reliability is improved by using multiple assessment methods. The aim of this study was to explore experiences of general practice medical educator assessors and registrars (trainees) when adding random case analysis to direct observation (ARCADO) during formative workplace-based assessments. A sample of general practice medical educators and matched registrars were recruited. Following the ARCADO workplace assessment, semi-structured qualitative interviews were conducted. The data was analysed thematically. Ten registrars and eight medical educators participated. Four major themes emerged - formative versus summative assessment; strengths (acceptability, flexibility, time efficiency, complementarity and authenticity); weaknesses (reduced observation and integrity risks); and contextual factors (variation in assessment content, assessment timing, registrar-medical educator relationship, medical educator's approach and registrar ability). ARCADO is a well-accepted workplace-based formative assessment perceived by registrars and assessors to be valid and flexible. The use of ARCADO enabled complementary insights that would not have been achieved with direct observation alone. Whilst there are some contextual factors to be considered in its implementation, ARCADO appears to have utility as formative assessment and, subject to further evaluation, high-stakes assessment.

  17. The mesoscopic conductance of disordered rings, its random matrix theory and the generalized variable range hopping picture

    International Nuclear Information System (INIS)

    Stotland, Alexander; Peer, Tal; Cohen, Doron; Budoyo, Rangga; Kottos, Tsampikos

    2008-01-01

    The calculation of the conductance of disordered rings requires a theory that goes beyond the Kubo-Drude formulation. Assuming 'mesoscopic' circumstances the analysis of the electro-driven transitions shows similarities with a percolation problem in energy space. We argue that the texture and the sparsity of the perturbation matrix dictate the value of the conductance, and study its dependence on the disorder strength, ranging from the ballistic to the Anderson localization regime. An improved sparse random matrix model is introduced to capture the essential ingredients of the problem, and leads to a generalized variable range hopping picture. (fast track communication)

  18. Self-reported oral and general health in relation to socioeconomic position

    OpenAIRE

    Hakeberg, Magnus; Wide Boman, Ulla

    2017-01-01

    Abstract Background During the past two decades, several scientific publications from different countries have shown how oral health in the population varies with social determinants. The aim of the present study was to explore the relationship between self-reported oral and general health in relation to different measures of socioeconomic position. Methods Data were collected from a randomly selected sample of the adult population in Sweden (n = 3500, mean age 53.4 years, 53.1% women). The r...

  19. Rationale and study design of PROVHILO - a worldwide multicenter randomized controlled trial on protective ventilation during general anesthesia for open abdominal surgery

    Directory of Open Access Journals (Sweden)

    Hedenstierna Göran

    2011-05-01

    Full Text Available Abstract Background Post-operative pulmonary complications add to the morbidity and mortality of surgical patients, in particular after general anesthesia >2 hours for abdominal surgery. Whether a protective mechanical ventilation strategy with higher levels of positive end-expiratory pressure (PEEP and repeated recruitment maneuvers; the "open lung strategy", protects against post-operative pulmonary complications is uncertain. The present study aims at comparing a protective mechanical ventilation strategy with a conventional mechanical ventilation strategy during general anesthesia for abdominal non-laparoscopic surgery. Methods The PROtective Ventilation using HIgh versus LOw positive end-expiratory pressure ("PROVHILO" trial is a worldwide investigator-initiated multicenter randomized controlled two-arm study. Nine hundred patients scheduled for non-laparoscopic abdominal surgery at high or intermediate risk for post-operative pulmonary complications are randomized to mechanical ventilation with the level of PEEP at 12 cmH2O with recruitment maneuvers (the lung-protective strategy or mechanical ventilation with the level of PEEP at maximum 2 cmH2O without recruitment maneuvers (the conventional strategy. The primary endpoint is any post-operative pulmonary complication. Discussion The PROVHILO trial is the first randomized controlled trial powered to investigate whether an open lung mechanical ventilation strategy in short-term mechanical ventilation prevents against postoperative pulmonary complications. Trial registration ISRCTN: ISRCTN70332574

  20. The effectiveness of Cognitive Behavioral Therapy (CBT) with general exercises versus general exercises alone in the management of chronic low back pain.

    Science.gov (United States)

    Khan, Muhammad; Akhter, Saeed; Soomro, Rabail Rani; Ali, Syed Shahzad

    2014-07-01

    To evaluate the effectiveness of Cognitive Behavioural Therapy (CBT) along with General exercises and General exercises alone in chronic low back pain. Total 54 patients with chronic low back pain who fulfilled inclusion criteria were recruited from Physiotherapy, Department of Alain Poly Clinic Karachi and Institute of Physical Medicine & Rehabilitation Dow University of Health Sciences Karachi. Selected patients were equally divided and randomly assigned into two groups with simple randomisation method. The Cognitive Behavioural Therapy (CBT) and General exercises group received Operant model of CBT and General Exercises whereas General exercises group received General exercises only. Both groups received a home exercise program as well. Patients in both groups received 3 treatment sessions per week for 12 consecutive weeks. Clinical assessment was performed using Visual Analogue Scale (VAS) and Ronald Morris Disability Questionnaire at baseline and after 12 weeks. Both study groups showed statistically significant improvements in both outcomes measures p=0.000. However, mean improvements in post intervention VAS score and Ronald Morris score was better in CBT and exercises group as compared to General exercise group. In conclusion, both interventions are effective in treating chronic low back pain however; CBT & General exercises are clinically more effective than General exercises alone.

  1. Perceived competence and attitudes towards patients with suicidal behaviour: a survey of general practitioners, psychiatrists and internists

    OpenAIRE

    Grimholt, Tine K; Haavet, Ole R; Jacobsen, Dag; Sandvik, Leiv; Ekeberg, Oivind

    2014-01-01

    Background Competence and attitudes to suicidal behaviour among physicians are important to provide high-quality care for a large patient group. The aim was to study different physicians’ attitudes towards suicidal behaviour and their perceived competence to care for suicidal patients. Methods A random selection (n = 750) of all registered General Practitioners, Psychiatrists and Internists in Norway ...

  2. Internet treatment for generalized anxiety disorder: a randomized controlled trial comparing clinician vs. technician assistance.

    Science.gov (United States)

    Robinson, Emma; Titov, Nickolai; Andrews, Gavin; McIntyre, Karen; Schwencke, Genevieve; Solley, Karen

    2010-06-03

    Internet-based cognitive behavioural therapy (iCBT) for generalized anxiety disorder (GAD) has been shown to be effective when guided by a clinician. The present study sought to replicate this finding, and determine whether support from a technician is as effective as guidance from a clinician. Randomized controlled non-inferiority trial comparing three groups: Clinician-assisted vs. technician-assisted vs. delayed treatment. Community-based volunteers applied to the VirtualClinic (www.virtualclinic.org.au) research program and 150 participants with GAD were randomized. Participants in the clinician- and technician-assisted groups received access to an iCBT program for GAD comprising six online lessons, weekly homework assignments, and weekly supportive contact over a treatment period of 10 weeks. Participants in the clinician-assisted group also received access to a moderated online discussion forum. The main outcome measures were the Penn State Worry Questionnaire (PSWQ) and the Generalized Anxiety Disorder-7 Item (GAD-7). Completion rates were high, and both treatment groups reduced scores on the PSWQ (ptechnician-assisted groups, respectively, and on the GAD-7 were 1.55 and 1.73, respectively. At 3 month follow-up participants in both treatment groups had sustained the gains made at post-treatment. Participants in the clinician-assisted group had made further gains on the PSWQ. Approximately 81 minutes of clinician time and 75 minutes of technician time were required per participant during the 10 week treatment program. Both clinician- and technician-assisted treatment resulted in large effect sizes and clinically significant improvements comparable to those associated with face-to-face treatment, while a delayed treatment/control group did not improve. These results provide support for large scale trials to determine the clinical effectiveness and acceptability of technician-assisted iCBT programs for GAD. This form of treatment has potential to increase the

  3. Post-model selection inference and model averaging

    Directory of Open Access Journals (Sweden)

    Georges Nguefack-Tsague

    2011-07-01

    Full Text Available Although model selection is routinely used in practice nowadays, little is known about its precise effects on any subsequent inference that is carried out. The same goes for the effects induced by the closely related technique of model averaging. This paper is concerned with the use of the same data first to select a model and then to carry out inference, in particular point estimation and point prediction. The properties of the resulting estimator, called a post-model-selection estimator (PMSE, are hard to derive. Using selection criteria such as hypothesis testing, AIC, BIC, HQ and Cp, we illustrate that, in terms of risk function, no single PMSE dominates the others. The same conclusion holds more generally for any penalised likelihood information criterion. We also compare various model averaging schemes and show that no single one dominates the others in terms of risk function. Since PMSEs can be regarded as a special case of model averaging, with 0-1 random-weights, we propose a connection between the two theories, in the frequentist approach, by taking account of the selection procedure when performing model averaging. We illustrate the point by simulating a simple linear regression model.

  4. Assessment of Interior General and Local Lighting in Carpet Weaving Workshops in Bijar City

    OpenAIRE

    Rostam Golmohamadi; Homeira Alizadeh; Motamedzade Majid; Soltanian Alireza

    2014-01-01

    Background & Objectives : The comfort lighting in the workplace provides employees visual health which can improve safety, visual comfort and enhance performance and product quality. The present study was conducted to evaluate general and local lighting in carpet weaving workshops in Bijar city . Methods : In this descriptive analytical study, 101 carpet weaving workshops were randomly selected. The illuminance were measured based on the models and formulas presented in Illuminating Engin...

  5. Some common random fixed point theorems for contractive type conditions in cone random metric spaces

    Directory of Open Access Journals (Sweden)

    Saluja Gurucharan S.

    2016-08-01

    Full Text Available In this paper, we establish some common random fixed point theorems for contractive type conditions in the setting of cone random metric spaces. Our results unify, extend and generalize many known results from the current existing literature.

  6. Random distance distribution for spherical objects: general theory and applications to physics

    International Nuclear Information System (INIS)

    Tu Shuju; Fischbach, Ephraim

    2002-01-01

    A formalism is presented for analytically obtaining the probability density function, P n (s), for the random distance s between two random points in an n-dimensional spherical object of radius R. Our formalism allows P n (s) to be calculated for a spherical n-ball having an arbitrary volume density, and reproduces the well-known results for the case of uniform density. The results find applications in geometric probability, computational science, molecular biological systems, statistical physics, astrophysics, condensed matter physics, nuclear physics and elementary particle physics. As one application of these results, we propose a new statistical method derived from our formalism to study random number generators used in Monte Carlo simulations. (author)

  7. Lifestyle intervention in general practice for physical activity, smoking, alcohol consumption and diet in elderly: a randomized controlled trial.

    Science.gov (United States)

    Vrdoljak, Davorka; Marković, Biserka Bergman; Puljak, Livia; Lalić, Dragica Ivezić; Kranjčević, Ksenija; Vučak, Jasna

    2014-01-01

    The purpose of the study was to compare the effectiveness of programmed and intensified intervention on lifestyle changes, including physical activity, cigarette smoking, alcohol consumption and diet, in patients aged ≥ 65 with the usual care of general practitioners (GP). In this multicenter randomized controlled trial, 738 patients aged ≥ 65 were randomly assigned to receive intensified intervention (N = 371) or usual care (N = 367) of a GP for lifestyle changes, with 18-month follow-up. The main outcome measures were physical activity, smoking, alcohol consumption and diet. The study was conducted in 59 general practices in Croatia between May 2008 and May 2010. The patients' mean age was 72.3 ± 5.2 years. Significant diet correction was achieved after 18-month follow-up in the intervention group, comparing to controls. More patients followed strictly Mediterranean diet and consumed healthy foods more frequently. There was no significant difference between the groups in physical activity, tobacco smoking and alcohol consumption or diet after the intervention. In conclusion, an 18-month intensified GP's intervention had limited effect on lifestyle habits. GP intervention managed to change dietary habits in elderly population, which is encouraging since elderly population is very resistant regarding lifestyle habit changes. Clinical trial registration number. ISRCTN31857696. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  8. Semi-structured interview is a reliable and feasible tool for selection of doctors for general practice specialist training.

    Science.gov (United States)

    Isaksen, Jesper Hesselbjerg; Hertel, Niels Thomas; Kjær, Niels Kristian

    2013-09-01

    In order to optimise the selection process for admission to specialist training in family medicine, we developed a new design for structured applications and selection interviews. The design contains semi-structured interviews, which combine individualised elements from the applications with standardised behaviour-based questions. This paper describes the design of the tool, and offers reflections concerning its acceptability, reliability and feasibility. We used a combined quantitative and qualitative evaluation method. Ratings obtained by the applicants in two selection rounds were analysed for reliability and generalisability using the GENOVA programme. Applicants and assessors were randomly selected for individual semi-structured in-depth interviews. The qualitative data were analysed in accordance with the grounded theory method. Quantitative analysis yielded a high Cronbach's alpha of 0.97 for the first round and 0.90 for the second round, and a G coefficient of the first round of 0.74 and of the second round of 0.40. Qualitative analysis demonstrated high acceptability and fairness and it improved the assessors' judgment. Applicants reported concerns about loss of personality and some anxiety. The applicants' ability to reflect on their competences was important. The developed selection tool demonstrated an acceptable level of reliability, but only moderate generalisability. The users found that the tool provided a high degree of acceptability; it is a feasible and useful tool for -selection of doctors for specialist training if combined with work-based assessment. Studies on the benefits and drawbacks of this tool compared with other selection models are relevant. not relevant. not relevant.

  9. Robust inference in sample selection models

    KAUST Repository

    Zhelonkin, Mikhail; Genton, Marc G.; Ronchetti, Elvezio

    2015-01-01

    The problem of non-random sample selectivity often occurs in practice in many fields. The classical estimators introduced by Heckman are the backbone of the standard statistical analysis of these models. However, these estimators are very sensitive to small deviations from the distributional assumptions which are often not satisfied in practice. We develop a general framework to study the robustness properties of estimators and tests in sample selection models. We derive the influence function and the change-of-variance function of Heckman's two-stage estimator, and we demonstrate the non-robustness of this estimator and its estimated variance to small deviations from the model assumed. We propose a procedure for robustifying the estimator, prove its asymptotic normality and give its asymptotic variance. Both cases with and without an exclusion restriction are covered. This allows us to construct a simple robust alternative to the sample selection bias test. We illustrate the use of our new methodology in an analysis of ambulatory expenditures and we compare the performance of the classical and robust methods in a Monte Carlo simulation study.

  10. Robust inference in sample selection models

    KAUST Repository

    Zhelonkin, Mikhail

    2015-11-20

    The problem of non-random sample selectivity often occurs in practice in many fields. The classical estimators introduced by Heckman are the backbone of the standard statistical analysis of these models. However, these estimators are very sensitive to small deviations from the distributional assumptions which are often not satisfied in practice. We develop a general framework to study the robustness properties of estimators and tests in sample selection models. We derive the influence function and the change-of-variance function of Heckman\\'s two-stage estimator, and we demonstrate the non-robustness of this estimator and its estimated variance to small deviations from the model assumed. We propose a procedure for robustifying the estimator, prove its asymptotic normality and give its asymptotic variance. Both cases with and without an exclusion restriction are covered. This allows us to construct a simple robust alternative to the sample selection bias test. We illustrate the use of our new methodology in an analysis of ambulatory expenditures and we compare the performance of the classical and robust methods in a Monte Carlo simulation study.

  11. Inference for feature selection using the Lasso with high-dimensional data

    DEFF Research Database (Denmark)

    Brink-Jensen, Kasper; Ekstrøm, Claus Thorn

    2014-01-01

    Penalized regression models such as the Lasso have proved useful for variable selection in many fields - especially for situations with high-dimensional data where the numbers of predictors far exceeds the number of observations. These methods identify and rank variables of importance but do...... not generally provide any inference of the selected variables. Thus, the variables selected might be the "most important" but need not be significant. We propose a significance test for the selection found by the Lasso. We introduce a procedure that computes inference and p-values for features chosen...... by the Lasso. This method rephrases the null hypothesis and uses a randomization approach which ensures that the error rate is controlled even for small samples. We demonstrate the ability of the algorithm to compute $p$-values of the expected magnitude with simulated data using a multitude of scenarios...

  12. Parental Divorce and Generalized Trust

    OpenAIRE

    Viitanen, Tarja

    2011-01-01

    This paper examines the effect of parental divorce during childhood on generalized trust later on in life using Australian HILDA panel data. The dependent variable is composed of answers to the statement: “Generally speaking, most people can be trusted”. The main explanatory variables include the occurrence of parental divorce for the whole sample and the age at which parents divorced for the sub-sample. The analysis is conducted using random effects ordered probit, correlated random effects ...

  13. Selective serotonin reuptake inhibitors (SSRIs) for post-partum depression (PPD): a systematic review of randomized clinical trials.

    Science.gov (United States)

    De Crescenzo, Franco; Perelli, Federica; Armando, Marco; Vicari, Stefano

    2014-01-01

    The treatment of postpartum depression with selective serotonin reuptake inhibitors (SSRIs) has been claimed to be both efficacious and well tolerated, but no recent systematic reviews have been conducted. A qualitative systematic review of randomized clinical trials on women with postpartum depression comparing SSRIs to placebo and/or other treatments was performed. A comprehensive literature search of online databases, the bibliographies of published articles and grey literature were conducted. Data on efficacy, acceptability and tolerability were extracted and the quality of the trials was assessed. Six randomised clinical trials, comprising 595 patients, met quality criteria for inclusion in the analysis. Cognitive-behavioural intervention, psychosocial community-based intervention, psychodynamic therapy, cognitive behavioural therapy, a second-generation tricyclic antidepressant and placebo were used as comparisons. All studies demonstrated higher response and remission rates among those treated with SSRIs and greater mean changes on depression scales, although findings were not always statistically significant. Dropout rates were high in three of the trials but similar among treatment and comparison groups. In general, SSRIs were well tolerated and trial quality was good. There are few trials, patients included in the trials were not representative of all patients with postpartum depression, dropout rates in three trials were high, and long-term efficacy and tolerability were assessed in only two trials. SSRIs appear to be efficacious and well tolerated in the treatment of postpartum depression, but the available evidence fails to demonstrate a clear superiority over other treatments. © 2013 Elsevier B.V. All rights reserved.

  14. Deception, efficiency, and random groups - Psychology and the gradual origination of the random group design

    NARCIS (Netherlands)

    Dehue, T

    1997-01-01

    In the life sciences, psychology, and large parts of the other social sciences, the ideal experiment is a comparative experiment with randomly composed experimental and control groups. Historians and practitioners of these sciences generally attribute the invention of this "random group design" to

  15. Optimization of refueling-shuffling scheme in PWR core by random search strategy

    International Nuclear Information System (INIS)

    Wu Yuan

    1991-11-01

    A random method for simulating optimization of refueling management in a pressurized water reactor (PWR) core is described. The main purpose of the optimization was to select the 'best' refueling arrangement scheme which would produce maximum economic benefits under certain imposed conditions. To fulfill this goal, an effective optimization strategy, two-stage random search method was developed. First, the search was made in a manner similar to the stratified sampling technique. A local optimum can be reached by comparison of the successive results. Then the other random experiences would be carried on between different strata to try to find the global optimum. In general, it can be used as a practical tool for conventional fuel management scheme. However, it can also be used in studies on optimization of Low-Leakage fuel management. Some calculations were done for a typical PWR core on a CYBER-180/830 computer. The results show that the method proposed can obtain satisfactory approach at reasonable low computational cost

  16. Bose condensation in (random traps

    Directory of Open Access Journals (Sweden)

    V.A. Zagrebnov

    2009-01-01

    Full Text Available We study a non-interacting (perfect Bose-gas in random external potentials (traps. It is shown that a generalized Bose-Einstein condensation in the random eigenstates manifests if and only if the same occurs in the one-particle kinetic-energy eigenstates, which corresponds to the generalized condensation of the free Bose-gas. Moreover, we prove that the amounts of both condensate densities are equal. This statement is relevant for justification of the Bogoliubov approximation} in the theory of disordered boson systems.

  17. Comparison of confirmed inactive and randomly selected compounds as negative training examples in support vector machine-based virtual screening.

    Science.gov (United States)

    Heikamp, Kathrin; Bajorath, Jürgen

    2013-07-22

    The choice of negative training data for machine learning is a little explored issue in chemoinformatics. In this study, the influence of alternative sets of negative training data and different background databases on support vector machine (SVM) modeling and virtual screening has been investigated. Target-directed SVM models have been derived on the basis of differently composed training sets containing confirmed inactive molecules or randomly selected database compounds as negative training instances. These models were then applied to search background databases consisting of biological screening data or randomly assembled compounds for available hits. Negative training data were found to systematically influence compound recall in virtual screening. In addition, different background databases had a strong influence on the search results. Our findings also indicated that typical benchmark settings lead to an overestimation of SVM-based virtual screening performance compared to search conditions that are more relevant for practical applications.

  18. Variable Selection in Time Series Forecasting Using Random Forests

    Directory of Open Access Journals (Sweden)

    Hristos Tyralis

    2017-10-01

    Full Text Available Time series forecasting using machine learning algorithms has gained popularity recently. Random forest is a machine learning algorithm implemented in time series forecasting; however, most of its forecasting properties have remained unexplored. Here we focus on assessing the performance of random forests in one-step forecasting using two large datasets of short time series with the aim to suggest an optimal set of predictor variables. Furthermore, we compare its performance to benchmarking methods. The first dataset is composed by 16,000 simulated time series from a variety of Autoregressive Fractionally Integrated Moving Average (ARFIMA models. The second dataset consists of 135 mean annual temperature time series. The highest predictive performance of RF is observed when using a low number of recent lagged predictor variables. This outcome could be useful in relevant future applications, with the prospect to achieve higher predictive accuracy.

  19. The genetic consequences of selection in natural populations.

    Science.gov (United States)

    Thurman, Timothy J; Barrett, Rowan D H

    2016-04-01

    The selection coefficient, s, quantifies the strength of selection acting on a genetic variant. Despite this parameter's central importance to population genetic models, until recently we have known relatively little about the value of s in natural populations. With the development of molecular genetic techniques in the late 20th century and the sequencing technologies that followed, biologists are now able to identify genetic variants and directly relate them to organismal fitness. We reviewed the literature for published estimates of natural selection acting at the genetic level and found over 3000 estimates of selection coefficients from 79 studies. Selection coefficients were roughly exponentially distributed, suggesting that the impact of selection at the genetic level is generally weak but can occasionally be quite strong. We used both nonparametric statistics and formal random-effects meta-analysis to determine how selection varies across biological and methodological categories. Selection was stronger when measured over shorter timescales, with the mean magnitude of s greatest for studies that measured selection within a single generation. Our analyses found conflicting trends when considering how selection varies with the genetic scale (e.g., SNPs or haplotypes) at which it is measured, suggesting a need for further research. Besides these quantitative conclusions, we highlight key issues in the calculation, interpretation, and reporting of selection coefficients and provide recommendations for future research. © 2016 John Wiley & Sons Ltd.

  20. Testing concordance of instrumental variable effects in generalized linear models with application to Mendelian randomization

    Science.gov (United States)

    Dai, James Y.; Chan, Kwun Chuen Gary; Hsu, Li

    2014-01-01

    Instrumental variable regression is one way to overcome unmeasured confounding and estimate causal effect in observational studies. Built on structural mean models, there has been considerale work recently developed for consistent estimation of causal relative risk and causal odds ratio. Such models can sometimes suffer from identification issues for weak instruments. This hampered the applicability of Mendelian randomization analysis in genetic epidemiology. When there are multiple genetic variants available as instrumental variables, and causal effect is defined in a generalized linear model in the presence of unmeasured confounders, we propose to test concordance between instrumental variable effects on the intermediate exposure and instrumental variable effects on the disease outcome, as a means to test the causal effect. We show that a class of generalized least squares estimators provide valid and consistent tests of causality. For causal effect of a continuous exposure on a dichotomous outcome in logistic models, the proposed estimators are shown to be asymptotically conservative. When the disease outcome is rare, such estimators are consistent due to the log-linear approximation of the logistic function. Optimality of such estimators relative to the well-known two-stage least squares estimator and the double-logistic structural mean model is further discussed. PMID:24863158

  1. Influence of Head Teachers' General and Instructional Supervisory Practices on Teachers' Work Performance in Secondary Schools in Entebbe Municipality, Wakiso District, Uganda

    Science.gov (United States)

    Jared, Nzabonimpa Buregeya

    2011-01-01

    The study examined the Influence of Secondary School Head Teachers' General and Instructional Supervisory Practices on Teachers' Work Performance. Qualitative and qualitative methods with a descriptive-correlational research approach were used in the study. Purposive sampling technique alongside random sampling technique was used to select the…

  2. The Long-Term Effectiveness of a Selective, Personality-Targeted Prevention Program in Reducing Alcohol Use and Related Harms: A Cluster Randomized Controlled Trial

    Science.gov (United States)

    Newton, Nicola C.; Conrod, Patricia J.; Slade, Tim; Carragher, Natacha; Champion, Katrina E.; Barrett, Emma L.; Kelly, Erin V.; Nair, Natasha K.; Stapinski, Lexine; Teesson, Maree

    2016-01-01

    Background: This study investigated the long-term effectiveness of Preventure, a selective personality-targeted prevention program, in reducing the uptake of alcohol, harmful use of alcohol, and alcohol-related harms over a 3-year period. Methods: A cluster randomized controlled trial was conducted to assess the effectiveness of Preventure.…

  3. Survival in individuals with severe alpha 1-antitrypsin deficiency (PiZZ) in comparison to a general population with known smoking habits.

    Science.gov (United States)

    Tanash, Hanan A; Ekström, Magnus; Rönmark, Eva; Lindberg, Anne; Piitulainen, Eeva

    2017-09-01

    Knowledge about the natural history of severe alpha 1-antitrypsin (AAT) deficiency (PiZZ) is limited. Our aim was to compare the survival of PiZZ individuals with randomly selected controls from the Swedish general population.The PiZZ subjects (n=1585) were selected from the Swedish National AATD Register. The controls (n=5999) were randomly selected from the Swedish population register. Smoking habits were known for all subjects.Median follow-up times for the PiZZ subjects (731 never-smokers) and controls (3179 never-smokers) were 12 and 17 years, respectively (psmoking habits and presence of respiratory symptoms, the risk of death was still significantly higher for the PiZZ individuals than for the controls, hazard ratio (HR) 3.2 (95% CI 2.8-3.6; psmoking PiZZ individuals identified by screening, compared to never-smoking controls, HR 1.2 (95% CI 0.6-2.2).The never-smoking PiZZ individuals identified by screening had a similar life expectancy to the never-smokers in the Swedish general population. Early diagnosis of AAT deficiency is of utmost importance. Copyright ©ERS 2017.

  4. Simple and Multivariate Relationships Between Spiritual Intelligence with General Health and Happiness.

    Science.gov (United States)

    Amirian, Mohammad-Elyas; Fazilat-Pour, Masoud

    2016-08-01

    The present study examined simple and multivariate relationships of spiritual intelligence with general health and happiness. The employed method was descriptive and correlational. King's Spiritual Quotient scales, GHQ-28 and Oxford Happiness Inventory, are filled out by a sample consisted of 384 students, which were selected using stratified random sampling from the students of Shahid Bahonar University of Kerman. Data are subjected to descriptive and inferential statistics including correlations and multivariate regressions. Bivariate correlations support positive and significant predictive value of spiritual intelligence toward general health and happiness. Further analysis showed that among the Spiritual Intelligence' subscales, Existential Critical Thinking Predicted General Health and Happiness, reversely. In addition, happiness was positively predicted by generation of personal meaning and transcendental awareness. The findings are discussed in line with the previous studies and the relevant theoretical background.

  5. Quantum random access memory

    OpenAIRE

    Giovannetti, Vittorio; Lloyd, Seth; Maccone, Lorenzo

    2007-01-01

    A random access memory (RAM) uses n bits to randomly address N=2^n distinct memory cells. A quantum random access memory (qRAM) uses n qubits to address any quantum superposition of N memory cells. We present an architecture that exponentially reduces the requirements for a memory call: O(log N) switches need be thrown instead of the N used in conventional (classical or quantum) RAM designs. This yields a more robust qRAM algorithm, as it in general requires entanglement among exponentially l...

  6. Global Convergence of Arbitrary-Block Gradient Methods for Generalized Polyak-{\\L} ojasiewicz Functions

    KAUST Repository

    Csiba, Dominik

    2017-09-09

    In this paper we introduce two novel generalizations of the theory for gradient descent type methods in the proximal setting. First, we introduce the proportion function, which we further use to analyze all known (and many new) block-selection rules for block coordinate descent methods under a single framework. This framework includes randomized methods with uniform, non-uniform or even adaptive sampling strategies, as well as deterministic methods with batch, greedy or cyclic selection rules. Second, the theory of strongly-convex optimization was recently generalized to a specific class of non-convex functions satisfying the so-called Polyak-{\\\\L}ojasiewicz condition. To mirror this generalization in the weakly convex case, we introduce the Weak Polyak-{\\\\L}ojasiewicz condition, using which we give global convergence guarantees for a class of non-convex functions previously not considered in theory. Additionally, we establish (necessarily somewhat weaker) convergence guarantees for an even larger class of non-convex functions satisfying a certain smoothness assumption only. By combining the two abovementioned generalizations we recover the state-of-the-art convergence guarantees for a large class of previously known methods and setups as special cases of our general framework. Moreover, our frameworks allows for the derivation of new guarantees for many new combinations of methods and setups, as well as a large class of novel non-convex objectives. The flexibility of our approach offers a lot of potential for future research, as a new block selection procedure will have a convergence guarantee for all objectives considered in our framework, while a new objective analyzed under our approach will have a whole fleet of block selection rules with convergence guarantees readily available.

  7. Global Convergence of Arbitrary-Block Gradient Methods for Generalized Polyak-{\\L} ojasiewicz Functions

    KAUST Repository

    Csiba, Dominik; Richtarik, Peter

    2017-01-01

    In this paper we introduce two novel generalizations of the theory for gradient descent type methods in the proximal setting. First, we introduce the proportion function, which we further use to analyze all known (and many new) block-selection rules for block coordinate descent methods under a single framework. This framework includes randomized methods with uniform, non-uniform or even adaptive sampling strategies, as well as deterministic methods with batch, greedy or cyclic selection rules. Second, the theory of strongly-convex optimization was recently generalized to a specific class of non-convex functions satisfying the so-called Polyak-{\\L}ojasiewicz condition. To mirror this generalization in the weakly convex case, we introduce the Weak Polyak-{\\L}ojasiewicz condition, using which we give global convergence guarantees for a class of non-convex functions previously not considered in theory. Additionally, we establish (necessarily somewhat weaker) convergence guarantees for an even larger class of non-convex functions satisfying a certain smoothness assumption only. By combining the two abovementioned generalizations we recover the state-of-the-art convergence guarantees for a large class of previously known methods and setups as special cases of our general framework. Moreover, our frameworks allows for the derivation of new guarantees for many new combinations of methods and setups, as well as a large class of novel non-convex objectives. The flexibility of our approach offers a lot of potential for future research, as a new block selection procedure will have a convergence guarantee for all objectives considered in our framework, while a new objective analyzed under our approach will have a whole fleet of block selection rules with convergence guarantees readily available.

  8. Generating equilateral random polygons in confinement III

    International Nuclear Information System (INIS)

    Diao, Y; Ernst, C; Montemayor, A; Ziegler, U

    2012-01-01

    In this paper we continue our earlier studies (Diao et al 2011 J. Phys. A: Math. Theor. 44 405202, Diao et al J. Phys. A: Math. Theor. 45 275203) on the generation methods of random equilateral polygons confined in a sphere. The first half of this paper is concerned with the generation of confined equilateral random walks. We show that if the selection of a vertex is uniform subject to the position of its previous vertex and the confining condition, then the distributions of the vertices are not uniform, although there exists a distribution such that if the initial vertex is selected following this distribution, then all vertices of the random walk follow this same distribution. Thus in order to generate a confined equilateral random walk, the selection of a vertex cannot be uniform subject to the position of its previous vertex and the confining condition. We provide a simple algorithm capable of generating confined equilateral random walks whose vertex distribution is almost uniform in the confinement sphere. In the second half of this paper we show that any process generating confined equilateral random walks can be turned into a process generating confined equilateral random polygons with the property that the vertex distribution of the polygons approaches the vertex distribution of the walks as the polygons get longer and longer. In our earlier studies, the starting point of the confined polygon is fixed at the center of the sphere. The new approach here allows us to move the starting point of the confined polygon off the center of the sphere. (paper)

  9. Affinity selection of Nipah and Hendra virus-related vaccine candidates from a complex random peptide library displayed on bacteriophage virus-like particles

    Energy Technology Data Exchange (ETDEWEB)

    Peabody, David S.; Chackerian, Bryce; Ashley, Carlee; Carnes, Eric; Negrete, Oscar

    2017-01-24

    The invention relates to virus-like particles of bacteriophage MS2 (MS2 VLPs) displaying peptide epitopes or peptide mimics of epitopes of Nipah Virus envelope glycoprotein that elicit an immune response against Nipah Virus upon vaccination of humans or animals. Affinity selection on Nipah Virus-neutralizing monoclonal antibodies using random sequence peptide libraries on MS2 VLPs selected peptides with sequence similarity to peptide sequences found within the envelope glycoprotein of Nipah itself, thus identifying the epitopes the antibodies recognize. The selected peptide sequences themselves are not necessarily identical in all respects to a sequence within Nipah Virus glycoprotein, and therefore may be referred to as epitope mimics VLPs displaying these epitope mimics can serve as vaccine. On the other hand, display of the corresponding wild-type sequence derived from Nipah Virus and corresponding to the epitope mapped by affinity selection, may also be used as a vaccine.

  10. Effects of Random Values for Particle Swarm Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Hou-Ping Dai

    2018-02-01

    Full Text Available Particle swarm optimization (PSO algorithm is generally improved by adaptively adjusting the inertia weight or combining with other evolution algorithms. However, in most modified PSO algorithms, the random values are always generated by uniform distribution in the range of [0, 1]. In this study, the random values, which are generated by uniform distribution in the ranges of [0, 1] and [−1, 1], and Gauss distribution with mean 0 and variance 1 ( U [ 0 , 1 ] , U [ − 1 , 1 ] and G ( 0 , 1 , are respectively used in the standard PSO and linear decreasing inertia weight (LDIW PSO algorithms. For comparison, the deterministic PSO algorithm, in which the random values are set as 0.5, is also investigated in this study. Some benchmark functions and the pressure vessel design problem are selected to test these algorithms with different types of random values in three space dimensions (10, 30, and 100. The experimental results show that the standard PSO and LDIW-PSO algorithms with random values generated by U [ − 1 , 1 ] or G ( 0 , 1 are more likely to avoid falling into local optima and quickly obtain the global optima. This is because the large-scale random values can expand the range of particle velocity to make the particle more likely to escape from local optima and obtain the global optima. Although the random values generated by U [ − 1 , 1 ] or G ( 0 , 1 are beneficial to improve the global searching ability, the local searching ability for a low dimensional practical optimization problem may be decreased due to the finite particles.

  11. Territory and nest site selection patterns by Grasshopper Sparrows in southeastern Arizona

    Science.gov (United States)

    Ruth, Janet M.; Skagen, Susan K.

    2017-01-01

    Grassland bird populations are showing some of the greatest rates of decline of any North American birds, prompting measures to protect and improve important habitat. We assessed how vegetation structure and composition, habitat features often targeted for management, affected territory and nest site selection by Grasshopper Sparrows (Ammodramus savannarum ammolegus) in southeastern Arizona. To identify features important to males establishing territories, we compared vegetation characteristics of known territories and random samples on 2 sites over 5 years. We examined habitat selection patterns of females by comparing characteristics of nest sites with territories over 3 years. Males selected territories in areas of sparser vegetation structure and more tall shrubs (>2 m) than random plots on the site with low shrub densities. Males did not select territories based on the proportion of exotic grasses. Females generally located nest sites in areas with lower small shrub (1–2 m tall) densities than territories overall when possible and preferentially selected native grasses for nest construction. Whether habitat selection was apparent depended upon the range of vegetation structure that was available. We identified an upper threshold above which grass structure seemed to be too high and dense for Grasshopper Sparrows. Our results suggest that some management that reduces vegetative structure may benefit this species in desert grasslands at the nest and territory scale. However, we did not assess initial male habitat selection at a broader landscape scale where their selection patterns may be different and could be influenced by vegetation density and structure outside the range of values sampled in this study.

  12. Working situation of cancer survivors versus the general population.

    Science.gov (United States)

    Lee, Myung Kyung; Yun, Young Ho

    2015-06-01

    The purposes of this study were to compare the working situation of cancer survivors and the general (cancer-free) population and investigate characteristics associated with the increased likelihood of unemployment between the two groups. We selected 1927 cancer survivors from the 2008 Korean Community Health Survey data less than 65 years of age and used propensity score matching to randomly select 1924 individuals from the general population who closely resembled the cancer survivors. Compared to the general population, cancer survivors were less likely to be engaged in paid work, particularly as permanent workers, and were more likely to work regular hours. Additionally, they tended to do less work that involved lifting or moving heavy objects and uncomfortable postures and were more willing to express their emotions. An increased probability of unemployment among cancer survivors was associated with being over 50 years old, being female, having a lower monthly income, having multiple comorbidities, belonging to a nuclear family, being a National Basic Livelihood Act beneficiary, and having a recent diagnosis. Cancer survivors may want to pursue flexible occupations and improve their working situation. Further, they perceive their workplace more positively compared to the general population. Respecting the cancer survivor's choice to find flexible working conditions that suit their health needs and status, health-care providers involved in managing work-related issues among cancer survivors should be aware of the interaction between work-related concerns and post-cancer disease management.

  13. Apnea after awake-regional and general anesthesia in infants: The General Anesthesia compared to Spinal anesthesia (GAS) study: comparing apnea and neurodevelopmental outcomes, a randomized controlled trial

    Science.gov (United States)

    Davidson, Andrew J.; Morton, Neil S.; Arnup, Sarah J.; de Graaff, Jurgen C.; Disma, Nicola; Withington, Davinia E.; Frawley, Geoff; Hunt, Rodney W.; Hardy, Pollyanna; Khotcholava, Magda; von Ungern Sternberg, Britta S.; Wilton, Niall; Tuo, Pietro; Salvo, Ida; Ormond, Gillian; Stargatt, Robyn; Locatelli, Bruno Guido; McCann, Mary Ellen

    2015-01-01

    Background Post-operative apnea is a complication in young infants. Awake-regional anesthesia (RA) may reduce the risk; however the evidence is weak. The General Anesthesia compared to Spinal anesthesia (GAS) study is a randomized, controlled, trial designed to assess the influence of general anesthesia (GA) on neurodevelopment. A secondary aim is to compare rates of apnea after anesthesia. Methods Infants ≤ 60 weeks postmenstrual age scheduled for inguinal herniorraphy were randomized to RA or GA. Exclusion criteria included risk factors for adverse neurodevelopmental outcome and infants born < 26 weeks’ gestation. The primary outcome of this analysis was any observed apnea up to 12 hours post-operatively. Apnea assessment was unblinded. Results 363 patients were assigned to RA and 359 to GA. Overall the incidence of apnea (0 to 12 hours) was similar between arms (3% in RA and 4% in GA arms, Odds Ratio (OR) 0.63, 95% Confidence Intervals (CI): 0.31 to 1.30, P=0.2133), however the incidence of early apnea (0 to 30 minutes) was lower in the RA arm (1% versus 3%, OR 0.20, 95%CI: 0.05 to 0.91, P=0.0367). The incidence of late apnea (30 minutes to 12 hours) was 2% in both RA and GA arms (OR 1.17, 95%CI: 0.41 to 3.33, P=0.7688). The strongest predictor of apnea was prematurity (OR 21.87, 95% CI 4.38 to 109.24) and 96% of infants with apnea were premature. Conclusions RA in infants undergoing inguinal herniorraphy reduces apnea in the early post-operative period. Cardio-respiratory monitoring should be used for all ex-premature infants. PMID:26001033

  14. Quantifiers for randomness of chaotic pseudo-random number generators.

    Science.gov (United States)

    De Micco, L; Larrondo, H A; Plastino, A; Rosso, O A

    2009-08-28

    We deal with randomness quantifiers and concentrate on their ability to discern the hallmark of chaos in time series used in connection with pseudo-random number generators (PRNGs). Workers in the field are motivated to use chaotic maps for generating PRNGs because of the simplicity of their implementation. Although there exist very efficient general-purpose benchmarks for testing PRNGs, we feel that the analysis provided here sheds additional didactic light on the importance of the main statistical characteristics of a chaotic map, namely (i) its invariant measure and (ii) the mixing constant. This is of help in answering two questions that arise in applications: (i) which is the best PRNG among the available ones? and (ii) if a given PRNG turns out not to be good enough and a randomization procedure must still be applied to it, which is the best applicable randomization procedure? Our answer provides a comparative analysis of several quantifiers advanced in the extant literature.

  15. Fast selection of miRNA candidates based on large-scale pre-computed MFE sets of randomized sequences.

    Science.gov (United States)

    Warris, Sven; Boymans, Sander; Muiser, Iwe; Noback, Michiel; Krijnen, Wim; Nap, Jan-Peter

    2014-01-13

    Small RNAs are important regulators of genome function, yet their prediction in genomes is still a major computational challenge. Statistical analyses of pre-miRNA sequences indicated that their 2D structure tends to have a minimal free energy (MFE) significantly lower than MFE values of equivalently randomized sequences with the same nucleotide composition, in contrast to other classes of non-coding RNA. The computation of many MFEs is, however, too intensive to allow for genome-wide screenings. Using a local grid infrastructure, MFE distributions of random sequences were pre-calculated on a large scale. These distributions follow a normal distribution and can be used to determine the MFE distribution for any given sequence composition by interpolation. It allows on-the-fly calculation of the normal distribution for any candidate sequence composition. The speedup achieved makes genome-wide screening with this characteristic of a pre-miRNA sequence practical. Although this particular property alone will not be able to distinguish miRNAs from other sequences sufficiently discriminative, the MFE-based P-value should be added to the parameters of choice to be included in the selection of potential miRNA candidates for experimental verification.

  16. Decompounding random sums: A nonparametric approach

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted; Pitts, Susan M.

    Observations from sums of random variables with a random number of summands, known as random, compound or stopped sums arise within many areas of engineering and science. Quite often it is desirable to infer properties of the distribution of the terms in the random sum. In the present paper we...... review a number of applications and consider the nonlinear inverse problem of inferring the cumulative distribution function of the components in the random sum. We review the existing literature on non-parametric approaches to the problem. The models amenable to the analysis are generalized considerably...

  17. Mean-field analysis of orientation selectivity in inhibition-dominated networks of spiking neurons.

    Science.gov (United States)

    Sadeh, Sadra; Cardanobile, Stefano; Rotter, Stefan

    2014-01-01

    Mechanisms underlying the emergence of orientation selectivity in the primary visual cortex are highly debated. Here we study the contribution of inhibition-dominated random recurrent networks to orientation selectivity, and more generally to sensory processing. By simulating and analyzing large-scale networks of spiking neurons, we investigate tuning amplification and contrast invariance of orientation selectivity in these networks. In particular, we show how selective attenuation of the common mode and amplification of the modulation component take place in these networks. Selective attenuation of the baseline, which is governed by the exceptional eigenvalue of the connectivity matrix, removes the unspecific, redundant signal component and ensures the invariance of selectivity across different contrasts. Selective amplification of modulation, which is governed by the operating regime of the network and depends on the strength of coupling, amplifies the informative signal component and thus increases the signal-to-noise ratio. Here, we perform a mean-field analysis which accounts for this process.

  18. Effect of Peer-Led Team Learning (PLTL) on Student Achievement, Attitude, and Self-Concept in College General Chemistry in Randomized and Quasi Experimental Designs

    Science.gov (United States)

    Chan, Julia Y. K.; Bauer, Christopher F.

    2015-01-01

    This study investigated exam achievement and affective characteristics of students in general chemistry in a fully-randomized experimental design, contrasting Peer-Led Team Learning (PLTL) participation with a control group balanced for time-on-task and study activity. This study population included two independent first-semester courses with…

  19. Selection of examples in case-based computer-aided decision systems

    International Nuclear Information System (INIS)

    Mazurowski, Maciej A; Zurada, Jacek M; Tourassi, Georgia D

    2008-01-01

    Case-based computer-aided decision (CB-CAD) systems rely on a database of previously stored, known examples when classifying new, incoming queries. Such systems can be particularly useful since they do not need retraining every time a new example is deposited in the case base. The adaptive nature of case-based systems is well suited to the current trend of continuously expanding digital databases in the medical domain. To maintain efficiency, however, such systems need sophisticated strategies to effectively manage the available evidence database. In this paper, we discuss the general problem of building an evidence database by selecting the most useful examples to store while satisfying existing storage requirements. We evaluate three intelligent techniques for this purpose: genetic algorithm-based selection, greedy selection and random mutation hill climbing. These techniques are compared to a random selection strategy used as the baseline. The study is performed with a previously presented CB-CAD system applied for false positive reduction in screening mammograms. The experimental evaluation shows that when the development goal is to maximize the system's diagnostic performance, the intelligent techniques are able to reduce the size of the evidence database to 37% of the original database by eliminating superfluous and/or detrimental examples while at the same time significantly improving the CAD system's performance. Furthermore, if the case-base size is a main concern, the total number of examples stored in the system can be reduced to only 2-4% of the original database without a decrease in the diagnostic performance. Comparison of the techniques shows that random mutation hill climbing provides the best balance between the diagnostic performance and computational efficiency when building the evidence database of the CB-CAD system.

  20. An Exact Closed-Form Expression for the BER of Binary Modulations with Dual-Branch Selection over Generalized-K Fading

    KAUST Repository

    Ansari, Imran Shafique

    2012-07-31

    Error performance is one of the main performance measures and the derivation of its closed-form expression has proved to be quite involved for certain systems. In this paper, a unified closed-form expression, applicable to different binary modulation schemes, for the bit error rate of dual-branch selection diversity based systems undergoing independent but not necessarily identically distributed generalized-K fading is derived in terms of the extended generalized bivariate Meijer G-function.

  1. Annealed central limit theorems for the ising model on random graphs

    NARCIS (Netherlands)

    Giardinà, C.; Giberti, C.; van der Hofstad, R.W.; Prioriello, M.L.

    2016-01-01

    The aim of this paper is to prove central limit theorems with respect to the annealed measure for the magnetization rescaled by √N of Ising models on random graphs. More precisely, we consider the general rank-1 inhomogeneous random graph (or generalized random graph), the 2-regular configuration

  2. A simple method for finding explicit analytic transition densities of diffusion processes with general diploid selection.

    Science.gov (United States)

    Song, Yun S; Steinrücken, Matthias

    2012-03-01

    The transition density function of the Wright-Fisher diffusion describes the evolution of population-wide allele frequencies over time. This function has important practical applications in population genetics, but finding an explicit formula under a general diploid selection model has remained a difficult open problem. In this article, we develop a new computational method to tackle this classic problem. Specifically, our method explicitly finds the eigenvalues and eigenfunctions of the diffusion generator associated with the Wright-Fisher diffusion with recurrent mutation and arbitrary diploid selection, thus allowing one to obtain an accurate spectral representation of the transition density function. Simplicity is one of the appealing features of our approach. Although our derivation involves somewhat advanced mathematical concepts, the resulting algorithm is quite simple and efficient, only involving standard linear algebra. Furthermore, unlike previous approaches based on perturbation, which is applicable only when the population-scaled selection coefficient is small, our method is nonperturbative and is valid for a broad range of parameter values. As a by-product of our work, we obtain the rate of convergence to the stationary distribution under mutation-selection balance.

  3. Multivariate generalized linear mixed models using R

    CERN Document Server

    Berridge, Damon Mark

    2011-01-01

    Multivariate Generalized Linear Mixed Models Using R presents robust and methodologically sound models for analyzing large and complex data sets, enabling readers to answer increasingly complex research questions. The book applies the principles of modeling to longitudinal data from panel and related studies via the Sabre software package in R. A Unified Framework for a Broad Class of Models The authors first discuss members of the family of generalized linear models, gradually adding complexity to the modeling framework by incorporating random effects. After reviewing the generalized linear model notation, they illustrate a range of random effects models, including three-level, multivariate, endpoint, event history, and state dependence models. They estimate the multivariate generalized linear mixed models (MGLMMs) using either standard or adaptive Gaussian quadrature. The authors also compare two-level fixed and random effects linear models. The appendices contain additional information on quadrature, model...

  4. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination.

    Science.gov (United States)

    Koh, Bongyeun; Hong, Sunggi; Kim, Soon-Sim; Hyun, Jin-Sook; Baek, Milye; Moon, Jundong; Kwon, Hayran; Kim, Gyoungyong; Min, Seonggi; Kang, Gu-Hyun

    2016-01-01

    The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE), which requires examinees to select items randomly. The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01), as well as 4 of the 5 items on the advanced skills test (P<0.05). In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01), as well as all 3 of the advanced skills test items (P<0.01). In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  5. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination

    Directory of Open Access Journals (Sweden)

    Bongyeun Koh

    2016-01-01

    Full Text Available Purpose: The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE, which requires examinees to select items randomly. Methods: The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. Results: In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01, as well as 4 of the 5 items on the advanced skills test (P<0.05. In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01, as well as all 3 of the advanced skills test items (P<0.01. Conclusion: In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  6. Private randomness expansion with untrusted devices

    International Nuclear Information System (INIS)

    Colbeck, Roger; Kent, Adrian

    2011-01-01

    Randomness is an important resource for many applications, from gambling to secure communication. However, guaranteeing that the output from a candidate random source could not have been predicted by an outside party is a challenging task, and many supposedly random sources used today provide no such guarantee. Quantum solutions to this problem exist, for example a device which internally sends a photon through a beamsplitter and observes on which side it emerges, but, presently, such solutions require the user to trust the internal workings of the device. Here, we seek to go beyond this limitation by asking whether randomness can be generated using untrusted devices-even ones created by an adversarial agent-while providing a guarantee that no outside party (including the agent) can predict it. Since this is easily seen to be impossible unless the user has an initially private random string, the task we investigate here is private randomness expansion. We introduce a protocol for private randomness expansion with untrusted devices which is designed to take as input an initially private random string and produce as output a longer private random string. We point out that private randomness expansion protocols are generally vulnerable to attacks that can render the initial string partially insecure, even though that string is used only inside a secure laboratory; our protocol is designed to remove this previously unconsidered vulnerability by privacy amplification. We also discuss extensions of our protocol designed to generate an arbitrarily long random string from a finite initially private random string. The security of these protocols against the most general attacks is left as an open question.

  7. Private randomness expansion with untrusted devices

    Science.gov (United States)

    Colbeck, Roger; Kent, Adrian

    2011-03-01

    Randomness is an important resource for many applications, from gambling to secure communication. However, guaranteeing that the output from a candidate random source could not have been predicted by an outside party is a challenging task, and many supposedly random sources used today provide no such guarantee. Quantum solutions to this problem exist, for example a device which internally sends a photon through a beamsplitter and observes on which side it emerges, but, presently, such solutions require the user to trust the internal workings of the device. Here, we seek to go beyond this limitation by asking whether randomness can be generated using untrusted devices—even ones created by an adversarial agent—while providing a guarantee that no outside party (including the agent) can predict it. Since this is easily seen to be impossible unless the user has an initially private random string, the task we investigate here is private randomness expansion. We introduce a protocol for private randomness expansion with untrusted devices which is designed to take as input an initially private random string and produce as output a longer private random string. We point out that private randomness expansion protocols are generally vulnerable to attacks that can render the initial string partially insecure, even though that string is used only inside a secure laboratory; our protocol is designed to remove this previously unconsidered vulnerability by privacy amplification. We also discuss extensions of our protocol designed to generate an arbitrarily long random string from a finite initially private random string. The security of these protocols against the most general attacks is left as an open question.

  8. Private randomness expansion with untrusted devices

    Energy Technology Data Exchange (ETDEWEB)

    Colbeck, Roger; Kent, Adrian, E-mail: rcolbeck@perimeterinstitute.ca, E-mail: a.p.a.kent@damtp.cam.ac.uk [Perimeter Institute for Theoretical Physics, 31 Caroline Street North, Waterloo, ON N2L 2Y5 (Canada)

    2011-03-04

    Randomness is an important resource for many applications, from gambling to secure communication. However, guaranteeing that the output from a candidate random source could not have been predicted by an outside party is a challenging task, and many supposedly random sources used today provide no such guarantee. Quantum solutions to this problem exist, for example a device which internally sends a photon through a beamsplitter and observes on which side it emerges, but, presently, such solutions require the user to trust the internal workings of the device. Here, we seek to go beyond this limitation by asking whether randomness can be generated using untrusted devices-even ones created by an adversarial agent-while providing a guarantee that no outside party (including the agent) can predict it. Since this is easily seen to be impossible unless the user has an initially private random string, the task we investigate here is private randomness expansion. We introduce a protocol for private randomness expansion with untrusted devices which is designed to take as input an initially private random string and produce as output a longer private random string. We point out that private randomness expansion protocols are generally vulnerable to attacks that can render the initial string partially insecure, even though that string is used only inside a secure laboratory; our protocol is designed to remove this previously unconsidered vulnerability by privacy amplification. We also discuss extensions of our protocol designed to generate an arbitrarily long random string from a finite initially private random string. The security of these protocols against the most general attacks is left as an open question.

  9. Identifying critical success factors for designing selection processes into postgraduate specialty training: the case of UK general practice.

    Science.gov (United States)

    Plint, Simon; Patterson, Fiona

    2010-06-01

    The UK national recruitment process into general practice training has been developed over several years, with incremental introduction of stages which have been piloted and validated. Previously independent processes, which encouraged multiple applications and produced inconsistent outcomes, have been replaced by a robust national process which has high reliability and predictive validity, and is perceived to be fair by candidates and allocates applicants equitably across the country. Best selection practice involves a job analysis which identifies required competencies, then designs reliable assessment methods to measure them, and over the long term ensures that the process has predictive validity against future performance. The general practitioner recruitment process introduced machine markable short listing assessments for the first time in the UK postgraduate recruitment context, and also adopted selection centre workplace simulations. The key success factors have been identified as corporate commitment to the goal of a national process, with gradual convergence maintaining locus of control rather than the imposition of change without perceived legitimate authority.

  10. Effect of electromagnetic radiations from mobile phone base stations on general health and salivary function

    OpenAIRE

    Singh, Kushpal; Nagaraj, Anup; Yousuf, Asif; Ganta, Shravani; Pareek, Sonia; Vishnani, Preeti

    2016-01-01

    Objective: Cell phones use electromagnetic, nonionizing radiations in the microwave range, which some believe may be harmful to human health. The present study aimed to determine the effect of electromagnetic radiations (EMRs) on unstimulated/stimulated salivary flow rate and other health-related problems between the general populations residing in proximity to and far away from mobile phone base stations. Materials and Methods: A total of four mobile base stations were randomly selected from...

  11. Permutation Entropy for Random Binary Sequences

    Directory of Open Access Journals (Sweden)

    Lingfeng Liu

    2015-12-01

    Full Text Available In this paper, we generalize the permutation entropy (PE measure to binary sequences, which is based on Shannon’s entropy, and theoretically analyze this measure for random binary sequences. We deduce the theoretical value of PE for random binary sequences, which can be used to measure the randomness of binary sequences. We also reveal the relationship between this PE measure with other randomness measures, such as Shannon’s entropy and Lempel–Ziv complexity. The results show that PE is consistent with these two measures. Furthermore, we use PE as one of the randomness measures to evaluate the randomness of chaotic binary sequences.

  12. Methods for model selection in applied science and engineering.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2004-10-01

    Mathematical models are developed and used to study the properties of complex systems and/or modify these systems to satisfy some performance requirements in just about every area of applied science and engineering. A particular reason for developing a model, e.g., performance assessment or design, is referred to as the model use. Our objective is the development of a methodology for selecting a model that is sufficiently accurate for an intended use. Information on the system being modeled is, in general, incomplete, so that there may be two or more models consistent with the available information. The collection of these models is called the class of candidate models. Methods are developed for selecting the optimal member from a class of candidate models for the system. The optimal model depends on the available information, the selected class of candidate models, and the model use. Classical methods for model selection, including the method of maximum likelihood and Bayesian methods, as well as a method employing a decision-theoretic approach, are formulated to select the optimal model for numerous applications. There is no requirement that the candidate models be random. Classical methods for model selection ignore model use and require data to be available. Examples are used to show that these methods can be unreliable when data is limited. The decision-theoretic approach to model selection does not have these limitations, and model use is included through an appropriate utility function. This is especially important when modeling high risk systems, where the consequences of using an inappropriate model for the system can be disastrous. The decision-theoretic method for model selection is developed and applied for a series of complex and diverse applications. These include the selection of the: (1) optimal order of the polynomial chaos approximation for non-Gaussian random variables and stationary stochastic processes, (2) optimal pressure load model to be

  13. Reporting funding source or conflict of interest in abstracts of randomized controlled trials, no evidence of a large impact on general practitioners' confidence in conclusions, a three-arm randomized controlled trial.

    Science.gov (United States)

    Buffel du Vaure, Céline; Boutron, Isabelle; Perrodeau, Elodie; Ravaud, Philippe

    2014-04-28

    Systematic reporting of funding sources is recommended in the CONSORT Statement for abstracts. However, no specific recommendation is related to the reporting of conflicts of interest (CoI). The objective was to compare physicians' confidence in the conclusions of abstracts of randomized controlled trials of pharmaceutical treatment indexed in PubMed. We planned a three-arm parallel-group randomized trial. French general practitioners (GPs) were invited to participate and were blinded to the study's aim. We used a representative sample of 75 abstracts of pharmaceutical industry-funded randomized controlled trials published in 2010 and indexed in PubMed. Each abstract was standardized and reported in three formats: 1) no mention of the funding source or CoI; 2) reporting the funding source only; and 3) reporting the funding source and CoI. GPs were randomized according to a computerized randomization on a secure Internet system at a 1:1:1 ratio to assess one abstract among the three formats. The primary outcome was GPs' confidence in the abstract conclusions (0, not at all, to 10, completely confident). The study was planned to detect a large difference with an effect size of 0.5. Between October 2012 and June 2013, among 605 GPs contacted, 354 were randomized, 118 for each type of abstract. The mean difference (95% confidence interval) in GPs' confidence in abstract findings was 0.2 (-0.6; 1.0) (P = 0.84) for abstracts reporting the funding source only versus no funding source or CoI; -0.4 (-1.3; 0.4) (P = 0.39) for abstracts reporting the funding source and CoI versus no funding source and CoI; and -0.6 (-1.5; 0.2) (P = 0.15) for abstracts reporting the funding source and CoI versus the funding source only. We found no evidence of a large impact of trial report abstracts mentioning funding sources or CoI on GPs' confidence in the conclusions of the abstracts. ClinicalTrials.gov identifier: NCT01679873.

  14. Random Subspace Aggregation for Cancer Prediction with Gene Expression Profiles

    Directory of Open Access Journals (Sweden)

    Liying Yang

    2016-01-01

    Full Text Available Background. Precisely predicting cancer is crucial for cancer treatment. Gene expression profiles make it possible to analyze patterns between genes and cancers on the genome-wide scale. Gene expression data analysis, however, is confronted with enormous challenges for its characteristics, such as high dimensionality, small sample size, and low Signal-to-Noise Ratio. Results. This paper proposes a method, termed RS_SVM, to predict gene expression profiles via aggregating SVM trained on random subspaces. After choosing gene features through statistical analysis, RS_SVM randomly selects feature subsets to yield random subspaces and training SVM classifiers accordingly and then aggregates SVM classifiers to capture the advantage of ensemble learning. Experiments on eight real gene expression datasets are performed to validate the RS_SVM method. Experimental results show that RS_SVM achieved better classification accuracy and generalization performance in contrast with single SVM, K-nearest neighbor, decision tree, Bagging, AdaBoost, and the state-of-the-art methods. Experiments also explored the effect of subspace size on prediction performance. Conclusions. The proposed RS_SVM method yielded superior performance in analyzing gene expression profiles, which demonstrates that RS_SVM provides a good channel for such biological data.

  15. Biological Principles and Threshold Concepts for Understanding Natural Selection. Implications for Developing Visualizations as a Pedagogic Tool

    Science.gov (United States)

    Tibell, Lena A. E.; Harms, Ute

    2017-11-01

    Modern evolutionary theory is both a central theory and an integrative framework of the life sciences. This is reflected in the common references to evolution in modern science education curricula and contexts. In fact, evolution is a core idea that is supposed to support biology learning by facilitating the organization of relevant knowledge. In addition, evolution can function as a pivotal link between concepts and highlight similarities in the complexity of biological concepts. However, empirical studies in many countries have for decades identified deficiencies in students' scientific understanding of evolution mainly focusing on natural selection. Clearly, there are major obstacles to learning natural selection, and we argue that to overcome them, it is essential to address explicitly the general abstract concepts that underlie the biological processes, e.g., randomness or probability. Hence, we propose a two-dimensional framework for analyzing and structuring teaching of natural selection. The first—purely biological—dimension embraces the three main principles variation, heredity, and selection structured in nine key concepts that form the core idea of natural selection. The second dimension encompasses four so-called thresholds, i.e., general abstract and/or non-perceptual concepts: randomness, probability, spatial scales, and temporal scales. We claim that both of these dimensions must be continuously considered, in tandem, when teaching evolution in order to allow development of a meaningful understanding of the process. Further, we suggest that making the thresholds tangible with the aid of appropriate kinds of visualizations will facilitate grasping of the threshold concepts, and thus, help learners to overcome the difficulties in understanding the central theory of life.

  16. Low-temperature random matrix theory at the soft edge

    International Nuclear Information System (INIS)

    Edelman, Alan; Persson, Per-Olof; Sutton, Brian D.

    2014-01-01

    “Low temperature” random matrix theory is the study of random eigenvalues as energy is removed. In standard notation, β is identified with inverse temperature, and low temperatures are achieved through the limit β → ∞. In this paper, we derive statistics for low-temperature random matrices at the “soft edge,” which describes the extreme eigenvalues for many random matrix distributions. Specifically, new asymptotics are found for the expected value and standard deviation of the general-β Tracy-Widom distribution. The new techniques utilize beta ensembles, stochastic differential operators, and Riccati diffusions. The asymptotics fit known high-temperature statistics curiously well and contribute to the larger program of generalrandom matrix theory

  17. Generalized random sequential adsorption of polydisperse mixtures on a one-dimensional lattice

    International Nuclear Information System (INIS)

    Lončarević, I; Budinski-Petković, Lj; Vrhovac, S B; Belić, A

    2010-01-01

    Generalized random sequential adsorption (RSA) of polydisperse mixtures of k-mers on a one-dimensional lattice is studied numerically by means of Monte Carlo simulations. The kinetics of the deposition process of mixtures is studied for the irreversible case, for adsorption–desorption processes and for the case where adsorption, desorption and diffusion are present simultaneously. We concentrate here on the influence of the number of mixture components and the length of the k-mers making up the mixture on the temporal behavior of the coverage fraction θ(t). The approach of the coverage θ(t) to the jamming limit θ jam in the case of irreversible RSA is found to be exponential, θ jam -θ(t)∝ exp(-t/σ), not only for a whole mixture, but also for the individual components. For the reversible deposition of polydisperse mixtures, we find that after the initial 'jamming', a stretched exponential growth of the coverage θ(t) towards the equilibrium state value θ eq occurs, i.e., θ eq -θ(t)∝ exp[-(t/τ) β ]. The characteristic timescale τ is found to decrease with the desorption probability P des . When adsorption, desorption and diffusion occur simultaneously, the coverage of a mixture always reaches an equilibrium value θ eq , but there is a significant difference in temporal evolution between the coverage with diffusion and that without

  18. Effect of whole-body vibration exercise on mobility, balance ability and general health status in frail elderly patients: a pilot randomized controlled trial.

    Science.gov (United States)

    Zhang, Li; Weng, Changshui; Liu, Miao; Wang, Qiuhua; Liu, Liming; He, Yao

    2014-01-01

    To study the effects of whole-body vibration exercises on the mobility function, balance and general health status, and its feasibility as an intervention in frail elderly patients. Pilot randomized controlled trial. Forty-four frail older persons (85.27 ± 3.63 years) meeting the Fried Frailty Criteria. All eligible subjects were randomly assigned to the experimental group, who received a whole-body vibration exercise alone (vibration amplitude: 1-3 mm; frequency: 6-26 Hz; 4-5 bouts × 60 seconds; 3-5 times weekly), or a control group, who received usual care and exercises for eight weeks. The Timed Up and Go Test, 30-second chair stand test, lower extremities muscle strength, balance function, balance confidence and General Health Status were assessed at the beginning of the study, after four weeks and eight weeks of the intervention. Whole-body vibration exercise reduced the time of the Timed Up and Go Test (40.47 ± 15.94 s to 21.34 ± 4.42 s), improved the bilateral knees extensor strength (6.96 ± 1.70 kg to 11.26 ± 2.08 kg), the posture stability (surface area ellipse: 404.58 ± 177.05 to 255.95 ± 107.28) and General Health Status (Short-form Health Survey score: 24.51 ± 10.69 and 49.63 ± 9.85 to 45.03 ± 11.15 and 65.23 ± 9.39, respectively). The repeated-measures ANOVA showed that there were significant differences in the Timed Up and Go Test, 30-second chair stand test, bilateral knees extensor strength, activities-specific balance confidence score and general health status between the two groups (P balance and the general health status in the frail elderly.

  19. Object grammars and random generation

    Directory of Open Access Journals (Sweden)

    I. Dutour

    1998-12-01

    Full Text Available This paper presents a new systematic approach for the uniform random generation of combinatorial objects. The method is based on the notion of object grammars which give recursive descriptions of objects and generalize context-freegrammars. The application of particular valuations to these grammars leads to enumeration and random generation of objects according to non algebraic parameters.

  20. Random power series in the unit ball of Cn

    International Nuclear Information System (INIS)

    Shi Jihuai.

    1989-07-01

    The random power series in the unit disc has been studied by many authors. In this paper, we studied the random power series in the unit ball of C n and generalized some results in the unit disc to the unit ball, in particular, the result obtained recently by Duren has been generalized to the unit ball. The main tool used here is the generalized Salem-Zygmund's theorem. (author). 12 refs

  1. Random survival forests for competing risks

    DEFF Research Database (Denmark)

    Ishwaran, Hemant; Gerds, Thomas A; Kogalur, Udaya B

    2014-01-01

    We introduce a new approach to competing risks using random forests. Our method is fully non-parametric and can be used for selecting event-specific variables and for estimating the cumulative incidence function. We show that the method is highly effective for both prediction and variable selection...

  2. Using Random Numbers in Science Research Activities.

    Science.gov (United States)

    Schlenker, Richard M.; And Others

    1996-01-01

    Discusses the importance of science process skills and describes ways to select sets of random numbers for selection of subjects for a research study in an unbiased manner. Presents an activity appropriate for grades 5-12. (JRH)

  3. Randomization of grab-sampling strategies for estimating the annual exposure of U miners to Rn daughters.

    Science.gov (United States)

    Borak, T B

    1986-04-01

    Periodic grab sampling in combination with time-of-occupancy surveys has been the accepted procedure for estimating the annual exposure of underground U miners to Rn daughters. Temporal variations in the concentration of potential alpha energy in the mine generate uncertainties in this process. A system to randomize the selection of locations for measurement is described which can reduce uncertainties and eliminate systematic biases in the data. In general, a sample frequency of 50 measurements per year is sufficient to satisfy the criteria that the annual exposure be determined in working level months to within +/- 50% of the true value with a 95% level of confidence. Suggestions for implementing this randomization scheme are presented.

  4. Random Dynamics

    Science.gov (United States)

    Bennett, D. L.; Brene, N.; Nielsen, H. B.

    1987-01-01

    The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model.

  5. Application of Generalized Student’s T-Distribution In Modeling The Distribution of Empirical Return Rates on Selected Stock Exchange Indexes

    Directory of Open Access Journals (Sweden)

    Purczyńskiz Jan

    2014-07-01

    Full Text Available This paper examines the application of the so called generalized Student’s t-distribution in modeling the distribution of empirical return rates on selected Warsaw stock exchange indexes. It deals with distribution parameters by means of the method of logarithmic moments, the maximum likelihood method and the method of moments. Generalized Student’s t-distribution ensures better fitting to empirical data than the classical Student’s t-distribution.

  6. Random magnetism

    International Nuclear Information System (INIS)

    Tsallis, C.

    1980-03-01

    The 'ingredients' which control a phase transition in well defined system as well as in random ones (e.g. random magnetic systems) are listed and discussed within a somehow unifying perspective. Among these 'ingredients' we find the couplings and elements responsible for the cooperative phenomenon, the topological connectivity as well as possible topological incompatibilities, the influence of new degrees of freedom, the order parameter dimensionality, the ground state degeneracy and finally the 'quanticity' of the system. The general trends, though illustrated in magnetic systems, essentially hold for all phase transitions, and give a basis for connection of this area with Field theory, Theory of dynamical systems, etc. (Author) [pt

  7. Random magnetism

    International Nuclear Information System (INIS)

    Tsallis, C.

    1981-01-01

    The 'ingredients' which control a phase transition in well defined systems as well as in random ones (e.q. random magnetic systems) are listed and discussed within a somehow unifying perspective. Among these 'ingredients' the couplings and elements responsible for the cooperative phenomenon, the topological connectivity as well as possible topological incompatibilities, the influence of new degrees of freedom, the order parameter dimensionality, the ground state degeneracy and finally the 'quanticity' of the system are found. The general trends, though illustrated in magnetic systems, essentially hold for all phase transitions, and give a basis for connection of this area with Field theory, Theory of dynamical systems, etc. (Author) [pt

  8. A General Catalyst for Site-Selective C(sp(3))-H Bond Amination of Activated Secondary over Tertiary Alkyl C(sp(3))-H Bonds.

    Science.gov (United States)

    Scamp, Ryan J; Jirak, James G; Dolan, Nicholas S; Guzei, Ilia A; Schomaker, Jennifer M

    2016-06-17

    The discovery of transition metal complexes capable of promoting general, catalyst-controlled and selective carbon-hydrogen (C-H) bond amination of activated secondary C-H bonds over tertiary alkyl C(sp(3))-H bonds is challenging, as substrate control often dominates when reactive nitrene intermediates are involved. In this letter, we report the design of a new silver complex, [(Py5Me2)AgOTf]2, that displays general and good-to-excellent selectivity for nitrene insertion into propargylic, benzylic, and allylic C-H bonds over tertiary alkyl C(sp(3))-H bonds.

  9. Orthogonal polynomials and random matrices

    CERN Document Server

    Deift, Percy

    2000-01-01

    This volume expands on a set of lectures held at the Courant Institute on Riemann-Hilbert problems, orthogonal polynomials, and random matrix theory. The goal of the course was to prove universality for a variety of statistical quantities arising in the theory of random matrix models. The central question was the following: Why do very general ensembles of random n {\\times} n matrices exhibit universal behavior as n {\\rightarrow} {\\infty}? The main ingredient in the proof is the steepest descent method for oscillatory Riemann-Hilbert problems.

  10. Random queues and risk averse users

    DEFF Research Database (Denmark)

    de Palma, André; Fosgerau, Mogens

    2013-01-01

    We analyze Nash equilibrium in time of use of a congested facility. Users are risk averse with general concave utility. Queues are subject to varying degrees of random sorting, ranging from strict queue priority to a completely random queue. We define the key “no residual queue” property, which...

  11. Conditional Monte Carlo randomization tests for regression models.

    Science.gov (United States)

    Parhat, Parwen; Rosenberger, William F; Diao, Guoqing

    2014-08-15

    We discuss the computation of randomization tests for clinical trials of two treatments when the primary outcome is based on a regression model. We begin by revisiting the seminal paper of Gail, Tan, and Piantadosi (1988), and then describe a method based on Monte Carlo generation of randomization sequences. The tests based on this Monte Carlo procedure are design based, in that they incorporate the particular randomization procedure used. We discuss permuted block designs, complete randomization, and biased coin designs. We also use a new technique by Plamadeala and Rosenberger (2012) for simple computation of conditional randomization tests. Like Gail, Tan, and Piantadosi, we focus on residuals from generalized linear models and martingale residuals from survival models. Such techniques do not apply to longitudinal data analysis, and we introduce a method for computation of randomization tests based on the predicted rate of change from a generalized linear mixed model when outcomes are longitudinal. We show, by simulation, that these randomization tests preserve the size and power well under model misspecification. Copyright © 2014 John Wiley & Sons, Ltd.

  12. Random dynamics

    International Nuclear Information System (INIS)

    Bennett, D.L.

    1987-01-01

    The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: Gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model. (orig.)

  13. Random dynamics

    International Nuclear Information System (INIS)

    Bennett, D.L.; Brene, N.; Nielsen, H.B.

    1986-06-01

    The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model. (orig.)

  14. Bell inequalities for random fields

    Energy Technology Data Exchange (ETDEWEB)

    Morgan, Peter [Physics Department, Yale University, CT 06520 (United States)

    2006-06-09

    The assumptions required for the derivation of Bell inequalities are not satisfied for random field models in which there are any thermal or quantum fluctuations, in contrast to the general satisfaction of the assumptions for classical two point particle models. Classical random field models that explicitly include the effects of quantum fluctuations on measurement are possible for experiments that violate Bell inequalities.

  15. Bell inequalities for random fields

    OpenAIRE

    Morgan, Peter

    2004-01-01

    The assumptions required for the derivation of Bell inequalities are not usually satisfied for random fields in which there are any thermal or quantum fluctuations, in contrast to the general satisfaction of the assumptions for classical two point particle models. Classical random field models that explicitly include the effects of quantum fluctuations on measurement are possible for experiments that violate Bell inequalities.

  16. Correlates of smoking with socioeconomic status, leisure time physical activity and alcohol consumption among Polish adults from randomly selected regions.

    Science.gov (United States)

    Woitas-Slubowska, Donata; Hurnik, Elzbieta; Skarpańska-Stejnborn, Anna

    2010-12-01

    To determine the association between smoking status and leisure time physical activity (LTPA), alcohol consumption, and socioeconomic status (SES) among Polish adults. 466 randomly selected men and women (aged 18-66 years) responded to an anonymous questionnaire regarding smoking, alcohol consumption, LTPA, and SES. Multiple logistic regression was used to examine the association of smoking status with six socioeconomic measures, level of LTPA, and frequency and type of alcohol consumed. Smokers were defined as individuals smoking occasionally or daily. The odds of being smoker were 9 times (men) and 27 times (women) higher among respondents who drink alcohol several times/ week or everyday in comparison to non-drinkers (p times higher compared to those with the high educational attainment (p = 0.007). Among women we observed that students were the most frequent smokers. Female students were almost three times more likely to smoke than non-professional women, and two times more likely than physical workers (p = 0.018). The findings of this study indicated that among randomly selected Polish man and women aged 18-66 smoking and alcohol consumption tended to cluster. These results imply that intervention strategies need to target multiple risk factors simultaneously. The highest risk of smoking was observed among low educated men, female students, and both men and women drinking alcohol several times a week or every day. Information on subgroups with the high risk of smoking will help in planning future preventive strategies.

  17. Random processes in nuclear reactors

    CERN Document Server

    Williams, M M R

    1974-01-01

    Random Processes in Nuclear Reactors describes the problems that a nuclear engineer may meet which involve random fluctuations and sets out in detail how they may be interpreted in terms of various models of the reactor system. Chapters set out to discuss topics on the origins of random processes and sources; the general technique to zero-power problems and bring out the basic effect of fission, and fluctuations in the lifetime of neutrons, on the measured response; the interpretation of power reactor noise; and associated problems connected with mechanical, hydraulic and thermal noise sources

  18. Query construction, entropy, and generalization in neural-network models

    Science.gov (United States)

    Sollich, Peter

    1994-05-01

    We study query construction algorithms, which aim at improving the generalization ability of systems that learn from examples by choosing optimal, nonredundant training sets. We set up a general probabilistic framework for deriving such algorithms from the requirement of optimizing a suitable objective function; specifically, we consider the objective functions entropy (or information gain) and generalization error. For two learning scenarios, the high-low game and the linear perceptron, we evaluate the generalization performance obtained by applying the corresponding query construction algorithms and compare it to training on random examples. We find qualitative differences between the two scenarios due to the different structure of the underlying rules (nonlinear and ``noninvertible'' versus linear); in particular, for the linear perceptron, random examples lead to the same generalization ability as a sequence of queries in the limit of an infinite number of examples. We also investigate learning algorithms which are ill matched to the learning environment and find that, in this case, minimum entropy queries can in fact yield a lower generalization ability than random examples. Finally, we study the efficiency of single queries and its dependence on the learning history, i.e., on whether the previous training examples were generated randomly or by querying, and the difference between globally and locally optimal query construction.

  19. Phenobarbital Versus Valproate for Generalized Convulsive Status Epilepticus in Adults: A Prospective Randomized Controlled Trial in China.

    Science.gov (United States)

    Su, Yingying; Liu, Gang; Tian, Fei; Ren, Guoping; Jiang, Mengdi; Chun, Brian; Zhang, Yunzhou; Zhang, Yan; Ye, Hong; Gao, Daiquan; Chen, Weibi

    2016-12-01

    Although generalized convulsive status epilepticus (GCSE) is a life-threatening emergency, evidence-based data to guide initial drug treatment choices are lacking in the Chinese population. We conducted this prospective, randomized, controlled trial to evaluate the relative efficacy and safety of intravenous phenobarbital and valproate in patients with GCSE. After the failure of first-line diazepam treatment, Chinese adult patients with GCSE were randomized to receive either intravenous phenobarbital (standard doses, low rate) or valproate (standard). Successful treatment was considered when clinical and electroencephalographic seizure activity ceased. Adverse events following treatment, as well as the neurological outcomes at discharge and 3 months later, were also evaluated. Overall, 73 cases were enrolled in the study. Intravenous phenobarbital was successful in 81.1% of patients, and intravenous valproate was successful in 44.4% of patients (p phenobarbital (6.7%) was significantly lower than that in patients receiving valproate (31.3%), and the total number of adverse events did not differ significantly between the two groups (p > 0.05). In the phenobarbital group, two patients (5.4%) required ventilation and two patients (5.4%) developed serious hypotension. The neurological outcomes of the phenobarbital group were generally better than those of the valproate group; however, no significant differences were observed between phenobarbital and valproate with respect to mortality (8.1 vs. 16.6%) at discharge, or mortality (16.2 vs. 30.5%) and post-symptomatic epilepsy (26.3 vs. 42.8%) at 3-month follow-up. Intravenous phenobarbital appears to be more effective than intravenous valproate for Chinese adult patients with GCSE. The occurrence of serious respiratory depression and hypotension caused by phenobarbital was reduced by decreasing the intravenous infusion rate; however, even at a lower infusion rate than typically used in other institutions, intravenous

  20. A generalization of Friedman's rank statistic

    NARCIS (Netherlands)

    Kroon, de J.; Laan, van der P.

    1983-01-01

    In this paper a very natural generalization of the two·way analysis of variance rank statistic of FRIEDMAN is given. The general distribution-free test procedure based on this statistic for the effect of J treatments in a random block design can be applied in general two-way layouts without

  1. Effects of one versus two bouts of moderate intensity physical activity on selective attention during a school morning in Dutch primary schoolchildren: A randomized controlled trial.

    Science.gov (United States)

    Altenburg, Teatske M; Chinapaw, Mai J M; Singh, Amika S

    2016-10-01

    Evidence suggests that physical activity is positively related to several aspects of cognitive functioning in children, among which is selective attention. To date, no information is available on the optimal frequency of physical activity on cognitive functioning in children. The current study examined the acute effects of one and two bouts of moderate-intensity physical activity on children's selective attention. Randomized controlled trial (ISRCTN97975679). Thirty boys and twenty-six girls, aged 10-13 years, were randomly assigned to three conditions: (A) sitting all morning working on simulated school tasks; (B) one 20-min physical activity bout after 90min; and (C) two 20-min physical activity bouts, i.e. at the start and after 90min. Selective attention was assessed at five time points during the morning (i.e. at baseline and after 20, 110, 130 and 220min), using the 'Sky Search' subtest of the 'Test of Selective Attention in Children'. We used GEE analysis to examine differences in Sky Search scores between the three experimental conditions, adjusting for school, baseline scores, self-reported screen time and time spent in sports. Children who performed two 20-min bouts of moderate-intensity physical activity had significantly better Sky Search scores compared to children who performed one physical activity bout or remained seated the whole morning (B=-0.26; 95% CI=[-0.52; -0.00]). Our findings support the importance of repeated physical activity during the school day for beneficial effects on selective attention in children. Copyright © 2015 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  2. The Amnesiac Lookback Option: Selectively Monitored Lookback Options and Cryptocurrencies

    Directory of Open Access Journals (Sweden)

    Ho-Chun Herbert Chang

    2018-05-01

    Full Text Available This study proposes a strategy to make the lookback option cheaper and more practical, and suggests the use of its properties to reduce risk exposure in cryptocurrency markets through blockchain enforced smart contracts and correct for informational inefficiencies surrounding prices and volatility. This paper generalizes partial, discretely-monitored lookback options that dilute premiums by selecting a subset of specified periods to determine payoff, which we call amnesiac lookback options. Prior literature on discretely-monitored lookback options considers the number of periods and assumes equidistant lookback periods in pricing partial lookback options. This study by contrast considers random sampling of lookback periods and compares resulting payoff of the call, put and spread options under floating and fixed strikes. Amnesiac lookbacks are priced with Monte Carlo simulations of Gaussian random walks under equidistant and random periods. Results are compared to analytic and binomial pricing models for the same derivatives. Simulations show diminishing marginal increases to the fair price as the number of selected periods is increased. The returns correspond to a Hill curve whose parameters are set by interest rate and volatility. We demonstrate over-pricing under equidistant monitoring assumptions with error increasing as the lookback periods decrease. An example of a direct implication for event trading is when shock is forecasted but its timing uncertain, equidistant sampling produces a lower error on the true maximum than random choice. We conclude that the instrument provides an ideal space for investors to balance their risk, and as a prime candidate to hedge extreme volatility. We discuss the application of the amnesiac lookback option and path-dependent options to cryptocurrencies and blockchain commodities in the context of smart contracts.

  3. The Effect of EMDR and CBT on Low Self-esteem in a General Psychiatric Population: A Randomized Controlled Trial.

    Science.gov (United States)

    Griffioen, Brecht T; van der Vegt, Anna A; de Groot, Izaäk W; de Jongh, Ad

    2017-01-01

    Although low self-esteem has been found to be an important factor in the development and maintenance of psychopathology, surprisingly little is known about its treatment. This study investigated the effectiveness of Eye Movement Desensitization and Reprocessing (EMDR) therapy and Cognitive Behavioural Therapy (CBT), regarding their capacities in enhancing self-esteem in a general psychiatric secondary health care population. A randomized controlled trial with two parallel groups was used. Participants were randomly allocated to either 10 weekly sessions of EMDR ( n = 15) or CBT ( n = 15). They were assessed pre-treatment, after each session, post treatment and at 3 months follow-up on self-esteem (Rosenberg Self-esteem Scale and Credibility of Core Beliefs), psychological symptoms (Brief Symptom Inventory), social anxiety, and social interaction (Inventory of Interpersonal Situations) (IIS). The data were analyzed using repeated measures ANOVA for the complete cases ( n = 19) and intention-to-treat ( n = 30) to examine differences over time and between conditions. Both groups, EMDR as well as CBT, showed significant improvements on self-esteem, increasing two standard deviations on the main parameter (RSES). Furthermore, the results showed significant reductions in general psychiatric symptoms. The effects were maintained at 3 months follow-up. No between-group differences could be detected. Although the small sample requires to exercise caution in the interpretation of the findings, the results suggest that, when offering an adequate number of sessions, both EMDR and CBT have the potential to be effective treatments for patients with low self-esteem and a wide range of comorbid psychiatric conditions. This study was registered at www.trialregister.nl with identifier NTR4611.

  4. The Effect of EMDR and CBT on Low Self-esteem in a General Psychiatric Population: A Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Brecht T. Griffioen

    2017-11-01

    Full Text Available Although low self-esteem has been found to be an important factor in the development and maintenance of psychopathology, surprisingly little is known about its treatment. This study investigated the effectiveness of Eye Movement Desensitization and Reprocessing (EMDR therapy and Cognitive Behavioural Therapy (CBT, regarding their capacities in enhancing self-esteem in a general psychiatric secondary health care population. A randomized controlled trial with two parallel groups was used. Participants were randomly allocated to either 10 weekly sessions of EMDR (n = 15 or CBT (n = 15. They were assessed pre-treatment, after each session, post treatment and at 3 months follow-up on self-esteem (Rosenberg Self-esteem Scale and Credibility of Core Beliefs, psychological symptoms (Brief Symptom Inventory, social anxiety, and social interaction (Inventory of Interpersonal Situations (IIS. The data were analyzed using repeated measures ANOVA for the complete cases (n = 19 and intention-to-treat (n = 30 to examine differences over time and between conditions. Both groups, EMDR as well as CBT, showed significant improvements on self-esteem, increasing two standard deviations on the main parameter (RSES. Furthermore, the results showed significant reductions in general psychiatric symptoms. The effects were maintained at 3 months follow-up. No between-group differences could be detected. Although the small sample requires to exercise caution in the interpretation of the findings, the results suggest that, when offering an adequate number of sessions, both EMDR and CBT have the potential to be effective treatments for patients with low self-esteem and a wide range of comorbid psychiatric conditions. This study was registered at www.trialregister.nl with identifier NTR4611.

  5. Random walks on reductive groups

    CERN Document Server

    Benoist, Yves

    2016-01-01

    The classical theory of Random Walks describes the asymptotic behavior of sums of independent identically distributed random real variables. This book explains the generalization of this theory to products of independent identically distributed random matrices with real coefficients. Under the assumption that the action of the matrices is semisimple – or, equivalently, that the Zariski closure of the group generated by these matrices is reductive - and under suitable moment assumptions, it is shown that the norm of the products of such random matrices satisfies a number of classical probabilistic laws. This book includes necessary background on the theory of reductive algebraic groups, probability theory and operator theory, thereby providing a modern introduction to the topic.

  6. Selective prevention of cardiometabolic diseases in general practice: attitudes and working methods of male and female general practitioners before and after the introduction of the Prevention Consultation guideline in the Netherlands

    NARCIS (Netherlands)

    Vos, H.M.M.; Delft, D.H. Van; Kleijn, M.J.J. de; Nielen, M.M.; Schellevis, F.G.; Lagro-Janssen, A.L.M.

    2014-01-01

    RATIONALE, AIMS AND OBJECTIVES: In 2011 the module cardiometabolic risk of the Prevention Consultation guideline was introduced in the Netherlands in order to prevent cardiometabolic diseases. We aimed to compare attitudes and working methods of Dutch general practitioners (GPs) towards selective

  7. Selective prevention of cardiometabolic diseases in general practice: attitudes and working methods of male and female general practitioners before and after the introduction of the Prevention Consultation guideline in the Netherlands.

    NARCIS (Netherlands)

    Vos, H.M.M.; Delft, D.H.W.J.M. van; Kleijn, M.J.J. de; Nielen, M.M.J.; Schellevis, F.G.; Lagro-Janssen, A.L.M.

    2014-01-01

    Rationale, aims and objectives; In 2011 the module cardiometabolic risk of the Prevention Consultation guideline was introduced in the Netherlands in order to prevent cardiometabolic diseases. We aimed to compare attitudes and working methods of Dutch general practitioners (GPs) towards selective

  8. Selective prevention of cardiometabolic diseases in general practice: attitudes and working methods of male and female general practitioners before and after the introduction of the Prevention Consultation guideline in the Netherlands

    NARCIS (Netherlands)

    Vos, H.M.M.; Van Delft, D.H.W.J.; de Kleijn, M.J.J.; Nielen, M.M.J.; Schellevis, F.G.; Lagro-Janssen, A.L.M.

    2014-01-01

    Rationale, aims and objectives In 2011 the module cardiometabolic risk of the Prevention Consultation guideline was introduced in the Netherlands in order to prevent cardiometabolic diseases. We aimed to compare attitudes and working methods of Dutch general practitioners (GPs) towards selective

  9. Random mutagenesis of the hyperthermophilic archaeon Pyrococcus furiosus using in vitro mariner transposition and natural transformation.

    Science.gov (United States)

    Guschinskaya, Natalia; Brunel, Romain; Tourte, Maxime; Lipscomb, Gina L; Adams, Michael W W; Oger, Philippe; Charpentier, Xavier

    2016-11-08

    Transposition mutagenesis is a powerful tool to identify the function of genes, reveal essential genes and generally to unravel the genetic basis of living organisms. However, transposon-mediated mutagenesis has only been successfully applied to a limited number of archaeal species and has never been reported in Thermococcales. Here, we report random insertion mutagenesis in the hyperthermophilic archaeon Pyrococcus furiosus. The strategy takes advantage of the natural transformability of derivatives of the P. furiosus COM1 strain and of in vitro Mariner-based transposition. A transposon bearing a genetic marker is randomly transposed in vitro in genomic DNA that is then used for natural transformation of P. furiosus. A small-scale transposition reaction routinely generates several hundred and up to two thousands transformants. Southern analysis and sequencing showed that the obtained mutants contain a single and random genomic insertion. Polyploidy has been reported in Thermococcales and P. furiosus is suspected of being polyploid. Yet, about half of the mutants obtained on the first selection are homozygous for the transposon insertion. Two rounds of isolation on selective medium were sufficient to obtain gene conversion in initially heterozygous mutants. This transposition mutagenesis strategy will greatly facilitate functional exploration of the Thermococcales genomes.

  10. Mirnacle: machine learning with SMOTE and random forest for improving selectivity in pre-miRNA ab initio prediction.

    Science.gov (United States)

    Marques, Yuri Bento; de Paiva Oliveira, Alcione; Ribeiro Vasconcelos, Ana Tereza; Cerqueira, Fabio Ribeiro

    2016-12-15

    MicroRNAs (miRNAs) are key gene expression regulators in plants and animals. Therefore, miRNAs are involved in several biological processes, making the study of these molecules one of the most relevant topics of molecular biology nowadays. However, characterizing miRNAs in vivo is still a complex task. As a consequence, in silico methods have been developed to predict miRNA loci. A common ab initio strategy to find miRNAs in genomic data is to search for sequences that can fold into the typical hairpin structure of miRNA precursors (pre-miRNAs). The current ab initio approaches, however, have selectivity issues, i.e., a high number of false positives is reported, which can lead to laborious and costly attempts to provide biological validation. This study presents an extension of the ab initio method miRNAFold, with the aim of improving selectivity through machine learning techniques, namely, random forest combined with the SMOTE procedure that copes with imbalance datasets. By comparing our method, termed Mirnacle, with other important approaches in the literature, we demonstrate that Mirnacle substantially improves selectivity without compromising sensitivity. For the three datasets used in our experiments, our method achieved at least 97% of sensitivity and could deliver a two-fold, 20-fold, and 6-fold increase in selectivity, respectively, compared with the best results of current computational tools. The extension of miRNAFold by the introduction of machine learning techniques, significantly increases selectivity in pre-miRNA ab initio prediction, which optimally contributes to advanced studies on miRNAs, as the need of biological validations is diminished. Hopefully, new research, such as studies of severe diseases caused by miRNA malfunction, will benefit from the proposed computational tool.

  11. Selective enhancement of orientation tuning before saccades.

    Science.gov (United States)

    Ohl, Sven; Kuper, Clara; Rolfs, Martin

    2017-11-01

    Saccadic eye movements cause a rapid sweep of the visual image across the retina and bring the saccade's target into high-acuity foveal vision. Even before saccade onset, visual processing is selectively prioritized at the saccade target. To determine how this presaccadic attention shift exerts its influence on visual selection, we compare the dynamics of perceptual tuning curves before movement onset at the saccade target and in the opposite hemifield. Participants monitored a 30-Hz sequence of randomly oriented gratings for a target orientation. Combining a reverse correlation technique previously used to study orientation tuning in neurons and general additive mixed modeling, we found that perceptual reports were tuned to the target orientation. The gain of orientation tuning increased markedly within the last 100 ms before saccade onset. In addition, we observed finer orientation tuning right before saccade onset. This increase in gain and tuning occurred at the saccade target location and was not observed at the incongruent location in the opposite hemifield. The present findings suggest, therefore, that presaccadic attention exerts its influence on vision in a spatially and feature-selective manner, enhancing performance and sharpening feature tuning at the future gaze location before the eyes start moving.

  12. The Effect of Different Modes of English Captioning on EFL learners’ General Listening Comprehension: Full text Vs. Keyword Captions

    Directory of Open Access Journals (Sweden)

    Sorayya Behroozizad

    2015-08-01

    Full Text Available This study investigated the effect of different modes of English captioning on EFL learners’ general listening comprehension. To this end, forty five intermediate-level learners were selected based on their scores on a standardized English proficiency test (PET to carry out the study. Then, the selected participants were randomly assigned into two experimental groups (full-captions and keyword-captions and one control group (no-captions. Research instrumentation included a pre-test and a post-test following an experimental design. Participants took a pre-test and a post-test containing 50 multiple-choice questions (25question for pre-test and 25 question for post-test selected from a standard listening test PET, and also 15 treatment sessions. The findings showed significant differences among full-captions, keyword-captions, and no-captions in terms of their effect on learners’ general listening comprehension. This study provided some pedagogical implications for teaching listening through using different modes of captions. Keywords: Caption, full caption, keyword caption, listening comprehension

  13. Randomized algorithms in automatic control and data mining

    CERN Document Server

    Granichin, Oleg; Toledano-Kitai, Dvora

    2015-01-01

    In the fields of data mining and control, the huge amount of unstructured data and the presence of uncertainty in system descriptions have always been critical issues. The book Randomized Algorithms in Automatic Control and Data Mining introduces the readers to the fundamentals of randomized algorithm applications in data mining (especially clustering) and in automatic control synthesis. The methods proposed in this book guarantee that the computational complexity of classical algorithms and the conservativeness of standard robust control techniques will be reduced. It is shown that when a problem requires "brute force" in selecting among options, algorithms based on random selection of alternatives offer good results with certain probability for a restricted time and significantly reduce the volume of operations.

  14. Subjective randomness as statistical inference.

    Science.gov (United States)

    Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B

    2018-06-01

    Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. The basic science and mathematics of random mutation and natural selection.

    Science.gov (United States)

    Kleinman, Alan

    2014-12-20

    The mutation and natural selection phenomenon can and often does cause the failure of antimicrobial, herbicidal, pesticide and cancer treatments selection pressures. This phenomenon operates in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures. The mathematical behavior of mutation and selection is derived using the principles given by probability theory. The derivation of the equations describing the mutation and selection phenomenon is carried out in the context of an empirical example. Copyright © 2014 John Wiley & Sons, Ltd.

  16. Random quantum operations

    International Nuclear Information System (INIS)

    Bruzda, Wojciech; Cappellini, Valerio; Sommers, Hans-Juergen; Zyczkowski, Karol

    2009-01-01

    We define a natural ensemble of trace preserving, completely positive quantum maps and present algorithms to generate them at random. Spectral properties of the superoperator Φ associated with a given quantum map are investigated and a quantum analogue of the Frobenius-Perron theorem is proved. We derive a general formula for the density of eigenvalues of Φ and show the connection with the Ginibre ensemble of real non-symmetric random matrices. Numerical investigations of the spectral gap imply that a generic state of the system iterated several times by a fixed generic map converges exponentially to an invariant state

  17. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness

    Science.gov (United States)

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and

  18. Lifestyle factors and contact to general practice with respiratory alarm symptoms

    DEFF Research Database (Denmark)

    Sele, Lisa Maria Falk; Elnegaard, Sandra; Balasubramaniam, Kirubakaran

    2016-01-01

    BACKGROUND: A prerequisite for early lung cancer diagnosis is that individuals with respiratory alarm symptoms (RAS) contact a general practitioner (GP). This study aims to determine the proportion of individuals in the general population who contact a GP with RAS and to analyse the association...... between lifestyle factors and contact to GPs with RAS. METHODS: A web-based survey of 100 000 individuals randomly selected from the Danish Civil Registration System. Items regarding experience of RAS (prolonged coughing, shortness of breath, coughing up blood, and prolonged hoarseness), GP contacts......, and lifestyle factors (smoking status, alcohol intake, and body mass index) were included. RESULTS: In total 49 706 (52.5 %) individuals answered the questionnaire. Overall 7870 reported at least one respiratory alarm symptom, and of those 39.6 % (3 080) had contacted a GP. Regarding specific symptoms...

  19. A general framework to select working fluid and configuration of ORCs for low-to-medium temperature heat sources

    International Nuclear Information System (INIS)

    Vivian, Jacopo; Manente, Giovanni; Lazzaretto, Andrea

    2015-01-01

    Highlights: • General guidelines are proposed to select ORC working fluid and cycle layout. • Distance between critical and heat source temperature for optimal fluid selection. • Separate contributions of cycle efficiency and heat recovery factor. - Abstract: The selection of the most suitable working fluid and cycle configuration for a given heat source is a fundamental step in the search for the optimum design of Organic Rankine Cycles. In this phase cycle efficiency and heat source recovery factor lead to opposite design choices in the achievement of maximum system efficiency and, in turn, maximum power output. In this work, both separate and combined effects of these two performance factors are considered to supply a thorough understanding of the compromise resulting in maximum performance. This goal is pursued by carrying out design optimizations of four different ORC configurations operating with twenty-seven working fluids and recovering heat from sensible heat sources in the temperature range 120–180 °C. Optimum working fluids and thermodynamic parameters are those which simultaneously allow high cycle efficiency and high heat recovery from the heat source to be obtained. General guidelines are suggested to reach this target for any system configuration. The distance between fluid critical temperature and inlet temperature of the heat source is found to play a key role in predicting the optimum performance of all system configurations regardless of the inlet temperature of the heat source

  20. Prediction error variance and expected response to selection, when selection is based on the best predictor - for Gaussian and threshold characters, traits following a Poisson mixed model and survival traits

    DEFF Research Database (Denmark)

    Andersen, Anders Holst; Korsgaard, Inge Riis; Jensen, Just

    2002-01-01

    In this paper, we consider selection based on the best predictor of animal additive genetic values in Gaussian linear mixed models, threshold models, Poisson mixed models, and log normal frailty models for survival data (including models with time-dependent covariates with associated fixed...... or random effects). In the different models, expressions are given (when these can be found - otherwise unbiased estimates are given) for prediction error variance, accuracy of selection and expected response to selection on the additive genetic scale and on the observed scale. The expressions given for non...... Gaussian traits are generalisations of the well-known formulas for Gaussian traits - and reflect, for Poisson mixed models and frailty models for survival data, the hierarchal structure of the models. In general the ratio of the additive genetic variance to the total variance in the Gaussian part...

  1. On Optimal Data Split for Generalization Estimation and Model Selection

    DEFF Research Database (Denmark)

    Larsen, Jan; Goutte, Cyril

    1999-01-01

    The paper is concerned with studying the very different behavior of the two data splits using hold-out cross-validation, K-fold cross-validation and randomized permutation cross-validation. First we describe the theoretical basics of various cross-validation techniques with the purpose of reliably...

  2. RANDOM WALK HYPOTHESIS IN FINANCIAL MARKETS

    Directory of Open Access Journals (Sweden)

    Nicolae-Marius JULA

    2017-05-01

    Full Text Available Random walk hypothesis states that the stock market prices do not follow a predictable trajectory, but are simply random. If you are trying to predict a random set of data, one should test for randomness, because, despite the power and complexity of the used models, the results cannot be trustworthy. There are several methods for testing these hypotheses and the use of computational power provided by the R environment makes the work of the researcher easier and with a cost-effective approach. The increasing power of computing and the continuous development of econometric tests should give the potential investors new tools in selecting commodities and investing in efficient markets.

  3. Locally Perturbed Random Walks with Unbounded Jumps

    OpenAIRE

    Paulin, Daniel; Szász, Domokos

    2010-01-01

    In \\cite{SzT}, D. Sz\\'asz and A. Telcs have shown that for the diffusively scaled, simple symmetric random walk, weak convergence to the Brownian motion holds even in the case of local impurities if $d \\ge 2$. The extension of their result to finite range random walks is straightforward. Here, however, we are interested in the situation when the random walk has unbounded range. Concretely we generalize the statement of \\cite{SzT} to unbounded random walks whose jump distribution belongs to th...

  4. Computer generation of random deviates

    International Nuclear Information System (INIS)

    Cormack, John

    1991-01-01

    The need for random deviates arises in many scientific applications. In medical physics, Monte Carlo simulations have been used in radiology, radiation therapy and nuclear medicine. Specific instances include the modelling of x-ray scattering processes and the addition of random noise to images or curves in order to assess the effects of various processing procedures. Reliable sources of random deviates with statistical properties indistinguishable from true random deviates are a fundamental necessity for such tasks. This paper provides a review of computer algorithms which can be used to generate uniform random deviates and other distributions of interest to medical physicists, along with a few caveats relating to various problems and pitfalls which can occur. Source code listings for the generators discussed (in FORTRAN, Turbo-PASCAL and Data General ASSEMBLER) are available on request from the authors. 27 refs., 3 tabs., 5 figs

  5. The influence of resilience on mental health: The role of general well-being.

    Science.gov (United States)

    Gao, Tingting; Ding, Xinna; Chai, Jingxin; Zhang, Zhao; Zhang, Han; Kong, Yixi; Mei, Songli

    2017-06-01

    Nurses are suffering from increasing stress, and nursing is recognized as one of the most stressful job. Their mental health problems are serious and worthy of attention. The purpose of this study was to explore the relationship between resilience and mental health and general well-being among nurses. A cross-sectional survey was conducted in 2014, using a self-reported questionnaire. Participants were asked to complete the measure of resilience, mental health, and general well-being. The method of randomly cluster sampling was used to select nurses as participants. A survey of 365 nurses was conducted to test the hypothesized model. This study showed that resilience, mental health, and general well-being correlated with each other. General well-being was an effective predictor of resilience and mental health, whereas it both can moderate and mediate the relationship. Strategies to increase nurses' general well-being could enhance their resilience and reduce mental health problems. It is important to improve the mental health of nurses and maintain the professional values that ensure career sustainability. © 2017 John Wiley & Sons Australia, Ltd.

  6. The Goodness of Covariance Selection Problem from AUC Bounds

    OpenAIRE

    Khajavi, Navid Tafaghodi; Kuh, Anthony

    2016-01-01

    We conduct a study of graphical models and discuss the quality of model selection approximation by formulating the problem as a detection problem and examining the area under the curve (AUC). We are specifically looking at the model selection problem for jointly Gaussian random vectors. For Gaussian random vectors, this problem simplifies to the covariance selection problem which is widely discussed in literature by Dempster [1]. In this paper, we give the definition for the correlation appro...

  7. Exactly averaged equations for flow and transport in random media

    International Nuclear Information System (INIS)

    Shvidler, Mark; Karasaki, Kenzi

    2001-01-01

    It is well known that exact averaging of the equations of flow and transport in random porous media can be realized only for a small number of special, occasionally exotic, fields. On the other hand, the properties of approximate averaging methods are not yet fully understood. For example, the convergence behavior and the accuracy of truncated perturbation series. Furthermore, the calculation of the high-order perturbations is very complicated. These problems for a long time have stimulated attempts to find the answer for the question: Are there in existence some exact general and sufficiently universal forms of averaged equations? If the answer is positive, there arises the problem of the construction of these equations and analyzing them. There exist many publications related to these problems and oriented on different applications: hydrodynamics, flow and transport in porous media, theory of elasticity, acoustic and electromagnetic waves in random fields, etc. We present a method of finding the general form of exactly averaged equations for flow and transport in random fields by using (1) an assumption of the existence of Green's functions for appropriate stochastic problems, (2) some general properties of the Green's functions, and (3) the some basic information about the random fields of the conductivity, porosity and flow velocity. We present a general form of the exactly averaged non-local equations for the following cases. 1. Steady-state flow with sources in porous media with random conductivity. 2. Transient flow with sources in compressible media with random conductivity and porosity. 3. Non-reactive solute transport in random porous media. We discuss the problem of uniqueness and the properties of the non-local averaged equations, for the cases with some types of symmetry (isotropic, transversal isotropic, orthotropic) and we analyze the hypothesis of the structure non-local equations in general case of stochastically homogeneous fields. (author)

  8. Organic Ferroelectric-Based 1T1T Random Access Memory Cell Employing a Common Dielectric Layer Overcoming the Half-Selection Problem.

    Science.gov (United States)

    Zhao, Qiang; Wang, Hanlin; Ni, Zhenjie; Liu, Jie; Zhen, Yonggang; Zhang, Xiaotao; Jiang, Lang; Li, Rongjin; Dong, Huanli; Hu, Wenping

    2017-09-01

    Organic electronics based on poly(vinylidenefluoride/trifluoroethylene) (P(VDF-TrFE)) dielectric is facing great challenges in flexible circuits. As one indispensable part of integrated circuits, there is an urgent demand for low-cost and easy-fabrication nonvolatile memory devices. A breakthrough is made on a novel ferroelectric random access memory cell (1T1T FeRAM cell) consisting of one selection transistor and one ferroelectric memory transistor in order to overcome the half-selection problem. Unlike complicated manufacturing using multiple dielectrics, this system simplifies 1T1T FeRAM cell fabrication using one common dielectric. To achieve this goal, a strategy for semiconductor/insulator (S/I) interface modulation is put forward and applied to nonhysteretic selection transistors with high performances for driving or addressing purposes. As a result, high hole mobility of 3.81 cm 2 V -1 s -1 (average) for 2,6-diphenylanthracene (DPA) and electron mobility of 0.124 cm 2 V -1 s -1 (average) for N,N'-1H,1H-perfluorobutyl dicyanoperylenecarboxydiimide (PDI-FCN 2 ) are obtained in selection transistors. In this work, we demonstrate this technology's potential for organic ferroelectric-based pixelated memory module fabrication. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. 13 CFR 108.340 - Evaluation and selection-general.

    Science.gov (United States)

    2010-01-01

    ... VENTURE CAPITAL (âNMVCâ) PROGRAM Evaluation and Selection of NMVC Companies § 108.340 Evaluation and... Applicants in such a way as to promote Developmental Venture Capital investments nationwide and in both urban...

  10. Generalized treatment of point reactor kinetics driven by random reactivity fluctuations via the Wiener-Hermite functional method

    International Nuclear Information System (INIS)

    Behringer, K.

    1991-02-01

    In a recent paper by Behringer et al. (1990), the Wiener-Hermite Functional (WHF) method has been applied to point reactor kinetics excited by Gaussian random reactivity noise under stationary conditions, in order to calculate the neutron steady-state value and the neutron power spectral density (PSD) in a second-order (WHF-2) approximation. For simplicity, delayed neutrons and any feedback effects have been disregarded. The present study is a straightforward continuation of the previous one, treating the problem more generally by including any number of delayed neutron groups. For the case of white reactivity noise, the accuracy of the approach is determined by comparison with the exact solution available from the Fokker-Planck method. In the numerical comparisons, the first-oder (WHF-1) approximation of the PSD is also considered. (author) 4 figs., 10 refs

  11. Generalizations of orthogonal polynomials

    Science.gov (United States)

    Bultheel, A.; Cuyt, A.; van Assche, W.; van Barel, M.; Verdonk, B.

    2005-07-01

    We give a survey of recent generalizations of orthogonal polynomials. That includes multidimensional (matrix and vector orthogonal polynomials) and multivariate versions, multipole (orthogonal rational functions) variants, and extensions of the orthogonality conditions (multiple orthogonality). Most of these generalizations are inspired by the applications in which they are applied. We also give a glimpse of these applications, which are usually generalizations of applications where classical orthogonal polynomials also play a fundamental role: moment problems, numerical quadrature, rational approximation, linear algebra, recurrence relations, and random matrices.

  12. IgE antibodies to alpha-gal in the general adult population

    DEFF Research Database (Denmark)

    Gonzalez-Quintela, A; Dam Laursen, A S; Vidal, C

    2014-01-01

    -gal-specific (s)IgE and its associated factors in the general adult population from two separated (Northern and Southern) European regions (Denmark and Spain, respectively). METHODS: Cross-sectional study of 2297 and 444 randomly selected adults from 11 municipalities in Denmark and one in Spain. Alpha-gal s.......1% in the Danish and Spanish series, respectively. The prevalence of sIgE ≥ 0.35 kUA /L was 1.8% and 2.2% in Denmark and Spain, respectively. Alpha-gal sIgE positivity was associated with pet ownership in both series and, particularly, cat ownership (data available in the Danish series). Alpha-gal sIgE positivity...

  13. Relationship between general health of older health service users and their self-esteem in Isfahan in 2014

    OpenAIRE

    Razieh Molavi; Mousa Alavi; Mahrokh Keshvari

    2015-01-01

    Background: Self-esteem is known to be one of the most important markers of successful aging. Older people's self-esteem is influenced by several factors that particularly may be health related. Therefore, this study aimed to explore some important general health-related predictors of the older people's self-esteem. Materials and Methods: In this study, 200 people, aged 65 years and older, who referred to health care centers were selected through stratified random sampling method. Data we...

  14. Random matrix theories and chaotic dynamics

    International Nuclear Information System (INIS)

    Bohigas, O.

    1991-01-01

    A review of some of the main ideas, assumptions and results of the Wigner-Dyson type random matrix theories (RMT) which are relevant in the general context of 'Chaos and Quantum Physics' is presented. RMT are providing interesting and unexpected clues to connect classical dynamics with quantum phenomena. It is this aspect which will be emphasised and, concerning the main body of RMT, the author will restrict himself to a minimum. However, emphasis will be put on some generalizations of the 'canonical' random matrix ensembles that increase their flexibility, rendering the incorporation of relevant physical constraints possible. (R.P.) 112 refs., 35 figs., 5 tabs

  15. Self-reported oral and general health in relation to socioeconomic position.

    Science.gov (United States)

    Hakeberg, Magnus; Wide Boman, Ulla

    2017-07-26

    During the past two decades, several scientific publications from different countries have shown how oral health in the population varies with social determinants. The aim of the present study was to explore the relationship between self-reported oral and general health in relation to different measures of socioeconomic position. Data were collected from a randomly selected sample of the adult population in Sweden (n = 3500, mean age 53.4 years, 53.1% women). The response rate was 49.7%. Subjects were interviewed by telephone, using a questionnaire including items on self-reported oral and general health, socioeconomic position and lifestyle. A significant gradient was found for both oral and general health: the lower the socioeconomic position, the poorer the health. Socioeconomic position and, above all, economic measures were strongly associated with general health (OR 3.95) and with oral health (OR 1.76) if having an income below SEK 200,000 per year. Similar results were found in multivariate analyses controlling for age, gender and lifestyle variables. For adults, there are clear socioeconomic gradients in self-reported oral and general health, irrespective of different socioeconomic measures. Action is needed to ensure greater equity of oral and general health.

  16. Generalized Entanglement Entropies of Quantum Designs

    Science.gov (United States)

    Liu, Zi-Wen; Lloyd, Seth; Zhu, Elton Yechao; Zhu, Huangjun

    2018-03-01

    The entanglement properties of random quantum states or dynamics are important to the study of a broad spectrum of disciplines of physics, ranging from quantum information to high energy and many-body physics. This Letter investigates the interplay between the degrees of entanglement and randomness in pure states and unitary channels. We reveal strong connections between designs (distributions of states or unitaries that match certain moments of the uniform Haar measure) and generalized entropies (entropic functions that depend on certain powers of the density operator), by showing that Rényi entanglement entropies averaged over designs of the same order are almost maximal. This strengthens the celebrated Page's theorem. Moreover, we find that designs of an order that is logarithmic in the dimension maximize all Rényi entanglement entropies and so are completely random in terms of the entanglement spectrum. Our results relate the behaviors of Rényi entanglement entropies to the complexity of scrambling and quantum chaos in terms of the degree of randomness, and suggest a generalization of the fast scrambling conjecture.

  17. The World is Random: A Cognitive Perspective on Perceived Disorder

    Directory of Open Access Journals (Sweden)

    Hiroki P. Kotabe

    2014-06-01

    Full Text Available Research on the consequences of perceiving disorder is largely sociological and concerns broken windows theory, which states that signs of social disorder cause further social disorder. The predominant psychological explanations for this phenomena are primarily social. In contrast, I propose a parsimonious cognitive model (world-is-random model; WIR, which basically proposes that disorder primes randomness-related concepts, which results in a reduction in and threat to the sense of personal control, which has diverse affective, judgmental, and behavioral consequences. I review recent developments on the psychological consequences of perceiving disorder and argue that WIR can explain all of these findings. I also cover select correlational studies from the sociological literature and explain how WIR can at least partly explain for their diverse findings. In a general discussion, I consider possible alternative psychological models and argue that they do not adequately explain the most recent psychological research on disorder. I then propose future directions which include determining whether perceiving disorder causes a unique psychology and delimiting boundary conditions.

  18. Random-walk simulation of selected aspects of dissipative collisions

    International Nuclear Information System (INIS)

    Toeke, J.; Gobbi, A.; Matulewicz, T.

    1984-11-01

    Internuclear thermal equilibrium effects and shell structure effects in dissipative collisions are studied numerically within the framework of the model of stochastic exchanges by applying the random-walk technique. Effective blocking of the drift through the mass flux induced by the temperature difference, while leaving the variances of the mass distributions unaltered is found possible, provided an internuclear potential barrier is present. Presence of the shell structure is found to lead to characteristic correlations between the consecutive exchanges. Experimental evidence for the predicted effects is discussed. (orig.)

  19. SCRAED - Simple and Complex Random Assignment in Experimental Designs

    OpenAIRE

    Alferes, Valentim R.

    2009-01-01

    SCRAED is a package of 37 self-contained SPSS syntax files that performs simple and complex random assignment in experimental designs. For between-subjects designs, SCRAED includes simple random assignment (no restrictions, forced equal sizes, forced unequal sizes, and unequal probabilities), block random assignment (simple and generalized blocks), and stratified random assignment (no restrictions, forced equal sizes, forced unequal sizes, and unequal probabilities). For within-subject...

  20. Effect of electromagnetic radiations from mobile phone base stations on general health and salivary function.

    Science.gov (United States)

    Singh, Kushpal; Nagaraj, Anup; Yousuf, Asif; Ganta, Shravani; Pareek, Sonia; Vishnani, Preeti

    2016-01-01

    Cell phones use electromagnetic, nonionizing radiations in the microwave range, which some believe may be harmful to human health. The present study aimed to determine the effect of electromagnetic radiations (EMRs) on unstimulated/stimulated salivary flow rate and other health-related problems between the general populations residing in proximity to and far away from mobile phone base stations. A total of four mobile base stations were randomly selected from four zones of Jaipur, Rajasthan, India. Twenty individuals who were residing in proximity to the selected mobile phone towers were taken as the case group and the other 20 individuals (control group) who were living nearly 1 km away in the periphery were selected for salivary analysis. Questions related to sleep disturbances were measured using Pittsburgh Sleep Quality Index (PSQI) and other health problems were included in the questionnaire. Chi-square test was used for statistical analysis. It was unveiled that a majority of the subjects who were residing near the mobile base station complained of sleep disturbances, headache, dizziness, irritability, concentration difficulties, and hypertension. A majority of the study subjects had significantly lesser stimulated salivary secretion (P base stations on the health and well-being of the general population cannot be ruled out. Further studies are warranted to evaluate the effect of electromagnetic fields (EMFs) on general health and more specifically on oral health.

  1. Presentation and antimicrobial treatment of acute orofacial infections in general dental practice.

    Science.gov (United States)

    Lewis, M A; Meechan, C; MacFarlane, T W; Lamey, P J; Kay, E

    1989-01-21

    Information on the presentation of orofacial infections and the use of antimicrobial agents in general dental practice in the United Kingdom was obtained using a postal questionnaire. Six hundred dentists were randomly selected and a total of 340 replies were received, giving a response rate of 57%. The dental practitioners estimated that acute infection was present in only a minority (approximately 5%) of patients. A total of seven different antibiotics were prescribed, in a variety of regimens, for the treatment of bacterial infection. However, the majority of dentists (46-62%) preferred a 5-day course of penicillin (250 mg, qid) for bacterial conditions other than acute ulcerative gingivitis, for which most practitioners (89%) prescribed 3 days of metronidazole (200 mg, tid). Nystatin was the most frequently selected anticandidal agent and topical acyclovir the most popular therapy for Herpes simplex infection.

  2. Random number generation and creativity.

    Science.gov (United States)

    Bains, William

    2008-01-01

    A previous paper suggested that humans can generate genuinely random numbers. I tested this hypothesis by repeating the experiment with a larger number of highly numerate subjects, asking them to call out a sequence of digits selected from 0 through 9. The resulting sequences were substantially non-random, with an excess of sequential pairs of numbers and a deficit of repeats of the same number, in line with previous literature. However, the previous literature suggests that humans generate random numbers with substantial conscious effort, and distractions which reduce that effort reduce the randomness of the numbers. I reduced my subjects' concentration by asking them to call out in another language, and with alcohol - neither affected the randomness of their responses. This suggests that the ability to generate random numbers is a 'basic' function of the human mind, even if those numbers are not mathematically 'random'. I hypothesise that there is a 'creativity' mechanism, while not truly random, provides novelty as part of the mind's defence against closed programming loops, and that testing for the effects seen here in people more or less familiar with numbers or with spontaneous creativity could identify more features of this process. It is possible that training to perform better at simple random generation tasks could help to increase creativity, through training people to reduce the conscious mind's suppression of the 'spontaneous', creative response to new questions.

  3. A double-blind, randomized, placebo-controlled, fixed-dose phase III study of vilazodone in patients with generalized anxiety disorder.

    Science.gov (United States)

    Gommoll, Carl; Durgam, Suresh; Mathews, Maju; Forero, Giovanna; Nunez, Rene; Tang, Xiongwen; Thase, Michael E

    2015-06-01

    Vilazodone, a selective serotonin reuptake inhibitor and 5-HT1A receptor partial agonist, is approved for treating major depressive disorder in adults. This study (NCT01629966 ClinicalTrials.gov) evaluated the efficacy and safety of vilazodone in adults with generalized anxiety disorder (GAD). A multicenter, double-blind, parallel-group, placebo-controlled, fixed-dose study in patients with GAD randomized (1:1:1) to placebo (n = 223), or vilazodone 20 mg/day (n = 230) or 40 mg/day (n = 227). Primary and secondary efficacy parameters were total score change from baseline to week 8 on the Hamilton Rating Scale for Anxiety (HAMA) and Sheehan Disability Scale (SDS), respectively, analyzed using a predefined mixed-effect model for repeated measures (MMRM). Safety outcomes were presented by descriptive statistics. The least squares mean difference (95% confidence interval) in HAMA total score change from baseline (MMRM) was statistically significant for vilazodone 40 mg/day versus placebo (-1.80 [-3.26, -0.34]; P = .0312 [adjusted for multiple comparisons]), but not for vilazodone 20 mg/day versus placebo. Mean change from baseline in SDS total score was not significantly different for either dose of vilazodone versus placebo when adjusted for multiplicity; significant improvement versus placebo was noted for vilazodone 40 mg/day without adjustment for multiplicity (P = .0349). The incidence of adverse events was similar for vilazodone 20 and 40 mg/day (∼71%) and slightly lower for placebo (62%). Nausea, diarrhea, dizziness, vomiting, and fatigue were reported in ≥5% of patients in either vilazodone group and at least twice the rate of placebo. Vilazodone was effective in treating anxiety symptoms of GAD. No new safety concerns were identified. © 2015 The Authors. Depression and Anxiety published by Wiley Periodicals, Inc.

  4. Prevalence of latent tuberculous infection among adults in the general population of Ca Mau, Viet Nam.

    Science.gov (United States)

    Marks, G B; Nhung, N V; Nguyen, T A; Hoa, N B; Khoa, T H; Son, N V; Phuong, N T B; Tin, D M; Ho, J; Fox, G J

    2018-03-01

    The study was conducted in a randomly selected sample of persons aged 15 years living in Ca Mau Province, southern Viet Nam. To estimate the prevalence of latent tuberculous infection (LTBI) in the general adult population of this province of Viet Nam. The secondary objective was to examine age and sex differences in prevalence. A cross-sectional survey was conducted in a cluster-random sample of the population. Clusters were subcommunes. The presence of LTBI was assessed using the QuantiFERON®-TB Gold In-Tube test system. QuantiFERON tests were performed among 1319 persons aged 15 years (77.7% of those selected). The overall prevalence of positive tests was 36.8% (95%CI 33.4-40.3). The prevalence of a positive test was lower in females than in males (31.0% vs. 44.7%, OR 0.57, 95%CI 0.45-0.72, P Viet Nam have evidence of LTBI. Although LTBI prevalence is higher in males, the sex difference is not as great as that for TB notification rates.

  5. Selection and characterization of DNA aptamers

    NARCIS (Netherlands)

    Ruigrok, V.J.B.

    2013-01-01

    This thesis focusses on the selection and characterisation of DNA aptamers and the various aspects related to their selection from large pools of randomized oligonucleotides. Aptamers are affinity tools that can specifically recognize and bind predefined target molecules; this ability, however,

  6. Generalized connectivity of graphs

    CERN Document Server

    Li, Xueliang

    2016-01-01

    Noteworthy results, proof techniques, open problems and conjectures in generalized (edge-) connectivity are discussed in this book. Both theoretical and practical analyses for generalized (edge-) connectivity of graphs are provided. Topics covered in this book include: generalized (edge-) connectivity of graph classes, algorithms, computational complexity, sharp bounds, Nordhaus-Gaddum-type results, maximum generalized local connectivity, extremal problems, random graphs, multigraphs, relations with the Steiner tree packing problem and generalizations of connectivity. This book enables graduate students to understand and master a segment of graph theory and combinatorial optimization. Researchers in graph theory, combinatorics, combinatorial optimization, probability, computer science, discrete algorithms, complexity analysis, network design, and the information transferring models will find this book useful in their studies.

  7. Pseudo-Random Number Generators

    Science.gov (United States)

    Howell, L. W.; Rheinfurth, M. H.

    1984-01-01

    Package features comprehensive selection of probabilistic distributions. Monte Carlo simulations resorted to whenever systems studied not amenable to deterministic analyses or when direct experimentation not feasible. Random numbers having certain specified distribution characteristic integral part of simulations. Package consists of collector of "pseudorandom" number generators for use in Monte Carlo simulations.

  8. Model Selection with the Linear Mixed Model for Longitudinal Data

    Science.gov (United States)

    Ryoo, Ji Hoon

    2011-01-01

    Model building or model selection with linear mixed models (LMMs) is complicated by the presence of both fixed effects and random effects. The fixed effects structure and random effects structure are codependent, so selection of one influences the other. Most presentations of LMM in psychology and education are based on a multilevel or…

  9. 12 CFR 1805.700 - Evaluation and selection-general.

    Science.gov (United States)

    2010-01-01

    ... Section 1805.700 Banks and Banking COMMUNITY DEVELOPMENT FINANCIAL INSTITUTIONS FUND, DEPARTMENT OF THE TREASURY COMMUNITY DEVELOPMENT FINANCIAL INSTITUTIONS PROGRAM Evaluation and Selection of Applications... Applicants that vary by institution type, total asset size, stage of organizational development, markets...

  10. Cluster randomized trial in the general practice research database: 2. Secondary prevention after first stroke (eCRT study: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Dregan Alex

    2012-10-01

    Full Text Available Abstract Background The purpose of this research is to develop and evaluate methods for conducting pragmatic cluster randomized trials in a primary care electronic database. The proposal describes one application, in a less frequent chronic condition of public health importance, secondary prevention of stroke. A related protocol in antibiotic prescribing was reported previously. Methods/Design The study aims to implement a cluster randomized trial (CRT using the electronic patient records of the General Practice Research Database (GPRD as a sampling frame and data source. The specific objective of the trial is to evaluate the effectiveness of a computer-delivered intervention at enhancing the delivery of stroke secondary prevention in primary care. GPRD family practices will be allocated to the intervention or usual care. The intervention promotes the use of electronic prompts to support adherence with the recommendations of the UK Intercollegiate Stroke Working Party and NICE guidelines for the secondary prevention of stroke in primary care. Primary outcome measure will be the difference in systolic blood pressure between intervention and control trial arms at 12-month follow-up. Secondary outcomes will be differences in serum cholesterol, prescribing of antihypertensive drugs, statins, and antiplatelet therapy. The intervention will continue for 12 months. Information on the utilization of the decision-support tools will also be analyzed. Discussion The CRT will investigate the effectiveness of using a computer-delivered intervention to reduce the risk of stroke recurrence following a first stroke event. The study will provide methodological guidance on the implementation of CRTs in electronic databases in primary care. Trial registration Current Controlled Trials ISRCTN35701810

  11. Generalized double-humped logistic map-based medical image encryption

    Directory of Open Access Journals (Sweden)

    Samar M. Ismail

    2018-03-01

    Full Text Available This paper presents the design of the generalized Double Humped (DH logistic map, used for pseudo-random number key generation (PRNG. The generalized parameter added to the map provides more control on the map chaotic range. A new special map with a zooming effect of the bifurcation diagram is obtained by manipulating the generalization parameter value. The dynamic behavior of the generalized map is analyzed, including the study of the fixed points and stability ranges, Lyapunov exponent, and the complete bifurcation diagram. The option of designing any specific map is made possible through changing the general parameter increasing the randomness and controllability of the map. An image encryption algorithm is introduced based on pseudo-random sequence generation using the proposed generalized DH map offering secure communication transfer of medical MRI and X-ray images. Security analyses are carried out to consolidate system efficiency including: key sensitivity and key-space analyses, histogram analysis, correlation coefficients, MAE, NPCR and UACI calculations. System robustness against noise attacks has been proved along with the NIST test ensuring the system efficiency. A comparison between the proposed system with respect to previous works is presented.

  12. The adverse effect of selective cyclooxygenase-2 inhibitor on random skin flap survival in rats.

    Directory of Open Access Journals (Sweden)

    Haiyong Ren

    Full Text Available BACKGROUND: Cyclooxygenase-2(COX-2 inhibitors provide desired analgesic effects after injury or surgery, but evidences suggested they also attenuate wound healing. The study is to investigate the effect of COX-2 inhibitor on random skin flap survival. METHODS: The McFarlane flap model was established in 40 rats and evaluated within two groups, each group gave the same volume of Parecoxib and saline injection for 7 days. The necrotic area of the flap was measured, the specimens of the flap were stained with haematoxylin-eosin(HE for histologic analysis. Immunohistochemical staining was performed to analyse the level of VEGF and COX-2 . RESULTS: 7 days after operation, the flap necrotic area ratio in study group (66.65 ± 2.81% was significantly enlarged than that of the control group(48.81 ± 2.33%(P <0.01. Histological analysis demonstrated angiogenesis with mean vessel density per mm(2 being lower in study group (15.4 ± 4.4 than in control group (27.2 ± 4.1 (P <0.05. To evaluate the expression of COX-2 and VEGF protein in the intermediate area II in the two groups by immunohistochemistry test .The expression of COX-2 in study group was (1022.45 ± 153.1, and in control group was (2638.05 ± 132.2 (P <0.01. The expression of VEGF in the study and control groups were (2779.45 ± 472.0 vs (4938.05 ± 123.6(P <0.01.In the COX-2 inhibitor group, the expressions of COX-2 and VEGF protein were remarkably down-regulated as compared with the control group. CONCLUSION: Selective COX-2 inhibitor had adverse effect on random skin flap survival. Suppression of neovascularization induced by low level of VEGF was supposed to be the biological mechanism.

  13. Source-Independent Quantum Random Number Generation

    Science.gov (United States)

    Cao, Zhu; Zhou, Hongyi; Yuan, Xiao; Ma, Xiongfeng

    2016-01-01

    Quantum random number generators can provide genuine randomness by appealing to the fundamental principles of quantum mechanics. In general, a physical generator contains two parts—a randomness source and its readout. The source is essential to the quality of the resulting random numbers; hence, it needs to be carefully calibrated and modeled to achieve information-theoretical provable randomness. However, in practice, the source is a complicated physical system, such as a light source or an atomic ensemble, and any deviations in the real-life implementation from the theoretical model may affect the randomness of the output. To close this gap, we propose a source-independent scheme for quantum random number generation in which output randomness can be certified, even when the source is uncharacterized and untrusted. In our randomness analysis, we make no assumptions about the dimension of the source. For instance, multiphoton emissions are allowed in optical implementations. Our analysis takes into account the finite-key effect with the composable security definition. In the limit of large data size, the length of the input random seed is exponentially small compared to that of the output random bit. In addition, by modifying a quantum key distribution system, we experimentally demonstrate our scheme and achieve a randomness generation rate of over 5 ×103 bit /s .

  14. Bias in random forest variable importance measures: Illustrations, sources and a solution

    Directory of Open Access Journals (Sweden)

    Hothorn Torsten

    2007-01-01

    Full Text Available Abstract Background Variable importance measures for random forests have been receiving increased attention as a means of variable selection in many classification tasks in bioinformatics and related scientific fields, for instance to select a subset of genetic markers relevant for the prediction of a certain disease. We show that random forest variable importance measures are a sensible means for variable selection in many applications, but are not reliable in situations where potential predictor variables vary in their scale of measurement or their number of categories. This is particularly important in genomics and computational biology, where predictors often include variables of different types, for example when predictors include both sequence data and continuous variables such as folding energy, or when amino acid sequence data show different numbers of categories. Results Simulation studies are presented illustrating that, when random forest variable importance measures are used with data of varying types, the results are misleading because suboptimal predictor variables may be artificially preferred in variable selection. The two mechanisms underlying this deficiency are biased variable selection in the individual classification trees used to build the random forest on one hand, and effects induced by bootstrap sampling with replacement on the other hand. Conclusion We propose to employ an alternative implementation of random forests, that provides unbiased variable selection in the individual classification trees. When this method is applied using subsampling without replacement, the resulting variable importance measures can be used reliably for variable selection even in situations where the potential predictor variables vary in their scale of measurement or their number of categories. The usage of both random forest algorithms and their variable importance measures in the R system for statistical computing is illustrated and

  15. Perturbation Solutions for Random Linear Structural Systems subject to Random Excitation using Stochastic Differential Equations

    DEFF Research Database (Denmark)

    Köyluoglu, H.U.; Nielsen, Søren R.K.; Cakmak, A.S.

    1994-01-01

    perturbation method using stochastic differential equations. The joint statistical moments entering the perturbation solution are determined by considering an augmented dynamic system with state variables made up of the displacement and velocity vector and their first and second derivatives with respect......The paper deals with the first and second order statistical moments of the response of linear systems with random parameters subject to random excitation modelled as white-noise multiplied by an envelope function with random parameters. The method of analysis is basically a second order...... to the random parameters of the problem. Equations for partial derivatives are obtained from the partial differentiation of the equations of motion. The zero time-lag joint statistical moment equations for the augmented state vector are derived from the Itô differential formula. General formulation is given...

  16. Relationship between Abuse Experience and General Health among Older Adults in Yazd City- Iran

    Directory of Open Access Journals (Sweden)

    Hassan Rezaeipandari

    2016-06-01

    Full Text Available Introduction: Elder abuse may increase the vulnerability of ageing people to disease and decrease their general health status, so addressing the issue is essential for promoting elderly quality of life. The study aimed to examine the relation between abuse experience and general health among elderly people in Yazd city- Iran. Methods: The cross-sectional study carried out on 250 community-dwelling seniors in the city of Yazd who were selected with cluster random sampling. Data collection tools included, Iranian Domestic Elder Abuse Questionnaire and Persian version of the General Health Questionnaire 28. Data were analyzed using Spearman correlation coefficient and linear regression tests. Results: Mean scores of abuse experience and general health among the elders were 11.84±12.70 (range 0-100 and 21.82±10.84 (range 0-84 respectively. General health status was more undesirable among elders who had experienced abuse than those who had not. Elder abuse subscales accounted for 17.2 % changes in general health, which had only care neglect and physical abuse subscales with significant prediction effect. Conclusion: Abuse experience has negative effects on older adults' general health. care neglect and physical abuse play a more important role.

  17. Perceptions of randomized security schedules.

    Science.gov (United States)

    Scurich, Nicholas; John, Richard S

    2014-04-01

    Security of infrastructure is a major concern. Traditional security schedules are unable to provide omnipresent coverage; consequently, adversaries can exploit predictable vulnerabilities to their advantage. Randomized security schedules, which randomly deploy security measures, overcome these limitations, but public perceptions of such schedules have not been examined. In this experiment, participants were asked to make a choice between attending a venue that employed a traditional (i.e., search everyone) or a random (i.e., a probability of being searched) security schedule. The absolute probability of detecting contraband was manipulated (i.e., 1/10, 1/4, 1/2) but equivalent between the two schedule types. In general, participants were indifferent to either security schedule, regardless of the probability of detection. The randomized schedule was deemed more convenient, but the traditional schedule was considered fairer and safer. There were no differences between traditional and random schedule in terms of perceived effectiveness or deterrence. Policy implications for the implementation and utilization of randomized schedules are discussed. © 2013 Society for Risk Analysis.

  18. Flow in Random Microstructures: a Multilevel Monte Carlo Approach

    KAUST Repository

    Icardi, Matteo

    2016-01-06

    In this work we are interested in the fast estimation of effective parameters of random heterogeneous materials using Multilevel Monte Carlo (MLMC). MLMC is an efficient and flexible solution for the propagation of uncertainties in complex models, where an explicit parametrisation of the input randomness is not available or too expensive. We propose a general-purpose algorithm and computational code for the solution of Partial Differential Equations (PDEs) on random heterogeneous materials. We make use of the key idea of MLMC, based on different discretization levels, extending it in a more general context, making use of a hierarchy of physical resolution scales, solvers, models and other numerical/geometrical discretisation parameters. Modifications of the classical MLMC estimators are proposed to further reduce variance in cases where analytical convergence rates and asymptotic regimes are not available. Spheres, ellipsoids and general convex-shaped grains are placed randomly in the domain with different placing/packing algorithms and the effective properties of the heterogeneous medium are computed. These are, for example, effective diffusivities, conductivities, and reaction rates. The implementation of the Monte-Carlo estimators, the statistical samples and each single solver is done efficiently in parallel. The method is tested and applied for pore-scale simulations of random sphere packings.

  19. Inflation with generalized initial conditions

    International Nuclear Information System (INIS)

    Albrecht, A.; Brandenberger, R.; Matzner, R.

    1987-01-01

    In many current models of the early Universe a scalar field phi which is only very weakly coupled to other quantum fields is used to generate inflation. In such models there are no forces which could thermalize the scalar field, and previous assumptions about its preinflation ''initial'' conditions must be abandoned. In this paper the onset of inflation is studied classically for more general initial conditions of the scalar field configuration. In particular, initial conditions with a nonvanishing spatial average of phi, with phi chosen at random in each initial horizon volume, and with random initial momenta are considered. We identify and discuss several mechanisms that can drive these more general initial conditions toward an inflationary state. The analysis is done in one spatial dimension

  20. General quality of life of patients with acne vulgaris before and after performing selected cosmetological treatments

    Science.gov (United States)

    Chilicka, Karolina; Maj, Joanna; Panaszek, Bernard

    2017-01-01

    Background Achieving a satisfying quality of life for a patient by applying individually matched therapy is, simultaneously, a great challenge and a priority for contemporary medicine. Patients with visible dermatological ailments are particularly susceptible to reduction in the general quality of life. Among the dermatological diseases, acne causes considerable reduction in the quality of life and changes in self-perception that lead to the worsening of a patient’s mental condition, including depression and suicidal thoughts. As a result, difficulties in contact with loved ones, as well as social and professional problems are observed, which show that acne is not a somatic problem alone. To a large extent, it becomes a part of psychodermatology, becoming an important topic of public health in social medicine practice. Pharmacological treatment of acne is a challenge for a dermatologist and often requires the necessity of cooperating with a cosmetologist. Cosmetological treatments are aimed at improving the condition of the skin and reduction or subsiding of acne skin changes. Aim The aim of this study was to assess the influence of selected cosmetological treatments on the general quality of life of patients with acne. Materials and methods The study group consisted of 101 women aged 19–29 years (x¯=22.5 years, SD =2.3 years). All subjects were diagnosed with acne vulgaris of the face. In the study group, the acne changes occurred over the course of 3–15 years (x¯=8.1 years, SD =2.7 years). Selected cosmetological treatments (intensive pulsing light, alpha-hydroxy acids, cavitation peeling, needle-free mesotherapy, diamond microdermabrasion and sonophoresis) were performed in series in the number depending on the particular patient’s chosen treatment, after excluding contraindications. General quality of life of the patients was estimated using the Skindex-29 and Dermatology Life Quality Index (DLQI) questionnaires, before and after the cosmetological

  1. Multi-Label Learning via Random Label Selection for Protein Subcellular Multi-Locations Prediction.

    Science.gov (United States)

    Wang, Xiao; Li, Guo-Zheng

    2013-03-12

    Prediction of protein subcellular localization is an important but challenging problem, particularly when proteins may simultaneously exist at, or move between, two or more different subcellular location sites. Most of the existing protein subcellular localization methods are only used to deal with the single-location proteins. In the past few years, only a few methods have been proposed to tackle proteins with multiple locations. However, they only adopt a simple strategy, that is, transforming the multi-location proteins to multiple proteins with single location, which doesn't take correlations among different subcellular locations into account. In this paper, a novel method named RALS (multi-label learning via RAndom Label Selection), is proposed to learn from multi-location proteins in an effective and efficient way. Through five-fold cross validation test on a benchmark dataset, we demonstrate our proposed method with consideration of label correlations obviously outperforms the baseline BR method without consideration of label correlations, indicating correlations among different subcellular locations really exist and contribute to improvement of prediction performance. Experimental results on two benchmark datasets also show that our proposed methods achieve significantly higher performance than some other state-of-the-art methods in predicting subcellular multi-locations of proteins. The prediction web server is available at http://levis.tongji.edu.cn:8080/bioinfo/MLPred-Euk/ for the public usage.

  2. Meaning identification and meaning selection for general language monolingual dictionaries

    DEFF Research Database (Denmark)

    Bergenholtz, Henning; Agerbo, Heidi

    2014-01-01

    The traditional way for lexicographers to deal with polysemy in dictionaries is by applying the terms lumping and splitting. We will not follow this tradition. Instead, we argue that the identification and selection of meaning items (= polysems) should be treated in the same way as the identifica......The traditional way for lexicographers to deal with polysemy in dictionaries is by applying the terms lumping and splitting. We will not follow this tradition. Instead, we argue that the identification and selection of meaning items (= polysems) should be treated in the same way...... to references in the world (in this contribution called things), followed by a formulation of the identified meaning items which can be used for reception situations. Not always – as in the case of lemma selection – will all the identified meaning items be included in the dictionary. The selection of identified...... meaning items will depend on the genuine purpose of the dictionary....

  3. Random graph states, maximal flow and Fuss-Catalan distributions

    International Nuclear Information System (INIS)

    Collins, BenoIt; Nechita, Ion; Zyczkowski, Karol

    2010-01-01

    For any graph consisting of k vertices and m edges we construct an ensemble of random pure quantum states which describe a system composed of 2m subsystems. Each edge of the graph represents a bipartite, maximally entangled state. Each vertex represents a random unitary matrix generated according to the Haar measure, which describes the coupling between subsystems. Dividing all subsystems into two parts, one may study entanglement with respect to this partition. A general technique to derive an expression for the average entanglement entropy of random pure states associated with a given graph is presented. Our technique relies on Weingarten calculus and flow problems. We analyze the statistical properties of spectra of such random density matrices and show for which cases they are described by the free Poissonian (Marchenko-Pastur) distribution. We derive a discrete family of generalized, Fuss-Catalan distributions and explicitly construct graphs which lead to ensembles of random states characterized by these novel distributions of eigenvalues.

  4. Flow in Random Microstructures: a Multilevel Monte Carlo Approach

    KAUST Repository

    Icardi, Matteo; Tempone, Raul

    2016-01-01

    , where an explicit parametrisation of the input randomness is not available or too expensive. We propose a general-purpose algorithm and computational code for the solution of Partial Differential Equations (PDEs) on random heterogeneous materials. We

  5. Changes in Attitudes Towards Bariatric Surgery After 5 Years in the German General Public.

    Science.gov (United States)

    Jung, Franziska Ulrike Christine Else; Dietrich, A; Stroh, C; Riedel-Heller, S G; Luck-Sikorski, C

    2017-10-01

    The aim of this study was to investigate changes in attitudes of the general public towards bariatric surgery and other interventions that can be part of obesity management, during the last 5 years. 1007 participants were randomly selected and interviewed. Apart from socio-demographic data, interviews also included causal reasons for obesity as well as questions regarding treatment methods and their believed effectiveness. Results were compared with data published 5 years ago. Surgery is seen as a rather ineffective method to reduce weight in obesity and is recommended less often by the general public compared to the assessment 5 years ago. Public health-implications should inform about obesity and benefits of surgery as an intervention to improve individual health conditions.

  6. High-Tg Polynorbornene-Based Block and Random Copolymers for Butanol Pervaporation Membranes

    Science.gov (United States)

    Register, Richard A.; Kim, Dong-Gyun; Takigawa, Tamami; Kashino, Tomomasa; Burtovyy, Oleksandr; Bell, Andrew

    Vinyl addition polymers of substituted norbornene (NB) monomers possess desirably high glass transition temperatures (Tg); however, until very recently, the lack of an applicable living polymerization chemistry has precluded the synthesis of such polymers with controlled architecture, or copolymers with controlled sequence distribution. We have recently synthesized block and random copolymers of NB monomers bearing hydroxyhexafluoroisopropyl and n-butyl substituents (HFANB and BuNB) via living vinyl addition polymerization with Pd-based catalysts. Both series of polymers were cast into the selective skin layers of thin film composite (TFC) membranes, and these organophilic membranes investigated for the isolation of n-butanol from dilute aqueous solution (model fermentation broth) via pervaporation. The block copolymers show well-defined microphase-separated morphologies, both in bulk and as the selective skin layers on TFC membranes, while the random copolymers are homogeneous. Both block and random vinyl addition copolymers are effective as n-butanol pervaporation membranes, with the block copolymers showing a better flux-selectivity balance. While polyHFANB has much higher permeability and n-butanol selectivity than polyBuNB, incorporating BuNB units into the polymer (in either a block or random sequence) limits the swelling of the polyHFANB and thereby improves the n-butanol pervaporation selectivity.

  7. Diagnostic imaging to select the candidates to orthotopic transplantation: Experience in a general hospital

    International Nuclear Information System (INIS)

    Pozzato, Carlo; Baldini, Umberto; Gattoni, Filippo; Raiteri, Riccardo; Lazzerini, Francesco; Uslenghi, Carlo Matteo; Mevoli, Alessandra

    1997-01-01

    The authors report the experience of our general hospital in selecting the patients for orthotopic liver transplantation (OLT). The accuracy of duplex Doppler and color flow Doppler for portal and/or mesenteric vein thrombosis was evaluated by correlation with resected livers, computerized tomography and angiographic findings. Pathologic examinations diagnosed HCC in 5/20 transplant recipients: 2 lesions were found in 2 resected specimens (total hepatectomy) and 1 lesion was found in 3 cases. The sensitivity of US, plain and dynamic computerized tomography in identifying HCC patients was 20%; US and computerized tomography specificity rates were 100% and 87%, respectively. CTAP sensitivity was 75% and the sensitivity of Lipiodol computerized tomography and angiography was 100%. Therefore, in our series, US was poorly sensitivity in the detection of liver cancers, which may depend on the small number of patients, lesion size and the radiologists ignoring clinical and laboratory data on purpose. Nevertheless, the patients with a single HCC not exceeding 5 cm in diameter or with no more than 3 tumors, none of them exceeding 3 cm in diameter, are generally considered eligible for transplantation: therefore, our patients chosen for OLT on the basis of US and computerized tomography findings were actually eligible for transplantation in spite of US and computerized tomography false negative results. In conclusion, considering also the long stand-by list for OLT, the first selection of transplant candidates could be performed with US and color flow Doppler, plain and dynamic computerized tomography. The patients who are not ruled out as candidates for OLT on the basis of the findings of these imaging techniques and of clinical and laboratory findings are submitted to no further examination and referred to the transplantation unit. Otherwise, if conventional and color flow Doppler US and conventional computerized tomography are not enough to exclude a patient from OLT, the

  8. Nonstationary interference and scattering from random media

    International Nuclear Information System (INIS)

    Nazikian, R.

    1991-12-01

    For the small angle scattering of coherent plane waves from inhomogeneous random media, the three dimensional mean square distribution of random fluctuations may be recovered from the interferometric detection of the nonstationary modulational structure of the scattered field. Modulational properties of coherent waves scattered from random media are related to nonlocal correlations in the double sideband structure of the Fourier transform of the scattering potential. Such correlations may be expressed in terms of a suitability generalized spectral coherence function for analytic fields

  9. David Hull's generalized natural selection as an explanation for scientific change

    Science.gov (United States)

    Little, Michelle Yvette

    2001-10-01

    Philosophers of science such as Karl Popper and Thomas Kuhn have employed evolutionary idiom in describing scientific change. In Science as a Process (1988) Hull makes evolutionary theory explanatorily applicable. He modifies key evolutionary terms in order that both biological evolution and scientific change are instances of a general selection process. According to Hull, because of naturally-existing competition for credit among researchers and the professional lineages they constitute, scientists are constrained to cooperate and collaborate. This process entails two important philosophical consequences. First, it allows for a natural justification of why the sciences can provide objective empirical knowledge. Second, appreciating its strength means that a philosophical analysis of scientific change is solidly difficult features to combine. I work on strengthening two weaknesses in Hull's arguments. First, operating in his analysis is an unexplicated notion of ``information'' running parallel to the equally opaque notion of genetic information. My third chapter provides a clear account of ``genetic information'' whose usefulness extends beyond the assistance it can render Hull as a clear concept is needed in biological contexts as well. The fourth and fifth chapters submit evidence of scientific change from radio astronomy. Hull insists on empirical backing for philosophical theses but his own book stands to suffer from selection effects as it offers cases drawn from a single subspecialty in the biological sciences. I found that in the main scientists and the change they propel accords well with Hull's explanation. However, instances of major change reveal credit- and resource-sharing to a degree contrary with what Hull would expect. My conclusion is that the naturalness of competition, instantiated during the course of standardized and relatively ``normal'' scientific research, is not the norm during periods of new research and its uncertain standards of

  10. Suboptimal Muscle Synergy Activation Patterns Generalize their Motor Function across Postures.

    Science.gov (United States)

    Sohn, M Hongchul; Ting, Lena H

    2016-01-01

    We used a musculoskeletal model to investigate the possible biomechanical and neural bases of using consistent muscle synergy patterns to produce functional motor outputs across different biomechanical conditions, which we define as generalizability. Experimental studies in cats demonstrate that the same muscle synergies are used during reactive postural responses at widely varying configurations, producing similarly-oriented endpoint force vectors with respect to the limb axis. However, whether generalizability across postures arises due to similar biomechanical properties or to neural selection of a particular muscle activation pattern has not been explicitly tested. Here, we used a detailed cat hindlimb model to explore the set of feasible muscle activation patterns that produce experimental synergy force vectors at a target posture, and tested their generalizability by applying them to different test postures. We used three methods to select candidate muscle activation patterns: (1) randomly-selected feasible muscle activation patterns, (2) optimal muscle activation patterns minimizing muscle effort at a given posture, and (3) generalizable muscle activation patterns that explicitly minimized deviations from experimentally-identified synergy force vectors across all postures. Generalizability was measured by the deviation between the simulated force direction of the candidate muscle activation pattern and the experimental synergy force vectors at the test postures. Force angle deviations were the greatest for the randomly selected feasible muscle activation patterns (e.g., >100°), intermediate for effort-wise optimal muscle activation patterns (e.g., ~20°), and smallest for generalizable muscle activation patterns (e.g., synergy force vector was reduced by ~45% when generalizability requirements were imposed. Muscles recruited in the generalizable muscle activation patterns had less sensitive torque-producing characteristics to changes in postures. We

  11. Random coil chemical shifts in acidic 8 M urea: Implementation of random coil shift data in NMRView

    International Nuclear Information System (INIS)

    Schwarzinger, Stephan; Kroon, Gerard J.A.; Foss, Ted R.; Wright, Peter E.; Dyson, H. Jane

    2000-01-01

    Studies of proteins unfolded in acid or chemical denaturant can help in unraveling events during the earliest phases of protein folding. In order for meaningful comparisons to be made of residual structure in unfolded states, it is necessary to use random coil chemical shifts that are valid for the experimental system under study. We present a set of random coil chemical shifts obtained for model peptides under experimental conditions used in studies of denatured proteins. This new set, together with previously published data sets, has been incorporated into a software interface for NMRView, allowing selection of the random coil data set that fits the experimental conditions best

  12. Discriminative Projection Selection Based Face Image Hashing

    Science.gov (United States)

    Karabat, Cagatay; Erdogan, Hakan

    Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.

  13. The effect of selection on genetic parameter estimates

    African Journals Online (AJOL)

    Unknown

    The South African Journal of Animal Science is available online at ... A simulation study was carried out to investigate the effect of selection on the estimation of genetic ... The model contained a fixed effect, random genetic and random.

  14. The Wasteland of Random Supergravities

    OpenAIRE

    Marsh, David; McAllister, Liam; Wrase, Timm

    2011-01-01

    We show that in a general \\cal{N} = 1 supergravity with N \\gg 1 scalar fields, an exponentially small fraction of the de Sitter critical points are metastable vacua. Taking the superpotential and Kahler potential to be random functions, we construct a random matrix model for the Hessian matrix, which is well-approximated by the sum of a Wigner matrix and two Wishart matrices. We compute the eigenvalue spectrum analytically from the free convolution of the constituent spectra and find that in ...

  15. A General Model for Estimating Macroevolutionary Landscapes.

    Science.gov (United States)

    Boucher, Florian C; Démery, Vincent; Conti, Elena; Harmon, Luke J; Uyeda, Josef

    2018-03-01

    The evolution of quantitative characters over long timescales is often studied using stochastic diffusion models. The current toolbox available to students of macroevolution is however limited to two main models: Brownian motion and the Ornstein-Uhlenbeck process, plus some of their extensions. Here, we present a very general model for inferring the dynamics of quantitative characters evolving under both random diffusion and deterministic forces of any possible shape and strength, which can accommodate interesting evolutionary scenarios like directional trends, disruptive selection, or macroevolutionary landscapes with multiple peaks. This model is based on a general partial differential equation widely used in statistical mechanics: the Fokker-Planck equation, also known in population genetics as the Kolmogorov forward equation. We thus call the model FPK, for Fokker-Planck-Kolmogorov. We first explain how this model can be used to describe macroevolutionary landscapes over which quantitative traits evolve and, more importantly, we detail how it can be fitted to empirical data. Using simulations, we show that the model has good behavior both in terms of discrimination from alternative models and in terms of parameter inference. We provide R code to fit the model to empirical data using either maximum-likelihood or Bayesian estimation, and illustrate the use of this code with two empirical examples of body mass evolution in mammals. FPK should greatly expand the set of macroevolutionary scenarios that can be studied since it opens the way to estimating macroevolutionary landscapes of any conceivable shape. [Adaptation; bounds; diffusion; FPK model; macroevolution; maximum-likelihood estimation; MCMC methods; phylogenetic comparative data; selection.].

  16. Estimation of breeding values using selected pedigree records.

    Science.gov (United States)

    Morton, Richard; Howarth, Jordan M

    2005-06-01

    Fish bred in tanks or ponds cannot be easily tagged individually. The parentage of any individual may be determined by DNA fingerprinting, but is sufficiently expensive that large numbers cannot be so finger-printed. The measurement of the objective trait can be made on a much larger sample relatively cheaply. This article deals with experimental designs for selecting individuals to be finger-printed and for the estimation of the individual and family breeding values. The general setup provides estimates for both genetic effects regarded as fixed or random and for fixed effects due to known regressors. The family effects can be well estimated when even very small numbers are finger-printed, provided that they are the individuals with the most extreme phenotypes.

  17. Rich: Region-based Intelligent Cluster-Head Selection and Node Deployment Strategy in Concentric-based WSNs

    Directory of Open Access Journals (Sweden)

    FAN, C.-S.

    2013-11-01

    Full Text Available In a random deployment, sensor nodes are scattered randomly in the sensing field. Hence, the coverage can not be guaranteed. In contrast, the coverage of uniformly deployment is in general larger than the random deployment. However, uniformly deployment strategy may cause unbalanced traffic pattern in wireless sensor networks (WSNs. In this situation, larger load may be imposed to CHs (cluster heads around the sink. Therefore, CHs close to the sink use up their energy earlier than those farther away from the sink. To overcome this problem, we propose a novel node deployment strategy in the concentric model, namely, Region-based Intelligent Cluster-Head selection and node deployment strategy (called Rich. The coverage, energy consumption and data routing issues are well investigated and taken into consideration in the proposed Rich scheme. The simulation results show that the proposed Rich alleviates the unbalanced traffic pattern significantly, prolongs network lifetime and achieves satisfactory coverage ratio.

  18. The genealogy of samples in models with selection.

    Science.gov (United States)

    Neuhauser, C; Krone, S M

    1997-02-01

    We introduce the genealogy of a random sample of genes taken from a large haploid population that evolves according to random reproduction with selection and mutation. Without selection, the genealogy is described by Kingman's well-known coalescent process. In the selective case, the genealogy of the sample is embedded in a graph with a coalescing and branching structure. We describe this graph, called the ancestral selection graph, and point out differences and similarities with Kingman's coalescent. We present simulations for a two-allele model with symmetric mutation in which one of the alleles has a selective advantage over the other. We find that when the allele frequencies in the population are already in equilibrium, then the genealogy does not differ much from the neutral case. This is supported by rigorous results. Furthermore, we describe the ancestral selection graph for other selective models with finitely many selection classes, such as the K-allele models, infinitely-many-alleles models. DNA sequence models, and infinitely-many-sites models, and briefly discuss the diploid case.

  19. Kin groups and trait groups: population structure and epidemic disease selection.

    Science.gov (United States)

    Fix, A G

    1984-10-01

    A Monte Carlo simulation based on the population structure of a small-scale human population, the Semai Senoi of Malaysia, has been developed to study the combined effects of group, kin, and individual selection. The population structure resembles D.S. Wilson's structured deme model in that local breeding populations (Semai settlements) are subdivided into trait groups (hamlets) that may be kin-structured and are not themselves demes. Additionally, settlement breeding populations are connected by two-dimensional stepping-stone migration approaching 30% per generation. Group and kin-structured group selection occur among hamlets the survivors of which then disperse to breed within the settlement population. Genetic drift is modeled by the process of hamlet formation; individual selection as a deterministic process, and stepping-stone migration as either random or kin-structured migrant groups. The mechanism for group selection is epidemics of infectious disease that can wipe out small hamlets particularly if most adults become sick and social life collapses. Genetic resistance to a disease is an individual attribute; however, hamlet groups with several resistant adults are less likely to disintegrate and experience high social mortality. A specific human gene, hemoglobin E, which confers resistance to malaria, is studied as an example of the process. The results of the simulations show that high genetic variance among hamlet groups may be generated by moderate degrees of kin-structuring. This strong microdifferentiation provides the potential for group selection. The effect of group selection in this case is rapid increase in gene frequencies among the total set of populations. In fact, group selection in concert with individual selection produced a faster rate of gene frequency increase among a set of 25 populations than the rate within a single unstructured population subject to deterministic individual selection. Such rapid evolution with plausible rates of

  20. Source-Independent Quantum Random Number Generation

    Directory of Open Access Journals (Sweden)

    Zhu Cao

    2016-02-01

    Full Text Available Quantum random number generators can provide genuine randomness by appealing to the fundamental principles of quantum mechanics. In general, a physical generator contains two parts—a randomness source and its readout. The source is essential to the quality of the resulting random numbers; hence, it needs to be carefully calibrated and modeled to achieve information-theoretical provable randomness. However, in practice, the source is a complicated physical system, such as a light source or an atomic ensemble, and any deviations in the real-life implementation from the theoretical model may affect the randomness of the output. To close this gap, we propose a source-independent scheme for quantum random number generation in which output randomness can be certified, even when the source is uncharacterized and untrusted. In our randomness analysis, we make no assumptions about the dimension of the source. For instance, multiphoton emissions are allowed in optical implementations. Our analysis takes into account the finite-key effect with the composable security definition. In the limit of large data size, the length of the input random seed is exponentially small compared to that of the output random bit. In addition, by modifying a quantum key distribution system, we experimentally demonstrate our scheme and achieve a randomness generation rate of over 5×10^{3}  bit/s.

  1. Selection gradients, the opportunity for selection, and the coefficient of determination.

    Science.gov (United States)

    Moorad, Jacob A; Wade, Michael J

    2013-03-01

    Abstract We derive the relationship between R(2) (the coefficient of determination), selection gradients, and the opportunity for selection for univariate and multivariate cases. Our main result is to show that the portion of the opportunity for selection that is caused by variation for any trait is equal to the product of its selection gradient and its selection differential. This relationship is a corollary of the first and second fundamental theorems of natural selection, and it permits one to investigate the portions of the total opportunity for selection that are involved in directional selection, stabilizing (and diversifying) selection, and correlational selection, which is important to morphological integration. It also allows one to determine the fraction of fitness variation not explained by variation in measured phenotypes and therefore attributable to random (or, at least, unknown) influences. We apply our methods to a human data set to show how sex-specific mating success as a component of fitness variance can be decoupled from that owing to prereproductive mortality. By quantifying linear sources of sexual selection and quadratic sources of sexual selection, we illustrate that the former is stronger in males, while the latter is stronger in females.

  2. [Diagnosis and treatment in general internal medicine. Curriculum selection].

    Science.gov (United States)

    Casal, E R; Vázquez, E N; Husni, C

    1994-01-01

    In our country general internists are the providers of adult medical care in urban areas. In the past twenty years, with the increasing subspecialization within internal medicine and the development of advances in technology, the role of the general internist seems to be endangered. Recently much attention has been focused on this area and Divisions and Programs of General Internal Medicine have been established in most medical schools in the USA. The University of Buenos Aires instituted a Program of General Internal Medicine in its major teaching hospital in 1987. One of its purposes was to offer an educational experience to residents in the field of internal medicine primary care. This paper summarizes how this program was carried out and the subjects proposed in the area of Diagnosis and Treatment. The Program of General Internal Medicine is performed in the Outpatient Division and it is staffed by 3 faculty members and 4 fellows. Residents in Internal Medicine have a three month, full-time block rotation in the Program. A young, city dwelling, lower middle class population participates in the Program, with almost 10000 visits a year. The Program offers an experience that includes supervised patient care, an average of 100 office visits a month, and seminars and/or workshops covering topics of "Diagnosis and Treatment", "Case Presentations", "Clinical Epidemiology", "Prevention", and "Doctor-Patient Interview". In the area of Diagnosis and Treatment, the criteria used were: 1-frequency of diagnosis as determined by previous investigations, 2-relevant clinical conditions absent from the frequency list as determined by a consensus process by faculty members.(ABSTRACT TRUNCATED AT 250 WORDS)

  3. Elements of random walk and diffusion processes

    CERN Document Server

    Ibe, Oliver C

    2013-01-01

    Presents an important and unique introduction to random walk theory Random walk is a stochastic process that has proven to be a useful model in understanding discrete-state discrete-time processes across a wide spectrum of scientific disciplines. Elements of Random Walk and Diffusion Processes provides an interdisciplinary approach by including numerous practical examples and exercises with real-world applications in operations research, economics, engineering, and physics. Featuring an introduction to powerful and general techniques that are used in the application of physical and dynamic

  4. Nitrates and bone turnover (NABT) - trial to select the best nitrate preparation: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Bucur, Roxana C; Reid, Lauren S; Hamilton, Celeste J; Cummings, Steven R; Jamal, Sophie A

    2013-09-08

    comparisons with the best' approach for data analyses, as this strategy allows practical considerations of ease of use and tolerability to guide selection of the preparation for future studies. Data from this protocol will be used to develop a randomized, controlled trial of nitrates to prevent osteoporotic fractures. ClinicalTrials.gov Identifier: NCT01387672. Controlled-Trials.com: ISRCTN08860742.

  5. Effects of ancestral x irradiation followed by random mating on body weight of rats

    International Nuclear Information System (INIS)

    Gianola, D.; Chapman, A.B.; Rutledge, J.J.

    1977-01-01

    Effects of nine generations of 450R per generation of ancestral spermatogonial x irradiation of inbred rats on body weight were examined. After six generations of random mating (avoiding inbreeding) following the termination of irradiation, descendants of irradiated males (R) were significantly lighter than their controls (C) at 3 and 6 weeks, but not at 10 weeks of age. However, differences in growth between R and C populations were small. Among-litter and within-litter variance estimates were generally larger in the R lines than in the C lines, suggesting that selection responses would be greater in R than in C lines. In conjunction with previous evidence--obtained during the irradiation phase of the experiment--this suggested that more rapid response to selection for 6-week body weight, in particular, might accrue in the R lines

  6. Performance of Universal Adhesive in Primary Molars After Selective Removal of Carious Tissue: An 18-Month Randomized Clinical Trial.

    Science.gov (United States)

    Lenzi, Tathiane Larissa; Pires, Carine Weber; Soares, Fabio Zovico Maxnuck; Raggio, Daniela Prócida; Ardenghi, Thiago Machado; de Oliveira Rocha, Rachel

    2017-09-15

    To evaluate the 18-month clinical performance of a universal adhesive, applied under different adhesion strategies, after selective carious tissue removal in primary molars. Forty-four subjects (five to 10 years old) contributed with 90 primary molars presenting moderately deep dentin carious lesions on occlusal or occluso-proximal surfaces, which were randomly assigned following either self-etch or etch-and-rinse protocol of Scotchbond Universal Adhesive (3M ESPE). Resin composite was incrementally inserted for all restorations. Restorations were evaluated at one, six, 12, and 18 months using the modified United States Public Health Service criteria. Survival estimates for restorations' longevity were evaluated using the Kaplan-Meier method. Multivariate Cox regression analysis with shared frailty to assess the factors associated with failures (Padhesion strategy did not influence the restorations' longevity (P=0.06; 72.2 percent and 89.7 percent with etch-and-rinse and self-etch mode, respectively). Self-etch and etch-and-rinse strategies did not influence the clinical behavior of universal adhesive used in primary molars after selective carious tissue removal; although there was a tendency for better outcome of the self-etch strategy.

  7. Analysis of swaps in Radix selection

    DEFF Research Database (Denmark)

    Elmasry, Amr Ahmed Abd Elmoneim; Mahmoud, Hosam

    2011-01-01

    Radix Sort is a sorting algorithm based on analyzing digital data. We study the number of swaps made by Radix Select (a one-sided version of Radix Sort) to find an element with a randomly selected rank. This kind of grand average provides a smoothing over all individual distributions for specific...

  8. Selected aspects of proposed new EU general data protection legal framework and the Croatian perspective

    Directory of Open Access Journals (Sweden)

    Nina GUMZEJ

    2013-12-01

    Full Text Available Proposed new EU general data protection legal framework profoundly affects a large number of day-to-day business operations of organizations processing personal data and calls for significant effort on their part toward the necessary legal-regulatory compliance. In this paper the author examines key legislative developments towards this new EU frame and impact for the Republic of Croatia as the youngest EU Member State. Following introductory overview, legal analysis of draft EU General Data Protection Regulation as proposed by the European Commission and recently adopted amendments by the European Parliament mainly focuses on selected solutions impacting national data protection supervisory authorities. This is complemented with examination of relevant sources of EU law, including the case law of the Court of Justice of the European Union. Assessment of results of this research is next made with respect to prospects of the data protection legal framework of the Republic of Croatia. The paper is concluded with the author’s critical overview of analyzed EU proposals impacting national data protection supervisory authorities in light of EU pivotal goals, and de lege ferenda proposals to timely address identified obstacles towards more adequate enforcement of data protection legislation in Croatia.

  9. Integral Histogram with Random Projection for Pedestrian Detection.

    Directory of Open Access Journals (Sweden)

    Chang-Hua Liu

    Full Text Available In this paper, we give a systematic study to report several deep insights into the HOG, one of the most widely used features in the modern computer vision and image processing applications. We first show that, its magnitudes of gradient can be randomly projected with random matrix. To handle over-fitting, an integral histogram based on the differences of randomly selected blocks is proposed. The experiments show that both the random projection and integral histogram outperform the HOG feature obviously. Finally, the two ideas are combined into a new descriptor termed IHRP, which outperforms the HOG feature with less dimensions and higher speed.

  10. The MIXMAX random number generator

    Science.gov (United States)

    Savvidy, Konstantin G.

    2015-11-01

    In this paper, we study the randomness properties of unimodular matrix random number generators. Under well-known conditions, these discrete-time dynamical systems have the highly desirable K-mixing properties which guarantee high quality random numbers. It is found that some widely used random number generators have poor Kolmogorov entropy and consequently fail in empirical tests of randomness. These tests show that the lowest acceptable value of the Kolmogorov entropy is around 50. Next, we provide a solution to the problem of determining the maximal period of unimodular matrix generators of pseudo-random numbers. We formulate the necessary and sufficient condition to attain the maximum period and present a family of specific generators in the MIXMAX family with superior performance and excellent statistical properties. Finally, we construct three efficient algorithms for operations with the MIXMAX matrix which is a multi-dimensional generalization of the famous cat-map. First, allowing to compute the multiplication by the MIXMAX matrix with O(N) operations. Second, to recursively compute its characteristic polynomial with O(N2) operations, and third, to apply skips of large number of steps S to the sequence in O(N2 log(S)) operations.

  11. Unwilling or Unable to Cheat? Evidence from a Randomized Tax Audit Experiment in Denmark

    OpenAIRE

    Henrik J. Kleven; Martin B. Knudsen; Claus T. Kreiner; Søren Pedersen; Emmanuel Saez

    2010-01-01

    This paper analyzes a randomized tax enforcement experiment in Denmark. In the base year, a stratified and representative sample of over 40,000 individual income tax filers was selected for the experiment. Half of the tax filers were randomly selected to be thoroughly audited, while the rest were deliberately not audited. The following year, "threat-of-audit" letters were randomly assigned and sent to tax filers in both groups. Using comprehensive administrative tax data, we present four main...

  12. Green Supplier Selection Criteria

    DEFF Research Database (Denmark)

    Nielsen, Izabela Ewa; Banaeian, Narges; Golinska, Paulina

    2014-01-01

    Green supplier selection (GSS) criteria arise from an organization inclination to respond to any existing trends in environmental issues related to business management and processes, so GSS is integrating environmental thinking into conventional supplier selection. This research is designed...... to determine prevalent general and environmental supplier selection criteria and develop a framework which can help decision makers to determine and prioritize suitable green supplier selection criteria (general and environmental). In this research we considered several parameters (evaluation objectives......) to establish suitable criteria for GSS such as their production type, requirements, policy and objectives instead of applying common criteria. At first a comprehensive and deep review on prevalent and green supplier selection literatures performed. Then several evaluation objectives defined to assess the green...

  13. Distributional and efficiency results for subset selection

    NARCIS (Netherlands)

    Laan, van der P.

    1996-01-01

    Assume k (??k \\geq 2) populations are given. The associated independent random variables have continuous distribution functions with an unknown location parameter. The statistical selec??tion goal is to select a non??empty subset which contains the best population,?? that is the pop??ulation with

  14. Domain-General Brain Regions Do Not Track Linguistic Input as Closely as Language-Selective Regions.

    Science.gov (United States)

    Blank, Idan A; Fedorenko, Evelina

    2017-10-11

    Language comprehension engages a cortical network of left frontal and temporal regions. Activity in this network is language-selective, showing virtually no modulation by nonlinguistic tasks. In addition, language comprehension engages a second network consisting of bilateral frontal, parietal, cingulate, and insular regions. Activity in this "multiple demand" (MD) network scales with comprehension difficulty, but also with cognitive effort across a wide range of nonlinguistic tasks in a domain-general fashion. Given the functional dissociation between the language and MD networks, their respective contributions to comprehension are likely distinct, yet such differences remain elusive. Prior neuroimaging studies have suggested that activity in each network covaries with some linguistic features that, behaviorally, influence on-line processing and comprehension. This sensitivity of the language and MD networks to local input characteristics has often been interpreted, implicitly or explicitly, as evidence that both networks track linguistic input closely, and in a manner consistent across individuals. Here, we used fMRI to directly test this assumption by comparing the BOLD signal time courses in each network across different people ( n = 45, men and women) listening to the same story. Language network activity showed fewer individual differences, indicative of closer input tracking, whereas MD network activity was more idiosyncratic and, moreover, showed lower reliability within an individual across repetitions of a story. These findings constrain cognitive models of language comprehension by suggesting a novel distinction between the processes implemented in the language and MD networks. SIGNIFICANCE STATEMENT Language comprehension recruits both language-specific mechanisms and domain-general mechanisms that are engaged in many cognitive processes. In the human cortex, language-selective mechanisms are implemented in the left-lateralized "core language network

  15. Sequence-Based Prediction of RNA-Binding Proteins Using Random Forest with Minimum Redundancy Maximum Relevance Feature Selection

    Directory of Open Access Journals (Sweden)

    Xin Ma

    2015-01-01

    Full Text Available The prediction of RNA-binding proteins is one of the most challenging problems in computation biology. Although some studies have investigated this problem, the accuracy of prediction is still not sufficient. In this study, a highly accurate method was developed to predict RNA-binding proteins from amino acid sequences using random forests with the minimum redundancy maximum relevance (mRMR method, followed by incremental feature selection (IFS. We incorporated features of conjoint triad features and three novel features: binding propensity (BP, nonbinding propensity (NBP, and evolutionary information combined with physicochemical properties (EIPP. The results showed that these novel features have important roles in improving the performance of the predictor. Using the mRMR-IFS method, our predictor achieved the best performance (86.62% accuracy and 0.737 Matthews correlation coefficient. High prediction accuracy and successful prediction performance suggested that our method can be a useful approach to identify RNA-binding proteins from sequence information.

  16. Experimental Analysis of a Piezoelectric Energy Harvesting System for Harmonic, Random, and Sine on Random Vibration

    Energy Technology Data Exchange (ETDEWEB)

    Cryns, Jackson W.; Hatchell, Brian K.; Santiago-Rojas, Emiliano; Silvers, Kurt L.

    2013-07-01

    Formal journal article Experimental analysis of a piezoelectric energy harvesting system for harmonic, random, and sine on random vibration Abstract: Harvesting power with a piezoelectric vibration powered generator using a full-wave rectifier conditioning circuit is experimentally compared for varying sinusoidal, random and sine on random (SOR) input vibration scenarios. Additionally, the implications of source vibration characteristics on harvester design are discussed. Studies in vibration harvesting have yielded numerous alternatives for harvesting electrical energy from vibrations but piezoceramics arose as the most compact, energy dense means of energy transduction. The rise in popularity of harvesting energy from ambient vibrations has made piezoelectric generators commercially available. Much of the available literature focuses on maximizing harvested power through nonlinear processing circuits that require accurate knowledge of generator internal mechanical and electrical characteristics and idealization of the input vibration source, which cannot be assumed in general application. In this manuscript, variations in source vibration and load resistance are explored for a commercially available piezoelectric generator. We characterize the source vibration by its acceleration response for repeatability and transcription to general application. The results agree with numerical and theoretical predictions for in previous literature that load optimal resistance varies with transducer natural frequency and source type, and the findings demonstrate that significant gains are seen with lower tuned transducer natural frequencies for similar source amplitudes. Going beyond idealized steady state sinusoidal and simplified random vibration input, SOR testing allows for more accurate representation of real world ambient vibration. It is shown that characteristic interactions from more complex vibrational sources significantly alter power generation and power processing

  17. [General principles of effective communication between physician and patient with selected mental disorders].

    Science.gov (United States)

    Błaszczyk, Justyna; Bobińska, Kinga; Filip, Maria; Gałecki, Piotr

    2015-04-01

    Faced with the growing frequency of mental disorders occurrence and considering the necessity of improving the patient care, it is particularly important that physicians of different specialties knew the general principles of effective communication with patients who are mentally ill. Equally important is to spread the knowledge of the symptomatology of various mental illnesses. Studies published by the Institute of Psychiatry and Neurology involving persons between 18 and 64 years old, show that 8 millions Poles suffers or suffered from mental disorders. This represents almost 25% of Polish society. The above data confirm, that basic knowledge of criteria for diagnosing mental disorders and their treatment by primary care physicians, determines the success of the entire health care system. It must be taken into consideration that frequently patients seeing general practitioner (GP) are suffering from more than one mental illness or it is accompanied by somatic disease. Adequate communication determines effective treatment. Simple yet exact message, ability to adapt it to patient and problems reported by him, is a valuable means in daily medical practice. It reduces the risk of iatrogenic disorder, encourages the efficiency of the entire therapeutic process. Good cooperation with the patient is also determined by patience, empathy, understanding, and competence. The aim of this study is to present the principles of effective communication between doctor and patient suffering from selected mental disorders. The article defines the concept of communication. It shows symptomatology of primary psychiatric disorders. Moreover, the most common difficulties in relationship between the doctor and the patient had been pointed. © 2015 MEDPRESS.

  18. Impact of hemoglobin on plasma pro-B-type natriuretic peptide concentrations in the general population

    DEFF Research Database (Denmark)

    Nybo, Mads; Benn, Marianne; Mogelvang, Rasmus

    2007-01-01

    , the impact of hemoglobin status on proBNP concentrations has not been established in the general population. METHODS: In the 4th examination in the Copenhagen City Heart Study, we performed a nested case-control study of 6238 individuals from a Danish general population. Of these, 3497 randomly selected...... participants also underwent an echocardiographic examination. The population was stratified into groups depending on health and hemoglobin status. Correlations between hemoglobin and proBNP concentrations were examined by simple and multiple regression analyses, adjusted for variables known to influence...... the proBNP plasma concentration. RESULTS: The mean proBNP concentration was increased 1.7-fold in the group with anemia vs the nonanemic group [mean (SD) 42 (45) pmol/L vs 25 (29) pmol/L, P hemoglobin on pro...

  19. An active learning representative subset selection method using net analyte signal

    Science.gov (United States)

    He, Zhonghai; Ma, Zhenhe; Luan, Jingmin; Cai, Xi

    2018-05-01

    To guarantee accurate predictions, representative samples are needed when building a calibration model for spectroscopic measurements. However, in general, it is not known whether a sample is representative prior to measuring its concentration, which is both time-consuming and expensive. In this paper, a method to determine whether a sample should be selected into a calibration set is presented. The selection is based on the difference of Euclidean norm of net analyte signal (NAS) vector between the candidate and existing samples. First, the concentrations and spectra of a group of samples are used to compute the projection matrix, NAS vector, and scalar values. Next, the NAS vectors of candidate samples are computed by multiplying projection matrix with spectra of samples. Scalar value of NAS is obtained by norm computation. The distance between the candidate set and the selected set is computed, and samples with the largest distance are added to selected set sequentially. Last, the concentration of the analyte is measured such that the sample can be used as a calibration sample. Using a validation test, it is shown that the presented method is more efficient than random selection. As a result, the amount of time and money spent on reference measurements is greatly reduced.

  20. Survey of random surface theory

    International Nuclear Information System (INIS)

    Froehlich, J.

    1985-01-01

    The author describes some recent results in random surface theory. Attention is focused on those developments which are relevant for a quantum theory of strings. Some general remarks on the status of mathematical quantum field theory are included at the beginning. (orig.)

  1. Assessment of pharmaceutical waste management at selected hospitals and homes in Ghana.

    Science.gov (United States)

    Sasu, Samuel; Kümmerer, Klaus; Kranert, Martin

    2012-06-01

    The practice of use and disposal of waste from pharmaceuticals compromises the safety of the environment as well as representing a serious health risk, as they may accumulate and stay active for a long time in the aquatic environment. This article therefore presents the outcome of a study on pharmaceutical waste management practices at homes and hospitals in Ghana. The study was conducted at five healthcare institutions randomly selected in Ghana, namely two teaching hospitals (hospital A, hospital B), one regional hospital (hospital C), one district hospital (hospital D) and one quasi-governmental hospital (hospital E). Apart from hospital E which currently has a pharmaceutical waste separation programmr as well as drug return programme called DUMP (Disposal of Unused Medicines Program), all other hospitals visited do not have any separate collection and disposal programme for pharmaceutical waste. A survey was also carried out among the general public, involving the questioning of randomly selected participants in order to investigate the household disposal of unused and expired pharmaceuticals. The results from the survey showed that more than half of the respondents confirmed having unused, left-over or expired medicines at home and over 75% disposed of pharmaceutical waste through the normal waste bins which end up in the landfills or dump sites.

  2. Randomized Prediction Games for Adversarial Machine Learning.

    Science.gov (United States)

    Rota Bulo, Samuel; Biggio, Battista; Pillai, Ignazio; Pelillo, Marcello; Roli, Fabio

    In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different evasion attacks and modifying the classification function accordingly. However, both the classification function and the simulated data manipulations have been modeled in a deterministic manner, without accounting for any form of randomization. In this paper, we overcome this limitation by proposing a randomized prediction game, namely, a noncooperative game-theoretic formulation in which the classifier and the attacker make randomized strategy selections according to some probability distribution defined over the respective strategy set. We show that our approach allows one to improve the tradeoff between attack detection and false alarms with respect to the state-of-the-art secure classifiers, even against attacks that are different from those hypothesized during design, on application examples including handwritten digit recognition, spam, and malware detection.In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different

  3. Pseudo-random number generation using a 3-state cellular automaton

    Science.gov (United States)

    Bhattacharjee, Kamalika; Paul, Dipanjyoti; Das, Sukanta

    This paper investigates the potentiality of pseudo-random number generation of a 3-neighborhood 3-state cellular automaton (CA) under periodic boundary condition. Theoretical and empirical tests are performed on the numbers, generated by the CA, to observe the quality of it as pseudo-random number generator (PRNG). We analyze the strength and weakness of the proposed PRNG and conclude that the selected CA is a good random number generator.

  4. A randomized controlled study of a social skills training for preadolescent children with autism spectrum disorders: generalization of skills by training parents and teachers?

    Science.gov (United States)

    2014-01-01

    Background Social skills training (SST) is a common intervention for children with autism spectrum disorders (ASDs) to improve their social and communication skills. Despite the fact that SSTs are often applied in clinical practice, the evidence for the effectiveness of these trainings for children with ASD is inconclusive. Moreover, long term outcome and generalization of learned skills are little evaluated. Additionally, there is no research on the influence of involvement of parents and teachers on effectiveness of SST and on the generalization of learned social skills to daily life. We expect parent and teacher involvement in SST to enhance treatment efficacy and to facilitate generalization of learned skills to daily life. Method/Design In a randomized controlled trial (RCT) with three conditions, 120 participants with ASD at the end of primary school (10–12 years of calendar age) have been randomized to SST, SST-PTI (SST with Parent & Teacher Involvement), or care-as-usual. The SST consists of 18 group sessions of 1.5 hours for the children. In the SST-PTI condition, parents additionally participate in 8 parent sessions and parents and teachers are actively involved in homework assignments. Assessment takes place at three moments: before and immediately after the intervention period and at 6 months follow-up. Primary outcome is socialization, as an aspect of adaptive functioning. Secondary outcomes focus on specific social skills children learn during SST and on more general social skills pertaining to home and community settings from a multi-informant perspective. Additionally, possible predictors of treatment outcome will be assessed. Discussion The current study is an RCT study evaluating SST in a large sample of Dutch children with ASD in a specific age range (10–12 years). Strengths of the study are the use of one manualized protocol, application of standardized and internationally used rating instruments, use of multiple raters, investigation of

  5. A General Model of Negative Frequency Dependent Selection Explains Global Patterns of Human ABO Polymorphism.

    Directory of Open Access Journals (Sweden)

    Fernando A Villanea

    Full Text Available The ABO locus in humans is characterized by elevated heterozygosity and very similar allele frequencies among populations scattered across the globe. Using knowledge of ABO protein function, we generated a simple model of asymmetric negative frequency dependent selection and genetic drift to explain the maintenance of ABO polymorphism and its loss in human populations. In our models, regardless of the strength of selection, models with large effective population sizes result in ABO allele frequencies that closely match those observed in most continental populations. Populations must be moderately small to fall out of equilibrium and lose either the A or B allele (N(e ≤ 50 and much smaller (N(e ≤ 25 for the complete loss of diversity, which nearly always involved the fixation of the O allele. A pattern of low heterozygosity at the ABO locus where loss of polymorphism occurs in our model is consistent with small populations, such as Native American populations. This study provides a general evolutionary model to explain the observed global patterns of polymorphism at the ABO locus and the pattern of allele loss in small populations. Moreover, these results inform the range of population sizes associated with the recent human colonization of the Americas.

  6. Selective sorting of waste

    CERN Multimedia

    2007-01-01

    Not much effort needed, just willpower In order to keep the cost of disposing of waste materials as low as possible, CERN provides two types of recipient at the entrance to each building: a green plastic one for paper/cardboard and a metal one for general refuse. For some time now we have noticed, to our great regret, a growing negligence as far as selective sorting is concerned, with, for example, the green recipients being filled with a mixture of cardboard boxes full of polystyrene or protective wrappers, plastic bottles, empty yogurts pots, etc. …We have been able to ascertain, after careful checking, that this haphazard mixing of waste cannot be attributed to the cleaning staff but rather to members of the personnel who unscrupulously throw away their rubbish in a completely random manner. Non-sorted waste entails heavy costs for CERN. For information, once a non-compliant item is found in a green recipient, the entire contents are sent off for incineration rather than recycling… We are all concerned...

  7. Screening for self-plagiarism in a subspecialty-versus-general imaging journal using iThenticate.

    Science.gov (United States)

    Kalnins, A U; Halm, K; Castillo, M

    2015-06-01

    Self-plagiarism is a form of research misconduct that can dilute the credibility and reputation of a scientific journal, as well as the represented specialty. Journal editors are aware of this problem when reviewing submissions and use on-line plagiarism-analysis programs to facilitate detection. The American Journal of Neuroradiology (AJNR) uses iThenticate to screen several submitted original research manuscripts selected for review per issue and retrospectively assesses 3 issues per year. The prevalence of self-plagiarism in AJNR was compared with that in Radiology; the necessity and cost of more extensive screening in AJNR were evaluated. The self-duplication rate in AJNR original research articles was compared with that in Radiology, a general imaging journal that screens all submitted original research manuscripts selected for review by using iThenticate. The rate of self-duplication in original research articles from 2 randomly selected 2012 AJNR issues was compared with the rate in the prior year to gauge the need for more extensive screening. A cost analysis of screening all submitted original research manuscripts selected for review by using iThenticate was performed. Using an empiric 15% single-source duplication threshold, we found that the rate of significant self-plagiarism in original research articles was low for both journals. While AJNR had more articles exceeding this threshold, most instances were insignificant. Analyzing 2 randomly chosen issues of AJNR for single-source duplication of >15% in original research articles yielded no significant differences compared with an entire year. The approximate annual cost of screening all submitted original research manuscripts selected for review was US $6800.00. While the rate of self-plagiarism was low in AJNR and similar to that in Radiology, its potential cost in negative impact on AJNR and the subspecialty of neuroradiology justifies the costs of broader screening. © 2015 by American Journal of

  8. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  9. Theory of Randomized Search Heuristics in Combinatorial Optimization

    DEFF Research Database (Denmark)

    The rigorous mathematical analysis of randomized search heuristics(RSHs) with respect to their expected runtime is a growing research area where many results have been obtained in recent years. This class of heuristics includes well-known approaches such as Randomized Local Search (RLS), the Metr......The rigorous mathematical analysis of randomized search heuristics(RSHs) with respect to their expected runtime is a growing research area where many results have been obtained in recent years. This class of heuristics includes well-known approaches such as Randomized Local Search (RLS...... analysis of randomized algorithms to RSHs. Mostly, the expected runtime of RSHs on selected problems is analzyed. Thereby, we understand why and when RSHs are efficient optimizers and, conversely, when they cannot be efficient. The tutorial will give an overview on the analysis of RSHs for solving...

  10. Duloxetine in the treatment of generalized anxiety disorder

    Directory of Open Access Journals (Sweden)

    Alan Wright

    2009-08-01

    Full Text Available Alan Wright, Chad VanDenBergCenter for Clinical Research, Mercer University, Atlanta, GA, USAAbstract: Duloxetine is a serotonin-norepinephrine reuptake inhibitor (SNRI which is FDA approved for the treatment of generalized anxiety disorder (GAD in doses of 30 mg to 120 mg daily. Duloxetine has been shown to significantly improve symptoms of GAD as measured through the Hamilton Anxiety Rating Scale (HAMA, the Clinical Global Impressions Scale (CGI-I, and other various outcome measures in several placebo-controlled, randomized, double blind, multi-center studies. Symptom improvement began within the first few weeks, and continued for the duration of the studies. In addition, duloxetine has also been shown to improve outcomes in elderly patients with GAD, and in GAD patients with clinically significant pain symptoms. Duloxetine was noninferior compared with venlafaxine XR. Duloxetine was found to have a good tolerability profile which was predictable and similar to another SNRI, venlafaxine. Adverse events (AEs such as nausea, constipation, dry mouth, and insomnia were mild and transient, and occurred at relatively low rates. It was found to have a low frequency of drug interactions. In conclusion, duloxetine, a selective inhibitor for the serotonin and norepinephrine transporters, is efficacious in the treatment of GAD, and has a predictable tolerability profile, with AEs generally being mild to moderate.Keywords: duloxetine, generalized anxiety disorder, anxiety, GAD

  11. A generalized conditional heteroscedastic model for temperature downscaling

    Science.gov (United States)

    Modarres, R.; Ouarda, T. B. M. J.

    2014-11-01

    This study describes a method for deriving the time varying second order moment, or heteroscedasticity, of local daily temperature and its association to large Coupled Canadian General Circulation Models predictors. This is carried out by applying a multivariate generalized autoregressive conditional heteroscedasticity (MGARCH) approach to construct the conditional variance-covariance structure between General Circulation Models (GCMs) predictors and maximum and minimum temperature time series during 1980-2000. Two MGARCH specifications namely diagonal VECH and dynamic conditional correlation (DCC) are applied and 25 GCM predictors were selected for a bivariate temperature heteroscedastic modeling. It is observed that the conditional covariance between predictors and temperature is not very strong and mostly depends on the interaction between the random process governing temporal variation of predictors and predictants. The DCC model reveals a time varying conditional correlation between GCM predictors and temperature time series. No remarkable increasing or decreasing change is observed for correlation coefficients between GCM predictors and observed temperature during 1980-2000 while weak winter-summer seasonality is clear for both conditional covariance and correlation. Furthermore, the stationarity and nonlinearity Kwiatkowski-Phillips-Schmidt-Shin (KPSS) and Brock-Dechert-Scheinkman (BDS) tests showed that GCM predictors, temperature and their conditional correlation time series are nonlinear but stationary during 1980-2000 according to BDS and KPSS test results. However, the degree of nonlinearity of temperature time series is higher than most of the GCM predictors.

  12. Evolution in fluctuating environments: decomposing selection into additive components of the Robertson-Price equation.

    Science.gov (United States)

    Engen, Steinar; Saether, Bernt-Erik

    2014-03-01

    We analyze the stochastic components of the Robertson-Price equation for the evolution of quantitative characters that enables decomposition of the selection differential into components due to demographic and environmental stochasticity. We show how these two types of stochasticity affect the evolution of multivariate quantitative characters by defining demographic and environmental variances as components of individual fitness. The exact covariance formula for selection is decomposed into three components, the deterministic mean value, as well as stochastic demographic and environmental components. We show that demographic and environmental stochasticity generate random genetic drift and fluctuating selection, respectively. This provides a common theoretical framework for linking ecological and evolutionary processes. Demographic stochasticity can cause random variation in selection differentials independent of fluctuating selection caused by environmental variation. We use this model of selection to illustrate that the effect on the expected selection differential of random variation in individual fitness is dependent on population size, and that the strength of fluctuating selection is affected by how environmental variation affects the covariance in Malthusian fitness between individuals with different phenotypes. Thus, our approach enables us to partition out the effects of fluctuating selection from the effects of selection due to random variation in individual fitness caused by demographic stochasticity. © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.

  13. DNABP: Identification of DNA-Binding Proteins Based on Feature Selection Using a Random Forest and Predicting Binding Residues.

    Science.gov (United States)

    Ma, Xin; Guo, Jing; Sun, Xiao

    2016-01-01

    DNA-binding proteins are fundamentally important in cellular processes. Several computational-based methods have been developed to improve the prediction of DNA-binding proteins in previous years. However, insufficient work has been done on the prediction of DNA-binding proteins from protein sequence information. In this paper, a novel predictor, DNABP (DNA-binding proteins), was designed to predict DNA-binding proteins using the random forest (RF) classifier with a hybrid feature. The hybrid feature contains two types of novel sequence features, which reflect information about the conservation of physicochemical properties of the amino acids, and the binding propensity of DNA-binding residues and non-binding propensities of non-binding residues. The comparisons with each feature demonstrated that these two novel features contributed most to the improvement in predictive ability. Furthermore, to improve the prediction performance of the DNABP model, feature selection using the minimum redundancy maximum relevance (mRMR) method combined with incremental feature selection (IFS) was carried out during the model construction. The results showed that the DNABP model could achieve 86.90% accuracy, 83.76% sensitivity, 90.03% specificity and a Matthews correlation coefficient of 0.727. High prediction accuracy and performance comparisons with previous research suggested that DNABP could be a useful approach to identify DNA-binding proteins from sequence information. The DNABP web server system is freely available at http://www.cbi.seu.edu.cn/DNABP/.

  14. Hoeffding’s Inequality for Sums of Dependent Random Variables

    Czech Academy of Sciences Publication Activity Database

    Pelekis, Christos; Ramon, J.

    2017-01-01

    Roč. 14, č. 6 (2017), č. článku 243. ISSN 1660-5446 Institutional support: RVO:67985807 Keywords : dependent random variables * Hoeffding’s inequality * k-wise independent random variables * martingale differences Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.868, year: 2016

  15. Association between general and abdominal obesity with high blood pressure: difference between genders.

    Science.gov (United States)

    Silva, Alison O; Silva, Micaelly V; Pereira, Lisley K N; Feitosa, Wallacy M N; Ritti-Dias, Raphael M; Diniz, Paula R B; Oliveira, Luciano M F T

    2016-01-01

    To assess the association between general and abdominal obesity with high blood pressure in adolescents of both genders from the public school system. This was an epidemiological, descriptive, exploratory study, with a quantitative approach and local scope whose sample consisted of 481 high school students (aged 14-19), selected by using a random cluster sampling strategy. Blood pressure was measured through the use of automated monitor and was considered high when the pressure values were at or above the 95th percentile. The analyses were performed using the chi-squared test and binary logistic regression. The prevalence of high blood pressure was 6.4%, and it was higher among boys (9.0% vs. 4.7%, phigh blood pressure was associated with general (OR=6.4; phigh blood pressure only in boys, regardless of age. Copyright © 2015 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  16. A Heckman Selection- t Model

    KAUST Repository

    Marchenko, Yulia V.

    2012-03-01

    Sample selection arises often in practice as a result of the partial observability of the outcome of interest in a study. In the presence of sample selection, the observed data do not represent a random sample from the population, even after controlling for explanatory variables. That is, data are missing not at random. Thus, standard analysis using only complete cases will lead to biased results. Heckman introduced a sample selection model to analyze such data and proposed a full maximum likelihood estimation method under the assumption of normality. The method was criticized in the literature because of its sensitivity to the normality assumption. In practice, data, such as income or expenditure data, often violate the normality assumption because of heavier tails. We first establish a new link between sample selection models and recently studied families of extended skew-elliptical distributions. Then, this allows us to introduce a selection-t (SLt) model, which models the error distribution using a Student\\'s t distribution. We study its properties and investigate the finite-sample performance of the maximum likelihood estimators for this model. We compare the performance of the SLt model to the conventional Heckman selection-normal (SLN) model and apply it to analyze ambulatory expenditures. Unlike the SLNmodel, our analysis using the SLt model provides statistical evidence for the existence of sample selection bias in these data. We also investigate the performance of the test for sample selection bias based on the SLt model and compare it with the performances of several tests used with the SLN model. Our findings indicate that the latter tests can be misleading in the presence of heavy-tailed data. © 2012 American Statistical Association.

  17. Feature-selective attention in healthy old age: a selective decline in selective attention?

    Science.gov (United States)

    Quigley, Cliodhna; Müller, Matthias M

    2014-02-12

    Deficient selection against irrelevant information has been proposed to underlie age-related cognitive decline. We recently reported evidence for maintained early sensory selection when older and younger adults used spatial selective attention to perform a challenging task. Here we explored age-related differences when spatial selection is not possible and feature-selective attention must be deployed. We additionally compared the integrity of feedforward processing by exploiting the well established phenomenon of suppression of visual cortical responses attributable to interstimulus competition. Electroencephalogram was measured while older and younger human adults responded to brief occurrences of coherent motion in an attended stimulus composed of randomly moving, orientation-defined, flickering bars. Attention was directed to horizontal or vertical bars by a pretrial cue, after which two orthogonally oriented, overlapping stimuli or a single stimulus were presented. Horizontal and vertical bars flickered at different frequencies and thereby elicited separable steady-state visual-evoked potentials, which were used to examine the effect of feature-based selection and the competitive influence of a second stimulus on ongoing visual processing. Age differences were found in feature-selective attentional modulation of visual responses: older adults did not show consistent modulation of magnitude or phase. In contrast, the suppressive effect of a second stimulus was robust and comparable in magnitude across age groups, suggesting that bottom-up processing of the current stimuli is essentially unchanged in healthy old age. Thus, it seems that visual processing per se is unchanged, but top-down attentional control is compromised in older adults when space cannot be used to guide selection.

  18. Many random walks are faster than one

    Czech Academy of Sciences Publication Activity Database

    Alon, N.; Avin, Ch.; Koucký, Michal; Kozma, G.; Lotker, Z.; Tuttle, M.R.

    2011-01-01

    Roč. 20, č. 4 (2011), s. 481-502 ISSN 0963-5483 R&D Projects: GA ČR GP201/07/P276; GA ČR GA201/05/0124 Institutional research plan: CEZ:AV0Z10190503 Keywords : multiple random walks * parallel random walks Subject RIV: BA - General Mathematics Impact factor: 0.778, year: 2011 http://journals.cambridge.org/ action /displayAbstract?fromPage=online&aid=8280727

  19. Identification of Random Dynamic Force Using an Improved Maximum Entropy Regularization Combined with a Novel Conjugate Gradient

    Directory of Open Access Journals (Sweden)

    ChunPing Ren

    2017-01-01

    Full Text Available We propose a novel mathematical algorithm to offer a solution for the inverse random dynamic force identification in practical engineering. Dealing with the random dynamic force identification problem using the proposed algorithm, an improved maximum entropy (IME regularization technique is transformed into an unconstrained optimization problem, and a novel conjugate gradient (NCG method was applied to solve the objective function, which was abbreviated as IME-NCG algorithm. The result of IME-NCG algorithm is compared with that of ME, ME-CG, ME-NCG, and IME-CG algorithm; it is found that IME-NCG algorithm is available for identifying the random dynamic force due to smaller root mean-square-error (RMSE, lower restoration time, and fewer iterative steps. Example of engineering application shows that L-curve method is introduced which is better than Generalized Cross Validation (GCV method and is applied to select regularization parameter; thus the proposed algorithm can be helpful to alleviate the ill-conditioned problem in identification of dynamic force and to acquire an optimal solution of inverse problem in practical engineering.

  20. Evaluation of Randomly Selected Completed Medical Records Sheets in Teaching Hospitals of Jahrom University of Medical Sciences, 2009

    Directory of Open Access Journals (Sweden)

    Mohammad Parsa Mahjob

    2011-06-01

    Full Text Available Background and objective: Medical record documentation, often use to protect the patients legal rights, also providing information for medical researchers, general studies, education of health care staff and qualitative surveys is used. There is a need to control the amount of data entered in the medical record sheets of patients, considering the completion of these sheets is often carried out after completion of service delivery to the patients. Therefore, in this study the prevalence of completeness of medical history, operation reports, and physician order sheets by different documentaries in Jahrom teaching hospitals during year 2009 was analyzed. Methods and Materials: In this descriptive / retrospective study, the 400 medical record sheets of the patients from two teaching hospitals affiliated to Jahrom medical university was randomly selected. The tool of data collection was a checklist based on the content of medical history sheet, operation report and physician order sheets. The data were analyzed by SPSS (Version10 software and Microsoft Office Excel 2003. Results: Average of personal (Demography data entered in medical history, physician order and operation report sheets which is done by department's secretaries were 32.9, 35.8 and 40.18 percent. Average of clinical data entered by physician in medical history sheet is 38 percent. Surgical data entered by the surgeon in operation report sheet was 94.77 percent. Average of data entered by operation room's nurse in operation report sheet was 36.78 percent; Average of physician order data in physician order sheet entered by physician was 99.3 percent. Conclusion: According to this study, the rate of completed record papers reviewed by documentary in Jahrom teaching hospitals were not desirable and in some cases were very weak and incomplete. This deficiency was due to different reason such as medical record documentaries negligence, lack of adequate education for documentaries, High work

  1. Semi-Individualized Homeopathy Add-On Versus Usual Care Only for Premenstrual Disorders: A Randomized, Controlled Feasibility Study.

    Science.gov (United States)

    Klein-Laansma, Christien T; Jong, Mats; von Hagens, Cornelia; Jansen, Jean Pierre C H; van Wietmarschen, Herman; Jong, Miek C

    2018-03-22

    Premenstrual syndrome and premenstrual dysphoric disorder (PMS/PMDD) bother a substantial number of women. Homeopathy seems a promising treatment, but it needs investigation using reliable study designs. The feasibility of organizing an international randomized pragmatic trial on a homeopathic add-on treatment (usual care [UC] + HT) compared with UC alone was evaluated. A multicenter, randomized, controlled pragmatic trial with parallel groups. The study was organized in general and private homeopathic practices in the Netherlands and Sweden and in an outpatient university clinic in Germany. Women diagnosed as having PMS/PMDD, based on prospective daily rating by the daily record of severity of problems (DRSP) during a period of 2 months, were included and randomized. Women were to receive UC + HT or UC for 4 months. Homeopathic medicine selection was according to a previously tested prognostic questionnaire and electronic algorithm. Usual care was as provided by the women's general practitioner according to their preferences. Before and after treatment, the women completed diaries (DRSP), the measure yourself concerns and well-being, and other questionnaires. Intention-to-treat (ITT) and per protocol (PP) analyses were performed. In Germany, the study could not proceed because of legal limitations. In Sweden, recruitment proved extremely difficult. In the Netherlands and Sweden, 60 women were randomized (UC + HT: 28; UC: 32), data of 47/46 women were analyzed (ITT/PP). After 4 months, relative mean change of DRSP scores in the UC + HT group was significantly better than in the UC group (p = 0.03). With respect to recruitment and different legal status, it does not seem feasible to perform a larger, international, pragmatic randomized trial on (semi-)individualized homeopathy for PMS/PMDD. Since the added value of HT compared with UC was demonstrated by significant differences in symptom score changes, further studies are warranted.

  2. Flow, transport and diffusion in random geometries I: a MLMC algorithm

    KAUST Repository

    Canuto, Claudio

    2015-01-07

    Multilevel Monte Carlo (MLMC) is an efficient and flexible solution for the propagation of uncertainties in complex models, where an explicit parametrization of the input randomness is not available or too expensive. We propose a general-purpose algorithm and computational code for the solution of Partial Differential Equations (PDEs) on random geoemtry and with random parameters. We make use of the key idea of MLMC, based on different discretization levels, extending it in a more general context, making use of a hierarchy of physical resolution scales, solvers, models and other numerical/geometrical discretization parameters. Modifications of the classical MLMC estimators are proposed to further reduce variance in cases where analytical convergence rates and asymptotic regimes are not available. Spheres, ellipsoids and general convex-shaped grains are placed randomly in the domain with different placing/packing algorithms and the effective properties of the heterogeneous medium are computed. These are, for example, effective diffusivities, conductivities, and reaction rates. The implementation of the Monte-Carlo estimators, the statistical samples and each single solver is done efficiently in parallel.

  3. TU-AB-202-10: How Effective Are Current Atlas Selection Methods for Atlas-Based Auto-Contouring in Radiotherapy Planning?

    Energy Technology Data Exchange (ETDEWEB)

    Peressutti, D; Schipaanboord, B; Kadir, T; Gooding, M [Mirada Medical Limited, Science and Medical Technology, Oxford (United Kingdom); Soest, J van; Lustberg, T; Elmpt, W van; Dekker, A [Maastricht University Medical Centre, Department of Radiation Oncology MAASTRO - GROW School for Oncology Developmental Biology, Maastricht (Netherlands)

    2016-06-15

    Purpose: To investigate the effectiveness of atlas selection methods for improving atlas-based auto-contouring in radiotherapy planning. Methods: 275 H&N clinically delineated cases were employed as an atlas database from which atlases would be selected. A further 40 previously contoured cases were used as test patients against which atlas selection could be performed and evaluated. 26 variations of selection methods proposed in the literature and used in commercial systems were investigated. Atlas selection methods comprised either global or local image similarity measures, computed after rigid or deformable registration, combined with direct atlas search or with an intermediate template image. Workflow Box (Mirada-Medical, Oxford, UK) was used for all auto-contouring. Results on brain, brainstem, parotids and spinal cord were compared to random selection, a fixed set of 10 “good” atlases, and optimal selection by an “oracle” with knowledge of the ground truth. The Dice score and the average ranking with respect to the “oracle” were employed to assess the performance of the top 10 atlases selected by each method. Results: The fixed set of “good” atlases outperformed all of the atlas-patient image similarity-based selection methods (mean Dice 0.715 c.f. 0.603 to 0.677). In general, methods based on exhaustive comparison of local similarity measures showed better average Dice scores (0.658 to 0.677) compared to the use of either template image (0.655 to 0.672) or global similarity measures (0.603 to 0.666). The performance of image-based selection methods was found to be only slightly better than a random (0.645). Dice scores given relate to the left parotid, but similar results patterns were observed for all organs. Conclusion: Intuitively, atlas selection based on the patient CT is expected to improve auto-contouring performance. However, it was found that published approaches performed marginally better than random and use of a fixed set of

  4. Selecting the Best: Evolutionary Engineering of Chemical Production in Microbes.

    Science.gov (United States)

    Shepelin, Denis; Hansen, Anne Sofie Lærke; Lennen, Rebecca; Luo, Hao; Herrgård, Markus J

    2018-05-11

    Microbial cell factories have proven to be an economical means of production for many bulk, specialty, and fine chemical products. However, we still lack both a holistic understanding of organism physiology and the ability to predictively tune enzyme activities in vivo, thus slowing down rational engineering of industrially relevant strains. An alternative concept to rational engineering is to use evolution as the driving force to select for desired changes, an approach often described as evolutionary engineering. In evolutionary engineering, in vivo selections for a desired phenotype are combined with either generation of spontaneous mutations or some form of targeted or random mutagenesis. Evolutionary engineering has been used to successfully engineer easily selectable phenotypes, such as utilization of a suboptimal nutrient source or tolerance to inhibitory substrates or products. In this review, we focus primarily on a more challenging problem-the use of evolutionary engineering for improving the production of chemicals in microbes directly. We describe recent developments in evolutionary engineering strategies, in general, and discuss, in detail, case studies where production of a chemical has been successfully achieved through evolutionary engineering by coupling production to cellular growth.

  5. Preferential selection based on degree difference in the spatial prisoner's dilemma games

    Science.gov (United States)

    Huang, Changwei; Dai, Qionglin; Cheng, Hongyan; Li, Haihong

    2017-10-01

    Strategy evolution in spatial evolutionary games is generally implemented through imitation processes between individuals. In most previous studies, it is assumed that individuals pick up one of their neighbors randomly to learn from. However, by considering the heterogeneity of individuals' influence in the real society, preferential selection is more realistic. Here, we introduce a preferential selection mechanism based on degree difference into spatial prisoner's dilemma games on Erdös-Rényi networks and Barabási-Albert scale-free networks and investigate the effects of the preferential selection on cooperation. The results show that, when the individuals prefer to choose the neighbors who have small degree difference with themselves to imitate, cooperation is hurt by the preferential selection. In contrast, when the individuals prefer to choose those large degree difference neighbors to learn from, there exists optimal preference strength resulting in the maximal cooperation level no matter what the network structure is. In addition, we investigate the robustness of the results against variations of the noise, the average degree and the size of network in the model, and find that the qualitative features of the results are unchanged.

  6. Comparison of the effects of spinal epidural and general anesthesia on coagulation and fibrinolysis in laparoscopic cholecystectomy: a randomized controlled trial: VSJ Competition, 2nd place.

    Science.gov (United States)

    Demiryas, Suleyman; Donmez, Turgut; Erdem, Vuslat Muslu; Erdem, Duygu Ayfer; Hatipoglu, Engin; Ferahman, Sina; Sunamak, Oguzhan; Zengin, Lale Yoldas; Kocakusak, Ahmet

    2017-09-01

    Laparoscopic cholecystectomy (LC) is usually performed under general anesthesia. Recently, laparoscopic cholecystectomy under regional anesthesia has become popular, but this creates a serious risk of thromboembolism because of pneumoperitoneum, anesthesia technique, operative positioning, and patient-specific risk factors. This randomized controlled trial compares the effects of two different anesthesia techniques in laparoscopic cholecystectomy on coagulation and fibrinolysis. This randomized prospective study included 60 low-risk patients with deep vein thrombosis (DVT) who underwent elective LC without thrombo-emboli prophylaxis. The patients were randomly divided into two groups according to the anesthesia technique: the general anesthesia (group 1, n = 30) and spinal epidural anesthesia (group 2, n = 30) groups. Measurement of the prothrombin time (PT), thrombin time (TT), international normalized ratio (INR), activated partial thromboplastin time (aPTT), and blood levels of D-dimer (DD) and fibrinogen (F) were recorded preoperatively (pre), at the first hour (post 1) and 24 h (post 24) after the surgery. These results were compared both between and within the groups. The mean age was 51.5 ±16.7 years (range: 19-79 years). Pneumoperitoneum time was similar between group 1 (33.8 ±7.8) and group 2 (34.8 ±10.4). The TT levels significantly declined postoperatively in both groups. The levels of PT, aPTT, INR, D-dimer and fibrinogen dramatically increased postoperatively in both groups. While there was not any DVT, there was a significant decline in TT. There was a dramatic rise in the PT, INR, D-dimer, fibrin degradation products (FDP), and fibrinogen following LC. This may be attributed to the effects of pneumoperitoneum and anesthesia techniques on portal vein flow.

  7. Habitat selection of a declining white-tailed deer herd in the central Black Hills, South Dakota and Wyoming

    Science.gov (United States)

    Deperno, Christopher Shannon

    Habitat selection, survival rates, the Black Hills National Forest Habitat Capability Model (HABCAP), and the USDA Forest Service Geographic Information System (GIS) data base were evaluated for a declining white-tailed deer (Odocoileus virginianus dacotensis) herd in the central Black Hills of South Dakota and Wyoming. From July 1993 through July 1996, 73 adult and yearling female and 12 adult and yearling male white-tailed deer were radiocollared and visually monitored. Habitat information was collected at 4,662 white-tailed deer locations and 1,087 random locations. Natural mortality (71%) was the primary cause of female mortality, followed by harvest (22.5%) and accidental causes (6.5%). More females died in spring (53.2%) than in fall (22.6%), winter (14.5%), or summer (9.7%). Male mortality resulted from hunting in fall (66.7%) and natural causes in spring (33.3%). Survival rates for all deer by year were 62.1% in 1993, 51.1% in 1994, 56.4% in 1995, and 53.9% in 1996 and were similar (P = 0.691) across years. During winter, white-tailed deer selected ponderosa pine- (Pinus ponderosa ) deciduous and burned pine cover types. Overstory-understory habitats selected included pine/grass-forb, pine/bearberry (Arctostaphylos uva-ursi), pine/snowberry (Symphoricarpos albus), burned pine/grass-forb, and pine/shrub habitats. Structural stages selected included sapling-pole pine stands with >70% canopy cover, burned pine sapling-pole and saw-timber stands with 40% canopy cover and all sapling-pole pine structural stages; sapling-pole stands with >70% canopy cover received the greatest use. White-tailed deer primarily fed in pine saw-timber structural stage with less than 40% canopy cover. Overall, selected habitats contained lower amounts of grass/forb, shrubs, and litter than random locations. Male and female deer generally bedded in areas that were characterized by greater horizontal cover than feeding and random sites. When feeding and bedding sites were combined

  8. General quality of life of patients with acne vulgaris before and after performing selected cosmetological treatments

    Directory of Open Access Journals (Sweden)

    Chilicka K

    2017-08-01

    Full Text Available Karolina Chilicka,1 Joanna Maj,2 Bernard Panaszek3 1Department of Cosmetology, Opole Medical School, Opole, 2Department of Dermatology, Venereology and Allergology, 3Department of Internal Medicine and Allergy, Wroclaw Medical University, Wrocław, Poland Background: Achieving a satisfying quality of life for a patient by applying individually matched therapy is, simultaneously, a great challenge and a priority for contemporary medicine. Patients with visible dermatological ailments are particularly susceptible to reduction in the general quality of life. Among the dermatological diseases, acne causes considerable reduction in the quality of life and changes in self-perception that lead to the worsening of a patient’s mental condition, including depression and suicidal thoughts. As a result, difficulties in contact with loved ones, as well as social and professional problems are observed, which show that acne is not a somatic problem alone. To a large extent, it becomes a part of psychodermatology, becoming an important topic of public health in social medicine practice. Pharmacological treatment of acne is a challenge for a dermatologist and often requires the necessity of cooperating with a cosmetologist. Cosmetological treatments are aimed at improving the condition of the skin and reduction or subsiding of acne skin changes.Aim: The aim of this study was to assess the influence of selected cosmetological treatments on the general quality of life of patients with acne.Materials and methods: The study group consisted of 101 women aged 19–29 years (x̅  =22.5 years, SD =2.3 years. All subjects were diagnosed with acne vulgaris of the face. In the study group, the acne changes occurred over the course of 3–15 years (x̅ =8.1 years, SD =2.7 years. Selected cosmetological treatments (intensive pulsing light, alpha-hydroxy acids, cavitation peeling, needle-free mesotherapy, diamond microdermabrasion and sonophoresis were performed in

  9. Comparison of non-directive counselling and cognitive behaviour therapy for patients presenting in general practice with an ICD-10 depressive episode: a randomized control trial.

    Science.gov (United States)

    King, M; Marston, L; Bower, P

    2014-07-01

    Most evidence in the UK on the effectiveness of brief therapy for depression concerns cognitive behaviour therapy (CBT). In a trial published in 2000, we showed that non-directive counselling and CBT were equally effective in general practice for patients with depression and mixed anxiety and depression. Our results were criticized for including patients not meeting diagnostic criteria for a depressive disorder. In this reanalysis we aimed to compare the effectiveness of the two therapies for patients with an ICD-10 depressive episode. Patients with an ICD-10 depressive episode or mixed anxiety and depression were randomized to counselling, CBT or usual general practitioner (GP) care. Counsellors provided nondirective, interpersonal counselling following a manual that we developed based on the work of Carl Rogers. Cognitive behaviour therapists provided CBT also guided by a manual. Modelling was carried out using generalized estimating equations with the multiply imputed datasets. Outcomes were mean scores on the Beck Depression Inventory, Brief Symptom Inventory, and Social Adjustment Scale at 4 and 12 months. A total of 134 participants were randomized to CBT, 126 to counselling and 67 to usual GP care. We undertook (1) an interaction analysis using all 316 patients who were assigned a diagnosis and (2) a head-to-head comparison using only those 130 (41%) participants who had an ICD-10 depressive episode at baseline. CBT and counselling were both superior to GP care at 4 months but not at 12 months. There was no difference in the effectiveness of the two psychological therapies. We recommend that national clinical guidelines take our findings into consideration in recommending effective alternatives to CBT.

  10. Opportunistic Relay Selection with Cooperative Macro Diversity

    Directory of Open Access Journals (Sweden)

    Yu Chia-Hao

    2010-01-01

    Full Text Available We apply a fully opportunistic relay selection scheme to study cooperative diversity in a semianalytical manner. In our framework, idle Mobile Stations (MSs are capable of being used as Relay Stations (RSs and no relaying is required if the direct path is strong. Our relay selection scheme is fully selection based: either the direct path or one of the relaying paths is selected. Macro diversity, which is often ignored in analytical works, is taken into account together with micro diversity by using a complete channel model that includes both shadow fading and fast fading effects. The stochastic geometry of the network is taken into account by having a random number of randomly located MSs. The outage probability analysis of the selection differs from the case where only fast fading is considered. Under our framework, distribution of the received power is formulated using different Channel State Information (CSI assumptions to simulate both optimistic and practical environments. The results show that the relay selection gain can be significant given a suitable amount of candidate RSs. Also, while relay selection according to incomplete CSI is diversity suboptimal compared to relay selection based on full CSI, the loss in average throughput is not too significant. This is a consequence of the dominance of geometry over fast fading.

  11. Statistical auditing and randomness test of lotto k/N-type games

    Science.gov (United States)

    Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Rapallo, F.; Scalas, E.

    2008-11-01

    One of the most popular lottery games worldwide is the so-called “lotto k/N”. It considers N numbers 1,2,…,N from which k are drawn randomly, without replacement. A player selects k or more numbers and the first prize is shared amongst those players whose selected numbers match all of the k randomly drawn. Exact rules may vary in different countries. In this paper, mean values and covariances for the random variables representing the numbers drawn from this kind of game are presented, with the aim of using them to audit statistically the consistency of a given sample of historical results with theoretical values coming from a hypergeometric statistical model. The method can be adapted to test pseudorandom number generators.

  12. MendelianRandomization: an R package for performing Mendelian randomization analyses using summarized data.

    Science.gov (United States)

    Yavorska, Olena O; Burgess, Stephen

    2017-12-01

    MendelianRandomization is a software package for the R open-source software environment that performs Mendelian randomization analyses using summarized data. The core functionality is to implement the inverse-variance weighted, MR-Egger and weighted median methods for multiple genetic variants. Several options are available to the user, such as the use of robust regression, fixed- or random-effects models and the penalization of weights for genetic variants with heterogeneous causal estimates. Extensions to these methods, such as allowing for variants to be correlated, can be chosen if appropriate. Graphical commands allow summarized data to be displayed in an interactive graph, or the plotting of causal estimates from multiple methods, for comparison. Although the main method of data entry is directly by the user, there is also an option for allowing summarized data to be incorporated from the PhenoScanner database of genotype-phenotype associations. We hope to develop this feature in future versions of the package. The R software environment is available for download from [https://www.r-project.org/]. The MendelianRandomization package can be downloaded from the Comprehensive R Archive Network (CRAN) within R, or directly from [https://cran.r-project.org/web/packages/MendelianRandomization/]. Both R and the MendelianRandomization package are released under GNU General Public Licenses (GPL-2|GPL-3). © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.

  13. Feature Selection via Chaotic Antlion Optimization.

    Directory of Open Access Journals (Sweden)

    Hossam M Zawbaa

    Full Text Available Selecting a subset of relevant properties from a large set of features that describe a dataset is a challenging machine learning task. In biology, for instance, the advances in the available technologies enable the generation of a very large number of biomarkers that describe the data. Choosing the more informative markers along with performing a high-accuracy classification over the data can be a daunting task, particularly if the data are high dimensional. An often adopted approach is to formulate the feature selection problem as a biobjective optimization problem, with the aim of maximizing the performance of the data analysis model (the quality of the data training fitting while minimizing the number of features used.We propose an optimization approach for the feature selection problem that considers a "chaotic" version of the antlion optimizer method, a nature-inspired algorithm that mimics the hunting mechanism of antlions in nature. The balance between exploration of the search space and exploitation of the best solutions is a challenge in multi-objective optimization. The exploration/exploitation rate is controlled by the parameter I that limits the random walk range of the ants/prey. This variable is increased iteratively in a quasi-linear manner to decrease the exploration rate as the optimization progresses. The quasi-linear decrease in the variable I may lead to immature convergence in some cases and trapping in local minima in other cases. The chaotic system proposed here attempts to improve the tradeoff between exploration and exploitation. The methodology is evaluated using different chaotic maps on a number of feature selection datasets. To ensure generality, we used ten biological datasets, but we also used other types of data from various sources. The results are compared with the particle swarm optimizer and with genetic algorithm variants for feature selection using a set of quality metrics.

  14. Random and non-random mating populations: Evolutionary dynamics in meiotic drive.

    Science.gov (United States)

    Sarkar, Bijan

    2016-01-01

    Game theoretic tools are utilized to analyze a one-locus continuous selection model of sex-specific meiotic drive by considering nonequivalence of the viabilities of reciprocal heterozygotes that might be noticed at an imprinted locus. The model draws attention to the role of viability selections of different types to examine the stable nature of polymorphic equilibrium. A bridge between population genetics and evolutionary game theory has been built up by applying the concept of the Fundamental Theorem of Natural Selection. In addition to pointing out the influences of male and female segregation ratios on selection, configuration structure reveals some noted results, e.g., Hardy-Weinberg frequencies hold in replicator dynamics, occurrence of faster evolution at the maximized variance fitness, existence of mixed Evolutionarily Stable Strategy (ESS) in asymmetric games, the tending evolution to follow not only a 1:1 sex ratio but also a 1:1 different alleles ratio at particular gene locus. Through construction of replicator dynamics in the group selection framework, our selection model introduces a redefining bases of game theory to incorporate non-random mating where a mating parameter associated with population structure is dependent on the social structure. Also, the model exposes the fact that the number of polymorphic equilibria will depend on the algebraic expression of population structure. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Random tensors

    CERN Document Server

    Gurau, Razvan

    2017-01-01

    Written by the creator of the modern theory of random tensors, this book is the first self-contained introductory text to this rapidly developing theory. Starting from notions familiar to the average researcher or PhD student in mathematical or theoretical physics, the book presents in detail the theory and its applications to physics. The recent detections of the Higgs boson at the LHC and gravitational waves at LIGO mark new milestones in Physics confirming long standing predictions of Quantum Field Theory and General Relativity. These two experimental results only reinforce today the need to find an underlying common framework of the two: the elusive theory of Quantum Gravity. Over the past thirty years, several alternatives have been proposed as theories of Quantum Gravity, chief among them String Theory. While these theories are yet to be tested experimentally, key lessons have already been learned. Whatever the theory of Quantum Gravity may be, it must incorporate random geometry in one form or another....

  16. A DYNAMIC FEATURE SELECTION METHOD FOR DOCUMENT RANKING WITH RELEVANCE FEEDBACK APPROACH

    Directory of Open Access Journals (Sweden)

    K. Latha

    2010-07-01

    Full Text Available Ranking search results is essential for information retrieval and Web search. Search engines need to not only return highly relevant results, but also be fast to satisfy users. As a result, not all available features can be used for ranking, and in fact only a small percentage of these features can be used. Thus, it is crucial to have a feature selection mechanism that can find a subset of features that both meets latency requirements and achieves high relevance. In this paper we describe a 0/1 knapsack procedure for automatically selecting features to use within Generalization model for Document Ranking. We propose an approach for Relevance Feedback using Expectation Maximization method and evaluate the algorithm on the TREC Collection for describing classes of feedback textual information retrieval features. Experimental results, evaluated on standard TREC-9 part of the OHSUMED collections, show that our feature selection algorithm produces models that are either significantly more effective than, or equally effective as, models such as Markov Random Field model, Correlation Co-efficient and Count Difference method

  17. Not accounting for interindividual variability can mask habitat selection patterns: a case study on black bears.

    Science.gov (United States)

    Lesmerises, Rémi; St-Laurent, Martin-Hugues

    2017-11-01

    Habitat selection studies conducted at the population scale commonly aim to describe general patterns that could improve our understanding of the limiting factors in species-habitat relationships. Researchers often consider interindividual variation in selection patterns to control for its effects and avoid pseudoreplication by using mixed-effect models that include individuals as random factors. Here, we highlight common pitfalls and possible misinterpretations of this strategy by describing habitat selection of 21 black bears Ursus americanus. We used Bayesian mixed-effect models and compared results obtained when using random intercept (i.e., population level) versus calculating individual coefficients for each independent variable (i.e., individual level). We then related interindividual variability to individual characteristics (i.e., age, sex, reproductive status, body condition) in a multivariate analysis. The assumption of comparable behavior among individuals was verified only in 40% of the cases in our seasonal best models. Indeed, we found strong and opposite responses among sampled bears and individual coefficients were linked to individual characteristics. For some covariates, contrasted responses canceled each other out at the population level. In other cases, interindividual variability was concealed by the composition of our sample, with the majority of the bears (e.g., old individuals and bears in good physical condition) driving the population response (e.g., selection of young forest cuts). Our results stress the need to consider interindividual variability to avoid misinterpretation and uninformative results, especially for a flexible and opportunistic species. This study helps to identify some ecological drivers of interindividual variability in bear habitat selection patterns.

  18. Quality of recovery from anesthesia of patients undergoing balanced or total intravenous general anesthesia. Prospective randomized clinical trial.

    Science.gov (United States)

    Moro, Eduardo Toshiyuki; Leme, Fábio Caetano Oliveira; Noronha, Bernardo Roveda; Saraiva, Gustavo Farinha Pinto; de Matos Leite, Nathália Vianna; Navarro, Laís Helena Camacho

    2016-12-01

    The aim of the present study was to assess the quality of recovery from anesthesia of patients subjected to otorhinolaryngological (ORL) surgery under balanced or total intravenous general anesthesia by means of Quality of Recovery-40 (QoR-40) questionnaire. Prospective randomized clinical trial. The setting is at an operating room, a postoperative recovery area, and a hospital ward. One-hundred thirty American Society of Anesthesiologists physical status I or II patients scheduled to undergo general anesthesia for ORL interventions under remifentanil, in combination with sevoflurane (balanced technique) or propofol (total intravenous anesthesia). Occurrence of nausea, vomiting, body temperature less than 36°C, and length of stay in the postanesthesia care unit were recorded. The QoR-40 was administered by an investigator blind to group allocation 24 hours after surgery. The quality of recovery, as assessed by the score on the QoR-40, was compared between the groups. There is no difference regarding the QoR-40 score among intravenous and inhalation anesthesia groups (190.5 vs 189.5, respectively; P=.33). Similarly, among the 5 dimensions of the QoR-40, the scores were comparable between the groups. Incidence of hypothermia (P=.58), nauseas or vomits (P=.39), and length of surgery (P=.16) were similar among groups. The evaluation of pain intensity (P=.80) and dose of morphine use in the postanesthesia care unit (P=.4) was also comparable between groups. The quality of recovery from anesthesia assessed based on the patients' perception did not differ between the ones subjected to either inhalation or intravenous general anesthesia for ORL surgery based on QoR-40 questionnaire assessment. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Stochastic perturbations in open chaotic systems: random versus noisy maps.

    Science.gov (United States)

    Bódai, Tamás; Altmann, Eduardo G; Endler, Antonio

    2013-04-01

    We investigate the effects of random perturbations on fully chaotic open systems. Perturbations can be applied to each trajectory independently (white noise) or simultaneously to all trajectories (random map). We compare these two scenarios by generalizing the theory of open chaotic systems and introducing a time-dependent conditionally-map-invariant measure. For the same perturbation strength we show that the escape rate of the random map is always larger than that of the noisy map. In random maps we show that the escape rate κ and dimensions D of the relevant fractal sets often depend nonmonotonically on the intensity of the random perturbation. We discuss the accuracy (bias) and precision (variance) of finite-size estimators of κ and D, and show that the improvement of the precision of the estimations with the number of trajectories N is extremely slow ([proportionality]1/lnN). We also argue that the finite-size D estimators are typically biased. General theoretical results are combined with analytical calculations and numerical simulations in area-preserving baker maps.

  20. The end-state comfort effect in bimanual grip selection.

    Science.gov (United States)

    Fischman, Mark G; Stodden, David F; Lehman, Davana M

    2003-03-01

    During a unimanual grip selection task in which people pick up a lightweight dowel and place one end against targets at variable heights, the choice of hand grip (overhand vs. underhand) typically depends on the perception of how comfortable the arm will be at the end of the movement: an end-state comfort effect. The two experiments reported here extend this work to bimanual tasks. In each experiment, 26 right-handed participants used their left and right hands to simultaneously pick up two wooden dowels and place either the right or left end against a series of 14 targets ranging from 14 to 210 cm above the floor. These tasks were performed in systematic ascending and descending orders in Experiment 1 and in random order in Expiment 2. Results were generally consistent with predictions of end-state comfort in that, for the extreme highest and lowest targets, participants tended to select opposite grips with each hand. Taken together, our findings are consistent with the concept of constraint hierarchies within a posture-based motion-planning model.