WorldWideScience

Sample records for randomly selected case

  1. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    Science.gov (United States)

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  2. Selectivity and sparseness in randomly connected balanced networks.

    Directory of Open Access Journals (Sweden)

    Cengiz Pehlevan

    Full Text Available Neurons in sensory cortex show stimulus selectivity and sparse population response, even in cases where no strong functionally specific structure in connectivity can be detected. This raises the question whether selectivity and sparseness can be generated and maintained in randomly connected networks. We consider a recurrent network of excitatory and inhibitory spiking neurons with random connectivity, driven by random projections from an input layer of stimulus selective neurons. In this architecture, the stimulus-to-stimulus and neuron-to-neuron modulation of total synaptic input is weak compared to the mean input. Surprisingly, we show that in the balanced state the network can still support high stimulus selectivity and sparse population response. In the balanced state, strong synapses amplify the variation in synaptic input and recurrent inhibition cancels the mean. Functional specificity in connectivity emerges due to the inhomogeneity caused by the generative statistical rule used to build the network. We further elucidate the mechanism behind and evaluate the effects of model parameters on population sparseness and stimulus selectivity. Network response to mixtures of stimuli is investigated. It is shown that a balanced state with unselective inhibition can be achieved with densely connected input to inhibitory population. Balanced networks exhibit the "paradoxical" effect: an increase in excitatory drive to inhibition leads to decreased inhibitory population firing rate. We compare and contrast selectivity and sparseness generated by the balanced network to randomly connected unbalanced networks. Finally, we discuss our results in light of experiments.

  3. Blocked Randomization with Randomly Selected Block Sizes

    Directory of Open Access Journals (Sweden)

    Jimmy Efird

    2010-12-01

    Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  4. Application of random effects to the study of resource selection by animals.

    Science.gov (United States)

    Gillies, Cameron S; Hebblewhite, Mark; Nielsen, Scott E; Krawchuk, Meg A; Aldridge, Cameron L; Frair, Jacqueline L; Saher, D Joanne; Stevens, Cameron E; Jerde, Christopher L

    2006-07-01

    1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence. 2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability. 3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed. 4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects. 5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection. 6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions

  5. 47 CFR 1.1602 - Designation for random selection.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...

  6. 47 CFR 1.1603 - Conduct of random selection.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection. The...

  7. Selection of examples in case-based computer-aided decision systems

    International Nuclear Information System (INIS)

    Mazurowski, Maciej A; Zurada, Jacek M; Tourassi, Georgia D

    2008-01-01

    Case-based computer-aided decision (CB-CAD) systems rely on a database of previously stored, known examples when classifying new, incoming queries. Such systems can be particularly useful since they do not need retraining every time a new example is deposited in the case base. The adaptive nature of case-based systems is well suited to the current trend of continuously expanding digital databases in the medical domain. To maintain efficiency, however, such systems need sophisticated strategies to effectively manage the available evidence database. In this paper, we discuss the general problem of building an evidence database by selecting the most useful examples to store while satisfying existing storage requirements. We evaluate three intelligent techniques for this purpose: genetic algorithm-based selection, greedy selection and random mutation hill climbing. These techniques are compared to a random selection strategy used as the baseline. The study is performed with a previously presented CB-CAD system applied for false positive reduction in screening mammograms. The experimental evaluation shows that when the development goal is to maximize the system's diagnostic performance, the intelligent techniques are able to reduce the size of the evidence database to 37% of the original database by eliminating superfluous and/or detrimental examples while at the same time significantly improving the CAD system's performance. Furthermore, if the case-base size is a main concern, the total number of examples stored in the system can be reduced to only 2-4% of the original database without a decrease in the diagnostic performance. Comparison of the techniques shows that random mutation hill climbing provides the best balance between the diagnostic performance and computational efficiency when building the evidence database of the CB-CAD system.

  8. Random drift versus selection in academic vocabulary: an evolutionary analysis of published keywords.

    Science.gov (United States)

    Bentley, R Alexander

    2008-08-27

    The evolution of vocabulary in academic publishing is characterized via keyword frequencies recorded in the ISI Web of Science citations database. In four distinct case-studies, evolutionary analysis of keyword frequency change through time is compared to a model of random copying used as the null hypothesis, such that selection may be identified against it. The case studies from the physical sciences indicate greater selection in keyword choice than in the social sciences. Similar evolutionary analyses can be applied to a wide range of phenomena; wherever the popularity of multiple items through time has been recorded, as with web searches, or sales of popular music and books, for example.

  9. Random drift versus selection in academic vocabulary: an evolutionary analysis of published keywords.

    Directory of Open Access Journals (Sweden)

    R Alexander Bentley

    Full Text Available The evolution of vocabulary in academic publishing is characterized via keyword frequencies recorded in the ISI Web of Science citations database. In four distinct case-studies, evolutionary analysis of keyword frequency change through time is compared to a model of random copying used as the null hypothesis, such that selection may be identified against it. The case studies from the physical sciences indicate greater selection in keyword choice than in the social sciences. Similar evolutionary analyses can be applied to a wide range of phenomena; wherever the popularity of multiple items through time has been recorded, as with web searches, or sales of popular music and books, for example.

  10. Testing, Selection, and Implementation of Random Number Generators

    National Research Council Canada - National Science Library

    Collins, Joseph C

    2008-01-01

    An exhaustive evaluation of state-of-the-art random number generators with several well-known suites of tests provides the basis for selection of suitable random number generators for use in stochastic simulations...

  11. Local randomization in neighbor selection improves PRM roadmap quality

    KAUST Repository

    McMahon, Troy; Jacobs, Sam; Boyd, Bryan; Tapia, Lydia; Amato, Nancy M.

    2012-01-01

    Probabilistic Roadmap Methods (PRMs) are one of the most used classes of motion planning methods. These sampling-based methods generate robot configurations (nodes) and then connect them to form a graph (roadmap) containing representative feasible pathways. A key step in PRM roadmap construction involves identifying a set of candidate neighbors for each node. Traditionally, these candidates are chosen to be the k-closest nodes based on a given distance metric. In this paper, we propose a new neighbor selection policy called LocalRand(k,K'), that first computes the K' closest nodes to a specified node and then selects k of those nodes at random. Intuitively, LocalRand attempts to benefit from random sampling while maintaining the higher levels of local planner success inherent to selecting more local neighbors. We provide a methodology for selecting the parameters k and K'. We perform an experimental comparison which shows that for both rigid and articulated robots, LocalRand results in roadmaps that are better connected than the traditional k-closest policy or a purely random neighbor selection policy. The cost required to achieve these results is shown to be comparable to k-closest. © 2012 IEEE.

  12. Local randomization in neighbor selection improves PRM roadmap quality

    KAUST Repository

    McMahon, Troy

    2012-10-01

    Probabilistic Roadmap Methods (PRMs) are one of the most used classes of motion planning methods. These sampling-based methods generate robot configurations (nodes) and then connect them to form a graph (roadmap) containing representative feasible pathways. A key step in PRM roadmap construction involves identifying a set of candidate neighbors for each node. Traditionally, these candidates are chosen to be the k-closest nodes based on a given distance metric. In this paper, we propose a new neighbor selection policy called LocalRand(k,K\\'), that first computes the K\\' closest nodes to a specified node and then selects k of those nodes at random. Intuitively, LocalRand attempts to benefit from random sampling while maintaining the higher levels of local planner success inherent to selecting more local neighbors. We provide a methodology for selecting the parameters k and K\\'. We perform an experimental comparison which shows that for both rigid and articulated robots, LocalRand results in roadmaps that are better connected than the traditional k-closest policy or a purely random neighbor selection policy. The cost required to achieve these results is shown to be comparable to k-closest. © 2012 IEEE.

  13. Joint random beam and spectrum selection for spectrum sharing systems with partial channel state information

    KAUST Repository

    Abdallah, Mohamed M.

    2013-11-01

    In this work, we develop joint interference-aware random beam and spectrum selection scheme that provide enhanced performance for the secondary network under the condition that the interference observed at the primary receiver is below a predetermined acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a set of primary links composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes jointly select a beam, among a set of power-optimized random beams, as well as the primary spectrum that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint. In particular, we consider the case where the interference level is described by a q-bit description of its magnitude, whereby we propose a technique to find the optimal quantizer thresholds in a mean square error (MSE) sense. © 2013 IEEE.

  14. Joint random beam and spectrum selection for spectrum sharing systems with partial channel state information

    KAUST Repository

    Abdallah, Mohamed M.; Sayed, Mostafa M.; Alouini, Mohamed-Slim; Qaraqe, Khalid A.

    2013-01-01

    In this work, we develop joint interference-aware random beam and spectrum selection scheme that provide enhanced performance for the secondary network under the condition that the interference observed at the primary receiver is below a predetermined acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a set of primary links composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes jointly select a beam, among a set of power-optimized random beams, as well as the primary spectrum that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint. In particular, we consider the case where the interference level is described by a q-bit description of its magnitude, whereby we propose a technique to find the optimal quantizer thresholds in a mean square error (MSE) sense. © 2013 IEEE.

  15. A Method to Select Software Test Cases in Consideration of Past Input Sequence

    International Nuclear Information System (INIS)

    Kim, Hee Eun; Kim, Bo Gyung; Kang, Hyun Gook

    2015-01-01

    In the Korea Nuclear I and C Systems (KNICS) project, the software for the fully-digitalized reactor protection system (RPS) was developed under a strict procedure. Even though the behavior of the software is deterministic, the randomness of input sequence produces probabilistic behavior of software. A software failure occurs when some inputs to the software occur and interact with the internal state of the digital system to trigger a fault that was introduced into the software during the software lifecycle. In this paper, the method to select test set for software failure probability estimation is suggested. This test set reflects past input sequence of software, which covers all possible cases. In this study, the method to select test cases for software failure probability quantification was suggested. To obtain profile of paired state variables, relationships of the variables need to be considered. The effect of input from human operator also have to be considered. As an example, test set of PZR-PR-Lo-Trip logic was examined. This method provides framework for selecting test cases of safety-critical software

  16. Reducing selection bias in case-control studies from rare disease registries.

    Science.gov (United States)

    Cole, J Alexander; Taylor, John S; Hangartner, Thomas N; Weinreb, Neal J; Mistry, Pramod K; Khan, Aneal

    2011-09-12

    In clinical research of rare diseases, where small patient numbers and disease heterogeneity limit study design options, registries are a valuable resource for demographic and outcome information. However, in contrast to prospective, randomized clinical trials, the observational design of registries is prone to introduce selection bias and negatively impact the validity of data analyses. The objective of the study was to demonstrate the utility of case-control matching and the risk-set method in order to control bias in data from a rare disease registry. Data from the International Collaborative Gaucher Group (ICGG) Gaucher Registry were used as an example. A case-control matching analysis using the risk-set method was conducted to identify two groups of patients with type 1 Gaucher disease in the ICGG Gaucher Registry: patients with avascular osteonecrosis (AVN) and those without AVN. The frequency distributions of gender, decade of birth, treatment status, and splenectomy status were presented for cases and controls before and after matching. Odds ratios (and 95% confidence intervals) were calculated for each variable before and after matching. The application of case-control matching methodology results in cohorts of cases (i.e., patients with AVN) and controls (i.e., patients without AVN) who have comparable distributions for four common parameters used in subject selection: gender, year of birth (age), treatment status, and splenectomy status. Matching resulted in odds ratios of approximately 1.00, indicating no bias. We demonstrated bias in case-control selection in subjects from a prototype rare disease registry and used case-control matching to minimize this bias. Therefore, this approach appears useful to study cohorts of heterogeneous patients in rare disease registries.

  17. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  18. Applications of random forest feature selection for fine-scale genetic population assignment.

    Science.gov (United States)

    Sylvester, Emma V A; Bentzen, Paul; Bradbury, Ian R; Clément, Marie; Pearce, Jon; Horne, John; Beiko, Robert G

    2018-02-01

    Genetic population assignment used to inform wildlife management and conservation efforts requires panels of highly informative genetic markers and sensitive assignment tests. We explored the utility of machine-learning algorithms (random forest, regularized random forest and guided regularized random forest) compared with F ST ranking for selection of single nucleotide polymorphisms (SNP) for fine-scale population assignment. We applied these methods to an unpublished SNP data set for Atlantic salmon ( Salmo salar ) and a published SNP data set for Alaskan Chinook salmon ( Oncorhynchus tshawytscha ). In each species, we identified the minimum panel size required to obtain a self-assignment accuracy of at least 90% using each method to create panels of 50-700 markers Panels of SNPs identified using random forest-based methods performed up to 7.8 and 11.2 percentage points better than F ST -selected panels of similar size for the Atlantic salmon and Chinook salmon data, respectively. Self-assignment accuracy ≥90% was obtained with panels of 670 and 384 SNPs for each data set, respectively, a level of accuracy never reached for these species using F ST -selected panels. Our results demonstrate a role for machine-learning approaches in marker selection across large genomic data sets to improve assignment for management and conservation of exploited populations.

  19. Reducing selection bias in case-control studies from rare disease registries

    Directory of Open Access Journals (Sweden)

    Mistry Pramod K

    2011-09-01

    Full Text Available Abstract Background In clinical research of rare diseases, where small patient numbers and disease heterogeneity limit study design options, registries are a valuable resource for demographic and outcome information. However, in contrast to prospective, randomized clinical trials, the observational design of registries is prone to introduce selection bias and negatively impact the validity of data analyses. The objective of the study was to demonstrate the utility of case-control matching and the risk-set method in order to control bias in data from a rare disease registry. Data from the International Collaborative Gaucher Group (ICGG Gaucher Registry were used as an example. Methods A case-control matching analysis using the risk-set method was conducted to identify two groups of patients with type 1 Gaucher disease in the ICGG Gaucher Registry: patients with avascular osteonecrosis (AVN and those without AVN. The frequency distributions of gender, decade of birth, treatment status, and splenectomy status were presented for cases and controls before and after matching. Odds ratios (and 95% confidence intervals were calculated for each variable before and after matching. Results The application of case-control matching methodology results in cohorts of cases (i.e., patients with AVN and controls (i.e., patients without AVN who have comparable distributions for four common parameters used in subject selection: gender, year of birth (age, treatment status, and splenectomy status. Matching resulted in odds ratios of approximately 1.00, indicating no bias. Conclusions We demonstrated bias in case-control selection in subjects from a prototype rare disease registry and used case-control matching to minimize this bias. Therefore, this approach appears useful to study cohorts of heterogeneous patients in rare disease registries.

  20. Genome-wide association data classification and SNPs selection using two-stage quality-based Random Forests.

    Science.gov (United States)

    Nguyen, Thanh-Tung; Huang, Joshua; Wu, Qingyao; Nguyen, Thuy; Li, Mark

    2015-01-01

    Single-nucleotide polymorphisms (SNPs) selection and identification are the most important tasks in Genome-wide association data analysis. The problem is difficult because genome-wide association data is very high dimensional and a large portion of SNPs in the data is irrelevant to the disease. Advanced machine learning methods have been successfully used in Genome-wide association studies (GWAS) for identification of genetic variants that have relatively big effects in some common, complex diseases. Among them, the most successful one is Random Forests (RF). Despite of performing well in terms of prediction accuracy in some data sets with moderate size, RF still suffers from working in GWAS for selecting informative SNPs and building accurate prediction models. In this paper, we propose to use a new two-stage quality-based sampling method in random forests, named ts-RF, for SNP subspace selection for GWAS. The method first applies p-value assessment to find a cut-off point that separates informative and irrelevant SNPs in two groups. The informative SNPs group is further divided into two sub-groups: highly informative and weak informative SNPs. When sampling the SNP subspace for building trees for the forest, only those SNPs from the two sub-groups are taken into account. The feature subspaces always contain highly informative SNPs when used to split a node at a tree. This approach enables one to generate more accurate trees with a lower prediction error, meanwhile possibly avoiding overfitting. It allows one to detect interactions of multiple SNPs with the diseases, and to reduce the dimensionality and the amount of Genome-wide association data needed for learning the RF model. Extensive experiments on two genome-wide SNP data sets (Parkinson case-control data comprised of 408,803 SNPs and Alzheimer case-control data comprised of 380,157 SNPs) and 10 gene data sets have demonstrated that the proposed model significantly reduced prediction errors and outperformed

  1. Interference-aware random beam selection for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed M.

    2012-09-01

    Spectrum sharing systems have been introduced to alleviate the problem of spectrum scarcity by allowing secondary unlicensed networks to share the spectrum with primary licensed networks under acceptable interference levels to the primary users. In this paper, we develop interference-aware random beam selection schemes that provide enhanced throughput for the secondary link under the condition that the interference observed at the primary link is within a predetermined acceptable value. For a secondary transmitter equipped with multiple antennas, our schemes select a random beam, among a set of power- optimized orthogonal random beams, that maximizes the capacity of the secondary link while satisfying the interference constraint at the primary receiver for different levels of feedback information describing the interference level at the primary receiver. For the proposed schemes, we develop a statistical analysis for the signal-to-noise and interference ratio (SINR) statistics as well as the capacity of the secondary link. Finally, we present numerical results that study the effect of system parameters including number of beams and the maximum transmission power on the capacity of the secondary link attained using the proposed schemes. © 2012 IEEE.

  2. Interference-aware random beam selection for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed M.; Sayed, Mostafa M.; Alouini, Mohamed-Slim; Qaraqe, Khalid A.

    2012-01-01

    . In this paper, we develop interference-aware random beam selection schemes that provide enhanced throughput for the secondary link under the condition that the interference observed at the primary link is within a predetermined acceptable value. For a secondary

  3. Selecting appropriate cases when tracing causal mechanisms

    DEFF Research Database (Denmark)

    Beach, Derek; Pedersen, Rasmus Brun

    2016-01-01

    The last decade has witnessed resurgence in the interest in studying the causal mechanisms linking causes and outcomes in the social sciences. This article explores the overlooked implications for case selection when tracing mechanisms using in-depth case studies. Our argument is that existing case...... selection guidelines are appropriate for research aimed at making cross-case claims about causal relationships, where case selection is primarily used to control for other causes. However, existing guidelines are not in alignment with case-based research that aims to trace mechanisms, where the goal...... is to unpack the causal mechanism between X and Y, enabling causal inferences to be made because empirical evidence is provided for how the mechanism actually operated in a particular case. The in-depth, within-case tracing of how mechanisms operate in particular cases produces what can be termed mechanistic...

  4. The signature of positive selection at randomly chosen loci.

    OpenAIRE

    Przeworski, Molly

    2002-01-01

    In Drosophila and humans, there are accumulating examples of loci with a significant excess of high-frequency-derived alleles or high levels of linkage disequilibrium, relative to a neutral model of a random-mating population of constant size. These are features expected after a recent selective sweep. Their prevalence suggests that positive directional selection may be widespread in both species. However, as I show here, these features do not persist long after the sweep ends: The high-frequ...

  5. Simulated Performance Evaluation of a Selective Tracker Through Random Scenario Generation

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar

    2006-01-01

    performance assessment. Therefore, a random target motion scenario is adopted. Its implementation in particular for testing the proposed selective track splitting algorithm using Kalman filters is investigated through a number of performance parameters which gives the activity profile of the tracking scenario......  The paper presents a simulation study on the performance of a target tracker using selective track splitting filter algorithm through a random scenario implemented on a digital signal processor.  In a typical track splitting filter all the observation which fall inside a likelihood ellipse...... are used for update, however, in our proposed selective track splitting filter less number of observations are used for track update.  Much of the previous performance work [1] has been done on specific (deterministic) scenarios. One of the reasons for considering the specific scenarios, which were...

  6. Minimization over randomly selected lines

    Directory of Open Access Journals (Sweden)

    Ismet Sahin

    2013-07-01

    Full Text Available This paper presents a population-based evolutionary optimization method for minimizing a given cost function. The mutation operator of this method selects randomly oriented lines in the cost function domain, constructs quadratic functions interpolating the cost function at three different points over each line, and uses extrema of the quadratics as mutated points. The crossover operator modifies each mutated point based on components of two points in population, instead of one point as is usually performed in other evolutionary algorithms. The stopping criterion of this method depends on the number of almost degenerate quadratics. We demonstrate that the proposed method with these mutation and crossover operations achieves faster and more robust convergence than the well-known Differential Evolution and Particle Swarm algorithms.

  7. Selection for altruism through random drift in variable size populations

    Directory of Open Access Journals (Sweden)

    Houchmandzadeh Bahram

    2012-05-01

    Full Text Available Abstract Background Altruistic behavior is defined as helping others at a cost to oneself and a lowered fitness. The lower fitness implies that altruists should be selected against, which is in contradiction with their widespread presence is nature. Present models of selection for altruism (kin or multilevel show that altruistic behaviors can have ‘hidden’ advantages if the ‘common good’ produced by altruists is restricted to some related or unrelated groups. These models are mostly deterministic, or assume a frequency dependent fitness. Results Evolutionary dynamics is a competition between deterministic selection pressure and stochastic events due to random sampling from one generation to the next. We show here that an altruistic allele extending the carrying capacity of the habitat can win by increasing the random drift of “selfish” alleles. In other terms, the fixation probability of altruistic genes can be higher than those of a selfish ones, even though altruists have a smaller fitness. Moreover when populations are geographically structured, the altruists advantage can be highly amplified and the fixation probability of selfish genes can tend toward zero. The above results are obtained both by numerical and analytical calculations. Analytical results are obtained in the limit of large populations. Conclusions The theory we present does not involve kin or multilevel selection, but is based on the existence of random drift in variable size populations. The model is a generalization of the original Fisher-Wright and Moran models where the carrying capacity depends on the number of altruists.

  8. Selection bias and subject refusal in a cluster-randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Rochelle Yang

    2017-07-01

    Full Text Available Abstract Background Selection bias and non-participation bias are major methodological concerns which impact external validity. Cluster-randomized controlled trials are especially prone to selection bias as it is impractical to blind clusters to their allocation into intervention or control. This study assessed the impact of selection bias in a large cluster-randomized controlled trial. Methods The Improved Cardiovascular Risk Reduction to Enhance Rural Primary Care (ICARE study examined the impact of a remote pharmacist-led intervention in twelve medical offices. To assess eligibility, a standardized form containing patient demographics and medical information was completed for each screened patient. Eligible patients were approached by the study coordinator for recruitment. Both the study coordinator and the patient were aware of the site’s allocation prior to consent. Patients who consented or declined to participate were compared across control and intervention arms for differing characteristics. Statistical significance was determined using a two-tailed, equal variance t-test and a chi-square test with adjusted Bonferroni p-values. Results were adjusted for random cluster variation. Results There were 2749 completed screening forms returned to research staff with 461 subjects who had either consented or declined participation. Patients with poorly controlled diabetes were found to be significantly more likely to decline participation in intervention sites compared to those in control sites. A higher mean diastolic blood pressure was seen in patients with uncontrolled hypertension who declined in the control sites compared to those who declined in the intervention sites. However, these findings were no longer significant after adjustment for random variation among the sites. After this adjustment, females were now found to be significantly more likely to consent than males (odds ratio = 1.41; 95% confidence interval = 1.03, 1

  9. Interference-aware random beam selection schemes for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed; Qaraqe, Khalid; Alouini, Mohamed-Slim

    2012-01-01

    users. In this work, we develop interference-aware random beam selection schemes that provide enhanced performance for the secondary network under the condition that the interference observed by the receivers of the primary network is below a

  10. Not accounting for interindividual variability can mask habitat selection patterns: a case study on black bears.

    Science.gov (United States)

    Lesmerises, Rémi; St-Laurent, Martin-Hugues

    2017-11-01

    Habitat selection studies conducted at the population scale commonly aim to describe general patterns that could improve our understanding of the limiting factors in species-habitat relationships. Researchers often consider interindividual variation in selection patterns to control for its effects and avoid pseudoreplication by using mixed-effect models that include individuals as random factors. Here, we highlight common pitfalls and possible misinterpretations of this strategy by describing habitat selection of 21 black bears Ursus americanus. We used Bayesian mixed-effect models and compared results obtained when using random intercept (i.e., population level) versus calculating individual coefficients for each independent variable (i.e., individual level). We then related interindividual variability to individual characteristics (i.e., age, sex, reproductive status, body condition) in a multivariate analysis. The assumption of comparable behavior among individuals was verified only in 40% of the cases in our seasonal best models. Indeed, we found strong and opposite responses among sampled bears and individual coefficients were linked to individual characteristics. For some covariates, contrasted responses canceled each other out at the population level. In other cases, interindividual variability was concealed by the composition of our sample, with the majority of the bears (e.g., old individuals and bears in good physical condition) driving the population response (e.g., selection of young forest cuts). Our results stress the need to consider interindividual variability to avoid misinterpretation and uninformative results, especially for a flexible and opportunistic species. This study helps to identify some ecological drivers of interindividual variability in bear habitat selection patterns.

  11. Continuous-Time Mean-Variance Portfolio Selection with Random Horizon

    International Nuclear Information System (INIS)

    Yu, Zhiyong

    2013-01-01

    This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right

  12. Continuous-Time Mean-Variance Portfolio Selection with Random Horizon

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Zhiyong, E-mail: yuzhiyong@sdu.edu.cn [Shandong University, School of Mathematics (China)

    2013-12-15

    This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right.

  13. TEHRAN AIR POLLUTANTS PREDICTION BASED ON RANDOM FOREST FEATURE SELECTION METHOD

    Directory of Open Access Journals (Sweden)

    A. Shamsoddini

    2017-09-01

    Full Text Available Air pollution as one of the most serious forms of environmental pollutions poses huge threat to human life. Air pollution leads to environmental instability, and has harmful and undesirable effects on the environment. Modern prediction methods of the pollutant concentration are able to improve decision making and provide appropriate solutions. This study examines the performance of the Random Forest feature selection in combination with multiple-linear regression and Multilayer Perceptron Artificial Neural Networks methods, in order to achieve an efficient model to estimate carbon monoxide and nitrogen dioxide, sulfur dioxide and PM2.5 contents in the air. The results indicated that Artificial Neural Networks fed by the attributes selected by Random Forest feature selection method performed more accurate than other models for the modeling of all pollutants. The estimation accuracy of sulfur dioxide emissions was lower than the other air contaminants whereas the nitrogen dioxide was predicted more accurate than the other pollutants.

  14. Tehran Air Pollutants Prediction Based on Random Forest Feature Selection Method

    Science.gov (United States)

    Shamsoddini, A.; Aboodi, M. R.; Karami, J.

    2017-09-01

    Air pollution as one of the most serious forms of environmental pollutions poses huge threat to human life. Air pollution leads to environmental instability, and has harmful and undesirable effects on the environment. Modern prediction methods of the pollutant concentration are able to improve decision making and provide appropriate solutions. This study examines the performance of the Random Forest feature selection in combination with multiple-linear regression and Multilayer Perceptron Artificial Neural Networks methods, in order to achieve an efficient model to estimate carbon monoxide and nitrogen dioxide, sulfur dioxide and PM2.5 contents in the air. The results indicated that Artificial Neural Networks fed by the attributes selected by Random Forest feature selection method performed more accurate than other models for the modeling of all pollutants. The estimation accuracy of sulfur dioxide emissions was lower than the other air contaminants whereas the nitrogen dioxide was predicted more accurate than the other pollutants.

  15. Using Random Forests to Select Optimal Input Variables for Short-Term Wind Speed Forecasting Models

    Directory of Open Access Journals (Sweden)

    Hui Wang

    2017-10-01

    Full Text Available Achieving relatively high-accuracy short-term wind speed forecasting estimates is a precondition for the construction and grid-connected operation of wind power forecasting systems for wind farms. Currently, most research is focused on the structure of forecasting models and does not consider the selection of input variables, which can have significant impacts on forecasting performance. This paper presents an input variable selection method for wind speed forecasting models. The candidate input variables for various leading periods are selected and random forests (RF is employed to evaluate the importance of all variable as features. The feature subset with the best evaluation performance is selected as the optimal feature set. Then, kernel-based extreme learning machine is constructed to evaluate the performance of input variables selection based on RF. The results of the case study show that by removing the uncorrelated and redundant features, RF effectively extracts the most strongly correlated set of features from the candidate input variables. By finding the optimal feature combination to represent the original information, RF simplifies the structure of the wind speed forecasting model, shortens the training time required, and substantially improves the model’s accuracy and generalization ability, demonstrating that the input variables selected by RF are effective.

  16. Hebbian Learning in a Random Network Captures Selectivity Properties of the Prefrontal Cortex

    Science.gov (United States)

    Lindsay, Grace W.

    2017-01-01

    Complex cognitive behaviors, such as context-switching and rule-following, are thought to be supported by the prefrontal cortex (PFC). Neural activity in the PFC must thus be specialized to specific tasks while retaining flexibility. Nonlinear “mixed” selectivity is an important neurophysiological trait for enabling complex and context-dependent behaviors. Here we investigate (1) the extent to which the PFC exhibits computationally relevant properties, such as mixed selectivity, and (2) how such properties could arise via circuit mechanisms. We show that PFC cells recorded from male and female rhesus macaques during a complex task show a moderate level of specialization and structure that is not replicated by a model wherein cells receive random feedforward inputs. While random connectivity can be effective at generating mixed selectivity, the data show significantly more mixed selectivity than predicted by a model with otherwise matched parameters. A simple Hebbian learning rule applied to the random connectivity, however, increases mixed selectivity and enables the model to match the data more accurately. To explain how learning achieves this, we provide analysis along with a clear geometric interpretation of the impact of learning on selectivity. After learning, the model also matches the data on measures of noise, response density, clustering, and the distribution of selectivities. Of two styles of Hebbian learning tested, the simpler and more biologically plausible option better matches the data. These modeling results provide clues about how neural properties important for cognition can arise in a circuit and make clear experimental predictions regarding how various measures of selectivity would evolve during animal training. SIGNIFICANCE STATEMENT The prefrontal cortex is a brain region believed to support the ability of animals to engage in complex behavior. How neurons in this area respond to stimuli—and in particular, to combinations of stimuli (

  17. Performance Evaluation of User Selection Protocols in Random Networks with Energy Harvesting and Hardware Impairments

    Directory of Open Access Journals (Sweden)

    Tan Nhat Nguyen

    2016-01-01

    Full Text Available In this paper, we evaluate performances of various user selection protocols under impact of hardware impairments. In the considered protocols, a Base Station (BS selects one of available Users (US to serve, while the remaining USs harvest the energy from the Radio Frequency (RF transmitted by the BS. We assume that all of the US randomly appear around the BS. In the Random Selection Protocol (RAN, the BS randomly selects a US to transmit the data. In the second proposed protocol, named Minimum Distance Protocol (MIND, the US that is nearest to the BS will be chosen. In the Optimal Selection Protocol (OPT, the US providing the highest channel gain between itself and the BS will be served. For performance evaluation, we derive exact and asymptotic closed-form expressions of average Outage Probability (OP over Rayleigh fading channels. We also consider average harvested energy per a US. Finally, Monte-Carlo simulations are then performed to verify the theoretical results.

  18. Correcting Classifiers for Sample Selection Bias in Two-Phase Case-Control Studies

    Science.gov (United States)

    Theis, Fabian J.

    2017-01-01

    Epidemiological studies often utilize stratified data in which rare outcomes or exposures are artificially enriched. This design can increase precision in association tests but distorts predictions when applying classifiers on nonstratified data. Several methods correct for this so-called sample selection bias, but their performance remains unclear especially for machine learning classifiers. With an emphasis on two-phase case-control studies, we aim to assess which corrections to perform in which setting and to obtain methods suitable for machine learning techniques, especially the random forest. We propose two new resampling-based methods to resemble the original data and covariance structure: stochastic inverse-probability oversampling and parametric inverse-probability bagging. We compare all techniques for the random forest and other classifiers, both theoretically and on simulated and real data. Empirical results show that the random forest profits from only the parametric inverse-probability bagging proposed by us. For other classifiers, correction is mostly advantageous, and methods perform uniformly. We discuss consequences of inappropriate distribution assumptions and reason for different behaviors between the random forest and other classifiers. In conclusion, we provide guidance for choosing correction methods when training classifiers on biased samples. For random forests, our method outperforms state-of-the-art procedures if distribution assumptions are roughly fulfilled. We provide our implementation in the R package sambia. PMID:29312464

  19. Random Versus Nonrandom Peer Review: A Case for More Meaningful Peer Review.

    Science.gov (United States)

    Itri, Jason N; Donithan, Adam; Patel, Sohil H

    2018-05-10

    Random peer review programs are not optimized to discover cases with diagnostic error and thus have inherent limitations with respect to educational and quality improvement value. Nonrandom peer review offers an alternative approach in which diagnostic error cases are targeted for collection during routine clinical practice. The objective of this study was to compare error cases identified through random and nonrandom peer review approaches at an academic center. During the 1-year study period, the number of discrepancy cases and score of discrepancy were determined from each approach. The nonrandom peer review process collected 190 cases, of which 60 were scored as 2 (minor discrepancy), 94 as 3 (significant discrepancy), and 36 as 4 (major discrepancy). In the random peer review process, 1,690 cases were reviewed, of which 1,646 were scored as 1 (no discrepancy), 44 were scored as 2 (minor discrepancy), and none were scored as 3 or 4. Several teaching lessons and quality improvement measures were developed as a result of analysis of error cases collected through the nonrandom peer review process. Our experience supports the implementation of nonrandom peer review as a replacement to random peer review, with nonrandom peer review serving as a more effective method for collecting diagnostic error cases with educational and quality improvement value. Copyright © 2018 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  20. The reliability of randomly selected final year pharmacy students in ...

    African Journals Online (AJOL)

    Employing ANOVA, factorial experimental analysis, and the theory of error, reliability studies were conducted on the assessment of the drug product chloroquine phosphate tablets. The G–Study employed equal numbers of the factors for uniform control, and involved three analysts (randomly selected final year Pharmacy ...

  1. [Sexual offences--selected cases].

    Science.gov (United States)

    Łabecka, Marzena; Jarzabek-Bielecka, Grazyna; Lorkiewicz-Muszyńska, Dorota

    2013-04-01

    Expert testimony on violence victims also includes victims of sexual assault. The role of an expert is to classify the injuries by their severity as defined in art. 157 156 or 217 of the Criminal Code pertaining to crimes against health and life. Also, the role of an expert opinion is to determine whether the injuries identified during the exam occurred at the time and under the circumstances stated in medical history. The examination of sexual assault victims is conducted by two experts: a gynecologist and a forensic physician. Most examinations are performed at different times and various medical centers. The conclusions are presented in an official report. Regardless of victim age, all sexual crimes are investigated ex officio by the Police Department and the Prosecutor's Office. Further legal classification of criminal offenses is the task of an appropriate legal body and the offenses are codified in accordance with the provisions of chapter XXV of the Criminal Code, articles 197 - 205. In controversial cases, i.e. when two different expert opinions appear on the same case, or if, according to the law enforcement, a medical opinion is insufficient for some reason, an appropriate expert or team of experts is appointed to resolve the problem. To present selected cases of sexual violence victims treated at the Department of Gynecology and assessed at the Department of Forensic Medicine with reference to the challenges regarding qualification of the sustained injuries and clinical diagnoses. Research material included selected forensic opinions developed for law enforcement offices that involved victims of sexual violence. The expert opinions were prepared either on the basis of submitted evidence, or both, submitted evidence and examination of the victim at the Department of Forensic Medicine. Moreover the article presents a case of a patient examined and treated at the Department of Gynecology in Poznan. Based on the selected cases, the authors conclude that a

  2. Random selection of items. Selection of n1 samples among N items composing a stratum

    International Nuclear Information System (INIS)

    Jaech, J.L.; Lemaire, R.J.

    1987-02-01

    STR-224 provides generalized procedures to determine required sample sizes, for instance in the course of a Physical Inventory Verification at Bulk Handling Facilities. The present report describes procedures to generate random numbers and select groups of items to be verified in a given stratum through each of the measurement methods involved in the verification. (author). 3 refs

  3. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness

    Science.gov (United States)

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and

  4. The mathematics of random mutation and natural selection for multiple simultaneous selection pressures and the evolution of antimicrobial drug resistance.

    Science.gov (United States)

    Kleinman, Alan

    2016-12-20

    The random mutation and natural selection phenomenon act in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures when treating infections and cancers. The underlying principle to impair the random mutation and natural selection phenomenon is to use combination therapy, which forces the population to evolve to multiple selection pressures simultaneously that invoke the multiplication rule of probabilities simultaneously as well. Recently, it has been seen that combination therapy for the treatment of malaria has failed to prevent the emergence of drug-resistant variants. Using this empirical example and the principles of probability theory, the derivation of the equations describing this treatment failure is carried out. These equations give guidance as to how to use combination therapy for the treatment of cancers and infectious diseases and prevent the emergence of drug resistance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  5. A Bayesian random effects discrete-choice model for resource selection: Population-level selection inference

    Science.gov (United States)

    Thomas, D.L.; Johnson, D.; Griffith, B.

    2006-01-01

    Modeling the probability of use of land units characterized by discrete and continuous measures, we present a Bayesian random-effects model to assess resource selection. This model provides simultaneous estimation of both individual- and population-level selection. Deviance information criterion (DIC), a Bayesian alternative to AIC that is sample-size specific, is used for model selection. Aerial radiolocation data from 76 adult female caribou (Rangifer tarandus) and calf pairs during 1 year on an Arctic coastal plain calving ground were used to illustrate models and assess population-level selection of landscape attributes, as well as individual heterogeneity of selection. Landscape attributes included elevation, NDVI (a measure of forage greenness), and land cover-type classification. Results from the first of a 2-stage model-selection procedure indicated that there is substantial heterogeneity among cow-calf pairs with respect to selection of the landscape attributes. In the second stage, selection of models with heterogeneity included indicated that at the population-level, NDVI and land cover class were significant attributes for selection of different landscapes by pairs on the calving ground. Population-level selection coefficients indicate that the pairs generally select landscapes with higher levels of NDVI, but the relationship is quadratic. The highest rate of selection occurs at values of NDVI less than the maximum observed. Results for land cover-class selections coefficients indicate that wet sedge, moist sedge, herbaceous tussock tundra, and shrub tussock tundra are selected at approximately the same rate, while alpine and sparsely vegetated landscapes are selected at a lower rate. Furthermore, the variability in selection by individual caribou for moist sedge and sparsely vegetated landscapes is large relative to the variability in selection of other land cover types. The example analysis illustrates that, while sometimes computationally intense, a

  6. The signature of positive selection at randomly chosen loci.

    Science.gov (United States)

    Przeworski, Molly

    2002-03-01

    In Drosophila and humans, there are accumulating examples of loci with a significant excess of high-frequency-derived alleles or high levels of linkage disequilibrium, relative to a neutral model of a random-mating population of constant size. These are features expected after a recent selective sweep. Their prevalence suggests that positive directional selection may be widespread in both species. However, as I show here, these features do not persist long after the sweep ends: The high-frequency alleles drift to fixation and no longer contribute to polymorphism, while linkage disequilibrium is broken down by recombination. As a result, loci chosen without independent evidence of recent selection are not expected to exhibit either of these features, even if they have been affected by numerous sweeps in their genealogical history. How then can we explain the patterns in the data? One possibility is population structure, with unequal sampling from different subpopulations. Alternatively, positive selection may not operate as is commonly modeled. In particular, the rate of fixation of advantageous mutations may have increased in the recent past.

  7. No Randomization? No Problem: Experimental Control and Random Assignment in Single Case Research

    Science.gov (United States)

    Ledford, Jennifer R.

    2018-01-01

    Randomization of large number of participants to different treatment groups is often not a feasible or preferable way to answer questions of immediate interest to professional practice. Single case designs (SCDs) are a class of research designs that are experimental in nature but require only a few participants, all of whom receive the…

  8. The genealogy of samples in models with selection.

    Science.gov (United States)

    Neuhauser, C; Krone, S M

    1997-02-01

    We introduce the genealogy of a random sample of genes taken from a large haploid population that evolves according to random reproduction with selection and mutation. Without selection, the genealogy is described by Kingman's well-known coalescent process. In the selective case, the genealogy of the sample is embedded in a graph with a coalescing and branching structure. We describe this graph, called the ancestral selection graph, and point out differences and similarities with Kingman's coalescent. We present simulations for a two-allele model with symmetric mutation in which one of the alleles has a selective advantage over the other. We find that when the allele frequencies in the population are already in equilibrium, then the genealogy does not differ much from the neutral case. This is supported by rigorous results. Furthermore, we describe the ancestral selection graph for other selective models with finitely many selection classes, such as the K-allele models, infinitely-many-alleles models. DNA sequence models, and infinitely-many-sites models, and briefly discuss the diploid case.

  9. Discrete least squares polynomial approximation with random evaluations - application to PDEs with Random parameters

    KAUST Repository

    Nobile, Fabio

    2015-01-07

    We consider a general problem F(u, y) = 0 where u is the unknown solution, possibly Hilbert space valued, and y a set of uncertain parameters. We specifically address the situation in which the parameterto-solution map u(y) is smooth, however y could be very high (or even infinite) dimensional. In particular, we are interested in cases in which F is a differential operator, u a Hilbert space valued function and y a distributed, space and/or time varying, random field. We aim at reconstructing the parameter-to-solution map u(y) from random noise-free or noisy observations in random points by discrete least squares on polynomial spaces. The noise-free case is relevant whenever the technique is used to construct metamodels, based on polynomial expansions, for the output of computer experiments. In the case of PDEs with random parameters, the metamodel is then used to approximate statistics of the output quantity. We discuss the stability of discrete least squares on random points show convergence estimates both in expectation and probability. We also present possible strategies to select, either a-priori or by adaptive algorithms, sequences of approximating polynomial spaces that allow to reduce, and in some cases break, the curse of dimensionality

  10. Differential privacy-based evaporative cooling feature selection and classification with relief-F and random forests.

    Science.gov (United States)

    Le, Trang T; Simmons, W Kyle; Misaki, Masaya; Bodurka, Jerzy; White, Bill C; Savitz, Jonathan; McKinney, Brett A

    2017-09-15

    Classification of individuals into disease or clinical categories from high-dimensional biological data with low prediction error is an important challenge of statistical learning in bioinformatics. Feature selection can improve classification accuracy but must be incorporated carefully into cross-validation to avoid overfitting. Recently, feature selection methods based on differential privacy, such as differentially private random forests and reusable holdout sets, have been proposed. However, for domains such as bioinformatics, where the number of features is much larger than the number of observations p≫n , these differential privacy methods are susceptible to overfitting. We introduce private Evaporative Cooling, a stochastic privacy-preserving machine learning algorithm that uses Relief-F for feature selection and random forest for privacy preserving classification that also prevents overfitting. We relate the privacy-preserving threshold mechanism to a thermodynamic Maxwell-Boltzmann distribution, where the temperature represents the privacy threshold. We use the thermal statistical physics concept of Evaporative Cooling of atomic gases to perform backward stepwise privacy-preserving feature selection. On simulated data with main effects and statistical interactions, we compare accuracies on holdout and validation sets for three privacy-preserving methods: the reusable holdout, reusable holdout with random forest, and private Evaporative Cooling, which uses Relief-F feature selection and random forest classification. In simulations where interactions exist between attributes, private Evaporative Cooling provides higher classification accuracy without overfitting based on an independent validation set. In simulations without interactions, thresholdout with random forest and private Evaporative Cooling give comparable accuracies. We also apply these privacy methods to human brain resting-state fMRI data from a study of major depressive disorder. Code

  11. Interference-aware random beam selection schemes for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed

    2012-10-19

    Spectrum sharing systems have been recently introduced to alleviate the problem of spectrum scarcity by allowing secondary unlicensed networks to share the spectrum with primary licensed networks under acceptable interference levels to the primary users. In this work, we develop interference-aware random beam selection schemes that provide enhanced performance for the secondary network under the condition that the interference observed by the receivers of the primary network is below a predetermined/acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a primary link composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes select a beam, among a set of power-optimized random beams, that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint for different levels of feedback information describing the interference level at the primary receiver. For the proposed schemes, we develop a statistical analysis for the SINR statistics as well as the capacity and bit error rate (BER) of the secondary link.

  12. Topology-selective jamming of fully-connected, code-division random-access networks

    Science.gov (United States)

    Polydoros, Andreas; Cheng, Unjeng

    1990-01-01

    The purpose is to introduce certain models of topology selective stochastic jamming and examine its impact on a class of fully-connected, spread-spectrum, slotted ALOHA-type random access networks. The theory covers dedicated as well as half-duplex units. The dominant role of the spatial duty factor is established, and connections with the dual concept of time selective jamming are discussed. The optimal choices of coding rate and link access parameters (from the users' side) and the jamming spatial fraction are numerically established for DS and FH spreading.

  13. Peculiarities of the statistics of spectrally selected fluorescence radiation in laser-pumped dye-doped random media

    Science.gov (United States)

    Yuvchenko, S. A.; Ushakova, E. V.; Pavlova, M. V.; Alonova, M. V.; Zimnyakov, D. A.

    2018-04-01

    We consider the practical realization of a new optical probe method of the random media which is defined as the reference-free path length interferometry with the intensity moments analysis. A peculiarity in the statistics of the spectrally selected fluorescence radiation in laser-pumped dye-doped random medium is discussed. Previously established correlations between the second- and the third-order moments of the intensity fluctuations in the random interference patterns, the coherence function of the probe radiation, and the path difference probability density for the interfering partial waves in the medium are confirmed. The correlations were verified using the statistical analysis of the spectrally selected fluorescence radiation emitted by a laser-pumped dye-doped random medium. Water solution of Rhodamine 6G was applied as the doping fluorescent agent for the ensembles of the densely packed silica grains, which were pumped by the 532 nm radiation of a solid state laser. The spectrum of the mean path length for a random medium was reconstructed.

  14. An uncommon case of random fire-setting behavior associated with Todd paralysis: A case report

    Directory of Open Access Journals (Sweden)

    Kanehisa Masayuki

    2012-08-01

    Full Text Available Abstract Background The association between fire-setting behavior and psychiatric or medical disorders remains poorly understood. Although a link between fire-setting behavior and various organic brain disorders has been established, associations between fire setting and focal brain lesions have not yet been reported. Here, we describe the case of a 24-year-old first time arsonist who suffered Todd’s paralysis prior to the onset of a bizarre and random fire-setting behavior. Case presentation A case of a 24-year-old man with a sudden onset of a bizarre and random fire-setting behavior is reported. The man, who had been arrested on felony arson charges, complained of difficulties concentrating and of recent memory disturbances with leg weakness. A video-EEG recording demonstrated a close relationship between the focal motor impairment and a clear-cut epileptic ictal discharge involving the bilateral motor cortical areas. The SPECT result was statistically analyzed by comparing with standard SPECT images obtained from our institute (easy Z-score imaging system; eZIS. eZIS revealed hypoperfusion in cingulate cortex, basal ganglia and hyperperfusion in frontal cortex,. A neuropsychological test battery revealed lower than normal scores for executive function, attention, and memory, consistent with frontal lobe dysfunction. Conclusion The fire-setting behavior and Todd’s paralysis, together with an unremarkable performance on tests measuring executive function fifteen months prior, suggested a causal relationship between this organic brain lesion and the fire-setting behavior. The case describes a rare and as yet unreported association between random, impulse-driven fire-setting behavior and damage to the brain and suggests a disconnection of frontal lobe structures as a possible pathogenic mechanism.

  15. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil

    2016-12-14

    This paper considers the problem of selecting a set of $k$ measurements from $n$ available sensor observations. The selected measurements should minimize a certain error function assessing the error in estimating a certain $m$ dimensional parameter vector. The exhaustive search inspecting each of the $n\\\\choose k$ possible choices would require a very high computational complexity and as such is not practical for large $n$ and $k$. Alternative methods with low complexity have recently been investigated but their main drawbacks are that 1) they require perfect knowledge of the measurement matrix and 2) they need to be applied at the pace of change of the measurement matrix. To overcome these issues, we consider the asymptotic regime in which $k$, $n$ and $m$ grow large at the same pace. Tools from random matrix theory are then used to approximate in closed-form the most important error measures that are commonly used. The asymptotic approximations are then leveraged to select properly $k$ measurements exhibiting low values for the asymptotic error measures. Two heuristic algorithms are proposed: the first one merely consists in applying the convex optimization artifice to the asymptotic error measure. The second algorithm is a low-complexity greedy algorithm that attempts to look for a sufficiently good solution for the original minimization problem. The greedy algorithm can be applied to both the exact and the asymptotic error measures and can be thus implemented in blind and channel-aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also presented and sustain the efficiency of the proposed blind methods in reaching the performances of channel-aware algorithms.

  16. Evaluation and selection of CASE tool for SMART OTS development

    International Nuclear Information System (INIS)

    Park, K. O; Seo, S. M.; Seo, Y. S.; Koo, I. S.; Jang, M. H.

    1999-01-01

    CASE(Computer-Aided Software Engineering) tool is a software that aids in software engineering activities such as requirement analysis, design, testing, configuration management, and project management. The evaluation and selection of commercial CASE tools for the specific software development project is not a easy work because the technical ability of an evaluator and the maturity of a software development organization are required. In this paper, we discuss selection strategies, characteristic survey, evaluation criteria, and the result of CASE tool selection for the development of SMART(System-integrated Modular Advanced ReacTor) OTS(Operator Training Simulator)

  17. Managing the Public Sector Research and Development Portfolio Selection Process: A Case Study of Quantitative Selection and Optimization

    Science.gov (United States)

    2016-09-01

    PUBLIC SECTOR RESEARCH & DEVELOPMENT PORTFOLIO SELECTION PROCESS: A CASE STUDY OF QUANTITATIVE SELECTION AND OPTIMIZATION by Jason A. Schwartz...PUBLIC SECTOR RESEARCH & DEVELOPMENT PORTFOLIO SELECTION PROCESS: A CASE STUDY OF QUANTITATIVE SELECTION AND OPTIMIZATION 5. FUNDING NUMBERS 6...describing how public sector organizations can implement a research and development (R&D) portfolio optimization strategy to maximize the cost

  18. A Monte Carlo study of adsorption of random copolymers on random surfaces

    CERN Document Server

    Moghaddam, M S

    2003-01-01

    We study the adsorption problem of a random copolymer on a random surface in which a self-avoiding walk in three dimensions interacts with a plane defining a half-space to which the walk is confined. Each vertex of the walk is randomly labelled A with probability p sub p or B with probability 1 - p sub p , and only vertices labelled A are attracted to the surface plane. Each lattice site on the plane is also labelled either A with probability p sub s or B with probability 1 - p sub s , and only lattice sites labelled A interact with the walk. We study two variations of this model: in the first case the A-vertices of the walk interact only with the A-sites on the surface. In the second case the constraint of selective binding is removed; that is, any contact between the walk and the surface that involves an A-labelling, either from the surface or from the walk, is counted as a visit to the surface. The system is quenched in both cases, i.e. the labellings of the walk and of the surface are fixed as thermodynam...

  19. Selected Regional Judicial Officer Cases, 2005 - Present

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset contains selected cases involving EPA's Regional Judicial Officers (RJOs) from 2005 to present. EPA's Regional Judicial Officers (RJOs) perform...

  20. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti.

    Science.gov (United States)

    Wampler, Peter J; Rediske, Richard R; Molla, Azizur R

    2013-01-18

    A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This

  1. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti

    Directory of Open Access Journals (Sweden)

    Wampler Peter J

    2013-01-01

    Full Text Available Abstract Background A remote sensing technique was developed which combines a Geographic Information System (GIS; Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. Methods The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. Results A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. Conclusions The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only

  2. Construction Tender Subcontract Selection using Case-based Reasoning

    Directory of Open Access Journals (Sweden)

    Due Luu

    2012-11-01

    Full Text Available Obtaining competitive quotations from suitably qualified subcontractors at tender tim n significantly increase the chance of w1nmng a construction project. Amidst an increasingly growing trend to subcontracting in Australia, selecting appropriate subcontractors for a construction project can be a daunting task requiring the analysis of complex and dynamic criteria such as past performance, suitable experience, track record of competitive pricing, financial stability and so on. Subcontractor selection is plagued with uncertainty and vagueness and these conditions are difficul_t o represent in generalised sets of rules. DeciSIOns pertaining to the selection of subcontr:act?s tender time are usually based on the mtu1t1onand past experience of construction estimators. Case-based reasoning (CBR may be an appropriate method of addressing the chal_lenges of selecting subcontractors because CBR 1s able to harness the experiential knowledge of practitioners. This paper reviews the practicality and suitability of a CBR approach for subcontractor tender selection through the development of a prototype CBR procurement advisory system. In this system, subcontractor selection cases are represented by a set of attributes elicited from experienced construction estimators. The results indicate that CBR can enhance the appropriateness of the selection of subcontractors for construction projects.

  3. An Examination of Fluoxetine for the Treatment of Selective Mutism Using a Nonconcurrent Multiple-Baseline Single-Case Design Across 5 Cases.

    Science.gov (United States)

    Barterian, Justin A; Sanchez, Joel M; Magen, Jed; Siroky, Allison K; Mash, Brittany L; Carlson, John S

    2018-01-01

    This study examined the utility of fluoxetine in the treatment of 5 children, aged 5 to 14 years, diagnosed with selective mutism who also demonstrated symptoms of social anxiety. A nonconcurrent, randomized, multiple-baseline, single-case design with a single-blind placebo-controlled procedure was used. Parents and the study psychiatrist completed multiple methods of assessment including Direct Behavior Ratings and questionnaires. Treatment outcomes were evaluated by calculating effect sizes for each participant as an individual and for the participants as a group. Information regarding adverse effects with an emphasis on behavioral disinhibition and ratings of parental acceptance of the intervention was gathered. All 5 children experienced improvement in social anxiety, responsive speech, and spontaneous speech with medium to large effect sizes; however, children still met criteria for selective mutism at the end of the study. Adverse events were minimal, with only 2 children experiencing brief occurrences of minor behavioral disinhibition. Parents found the treatment highly acceptable.

  4. Optimizing Event Selection with the Random Grid Search

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C. [Fermilab; Prosper, Harrison B. [Florida State U.; Sekmen, Sezen [Kyungpook Natl. U.; Stewart, Chip [Broad Inst., Cambridge

    2017-06-29

    The random grid search (RGS) is a simple, but efficient, stochastic algorithm to find optimal cuts that was developed in the context of the search for the top quark at Fermilab in the mid-1990s. The algorithm, and associated code, have been enhanced recently with the introduction of two new cut types, one of which has been successfully used in searches for supersymmetry at the Large Hadron Collider. The RGS optimization algorithm is described along with the recent developments, which are illustrated with two examples from particle physics. One explores the optimization of the selection of vector boson fusion events in the four-lepton decay mode of the Higgs boson and the other optimizes SUSY searches using boosted objects and the razor variables.

  5. Continuous state branching processes in random environment: The Brownian case

    OpenAIRE

    Palau, Sandra; Pardo, Juan Carlos

    2015-01-01

    We consider continuous state branching processes that are perturbed by a Brownian motion. These processes are constructed as the unique strong solution of a stochastic differential equation. The long-term extinction and explosion behaviours are studied. In the stable case, the extinction and explosion probabilities are given explicitly. We find three regimes for the asymptotic behaviour of the explosion probability and, as in the case of branching processes in random environment, we find five...

  6. Non-random mating for selection with restricted rates of inbreeding and overlapping generations

    NARCIS (Netherlands)

    Sonesson, A.K.; Meuwissen, T.H.E.

    2002-01-01

    Minimum coancestry mating with a maximum of one offspring per mating pair (MC1) is compared with random mating schemes for populations with overlapping generations. Optimum contribution selection is used, whereby $\\\\\\\\Delta F$ is restricted. For schemes with $\\\\\\\\Delta F$ restricted to 0.25% per

  7. An uncommon case of random fire-setting behavior associated with Todd paralysis: A case report

    OpenAIRE

    Kanehisa, Masayuki; Morinaga, Katsuhiko; Kohno, Hisae; Maruyama, Yoshihiro; Ninomiya, Taiga; Ishitobi, Yoshinobu; Tanaka, Yoshihiro; Tsuru, Jusen; Hanada, Hiroaki; Yoshikawa, Tomoya; Akiyoshi, Jotaro

    2012-01-01

    Abstract Background The association between fire-setting behavior and psychiatric or medical disorders remains poorly understood. Although a link between fire-setting behavior and various organic brain disorders has been established, associations between fire setting and focal brain lesions have not yet been reported. Here, we describe the case of a 24-year-old first time arsonist who suffered Todd’s paralysis prior to the onset of a bizarre and random fire-setting behavior. Case presentation...

  8. Comparative Evaluations of Randomly Selected Four Point-of-Care Glucometer Devices in Addis Ababa, Ethiopia.

    Science.gov (United States)

    Wolde, Mistire; Tarekegn, Getahun; Kebede, Tedla

    2018-05-01

    Point-of-care glucometer (PoCG) devices play a significant role in self-monitoring of the blood sugar level, particularly in the follow-up of high blood sugar therapeutic response. The aim of this study was to evaluate blood glucose test results performed with four randomly selected glucometers on diabetes and control subjects versus standard wet chemistry (hexokinase) methods in Addis Ababa, Ethiopia. A prospective cross-sectional study was conducted on randomly selected 200 study participants (100 participants with diabetes and 100 healthy controls). Four randomly selected PoCG devices (CareSens N, DIAVUE Prudential, On Call Extra, i-QARE DS-W) were evaluated against hexokinase method and ISO 15197:2003 and ISO 15197:2013 standards. The minimum and maximum blood sugar values were recorded by CareSens N (21 mg/dl) and hexokinase method (498.8 mg/dl), respectively. The mean sugar values of all PoCG devices except On Call Extra showed significant differences compared with the reference hexokinase method. Meanwhile, all four PoCG devices had strong positive relationship (>80%) with the reference method (hexokinase). On the other hand, none of the four PoCG devices fulfilled the minimum accuracy measurement set by ISO 15197:2003 and ISO 15197:2013 standards. In addition, the linear regression analysis revealed that all four selected PoCG overestimated the glucose concentrations. The overall evaluation of the selected four PoCG measurements were poorly correlated with standard reference method. Therefore, before introducing PoCG devices to the market, there should be a standardized evaluation platform for validation. Further similar large-scale studies on other PoCG devices also need to be undertaken.

  9. Geography and genography: prediction of continental origin using randomly selected single nucleotide polymorphisms

    Directory of Open Access Journals (Sweden)

    Ramoni Marco F

    2007-03-01

    Full Text Available Abstract Background Recent studies have shown that when individuals are grouped on the basis of genetic similarity, group membership corresponds closely to continental origin. There has been considerable debate about the implications of these findings in the context of larger debates about race and the extent of genetic variation between groups. Some have argued that clustering according to continental origin demonstrates the existence of significant genetic differences between groups and that these differences may have important implications for differences in health and disease. Others argue that clustering according to continental origin requires the use of large amounts of genetic data or specifically chosen markers and is indicative only of very subtle genetic differences that are unlikely to have biomedical significance. Results We used small numbers of randomly selected single nucleotide polymorphisms (SNPs from the International HapMap Project to train naïve Bayes classifiers for prediction of ancestral continent of origin. Predictive accuracy was tested on two independent data sets. Genetically similar groups should be difficult to distinguish, especially if only a small number of genetic markers are used. The genetic differences between continentally defined groups are sufficiently large that one can accurately predict ancestral continent of origin using only a minute, randomly selected fraction of the genetic variation present in the human genome. Genotype data from only 50 random SNPs was sufficient to predict ancestral continent of origin in our primary test data set with an average accuracy of 95%. Genetic variations informative about ancestry were common and widely distributed throughout the genome. Conclusion Accurate characterization of ancestry is possible using small numbers of randomly selected SNPs. The results presented here show how investigators conducting genetic association studies can use small numbers of arbitrarily

  10. Ericksonian hypnotherapy for selective mutism: A single-case study.

    Science.gov (United States)

    Cavarra, Mauro; Brizio, Adelina; Gava, Nicoletta

    2017-01-16

    Children affected by selective mutism don't speak in contexts that are unfamiliar to them or in which speaking is expected or required (e.g. school, kindergarten…). Such disorder interferes with the child's normal activities, may have invalidating consequences in the long run if left untreated, is associated to anxious conditions and is considered hard to treat. Contemporary research is still in need of methodologically rigorous outcome studies and the results described in the small number of published randomized controlled trials and retrospective studies indicate cognitive-behavioral interventions lasting 20-24 sessions as the best therapeutic option. This case study, involving a 7-year-old girl, aims at providing preliminary evidence on the effectiveness of Ericksonian hypnosis in the treatment of this condition. A brief review of current evidence is provided. The case was treated by a licensed hypnotherapist, specialized in family therapy, in 5 sessions during the course of 3 months. After 3 months the symptoms of the client were resolved and the diagnosis was no longer applicable. Other improvements regarded her mood, social skills and school performance.  Conclusions: Ericksonian Hypnotherapy lead to the remission of the disorder and to the improvement of the general well being of the client in 5 sessions, a much briefer time span compared to what is reported in current literature. This paper represents the first step in the elaboration of replicable and reliable intervention principles.

  11. Suicide in Nepal: a modified psychological autopsy investigation from randomly selected police cases between 2013 and 2015.

    Science.gov (United States)

    Hagaman, Ashley K; Khadka, S; Lohani, S; Kohrt, B

    2017-12-01

    Yearly, 600,000 people complete suicide in low- and middle-income countries, accounting for 75% of the world's burden of suicide mortality. The highest regional rates are in South and East Asia. Nepal has one of the highest suicide rates in the world; however, few investigations exploring patterns surrounding both male and female suicides exist. This study used psychological autopsies to identify common factors, precipitating events, and warning signs in a diverse sample. Randomly sampled from 302 police case reports over 24 months, psychological autopsies were conducted for 39 completed suicide cases in one urban and one rural region of Nepal. In the total police sample (n = 302), 57.0% of deaths were male. Over 40% of deaths were 25 years or younger, including 65% of rural and 50.8% of female suicide deaths. We estimate the crude urban and rural suicide rates to be 16.1 and 22.8 per 100,000, respectively. Within our psychological autopsy sample, 38.5% met criteria for depression and only 23.1% informants believed that the deceased had thoughts of self-harm or suicide before death. Important warning signs include recent geographic migration, alcohol abuse, and family history of suicide. Suicide prevention strategies in Nepal should account for the lack of awareness about suicide risk among family members and early age of suicide completion, especially in rural and female populations. Given the low rates of ideation disclosure to friends and family, educating the general public about other signs of suicide may help prevention efforts in Nepal.

  12. Fuzzy Random λ-Mean SAD Portfolio Selection Problem: An Ant Colony Optimization Approach

    Science.gov (United States)

    Thakur, Gour Sundar Mitra; Bhattacharyya, Rupak; Mitra, Swapan Kumar

    2010-10-01

    To reach the investment goal, one has to select a combination of securities among different portfolios containing large number of securities. Only the past records of each security do not guarantee the future return. As there are many uncertain factors which directly or indirectly influence the stock market and there are also some newer stock markets which do not have enough historical data, experts' expectation and experience must be combined with the past records to generate an effective portfolio selection model. In this paper the return of security is assumed to be Fuzzy Random Variable Set (FRVS), where returns are set of random numbers which are in turn fuzzy numbers. A new λ-Mean Semi Absolute Deviation (λ-MSAD) portfolio selection model is developed. The subjective opinions of the investors to the rate of returns of each security are taken into consideration by introducing a pessimistic-optimistic parameter vector λ. λ-Mean Semi Absolute Deviation (λ-MSAD) model is preferred as it follows absolute deviation of the rate of returns of a portfolio instead of the variance as the measure of the risk. As this model can be reduced to Linear Programming Problem (LPP) it can be solved much faster than quadratic programming problems. Ant Colony Optimization (ACO) is used for solving the portfolio selection problem. ACO is a paradigm for designing meta-heuristic algorithms for combinatorial optimization problem. Data from BSE is used for illustration.

  13. Pediatric selective mutism therapy: a randomized controlled trial.

    Science.gov (United States)

    Esposito, Maria; Gimigliano, Francesca; Barillari, Maria R; Precenzano, Francesco; Ruberto, Maria; Sepe, Joseph; Barillari, Umberto; Gimigliano, Raffaele; Militerni, Roberto; Messina, Giovanni; Carotenuto, Marco

    2017-10-01

    Selective mutism (SM) is a rare disease in children coded by DSM-5 as an anxiety disorder. Despite the disabling nature of the disease, there is still no specific treatment. The aims of this study were to verify the efficacy of six-month standard psychomotor treatment and the positive changes in lifestyle, in a population of children affected by SM. Randomized controlled trial registered in the European Clinical Trials Registry (EuDract 2015-001161-36). University third level Centre (Child and Adolescent Neuropsychiatry Clinic). Study population was composed by 67 children in group A (psychomotricity treatment) (35 M, mean age 7.84±1.15) and 71 children in group B (behavioral and educational counseling) (37 M, mean age 7.75±1.36). Psychomotor treatment was administered by trained child therapists in residential settings three times per week. Each child was treated for the whole period by the same therapist and all the therapists shared the same protocol. The standard psychomotor session length is of 45 minutes. At T0 and after 6 months (T1) of treatments, patients underwent a behavioral and SM severity assessment. To verify the effects of the psychomotor management, the Child Behavior Checklist questionnaire (CBCL) and Selective Mutism Questionnaire (SMQ) were administered to the parents. After 6 months of psychomotor treatment SM children showed a significant reduction among CBCL scores such as in social relations, anxious/depressed, social problems and total problems (Pselective mutism, even if further studies are needed. The present study identifies in psychomotricity a safe and efficacy therapy for pediatric selective mutism.

  14. Primitive polynomials selection method for pseudo-random number generator

    Science.gov (United States)

    Anikin, I. V.; Alnajjar, Kh

    2018-01-01

    In this paper we suggested the method for primitive polynomials selection of special type. This kind of polynomials can be efficiently used as a characteristic polynomials for linear feedback shift registers in pseudo-random number generators. The proposed method consists of two basic steps: finding minimum-cost irreducible polynomials of the desired degree and applying primitivity tests to get the primitive ones. Finally two primitive polynomials, which was found by the proposed method, used in pseudorandom number generator based on fuzzy logic (FRNG) which had been suggested before by the authors. The sequences generated by new version of FRNG have low correlation magnitude, high linear complexity, less power consumption, is more balanced and have better statistical properties.

  15. Natural Selection as an Emergent Process: Instructional Implications

    Science.gov (United States)

    Cooper, Robert A.

    2017-01-01

    Student reasoning about cases of natural selection is often plagued by errors that stem from miscategorising selection as a direct, causal process, misunderstanding the role of randomness, and from the intuitive ideas of intentionality, teleology and essentialism. The common thread throughout many of these reasoning errors is a failure to apply…

  16. Case studies combined with or without concept maps improve critical thinking in hospital-based nurses: a randomized-controlled trial.

    Science.gov (United States)

    Huang, Yu-Chuan; Chen, Hsing-Hsia; Yeh, Mei-Ling; Chung, Yu-Chu

    2012-06-01

    Critical thinking (CT) is essential to the exercise of professional judgment. As nurses face increasingly complex health-care situations, critical thinking can promote appropriate clinical decision-making and improve the quality of nursing care. This study aimed to evaluate the effects of a program of case studies, alone (CS) or combined with concept maps (CSCM), on improving CT in clinical nurses. The study was a randomized controlled trial. The experimental group participated in a 16-week CSCM program, whereas the control group participated in a CS program of equal duration. A randomized-controlled trial with a multistage randomization process was used to select and to assign participants, ultimately resulting in 67 nurses in each group. Data were collected before and after the program using the California Critical Thinking Skill Test (CCTST) and the California Critical Thinking Disposition Inventory (CCTDI). After the programs, there were significant differences between the two groups in the critical thinking skills of analysis, evaluation, inference, deduction, and induction. There was also an overall significant difference, and a significant difference in the specific disposition of open-mindedness. This study supports the application of case studies combined with concept maps as a hospital-based teaching strategy to promote development of critical thinking skills and encourage dispositions for nurses. The CSCM resulted in greater improvements in all critical thinking skills of as well as the overall and open-minded affective dispositions toward critical thinking, compared with the case studies alone. An obvious improvement in the CSCM participants was the analytic skill and disposition. Further longitudinal studies and data collection from multisite evaluations in a range of geographic locales are warranted. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design

    Science.gov (United States)

    Wagler, Amy; Wagler, Ron

    2014-01-01

    Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…

  18. Materials selection for oxide-based resistive random access memories

    International Nuclear Information System (INIS)

    Guo, Yuzheng; Robertson, John

    2014-01-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO 2 , TiO 2 , Ta 2 O 5 , and Al 2 O 3 , to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta 2 O 5 RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy

  19. Portfolio Manager Selection – A Case Study

    DEFF Research Database (Denmark)

    Christensen, Michael

    2017-01-01

    Within a delegated portfolio management setting, this paper presents a case study of how the manager selection process can be operationalized in practice. Investors have to pursue a thorough screening of potential portfolio managers in order to discover their quality, and this paper discusses how...

  20. Emergence of multilevel selection in the prisoner's dilemma game on coevolving random networks

    International Nuclear Information System (INIS)

    Szolnoki, Attila; Perc, Matjaz

    2009-01-01

    We study the evolution of cooperation in the prisoner's dilemma game, whereby a coevolutionary rule is introduced that molds the random topology of the interaction network in two ways. First, existing links are deleted whenever a player adopts a new strategy or its degree exceeds a threshold value; second, new links are added randomly after a given number of game iterations. These coevolutionary processes correspond to the generic formation of new links and deletion of existing links that, especially in human societies, appear frequently as a consequence of ongoing socialization, change of lifestyle or death. Due to the counteraction of deletions and additions of links the initial heterogeneity of the interaction network is qualitatively preserved, and thus cannot be held responsible for the observed promotion of cooperation. Indeed, the coevolutionary rule evokes the spontaneous emergence of a powerful multilevel selection mechanism, which despite the sustained random topology of the evolving network, maintains cooperation across the whole span of defection temptation values.

  1. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology

    Science.gov (United States)

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...

  2. Optimization of the Dutch Matrix Test by Random Selection of Sentences From a Preselected Subset

    Directory of Open Access Journals (Sweden)

    Rolph Houben

    2015-04-01

    Full Text Available Matrix tests are available for speech recognition testing in many languages. For an accurate measurement, a steep psychometric function of the speech materials is required. For existing tests, it would be beneficial if it were possible to further optimize the available materials by increasing the function’s steepness. The objective is to show if the steepness of the psychometric function of an existing matrix test can be increased by selecting a homogeneous subset of recordings with the steepest sentence-based psychometric functions. We took data from a previous multicenter evaluation of the Dutch matrix test (45 normal-hearing listeners. Based on half of the data set, first the sentences (140 out of 311 with a similar speech reception threshold and with the steepest psychometric function (≥9.7%/dB were selected. Subsequently, the steepness of the psychometric function for this selection was calculated from the remaining (unused second half of the data set. The calculation showed that the slope increased from 10.2%/dB to 13.7%/dB. The resulting subset did not allow the construction of enough balanced test lists. Therefore, the measurement procedure was changed to randomly select the sentences during testing. Random selection may interfere with a representative occurrence of phonemes. However, in our material, the median phonemic occurrence remained close to that of the original test. This finding indicates that phonemic occurrence is not a critical factor. The work highlights the possibility that existing speech tests might be improved by selecting sentences with a steep psychometric function.

  3. The role of context in case study selection: An international business perspective

    OpenAIRE

    Poulis, K; Poulis, E; Plakoyiannaki, ME

    2013-01-01

    The extant methodological literature has challenged case selection in qualitative case study research for being arbitrary or relying too much on convenience logic. This paper aims to address parts of such criticism on the rigour of case selection through the presentation of a sampling framework that promotes contextualisation and thoroughness of sampling decisions in the study of international phenomena. This framework emerged from an inductive process following an actual case study project i...

  4. Adherence of French GPs to chronic neuropathic pain clinical guidelines: results of a cross-sectional, randomized, "e" case-vignette survey.

    Directory of Open Access Journals (Sweden)

    Valéria Martinez

    Full Text Available BACKGROUND AND AIMS: The French Pain Society published guidelines for neuropathic pain management in 2010. Our aim was to evaluate the compliance of GPs with these guidelines three years later. METHODS: We used "e" case vignette methodology for this non interventional study. A national panel of randomly selected GPs was included. We used eight "e" case-vignettes relating to chronic pain, differing in terms of the type of pain (neuropathic/non neuropathic, etiology (cancer, postoperative pain, low back pain with or without radicular pain, diabetes and symptoms. GPs received two randomly selected consecutive "e" case vignettes (with/without neuropathic pain. We analyzed their ability to recognize neuropathic pain and to prescribe appropriate first-line treatment. RESULTS: From the 1265 GPs in the database, we recruited 443 (35.0%, 334 of whom logged onto the web site (26.4% and 319 (25.2% of whom completed the survey. Among these GPs, 170 (53.3% were aware of the guidelines, 136 (42.6% were able to follow them, and 110 (34.5% used the DN4 diagnostic tool. Sensitivity for neuropathic pain recognition was 87.8% (CI: 84.2%; 91.4%. However, postoperative neuropathic pain was less well diagnosed (77.9%; CI: 69.6%; 86.2% than diabetic pain (95.2%; CI: 90.0%; 100.0%, cancer pain (90.6%; CI: 83.5%; 97.8% and typical radicular pain (90.7%; CI: 84.9%; 96.5%. When neuropathic pain was correctly recognized, the likelihood of appropriate first-line treatment prescription was 90.6% (CI: 87.4%; 93.8%. The treatments proposed were pregabaline (71.8%, gabapentine (43.9%, amiptriptylline (23.2% and duloxetine (18.2%. However, ibuprofen (11%, acetaminophen-codeine (29.5% and clonazepam (10% were still prescribed. CONCLUSIONS: The compliance of GPs with clinical practice guidelines appeared to be satisfactory, but differed between etiologies.

  5. Emotional selection in memes: the case of urban legends.

    Science.gov (United States)

    Bell, C; Sternberg, E

    2001-12-01

    This article explores how much memes like urban legends succeed on the basis of informational selection (i.e., truth or a moral lesson) and emotional selection (i.e., the ability to evoke emotions like anger, fear, or disgust). The article focuses on disgust because its elicitors have been precisely described. In Study 1, with controls for informational factors like truth, people were more willing to pass along stories that elicited stronger disgust. Study 2 randomly sampled legends and created versions that varied in disgust; people preferred to pass along versions that produced the highest level of disgust. Study 3 coded legends for specific story motifs that produce disgust (e.g., ingestion of a contaminated substance) and found that legends that contained more disgust motifs were distributed more widely on urban legend Web sites. The conclusion discusses implications of emotional selection for the social marketplace of ideas.

  6. Materials selection for oxide-based resistive random access memories

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Yuzheng; Robertson, John [Engineering Department, Cambridge University, Cambridge CB2 1PZ (United Kingdom)

    2014-12-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO{sub 2}, TiO{sub 2}, Ta{sub 2}O{sub 5}, and Al{sub 2}O{sub 3}, to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta{sub 2}O{sub 5} RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy.

  7. Lingual biomechanics, case selection and success

    Directory of Open Access Journals (Sweden)

    Sanjay Labh

    2016-01-01

    Full Text Available Deeper understanding of lingual biomechanics is prerequisite for success with lingual appliance. The difference between labial and lingual force system must be understood and kept in mind during treatment planning, especially anchorage planning, and extraction decision-making. As point of application of force changes, it completely changes the force system in all planes. This article describes lingual biomechanics, anchorage planning, diagnostic considerations, treatment planning, and case selection criteria in lingual orthodontics.

  8. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  9. From Protocols to Publications: A Study in Selective Reporting of Outcomes in Randomized Trials in Oncology

    Science.gov (United States)

    Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C.; Hobbs, Brian P.; Berry, Donald A.; Pentz, Rebecca D.; Tam, Alda; Hong, Waun K.; Ellis, Lee M.; Abbruzzese, James; Overman, Michael J.

    2015-01-01

    Purpose The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. Methods We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. Results A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P < .001). Twenty-eight studies (37.8%) reported a total of 65 unplanned end points; 52 (80.0%) of which were not identified as unplanned. Thirty-one (41.9%) and 19 (25.7%) of 74 trials reported a total of 52 unplanned analyses involving primary end points and 33 unplanned analyses involving nonprimary end points, respectively. Studies reported positive unplanned end points and unplanned analyses more frequently than negative outcomes in abstracts (unplanned end points odds ratio, 6.8; P = .002; unplanned analyses odd ratio, 8.4; P = .007). Conclusion Despite public and reviewer access to protocols, selective outcome reporting persists and is a major concern in the reporting of randomized clinical trials. To foster credible evidence-based medicine, additional initiatives are needed to minimize selective reporting. PMID:26304898

  10. The RANDOM computer program: A linear congruential random number generator

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  11. Analysis and applications of a frequency selective surface via a random distribution method

    International Nuclear Information System (INIS)

    Xie Shao-Yi; Huang Jing-Jian; Yuan Nai-Chang; Liu Li-Guo

    2014-01-01

    A novel frequency selective surface (FSS) for reducing radar cross section (RCS) is proposed in this paper. This FSS is based on the random distribution method, so it can be called random surface. In this paper, the stacked patches serving as periodic elements are employed for RCS reduction. Previous work has demonstrated the efficiency by utilizing the microstrip patches, especially for the reflectarray. First, the relevant theory of the method is described. Then a sample of a three-layer variable-sized stacked patch random surface with a dimension of 260 mm×260 mm is simulated, fabricated, and measured in order to demonstrate the validity of the proposed design. For the normal incidence, the 8-dB RCS reduction can be achieved both by the simulation and the measurement in 8 GHz–13 GHz. The oblique incidence of 30° is also investigated, in which the 7-dB RCS reduction can be obtained in a frequency range of 8 GHz–14 GHz. (condensed matter: electronic structure, electrical, magnetic, and optical properties)

  12. On theoretical models of gene expression evolution with random genetic drift and natural selection.

    Directory of Open Access Journals (Sweden)

    Osamu Ogasawara

    2009-11-01

    Full Text Available The relative contributions of natural selection and random genetic drift are a major source of debate in the study of gene expression evolution, which is hypothesized to serve as a bridge from molecular to phenotypic evolution. It has been suggested that the conflict between views is caused by the lack of a definite model of the neutral hypothesis, which can describe the long-run behavior of evolutionary change in mRNA abundance. Therefore previous studies have used inadequate analogies with the neutral prediction of other phenomena, such as amino acid or nucleotide sequence evolution, as the null hypothesis of their statistical inference.In this study, we introduced two novel theoretical models, one based on neutral drift and the other assuming natural selection, by focusing on a common property of the distribution of mRNA abundance among a variety of eukaryotic cells, which reflects the result of long-term evolution. Our results demonstrated that (1 our models can reproduce two independently found phenomena simultaneously: the time development of gene expression divergence and Zipf's law of the transcriptome; (2 cytological constraints can be explicitly formulated to describe long-term evolution; (3 the model assuming that natural selection optimized relative mRNA abundance was more consistent with previously published observations than the model of optimized absolute mRNA abundances.The models introduced in this study give a formulation of evolutionary change in the mRNA abundance of each gene as a stochastic process, on the basis of previously published observations. This model provides a foundation for interpreting observed data in studies of gene expression evolution, including identifying an adequate time scale for discriminating the effect of natural selection from that of random genetic drift of selectively neutral variations.

  13. From Protocols to Publications: A Study in Selective Reporting of Outcomes in Randomized Trials in Oncology.

    Science.gov (United States)

    Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C; Hobbs, Brian P; Berry, Donald A; Pentz, Rebecca D; Tam, Alda; Hong, Waun K; Ellis, Lee M; Abbruzzese, James; Overman, Michael J

    2015-11-01

    The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P medicine, additional initiatives are needed to minimize selective reporting. © 2015 by American Society of Clinical Oncology.

  14. Female monozygotic twins with selective mutism--a case report.

    Science.gov (United States)

    Sharkey, L; Mc Nicholas, F

    2006-04-01

    Selective mutism is a rare social anxiety disorder characterized by a total lack of speech in certain specific situations despite the ability to speak in others. Both genetic and psychosocial factors are thought to be involved in its presentation, persistence, and response to treatment. This case report describes a case of young female monozygotic twins who presented with selective mutism and their treatment spanning a 2-year period. It highlights the strong genetic association along with environmental factors such as social isolation and consequences of maternal social phobia, all contributing to treatment resistance, despite an intensive multimodal biopsychosocial approach. General issues related to the difficulties in treating monozygotic twins are also addressed.

  15. The lack of selection bias in a snowball sampled case-control study on drug abuse.

    Science.gov (United States)

    Lopes, C S; Rodrigues, L C; Sichieri, R

    1996-12-01

    Friend controls in matched case-control studies can be a potential source of bias based on the assumption that friends are more likely to share exposure factors. This study evaluates the role of selection bias in a case-control study that used the snowball sampling method based on friendship for the selection of cases and controls. The cases selected fro the study were drug abusers located in the community. Exposure was defined by the presence of at least one psychiatric diagnosis. Psychiatric and drug abuse/dependence diagnoses were made according to the Diagnostic and Statistical Manual of Mental Disorders (DSM-III-R) criteria. Cases and controls were matched on sex, age and friendship. The measurement of selection bias was made through the comparison of the proportion of exposed controls selected by exposed cases (p1) with the proportion of exposed controls selected by unexposed cases (p2). If p1 = p2 then, selection bias should not occur. The observed distribution of the 185 matched pairs having at least one psychiatric disorder showed a p1 value of 0.52 and a p2 value of 0.51, indicating no selection bias in this study. Our findings support the idea that the use of friend controls can produce a valid basis for a case-control study.

  16. High Entropy Random Selection Protocols

    NARCIS (Netherlands)

    H. Buhrman (Harry); M. Christandl (Matthias); M. Koucky (Michal); Z. Lotker (Zvi); B. Patt-Shamir; M. Charikar; K. Jansen; O. Reingold; J. Rolim

    2007-01-01

    textabstractIn this paper, we construct protocols for two parties that do not trust each other, to generate random variables with high Shannon entropy. We improve known bounds for the trade off between the number of rounds, length of communication and the entropy of the outcome.

  17. Fixation probability in a two-locus intersexual selection model.

    Science.gov (United States)

    Durand, Guillermo; Lessard, Sabin

    2016-06-01

    We study a two-locus model of intersexual selection in a finite haploid population reproducing according to a discrete-time Moran model with a trait locus expressed in males and a preference locus expressed in females. We show that the probability of ultimate fixation of a single mutant allele for a male ornament introduced at random at the trait locus given any initial frequency state at the preference locus is increased by weak intersexual selection and recombination, weak or strong. Moreover, this probability exceeds the initial frequency of the mutant allele even in the case of a costly male ornament if intersexual selection is not too weak. On the other hand, the probability of ultimate fixation of a single mutant allele for a female preference towards a male ornament introduced at random at the preference locus is increased by weak intersexual selection and weak recombination if the female preference is not costly, and is strong enough in the case of a costly male ornament. The analysis relies on an extension of the ancestral recombination-selection graph for samples of haplotypes to take into account events of intersexual selection, while the symbolic calculation of the fixation probabilities is made possible in a reasonable time by an optimizing algorithm. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Integrated Behavior Therapy for Selective Mutism: a randomized controlled pilot study.

    Science.gov (United States)

    Bergman, R Lindsey; Gonzalez, Araceli; Piacentini, John; Keller, Melody L

    2013-10-01

    To evaluate the feasibility, acceptability, and preliminary efficacy of a novel behavioral intervention for reducing symptoms of selective mutism and increasing functional speech. A total of 21 children ages 4 to 8 with primary selective mutism were randomized to 24 weeks of Integrated Behavior Therapy for Selective Mutism (IBTSM) or a 12-week Waitlist control. Clinical outcomes were assessed using blind independent evaluators, parent-, and teacher-report, and an objective behavioral measure. Treatment recipients completed a three-month follow-up to assess durability of treatment gains. Data indicated increased functional speaking behavior post-treatment as rated by parents and teachers, with a high rate of treatment responders as rated by blind independent evaluators (75%). Conversely, children in the Waitlist comparison group did not experience significant improvements in speaking behaviors. Children who received IBTSM also demonstrated significant improvements in number of words spoken at school compared to baseline, however, significant group differences did not emerge. Treatment recipients also experienced significant reductions in social anxiety per parent, but not teacher, report. Clinical gains were maintained over 3 month follow-up. IBTSM appears to be a promising new intervention that is efficacious in increasing functional speaking behaviors, feasible, and acceptable to parents and teachers. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Treatment selection in a randomized clinical trial via covariate-specific treatment effect curves.

    Science.gov (United States)

    Ma, Yunbei; Zhou, Xiao-Hua

    2017-02-01

    For time-to-event data in a randomized clinical trial, we proposed two new methods for selecting an optimal treatment for a patient based on the covariate-specific treatment effect curve, which is used to represent the clinical utility of a predictive biomarker. To select an optimal treatment for a patient with a specific biomarker value, we proposed pointwise confidence intervals for each covariate-specific treatment effect curve and the difference between covariate-specific treatment effect curves of two treatments. Furthermore, to select an optimal treatment for a future biomarker-defined subpopulation of patients, we proposed confidence bands for each covariate-specific treatment effect curve and the difference between each pair of covariate-specific treatment effect curve over a fixed interval of biomarker values. We constructed the confidence bands based on a resampling technique. We also conducted simulation studies to evaluate finite-sample properties of the proposed estimation methods. Finally, we illustrated the application of the proposed method in a real-world data set.

  20. Recruitment and quality academic staff selection: the case study of ...

    African Journals Online (AJOL)

    The sources from which organizations decide to select personnel from are central to its ability to survive, adapt, and grow (Noe et al, 2004:171). The paper examines a case study of recruitment and selection of quality academic staff into Covenant University. The paper addresses the factors that could affect recruitment ...

  1. Selection gradients, the opportunity for selection, and the coefficient of determination.

    Science.gov (United States)

    Moorad, Jacob A; Wade, Michael J

    2013-03-01

    Abstract We derive the relationship between R(2) (the coefficient of determination), selection gradients, and the opportunity for selection for univariate and multivariate cases. Our main result is to show that the portion of the opportunity for selection that is caused by variation for any trait is equal to the product of its selection gradient and its selection differential. This relationship is a corollary of the first and second fundamental theorems of natural selection, and it permits one to investigate the portions of the total opportunity for selection that are involved in directional selection, stabilizing (and diversifying) selection, and correlational selection, which is important to morphological integration. It also allows one to determine the fraction of fitness variation not explained by variation in measured phenotypes and therefore attributable to random (or, at least, unknown) influences. We apply our methods to a human data set to show how sex-specific mating success as a component of fitness variance can be decoupled from that owing to prereproductive mortality. By quantifying linear sources of sexual selection and quadratic sources of sexual selection, we illustrate that the former is stronger in males, while the latter is stronger in females.

  2. A Randomized Trial of Probation Case Management for Drug-Involved Women Offenders

    Science.gov (United States)

    Guydish, Joseph; Chan, Monica; Bostrom, Alan; Jessup, Martha A.; Davis, Thomas B.; Marsh, Cheryl

    2011-01-01

    This article reports findings from a clinical trial of a probation case management (PCM) intervention for drug-involved women offenders. Participants were randomly assigned to PCM (n = 92) or standard probation (n = 91) and followed for 12 months using measures of substance abuse, psychiatric symptoms, social support, and service utilization.…

  3. Intermittent random walks for an optimal search strategy: one-dimensional case

    International Nuclear Information System (INIS)

    Oshanin, G; Wio, H S; Lindenberg, K; Burlatsky, S F

    2007-01-01

    We study the search kinetics of an immobile target by a concentration of randomly moving searchers. The object of the study is to optimize the probability of detection within the constraints of our model. The target is hidden on a one-dimensional lattice in the sense that searchers have no a priori information about where it is, and may detect it only upon encounter. The searchers perform random walks in discrete time n = 0,1,2,...,N, where N is the maximal time the search process is allowed to run. With probability α the searchers step on a nearest-neighbour, and with probability (1-α) they leave the lattice and stay off until they land back on the lattice at a fixed distance L away from the departure point. The random walk is thus intermittent. We calculate the probability P N that the target remains undetected up to the maximal search time N, and seek to minimize this probability. We find that P N is a non-monotonic function of α, and show that there is an optimal choice α opt (N) of α well within the intermittent regime, 0 opt (N) N can be orders of magnitude smaller compared to the 'pure' random walk cases α = 0 and α = 1

  4. Selective decontamination in pediatric liver transplants. A randomized prospective study.

    Science.gov (United States)

    Smith, S D; Jackson, R J; Hannakan, C J; Wadowsky, R M; Tzakis, A G; Rowe, M I

    1993-06-01

    Although it has been suggested that selective decontamination of the digestive tract (SDD) decreases postoperative aerobic Gram-negative and fungal infections in orthotopic liver transplantation (OLT), no controlled trials exist in pediatric patients. This prospective, randomized controlled study of 36 pediatric OLT patients examines the effect of short-term SDD on postoperative infection and digestive tract flora. Patients were randomized into two groups. The control group received perioperative parenteral antibiotics only. The SDD group received in addition polymyxin E, tobramycin, and amphotericin B enterally and by oropharyngeal swab postoperatively until oral intake was tolerated (6 +/- 4 days). Indications for operation, preoperative status, age, and intensive care unit and hospital length of stay were no different in SDD (n = 18) and control (n = 18) groups. A total of 14 Gram-negative infections (intraabdominal abscess 7, septicemia 5, pneumonia 1, urinary tract 1) developed in the 36 patients studied. Mortality was not significantly different in the two groups. However, there were significantly fewer patients with Gram-negative infections in the SDD group: 3/18 patients (11%) vs. 11/18 patients (50%) in the control group, P < 0.001. There was also significant reduction in aerobic Gram-negative flora in the stool and pharynx in patients receiving SDD. Gram-positive and anaerobic organisms were unaffected. We conclude that short-term postoperative SDD significantly reduces Gram-negative infections in pediatric OLT patients.

  5. Simultaneous feature selection and parameter optimisation using an artificial ant colony: case study of melting point prediction

    Directory of Open Access Journals (Sweden)

    Nigsch Florian

    2008-10-01

    Full Text Available Abstract Background We present a novel feature selection algorithm, Winnowing Artificial Ant Colony (WAAC, that performs simultaneous feature selection and model parameter optimisation for the development of predictive quantitative structure-property relationship (QSPR models. The WAAC algorithm is an extension of the modified ant colony algorithm of Shen et al. (J Chem Inf Model 2005, 45: 1024–1029. We test the ability of the algorithm to develop a predictive partial least squares model for the Karthikeyan dataset (J Chem Inf Model 2005, 45: 581–590 of melting point values. We also test its ability to perform feature selection on a support vector machine model for the same dataset. Results Starting from an initial set of 203 descriptors, the WAAC algorithm selected a PLS model with 68 descriptors which has an RMSE on an external test set of 46.6°C and R2 of 0.51. The number of components chosen for the model was 49, which was close to optimal for this feature selection. The selected SVM model has 28 descriptors (cost of 5, ε of 0.21 and an RMSE of 45.1°C and R2 of 0.54. This model outperforms a kNN model (RMSE of 48.3°C, R2 of 0.47 for the same data and has similar performance to a Random Forest model (RMSE of 44.5°C, R2 of 0.55. However it is much less prone to bias at the extremes of the range of melting points as shown by the slope of the line through the residuals: -0.43 for WAAC/SVM, -0.53 for Random Forest. Conclusion With a careful choice of objective function, the WAAC algorithm can be used to optimise machine learning and regression models that suffer from overfitting. Where model parameters also need to be tuned, as is the case with support vector machine and partial least squares models, it can optimise these simultaneously. The moving probabilities used by the algorithm are easily interpreted in terms of the best and current models of the ants, and the winnowing procedure promotes the removal of irrelevant descriptors.

  6. Simultaneous feature selection and parameter optimisation using an artificial ant colony: case study of melting point prediction.

    Science.gov (United States)

    O'Boyle, Noel M; Palmer, David S; Nigsch, Florian; Mitchell, John Bo

    2008-10-29

    We present a novel feature selection algorithm, Winnowing Artificial Ant Colony (WAAC), that performs simultaneous feature selection and model parameter optimisation for the development of predictive quantitative structure-property relationship (QSPR) models. The WAAC algorithm is an extension of the modified ant colony algorithm of Shen et al. (J Chem Inf Model 2005, 45: 1024-1029). We test the ability of the algorithm to develop a predictive partial least squares model for the Karthikeyan dataset (J Chem Inf Model 2005, 45: 581-590) of melting point values. We also test its ability to perform feature selection on a support vector machine model for the same dataset. Starting from an initial set of 203 descriptors, the WAAC algorithm selected a PLS model with 68 descriptors which has an RMSE on an external test set of 46.6 degrees C and R2 of 0.51. The number of components chosen for the model was 49, which was close to optimal for this feature selection. The selected SVM model has 28 descriptors (cost of 5, epsilon of 0.21) and an RMSE of 45.1 degrees C and R2 of 0.54. This model outperforms a kNN model (RMSE of 48.3 degrees C, R2 of 0.47) for the same data and has similar performance to a Random Forest model (RMSE of 44.5 degrees C, R2 of 0.55). However it is much less prone to bias at the extremes of the range of melting points as shown by the slope of the line through the residuals: -0.43 for WAAC/SVM, -0.53 for Random Forest. With a careful choice of objective function, the WAAC algorithm can be used to optimise machine learning and regression models that suffer from overfitting. Where model parameters also need to be tuned, as is the case with support vector machine and partial least squares models, it can optimise these simultaneously. The moving probabilities used by the algorithm are easily interpreted in terms of the best and current models of the ants, and the winnowing procedure promotes the removal of irrelevant descriptors.

  7. Day-ahead load forecast using random forest and expert input selection

    International Nuclear Information System (INIS)

    Lahouar, A.; Ben Hadj Slama, J.

    2015-01-01

    Highlights: • A model based on random forests for short term load forecast is proposed. • An expert feature selection is added to refine inputs. • Special attention is paid to customers behavior, load profile and special holidays. • The model is flexible and able to handle complex load signal. • A technical comparison is performed to assess the forecast accuracy. - Abstract: The electrical load forecast is getting more and more important in recent years due to the electricity market deregulation and integration of renewable resources. To overcome the incoming challenges and ensure accurate power prediction for different time horizons, sophisticated intelligent methods are elaborated. Utilization of intelligent forecast algorithms is among main characteristics of smart grids, and is an efficient tool to face uncertainty. Several crucial tasks of power operators such as load dispatch rely on the short term forecast, thus it should be as accurate as possible. To this end, this paper proposes a short term load predictor, able to forecast the next 24 h of load. Using random forest, characterized by immunity to parameter variations and internal cross validation, the model is constructed following an online learning process. The inputs are refined by expert feature selection using a set of if–then rules, in order to include the own user specifications about the country weather or market, and to generalize the forecast ability. The proposed approach is tested through a real historical set from the Tunisian Power Company, and the simulation shows accurate and satisfactory results for one day in advance, with an average error exceeding rarely 2.3%. The model is validated for regular working days and weekends, and special attention is paid to moving holidays, following non Gregorian calendar

  8. The importance of indigenous games: The selected cases of ...

    African Journals Online (AJOL)

    The importance of indigenous games: The selected cases of Indigenous games in South Africa. ... do not have enough time to transfer their skills and knowledge of indigenous games to the younger generation. ... AJOL African Journals Online.

  9. Convergence analysis for Latin-hypercube lattice-sample selection strategies for 3D correlated random hydraulic-conductivity fields

    OpenAIRE

    Simuta-Champo, R.; Herrera-Zamarrón, G. S.

    2010-01-01

    The Monte Carlo technique provides a natural method for evaluating uncertainties. The uncertainty is represented by a probability distribution or by related quantities such as statistical moments. When the groundwater flow and transport governing equations are solved and the hydraulic conductivity field is treated as a random spatial function, the hydraulic head, velocities and concentrations also become random spatial functions. When that is the case, for the stochastic simulation of groundw...

  10. Treatment of Implant Exposure due to Skin Necroses after Skin Sparing Mastectomy: Initial Experiences Using a Not Selective Random Epigastric Flap.

    Science.gov (United States)

    Echazarreta-Gallego, Estíbaliz; Pola-Bandrés, Guillermo; Arribas-Del Amo, María Dolores; Gil-Romea, Ismael; Sousa-Domínguez, Ramón; Güemes-Sánchez, Antonio

    2017-10-01

    Breast prostheses exposure is probably the most devastating complication after a skin sparing mastectomy (SSM) and implant-based, one-stage, breast reconstruction. This complication may occur in the immediate post-operative period or in the weeks and even months after the procedure. In most cases, the cause is poor skin coverage of the implant due to skin necrosis. Eight consecutive cases of implant exposure (or risk of exposure) due to skin necrosis in SSM patients over a period of 5 years, all patients were treated using a random epigastric rotation flap, executed by the same medical team. A random epigastric flap (island or conventional rotation flap) was used to cover the skin defect. All the patients completed the procedure and all prostheses were saved; there were no cases of flap necrosis or infection. Cases of skin necrosis after SSM and immediate implant reconstruction, in which the implant is at risk of exposure, can be successfully treated with a random epigastric rotation flap.

  11. Distribution of orientation selectivity in recurrent networks of spiking neurons with different random topologies.

    Science.gov (United States)

    Sadeh, Sadra; Rotter, Stefan

    2014-01-01

    Neurons in the primary visual cortex are more or less selective for the orientation of a light bar used for stimulation. A broad distribution of individual grades of orientation selectivity has in fact been reported in all species. A possible reason for emergence of broad distributions is the recurrent network within which the stimulus is being processed. Here we compute the distribution of orientation selectivity in randomly connected model networks that are equipped with different spatial patterns of connectivity. We show that, for a wide variety of connectivity patterns, a linear theory based on firing rates accurately approximates the outcome of direct numerical simulations of networks of spiking neurons. Distance dependent connectivity in networks with a more biologically realistic structure does not compromise our linear analysis, as long as the linearized dynamics, and hence the uniform asynchronous irregular activity state, remain stable. We conclude that linear mechanisms of stimulus processing are indeed responsible for the emergence of orientation selectivity and its distribution in recurrent networks with functionally heterogeneous synaptic connectivity.

  12. Serious adverse events after HPV vaccination: a critical review of randomized trials and post-marketing case series.

    Science.gov (United States)

    Martínez-Lavín, Manuel; Amezcua-Guerra, Luis

    2017-10-01

    This article critically reviews HPV vaccine serious adverse events described in pre-licensure randomized trials and in post-marketing case series. HPV vaccine randomized trials were identified in PubMed. Safety data were extracted. Post-marketing case series describing HPV immunization adverse events were reviewed. Most HPV vaccine randomized trials did not use inert placebo in the control group. Two of the largest randomized trials found significantly more severe adverse events in the tested HPV vaccine arm of the study. Compared to 2871 women receiving aluminum placebo, the group of 2881 women injected with the bivalent HPV vaccine had more deaths on follow-up (14 vs. 3, p = 0.012). Compared to 7078 girls injected with the 4-valent HPV vaccine, 7071 girls receiving the 9-valent dose had more serious systemic adverse events (3.3 vs. 2.6%, p = 0.01). For the 9-valent dose, our calculated number needed to seriously harm is 140 (95% CI, 79–653) [DOSAGE ERROR CORRECTED] . The number needed to vaccinate is 1757 (95% CI, 131 to infinity). Practically, none of the serious adverse events occurring in any arm of both studies were judged to be vaccine-related. Pre-clinical trials, post-marketing case series, and the global drug adverse reaction database (VigiBase) describe similar post-HPV immunization symptom clusters. Two of the largest randomized HPV vaccine trials unveiled more severe adverse events in the tested HPV vaccine arm of the study. Nine-valent HPV vaccine has a worrisome number needed to vaccinate/number needed to harm quotient. Pre-clinical trials and post-marketing case series describe similar post-HPV immunization symptoms.

  13. Selective scene perception deficits in a case of topographical disorientation.

    Science.gov (United States)

    Robin, Jessica; Lowe, Matthew X; Pishdadian, Sara; Rivest, Josée; Cant, Jonathan S; Moscovitch, Morris

    2017-07-01

    Topographical disorientation (TD) is a neuropsychological condition characterized by an inability to find one's way, even in familiar environments. One common contributing cause of TD is landmark agnosia, a visual recognition impairment specific to scenes and landmarks. Although many cases of TD with landmark agnosia have been documented, little is known about the perceptual mechanisms which lead to selective deficits in recognizing scenes. In the present study, we test LH, a man who exhibits TD and landmark agnosia, on measures of scene perception that require selectively attending to either the configural or surface properties of a scene. Compared to healthy controls, LH demonstrates perceptual impairments when attending to the configuration of a scene, but not when attending to its surface properties, such as the pattern of the walls or whether the ground is sand or grass. In contrast, when focusing on objects instead of scenes, LH demonstrates intact perception of both geometric and surface properties. This study demonstrates that in a case of TD and landmark agnosia, the perceptual impairments are selective to the layout of scenes, providing insight into the mechanism of landmark agnosia and scene-selective perceptual processes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Development of the NUMO pre-selection, site-specific safety case

    International Nuclear Information System (INIS)

    Fujiyama, Tetsuo; Suzuki, Satoru; Deguchi, Akira; Umeki, Hiroyuki

    2016-01-01

    Key conclusions: ◆ “The NUMO pre-selection, site-specific safety case” provides the basic structure for subsequent safety cases that will be applied to any selected site, emphasising practical approaches and methodology which will be applicable for the conditions/constraints during an actual siting process. ◆ The preliminary results of the design and safety assessment would underpin the feasibility and safety of geological disposal in Japan.

  15. Advances in randomized parallel computing

    CERN Document Server

    Rajasekaran, Sanguthevar

    1999-01-01

    The technique of randomization has been employed to solve numerous prob­ lems of computing both sequentially and in parallel. Examples of randomized algorithms that are asymptotically better than their deterministic counterparts in solving various fundamental problems abound. Randomized algorithms have the advantages of simplicity and better performance both in theory and often in practice. This book is a collection of articles written by renowned experts in the area of randomized parallel computing. A brief introduction to randomized algorithms In the aflalysis of algorithms, at least three different measures of performance can be used: the best case, the worst case, and the average case. Often, the average case run time of an algorithm is much smaller than the worst case. 2 For instance, the worst case run time of Hoare's quicksort is O(n ), whereas its average case run time is only O( n log n). The average case analysis is conducted with an assumption on the input space. The assumption made to arrive at t...

  16. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  17. The additive congruential random number generator--A special case of a multiple recursive generator

    Science.gov (United States)

    Wikramaratna, Roy S.

    2008-07-01

    This paper considers an approach to generating uniformly distributed pseudo-random numbers which works well in serial applications but which also appears particularly well-suited for application on parallel processing systems. Additive Congruential Random Number (ACORN) generators are straightforward to implement for arbitrarily large order and modulus; if implemented using integer arithmetic, it becomes possible to generate identical sequences on any machine. Previously published theoretical analysis has demonstrated that a kth order ACORN sequence approximates to being uniformly distributed in up to k dimensions, for any given k. ACORN generators can be constructed to give period lengths exceeding any given number (for example, with period length in excess of 230p, for any given p). Results of empirical tests have demonstrated that, if p is greater than or equal to 2, then the ACORN generator can be used successfully for generating double precision uniform random variates. This paper demonstrates that an ACORN generator is a particular case of a multiple recursive generator (and, therefore, also a special case of a matrix generator). Both these latter approaches have been widely studied, and it is to be hoped that the results given in the present paper will lead to greater confidence in using the ACORN generators.

  18. Opportunistic Relay Selection with Cooperative Macro Diversity

    Directory of Open Access Journals (Sweden)

    Yu Chia-Hao

    2010-01-01

    Full Text Available We apply a fully opportunistic relay selection scheme to study cooperative diversity in a semianalytical manner. In our framework, idle Mobile Stations (MSs are capable of being used as Relay Stations (RSs and no relaying is required if the direct path is strong. Our relay selection scheme is fully selection based: either the direct path or one of the relaying paths is selected. Macro diversity, which is often ignored in analytical works, is taken into account together with micro diversity by using a complete channel model that includes both shadow fading and fast fading effects. The stochastic geometry of the network is taken into account by having a random number of randomly located MSs. The outage probability analysis of the selection differs from the case where only fast fading is considered. Under our framework, distribution of the received power is formulated using different Channel State Information (CSI assumptions to simulate both optimistic and practical environments. The results show that the relay selection gain can be significant given a suitable amount of candidate RSs. Also, while relay selection according to incomplete CSI is diversity suboptimal compared to relay selection based on full CSI, the loss in average throughput is not too significant. This is a consequence of the dominance of geometry over fast fading.

  19. Multivariate time series modeling of selected childhood diseases in ...

    African Journals Online (AJOL)

    This paper is focused on modeling the five most prevalent childhood diseases in Akwa Ibom State using a multivariate approach to time series. An aggregate of 78,839 reported cases of malaria, upper respiratory tract infection (URTI), Pneumonia, anaemia and tetanus were extracted from five randomly selected hospitals in ...

  20. Selection bias in population-based cancer case-control studies due to incomplete sampling frame coverage.

    Science.gov (United States)

    Walsh, Matthew C; Trentham-Dietz, Amy; Gangnon, Ronald E; Nieto, F Javier; Newcomb, Polly A; Palta, Mari

    2012-06-01

    Increasing numbers of individuals are choosing to opt out of population-based sampling frames due to privacy concerns. This is especially a problem in the selection of controls for case-control studies, as the cases often arise from relatively complete population-based registries, whereas control selection requires a sampling frame. If opt out is also related to risk factors, bias can arise. We linked breast cancer cases who reported having a valid driver's license from the 2004-2008 Wisconsin women's health study (N = 2,988) with a master list of licensed drivers from the Wisconsin Department of Transportation (WDOT). This master list excludes Wisconsin drivers that requested their information not be sold by the state. Multivariate-adjusted selection probability ratios (SPR) were calculated to estimate potential bias when using this driver's license sampling frame to select controls. A total of 962 cases (32%) had opted out of the WDOT sampling frame. Cases age <40 (SPR = 0.90), income either unreported (SPR = 0.89) or greater than $50,000 (SPR = 0.94), lower parity (SPR = 0.96 per one-child decrease), and hormone use (SPR = 0.93) were significantly less likely to be covered by the WDOT sampling frame (α = 0.05 level). Our results indicate the potential for selection bias due to differential opt out between various demographic and behavioral subgroups of controls. As selection bias may differ by exposure and study base, the assessment of potential bias needs to be ongoing. SPRs can be used to predict the direction of bias when cases and controls stem from different sampling frames in population-based case-control studies.

  1. Role of selective interaction in wealth distribution

    International Nuclear Information System (INIS)

    Gupta, A.K.

    2005-08-01

    In our simplified description 'money' is wealth. A kinetic theory model of money is investigated where two agents interact (trade) selectively and exchange random amount of money between them while keeping total money of all the agents constant. The probability distributions of individual money (P(m) vs. m) is seen to be influenced by certain modes of selective interactions. The distributions shift away from Boltzmann-Gibbs like exponential distribution and in some cases distributions emerge with power law tails known as Pareto's law (P(m) ∝ m -(1+α) ). (author)

  2. Biased random key genetic algorithm with insertion and gender selection for capacitated vehicle routing problem with time windows

    Science.gov (United States)

    Rochman, Auliya Noor; Prasetyo, Hari; Nugroho, Munajat Tri

    2017-06-01

    Vehicle Routing Problem (VRP) often occurs when the manufacturers need to distribute their product to some customers/outlets. The distribution process is typically restricted by the capacity of the vehicle and the working hours at the distributor. This type of VRP is also known as Capacitated Vehicle Routing Problem with Time Windows (CVRPTW). A Biased Random Key Genetic Algorithm (BRKGA) was designed and coded in MATLAB to solve the CVRPTW case of soft drink distribution. The standard BRKGA was then modified by applying chromosome insertion into the initial population and defining chromosome gender for parent undergoing crossover operation. The performance of the established algorithms was then compared to a heuristic procedure for solving a soft drink distribution. Some findings are revealed (1) the total distribution cost of BRKGA with insertion (BRKGA-I) results in a cost saving of 39% compared to the total cost of heuristic method, (2) BRKGA with the gender selection (BRKGA-GS) could further improve the performance of the heuristic method. However, the BRKGA-GS tends to yield worse results compared to that obtained from the standard BRKGA.

  3. Randomized random walk on a random walk

    International Nuclear Information System (INIS)

    Lee, P.A.

    1983-06-01

    This paper discusses generalizations of the model introduced by Kehr and Kunter of the random walk of a particle on a one-dimensional chain which in turn has been constructed by a random walk procedure. The superimposed random walk is randomised in time according to the occurrences of a stochastic point process. The probability of finding the particle in a particular position at a certain instant is obtained explicitly in the transform domain. It is found that the asymptotic behaviour for large time of the mean-square displacement of the particle depends critically on the assumed structure of the basic random walk, giving a diffusion-like term for an asymmetric walk or a square root law if the walk is symmetric. Many results are obtained in closed form for the Poisson process case, and these agree with those given previously by Kehr and Kunter. (author)

  4. Assessing the Effectiveness of Case-Based Collaborative Learning via Randomized Controlled Trial.

    Science.gov (United States)

    Krupat, Edward; Richards, Jeremy B; Sullivan, Amy M; Fleenor, Thomas J; Schwartzstein, Richard M

    2016-05-01

    Case-based collaborative learning (CBCL) is a novel small-group approach that borrows from team-based learning principles and incorporates elements of problem-based learning (PBL) and case-based learning. CBCL includes a preclass readiness assurance process and case-based in-class activities in which students respond to focused, open-ended questions individually, discuss their answers in groups of 4, and then reach consensus in larger groups of 16. This study introduces CBCL and assesses its effectiveness in one course at Harvard Medical School. In a 2013 randomized controlled trial, 64 medical and dental student volunteers were assigned randomly to one of four 8-person PBL tutorial groups (control; n = 32) or one of two 16-person CBCL tutorial groups (experimental condition; n = 32) as part of a required first-year physiology course. Outcomes for the PBL and CBCL groups were compared using final exam scores, student responses to a postcourse survey, and behavioral coding of portions of video-recorded class sessions. Overall, the course final exam scores for CBCL and PBL students were not significantly different. However, CBCL students whose mean exam performance in prior courses was below the participant median scored significantly higher than their PBL counterparts on the physiology course final exam. The most common adjectives students used to describe CBCL were "engaging," "fun," and "thought-provoking." Coding of observed behaviors indicated that individual affect was significantly higher in the CBCL groups than in the PBL groups. CBCL is a viable, engaging, active learning method. It may particularly benefit students with lower academic performance.

  5. A Heckman Selection- t Model

    KAUST Repository

    Marchenko, Yulia V.

    2012-03-01

    Sample selection arises often in practice as a result of the partial observability of the outcome of interest in a study. In the presence of sample selection, the observed data do not represent a random sample from the population, even after controlling for explanatory variables. That is, data are missing not at random. Thus, standard analysis using only complete cases will lead to biased results. Heckman introduced a sample selection model to analyze such data and proposed a full maximum likelihood estimation method under the assumption of normality. The method was criticized in the literature because of its sensitivity to the normality assumption. In practice, data, such as income or expenditure data, often violate the normality assumption because of heavier tails. We first establish a new link between sample selection models and recently studied families of extended skew-elliptical distributions. Then, this allows us to introduce a selection-t (SLt) model, which models the error distribution using a Student\\'s t distribution. We study its properties and investigate the finite-sample performance of the maximum likelihood estimators for this model. We compare the performance of the SLt model to the conventional Heckman selection-normal (SLN) model and apply it to analyze ambulatory expenditures. Unlike the SLNmodel, our analysis using the SLt model provides statistical evidence for the existence of sample selection bias in these data. We also investigate the performance of the test for sample selection bias based on the SLt model and compare it with the performances of several tests used with the SLN model. Our findings indicate that the latter tests can be misleading in the presence of heavy-tailed data. © 2012 American Statistical Association.

  6. Treating Selective Mutism Using Modular CBT for Child Anxiety: A Case Study

    Science.gov (United States)

    Reuther, Erin T.; Davis, Thompson E., III; Moree, Brittany N.; Matson, Johnny L.

    2011-01-01

    Selective mutism is a rare, debilitating condition usually seen in children. Unfortunately, there is little research examining effective treatments for this disorder, and designing an evidence-based treatment plan can be difficult. This case study presents the evidence-based treatment of an 8-year-old Caucasian boy with selective mutism using an…

  7. Case management: a randomized controlled study comparing a neighborhood team and a centralized individual model.

    OpenAIRE

    Eggert, G M; Zimmer, J G; Hall, W J; Friedman, B

    1991-01-01

    This randomized controlled study compared two types of case management for skilled nursing level patients living at home: the centralized individual model and the neighborhood team model. The team model differed from the individual model in that team case managers performed client assessments, care planning, some direct services, and reassessments; they also had much smaller caseloads and were assigned a specific catchment area. While patients in both groups incurred very high estimated healt...

  8. Implications of structural genomics target selection strategies: Pfam5000, whole genome, and random approaches

    Energy Technology Data Exchange (ETDEWEB)

    Chandonia, John-Marc; Brenner, Steven E.

    2004-07-14

    The structural genomics project is an international effort to determine the three-dimensional shapes of all important biological macromolecules, with a primary focus on proteins. Target proteins should be selected according to a strategy which is medically and biologically relevant, of good value, and tractable. As an option to consider, we present the Pfam5000 strategy, which involves selecting the 5000 most important families from the Pfam database as sources for targets. We compare the Pfam5000 strategy to several other proposed strategies that would require similar numbers of targets. These include including complete solution of several small to moderately sized bacterial proteomes, partial coverage of the human proteome, and random selection of approximately 5000 targets from sequenced genomes. We measure the impact that successful implementation of these strategies would have upon structural interpretation of the proteins in Swiss-Prot, TrEMBL, and 131 complete proteomes (including 10 of eukaryotes) from the Proteome Analysis database at EBI. Solving the structures of proteins from the 5000 largest Pfam families would allow accurate fold assignment for approximately 68 percent of all prokaryotic proteins (covering 59 percent of residues) and 61 percent of eukaryotic proteins (40 percent of residues). More fine-grained coverage which would allow accurate modeling of these proteins would require an order of magnitude more targets. The Pfam5000 strategy may be modified in several ways, for example to focus on larger families, bacterial sequences, or eukaryotic sequences; as long as secondary consideration is given to large families within Pfam, coverage results vary only slightly. In contrast, focusing structural genomics on a single tractable genome would have only a limited impact in structural knowledge of other proteomes: a significant fraction (about 30-40 percent of the proteins, and 40-60 percent of the residues) of each proteome is classified in small

  9. Ensuring optimal gender representation in recruitment and selection: the case of CERN

    CERN Document Server

    Sgouraki, Margarita

    This study examines gender diversity in recruitment and selection, exploring ways to improve female presence in science. First, the concepts of equal opportunities and managing diversity are presented. Next, the business case for diversity is discussed, emphasising the business and ethical benefits for organisations. Then, gender diversity issues regarding the underrepresentation of women in science are examined, focusing on gender stereotyping and the "leaky pipeline". Previous studies emphasise the importance of HRM activities such as recruitment and selection to promote gender diversity. However, there are still barriers when recruiting and selecting women in science. To examine and explore the research topic, a case-study approach is adopted. Methods included document analysis, interviews with key informants and cohort analysis. The limitations of the methodology are discussed and recommendations for future work are proposed. By examining the example of CERN, an intergovernmental research organisation, co...

  10. Participant-selected music and physical activity in older adults following cardiac rehabilitation: a randomized controlled trial.

    Science.gov (United States)

    Clark, Imogen N; Baker, Felicity A; Peiris, Casey L; Shoebridge, Georgie; Taylor, Nicholas F

    2017-03-01

    To evaluate effects of participant-selected music on older adults' achievement of activity levels recommended in the physical activity guidelines following cardiac rehabilitation. A parallel group randomized controlled trial with measurements at Weeks 0, 6 and 26. A multisite outpatient rehabilitation programme of a publicly funded metropolitan health service. Adults aged 60 years and older who had completed a cardiac rehabilitation programme. Experimental participants selected music to support walking with guidance from a music therapist. Control participants received usual care only. The primary outcome was the proportion of participants achieving activity levels recommended in physical activity guidelines. Secondary outcomes compared amounts of physical activity, exercise capacity, cardiac risk factors, and exercise self-efficacy. A total of 56 participants, mean age 68.2 years (SD = 6.5), were randomized to the experimental ( n = 28) and control groups ( n = 28). There were no differences between groups in proportions of participants achieving activity recommended in physical activity guidelines at Week 6 or 26. Secondary outcomes demonstrated between-group differences in male waist circumference at both measurements (Week 6 difference -2.0 cm, 95% CI -4.0 to 0; Week 26 difference -2.8 cm, 95% CI -5.4 to -0.1), and observed effect sizes favoured the experimental group for amounts of physical activity (d = 0.30), exercise capacity (d = 0.48), and blood pressure (d = -0.32). Participant-selected music did not increase the proportion of participants achieving recommended amounts of physical activity, but may have contributed to exercise-related benefits.

  11. r2VIM: A new variable selection method for random forests in genome-wide association studies.

    Science.gov (United States)

    Szymczak, Silke; Holzinger, Emily; Dasgupta, Abhijit; Malley, James D; Molloy, Anne M; Mills, James L; Brody, Lawrence C; Stambolian, Dwight; Bailey-Wilson, Joan E

    2016-01-01

    Machine learning methods and in particular random forests (RFs) are a promising alternative to standard single SNP analyses in genome-wide association studies (GWAS). RFs provide variable importance measures (VIMs) to rank SNPs according to their predictive power. However, in contrast to the established genome-wide significance threshold, no clear criteria exist to determine how many SNPs should be selected for downstream analyses. We propose a new variable selection approach, recurrent relative variable importance measure (r2VIM). Importance values are calculated relative to an observed minimal importance score for several runs of RF and only SNPs with large relative VIMs in all of the runs are selected as important. Evaluations on simulated GWAS data show that the new method controls the number of false-positives under the null hypothesis. Under a simple alternative hypothesis with several independent main effects it is only slightly less powerful than logistic regression. In an experimental GWAS data set, the same strong signal is identified while the approach selects none of the SNPs in an underpowered GWAS. The novel variable selection method r2VIM is a promising extension to standard RF for objectively selecting relevant SNPs in GWAS while controlling the number of false-positive results.

  12. Random magnetism

    International Nuclear Information System (INIS)

    Tahir-Kheli, R.A.

    1975-01-01

    A few simple problems relating to random magnetic systems are presented. Translational symmetry, only on the macroscopic scale, is assumed for these systems. A random set of parameters, on the microscopic scale, for the various regions of these systems is also assumed. A probability distribution for randomness is obeyed. Knowledge of the form of these probability distributions, is assumed in all cases [pt

  13. When to Intervene in Selective Mutism: The Multimodal Treatment of a Case of Persistent Selective Mutism.

    Science.gov (United States)

    Powell, Shawn; Dalley, Mahlono

    1995-01-01

    An identification and treatment model differentiating transient mutism from persistent selective mutism is proposed. The case study of a six-year-old girl is presented, who was treated with a multimodal approach combining behavioral techniques with play therapy and family involvement. At posttreatment and follow-up, she was talking in a manner…

  14. Multi-criteria selection of offshore wind farms: Case study for the Baltic States

    International Nuclear Information System (INIS)

    Chaouachi, Aymen; Covrig, Catalin Felix; Ardelean, Mircea

    2017-01-01

    This paper presents a multi-criteria selection approach for offshore wind sites assessment. The proposed site selection framework takes into consideration the electricity network’s operating security aspects, economic investment, operation costs and capacity performances relative to each potential site. The selection decision is made through Analytic Hierarchy Process (AHP), with an inherited flexibility that aims to allow end users to adjust the expected benefits accordingly to their respective and global priorities. The proposed site selection framework is implemented as an interactive case study for three Baltic States in the 2020 time horizon, based on real data and exhaustive power network models, taking into consideration the foreseen upgrades and network reinforcements. For each country the optimal offshore wind sites are assessed under multiple weight contribution scenarios, reflecting the characteristics of market design, regulatory aspects or renewable integration targets. - Highlights: • We use a multi-criteria selection approach for offshore wind sites assessment. • Security aspects, economic investment, operation costs and capacity performances are included. • The selection decision is made through an Analytic Hierarchy Process (AHP). • We implement the methodology as a case study for three Baltic States in the 2020 time horizon.

  15. Nurse case-manager vs multifaceted intervention to improve quality of osteoporosis care after wrist fracture: randomized controlled pilot study.

    Science.gov (United States)

    Majumdar, S R; Johnson, J A; Bellerose, D; McAlister, F A; Russell, A S; Hanley, D A; Garg, S; Lier, D A; Maksymowych, W P; Morrish, D W; Rowe, B H

    2011-01-01

    Few outpatients with fractures are treated for osteoporosis in the years following fracture. In a randomized pilot study, we found a nurse case-manager could double rates of osteoporosis testing and treatment compared with a proven efficacious quality improvement strategy directed at patients and physicians (57% vs 28% rates of appropriate care). Few patients with fractures are treated for osteoporosis. An intervention directed at wrist fracture patients (education) and physicians (guidelines, reminders) tripled osteoporosis treatment rates compared to controls (22% vs 7% within 6 months of fracture). More effective strategies are needed. We undertook a pilot study that compared a nurse case-manager to the multifaceted intervention using a randomized trial design. The case-manager counseled patients, arranged bone mineral density (BMD) tests, and prescribed treatments. We included controls from our first trial who remained untreated for osteoporosis 1-year post-fracture. Primary outcome was bisphosphonate treatment and secondary outcomes were BMD testing, appropriate care (BMD test-treatment if bone mass low), and costs. Forty six patients untreated 1-year after wrist fracture were randomized to case-manager (n = 21) or multifaceted intervention (n = 25). Median age was 60 years and 68% were female. Six months post-randomization, 9 (43%) case-managed patients were treated with bisphosphonates compared with 3 (12%) multifaceted intervention patients (relative risk [RR] 3.6, 95% confidence intervals [CI] 1.1-11.5, p = 0.019). Case-managed patients were more likely than multifaceted intervention patients to undergo BMD tests (81% vs 52%, RR 1.6, 95%CI 1.1-2.4, p = 0.042) and receive appropriate care (57% vs 28%, RR 2.0, 95%CI 1.0-4.2, p = 0.048). Case-management cost was $44 (CDN) per patient vs $12 for the multifaceted intervention. A nurse case-manager substantially increased rates of appropriate testing and treatment for osteoporosis in

  16. Association between selective serotonin reuptake inhibitors and upper gastrointestinal bleeding: population based case-control study

    Science.gov (United States)

    de Abajo, Francisco José; Rodríguez, Luis Alberto García; Montero, Dolores

    1999-01-01

    Objective To examine the association between selective serotonin reuptake inhibitors and risk of upper gastrointestinal bleeding. Design Population based case-control study. Setting General practices included in the UK general practice research database. Subjects 1651 incident cases of upper gastrointestinal bleeding and 248 cases of ulcer perforation among patients aged 40 to 79 years between April 1993 and September 1997, and 10 000 controls matched for age, sex, and year that the case was identified. Interventions Review of computer profiles for all potential cases, and an internal validation study to confirm the accuracy of the diagnosis on the basis of the computerised information. Main outcome measures Current use of selective serotonin reuptake inhibitors or other antidepressants within 30 days before the index date. Results Current exposure to selective serotonin reuptake inhibitors was identified in 3.1% (52 of 1651) of patients with upper gastrointestinal bleeding but only 1.0% (95 of 10 000) of controls, giving an adjusted rate ratio of 3.0 (95% confidence interval 2.1 to 4.4). This effect measure was not modified by sex, age, dose, or treatment duration. A crude incidence of 1 case per 8000 prescriptions was estimated. A small association was found with non-selective serotonin reuptake inhibitors (relative risk 1.4, 1.1 to 1.9) but not with antidepressants lacking this inhibitory effect. None of the groups of antidepressants was associated with ulcer perforation. The concurrent use of selective serotonin reuptake inhibitors with non-steroidal anti-inflammatory drugs increased the risk of upper gastrointestinal bleeding beyond the sum of their independent effects (15.6, 6.6 to 36.6). A smaller interaction was also found between selective serotonin reuptake inhibitors and low dose aspirin (7.2, 3.1 to 17.1). Conclusions Selective serotonin reuptake inhibitors increase the risk of upper gastrointestinal bleeding. The absolute effect is, however

  17. Two-year Randomized Clinical Trial of Self-etching Adhesives and Selective Enamel Etching.

    Science.gov (United States)

    Pena, C E; Rodrigues, J A; Ely, C; Giannini, M; Reis, A F

    2016-01-01

    The aim of this randomized, controlled prospective clinical trial was to evaluate the clinical effectiveness of restoring noncarious cervical lesions with two self-etching adhesive systems applied with or without selective enamel etching. A one-step self-etching adhesive (Xeno V(+)) and a two-step self-etching system (Clearfil SE Bond) were used. The effectiveness of phosphoric acid selective etching of enamel margins was also evaluated. Fifty-six cavities were restored with each adhesive system and divided into two subgroups (n=28; etch and non-etch). All 112 cavities were restored with the nanohybrid composite Esthet.X HD. The clinical effectiveness of restorations was recorded in terms of retention, marginal integrity, marginal staining, caries recurrence, and postoperative sensitivity after 3, 6, 12, 18, and 24 months (modified United States Public Health Service). The Friedman test detected significant differences only after 18 months for marginal staining in the groups Clearfil SE non-etch (p=0.009) and Xeno V(+) etch (p=0.004). One restoration was lost during the trial (Xeno V(+) etch; p>0.05). Although an increase in marginal staining was recorded for groups Clearfil SE non-etch and Xeno V(+) etch, the clinical effectiveness of restorations was considered acceptable for the single-step and two-step self-etching systems with or without selective enamel etching in this 24-month clinical trial.

  18. A comparison of random forest and its Gini importance with standard chemometric methods for the feature selection and classification of spectral data

    Directory of Open Access Journals (Sweden)

    Himmelreich Uwe

    2009-07-01

    Full Text Available Abstract Background Regularized regression methods such as principal component or partial least squares regression perform well in learning tasks on high dimensional spectral data, but cannot explicitly eliminate irrelevant features. The random forest classifier with its associated Gini feature importance, on the other hand, allows for an explicit feature elimination, but may not be optimally adapted to spectral data due to the topology of its constituent classification trees which are based on orthogonal splits in feature space. Results We propose to combine the best of both approaches, and evaluated the joint use of a feature selection based on a recursive feature elimination using the Gini importance of random forests' together with regularized classification methods on spectral data sets from medical diagnostics, chemotaxonomy, biomedical analytics, food science, and synthetically modified spectral data. Here, a feature selection using the Gini feature importance with a regularized classification by discriminant partial least squares regression performed as well as or better than a filtering according to different univariate statistical tests, or using regression coefficients in a backward feature elimination. It outperformed the direct application of the random forest classifier, or the direct application of the regularized classifiers on the full set of features. Conclusion The Gini importance of the random forest provided superior means for measuring feature relevance on spectral data, but – on an optimal subset of features – the regularized classifiers might be preferable over the random forest classifier, in spite of their limitation to model linear dependencies only. A feature selection based on Gini importance, however, may precede a regularized linear classification to identify this optimal subset of features, and to earn a double benefit of both dimensionality reduction and the elimination of noise from the classification task.

  19. Why the null matters: statistical tests, random walks and evolution.

    Science.gov (United States)

    Sheets, H D; Mitchell, C E

    2001-01-01

    A number of statistical tests have been developed to determine what type of dynamics underlie observed changes in morphology in evolutionary time series, based on the pattern of change within the time series. The theory of the 'scaled maximum', the 'log-rate-interval' (LRI) method, and the Hurst exponent all operate on the same principle of comparing the maximum change, or rate of change, in the observed dataset to the maximum change expected of a random walk. Less change in a dataset than expected of a random walk has been interpreted as indicating stabilizing selection, while more change implies directional selection. The 'runs test' in contrast, operates on the sequencing of steps, rather than on excursion. Applications of these tests to computer generated, simulated time series of known dynamical form and various levels of additive noise indicate that there is a fundamental asymmetry in the rate of type II errors of the tests based on excursion: they are all highly sensitive to noise in models of directional selection that result in a linear trend within a time series, but are largely noise immune in the case of a simple model of stabilizing selection. Additionally, the LRI method has a lower sensitivity than originally claimed, due to the large range of LRI rates produced by random walks. Examination of the published results of these tests show that they have seldom produced a conclusion that an observed evolutionary time series was due to directional selection, a result which needs closer examination in light of the asymmetric response of these tests.

  20. How random is a random vector?

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2015-01-01

    Over 80 years ago Samuel Wilks proposed that the “generalized variance” of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the “Wilks standard deviation” –the square root of the generalized variance–is indeed the standard deviation of a random vector. We further establish that the “uncorrelation index” –a derivative of the Wilks standard deviation–is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: “randomness measures” and “independence indices” of random vectors. In turn, these general notions give rise to “randomness diagrams”—tangible planar visualizations that answer the question: How random is a random vector? The notion of “independence indices” yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.

  1. How random is a random vector?

    Science.gov (United States)

    Eliazar, Iddo

    2015-12-01

    Over 80 years ago Samuel Wilks proposed that the "generalized variance" of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the "Wilks standard deviation" -the square root of the generalized variance-is indeed the standard deviation of a random vector. We further establish that the "uncorrelation index" -a derivative of the Wilks standard deviation-is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: "randomness measures" and "independence indices" of random vectors. In turn, these general notions give rise to "randomness diagrams"-tangible planar visualizations that answer the question: How random is a random vector? The notion of "independence indices" yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.

  2. Random effect selection in generalised linear models

    DEFF Research Database (Denmark)

    Denwood, Matt; Houe, Hans; Forkman, Björn

    We analysed abattoir recordings of meat inspection codes with possible relevance to onfarm animal welfare in cattle. Random effects logistic regression models were used to describe individual-level data obtained from 461,406 cattle slaughtered in Denmark. Our results demonstrate that the largest...

  3. Selection of a plant location - A case study: Urea production using ...

    African Journals Online (AJOL)

    Selection of a plant location - A case study: Urea production using Calub Gas. ... The criteria are. amongst others, supply of raw materials and fuel, market proximity, water supply. waste ... To set up · critical factors for Ethiopian scenario

  4. Random Forest Application for NEXRAD Radar Data Quality Control

    Science.gov (United States)

    Keem, M.; Seo, B. C.; Krajewski, W. F.

    2017-12-01

    Identification and elimination of non-meteorological radar echoes (e.g., returns from ground, wind turbines, and biological targets) are the basic data quality control steps before radar data use in quantitative applications (e.g., precipitation estimation). Although WSR-88Ds' recent upgrade to dual-polarization has enhanced this quality control and echo classification, there are still challenges to detect some non-meteorological echoes that show precipitation-like characteristics (e.g., wind turbine or anomalous propagation clutter embedded in rain). With this in mind, a new quality control method using Random Forest is proposed in this study. This classification algorithm is known to produce reliable results with less uncertainty. The method introduces randomness into sampling and feature selections and integrates consequent multiple decision trees. The multidimensional structure of the trees can characterize the statistical interactions of involved multiple features in complex situations. The authors explore the performance of Random Forest method for NEXRAD radar data quality control. Training datasets are selected using several clear cases of precipitation and non-precipitation (but with some non-meteorological echoes). The model is structured using available candidate features (from the NEXRAD data) such as horizontal reflectivity, differential reflectivity, differential phase shift, copolar correlation coefficient, and their horizontal textures (e.g., local standard deviation). The influence of each feature on classification results are quantified by variable importance measures that are automatically estimated by the Random Forest algorithm. Therefore, the number and types of features in the final forest can be examined based on the classification accuracy. The authors demonstrate the capability of the proposed approach using several cases ranging from distinct to complex rain/no-rain events and compare the performance with the existing algorithms (e

  5. Selected Translations of the Eichmann Case from German Magazine

    Science.gov (United States)

    1960-07-06

    to establish contact with the Grand Mufti of Jerusalem , the number-one enemy of the Jews in the Near East. After the occupation of Austria, Eichmann ...SELECTED TRANSLATIONS ON THE EICHMANN CASE FROM GERMAN MAGAZINE /Following is a translation of two articles from Per Spiegel /The Mirror/, Hamburg... Jerusalem parliament called for a routine debate on the budget. The atmosphere was listless. Only a few men with stiff, military bearing kept

  6. Random forest variable selection in spatial malaria transmission modelling in Mpumalanga Province, South Africa

    Directory of Open Access Journals (Sweden)

    Thandi Kapwata

    2016-11-01

    Full Text Available Malaria is an environmentally driven disease. In order to quantify the spatial variability of malaria transmission, it is imperative to understand the interactions between environmental variables and malaria epidemiology at a micro-geographic level using a novel statistical approach. The random forest (RF statistical learning method, a relatively new variable-importance ranking method, measures the variable importance of potentially influential parameters through the percent increase of the mean squared error. As this value increases, so does the relative importance of the associated variable. The principal aim of this study was to create predictive malaria maps generated using the selected variables based on the RF algorithm in the Ehlanzeni District of Mpumalanga Province, South Africa. From the seven environmental variables used [temperature, lag temperature, rainfall, lag rainfall, humidity, altitude, and the normalized difference vegetation index (NDVI], altitude was identified as the most influential predictor variable due its high selection frequency. It was selected as the top predictor for 4 out of 12 months of the year, followed by NDVI, temperature and lag rainfall, which were each selected twice. The combination of climatic variables that produced the highest prediction accuracy was altitude, NDVI, and temperature. This suggests that these three variables have high predictive capabilities in relation to malaria transmission. Furthermore, it is anticipated that the predictive maps generated from predictions made by the RF algorithm could be used to monitor the progression of malaria and assist in intervention and prevention efforts with respect to malaria.

  7. Selecting Optimal Parameters of Random Linear Network Coding for Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Heide, J; Zhang, Qi; Fitzek, F H P

    2013-01-01

    This work studies how to select optimal code parameters of Random Linear Network Coding (RLNC) in Wireless Sensor Networks (WSNs). With Rateless Deluge [1] the authors proposed to apply Network Coding (NC) for Over-the-Air Programming (OAP) in WSNs, and demonstrated that with NC a significant...... reduction in the number of transmitted packets can be achieved. However, NC introduces additional computations and potentially a non-negligible transmission overhead, both of which depend on the chosen coding parameters. Therefore it is necessary to consider the trade-off that these coding parameters...... present in order to obtain the lowest energy consumption per transmitted bit. This problem is analyzed and suitable coding parameters are determined for the popular Tmote Sky platform. Compared to the use of traditional RLNC, these parameters enable a reduction in the energy spent per bit which grows...

  8. Fragmentation of random trees

    International Nuclear Information System (INIS)

    Kalay, Z; Ben-Naim, E

    2015-01-01

    We study fragmentation of a random recursive tree into a forest by repeated removal of nodes. The initial tree consists of N nodes and it is generated by sequential addition of nodes with each new node attaching to a randomly-selected existing node. As nodes are removed from the tree, one at a time, the tree dissolves into an ensemble of separate trees, namely, a forest. We study statistical properties of trees and nodes in this heterogeneous forest, and find that the fraction of remaining nodes m characterizes the system in the limit N→∞. We obtain analytically the size density ϕ s of trees of size s. The size density has power-law tail ϕ s ∼s −α with exponent α=1+(1/m). Therefore, the tail becomes steeper as further nodes are removed, and the fragmentation process is unusual in that exponent α increases continuously with time. We also extend our analysis to the case where nodes are added as well as removed, and obtain the asymptotic size density for growing trees. (paper)

  9. Familial versus mass selection in small populations

    Directory of Open Access Journals (Sweden)

    Couvet Denis

    2003-07-01

    Full Text Available Abstract We used diffusion approximations and a Markov-chain approach to investigate the consequences of familial selection on the viability of small populations both in the short and in the long term. The outcome of familial selection was compared to the case of a random mating population under mass selection. In small populations, the higher effective size, associated with familial selection, resulted in higher fitness for slightly deleterious and/or highly recessive alleles. Conversely, because familial selection leads to a lower rate of directional selection, a lower fitness was observed for more detrimental genes that are not highly recessive, and with high population sizes. However, in the long term, genetic load was almost identical for both mass and familial selection for populations of up to 200 individuals. In terms of mean time to extinction, familial selection did not have any negative effect at least for small populations (N ≤ 50. Overall, familial selection could be proposed for use in management programs of small populations since it increases genetic variability and short-term viability without impairing the overall persistence times.

  10. Application of random coherence order selection in gradient-enhanced multidimensional NMR

    International Nuclear Information System (INIS)

    Bostock, Mark J.; Nietlispach, Daniel

    2016-01-01

    Development of multidimensional NMR is essential to many applications, for example in high resolution structural studies of biomolecules. Multidimensional techniques enable separation of NMR signals over several dimensions, improving signal resolution, whilst also allowing identification of new connectivities. However, these advantages come at a significant cost. The Fourier transform theorem requires acquisition of a grid of regularly spaced points to satisfy the Nyquist criterion, while frequency discrimination and acquisition of a pure phase spectrum require acquisition of both quadrature components for each time point in every indirect (non-acquisition) dimension, adding a factor of 2 N -1 to the number of free- induction decays which must be acquired, where N is the number of dimensions. Compressed sensing (CS) ℓ 1 -norm minimisation in combination with non-uniform sampling (NUS) has been shown to be extremely successful in overcoming the Nyquist criterion. Previously, maximum entropy reconstruction has also been used to overcome the limitation of frequency discrimination, processing data acquired with only one quadrature component at a given time interval, known as random phase detection (RPD), allowing a factor of two reduction in the number of points for each indirect dimension (Maciejewski et al. 2011 PNAS 108 16640). However, whilst this approach can be easily applied in situations where the quadrature components are acquired as amplitude modulated data, the same principle is not easily extended to phase modulated (P-/N-type) experiments where data is acquired in the form exp (iωt) or exp (-iωt), and which make up many of the multidimensional experiments used in modern NMR. Here we demonstrate a modification of the CS ℓ 1 -norm approach to allow random coherence order selection (RCS) for phase modulated experiments; we generalise the nomenclature for RCS and RPD as random quadrature detection (RQD). With this method, the power of RQD can be extended

  11. Risk-Controlled Multiobjective Portfolio Selection Problem Using a Principle of Compromise

    Directory of Open Access Journals (Sweden)

    Takashi Hasuike

    2014-01-01

    Full Text Available This paper proposes a multiobjective portfolio selection problem with most probable random distribution derived from current market data and other random distributions of boom and recession under the risk-controlled parameters determined by an investor. The current market data and information include not only historical data but also interpretations of economists’ oral and linguistic information, and hence, the boom and recession are often caused by these nonnumeric data. Therefore, investors need to consider several situations from most probable condition to boom and recession and to avoid the risk less than the target return in each situation. Furthermore, it is generally difficult to set random distributions of these cases exactly. Therefore, a robust-based approach for portfolio selection problems using the only mean values and variances of securities is proposed as a multiobjective programming problem. In addition, an exact algorithm is developed to obtain an explicit optimal portfolio using a principle of compromise.

  12. On Random Numbers and Design

    Science.gov (United States)

    Ben-Ari, Morechai

    2004-01-01

    The term "random" is frequently used in discussion of the theory of evolution, even though the mathematical concept of randomness is problematic and of little relevance in the theory. Therefore, since the core concept of the theory of evolution is the non-random process of natural selection, the term random should not be used in teaching the…

  13. Challenges faced by young persons living with HIV: The case of ...

    African Journals Online (AJOL)

    Challenges faced by young persons living with HIV: The case of children on the ... Journal of Social Development in Africa ... to 34 randomly selected children who were beneficiaries of an initiative called the Community Outreach Programme.

  14. Selecting for Fast Protein-Protein Association As Demonstrated on a Random TEM1 Yeast Library Binding BLIP.

    Science.gov (United States)

    Cohen-Khait, Ruth; Schreiber, Gideon

    2018-04-27

    Protein-protein interactions mediate the vast majority of cellular processes. Though protein interactions obey basic chemical principles also within the cell, the in vivo physiological environment may not allow for equilibrium to be reached. Thus, in vitro measured thermodynamic affinity may not provide a complete picture of protein interactions in the biological context. Binding kinetics composed of the association and dissociation rate constants are relevant and important in the cell. Therefore, changes in protein-protein interaction kinetics have a significant impact on the in vivo activity of the proteins. The common protocol for the selection of tighter binders from a mutant library selects for protein complexes with slower dissociation rate constants. Here we describe a method to specifically select for variants with faster association rate constants by using pre-equilibrium selection, starting from a large random library. Toward this end, we refine the selection conditions of a TEM1-β-lactamase library against its natural nanomolar affinity binder β-lactamase inhibitor protein (BLIP). The optimal selection conditions depend on the ligand concentration and on the incubation time. In addition, we show that a second sort of the library helps to separate signal from noise, resulting in a higher percent of faster binders in the selected library. Fast associating protein variants are of particular interest for drug development and other biotechnological applications.

  15. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling.

    Science.gov (United States)

    Zhou, Fuqun; Zhang, Aining

    2016-10-25

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2-3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests' features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.

  16. Comparative Analysis of Site-Selection Process for Power Plants in Korea: Cases of Thermal, Nuclear, and Renewable Energies

    International Nuclear Information System (INIS)

    Kang, M.; Lee, M.; Yoon, J. W.; Choi, H. C.; Chu, C.; Lee, H.; Park, J.

    2017-01-01

    There are various conflicts related to power generation facilities; however, the conflicts that arise during the process of luring facilities or site selection, as in the previous cases, can eventually influence greatly the implementation of the national energy policy or strategy. This study analyzed the conflict phenomenon that occurred in the site selection policy of the power generation facilities through the case studies. We selected the most recent conflict cases by each energy source, identified the qualitative context characteristics of the cases and tried to suggest the policy leverages. In this study, it is concluded that the cause of conflicts in decision making system for site selection of power plants is insufficient yet due to the variable circumstances such as environmental events, stakeholder range, etc. However, the conclusions obtained from the case study are difficult generalization without specific prescription books, so further studies for those areas are required.

  17. A Permutation Importance-Based Feature Selection Method for Short-Term Electricity Load Forecasting Using Random Forest

    Directory of Open Access Journals (Sweden)

    Nantian Huang

    2016-09-01

    Full Text Available The prediction accuracy of short-term load forecast (STLF depends on prediction model choice and feature selection result. In this paper, a novel random forest (RF-based feature selection method for STLF is proposed. First, 243 related features were extracted from historical load data and the time information of prediction points to form the original feature set. Subsequently, the original feature set was used to train an RF as the original model. After the training process, the prediction error of the original model on the test set was recorded and the permutation importance (PI value of each feature was obtained. Then, an improved sequential backward search method was used to select the optimal forecasting feature subset based on the PI value of each feature. Finally, the optimal forecasting feature subset was used to train a new RF model as the final prediction model. Experiments showed that the prediction accuracy of RF trained by the optimal forecasting feature subset was higher than that of the original model and comparative models based on support vector regression and artificial neural network.

  18. Direct random insertion mutagenesis of Helicobacter pylori

    NARCIS (Netherlands)

    de Jonge, Ramon; Bakker, Dennis; van Vliet, Arnoud H. M.; Kuipers, Ernst J.; Vandenbroucke-Grauls, Christina M. J. E.; Kusters, Johannes G.

    2003-01-01

    Random insertion mutagenesis is a widely used technique for the identification of bacterial virulence genes. Most strategies for random mutagenesis involve cloning in Escherichia coli for passage of plasmids or for phenotypic selection. This can result in biased selection due to restriction or

  19. Direct random insertion mutagenesis of Helicobacter pylori.

    NARCIS (Netherlands)

    Jonge, de R.; Bakker, D.; Vliet, van AH; Kuipers, E.J.; Vandenbroucke-Grauls, C.M.J.E.; Kusters, J.G.

    2003-01-01

    Random insertion mutagenesis is a widely used technique for the identification of bacterial virulence genes. Most strategies for random mutagenesis involve cloning in Escherichia coli for passage of plasmids or for phenotypic selection. This can result in biased selection due to restriction or

  20. Feature Selection as a Time and Cost-Saving Approach for Land Suitability Classification (Case Study of Shavur Plain, Iran

    Directory of Open Access Journals (Sweden)

    Saeid Hamzeh

    2016-10-01

    Full Text Available Land suitability classification is important in planning and managing sustainable land use. Most approaches to land suitability analysis combine a large number of land and soil parameters, and are time-consuming and costly. In this study, a potentially useful technique (combined feature selection and fuzzy-AHP method to increase the efficiency of land suitability analysis was presented. To this end, three different feature selection algorithms—random search, best search and genetic methods—were used to determine the most effective parameters for land suitability classification for the cultivation of barely in the Shavur Plain, southwest Iran. Next, land suitability classes were calculated for all methods by using the fuzzy-AHP approach. Salinity (electrical conductivity (EC, alkalinity (exchangeable sodium percentage (ESP, wetness and soil texture were selected using the random search method. Gypsum, EC, ESP, and soil texture were selected using both the best search and genetic methods. The result shows a strong agreement between the standard fuzzy-AHP methods and methods presented in this study. The values of Kappa coefficients were 0.82, 0.79 and 0.79 for the random search, best search and genetic methods, respectively, compared with the standard fuzzy-AHP method. Our results indicate that EC, ESP, soil texture and wetness are the most effective features for evaluating land suitability classification for the cultivation of barely in the study area, and uses of these parameters, together with their appropriate weights as obtained from fuzzy-AHP, can perform good results for land suitability classification. So, the combined feature selection presented and the fuzzy-AHP approach has the potential to save time and money for land suitability classification.

  1. The selection problem for discounted Hamilton–Jacobi equations: some non-convex cases

    KAUST Repository

    Gomes, Diogo A.; Mitake, Hiroyoshi; Tran, Hung V.

    2018-01-01

    Here, we study the selection problem for the vanishing discount approximation of non-convex, first-order Hamilton–Jacobi equations. While the selection problem is well understood for convex Hamiltonians, the selection problem for non-convex Hamiltonians has thus far not been studied. We begin our study by examining a generalized discounted Hamilton–Jacobi equation. Next, using an exponential transformation, we apply our methods to strictly quasi-convex and to some non-convex Hamilton–Jacobi equations. Finally, we examine a non-convex Hamiltonian with flat parts to which our results do not directly apply. In this case, we establish the convergence by a direct approach.

  2. The selection problem for discounted Hamilton–Jacobi equations: some non-convex cases

    KAUST Repository

    Gomes, Diogo A.

    2018-01-26

    Here, we study the selection problem for the vanishing discount approximation of non-convex, first-order Hamilton–Jacobi equations. While the selection problem is well understood for convex Hamiltonians, the selection problem for non-convex Hamiltonians has thus far not been studied. We begin our study by examining a generalized discounted Hamilton–Jacobi equation. Next, using an exponential transformation, we apply our methods to strictly quasi-convex and to some non-convex Hamilton–Jacobi equations. Finally, we examine a non-convex Hamiltonian with flat parts to which our results do not directly apply. In this case, we establish the convergence by a direct approach.

  3. Selecting registration schemes in case of interstitial lung disease follow-up in CT

    International Nuclear Information System (INIS)

    Vlachopoulos, Georgios; Korfiatis, Panayiotis; Skiadopoulos, Spyros; Kazantzi, Alexandra; Kalogeropoulou, Christina; Pratikakis, Ioannis; Costaridou, Lena

    2015-01-01

    Purpose: Primary goal of this study is to select optimal registration schemes in the framework of interstitial lung disease (ILD) follow-up analysis in CT. Methods: A set of 128 multiresolution schemes composed of multiresolution nonrigid and combinations of rigid and nonrigid registration schemes are evaluated, utilizing ten artificially warped ILD follow-up volumes, originating from ten clinical volumetric CT scans of ILD affected patients, to select candidate optimal schemes. Specifically, all combinations of four transformation models (three rigid: rigid, similarity, affine and one nonrigid: third order B-spline), four cost functions (sum-of-square distances, normalized correlation coefficient, mutual information, and normalized mutual information), four gradient descent optimizers (standard, regular step, adaptive stochastic, and finite difference), and two types of pyramids (recursive and Gaussian-smoothing) were considered. The selection process involves two stages. The first stage involves identification of schemes with deformation field singularities, according to the determinant of the Jacobian matrix. In the second stage, evaluation methodology is based on distance between corresponding landmark points in both normal lung parenchyma (NLP) and ILD affected regions. Statistical analysis was performed in order to select near optimal registration schemes per evaluation metric. Performance of the candidate registration schemes was verified on a case sample of ten clinical follow-up CT scans to obtain the selected registration schemes. Results: By considering near optimal schemes common to all ranking lists, 16 out of 128 registration schemes were initially selected. These schemes obtained submillimeter registration accuracies in terms of average distance errors 0.18 ± 0.01 mm for NLP and 0.20 ± 0.01 mm for ILD, in case of artificially generated follow-up data. Registration accuracy in terms of average distance error in clinical follow-up data was in the

  4. Selecting registration schemes in case of interstitial lung disease follow-up in CT

    Energy Technology Data Exchange (ETDEWEB)

    Vlachopoulos, Georgios; Korfiatis, Panayiotis; Skiadopoulos, Spyros; Kazantzi, Alexandra [Department of Medical Physics, School of Medicine,University of Patras, Patras 26504 (Greece); Kalogeropoulou, Christina [Department of Radiology, School of Medicine, University of Patras, Patras 26504 (Greece); Pratikakis, Ioannis [Department of Electrical and Computer Engineering, Democritus University of Thrace, Xanthi 67100 (Greece); Costaridou, Lena, E-mail: costarid@upatras.gr [Department of Medical Physics, School of Medicine, University of Patras, Patras 26504 (Greece)

    2015-08-15

    Purpose: Primary goal of this study is to select optimal registration schemes in the framework of interstitial lung disease (ILD) follow-up analysis in CT. Methods: A set of 128 multiresolution schemes composed of multiresolution nonrigid and combinations of rigid and nonrigid registration schemes are evaluated, utilizing ten artificially warped ILD follow-up volumes, originating from ten clinical volumetric CT scans of ILD affected patients, to select candidate optimal schemes. Specifically, all combinations of four transformation models (three rigid: rigid, similarity, affine and one nonrigid: third order B-spline), four cost functions (sum-of-square distances, normalized correlation coefficient, mutual information, and normalized mutual information), four gradient descent optimizers (standard, regular step, adaptive stochastic, and finite difference), and two types of pyramids (recursive and Gaussian-smoothing) were considered. The selection process involves two stages. The first stage involves identification of schemes with deformation field singularities, according to the determinant of the Jacobian matrix. In the second stage, evaluation methodology is based on distance between corresponding landmark points in both normal lung parenchyma (NLP) and ILD affected regions. Statistical analysis was performed in order to select near optimal registration schemes per evaluation metric. Performance of the candidate registration schemes was verified on a case sample of ten clinical follow-up CT scans to obtain the selected registration schemes. Results: By considering near optimal schemes common to all ranking lists, 16 out of 128 registration schemes were initially selected. These schemes obtained submillimeter registration accuracies in terms of average distance errors 0.18 ± 0.01 mm for NLP and 0.20 ± 0.01 mm for ILD, in case of artificially generated follow-up data. Registration accuracy in terms of average distance error in clinical follow-up data was in the

  5. Coupled continuous time-random walks in quenched random environment

    Science.gov (United States)

    Magdziarz, M.; Szczotka, W.

    2018-02-01

    We introduce a coupled continuous-time random walk with coupling which is characteristic for Lévy walks. Additionally we assume that the walker moves in a quenched random environment, i.e. the site disorder at each lattice point is fixed in time. We analyze the scaling limit of such a random walk. We show that for large times the behaviour of the analyzed process is exactly the same as in the case of uncoupled quenched trap model for Lévy flights.

  6. [Use of Cone Beam Computed Tomography in endodontics: rational case selection criteria].

    Science.gov (United States)

    Rosen, E; Tsesis, I

    2016-01-01

    To present rational case selection criteria for the use of CBCT (Cone Beam Computed Tomography) in endodontics. This article reviews the literature concerning the benefits of CBCT in endodontics, alongside its radiation risks, and present case selection criteria for referral of endodontic patients to CBCT. Up to date, the expected ultimate benefit of CBCT to the endodontic patient is yet uncertain, and the current literature is mainly restricted to its technical efficacy. In addition, the potential radiation risks of CBCT scan are stochastic in nature and uncertain, and are worrying especially in pediatric patients. Both the efficacy of CBCT in supporting the endodontic practitioner decision making and in affecting treatment outcomes, and its long term potential radiation risks are yet uncertain. Therefore, a cautious rational decision making is essential when a CBCT scan is considered in endodontics. Risk-benefit considerations are presented.

  7. Materials selection as an interdisciplinary technical activity: basic methodology and case studies

    Directory of Open Access Journals (Sweden)

    M. Ferrante

    2000-04-01

    Full Text Available The technical activity known as Materials Selection is reviewed in its concepts and methodologies. Objectives and strategies are briefly presented and two important features are introduced and discussed; (i Merit Indices: a combination of materials properties, which maximises the objectives chosen by the designer and (ii Materials Properties Maps: a bi-dimensional space whose coordinates are pairs of properties in which materials can be plotted and compared directly in terms of their merit indices. A general strategy for the deduction of these indices is explained and a formal methodology to establish a ranking of candidate materials when multiple constraints intervene is presented. Finally, two case studies are discussed in depth, one related to materials substitution in the context of mechanical design and a less conventional case linking material selection to physical comfort in the home furniture industry.

  8. Selective oropharyngeal decontamination versus selective digestive decontamination in critically ill patients: a meta-analysis of randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Zhao D

    2015-07-01

    Full Text Available Di Zhao,1,* Jian Song,2,* Xuan Gao,3 Fei Gao,4 Yupeng Wu,2 Yingying Lu,5 Kai Hou1 1Department of Neurosurgery, The First Hospital of Hebei Medical University, 2Department of Neurosurgery, 3Department of Neurology, The Second Hospital of Hebei Medical University, 4Hebei Provincial Procurement Centers for Medical Drugs and Devices, 5Department of Neurosurgery, The Second Hospital of Hebei Medical University, Shijiazhuang People’s Republic of China *These authors contributed equally to this work Background: Selective digestive decontamination (SDD and selective oropharyngeal decontamination (SOD are associated with reduced mortality and infection rates among patients in intensive care units (ICUs; however, whether SOD has a superior effect than SDD remains uncertain. Hence, we conducted a meta-analysis of randomized controlled trials (RCTs to compare SOD with SDD in terms of clinical outcomes and antimicrobial resistance rates in patients who were critically ill. Methods: RCTs published in PubMed, Embase, and Web of Science were systematically reviewed to compare the effects of SOD and SDD in patients who were critically ill. Outcomes included day-28 mortality, length of ICU stay, length of hospital stay, duration of mechanical ventilation, ICU-acquired bacteremia, and prevalence of antibiotic-resistant Gram-negative bacteria. Results were expressed as risk ratio (RR with 95% confidence intervals (CIs, and weighted mean differences (WMDs with 95% CIs. Pooled estimates were performed using a fixed-effects model or random-effects model, depending on the heterogeneity among studies. Results: A total of four RCTs involving 23,822 patients met the inclusion criteria and were included in this meta-analysis. Among patients whose admitting specialty was surgery, cardiothoracic surgery (57.3% and neurosurgery (29.7% were the two main types of surgery being performed. Pooled results showed that SOD had similar effects as SDD in day-28 mortality (RR =1

  9. An assessment of the quality of care for children in eighteen randomly selected district and sub-district hospitals in Bangladesh

    Directory of Open Access Journals (Sweden)

    Hoque Dewan ME

    2012-12-01

    Full Text Available Abstract Background Quality hospital care is important in ensuring that the needs of severely ill children are met to avert child mortality. However, the quality of hospital care for children in developing countries has often been found poor. As the first step of a country road map for improving hospital care for children, we assessed the baseline situation with respect to the quality of care provided to children under-five years age in district and sub-district level hospitals in Bangladesh. Methods Using adapted World Health Organization (WHO hospital assessment tools and standards, an assessment of 18 randomly selected district (n=6 and sub-district (n=12 hospitals was undertaken. Teams of trained assessors used direct case observation, record review, interviews, and Management Information System (MIS data to assess the quality of clinical case management and monitoring; infrastructure, processes and hospital administration; essential hospital and laboratory supports, drugs and equipment. Results Findings demonstrate that the overall quality of care provided in these hospitals was poor. No hospital had a functioning triage system to prioritise those children most in need of immediate care. Laboratory supports and essential equipment were deficient. Only one hospital had all of the essential drugs for paediatric care. Less than a third of hospitals had a back-up power supply, and just under half had functioning arrangements for safe-drinking water. Clinical case management was found to be sub-optimal for prevalent illnesses, as was the quality of neonatal care. Conclusion Action is needed to improve the quality of paediatric care in hospital settings in Bangladesh, with a particular need to invest in improving newborn care.

  10. Strategyproof Peer Selection using Randomization, Partitioning, and Apportionment

    OpenAIRE

    Aziz, Haris; Lev, Omer; Mattei, Nicholas; Rosenschein, Jeffrey S.; Walsh, Toby

    2016-01-01

    Peer review, evaluation, and selection is a fundamental aspect of modern science. Funding bodies the world over employ experts to review and select the best proposals of those submitted for funding. The problem of peer selection, however, is much more general: a professional society may want to give a subset of its members awards based on the opinions of all members; an instructor for a MOOC or online course may want to crowdsource grading; or a marketing company may select ideas from group b...

  11. Just-in-time consent: The ethical case for an alternative to traditional informed consent in randomized trials comparing an experimental intervention with usual care.

    Science.gov (United States)

    Vickers, Andrew J; Young-Afat, Danny A; Ehdaie, Behfar; Kim, Scott Yh

    2018-02-01

    Informed consent for randomized trials often causes significant and persistent anxiety, distress and confusion to patients. Where an experimental treatment is compared to a standard care control, much of this burden is potentially avoidable in the control group. We propose a "just-in-time" consent in which consent discussions take place in two stages: an initial consent to research from all participants and a later specific consent to randomized treatment only from those assigned to the experimental intervention. All patients are first approached and informed about research procedures, such as questionnaires or tests. They are also informed that they might be randomly selected to receive an experimental treatment and that, if selected, they can learn more about the treatment and decide whether or not to accept it at that time. After randomization, control patients undergo standard clinical consent whereas patients randomized to the experimental procedure undergo a second consent discussion. Analysis would be by intent-to-treat, which protects the trial from selection bias, although not from poor acceptance of experimental treatment. The advantages of just-in-time consent stem from the fact that only patients randomized to the experimental treatment are subject to a discussion of that intervention. We hypothesize that this will reduce much of the patient's burden associated with the consent process, such as decisional anxiety, confusion and information overload. We recommend well-controlled studies to compare just-in-time and traditional consent, with endpoints to include characteristics of participants, distress and anxiety and participants' understanding of research procedures.

  12. Oracle Efficient Variable Selection in Random and Fixed Effects Panel Data Models

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl

    This paper generalizes the results for the Bridge estimator of Huang et al. (2008) to linear random and fixed effects panel data models which are allowed to grow in both dimensions. In particular we show that the Bridge estimator is oracle efficient. It can correctly distinguish between relevant...... and irrelevant variables and the asymptotic distribution of the estimators of the coefficients of the relevant variables is the same as if only these had been included in the model, i.e. as if an oracle had revealed the true model prior to estimation. In the case of more explanatory variables than observations......, we prove that the Marginal Bridge estimator can asymptotically correctly distinguish between relevant and irrelevant explanatory variables. We do this without restricting the dependence between covariates and without assuming sub Gaussianity of the error terms thereby generalizing the results...

  13. Associations between selected allergens, phthalates, nicotine, polycyclic aromatic hydrocarbons, and bedroom ventilation and clinically confirmed asthma, rhinoconjunctivitis, and atopic dermatitis in preschool children

    DEFF Research Database (Denmark)

    Callesen, M.; Bekö, Gabriel; Weschler, Charles J.

    2014-01-01

    participating families. A single physician conducted clinical examinations of all 500 children. Children from the initially random control group with clinically confirmed allergic disease were subsequently excluded from the control group and admitted in the case group, leaving 242 in the healthy control group...... and clinically confirmed asthma, rhinoconjunctivitis, and atopic dermatitis. The study is a cross-sectional case-control study of 500 children aged 3-5years from Odense, Denmark. The 200 cases had at least two parentally reported allergic diseases, while the 300 controls were randomly selected from 2835...

  14. 47 CFR 1.1604 - Post-selection hearings.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a) Following the random...

  15. Random genetic drift, natural selection, and noise in human cranial evolution.

    Science.gov (United States)

    Roseman, Charles C

    2016-08-01

    This study assesses the extent to which relationships among groups complicate comparative studies of adaptation in recent human cranial variation and the extent to which departures from neutral additive models of evolution hinder the reconstruction of population relationships among groups using cranial morphology. Using a maximum likelihood evolutionary model fitting approach and a mixed population genomic and cranial data set, I evaluate the relative fits of several widely used models of human cranial evolution. Moreover, I compare the goodness of fit of models of cranial evolution constrained by genomic variation to test hypotheses about population specific departures from neutrality. Models from population genomics are much better fits to cranial variation than are traditional models from comparative human biology. There is not enough evolutionary information in the cranium to reconstruct much of recent human evolution but the influence of population history on cranial variation is strong enough to cause comparative studies of adaptation serious difficulties. Deviations from a model of random genetic drift along a tree-like population history show the importance of environmental effects, gene flow, and/or natural selection on human cranial variation. Moreover, there is a strong signal of the effect of natural selection or an environmental factor on a group of humans from Siberia. The evolution of the human cranium is complex and no one evolutionary process has prevailed at the expense of all others. A holistic unification of phenome, genome, and environmental context, gives us a strong point of purchase on these problems, which is unavailable to any one traditional approach alone. Am J Phys Anthropol 160:582-592, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  16. Neonatal Vitamin D Levels in Relation to Risk of Overweight at 7 Years in the Danish D-Tect Case-Cohort Study

    DEFF Research Database (Denmark)

    Jensen, Camilla B.; Lundqvist, Marika; Sørensen, Thorkild I. A.

    2017-01-01

    Background: Vitamin D level in pregnancy may be associated with risk of overweight in the offspring later in life. Methods: In a case-cohort study based on Danish biobanks and registers we examined the association between 25-hydroxy-vitamin D (25(OH)D) level at birth and overweight at 7 years....... Cases of overweight (n = 871) were randomly selected among 7-year-old children from the Copenhagen School Health Records Register (CSHRR) with a BMI above the 90th percentile. The cohort (n = 1,311) was a random sample selected among all Danish children born during the same period. Neonatal 25(OH...

  17. Validation of a New Method to Automatically Select Cases With Intraoperative Red Blood Cell Transfusion for Audit.

    Science.gov (United States)

    Dexter, Franklin; Epstein, Richard H; Ledolter, Johannes; Dasovich, Susan M; Herman, Jay H; Maga, Joni M; Schwenk, Eric S

    2018-05-01

    Hospitals review allogeneic red blood cell (RBC) transfusions for appropriateness. Audit criteria have been published that apply to 5 common procedures. We expanded on this work to study the management decision of selecting which cases involving transfusion of at least 1 RBC unit to audit (review) among all surgical procedures, including those previously studied. This retrospective, observational study included 400,000 cases among 1891 different procedures over an 11-year period. There were 12,616 cases with RBC transfusion. We studied the proportions of cases that would be audited based on criteria of nadir hemoglobin (Hb) greater than the hospital's selected transfusion threshold, or absent Hb or missing estimated blood loss (EBL) among procedures with median EBL 50%) that would be audited and most cases (>50%) with transfusion were among procedures with median EBL 9 g/dL, the procedure's median EBL was 9 g/dL and median EBL for the procedure ≥500 mL. An automated process to select cases for audit of intraoperative transfusion of RBC needs to consider the median EBL of the procedure, whether the nadir Hb is below the hospital's Hb transfusion threshold for surgical cases, and the absence of either a Hb or entry of the EBL for the case. This conclusion applies to all surgical cases and procedures.

  18. Do vouchers lead to sorting under random private-school selection? Evidence from the Milwaukee voucher program

    OpenAIRE

    Chakrabarti, Rajashri

    2009-01-01

    This paper analyzes the effect of school vouchers on student sorting - defined as a flight to private schools by high-income and committed public-school students - and whether vouchers can be designed to reduce or eliminate it. Much of the existing literature investigates sorting in cases where private schools can screen students. However, publicly funded U.S. voucher programs require a private school to accept all students unless it is oversubscribed and to pick students randomly if it is ov...

  19. Random ensemble learning for EEG classification.

    Science.gov (United States)

    Hosseini, Mohammad-Parsa; Pompili, Dario; Elisevich, Kost; Soltanian-Zadeh, Hamid

    2018-01-01

    Real-time detection of seizure activity in epilepsy patients is critical in averting seizure activity and improving patients' quality of life. Accurate evaluation, presurgical assessment, seizure prevention, and emergency alerts all depend on the rapid detection of seizure onset. A new method of feature selection and classification for rapid and precise seizure detection is discussed wherein informative components of electroencephalogram (EEG)-derived data are extracted and an automatic method is presented using infinite independent component analysis (I-ICA) to select independent features. The feature space is divided into subspaces via random selection and multichannel support vector machines (SVMs) are used to classify these subspaces. The result of each classifier is then combined by majority voting to establish the final output. In addition, a random subspace ensemble using a combination of SVM, multilayer perceptron (MLP) neural network and an extended k-nearest neighbors (k-NN), called extended nearest neighbor (ENN), is developed for the EEG and electrocorticography (ECoG) big data problem. To evaluate the solution, a benchmark ECoG of eight patients with temporal and extratemporal epilepsy was implemented in a distributed computing framework as a multitier cloud-computing architecture. Using leave-one-out cross-validation, the accuracy, sensitivity, specificity, and both false positive and false negative ratios of the proposed method were found to be 0.97, 0.98, 0.96, 0.04, and 0.02, respectively. Application of the solution to cases under investigation with ECoG has also been effected to demonstrate its utility. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology.

    Science.gov (United States)

    Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H

    2017-07-01

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in

  1. Quantifying the impact of selection bias caused by nonparticipation in a case-control study of mobile phone use

    DEFF Research Database (Denmark)

    Vrijheid, Martine; Richardson, Lesley; Armstrong, Bruce K

    2009-01-01

    To quantitatively assess the impact of selection bias caused by nonparticipation in a multinational case-control study of mobile phone use and brain tumor.......To quantitatively assess the impact of selection bias caused by nonparticipation in a multinational case-control study of mobile phone use and brain tumor....

  2. Tests of selection in pooled case-control data: an empirical study.

    Science.gov (United States)

    Udpa, Nitin; Zhou, Dan; Haddad, Gabriel G; Bafna, Vineet

    2011-01-01

    For smaller organisms with faster breeding cycles, artificial selection can be used to create sub-populations with different phenotypic traits. Genetic tests can be employed to identify the causal markers for the phenotypes, as a precursor to engineering strains with a combination of traits. Traditional approaches involve analyzing crosses of inbred strains to test for co-segregation with genetic markers. Here we take advantage of cheaper next generation sequencing techniques to identify genetic signatures of adaptation to the selection constraints. Obtaining individual sequencing data is often unrealistic due to cost and sample issues, so we focus on pooled genomic data. We explore a series of statistical tests for selection using pooled case (under selection) and control populations. The tests generally capture skews in the scaled frequency spectrum of alleles in a region, which are indicative of a selective sweep. Extensive simulations are used to show that these approaches work well for a wide range of population divergence times and strong selective pressures. Control vs control simulations are used to determine an empirical False Positive Rate, and regions under selection are determined using a 1% FPR level. We show that pooling does not have a significant impact on statistical power. The tests are also robust to reasonable variations in several different parameters, including window size, base-calling error rate, and sequencing coverage. We then demonstrate the viability (and the challenges) of one of these methods in two independent Drosophila populations (Drosophila melanogaster) bred under selection for hypoxia and accelerated development, respectively. Testing for extreme hypoxia tolerance showed clear signals of selection, pointing to loci that are important for hypoxia adaptation. Overall, we outline a strategy for finding regions under selection using pooled sequences, then devise optimal tests for that strategy. The approaches show promise for

  3. Wide brick tunnel randomization - an unequal allocation procedure that limits the imbalance in treatment totals.

    Science.gov (United States)

    Kuznetsova, Olga M; Tymofyeyev, Yevgen

    2014-04-30

    In open-label studies, partial predictability of permuted block randomization provides potential for selection bias. To lessen the selection bias in two-arm studies with equal allocation, a number of allocation procedures that limit the imbalance in treatment totals at a pre-specified level but do not require the exact balance at the ends of the blocks were developed. In studies with unequal allocation, however, the task of designing a randomization procedure that sets a pre-specified limit on imbalance in group totals is not resolved. Existing allocation procedures either do not preserve the allocation ratio at every allocation or do not include all allocation sequences that comply with the pre-specified imbalance threshold. Kuznetsova and Tymofyeyev described the brick tunnel randomization for studies with unequal allocation that preserves the allocation ratio at every step and, in the two-arm case, includes all sequences that satisfy the smallest possible imbalance threshold. This article introduces wide brick tunnel randomization for studies with unequal allocation that allows all allocation sequences with imbalance not exceeding any pre-specified threshold while preserving the allocation ratio at every step. In open-label studies, allowing a larger imbalance in treatment totals lowers selection bias because of the predictability of treatment assignments. The applications of the technique in two-arm and multi-arm open-label studies with unequal allocation are described. Copyright © 2013 John Wiley & Sons, Ltd.

  4. Random Pattern Vertically Oriented, Partial Thickness Buccinator Myomucosal Flap for Intraoral Reconstruction: A Report of Two Cases

    Directory of Open Access Journals (Sweden)

    Amin Rahpeyma

    2016-05-01

    Full Text Available Introduction: Reconstruction of the oral cavity witha flap design containing the buccal mucosa and buccinator muscle but excluding the facial artery and vein is the topic of these case reports. Case Reports: This article uses random pattern vertically oriented partial thickness buccinator myomucosal flap for intraoral reconstruction in two cases. The first was for lining the mandibular anterior vestibule in a trauma patient. The second was for oral side coverage of bone graft in special cleft patient. In both patients, this flap survived and good bone coverage with non-keratinized mucosa was obtained. Conclusion:  Thin long buccal myomucosal flap not including facial artery and vein can survive.

  5. Lessons learned? Selected public acceptance case studies since Three Mile Island

    Energy Technology Data Exchange (ETDEWEB)

    Blee, D. [NAC International, Atlanta Corporate Headquarters, Atlanta, GA (United States)

    2001-02-01

    This paper will present an overview of the present situation, some recent polling survey information, and then look at lessons learned in terms of selected case studies and some global issues over the 22 years since the Three Mile Island (TMI) accident. That is quite an ambitious topic but there are some important lessons we can learn from the post-TMI era. (author)

  6. Leukaemia and occupation: a New Zealand Cancer Registry-based case-control Study.

    NARCIS (Netherlands)

    McLean, D.; 't Mannetje, A.; Dryson, E.; Walls, C.; McKenzie, F.; Maule, M.; Cheng, S.; Cunningham, C.; Kromhout, H.; Boffetta, P.; Blair, A.; Pearce, N.

    2009-01-01

    BACKGROUND: To examine the association between occupation and leukaemia. METHODS: We interviewed 225 cases (aged 20-75 years) notified to the New Zealand Cancer Registry during 2003-04, and 471 controls randomly selected from the Electoral Roll collecting demographic details, information on

  7. Sex selection: treating different cases differently.

    Science.gov (United States)

    Dickens, B M; Serour, G I; Cook, R J; Qiu, R-Z

    2005-08-01

    This paper contrasts ethical approaches to sex selection in countries where discrimination against women is pervasive, resulting in selection against girl children, and in countries where there is less general discrimination and couples do not prefer children of either sex. National sex ratio imbalances where discrimination against women is common have resulted in laws and policies, such as in India and China, to deter and prevent sex selection. Birth ratios of children can be affected by techniques of prenatal sex determination and abortion, preconception sex selection and discarding disfavored embryos, and prefertilization sperm sorting, when disfavored sperm remain unused. Incentives for son preference are reviewed, and laws and policies to prevent sex selection are explained. The elimination of social, economic and other discrimination against women is urged to redress sex selection against girl children. Where there is no general selection against girl children, sex selection can be allowed to assist families that want children of both sexes.

  8. Selective epidemic vaccination under the performant routing algorithms

    Science.gov (United States)

    Bamaarouf, O.; Alweimine, A. Ould Baba; Rachadi, A.; EZ-Zahraouy, H.

    2018-04-01

    Despite the extensive research on traffic dynamics and epidemic spreading, the effect of the routing algorithms strategies on the traffic-driven epidemic spreading has not received an adequate attention. It is well known that more performant routing algorithm strategies are used to overcome the congestion problem. However, our main result shows unexpectedly that these algorithms favor the virus spreading more than the case where the shortest path based algorithm is used. In this work, we studied the virus spreading in a complex network using the efficient path and the global dynamic routing algorithms as compared to shortest path strategy. Some previous studies have tried to modify the routing rules to limit the virus spreading, but at the expense of reducing the traffic transport efficiency. This work proposed a solution to overcome this drawback by using a selective vaccination procedure instead of a random vaccination used often in the literature. We found that the selective vaccination succeeded in eradicating the virus better than a pure random intervention for the performant routing algorithm strategies.

  9. The Value of Outsourcing Selected Cases in a Medical Examiner Population: A 10-Year Experience.

    Science.gov (United States)

    McCleskey, Brandi C; Reilly, Stephanie D; Atherton, Dan

    2017-01-01

    Due to increasing caseloads and inadequate staffing, the burden on Coroner/Medical Examiner Offices to comply with recommended autopsy limits for forensic pathologists (FPs) has been difficult. Since 2006, pathologists at the University of Alabama at Birmingham have performed select autopsies for the Alabama Department of Forensic Sciences. Each case was reviewed by a state FP and scene investigator to determine appropriateness for referral. All referred cases received full postmortem examination including microscopic examination and collection of toxicological samples, and toxicology was ordered by the referring FP as appropriate. The final cause and manner of death were determined by the referring state FP after review of all findings. A majority of the 421 cases were ruled accidental deaths (233), most due to drug toxicity. Of the 178 natural deaths, 118 were attributed to cardiovascular disease. Outsourcing select forensic cases can be educational and an effective tool to manage workflow without compromising quality. © 2016 American Academy of Forensic Sciences.

  10. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil; Kammoun, Abla; Al-Naffouri, Tareq Y.; Alouini, Mohamed-Slim

    2016-01-01

    -aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also

  11. Hamiltonian Cycles on Random Eulerian Triangulations

    DEFF Research Database (Denmark)

    Guitter, E.; Kristjansen, C.; Nielsen, Jakob Langgaard

    1998-01-01

    . Considering the case n -> 0, this implies that the system of random Eulerian triangulations equipped with Hamiltonian cycles describes a c=-1 matter field coupled to 2D quantum gravity as opposed to the system of usual random triangulations equipped with Hamiltonian cycles which has c=-2. Hence, in this case...

  12. Effects of choice architecture and chef-enhanced meals on the selection and consumption of healthier school foods: a randomized clinical trial.

    Science.gov (United States)

    Cohen, Juliana F W; Richardson, Scott A; Cluggish, Sarah A; Parker, Ellen; Catalano, Paul J; Rimm, Eric B

    2015-05-01

    Little is known about the long-term effect of a chef-enhanced menu on healthier food selection and consumption in school lunchrooms. In addition, it remains unclear if extended exposure to other strategies to promote healthier foods (eg, choice architecture) also improves food selection or consumption. To evaluate the short- and long-term effects of chef-enhanced meals and extended exposure to choice architecture on healthier school food selection and consumption. A school-based randomized clinical trial was conducted during the 2011-2012 school year among 14 elementary and middle schools in 2 urban, low-income school districts (intent-to-treat analysis). Included in the study were 2638 students in grades 3 through 8 attending participating schools (38.4% of eligible participants). Schools were first randomized to receive a professional chef to improve school meal palatability (chef schools) or to a delayed intervention (control group). To assess the effect of choice architecture (smart café), all schools after 3 months were then randomized to the smart café intervention or to the control group. School food selection was recorded, and consumption was measured using plate waste methods. After 3 months, vegetable selection increased in chef vs control schools (odds ratio [OR], 1.75; 95% CI, 1.36-2.24), but there was no effect on the selection of other components or on meal consumption. After long-term or extended exposure to the chef or smart café intervention, fruit selection increased in the chef (OR, 3.08; 95% CI, 2.23-4.25), smart café (OR, 1.45; 95% CI, 1.13-1.87), and chef plus smart café (OR, 3.10; 95% CI, 2.26-4.25) schools compared with the control schools, and consumption increased in the chef schools (OR, 0.17; 95% CI, 0.03-0.30 cups/d). Vegetable selection increased in the chef (OR, 2.54; 95% CI, 1.83-3.54), smart café (OR, 1.91; 95% CI, 1.46-2.50), and chef plus smart café schools (OR, 7.38, 95% CI, 5.26-10.35) compared with the control schools

  13. Modified random hinge transport mechanics and multiple scattering step-size selection in EGS5

    International Nuclear Information System (INIS)

    Wilderman, S.J.; Bielajew, A.F.

    2005-01-01

    The new transport mechanics in EGS5 allows for significantly longer electron transport step sizes and hence shorter computation times than required for identical problems in EGS4. But as with all Monte Carlo electron transport algorithms, certain classes of problems exhibit step-size dependencies even when operating within recommended ranges, sometimes making selection of step-sizes a daunting task for novice users. Further contributing to this problem, because of the decoupling of multiple scattering and continuous energy loss in the dual random hinge transport mechanics of EGS5, there are two independent step sizes in EGS5, one for multiple scattering and one for continuous energy loss, each of which influences speed and accuracy in a different manner. Further, whereas EGS4 used a single value of fractional energy loss (ESTEPE) to determine step sizes at all energies, to increase performance by decreasing the amount of effort expended simulating lower energy particles, EGS5 permits the fractional energy loss values which are used to determine both the multiple scattering and continuous energy loss step sizes to vary with energy. This results in requiring the user to specify four fractional energy loss values when optimizing computations for speed. Thus, in order to simplify step-size selection and to mitigate step-size dependencies, a method has been devised to automatically optimize step-size selection based on a single material dependent input related to the size of problem tally region. In this paper we discuss the new transport mechanics in EGS5 and describe the automatic step-size optimization algorithm. (author)

  14. Risk Factors Profile of Shoulder Dystocia in Oman: A Case Control Study

    OpenAIRE

    Maha M. Al-Khaduri; Rania Mohammed Abudraz; Sayed G. Rizvi; Yahya M. Al-Farsi

    2014-01-01

    Objective: This study aimed to assess the risk factor profile of shoulder dystocia and associated neonatal complications in Oman, a developing Arab country. Methods: A retrospective case-control study was conducted among 111 cases with dystocia and 111 controls, identified during 1994-2006 period in a tertiary care hospital in Oman. Controls were randomly selected among women who did not have dystocia, and were matched to cases on the day of delivery. Data related to potential risk factor...

  15. Reported cases of selected diseases.

    Science.gov (United States)

    1994-06-01

    The number of reported cases of measles, poliomyelitis, tetanus, diphtheria, and whooping cough for the period of January 1, 1994 to the date of the last report is presented in tabular form by country with a comparison for the same epidemiological period in 1993. The countries included are Bolivia, Colombia, Ecuador, Peru, Venezuela, Argentina, Chile, Paraguay, Uruguay, Brazil, Belize, Costa Rica, El Salvador, Guatemala, Honduras, Nicaragua, Panama, Mexico, Cuba, Haiti, the Dominican Republic, Antigua and Barbuda, the Bahamas, Barbados, Dominica, Grenada, Guyana, Jamaica, Saint Kitts and Nevis, Saint Vincent, Saint Lucia, Suriname, Trinidad and Tobago, Canada, and the US. The figures for measles are given as reported and as confirmed. In some countries, the reported number of cases of measles decreased from 1993 figures (Venezuela 5275 vs. 6060, Paraguay 26 vs. 958, Brazil 272 vs. 958, Canada 30 vs. 38), but, in others, the figure increased from 1993 (Mexico 47 vs. 21, the US 155 vs. 86). There were no reported cases of poliomyelitis for either year in any country. The figures for tetanus are divided into nonneonatal and neonatal. In Brazil the number of nonneonatal cases decreased to 360 from 371 in 1993, and the number of neonatal cases decreased to 28 from 65. In Mexico, nonneonatal cases decreased to 28 from 45, but neonatal cases increased to 23 from 20 in 1993. The number of cases of diphtheria cases in Brazil decreased to 28 from 65 in the same period of 1993. The number of cases of whooping cough decreased to 431 from 1651 in Brazil and to 51 from 70 in Mexico. However, the number of cases in Canada increased to 1047 from 784.

  16. Case management for persons with substance use disorders

    DEFF Research Database (Denmark)

    Hesse, Morten; Vanderplasschen, Wouter; Rapp, Richard

    2007-01-01

    ). Reference searching; personal communication; conference abstracts; book chapters on case management. Selection criteria Randomized controlled studies that compared a specific model of case management with either treatment as usual or another treatment model, included only patients with at least one alcohol...... or drug related problem. Data collection & analysis Two groups of reviewers extracted the data independently . Standardized mean difference was estimated. Main results In total, we could extract results from 15 studies. Outcome on illicit drug use was reported from 7 studies with 2391 patients. The effect...

  17. Tests of Selection in Pooled Case-Control Data: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Nitin eUdpa

    2011-11-01

    Full Text Available For smaller organisms with faster breeding cycles, artificial selection can be used to create sub-populations with different phenotypic traits. Genetic tests can be employed to identify the causal markers for the phenotypes, as a precursor to engineering strains with a combination of traits. Traditional approaches involve analyzing crosses of inbred strains to test for co-segregation with genetic markers. Here we take advantage of cheaper next generation sequencing techniques to identifygenetic signatures of adaptation to the selection constraints. Obtaining individual sequencing data is often unrealistic due to cost and sample issues, so we focus on pooled genomic data.In this paper, we explore a series of statistical tests for selection using pooled case (under selection and control populations. Extensive simulations are used to show that these approaches work well for a wide range of population divergence times and strong selective pressures. We show that pooling does not have a significant impact on statistical power. The tests are also robust to reasonable variations in several different parameters, including window size, base-calling error rate, and sequencing coverage. We then demonstrate the viability (and the challenges of one of these methods in two independent Drosophila populations (Drosophila melanogaster bred under selectionfor hypoxia and accelerated development, respectively. Testing for extreme hypoxia tolerance showed clear signals of selection, pointing to loci that are important for hypoxia adaptation.Overall, we outline a strategy for finding regions under selection using pooled sequences, then devise optimal tests for that strategy. The approaches show promise for detecting selection, even several generations after fixation of the beneficial allele has occurred.

  18. The Power of Natural Selection: A Guided Investigation of Three Case Studies

    Science.gov (United States)

    Beachly, William

    2010-01-01

    I describe a quantitative approach to three case studies in evolution that can be used to challenge college freshmen to explore the power of natural selection and ask questions that foster a deeper understanding of its operation and relevance. Hemochromatosis, the peppered moth, and hominid cranial capacity are investigated with a common algebraic…

  19. Evolving artificial metalloenzymes via random mutagenesis

    Science.gov (United States)

    Yang, Hao; Swartz, Alan M.; Park, Hyun June; Srivastava, Poonam; Ellis-Guardiola, Ken; Upp, David M.; Lee, Gihoon; Belsare, Ketaki; Gu, Yifan; Zhang, Chen; Moellering, Raymond E.; Lewis, Jared C.

    2018-03-01

    Random mutagenesis has the potential to optimize the efficiency and selectivity of protein catalysts without requiring detailed knowledge of protein structure; however, introducing synthetic metal cofactors complicates the expression and screening of enzyme libraries, and activity arising from free cofactor must be eliminated. Here we report an efficient platform to create and screen libraries of artificial metalloenzymes (ArMs) via random mutagenesis, which we use to evolve highly selective dirhodium cyclopropanases. Error-prone PCR and combinatorial codon mutagenesis enabled multiplexed analysis of random mutations, including at sites distal to the putative ArM active site that are difficult to identify using targeted mutagenesis approaches. Variants that exhibited significantly improved selectivity for each of the cyclopropane product enantiomers were identified, and higher activity than previously reported ArM cyclopropanases obtained via targeted mutagenesis was also observed. This improved selectivity carried over to other dirhodium-catalysed transformations, including N-H, S-H and Si-H insertion, demonstrating that ArMs evolved for one reaction can serve as starting points to evolve catalysts for others.

  20. Case Report: Evaluation strategies and cognitive intervention: the case of a monovular twin child affected by selective mutism.

    Science.gov (United States)

    Capobianco, Micaela; Cerniglia, Luca

    2018-01-01

    The present work describes the assessment process, evaluation strategies, and cognitive intervention on a 9 years old child with selective mutism (SM), a monovular twin of a child also affected by mutism. Currently, the cognitive behavioral multimodal treatment seems the most effective therapeutic approach for children diagnosed with selective mutism (Capobianco & Cerniglia, 2018). The illustrated case confirms the role of biological factors involved in mutacic disorder but also highlights the importance of environmental influences in the maintenance of the disorder with respect to relational and contextual dynamics (e.g. complicity between sisters, family relationships). The article discusses furthermore the importance of an early diagnosis as a predictor of positive treatment outcomes.

  1. Cluster randomization and political philosophy.

    Science.gov (United States)

    Chwang, Eric

    2012-11-01

    In this paper, I will argue that, while the ethical issues raised by cluster randomization can be challenging, they are not new. My thesis divides neatly into two parts. In the first, easier part I argue that many of the ethical challenges posed by cluster randomized human subjects research are clearly present in other types of human subjects research, and so are not novel. In the second, more difficult part I discuss the thorniest ethical challenge for cluster randomized research--cases where consent is genuinely impractical to obtain. I argue that once again these cases require no new analytic insight; instead, we should look to political philosophy for guidance. In other words, the most serious ethical problem that arises in cluster randomized research also arises in political philosophy. © 2011 Blackwell Publishing Ltd.

  2. Quality assurance in the EORTC 22033–26033/CE5 phase III randomized trial for low grade glioma: The digital individual case review

    International Nuclear Information System (INIS)

    Fairchild, Alysa; Weber, Damien C.; Bar-Deroma, Raquel; Gulyban, Akos; Fenton, Paul A.; Stupp, Roger; Baumert, Brigitta G.

    2012-01-01

    Introduction: The phase III EORTC 22033–26033/NCIC CE5 intergroup trial compares 50.4 Gy radiotherapy with up-front temozolomide in previously untreated low-grade glioma. We describe the digital EORTC individual case review (ICR) performed to evaluate protocol radiotherapy (RT) compliance. Methods: Fifty-eight institutions were asked to submit 1–2 randomly selected cases. Digital ICR datasets were uploaded to the EORTC server and accessed by three central reviewers. Twenty-seven parameters were analysed including volume delineation, treatment planning, organ at risk (OAR) dosimetry and verification. Consensus reviews were collated and summary statistics calculated. Results: Fifty-seven of seventy-two requested datasets from forty-eight institutions were technically usable. 31/57 received a major deviation for at least one section. Relocation accuracy was according to protocol in 45. Just over 30% had acceptable target volumes. OAR contours were missing in an average of 25% of cases. Up to one-third of those present were incorrectly drawn while dosimetry was largely protocol compliant. Beam energy was acceptable in 97% and 48 patients had per protocol beam arrangements. Conclusions: Digital RT plan submission and review within the EORTC 22033–26033 ICR provide a solid foundation for future quality assurance procedures. Strict evaluation resulted in overall grades of minor and major deviation for 37% and 32%, respectively.

  3. Case management: a randomized controlled study comparing a neighborhood team and a centralized individual model.

    Science.gov (United States)

    Eggert, G M; Zimmer, J G; Hall, W J; Friedman, B

    1991-10-01

    This randomized controlled study compared two types of case management for skilled nursing level patients living at home: the centralized individual model and the neighborhood team model. The team model differed from the individual model in that team case managers performed client assessments, care planning, some direct services, and reassessments; they also had much smaller caseloads and were assigned a specific catchment area. While patients in both groups incurred very high estimated health services costs, the average annual cost during 1983-85 for team cases was 13.6 percent less than that of individual model cases. While the team cases were 18.3 percent less expensive among "old" patients (patients who entered the study from the existing ACCESS caseload), they were only 2.7 percent less costly among "new" cases. The lower costs were due to reductions in hospital days and home care. Team cases averaged 26 percent fewer hospital days per year and 17 percent fewer home health aide hours. Nursing home use was 48 percent higher for the team group than for the individual model group. Mortality was almost exactly the same for both groups during the first year (about 30 percent), but was lower for team patients during the second year (11 percent as compared to 16 percent). Probable mechanisms for the observed results are discussed.

  4. Discrete least squares polynomial approximation with random evaluations - application to PDEs with Random parameters

    KAUST Repository

    Nobile, Fabio

    2015-01-01

    the parameter-to-solution map u(y) from random noise-free or noisy observations in random points by discrete least squares on polynomial spaces. The noise-free case is relevant whenever the technique is used to construct metamodels, based on polynomial

  5. Winning strategies: A case study of Oyo State Lottery, Nigeria ...

    African Journals Online (AJOL)

    In this study, we investigated three common lottery strategies: random, low and high frequency strategies, usually employed by lottery players. The Oyo State Lottery, a type of lottery in Oyo State, Nigeria was used as a case study. For the three strategies, we considered whether the selection of numbers in Oyo State lottery ...

  6. Curvature of random walks and random polygons in confinement

    International Nuclear Information System (INIS)

    Diao, Y; Ernst, C; Montemayor, A; Ziegler, U

    2013-01-01

    The purpose of this paper is to study the curvature of equilateral random walks and polygons that are confined in a sphere. Curvature is one of several basic geometric properties that can be used to describe random walks and polygons. We show that confinement affects curvature quite strongly, and in the limit case where the confinement diameter equals the edge length the unconfined expected curvature value doubles from π/2 to π. To study curvature a simple model of an equilateral random walk in spherical confinement in dimensions 2 and 3 is introduced. For this simple model we derive explicit integral expressions for the expected value of the total curvature in both dimensions. These expressions are functions that depend only on the radius R of the confinement sphere. We then show that the values obtained by numeric integration of these expressions agrees with numerical average curvature estimates obtained from simulations of random walks. Finally, we compare the confinement effect on curvature of random walks with random polygons. (paper)

  7. Random walk of passive tracers among randomly moving obstacles.

    Science.gov (United States)

    Gori, Matteo; Donato, Irene; Floriani, Elena; Nardecchia, Ilaria; Pettini, Marco

    2016-04-14

    This study is mainly motivated by the need of understanding how the diffusion behavior of a biomolecule (or even of a larger object) is affected by other moving macromolecules, organelles, and so on, inside a living cell, whence the possibility of understanding whether or not a randomly walking biomolecule is also subject to a long-range force field driving it to its target. By means of the Continuous Time Random Walk (CTRW) technique the topic of random walk in random environment is here considered in the case of a passively diffusing particle among randomly moving and interacting obstacles. The relevant physical quantity which is worked out is the diffusion coefficient of the passive tracer which is computed as a function of the average inter-obstacles distance. The results reported here suggest that if a biomolecule, let us call it a test molecule, moves towards its target in the presence of other independently interacting molecules, its motion can be considerably slowed down.

  8. Neuropathic pain: transcranial electric motor cortex stimulation using high frequency random noise. Case report of a novel treatment

    Directory of Open Access Journals (Sweden)

    Alm PA

    2013-06-01

    Full Text Available Per A Alm, Karolina DreimanisDepartment of Neuroscience, Uppsala University, Uppsala, SwedenObjectives: Electric motor cortex stimulation has been reported to be effective for many cases of neuropathic pain, in the form of epidural stimulation or transcranial direct current stimulation (tDCS. A novel technique is transcranial random noise stimulation (tRNS, which increases the cortical excitability irrespective of the orientation of the current. The aim of this study was to investigate the effect of tRNS on neuropathic pain in a small number of subjects, and in a case study explore the effects of different stimulation parameters and the long-term stability of treatment effects.Methods: The study was divided into three phases: (1 a double-blind 100–600 Hz, varying from 0.5 to 10 minutes and from 50 to 1500 µA, at intervals ranging from daily to fortnightly.crossover study, with four subjects; (2 a double-blind extended case study with one responder; and (3 open continued treatment. The motor cortex stimulation consisted of alternating current random noise (100–600 Hz, varying from 0.5 to 10 minutes and from 50 to 1500 μA, at intervals ranging from daily to fortnightly.Results: One out of four participants showed a strong positive effect (also compared with direct-current-sham, P = 0.006. Unexpectedly, this effect was shown to occur also for very weak (100 µA, P = 0.048 and brief (0.5 minutes, P = 0.028 stimulation. The effect was largest during the first month, but remained at a highly motivating level for the patient after 6 months.Discussion: The study suggests that tRNS may be an effective treatment for some cases of neuropathic pain. An important result was the indication that even low levels of stimulation may have substantial effects.Keywords: neuropathic pain, central pain, transcranial direct current stimulation, motor cortex stimulation, random noise stimulation

  9. The randomly renewed general item and the randomly inspected item with exponential life distribution

    International Nuclear Information System (INIS)

    Schneeweiss, W.G.

    1979-01-01

    For a randomly renewed item the probability distributions of the time to failure and of the duration of down time and the expectations of these random variables are determined. Moreover, it is shown that the same theory applies to randomly checked items with exponential probability distribution of life such as electronic items. The case of periodic renewals is treated as an example. (orig.) [de

  10. Effectiveness of technology-assisted case management in low income adults with type 2 diabetes (TACM-DM: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Strom Joni L

    2011-10-01

    Full Text Available Abstract Background An estimated 1 in 3 American adults will have diabetes by the year 2050. Nationally, South Carolina ranks 10th in cases of diagnosed diabetes compared to other states. In adults, type 2 diabetes (T2DM accounts for approximately 90-95% of all diagnosed cases of diabetes. Clinically, provider and health system factors account for Methods We describe a four-year prospective, randomized clinical trial, which will test the effectiveness of technology-assisted case management in low income rural adults with T2DM. Two-hundred (200 male and female participants, 18 years of age or older and with an HbA1c ≥ 8%, will be randomized into one of two groups: (1 an intervention arm employing the innovative FORA system coupled with nurse case management or (2 a usual care group. Participants will be followed for 6-months to ascertain the effect of the interventions on glycemic control. Our primary hypothesis is that among indigent, rural adult patients with T2DM treated in FQHC's, participants randomized to the technology-assisted case management intervention will have significantly greater reduction in HbA1c at 6 months of follow-up compared to usual care. Discussion Results from this study will provide important insight into the effectiveness of technology-assisted case management intervention (TACM for optimizing diabetes care in indigent, rural adult patients with T2DM treated in FQHC's. Trial Registration National Institutes of Health Clinical Trials Registry (http://ClinicalTrials.gov identifier# NCT01373489

  11. Fast integration using quasi-random numbers

    International Nuclear Information System (INIS)

    Bossert, J.; Feindt, M.; Kerzel, U.

    2006-01-01

    Quasi-random numbers are specially constructed series of numbers optimised to evenly sample a given s-dimensional volume. Using quasi-random numbers in numerical integration converges faster with a higher accuracy compared to the case of pseudo-random numbers. The basic properties of quasi-random numbers are introduced, various generators are discussed and the achieved gain is illustrated by examples

  12. Fast integration using quasi-random numbers

    Science.gov (United States)

    Bossert, J.; Feindt, M.; Kerzel, U.

    2006-04-01

    Quasi-random numbers are specially constructed series of numbers optimised to evenly sample a given s-dimensional volume. Using quasi-random numbers in numerical integration converges faster with a higher accuracy compared to the case of pseudo-random numbers. The basic properties of quasi-random numbers are introduced, various generators are discussed and the achieved gain is illustrated by examples.

  13. Application Of Decision Tree Approach To Student Selection Model- A Case Study

    Science.gov (United States)

    Harwati; Sudiya, Amby

    2016-01-01

    The main purpose of the institution is to provide quality education to the students and to improve the quality of managerial decisions. One of the ways to improve the quality of students is to arrange the selection of new students with a more selective. This research takes the case in the selection of new students at Islamic University of Indonesia, Yogyakarta, Indonesia. One of the university's selection is through filtering administrative selection based on the records of prospective students at the high school without paper testing. Currently, that kind of selection does not yet has a standard model and criteria. Selection is only done by comparing candidate application file, so the subjectivity of assessment is very possible to happen because of the lack standard criteria that can differentiate the quality of students from one another. By applying data mining techniques classification, can be built a model selection for new students which includes criteria to certain standards such as the area of origin, the status of the school, the average value and so on. These criteria are determined by using rules that appear based on the classification of the academic achievement (GPA) of the students in previous years who entered the university through the same way. The decision tree method with C4.5 algorithm is used here. The results show that students are given priority for admission is that meet the following criteria: came from the island of Java, public school, majoring in science, an average value above 75, and have at least one achievement during their study in high school.

  14. Do Instructional Videos on Sputum Submission Result in Increased Tuberculosis Case Detection? A Randomized Controlled Trial.

    Science.gov (United States)

    Mhalu, Grace; Hella, Jerry; Doulla, Basra; Mhimbira, Francis; Mtutu, Hawa; Hiza, Helen; Sasamalo, Mohamed; Rutaihwa, Liliana; Rieder, Hans L; Seimon, Tamsyn; Mutayoba, Beatrice; Weiss, Mitchell G; Fenner, Lukas

    2015-01-01

    We examined the effect of an instructional video about the production of diagnostic sputum on case detection of tuberculosis (TB), and evaluated the acceptance of the video. Randomized controlled trial. We prepared a culturally adapted instructional video for sputum submission. We analyzed 200 presumptive TB cases coughing for more than two weeks who attended the outpatient department of the governmental Municipal Hospital in Mwananyamala (Dar es Salaam, Tanzania). They were randomly assigned to either receive instructions on sputum submission using the video before submission (intervention group, n = 100) or standard of care (control group, n = 100). Sputum samples were examined for volume, quality and presence of acid-fast bacilli by experienced laboratory technicians blinded to study groups. Median age was 39.1 years (interquartile range 37.0-50.0); 94 (47%) were females, 106 (53%) were males, and 49 (24.5%) were HIV-infected. We found that the instructional video intervention was associated with detection of a higher proportion of microscopically confirmed cases (56%, 95% confidence interval [95% CI] 45.7-65.9%, sputum smear positive patients in the intervention group versus 23%, 95% CI 15.2-32.5%, in the control group, p sex, modified the effectiveness of the intervention by improving it positively. When asked how well the video instructions were understood, the majority of patients in the intervention group reported to have understood the video instructions well (97%). Most of the patients thought the video would be useful in the cultural setting of Tanzania (92%). Sputum submission instructional videos increased the yield of tuberculosis cases through better quality of sputum samples. If confirmed in larger studies, instructional videos may have a substantial effect on the case yield using sputum microscopy and also molecular tests. This low-cost strategy should be considered as part of the efforts to control TB in resource-limited settings. Pan African

  15. Post-model selection inference and model averaging

    Directory of Open Access Journals (Sweden)

    Georges Nguefack-Tsague

    2011-07-01

    Full Text Available Although model selection is routinely used in practice nowadays, little is known about its precise effects on any subsequent inference that is carried out. The same goes for the effects induced by the closely related technique of model averaging. This paper is concerned with the use of the same data first to select a model and then to carry out inference, in particular point estimation and point prediction. The properties of the resulting estimator, called a post-model-selection estimator (PMSE, are hard to derive. Using selection criteria such as hypothesis testing, AIC, BIC, HQ and Cp, we illustrate that, in terms of risk function, no single PMSE dominates the others. The same conclusion holds more generally for any penalised likelihood information criterion. We also compare various model averaging schemes and show that no single one dominates the others in terms of risk function. Since PMSEs can be regarded as a special case of model averaging, with 0-1 random-weights, we propose a connection between the two theories, in the frequentist approach, by taking account of the selection procedure when performing model averaging. We illustrate the point by simulating a simple linear regression model.

  16. Work Incapacity and Treatment Costs After Severe Accidents: Standard Versus Intensive Case Management in a 6-Year Randomized Controlled Trial.

    Science.gov (United States)

    Scholz, Stefan M; Andermatt, Peter; Tobler, Benno L; Spinnler, Dieter

    2016-09-01

    Purpose Case management is widely accepted as an effective method to support medical rehabilitation and vocational reintegration of accident victims with musculoskeletal injuries. This study investigates whether more intensive case management improves outcomes such as work incapacity and treatment costs for severely injured patients. Methods 8,050 patients were randomly allocated either to standard case management (SCM, administered by claims specialists) or intensive case management (ICM, administered by case managers). These study groups differ mainly by caseload, which was approximately 100 cases in SCM and 35 in ICM. The setting is equivalent to a prospective randomized controlled trial. A 6-year follow-up period was chosen in order to encompass both short-term insurance benefits and permanent disability costs. All data were extracted from administrative insurance databases. Results Average work incapacity over the 6-year follow-up, including contributions from daily allowances and permanent losses from disability, was slightly but insignificantly higher under ICM than under SCM (21.6 vs. 21.3 % of pre-accident work capacity). Remaining work incapacity after 6 years of follow-up showed no difference between ICM and SCM (8.9 vs. 8.8 % of pre-accident work incapacity). Treatment costs were 43,500 Swiss Francs (CHF) in ICM compared to 39,800 in SCM (+9.4 %, p = 0.01). The number of care providers involved in ICM was 10.5 compared to 10.0 in ICM (+5.0 %, p accident victims.

  17. Comparison of confirmed inactive and randomly selected compounds as negative training examples in support vector machine-based virtual screening.

    Science.gov (United States)

    Heikamp, Kathrin; Bajorath, Jürgen

    2013-07-22

    The choice of negative training data for machine learning is a little explored issue in chemoinformatics. In this study, the influence of alternative sets of negative training data and different background databases on support vector machine (SVM) modeling and virtual screening has been investigated. Target-directed SVM models have been derived on the basis of differently composed training sets containing confirmed inactive molecules or randomly selected database compounds as negative training instances. These models were then applied to search background databases consisting of biological screening data or randomly assembled compounds for available hits. Negative training data were found to systematically influence compound recall in virtual screening. In addition, different background databases had a strong influence on the search results. Our findings also indicated that typical benchmark settings lead to an overestimation of SVM-based virtual screening performance compared to search conditions that are more relevant for practical applications.

  18. Variable Selection in Time Series Forecasting Using Random Forests

    Directory of Open Access Journals (Sweden)

    Hristos Tyralis

    2017-10-01

    Full Text Available Time series forecasting using machine learning algorithms has gained popularity recently. Random forest is a machine learning algorithm implemented in time series forecasting; however, most of its forecasting properties have remained unexplored. Here we focus on assessing the performance of random forests in one-step forecasting using two large datasets of short time series with the aim to suggest an optimal set of predictor variables. Furthermore, we compare its performance to benchmarking methods. The first dataset is composed by 16,000 simulated time series from a variety of Autoregressive Fractionally Integrated Moving Average (ARFIMA models. The second dataset consists of 135 mean annual temperature time series. The highest predictive performance of RF is observed when using a low number of recent lagged predictor variables. This outcome could be useful in relevant future applications, with the prospect to achieve higher predictive accuracy.

  19. Comparative analysis of instance selection algorithms for instance-based classifiers in the context of medical decision support

    International Nuclear Information System (INIS)

    Mazurowski, Maciej A; Tourassi, Georgia D; Malof, Jordan M

    2011-01-01

    When constructing a pattern classifier, it is important to make best use of the instances (a.k.a. cases, examples, patterns or prototypes) available for its development. In this paper we present an extensive comparative analysis of algorithms that, given a pool of previously acquired instances, attempt to select those that will be the most effective to construct an instance-based classifier in terms of classification performance, time efficiency and storage requirements. We evaluate seven previously proposed instance selection algorithms and compare their performance to simple random selection of instances. We perform the evaluation using k-nearest neighbor classifier and three classification problems: one with simulated Gaussian data and two based on clinical databases for breast cancer detection and diagnosis, respectively. Finally, we evaluate the impact of the number of instances available for selection on the performance of the selection algorithms and conduct initial analysis of the selected instances. The experiments show that for all investigated classification problems, it was possible to reduce the size of the original development dataset to less than 3% of its initial size while maintaining or improving the classification performance. Random mutation hill climbing emerges as the superior selection algorithm. Furthermore, we show that some previously proposed algorithms perform worse than random selection. Regarding the impact of the number of instances available for the classifier development on the performance of the selection algorithms, we confirm that the selection algorithms are generally more effective as the pool of available instances increases. In conclusion, instance selection is generally beneficial for instance-based classifiers as it can improve their performance, reduce their storage requirements and improve their response time. However, choosing the right selection algorithm is crucial.

  20. Robust inference in sample selection models

    KAUST Repository

    Zhelonkin, Mikhail; Genton, Marc G.; Ronchetti, Elvezio

    2015-01-01

    The problem of non-random sample selectivity often occurs in practice in many fields. The classical estimators introduced by Heckman are the backbone of the standard statistical analysis of these models. However, these estimators are very sensitive to small deviations from the distributional assumptions which are often not satisfied in practice. We develop a general framework to study the robustness properties of estimators and tests in sample selection models. We derive the influence function and the change-of-variance function of Heckman's two-stage estimator, and we demonstrate the non-robustness of this estimator and its estimated variance to small deviations from the model assumed. We propose a procedure for robustifying the estimator, prove its asymptotic normality and give its asymptotic variance. Both cases with and without an exclusion restriction are covered. This allows us to construct a simple robust alternative to the sample selection bias test. We illustrate the use of our new methodology in an analysis of ambulatory expenditures and we compare the performance of the classical and robust methods in a Monte Carlo simulation study.

  1. Robust inference in sample selection models

    KAUST Repository

    Zhelonkin, Mikhail

    2015-11-20

    The problem of non-random sample selectivity often occurs in practice in many fields. The classical estimators introduced by Heckman are the backbone of the standard statistical analysis of these models. However, these estimators are very sensitive to small deviations from the distributional assumptions which are often not satisfied in practice. We develop a general framework to study the robustness properties of estimators and tests in sample selection models. We derive the influence function and the change-of-variance function of Heckman\\'s two-stage estimator, and we demonstrate the non-robustness of this estimator and its estimated variance to small deviations from the model assumed. We propose a procedure for robustifying the estimator, prove its asymptotic normality and give its asymptotic variance. Both cases with and without an exclusion restriction are covered. This allows us to construct a simple robust alternative to the sample selection bias test. We illustrate the use of our new methodology in an analysis of ambulatory expenditures and we compare the performance of the classical and robust methods in a Monte Carlo simulation study.

  2. The Long-Term Effectiveness of a Selective, Personality-Targeted Prevention Program in Reducing Alcohol Use and Related Harms: A Cluster Randomized Controlled Trial

    Science.gov (United States)

    Newton, Nicola C.; Conrod, Patricia J.; Slade, Tim; Carragher, Natacha; Champion, Katrina E.; Barrett, Emma L.; Kelly, Erin V.; Nair, Natasha K.; Stapinski, Lexine; Teesson, Maree

    2016-01-01

    Background: This study investigated the long-term effectiveness of Preventure, a selective personality-targeted prevention program, in reducing the uptake of alcohol, harmful use of alcohol, and alcohol-related harms over a 3-year period. Methods: A cluster randomized controlled trial was conducted to assess the effectiveness of Preventure.…

  3. Selective serotonin reuptake inhibitors and gastrointestinal bleeding: a case-control study.

    Directory of Open Access Journals (Sweden)

    Alfonso Carvajal

    Full Text Available BACKGROUND: Selective serotonin reuptake inhibitors (SSRIs have been associated with upper gastrointestinal (GI bleeding. Given their worldwide use, even small risks account for a large number of cases. This study has been conducted with carefully collected information to further investigate the relationship between SSRIs and upper GI bleeding. METHODS: We conducted a case-control study in hospitals in Spain and in Italy. Cases were patients aged ≥18 years with a primary diagnosis of acute upper GI bleeding diagnosed by endoscopy; three controls were matched by sex, age, date of admission (within 3 months and hospital among patients who were admitted for elective surgery for non-painful disorders. Exposures to SSRIs, other antidepressants and other drugs were defined as any use of these drugs in the 7 days before the day on which upper gastrointestinal bleeding started (index day. RESULTS: 581 cases of upper GI bleeding and 1358 controls were considered eligible for the study; no differences in age or sex distribution were observed between cases and controls after matching. Overall, 4.0% of the cases and 3.3% of controls used an SSRI antidepressant in the week before the index day. No significant risk of upper GI bleeding was encountered for SSRI antidepressants (adjusted odds ratio, 1.06, 95% CI, 0.57-1.96 or for whichever other grouping of antidepressants. CONCLUSIONS: The results of this case-control study showed no significant increase in upper GI bleeding with SSRIs and provide good evidence that the magnitude of any increase in risk is not greater than 2.

  4. Random covering of the circle: the configuration-space of the free deposition process

    Energy Technology Data Exchange (ETDEWEB)

    Huillet, Thierry [Laboratoire de Physique Theorique et Modelisation, CNRS-UMR 8089 et Universite de Cergy-Pontoise, 5 mail Gay-Lussac, 95031, Neuville sur Oise (France)

    2003-12-12

    Consider a circle of circumference 1. Throw at random n points, sequentially, on this circle and append clockwise an arc (or rod) of length s to each such point. The resulting random set (the free gas of rods) is a collection of a random number of clusters with random sizes. It models a free deposition process on a 1D substrate. For such processes, we shall consider the occurrence times (number of rods) and probabilities, as n grows, of the following configurations: those avoiding rod overlap (the hard-rod gas), those for which the largest gap is smaller than rod length s (the packing gas), those (parking configurations) for which hard rod and packing constraints are both fulfilled and covering configurations. Special attention is paid to the statistical properties of each such (rare) configuration in the asymptotic density domain when ns = {rho}, for some finite density {rho} of points. Using results from spacings in the random division of the circle, explicit large deviation rate functions can be computed in each case from state equations. Lastly, a process consisting in selecting at random one of these specific equilibrium configurations (called the observable) can be modelled. When particularized to the parking model, this system produces parking configurations differently from Renyi's random sequential adsorption model.

  5. Behavioral performance follows the time course of neural facilitation and suppression during cued shifts of feature-selective attention

    OpenAIRE

    Andersen, S. K.; Müller, M. M.

    2010-01-01

    A central question in the field of attention is whether visual processing is a strictly limited resource, which must be allocated by selective attention. If this were the case, attentional enhancement of one stimulus should invariably lead to suppression of unattended distracter stimuli. Here we examine voluntary cued shifts of feature-selective attention to either one of two superimposed red or blue random dot kinematograms (RDKs) to test whether such a reciprocal relationship between enhanc...

  6. Toward mHealth Brief Contact Interventions in Suicide Prevention: Case Series From the Suicide Intervention Assisted by Messages (SIAM) Randomized Controlled Trial.

    Science.gov (United States)

    Berrouiguet, Sofian; Larsen, Mark Erik; Mesmeur, Catherine; Gravey, Michel; Billot, Romain; Walter, Michel; Lemey, Christophe; Lenca, Philippe

    2018-01-10

    Research indicates that maintaining contact either via letter or postcard with at-risk adults following discharge from care services after a suicide attempt (SA) can reduce reattempt risk. Pilot studies have demonstrated that interventions using mobile health (mHealth) technologies are feasible in a suicide prevention setting. The aim of this study was to report three cases of patients recruited in the Suicide Intervention Assisted by Messages (SIAM) study to describe how a mobile intervention may influence follow-up. SIAM is a 2-year, multicenter randomized controlled trial conducted by the Brest University Hospital, France. Participants in the intervention group receive SIAM text messages 48 hours after discharge, then at day 8 and day 15, and months 1, 2, 3, 4, 5, and 6. The study includes participants aged 18 years or older, who have attended a participating hospital for an SA, and have been discharged from the emergency department (ED) or a psychiatric unit (PU) for a stay of less than 7 days. Eligible participants are randomized between the SIAM intervention messages and a control group. In this study, we present three cases from the ongoing SIAM study that demonstrate the capability of a mobile-based brief contact intervention for triggering patient-initiated contact with a crisis support team at various time points throughout the mobile-based follow-up period. Out of the 244 patients recruited in the SIAM randomized controlled trial, three cases were selected to illustrate the impact of mHealth on suicide risk management. Participants initiated contact with the emergency crisis support service after receiving text messages up to 6 months following discharge from the hospital. Contact was initiated immediately following receipt of a text message or up to 6 days following a message. This text message-based brief contact intervention has demonstrated the potential to reconnect suicidal individuals with crisis support services while they are experiencing

  7. Recruiting Young Volunteers in an Area of Selective Education: A Qualitative Case Study

    Science.gov (United States)

    Dean, Jon

    2016-01-01

    This article presents findings from a small qualitative case study of a youth volunteering brokerage organisation in England, operating in an area of selective state education. Data show how brokerage workers felt grammar schools managed their students in a concerted way to improve students' chances of attending university. Conversely, workers…

  8. Fields on a random lattice

    International Nuclear Information System (INIS)

    Itzykson, C.

    1983-10-01

    We review the formulation of field theory and statistical mechanics on a Poissonian random lattice. Topics discussed include random geometry, the construction of field equations for arbitrary spin, the free field spectrum and the question of localization illustrated in the one dimensional case

  9. Multi-Index Stochastic Collocation for random PDEs

    KAUST Repository

    Haji Ali, Abdul Lateef

    2016-03-28

    In this work we introduce the Multi-Index Stochastic Collocation method (MISC) for computing statistics of the solution of a PDE with random data. MISC is a combination technique based on mixed differences of spatial approximations and quadratures over the space of random data. We propose an optimization procedure to select the most effective mixed differences to include in the MISC estimator: such optimization is a crucial step and allows us to build a method that, provided with sufficient solution regularity, is potentially more effective than other multi-level collocation methods already available in literature. We then provide a complexity analysis that assumes decay rates of product type for such mixed differences, showing that in the optimal case the convergence rate of MISC is only dictated by the convergence of the deterministic solver applied to a one dimensional problem. We show the effectiveness of MISC with some computational tests, comparing it with other related methods available in the literature, such as the Multi-Index and Multilevel Monte Carlo, Multilevel Stochastic Collocation, Quasi Optimal Stochastic Collocation and Sparse Composite Collocation methods.

  10. Multi-Index Stochastic Collocation for random PDEs

    KAUST Repository

    Haji Ali, Abdul Lateef; Nobile, Fabio; Tamellini, Lorenzo; Tempone, Raul

    2016-01-01

    In this work we introduce the Multi-Index Stochastic Collocation method (MISC) for computing statistics of the solution of a PDE with random data. MISC is a combination technique based on mixed differences of spatial approximations and quadratures over the space of random data. We propose an optimization procedure to select the most effective mixed differences to include in the MISC estimator: such optimization is a crucial step and allows us to build a method that, provided with sufficient solution regularity, is potentially more effective than other multi-level collocation methods already available in literature. We then provide a complexity analysis that assumes decay rates of product type for such mixed differences, showing that in the optimal case the convergence rate of MISC is only dictated by the convergence of the deterministic solver applied to a one dimensional problem. We show the effectiveness of MISC with some computational tests, comparing it with other related methods available in the literature, such as the Multi-Index and Multilevel Monte Carlo, Multilevel Stochastic Collocation, Quasi Optimal Stochastic Collocation and Sparse Composite Collocation methods.

  11. Generating equilateral random polygons in confinement III

    International Nuclear Information System (INIS)

    Diao, Y; Ernst, C; Montemayor, A; Ziegler, U

    2012-01-01

    In this paper we continue our earlier studies (Diao et al 2011 J. Phys. A: Math. Theor. 44 405202, Diao et al J. Phys. A: Math. Theor. 45 275203) on the generation methods of random equilateral polygons confined in a sphere. The first half of this paper is concerned with the generation of confined equilateral random walks. We show that if the selection of a vertex is uniform subject to the position of its previous vertex and the confining condition, then the distributions of the vertices are not uniform, although there exists a distribution such that if the initial vertex is selected following this distribution, then all vertices of the random walk follow this same distribution. Thus in order to generate a confined equilateral random walk, the selection of a vertex cannot be uniform subject to the position of its previous vertex and the confining condition. We provide a simple algorithm capable of generating confined equilateral random walks whose vertex distribution is almost uniform in the confinement sphere. In the second half of this paper we show that any process generating confined equilateral random walks can be turned into a process generating confined equilateral random polygons with the property that the vertex distribution of the polygons approaches the vertex distribution of the walks as the polygons get longer and longer. In our earlier studies, the starting point of the confined polygon is fixed at the center of the sphere. The new approach here allows us to move the starting point of the confined polygon off the center of the sphere. (paper)

  12. Indication, methods and results of selective arteriography of the A. iliaca interna in case of erectile dysfunction

    Energy Technology Data Exchange (ETDEWEB)

    Baehren, W.; Gall, H.; Scherb, W.; Thon, W.

    1988-01-01

    Erectile dysfunction very frequently can be traced back to the real cause by means of angiography. Selective angiography is the method of choice in cases where other causes of circulatory disturbance have already been excluded, and non-invasive tests are expected to yield information of relevance to therapy. The qualitatively best angiographic results are obtained by examination under peridural anesthesia and by intracavitary injection of vaso-active substances. Selective arteriography is indicated in cases of primary or post-traumatic erectile dysfunction. It is a prerequisite of surgery for revascularisation of the pudendal-penile vascular bed.

  13. Affinity selection of Nipah and Hendra virus-related vaccine candidates from a complex random peptide library displayed on bacteriophage virus-like particles

    Energy Technology Data Exchange (ETDEWEB)

    Peabody, David S.; Chackerian, Bryce; Ashley, Carlee; Carnes, Eric; Negrete, Oscar

    2017-01-24

    The invention relates to virus-like particles of bacteriophage MS2 (MS2 VLPs) displaying peptide epitopes or peptide mimics of epitopes of Nipah Virus envelope glycoprotein that elicit an immune response against Nipah Virus upon vaccination of humans or animals. Affinity selection on Nipah Virus-neutralizing monoclonal antibodies using random sequence peptide libraries on MS2 VLPs selected peptides with sequence similarity to peptide sequences found within the envelope glycoprotein of Nipah itself, thus identifying the epitopes the antibodies recognize. The selected peptide sequences themselves are not necessarily identical in all respects to a sequence within Nipah Virus glycoprotein, and therefore may be referred to as epitope mimics VLPs displaying these epitope mimics can serve as vaccine. On the other hand, display of the corresponding wild-type sequence derived from Nipah Virus and corresponding to the epitope mapped by affinity selection, may also be used as a vaccine.

  14. Automated selection of relevant information for notification of incident cancer cases within a multisource cancer registry.

    Science.gov (United States)

    Jouhet, V; Defossez, G; Ingrand, P

    2013-01-01

    The aim of this study was to develop and evaluate a selection algorithm of relevant records for the notification of incident cases of cancer on the basis of the individual data available in a multi-source information system. This work was conducted on data for the year 2008 in the general cancer registry of Poitou-Charentes region (France). The selection algorithm hierarchizes information according to its level of relevance for tumoral topography and tumoral morphology independently. The selected data are combined to form composite records. These records are then grouped in respect with the notification rules of the International Agency for Research on Cancer for multiple primary cancers. The evaluation, based on recall, precision and F-measure confronted cases validated manually by the registry's physicians with tumours notified with and without records selection. The analysis involved 12,346 tumours validated among 11,971 individuals. The data used were hospital discharge data (104,474 records), pathology data (21,851 records), healthcare insurance data (7508 records) and cancer care centre's data (686 records). The selection algorithm permitted performances improvement for notification of tumour topography (F-measure 0.926 with vs. 0.857 without selection) and tumour morphology (F-measure 0.805 with vs. 0.750 without selection). These results show that selection of information according to its origin is efficient in reducing noise generated by imprecise coding. Further research is needed for solving the semantic problems relating to the integration of heterogeneous data and the use of non-structured information.

  15. Human action analysis with randomized trees

    CERN Document Server

    Yu, Gang; Liu, Zicheng

    2014-01-01

    This book will provide a comprehensive overview on human action analysis with randomized trees. It will cover both the supervised random trees and the unsupervised random trees. When there are sufficient amount of labeled data available, supervised random trees provides a fast method for space-time interest point matching. When labeled data is minimal as in the case of example-based action search, unsupervised random trees is used to leverage the unlabelled data. We describe how the randomized trees can be used for action classification, action detection, action search, and action prediction.

  16. Fast selection of miRNA candidates based on large-scale pre-computed MFE sets of randomized sequences.

    Science.gov (United States)

    Warris, Sven; Boymans, Sander; Muiser, Iwe; Noback, Michiel; Krijnen, Wim; Nap, Jan-Peter

    2014-01-13

    Small RNAs are important regulators of genome function, yet their prediction in genomes is still a major computational challenge. Statistical analyses of pre-miRNA sequences indicated that their 2D structure tends to have a minimal free energy (MFE) significantly lower than MFE values of equivalently randomized sequences with the same nucleotide composition, in contrast to other classes of non-coding RNA. The computation of many MFEs is, however, too intensive to allow for genome-wide screenings. Using a local grid infrastructure, MFE distributions of random sequences were pre-calculated on a large scale. These distributions follow a normal distribution and can be used to determine the MFE distribution for any given sequence composition by interpolation. It allows on-the-fly calculation of the normal distribution for any candidate sequence composition. The speedup achieved makes genome-wide screening with this characteristic of a pre-miRNA sequence practical. Although this particular property alone will not be able to distinguish miRNAs from other sequences sufficiently discriminative, the MFE-based P-value should be added to the parameters of choice to be included in the selection of potential miRNA candidates for experimental verification.

  17. THE METHODOLOGY OF CASES SELECTION FOR TEACHING FOREGN SPEECH TO THE STUDENTS OF NON-LIGUISTIC SPECIALITIES

    Directory of Open Access Journals (Sweden)

    Tatyana Lozovskaya

    2015-10-01

    Full Text Available This article deals with the advantages of case-study and its potential in forming the motivation for studying the English language for students of non-linguistic specialities, psychology students in particular. Training future psychologists foreign language communication should involve cases, published in foreign periodicals, and numerous exercises and communicative tasks according to the requirements of the case-technology which is used during their learning process. The studies enable to single out the main criteria of cases selection for the successful formation of foreign speech with the students of psychological faculty.

  18. Melioidosis Cases and Selected Reports of Occupational Exposures to Burkholderia pseudomallei--United States, 2008-2013.

    Science.gov (United States)

    Benoit, Tina J; Blaney, David D; Gee, Jay E; Elrod, Mindy G; Hoffmaster, Alex R; Doker, Thomas J; Bower, William A; Walke, Henry T

    2015-07-03

    Melioidosis is an infection caused by the Gram-negative bacillus Burkholderia pseudomallei, which is naturally found in water and soil in areas endemic for melioidosis. Infection can be severe and sometimes fatal. The federal select agent program designates B. pseudomallei as a Tier 1 overlap select agent, which can affect both humans and animals. Identification of B. pseudomallei and all occupational exposures must be reported to the Federal Select Agent Program immediately (i.e., within 24 hours), whereas states are not required to notify CDC's Bacterial Special Pathogens Branch (BSPB) of human infections. 2008-2013. The passive surveillance system includes reports of suspected (human and animal) melioidosis cases and reports of incidents of possible occupational exposures. Reporting of suspected cases to BSPB is voluntary. BSPB receives reports of occupational exposure in the context of a request for technical consultation (so that the system does not include the full complement of the mandatory and confidential reporting to the Federal Select Agent Program). Reporting sources include state health departments, medical facilities, microbiologic laboratories, or research facilities. Melioidosis cases are classified using the standard case definition adopted by the Council of State and Territorial Epidemiologists in 2011. In follow up to reports of occupational exposures, CDC often provides technical assistance to state health departments to identify all persons with possible exposures, define level of risk, and provide recommendations for postexposure prophylaxis and health monitoring of exposed persons. During 2008-2013, BSPB provided technical assistance to 20 U.S. states and Puerto Rico involving 37 confirmed cases of melioidosis (34 human cases and three animal cases). Among those with documented travel history, the majority of reported cases (64%) occurred among persons with a documented travel history to areas endemic for melioidosis. Two persons did not

  19. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination.

    Science.gov (United States)

    Koh, Bongyeun; Hong, Sunggi; Kim, Soon-Sim; Hyun, Jin-Sook; Baek, Milye; Moon, Jundong; Kwon, Hayran; Kim, Gyoungyong; Min, Seonggi; Kang, Gu-Hyun

    2016-01-01

    The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE), which requires examinees to select items randomly. The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01), as well as 4 of the 5 items on the advanced skills test (P<0.05). In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01), as well as all 3 of the advanced skills test items (P<0.01). In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  20. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination

    Directory of Open Access Journals (Sweden)

    Bongyeun Koh

    2016-01-01

    Full Text Available Purpose: The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE, which requires examinees to select items randomly. Methods: The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. Results: In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01, as well as 4 of the 5 items on the advanced skills test (P<0.05. In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01, as well as all 3 of the advanced skills test items (P<0.01. Conclusion: In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  1. The Case for Quality Book Selection.

    Science.gov (United States)

    Bob, Murray C.

    1982-01-01

    This essay on library book selection critiques Nora Rawlinson's article on practices at the Baltimore County Public Library which appeared in Library Journal, November 15, 1981, p. 2188, and discusses library circulation statistics in relation to book selection. (EJS)

  2. TU-AB-202-10: How Effective Are Current Atlas Selection Methods for Atlas-Based Auto-Contouring in Radiotherapy Planning?

    Energy Technology Data Exchange (ETDEWEB)

    Peressutti, D; Schipaanboord, B; Kadir, T; Gooding, M [Mirada Medical Limited, Science and Medical Technology, Oxford (United Kingdom); Soest, J van; Lustberg, T; Elmpt, W van; Dekker, A [Maastricht University Medical Centre, Department of Radiation Oncology MAASTRO - GROW School for Oncology Developmental Biology, Maastricht (Netherlands)

    2016-06-15

    Purpose: To investigate the effectiveness of atlas selection methods for improving atlas-based auto-contouring in radiotherapy planning. Methods: 275 H&N clinically delineated cases were employed as an atlas database from which atlases would be selected. A further 40 previously contoured cases were used as test patients against which atlas selection could be performed and evaluated. 26 variations of selection methods proposed in the literature and used in commercial systems were investigated. Atlas selection methods comprised either global or local image similarity measures, computed after rigid or deformable registration, combined with direct atlas search or with an intermediate template image. Workflow Box (Mirada-Medical, Oxford, UK) was used for all auto-contouring. Results on brain, brainstem, parotids and spinal cord were compared to random selection, a fixed set of 10 “good” atlases, and optimal selection by an “oracle” with knowledge of the ground truth. The Dice score and the average ranking with respect to the “oracle” were employed to assess the performance of the top 10 atlases selected by each method. Results: The fixed set of “good” atlases outperformed all of the atlas-patient image similarity-based selection methods (mean Dice 0.715 c.f. 0.603 to 0.677). In general, methods based on exhaustive comparison of local similarity measures showed better average Dice scores (0.658 to 0.677) compared to the use of either template image (0.655 to 0.672) or global similarity measures (0.603 to 0.666). The performance of image-based selection methods was found to be only slightly better than a random (0.645). Dice scores given relate to the left parotid, but similar results patterns were observed for all organs. Conclusion: Intuitively, atlas selection based on the patient CT is expected to improve auto-contouring performance. However, it was found that published approaches performed marginally better than random and use of a fixed set of

  3. Feature Selection via Chaotic Antlion Optimization.

    Directory of Open Access Journals (Sweden)

    Hossam M Zawbaa

    Full Text Available Selecting a subset of relevant properties from a large set of features that describe a dataset is a challenging machine learning task. In biology, for instance, the advances in the available technologies enable the generation of a very large number of biomarkers that describe the data. Choosing the more informative markers along with performing a high-accuracy classification over the data can be a daunting task, particularly if the data are high dimensional. An often adopted approach is to formulate the feature selection problem as a biobjective optimization problem, with the aim of maximizing the performance of the data analysis model (the quality of the data training fitting while minimizing the number of features used.We propose an optimization approach for the feature selection problem that considers a "chaotic" version of the antlion optimizer method, a nature-inspired algorithm that mimics the hunting mechanism of antlions in nature. The balance between exploration of the search space and exploitation of the best solutions is a challenge in multi-objective optimization. The exploration/exploitation rate is controlled by the parameter I that limits the random walk range of the ants/prey. This variable is increased iteratively in a quasi-linear manner to decrease the exploration rate as the optimization progresses. The quasi-linear decrease in the variable I may lead to immature convergence in some cases and trapping in local minima in other cases. The chaotic system proposed here attempts to improve the tradeoff between exploration and exploitation. The methodology is evaluated using different chaotic maps on a number of feature selection datasets. To ensure generality, we used ten biological datasets, but we also used other types of data from various sources. The results are compared with the particle swarm optimizer and with genetic algorithm variants for feature selection using a set of quality metrics.

  4. Application of quasi-random numbers for simulation

    International Nuclear Information System (INIS)

    Kazachenko, O.N.; Takhtamyshev, G.G.

    1985-01-01

    Application of the Monte-Carlo method for multidimensional integration is discussed. The main goal is to check the statement that the application of quasi-random numbers instead of regular pseudo-random numbers provides more rapid convergency. The Sobol, Richtmayer and Halton algorithms of quasi-random sequences are described. Over 50 tests to compare these quasi-random numbers as well as pseudo-random numbers were fulfilled. In all cases quasi-random numbers have clearly demonstrated a more rapid convergency as compared with pseudo-random ones. Positive test results on quasi-random trend in Monte-Carlo method seem very promising

  5. HYBRID DATA APPROACH FOR SELECTING EFFECTIVE TEST CASES DURING THE REGRESSION TESTING

    OpenAIRE

    Mohan, M.; Shrimali, Tarun

    2017-01-01

    In the software industry, software testing becomes more important in the entire software development life cycle. Software testing is one of the fundamental components of software quality assurances. Software Testing Life Cycle (STLC)is a process involved in testing the complete software, which includes Regression Testing, Unit Testing, Smoke Testing, Integration Testing, Interface Testing, System Testing & etc. In the STLC of Regression testing, test case selection is one of the most importan...

  6. An Outbreak of Clostridium difficile Ribotype 027 Associated with Length of Stay in the Intensive Care Unit and Use of Selective Decontamination of the Digestive Tract: A Case Control Study.

    Directory of Open Access Journals (Sweden)

    Yvette H van Beurden

    Full Text Available An outbreak of Clostridium difficile ribotype 027 infection (CDI occurred at an university hospital, involving 19 departments. To determine what hospital-associated factors drove the outbreak of this particular strain we performed a case-control study.Cases (n = 79, diagnosed with CDI due to C. difficile ribotype 027 were matched for age and treating medical specialty to four control patients (n = 316. Patients diagnosed with CDI due to other ribotypes were included as a second control group. A random selection of C. difficile ribotype 027 strains (n = 10 was genotyped by Whole Genome Sequencing (WGS.WGS showed the outbreak was likely caused by a single strain of C. difficile (two or less single-nucleotide variants between isolates. Ninety-five percent of cases had used antibiotics, compared to 56% of controls. Previous admission to the intensive care unit (ICU (OR: 2.4, 95% CI 1.0-5.6, longer length of stay (LOS, and recent hospital admission were associated with CDI ribotype 027. Cases were less likely to have been admitted to a ward with a known isolated CDI patient (OR: 0.2, 95% CI 0.1-0.6. Analysis of patients who stayed at the ICU (35 cases; 51 controls, indicated that the use of selective decontamination of the digestive tract (SDD and a longer LOS in the ICU were associated with CDI risk.In this large outbreak, any antibiotic use, including SDD use, appeared as a prerequisite for acquisition of the outbreak strain. The role of use of SDD and prolonged stay on the ICU could not be disentangled, but both factors can play a biologically plausible role in C. difficile acquisition and infection.

  7. Random survival forests for competing risks

    DEFF Research Database (Denmark)

    Ishwaran, Hemant; Gerds, Thomas A; Kogalur, Udaya B

    2014-01-01

    We introduce a new approach to competing risks using random forests. Our method is fully non-parametric and can be used for selecting event-specific variables and for estimating the cumulative incidence function. We show that the method is highly effective for both prediction and variable selection...

  8. Using Random Numbers in Science Research Activities.

    Science.gov (United States)

    Schlenker, Richard M.; And Others

    1996-01-01

    Discusses the importance of science process skills and describes ways to select sets of random numbers for selection of subjects for a research study in an unbiased manner. Presents an activity appropriate for grades 5-12. (JRH)

  9. Influence of case-based e-learning on students' performance in point-of-care ultrasound courses: a randomized trial.

    Science.gov (United States)

    Hempel, Dorothea; Sinnathurai, Sivajini; Haunhorst, Stephanie; Seibel, Armin; Michels, Guido; Heringer, Frank; Recker, Florian; Breitkreutz, Raoul

    2016-08-01

    Theoretical knowledge, visual perception, and sensorimotor skills are key elements in ultrasound education. Classroom-based presentations are used routinely to teach theoretical knowledge, whereas visual perception and sensorimotor skills typically require hands-on training (HT). We aimed to compare the effect of classroom-based lectures versus a case-based e-learning (based on clinical cases only) on the hands-on performance of trainees during an emergency ultrasound course. This is a randomized, controlled, parallel-group study. Sixty-two medical students were randomized into two groups [group 1 (G1) and group 2 (G2)]. G1 (n=29) was subjected to a precourse e-learning, based on 14 short screencasts (each 5 min), an on-site discussion (60 min), and a standardized HT session on the day of the course. G2 (n=31) received classroom-based presentations on the day of the course before an identical HT session. Both groups completed a multiple-choice (MC) pretest (test A), a practical postcourse test (objective structured clinical exam), and MC tests directly after the HT (test B) and 1 day after the course (test C). The Mann-Whitney U-test was used for statistical analysis. G1 performed markedly better in test A (median 84.2, 25%; 75% percentile: 68.5; 92.2) compared with G2 (65.8; 53.8; 80.4), who had not participated in case-based e-learning (P=0.0009). No differences were found in the objective structured clinical exam, test B, and test C. e-learning exclusively based on clinical cases is an effective method of education in preparation for HT sessions and can reduce attendance time in ultrasound courses.

  10. Quality pseudo-random number generator

    International Nuclear Information System (INIS)

    Tarasiuk, J.

    1996-01-01

    The pseudo-random number generator (RNG) was written to match needs of nuclear and high-energy physics computation which in some cases require very long and independent random number sequences. In this random number generator the repetition period is about 10 36 what should be sufficient for all computers in the world. In this article the test results of RNG correlation, speed and identity of computations for PC, Sun4 and VAX computer tests are presented

  11. Approaches to sampling and case selection in qualitative research: examples in the geography of health.

    Science.gov (United States)

    Curtis, S; Gesler, W; Smith, G; Washburn, S

    2000-04-01

    This paper focuses on the question of sampling (or selection of cases) in qualitative research. Although the literature includes some very useful discussions of qualitative sampling strategies, the question of sampling often seems to receive less attention in methodological discussion than questions of how data is collected or is analysed. Decisions about sampling are likely to be important in many qualitative studies (although it may not be an issue in some research). There are varying accounts of the principles applicable to sampling or case selection. Those who espouse 'theoretical sampling', based on a 'grounded theory' approach, are in some ways opposed to those who promote forms of 'purposive sampling' suitable for research informed by an existing body of social theory. Diversity also results from the many different methods for drawing purposive samples which are applicable to qualitative research. We explore the value of a framework suggested by Miles and Huberman [Miles, M., Huberman,, A., 1994. Qualitative Data Analysis, Sage, London.], to evaluate the sampling strategies employed in three examples of research by the authors. Our examples comprise three studies which respectively involve selection of: 'healing places'; rural places which incorporated national anti-malarial policies; young male interviewees, identified as either chronically ill or disabled. The examples are used to show how in these three studies the (sometimes conflicting) requirements of the different criteria were resolved, as well as the potential and constraints placed on the research by the selection decisions which were made. We also consider how far the criteria Miles and Huberman suggest seem helpful for planning 'sample' selection in qualitative research.

  12. Random walk, diffusion and mixing in simulations of scalar transport in fluid flows

    International Nuclear Information System (INIS)

    Klimenko, A Y

    2008-01-01

    Physical similarity and mathematical equivalence of continuous diffusion and particle random walk form one of the cornerstones of modern physics and the theory of stochastic processes. In many applied models used in simulation of turbulent transport and turbulent combustion, mixing between particles is used to reflect the influence of the continuous diffusion terms in the transport equations. We show that the continuous scalar transport and diffusion can be accurately specified by means of mixing between randomly walking Lagrangian particles with scalar properties and assess errors associated with this scheme. This gives an alternative formulation for the stochastic process which is selected to represent the continuous diffusion. This paper focuses on statistical errors and deals with relatively simple cases, where one-particle distributions are sufficient for a complete description of the problem.

  13. Correlates of smoking with socioeconomic status, leisure time physical activity and alcohol consumption among Polish adults from randomly selected regions.

    Science.gov (United States)

    Woitas-Slubowska, Donata; Hurnik, Elzbieta; Skarpańska-Stejnborn, Anna

    2010-12-01

    To determine the association between smoking status and leisure time physical activity (LTPA), alcohol consumption, and socioeconomic status (SES) among Polish adults. 466 randomly selected men and women (aged 18-66 years) responded to an anonymous questionnaire regarding smoking, alcohol consumption, LTPA, and SES. Multiple logistic regression was used to examine the association of smoking status with six socioeconomic measures, level of LTPA, and frequency and type of alcohol consumed. Smokers were defined as individuals smoking occasionally or daily. The odds of being smoker were 9 times (men) and 27 times (women) higher among respondents who drink alcohol several times/ week or everyday in comparison to non-drinkers (p times higher compared to those with the high educational attainment (p = 0.007). Among women we observed that students were the most frequent smokers. Female students were almost three times more likely to smoke than non-professional women, and two times more likely than physical workers (p = 0.018). The findings of this study indicated that among randomly selected Polish man and women aged 18-66 smoking and alcohol consumption tended to cluster. These results imply that intervention strategies need to target multiple risk factors simultaneously. The highest risk of smoking was observed among low educated men, female students, and both men and women drinking alcohol several times a week or every day. Information on subgroups with the high risk of smoking will help in planning future preventive strategies.

  14. Entropy of level-cut random Gaussian structures at different volume fractions.

    Science.gov (United States)

    Marčelja, Stjepan

    2017-10-01

    Cutting random Gaussian fields at a given level can create a variety of morphologically different two- or several-phase structures that have often been used to describe physical systems. The entropy of such structures depends on the covariance function of the generating Gaussian random field, which in turn depends on its spectral density. But the entropy of level-cut structures also depends on the volume fractions of different phases, which is determined by the selection of the cutting level. This dependence has been neglected in earlier work. We evaluate the entropy of several lattice models to show that, even in the cases of strongly coupled systems, the dependence of the entropy of level-cut structures on molar fractions of the constituents scales with the simple ideal noninteracting system formula. In the last section, we discuss the application of the results to binary or ternary fluids and microemulsions.

  15. Entropy of level-cut random Gaussian structures at different volume fractions

    Science.gov (United States)

    Marčelja, Stjepan

    2017-10-01

    Cutting random Gaussian fields at a given level can create a variety of morphologically different two- or several-phase structures that have often been used to describe physical systems. The entropy of such structures depends on the covariance function of the generating Gaussian random field, which in turn depends on its spectral density. But the entropy of level-cut structures also depends on the volume fractions of different phases, which is determined by the selection of the cutting level. This dependence has been neglected in earlier work. We evaluate the entropy of several lattice models to show that, even in the cases of strongly coupled systems, the dependence of the entropy of level-cut structures on molar fractions of the constituents scales with the simple ideal noninteracting system formula. In the last section, we discuss the application of the results to binary or ternary fluids and microemulsions.

  16. A stochastic model for stationary dynamics of prices in real estate markets. A case of random intensity for Poisson moments of prices changes

    Science.gov (United States)

    Rusakov, Oleg; Laskin, Michael

    2017-06-01

    We consider a stochastic model of changes of prices in real estate markets. We suppose that in a book of prices the changes happen in points of jumps of a Poisson process with a random intensity, i.e. moments of changes sequently follow to a random process of the Cox process type. We calculate cumulative mathematical expectations and variances for the random intensity of this point process. In the case that the process of random intensity is a martingale the cumulative variance has a linear grows. We statistically process a number of observations of real estate prices and accept hypotheses of a linear grows for estimations as well for cumulative average, as for cumulative variance both for input and output prises that are writing in the book of prises.

  17. First implantation of selective his bundle pacing in Colombia. Report of two cases

    International Nuclear Information System (INIS)

    Velasquez, Jorge E; Duque, Mauricio; Marin V, Jorge E; Aristizabal, Julian M; Medina, Luis E; Gonzalez, Edgardo; Duque, Laura; Uribe, William

    2010-01-01

    Pacing in the right ventricular apex is known to produce ventricular dysynchrony and patients often develop dilated cardiomyopathy. For this reason, a more physiological stimulation has been performed. With the recent technological development of electrodes for selective stimulation of the His bundle, we want to show the experience of this type of implantation in the first two cases made in the electrophysiology laboratory in Medellin.

  18. Random walk through fractal environments

    International Nuclear Information System (INIS)

    Isliker, H.; Vlahos, L.

    2003-01-01

    We analyze random walk through fractal environments, embedded in three-dimensional, permeable space. Particles travel freely and are scattered off into random directions when they hit the fractal. The statistical distribution of the flight increments (i.e., of the displacements between two consecutive hittings) is analytically derived from a common, practical definition of fractal dimension, and it turns out to approximate quite well a power-law in the case where the dimension D F of the fractal is less than 2, there is though, always a finite rate of unaffected escape. Random walks through fractal sets with D F ≤2 can thus be considered as defective Levy walks. The distribution of jump increments for D F >2 is decaying exponentially. The diffusive behavior of the random walk is analyzed in the frame of continuous time random walk, which we generalize to include the case of defective distributions of walk increments. It is shown that the particles undergo anomalous, enhanced diffusion for D F F >2 is normal for large times, enhanced though for small and intermediate times. In particular, it follows that fractals generated by a particular class of self-organized criticality models give rise to enhanced diffusion. The analytical results are illustrated by Monte Carlo simulations

  19. SELECTION OF PROJECT MANAGERS IN CONSTRUCTION FIRMS USING ANALYTIC HIERARCHY PROCESS (AHP AND FUZZY TOPSIS: A CASE STUDY

    Directory of Open Access Journals (Sweden)

    Fatemeh Torfi

    2011-10-01

    Full Text Available Selecting a project manager is a major decision for every construction company. Traditionally, a project manager is selected by interviewing applicants and evaluating their capabilities by considering the special requirements of the project. The interviews are usually conducted by senior managers, and the selection of the best candidate depends on their opinions. Thus, the results may not be completely reliable. Moreover, conducting interviews for a large group of candidates is time-consuming. Thus, there is a need for computational models that can be used to select the most suitable applicant, given the project specifications and the applicants’ details. In this paper, a case study is performed in which a Fuzzy Multiple Criteria Decision Making (FMCDM model is used to select the best candidate for the post of project manager in a large construction firm. First, with the opinions of the senior managers, all the criteria and sub-criteria required for the selection are gathered, and the criteria priorities are qualitatively specified. Then, the applicants are ranked using the Analytic Hierarchy Process (AHP, approximate weights of the criteria, and fuzzy technique for order performance by similarity to ideal solution (TOPSIS. The results of the case study are shown to be satisfactory.

  20. Random walk in dynamically disordered chains: Poisson white noise disorder

    International Nuclear Information System (INIS)

    Hernandez-Garcia, E.; Pesquera, L.; Rodriguez, M.A.; San Miguel, M.

    1989-01-01

    Exact solutions are given for a variety of models of random walks in a chain with time-dependent disorder. Dynamic disorder is modeled by white Poisson noise. Models with site-independent (global) and site-dependent (local) disorder are considered. Results are described in terms of an affective random walk in a nondisordered medium. In the cases of global disorder the effective random walk contains multistep transitions, so that the continuous limit is not a diffusion process. In the cases of local disorder the effective process is equivalent to usual random walk in the absence of disorder but with slower diffusion. Difficulties associated with the continuous-limit representation of random walk in a disordered chain are discussed. In particular, the authors consider explicit cases in which taking the continuous limit and averaging over disorder sources do not commute

  1. Association Between Zolpidem Use and Glaucoma Risk: A Taiwanese Population-Based Case-Control Study

    OpenAIRE

    Ho, Yi-Hao; Chang, Yue-Cune; Huang, Wei-Cheng; Chen, Hsin-Yi; Lin, Che-Chen; Sung, Fung-Chang

    2015-01-01

    Background To date, the relationship between zolpidem use and subsequent risk of glaucoma in a Taiwanese population has not been assessed. Methods We used data from the National Health Insurance system to investigate whether zolpidem use was related to glaucoma risk. A 1:4 matched case-control study was conducted. The cases were patients newly diagnosed with glaucoma from 2001 to 2010. The controls were randomly selected non-glaucoma subjects matched by sex and age (?5 years). Zolpidem exposu...

  2. Effects of one versus two bouts of moderate intensity physical activity on selective attention during a school morning in Dutch primary schoolchildren: A randomized controlled trial.

    Science.gov (United States)

    Altenburg, Teatske M; Chinapaw, Mai J M; Singh, Amika S

    2016-10-01

    Evidence suggests that physical activity is positively related to several aspects of cognitive functioning in children, among which is selective attention. To date, no information is available on the optimal frequency of physical activity on cognitive functioning in children. The current study examined the acute effects of one and two bouts of moderate-intensity physical activity on children's selective attention. Randomized controlled trial (ISRCTN97975679). Thirty boys and twenty-six girls, aged 10-13 years, were randomly assigned to three conditions: (A) sitting all morning working on simulated school tasks; (B) one 20-min physical activity bout after 90min; and (C) two 20-min physical activity bouts, i.e. at the start and after 90min. Selective attention was assessed at five time points during the morning (i.e. at baseline and after 20, 110, 130 and 220min), using the 'Sky Search' subtest of the 'Test of Selective Attention in Children'. We used GEE analysis to examine differences in Sky Search scores between the three experimental conditions, adjusting for school, baseline scores, self-reported screen time and time spent in sports. Children who performed two 20-min bouts of moderate-intensity physical activity had significantly better Sky Search scores compared to children who performed one physical activity bout or remained seated the whole morning (B=-0.26; 95% CI=[-0.52; -0.00]). Our findings support the importance of repeated physical activity during the school day for beneficial effects on selective attention in children. Copyright © 2015 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  3. An MGF-based unified framework to determine the joint statistics of partial sums of ordered i.n.d. random variables

    KAUST Repository

    Nam, Sungsik

    2014-08-01

    The joint statistics of partial sums of ordered random variables (RVs) are often needed for the accurate performance characterization of a wide variety of wireless communication systems. A unified analytical framework to determine the joint statistics of partial sums of ordered independent and identically distributed (i.i.d.) random variables was recently presented. However, the identical distribution assumption may not be valid in several real-world applications. With this motivation in mind, we consider in this paper the more general case in which the random variables are independent but not necessarily identically distributed (i.n.d.). More specifically, we extend the previous analysis and introduce a new more general unified analytical framework to determine the joint statistics of partial sums of ordered i.n.d. RVs. Our mathematical formalism is illustrated with an application on the exact performance analysis of the capture probability of generalized selection combining (GSC)-based RAKE receivers operating over frequency-selective fading channels with a non-uniform power delay profile. © 1991-2012 IEEE.

  4. Do Instructional Videos on Sputum Submission Result in Increased Tuberculosis Case Detection? A Randomized Controlled Trial.

    Directory of Open Access Journals (Sweden)

    Grace Mhalu

    Full Text Available We examined the effect of an instructional video about the production of diagnostic sputum on case detection of tuberculosis (TB, and evaluated the acceptance of the video.Randomized controlled trial.We prepared a culturally adapted instructional video for sputum submission. We analyzed 200 presumptive TB cases coughing for more than two weeks who attended the outpatient department of the governmental Municipal Hospital in Mwananyamala (Dar es Salaam, Tanzania. They were randomly assigned to either receive instructions on sputum submission using the video before submission (intervention group, n = 100 or standard of care (control group, n = 100. Sputum samples were examined for volume, quality and presence of acid-fast bacilli by experienced laboratory technicians blinded to study groups.Median age was 39.1 years (interquartile range 37.0-50.0; 94 (47% were females, 106 (53% were males, and 49 (24.5% were HIV-infected. We found that the instructional video intervention was associated with detection of a higher proportion of microscopically confirmed cases (56%, 95% confidence interval [95% CI] 45.7-65.9%, sputum smear positive patients in the intervention group versus 23%, 95% CI 15.2-32.5%, in the control group, p <0.0001, an increase in volume of specimen defined as a volume ≥3ml (78%, 95% CI 68.6-85.7%, versus 45%, 95% CI 35.0-55.3%, p <0.0001, and specimens less likely to be salivary (14%, 95% CI 7.9-22.4%, versus 39%, 95% CI 29.4-49.3%, p = 0.0001. Older age, but not the HIV status or sex, modified the effectiveness of the intervention by improving it positively. When asked how well the video instructions were understood, the majority of patients in the intervention group reported to have understood the video instructions well (97%. Most of the patients thought the video would be useful in the cultural setting of Tanzania (92%.Sputum submission instructional videos increased the yield of tuberculosis cases through better quality of sputum

  5. Mirnacle: machine learning with SMOTE and random forest for improving selectivity in pre-miRNA ab initio prediction.

    Science.gov (United States)

    Marques, Yuri Bento; de Paiva Oliveira, Alcione; Ribeiro Vasconcelos, Ana Tereza; Cerqueira, Fabio Ribeiro

    2016-12-15

    MicroRNAs (miRNAs) are key gene expression regulators in plants and animals. Therefore, miRNAs are involved in several biological processes, making the study of these molecules one of the most relevant topics of molecular biology nowadays. However, characterizing miRNAs in vivo is still a complex task. As a consequence, in silico methods have been developed to predict miRNA loci. A common ab initio strategy to find miRNAs in genomic data is to search for sequences that can fold into the typical hairpin structure of miRNA precursors (pre-miRNAs). The current ab initio approaches, however, have selectivity issues, i.e., a high number of false positives is reported, which can lead to laborious and costly attempts to provide biological validation. This study presents an extension of the ab initio method miRNAFold, with the aim of improving selectivity through machine learning techniques, namely, random forest combined with the SMOTE procedure that copes with imbalance datasets. By comparing our method, termed Mirnacle, with other important approaches in the literature, we demonstrate that Mirnacle substantially improves selectivity without compromising sensitivity. For the three datasets used in our experiments, our method achieved at least 97% of sensitivity and could deliver a two-fold, 20-fold, and 6-fold increase in selectivity, respectively, compared with the best results of current computational tools. The extension of miRNAFold by the introduction of machine learning techniques, significantly increases selectivity in pre-miRNA ab initio prediction, which optimally contributes to advanced studies on miRNAs, as the need of biological validations is diminished. Hopefully, new research, such as studies of severe diseases caused by miRNA malfunction, will benefit from the proposed computational tool.

  6. Design, and participant enrollment, of a randomized controlled trial evaluating effectiveness and cost-effectiveness of a community-based case management intervention, for patients suffering from COPD

    DEFF Research Database (Denmark)

    Sørensen, Sabrina Storgaard; Pedersen, Kjeld Møller; Weinreich, Ulla Møller

    2015-01-01

    Background: Case management interventions are recommended to improve quality of care and reduce costs in chronic care, but further evidence on effectiveness and cost-effectiveness is needed. The objective of this study is the reporting of the design and participant enrollment of a randomized...... controlled trial, conducted to evaluate the effectiveness and cost-effectiveness of a community-based case management model for patients suffering from chronic obstructive pulmonary disease (COPD). With a focus on support for self-care and care coordination, the intervention was hypothesized to result...... patients were randomized into two groups: the case-managed group and the usual-care group. Participant characteristics were obtained at baseline, and measures on effectiveness and costs were obtained through questionnaires and registries within a 12-month follow-up period. In the forthcoming analysis...

  7. Should Controls With Respiratory Symptoms Be Excluded From Case-Control Studies of Pneumonia Etiology? Reflections From the PERCH Study.

    Science.gov (United States)

    Higdon, Melissa M; Hammitt, Laura L; Deloria Knoll, Maria; Baggett, Henry C; Brooks, W Abdullah; Howie, Stephen R C; Kotloff, Karen L; Levine, Orin S; Madhi, Shabir A; Murdoch, David R; Scott, J Anthony G; Thea, Donald M; Driscoll, Amanda J; Karron, Ruth A; Park, Daniel E; Prosperi, Christine; Zeger, Scott L; O'Brien, Katherine L; Feikin, Daniel R

    2017-06-15

    Many pneumonia etiology case-control studies exclude controls with respiratory illness from enrollment or analyses. Herein we argue that selecting controls regardless of respiratory symptoms provides the least biased estimates of pneumonia etiology. We review 3 reasons investigators may choose to exclude controls with respiratory symptoms in light of epidemiologic principles of control selection and present data from the Pneumonia Etiology Research for Child Health (PERCH) study where relevant to assess their validity. We conclude that exclusion of controls with respiratory symptoms will result in biased estimates of etiology. Randomly selected community controls, with or without respiratory symptoms, as long as they do not meet the criteria for case-defining pneumonia, are most representative of the general population from which cases arose and the least subject to selection bias. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  8. Randomized algorithms in automatic control and data mining

    CERN Document Server

    Granichin, Oleg; Toledano-Kitai, Dvora

    2015-01-01

    In the fields of data mining and control, the huge amount of unstructured data and the presence of uncertainty in system descriptions have always been critical issues. The book Randomized Algorithms in Automatic Control and Data Mining introduces the readers to the fundamentals of randomized algorithm applications in data mining (especially clustering) and in automatic control synthesis. The methods proposed in this book guarantee that the computational complexity of classical algorithms and the conservativeness of standard robust control techniques will be reduced. It is shown that when a problem requires "brute force" in selecting among options, algorithms based on random selection of alternatives offer good results with certain probability for a restricted time and significantly reduce the volume of operations.

  9. The basic science and mathematics of random mutation and natural selection.

    Science.gov (United States)

    Kleinman, Alan

    2014-12-20

    The mutation and natural selection phenomenon can and often does cause the failure of antimicrobial, herbicidal, pesticide and cancer treatments selection pressures. This phenomenon operates in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures. The mathematical behavior of mutation and selection is derived using the principles given by probability theory. The derivation of the equations describing the mutation and selection phenomenon is carried out in the context of an empirical example. Copyright © 2014 John Wiley & Sons, Ltd.

  10. Multi-Index Stochastic Collocation (MISC) for random elliptic PDEs

    KAUST Repository

    Haji Ali, Abdul Lateef; Nobile, Fabio; Tamellini, Lorenzo; Tempone, Raul

    2016-01-01

    In this work we introduce the Multi-Index Stochastic Collocation method (MISC) for computing statistics of the solution of a PDE with random data. MISC is a combination technique based on mixed differences of spatial approximations and quadratures over the space of random data. We propose an optimization procedure to select the most effective mixed differences to include in the MISC estimator: such optimization is a crucial step and allows us to build a method that, provided with sufficient solution regularity, is potentially more effective than other multi-level collocation methods already available in literature. We then provide a complexity analysis that assumes decay rates of product type for such mixed differences, showing that in the optimal case the convergence rate of MISC is only dictated by the convergence of the deterministic solver applied to a one dimensional problem. We show the effectiveness of MISC with some computational tests, comparing it with other related methods available in the literature, such as the Multi-Index and Multilevel Monte Carlo, Multilevel Stochastic Collocation, Quasi Optimal Stochastic Collocation and Sparse Composite Collocation methods.

  11. Multi-Index Stochastic Collocation (MISC) for random elliptic PDEs

    KAUST Repository

    Haji Ali, Abdul Lateef

    2016-01-06

    In this work we introduce the Multi-Index Stochastic Collocation method (MISC) for computing statistics of the solution of a PDE with random data. MISC is a combination technique based on mixed differences of spatial approximations and quadratures over the space of random data. We propose an optimization procedure to select the most effective mixed differences to include in the MISC estimator: such optimization is a crucial step and allows us to build a method that, provided with sufficient solution regularity, is potentially more effective than other multi-level collocation methods already available in literature. We then provide a complexity analysis that assumes decay rates of product type for such mixed differences, showing that in the optimal case the convergence rate of MISC is only dictated by the convergence of the deterministic solver applied to a one dimensional problem. We show the effectiveness of MISC with some computational tests, comparing it with other related methods available in the literature, such as the Multi-Index and Multilevel Monte Carlo, Multilevel Stochastic Collocation, Quasi Optimal Stochastic Collocation and Sparse Composite Collocation methods.

  12. Bilirubin and Stroke Risk Using a Mendelian Randomization Design.

    Science.gov (United States)

    Lee, Sun Ju; Jee, Yon Ho; Jung, Keum Ji; Hong, Seri; Shin, Eun Soon; Jee, Sun Ha

    2017-05-01

    Circulating bilirubin, a natural antioxidant, is associated with decreased risk of stroke. However, the nature of the relationship between the two remains unknown. We used a Mendelian randomization analysis to assess the causal effect of serum bilirubin on stroke risk in Koreans. The 14 single-nucleotide polymorphisms (SNPs) (bilirubin level in the KCPS-II (Korean Cancer Prevention Study-II) Biobank subcohort consisting of 4793 healthy Korean and 806 stroke cases. Weighted genetic risk score was calculated using 14 SNPs selected from the top SNPs. Both rs6742078 (F statistics=138) and weighted genetic risk score with 14 SNPs (F statistics=187) were strongly associated with bilirubin levels. Simultaneously, serum bilirubin level was associated with decreased risk of stroke in an ordinary least-squares analysis. However, in 2-stage least-squares Mendelian randomization analysis, no causal relationship between serum bilirubin and stroke risk was found. There is no evidence that bilirubin level is causally associated with risk of stroke in Koreans. Therefore, bilirubin level is not a risk determinant of stroke. © 2017 American Heart Association, Inc.

  13. Optimization methods for activities selection problems

    Science.gov (United States)

    Mahad, Nor Faradilah; Alias, Suriana; Yaakop, Siti Zulaika; Arshad, Norul Amanina Mohd; Mazni, Elis Sofia

    2017-08-01

    Co-curriculum activities must be joined by every student in Malaysia and these activities bring a lot of benefits to the students. By joining these activities, the students can learn about the time management and they can developing many useful skills. This project focuses on the selection of co-curriculum activities in secondary school using the optimization methods which are the Analytic Hierarchy Process (AHP) and Zero-One Goal Programming (ZOGP). A secondary school in Negeri Sembilan, Malaysia was chosen as a case study. A set of questionnaires were distributed randomly to calculate the weighted for each activity based on the 3 chosen criteria which are soft skills, interesting activities and performances. The weighted was calculated by using AHP and the results showed that the most important criteria is soft skills. Then, the ZOGP model will be analyzed by using LINGO Software version 15.0. There are two priorities to be considered. The first priority which is to minimize the budget for the activities is achieved since the total budget can be reduced by RM233.00. Therefore, the total budget to implement the selected activities is RM11,195.00. The second priority which is to select the co-curriculum activities is also achieved. The results showed that 9 out of 15 activities were selected. Thus, it can concluded that AHP and ZOGP approach can be used as the optimization methods for activities selection problem.

  14. Modeling of Residential Water Demand Using Random Effect Model,Case Study: Arak City

    Directory of Open Access Journals (Sweden)

    Seyed Hossein Sajadifar

    2011-10-01

    Full Text Available The present study tries to apply the “Partial Adjustment Model” and “Random Effect Model” techniques to the Stone-Greay’s linear expenditure system, in order to estimate the "Residential Seasonal Demand" for water in Arak city. Per capita water consumption of family residences is regressed on marginal price, per capita income, price of other goods, average temperature and average rainfall. Panel data approaches based on a sample of 152 observations from Arak city referred to 1993-2003. From the estimation of the Elasticity-price of the residential water demand, we want to know how a policy of responsive pricing can lead to more efficient household water consumption inArakcity. Results also indicated that summer price elasticity was twice the winter and price and income elasticity was less than 1 in all cases.

  15. RANDOM WALK HYPOTHESIS IN FINANCIAL MARKETS

    Directory of Open Access Journals (Sweden)

    Nicolae-Marius JULA

    2017-05-01

    Full Text Available Random walk hypothesis states that the stock market prices do not follow a predictable trajectory, but are simply random. If you are trying to predict a random set of data, one should test for randomness, because, despite the power and complexity of the used models, the results cannot be trustworthy. There are several methods for testing these hypotheses and the use of computational power provided by the R environment makes the work of the researcher easier and with a cost-effective approach. The increasing power of computing and the continuous development of econometric tests should give the potential investors new tools in selecting commodities and investing in efficient markets.

  16. A Phenomenological Case Study of the Elementary to Secondary Transition for One Female Student Diagnosed with Selective Mutism

    Science.gov (United States)

    Nashman-Smith, Mona

    2017-01-01

    Selective mutism (SM) is considered a communication and anxiety disorder that afflicts about 1% of students. The rarity of SM and the isolated cases of this condition has rendered the elementary to secondary school experience for a student with SM difficult to study. Utilizing a qualitative approach, this phenomenological case study examined the…

  17. The Goodness of Covariance Selection Problem from AUC Bounds

    OpenAIRE

    Khajavi, Navid Tafaghodi; Kuh, Anthony

    2016-01-01

    We conduct a study of graphical models and discuss the quality of model selection approximation by formulating the problem as a detection problem and examining the area under the curve (AUC). We are specifically looking at the model selection problem for jointly Gaussian random vectors. For Gaussian random vectors, this problem simplifies to the covariance selection problem which is widely discussed in literature by Dempster [1]. In this paper, we give the definition for the correlation appro...

  18. Motzkin numbers out of Random Domino Automaton

    Energy Technology Data Exchange (ETDEWEB)

    Białecki, Mariusz, E-mail: bialecki@igf.edu.pl [Institute of Geophysics, Polish Academy of Sciences, ul. Ks. Janusza 64, 01-452 Warszawa (Poland)

    2012-10-01

    Motzkin numbers are derived from a special case of Random Domino Automaton – recently proposed a slowly driven system being a stochastic toy model of earthquakes. It is also a generalisation of 1D Drossel–Schwabl forest-fire model. A solution of the set of equations describing stationary state of Random Domino Automaton in inverse-power case is presented. A link with Motzkin numbers allows to present explicit form of asymptotic behaviour of the automaton. -- Highlights: ► Motzkin numbers are derived from stochastic cellular automaton with avalanches. ► Explicit solution of toy model of earthquakes is presented. ► Case with inverse-power distribution of avalanches is found.

  19. Clinical Importance of Morphological Appearance of Seminiferous Tubules During MicroTESE in NOA Cases

    Directory of Open Access Journals (Sweden)

    A. H. Haliloglu

    2005-12-01

    Full Text Available Design: Clinical study. Setting: Research Center on Infertility, Ankara University; and Urology Department. Patients: 65 men with nonobstructive azoospermia (NOA.\tInterventions: Microscopical appearance of seminiferous tubules was recorded during TESE surgery. Differing from others, the largest opaque-white in color tubules were cut and removed. When all the tubules have no discriminating appearance, randomized biopsies were obtained. Removed tissue pieces were subjected to mechanical mincing under the stereomicroscope and then enzymatic digestion processes. Using inversion microscope (x32 magnification spermatozoa were searched. Main Outcome Measures: Morphological appearance of seminiferous tubules under optical magnification, spermatozoa recovery rates and histopathological findings were compared.\tRESULTS: In cases of Sertoli cell-only syndrome (SCOS, maturation arrest, hypospermatogenesis and focal spermatogenesis TESE yielded at least one spermatozoon in 37%, 52%, 100% and 63% of the cases, respectively. When all the seminiferous tubules were homogenously swollen, histopathological diagnosis was hypospermatogenesis in 100% of the cases. Homogenously thin and transparent tubules corresponded to SCOS or maturation arrest in 90% and 10% of the cases, respectively. Mature spermatozoa recovery rates were 100% and zero in homogenously-swollen observed and homogenously-thin observed tubules, respectively. CONCLUSIONS: Present data indicate that in cases of all tubules are homogenous in appearance and none of them can be discriminated from others, using microscope has no advantage in selection of the tubuli to be removed, but randomizely selection would also be sufficient. MicroTESE significantly increases the success in NOA cases with seminiferous tubules dispersed heterogeneously.

  20. Organic Ferroelectric-Based 1T1T Random Access Memory Cell Employing a Common Dielectric Layer Overcoming the Half-Selection Problem.

    Science.gov (United States)

    Zhao, Qiang; Wang, Hanlin; Ni, Zhenjie; Liu, Jie; Zhen, Yonggang; Zhang, Xiaotao; Jiang, Lang; Li, Rongjin; Dong, Huanli; Hu, Wenping

    2017-09-01

    Organic electronics based on poly(vinylidenefluoride/trifluoroethylene) (P(VDF-TrFE)) dielectric is facing great challenges in flexible circuits. As one indispensable part of integrated circuits, there is an urgent demand for low-cost and easy-fabrication nonvolatile memory devices. A breakthrough is made on a novel ferroelectric random access memory cell (1T1T FeRAM cell) consisting of one selection transistor and one ferroelectric memory transistor in order to overcome the half-selection problem. Unlike complicated manufacturing using multiple dielectrics, this system simplifies 1T1T FeRAM cell fabrication using one common dielectric. To achieve this goal, a strategy for semiconductor/insulator (S/I) interface modulation is put forward and applied to nonhysteretic selection transistors with high performances for driving or addressing purposes. As a result, high hole mobility of 3.81 cm 2 V -1 s -1 (average) for 2,6-diphenylanthracene (DPA) and electron mobility of 0.124 cm 2 V -1 s -1 (average) for N,N'-1H,1H-perfluorobutyl dicyanoperylenecarboxydiimide (PDI-FCN 2 ) are obtained in selection transistors. In this work, we demonstrate this technology's potential for organic ferroelectric-based pixelated memory module fabrication. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Zero Distribution of System with Unknown Random Variables Case Study: Avoiding Collision Path

    Directory of Open Access Journals (Sweden)

    Parman Setyamartana

    2014-07-01

    Full Text Available This paper presents the stochastic analysis of finding the feasible trajectories of robotics arm motion at obstacle surrounding. Unknown variables are coefficients of polynomials joint angle so that the collision-free motion is achieved. ãk is matrix consisting of these unknown feasible polynomial coefficients. The pattern of feasible polynomial in the obstacle environment shows as random. This paper proposes to model the pattern of this randomness values using random polynomial with unknown variables as coefficients. The behavior of the system will be obtained from zero distribution as the characteristic of such random polynomial. Results show that the pattern of random polynomial of avoiding collision can be constructed from zero distribution. Zero distribution is like building block of the system with obstacles as uncertainty factor. By scale factor k, which has range, the random coefficient pattern can be predicted.

  2. Beauveria keratitis and biopesticides: case histories and a random amplification of polymorphic DNA comparison.

    Science.gov (United States)

    Pariseau, Brett; Nehls, Sarah; Ogawa, Gregory S H; Sutton, Deanna A; Wickes, Brian L; Romanelli, Anna M

    2010-02-01

    The purposes of this study were to describe 2 contact lens-associated Beauveria keratitis cases and to compare the isolates of 3 contact lens-associated Beauveria keratitis cases with Beauveria-based biopesticides using random amplification of polymorphic DNA (RAPD). A 55-year-old diabetic woman from New Mexico and a 31-year-old healthy woman from southern Wisconsin developed soft contact lens-related corneal ulcers unresponsive to topical moxifloxacin and prednisolone acetate drops. Their corneal cultures grew B. bassiana. These isolates, an isolate from a third soft contact lens-related Beauveria keratitis case, and Beauveria-based biopesticides sold in the United States were analyzed using morphological features, DNA sequencing, and RAPD. A PubMed, Cochrane Library, OVID, UpToDate, and Google search using the term "Beauveria" found only 9 reported Beauveria keratitis infections. Patient 1 responded to topical natamycin, ketoconazole, and 200 mg oral ketoconazole twice daily before developing a secondary bacterial infection requiring penetrating keratoplasty. After subsequent cataract surgery, the best-corrected visual acuity was 20/20. Patient 2 was treated with topical natamycin, topical amphotericin, and 200 mg oral voriconazole twice daily for 1 month with residual scarring and a best-corrected visual acuity of 20/25. RAPD showed that all isolates were unrelated. Although earlier reported Beauveria keratitis cases occurred after corneal injury in patients who did not wear contact lenses, 3 recent patients wore soft contact lenses and denied trauma, mirroring a changing trend in microbial keratitis. RAPD analysis showed that the Beauveria isolates were unrelated to one another and to Beauveria-based biopesticides. In Patient 2, oral voriconazole worked well.

  3. Random Assignment: Practical Considerations from Field Experiments.

    Science.gov (United States)

    Dunford, Franklyn W.

    1990-01-01

    Seven qualitative issues associated with randomization that have the potential to weaken or destroy otherwise sound experimental designs are reviewed and illustrated via actual field experiments. Issue areas include ethics and legality, liability risks, manipulation of randomized outcomes, hidden bias, design intrusiveness, case flow, and…

  4. Identification and selection of cases and controls in the Pneumonia Etiology Research for Child Health project

    NARCIS (Netherlands)

    Deloria-Knoll, Maria; Feikin, Daniel R.; Scott, J. Anthony G.; O'Brien, Katherine L.; DeLuca, Andrea N.; Driscoll, Amanda J.; Levine, Orin S.; Black, Robert E.; Bhutta, Zulfiqar A.; Campbell, Harry; Cherian, Thomas; Crook, Derrick W.; de Jong, Menno D.; Dowell, Scott F.; Graham, Stephen M.; Klugman, Keith P.; Lanata, Claudio F.; Madhi, Shabir A.; Martin, Paul; Nataro, James P.; Piazza, Franco M.; Qazi, Shamim A.; Zar, Heather J.; Baggett, Henry C.; Brooks, W. Abdullah; Chipeta, James; Ebruke, Bernard; Endtz, Hubert P.; Groome, Michelle; Hammitt, Laura L.; Howie, Stephen R. C.; Kotloff, Karen; Maloney, Susan A.; Moore, David; Otieno, Juliet; Seidenberg, Phil; Tapia, Milagritos; Thamthitiwat, Somsak; Thea, Donald M.; Zaman, Khaleque

    2012-01-01

    Methods for the identification and selection of patients (cases) with severe or very severe pneumonia and controls for the Pneumonia Etiology Research for Child Health (PERCH) project were needed. Issues considered include eligibility criteria and sampling strategies, whether to enroll hospital or

  5. A Solution Method for Linear and Geometrically Nonlinear MDOF Systems with Random Properties subject to Random Excitation

    DEFF Research Database (Denmark)

    Micaletti, R. C.; Cakmak, A. S.; Nielsen, Søren R. K.

    structural properties. The resulting state-space formulation is a system of ordinary stochastic differential equations with random coefficient and deterministic initial conditions which are subsequently transformed into ordinary stochastic differential equations with deterministic coefficients and random......A method for computing the lower-order moments of randomly-excited multi-degree-of-freedom (MDOF) systems with random structural properties is proposed. The method is grounded in the techniques of stochastic calculus, utilizing a Markov diffusion process to model the structural system with random...... initial conditions. This transformation facilitates the derivation of differential equations which govern the evolution of the unconditional statistical moments of response. Primary consideration is given to linear systems and systems with odd polynomial nonlinearities, for in these cases...

  6. Random-walk simulation of selected aspects of dissipative collisions

    International Nuclear Information System (INIS)

    Toeke, J.; Gobbi, A.; Matulewicz, T.

    1984-11-01

    Internuclear thermal equilibrium effects and shell structure effects in dissipative collisions are studied numerically within the framework of the model of stochastic exchanges by applying the random-walk technique. Effective blocking of the drift through the mass flux induced by the temperature difference, while leaving the variances of the mass distributions unaltered is found possible, provided an internuclear potential barrier is present. Presence of the shell structure is found to lead to characteristic correlations between the consecutive exchanges. Experimental evidence for the predicted effects is discussed. (orig.)

  7. Locally Perturbed Random Walks with Unbounded Jumps

    OpenAIRE

    Paulin, Daniel; Szász, Domokos

    2010-01-01

    In \\cite{SzT}, D. Sz\\'asz and A. Telcs have shown that for the diffusively scaled, simple symmetric random walk, weak convergence to the Brownian motion holds even in the case of local impurities if $d \\ge 2$. The extension of their result to finite range random walks is straightforward. Here, however, we are interested in the situation when the random walk has unbounded range. Concretely we generalize the statement of \\cite{SzT} to unbounded random walks whose jump distribution belongs to th...

  8. Attentional Selection of Feature Conjunctions Is Accomplished by Parallel and Independent Selection of Single Features.

    Science.gov (United States)

    Andersen, Søren K; Müller, Matthias M; Hillyard, Steven A

    2015-07-08

    Experiments that study feature-based attention have often examined situations in which selection is based on a single feature (e.g., the color red). However, in more complex situations relevant stimuli may not be set apart from other stimuli by a single defining property but by a specific combination of features. Here, we examined sustained attentional selection of stimuli defined by conjunctions of color and orientation. Human observers attended to one out of four concurrently presented superimposed fields of randomly moving horizontal or vertical bars of red or blue color to detect brief intervals of coherent motion. Selective stimulus processing in early visual cortex was assessed by recordings of steady-state visual evoked potentials (SSVEPs) elicited by each of the flickering fields of stimuli. We directly contrasted attentional selection of single features and feature conjunctions and found that SSVEP amplitudes on conditions in which selection was based on a single feature only (color or orientation) exactly predicted the magnitude of attentional enhancement of SSVEPs when attending to a conjunction of both features. Furthermore, enhanced SSVEP amplitudes elicited by attended stimuli were accompanied by equivalent reductions of SSVEP amplitudes elicited by unattended stimuli in all cases. We conclude that attentional selection of a feature-conjunction stimulus is accomplished by the parallel and independent facilitation of its constituent feature dimensions in early visual cortex. The ability to perceive the world is limited by the brain's processing capacity. Attention affords adaptive behavior by selectively prioritizing processing of relevant stimuli based on their features (location, color, orientation, etc.). We found that attentional mechanisms for selection of different features belonging to the same object operate independently and in parallel: concurrent attentional selection of two stimulus features is simply the sum of attending to each of those

  9. Comparing conVEntional RadioTherapy with stereotactIC body radiotherapy in patients with spinAL metastases: study protocol for an randomized controlled trial following the cohort multiple randomized controlled trial design

    International Nuclear Information System (INIS)

    Velden, Joanne M. van der; Verkooijen, Helena M.; Seravalli, Enrica; Hes, Jochem; Gerlich, A. Sophie; Kasperts, Nicolien; Eppinga, Wietse S. C.; Verlaan, Jorrit-Jan; Vulpen, Marco van

    2016-01-01

    Standard radiotherapy is the treatment of first choice in patients with symptomatic spinal metastases, but is only moderately effective. Stereotactic body radiation therapy is increasingly used to treat spinal metastases, without randomized evidence of superiority over standard radiotherapy. The VERTICAL study aims to quantify the effect of stereotactic radiation therapy in patients with metastatic spinal disease. This study follows the ‘cohort multiple Randomized Controlled Trial’ design. The VERTICAL study is conducted within the PRESENT cohort. In PRESENT, all patients with bone metastases referred for radiation therapy are enrolled. For each patient, clinical and patient-reported outcomes are captured at baseline and at regular intervals during follow-up. In addition, patients give informed consent to be offered experimental interventions. Within PRESENT, 110 patients are identified as a sub cohort of eligible patients (i.e. patients with unirradiated painful, mechanically stable spinal metastases who are able to undergo stereotactic radiation therapy). After a protocol amendment, also patients with non-spinal bony metastases are eligible. From the sub cohort, a random selection of patients is offered stereotactic radiation therapy (n = 55), which patients may accept or refuse. Only patients accepting stereotactic radiation therapy sign informed consent for the VERTICAL trial. Non-selected patients (n = 55) receive standard radiotherapy, and are not aware of them serving as controls. Primary endpoint is pain response after three months. Data will be analyzed by intention to treat, complemented by instrumental variable analysis in case of substantial refusal of the stereotactic radiation therapy in the intervention arm. This study is designed to quantify the treatment response after (stereotactic) radiation therapy in patients with symptomatic spinal metastases. This is the first randomized study in palliative care following the cohort multiple Randomized

  10. Selecting the patients for morning report sessions: case-based vs. conventional method.

    Science.gov (United States)

    Rabiei, Mehdi; Saeidi, Masumeh; Kiani, Mohammad Ali; Amin, Sakineh Mohebi; Ahanchian, Hamid; Jafari, Seyed Ali; Kianifar, Hamidreza

    2015-08-01

    One of the most important issues in morning report sessions is the number of patients. The aim of this study was to investigate and compare the number of cases reported in the morning report sessions in terms of case-based and conventional methods from the perspective of pediatric residents of Mashhad University of Medical Sciences. The present study was conducted on 24 pediatric residents of Mashhad University of Medical Sciences in the academic year 2014-2015. In this survey, the residents replied to a 20-question researcher-made questionnaire that had been designed to measure the views of residents regarding the number of patients in the morning report sessions using case-based and conventional methods. The validity of the questionnaire was confirmed by experts' views and its reliability by calculating Cronbach's alpha coefficients. Data were analyzed by t-test analysis. The mean age of the residents was 30.852 ± 2.506, and 66.6% of them were female. The results showed that there was no significant relationship among the variables of academic year, gender, and residents' perspective to choosing the number of patients in the morning report sessions (P > 0.05). T-test analysis showed a significant relationship among the average scores of residents in the selection of the case-based method in comparison to the conventional method (P case-based morning report was preferred compared to the conventional method. This method makes residents pay more attention to the details of patients' issues and therefore helps them to better plan how to address patient problems and improve their differential diagnosis skills.

  11. Random number generation and creativity.

    Science.gov (United States)

    Bains, William

    2008-01-01

    A previous paper suggested that humans can generate genuinely random numbers. I tested this hypothesis by repeating the experiment with a larger number of highly numerate subjects, asking them to call out a sequence of digits selected from 0 through 9. The resulting sequences were substantially non-random, with an excess of sequential pairs of numbers and a deficit of repeats of the same number, in line with previous literature. However, the previous literature suggests that humans generate random numbers with substantial conscious effort, and distractions which reduce that effort reduce the randomness of the numbers. I reduced my subjects' concentration by asking them to call out in another language, and with alcohol - neither affected the randomness of their responses. This suggests that the ability to generate random numbers is a 'basic' function of the human mind, even if those numbers are not mathematically 'random'. I hypothesise that there is a 'creativity' mechanism, while not truly random, provides novelty as part of the mind's defence against closed programming loops, and that testing for the effects seen here in people more or less familiar with numbers or with spontaneous creativity could identify more features of this process. It is possible that training to perform better at simple random generation tasks could help to increase creativity, through training people to reduce the conscious mind's suppression of the 'spontaneous', creative response to new questions.

  12. Selection and characterization of DNA aptamers

    NARCIS (Netherlands)

    Ruigrok, V.J.B.

    2013-01-01

    This thesis focusses on the selection and characterisation of DNA aptamers and the various aspects related to their selection from large pools of randomized oligonucleotides. Aptamers are affinity tools that can specifically recognize and bind predefined target molecules; this ability, however,

  13. Whole Slide Imaging Versus Microscopy for Primary Diagnosis in Surgical Pathology: A Multicenter Blinded Randomized Noninferiority Study of 1992 Cases (Pivotal Study).

    Science.gov (United States)

    Mukhopadhyay, Sanjay; Feldman, Michael D; Abels, Esther; Ashfaq, Raheela; Beltaifa, Senda; Cacciabeve, Nicolas G; Cathro, Helen P; Cheng, Liang; Cooper, Kumarasen; Dickey, Glenn E; Gill, Ryan M; Heaton, Robert P; Kerstens, René; Lindberg, Guy M; Malhotra, Reenu K; Mandell, James W; Manlucu, Ellen D; Mills, Anne M; Mills, Stacey E; Moskaluk, Christopher A; Nelis, Mischa; Patil, Deepa T; Przybycin, Christopher G; Reynolds, Jordan P; Rubin, Brian P; Saboorian, Mohammad H; Salicru, Mauricio; Samols, Mark A; Sturgis, Charles D; Turner, Kevin O; Wick, Mark R; Yoon, Ji Y; Zhao, Po; Taylor, Clive R

    2018-01-01

    Most prior studies of primary diagnosis in surgical pathology using whole slide imaging (WSI) versus microscopy have focused on specific organ systems or included relatively few cases. The objective of this study was to demonstrate that WSI is noninferior to microscopy for primary diagnosis in surgical pathology. A blinded randomized noninferiority study was conducted across the entire range of surgical pathology cases (biopsies and resections, including hematoxylin and eosin, immunohistochemistry, and special stains) from 4 institutions using the original sign-out diagnosis (baseline diagnosis) as the reference standard. Cases were scanned, converted to WSI and randomized. Sixteen pathologists interpreted cases by microscopy or WSI, followed by a wash-out period of ≥4 weeks, after which cases were read by the same observers using the other modality. Major discordances were identified by an adjudication panel, and the differences between major discordance rates for both microscopy (against the reference standard) and WSI (against the reference standard) were calculated. A total of 1992 cases were included, resulting in 15,925 reads. The major discordance rate with the reference standard diagnosis was 4.9% for WSI and 4.6% for microscopy. The difference between major discordance rates for microscopy and WSI was 0.4% (95% confidence interval, -0.30% to 1.01%). The difference in major discordance rates for WSI and microscopy was highest in endocrine pathology (1.8%), neoplastic kidney pathology (1.5%), urinary bladder pathology (1.3%), and gynecologic pathology (1.2%). Detailed analysis of these cases revealed no instances where interpretation by WSI was consistently inaccurate compared with microscopy for multiple observers. We conclude that WSI is noninferior to microscopy for primary diagnosis in surgical pathology, including biopsies and resections stained with hematoxylin and eosin, immunohistochemistry and special stains. This conclusion is valid across a wide

  14. Range Selection and Median

    DEFF Research Database (Denmark)

    Jørgensen, Allan Grønlund; Larsen, Kasper Green

    2011-01-01

    and several natural special cases thereof. The rst special case is known as range median, which arises when k is xed to b(j 􀀀 i + 1)=2c. The second case, denoted prex selection, arises when i is xed to 0. Finally, we also consider the bounded rank prex selection problem and the xed rank range......Range selection is the problem of preprocessing an input array A of n unique integers, such that given a query (i; j; k), one can report the k'th smallest integer in the subarray A[i];A[i+1]; : : : ;A[j]. In this paper we consider static data structures in the word-RAM for range selection...... selection problem. In the former, data structures must support prex selection queries under the assumption that k for some value n given at construction time, while in the latter, data structures must support range selection queries where k is xed beforehand for all queries. We prove cell probe lower bounds...

  15. Rain dance: the role of randomization in clinical trials

    Directory of Open Access Journals (Sweden)

    Diniz JB

    2016-07-01

    Full Text Available Juliana Belo Diniz,1 Victor Fossaluza,2 Carlos Alberto de Bragança Pereira,1,2 Sergio Wechsler2 1Institute of Psychiatry, Clinics Hospital University of São Paulo Medical School, 2Department of Statistics, Institute of Mathematics and Statistics, University of São Paulo, São Paulo, Brazil Abstract: Randomized clinical trials are the gold standard for testing efficacy of treatment interventions. However, although randomization protects against deliberately biased samples, it does not guarantee random imbalances will not occur. Methods of intentional allocation that can overcome such deficiency of randomization have been developed, but are less frequently applied than randomization. Initially, we introduce a fictitious case example to revise and discuss the reasons of researchers' resistance to intentionally allocate instead of simply randomizing. We then introduce a real case example to evaluate the performance of an intentional protocol for allocation based on compositional data balance. A real case of allocation of 50 patients in two arms was compared with an optimal allocation of global instead of sequential arrivals. Performance was measured by a weighted average of Aitchison distances, between arms, of prognostic factors. To compare the intentional allocation with simple random allocation, 50,000 arrival orderings of 50 patients were simulated. To each one of the orders, both kinds of allocations into two arms were considered. Intentional allocation performed as well as optimal allocation in the case considered. In addition, out of the 50,000 simulated orders, 61% of them performed better with intentional allocation than random allocation. Hence, we conclude that intentional allocation should be encouraged in the design of future interventional clinical trials as a way to prevent unbalanced samples. Our sequential method is a viable alternative to overcome technical difficulties for study designs that require sequential inclusion of

  16. Pseudo-Random Number Generators

    Science.gov (United States)

    Howell, L. W.; Rheinfurth, M. H.

    1984-01-01

    Package features comprehensive selection of probabilistic distributions. Monte Carlo simulations resorted to whenever systems studied not amenable to deterministic analyses or when direct experimentation not feasible. Random numbers having certain specified distribution characteristic integral part of simulations. Package consists of collector of "pseudorandom" number generators for use in Monte Carlo simulations.

  17. Model Selection with the Linear Mixed Model for Longitudinal Data

    Science.gov (United States)

    Ryoo, Ji Hoon

    2011-01-01

    Model building or model selection with linear mixed models (LMMs) is complicated by the presence of both fixed effects and random effects. The fixed effects structure and random effects structure are codependent, so selection of one influences the other. Most presentations of LMM in psychology and education are based on a multilevel or…

  18. The role intuitive decision making plays in project selection in the residential property market: a case study of a medium sized

    OpenAIRE

    Ryan, Diarmuid

    2013-01-01

    non-peer-reviewed This thesis addresses the role of intuition in project selection in residential property development. The paper provides a comprehensive review of existing literature in relation to project selection and decision making. In addition, the examination of the project files of M.A. Ryan & Sons Ltd., a medium sized property development company, has enabled a case study to be carried out on three projects carried out by the organisation. Through the case study, t...

  19. Validation of a case definition to define hypertension using administrative data.

    Science.gov (United States)

    Quan, Hude; Khan, Nadia; Hemmelgarn, Brenda R; Tu, Karen; Chen, Guanmin; Campbell, Norm; Hill, Michael D; Ghali, William A; McAlister, Finlay A

    2009-12-01

    We validated the accuracy of case definitions for hypertension derived from administrative data across time periods (year 2001 versus 2004) and geographic regions using physician charts. Physician charts were randomly selected in rural and urban areas from Alberta and British Columbia, Canada, during years 2001 and 2004. Physician charts were linked with administrative data through unique personal health number. We reviewed charts of approximately 50 randomly selected patients >35 years of age from each clinic within 48 urban and 16 rural family physician clinics to identify physician diagnoses of hypertension during the years 2001 and 2004. The validity indices were estimated for diagnosed hypertension using 3 years of administrative data for the 8 case-definition combinations. Of the 3,362 patient charts reviewed, the prevalence of hypertension ranged from 18.8% to 33.3%, depending on the year and region studied. The administrative data hypertension definition of "2 claims within 2 years or 1 hospitalization" had the highest validity relative to the other definitions evaluated (sensitivity 75%, specificity 94%, positive predictive value 81%, negative predictive value 92%, and kappa 0.71). After adjustment for age, sex, and comorbid conditions, the sensitivities between regions, years, and provinces were not significantly different, but the positive predictive value varied slightly across geographic regions. These results provide evidence that administrative data can be used as a relatively valid source of data to define cases of hypertension for surveillance and research purposes.

  20. Bayesian dynamic modeling of time series of dengue disease case counts.

    Science.gov (United States)

    Martínez-Bello, Daniel Adyro; López-Quílez, Antonio; Torres-Prieto, Alexander

    2017-07-01

    The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model's short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful

  1. Evolving Random Forest for Preference Learning

    DEFF Research Database (Denmark)

    Abou-Zleikha, Mohamed; Shaker, Noor

    2015-01-01

    This paper introduces a novel approach for pairwise preference learning through a combination of an evolutionary method and random forest. Grammatical evolution is used to describe the structure of the trees in the Random Forest (RF) and to handle the process of evolution. Evolved random forests ...... obtained for predicting pairwise self-reports of users for the three emotional states engagement, frustration and challenge show very promising results that are comparable and in some cases superior to those obtained from state-of-the-art methods....

  2. The adverse effect of selective cyclooxygenase-2 inhibitor on random skin flap survival in rats.

    Directory of Open Access Journals (Sweden)

    Haiyong Ren

    Full Text Available BACKGROUND: Cyclooxygenase-2(COX-2 inhibitors provide desired analgesic effects after injury or surgery, but evidences suggested they also attenuate wound healing. The study is to investigate the effect of COX-2 inhibitor on random skin flap survival. METHODS: The McFarlane flap model was established in 40 rats and evaluated within two groups, each group gave the same volume of Parecoxib and saline injection for 7 days. The necrotic area of the flap was measured, the specimens of the flap were stained with haematoxylin-eosin(HE for histologic analysis. Immunohistochemical staining was performed to analyse the level of VEGF and COX-2 . RESULTS: 7 days after operation, the flap necrotic area ratio in study group (66.65 ± 2.81% was significantly enlarged than that of the control group(48.81 ± 2.33%(P <0.01. Histological analysis demonstrated angiogenesis with mean vessel density per mm(2 being lower in study group (15.4 ± 4.4 than in control group (27.2 ± 4.1 (P <0.05. To evaluate the expression of COX-2 and VEGF protein in the intermediate area II in the two groups by immunohistochemistry test .The expression of COX-2 in study group was (1022.45 ± 153.1, and in control group was (2638.05 ± 132.2 (P <0.01. The expression of VEGF in the study and control groups were (2779.45 ± 472.0 vs (4938.05 ± 123.6(P <0.01.In the COX-2 inhibitor group, the expressions of COX-2 and VEGF protein were remarkably down-regulated as compared with the control group. CONCLUSION: Selective COX-2 inhibitor had adverse effect on random skin flap survival. Suppression of neovascularization induced by low level of VEGF was supposed to be the biological mechanism.

  3. Random walks and diffusion on networks

    Science.gov (United States)

    Masuda, Naoki; Porter, Mason A.; Lambiotte, Renaud

    2017-11-01

    Random walks are ubiquitous in the sciences, and they are interesting from both theoretical and practical perspectives. They are one of the most fundamental types of stochastic processes; can be used to model numerous phenomena, including diffusion, interactions, and opinions among humans and animals; and can be used to extract information about important entities or dense groups of entities in a network. Random walks have been studied for many decades on both regular lattices and (especially in the last couple of decades) on networks with a variety of structures. In the present article, we survey the theory and applications of random walks on networks, restricting ourselves to simple cases of single and non-adaptive random walkers. We distinguish three main types of random walks: discrete-time random walks, node-centric continuous-time random walks, and edge-centric continuous-time random walks. We first briefly survey random walks on a line, and then we consider random walks on various types of networks. We extensively discuss applications of random walks, including ranking of nodes (e.g., PageRank), community detection, respondent-driven sampling, and opinion models such as voter models.

  4. Bias in random forest variable importance measures: Illustrations, sources and a solution

    Directory of Open Access Journals (Sweden)

    Hothorn Torsten

    2007-01-01

    Full Text Available Abstract Background Variable importance measures for random forests have been receiving increased attention as a means of variable selection in many classification tasks in bioinformatics and related scientific fields, for instance to select a subset of genetic markers relevant for the prediction of a certain disease. We show that random forest variable importance measures are a sensible means for variable selection in many applications, but are not reliable in situations where potential predictor variables vary in their scale of measurement or their number of categories. This is particularly important in genomics and computational biology, where predictors often include variables of different types, for example when predictors include both sequence data and continuous variables such as folding energy, or when amino acid sequence data show different numbers of categories. Results Simulation studies are presented illustrating that, when random forest variable importance measures are used with data of varying types, the results are misleading because suboptimal predictor variables may be artificially preferred in variable selection. The two mechanisms underlying this deficiency are biased variable selection in the individual classification trees used to build the random forest on one hand, and effects induced by bootstrap sampling with replacement on the other hand. Conclusion We propose to employ an alternative implementation of random forests, that provides unbiased variable selection in the individual classification trees. When this method is applied using subsampling without replacement, the resulting variable importance measures can be used reliably for variable selection even in situations where the potential predictor variables vary in their scale of measurement or their number of categories. The usage of both random forest algorithms and their variable importance measures in the R system for statistical computing is illustrated and

  5. Peer-selected "best papers"-are they really that "good"?

    Science.gov (United States)

    Wainer, Jacques; Eckmann, Michael; Rocha, Anderson

    2015-01-01

    Peer evaluation is the cornerstone of science evaluation. In this paper, we analyze whether or not a form of peer evaluation, the pre-publication selection of the best papers in Computer Science (CS) conferences, is better than random, when considering future citations received by the papers. Considering 12 conferences (for several years), we collected the citation counts from Scopus for both the best papers and the non-best papers. For a different set of 17 conferences, we collected the data from Google Scholar. For each data set, we computed the proportion of cases whereby the best paper has more citations. We also compare this proportion for years before 2010 and after to evaluate if there is a propaganda effect. Finally, we count the proportion of best papers that are in the top 10% and 20% most cited for each conference instance. The probability that a best paper will receive more citations than a non best paper is 0.72 (95% CI = 0.66, 0.77) for the Scopus data, and 0.78 (95% CI = 0.74, 0.81) for the Scholar data. There are no significant changes in the probabilities for different years. Also, 51% of the best papers are among the top 10% most cited papers in each conference/year, and 64% of them are among the top 20% most cited. There is strong evidence that the selection of best papers in Computer Science conferences is better than a random selection, and that a significant number of the best papers are among the top cited papers in the conference.

  6. Ellipsometry measurements of glass transition breadth in bulk films of random, block, and gradient copolymers.

    Science.gov (United States)

    Mok, M M; Kim, J; Marrou, S R; Torkelson, J M

    2010-03-01

    Bulk films of random, block and gradient copolymer systems were studied using ellipsometry to demonstrate the applicability of the numerical differentiation technique pioneered by Kawana and Jones for studying the glass transition temperature (T (g)) behavior and thermal expansivities of copolymers possessing different architectures and different levels of nanoheterogeneity. In a series of styrene/n -butyl methacrylate (S/nBMA) random copolymers, T (g) breadths were observed to increase from approximately 17( degrees ) C in styrene-rich cases to almost 30( degrees ) C in nBMA-rich cases, reflecting previous observations of significant nanoheterogeneity in PnBMA homopolymers. The derivative technique also revealed for the first time a substantial increase in glassy-state expansivity with increasing nBMA content in S/nBMA random copolymers, from 1.4x10(-4) K-1 in PS to 3.5x10(-4) K-1 in PnBMA. The first characterization of block copolymer T (g) 's and T (g) breadths by ellipsometry is given, examining the impact of nanophase-segregated copolymer structure on ellipsometric measurements of glass transition. The results show that, while the technique is effective in detecting the two T (g) 's expected in certain block copolymer systems, the details of the glass transition can become suppressed in ellipsometry measurements of a rubbery minor phase under conditions where the matrix is glassy; meanwhile, both transitions are easily discernible by differential scanning calorimetry. Finally, broad glass transition regions were measured in gradient copolymers, yielding in some cases extraordinary T (g) breadths of 69- 71( degrees ) C , factors of 4-5 larger than the T (g) breadths of related homopolymers and random copolymers. Surprisingly, one gradient copolymer demonstrated a slightly narrower T (g) breadth than the S/nBMA random copolymers with the highest nBMA content. This highlights the fact that nanoheterogeneity relevant to the glass transition response in selected

  7. Randomized branch sampling to estimatefruit production in Pecan trees cv. ‘Barton’

    Directory of Open Access Journals (Sweden)

    Filemom Manoel Mokochinski

    Full Text Available ABSTRACT: Sampling techniques to quantify the production of fruits are still very scarce and create a gap in crop development research. This study was conducted in a rural property in the county of Cachoeira do Sul - RS to estimate the efficiency of randomized branch sampling (RBS in quantifying the production of pecan fruit at three different ages (5,7 and 10 years. Two selection techniques were tested: the probability proportional to the diameter (PPD and the uniform probability (UP techniques, which were performed on nine trees, three from each age and randomly chosen. The RBS underestimated fruit production for all ages, and its main drawback was the high sampling error (125.17% - PPD and 111.04% - UP. The UP was regarded as more efficient than the PPD, though both techniques estimated similar production and similar experimental errors. In conclusion, we reported that branch sampling was inaccurate for this case study, requiring new studies to produce estimates with smaller sampling error.

  8. Multi-Label Learning via Random Label Selection for Protein Subcellular Multi-Locations Prediction.

    Science.gov (United States)

    Wang, Xiao; Li, Guo-Zheng

    2013-03-12

    Prediction of protein subcellular localization is an important but challenging problem, particularly when proteins may simultaneously exist at, or move between, two or more different subcellular location sites. Most of the existing protein subcellular localization methods are only used to deal with the single-location proteins. In the past few years, only a few methods have been proposed to tackle proteins with multiple locations. However, they only adopt a simple strategy, that is, transforming the multi-location proteins to multiple proteins with single location, which doesn't take correlations among different subcellular locations into account. In this paper, a novel method named RALS (multi-label learning via RAndom Label Selection), is proposed to learn from multi-location proteins in an effective and efficient way. Through five-fold cross validation test on a benchmark dataset, we demonstrate our proposed method with consideration of label correlations obviously outperforms the baseline BR method without consideration of label correlations, indicating correlations among different subcellular locations really exist and contribute to improvement of prediction performance. Experimental results on two benchmark datasets also show that our proposed methods achieve significantly higher performance than some other state-of-the-art methods in predicting subcellular multi-locations of proteins. The prediction web server is available at http://levis.tongji.edu.cn:8080/bioinfo/MLPred-Euk/ for the public usage.

  9. [Acupotomy and acupuncture in the treatment of avascular necrosis of femoral head at the early and middle stages:a clinical randomized controlled trial].

    Science.gov (United States)

    Wang, Zhanyou; Zhou, Xuelong; Xie, Lishuang; Liang, Dongyue; Wang, Ying; Zhang, Hong-An; Zheng, Jinghong

    2016-10-12

    To compare the efficacy difference between acupotomy and acupuncture in the treatment of avascular necrosis of femoral head at the early and middle stages. The randomized controlled prospective study method was adopted. Sixty cases of avascular necrosis of femoral head at Ficat-ArletⅠto Ⅱ stages were randomized into an acupotomy group (32 cases) and an acupuncture group (28 cases) by the third part. In the acupotomy group, the acupotomy was adopted for the loose solution at the treatment sites of hip joint, once every two weeks, totally for 3 times. In the acupuncture group, ashi points around the hip joint were selected and stimulated with warm acupuncture therapy, once every day, for 6 weeks. Harris hip score was observed before and after treatment. The efficacy was evaluated in the two groups. Harris hip score was improved significantly after treatment in the two groups (both P avascular necrosis of femoral head at the early and middle stages.

  10. Method selection for sustainability assessments: The case of recovery of resources from waste water.

    Science.gov (United States)

    Zijp, M C; Waaijers-van der Loop, S L; Heijungs, R; Broeren, M L M; Peeters, R; Van Nieuwenhuijzen, A; Shen, L; Heugens, E H W; Posthuma, L

    2017-07-15

    Sustainability assessments provide scientific support in decision procedures towards sustainable solutions. However, in order to contribute in identifying and choosing sustainable solutions, the sustainability assessment has to fit the decision context. Two complicating factors exist. First, different stakeholders tend to have different views on what a sustainability assessment should encompass. Second, a plethora of sustainability assessment methods exist, due to the multi-dimensional characteristic of the concept. Different methods provide other representations of sustainability. Based on a literature review, we present a protocol to facilitate method selection together with stakeholders. The protocol guides the exploration of i) the decision context, ii) the different views of stakeholders and iii) the selection of pertinent assessment methods. In addition, we present an online tool for method selection. This tool identifies assessment methods that meet the specifications obtained with the protocol, and currently contains characteristics of 30 sustainability assessment methods. The utility of the protocol and the tool are tested in a case study on the recovery of resources from domestic waste water. In several iterations, a combination of methods was selected, followed by execution of the selected sustainability assessment methods. The assessment results can be used in the first phase of the decision procedure that leads to a strategic choice for sustainable resource recovery from waste water in the Netherlands. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Non-Selective Evolution of Growing Populations.

    Directory of Open Access Journals (Sweden)

    Karl Wienand

    Full Text Available Non-selective effects, like genetic drift, are an important factor in modern conceptions of evolution, and have been extensively studied for constant population sizes (Kimura, 1955; Otto and Whitlock, 1997. Here, we consider non-selective evolution in the case of growing populations that are of small size and have varying trait compositions (e.g. after a population bottleneck. We find that, in these conditions, populations never fixate to a trait, but tend to a random limit composition, and that the distribution of compositions "freezes" to a steady state. This final state is crucially influenced by the initial conditions. We obtain these findings from a combined theoretical and experimental approach, using multiple mixed subpopulations of two Pseudomonas putida strains in non-selective growth conditions (Matthijs et al, 2009 as model system. The experimental results for the population dynamics match the theoretical predictions based on the Pólya urn model (Eggenberger and Pólya, 1923 for all analyzed parameter regimes. In summary, we show that exponential growth stops genetic drift. This result contrasts with previous theoretical analyses of non-selective evolution (e.g. genetic drift, which investigated how traits spread and eventually take over populations (fixate (Kimura, 1955; Otto and Whitlock, 1997. Moreover, our work highlights how deeply growth influences non-selective evolution, and how it plays a key role in maintaining genetic variability. Consequently, it is of particular importance in life-cycles models (Melbinger et al, 2010; Cremer et al, 2011; Cremer et al, 2012 of periodically shrinking and expanding populations.

  12. Survivor bias in Mendelian randomization analysis

    DEFF Research Database (Denmark)

    Vansteelandt, Stijn; Dukes, Oliver; Martinussen, Torben

    2017-01-01

    Mendelian randomization studies employ genotypes as experimental handles to infer the effect of genetically modified exposures (e.g. vitamin D exposure) on disease outcomes (e.g. mortality). The statistical analysis of these studies makes use of the standard instrumental variables framework. Many...... of these studies focus on elderly populations, thereby ignoring the problem of left truncation, which arises due to the selection of study participants being conditional upon surviving up to the time of study onset. Such selection, in general, invalidates the assumptions on which the instrumental variables...... analysis rests. We show that Mendelian randomization studies of adult or elderly populations will therefore, in general, return biased estimates of the exposure effect when the considered genotype affects mortality; in contrast, standard tests of the causal null hypothesis that the exposure does not affect...

  13. High-Tg Polynorbornene-Based Block and Random Copolymers for Butanol Pervaporation Membranes

    Science.gov (United States)

    Register, Richard A.; Kim, Dong-Gyun; Takigawa, Tamami; Kashino, Tomomasa; Burtovyy, Oleksandr; Bell, Andrew

    Vinyl addition polymers of substituted norbornene (NB) monomers possess desirably high glass transition temperatures (Tg); however, until very recently, the lack of an applicable living polymerization chemistry has precluded the synthesis of such polymers with controlled architecture, or copolymers with controlled sequence distribution. We have recently synthesized block and random copolymers of NB monomers bearing hydroxyhexafluoroisopropyl and n-butyl substituents (HFANB and BuNB) via living vinyl addition polymerization with Pd-based catalysts. Both series of polymers were cast into the selective skin layers of thin film composite (TFC) membranes, and these organophilic membranes investigated for the isolation of n-butanol from dilute aqueous solution (model fermentation broth) via pervaporation. The block copolymers show well-defined microphase-separated morphologies, both in bulk and as the selective skin layers on TFC membranes, while the random copolymers are homogeneous. Both block and random vinyl addition copolymers are effective as n-butanol pervaporation membranes, with the block copolymers showing a better flux-selectivity balance. While polyHFANB has much higher permeability and n-butanol selectivity than polyBuNB, incorporating BuNB units into the polymer (in either a block or random sequence) limits the swelling of the polyHFANB and thereby improves the n-butanol pervaporation selectivity.

  14. Birth order and sibship size: evaluation of the role of selection bias in a case-control study of non-Hodgkin's lymphoma.

    Science.gov (United States)

    Mensah, F K; Willett, E V; Simpson, J; Smith, A G; Roman, E

    2007-09-15

    Substantial heterogeneity has been observed among case-control studies investigating associations between non-Hodgkin's lymphoma and familial characteristics, such as birth order and sibship size. The potential role of selection bias in explaining such heterogeneity is considered within this study. Selection bias according to familial characteristics and socioeconomic status is investigated within a United Kingdom-based case-control study of non-Hodgkin's lymphoma diagnosed during 1998-2001. Reported distributions of birth order and maternal age are each compared with expected reference distributions derived using national birth statistics from the United Kingdom. A method is detailed in which yearly data are used to derive expected distributions, taking account of variability in birth statistics over time. Census data are used to reweight both the case and control study populations such that they are comparable with the general population with regard to socioeconomic status. The authors found little support for an association between non-Hodgkin's lymphoma and birth order or family size and little evidence for an influence of selection bias. However, the findings suggest that between-study heterogeneity could be explained by selection biases that influence the demographic characteristics of participants.

  15. Compositions, Random Sums and Continued Random Fractions of Poisson and Fractional Poisson Processes

    Science.gov (United States)

    Orsingher, Enzo; Polito, Federico

    2012-08-01

    In this paper we consider the relation between random sums and compositions of different processes. In particular, for independent Poisson processes N α ( t), N β ( t), t>0, we have that N_{α}(N_{β}(t)) stackrel{d}{=} sum_{j=1}^{N_{β}(t)} Xj, where the X j s are Poisson random variables. We present a series of similar cases, where the outer process is Poisson with different inner processes. We highlight generalisations of these results where the external process is infinitely divisible. A section of the paper concerns compositions of the form N_{α}(tauk^{ν}), ν∈(0,1], where tauk^{ν} is the inverse of the fractional Poisson process, and we show how these compositions can be represented as random sums. Furthermore we study compositions of the form Θ( N( t)), t>0, which can be represented as random products. The last section is devoted to studying continued fractions of Cauchy random variables with a Poisson number of levels. We evaluate the exact distribution and derive the scale parameter in terms of ratios of Fibonacci numbers.

  16. Selective Serotonin Reuptake Inhibitors for Treatment of Selective Mutism

    Directory of Open Access Journals (Sweden)

    Mazlum Çöpür

    2012-03-01

    Full Text Available Some authors suggest that selective mutism should be considered as a variant of social phobia or a disorder in the obsessive-compulsive spectrum. Recent studies indicate that pharmacological treatments may be effective in the treatment of selective mutism. In this article, four cases who were treated with citalopram and escitalopram are presented. The results indicate that the drugs were well tolerated, and the level of social and verbal interactions improved significantly. These findings have shown that citalopram and escitalopram can be considered in medication of selective mutism; nevertheless, it is essential that research be done with more cases than previous ones, in order to prove their accuracy

  17. Mutism as the presenting symptom: three case reports and selective review of literature.

    Science.gov (United States)

    Aggarwal, Ashish; Sharma, Dinesh Dutt; Kumar, Ramesh; Sharma, Ravi C

    2010-01-01

    Mutism, defined as an inability or unwillingness to speak, resulting in an absence or marked paucity of verbal output, is a common clinical symptom seen in psychiatric as well as neurology outpatient department. It rarely presents as an isolated disability and often occurs in association with other disturbances in behavior, thought processes, affect, or level of consciousness. It is often a focus of clinical attention, both for the physician and the relatives. Mutism occurs in a number of conditions, both functional and organic, and a proper diagnosis is important for the management. We hereby present three cases, who presented with mutism as the presenting symptom and the differential diagnosis and management issues related to these cases are discussed. The authors also selectively reviewed the literature on mutism, including psychiatric, neurologic, toxic-metabolic, and drug-induced causes.

  18. Selection of controls in case-control studies on maternal medication use and risk of birth defects

    NARCIS (Netherlands)

    Bakker, M.K.; de Walle, H.E.; Dequito, A.; van den Berg, P.B.; de Jong-van den Berg, L.T.

    BACKGROUND:: In case-control studies on teratogenic risks of maternal drug use during pregnancy, the use of normal or malformed controls may lead to recall-bias or selection bias. This can be avoided by using controls with a genetic disorder. However, researchers are hesitant to use these as

  19. Wishart and anti-Wishart random matrices

    International Nuclear Information System (INIS)

    Janik, Romuald A; Nowak, Maciej A

    2003-01-01

    We provide a compact exact representation for the distribution of the matrix elements of the Wishart-type random matrices A † A, for any finite number of rows and columns of A, without any large N approximations. In particular, we treat the case when the Wishart-type random matrix contains redundant, non-random information, which is a new result. This representation is of interest for a procedure for reconstructing the redundant information hidden in Wishart matrices, with potential applications to numerous models based on biological, social and artificial intelligence networks

  20. Random coil chemical shifts in acidic 8 M urea: Implementation of random coil shift data in NMRView

    International Nuclear Information System (INIS)

    Schwarzinger, Stephan; Kroon, Gerard J.A.; Foss, Ted R.; Wright, Peter E.; Dyson, H. Jane

    2000-01-01

    Studies of proteins unfolded in acid or chemical denaturant can help in unraveling events during the earliest phases of protein folding. In order for meaningful comparisons to be made of residual structure in unfolded states, it is necessary to use random coil chemical shifts that are valid for the experimental system under study. We present a set of random coil chemical shifts obtained for model peptides under experimental conditions used in studies of denatured proteins. This new set, together with previously published data sets, has been incorporated into a software interface for NMRView, allowing selection of the random coil data set that fits the experimental conditions best

  1. Discriminative Projection Selection Based Face Image Hashing

    Science.gov (United States)

    Karabat, Cagatay; Erdogan, Hakan

    Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.

  2. The effect of selection on genetic parameter estimates

    African Journals Online (AJOL)

    Unknown

    The South African Journal of Animal Science is available online at ... A simulation study was carried out to investigate the effect of selection on the estimation of genetic ... The model contained a fixed effect, random genetic and random.

  3. A theory for the origin of a self-replicating chemical system. I - Natural selection of the autogen from short, random oligomers

    Science.gov (United States)

    White, D. H.

    1980-01-01

    A general theory is presented for the origin of a self-replicating chemical system, termed an autogen, which is capable of both crude replication and translation (protein synthesis). The theory requires the availability of free energy and monomers to the system, a significant background low-yield synthesis of kinetically stable oligopeptides and oligonucleotides, the localization of the oligomers, crude oligonucleotide selectivity of amino acids during oligopeptide synthesis, crude oligonucleotide replication, and two short peptide families which catalyze replication and translation, to produce a localized group of at least one copy each of two protogenes and two protoenzymes. The model posits a process of random oligomerization, followed by the random nucleation of functional components and the rapid autocatalytic growth of the functioning autogen to macroscopic amounts, to account for the origin of the first self-replicating system. Such a process contains steps of such high probability and short time periods that it is suggested that the emergence of an autogen in a laboratory experiment of reasonable time scale may be possible.

  4. Random walk through fractal environments

    OpenAIRE

    Isliker, H.; Vlahos, L.

    2002-01-01

    We analyze random walk through fractal environments, embedded in 3-dimensional, permeable space. Particles travel freely and are scattered off into random directions when they hit the fractal. The statistical distribution of the flight increments (i.e. of the displacements between two consecutive hittings) is analytically derived from a common, practical definition of fractal dimension, and it turns out to approximate quite well a power-law in the case where the dimension D of the fractal is ...

  5. Penicillin at the late stage of leptospirosis: a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Costa Everaldo

    2003-01-01

    Full Text Available There is evidence that an early start of penicillin reduces the case-fatality rate of leptospirosis and that chemoprophylaxis is efficacious in persons exposed to the sources of leptospira. The existent data, however, are inconsistent regarding the benefit of introducing penicillin at a late stage of leptospirosis. The present study was developed to assess whether the introduction of penicillin after more than four days of symptoms reduces the in-hospital case-fatality rate of leptospirosis. A total of 253 patients aged 15 to 76 years with advanced leptospirosis, i.e., more than four days of symptoms, admitted to an infectious disease hospital located in Salvador, Brazil, were selected for the study. The patients were randomized to one of two treatment groups: with intravenous penicillin, 6 million units day (one million unit every four hours for seven days (n = 125 and without (n = 128 penicillin. The main outcome was death during hospitalization. The case-fatality rate was approximately twice as high in the group treated with penicillin (12%; 15/125 than in the comparison group (6.3%; 8/128. This difference pointed in the opposite direction of the study hypothesis, but was not statistically significant (p = 0.112. Length of hospital stay was similar between the treatment groups. According to the results of the present randomized clinical trial initiation of penicillin in patients with severe forms of leptospirosis after at least four days of symptomatic leptospirosis is not beneficial. Therefore, more attention should be directed to prevention and earlier initiation of the treatment of leptospirosis.

  6. Random walk and the heat equation

    CERN Document Server

    Lawler, Gregory F

    2010-01-01

    The heat equation can be derived by averaging over a very large number of particles. Traditionally, the resulting PDE is studied as a deterministic equation, an approach that has brought many significant results and a deep understanding of the equation and its solutions. By studying the heat equation by considering the individual random particles, however, one gains further intuition into the problem. While this is now standard for many researchers, this approach is generally not presented at the undergraduate level. In this book, Lawler introduces the heat equation and the closely related notion of harmonic functions from a probabilistic perspective. The theme of the first two chapters of the book is the relationship between random walks and the heat equation. The first chapter discusses the discrete case, random walk and the heat equation on the integer lattice; and the second chapter discusses the continuous case, Brownian motion and the usual heat equation. Relationships are shown between the two. For exa...

  7. Continuing professional education and the selection of candidates: the case for a tripartite model.

    Science.gov (United States)

    Ellis, L B

    2000-02-01

    This paper argues the case for a tripartite model involving the manager educator and practitioner in the selection of candidates to programmes of continuing professional education (CPE). Nurse educators are said to play a key link in the education practice chain (Pendleton & Myles 1991), yet with the introduction of a market philosophy for education, the educator appears to have little, if any, influence over the selection of CPE candidates. Empirical studies on the value of an effective system for identifying the educational needs of the individual and the locality are unequivocal in specifying the benefits of a collaborative selection process (Larcombe & Maggs 1991). However, there are few studies that offer a model of collaboration and fewer still on how to operationalize such a model. This paper presents the policy and legislative context of CPE leading to the development of a market philosophy. The tension between educational reforms such as life-long learning and diminishing and finite resources are highlighted. These strategic issues provide the backdrop and rationale for considering the process for identifying CPE needs, and the characteristics of an effective system as suggested in the literature. Finally, this paper outlines recommendations for a partnership between the manager practitioner and educationalist in the selection of CPE candidates.

  8. Nitrates and bone turnover (NABT) - trial to select the best nitrate preparation: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Bucur, Roxana C; Reid, Lauren S; Hamilton, Celeste J; Cummings, Steven R; Jamal, Sophie A

    2013-09-08

    comparisons with the best' approach for data analyses, as this strategy allows practical considerations of ease of use and tolerability to guide selection of the preparation for future studies. Data from this protocol will be used to develop a randomized, controlled trial of nitrates to prevent osteoporotic fractures. ClinicalTrials.gov Identifier: NCT01387672. Controlled-Trials.com: ISRCTN08860742.

  9. [Silvicultural treatments and their selection effects].

    Science.gov (United States)

    Vincent, G

    1973-01-01

    Selection can be defined in terms of its observable consequences as the non random differential reproduction of genotypes (Lerner 1958). In the forest stands we are selecting during the improvements-fellings and reproduction treatments the individuals surpassing in growth or in production of first-class timber. However the silvicultural treatments taken in forest stands guarantee a permanent increase of forest production only in such cases, if they have been taken with respect to the principles of directional (dynamic) selection. These principles require that the trees determined for further growing and for forest regeneration are selected by their hereditary properties, i.e. by their genotypes.For making this selection feasible, our study deals with the genetic parameters and gives some examples of the application of the response, the selection differential, the heritability in the narrow and in the broad sense, as well as of the genetic and genotypic gain. On the strength of this parameter we have the possibility to estimate the economic success of several silvicultural treatments in forest stands.The mentioned examples demonstrate that the selection measures of a higher intensity will be manifested in a higher selection differential, in a higher genetic and genotypic gain and that the mentioned measures show more distinct effects in the variable populations - in natural forest - than in the population characteristic by a smaller variability, e.g. in many uniform artificially established stands.The examples of influences of different selection on the genotypes composition of population prove that genetics instructs us to differentiate the different genotypes of the same species and gives us at the same time a new criterions for evaluating selectional treatments. These criterions from economic point of view is necessary to consider in silviculture as advantageous even for the reason that we can judge from these criterions the genetical composition of forest stands

  10. Performance of Universal Adhesive in Primary Molars After Selective Removal of Carious Tissue: An 18-Month Randomized Clinical Trial.

    Science.gov (United States)

    Lenzi, Tathiane Larissa; Pires, Carine Weber; Soares, Fabio Zovico Maxnuck; Raggio, Daniela Prócida; Ardenghi, Thiago Machado; de Oliveira Rocha, Rachel

    2017-09-15

    To evaluate the 18-month clinical performance of a universal adhesive, applied under different adhesion strategies, after selective carious tissue removal in primary molars. Forty-four subjects (five to 10 years old) contributed with 90 primary molars presenting moderately deep dentin carious lesions on occlusal or occluso-proximal surfaces, which were randomly assigned following either self-etch or etch-and-rinse protocol of Scotchbond Universal Adhesive (3M ESPE). Resin composite was incrementally inserted for all restorations. Restorations were evaluated at one, six, 12, and 18 months using the modified United States Public Health Service criteria. Survival estimates for restorations' longevity were evaluated using the Kaplan-Meier method. Multivariate Cox regression analysis with shared frailty to assess the factors associated with failures (Padhesion strategy did not influence the restorations' longevity (P=0.06; 72.2 percent and 89.7 percent with etch-and-rinse and self-etch mode, respectively). Self-etch and etch-and-rinse strategies did not influence the clinical behavior of universal adhesive used in primary molars after selective carious tissue removal; although there was a tendency for better outcome of the self-etch strategy.

  11. Uncertain programming models for portfolio selection with uncertain returns

    Science.gov (United States)

    Zhang, Bo; Peng, Jin; Li, Shengguo

    2015-10-01

    In an indeterminacy economic environment, experts' knowledge about the returns of securities consists of much uncertainty instead of randomness. This paper discusses portfolio selection problem in uncertain environment in which security returns cannot be well reflected by historical data, but can be evaluated by the experts. In the paper, returns of securities are assumed to be given by uncertain variables. According to various decision criteria, the portfolio selection problem in uncertain environment is formulated as expected-variance-chance model and chance-expected-variance model by using the uncertainty programming. Within the framework of uncertainty theory, for the convenience of solving the models, some crisp equivalents are discussed under different conditions. In addition, a hybrid intelligent algorithm is designed in the paper to provide a general method for solving the new models in general cases. At last, two numerical examples are provided to show the performance and applications of the models and algorithm.

  12. Age replacement policy based on imperfect repair with random probability

    International Nuclear Information System (INIS)

    Lim, J.H.; Qu, Jian; Zuo, Ming J.

    2016-01-01

    In most of literatures of age replacement policy, failures before planned replacement age can be either minimally repaired or perfectly repaired based on the types of failures, cost for repairs and so on. In this paper, we propose age replacement policy based on imperfect repair with random probability. The proposed policy incorporates the case that such intermittent failure can be either minimally repaired or perfectly repaired with random probabilities. The mathematical formulas of the expected cost rate per unit time are derived for both the infinite-horizon case and the one-replacement-cycle case. For each case, we show that the optimal replacement age exists and is finite. - Highlights: • We propose a new age replacement policy with random probability of perfect repair. • We develop the expected cost per unit time. • We discuss the optimal age for replacement minimizing the expected cost rate.

  13. Analysis of swaps in Radix selection

    DEFF Research Database (Denmark)

    Elmasry, Amr Ahmed Abd Elmoneim; Mahmoud, Hosam

    2011-01-01

    Radix Sort is a sorting algorithm based on analyzing digital data. We study the number of swaps made by Radix Select (a one-sided version of Radix Sort) to find an element with a randomly selected rank. This kind of grand average provides a smoothing over all individual distributions for specific...

  14. Random walks in the quarter-plane: invariant measures and performance bounds

    NARCIS (Netherlands)

    Chen, Y.

    2015-01-01

    This monograph focuses on random walks in the quarter-plane. Such random walks are frequently used to model queueing systems and the invariant measure of a random walk is of major importance in studying the performance of these systems. In special cases the invariant measure of a random walk can be

  15. Integral Histogram with Random Projection for Pedestrian Detection.

    Directory of Open Access Journals (Sweden)

    Chang-Hua Liu

    Full Text Available In this paper, we give a systematic study to report several deep insights into the HOG, one of the most widely used features in the modern computer vision and image processing applications. We first show that, its magnitudes of gradient can be randomly projected with random matrix. To handle over-fitting, an integral histogram based on the differences of randomly selected blocks is proposed. The experiments show that both the random projection and integral histogram outperform the HOG feature obviously. Finally, the two ideas are combined into a new descriptor termed IHRP, which outperforms the HOG feature with less dimensions and higher speed.

  16. Unwilling or Unable to Cheat? Evidence from a Randomized Tax Audit Experiment in Denmark

    OpenAIRE

    Henrik J. Kleven; Martin B. Knudsen; Claus T. Kreiner; Søren Pedersen; Emmanuel Saez

    2010-01-01

    This paper analyzes a randomized tax enforcement experiment in Denmark. In the base year, a stratified and representative sample of over 40,000 individual income tax filers was selected for the experiment. Half of the tax filers were randomly selected to be thoroughly audited, while the rest were deliberately not audited. The following year, "threat-of-audit" letters were randomly assigned and sent to tax filers in both groups. Using comprehensive administrative tax data, we present four main...

  17. Multi-criteria decision making on selection of solar–wind hybrid power station location: A case of China

    International Nuclear Information System (INIS)

    Yunna, Wu; Geng, Shuai

    2014-01-01

    Highlights: • We summarize the evaluation attributes from the perspective of project management. • The duties of roles are defined in the decision process. • The decision framework can provide various rankings of alternatives. • A China’s solar–wind hybrid power station location selection case is study. - Abstract: Site selection plays an important role in the entire life cycle of solar–wind hybrid power station (SWHPS) project and is worthy to further study. There are problems in the present researches: first, the SWHPS site evaluation results are difficult to be understood by the project managers due to the evaluations of SWHPS site are few from the perspective of project management. Second, the independence of experts is difficult to be protected since the undefined duties of roles in the evaluation process Third, the project managers cannot consider the alternatives thoroughly because that the evaluation result is single. Hence the innovativeness of this paper is as follows: first, the evaluation attributes of SWHPS site selection are summarized from the perspective of project management; second, the duties of roles in the decision process are defined; third, according to the principle of practicality, a decision framework of SWHPS site selection is built based on the analytic hierarchy process method, the merits of this decision framework are that it can provide various rankings of alternatives and is easy to be used. Finally, a case study of China demonstrates the effectiveness of decision framework

  18. On quantifying uncertainty for project selection: the case of renewable energy sources' investment

    International Nuclear Information System (INIS)

    Kirytopoulos, Konstantinos; Rentizelas, Athanassios; Tziralis, Georgios

    2006-01-01

    The selection of a project among different alternatives, considering the limited resources of a company (organisation), is an added value process that determines the prosperity of an undertaken project (investment). This applies also to the 'boming' Renewable Energy Sector, especially under the circumstances established by the recent activation of the Kyoto protocal and by the plethora of available choices for renewable energy sources (RES) projjects. The need for a reliable project selection method among the various alternatives is, therefore, highlighted and, in this context, the paper proposes the NPV function as one of possible criteria for the selection of a RES project. Furthermore, it differentiates from the typical NPV calculation process by adding the concept of a probabilistic NPV approach through Monte Carlo simulation. Reality is non-deterministic, so any attempt of modelling it by using a deterministic approach is by definition erroneous. The paper ultimately proposes a process of substituting the point with a range estimation, capable of quantifying the various uncertainty factors and in this way elucidate the accomplishment possibilities of eligible scenarious. The paper is enhanced by case study showing how the proposed method can be practically applied to support the investment decision, thus enabling the decision makers to judge its effectiveness and usefulness.(Author)

  19. Case Report: Evaluation strategies and cognitive intervention: the case of a monovular twin child affected by selective mutism [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Micaela Capobianco

    2018-02-01

    Full Text Available The present work describes the assessment process, evaluation strategies, and cognitive intervention on a 9 years old child with selective mutism (SM, a monovular twin of a child also affected by mutism. Currently, the cognitive behavioral multimodal treatment seems the most effective therapeutic approach for children diagnosed with selective mutism (Capobianco & Cerniglia, 2018. The illustrated case confirms the role of biological factors involved in mutacic disorder but also highlights the importance of environmental influences in the maintenance of the disorder with respect to relational and contextual dynamics (e.g. complicity between sisters, family relationships. The article discusses furthermore the importance of an early diagnosis as a predictor of positive treatment outcomes.

  20. Collocation methods for uncertainty quanti cation in PDE models with random data

    KAUST Repository

    Nobile, Fabio

    2014-01-06

    In this talk we consider Partial Differential Equations (PDEs) whose input data are modeled as random fields to account for their intrinsic variability or our lack of knowledge. After parametrizing the input random fields by finitely many independent random variables, we exploit the high regularity of the solution of the PDE as a function of the input random variables and consider sparse polynomial approximations in probability (Polynomial Chaos expansion) by collocation methods. We first address interpolatory approximations where the PDE is solved on a sparse grid of Gauss points in the probability space and the solutions thus obtained interpolated by multivariate polynomials. We present recent results on optimized sparse grids in which the selection of points is based on a knapsack approach and relies on sharp estimates of the decay of the coefficients of the polynomial chaos expansion of the solution. Secondly, we consider regression approaches where the PDE is evaluated on randomly chosen points in the probability space and a polynomial approximation constructed by the least square method. We present recent theoretical results on the stability and optimality of the approximation under suitable conditions between the number of sampling points and the dimension of the polynomial space. In particular, we show that for uniform random variables, the number of sampling point has to scale quadratically with the dimension of the polynomial space to maintain the stability and optimality of the approximation. Numerical results show that such condition is sharp in the monovariate case but seems to be over-constraining in higher dimensions. The regression technique seems therefore to be attractive in higher dimensions.

  1. Natural Aphrodisiacs-A Review of Selected Sexual Enhancers.

    Science.gov (United States)

    West, Elizabeth; Krychman, Michael

    2015-10-01

    The Food and Drug Administration defines an aphrodisiac drug product as "any product that bears labeling claims that it will arouse or increase sexual desire, or that it will improve sexual performance." Presently, there are no approved medications for the treatment of lowered desire for women, and many opt for "natural" products. The aim of this article was to review the most popular and currently used aphrodisiac products marketed in the United States. The safety and efficacy of animal- and plant-based aphrodisiacs, vitamins and minerals, and popular over-the-counter combination supplements have been reviewed. An English PubMed literature search was performed using the key words "sexuality," "sex," "aphrodisiac," and "sexual enhancer." Approximately 50 articles were reviewed by the authors. The authors used relevant case series, case-controlled, and randomized clinical trial data. Products were evaluated based on the quality of research, and their known efficacy and safety considerations. Products with low risk and potential benefit for sexual response based on prior research studies were highlighted. Research has demonstrated that the risks of yohimbine, Spanish fly, mad honey, and Bufo toad may outweigh any benefit, and these products should be avoided. Other products, such as Maca, Tribulus, Ginkgo, and ginseng, have limited but emerging data. Randomized clinical trial data are often lacking, but future research should be performed to further elucidate the efficacy and safety of these products. Future randomized clinical trials are warranted before health care practitioners can recommend most aphrodisiac products. There remain some medical concerns with drug interactions, purity, reliability, and safety. West E and Krychman M. Natural aphrodisiacs-A review of selected sexual enhancers.. Copyright © 2015 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.

  2. Exponential Inequalities for Positively Associated Random Variables and Applications

    Directory of Open Access Journals (Sweden)

    Yang Shanchao

    2008-01-01

    Full Text Available Abstract We establish some exponential inequalities for positively associated random variables without the boundedness assumption. These inequalities improve the corresponding results obtained by Oliveira (2005. By one of the inequalities, we obtain the convergence rate for the case of geometrically decreasing covariances, which closes to the optimal achievable convergence rate for independent random variables under the Hartman-Wintner law of the iterated logarithm and improves the convergence rate derived by Oliveira (2005 for the above case.

  3. Outage probability of dual-hop partial relay selection with feedback delay in the presence of interference

    KAUST Repository

    Al-Qahtani, Fawaz S.

    2011-09-01

    In this paper, we investigate the outage performance of a dual-hop relaying systems with partial relay selection and feedback delay. The analysis considers the case of Rayleigh fading channels when the relaying station as well as the destination undergo mutually independent interfering signals. Particularly, we derive the cumulative distribution function (c.d.f.) of a new type of random variable involving sum of multiple independent exponential random variables, based on which, we present closed-form expressions for the exact outage probability of a fixed amplify-and-forward (AF) and decode-and-forward (DF) relaying protocols. Numerical results are provided to illustrate the joint effect of the delayed feedback and co-channel interference on the outage probability. © 2011 IEEE.

  4. Bayesian dose selection design for a binary outcome using restricted response adaptive randomization.

    Science.gov (United States)

    Meinzer, Caitlyn; Martin, Renee; Suarez, Jose I

    2017-09-08

    In phase II trials, the most efficacious dose is usually not known. Moreover, given limited resources, it is difficult to robustly identify a dose while also testing for a signal of efficacy that would support a phase III trial. Recent designs have sought to be more efficient by exploring multiple doses through the use of adaptive strategies. However, the added flexibility may potentially increase the risk of making incorrect assumptions and reduce the total amount of information available across the dose range as a function of imbalanced sample size. To balance these challenges, a novel placebo-controlled design is presented in which a restricted Bayesian response adaptive randomization (RAR) is used to allocate a majority of subjects to the optimal dose of active drug, defined as the dose with the lowest probability of poor outcome. However, the allocation between subjects who receive active drug or placebo is held constant to retain the maximum possible power for a hypothesis test of overall efficacy comparing the optimal dose to placebo. The design properties and optimization of the design are presented in the context of a phase II trial for subarachnoid hemorrhage. For a fixed total sample size, a trade-off exists between the ability to select the optimal dose and the probability of rejecting the null hypothesis. This relationship is modified by the allocation ratio between active and control subjects, the choice of RAR algorithm, and the number of subjects allocated to an initial fixed allocation period. While a responsive RAR algorithm improves the ability to select the correct dose, there is an increased risk of assigning more subjects to a worse arm as a function of ephemeral trends in the data. A subarachnoid treatment trial is used to illustrate how this design can be customized for specific objectives and available data. Bayesian adaptive designs are a flexible approach to addressing multiple questions surrounding the optimal dose for treatment efficacy

  5. Distributional and efficiency results for subset selection

    NARCIS (Netherlands)

    Laan, van der P.

    1996-01-01

    Assume k (??k \\geq 2) populations are given. The associated independent random variables have continuous distribution functions with an unknown location parameter. The statistical selec??tion goal is to select a non??empty subset which contains the best population,?? that is the pop??ulation with

  6. An evidence-based approach to case management model selection for an acute care facility: is there really a preferred model?

    Science.gov (United States)

    Terra, Sandra M

    2007-01-01

    This research seeks to determine whether there is adequate evidence-based justification for selection of one acute care case management model over another. Acute Inpatient Hospital. This article presents a systematic review of published case management literature, resulting in classification specific to terms of level of evidence. This review examines the best available evidence in an effort to select an acute care case management model. Although no single case management model can be identified as preferred, it is clear that adequate evidence-based literature exists to acknowledge key factors driving the acute care model and to form a foundation for the efficacy of hospital case management practice. Although no single case management model can be identified as preferred, this systematic review demonstrates that adequate evidence-based literature exists to acknowledge key factors driving the acute care model and forming a foundation for the efficacy of hospital case management practice. Distinctive aspects of case management frameworks can be used to guide the development of an acute care case management model. The study illustrates: * The effectiveness of case management when there is direct patient contact by the case manager regardless of disease condition: not only does the quality of care increase but also length of stay (LOS) decreases, care is defragmented, and both patient and physician satisfaction can increase. * The preferred case management models result in measurable outcomes that can directly relate to, and demonstrate alignment with, organizational strategy. * Acute care management programs reduce cost and LOS, and improve outcomes. * An integrated case management program that includes social workers, as well as nursing, is the most effective acute care management model. * The successful case management model will recognize physicians, as well as patients, as valued customers with whom partnership can positively affect financial outcomes in terms of

  7. Sequence-Based Prediction of RNA-Binding Proteins Using Random Forest with Minimum Redundancy Maximum Relevance Feature Selection

    Directory of Open Access Journals (Sweden)

    Xin Ma

    2015-01-01

    Full Text Available The prediction of RNA-binding proteins is one of the most challenging problems in computation biology. Although some studies have investigated this problem, the accuracy of prediction is still not sufficient. In this study, a highly accurate method was developed to predict RNA-binding proteins from amino acid sequences using random forests with the minimum redundancy maximum relevance (mRMR method, followed by incremental feature selection (IFS. We incorporated features of conjoint triad features and three novel features: binding propensity (BP, nonbinding propensity (NBP, and evolutionary information combined with physicochemical properties (EIPP. The results showed that these novel features have important roles in improving the performance of the predictor. Using the mRMR-IFS method, our predictor achieved the best performance (86.62% accuracy and 0.737 Matthews correlation coefficient. High prediction accuracy and successful prediction performance suggested that our method can be a useful approach to identify RNA-binding proteins from sequence information.

  8. Program pseudo-random number generator for microcomputers

    International Nuclear Information System (INIS)

    Ososkov, G.A.

    1980-01-01

    Program pseudo-random number generators (PNG) intended for the test of control equipment and communication channels are considered. In the case of 8-bit microcomputers it is necessary to assign 4 words of storage to allocate one random number. The proposed economical algorithms of the random number generation are based on the idea of the ''mixing'' of such quarters of the preceeding random number to obtain the next one. Test results of the PNG are displayed for two such generators. A FORTRAN variant of the PNG is presented along with a program realizing the PNG made on the base of the INTEL-8080 autocode

  9. Selective preservation of the beat in apperceptive music agnosia: a case study.

    Science.gov (United States)

    Baird, Amee D; Walker, David G; Biggs, Vivien; Robinson, Gail A

    2014-04-01

    Music perception involves processing of melodic, temporal and emotional dimensions that have been found to dissociate in healthy individuals and after brain injury. Two components of the temporal dimension have been distinguished, namely rhythm and metre. We describe an 18 year old male musician 'JM' who showed apperceptive music agnosia with selectively preserved metre perception, and impaired recognition of sad and peaceful music relative to age and music experience matched controls after resection of a right temporoparietal tumour. Two months post-surgery JM underwent a comprehensive neuropsychological evaluation including assessment of his music perception abilities using the Montreal Battery for Evaluation of Amusia (MBEA, Peretz, Champod, & Hyde, 2003). He also completed several experimental tasks to explore his ability to recognise famous songs and melodies, emotions portrayed by music and a broader range of environmental sounds. Five age-, gender-, education- and musical experienced-matched controls were administered the same experimental tasks. JM showed selective preservation of metre perception, with impaired performances compared to controls and scoring below the 5% cut-off on all MBEA subtests, except for the metric condition. He could identify his favourite songs and environmental sounds. He showed impaired recognition of sad and peaceful emotions portrayed in music relative to controls but intact ability to identify happy and scary music. This case study contributes to the scarce literature documenting a dissociation between rhythmic and metric processing, and the rare observation of selectively preserved metric interpretation in the context of apperceptive music agnosia. It supports the notion that the anterior portion of the superior temporal gyrus (STG) plays a role in metric processing and provides the novel observation that selectively preserved metre is sufficient to identify happy and scary, but not sad or peaceful emotions portrayed in music

  10. [A case with apraxia of tool use: selective inability to form a hand posture for a tool].

    Science.gov (United States)

    Hayakawa, Yuko; Fujii, Toshikatsu; Yamadori, Atsushi; Meguro, Kenichi; Suzuki, Kyoko

    2015-03-01

    Impaired tool use is recognized as a symptom of ideational apraxia. While many studies have focused on difficulties in producing gestures as a whole, using tools involves several steps; these include forming hand postures appropriate for the use of certain tool, selecting objects or body parts to act on, and producing gestures. In previously reported cases, both producing and recognizing hand postures were impaired. Here we report the first case showing a selective impairment of forming hand postures appropriate for tools with preserved recognition of the required hand postures. A 24-year-old, right-handed man was admitted to hospital because of sensory impairment of the right side of the body, mild aphasia, and impaired tool use due to left parietal subcortical hemorrhage. His ability to make symbolic gestures, copy finger postures, and orient his hand to pass a slit was well preserved. Semantic knowledge for tools and hand postures was also intact. He could flawlessly select the correct hand postures in recognition tasks. He only demonstrated difficulties in forming a hand posture appropriate for a tool. Once he properly grasped a tool by trial and error, he could use it without hesitation. These observations suggest that each step of tool use should be thoroughly examined in patients with ideational apraxia.

  11. Pseudo-random data acquisition geometry in 3D seismic survey; Sanjigen jishin tansa ni okeru giji random data shutoku reiauto ni tsuite

    Energy Technology Data Exchange (ETDEWEB)

    Minegishi, M; Tsuburaya, Y [Japan National Oil Corp., Tokyo (Japan). Technology Research Center

    1996-10-01

    Influence of pseudo-random geometry on the imaging for 3D seismic exploration data acquisition has been investigate using a simple model by comparing with the regular geometry. When constituting wave front by the interference of elemental waves, pseudo-random geometry data did not always provide good results. In the case of a point diffractor, the imaging operation, where the constituted wave front was returned to the point diffractor by the interference of elemental waves for the spatial alias records, did not always give clear images. In the case of multi point diffractor, good images were obtained with less noise generation in spite of alias records. There are a lot of diffractors in the actual geological structures, which corresponds to the case of multi point diffractors. Finally, better images could be obtained by inputting records acquired using the pseudo-random geometry rather than by inputting spatial alias records acquired using the regular geometry. 7 refs., 6 figs.

  12. Randomized Prediction Games for Adversarial Machine Learning.

    Science.gov (United States)

    Rota Bulo, Samuel; Biggio, Battista; Pillai, Ignazio; Pelillo, Marcello; Roli, Fabio

    In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different evasion attacks and modifying the classification function accordingly. However, both the classification function and the simulated data manipulations have been modeled in a deterministic manner, without accounting for any form of randomization. In this paper, we overcome this limitation by proposing a randomized prediction game, namely, a noncooperative game-theoretic formulation in which the classifier and the attacker make randomized strategy selections according to some probability distribution defined over the respective strategy set. We show that our approach allows one to improve the tradeoff between attack detection and false alarms with respect to the state-of-the-art secure classifiers, even against attacks that are different from those hypothesized during design, on application examples including handwritten digit recognition, spam, and malware detection.In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different

  13. Pseudo-random number generation using a 3-state cellular automaton

    Science.gov (United States)

    Bhattacharjee, Kamalika; Paul, Dipanjyoti; Das, Sukanta

    This paper investigates the potentiality of pseudo-random number generation of a 3-neighborhood 3-state cellular automaton (CA) under periodic boundary condition. Theoretical and empirical tests are performed on the numbers, generated by the CA, to observe the quality of it as pseudo-random number generator (PRNG). We analyze the strength and weakness of the proposed PRNG and conclude that the selected CA is a good random number generator.

  14. Comparing Non-Medical Sex Selection and Saviour Sibling Selection in the Case of JS and LS v Patient Review Panel: Beyond the Welfare of the Child?

    Science.gov (United States)

    Smith, Malcolm K; Taylor-Sands, Michelle

    2018-03-01

    The national ethical guidelines relevant to assisted reproductive technology (ART) have recently been reviewed by the National Health and Medical Research Council (NHMRC). The review process paid particular attention to the issue of non-medical sex selection, although ultimately, the updated ethical guidelines maintain the pre-consultation position of a prohibition on non-medical sex selection. Whilst this recent review process provided a public forum for debate and discussion of this ethically contentious issue, the Victorian case of JS and LS v Patient Review Panel (Health and Privacy) [2011] VCAT 856 provides a rare instance where the prohibition on non-medical sex selection has been explored by a court or tribunal in Australia. This paper analyses the reasoning in that decision, focusing specifically on how the Victorian Civil and Administrative Tribunal applied the statutory framework relevant to ART and its comparison to other uses of embryo selection technologies. The Tribunal relied heavily upon the welfare-of-the-child principle under the Assisted Reproductive Treatment Act 2008 (Vic). The Tribunal also compared non-medical sex selection with saviour sibling selection (that is, where a child is purposely conceived as a matched tissue donor for an existing child of the family). Our analysis leads us to conclude that the Tribunal's reasoning fails to adequately justify the denial of the applicants' request to utilize ART services to select the sex of their prospective child.

  15. Effect Evaluation of a Randomized Trial to Reduce Infectious Illness and Illness-Related Absenteeism Among Schoolchildren

    DEFF Research Database (Denmark)

    Denbæk, Anne Maj; Andersen, Anette; Bonnesen, Camilla Thørring

    2018-01-01

    -based multi-component intervention to improve hand washing among schoolchildren, the Hi Five study, succeeded in reducing infectious illness and illness-related absenteeism in schools. METHODS: The Hi Five study was a three-armed cluster-randomized controlled trial involving 43 randomly selected Danish...... schools; two intervention arms involving 14 schools each, and 15 control schools. Infectious illness days, infectious illness episodes and illness-related absenteeism were estimated in multilevel regressions, based on available cases of text messages answered by parents and based on questionnaire data.......84-1.16)) or in reporting illness-related absenteeism(OR I-arm I : 1.09 (0.83-1.43) & ORI-arm II: 1.06 (0.81-1.40)). CONCLUSIONS: The multi component Hi Five intervention achieved no difference in the number of illness days, illness episodes or illness-related absenteeism among children in intervention schools compared...

  16. Selection of Celebrity Endorsers

    DEFF Research Database (Denmark)

    Hollensen, Svend; Schimmelpfennig, Christian

    2013-01-01

    several candidates by means of subtle evaluation procedures. Design/methodology/approach – A case study research has been carried out among companies experienced in celebrity endorsements to learn more about the endorser selection process in practise. Based on these cases theory is inductively developed......Purpose - This research aims at shedding some light on the various avenues marketers can undertake until finally an endorsement contract is signed. The focus of the study lies on verifying the generally held assumption that endorser selection is usually taken care of by creative agencies, vetting....... Findings – Our research suggests that generally held assumption that endorsers being selected and thoroughly vetted by a creative agency may not be universally valid. A normative model to illustrate the continuum of the selection process in practise is suggested and the two polar case studies (Swiss brand...

  17. Mobile electronic versus paper case report forms in clinical trials: a randomized controlled trial.

    Science.gov (United States)

    Fleischmann, Robert; Decker, Anne-Marie; Kraft, Antje; Mai, Knut; Schmidt, Sein

    2017-12-01

    Regulations, study design complexity and amounts of collected and shared data in clinical trials render efficient data handling procedures inevitable. Recent research suggests that electronic data capture can be key in this context but evidence is insufficient. This randomized controlled parallel group study tested the hypothesis that time efficiency is superior when electronic (eCRF) instead of paper case report forms (pCRF) are used for data collection. We additionally investigated predictors of time saving effects and data integrity. This study was conducted on top of a clinical weight loss trial performed at a clinical research facility over six months. All study nurses and patients participating in the clinical trial were eligible to participate and randomly allocated to enter cross-sectional data obtained during routine visits either through pCRF or eCRF. A balanced randomization list was generated before enrolment commenced. 90 and 30 records were gathered for the time that 27 patients and 2 study nurses required to report 2025 and 2037 field values, respectively. The primary hypothesis, that eCRF use is faster than pCRF use, was tested by a two-tailed t-test. Analysis of variance and covariance were used to evaluate predictors of entry performance. Data integrity was evaluated by descriptive statistics. All randomized patients were included in the study (eCRF group n = 13, pCRF group n = 14). eCRF, as compared to pCRF, data collection was associated with significant time savings  across all conditions (8.29 ± 5.15 min vs. 10.54 ± 6.98 min, p = .047). This effect was not defined by participant type, i.e. patients or study nurses (F (1,112)  = .15, p = .699), CRF length (F (2,112)  = .49, p = .609) or patient age (Beta = .09, p = .534). Additional 5.16 ± 2.83 min per CRF were saved with eCRFs due to data transcription redundancy when patients answered questionnaires directly in eCRFs. Data integrity was

  18. Topical Colchicine Gel versus Diclofenac Sodium Gel for the Treatment of Actinic Keratoses: A Randomized, Double-Blind Study.

    Science.gov (United States)

    Faghihi, Gita; Elahipoor, Azam; Iraji, Fariba; Behfar, Shadi; Abtahi-Naeini, Bahareh

    2016-01-01

    Introduction. Actinic keratoses (AKs), a premalignant skin lesion, are a common lesion in fair skin. Although destructive treatment remains the gold standard for AKs, medical therapies may be preferable due to the comfort and reliability .This study aims to compare the effects of topical 1% colchicine gel and 3% diclofenac sodium gel in AKs. Materials and Methods. In this randomized double-blind study, 70 lesions were selected. Patients were randomized before receiving either 1% colchicine gel or 3% diclofenac sodium cream twice a day for 6 weeks. Patients were evaluated in terms of their lesion size, treatment complications, and recurrence at 7, 30, 60, and 120 days after treatment. Results. The mean of changes in the size was significant in both groups both before and after treatment ( 0.05). No case of erythema was seen in the colchicine group, while erythema was seen in 22.9% (eight cases) of patients in the diclofenac sodium group (p = 0.005). Conclusions. 1% colchicine gel was a safe and effective medication with fewer side effects and lack of recurrence of the lesion.

  19. Topical Colchicine Gel versus Diclofenac Sodium Gel for the Treatment of Actinic Keratoses: A Randomized, Double-Blind Study

    Directory of Open Access Journals (Sweden)

    Gita Faghihi

    2016-01-01

    Full Text Available Introduction. Actinic keratoses (AKs, a premalignant skin lesion, are a common lesion in fair skin. Although destructive treatment remains the gold standard for AKs, medical therapies may be preferable due to the comfort and reliability .This study aims to compare the effects of topical 1% colchicine gel and 3% diclofenac sodium gel in AKs. Materials and Methods. In this randomized double-blind study, 70 lesions were selected. Patients were randomized before receiving either 1% colchicine gel or 3% diclofenac sodium cream twice a day for 6 weeks. Patients were evaluated in terms of their lesion size, treatment complications, and recurrence at 7, 30, 60, and 120 days after treatment. Results. The mean of changes in the size was significant in both groups both before and after treatment ( 0.05. No case of erythema was seen in the colchicine group, while erythema was seen in 22.9% (eight cases of patients in the diclofenac sodium group (p = 0.005. Conclusions. 1% colchicine gel was a safe and effective medication with fewer side effects and lack of recurrence of the lesion.

  20. Theory of Randomized Search Heuristics in Combinatorial Optimization

    DEFF Research Database (Denmark)

    The rigorous mathematical analysis of randomized search heuristics(RSHs) with respect to their expected runtime is a growing research area where many results have been obtained in recent years. This class of heuristics includes well-known approaches such as Randomized Local Search (RLS), the Metr......The rigorous mathematical analysis of randomized search heuristics(RSHs) with respect to their expected runtime is a growing research area where many results have been obtained in recent years. This class of heuristics includes well-known approaches such as Randomized Local Search (RLS...... analysis of randomized algorithms to RSHs. Mostly, the expected runtime of RSHs on selected problems is analzyed. Thereby, we understand why and when RSHs are efficient optimizers and, conversely, when they cannot be efficient. The tutorial will give an overview on the analysis of RSHs for solving...

  1. The Effect of Price on Surgeons' Choice of Implants: A Randomized Controlled Survey.

    Science.gov (United States)

    Wasterlain, Amy S; Melamed, Eitan; Bello, Ricardo; Karia, Raj; Capo, John T

    2017-08-01

    Surgical costs are under scrutiny and surgeons are being held increasingly responsible for cost containment. In some instances, implants are the largest component of total procedure cost, yet previous studies reveal that surgeons' knowledge of implant prices is poor. Our study aims to (1) understand drivers behind implant selection and (2) assess whether educating surgeons about implant costs affects implant selection. We surveyed 226 orthopedic surgeons across 6 continents. The survey presented 8 clinical cases of upper extremity fractures with history, radiographs, and implant options. Surgeons were randomized to receive either a version with each implant's average selling price ("price-aware" group), or a version without prices ("price-naïve" group). Surgeons selected a surgical implant and ranked factors affecting implant choice. Descriptive statistics and univariate, multivariable, and subgroup analyses were performed. For cases offering implants within the same class (eg, volar locking plates), price-awareness reduced implant cost by 9% to 11%. When offered different models of distal radius volar locking plates, 25% of price-naïve surgeons selected the most expensive plate compared with only 7% of price-aware surgeons. For cases offering different classes of implants (eg, plate vs external fixator), there was no difference in implant choice between price-aware and price-naïve surgeons. Familiarity with the implant was the most common reason for choosing an implant in both groups (35% vs 46%). Price-aware surgeons were more likely to rank cost as a factor (29% vs 21%). Price awareness significantly influences surgeons' choice of a specific model within the same implant class. Merely including prices with a list of implants leads surgeons to select less expensive implants. This implies that an untapped opportunity exists to reduce surgical expenditures simply by enhancing surgeons' cost awareness. Economic/Decision Analyses I. Copyright © 2017 American

  2. Evolution in fluctuating environments: decomposing selection into additive components of the Robertson-Price equation.

    Science.gov (United States)

    Engen, Steinar; Saether, Bernt-Erik

    2014-03-01

    We analyze the stochastic components of the Robertson-Price equation for the evolution of quantitative characters that enables decomposition of the selection differential into components due to demographic and environmental stochasticity. We show how these two types of stochasticity affect the evolution of multivariate quantitative characters by defining demographic and environmental variances as components of individual fitness. The exact covariance formula for selection is decomposed into three components, the deterministic mean value, as well as stochastic demographic and environmental components. We show that demographic and environmental stochasticity generate random genetic drift and fluctuating selection, respectively. This provides a common theoretical framework for linking ecological and evolutionary processes. Demographic stochasticity can cause random variation in selection differentials independent of fluctuating selection caused by environmental variation. We use this model of selection to illustrate that the effect on the expected selection differential of random variation in individual fitness is dependent on population size, and that the strength of fluctuating selection is affected by how environmental variation affects the covariance in Malthusian fitness between individuals with different phenotypes. Thus, our approach enables us to partition out the effects of fluctuating selection from the effects of selection due to random variation in individual fitness caused by demographic stochasticity. © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.

  3. Random isotropic one-dimensional XY-model

    Science.gov (United States)

    Gonçalves, L. L.; Vieira, A. P.

    1998-01-01

    The 1D isotropic s = ½XY-model ( N sites), with random exchange interaction in a transverse random field is considered. The random variables satisfy bimodal quenched distributions. The solution is obtained by using the Jordan-Wigner fermionization and a canonical transformation, reducing the problem to diagonalizing an N × N matrix, corresponding to a system of N noninteracting fermions. The calculations are performed numerically for N = 1000, and the field-induced magnetization at T = 0 is obtained by averaging the results for the different samples. For the dilute case, in the uniform field limit, the magnetization exhibits various discontinuities, which are the consequence of the existence of disconnected finite clusters distributed along the chain. Also in this limit, for finite exchange constants J A and J B, as the probability of J A varies from one to zero, the saturation field is seen to vary from Γ A to Γ B, where Γ A(Γ B) is the value of the saturation field for the pure case with exchange constant equal to J A(J B) .

  4. Procreative liberty: the case for preconception sex selection.

    Science.gov (United States)

    Dahl, Edgar

    2003-01-01

    Preconception sex selection for non-medical reasons raises serious moral, legal and social issues. The main concerns include the threat of a sex ratio distortion due to a common preference for boys over girls, the charge of sexism, the danger of reinforcing gender stereotypical behaviour in sex selected children, and the fear of a slippery slope towards creating designer babies. This paper endeavours to show that none of the objections to preconception sex selection is conclusive and that there is no justification for denying parents the right to choose the sex of their prospective children.

  5. Programmable disorder in random DNA tilings

    Science.gov (United States)

    Tikhomirov, Grigory; Petersen, Philip; Qian, Lulu

    2017-03-01

    Scaling up the complexity and diversity of synthetic molecular structures will require strategies that exploit the inherent stochasticity of molecular systems in a controlled fashion. Here we demonstrate a framework for programming random DNA tilings and show how to control the properties of global patterns through simple, local rules. We constructed three general forms of planar network—random loops, mazes and trees—on the surface of self-assembled DNA origami arrays on the micrometre scale with nanometre resolution. Using simple molecular building blocks and robust experimental conditions, we demonstrate control of a wide range of properties of the random networks, including the branching rules, the growth directions, the proximity between adjacent networks and the size distribution. Much as combinatorial approaches for generating random one-dimensional chains of polymers have been used to revolutionize chemical synthesis and the selection of functional nucleic acids, our strategy extends these principles to random two-dimensional networks of molecules and creates new opportunities for fabricating more complex molecular devices that are organized by DNA nanostructures.

  6. Much of the variation in breast pathology quality assurance data in the UK can be explained by the random order in which cases arrive at individual centres, but some true outliers do exist.

    Science.gov (United States)

    Cross, Simon S; Stephenson, Timothy J; Harrison, Robert F

    2011-10-01

    To investigate the role of random temporal order of patient arrival at screening centres in the variability seen in rates of node positivity and breast cancer grade between centres in the NHS Breast Screening Programme. Computer simulations were performed of the variation in node positivity and breast cancer grade with the random temporal arrival of patients at screening centres based on national UK audit data. Cumulative mean graphs of these data were plotted. Confidence intervals for the parameters were generated, using the binomial distribution. UK audit data were plotted on these control limit graphs. The results showed that much of the variability in the audit data could be accounted for by the effects of random order of arrival of cases at the screening centres. Confidence intervals of 99.7% identified true outliers in the data. Much of the variation in breast pathology quality assurance data in the UK can be explained by the random order in which cases arrive at individual centres. Control charts with confidence intervals of 99.7% plotted against the number of reported cases are useful tools for identification of true outliers. 2011 Blackwell Publishing Limited.

  7. Key Aspects of Nucleic Acid Library Design for in Vitro Selection

    Science.gov (United States)

    Vorobyeva, Maria A.; Davydova, Anna S.; Vorobjev, Pavel E.; Pyshnyi, Dmitrii V.; Venyaminova, Alya G.

    2018-01-01

    Nucleic acid aptamers capable of selectively recognizing their target molecules have nowadays been established as powerful and tunable tools for biospecific applications, be it therapeutics, drug delivery systems or biosensors. It is now generally acknowledged that in vitro selection enables one to generate aptamers to almost any target of interest. However, the success of selection and the affinity of the resulting aptamers depend to a large extent on the nature and design of an initial random nucleic acid library. In this review, we summarize and discuss the most important features of the design of nucleic acid libraries for in vitro selection such as the nature of the library (DNA, RNA or modified nucleotides), the length of a randomized region and the presence of fixed sequences. We also compare and contrast different randomization strategies and consider computer methods of library design and some other aspects. PMID:29401748

  8. Continuity of Integrated Density of States - Independent Randomness

    Indian Academy of Sciences (India)

    In this paper we discuss the continuity properties of the integrated density of states for random models based on that of the single site distribution. Our results are valid for models with independent randomness with arbitrary free parts. In particular in the case of the Anderson type models (with stationary, growing, decaying ...

  9. Employing online quantum random number generators for generating truly random quantum states in Mathematica

    Science.gov (United States)

    Miszczak, Jarosław Adam

    2013-01-01

    The presented package for the Mathematica computing system allows the harnessing of quantum random number generators (QRNG) for investigating the statistical properties of quantum states. The described package implements a number of functions for generating random states. The new version of the package adds the ability to use the on-line quantum random number generator service and implements new functions for retrieving lists of random numbers. Thanks to the introduced improvements, the new version provides faster access to high-quality sources of random numbers and can be used in simulations requiring large amount of random data. New version program summaryProgram title: TRQS Catalogue identifier: AEKA_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 18 134 No. of bytes in distributed program, including test data, etc.: 2 520 49 Distribution format: tar.gz Programming language: Mathematica, C. Computer: Any supporting Mathematica in version 7 or higher. Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit). RAM: Case-dependent Supplementary material: Fig. 1 mentioned below can be downloaded. Classification: 4.15. External routines: Quantis software library (http://www.idquantique.com/support/quantis-trng.html) Catalogue identifier of previous version: AEKA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183(2012)118 Does the new version supersede the previous version?: Yes Nature of problem: Generation of random density matrices and utilization of high-quality random numbers for the purpose of computer simulation. Solution method: Use of a physical quantum random number generator and an on-line service providing access to the source of true random

  10. DNABP: Identification of DNA-Binding Proteins Based on Feature Selection Using a Random Forest and Predicting Binding Residues.

    Science.gov (United States)

    Ma, Xin; Guo, Jing; Sun, Xiao

    2016-01-01

    DNA-binding proteins are fundamentally important in cellular processes. Several computational-based methods have been developed to improve the prediction of DNA-binding proteins in previous years. However, insufficient work has been done on the prediction of DNA-binding proteins from protein sequence information. In this paper, a novel predictor, DNABP (DNA-binding proteins), was designed to predict DNA-binding proteins using the random forest (RF) classifier with a hybrid feature. The hybrid feature contains two types of novel sequence features, which reflect information about the conservation of physicochemical properties of the amino acids, and the binding propensity of DNA-binding residues and non-binding propensities of non-binding residues. The comparisons with each feature demonstrated that these two novel features contributed most to the improvement in predictive ability. Furthermore, to improve the prediction performance of the DNABP model, feature selection using the minimum redundancy maximum relevance (mRMR) method combined with incremental feature selection (IFS) was carried out during the model construction. The results showed that the DNABP model could achieve 86.90% accuracy, 83.76% sensitivity, 90.03% specificity and a Matthews correlation coefficient of 0.727. High prediction accuracy and performance comparisons with previous research suggested that DNABP could be a useful approach to identify DNA-binding proteins from sequence information. The DNABP web server system is freely available at http://www.cbi.seu.edu.cn/DNABP/.

  11. Significance of cultural beliefs in presentation of psychiatric illness: a case report of selective mutism in a man from Nepal.

    Science.gov (United States)

    Babikian, Sarkis; Emerson, Lyndal; Wynn, Gary H

    2007-11-01

    A 22-year-old active duty E1 Nepalese male who recently emigrated from Nepal suddenly exhibited strange behaviors and mutism during Advanced Individual Training. After receiving care from a hospital near his unit, he was transferred to Walter Reed Army Medical Center Inpatient Psychiatry for further evaluation and treatment. Although he was admitted with a diagnosis of psychosis not otherwise specified (NOS), after consideration of cultural factors and by ruling out concurrent thought disorder, a diagnosis of selective mutism was made. To our knowledge this is the first reported case of selective mutism in a soldier. This case serves as a reminder of the need for cultural awareness during psychological evaluation, diagnosis, and treatment of patients.

  12. On the role of heat and mass transfer into laser processability during selective laser melting AlSi12 alloy based on a randomly packed powder-bed

    Science.gov (United States)

    Wang, Lianfeng; Yan, Biao; Guo, Lijie; Gu, Dongdong

    2018-04-01

    A newly transient mesoscopic model with a randomly packed powder-bed has been proposed to investigate the heat and mass transfer and laser process quality between neighboring tracks during selective laser melting (SLM) AlSi12 alloy by finite volume method (FVM), considering the solid/liquid phase transition, variable temperature-dependent properties and interfacial force. The results apparently revealed that both the operating temperature and resultant cooling rate were obviously elevated by increasing the laser power. Accordingly, the resultant viscosity of liquid significantly reduced under a large laser power and was characterized with a large velocity, which was prone to result in a more intensive convection within pool. In this case, the sufficient heat and mass transfer occurred at the interface between the previously fabricated tracks and currently building track, revealing a strongly sufficient spreading between the neighboring tracks and a resultant high-quality surface without obvious porosity. By contrast, the surface quality of SLM-processed components with a relatively low laser power notably weakened due to the limited and insufficient heat and mass transfer at the interface of neighboring tracks. Furthermore, the experimental surface morphologies of the top surface were correspondingly acquired and were in full accordance to the calculated results via simulation.

  13. Feature-selective attention in healthy old age: a selective decline in selective attention?

    Science.gov (United States)

    Quigley, Cliodhna; Müller, Matthias M

    2014-02-12

    Deficient selection against irrelevant information has been proposed to underlie age-related cognitive decline. We recently reported evidence for maintained early sensory selection when older and younger adults used spatial selective attention to perform a challenging task. Here we explored age-related differences when spatial selection is not possible and feature-selective attention must be deployed. We additionally compared the integrity of feedforward processing by exploiting the well established phenomenon of suppression of visual cortical responses attributable to interstimulus competition. Electroencephalogram was measured while older and younger human adults responded to brief occurrences of coherent motion in an attended stimulus composed of randomly moving, orientation-defined, flickering bars. Attention was directed to horizontal or vertical bars by a pretrial cue, after which two orthogonally oriented, overlapping stimuli or a single stimulus were presented. Horizontal and vertical bars flickered at different frequencies and thereby elicited separable steady-state visual-evoked potentials, which were used to examine the effect of feature-based selection and the competitive influence of a second stimulus on ongoing visual processing. Age differences were found in feature-selective attentional modulation of visual responses: older adults did not show consistent modulation of magnitude or phase. In contrast, the suppressive effect of a second stimulus was robust and comparable in magnitude across age groups, suggesting that bottom-up processing of the current stimuli is essentially unchanged in healthy old age. Thus, it seems that visual processing per se is unchanged, but top-down attentional control is compromised in older adults when space cannot be used to guide selection.

  14. Surgical management of thalamic gliomas: case selection, technical considerations, and review of literature.

    Science.gov (United States)

    Sai Kiran, Narayanam Anantha; Thakar, Sumit; Dadlani, Ravi; Mohan, Dilip; Furtado, Sunil Valentine; Ghosal, Nandita; Aryan, Saritha; Hegde, Alangar S

    2013-07-01

    This study aimed to identify (1) the thalamic gliomas suitable for surgical resection and (2) the appropriate surgical approach based on their location and the displacement of the posterior limb of the internal capsule (PLIC). A retrospective study over a 5-year period (from 2006 to 2010) was performed in 41 patients with thalamic gliomas. The mean age of these patients was 20.4 years (range, 2-65 years). Twenty (49 %) tumors were thalamic, 19 (46 %) were thalamopeduncular, and 2 (5 %) were bilateral. The PLIC, based on T2-weighted magnetic resonance axial sections, was displaced anterolaterally in 23 (56 %) cases and laterally in 6 (14 %) cases. It was involved by lesion in eight (20 %) cases and could not be identified in four (10 %) cases. Resection, favored in patients with well-defined, contrast-enhancing lesions, was performed in 34 (83 %) cases, while a biopsy was resorted to in 7 (17 %) cases. A gross total resection or near total resection (>90 %) could be achieved in 26 (63 %) cases. The middle temporal gyrus approach, used when the PLIC was displaced anterolaterally, was the commonly used approach (63.5 %). Common pathologies were pilocytic astrocytoma (58 %) in children and grade III/IV astrocytomas (86 %) in adults. Preoperative motor deficits improved in 64 % of the patients with pilocytic lesions as compared to 0 % in patients with grade III/IV lesions (P value, 0.001). Postoperatively, two patients (5 %) had marginal worsening of motor power, two patients developed visual field defects, and one patient developed a third nerve paresis. Radical resection of thalamic gliomas is a useful treatment modality in a select subset of patients and is the treatment of choice for pilocytic astrocytomas. Tailoring the surgical approach, depending on the relative position of the PLIC, has an important bearing on outcome.

  15. Propensity score matching for selection of local areas as controls for evaluation of effects of alcohol policies in case series and quasi case-control designs.

    Science.gov (United States)

    de Vocht, F; Campbell, R; Brennan, A; Mooney, J; Angus, C; Hickman, M

    2016-03-01

    Area-level public health interventions can be difficult to evaluate using natural experiments. We describe the use of propensity score matching (PSM) to select control local authority areas (LAU) to evaluate the public health impact of alcohol policies for (1) prospective evaluation of alcohol policies using area-level data, and (2) a novel two-stage quasi case-control design. Ecological. Alcohol-related indicator data (Local Alcohol Profiles for England, PHE Health Profiles and ONS data) were linked at LAU level. Six LAUs (Blackpool, Bradford, Bristol, Ipswich, Islington, and Newcastle-upon-Tyne) as sample intervention or case areas were matched to two control LAUs each using PSM. For the quasi case-control study a second stage was added aimed at obtaining maximum contrast in outcomes based on propensity scores. Matching was evaluated based on average standardized absolute mean differences (ASAM) and variable-specific P-values after matching. The six LAUs were matched to suitable control areas (with ASAM 0.05 indicating good matching) for a prospective evaluation study that sought areas that were similar at baseline in order to assess whether a change in intervention exposure led to a change in the outcome (alcohol related harm). PSM also generated appropriate matches for a quasi case-control study--whereby the contrast in health outcomes between cases and control areas needed to be optimized in order to assess retrospectively whether differences in intervention exposure were associated with the outcome. The use of PSM for area-level alcohol policy evaluation, but also for other public health interventions, will improve the value of these evaluations by objective and quantitative selection of the most appropriate control areas. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Location selection in the visual domain

    NARCIS (Netherlands)

    van der Lubbe, Robert Henricus Johannes; Woestenburg, Jaap C.

    2000-01-01

    According to A.H.C. Van der Heijden (1992), attentional selection of visual stimuli can be considered as location selection. Depending on the type of task, location selection can be considered to be automatic )e.g., in case of abrupt onsets), directly controlled (e.g., in case of symbolic precues),

  17. Criticality and entanglement in random quantum systems

    International Nuclear Information System (INIS)

    Refael, G; Moore, J E

    2009-01-01

    We review studies of entanglement entropy in systems with quenched randomness, concentrating on universal behavior at strongly random quantum critical points. The disorder-averaged entanglement entropy provides insight into the quantum criticality of these systems and an understanding of their relationship to non-random ('pure') quantum criticality. The entanglement near many such critical points in one dimension shows a logarithmic divergence in subsystem size, similar to that in the pure case but with a different universal coefficient. Such universal coefficients are examples of universal critical amplitudes in a random system. Possible measurements are reviewed along with the one-particle entanglement scaling at certain Anderson localization transitions. We also comment briefly on higher dimensions and challenges for the future.

  18. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  19. Statistical auditing and randomness test of lotto k/N-type games

    Science.gov (United States)

    Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Rapallo, F.; Scalas, E.

    2008-11-01

    One of the most popular lottery games worldwide is the so-called “lotto k/N”. It considers N numbers 1,2,…,N from which k are drawn randomly, without replacement. A player selects k or more numbers and the first prize is shared amongst those players whose selected numbers match all of the k randomly drawn. Exact rules may vary in different countries. In this paper, mean values and covariances for the random variables representing the numbers drawn from this kind of game are presented, with the aim of using them to audit statistically the consistency of a given sample of historical results with theoretical values coming from a hypergeometric statistical model. The method can be adapted to test pseudorandom number generators.

  20. Random and non-random mating populations: Evolutionary dynamics in meiotic drive.

    Science.gov (United States)

    Sarkar, Bijan

    2016-01-01

    Game theoretic tools are utilized to analyze a one-locus continuous selection model of sex-specific meiotic drive by considering nonequivalence of the viabilities of reciprocal heterozygotes that might be noticed at an imprinted locus. The model draws attention to the role of viability selections of different types to examine the stable nature of polymorphic equilibrium. A bridge between population genetics and evolutionary game theory has been built up by applying the concept of the Fundamental Theorem of Natural Selection. In addition to pointing out the influences of male and female segregation ratios on selection, configuration structure reveals some noted results, e.g., Hardy-Weinberg frequencies hold in replicator dynamics, occurrence of faster evolution at the maximized variance fitness, existence of mixed Evolutionarily Stable Strategy (ESS) in asymmetric games, the tending evolution to follow not only a 1:1 sex ratio but also a 1:1 different alleles ratio at particular gene locus. Through construction of replicator dynamics in the group selection framework, our selection model introduces a redefining bases of game theory to incorporate non-random mating where a mating parameter associated with population structure is dependent on the social structure. Also, the model exposes the fact that the number of polymorphic equilibria will depend on the algebraic expression of population structure. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Velocity and Dispersion for a Two-Dimensional Random Walk

    International Nuclear Information System (INIS)

    Li Jinghui

    2009-01-01

    In the paper, we consider the transport of a two-dimensional random walk. The velocity and the dispersion of this two-dimensional random walk are derived. It mainly show that: (i) by controlling the values of the transition rates, the direction of the random walk can be reversed; (ii) for some suitably selected transition rates, our two-dimensional random walk can be efficient in comparison with the one-dimensional random walk. Our work is motivated in part by the challenge to explain the unidirectional transport of motor proteins. When the motor proteins move at the turn points of their tracks (i.e., the cytoskeleton filaments and the DNA molecular tubes), some of our results in this paper can be used to deal with the problem. (general)

  2. Dynamical correlations for circular ensembles of random matrices

    International Nuclear Information System (INIS)

    Nagao, Taro; Forrester, Peter

    2003-01-01

    Circular Brownian motion models of random matrices were introduced by Dyson and describe the parametric eigenparameter correlations of unitary random matrices. For symmetric unitary, self-dual quaternion unitary and an analogue of antisymmetric Hermitian matrix initial conditions, Brownian dynamics toward the unitary symmetry is analyzed. The dynamical correlation functions of arbitrary number of Brownian particles at arbitrary number of times are shown to be written in the forms of quaternion determinants, similarly as in the case of Hermitian random matrix models

  3. Multistable selection equations of pattern formation type in the case of inhomogeneous growth rates: With applications to two-dimensional assignment problems

    International Nuclear Information System (INIS)

    Frank, T.D.

    2011-01-01

    We study the stability of solutions of a particular type of multistable selection equations proposed by Starke, Schanz and Haken in the case of an inhomogeneous spectrum of growth parameters. We determine how the stability of feasible solutions depends on the inhomogeneity of the spectrum. We show that the strength of the competitive interaction between feasible solutions can act as a control parameter that induces bifurcations reducing the degree of multistability. - Research highlights: → Feasible solutions can be stable in the case of inhomogeneous growth parameters. → Changing coupling strength can induce bifurcations of feasible solutions. → Optimal solutions are obtained when selected winnings are relatively large.

  4. Random matrix ensembles for PT-symmetric systems

    International Nuclear Information System (INIS)

    Graefe, Eva-Maria; Mudute-Ndumbe, Steve; Taylor, Matthew

    2015-01-01

    Recently much effort has been made towards the introduction of non-Hermitian random matrix models respecting PT-symmetry. Here we show that there is a one-to-one correspondence between complex PT-symmetric matrices and split-complex and split-quaternionic versions of Hermitian matrices. We introduce two new random matrix ensembles of (a) Gaussian split-complex Hermitian; and (b) Gaussian split-quaternionic Hermitian matrices, of arbitrary sizes. We conjecture that these ensembles represent universality classes for PT-symmetric matrices. For the case of 2 × 2 matrices we derive analytic expressions for the joint probability distributions of the eigenvalues, the one-level densities and the level spacings in the case of real eigenvalues. (fast track communication)

  5. Village doctor-assisted case management of rural patients with schizophrenia: protocol for a cluster randomized control trial.

    Science.gov (United States)

    Gong, Wenjie; Xu, Dong; Zhou, Liang; Brown, Henry Shelton; Smith, Kirk L; Xiao, Shuiyuan

    2014-01-16

    Strict compliance with prescribed medication is the key to reducing relapses in schizophrenia. As villagers in China lack regular access to psychiatrists to supervise compliance, we propose to train village 'doctors' (i.e., villagers with basic medical training and currently operating in villages across China delivering basic clinical and preventive care) to manage rural patients with schizophrenia with respect to compliance and monitoring symptoms. We hypothesize that with the necessary training and proper oversight, village doctors can significantly improve drug compliance of villagers with schizophrenia. We will conduct a cluster randomized controlled trial in 40 villages in Liuyang, Hunan Province, China, home to approximately 400 patients with schizophrenia. Half of the villages will be randomized into the treatment group (village doctor, or VD model) wherein village doctors who have received training in a schizophrenia case management protocol will manage case records, supervise drug taking, educate patients and families on schizophrenia and its treatment, and monitor patients for signs of relapse in order to arrange prompt referral. The other 20 villages will be assigned to the control group (case as usual, or CAU model) wherein patients will be visited by psychiatrists every two months and receive free antipsychotic medications under an on-going government program, Project 686. These control patients will receive no other management or follow up from health workers. A baseline survey will be conducted before the intervention to gather data on patient's socio-economic status, drug compliance history, and clinical and health outcome measures. Data will be re-collected 6 and 12 months into the intervention. A difference-in-difference regression model will be used to detect the program effect on drug compliance and other outcome measures. A cost-effectiveness analysis will also be conducted to compare the value of the VD model to that of the CAU group. Lack of

  6. Risk Attitudes, Sample Selection and Attrition in a Longitudinal Field Experiment

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Lau, Morten Igel

    with respect to risk attitudes. Our design builds in explicit randomization on the incentives for participation. We show that there are significant sample selection effects on inferences about the extent of risk aversion, but that the effects of subsequent sample attrition are minimal. Ignoring sample...... selection leads to inferences that subjects in the population are more risk averse than they actually are. Correcting for sample selection and attrition affects utility curvature, but does not affect inferences about probability weighting. Properly accounting for sample selection and attrition effects leads...... to findings of temporal stability in overall risk aversion. However, that stability is around different levels of risk aversion than one might naively infer without the controls for sample selection and attrition we are able to implement. This evidence of “randomization bias” from sample selection...

  7. Effects of Video Game Training on Measures of Selective Attention and Working Memory in Older Adults: Results from a Randomized Controlled Trial

    Science.gov (United States)

    Ballesteros, Soledad; Mayas, Julia; Prieto, Antonio; Ruiz-Marquez, Eloísa; Toril, Pilar; Reales, José M.

    2017-01-01

    Video game training with older adults potentially enhances aspects of cognition that decline with aging and could therefore offer a promising training approach. Although, previous published studies suggest that training can produce transfer, many of them have certain shortcomings. This randomized controlled trial (RCT; Clinicaltrials.gov ID: NCT02796508) tried to overcome some of these limitations by incorporating an active control group and the assessment of motivation and expectations. Seventy-five older volunteers were randomly assigned to the experimental group trained for 16 sessions with non-action video games from Lumosity, a commercial platform (http://www.lumosity.com/) or to an active control group trained for the same number of sessions with simulation strategy games. The final sample included 55 older adults (30 in the experimental group and 25 in the active control group). Participants were tested individually before and after training to assess working memory (WM) and selective attention and also reported their perceived improvement, motivation and engagement. The results showed improved performance across the training sessions. The main results were: (1) the experimental group did not show greater improvements in measures of selective attention and working memory than the active control group (the opposite occurred in the oddball task); (2) a marginal training effect was observed for the N-back task, but not for the Stroop task while both groups improved in the Corsi Blocks task. Based on these results, one can conclude that training with non-action games provide modest benefits for untrained tasks. The effect is not specific for that kind of training as a similar effect was observed for strategy video games. Groups did not differ in motivation, engagement or expectations. PMID:29163136

  8. Effects of Video Game Training on Measures of Selective Attention and Working Memory in Older Adults: Results from a Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Soledad Ballesteros

    2017-11-01

    Full Text Available Video game training with older adults potentially enhances aspects of cognition that decline with aging and could therefore offer a promising training approach. Although, previous published studies suggest that training can produce transfer, many of them have certain shortcomings. This randomized controlled trial (RCT; Clinicaltrials.gov ID: NCT02796508 tried to overcome some of these limitations by incorporating an active control group and the assessment of motivation and expectations. Seventy-five older volunteers were randomly assigned to the experimental group trained for 16 sessions with non-action video games from Lumosity, a commercial platform (http://www.lumosity.com/ or to an active control group trained for the same number of sessions with simulation strategy games. The final sample included 55 older adults (30 in the experimental group and 25 in the active control group. Participants were tested individually before and after training to assess working memory (WM and selective attention and also reported their perceived improvement, motivation and engagement. The results showed improved performance across the training sessions. The main results were: (1 the experimental group did not show greater improvements in measures of selective attention and working memory than the active control group (the opposite occurred in the oddball task; (2 a marginal training effect was observed for the N-back task, but not for the Stroop task while both groups improved in the Corsi Blocks task. Based on these results, one can conclude that training with non-action games provide modest benefits for untrained tasks. The effect is not specific for that kind of training as a similar effect was observed for strategy video games. Groups did not differ in motivation, engagement or expectations.

  9. Effects of Video Game Training on Measures of Selective Attention and Working Memory in Older Adults: Results from a Randomized Controlled Trial.

    Science.gov (United States)

    Ballesteros, Soledad; Mayas, Julia; Prieto, Antonio; Ruiz-Marquez, Eloísa; Toril, Pilar; Reales, José M

    2017-01-01

    Video game training with older adults potentially enhances aspects of cognition that decline with aging and could therefore offer a promising training approach. Although, previous published studies suggest that training can produce transfer, many of them have certain shortcomings. This randomized controlled trial (RCT; Clinicaltrials.gov ID: NCT02796508) tried to overcome some of these limitations by incorporating an active control group and the assessment of motivation and expectations. Seventy-five older volunteers were randomly assigned to the experimental group trained for 16 sessions with non-action video games from Lumosity , a commercial platform (http://www.lumosity.com/) or to an active control group trained for the same number of sessions with simulation strategy games. The final sample included 55 older adults (30 in the experimental group and 25 in the active control group). Participants were tested individually before and after training to assess working memory (WM) and selective attention and also reported their perceived improvement, motivation and engagement. The results showed improved performance across the training sessions. The main results were: (1) the experimental group did not show greater improvements in measures of selective attention and working memory than the active control group (the opposite occurred in the oddball task); (2) a marginal training effect was observed for the N -back task, but not for the Stroop task while both groups improved in the Corsi Blocks task. Based on these results, one can conclude that training with non-action games provide modest benefits for untrained tasks. The effect is not specific for that kind of training as a similar effect was observed for strategy video games. Groups did not differ in motivation, engagement or expectations.

  10. [YANG's pricking-cupping therapy for knee osteoarthritis: a multi-center randomized controlled trial].

    Science.gov (United States)

    Wang, Bo; Liu, Xiru; Hu, Zhihai; Sun, Aijun; Ma, Yanwen; Chen Yingying; Zhang, Xuzhi; Liu, Meiling; Wang, Yi; Wang, Shuoshuo; Zhang, Yunjia; Li, Yijing; Shen, Weidong

    2016-02-01

    To evaluate the clinical efficacy of YANG's pricking-cupping therapy for knee osteoar thritis (KOA). Methods This was a multi-center randomized parallel controlled trial. One hundred and seventy one patients with KOA were randomly allocated to a pricking-cupping group (89 cases) and a conventional acu puncture group (82 cases). Neixiyan (EX-LE 4), Dubi (ST 35) and ashi points were selected in the two groups. Patients in the pricking-cupping group were treated with YANG's pricking-cupping therapy; the seven-star needles were used to perform pricking at acupoints, then cupping was used until slight bleeding was observed. Patients in the conventional acupuncture group were treated with semi-standardized filiform needle therapy. The treatment was given for 4 weeks (from a minimum of 5 times to a maximum of 10 times). The follow-up visit was 4 weeks. The Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) and the visual analogue scale (VAS) were adopted for the efficacy assessments. The pain score, stiffness score, physical function score and total score of WOMAC were all reduced after 4-week treatment and during follow-up visit in the two groups (all P0. 05), each score and total score of WOMAC in the pricking-cupping group were lower than those in the conventional acupuncture group after 4-week treatment and during follow-up visit (Pcupping group were lower than those in the conventional acupuncture group after 4-week treatment and during follow-up visit (P cupping and conventional acupuncture therapy can both significantly improve knee joint pain and function in patients with KOA, which are relatively safe. The pricking cupping therapy is superior to conventional acupuncture with the identical selection of acupoints.

  11. Evaluation of Randomly Selected Completed Medical Records Sheets in Teaching Hospitals of Jahrom University of Medical Sciences, 2009

    Directory of Open Access Journals (Sweden)

    Mohammad Parsa Mahjob

    2011-06-01

    Full Text Available Background and objective: Medical record documentation, often use to protect the patients legal rights, also providing information for medical researchers, general studies, education of health care staff and qualitative surveys is used. There is a need to control the amount of data entered in the medical record sheets of patients, considering the completion of these sheets is often carried out after completion of service delivery to the patients. Therefore, in this study the prevalence of completeness of medical history, operation reports, and physician order sheets by different documentaries in Jahrom teaching hospitals during year 2009 was analyzed. Methods and Materials: In this descriptive / retrospective study, the 400 medical record sheets of the patients from two teaching hospitals affiliated to Jahrom medical university was randomly selected. The tool of data collection was a checklist based on the content of medical history sheet, operation report and physician order sheets. The data were analyzed by SPSS (Version10 software and Microsoft Office Excel 2003. Results: Average of personal (Demography data entered in medical history, physician order and operation report sheets which is done by department's secretaries were 32.9, 35.8 and 40.18 percent. Average of clinical data entered by physician in medical history sheet is 38 percent. Surgical data entered by the surgeon in operation report sheet was 94.77 percent. Average of data entered by operation room's nurse in operation report sheet was 36.78 percent; Average of physician order data in physician order sheet entered by physician was 99.3 percent. Conclusion: According to this study, the rate of completed record papers reviewed by documentary in Jahrom teaching hospitals were not desirable and in some cases were very weak and incomplete. This deficiency was due to different reason such as medical record documentaries negligence, lack of adequate education for documentaries, High work

  12. Two models of inventory control with supplier selection in case of multiple sourcing: a case of Isfahan Steel Company

    Science.gov (United States)

    Rabieh, Masood; Soukhakian, Mohammad Ali; Mosleh Shirazi, Ali Naghi

    2016-06-01

    Selecting the best suppliers is crucial for a company's success. Since competition is a determining factor nowadays, reducing cost and increasing quality of products are two key criteria for appropriate supplier selection. In the study, first the inventories of agglomeration plant of Isfahan Steel Company were categorized through VED and ABC methods. Then the models to supply two important kinds of raw materials (inventories) were developed, considering the following items: (1) the optimal consumption composite of the materials, (2) the total cost of logistics, (3) each supplier's terms and conditions, (4) the buyer's limitations and (5) the consumption behavior of the buyers. Among diverse developed and tested models—using the company's actual data within three pervious years—the two new innovative models of mixed-integer non-linear programming type were found to be most suitable. The results of solving two models by lingo software (based on company's data in this particular case) were equaled. Comparing the results of the new models to the actual performance of the company revealed 10.9 and 7.1 % reduction in total procurement costs of the company in two consecutive years.

  13. A case-control study of Yersinia enterocolitica infections in Auckland.

    Science.gov (United States)

    Satterthwaite, P; Pritchard, K; Floyd, D; Law, B

    1999-10-01

    To identify major risk factors for Yersinia enterocolitica (YE) and identify measures to reduce YE infections. A prospective case control study, group age matched, using 186 cases of YE identified by community pathology laboratories and 379 randomly selected controls. Conducted between April 1995 and June 1996 in Auckland, New Zealand. Face-to-face interviews used a standardised questionnaire examining exposures to factors potentially associated with YE infections including untreated water, unreticulated sewerage, consumption of selected foods, selected food handling practices and socio-demographic factors. Multivariate logistic regression was used to calculate adjusted odds ratios for the potential risk factors. Population attributable risk (PAR) was calculated for significant exposures. Having more than two people living in the home was more common among cases than controls (OR = 2.2). Town supply water (OR = 0.2), reticulated sewerage (OR = 0.34) and looking after a young child (OR = 0.51) were significantly less common. Of the meats, only pork (OR = 1.34) had a higher consumption rate, while bacon (OR = 0.75) and smallgoods (OR = 0.73) were consumed less frequently by cases than controls. Eating food from a sandwich bar was more frequent among cases (OR = 1.18). Fruit and vegetable consumption was marginally less (OR = 0.98). The population attributable risk of these factors was 0.89, implying that 89% of YE would be eliminated if adverse exposures were removed. The risk of YE illness is increased by contact with untreated water, unreticulated sewerage and consumption of pork. Investigation of non-town water supply, informal sewerage systems and methods of preparation and consumption of pork are recommended to determine how YE enters the human food chain.

  14. Lines of Descent Under Selection

    Science.gov (United States)

    Baake, Ellen; Wakolbinger, Anton

    2017-11-01

    We review recent progress on ancestral processes related to mutation-selection models, both in the deterministic and the stochastic setting. We mainly rely on two concepts, namely, the killed ancestral selection graph and the pruned lookdown ancestral selection graph. The killed ancestral selection graph gives a representation of the type of a random individual from a stationary population, based upon the individual's potential ancestry back until the mutations that define the individual's type. The pruned lookdown ancestral selection graph allows one to trace the ancestry of individuals from a stationary distribution back into the distant past, thus leading to the stationary distribution of ancestral types. We illustrate the results by applying them to a prototype model for the error threshold phenomenon.

  15. Out of the box selection and application of UX evaluation methods and practical cases

    DEFF Research Database (Denmark)

    Obrist, Marianna; Knoche, Hendrik; Basapur, Santosh

    2013-01-01

    The scope of user experience supersedes the concept of usability and other performance oriented measures by including for example users' emotions, motivations and a strong focus on the context of use. The purpose of this tutorial is to motivate researchers and practitioners to think about...... the challenging questions around how to select and apply UX evaluation methods for different usage contexts, in particular for the "home" and "mobile" context, relevant for TV-based services. Next to a general understanding of UX evaluation and available methods, we will provide concrete UX evaluation case...

  16. N-state random switching based on quantum tunnelling

    Science.gov (United States)

    Bernardo Gavito, Ramón; Jiménez Urbanos, Fernando; Roberts, Jonathan; Sexton, James; Astbury, Benjamin; Shokeir, Hamzah; McGrath, Thomas; Noori, Yasir J.; Woodhead, Christopher S.; Missous, Mohamed; Roedig, Utz; Young, Robert J.

    2017-08-01

    In this work, we show how the hysteretic behaviour of resonant tunnelling diodes (RTDs) can be exploited for new functionalities. In particular, the RTDs exhibit a stochastic 2-state switching mechanism that could be useful for random number generation and cryptographic applications. This behaviour can be scaled to N-bit switching, by connecting various RTDs in series. The InGaAs/AlAs RTDs used in our experiments display very sharp negative differential resistance (NDR) peaks at room temperature which show hysteresis cycles that, rather than having a fixed switching threshold, show a probability distribution about a central value. We propose to use this intrinsic uncertainty emerging from the quantum nature of the RTDs as a source of randomness. We show that a combination of two RTDs in series results in devices with three-state outputs and discuss the possibility of scaling to N-state devices by subsequent series connections of RTDs, which we demonstrate for the up to the 4-state case. In this work, we suggest using that the intrinsic uncertainty in the conduction paths of resonant tunnelling diodes can behave as a source of randomness that can be integrated into current electronics to produce on-chip true random number generators. The N-shaped I-V characteristic of RTDs results in a two-level random voltage output when driven with current pulse trains. Electrical characterisation and randomness testing of the devices was conducted in order to determine the validity of the true randomness assumption. Based on the results obtained for the single RTD case, we suggest the possibility of using multi-well devices to generate N-state random switching devices for their use in random number generation or multi-valued logic devices.

  17. Botulinum toxin injection for hypercontractile or spastic esophageal motility disorders: may high-resolution manometry help to select cases?

    Science.gov (United States)

    Marjoux, S; Brochard, C; Roman, S; Gincul, R; Pagenault, M; Ponchon, T; Ropert, A; Mion, F

    2015-01-01

    Endoscopic injections of botulinum toxin in the cardia or distal esophagus have been advocated to treat achalasia and spastic esophageal motility disorders. We conducted a retrospective study to evaluate whether manometric diagnosis using the Chicago classification in high-resolution manometry (HRM) would be predictive of the clinical response. Charts of patients with spastic and hypertensive motility disorders diagnosed with HRM and treated with botulinum toxin were retrospectively reviewed at two centers. HRM recordings were systematically reanalyzed, and a patient's phone survey was conducted. Forty-five patients treated between 2008 and 2013 were included. Most patients had achalasia type 3 (22 cases). Other diagnoses were jackhammer esophagus (8 cases), distal esophageal spasm (7 cases), esophagogastric junction outflow obstruction (5 cases), nutcracker esophagus (1 case), and 2 unclassified cases. Botulinum toxin injections were performed into the cardia only in 9 cases, into the wall of the distal esophagus in 19 cases, and in both locations (cardia and distal esophagus) in 17 cases. No complication occurred in 31 cases. Chest pain was noticed for less than 7 days in 13 cases. One death related to mediastinitis occurred 3 weeks after botulinum toxin injection. Efficacy was assessed in 42 patients: 71% were significantly improved 2 months after botulinum toxin, and 57% remained satisfied for more than 6 months. No clear difference was observed in terms of response according to manometric diagnosis; however, type 3 achalasia previously dilated and with normal integrated relaxation pressure (4s-integrated relaxation pressure botulinum toxin. Endoscopic injections of botulinum toxin may be effective in some patients with spastic or hypercontractile esophageal motility disorders. The manometric Chicago classification diagnosis does not seem to predict the results. Prospective randomized trials are required to identify patients most likely to benefit from

  18. Potential self-selection bias in a nested case-control study on indoor environmental factors and their association with asthma and allergic symptoms among pre-school children

    DEFF Research Database (Denmark)

    Bornehag, Carl-Gustaf; Sundell, Jan; Sigsgaard, T.

    2006-01-01

    , including health, building characteristics of the home, and socioeconomic factors between participating and non-participating families in a nested case-control study on asthma and allergy among children. Information was collected in a baseline questionnaire to the parents of 14,077 children aged 1-6 years...... in a first step. In a second step 2,156 of the children were invited to participate in a case-control study. Of these, 198 cases and 202 controls were finally selected. For identifying potential selection bias, information concerning all invited families in the case-control study was obtained from...

  19. Movement pathways and habitat selection by woodland caribou during spring migration

    Directory of Open Access Journals (Sweden)

    D. Joanne Saher

    2005-05-01

    Full Text Available Woodland caribou (Rangifer tarandus caribou are a threatened species throughout Canada. Special management is therefore required to ensure habitat needs are met, particularly because much of their current distribution is heavily influenced by resource extraction activities. Although winter habitat is thought to be limiting and is the primary focus of conservation efforts, maintaining connectivity between summer and winter ranges has received little attention. We used global positioning system data from an interprovincial, woodland caribou herd to define migratory movements on a relatively pristine range. Non-linear models indicated that caribou movement during migration was punctuated; caribou traveled for some distance (movement phase followed by a pause (resting/foraging phase. We then developed resource selection functions (RSFs, using case-controlled logistic regression, to describe resting/foraging sites and movement sites, at the landscape scale. The RSFs indicated that caribou traveled through areas that were less rugged and closer to water than random and that resting/foraging sites were associated with older forests that have a greater component of pine, and are further from water than were random available locations. This approach to analyzing animal location data allowed us to identify two patterns of habitat selection (travel and foraging/resting for caribou during the migratory period. Resultant models are important tools for land use planning to ensure that connectivity between caribou summer and winter ranges is maintained.

  20. Selection of population controls for a Salmonella case-control study in the UK using a market research panel and web-survey provides time and resource savings.

    Science.gov (United States)

    Mook, P; Kanagarajah, S; Maguire, H; Adak, G K; Dabrera, G; Waldram, A; Freeman, R; Charlett, A; Oliver, I

    2016-04-01

    Timely recruitment of population controls in infectious disease outbreak investigations is challenging. We evaluated the timeliness and cost of using a market research panel as a sampling frame for recruiting controls in a case-control study during an outbreak of Salmonella Mikawasima in the UK in 2013. We deployed a web-survey by email to targeted members of a market research panel (panel controls) in parallel to the outbreak control team interviewing randomly selected public health staff by telephone and completing paper-based questionnaires (staff controls). Recruitment and completion of exposure history web-surveys for panel controls (n = 123) took 14 h compared to 15 days for staff controls (n = 82). The average staff-time cost per questionnaire for staff controls was £13·13 compared to an invoiced cost of £3·60 per panel control. Differences in the distribution of some exposures existed between these control groups but case-control studies using each group found that illness was associated with consumption of chicken outside of the home and chicken from local butchers. Recruiting market research panel controls offers time and resource savings. More rapid investigations would enable more prompt implementation of control measures. We recommend that this method of recruiting controls is considered in future investigations and assessed further to better understand strengths and limitations.

  1. Selecting the Best: Evolutionary Engineering of Chemical Production in Microbes.

    Science.gov (United States)

    Shepelin, Denis; Hansen, Anne Sofie Lærke; Lennen, Rebecca; Luo, Hao; Herrgård, Markus J

    2018-05-11

    Microbial cell factories have proven to be an economical means of production for many bulk, specialty, and fine chemical products. However, we still lack both a holistic understanding of organism physiology and the ability to predictively tune enzyme activities in vivo, thus slowing down rational engineering of industrially relevant strains. An alternative concept to rational engineering is to use evolution as the driving force to select for desired changes, an approach often described as evolutionary engineering. In evolutionary engineering, in vivo selections for a desired phenotype are combined with either generation of spontaneous mutations or some form of targeted or random mutagenesis. Evolutionary engineering has been used to successfully engineer easily selectable phenotypes, such as utilization of a suboptimal nutrient source or tolerance to inhibitory substrates or products. In this review, we focus primarily on a more challenging problem-the use of evolutionary engineering for improving the production of chemicals in microbes directly. We describe recent developments in evolutionary engineering strategies, in general, and discuss, in detail, case studies where production of a chemical has been successfully achieved through evolutionary engineering by coupling production to cellular growth.

  2. Feature Selection with the Boruta Package

    OpenAIRE

    Kursa, Miron B.; Rudnicki, Witold R.

    2010-01-01

    This article describes a R package Boruta, implementing a novel feature selection algorithm for finding emph{all relevant variables}. The algorithm is designed as a wrapper around a Random Forest classification algorithm. It iteratively removes the features which are proved by a statistical test to be less relevant than random probes. The Boruta package provides a convenient interface to the algorithm. The short description of the algorithm and examples of its application are presented.

  3. Empirical test of Capital Asset Pricing Model on Selected Banking Shares from Borsa Istanbul

    Directory of Open Access Journals (Sweden)

    Fuzuli Aliyev

    2018-03-01

    Full Text Available In this paper we tested Capital Asset Pricing Model (shortly CAPM hereafter on the selected banking stocks of Borsa Istanbul. Here we tried to explain how to price financial assets based on their risks in the case of BIST-100 index. CAPM is an important model in the portfolio management theory used by economic agents for the selection of financial assets. We used 12 random banking stocks’ monthly return data for 2001–2010 periods. To test the validity of the CAPM, we first derived the regression equation for the risk-free interest rate and risk premium relationship using January 2001–December 2009 data. Then, estimated January–December 2010 returns with the equation. Comparing forecasted return with the actual return, we concluded that the CAPM is valid for the portfolio consisting of the 12 banks traded in the ISE, i.e. The model could predict the overall outcome of portfolio of selected banking shares

  4. Improvement of Automated POST Case Success Rate Using Support Vector Machines

    Science.gov (United States)

    Zwack, Matthew R.; Dees, Patrick D.

    2017-01-01

    During early conceptual design of complex systems, concept down selection can have a large impact upon program life-cycle cost. Therefore, any concepts selected during early design will inherently commit program costs and affect the overall probability of program success. For this reason it is important to consider as large a design space as possible in order to better inform the down selection process. For conceptual design of launch vehicles, trajectory analysis and optimization often presents the largest obstacle to evaluating large trade spaces. This is due to the sensitivity of the trajectory discipline to changes in all other aspects of the vehicle design. Small deltas in the performance of other subsystems can result in relatively large fluctuations in the ascent trajectory because the solution space is non-linear and multi-modal [1]. In order to help capture large design spaces for new launch vehicles, the authors have performed previous work seeking to automate the execution of the industry standard tool, Program to Optimize Simulated Trajectories (POST). This work initially focused on implementation of analyst heuristics to enable closure of cases in an automated fashion, with the goal of applying the concepts of design of experiments (DOE) and surrogate modeling to enable near instantaneous throughput of vehicle cases [2]. Additional work was then completed to improve the DOE process by utilizing a graph theory based approach to connect similar design points [3]. The conclusion of the previous work illustrated the utility of the graph theory approach for completing a DOE through POST. However, this approach was still dependent upon the use of random repetitions to generate seed points for the graph. As noted in [3], only 8% of these random repetitions resulted in converged trajectories. This ultimately affects the ability of the random reps method to confidently approach the global optima for a given vehicle case in a reasonable amount of time. With only

  5. Efficient Use of Historical Data for Genomic Selection: A Case Study of Stem Rust Resistance in Wheat

    Directory of Open Access Journals (Sweden)

    J. Rutkoski

    2015-03-01

    Full Text Available Genomic selection (GS is a methodology that can improve crop breeding efficiency. To implement GS, a training population (TP with phenotypic and genotypic data is required to train a statistical model used to predict genotyped selection candidates (SCs. A key factor impacting prediction accuracy is the relationship between the TP and the SCs. This study used empirical data for quantitative adult plant resistance to stem rust of wheat ( L. to investigate the utility of a historical TP (TP compared with a population-specific TP (TP, the potential for TP optimization, and the utility of TP data when close relative data is available for training. We found that, depending on the population size, a TP was 1.5 to 4.4 times more accurate than a TP, and TP optimization based on the mean of the generalized coefficient of determination or prediction error variance enabled the selection of subsets that led to significantly higher accuracy than randomly selected subsets. Retaining historical data when data on close relatives were available lead to a 11.9% increase in accuracy, at best, and a 12% decrease in accuracy, at worst, depending on the heritability. We conclude that historical data could be used successfully to initiate a GS program, especially if the dataset is very large and of high heritability. Training population optimization would be useful for the identification of TP subsets to phenotype additional traits. However, after model updating, discarding historical data may be warranted. More studies are needed to determine if these observations represent general trends.

  6. Developing effective web-based regional anesthesia education: a randomized study evaluating case-based versus non-case-based module design.

    Science.gov (United States)

    Kopp, Sandra L; Smith, Hugh M

    2011-01-01

    Little is known about the use of Web-based education in regional anesthesia training. Benefits of Web-based education include the ability to standardize learning material quality and content, build appropriate learning progressions, use interactive multimedia technologies, and individualize delivery of course materials. The goals of this investigation were (1) to determine whether module design influences regional anesthesia knowledge acquisition, (2) to characterize learner preference patterns among anesthesia residents, and (3) to determine whether learner preferences play a role in knowledge acquisition. Direct comparison of knowledge assessments, learning styles, and learner preferences will be made between an interactive case-based and a traditional textbook-style module design. Forty-three Mayo Clinic anesthesiology residents completed 2 online modules, a knowledge pretest, posttest, an Index of Learning Styles assessment, and a participant satisfaction survey. Interscalene and lumbar plexus regional techniques were selected as the learning content for 4 Web modules constructed using the Blackboard Vista coursework application. One traditional textbook-style module and 1 interactive case-based module were designed for each of the interscalene and lumbar plexus techniques. Participants scored higher on the postmodule knowledge assessment for both of the interscalene and lumbar plexus modules. Postmodule knowledge performance scores were independent of both module design (interactive case-based versus traditional textbook style) and learning style preferences. However, nearly all participants reported a preference for Web-based learning and believe that it should be used in anesthesia resident education. Participants did not feel that Web-base learning should replace the current lecture-based curriculum. All residents scored higher on the postmodule knowledge assessment, but this improvement was independent of the module design and individual learning styles

  7. Evaluating malaria case management at public health facilities in two provinces in Angola.

    Science.gov (United States)

    Plucinski, Mateusz M; Ferreira, Manzambi; Ferreira, Carolina Miguel; Burns, Jordan; Gaparayi, Patrick; João, Lubaki; da Costa, Olinda; Gill, Parambir; Samutondo, Claudete; Quivinja, Joltim; Mbounga, Eliane; de León, Gabriel Ponce; Halsey, Eric S; Dimbu, Pedro Rafael; Fortes, Filomeno

    2017-05-03

    Malaria accounts for the largest portion of healthcare demand in Angola. A pillar of malaria control in Angola is the appropriate management of malaria illness, including testing of suspect cases with rapid diagnostic tests (RDTs) and treatment of confirmed cases with artemisinin-based combination therapy (ACT). Periodic systematic evaluations of malaria case management are recommended to measure health facility readiness and adherence to national case management guidelines. Cross-sectional health facility surveys were performed in low-transmission Huambo and high-transmission Uíge Provinces in early 2016. In each province, 45 health facilities were randomly selected from among all public health facilities stratified by level of care. Survey teams performed inventories of malaria commodities and conducted exit interviews and re-examinations, including RDT testing, of a random selection of all patients completing outpatient consultations. Key health facility readiness and case management indicators were calculated adjusting for the cluster sampling design and utilization. Availability of RDTs or microscopy on the day of the survey was 71% (54-83) in Huambo and 85% (67-94) in Uíge. At least one unit dose pack of one formulation of an ACT (usually artemether-lumefantrine) was available in 83% (66-92) of health facilities in Huambo and 79% (61-90) of health facilities in Uíge. Testing rates of suspect malaria cases in Huambo were 30% (23-38) versus 69% (53-81) in Uíge. Overall, 28% (13-49) of patients with uncomplicated malaria, as determined during the re-examination, were appropriately treated with an ACT with the correct dose in Huambo, compared to 60% (42-75) in Uíge. Incorrect case management of suspect malaria cases was associated with lack of healthcare worker training in Huambo and ACT stock-outs in Uíge. The results reveal important differences between provinces. Despite similar availability of testing and ACT, testing and treatment rates were lower in

  8. Intake of Total and Subgroups of Fat Minimally Affect the Associations between Selected Single Nucleotide Polymorphisms in the PPARγ Pathway and Changes in Anthropometry among European Adults from Cohorts of the DiOGenes Study

    DEFF Research Database (Denmark)

    Larsen, Sofus C; Ängquist, Lars; Østergaard, Jane N

    2016-01-01

    nucleotide polymorphisms (SNPs) within 4 genes in the PPARγ pathway are associated with the OR of being a BW gainer or with annual changes in anthropometry and whether intake of total fat, monounsaturated fat, polyunsaturated fat, or saturated fat has a modifying effect on these associations. METHODS: A case......-noncase study included 11,048 men and women from cohorts in the European Diet, Obesity and Genes study; 5552 were cases, defined as individuals with the greatest BW gain during follow-up, and 6548 were randomly selected, including 5496 noncases. We selected 4 genes [CCAAT/enhancer binding protein β (CEBPB...

  9. Validity of Qualis database as a predictor of evidence hierarchy and risk of bias in randomized controlled trials: a case study in dentistry

    Directory of Open Access Journals (Sweden)

    Christiane Alves Ferreira

    2011-01-01

    Full Text Available OBJECTIVE: To evaluate the validity of the Qualis database in identifying the levels of scientific evidence and the quality of randomized controlled trials indexed in the Lilacs database. METHODS: We selected 40 open-access journals and performed a page-by-page hand search, to identify published articles according to the type of study during a period of six years. Classification of studies was performed by independent reviewers assessed for their reliability. Randomized controlled trials were identified for separate evaluation of risk of bias using four dimensions: generation of allocation sequence, allocation concealment, blinding, and incomplete outcome data. The Qualis classification was considered to be the outcome variable. The statistical tests used included Kappa, Spearman's correlation, Kendall-tau and ordinal regressions. RESULTS: Studies with low levels of scientific evidence received similar Qualis classifications when compared to studies with high levels of evidence. In addition, randomized controlled trials with a high risk of bias for the generation of allocation sequences and allocation concealment were more likely to be published in journals with higher Qualis levels. DISCUSSION: The hierarchy level of the scientific evidence as classified by type of research design, as well as by the validity of studies according to the bias control level, was not correlated or associated with Qualis stratification. CONCLUSION: Qualis classifications for journals are not an approximate or indirect predictor of the validity of randomized controlled trials published in these journals and are therefore not a legitimate or appropriate indicator of the validity of randomized controlled trials.

  10. The hard-core model on random graphs revisited

    International Nuclear Information System (INIS)

    Barbier, Jean; Krzakala, Florent; Zhang, Pan; Zdeborová, Lenka

    2013-01-01

    We revisit the classical hard-core model, also known as independent set and dual to vertex cover problem, where one puts particles with a first-neighbor hard-core repulsion on the vertices of a random graph. Although the case of random graphs with small and very large average degrees respectively are quite well understood, they yield qualitatively different results and our aim here is to reconciliate these two cases. We revisit results that can be obtained using the (heuristic) cavity method and show that it provides a closed-form conjecture for the exact density of the densest packing on random regular graphs with degree K ≥ 20, and that for K > 16 the nature of the phase transition is the same as for large K. This also shows that the hard-code model is the simplest mean-field lattice model for structural glasses and jamming

  11. Chemical library subset selection algorithms: a unified derivation using spatial statistics.

    Science.gov (United States)

    Hamprecht, Fred A; Thiel, Walter; van Gunsteren, Wilfred F

    2002-01-01

    If similar compounds have similar activity, rational subset selection becomes superior to random selection in screening for pharmacological lead discovery programs. Traditional approaches to this experimental design problem fall into two classes: (i) a linear or quadratic response function is assumed (ii) some space filling criterion is optimized. The assumptions underlying the first approach are clear but not always defendable; the second approach yields more intuitive designs but lacks a clear theoretical foundation. We model activity in a bioassay as realization of a stochastic process and use the best linear unbiased estimator to construct spatial sampling designs that optimize the integrated mean square prediction error, the maximum mean square prediction error, or the entropy. We argue that our approach constitutes a unifying framework encompassing most proposed techniques as limiting cases and sheds light on their underlying assumptions. In particular, vector quantization is obtained, in dimensions up to eight, in the limiting case of very smooth response surfaces for the integrated mean square error criterion. Closest packing is obtained for very rough surfaces under the integrated mean square error and entropy criteria. We suggest to use either the integrated mean square prediction error or the entropy as optimization criteria rather than approximations thereof and propose a scheme for direct iterative minimization of the integrated mean square prediction error. Finally, we discuss how the quality of chemical descriptors manifests itself and clarify the assumptions underlying the selection of diverse or representative subsets.

  12. Case Studies on Timber Defects of Selected Traditional Houses in Malacca

    Directory of Open Access Journals (Sweden)

    Nor Haniza Ishak

    2007-12-01

    Full Text Available The effect of adverse environmental conditions on building materials and the extent of damage caused depends on both the materials used and the environmental conditions. Although timber is a diminishing resource, it is still widely used in today's construction. In Malaysia, timber is one of the main components of many historic buildings. Appropriate maintenance of such buildings requires an understanding of timber defects and its related problems. Timber defects are classified into two major groups: non-biological and biological deteriorations. Non-biological deterioration consists of physical decay, excessive moisture content, dimensional instability and chemical deterioration. These defects are mainly caused by the timber in service being subjected to environmental exposure. The most common and destructive timber biological deteriorations are those due to dry rot, we t rot as well as insect attacks . A study based on seven selected houses was conducted to identify the most common building defects, specifically on timber components amongst traditional Malay houses in Malacca, Malaysia, A building condition survey was carried out to determine the effect of the environment towards timber buildings and their main components. Data collected were based on the investigation and visual observation of the selected case studies. Findings of this research will serve as an indicator towards maintaining the buildings' timber components in good condition in order that the buildings' life span could be extended and primarily to conserve the valuable traditional timber houses in a historical city.

  13. Pseudo-random-number generators and the square site percolation threshold.

    Science.gov (United States)

    Lee, Michael J

    2008-09-01

    Selected pseudo-random-number generators are applied to a Monte Carlo study of the two-dimensional square-lattice site percolation model. A generator suitable for high precision calculations is identified from an application specific test of randomness. After extended computation and analysis, an ostensibly reliable value of p_{c}=0.59274598(4) is obtained for the percolation threshold.

  14. Algebraic polynomials with random coefficients

    Directory of Open Access Journals (Sweden)

    K. Farahmand

    2002-01-01

    Full Text Available This paper provides an asymptotic value for the mathematical expected number of points of inflections of a random polynomial of the form a0(ω+a1(ω(n11/2x+a2(ω(n21/2x2+…an(ω(nn1/2xn when n is large. The coefficients {aj(w}j=0n, w∈Ω are assumed to be a sequence of independent normally distributed random variables with means zero and variance one, each defined on a fixed probability space (A,Ω,Pr. A special case of dependent coefficients is also studied.

  15. Effect of mirtazapine versus selective serotonin reuptake inhibitors on benzodiazepine use in patients with major depressive disorder: a pragmatic, multicenter, open-label, randomized, active-controlled, 24-week trial.

    Science.gov (United States)

    Hashimoto, Tasuku; Shiina, Akihiro; Hasegawa, Tadashi; Kimura, Hiroshi; Oda, Yasunori; Niitsu, Tomihisa; Ishikawa, Masatomo; Tachibana, Masumi; Muneoka, Katsumasa; Matsuki, Satoshi; Nakazato, Michiko; Iyo, Masaomi

    2016-01-01

    This study aimed to evaluate whether selecting mirtazapine as the first choice for current depressive episode instead of selective serotonin reuptake inhibitors (SSRIs) reduces benzodiazepine use in patients with major depressive disorder (MDD). We concurrently examined the relationship between clinical responses and serum mature brain-derived neurotrophic factor (BDNF) and its precursor, proBDNF. We conducted an open-label randomized trial in routine psychiatric practice settings. Seventy-seven MDD outpatients were randomly assigned to the mirtazapine or predetermined SSRIs groups, and investigators arbitrarily selected sertraline or paroxetine. The primary outcome was the proportion of benzodiazepine users at weeks 6, 12, and 24 between the groups. We defined patients showing a ≥50 % reduction in Hamilton depression rating scale (HDRS) scores from baseline as responders. Blood samples were collected at baseline, weeks 6, 12, and 24. Sixty-five patients prescribed benzodiazepines from prescription day 1 were analyzed for the primary outcome. The percentage of benzodiazepine users was significantly lower in the mirtazapine than in the SSRIs group at weeks 6, 12, and 24 (21.4 vs. 81.8 %; 11.1 vs. 85.7 %, both P  depressive episodes may reduce benzodiazepine use in patients with MDD. Trial registration UMIN000004144. Registered 2nd September 2010. The date of enrolment of the first participant to the trial was 24th August 2010. This study was retrospectively registered 9 days after the first participant was enrolled.

  16. The impact of the ECHR on private international law: An analysis of Strasbourg and selected national case law

    NARCIS (Netherlands)

    Kiestra, L.R.

    2013-01-01

    In this research the interaction between the rights guaranteed in the European Convention of Human Rights (ECHR) and private international law has been analyzed by examining the case law of the European Court of Human Rights (the Court) in Strasbourg and selected national courts. In doing so the

  17. Pseudo-random number generator for the Sigma 5 computer

    Science.gov (United States)

    Carroll, S. N.

    1983-01-01

    A technique is presented for developing a pseudo-random number generator based on the linear congruential form. The two numbers used for the generator are a prime number and a corresponding primitive root, where the prime is the largest prime number that can be accurately represented on a particular computer. The primitive root is selected by applying Marsaglia's lattice test. The technique presented was applied to write a random number program for the Sigma 5 computer. The new program, named S:RANDOM1, is judged to be superior to the older program named S:RANDOM. For applications requiring several independent random number generators, a table is included showing several acceptable primitive roots. The technique and programs described can be applied to any computer having word length different from that of the Sigma 5.

  18. Review of Random Phase Encoding in Volume Holographic Storage

    Directory of Open Access Journals (Sweden)

    Wei-Chia Su

    2012-09-01

    Full Text Available Random phase encoding is a unique technique for volume hologram which can be applied to various applications such as holographic multiplexing storage, image encryption, and optical sensing. In this review article, we first review and discuss diffraction selectivity of random phase encoding in volume holograms, which is the most important parameter related to multiplexing capacity of volume holographic storage. We then review an image encryption system based on random phase encoding. The alignment of phase key for decryption of the encoded image stored in holographic memory is analyzed and discussed. In the latter part of the review, an all-optical sensing system implemented by random phase encoding and holographic interconnection is presented.

  19. Managing salinity in Upper Colorado River Basin streams: Selecting catchments for sediment control efforts using watershed characteristics and random forests models

    Science.gov (United States)

    Tillman, Fred; Anning, David W.; Heilman, Julian A.; Buto, Susan G.; Miller, Matthew P.

    2018-01-01

    Elevated concentrations of dissolved-solids (salinity) including calcium, sodium, sulfate, and chloride, among others, in the Colorado River cause substantial problems for its water users. Previous efforts to reduce dissolved solids in upper Colorado River basin (UCRB) streams often focused on reducing suspended-sediment transport to streams, but few studies have investigated the relationship between suspended sediment and salinity, or evaluated which watershed characteristics might be associated with this relationship. Are there catchment properties that may help in identifying areas where control of suspended sediment will also reduce salinity transport to streams? A random forests classification analysis was performed on topographic, climate, land cover, geology, rock chemistry, soil, and hydrologic information in 163 UCRB catchments. Two random forests models were developed in this study: one for exploring stream and catchment characteristics associated with stream sites where dissolved solids increase with increasing suspended-sediment concentration, and the other for predicting where these sites are located in unmonitored reaches. Results of variable importance from the exploratory random forests models indicate that no simple source, geochemical process, or transport mechanism can easily explain the relationship between dissolved solids and suspended sediment concentrations at UCRB monitoring sites. Among the most important watershed characteristics in both models were measures of soil hydraulic conductivity, soil erodibility, minimum catchment elevation, catchment area, and the silt component of soil in the catchment. Predictions at key locations in the basin were combined with observations from selected monitoring sites, and presented in map-form to give a complete understanding of where catchment sediment control practices would also benefit control of dissolved solids in streams.

  20. Randomizer for High Data Rates

    Science.gov (United States)

    Garon, Howard; Sank, Victor J.

    2018-01-01

    NASA as well as a number of other space agencies now recognize that the current recommended CCSDS randomizer used for telemetry (TM) is too short. When multiple applications of the PN8 Maximal Length Sequence (MLS) are required in order to fully cover a channel access data unit (CADU), spectral problems in the form of elevated spurious discretes (spurs) appear. Originally the randomizer was called a bit transition generator (BTG) precisely because it was thought that its primary value was to insure sufficient bit transitions to allow the bit/symbol synchronizer to lock and remain locked. We, NASA, have shown that the old BTG concept is a limited view of the real value of the randomizer sequence and that the randomizer also aids in signal acquisition as well as minimizing the potential for false decoder lock. Under the guidelines we considered here there are multiple maximal length sequences under GF(2) which appear attractive in this application. Although there may be mitigating reasons why another MLS sequence could be selected, one sequence in particular possesses a combination of desired properties which offsets it from the others.

  1. MONETARY TRANSMISSION CHANNELS IN FLEXIBLE MONETARY AND EXCHANGE RATE REGIMES: THE CASE OF SELECTED TRANSITION ECONOMIES

    OpenAIRE

    JOSIFIDIS, Kosta; PUCAR, Emilija Beker; SUPIĆ, Novica

    2010-01-01

    The paper explores selected monetary transmission channels in the case of transition economies. Namely, an exchange rate channel, an interest rate channel, direct and indirect influence to an exchange rate, are focused. Specific (former) transition economies are differentiated according the combination of implemented monetary and exchange rate regimes: exchange rate as a nominal anchor and rigid exchange rate regimes, exchange rate as a nominal anchor and intermediate exchange rate regimes, a...

  2. Boundary conditions in random sequential adsorption

    Science.gov (United States)

    Cieśla, Michał; Ziff, Robert M.

    2018-04-01

    The influence of different boundary conditions on the density of random packings of disks is studied. Packings are generated using the random sequential adsorption algorithm with three different types of boundary conditions: periodic, open, and wall. It is found that the finite size effects are smallest for periodic boundary conditions, as expected. On the other hand, in the case of open and wall boundaries it is possible to introduce an effective packing size and a constant correction term to significantly improve the packing densities.

  3. Simulating WTP Values from Random-Coefficient Models

    OpenAIRE

    Maurus Rischatsch

    2009-01-01

    Discrete Choice Experiments (DCEs) designed to estimate willingness-to-pay (WTP) values are very popular in health economics. With increased computation power and advanced simulation techniques, random-coefficient models have gained an increasing importance in applied work as they allow for taste heterogeneity. This paper discusses the parametrical derivation of WTP values from estimated random-coefficient models and shows how these values can be simulated in cases where they do not have a kn...

  4. Feature Selection with the Boruta Package

    Directory of Open Access Journals (Sweden)

    Miron B. Kursa

    2010-10-01

    Full Text Available This article describes a R package Boruta, implementing a novel feature selection algorithm for finding emph{all relevant variables}. The algorithm is designed as a wrapper around a Random Forest classification algorithm. It iteratively removes the features which are proved by a statistical test to be less relevant than random probes. The Boruta package provides a convenient interface to the algorithm. The short description of the algorithm and examples of its application are presented.

  5. [Scrotal temperature in 258 healthy men, randomly selected from a population of men aged 18 to 23 years old. Statistical analysis, epidemiologic observations, and measurement of the testicular diameters].

    Science.gov (United States)

    Valeri, A; Mianné, D; Merouze, F; Bujan, L; Altobelli, A; Masson, J

    1993-06-01

    Scrotal hyperthermia can induce certain alterations in spermatogenesis. The basal scrotal temperature used to define hyperthermia is usually 33 degrees C. However, no study, conducted according to a strict methodology has validated this mean measurement. We therefore randomly selected 258 men between the ages of 18 and 23 years from a population of 2,000 young French men seen at the National Service Selection Centre in order to measure the scrotal temperature over each testis and in the median raphe in order to determine the mean and median values for these temperatures. For a mean room temperature of 23 +/- 0.5 degrees C with a range of 18 to 31 degrees C, the mean right and left scrotal temperature was 34.2 +/- 0.1 degree C and the mean medioscrotal temperature was 34.4 +/- 0.1 degree C. Scrotal temperature was very significantly correlated to room temperature and its variations. It was therefore impossible to define a normal value for scrotal temperature. Only measurement of scrotal temperature at neutral room temperature, between 21 and 25 degrees C, is able to provide a reference value for scrotal temperature. In this study, the mean scrotal temperature under these conditions was 34.4 +/- 0.2 degree C, i.e. 2.5 degrees C less than body temperature. In the 12.9% of cases with left varicocele, left scrotal temperature was significantly higher than in the absence of varicocele and was also higher than right Scrotal temperature. The authors also determined the dimensions of the testes.(ABSTRACT TRUNCATED AT 250 WORDS)

  6. Strategic decisions in transport: a case study for a naval base selection in Brazil

    Directory of Open Access Journals (Sweden)

    Amaury Caruzzo

    2016-04-01

    Full Text Available A decision on a military strategic environment, such as the selection of a new naval base, is a complex process and involves various criteria. In this context, few studies are available on the problems of military-naval transport decisions. Therefore, the aim of this paper is to present a maritime transport case study using a multi-methodology framework in a process of strategic decision making in logistics. Through a review of the literature, normative documents from the Brazilian armed forces, and interviews with military officers, criteria and preferences were identified and a hierarchical structure was constructed for a case study in the Brazilian Navy–the location of the second Fleet Headquarters. The results indicated that São Marcos Bay, in Maranhão State, was the best location among the alternatives. The multi-criteria approach was shown to be a valuable tool in assisting the decision making process and to understand the trade-offs between strategic and operational criteria in a transport decision.

  7. Redefining the Practice of Peer Review Through Intelligent Automation Part 2: Data-Driven Peer Review Selection and Assignment.

    Science.gov (United States)

    Reiner, Bruce I

    2017-12-01

    In conventional radiology peer review practice, a small number of exams (routinely 5% of the total volume) is randomly selected, which may significantly underestimate the true error rate within a given radiology practice. An alternative and preferable approach would be to create a data-driven model which mathematically quantifies a peer review risk score for each individual exam and uses this data to identify high risk exams and readers, and selectively target these exams for peer review. An analogous model can also be created to assist in the assignment of these peer review cases in keeping with specific priorities of the service provider. An additional option to enhance the peer review process would be to assign the peer review cases in a truly blinded fashion. In addition to eliminating traditional peer review bias, this approach has the potential to better define exam-specific standard of care, particularly when multiple readers participate in the peer review process.

  8. Forty cases of gastrointestinal neurosis treated by acupunture.

    Science.gov (United States)

    Zhao, Yaping; Ding, Min; Wang, Yanjun

    2008-03-01

    To compare the therapeutic effect of acupuncture for gastrointestinal neurosis with that of oral remedy. Eighty cases were randomly divided into the following 2 groups. In the treatment group, acupuncture was given for one month at the main points of Zhongwan (CV 12), Zusanli (ST 36), Taichong (LR 3) and Shenmen (HT 7), with the auxiliary points selected according to TCM differentiation. In the control group, Domperidone was orally administered for one month. The total effective rate was 92.5% in the treatment group and 75.0% in the control group, with a significant difference between the 2 groups (chi2 = 4.423, P neurosis and with less toxic side effects.

  9. Scarab/Saffron Development Project Case study: Material Selection Criteria for the Monoethylene Glycol Recovery Package

    International Nuclear Information System (INIS)

    Moussa, A.M.; Habib, S.; Shinaishin, A.

    2004-01-01

    MEG Recovery Unit for Scarab/Saffron development project is the first application in gas production. The Mono Ethylene Glycol Recovery Unit (MEG) recovers MEG from Water/MEG stream and removes salts and other contaminants. MEG Recovery Unit Equipment Design Criteria were designed for two parallel trains A and B, each train is capable to treat 500 bbl MEG, 1500 bbl water and 9 ton salt. The MEG unit is a combination of two unit operations; MEG Recovery unit is normally applicable in the oil and gas industries that is applying distillation technique, while the new technology is salt treatment and handling. The MEG Unit material selection is made to be suitable for the entire design life which is 25 years, the materials for MEG Recovery Unit have been selected among the available corrosion resistance alloys, where requested by the service and ambient conditions. Therefore all. the parts of the MEG unit that are in saline service are in either (2205 duplex, AISI 316L) and in Inconel alloy 625 related to operating temperature. This case study focused at Inconel alloy 625, which is selected for salt service and their operation problem occurred during the construction and operating conditions

  10. An Investigation of Science Teachers’ Teaching Methods and Techniques: Amasya Case

    Directory of Open Access Journals (Sweden)

    Orhan KARAMUSTAFAOĞLU

    2014-10-01

    Full Text Available The purpose of this study is to determine the methods and techniques science teachers mostly employ in their classrooms. To collect data, the researchers employed a survey with 60 science teachers and randomly selected 6 of them to observe these selected teachers in real classroom situation. Furthermore, the researchers invited 154 students taught by the selected 6 teachers in this study, for focus group interviewing. After analyzing the collected data, the researchers found that teachers in this study 1 were more likely to use narrative method, 2 supported their teaching with question and answer, demonstration, case study, and problem solving methods and techniques, and 3 rarely employed student centered discussion, laboratory practice, role playing and project-based learning methods in their classroom. Consequently, there exist some differences between theory and practice regarding teaching methods and techniques of teachers in this study.

  11. Universal Prevention for Anxiety and Depressive Symptoms in Children: A Meta-analysis of Randomized and Cluster-Randomized Trials.

    Science.gov (United States)

    Ahlen, Johan; Lenhard, Fabian; Ghaderi, Ata

    2015-12-01

    Although under-diagnosed, anxiety and depression are among the most prevalent psychiatric disorders in children and adolescents, leading to severe impairment, increased risk of future psychiatric problems, and a high economic burden to society. Universal prevention may be a potent way to address these widespread problems. There are several benefits to universal relative to targeted interventions because there is limited knowledge as to how to screen for anxiety and depression in the general population. Earlier meta-analyses of the prevention of depression and anxiety symptoms among children suffer from methodological inadequacies such as combining universal, selective, and indicated interventions in the same analyses, and comparing cluster-randomized trials with randomized trials without any correction for clustering effects. The present meta-analysis attempted to determine the effectiveness of universal interventions to prevent anxiety and depressive symptoms after correcting for clustering effects. A systematic search of randomized studies in PsychINFO, Cochrane Library, and Google Scholar resulted in 30 eligible studies meeting inclusion criteria, namely peer-reviewed, randomized or cluster-randomized trials of universal interventions for anxiety and depressive symptoms in school-aged children. Sixty-three percent of the studies reported outcome data regarding anxiety and 87 % reported outcome data regarding depression. Seventy percent of the studies used randomization at the cluster level. There were small but significant effects regarding anxiety (.13) and depressive (.11) symptoms as measured at immediate posttest. At follow-up, which ranged from 3 to 48 months, effects were significantly larger than zero regarding depressive (.07) but not anxiety (.11) symptoms. There was no significant moderation effect of the following pre-selected variables: the primary aim of the intervention (anxiety or depression), deliverer of the intervention, gender distribution

  12. Prevalence of at-risk genotypes for genotoxic effects decreases with age in a randomly selected population in Flanders: a cross sectional study

    Directory of Open Access Journals (Sweden)

    van Delft Joost HM

    2011-10-01

    Full Text Available Abstract Background We hypothesized that in Flanders (Belgium, the prevalence of at-risk genotypes for genotoxic effects decreases with age due to morbidity and mortality resulting from chronic diseases. Rather than polymorphisms in single genes, the interaction of multiple genetic polymorphisms in low penetrance genes involved in genotoxic effects might be of relevance. Methods Genotyping was performed on 399 randomly selected adults (aged 50-65 and on 442 randomly selected adolescents. Based on their involvement in processes relevant to genotoxicity, 28 low penetrance polymorphisms affecting the phenotype in 19 genes were selected (xenobiotic metabolism, oxidative stress defense and DNA repair, respectively 13, 6 and 9 polymorphisms. Polymorphisms which, based on available literature, could not clearly be categorized a priori as leading to an 'increased risk' or a 'protective effect' were excluded. Results The mean number of risk alleles for all investigated polymorphisms was found to be lower in the 'elderly' (17.0 ± 2.9 than the 'adolescent' (17.6 ± 3.1 subpopulation (P = 0.002. These results were not affected by gender nor smoking. The prevalence of a high (> 17 = median number of risk alleles was less frequent in the 'elderly' (40.6% than the 'adolescent' (51.4% subpopulation (P = 0.002. In particular for phase II enzymes, the mean number of risk alleles was lower in the 'elderly' (4.3 ± 1.6 than the 'adolescent' age group (4.8 ± 1.9 P 4 = median number of risk alleles was less frequent in the 'elderly' (41.3% than the adolescent subpopulation (56.3%, P 8 = median number of risk alleles for DNA repair enzyme-coding genes was lower in the 'elderly' (37,3% than the 'adolescent' subpopulation (45.6%, P = 0.017. Conclusions These observations are consistent with the hypothesis that, in Flanders, the prevalence of at-risk alleles in genes involved in genotoxic effects decreases with age, suggesting that persons carrying a higher number of

  13. Project selection problem under uncertainty: An application of utility theory and chance constrained programming to a real case

    Directory of Open Access Journals (Sweden)

    Reza Hosnavi Atashgah

    2013-06-01

    Full Text Available Selecting from a pool of interdependent projects under certainty, when faced with resource constraints, has been studied well in the literature of project selection problem. After briefly reviewing and discussing popular modeling approaches for dealing with uncertainty, this paper proposes an approach based on chance constrained programming and utility theory for a certain range of problems and under some practical assumptions. Expected Utility Programming, as the proposed modeling approach, will be compared with other well-known methods and its meaningfulness and usefulness will be illustrated via two numerical examples and one real case.

  14. Random and Systematic Errors Share in Total Error of Probes for CNC Machine Tools

    Directory of Open Access Journals (Sweden)

    Adam Wozniak

    2018-03-01

    Full Text Available Probes for CNC machine tools, as every measurement device, have accuracy limited by random errors and by systematic errors. Random errors of these probes are described by a parameter called unidirectional repeatability. Manufacturers of probes for CNC machine tools usually specify only this parameter, while parameters describing systematic errors of the probes, such as pre-travel variation or triggering radius variation, are used rarely. Systematic errors of the probes, linked to the differences in pre-travel values for different measurement directions, can be corrected or compensated, but it is not a widely used procedure. In this paper, the share of systematic errors and random errors in total error of exemplary probes are determined. In the case of simple, kinematic probes, systematic errors are much greater than random errors, so compensation would significantly reduce the probing error. Moreover, it shows that in the case of kinematic probes commonly specified unidirectional repeatability is significantly better than 2D performance. However, in the case of more precise strain-gauge probe systematic errors are of the same order as random errors, which means that errors correction or compensation, in this case, would not yield any significant benefits.

  15. Effects of prey abundance, distribution, visual contrast and morphology on selection by a pelagic piscivore

    Science.gov (United States)

    Hansen, Adam G.; Beauchamp, David A.

    2014-01-01

    Most predators eat only a subset of possible prey. However, studies evaluating diet selection rarely measure prey availability in a manner that accounts for temporal–spatial overlap with predators, the sensory mechanisms employed to detect prey, and constraints on prey capture.We evaluated the diet selection of cutthroat trout (Oncorhynchus clarkii) feeding on a diverse planktivore assemblage in Lake Washington to test the hypothesis that the diet selection of piscivores would reflect random (opportunistic) as opposed to non-random (targeted) feeding, after accounting for predator–prey overlap, visual detection and capture constraints.Diets of cutthroat trout were sampled in autumn 2005, when the abundance of transparent, age-0 longfin smelt (Spirinchus thaleichthys) was low, and 2006, when the abundance of smelt was nearly seven times higher. Diet selection was evaluated separately using depth-integrated and depth-specific (accounted for predator–prey overlap) prey abundance. The abundance of different prey was then adjusted for differences in detectability and vulnerability to predation to see whether these factors could explain diet selection.In 2005, cutthroat trout fed non-randomly by selecting against the smaller, transparent age-0 longfin smelt, but for the larger age-1 longfin smelt. After adjusting prey abundance for visual detection and capture, cutthroat trout fed randomly. In 2006, depth-integrated and depth-specific abundance explained the diets of cutthroat trout well, indicating random feeding. Feeding became non-random after adjusting for visual detection and capture. Cutthroat trout selected strongly for age-0 longfin smelt, but against similar sized threespine stickleback (Gasterosteus aculeatus) and larger age-1 longfin smelt in 2006. Overlap with juvenile sockeye salmon (O. nerka) was minimal in both years, and sockeye salmon were rare in the diets of cutthroat trout.The direction of the shift between random and non-random selection

  16. A method for selecting cis-acting regulatory sequences that respond to small molecule effectors

    Directory of Open Access Journals (Sweden)

    Allas Ülar

    2010-08-01

    Full Text Available Abstract Background Several cis-acting regulatory sequences functioning at the level of mRNA or nascent peptide and specifically influencing transcription or translation have been described. These regulatory elements often respond to specific chemicals. Results We have developed a method that allows us to select cis-acting regulatory sequences that respond to diverse chemicals. The method is based on the β-lactamase gene containing a random sequence inserted into the beginning of the ORF. Several rounds of selection are used to isolate sequences that suppress β-lactamase expression in response to the compound under study. We have isolated sequences that respond to erythromycin, troleandomycin, chloramphenicol, meta-toluate and homoserine lactone. By introducing synonymous and non-synonymous mutations we have shown that at least in the case of erythromycin the sequences act at the peptide level. We have also tested the cross-activities of the constructs and found that in most cases the sequences respond most strongly to the compound on which they were isolated. Conclusions Several selected peptides showed ligand-specific changes in amino acid frequencies, but no consensus motif could be identified. This is consistent with previous observations on natural cis-acting peptides, showing that it is often impossible to demonstrate a consensus. Applying the currently developed method on a larger scale, by selecting and comparing an extended set of sequences, might allow the sequence rules underlying the activity of cis-acting regulatory peptides to be identified.

  17. Exactly averaged equations for flow and transport in random media

    International Nuclear Information System (INIS)

    Shvidler, Mark; Karasaki, Kenzi

    2001-01-01

    It is well known that exact averaging of the equations of flow and transport in random porous media can be realized only for a small number of special, occasionally exotic, fields. On the other hand, the properties of approximate averaging methods are not yet fully understood. For example, the convergence behavior and the accuracy of truncated perturbation series. Furthermore, the calculation of the high-order perturbations is very complicated. These problems for a long time have stimulated attempts to find the answer for the question: Are there in existence some exact general and sufficiently universal forms of averaged equations? If the answer is positive, there arises the problem of the construction of these equations and analyzing them. There exist many publications related to these problems and oriented on different applications: hydrodynamics, flow and transport in porous media, theory of elasticity, acoustic and electromagnetic waves in random fields, etc. We present a method of finding the general form of exactly averaged equations for flow and transport in random fields by using (1) an assumption of the existence of Green's functions for appropriate stochastic problems, (2) some general properties of the Green's functions, and (3) the some basic information about the random fields of the conductivity, porosity and flow velocity. We present a general form of the exactly averaged non-local equations for the following cases. 1. Steady-state flow with sources in porous media with random conductivity. 2. Transient flow with sources in compressible media with random conductivity and porosity. 3. Non-reactive solute transport in random porous media. We discuss the problem of uniqueness and the properties of the non-local averaged equations, for the cases with some types of symmetry (isotropic, transversal isotropic, orthotropic) and we analyze the hypothesis of the structure non-local equations in general case of stochastically homogeneous fields. (author)

  18. Quantitative characterisation of an engineering write-up using random walk analysis

    Directory of Open Access Journals (Sweden)

    Sunday A. Oke

    2008-02-01

    Full Text Available This contribution reports on the investigation of correlation properties in an English scientific text (engineering write-up by means of a random walk. Though the idea to use a random walk to characterise correlations is not new (it was used e.g. in the genome analysis and in the analysis of texts, a random walk approach to the analysis of an English scientific text is still far from being exploited in its full strength as demonstrated in this paper. A method of high-dimensional embedding is proposed. Case examples were drawn arbitrarily from four engineering write-ups (Ph.D. synopsis of three engineering departments in the Faculty of Technology, University of Ibadan, Nigeria. Thirteen additional analyses of non-engineering English texts were made and the results compared to the engineering English texts. Thus, a total of seventeen write-ups of eight Faculties and sixteen Departments of the University of Ibadan were considered. The characterising exponents which relate the average distance of random walkers away from a known starting position to the elapsed time steps were estimated for the seventeen cases according to the power law and in three different dimensional spaces. The average characteristic exponent obtained for the seventeen cases and over three different dimensional spaces studied was 1.42 to 2-decimal with a minimum and a maximum coefficient of determination (R2 of 0.9495 and 0.9994 respectively. This is found to be 284% of the average characterising exponent value (0.5, as supported by the literature for random walkers based on the pseudo-random number generator. The average characteristic exponent obtained for the four cases that were engineering-based and over the three different dimensional studied spaces was 1.41 to 2-decimal (closer by 99.3% to 1.42 with a minimum and a maximum coefficient of determination (R2 of 0.9507 and 0.9974 respectively. This is found to be 282% of the average characterising exponent value (0.5, as

  19. Remote-online case-based learning: A comparison of remote-online and face-to-face, case-based learning - a randomized controlled trial.

    Science.gov (United States)

    Nicklen, Peter; Keating, Jenny L; Paynter, Sophie; Storr, Michael; Maloney, Stephen

    2016-01-01

    Case-based learning (CBL) is an educational approach where students work in small, collaborative groups to solve problems. Computer assisted learning (CAL) is the implementation of computer technology in education. The purpose of this study was to compare the effects of a remote-online CBL (RO-CBL) with traditional face-to-face CBL on learning the outcomes of undergraduate physiotherapy students. Participants were randomized to either the control (face-to-face CBL) or to the CAL intervention (RO-CBL). The entire 3rd year physiotherapy cohort (n = 41) at Monash University, Victoria, Australia, were invited to participate in the randomized controlled trial. Outcomes included a postintervention multiple-choice test evaluating the knowledge gained from the CBL, a self-assessment of learning based on examinable learning objectives and student satisfaction with the CBL. In addition, a focus group was conducted investigating perceptions and responses to the online format. Thirty-eight students (control n = 19, intervention n = 19) participated in two CBL sessions and completed the outcome assessments. CBL median scores for the postintervention multiple-choice test were comparable (Wilcoxon rank sum P = 0.61) (median/10 [range] intervention group: 9 [8-10] control group: 10 [7-10]). Of the 15 examinable learning objectives, eight were significantly in favor of the control group, suggesting a greater perceived depth of learning. Eighty-four percent of students (16/19) disagreed with the statement "I enjoyed the method of CBL delivery." Key themes identified from the focus group included risks associated with the implementation of, challenges of communicating in, and flexibility offered, by web-based programs. RO-CBL appears to provide students with a comparable learning experience to traditional CBL. Procedural and infrastructure factors need to be addressed in future studies to counter student dissatisfaction and decreased perceived depth of learning.

  20. Conversion of the random amplified polymorphic DNA (RAPD ...

    African Journals Online (AJOL)

    Conversion of the random amplified polymorphic DNA (RAPD) marker UBC#116 linked to Fusarium crown and root rot resistance gene (Frl) into a co-dominant sequence characterized amplified region (SCAR) marker for marker-assisted selection of tomato.

  1. Case management services for work related upper extremity disorders. Integrating workplace accommodation and problem solving.

    Science.gov (United States)

    Shaw, W S; Feuerstein, M; Lincoln, A E; Miller, V I; Wood, P M

    2001-08-01

    A case manager's ability to obtain worksite accommodations and engage workers in active problem solving may improve health and return to work outcomes for clients with work related upper extremity disorders (WRUEDs). This study examines the feasibility of a 2 day training seminar to help nurse case managers identify ergonomic risk factors, provide accommodation, and conduct problem solving skills training with workers' compensation claimants recovering from WRUEDs. Eight procedural steps to this case management approach were identified, translated into a training workshop format, and conveyed to 65 randomly selected case managers. Results indicate moderate to high self ratings of confidence to perform ergonomic assessments (mean = 7.5 of 10) and to provide problem solving skills training (mean = 7.2 of 10) after the seminar. This training format was suitable to experienced case managers and generated a moderate to high level of confidence to use this case management approach.

  2. Multi-Index Monte Carlo and stochastic collocation methods for random PDEs

    KAUST Repository

    Nobile, Fabio; Haji Ali, Abdul Lateef; Tamellini, Lorenzo; Tempone, Raul

    2016-01-01

    In this talk we consider the problem of computing statistics of the solution of a partial differential equation with random data, where the random coefficient is parametrized by means of a finite or countable sequence of terms in a suitable expansion. We describe and analyze a Multi-Index Monte Carlo (MIMC) and a Multi-Index Stochastic Collocation method (MISC). the former is both a stochastic version of the combination technique introduced by Zenger, Griebel and collaborators and an extension of the Multilevel Monte Carlo (MLMC) method first described by Heinrich and Giles. Instead of using firstorder differences as in MLMC, MIMC uses mixed differences to reduce the variance of the hierarchical differences dramatically. This in turn yields new and improved complexity results, which are natural generalizations of Giles s MLMC analysis, and which increase the domain of problem parameters for which we achieve the optimal convergence, O(TOL-2). On the same vein, MISC is a deterministic combination technique based on mixed differences of spatial approximations and quadratures over the space of random data. Provided enough mixed regularity, MISC can achieve better complexity than MIMC. Moreover, we show that in the optimal case the convergence rate of MISC is only dictated by the convergence of the deterministic solver applied to a one-dimensional spatial problem. We propose optimization procedures to select the most effective mixed differences to include in MIMC and MISC. Such optimization is a crucial step that allows us to make MIMC and MISC computationally effective. We finally show the effectiveness of MIMC and MISC with some computational tests, including tests with a infinite countable number of random parameters.

  3. Multi-Index Monte Carlo and stochastic collocation methods for random PDEs

    KAUST Repository

    Nobile, Fabio

    2016-01-09

    In this talk we consider the problem of computing statistics of the solution of a partial differential equation with random data, where the random coefficient is parametrized by means of a finite or countable sequence of terms in a suitable expansion. We describe and analyze a Multi-Index Monte Carlo (MIMC) and a Multi-Index Stochastic Collocation method (MISC). the former is both a stochastic version of the combination technique introduced by Zenger, Griebel and collaborators and an extension of the Multilevel Monte Carlo (MLMC) method first described by Heinrich and Giles. Instead of using firstorder differences as in MLMC, MIMC uses mixed differences to reduce the variance of the hierarchical differences dramatically. This in turn yields new and improved complexity results, which are natural generalizations of Giles s MLMC analysis, and which increase the domain of problem parameters for which we achieve the optimal convergence, O(TOL-2). On the same vein, MISC is a deterministic combination technique based on mixed differences of spatial approximations and quadratures over the space of random data. Provided enough mixed regularity, MISC can achieve better complexity than MIMC. Moreover, we show that in the optimal case the convergence rate of MISC is only dictated by the convergence of the deterministic solver applied to a one-dimensional spatial problem. We propose optimization procedures to select the most effective mixed differences to include in MIMC and MISC. Such optimization is a crucial step that allows us to make MIMC and MISC computationally effective. We finally show the effectiveness of MIMC and MISC with some computational tests, including tests with a infinite countable number of random parameters.

  4. Fuzzy Random Walkers with Second Order Bounds: An Asymmetric Analysis

    Directory of Open Access Journals (Sweden)

    Georgios Drakopoulos

    2017-03-01

    Full Text Available Edge-fuzzy graphs constitute an essential modeling paradigm across a broad spectrum of domains ranging from artificial intelligence to computational neuroscience and social network analysis. Under this model, fundamental graph properties such as edge length and graph diameter become stochastic and as such they are consequently expressed in probabilistic terms. Thus, algorithms for fuzzy graph analysis must rely on non-deterministic design principles. One such principle is Random Walker, which is based on a virtual entity and selects either edges or, like in this case, vertices of a fuzzy graph to visit. This allows the estimation of global graph properties through a long sequence of local decisions, making it a viable strategy candidate for graph processing software relying on native graph databases such as Neo4j. As a concrete example, Chebyshev Walktrap, a heuristic fuzzy community discovery algorithm relying on second order statistics and on the teleportation of the Random Walker, is proposed and its performance, expressed in terms of community coherence and number of vertex visits, is compared to the previously proposed algorithms of Markov Walktrap, Fuzzy Walktrap, and Fuzzy Newman–Girvan. In order to facilitate this comparison, a metric based on the asymmetric metrics of Tversky index and Kullback–Leibler divergence is used.

  5. Data-Driven Derivation of an "Informer Compound Set" for Improved Selection of Active Compounds in High-Throughput Screening.

    Science.gov (United States)

    Paricharak, Shardul; IJzerman, Adriaan P; Jenkins, Jeremy L; Bender, Andreas; Nigsch, Florian

    2016-09-26

    Despite the usefulness of high-throughput screening (HTS) in drug discovery, for some systems, low assay throughput or high screening cost can prohibit the screening of large numbers of compounds. In such cases, iterative cycles of screening involving active learning (AL) are employed, creating the need for smaller "informer sets" that can be routinely screened to build predictive models for selecting compounds from the screening collection for follow-up screens. Here, we present a data-driven derivation of an informer compound set with improved predictivity of active compounds in HTS, and we validate its benefit over randomly selected training sets on 46 PubChem assays comprising at least 300,000 compounds and covering a wide range of assay biology. The informer compound set showed improvement in BEDROC(α = 100), PRAUC, and ROCAUC values averaged over all assays of 0.024, 0.014, and 0.016, respectively, compared to randomly selected training sets, all with paired t-test p-values agnostic fashion. This approach led to a consistent improvement in hit rates in follow-up screens without compromising scaffold retrieval. The informer set is adjustable in size depending on the number of compounds one intends to screen, as performance gains are realized for sets with more than 3,000 compounds, and this set is therefore applicable to a variety of situations. Finally, our results indicate that random sampling may not adequately cover descriptor space, drawing attention to the importance of the composition of the training set for predicting actives.

  6. Random matrices and random difference equations

    International Nuclear Information System (INIS)

    Uppuluri, V.R.R.

    1975-01-01

    Mathematical models leading to products of random matrices and random difference equations are discussed. A one-compartment model with random behavior is introduced, and it is shown how the average concentration in the discrete time model converges to the exponential function. This is of relevance to understanding how radioactivity gets trapped in bone structure in blood--bone systems. The ideas are then generalized to two-compartment models and mammillary systems, where products of random matrices appear in a natural way. The appearance of products of random matrices in applications in demography and control theory is considered. Then random sequences motivated from the following problems are studied: constant pulsing and random decay models, random pulsing and constant decay models, and random pulsing and random decay models

  7. Zinc as an adjunct treatment for reducing case fatality due to clinical severe infection in young infants: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Wadhwa, Nitya; Basnet, Sudha; Natchu, Uma Chandra Mouli; Shrestha, Laxman P; Bhatnagar, Shinjini; Sommerfelt, Halvor; Strand, Tor A; Ramji, Siddarth; Aggarwal, K C; Chellani, Harish; Govil, Anuradha; Jajoo, Mamta; Mathur, N B; Bhatt, Meenakshi; Mohta, Anup; Ansari, Imran; Basnet, Srijana; Chapagain, Ram H; Shah, Ganesh P; Shrestha, Binod M

    2017-07-10

    An estimated 2.7 of the 5.9 million deaths in children under 5 years of age occur in the neonatal period. Severe infections contribute to almost a quarter of these deaths. Mortality due to severe infections in developing country settings is substantial despite antibiotic therapy. Effective interventions that can be added to standard therapy for severe infections are required to reduce case fatality. This is a double-blind randomized placebo-controlled parallel group superiority trial to investigate the effect of zinc administered orally as an adjunct to standard therapy to infants aged 3 days up to 2 months (59 days) hospitalized with clinical severe infection, that will be undertaken in seven hospitals in Delhi, India and Kathmandu, Nepal. In a 1:1 ratio, we will randomly assign young infants to receive 10 mg of elemental zinc or placebo orally in addition to the standard therapy for a total of 14 days. The primary outcomes hospital case fatality, which is death due to any cause and at any time after enrolment while hospitalized for the illness episode, and extended case fatality, which encompasses the period until 12 weeks after enrolment. A previous study showed a beneficial effect of zinc in reducing the risk of treatment failure, as well as a non-significant effect on case fatality. This study was not powered to detect an effect on case fatality, which this current study is. If the results are consistent with this earlier trial, we would have provided strong evidence for recommending zinc as an adjunct to standard therapy for clinical severe infection in young infants. Universal Trial Number: U1111-1187-6479, Clinical Trials Registry - India: CTRI/2017/02/007966 : Registered on February 27, 2017.

  8. [Acupuncture therapy for the improvement of sleep quality of outpatients receiving methadone maintenance treatment: a randomized controlled trial].

    Science.gov (United States)

    Li, Yi; Liu, Xue-bing; Zhang, Yao

    2012-08-01

    To study the efficacy and safety of acupuncture therapy for the improvement of sleep quality of outpatients receiving methadone maintenance treatment (MMT). Using randomized double-blinded controlled design, seventy-five MMT outpatients with low sleep quality [score of Pittsburgh sleep quality index (PSQI) > or = 8], were randomly assigned to the acupuncture group (38 cases) and the sham-acupuncture group (37 cases). All patients maintained previous MMT. Acupuncture was applied to Baihui (GV20), Shenmen (bilateral, TF4), Shenting (GV24), Sanyinjiao (bilateral, SP6), and Sishencong (EX-HN1) in the acupuncture group. The same procedures were performed in the sham-acupuncture group, but not to the acupoints (5 mm lateral to the acupoints selected in the acupuncture group) with shallow needling technique. The treatment was performed 5 times each week for 8 successive weeks. The PSQI was assessed before treatment, at the end of the 2nd, 4th, 6th, and 8th week of the treatment. The detection ratio of low sleep quality and the incidence of adverse acupuncture reactions were compared between the two groups at the end of the 8th week. The overall PSQI score was obviously higher in the acupuncture group than in the sham-acupuncture group with statistical difference (P acupuncture group (60.53%, 23/38 cases) than in the sham-acupuncture group (83.78%, 31/37 cases) with statistical difference (P acupuncture reaction was 5.26% (2/38 cases) in the acupuncture group and 2.70% (1/37 cases) in the sham-acupuncture group respectively, showing no statistical difference (P > 0.05). Acupuncture therapy could effectively and safely improve the sleep quality of outpatients receiving MMT.

  9. Statistics for Ratios of Rayleigh, Rician, Nakagami-m, and Weibull Distributed Random Variables

    Directory of Open Access Journals (Sweden)

    Dragana Č. Pavlović

    2013-01-01

    Full Text Available The distributions of ratios of random variables are of interest in many areas of the sciences. In this brief paper, we present the joint probability density function (PDF and PDF of maximum of ratios μ1=R1/r1 and μ2=R2/r2 for the cases where R1, R2, r1, and r2 are Rayleigh, Rician, Nakagami-m, and Weibull distributed random variables. Random variables R1 and R2, as well as random variables r1 and r2, are correlated. Ascertaining on the suitability of the Weibull distribution to describe fading in both indoor and outdoor environments, special attention is dedicated to the case of Weibull random variables. For this case, analytical expressions for the joint PDF, PDF of maximum, PDF of minimum, and product moments of arbitrary number of ratios μi=Ri/ri, i=1,…,L are obtained. Random variables in numerator, Ri, as well as random variables in denominator, ri, are exponentially correlated. To the best of the authors' knowledge, analytical expressions for the PDF of minimum and product moments of {μi}i=1L are novel in the open technical literature. The proposed mathematical analysis is complemented by various numerical results. An application of presented theoretical results is illustrated with respect to performance assessment of wireless systems.

  10. Random analysis of bearing capacity of square footing using the LAS procedure

    Science.gov (United States)

    Kawa, Marek; Puła, Wojciech; Suska, Michał

    2016-09-01

    In the present paper, a three-dimensional problem of bearing capacity of square footing on random soil medium is analyzed. The random fields of strength parameters c and φ are generated using LAS procedure (Local Average Subdivision, Fenton and Vanmarcke 1990). The procedure used is re-implemented by the authors in Mathematica environment in order to combine it with commercial program. Since the procedure is still tested the random filed has been assumed as one-dimensional: the strength properties of soil are random in vertical direction only. Individual realizations of bearing capacity boundary-problem with strength parameters of medium defined the above procedure are solved using FLAC3D Software. The analysis is performed for two qualitatively different cases, namely for the purely cohesive and cohesive-frictional soils. For the latter case the friction angle and cohesion have been assumed as independent random variables. For these two cases the random square footing bearing capacity results have been obtained for the range of fluctuation scales from 0.5 m to 10 m. Each time 1000 Monte Carlo realizations have been performed. The obtained results allow not only the mean and variance but also the probability density function to be estimated. An example of application of this function for reliability calculation has been presented in the final part of the paper.

  11. Effect of a Counseling Session Bolstered by Text Messaging on Self-Selected Health Behaviors in College Students: A Preliminary Randomized Controlled Trial.

    Science.gov (United States)

    Sandrick, Janice; Tracy, Doreen; Eliasson, Arn; Roth, Ashley; Bartel, Jeffrey; Simko, Melanie; Bowman, Tracy; Harouse-Bell, Karen; Kashani, Mariam; Vernalis, Marina

    2017-05-17

    The college experience is often the first time when young adults live independently and make their own lifestyle choices. These choices affect dietary behaviors, exercise habits, techniques to deal with stress, and decisions on sleep time, all of which direct the trajectory of future health. There is a need for effective strategies that will encourage healthy lifestyle choices in young adults attending college. This preliminary randomized controlled trial tested the effect of coaching and text messages (short message service, SMS) on self-selected health behaviors in the domains of diet, exercise, stress, and sleep. A second analysis measured the ripple effect of the intervention on health behaviors not specifically selected as a goal by participants. Full-time students aged 18-30 years were recruited by word of mouth and campuswide advertisements (flyers, posters, mailings, university website) at a small university in western Pennsylvania from January to May 2015. Exclusions included pregnancy, eating disorders, chronic medical diagnoses, and prescription medications other than birth control. Of 60 participants, 30 were randomized to receive a single face-to-face meeting with a health coach to review results of behavioral questionnaires and to set a health behavior goal for the 8-week study period. The face-to-face meeting was followed by SMS text messages designed to encourage achievement of the behavioral goal. A total of 30 control subjects underwent the same health and behavioral assessments at intake and program end but did not receive coaching or SMS text messages. The texting app showed that 87.31% (2187/2505) of messages were viewed by intervention participants. Furthermore, 28 of the 30 intervention participants and all 30 control participants provided outcome data. Among intervention participants, 22 of 30 (73%) showed improvement in health behavior goal attainment, with the whole group (n=30) showing a mean improvement of 88% (95% CI 39-136). Mean

  12. Sampling large random knots in a confined space

    International Nuclear Information System (INIS)

    Arsuaga, J; Blackstone, T; Diao, Y; Hinson, K; Karadayi, E; Saito, M

    2007-01-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e n 2 )). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n 2 ). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications

  13. Sampling large random knots in a confined space

    Science.gov (United States)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  14. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  15. A Thematic Look at Selected Cases of Marital Nullity in the Philippines

    Directory of Open Access Journals (Sweden)

    Antero Rosauro V. Arias, Jr.

    2018-05-01

    Full Text Available Psychological incapacity on the part of either or both spouses as the basis of marital nullity under Article 36 of The Family Code of the Philippines has long been traced to cases of personality disorders. From a theoretical framework that included the legal basis under the said article and the categorical model of the Diagnostic and Statistical Manual of Mental Disorders-IV-TR (DSM-IV-TR, the author purposefully selected several clinical cases of spouses’ narratives in their social case history – these narratives were already part and parcel of court transcripts. Employing a qualitative research methodology using thematic analysis, they were then dissected into superordinate themes that represented spouses’ developmental years, premarital relationship years, and period of marital cohabitation as husband and wife. Thereafter, other themes and possible subthemes were extracted and listed under each of these superordinate themes. These themes and subthemes were then equated to the spouses’ overt manifestations of psychological incapacity. In turn, these manifestations were matched with any or all of the diagnostic features or traits of personality functioning in the DSM. The ultimate objective of deriving and labelling the identified themes with specific personality disorders, with due consideration to the subthemes that referred to spouses’ juridical antecedent behaviors, was successfully achieved to supplement the use of powerful psychometric tests, including the use of projective techniques which were utilized in the local courts. This innovative scheme of thematically analyzing spouses’ narratives on marital nullification figured very well in forensic mental health assessment, especially when the respondent spouse was not available to undergo the necessary psychological assessment for some reason.

  16. Reserves Represented by Random Walks

    International Nuclear Information System (INIS)

    Filipe, J A; Ferreira, M A M; Andrade, M

    2012-01-01

    The reserves problem is studied through models based on Random Walks. Random walks are a classical particular case in the analysis of stochastic processes. They do not appear only to study reserves evolution models. They are also used to build more complex systems and as analysis instruments, in a theoretical feature, of other kind of systems. In this work by studying the reserves, the main objective is to see and guarantee that pensions funds get sustainable. Being the use of these models considering this goal a classical approach in the study of pensions funds, this work concluded about the problematic of reserves. A concrete example is presented.

  17. General practice performance in referral for suspected cancer: influence of number of cases and case-mix on publicly reported data.

    Science.gov (United States)

    Murchie, P; Chowdhury, A; Smith, S; Campbell, N C; Lee, A J; Linden, D; Burton, C D

    2015-05-26

    Publicly available data show variation in GPs' use of urgent suspected cancer (USC) referral pathways. We investigated whether this could be due to small numbers of cancer cases and random case-mix, rather than due to true variation in performance. We analysed individual GP practice USC referral detection rates (proportion of the practice's cancer cases that are detected via USC) and conversion rates (proportion of the practice's USC referrals that prove to be cancer) in routinely collected data from GP practices in all of England (over 4 years) and northeast Scotland (over 7 years). We explored the effect of pooling data. We then modelled the effects of adding random case-mix to practice variation. Correlations between practice detection rate and conversion rate became less positive when data were aggregated over several years. Adding random case-mix to between-practice variation indicated that the median proportion of poorly performing practices correctly identified after 25 cancer cases were examined was 20% (IQR 17 to 24) and after 100 cases was 44% (IQR 40 to 47). Much apparent variation in GPs' use of suspected cancer referral pathways can be attributed to random case-mix. The methods currently used to assess the quality of GP-suspected cancer referral performance, and to compare individual practices, are misleading. These should no longer be used, and more appropriate and robust methods should be developed.

  18. The London Fibromyalgia Epidemiology Study: comparing the demographic and clinical characteristics in 100 random community cases of fibromyalgia versus controls.

    Science.gov (United States)

    White, K P; Speechley, M; Harth, M; Ostbye, T

    1999-07-01

    To identify demographic and clinical features that distinguish fibromyalgia (FM) from other chronic widespread pain. We identified 100 confirmed FM cases, 76 widespread pain controls, and 135 general controls in a random community survey of 3395 noninstitutionalized adults living in London, Ontario. FM cases were distinguished from pain controls using the 1990 American College of Rheumatology (ACR) classification criteria for FM. The mean age of FM cases was 47.8 years (range 19 to 86), the same as for pain controls; 86% of FM cases were female versus 67.1% of pain controls (p < 0.01). FM cases were less educated than general controls (p = 0.03). Male and female FM cases were similar, except females were older and reported more major symptoms (both p = 0.02). FM cases reported more severe pain and fatigue, more symptoms, more major symptoms, and worse overall health than pain controls or general controls. The most commonly reported major symptoms among FM cases were musculoskeletal pain (77.3%), fatigue (77.3%), severe fatigue lasting 24 h after minimal activity (77.0%), nonrestorative sleep (65.7%), and insomnia (56.0%). Subjects with 11-14 tender points were more similar to those with 15-18 tender points than to those with 7-10 points in 11 of 14 clinical variables. On multivariate analysis, 4 symptoms distinguished FM cases from pain controls: pain severity (p = 0.004), severe fatigue lasting 24 h after minimal activity (p = 0.006), weakness (p = 0.008), and self-reported swelling of neck glands (p = 0.01). In the general population, adults who meet the ACR definition of FM appear to have distinct features compared to those with chronic widespread pain who do not meet criteria.

  19. Bridging Emergent Attributes and Darwinian Principles in Teaching Natural Selection

    Science.gov (United States)

    Xu, Dongchen; Chi, Michelene T. H.

    2016-01-01

    Students often have misconceptions about natural selection as they misuse a direct causal schema to explain the process. Natural selection is in fact an emergent process where random interactions lead to changes in a population. The misconceptions stem from students' lack of emergent schema for natural selection. In order to help students…

  20. Selection of representative calibration sample sets for near-infrared reflectance spectroscopy to predict nitrogen concentration in grasses

    DEFF Research Database (Denmark)

    Shetty, Nisha; Rinnan, Åsmund; Gislum, René

    2012-01-01

    ) algorithm were used and compared. Both Puchwein and CADEX methods provide a calibration set equally distributed in space, and both methods require a minimum prior of knowledge. The samples were also selected randomly using complete random, cultivar random (year fixed), year random (cultivar fixed......) and interaction (cultivar × year fixed) random procedures to see the influence of different factors on sample selection. Puchwein's method performed best with lowest RMSEP followed by CADEX, interaction random, year random, cultivar random and complete random. Out of 118 samples of the complete calibration set...... effectively enhance the cost-effectiveness of NIR spectral analysis by reducing the number of analyzed samples in the calibration set by more than 80%, which substantially reduces the effort of laboratory analyses with no significant loss in prediction accuracy....

  1. Selective mutism.

    Science.gov (United States)

    Hua, Alexandra; Major, Nili

    2016-02-01

    Selective mutism is a disorder in which an individual fails to speak in certain social situations though speaks normally in other settings. Most commonly, this disorder initially manifests when children fail to speak in school. Selective mutism results in significant social and academic impairment in those affected by it. This review will summarize the current understanding of selective mutism with regard to diagnosis, epidemiology, cause, prognosis, and treatment. Studies over the past 20 years have consistently demonstrated a strong relationship between selective mutism and anxiety, most notably social phobia. These findings have led to the recent reclassification of selective mutism as an anxiety disorder in the Diagnostic and Statistical Manual of Mental Disorders, 5th Edition. In addition to anxiety, several other factors have been implicated in the development of selective mutism, including communication delays and immigration/bilingualism, adding to the complexity of the disorder. In the past few years, several randomized studies have supported the efficacy of psychosocial interventions based on a graduated exposure to situations requiring verbal communication. Less data are available regarding the use of pharmacologic treatment, though there are some studies that suggest a potential benefit. Selective mutism is a disorder that typically emerges in early childhood and is currently conceptualized as an anxiety disorder. The development of selective mutism appears to result from the interplay of a variety of genetic, temperamental, environmental, and developmental factors. Although little has been published about selective mutism in the general pediatric literature, pediatric clinicians are in a position to play an important role in the early diagnosis and treatment of this debilitating condition.

  2. Cost-effectiveness implications based on a comparison of nursing home and home health case mix.

    OpenAIRE

    Kramer, A M; Shaughnessy, P W; Pettigrew, M L

    1985-01-01

    Case-mix differences between 653 home health care patients and 650 nursing home patients, and between 455 Medicare home health patients and 447 Medicare nursing home patients were assessed using random samples selected from 20 home health agencies and 46 nursing homes in 12 states in 1982 and 1983. Home health patients were younger, had shorter lengths of stay, and were less functionally disabled than nursing home patients. Traditional long-term care problems requiring personal care were more...

  3. Statistical properties of random clique networks

    Science.gov (United States)

    Ding, Yi-Min; Meng, Jun; Fan, Jing-Fang; Ye, Fang-Fu; Chen, Xiao-Song

    2017-10-01

    In this paper, a random clique network model to mimic the large clustering coefficient and the modular structure that exist in many real complex networks, such as social networks, artificial networks, and protein interaction networks, is introduced by combining the random selection rule of the Erdös and Rényi (ER) model and the concept of cliques. We find that random clique networks having a small average degree differ from the ER network in that they have a large clustering coefficient and a power law clustering spectrum, while networks having a high average degree have similar properties as the ER model. In addition, we find that the relation between the clustering coefficient and the average degree shows a non-monotonic behavior and that the degree distributions can be fit by multiple Poisson curves; we explain the origin of such novel behaviors and degree distributions.

  4. Benchmarking Variable Selection in QSAR.

    Science.gov (United States)

    Eklund, Martin; Norinder, Ulf; Boyer, Scott; Carlsson, Lars

    2012-02-01

    Variable selection is important in QSAR modeling since it can improve model performance and transparency, as well as reduce the computational cost of model fitting and predictions. Which variable selection methods that perform well in QSAR settings is largely unknown. To address this question we, in a total of 1728 benchmarking experiments, rigorously investigated how eight variable selection methods affect the predictive performance and transparency of random forest models fitted to seven QSAR datasets covering different endpoints, descriptors sets, types of response variables, and number of chemical compounds. The results show that univariate variable selection methods are suboptimal and that the number of variables in the benchmarked datasets can be reduced with about 60 % without significant loss in model performance when using multivariate adaptive regression splines MARS and forward selection. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Feature Selection for Chemical Sensor Arrays Using Mutual Information

    Science.gov (United States)

    Wang, X. Rosalind; Lizier, Joseph T.; Nowotny, Thomas; Berna, Amalia Z.; Prokopenko, Mikhail; Trowell, Stephen C.

    2014-01-01

    We address the problem of feature selection for classifying a diverse set of chemicals using an array of metal oxide sensors. Our aim is to evaluate a filter approach to feature selection with reference to previous work, which used a wrapper approach on the same data set, and established best features and upper bounds on classification performance. We selected feature sets that exhibit the maximal mutual information with the identity of the chemicals. The selected features closely match those found to perform well in the previous study using a wrapper approach to conduct an exhaustive search of all permitted feature combinations. By comparing the classification performance of support vector machines (using features selected by mutual information) with the performance observed in the previous study, we found that while our approach does not always give the maximum possible classification performance, it always selects features that achieve classification performance approaching the optimum obtained by exhaustive search. We performed further classification using the selected feature set with some common classifiers and found that, for the selected features, Bayesian Networks gave the best performance. Finally, we compared the observed classification performances with the performance of classifiers using randomly selected features. We found that the selected features consistently outperformed randomly selected features for all tested classifiers. The mutual information filter approach is therefore a computationally efficient method for selecting near optimal features for chemical sensor arrays. PMID:24595058

  6. A clear case. Selective investment in case management applications can yield significant returns.

    Science.gov (United States)

    Hagland, Mark

    2009-03-01

    Strategically conceived case management system implementation makes good patient care and business sense, CIOs agree. Significant financial savings can be achieved from case management IS implementations, if those implementations are executed in the context of partnership between clinical leaders and the CIO's team. CIOs agree that applying the concept of "investment" to the implementation of case management IT can make resource allocation decisions easier.

  7. Pervasive randomness in physics: an introduction to its modelling and spectral characterisation

    Science.gov (United States)

    Howard, Roy

    2017-10-01

    An introduction to the modelling and spectral characterisation of random phenomena is detailed at a level consistent with a first exposure to the subject at an undergraduate level. A signal framework for defining a random process is provided and this underpins an introduction to common random processes including the Poisson point process, the random walk, the random telegraph signal, shot noise, information signalling random processes, jittered pulse trains, birth-death random processes and Markov chains. An introduction to the spectral characterisation of signals and random processes, via either an energy spectral density or a power spectral density, is detailed. The important case of defining a white noise random process concludes the paper.

  8. Genarris: Random generation of molecular crystal structures and fast screening with a Harris approximation

    Science.gov (United States)

    Li, Xiayue; Curtis, Farren S.; Rose, Timothy; Schober, Christoph; Vazquez-Mayagoitia, Alvaro; Reuter, Karsten; Oberhofer, Harald; Marom, Noa

    2018-06-01

    We present Genarris, a Python package that performs configuration space screening for molecular crystals of rigid molecules by random sampling with physical constraints. For fast energy evaluations, Genarris employs a Harris approximation, whereby the total density of a molecular crystal is constructed via superposition of single molecule densities. Dispersion-inclusive density functional theory is then used for the Harris density without performing a self-consistency cycle. Genarris uses machine learning for clustering, based on a relative coordinate descriptor developed specifically for molecular crystals, which is shown to be robust in identifying packing motif similarity. In addition to random structure generation, Genarris offers three workflows based on different sequences of successive clustering and selection steps: the "Rigorous" workflow is an exhaustive exploration of the potential energy landscape, the "Energy" workflow produces a set of low energy structures, and the "Diverse" workflow produces a maximally diverse set of structures. The latter is recommended for generating initial populations for genetic algorithms. Here, the implementation of Genarris is reported and its application is demonstrated for three test cases.

  9. Money creation process in a random redistribution model

    Science.gov (United States)

    Chen, Siyan; Wang, Yougui; Li, Keqiang; Wu, Jinshan

    2014-01-01

    In this paper, the dynamical process of money creation in a random exchange model with debt is investigated. The money creation kinetics are analyzed by both the money-transfer matrix method and the diffusion method. From both approaches, we attain the same conclusion: the source of money creation in the case of random exchange is the agents with neither money nor debt. These analytical results are demonstrated by computer simulations.

  10. High cycle fatigue of austenitic stainless steels under random loading

    International Nuclear Information System (INIS)

    Gauthier, J.P.; Petrequin, P.

    1987-08-01

    To investigate reactor components, load control random fatigue tests were performed at 300 0 C and 550 0 C, on specimens from austenitic stainless steels plates in the transverse orientation. Random solicitations are produced on closed loop servo-hydraulic machines by a mini computer which generates random load sequence by the use of reduced Markovian matrix. The method has the advantage of taking into account the mean load for each cycle. The solicitations generated are those of a stationary gaussian process. Fatigue tests have been mainly performed in the endurance region of fatigue curve, with scattering determination using stair case method. Experimental results have been analysed aiming at determining design curves for components calculations, depending on irregularity factor and temperature. Analysis in term of mean square root fatigue limit calculation, shows that random loading gives more damage than constant amplitude loading. Damage calculations following Miner rule have been made using the probability density function for the case where the irregularity factor is nearest to 100 %. The Miner rule is too conservative for our results. A method using design curves including random loading effects with irregularity factor as an indexing parameter is proposed

  11. A random matrix approach to VARMA processes

    International Nuclear Information System (INIS)

    Burda, Zdzislaw; Jarosz, Andrzej; Nowak, Maciej A; Snarska, Malgorzata

    2010-01-01

    We apply random matrix theory to derive the spectral density of large sample covariance matrices generated by multivariate VMA(q), VAR(q) and VARMA(q 1 , q 2 ) processes. In particular, we consider a limit where the number of random variables N and the number of consecutive time measurements T are large but the ratio N/T is fixed. In this regime, the underlying random matrices are asymptotically equivalent to free random variables (FRV). We apply the FRV calculus to calculate the eigenvalue density of the sample covariance for several VARMA-type processes. We explicitly solve the VARMA(1, 1) case and demonstrate perfect agreement between the analytical result and the spectra obtained by Monte Carlo simulations. The proposed method is purely algebraic and can be easily generalized to q 1 >1 and q 2 >1.

  12. The Effect of Speed Alterations on Tempo Note Selection.

    Science.gov (United States)

    Madsen, Clifford K.; And Others

    1986-01-01

    Investigated the tempo note preferences of 100 randomly selected college-level musicians using familiar orchestral music as stimuli. Subjects heard selections at increased, decreased, and unaltered tempi. Results showed musicians were not accurate in estimating original tempo and showed consistent preference for faster than actual tempo.…

  13. Natural selection and algorithmic design of mRNA.

    Science.gov (United States)

    Cohen, Barry; Skiena, Steven

    2003-01-01

    Messenger RNA (mRNA) sequences serve as templates for proteins according to the triplet code, in which each of the 4(3) = 64 different codons (sequences of three consecutive nucleotide bases) in RNA either terminate transcription or map to one of the 20 different amino acids (or residues) which build up proteins. Because there are more codons than residues, there is inherent redundancy in the coding. Certain residues (e.g., tryptophan) have only a single corresponding codon, while other residues (e.g., arginine) have as many as six corresponding codons. This freedom implies that the number of possible RNA sequences coding for a given protein grows exponentially in the length of the protein. Thus nature has wide latitude to select among mRNA sequences which are informationally equivalent, but structurally and energetically divergent. In this paper, we explore how nature takes advantage of this freedom and how to algorithmically design structures more energetically favorable than have been built through natural selection. In particular: (1) Natural Selection--we perform the first large-scale computational experiment comparing the stability of mRNA sequences from a variety of organisms to random synonymous sequences which respect the codon preferences of the organism. This experiment was conducted on over 27,000 sequences from 34 microbial species with 36 genomic structures. We provide evidence that in all genomic structures highly stable sequences are disproportionately abundant, and in 19 of 36 cases highly unstable sequences are disproportionately abundant. This suggests that the stability of mRNA sequences is subject to natural selection. (2) Artificial Selection--motivated by these biological results, we examine the algorithmic problem of designing the most stable and unstable mRNA sequences which code for a target protein. We give a polynomial-time dynamic programming solution to the most stable sequence problem (MSSP), which is asymptotically no more complex

  14. Random walk on random walks

    NARCIS (Netherlands)

    Hilário, M.; Hollander, den W.Th.F.; Sidoravicius, V.; Soares dos Santos, R.; Teixeira, A.

    2014-01-01

    In this paper we study a random walk in a one-dimensional dynamic random environment consisting of a collection of independent particles performing simple symmetric random walks in a Poisson equilibrium with density ¿¿(0,8). At each step the random walk performs a nearest-neighbour jump, moving to

  15. Phase transitions in random uniaxial systems with dipolar interactions

    International Nuclear Information System (INIS)

    Schuster, H.G.

    1977-01-01

    The critical behaviour of random uniaxial ferromagnetic (ferroelectric) systems with both short range and long range dipolar interactions is investigated, using the field theoretic renormalization method of Brezin et al. for the free energy above and below transition point Tsub(c). The randomness is due to externally introduced fluctuations in the short range interactions (quenched case) or (and) magneto-elastic coupling to the lattice (annealed case). Strong deviations in the critical behaviour with respect to the pure systems are found. In the quenched case, e.g., the specific heat C and the coefficient f 2 (of M 3 in the equation of state, where M is the magnetization) change from C proportional to abs ln abs t abs abssup(1/3), f 2 proportional to abs ln abs t abs abs sup(1/3), f 2 proportional to abs ln abs t abs abs -1 in the pure system to C = A+- + C+-exp[-4√ 3 106 abs ln abs t abs abs], f 2 proportional to abs ln abs t abs abs sup(-1/2) (where t = (T-Tsub(c)) / Tsub(c) is the reduced temperature and A+-, C+- are constants) in the random situation. (orig.) [de

  16. Prone position as prevention of lung injury in comatose patients: a prospective, randomized, controlled study.

    Science.gov (United States)

    Beuret, Pascal; Carton, Marie-Jose; Nourdine, Karim; Kaaki, Mahmoud; Tramoni, Gerard; Ducreux, Jean-Claude

    2002-05-01

    Comatose patients frequently exhibit pulmonary function worsening, especially in cases of pulmonary infection. It appears to have a deleterious effect on neurologic outcome. We therefore conducted a randomized trial to determine whether daily prone positioning would prevent lung worsening in these patients. Prospective, randomized, controlled study. Sixteen-bed intensive care unit. Fifty-one patients who required invasive mechanical ventilation because of coma with Glascow coma scores of 9 or less. In the prone position (PP) group: prone positioning for 4 h once daily until the patients could get up to sit in an armchair; in the supine position (SP) group: supine positioning. The primary end point was the incidence of lung worsening defined by an increase in the Lung Injury Score of at least 1 point since the time of randomization. The secondary end point was the incidence of ventilator-associated pneumonia (VAP). A total of 25 patients were randomly assigned to the PP group and 26 patients to the SP group. The characteristics of the patients from the two groups were similar at randomization. The incidence of lung worsening was lower in the PP group (12%) than in the SP group (50%) ( p=0.003). The incidence of VAP was 20% in the PP group and 38.4% in the SP group ( p=0.14). There was no serious complication attributable to prone positioning, however, there was a significant increase of intracranial pressure in the PP. In a selected population of comatose ventilated patients, daily prone positioning reduced the incidence of lung worsening.

  17. “On the Margins and Not the Mainstream:” Case Selection for the Implementation of Community based Primary Health Care in Canada and New Zealand

    Directory of Open Access Journals (Sweden)

    Kerry Kuluski

    2017-06-01

    Full Text Available Healthcare system reforms are pushing beyond primary care to more holistic, integrated models of community based primary health care (CBPHC to better meet the needs of the population. Across the world CBPHC is at varying stages of development and few standard models exist. In order to scale up and spread successful models of care it is important to study what works and why. The first step is to select ‘appropriate’ cases to study. In this commentary we reflect on our journey in the selection of CBPHC models for older adults, revealing the limited utility of sourcing the empirical literature; the difficulty in identifying “successful” models to study when outcomes of importance differ across stakeholders; the value of drawing on clinical and organisational networks and experts; and the association between policy context and ease of case selection. Such insights have important implications for case study methodology in health services and policy research.

  18. Nonlinear Pricing with Random Participation

    OpenAIRE

    Jean-Charles Rochet; Lars A. Stole

    2002-01-01

    The canonical selection contracting programme takes the agent's participation decision as deterministic and finds the optimal contract, typically satisfying this constraint for the worst type. Upon weakening this assumption of known reservation values by introducing independent randomness into the agents' outside options, we find that some of the received wisdom from mechanism design and nonlinear pricing is not robust and the richer model which allows for stochastic participation affords a m...

  19. Genetic search feature selection for affective modeling

    DEFF Research Database (Denmark)

    Martínez, Héctor P.; Yannakakis, Georgios N.

    2010-01-01

    Automatic feature selection is a critical step towards the generation of successful computational models of affect. This paper presents a genetic search-based feature selection method which is developed as a global-search algorithm for improving the accuracy of the affective models built....... The method is tested and compared against sequential forward feature selection and random search in a dataset derived from a game survey experiment which contains bimodal input features (physiological and gameplay) and expressed pairwise preferences of affect. Results suggest that the proposed method...

  20. Impact of selective genotyping in the training population on accuracy and bias of genomic selection.

    Science.gov (United States)

    Zhao, Yusheng; Gowda, Manje; Longin, Friedrich H; Würschum, Tobias; Ranc, Nicolas; Reif, Jochen C

    2012-08-01

    Estimating marker effects based on routinely generated phenotypic data of breeding programs is a cost-effective strategy to implement genomic selection. Truncation selection in breeding populations, however, could have a strong impact on the accuracy to predict genomic breeding values. The main objective of our study was to investigate the influence of phenotypic selection on the accuracy and bias of genomic selection. We used experimental data of 788 testcross progenies from an elite maize breeding program. The testcross progenies were evaluated in unreplicated field trials in ten environments and fingerprinted with 857 SNP markers. Random regression best linear unbiased prediction method was used in combination with fivefold cross-validation based on genotypic sampling. We observed a substantial loss in the accuracy to predict genomic breeding values in unidirectional selected populations. In contrast, estimating marker effects based on bidirectional selected populations led to only a marginal decrease in the prediction accuracy of genomic breeding values. We concluded that bidirectional selection is a valuable approach to efficiently implement genomic selection in applied plant breeding programs.

  1. First-passage exponents of multiple random walks

    International Nuclear Information System (INIS)

    Ben-Naim, E; Krapivsky, P L

    2010-01-01

    We investigate first-passage statistics of an ensemble of N noninteracting random walks on a line. Starting from a configuration in which all particles are located in the positive half-line, we study S n (t), the probability that the nth rightmost particle remains in the positive half-line up to time t. This quantity decays algebraically, S n (t)∼t -β n , in the long-time limit. Interestingly, there is a family of nontrivial first-passage exponents, β 1 2 N-1 ; the only exception is the two-particle case where β 1 = 1/3. In the N → ∞ limit, however, the exponents attain a scaling form, β n (N) → β(z) with z=(n-N/2)/√N. We also demonstrate that the smallest exponent decays exponentially with N. We deduce these results from first-passage kinetics of a random walk in an N-dimensional cone and confirm them using numerical simulations. Additionally, we investigate the family of exponents that characterizes leadership statistics of multiple random walks and find that in this case, the cone provides an excellent approximation.

  2. Peer-Selected “Best Papers”—Are They Really That “Good”?

    Science.gov (United States)

    Wainer, Jacques; Eckmann, Michael; Rocha, Anderson

    2015-01-01

    Background Peer evaluation is the cornerstone of science evaluation. In this paper, we analyze whether or not a form of peer evaluation, the pre-publication selection of the best papers in Computer Science (CS) conferences, is better than random, when considering future citations received by the papers. Methods Considering 12 conferences (for several years), we collected the citation counts from Scopus for both the best papers and the non-best papers. For a different set of 17 conferences, we collected the data from Google Scholar. For each data set, we computed the proportion of cases whereby the best paper has more citations. We also compare this proportion for years before 2010 and after to evaluate if there is a propaganda effect. Finally, we count the proportion of best papers that are in the top 10% and 20% most cited for each conference instance. Results The probability that a best paper will receive more citations than a non best paper is 0.72 (95% CI = 0.66, 0.77) for the Scopus data, and 0.78 (95% CI = 0.74, 0.81) for the Scholar data. There are no significant changes in the probabilities for different years. Also, 51% of the best papers are among the top 10% most cited papers in each conference/year, and 64% of them are among the top 20% most cited. Discussion There is strong evidence that the selection of best papers in Computer Science conferences is better than a random selection, and that a significant number of the best papers are among the top cited papers in the conference. PMID:25789480

  3. Effectiveness of 23-valent pneumococcal polysaccharide vaccine and seasonal influenza vaccine for pneumonia among the elderly - Selection of controls in a case-control study.

    Science.gov (United States)

    Kondo, Kyoko; Suzuki, Kanzo; Washio, Masakazu; Ohfuji, Satoko; Fukushima, Wakaba; Maeda, Akiko; Hirota, Yoshio

    2017-08-24

    We conducted a case-control study to elucidate associations between pneumonia in elderly individuals and 23-valent pneumococcal polysaccharide vaccine (PPSV23) and seasonal influenza vaccine (influenza vaccine). Here, we examined selection of controls in our study using an analytic epidemiology approach. The study period was from October 1, 2009 through September 30, 2014. Cases comprised ≥65-year-old patients newly diagnosed with pneumonia. For every case with pneumonia, two patients with other diseases (one respiratory medicine, one non-respiratory medicine) who were sex-, age-, visit date- and visit hospital-matched were selected as controls. Odds ratios (ORs) and 95% confidence intervals (CIs) of vaccination for pneumonia were calculated using conditional logistic regression model. Similar analyses were also conducted based on the clinical department of controls. Analysis was conducted in 234 cases and 438 controls. Effectiveness of pneumococcal vaccination or influenza vaccination against pneumonia was not detected. Proportions of either vaccination in controls were greater among respiratory medicine (pneumococcal vaccine, 38%; influenza vaccine, 55%) than among non-respiratory medicine (23%; 48%). Analysis using controls restricted to respiratory medicine showed marginally significant effectiveness of pneumococcal vaccination (OR, 0.59; 95%CI, 0.34-1.03; P=0.064) and influenza vaccination (0.64; 0.40-1.04; 0.072). However, this effectiveness might have been overestimated by selection bias of controls, as pneumonia cases are not necessarily respiratory medicine patients. In the analysis using controls restricted to non-respiratory medicine, OR of pneumococcal vaccination for pneumonia was close to 1, presumably because the proportion of pneumococcal vaccination was higher in cases than in controls. Because pneumococcal vaccine was not routinely administered during the study period, differences in recommendations of vaccination by physician in different

  4. A case of selective mutism in an 8-year-old girl with thalassaemia major after bone marrow transplantation.

    Science.gov (United States)

    Plener, P L; Gatz, S A; Schuetz, C; Ludolph, A G; Kölch, M

    2012-01-01

    Selective mutism is rare with a prevalence below 1% in the general population, but a higher prevalence in populations at risk (children with speech retardation, migration). Evidence for treatment strategies is hardly available. This case report provides information on the treatment of selective mutism in an 8-year-old girl with preexisting thalassaemia major. As medications she received penicillin prophylaxis (500000 IE/d) and deferasirox (Exjade; 20-25mg/kg/d), an iron chelator. The preexisting somatic disease and treatment complicated the treatment, as there are no data about pharmacological combination therapy. Psychotherapy in day treatment, supported by the use of the SSRI fluoxetine (10 mg), led to a decrease in the selective mutism score from 33 to 12 points, GAF improved by 21 points. Mean levels of fluoxetine plus norfluoxetine were 287.8 ng/ml without significant level fluctuations. © Georg Thieme Verlag KG Stuttgart · New York.

  5. Noncontextuality with Marginal Selectivity in Reconstructing Mental Architectures

    Directory of Open Access Journals (Sweden)

    Ru eZhang

    2015-06-01

    Full Text Available We present a general theory of series-parallel mental architectures with selectively influenced stochastically non-independent components. A mental architecture is a hypothetical network of processes aimed at performing a task, of which we only observe the overall time it takes under variable parameters of the task. It is usually assumed that the network contains several processes selectively influenced by different experimental factors, and then the question is asked as to how these processes are arranged within the network, e.g., whether they are concurrent or sequential. One way of doing this is to consider the distribution functions for the overall processing time and compute certain linear combinations thereof (interaction contrasts. The theory of selective influences in psychology can be viewed as a special application of the interdisciplinary theory of (noncontextuality having its origins and main applications in quantum theory. In particular, lack of contextuality is equivalent to the existence of a hidden random entity of which all the random variables in play are functions. Consequently, for any given value of this common random entity, the processing times and their compositions (minima, maxima, or sums become deterministic quantities. These quantities, in turn, can be treated as random variables with (shifted Heaviside distribution functions, for which one can easily compute various linear combinations across different treatments, including interaction contrasts. This mathematical fact leads to a simple method, more general than the previously used ones, to investigate and characterize the interaction contrast for different types of series-parallel architectures.

  6. A RANDOMIZED CONTROLLED PLACEBO STUDY OF DEXTROSE IONTOPHORESIS VERSUS DEXTROSE PROLOTHERAPY IN CASE OF KNEE OSTEOARTHRITIS

    Directory of Open Access Journals (Sweden)

    Mahmoud Mohamed Ahmed Ewidea

    2015-12-01

    Full Text Available Background: Osteoarthritis is the most common cause of musculoskeletal pain and disability in the knee joint. This study investigated the efficacy of Dextrose iontophoresis versus Dextrose prolotherapy in case of knee osteoarthritis in a randomized, placebo-controlled, double-blinded study. Methods: sixty patients diagnosed mild to moderate osteoarthritis were included in the study. Their age's were45:65 years with mean age 51 ± 3.5 years. Patients were divided randomly into three equal groups, group (Areceived 50 % dextrose iontophoresis, group (B Each patient received three intra-articular injections of dextrose at 1-month intervals in weeks 0, 4, and 8 and group (C received sham iontophoresis. The outcome measurements were Western Ontario and McMaster Universities arthritis index (WOMAC values, knee ROM, and pain severity at rest (seated and in activity (after walking 6 m using the visual analogue scale (VAS were recorded. The patients were evaluated for these parameters before allocated in their groups then after 4, 8, and 24 weeks later. Results: compared to sham group (placebo there were significant improvement of VAS and ROM of iontophoresis group than sham (placebo group (p<0.000. Also there were significant improvement of prolotherapy group than placebo (p<0.006, and 0.02 respectively. Furthermore there was significant improve of iontophoresis group than prolotherapy where p was <0.000 for VAS, ROM and (WOMAC. Conclusion: The results of this study suggested that both dextrose iontophoresis and dextrose prolotherapy may be as useful modalities in treatment of osteoarthritis with better effects of dextrose iontophoresis than prolotherapy.

  7. Microbial Resistance to Triclosan: A Case Study in Natural Selection

    Science.gov (United States)

    Serafini, Amanda; Matthews, Dorothy M.

    2009-01-01

    Natural selection is the mechanism of evolution caused by the environmental selection of organisms most fit to reproduce, sometimes explained as "survival of the fittest." An example of evolution by natural selection is the development of bacteria that are resistant to antimicrobial agents as a result of exposure to these agents. Triclosan, which…

  8. A case-control study of risk factors for bovine cysticercosis in Danish cattle herds

    DEFF Research Database (Denmark)

    Calvo Artavia, Francisco Fernando; Nielsen, Liza Rosenbaum; Dahl, J.

    2013-01-01

    than in countries with few lightly infected cases per year. The aim of the present case-control study was to quantify associations between potential herd-level risk factors and BC in Danish cattle herds. Risk factors can be used in the design of a risk-based meat inspection system targeted towards...... a questionnaire and register data from the Danish Cattle Database were grouped into meaningful variables and used to investigate the risk factors for BC using a multivariable logistic regression model. Case herds were almost three times more likely than control herds to let all or most animals out grazing. Case...... the animals with the highest risk of BC. Cases (n = 77) included herds that hosted at least one animal diagnosed with BC at meat inspection, from 2006 to 2010. Control herds (n = 231) consisted of randomly selected herds that had not hosted any animals diagnosed with BC between 2004 and 2010. The answers from...

  9. Selection and ranking of occupational safety indicators based on fuzzy AHP: A case study in road construction companies

    Directory of Open Access Journals (Sweden)

    Janackovic, Goran Lj.

    2013-11-01

    Full Text Available This paper presents the factors, performance, and indicators of occupational safety, as well as a method to select and rank occupational safety indicators based on the expert evaluation method and the fuzzy analytic hierarchy process (fuzzy AHP. A case study is done on road construction companies in Serbia. The key safety performance indicators for the road construction industry are identified and ranked according to the results of a survey that included experts who assessed occupational safety risks in these companies. The case study confirmed that organisational factors have a dominant effect on the quality of the occupational health and safety management system in Serbian road construction companies.

  10. Associated factors with attention deficit hyperactivity disorder (ADHD): a case-control study.

    Science.gov (United States)

    Malek, Ayyoub; Amiri, Shahrokh; Sadegfard, Majid; Abdi, Salman; Amini, Saeedeh

    2012-09-01

    The current study attempted to investigate factors associated with attention deficit hyperactivity disorder (ADHD) in children without co-morbidities. In this case-control study, 164 ADHD children who attended the Child and Adolescent Psychiatric Clinics of Tabriz University of Medical Sciences, Iran were compared with 166 normal children selected in a random-cluster method from primary and secondary schools. Clinical interviews based on DSM-IV-TR using K-SADS were used to diagnose ADHD cases and to select the control group. Participants were matched for age. We used chi-square and binary logistic regression for data analysis. Among the associated factors with ADHD were gender and maternal employment. Boys (OR 0.54; 95% confidence interval: 0.34 - 0.86) and those children with working mothers (OR 0.16: 95% confidence interval: 0.06 - 0.86) suffered more from ADHD. The birth season, family size, birth order, and parental kinship were not among risk factors for ADHD. The results of the study show that maternal employment and male gender are among the associated risk factors for ADHD.

  11. Filling of a Poisson trap by a population of random intermittent searchers

    KAUST Repository

    Bressloff, Paul C.; Newby, Jay M.

    2012-01-01

    We extend the continuum theory of random intermittent search processes to the case of N independent searchers looking to deliver cargo to a single hidden target located somewhere on a semi-infinite track. Each searcher randomly switches between a

  12. DNA-based random number generation in security circuitry.

    Science.gov (United States)

    Gearheart, Christy M; Arazi, Benjamin; Rouchka, Eric C

    2010-06-01

    DNA-based circuit design is an area of research in which traditional silicon-based technologies are replaced by naturally occurring phenomena taken from biochemistry and molecular biology. This research focuses on further developing DNA-based methodologies to mimic digital data manipulation. While exhibiting fundamental principles, this work was done in conjunction with the vision that DNA-based circuitry, when the technology matures, will form the basis for a tamper-proof security module, revolutionizing the meaning and concept of tamper-proofing and possibly preventing it altogether based on accurate scientific observations. A paramount part of such a solution would be self-generation of random numbers. A novel prototype schema employs solid phase synthesis of oligonucleotides for random construction of DNA sequences; temporary storage and retrieval is achieved through plasmid vectors. A discussion of how to evaluate sequence randomness is included, as well as how these techniques are applied to a simulation of the random number generation circuitry. Simulation results show generated sequences successfully pass three selected NIST random number generation tests specified for security applications.

  13. Multistage Selection and the Financing of New Ventures

    OpenAIRE

    Jonathan T. Eckhardt; Scott Shane; Frédéric Delmar

    2006-01-01

    Using a random sample of 221 new Swedish ventures initiated in 1998, we examine why some new ventures are more likely than others to successfully be awarded capital from external sources. We examine venture financing as a staged selection process in which two sequential selection events systematically winnow the population of ventures and influence which ventures receive financing. For a venture to receive external financing its founders must first select it as a candidate for external fundin...

  14. Rural Women\\'s Preference For Selected Programmes Of The ...

    African Journals Online (AJOL)

    The study focused on the rural women's preference for selected programmes of the National Special Programme for Food Security (NSPFS) in Imo State, Nigeria. Data was collected with the aid of structured interview from 150 randomly selected women in the study area. Results from the study showed that respondents ...

  15. Improving observational study estimates of treatment effects using joint modeling of selection effects and outcomes: the case of AAA repair.

    Science.gov (United States)

    O'Malley, A James; Cotterill, Philip; Schermerhorn, Marc L; Landon, Bruce E

    2011-12-01

    When 2 treatment approaches are available, there are likely to be unmeasured confounders that influence choice of procedure, which complicates estimation of the causal effect of treatment on outcomes using observational data. To estimate the effect of endovascular (endo) versus open surgical (open) repair, including possible modification by institutional volume, on survival after treatment for abdominal aortic aneurysm, accounting for observed and unobserved confounding variables. Observational study of data from the Medicare program using a joint model of treatment selection and survival given treatment to estimate the effects of type of surgery and institutional volume on survival. We studied 61,414 eligible repairs of intact abdominal aortic aneurysms during 2001 to 2004. The outcome, perioperative death, is defined as in-hospital death or death within 30 days of operation. The key predictors are use of endo, transformed endo and open volume, and endo-volume interactions. There is strong evidence of nonrandom selection of treatment with potential confounding variables including institutional volume and procedure date, variables not typically adjusted for in clinical trials. The best fitting model included heterogeneous transformations of endo volume for endo cases and open volume for open cases as predictors. Consistent with our hypothesis, accounting for unmeasured selection reduced the mortality benefit of endo. The effect of endo versus open surgery varies nonlinearly with endo and open volume. Accounting for institutional experience and unmeasured selection enables better decision-making by physicians making treatment referrals, investigators evaluating treatments, and policy makers.

  16. Critical behavior in inhomogeneous random graphs

    NARCIS (Netherlands)

    Hofstad, van der R.W.

    2013-01-01

    We study the critical behavior of inhomogeneous random graphs in the so-called rank-1 case, where edges are present independently but with unequal edge occupation probabilities. The edge occupation probabilities are moderated by vertex weights, and are such that the degree of vertex i is close in

  17. A New Random Walk for Replica Detection in WSNs

    Science.gov (United States)

    Aalsalem, Mohammed Y.; Saad, N. M.; Hossain, Md. Shohrab; Atiquzzaman, Mohammed; Khan, Muhammad Khurram

    2016-01-01

    Wireless Sensor Networks (WSNs) are vulnerable to Node Replication attacks or Clone attacks. Among all the existing clone detection protocols in WSNs, RAWL shows the most promising results by employing Simple Random Walk (SRW). More recently, RAND outperforms RAWL by incorporating Network Division with SRW. Both RAND and RAWL have used SRW for random selection of witness nodes which is problematic because of frequently revisiting the previously passed nodes that leads to longer delays, high expenditures of energy with lower probability that witness nodes intersect. To circumvent this problem, we propose to employ a new kind of constrained random walk, namely Single Stage Memory Random Walk and present a distributed technique called SSRWND (Single Stage Memory Random Walk with Network Division). In SSRWND, single stage memory random walk is combined with network division aiming to decrease the communication and memory costs while keeping the detection probability higher. Through intensive simulations it is verified that SSRWND guarantees higher witness node security with moderate communication and memory overheads. SSRWND is expedient for security oriented application fields of WSNs like military and medical. PMID:27409082

  18. A New Random Walk for Replica Detection in WSNs.

    Science.gov (United States)

    Aalsalem, Mohammed Y; Khan, Wazir Zada; Saad, N M; Hossain, Md Shohrab; Atiquzzaman, Mohammed; Khan, Muhammad Khurram

    2016-01-01

    Wireless Sensor Networks (WSNs) are vulnerable to Node Replication attacks or Clone attacks. Among all the existing clone detection protocols in WSNs, RAWL shows the most promising results by employing Simple Random Walk (SRW). More recently, RAND outperforms RAWL by incorporating Network Division with SRW. Both RAND and RAWL have used SRW for random selection of witness nodes which is problematic because of frequently revisiting the previously passed nodes that leads to longer delays, high expenditures of energy with lower probability that witness nodes intersect. To circumvent this problem, we propose to employ a new kind of constrained random walk, namely Single Stage Memory Random Walk and present a distributed technique called SSRWND (Single Stage Memory Random Walk with Network Division). In SSRWND, single stage memory random walk is combined with network division aiming to decrease the communication and memory costs while keeping the detection probability higher. Through intensive simulations it is verified that SSRWND guarantees higher witness node security with moderate communication and memory overheads. SSRWND is expedient for security oriented application fields of WSNs like military and medical.

  19. The concentration of heavy metals: zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people

    International Nuclear Information System (INIS)

    Wandiga, S.O.; Jumba, I.O.

    1982-01-01

    An intercomparative analysis of the concentration of heavy metals:zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people using the techniques of atomic absorption spectrophotometry (AAS) and differential pulse anodic stripping voltammetry (DPAS) has been undertaken. The percent relative standard deviation for each sample analysed using either of the techniques show good sensitivity and correlation between the techniques. The DPAS was found to be slightly sensitive than the AAs instrument used. The recalculated body burden rations of Cd to Zn, Pb to Fe reveal no unusual health impairement symptoms and suggest a relatively clean environment in Kenya.(author)

  20. Meta-analytic comparison of randomized and nonrandomized studies of breast cancer surgery.

    Science.gov (United States)

    Edwards, Janet P; Kelly, Elizabeth J; Lin, Yongtao; Lenders, Taryn; Ghali, William A; Graham, Andrew J

    2012-06-01

    Randomized controlled trials (RCTs) are thought to provide the most accurate estimation of "true" treatment effect. The relative quality of effect estimates derived from nonrandomized studies (nRCTs) remains unclear, particularly in surgery, where the obstacles to performing high-quality RCTs are compounded. We performed a meta-analysis of effect estimates of RCTs comparing surgical procedures for breast cancer relative to those of corresponding nRCTs. English-language RCTs of breast cancer treatment in human patients published from 2003 to 2008 were identified in MEDLINE, EMBASE and Cochrane databases. We identified nRCTs using the National Library of Medicine's "related articles" function and reference lists. Two reviewers conducted all steps of study selection. We included studies comparing 2 surgical arms for the treatment of breast cancer. Information on treatment efficacy estimates, expressed as relative risk (RR) for outcomes of interest in both the RCTs and nRCTs was extracted. We identified 12 RCTs representing 10 topic/outcome combinations with comparable nRCTs. On visual inspection, 4 of 10 outcomes showed substantial differences in summary RR. The pooled RR estimates for RCTs versus nRCTs differed more than 2-fold in 2 of 10 outcomes and failed to demonstrate consistency of statistical differences in 3 of 10 cases. A statistically significant difference, as assessed by the z score, was not detected for any of the outcomes. Randomized controlled trials comparing surgical procedures for breast cancer may demonstrate clinically relevant differences in effect estimates in 20%-40% of cases relative to those generated by nRCTs, depending on which metric is used.