WorldWideScience

Sample records for twenty randomly selected

  1. Random Selection for Drug Screening

    Energy Technology Data Exchange (ETDEWEB)

    Center for Human Reliability Studies

    2007-05-01

    Simple random sampling is generally the starting point for a random sampling process. This sampling technique ensures that each individual within a group (population) has an equal chance of being selected. There are a variety of ways to implement random sampling in a practical situation.

  2. Random Selection for Drug Screening

    Energy Technology Data Exchange (ETDEWEB)

    Center for Human Reliability Studies

    2007-05-01

    Sampling is the process of choosing some members out of a group or population. Probablity sampling, or random sampling, is the process of selecting members by chance with a known probability of each individual being chosen.

  3. Selective sweeps across twenty millions years of primate evolution

    DEFF Research Database (Denmark)

    Munch, Kasper; Nam, Kiwoong; Schierup, Mikkel Heide

    2016-01-01

    The contribution from selective sweeps to variation in genetic diversity has proven notoriously difficult to assess, in part because polymorphism data only allows detection of sweeps in the most recent few hundred thousand years. Here we show how linked selection in ancestral species can be quant...

  4. Randomized selection on the GPU

    Energy Technology Data Exchange (ETDEWEB)

    Monroe, Laura Marie [Los Alamos National Laboratory; Wendelberger, Joanne R [Los Alamos National Laboratory; Michalak, Sarah E [Los Alamos National Laboratory

    2011-01-13

    We implement here a fast and memory-sparing probabilistic top N selection algorithm on the GPU. To our knowledge, this is the first direct selection in the literature for the GPU. The algorithm proceeds via a probabilistic-guess-and-chcck process searching for the Nth element. It always gives a correct result and always terminates. The use of randomization reduces the amount of data that needs heavy processing, and so reduces the average time required for the algorithm. Probabilistic Las Vegas algorithms of this kind are a form of stochastic optimization and can be well suited to more general parallel processors with limited amounts of fast memory.

  5. Minimization over randomly selected lines

    Directory of Open Access Journals (Sweden)

    Ismet Sahin

    2013-07-01

    Full Text Available This paper presents a population-based evolutionary optimization method for minimizing a given cost function. The mutation operator of this method selects randomly oriented lines in the cost function domain, constructs quadratic functions interpolating the cost function at three different points over each line, and uses extrema of the quadratics as mutated points. The crossover operator modifies each mutated point based on components of two points in population, instead of one point as is usually performed in other evolutionary algorithms. The stopping criterion of this method depends on the number of almost degenerate quadratics. We demonstrate that the proposed method with these mutation and crossover operations achieves faster and more robust convergence than the well-known Differential Evolution and Particle Swarm algorithms.

  6. Random Effect and Latent Variable Model Selection

    CERN Document Server

    Dunson, David B

    2008-01-01

    Presents various methods for accommodating model uncertainty in random effects and latent variable models. This book focuses on frequentist likelihood ratio and score tests for zero variance components. It also focuses on Bayesian methods for random effects selection in linear mixed effects and generalized linear mixed models

  7. Improving randomness characterization through Bayesian model selection

    CERN Document Server

    R., Rafael Díaz-H; Martínez, Alí M Angulo; U'Ren, Alfred B; Hirsch, Jorge G; Marsili, Matteo; Castillo, Isaac Pérez

    2016-01-01

    Nowadays random number generation plays an essential role in technology with important applications in areas ranging from cryptography, which lies at the core of current communication protocols, to Monte Carlo methods, and other probabilistic algorithms. In this context, a crucial scientific endeavour is to develop effective methods that allow the characterization of random number generators. However, commonly employed methods either lack formality (e.g. the NIST test suite), or are inapplicable in principle (e.g. the characterization derived from the Algorithmic Theory of Information (ATI)). In this letter we present a novel method based on Bayesian model selection, which is both rigorous and effective, for characterizing randomness in a bit sequence. We derive analytic expressions for a model's likelihood which is then used to compute its posterior probability distribution. Our method proves to be more rigorous than NIST's suite and the Borel-Normality criterion and its implementation is straightforward. We...

  8. Randomness in post-selected events

    Science.gov (United States)

    Phuc Thinh, Le; de la Torre, Gonzalo; Bancal, Jean-Daniel; Pironio, Stefano; Scarani, Valerio

    2016-03-01

    Bell inequality violations can be used to certify private randomness for use in cryptographic applications. In photonic Bell experiments, a large amount of the data that is generated comes from no-detection events and presumably contains little randomness. This raises the question as to whether randomness can be extracted only from the smaller post-selected subset corresponding to proper detection events, instead of from the entire set of data. This could in principle be feasible without opening an analogue of the detection loophole as long as the min-entropy of the post-selected data is evaluated by taking all the information into account, including no-detection events. The possibility of extracting randomness from a short string has a practical advantage, because it reduces the computational time of the extraction. Here, we investigate the above idea in a simple scenario, where the devices and the adversary behave according to i.i.d. strategies. We show that indeed almost all the randomness is present in the pair of outcomes for which at least one detection happened. We further show that in some cases applying a pre-processing on the data can capture features that an analysis based on global frequencies only misses, thus resulting in the certification of more randomness. We then briefly consider non-i.i.d strategies and provide an explicit example of such a strategy that is more powerful than any i.i.d. one even in the asymptotic limit of infinitely many measurement rounds, something that was not reported before in the context of Bell inequalities.

  9. Selecting materialized views using random algorithm

    Science.gov (United States)

    Zhou, Lijuan; Hao, Zhongxiao; Liu, Chi

    2007-04-01

    The data warehouse is a repository of information collected from multiple possibly heterogeneous autonomous distributed databases. The information stored at the data warehouse is in form of views referred to as materialized views. The selection of the materialized views is one of the most important decisions in designing a data warehouse. Materialized views are stored in the data warehouse for the purpose of efficiently implementing on-line analytical processing queries. The first issue for the user to consider is query response time. So in this paper, we develop algorithms to select a set of views to materialize in data warehouse in order to minimize the total view maintenance cost under the constraint of a given query response time. We call it query_cost view_ selection problem. First, cost graph and cost model of query_cost view_ selection problem are presented. Second, the methods for selecting materialized views by using random algorithms are presented. The genetic algorithm is applied to the materialized views selection problem. But with the development of genetic process, the legal solution produced become more and more difficult, so a lot of solutions are eliminated and producing time of the solutions is lengthened in genetic algorithm. Therefore, improved algorithm has been presented in this paper, which is the combination of simulated annealing algorithm and genetic algorithm for the purpose of solving the query cost view selection problem. Finally, in order to test the function and efficiency of our algorithms experiment simulation is adopted. The experiments show that the given methods can provide near-optimal solutions in limited time and works better in practical cases. Randomized algorithms will become invaluable tools for data warehouse evolution.

  10. A Strong Limit Theorem on Generalized Random Selection for m-valued Random Sequences

    Institute of Scientific and Technical Information of China (English)

    WANGZhong-zhi; XUFu-xia

    2003-01-01

    In this paper, a strong limit theorem on gambling strategy for binary Bernoulli sequence, i.e.irregularity theorem, is extended to random selection for dependent m-valued random variables, via using a new method-differentiability on net. Furthermore, by allowing the selection function to take value in finite interval [-M, M], the conception of random selection is generalized.

  11. Random forest for gene selection and microarray data classification.

    Science.gov (United States)

    Moorthy, Kohbalan; Mohamad, Mohd Saberi

    2011-01-01

    A random forest method has been selected to perform both gene selection and classification of the microarray data. In this embedded method, the selection of smallest possible sets of genes with lowest error rates is the key factor in achieving highest classification accuracy. Hence, improved gene selection method using random forest has been proposed to obtain the smallest subset of genes as well as biggest subset of genes prior to classification. The option for biggest subset selection is done to assist researchers who intend to use the informative genes for further research. Enhanced random forest gene selection has performed better in terms of selecting the smallest subset as well as biggest subset of informative genes with lowest out of bag error rates through gene selection. Furthermore, the classification performed on the selected subset of genes using random forest has lead to lower prediction error rates compared to existing method and other similar available methods.

  12. In-Place Randomized Slope Selection

    DEFF Research Database (Denmark)

    Blunck, Henrik; Vahrenhold, Jan

    2006-01-01

    Slope selection, i.e. selecting the slope with rank k among all 􀀀n 2lines induced by a collection P of points, results in a widely used robust estimator for linefitting. In this paper, we demonstrate that it is possible to perform slope selection in expected O(n·log2 n) time using only...

  13. In-Place Randomized Slope Selection

    DEFF Research Database (Denmark)

    Blunck, Henrik; Vahrenhold, Jan

    2006-01-01

    Slope selection, i.e. selecting the slope with rank k among all 􀀀n 2lines induced by a collection P of points, results in a widely used robust estimator for linefitting. In this paper, we demonstrate that it is possible to perform slope selection in expected O(n·log2 n) time using only...

  14. Investigation of twenty selected medicinal plants from Malaysia for anti-Chikungunya virus activity.

    Science.gov (United States)

    Chan, Yik Sin; Khoo, Kong Soo; Sit, Nam Weng Weng

    2016-09-01

    Chikungunya virus is a reemerging arbovirus transmitted mainly by Aedes mosquitoes. As there are no specific treatments available, Chikungunya virus infection is a significant public health problem. This study investigated 120 extracts from selected medicinal plants for anti-Chikungunya virus activity. The plant materials were subjected to sequential solvent extraction to obtain six different extracts for each plant. The cytotoxicity and antiviral activity of each extract were examined using African monkey kidney epithelial (Vero) cells. The ethanol, methanol and chloroform extracts of Tradescantia spathacea (Commelinaceae) leaves showed the strongest cytopathic effect inhibition on Vero cells, resulting in cell viabilities of 92.6% ± 1.0% (512 μg/ml), 91.5% ± 1.7% (512 μg/ml) and 88.8% ± 2.4% (80 μg/ml) respectively. However, quantitative RT-PCR analysis revealed that the chloroform extract of Rhapis excelsa (Arecaceae) leaves resulted in the highest percentage of reduction of viral load (98.1%), followed by the ethyl acetate extract of Vernonia amygdalina (Compositae) leaves (95.5%). The corresponding 50% effective concentrations (EC50) and selectivity indices for these two extracts were 29.9 ± 0.9 and 32.4 ± 1.3 μg/ml, and 5.4 and 5.1 respectively. Rhapis excelsa and Vernonia amygdalina could be sources of anti-Chikungunya virus agents. [Int Microbiol 19(3):175-182 (2016)]. Copyright© by the Spanish Society for Microbiology and Institute for Catalan Studies.

  15. Random effect selection in generalised linear models

    DEFF Research Database (Denmark)

    Denwood, Matt; Houe, Hans; Forkman, Björn;

    We analysed abattoir recordings of meat inspection codes with possible relevance to onfarm animal welfare in cattle. Random effects logistic regression models were used to describe individual-level data obtained from 461,406 cattle slaughtered in Denmark. Our results demonstrate that the largest...

  16. In-Place Randomized Slope Selection

    DEFF Research Database (Denmark)

    Blunck, Henrik; Vahrenhold, Jan

    2006-01-01

    Slope selection is a well-known algorithmic tool used in the context of computing robust estimators for fitting a line to a collection P of n points in the plane. We demonstrate that it is possible to perform slope selection in expected O(nlogn) time using only constant extra space in addition...

  17. Zinc balance of twenty healthy elderly subjects consuming self-selected diets

    Energy Technology Data Exchange (ETDEWEB)

    Souza, M.C.; Prather, E.S.; Rhodes, D.G. (Univ. of Maryland, College Park (United States))

    1991-03-15

    Dietary zinc (Zn) intake and balance were determined in ten male and ten female free-living, healthy, elderly subjects on self-selected diets over a period of seven consecutive days. Zn content in the diet, fecal and urine composites for each subject was determined by atomic absorption spectrophotometry. Mean age for the 20 participants was 73.9 years. Mean Zn intakes were 8.9 and 23.3 mg/day for females and males, respectively. Female dietary intakes ranged from 8.2 to 26.8 mg Zn/day. However, three of the males took Zn supplements which extended the total intake range to 64.9 mg/day. Mean Zn balances were +0.1 and +5.1 mg/day for females and males, respectively; ranges for females were {minus}0.1 to +4.3 mg/day and for males were {minus}7.3 to +15.1 mg/day. The 1989 RDAs are 15 mg for males and 12 mg for females. Only three females consumed more than 12 mg Zn/day. Only 2 males consumed more than 15 mg/day from their diet; two other males consumed more than 15 mg due to Zn supplements. Total dietary phytate (TDP) and total dietary fiber (TDF) were calculated from the 7-day weighed food records. Mean TDP intake for females was 1,159 mg/day; mean TDP for the males was 1,661 mg/day. Mean TDF intake for females was 19 g/day; mean TDF for males was 30 g/day.

  18. Bayesian nonparametric centered random effects models with variable selection.

    Science.gov (United States)

    Yang, Mingan

    2013-03-01

    In a linear mixed effects model, it is common practice to assume that the random effects follow a parametric distribution such as a normal distribution with mean zero. However, in the case of variable selection, substantial violation of the normality assumption can potentially impact the subset selection and result in poor interpretation and even incorrect results. In nonparametric random effects models, the random effects generally have a nonzero mean, which causes an identifiability problem for the fixed effects that are paired with the random effects. In this article, we focus on a Bayesian method for variable selection. We characterize the subject-specific random effects nonparametrically with a Dirichlet process and resolve the bias simultaneously. In particular, we propose flexible modeling of the conditional distribution of the random effects with changes across the predictor space. The approach is implemented using a stochastic search Gibbs sampler to identify subsets of fixed effects and random effects to be included in the model. Simulations are provided to evaluate and compare the performance of our approach to the existing ones. We then apply the new approach to a real data example, cross-country and interlaboratory rodent uterotrophic bioassay.

  19. Selectivity and sparseness in randomly connected balanced networks.

    Directory of Open Access Journals (Sweden)

    Cengiz Pehlevan

    Full Text Available Neurons in sensory cortex show stimulus selectivity and sparse population response, even in cases where no strong functionally specific structure in connectivity can be detected. This raises the question whether selectivity and sparseness can be generated and maintained in randomly connected networks. We consider a recurrent network of excitatory and inhibitory spiking neurons with random connectivity, driven by random projections from an input layer of stimulus selective neurons. In this architecture, the stimulus-to-stimulus and neuron-to-neuron modulation of total synaptic input is weak compared to the mean input. Surprisingly, we show that in the balanced state the network can still support high stimulus selectivity and sparse population response. In the balanced state, strong synapses amplify the variation in synaptic input and recurrent inhibition cancels the mean. Functional specificity in connectivity emerges due to the inhomogeneity caused by the generative statistical rule used to build the network. We further elucidate the mechanism behind and evaluate the effects of model parameters on population sparseness and stimulus selectivity. Network response to mixtures of stimuli is investigated. It is shown that a balanced state with unselective inhibition can be achieved with densely connected input to inhibitory population. Balanced networks exhibit the "paradoxical" effect: an increase in excitatory drive to inhibition leads to decreased inhibitory population firing rate. We compare and contrast selectivity and sparseness generated by the balanced network to randomly connected unbalanced networks. Finally, we discuss our results in light of experiments.

  20. Accidental politicians: How randomly selected legislators can improve parliament efficiency

    Science.gov (United States)

    Pluchino, Alessandro; Garofalo, Cesare; Rapisarda, Andrea; Spagano, Salvatore; Caserta, Maurizio

    2011-10-01

    We study a prototypical model of a Parliament with two Parties or two Political Coalitions and we show how the introduction of a variable percentage of randomly selected independent legislators can increase the global efficiency of a Legislature, in terms of both the number of laws passed and the average social welfare obtained. We also analytically find an “efficiency golden rule” which allows to fix the optimal number of legislators to be selected at random after that regular elections have established the relative proportion of the two Parties or Coalitions. These results are in line with both the ancient Greek democratic system and the recent discovery that the adoption of random strategies can improve the efficiency of hierarchical organizations.

  1. Accidental Politicians: How Randomly Selected Legislators Can Improve Parliament Efficiency

    CERN Document Server

    Pluchino, A; Rapisarda, A; Spagano, S; Caserta, M

    2011-01-01

    We study a prototypical model of a Parliament with two Parties or two Political Coalitions and we show how the introduction of a variable percentage of randomly selected independent legislators can increase the global efficiency of a Legislature, in terms of both number of laws passed and average social welfare obtained. We also analytically find an "efficiency golden rule" which allows to fix the optimal number of legislators to be selected at random after that regular elections have established the relative proportion of the two Parties or Coalitions. These results are in line with both the ancient Greek democratic system and the recent discovery that the adoption of random strategies can improve the efficiency of hierarchical organizations.

  2. Local randomization in neighbor selection improves PRM roadmap quality

    KAUST Repository

    McMahon, Troy

    2012-10-01

    Probabilistic Roadmap Methods (PRMs) are one of the most used classes of motion planning methods. These sampling-based methods generate robot configurations (nodes) and then connect them to form a graph (roadmap) containing representative feasible pathways. A key step in PRM roadmap construction involves identifying a set of candidate neighbors for each node. Traditionally, these candidates are chosen to be the k-closest nodes based on a given distance metric. In this paper, we propose a new neighbor selection policy called LocalRand(k,K\\'), that first computes the K\\' closest nodes to a specified node and then selects k of those nodes at random. Intuitively, LocalRand attempts to benefit from random sampling while maintaining the higher levels of local planner success inherent to selecting more local neighbors. We provide a methodology for selecting the parameters k and K\\'. We perform an experimental comparison which shows that for both rigid and articulated robots, LocalRand results in roadmaps that are better connected than the traditional k-closest policy or a purely random neighbor selection policy. The cost required to achieve these results is shown to be comparable to k-closest. © 2012 IEEE.

  3. Selecting a phoneme-to-grapheme mapping: Random or weighted selection?

    Directory of Open Access Journals (Sweden)

    Binna Lee

    2015-05-01

    Our findings demonstrate that random selection underestimates MOA’s PG correspondences whereas weighted selection predicts higher PG correspondences than he produces. To explain his intermediate spelling performance on PPEs, we will test additional approaches to weighing the relative probability of PG mappings, including using log frequencies, separating consonant and vowel status, and considering the number of grapheme options in each phoneme.

  4. Clonal Selection Algorithm Based Iterative Learning Control with Random Disturbance

    Directory of Open Access Journals (Sweden)

    Yuanyuan Ju

    2013-01-01

    Full Text Available Clonal selection algorithm is improved and proposed as a method to solve optimization problems in iterative learning control. And a clonal selection algorithm based optimal iterative learning control algorithm with random disturbance is proposed. In the algorithm, at the same time, the size of the search space is decreased and the convergence speed of the algorithm is increased. In addition a model modifying device is used in the algorithm to cope with the uncertainty in the plant model. In addition a model is used in the algorithm cope with the uncertainty in the plant model. Simulations show that the convergence speed is satisfactory regardless of whether or not the plant model is precise nonlinear plants. The simulation test verify the controlled system with random disturbance can reached to stability by using improved iterative learning control law but not the traditional control law.

  5. Interference-aware random beam selection for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed M.

    2012-09-01

    Spectrum sharing systems have been introduced to alleviate the problem of spectrum scarcity by allowing secondary unlicensed networks to share the spectrum with primary licensed networks under acceptable interference levels to the primary users. In this paper, we develop interference-aware random beam selection schemes that provide enhanced throughput for the secondary link under the condition that the interference observed at the primary link is within a predetermined acceptable value. For a secondary transmitter equipped with multiple antennas, our schemes select a random beam, among a set of power- optimized orthogonal random beams, that maximizes the capacity of the secondary link while satisfying the interference constraint at the primary receiver for different levels of feedback information describing the interference level at the primary receiver. For the proposed schemes, we develop a statistical analysis for the signal-to-noise and interference ratio (SINR) statistics as well as the capacity of the secondary link. Finally, we present numerical results that study the effect of system parameters including number of beams and the maximum transmission power on the capacity of the secondary link attained using the proposed schemes. © 2012 IEEE.

  6. Selective randomized load balancing and mesh networks with changing demands

    Science.gov (United States)

    Shepherd, F. B.; Winzer, P. J.

    2006-05-01

    We consider the problem of building cost-effective networks that are robust to dynamic changes in demand patterns. We compare several architectures using demand-oblivious routing strategies. Traditional approaches include single-hop architectures based on a (static or dynamic) circuit-switched core infrastructure and multihop (packet-switched) architectures based on point-to-point circuits in the core. To address demand uncertainty, we seek minimum cost networks that can carry the class of hose demand matrices. Apart from shortest-path routing, Valiant's randomized load balancing (RLB), and virtual private network (VPN) tree routing, we propose a third, highly attractive approach: selective randomized load balancing (SRLB). This is a blend of dual-hop hub routing and randomized load balancing that combines the advantages of both architectures in terms of network cost, delay, and delay jitter. In particular, we give empirical analyses for the cost (in terms of transport and switching equipment) for the discussed architectures, based on three representative carrier networks. Of these three networks, SRLB maintains the resilience properties of RLB while achieving significant cost reduction over all other architectures, including RLB and multihop Internet protocol/multiprotocol label switching (IP/MPLS) networks using VPN-tree routing.

  7. Unbiased split variable selection for random survival forests using maximally selected rank statistics.

    Science.gov (United States)

    Wright, Marvin N; Dankowski, Theresa; Ziegler, Andreas

    2017-04-15

    The most popular approach for analyzing survival data is the Cox regression model. The Cox model may, however, be misspecified, and its proportionality assumption may not always be fulfilled. An alternative approach for survival prediction is random forests for survival outcomes. The standard split criterion for random survival forests is the log-rank test statistic, which favors splitting variables with many possible split points. Conditional inference forests avoid this split variable selection bias. However, linear rank statistics are utilized by default in conditional inference forests to select the optimal splitting variable, which cannot detect non-linear effects in the independent variables. An alternative is to use maximally selected rank statistics for the split point selection. As in conditional inference forests, splitting variables are compared on the p-value scale. However, instead of the conditional Monte-Carlo approach used in conditional inference forests, p-value approximations are employed. We describe several p-value approximations and the implementation of the proposed random forest approach. A simulation study demonstrates that unbiased split variable selection is possible. However, there is a trade-off between unbiased split variable selection and runtime. In benchmark studies of prediction performance on simulated and real datasets, the new method performs better than random survival forests if informative dichotomous variables are combined with uninformative variables with more categories and better than conditional inference forests if non-linear covariate effects are included. In a runtime comparison, the method proves to be computationally faster than both alternatives, if a simple p-value approximation is used. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Twenty-five thousand years of fluctuating selection on leopard complex spotting and congenital night blindness in horses.

    Science.gov (United States)

    Ludwig, Arne; Reissmann, Monika; Benecke, Norbert; Bellone, Rebecca; Sandoval-Castellanos, Edson; Cieslak, Michael; Fortes, Gloria G; Morales-Muñiz, Arturo; Hofreiter, Michael; Pruvost, Melanie

    2015-01-19

    Leopard complex spotting is inherited by the incompletely dominant locus, LP, which also causes congenital stationary night blindness in homozygous horses. We investigated an associated single nucleotide polymorphism in the TRPM1 gene in 96 archaeological bones from 31 localities from Late Pleistocene (approx. 17 000 YBP) to medieval times. The first genetic evidence of LP spotting in Europe dates back to the Pleistocene. We tested for temporal changes in the LP associated allele frequency and estimated coefficients of selection by means of approximate Bayesian computation analyses. Our results show that at least some of the observed frequency changes are congruent with shifts in artificial selection pressure for the leopard complex spotting phenotype. In early domestic horses from Kirklareli-Kanligecit (Turkey) dating to 2700-2200 BC, a remarkably high number of leopard spotted horses (six of 10 individuals) was detected including one adult homozygote. However, LP seems to have largely disappeared during the late Bronze Age, suggesting selection against this phenotype in early domestic horses. During the Iron Age, LP reappeared, probably by reintroduction into the domestic gene pool from wild animals. This picture of alternating selective regimes might explain how genetic diversity was maintained in domestic animals despite selection for specific traits at different times.

  9. Ammons quick test validity among randomly selected referrals.

    Science.gov (United States)

    Zagar, Robert John; Kovach, Joseph W; Busch, Kenneth G; Zablocki, Michael D; Osnowitz, William; Neuhengen, Jonas; Liu, Yutong; Zagar, Agata Karolina

    2013-12-01

    After selection using a random number table, from volunteer referrals, 89 Youth (61 boys, 28 girls; 48 African Americans, 2 Asian Americans, 27 Euro-Americans, 12 Hispanic Americans), and 147 Adults (107 men, 40 women; 11 African Americans, 6 Asian Americans, 124 Euro-Americans, 6 Hispanic Americans) were administered the Ammons Quick Test (QT). Means, confidence intervals, standard deviations, and Pearson product-moment correlations among tests were computed. The Ammons QT was moderately to strongly and significantly correlated statistically with: the Peabody Picture Vocabulary Test-3b (PPVT-3b); the Vineland Adaptive Behavior Scales-2 Parent/Teacher Form; the Wechsler Intelligence Scale for Children (WISC-4) or the Wechsler Adult Intelligence Scale (WAIS-4); and the Wide Range Achievement Test-Fourth Edition (WRAT-4) Blue and Green Forms. After 51 years, the original norms for the Ammons QT remain valid measures of receptive vocabulary, verbal intelligence, and auditory information processing useful to clinicians.

  10. Materials selection for oxide-based resistive random access memories

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Yuzheng; Robertson, John [Engineering Department, Cambridge University, Cambridge CB2 1PZ (United Kingdom)

    2014-12-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO{sub 2}, TiO{sub 2}, Ta{sub 2}O{sub 5}, and Al{sub 2}O{sub 3}, to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta{sub 2}O{sub 5} RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy.

  11. Optimizing Event Selection with the Random Grid Search

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C. [Fermilab; Prosper, Harrison B. [Florida State U.; Sekmen, Sezen [Kyungpook Natl. U.; Stewart, Chip [Broad Inst., Cambridge

    2017-06-29

    The random grid search (RGS) is a simple, but efficient, stochastic algorithm to find optimal cuts that was developed in the context of the search for the top quark at Fermilab in the mid-1990s. The algorithm, and associated code, have been enhanced recently with the introduction of two new cut types, one of which has been successfully used in searches for supersymmetry at the Large Hadron Collider. The RGS optimization algorithm is described along with the recent developments, which are illustrated with two examples from particle physics. One explores the optimization of the selection of vector boson fusion events in the four-lepton decay mode of the Higgs boson and the other optimizes SUSY searches using boosted objects and the razor variables.

  12. Twenty years of artificial directional selection have shaped the genome of the Italian Large White pig breed.

    Science.gov (United States)

    Schiavo, G; Galimberti, G; Calò, D G; Samorè, A B; Bertolini, F; Russo, V; Gallo, M; Buttazzoni, L; Fontanesi, L

    2016-04-01

    In this study, we investigated at the genome-wide level if 20 years of artificial directional selection based on boar genetic evaluation obtained with a classical BLUP animal model shaped the genome of the Italian Large White pig breed. The most influential boars of this breed (n = 192), born from 1992 (the beginning of the selection program of this breed) to 2012, with an estimated breeding value reliability of >0.85, were genotyped with the Illumina Porcine SNP60 BeadChip. After grouping the boars in eight classes according to their year of birth, filtered single nucleotide polymorphisms (SNPs) were used to evaluate the effects of time on genotype frequency changes using multinomial logistic regression models. Of these markers, 493 had a PBonferroni  genome. The largest proportion of the 493 SNPs was on porcine chromosome (SSC) 7, SSC2, SSC8 and SSC18 for a total of 204 haploblocks. Functional annotations of genomic regions, including the 493 shifted SNPs, reported a few Gene Ontology terms that might underly the biological processes that contributed to increase performances of the pigs over the 20 years of the selection program. The obtained results indicated that the genome of the Italian Large White pigs was shaped by a directional selection program derived by the application of methodologies assuming the infinitesimal model that captured a continuous trend of allele frequency changes in the boar population.

  13. Isoflavones in the Rutaceae family: twenty selected representatives of the genera Citrus, Fortunella, Poncirus, Ruta and Severinia.

    Science.gov (United States)

    Koblovská, Radka; Macková, Zuzana; Vítková, Michaela; Kokoska, Ladislav; Klejdus, Borivoj; Lapcík, Oldrich

    2008-01-01

    Enzyme-linked immunosorbent assays in combination with semi-preparative high-performance liquid chromatography (HPLC) and analytical HPLC with mass spectroscopy in the selective ion monitoring mode were used for the determination of selected isoflavones, daidzein, genistein, biochanin A and their homologues, in 20 representatives of the Rutaceae family. Species belonging to five genera were studied, namely Citrus, Fortunella, Poncirus, Ruta and Severinia. The enzyme immunoassays used were based on polyclonal antibodies raised against isoflavonoid conjugates with bovine serum albumin (BSA), namely biochanin A-7-BSA, daidzein-7-BSA, daidzein-4'-BSA, genistein-7-BSA and genistein-4'-BSA. Aglycones as well as glycosides were detected, and methoxyisoflavones appeared to be more abundant than hydoxyisoflavones. The content of individual isoflavonoids ranged from 0 to 2.6 mg/kg (dry weight); the sum of all measured substances reached up to 5.9 mg.

  14. The dynamics of population ageing into the twenty-first century: ASEAN and selected countries of Pacific Asia.

    Science.gov (United States)

    Neville, W

    1992-07-01

    Demographic aging is examined in selected countries of east and Southeast Asia. "Among the 10 countries discussed in this article, there is a wide range of experience in the process of aging from the advanced stage reached by Japan to the incipient stage evident in the Philippines. Although the direction of age structural shift in these countries is consistent throughout, earlier patterns of fertility, mortality and migration dictate differing effects over the 50-year period, 1970-2020. This is apparent in the behaviour and changing relationships of cohorts passing through the various stages of the life course. The ultimate phase of the current ageing cycle results in a greatly expanded elderly component which, if the case of Japan provides a precedent, is likely to be further inflated by concurrent increases in life expectancy among the elderly themselves."

  15. Twenty-four-hour profiles of metabolic and stress hormones in sheep selected for a calm or nervous temperament.

    Science.gov (United States)

    Rietema, S E; Blackberry, M A; Maloney, S K; Martin, G B; Hawken, P A R; Blache, D

    2015-10-01

    Even in the absence of stressors, temperament is associated with changes in the concentration of stress-responsive hormones and, possibly because of such changes, temperament can affect metabolism. We tested whether, in sheep bred for temperament for 14 generations, "nervous" females have greater concentrations of stress-responsive hormones in the absence of stressors than "calm" females, and whether these differences are associated with changes in the concentrations of metabolic hormones. In resting "calm" (n = 8) and "nervous" (n = 8) sheep, concentrations of cortisol, prolactin, leptin, and insulin were measured in blood plasma sampled via jugular catheter every 20 min for 24 h. The animals were individually penned, habituated to their housing and human handling over 7 wk, and fed before sampling began. Diurnal variation was evident for all hormones, but a 24-h cortisol pattern was detected in only 7 individuals. There was no effect of temperament on any aspect of concentrations of cortisol or prolactin, but "calm" animals had greater concentrations of insulin in the early afternoon than "nervous" animals (14.5 ± 1.1 vs 10.0 ± 1.6 μU/mL; P = 0.038), and a similar tendency was seen for leptin (P = 0.092). We conclude that selection for temperament affects the concentration of metabolic hormones in the absence of stressors, but this effect is independent of stress-responsive hormones.

  16. Event selection with a Random Forest in IceCube

    Energy Technology Data Exchange (ETDEWEB)

    Ruhe, Tim [TU, Dortmund (Germany); Collaboration: IceCube-Collaboration

    2011-07-01

    The Random Forest method is a multivariate algorithm that can be used for classification and regression respectively. The Random Forest implemented in the RapidMiner learning environment has been used for training and validation on data and Monte Carlo simulations of the IceCube neutrino telescope. Latest results are presented.

  17. THE INFLUENCE OF SELECTING RECEIVING SLITSAND USING RANDOMLY ORIENTED STANDARD SAMPLESON QUANTITATIVE TEXTURE ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    刘毓舒

    2001-01-01

    The influence of the selection of receiving slits and the use of standard samples with random orientation on the result of quantitative texture analysis was tested and discussed. The result proves that it will improve the precision of the analysis to use proper slits and a randomly oriented standard sample. A simple method was given to interpolate the correction curves of random intensities.

  18. Aiming for a representative sample: Simulating random versus purposive strategies for hospital selection

    NARCIS (Netherlands)

    Hoeven, van Loan R.; Janssen, Mart P.; Roes, Kit C.B.; Koffijberg, Hendrik

    2015-01-01

    Background A ubiquitous issue in research is that of selecting a representative sample from the study population. While random sampling strategies are the gold standard, in practice, random sampling of participants is not always feasible nor necessarily the optimal choice. In our case, a selection m

  19. Aiming for a representative sample: Simulating random versus purposive strategies for hospital selection

    NARCIS (Netherlands)

    van Hoeven, Loan R; Janssen, Mart P; Roes, Kit C B; Koffijberg, Hendrik

    2015-01-01

    BACKGROUND: A ubiquitous issue in research is that of selecting a representative sample from the study population. While random sampling strategies are the gold standard, in practice, random sampling of participants is not always feasible nor necessarily the optimal choice. In our case, a selection

  20. Study on MAX-MIN Ant System with Random Selection in Quadratic Assignment Problem

    Science.gov (United States)

    Iimura, Ichiro; Yoshida, Kenji; Ishibashi, Ken; Nakayama, Shigeru

    Ant Colony Optimization (ACO), which is a type of swarm intelligence inspired by ants' foraging behavior, has been studied extensively and its effectiveness has been shown by many researchers. The previous studies have reported that MAX-MIN Ant System (MMAS) is one of effective ACO algorithms. The MMAS maintains the balance of intensification and diversification concerning pheromone by limiting the quantity of pheromone to the range of minimum and maximum values. In this paper, we propose MAX-MIN Ant System with Random Selection (MMASRS) for improving the search performance even further. The MMASRS is a new ACO algorithm that is MMAS into which random selection was newly introduced. The random selection is one of the edgechoosing methods by agents (ants). In our experimental evaluation using ten quadratic assignment problems, we have proved that the proposed MMASRS with the random selection is superior to the conventional MMAS without the random selection in the viewpoint of the search performance.

  1. The Application of Imperialist Competitive Algorithm for Fuzzy Random Portfolio Selection Problem

    Science.gov (United States)

    EhsanHesamSadati, Mir; Bagherzadeh Mohasefi, Jamshid

    2013-10-01

    This paper presents an implementation of the Imperialist Competitive Algorithm (ICA) for solving the fuzzy random portfolio selection problem where the asset returns are represented by fuzzy random variables. Portfolio Optimization is an important research field in modern finance. By using the necessity-based model, fuzzy random variables reformulate to the linear programming and ICA will be designed to find the optimum solution. To show the efficiency of the proposed method, a numerical example illustrates the whole idea on implementation of ICA for fuzzy random portfolio selection problem.

  2. Radical prostatectomy versus expectant treatment for early carcinoma of the prostate. Twenty-three year follow-up of a prospective randomized study

    DEFF Research Database (Denmark)

    Iversen, P; Madsen, P O; Corle, D K

    1995-01-01

    In a study by the Veterans Administration Cooperative Urological Research Group (VACURG), 142 patients with localized prostate cancer, VACURG stage I and II, were randomized between radical prostatectomy plus placebo versus placebo alone as initial treatment. 111 patients were evaluable for treat......In a study by the Veterans Administration Cooperative Urological Research Group (VACURG), 142 patients with localized prostate cancer, VACURG stage I and II, were randomized between radical prostatectomy plus placebo versus placebo alone as initial treatment. 111 patients were evaluable...

  3. Research on a randomized real-valued negative selection algorithm

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    A real-valued negative selection algorithm with good mathematical foundation is presented to solve some of the drawbacks of previous approach. Specifically, it can produce a good estimate of the optimal number of detectors needed to cover the non-self space, and the maximization of the non-self coverage is done through an optimization algorithm with proven convergence properties. Experiments are performed to validate the assumptions made while designing the algorithm and to evaluate its performance.

  4. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology

    Science.gov (United States)

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...

  5. Random Gaussian process effect upon selective system of spectra heterodyne analyzer

    Directory of Open Access Journals (Sweden)

    N. F. Vollerner

    1967-12-01

    Full Text Available The formula is obtained that describe mean power changing the selective system output by changing speed tuning of the spectra heterodyne analyzer when searching random stationary processes.

  6. Variable Selection for Varying-Coefficient Models with Missing Response at Random

    Institute of Scientific and Technical Information of China (English)

    Pei Xin ZHAO; Liu Gen XUE

    2011-01-01

    In this paper, we present a variable selection procedure by combining basis function approximations with penalized estimating equations for varying-coefficient models with missing response at random. With appropriate selection of the tuning parameters, we establish the consistency of the variable selection procedure and the optimal convergence rate of the regularized estimators. A simulation study is undertaken to assess the finite sample performance of the proposed variable selection procedure.

  7. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  8. Selection of unique Escherichia coli clones by random amplified polymorphic DNA (RAPD)

    DEFF Research Database (Denmark)

    Nielsen, Karen L; Godfrey, Paul A; Stegger, Marc

    2014-01-01

    Identifying and characterizing clonal diversity are important when analysing fecal flora. We evaluated random amplified polymorphic DNA (RAPD) PCR, applied for selection of Escherichia coli isolates, by whole genome sequencing. RAPD was fast, and reproducible as screening method for selection of ...

  9. On the Selection of Random Numbers in the ElGamal Algorithm

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The ElGamal algorithm, which can be used for both signature and encryption, is of importance in public-key cryptosystems. However, there has arisen an issue that different criteria of selecting a random number are used for the same algorithm. In the aspects of the sufficiency, necessity, security and computational overhead of parameter selection, this paper analyzes these criteria in a comparative manner and points out the insecurities in some textbook cryptographic schemes. Meanwhile, in order to enhance security a novel generalization of the ElGamal signature scheme is made by expanding the range of selecting random numbers at an acceptable cost of additional computation, and its feasibility is demonstrated.

  10. Recovery of a lowland dipterocarp forest twenty two years after selective logging at Sekundur, Gunung Leuser National Park, North Sumatra, Indonesia

    Directory of Open Access Journals (Sweden)

    Dolly - Priatna

    2006-12-01

    Full Text Available PRIATNA, D.; KARTAWINATA, K.; ABDULHADI, R. 2004. Recovery of a lowland dipterocarp forest twenty two years after selective logging at Sekundur, Gunung Leuser National Park, North Sumatra, Indonesia. Reinwardtia 12 (3: 237–255. — A permanent 2-ha plot of lowland forest selectively logged in 1978 at Sekundur, Gunung Leuser National Park, which is also a Biosphere Reserve and a World Heritage Site, North Sumatra, was established and investigated in 1982. It was re-examined in 2000, where remeasurement and reidentification of all trees with DBH 10 cm were made. The areas of gap, building and mature phases of the canopy were also measured and mapped. Within this plot, 133 species, 87 genera and 39 families were recorded, with the total number of trees of 1145 or density of 572.5/ha. Euphorbiaceae was the richest family with 18 species (13.5 % of the total and total number of trees of 248 (21.7 % of the total or density of 124 trees/ha. The most important families were Dipterocarpaceae with IV (Importance Value = 52.0, followed by Euphorbiaceae with IV = 51.8. The most prevalent species was Shorea kunstleri (Dipterocarpaceae with IV =24.4, followed by Macaranga diepenhorstii (Euphorbiaceae with IV = 12.4. They were the species with highest density, 34 trees/ha and 23.5 trees/ha, respectively. During the period of 18 years there has been no shift in the richest families, most important families and most important species. Euphorbiaceae was the richest family and Dipterocarpaceae was the most important family, with Shorea kunstleri as the most important species with highest importance value throughout the period. The number of species increased from 127 to 133 with increase in density by 36.8% , from 418.5 trees/ha to 572.5 trees/ha. The mortality was 25.57 % or 1.4 % per year. The diameter class distribution indicated that the forest recovery has not been complete. Trees were small, comprising 67.6 % with diameters of 10-20 cm and only two trees

  11. Variable Selection for Semiparametric Varying-Coefficient Partially Linear Models with Missing Response at Random

    Institute of Scientific and Technical Information of China (English)

    Pei Xin ZHAO; Liu Gen XUE

    2011-01-01

    In this paper,we present a variable selection procedure by combining basis function approximations with penalized estimating equations for semiparametric varying-coefficient partially linear models with missing response at random.The proposed procedure simultaneously selects significant variables in parametric components and nonparametric components.With appropriate selection of the tuning parameters,we establish the consistency of the variable selection procedure and the convergence rate of the regularized estimators.A simulation study is undertaken to assess the finite sample performance of the proposed variable selection procedure.

  12. Twenty-year perspective of randomized controlled trials for surgery of chronic nonspecific low back pain: citation bias and tangential knowledge.

    Science.gov (United States)

    Andrade, Nicholas S; Flynn, John P; Bartanusz, Viktor

    2013-11-01

    After decades of clinical research, the role of surgery for chronic nonspecific low back pain (CNLBP) remains equivocal. Despite significant intellectual, human, and economic investments into randomized controlled trials (RCTs) in the past two decades, the role of surgery in the treatment for CNLBP has not been clarified. To delineate the historical research agenda of surgical RCTs for CNLBP performed between 1993 and 2012 investigating whether conclusions from earlier published trials influenced the choice of research questions of subsequent RCTs on elucidating the role of surgery in the management of CNLBP. Literature review. We searched the literature for all RCTs involving surgery for CNLBP. We reviewed relevant studies to identify the study question, comparator arms, and sample size. Randomized controlled trials were classified as "indication" trials if they evaluated the effectiveness of surgical therapy versus nonoperative care or as "technical" if they compared different surgical techniques, adjuncts, or procedures. We used citation analysis to determine the impact of trials on subsequent research in the field. Altogether 33 technical RCTs (3,790 patients) and 6 indication RCTs (981 patients) have been performed. Since 2007, despite the unclear benefits of surgery reported by the first four indication trials published in 2001 to 2006, technical trials have continued to predominate (16 vs. 2). Of the technical trials, types of instrumentation (13 trials, 1,332 patients), bone graft materials and substitutes (11 trials, 833 patients), and disc arthroplasty versus fusion (5 trials, 1,337 patients) were the most common comparisons made. Surgeon authors have predominantly cited one of the indication trials that reported more favorable results for surgery, despite a lack of superior methodology or sample size. Trials evaluating bone morphogenic protein, instrumentation, and disc arthroplasty were all cited more frequently than the largest trial of surgical versus

  13. Random Forest (RF) Wrappers for Waveband Selection and Classification of Hyperspectral Data.

    Science.gov (United States)

    Poona, Nitesh Keshavelal; van Niekerk, Adriaan; Nadel, Ryan Leslie; Ismail, Riyad

    2016-02-01

    Hyperspectral data collected using a field spectroradiometer was used to model asymptomatic stress in Pinus radiata and Pinus patula seedlings infected with the pathogen Fusarium circinatum. Spectral data were analyzed using the random forest algorithm. To improve the classification accuracy of the model, subsets of wavebands were selected using three feature selection algorithms: (1) Boruta; (2) recursive feature elimination (RFE); and (3) area under the receiver operating characteristic curve of the random forest (AUC-RF). Results highlighted the robustness of the above feature selection methods when used in conjunction with the random forest algorithm for analyzing hyperspectral data. Overall, the Boruta feature selection algorithm provided the best results. When discriminating F. circinatum stress in Pinus radiata seedlings, Boruta selected wavebands (n = 69) yielded the best overall classification accuracies (training error of 17.00%, independent test error of 17.00% and an AUC value of 0.91). Classification results were, however, significantly lower for P. patula seedlings, with a training error of 24.00%, independent test error of 38.00%, and an AUC value of 0.65. A hybrid selection method that utilizes combinations of wavebands selected from the three feature selection algorithms was also tested. The hybrid method showed an improvement in classification accuracies for P. patula, and no improvement for P. radiata. The results of this study provide impetus towards implementing a hyperspectral framework for detecting stress within nursery environments.

  14. An efficient method of wavelength interval selection based on random frog for multivariate spectral calibration

    Science.gov (United States)

    Yun, Yong-Huan; Li, Hong-Dong; Wood, Leslie R. E.; Fan, Wei; Wang, Jia-Jun; Cao, Dong-Sheng; Xu, Qing-Song; Liang, Yi-Zeng

    2013-07-01

    Wavelength selection is a critical step for producing better prediction performance when applied to spectral data. Considering the fact that the vibrational and rotational spectra have continuous features of spectral bands, we propose a novel method of wavelength interval selection based on random frog, called interval random frog (iRF). To obtain all the possible continuous intervals, spectra are first divided into intervals by moving window of a fix width over the whole spectra. These overlapping intervals are ranked applying random frog coupled with PLS and the optimal ones are chosen. This method has been applied to two near-infrared spectral datasets displaying higher efficiency in wavelength interval selection than others. The source code of iRF can be freely downloaded for academy research at the website: http://code.google.com/p/multivariate-calibration/downloads/list.

  15. Simulated Performance Evaluation of a Selective Tracker Through Random Scenario Generation

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar

    2006-01-01

      The paper presents a simulation study on the performance of a target tracker using selective track splitting filter algorithm through a random scenario implemented on a digital signal processor.  In a typical track splitting filter all the observation which fall inside a likelihood ellipse...... performance assessment. Therefore, a random target motion scenario is adopted. Its implementation in particular for testing the proposed selective track splitting algorithm using Kalman filters is investigated through a number of performance parameters which gives the activity profile of the tracking scenario...

  16. Unbiased Feature Selection in Learning Random Forests for High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Thanh-Tung Nguyen

    2015-01-01

    Full Text Available Random forests (RFs have been widely used as a powerful classification method. However, with the randomization in both bagging samples and feature selection, the trees in the forest tend to select uninformative features for node splitting. This makes RFs have poor accuracy when working with high-dimensional data. Besides that, RFs have bias in the feature selection process where multivalued features are favored. Aiming at debiasing feature selection in RFs, we propose a new RF algorithm, called xRF, to select good features in learning RFs for high-dimensional data. We first remove the uninformative features using p-value assessment, and the subset of unbiased features is then selected based on some statistical measures. This feature subset is then partitioned into two subsets. A feature weighting sampling technique is used to sample features from these two subsets for building trees. This approach enables one to generate more accurate trees, while allowing one to reduce dimensionality and the amount of data needed for learning RFs. An extensive set of experiments has been conducted on 47 high-dimensional real-world datasets including image datasets. The experimental results have shown that RFs with the proposed approach outperformed the existing random forests in increasing the accuracy and the AUC measures.

  17. Unbiased feature selection in learning random forests for high-dimensional data.

    Science.gov (United States)

    Nguyen, Thanh-Tung; Huang, Joshua Zhexue; Nguyen, Thuy Thi

    2015-01-01

    Random forests (RFs) have been widely used as a powerful classification method. However, with the randomization in both bagging samples and feature selection, the trees in the forest tend to select uninformative features for node splitting. This makes RFs have poor accuracy when working with high-dimensional data. Besides that, RFs have bias in the feature selection process where multivalued features are favored. Aiming at debiasing feature selection in RFs, we propose a new RF algorithm, called xRF, to select good features in learning RFs for high-dimensional data. We first remove the uninformative features using p-value assessment, and the subset of unbiased features is then selected based on some statistical measures. This feature subset is then partitioned into two subsets. A feature weighting sampling technique is used to sample features from these two subsets for building trees. This approach enables one to generate more accurate trees, while allowing one to reduce dimensionality and the amount of data needed for learning RFs. An extensive set of experiments has been conducted on 47 high-dimensional real-world datasets including image datasets. The experimental results have shown that RFs with the proposed approach outperformed the existing random forests in increasing the accuracy and the AUC measures.

  18. Algorithms for White-box Obfuscation Using Randomized Subcircuit Selection and Replacement

    Science.gov (United States)

    2008-03-27

    architecture A.1.1 Functionality. CORGI is a Java application which employs a model- view-controller ( MVC ) architecture . In Figure A.1 (page 71), the...software . . . . . . . . . . . . . . . . . . . . . . 70 A.1 CORGI architecture . . . . . . . . . . . . . . . . . . . . 70 A.1.1 Functionality...accomplish two objectives with this research. 1. Develop a software architecture for developing and testing random selection schema for obfuscating a

  19. Selecting Optimal Parameters of Random Linear Network Coding for Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Heide, Janus; Zhang, Qi; Fitzek, Frank

    2013-01-01

    This work studies how to select optimal code parameters of Random Linear Network Coding (RLNC) in Wireless Sensor Networks (WSNs). With Rateless Deluge [1] the authors proposed to apply Network Coding (NC) for Over-the-Air Programming (OAP) in WSNs, and demonstrated that with NC a significant...

  20. An Evaluation of Information Criteria Use for Correct Cross-Classified Random Effects Model Selection

    Science.gov (United States)

    Beretvas, S. Natasha; Murphy, Daniel L.

    2013-01-01

    The authors assessed correct model identification rates of Akaike's information criterion (AIC), corrected criterion (AICC), consistent AIC (CAIC), Hannon and Quinn's information criterion (HQIC), and Bayesian information criterion (BIC) for selecting among cross-classified random effects models. Performance of default values for the 5…

  1. Performance Evaluation of User Selection Protocols in Random Networks with Energy Harvesting and Hardware Impairments

    Directory of Open Access Journals (Sweden)

    Tan Nhat Nguyen

    2016-01-01

    Full Text Available In this paper, we evaluate performances of various user selection protocols under impact of hardware impairments. In the considered protocols, a Base Station (BS selects one of available Users (US to serve, while the remaining USs harvest the energy from the Radio Frequency (RF transmitted by the BS. We assume that all of the US randomly appear around the BS. In the Random Selection Protocol (RAN, the BS randomly selects a US to transmit the data. In the second proposed protocol, named Minimum Distance Protocol (MIND, the US that is nearest to the BS will be chosen. In the Optimal Selection Protocol (OPT, the US providing the highest channel gain between itself and the BS will be served. For performance evaluation, we derive exact and asymptotic closed-form expressions of average Outage Probability (OP over Rayleigh fading channels. We also consider average harvested energy per a US. Finally, Monte-Carlo simulations are then performed to verify the theoretical results.

  2. Classification of Cancer Gene Selection Using Random Forest and Neural Network Based Ensemble Classifier

    Directory of Open Access Journals (Sweden)

    Jogendra Kushwah

    2013-06-01

    Full Text Available The free radical gene classification of cancer diseases is challenging job in biomedical data engineering. The improving of classification of gene selection of cancer diseases various classifier are used, but the classification of classifier are not validate. So ensemble classifier is used for cancer gene classification using neural network classifier with random forest tree. The random forest tree is ensembling technique of classifier in this technique the number of classifier ensemble of their leaf node of class of classifier. In this paper we combined neural network with random forest ensemble classifier for classification of cancer gene selection for diagnose analysis of cancer diseases. The proposed method is different from most of the methods of ensemble classifier, which follow an input output paradigm of neural network, where the members of the ensemble are selected from a set of neural network classifier. the number of classifiers is determined during the rising procedure of the forest. Furthermore, the proposed method produces an ensemble not only correct, but also assorted, ensuring the two important properties that should characterize an ensemble classifier. For empirical evaluation of our proposed method we used UCI cancer diseases data set for classification. Our experimental result shows that better result in compression of random forest tree classification.

  3. Development of Solution Algorithm and Sensitivity Analysis for Random Fuzzy Portfolio Selection Model

    Science.gov (United States)

    Hasuike, Takashi; Katagiri, Hideki

    2010-10-01

    This paper focuses on the proposition of a portfolio selection problem considering an investor's subjectivity and the sensitivity analysis for the change of subjectivity. Since this proposed problem is formulated as a random fuzzy programming problem due to both randomness and subjectivity presented by fuzzy numbers, it is not well-defined. Therefore, introducing Sharpe ratio which is one of important performance measures of portfolio models, the main problem is transformed into the standard fuzzy programming problem. Furthermore, using the sensitivity analysis for fuzziness, the analytical optimal portfolio with the sensitivity factor is obtained.

  4. Continuous-Time Mean-Variance Portfolio Selection with Random Horizon

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Zhiyong, E-mail: yuzhiyong@sdu.edu.cn [Shandong University, School of Mathematics (China)

    2013-12-15

    This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right.

  5. Selected papers from the twenty-third annual workshop on geothermal reservoir engineering, Stanford University; Dai 23 kai Stanford daigaku chinetsu choryuso kogaku workshop ni tsuite

    Energy Technology Data Exchange (ETDEWEB)

    Kikuchi, T. [Geological Survey of Japan, Tsukuba (Japan)

    1998-06-15

    The annual workshop on geothermal reservoir engineering at Stanford University which is now the twenty-third meeting was held for three days starting January 26, 1998. Contents of the lectures included topics as various as 12 cases of field studies, 12 modelings, 8 cases of geochemistry, 8 cases of earth science, 4 cases of physical exploration, 6 high-temperature rocks, 2 cases of deep geothermal research, 5 cases related to wells, and three others. By countries, the United States presented about half of the total number of papers, the Philippines presented about ten papers, and Japan five papers. This paper introduces the summary of four papers said to have drawn interest of the participants. The thesis No. 1 describes utilization in field scale of mass-flow measuring chemical tracers in geothermal areas in the Philippines. The thesis No. 2 is about hydraulic properties of the Dixie Valley (Nevada) geothermal area as seen from well test analysis. The thesis No. 3 is about fracture permeability of reservoir scale in the Dixie Valley (Nevada) geothermal area. The thesis No. 4 mentions high-order differential for a phase front propagation problem in geothermal systems. 4 refs., 5 figs., 2 tabs.

  6. Antibiotic selection pressure and macrolide resistance in nasopharyngeal Streptococcus pneumoniae: a cluster-randomized clinical trial.

    Directory of Open Access Journals (Sweden)

    Alison H Skalet

    Full Text Available BACKGROUND: It is widely thought that widespread antibiotic use selects for community antibiotic resistance, though this has been difficult to prove in the setting of a community-randomized clinical trial. In this study, we used a randomized clinical trial design to assess whether macrolide resistance was higher in communities treated with mass azithromycin for trachoma, compared to untreated control communities. METHODS AND FINDINGS: In a cluster-randomized trial for trachoma control in Ethiopia, 12 communities were randomized to receive mass azithromycin treatment of children aged 1-10 years at months 0, 3, 6, and 9. Twelve control communities were randomized to receive no antibiotic treatments until the conclusion of the study. Nasopharyngeal swabs were collected from randomly selected children in the treated group at baseline and month 12, and in the control group at month 12. Antibiotic susceptibility testing was performed on Streptococcus pneumoniae isolated from the swabs using Etest strips. In the treated group, the mean prevalence of azithromycin resistance among all monitored children increased from 3.6% (95% confidence interval [CI] 0.8%-8.9% at baseline, to 46.9% (37.5%-57.5% at month 12 (p = 0.003. In control communities, azithromycin resistance was 9.2% (95% CI 6.7%-13.3% at month 12, significantly lower than the treated group (p < 0.0001. Penicillin resistance was identified in 0.8% (95% CI 0%-4.2% of isolates in the control group at 1 year, and in no isolates in the children-treated group at baseline or 1 year. CONCLUSIONS: This cluster-randomized clinical trial demonstrated that compared to untreated control communities, nasopharyngeal pneumococcal resistance to macrolides was significantly higher in communities randomized to intensive azithromycin treatment. Mass azithromycin distributions were given more frequently than currently recommended by the World Health Organization's trachoma program. Azithromycin use in this setting

  7. Sensor selection for parameterized random field estimation in wireless sensor networks

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    We consider the random field estimation problem with parametric trend in wireless sensor networks where the field can be described by unknown parameters to be estimated. Due to the limited resources, the network selects only a subset of the sensors to perform the estimation task with a desired performance under the D-optimal criterion. We propose a greedy sampling scheme to select the sensor nodes according to the information gain of the sensors. A distributed algorithm is also developed by consensus-based ...

  8. Topology-selective jamming of fully-connected, code-division random-access networks

    Science.gov (United States)

    Polydoros, Andreas; Cheng, Unjeng

    1990-01-01

    The purpose is to introduce certain models of topology selective stochastic jamming and examine its impact on a class of fully-connected, spread-spectrum, slotted ALOHA-type random access networks. The theory covers dedicated as well as half-duplex units. The dominant role of the spatial duty factor is established, and connections with the dual concept of time selective jamming are discussed. The optimal choices of coding rate and link access parameters (from the users' side) and the jamming spatial fraction are numerically established for DS and FH spreading.

  9. Surgical treatment of cavus foot in Charcot-Marie-tooth disease: a review of twenty-four cases: AAOS exhibit selection.

    Science.gov (United States)

    Faldini, Cesare; Traina, Francesco; Nanni, Matteo; Mazzotti, Antonio; Calamelli, Carlotta; Fabbri, Daniele; Pungetti, Camilla; Giannini, Sandro

    2015-03-18

    Charcot-Marie-Tooth disease is the single most common diagnosis associated with cavus foot. The imbalance involving intrinsic and extrinsic muscles has been suggested as the main pathogenetic cause of cavus foot in this disease. The goal of surgical treatment is to correct the deformity to obtain a plantigrade foot. In the presence of a flexible deformity and the absence of degenerative arthritis, preserving as much as possible of the overall range of motion of the foot and ankle is advisable. Twenty-four cavus feet in twelve patients with Charcot-Marie-Tooth disease were included in the study. Clinical evaluation was summarized with the Maryland Foot Score. Radiographic evaluation assessed calcaneal pitch, Meary angle, Hibb angle, and absence of degenerative joint changes. Only patients who had a flexible deformity, with varus of the heel reducible in the Coleman-Andreasi test, and did not have degenerative joint arthritis were included in this study. Surgical treatment consisted in plantar fasciotomy, midtarsal osteotomy, extensor hallucis longus tendon transfer to the first metatarsal (Jones procedure), and dorsiflexion osteotomy of the first metatarsal. Mean follow-up was six years (range, two to thirteen years). The mean Maryland Foot Score was 72 preoperatively and 86 postoperatively. The postoperative result was rated as excellent in twelve feet (50%), good in ten (42%), and fair in two (8%). Mean calcaneal pitch was 34° preoperatively and 24° at the time of the latest follow-up, the mean Hibb angle was 121° preoperatively and 136° postoperatively, and the mean Meary angle was 25° preoperatively and 2° postoperatively. Plantar fasciotomy, midtarsal osteotomy, the Jones procedure, and dorsiflexion osteotomy of the first metatarsal yielded adequate correction of flexible cavus feet in patients with Charcot-Marie-Tooth disease in the absence of fixed hindfoot deformity. The fact that the improvement in the outcome score was only modest may be attributable

  10. Malaria life cycle intensifies both natural selection and random genetic drift.

    Science.gov (United States)

    Chang, Hsiao-Han; Moss, Eli L; Park, Daniel J; Ndiaye, Daouda; Mboup, Souleymane; Volkman, Sarah K; Sabeti, Pardis C; Wirth, Dyann F; Neafsey, Daniel E; Hartl, Daniel L

    2013-12-10

    Analysis of genome sequences of 159 isolates of Plasmodium falciparum from Senegal yields an extraordinarily high proportion (26.85%) of protein-coding genes with the ratio of nonsynonymous to synonymous polymorphism greater than one. This proportion is much greater than observed in other organisms. Also unusual is that the site-frequency spectra of synonymous and nonsynonymous polymorphisms are virtually indistinguishable. We hypothesized that the complicated life cycle of malaria parasites might lead to qualitatively different population genetics from that predicted from the classical Wright-Fisher (WF) model, which assumes a single random-mating population with a finite and constant population size in an organism with nonoverlapping generations. This paper summarizes simulation studies of random genetic drift and selection in malaria parasites that take into account their unusual life history. Our results show that random genetic drift in the malaria life cycle is more pronounced than under the WF model. Paradoxically, the efficiency of purifying selection in the malaria life cycle is also greater than under WF, and the relative efficiency of positive selection varies according to conditions. Additionally, the site-frequency spectrum under neutrality is also more skewed toward low-frequency alleles than expected with WF. These results highlight the importance of considering the malaria life cycle when applying existing population genetic tools based on the WF model. The same caveat applies to other species with similarly complex life cycles.

  11. Predicting protein-RNA interaction amino acids using random forest based on submodularity subset selection.

    Science.gov (United States)

    Pan, Xiaoyong; Zhu, Lin; Fan, Yong-Xian; Yan, Junchi

    2014-11-13

    Protein-RNA interaction plays a very crucial role in many biological processes, such as protein synthesis, transcription and post-transcription of gene expression and pathogenesis of disease. Especially RNAs always function through binding to proteins. Identification of binding interface region is especially useful for cellular pathways analysis and drug design. In this study, we proposed a novel approach for binding sites identification in proteins, which not only integrates local features and global features from protein sequence directly, but also constructed a balanced training dataset using sub-sampling based on submodularity subset selection. Firstly we extracted local features and global features from protein sequence, such as evolution information and molecule weight. Secondly, the number of non-interaction sites is much more than interaction sites, which leads to a sample imbalance problem, and hence biased machine learning model with preference to non-interaction sites. To better resolve this problem, instead of previous randomly sub-sampling over-represented non-interaction sites, a novel sampling approach based on submodularity subset selection was employed, which can select more representative data subset. Finally random forest were trained on optimally selected training subsets to predict interaction sites. Our result showed that our proposed method is very promising for predicting protein-RNA interaction residues, it achieved an accuracy of 0.863, which is better than other state-of-the-art methods. Furthermore, it also indicated the extracted global features have very strong discriminate ability for identifying interaction residues from random forest feature importance analysis.

  12. Optimization of the Dutch Matrix Test by Random Selection of Sentences From a Preselected Subset

    Directory of Open Access Journals (Sweden)

    Rolph Houben

    2015-04-01

    Full Text Available Matrix tests are available for speech recognition testing in many languages. For an accurate measurement, a steep psychometric function of the speech materials is required. For existing tests, it would be beneficial if it were possible to further optimize the available materials by increasing the function’s steepness. The objective is to show if the steepness of the psychometric function of an existing matrix test can be increased by selecting a homogeneous subset of recordings with the steepest sentence-based psychometric functions. We took data from a previous multicenter evaluation of the Dutch matrix test (45 normal-hearing listeners. Based on half of the data set, first the sentences (140 out of 311 with a similar speech reception threshold and with the steepest psychometric function (≥9.7%/dB were selected. Subsequently, the steepness of the psychometric function for this selection was calculated from the remaining (unused second half of the data set. The calculation showed that the slope increased from 10.2%/dB to 13.7%/dB. The resulting subset did not allow the construction of enough balanced test lists. Therefore, the measurement procedure was changed to randomly select the sentences during testing. Random selection may interfere with a representative occurrence of phonemes. However, in our material, the median phonemic occurrence remained close to that of the original test. This finding indicates that phonemic occurrence is not a critical factor. The work highlights the possibility that existing speech tests might be improved by selecting sentences with a steep psychometric function.

  13. Optimization of the Dutch matrix test by random selection of sentences from a preselected subset.

    Science.gov (United States)

    Houben, Rolph; Dreschler, Wouter A

    2015-05-11

    Matrix tests are available for speech recognition testing in many languages. For an accurate measurement, a steep psychometric function of the speech materials is required. For existing tests, it would be beneficial if it were possible to further optimize the available materials by increasing the function's steepness. The objective is to show if the steepness of the psychometric function of an existing matrix test can be increased by selecting a homogeneous subset of recordings with the steepest sentence-based psychometric functions. We took data from a previous multicenter evaluation of the Dutch matrix test (45 normal-hearing listeners). Based on half of the data set, first the sentences (140 out of 311) with a similar speech reception threshold and with the steepest psychometric function (≥9.7%/dB) were selected. Subsequently, the steepness of the psychometric function for this selection was calculated from the remaining (unused) second half of the data set. The calculation showed that the slope increased from 10.2%/dB to 13.7%/dB. The resulting subset did not allow the construction of enough balanced test lists. Therefore, the measurement procedure was changed to randomly select the sentences during testing. Random selection may interfere with a representative occurrence of phonemes. However, in our material, the median phonemic occurrence remained close to that of the original test. This finding indicates that phonemic occurrence is not a critical factor. The work highlights the possibility that existing speech tests might be improved by selecting sentences with a steep psychometric function.

  14. Classification of Cancer Gene Selection Using Random Forest and Neural Network Based Ensemble Classifier

    Directory of Open Access Journals (Sweden)

    Jogendra Kushwah

    2013-06-01

    Full Text Available The free radical gene classification of cancerdiseasesis challenging job in biomedical dataengineering. The improving of classification of geneselection of cancer diseases various classifier areused, but the classification of classifier are notvalidate. So ensemble classifier is used for cancergene classification using neural network classifierwith random forest tree. The random forest tree isensembling technique of classifier in this techniquethe number of classifier ensemble of their leaf nodeof class of classifier. In this paper we combinedneuralnetwork with random forest ensembleclassifier for classification of cancer gene selectionfor diagnose analysis of cancer diseases.Theproposed method is different from most of themethods of ensemble classifier, which follow aninput output paradigm ofneural network, where themembers of the ensemble are selected from a set ofneural network classifier. the number of classifiersis determined during the rising procedure of theforest. Furthermore, the proposed method producesan ensemble not only correct, but also assorted,ensuring the two important properties that shouldcharacterize an ensemble classifier. For empiricalevaluation of our proposed method we used UCIcancer diseases data set for classification. Ourexperimental result shows that betterresult incompression of random forest tree classification

  15. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology.

    Science.gov (United States)

    Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H

    2017-07-01

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in

  16. Gametocytes infectiousness to mosquitoes: variable selection using random forests, and zero inflated models

    CERN Document Server

    Genuer, Robin; Toussile, Wilson

    2011-01-01

    Malaria control strategies aiming at reducing disease transmission intensity may impact both oocyst intensity and infection prevalence in the mosquito vector. Thus far, mathematical models failed to identify a clear relationship between Plasmodium falciparum gametocytes and their infectiousness to mosquitoes. Natural isolates of gametocytes are genetically diverse and biologically complex. Infectiousness to mosquitoes relies on multiple parameters such as density, sex-ratio, maturity, parasite genotypes and host immune factors. In this article, we investigated how density and genetic diversity of gametocytes impact on the success of transmission in the mosquito vector. We analyzed data for which the number of covariates plus attendant interactions is at least of order of the sample size, precluding usage of classical models such as general linear models. We then considered the variable importance from random forests to address the problem of selecting the most influent variables. The selected covariates were ...

  17. Selective decontamination of the digestive tract to prevent postoperative infection : A randomized placebo-controlled trial in liver transplant patients

    NARCIS (Netherlands)

    Zwaveling, JH; Maring, JK; Klompmaker, IJ; Haagsma, EB; Bottema, JT; Winter, Heinrich L.J.; van Enckevort, PJ; TenVergert, EM; Metselaar, HJ; Bruining, HA; Slooff, MJH

    Objective., To determine the efficacy of selective decontamination of the digestive tract (SDD) in patients undergoing elective transplantation of the liver. Design: Randomized, double-blind, placebo-controlled study. Setting. Two academic teaching hospitals. Patients. Adult patients undergoing

  18. Selective decontamination of the digestive tract to prevent postoperative infection : A randomized placebo-controlled trial in liver transplant patients

    NARCIS (Netherlands)

    Zwaveling, JH; Maring, JK; Klompmaker, IJ; Haagsma, EB; Bottema, JT; Winter, Heinrich L.J.; van Enckevort, PJ; TenVergert, EM; Metselaar, HJ; Bruining, HA; Slooff, MJH

    2002-01-01

    Objective., To determine the efficacy of selective decontamination of the digestive tract (SDD) in patients undergoing elective transplantation of the liver. Design: Randomized, double-blind, placebo-controlled study. Setting. Two academic teaching hospitals. Patients. Adult patients undergoing elec

  19. Twenty lectures on thermodynamics

    CERN Document Server

    Buchdahl, H A

    2013-01-01

    Twenty Lectures on Thermodynamics is a course of lectures, parts of which the author has given various times over the last few years. The book gives the readers a bird's eye view of phenomenological and statistical thermodynamics. The book covers many areas in thermodynamics such as states and transition; adiabatic isolation; irreversibility; the first, second, third and Zeroth laws of thermodynamics; entropy and entropy law; the idea of the application of thermodynamics; pseudo-states; the quantum-static al canonical and grand canonical ensembles; and semi-classical gaseous systems. The text

  20. Geography and genography: prediction of continental origin using randomly selected single nucleotide polymorphisms

    Directory of Open Access Journals (Sweden)

    Ramoni Marco F

    2007-03-01

    Full Text Available Abstract Background Recent studies have shown that when individuals are grouped on the basis of genetic similarity, group membership corresponds closely to continental origin. There has been considerable debate about the implications of these findings in the context of larger debates about race and the extent of genetic variation between groups. Some have argued that clustering according to continental origin demonstrates the existence of significant genetic differences between groups and that these differences may have important implications for differences in health and disease. Others argue that clustering according to continental origin requires the use of large amounts of genetic data or specifically chosen markers and is indicative only of very subtle genetic differences that are unlikely to have biomedical significance. Results We used small numbers of randomly selected single nucleotide polymorphisms (SNPs from the International HapMap Project to train naïve Bayes classifiers for prediction of ancestral continent of origin. Predictive accuracy was tested on two independent data sets. Genetically similar groups should be difficult to distinguish, especially if only a small number of genetic markers are used. The genetic differences between continentally defined groups are sufficiently large that one can accurately predict ancestral continent of origin using only a minute, randomly selected fraction of the genetic variation present in the human genome. Genotype data from only 50 random SNPs was sufficient to predict ancestral continent of origin in our primary test data set with an average accuracy of 95%. Genetic variations informative about ancestry were common and widely distributed throughout the genome. Conclusion Accurate characterization of ancestry is possible using small numbers of randomly selected SNPs. The results presented here show how investigators conducting genetic association studies can use small numbers of arbitrarily

  1. Feature selection for outcome prediction in oesophageal cancer using genetic algorithm and random forest classifier.

    Science.gov (United States)

    Paul, Desbordes; Su, Ruan; Romain, Modzelewski; Sébastien, Vauclin; Pierre, Vera; Isabelle, Gardin

    2016-12-28

    The outcome prediction of patients can greatly help to personalize cancer treatment. A large amount of quantitative features (clinical exams, imaging, …) are potentially useful to assess the patient outcome. The challenge is to choose the most predictive subset of features. In this paper, we propose a new feature selection strategy called GARF (genetic algorithm based on random forest) extracted from positron emission tomography (PET) images and clinical data. The most relevant features, predictive of the therapeutic response or which are prognoses of the patient survival 3 years after the end of treatment, were selected using GARF on a cohort of 65 patients with a local advanced oesophageal cancer eligible for chemo-radiation therapy. The most relevant predictive results were obtained with a subset of 9 features leading to a random forest misclassification rate of 18±4% and an areas under the of receiver operating characteristic (ROC) curves (AUC) of 0.823±0.032. The most relevant prognostic results were obtained with 8 features leading to an error rate of 20±7% and an AUC of 0.750±0.108. Both predictive and prognostic results show better performances using GARF than using 4 other studied methods.

  2. Trends of periodontal conditions in two different randomly selected Swiss (Bernese) cohorts 25 years apart.

    Science.gov (United States)

    Schürch, Ernst; Dulla, Joëlle A; Bürgin, Walter; Lussi, Adrian; Lang, Niklaus P

    2015-10-01

    To assess the periodontal conditions of two randomly selected Swiss cohorts 25 years apart. Standardized examinations were performed to assess the periodontal conditions of two randomly selected populations of the Canton of Bern; oral cleanliness was evaluated using the plaque index (PlI) and the retention index (RI). Gingival health was scored according to the gingival index (GI). Periodontal conditions were evaluated by pocket probing depth (PPD) and loss of attachment (LA). At the first examination in 1985, 206 out of 350 subjects were evaluated, while in the second examination in 2010, 134 out of 490 subjects attended the examinations. In 1985, subjects showed a mean PlI of 1.16, and 0.77 in 2010. RI was 0.81 and 0.36 in 1985 and 2010 respectively. Mean GI was 1.34 and 0.6. The mean proportion of PPD ≤3 mm was 72% in 1985 and 97.3% in 2010. PPD ≥ 6 mm affected 2.0% in 1985 and 0.3% in 2010. In 1985, subjects had an average of 20.7 teeth, while in 2010, the average was 24.6. In 1985, 7.3% of the subjects were edentulous, while in 2010, 4.5% had no teeth. Trends to improvements resulting in more teeth in function and better periodontal conditions were recognized. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. Joint random beam and spectrum selection for spectrum sharing systems with partial channel state information

    KAUST Repository

    Abdallah, Mohamed M.

    2013-11-01

    In this work, we develop joint interference-aware random beam and spectrum selection scheme that provide enhanced performance for the secondary network under the condition that the interference observed at the primary receiver is below a predetermined acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a set of primary links composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes jointly select a beam, among a set of power-optimized random beams, as well as the primary spectrum that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint. In particular, we consider the case where the interference level is described by a q-bit description of its magnitude, whereby we propose a technique to find the optimal quantizer thresholds in a mean square error (MSE) sense. © 2013 IEEE.

  4. Interference-aware random beam selection schemes for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed

    2012-10-19

    Spectrum sharing systems have been recently introduced to alleviate the problem of spectrum scarcity by allowing secondary unlicensed networks to share the spectrum with primary licensed networks under acceptable interference levels to the primary users. In this work, we develop interference-aware random beam selection schemes that provide enhanced performance for the secondary network under the condition that the interference observed by the receivers of the primary network is below a predetermined/acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a primary link composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes select a beam, among a set of power-optimized random beams, that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint for different levels of feedback information describing the interference level at the primary receiver. For the proposed schemes, we develop a statistical analysis for the SINR statistics as well as the capacity and bit error rate (BER) of the secondary link.

  5. Selective oropharyngeal decontamination versus selective digestive decontamination in critically ill patients: a meta-analysis of randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Zhao D

    2015-07-01

    Full Text Available Di Zhao,1,* Jian Song,2,* Xuan Gao,3 Fei Gao,4 Yupeng Wu,2 Yingying Lu,5 Kai Hou1 1Department of Neurosurgery, The First Hospital of Hebei Medical University, 2Department of Neurosurgery, 3Department of Neurology, The Second Hospital of Hebei Medical University, 4Hebei Provincial Procurement Centers for Medical Drugs and Devices, 5Department of Neurosurgery, The Second Hospital of Hebei Medical University, Shijiazhuang People’s Republic of China *These authors contributed equally to this work Background: Selective digestive decontamination (SDD and selective oropharyngeal decontamination (SOD are associated with reduced mortality and infection rates among patients in intensive care units (ICUs; however, whether SOD has a superior effect than SDD remains uncertain. Hence, we conducted a meta-analysis of randomized controlled trials (RCTs to compare SOD with SDD in terms of clinical outcomes and antimicrobial resistance rates in patients who were critically ill. Methods: RCTs published in PubMed, Embase, and Web of Science were systematically reviewed to compare the effects of SOD and SDD in patients who were critically ill. Outcomes included day-28 mortality, length of ICU stay, length of hospital stay, duration of mechanical ventilation, ICU-acquired bacteremia, and prevalence of antibiotic-resistant Gram-negative bacteria. Results were expressed as risk ratio (RR with 95% confidence intervals (CIs, and weighted mean differences (WMDs with 95% CIs. Pooled estimates were performed using a fixed-effects model or random-effects model, depending on the heterogeneity among studies. Results: A total of four RCTs involving 23,822 patients met the inclusion criteria and were included in this meta-analysis. Among patients whose admitting specialty was surgery, cardiothoracic surgery (57.3% and neurosurgery (29.7% were the two main types of surgery being performed. Pooled results showed that SOD had similar effects as SDD in day-28 mortality (RR =1

  6. Twenty Years of KSHV

    Directory of Open Access Journals (Sweden)

    Yuan Chang

    2014-11-01

    Full Text Available Twenty years ago, Kaposi’s sarcoma (KS was the oncologic counterpart to Winston Churchill’s Russia: a riddle, wrapped in a mystery, inside an enigma. First described by Moritz Kaposi in 1872, who reported it to be an aggressive skin tumor, KS became known over the next century as a slow-growing tumor of elderly men—in fact, most KS patients were expected to die with the tumor rather than from it. Nevertheless, the course and manifestations of the disease varied widely in different clinical contexts. The puzzle of KS came to the forefront as a harbinger of the AIDS epidemic. The articles in this issue of Viruses recount progress made in understanding Kaposi’s sarcoma herpesvirus (KSHV since its initial description in 1994.

  7. A Classification Study of Respiratory Syncytial Virus (RSV Inhibitors by Variable Selection with Random Forest

    Directory of Open Access Journals (Sweden)

    Shuwei Zhang

    2011-02-01

    Full Text Available Experimental pEC50s for 216 selective respiratory syncytial virus (RSV inhibitors are used to develop classification models as a potential screening tool for a large library of target compounds. Variable selection algorithm coupled with random forests (VS-RF is used to extract the physicochemical features most relevant to the RSV inhibition. Based on the selected small set of descriptors, four other widely used approaches, i.e., support vector machine (SVM, Gaussian process (GP, linear discriminant analysis (LDA and k nearest neighbors (kNN routines are also employed and compared with the VS-RF method in terms of several of rigorous evaluation criteria. The obtained results indicate that the VS-RF model is a powerful tool for classification of RSV inhibitors, producing the highest overall accuracy of 94.34% for the external prediction set, which significantly outperforms the other four methods with the average accuracy of 80.66%. The proposed model with excellent prediction capacity from internal to external quality should be important for screening and optimization of potential RSV inhibitors prior to chemical synthesis in drug development.

  8. Rigorous selection of random forest models for identifying compounds that activate toxicity-related pathways

    Directory of Open Access Journals (Sweden)

    Yoshihiro eUesawa

    2016-02-01

    Full Text Available Random forest (RF is a machine-learning ensemble method with high predictive performance. Majority voting in RF uses the discrimination results in numerous decision trees produced from bootstrapping data. For the same dataset, the bootstrapping process yields different predictive capacities in each generation. As participants in the Toxicology in the 21st Century (Tox21 DATA Challenge 2014, we produced numerous RF models for predicting the structures of compounds that can activate each toxicity-related pathway, and then selected the model with the highest predictive ability. Half of the compounds in the training dataset supplied by the competition organizer were allocated to the validation dataset. The remaining compounds were used in model construction. The charged and uncharged forms of each molecule were calculated using the molecular operating environment (MOE software. Subsequently, the descriptors were computed using MOE, MarvinView, and Dragon. These combined methods yielded over 4,071 descriptors for model construction. Using these descriptors, pattern recognition analyses were performed by RF implemented in JMP Pro (a statistical software package. A hundred to two hundred RF models were generated for each pathway. The predictive performance of each model was tested against the validation dataset, and the best-performing model was selected. In the competition, the latter model selected a best-performing model from the 50% test set that best predicted the structures of compounds that activate the estrogen receptor ligand-binding domain (ER-LBD.

  9. High-dose statin pretreatment decreases periprocedural myocardial infarction and cardiovascular events in patients undergoing elective percutaneous coronary intervention: a meta-analysis of twenty-four randomized controlled trials.

    Directory of Open Access Journals (Sweden)

    Le Wang

    Full Text Available BACKGROUND: Evidence suggests that high-dose statin pretreatment may reduce the risk of periprocedural myocardial infarction (PMI and major adverse cardiac events (MACE for certain patients; however, previous analyses have not considered patients with a history of statin maintenance treatment. In this meta-analysis of randomized controlled trials (RCTs, we reevaluated the efficacy of short-term high-dose statin pretreatment to prevent PMI and MACE in an expanded set of patients undergoing elective percutaneous coronary intervention. METHODS: We searched the PubMed/Medline database for RCTs that compared high-dose statin pretreatment with no statin or low-dose statin pretreatment as a prevention of PMI and MACE. We evaluated the incidence of PMI and MACE, including death, spontaneous myocardial infarction, and target vessel revascularization at the longest follow-up for each study for subgroups stratified by disease classification and prior low-dose statin treatment. RESULTS: Twenty-four RCTs with a total of 5,526 patients were identified. High-dose statin pretreatment was associated with 59% relative reduction in PMI (odds ratio [OR]: 0.41; 95% confidence interval [CI]: 0.34-0.49; P<0.00001 and 39% relative reduction in MACE (OR: 0.61; 95% CI: 0.45-0.83; P = 0.002. The benefit of high-dose statin pretreatment on MACE was significant for statin-naive patients (OR: 0.69; 95% CI: 0.50-0.95; P = 0.02 and prior low dose statin-treated patients (OR: 0.28; 95% CI: 0.12-0.65; P = 0.003; and for patients with acute coronary syndrome (OR: 0.52; 95% CI: 0.34-0.79; P = 0.003, but not for patients with stable angina (OR: 0.71; 95% CI 0.45-1.10; P = 0.12. Long-term effects on survival were less obvious. CONCLUSIONS: High-dose statin pretreatment can result in a significant reduction in PMI and MACE for patients undergoing elective PCI. The positive effect of high-dose statin pretreatment on PMI and MACE is significant for statin-naïve patients and patients

  10. High-Dose Statin Pretreatment Decreases Periprocedural Myocardial Infarction and Cardiovascular Events in Patients Undergoing Elective Percutaneous Coronary Intervention: A Meta-Analysis of Twenty-Four Randomized Controlled Trials

    Science.gov (United States)

    Wang, Le; Peng, Pingan; Zhang, Ou; Xu, Xiaohan; Yang, Shiwei; Zhao, Yingxin; Zhou, Yujie

    2014-01-01

    Background Evidence suggests that high-dose statin pretreatment may reduce the risk of periprocedural myocardial infarction (PMI) and major adverse cardiac events (MACE) for certain patients; however, previous analyses have not considered patients with a history of statin maintenance treatment. In this meta-analysis of randomized controlled trials (RCTs), we reevaluated the efficacy of short-term high-dose statin pretreatment to prevent PMI and MACE in an expanded set of patients undergoing elective percutaneous coronary intervention. Methods We searched the PubMed/Medline database for RCTs that compared high-dose statin pretreatment with no statin or low-dose statin pretreatment as a prevention of PMI and MACE. We evaluated the incidence of PMI and MACE, including death, spontaneous myocardial infarction, and target vessel revascularization at the longest follow-up for each study for subgroups stratified by disease classification and prior low-dose statin treatment. Results Twenty-four RCTs with a total of 5,526 patients were identified. High-dose statin pretreatment was associated with 59% relative reduction in PMI (odds ratio [OR]: 0.41; 95% confidence interval [CI]: 0.34–0.49; Pstatin pretreatment on MACE was significant for statin-naive patients (OR: 0.69; 95% CI: 0.50–0.95; P = 0.02) and prior low dose statin-treated patients (OR: 0.28; 95% CI: 0.12–0.65; P = 0.003); and for patients with acute coronary syndrome (OR: 0.52; 95% CI: 0.34–0.79; P = 0.003), but not for patients with stable angina (OR: 0.71; 95% CI 0.45–1.10; P = 0.12). Long-term effects on survival were less obvious. Conclusions High-dose statin pretreatment can result in a significant reduction in PMI and MACE for patients undergoing elective PCI. The positive effect of high-dose statin pretreatment on PMI and MACE is significant for statin-naïve patients and patients with prior treatment. The positive effect of high-dose statin pretreatment on MACE is significant for

  11. Selection of Unique Escherichia coli Clones by Random Amplified Polymorphic DNA (RAPD): Evaluation by Whole Genome Sequencing

    Science.gov (United States)

    Nielsen, Karen L.; Godfrey, Paul A.; Stegger, Marc; Andersen, Paal S.; Feldgarden, Michael; Frimodt-Møller, Niels

    2014-01-01

    Identifying and characterizing clonal diversity is important when analysing fecal flora. We evaluated random amplified polymorphic DNA (RAPD) PCR, applied for selection of Escherichia coli isolates, by whole genome sequencing. RAPD was fast, and reproducible as screening method for selection of distinct E. coli clones in fecal swabs. PMID:24912108

  12. A randomized clinical trial of selective laser trabeculoplasty versus argon laser trabeculoplasty in patients with pseudoexfoliation.

    Science.gov (United States)

    Kent, Shefalee S; Hutnik, Cindy M L; Birt, Catherine M; Damji, Karim F; Harasymowycz, Paul; Si, Francie; Hodge, William; Pan, Irene; Crichton, Andrew

    2015-01-01

    To evaluate the efficacy of selective laser trabeculoplasty (SLT) versus argon laser trabeculoplasty (ALT) in lowering the intraocular pressure (IOP) in patients with open-angle glaucoma or ocular hypertension secondary to pseudoexfoliation. Multicentered randomized clinical trial. A total of 76 eyes from 60 patients with pseudoexfoliation and uncontrolled IOP were recruited from 5 Canadian academic institutions. Patients with prior laser trabeculoplasty, ocular surgery within 6 months, previous glaucoma surgery, an advanced visual field defect, current steroid use, and monocular patients were excluded. Eyes were randomized to receive either 180-degree SLT or 180-degree ALT by a nonblocked randomization schedule stratified by center. The primary outcome was the change in IOP at 6 months versus baseline and secondary outcomes included change in number of glaucoma medications after laser. Baseline variables included age, sex, angle grade, angle pigmentation, and number of glaucoma medications. Of the 76 eyes, 45 eyes received SLT and 31 eyes received ALT. The overall age was 72.9 years (65% females). The baseline IOPs in the SLT and ALT groups were 23.1 and 25.2 mm Hg, respectively (P=0.03). The IOP reduction 6 months after SLT was -6.8 mm Hg and post-ALT was -7.7 mm Hg (P>0.05). The SLT group had reduced glaucoma medications by 0.16 medications at 6 months and the ALT group had no decrease in medications over the same time period (P=0.59). There were no postlaser IOP spikes in either group. ALT and SLT are equivalent in lowering IOP at 6 months posttreatment in patients with PXF.

  13. A Randomized Heuristic for Kernel Parameter Selection with Large-scale Multi-class Data

    DEFF Research Database (Denmark)

    Hansen, Toke Jansen; Abrahamsen, Trine Julie; Hansen, Lars Kai

    2011-01-01

    Over the past few years kernel methods have gained a tremendous amount of attention as existing linear algorithms can easily be extended to account for highly non-linear data in a computationally efficient manner. Unfortunately most kernels require careful tuning of intrinsic parameters to correc......Over the past few years kernel methods have gained a tremendous amount of attention as existing linear algorithms can easily be extended to account for highly non-linear data in a computationally efficient manner. Unfortunately most kernels require careful tuning of intrinsic parameters....... In this contribution we investigate a novel randomized approach for kernel parameter selection in large-scale multi-class data. We fit a minimum enclosing ball to the class means in Reproducing Kernel Hilbert Spaces (RKHS), and use the radius as a quality measure of the space, defined by the kernel parameter. We apply...

  14. Implications of structural genomics target selection strategies: Pfam5000, whole genome, and random approaches

    Energy Technology Data Exchange (ETDEWEB)

    Chandonia, John-Marc; Brenner, Steven E.

    2004-07-14

    The structural genomics project is an international effort to determine the three-dimensional shapes of all important biological macromolecules, with a primary focus on proteins. Target proteins should be selected according to a strategy which is medically and biologically relevant, of good value, and tractable. As an option to consider, we present the Pfam5000 strategy, which involves selecting the 5000 most important families from the Pfam database as sources for targets. We compare the Pfam5000 strategy to several other proposed strategies that would require similar numbers of targets. These include including complete solution of several small to moderately sized bacterial proteomes, partial coverage of the human proteome, and random selection of approximately 5000 targets from sequenced genomes. We measure the impact that successful implementation of these strategies would have upon structural interpretation of the proteins in Swiss-Prot, TrEMBL, and 131 complete proteomes (including 10 of eukaryotes) from the Proteome Analysis database at EBI. Solving the structures of proteins from the 5000 largest Pfam families would allow accurate fold assignment for approximately 68 percent of all prokaryotic proteins (covering 59 percent of residues) and 61 percent of eukaryotic proteins (40 percent of residues). More fine-grained coverage which would allow accurate modeling of these proteins would require an order of magnitude more targets. The Pfam5000 strategy may be modified in several ways, for example to focus on larger families, bacterial sequences, or eukaryotic sequences; as long as secondary consideration is given to large families within Pfam, coverage results vary only slightly. In contrast, focusing structural genomics on a single tractable genome would have only a limited impact in structural knowledge of other proteomes: a significant fraction (about 30-40 percent of the proteins, and 40-60 percent of the residues) of each proteome is classified in small

  15. Selection of trkB-binding peptides from a phage-displayed random peptide library

    Institute of Scientific and Technical Information of China (English)

    马仲才; 吴晓兰; 曹明媚; 潘卫; 朱分禄; 陈景山; 戚中田

    2003-01-01

    Brain-derived neurotrophic factor (BDNF) shows potential in the treatment of neurodegenerative diseases, but the therapeutic application of BDNF has been greatly limited because it is too large in molecular size to permeate blood-brain barrier. To develop low-molecular-weight BDNF-like peptides, we selected a phage-displayed random peptide library using trkB expressed on NIH 3T3 cells as target in the study. With the strategy of peptide library incubation with NIH 3T3 cells and competitive elution with 1 υg/mL of BDNF in the last round of selection, the specific phages able to bind to the natural conformation of trkB and antagonize BDNF binding to trkB were enriched effectively. Five trkB-binding peptides were obtained, in which a core sequence of CRA/TXφXXφXXC (X represents the random amino acids, φ represents T, L or I) was identified. The BDNF-like activity of these five peptides displayed on phages was not observed, though all of them antagonized the activity of BDNF in a dose-dependent manner. Similar results were obtained with the synthetic peptide of C1 clone, indicating that the 5 phage-derived peptides were trkB antagonists. These low-molecular-weight antagonists of trkB may be of potential application in the treatment of neuroblastoma and chronic pain. Meanwhile, the obtained core sequence also could be used as the base to construct the secondary phage-displayed peptide library for further development of small peptides mimicking BDNF activity.

  16. Top at Twenty

    CERN Document Server

    2015-01-01

    The "Top at Twenty" workshop is dedicated to the celebration of 20 years since the top quark discovery at Fermilab in 1995. Speakers from all experiments capable of studying top quark, ATLAS, CDF, CMS and DZero, will present the most recent results of the top quark studies based on Run II of the Tevatron and Run I of the LHC. Reviews of such fundamental measurements as mass of the top quark, its spin, charge and production properties are planned with some of them orders of magnitude better in precision in comparison with original CDF and DZero papers announcing the top quark discovery. Measurements of top quark production and decay that illuminate the nature of the Higgs boson and seek new phenomena will be presented. Theoretical talks on how the top quark fits into the Standard Model and its potential extensions will also be presented. This workshop will complement the yearly Top Workshop which is held in September and will benefit from many new results expected to be presented at winter conferences in 2015...

  17. The adverse effect of selective cyclooxygenase-2 inhibitor on random skin flap survival in rats.

    Directory of Open Access Journals (Sweden)

    Haiyong Ren

    Full Text Available BACKGROUND: Cyclooxygenase-2(COX-2 inhibitors provide desired analgesic effects after injury or surgery, but evidences suggested they also attenuate wound healing. The study is to investigate the effect of COX-2 inhibitor on random skin flap survival. METHODS: The McFarlane flap model was established in 40 rats and evaluated within two groups, each group gave the same volume of Parecoxib and saline injection for 7 days. The necrotic area of the flap was measured, the specimens of the flap were stained with haematoxylin-eosin(HE for histologic analysis. Immunohistochemical staining was performed to analyse the level of VEGF and COX-2 . RESULTS: 7 days after operation, the flap necrotic area ratio in study group (66.65 ± 2.81% was significantly enlarged than that of the control group(48.81 ± 2.33%(P <0.01. Histological analysis demonstrated angiogenesis with mean vessel density per mm(2 being lower in study group (15.4 ± 4.4 than in control group (27.2 ± 4.1 (P <0.05. To evaluate the expression of COX-2 and VEGF protein in the intermediate area II in the two groups by immunohistochemistry test .The expression of COX-2 in study group was (1022.45 ± 153.1, and in control group was (2638.05 ± 132.2 (P <0.01. The expression of VEGF in the study and control groups were (2779.45 ± 472.0 vs (4938.05 ± 123.6(P <0.01.In the COX-2 inhibitor group, the expressions of COX-2 and VEGF protein were remarkably down-regulated as compared with the control group. CONCLUSION: Selective COX-2 inhibitor had adverse effect on random skin flap survival. Suppression of neovascularization induced by low level of VEGF was supposed to be the biological mechanism.

  18. Random genetic drift, natural selection, and noise in human cranial evolution.

    Science.gov (United States)

    Roseman, Charles C

    2016-08-01

    This study assesses the extent to which relationships among groups complicate comparative studies of adaptation in recent human cranial variation and the extent to which departures from neutral additive models of evolution hinder the reconstruction of population relationships among groups using cranial morphology. Using a maximum likelihood evolutionary model fitting approach and a mixed population genomic and cranial data set, I evaluate the relative fits of several widely used models of human cranial evolution. Moreover, I compare the goodness of fit of models of cranial evolution constrained by genomic variation to test hypotheses about population specific departures from neutrality. Models from population genomics are much better fits to cranial variation than are traditional models from comparative human biology. There is not enough evolutionary information in the cranium to reconstruct much of recent human evolution but the influence of population history on cranial variation is strong enough to cause comparative studies of adaptation serious difficulties. Deviations from a model of random genetic drift along a tree-like population history show the importance of environmental effects, gene flow, and/or natural selection on human cranial variation. Moreover, there is a strong signal of the effect of natural selection or an environmental factor on a group of humans from Siberia. The evolution of the human cranium is complex and no one evolutionary process has prevailed at the expense of all others. A holistic unification of phenome, genome, and environmental context, gives us a strong point of purchase on these problems, which is unavailable to any one traditional approach alone. Am J Phys Anthropol 160:582-592, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  19. Equivalence between Step Selection Functions and Biased Correlated Random Walks for Statistical Inference on Animal Movement.

    Science.gov (United States)

    Duchesne, Thierry; Fortin, Daniel; Rivest, Louis-Paul

    2015-01-01

    Animal movement has a fundamental impact on population and community structure and dynamics. Biased correlated random walks (BCRW) and step selection functions (SSF) are commonly used to study movements. Because no studies have contrasted the parameters and the statistical properties of their estimators for models constructed under these two Lagrangian approaches, it remains unclear whether or not they allow for similar inference. First, we used the Weak Law of Large Numbers to demonstrate that the log-likelihood function for estimating the parameters of BCRW models can be approximated by the log-likelihood of SSFs. Second, we illustrated the link between the two approaches by fitting BCRW with maximum likelihood and with SSF to simulated movement data in virtual environments and to the trajectory of bison (Bison bison L.) trails in natural landscapes. Using simulated and empirical data, we found that the parameters of a BCRW estimated directly from maximum likelihood and by fitting an SSF were remarkably similar. Movement analysis is increasingly used as a tool for understanding the influence of landscape properties on animal distribution. In the rapidly developing field of movement ecology, management and conservation biologists must decide which method they should implement to accurately assess the determinants of animal movement. We showed that BCRW and SSF can provide similar insights into the environmental features influencing animal movements. Both techniques have advantages. BCRW has already been extended to allow for multi-state modeling. Unlike BCRW, however, SSF can be estimated using most statistical packages, it can simultaneously evaluate habitat selection and movement biases, and can easily integrate a large number of movement taxes at multiple scales. SSF thus offers a simple, yet effective, statistical technique to identify movement taxis.

  20. CURE-SMOTE algorithm and hybrid algorithm for feature selection and parameter optimization based on random forests.

    Science.gov (United States)

    Ma, Li; Fan, Suohai

    2017-03-14

    The random forests algorithm is a type of classifier with prominent universality, a wide application range, and robustness for avoiding overfitting. But there are still some drawbacks to random forests. Therefore, to improve the performance of random forests, this paper seeks to improve imbalanced data processing, feature selection and parameter optimization. We propose the CURE-SMOTE algorithm for the imbalanced data classification problem. Experiments on imbalanced UCI data reveal that the combination of Clustering Using Representatives (CURE) enhances the original synthetic minority oversampling technique (SMOTE) algorithms effectively compared with the classification results on the original data using random sampling, Borderline-SMOTE1, safe-level SMOTE, C-SMOTE, and k-means-SMOTE. Additionally, the hybrid RF (random forests) algorithm has been proposed for feature selection and parameter optimization, which uses the minimum out of bag (OOB) data error as its objective function. Simulation results on binary and higher-dimensional data indicate that the proposed hybrid RF algorithms, hybrid genetic-random forests algorithm, hybrid particle swarm-random forests algorithm and hybrid fish swarm-random forests algorithm can achieve the minimum OOB error and show the best generalization ability. The training set produced from the proposed CURE-SMOTE algorithm is closer to the original data distribution because it contains minimal noise. Thus, better classification results are produced from this feasible and effective algorithm. Moreover, the hybrid algorithm's F-value, G-mean, AUC and OOB scores demonstrate that they surpass the performance of the original RF algorithm. Hence, this hybrid algorithm provides a new way to perform feature selection and parameter optimization.

  1. Melanocytic Hyperplasia in the Epidermis Overlying Trichoblastomas in 100 Randomly Selected Cases.

    Science.gov (United States)

    Al Omoush, Tahseen M M; Michal, Michael; Konstantinova, Anastasia M; Michal, Michal; Kutzner, Heinz; Kazakov, Dmitry V

    2016-04-01

    One hundred cases of trichoblastomas (large nodular, small nodular, cribriform, lymphadenoma, and columnar) were randomly selected and studied for the presence of melanocytic hyperplasia in the epidermis overlying the tumors, which was defined as foci of increased melanocytes in the basal layer of the epidermis (more than 1 per 4 basal keratinocytes). Focal melanocytic hyperplasia was detected in a total of 22 cases of trichoblastoma (22%), and this phenomenon was most frequently seen in columnar trichoblastoma (7 cases), followed by large nodular trichoblastoma (5 cases). The mechanism of epidermal melanocytic hyperplasia overlying trichoblastoma is unclear. Ultraviolet may be a contributing factor, as focal melanocytic hyperplasia was also detected in one-third of cases in the epidermis overlying uninvolved skin, usually associated with solar elastosis. This is further corroborated by the occurrence of the lesions predominantly on the face. Melanocytic hyperplasia overlying trichoblastoma appears to have no impact on the clinical appearance of the lesion and is recognized only microscopically. In an adequate biopsy specimen containing at least part of trichoblastoma, it should not cause any diagnostic problems.

  2. Sexually transmitted diseases among randomly selected attenders at an antenatal clinic in The Gambia.

    Science.gov (United States)

    Mabey, D C; Lloyd-Evans, N E; Conteh, S; Forsey, T

    1984-10-01

    One hundred randomly selected women attending a free government antenatal clinic in the town of Bakau, The Gambia, were examined. Vaginal swabs were taken for microscopical examination for Trichomonas vaginalis and for culture on Sabouraud's medium. Cervical swabs were taken for culture of Neisseria gonorrhoeae and Chlamydia trachomatis and, in 50 cases, Herpesvirus hominis; in addition, urethral swabs were taken for culture of N gonorrhoeae. Serum samples were tested for antibodies to Treponema pallidum by the Venereal Diseases Research Laboratory (VDRL) test and T pallidum haemagglutination assay (TPHA), and to C trachomatis and H hominis by microimmunofluorescence. The prevalence of infection with Candida albicans was found to be 35%, T vaginalis 32%, C trachomatis 6.9%, N gonorrhoeae 6.7%, T pallidum 1%, and H hominis 0%. IgG antibodies at a titre of at least 1/16 to C trachomatis serotypes D-K were found in 29.4%, and to serotypes A-C in a further 10.6%. IgG antibodies at a titre of at least 1/16 to H hominis type I were found in 94%, and to type II in 53%, although a proportion of the latter probably represent cross reacting antibodies to type I.

  3. Modeling Slotted Aloha as a Stochastic Game with Random Discrete Power Selection Algorithms

    Directory of Open Access Journals (Sweden)

    Rachid El-Azouzi

    2009-01-01

    Full Text Available We consider the uplink case of a cellular system where bufferless mobiles transmit over a common channel to a base station, using the slotted aloha medium access protocol. We study the performance of this system under several power differentiation schemes. Indeed, we consider a random set of selectable transmission powers and further study the impact of priorities given either to new arrival packets or to the backlogged ones. Later, we address a general capture model where a mobile transmits successfully a packet if its instantaneous SINR (signal to interferences plus noise ratio is lager than some fixed threshold. Under this capture model, we analyze both the cooperative team in which a common goal is jointly optimized as well as the noncooperative game problem where mobiles reach to optimize their own objectives. Furthermore, we derive the throughput and the expected delay and use them as the objectives to optimize and provide a stability analysis as alternative study. Exhaustive performance evaluations were carried out, we show that schemes with power differentiation improve significantly the individual as well as global performances, and could eliminate in some cases the bi-stable nature of slotted aloha.

  4. Coating Efficiency in Preventing Photolytic Degradation of Two Randomly Selected Brands of Metoprolol Tartrate

    Directory of Open Access Journals (Sweden)

    Md. Anisur Rahman

    2016-03-01

    Full Text Available This research work was carried out to determine whether the film coating is effective to prevent the photolytic degradation of Metoprolol tartrate which is known to have photosensitivity. For this purpose, two randomly selected brands of two different pharmaceutical companies were chosen i.e. Brand A and Brand B. These two brands were exposed to different lighting conditions (normal light, direct sunlight as well as two incandescent lights i.e. 25 watt bulb, 40 watt bulb. Potency tests were performed using UV spectroscopy which showed gradual decline in potency of the tablets under aforesaid lighting conditions and the potency degradations were found 11.48%, 12.92%, 22.62%, 16.87% for Brand A and 14.74%, 14.24%, 10.88%, 18.10% for Brand B under 25 watt bulb, 40 watt bulb, direct sunlight and normal room light respectively. So this study reveals that the both brands containing metoprolol tartrate showed significant light sensitivity even though they are coated and protective opaque packaging is highly recommended for their protection.

  5. Most Undirected Random Graphs Are Amplifiers of Selection for Birth-Death Dynamics, but Suppressors of Selection for Death-Birth Dynamics.

    Directory of Open Access Journals (Sweden)

    Laura Hindersin

    2015-11-01

    Full Text Available We analyze evolutionary dynamics on graphs, where the nodes represent individuals of a population. The links of a node describe which other individuals can be displaced by the offspring of the individual on that node. Amplifiers of selection are graphs for which the fixation probability is increased for advantageous mutants and decreased for disadvantageous mutants. A few examples of such amplifiers have been developed, but so far it is unclear how many such structures exist and how to construct them. Here, we show that almost any undirected random graph is an amplifier of selection for Birth-death updating, where an individual is selected to reproduce with probability proportional to its fitness and one of its neighbors is replaced by that offspring at random. If we instead focus on death-Birth updating, in which a random individual is removed and its neighbors compete for the empty spot, then the same ensemble of graphs consists of almost only suppressors of selection for which the fixation probability is decreased for advantageous mutants and increased for disadvantageous mutants. Thus, the impact of population structure on evolutionary dynamics is a subtle issue that will depend on seemingly minor details of the underlying evolutionary process.

  6. Outlook: The Next Twenty Years

    Energy Technology Data Exchange (ETDEWEB)

    Murayama, Hitoshi

    2003-12-07

    I present an outlook for the next twenty years in particle physics. I start with the big questions in our field, broken down into four categories: horizontal, vertical, heaven, and hell. Then I discuss how we attack the bigquestions in each category during the next twenty years. I argue for a synergy between many different approaches taken in our field.

  7. Bayesian dose selection design for a binary outcome using restricted response adaptive randomization.

    Science.gov (United States)

    Meinzer, Caitlyn; Martin, Renee; Suarez, Jose I

    2017-09-08

    In phase II trials, the most efficacious dose is usually not known. Moreover, given limited resources, it is difficult to robustly identify a dose while also testing for a signal of efficacy that would support a phase III trial. Recent designs have sought to be more efficient by exploring multiple doses through the use of adaptive strategies. However, the added flexibility may potentially increase the risk of making incorrect assumptions and reduce the total amount of information available across the dose range as a function of imbalanced sample size. To balance these challenges, a novel placebo-controlled design is presented in which a restricted Bayesian response adaptive randomization (RAR) is used to allocate a majority of subjects to the optimal dose of active drug, defined as the dose with the lowest probability of poor outcome. However, the allocation between subjects who receive active drug or placebo is held constant to retain the maximum possible power for a hypothesis test of overall efficacy comparing the optimal dose to placebo. The design properties and optimization of the design are presented in the context of a phase II trial for subarachnoid hemorrhage. For a fixed total sample size, a trade-off exists between the ability to select the optimal dose and the probability of rejecting the null hypothesis. This relationship is modified by the allocation ratio between active and control subjects, the choice of RAR algorithm, and the number of subjects allocated to an initial fixed allocation period. While a responsive RAR algorithm improves the ability to select the correct dose, there is an increased risk of assigning more subjects to a worse arm as a function of ephemeral trends in the data. A subarachnoid treatment trial is used to illustrate how this design can be customized for specific objectives and available data. Bayesian adaptive designs are a flexible approach to addressing multiple questions surrounding the optimal dose for treatment efficacy

  8. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti

    Directory of Open Access Journals (Sweden)

    Wampler Peter J

    2013-01-01

    Full Text Available Abstract Background A remote sensing technique was developed which combines a Geographic Information System (GIS; Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. Methods The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. Results A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. Conclusions The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only

  9. Automatised selection of load paths to construct reduced-order models in computational damage micromechanics: from dissipation-driven random selection to Bayesian optimization

    Science.gov (United States)

    Goury, Olivier; Amsallem, David; Bordas, Stéphane Pierre Alain; Liu, Wing Kam; Kerfriden, Pierre

    2016-08-01

    In this paper, we present new reliable model order reduction strategies for computational micromechanics. The difficulties rely mainly upon the high dimensionality of the parameter space represented by any load path applied onto the representative volume element. We take special care of the challenge of selecting an exhaustive snapshot set. This is treated by first using a random sampling of energy dissipating load paths and then in a more advanced way using Bayesian optimization associated with an interlocked division of the parameter space. Results show that we can insure the selection of an exhaustive snapshot set from which a reliable reduced-order model can be built.

  10. Generation of Aptamers from A Primer-Free Randomized ssDNA Library Using Magnetic-Assisted Rapid Aptamer Selection

    Science.gov (United States)

    Tsao, Shih-Ming; Lai, Ji-Ching; Horng, Horng-Er; Liu, Tu-Chen; Hong, Chin-Yih

    2017-04-01

    Aptamers are oligonucleotides that can bind to specific target molecules. Most aptamers are generated using random libraries in the standard systematic evolution of ligands by exponential enrichment (SELEX). Each random library contains oligonucleotides with a randomized central region and two fixed primer regions at both ends. The fixed primer regions are necessary for amplifying target-bound sequences by PCR. However, these extra-sequences may cause non-specific bindings, which potentially interfere with good binding for random sequences. The Magnetic-Assisted Rapid Aptamer Selection (MARAS) is a newly developed protocol for generating single-strand DNA aptamers. No repeat selection cycle is required in the protocol. This study proposes and demonstrates a method to isolate aptamers for C-reactive proteins (CRP) from a randomized ssDNA library containing no fixed sequences at 5‧ and 3‧ termini using the MARAS platform. Furthermore, the isolated primer-free aptamer was sequenced and binding affinity for CRP was analyzed. The specificity of the obtained aptamer was validated using blind serum samples. The result was consistent with monoclonal antibody-based nephelometry analysis, which indicated that a primer-free aptamer has high specificity toward targets. MARAS is a feasible platform for efficiently generating primer-free aptamers for clinical diagnoses.

  11. The basic science and mathematics of random mutation and natural selection.

    Science.gov (United States)

    Kleinman, Alan

    2014-12-20

    The mutation and natural selection phenomenon can and often does cause the failure of antimicrobial, herbicidal, pesticide and cancer treatments selection pressures. This phenomenon operates in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures. The mathematical behavior of mutation and selection is derived using the principles given by probability theory. The derivation of the equations describing the mutation and selection phenomenon is carried out in the context of an empirical example.

  12. Introduction of mismatches in a random shRNA-encoding library improves potency for phenotypic selection.

    Directory of Open Access Journals (Sweden)

    Yongping Wang

    Full Text Available RNA interference (RNAi is a mechanism for interfering with gene expression through the action of small, non-coding RNAs. We previously constructed a short-hairpin-loop RNA (shRNA encoding library that is random at the nucleotide level [1]. In this library, the stems of the hairpin are completely complementary. To improve the potency of initial hits, and therefore signal-to-noise ratios in library screening, as well as to simplify hit-sequence retrieval by PCR, we constructed a second-generation library in which we introduced random mismatches between the two halves of the stem of each hairpin, on a random template background. In a screen for shRNAs that protect an interleukin-3 (IL3 dependent cell line from IL3 withdrawal, our second-generation library yielded hit sequences with significantly higher potencies than those from the first-generation library in the same screen. Our method of random mutagenesis was effective for a random template and is likely suitable, therefore, for any DNA template of interest. The improved potency of our second-generation library expands the range of possible unbiased screens for small-RNA therapeutics and biologic tools.

  13. Prevalence and Severity of College Student Bereavement Examined in a Randomly Selected Sample

    Science.gov (United States)

    Balk, David E.; Walker, Andrea C.; Baker, Ardith

    2010-01-01

    The authors used stratified random sampling to assess the prevalence and severity of bereavement in college undergraduates, providing an advance over findings that emerge from convenience sampling methods or from anecdotal observations. Prior research using convenience sampling indicated that 22% to 30% of college students are within 12 months of…

  14. Demonstrating an Interactive Genetic Drift Exercise: Examining the Processes of Random Mating and Selection.

    Science.gov (United States)

    Carter, Ashley J. R.

    2002-01-01

    Presents a hands-on activity on the phenomenon of genetic drift in populations that reinforces the random nature of drift and demonstrates the effect of the population size on the mean frequency of an allele over a few generations. Includes materials for the demonstration, procedures, and discussion topics. (KHR)

  15. A RANDOMIZED SELECTIVE ENCRYPTION USING HASHING TECHNIQUE FOR SECURING VIDEO STREAMS

    Directory of Open Access Journals (Sweden)

    Lizyflorance. C

    2012-11-01

    Full Text Available Digital video transmissions are widely used in network nowadays. Hence, securing its contents and keeping privacy is vital. Several encryption algorithms have been proposed earlier to achieve securevideo transmission. But altogether attaining efficiency, security and flexibility is a major challenge. To transmit a digital video, encryption is necessary to protect its contents from attacks. As the size of the videos are usually large their contents has to be compressed before transmission. Encryption is applied on the video content after compression. One of the encryption technique selective encryption is used for encrypting video. It encrypts only a subset of data. The selective encryption algorithm reduces the amount of the data to be encrypted and achieves a required level of security. In this paper we study the existing selective encryption algorithm and its classifications. The challenges in the selective encryptionalgorithms and some future directions are presented.

  16. An Improved Image Steganography Method Based on LSB Technique with Random Pixel Selection

    Directory of Open Access Journals (Sweden)

    Marwa M. Emam

    2016-03-01

    Full Text Available with the rapid advance in digital network, information technology, digital libraries, and particularly World Wide Web services, many kinds of information could be retrieved any time. Thus, the security issue has become one of the most significant problems for distributing new information. It is necessary to protect this information while passing over insecure channels. Steganography introduces a strongly approach to hide the secret data in an appropriate media carriers such as images, audio files, text files, and video files. In this paper, a new image steganography method based on spatial domain is proposed. According to the proposed method, the secret message is embedded randomly in the pixel location of the cover image using Pseudo Random Number Generator (PRNG of each pixel value of the cover image instead of embedding sequentially in the pixels of the cover image. This randomization is expected to increase the security of the system. The proposed method works with two layers (Blue and Green, as (2-1-2 layer, and the byte of the message will be embedded in three pixels only in this form (3-2-3. From the experimental results, it has found that the proposed method achieves a very high Maximum Hiding Capacity (MHC, and higher visual quality as indicated by the Peak Signal-to- Noise Ratio (PSNR.

  17. Random frog: an efficient reversible jump Markov Chain Monte Carlo-like approach for variable selection with applications to gene selection and disease classification.

    Science.gov (United States)

    Li, Hong-Dong; Xu, Qing-Song; Liang, Yi-Zeng

    2012-08-31

    The identification of disease-relevant genes represents a challenge in microarray-based disease diagnosis where the sample size is often limited. Among established methods, reversible jump Markov Chain Monte Carlo (RJMCMC) methods have proven to be quite promising for variable selection. However, the design and application of an RJMCMC algorithm requires, for example, special criteria for prior distributions. Also, the simulation from joint posterior distributions of models is computationally extensive, and may even be mathematically intractable. These disadvantages may limit the applications of RJMCMC algorithms. Therefore, the development of algorithms that possess the advantages of RJMCMC methods and are also efficient and easy to follow for selecting disease-associated genes is required. Here we report a RJMCMC-like method, called random frog that possesses the advantages of RJMCMC methods and is much easier to implement. Using the colon and the estrogen gene expression datasets, we show that random frog is effective in identifying discriminating genes. The top 2 ranked genes for colon and estrogen are Z50753, U00968, and Y10871_at, Z22536_at, respectively. (The source codes with GNU General Public License Version 2.0 are freely available to non-commercial users at: http://code.google.com/p/randomfrog/.).

  18. High polishing selectivity ceria slurry for formation of top electrode in spin-transfer torque magnetic random access memory

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Hao [Advanced Semiconductor Materials and Devices Development Center, Hanyang University, Seoul 133-791 (Korea, Republic of); Department of Electronics and Communication Engineering, Hanyang University, Seoul 133-791 (Korea, Republic of); Lim, Jae-Hyung [Advanced Semiconductor Materials and Devices Development Center, Hanyang University, Seoul 133-791 (Korea, Republic of); Department of Nanoscale Semiconductor Engineering, Hanyang University, Seoul 133-791 (Korea, Republic of); Park, Jin-Hyung [Advanced Semiconductor Materials and Devices Development Center, Hanyang University, Seoul 133-791 (Korea, Republic of); Park, Jea-Gun, E-mail: parkjgL@hanyang.ac.kr [Advanced Semiconductor Materials and Devices Development Center, Hanyang University, Seoul 133-791 (Korea, Republic of); Department of Electronics and Communication Engineering, Hanyang University, Seoul 133-791 (Korea, Republic of)

    2012-11-01

    During the formation of the top electrode (T.E.) in spin-transfer torque magnetic random access memory, a slurry with a high polishing rate of SiO{sub 2} and a low polishing rate of metal (T.E. material) is required in the chemical mechanical planarization application area. We used a ceria-based slurry with a polymeric additive to maintain the high polishing rate of SiO{sub 2} while it suppresses the polishing rate of the T.E. material, tantalum and ruthenium. We found ruthenium showed a significantly higher selectivity than tantalum in the ceria-based slurry. X-ray photoelectron spectroscopy was used to investigate the adsorption characteristics of the polymeric additive on the T.E. material. Except for the adsorbed polymeric additive, we found that zeta potential of the T.E. material played a critical role in determining the polishing selectivity of SiO{sub 2}-to-T.E. material. - Highlights: Black-Right-Pointing-Pointer High selective chemical mechanical planarization (CMP) slurry was investigated. Black-Right-Pointing-Pointer The slurry has a high selectivity of SiO{sub 2}-to-metals like tantalum and ruthenium. Black-Right-Pointing-Pointer Spin-transfer-torque magnetic memory requires such high selectivity slurry. Black-Right-Pointing-Pointer Surface zeta potential was used to explain CMP mechanism. Black-Right-Pointing-Pointer tantalum and ruthenium have different rate-determining steps during CMP.

  19. A Class of Shannon-McMillan theorems for mth-order Markov information source on generalized random selection system

    Directory of Open Access Journals (Sweden)

    Wang Kang Kang

    2013-06-01

    Full Text Available In this paper, our aim is to establish a class of Shannon-McMillan theorems for $m$th-order nonhomogeneous Markov information source on the generalized random selection system by constructing the consistent distribution functions. As corollaries, we obtain some Shannon-McMillan theorems for $m$th-order nonhomogeneous Markov information source and the general nonhomogeneous Markov information source. Some results which have been obtained are extended. In the proof, a new technique for studying Shannon-McMillan theorems in information theory is applied.

  20. A Randomized Controlled Trial of Cognitive Debiasing Improves Assessment and Treatment Selection for Pediatric Bipolar Disorder

    Science.gov (United States)

    Jenkins, Melissa M.; Youngstrom, Eric A.

    2015-01-01

    Objective This study examined the efficacy of a new cognitive debiasing intervention in reducing decision-making errors in the assessment of pediatric bipolar disorder (PBD). Method The study was a randomized controlled trial using case vignette methodology. Participants were 137 mental health professionals working in different regions of the US (M=8.6±7.5 years of experience). Participants were randomly assigned to a (1) brief overview of PBD (control condition), or (2) the same brief overview plus a cognitive debiasing intervention (treatment condition) that educated participants about common cognitive pitfalls (e.g., base-rate neglect; search satisficing) and taught corrective strategies (e.g., mnemonics, Bayesian tools). Both groups evaluated four identical case vignettes. Primary outcome measures were clinicians’ diagnoses and treatment decisions. The vignette characters’ race/ethnicity was experimentally manipulated. Results Participants in the treatment group showed better overall judgment accuracy, p clinical recommendations, particularly in cases where participants missed comorbid conditions, failed to detect the possibility of hypomania or mania in depressed youths, and misdiagnosed classic manic symptoms. In contrast, effects of patient race were negligible. Conclusions The cognitive debiasing intervention outperformed the control condition. Examining specific heuristics in cases of PBD may identify especially problematic mismatches between typical habits of thought and characteristics of the disorder. The debiasing intervention was brief and delivered via the Web; it has the potential to generalize and extend to other diagnoses as well as to various practice and training settings. PMID:26727411

  1. Pattern selection and self-organization induced by random boundary initial values in a neuronal network

    Science.gov (United States)

    Ma, Jun; Xu, Ying; Wang, Chunni; Jin, Wuyin

    2016-11-01

    Regular spatial patterns could be observed in spatiotemporal systems far from equilibrium states. Artificial networks with different topologies are often designed to reproduce the collective behaviors of nodes (or neurons) which the local kinetics of node is described by kinds of oscillator models. It is believed that the self-organization of network much depends on the bifurcation parameters and topology connection type. Indeed, the boundary effect is every important on the pattern formation of network. In this paper, a regular network of Hindmarsh-Rose neurons is designed in a two-dimensional square array with nearest-neighbor connection type. The neurons on the boundary are excited with random stimulus. It is found that spiral waves, even a pair of spiral waves could be developed in the network under appropriate coupling intensity. Otherwise, the spatial distribution of network shows irregular states. A statistical variable is defined to detect the collective behavior by using mean field theory. It is confirmed that regular pattern could be developed when the synchronization degree is low. The potential mechanism could be that random perturbation on the boundary could induce coherence resonance-like behavior thus spiral wave could be developed in the network.

  2. Instrument selection for randomized controlled trials: why this and not that?

    Science.gov (United States)

    Records, Kathie; Keller, Colleen; Ainsworth, Barbara; Permana, Paska

    2012-01-01

    A fundamental linchpin for obtaining rigorous findings in quantitative research involves the selection of survey instruments. Psychometric recommendations are available for the processes for scale development and testing and guidance for selection of established scales. These processes are necessary to address the validity link between the phenomena under investigation, the empirical measures and, ultimately, the theoretical ties between these and the world views of the participants. Detailed information is most often provided about study design and protocols, but far less frequently is a detailed theoretical explanation provided for why specific instruments are chosen. Guidance to inform choices is often difficult to find when scales are needed for specific cultural, ethnic, or racial groups. This paper details the rationale underlying instrument selection for measurement of the major processes (intervention, mediator and moderator variables, outcome variables) in an ongoing study of postpartum Latinas, Madres para la Salud [Mothers for Health]. The rationale underpinning our choices includes a discussion of alternatives, when appropriate. These exemplars may provide direction for other intervention researchers who are working with specific cultural, racial, or ethnic groups or for other investigators who are seeking to select the 'best' instrument. Thoughtful consideration of measurement and articulation of the rationale underlying our choices facilitates the maintenance of rigor within the study design and improves our ability to assess study outcomes.

  3. Interpreting patterns of resource utilization: randomness and selectivity in pollen feeding by adult hoverflies.

    Science.gov (United States)

    Haslett, J R

    1989-03-01

    Adult syrphid flies feed primarily on pollen and nectar from flowers and may be regarded as suitable models for the investigation of resource partitioning in a plant/pollinator system. The present study examines the extent to which a small group of six species are selective in their diets and investigates the role of flower colour as a means by which such selectivity may occur. Flower feeding preferences were determined by pollen analyses of gut contents and an extensive flower sampling programme was under-taken to provide information on the relative abundances of the food resources available to the insects. Flower colours were defined by their reflectance spectra, and the inherent colour preferences of the flies were determined by field experiments in which natural flowers were simulated using painted plastic discs. The results reveal that some hoverfly species are highly selective in their pollen diets, while others have a more generalist approach to their foraging. The division of flower resources by the more selective species is shown to be dependent, at least partially, on the colours of the flowers. The findings are discussed in relation to the theories of Competition and Optimal Foraging and the 'mechanistic approach' to ecology. The use of learning models is suggested as an alternative means of investigating patterns of resource use in future research.

  4. Moral hazard and selection among the poor: evidence from a randomized experiment.

    Science.gov (United States)

    Spenkuch, Jörg L

    2012-01-01

    Not only does economic theory predict high-risk individuals to be more likely to purchase insurance, but insurance coverage is also thought to crowd out precautionary activities. In spite of stark theoretical predictions, there is conflicting empirical evidence on adverse selection, and evidence on ex ante moral hazard is very scarce. Using data from the Seguro Popular Experiment in Mexico, this paper documents patterns of selection on observables into health insurance as well as the existence of non-negligible ex ante moral hazard. More specifically, the findings indicate that (i) agents in poor self-assessed health prior to the intervention have, all else equal, a higher propensity to take up insurance; and (ii) insurance coverage reduces the demand for self-protection in the form of preventive care. Curiously, however, individuals do not sort based on objective measures of their health.

  5. Phenotypic evolution by distance in fluctuating environments: The contribution of dispersal, selection and random genetic drift.

    Science.gov (United States)

    Engen, Steinar; Sæther, Bernt-Erik

    2016-06-01

    Here we analyze how dispersal, genetic drift, and adaptation to the local environment affect the geographical differentiation of a quantitative character through natural selection using a spatial dynamic model for the evolution of the distribution of mean breeding values in space and time. The variation in optimal phenotype is described by local Ornstein-Uhlenbeck processes with a given spatial autocorrelation. Selection and drift are assumed to be governed by phenotypic variation within areas with a given mean breeding value and constant additive genetic variance. Between such neighboring areas there will be white noise variation in mean breeding values, while the variation at larger distances has a spatial structure and a spatial scale that we investigate. The model is analyzed by solving balance equations for the stationary distribution of mean breeding values. We also present scaling results for the spatial autocovariance function for mean breeding values as well as that for the covariance between mean breeding value and the optimal phenotype expressing local adaption. Our results show in particular how these spatial scales depend on population density. For large densities the spatial scale of fluctuations in mean breeding values have similarities with corresponding results in population dynamics, where the effect of migration on spatial scales may be large if the local strength of density regulation is small. In our evolutionary model strength of density regulation corresponds to strength of local selection so that weak local selection may produce large spatial scales of autocovariances. Genetic drift and stochastic migration are shown to act through the population size within a characteristic area with much smaller variation in optimal phenotypes than in the whole population.

  6. Evaluating a selective prevention programme for binge drinking among young adolescents: study protocol of a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Wiers Reinout

    2011-02-01

    Full Text Available Abstract Background In comparison to other Europe countries, Dutch adolescents are at the top in drinking frequency and binge drinking. A total of 75% of the Dutch 12 to 16 year olds who drink alcohol also engage in binge drinking. A prevention programme called Preventure was developed in Canada to prevent adolescents from binge drinking. This article describes a study that aims to assess the effects of this selective school-based prevention programme in the Netherlands. Methods A randomized controlled trial is being conducted among 13 to 15-year-old adolescents in secondary schools. Schools were randomly assigned to the intervention and control conditions. The intervention condition consisted of two 90 minute group sessions, carried out at the participants' schools and provided by a qualified counsellor and a co-facilitator. The intervention targeted young adolescents who demonstrated personality risk for alcohol abuse. The group sessions were adapted to four personality profiles. The control condition received no further intervention above the standard substance use education sessions provided in the Dutch national curriculum. The primary outcomes will be the percentage reduction in binge drinking, weekly drinking and drinking-related problems after three specified time periods. A screening survey collected data by means of an Internet questionnaire. Students have completed, or will complete, a post-treatment survey after 2, 6, and 12 months, also by means of an online questionnaire. Discussion This study protocol presents the design and current implementation of a randomized controlled trial to evaluate the effectiveness of a selective alcohol prevention programme. We expect that a significantly lower number of adolescents will binge drink, drink weekly, and have drinking-related problems in the intervention condition compared to the control condition, as a result of this intervention. Trial registration This trial is registered in the Dutch

  7. Simple random sampling-based probe station selection for fault detection in wireless sensor networks.

    Science.gov (United States)

    Huang, Rimao; Qiu, Xuesong; Rui, Lanlan

    2011-01-01

    Fault detection for wireless sensor networks (WSNs) has been studied intensively in recent years. Most existing works statically choose the manager nodes as probe stations and probe the network at a fixed frequency. This straightforward solution leads however to several deficiencies. Firstly, by only assigning the fault detection task to the manager node the whole network is out of balance, and this quickly overloads the already heavily burdened manager node, which in turn ultimately shortens the lifetime of the whole network. Secondly, probing with a fixed frequency often generates too much useless network traffic, which results in a waste of the limited network energy. Thirdly, the traditional algorithm for choosing a probing node is too complicated to be used in energy-critical wireless sensor networks. In this paper, we study the distribution characters of the fault nodes in wireless sensor networks, validate the Pareto principle that a small number of clusters contain most of the faults. We then present a Simple Random Sampling-based algorithm to dynamic choose sensor nodes as probe stations. A dynamic adjusting rule for probing frequency is also proposed to reduce the number of useless probing packets. The simulation experiments demonstrate that the algorithm and adjusting rule we present can effectively prolong the lifetime of a wireless sensor network without decreasing the fault detected rate.

  8. Priority and Random Selection for Dynamic Window Secured Implicit Geographic Routing in Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Zurina M. Hanapi

    2009-01-01

    Full Text Available Problem statement: Sensor nodes are easily exposed to many attacks since it were deployed in unattended adversarial environment with no global addressing and used for critical applications such as battlefield surveillance and emergency response. While the sensor also needs to act as a router to relay a message to a required recipient, then this increased the vulnerabilities to a network layer. However, existing security mechanisms are not permissible to be fitted directly into any sensor network due to constraints on energy and computational capabilities of sensor node itself that require on the modification on the protocols that associated with the sensor node itself in order to provide the security. Approach: In this study, a Dynamic Window Secured Implicit Geographic Forwarding (DWIGF routing protocol was presented which based on an approach of lazy binding technique and dynamic time on collection window and inherits a geographical routing techniques. Results: The DWIGF was intelligent to minimize a Clear To Send (CTS rushing attack and robust against black hole and selective forwarding attacks with high packet delivery ratios because of selection of a failed node and an attacker was minimized respectively. Moreover, few routing attacks were eliminated since the routing technique used was classified as geographic routing. Conclusion: This novel routing protocol was promising a secured routing without inserting any existing security mechanism inside.

  9. Reduced plasma aldosterone concentrations in randomly selected patients with insulin-dependent diabetes mellitus.

    LENUS (Irish Health Repository)

    Cronin, C C

    2012-02-03

    Abnormalities of the renin-angiotensin system have been reported in patients with diabetes mellitus and with diabetic complications. In this study, plasma concentrations of prorenin, renin, and aldosterone were measured in a stratified random sample of 110 insulin-dependent (Type 1) diabetic patients attending our outpatient clinic. Fifty-four age- and sex-matched control subjects were also examined. Plasma prorenin concentration was higher in patients without complications than in control subjects when upright (geometric mean (95% confidence intervals (CI): 75.9 (55.0-105.6) vs 45.1 (31.6-64.3) mU I-1, p < 0.05). There was no difference in plasma prorenin concentration between patients without and with microalbuminuria and between patients without and with background retinopathy. Plasma renin concentration, both when supine and upright, was similar in control subjects, in patients without complications, and in patients with varying degrees of diabetic microangiopathy. Plasma aldosterone was suppressed in patients without complications in comparison to control subjects (74 (58-95) vs 167 (140-199) ng I-1, p < 0.001) and was also suppressed in patients with microvascular disease. Plasma potassium was significantly higher in patients than in control subjects (mean +\\/- standard deviation: 4.10 +\\/- 0.36 vs 3.89 +\\/- 0.26 mmol I-1; p < 0.001) and plasma sodium was significantly lower (138 +\\/- 4 vs 140 +\\/- 2 mmol I-1; p < 0.001). We conclude that plasma prorenin is not a useful early marker for diabetic microvascular disease. Despite apparently normal plasma renin concentrations, plasma aldosterone is suppressed in insulin-dependent diabetic patients.

  10. Does pulmonary rehabilitation work in clinical practice? A review on selection and dropout in randomized controlled trials on pulmonary rehabilitation

    Directory of Open Access Journals (Sweden)

    Bodil Bjoernshave

    2010-04-01

    Full Text Available Bodil Bjoernshave1, Jens Korsgaard2, Claus Vinther Nielsen31Medical Department, Horsens Regional Hospital, Denmark; 2Aalborg Hospital Science and Innovation Centre, 3Department of Clinical Social Medicine and Rehabilitation, Institute of Public Health, Aarhus University, DenmarkAim: To analyze randomized controlled trials (RCTs on pulmonary rehabilitation (PR to determine whether the patients who complete PR form a representative subset of the chronic obstructive pulmonary disease (COPD target population and to discuss what impact this may have for the generalizability and implementation of PR in practice.Material and methods: A review of 26 RCTs included in a Cochrane Review 2007. We analyzed the selection at three different levels: 1 sampling; 2 inclusion and exclusion; 3 and dropout. Results: Of 26 studies only 3 (12% described the sampling as the number of patients contacted. In these studies 28% completed PR. In all we found, that 75% of the patients suitable for PR programs were omitted due to sampling exclusion and dropout. Most of the study populations are not representative of the target population.Conclusion: The RCTs selected for the Cochrane review gave sparse information about the sampling procedure. The demand for high internal validity in studies on PR reduced their external validity. The patients completing PR programs in RCTs were not drawn from a representative subset of the target population. The ability to draw conclusions relevant to clinical practice from the results of the RCTs on PR is impaired.Keywords: COPD, rehabilitation, selection, dropout, external validity

  11. A Permutation Importance-Based Feature Selection Method for Short-Term Electricity Load Forecasting Using Random Forest

    Directory of Open Access Journals (Sweden)

    Nantian Huang

    2016-09-01

    Full Text Available The prediction accuracy of short-term load forecast (STLF depends on prediction model choice and feature selection result. In this paper, a novel random forest (RF-based feature selection method for STLF is proposed. First, 243 related features were extracted from historical load data and the time information of prediction points to form the original feature set. Subsequently, the original feature set was used to train an RF as the original model. After the training process, the prediction error of the original model on the test set was recorded and the permutation importance (PI value of each feature was obtained. Then, an improved sequential backward search method was used to select the optimal forecasting feature subset based on the PI value of each feature. Finally, the optimal forecasting feature subset was used to train a new RF model as the final prediction model. Experiments showed that the prediction accuracy of RF trained by the optimal forecasting feature subset was higher than that of the original model and comparative models based on support vector regression and artificial neural network.

  12. Pregnancy is not a risk factor for gallstone disease: Results of a randomly selected population sample

    Institute of Scientific and Technical Information of China (English)

    Thomas Walcher; Bernhard Otto Boehm; Wolfgang Kratzer; Mark Martin Haenle; Martina Kron; Birgit Hay; Richard Andrew Mason; Alexa Friederike Alice von Schmiesing; Armin Imhof; Wolfgang Koenig; Peter Kern

    2005-01-01

    AIM: To investigate the prevalence, risk factors, and selection of the study population for cholecystolithiasis in an urban population in Germany, in relation to our own findings and to the results in the international literature.METHODS: A total of 2 147 persons (1 111 females,age 42.8±12.7 years; 1 036 males, age 42.3±13.1 years)participating in an investigation on the prevalence of Echinococcus multilocularis were studied for risk factors and prevalence of gallbladder stone disease.Risk factors were assessed by means of a standardized interview and calculation of body mass index (BMI). A diagnostic ultrasound examination of the gallbladder was performed. Data were analyzed by multiple logistic regression, using the SAS statistical software package.RESULTS: Gallbladder stones were detected in 171study participants (8.0%, n = 2 147). Risk factors for the development of gallbladder stone disease included age, sex, BMI, and positive family history. In a separate analysis of female study participants, pregnancy (yes/no)and number of pregnancies did not exert any influence.CONCLUSION: Findings of the present study confirm that age, female sex, BMI, and positive family history are risk factors for the development of gallbladder stone disease. Pregnancy and the number of pregnancies,however, could not be shown to be risk factors. There seem to be no differences in the respective prevalence for gallbladder stone disease in urban and rural populations.

  13. Random mutagenesis and selection of organic solvent-stable haloperoxidase from Streptomyces aureofaciens.

    Science.gov (United States)

    Yamada, Ryosuke; Higo, Tatsutoshi; Yoshikawa, Chisa; China, Hideyasu; Yasuda, Masahiro; Ogino, Hiroyasu

    2015-01-01

    Haloperoxidases are useful oxygenases involved in halogenation of a range of water-insoluble organic compounds and can be used without additional high-cost cofactors. In particular, organic solvent-stable haloperoxidases are desirable for enzymatic halogenations in the presence of organic solvents. In this study, we adopted a directed evolution approach by error-prone polymerase chain reaction to improve the organic solvent-stability of the homodimeric BPO-A1 haloperoxidase from Streptomyces aureofaciens. Among 1,000 mutant BPO-A1 haloperoxidases, an organic solvent-stable mutant OST48 with P123L and P241A mutations and a high active mutant OST959 with H53Y and G162R mutations were selected. The residual activity of mutant OST48 after incubation in 40% (v/v) 1-propanol for 1 h was 1.8-fold higher than that of wild-type BPO-A1. In addition, the OST48 mutant showed higher stability in methanol, ethanol, dimethyl sulfoxide, and N,N-dimethylformamide than wild-type BPO-A1 haloperoxidase. Moreover, after incubation at 80°C for 1 h, the residual activity of mutant OST959 was 4.6-fold higher than that of wild-type BPO-A1. Based on the evaluation of single amino acid-substituted mutant models, stabilization of the hydrophobic core derived from P123L mutation and increased numbers of hydrogen bonds derived from G162R mutation led to higher organic solvent-stability and thermostability, respectively.

  14. A novel peptide, selected from phage display library of random peptides, can efficiently target into human breast cancer cell

    Institute of Scientific and Technical Information of China (English)

    DONG Jian; LIU WeiQing; JIANG AiMei; ZHANG KeJian; CHEN MingQing

    2008-01-01

    To develop a targeting vector for breast cancer biotherapy, MDA-MB-231 cell, a human breast cancer cell line, was co-cultured with pC89 (9 aa) phage display library of random peptides. In multiple inde-pendent peptide-presenting phage screening trials, subtilisin was used as a protease to inactivate ex-tra-cellular phages. The internalized phages were collected by cell lysising and amplified in E. coli XLI-Blue. Through five rounds of selection, the peptide-presenting phages which could be internalized in MDA-MB-231 cells were isolated. A comparison was made between internalization capacities of pep-tide-presenting phages isolated from MDA-MB-231 cells and RGD-integrin binding phage by cocultur-ing them with other human tumor cell lines and normal cells. The nucleotide sequences of isolated peptide-presenting phages were then determined by DNA sequencing. To uncover whether phage coat protein or amino acid order was required for the character of the peptide to MDA-MB-231 cells, three peptides were synthesized. They are CASPSGALRSC, ASPSGALRS and CGVIFDHSVPC (the shifted sequence of CASPSGALRSC), and after coculturing them with different cell lines, their targeting ca-pacities to MDA-MB-231 cells were detected. These data suggested that the internalization process was highly selective, and capable of capturing a specific peptide from parent peptide variants. Moreover, the targeting internalization event of peptides was an amino acid sequence dependent manner. The results demonstrated the feasibility of using phage display library of random peptides to develop new targeting system for intracellular delivery of macromolecules, and the peptide we obtained might be modified as a targeting vector for breast cancer gene therapy.

  15. Twenty-first century vaccines

    Science.gov (United States)

    Rappuoli, Rino

    2011-01-01

    In the twentieth century, vaccination has been possibly the greatest revolution in health. Together with hygiene and antibiotics, vaccination led to the elimination of many childhood infectious diseases and contributed to the increase in disability-free life expectancy that in Western societies rose from 50 to 78–85 years (Crimmins, E. M. & Finch, C. E. 2006 Proc. Natl Acad. Sci. USA 103, 498–503; Kirkwood, T. B. 2008 Nat. Med 10, 1177–1185). In the twenty-first century, vaccination will be expected to eliminate the remaining childhood infectious diseases, such as meningococcal meningitis, respiratory syncytial virus, group A streptococcus, and will address the health challenges of this century such as those associated with ageing, antibiotic resistance, emerging infectious diseases and poverty. However, for this to happen, we need to increase the public trust in vaccination so that vaccines can be perceived as the best insurance against most diseases across all ages. PMID:21893537

  16. Polarimetric SAR decomposition parameter subset selection and their optimal dynamic range evaluation for urban area classification using Random Forest

    Science.gov (United States)

    Hariharan, Siddharth; Tirodkar, Siddhesh; Bhattacharya, Avik

    2016-02-01

    Urban area classification is important for monitoring the ever increasing urbanization and studying its environmental impact. Two NASA JPL's UAVSAR datasets of L-band (wavelength: 23 cm) were used in this study for urban area classification. The two datasets used in this study are different in terms of urban area structures, building patterns, their geometric shapes and sizes. In these datasets, some urban areas appear oriented about the radar line of sight (LOS) while some areas appear non-oriented. In this study, roll invariant polarimetric SAR decomposition parameters were used to classify these urban areas. Random Forest (RF), which is an ensemble decision tree learning technique, was used in this study. RF performs parameter subset selection as a part of its classification procedure. In this study, parameter subsets were obtained and analyzed to infer scattering mechanisms useful for urban area classification. The Cloude-Pottier α, the Touzi dominant scattering amplitude αs1 and the anisotropy A were among the top six important parameters selected for both the datasets. However, it was observed that these parameters were ranked differently for the two datasets. The urban area classification using RF was compared with the Support Vector Machine (SVM) and the Maximum Likelihood Classifier (MLC) for both the datasets. RF outperforms SVM by 4% and MLC by 12% in Dataset 1. It also outperforms SVM and MLC by 3.5% and 11% respectively in Dataset 2.

  17. Enhanced stabilization of a stable single domain antibody for SEB toxin by random mutagenesis and stringent selection.

    Science.gov (United States)

    Turner, Kendrick B; Zabetakis, Dan; Goldman, Ellen R; Anderson, George P

    2014-03-01

    Single domain antibodies, recombinant variable heavy domains derived from the unique heavy-chain only antibodies found in camelids and sharks, are exceptionally rugged due to their ability to refold following heat or chemical denaturation. In addition, a number of single domain antibodies have been found to possess high melting points which provide an even greater degree of stability; one of these, llama-derived A3, is a binder of Staphylococcal enterotoxin B and has a Tm of 83.5 °C. In this work, we utilized random mutagenesis and stringent selection in an effort to obtain variants of A3 with even higher melting points. This effort resulted in the selection of a double mutant, A3-T28I-S72I, which has a melting point of 90.0 °C and near wild-type affinity for the target antigen. We further characterized the mutations individually to determine that while both contributed to the thermal stabilization, the T28I mutation accounted for ∼ 4.1 °C of the 6.5 °C increase. This work demonstrates that by the addition of relatively subtle changes it is possible to further improve the melting temperature of single domain antibodies that are already remarkably stable.

  18. K-Ras(G12D)-selective inhibitory peptides generated by random peptide T7 phage display technology.

    Science.gov (United States)

    Sakamoto, Kotaro; Kamada, Yusuke; Sameshima, Tomoya; Yaguchi, Masahiro; Niida, Ayumu; Sasaki, Shigekazu; Miwa, Masanori; Ohkubo, Shoichi; Sakamoto, Jun-Ichi; Kamaura, Masahiro; Cho, Nobuo; Tani, Akiyoshi

    2017-03-11

    Amino-acid mutations of Gly(12) (e.g. G12D, G12V, G12C) of V-Ki-ras2 Kirsten rat sarcoma viral oncogene homolog (K-Ras), the most promising drug target in cancer therapy, are major growth drivers in various cancers. Although over 30 years have passed since the discovery of these mutations in most cancer patients, effective mutated K-Ras inhibitors have not been marketed. Here, we report novel and selective inhibitory peptides to K-Ras(G12D). We screened random peptide libraries displayed on T7 phage against purified recombinant K-Ras(G12D), with thorough subtraction of phages bound to wild-type K-Ras, and obtained KRpep-2 (Ac-RRCPLYISYDPVCRR-NH2) as a consensus sequence. KRpep-2 showed more than 10-fold binding- and inhibition-selectivity to K-Ras(G12D), both in SPR analysis and GDP/GTP exchange enzyme assay. KD and IC50 values were 51 and 8.9 nM, respectively. After subsequent sequence optimization, we successfully generated KRpep-2d (Ac-RRRRCPLYISYDPVCRRRR-NH2) that inhibited enzyme activity of K-Ras(G12D) with IC50 = 1.6 nM and significantly suppressed ERK-phosphorylation, downstream of K-Ras(G12D), along with A427 cancer cell proliferation at 30 μM peptide concentration. To our knowledge, this is the first report of a K-Ras(G12D)-selective inhibitor, contributing to the development and study of K-Ras(G12D)-targeting drugs. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Defining the sequence specificity of DNA-binding proteins by selecting binding sites from random-sequence oligonucleotides: analysis of yeast GCN4 protein.

    OpenAIRE

    Oliphant, A R; Brandl, C J; Struhl, K

    1989-01-01

    We describe a new method for accurately defining the sequence recognition properties of DNA-binding proteins by selecting high-affinity binding sites from random-sequence DNA. The yeast transcriptional activator protein GCN4 was coupled to a Sepharose column, and binding sites were isolated by passing short, random-sequence oligonucleotides over the column and eluting them with increasing salt concentrations. Of 43 specifically bound oligonucleotides, 40 contained the symmetric sequence TGA(C...

  20. Using remote, spatial techniques to select a random household sample in a dispersed, semi-nomadic pastoral community: utility for a longitudinal health and demographic surveillance system

    OpenAIRE

    Pearson, Amber L; Rzotkiewicz, Amanda; Zwickle, Adam

    2015-01-01

    Background Obtaining a random household sample can be expensive and challenging. In a dispersed community of semi-nomadic households in rural Tanzania, this study aimed to test an alternative method utilizing freely available aerial imagery. Methods We pinned every single-standing structure or boma (compound) in Naitolia, Tanzania using a ‘placemark’ in Google Earth Pro (version 7.1.2.2041). Next, a local expert assisted in removing misclassified placemarks. A random sample was then selected ...

  1. Implementation of client versus care-provider strategies to improve external cephalic version rates: a cluster randomized controlled trial

    NARCIS (Netherlands)

    Vlemmix, F.; Rosman, A.N.; Rijnders, M.E.; Beuckens, A.; Opmeer, B.C.; Mol, B.W.J.; Kok, M.; Fleuren, M.A.H.

    2015-01-01

    Onjective: To determine the effectiveness of a client or care-provider strategy to improve the implementation of external cephalic version. Design: Cluster randomized controlled trial.Setting: Twenty-five clusters; hospitals and their referring midwifery practices randomly selected in the Netherland

  2. Quantitative structure-property relationships of retention indices of some sulfur organic compounds using random forest technique as a variable selection and modeling method.

    Science.gov (United States)

    Goudarzi, Nasser; Shahsavani, Davood; Emadi-Gandaghi, Fereshteh; Chamjangali, Mansour Arab

    2016-10-01

    In this work, a noble quantitative structure-property relationship technique is proposed on the basis of the random forest for prediction of the retention indices of some sulfur organic compounds. In order to calculate the retention indices of these compounds, the theoretical descriptors produced using their molecular structures are employed. The influence of the significant parameters affecting the capability of the developed random forest prediction power such as the number of randomly selected variables applied to split each node (m) and the number of trees (nt ) is studied to obtain the best model. After optimizing the nt and m parameters, the random forest model conducted for m = 70 and nt = 460 was found to yield the best results. The artificial neural network and multiple linear regression modeling techniques are also used to predict the retention index values for these compounds for comparison with the results of random forest model. The descriptors selected by the stepwise regression and random forest model are used to build the artificial neural network models. The results achieved showed the superiority of the random forest model over the other models for prediction of the retention indices of the studied compounds.

  3. Clinical validation of embryo culture and selection by morphokinetic analysis: a randomized, controlled trial of the EmbryoScope.

    Science.gov (United States)

    Rubio, Irene; Galán, Arancha; Larreategui, Zaloa; Ayerdi, Fernando; Bellver, Jose; Herrero, Javier; Meseguer, Marcos

    2014-11-01

    To determine whether incubation in the integrated EmbryoScope time-lapse monitoring system (TMS) and selection supported by the use of a multivariable morphokinetic model improve reproductive outcomes in comparison with incubation in a standard incubator (SI) embryo culture and selection based exclusively on morphology. Prospective, randomized, double-blinded, controlled study. University-affiliated private in vitro fertilization (IVF) clinic. Eight hundred forty-three infertile couples undergoing intracytoplasmic sperm injection (ICSI). No patient intervention; embryos cultured in SI with development evaluated only by morphology (control group) and embryos cultured in TMS with embryo selection was based on a multivariable model (study group). Rates of embryo implantation, pregnancy, ongoing pregnancy (OPR), and early pregnancy loss. Analyzing per treated cycle, the ongoing pregnancy rate was statistically significantly increased 51.4% (95% CI, 46.7-56.0) for the TMS group compared with 41.7% (95% CI, 36.9-46.5) for the SI group. For pregnancy rate, differences were not statistically significant at 61.6% (95% CI, 56.9-66.0) versus 56.3% (95% CI, 51.4-61.0). The results per transfer were similar: statistically significant differences in ongoing pregnancy rate of 54.5% (95% CI, 49.6-59.2) versus 45.3% (95% CI, 40.3-50.4) and not statistically significant for pregnancy rate at 65.2% (95% CI, 60.6-69.8) versus 61.1% (95% CI, 56.2-66.1). Early pregnancy loss was statistically significantly decreased for the TMS group with 16.6% (95% CI, 12.6-21.4) versus 25.8% (95% CI, 20.6-31.9). The implantation rate was statistically significantly increased at 44.9% (95% CI, 41.4-48.4) versus 37.1% (95% CI, 33.6-40.7). The strategy of culturing and selecting embryos in the integrated EmbryoScope time-lapse monitoring system improves reproductive outcomes. NCT01549262. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Selectivity of Chemoresistive Sensors Made of Chemically Functionalized Carbon Nanotube Random Networks for Volatile Organic Compounds (VOC

    Directory of Open Access Journals (Sweden)

    Jean-François Feller

    2014-01-01

    Full Text Available Different grades of chemically functionalized carbon nanotubes (CNT have been processed by spraying layer-by-layer (sLbL to obtain an array of chemoresistive transducers for volatile organic compound (VOC detection. The sLbL process led to random networks of CNT less conductive, but more sensitive to vapors than filtration under vacuum (bucky papers. Shorter CNT were also found to be more sensitive due to the less entangled and more easily disconnectable conducting networks they are making. Chemical functionalization of the CNT’ surface is changing their selectivity towards VOC, which makes it possible to easily discriminate methanol, chloroform and tetrahydrofuran (THF from toluene vapors after the assembly of CNT transducers into an array to make an e-nose. Interestingly, the amplitude of the CNT transducers’ responses can be enhanced by a factor of five (methanol to 100 (chloroform by dispersing them into a polymer matrix, such as poly(styrene (PS, poly(carbonate (PC or poly(methyl methacrylate (PMMA. COOH functionalization of CNT was found to penalize their dispersion in polymers and to decrease the sensors’ sensitivity. The resulting conductive polymer nanocomposites (CPCs not only allow for a more easy tuning of the sensors’ selectivity by changing the chemical nature of the matrix, but they also allow them to adjust their sensitivity by changing the average gap between CNT (acting on quantum tunneling in the CNT network. Quantum resistive sensors (QRSs appear promising for environmental monitoring and anticipated disease diagnostics that are both based on VOC analysis.

  5. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling.

    Science.gov (United States)

    Zhou, Fuqun; Zhang, Aining

    2016-10-25

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2-3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests' features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.

  6. Biased random key genetic algorithm with insertion and gender selection for capacitated vehicle routing problem with time windows

    Science.gov (United States)

    Rochman, Auliya Noor; Prasetyo, Hari; Nugroho, Munajat Tri

    2017-06-01

    Vehicle Routing Problem (VRP) often occurs when the manufacturers need to distribute their product to some customers/outlets. The distribution process is typically restricted by the capacity of the vehicle and the working hours at the distributor. This type of VRP is also known as Capacitated Vehicle Routing Problem with Time Windows (CVRPTW). A Biased Random Key Genetic Algorithm (BRKGA) was designed and coded in MATLAB to solve the CVRPTW case of soft drink distribution. The standard BRKGA was then modified by applying chromosome insertion into the initial population and defining chromosome gender for parent undergoing crossover operation. The performance of the established algorithms was then compared to a heuristic procedure for solving a soft drink distribution. Some findings are revealed (1) the total distribution cost of BRKGA with insertion (BRKGA-I) results in a cost saving of 39% compared to the total cost of heuristic method, (2) BRKGA with the gender selection (BRKGA-GS) could further improve the performance of the heuristic method. However, the BRKGA-GS tends to yield worse results compared to that obtained from the standard BRKGA.

  7. Sequence-Based Prediction of RNA-Binding Proteins Using Random Forest with Minimum Redundancy Maximum Relevance Feature Selection

    Directory of Open Access Journals (Sweden)

    Xin Ma

    2015-01-01

    Full Text Available The prediction of RNA-binding proteins is one of the most challenging problems in computation biology. Although some studies have investigated this problem, the accuracy of prediction is still not sufficient. In this study, a highly accurate method was developed to predict RNA-binding proteins from amino acid sequences using random forests with the minimum redundancy maximum relevance (mRMR method, followed by incremental feature selection (IFS. We incorporated features of conjoint triad features and three novel features: binding propensity (BP, nonbinding propensity (NBP, and evolutionary information combined with physicochemical properties (EIPP. The results showed that these novel features have important roles in improving the performance of the predictor. Using the mRMR-IFS method, our predictor achieved the best performance (86.62% accuracy and 0.737 Matthews correlation coefficient. High prediction accuracy and successful prediction performance suggested that our method can be a useful approach to identify RNA-binding proteins from sequence information.

  8. Estimating the efficacy of Alcoholics Anonymous without self-selection bias: an instrumental variables re-analysis of randomized clinical trials.

    Science.gov (United States)

    Humphreys, Keith; Blodgett, Janet C; Wagner, Todd H

    2014-11-01

    Observational studies of Alcoholics Anonymous' (AA) effectiveness are vulnerable to self-selection bias because individuals choose whether or not to attend AA. The present study, therefore, employed an innovative statistical technique to derive a selection bias-free estimate of AA's impact. Six data sets from 5 National Institutes of Health-funded randomized trials (1 with 2 independent parallel arms) of AA facilitation interventions were analyzed using instrumental variables models. Alcohol-dependent individuals in one of the data sets (n = 774) were analyzed separately from the rest of sample (n = 1,582 individuals pooled from 5 data sets) because of heterogeneity in sample parameters. Randomization itself was used as the instrumental variable. Randomization was a good instrument in both samples, effectively predicting increased AA attendance that could not be attributed to self-selection. In 5 of the 6 data sets, which were pooled for analysis, increased AA attendance that was attributable to randomization (i.e., free of self-selection bias) was effective at increasing days of abstinence at 3-month (B = 0.38, p = 0.001) and 15-month (B = 0.42, p = 0.04) follow-up. However, in the remaining data set, in which preexisting AA attendance was much higher, further increases in AA involvement caused by the randomly assigned facilitation intervention did not affect drinking outcome. For most individuals seeking help for alcohol problems, increasing AA attendance leads to short- and long-term decreases in alcohol consumption that cannot be attributed to self-selection. However, for populations with high preexisting AA involvement, further increases in AA attendance may have little impact. Copyright © 2014 by the Research Society on Alcoholism.

  9. Robust prediction of B-factor profile from sequence using two-stage SVR based on random forest feature selection.

    Science.gov (United States)

    Pan, Xiao-Yong; Shen, Hong-Bin

    2009-01-01

    B-factor is highly correlated with protein internal motion, which is used to measure the uncertainty in the position of an atom within a crystal structure. Although the rapid progress of structural biology in recent years makes more accurate protein structures available than ever, with the avalanche of new protein sequences emerging during the post-genomic Era, the gap between the known protein sequences and the known protein structures becomes wider and wider. It is urgent to develop automated methods to predict B-factor profile from the amino acid sequences directly, so as to be able to timely utilize them for basic research. In this article, we propose a novel approach, called PredBF, to predict the real value of B-factor. We firstly extract both global and local features from the protein sequences as well as their evolution information, then the random forests feature selection is applied to rank their importance and the most important features are inputted to a two-stage support vector regression (SVR) for prediction, where the initial predicted outputs from the 1(st) SVR are further inputted to the 2nd layer SVR for final refinement. Our results have revealed that a systematic analysis of the importance of different features makes us have deep insights into the different contributions of features and is very necessary for developing effective B-factor prediction tools. The two-layer SVR prediction model designed in this study further enhanced the robustness of predicting the B-factor profile. As a web server, PredBF is freely available at: http://www.csbio.sjtu.edu.cn/bioinf/PredBF for academic use.

  10. Quality of life in female myocardial infarction survivors: a comparative study with a randomly selected general female population cohort

    Directory of Open Access Journals (Sweden)

    Fridlund Bengt

    2007-10-01

    Full Text Available Abstract Background A substantial burden associated with MI has been reported. Thus, how survivors experience their quality of life (QOL is now being given increasing attention. However, few studies have involved women and a comparison with the general population. The aims of this study were to determine the QOL of female MI survivors, to investigate whether their QOL differed from that of the general population, and to evaluate the clinical significance of the findings. Methods Two cross-sectional surveys were performed; on female MI survivors and the general Norwegian population. The MI survey included women aged 62–80 years, three months to five years after their MI. One hundred and forty-five women responded, yielding a response rate of 60%. A subset of women in the same age range (n = 156 was drawn from a study of 1893 randomly selected Norwegian citizens. QOL was measured in both groups with the World Health Organization Quality of Life Instrument Abbreviated (WHOQOL-BREF. Results The majority (54% of the female MI survivors presented with ST-elevation in their ECG, 31% received thrombolysis, and 38% had reduced left ventricular ejection fraction. Female MI survivors reported significantly lower satisfaction with general health (p = 0.020 and overall QOL (p = 0.017 than women from the general population. This was also the case for the physical and environmental QOL domains (p Conclusion The burden of MI significantly affects the physical health of elderly women. Still, female MI survivors fare as well as the general female population on psychosocial QOL domains. Action should be taken not only to support women's physical needs but also to reinforce their strengths in order to maintain optimal QOL.

  11. Water chemistry in 179 randomly selected Swedish headwater streams related to forest production, clear-felling and climate.

    Science.gov (United States)

    Löfgren, Stefan; Fröberg, Mats; Yu, Jun; Nisell, Jakob; Ranneby, Bo

    2014-12-01

    From a policy perspective, it is important to understand forestry effects on surface waters from a landscape perspective. The EU Water Framework Directive demands remedial actions if not achieving good ecological status. In Sweden, 44 % of the surface water bodies have moderate ecological status or worse. Many of these drain catchments with a mosaic of managed forests. It is important for the forestry sector and water authorities to be able to identify where, in the forested landscape, special precautions are necessary. The aim of this study was to quantify the relations between forestry parameters and headwater stream concentrations of nutrients, organic matter and acid-base chemistry. The results are put into the context of regional climate, sulphur and nitrogen deposition, as well as marine influences. Water chemistry was measured in 179 randomly selected headwater streams from two regions in southwest and central Sweden, corresponding to 10 % of the Swedish land area. Forest status was determined from satellite images and Swedish National Forest Inventory data using the probabilistic classifier method, which was used to model stream water chemistry with Bayesian model averaging. The results indicate that concentrations of e.g. nitrogen, phosphorus and organic matter are related to factors associated with forest production but that it is not forestry per se that causes the excess losses. Instead, factors simultaneously affecting forest production and stream water chemistry, such as climate, extensive soil pools and nitrogen deposition, are the most likely candidates The relationships with clear-felled and wetland areas are likely to be direct effects.

  12. Evaluation of Randomly Selected Completed Medical Records Sheets in Teaching Hospitals of Jahrom University of Medical Sciences, 2009

    Directory of Open Access Journals (Sweden)

    Mohammad Parsa Mahjob

    2011-06-01

    Full Text Available Background and objective: Medical record documentation, often use to protect the patients legal rights, also providing information for medical researchers, general studies, education of health care staff and qualitative surveys is used. There is a need to control the amount of data entered in the medical record sheets of patients, considering the completion of these sheets is often carried out after completion of service delivery to the patients. Therefore, in this study the prevalence of completeness of medical history, operation reports, and physician order sheets by different documentaries in Jahrom teaching hospitals during year 2009 was analyzed. Methods and Materials: In this descriptive / retrospective study, the 400 medical record sheets of the patients from two teaching hospitals affiliated to Jahrom medical university was randomly selected. The tool of data collection was a checklist based on the content of medical history sheet, operation report and physician order sheets. The data were analyzed by SPSS (Version10 software and Microsoft Office Excel 2003. Results: Average of personal (Demography data entered in medical history, physician order and operation report sheets which is done by department's secretaries were 32.9, 35.8 and 40.18 percent. Average of clinical data entered by physician in medical history sheet is 38 percent. Surgical data entered by the surgeon in operation report sheet was 94.77 percent. Average of data entered by operation room's nurse in operation report sheet was 36.78 percent; Average of physician order data in physician order sheet entered by physician was 99.3 percent. Conclusion: According to this study, the rate of completed record papers reviewed by documentary in Jahrom teaching hospitals were not desirable and in some cases were very weak and incomplete. This deficiency was due to different reason such as medical record documentaries negligence, lack of adequate education for documentaries, High work

  13. Twenty Questions Games Always End With Yes

    CERN Document Server

    Gill, John T

    2010-01-01

    Huffman coding is often presented as the optimal solution to Twenty Questions. However, a caveat is that Twenty Questions games always end with a reply of "Yes," whereas Huffman codewords need not obey this constraint. We bring resolution to this issue, and prove that the average number of questions still lies between H(X) and H(X)+1.

  14. Capital in the Twenty-First Century

    DEFF Research Database (Denmark)

    Hansen, Per H.

    2014-01-01

    Review essay on: Capital in the Twenty-First Century. By Thomas Piketty . Translated by Arthur Goldhammer . Cambridge, Mass.: The Belknap Press of Harvard University Press, 2014. viii + 685 pp......Review essay on: Capital in the Twenty-First Century. By Thomas Piketty . Translated by Arthur Goldhammer . Cambridge, Mass.: The Belknap Press of Harvard University Press, 2014. viii + 685 pp...

  15. A comparison of random forest and its Gini importance with standard chemometric methods for the feature selection and classification of spectral data

    Directory of Open Access Journals (Sweden)

    Himmelreich Uwe

    2009-07-01

    Full Text Available Abstract Background Regularized regression methods such as principal component or partial least squares regression perform well in learning tasks on high dimensional spectral data, but cannot explicitly eliminate irrelevant features. The random forest classifier with its associated Gini feature importance, on the other hand, allows for an explicit feature elimination, but may not be optimally adapted to spectral data due to the topology of its constituent classification trees which are based on orthogonal splits in feature space. Results We propose to combine the best of both approaches, and evaluated the joint use of a feature selection based on a recursive feature elimination using the Gini importance of random forests' together with regularized classification methods on spectral data sets from medical diagnostics, chemotaxonomy, biomedical analytics, food science, and synthetically modified spectral data. Here, a feature selection using the Gini feature importance with a regularized classification by discriminant partial least squares regression performed as well as or better than a filtering according to different univariate statistical tests, or using regression coefficients in a backward feature elimination. It outperformed the direct application of the random forest classifier, or the direct application of the regularized classifiers on the full set of features. Conclusion The Gini importance of the random forest provided superior means for measuring feature relevance on spectral data, but – on an optimal subset of features – the regularized classifiers might be preferable over the random forest classifier, in spite of their limitation to model linear dependencies only. A feature selection based on Gini importance, however, may precede a regularized linear classification to identify this optimal subset of features, and to earn a double benefit of both dimensionality reduction and the elimination of noise from the classification task.

  16. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination.

    Science.gov (United States)

    Koh, Bongyeun; Hong, Sunggi; Kim, Soon-Sim; Hyun, Jin-Sook; Baek, Milye; Moon, Jundong; Kwon, Hayran; Kim, Gyoungyong; Min, Seonggi; Kang, Gu-Hyun

    2016-01-01

    The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE), which requires examinees to select items randomly. The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (Ptest items (P<0.01). In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  17. Affinity selection of Nipah and Hendra virus-related vaccine candidates from a complex random peptide library displayed on bacteriophage virus-like particles

    Energy Technology Data Exchange (ETDEWEB)

    Peabody, David S.; Chackerian, Bryce; Ashley, Carlee; Carnes, Eric; Negrete, Oscar

    2017-01-24

    The invention relates to virus-like particles of bacteriophage MS2 (MS2 VLPs) displaying peptide epitopes or peptide mimics of epitopes of Nipah Virus envelope glycoprotein that elicit an immune response against Nipah Virus upon vaccination of humans or animals. Affinity selection on Nipah Virus-neutralizing monoclonal antibodies using random sequence peptide libraries on MS2 VLPs selected peptides with sequence similarity to peptide sequences found within the envelope glycoprotein of Nipah itself, thus identifying the epitopes the antibodies recognize. The selected peptide sequences themselves are not necessarily identical in all respects to a sequence within Nipah Virus glycoprotein, and therefore may be referred to as epitope mimics VLPs displaying these epitope mimics can serve as vaccine. On the other hand, display of the corresponding wild-type sequence derived from Nipah Virus and corresponding to the epitope mapped by affinity selection, may also be used as a vaccine.

  18. Proceedings of the Twenty-Third Annual Software Engineering Workshop

    Science.gov (United States)

    1999-01-01

    The Twenty-third Annual Software Engineering Workshop (SEW) provided 20 presentations designed to further the goals of the Software Engineering Laboratory (SEL) of the NASA-GSFC. The presentations were selected on their creativity. The sessions which were held on 2-3 of December 1998, centered on the SEL, Experimentation, Inspections, Fault Prediction, Verification and Validation, and Embedded Systems and Safety-Critical Systems.

  19. Selection of random RNA fragments as method for searching for a site of regulation of translation of E. coli streptomycin mRNA by ribosomal protein S7.

    Science.gov (United States)

    Surdina, A V; Rassokhin, T I; Golovin, A V; Spiridonova, V A; Kraal, B; Kopylov, A M

    2008-06-01

    In E. coli cells ribosomal small subunit biogenesis is regulated by RNA-protein interactions involving protein S7. S7 initiates the subunit assembly interacting with 16S rRNA. During shift-down of rRNA synthesis level, free S7 inhibits self-translation by interacting with 96 nucleotides long specific region of streptomycin (str) mRNA between cistrons S12 and S7 (intercistron). Many bacteria do not have the extended intercistron challenging development of specific approaches for searching putative mRNA regulatory regions, which are able to interact with proteins. The paper describes application of SERF approach (Selection of Random RNA Fragments) to reveal regulatory regions of str mRNA. Set of random DNA fragments has been generated from str operon by random hydrolysis and then transcribed into RNA; the fragments being able to bind protein S7 (serfamers) have been selected by iterative rounds. S7 binds to single serfamer, 109 nucleotide long (RNA109), derived from the intercistron. After multiple copying and selection, the intercistronic mutant (RNA109) has been isolated; it has enhanced affinity to S7. RNA109 binds to the protein better than authentic intercistronic str mRNA; apparent dissociation constants are 26 +/- 5 and 60 +/- 8 nM, respectively. Location of S7 binding site on the mRNA, as well as putative mode of regulation of coupled translation of S12 and S7 cistrons have been hypothesized.

  20. Defining the sequence specificity of DNA-binding proteins by selecting binding sites from random-sequence oligonucleotides: analysis of yeast GCN4 protein.

    Science.gov (United States)

    Oliphant, A R; Brandl, C J; Struhl, K

    1989-07-01

    We describe a new method for accurately defining the sequence recognition properties of DNA-binding proteins by selecting high-affinity binding sites from random-sequence DNA. The yeast transcriptional activator protein GCN4 was coupled to a Sepharose column, and binding sites were isolated by passing short, random-sequence oligonucleotides over the column and eluting them with increasing salt concentrations. Of 43 specifically bound oligonucleotides, 40 contained the symmetric sequence TGA(C/G)TCA, whereas the other 3 contained sequences matching six of these seven bases. The extreme preference for this 7-base-pair sequence suggests that each position directly contacts GCN4. The three nucleotide positions on each side of this core heptanucleotide also showed sequence preferences, indicating their effect on GCN4 binding. Interestingly, deviations in the core and a stronger sequence preference in the flanking region were found on one side of the central C . G base pair. Although GCN4 binds as a dimer, this asymmetry supports a model in which interactions on each side of the binding site are not equivalent. The random selection method should prove generally useful for defining the specificities of other DNA-binding proteins and for identifying putative target sequences from genomic DNA.

  1. Factors that influence the selection of sterile glove brand: a randomized controlled trial evaluating the performance and cost of gloves

    National Research Council Canada - National Science Library

    Johnson, Rebecca L; Smith, Hugh M; Duncan, Christopher M; Torsher, Laurence C; Schroeder, Darrell R; Hebl, James R

    2013-01-01

    To determine whether glove use modifies tactile and psychomotor performance of health care providers when compared with no glove use and to evaluate factors that influence the selection of sterile glove...

  2. Free variable selection QSPR study to predict (19)F chemical shifts of some fluorinated organic compounds using Random Forest and RBF-PLS methods.

    Science.gov (United States)

    Goudarzi, Nasser

    2016-04-05

    In this work, two new and powerful chemometrics methods are applied for the modeling and prediction of the (19)F chemical shift values of some fluorinated organic compounds. The radial basis function-partial least square (RBF-PLS) and random forest (RF) are employed to construct the models to predict the (19)F chemical shifts. In this study, we didn't used from any variable selection method and RF method can be used as variable selection and modeling technique. Effects of the important parameters affecting the ability of the RF prediction power such as the number of trees (nt) and the number of randomly selected variables to split each node (m) were investigated. The root-mean-square errors of prediction (RMSEP) for the training set and the prediction set for the RBF-PLS and RF models were 44.70, 23.86, 29.77, and 23.69, respectively. Also, the correlation coefficients of the prediction set for the RBF-PLS and RF models were 0.8684 and 0.9313, respectively. The results obtained reveal that the RF model can be used as a powerful chemometrics tool for the quantitative structure-property relationship (QSPR) studies.

  3. The prevalence and classification of chronic kidney disease in cats randomly selected within four age groups and in cats recruited for degenerative joint disease studies

    Science.gov (United States)

    Marino, Christina L; Lascelles, B Duncan X; Vaden, Shelly L; Gruen, Margaret E; Marks, Steven L

    2015-01-01

    Chronic kidney disease (CKD) and degenerative joint disease are both considered common in older cats. Information on the co-prevalence of these two diseases is lacking. This retrospective study was designed to determine the prevalence of CKD in two cohorts of cats: cats randomly selected from four evenly distributed age groups (RS group) and cats recruited for degenerative joint disease studies (DJD group), and to evaluate the concurrence of CKD and DJD in these cohorts. The RS group was randomly selected from four age groups from 6 months to 20 years, and the DJD group comprised cats recruited to four previous DJD studies, with the DJD group excluding cats with a blood urea nitrogen and/or serum creatinine concentration >20% (the upper end of normal) for two studies and cats with CKD stages 3 and 4 for the other two studies. The prevalence of CKD in the RS and DJD groups was higher than expected at 50% and 68.8%, respectively. CKD was common in cats between 1 and 15 years of age, with a similar prevalence of CKD stages 1 and 2 across age groups in both the RS and DJD cats, respectively. We found significant concurrence between CKD and DJD in cats of all ages, indicating the need for increased screening for CKD when selecting DJD treatments. Additionally, this study offers the idea of a relationship and causal commonality between CKD and DJD owing to the striking concurrence across age groups and life stages. PMID:24217707

  4. Toward a Code for the Interactions of Zinc Fingers with DNA: Selection of Randomized Fingers Displayed on Phage

    Science.gov (United States)

    Choo, Yen; Klug, Aaron

    1994-11-01

    We have used two selection techniques to study sequence-specific DNA recognition by the zinc finger, a small, modular DNA-binding minidomain. We have chosen zinc fingers because they bind as independent modules and so can be linked together in a peptide designed to bind a predetermined DNA site. In this paper, we describe how a library of zinc fingers displayed on the surface of bacteriophage enables selection of fingers capable of binding to given DNA triplets. The amino acid sequences of selected fingers which bind the same triplet are compared to examine how sequence-specific DNA recognition occurs. Our results can be rationalized in terms of coded interactions between zinc fingers and DNA, involving base contacts from a few α-helical positions. In the paper following this one, we describe a complementary technique which confirms the identity of amino acids capable of DNA sequence discrimination from these positions.

  5. Randomized trial of switching from prescribed non-selective non-steroidal anti-inflammatory drugs to prescribed celecoxib

    DEFF Research Database (Denmark)

    Macdonald, Thomas M; Hawkey, Chris J; Ford, Ian;

    2016-01-01

    BACKGROUND: Selective cyclooxygenase-2 inhibitors and conventional non-selective non-steroidal anti-inflammatory drugs (nsNSAIDs) have been associated with adverse cardiovascular (CV) effects. We compared the CV safety of switching to celecoxib vs. continuing nsNSAID therapy in a European setting...... primary events per 1000 patient-years exposure. There were only 15 adjudicated secondary upper gastrointestinal complication endpoints (0.078/100 patient-years on celecoxib vs. 0.053 on nsNSAIDs OT, 0.078 vs. 0.053 ITT). More gastrointestinal serious adverse reactions and haematological adverse reactions...

  6. SNPs selected by information content outperform randomly selected microsatellite loci for delineating genetic identification and introgression in the endangered dark European honeybee (Apis mellifera mellifera).

    Science.gov (United States)

    Muñoz, Irene; Henriques, Dora; Jara, Laura; Johnston, J Spencer; Chávez-Galarza, Julio; De La Rúa, Pilar; Pinto, M Alice

    2016-11-14

    The honeybee (Apis mellifera) has been threatened by multiple factors including pests and pathogens, pesticides and loss of locally adapted gene complexes due to replacement and introgression. In western Europe, the genetic integrity of the native A. m. mellifera (M-lineage) is endangered due to trading and intensive queen breeding with commercial subspecies of eastern European ancestry (C-lineage). Effective conservation actions require reliable molecular tools to identify pure-bred A. m. mellifera colonies. Microsatellites have been preferred for identification of A. m. mellifera stocks across conservation centres. However, owing to high throughput, easy transferability between laboratories and low genotyping error, SNPs promise to become popular. Here, we compared the resolving power of a widely utilized microsatellite set to detect structure and introgression with that of different sets that combine a variable number of SNPs selected for their information content and genomic proximity to the microsatellite loci. Contrary to every SNP data set, microsatellites did not discriminate between the two lineages in the PCA space. Mean introgression proportions were identical across the two marker types, although at the individual level, microsatellites' performance was relatively poor at the upper range of Q-values, a result reflected by their lower precision. Our results suggest that SNPs are more accurate and powerful than microsatellites for identification of A. m. mellifera colonies, especially when they are selected by information content.

  7. Acute changes of hip joint range of motion using selected clinical stretching procedures: A randomized crossover study.

    Science.gov (United States)

    Hammer, Adam M; Hammer, Roger L; Lomond, Karen V; O'Connor, Paul

    2017-09-01

    Hip adductor flexibility and strength is an important component of athletic performance and many activities of daily living. Little research has been done on the acute effects of a single session of stretching on hip abduction range of motion (ROM). The aim of this study was to compare 3 clinical stretching procedures against passive static stretching and control on ROM and peak isometric maximal voluntary contraction (MVC). Using a randomized crossover study design, a total of 40 participants (20 male and 20 female) who had reduced hip adductor muscle length attended a familiarization session and 5 testing sessions on non-consecutive days. Following the warm-up and pre-intervention measures of ROM and MVC, participants were randomly assigned 1 of 3 clinical stretching procedures (modified lunge, multidirectional, and joint mobilization) or a static stretch or control condition. Post-intervention measures of ROM and MVC were taken immediately following completion of the assigned condition. An ANOVA using a repeated measure design with the change score was conducted. All interventions resulted in small but statistically significant (p stretching was greater than control (p = 0.031). These data suggest that a single session of stretching has only a minimal effect on acute changes of hip abduction ROM. Although hip abduction is a frontal plane motion, to effectively increase the extensibility of the structures that limit abduction, integrating multi-planar stretches may be indicated. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Twenty-Channel Voice Response System.

    Science.gov (United States)

    1981-06-01

    programs and vocabulary. 0 Telephone Company (TELCO) Switched Lines - provides access to VRS using telephones. * Bell 407C Data Sets - Converts the Touch...from the twenty 407C units. 0 DLII-E - Asynchronous interface to the 11/34 unibus for the VOTRAX unit. * 20 Channel ADPCM Decoder - a specially designed

  9. Educating the Ablest: Twenty Years Later

    Science.gov (United States)

    Culross, Rita R.

    2015-01-01

    This study examines the current lives of thirty-five individuals who participated in high school gifted programs twenty years ago. The research specifically looked at educational attainment and career goals in terms of expressed aspirations in high school, using social media and other Internet sources. Results indicated continued support for the…

  10. How selection fashions morphological variation in Cakile maritima: A comparative analysis of population structure using random amplified polymorphic DNA and quantitative traits

    Institute of Scientific and Technical Information of China (English)

    Gandour MHEMMED; Hessini KAMEL; Abdelly CHEDLY

    2012-01-01

    It is a long-standing debate in evolutionary biology whether natural selection can generate divergence in the face of gene flow.Comparative studies of quantitative genetic and neutral marker differentiation have provided means for detecting the action of selection and random genetic drift in natural populations.We estimated the degree of population divergence in several quantitative traits and compared these estimates with that based on presumably neutral molecular markers (random amplified polymorphic DNA [RAPD]).This approach allowed us to disentangle the effects of divergent selection from that of other evolutionary forces.Nine populations of Cakile maritima,which encompasses the complete range of distribution of this species in Tunisia,were examined.We found a high proportion of total genetic variance to be among populations and among ecoregions for quantitative traits (range of QsT:0.44-0.88) and a moderate one for RAPD markers (GsT:0.081).In addition,almost all characters displayed a significant higher QsT than GsT,indicating occurrence of phenotypic plasticity and local adaptation.The latter is explicable as there is no reason to expect that natural selection would affect in similar fashion all traits and affect all populations at a similar level.We also found a negative and significant correlation between genetic variation in molecular marker loci and quantitative traits at the multitrait scale.This result attests that the evolution of these markers in C.maritima were not paralleled,suggesting that the degree of genetic differentiation in neutral marker loci is closely predictive of the degree of differentiation in loci coding quantitative traits and the majority of these neutral markers negatively controlled the studied quantitative traits.

  11. Human IgA-binding peptides selected from random peptide libraries: affinity maturation and application in IgA purification.

    Science.gov (United States)

    Hatanaka, Takaaki; Ohzono, Shinji; Park, Mirae; Sakamoto, Kotaro; Tsukamoto, Shogo; Sugita, Ryohei; Ishitobi, Hiroyuki; Mori, Toshiyuki; Ito, Osamu; Sorajo, Koichi; Sugimura, Kazuhisa; Ham, Sihyun; Ito, Yuji

    2012-12-14

    Phage display system is a powerful tool to design specific ligands for target molecules. Here, we used disulfide-constrained random peptide libraries constructed with the T7 phage display system to isolate peptides specific to human IgA. The binding clones (A1-A4) isolated by biopanning exhibited clear specificity to human IgA, but the synthetic peptide derived from the A2 clone exhibited a low specificity/affinity (K(d) = 1.3 μm). Therefore, we tried to improve the peptide using a partial randomized phage display library and mutational studies on the synthetic peptides. The designed Opt-1 peptide exhibited a 39-fold higher affinity (K(d) = 33 nm) than the A2 peptide. An Opt-1 peptide-conjugated column was used to purify IgA from human plasma. However, the recovered IgA fraction was contaminated with other proteins, indicating nonspecific binding. To design a peptide with increased binding specificity, we examined the structural features of Opt-1 and the Opt-1-IgA complex using all-atom molecular dynamics simulations with explicit water. The simulation results revealed that the Opt-1 peptide displayed partial helicity in the N-terminal region and possessed a hydrophobic cluster that played a significant role in tight binding with IgA-Fc. However, these hydrophobic residues of Opt-1 may contribute to nonspecific binding with other proteins. To increase binding specificity, we introduced several mutations in the hydrophobic residues of Opt-1. The resultant Opt-3 peptide exhibited high specificity and high binding affinity for IgA, leading to successful isolation of IgA without contamination.

  12. Efficient high payload and Randomly selected sub-blocks image Steganography%一种高效的随机分块图像隐写算法

    Institute of Scientific and Technical Information of China (English)

    唐明伟; 胡节; 范明钰; 郑秀林

    2012-01-01

    It has been a hot area of research in information and network security in which steganography has an efficient high payload and can hide much information. ERS(an Efficient high payload and Randomly selected sub-blocks image Steganography) is proposed. The experiment results show that the method can not only reduce change of cover image, but also improve the efficient high payload. ERS is simple. Its performance is better than others in the efficiency and security of information hiding.%嵌入效率高和隐藏信息量大的信息隐藏算法,已成为信息安全领域研究的一个热点.通过对该类信息隐藏算法的分析,提出了一种高效率的随机分块信息隐藏算法ERS (an Efficient high payload and Randomly selected sub-blocks image Steganography).实验与分析结果表明:该算法不仅能够减小对载体的修改,提高嵌入效率,而且其算法实现简单、计算量小,性能优于其他算法.

  13. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination

    Directory of Open Access Journals (Sweden)

    Bongyeun Koh

    2016-01-01

    Full Text Available Purpose: The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE, which requires examinees to select items randomly. Methods: The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. Results: In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01, as well as 4 of the 5 items on the advanced skills test (P<0.05. In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01, as well as all 3 of the advanced skills test items (P<0.01. Conclusion: In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  14. Twenty Practices of an Entrepreneurial University

    DEFF Research Database (Denmark)

    Gjerding, Allan Næs; Wilderom, Celeste P.M.; Cameron, Shona P.B.;

    2006-01-01

    similarities; especially that entrepreneurship within universities has to be welcomed and facilitated top-down, but organically occurs and develops bottom-up. Implementing entrepreneurship at universities is thus about stimulating a culture of organic intrapreneurship and we provide practical recommendations...... studies twenty organisational practices against which a University's entrepreneurship can be measured. These twenty practices or factors in effect formed the basis for an entrepreneurship audit. During a series of interviews, the extent to which the universities are seen as entrepreneurial...... by the interviewees was surveyed. We showed that the practices have been implemented only to various degrees and rather unsystematically. There are important differences among the universities, to some extent depending on the level of ambition that each university has regarding each practice. There are also important...

  15. EcmPred: Prediction of extracellular matrix proteins based on random forest with maximum relevance minimum redundancy feature selection

    KAUST Repository

    Kandaswamy, Krishna Kumar Umar

    2013-01-01

    The extracellular matrix (ECM) is a major component of tissues of multicellular organisms. It consists of secreted macromolecules, mainly polysaccharides and glycoproteins. Malfunctions of ECM proteins lead to severe disorders such as marfan syndrome, osteogenesis imperfecta, numerous chondrodysplasias, and skin diseases. In this work, we report a random forest approach, EcmPred, for the prediction of ECM proteins from protein sequences. EcmPred was trained on a dataset containing 300 ECM and 300 non-ECM and tested on a dataset containing 145 ECM and 4187 non-ECM proteins. EcmPred achieved 83% accuracy on the training and 77% on the test dataset. EcmPred predicted 15 out of 20 experimentally verified ECM proteins. By scanning the entire human proteome, we predicted novel ECM proteins validated with gene ontology and InterPro. The dataset and standalone version of the EcmPred software is available at http://www.inb.uni-luebeck.de/tools-demos/Extracellular_matrix_proteins/EcmPred. © 2012 Elsevier Ltd.

  16. EcmPred: prediction of extracellular matrix proteins based on random forest with maximum relevance minimum redundancy feature selection.

    Science.gov (United States)

    Kandaswamy, Krishna Kumar; Pugalenthi, Ganesan; Kalies, Kai-Uwe; Hartmann, Enno; Martinetz, Thomas

    2013-01-21

    The extracellular matrix (ECM) is a major component of tissues of multicellular organisms. It consists of secreted macromolecules, mainly polysaccharides and glycoproteins. Malfunctions of ECM proteins lead to severe disorders such as marfan syndrome, osteogenesis imperfecta, numerous chondrodysplasias, and skin diseases. In this work, we report a random forest approach, EcmPred, for the prediction of ECM proteins from protein sequences. EcmPred was trained on a dataset containing 300 ECM and 300 non-ECM and tested on a dataset containing 145 ECM and 4187 non-ECM proteins. EcmPred achieved 83% accuracy on the training and 77% on the test dataset. EcmPred predicted 15 out of 20 experimentally verified ECM proteins. By scanning the entire human proteome, we predicted novel ECM proteins validated with gene ontology and InterPro. The dataset and standalone version of the EcmPred software is available at http://www.inb.uni-luebeck.de/tools-demos/Extracellular_matrix_proteins/EcmPred.

  17. Intelligence in the Twenty-First Century

    OpenAIRE

    2000-01-01

    The author concludes that the world will most probably remain rife with conflict even in the twenty first century and that the traditional role of intelligence will not only continue but will increase in importance. He characterizes the international situation as being "more of the same historically"; that is, the existence of several different centers of power and mutual conflicts based solely on national interests. In order to protect and promote one's national interests, sovereign states w...

  18. Servicing the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    Fisk, D. [DTLR, London (United Kingdom)

    2002-04-01

    Twentieth century governments have committed themselves to the principle of sustainable development. Efforts to fulfil this goal offer an insight into changes in building services provision in the opening decades of the new century. Sustainable development indicators are used to identify possible trends. The analysis also forms the basis for some speculative conjectures as a basis for a research agenda for the twenty-first century. (Author)

  19. Twenty-first century learning in afterschool.

    Science.gov (United States)

    Schwarz, Eric; Stolow, David

    2006-01-01

    Twenty-first century skills increasingly represent the ticket to the middle class. Yet, the authors argue, in-school learning is simply not enough to help students develop these skills. The authors make the case that after-school (or out-of-school) learning programs are emerging as one of the nation's most promising strategies for preparing young people for the workforce and civic life. Most school systems have significant limitations for teaching twenty-first century skills. They have the limits of time: with only six hours per day there is barely enough time to teach even the basic skills, especially for those students starting already behind. They have the limits of structure: typical school buildings and classrooms are not physically set up for innovative learning. They have the limits of inertia and bureaucracy: school systems are notoriously resistant to change. And perhaps most important, they have the limits of priorities: especially with the onset of the No Child Left Behind Act, schools are laserlike in their focus on teaching the basics and therefore have less incentive to incorporate twenty-first century skills. Meanwhile, the authors argue that after-school programs are an untapped resource with three competitive advantages. First, they enable students to work collaboratively in small groups, a setup on which the modern economy will increasingly rely. Second, they are well suited to project-based learning and the development of mastery. Third, they allow students to learn in the real-world contexts that make sense. Yet the after-school sector is fraught with challenges. It lacks focus-Is it child care, public safety, homework tutoring? And it lacks rigorous results. The authors argue that the teaching of twenty-first century skills should become the new organizing principle for afterschool that will propel the field forward and more effectively bridge in-school and out-of-school learning.

  20. Recruitment strategies should not be randomly selected: empirically improving recruitment success and diversity in developmental psychology research

    Science.gov (United States)

    Sugden, Nicole A.; Moulson, Margaret C.

    2015-01-01

    Psychological and developmental research have been critiqued for the lack of diversity of research samples. Because differences in culture, race, and ethnicity can influence participant behavior, limited diversity limits the generalizability of the findings. These differences may also impact how participants behave in response to recruitment attempts, which suggests that recruitment itself may be leveraged to increase sample diversity. The goal of the current study was to determine what factors, within a recruitment interaction, could be leveraged to increase success and diversity when recruiting families with children for developmental research. Study 1 found three factors influenced success: (1) recruitment was more successful when other potential participants were also interested (i.e., recruiters were busy), (2) recruiters of particular races were more successful than recruiters of other races, and (3) differences in success were related to what the recruiter said to engage the potential participant (i.e., the script). The latter two factors interacted, suggesting some recruiters were using less optimal scripts. To improve success rates, study 2 randomly assigned scripts to recruiters and encouraged them to recruit more vigorously during busy periods. Study 2 found that two factors influenced success: (1) some scripts were more successful than others and (2) we were more successful at recruiting non-White potential participants than White participants. These two interacted, with some scripts being more successful with White and other scripts being more successful with non-White families. This intervention significantly increased recruitment success rate by 8.1% and the overall number of families recruited by 15.3%. These findings reveal that empirically evaluating and tailoring recruitment efforts based on the most successful strategies is effective in boosting diversity through increased participation of children from non-White families. PMID:25972829

  1. Recruitment strategies should not be randomly selected: empirically improving recruitment success and diversity in developmental psychology research.

    Science.gov (United States)

    Sugden, Nicole A; Moulson, Margaret C

    2015-01-01

    Psychological and developmental research have been critiqued for the lack of diversity of research samples. Because differences in culture, race, and ethnicity can influence participant behavior, limited diversity limits the generalizability of the findings. These differences may also impact how participants behave in response to recruitment attempts, which suggests that recruitment itself may be leveraged to increase sample diversity. The goal of the current study was to determine what factors, within a recruitment interaction, could be leveraged to increase success and diversity when recruiting families with children for developmental research. Study 1 found three factors influenced success: (1) recruitment was more successful when other potential participants were also interested (i.e., recruiters were busy), (2) recruiters of particular races were more successful than recruiters of other races, and (3) differences in success were related to what the recruiter said to engage the potential participant (i.e., the script). The latter two factors interacted, suggesting some recruiters were using less optimal scripts. To improve success rates, study 2 randomly assigned scripts to recruiters and encouraged them to recruit more vigorously during busy periods. Study 2 found that two factors influenced success: (1) some scripts were more successful than others and (2) we were more successful at recruiting non-White potential participants than White participants. These two interacted, with some scripts being more successful with White and other scripts being more successful with non-White families. This intervention significantly increased recruitment success rate by 8.1% and the overall number of families recruited by 15.3%. These findings reveal that empirically evaluating and tailoring recruitment efforts based on the most successful strategies is effective in boosting diversity through increased participation of children from non-White families.

  2. Amisulpride a selective dopamine antagonist and atypical antipsychotic: results of a meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Leucht, Stefan

    2004-03-01

    The pharmacological profiles of the atypical antipsychotics, clozapine, olanzapine, quetiapine and risperidone, all show a combined serotonin (5-HT2) and dopamine type-2 (D2) receptor antagonism. Amisulpride, a highly selective dopamine D2/D3 receptor antagonist that binds preferentially to receptors in the mesolimbic system, is also an 'atypical' antipsychotic despite having a different receptor-affinity profile. A meta-analysis of 18 clinical trials was undertaken to compare the efficacy and safety of amisulpride with conventional antipsychotics. The improvement in mental state was assessed using the Brief Psychiatric Rating Scale (BPRS) or the Scale for the Assessment of Negative Symptoms (SANS). In a pooled analysis of 10 studies of acutely ill patients, amisulpride was significantly more effective than conventional neuroleptics with regard to improvement of global symptoms. Amisulpride is, to date, the only atypical antipsychotic for which several studies on patients suffering predominantly from negative symptoms have been published. In four such studies, amisulpride was significantly superior to placebo. Three small studies with conventional neuroleptics as a comparator showed only a trend in favour of amisulpride in this regard. Amisulpride was associated with fewer extrapyramidal side-effects and fewer drop-outs due to adverse events than conventional neuroleptics. These results clearly show that amisulpride is an 'atypical' antipsychotic, and they cast some doubt on the notion that combined 5-HT2-D2 antagonism is the only reason for the high efficacy against negative symptoms and fewer extrapyramidal side-effects.

  3. Detection of blaSHV, blaTEM and blaCTX-M antibiotic resistance genes in randomly selected bacterial pathogens from the Steve Biko Academic Hospital.

    Science.gov (United States)

    Ehlers, Marthie M; Veldsman, Chrisna; Makgotlho, Eddy P; Dove, Michael G; Hoosen, Anwar A; Kock, Marleen M

    2009-08-01

    Extended-spectrum beta-lactamases (ESBLs) are considered to be one of the most important antibiotic resistance mechanisms. This study reported the ESBL-producing genes in 53 randomly selected clinical bacterial isolates from the Steve Biko Academic Hospital. The presence of the bla(SHV), bla(TEM) and bla(CTX-M) genes was determined, and the overall prevalence of these genes detected in this study was 87% (46/53) in comparison with the literature; these results were higher when compared with 33% for Escherichia coli in Europe and 0.8% in Denmark for similar pathogens. These research findings indicated that it is crucial to routinely monitor the prevalence of these resistance genes.

  4. Evolutional selection of a combinatorial phage library displaying randomly-rearranged various single domains of immunoglobulin (Ig-binding proteins (IBPs with four kinds of Ig molecules

    Directory of Open Access Journals (Sweden)

    Jia Jian-An

    2008-08-01

    Full Text Available Abstract Background Protein A, protein G and protein L are three well-defined immunoglobulin (Ig-binding proteins (IBPs, which show affinity for specific sites on Ig of mammalian hosts. Although the precise functions of these molecules are not fully understood, it is thought that they play an important role in pathogenicity of bacteria. The single domains of protein A, protein G and protein L were all demonstrated to have function to bind to Ig. Whether combinations of Ig-binding domains of various IBPs could exhibit useful novel binding is interesting. Results We used a combinatorial phage library which displayed randomly-rearranged various-peptide-linked molecules of D and A domains of protein A, designated PA(D and PA(A respectively, B2 domain of protein G (PG and B3 domain of protein L (PL for affinity selection with human IgG (hIgG, human IgM (hIgM, human IgA (hIgA and recombinant hIgG1-Fc as bait respectively. Two kinds of novel combinatorial molecules with characteristic structure of PA(A-PG and PA(A-PL were obtained in hIgG (hIgG1-Fc and hIgM (hIgA post-selection populations respectively. In addition, the linking peptides among all PA(A-PG and PA(A-PL structures was strongly selected, and showed interestingly divergent and convergent distribution. The phage binding assays and competitive inhibition experiments demonstrated that PA(A-PG and PA(A-PL combinations possess comparable binding advantages with hIgG/hIgG1-Fc and hIgM/hIgA respectively. Conclusion In this work, a combinatorial phage library displaying Ig-binding domains of protein A, protein G, or protein L joined by various random linking peptides was used to conducted evolutional selection in vitro with four kinds of Ig molecules. Two kinds of novel combinations of Ig-binding domains, PA(A-PG and PA(A-PL, were obtained, and demonstrate the novel Ig binding properties.

  5. Zeta Sperm Selection Improves Pregnancy Rate and Alters Sex Ratio in Male Factor Infertility Patients: A Double-Blind, Randomized Clinical Trial

    Directory of Open Access Journals (Sweden)

    Nasr Esfahani Mohammad Hossein

    2016-07-01

    Full Text Available Background Selection of sperm for intra-cytoplasmic sperm injection (ICSI is usually considered as the ultimate technique to alleviate male-factor infertility. In routine ICSI, selection is based on morphology and viability which does not necessarily preclude the chance injection of DNA-damaged or apoptotic sperm into the oocyte. Sperm with high negative surface electrical charge, named “Zeta potential”, are mature and more likely to have intact chromatin. In addition, X-bearing spermatozoa carry more negative charge. Therefore, we aimed to compare the clinical outcomes of Zeta procedure with routine sperm selection in infertile men candidate for ICSI. Materials and Methods From a total of 203 ICSI cycles studied, 101 cycles were allocated to density gradient centrifugation (DGC/Zeta group and the remaining 102 were included in the DGC group in this prospective study. Clinical outcomes were com- pared between the two groups. The ratios of Xand Y bearing sperm were assessed by fluorescence in situ hybridization (FISH and quantitative polymerase chain reaction (qPCR methods in 17 independent semen samples. Results In the present double-blind randomized clinical trial, a significant increase in top quality embryos and pregnancy rate were observed in DGC/Zeta group compared to DGC group. Moreover, sex ratio (XY/XX at birth significantly was lower in the DGC/Zeta group compared to DGC group despite similar ratio of X/Y bearings sper- matozoa following Zeta selection. Conclusion Zeta method not only improves the percentage of top embryo quality and pregnancy outcome but also alters the sex ratio compared to the conventional DGC method, despite no significant change in the ratio of Xand Ybearing sperm population (Registration number: IRCT201108047223N1.

  6. Examination of the transcription factor NtcA-binding motif by in vitro selection of DNA sequences from a random library.

    Science.gov (United States)

    Jiang, F; Wisén, S; Widersten, M; Bergman, B; Mannervik, B

    2000-08-25

    A recursive in vitro selection among random DNA sequences was used for analysis of the cyanobacterial transcription factor NtcA-binding motifs. An eight-base palindromic sequence, TGTA-(N(8))-TACA, was found to be the optimal NtcA-binding sequence. The more divergent the binding sequences, compared to this consensus sequence, the lower the NtcA affinity. The second and third bases in each four-nucleotide half of the consensus sequence were crucial for NtcA binding, and they were in general highly conserved. The most frequently occurring sequence in the middle weakly conserved region was similar to that of the NtcA-binding motif of the Anabaena sp. strain PCC 7120 glnA gene, previously known to have high affinity for NtcA. This indicates that the middle sequences were selected for high NtcA affinity. Analysis of natural NtcA-binding motifs showed that these could be classified into two groups based on differences in recognition consensus sequences. It is suggested that NtcA naturally recognizes different DNA-binding motifs, or has differential affinities to these sequences under different physiological conditions.

  7. A randomized controlled trial investigating the use of a predictive nomogram for the selection of the FSH starting dose in IVF/ICSI cycles.

    Science.gov (United States)

    Allegra, Adolfo; Marino, Angelo; Volpes, Aldo; Coffaro, Francesco; Scaglione, Piero; Gullo, Salvatore; La Marca, Antonio

    2017-01-23

    The number of oocytes retrieved is a relevant intermediate outcome in women undergoing IVF/intracytoplasmic sperm injection (ICSI). This trial compared the efficiency of the selection of the FSH starting dose according to a nomogram based on multiple biomarkers (age, day 3 FSH, anti-Müllerian hormone) versus an age-based strategy. The primary outcome measure was the proportion of women with an optimal number of retrieved oocytes defined as 8-14. At their first IVF/ICSI cycle, 191 patients underwent a long gonadotrophin-releasing hormone agonist protocol and were randomized to receive a starting dose of recombinant (human) FSH, based on their age (150 IU if ≤35 years, 225 IU if >35 years) or based on the nomogram. Optimal response was observed in 58/92 patients (63%) in the nomogram group and in 42/99 (42%) in the control group (+21%, 95% CI = 0.07 to 0.35, P = 0.0037). No significant differences were found in the clinical pregnancy rate or the number of embryos cryopreserved per patient. The study showed that the FSH starting dose selected according to ovarian reserve is associated with an increase in the proportion of patients with an optimal response: large trials are recommended to investigate any possible effect on the live-birth rate.

  8. Prevalence of at-risk genotypes for genotoxic effects decreases with age in a randomly selected population in Flanders: a cross sectional study

    Directory of Open Access Journals (Sweden)

    van Delft Joost HM

    2011-10-01

    Full Text Available Abstract Background We hypothesized that in Flanders (Belgium, the prevalence of at-risk genotypes for genotoxic effects decreases with age due to morbidity and mortality resulting from chronic diseases. Rather than polymorphisms in single genes, the interaction of multiple genetic polymorphisms in low penetrance genes involved in genotoxic effects might be of relevance. Methods Genotyping was performed on 399 randomly selected adults (aged 50-65 and on 442 randomly selected adolescents. Based on their involvement in processes relevant to genotoxicity, 28 low penetrance polymorphisms affecting the phenotype in 19 genes were selected (xenobiotic metabolism, oxidative stress defense and DNA repair, respectively 13, 6 and 9 polymorphisms. Polymorphisms which, based on available literature, could not clearly be categorized a priori as leading to an 'increased risk' or a 'protective effect' were excluded. Results The mean number of risk alleles for all investigated polymorphisms was found to be lower in the 'elderly' (17.0 ± 2.9 than the 'adolescent' (17.6 ± 3.1 subpopulation (P = 0.002. These results were not affected by gender nor smoking. The prevalence of a high (> 17 = median number of risk alleles was less frequent in the 'elderly' (40.6% than the 'adolescent' (51.4% subpopulation (P = 0.002. In particular for phase II enzymes, the mean number of risk alleles was lower in the 'elderly' (4.3 ± 1.6 than the 'adolescent' age group (4.8 ± 1.9 P 4 = median number of risk alleles was less frequent in the 'elderly' (41.3% than the adolescent subpopulation (56.3%, P 8 = median number of risk alleles for DNA repair enzyme-coding genes was lower in the 'elderly' (37,3% than the 'adolescent' subpopulation (45.6%, P = 0.017. Conclusions These observations are consistent with the hypothesis that, in Flanders, the prevalence of at-risk alleles in genes involved in genotoxic effects decreases with age, suggesting that persons carrying a higher number of

  9. The characterization of twenty sequenced human genomes.

    Directory of Open Access Journals (Sweden)

    Kimberly Pelak

    2010-09-01

    Full Text Available We present the analysis of twenty human genomes to evaluate the prospects for identifying rare functional variants that contribute to a phenotype of interest. We sequenced at high coverage ten "case" genomes from individuals with severe hemophilia A and ten "control" genomes. We summarize the number of genetic variants emerging from a study of this magnitude, and provide a proof of concept for the identification of rare and highly-penetrant functional variants by confirming that the cause of hemophilia A is easily recognizable in this data set. We also show that the number of novel single nucleotide variants (SNVs discovered per genome seems to stabilize at about 144,000 new variants per genome, after the first 15 individuals have been sequenced. Finally, we find that, on average, each genome carries 165 homozygous protein-truncating or stop loss variants in genes representing a diverse set of pathways.

  10. Enhanced emotional reactivity after selective REM sleep deprivation in humans: an fMRI study

    OpenAIRE

    Rosales-Lagarde, Alejandra; Jorge L Armony; del Río-Portilla, Yolanda; Trejo-Martínez, David; Conde, Ruben; Corsi-Cabrera, Maria

    2012-01-01

    Converging evidence from animal and human studies suggest that rapid eye movement (REM) sleep modulates emotional processing. The aim of the present study was to explore the effects of selective REM sleep deprivation (REM-D) on emotional responses to threatening visual stimuli and their brain correlates using functional magnetic resonance imaging (fMRI). Twenty healthy subjects were randomly assigned to two groups: selective REM-D, by awakening them at each REM sleep onset, or non-rapid eye m...

  11. Enhanced emotional reactivity after selective REM sleep deprivation in humans: an fMRI study

    OpenAIRE

    Alejandra eRosales-Lagarde; Jorge L Armony; Yolanda edel Río-Portilla; David eTrejo-Martínez; Ruben eConde; Maria eCorsi-Cabrera

    2012-01-01

    Converging evidence from animal and human studies suggest that REM sleep modulates emotional processing. The aim of the present study was to explore the effects of selective REM sleep deprivation on emotional responses to threatening visual stimuli and their brain correlates using functional magnetic resonance imaging (fMRI). Twenty healthy subjects were randomly assigned to two groups: selective REM sleep deprivation (REM-D), by awakening them at each REM sleep onset, or NREM sleep interrupt...

  12. Selection of IgG variants with increased FcRn binding using random and directed mutagenesis: impact on effector functions

    Directory of Open Access Journals (Sweden)

    Céline eMonnet

    2015-02-01

    Full Text Available Despite the reasonably long half-life of IgGs, market pressure for higher patient convenience while conserving efficacy continues to drive IgG half-life improvement. IgG half-life is dependent on the neonatal Fc receptor FcRn, which amongst other functions, protects IgG from catabolism. FcRn binds the Fc domain of IgG at an acidic pH ensuring that endocytosed IgG will not be degraded in lysosomal compartments and will then be released into the bloodstream. Consistent with this mechanism of action, several Fc engineered IgG with increased FcRn affinity and conserved pH-dependency were designed and resulted in longer half-life in vivo in human FcRn transgenic mice (hFcRn, cynomolgus monkeys and recently in healthy humans. These IgG variants were usually obtained by in silico approaches or directed mutagenesis in the FcRn binding site. Using random mutagenesis, combined with a pH-dependent phage display selection process, we isolated IgG variants with improved FcRn-binding which exhibited longer in vivo half-life in hFcRn mice. Interestingly, many mutations enhancing Fc/FcRn interaction were located at a distance from the FcRn binding site validating our random molecular approach. Directed mutagenesis was then applied to generate new variants to further characterize our IgG variants and the effect of the mutations selected. Since these mutations are distributed over the whole Fc sequence, binding to other Fc effectors, such as complement C1q and FcgRs, was dramatically modified, even by mutations distant from these effectors’ binding sites. Hence, we obtained numerous IgG variants with increased FcRn binding and different binding patterns to other Fc effectors, including variants without any effector function, providing distinct fit-for-purpose Fc molecules. We therefore provide evidence that half-life and effector functions should be optimized simultaneously as mutations can have unexpected effects on all Fc receptors that are critical for Ig

  13. Rock magnetic evidence of non-random raw material selection criteria in Cerro Toledo Obsidian Artifacts from Valles Caldera, New Mexico

    Science.gov (United States)

    Gregovich, A.; Feinberg, J. M.; Steffen, A.; Sternberg, R. S.

    2014-12-01

    Stone tools are one of the most enduring forms of ancient human behavior available to anthropologists. The geologic materials that comprise stone tools are a reflection of the rocks that were available locally or through trade, as are the intended use of the tools and the knapping technology needed to produce them. Investigation of the rock magnetic and geochemical characteristics of the artifacts and the geological source materials provides a baseline to explore these past behaviors. This study uses rock magnetic properties to explore the raw material selection criteria involved in the production of obsidian tools in the region around Valles Caldera in northern New Mexico. Obsidian is locally abundant and was traded by tribes across the central United States. Here we compare the rock magnetic properties of a sample of obsidian projectile points (N =25) that have been geochemically sourced to the Cerro Toledo obsidian flow with geological samples collected from four sites within the same flow (N =135). This collection of archaeological artifacts, albeit small, contains representatives of at least 8 different point styles that were used over 6000 years from the Archaic into the Late Prehistoric. Bulk rock hysteresis parameters (Mr, Ms, Bc, and Bcr) and low-field susceptibility (Χ) measurements show that the projectile points generally contain a lower concentration of magnetic minerals than the geologic samples. For example, the artifacts' median Ms value is 2.9 x 10-3 Am2kg-1, while that of the geological samples is 6.5 x 10-3 Am2kg-1. The concentration of magnetic minerals in obsidian is a proxy for the concentration of microlites in general, and this relationship suggests that although obsidian was locally abundant, toolmakers employed non-random selection criteria resulting in generally lower concentrations of microlites in their obsidian tools.

  14. The twenty-first century in space

    CERN Document Server

    Evans, Ben

    2015-01-01

    This final entry in the History of Human Space Exploration mini-series by Ben Evans continues with an in-depth look at the latter part of the 20th century and the start of the new millennium. Picking up where Partnership in Space left off, the story commemorating the evolution of manned space exploration unfolds in further detail. More than fifty years after Yuri Gagarin’s pioneering journey into space, Evans extends his overview of how that momentous voyage continued through the decades which followed. The Twenty-first Century in Space, the sixth book in the series, explores how the fledgling partnership between the United States and Russia in the 1990s gradually bore fruit and laid the groundwork for today’s International Space Station. The narrative follows the convergence of the Shuttle and Mir programs, together with standalone missions, including servicing the Hubble Space Telescope, many of whose technical and human lessons enabled the first efforts to build the ISS in orbit. The book also looks to...

  15. Twenty Years After: Armenian Research Libraries Today

    Directory of Open Access Journals (Sweden)

    D. Aram Donabedian

    2012-05-01

    Full Text Available Since achieving statehood in 1991, Armenia has faced major economic and political obstacles which have significantly affected the nation’s research libraries. This research paper will quantitatively and qualitatively examine the challenges facing Armenian research libraries just over twenty years after the collapse of the Soviet Union. Specifically, the authors analyze their interviews with five library administrators at five major institutions, respectively. These include Yerevan State University Library, the National Library of Armenia, the Fundamental Scientific Library of the National Academy of Sciences of Armenia, the Republican Scientific-Medical Library of Armenia, and the Papazian Library of the American University of Armenia. The instrument for the interviews consists of 73 questions based on the 2004 Association of College and Research Libraries Standards for Libraries in Higher Education and evaluates the following factors:• The library’s mission, goals and objectives• Public or user services• Instruction activities at the library• Resources (print, media, or electronic and collection development• Access to the library’s resources• Outcome assessment, or evaluation of the library• Staffing issues• Facility maintenance and plans for library development• Communication and cooperation both within the library and with the user community• Administration• BudgetIn addition, we will focus on the strengths and weaknesses of these libraries and investigate the growing open access movement in Armenia. Based on our findings, the authors wish to facilitate dialogue and consider possible approaches to help these libraries meet Armenia’s pressing information needs.

  16. Worm Propagation Model Based on Selective-Random Scan%基于选择性随机扫描的蠕虫传播模型

    Institute of Scientific and Technical Information of China (English)

    张祥德; 丁春燕; 朱和贵

    2006-01-01

    分析了蠕虫病毒在一个封闭的计算机群中传播的过程,提出了一个离散的蠕虫传播模型,并且把该模型和Code Red v2蠕虫的真实传播数据进行了比较,通过比较发现该模型较好地反映了随机扫描蠕虫的传播规律.进一步把该模型做了推广,考虑选择性随机扫描(Selective-random scan)蠕虫的传播规律,通过推广后的模型可以发现,在一个封闭的计算机群中,各个小网络中的易感主机数变化越大,蠕虫传播的速度越快.

  17. Selection of single blastocysts for fresh transfer via standard morphology assessment alone and with array CGH for good prognosis IVF patients: results from a randomized pilot study

    Directory of Open Access Journals (Sweden)

    Yang Zhihong

    2012-05-01

    Full Text Available Abstract Background Single embryo transfer (SET remains underutilized as a strategy to reduce multiple gestation risk in IVF, and its overall lower pregnancy rate underscores the need for improved techniques to select one embryo for fresh transfer. This study explored use of comprehensive chromosomal screening by array CGH (aCGH to provide this advantage and improve pregnancy rate from SET. Methods First-time IVF patients with a good prognosis (age Results For patients in Group A (n = 55, 425 blastocysts were biopsied and analyzed via aCGH (7.7 blastocysts/patient. Aneuploidy was detected in 191/425 (44.9% of blastocysts in this group. For patients in Group B (n = 48, 389 blastocysts were microscopically examined (8.1 blastocysts/patient. Clinical pregnancy rate was significantly higher in the morphology + aCGH group compared to the morphology-only group (70.9 and 45.8%, respectively; p = 0.017; ongoing pregnancy rate for Groups A and B were 69.1 vs. 41.7%, respectively (p = 0.009. There were no twin pregnancies. Conclusion Although aCGH followed by frozen embryo transfer has been used to screen at risk embryos (e.g., known parental chromosomal translocation or history of recurrent pregnancy loss, this is the first description of aCGH fully integrated with a clinical IVF program to select single blastocysts for fresh SET in good prognosis patients. The observed aneuploidy rate (44.9% among biopsied blastocysts highlights the inherent imprecision of SET when conventional morphology is used alone. Embryos randomized to the aCGH group implanted with greater efficiency, resulted in clinical pregnancy more often, and yielded a lower miscarriage rate than those selected without aCGH. Additional studies are needed to verify our pilot data and confirm a role for on-site, rapid aCGH for IVF patients contemplating fresh SET.

  18. Efficacy of aerobic exercise and a prudent diet for improving selected lipids and lipoproteins in adults: a meta-analysis of randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Roberts Susan

    2011-06-01

    Full Text Available Abstract Background Studies addressing the effects of aerobic exercise and a prudent diet on lipid and lipoprotein concentrations in adults have reached conflicting conclusions. The purpose of this study was to determine the effects of aerobic exercise combined with a prudent diet on lipid and lipoprotein concentrations in adults. Methods Studies were located by searching nine electronic databases, cross-referencing, and expert review. Two independent reviewers selected studies that met the following criteria: (1 randomized controlled trials, (2 aerobic exercise combined with diet recommendations (saturated/trans fat intake less than 10% of total calories and cholesterol less than 300 mg/day and/or fiber intake ≥25 g/day in women and ≥35 grams per day in men, (3 intervention ≥4 weeks, (4 humans ≥18 years of age, (5 published studies, including dissertations and Master's theses, (6 studies published in any language, (7 studies published between January 1, 1955 and May 1, 2009, (8 assessment of one or more of the following lipid and lipoprotein concentrations: total cholesterol (TC, high-density lipoprotein cholesterol (HDL-C, ratio of TC to HDL-C, non-HDL-C, low-density lipoprotein cholesterol (LDL-C and triglycerides (TG. Two reviewers independently extracted all data. Random-effects models that account for heterogeneity and 95% confidence intervals were used to pool findings. Results Of the 1,401 citations reviewed, six studies representing 16 groups (8 intervention, 8 control and up to 559 men and women (282 intervention, 277 control met the criteria for analysis. Statistically significant intervention minus control reductions were found for TC (-15.5 mg/dl, 95% CI, -20.3 to -10.7, TC:HDL-C (-0.4 mg/dl, 95% CI, -0.7 to -0.2, LDL-C (-9.2 mg/dl, 95% CI, -12.7 to -5.8 and TG (-10.6 mg/dl, 95% CI, -17.2 to -4.0 but not HDL-C (-0.5 mg/dl, 95% CI, -4.0 to 3.1. Changes were equivalent to reductions of 7.5%, 6.6%, 7.2% and 18.2% respectively

  19. Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design

    Science.gov (United States)

    Wagler, Amy; Wagler, Ron

    2014-01-01

    Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…

  20. Cerebral blood flow reactivity in patients undergoing selective amygdalohippocampectomy for epilepsy of mesial temporal origin. A prospective randomized comparison of the trans-Sylvian and the transcortical approach.

    Science.gov (United States)

    Schatlo, Bawarjan; Jägersberg, Max; Paass, Gerhard; Faltermeier, Rupert; Streich, Jörg; Meyer, Bernhard; Schaller, Karl

    2015-01-01

    The aim of this study was to assess (1) whether vasoreactivity is altered in patients with epilepsy and (2) whether the two most commonly used approaches, the trans-Sylvian (TS) and the trans-cortical (TC) route, differ in their impact on cortical blood flow. Patients were randomized to undergo selective amygdalohippocampectomy (selAH) through a TC or TS route. Before and after selAH, we recorded microcirculation parameters on the superficial cortex surrounding the surgical corridor. Blood flow and velocity were measured using laser Doppler flowmetry and micro-Doppler, respectively. Cortical oxygen saturation (SO2) was measured using remission spectrophotometry under hypocapnic and normocapnic conditions. Ten patients were operated using the TS approach, and eight were operated via the TC approach. Vasomotor reactivity patterns measured with micro-Doppler were physiologically prior to selAH in both groups. After completion of surgery, a significant increase in SO2-values occurred in the TS group (before: 56.7 ± 2.2, after: 65.5 ± 3.0%SO2), but not in the TC group (before: 52.9 ± 5.2, after: 53.0 ± 3.7%SO2). The rate of critical SO2 values below 25% was significantly higher after the TC approach (12.3%) compared to the TS approach (5.2%; p < 0.05). Our findings provide the first invasively measured evidence that patients with mesial temporal lobe epilepsy have preserved cerebral blood flow responses to alterations in CO2. In addition, local cortical SO2 was higher in the TS group than in the TC group after selAH. This may be a sign of reactive cortical vessel dilation after proximal vessel manipulation associated with the TS approach. In contrast, the lower values of SO2 after the TC approach indicate tissue ischaemia surrounding the surgical corridor surrounding the corticotomy.

  1. An assessment of the quality of care for children in eighteen randomly selected district and sub-district hospitals in Bangladesh

    Directory of Open Access Journals (Sweden)

    Hoque Dewan ME

    2012-12-01

    Full Text Available Abstract Background Quality hospital care is important in ensuring that the needs of severely ill children are met to avert child mortality. However, the quality of hospital care for children in developing countries has often been found poor. As the first step of a country road map for improving hospital care for children, we assessed the baseline situation with respect to the quality of care provided to children under-five years age in district and sub-district level hospitals in Bangladesh. Methods Using adapted World Health Organization (WHO hospital assessment tools and standards, an assessment of 18 randomly selected district (n=6 and sub-district (n=12 hospitals was undertaken. Teams of trained assessors used direct case observation, record review, interviews, and Management Information System (MIS data to assess the quality of clinical case management and monitoring; infrastructure, processes and hospital administration; essential hospital and laboratory supports, drugs and equipment. Results Findings demonstrate that the overall quality of care provided in these hospitals was poor. No hospital had a functioning triage system to prioritise those children most in need of immediate care. Laboratory supports and essential equipment were deficient. Only one hospital had all of the essential drugs for paediatric care. Less than a third of hospitals had a back-up power supply, and just under half had functioning arrangements for safe-drinking water. Clinical case management was found to be sub-optimal for prevalent illnesses, as was the quality of neonatal care. Conclusion Action is needed to improve the quality of paediatric care in hospital settings in Bangladesh, with a particular need to invest in improving newborn care.

  2. Proceedings: Twenty years of energy policy: Looking toward the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    1992-12-31

    In 1973, immediately following the Arab Oil Embargo, the Energy Resources Center, University of Illinois at Chicago initiated an innovative annual public service program called the Illinois Energy Conference. The objective was to provide a public forum each year to address an energy or environmental issue critical to the state, region and nation. Twenty years have passed since that inaugural program, and during that period we have covered a broad spectrum of issues including energy conservation nuclear power, Illinois coal, energy policy options, natural gas, alternative fuels, new energy technologies, utility deregulation and the National Energy Strategy.

  3. On Random Numbers and Design

    Science.gov (United States)

    Ben-Ari, Morechai

    2004-01-01

    The term "random" is frequently used in discussion of the theory of evolution, even though the mathematical concept of randomness is problematic and of little relevance in the theory. Therefore, since the core concept of the theory of evolution is the non-random process of natural selection, the term random should not be used in teaching the…

  4. Paleolithic nutrition: twenty-five years later.

    Science.gov (United States)

    Konner, Melvin; Eaton, S Boyd

    2010-12-01

    A quarter century has passed since the first publication of the evolutionary discordance hypothesis, according to which departures from the nutrition and activity patterns of our hunter-gatherer ancestors have contributed greatly and in specifically definable ways to the endemic chronic diseases of modern civilization. Refinements of the model have changed it in some respects, but anthropological evidence continues to indicate that ancestral human diets prevalent during our evolution were characterized by much lower levels of refined carbohydrates and sodium, much higher levels of fiber and protein, and comparable levels of fat (primarily unsaturated fat) and cholesterol. Physical activity levels were also much higher than current levels, resulting in higher energy throughput. We said at the outset that such evidence could only suggest testable hypotheses and that recommendations must ultimately rest on more conventional epidemiological, clinical, and laboratory studies. Such studies have multiplied and have supported many aspects of our model, to the extent that in some respects, official recommendations today have targets closer to those prevalent among hunter-gatherers than did comparable recommendations 25 years ago. Furthermore, doubts have been raised about the necessity for very low levels of protein, fat, and cholesterol intake common in official recommendations. Most impressively, randomized controlled trials have begun to confirm the value of hunter-gatherer diets in some high-risk groups, even as compared with routinely recommended diets. Much more research needs to be done, but the past quarter century has proven the interest and heuristic value, if not yet the ultimate validity, of the model.

  5. The Patient Deficit Model Overturned: a qualitative study of patients' perceptions of invitation to participate in a randomized controlled trial comparing selective bladder preservation against surgery in muscle invasive bladder cancer (SPARE, CRUK/07/011).

    Science.gov (United States)

    Moynihan, Clare; Lewis, Rebecca; Hall, Emma; Jones, Emma; Birtle, Alison; Huddart, Robert

    2012-11-29

    Evidence suggests that poor recruitment into clinical trials rests on a patient 'deficit' model - an inability to comprehend trial processes. Poor communication has also been cited as a possible barrier to recruitment. A qualitative patient interview study was included within the feasibility stage of a phase III non-inferiority Randomized Controlled Trial (RCT) (SPARE, CRUK/07/011) in muscle invasive bladder cancer. The aim was to illuminate problems in the context of randomization. The qualitative study used a 'Framework Analysis' that included 'constant comparison' in which semi-structured interviews are transcribed, analyzed, compared and contrasted both between and within transcripts. Three researchers coded and interpreted data. Twenty-four patients agreed to enter the interview study; 10 decliners of randomization and 14 accepters, of whom 2 subsequently declined their allocated treatment.The main theme applying to the majority of the sample was confusion and ambiguity. There was little indication that confusion directly impacted on decisions to enter the SPARE trial. However, confusion did appear to impact on ethical considerations surrounding 'informed consent', as well as cause a sense of alienation between patients and health personnel.Sub-optimal communication in many guises accounted for the confusion, together with the logistical elements of a trial that involved treatment options delivered in a number of geographical locations. These data highlight the difficulty of providing balanced and clear trial information within the UK health system, despite best intentions. Involvement of multiple professionals can impact on communication processes with patients who are considering participation in RCTs. Our results led us to question the 'deficit' model of patient behavior. It is suggested that health professionals might consider facilitating a context in which patients feel fully included in the trial enterprise and potentially consider alternatives to

  6. The Patient Deficit Model Overturned: a qualitative study of patients' perceptions of invitation to participate in a randomized controlled trial comparing selective bladder preservation against surgery in muscle invasive bladder cancer (SPARE, CRUK/07/011

    Directory of Open Access Journals (Sweden)

    Moynihan Clare

    2012-11-01

    Full Text Available Abstract Background Evidence suggests that poor recruitment into clinical trials rests on a patient ‘deficit’ model – an inability to comprehend trial processes. Poor communication has also been cited as a possible barrier to recruitment. A qualitative patient interview study was included within the feasibility stage of a phase III non-inferiority Randomized Controlled Trial (RCT (SPARE, CRUK/07/011 in muscle invasive bladder cancer. The aim was to illuminate problems in the context of randomization. Methods The qualitative study used a ‘Framework Analysis’ that included ‘constant comparison’ in which semi-structured interviews are transcribed, analyzed, compared and contrasted both between and within transcripts. Three researchers coded and interpreted data. Results Twenty-four patients agreed to enter the interview study; 10 decliners of randomization and 14 accepters, of whom 2 subsequently declined their allocated treatment. The main theme applying to the majority of the sample was confusion and ambiguity. There was little indication that confusion directly impacted on decisions to enter the SPARE trial. However, confusion did appear to impact on ethical considerations surrounding ‘informed consent’, as well as cause a sense of alienation between patients and health personnel. Sub-optimal communication in many guises accounted for the confusion, together with the logistical elements of a trial that involved treatment options delivered in a number of geographical locations. Conclusions These data highlight the difficulty of providing balanced and clear trial information within the UK health system, despite best intentions. Involvement of multiple professionals can impact on communication processes with patients who are considering participation in RCTs. Our results led us to question the ‘deficit’ model of patient behavior. It is suggested that health professionals might consider facilitating a context in which patients

  7. Randomization tests

    CERN Document Server

    Edgington, Eugene

    2007-01-01

    Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani

  8. A Multi-Center, Randomized, Controlled, Pivotal Study to Assess the Safety and Efficacy of a Selective Cytopheretic Device in Patients with Acute Kidney Injury.

    Directory of Open Access Journals (Sweden)

    James A Tumlin

    Full Text Available Acute kidney injury (AKI is a highly morbid condition in critically ill patients that is associated with high mortality. Previous clinical studies have demonstrated the safety and efficacy of the Selective Cytopheretic Device (SCD in the treatment of AKI requiring continuous renal replacement therapy in the intensive care unit (ICU.A randomized, controlled trial of 134 ICU patients with AKI, 69 received continuous renal replacement therapy (CRRT alone and 65 received SCD therapy.No significant difference in 60-day mortality was observed between the treated (27/69; 39% and control patients (21/59; 36%, with six patients lost to follow up in the intention to treat (ITT analysis. Of the 19 SCD subjects (CRRT+SCD and 31 control subjects (CRRT alone who maintained a post-filter ionized calcium (iCa level in the protocol's recommended range (≤ 0.4 mmol/L for greater or equal to 90% of the therapy time, 60-day mortality was 16% (3/19 in the SCD group compared to 41% (11/27 in the CRRT alone group (p = 0.11. Dialysis dependency showed a borderline statistically significant difference between the SCD treated versus control CRRT alone patients maintained for ≥ 90% of the treatment in the protocol's recommended (r iCa target range of ≤ 0.4 mmol/L with values of, 0% (0/16 and 25% (4/16, respectively (P = 0.10. When the riCa treated and control subgroups were compared for a composite index of 60 day mortality and dialysis dependency, the percentage of SCD treated subjects was 16% versus 58% in the control subjects (p<0.01. The incidence of serious adverse events did not differ between the treated (45/69; 65% and control groups (40/65; 63%; p = 0·86.SCD therapy may improve mortality and reduce dialysis dependency in a tightly controlled regional hypocalcaemic environment in the perfusion circuit.ClinicalTrials.gov NCT01400893 http://clinicaltrials.gov/ct2/show/NCT01400893.

  9. A randomized control trial to evaluate the effect of adjuvant selective laser trabeculoplasty versus medication alone in primary open-angle glaucoma: preliminary results

    Directory of Open Access Journals (Sweden)

    Lee JWY

    2014-09-01

    Full Text Available Jacky WY Lee,1,2 Catherine WS Chan,2 Mandy OM Wong,3 Jonathan CH Chan,3 Qing Li,2 Jimmy SM Lai2 1The Department of Ophthalmology, Caritas Medical Centre, 2The Department of Ophthalmology, The University of Hong Kong, 3The Department of Ophthalmology, Queen Mary Hospital, Hong Kong Background: The objective of this study was to investigate the effects of adjuvant selective laser trabeculoplasty (SLT versus medication alone on intraocular pressure (IOP control, medication use, and quality of life in patients with primary open-angle glaucoma.Methods: This prospective, randomized control study recruited 41 consecutive primary open-angle glaucoma subjects with medically-controlled IOP ≤21 mmHg. The SLT group (n=22 received a single 360-degree SLT treatment. The medication-only group (n=19 continued with their usual treatment regimen. In both groups, medication was titrated to maintain a target IOP defined as a 25% reduction from baseline IOP without medication, or <18 mmHg, whichever was lower. Outcomes, which were measured at baseline and at 6 months, included the Glaucoma Quality of Life-15 (GQL-15 and Comparison of Ophthalmic Medications for Tolerability (COMTOL survey scores, IOP, and the number of antiglaucoma medicines. Results: The baseline IOP was 15.8±2.7 mmHg and 14.5±2.5 mmHg in the SLT and medication-only groups, respectively (P=0.04. Both groups had a comparable number of baseline medication (P=0.2, GQL-15 (P=0.3 and COMTOL scores (P=0.7. At 6 months, the SLT group had a lower IOP (P=0.03 and required fewer medications compared with both baseline (P<0.0001 and with the medication-only group (P=0.02. There was no statistically significant difference in the 6-month GQL-15 or COMTOL score as compared to baseline (P≥0.4 or between the two treatment groups (P≥0.2.Conclusion: A single session of adjuvant SLT provided further reductions in IOP and medication without substantial changes in quality of life or medication tolerability at 6

  10. A prospective randomized multicenter trial of amnioreduction versus selective fetoscopic laser photocoagulation for the treatment of severe twin–twin transfusion syndrome

    Science.gov (United States)

    Crombleholme, Timothy M.; Shera, David; Lee, Hanmin; Johnson, Mark; D’Alton, Mary; Porter, Flint; Chyu, Jacquelyn; Silver, Richard; Abuhamad, Alfred; Saade, George; Shields, Laurence; Kauffman, David; Stone, Joanne; Albanese, Craig T.; Bahado-Singh, Ray; Ball, Robert H.; Bilaniuk, Larissa; Coleman, Beverly; Farmer, Diana; Feldstein, Vickie; Harrison, Michael R.; Hedrick, Holly; Livingston, Jeffrey; Lorenz, Robert P.; Miller, David A.; Norton, Mary E.; Polzin, William J.; Robinson, Julian N.; Rychik, Jack; Sandberg, Per L.; Seri, Istvan; Simon, Erin; Simpson, Lynn L.; Yedigarova, Larisa; Wilson, R. Douglas; Young, Bruce

    2009-01-01

    Objective To examine the effect of selective fetoscopic laser photocoagulation (SFLP) versus serial amnioreduction (AR) on perinatal mortality in severe twin-twin transfusion syndrome (TTTS). Study Design 5-year multicenter prospective randomized controlled trial. The primary outcome variable was 30-day postnatal survival of donors and recipients. Results There is no statistically significant difference in 30-day postnatal survival between SFLP or AR treatment for donors at 55% (11/20) vs 55% (11/20) (p=1, OR=1, 95%CI=0.242 to 4.14) or recipients at 30% (6/20) vs 45% (9/20) (p=0.51, OR=1.88, 95%CI=0.44 to 8.64). There is no difference in 30-day survival of one or both twins on a per pregnancy basis between AR at 75% (15/20) and SFLP at 65% (13/20) (p=0.73, OR=1.62, 95%CI=0.34 to 8.09). Overall survival (newborns divided by the number of fetuses treated) is not statistically significant for AR at 60% (24/40) vs SFLP 45% (18/40) (p=0.18, OR=2.01, 95%CI=0.76 to 5.44). There is a statistically significant increase in fetal recipient mortality in the SFLP arm at 70% (14/20) versus the AR arm at 35% (7/20) (p=0.25, OR=5.31, 95%CI=1.19 to 27.6). This is offset by increased recipient neonatal mortality of 30% (6/20) in the AR arm. Echocardiographic abnormality in recipient twin Cardiovascular Profile Score is the most significant predictor of recipient mortality (p=0.055, OR=3.025/point) by logistic regression analysis. Conclusions The outcome of the trial does not conclusively determine whether AR or SFLP is a superior treatment modality. TTTS cardiomyopathy appears to be an important factor in recipient survival in TTTS. PMID:17904975

  11. Selective decontamination of the oral and digestive tract in surgical versus non-surgical patients in intensive care in a cluster-randomized trial.

    NARCIS (Netherlands)

    Melsen, W.G.; Smet, A.M. de; Kluytmans, J.A.; Bonten, M.J.; Pickkers, P.

    2012-01-01

    BACKGROUND: Selective digestive decontamination (SDD) and selective oropharyngeal decontamination (SOD) are effective in improving survival in patients under intensive care. In this study possible differential effects in surgical and non-surgical patients were investigated. METHODS: This was a post

  12. Boosted Random Forest

    National Research Council Canada - National Science Library

    MISHINA, Yohei; MURATA, Ryuei; YAMAUCHI, Yuji; YAMASHITA, Takayoshi; FUJIYOSHI, Hironobu

    2015-01-01

    .... Within machine learning, a Random Forest is a multi-class classifier with high-performance classification, achieved using bagging and feature selection, and is capable of high-speed training and classification...

  13. Selective laser melting: a unit cell approach for the manufacture of porous, titanium, bone in-growth constructs, suitable for orthopedic applications. II. Randomized structures.

    Science.gov (United States)

    Mullen, Lewis; Stamp, Robin C; Fox, Peter; Jones, Eric; Ngo, Chau; Sutcliffe, Christopher J

    2010-01-01

    In this study, the unit cell approach, which has previously been demonstrated as a method of manufacturing porous components suitable for use as orthopedic implants, has been further developed to include randomized structures. These random structures may aid the bone in-growth process because of their similarity in appearance to trabecular bone and are shown to carry legacy properties that can be related back to the original unit cell on which they are ultimately based. In addition to this, it has been shown that randomization improves the mechanical properties of regular unit cell structures, resulting in anticipated improvements to both implant functionality and longevity. The study also evaluates the effect that a post process sinter cycle has on the components, outlines the improved mechanical properties that are attainable, and also the changes in both the macro and microstructure that occur.

  14. [The Gulf War Syndrome twenty years on].

    Science.gov (United States)

    Auxéméry, Y

    2013-10-01

    opposition or continuity links between the objective external exposure (smoke from petrol wells, impoverished uranium, biological agents, chemicals) and the share of inner emotion albeit reactive and characterised by a subjective stress. There were no lack of stress factors for the troops deployed: repeated alerts of chemical attacks, hostility of the environment with its sandstorms and venomous animals, climatic conditions making long hours of backup and static observation difficult, collecting bodies, lack of knowledge of the precise geography of their movements and uncertainty of the duration of the conflict. The military anti-nuclear-bacteriological-chemical uniform admittedly provided protective confinement, shutting out the hostile world from which the threat would come but, at the same time, this isolation increases the fear of a hypothetical risk whilst the internal perceptions are increased and can open the way to future somatisations. In a context like this, the somatic manifestations of anxiety (palpitations, sweating, paresthesia…) are willingly associated with somatised functional disorders to which can also be assigned over-interpretations of bodily feelings according to a hypochondriacal mechanism. The selective attention to somatic perceptions in the absence of mentalisations, the request for reassurance reiterated and the excessive use of the treatment system will be diagnostic indices of these symptoms caused by the stress. Rather than toxic exposure to such and such a substance, the non-specific syndrome called "Gulf War Syndrome" is the result of exposure to the eponymous operational theatre. But if the psychological and psychosomatic suffering occurring in veterans is immutable throughout history, the expression of these difficulties has specificities according to the past cultural, political and scientific context. In the example of GWS, the diffusion of the fear of a pathology resulting from chemical weapons has promoted this phenomenon. In the end

  15. Evaluation of Effect of Cannabis Smoking on the Hematological Properties of Selected Adult Male Students Smokers

    OpenAIRE

    Nwaichi E. O.; Omorodion, F. O

    2015-01-01

    The study investigated the effect of cannabis sativa smoking on some hematological characteristics on the male students consumers. Blood samples were collected in triplicates from twenty (10) randomly selected male voluntary marijuana smokers (test) and ten (10) voluntary male non-smokers (control) in Choba Community, Port Harcourt, Rivers State. The parameters considered were body temperature, pulse rate, Red blood cells (RBC) count, white blood cell (WBC) count, packed cell volume (PCV), er...

  16. Yangzhou’s Famous Twenty-fourth Bridge

    Institute of Scientific and Technical Information of China (English)

    1992-01-01

    “LOOMING green moun-tains and runningstreams;grass does notwither and fall,though the autumnhas come to an end in the south.The bright moon arises overTwenty-fourth Bridge.Where doyou teach pure-jade Yangzhouwomen to play music on bambooflutes?”This poem by Du Mu(803-c.852),a famous poet of thelate Tang Dynasty,is well remem-bered today It made Yangzhou’sTwenty-fourth Bridge Known to la-ter generations.Of many ancientpoems about Twenty-fourth Bridge

  17. The effect of barusiban, a selective oxytocin antagonist, in threatened preterm labor at late gestational age: a randomized, double-blind, placebo-controlled trial

    DEFF Research Database (Denmark)

    Thornton, Steven; Goodwin, Thomas M; Greisen, Gorm;

    2009-01-01

    OBJECTIVE: The objective of the study was to compare barusiban with placebo in threatened preterm labor. STUDY DESIGN: This was a randomized, double-blind, placebo-controlled, multicenter study. One hundred sixty-three women at 34-35 weeks plus 6 days, and with 6 or more contractions of 30 second...

  18. Prospective Randomized Trial Comparing the 1-Stage with the 2-Stage Implantation of a Pulse Generator in Patients with Pelvic Floor Dysfunction Selected for Sacral Nerve Stimulation.

    NARCIS (Netherlands)

    Everaert, Karel; Kerckhaert, Wim; Caluwaerts, Hilde; Audenaert, M; Vereecke, Hugo Eric Marc; De Cuypere, G; Boelaert, A; Van den Hombergh, U; Oosterlinck, Wim A

    2004-01-01

    Abstract Objective: The aim of this study was to evaluate in a prospective, randomized setting if the 2-stage implant, compared to a 1-stage implant, leads to a superior subjective or objective outcome of sacral nerve stimulation after implantation of the pulse generator in patients with lower urina

  19. [An open randomized comparative trial of efficacy and safety of selective alpha-adrenoblocker setegis (terazosin) in therapy of patients with chronic bacterial prostatitis].

    Science.gov (United States)

    Trapeznikova, M F; Morozov, A P; Dutov, V V; Urenkov, S B; Pozdniakov, K V; Bychkova, N V

    2007-01-01

    An open randomized comparative trial of setegis (terazosine) has shown good subjective and objective results in patients with chronic bacterial prostatitis. The drug is well tolerated and produces insignificant side effects. It is also demonstrated that combined therapy with alpha-adrenoblockers is more effective that monotherapy with antibacterial drugs in patients with bacterial prostatitis.

  20. Prospective Randomized Trial Comparing the 1-Stage with the 2-Stage Implantation of a Pulse Generator in Patients with Pelvic Floor Dysfunction Selected for Sacral Nerve Stimulation.

    NARCIS (Netherlands)

    Everaert, Karel; Kerckhaert, Wim; Caluwaerts, Hilde; Audenaert, M; Vereecke, Hugo Eric Marc; De Cuypere, G; Boelaert, A; Van den Hombergh, U; Oosterlinck, Wim A

    2004-01-01

    Abstract Objective: The aim of this study was to evaluate in a prospective, randomized setting if the 2-stage implant, compared to a 1-stage implant, leads to a superior subjective or objective outcome of sacral nerve stimulation after implantation of the pulse generator in patients with lower urina

  1. Twenty-Seventh Symposium (International) on Combustion. Volume 1

    Science.gov (United States)

    1998-01-01

    pollutant emissions on human driving the development of highly efficient low- health range from respiratory diseases (e.g., child- emission combustion...Systems (N. 101. Stein, S. E., Walker, J. A., Suryan, M. M., and Fahr , Peters and B. Rogg, eds.), Lecture Notes in Physics, A., in Twenty-Third Symposium...M. M., and Fahr , A., and Kawano, H., Int. j. Chem. Kinet. 21:643-666 in Twenty-Third Symposium (International) on Com- (1989). bustion, The

  2. Random duality

    Institute of Scientific and Technical Information of China (English)

    GUO TieXin; CHEN XinXiang

    2009-01-01

    The purpose of this paper is to provide a random duality theory for the further development of the theory of random conjugate spaces for random normed modules.First,the complicated stratification structure of a module over the algebra L(μ,K) frequently makes our investigations into random duality theory considerably different from the corresponding ones into classical duality theory,thus in this paper we have to first begin in overcoming several substantial obstacles to the study of stratification structure on random locally convex modules.Then,we give the representation theorem of weakly continuous canonical module homomorphisms,the theorem of existence of random Mackey structure,and the random bipolar theorem with respect to a regular random duality pair together with some important random compatible invariants.

  3. Random duality

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The purpose of this paper is to provide a random duality theory for the further development of the theory of random conjugate spaces for random normed modules. First, the complicated stratification structure of a module over the algebra L(μ, K) frequently makes our investigations into random duality theory considerably difierent from the corresponding ones into classical duality theory, thus in this paper we have to first begin in overcoming several substantial obstacles to the study of stratification structure on random locally convex modules. Then, we give the representation theorem of weakly continuous canonical module homomorphisms, the theorem of existence of random Mackey structure, and the random bipolar theorem with respect to a regular random duality pair together with some important random compatible invariants.

  4. GeneSrF and varSelRF: a web-based tool and R package for gene selection and classification using random forest

    Directory of Open Access Journals (Sweden)

    Diaz-Uriarte Ramón

    2007-09-01

    Full Text Available Abstract Background Microarray data are often used for patient classification and gene selection. An appropriate tool for end users and biomedical researchers should combine user friendliness with statistical rigor, including carefully avoiding selection biases and allowing analysis of multiple solutions, together with access to additional functional information of selected genes. Methodologically, such a tool would be of greater use if it incorporates state-of-the-art computational approaches and makes source code available. Results We have developed GeneSrF, a web-based tool, and varSelRF, an R package, that implement, in the context of patient classification, a validated method for selecting very small sets of genes while preserving classification accuracy. Computation is parallelized, allowing to take advantage of multicore CPUs and clusters of workstations. Output includes bootstrapped estimates of prediction error rate, and assessments of the stability of the solutions. Clickable tables link to additional information for each gene (GO terms, PubMed citations, KEGG pathways, and output can be sent to PaLS for examination of PubMed references, GO terms, KEGG and and Reactome pathways characteristic of sets of genes selected for class prediction. The full source code is available, allowing to extend the software. The web-based application is available from http://genesrf2.bioinfo.cnio.es. All source code is available from Bioinformatics.org or The Launchpad. The R package is also available from CRAN. Conclusion varSelRF and GeneSrF implement a validated method for gene selection including bootstrap estimates of classification error rate. They are valuable tools for applied biomedical researchers, specially for exploratory work with microarray data. Because of the underlying technology used (combination of parallelization with web-based application they are also of methodological interest to bioinformaticians and biostatisticians.

  5. Adjacent, distal, or combination of point-selective effects of acupuncture on temporomandibular joint disorders: A randomized, single-blind, assessor-blind controlled trial

    Directory of Open Access Journals (Sweden)

    Kyung-Won Kang

    2012-12-01

    Conclusion: Our results suggest that point-selective effects among adjacent, distal, or a combination of acupoints are hardly associated with pain intensity or palpation index in participants with TMDs. Larger sample size trials are required to overcome the shortcomings of the study.

  6. Population and business exposure to twenty scenario earthquakes in the State of Washington

    Science.gov (United States)

    Wood, Nathan; Ratliff, Jamie

    2011-01-01

    This report documents the results of an initial analysis of population and business exposure to scenario earthquakes in Washington. This analysis was conducted to support the U.S. Geological Survey (USGS) Pacific Northwest Multi-Hazards Demonstration Project (MHDP) and an ongoing collaboration between the State of Washington Emergency Management Division (WEMD) and the USGS on earthquake hazards and vulnerability topics. This report was developed to help WEMD meet internal planning needs. A subsequent report will provide analysis to the community level. The objective of this project was to use scenario ground-motion hazard maps to estimate population and business exposure to twenty Washington earthquakes. In consultation with the USGS Earthquake Hazards Program and the Washington Division of Geology and Natural Resources, the twenty scenario earthquakes were selected by WEMD (fig. 1). Hazard maps were then produced by the USGS and placed in the USGS ShakeMap archive.

  7. Effects of the Adenosine A(1) Receptor Antagonist Rolofylline on Renal Function in Patients With Acute Heart Failure and Renal Dysfunction Results From PROTECT (Placebo-Controlled Randomized Study of the Selective A(1) Adenosine Receptor Antagonist Rolofylline for Patients Hospitalized With Acute Decompensated Heart Failure and Volume Overload to Assess Treatment Effect on Congestion and Renal Function)

    NARCIS (Netherlands)

    Voors, Adriaan A.; Dittrich, Howard C.; Massie, Barry M.; DeLucca, Paul; Mansoor, George A.; Metra, Marco; Cotter, Gad; Weatherley, Beth D.; Ponikowski, Piotr; Teerlink, John R.; Cleland, John G. F.; O'Connor, Christopher M.; Givertz, Michael M.

    2011-01-01

    Objectives This study sought to assess the effects of rolofylline on renal function in patients with acute heart failure (AHF) and renal dysfunction randomized in PROTECT (Placebo-Controlled Randomized Study of the Selective A(1) Adenosine Receptor Antagonist Rolofylline for Patients Hospitalized Wi

  8. Twenty-Sixth Symposium (International) on Combustion, Volume 1.

    Science.gov (United States)

    1997-04-01

    Fundamental Aspects of Combustion, Oxford University Press, New York, 1993, p. 81. 12. Ross, H. D., Miller, F. J., Schiller, D. N., and Sirig- nano , W...Mech. 48:547-591 (1971). 2. Lazaro , B. J. and Lasheras, J. C, Phys. Fluids 1:1035 (1989). 3. Kiger, K. T. and Lasheras, J. C, Twenty-Fifth Sympo

  9. Proceedings of the Twenty Second Nordic Seminar on Computational Mechanics

    DEFF Research Database (Denmark)

    This book contains the proceedings of the Twenty Second Nordic Seminar on Computational Mechanics (NSCM22), taking event 22-23 October 2009 at Aalborg University, Denmark. The papers presented at the Optimization Seminar in Honour of Niels Olhoff, held 21 October 2009 at Aalborg University, Denmark...

  10. The Work Place of the Early Twenty-First Century.

    Science.gov (United States)

    Brown, James M.

    1991-01-01

    Major issues affecting the workplace of the twenty-first century include productivity growth, globalization, resistance to change, worker alienation, and telecommunications. Opposing views of technology are that (1) it will improve the economy and create jobs or (2) the majority of new jobs will not require high skills. (SK)

  11. Educators Guide to Free Filmstrips. Twenty-Third Edition.

    Science.gov (United States)

    Horkheimer, Mary Foley, Comp.; Diffor, John C., Comp.

    A total of 453 titles of filmstrips, slide sets, and sets of transparencies available free of charge to educators are listed in this guide. More than 20,000 separate frames are offered from 95 sources. Twenty of the filmstrips may be retained permanently by the borrower. The films cover topics in the fields of accident prevention, aerospace…

  12. Membership, Belonging, and Identity in the Twenty-First Century

    Science.gov (United States)

    Motteram, Gary

    2016-01-01

    This article takes a case study approach to exploring membership, belonging, and identity amongst English language teachers in the twenty-first century. It explores findings from two membership surveys conducted for the International Association of Teachers of English as a Foreign Language (IATEFL), and considers the impact of recommendations…

  13. Tall Fescue for the Twenty-first Century

    Science.gov (United States)

    Tall Fescue for the Twenty-first Century is a comprehensive monograph by experts from around the world about the science of tall fescue [Lolium arundinaceum (Schreb.) Darbysh. = Schedonorus arundinaceus (Schreb.) Dumort., formerly Fes¬tuca arundinacea Schreb. var. arundinacea] and its applications. ...

  14. Powering into the twenty-first century [Singapore Power Limited

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1997-07-01

    To meet the challenges of the twenty-first century power industry, Singapore Power was incorporated as a commercial entity in October 1995. As the leading energy company in Singapore, SP continues to invest heavily in infrastructure development to improve its service efficiency and reliability, and to maintain its reputation as one of the world`s best power suppliers. (UK)

  15. Afterword: Victorian Sculpture for the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    David J. Getsy

    2016-06-01

    Full Text Available Commenting on the directions proposed by this issue of '19', the afterword discusses the broad trends in twenty-first century studies of Victorian sculpture and the opportunity for debate arising from the first attempt at a comprehensive exhibition.

  16. Membership, Belonging, and Identity in the Twenty-First Century

    Science.gov (United States)

    Motteram, Gary

    2016-01-01

    This article takes a case study approach to exploring membership, belonging, and identity amongst English language teachers in the twenty-first century. It explores findings from two membership surveys conducted for the International Association of Teachers of English as a Foreign Language (IATEFL), and considers the impact of recommendations…

  17. Blended Instruction: The Roaring Twenties Meets Coursesites.com

    Science.gov (United States)

    Waldron, Diane L.

    2014-01-01

    The action research study described in this report outlines the design and implementation of a unit of blended instruction in a traditional high school English classroom. Twenty technical high school students in an 11th grade Honors English class engaged in a variety of internet-based activities in conjunction with traditional learning activities…

  18. Twenty-One: a baseline for multilingual multimedia retrieval

    NARCIS (Netherlands)

    Jong, de Franciska; Hiemstra, Djoerd; Jong, de Franciska; Netter, Klaus

    1998-01-01

    In this paper we will give a short overview of the ideas underpinning the demonstrator developed within the EU-funded project Twenty-One; this system provides for the disclosure of information in a heterogeneous document environment that includes documents of different types and languages. As part o

  19. Twenty years of physics at MAMI -What did it mean?

    Energy Technology Data Exchange (ETDEWEB)

    Mecking, B.A. [Thomas Jefferson National Accelerator Facility, Newport News, VA (United States)

    2006-05-15

    The development over the last twenty years of the physics program and the experimental facilities at the Mainz Microtron MAMI will be reviewed. Ground-breaking contributions have been made to the development of experimental techniques and to our understanding of the structure of nucleons and nuclei. (orig.)

  20. The Presidential Platform on Twenty-First Century Education Goals

    Science.gov (United States)

    Tichnor-Wagner, Ariel; Socol, Allison Rose

    2016-01-01

    As social and economic problems change, so do the goals of education reformers. This content analysis of presidential debates transcripts, state of the union addresses, and education budgets from 2000 to 2015 reveals the ways in which presidents and presidential candidates have framed education goals thus far in the twenty-first century. Using…

  1. Twenty years of physics at MAMI --What did it mean?

    Energy Technology Data Exchange (ETDEWEB)

    Bernhard Mecking

    2006-06-01

    The development over the last twenty years of the physics program and the experimental facilities at the Mainz Microtron MAMI will be reviewed. Ground-breaking contributions have been made to the development of experimental techniques and to our understanding of the structure of nucleons and nuclei.

  2. The Work Place of the Early Twenty-First Century.

    Science.gov (United States)

    Brown, James M.

    1991-01-01

    Major issues affecting the workplace of the twenty-first century include productivity growth, globalization, resistance to change, worker alienation, and telecommunications. Opposing views of technology are that (1) it will improve the economy and create jobs or (2) the majority of new jobs will not require high skills. (SK)

  3. Twenty-One: a baseline for multilingual multimedia retrieval

    NARCIS (Netherlands)

    Unknown, [Unknown; Hiemstra, Djoerd; de Jong, Franciska M.G.; Netter, Klaus

    1998-01-01

    In this paper we will give a short overview of the ideas underpinning the demonstrator developed within the EU-funded project Twenty-One; this system provides for the disclosure of information in a heterogeneous document environment that includes documents of different types and languages. As part o

  4. Digital earth applications in the twenty-first century

    NARCIS (Netherlands)

    de By, R.A.; Georgiadou, P.Y.

    2014-01-01

    In these early years of the twenty-first century, we must look at how the truly cross-cutting information technology supports other innovations, and how it will fundamentally change the information positions of government, private sector and the scientific domain as well as the citizen. In those

  5. Randomized Comparison of Selective Internal Radiotherapy (SIRT) Versus Drug-Eluting Bead Transarterial Chemoembolization (DEB-TACE) for the Treatment of Hepatocellular Carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Pitton, Michael B., E-mail: michael.pitton@unimedizin-mainz.de; Kloeckner, Roman [Johannes Gutenberg University Medical Center, Department of Diagnostic and Interventional Radiology (Germany); Ruckes, Christian [Johannes Gutenberg University Medical Center, IZKS (Germany); Wirth, Gesine M. [Johannes Gutenberg University Medical Center, Department of Diagnostic and Interventional Radiology (Germany); Eichhorn, Waltraud [Johannes Gutenberg University Medical Center, Department of Nuclear Medicine (Germany); Wörns, Marcus A.; Weinmann, Arndt [Johannes Gutenberg University Medical Center, Department of Internal Medicine (Germany); Schreckenberger, Mathias [Johannes Gutenberg University Medical Center, Department of Nuclear Medicine (Germany); Galle, Peter R. [Johannes Gutenberg University Medical Center, Department of Internal Medicine (Germany); Otto, Gerd [Johannes Gutenberg University Medical Center, Department of Transplantation Surgery (Germany); Dueber, Christoph [Johannes Gutenberg University Medical Center, Department of Diagnostic and Interventional Radiology (Germany)

    2015-04-15

    PurposeTo prospectively compare SIRT and DEB-TACE for treating hepatocellular carcinoma (HCC).MethodsFrom 04/2010–07/2012, 24 patients with histologically proven unresectable N0, M0 HCCs were randomized 1:1 to receive SIRT or DEB-TACE. SIRT could be repeated once in case of recurrence; while, TACE was repeated every 6 weeks until no viable tumor tissue was detected by MRI or contraindications prohibited further treatment. Patients were followed-up by MRI every 3 months; the final evaluation was 05/2013.ResultsBoth groups were comparable in demographics (SIRT: 8males/4females, mean age 72 ± 7 years; TACE: 10males/2females, mean age 71 ± 9 years), initial tumor load (1 patient ≥25 % in each group), and BCLC (Barcelona Clinic Liver Cancer) stage (SIRT: 12×B; TACE 1×A, 11×B). Median progression-free survival (PFS) was 180 days for SIRT versus 216 days for TACE patients (p = 0.6193) with a median TTP of 371 days versus 336 days, respectively (p = 0.5764). Median OS was 592 days for SIRT versus 788 days for TACE patients (p = 0.9271). Seven patients died in each group. Causes of death were liver failure (n = 4 SIRT group), tumor progression (n = 4 TACE group), cardiovascular events, and inconclusive (n = 1 in each group).ConclusionsNo significant differences were found in median PFS, OS, and TTP. The lower rate of tumor progression in the SIRT group was nullified by a greater incidence of liver failure. This pilot study is the first prospective randomized trial comparing SIRT and TACE for treating HCC, and results can be used for sample size calculations of future studies.

  6. Twenty natural organic pigments for application in dye sensitized solar cells

    Science.gov (United States)

    Castillo, D.; Sánchez Juárez, A.; Espinosa Tapia, S.; Guaman, A.; Obregón Calderón, D.

    2016-09-01

    In this work we present the results of a study of twenty natural pigments obtained from plants and insects from southern Ecuador. Many of them will be considered as a potential natural sensitizer for the construction of DSSCs. The results indicate that these pigments have a good performance in the absorbance and wavelength spectra. Were selected four best pigments for the construction of DSSCs, Rumex tolimensis Wedd, Raphanus sativus, Hibiscus sabdariffa, and Prunus serótina, however the conversion efficiency is lower than 1%.

  7. From Randomness to Order

    Directory of Open Access Journals (Sweden)

    Jorge Berger

    2004-03-01

    Full Text Available I review some selected situations in which order builds up from randomness, or a losing trend turns into winning. Except for Section 4 (which is mine, all cases are well documented and the price paid to achieve order is apparent.

  8. A machine learning methodology for the selection and classification of spontaneous spinal cord dorsum potentials allows disclosure of structured (non-random changes in neuronal connectivity induced by nociceptive stimulation

    Directory of Open Access Journals (Sweden)

    Mario eMartin

    2015-08-01

    Full Text Available Fractal analysis of spontaneous cord dorsum potentials (CDPs generated in the lumbosacral spinal segments has revealed that these potentials are generated by ongoing structured (non-random neuronal activity. Studies aimed to disclose the changes produced by nociceptive stimulation on the functional organization of the neuronal networks generating these potentials used predetermined templates to select specific classes of spontaneous CDPs. Since this procedure was time consuming and required continuous supervision, it was limited to the analysis of two types of CDPs (negative CDPs and negative positive CDPs, thus excluding potentials that may reflect activation of other neuronal networks of presumed functional relevance. We now present a novel procedure based in machine learning that allows the efficient and unbiased selection of a variety of spontaneous CDPs with different shapes and amplitudes. The reliability and performance of the method is evaluated by analyzing the effects on the probabilities of generation of different types of spontaneous CDPs induced by the intradermic injection of small amounts of capsaicin in the anesthetized cat.The results obtained with the selection method presently described allowed detection of spontaneous CDPs with specific shapes and amplitudes that are assumed to represent the activation of functionally coupled sets of dorsal horn neurones that acquire different, structured configurations in response to nociceptive stimuli.

  9. Random Matrices

    CERN Document Server

    Stephanov, M A; Wettig, T

    2005-01-01

    We review elementary properties of random matrices and discuss widely used mathematical methods for both hermitian and nonhermitian random matrix ensembles. Applications to a wide range of physics problems are summarized. This paper originally appeared as an article in the Wiley Encyclopedia of Electrical and Electronics Engineering.

  10. Promoting mobility after hip fracture (ProMo: study protocol and selected baseline results of a year-long randomized controlled trial among community-dwelling older people

    Directory of Open Access Journals (Sweden)

    Sipilä Sarianna

    2011-12-01

    Full Text Available Abstract Background To cope at their homes, community-dwelling older people surviving a hip fracture need a sufficient amount of functional ability and mobility. There is a lack of evidence on the best practices supporting recovery after hip fracture. The purpose of this article is to describe the design, intervention and demographic baseline results of a study investigating the effects of a rehabilitation program aiming to restore mobility and functional capacity among community-dwelling participants after hip fracture. Methods/Design Population-based sample of over 60-year-old community-dwelling men and women operated for hip fracture (n = 81, mean age 79 years, 78% were women participated in this study and were randomly allocated into control (Standard Care and ProMo intervention groups on average 10 weeks post fracture and 6 weeks after discharged to home. Standard Care included written home exercise program with 5-7 exercises for lower limbs. Of all participants, 12 got a referral to physiotherapy. After discharged to home, only 50% adhered to Standard Care. None of the participants were followed-up for Standard Care or mobility recovery. ProMo-intervention included Standard Care and a year-long program including evaluation/modification of environmental hazards, guidance for safe walking, pain management, progressive home exercise program and physical activity counseling. Measurements included a comprehensive battery of laboratory tests and self-report on mobility limitation, disability, physical functional capacity and health as well as assessments for the key prerequisites for mobility, disability and functional capacity. All assessments were performed blinded at the research laboratory. No significant differences were observed between intervention and control groups in any of the demographic variables. Discussion Ten weeks post hip fracture only half of the participants were compliant to Standard Care. No follow-up for Standard Care or

  11. NATO’s Relevance in the Twenty-First Century

    Science.gov (United States)

    2012-03-22

    rules of engagement for force protection.19 NATO Foreign Ministers authorized the Supreme Allied Commander Europe (SACEUR) to begin the next stage of...the mission on 9 December 2004. The activation order for this next stage was given by SACEUR on 16 December 2004. It allowed the deployment of 300...Christopher Coker, Globalisation and Insecurity in the Twenty-first Century: NATO and the Management of Risk (The International Institute for Strategic

  12. Proceedings of the twenty-first LAMPF users group meeting

    Energy Technology Data Exchange (ETDEWEB)

    1988-04-01

    The Twenty-First Annual LAMPF Users Group Meeting was held November 9-10, 1987, at the Clinton P. Anderson Meson Physics Facility. The program included a number of invited talks on various aspects of nuclear and particle physics as well as status reports on LAMPF and discussions of upgrade options. The LAMPF working groups met and discussed plans for the secondary beam lines, experimental programs, and computing facilities.

  13. About capital in the twenty-first century

    OpenAIRE

    2015-01-01

    In this article, I present three key facts about income and wealth inequality in the long run emerging from my book, Capital in the Twenty-First Century, and seek to sharpen and refocus the discussion about those trends. In particular, I clarify the role played by r > g in my analysis of wealth inequality. I also discuss some of the implications for optimal taxation, and the relation between capital-income ratios and capital shares.

  14. Technological sciences society of the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-04-15

    This book introduces information-oriented society of the twenty-first century connected to computer network for example memory of dream : F-ram, information-oriented society : New media, communications network for next generation ; ISDN on what is IDSN?, development of information service industry, from office automation to an intelligent building in the future, home shopping and home banking and rock that hinders information-oriented society.

  15. The November $J / \\Psi$ Revolution Twenty-Five Years Later

    CERN Document Server

    Khare, A

    1999-01-01

    Exactly twenty five years ago the world of high energy physics was set on fire by the discovery of a new particle with an unusually narrow width at 3095 MeV, known popularly as the $J/\\Psi$ revolution. This discovery was very decisive in our understanding as well as formulating the current picture regarding the basic constituents of nature. I look back at the discovery, pointing out how unexpected, dramatic and significant it was.

  16. Proceedings of the twenty-second LAMPF users groupd meeting

    Energy Technology Data Exchange (ETDEWEB)

    Marinuzzi, R.

    1989-04-01

    The Twenty-Second Annual LAMPF Users Group Meeting was held October 17--18, 1988, at the Clinton P. Anderson Meson Physics Facility. The program included a number of invited talks on various aspects of nuclear and particle physics as well as status reports on LAMPF and discussions of upgrade options. The LAMPF working groups met and discussed plans for the secondary beam lines, experimental programs, and computing facilities.

  17. Early twenty-first-century droughts during the warmest climate

    Directory of Open Access Journals (Sweden)

    Felix Kogan

    2016-01-01

    Full Text Available The first 13 years of the twenty-first century have begun with a series of widespread, long and intensive droughts around the world. Extreme and severe-to-extreme intensity droughts covered 2%–6% and 7%–16% of the world land, respectively, affecting environment, economies and humans. These droughts reduced agricultural production, leading to food shortages, human health deterioration, poverty, regional disturbances, population migration and death. This feature article is a travelogue of the twenty-first-century global and regional droughts during the warmest years of the past 100 years. These droughts were identified and monitored with the National Oceanic and Atmospheric Administration operational space technology, called vegetation health (VH, which has the longest period of observation and provides good data quality. The VH method was used for assessment of vegetation condition or health, including drought early detection and monitoring. The VH method is based on operational satellites data estimating both land surface greenness (NDVI and thermal conditions. The twenty-first-century droughts in the USA, Russia, Australia and Horn of Africa were intensive, long, covered large areas and caused huge losses in agricultural production, which affected food security and led to food riots in some countries. This research also investigates drought dynamics presenting no definite conclusion about drought intensification or/and expansion during the time of the warmest globe.

  18. The design and protocol of heat-sensitive moxibustion for knee osteoarthritis: a multicenter randomized controlled trial on the rules of selecting moxibustion location

    Directory of Open Access Journals (Sweden)

    Chi Zhenhai

    2010-06-01

    Full Text Available Abstract Background Knee osteoarthritis is a major cause of pain and functional limitation. Complementary and alternative medical approaches have been employed to relieve symptoms and to avoid the side effects of conventional medication. Moxibustion has been widely used to treat patients with knee osteoarthritis. Our past researches suggested heat-sensitive moxibustion might be superior to the conventional moxibustion. Our objective is to investigate the effectiveness of heat-sensitive moxibustion compared with conventional moxibustion or conventional drug treatment. Methods This study consists of a multi-centre (four centers in China, randomised, controlled trial with three parallel arms (A: heat-sensitive moxibustion; B: conventional moxibustion; C: conventional drug group. The moxibustion locations are different from A and B. Group A selects heat-sensitization acupoint from the region consisting of Yin Lingquan(SP9, Yang Lingquan(GB34, Liang Qiu(ST34, and Xue Hai (SP10. Meanwhile, fixed acupoints are used in group B, that is Xi Yan (EX-LE5 and He Ding (EX-LE2. The conventional drug group treats with intra-articular Sodium Hyaluronate injection. The outcome measures above will be assessed before the treatment, the 30 days of the last moxibustion session and 6 months after the last moxibustion session. Discussion This trial will utilize high quality trial methodologies in accordance with CONSORT guidelines. It will provide evidence for the effectiveness of moxibustion as a treatment for moderate and severe knee osteoarthritis. Moreover, the result will clarify the rules of heat-sensitive moxibustion location to improve the therapeutic effect with suspended moxibustion, and propose a new concept and a new theory of moxibustion to guide clinical practices. Trial Registration The trial is registered at Controlled Clinical Trials: ChiCTR-TRC-00000600.

  19. Effect of soothing-liver and nourishing-heart acupuncture on early selective serotonin reuptake inhibitor treatment onset for depressive disorder and related indicators of neuroimmunology: a randomized controlled clinical trial.

    Science.gov (United States)

    Liu, Yi; Feng, Hui; Mo, Yali; Gao, Jingfang; Mao, Hongjing; Song, Mingfen; Wang, Shengdong; Yin, Yan; Liu, Wenjuan

    2015-10-01

    To observe the effect of soothing-liver and nourishing-heart acupuncture on selective serotonin reuptake inhibitor (SSRIs) treatment effect onset in patients with depressive disorder and related indicators of neuroimmunology. Overall, 126 patients with depressive disorder were randomly divided into a medicine and acupuncture-medicine group using a random number table. Patients were treated for 6 consecutive weeks. The two groups were evaluated by the Montgomery-Asberg Depression Rating Scale (MADRS) and Side Effects Rating Scale (SERS) to assess the effect of the soothing-liver and nourishing-heart acupuncture method on early onset of SSRI treatment effect. Changes in serum 5-hydroxytryptamine (5-HT) and inflammatory cytokines before and after treatment were recorded and compared between the medicine group and the acupuncture-medicine group. The acupuncture-medicine group had significantly lower MADRS scores at weeks 1, 2, 4, and 6 after treatment compared with the medicine group (P treatment compared with the medicine group (P treatment, serum 5-HT in the acupuncture-medicine group was significantly higher compared with the medicine group (P 0.05). Anti-inflammatory cytokines IL-4 and IL-10 were significantly higher in the acupuncture-medicine group compared with the medicine group (P depressive disorder and can significantly reduce the adverse reactions of SSRIs. Moreover, acupuncture can enhance serum 5-HT and regulate the balance of pro-inflammatory cytokines and anti-inflammatory cytokines.

  20. Acid Rain: A Selective Bibliography. Second Edition. Bibliography Series Twenty-One.

    Science.gov (United States)

    O'Neill, Gertrudis, Comp.

    Acid rain is a term for rain, snow, or other precipitation produced from water vapor in the air reacting with emissions from automobiles, factories, power plants, and other oil and coal burning sources. When these chemical compounds, composed of sulfur oxide and nitrogen oxide, react with water vapor, the result is sulfuric acid and nitric acid.…

  1. 基于动态规划法的物流配送路径的随机选择%RANDOM SELECTION OF LOGISTICS DISTRIBUTION ROUTE BASED ON DYNAMIC PROGRAMMING

    Institute of Scientific and Technical Information of China (English)

    赵慧娟; 汤兵勇; 张云

    2013-01-01

    Logistics distribution is the important part of E-Business, the selection of distribution route plays an important role for logistics enterprises in improving their efficiency. The dynamic programming adapting to multi-phase decision making is analysed. Based on essential dynamic programming algorithm and in combination with route selection issue in logistics distribution, the traffic jam factor in distribution route is imported for randomly modifying the corresponding weight of the distribution route and dynamically adjusting the distribution route selection. In conjunction with specific example, we analyse the effectiveness and feasibility of this method, it achieves the dynamic route selection in logistics distribution process.%物流配送是电子商务的重要环节,配送路径的选择对于提高物流企业的效率十分重要.分析适用于多阶段决策的动态规划法,在基本的动态规划算法基础上,结合物流配送的路径选择问题,引入配送途中道路的拥堵因子,随机修正配送路径的相应权值,动态调整选择配送路径.结合具体的实例,分析证明了该方法的有效性和可行性,实现物流配送过程中路径的动态选择.

  2. Drop-out from cardiovascular magnetic resonance in a randomized controlled trial of ST-elevation myocardial infarction does not cause selection bias on endpoints.

    Science.gov (United States)

    Laursen, Peter Nørkjær; Holmvang, L; Kelbæk, H; Vejlstrup, N; Engstrøm, T; Lønborg, J

    2017-07-01

    The extent of selection bias due to drop-out in clinical trials of ST-elevation myocardial infarction (STEMI) using cardiovascular magnetic resonance (CMR) as surrogate endpoints is unknown. We sought to interrogate the characteristics and prognosis of patients who dropped out before acute CMR assessment compared to CMR-participants in a previously published double-blinded, placebo-controlled all-comer trial with CMR outcome as the primary endpoint. Baseline characteristics and composite endpoint of all-cause mortality, heart failure and re-infarction after 30 days and 5 years of follow-up were assessed and compared between CMR-drop-outs and CMR-participants using the trial screening log and the Eastern Danish Heart Registry. The drop-out rate from acute CMR was 28% (n = 92). These patients had a significantly worse clinical risk profile upon admission as evaluated by the TIMI-risk score (3.7 (± 2.1) vs 4.0 (± 2.6), p = 0.043) and by left ventricular ejection fraction (43 (± 9) vs. 47 (± 10), p = 0.029). CMR drop-outs had a higher incidence of known hypertension (39% vs. 35%, p = 0.043), known diabetes (14% vs. 7%, p = 0.025), known cardiac disease (11% vs. 3%, p = 0.013) and known renal function disease (5% vs. 0%, p = 0.007). However, the 30-day and 5-years composite endpoint rate was not significantly higher among the CMR drop-out ((HR 1.43 (95%-CI 0.5; 3.97) (p = 0.5)) and (HR 1.31 (95%-CI 0.84; 2.05) (p = 0.24)). CMR-drop-outs had a higher incidence of cardiovascular risk factors at baseline, a worse clinical risk profile upon admission. However, no significant difference was observed in the clinical endpoints between the groups.

  3. Random thoughts

    Science.gov (United States)

    ajansen; kwhitefoot; panteltje1; edprochak; sudhakar, the

    2014-07-01

    In reply to the physicsworld.com news story “How to make a quantum random-number generator from a mobile phone” (16 May, http://ow.ly/xFiYc, see also p5), which describes a way of delivering random numbers by counting the number of photons that impinge on each of the individual pixels in the camera of a Nokia N9 smartphone.

  4. Twenty-first workshop on geothermal reservoir engineering: Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-01-26

    PREFACE The Twenty-First Workshop on Geothermal Reservoir Engineering was held at the Holiday Inn, Palo Alto on January 22-24, 1996. There were one-hundred fifty-five registered participants. Participants came from twenty foreign countries: Argentina, Austria, Canada, Costa Rica, El Salvador, France, Iceland, Indonesia, Italy, Japan, Mexico, The Netherlands, New Zealand, Nicaragua, the Philippines, Romania, Russia, Switzerland, Turkey and the UK. The performance of many geothermal reservoirs outside the United States was described in several of the papers. Professor Roland N. Horne opened the meeting and welcomed visitors. The key note speaker was Marshall Reed, who gave a brief overview of the Department of Energy's current plan. Sixty-six papers were presented in the technical sessions of the workshop. Technical papers were organized into twenty sessions concerning: reservoir assessment, modeling, geology/geochemistry, fracture modeling hot dry rock, geoscience, low enthalpy, injection, well testing, drilling, adsorption and stimulation. Session chairmen were major contributors to the workshop, and we thank: Ben Barker, Bobbie Bishop-Gollan, Tom Box, Jim Combs, John Counsil, Sabodh Garg, Malcolm Grant, Marcel0 Lippmann, Jim Lovekin, John Pritchett, Marshall Reed, Joel Renner, Subir Sanyal, Mike Shook, Alfred Truesdell and Ken Williamson. Jim Lovekin gave the post-dinner speech at the banquet and highlighted the exciting developments in the geothermal field which are taking place worldwide. The Workshop was organized by the Stanford Geothermal Program faculty, staff, and graduate students. We wish to thank our students who operated the audiovisual equipment. Shaun D. Fitzgerald Program Manager.

  5. Twenty-first-century medical microbiology services in the UK.

    Science.gov (United States)

    Duerden, Brian

    2005-12-01

    With infection once again a high priority for the UK National Health Service (NHS), the medical microbiology and infection-control services require increased technology resources and more multidisciplinary staff. Clinical care and health protection need a coordinated network of microbiology services working to consistent standards, provided locally by NHS Trusts and supported by the regional expertise and national reference laboratories of the new Health Protection Agency. Here, I outline my thoughts on the need for these new resources and the ways in which clinical microbiology services in the UK can best meet the demands of the twenty-first century.

  6. Twenty years of energy policy: What should we have learned?

    Energy Technology Data Exchange (ETDEWEB)

    Greene, D.L. [Oak Ridge National Lab., TN (United States). Center for Transportation Analysis

    1994-07-01

    This report examines the past twenty years of energy market events and energy policies to determine what may be useful for the future. The author focuses on two important lessons that should have been learned but which the author feels have been seriously misunderstood. The first is that oil price shocks were a very big and very real problem for oil importing countries, a problem the has not gone away. The second is that automobile fuel economy regulation has worked and worked effectively to reduce oil consumption and the externalities associated with it, and can still work effectively in the future.

  7. Accelerators for the twenty-first century a review

    CERN Document Server

    Wilson, Edmund J N

    1990-01-01

    The development of the synchrotron, and later the storage ring, was based upon the electrical technology at the turn of this century, aided by the microwave radar techniques of World War II. This method of acceleration seems to have reached its limit. Even superconductivity is not likely to lead to devices that will satisfy physics needs into the twenty-first century. Unless a new principle for accelerating elementary particles is discovered soon, it is difficult to imagine that high-energy physics will continue to reach out to higher energies and luminosities.

  8. Earth observations in the twenty-first century

    Science.gov (United States)

    Geller, M. A.

    1986-01-01

    Some of the achievements of earth observations from past space missions are described. Also discussed are the achievements to be anticipated from currently approved and planned earth observation missions. In looking forward to what the objectives of earth observations from space are expected to be in the future, together with what technology is expected to enable, what the earth observing program will look like during the first part of the twenty-first century is discussed. It is concluded that a key part of this program will be long-term observations holistically viewing the earth system.

  9. Vinte anos de efeito SERS Twenty years of SERS

    Directory of Open Access Journals (Sweden)

    Dalva L. A. de Faria

    1999-07-01

    Full Text Available The Surface Enhanced Raman Scattering (SERS effect was observed for the first time in 1974, but it was only considered a new effect three years later, hence, nearly twenty years ago. Since its discovery, a significant amount of investigations have been performed aiming at to clarify the nature of the observed enhancement, to improve the surface stability and to establish applications which nowadays range from the study of biomolecules to catalysis. Some of the more relevant aspects of this effect which have been examined across the last two decades are summarized in this paper which presents the introductory aspects of SERS alongside with several of its applications.

  10. Strategies for Teaching Maritime Archaeology in the Twenty First Century

    Science.gov (United States)

    Staniforth, Mark

    2008-12-01

    Maritime archaeology is a multi-faceted discipline that requires both theoretical learning and practical skills training. In the past most universities have approached the teaching of maritime archaeology as a full-time on-campus activity designed for ‘traditional’ graduate students; primarily those in their early twenties who have recently come from full-time undergraduate study and who are able to study on-campus. The needs of mature-age and other students who work and live in different places (or countries) and therefore cannot attend lectures on a regular basis (or at all) have largely been ignored. This paper provides a case study in the teaching of maritime archaeology from Australia that, in addition to ‘traditional’ on-campus teaching, includes four main components: (1) learning field methods through field schools; (2) skills training through the AIMA/NAS avocational training program; (3) distance learning topics available through CD-ROM and using the Internet; and (4) practicums, internships and fellowships. The author argues that programs to teach maritime archaeology in the twenty first century need to be flexible and to address the diverse needs of students who do not fit the ‘traditional’ model. This involves collaborative partnerships with other universities as well as government underwater cultural heritage management agencies and museums, primarily through field schools, practicums and internships.

  11. Random matrices

    CERN Document Server

    Mehta, Madan Lal

    1990-01-01

    Since the publication of Random Matrices (Academic Press, 1967) so many new results have emerged both in theory and in applications, that this edition is almost completely revised to reflect the developments. For example, the theory of matrices with quaternion elements was developed to compute certain multiple integrals, and the inverse scattering theory was used to derive asymptotic results. The discovery of Selberg's 1944 paper on a multiple integral also gave rise to hundreds of recent publications. This book presents a coherent and detailed analytical treatment of random matrices, leading

  12. Repetitive transcranial magnetic stimulation (rTMS) augmentation of selective serotonin reuptake inhibitors (SSRIs) for SSRI-resistant obsessive-compulsive disorder (OCD): a meta-analysis of randomized controlled trials

    Science.gov (United States)

    Ma, Zhong-Rui; Shi, Li-Jun

    2014-01-01

    Background and objective: Randomized controlled trials (RCTs) on repetitive transcranial magnetic stimulation (rTMS) as augmentation of selective serotonin reuptake inhibitors (SSRIs) for SSRI-resistant obsessive-compulsive disorder (OCD) have yielded conflicting results. Therefore, this meta-analysis was conducted to assess the efficacy of this strategy for SSRI-resistant OCD. Methods: Scientific and medical databases, including international databases (PubMed, MEDLINE, EMBASE, CCTR, Web of Science, PsycINFO), two Chinese databases (CBM-disc, CNKI), and relevant websites dated up to July 2014, were searched for RCTs on this strategy for treating OCD. Mantel-Haenszel random-effects model was used. Yale-Brown Obsessive Compulsive Scale (Y-BOCS) score, response rates and drop-out rates were evaluated. Results: Data were obtained from nine RCTs consisting of 290 subjects. Active rTMS was an effective augmentation strategy in treating SSRI-resistant OCD with a pooled WMD of 3.89 (95% CI = [1.27, 6.50]) for reducing Y-BOCS score and a pooled odds ratio (OR) of 2.65 (95% CI = [1.36, 5.17] for response rates. No significant differences in drop-out rates were found. No publication bias was detected. Conclusion: The pooled examination demonstrated that this strategy seems to be efficacious and acceptable for treating SSRI-resistant OCD. As the number of RCTs included here was limited, further large-scale multi-center RCTs are required to validate our conclusions. PMID:25663986

  13. A phase IIA randomized, placebo-controlled clinical trial to study the efficacy and safety of the selective androgen receptor modulator (SARM), MK-0773 in female participants with sarcopenia.

    Science.gov (United States)

    Papanicolaou, D A; Ather, S N; Zhu, H; Zhou, Y; Lutkiewicz, J; Scott, B B; Chandler, J

    2013-01-01

    Sarcopenia, the age-related loss of muscle mass [defined as appendicular LBM/Height2 (aLBM/ht2) below peak value by>1SD], strength and function, is a major contributing factor to frailty in the elderly. MK-0773 is a selective androgen receptor modulator designed to improve muscle function while minimizing effects on other tissues. The primary objective of this study was to demonstrate an improvement in muscle strength and lean body mass (LBM) in sarcopenic frail elderly women treated with MK-0773 relative to placebo. This was a randomized, double-blind, parallel-arm, placebo-controlled, multicenter, 6-month study. Participants were randomized in a 1:1 ratio to receive either MK-0773 50mg b.i.d. or placebo; all participants received Vitamin D and protein supplementation. General community. 170 Women aged ≥65 with sarcopenia and moderate physical dysfunction. Dual energy X-ray absorptiometry, muscle strength and power, physical performance measures. Participants receiving MK-0773 showed a statistically significant increase in LBM from baseline at Month 6 vs. placebo (p<0.001). Participants receiving both MK-0773 and placebo showed a statistically significant increase in strength from baseline to Month 6, but the mean difference between the two groups was not significant (p=0.269). Both groups showed significant improvement from baseline at Month 6 in physical performance measures, but there were no statistically significant differences between participants receiving MK-0773 and placebo. A greater number of participants experienced elevated transaminases in the MK-0773 group vs. placebo, which resolved after discontinuation of study therapy. MK-0773 was generally well-tolerated with no evidence of androgenization. The MK-0773-induced increase in LBM did not translate to improvement in strength or function vs. placebo. The improvement of strength and physical function in the placebo group could be at least partly attributed to protein and vitamin D supplementation.

  14. The moral importance of selecting people randomly.

    Science.gov (United States)

    Peterson, Martin

    2008-07-01

    This article discusses some ethical principles for distributing pandemic influenza vaccine and other indivisible goods. I argue that a number of principles for distributing pandemic influenza vaccine recently adopted by several national governments are morally unacceptable because they put too much emphasis on utilitarian considerations, such as the ability of the individual to contribute to society. Instead, it would be better to distribute vaccine by setting up a lottery. The argument for this view is based on a purely consequentialist account of morality; i.e. an action is right if and only if its outcome is optimal. However, unlike utilitarians I do not believe that alternatives should be ranked strictly according to the amount of happiness or preference satisfaction they bring about. Even a mere chance to get some vaccine matters morally, even if it is never realized.

  15. Genomic relations among 31 species of Mammillaria haworth (Cactaceae) using random amplified polymorphic DNA.

    Science.gov (United States)

    Mattagajasingh, Ilwola; Mukherjee, Arup Kumar; Das, Premananda

    2006-01-01

    Thirty-one species of Mammillaria were selected to study the molecular phylogeny using random amplified polymorphic DNA (RAPD) markers. High amount of mucilage (gelling polysaccharides) present in Mammillaria was a major obstacle in isolating good quality genomic DNA. The CTAB (cetyl trimethyl ammonium bromide) method was modified to obtain good quality genomic DNA. Twenty-two random decamer primers resulted in 621 bands, all of which were polymorphic. The similarity matrix value varied from 0.109 to 0.622 indicating wide variability among the studied species. The dendrogram obtained from the unweighted pair group method using arithmetic averages (UPGMA) analysis revealed that some of the species did not follow the conventional classification. The present work shows the usefulness of RAPD markers for genetic characterization to establish phylogenetic relations among Mammillaria species.

  16. Uncertainties in sea level projections on twenty-year timescales

    Science.gov (United States)

    Vinogradova, Nadya; Davis, James; Landerer, Felix; Little, Chris

    2016-04-01

    Regional decadal changes in sea level are governed by various processes, including ocean dynamics, gravitational and solid earth responses, mass loss of continental ice, and other local coastal processes. In order to improve predictions and physical attribution in decadal sea level trends, the uncertainties of each processes must be reflected in the sea level calculations. Here we explore uncertainties in predictions of the decadal and bi-decadal changes in regional sea level induced by the changes in ocean dynamics and associated redistribution of heat and freshwater (often referred to as dynamic sea level). Such predictions are typically based on the solutions from coupled atmospheric and oceanic general circulation models, including a suite of climate models participating in phase 5 of the Coupled Model Intercompasion Project (CMIP5). Designed to simulate long-term ocean variability in response to warming climate due to increasing green-house gas concentration ("forced" response), CMIP5 are deficient in simulating variability at shorter time scales. In contrast, global observations of sea level are available during a relatively short time span (e.g., twenty-year altimetry records), and are dominated by an "unforced" variability that occurs freely (internally) within the climate system. This makes it challenging to examine how well observations compare with model simulations. Therefore, here we focus on patterns and spatial characteristics of projected twenty-year trends in dynamic sea level. Based on the ensemble of CMIP5 models, each comprising a 240-year run, we compute an envelope of twenty-year rates, and analyze the spread and spatial relationship among predicted rates. An ensemble root-mean-square average exhibits large-scale spatial patterns, with the largest uncertainties found over mid and high latitudes that could be attributed to the changes in wind patterns and buoyancy forcing. To understand and parameterize spatial characteristics of the

  17. Microstructural parameters of bone evaluated using HR-pQCT correlate with the DXA-derived cortical index and the trabecular bone score in a cohort of randomly selected premenopausal women.

    Directory of Open Access Journals (Sweden)

    Albrecht W Popp

    Full Text Available BACKGROUND: Areal bone mineral density is predictive for fracture risk. Microstructural bone parameters evaluated at the appendicular skeleton by high-resolution peripheral quantitative computed tomography (HR-pQCT display differences between healthy patients and fracture patients. With the simple geometry of the cortex at the distal tibial diaphysis, a cortical index of the tibia combining material and mechanical properties correlated highly with bone strength ex vivo. The trabecular bone score derived from the scan of the lumbar spine by dual-energy X-ray absorptiometry (DXA correlated ex vivo with the micro architectural parameters. It is unknown if these microstructural correlations could be made in healthy premenopausal women. METHODS: Randomly selected women between 20-40 years of age were examined by DXA and HR-pQCT at the standard regions of interest and at customized sub regions to focus on cortical and trabecular parameters of strength separately. For cortical strength, at the distal tibia the volumetric cortical index was calculated directly from HR-pQCT and the areal cortical index was derived from the DXA scan using a Canny threshold-based tool. For trabecular strength, the trabecular bone score was calculated based on the DXA scan of the lumbar spine and was compared with the corresponding parameters derived from the HR-pQCT measurements at radius and tibia. RESULTS: Seventy-two healthy women were included (average age 33.8 years, average BMI 23.2 kg/m(2. The areal cortical index correlated highly with the volumetric cortical index at the distal tibia (R  =  0.798. The trabecular bone score correlated moderately with the microstructural parameters of the trabecular bone. CONCLUSION: This study in randomly selected premenopausal women demonstrated that microstructural parameters of the bone evaluated by HR-pQCT correlated with the DXA derived parameters of skeletal regions containing predominantly cortical or cancellous bone

  18. Microstructural Parameters of Bone Evaluated Using HR-pQCT Correlate with the DXA-Derived Cortical Index and the Trabecular Bone Score in a Cohort of Randomly Selected Premenopausal Women

    Science.gov (United States)

    Popp, Albrecht W.; Buffat, Helene; Eberli, Ursula; Lippuner, Kurt; Ernst, Manuela; Richards, R. Geoff; Stadelmann, Vincent A.; Windolf, Markus

    2014-01-01

    Background Areal bone mineral density is predictive for fracture risk. Microstructural bone parameters evaluated at the appendicular skeleton by high-resolution peripheral quantitative computed tomography (HR-pQCT) display differences between healthy patients and fracture patients. With the simple geometry of the cortex at the distal tibial diaphysis, a cortical index of the tibia combining material and mechanical properties correlated highly with bone strength ex vivo. The trabecular bone score derived from the scan of the lumbar spine by dual-energy X-ray absorptiometry (DXA) correlated ex vivo with the micro architectural parameters. It is unknown if these microstructural correlations could be made in healthy premenopausal women. Methods Randomly selected women between 20–40 years of age were examined by DXA and HR-pQCT at the standard regions of interest and at customized sub regions to focus on cortical and trabecular parameters of strength separately. For cortical strength, at the distal tibia the volumetric cortical index was calculated directly from HR-pQCT and the areal cortical index was derived from the DXA scan using a Canny threshold-based tool. For trabecular strength, the trabecular bone score was calculated based on the DXA scan of the lumbar spine and was compared with the corresponding parameters derived from the HR-pQCT measurements at radius and tibia. Results Seventy-two healthy women were included (average age 33.8 years, average BMI 23.2 kg/m2). The areal cortical index correlated highly with the volumetric cortical index at the distal tibia (R  =  0.798). The trabecular bone score correlated moderately with the microstructural parameters of the trabecular bone. Conclusion This study in randomly selected premenopausal women demonstrated that microstructural parameters of the bone evaluated by HR-pQCT correlated with the DXA derived parameters of skeletal regions containing predominantly cortical or cancellous bone. Whether these indexes

  19. Randomized metarounding

    Energy Technology Data Exchange (ETDEWEB)

    CARR,ROBERT D.; VEMPALA,SANTOSH

    2000-01-25

    The authors present a new technique for the design of approximation algorithms that can be viewed as a generalization of randomized rounding. They derive new or improved approximation guarantees for a class of generalized congestion problems such as multicast congestion, multiple TSP etc. Their main mathematical tool is a structural decomposition theorem related to the integrality gap of a relaxation.

  20. Policies pertaining to complementary and alternative medical therapies in a random sample of 39 academic health centers.

    Science.gov (United States)

    Cohen, Michael H; Sandler, Lynne; Hrbek, Andrea; Davis, Roger B; Eisenberg, David M

    2005-01-01

    This research documents policies in 39 randomly selected academic medical centers integrating complementary and alternative medical (CAM) services into conventional care. Twenty-three offered CAM services-most commonly, acupuncture, massage, dietary supplements, mind-body therapies, and music therapy. None had written policies concerning credentialing practices or malpractice liability. Only 10 reported a written policy governing use of dietary supplements, although three sold supplements in inpatient formularies, one in the psychiatry department, and five in outpatient pharmacies. Thus, few academic medical centers have sufficiently integrated CAM services into conventional care by developing consensus-written policies governing credentialing, malpractice liability, and dietary supplement use.

  1. Diurnal Variation and Twenty-Four Hour Sleep Deprivation Do Not Alter Supine Heart Rate Variability in Healthy Male Young Adults

    Science.gov (United States)

    Elvsåshagen, Torbjørn; Zak, Nathalia; Norbom, Linn B.; Pedersen, Per Ø.; Quraishi, Sophia H.; Bjørnerud, Atle; Malt, Ulrik F.; Groote, Inge R.; Kaufmann, Tobias; Andreassen, Ole A.; Westlye, Lars T.

    2017-01-01

    Heart rate variability (HRV) has become an increasingly popular index of cardiac autonomic control in the biobehavioral sciences due to its relationship with mental illness and cognitive traits. However, the intraindividual stability of HRV in response to sleep and diurnal disturbances, which are commonly reported in mental illness, and its relationship with executive function are not well understood. Here, in 40 healthy adult males we calculated high frequency HRV—an index of parasympathetic nervous system (PNS) activity—using pulse oximetry during brain imaging, and assessed attentional and executive function performance in a subsequent behavioral test session at three time points: morning, evening, and the following morning. Twenty participants were randomly selected for total sleep deprivation whereas the other 20 participants slept as normal. Sleep deprivation and morning-to-night variation did not influence high frequency HRV at either a group or individual level; however, sleep deprivation abolished the relationship between orienting attention performance and HRV. We conclude that a day of wake and a night of laboratory-induced sleep deprivation do not alter supine high frequency HRV in young healthy male adults. PMID:28151944

  2. A competing-risk-based score for predicting twenty-year risk of incident diabetes: the Beijing Longitudinal Study of Ageing study

    Science.gov (United States)

    Liu, Xiangtong; Chen, Zhenghong; Fine, Jason Peter; Liu, Long; Wang, Anxin; Guo, Jin; Tao, Lixin; Mahara, Gehendra; Yang, Kun; Zhang, Jie; Tian, Sijia; Li, Haibin; Liu, Kuo; Luo, Yanxia; Zhang, Feng; Tang, Zhe; Guo, Xiuhua

    2016-01-01

    Few risk tools have been proposed to quantify the long-term risk of diabetes among middle-aged and elderly individuals in China. The present study aimed to develop a risk tool to estimate the 20-year risk of developing diabetes while incorporating competing risks. A three-stage stratification random-clustering sampling procedure was conducted to ensure the representativeness of the Beijing elderly. We prospectively followed 1857 community residents aged 55 years and above who were free of diabetes at baseline examination. Sub-distribution hazards models were used to adjust for the competing risks of non-diabetes death. The cumulative incidence function of twenty-year diabetes event rates was 11.60% after adjusting for the competing risks of non-diabetes death. Age, body mass index, fasting plasma glucose, health status, and physical activity were selected to form the score. The area under the ROC curve (AUC) was 0.76 (95% Confidence Interval: 0.72–0.80), and the optimism-corrected AUC was 0.78 (95% Confidence Interval: 0.69–0.87) after internal validation by bootstrapping. The calibration plot showed that the actual diabetes risk was similar to the predicted risk. The cut-off value of the risk score was 19 points, marking mark the difference between low-risk and high-risk patients, which exhibited a sensitivity of 0.74 and specificity of 0.65. PMID:27849048

  3. Managing asbestos in Italy: twenty years after the ban.

    Science.gov (United States)

    Silvestri, Stefano

    2012-01-01

    Establishing an asbestos ban is not sufficient to achieve effective primary prevention. Twenty years after the Italian asbestos ban, the residual presence of asbestos-containing materials, estimated to be 80 percent of the quantity existing in 1992, may still be the cause of negative effects to the health of workers and the general population. The so called "asbestos way-out" at this rate of cleaning up, roughly 1 percent per year, is too slow, and new policy to re-discuss the entire process is needed. Encouragement of the owners with tax relief when the substitution of the asbestos roofs is performed with photovoltaic panels, as well as reducing the cost of removal planning local landfills may be the keys to accelerate the cleanup process.

  4. The Dialectics of Discrimination in the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    John Stone

    2007-12-01

    Full Text Available This article explores some of the latest developments in the scholarship on race relations and nationalism that seek to address the impact of globalization and the changed geo-political relations of the first decade of the twenty-first century. New patterns of identification, some of which challenge existing group boundaries and others that reinforce them, can be seen to flow from the effects of global market changes and the political counter-movements against them. The impact of the “war on terrorism”, the limits of the utility of hard power, and the need for new mechanisms of inter-racial and inter-ethnic conflict resolution are evaluated to emphasize the complexity of these group relations in the new world disorder.

  5. Twenty-first century power needs. Challenges, and supply options

    Energy Technology Data Exchange (ETDEWEB)

    Criswell, D.R. [Houston Univ., TX (United States). Solar Energy Lab.

    1997-11-01

    The challenge of providing adequate power to enable world prosperity in the twenty-first century to continue was discussed. It was estimated that by 2050, a prosperous world of 10 billion people will require 60 TWt of thermal power. Conventional power systems will not be able to provide the needed energy because of limited fuels, contamination of the biosphere and costs. A viable, cost effective alternative will be solar energy that is captured in space and from facilities on the Moon, and that is imported to Earth by microwaves. Global electric power systems that use the Moon and deliver 1,000 TWe-Y of energy by 2070 was suggested as the most obvious alternative. Despite the huge initial cost of 20 to 100 trillion dollars, the long-term cost was said to be small compared to terrestrial and Earth-orbital options. 30 refs., 2 figs.

  6. Increased Prevalence of Dental Fluorosis after Twenty Years

    DEFF Research Database (Denmark)

    Richards, Alan; Larsen, Mogens Joost; Maare, L.

    2006-01-01

    children, of similar ages, examined in the same area in 1984. Results: The prevalence and severity of fluorosis varied between tooth types so that the later in childhood the teeth are mineralized the higher the prevalence of dental fluorosis. When the data were compared to those collected 20 years...... in the (later formed) premolars and second molars. Conclusions: A significant increase in fluorosis prevalence has occurred over the last 20 years due to increased fluoride exposure of pre-school children. These findings may be explained by increased use of fluoride toothpaste by this age group from......0977   Increased Prevalence of Dental Fluorosis after Twenty Years A. RICHARDS1, M. LARSEN1, L. MAARE2, and H. HEDEBOE2, 1Aarhus University, Faculty of Health Sciences, Denmark, 2Præstø School Dental Service, Denmark Objectives: To describe the prevalence and severity of dental fluorosis among all...

  7. The Danish eID Case: Twenty years of Delay

    DEFF Research Database (Denmark)

    Hoff, Jens Villiam; Hoff, Frederik Villiam

    2010-01-01

    The focus of this article is to explain why there is still no qualified digital signature in Denmark as defined by the EU eSignatures Directive nor any other nationwide eID even though Denmark had an early start in eGovernment, and a high level of "e-readiness" compared to other nations. Laying out...... of intergovernmental coordination and lack of cooperation between public and private sector. However, with the recent tender on digital signatures won by the PBS and the roll-out of the NemID it seems that Denmark will finally - after twenty years of delay - have an eID which can be widely used in the public as well...

  8. Proceedings of Twenty-Seventh Annual Institute on Mining Health, Safety and Research

    Energy Technology Data Exchange (ETDEWEB)

    Bockosh, G.R. [ed.] [Pittsburgh Research Center, US Dept. of Energy (United States); Langton, J. [ed.] [Mine Safety and Health Administration, US Dept. of Labor (United States); Karmis, M. [ed.] [Virginia Polytechnic Institute and State University. Dept. of Mining and Minerals Engineering, Blacksburg (United States)

    1996-12-31

    This Proceedings contains the presentations made during the program of the Twenty-Seventh Annual Institute on Mining Health, Safety and Research held at Virginia Polytechnic Institute and State University, Blacksburg, Virginia, on August 26-28, 1996. The Twenty-Seventh Annual Institute on Mining, Health, Safety and Research was the latest in a series of conferences held at Virginia Polytechnic Institute and State University, cosponsored by the Mine Safety and Health Administration, United States Department of Labor, and the Pittsburgh Research Center, United States Department of Energy (formerly part of the Bureau of Mines, U. S. Department of Interior). The Institute provides an information forum for mine operators, managers, superintendents, safety directors, engineers, inspectors, researchers, teachers, state agency officials, and others with a responsible interest in the important field of mining health, safety and research. In particular, the Institute is designed to help mine operating personnel gain a broader knowledge and understanding of the various aspects of mining health and safety, and to present them with methods of control and solutions developed through research. Selected papers have been processed separately for inclusion in the Energy Science and Technology database.

  9. Comparison of the compact dry TC and 3M petrifilm ACP dry sheet media methods with the spiral plate method for the examination of randomly selected foods for obtaining aerobic colony counts.

    Science.gov (United States)

    Ellis, P; Meldrum, R

    2002-02-01

    Two hundred thirty-six randomly selected food and milk samples were examined to obtain aerobic colony counts by two dry sheet media methods and a standard Public Health Laboratory Service spiral plate method. Results for 40 samples were outside the limits of detection for one or more of the tested methods and were not considered. (The limits of detection for the spiral plate method were 200 to 1 x 10(8) CFU/ml for the spiral plate method and 100 to 3 x 10(6) CFU/ml for the dry sheet media methods.) The remaining 196 sets of results were analyzed further. When the results from the three methods were compared, correlation coefficients were all >0.80 and slopes and intercepts were close to 1.0 and 0.0, respectively. Mean log values and standard deviations were very similar for all three methods. The results were evaluated according to published UK guidelines for ready-to-eat foods sampled at the point of sale, which include a quality acceptability assessment that is based on aerobic colony counts. Eighty-six percent of the comparable results gave the same verdict with regard to acceptability according to the aerobic colony count guidelines. Both dry sheet media methods were comparable to the spiral plate method and can be recommended for the examination of food.

  10. Fractional randomness

    Science.gov (United States)

    Tapiero, Charles S.; Vallois, Pierre

    2016-11-01

    The premise of this paper is that a fractional probability distribution is based on fractional operators and the fractional (Hurst) index used that alters the classical setting of random variables. For example, a random variable defined by its density function might not have a fractional density function defined in its conventional sense. Practically, it implies that a distribution's granularity defined by a fractional kernel may have properties that differ due to the fractional index used and the fractional calculus applied to define it. The purpose of this paper is to consider an application of fractional calculus to define the fractional density function of a random variable. In addition, we provide and prove a number of results, defining the functional forms of these distributions as well as their existence. In particular, we define fractional probability distributions for increasing and decreasing functions that are right continuous. Examples are used to motivate the usefulness of a statistical approach to fractional calculus and its application to economic and financial problems. In conclusion, this paper is a preliminary attempt to construct statistical fractional models. Due to the breadth and the extent of such problems, this paper may be considered as an initial attempt to do so.

  11. Rethinking the humanities in twenty-first century Africa

    African Journals Online (AJOL)

    Ngwira's paper examines how the female body is portrayed in selected ... Mapatidwe – with the objective of illustrating how this daring representation ... film and on the phenomenon of witchcraft, from theological and philosophical perspectives ...

  12. 78 FR 20168 - Twenty Fourth Meeting: RTCA Special Committee 203, Unmanned Aircraft Systems

    Science.gov (United States)

    2013-04-03

    ... Federal Aviation Administration Twenty Fourth Meeting: RTCA Special Committee 203, Unmanned Aircraft Systems AGENCY: Federal Aviation Administration (FAA), U.S. Department of Transportation (DOT). ACTION.../Approval of Twenty Third Plenary Meeting Summary Leadership Update Workgroup Progress...

  13. Random 'choices' and the locality loophole

    OpenAIRE

    Pironio, Stefano

    2015-01-01

    It has been claimed that to close the locality loophole in a Bell experiment, random numbers of quantum origin should be used for selecting the measurement settings. This is how it has been implemented in all recent Bell experiment addressing this loophole. I point out in this note that quantum random number generators are unnecessary for such experiments and that a Bell experiment with a pseudo-random (but otherwise completely deterministic) mechanism for selecting the measurement settings, ...

  14. Implementing multifactorial psychotherapy research in online virtual environments (IMPROVE-2: study protocol for a phase III trial of the MOST randomized component selection method for internet cognitive-behavioural therapy for depression

    Directory of Open Access Journals (Sweden)

    Edward Watkins

    2016-10-01

    Full Text Available Abstract Background Depression is a global health challenge. Although there are effective psychological and pharmaceutical interventions, our best treatments achieve remission rates less than 1/3 and limited sustained recovery. Underpinning this efficacy gap is limited understanding of how complex psychological interventions for depression work. Recent reviews have argued that the active ingredients of therapy need to be identified so that therapy can be made briefer, more potent, and to improve scalability. This in turn requires the use of rigorous study designs that test the presence or absence of individual therapeutic elements, rather than standard comparative randomised controlled trials. One such approach is the Multiphase Optimization Strategy, which uses efficient experimentation such as factorial designs to identify active factors in complex interventions. This approach has been successfully applied to behavioural health but not yet to mental health interventions. Methods/Design A Phase III randomised, single-blind balanced fractional factorial trial, based in England and conducted on the internet, randomized at the level of the patient, will investigate the active ingredients of internet cognitive-behavioural therapy (CBT for depression. Adults with depression (operationalized as PHQ-9 score ≥ 10, recruited directly from the internet and from an UK National Health Service Improving Access to Psychological Therapies service, will be randomized across seven experimental factors, each reflecting the presence versus absence of specific treatment components (activity scheduling, functional analysis, thought challenging, relaxation, concreteness training, absorption, self-compassion training using a 32-condition balanced fractional factorial design (2IV 7-2. The primary outcome is symptoms of depression (PHQ-9 at 12 weeks. Secondary outcomes include symptoms of anxiety and process measures related to hypothesized mechanisms

  15. Twenty Meter Space Telescope Based on Diffractive Fresnel Lens

    Energy Technology Data Exchange (ETDEWEB)

    Early, J; Hyde, R; Baron, R

    2003-06-26

    Diffractive lenses offer two potential advantages for very large aperture space telescopes; very loose surface-figure tolerances and physical implementation as thin, flat optical elements. In order to actually realize these advantages one must be able to build large diffractive lenses with adequate optical precision and also to compactly stow the lens for launch and then fully deploy it in space. We will discuss the recent fabrication and assembly demonstration of a 5m glass diffractive Fresnel lens at LLNL. Optical performance data from smaller full telescopes with diffractive lens and corrective optics show diffraction limited performance with broad bandwidths. A systems design for a 20m space telescope will be presented. The primary optic can be rolled to fit inside of the standard fairings of the Delta IV vehicle. This configuration has a simple deployment and requires no orbital assembly. A twenty meter visible telescope could have a significant impact in conventional astronomy with eight times the resolution of Hubble and over sixty times the light gathering capacity. If the light scattering is made acceptable, this telescope could also be used in the search for terrestrial planets.

  16. Nuclear energy into the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    Hammond, G.P. [Bath Univ. (United Kingdom). School of Mechanical Engineering

    1996-12-31

    The historical development of the civil nuclear power generation industry is examined in the light of the need to meet conflicting energy-supply and environmental pressures over recent decades. It is suggested that fission (thermal and fast) reactors will dominate the market up to the period 2010-2030, with fusion being relegated to the latter part of the twenty-first century. A number of issues affecting the use of nuclear electricity generation in Western Europe are considered including its cost, industrial strategy needs, and the public acceptability of nuclear power. The contribution of nuclear power stations to achieving CO2 targets aimed at relieving global warming is discussed in the context of alternative strategies for sustainable development, including renewable energy sources and energy-efficiency measures. Trends in the generation of nuclear electricity from fission reactors are finally considered in terms of the main geopolitical groupings that make up the world in the mid-1990s. Several recent, but somewhat conflicting, forecasts of the role of nuclear power in the fuel mix to about 2020 are reviewed. It is argued that the only major expansion in generating capacity will take place on the Asia-Pacific Rim and not in the developing countries generally. Nevertheless, the global nuclear industry overall will continue to be dominated by a small number of large nuclear electricity generating countries; principally the USA, France and Japan. (UK).

  17. Optical Studies of Twenty Longer-Period Cataclysmic Binaries

    CERN Document Server

    Thorstensen, John R; Skinner, Julie N

    2010-01-01

    We obtained time-series radial velocity spectroscopy of twenty cataclysmic variable stars, with the aim of determining orbital periods P_orb. All of the stars reported here prove to have P_orb > 3.5 h. For sixteen of the stars, these are the first available period determinations, and for the remaining four (V709 Cas, AF Cam, V1062 Tau, and RX J2133+51) we use new observations to improve the accuracy of previously-published periods. Most of the targets are dwarf novae, without notable idiosyncracies. Of the remainder, three (V709 Cas, V1062 Tau, and RX J2133+51) are intermediate polars (DQ Her stars); one (IPHAS 0345) is a secondary-dominated system without known outbursts, similar to LY UMa; one (V1059 Sgr) is an old nova; and two others (V478 Her and V1082 Sgr) are long-period novalike variables. The stars with new periods are IPHAS 0345 (0.314 d); V344 Ori (0.234 d); VZ Sex (0.149 d); NSVS 1057+09 (0.376 d); V478 Her (0.629 d); V1059 Sgr (0.286 d); V1082 Sgr (0.868 d); FO Aql (0.217 d); V587 Lyr (0.275 d); ...

  18. New Bachelards?: Reveries, Elements and Twenty-First Century Materialisms

    Directory of Open Access Journals (Sweden)

    James L. Smith

    2012-10-01

    Full Text Available Recent years have seen an infusion of new ideas into material philosophy through the work of the so-called ‘new materialists’. Poignant examples appear within two recent books: the first, Vibrant Matter by Jane Bennett (2010, sets out to “enhance receptivity to the impersonal life that surrounds and infuses us” (2010: 4. The second, Elemental Philosophy by David Macauley (2010, advocates an anamnesis or recollection of the elements as imaginatively dynamic matter. Within his essays on the imagination of matter, Gaston Bachelard outlined an archetypal vision of the elements predicated upon the material imagination. He explored the manner in which the imagination inhabits the world, is triggered by the stimulus of material dynamism, and is formed from a co-constitution of subject and object. This article proposes that recent trends in materialist philosophy – as exemplified by the monographs of Bennett and Macauley – reinforce the ideas of Bachelard and take them in new directions. Bachelard provides us with a compelling argument for the rediscovery of material imagination, whereas New Materialism portrays a vision of matter filled with autonomous dynamism that lends itself to entering into a relationship with this imagination. Consequently, this article proposes that Gaston Bachelard has gained a new relevance as a result of contemporary trends in material philosophy, has taken on new possibilities through recent scholarship, and remains a force within the twenty-first century discursive landscape.

  19. The Antigerminative Activity of Twenty-Seven Monoterpenes

    Directory of Open Access Journals (Sweden)

    Laura De Martino

    2010-09-01

    Full Text Available Monoterpenes, the main constituents of essential oils, are known for their many biological activities. The present work studied the potential biological activity of twenty-seven monoterpenes, including monoterpene hydrocarbons and oxygenated ones, against seed germination and subsequent primary radicle growth of Raphanus sativus L. (radish and Lepidium sativum L. (garden cress, under laboratory conditions. The compounds, belonging to different chemical classes, showed different potency in affecting both parameters evaluated. The assayed compounds demonstrated a good inhibitory activity in a dose-dependent way. In general, radish seed is more sensitive than garden cress and its germination appeares more inhibited by alcohols; at the highest concentration tested, the more active substances were geraniol, borneol, (±-β-citronellol and α-terpineol. Geraniol and carvone inhibited, in a significant way, the germination of garden cress, at the highest concentration tested. Radicle elongation of two test species was inhibited mainly by alcohols and ketones. Carvone inhibited the radicle elongation of both seeds, at almost all concentrations assayed, while 1,8-cineole inhibited their radicle elongation at the lowest concentrations (10−5 M, 10−6 M.

  20. The antigerminative activity of twenty-seven monoterpenes.

    Science.gov (United States)

    De Martino, Laura; Mancini, Emilia; de Almeida, Luiz Fernando Rolim; De Feo, Vincenzo

    2010-09-21

    Monoterpenes, the main constituents of essential oils, are known for their many biological activities. The present work studied the potential biological activity of twenty-seven monoterpenes, including monoterpene hydrocarbons and oxygenated ones, against seed germination and subsequent primary radicle growth of Raphanus sativus L. (radish) and Lepidium sativum L. (garden cress), under laboratory conditions. The compounds, belonging to different chemical classes, showed different potency in affecting both parameters evaluated. The assayed compounds demonstrated a good inhibitory activity in a dose-dependent way. In general, radish seed is more sensitive than garden cress and its germination appeares more inhibited by alcohols; at the highest concentration tested, the more active substances were geraniol, borneol, (±)-β-citronellol and α-terpineol. Geraniol and carvone inhibited, in a significant way, the germination of garden cress, at the highest concentration tested. Radicle elongation of two test species was inhibited mainly by alcohols and ketones. Carvone inhibited the radicle elongation of both seeds, at almost all concentrations assayed, while 1,8-cineole inhibited their radicle elongation at the lowest concentrations (10(-5) M, 10(-6) M).

  1. Familial Sarcoidosis: An Analysis of Twenty-Eight Cases

    Directory of Open Access Journals (Sweden)

    Dildar Duman

    2016-12-01

    Full Text Available Objective: Sarcoidosis is a multisystemic disease, exact cause of disease is unknown but it is assumed that genetic predisposition and ethnic factors play a role in etiology. Studies related with familial sarcoidosis is limited and only case reports about familial sarcoidosis is available from our country. We aimed to evaluate the prevelance of familial sarcoidosis and clinical findings of cases with familial sarcoidosis. Methods: We retrospectively documented file records of 678 patients diagnosed with sarcoidosis and followed up in outpatient clinic of sarcoidosis from January 1996 to February 2016. 28 familial sarcoidosis cases in 14 families were enrolled into the study. Their demographic findings, family relationship, symptoms, laboratory and pulmonary function test results, radiological apperances, diagnostic methods, treatments were recorded. Results: Twenty-eight sarcoidosis patients out of 678 reported as familial cases, giving a prevelance of familial sarcoidosis as 4%. There were 8 sarcoidosis sib, 4 sarcoidosis mother-child, 1 sarcoidosis father-child and 1 sarcoidosis cousin relationship. Female/male ratio was 1.8, mean age of the study population was 43, most freguent symptoms were cough and dyspnea, stage 2 was mostly seen according to chest X-ray, most common CT appearance was mediastinal lymphadenopathy and mediastinoscopy was the most freguent diagnostic method. Conclusion: This study is important to lead interrogation of family in patients with suspected sarcoidosis and future studies investigating familial aggregation in sarcoidosis.

  2. Superhumps in Cataclysmic Binaries. XXIV. Twenty More Dwarf Novae

    CERN Document Server

    Patterson, J; Kemp, J; Skillman, D R; Vanmunster, T; Harvey, D; Fried, R E; Jensen, L; Cook, L; Rea, R; Monard, B; McCormick, J; Velthuis, F; Walker, S; Martin, B; Bolt, G; Pavlenko, E P; O'Donoghue, D; Gunn, J; Novak, R; Masi, G; Garradd, G; Butterworth, N D; Krajci, T; Foote, J; Beshore, E

    2003-01-01

    We report precise measures of the orbital and superhump period in twenty more dwarf novae. For ten stars, we report new and confirmed spectroscopic periods - signifying the orbital period P_o - as well as the superhump period P_sh. These are GX Cas, HO Del, HS Vir, BC UMa, RZ Leo, KV Dra, KS UMa, TU Crt, QW Ser, and RZ Sge. For the remaining ten, we report a medley of P_o and P_sh measurements from photometry; most are new, with some confirmations of previous values. These are KV And, LL And, WX Cet, MM Hya, AO Oct, V2051 Oph, NY Ser, KK Tel, HV Vir, and RX J1155.4-5641. Periods, as usual, can be measured to high accuracy, and these are of special interest since they carry dynamical information about the binary. We still have not quite learned how to read the music, but a few things are clear. The fractional superhump excess epsilon [=(P_sh-P_o)/P_o] varies smoothly with P_o. The scatter of the points about that smooth curve is quite low, and can be used to limit the intrinsic scatter in M_1, the white dwarf ...

  3. Twenty-second Fungal Genetics Conference - Asilomar, 2003

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan D. Walton

    2003-06-30

    The purpose of the Twenty Second Fungal Genetics Conference is to bring together scientists and students who are interested in genetic approaches to studying the biology of filamentous fungi. It is intended to stimulate thinking and discussion in an atmosphere that supports interactions between scientists at different levels and in different disciplines. Topics range from the basic to the applied. Filamentous fungi impact human affairs in many ways. In the environment they are the most important agents of decay and nutrient turnover. They are used extensively in the food industry for the production of food enzymes such as pectinase and food additives such as citric acid. They are used in the production of fermented foods such as alcoholic drinks, bread, cheese, and soy sauce. More than a dozen species of mushrooms are used as foods directly. Many of our most important antibiotics, such as penicillin, cyclosporin, and lovastatin, come from fungi. Fungi also have many negative impacts on human health and economics. Fungi are serious pathogens in immuno-compromised patients. Fungi are the single largest group of plant pathogens and thus a serious limit on crop productivity throughout the world. Many fungi are allergenic, and mold contamination of residences and commercial buildings is now recognized as a serious public health threat. As decomposers, fungi cause extensive damage to just about all natural and synthetic materials.

  4. Twenty years on: Poverty and hardship in urban Fiji

    Directory of Open Access Journals (Sweden)

    Jenny Bryant-Tokalau

    2012-09-01

    Full Text Available Through ‘official statistics’, academic and donor interpretations as well as the eyes of Suva residents, this paper presents an overview and case study of twenty years of growing poverty and hardship in the contemporary Pacific. Focusing on the past two decades, the paper notes how much, and yet so little, has changed for those attempting to make a living in the rapidly developing towns and cities. Changing interpretations of poverty and hardship are presented, moving from the ‘no such thing’ view, to simplification, and finally to an understanding that Pacific island countries, especially Fiji, are no longer an ‘extension’ of Australia and New Zealand, but independent nations actively trying to find solutions to their issues of economic, social and political hardship whilst facing challenges to traditional institutions and networks. Fiji is in some respects a very particular case as almost half of the population has limited access to secure land, but the very nature of that vulnerability to hardship and poverty holds useful lessons for wider analysis.

  5. Protocol for Combined Analysis of FOXFIRE, SIRFLOX, and FOXFIRE-Global Randomized Phase III Trials of Chemotherapy +/- Selective Internal Radiation Therapy as First-Line Treatment for Patients With Metastatic Colorectal Cancer.

    Science.gov (United States)

    Virdee, Pradeep S; Moschandreas, Joanna; Gebski, Val; Love, Sharon B; Francis, E Anne; Wasan, Harpreet S; van Hazel, Guy; Gibbs, Peter; Sharma, Ricky A

    2017-03-28

    In colorectal cancer (CRC), unresectable liver metastases are associated with a poor prognosis. The FOXFIRE (an open-label randomized phase III trial of 5-fluorouracil, oxaliplatin, and folinic acid +/- interventional radioembolization as first-line treatment for patients with unresectable liver-only or liver-predominant metastatic colorectal cancer), SIRFLOX (randomized comparative study of FOLFOX6m plus SIR-Spheres microspheres versus FOLFOX6m alone as first-line treatment in patients with nonresectable liver metastases from primary colorectal carcinoma), and FOXFIRE-Global (assessment of overall survival of FOLFOX6m plus SIR-Spheres microspheres versus FOLFOX6m alone as first-line treatment in patients with nonresectable liver metastases from primary colorectal carcinoma in a randomized clinical study) clinical trials were designed to evaluate the efficacy and safety of combining first-line chemotherapy with selective internal radiation therapy (SIRT) using yttrium-90 resin microspheres, also called transarterial radioembolization. The aim of this analysis is to prospectively combine clinical data from 3 trials to allow adequate power to evaluate the impact of chemotherapy with SIRT on overall survival. Eligible patients are adults with histologically confirmed CRC and unequivocal evidence of liver metastases which are not treatable by surgical resection or local ablation with curative intent at the time of study entry. Patients may also have limited extrahepatic metastases. Final analysis will take place when all participants have been followed up for a minimum of 2 years. Efficacy and safety estimates derived using individual participant data (IPD) from SIRFLOX, FOXFIRE, and FOXFIRE-Global will be pooled using 2-stage prospective meta-analysis. Secondary outcome measures include progression-free survival (PFS), liver-specific PFS, health-related quality of life, response rate, resection rate, and adverse event profile. The large study population will

  6. Random tensors

    CERN Document Server

    Gurau, Razvan

    2017-01-01

    Written by the creator of the modern theory of random tensors, this book is the first self-contained introductory text to this rapidly developing theory. Starting from notions familiar to the average researcher or PhD student in mathematical or theoretical physics, the book presents in detail the theory and its applications to physics. The recent detections of the Higgs boson at the LHC and gravitational waves at LIGO mark new milestones in Physics confirming long standing predictions of Quantum Field Theory and General Relativity. These two experimental results only reinforce today the need to find an underlying common framework of the two: the elusive theory of Quantum Gravity. Over the past thirty years, several alternatives have been proposed as theories of Quantum Gravity, chief among them String Theory. While these theories are yet to be tested experimentally, key lessons have already been learned. Whatever the theory of Quantum Gravity may be, it must incorporate random geometry in one form or another....

  7. The deep, hot biosphere: Twenty-five years of retrospection.

    Science.gov (United States)

    Colman, Daniel R; Poudel, Saroj; Stamps, Blake W; Boyd, Eric S; Spear, John R

    2017-07-03

    Twenty-five years ago this month, Thomas Gold published a seminal manuscript suggesting the presence of a "deep, hot biosphere" in the Earth's crust. Since this publication, a considerable amount of attention has been given to the study of deep biospheres, their role in geochemical cycles, and their potential to inform on the origin of life and its potential outside of Earth. Overwhelming evidence now supports the presence of a deep biosphere ubiquitously distributed on Earth in both terrestrial and marine settings. Furthermore, it has become apparent that much of this life is dependent on lithogenically sourced high-energy compounds to sustain productivity. A vast diversity of uncultivated microorganisms has been detected in subsurface environments, and we show that H2, CH4, and CO feature prominently in many of their predicted metabolisms. Despite 25 years of intense study, key questions remain on life in the deep subsurface, including whether it is endemic and the extent of its involvement in the anaerobic formation and degradation of hydrocarbons. Emergent data from cultivation and next-generation sequencing approaches continue to provide promising new hints to answer these questions. As Gold suggested, and as has become increasingly evident, to better understand the subsurface is critical to further understanding the Earth, life, the evolution of life, and the potential for life elsewhere. To this end, we suggest the need to develop a robust network of interdisciplinary scientists and accessible field sites for long-term monitoring of the Earth's subsurface in the form of a deep subsurface microbiome initiative.

  8. Twenty years of minimally invasive surgery in the Czech Republic

    Directory of Open Access Journals (Sweden)

    Miloslav Duda

    2011-03-01

    increased over the last twenty years, and the range of types of surgical therapies has enlarged.

  9. Twenty Common Testing Mistakes for EFL Teachers to Avoid

    Science.gov (United States)

    Henning, Grant

    2012-01-01

    To some extent, good testing procedure, like good language use, can be achieved through avoidance of errors. Almost any language-instruction program requires the preparation and administration of tests, and it is only to the extent that certain common testing mistakes have been avoided that such tests can be said to be worthwhile selection,…

  10. Educators Guide to Free Filmstrips. Twenty-Fifth Edition.

    Science.gov (United States)

    Horkheimer, Mary Foley, Comp.; Diffor, John C., Comp.

    Slightly more than 500 titles, including 122 new this year, which are available free from 99 sources are listed in this guide to help teachers and librarians select current relevant materials for their students. One hundred and fifty silent filmstrips, 113 sound films, and 243 slide sets are cataloged, covering the following subjects: aerospace…

  11. Random functions and turbulence

    CERN Document Server

    Panchev, S

    1971-01-01

    International Series of Monographs in Natural Philosophy, Volume 32: Random Functions and Turbulence focuses on the use of random functions as mathematical methods. The manuscript first offers information on the elements of the theory of random functions. Topics include determination of statistical moments by characteristic functions; functional transformations of random variables; multidimensional random variables with spherical symmetry; and random variables and distribution functions. The book then discusses random processes and random fields, including stationarity and ergodicity of random

  12. Selective Androgen Receptor Modulator (SARM) treatment prevents bone loss and reduces body fat in ovariectomized rats.

    Science.gov (United States)

    Kearbey, Jeffrey D; Gao, Wenqing; Narayanan, Ramesh; Fisher, Scott J; Wu, Di; Miller, Duane D; Dalton, James T

    2007-02-01

    This study was conducted to examine the bone and body composition effects of S-4, an aryl-propionamide derived Selective Androgen Receptor Modulator (SARM) in an ovariectomy induced model of accelerated bone loss. One hundred twenty female Sprague-Dawley rats aged to twenty-three weeks were randomly assigned to twelve treatment groups. Drug treatment was initiated immediately following ovariectomy and continued for one hundred twenty days. Whole body bone mineral density (BMD), body composition, and lumbar vertebrae BMD were measured by dual energy x-ray absorptiometry. More stringent regional pQCT and biomechanical strength testing was performed on excised femurs. We found that S-4 treatment maintained whole body and trabecular BMD, cortical content, and increased bone strength while decreasing body fat in these animals. The data presented herein show the protective skeletal effects of S-4. Our previous reports have shown the tissue selectivity and muscle anabolic activity of S-4. Together these data suggest that S-4 could reduce the incidence of fracture via two different mechanisms (i.e., via direct effects in bone and reducing the incidence of falls through increased muscle strength). This approach to fracture reduction would be advantageous over current therapies in these patients which are primarily antiresorptive in nature.

  13. Selective Androgen Receptor Modulator (SARM) Treatment Prevents Bone Loss and Reduces Body Fat in Ovariectomized Rats

    Science.gov (United States)

    Kearbey, Jeffrey D.; Gao, Wenqing; Narayanan, Ramesh; Fisher, Scott J.; Wu, Di; Miller, Duane D.; Dalton, James T.

    2007-01-01

    Purpose This study was conducted to examine the bone and body composition effects of S-4, an arylpropionamide derived Selective Androgen Receptor Modulator (SARM) in an ovariectomy induced model of accelerated bone loss. Methods One hundred twenty female Sprague-Dawley rats aged to twenty-three weeks were randomly assigned to twelve treatment groups. Drug treatment was initiated immediately following ovariectomy and continued for one hundred twenty days. Whole body bone mineral density (BMD), body composition, and lumbar vertebrae BMD were measured by dual energy x-ray absorptiometry. More stringent regional pQCT and biomechanical strength testing was performed on excised femurs. Results We found that S-4 treatment maintained whole body and trabecular BMD, cortical content, and increased bone strength while decreasing body fat in these animals. Conclusions The data presented herein show the protective skeletal effects of S-4. Our previous reports have shown the tissue selectivity and muscle anabolic activity of S-4. Together these data suggest that S-4 could reduce the incidence of fracture via two different mechanisms (i.e., via direct effects in bone and reducing the incidence of falls through increased muscle strength). This approach to fracture reduction would be advantageous over current therapies in these patients which are primarily antiresorptive in nature. PMID:17063395

  14. Decadal potential predictability of twenty-first century climate

    Energy Technology Data Exchange (ETDEWEB)

    Boer, George J. [Canadian Centre for Climate Modelling and Analysis, Environment Canada, PO Box 3065, Victoria, BC (Canada)

    2011-03-15

    Decadal prediction of the coupled climate system is potentially possible given enough information and knowledge. Predictability will reside in both externally forced and in long timescale internally generated variability. The ''potential predictability'' investigated here is characterized by the fraction of the total variability accounted for by these two components in the presence of short-timescale unpredictable ''noise'' variability. Potential predictability is not a classical measure of predictability nor a measure of forecast skill but it does identify regions where long timescale variability is an appreciable fraction of the total and hence where prediction on these scale may be possible. A multi-model estimate of the potential predictability variance fraction (ppvf) as it evolves through the first part of the twenty-first century is obtained using simulation data from the CMIP3 archive. Two estimates of potential predictability are used which depend on the treatment of the forced component. The multi-decadal estimate considers the magnitude of the forced component as the change from the beginning of the century and so becomes largely a measure of climate change as the century progresses. The next-decade estimate considers the change in the forced component from the past decade and so is more pertinent to an actual forecast for the next decade. Long timescale internally generated variability provides additional potential predictability beyond that of the forced component. The ppvf may be expressed in terms of a signal-to-noise ratio and takes on values between 0 and 1. The largest values of the ppvf for temperature are found over tropical and mid-latitude oceans, with the exception of the equatorial Pacific, and some but not all tropical land areas. Overall the potential predictability for temperature generally declines with latitude and is relatively low over mid- to high-latitude land. Potential predictability for

  15. Twenty-Five Year Site Plan FY2013 - FY2037

    Energy Technology Data Exchange (ETDEWEB)

    Jones, William H. [Los Alamos National Laboratory

    2012-07-12

    Los Alamos National Laboratory (the Laboratory) is the nation's premier national security science laboratory. Its mission is to develop and apply science and technology to ensure the safety, security, and reliability of the United States (U.S.) nuclear stockpile; reduce the threat of weapons of mass destruction, proliferation, and terrorism; and solve national problems in defense, energy, and the environment. The fiscal year (FY) 2013-2037 Twenty-Five Year Site Plan (TYSP) is a vital component for planning to meet the National Nuclear Security Administration (NNSA) commitment to ensure the U.S. has a safe, secure, and reliable nuclear deterrent. The Laboratory also uses the TYSP as an integrated planning tool to guide development of an efficient and responsive infrastructure that effectively supports the Laboratory's missions and workforce. Emphasizing the Laboratory's core capabilities, this TYSP reflects the Laboratory's role as a prominent contributor to NNSA missions through its programs and campaigns. The Laboratory is aligned with Nuclear Security Enterprise (NSE) modernization activities outlined in the NNSA Strategic Plan (May 2011) which include: (1) ensuring laboratory plutonium space effectively supports pit manufacturing and enterprise-wide special nuclear materials consolidation; (2) constructing the Chemistry and Metallurgy Research Replacement Nuclear Facility (CMRR-NF); (3) establishing shared user facilities to more cost effectively manage high-value, experimental, computational and production capabilities; and (4) modernizing enduring facilities while reducing the excess facility footprint. Th is TYSP is viewed by the Laboratory as a vital planning tool to develop an effi cient and responsive infrastructure. Long range facility and infrastructure development planning are critical to assure sustainment and modernization. Out-year re-investment is essential for sustaining existing facilities, and will be re-evaluated on an annual

  16. Understanding Contamination; Twenty Years of Simulating Radiological Contamination

    Energy Technology Data Exchange (ETDEWEB)

    Emily Snyder; John Drake; Ryan James

    2012-02-01

    A wide variety of simulated contamination methods have been developed by researchers to reproducibly test radiological decontamination methods. Some twenty years ago a method of non-radioactive contamination simulation was proposed at the Idaho National Laboratory (INL) that mimicked the character of radioactive cesium and zirconium contamination on stainless steel. It involved baking the contamination into the surface of the stainless steel in order to 'fix' it into a tenacious, tightly bound oxide layer. This type of contamination was particularly applicable to nuclear processing facilities (and nuclear reactors) where oxide growth and exchange of radioactive materials within the oxide layer became the predominant model for material/contaminant interaction. Additional simulation methods and their empirically derived basis (from a nuclear fuel reprocessing facility) are discussed. In the last ten years the INL, working with the Defense Advanced Research Projects Agency (DARPA) and the National Homeland Security Research Center (NHSRC), has continued to develop contamination simulation methodologies. The most notable of these newer methodologies was developed to compare the efficacy of different decontamination technologies against radiological dispersal device (RDD, 'dirty bomb') type of contamination. There are many different scenarios for how RDD contamination may be spread, but the most commonly used one at the INL involves the dispersal of an aqueous solution containing radioactive Cs-137. This method was chosen during the DARPA projects and has continued through the NHSRC series of decontamination trials and also gives a tenacious 'fixed' contamination. Much has been learned about the interaction of cesium contamination with building materials, particularly concrete, throughout these tests. The effects of porosity, cation-exchange capacity of the material and the amount of dirt and debris on the surface are very important factors

  17. Faster generation of random spanning trees

    CERN Document Server

    Kelner, Jonathan A

    2009-01-01

    In this paper, we set forth a new algorithm for generating approximately uniformly random spanning trees in undirected graphs. We show how to sample from a distribution that is within a multiplicative $(1+\\delta)$ of uniform in expected time $\\TO(m\\sqrt{n}\\log 1/\\delta)$. This improves the sparse graph case of the best previously known worst-case bound of $O(\\min \\{mn, n^{2.376}\\})$, which has stood for twenty years. To achieve this goal, we exploit the connection between random walks on graphs and electrical networks, and we use this to introduce a new approach to the problem that integrates discrete random walk-based techniques with continuous linear algebraic methods. We believe that our use of electrical networks and sparse linear system solvers in conjunction with random walks and combinatorial partitioning techniques is a useful paradigm that will find further applications in algorithmic graph theory.

  18. Twenty-five years of ambulatory laparoscopic cholecystectomy.

    Science.gov (United States)

    Bueno Lledó, José; Granero Castro, Pablo; Gomez I Gavara, Inmaculada; Ibañez Cirión, Jose L; López Andújar, Rafael; García Granero, Eduardo

    2016-10-01

    It is accepted by the surgical community that laparoscopic cholecystectomy (LC) is the technique of choice in the treatment of symptomatic cholelithiasis. However, more controversial is the standardization of system implementation in Ambulatory Surgery because of its different different connotations. This article aims to update the factors that influence the performance of LC in day surgery, analyzing the 25 years since its implementation, focusing on the quality and acceptance by the patient. Individualization is essential: patient selection criteria and the implementation by experienced teams in LC, are factors that ensure high guarantee of success.

  19. Patented Biologically-inspired Technological Innovations: A Twenty Year View

    Institute of Scientific and Technical Information of China (English)

    Richard H. C. Bonser

    2006-01-01

    Publication rate of patents can be a useful measure of innovation and productivity in fields of science and technology. To assess the growth in industrially-important research, I conducted an appraisal of patents published between 1985 and 2005 on online databases using keywords chosen to select technologies arising as a result of biological inspiration. Whilst the total number of patents increased over the period examined, those with biomimetic content had increased faster as a proportion of total patent publications. Logistic regression analysis reveals that we may be a little over half way through an initial innovation cycle inspired by biological systems.

  20. Randomized Consensus Processing over Random Graphs: Independence and Convergence

    CERN Document Server

    Shi, Guodong

    2011-01-01

    Various consensus algorithms over random networks have been investigated in the literature. In this paper, we focus on the role that randomized individual decision-making plays to consensus seeking under stochastic communications. At each time step, each node will independently choose to follow the consensus algorithm, or to stick to current state by a simple Bernoulli trial with time-dependent success probabilities. This node decision strategy characterizes the random node-failures on a communication networks, or a biased opinion selection in the belief evolution over social networks. Connectivity-independent and arc-independent graphs are defined, respectively, to capture the fundamental nature of random network processes with regard to the convergence of the consensus algorithms. A series of sufficient and/or necessary conditions are given on the success probability sequence for the network to reach a global consensus with probability one under different stochastic connectivity assumptions, by which a comp...

  1. Random fixed points and random differential inclusions

    Directory of Open Access Journals (Sweden)

    Nikolaos S. Papageorgiou

    1988-01-01

    Full Text Available In this paper, first, we study random best approximations to random sets, using fixed point techniques, obtaining this way stochastic analogues of earlier deterministic results by Browder-Petryshyn, KyFan and Reich. Then we prove two fixed point theorems for random multifunctions with stochastic domain that satisfy certain tangential conditions. Finally we consider a random differential inclusion with upper semicontinuous orientor field and establish the existence of random solutions.

  2. Twenty five years of organic chemistry with diiodosamarium

    Energy Technology Data Exchange (ETDEWEB)

    Kagan, Henri B. [Laboratoire de Synthese Asymetrique (UMR 8075), Institut de Chimie Moleculaire et des Materiaux d' Orsay, Universite Paris-Sud, 91405 Orsay (France)]. E-mail: kagan@icmo.u-psud.fr

    2006-02-09

    An historical account of the introduction of samarium diiodide in organic chemistry is presented in the first section, with the main initial results obtained. The basic organic transformations published in the author laboratory and by other groups during the initial period (1977-1987) are detailed. Some of the progresses subsequently obtained will be selected such as various transformations in synthesis, including asymmetric synthesis and total synthesis of natural products. The possible use of SmI{sub 2} in catalytic amount together with a terminal reducing agent is discussed. In the conclusion is summarized the wide scope of the chemistry induced by SmI{sub 2} with some comments on the future of this chemistry.

  3. Drone Warfare: Twenty-First Century Empire and Communications

    Directory of Open Access Journals (Sweden)

    Kevin Howley

    2017-02-01

    Full Text Available This paper, part of a larger project that examines drones from a social-construction of technology perspective, considers drone warfare in light of Harold Innis’s seminal work on empire and communication. Leveraging leading-edge aeronautics with advanced optics, data processing, and networked communication, drones represent an archetypal “space-biased” technology. Indeed, by allowing remote operators and others to monitor, select, and strike targets from half a world away, and in real-time, these weapon systems epitomize the “pernicious neglect of time” Innis sought to identify and remedy in his later writing. With Innis’s time-space dialectic as a starting point, then, the paper considers drones in light of a longstanding paradox of American culture: the impulse to collapse the geographical distance between the United States and other parts of the globe, while simultaneously magnifying the cultural difference between Americans and other peoples and societies. In the midst of the worldwide proliferation of drones, this quintessentially sublime technology embodies this (disconnect in important, profound, and ominous ways.

  4. Twenty years of social capital and health research: a glossary.

    Science.gov (United States)

    Moore, S; Kawachi, I

    2017-05-01

    Research on social capital in public health is approaching its 20th anniversary. Over this period, there have been rich and productive debates on the definition, measurement and importance of social capital for public health research and practice. As a result, the concepts and measures characterising social capital and health research have also evolved, often drawing from research in the social, political and behavioural sciences. The multidisciplinary adaptation of social capital-related concepts to study health has made it challenging for researchers to reach consensus on a common theoretical approach. This glossary thus aims to provide a general overview without recommending any particular approach. Based on our knowledge and research on social capital and health, we have selected key concepts and terms that have gained prominence over the last decade and complement an earlier glossary on social capital and health. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  5. Indication to Open Anatrophic Nephrolithotomy in the Twenty-First Century: A Case Report

    Directory of Open Access Journals (Sweden)

    Alfredo Maria Bove

    2012-01-01

    Full Text Available Introduction. Advances in endourology have greatly reduced indications to open surgery in the treatment of staghorn kidney stones. Nevertheless in our experience, open surgery still represents the treatment of choice in rare cases. Case Report. A 71-year-old morbidly obese female patient complaining about occasional left flank pain, and recurrent cystitis for many years, presented bilateral staghorn kidney stones. Comorbidities were obesity (BMI 36.2, hypertension, type II diabetes, and chronic obstructive pulmunary disease (COPD hyperlipidemia. Due to these comorbidities, endoscopic and laparoscopic approaches were not indicated. We offered the patient staged open anatrophic nephrolithotomy. Results. Operative time was 180 minutes. Blood loss was 500 cc. requiring one unit of packed red blood cells. Hospital stay was 7 days. The renal function was unaffected based on preoperative and postoperative serum creatinine levels. Stone-free status of the left kidney was confirmed after surgery with CT scan. Conclusions. Open surgery can represent a valid alterative in the treatment of staghorn kidney stones of very selected cases. A discussion of the current indications in the twenty-first century is presented.

  6. Genetic diversity in twenty variants of the avian polyomavirus.

    Science.gov (United States)

    Phalen, D N; Wilson, V G; Gaskin, J M; Derr, J N; Graham, D L

    1999-01-01

    To determine if different pathotypes of the avian polyomavirus (APV) exist and to compare the genomes of APVs originating from different geographic areas, dates, and species of birds, the partial sequences of 18 APVs were determined. New viral sequences were compared with three published APV sequences. Two of the new viruses had identical sequences. Forty point mutations were found at 31 loci. A 27-bp deletion was found in the VP2 and VP3 open reading frames of one virus. A duplication of the putative origin of replication and adjacent enhancer region was previously reported in one APV. Smaller duplications involving the origin in one APV and a second enhancer region in another were discovered. All duplications were in tissue culture-adapted viruses, suggesting they occurred during the isolation process. Excluding duplications and the deletion, maximum variation between viruses was small (11 bp). A maximum parsimony tree was constructed that contained three major branches. The three earliest isolates were on separate branches. The European viruses were confined to branch I, but APVs from the United States were on all three branches. Lovebird, budgerigar, and macaw APVs were also on each of the three branches, suggesting that species-specific pathotypes have not developed. Most nonsynonymous mutations occurred in a small portion of the VP2 and VP3 open reading frames, demonstrating a selection for these mutations. That a glycine at VP2 221 will inhibit virus replication in chicken embryo fibroblasts (CEFs) has been previously reported. In contrast, six of seven of the new APVs isolated in CEFs had a glycine at VP2 221.

  7. Randomization in clinical trials in orthodontics: its significance in research design and methods to achieve it.

    Science.gov (United States)

    Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore

    2011-12-01

    Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.

  8. High Temperature Superconductivity in the Past Twenty Years Part 1-Discovery, Material, and Theory

    Institute of Scientific and Technical Information of China (English)

    Jian-Xun Jin

    2008-01-01

    Twenty years after the discovery of high- temperature superconductors (HTSs), the HTS mate- rials now have been well developed. Meanwhile the mechanism of superconductivity is still one of the topical interests in physics. The achievements made on HTS materials and theories during the last twenty years are reviewed comprehensively in this paper.

  9. A Critical Feminist and Race Critique of Thomas Piketty's "Capital in the Twenty-First Century"

    Science.gov (United States)

    Moeller, Kathryn

    2016-01-01

    Thomas Piketty's "Capital in the Twenty-first Century" documents the foreboding nature of rising wealth inequality in the twenty-first century. In an effort to promote a more just and democratic global society and rein in the unfettered accumulation of wealth by the few, Piketty calls for a global progressive annual tax on corporate…

  10. A Critical Feminist and Race Critique of Thomas Piketty's "Capital in the Twenty-First Century"

    Science.gov (United States)

    Moeller, Kathryn

    2016-01-01

    Thomas Piketty's "Capital in the Twenty-first Century" documents the foreboding nature of rising wealth inequality in the twenty-first century. In an effort to promote a more just and democratic global society and rein in the unfettered accumulation of wealth by the few, Piketty calls for a global progressive annual tax on corporate…

  11. Slowmation: A Twenty-First Century Educational Tool for Science and Mathematics Pre-Service Teachers

    Science.gov (United States)

    Paige, Kathryn; Bentley, Brendan; Dobson, Stephen

    2016-01-01

    Slowmation is a twenty-first century digital literacy educational tool. This teaching and learning tool has been incorporated as an assessment strategy in the curriculum area of science and mathematics with pre-service teachers (PSTs). This paper explores two themes: developing twenty-first century digital literacy skills and modelling best…

  12. Randomized Trial on the Effectiveness of Dexamethasone in TMJ Arthrocentesis

    NARCIS (Netherlands)

    Huddleston-Slater, J.J.R.; Vos, L.M.; Stroy, L.P.P.; Stegenga, B.

    2012-01-01

    The aim of this study was to compare the effectiveness of dexamethasone administration following arthrocentesis of the temporomandibular joint (TMJ) with a placebo (saline). Twenty-eight participants with TMJ arthralgia were randomly assigned to two groups of a parallel double-blind RCT. In both gro

  13. Light-cured calcium hydroxide vs formocresol in human primary molar pulpotomies: a randomized controlled trial.

    Science.gov (United States)

    Zurn, Derek; Seale, N Sue

    2008-01-01

    The purpose of this prospective study was to compare light-cured calcium hydroxide (Ca(OH)2) with diluted formocresol (FC) for its success as a primary molar pulpotomy medicament Selection criteria included at least 2 matching, asymptomatic, contralateral primary molars requiring vital pulpotomies. Matched teeth in each patient were randomized to receive either Ca(OH)2 or FC as a pulpotomy medicament. All teeth were restored with prefabricated metal crowns. Twenty patients (34 pairs of teeth) were followed clinically and radiographically for > or =1 year. Two blinded, standardized, and calibrated examiners evaluated and scored each radiograph for signs of pathology, based upon a modified scale previously proposed. Findings were grouped in: (a) 0 - 6; (b) 7 - 12; and (c) 13 - 24 month intervals. Radiographic scoring favored the FC group of the 7- to 12- and 13- to 24-month intervals (Pformocresol as a pulpotomy agent.

  14. Consistency of Random Survival Forests.

    Science.gov (United States)

    Ishwaran, Hemant; Kogalur, Udaya B

    2010-07-01

    We prove uniform consistency of Random Survival Forests (RSF), a newly introduced forest ensemble learner for analysis of right-censored survival data. Consistency is proven under general splitting rules, bootstrapping, and random selection of variables-that is, under true implementation of the methodology. Under this setting we show that the forest ensemble survival function converges uniformly to the true population survival function. To prove this result we make one key assumption regarding the feature space: we assume that all variables are factors. Doing so ensures that the feature space has finite cardinality and enables us to exploit counting process theory and the uniform consistency of the Kaplan-Meier survival function.

  15. Twenty-Four-Hour Ambulatory Blood Pressure Monitoring in Hypertension

    Science.gov (United States)

    2012-01-01

    assessments, systematic reviews, meta-analyses, or randomized controlled trials. Exclusion Criteria non-English papers; animal or in vitro studies; case reports, case series, or case-case studies; studies comparing different antihypertensive therapies and evaluating their antihypertensive effects using 24-hour ABPM; studies on home or self-monitoring of BP, and studies on automated office BP measurement; studies in high-risk subgroups (e.g. diabetes, pregnancy, kidney disease). Outcomes of Interest Patient Outcomes mortality: all cardiovascular events (e.g., myocardial infarction [MI], stroke); non-fatal: all cardiovascular events (e.g., MI, stroke); combined fatal and non-fatal: all cardiovascular events (e.g., MI, stroke); all non-cardiovascular events; control of BP (e.g. systolic and/or diastolic target level). Drug-Related Outcomes percentage of patients who show a reduction in, or stop, drug treatment; percentage of patients who begin multi-drug treatment; drug therapy use (e.g. number, intensity of drug use); drug-related adverse events. Quality of Evidence The quality of the body of evidence was assessed as high, moderate, low, or very low according to the GRADE Working Group criteria. As stated by the GRADE Working Group, the following definitions of quality were used in grading the quality of the evidence: High Further research is very unlikely to change confidence in the estimate of effect. Moderate Further research is likely to have an important impact on confidence in the estimate of effect and may change the estimate. Low Further research is very likely to have an important impact on confidence in the estimate of effect and is likely to change the estimate. Very Low Any estimate of effect is very uncertain. Summary of Findings Short-Term Follow-Up Studies (Length of Follow-Up of ≤ 1 Year) Based on very low quality of evidence, there is no difference between technologies for non-fatal cardiovascular events. Based on moderate quality of evidence, ABPM resulted

  16. Stabilizing Randomly Switched Systems

    CERN Document Server

    Chatterjee, Debasish

    2008-01-01

    This article is concerned with stability analysis and stabilization of randomly switched systems under a class of switching signals. The switching signal is modeled as a jump stochastic (not necessarily Markovian) process independent of the system state; it selects, at each instant of time, the active subsystem from a family of systems. Sufficient conditions for stochastic stability (almost sure, in the mean, and in probability) of the switched system are established when the subsystems do not possess control inputs, and not every subsystem is required to be stable. These conditions are employed to design stabilizing feedback controllers when the subsystems are affine in control. The analysis is carried out with the aid of multiple Lyapunov-like functions, and the analysis results together with universal formulae for feedback stabilization of nonlinear systems constitute our primary tools for control design

  17. RANDOM WALK HYPOTHESIS IN FINANCIAL MARKETS

    Directory of Open Access Journals (Sweden)

    Nicolae-Marius JULA

    2017-05-01

    Full Text Available Random walk hypothesis states that the stock market prices do not follow a predictable trajectory, but are simply random. If you are trying to predict a random set of data, one should test for randomness, because, despite the power and complexity of the used models, the results cannot be trustworthy. There are several methods for testing these hypotheses and the use of computational power provided by the R environment makes the work of the researcher easier and with a cost-effective approach. The increasing power of computing and the continuous development of econometric tests should give the potential investors new tools in selecting commodities and investing in efficient markets.

  18. Acupuncture for Preventing Complications after Radical Hysterectomy: A Randomized Controlled Clinical Trial

    Directory of Open Access Journals (Sweden)

    Wei-min Yi

    2014-01-01

    Full Text Available We aimed to investigate the preventive effects of acupuncture for complications after radical hysterectomy. A single-center randomized controlled single-blinded trial was performed in a western-style hospital in China. One hundred and twenty patients after radical hysterectomy were randomly allocated to two groups and started acupuncture from sixth postoperative day for five consecutive days. Sanyinjiao (SP6, Shuidao (ST28, and Epangxian III (MS4 were selected with electrical stimulation and Zusanli (ST36 without electrical stimulation for thirty minutes in treatment group. Binao (LI14 was selected as sham acupuncture point without any stimulation in control group. The main outcome measures were bladder function and prevalence of postoperative complications. Compared with control group, treatment group reported significantly improved bladder function in terms of maximal cystometric capacity, first voiding desire, maximal flow rate, residual urine, and bladder compliance, and decreased bladder sensory loss, incontinence, and urinary retention on fifteenth and thirtieth postoperative days. Treatment group showed significant advantage in reduction of urinary tract infection on thirtieth postoperative day. But no significant difference between groups was observed for lymphocyst formation. By improving postoperative bladder function, early intervention of acupuncture may provide a valuable alternative method to prevent bladder dysfunctional disorders and urinary tract infection after radical hysterectomy.

  19. Genetic parameters for residual feed intake in a random population of Pekin duck.

    Science.gov (United States)

    Zhang, Yunsheng; Guo, Zhan Bao; Xie, Ming; Zhang, Zhiying; Hou, Shuisheng

    2017-02-01

    The feed intake (FI) and feed efficiency are economically important traits in ducks. To obtain insight into this economically important trait, we designed an experiment based on the residual feed intake (RFI) and feed conversion ratio (FCR) of a random population Pekin duck. Two thousand and twenty pedigreed random population Pekin ducks were established from 90 males mated to 450 females in two hatches. Traits analyzed in the study were body weight at the 42th day (BW42), 15 to 42 days average daily gain (ADG), 15 to 42 days FI, 15 to 42 days FCR, and 15 to 42 days RFI to assess their genetic inter-relationships. The genetic parameters for feed efficiency traits were estimated using restricted maximum likelihood (REML) methodology applied to a sire-dam model for all traits using the ASREML software. Estimates heritability of BW42, ADG, FI, FCR, and RFI were 0.39, 0.38, 0.33, 0.38, and 0.41, respectively. The genetic correlation was high between RFI and FI (0.77) and moderate between RFI and FCR (0.54). The genetic correlation was high and moderate between FCR and ADG (-0.80), and between FCR and BW42 (-0.64), and between FCR and FI (0.49), respectively. Thus, selection on RFI was expected to improve feed efficiency, and reduce FI. Selection on RFI thus improves the feed efficiency of animals without impairing their FI and increase growth rate.

  20. Effects of a stepwise multidisciplinary intervention for challenging behavior in advanced dementia: a cluster randomized controlled trial.

    NARCIS (Netherlands)

    Pieper, M.J.C.; Francke, A.L.; Steen, J.T. van der; Scherder, E.J.A.; Twisk, J.W.R.; Kovach, C.R.; Achterberg, W.P.

    2016-01-01

    Objectives: To assess whether implementation of a stepwise multicomponent intervention (STA OP!) is effective in reducing challenging behavior and depression in nursing home residents with advanced dementia. Design: Cluster randomized controlled trial. Setting: Twenty-one clusters (single

  1. Effects of a Stepwise Multidisciplinary Intervention for Challenging Behavior in Advanced Dementia: A Cluster Randomized Controlled Trial

    NARCIS (Netherlands)

    Pieper, M.J.; Francke, A.L.; Steen, J.T. van der; Scherder, E.J.; Twisk, J.W.; Kovach, C.R.; Achterberg, W.P.

    2016-01-01

    OBJECTIVES: To assess whether implementation of a stepwise multicomponent intervention (STA OP!) is effective in reducing challenging behavior and depression in nursing home residents with advanced dementia. DESIGN: Cluster randomized controlled trial. SETTING: Twenty-one clusters (single

  2. Surgical gastrojejunostomy or endoscopic stent placement for the palliation of malignant gastric outlet obstruction (SUSTENT study): a multicenter randomized trial

    NARCIS (Netherlands)

    S.M. Jeurnink; E.W. Steyerberg; J.E. van Hooft; C.H.J. van Eijck; M.P. Schwartz; F.P. Vleggaar; E.J. Kuipers; P.D. Siersema

    2010-01-01

    BACKGROUND: Both gastrojejunostomy (GJJ) and stent placement are commonly used palliative treatments of obstructive symptoms caused by malignant gastric outlet obstruction (GOO). OBJECTIVE: Compare GJJ and stent placement. DESIGN: Multicenter, randomized trial. SETTING: Twenty-one centers in The Net

  3. Twenty-first century skills for students: hands-on learning after school builds school and life success.

    Science.gov (United States)

    Cabral, Leide

    2006-01-01

    At the core of the movement for twenty-first century skills are students. The growing efforts to increase programs leveraging out-of-school time are focused on giving American youth everything they need to compete in this increasingly complex world. The author is one of many students who have been well served by initiatives imparting twenty-first century skills during after-school hours. Now a senior at Boston Latin School, the author has been helped along the way by Citizen Schools, an after-school education program focused on hands-on learning apprenticeships and homework help. While enrolled in the program as a middle school student, the author took part in projects that exemplified hands-on, inquiry-based learning that helped her develop twenty-first century skills. For example, along with dozens of other students, she advanced her data analysis skills by analyzing statistics about Boston Public high schools, which also helped her select and enroll in one of the city's premier exam schools. Also, she and her peers worked with corporate attorneys who served as writing coaches and whose expertise the author drew from in producing a published essay and greatly improving her writing skills. The author now finds that the public speaking, leadership, organizational, social, and management abilities she built through her participation in Citizen Schools are a great asset to her in high school. The confidence with which she tackles her responsibilities can also be traced back to her experiences in the program. As she looks toward college, the author reflects and realizes that being actively involved in a quality after-school program put her on track for a successful future.

  4. Random broadcast on random geometric graphs

    Energy Technology Data Exchange (ETDEWEB)

    Bradonjic, Milan [Los Alamos National Laboratory; Elsasser, Robert [UNIV OF PADERBORN; Friedrich, Tobias [ICSI/BERKELEY; Sauerwald, Tomas [ICSI/BERKELEY

    2009-01-01

    In this work, we consider the random broadcast time on random geometric graphs (RGGs). The classic random broadcast model, also known as push algorithm, is defined as: starting with one informed node, in each succeeding round every informed node chooses one of its neighbors uniformly at random and informs it. We consider the random broadcast time on RGGs, when with high probability: (i) RGG is connected, (ii) when there exists the giant component in RGG. We show that the random broadcast time is bounded by {Omicron}({radical} n + diam(component)), where diam(component) is a diameter of the entire graph, or the giant component, for the regimes (i), or (ii), respectively. In other words, for both regimes, we derive the broadcast time to be {Theta}(diam(G)), which is asymptotically optimal.

  5. Random survival forests for competing risks

    DEFF Research Database (Denmark)

    Ishwaran, Hemant; Gerds, Thomas A; Kogalur, Udaya B

    2014-01-01

    We introduce a new approach to competing risks using random forests. Our method is fully non-parametric and can be used for selecting event-specific variables and for estimating the cumulative incidence function. We show that the method is highly effective for both prediction and variable selection...

  6. Completely random signed measures

    DEFF Research Database (Denmark)

    Hellmund, Gunnar

    Completely random signed measures are defined, characterized and related to Lévy random measures and Lévy bases.......Completely random signed measures are defined, characterized and related to Lévy random measures and Lévy bases....

  7. Twenty-fourth Semiannual Report of the Commission to the Congress, July 1958

    Energy Technology Data Exchange (ETDEWEB)

    Strauss, Lewis L.; McCone, John A.

    1958-07-31

    The document represents the twenty-fourth semiannual Atomic Energy Commission (AEC) report to Congress. The report sums up the major activities and developments in the national atomic energy program covering the period January - June 1958.

  8. Twenty-second Semiannual Report of the Commission to the Congress, July 1957

    Energy Technology Data Exchange (ETDEWEB)

    Strauss, Lewis L.

    1957-07-31

    The document represents the twenty-second semiannual Atomic Energy Commission (AEC) report to Congress. The report sums up the major activities and developments in the national atomic energy program covering the period January - June 1957.

  9. Nutrients, technological properties and genetic relationships among twenty cowpea landraces cultivated in West Africa

    NARCIS (Netherlands)

    Madode, Y.E.E.; Linnemann, A.R.; Nout, M.J.R.; Vosman, B.J.; Hounhouigan, D.J.; Boekel, van T.

    2012-01-01

    The genetic relationships among twenty phenotypically different cowpea landraces were unravelled regarding their suitability for preparing West African dishes. Amplified fragment length polymorphism classified unpigmented landraces (UPs) as highly similar (65%, one cluster), contrary to pigmented la

  10. Proceedings of the twenty-first workshop on geothermal reservoir engineering

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    This document contains the Proceedings of the Twenty-first Workshop in Geothermal Reservoir Engineering, held at Stanford University, Stanford, California, USA, January 22-24, 1996. Sixty-six papers were presented in the technical sessions of the workshop. Technical papers were organized into twenty sessions including: reservoir assessment, modeling, geology/geochemistry, fracture modeling/hot-dry-rock, low enthalpy, fluid injection, well testing, drilling, adsorption, and well stimulation.

  11. Chapter Twenty

    African Journals Online (AJOL)

    User

    The 1970s Nigeria witnessed oil boom which turns out to be a disaster to people ... pictures of destitute, starving children, raped and battered women and cases of ..... that demand high intelligence, proper planning and long term preparation.

  12. Application of Random Forest Algorithm in Improtant Feature Selection from EMG Signal%随机森林算法在肌电的重要特征选择中的应用

    Institute of Scientific and Technical Information of China (English)

    张洪强; 刘光远; 赖祥伟

    2013-01-01

    It is a hard problem that how to find the effective feature from high-dimensional feature in EMG signal emotion recognition. This paper used random forest algorithm to comput the contribution in different emotion recognition of the 126 EMG signal, depending on the feature evaluation criteria of random forest algorithm, and then preferentially made up the features that have more contribution to emotion recognition,and used then to emotion recognition. Experiments show it is reasonable.%在肌电信号的情感识别问题中,如何从高维特征中找出起关键作用的特征,一直是情感识别的难题.使用随机森林算法,并依照其对特征的评价准则,来计算肌电信号的126个初始特征在不同情感模式分类中的贡献度.依照每个特征的重要程度,优先组合贡献度大的特征并将其用于情感的分类.实验数据验证了该方法的有效性.

  13. End-Permian Mass Extinction in the Oceans: An Ancient Analog for the Twenty-First Century?

    Science.gov (United States)

    Payne, Jonathan L.; Clapham, Matthew E.

    2012-05-01

    The greatest loss of biodiversity in the history of animal life occurred at the end of the Permian Period (˜252 million years ago). This biotic catastrophe coincided with an interval of widespread ocean anoxia and the eruption of one of Earth's largest continental flood basalt provinces, the Siberian Traps. Volatile release from basaltic magma and sedimentary strata during emplacement of the Siberian Traps can account for most end-Permian paleontological and geochemical observations. Climate change and, perhaps, destruction of the ozone layer can explain extinctions on land, whereas changes in ocean oxygen levels, CO2, pH, and temperature can account for extinction selectivity across marine animals. These emerging insights from geology, geochemistry, and paleobiology suggest that the end-Permian extinction may serve as an important ancient analog for twenty-first century oceans.

  14. Mechanical design of multiple zone plates precision alignment apparatus for hard X-ray focusing in twenty-nanometer scale

    Energy Technology Data Exchange (ETDEWEB)

    Shu, Deming; Liu, Jie; Gleber, Sophie C.; Vila-Comamala, Joan; Lai, Barry; Maser, Jorg M.; Roehrig, Christian; Wojcik, Michael J.; Vogt, Franz Stefan

    2017-04-04

    An enhanced mechanical design of multiple zone plates precision alignment apparatus for hard x-ray focusing in a twenty-nanometer scale is provided. The precision alignment apparatus includes a zone plate alignment base frame; a plurality of zone plates; and a plurality of zone plate holders, each said zone plate holder for mounting and aligning a respective zone plate for hard x-ray focusing. At least one respective positioning stage drives and positions each respective zone plate holder. Each respective positioning stage is mounted on the zone plate alignment base frame. A respective linkage component connects each respective positioning stage and the respective zone plate holder. The zone plate alignment base frame, each zone plate holder and each linkage component is formed of a selected material for providing thermal expansion stability and positioning stability for the precision alignment apparatus.

  15. Mechanical design of multiple zone plates precision alignment apparatus for hard X-ray focusing in twenty-nanometer scale

    Science.gov (United States)

    Shu, Deming; Liu, Jie; Gleber, Sophie C.; Vila-Comamala, Joan; Lai, Barry; Maser, Jorg M.; Roehrig, Christian; Wojcik, Michael J.; Vogt, Franz Stefan

    2017-04-04

    An enhanced mechanical design of multiple zone plates precision alignment apparatus for hard x-ray focusing in a twenty-nanometer scale is provided. The precision alignment apparatus includes a zone plate alignment base frame; a plurality of zone plates; and a plurality of zone plate holders, each said zone plate holder for mounting and aligning a respective zone plate for hard x-ray focusing. At least one respective positioning stage drives and positions each respective zone plate holder. Each respective positioning stage is mounted on the zone plate alignment base frame. A respective linkage component connects each respective positioning stage and the respective zone plate holder. The zone plate alignment base frame, each zone plate holder and each linkage component is formed of a selected material for providing thermal expansion stability and positioning stability for the precision alignment apparatus.

  16. Psychological treatment of late-life depression:a meta-analysis of randomized controlled trials

    OpenAIRE

    2006-01-01

    SUMMARY Background Older meta-analyses of the effects of psychological treatments for depression in older adults have found that these treatments have large effects. However, these earlier meta-analyses also included non-randomized studies, and did not include newer high-quality randomized controlled trials. Methods We conducted a meta-analysis of randomized studies on psychological treatments for depression in older adults. Results Twenty-five studies were included, of which 17 compared a ps...

  17. Associative Hierarchical Random Fields.

    Science.gov (United States)

    Ladický, L'ubor; Russell, Chris; Kohli, Pushmeet; Torr, Philip H S

    2014-06-01

    This paper makes two contributions: the first is the proposal of a new model-The associative hierarchical random field (AHRF), and a novel algorithm for its optimization; the second is the application of this model to the problem of semantic segmentation. Most methods for semantic segmentation are formulated as a labeling problem for variables that might correspond to either pixels or segments such as super-pixels. It is well known that the generation of super pixel segmentations is not unique. This has motivated many researchers to use multiple super pixel segmentations for problems such as semantic segmentation or single view reconstruction. These super-pixels have not yet been combined in a principled manner, this is a difficult problem, as they may overlap, or be nested in such a way that the segmentations form a segmentation tree. Our new hierarchical random field model allows information from all of the multiple segmentations to contribute to a global energy. MAP inference in this model can be performed efficiently using powerful graph cut based move making algorithms. Our framework generalizes much of the previous work based on pixels or segments, and the resulting labelings can be viewed both as a detailed segmentation at the pixel level, or at the other extreme, as a segment selector that pieces together a solution like a jigsaw, selecting the best segments from different segmentations as pieces. We evaluate its performance on some of the most challenging data sets for object class segmentation, and show that this ability to perform inference using multiple overlapping segmentations leads to state-of-the-art results.

  18. Chapter Twenty Chapter Twenty Eight

    African Journals Online (AJOL)

    User

    It is also true that each culture affects their writers' language and style differently in works ... Thus it calls for a cautious and positive use of words .... Memory is also fragrance from withered flowers. Memory is also the music from broken guitars.

  19. Matricially free random variables

    CERN Document Server

    Lenczewski, Romuald

    2008-01-01

    We show that the operatorial framework developed by Voiculescu for free random variables can be extended to arrays of random variables whose multiplication imitates matricial multiplication. The associated notion of independence, called matricial freeness, can be viewed as a generalization of both freeness and monotone independence. At the same time, the sums of matricially free random variables, called random pseudomatrices, are closely related to Gaussian random matrices. The main results presented in this paper concern the standard and tracial central limit theorems for random pseudomatrices and the corresponding limit distributions which can be viewed as matricial generalizations of semicirle laws.

  20. Fluctuating Selection in the Moran

    Science.gov (United States)

    Dean, Antony M.; Lehman, Clarence; Yi, Xiao

    2017-01-01

    Contrary to classical population genetics theory, experiments demonstrate that fluctuating selection can protect a haploid polymorphism in the absence of frequency dependent effects on fitness. Using forward simulations with the Moran model, we confirm our analytical results showing that a fluctuating selection regime, with a mean selection coefficient of zero, promotes polymorphism. We find that increases in heterozygosity over neutral expectations are especially pronounced when fluctuations are rapid, mutation is weak, the population size is large, and the variance in selection is big. Lowering the frequency of fluctuations makes selection more directional, and so heterozygosity declines. We also show that fluctuating selection raises dn/ds ratios for polymorphism, not only by sweeping selected alleles into the population, but also by purging the neutral variants of selected alleles as they undergo repeated bottlenecks. Our analysis shows that randomly fluctuating selection increases the rate of evolution by increasing the probability of fixation. The impact is especially noticeable when the selection is strong and mutation is weak. Simulations show the increase in the rate of evolution declines as the rate of new mutations entering the population increases, an effect attributable to clonal interference. Intriguingly, fluctuating selection increases the dn/ds ratios for divergence more than for polymorphism, a pattern commonly seen in comparative genomics. Our model, which extends the classical neutral model of molecular evolution by incorporating random fluctuations in selection, accommodates a wide variety of observations, both neutral and selected, with economy. PMID:28108586

  1. Application of Random Forest on Selecting Evaluation Index System for Enterprise Credit Assessment%随机森林在企业信用评估指标体系确定中的应用

    Institute of Scientific and Technical Information of China (English)

    林成德; 彭国兰

    2007-01-01

    评估指标体系的确定是企业信用评估的一个关键环节,指标体系选取的好坏直接影响模型的预测准确率.本文引进组合学习算法的新方法--随机森林(Random Forest,RF)来选择指标,使得到的指标体系更加客观,更加符合机器学习的特点.实验证明,该方法确定的指标体系能更有效地体现企业的信用状况,使用该指标体系建立的随机森林评估模型具有更高的预测准确率.

  2. A machine learning methodology for the selection and classification of spontaneous spinal cord dorsum potentials allows disclosure of structured (non-random) changes in neuronal connectivity induced by nociceptive stimulation.

    Science.gov (United States)

    Martin, Mario; Contreras-Hernández, Enrique; Béjar, Javier; Esposito, Gennaro; Chávez, Diógenes; Glusman, Silvio; Cortés, Ulises; Rudomin, Pablo

    2015-01-01

    Previous studies aimed to disclose the functional organization of the neuronal networks involved in the generation of the spontaneous cord dorsum potentials (CDPs) generated in the lumbosacral spinal segments used predetermined templates to select specific classes of spontaneous CDPs. Since this procedure was time consuming and required continuous supervision, it was limited to the analysis of two specific types of CDPs (negative CDPs and negative positive CDPs), thus excluding potentials that may reflect activation of other neuronal networks of presumed functional relevance. We now present a novel procedure based in machine learning that allows the efficient and unbiased selection of a variety of spontaneous CDPs with different shapes and amplitudes. The reliability and performance of the present method is evaluated by analyzing the effects on the probabilities of generation of different classes of spontaneous CDPs induced by the intradermic injection of small amounts of capsaicin in the anesthetized cat, a procedure known to induce a state of central sensitization leading to allodynia and hyperalgesia. The results obtained with the selection method presently described allowed detection of spontaneous CDPs with specific shapes and amplitudes that are assumed to represent the activation of functionally coupled sets of dorsal horn neurones that acquire different, structured configurations in response to nociceptive stimuli. These changes are considered as responses tending to adequate transmission of sensory information to specific functional requirements as part of homeostatic adjustments.

  3. A machine learning methodology for the selection and classification of spontaneous spinal cord dorsum potentials allows disclosure of structured (non-random) changes in neuronal connectivity induced by nociceptive stimulation

    Science.gov (United States)

    Martin, Mario; Contreras-Hernández, Enrique; Béjar, Javier; Esposito, Gennaro; Chávez, Diógenes; Glusman, Silvio; Cortés, Ulises; Rudomin, Pablo

    2015-01-01

    Previous studies aimed to disclose the functional organization of the neuronal networks involved in the generation of the spontaneous cord dorsum potentials (CDPs) generated in the lumbosacral spinal segments used predetermined templates to select specific classes of spontaneous CDPs. Since this procedure was time consuming and required continuous supervision, it was limited to the analysis of two specific types of CDPs (negative CDPs and negative positive CDPs), thus excluding potentials that may reflect activation of other neuronal networks of presumed functional relevance. We now present a novel procedure based in machine learning that allows the efficient and unbiased selection of a variety of spontaneous CDPs with different shapes and amplitudes. The reliability and performance of the present method is evaluated by analyzing the effects on the probabilities of generation of different classes of spontaneous CDPs induced by the intradermic injection of small amounts of capsaicin in the anesthetized cat, a procedure known to induce a state of central sensitization leading to allodynia and hyperalgesia. The results obtained with the selection method presently described allowed detection of spontaneous CDPs with specific shapes and amplitudes that are assumed to represent the activation of functionally coupled sets of dorsal horn neurones that acquire different, structured configurations in response to nociceptive stimuli. These changes are considered as responses tending to adequate transmission of sensory information to specific functional requirements as part of homeostatic adjustments. PMID:26379540

  4. Projection of drought hazards in China during twenty-first century

    Science.gov (United States)

    Liang, Yulian; Wang, Yongli; Yan, Xiaodong; Liu, Wenbin; Jin, Shaofei; Han, Mingchen

    2017-06-01

    Drought is occurring with increased frequency under climate warming. To understand the behavior of drought and its variation in the future, current and future drought in the twenty-first century over China is discussed. The drought frequency and trend of drought intensity are assessed using the Palmer Drought Severity Index (PDSI), which is calculated based on historical meteorological observations and outputs of the fifth Coupled Model Intercomparison Project (CMIP5) under three representative concentration pathway (RCP) scenarios. The simulation results of drought period, defined by PDSI class, could capture more than 90% of historical drought events. Projection results indicate that drought frequency will increase over China in the twenty-first century under the RCP4.5 and RCP8.5 scenarios. In the mid-twenty-first century (2021-2050), similar patterns of drought frequency are found under the three emission scenarios, and annual drought duration would last 3.5-4 months. At the end of the twenty-first century (2071-2100), annual drought duration could exceed 5 months in northwestern China as well as coastal areas of eastern and southern China under the RCP8.5 scenario. Drought is slightly reduced over the entire twenty-first century under the RCP2.6 scenario, whereas drought hazards will be more serious in most regions of China under the RCP8.5 scenario.

  5. RANDOMIZED CONTROLLED TRIALS IN ORTHOPEDICS AND TRAUMATOLOGY: SYSTEMATIC ANALYSIS ON THE NATIONAL EVIDENCE.

    Science.gov (United States)

    de Moraes, Vinícius Ynoe; Moreira, Cesar Domingues; Tamaoki, Marcel Jun Sugawara; Faloppa, Flávio; Belloti, Joao Carlos

    2010-01-01

    To assess whether there has been any improvement in the quality and quantity of randomized controlled trials (RCTs) in nationally published journals through the application of standardized and validated scores. We electronically selected all RCTs published in the two indexed Brazilian journals that focus on orthopedics, over the period 2000-2009: Acta Ortopédica Brasileira (AOB) and Revista Brasileira de Ortopedia (RBO). These RCTs were identified and scored by two independent researchers in accordance with the Jadad scale and the Cochrane Bone, Joint and Muscle Trauma Group score. The studies selected were grouped as follows: 1) publication period (2000-2004 or 2004-2009); 2) journal of publication (AOB or RBO). Twenty-two papers were selected: 10 from AOB and 12 from RBO. No statistically significant differences were found between the proportions (nRCT/nTotal of published papers) of RCTs published in the two journals (p = 0.458), or in the Jadad score (p = 0.722) and Cochrane score (p = 0.630). The relative quality and quantity of RCTs in the journals analyzed were similar. There was a trend towards improvement of quality, but there was no increase in the number of RCTs between the two periods analyzed.

  6. RANDOMIZED CONTROLLED TRIALS IN ORTHOPEDICS AND TRAUMATOLOGY: SYSTEMATIC ANALYSIS ON THE NATIONAL EVIDENCE

    Science.gov (United States)

    de Moraes, Vinícius Ynoe; Moreira, Cesar Domingues; Tamaoki, Marcel Jun Sugawara; Faloppa, Flávio; Belloti, Joao Carlos

    2015-01-01

    Objective: To assess whether there has been any improvement in the quality and quantity of randomized controlled trials (RCTs) in nationally published journals through the application of standardized and validated scores. Methods: We electronically selected all RCTs published in the two indexed Brazilian journals that focus on orthopedics, over the period 2000-2009: Acta Ortopédica Brasileira (AOB) and Revista Brasileira de Ortopedia (RBO). These RCTs were identified and scored by two independent researchers in accordance with the Jadad scale and the Cochrane Bone, Joint and Muscle Trauma Group score. The studies selected were grouped as follows: 1) publication period (2000-2004 or 2004-2009); 2) journal of publication (AOB or RBO). Results: Twenty-two papers were selected: 10 from AOB and 12 from RBO. No statistically significant differences were found between the proportions (nRCT/nTotal of published papers) of RCTs published in the two journals (p = 0.458), or in the Jadad score (p = 0.722) and Cochrane score (p = 0.630). Conclusion: The relative quality and quantity of RCTs in the journals analyzed were similar. There was a trend towards improvement of quality, but there was no increase in the number of RCTs between the two periods analyzed. PMID:27026971

  7. On Gaussian random supergravity

    Energy Technology Data Exchange (ETDEWEB)

    Bachlechner, Thomas C. [Department of Physics, Cornell University,Physical Sciences Building 428, Ithaca, NY 14853 (United States)

    2014-04-08

    We study the distribution of metastable vacua and the likelihood of slow roll inflation in high dimensional random landscapes. We consider two examples of landscapes: a Gaussian random potential and an effective supergravity potential defined via a Gaussian random superpotential and a trivial Kähler potential. To examine these landscapes we introduce a random matrix model that describes the correlations between various derivatives and we propose an efficient algorithm that allows for a numerical study of high dimensional random fields. Using these novel tools, we find that the vast majority of metastable critical points in N dimensional random supergravities are either approximately supersymmetric with |F|≪M{sub susy} or supersymmetric. Such approximately supersymmetric points are dynamical attractors in the landscape and the probability that a randomly chosen critical point is metastable scales as log (P)∝−N. We argue that random supergravities lead to potentially interesting inflationary dynamics.

  8. On Gaussian random supergravity

    Science.gov (United States)

    Bachlechner, Thomas C.

    2014-04-01

    We study the distribution of metastable vacua and the likelihood of slow roll inflation in high dimensional random landscapes. We consider two examples of landscapes: a Gaussian random potential and an effective supergravity potential defined via a Gaussian random superpotential and a trivial Kähler potential. To examine these landscapes we introduce a random matrix model that describes the correlations between various derivatives and we propose an efficient algorithm that allows for a numerical study of high dimensional random fields. Using these novel tools, we find that the vast majority of metastable critical points in N dimensional random supergravities are either approximately supersymmetric with | F| ≪ M susy or supersymmetric. Such approximately supersymmetric points are dynamical attractors in the landscape and the probability that a randomly chosen critical point is metastable scales as log( P ) ∝ - N. We argue that random supergravities lead to potentially interesting inflationary dynamics.

  9. On Gaussian Random Supergravity

    CERN Document Server

    Bachlechner, Thomas C

    2014-01-01

    We study the distribution of metastable vacua and the likelihood of slow roll inflation in high dimensional random landscapes. We consider two examples of landscapes: a Gaussian random potential and an effective supergravity potential defined via a Gaussian random superpotential and a trivial Kahler potential. To examine these landscapes we introduce a random matrix model that describes the correlations between various derivatives and we propose an efficient algorithm that allows for a numerical study of high dimensional random fields. Using these novel tools, we find that the vast majority of metastable critical points in N dimensional random supergravities are either approximately supersymmetric with |F|<< M_{susy} or supersymmetric. Such approximately supersymmetric points are dynamical attractors in the landscape and the probability that a randomly chosen critical point is metastable scales as log(P)\\propto -N. We argue that random supergravities lead to potentially interesting inflationary dynamics...

  10. On Gaussian random supergravity

    OpenAIRE

    Bachlechner, Thomas C.

    2014-01-01

    We study the distribution of metastable vacua and the likelihood of slow roll inflation in high dimensional random landscapes. We consider two examples of landscapes: a Gaussian random potential and an effective supergravity potential defined via a Gaussian random superpotential and a trivial K\\"ahler potential. To examine these landscapes we introduce a random matrix model that describes the correlations between various derivatives and we propose an efficient algorithm that allows for a nume...

  11. Quantum Random Number Generators

    OpenAIRE

    Herrero-Collantes, Miguel; Garcia-Escartin, Juan Carlos

    2016-01-01

    Random numbers are a fundamental resource in science and engineering with important applications in simulation and cryptography. The inherent randomness at the core of quantum mechanics makes quantum systems a perfect source of entropy. Quantum random number generation is one of the most mature quantum technologies with many alternative generation methods. We discuss the different technologies in quantum random number generation from the early devices based on radioactive decay to the multipl...

  12. Use of Silver Nitrate for the Assessment of Sperm Measurements in Selected Farm and Free-Living Animal Species

    Directory of Open Access Journals (Sweden)

    Andraszek Katarzyna

    2014-10-01

    Full Text Available The study was conducted on spermatozoa of selected farm and free-living animal species, isolated post mortem from the tail of the epididymis, and stained with silver nitrate - AgNO3. The material was collected from pigs, goats, wild boar, and European roe deer. Twenty morphologically normal spermatozoa randomly selected from each animal and well visible under the microscope, were analysed. The following measurements were considered: head length, width, perimeter and area, acrosome area, mid-piece length, tail length, and overall sperm length. AgNO3 staining differentiated the acrosomal (light hue and distal (dark hue part of the sperm head, and a light-hued mid-piece was visible within the sperm tail. Silver nitrate staining revealed species and variety-related differences, particularly in reference to the sperm head. Clear-cut differentiation within the head and tail area made it possible to perform detailed morphometric measurements of the spermatozoa.

  13. "The First Twenty Years," by Bernard J. Siegel. Annual Review of Anthropology, 22 (1993, pp. 1-34, Annual Reviews, Inc, Palo Alto

    Directory of Open Access Journals (Sweden)

    James A. Delle

    1995-05-01

    Full Text Available After twenty years as editor of the Annual Review ofAnthropology (ARA, Professor Siegel took on a daunting task with this article. In his words, he set out to "ponder the developments in the several subfields of anthropol­ogy over this period of time, as reflected in the topics selected for review in this enterprise" (p. 8. To this end Siegel, a cultural anthropologist, mined the collective knowledge contained within twenty years of the ARA. In his presentation, he considers the intellectual developments within each of the five subdisciplines separately (he includes applied anthropology, concluding with some brief remarks on the importance of maintaining a four or five field approach to anthropology. For our purposes here, I will limit my comments to his section on archaeology.

  14. RANDOM FORESTS:AN IMPORTANT FEATURE GENES SELECTION METHOD OF TUMOR%随机森林:一种重要的肿瘤特征基因选择法

    Institute of Scientific and Technical Information of China (English)

    李建更; 高志坤

    2009-01-01

    特征选择技术已经被广泛地应用于生物信息学科,随机森林(random forests,RF)是其中一种重要的特征选择方法.利用Rf对胃癌,结肠癌和肺癌等5组基因表达谱数据进行特征基因选择,将选择结果与支持向量机(support vector machine,SVM)结合对原数据集分类,并对特征基因选择及分类结果进行初步的分析.同时使用微阵列显著性分析(significant analysis of microarray,SAM)和ReliefF法与RF比较,结果显示随机森林选择的特征基因包含更多分类信息,分类准确率更高.结合该方法自身具有的分类方面的诸多优势,随机森林可以作为一种可靠的基因表达谱数据分析手段被广泛使用.

  15. A Theoretical Model for Selective Exposure Research.

    Science.gov (United States)

    Roloff, Michael E.; Noland, Mark

    This study tests the basic assumptions underlying Fishbein's Model of Attitudes by correlating an individual's selective exposure to types of television programs (situation comedies, family drama, and action/adventure) with the attitudinal similarity between individual attitudes and attitudes characterized on the programs. Twenty-three college…

  16. [Natural selection].

    Science.gov (United States)

    Mayr, E

    1985-05-01

    Much of the resistance against Darwin's theory of natural selection has been due to misunderstandings. It is shown that natural selection is not a tautology and that it is a two-step process. The first step, the production of variation, is under the control of chance; the second step, selection proper, is an anti-chance process, but subject to many constraints. The target of selection is the individual as a whole, and many neutral mutations can be retained as hitchhikers of successful genotypes. Sexual selection results from selection for pure reproductive success.

  17. Twenty Years of Systems Engineering on the Cassini-Huygens Mission

    Science.gov (United States)

    Manor-Chapman, Emily

    2013-01-01

    Over the past twenty years, the Cassini-Huygens Mission has successfully utilized systems engineering to develop and execute a challenging prime mission and two mission extensions. Systems engineering was not only essential in designing the mission, but as knowledge of the system was gained during cruise and science operations, it was critical in evolving operational strategies and processes. This paper discusses systems engineering successes, challenges, and lessons learned on the Cassini-Huygens Mission gathered from a thorough study of mission plans and developed scenarios, and interviews with key project leaders across its twenty-year history.

  18. Border Crossing in Contemporary Brazilian Culture: Global Perspectives from the Twenty-First Century Literary Scene

    Directory of Open Access Journals (Sweden)

    Cimara Valim de Melo

    2016-06-01

    Full Text Available Abstract: This paper investigates the process of internationalisation of Brazilian literature in the twenty-first century from the perspective of the publishing market. For this, we analyse how Brazil has responded to globalisation and what effects of cultural globalisation can be seen in the Brazilian literary scene, focusing on the novel. Observing the movement of the novelists throughout the globe, the reception of Brazilian literature in the United Kingdom and the relations between art and the literary market in Brazil, we intend to provoke some reflections on Brazilian cultural history in the light of the twenty-first century.

  19. Border Crossing in Contemporary Brazilian Culture: Global Perspectives from the Twenty-First Century Literary Scene

    Directory of Open Access Journals (Sweden)

    Cimara Valim de Melo

    2016-06-01

    Full Text Available Abstract: This paper investigates the process of internationalisation of Brazilian literature in the twenty-first century from the perspective of the publishing market. For this, we analyse how Brazil has responded to globalisation and what effects of cultural globalisation can be seen in the Brazilian literary scene, focusing on the novel. Observing the movement of the novelists throughout the globe, the reception of Brazilian literature in the United Kingdom and the relations between art and the literary market in Brazil, we intend to provoke some reflections on Brazilian cultural history in the light of the twenty-first century.

  20. Quantum random number generator

    CERN Document Server

    Stipcevic, M

    2006-01-01

    We report upon a novel principle for realization of a fast nondeterministic random number generator whose randomness relies on intrinsic randomness of the quantum physical processes of photonic emission in semiconductors and subsequent detection by the photoelectric effect. Timing information of detected photons is used to generate binary random digits-bits. The bit extraction method based on restartable clock theoretically eliminates both bias and autocorrelation while reaching efficiency of almost 0.5 bits per random event. A prototype has been built and statistically tested.

  1. Misuse of randomization

    DEFF Research Database (Denmark)

    Liu, Jianping; Kjaergard, Lise Lotte; Gluud, Christian

    2002-01-01

    The quality of randomization of Chinese randomized trials on herbal medicines for hepatitis B was assessed. Search strategy and inclusion criteria were based on the published protocol. One hundred and seventy-six randomized clinical trials (RCTs) involving 20,452 patients with chronic hepatitis B....../150) of the studies were imbalanced at the 0.05 level of probability for the two treatments and 13.3% (20/150) imbalanced at the 0.01 level in the randomization. It is suggested that there may exist misunderstanding of the concept and the misuse of randomization based on the review....

  2. Quantum random number generation

    Science.gov (United States)

    Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu; Qi, Bing; Zhang, Zhen

    2016-06-01

    Quantum physics can be exploited to generate true random numbers, which have important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness—coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. On the basis of the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a high speed by properly modelling the devices. The second category is self-testing QRNG, in which verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category that provides a tradeoff between the trustworthiness on the device and the random number generation speed.

  3. A cluster-randomized controlled trial evaluating the effects of delaying onset of adolescent substance abuse on cognitive development and addiction following a selective, personality-targeted intervention programme: the Co-Venture trial.

    Science.gov (United States)

    O'Leary-Barrett, Maeve; Mâsse, Benoit; Pihl, Robert O; Stewart, Sherry H; Séguin, Jean R; Conrod, Patricia J

    2017-10-01

    Substance use and binge drinking during early adolescence are associated with neurocognitive abnormalities, mental health problems and an increased risk for future addiction. The trial aims to evaluate the protective effects of an evidence-based substance use prevention programme on the onset of alcohol and drug use in adolescence, as well as on cognitive, mental health and addiction outcomes over 5 years. Thirty-eight high schools will be recruited, with a final sample of 31 schools assigned to intervention or control conditions (3826 youth). Brief personality-targeted interventions will be delivered to high-risk youth attending intervention schools during the first year of the trial. Control school participants will receive no intervention above what is offered to them in the regular curriculum by their respective schools. Public/private French and English high schools in Montreal (Canada). All grade 7 students (12-13 years old) will be invited to participate. High-risk youth will be identified as those scoring one standard deviation or more above the school mean on one of the four personality subscales of the Substance Use Risk Profile Scale (40-45% youth). Self-reported substance use and mental health symptoms and cognitive functioning measured annually throughout 5 years. Primary outcomes are the onset of substance use disorders at 4 years post-intervention (year 5). Secondary intermediate outcomes are the onset of alcohol and substance use 2 years post-intervention and neuropsychological functions; namely, the protective effects of substance use prevention on cognitive functions generally, and executive functions and reward sensitivity specifically. This longitudinal, cluster-randomized controlled trial will investigate the impact of a brief personality-targeted intervention program on reducing the onset of addiction 4 years-post intervention. Results will tease apart the developmental sequences of uptake and growth in substance use and cognitive

  4. On Improvement of Random Preference Portfolio Selection Model Based on Fuzzy Theory%基于模糊理论的投资组合随机偏好选择模型的改进

    Institute of Scientific and Technical Information of China (English)

    罗丹

    2013-01-01

    There are so many uncertain factors in the investment portfolio,two important ones of which are profit and risk in security investment.The definition of uncertainty is broad,and it is classified into the following two elements for investment:One is the external environment,and the other is the investors’ random decisions.This paper deals with the studies on the latter,and we obtain the optimal portfolio by different risk preference of investors.At first,the return and risk are normalized by means of fuzzy theory and investment combination model with a parameter which reflects investor’risk attitude,and obtains the optimum relation by choosing different parameter.%证券的收益和风险于投资者来说,其决策的随机性也会使得收益和风险存在不确定性。利用模糊理论和投资组合模型对收益和风险进行标准化处理,同时引入反映投资者风险偏好的参数,再通过选取不同的偏好参数,从而找到不同参数下的最优组合。通过实例的数据试验表明根据个人不同的偏好下的投资组合,收益为主、风险为次的组合,得到的结果更优。同时标准化处理的模型在最优结果上明显优于未作处理的模型,表明了该模型具有一定的实用价值。

  5. Assessment of non-BDNF neurotrophins and GDNF levels after depression treatment with sertraline and transcranial direct current stimulation in a factorial, randomized, sham-controlled trial (SELECT-TDCS): an exploratory analysis.

    Science.gov (United States)

    Brunoni, André R; Machado-Vieira, Rodrigo; Zarate, Carlos A; Vieira, Erica L M; Valiengo, Leandro; Benseñor, Isabela M; Lotufo, Paulo A; Gattaz, Wagner F; Teixeira, Antonio L

    2015-01-02

    The neurotrophic hypothesis of depression states that the major depressive episode is associated with lower neurotrophic factors levels, which increase with amelioration of depressive symptoms. However, this hypothesis has not been extended to investigate neurotrophic factors other than the brain-derived neurotrophic factor (BDNF). We therefore explored whether plasma levels of neurotrophins 3 (NT-3) and 4 (NT-4), nerve growth factor (NGF) and glial cell line derived neurotrophic factor (GDNF) changed after antidepressant treatment and correlated with treatment response. Seventy-three patients with moderate-to-severe, antidepressant-free unipolar depression were assigned to a pharmacological (sertraline) and a non-pharmacological (transcranial direct current stimulation, tDCS) intervention in a randomized, 2 × 2, placebo-controlled design. The plasma levels of NT-3, NT-4, NGF and GDNF were determined by enzyme-linked immunosorbent assay before and after a 6-week treatment course and analyzed according to clinical response and allocation group. We found that tDCS and sertraline (separately and combined) produced significant improvement in depressive symptoms. Plasma levels of all neurotrophic factors were similar across groups at baseline and remained significantly unchanged regardless of the intervention and of clinical response. Also, baseline plasma levels were not associated with clinical response. To conclude, in this 6-week placebo-controlled trial, NT-3, NT-4, NGF and GDNF plasma levels did not significantly change with sertraline or tDCS. These data suggest that these neurotrophic factors are not surrogate biomarkers of treatment response or involved in the antidepressant mechanisms of tDCS.

  6. 基于随机选择和竞争合作更新策略的变化检测%Change detection based on randomly selective and co-competitive update strategy

    Institute of Scientific and Technical Information of China (English)

    朱益稼; 于凤芹; 陈莹

    2016-01-01

    针对复杂场景中背景更新这一难题,提出一种背景更新适应性较强的变化检测算法。利用建模样本的无记忆性进行随机更新得到初步更新后的背景模型,采用模型像素间竞争的方式选取邻域更新位置,通过当前帧图像像素间合作的方式得到加权更新值。仿真实验结果表明,所提更新策略能够在有效处理复杂场景的同时保证检测结果的准确率。五类复杂背景图像序列的仿真结果也验证了该算法具有综合性能的优势。%This paper presents a model update method for change detection which can adapt strongly in complex back-ground. Random update is used for preliminary model update, since samples are memoryless. It competitively looks for the substituted pixels in the neighborhood of adapted background model. The assignment of weighted values behaves as a cooperative way between current frame pixels. Efficiency figures show that the proposed algorithm is feasible and accu-rate for complex background. The results of five kinds of image sequences with complex background prove excellent methods in terms of comprehensive detection rate.

  7. Movies to the Rescue: Keeping the Cold War Relevant for Twenty-First-Century Students

    Science.gov (United States)

    Gokcek, Gigi; Howard, Alison

    2013-01-01

    What are the challenges of teaching Cold War politics to the twenty-first-century student? How might the millennial generation be educated about the political science theories and concepts associated with this period in history? A college student today, who grew up in the post-Cold War era with the Internet, Facebook, Twitter, smart phones,…

  8. Violating Pedagogy: Literary Theory in the Twenty-First Century College Classroom

    Science.gov (United States)

    Johnson, Heather G. S.

    2015-01-01

    "Violating Pedagogy: Literary Theory in the Twenty-first Century College Classroom" discusses the challenge of teaching literary theory to undergraduate and graduate students in a cultural atmosphere that may at times feel simultaneously anti-intellectual and overpopulated with competing scholarly concerns. Approaching theory as a…

  9. Culture, Power, and the University in the Twenty-First Century

    Science.gov (United States)

    Murphy, Peter

    2012-01-01

    Powerful nations have influential systems of higher education. The article explores the possible pattern of geopolitics in the twenty-first century, and the competing prospects of America and its rivals in higher education and research. Pressures on both the American and non-American worlds are evaluated, along with relative economic strengths,…

  10. Twenty-Two Hispanic Leaders Discuss Poverty: Results from the Hispanic Leaders Study.

    Science.gov (United States)

    Quiroz, Julia Teresa

    This study reports twenty-two Hispanic leaders' responses to interviews assessing their perspectives on the nature, prevalence, and causes of poverty among Hispanics. This report contains six parts. Part 1 is an introduction. Part 2 presents the methodology used in the study. Part 3 gives the leaders' demographic and educational backgrounds. Part…

  11. A Comment on Class Productions in Elite Secondary Schools in Twenty-First-Century Global Context

    Science.gov (United States)

    Weis, Lois

    2014-01-01

    In this closing essay, Lois Weis offers a broad overview of the contributions of this Special Issue on class production in elite secondary schools in the twenty-first-century global context. Drawing upon her own research within US privileged secondary schools, Weis explores the contemporary social, economic and political landscape as connected to…

  12. How Do Students Value the Importance of Twenty-First Century Skills?

    Science.gov (United States)

    Ahonen, Arto Kalevi; Kinnunen, Päivi

    2015-01-01

    Frameworks of twenty-first century skills have attained a central role in school development and curriculum changes all over the world. There is a common understanding of the need for meta-skills such as problem solving, reasoning, collaboration, and self-regulation. This article presents results from a Finnish study, in which 718 school pupils…

  13. Non-International Armed Conflict in the Twenty-first Century

    Science.gov (United States)

    2012-01-01

    political and ecological aspirations, rather than by a sound analysis of State practice. State practice during non-international armed conflicts does...Chipka were visited in the English Channel, the German Las Palmas twenty-two nautical miles south of Cape Vicent and the German Archsum fifty-four

  14. 22 CFR 181.5 - Twenty-day rule for concluded agreements.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Twenty-day rule for concluded agreements. 181.5 Section 181.5 Foreign Relations DEPARTMENT OF STATE INTERNATIONAL AGREEMENTS COORDINATION, REPORTING AND..., including the Department of State, that concludes an international agreement within the meaning of the Act...

  15. 78 FR 31627 - Twenty-Second Meeting: RTCA Special Committee 224, Airport Security Access Control Systems

    Science.gov (United States)

    2013-05-24

    ...: RTCA Special Committee 224, Airport Security Access Control Systems AGENCY: Federal Aviation... 224, Airport Security Access Control Systems. SUMMARY: The FAA is issuing this notice to advise the public of the twenty-second meeting of the RTCA Special Committee 224, Airport Security Access...

  16. Twenty Years of Research on RNS for DSP: Lessons Learned and Future Perspectives

    DEFF Research Database (Denmark)

    Albicocco, Pietro; Cardarilli, Gian Carlo; Nannarelli, Alberto;

    2014-01-01

    In this paper, we discuss a number of issues emerged from our twenty-year long experience in applying the Residue Number System (RNS) to DSP systems. In early days, RNS was mainly used to reach the maximum performance in speed. Today, RNS is also used to obtain powerefficient (tradeoffs speed...

  17. Twenty-One Ways to Use Music in Teaching the Language Arts.

    Science.gov (United States)

    Cardarelli, Aldo F.

    Twenty-one activities that integrate music and the language arts in order to capitalize on children's interests are described in this paper. Topics of the activities are as follows: alphabetical order, pantomime, vocabulary building from words of a favorite song, words that are "the most (whatever)" from songs, mood words, a configuration clue…

  18. Twenty-Two Hispanic Leaders Discuss Poverty: Results from the Hispanic Leaders Study.

    Science.gov (United States)

    Quiroz, Julia Teresa

    This study reports twenty-two Hispanic leaders' responses to interviews assessing their perspectives on the nature, prevalence, and causes of poverty among Hispanics. This report contains six parts. Part 1 is an introduction. Part 2 presents the methodology used in the study. Part 3 gives the leaders' demographic and educational backgrounds. Part…

  19. Twenty-year trends in the prevalence of Down syndrome and other trisomies in Europe

    DEFF Research Database (Denmark)

    Loane, Maria; Morris, Joan K; Addor, Marie-Claude

    2013-01-01

    This study examines trends and geographical differences in total and live birth prevalence of trisomies 21, 18 and 13 with regard to increasing maternal age and prenatal diagnosis in Europe. Twenty-one population-based EUROCAT registries covering 6.1 million births between 1990 and 2009 participa...

  20. Yeast culture collections in the twenty-first century: new opportunities and challenges

    NARCIS (Netherlands)

    Boundy-Mills, Kyria L.; Glantschnig, Ewald; Roberts, Ian N.; Yurkov, Andrey; Casaregola, Serge; Daniel, Heide-Marie; Groenewald, Marizeth; Turchetti, Benedetta

    2016-01-01

    The twenty-first century has brought new opportunities and challenges to yeast culture collections, whether they are long-standing or recently established. Basic functions such as archiving, characterizing and distributing yeasts continue, but with expanded responsibilities and emerging opportunitie

  1. The Five Cs of Digital Curation: Supporting Twenty-First-Century Teaching and Learning

    Science.gov (United States)

    Deschaine, Mark E.; Sharma, Sue Ann

    2015-01-01

    Digital curation is a process that allows university professors to adapt and adopt resources from multidisciplinary fields to meet the educational needs of twenty-first-century learners. Looking through the lens of new media literacy studies (Vasquez, Harste, & Albers, 2010) and new literacies studies (Gee, 2010), we propose that university…

  2. Movies to the Rescue: Keeping the Cold War Relevant for Twenty-First-Century Students

    Science.gov (United States)

    Gokcek, Gigi; Howard, Alison

    2013-01-01

    What are the challenges of teaching Cold War politics to the twenty-first-century student? How might the millennial generation be educated about the political science theories and concepts associated with this period in history? A college student today, who grew up in the post-Cold War era with the Internet, Facebook, Twitter, smart phones,…

  3. An investigation of twenty-one cases of low-frequency noise complaints

    DEFF Research Database (Denmark)

    Pedersen, Christian Sejer; Møller, Henrik; Persson-Waye, Kerstin

    2007-01-01

    Twenty-one cases of low-frequency noise complaints were thoroughly investigated with the aim of answering the question whether it is real physical sound or low-frequency tinnitus that causes the annoyance. Noise recordings were made in the homes of the complainants taking the spatial variation...

  4. 38 CFR 8.31 - Total disability for twenty years or more.

    Science.gov (United States)

    2010-07-01

    ... AFFAIRS NATIONAL SERVICE LIFE INSURANCE Appeals § 8.31 Total disability for twenty years or more. Where the Disability Insurance Claims activity has made a finding of total disability for insurance purposes... will not be discontinued thereafter, except upon a showing that such a determination was based on fraud...

  5. Education for Future-Oriented Citizenship: Implications for the Education of Twenty-First Century Competencies

    Science.gov (United States)

    Lee, Wing On

    2012-01-01

    Globalization and the knowledge economy have opened up worldwide agendas for national development. Following this is the emphasis on the social dimension, otherwise known as social capital. Much of social capital includes "soft skills" and "twenty-first century skills", which broadly cover critical, creative and inventive…

  6. Land and Freedom [Twenty Lessons for High School American Studies Classroom Instruction].

    Science.gov (United States)

    Rubenstein, Stan

    Twenty self-contained lessons about land and freedom feature activities that can be used with high school social studies classes. The lessons are: Indian Land Ownership, The Dutch and the New World, Colonial Mercantilism, the Declaration and Natural Rights, Jefferson and Liberty, Louisiana Purchase, the Tariff Issue of 1824, Panic of 1837, John…

  7. Knowledge and Educational Research in the Context of "Twenty-First Century Learning"

    Science.gov (United States)

    Benade, Leon

    2014-01-01

    Educational researchers and academics cannot ignore the ever-present call for education, and schooling in particular, to reflect the needs of the twenty-first century knowledge economy. Since the 1990s, national curricula and education systems have reflected this call in their focus on technology and shifting pedagogy to increasingly…

  8. Theoretical Contexts and Conceptual Frames for the Study of Twenty-First Century Capitalism

    DEFF Research Database (Denmark)

    Hull Kristensen, Peer; Morgan, Glenn

    2012-01-01

    This chapter argues that the comparative institutionalist approach requires rethinking in the light of developments in the twenty-first century. The chapter emphasizes the following features of the new environment: first, the rise of the BRIC and the emerging economies; secondly, the changed...

  9. Way Forward in the Twenty-First Century in Content-Based Instruction: Moving towards Integration

    Science.gov (United States)

    Ruiz de Zarobe, Yolanda; Cenoz, Jasone

    2015-01-01

    The aim of this paper is to reflect on the theoretical and methodological underpinnings that provide the basis for an understanding of Content-Based Instruction/Content and Language Integrated Learning (CBI/CLIL) in the field and its relevance in education in the twenty-first century. It is argued that the agenda of CBI/CLIL needs to move towards…

  10. Humanities: The Unexpected Success Story of the Twenty-First Century

    Science.gov (United States)

    Davis, Virginia

    2012-01-01

    Humanities within universities faced challenges in the latter half of the twentieth century as their value in the modern world was questioned. This paper argues that there is strong potential for the humanities to thrive in the twenty-first century university sector. It outlines some of the managerial implications necessary to ensure that this…

  11. Visual Literacy: Does It Enhance Leadership Abilities Required for the Twenty-First Century?

    Science.gov (United States)

    Bintz, Carol

    2016-01-01

    The twenty-first century hosts a well-established global economy, where leaders are required to have increasingly complex skills that include creativity, innovation, vision, relatability, critical thinking and well-honed communications methods. The experience gained by learning to be visually literate includes the ability to see, observe, analyze,…

  12. Thomas Piketty – The Adam Smith of the Twenty-First Century?

    Directory of Open Access Journals (Sweden)

    Jacob Dahl Rendtorff

    2014-11-01

    Full Text Available Piketty’s book, Capital in the Twenty-First Century (2014 has become a bestseller in the world. Two month after its publication, it had sold more than 200.000 copies, and this success will surely continue for a long time. Piketty has established a new platform to discuss political economy.

  13. EXOGENOUS CHALLENGES FOR THE TOURISM INDUSTRY IN THE BEGINNING OF THE TWENTY FIRST CENTURY

    Directory of Open Access Journals (Sweden)

    Akosz Ozan

    2009-05-01

    Full Text Available Tourism is one of the fastest growing industries in the world. Besides its sustained growth the tourism industry has shown in the first years of the twenty first century that it can deal with political, military and natural disasters. The present paper ac

  14. Critical Remarks on Piketty's 'Capital in the Twenty-first Century'

    OpenAIRE

    Homburg, Stefan

    2014-01-01

    This paper discusses the central macroeconomic claims that are made in Thomas Piketty's book 'Capital in the Twenty-first Century'. The paper aims to show that Piketty's contentions are not only logically flawed but also contradicted by his own data.

  15. Twenty-One: cross-language disclosure and retrieval of multimedia documents on sustainable development

    NARCIS (Netherlands)

    Stal, ter W.G.; Beijert, J.-H.; Bruin, de G.; Gent, van J.; Jong, de F.M.G.; Kraaij, W.; Netter, K.; Smart, G.

    1998-01-01

    The Twenty-One project brings together environmental organisations, technology providers and research institutes from several European countries. The main objective of the project is to make documents on environmental issues—in particular, on the subject of sustainable development—available on CD-RO

  16. Teaching Middle School Language Arts: Incorporating Twenty-First Century Literacies

    Science.gov (United States)

    Small Roseboro, Anna J.

    2010-01-01

    "Teaching Middle School Language Arts" is the first book on teaching middle school language arts for multiple intelligences and related twenty-first-century literacies in technologically and ethnically diverse communities. More than 670,000 middle school teachers (grades six through eight) are responsible for educating nearly 13 million students…

  17. Testing Students under Cognitive Capitalism: Knowledge Production of Twenty-First Century Skills

    Science.gov (United States)

    Morgan, Clara

    2016-01-01

    Scholars studying the global governance of education have noted the increasingly important role corporations play in educational policy making. I contribute to this scholarship by examining the Assessment and Teaching of twenty-first century skills (ATC21S™) project, a knowledge production apparatus operating under cognitive capitalism. I analyze…

  18. Education for Future-Oriented Citizenship: Implications for the Education of Twenty-First Century Competencies

    Science.gov (United States)

    Lee, Wing On

    2012-01-01

    Globalization and the knowledge economy have opened up worldwide agendas for national development. Following this is the emphasis on the social dimension, otherwise known as social capital. Much of social capital includes "soft skills" and "twenty-first century skills", which broadly cover critical, creative and inventive thinking; information,…

  19. METHYLMERCURY BIOACCUMULATION DEPENDENCE ON NORTHERN PIKE AGE AND SIZE IN TWENTY MINNESOTA LAKES

    Science.gov (United States)

    Mercury accumulation in northern pike muscle tissue (fillets) was found to be directly related to fish age and size. Measurements were made on 173 individual northern pike specimens from twenty lakes across Minnesota. Best fit regressions of mercury fillet concentration (wet wt.)...

  20. Twenty-first Semiannual Report of the Commission to the Congress, January 1957

    Energy Technology Data Exchange (ETDEWEB)

    Strauss, Lewis L.

    1957-01-31

    The document represents the twenty-first semiannual Atomic Energy Commission (AEC) report to Congress. The report sums up the major activities and developments in the national atomic energy program covering the period July - December 1956. A special part two of this semiannual report addresses specifically Radiation Safety in Atomic Energy Activities.

  1. Way Forward in the Twenty-First Century in Content-Based Instruction: Moving towards Integration

    Science.gov (United States)

    Ruiz de Zarobe, Yolanda; Cenoz, Jasone

    2015-01-01

    The aim of this paper is to reflect on the theoretical and methodological underpinnings that provide the basis for an understanding of Content-Based Instruction/Content and Language Integrated Learning (CBI/CLIL) in the field and its relevance in education in the twenty-first century. It is argued that the agenda of CBI/CLIL needs to move towards…

  2. Quiet Riots: Race and Poverty in the United States. The Kerner Report Twenty Years Later.

    Science.gov (United States)

    Harris, Fred R., Ed.; Wilkins, Roger W., Ed.

    This book grew out of the national conference "The Kerner Commission: Twenty Years Later." The Kerner Commission found in its 1968 Report that America was moving toward two separate and unequal societies, divided along racial lines, and that major efforts to combat poverty, unemployment, and racism were mandated. The essays in this book…

  3. Testing Students under Cognitive Capitalism: Knowledge Production of Twenty-First Century Skills

    Science.gov (United States)

    Morgan, Clara

    2016-01-01

    Scholars studying the global governance of education have noted the increasingly important role corporations play in educational policy making. I contribute to this scholarship by examining the Assessment and Teaching of twenty-first century skills (ATC21S™) project, a knowledge production apparatus operating under cognitive capitalism. I analyze…

  4. Random walks, random fields, and disordered systems

    CERN Document Server

    Černý, Jiří; Kotecký, Roman

    2015-01-01

    Focusing on the mathematics that lies at the intersection of probability theory, statistical physics, combinatorics and computer science, this volume collects together lecture notes on recent developments in the area. The common ground of these subjects is perhaps best described by the three terms in the title: Random Walks, Random Fields and Disordered Systems. The specific topics covered include a study of Branching Brownian Motion from the perspective of disordered (spin-glass) systems, a detailed analysis of weakly self-avoiding random walks in four spatial dimensions via methods of field theory and the renormalization group, a study of phase transitions in disordered discrete structures using a rigorous version of the cavity method, a survey of recent work on interacting polymers in the ballisticity regime and, finally, a treatise on two-dimensional loop-soup models and their connection to conformally invariant systems and the Gaussian Free Field. The notes are aimed at early graduate students with a mod...

  5. Additional benefit of using a risk-based selection for prostate biopsy: an analysis of biopsy complications in the Rotterdam section of the European Randomized Study of Screening for Prostate Cancer.

    Science.gov (United States)

    Chiu, Peter K; Alberts, Arnout R; Venderbos, Lionne D F; Bangma, Chris H; Roobol, Monique J

    2017-09-01

    To investigate biopsy complications and hospital admissions that could be reduced by the use of European Randomized Study of Screening for Prostate Cancer (ERSPC) risk calculators. All biopsies performed in the Rotterdam section of the ERSPC between 1993 and 2015 were included. Biopsy complications and hospital admission data were prospectively recorded in questionnaires that were completed 2 weeks after biopsy. The ERSPC risk calculators 3 (RC3) and 4 (RC4) were applied to men attending the first and subsequent rounds of screening, respectively. Applying the predefined RC3/4 probability thresholds for prostate cancer (PCa) risk of ≥12.5% and high-grade PCa risk ≥3%, we assessed the number of complications, admissions and costs that could be reduced by avoiding biopsies in men below these thresholds. A total of 10 747 biopsies with complete questionnaires were included. For these biopsies a complication rate of 67.9% (7294/10 747), a post-biopsy fever rate of 3.9% (424/10747) and a hospital admission rate of 0.9% (92/10747) were recorded. The fever rate was found to be static over the years, but the hospital admission rate tripled from 0.6% (1993-1996) to 2.1% (2009-2015). Among 7704 biopsies which fit the criteria for RC3 or RC4, 35.8% of biopsies (2757/7704), 37.4% of complications (1972/5268), 39.4% of fever events (128/325) and 42.3% of admissions (30/71) could have been avoided by using one of the risk calculators. More complications could have been avoided if RC4 had been used and for more recent biopsies (2009-2015). Our findings show that 35.9% of the total cost of biopsies and complication treatment could have been avoided. A significant proportion of biopsy complications, hospital admissions and costs could be reduced if biopsy decisions were based on ERSPC risk calculators instead of PSA only. This effect was most prominent in more recent biopsies and in men with repeated biopsies or screening. © 2017 The Authors BJU International © 2017 BJU

  6. New 1/N expansions in random tensor models

    CERN Document Server

    Bonzom, Valentin

    2012-01-01

    Although random tensor models were introduced twenty years ago, it is only in 2011 that Gurau proved the existence of a 1/N expansion. Here we show that there actually is more than a single 1/N expansion, depending on the dimension. In the large N limit, these new expansions retain more than the melonic graphs. Still, in most cases, the large N limit is found to be Gaussian, and therefore extends the scope of the universality theorem for large random tensors. Nevertheless, a scaling which leads to non-Gaussian large N limits, in even dimensions, is identified for the first time.

  7. Why American business demands twenty-first century learning: A company perspective.

    Science.gov (United States)

    Knox, Allyson

    2006-01-01

    Microsoft is an innovative corporation demonstrating the kind and caliber of job skills needed in the twenty-first century. It demonstrates its commitment to twenty-first century skills by holding its employees accountable to a set of core competencies, enabling the company to run effectively. The author explores how Microsoft's core competencies parallel the Partnership for 21st Century Skills learning frameworks. Both require advanced problem-solving skills and a passion for technology, both expect individuals to be able to work in teams, both look for a love of learning, and both call for the self-confidence to honestly self-evaluate. Microsoft also works to cultivate twenty-first century skills among future workers, investing in education to help prepare young people for competitive futures. As the need for digital literacy has become imperative, technology companies have taken the lead in facilitating technology training by partnering with schools and communities. Microsoft is playing a direct role in preparing students for what lies ahead in their careers. To further twenty-first century skills, or core competencies, among the nation's youth, Microsoft has established Partners in Learning, a program that helps education organizations build partnerships that leverage technology to improve teaching and learning. One Partners in Learning grantee is Global Kids, a nonprofit organization that trains students to design online games focused on global social issues resonating with civic and global competencies. As Microsoft believes the challenges of competing in today's economy and teaching today's students are substantial but not insurmountable, such partnerships and investments demonstrate Microsoft's belief in and commitment to twenty-first century skills.

  8. Quantum random number generators

    Science.gov (United States)

    Herrero-Collantes, Miguel; Garcia-Escartin, Juan Carlos

    2017-01-01

    Random numbers are a fundamental resource in science and engineering with important applications in simulation and cryptography. The inherent randomness at the core of quantum mechanics makes quantum systems a perfect source of entropy. Quantum random number generation is one of the most mature quantum technologies with many alternative generation methods. This review discusses the different technologies in quantum random number generation from the early devices based on radioactive decay to the multiple ways to use the quantum states of light to gather entropy from a quantum origin. Randomness extraction and amplification and the notable possibility of generating trusted random numbers even with untrusted hardware using device-independent generation protocols are also discussed.

  9. Randomness and Differentiability

    CERN Document Server

    Brattka, Vasco; Nies, André

    2011-01-01

    We characterize some major algorithmic randomness notions via differentiability of effective functions. (1) We show that a real number z in [0,1] is computably random if and only if every nondecreasing computable function [0,1]->R is differentiable at z. (2) A real number z in [0,1] is weakly 2-random if and only if every almost everywhere differentiable computable function [0,1]->R is differentiable at z. (3) Recasting results of the constructivist Demuth (1975) in classical language, we show that a real z is ML random if and only if every computable function of bounded variation is differentiable at z, and similarly for absolutely continuous functions. We also use the analytic methods to show that computable randomness of a real is base invariant, and to derive preservation results for randomness notions.

  10. Invitation to Random Tensors

    Science.gov (United States)

    Gurau, Razvan

    2016-09-01

    This article is preface to the SIGMA special issue ''Tensor Models, Formalism and Applications'', http://www.emis.de/journals/SIGMA/Tensor_Models.html. The issue is a collection of eight excellent, up to date reviews on random tensor models. The reviews combine pedagogical introductions meant for a general audience with presentations of the most recent developments in the field. This preface aims to give a condensed panoramic overview of random tensors as the natural generalization of random matrices to higher dimensions.

  11. On Random Rough Sets

    Institute of Scientific and Technical Information of China (English)

    Weizhi Wu

    2006-01-01

    In this paper,the concept of a random rough set which includes the mechanisms of numeric and non-numeric aspects of uncertain knowledge is introduced. It is proved that for any belief structure and its inducing belief and plausibility measures there exists a random approximation space such that the associated lower and upper probabilities are respectively the given belief and plausibility measures, and vice versa. And for a random approximation space generated from a totally random set, its inducing lower and upper probabilities are respectively a pair of necessity and possibility measures.

  12. Rationale and design of the randomized, double-blind trial testing INtraveNous and Oral administration of elinogrel, a selective and reversible P2Y(12)-receptor inhibitor, versus clopidogrel to eVAluate Tolerability and Efficacy in nonurgent Percutaneous Coronary Interventions patients (INNOVATE-PCI).

    Science.gov (United States)

    Leonardi, Sergio; Rao, Sunil V; Harrington, Robert A; Bhatt, Deepak L; Gibson, C Michael; Roe, Matthew T; Kochman, Janusz; Huber, Kurt; Zeymer, Uwe; Madan, Mina; Gretler, Daniel D; McClure, Matthew W; Paynter, Gayle E; Thompson, Vivian; Welsh, Robert C

    2010-07-01

    Despite current dual-antiplatelet therapy with aspirin and clopidogrel, adverse clinical events continue to occur during and after percutaneous coronary intervention (PCI). The failure of clopidogrel to provide optimal protection may be related to delayed onset of action, interpatient variability in its effect, and an insufficient level of platelet inhibition. Furthermore, the irreversible binding of clopidogrel to the P2Y(12) receptor for the life span of the platelet is associated with increased bleeding risk especially during urgent or emergency surgery. Novel antiplatelet agents are required to improve management of patients undergoing PCI. Elinogrel is a potent, direct-acting (ie, non-prodrug), selective, competitive, and reversible P2Y(12) inhibitor available in both intravenous and oral formulations. The INNOVATE-PCI study is a phase 2 randomized, double-blind, clopidogrel-controlled trial to evaluate the safety, tolerability, and preliminary efficacy of this novel antiplatelet agent in patients undergoing nonurgent PCI.

  13. 利用噬菌体随机9肽库筛选寻常型天疱疮抗原模拟表位%Selection of mimotopes of pemphigus vulgaris antigen from a phage-displayed random nonapeptide library

    Institute of Scientific and Technical Information of China (English)

    黄丽群; 姚刚; 薛峰; 潘萌; 孙兵; 郑捷

    2008-01-01

    目的 利用噬菌体随机9肽库筛选寻常型天疱疮抗原桥粒芯糖蛋白3(desmoglein,Dsg3)模拟表位,加深对寻常型天疱疮发病机制的认识. 方法 通过大肠杆菌表达Dsg3的胞外结构域1和2(extracellular domain,EC1-2)和谷胱甘肽转移酶(GST)的融合蛋白,从寻常型天疱疮患者血清中纯化与EC1-2特异结合的多克隆抗体,对噬菌体随机线九肽库和环九肽库进行亲和筛选,阳件噬菌体展示肽经ELISA和竞争性ELISA验证. 结果 经过两轮亲和筛选,与自身抗体结合的噬菌体明显富集,免疫筛查、验证后得到3个阳性噬菌体展示肽,ELISA检测显示它们与患者血清反应,而不与正常人血清反应,竞争性ELISA检测显示它们可以抑制寻常型天疱疮患者血清与重组蛋白EC1-2的结合. 结论 利用噬菌体随机9肽库筛选到3个与寻常性大疱疮密切相关的模拟表位.%Objective To screen the mimotopes ofpemphigus vulgaris (PV) antigen, desmoglein3 (Dsg3) with a phage-displayed random nonapeptide library, so as to update the knowledge on the patho-genesis of PV. Methods Recombinant fusion protein of extracellular domain 1-2 (EC1-2) of Dsg3 and glutathione transferase was expressed by E.coli BL21, and used to purify polyclonal autoantibody binding to recombinant EC 1-2 from the sera of patients with PV. Then, selected autoantibody was applied as a ligand for biopanning of a phage-displayed linear random nonapeptide library and circular random nonapeptide library. Monoclonal phages were selected by immunoscreening and tested with ELISA and competitive ELISA. Results After two rounds ofbiopanning, a population ofpeptide-displaying phages binding to autoan- tidody were highly enriched. Sixty individual phage clones selected by immunosereening were further sub-jected to screening with ELISA and competitive ELISA. Finally, three positive phage clones were obtained. As shown by ELISA and competitive ELISA, they reacted with serum from

  14. Twenty four year time trends in fats and cholesterol intake by adolescents. Warsaw Adolescents Study

    Directory of Open Access Journals (Sweden)

    Charzewska Jadwiga

    2015-06-01

    Full Text Available The objective of this study was to determine time trends (1982–2006 in total fat intake and changes in fatty acid structure intake in adolescents from Warsaw in view of increasing prevalence of obesity. Data come from four successive surveys randomly selected samples of adolescents (aged 11–15 years old, from Warsaw region. In total 9747 pupils have been examined, with response rate varying from 55% to 87% depending on year. Surveys were done always in the spring season of the year. Food intake was assessed by using 24 hours recall method of consumption by the pupils all products, including enriched, dishes and beverages as well as diet supplements, in the last 24 hours preceding the examination. The content of energy and nutrients was calculated by means of own computer softwares (DIET 2 and 4, taking into account successive revisions of the tables of food composition and nutritional values, as well as current Polish DRI. A significant decreasing trend was found in intake of total fat, of saturated fatty acids (SFA and cholesterol. The percentage of energy from total fat, also decreased both in boys (to 35,1% and girls (to 33,7%, what failed to reach the desired level below 30% of energy from fat which is recommended. Also significant decrease of SFA consumption was not satisfactory enough to approach the values <10% of energy recommended as was from 13% to 15%. Decreasing trends in fat intake was not in accordance with the trend in obesity prevalence in the adolescents as average BMI is going up. To stabilize the health-oriented changes especially in the diets of adolescents, further activity is desired from professionals working with prevention of adolescents obesity.

  15. Using Chemistry Simulations: Attention Capture, Selective Amnesia and Inattentional Blindness

    Science.gov (United States)

    Rodrigues, Susan

    2011-01-01

    Twenty-one convenience sample student volunteers aged between 14-15 years worked in pairs (and one group of three) with two randomly allocated high quality conceptual (molecular level) and operational (mimicking wet labs) simulations. The volunteers were told they had five minutes to play, repeat, review, restart or stop the simulation, which in…

  16. Selection of Triptolide Ligands from a Random Phage Display Library and Primary Verification of Their Combination%雷公藤内酯醇靶蛋白的筛选及其结合的初步验证

    Institute of Scientific and Technical Information of China (English)

    杨旭光; 徐晓煜; 李家璜; 姚其正; 朱彤阳; 华子春; 郑伟娟

    2012-01-01

    Triptolide is an extract from the Chinese herb Tripterygium wilfordii Hook f. We screened out a peptide from a C7 phage display library and presumed it was a potential peptide ligand for triptolide. The specificity of the selected clone for triptolide was confirmed by ELISA and immunoprecipi-tation. After DNA sequencing, a BLAST search was carried out and 77 matching sequences was retrieved. The most promising candidate is human steroidogenic factor-1 (hSF-1), a member of the nuclear receptor family that controls the synthesis of steroid hormones by regulating steroidogenic enzyme genes. We purified the ligand binding domain (LBD) of hSF-1 and then used it for further experiments. Fluorescence spectrum experiments showed that triptolide could cause fluorescence quenching of hSF-1-LBD, isothermal titration calorimetric(ITC) measurements showed a enthalpydriven interaction between triptolide and hSF-1-LBD, and a dose-dependent interaction between them was observed by surface plasmon resonance(SPR). All these results confirmed the specific interaction between hSF-1 and triptolide.%采用噬菌体展示肽库技术筛选雷公藤内酯醇的靶蛋白,得到了一个肽段,并通过酶联免疫吸附(ELISA)和免疫共沉淀验证了该片段对雷公藤内酯醇的结合特异性.用Basic Local Alignment Search Tool (BLAST)进行序列对比后,找到了77个匹配序列,其中最为匹配的序列是人类类固醇生成因子-1(hSF-1),因此hSF-1可能是雷公藤内酯醇的一个潜在受体.在大肠杆菌中表达纯化了hSF-1的配体结合域(LBD),荧光光谱实验表明雷公藤内酯醇对hSF-1-LBD有荧光淬灭作用、等温滴定量热(ITC)实验表明雷公藤内酯醇与hSF-1-LBD发生焓驱动的特异性结合,表面等离子体共振(SPR)实验表明雷公藤内酯醇可以与hSF-1-LBD剂量依赖性结合,这些都证实了hSF-1与雷公藤内酯醇存在特异性相互作用.

  17. Farmer's Knowledge of Horticultural Traits and Participatory Selection of African Eggplant Varieties (Solanum aethiopicum in Tanzania

    Directory of Open Access Journals (Sweden)

    Adeniji, OT.

    2012-01-01

    Full Text Available Participatory selection was conducted in 2008 through 2009 to identify farmers' preference for species and horticultural traits that may constitute future breeding objectives. Vegetable farmers were selected from Moshi and Arusha regions, test population comprised twenty-six accessions from four Solanum species (eggplant and relatives. Purposive sampling was used to select the farming communities with high African eggplant production activities; a multistage random sampling procedure was adopted to select farmers from three regions for participatory meeting. The focus group discussion sessions identified fruit shape, taste, earliness, medicinal properties, marketability and resistance to diseases as farmers' preferred traits in S. aethiopicum; taste and marketability for S. melongena, taste and medicinal properties among S. macrocarpon and S. anguivi. Fruits characterized by cream colour at commercial harvest are most preferred compared to green, to a lesser extent is purple. Interestingly high fruits per plant, fruits per cluster and fruit cluster per plant best described S. anguivi. Fruit yield was superior in Db3 (S. aethiopicum Gilo group, top five accessions for organoleptic properties are Db3, Ab2, MM 1619, S00052 and MM 1086. Characters indicated above may constitute breeding objectives and population identified may serve as pollen parents for development of new varieties in african eggplant. Intraspecific hybridization within S. aethiopicum Gilo cultigroup, hybridization among Gilo and Shum cultigroups and interspecific hybridization between S. aethiopicum and S. anguivi may evolve new population aimed at improving fruit yield.

  18. Genetic parameters and selection gains for Euterpe oleracea in juvenile phase

    Directory of Open Access Journals (Sweden)

    João Tomé de Farias Neto

    2012-09-01

    Full Text Available Genetics parameters and selection gains, obtained 36 months after planting, are presented and discussed for progenies of open pollinated population of açai palm for plant height (AP, plant diameter (DPC, number of live leaves ( NFV and tiller number (NP, based on the linear mixed model methodology (REML / BLUP. The thirty progenies were evaluated in a randomized blocks design with three replications and plots of five plants, spaced at 6m x 4m. The values obtained for individual heritability (0.55, 0.44, 0.38 and 0.43 and for progeny means (0.64, 0.54, 0.58 and 0.64 for AP, DPC, NFV and NP, respectively, were expressives, which indicates the possibility of genetic progress with the selection. The accuracy among the genetics values predicted and the true were of 0.802 for height, 0.736 for diameter, 0.760 for number of live leaves and 0.797 for tiller number. With the exception of NFV character, the coefficients of individual genetic variation were high (>10%, confirming the potential of the population for selection. Predicted genetic gains of 89.3% were obtained for the character AP and 2.1% for DCP, with the selection of the twenty top individuals. Correlation was found between height and diameter of the plant. Among ages, for the same characters, positive correlations of mean magnitudes were found.

  19. Asymptotics of Random Contractions

    CERN Document Server

    Hashorva, Enkelejd; Tang, Qihe

    2010-01-01

    In this paper we discuss the asymptotic behaviour of random contractions $X=RS$, where $R$, with distribution function $F$, is a positive random variable independent of $S\\in (0,1)$. Random contractions appear naturally in insurance and finance. Our principal contribution is the derivation of the tail asymptotics of $X$ assuming that $F$ is in the max-domain of attraction of an extreme value distribution and the distribution function of $S$ satisfies a regular variation property. We apply our result to derive the asymptotics of the probability of ruin for a particular discrete-time risk model. Further we quantify in our asymptotic setting the effect of the random scaling on the Conditional Tail Expectations, risk aggregation, and derive the joint asymptotic distribution of linear combinations of random contractions.

  20. Random complex fewnomials, I

    CERN Document Server

    Shiffman, Bernard

    2010-01-01

    We introduce several notions of `random fewnomials', i.e. random polynomials with a fixed number f of monomials of degree N. The f exponents are chosen at random and then the coefficients are chosen to be Gaussian random, mainly from the SU(m + 1) ensemble. The results give limiting formulas as N goes to infinity for the expected distribution of complex zeros of a system of k random fewnomials in m variables. When k = m, for SU(m + 1) polynomials, the limit is the Monge-Ampere measure of a toric Kaehler potential on CP^m obtained by averaging a `discrete Legendre transform' of the Fubini-Study symplectic potential at f points of the unit simplex in R^m.

  1. Random bistochastic matrices

    Energy Technology Data Exchange (ETDEWEB)

    Cappellini, Valerio [' Mark Kac' Complex Systems Research Centre, Uniwersytet Jagiellonski, ul. Reymonta 4, 30-059 Krakow (Poland); Sommers, Hans-Juergen [Fachbereich Physik, Universitaet Duisburg-Essen, Campus Duisburg, 47048 Duisburg (Germany); Bruzda, Wojciech; Zyczkowski, Karol [Instytut Fizyki im. Smoluchowskiego, Uniwersytet Jagiellonski, ul. Reymonta 4, 30-059 Krakow (Poland)], E-mail: valerio@ictp.it, E-mail: h.j.sommers@uni-due.de, E-mail: w.bruzda@uj.edu.pl, E-mail: karol@cft.edu.pl

    2009-09-11

    Ensembles of random stochastic and bistochastic matrices are investigated. While all columns of a random stochastic matrix can be chosen independently, the rows and columns of a bistochastic matrix have to be correlated. We evaluate the probability measure induced into the Birkhoff polytope of bistochastic matrices by applying the Sinkhorn algorithm to a given ensemble of random stochastic matrices. For matrices of order N = 2 we derive explicit formulae for the probability distributions induced by random stochastic matrices with columns distributed according to the Dirichlet distribution. For arbitrary N we construct an initial ensemble of stochastic matrices which allows one to generate random bistochastic matrices according to a distribution locally flat at the center of the Birkhoff polytope. The value of the probability density at this point enables us to obtain an estimation of the volume of the Birkhoff polytope, consistent with recent asymptotic results.

  2. Random Bistochastic Matrices

    CERN Document Server

    Cappellini, V; Bruzda, W; Zyczkowski, K

    2009-01-01

    Ensembles of random stochastic and bistochastic matrices are investigated. While all columns of a random stochastic matrix can be chosen independently, the rows and columns of a bistochastic matrix have to be correlated. We evaluate the probability measure induced into the Birkhoff polytope of bistochastic matrices by applying the Sinkhorn algorithm to a given ensemble of random stochastic matrices. For matrices of order N=2 we derive explicit formulae for the probability distributions induced by random stochastic matrices with columns distributed according to the Dirichlet distribution. For arbitrary $N$ we construct an initial ensemble of stochastic matrices which allows one to generate random bistochastic matrices according to a distribution locally flat at the center of the Birkhoff polytope. The value of the probability density at this point enables us to obtain an estimation of the volume of the Birkhoff polytope, consistent with recent asymptotic results.

  3. Quantum random number generator

    Science.gov (United States)

    Pooser, Raphael C.

    2016-05-10

    A quantum random number generator (QRNG) and a photon generator for a QRNG are provided. The photon generator may be operated in a spontaneous mode below a lasing threshold to emit photons. Photons emitted from the photon generator may have at least one random characteristic, which may be monitored by the QRNG to generate a random number. In one embodiment, the photon generator may include a photon emitter and an amplifier coupled to the photon emitter. The amplifier may enable the photon generator to be used in the QRNG without introducing significant bias in the random number and may enable multiplexing of multiple random numbers. The amplifier may also desensitize the photon generator to fluctuations in power supplied thereto while operating in the spontaneous mode. In one embodiment, the photon emitter and amplifier may be a tapered diode amplifier.

  4. Randomness: quantum versus classical

    CERN Document Server

    Khrennikov, Andrei

    2015-01-01

    Recent tremendous development of quantum information theory led to a number of quantum technological projects, e.g., quantum random generators. This development stimulates a new wave of interest in quantum foundations. One of the most intriguing problems of quantum foundations is elaboration of a consistent and commonly accepted interpretation of quantum state. Closely related problem is clarification of the notion of quantum randomness and its interrelation with classical randomness. In this short review we shall discuss basics of classical theory of randomness (which by itself is very complex and characterized by diversity of approaches) and compare it with irreducible quantum randomness. The second part of this review is devoted to the information interpretation of quantum mechanics (QM) in the spirit of Zeilinger and Brukner (and QBism of Fuchs et al.) and physics in general (e.g., Wheeler's "it from bit") as well as digital philosophy of Chaitin (with historical coupling to ideas of Leibnitz). Finally, w...

  5. Randomness: Quantum versus classical

    Science.gov (United States)

    Khrennikov, Andrei

    2016-05-01

    Recent tremendous development of quantum information theory has led to a number of quantum technological projects, e.g. quantum random generators. This development had stimulated a new wave of interest in quantum foundations. One of the most intriguing problems of quantum foundations is the elaboration of a consistent and commonly accepted interpretation of a quantum state. Closely related problem is the clarification of the notion of quantum randomness and its interrelation with classical randomness. In this short review, we shall discuss basics of classical theory of randomness (which by itself is very complex and characterized by diversity of approaches) and compare it with irreducible quantum randomness. We also discuss briefly “digital philosophy”, its role in physics (classical and quantum) and its coupling to the information interpretation of quantum mechanics (QM).

  6. New Poetics of the Film Body: Docility, Molecular Fundamentalism and Twenty First Century Destiny

    Directory of Open Access Journals (Sweden)

    Flynn Susan

    2015-06-01

    Full Text Available Twenty first century film evokes a new topology of the body. Science and technology are the new century’s ‘sovereign power’ which enforces biopolitics through bodies which, by virtue of being seen at their most fundamental level, have become docile surfaces. The film body is at once manipulated and coerced into an ethos of optimization; a thoroughly scientific and ‘molecular’ optimization which proffers ‘normalization’ and intimately regulated bodies. In the film bodies of this millennium, bodily intervention results in surveillance becoming internalized. Now the body is both a means and an end of social control. This essay applies the philosophies Michel Foucault and Nikolas Rose to twenty first century Hollywood film, elucidating a new tropos, a new film body/body of film.

  7. Twenty-four-hour blood pressure among Greenlanders and Danes: relationship to diet and lifestyle

    DEFF Research Database (Denmark)

    Jørgensen, Marit Eika; Pedersen, M.B.; Siggaard, Cecilie

    2002-01-01

    the influence of Arctic food and lifestyle on blood pressure. Four groups of healthy subjects were recruited for the study. Group I: Danes in Denmark consuming European food; group II: Greenlanders in Denmark consuming European food; group III: Greenlanders in Greenland consuming mainly European food; and group......, outdoor temperature, and lifestyle factors. Greenlanders have a lower 24-h diastolic blood pressure than Danes, and it is suggested that genetic factors are mainly responsible for the lower blood pressure level among Greenlanders. Twenty-four-hour blood pressure among Greenlanders and Danes: Relationship...... to diet and lifestyle - ResearchGate. Available from: http://www.researchgate.net/publication/11001663_Twenty-four-hour_blood_pressure_among_Greenlanders_and_Danes_Relationship_to_diet_and_lifestyle [accessed Oct 7, 2015]....

  8. Random attractors for asymptotically upper semicompact multivalue random semiflows

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The present paper studied the dynamics of some multivalued random semiflow. The corresponding concept of random attractor for this case was introduced to study asymptotic behavior. The existence of random attractor of multivalued random semiflow was proved under the assumption of pullback asymptotically upper semicompact, and this random attractor is random compact and invariant. Furthermore, if the system has ergodicity, then this random attractor is the limit set of a deterministic bounded set.

  9. Twenty Years of Evolutionary Change in the Department of Defense’s Civil Support Mission

    Science.gov (United States)

    2013-05-23

    should include the foregoing statement.) ii ABSTRACT TWENTY YEARS OF EVOLUTIONARY CHANGE IN THE DEPARTMENT OF DEFENSE’S... gasping for air flooded television and print outlets across the globe. The casualty figures from the attack included twelve dead, fifty four...Biological Incident Response Force, II Marine Expeditionary Force, Indian Head, MD,” Marines: The Official Website of the United States Marine Corps

  10. Results of miconazole therapy in twenty-eight patients with paracoccidioidomycosis (South American blastomycosis).

    Science.gov (United States)

    Negroni, R; Rubinstein, P; Herrmann, A; Gimenez, A

    1977-01-01

    Results are presented of treatment with miconazole, orally and intravenously, in patients with paracoccidioidomycosis. Twenty-eight male patients aged from 34 to 66 years and exhibiting various clinical forms of the disease were studied. Twenty-five came from endemic areas in north east Argentina (Chaco, Formosa, Misiones, Corrientes and northern Santa Fe) and the remaining three from Paraguay. Twenty patients were engaged in agricultural work or at woodmills. single or multiple lesions were observed in 24 cases. Thirteen were suffering from infection of the larynx and in two of them a tracheotomy was necessary. Twenty-three showed pulmonary lesions on X-rays. Twelve had ganglionic lesions, eight had cutaneous lesions and one patient had osteoarthritis of the knee. One patient had hepatomegaly which was unrelated to chronic alcoholism. Fourteen patients had received previous treatments such as sulphonamides and amphotericin B (7 cases); sulphonamides (3), sulphonamides and the combination sulfamethoxazole + trimethoprim (3), and one patient had received all three medications. All patients had relapsed before starting miconazole therapy. Diagnosis was established by the presence of P. brasiliensis in all cases, recovered either from cutaneous or mucosal biopsy samples or from the sputum. Complement fixation tests were positive in all patients at the onset of the treatment and the immunodiffusion reactions showed precipitation bands in 27/28 patients. Skin tests with P. brasiliensis antigens proved to be positive in 18 cases and negative in 10. The erythrocyte sedimentation rate was markedly accelerated in 22 patients (greater than 20 mm in the first hour).(ABSTRACT TRUNCATED AT 250 WORDS) Images p24-a Fig 1 Fig 2 PMID:122643

  11. A Brief Analysis of O. Henry’s Writing Characteristics in"After Twenty Years"

    Institute of Scientific and Technical Information of China (English)

    王忠林

    2014-01-01

    O. Henry was one of the three world’s masters of short stories. He was known as the father of the modern American short stories. Even till today O. Henry’s short stories still enjoy a lot of popularity. This article analyzes the writing characteristics of O. Henry’s short stories by probing into one of his short stories"After Twenty Years".

  12. Automation and robotics for Space Station in the twenty-first century

    Science.gov (United States)

    Willshire, K. F.; Pivirotto, D. L.

    1986-01-01

    Space Station telerobotics will evolve beyond the initial capability into a smarter and more capable system as we enter the twenty-first century. Current technology programs including several proposed ground and flight experiments to enable development of this system are described. Advancements in the areas of machine vision, smart sensors, advanced control architecture, manipulator joint design, end effector design, and artificial intelligence will provide increasingly more autonomous telerobotic systems.

  13. Twenty-First Century Europe: Emergence of Baltic States into European Alliances

    Science.gov (United States)

    2003-04-07

    sources of power supply. Estonia is the only country in the world where oil shale is the primary source of energy, supplying over 75 percent of its total...Unclassified The contributions of Estonia , Latvia, and Lithuania ("the Baltic States") to the North Atlantic Treaty Organization (NATO), the European...23 vi TWENTY-FIRST CENTURY EUROPE: EMERGENCE OF THE BALTIC STATES INTO EUROPEAN ALLIANCES BACKGROUND Estonia , Latvia, and Lithuania are often

  14. Monetary Policy Implementation and Results in Twenty Inflation-Targeting Countries.

    OpenAIRE

    Klaus Schmidt-Hebbel.; Matías Tapia

    2002-01-01

    Inflation targeting is an increasingly popular monetary regime among industrialized and developing central banks. However, there is little cross-country comparative information about commonalties and differences in monetary policy implementation and results across inflation-targeting countries. This paper presents the results of a survey on monetary policy conducted among the world’s twenty central banks that currently target inflation. Survey responses highlight operational features of monet...

  15. Proceedings of the twenty-sixth annual Institute on Mining Health, Safety and Research

    Energy Technology Data Exchange (ETDEWEB)

    Tinney, G.R.; Bacho, A.; Karmis, M. [eds.

    1995-12-31

    The proceedings of the Twenty-Sixth Annual Institute on Mining Health, Safety and Research are presented. The Conference was held in Blacksburg, Virginia on August 28-30, 1995 and covered such topics as themes of change, miner`s safety, personal and corporate liability, behavioral changes and positive reinforcement, and meeting health and safety objectives in mining operations. A separate abstract was prepared for the thirteen papers for inclusion in the Energy Science and Technology Database.

  16. A Summary of the Twenty-Ninth AAAI Conference on Artificial Intelligence

    OpenAIRE

    2015-01-01

    The Twenty-Ninth AAAI Conference on Artificial Intelligence, (AAAI-15) was held in January 2015 in Austin, Texas (USA) The conference program was cochaired by Sven Koenig and Blai Bonet. This report contains reflective summaries of the main conference, the robotics program, the AI and robotics workshop, the virtual agent exhibition, the what's hot track, the competition panel, the senior member track, student and outreach activities, the student abstract and poster program, the doctoral conso...

  17. A New Combinational Selection Operator in Genetic Algorithm

    Science.gov (United States)

    Rafsanjani, Marjan Kuchaki; Eskandari, Sadegh

    2011-09-01

    In this paper, a new Random Combinational Selection Operator (RCSO) is presented. Three existing selection operators and our proposed selection method are applied to traveling salesman problems using MATLAB. The tours obtained using our selection method, are shorter than those that were obtained with existing selection operators for large numbers of cities.

  18. Twenty-first century Arctic climate change in the CCSM3 IPCC scenario simulations

    Energy Technology Data Exchange (ETDEWEB)

    Teng, Haiyan; Washington, Warren M.; Meehl, Gerald A.; Buja, Lawrence E.; Strand, Gary W. [National Center for Atmospheric Research, Boulder, CO (United States)

    2006-05-15

    Arctic climate change in the Twenty-first century is simulated by the Community Climate System Model version 3.0 (CCSM3). The simulations from three emission scenarios (A2, A1B and B1) are analyzed using eight (A1B and B1) or five (A2) ensemble members. The model simulates a reasonable present-day climate and historical climate trend. The model projects a decline of sea-ice extent in the range of 1.4-3.9% per decade and 4.8-22.2% per decade in winter and summer, respectively, corresponding to the range of forcings that span the scenarios. At the end of the Twenty-first century, the winter and summer Arctic mean surface air temperature increases in a range of 4-14 C (B1 and A2) and 0.7-5 C (B1 and A2) relative to the end of the Twentieth century. The Arctic becomes ice-free during summer at the end of the Twenty-first century in the A2 scenario. Similar to the observations, the Arctic Oscillation (AO) is the dominant factor in explaining the variability of the atmosphere and sea ice in the 1870-1999 historical runs. The AO shifts to the positive phase in response to greenhouse gas forcings in the Twenty-first century. But the simulated trends in both Arctic mean sea-level pressure and the AO index are smaller than what has been observed. The Twenty-first century Arctic warming mainly results from the radiative forcing of greenhouse gases. The 1st empirical orthogonal function (explains 72.2-51.7% of the total variance) of the wintertime surface air temperature during 1870-2099 is characterized by a strong warming trend and a ''polar amplification''-type of spatial pattern. The AO, which plays a secondary role, contributes to less than 10% of the total variance in both surface temperature and sea-ice concentration. (orig.)

  19. Twenty first century climatic and hydrological changes over Upper Indus Basin of Himalayan region of Pakistan

    Science.gov (United States)

    Ali, Shaukat; Li, Dan; Congbin, Fu; Khan, Firdos

    2015-01-01

    This study is based on both the recent and the predicted twenty first century climatic and hydrological changes over the mountainous Upper Indus Basin (UIB), which are influenced by snow and glacier melting. Conformal-Cubic Atmospheric Model (CCAM) data for the periods 1976-2005, 2006-2035, 2041-2070, and 2071-2100 with RCP4.5 and RCP8.5; and Regional Climate Model (RegCM) data for the periods of 2041-2050 and 2071-2080 with RCP8.5 are used for climatic projection and, after bias correction, the same data are used as an input to the University of British Columbia (UBC) hydrological model for river flow projections. The projections of all of the future periods were compared with the results of 1976-2005 and with each other. Projections of future changes show a consistent increase in air temperature and precipitation. However, temperature and precipitation increase is relatively slow during 2071-2100 in contrast with 2041-2070. Northern parts are more likely to experience an increase in precipitation and temperature in comparison to the southern parts. A higher increase in temperature is projected during spring and winter over southern parts and during summer over northern parts. Moreover, the increase in minimum temperature is larger in both scenarios for all future periods. Future river flow is projected by both models to increase in the twenty first century (CCAM and RegCM) in both scenarios. However, the rate of increase is larger during the first half while it is relatively small in the second half of the twenty first century in RCP4.5. The possible reason for high river flow during the first half of the twenty first century is the large increase in temperature, which may cause faster melting of snow, while in the last half of the century there is a decreasing trend in river flow, precipitation, and temperature (2071-2100) in comparison to 2041-2070 for RCP4.5. Generally, for all future periods, the percentage of increased river flow is larger in winter than in

  20. Theory of Selection Operators on Hyperspaces and Multivalued Stochastic Processes

    Institute of Scientific and Technical Information of China (English)

    高勇; 张文修

    1994-01-01

    In this paper, a new concept of selection operators on hyperspaces (subsets spaces) is introduced, and the existence theorems for several kinds of selection operators are proved. Using the methods of selection operators, we give a selection characterization of identically distributed multivalued random variables and completely solve the vector-valued selection problem for sequences of multivalued random variables converging in distribution. The regular selections and Markov selections for multivalued stochastic processes are studied, and a discretization theorem for multivalued Markov processes is established. A theorem on the asymptotic martingale selections for compact and convex multivalued asymptotic martingale is proved.

  1. How to deal with morning bad breath: A randomized, crossover clinical trial

    OpenAIRE

    Jeronimo M Oliveira-Neto; Sandra Sato; Vinicius Pedrazzi

    2013-01-01

    Context: The absence of a protocol for the treatment of halitosis has led us to compare mouthrinses with mechanical oral hygiene procedures for treating morning breath by employing a hand-held sulfide monitor. Aims: To compare the efficacy of five modalities of treatment for controlling morning halitosis in subjects with no dental or periodontal disease. Settings and Design: This is a five-period, randomized, crossover clinical trial. Materials and Methods: Twenty volunteers were randomly ass...

  2. Electrocardiogram ST Analysis During Labor : A Systematic Review and Meta-analysis of Randomized Controlled Trials

    NARCIS (Netherlands)

    Saccone, Gabriele; Schuit, Ewoud; Amer-Wåhlin, Isis; Xodo, Serena; Berghella, Vincenzo

    2016-01-01

    OBJECTIVE: To compare the effectiveness of cardiotocography plus ST analysis with cardiotocography alone during labor. DATA SOURCES: Randomized controlled trials were identified by searching electronic databases. METHODS OF STUDY SELECTION: We included all randomized controlled trials comparing intr

  3. Electrocardiogram ST Analysis During Labor : A Systematic Review and Meta-analysis of Randomized Controlled Trials

    NARCIS (Netherlands)

    Saccone, Gabriele; Schuit, Ewoud; Amer-Wåhlin, Isis; Xodo, Serena; Berghella, Vincenzo

    OBJECTIVE: To compare the effectiveness of cardiotocography plus ST analysis with cardiotocography alone during labor. DATA SOURCES: Randomized controlled trials were identified by searching electronic databases. METHODS OF STUDY SELECTION: We included all randomized controlled trials comparing

  4. A random walk with a branching system in random environments

    Institute of Scientific and Technical Information of China (English)

    Ying-qiu LI; Xu LI; Quan-sheng LIU

    2007-01-01

    We consider a branching random walk in random environments, where the particles are reproduced as a branching process with a random environment (in time), and move independently as a random walk on Z with a random environment (in locations). We obtain the asymptotic properties on the position of the rightmost particle at time n, revealing a phase transition phenomenon of the system.

  5. Genetic parameters for residual feed intake in a random population of Pekin duck

    Directory of Open Access Journals (Sweden)

    Yunsheng Zhang

    2017-02-01

    Full Text Available Objective The feed intake (FI and feed efficiency are economically important traits in ducks. To obtain insight into this economically important trait, we designed an experiment based on the residual feed intake (RFI and feed conversion ratio (FCR of a random population Pekin duck. Methods Two thousand and twenty pedigreed random population Pekin ducks were established from 90 males mated to 450 females in two hatches. Traits analyzed in the study were body weight at the 42th day (BW42, 15 to 42 days average daily gain (ADG, 15 to 42 days FI, 15 to 42 days FCR, and 15 to 42 days RFI to assess their genetic inter-relationships. The genetic parameters for feed efficiency traits were estimated using restricted maximum likelihood (REML methodology applied to a sire-dam model for all traits using the ASREML software. Results Estimates heritability of BW42, ADG, FI, FCR, and RFI were 0.39, 0.38, 0.33, 0.38, and 0.41, respectively. The genetic correlation was high between RFI and FI (0.77 and moderate between RFI and FCR (0.54. The genetic correlation was high and moderate between FCR and ADG (−0.80, and between FCR and BW42 (−0.64, and between FCR and FI (0.49, respectively. Conclusion Thus, selection on RFI was expected to improve feed efficiency, and reduce FI. Selection on RFI thus improves the feed efficiency of animals without impairing their FI and increase growth rate.

  6. Invitation to random tensors

    CERN Document Server

    Gurau, Razvan

    2016-01-01

    Preface to the SIGMA special issue "Tensor Models, Formalism and Applications." The SIGMA special issue "Tensor Models, Formalism and Applications" is a collection of eight excellent, up to date reviews \\cite{Ryan:2016sundry,Bonzom:2016dwy,Rivasseau:2016zco,Carrozza:2016vsq,Krajewski:2016svb,Rivasseau:2016rgt,Tanasa:2015uhr,Gielen:2016dss} on random tensor models. The reviews combine pedagogical introductions meant for a general audience with presentations of the most recent developments in the field. This preface aims to give a condensed panoramic overview of random tensors as the natural generalization of random matrices to higher dimensions.

  7. Aperiodic Quantum Random Walks

    CERN Document Server

    Ribeiro, P; Mosseri, R; Ribeiro, Pedro; Milman, Perola; Mosseri, Remy

    2004-01-01

    We generalize the quantum random walk protocol for a particle in a one-dimensional chain, by using several types of biased quantum coins, arranged in aperiodic sequences, in a manner that leads to a rich variety of possible wave function evolutions. Quasiperiodic sequences, following the Fibonacci prescription, are of particular interest, leading to a sub-ballistic wavefunction spreading. In contrast, random sequences leads to diffusive spreading, similar to the classical random walk behaviour. We also describe how to experimentally implement these aperiodic sequences.

  8. Random Fiber Laser

    CERN Document Server

    de Matos, Christiano J S; Brito-Silva, Antônio M; Gámez, M A Martinez; Gomes, Anderson S L; de Araújo, Cid B

    2007-01-01

    We investigate the effects of two dimensional confinement on the lasing properties of a classical random laser system operating in the incoherent feedback (diffusive) regime. A suspension of 250nm rutile (TiO2) particles in a Rhodamine 6G solution was inserted into the hollow core of a photonic crystal fiber (PCF) generating the first random fiber laser and a novel quasi-one-dimensional RL geometry. Comparison with similar systems in bulk format shows that the random fiber laser presents an efficiency that is at least two orders of magnitude higher.

  9. Five-year follow-up of harms and benefits of behavioral infant sleep intervention: randomized trial.

    Science.gov (United States)

    Price, Anna M H; Wake, Melissa; Ukoumunne, Obioha C; Hiscock, Harriet

    2012-10-01

    Randomized trials have demonstrated the short- to medium-term effectiveness of behavioral infant sleep interventions. However, concerns persist that they may harm children's emotional development and subsequent mental health. This study aimed to determine long-term harms and/or benefits of an infant behavioral sleep program at age 6 years on (1) child, (2) child-parent, and (3) maternal outcomes. Three hundred twenty-six children (173 intervention) with parent-reported sleep problems at age 7 months were selected from a population sample of 692 infants recruited from well-child centers. The study was a 5-year follow-up of a population-based cluster-randomized trial. Allocation was concealed and researchers (but not parents) were blinded to group allocation. Behavioral techniques were delivered over 1 to 3 individual nurse consultations at infant age 8 to 10 months, versus usual care. The main outcomes measured were (1) child mental health, sleep, psychosocial functioning, stress regulation; (2) child-parent relationship; and (3) maternal mental health and parenting styles. Two hundred twenty-five families (69%) participated. There was no evidence of differences between intervention and control families for any outcome, including (1) children's emotional (P = .8) and conduct behavior scores (P = .6), sleep problems (9% vs 7%, P = .2), sleep habits score (P = .4), parent- (P = .7) and child-reported (P = .8) psychosocial functioning, chronic stress (29% vs 22%, P = .4); (2) child-parent closeness (P = .1) and conflict (P = .4), global relationship (P = .9), disinhibited attachment (P = .3); and (3) parent depression, anxiety, and stress scores (P = .9) or authoritative parenting (63% vs 59%, P = .5). Behavioral sleep techniques have no marked long-lasting effects (positive or negative). Parents and health professionals can confidently use these techniques to reduce the short- to medium-term burden of infant sleep problems and maternal depression.

  10. Universal statistics of selected values

    Science.gov (United States)

    Smerlak, Matteo; Youssef, Ahmed

    2017-03-01

    Selection, the tendency of some traits to become more frequent than others under the influence of some (natural or artificial) agency, is a key component of Darwinian evolution and countless other natural and social phenomena. Yet a general theory of selection, analogous to the Fisher-Tippett-Gnedenko theory of extreme events, is lacking. Here we introduce a probabilistic definition of selection and show that selected values are attracted to a universal family of limiting distributions which generalize the log-normal distribution. The universality classes and scaling exponents are determined by the tail thickness of the random variable under selection. Our results provide a possible explanation for skewed distributions observed in diverse contexts where selection plays a key role, from molecular biology to agriculture and sport.

  11. Selfish spermatogonial selection

    DEFF Research Database (Denmark)

    Lim, Jasmine; Maher, Geoffrey J; Turner, Gareth D H

    2012-01-01

    The dominant congenital disorders Apert syndrome, achondroplasia and multiple endocrine neoplasia-caused by specific missense mutations in the FGFR2, FGFR3 and RET proteins respectively-represent classical examples of paternal age-effect mutation, a class that arises at particularly high...... frequencies in the sperm of older men. Previous analyses of DNA from randomly selected cadaveric testes showed that the levels of the corresponding FGFR2, FGFR3 and RET mutations exhibit very uneven spatial distributions, with localised hotspots surrounded by large mutation-negative areas. These studies imply...

  12. Bits, Bytes and Dinosaurs: Using Levinas and Freire to Address the Concept of "Twenty-First Century Learning"

    Science.gov (United States)

    Benade, Leon

    2015-01-01

    The discourse of twenty-first century learning argues that education should prepare students for successful living in the twenty-first century workplace and society. It challenges all educators with the idea that contemporary education is unable to do so, as it is designed to replicate an industrial age model, essentially rear-focused, rather than…

  13. Random maintenance policies

    CERN Document Server

    Nakagawa, Toshio

    2014-01-01

    Exploring random maintenance models, this book provides an introduction to the implementation of random maintenance, and it is one of the first books to be written on this subject.  It aims to help readers learn new techniques for applying random policies to actual reliability models, and it provides new theoretical analyses of various models including classical replacement, preventive maintenance and inspection policies. These policies are applied to scheduling problems, backup policies of database systems, maintenance policies of cumulative damage models, and reliability of random redundant systems. Reliability theory is a major concern for engineers and managers, and in light of Japan’s recent earthquake, the reliability of large-scale systems has increased in importance. This also highlights the need for a new notion of maintenance and reliability theory, and how this can practically be applied to systems. Providing an essential guide for engineers and managers specializing in reliability maintenance a...

  14. Drawing a random number

    DEFF Research Database (Denmark)

    Wanscher, Jørgen Bundgaard; Sørensen, Majken Vildrik

    2006-01-01

    highly uniform multidimensional draws, which are highly relevant for todays traffic models. This paper shows among others combined shuffling and scrambling seems needless, that scrambling gives the lowest correlation and that there are detectable differences between random numbers, dependent...

  15. Spiders in random environment

    CERN Document Server

    Gallesco, Christophe; Popov, Serguei; Vachkovskaia, Marina

    2010-01-01

    A spider consists of several, say $N$, particles. Particles can jump independently according to a random walk if the movement does not violate some given restriction rules. If the movement violates a rule it is not carried out. We consider random walk in random environment (RWRE) on $\\Z$ as underlying random walk. We suppose the environment $\\omega=(\\omega_x)_{x \\in \\Z}$ to be elliptic, with positive drift and nestling, so that there exists a unique positive constant $\\kappa$ such that $\\E[((1-\\omega_0)/\\omega_0)^{\\kappa}]=1$. The restriction rules are kept very general; we only assume transitivity and irreducibility of the spider. The main result is that the speed of a spider is positive if $\\kappa/N>1$ and null if $\\kappa/N<1$. In particular, if $\\kappa/N <1$ a spider has null speed but the speed of a (single) RWRE is positive.

  16. Evaluation of the College Possible Program: Results from a Randomized

    Science.gov (United States)

    Avery, Christopher

    2013-01-01

    This paper reports the results of a randomized trial of the College Possible program, which provides two years of college preparatory work for high school juniors and seniors in Minneapolis and St. Paul. The trial involved 238 students, including 134 who were randomly selected for admission to the program. The results indicate that the College…

  17. Taming random lasers through active spatial control of the pump.

    Science.gov (United States)

    Bachelard, N; Andreasen, J; Gigan, S; Sebbah, P

    2012-07-20

    Active control of the spatial pump profile is proposed to exercise control over random laser emission. We demonstrate numerically the selection of any desired lasing mode from the emission spectrum. An iterative optimization method is employed, first in the regime of strong scattering where modes are spatially localized and can be easily selected using local pumping. Remarkably, this method works efficiently even in the weakly scattering regime, where strong spatial overlap of the modes precludes spatial selectivity. A complex optimized pump profile is found, which selects the desired lasing mode at the expense of others, thus demonstrating the potential of pump shaping for robust and controllable single mode operation of a random laser.

  18. Taming random lasers through active spatial control of the pump

    CERN Document Server

    Bachelard, Nicolas; Gigan, Sylvain; Sebbah, Patrick

    2012-01-01

    Active control of the pump spatial profile is proposed to exercise control over random laser emission. We demonstrate numerically the selection of any desired lasing mode from the emission spectrum. An iterative optimization method is employed, first in the regime of strong scattering where modes are spatially localized and can be easily selected using local pumping. Remarkably, this method works efficiently even in the weakly scattering regime, where strong spatial overlap of the modes precludes spatial selectivity. A complex optimized pump profile is found, which selects the desired lasing mode at the expense of others, thus demonstrating the potential of pump shaping for robust and controllable singlemode operation of a random laser.

  19. Motivational Interviewing as a Supervision Strategy in Probation: A Randomized Effectiveness Trial

    Science.gov (United States)

    Walters, Scott T.; Vader, Amanda M.; Nguyen, Norma; Harris, T. Robert; Eells, Jennifer

    2010-01-01

    Motivational interviewing (MI) has been recommended as a supervision style in probation. This project examined the effectiveness of an MI training curriculum on probation officer MI skill and subsequent probationer outcome. Twenty probation officers were randomized to receive MI training, or to a waiting list control, while an additional group of…

  20. Does epicatechin contribute to the acute vascular function effects of dark chocolate? A randomized, crossover study

    NARCIS (Netherlands)

    Dower, James I.; Geleijnse, Marianne; Kroon, Paul A.; Philo, Mark; Mensink, Marco; Kromhout, Daan; Hollman, Peter C.H.

    2016-01-01

    Scope: Cocoa, rich in flavan-3-ols, improves vascular function, but the contribution of specific flavan-3-ols is unknown. We compared the effects of pure epicatechin, a major cocoa flavan-3-ol, and chocolate. Methods and results: In a randomized crossover study, twenty healthy men (40-80 years)

  1. Does epicatechin contribute to the acute vascular function effects of dark chocolate? A randomized, crossover study

    NARCIS (Netherlands)

    Dower, James I.; Geleijnse, Marianne; Kroon, Paul A.; Philo, Mark; Mensink, Marco; Kromhout, Daan; Hollman, Peter C.H.

    2016-01-01

    Scope: Cocoa, rich in flavan-3-ols, improves vascular function, but the contribution of specific flavan-3-ols is unknown. We compared the effects of pure epicatechin, a major cocoa flavan-3-ol, and chocolate. Methods and results: In a randomized crossover study, twenty healthy men (40-80 years) w

  2. EMDR versus CBT for children with self-esteem and behavioral problems: a randomized controlled trial

    NARCIS (Netherlands)

    F. Wanders; M. Serra; A. de Jongh

    2008-01-01

    This study compared eye movement desensitization and reprocessing (EMDR) with cognitive-behavioral therapy (CBT). Twenty-six children (average age 10.4 years) with behavioral problems were randomly assigned to receive either 4 sessions of EMDR or CBT prior to usual treatment provided in outpatient a

  3. Therapeutic effect of pirenzepine for clozapine-induced hypersalivation: a randomized, double-blind, placebo-controlled, cross-over study.

    Science.gov (United States)

    Bai, Y M; Lin, C C; Chen, J Y; Liu, W C

    2001-12-01

    The objective of this study was to investigate the efficacy of pirenzepine in the treatment of clozapine-induced hypersalivation. Pirenzepine is reported to counteract hypersalivation by its selective antagonistic activity on the M4-muscarinic receptor, which is stimulated by clozapine. Twenty patients with clozapine-induced hypersalivation underwent a random-order, double-blind, placebo-controlled, cross-over trial which lasted 8 weeks each for the pirenzepine and placebo investigations, with a 4-week washout period in between. The severity of hypersalivation was assessed using an objective measure: saliva production monitored through the diameter of wetted surface on tissue paper placed over the patient's pillow. Our study showed that pirenzepine had no significant therapeutic effect on hypersalivation compared with placebo, suggesting that hypersalivation induced by clozapine might have a neurobiological basis other than the M4-muscarinic receptor.

  4. Random subspaces in quantum information theory

    Science.gov (United States)

    Hayden, Patrick

    2005-03-01

    The selection of random unitary transformations plays a role in quantum information theory analogous to the role of random hash functions in classical information theory. Recent applications have included protocols achieving the quantum channel capacity and methods for extending superdense coding from bits to qubits. In addition, the corresponding random subspaces have proved useful for studying the structure of bipartite and multipartite entanglement. In quantum information theory, we're fond of saying that Hilbert space is a big place, the implication being that there's room for the unexpected to occur. The goal of this talk is to further bolster this homespun wisdowm. I'm going to present a number of results in quantum information theory that stem from the initially counterintuitive geometry of high-dimensional vector spaces, where subspaces with highly extremal properties are the norm rather than the exception. Peter Shor has shown, for example, that randomly selected subspaces can be used to send quantum information through a noisy quantum channel at the highest possible rate, that is, the quantum channel capacity. More recently, Debbie Leung, Andreas Winter and I demonstrated that a randomly chosen subspace of a bipartite quantum system will likely contain nothing but nearly maximally entangled states, even if the subspace is nearly as large as the original system in qubit terms. This observation has implications for communication, especially superdense coding.

  5. LSPI with Random Projections

    OpenAIRE

    2010-01-01

    We consider the problem of reinforcement learning in high-dimensional spaces when the number of features is bigger than the number of samples. In particular, we study the least-squares temporal difference (LSTD) learning algorithm when a space of low dimension is generated with a random projection from a high-dimensional space. We provide a thorough theoretical analysis of the LSTD with random projections and derive performance bounds for the resulting algorithm. We also show how the error of...

  6. Random unistochastic matrices

    OpenAIRE

    Zyczkowski, K.; Slomczynski, W.; Kus, M.; Sommers, H. -J.

    2001-01-01

    An ensemble of random unistochastic (orthostochastic) matrices is defined by taking squared moduli of elements of random unitary (orthogonal) matrices distributed according to the Haar measure on U(N) (or O(N), respectively). An ensemble of symmetric unistochastic matrices is obtained with use of unitary symmetric matrices pertaining to the circular orthogonal ensemble. We study the distribution of complex eigenvalues of bistochastic, unistochastic and ortostochastic matrices in the complex p...

  7. Quantum randomness and unpredictability

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Gregg [Quantum Communication and Measurement Laboratory, Department of Electrical and Computer Engineering and Division of Natural Science and Mathematics, Boston University, Boston, MA (United States)

    2017-06-15

    Quantum mechanics is a physical theory supplying probabilities corresponding to expectation values for measurement outcomes. Indeed, its formalism can be constructed with measurement as a fundamental process, as was done by Schwinger, provided that individual measurements outcomes occur in a random way. The randomness appearing in quantum mechanics, as with other forms of randomness, has often been considered equivalent to a form of indeterminism. Here, it is argued that quantum randomness should instead be understood as a form of unpredictability because, amongst other things, indeterminism is not a necessary condition for randomness. For concreteness, an explication of the randomness of quantum mechanics as the unpredictability of quantum measurement outcomes is provided. Finally, it is shown how this view can be combined with the recently introduced view that the very appearance of individual quantum measurement outcomes can be grounded in the Plenitude principle of Leibniz, a principle variants of which have been utilized in physics by Dirac and Gell-Mann in relation to the fundamental processes. This move provides further support to Schwinger's ''symbolic'' derivation of quantum mechanics from measurement. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  8. Response to family selection and genetic parameters in Japanese quail selected for four week breast weight

    DEFF Research Database (Denmark)

    Khaldari, Majid; Yeganeh, Hassan Mehrabani; Pakdel, Abbas;

    2011-01-01

    An experiment was conducted to investigate the effect of short-term selection for 4 week breast weight (4wk BRW), and to estimate genetic parameters of body weight, and carcass traits. A selection (S) line and control (C) line was randomly selected from a base population. Data were collected over...

  9. Introducing random matrix theory into underwater sound propagation

    CERN Document Server

    Hegewisch, Katherine C

    2011-01-01

    Ocean acoustic propagation can be formulated as a wave guide with a weakly random medium generating multiple scattering. Twenty years ago, this was recognized as a quantum chaos problem, and yet random matrix theory, one pillar of quantum or wave chaos studies, has never been introduced into the subject. The modes of the wave guide provide a representation for the propagation, which in the parabolic approximation is unitary. Scattering induced by the ocean's internal waves leads to a power-law random banded unitary matrix ensemble for long-range deep ocean acoustic propagation. The ensemble has similarities, but differs, from those introduced for studying the Anderson metal-insulator transition. The resulting long-range propagation ensemble statistics agree well with those of full wave propagation using the parabolic equation.

  10. Establishing the R&D agenda for twenty-first century learning.

    Science.gov (United States)

    Kay, Ken; Honey, Margaret

    2006-01-01

    An infusion of twenty-first century skills into American public education necessitates a plan for research and development to further such reform. While the nation agrees that students must obtain critical thinking, problem-solving, and communication skills to succeed in the current global marketplace, this chapter puts forth a long-term, proactive agenda to invest in targeted research to propel and sustain this shift in education. The authors examine the impact such an R&D agenda would have on pedagogy and assessment and the implications for institutions of higher education. As the United States struggles to maintain dominance in the international economy, it faces a great challenge in keeping up with European and Asian competitors' strategies for preparing youth for the global marketplace. The authors hope the global reality will help contextualize the debate around American education--the current trend toward basics and accountability needs to be broadened. Building on frameworks created by the Partnership for 21st Century Skills, this chapter proposes questions to guide research around teaching, professional development, and assessment significant to twenty-first century skills. Knowing that educational change depends on providing teachers with the tools, support, and training to make fundamental changes in their practice, the authors argue for extensive research around best practices. In addition, if assessments are created to measure the desired outcomes, such measuring tools can drive reform. Furthermore, large-scale changes in teacher preparation programs must take place to allow teachers to adequately employ twenty-first century teaching and assessment strategies.

  11. Why American business demands twenty-first century skills: an industry perspective.

    Science.gov (United States)

    Bruett, Karen

    2006-01-01

    Public education is the key to individual and business prosperity. With a vested stake in education, educators, employers, parents, policymakers, and the public should question how this nation's public education system is faring. Knowing that recent international assessments have shown little or no gains in American students' achievement, the author asserts the clear need for change. As both a large American corporate employer and a provider of technology for schools, Dell is concerned with ensuring that youth will thrive in their adult lives. Changing workplace expectations lead to a new list of skills students will need to acquire before completing their schooling. Through technology, Dell supports schools in meeting educational goals, striving to supply students with the necessary skills, referred to as twenty-first century skills. The Partnership for 21st Century Skills, of which Dell is a member, has led an initiative to define what twenty-first century learning should entail. Through extensive research, the partnership has built a framework outlining twenty-first century skills: analytical thinking, communication, collaboration, global awareness, and technological and economic literacy. Dell and the partnership are working state by state to promote the integration of these skills into curricula, professional development for teachers, and classroom environments. The authors describe two current initiatives, one in Virginia, the other in Texas, which both use technology to help student learning. All stakeholders can take part in preparing young people to compete in the global economy. Educators and administrators, legislators, parents, and employers must play their role in helping students be ready for what the workforce and the world has in store for them.

  12. The design and performance of a twenty barrel hydrogen pellet injector for Alcator C-Mod

    Energy Technology Data Exchange (ETDEWEB)

    Urbahn, John A. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    1994-05-01

    A twenty barrel hydrogen pellet injector has been designed, built and tested both in the laboratory and on the Alcator C-Mod Tokamak at MIT. The injector functions by firing pellets of frozen hydrogen or deuterium deep into the plasma discharge for the purpose of fueling the plasma, modifying the density profile and increasing the global energy confinement time. The design goals of the injector are: (1) Operational flexibility, (2) High reliability, (3) Remote operation with minimal maintenance. These requirements have lead to a single stage, pipe gun design with twenty barrels. Pellets are formed by in- situ condensation of the fuel gas, thus avoiding moving parts at cryogenic temperatures. The injector is the first to dispense with the need for cryogenic fluids and instead uses a closed cycle refrigerator to cool the thermal system components. The twenty barrels of the injector produce pellets of four different size groups and allow for a high degree of flexibility in fueling experiments. Operation of the injector is under PLC control allowing for remote operation, interlocked safety features and automated pellet manufacturing. The injector has been extrusively tested and shown to produce pellets reliably with velocities up to 1400 m/sec. During the period from September to November of 1993, the injector was successfully used to fire pellets into over fifty plasma discharges. Experimental results include data on the pellet penetration into the plasma using an advanced pellet tracking diagnostic with improved time and spatial response. Data from the tracker indicates pellet penetrations were between 30 and 86 percent of the plasma minor radius.

  13. The frequency of drugs in randomly selected drivers in Denmark

    DEFF Research Database (Denmark)

    Simonsen, Kirsten Wiese; Steentoft, Anni; Hels, Tove

    the most frequent illicit drugs detected above the limit of quantitation (LOQ); while, codeine, tramadol, zopiclone, and benzodiazepines were the most frequent legal drugs. Middle aged men (median age 47.5 years) dominated the drunk driving group, while the drivers positive for illegal drugs consisted...... mainly of young men (median age 26 years). Middle aged women (median age 44.5 years) often tested positive for benzodiazepines at concentrations exceeding the legal limits. Interestingly, 0.6% of drivers tested positive for tramadol, at concentrations above the DRUID cut off; although, tramadol...

  14. Testing, Selection, and Implementation of Random Number Generators

    Science.gov (United States)

    2008-07-01

    and the well-equidistributed long-period linear (9) RNG well1024a. RNGs due to Marsaglia are multiply with carry (MWC) variations cmwc4096, mwc, mwcx...Generators Based on Linear Recurrences Modulo 2. ACM Transactions on Mathematical Software 2006, 32 (1), 1–16. 10. Marsaglia , G. Xorshift RNGs...Gaithersburg, MD, 2001. 13. Marsaglia , G. DieHard Home Page. http://stat.fsu.edu (accessed January 2008), path: pub/diehard. 14. U.S. Army Research

  15. Clinical Research Methodology 3: Randomized Controlled Trials.

    Science.gov (United States)

    Sessler, Daniel I; Imrey, Peter B

    2015-10-01

    Randomized assignment of treatment excludes reverse causation and selection bias and, in sufficiently large studies, effectively prevents confounding. Well-implemented blinding prevents measurement bias. Studies that include these protections are called randomized, blinded clinical trials and, when conducted with sufficient numbers of patients, provide the most valid results. Although conceptually straightforward, design of clinical trials requires thoughtful trade-offs among competing approaches-all of which influence the number of patients required, enrollment time, internal and external validity, ability to evaluate interactions among treatments, and cost.

  16. Brain Tumor Segmentation Based on Random Forest

    Directory of Open Access Journals (Sweden)

    László Lefkovits

    2016-09-01

    Full Text Available In this article we present a discriminative model for tumor detection from multimodal MR images. The main part of the model is built around the random forest (RF classifier. We created an optimization algorithm able to select the important features for reducing the dimensionality of data. This method is also used to find out the training parameters used in the learning phase. The algorithm is based on random feature properties for evaluating the importance of the variable, the evolution of learning errors and the proximities between instances. The detection performances obtained have been compared with the most recent systems, offering similar results.

  17. Effects of energy constraints on transportation systems. [Twenty-six papers

    Energy Technology Data Exchange (ETDEWEB)

    Mittal, R. K. [ed.

    1977-12-01

    Twenty-six papers are presented on a variety of topics including: energy and transportaton facts and figures; long-range planning under energy constraints; technology assessment of alternative fuels; energy efficiency of intercity passenger and freight movement; energy efficiency of intracity passenger movement; federal role; electrification of railroads; energy impact of the electric car in an urban enviroment; research needs and projects in progress--federal viewpoint; research needs in transportation energy conservation--data needs; and energy intensity of various transportation modes--an overview. A separate abstract was prepared for each of the papers for inclusion in Energy Research Abstracts (ERA) and in Energy Abstracts for Policy Analysis (EAPA).

  18. Managing the twenty-first century reference department challenges and prospects

    CERN Document Server

    Katz, Linda S

    2014-01-01

    Learn the skills needed to update and manage a reference department that efficiently meets the needs of clients today?and tomorrow! Managing the Twenty-First Century Reference Department: Challenges and Prospects provides librarians with the knowledge and skills they need to manage an effective reference service. Full of useful and practical ideas, this book presents successful methods for recruiting and retaining capable reference department staff and management, training new employees and adapting current services to an evolving field. Expert practitioners address the changing role of the r

  19. [Discirculatory encephalopathy in liquidators of the Chernobyl nuclear power station: a twenty-year study].

    Science.gov (United States)

    Podsonnaia, I V; Shumakher, G I; Golovin, V A

    2009-01-01

    A comparative twenty-year study of 536 liquidators of the Chernobyl nuclear disaster and 436 patients without radiation anamnesis has been carried out. Discirculatory encephalopathy (DE) was more often developed in subjects exposed to radiation at the age 30 years. Compared to individuals from the general population, it is characterized by the earlier onset, malignant progression, rapid increase of signs of cerebral affection during the first two years after exposure to radiation, stability of clinical symptoms during the following 5-6 years and further progressive cerebral decompensation with early autonomic dysfunction, psychoorganic syndrome, epilepsy. Moreover, severe stroke is a common complication of DE in liquidators.

  20. Digital images and art libraries in the twenty-first century

    CERN Document Server

    Wyngaard, Susan

    2013-01-01

    Increase your knowledge of the digital technology that is essential for art librarianship today! Digital Images and Art Libraries in the Twenty-First Century is your key to cutting-edge discourse on digital image databases and art libraries. Just as early photographers tried to capture the world to make it accessible, now information professionals in art libraries and art museums are creating and sharing digital collections to make them broadly accessible. This collection shares the experience and insight of art information managers who have taken advantage of digital technology to exp