WorldWideScience

Sample records for twenty randomly selected

  1. Blocked Randomization with Randomly Selected Block Sizes

    Directory of Open Access Journals (Sweden)

    Jimmy Efird

    2010-12-01

    Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  2. Minimization over randomly selected lines

    Directory of Open Access Journals (Sweden)

    Ismet Sahin

    2013-07-01

    Full Text Available This paper presents a population-based evolutionary optimization method for minimizing a given cost function. The mutation operator of this method selects randomly oriented lines in the cost function domain, constructs quadratic functions interpolating the cost function at three different points over each line, and uses extrema of the quadratics as mutated points. The crossover operator modifies each mutated point based on components of two points in population, instead of one point as is usually performed in other evolutionary algorithms. The stopping criterion of this method depends on the number of almost degenerate quadratics. We demonstrate that the proposed method with these mutation and crossover operations achieves faster and more robust convergence than the well-known Differential Evolution and Particle Swarm algorithms.

  3. High Entropy Random Selection Protocols

    NARCIS (Netherlands)

    H. Buhrman (Harry); M. Christandl (Matthias); M. Koucky (Michal); Z. Lotker (Zvi); B. Patt-Shamir; M. Charikar; K. Jansen; O. Reingold; J. Rolim

    2007-01-01

    textabstractIn this paper, we construct protocols for two parties that do not trust each other, to generate random variables with high Shannon entropy. We improve known bounds for the trade off between the number of rounds, length of communication and the entropy of the outcome.

  4. Book review: Twenty-Five Years on the Cutting Edge of Obsidian Studies: Selected Readings from the IAOS Bulletin

    Directory of Open Access Journals (Sweden)

    Sean Dolan

    2017-03-01

    Full Text Available Edited by Carolyn D. Dillian (Coastal Carolina University, Twenty-Five Years on the Cutting Edge of Obsidian Studies: Selected Readings from the IAOS Bulletin consists of 19 previously published articles from the International Association for Obsidian Studies (IAOS Bulletin. Dillian selected these articles because they provide a range of methodological and theoretical approaches concerning archaeological obsidian studies from around the world like Eretria, California, and the Near East, for example.

  5. 47 CFR 1.1603 - Conduct of random selection.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection. The...

  6. 47 CFR 1.1602 - Designation for random selection.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...

  7. Testing, Selection, and Implementation of Random Number Generators

    National Research Council Canada - National Science Library

    Collins, Joseph C

    2008-01-01

    An exhaustive evaluation of state-of-the-art random number generators with several well-known suites of tests provides the basis for selection of suitable random number generators for use in stochastic simulations...

  8. Random effect selection in generalised linear models

    DEFF Research Database (Denmark)

    Denwood, Matt; Houe, Hans; Forkman, Björn

    We analysed abattoir recordings of meat inspection codes with possible relevance to onfarm animal welfare in cattle. Random effects logistic regression models were used to describe individual-level data obtained from 461,406 cattle slaughtered in Denmark. Our results demonstrate that the largest...

  9. Interference-aware random beam selection for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed M.; Sayed, Mostafa M.; Alouini, Mohamed-Slim; Qaraqe, Khalid A.

    2012-01-01

    . In this paper, we develop interference-aware random beam selection schemes that provide enhanced throughput for the secondary link under the condition that the interference observed at the primary link is within a predetermined acceptable value. For a secondary

  10. Selectivity and sparseness in randomly connected balanced networks.

    Directory of Open Access Journals (Sweden)

    Cengiz Pehlevan

    Full Text Available Neurons in sensory cortex show stimulus selectivity and sparse population response, even in cases where no strong functionally specific structure in connectivity can be detected. This raises the question whether selectivity and sparseness can be generated and maintained in randomly connected networks. We consider a recurrent network of excitatory and inhibitory spiking neurons with random connectivity, driven by random projections from an input layer of stimulus selective neurons. In this architecture, the stimulus-to-stimulus and neuron-to-neuron modulation of total synaptic input is weak compared to the mean input. Surprisingly, we show that in the balanced state the network can still support high stimulus selectivity and sparse population response. In the balanced state, strong synapses amplify the variation in synaptic input and recurrent inhibition cancels the mean. Functional specificity in connectivity emerges due to the inhomogeneity caused by the generative statistical rule used to build the network. We further elucidate the mechanism behind and evaluate the effects of model parameters on population sparseness and stimulus selectivity. Network response to mixtures of stimuli is investigated. It is shown that a balanced state with unselective inhibition can be achieved with densely connected input to inhibitory population. Balanced networks exhibit the "paradoxical" effect: an increase in excitatory drive to inhibition leads to decreased inhibitory population firing rate. We compare and contrast selectivity and sparseness generated by the balanced network to randomly connected unbalanced networks. Finally, we discuss our results in light of experiments.

  11. The signature of positive selection at randomly chosen loci.

    OpenAIRE

    Przeworski, Molly

    2002-01-01

    In Drosophila and humans, there are accumulating examples of loci with a significant excess of high-frequency-derived alleles or high levels of linkage disequilibrium, relative to a neutral model of a random-mating population of constant size. These are features expected after a recent selective sweep. Their prevalence suggests that positive directional selection may be widespread in both species. However, as I show here, these features do not persist long after the sweep ends: The high-frequ...

  12. The reliability of randomly selected final year pharmacy students in ...

    African Journals Online (AJOL)

    Employing ANOVA, factorial experimental analysis, and the theory of error, reliability studies were conducted on the assessment of the drug product chloroquine phosphate tablets. The G–Study employed equal numbers of the factors for uniform control, and involved three analysts (randomly selected final year Pharmacy ...

  13. Local randomization in neighbor selection improves PRM roadmap quality

    KAUST Repository

    McMahon, Troy

    2012-10-01

    Probabilistic Roadmap Methods (PRMs) are one of the most used classes of motion planning methods. These sampling-based methods generate robot configurations (nodes) and then connect them to form a graph (roadmap) containing representative feasible pathways. A key step in PRM roadmap construction involves identifying a set of candidate neighbors for each node. Traditionally, these candidates are chosen to be the k-closest nodes based on a given distance metric. In this paper, we propose a new neighbor selection policy called LocalRand(k,K\\'), that first computes the K\\' closest nodes to a specified node and then selects k of those nodes at random. Intuitively, LocalRand attempts to benefit from random sampling while maintaining the higher levels of local planner success inherent to selecting more local neighbors. We provide a methodology for selecting the parameters k and K\\'. We perform an experimental comparison which shows that for both rigid and articulated robots, LocalRand results in roadmaps that are better connected than the traditional k-closest policy or a purely random neighbor selection policy. The cost required to achieve these results is shown to be comparable to k-closest. © 2012 IEEE.

  14. Local randomization in neighbor selection improves PRM roadmap quality

    KAUST Repository

    McMahon, Troy; Jacobs, Sam; Boyd, Bryan; Tapia, Lydia; Amato, Nancy M.

    2012-01-01

    Probabilistic Roadmap Methods (PRMs) are one of the most used classes of motion planning methods. These sampling-based methods generate robot configurations (nodes) and then connect them to form a graph (roadmap) containing representative feasible pathways. A key step in PRM roadmap construction involves identifying a set of candidate neighbors for each node. Traditionally, these candidates are chosen to be the k-closest nodes based on a given distance metric. In this paper, we propose a new neighbor selection policy called LocalRand(k,K'), that first computes the K' closest nodes to a specified node and then selects k of those nodes at random. Intuitively, LocalRand attempts to benefit from random sampling while maintaining the higher levels of local planner success inherent to selecting more local neighbors. We provide a methodology for selecting the parameters k and K'. We perform an experimental comparison which shows that for both rigid and articulated robots, LocalRand results in roadmaps that are better connected than the traditional k-closest policy or a purely random neighbor selection policy. The cost required to achieve these results is shown to be comparable to k-closest. © 2012 IEEE.

  15. Selection for altruism through random drift in variable size populations

    Directory of Open Access Journals (Sweden)

    Houchmandzadeh Bahram

    2012-05-01

    Full Text Available Abstract Background Altruistic behavior is defined as helping others at a cost to oneself and a lowered fitness. The lower fitness implies that altruists should be selected against, which is in contradiction with their widespread presence is nature. Present models of selection for altruism (kin or multilevel show that altruistic behaviors can have ‘hidden’ advantages if the ‘common good’ produced by altruists is restricted to some related or unrelated groups. These models are mostly deterministic, or assume a frequency dependent fitness. Results Evolutionary dynamics is a competition between deterministic selection pressure and stochastic events due to random sampling from one generation to the next. We show here that an altruistic allele extending the carrying capacity of the habitat can win by increasing the random drift of “selfish” alleles. In other terms, the fixation probability of altruistic genes can be higher than those of a selfish ones, even though altruists have a smaller fitness. Moreover when populations are geographically structured, the altruists advantage can be highly amplified and the fixation probability of selfish genes can tend toward zero. The above results are obtained both by numerical and analytical calculations. Analytical results are obtained in the limit of large populations. Conclusions The theory we present does not involve kin or multilevel selection, but is based on the existence of random drift in variable size populations. The model is a generalization of the original Fisher-Wright and Moran models where the carrying capacity depends on the number of altruists.

  16. Interference-aware random beam selection for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed M.

    2012-09-01

    Spectrum sharing systems have been introduced to alleviate the problem of spectrum scarcity by allowing secondary unlicensed networks to share the spectrum with primary licensed networks under acceptable interference levels to the primary users. In this paper, we develop interference-aware random beam selection schemes that provide enhanced throughput for the secondary link under the condition that the interference observed at the primary link is within a predetermined acceptable value. For a secondary transmitter equipped with multiple antennas, our schemes select a random beam, among a set of power- optimized orthogonal random beams, that maximizes the capacity of the secondary link while satisfying the interference constraint at the primary receiver for different levels of feedback information describing the interference level at the primary receiver. For the proposed schemes, we develop a statistical analysis for the signal-to-noise and interference ratio (SINR) statistics as well as the capacity of the secondary link. Finally, we present numerical results that study the effect of system parameters including number of beams and the maximum transmission power on the capacity of the secondary link attained using the proposed schemes. © 2012 IEEE.

  17. From Protocols to Publications: A Study in Selective Reporting of Outcomes in Randomized Trials in Oncology

    Science.gov (United States)

    Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C.; Hobbs, Brian P.; Berry, Donald A.; Pentz, Rebecca D.; Tam, Alda; Hong, Waun K.; Ellis, Lee M.; Abbruzzese, James; Overman, Michael J.

    2015-01-01

    Purpose The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. Methods We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. Results A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P < .001). Twenty-eight studies (37.8%) reported a total of 65 unplanned end points; 52 (80.0%) of which were not identified as unplanned. Thirty-one (41.9%) and 19 (25.7%) of 74 trials reported a total of 52 unplanned analyses involving primary end points and 33 unplanned analyses involving nonprimary end points, respectively. Studies reported positive unplanned end points and unplanned analyses more frequently than negative outcomes in abstracts (unplanned end points odds ratio, 6.8; P = .002; unplanned analyses odd ratio, 8.4; P = .007). Conclusion Despite public and reviewer access to protocols, selective outcome reporting persists and is a major concern in the reporting of randomized clinical trials. To foster credible evidence-based medicine, additional initiatives are needed to minimize selective reporting. PMID:26304898

  18. Watershed-scale response to climate change through the twenty-first century for selected basins across the United States

    Science.gov (United States)

    Hay, Lauren E.; Markstrom, Steven; Ward-Garrison, Christian D.

    2011-01-01

    The hydrologic response of different climate-change emission scenarios for the twenty-first century were evaluated in 14 basins from different hydroclimatic regions across the United States using the Precipitation-Runoff Modeling System (PRMS), a process-based, distributed-parameter watershed model. This study involves four major steps: 1) setup and calibration of the PRMS model in 14 basins across the United States by local U.S. Geological Survey personnel; 2) statistical downscaling of the World Climate Research Programme’s Coupled Model Intercomparison Project phase 3 climate-change emission scenarios to create PRMS input files that reflect these emission scenarios; 3) run PRMS for the climate-change emission scenarios for the 14 basins; and 4) evaluation of the PRMS output.This paper presents an overview of this project, details of the methodology, results from the 14 basin simulations, and interpretation of these results. A key finding is that the hydrological response of the different geographical regions of the United States to potential climate change may be very different, depending on the dominant physical processes of that particular region. Also considered is the tremendous amount of uncertainty present in the climate emission scenarios and how this uncertainty propagates through the hydrologic simulations. This paper concludes with a discussion of the lessons learned and potential for future work.

  19. Random selection of items. Selection of n1 samples among N items composing a stratum

    International Nuclear Information System (INIS)

    Jaech, J.L.; Lemaire, R.J.

    1987-02-01

    STR-224 provides generalized procedures to determine required sample sizes, for instance in the course of a Physical Inventory Verification at Bulk Handling Facilities. The present report describes procedures to generate random numbers and select groups of items to be verified in a given stratum through each of the measurement methods involved in the verification. (author). 3 refs

  20. Twenty years of artificial directional selection have shaped the genome of the Italian Large White pig breed.

    Science.gov (United States)

    Schiavo, G; Galimberti, G; Calò, D G; Samorè, A B; Bertolini, F; Russo, V; Gallo, M; Buttazzoni, L; Fontanesi, L

    2016-04-01

    In this study, we investigated at the genome-wide level if 20 years of artificial directional selection based on boar genetic evaluation obtained with a classical BLUP animal model shaped the genome of the Italian Large White pig breed. The most influential boars of this breed (n = 192), born from 1992 (the beginning of the selection program of this breed) to 2012, with an estimated breeding value reliability of >0.85, were genotyped with the Illumina Porcine SNP60 BeadChip. After grouping the boars in eight classes according to their year of birth, filtered single nucleotide polymorphisms (SNPs) were used to evaluate the effects of time on genotype frequency changes using multinomial logistic regression models. Of these markers, 493 had a PBonferroni  selection program. The obtained results indicated that the genome of the Italian Large White pigs was shaped by a directional selection program derived by the application of methodologies assuming the infinitesimal model that captured a continuous trend of allele frequency changes in the boar population. © 2015 Stichting International Foundation for Animal Genetics.

  1. The signature of positive selection at randomly chosen loci.

    Science.gov (United States)

    Przeworski, Molly

    2002-03-01

    In Drosophila and humans, there are accumulating examples of loci with a significant excess of high-frequency-derived alleles or high levels of linkage disequilibrium, relative to a neutral model of a random-mating population of constant size. These are features expected after a recent selective sweep. Their prevalence suggests that positive directional selection may be widespread in both species. However, as I show here, these features do not persist long after the sweep ends: The high-frequency alleles drift to fixation and no longer contribute to polymorphism, while linkage disequilibrium is broken down by recombination. As a result, loci chosen without independent evidence of recent selection are not expected to exhibit either of these features, even if they have been affected by numerous sweeps in their genealogical history. How then can we explain the patterns in the data? One possibility is population structure, with unequal sampling from different subpopulations. Alternatively, positive selection may not operate as is commonly modeled. In particular, the rate of fixation of advantageous mutations may have increased in the recent past.

  2. A Bayesian random effects discrete-choice model for resource selection: Population-level selection inference

    Science.gov (United States)

    Thomas, D.L.; Johnson, D.; Griffith, B.

    2006-01-01

    Modeling the probability of use of land units characterized by discrete and continuous measures, we present a Bayesian random-effects model to assess resource selection. This model provides simultaneous estimation of both individual- and population-level selection. Deviance information criterion (DIC), a Bayesian alternative to AIC that is sample-size specific, is used for model selection. Aerial radiolocation data from 76 adult female caribou (Rangifer tarandus) and calf pairs during 1 year on an Arctic coastal plain calving ground were used to illustrate models and assess population-level selection of landscape attributes, as well as individual heterogeneity of selection. Landscape attributes included elevation, NDVI (a measure of forage greenness), and land cover-type classification. Results from the first of a 2-stage model-selection procedure indicated that there is substantial heterogeneity among cow-calf pairs with respect to selection of the landscape attributes. In the second stage, selection of models with heterogeneity included indicated that at the population-level, NDVI and land cover class were significant attributes for selection of different landscapes by pairs on the calving ground. Population-level selection coefficients indicate that the pairs generally select landscapes with higher levels of NDVI, but the relationship is quadratic. The highest rate of selection occurs at values of NDVI less than the maximum observed. Results for land cover-class selections coefficients indicate that wet sedge, moist sedge, herbaceous tussock tundra, and shrub tussock tundra are selected at approximately the same rate, while alpine and sparsely vegetated landscapes are selected at a lower rate. Furthermore, the variability in selection by individual caribou for moist sedge and sparsely vegetated landscapes is large relative to the variability in selection of other land cover types. The example analysis illustrates that, while sometimes computationally intense, a

  3. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil

    2016-12-14

    This paper considers the problem of selecting a set of $k$ measurements from $n$ available sensor observations. The selected measurements should minimize a certain error function assessing the error in estimating a certain $m$ dimensional parameter vector. The exhaustive search inspecting each of the $n\\\\choose k$ possible choices would require a very high computational complexity and as such is not practical for large $n$ and $k$. Alternative methods with low complexity have recently been investigated but their main drawbacks are that 1) they require perfect knowledge of the measurement matrix and 2) they need to be applied at the pace of change of the measurement matrix. To overcome these issues, we consider the asymptotic regime in which $k$, $n$ and $m$ grow large at the same pace. Tools from random matrix theory are then used to approximate in closed-form the most important error measures that are commonly used. The asymptotic approximations are then leveraged to select properly $k$ measurements exhibiting low values for the asymptotic error measures. Two heuristic algorithms are proposed: the first one merely consists in applying the convex optimization artifice to the asymptotic error measure. The second algorithm is a low-complexity greedy algorithm that attempts to look for a sufficiently good solution for the original minimization problem. The greedy algorithm can be applied to both the exact and the asymptotic error measures and can be thus implemented in blind and channel-aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also presented and sustain the efficiency of the proposed blind methods in reaching the performances of channel-aware algorithms.

  4. Materials selection for oxide-based resistive random access memories

    International Nuclear Information System (INIS)

    Guo, Yuzheng; Robertson, John

    2014-01-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO 2 , TiO 2 , Ta 2 O 5 , and Al 2 O 3 , to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta 2 O 5 RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy

  5. Primitive polynomials selection method for pseudo-random number generator

    Science.gov (United States)

    Anikin, I. V.; Alnajjar, Kh

    2018-01-01

    In this paper we suggested the method for primitive polynomials selection of special type. This kind of polynomials can be efficiently used as a characteristic polynomials for linear feedback shift registers in pseudo-random number generators. The proposed method consists of two basic steps: finding minimum-cost irreducible polynomials of the desired degree and applying primitivity tests to get the primitive ones. Finally two primitive polynomials, which was found by the proposed method, used in pseudorandom number generator based on fuzzy logic (FRNG) which had been suggested before by the authors. The sequences generated by new version of FRNG have low correlation magnitude, high linear complexity, less power consumption, is more balanced and have better statistical properties.

  6. Materials selection for oxide-based resistive random access memories

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Yuzheng; Robertson, John [Engineering Department, Cambridge University, Cambridge CB2 1PZ (United Kingdom)

    2014-12-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO{sub 2}, TiO{sub 2}, Ta{sub 2}O{sub 5}, and Al{sub 2}O{sub 3}, to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta{sub 2}O{sub 5} RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy.

  7. Optimizing Event Selection with the Random Grid Search

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C. [Fermilab; Prosper, Harrison B. [Florida State U.; Sekmen, Sezen [Kyungpook Natl. U.; Stewart, Chip [Broad Inst., Cambridge

    2017-06-29

    The random grid search (RGS) is a simple, but efficient, stochastic algorithm to find optimal cuts that was developed in the context of the search for the top quark at Fermilab in the mid-1990s. The algorithm, and associated code, have been enhanced recently with the introduction of two new cut types, one of which has been successfully used in searches for supersymmetry at the Large Hadron Collider. The RGS optimization algorithm is described along with the recent developments, which are illustrated with two examples from particle physics. One explores the optimization of the selection of vector boson fusion events in the four-lepton decay mode of the Higgs boson and the other optimizes SUSY searches using boosted objects and the razor variables.

  8. Selective decontamination in pediatric liver transplants. A randomized prospective study.

    Science.gov (United States)

    Smith, S D; Jackson, R J; Hannakan, C J; Wadowsky, R M; Tzakis, A G; Rowe, M I

    1993-06-01

    Although it has been suggested that selective decontamination of the digestive tract (SDD) decreases postoperative aerobic Gram-negative and fungal infections in orthotopic liver transplantation (OLT), no controlled trials exist in pediatric patients. This prospective, randomized controlled study of 36 pediatric OLT patients examines the effect of short-term SDD on postoperative infection and digestive tract flora. Patients were randomized into two groups. The control group received perioperative parenteral antibiotics only. The SDD group received in addition polymyxin E, tobramycin, and amphotericin B enterally and by oropharyngeal swab postoperatively until oral intake was tolerated (6 +/- 4 days). Indications for operation, preoperative status, age, and intensive care unit and hospital length of stay were no different in SDD (n = 18) and control (n = 18) groups. A total of 14 Gram-negative infections (intraabdominal abscess 7, septicemia 5, pneumonia 1, urinary tract 1) developed in the 36 patients studied. Mortality was not significantly different in the two groups. However, there were significantly fewer patients with Gram-negative infections in the SDD group: 3/18 patients (11%) vs. 11/18 patients (50%) in the control group, P < 0.001. There was also significant reduction in aerobic Gram-negative flora in the stool and pharynx in patients receiving SDD. Gram-positive and anaerobic organisms were unaffected. We conclude that short-term postoperative SDD significantly reduces Gram-negative infections in pediatric OLT patients.

  9. Pediatric selective mutism therapy: a randomized controlled trial.

    Science.gov (United States)

    Esposito, Maria; Gimigliano, Francesca; Barillari, Maria R; Precenzano, Francesco; Ruberto, Maria; Sepe, Joseph; Barillari, Umberto; Gimigliano, Raffaele; Militerni, Roberto; Messina, Giovanni; Carotenuto, Marco

    2017-10-01

    Selective mutism (SM) is a rare disease in children coded by DSM-5 as an anxiety disorder. Despite the disabling nature of the disease, there is still no specific treatment. The aims of this study were to verify the efficacy of six-month standard psychomotor treatment and the positive changes in lifestyle, in a population of children affected by SM. Randomized controlled trial registered in the European Clinical Trials Registry (EuDract 2015-001161-36). University third level Centre (Child and Adolescent Neuropsychiatry Clinic). Study population was composed by 67 children in group A (psychomotricity treatment) (35 M, mean age 7.84±1.15) and 71 children in group B (behavioral and educational counseling) (37 M, mean age 7.75±1.36). Psychomotor treatment was administered by trained child therapists in residential settings three times per week. Each child was treated for the whole period by the same therapist and all the therapists shared the same protocol. The standard psychomotor session length is of 45 minutes. At T0 and after 6 months (T1) of treatments, patients underwent a behavioral and SM severity assessment. To verify the effects of the psychomotor management, the Child Behavior Checklist questionnaire (CBCL) and Selective Mutism Questionnaire (SMQ) were administered to the parents. After 6 months of psychomotor treatment SM children showed a significant reduction among CBCL scores such as in social relations, anxious/depressed, social problems and total problems (Pselective mutism, even if further studies are needed. The present study identifies in psychomotricity a safe and efficacy therapy for pediatric selective mutism.

  10. Can cannabis use be prevented by targeting personality risk in schools? Twenty?four?month outcome of the adventure trial on cannabis use: a cluster?randomized controlled trial

    OpenAIRE

    Mahu, Ioan T.; Doucet, Christine; O'Leary?Barrett, Maeve; Conrod, Patricia J.

    2015-01-01

    Aims To examine the effectiveness of a personality?targeted intervention program (Adventure trial) delivered by trained teachers to high?risk (HR) high?school students on reducing marijuana use and frequency of use. Design A cluster?randomized controlled trial. Setting Secondary schools in London, UK. Participants Twenty?one secondary schools were randomized to intervention (n?=?12) or control (n?=?9) conditions, encompassing a total of 1038 HR students in the ninth grade [mean (standard devi...

  11. Twenty-three generations of mice bidirectionally selected for open-field thigmotaxis: selection response and repeated exposure to the open field.

    Science.gov (United States)

    Leppänen, Pia K; Ravaja, N; Ewalds-Kvist, S B M

    2006-03-01

    We examined: (a) the response to bidirectional selection for open-field (OF) thigmotaxis in mice for 23 generations and (b) the effects of repeated exposure (during 5 days) on different OF behaviors in the selectively bred high OF thigmotaxis (HOFT) and low OF thigmotaxis (LOFT) mice. A total of 2049 mice were used in the study. Prior to the testing in the selection experiment, the mice were exposed to the OF apparatus for approximately 2 min on each of 4 consecutive days. Thus, the selection was based on the scores registered on the 5th day after the four habituation periods. The HOFT mice were more thigmotactic than the LOFT mice in almost each generation. The HOFT mice also tended to rear less than the LOFT mice, which was explained by the inverse relationship between emotionality and exploratory tendencies. The lines did not generally differ in ambulation. Sex differences were found in thigmotaxis, ambulation, and rearing. In the repeated exposure experiment, the development of nine different OF behaviors across the 5 days of testing was addressed. Both lines ambulated, explored, and reared most on the 1st, 4th, and 5th days. Grooming and radial latency decreased and thigmotaxis increased linearly across the testing days. Line differences were found in ambulation, exploration, grooming, and rearing, while sex differences were manifested in ambulation and exploration. The line difference in thigmotaxis was evident only on the 5th day. Temporal changes were partially at variance with the general assumptions. OF thigmotaxis was found to be a powerful characteristic for producing two diverging lines of mice.

  12. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil; Kammoun, Abla; Al-Naffouri, Tareq Y.; Alouini, Mohamed-Slim

    2016-01-01

    -aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also

  13. Strategyproof Peer Selection using Randomization, Partitioning, and Apportionment

    OpenAIRE

    Aziz, Haris; Lev, Omer; Mattei, Nicholas; Rosenschein, Jeffrey S.; Walsh, Toby

    2016-01-01

    Peer review, evaluation, and selection is a fundamental aspect of modern science. Funding bodies the world over employ experts to review and select the best proposals of those submitted for funding. The problem of peer selection, however, is much more general: a professional society may want to give a subset of its members awards based on the opinions of all members; an instructor for a MOOC or online course may want to crowdsource grading; or a marketing company may select ideas from group b...

  14. Variable Selection in Time Series Forecasting Using Random Forests

    Directory of Open Access Journals (Sweden)

    Hristos Tyralis

    2017-10-01

    Full Text Available Time series forecasting using machine learning algorithms has gained popularity recently. Random forest is a machine learning algorithm implemented in time series forecasting; however, most of its forecasting properties have remained unexplored. Here we focus on assessing the performance of random forests in one-step forecasting using two large datasets of short time series with the aim to suggest an optimal set of predictor variables. Furthermore, we compare its performance to benchmarking methods. The first dataset is composed by 16,000 simulated time series from a variety of Autoregressive Fractionally Integrated Moving Average (ARFIMA models. The second dataset consists of 135 mean annual temperature time series. The highest predictive performance of RF is observed when using a low number of recent lagged predictor variables. This outcome could be useful in relevant future applications, with the prospect to achieve higher predictive accuracy.

  15. Random-walk simulation of selected aspects of dissipative collisions

    International Nuclear Information System (INIS)

    Toeke, J.; Gobbi, A.; Matulewicz, T.

    1984-11-01

    Internuclear thermal equilibrium effects and shell structure effects in dissipative collisions are studied numerically within the framework of the model of stochastic exchanges by applying the random-walk technique. Effective blocking of the drift through the mass flux induced by the temperature difference, while leaving the variances of the mass distributions unaltered is found possible, provided an internuclear potential barrier is present. Presence of the shell structure is found to lead to characteristic correlations between the consecutive exchanges. Experimental evidence for the predicted effects is discussed. (orig.)

  16. Application of random effects to the study of resource selection by animals.

    Science.gov (United States)

    Gillies, Cameron S; Hebblewhite, Mark; Nielsen, Scott E; Krawchuk, Meg A; Aldridge, Cameron L; Frair, Jacqueline L; Saher, D Joanne; Stevens, Cameron E; Jerde, Christopher L

    2006-07-01

    1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence. 2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability. 3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed. 4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects. 5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection. 6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions

  17. Interference-aware random beam selection schemes for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed; Qaraqe, Khalid; Alouini, Mohamed-Slim

    2012-01-01

    users. In this work, we develop interference-aware random beam selection schemes that provide enhanced performance for the secondary network under the condition that the interference observed by the receivers of the primary network is below a

  18. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology

    Science.gov (United States)

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...

  19. Recovery of a lowland dipterocarp forest twenty two years after selective logging at Sekundur, Gunung Leuser National Park, North Sumatra, Indonesia

    Directory of Open Access Journals (Sweden)

    Dolly - Priatna

    2006-12-01

    Full Text Available PRIATNA, D.; KARTAWINATA, K.; ABDULHADI, R. 2004. Recovery of a lowland dipterocarp forest twenty two years after selective logging at Sekundur, Gunung Leuser National Park, North Sumatra, Indonesia. Reinwardtia 12 (3: 237–255. — A permanent 2-ha plot of lowland forest selectively logged in 1978 at Sekundur, Gunung Leuser National Park, which is also a Biosphere Reserve and a World Heritage Site, North Sumatra, was established and investigated in 1982. It was re-examined in 2000, where remeasurement and reidentification of all trees with DBH 10 cm were made. The areas of gap, building and mature phases of the canopy were also measured and mapped. Within this plot, 133 species, 87 genera and 39 families were recorded, with the total number of trees of 1145 or density of 572.5/ha. Euphorbiaceae was the richest family with 18 species (13.5 % of the total and total number of trees of 248 (21.7 % of the total or density of 124 trees/ha. The most important families were Dipterocarpaceae with IV (Importance Value = 52.0, followed by Euphorbiaceae with IV = 51.8. The most prevalent species was Shorea kunstleri (Dipterocarpaceae with IV =24.4, followed by Macaranga diepenhorstii (Euphorbiaceae with IV = 12.4. They were the species with highest density, 34 trees/ha and 23.5 trees/ha, respectively. During the period of 18 years there has been no shift in the richest families, most important families and most important species. Euphorbiaceae was the richest family and Dipterocarpaceae was the most important family, with Shorea kunstleri as the most important species with highest importance value throughout the period. The number of species increased from 127 to 133 with increase in density by 36.8% , from 418.5 trees/ha to 572.5 trees/ha. The mortality was 25.57 % or 1.4 % per year. The diameter class distribution indicated that the forest recovery has not been complete. Trees were small, comprising 67.6 % with diameters of 10-20 cm and only two trees

  20. The mathematics of random mutation and natural selection for multiple simultaneous selection pressures and the evolution of antimicrobial drug resistance.

    Science.gov (United States)

    Kleinman, Alan

    2016-12-20

    The random mutation and natural selection phenomenon act in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures when treating infections and cancers. The underlying principle to impair the random mutation and natural selection phenomenon is to use combination therapy, which forces the population to evolve to multiple selection pressures simultaneously that invoke the multiplication rule of probabilities simultaneously as well. Recently, it has been seen that combination therapy for the treatment of malaria has failed to prevent the emergence of drug-resistant variants. Using this empirical example and the principles of probability theory, the derivation of the equations describing this treatment failure is carried out. These equations give guidance as to how to use combination therapy for the treatment of cancers and infectious diseases and prevent the emergence of drug resistance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  1. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  2. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  3. Twenty weeks of isometric handgrip home training to lower blood pressure in hypertensive older adults: a study protocol for a randomized controlled trial.

    Science.gov (United States)

    Jørgensen, Martin Grønbech; Ryg, Jesper; Danielsen, Mathias Brix; Madeleine, Pascal; Andersen, Stig

    2018-02-09

    Hypertension markedly increases the risk of cardiovascular diseases and overall mortality. Lifestyle modifications, such as increased levels of physical activity, are recommended as the first line of anti-hypertensive treatment. A recent systematic review showed that isometric handgrip (IHG) training was superior to traditional endurance and strength training in lowering resting systolic blood pressure (SBP). The average length of previous IHG training studies is approximately 7.5 weeks with the longest being 10 weeks. Therefore, presently it is unknown if it is possible to further lower blood pressure levels beyond the 10-week mark. Recently, we developed a novel method for monitoring handgrip intensity using a standard Nintendo Wii Board (Wii). The primary aim of this study is to explore the effects of a 20-week IHG home training facilitated by a Wii in hypertensive older adults (50 + years of age) on lowering SBP compared to usual care. Secondary aims are to explore if/when a leveling-off effect on SBP will occur during the 20-week intervention period in the training group and to explore adherence and potential harms related to the IHG home training. Based on previous evidence, we calculated that 50 hypertensive (SBP between 140 and 179 mmHg), older adults (50 + years of age) are needed to achieve a power of 80% or more. Participants will be randomly assigned to either an intervention >group (IHG home training + hypertension guidelines on lifestyle changes) or to a control group (hypertension guidelines on lifestyle changes). Participants in the intervention group will perform IHG home training (30% of maximum grip strength for a total of 8 min per day per hand) three times a week for 20 weeks. Resting blood pressure and maximal handgrip strength will be obtained by a blinded outcome assessor in both groups at specific time points (baseline, follow-up at 5, 10, 15, and 20 weeks) throughout the study. This assessor-blinded, randomized controlled

  4. Twenty-year perspective of randomized controlled trials for surgery of chronic nonspecific low back pain: citation bias and tangential knowledge.

    Science.gov (United States)

    Andrade, Nicholas S; Flynn, John P; Bartanusz, Viktor

    2013-11-01

    After decades of clinical research, the role of surgery for chronic nonspecific low back pain (CNLBP) remains equivocal. Despite significant intellectual, human, and economic investments into randomized controlled trials (RCTs) in the past two decades, the role of surgery in the treatment for CNLBP has not been clarified. To delineate the historical research agenda of surgical RCTs for CNLBP performed between 1993 and 2012 investigating whether conclusions from earlier published trials influenced the choice of research questions of subsequent RCTs on elucidating the role of surgery in the management of CNLBP. Literature review. We searched the literature for all RCTs involving surgery for CNLBP. We reviewed relevant studies to identify the study question, comparator arms, and sample size. Randomized controlled trials were classified as "indication" trials if they evaluated the effectiveness of surgical therapy versus nonoperative care or as "technical" if they compared different surgical techniques, adjuncts, or procedures. We used citation analysis to determine the impact of trials on subsequent research in the field. Altogether 33 technical RCTs (3,790 patients) and 6 indication RCTs (981 patients) have been performed. Since 2007, despite the unclear benefits of surgery reported by the first four indication trials published in 2001 to 2006, technical trials have continued to predominate (16 vs. 2). Of the technical trials, types of instrumentation (13 trials, 1,332 patients), bone graft materials and substitutes (11 trials, 833 patients), and disc arthroplasty versus fusion (5 trials, 1,337 patients) were the most common comparisons made. Surgeon authors have predominantly cited one of the indication trials that reported more favorable results for surgery, despite a lack of superior methodology or sample size. Trials evaluating bone morphogenic protein, instrumentation, and disc arthroplasty were all cited more frequently than the largest trial of surgical versus

  5. Non-random mating for selection with restricted rates of inbreeding and overlapping generations

    NARCIS (Netherlands)

    Sonesson, A.K.; Meuwissen, T.H.E.

    2002-01-01

    Minimum coancestry mating with a maximum of one offspring per mating pair (MC1) is compared with random mating schemes for populations with overlapping generations. Optimum contribution selection is used, whereby $\\\\\\\\Delta F$ is restricted. For schemes with $\\\\\\\\Delta F$ restricted to 0.25% per

  6. Applications of random forest feature selection for fine-scale genetic population assignment.

    Science.gov (United States)

    Sylvester, Emma V A; Bentzen, Paul; Bradbury, Ian R; Clément, Marie; Pearce, Jon; Horne, John; Beiko, Robert G

    2018-02-01

    Genetic population assignment used to inform wildlife management and conservation efforts requires panels of highly informative genetic markers and sensitive assignment tests. We explored the utility of machine-learning algorithms (random forest, regularized random forest and guided regularized random forest) compared with F ST ranking for selection of single nucleotide polymorphisms (SNP) for fine-scale population assignment. We applied these methods to an unpublished SNP data set for Atlantic salmon ( Salmo salar ) and a published SNP data set for Alaskan Chinook salmon ( Oncorhynchus tshawytscha ). In each species, we identified the minimum panel size required to obtain a self-assignment accuracy of at least 90% using each method to create panels of 50-700 markers Panels of SNPs identified using random forest-based methods performed up to 7.8 and 11.2 percentage points better than F ST -selected panels of similar size for the Atlantic salmon and Chinook salmon data, respectively. Self-assignment accuracy ≥90% was obtained with panels of 670 and 384 SNPs for each data set, respectively, a level of accuracy never reached for these species using F ST -selected panels. Our results demonstrate a role for machine-learning approaches in marker selection across large genomic data sets to improve assignment for management and conservation of exploited populations.

  7. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  8. Hebbian Learning in a Random Network Captures Selectivity Properties of the Prefrontal Cortex

    Science.gov (United States)

    Lindsay, Grace W.

    2017-01-01

    Complex cognitive behaviors, such as context-switching and rule-following, are thought to be supported by the prefrontal cortex (PFC). Neural activity in the PFC must thus be specialized to specific tasks while retaining flexibility. Nonlinear “mixed” selectivity is an important neurophysiological trait for enabling complex and context-dependent behaviors. Here we investigate (1) the extent to which the PFC exhibits computationally relevant properties, such as mixed selectivity, and (2) how such properties could arise via circuit mechanisms. We show that PFC cells recorded from male and female rhesus macaques during a complex task show a moderate level of specialization and structure that is not replicated by a model wherein cells receive random feedforward inputs. While random connectivity can be effective at generating mixed selectivity, the data show significantly more mixed selectivity than predicted by a model with otherwise matched parameters. A simple Hebbian learning rule applied to the random connectivity, however, increases mixed selectivity and enables the model to match the data more accurately. To explain how learning achieves this, we provide analysis along with a clear geometric interpretation of the impact of learning on selectivity. After learning, the model also matches the data on measures of noise, response density, clustering, and the distribution of selectivities. Of two styles of Hebbian learning tested, the simpler and more biologically plausible option better matches the data. These modeling results provide clues about how neural properties important for cognition can arise in a circuit and make clear experimental predictions regarding how various measures of selectivity would evolve during animal training. SIGNIFICANCE STATEMENT The prefrontal cortex is a brain region believed to support the ability of animals to engage in complex behavior. How neurons in this area respond to stimuli—and in particular, to combinations of stimuli (

  9. Performance Evaluation of User Selection Protocols in Random Networks with Energy Harvesting and Hardware Impairments

    Directory of Open Access Journals (Sweden)

    Tan Nhat Nguyen

    2016-01-01

    Full Text Available In this paper, we evaluate performances of various user selection protocols under impact of hardware impairments. In the considered protocols, a Base Station (BS selects one of available Users (US to serve, while the remaining USs harvest the energy from the Radio Frequency (RF transmitted by the BS. We assume that all of the US randomly appear around the BS. In the Random Selection Protocol (RAN, the BS randomly selects a US to transmit the data. In the second proposed protocol, named Minimum Distance Protocol (MIND, the US that is nearest to the BS will be chosen. In the Optimal Selection Protocol (OPT, the US providing the highest channel gain between itself and the BS will be served. For performance evaluation, we derive exact and asymptotic closed-form expressions of average Outage Probability (OP over Rayleigh fading channels. We also consider average harvested energy per a US. Finally, Monte-Carlo simulations are then performed to verify the theoretical results.

  10. Simulated Performance Evaluation of a Selective Tracker Through Random Scenario Generation

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar

    2006-01-01

    performance assessment. Therefore, a random target motion scenario is adopted. Its implementation in particular for testing the proposed selective track splitting algorithm using Kalman filters is investigated through a number of performance parameters which gives the activity profile of the tracking scenario......  The paper presents a simulation study on the performance of a target tracker using selective track splitting filter algorithm through a random scenario implemented on a digital signal processor.  In a typical track splitting filter all the observation which fall inside a likelihood ellipse...... are used for update, however, in our proposed selective track splitting filter less number of observations are used for track update.  Much of the previous performance work [1] has been done on specific (deterministic) scenarios. One of the reasons for considering the specific scenarios, which were...

  11. TEHRAN AIR POLLUTANTS PREDICTION BASED ON RANDOM FOREST FEATURE SELECTION METHOD

    Directory of Open Access Journals (Sweden)

    A. Shamsoddini

    2017-09-01

    Full Text Available Air pollution as one of the most serious forms of environmental pollutions poses huge threat to human life. Air pollution leads to environmental instability, and has harmful and undesirable effects on the environment. Modern prediction methods of the pollutant concentration are able to improve decision making and provide appropriate solutions. This study examines the performance of the Random Forest feature selection in combination with multiple-linear regression and Multilayer Perceptron Artificial Neural Networks methods, in order to achieve an efficient model to estimate carbon monoxide and nitrogen dioxide, sulfur dioxide and PM2.5 contents in the air. The results indicated that Artificial Neural Networks fed by the attributes selected by Random Forest feature selection method performed more accurate than other models for the modeling of all pollutants. The estimation accuracy of sulfur dioxide emissions was lower than the other air contaminants whereas the nitrogen dioxide was predicted more accurate than the other pollutants.

  12. Tehran Air Pollutants Prediction Based on Random Forest Feature Selection Method

    Science.gov (United States)

    Shamsoddini, A.; Aboodi, M. R.; Karami, J.

    2017-09-01

    Air pollution as one of the most serious forms of environmental pollutions poses huge threat to human life. Air pollution leads to environmental instability, and has harmful and undesirable effects on the environment. Modern prediction methods of the pollutant concentration are able to improve decision making and provide appropriate solutions. This study examines the performance of the Random Forest feature selection in combination with multiple-linear regression and Multilayer Perceptron Artificial Neural Networks methods, in order to achieve an efficient model to estimate carbon monoxide and nitrogen dioxide, sulfur dioxide and PM2.5 contents in the air. The results indicated that Artificial Neural Networks fed by the attributes selected by Random Forest feature selection method performed more accurate than other models for the modeling of all pollutants. The estimation accuracy of sulfur dioxide emissions was lower than the other air contaminants whereas the nitrogen dioxide was predicted more accurate than the other pollutants.

  13. Continuous-Time Mean-Variance Portfolio Selection with Random Horizon

    International Nuclear Information System (INIS)

    Yu, Zhiyong

    2013-01-01

    This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right

  14. Continuous-Time Mean-Variance Portfolio Selection with Random Horizon

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Zhiyong, E-mail: yuzhiyong@sdu.edu.cn [Shandong University, School of Mathematics (China)

    2013-12-15

    This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right.

  15. Emergence of multilevel selection in the prisoner's dilemma game on coevolving random networks

    International Nuclear Information System (INIS)

    Szolnoki, Attila; Perc, Matjaz

    2009-01-01

    We study the evolution of cooperation in the prisoner's dilemma game, whereby a coevolutionary rule is introduced that molds the random topology of the interaction network in two ways. First, existing links are deleted whenever a player adopts a new strategy or its degree exceeds a threshold value; second, new links are added randomly after a given number of game iterations. These coevolutionary processes correspond to the generic formation of new links and deletion of existing links that, especially in human societies, appear frequently as a consequence of ongoing socialization, change of lifestyle or death. Due to the counteraction of deletions and additions of links the initial heterogeneity of the interaction network is qualitatively preserved, and thus cannot be held responsible for the observed promotion of cooperation. Indeed, the coevolutionary rule evokes the spontaneous emergence of a powerful multilevel selection mechanism, which despite the sustained random topology of the evolving network, maintains cooperation across the whole span of defection temptation values.

  16. Topology-selective jamming of fully-connected, code-division random-access networks

    Science.gov (United States)

    Polydoros, Andreas; Cheng, Unjeng

    1990-01-01

    The purpose is to introduce certain models of topology selective stochastic jamming and examine its impact on a class of fully-connected, spread-spectrum, slotted ALOHA-type random access networks. The theory covers dedicated as well as half-duplex units. The dominant role of the spatial duty factor is established, and connections with the dual concept of time selective jamming are discussed. The optimal choices of coding rate and link access parameters (from the users' side) and the jamming spatial fraction are numerically established for DS and FH spreading.

  17. Random drift versus selection in academic vocabulary: an evolutionary analysis of published keywords.

    Directory of Open Access Journals (Sweden)

    R Alexander Bentley

    Full Text Available The evolution of vocabulary in academic publishing is characterized via keyword frequencies recorded in the ISI Web of Science citations database. In four distinct case-studies, evolutionary analysis of keyword frequency change through time is compared to a model of random copying used as the null hypothesis, such that selection may be identified against it. The case studies from the physical sciences indicate greater selection in keyword choice than in the social sciences. Similar evolutionary analyses can be applied to a wide range of phenomena; wherever the popularity of multiple items through time has been recorded, as with web searches, or sales of popular music and books, for example.

  18. Random drift versus selection in academic vocabulary: an evolutionary analysis of published keywords.

    Science.gov (United States)

    Bentley, R Alexander

    2008-08-27

    The evolution of vocabulary in academic publishing is characterized via keyword frequencies recorded in the ISI Web of Science citations database. In four distinct case-studies, evolutionary analysis of keyword frequency change through time is compared to a model of random copying used as the null hypothesis, such that selection may be identified against it. The case studies from the physical sciences indicate greater selection in keyword choice than in the social sciences. Similar evolutionary analyses can be applied to a wide range of phenomena; wherever the popularity of multiple items through time has been recorded, as with web searches, or sales of popular music and books, for example.

  19. Comparative Evaluations of Randomly Selected Four Point-of-Care Glucometer Devices in Addis Ababa, Ethiopia.

    Science.gov (United States)

    Wolde, Mistire; Tarekegn, Getahun; Kebede, Tedla

    2018-05-01

    Point-of-care glucometer (PoCG) devices play a significant role in self-monitoring of the blood sugar level, particularly in the follow-up of high blood sugar therapeutic response. The aim of this study was to evaluate blood glucose test results performed with four randomly selected glucometers on diabetes and control subjects versus standard wet chemistry (hexokinase) methods in Addis Ababa, Ethiopia. A prospective cross-sectional study was conducted on randomly selected 200 study participants (100 participants with diabetes and 100 healthy controls). Four randomly selected PoCG devices (CareSens N, DIAVUE Prudential, On Call Extra, i-QARE DS-W) were evaluated against hexokinase method and ISO 15197:2003 and ISO 15197:2013 standards. The minimum and maximum blood sugar values were recorded by CareSens N (21 mg/dl) and hexokinase method (498.8 mg/dl), respectively. The mean sugar values of all PoCG devices except On Call Extra showed significant differences compared with the reference hexokinase method. Meanwhile, all four PoCG devices had strong positive relationship (>80%) with the reference method (hexokinase). On the other hand, none of the four PoCG devices fulfilled the minimum accuracy measurement set by ISO 15197:2003 and ISO 15197:2013 standards. In addition, the linear regression analysis revealed that all four selected PoCG overestimated the glucose concentrations. The overall evaluation of the selected four PoCG measurements were poorly correlated with standard reference method. Therefore, before introducing PoCG devices to the market, there should be a standardized evaluation platform for validation. Further similar large-scale studies on other PoCG devices also need to be undertaken.

  20. Selection bias and subject refusal in a cluster-randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Rochelle Yang

    2017-07-01

    Full Text Available Abstract Background Selection bias and non-participation bias are major methodological concerns which impact external validity. Cluster-randomized controlled trials are especially prone to selection bias as it is impractical to blind clusters to their allocation into intervention or control. This study assessed the impact of selection bias in a large cluster-randomized controlled trial. Methods The Improved Cardiovascular Risk Reduction to Enhance Rural Primary Care (ICARE study examined the impact of a remote pharmacist-led intervention in twelve medical offices. To assess eligibility, a standardized form containing patient demographics and medical information was completed for each screened patient. Eligible patients were approached by the study coordinator for recruitment. Both the study coordinator and the patient were aware of the site’s allocation prior to consent. Patients who consented or declined to participate were compared across control and intervention arms for differing characteristics. Statistical significance was determined using a two-tailed, equal variance t-test and a chi-square test with adjusted Bonferroni p-values. Results were adjusted for random cluster variation. Results There were 2749 completed screening forms returned to research staff with 461 subjects who had either consented or declined participation. Patients with poorly controlled diabetes were found to be significantly more likely to decline participation in intervention sites compared to those in control sites. A higher mean diastolic blood pressure was seen in patients with uncontrolled hypertension who declined in the control sites compared to those who declined in the intervention sites. However, these findings were no longer significant after adjustment for random variation among the sites. After this adjustment, females were now found to be significantly more likely to consent than males (odds ratio = 1.41; 95% confidence interval = 1.03, 1

  1. Fuzzy Random λ-Mean SAD Portfolio Selection Problem: An Ant Colony Optimization Approach

    Science.gov (United States)

    Thakur, Gour Sundar Mitra; Bhattacharyya, Rupak; Mitra, Swapan Kumar

    2010-10-01

    To reach the investment goal, one has to select a combination of securities among different portfolios containing large number of securities. Only the past records of each security do not guarantee the future return. As there are many uncertain factors which directly or indirectly influence the stock market and there are also some newer stock markets which do not have enough historical data, experts' expectation and experience must be combined with the past records to generate an effective portfolio selection model. In this paper the return of security is assumed to be Fuzzy Random Variable Set (FRVS), where returns are set of random numbers which are in turn fuzzy numbers. A new λ-Mean Semi Absolute Deviation (λ-MSAD) portfolio selection model is developed. The subjective opinions of the investors to the rate of returns of each security are taken into consideration by introducing a pessimistic-optimistic parameter vector λ. λ-Mean Semi Absolute Deviation (λ-MSAD) model is preferred as it follows absolute deviation of the rate of returns of a portfolio instead of the variance as the measure of the risk. As this model can be reduced to Linear Programming Problem (LPP) it can be solved much faster than quadratic programming problems. Ant Colony Optimization (ACO) is used for solving the portfolio selection problem. ACO is a paradigm for designing meta-heuristic algorithms for combinatorial optimization problem. Data from BSE is used for illustration.

  2. Optimization of the Dutch Matrix Test by Random Selection of Sentences From a Preselected Subset

    Directory of Open Access Journals (Sweden)

    Rolph Houben

    2015-04-01

    Full Text Available Matrix tests are available for speech recognition testing in many languages. For an accurate measurement, a steep psychometric function of the speech materials is required. For existing tests, it would be beneficial if it were possible to further optimize the available materials by increasing the function’s steepness. The objective is to show if the steepness of the psychometric function of an existing matrix test can be increased by selecting a homogeneous subset of recordings with the steepest sentence-based psychometric functions. We took data from a previous multicenter evaluation of the Dutch matrix test (45 normal-hearing listeners. Based on half of the data set, first the sentences (140 out of 311 with a similar speech reception threshold and with the steepest psychometric function (≥9.7%/dB were selected. Subsequently, the steepness of the psychometric function for this selection was calculated from the remaining (unused second half of the data set. The calculation showed that the slope increased from 10.2%/dB to 13.7%/dB. The resulting subset did not allow the construction of enough balanced test lists. Therefore, the measurement procedure was changed to randomly select the sentences during testing. Random selection may interfere with a representative occurrence of phonemes. However, in our material, the median phonemic occurrence remained close to that of the original test. This finding indicates that phonemic occurrence is not a critical factor. The work highlights the possibility that existing speech tests might be improved by selecting sentences with a steep psychometric function.

  3. Using Random Forests to Select Optimal Input Variables for Short-Term Wind Speed Forecasting Models

    Directory of Open Access Journals (Sweden)

    Hui Wang

    2017-10-01

    Full Text Available Achieving relatively high-accuracy short-term wind speed forecasting estimates is a precondition for the construction and grid-connected operation of wind power forecasting systems for wind farms. Currently, most research is focused on the structure of forecasting models and does not consider the selection of input variables, which can have significant impacts on forecasting performance. This paper presents an input variable selection method for wind speed forecasting models. The candidate input variables for various leading periods are selected and random forests (RF is employed to evaluate the importance of all variable as features. The feature subset with the best evaluation performance is selected as the optimal feature set. Then, kernel-based extreme learning machine is constructed to evaluate the performance of input variables selection based on RF. The results of the case study show that by removing the uncorrelated and redundant features, RF effectively extracts the most strongly correlated set of features from the candidate input variables. By finding the optimal feature combination to represent the original information, RF simplifies the structure of the wind speed forecasting model, shortens the training time required, and substantially improves the model’s accuracy and generalization ability, demonstrating that the input variables selected by RF are effective.

  4. From Protocols to Publications: A Study in Selective Reporting of Outcomes in Randomized Trials in Oncology.

    Science.gov (United States)

    Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C; Hobbs, Brian P; Berry, Donald A; Pentz, Rebecca D; Tam, Alda; Hong, Waun K; Ellis, Lee M; Abbruzzese, James; Overman, Michael J

    2015-11-01

    The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P medicine, additional initiatives are needed to minimize selective reporting. © 2015 by American Society of Clinical Oncology.

  5. Twenty lectures on thermodynamics

    CERN Document Server

    Buchdahl, H A

    2013-01-01

    Twenty Lectures on Thermodynamics is a course of lectures, parts of which the author has given various times over the last few years. The book gives the readers a bird's eye view of phenomenological and statistical thermodynamics. The book covers many areas in thermodynamics such as states and transition; adiabatic isolation; irreversibility; the first, second, third and Zeroth laws of thermodynamics; entropy and entropy law; the idea of the application of thermodynamics; pseudo-states; the quantum-static al canonical and grand canonical ensembles; and semi-classical gaseous systems. The text

  6. On theoretical models of gene expression evolution with random genetic drift and natural selection.

    Directory of Open Access Journals (Sweden)

    Osamu Ogasawara

    2009-11-01

    Full Text Available The relative contributions of natural selection and random genetic drift are a major source of debate in the study of gene expression evolution, which is hypothesized to serve as a bridge from molecular to phenotypic evolution. It has been suggested that the conflict between views is caused by the lack of a definite model of the neutral hypothesis, which can describe the long-run behavior of evolutionary change in mRNA abundance. Therefore previous studies have used inadequate analogies with the neutral prediction of other phenomena, such as amino acid or nucleotide sequence evolution, as the null hypothesis of their statistical inference.In this study, we introduced two novel theoretical models, one based on neutral drift and the other assuming natural selection, by focusing on a common property of the distribution of mRNA abundance among a variety of eukaryotic cells, which reflects the result of long-term evolution. Our results demonstrated that (1 our models can reproduce two independently found phenomena simultaneously: the time development of gene expression divergence and Zipf's law of the transcriptome; (2 cytological constraints can be explicitly formulated to describe long-term evolution; (3 the model assuming that natural selection optimized relative mRNA abundance was more consistent with previously published observations than the model of optimized absolute mRNA abundances.The models introduced in this study give a formulation of evolutionary change in the mRNA abundance of each gene as a stochastic process, on the basis of previously published observations. This model provides a foundation for interpreting observed data in studies of gene expression evolution, including identifying an adequate time scale for discriminating the effect of natural selection from that of random genetic drift of selectively neutral variations.

  7. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology.

    Science.gov (United States)

    Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H

    2017-07-01

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in

  8. Analysis and applications of a frequency selective surface via a random distribution method

    International Nuclear Information System (INIS)

    Xie Shao-Yi; Huang Jing-Jian; Yuan Nai-Chang; Liu Li-Guo

    2014-01-01

    A novel frequency selective surface (FSS) for reducing radar cross section (RCS) is proposed in this paper. This FSS is based on the random distribution method, so it can be called random surface. In this paper, the stacked patches serving as periodic elements are employed for RCS reduction. Previous work has demonstrated the efficiency by utilizing the microstrip patches, especially for the reflectarray. First, the relevant theory of the method is described. Then a sample of a three-layer variable-sized stacked patch random surface with a dimension of 260 mm×260 mm is simulated, fabricated, and measured in order to demonstrate the validity of the proposed design. For the normal incidence, the 8-dB RCS reduction can be achieved both by the simulation and the measurement in 8 GHz–13 GHz. The oblique incidence of 30° is also investigated, in which the 7-dB RCS reduction can be obtained in a frequency range of 8 GHz–14 GHz. (condensed matter: electronic structure, electrical, magnetic, and optical properties)

  9. Selective oropharyngeal decontamination versus selective digestive decontamination in critically ill patients: a meta-analysis of randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Zhao D

    2015-07-01

    Full Text Available Di Zhao,1,* Jian Song,2,* Xuan Gao,3 Fei Gao,4 Yupeng Wu,2 Yingying Lu,5 Kai Hou1 1Department of Neurosurgery, The First Hospital of Hebei Medical University, 2Department of Neurosurgery, 3Department of Neurology, The Second Hospital of Hebei Medical University, 4Hebei Provincial Procurement Centers for Medical Drugs and Devices, 5Department of Neurosurgery, The Second Hospital of Hebei Medical University, Shijiazhuang People’s Republic of China *These authors contributed equally to this work Background: Selective digestive decontamination (SDD and selective oropharyngeal decontamination (SOD are associated with reduced mortality and infection rates among patients in intensive care units (ICUs; however, whether SOD has a superior effect than SDD remains uncertain. Hence, we conducted a meta-analysis of randomized controlled trials (RCTs to compare SOD with SDD in terms of clinical outcomes and antimicrobial resistance rates in patients who were critically ill. Methods: RCTs published in PubMed, Embase, and Web of Science were systematically reviewed to compare the effects of SOD and SDD in patients who were critically ill. Outcomes included day-28 mortality, length of ICU stay, length of hospital stay, duration of mechanical ventilation, ICU-acquired bacteremia, and prevalence of antibiotic-resistant Gram-negative bacteria. Results were expressed as risk ratio (RR with 95% confidence intervals (CIs, and weighted mean differences (WMDs with 95% CIs. Pooled estimates were performed using a fixed-effects model or random-effects model, depending on the heterogeneity among studies. Results: A total of four RCTs involving 23,822 patients met the inclusion criteria and were included in this meta-analysis. Among patients whose admitting specialty was surgery, cardiothoracic surgery (57.3% and neurosurgery (29.7% were the two main types of surgery being performed. Pooled results showed that SOD had similar effects as SDD in day-28 mortality (RR =1

  10. Geography and genography: prediction of continental origin using randomly selected single nucleotide polymorphisms

    Directory of Open Access Journals (Sweden)

    Ramoni Marco F

    2007-03-01

    Full Text Available Abstract Background Recent studies have shown that when individuals are grouped on the basis of genetic similarity, group membership corresponds closely to continental origin. There has been considerable debate about the implications of these findings in the context of larger debates about race and the extent of genetic variation between groups. Some have argued that clustering according to continental origin demonstrates the existence of significant genetic differences between groups and that these differences may have important implications for differences in health and disease. Others argue that clustering according to continental origin requires the use of large amounts of genetic data or specifically chosen markers and is indicative only of very subtle genetic differences that are unlikely to have biomedical significance. Results We used small numbers of randomly selected single nucleotide polymorphisms (SNPs from the International HapMap Project to train naïve Bayes classifiers for prediction of ancestral continent of origin. Predictive accuracy was tested on two independent data sets. Genetically similar groups should be difficult to distinguish, especially if only a small number of genetic markers are used. The genetic differences between continentally defined groups are sufficiently large that one can accurately predict ancestral continent of origin using only a minute, randomly selected fraction of the genetic variation present in the human genome. Genotype data from only 50 random SNPs was sufficient to predict ancestral continent of origin in our primary test data set with an average accuracy of 95%. Genetic variations informative about ancestry were common and widely distributed throughout the genome. Conclusion Accurate characterization of ancestry is possible using small numbers of randomly selected SNPs. The results presented here show how investigators conducting genetic association studies can use small numbers of arbitrarily

  11. Random forest variable selection in spatial malaria transmission modelling in Mpumalanga Province, South Africa

    Directory of Open Access Journals (Sweden)

    Thandi Kapwata

    2016-11-01

    Full Text Available Malaria is an environmentally driven disease. In order to quantify the spatial variability of malaria transmission, it is imperative to understand the interactions between environmental variables and malaria epidemiology at a micro-geographic level using a novel statistical approach. The random forest (RF statistical learning method, a relatively new variable-importance ranking method, measures the variable importance of potentially influential parameters through the percent increase of the mean squared error. As this value increases, so does the relative importance of the associated variable. The principal aim of this study was to create predictive malaria maps generated using the selected variables based on the RF algorithm in the Ehlanzeni District of Mpumalanga Province, South Africa. From the seven environmental variables used [temperature, lag temperature, rainfall, lag rainfall, humidity, altitude, and the normalized difference vegetation index (NDVI], altitude was identified as the most influential predictor variable due its high selection frequency. It was selected as the top predictor for 4 out of 12 months of the year, followed by NDVI, temperature and lag rainfall, which were each selected twice. The combination of climatic variables that produced the highest prediction accuracy was altitude, NDVI, and temperature. This suggests that these three variables have high predictive capabilities in relation to malaria transmission. Furthermore, it is anticipated that the predictive maps generated from predictions made by the RF algorithm could be used to monitor the progression of malaria and assist in intervention and prevention efforts with respect to malaria.

  12. Distribution of orientation selectivity in recurrent networks of spiking neurons with different random topologies.

    Science.gov (United States)

    Sadeh, Sadra; Rotter, Stefan

    2014-01-01

    Neurons in the primary visual cortex are more or less selective for the orientation of a light bar used for stimulation. A broad distribution of individual grades of orientation selectivity has in fact been reported in all species. A possible reason for emergence of broad distributions is the recurrent network within which the stimulus is being processed. Here we compute the distribution of orientation selectivity in randomly connected model networks that are equipped with different spatial patterns of connectivity. We show that, for a wide variety of connectivity patterns, a linear theory based on firing rates accurately approximates the outcome of direct numerical simulations of networks of spiking neurons. Distance dependent connectivity in networks with a more biologically realistic structure does not compromise our linear analysis, as long as the linearized dynamics, and hence the uniform asynchronous irregular activity state, remain stable. We conclude that linear mechanisms of stimulus processing are indeed responsible for the emergence of orientation selectivity and its distribution in recurrent networks with functionally heterogeneous synaptic connectivity.

  13. Interference-aware random beam selection schemes for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed

    2012-10-19

    Spectrum sharing systems have been recently introduced to alleviate the problem of spectrum scarcity by allowing secondary unlicensed networks to share the spectrum with primary licensed networks under acceptable interference levels to the primary users. In this work, we develop interference-aware random beam selection schemes that provide enhanced performance for the secondary network under the condition that the interference observed by the receivers of the primary network is below a predetermined/acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a primary link composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes select a beam, among a set of power-optimized random beams, that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint for different levels of feedback information describing the interference level at the primary receiver. For the proposed schemes, we develop a statistical analysis for the SINR statistics as well as the capacity and bit error rate (BER) of the secondary link.

  14. Joint random beam and spectrum selection for spectrum sharing systems with partial channel state information

    KAUST Repository

    Abdallah, Mohamed M.

    2013-11-01

    In this work, we develop joint interference-aware random beam and spectrum selection scheme that provide enhanced performance for the secondary network under the condition that the interference observed at the primary receiver is below a predetermined acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a set of primary links composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes jointly select a beam, among a set of power-optimized random beams, as well as the primary spectrum that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint. In particular, we consider the case where the interference level is described by a q-bit description of its magnitude, whereby we propose a technique to find the optimal quantizer thresholds in a mean square error (MSE) sense. © 2013 IEEE.

  15. Joint random beam and spectrum selection for spectrum sharing systems with partial channel state information

    KAUST Repository

    Abdallah, Mohamed M.; Sayed, Mostafa M.; Alouini, Mohamed-Slim; Qaraqe, Khalid A.

    2013-01-01

    In this work, we develop joint interference-aware random beam and spectrum selection scheme that provide enhanced performance for the secondary network under the condition that the interference observed at the primary receiver is below a predetermined acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a set of primary links composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes jointly select a beam, among a set of power-optimized random beams, as well as the primary spectrum that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint. In particular, we consider the case where the interference level is described by a q-bit description of its magnitude, whereby we propose a technique to find the optimal quantizer thresholds in a mean square error (MSE) sense. © 2013 IEEE.

  16. Effects of one versus two bouts of moderate intensity physical activity on selective attention during a school morning in Dutch primary schoolchildren: A randomized controlled trial.

    Science.gov (United States)

    Altenburg, Teatske M; Chinapaw, Mai J M; Singh, Amika S

    2016-10-01

    Evidence suggests that physical activity is positively related to several aspects of cognitive functioning in children, among which is selective attention. To date, no information is available on the optimal frequency of physical activity on cognitive functioning in children. The current study examined the acute effects of one and two bouts of moderate-intensity physical activity on children's selective attention. Randomized controlled trial (ISRCTN97975679). Thirty boys and twenty-six girls, aged 10-13 years, were randomly assigned to three conditions: (A) sitting all morning working on simulated school tasks; (B) one 20-min physical activity bout after 90min; and (C) two 20-min physical activity bouts, i.e. at the start and after 90min. Selective attention was assessed at five time points during the morning (i.e. at baseline and after 20, 110, 130 and 220min), using the 'Sky Search' subtest of the 'Test of Selective Attention in Children'. We used GEE analysis to examine differences in Sky Search scores between the three experimental conditions, adjusting for school, baseline scores, self-reported screen time and time spent in sports. Children who performed two 20-min bouts of moderate-intensity physical activity had significantly better Sky Search scores compared to children who performed one physical activity bout or remained seated the whole morning (B=-0.26; 95% CI=[-0.52; -0.00]). Our findings support the importance of repeated physical activity during the school day for beneficial effects on selective attention in children. Copyright © 2015 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  17. High-Dose Statin Pretreatment Decreases Periprocedural Myocardial Infarction and Cardiovascular Events in Patients Undergoing Elective Percutaneous Coronary Intervention: A Meta-Analysis of Twenty-Four Randomized Controlled Trials

    Science.gov (United States)

    Wang, Le; Peng, Pingan; Zhang, Ou; Xu, Xiaohan; Yang, Shiwei; Zhao, Yingxin; Zhou, Yujie

    2014-01-01

    Background Evidence suggests that high-dose statin pretreatment may reduce the risk of periprocedural myocardial infarction (PMI) and major adverse cardiac events (MACE) for certain patients; however, previous analyses have not considered patients with a history of statin maintenance treatment. In this meta-analysis of randomized controlled trials (RCTs), we reevaluated the efficacy of short-term high-dose statin pretreatment to prevent PMI and MACE in an expanded set of patients undergoing elective percutaneous coronary intervention. Methods We searched the PubMed/Medline database for RCTs that compared high-dose statin pretreatment with no statin or low-dose statin pretreatment as a prevention of PMI and MACE. We evaluated the incidence of PMI and MACE, including death, spontaneous myocardial infarction, and target vessel revascularization at the longest follow-up for each study for subgroups stratified by disease classification and prior low-dose statin treatment. Results Twenty-four RCTs with a total of 5,526 patients were identified. High-dose statin pretreatment was associated with 59% relative reduction in PMI (odds ratio [OR]: 0.41; 95% confidence interval [CI]: 0.34–0.49; Pstatin pretreatment on MACE was significant for statin-naive patients (OR: 0.69; 95% CI: 0.50–0.95; P = 0.02) and prior low dose statin-treated patients (OR: 0.28; 95% CI: 0.12–0.65; P = 0.003); and for patients with acute coronary syndrome (OR: 0.52; 95% CI: 0.34–0.79; P = 0.003), but not for patients with stable angina (OR: 0.71; 95% CI 0.45–1.10; P = 0.12). Long-term effects on survival were less obvious. Conclusions High-dose statin pretreatment can result in a significant reduction in PMI and MACE for patients undergoing elective PCI. The positive effect of high-dose statin pretreatment on PMI and MACE is significant for statin-naïve patients and patients with prior treatment. The positive effect of high-dose statin pretreatment on MACE is significant for

  18. Treatment selection in a randomized clinical trial via covariate-specific treatment effect curves.

    Science.gov (United States)

    Ma, Yunbei; Zhou, Xiao-Hua

    2017-02-01

    For time-to-event data in a randomized clinical trial, we proposed two new methods for selecting an optimal treatment for a patient based on the covariate-specific treatment effect curve, which is used to represent the clinical utility of a predictive biomarker. To select an optimal treatment for a patient with a specific biomarker value, we proposed pointwise confidence intervals for each covariate-specific treatment effect curve and the difference between covariate-specific treatment effect curves of two treatments. Furthermore, to select an optimal treatment for a future biomarker-defined subpopulation of patients, we proposed confidence bands for each covariate-specific treatment effect curve and the difference between each pair of covariate-specific treatment effect curve over a fixed interval of biomarker values. We constructed the confidence bands based on a resampling technique. We also conducted simulation studies to evaluate finite-sample properties of the proposed estimation methods. Finally, we illustrated the application of the proposed method in a real-world data set.

  19. Integrated Behavior Therapy for Selective Mutism: a randomized controlled pilot study.

    Science.gov (United States)

    Bergman, R Lindsey; Gonzalez, Araceli; Piacentini, John; Keller, Melody L

    2013-10-01

    To evaluate the feasibility, acceptability, and preliminary efficacy of a novel behavioral intervention for reducing symptoms of selective mutism and increasing functional speech. A total of 21 children ages 4 to 8 with primary selective mutism were randomized to 24 weeks of Integrated Behavior Therapy for Selective Mutism (IBTSM) or a 12-week Waitlist control. Clinical outcomes were assessed using blind independent evaluators, parent-, and teacher-report, and an objective behavioral measure. Treatment recipients completed a three-month follow-up to assess durability of treatment gains. Data indicated increased functional speaking behavior post-treatment as rated by parents and teachers, with a high rate of treatment responders as rated by blind independent evaluators (75%). Conversely, children in the Waitlist comparison group did not experience significant improvements in speaking behaviors. Children who received IBTSM also demonstrated significant improvements in number of words spoken at school compared to baseline, however, significant group differences did not emerge. Treatment recipients also experienced significant reductions in social anxiety per parent, but not teacher, report. Clinical gains were maintained over 3 month follow-up. IBTSM appears to be a promising new intervention that is efficacious in increasing functional speaking behaviors, feasible, and acceptable to parents and teachers. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Two-year Randomized Clinical Trial of Self-etching Adhesives and Selective Enamel Etching.

    Science.gov (United States)

    Pena, C E; Rodrigues, J A; Ely, C; Giannini, M; Reis, A F

    2016-01-01

    The aim of this randomized, controlled prospective clinical trial was to evaluate the clinical effectiveness of restoring noncarious cervical lesions with two self-etching adhesive systems applied with or without selective enamel etching. A one-step self-etching adhesive (Xeno V(+)) and a two-step self-etching system (Clearfil SE Bond) were used. The effectiveness of phosphoric acid selective etching of enamel margins was also evaluated. Fifty-six cavities were restored with each adhesive system and divided into two subgroups (n=28; etch and non-etch). All 112 cavities were restored with the nanohybrid composite Esthet.X HD. The clinical effectiveness of restorations was recorded in terms of retention, marginal integrity, marginal staining, caries recurrence, and postoperative sensitivity after 3, 6, 12, 18, and 24 months (modified United States Public Health Service). The Friedman test detected significant differences only after 18 months for marginal staining in the groups Clearfil SE non-etch (p=0.009) and Xeno V(+) etch (p=0.004). One restoration was lost during the trial (Xeno V(+) etch; p>0.05). Although an increase in marginal staining was recorded for groups Clearfil SE non-etch and Xeno V(+) etch, the clinical effectiveness of restorations was considered acceptable for the single-step and two-step self-etching systems with or without selective enamel etching in this 24-month clinical trial.

  1. Selecting Optimal Parameters of Random Linear Network Coding for Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Heide, J; Zhang, Qi; Fitzek, F H P

    2013-01-01

    This work studies how to select optimal code parameters of Random Linear Network Coding (RLNC) in Wireless Sensor Networks (WSNs). With Rateless Deluge [1] the authors proposed to apply Network Coding (NC) for Over-the-Air Programming (OAP) in WSNs, and demonstrated that with NC a significant...... reduction in the number of transmitted packets can be achieved. However, NC introduces additional computations and potentially a non-negligible transmission overhead, both of which depend on the chosen coding parameters. Therefore it is necessary to consider the trade-off that these coding parameters...... present in order to obtain the lowest energy consumption per transmitted bit. This problem is analyzed and suitable coding parameters are determined for the popular Tmote Sky platform. Compared to the use of traditional RLNC, these parameters enable a reduction in the energy spent per bit which grows...

  2. Twenty years of JHEP

    International Nuclear Information System (INIS)

    Amati, Daniele; Bonora, Loriano

    2017-01-01

    On July 1st of twenty years ago we launched the first issue of JHEP. It was a real challenge to try the adventure of a scientific journal thought and managed by scientists, independent of private or institutional publishers. The idea was that physicists who were performing the work, writing the papers, and refereeing those of their peers, could also edit them in an electronic format. Thus, at a limited cost which could be handled by academic or scientific institutions, giving open access to the published information, and avoiding the heavy burden at that time placed on our libraries. In order to avoid the failure that - as we well remember was predicted to us by publishing companies and institutions, we needed the active support of the scientific community. A support in collaborating to the scrutiny of papers and scientific policies but, mainly, in sending to us their good papers. We contacted in that sense several of our most active and renowned colleagues, restricting of course to our high energy physics field. We received many enthusiastic responses as can be specifically seen from the papers that appear in the first issues of the journal, as well as the respected names listed in the advisory and in the editorial boards featuring in the opening page of the first issue of JHEP attached to this letter. We were optimists, but perhaps not as much as to foresee the success of the journal, which, in his first 20 years, turned out to be one of the most (if not the most) prestigious one in our field. Success for which --- as first project chairman and executive editor - we must essentially thank the scientific community who supported JHEP by considering it ''their journal'', as was our intention from the start. In this occasion we would like to recall the important contribution of Marco Fabbrichesi who designed the software on which the journal is based, and the technical personnel who implemented it and made it work. It is our pleasure to mention

  3. Implications of structural genomics target selection strategies: Pfam5000, whole genome, and random approaches

    Energy Technology Data Exchange (ETDEWEB)

    Chandonia, John-Marc; Brenner, Steven E.

    2004-07-14

    The structural genomics project is an international effort to determine the three-dimensional shapes of all important biological macromolecules, with a primary focus on proteins. Target proteins should be selected according to a strategy which is medically and biologically relevant, of good value, and tractable. As an option to consider, we present the Pfam5000 strategy, which involves selecting the 5000 most important families from the Pfam database as sources for targets. We compare the Pfam5000 strategy to several other proposed strategies that would require similar numbers of targets. These include including complete solution of several small to moderately sized bacterial proteomes, partial coverage of the human proteome, and random selection of approximately 5000 targets from sequenced genomes. We measure the impact that successful implementation of these strategies would have upon structural interpretation of the proteins in Swiss-Prot, TrEMBL, and 131 complete proteomes (including 10 of eukaryotes) from the Proteome Analysis database at EBI. Solving the structures of proteins from the 5000 largest Pfam families would allow accurate fold assignment for approximately 68 percent of all prokaryotic proteins (covering 59 percent of residues) and 61 percent of eukaryotic proteins (40 percent of residues). More fine-grained coverage which would allow accurate modeling of these proteins would require an order of magnitude more targets. The Pfam5000 strategy may be modified in several ways, for example to focus on larger families, bacterial sequences, or eukaryotic sequences; as long as secondary consideration is given to large families within Pfam, coverage results vary only slightly. In contrast, focusing structural genomics on a single tractable genome would have only a limited impact in structural knowledge of other proteomes: a significant fraction (about 30-40 percent of the proteins, and 40-60 percent of the residues) of each proteome is classified in small

  4. Day-ahead load forecast using random forest and expert input selection

    International Nuclear Information System (INIS)

    Lahouar, A.; Ben Hadj Slama, J.

    2015-01-01

    Highlights: • A model based on random forests for short term load forecast is proposed. • An expert feature selection is added to refine inputs. • Special attention is paid to customers behavior, load profile and special holidays. • The model is flexible and able to handle complex load signal. • A technical comparison is performed to assess the forecast accuracy. - Abstract: The electrical load forecast is getting more and more important in recent years due to the electricity market deregulation and integration of renewable resources. To overcome the incoming challenges and ensure accurate power prediction for different time horizons, sophisticated intelligent methods are elaborated. Utilization of intelligent forecast algorithms is among main characteristics of smart grids, and is an efficient tool to face uncertainty. Several crucial tasks of power operators such as load dispatch rely on the short term forecast, thus it should be as accurate as possible. To this end, this paper proposes a short term load predictor, able to forecast the next 24 h of load. Using random forest, characterized by immunity to parameter variations and internal cross validation, the model is constructed following an online learning process. The inputs are refined by expert feature selection using a set of if–then rules, in order to include the own user specifications about the country weather or market, and to generalize the forecast ability. The proposed approach is tested through a real historical set from the Tunisian Power Company, and the simulation shows accurate and satisfactory results for one day in advance, with an average error exceeding rarely 2.3%. The model is validated for regular working days and weekends, and special attention is paid to moving holidays, following non Gregorian calendar

  5. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    Science.gov (United States)

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  6. Modified random hinge transport mechanics and multiple scattering step-size selection in EGS5

    International Nuclear Information System (INIS)

    Wilderman, S.J.; Bielajew, A.F.

    2005-01-01

    The new transport mechanics in EGS5 allows for significantly longer electron transport step sizes and hence shorter computation times than required for identical problems in EGS4. But as with all Monte Carlo electron transport algorithms, certain classes of problems exhibit step-size dependencies even when operating within recommended ranges, sometimes making selection of step-sizes a daunting task for novice users. Further contributing to this problem, because of the decoupling of multiple scattering and continuous energy loss in the dual random hinge transport mechanics of EGS5, there are two independent step sizes in EGS5, one for multiple scattering and one for continuous energy loss, each of which influences speed and accuracy in a different manner. Further, whereas EGS4 used a single value of fractional energy loss (ESTEPE) to determine step sizes at all energies, to increase performance by decreasing the amount of effort expended simulating lower energy particles, EGS5 permits the fractional energy loss values which are used to determine both the multiple scattering and continuous energy loss step sizes to vary with energy. This results in requiring the user to specify four fractional energy loss values when optimizing computations for speed. Thus, in order to simplify step-size selection and to mitigate step-size dependencies, a method has been devised to automatically optimize step-size selection based on a single material dependent input related to the size of problem tally region. In this paper we discuss the new transport mechanics in EGS5 and describe the automatic step-size optimization algorithm. (author)

  7. The adverse effect of selective cyclooxygenase-2 inhibitor on random skin flap survival in rats.

    Directory of Open Access Journals (Sweden)

    Haiyong Ren

    Full Text Available BACKGROUND: Cyclooxygenase-2(COX-2 inhibitors provide desired analgesic effects after injury or surgery, but evidences suggested they also attenuate wound healing. The study is to investigate the effect of COX-2 inhibitor on random skin flap survival. METHODS: The McFarlane flap model was established in 40 rats and evaluated within two groups, each group gave the same volume of Parecoxib and saline injection for 7 days. The necrotic area of the flap was measured, the specimens of the flap were stained with haematoxylin-eosin(HE for histologic analysis. Immunohistochemical staining was performed to analyse the level of VEGF and COX-2 . RESULTS: 7 days after operation, the flap necrotic area ratio in study group (66.65 ± 2.81% was significantly enlarged than that of the control group(48.81 ± 2.33%(P <0.01. Histological analysis demonstrated angiogenesis with mean vessel density per mm(2 being lower in study group (15.4 ± 4.4 than in control group (27.2 ± 4.1 (P <0.05. To evaluate the expression of COX-2 and VEGF protein in the intermediate area II in the two groups by immunohistochemistry test .The expression of COX-2 in study group was (1022.45 ± 153.1, and in control group was (2638.05 ± 132.2 (P <0.01. The expression of VEGF in the study and control groups were (2779.45 ± 472.0 vs (4938.05 ± 123.6(P <0.01.In the COX-2 inhibitor group, the expressions of COX-2 and VEGF protein were remarkably down-regulated as compared with the control group. CONCLUSION: Selective COX-2 inhibitor had adverse effect on random skin flap survival. Suppression of neovascularization induced by low level of VEGF was supposed to be the biological mechanism.

  8. Application of random coherence order selection in gradient-enhanced multidimensional NMR

    International Nuclear Information System (INIS)

    Bostock, Mark J.; Nietlispach, Daniel

    2016-01-01

    Development of multidimensional NMR is essential to many applications, for example in high resolution structural studies of biomolecules. Multidimensional techniques enable separation of NMR signals over several dimensions, improving signal resolution, whilst also allowing identification of new connectivities. However, these advantages come at a significant cost. The Fourier transform theorem requires acquisition of a grid of regularly spaced points to satisfy the Nyquist criterion, while frequency discrimination and acquisition of a pure phase spectrum require acquisition of both quadrature components for each time point in every indirect (non-acquisition) dimension, adding a factor of 2 N -1 to the number of free- induction decays which must be acquired, where N is the number of dimensions. Compressed sensing (CS) ℓ 1 -norm minimisation in combination with non-uniform sampling (NUS) has been shown to be extremely successful in overcoming the Nyquist criterion. Previously, maximum entropy reconstruction has also been used to overcome the limitation of frequency discrimination, processing data acquired with only one quadrature component at a given time interval, known as random phase detection (RPD), allowing a factor of two reduction in the number of points for each indirect dimension (Maciejewski et al. 2011 PNAS 108 16640). However, whilst this approach can be easily applied in situations where the quadrature components are acquired as amplitude modulated data, the same principle is not easily extended to phase modulated (P-/N-type) experiments where data is acquired in the form exp (iωt) or exp (-iωt), and which make up many of the multidimensional experiments used in modern NMR. Here we demonstrate a modification of the CS ℓ 1 -norm approach to allow random coherence order selection (RCS) for phase modulated experiments; we generalise the nomenclature for RCS and RPD as random quadrature detection (RQD). With this method, the power of RQD can be extended

  9. Outlook: The Next Twenty Years

    Energy Technology Data Exchange (ETDEWEB)

    Murayama, Hitoshi

    2003-12-07

    I present an outlook for the next twenty years in particle physics. I start with the big questions in our field, broken down into four categories: horizontal, vertical, heaven, and hell. Then I discuss how we attack the bigquestions in each category during the next twenty years. I argue for a synergy between many different approaches taken in our field.

  10. Random genetic drift, natural selection, and noise in human cranial evolution.

    Science.gov (United States)

    Roseman, Charles C

    2016-08-01

    This study assesses the extent to which relationships among groups complicate comparative studies of adaptation in recent human cranial variation and the extent to which departures from neutral additive models of evolution hinder the reconstruction of population relationships among groups using cranial morphology. Using a maximum likelihood evolutionary model fitting approach and a mixed population genomic and cranial data set, I evaluate the relative fits of several widely used models of human cranial evolution. Moreover, I compare the goodness of fit of models of cranial evolution constrained by genomic variation to test hypotheses about population specific departures from neutrality. Models from population genomics are much better fits to cranial variation than are traditional models from comparative human biology. There is not enough evolutionary information in the cranium to reconstruct much of recent human evolution but the influence of population history on cranial variation is strong enough to cause comparative studies of adaptation serious difficulties. Deviations from a model of random genetic drift along a tree-like population history show the importance of environmental effects, gene flow, and/or natural selection on human cranial variation. Moreover, there is a strong signal of the effect of natural selection or an environmental factor on a group of humans from Siberia. The evolution of the human cranium is complex and no one evolutionary process has prevailed at the expense of all others. A holistic unification of phenome, genome, and environmental context, gives us a strong point of purchase on these problems, which is unavailable to any one traditional approach alone. Am J Phys Anthropol 160:582-592, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  11. Multi-Label Learning via Random Label Selection for Protein Subcellular Multi-Locations Prediction.

    Science.gov (United States)

    Wang, Xiao; Li, Guo-Zheng

    2013-03-12

    Prediction of protein subcellular localization is an important but challenging problem, particularly when proteins may simultaneously exist at, or move between, two or more different subcellular location sites. Most of the existing protein subcellular localization methods are only used to deal with the single-location proteins. In the past few years, only a few methods have been proposed to tackle proteins with multiple locations. However, they only adopt a simple strategy, that is, transforming the multi-location proteins to multiple proteins with single location, which doesn't take correlations among different subcellular locations into account. In this paper, a novel method named RALS (multi-label learning via RAndom Label Selection), is proposed to learn from multi-location proteins in an effective and efficient way. Through five-fold cross validation test on a benchmark dataset, we demonstrate our proposed method with consideration of label correlations obviously outperforms the baseline BR method without consideration of label correlations, indicating correlations among different subcellular locations really exist and contribute to improvement of prediction performance. Experimental results on two benchmark datasets also show that our proposed methods achieve significantly higher performance than some other state-of-the-art methods in predicting subcellular multi-locations of proteins. The prediction web server is available at http://levis.tongji.edu.cn:8080/bioinfo/MLPred-Euk/ for the public usage.

  12. Differential privacy-based evaporative cooling feature selection and classification with relief-F and random forests.

    Science.gov (United States)

    Le, Trang T; Simmons, W Kyle; Misaki, Masaya; Bodurka, Jerzy; White, Bill C; Savitz, Jonathan; McKinney, Brett A

    2017-09-15

    Classification of individuals into disease or clinical categories from high-dimensional biological data with low prediction error is an important challenge of statistical learning in bioinformatics. Feature selection can improve classification accuracy but must be incorporated carefully into cross-validation to avoid overfitting. Recently, feature selection methods based on differential privacy, such as differentially private random forests and reusable holdout sets, have been proposed. However, for domains such as bioinformatics, where the number of features is much larger than the number of observations p≫n , these differential privacy methods are susceptible to overfitting. We introduce private Evaporative Cooling, a stochastic privacy-preserving machine learning algorithm that uses Relief-F for feature selection and random forest for privacy preserving classification that also prevents overfitting. We relate the privacy-preserving threshold mechanism to a thermodynamic Maxwell-Boltzmann distribution, where the temperature represents the privacy threshold. We use the thermal statistical physics concept of Evaporative Cooling of atomic gases to perform backward stepwise privacy-preserving feature selection. On simulated data with main effects and statistical interactions, we compare accuracies on holdout and validation sets for three privacy-preserving methods: the reusable holdout, reusable holdout with random forest, and private Evaporative Cooling, which uses Relief-F feature selection and random forest classification. In simulations where interactions exist between attributes, private Evaporative Cooling provides higher classification accuracy without overfitting based on an independent validation set. In simulations without interactions, thresholdout with random forest and private Evaporative Cooling give comparable accuracies. We also apply these privacy methods to human brain resting-state fMRI data from a study of major depressive disorder. Code

  13. Bayesian dose selection design for a binary outcome using restricted response adaptive randomization.

    Science.gov (United States)

    Meinzer, Caitlyn; Martin, Renee; Suarez, Jose I

    2017-09-08

    In phase II trials, the most efficacious dose is usually not known. Moreover, given limited resources, it is difficult to robustly identify a dose while also testing for a signal of efficacy that would support a phase III trial. Recent designs have sought to be more efficient by exploring multiple doses through the use of adaptive strategies. However, the added flexibility may potentially increase the risk of making incorrect assumptions and reduce the total amount of information available across the dose range as a function of imbalanced sample size. To balance these challenges, a novel placebo-controlled design is presented in which a restricted Bayesian response adaptive randomization (RAR) is used to allocate a majority of subjects to the optimal dose of active drug, defined as the dose with the lowest probability of poor outcome. However, the allocation between subjects who receive active drug or placebo is held constant to retain the maximum possible power for a hypothesis test of overall efficacy comparing the optimal dose to placebo. The design properties and optimization of the design are presented in the context of a phase II trial for subarachnoid hemorrhage. For a fixed total sample size, a trade-off exists between the ability to select the optimal dose and the probability of rejecting the null hypothesis. This relationship is modified by the allocation ratio between active and control subjects, the choice of RAR algorithm, and the number of subjects allocated to an initial fixed allocation period. While a responsive RAR algorithm improves the ability to select the correct dose, there is an increased risk of assigning more subjects to a worse arm as a function of ephemeral trends in the data. A subarachnoid treatment trial is used to illustrate how this design can be customized for specific objectives and available data. Bayesian adaptive designs are a flexible approach to addressing multiple questions surrounding the optimal dose for treatment efficacy

  14. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti.

    Science.gov (United States)

    Wampler, Peter J; Rediske, Richard R; Molla, Azizur R

    2013-01-18

    A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This

  15. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti

    Directory of Open Access Journals (Sweden)

    Wampler Peter J

    2013-01-01

    Full Text Available Abstract Background A remote sensing technique was developed which combines a Geographic Information System (GIS; Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. Methods The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. Results A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. Conclusions The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only

  16. Peculiarities of the statistics of spectrally selected fluorescence radiation in laser-pumped dye-doped random media

    Science.gov (United States)

    Yuvchenko, S. A.; Ushakova, E. V.; Pavlova, M. V.; Alonova, M. V.; Zimnyakov, D. A.

    2018-04-01

    We consider the practical realization of a new optical probe method of the random media which is defined as the reference-free path length interferometry with the intensity moments analysis. A peculiarity in the statistics of the spectrally selected fluorescence radiation in laser-pumped dye-doped random medium is discussed. Previously established correlations between the second- and the third-order moments of the intensity fluctuations in the random interference patterns, the coherence function of the probe radiation, and the path difference probability density for the interfering partial waves in the medium are confirmed. The correlations were verified using the statistical analysis of the spectrally selected fluorescence radiation emitted by a laser-pumped dye-doped random medium. Water solution of Rhodamine 6G was applied as the doping fluorescent agent for the ensembles of the densely packed silica grains, which were pumped by the 532 nm radiation of a solid state laser. The spectrum of the mean path length for a random medium was reconstructed.

  17. The basic science and mathematics of random mutation and natural selection.

    Science.gov (United States)

    Kleinman, Alan

    2014-12-20

    The mutation and natural selection phenomenon can and often does cause the failure of antimicrobial, herbicidal, pesticide and cancer treatments selection pressures. This phenomenon operates in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures. The mathematical behavior of mutation and selection is derived using the principles given by probability theory. The derivation of the equations describing the mutation and selection phenomenon is carried out in the context of an empirical example. Copyright © 2014 John Wiley & Sons, Ltd.

  18. Selecting for Fast Protein-Protein Association As Demonstrated on a Random TEM1 Yeast Library Binding BLIP.

    Science.gov (United States)

    Cohen-Khait, Ruth; Schreiber, Gideon

    2018-04-27

    Protein-protein interactions mediate the vast majority of cellular processes. Though protein interactions obey basic chemical principles also within the cell, the in vivo physiological environment may not allow for equilibrium to be reached. Thus, in vitro measured thermodynamic affinity may not provide a complete picture of protein interactions in the biological context. Binding kinetics composed of the association and dissociation rate constants are relevant and important in the cell. Therefore, changes in protein-protein interaction kinetics have a significant impact on the in vivo activity of the proteins. The common protocol for the selection of tighter binders from a mutant library selects for protein complexes with slower dissociation rate constants. Here we describe a method to specifically select for variants with faster association rate constants by using pre-equilibrium selection, starting from a large random library. Toward this end, we refine the selection conditions of a TEM1-β-lactamase library against its natural nanomolar affinity binder β-lactamase inhibitor protein (BLIP). The optimal selection conditions depend on the ligand concentration and on the incubation time. In addition, we show that a second sort of the library helps to separate signal from noise, resulting in a higher percent of faster binders in the selected library. Fast associating protein variants are of particular interest for drug development and other biotechnological applications.

  19. r2VIM: A new variable selection method for random forests in genome-wide association studies.

    Science.gov (United States)

    Szymczak, Silke; Holzinger, Emily; Dasgupta, Abhijit; Malley, James D; Molloy, Anne M; Mills, James L; Brody, Lawrence C; Stambolian, Dwight; Bailey-Wilson, Joan E

    2016-01-01

    Machine learning methods and in particular random forests (RFs) are a promising alternative to standard single SNP analyses in genome-wide association studies (GWAS). RFs provide variable importance measures (VIMs) to rank SNPs according to their predictive power. However, in contrast to the established genome-wide significance threshold, no clear criteria exist to determine how many SNPs should be selected for downstream analyses. We propose a new variable selection approach, recurrent relative variable importance measure (r2VIM). Importance values are calculated relative to an observed minimal importance score for several runs of RF and only SNPs with large relative VIMs in all of the runs are selected as important. Evaluations on simulated GWAS data show that the new method controls the number of false-positives under the null hypothesis. Under a simple alternative hypothesis with several independent main effects it is only slightly less powerful than logistic regression. In an experimental GWAS data set, the same strong signal is identified while the approach selects none of the SNPs in an underpowered GWAS. The novel variable selection method r2VIM is a promising extension to standard RF for objectively selecting relevant SNPs in GWAS while controlling the number of false-positive results.

  20. Feature selection and classification of mechanical fault of an induction motor using random forest classifier

    OpenAIRE

    Patel, Raj Kumar; Giri, V.K.

    2016-01-01

    Fault detection and diagnosis is the most important technology in condition-based maintenance (CBM) system for rotating machinery. This paper experimentally explores the development of a random forest (RF) classifier, a recently emerged machine learning technique, for multi-class mechanical fault diagnosis in bearing of an induction motor. Firstly, the vibration signals are collected from the bearing using accelerometer sensor. Parameters from the vibration signal are extracted in the form of...

  1. Comparison of confirmed inactive and randomly selected compounds as negative training examples in support vector machine-based virtual screening.

    Science.gov (United States)

    Heikamp, Kathrin; Bajorath, Jürgen

    2013-07-22

    The choice of negative training data for machine learning is a little explored issue in chemoinformatics. In this study, the influence of alternative sets of negative training data and different background databases on support vector machine (SVM) modeling and virtual screening has been investigated. Target-directed SVM models have been derived on the basis of differently composed training sets containing confirmed inactive molecules or randomly selected database compounds as negative training instances. These models were then applied to search background databases consisting of biological screening data or randomly assembled compounds for available hits. Negative training data were found to systematically influence compound recall in virtual screening. In addition, different background databases had a strong influence on the search results. Our findings also indicated that typical benchmark settings lead to an overestimation of SVM-based virtual screening performance compared to search conditions that are more relevant for practical applications.

  2. Novel Zn2+-chelating peptides selected from a fimbria-displayed random peptide library

    DEFF Research Database (Denmark)

    Kjærgaard, Kristian; Schembri, Mark; Klemm, Per

    2001-01-01

    The display of peptide sequences on the surface of bacteria is a technology that offers exciting applications in biotechnology and medical research. Type 1 fimbriae are surface organelles of Escherichia coli which mediate D-mannose-sensitive binding to different host surfaces by virtue of the Fim......H adhesin. FimH is a component of the fimbrial organelle that can accommodate and display a diverse range of peptide sequences on the E. coli cell surface. In this study we have constructed a random peptide library in FimH. The library, consisting of similar to 40 million individual clones, was screened...

  3. Predictive Validity of an Empirical Approach for Selecting Promising Message Topics: A Randomized-Controlled Study

    Science.gov (United States)

    Lee, Stella Juhyun; Brennan, Emily; Gibson, Laura Anne; Tan, Andy S. L.; Kybert-Momjian, Ani; Liu, Jiaying; Hornik, Robert

    2016-01-01

    Several message topic selection approaches propose that messages based on beliefs pretested and found to be more strongly associated with intentions will be more effective in changing population intentions and behaviors when used in a campaign. This study aimed to validate the underlying causal assumption of these approaches which rely on cross-sectional belief–intention associations. We experimentally tested whether messages addressing promising themes as identified by the above criterion were more persuasive than messages addressing less promising themes. Contrary to expectations, all messages increased intentions. Interestingly, mediation analyses showed that while messages deemed promising affected intentions through changes in targeted promising beliefs, messages deemed less promising also achieved persuasion by influencing nontargeted promising beliefs. Implications for message topic selection are discussed. PMID:27867218

  4. Oracle Efficient Variable Selection in Random and Fixed Effects Panel Data Models

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl

    This paper generalizes the results for the Bridge estimator of Huang et al. (2008) to linear random and fixed effects panel data models which are allowed to grow in both dimensions. In particular we show that the Bridge estimator is oracle efficient. It can correctly distinguish between relevant...... and irrelevant variables and the asymptotic distribution of the estimators of the coefficients of the relevant variables is the same as if only these had been included in the model, i.e. as if an oracle had revealed the true model prior to estimation. In the case of more explanatory variables than observations......, we prove that the Marginal Bridge estimator can asymptotically correctly distinguish between relevant and irrelevant explanatory variables. We do this without restricting the dependence between covariates and without assuming sub Gaussianity of the error terms thereby generalizing the results...

  5. Presence of psychoactive substances in oral fluid from randomly selected drivers in Denmark

    DEFF Research Database (Denmark)

    Simonsen, K. Wiese; Steentoft, A.; Hels, Tove

    2012-01-01

    . The percentage of drivers positive for medicinal drugs above the Danish legal concentration limit was 0.4%; while, 0.3% of the drivers tested positive for one or more illicit drug at concentrations exceeding the Danish legal limit. Tetrahydrocannabinol, cocaine, and amphetamine were the most frequent illicit......This roadside study is the Danish part of the EU-project DRUID (Driving under the Influence of Drugs, Alcohol, and Medicines) and included three representative regions in Denmark. Oral fluid samples (n = 3002) were collected randomly from drivers using a sampling scheme stratified by time, season......, and road type. The oral fluid samples were screened for 29 illegal and legal psychoactive substances and metabolites as well as ethanol. Fourteen (0.5%) drivers were positive for ethanol alone or in combination with drugs) at concentrations above 0.53 g/l (0.5 mg/g), which is the Danish legal limit...

  6. Correlates of smoking with socioeconomic status, leisure time physical activity and alcohol consumption among Polish adults from randomly selected regions.

    Science.gov (United States)

    Woitas-Slubowska, Donata; Hurnik, Elzbieta; Skarpańska-Stejnborn, Anna

    2010-12-01

    To determine the association between smoking status and leisure time physical activity (LTPA), alcohol consumption, and socioeconomic status (SES) among Polish adults. 466 randomly selected men and women (aged 18-66 years) responded to an anonymous questionnaire regarding smoking, alcohol consumption, LTPA, and SES. Multiple logistic regression was used to examine the association of smoking status with six socioeconomic measures, level of LTPA, and frequency and type of alcohol consumed. Smokers were defined as individuals smoking occasionally or daily. The odds of being smoker were 9 times (men) and 27 times (women) higher among respondents who drink alcohol several times/ week or everyday in comparison to non-drinkers (p times higher compared to those with the high educational attainment (p = 0.007). Among women we observed that students were the most frequent smokers. Female students were almost three times more likely to smoke than non-professional women, and two times more likely than physical workers (p = 0.018). The findings of this study indicated that among randomly selected Polish man and women aged 18-66 smoking and alcohol consumption tended to cluster. These results imply that intervention strategies need to target multiple risk factors simultaneously. The highest risk of smoking was observed among low educated men, female students, and both men and women drinking alcohol several times a week or every day. Information on subgroups with the high risk of smoking will help in planning future preventive strategies.

  7. Genome-wide association data classification and SNPs selection using two-stage quality-based Random Forests.

    Science.gov (United States)

    Nguyen, Thanh-Tung; Huang, Joshua; Wu, Qingyao; Nguyen, Thuy; Li, Mark

    2015-01-01

    Single-nucleotide polymorphisms (SNPs) selection and identification are the most important tasks in Genome-wide association data analysis. The problem is difficult because genome-wide association data is very high dimensional and a large portion of SNPs in the data is irrelevant to the disease. Advanced machine learning methods have been successfully used in Genome-wide association studies (GWAS) for identification of genetic variants that have relatively big effects in some common, complex diseases. Among them, the most successful one is Random Forests (RF). Despite of performing well in terms of prediction accuracy in some data sets with moderate size, RF still suffers from working in GWAS for selecting informative SNPs and building accurate prediction models. In this paper, we propose to use a new two-stage quality-based sampling method in random forests, named ts-RF, for SNP subspace selection for GWAS. The method first applies p-value assessment to find a cut-off point that separates informative and irrelevant SNPs in two groups. The informative SNPs group is further divided into two sub-groups: highly informative and weak informative SNPs. When sampling the SNP subspace for building trees for the forest, only those SNPs from the two sub-groups are taken into account. The feature subspaces always contain highly informative SNPs when used to split a node at a tree. This approach enables one to generate more accurate trees with a lower prediction error, meanwhile possibly avoiding overfitting. It allows one to detect interactions of multiple SNPs with the diseases, and to reduce the dimensionality and the amount of Genome-wide association data needed for learning the RF model. Extensive experiments on two genome-wide SNP data sets (Parkinson case-control data comprised of 408,803 SNPs and Alzheimer case-control data comprised of 380,157 SNPs) and 10 gene data sets have demonstrated that the proposed model significantly reduced prediction errors and outperformed

  8. Twenty-first century vaccines

    Science.gov (United States)

    Rappuoli, Rino

    2011-01-01

    In the twentieth century, vaccination has been possibly the greatest revolution in health. Together with hygiene and antibiotics, vaccination led to the elimination of many childhood infectious diseases and contributed to the increase in disability-free life expectancy that in Western societies rose from 50 to 78–85 years (Crimmins, E. M. & Finch, C. E. 2006 Proc. Natl Acad. Sci. USA 103, 498–503; Kirkwood, T. B. 2008 Nat. Med 10, 1177–1185). In the twenty-first century, vaccination will be expected to eliminate the remaining childhood infectious diseases, such as meningococcal meningitis, respiratory syncytial virus, group A streptococcus, and will address the health challenges of this century such as those associated with ageing, antibiotic resistance, emerging infectious diseases and poverty. However, for this to happen, we need to increase the public trust in vaccination so that vaccines can be perceived as the best insurance against most diseases across all ages. PMID:21893537

  9. Deformation quantization: Twenty years after

    International Nuclear Information System (INIS)

    Sternheimer, Daniel

    1998-01-01

    We first review the historical developments, both in physics and in mathematics, that preceded (and in some sense provided the background of) deformation quantization. Then we describe the birth of the latter theory and its evolution in the past twenty years, insisting on the main conceptual developments and keeping here as much as possible on the physical side. For the physical part the accent is put on its relations to, and relevance for, 'conventional' physics. For the mathematical part we concentrate on the questions of existence and equivalence, including most recent developments for general Poisson manifolds; we touch also noncommutative geometry and index theorems, and relations with group theory, including quantum groups. An extensive (though very incomplete) bibliography is appended and includes background mathematical literature

  10. Capturing the Flatness of a peer-to-peer lending network through random and selected perturbations

    Science.gov (United States)

    Karampourniotis, Panagiotis D.; Singh, Pramesh; Uparna, Jayaram; Horvat, Emoke-Agnes; Szymanski, Boleslaw K.; Korniss, Gyorgy; Bakdash, Jonathan Z.; Uzzi, Brian

    Null models are established tools that have been used in network analysis to uncover various structural patterns. They quantify the deviance of an observed network measure to that given by the null model. We construct a null model for weighted, directed networks to identify biased links (carrying significantly different weights than expected according to the null model) and thus quantify the flatness of the system. Using this model, we study the flatness of Kiva, a large international crownfinancing network of borrowers and lenders, aggregated to the country level. The dataset spans the years from 2006 to 2013. Our longitudinal analysis shows that flatness of the system is reducing over time, meaning the proportion of biased inter-country links is growing. We extend our analysis by testing the robustness of the flatness of the network in perturbations on the links' weights or the nodes themselves. Examples of such perturbations are event shocks (e.g. erecting walls) or regulatory shocks (e.g. Brexit). We find that flatness is unaffected by random shocks, but changes after shocks target links with a large weight or bias. The methods we use to capture the flatness are based on analytics, simulations, and numerical computations using Shannon's maximum entropy. Supported by ARL NS-CTA.

  11. Participant-selected music and physical activity in older adults following cardiac rehabilitation: a randomized controlled trial.

    Science.gov (United States)

    Clark, Imogen N; Baker, Felicity A; Peiris, Casey L; Shoebridge, Georgie; Taylor, Nicholas F

    2017-03-01

    To evaluate effects of participant-selected music on older adults' achievement of activity levels recommended in the physical activity guidelines following cardiac rehabilitation. A parallel group randomized controlled trial with measurements at Weeks 0, 6 and 26. A multisite outpatient rehabilitation programme of a publicly funded metropolitan health service. Adults aged 60 years and older who had completed a cardiac rehabilitation programme. Experimental participants selected music to support walking with guidance from a music therapist. Control participants received usual care only. The primary outcome was the proportion of participants achieving activity levels recommended in physical activity guidelines. Secondary outcomes compared amounts of physical activity, exercise capacity, cardiac risk factors, and exercise self-efficacy. A total of 56 participants, mean age 68.2 years (SD = 6.5), were randomized to the experimental ( n = 28) and control groups ( n = 28). There were no differences between groups in proportions of participants achieving activity recommended in physical activity guidelines at Week 6 or 26. Secondary outcomes demonstrated between-group differences in male waist circumference at both measurements (Week 6 difference -2.0 cm, 95% CI -4.0 to 0; Week 26 difference -2.8 cm, 95% CI -5.4 to -0.1), and observed effect sizes favoured the experimental group for amounts of physical activity (d = 0.30), exercise capacity (d = 0.48), and blood pressure (d = -0.32). Participant-selected music did not increase the proportion of participants achieving recommended amounts of physical activity, but may have contributed to exercise-related benefits.

  12. Mirnacle: machine learning with SMOTE and random forest for improving selectivity in pre-miRNA ab initio prediction.

    Science.gov (United States)

    Marques, Yuri Bento; de Paiva Oliveira, Alcione; Ribeiro Vasconcelos, Ana Tereza; Cerqueira, Fabio Ribeiro

    2016-12-15

    MicroRNAs (miRNAs) are key gene expression regulators in plants and animals. Therefore, miRNAs are involved in several biological processes, making the study of these molecules one of the most relevant topics of molecular biology nowadays. However, characterizing miRNAs in vivo is still a complex task. As a consequence, in silico methods have been developed to predict miRNA loci. A common ab initio strategy to find miRNAs in genomic data is to search for sequences that can fold into the typical hairpin structure of miRNA precursors (pre-miRNAs). The current ab initio approaches, however, have selectivity issues, i.e., a high number of false positives is reported, which can lead to laborious and costly attempts to provide biological validation. This study presents an extension of the ab initio method miRNAFold, with the aim of improving selectivity through machine learning techniques, namely, random forest combined with the SMOTE procedure that copes with imbalance datasets. By comparing our method, termed Mirnacle, with other important approaches in the literature, we demonstrate that Mirnacle substantially improves selectivity without compromising sensitivity. For the three datasets used in our experiments, our method achieved at least 97% of sensitivity and could deliver a two-fold, 20-fold, and 6-fold increase in selectivity, respectively, compared with the best results of current computational tools. The extension of miRNAFold by the introduction of machine learning techniques, significantly increases selectivity in pre-miRNA ab initio prediction, which optimally contributes to advanced studies on miRNAs, as the need of biological validations is diminished. Hopefully, new research, such as studies of severe diseases caused by miRNA malfunction, will benefit from the proposed computational tool.

  13. Reduced plasma aldosterone concentrations in randomly selected patients with insulin-dependent diabetes mellitus.

    LENUS (Irish Health Repository)

    Cronin, C C

    2012-02-03

    Abnormalities of the renin-angiotensin system have been reported in patients with diabetes mellitus and with diabetic complications. In this study, plasma concentrations of prorenin, renin, and aldosterone were measured in a stratified random sample of 110 insulin-dependent (Type 1) diabetic patients attending our outpatient clinic. Fifty-four age- and sex-matched control subjects were also examined. Plasma prorenin concentration was higher in patients without complications than in control subjects when upright (geometric mean (95% confidence intervals (CI): 75.9 (55.0-105.6) vs 45.1 (31.6-64.3) mU I-1, p < 0.05). There was no difference in plasma prorenin concentration between patients without and with microalbuminuria and between patients without and with background retinopathy. Plasma renin concentration, both when supine and upright, was similar in control subjects, in patients without complications, and in patients with varying degrees of diabetic microangiopathy. Plasma aldosterone was suppressed in patients without complications in comparison to control subjects (74 (58-95) vs 167 (140-199) ng I-1, p < 0.001) and was also suppressed in patients with microvascular disease. Plasma potassium was significantly higher in patients than in control subjects (mean +\\/- standard deviation: 4.10 +\\/- 0.36 vs 3.89 +\\/- 0.26 mmol I-1; p < 0.001) and plasma sodium was significantly lower (138 +\\/- 4 vs 140 +\\/- 2 mmol I-1; p < 0.001). We conclude that plasma prorenin is not a useful early marker for diabetic microvascular disease. Despite apparently normal plasma renin concentrations, plasma aldosterone is suppressed in insulin-dependent diabetic patients.

  14. A Permutation Importance-Based Feature Selection Method for Short-Term Electricity Load Forecasting Using Random Forest

    Directory of Open Access Journals (Sweden)

    Nantian Huang

    2016-09-01

    Full Text Available The prediction accuracy of short-term load forecast (STLF depends on prediction model choice and feature selection result. In this paper, a novel random forest (RF-based feature selection method for STLF is proposed. First, 243 related features were extracted from historical load data and the time information of prediction points to form the original feature set. Subsequently, the original feature set was used to train an RF as the original model. After the training process, the prediction error of the original model on the test set was recorded and the permutation importance (PI value of each feature was obtained. Then, an improved sequential backward search method was used to select the optimal forecasting feature subset based on the PI value of each feature. Finally, the optimal forecasting feature subset was used to train a new RF model as the final prediction model. Experiments showed that the prediction accuracy of RF trained by the optimal forecasting feature subset was higher than that of the original model and comparative models based on support vector regression and artificial neural network.

  15. Nitrates and bone turnover (NABT) - trial to select the best nitrate preparation: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Bucur, Roxana C; Reid, Lauren S; Hamilton, Celeste J; Cummings, Steven R; Jamal, Sophie A

    2013-09-08

    comparisons with the best' approach for data analyses, as this strategy allows practical considerations of ease of use and tolerability to guide selection of the preparation for future studies. Data from this protocol will be used to develop a randomized, controlled trial of nitrates to prevent osteoporotic fractures. ClinicalTrials.gov Identifier: NCT01387672. Controlled-Trials.com: ISRCTN08860742.

  16. Non-Random Inversion Landscapes in Prokaryotic Genomes Are Shaped by Heterogeneous Selection Pressures.

    Science.gov (United States)

    Repar, Jelena; Warnecke, Tobias

    2017-08-01

    Inversions are a major contributor to structural genome evolution in prokaryotes. Here, using a novel alignment-based method, we systematically compare 1,651 bacterial and 98 archaeal genomes to show that inversion landscapes are frequently biased toward (symmetric) inversions around the origin-terminus axis. However, symmetric inversion bias is not a universal feature of prokaryotic genome evolution but varies considerably across clades. At the extremes, inversion landscapes in Bacillus-Clostridium and Actinobacteria are dominated by symmetric inversions, while there is little or no systematic bias favoring symmetric rearrangements in archaea with a single origin of replication. Within clades, we find strong but clade-specific relationships between symmetric inversion bias and different features of adaptive genome architecture, including the distance of essential genes to the origin of replication and the preferential localization of genes on the leading strand. We suggest that heterogeneous selection pressures have converged to produce similar patterns of structural genome evolution across prokaryotes. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  17. Fast selection of miRNA candidates based on large-scale pre-computed MFE sets of randomized sequences.

    Science.gov (United States)

    Warris, Sven; Boymans, Sander; Muiser, Iwe; Noback, Michiel; Krijnen, Wim; Nap, Jan-Peter

    2014-01-13

    Small RNAs are important regulators of genome function, yet their prediction in genomes is still a major computational challenge. Statistical analyses of pre-miRNA sequences indicated that their 2D structure tends to have a minimal free energy (MFE) significantly lower than MFE values of equivalently randomized sequences with the same nucleotide composition, in contrast to other classes of non-coding RNA. The computation of many MFEs is, however, too intensive to allow for genome-wide screenings. Using a local grid infrastructure, MFE distributions of random sequences were pre-calculated on a large scale. These distributions follow a normal distribution and can be used to determine the MFE distribution for any given sequence composition by interpolation. It allows on-the-fly calculation of the normal distribution for any candidate sequence composition. The speedup achieved makes genome-wide screening with this characteristic of a pre-miRNA sequence practical. Although this particular property alone will not be able to distinguish miRNAs from other sequences sufficiently discriminative, the MFE-based P-value should be added to the parameters of choice to be included in the selection of potential miRNA candidates for experimental verification.

  18. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness

    Science.gov (United States)

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and

  19. Performance of Universal Adhesive in Primary Molars After Selective Removal of Carious Tissue: An 18-Month Randomized Clinical Trial.

    Science.gov (United States)

    Lenzi, Tathiane Larissa; Pires, Carine Weber; Soares, Fabio Zovico Maxnuck; Raggio, Daniela Prócida; Ardenghi, Thiago Machado; de Oliveira Rocha, Rachel

    2017-09-15

    To evaluate the 18-month clinical performance of a universal adhesive, applied under different adhesion strategies, after selective carious tissue removal in primary molars. Forty-four subjects (five to 10 years old) contributed with 90 primary molars presenting moderately deep dentin carious lesions on occlusal or occluso-proximal surfaces, which were randomly assigned following either self-etch or etch-and-rinse protocol of Scotchbond Universal Adhesive (3M ESPE). Resin composite was incrementally inserted for all restorations. Restorations were evaluated at one, six, 12, and 18 months using the modified United States Public Health Service criteria. Survival estimates for restorations' longevity were evaluated using the Kaplan-Meier method. Multivariate Cox regression analysis with shared frailty to assess the factors associated with failures (Padhesion strategy did not influence the restorations' longevity (P=0.06; 72.2 percent and 89.7 percent with etch-and-rinse and self-etch mode, respectively). Self-etch and etch-and-rinse strategies did not influence the clinical behavior of universal adhesive used in primary molars after selective carious tissue removal; although there was a tendency for better outcome of the self-etch strategy.

  20. DNABP: Identification of DNA-Binding Proteins Based on Feature Selection Using a Random Forest and Predicting Binding Residues.

    Science.gov (United States)

    Ma, Xin; Guo, Jing; Sun, Xiao

    2016-01-01

    DNA-binding proteins are fundamentally important in cellular processes. Several computational-based methods have been developed to improve the prediction of DNA-binding proteins in previous years. However, insufficient work has been done on the prediction of DNA-binding proteins from protein sequence information. In this paper, a novel predictor, DNABP (DNA-binding proteins), was designed to predict DNA-binding proteins using the random forest (RF) classifier with a hybrid feature. The hybrid feature contains two types of novel sequence features, which reflect information about the conservation of physicochemical properties of the amino acids, and the binding propensity of DNA-binding residues and non-binding propensities of non-binding residues. The comparisons with each feature demonstrated that these two novel features contributed most to the improvement in predictive ability. Furthermore, to improve the prediction performance of the DNABP model, feature selection using the minimum redundancy maximum relevance (mRMR) method combined with incremental feature selection (IFS) was carried out during the model construction. The results showed that the DNABP model could achieve 86.90% accuracy, 83.76% sensitivity, 90.03% specificity and a Matthews correlation coefficient of 0.727. High prediction accuracy and performance comparisons with previous research suggested that DNABP could be a useful approach to identify DNA-binding proteins from sequence information. The DNABP web server system is freely available at http://www.cbi.seu.edu.cn/DNABP/.

  1. Implementation of client versus care-provider strategies to improve external cephalic version rates: a cluster randomized controlled trial

    NARCIS (Netherlands)

    Vlemmix, Floortje; Rosman, Ageeth N.; Rijnders, Marlies E.; Beuckens, Antje; Opmeer, Brent C.; Mol, Ben W. J.; Kok, Marjolein; Fleuren, Margot A. H.

    2015-01-01

    To determine the effectiveness of a client or care-provider strategy to improve the implementation of external cephalic version. Cluster randomized controlled trial. Twenty-five clusters; hospitals and their referring midwifery practices randomly selected in the Netherlands. Singleton breech

  2. Implementation of client versus care-provider strategies to improve external cephalic version rates: a cluster randomized controlled trial

    NARCIS (Netherlands)

    Vlemmix, F.; Rosman, A.N.; Rijnders, M.E.; Beuckens, A.; Opmeer, B.C.; Mol, B.W.J.; Kok, M.; Fleuren, M.A.H.

    2015-01-01

    Onjective: To determine the effectiveness of a client or care-provider strategy to improve the implementation of external cephalic version. Design: Cluster randomized controlled trial.Setting: Twenty-five clusters; hospitals and their referring midwifery practices randomly selected in the

  3. Biased random key genetic algorithm with insertion and gender selection for capacitated vehicle routing problem with time windows

    Science.gov (United States)

    Rochman, Auliya Noor; Prasetyo, Hari; Nugroho, Munajat Tri

    2017-06-01

    Vehicle Routing Problem (VRP) often occurs when the manufacturers need to distribute their product to some customers/outlets. The distribution process is typically restricted by the capacity of the vehicle and the working hours at the distributor. This type of VRP is also known as Capacitated Vehicle Routing Problem with Time Windows (CVRPTW). A Biased Random Key Genetic Algorithm (BRKGA) was designed and coded in MATLAB to solve the CVRPTW case of soft drink distribution. The standard BRKGA was then modified by applying chromosome insertion into the initial population and defining chromosome gender for parent undergoing crossover operation. The performance of the established algorithms was then compared to a heuristic procedure for solving a soft drink distribution. Some findings are revealed (1) the total distribution cost of BRKGA with insertion (BRKGA-I) results in a cost saving of 39% compared to the total cost of heuristic method, (2) BRKGA with the gender selection (BRKGA-GS) could further improve the performance of the heuristic method. However, the BRKGA-GS tends to yield worse results compared to that obtained from the standard BRKGA.

  4. Sequence-Based Prediction of RNA-Binding Proteins Using Random Forest with Minimum Redundancy Maximum Relevance Feature Selection

    Directory of Open Access Journals (Sweden)

    Xin Ma

    2015-01-01

    Full Text Available The prediction of RNA-binding proteins is one of the most challenging problems in computation biology. Although some studies have investigated this problem, the accuracy of prediction is still not sufficient. In this study, a highly accurate method was developed to predict RNA-binding proteins from amino acid sequences using random forests with the minimum redundancy maximum relevance (mRMR method, followed by incremental feature selection (IFS. We incorporated features of conjoint triad features and three novel features: binding propensity (BP, nonbinding propensity (NBP, and evolutionary information combined with physicochemical properties (EIPP. The results showed that these novel features have important roles in improving the performance of the predictor. Using the mRMR-IFS method, our predictor achieved the best performance (86.62% accuracy and 0.737 Matthews correlation coefficient. High prediction accuracy and successful prediction performance suggested that our method can be a useful approach to identify RNA-binding proteins from sequence information.

  5. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling.

    Science.gov (United States)

    Zhou, Fuqun; Zhang, Aining

    2016-10-25

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2-3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests' features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.

  6. Selective serotonin reuptake inhibitors (SSRIs) for post-partum depression (PPD): a systematic review of randomized clinical trials.

    Science.gov (United States)

    De Crescenzo, Franco; Perelli, Federica; Armando, Marco; Vicari, Stefano

    2014-01-01

    The treatment of postpartum depression with selective serotonin reuptake inhibitors (SSRIs) has been claimed to be both efficacious and well tolerated, but no recent systematic reviews have been conducted. A qualitative systematic review of randomized clinical trials on women with postpartum depression comparing SSRIs to placebo and/or other treatments was performed. A comprehensive literature search of online databases, the bibliographies of published articles and grey literature were conducted. Data on efficacy, acceptability and tolerability were extracted and the quality of the trials was assessed. Six randomised clinical trials, comprising 595 patients, met quality criteria for inclusion in the analysis. Cognitive-behavioural intervention, psychosocial community-based intervention, psychodynamic therapy, cognitive behavioural therapy, a second-generation tricyclic antidepressant and placebo were used as comparisons. All studies demonstrated higher response and remission rates among those treated with SSRIs and greater mean changes on depression scales, although findings were not always statistically significant. Dropout rates were high in three of the trials but similar among treatment and comparison groups. In general, SSRIs were well tolerated and trial quality was good. There are few trials, patients included in the trials were not representative of all patients with postpartum depression, dropout rates in three trials were high, and long-term efficacy and tolerability were assessed in only two trials. SSRIs appear to be efficacious and well tolerated in the treatment of postpartum depression, but the available evidence fails to demonstrate a clear superiority over other treatments. © 2013 Elsevier B.V. All rights reserved.

  7. The Long-Term Effectiveness of a Selective, Personality-Targeted Prevention Program in Reducing Alcohol Use and Related Harms: A Cluster Randomized Controlled Trial

    Science.gov (United States)

    Newton, Nicola C.; Conrod, Patricia J.; Slade, Tim; Carragher, Natacha; Champion, Katrina E.; Barrett, Emma L.; Kelly, Erin V.; Nair, Natasha K.; Stapinski, Lexine; Teesson, Maree

    2016-01-01

    Background: This study investigated the long-term effectiveness of Preventure, a selective personality-targeted prevention program, in reducing the uptake of alcohol, harmful use of alcohol, and alcohol-related harms over a 3-year period. Methods: A cluster randomized controlled trial was conducted to assess the effectiveness of Preventure.…

  8. Capital in the Twenty-First Century

    DEFF Research Database (Denmark)

    Hansen, Per H.

    2014-01-01

    Review essay on: Capital in the Twenty-First Century. By Thomas Piketty . Translated by Arthur Goldhammer . Cambridge, Mass.: The Belknap Press of Harvard University Press, 2014. viii + 685 pp......Review essay on: Capital in the Twenty-First Century. By Thomas Piketty . Translated by Arthur Goldhammer . Cambridge, Mass.: The Belknap Press of Harvard University Press, 2014. viii + 685 pp...

  9. Twenty Practices of an Entrepreneurial University

    DEFF Research Database (Denmark)

    Gjerding, Allan Næs; Wilderom, Celeste P.M.; Cameron, Shona P.B.

    2006-01-01

    studies twenty organisational practices against which a University's entrepreneurship can be measured. These twenty practices or factors in effect formed the basis for an entrepreneurship audit. During a series of interviews, the extent to which the universities are seen as entrepreneurial...

  10. Topics in random walks in random environment

    International Nuclear Information System (INIS)

    Sznitman, A.-S.

    2004-01-01

    Over the last twenty-five years random motions in random media have been intensively investigated and some new general methods and paradigms have by now emerged. Random walks in random environment constitute one of the canonical models of the field. However in dimension bigger than one they are still poorly understood and many of the basic issues remain to this day unresolved. The present series of lectures attempt to give an account of the progresses which have been made over the last few years, especially in the study of multi-dimensional random walks in random environment with ballistic behavior. (author)

  11. Evaluation of Randomly Selected Completed Medical Records Sheets in Teaching Hospitals of Jahrom University of Medical Sciences, 2009

    Directory of Open Access Journals (Sweden)

    Mohammad Parsa Mahjob

    2011-06-01

    Full Text Available Background and objective: Medical record documentation, often use to protect the patients legal rights, also providing information for medical researchers, general studies, education of health care staff and qualitative surveys is used. There is a need to control the amount of data entered in the medical record sheets of patients, considering the completion of these sheets is often carried out after completion of service delivery to the patients. Therefore, in this study the prevalence of completeness of medical history, operation reports, and physician order sheets by different documentaries in Jahrom teaching hospitals during year 2009 was analyzed. Methods and Materials: In this descriptive / retrospective study, the 400 medical record sheets of the patients from two teaching hospitals affiliated to Jahrom medical university was randomly selected. The tool of data collection was a checklist based on the content of medical history sheet, operation report and physician order sheets. The data were analyzed by SPSS (Version10 software and Microsoft Office Excel 2003. Results: Average of personal (Demography data entered in medical history, physician order and operation report sheets which is done by department's secretaries were 32.9, 35.8 and 40.18 percent. Average of clinical data entered by physician in medical history sheet is 38 percent. Surgical data entered by the surgeon in operation report sheet was 94.77 percent. Average of data entered by operation room's nurse in operation report sheet was 36.78 percent; Average of physician order data in physician order sheet entered by physician was 99.3 percent. Conclusion: According to this study, the rate of completed record papers reviewed by documentary in Jahrom teaching hospitals were not desirable and in some cases were very weak and incomplete. This deficiency was due to different reason such as medical record documentaries negligence, lack of adequate education for documentaries, High work

  12. Water chemistry in 179 randomly selected Swedish headwater streams related to forest production, clear-felling and climate.

    Science.gov (United States)

    Löfgren, Stefan; Fröberg, Mats; Yu, Jun; Nisell, Jakob; Ranneby, Bo

    2014-12-01

    From a policy perspective, it is important to understand forestry effects on surface waters from a landscape perspective. The EU Water Framework Directive demands remedial actions if not achieving good ecological status. In Sweden, 44 % of the surface water bodies have moderate ecological status or worse. Many of these drain catchments with a mosaic of managed forests. It is important for the forestry sector and water authorities to be able to identify where, in the forested landscape, special precautions are necessary. The aim of this study was to quantify the relations between forestry parameters and headwater stream concentrations of nutrients, organic matter and acid-base chemistry. The results are put into the context of regional climate, sulphur and nitrogen deposition, as well as marine influences. Water chemistry was measured in 179 randomly selected headwater streams from two regions in southwest and central Sweden, corresponding to 10 % of the Swedish land area. Forest status was determined from satellite images and Swedish National Forest Inventory data using the probabilistic classifier method, which was used to model stream water chemistry with Bayesian model averaging. The results indicate that concentrations of e.g. nitrogen, phosphorus and organic matter are related to factors associated with forest production but that it is not forestry per se that causes the excess losses. Instead, factors simultaneously affecting forest production and stream water chemistry, such as climate, extensive soil pools and nitrogen deposition, are the most likely candidates The relationships with clear-felled and wetland areas are likely to be direct effects.

  13. Proceedings of the Twenty-Third Annual Software Engineering Workshop

    Science.gov (United States)

    1999-01-01

    The Twenty-third Annual Software Engineering Workshop (SEW) provided 20 presentations designed to further the goals of the Software Engineering Laboratory (SEL) of the NASA-GSFC. The presentations were selected on their creativity. The sessions which were held on 2-3 of December 1998, centered on the SEL, Experimentation, Inspections, Fault Prediction, Verification and Validation, and Embedded Systems and Safety-Critical Systems.

  14. Convergence analysis for Latin-hypercube lattice-sample selection strategies for 3D correlated random hydraulic-conductivity fields

    OpenAIRE

    Simuta-Champo, R.; Herrera-Zamarrón, G. S.

    2010-01-01

    The Monte Carlo technique provides a natural method for evaluating uncertainties. The uncertainty is represented by a probability distribution or by related quantities such as statistical moments. When the groundwater flow and transport governing equations are solved and the hydraulic conductivity field is treated as a random spatial function, the hydraulic head, velocities and concentrations also become random spatial functions. When that is the case, for the stochastic simulation of groundw...

  15. A comparison of random forest and its Gini importance with standard chemometric methods for the feature selection and classification of spectral data

    Directory of Open Access Journals (Sweden)

    Himmelreich Uwe

    2009-07-01

    Full Text Available Abstract Background Regularized regression methods such as principal component or partial least squares regression perform well in learning tasks on high dimensional spectral data, but cannot explicitly eliminate irrelevant features. The random forest classifier with its associated Gini feature importance, on the other hand, allows for an explicit feature elimination, but may not be optimally adapted to spectral data due to the topology of its constituent classification trees which are based on orthogonal splits in feature space. Results We propose to combine the best of both approaches, and evaluated the joint use of a feature selection based on a recursive feature elimination using the Gini importance of random forests' together with regularized classification methods on spectral data sets from medical diagnostics, chemotaxonomy, biomedical analytics, food science, and synthetically modified spectral data. Here, a feature selection using the Gini feature importance with a regularized classification by discriminant partial least squares regression performed as well as or better than a filtering according to different univariate statistical tests, or using regression coefficients in a backward feature elimination. It outperformed the direct application of the random forest classifier, or the direct application of the regularized classifiers on the full set of features. Conclusion The Gini importance of the random forest provided superior means for measuring feature relevance on spectral data, but – on an optimal subset of features – the regularized classifiers might be preferable over the random forest classifier, in spite of their limitation to model linear dependencies only. A feature selection based on Gini importance, however, may precede a regularized linear classification to identify this optimal subset of features, and to earn a double benefit of both dimensionality reduction and the elimination of noise from the classification task.

  16. Affinity selection of Nipah and Hendra virus-related vaccine candidates from a complex random peptide library displayed on bacteriophage virus-like particles

    Energy Technology Data Exchange (ETDEWEB)

    Peabody, David S.; Chackerian, Bryce; Ashley, Carlee; Carnes, Eric; Negrete, Oscar

    2017-01-24

    The invention relates to virus-like particles of bacteriophage MS2 (MS2 VLPs) displaying peptide epitopes or peptide mimics of epitopes of Nipah Virus envelope glycoprotein that elicit an immune response against Nipah Virus upon vaccination of humans or animals. Affinity selection on Nipah Virus-neutralizing monoclonal antibodies using random sequence peptide libraries on MS2 VLPs selected peptides with sequence similarity to peptide sequences found within the envelope glycoprotein of Nipah itself, thus identifying the epitopes the antibodies recognize. The selected peptide sequences themselves are not necessarily identical in all respects to a sequence within Nipah Virus glycoprotein, and therefore may be referred to as epitope mimics VLPs displaying these epitope mimics can serve as vaccine. On the other hand, display of the corresponding wild-type sequence derived from Nipah Virus and corresponding to the epitope mapped by affinity selection, may also be used as a vaccine.

  17. Selected CD133⁺ progenitor cells to promote angiogenesis in patients with refractory angina: final results of the PROGENITOR randomized trial.

    Science.gov (United States)

    Jimenez-Quevedo, Pilar; Gonzalez-Ferrer, Juan Jose; Sabate, Manel; Garcia-Moll, Xavier; Delgado-Bolton, Roberto; Llorente, Leopoldo; Bernardo, Esther; Ortega-Pozzi, Aranzazu; Hernandez-Antolin, Rosana; Alfonso, Fernando; Gonzalo, Nieves; Escaned, Javier; Bañuelos, Camino; Regueiro, Ander; Marin, Pedro; Fernandez-Ortiz, Antonio; Neves, Barbara Das; Del Trigo, Maria; Fernandez, Cristina; Tejerina, Teresa; Redondo, Santiago; Garcia, Eulogio; Macaya, Carlos

    2014-11-07

    Refractory angina constitutes a clinical problem. The aim of this study was to assess the safety and the feasibility of transendocardial injection of CD133(+) cells to foster angiogenesis in patients with refractory angina. In this randomized, double-blinded, multicenter controlled trial, eligible patients were treated with granulocyte colony-stimulating factor, underwent an apheresis and electromechanical mapping, and were randomized to receive treatment with CD133(+) cells or no treatment. The primary end point was the safety of transendocardial injection of CD133(+) cells, as measured by the occurrence of major adverse cardiac and cerebrovascular event at 6 months. Secondary end points analyzed the efficacy. Twenty-eight patients were included (n=19 treatment; n=9 control). At 6 months, 1 patient in each group had ventricular fibrillation and 1 patient in each group died. One patient (treatment group) had a cardiac tamponade during mapping. There were no significant differences between groups with respect to efficacy parameters; however, the comparison within groups showed a significant improvement in the number of angina episodes per month (median absolute difference, -8.5 [95% confidence interval, -15.0 to -4.0]) and in angina functional class in the treatment arm but not in the control group. At 6 months, only 1 simple-photon emission computed tomography (SPECT) parameter: summed score improved significantly in the treatment group at rest and at stress (median absolute difference, -1.0 [95% confidence interval, -1.9 to -0.1]) but not in the control arm. Our findings support feasibility and safety of transendocardial injection of CD133(+) cells in patients with refractory angina. The promising clinical results and favorable data observed in SPECT summed score may set up the basis to test the efficacy of cell therapy in a larger randomized trial. © 2014 American Heart Association, Inc.

  18. Twenty-four hour care for schizophrenia.

    Science.gov (United States)

    Macpherson, Rob; Edwards, Thomas Rhys; Chilvers, Rupatharshini; David, Chris; Elliott, Helen J

    2009-04-15

    Despite modern treatment approaches and a focus on community care, there remains a group of people who cannot easily be discharged from psychiatric hospital directly into the community. Twenty-four hour residential rehabilitation (a 'ward-in-a-house') is one model of care that has evolved in association with psychiatric hospital closure programmes. To determine the effects of 24 hour residential rehabilitation compared with standard treatment within a hospital setting. We searched the Cochrane Schizophrenia Group Trials Register (May 2002 and February 2004). We included all randomised or quasi-randomised trials that compared 24 hour residential rehabilitation with standard care for people with severe mental illness. Studies were reliably selected, quality assessed and data extracted. Data were excluded where more than 50% of participants in any group were lost to follow-up. For binary outcomes we calculated the relative risk and its 95% confidence interval. We identified and included one study with 22 participants with important methodological shortcomings and limitations of reporting. The two-year controlled study evaluated "new long stay patients" in a hostel ward in the UK. One outcome 'unable to manage in the placement' provided usable data (n=22, RR 7.0 CI 0.4 to 121.4). The trial reported that hostel ward residents developed superior domestic skills, used more facilities in the community and were more likely to engage in constructive activities than those in hospital - although usable numerical data were not reported. These potential advantages were not purchased at a price. The limited economic data was not good but the cost of providing 24 hour care did not seem clearly different from the standard care provided by the hospital - and it may have been less. From the single, small and ill-reported, included study, the hostel ward type of facility appeared cheaper and positively effective. Currently, the value of this way of supporting people - which could be

  19. Randomized trial of switching from prescribed non-selective non-steroidal anti-inflammatory drugs to prescribed celecoxib

    DEFF Research Database (Denmark)

    Macdonald, Thomas M; Hawkey, Chris J; Ford, Ian

    2017-01-01

    BACKGROUND: Selective cyclooxygenase-2 inhibitors and conventional non-selective non-steroidal anti-inflammatory drugs (nsNSAIDs) have been associated with adverse cardiovascular (CV) effects. We compared the CV safety of switching to celecoxib vs. continuing nsNSAID therapy in a European setting...

  20. Empirical versus Random Item Selection in the Design of Intelligence Test Short Forms--The WISC-R Example.

    Science.gov (United States)

    Goh, David S.

    1979-01-01

    The advantages of using psychometric thoery to design short forms of intelligence tests are demonstrated by comparing such usage to a systematic random procedure that has previously been used. The Wechsler Intelligence Scale for Children Revised (WISC-R) Short Form is presented as an example. (JKS)

  1. Do vouchers lead to sorting under random private-school selection? Evidence from the Milwaukee voucher program

    OpenAIRE

    Chakrabarti, Rajashri

    2009-01-01

    This paper analyzes the effect of school vouchers on student sorting - defined as a flight to private schools by high-income and committed public-school students - and whether vouchers can be designed to reduce or eliminate it. Much of the existing literature investigates sorting in cases where private schools can screen students. However, publicly funded U.S. voucher programs require a private school to accept all students unless it is oversubscribed and to pick students randomly if it is ov...

  2. Identification and DNA fingerprinting of Legionella strains by randomly amplified polymorphic DNA analysis.

    OpenAIRE

    Bansal, N S; McDonell, F

    1997-01-01

    The randomly amplified polymorphic DNA (RAPD) technique was used in the development of a fingerprinting (typing) and identification protocol for Legionella strains. Twenty decamer random oligonucleotide primers were screened for their discriminatory abilities. Two candidate primers were selected. By using a combination of these primers, RAPD analysis allowed for the differentiation between all different species, between the serogroups, and further differentiation between subtypes of the same ...

  3. SNPs selected by information content outperform randomly selected microsatellite loci for delineating genetic identification and introgression in the endangered dark European honeybee (Apis mellifera mellifera).

    Science.gov (United States)

    Muñoz, Irene; Henriques, Dora; Jara, Laura; Johnston, J Spencer; Chávez-Galarza, Julio; De La Rúa, Pilar; Pinto, M Alice

    2017-07-01

    The honeybee (Apis mellifera) has been threatened by multiple factors including pests and pathogens, pesticides and loss of locally adapted gene complexes due to replacement and introgression. In western Europe, the genetic integrity of the native A. m. mellifera (M-lineage) is endangered due to trading and intensive queen breeding with commercial subspecies of eastern European ancestry (C-lineage). Effective conservation actions require reliable molecular tools to identify pure-bred A. m. mellifera colonies. Microsatellites have been preferred for identification of A. m. mellifera stocks across conservation centres. However, owing to high throughput, easy transferability between laboratories and low genotyping error, SNPs promise to become popular. Here, we compared the resolving power of a widely utilized microsatellite set to detect structure and introgression with that of different sets that combine a variable number of SNPs selected for their information content and genomic proximity to the microsatellite loci. Contrary to every SNP data set, microsatellites did not discriminate between the two lineages in the PCA space. Mean introgression proportions were identical across the two marker types, although at the individual level, microsatellites' performance was relatively poor at the upper range of Q-values, a result reflected by their lower precision. Our results suggest that SNPs are more accurate and powerful than microsatellites for identification of A. m. mellifera colonies, especially when they are selected by information content. © 2016 John Wiley & Sons Ltd.

  4. The concentration of heavy metals: zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people

    International Nuclear Information System (INIS)

    Wandiga, S.O.; Jumba, I.O.

    1982-01-01

    An intercomparative analysis of the concentration of heavy metals:zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people using the techniques of atomic absorption spectrophotometry (AAS) and differential pulse anodic stripping voltammetry (DPAS) has been undertaken. The percent relative standard deviation for each sample analysed using either of the techniques show good sensitivity and correlation between the techniques. The DPAS was found to be slightly sensitive than the AAs instrument used. The recalculated body burden rations of Cd to Zn, Pb to Fe reveal no unusual health impairement symptoms and suggest a relatively clean environment in Kenya.(author)

  5. The Twenty Statement Test in Teacher Development

    Directory of Open Access Journals (Sweden)

    Ahmet Aypay

    2011-10-01

    Full Text Available The purpose of this study is to describe teacher characteristics using Twenty Statements Test (TST. Study group includes a total of thirty-five individuals, including teachers, guidance and counselors and research assistants. The study used a qualitative approach on teacher identity. TST is one of the qualitative methods that were used to determine individual self-conceptualization. Study group were requested to write twenty statements that describe themselves responding to the question “Who I am?” in a free format. The findings indicated that teachers were overwhelmingly in group c (reflective. No differences were found in terms of gender and profession. Only few significant differences have been found based on marital status. The utility of TST in teacher training and development was discussed.

  6. The Twenty Statement Test in Teacher Development

    Directory of Open Access Journals (Sweden)

    Ahmet Aypay

    2011-04-01

    Full Text Available The purpose of this study is to describe teacher characteristics using Twenty Statements Test (TST. Study group includes a total of thirty-five individuals, including teachers, guidance and counselors and research assistants. The study used a qualitative approach on teacher identity. TST is one of the qualitative methods that were used to determine individual self-conceptualization. Study group were requested to write twenty statements that describe themselves responding to the question “Who I am?” in a free format. The findings indicated that teachers were overwhelmingly in group c (reflective. No differences were found in terms of gender and profession. Only few significant differences have been found based on marital status. The utility of TST in teacher training and development was discussed

  7. Twenty years of diffraction at the Tevatron

    International Nuclear Information System (INIS)

    Goulianos, K.; Rockefeller U.

    2005-01-01

    Results on diffractive particle interactions from the Fermilab Tevatron (bar p)p collider are placed in perspective through a QCD inspired phenomenological approach, which exploits scaling and factorization properties observed in data. The results discussed are those obtained by the CDF Collaboration from a comprehensive set of single, double, and multigap soft and hard diffraction processes studied during the twenty year period since 1985, when the CDF diffractive program was proposed and the first Blois Workshop was held

  8. Safety regulation - twenty years after Chernobyl accident

    International Nuclear Information System (INIS)

    Aleksashin, P.P.; Bukrinskij, A.M.; Gordon, B.G.

    2006-01-01

    Main stages of development, successes, achievements and shortcomings of activity of supervision body after the Chernobyl NPP accident are analysed. The estimation of the realized variations of the functions of the state supervision department is carried out. Results of the twenty year period of improvement of the supervision body are summed up. The measures for increasing the efficiency of the supervision body operation are outlined [ru

  9. "Open mesh" or "strictly selected population" recruitment? The experience of the randomized controlled MeMeMe trial

    Directory of Open Access Journals (Sweden)

    Cortellini M

    2017-07-01

    Full Text Available Mauro Cortellini, Franco Berrino, Patrizia Pasanisi Department of Preventive & Predictive Medicine, Foundation IRCCS National Cancer Institute of Milan, Milan, Italy Abstract: Among randomized controlled trials (RCTs, trials for primary prevention require large samples and long follow-up to obtain a high-quality outcome; therefore the recruitment process and the drop-out rates largely dictate the adequacy of the results. We are conducting a Phase III trial on persons with metabolic syndrome to test the hypothesis that comprehensive lifestyle changes and/or metformin treatment prevents age-related chronic diseases (the MeMeMe trial, EudraCT number: 2012-005427-32, also registered on ClinicalTrials.gov [NCT02960711]. Here, we briefly analyze and discuss the reasons which may lead to participants dropping out from trials. In our experience, participants may back out of a trial for different reasons. Drug-induced side effects are certainly the most compelling reason. But what are the other reasons, relating to the participants’ perception of the progress of the trial which led them to withdraw after randomization? What about the time-dependent drop-out rate in primary prevention trials? The primary outcome of this analysis is the point of drop-out from trial, defined as the time from the randomization date to the withdrawal date. Survival functions were non-parametrically estimated using the product-limit estimator. The curves were statistically compared using the log-rank test (P=0.64, not significant. Researchers involved in primary prevention RCTs seem to have to deal with the paradox of the proverbial “short blanket syndrome”. Recruiting only highly motivated candidates might be useful for the smooth progress of the trial but it may lead to a very low enrollment rate. On the other hand, what about enrolling all the eligible subjects without considering their motivation? This might boost the enrollment rate, but it can lead to biased

  10. "Open mesh" or "strictly selected population" recruitment? The experience of the randomized controlled MeMeMe trial.

    Science.gov (United States)

    Cortellini, Mauro; Berrino, Franco; Pasanisi, Patrizia

    2017-01-01

    Among randomized controlled trials (RCTs), trials for primary prevention require large samples and long follow-up to obtain a high-quality outcome; therefore the recruitment process and the drop-out rates largely dictate the adequacy of the results. We are conducting a Phase III trial on persons with metabolic syndrome to test the hypothesis that comprehensive lifestyle changes and/or metformin treatment prevents age-related chronic diseases (the MeMeMe trial, EudraCT number: 2012-005427-32, also registered on ClinicalTrials.gov [NCT02960711]). Here, we briefly analyze and discuss the reasons which may lead to participants dropping out from trials. In our experience, participants may back out of a trial for different reasons. Drug-induced side effects are certainly the most compelling reason. But what are the other reasons, relating to the participants' perception of the progress of the trial which led them to withdraw after randomization? What about the time-dependent drop-out rate in primary prevention trials? The primary outcome of this analysis is the point of drop-out from trial, defined as the time from the randomization date to the withdrawal date. Survival functions were non-parametrically estimated using the product-limit estimator. The curves were statistically compared using the log-rank test ( P =0.64, not significant). Researchers involved in primary prevention RCTs seem to have to deal with the paradox of the proverbial "short blanket syndrome". Recruiting only highly motivated candidates might be useful for the smooth progress of the trial but it may lead to a very low enrollment rate. On the other hand, what about enrolling all the eligible subjects without considering their motivation? This might boost the enrollment rate, but it can lead to biased results on account of large proportions of drop-outs. Our experience suggests that participants do not change their mind depending on the allocation group (intervention or control). There is no single

  11. Evolution of Randomized Trials in Advanced/Metastatic Soft Tissue Sarcoma: End Point Selection, Surrogacy, and Quality of Reporting.

    Science.gov (United States)

    Zer, Alona; Prince, Rebecca M; Amir, Eitan; Abdul Razak, Albiruni

    2016-05-01

    Randomized controlled trials (RCTs) in soft tissue sarcoma (STS) have used varying end points. The surrogacy of intermediate end points, such as progression-free survival (PFS), response rate (RR), and 3-month and 6-month PFS (3moPFS and 6moPFS) with overall survival (OS), remains unknown. The quality of efficacy and toxicity reporting in these studies is also uncertain. A systematic review of systemic therapy RCTs in STS was performed. Surrogacy between intermediate end points and OS was explored using weighted linear regression for the hazard ratio for OS with the hazard ratio for PFS or the odds ratio for RR, 3moPFS, and 6moPFS. The quality of reporting for efficacy and toxicity was also evaluated. Fifty-two RCTs published between 1974 and 2014, comprising 9,762 patients, met the inclusion criteria. There were significant correlations between PFS and OS (R = 0.61) and between RR and OS (R = 0.51). Conversely, there were nonsignificant correlations between 3moPFS and 6moPFS with OS. A reduction in the use of RR as the primary end point was observed over time, favoring time-based events (P for trend = .02). In 14% of RCTs, the primary end point was not met, but the study was reported as being positive. Toxicity was comprehensively reported in 47% of RCTs, whereas 14% inadequately reported toxicity. In advanced STS, PFS and RR seem to be appropriate surrogates for OS. There is poor correlation between OS and both 3moPFS and 6moPFS. As such, caution is urged with the use of these as primary end points in randomized STS trials. The quality of toxicity reporting and interpretation of results is suboptimal. © 2016 by American Society of Clinical Oncology.

  12. Organic Ferroelectric-Based 1T1T Random Access Memory Cell Employing a Common Dielectric Layer Overcoming the Half-Selection Problem.

    Science.gov (United States)

    Zhao, Qiang; Wang, Hanlin; Ni, Zhenjie; Liu, Jie; Zhen, Yonggang; Zhang, Xiaotao; Jiang, Lang; Li, Rongjin; Dong, Huanli; Hu, Wenping

    2017-09-01

    Organic electronics based on poly(vinylidenefluoride/trifluoroethylene) (P(VDF-TrFE)) dielectric is facing great challenges in flexible circuits. As one indispensable part of integrated circuits, there is an urgent demand for low-cost and easy-fabrication nonvolatile memory devices. A breakthrough is made on a novel ferroelectric random access memory cell (1T1T FeRAM cell) consisting of one selection transistor and one ferroelectric memory transistor in order to overcome the half-selection problem. Unlike complicated manufacturing using multiple dielectrics, this system simplifies 1T1T FeRAM cell fabrication using one common dielectric. To achieve this goal, a strategy for semiconductor/insulator (S/I) interface modulation is put forward and applied to nonhysteretic selection transistors with high performances for driving or addressing purposes. As a result, high hole mobility of 3.81 cm 2 V -1 s -1 (average) for 2,6-diphenylanthracene (DPA) and electron mobility of 0.124 cm 2 V -1 s -1 (average) for N,N'-1H,1H-perfluorobutyl dicyanoperylenecarboxydiimide (PDI-FCN 2 ) are obtained in selection transistors. In this work, we demonstrate this technology's potential for organic ferroelectric-based pixelated memory module fabrication. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination

    Directory of Open Access Journals (Sweden)

    Bongyeun Koh

    2016-01-01

    Full Text Available Purpose: The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE, which requires examinees to select items randomly. Methods: The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. Results: In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01, as well as 4 of the 5 items on the advanced skills test (P<0.05. In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01, as well as all 3 of the advanced skills test items (P<0.01. Conclusion: In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  14. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination.

    Science.gov (United States)

    Koh, Bongyeun; Hong, Sunggi; Kim, Soon-Sim; Hyun, Jin-Sook; Baek, Milye; Moon, Jundong; Kwon, Hayran; Kim, Gyoungyong; Min, Seonggi; Kang, Gu-Hyun

    2016-01-01

    The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE), which requires examinees to select items randomly. The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01), as well as 4 of the 5 items on the advanced skills test (P<0.05). In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01), as well as all 3 of the advanced skills test items (P<0.01). In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  15. Effects of choice architecture and chef-enhanced meals on the selection and consumption of healthier school foods: a randomized clinical trial.

    Science.gov (United States)

    Cohen, Juliana F W; Richardson, Scott A; Cluggish, Sarah A; Parker, Ellen; Catalano, Paul J; Rimm, Eric B

    2015-05-01

    Little is known about the long-term effect of a chef-enhanced menu on healthier food selection and consumption in school lunchrooms. In addition, it remains unclear if extended exposure to other strategies to promote healthier foods (eg, choice architecture) also improves food selection or consumption. To evaluate the short- and long-term effects of chef-enhanced meals and extended exposure to choice architecture on healthier school food selection and consumption. A school-based randomized clinical trial was conducted during the 2011-2012 school year among 14 elementary and middle schools in 2 urban, low-income school districts (intent-to-treat analysis). Included in the study were 2638 students in grades 3 through 8 attending participating schools (38.4% of eligible participants). Schools were first randomized to receive a professional chef to improve school meal palatability (chef schools) or to a delayed intervention (control group). To assess the effect of choice architecture (smart café), all schools after 3 months were then randomized to the smart café intervention or to the control group. School food selection was recorded, and consumption was measured using plate waste methods. After 3 months, vegetable selection increased in chef vs control schools (odds ratio [OR], 1.75; 95% CI, 1.36-2.24), but there was no effect on the selection of other components or on meal consumption. After long-term or extended exposure to the chef or smart café intervention, fruit selection increased in the chef (OR, 3.08; 95% CI, 2.23-4.25), smart café (OR, 1.45; 95% CI, 1.13-1.87), and chef plus smart café (OR, 3.10; 95% CI, 2.26-4.25) schools compared with the control schools, and consumption increased in the chef schools (OR, 0.17; 95% CI, 0.03-0.30 cups/d). Vegetable selection increased in the chef (OR, 2.54; 95% CI, 1.83-3.54), smart café (OR, 1.91; 95% CI, 1.46-2.50), and chef plus smart café schools (OR, 7.38, 95% CI, 5.26-10.35) compared with the control schools

  16. A randomized trial on mineralocorticoid receptor blockade in men: effects on stress responses, selective attention, and memory.

    Science.gov (United States)

    Cornelisse, Sandra; Joëls, Marian; Smeets, Tom

    2011-12-01

    Corticosteroids, released in high amounts after stress, exert their effects via two different receptors in the brain: glucocorticoid receptors (GRs) and mineralocorticoid receptors (MRs). GRs have a role in normalizing stress-induced effects and promoting consolidation, while MRs are thought to be important in determining the threshold for activation of the hypothalamic-pituitary-adrenal (HPA) axis. We investigated the effects of MR blockade on HPA axis responses to stress and stress-induced changes in cognitive function. In a double-blind, placebo-controlled study, 64 healthy young men received 400 mg of the MR antagonist spironolactone or placebo. After 1.5 h, they were exposed to either a Trier Social Stress Test or a non-stressful control task. Responses to stress were evaluated by hormonal, subjective, and physiological measurements. Afterwards, selective attention, working memory, and long-term memory performance were assessed. Spironolactone increased basal salivary cortisol levels as well as cortisol levels in response to stress. Furthermore, spironolactone significantly impaired selective attention, but only in the control group. The stress group receiving spironolactone showed impaired working memory performance. By contrast, long-term memory was enhanced in this group. These data support a role of MRs in the regulation of the HPA axis under basal conditions as well as in response to stress. The increased availability of cortisol after spironolactone treatment implies enhanced GR activation, which, in combination with MR blockade, presumably resulted in a decreased MR/GR activation ratio. This condition influences both selective attention and performance in various memory tasks.

  17. The next twenty years - IAEA's role

    International Nuclear Information System (INIS)

    Tape, G.F.

    1977-01-01

    The twentieth anniversary of an institution is an appropriate time to look back and to ask what has been achieved. It is also an appropriate time to look ahead and to ask what should be the mission for the future. How can the strengths of the International Atomic Energy Agency (IAEA) be best utilized, what new opportunities should be seized upon, and what challenges should the IAEA be prepared to meet in the next twenty years? Forward planning is a very necessary activity in today's world. There are so many demands on national or institutional resources that careful analysis of options is necessary to establish priorities and ultimately to provide for implementation. But such planning must be done carefully with full appreciation for the validity and sensitivity of the input assumptions and data. Furthermore, today's plan, while setting goals and directions, cannot be so inflexible that it cannot be responsive to ever-changing political, economic and technical constraints or opportunities. Thus in looking ahead, the plan must contain provisions for flexibility to provide for further modifications in the light of ever-changing knowledge, attitudes, and world conditions. The experience of the past five years in the energy field, and especially in nuclear energy, underscores this need. In looking ahead for the next twenty years, we are attempting to describe the International Atomic Energy Agency and its role through the twentieth century. In doing so, we are automatically laying the base for the Agency's work going into the twenty-first century. In short, we are trying to visualize a programme that can serve the coming generation and, in doing so, creating a base from which the needs of the succeeding generation can be met. This is a large order and the crystal ball is less than clear. (author)

  18. Twenty-five years of simulator training

    International Nuclear Information System (INIS)

    Anon.

    2002-01-01

    The first training simulator for nuclear power plant personnel in Germany was commissioned twenty-five years ago. The strategy of training by simulators was developed and pursued consistently and continuously in order to ensure sound training of nuclear power plant personnel. The present thirteen simulators cover a broad range of plants. A systematic training concept also helps to ensure a high level of competence and permanent qualification of plant personnel. The anniversary was marked by a festive event at which Erich K. Steiner read a paper on 'The Importance of Simulator Training', and Professor Dr. Adolf Birkhofer spoke about 'Nuclear Technology Education and Training'. (orig.)

  19. Radiation curing - twenty five years on

    International Nuclear Information System (INIS)

    Garnett, J.L.

    1995-01-01

    Progress in UV/EB curing during the past twenty five years is briefly reviewed. During this time developments in unique polymer chemistry, novel equipment design and the introduction of relevant educational programmes has enabled radiation curing to become an established technology with specific strengths in certain industries. Possible reasons for the emergence of the technology in these niche markets are discussed. Despite the worldwide recession, radiation curing is shown to be expanding at 5% per annum with the prospect of higher growth with improving economic conditions. (Author)

  20. EcmPred: Prediction of extracellular matrix proteins based on random forest with maximum relevance minimum redundancy feature selection

    KAUST Repository

    Kandaswamy, Krishna Kumar Umar

    2013-01-01

    The extracellular matrix (ECM) is a major component of tissues of multicellular organisms. It consists of secreted macromolecules, mainly polysaccharides and glycoproteins. Malfunctions of ECM proteins lead to severe disorders such as marfan syndrome, osteogenesis imperfecta, numerous chondrodysplasias, and skin diseases. In this work, we report a random forest approach, EcmPred, for the prediction of ECM proteins from protein sequences. EcmPred was trained on a dataset containing 300 ECM and 300 non-ECM and tested on a dataset containing 145 ECM and 4187 non-ECM proteins. EcmPred achieved 83% accuracy on the training and 77% on the test dataset. EcmPred predicted 15 out of 20 experimentally verified ECM proteins. By scanning the entire human proteome, we predicted novel ECM proteins validated with gene ontology and InterPro. The dataset and standalone version of the EcmPred software is available at http://www.inb.uni-luebeck.de/tools-demos/Extracellular_matrix_proteins/EcmPred. © 2012 Elsevier Ltd.

  1. Food pantry selection solutions: a randomized controlled trial in client-choice food pantries to nudge clients to targeted foods.

    Science.gov (United States)

    Wilson, Norbert L W; Just, David R; Swigert, Jeffery; Wansink, Brian

    2017-06-01

    Food pantries and food banks are interested in cost-effective methods to encourage the selection of targeted foods without restricting choices. Thus, this study evaluates the effectiveness of nudges toward targeted foods. In October/November 2014, we manipulated the display of a targeted product in a New York State food pantry. We evaluated the binary choice of the targeted good when we placed it in the front or the back of the category line (placement order) and when we presented the product in its original box or unboxed (packaging). The average uptake proportion for the back treatment was 0.231, 95% CI = 0.179, 0.29, n = 205, and for the front treatment, the proportion was 0.337, 95% CI = 0.272, 0.406, n = 238 with an odds ratio of 1.688, 95% CI = 1.088, 2.523. The average uptake for the unboxed treatment was 0.224, 95% CI = 0.174, 0.280, n = 255, and for the boxed intervention, the proportion was 0.356, 95% CI = 0.288, 0.429, n = 188 with an odds ratio of 1.923, 95% CI = 1.237, 2.991. Nudges increased uptake of the targeted food. The findings also hold when we control for a potential confounder. Low cost and unobtrusive nudges can be effective tools for food pantry organizers to encourage the selection of targeted foods. NCT02403882. © The Author 2016. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Recruitment strategies should not be randomly selected: empirically improving recruitment success and diversity in developmental psychology research

    Science.gov (United States)

    Sugden, Nicole A.; Moulson, Margaret C.

    2015-01-01

    Psychological and developmental research have been critiqued for the lack of diversity of research samples. Because differences in culture, race, and ethnicity can influence participant behavior, limited diversity limits the generalizability of the findings. These differences may also impact how participants behave in response to recruitment attempts, which suggests that recruitment itself may be leveraged to increase sample diversity. The goal of the current study was to determine what factors, within a recruitment interaction, could be leveraged to increase success and diversity when recruiting families with children for developmental research. Study 1 found three factors influenced success: (1) recruitment was more successful when other potential participants were also interested (i.e., recruiters were busy), (2) recruiters of particular races were more successful than recruiters of other races, and (3) differences in success were related to what the recruiter said to engage the potential participant (i.e., the script). The latter two factors interacted, suggesting some recruiters were using less optimal scripts. To improve success rates, study 2 randomly assigned scripts to recruiters and encouraged them to recruit more vigorously during busy periods. Study 2 found that two factors influenced success: (1) some scripts were more successful than others and (2) we were more successful at recruiting non-White potential participants than White participants. These two interacted, with some scripts being more successful with White and other scripts being more successful with non-White families. This intervention significantly increased recruitment success rate by 8.1% and the overall number of families recruited by 15.3%. These findings reveal that empirically evaluating and tailoring recruitment efforts based on the most successful strategies is effective in boosting diversity through increased participation of children from non-White families. PMID:25972829

  3. Suicide in Nepal: a modified psychological autopsy investigation from randomly selected police cases between 2013 and 2015.

    Science.gov (United States)

    Hagaman, Ashley K; Khadka, S; Lohani, S; Kohrt, B

    2017-12-01

    Yearly, 600,000 people complete suicide in low- and middle-income countries, accounting for 75% of the world's burden of suicide mortality. The highest regional rates are in South and East Asia. Nepal has one of the highest suicide rates in the world; however, few investigations exploring patterns surrounding both male and female suicides exist. This study used psychological autopsies to identify common factors, precipitating events, and warning signs in a diverse sample. Randomly sampled from 302 police case reports over 24 months, psychological autopsies were conducted for 39 completed suicide cases in one urban and one rural region of Nepal. In the total police sample (n = 302), 57.0% of deaths were male. Over 40% of deaths were 25 years or younger, including 65% of rural and 50.8% of female suicide deaths. We estimate the crude urban and rural suicide rates to be 16.1 and 22.8 per 100,000, respectively. Within our psychological autopsy sample, 38.5% met criteria for depression and only 23.1% informants believed that the deceased had thoughts of self-harm or suicide before death. Important warning signs include recent geographic migration, alcohol abuse, and family history of suicide. Suicide prevention strategies in Nepal should account for the lack of awareness about suicide risk among family members and early age of suicide completion, especially in rural and female populations. Given the low rates of ideation disclosure to friends and family, educating the general public about other signs of suicide may help prevention efforts in Nepal.

  4. Misuse of randomization

    DEFF Research Database (Denmark)

    Liu, Jianping; Kjaergard, Lise Lotte; Gluud, Christian

    2002-01-01

    The quality of randomization of Chinese randomized trials on herbal medicines for hepatitis B was assessed. Search strategy and inclusion criteria were based on the published protocol. One hundred and seventy-six randomized clinical trials (RCTs) involving 20,452 patients with chronic hepatitis B...... virus (HBV) infection were identified that tested Chinese medicinal herbs. They were published in 49 Chinese journals. Only 10% (18/176) of the studies reported the method by which they randomized patients. Only two reported allocation concealment and were considered as adequate. Twenty percent (30...

  5. Selection of drug resistant mutants from random library of Plasmodium falciparum dihydrofolate reductase in Plasmodium berghei model

    Directory of Open Access Journals (Sweden)

    Yuthavong Yongyuth

    2011-05-01

    Full Text Available Abstract Background The prevalence of drug resistance amongst the human malaria Plasmodium species has most commonly been associated with genomic mutation within the parasites. This phenomenon necessitates evolutionary predictive studies of possible resistance mutations, which may occur when a new drug is introduced. Therefore, identification of possible new Plasmodium falciparum dihydrofolate reductase (PfDHFR mutants that confer resistance to antifolate drugs is essential in the process of antifolate anti-malarial drug development. Methods A system to identify mutations in Pfdhfr gene that confer antifolate drug resistance using an animal Plasmodium parasite model was developed. By using error-prone PCR and Plasmodium transfection technologies, libraries of Pfdhfr mutant were generated and then episomally transfected to Plasmodium berghei parasites, from which pyrimethamine-resistant PfDHFR mutants were selected. Results The principal mutation found from this experiment was S108N, coincident with the first pyrimethamine-resistance mutation isolated from the field. A transgenic P. berghei, in which endogenous Pbdhfr allele was replaced with the mutant PfdhfrS108N, was generated and confirmed to have normal growth rate comparing to parental non-transgenic parasite and also confer resistance to pyrimethamine. Conclusion This study demonstrated the power of the transgenic P. berghei system to predict drug-resistant Pfdhfr mutations in an in vivo parasite/host setting. The system could be utilized for identification of possible novel drug-resistant mutants that could arise against new antifolate compounds and for prediction the evolution of resistance mutations.

  6. The characterization of twenty sequenced human genomes.

    Directory of Open Access Journals (Sweden)

    Kimberly Pelak

    2010-09-01

    Full Text Available We present the analysis of twenty human genomes to evaluate the prospects for identifying rare functional variants that contribute to a phenotype of interest. We sequenced at high coverage ten "case" genomes from individuals with severe hemophilia A and ten "control" genomes. We summarize the number of genetic variants emerging from a study of this magnitude, and provide a proof of concept for the identification of rare and highly-penetrant functional variants by confirming that the cause of hemophilia A is easily recognizable in this data set. We also show that the number of novel single nucleotide variants (SNVs discovered per genome seems to stabilize at about 144,000 new variants per genome, after the first 15 individuals have been sequenced. Finally, we find that, on average, each genome carries 165 homozygous protein-truncating or stop loss variants in genes representing a diverse set of pathways.

  7. Twenty five years of fundamental theory

    International Nuclear Information System (INIS)

    Bell, J.S.

    1980-01-01

    In reviewing the last twenty five years in fundamental physics theory it is stated that there has been no revolution in this field. In the absence of gravitation, Lorentz invariance remains a requirement on fundamental laws. Einstein's theory of gravitation inspires increasing conviction on the astronomical scale. Quantum theory remains the framework for all serious effort in microphysics, and quantum electrodynamics remains the model of a fully articulated microphysical theory, completely successful in its domain. However,a number of ideas have appeared, of great theoretical interest and some phenomenological success, which may well contribute to the next decisive step. Recent work on the following topics is mentioned; gravitational radiation, singularites, black body radiation from black holes, gauge and hidden symmetry in quantum electrodynamics, the renormalization of electromagnetic and weak interaction theory, non-Abelian gauge theories, magnetic monopoles as the most striking example of solitons, and supersymmetry. (UK)

  8. Managing salinity in Upper Colorado River Basin streams: Selecting catchments for sediment control efforts using watershed characteristics and random forests models

    Science.gov (United States)

    Tillman, Fred; Anning, David W.; Heilman, Julian A.; Buto, Susan G.; Miller, Matthew P.

    2018-01-01

    Elevated concentrations of dissolved-solids (salinity) including calcium, sodium, sulfate, and chloride, among others, in the Colorado River cause substantial problems for its water users. Previous efforts to reduce dissolved solids in upper Colorado River basin (UCRB) streams often focused on reducing suspended-sediment transport to streams, but few studies have investigated the relationship between suspended sediment and salinity, or evaluated which watershed characteristics might be associated with this relationship. Are there catchment properties that may help in identifying areas where control of suspended sediment will also reduce salinity transport to streams? A random forests classification analysis was performed on topographic, climate, land cover, geology, rock chemistry, soil, and hydrologic information in 163 UCRB catchments. Two random forests models were developed in this study: one for exploring stream and catchment characteristics associated with stream sites where dissolved solids increase with increasing suspended-sediment concentration, and the other for predicting where these sites are located in unmonitored reaches. Results of variable importance from the exploratory random forests models indicate that no simple source, geochemical process, or transport mechanism can easily explain the relationship between dissolved solids and suspended sediment concentrations at UCRB monitoring sites. Among the most important watershed characteristics in both models were measures of soil hydraulic conductivity, soil erodibility, minimum catchment elevation, catchment area, and the silt component of soil in the catchment. Predictions at key locations in the basin were combined with observations from selected monitoring sites, and presented in map-form to give a complete understanding of where catchment sediment control practices would also benefit control of dissolved solids in streams.

  9. Effect of a Counseling Session Bolstered by Text Messaging on Self-Selected Health Behaviors in College Students: A Preliminary Randomized Controlled Trial.

    Science.gov (United States)

    Sandrick, Janice; Tracy, Doreen; Eliasson, Arn; Roth, Ashley; Bartel, Jeffrey; Simko, Melanie; Bowman, Tracy; Harouse-Bell, Karen; Kashani, Mariam; Vernalis, Marina

    2017-05-17

    The college experience is often the first time when young adults live independently and make their own lifestyle choices. These choices affect dietary behaviors, exercise habits, techniques to deal with stress, and decisions on sleep time, all of which direct the trajectory of future health. There is a need for effective strategies that will encourage healthy lifestyle choices in young adults attending college. This preliminary randomized controlled trial tested the effect of coaching and text messages (short message service, SMS) on self-selected health behaviors in the domains of diet, exercise, stress, and sleep. A second analysis measured the ripple effect of the intervention on health behaviors not specifically selected as a goal by participants. Full-time students aged 18-30 years were recruited by word of mouth and campuswide advertisements (flyers, posters, mailings, university website) at a small university in western Pennsylvania from January to May 2015. Exclusions included pregnancy, eating disorders, chronic medical diagnoses, and prescription medications other than birth control. Of 60 participants, 30 were randomized to receive a single face-to-face meeting with a health coach to review results of behavioral questionnaires and to set a health behavior goal for the 8-week study period. The face-to-face meeting was followed by SMS text messages designed to encourage achievement of the behavioral goal. A total of 30 control subjects underwent the same health and behavioral assessments at intake and program end but did not receive coaching or SMS text messages. The texting app showed that 87.31% (2187/2505) of messages were viewed by intervention participants. Furthermore, 28 of the 30 intervention participants and all 30 control participants provided outcome data. Among intervention participants, 22 of 30 (73%) showed improvement in health behavior goal attainment, with the whole group (n=30) showing a mean improvement of 88% (95% CI 39-136). Mean

  10. A theory for the origin of a self-replicating chemical system. I - Natural selection of the autogen from short, random oligomers

    Science.gov (United States)

    White, D. H.

    1980-01-01

    A general theory is presented for the origin of a self-replicating chemical system, termed an autogen, which is capable of both crude replication and translation (protein synthesis). The theory requires the availability of free energy and monomers to the system, a significant background low-yield synthesis of kinetically stable oligopeptides and oligonucleotides, the localization of the oligomers, crude oligonucleotide selectivity of amino acids during oligopeptide synthesis, crude oligonucleotide replication, and two short peptide families which catalyze replication and translation, to produce a localized group of at least one copy each of two protogenes and two protoenzymes. The model posits a process of random oligomerization, followed by the random nucleation of functional components and the rapid autocatalytic growth of the functioning autogen to macroscopic amounts, to account for the origin of the first self-replicating system. Such a process contains steps of such high probability and short time periods that it is suggested that the emergence of an autogen in a laboratory experiment of reasonable time scale may be possible.

  11. Effects of Video Game Training on Measures of Selective Attention and Working Memory in Older Adults: Results from a Randomized Controlled Trial

    Science.gov (United States)

    Ballesteros, Soledad; Mayas, Julia; Prieto, Antonio; Ruiz-Marquez, Eloísa; Toril, Pilar; Reales, José M.

    2017-01-01

    Video game training with older adults potentially enhances aspects of cognition that decline with aging and could therefore offer a promising training approach. Although, previous published studies suggest that training can produce transfer, many of them have certain shortcomings. This randomized controlled trial (RCT; Clinicaltrials.gov ID: NCT02796508) tried to overcome some of these limitations by incorporating an active control group and the assessment of motivation and expectations. Seventy-five older volunteers were randomly assigned to the experimental group trained for 16 sessions with non-action video games from Lumosity, a commercial platform (http://www.lumosity.com/) or to an active control group trained for the same number of sessions with simulation strategy games. The final sample included 55 older adults (30 in the experimental group and 25 in the active control group). Participants were tested individually before and after training to assess working memory (WM) and selective attention and also reported their perceived improvement, motivation and engagement. The results showed improved performance across the training sessions. The main results were: (1) the experimental group did not show greater improvements in measures of selective attention and working memory than the active control group (the opposite occurred in the oddball task); (2) a marginal training effect was observed for the N-back task, but not for the Stroop task while both groups improved in the Corsi Blocks task. Based on these results, one can conclude that training with non-action games provide modest benefits for untrained tasks. The effect is not specific for that kind of training as a similar effect was observed for strategy video games. Groups did not differ in motivation, engagement or expectations. PMID:29163136

  12. Effects of Video Game Training on Measures of Selective Attention and Working Memory in Older Adults: Results from a Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Soledad Ballesteros

    2017-11-01

    Full Text Available Video game training with older adults potentially enhances aspects of cognition that decline with aging and could therefore offer a promising training approach. Although, previous published studies suggest that training can produce transfer, many of them have certain shortcomings. This randomized controlled trial (RCT; Clinicaltrials.gov ID: NCT02796508 tried to overcome some of these limitations by incorporating an active control group and the assessment of motivation and expectations. Seventy-five older volunteers were randomly assigned to the experimental group trained for 16 sessions with non-action video games from Lumosity, a commercial platform (http://www.lumosity.com/ or to an active control group trained for the same number of sessions with simulation strategy games. The final sample included 55 older adults (30 in the experimental group and 25 in the active control group. Participants were tested individually before and after training to assess working memory (WM and selective attention and also reported their perceived improvement, motivation and engagement. The results showed improved performance across the training sessions. The main results were: (1 the experimental group did not show greater improvements in measures of selective attention and working memory than the active control group (the opposite occurred in the oddball task; (2 a marginal training effect was observed for the N-back task, but not for the Stroop task while both groups improved in the Corsi Blocks task. Based on these results, one can conclude that training with non-action games provide modest benefits for untrained tasks. The effect is not specific for that kind of training as a similar effect was observed for strategy video games. Groups did not differ in motivation, engagement or expectations.

  13. Effects of Video Game Training on Measures of Selective Attention and Working Memory in Older Adults: Results from a Randomized Controlled Trial.

    Science.gov (United States)

    Ballesteros, Soledad; Mayas, Julia; Prieto, Antonio; Ruiz-Marquez, Eloísa; Toril, Pilar; Reales, José M

    2017-01-01

    Video game training with older adults potentially enhances aspects of cognition that decline with aging and could therefore offer a promising training approach. Although, previous published studies suggest that training can produce transfer, many of them have certain shortcomings. This randomized controlled trial (RCT; Clinicaltrials.gov ID: NCT02796508) tried to overcome some of these limitations by incorporating an active control group and the assessment of motivation and expectations. Seventy-five older volunteers were randomly assigned to the experimental group trained for 16 sessions with non-action video games from Lumosity , a commercial platform (http://www.lumosity.com/) or to an active control group trained for the same number of sessions with simulation strategy games. The final sample included 55 older adults (30 in the experimental group and 25 in the active control group). Participants were tested individually before and after training to assess working memory (WM) and selective attention and also reported their perceived improvement, motivation and engagement. The results showed improved performance across the training sessions. The main results were: (1) the experimental group did not show greater improvements in measures of selective attention and working memory than the active control group (the opposite occurred in the oddball task); (2) a marginal training effect was observed for the N -back task, but not for the Stroop task while both groups improved in the Corsi Blocks task. Based on these results, one can conclude that training with non-action games provide modest benefits for untrained tasks. The effect is not specific for that kind of training as a similar effect was observed for strategy video games. Groups did not differ in motivation, engagement or expectations.

  14. A patient and community-centered approach selecting endpoints for a randomized trial of a novel advance care planning tool

    Directory of Open Access Journals (Sweden)

    Bridges JFP

    2018-02-01

    Full Text Available John FP Bridges,1,2 Norah L Crossnohere,2 Anne L Schuster,1 Judith A Miller,3 Carolyn Pastorini,3,† Rebecca A Aslakson2,4,5 1Department of Health Policy and Management, The Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, 2Department of Health, Behavior, and Society, The Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, 3Patient-Centered Outcomes Research Institute (PCORI Project, Baltimore, MD, 4Department of Anesthesiology and Critical Care Medicine, The Johns Hopkins School of Medicine, Baltimore, MD, 5Armstrong Institute for Patient Safety and Quality, The Johns Hopkins School of Medicine, Baltimore, MD, USA †Carolyn Pastorini passed away on August 24, 2015 Background: Despite a movement toward patient-centered outcomes, best practices on how to gather and refine patients’ perspectives on research endpoints are limited. Advanced care planning (ACP is inherently patient centered and would benefit from patient prioritization of endpoints for ACP-related tools and studies.Objective: This investigation sought to prioritize patient-centered endpoints for the content and evaluation of an ACP video being developed for patients undergoing major surgery. We also sought to highlight an approach using complementary engagement and research strategies to document priorities and preferences of patients and other stakeholders.Materials and methods: Endpoints identified from a previously published environmental scan were operationalized following rating by a caregiver co-investigator, refinement by a patient co-investigator, review by a stakeholder committee, and validation by patients and family members. Finalized endpoints were taken to a state fair where members of the public who indicated that they or a loved one had undergone major surgery prioritized their most relevant endpoints and provided comments.Results: Of the initial 50 ACP endpoints identified from the review, 12 endpoints were selected for public

  15. Twenty-First Water Reactor Safety Information Meeting

    International Nuclear Information System (INIS)

    Monteleone, S.

    1994-04-01

    This three-volume report contains 90 papers out of the 102 that were presented at the Twenty-First Water Reactor Safety Information Meeting held at the Bethesda Marriott Hotel, Bethesda, Maryland, during the week of October 25-27, 1993. The papers are printed in the order of their presentation in each session and describe progress and results of programs in nuclear safety research conducted in this country and abroad. Foreign participation in the meeting included papers presented by researchers from France, Germany, Japan, Russia, Switzerland, Taiwan, and United Kingdom. The titles of the papers and the names of the authors have been updated and may differ from those that appeared in the final program of the meeting. Selected papers were indexed separately for inclusion in the Energy Science and Technology Database

  16. A randomized controlled trial investigating the use of a predictive nomogram for the selection of the FSH starting dose in IVF/ICSI cycles.

    Science.gov (United States)

    Allegra, Adolfo; Marino, Angelo; Volpes, Aldo; Coffaro, Francesco; Scaglione, Piero; Gullo, Salvatore; La Marca, Antonio

    2017-04-01

    The number of oocytes retrieved is a relevant intermediate outcome in women undergoing IVF/intracytoplasmic sperm injection (ICSI). This trial compared the efficiency of the selection of the FSH starting dose according to a nomogram based on multiple biomarkers (age, day 3 FSH, anti-Müllerian hormone) versus an age-based strategy. The primary outcome measure was the proportion of women with an optimal number of retrieved oocytes defined as 8-14. At their first IVF/ICSI cycle, 191 patients underwent a long gonadotrophin-releasing hormone agonist protocol and were randomized to receive a starting dose of recombinant (human) FSH, based on their age (150 IU if ≤35 years, 225 IU if >35 years) or based on the nomogram. Optimal response was observed in 58/92 patients (63%) in the nomogram group and in 42/99 (42%) in the control group (+21%, 95% CI = 0.07 to 0.35, P = 0.0037). No significant differences were found in the clinical pregnancy rate or the number of embryos cryopreserved per patient. The study showed that the FSH starting dose selected according to ovarian reserve is associated with an increase in the proportion of patients with an optimal response: large trials are recommended to investigate any possible effect on the live-birth rate. Copyright © 2017 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.

  17. Novel β-lactamase-random peptide fusion libraries for phage display selection of cancer cell-targeting agents suitable for enzyme prodrug therapy

    Science.gov (United States)

    Shukla, Girja S.; Krag, David N.

    2010-01-01

    Novel phage-displayed random linear dodecapeptide (X12) and cysteine-constrained decapeptide (CX10C) libraries constructed in fusion to the amino-terminus of P99 β-lactamase molecules were used for identifying β-lactamase-linked cancer cell-specific ligands. The size and quality of both libraries were comparable to the standards of other reported phage display systems. Using the single-round panning method based on phage DNA recovery, we identified severalβ-lactamase fusion peptides that specifically bind to live human breast cancer MDA-MB-361 cells. The β-lactamase fusion to the peptides helped in conducting the enzyme activity-based clone normalization and cell-binding screening in a very time- and cost-efficient manner. The methods were suitable for 96-well readout as well as microscopic imaging. The success of the biopanning was indicated by the presence of ~40% cancer cell-specific clones among recovered phages. One of the binding clones appeared multiple times. The cancer cell-binding fusion peptides also shared several significant motifs. This opens a new way of preparing and selecting phage display libraries. The cancer cell-specific β-lactamase-linked affinity reagents selected from these libraries can be used for any application that requires a reporter for tracking the ligand molecules. Furthermore, these affinity reagents have also a potential for their direct use in the targeted enzyme prodrug therapy of cancer. PMID:19751096

  18. Mucositis reduction by selective elimination of oral flora in irradiated cancers of the head and neck: a placebo-controlled double-blind randomized study

    International Nuclear Information System (INIS)

    Wijers, Oda B.; Levendag, Peter C.; Harms, Erik; Gan-Teng, A.M.; Schmitz, Paul I.M.; Hendriks, W.D.H.; Wilms, Erik B.; Est, Henri van der; Visch, Leo L.

    2001-01-01

    Purpose: The aim of the study was to test the hypothesis that aerobic Gram-negative bacteria (AGNB) play a crucial role in the pathogenesis of radiation-induced mucositis; consequently, selective elimination of these bacteria from the oral flora should result in a reduction of the mucositis. Methods and Materials: Head-and-neck cancer patients, when scheduled for treatment by external beam radiation therapy (EBRT), were randomized for prophylactic treatment with an oral paste containing either a placebo or a combination of the antibiotics polymyxin E, tobramycin, and amphotericin B (PTA group). Weekly, the objective and subjective mucositis scores and microbiologic counts of the oral flora were noted. The primary study endpoint was the mucositis grade after 3 weeks of EBRT. Results: Seventy-seven patients were evaluable. No statistically significant difference for the objective and subjective mucositis scores was observed between the two study arms (p=0.33). The percentage of patients with positive cultures of AGNB was significantly reduced in the PTA group (p=0.01). However, complete eradication of AGNB was not achieved. Conclusions: Selective elimination of AGNB of the oral flora did not result in a reduction of radiation-induced mucositis and therefore does not support the hypothesis that these bacteria play a crucial role in the pathogenesis of mucositis

  19. Prevalence of at-risk genotypes for genotoxic effects decreases with age in a randomly selected population in Flanders: a cross sectional study

    Directory of Open Access Journals (Sweden)

    van Delft Joost HM

    2011-10-01

    Full Text Available Abstract Background We hypothesized that in Flanders (Belgium, the prevalence of at-risk genotypes for genotoxic effects decreases with age due to morbidity and mortality resulting from chronic diseases. Rather than polymorphisms in single genes, the interaction of multiple genetic polymorphisms in low penetrance genes involved in genotoxic effects might be of relevance. Methods Genotyping was performed on 399 randomly selected adults (aged 50-65 and on 442 randomly selected adolescents. Based on their involvement in processes relevant to genotoxicity, 28 low penetrance polymorphisms affecting the phenotype in 19 genes were selected (xenobiotic metabolism, oxidative stress defense and DNA repair, respectively 13, 6 and 9 polymorphisms. Polymorphisms which, based on available literature, could not clearly be categorized a priori as leading to an 'increased risk' or a 'protective effect' were excluded. Results The mean number of risk alleles for all investigated polymorphisms was found to be lower in the 'elderly' (17.0 ± 2.9 than the 'adolescent' (17.6 ± 3.1 subpopulation (P = 0.002. These results were not affected by gender nor smoking. The prevalence of a high (> 17 = median number of risk alleles was less frequent in the 'elderly' (40.6% than the 'adolescent' (51.4% subpopulation (P = 0.002. In particular for phase II enzymes, the mean number of risk alleles was lower in the 'elderly' (4.3 ± 1.6 than the 'adolescent' age group (4.8 ± 1.9 P 4 = median number of risk alleles was less frequent in the 'elderly' (41.3% than the adolescent subpopulation (56.3%, P 8 = median number of risk alleles for DNA repair enzyme-coding genes was lower in the 'elderly' (37,3% than the 'adolescent' subpopulation (45.6%, P = 0.017. Conclusions These observations are consistent with the hypothesis that, in Flanders, the prevalence of at-risk alleles in genes involved in genotoxic effects decreases with age, suggesting that persons carrying a higher number of

  20. PONTIAC (NT-proBNP selected prevention of cardiac events in a population of diabetic patients without a history of cardiac disease): a prospective randomized controlled trial.

    Science.gov (United States)

    Huelsmann, Martin; Neuhold, Stephanie; Resl, Michael; Strunk, Guido; Brath, Helmut; Francesconi, Claudia; Adlbrecht, Christopher; Prager, Rudolf; Luger, Anton; Pacher, Richard; Clodi, Martin

    2013-10-08

    The study sought to assess the primary preventive effect of neurohumoral therapy in high-risk diabetic patients selected by N-terminal pro-B-type natriuretic peptide (NT-proBNP). Few clinical trials have successfully demonstrated the prevention of cardiac events in patients with diabetes. One reason for this might be an inaccurate selection of patients. NT-proBNP has not been assessed in this context. A total of 300 patients with type 2 diabetes, elevated NT-proBNP (>125 pg/ml) but free of cardiac disease were randomized. The "control" group was cared for at 4 diabetes care units; the "intensified" group was additionally treated at a cardiac outpatient clinic for the up-titration of renin-angiotensin system (RAS) antagonists and beta-blockers. The primary endpoint was hospitalization/death due to cardiac disease after 2 years. At baseline, the mean age of the patients was 67.5 ± 9 years, duration of diabetes was 15 ± 12 years, 37% were male, HbA1c was 7 ± 1.1%, blood pressure was 151 ± 22 mm Hg, heart rate was 72 ± 11 beats/min, median NT-proBNP was 265.5 pg/ml (interquartile range: 180.8 to 401.8 pg/ml). After 12 months there was a significant difference between the number of patients treated with a RAS antagonist/beta-blocker and the dosage reached between groups (p titration of RAS antagonists and beta-blockers to maximum tolerated dosages is an effective and safe intervention for the primary prevention of cardiac events for diabetic patients pre-selected using NT-proBNP. (Nt-proBNP Guided Primary Prevention of CV Events in Diabetic Patients [PONTIAC]; NCT00562952). Copyright © 2013 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  1. The twenty-first century in space

    CERN Document Server

    Evans, Ben

    2015-01-01

    This final entry in the History of Human Space Exploration mini-series by Ben Evans continues with an in-depth look at the latter part of the 20th century and the start of the new millennium. Picking up where Partnership in Space left off, the story commemorating the evolution of manned space exploration unfolds in further detail. More than fifty years after Yuri Gagarin’s pioneering journey into space, Evans extends his overview of how that momentous voyage continued through the decades which followed. The Twenty-first Century in Space, the sixth book in the series, explores how the fledgling partnership between the United States and Russia in the 1990s gradually bore fruit and laid the groundwork for today’s International Space Station. The narrative follows the convergence of the Shuttle and Mir programs, together with standalone missions, including servicing the Hubble Space Telescope, many of whose technical and human lessons enabled the first efforts to build the ISS in orbit. The book also looks to...

  2. Common selective serotonin reuptake inhibitor side effects in older adults associated with genetic polymorphisms in the serotonin transporter and receptors: data from a randomized controlled trial.

    Science.gov (United States)

    Garfield, Lauren D; Dixon, David; Nowotny, Petra; Lotrich, Francis E; Pollock, Bruce G; Kristjansson, Sean D; Doré, Peter M; Lenze, Eric J

    2014-10-01

    Antidepressant side effects are a significant public health issue, associated with poor adherence, premature treatment discontinuation, and, rarely, significant harm. Older adults assume the largest and most serious burden of medication side effects. We investigated the association between antidepressant side effects and genetic variation in the serotonin system in anxious, older adults participating in a randomized, placebo-controlled trial of the selective serotonin reuptake inhibitor (SSRI) escitalopram. Adults (N = 177) aged ≥ 60 years were randomized to active treatment or placebo for 12 weeks. Side effects were assessed using the Udvalg fur Kliniske Undersøgelser side-effect rating scale. Genetic polymorphisms were putative functional variants in the promoters of the serotonin transporter and 1A and 2A receptors (5-HTTLPR [L/S + rs25531], HTR1A rs6295, HTR2A rs6311, respectively). Four significant drug-placebo side-effect differences were found: increased duration of sleep, dry mouth, diarrhea, and diminished sexual desire. Analyses using putative high- versus low-transcription genotype groupings revealed six pharmacogenetic effects: greater dry mouth and decreased sexual desire for the low- and high-expressing serotonin transporter genotypes, respectively, and greater diarrhea with the 1A receptor low-transcription genotype. Diminished sexual desire was experienced significantly more by high-expressing genotypes in the serotonin transporter, 1A, or 2A receptors. There was not a significant relationship between drug concentration and side effects nor a mean difference in drug concentration between low- and high-expressing genotypes. Genetic variation in the serotonin system may predict who develops common SSRI side effects and why. More work is needed to further characterize this genetic modulation and to translate research findings into strategies useful for more personalized patient care. Published by Elsevier Inc.

  3. On the role of heat and mass transfer into laser processability during selective laser melting AlSi12 alloy based on a randomly packed powder-bed

    Science.gov (United States)

    Wang, Lianfeng; Yan, Biao; Guo, Lijie; Gu, Dongdong

    2018-04-01

    A newly transient mesoscopic model with a randomly packed powder-bed has been proposed to investigate the heat and mass transfer and laser process quality between neighboring tracks during selective laser melting (SLM) AlSi12 alloy by finite volume method (FVM), considering the solid/liquid phase transition, variable temperature-dependent properties and interfacial force. The results apparently revealed that both the operating temperature and resultant cooling rate were obviously elevated by increasing the laser power. Accordingly, the resultant viscosity of liquid significantly reduced under a large laser power and was characterized with a large velocity, which was prone to result in a more intensive convection within pool. In this case, the sufficient heat and mass transfer occurred at the interface between the previously fabricated tracks and currently building track, revealing a strongly sufficient spreading between the neighboring tracks and a resultant high-quality surface without obvious porosity. By contrast, the surface quality of SLM-processed components with a relatively low laser power notably weakened due to the limited and insufficient heat and mass transfer at the interface of neighboring tracks. Furthermore, the experimental surface morphologies of the top surface were correspondingly acquired and were in full accordance to the calculated results via simulation.

  4. Selection of single blastocysts for fresh transfer via standard morphology assessment alone and with array CGH for good prognosis IVF patients: results from a randomized pilot study

    Directory of Open Access Journals (Sweden)

    Yang Zhihong

    2012-05-01

    Full Text Available Abstract Background Single embryo transfer (SET remains underutilized as a strategy to reduce multiple gestation risk in IVF, and its overall lower pregnancy rate underscores the need for improved techniques to select one embryo for fresh transfer. This study explored use of comprehensive chromosomal screening by array CGH (aCGH to provide this advantage and improve pregnancy rate from SET. Methods First-time IVF patients with a good prognosis (age Results For patients in Group A (n = 55, 425 blastocysts were biopsied and analyzed via aCGH (7.7 blastocysts/patient. Aneuploidy was detected in 191/425 (44.9% of blastocysts in this group. For patients in Group B (n = 48, 389 blastocysts were microscopically examined (8.1 blastocysts/patient. Clinical pregnancy rate was significantly higher in the morphology + aCGH group compared to the morphology-only group (70.9 and 45.8%, respectively; p = 0.017; ongoing pregnancy rate for Groups A and B were 69.1 vs. 41.7%, respectively (p = 0.009. There were no twin pregnancies. Conclusion Although aCGH followed by frozen embryo transfer has been used to screen at risk embryos (e.g., known parental chromosomal translocation or history of recurrent pregnancy loss, this is the first description of aCGH fully integrated with a clinical IVF program to select single blastocysts for fresh SET in good prognosis patients. The observed aneuploidy rate (44.9% among biopsied blastocysts highlights the inherent imprecision of SET when conventional morphology is used alone. Embryos randomized to the aCGH group implanted with greater efficiency, resulted in clinical pregnancy more often, and yielded a lower miscarriage rate than those selected without aCGH. Additional studies are needed to verify our pilot data and confirm a role for on-site, rapid aCGH for IVF patients contemplating fresh SET.

  5. Early prevention of antisocial personality: long-term follow-up of two randomized controlled trials comparing indicated and selective approaches.

    Science.gov (United States)

    Scott, Stephen; Briskman, Jackie; O'Connor, Thomas G

    2014-06-01

    Antisocial personality is a common adult problem that imposes a major public health burden, but for which there is no effective treatment. Affected individuals exhibit persistent antisocial behavior and pervasive antisocial character traits, such as irritability, manipulativeness, and lack of remorse. Prevention of antisocial personality in childhood has been advocated, but evidence for effective interventions is lacking. The authors conducted two follow-up studies of randomized trials of group parent training. One involved 120 clinic-referred 3- to 7-year-olds with severe antisocial behavior for whom treatment was indicated, 93 of whom were reassessed between ages 10 and 17. The other involved 109 high-risk 4- to 6-year-olds with elevated antisocial behavior who were selectively screened from the community, 90 of whom were reassessed between ages 9 and 13. The primary psychiatric outcome measures were the two elements of antisocial personality, namely, antisocial behavior (assessed by a diagnostic interview) and antisocial character traits (assessed by a questionnaire). Also assessed were reading achievement (an important domain of youth functioning at work) and parent-adolescent relationship quality. In the indicated sample, both elements of antisocial personality were improved in the early intervention group at long-term follow-up compared with the control group (antisocial behavior: odds ratio of oppositional defiant disorder=0.20, 95% CI=0.06, 0.69; antisocial character traits: B=-4.41, 95% CI=-1.12, -8.64). Additionally, reading ability improved (B=9.18, 95% CI=0.58, 18.0). Parental expressed emotion was warmer (B=0.86, 95% CI=0.20, 1.41) and supervision was closer (B=-0.43, 95% CI=-0.11, -0.75), but direct observation of parenting showed no differences. Teacher-rated and self-rated antisocial behavior were unchanged. In contrast, in the selective high-risk sample, early intervention was not associated with improved long-term outcomes. Early intervention with

  6. Proceedings: Twenty years of energy policy: Looking toward the twenty-first century

    International Nuclear Information System (INIS)

    1992-01-01

    In 1973, immediately following the Arab Oil Embargo, the Energy Resources Center, University of Illinois at Chicago initiated an innovative annual public service program called the Illinois Energy Conference. The objective was to provide a public forum each year to address an energy or environmental issue critical to the state, region and nation. Twenty years have passed since that inaugural program, and during that period we have covered a broad spectrum of issues including energy conservation nuclear power, Illinois coal, energy policy options, natural gas, alternative fuels, new energy technologies, utility deregulation and the National Energy Strategy

  7. Proceedings: Twenty years of energy policy: Looking toward the twenty-first century

    Energy Technology Data Exchange (ETDEWEB)

    1992-12-31

    In 1973, immediately following the Arab Oil Embargo, the Energy Resources Center, University of Illinois at Chicago initiated an innovative annual public service program called the Illinois Energy Conference. The objective was to provide a public forum each year to address an energy or environmental issue critical to the state, region and nation. Twenty years have passed since that inaugural program, and during that period we have covered a broad spectrum of issues including energy conservation nuclear power, Illinois coal, energy policy options, natural gas, alternative fuels, new energy technologies, utility deregulation and the National Energy Strategy.

  8. The effects of the adjunctive bupropion on male sexual dysfunction induced by a selective serotonin reuptake inhibitor: a double-blind placebo-controlled and randomized study.

    Science.gov (United States)

    Safarinejad, Mohammad Reza

    2010-09-01

    To determine the safety and efficacy of adjunctive bupropion sustained-release (SR) on male sexual dysfunction (SD) induced by a selective serotonin reuptake inhibitor (SSRI), as SD is a common side-effect of SSRIs and the most effective treatments have yet to be determined. The randomized sample consisted of 234 euthymic men who were receiving some type of SSRI. The men were randomly assigned to bupropion SR (150 mg twice daily, 117) or placebo (twice daily, 117) for 12 weeks. Efficacy was evaluated using the Clinical Global Impression-Sexual Function (CGI-SF; the primary outcome measure), the International Index of Erectile Function (IIEF), Arizona Sexual Experience Scale (ASEX), and Erectile Dysfunction Inventory of Treatment Satisfaction (EDITS) (secondary outcome measures). Participants were followed biweekly during study period. After 12 weeks of treatment, the mean (sd) scores for CGI-SF were significantly lower, i.e. better, in patients on bupropion SR, at 2.4 (1.2), than in the placebo group, at 3.9 (1.1) (P= 0.01). Men who received bupropion had a significant increase in the total IIEF score (54.4% vs 1.2%; P= 0.003), and in the five different domains of the IIEF. Total ASEX scores were significantly lower, i.e. better, among men who received bupropion than placebo, at 15.5 (4.3) vs 21.5 (4.7) (P= 0.002). The EDITS scores were 67.4 (10.2) for the bupropion and 36.3 (11.7) for the placebo group (P= 0.001). The ASEX score and CGI-SF score were correlated (P= 0.003). In linear regression analyses the CGI-SF score was not affected significantly by the duration of SD, type of SSRI used and age. Bupropion is an effective treatment for male SD induced by SSRIs. These results provide empirical support for conducting a further study of bupropion.

  9. Paleolithic nutrition: twenty-five years later.

    Science.gov (United States)

    Konner, Melvin; Eaton, S Boyd

    2010-12-01

    A quarter century has passed since the first publication of the evolutionary discordance hypothesis, according to which departures from the nutrition and activity patterns of our hunter-gatherer ancestors have contributed greatly and in specifically definable ways to the endemic chronic diseases of modern civilization. Refinements of the model have changed it in some respects, but anthropological evidence continues to indicate that ancestral human diets prevalent during our evolution were characterized by much lower levels of refined carbohydrates and sodium, much higher levels of fiber and protein, and comparable levels of fat (primarily unsaturated fat) and cholesterol. Physical activity levels were also much higher than current levels, resulting in higher energy throughput. We said at the outset that such evidence could only suggest testable hypotheses and that recommendations must ultimately rest on more conventional epidemiological, clinical, and laboratory studies. Such studies have multiplied and have supported many aspects of our model, to the extent that in some respects, official recommendations today have targets closer to those prevalent among hunter-gatherers than did comparable recommendations 25 years ago. Furthermore, doubts have been raised about the necessity for very low levels of protein, fat, and cholesterol intake common in official recommendations. Most impressively, randomized controlled trials have begun to confirm the value of hunter-gatherer diets in some high-risk groups, even as compared with routinely recommended diets. Much more research needs to be done, but the past quarter century has proven the interest and heuristic value, if not yet the ultimate validity, of the model.

  10. Twenty years after the Chernobyl accident

    International Nuclear Information System (INIS)

    2006-01-01

    Full text: The April 1986 accident at the Chernobyl nuclear power plant remains a painful memory in the lives of the hundreds of thousands of people who were most affected by the accident. In addition to the emergency rescue workers who died, thousands of children contracted thyroid cancer, and thousands of other individuals will eventually die of other cancers caused by the release of radiation. Vast areas of cropland, forests, rivers and urban centres were contaminated by environmental fallout. Hundreds of thousands of people were evacuated from these affected areas - forced to leave behind their homes, possessions, and livelihoods - and resettled elsewhere, in a traumatic outcome that has had long-lasting psychological and social impacts. The commemoration of the Chernobyl tragedy is taking place in many forums this month - in Minsk, in Kiev and in other locations. At the IAEA, it might be said that we have been responding to the accident and its consequences for twenty years, in a number of ways: first, through a variety of programmes designed to help mitigate the environmental and health consequences of the accident; second, by analyzing the lessons of what went wrong to allow such an accident to occur at all; and third, by working to prevent any such accident from occurring in the future. Building a strong and effective global nuclear safety regime is a central objective of our work. This requires effective international cooperation. The explosions that destroyed the Unit 4 reactor core, and discharged its contents in a cloud of radionuclides, made painfully clear that the safety risks associated with nuclear and radiological activities extend beyond national borders. International cooperation on nuclear safety matters - sharing information, setting clear safety standards, assisting with safety upgrades, and reviewing operational performance - has therefore become a hallmark of IAEA activity, particularly at a time when we are witnessing an expansion of

  11. Sequence based prediction of DNA-binding proteins based on hybrid feature selection using random forest and Gaussian naïve Bayes.

    Directory of Open Access Journals (Sweden)

    Wangchao Lou

    Full Text Available Developing an efficient method for determination of the DNA-binding proteins, due to their vital roles in gene regulation, is becoming highly desired since it would be invaluable to advance our understanding of protein functions. In this study, we proposed a new method for the prediction of the DNA-binding proteins, by performing the feature rank using random forest and the wrapper-based feature selection using forward best-first search strategy. The features comprise information from primary sequence, predicted secondary structure, predicted relative solvent accessibility, and position specific scoring matrix. The proposed method, called DBPPred, used Gaussian naïve Bayes as the underlying classifier since it outperformed five other classifiers, including decision tree, logistic regression, k-nearest neighbor, support vector machine with polynomial kernel, and support vector machine with radial basis function. As a result, the proposed DBPPred yields the highest average accuracy of 0.791 and average MCC of 0.583 according to the five-fold cross validation with ten runs on the training benchmark dataset PDB594. Subsequently, blind tests on the independent dataset PDB186 by the proposed model trained on the entire PDB594 dataset and by other five existing methods (including iDNA-Prot, DNA-Prot, DNAbinder, DNABIND and DBD-Threader were performed, resulting in that the proposed DBPPred yielded the highest accuracy of 0.769, MCC of 0.538, and AUC of 0.790. The independent tests performed by the proposed DBPPred on completely a large non-DNA binding protein dataset and two RNA binding protein datasets also showed improved or comparable quality when compared with the relevant prediction methods. Moreover, we observed that majority of the selected features by the proposed method are statistically significantly different between the mean feature values of the DNA-binding and the non DNA-binding proteins. All of the experimental results indicate that

  12. The CAP study, evaluation of integrated universal and selective prevention strategies for youth alcohol misuse: study protocol of a cluster randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Newton Nicola C

    2012-08-01

    Full Text Available Abstract Background Alcohol misuse amongst young people is a serious concern. The need for effective prevention is clear, yet there appear to be few evidenced-based programs that prevent alcohol misuse and none that target both high and low-risk youth. The CAP study addresses this gap by evaluating the efficacy of an integrated approach to alcohol misuse prevention, which combines the effective universal internet-based Climate Schools program with the effective selective personality-targeted Preventure program. This article describes the development and protocol of the CAP study which aims to prevent alcohol misuse and related harms in Australian adolescents. Methods/Design A cluster randomized controlled trial (RCT is being conducted with Year 8 students aged 13 to 14-years-old from 27 secondary schools in New South Wales and Victoria, Australia. Blocked randomisation was used to assign schools to one of four groups; Climate Schools only, Preventure only, CAP (Climate Schools and Preventure, or Control (alcohol, drug and health education as usual. The primary outcomes of the trial will be the uptake and harmful use of alcohol and alcohol related harms. Secondary outcomes will include alcohol and cannabis related knowledge, cannabis related harms, intentions to use, and mental health symptomatology. All participants will complete assessments on five occasions; baseline; immediately post intervention, and at 12, 24 and 36 months post baseline. Discussion This study protocol presents the design and current implementation of a cluster RCT to evaluate the efficacy of the CAP study; an integrated universal and selective approach to prevent alcohol use and related harms among adolescents. Compared to students who receive the stand-alone universal Climate Schools program or alcohol and drug education as usual (Controls, we expect the students who receive the CAP intervention to have significantly less uptake of alcohol use, a reduction in average

  13. Spatial and simultaneous seroepidemiology of anti-Leishmania spp. antibodies in dog owners and their dogs from randomly selected households in a major city of southern Brazil.

    Science.gov (United States)

    Benitez, Aline do Nascimento; Martins, Felippe Danyel Cardoso; Mareze, Marcelle; Nino, Beatriz de Souza Lima; Caldart, Eloiza Teles; Ferreira, Fernanda Pinto; Mitsuka-Breganó, Regina; Freire, Roberta Lemos; Galhardo, Juliana Arena; Martins, Camila Marinelli; Biondo, Alexander Welker; Navarro, Italmar Teodorico

    2018-06-01

    Although leishmaniasis has been described as a classic example of a zoonosis requiring a comprehensive approach for control, to date, no study has been conducted on the spatial distribution of simultaneous Leishmania spp. seroprevalence in dog owners and dogs from randomly selected households in urban settings. Accordingly, the present study aimed to simultaneously identify the seroprevalence, spatial distribution and associated factors of infection with Leishmania spp. in dog owners and their dogs in the city of Londrina, a county seat in southern Brazil with a population of half a million people and ranked 18th in population and 145th in the human development index (HDI) out of 5570 Brazilian cities. Overall, 564 households were surveyed and included 597 homeowners and their 729 dogs. Anti-Leishmania spp. antibodies were detected by ELISA in 9/597 (1.50%) dog owners and in 32/729 (4.38%) dogs, with significantly higher prevalence (p = 0.0042) in dogs. Spatial analysis revealed associations between seropositive dogs and households located up to 500 m from the local railway. No clusters were found for either owner or dog case distributions. In summary, the seroepidemiological and spatial results collectively show a lack of association of the factors for infection, and the results demonstrated higher exposure for dogs than their owners. However, railway areas may provide favorable conditions for the maintenance of infected phlebotomines, thereby causing infection in nearby domiciled dogs. In such an urban scenario, local sanitary barriers should be focused on the terrestrial routes of people and surrounding areas, particularly railways, via continuous vector surveillance and identification of phlebotomines infected by Leishmania spp. Copyright © 2018. Published by Elsevier B.V.

  14. An assessment of the quality of care for children in eighteen randomly selected district and sub-district hospitals in Bangladesh

    Directory of Open Access Journals (Sweden)

    Hoque Dewan ME

    2012-12-01

    Full Text Available Abstract Background Quality hospital care is important in ensuring that the needs of severely ill children are met to avert child mortality. However, the quality of hospital care for children in developing countries has often been found poor. As the first step of a country road map for improving hospital care for children, we assessed the baseline situation with respect to the quality of care provided to children under-five years age in district and sub-district level hospitals in Bangladesh. Methods Using adapted World Health Organization (WHO hospital assessment tools and standards, an assessment of 18 randomly selected district (n=6 and sub-district (n=12 hospitals was undertaken. Teams of trained assessors used direct case observation, record review, interviews, and Management Information System (MIS data to assess the quality of clinical case management and monitoring; infrastructure, processes and hospital administration; essential hospital and laboratory supports, drugs and equipment. Results Findings demonstrate that the overall quality of care provided in these hospitals was poor. No hospital had a functioning triage system to prioritise those children most in need of immediate care. Laboratory supports and essential equipment were deficient. Only one hospital had all of the essential drugs for paediatric care. Less than a third of hospitals had a back-up power supply, and just under half had functioning arrangements for safe-drinking water. Clinical case management was found to be sub-optimal for prevalent illnesses, as was the quality of neonatal care. Conclusion Action is needed to improve the quality of paediatric care in hospital settings in Bangladesh, with a particular need to invest in improving newborn care.

  15. Effectiveness of a selective intervention program targeting personality risk factors for alcohol misuse among young adolescents: results of a cluster randomized controlled trial

    NARCIS (Netherlands)

    Lammers, J.; Goossens, F.; Conrod, P.; Engels, R.C.M.E.; Wiers, R.W.H.J.; Kleinjan, M.

    2015-01-01

    Aim The effectiveness of Preventure was tested on drinking behaviour of young adolescents in secondary education in the Netherlands. Design A cluster randomized controlled trial was carried out, with participants assigned randomly to a two-session coping skills intervention or a control

  16. Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design

    Science.gov (United States)

    Wagler, Amy; Wagler, Ron

    2014-01-01

    Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…

  17. The Patient Deficit Model Overturned: a qualitative study of patients' perceptions of invitation to participate in a randomized controlled trial comparing selective bladder preservation against surgery in muscle invasive bladder cancer (SPARE, CRUK/07/011

    Directory of Open Access Journals (Sweden)

    Moynihan Clare

    2012-11-01

    Full Text Available Abstract Background Evidence suggests that poor recruitment into clinical trials rests on a patient ‘deficit’ model – an inability to comprehend trial processes. Poor communication has also been cited as a possible barrier to recruitment. A qualitative patient interview study was included within the feasibility stage of a phase III non-inferiority Randomized Controlled Trial (RCT (SPARE, CRUK/07/011 in muscle invasive bladder cancer. The aim was to illuminate problems in the context of randomization. Methods The qualitative study used a ‘Framework Analysis’ that included ‘constant comparison’ in which semi-structured interviews are transcribed, analyzed, compared and contrasted both between and within transcripts. Three researchers coded and interpreted data. Results Twenty-four patients agreed to enter the interview study; 10 decliners of randomization and 14 accepters, of whom 2 subsequently declined their allocated treatment. The main theme applying to the majority of the sample was confusion and ambiguity. There was little indication that confusion directly impacted on decisions to enter the SPARE trial. However, confusion did appear to impact on ethical considerations surrounding ‘informed consent’, as well as cause a sense of alienation between patients and health personnel. Sub-optimal communication in many guises accounted for the confusion, together with the logistical elements of a trial that involved treatment options delivered in a number of geographical locations. Conclusions These data highlight the difficulty of providing balanced and clear trial information within the UK health system, despite best intentions. Involvement of multiple professionals can impact on communication processes with patients who are considering participation in RCTs. Our results led us to question the ‘deficit’ model of patient behavior. It is suggested that health professionals might consider facilitating a context in which patients

  18. [The Gulf War Syndrome twenty years on].

    Science.gov (United States)

    Auxéméry, Y

    2013-10-01

    opposition or continuity links between the objective external exposure (smoke from petrol wells, impoverished uranium, biological agents, chemicals) and the share of inner emotion albeit reactive and characterised by a subjective stress. There were no lack of stress factors for the troops deployed: repeated alerts of chemical attacks, hostility of the environment with its sandstorms and venomous animals, climatic conditions making long hours of backup and static observation difficult, collecting bodies, lack of knowledge of the precise geography of their movements and uncertainty of the duration of the conflict. The military anti-nuclear-bacteriological-chemical uniform admittedly provided protective confinement, shutting out the hostile world from which the threat would come but, at the same time, this isolation increases the fear of a hypothetical risk whilst the internal perceptions are increased and can open the way to future somatisations. In a context like this, the somatic manifestations of anxiety (palpitations, sweating, paresthesia…) are willingly associated with somatised functional disorders to which can also be assigned over-interpretations of bodily feelings according to a hypochondriacal mechanism. The selective attention to somatic perceptions in the absence of mentalisations, the request for reassurance reiterated and the excessive use of the treatment system will be diagnostic indices of these symptoms caused by the stress. Rather than toxic exposure to such and such a substance, the non-specific syndrome called "Gulf War Syndrome" is the result of exposure to the eponymous operational theatre. But if the psychological and psychosomatic suffering occurring in veterans is immutable throughout history, the expression of these difficulties has specificities according to the past cultural, political and scientific context. In the example of GWS, the diffusion of the fear of a pathology resulting from chemical weapons has promoted this phenomenon. In the end

  19. A randomized control trial to evaluate the effect of adjuvant selective laser trabeculoplasty versus medication alone in primary open-angle glaucoma: preliminary results

    Directory of Open Access Journals (Sweden)

    Lee JWY

    2014-09-01

    Full Text Available Jacky WY Lee,1,2 Catherine WS Chan,2 Mandy OM Wong,3 Jonathan CH Chan,3 Qing Li,2 Jimmy SM Lai2 1The Department of Ophthalmology, Caritas Medical Centre, 2The Department of Ophthalmology, The University of Hong Kong, 3The Department of Ophthalmology, Queen Mary Hospital, Hong Kong Background: The objective of this study was to investigate the effects of adjuvant selective laser trabeculoplasty (SLT versus medication alone on intraocular pressure (IOP control, medication use, and quality of life in patients with primary open-angle glaucoma.Methods: This prospective, randomized control study recruited 41 consecutive primary open-angle glaucoma subjects with medically-controlled IOP ≤21 mmHg. The SLT group (n=22 received a single 360-degree SLT treatment. The medication-only group (n=19 continued with their usual treatment regimen. In both groups, medication was titrated to maintain a target IOP defined as a 25% reduction from baseline IOP without medication, or <18 mmHg, whichever was lower. Outcomes, which were measured at baseline and at 6 months, included the Glaucoma Quality of Life-15 (GQL-15 and Comparison of Ophthalmic Medications for Tolerability (COMTOL survey scores, IOP, and the number of antiglaucoma medicines. Results: The baseline IOP was 15.8±2.7 mmHg and 14.5±2.5 mmHg in the SLT and medication-only groups, respectively (P=0.04. Both groups had a comparable number of baseline medication (P=0.2, GQL-15 (P=0.3 and COMTOL scores (P=0.7. At 6 months, the SLT group had a lower IOP (P=0.03 and required fewer medications compared with both baseline (P<0.0001 and with the medication-only group (P=0.02. There was no statistically significant difference in the 6-month GQL-15 or COMTOL score as compared to baseline (P≥0.4 or between the two treatment groups (P≥0.2.Conclusion: A single session of adjuvant SLT provided further reductions in IOP and medication without substantial changes in quality of life or medication tolerability at 6

  20. On Random Numbers and Design

    Science.gov (United States)

    Ben-Ari, Morechai

    2004-01-01

    The term "random" is frequently used in discussion of the theory of evolution, even though the mathematical concept of randomness is problematic and of little relevance in the theory. Therefore, since the core concept of the theory of evolution is the non-random process of natural selection, the term random should not be used in teaching the…

  1. Twenty new ISO standards on dosimetry for radiation processing

    International Nuclear Information System (INIS)

    Farrar IV, H.

    2000-01-01

    Twenty standards on essentially all aspects of dosimetry for radiation processing were published as new ISO standards in December 1998. The standards are based on 20 standard practices and guides developed over the past 14 years by Subcommittee E10.01 of the American Society for Testing and Materials (ASTM). The transformation to ISO standards using the 'fast track' process under ISO Technical Committee 85 (ISO/TC85) commenced in 1995 and resulted in some overlap of technical information between three of the new standards and the existing ISO Standard 11137 Sterilization of health care products - Requirements for validation and routine control - Radiation sterilization. Although the technical information in these four standards was consistent, compromise wording in the scopes of the three new ISO standards to establish precedence for use were adopted. Two of the new ISO standards are specifically for food irradiation applications, but the majority apply to all forms of gamma, X-ray, and electron beam radiation processing, including dosimetry for sterilization of health care products and the radiation processing of fruit, vegetables, meats, spices, processed foods, plastics, inks, medical wastes, and paper. Most of the standards provide exact procedures for using individual dosimetry systems or for characterizing various types of irradiation facilities, but one covers the selection and calibration of dosimetry systems, and another covers the treatment of uncertainties using the new ISO Type A and Type B evaluations. Unfortunately, nine of the 20 standards just adopted by the ISO are not the most recent versions of these standards and are therefore already out of date. To help solve this problem, efforts are being made to develop procedures to coordinate the ASTM and ISO development and revision processes for these and future ASTM-originating dosimetry standards. In the meantime, an additional four dosimetry standards have recently been published by the ASTM but have

  2. Strategic Leader Competencies for the Twenty-First Century

    National Research Council Canada - National Science Library

    Becker, Bradley A

    2007-01-01

    ...: interpersonal skills, conceptual skills, and technical skills. From these three primary strategic leadership skills, there is a list of twenty-one competencies that a strategic leader should posses...

  3. EFFECT OF CORE TRAINING ON SELECTED HEMATOLOGICAL VARIABLES AMONG BASKETBALL PLAYERS

    OpenAIRE

    K. Rejinadevi; Dr. C. Ramesh

    2017-01-01

    The purpose of the study was to find out the effect of core training on selected haematological variables among basketball players. For the purpose of the study forty men basketball players were selected as subjects from S.V.N College and Arul Anandar College, Madurai, Tamilnadu at random and their age ranged from 18 to 25 years. The selected subjects are divided in to two groups of twenty subjects each. Group I acted as core training group and Group II acted as control group. The experimenta...

  4. Pharmacodynamics and safety of the novel selective progesterone receptor modulator vilaprisan: a double-blind, randomized, placebo-controlled phase 1 trial in healthy women.

    Science.gov (United States)

    Schütt, Barbara; Kaiser, Andreas; Schultze-Mosgau, Marcus-Hillert; Seitz, Christian; Bell, David; Koch, Manuela; Rohde, Beate

    2016-08-01

    Does administration of vilaprisan (VPR) to healthy women for 12 weeks reduce menstrual bleeding? In this 12-week proof-of-concept phase 1 trial, most women (30/33, 90%) who received VPR at daily doses of 1-5 mg reported the absence of menstrual bleeding. Vilaprisan (BAY 1002670) is a novel, highly potent selective progesterone receptor modulator that markedly reduces the growth of human leiomyoma tissue in a preclinical model of uterine fibroids (UFs). In this double-blind, parallel-group study, of the 163 healthy women enrolled 73 were randomized to daily VPR 0.1 mg (n = 12), 0.5 mg (n = 12), 1 mg (n = 13), 2 mg (n = 12), 5 mg (n = 12) or placebo tablets (n = 12) for 12 weeks. Participants were followed up until the start of the second menstrual bleeding after the end of treatment. Trial simulations were used to determine the minimum sample size required to estimate the non-bleeding rate (i.e. self-assessed bleeding intensity of 'none' or 'spotting') using Bayesian dose-response estimation with incorporated prior information. It was estimated that 48 participants in the per-protocol analysis population would be sufficient. Women aged 18-45 years who had been sterilized by tubal ligation were enrolled between November 2011 and May 2012. Participants kept a daily diary of bleeding intensity. Blood and urine samples were taken, and transvaginal ultrasound was performed before treatment, during treatment and follow-up. Endometrial biopsies were obtained during the pretreatment cycle, at the end of the treatment period and during the follow-up phase. The primary outcome was the estimated dose-response curve of the observed non-bleeding rate during Days 10-84 of treatment, excluding the endometrial biopsy day and 2 days after biopsy. Secondary outcomes included return of bleeding during follow-up, size of follicle-like structures and serum hormone levels. Safety assessments included adverse events (AEs), endometrial thickness and histology, laboratory parameters, vital

  5. Randomization tests

    CERN Document Server

    Edgington, Eugene

    2007-01-01

    Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani

  6. Selepressin, a novel selective vasopressin V1A agonist, is an effective substitute for norepinephrine in a phase IIa randomized, placebo-controlled trial in septic shock patients

    DEFF Research Database (Denmark)

    Russell, James A; Vincent, Jean-Louis; Kjølbye, Anne Louise

    2017-01-01

    BACKGROUND: Vasopressin is widely used for vasopressor support in septic shock patients, but experimental evidence suggests that selective V1A agonists are superior. The initial pharmacodynamic effects, pharmacokinetics, and safety of selepressin, a novel V1A-selective vasopressin analogue, was e...

  7. The effect of barusiban, a selective oxytocin antagonist, in threatened preterm labor at late gestational age: a randomized, double-blind, placebo-controlled trial

    DEFF Research Database (Denmark)

    Thornton, Steven; Goodwin, Thomas M; Greisen, Gorm

    2009-01-01

    OBJECTIVE: The objective of the study was to compare barusiban with placebo in threatened preterm labor. STUDY DESIGN: This was a randomized, double-blind, placebo-controlled, multicenter study. One hundred sixty-three women at 34-35 weeks plus 6 days, and with 6 or more contractions of 30 seconds...

  8. Prevalence, diagnostics and management of musculoskeletal disorders in primary health care in Sweden : an investigation of 2000 randomly selected patient records

    OpenAIRE

    Wiitavaara, Birgitta; Fahlström, Martin; Djupsjöbacka, Mats

    2017-01-01

    Abstract Rationale, aims and objectives The aims of this study is to investigate the prevalence of patients seeking care due to different musculoskeletal disorders (MSDs) at primary health care centres (PHCs), to chart different factors such as symptoms, diagnosis and actions prescribed for patients that visited the PHCs due to MSD and to make comparisons regarding differences due to gender, age and rural or urban PHC. Methods Patient records (2000) for patients in working age were randomly s...

  9. Treatment of Implant Exposure due to Skin Necroses after Skin Sparing Mastectomy: Initial Experiences Using a Not Selective Random Epigastric Flap.

    Science.gov (United States)

    Echazarreta-Gallego, Estíbaliz; Pola-Bandrés, Guillermo; Arribas-Del Amo, María Dolores; Gil-Romea, Ismael; Sousa-Domínguez, Ramón; Güemes-Sánchez, Antonio

    2017-10-01

    Breast prostheses exposure is probably the most devastating complication after a skin sparing mastectomy (SSM) and implant-based, one-stage, breast reconstruction. This complication may occur in the immediate post-operative period or in the weeks and even months after the procedure. In most cases, the cause is poor skin coverage of the implant due to skin necrosis. Eight consecutive cases of implant exposure (or risk of exposure) due to skin necrosis in SSM patients over a period of 5 years, all patients were treated using a random epigastric rotation flap, executed by the same medical team. A random epigastric flap (island or conventional rotation flap) was used to cover the skin defect. All the patients completed the procedure and all prostheses were saved; there were no cases of flap necrosis or infection. Cases of skin necrosis after SSM and immediate implant reconstruction, in which the implant is at risk of exposure, can be successfully treated with a random epigastric rotation flap.

  10. The twenty-eight lodges (xiu 宿)

    OpenAIRE

    Morgan, Daniel Patrick

    2018-01-01

    The twenty-eight lodges (xiu 宿). Named and numbered in the central circle, the twenty-eight lodges, represent uneven orange-segment-like divisions of the celestial sphere running from pole to pole through the ‘guide stars’ (juxing 距星) at/near the western extremity of the constellations after which they are named. In the inner circle are the modern identifications of those guide stars. In the outer circle are the equatorial lodge widths in contemporary use in du 度 / days, where the number of d...

  11. Effect of mirtazapine versus selective serotonin reuptake inhibitors on benzodiazepine use in patients with major depressive disorder: a pragmatic, multicenter, open-label, randomized, active-controlled, 24-week trial.

    Science.gov (United States)

    Hashimoto, Tasuku; Shiina, Akihiro; Hasegawa, Tadashi; Kimura, Hiroshi; Oda, Yasunori; Niitsu, Tomihisa; Ishikawa, Masatomo; Tachibana, Masumi; Muneoka, Katsumasa; Matsuki, Satoshi; Nakazato, Michiko; Iyo, Masaomi

    2016-01-01

    This study aimed to evaluate whether selecting mirtazapine as the first choice for current depressive episode instead of selective serotonin reuptake inhibitors (SSRIs) reduces benzodiazepine use in patients with major depressive disorder (MDD). We concurrently examined the relationship between clinical responses and serum mature brain-derived neurotrophic factor (BDNF) and its precursor, proBDNF. We conducted an open-label randomized trial in routine psychiatric practice settings. Seventy-seven MDD outpatients were randomly assigned to the mirtazapine or predetermined SSRIs groups, and investigators arbitrarily selected sertraline or paroxetine. The primary outcome was the proportion of benzodiazepine users at weeks 6, 12, and 24 between the groups. We defined patients showing a ≥50 % reduction in Hamilton depression rating scale (HDRS) scores from baseline as responders. Blood samples were collected at baseline, weeks 6, 12, and 24. Sixty-five patients prescribed benzodiazepines from prescription day 1 were analyzed for the primary outcome. The percentage of benzodiazepine users was significantly lower in the mirtazapine than in the SSRIs group at weeks 6, 12, and 24 (21.4 vs. 81.8 %; 11.1 vs. 85.7 %, both P  depressive episodes may reduce benzodiazepine use in patients with MDD. Trial registration UMIN000004144. Registered 2nd September 2010. The date of enrolment of the first participant to the trial was 24th August 2010. This study was retrospectively registered 9 days after the first participant was enrolled.

  12. Building On Builder: The Persistent Icarus Syndrome at Twenty Years

    Science.gov (United States)

    2013-06-01

    mission of the United States Air Force is to "fly, fight, and win…in air, space and cyberspace"--as an intergral member of the Joint team that...Scenarios: A Military Futurist Explores War in the Twenty-First Century (New York: Bantam Books Trade Paperbacksl, 2009), 17. 33 Carl H. Builder

  13. Management of twenty patients with neck trauma in Khartoum ENT ...

    African Journals Online (AJOL)

    Background: Neck trauma is a great surgical challenge, because there are multi organ and systems involved. Objective: To study the clinical presentation, management and outcome of twenty patients presented to Khartoum ENT Hospital with neck trauma. Methods: This is a prospective study conducted in Khartoum ENT ...

  14. Proceedings of the Twenty Second Nordic Seminar on Computational Mechanics

    DEFF Research Database (Denmark)

    This book contains the proceedings of the Twenty Second Nordic Seminar on Computational Mechanics (NSCM22), taking event 22-23 October 2009 at Aalborg University, Denmark. The papers presented at the Optimization Seminar in Honour of Niels Olhoff, held 21 October 2009 at Aalborg University, Denmark...

  15. Proceedings of the Twenty-Eighth General Assembly Beijing 2012

    Science.gov (United States)

    Montmerle, Thierry

    2015-09-01

    Preface; 1. Inaugural ceremony; 2. Twenty-eighth General Assembly business sessions; 3. Closing ceremony; 4. Resolutions; 5. Report of Executive Committee, 2009-2012; 6. Reports on Division, Commission, and Working Group meetings; 7. Statutes, bye-laws, and working rules; 8. New members admitted at the General Assembly; 9. Divisions and their Commissions.

  16. Twenty Years of French "Didactique" Viewed from the United States

    Science.gov (United States)

    Kilpatrick, Jeremy

    2003-01-01

    One cannot begin considering the topic of this colloquium without asking, why twenty years? Why not two hundred? Two hundred years ago, Silvestre Franois Lacroix was about to be named chief officer of the Commission Executive de L'Instruction Publique. Out of that experience, together with his long career in instruction, especially as professor of…

  17. Digital earth applications in the twenty-first century

    NARCIS (Netherlands)

    de By, R.A.; Georgiadou, P.Y.

    2014-01-01

    In these early years of the twenty-first century, we must look at how the truly cross-cutting information technology supports other innovations, and how it will fundamentally change the information positions of government, private sector and the scientific domain as well as the citizen. In those

  18. Afterword: Victorian Sculpture for the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    David J. Getsy

    2016-06-01

    Full Text Available Commenting on the directions proposed by this issue of '19', the afterword discusses the broad trends in twenty-first century studies of Victorian sculpture and the opportunity for debate arising from the first attempt at a comprehensive exhibition.

  19. Investigating the Twenty Year Lag in the Vocational Rehabilitation Process.

    Science.gov (United States)

    Lowitt, Julian

    In the rehabilitation workshop there is insufficient attention to job development oriented to the current and future needs of industry. Many types of work which were done in vocational workshops in contract from industrial firms are now done by automation. Semiskilled labor is thus in diminished demand. There is a twenty year lag in the industrial…

  20. SEAPOWER: A GUIDE FOR THE TWENTY- FIRST CENTURY

    African Journals Online (AJOL)

    Abel

    $154,37 (amazon.com hardback). With the publication of Seapower: A Guide for the Twenty-First Century. Geoffrey Till has set the standard for publications on all things maritime. The updated and expanded new edition of the book is an essential guide for students of naval history and maritime strategy and provides ...

  1. Summary record of the twenty-seventh meeting

    International Nuclear Information System (INIS)

    Young, P.

    1990-01-01

    The topics presented and discussed at the twenty-seventh international meeting of the Nuclear Data Committee of the Nuclear Energy Agency are summarized. Relations with other committees and reports of data centers are analyzed. Problems concerning nuclear model codes are underlined. National evaluation efforts on data library and data file are reported. Reports from several laboratories and subcommittees are summarized

  2. The selective serotonin reuptake inhibitor fluoxetine does not change rectal sensitivity and symptoms in patients with irritable bowel syndrome: a double blind, randomized, placebo-controlled study

    NARCIS (Netherlands)

    Kuiken, Sjoerd D.; Tytgat, Guido N. J.; Boeckxstaens, Guy E. E.

    2003-01-01

    BACKGROUND & AIMS: Although widely prescribed, the evidence for the use of antidepressants for the treatment of irritable bowel syndrome (IBS) is limited. In this study, we hypothesized that fluoxetine (Prozac), a selective serotonin reuptake inhibitor, has visceral analgesic properties, leading to

  3. A sequential logic circuit for coincidences randomly distributed in 'time' and 'duration', with selection and total sampling

    International Nuclear Information System (INIS)

    Carnet, Bernard; Delhumeau, Michel

    1971-06-01

    The principles of binary analysis applied to the investigation of sequential circuits were used to design a two way coincidence circuit whose input may be, random or periodic variables of constant or variable duration. The output signal strictly reproduces the characteristics of the input signal triggering the coincidence. A coincidence between input signals does not produce any output signal if one of the signals has already triggered the output signal. The characteristics of the output signal in relation to those of the input signal are: minimum time jitter, excellent duration reproducibility and maximum efficiency. Some rules are given for achieving these results. The symmetry, transitivity and non-transitivity characteristics of the edges on the primitive graph are analyzed and lead to some rules for positioning the states on a secondary graph. It is from this graph that the equations of the circuits can be calculated. The development of the circuit and its dynamic testing are discussed. For this testing, the functioning of the circuit is simulated by feeding into the input randomly generated signals

  4. Impact of selected magnetic fields on the therapeutic effect in patients with lumbar discopathy: A prospective, randomized, single-blinded, and placebo-controlled clinical trial.

    Science.gov (United States)

    Taradaj, Jakub; Ozon, Marcin; Dymarek, Robert; Bolach, Bartosz; Walewicz, Karolina; Rosińczuk, Joanna

    2018-03-23

    Interdisciplinary physical therapy together with pharmacological treatment constitute conservative treatment strategies related to low back pain (LBP). There is still a lack of high quality studies aimed at an objective evaluation of physiotherapeutic procedures according to their effectiveness in LBP. The aim of this study is to carry out a prospective, randomized, single-blinded, and placebocontrolled clinical trial to evaluate the effectiveness of magnetic fields in discopathy-related LBP. A group of 177 patients was assessed for eligibility based on inclusion and exclusion criteria. In the end, 106 patients were randomly assigned into 5 comparative groups: A (n = 23; magnetic therapy: 10 mT, 50 Hz); B (n = 23; magnetic therapy: 5 mT, 50 Hz); C (n = 20; placebo magnetic therapy); D (n = 20; magnetic stimulation: 49.2 μT, 195 Hz); and E (n = 20; placebo magnetic stimulation). All patients were assessed using tests for pain intensity, degree of disability and range of motion. Also, postural stability was assessed using a stabilographic platform. In this study, positive changes in all clinical outcomes were demonstrated in group A (p 0.05). It was determined that the application of magnetic therapy (10 mT, 50 Hz, 20 min) significantly reduces pain symptoms and leads to an improvement of functional ability in patients with LBP.

  5. Why the American public supports twenty-first century learning.

    Science.gov (United States)

    Sacconaghi, Michele

    2006-01-01

    Aware that constituent support is essential to any educational endeavor, the AOL Time Warner Foundation (now the Time Warner Foundation), in conjunction with two respected national research firms, measured Americans' attitudes toward the implementation of twenty-first century skills. The foundation's national research survey was intended to explore public perceptions of the need for changes in the educational system, in school and after school, with respect to the teaching of twenty-first century skills. The author summarizes the findings of the survey, which were released by the foundation in June 2003. One thousand adults were surveyed by telephone, including African Americans, Latinos, teachers, and business executives. In general, the survey found that Americans believe today's students need a "basics-plus" education, meaning communication, technology, and critical thinking skills in addition to the traditional basics of reading, writing, and math. In fact, 92 percent of respondents stated that students today need different skills from those of ten to twenty years ago. Also, after-school programs were found to be an appropriate vehicle to teach these skills. Furthermore, the survey explored how well the public perceives schools to be preparing youth for the workforce and postsecondary education, which twenty-first century skills are seen as being taught effectively, and the level of need for after-school and summer programs. The survey results provide conclusive evidence of national support for basics-plus education. Thus, a clear opportunity exists to build momentum for a new model of education for the twenty-first century.

  6. The RANDOM computer program: A linear congruential random number generator

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  7. Random walk on random walks

    NARCIS (Netherlands)

    Hilário, M.; Hollander, den W.Th.F.; Sidoravicius, V.; Soares dos Santos, R.; Teixeira, A.

    2014-01-01

    In this paper we study a random walk in a one-dimensional dynamic random environment consisting of a collection of independent particles performing simple symmetric random walks in a Poisson equilibrium with density ¿¿(0,8). At each step the random walk performs a nearest-neighbour jump, moving to

  8. Efficacy and Safety of Axitinib Versus Sorafenib in Metastatic Renal Cell Carcinoma: Subgroup Analysis of Japanese Patients from the Global Randomized Phase 3 AXIS Trial

    OpenAIRE

    Ueda, Takeshi; Uemura, Hirotsugu; Tomita, Yoshihiko; Tsukamoto, Taiji; Kanayama, Hiroomi; Shinohara, Nobuo; Tarazi, Jamal; Chen, Connie; Kim, Sinil; Ozono, Seiichiro; Naito, Seiji; Akaza, Hideyuki

    2013-01-01

    Objective Axitinib is a potent and selective second-generation inhibitor of vascular endothelial growth factor receptors 1, 2 and 3. The efficacy and safety of axitinib in Japanese patients with metastatic renal cell carcinoma were evaluated. Methods A subgroup analysis was conducted in Japanese patients enrolled in the randomized Phase III trial of axitinib versus sorafenib after failure of one prior systemic therapy for metastatic renal cell carcinoma. Results Twenty-five (of 361) and 29 (o...

  9. Repeatability and number of growing seasons for the selection of custard apple progenies

    Directory of Open Access Journals (Sweden)

    Julio César Do Vale

    2011-01-01

    Full Text Available This study aimed to estimate the repeatability coefficient and determine the minimum number of samplesrequired for effective selection for yield of custard apple. Twenty progenies were evaluated in randomized blocks, fivereplications and four plants per plot. The fruits were collected, counted and weighed every two days of the year. Estimates ofthe repeatability coefficients were obtained by the methods of analysis of variance - ANOVA and principal components - PC.The estimates from the repeatability analysis of biennial data are higher than those based on individual years. The estimatesof the PC method were accurate even in the first harvest, unlike ANOVA. Four biennia were sufficient to ensure effectiveprogeny selection of custard apple.

  10. Twenty natural organic pigments for application in dye sensitized solar cells

    Science.gov (United States)

    Castillo, D.; Sánchez Juárez, A.; Espinosa Tapia, S.; Guaman, A.; Obregón Calderón, D.

    2016-09-01

    In this work we present the results of a study of twenty natural pigments obtained from plants and insects from southern Ecuador. Many of them will be considered as a potential natural sensitizer for the construction of DSSCs. The results indicate that these pigments have a good performance in the absorbance and wavelength spectra. Were selected four best pigments for the construction of DSSCs, Rumex tolimensis Wedd, Raphanus sativus, Hibiscus sabdariffa, and Prunus serótina, however the conversion efficiency is lower than 1%.

  11. Randomized Comparison of Selective Internal Radiotherapy (SIRT) Versus Drug-Eluting Bead Transarterial Chemoembolization (DEB-TACE) for the Treatment of Hepatocellular Carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Pitton, Michael B., E-mail: michael.pitton@unimedizin-mainz.de; Kloeckner, Roman [Johannes Gutenberg University Medical Center, Department of Diagnostic and Interventional Radiology (Germany); Ruckes, Christian [Johannes Gutenberg University Medical Center, IZKS (Germany); Wirth, Gesine M. [Johannes Gutenberg University Medical Center, Department of Diagnostic and Interventional Radiology (Germany); Eichhorn, Waltraud [Johannes Gutenberg University Medical Center, Department of Nuclear Medicine (Germany); Wörns, Marcus A.; Weinmann, Arndt [Johannes Gutenberg University Medical Center, Department of Internal Medicine (Germany); Schreckenberger, Mathias [Johannes Gutenberg University Medical Center, Department of Nuclear Medicine (Germany); Galle, Peter R. [Johannes Gutenberg University Medical Center, Department of Internal Medicine (Germany); Otto, Gerd [Johannes Gutenberg University Medical Center, Department of Transplantation Surgery (Germany); Dueber, Christoph [Johannes Gutenberg University Medical Center, Department of Diagnostic and Interventional Radiology (Germany)

    2015-04-15

    PurposeTo prospectively compare SIRT and DEB-TACE for treating hepatocellular carcinoma (HCC).MethodsFrom 04/2010–07/2012, 24 patients with histologically proven unresectable N0, M0 HCCs were randomized 1:1 to receive SIRT or DEB-TACE. SIRT could be repeated once in case of recurrence; while, TACE was repeated every 6 weeks until no viable tumor tissue was detected by MRI or contraindications prohibited further treatment. Patients were followed-up by MRI every 3 months; the final evaluation was 05/2013.ResultsBoth groups were comparable in demographics (SIRT: 8males/4females, mean age 72 ± 7 years; TACE: 10males/2females, mean age 71 ± 9 years), initial tumor load (1 patient ≥25 % in each group), and BCLC (Barcelona Clinic Liver Cancer) stage (SIRT: 12×B; TACE 1×A, 11×B). Median progression-free survival (PFS) was 180 days for SIRT versus 216 days for TACE patients (p = 0.6193) with a median TTP of 371 days versus 336 days, respectively (p = 0.5764). Median OS was 592 days for SIRT versus 788 days for TACE patients (p = 0.9271). Seven patients died in each group. Causes of death were liver failure (n = 4 SIRT group), tumor progression (n = 4 TACE group), cardiovascular events, and inconclusive (n = 1 in each group).ConclusionsNo significant differences were found in median PFS, OS, and TTP. The lower rate of tumor progression in the SIRT group was nullified by a greater incidence of liver failure. This pilot study is the first prospective randomized trial comparing SIRT and TACE for treating HCC, and results can be used for sample size calculations of future studies.

  12. Randomized Comparison of Selective Internal Radiotherapy (SIRT) Versus Drug-Eluting Bead Transarterial Chemoembolization (DEB-TACE) for the Treatment of Hepatocellular Carcinoma

    International Nuclear Information System (INIS)

    Pitton, Michael B.; Kloeckner, Roman; Ruckes, Christian; Wirth, Gesine M.; Eichhorn, Waltraud; Wörns, Marcus A.; Weinmann, Arndt; Schreckenberger, Mathias; Galle, Peter R.; Otto, Gerd; Dueber, Christoph

    2015-01-01

    PurposeTo prospectively compare SIRT and DEB-TACE for treating hepatocellular carcinoma (HCC).MethodsFrom 04/2010–07/2012, 24 patients with histologically proven unresectable N0, M0 HCCs were randomized 1:1 to receive SIRT or DEB-TACE. SIRT could be repeated once in case of recurrence; while, TACE was repeated every 6 weeks until no viable tumor tissue was detected by MRI or contraindications prohibited further treatment. Patients were followed-up by MRI every 3 months; the final evaluation was 05/2013.ResultsBoth groups were comparable in demographics (SIRT: 8males/4females, mean age 72 ± 7 years; TACE: 10males/2females, mean age 71 ± 9 years), initial tumor load (1 patient ≥25 % in each group), and BCLC (Barcelona Clinic Liver Cancer) stage (SIRT: 12×B; TACE 1×A, 11×B). Median progression-free survival (PFS) was 180 days for SIRT versus 216 days for TACE patients (p = 0.6193) with a median TTP of 371 days versus 336 days, respectively (p = 0.5764). Median OS was 592 days for SIRT versus 788 days for TACE patients (p = 0.9271). Seven patients died in each group. Causes of death were liver failure (n = 4 SIRT group), tumor progression (n = 4 TACE group), cardiovascular events, and inconclusive (n = 1 in each group).ConclusionsNo significant differences were found in median PFS, OS, and TTP. The lower rate of tumor progression in the SIRT group was nullified by a greater incidence of liver failure. This pilot study is the first prospective randomized trial comparing SIRT and TACE for treating HCC, and results can be used for sample size calculations of future studies

  13. Selection of peripheral intravenous catheters with 24-gauge side-holes versus those with 22-gauge end-hole for MDCT: A prospective randomized study

    Energy Technology Data Exchange (ETDEWEB)

    Tamura, Akio, E-mail: a.akahane@gmail.com [Department of Radiology, Iwate Medical University School of Medicine, 19-1 Uchimaru, Morioka 020-8505 (Japan); Kato, Kenichi, E-mail: kkato@iwate-med.ac.jp [Department of Radiology, Iwate Medical University School of Medicine, 19-1 Uchimaru, Morioka 020-8505 (Japan); Kamata, Masayoshi, E-mail: kamataaoi@yahoo.co.jp [Iwate Medical University Hospital, 19-1 Uchimaru, Morioka 020-8505 (Japan); Suzuki, Tomohiro, E-mail: suzukitomohiro123@gmail.com [Department of Radiology, Iwate Medical University School of Medicine, 19-1 Uchimaru, Morioka 020-8505 (Japan); Suzuki, Michiko, E-mail: mamimichiko@me.com [Department of Radiology, Iwate Medical University School of Medicine, 19-1 Uchimaru, Morioka 020-8505 (Japan); Nakayama, Manabu, E-mail: gakuymgt@yahoo.co.jp [Department of Radiology, Iwate Medical University School of Medicine, 19-1 Uchimaru, Morioka 020-8505 (Japan); Tomabechi, Makiko, E-mail: mtomabechi@mac.com [Department of Radiology, Iwate Medical University School of Medicine, 19-1 Uchimaru, Morioka 020-8505 (Japan); Nakasato, Tatsuhiko, E-mail: nakasato77@gmail.com [Department of Radiology, Southern Tohoku Research Institute for Neuroscience, 7-115 Yatsuyamada, Koriyama 963-8563 (Japan); Ehara, Shigeru, E-mail: ehara@iwate-med.ac.jp [Department of Radiology, Iwate Medical University School of Medicine, 19-1 Uchimaru, Morioka 020-8505 (Japan)

    2017-02-15

    Highlights: • We compared 24-gauge side-hole and conventional 22-gauge end-hole catheters in MDCT. • The 24-gauge side-hole catheter is noninferior to the 22-gauge end-hole catheter. • The 24-gauge side-hole catheter is safe and facilitates optimal enhancement quality. • The 24-gauge side-hole catheter is suitable for patients with narrow or fragile veins. - Abstract: Purpose: To compare the 24-gauge side-holes catheter and conventional 22-gauge end-hole catheter in terms of safety, injection pressure, and contrast enhancement on multi-detector computed tomography (MDCT). Materials & methods: In a randomized single-center study, 180 patients were randomized to either the 24-gauge side-holes catheter or the 22-gauge end-hole catheter groups. The primary endpoint was safety during intravenous administration of contrast material for MDCT, using a non-inferiority analysis (lower limit 95% CI greater than −10% non-inferiority margin for the group difference). The secondary endpoints were injection pressure and contrast enhancement. Results: A total of 174 patients were analyzed for safety during intravenous contrast material administration for MDCT. The overall extravasation rate was 1.1% (2/174 patients); 1 (1.2%) minor episode occurred in the 24-gauge side-holes catheter group and 1 (1.1%) in the 22-gauge end-hole catheter group (difference: 0.1%, 95% CI: −3.17% to 3.28%, non-inferiority P = 1). The mean maximum pressure was higher with the 24-gauge side-holes catheter than with the 22-gauge end-hole catheter (8.16 ± 0.95 kg/cm{sup 2} vs. 4.79 ± 0.63 kg/cm{sup 2}, P < 0.001). The mean contrast enhancement of the abdominal aorta, celiac artery, superior mesenteric artery, and pancreatic parenchyma in the two groups were not significantly different. Conclusion: In conclusion, our study showed that the 24-gauge side-holes catheter is safe and suitable for delivering iodine with a concentration of 300 mg/mL at a flow-rate of 3 mL/s, and it may contribute to

  14. Twenty years of analysis of light elements at the LARN

    International Nuclear Information System (INIS)

    Demortier, G.

    1992-01-01

    We review the applications of ion beam analysis of light elements performed in the LARN during the last twenty years. The works mainly concern: helium bubbles in aluminum foils, Li in aluminum alloys, carbon in high purity MgO crystals and in olivines, nitrogen bubbles in glass and implanted nitrogen in iron and aluminum, oxygen in YBaCuO superconductors, fluorine in tooth enamel and implanted fluorine in metals. (orig.)

  15. Proceedings of the twenty-first LAMPF users group meeting

    International Nuclear Information System (INIS)

    1988-04-01

    The Twenty-First Annual LAMPF Users Group Meeting was held November 9-10, 1987, at the Clinton P. Anderson Meson Physics Facility. The program included a number of invited talks on various aspects of nuclear and particle physics as well as status reports on LAMPF and discussions of upgrade options. The LAMPF working groups met and discussed plans for the secondary beam lines, experimental programs, and computing facilities

  16. Proceedings of the twenty-second LAMPF users groupd meeting

    International Nuclear Information System (INIS)

    Marinuzzi, R.

    1989-04-01

    The Twenty-Second Annual LAMPF Users Group Meeting was held October 17--18, 1988, at the Clinton P. Anderson Meson Physics Facility. The program included a number of invited talks on various aspects of nuclear and particle physics as well as status reports on LAMPF and discussions of upgrade options. The LAMPF working groups met and discussed plans for the secondary beam lines, experimental programs, and computing facilities

  17. Twenty five years of clusters -- from Bochum to Strasbourg

    International Nuclear Information System (INIS)

    Betts, R.R.; Chicago Univ., IL

    1994-01-01

    Developments in the area of clustering aspects of nuclear structure and reactions over the past twenty-five years are reviewed. The viewpoint is that the nucleus is an assembly of clusters. The question is whether clusters actually exist in the nucleus. Although there is abundant evidence for this in light nuclei, the situation for more complex clusters in heavier nuclei is much worse. Differential cross sections for scattering of alpha particles and heavy ions are shown

  18. Technological sciences society of the twenty-first century

    International Nuclear Information System (INIS)

    1999-04-01

    This book introduces information-oriented society of the twenty-first century connected to computer network for example memory of dream : F-ram, information-oriented society : New media, communications network for next generation ; ISDN on what is IDSN?, development of information service industry, from office automation to an intelligent building in the future, home shopping and home banking and rock that hinders information-oriented society.

  19. NATO’s Relevance in the Twenty-First Century

    Science.gov (United States)

    2012-03-22

    reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...5d. PROJECT NUMBER Colonel John K. Jones 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES...Christopher Coker, Globalisation and Insecurity in the Twenty-first Century: NATO and the Management of Risk (The International Institute for Strategic

  20. Designing Vaccines for the Twenty-First Century Society

    OpenAIRE

    Finco, Oretta; Rappuoli, Rino

    2014-01-01

    The history of vaccination clearly demonstrates that vaccines have been highly successful in preventing infectious diseases, reducing significantly the incidence of childhood diseases and mortality. However, many infections are still not preventable with the currently available vaccines and they represent a major cause of mortality worldwide. In the twenty-first century, the innovation brought by novel technologies in antigen discovery and formulation together with a deeper knowledge of the h...

  1. Early twenty-first-century droughts during the warmest climate

    Directory of Open Access Journals (Sweden)

    Felix Kogan

    2016-01-01

    Full Text Available The first 13 years of the twenty-first century have begun with a series of widespread, long and intensive droughts around the world. Extreme and severe-to-extreme intensity droughts covered 2%–6% and 7%–16% of the world land, respectively, affecting environment, economies and humans. These droughts reduced agricultural production, leading to food shortages, human health deterioration, poverty, regional disturbances, population migration and death. This feature article is a travelogue of the twenty-first-century global and regional droughts during the warmest years of the past 100 years. These droughts were identified and monitored with the National Oceanic and Atmospheric Administration operational space technology, called vegetation health (VH, which has the longest period of observation and provides good data quality. The VH method was used for assessment of vegetation condition or health, including drought early detection and monitoring. The VH method is based on operational satellites data estimating both land surface greenness (NDVI and thermal conditions. The twenty-first-century droughts in the USA, Russia, Australia and Horn of Africa were intensive, long, covered large areas and caused huge losses in agricultural production, which affected food security and led to food riots in some countries. This research also investigates drought dynamics presenting no definite conclusion about drought intensification or/and expansion during the time of the warmest globe.

  2. Open Adoption Placement by Birth Mothers in Their Twenties.

    Science.gov (United States)

    Clutter, Lynn B

    The purpose of this study was to summarize birth mothers' descriptions of unplanned pregnancy experienced in their twenties and how open adoption influenced their lives. Naturalistic inquiry was used with purposive sampling from one agency and telephone interviews of women who experienced unplanned pregnancy in their twenties and relinquishment through open adoption. Recorded, transcribed, and deidentified interviews were analyzed for qualitative themes. Fifteen participants judiciously weighed the open adoption decision. Over half parented other children prior to placement. Most knew they could not have parented this child due to life stressors. Placement was a hard decision, but ongoing contact with birth child and adoptive family was valued. Open adoption processes made them stronger by being happy that their child experienced family life with greater opportunities than birth mothers could offer at the time. Summarized themes used the acronym COMMITTED: C-care deeply about what is best for the child, O-ongoing open adoption: good and hard, M-meeting together regularly, M-moving on in personal growth, accomplishments, and milestones, I-independence from previous stressors or crises, T-transitions, T-therapeutic support, E-emotions, D-depression giving way to deepened strength and personal direction. Open adoption is reinforced as a positive resolution of unintended pregnancy for birth mothers in their twenties.

  3. Optimal dose selection accounting for patient subpopulations in a randomized Phase II trial to maximize the success probability of a subsequent Phase III trial.

    Science.gov (United States)

    Takahashi, Fumihiro; Morita, Satoshi

    2018-02-08

    Phase II clinical trials are conducted to determine the optimal dose of the study drug for use in Phase III clinical trials while also balancing efficacy and safety. In conducting these trials, it may be important to consider subpopulations of patients grouped by background factors such as drug metabolism and kidney and liver function. Determining the optimal dose, as well as maximizing the effectiveness of the study drug by analyzing patient subpopulations, requires a complex decision-making process. In extreme cases, drug development has to be terminated due to inadequate efficacy or severe toxicity. Such a decision may be based on a particular subpopulation. We propose a Bayesian utility approach (BUART) to randomized Phase II clinical trials which uses a first-order bivariate normal dynamic linear model for efficacy and safety in order to determine the optimal dose and study population in a subsequent Phase III clinical trial. We carried out a simulation study under a wide range of clinical scenarios to evaluate the performance of the proposed method in comparison with a conventional method separately analyzing efficacy and safety in each patient population. The proposed method showed more favorable operating characteristics in determining the optimal population and dose.

  4. Random magnetism

    International Nuclear Information System (INIS)

    Tahir-Kheli, R.A.

    1975-01-01

    A few simple problems relating to random magnetic systems are presented. Translational symmetry, only on the macroscopic scale, is assumed for these systems. A random set of parameters, on the microscopic scale, for the various regions of these systems is also assumed. A probability distribution for randomness is obeyed. Knowledge of the form of these probability distributions, is assumed in all cases [pt

  5. Comparison between paricalcitol and active non-selective vitamin D receptor activator for secondary hyperparathyroidism in chronic kidney disease: a systematic review and meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Cai, Panpan; Tang, Xiaohong; Qin, Wei; Ji, Ling; Li, Zi

    2016-04-01

    The goal of this systematic review is to evaluate the efficacy and safety of paricalcitol versus active non-selective vitamin D receptor activators (VDRAs) for secondary hyperparathyroidism (SHPT) management in chronic kidney disease (CKD) patients. PubMed, EMBASE, Cochrane Central Register of Controlled Trials (CENTRAL), clinicaltrials.gov (inception to September 2015), and ASN Web site were searched for relevant studies. A meta-analysis of randomized controlled trials (RCTs) and quasi-RCTs that assessed the effects and adverse events of paricalcitol and active non-selective VDRA in adult CKD patients with SHPT was performed using Review Manager 5.2. A total of 10 trials involving 734 patients were identified for this review. The quality of included trials was limited, and very few trials reported all-cause mortality or cardiovascular calcification without any differences between two groups. Compared with active non-selective VDRAs, paricalcitol showed no significant difference in both PTH reduction (MD -7.78, 95% CI -28.59-13.03, P = 0.46) and the proportion of patients who achieved the target reduction of PTH (OR 1.27, 95% CI 0.87-1.85, P = 0.22). In addition, no statistical differences were found in terms of serum calcium, episodes of hypercalcemia, serum phosphorus, calcium × phosphorus products, and bone metabolism index. Current evidence is insufficient, showing paricalcitol is superior to active non-selective VDRAs in lowering PTH or reducing the burden of mineral loading. Further trials are required to prove the tissue-selective effect of paricalcitol and to overcome the limitation of current research.

  6. Trigemino-gustatory interactions: a randomized controlled clinical trial assessing the effects of selective anesthesia of dental afferents on taste thresholds.

    Science.gov (United States)

    Lecor, Papa Abdou; Touré, Babacar; Boucher, Yves

    2018-03-01

    This study aimed at analyzing the effect of the temporary removal of trigeminal dental afferents on electrogustometric thresholds (EGMt). EGMt were measured in 300 healthy subjects randomized in three groups, in nine loci on the right and left side (RS, LS) of the tongue surface before and after anesthesia. Group IAN (n = 56 RS, n = 44 LS) received intraosseous local anesthesia of the inferior alveolar nerve (IAN). Group MdN received mandibular nerve (MdN) block targeting IAN before its entrance into the mandibular foramen (n = 60, RS, and n = 40, LS); group MxN receiving maxillary nerve (MxN) anesthesia (n = 56 RS and n = 44 LS) was the control group. Differences between mean EGMt were analyzed with the Wilcoxon test; correlation between type of anesthesia and EGMt was performed with Spearman's rho, all with a level of significance set at p ≤ 0.05. Significant EGMt (μA) differences before and after anesthesia were found in all loci with MdN and IAN on the ipsilateral side (p Anesthesia of the MdN was positively correlated with the increase in EGMt (p anesthesia of IAN was positively correlated only with the increase in EGMt measured at posterior and dorsal loci of the tongue surface (p anesthesia suggests a participation of dental afferents in taste perception. Extraction of teeth may impair food intake not only due to impaired masticatory ability but also to alteration of neurological trigemino-gustatory interactions. PACTR201602001452260.

  7. 10-Year Mortality Outcome of a Routine Invasive Strategy Versus a Selective Invasive Strategy in Non-ST-Segment Elevation Acute Coronary Syndrome: The British Heart Foundation RITA-3 Randomized Trial.

    Science.gov (United States)

    Henderson, Robert A; Jarvis, Christopher; Clayton, Tim; Pocock, Stuart J; Fox, Keith A A

    2015-08-04

    The RITA-3 (Third Randomised Intervention Treatment of Angina) trial compared outcomes of a routine early invasive strategy (coronary arteriography and myocardial revascularization, as clinically indicated) to those of a selective invasive strategy (coronary arteriography for recurrent ischemia only) in patients with non-ST-segment elevation acute coronary syndrome (NSTEACS). At a median of 5 years' follow-up, the routine invasive strategy was associated with a 24% reduction in the odds of all-cause mortality. This study reports 10-year follow-up outcomes of the randomized cohort to determine the impact of a routine invasive strategy on longer-term mortality. We randomized 1,810 patients with NSTEACS to receive routine invasive or selective invasive strategies. All randomized patients had annual follow-up visits up to 5 years, and mortality was documented thereafter using data from the Office of National Statistics. Over 10 years, there were no differences in mortality between the 2 groups (all-cause deaths in 225 [25.1%] vs. 232 patients [25.4%]: p = 0.94; and cardiovascular deaths in 135 [15.1%] vs. 147 patients [16.1%]: p = 0.65 in the routine invasive and selective invasive groups, respectively). Multivariate analysis identified several independent predictors of 10-year mortality: age, previous myocardial infarction, heart failure, smoking status, diabetes, heart rate, and ST-segment depression. A modified post-discharge Global Registry of Acute Coronary Events (GRACE) score was used to calculate an individual risk score for each patient and to form low-risk, medium-risk, and high-risk groups. Risk of death within 10 years varied markedly from 14.4 % in the low-risk group to 56.2% in the high-risk group. This mortality trend did not depend on the assigned treatment strategy. The advantage of reduced mortality of routine early invasive strategy seen at 5 years was attenuated during later follow-up, with no evidence of a difference in outcome at 10 years

  8. The selective beta 1-blocking agent metoprolol compared with antithyroid drug and thyroxine as preoperative treatment of patients with hyperthyroidism. Results from a prospective, randomized study.

    Science.gov (United States)

    Adlerberth, A; Stenström, G; Hasselgren, P O

    1987-01-01

    Despite the increasing use of beta-blocking agents alone as preoperative treatment of patients with hyperthyroidism, there are no controlled clinical studies in which this regimen has been compared with a more conventional preoperative treatment. Thirty patients with newly diagnosed and untreated hyperthyroidism were randomized to preoperative treatment with methimazole in combination with thyroxine (Group I) or the beta 1-blocking agent metoprolol (Group II). Metoprolol was used since it has been demonstrated that the beneficial effect of beta-blockade in hyperthyroidism is mainly due to beta 1-blockade. The preoperative, intraoperative, and postoperative courses in the two groups were compared, and patients were followed up for 1 year after thyroidectomy. At the time of diagnosis, serum concentration of triiodothyronine (T3) was 6.1 +/- 0.59 nmol/L in Group I and 5.7 +/- 0.66 nmol/L in Group II (reference interval 1.5-3.0 nmol/L). Clinical improvement during preoperative treatment was similar in the two groups of patients, but serum T3 was normalized only in Group I. The median length of preoperative treatment was 12 weeks in Group I and 5 weeks in Group II (p less than 0.01). There were no serious adverse effects of the drugs during preoperative preparation in either treatment group. Operating time, consistency and vascularity of the thyroid gland, and intraoperative blood loss were similar in the two groups. No anesthesiologic or cardiovascular complications occurred during operation in either group. One patient in Group I (7%) and three patients in Group II (20%) had clinical signs of hyperthyroid function during the first postoperative day. These symptoms were abolished by the administration of small doses of metoprolol, and no case of thyroid storm occurred. Postoperative hypocalcemia or recurrent laryngeal nerve paralysis did not occur in either group. During the first postoperative year, hypothyroidism developed in two patients in Group I (13%) and in six

  9. Promoting mobility after hip fracture (ProMo: study protocol and selected baseline results of a year-long randomized controlled trial among community-dwelling older people

    Directory of Open Access Journals (Sweden)

    Sipilä Sarianna

    2011-12-01

    Full Text Available Abstract Background To cope at their homes, community-dwelling older people surviving a hip fracture need a sufficient amount of functional ability and mobility. There is a lack of evidence on the best practices supporting recovery after hip fracture. The purpose of this article is to describe the design, intervention and demographic baseline results of a study investigating the effects of a rehabilitation program aiming to restore mobility and functional capacity among community-dwelling participants after hip fracture. Methods/Design Population-based sample of over 60-year-old community-dwelling men and women operated for hip fracture (n = 81, mean age 79 years, 78% were women participated in this study and were randomly allocated into control (Standard Care and ProMo intervention groups on average 10 weeks post fracture and 6 weeks after discharged to home. Standard Care included written home exercise program with 5-7 exercises for lower limbs. Of all participants, 12 got a referral to physiotherapy. After discharged to home, only 50% adhered to Standard Care. None of the participants were followed-up for Standard Care or mobility recovery. ProMo-intervention included Standard Care and a year-long program including evaluation/modification of environmental hazards, guidance for safe walking, pain management, progressive home exercise program and physical activity counseling. Measurements included a comprehensive battery of laboratory tests and self-report on mobility limitation, disability, physical functional capacity and health as well as assessments for the key prerequisites for mobility, disability and functional capacity. All assessments were performed blinded at the research laboratory. No significant differences were observed between intervention and control groups in any of the demographic variables. Discussion Ten weeks post hip fracture only half of the participants were compliant to Standard Care. No follow-up for Standard Care or

  10. Randomized random walk on a random walk

    International Nuclear Information System (INIS)

    Lee, P.A.

    1983-06-01

    This paper discusses generalizations of the model introduced by Kehr and Kunter of the random walk of a particle on a one-dimensional chain which in turn has been constructed by a random walk procedure. The superimposed random walk is randomised in time according to the occurrences of a stochastic point process. The probability of finding the particle in a particular position at a certain instant is obtained explicitly in the transform domain. It is found that the asymptotic behaviour for large time of the mean-square displacement of the particle depends critically on the assumed structure of the basic random walk, giving a diffusion-like term for an asymmetric walk or a square root law if the walk is symmetric. Many results are obtained in closed form for the Poisson process case, and these agree with those given previously by Kehr and Kunter. (author)

  11. Selective mutism.

    Science.gov (United States)

    Hua, Alexandra; Major, Nili

    2016-02-01

    Selective mutism is a disorder in which an individual fails to speak in certain social situations though speaks normally in other settings. Most commonly, this disorder initially manifests when children fail to speak in school. Selective mutism results in significant social and academic impairment in those affected by it. This review will summarize the current understanding of selective mutism with regard to diagnosis, epidemiology, cause, prognosis, and treatment. Studies over the past 20 years have consistently demonstrated a strong relationship between selective mutism and anxiety, most notably social phobia. These findings have led to the recent reclassification of selective mutism as an anxiety disorder in the Diagnostic and Statistical Manual of Mental Disorders, 5th Edition. In addition to anxiety, several other factors have been implicated in the development of selective mutism, including communication delays and immigration/bilingualism, adding to the complexity of the disorder. In the past few years, several randomized studies have supported the efficacy of psychosocial interventions based on a graduated exposure to situations requiring verbal communication. Less data are available regarding the use of pharmacologic treatment, though there are some studies that suggest a potential benefit. Selective mutism is a disorder that typically emerges in early childhood and is currently conceptualized as an anxiety disorder. The development of selective mutism appears to result from the interplay of a variety of genetic, temperamental, environmental, and developmental factors. Although little has been published about selective mutism in the general pediatric literature, pediatric clinicians are in a position to play an important role in the early diagnosis and treatment of this debilitating condition.

  12. Selective enhancement of Selényi rings induced by the cross-correlation between the interfaces of a two-dimensional randomly rough dielectric film

    Science.gov (United States)

    Banon, J.-P.; Hetland, Ø. S.; Simonsen, I.

    2018-02-01

    By the use of both perturbative and non-perturbative solutions of the reduced Rayleigh equation, we present a detailed study of the scattering of light from two-dimensional weakly rough dielectric films. It is shown that for several rough film configurations, Selényi interference rings exist in the diffusely scattered light. For film systems supported by dielectric substrates where only one of the two interfaces of the film is weakly rough and the other planar, Selényi interference rings are observed at angular positions that can be determined from simple phase arguments. For such single-rough-interface films, we find and explain by a single scattering model that the contrast in the interference patterns is better when the top interface of the film (the interface facing the incident light) is rough than when the bottom interface is rough. When both film interfaces are rough, Selényi interference rings exist but a potential cross-correlation of the two rough interfaces of the film can be used to selectively enhance some of the interference rings while others are attenuated and might even disappear. This feature may in principle be used in determining the correlation properties of interfaces of films that otherwise would be difficult to access.

  13. Drop-out from cardiovascular magnetic resonance in a randomized controlled trial of ST-elevation myocardial infarction does not cause selection bias on endpoints

    DEFF Research Database (Denmark)

    Laursen, Peter Nørkjær; Holmvang, L.; Kelbæk, H.

    2017-01-01

    Background: The extent of selection bias due to drop-out in clinical trials of ST-elevation myocardial infarction (STEMI) using cardiovascular magnetic resonance (CMR) as surrogate endpoints is unknown. We sought to interrogate the characteristics and prognosis of patients who dropped out before...... years of follow-up were assessed and compared between CMR-drop-outs and CMR-participants using the trial screening log and the Eastern Danish Heart Registry. Results: The drop-out rate from acute CMR was 28% (n = 92). These patients had a significantly worse clinical risk profile upon admission...... as evaluated by the TIMI-risk score (3.7 (± 2.1) vs 4.0 (± 2.6), p = 0.043) and by left ventricular ejection fraction (43 (± 9) vs. 47 (± 10), p = 0.029). CMR drop-outs had a higher incidence of known hypertension (39% vs. 35%, p = 0.043), known diabetes (14% vs. 7%, p = 0.025), known cardiac disease (11% vs...

  14. Twenty-first workshop on geothermal reservoir engineering: Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-01-26

    PREFACE The Twenty-First Workshop on Geothermal Reservoir Engineering was held at the Holiday Inn, Palo Alto on January 22-24, 1996. There were one-hundred fifty-five registered participants. Participants came from twenty foreign countries: Argentina, Austria, Canada, Costa Rica, El Salvador, France, Iceland, Indonesia, Italy, Japan, Mexico, The Netherlands, New Zealand, Nicaragua, the Philippines, Romania, Russia, Switzerland, Turkey and the UK. The performance of many geothermal reservoirs outside the United States was described in several of the papers. Professor Roland N. Horne opened the meeting and welcomed visitors. The key note speaker was Marshall Reed, who gave a brief overview of the Department of Energy's current plan. Sixty-six papers were presented in the technical sessions of the workshop. Technical papers were organized into twenty sessions concerning: reservoir assessment, modeling, geology/geochemistry, fracture modeling hot dry rock, geoscience, low enthalpy, injection, well testing, drilling, adsorption and stimulation. Session chairmen were major contributors to the workshop, and we thank: Ben Barker, Bobbie Bishop-Gollan, Tom Box, Jim Combs, John Counsil, Sabodh Garg, Malcolm Grant, Marcel0 Lippmann, Jim Lovekin, John Pritchett, Marshall Reed, Joel Renner, Subir Sanyal, Mike Shook, Alfred Truesdell and Ken Williamson. Jim Lovekin gave the post-dinner speech at the banquet and highlighted the exciting developments in the geothermal field which are taking place worldwide. The Workshop was organized by the Stanford Geothermal Program faculty, staff, and graduate students. We wish to thank our students who operated the audiovisual equipment. Shaun D. Fitzgerald Program Manager.

  15. Twenty years of energy policy: What should we have learned?

    International Nuclear Information System (INIS)

    Greene, D.L.

    1994-07-01

    This report examines the past twenty years of energy market events and energy policies to determine what may be useful for the future. The author focuses on two important lessons that should have been learned but which the author feels have been seriously misunderstood. The first is that oil price shocks were a very big and very real problem for oil importing countries, a problem the has not gone away. The second is that automobile fuel economy regulation has worked and worked effectively to reduce oil consumption and the externalities associated with it, and can still work effectively in the future

  16. Twenty-first-century medical microbiology services in the UK.

    Science.gov (United States)

    Duerden, Brian

    2005-12-01

    With infection once again a high priority for the UK National Health Service (NHS), the medical microbiology and infection-control services require increased technology resources and more multidisciplinary staff. Clinical care and health protection need a coordinated network of microbiology services working to consistent standards, provided locally by NHS Trusts and supported by the regional expertise and national reference laboratories of the new Health Protection Agency. Here, I outline my thoughts on the need for these new resources and the ways in which clinical microbiology services in the UK can best meet the demands of the twenty-first century.

  17. Accelerators for the twenty-first century a review

    CERN Document Server

    Wilson, Edmund J N

    1990-01-01

    The development of the synchrotron, and later the storage ring, was based upon the electrical technology at the turn of this century, aided by the microwave radar techniques of World War II. This method of acceleration seems to have reached its limit. Even superconductivity is not likely to lead to devices that will satisfy physics needs into the twenty-first century. Unless a new principle for accelerating elementary particles is discovered soon, it is difficult to imagine that high-energy physics will continue to reach out to higher energies and luminosities.

  18. Twenty years of energy policy: What should we have learned?

    Energy Technology Data Exchange (ETDEWEB)

    Greene, D.L. [Oak Ridge National Lab., TN (United States). Center for Transportation Analysis

    1994-07-01

    This report examines the past twenty years of energy market events and energy policies to determine what may be useful for the future. The author focuses on two important lessons that should have been learned but which the author feels have been seriously misunderstood. The first is that oil price shocks were a very big and very real problem for oil importing countries, a problem the has not gone away. The second is that automobile fuel economy regulation has worked and worked effectively to reduce oil consumption and the externalities associated with it, and can still work effectively in the future.

  19. The design and protocol of heat-sensitive moxibustion for knee osteoarthritis: a multicenter randomized controlled trial on the rules of selecting moxibustion location

    Directory of Open Access Journals (Sweden)

    Chi Zhenhai

    2010-06-01

    Full Text Available Abstract Background Knee osteoarthritis is a major cause of pain and functional limitation. Complementary and alternative medical approaches have been employed to relieve symptoms and to avoid the side effects of conventional medication. Moxibustion has been widely used to treat patients with knee osteoarthritis. Our past researches suggested heat-sensitive moxibustion might be superior to the conventional moxibustion. Our objective is to investigate the effectiveness of heat-sensitive moxibustion compared with conventional moxibustion or conventional drug treatment. Methods This study consists of a multi-centre (four centers in China, randomised, controlled trial with three parallel arms (A: heat-sensitive moxibustion; B: conventional moxibustion; C: conventional drug group. The moxibustion locations are different from A and B. Group A selects heat-sensitization acupoint from the region consisting of Yin Lingquan(SP9, Yang Lingquan(GB34, Liang Qiu(ST34, and Xue Hai (SP10. Meanwhile, fixed acupoints are used in group B, that is Xi Yan (EX-LE5 and He Ding (EX-LE2. The conventional drug group treats with intra-articular Sodium Hyaluronate injection. The outcome measures above will be assessed before the treatment, the 30 days of the last moxibustion session and 6 months after the last moxibustion session. Discussion This trial will utilize high quality trial methodologies in accordance with CONSORT guidelines. It will provide evidence for the effectiveness of moxibustion as a treatment for moderate and severe knee osteoarthritis. Moreover, the result will clarify the rules of heat-sensitive moxibustion location to improve the therapeutic effect with suspended moxibustion, and propose a new concept and a new theory of moxibustion to guide clinical practices. Trial Registration The trial is registered at Controlled Clinical Trials: ChiCTR-TRC-00000600.

  20. Strategies for Teaching Maritime Archaeology in the Twenty First Century

    Science.gov (United States)

    Staniforth, Mark

    2008-12-01

    Maritime archaeology is a multi-faceted discipline that requires both theoretical learning and practical skills training. In the past most universities have approached the teaching of maritime archaeology as a full-time on-campus activity designed for ‘traditional’ graduate students; primarily those in their early twenties who have recently come from full-time undergraduate study and who are able to study on-campus. The needs of mature-age and other students who work and live in different places (or countries) and therefore cannot attend lectures on a regular basis (or at all) have largely been ignored. This paper provides a case study in the teaching of maritime archaeology from Australia that, in addition to ‘traditional’ on-campus teaching, includes four main components: (1) learning field methods through field schools; (2) skills training through the AIMA/NAS avocational training program; (3) distance learning topics available through CD-ROM and using the Internet; and (4) practicums, internships and fellowships. The author argues that programs to teach maritime archaeology in the twenty first century need to be flexible and to address the diverse needs of students who do not fit the ‘traditional’ model. This involves collaborative partnerships with other universities as well as government underwater cultural heritage management agencies and museums, primarily through field schools, practicums and internships.

  1. Random Fields

    Science.gov (United States)

    Vanmarcke, Erik

    1983-03-01

    Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.

  2. Increasing precipitation volatility in twenty-first-century California

    Science.gov (United States)

    Swain, Daniel L.; Langenbrunner, Baird; Neelin, J. David; Hall, Alex

    2018-05-01

    Mediterranean climate regimes are particularly susceptible to rapid shifts between drought and flood—of which, California's rapid transition from record multi-year dryness between 2012 and 2016 to extreme wetness during the 2016-2017 winter provides a dramatic example. Projected future changes in such dry-to-wet events, however, remain inadequately quantified, which we investigate here using the Community Earth System Model Large Ensemble of climate model simulations. Anthropogenic forcing is found to yield large twenty-first-century increases in the frequency of wet extremes, including a more than threefold increase in sub-seasonal events comparable to California's `Great Flood of 1862'. Smaller but statistically robust increases in dry extremes are also apparent. As a consequence, a 25% to 100% increase in extreme dry-to-wet precipitation events is projected, despite only modest changes in mean precipitation. Such hydrological cycle intensification would seriously challenge California's existing water storage, conveyance and flood control infrastructure.

  3. The Turn to Precarity in Twenty-First Century Fiction

    Directory of Open Access Journals (Sweden)

    Morrison Jago

    2014-01-01

    Full Text Available Recent years have seen several attempts by writers and critics to understand the changed sensibility in post-9/11 fiction through a variety of new -isms. This essay explores this cultural shift in a different way, finding a ‘turn to precarity’ in twenty-first century fiction characterised by a renewal of interest in the flow and foreclosure of affect, the resurgence of questions about vulnerability and our relationships to the other, and a heightened awareness of the social dynamics of seeing. The essay draws these tendencies together via the work of Judith Butler in Frames of War, in an analysis of Trezza Azzopardi’s quasi-biographical study of precarious life, Remember Me.

  4. Twenty-First Water Reaction Safety Information Meeting

    International Nuclear Information System (INIS)

    Monteleone, S.

    1994-04-01

    This three-volume report contains 90 papers out of the 102 that were presented at the Twenty-First Water Reactor Safety Information Meeting held at the Bethesda Marriott Hotel, Bethesda, Maryland, during the week of October 25--27, 1993. The papers are printed in the order of their presentation in each session and describe progress and results of programs in nuclear safety research conducted in this country and abroad. Foreign participation in the meeting included papers presented by researchers from France, Germany, Japan, Russia, Switzerland, Taiwan, and United Kingdom. The titles of the papers and the names of the authors have been updated and may differ from those that appeared in the final program of the meeting. Individual papers have been cataloged separately. This document, Volume 2, presents papers on severe accident research

  5. The Dialectics of Discrimination in the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    John Stone

    2007-12-01

    Full Text Available This article explores some of the latest developments in the scholarship on race relations and nationalism that seek to address the impact of globalization and the changed geo-political relations of the first decade of the twenty-first century. New patterns of identification, some of which challenge existing group boundaries and others that reinforce them, can be seen to flow from the effects of global market changes and the political counter-movements against them. The impact of the “war on terrorism”, the limits of the utility of hard power, and the need for new mechanisms of inter-racial and inter-ethnic conflict resolution are evaluated to emphasize the complexity of these group relations in the new world disorder.

  6. Proceedings of Twenty-Seventh Annual Institute on Mining Health, Safety and Research

    Energy Technology Data Exchange (ETDEWEB)

    Bockosh, G.R. [ed.] [Pittsburgh Research Center, US Dept. of Energy (United States); Langton, J. [ed.] [Mine Safety and Health Administration, US Dept. of Labor (United States); Karmis, M. [ed.] [Virginia Polytechnic Institute and State University. Dept. of Mining and Minerals Engineering, Blacksburg (United States)

    1996-12-31

    This Proceedings contains the presentations made during the program of the Twenty-Seventh Annual Institute on Mining Health, Safety and Research held at Virginia Polytechnic Institute and State University, Blacksburg, Virginia, on August 26-28, 1996. The Twenty-Seventh Annual Institute on Mining, Health, Safety and Research was the latest in a series of conferences held at Virginia Polytechnic Institute and State University, cosponsored by the Mine Safety and Health Administration, United States Department of Labor, and the Pittsburgh Research Center, United States Department of Energy (formerly part of the Bureau of Mines, U. S. Department of Interior). The Institute provides an information forum for mine operators, managers, superintendents, safety directors, engineers, inspectors, researchers, teachers, state agency officials, and others with a responsible interest in the important field of mining health, safety and research. In particular, the Institute is designed to help mine operating personnel gain a broader knowledge and understanding of the various aspects of mining health and safety, and to present them with methods of control and solutions developed through research. Selected papers have been processed separately for inclusion in the Energy Science and Technology database.

  7. Diurnal Variation and Twenty-Four Hour Sleep Deprivation Do Not Alter Supine Heart Rate Variability in Healthy Male Young Adults.

    Directory of Open Access Journals (Sweden)

    Daniel S Quintana

    Full Text Available Heart rate variability (HRV has become an increasingly popular index of cardiac autonomic control in the biobehavioral sciences due to its relationship with mental illness and cognitive traits. However, the intraindividual stability of HRV in response to sleep and diurnal disturbances, which are commonly reported in mental illness, and its relationship with executive function are not well understood. Here, in 40 healthy adult males we calculated high frequency HRV-an index of parasympathetic nervous system (PNS activity-using pulse oximetry during brain imaging, and assessed attentional and executive function performance in a subsequent behavioral test session at three time points: morning, evening, and the following morning. Twenty participants were randomly selected for total sleep deprivation whereas the other 20 participants slept as normal. Sleep deprivation and morning-to-night variation did not influence high frequency HRV at either a group or individual level; however, sleep deprivation abolished the relationship between orienting attention performance and HRV. We conclude that a day of wake and a night of laboratory-induced sleep deprivation do not alter supine high frequency HRV in young healthy male adults.

  8. United States Military Space: Into the Twenty-First Century

    Science.gov (United States)

    2002-01-01

    famous and articulate spokesmen for planetary science; Pale Blue Dot : A Vision of the Human Future in Space (New York: Random House, 1994) was one...and defining human characteristic. Carl Sagan is a primary spokesman for those who view spaceflight in scientific and ecological terms and see it as...Spacefaring Civilization (New York: Jeremy P. Tarcher/Putnam, 1999). Carl Sagan cofounded the Planetary Society in 1980 and was one of the most

  9. Secure Path Selection under Random Fading

    Directory of Open Access Journals (Sweden)

    Furqan Jameel

    2017-05-01

    Full Text Available Application-oriented Wireless Sensor Networks (WSNs promises to be one of the most useful technologies of this century. However, secure communication between nodes in WSNs is still an unresolved issue. In this context, we propose two protocols (i.e. Optimal Secure Path (OSP and Sub-optimal Secure Path (SSP to minimize the outage probability of secrecy capacity in the presence of multiple eavesdroppers. We consider dissimilar fading at the main and wiretap link and provide detailed evaluation of the impact of Nakagami-m and Rician-K factors on the secrecy performance of WSNs. Extensive simulations are performed to validate our findings. Although the optimal scheme ensures more security, yet the sub-optimal scheme proves to be a more practical approach to secure wireless links.

  10. Twenty years of RERTR in Russia: Past, present and future

    International Nuclear Information System (INIS)

    Arkhangelsky, N.

    2000-01-01

    The Russian RERTR Program started approximately 20 years ago. The USSR always supported principles and goals of the policy of nonproliferation and at the end of 70's the Soviet Government decided to create such national program. Twenty years is the sufficient time to estimate preliminary results of the realization of the program and its prospects on the future. After the first successes of the program, when the enrichment of uranium in fuel elements for foreign supplies was reduced from 80 % to 36 %, the realization of the program was suspended in connection with financial difficulties. But in the beginning of the 90's the Program has received a new pulse connected to the inclusion of Russian scientists and engineers in the international Program. Now basic directions in development of new kinds of fuel are development of works on fuel on a basis dioxide of uranium and development of fuel on a basis of U-Mo alloy. In the future to basic goals of the program the problem of the management of the spent nuclear fuel should be added. The management of HEU at the final stage of a fuel cycle becomes an important objective of the program, since the basic amount of HEU is concentrated in storage of SNF. (author)

  11. Nuclear energy into the twenty-first century

    International Nuclear Information System (INIS)

    Hammond, G.P.

    1996-01-01

    The historical development of the civil nuclear power generation industry is examined in the light of the need to meet conflicting energy-supply and environmental pressures over recent decades. It is suggested that fission (thermal and fast) reactors will dominate the market up to the period 2010-2030, with fusion being relegated to the latter part of the twenty-first century. A number of issues affecting the use of nuclear electricity generation in Western Europe are considered including its cost, industrial strategy needs, and the public acceptability of nuclear power. The contribution of nuclear power stations to achieving CO2 targets aimed at relieving global warming is discussed in the context of alternative strategies for sustainable development, including renewable energy sources and energy-efficiency measures. Trends in the generation of nuclear electricity from fission reactors are finally considered in terms of the main geopolitical groupings that make up the world in the mid-1990s. Several recent, but somewhat conflicting, forecasts of the role of nuclear power in the fuel mix to about 2020 are reviewed. It is argued that the only major expansion in generating capacity will take place on the Asia-Pacific Rim and not in the developing countries generally. Nevertheless, the global nuclear industry overall will continue to be dominated by a small number of large nuclear electricity generating countries; principally the USA, France and Japan. (UK)

  12. Twenty years on: Poverty and hardship in urban Fiji

    Directory of Open Access Journals (Sweden)

    Jenny Bryant-Tokalau

    2012-09-01

    Full Text Available Through ‘official statistics’, academic and donor interpretations as well as the eyes of Suva residents, this paper presents an overview and case study of twenty years of growing poverty and hardship in the contemporary Pacific. Focusing on the past two decades, the paper notes how much, and yet so little, has changed for those attempting to make a living in the rapidly developing towns and cities. Changing interpretations of poverty and hardship are presented, moving from the ‘no such thing’ view, to simplification, and finally to an understanding that Pacific island countries, especially Fiji, are no longer an ‘extension’ of Australia and New Zealand, but independent nations actively trying to find solutions to their issues of economic, social and political hardship whilst facing challenges to traditional institutions and networks. Fiji is in some respects a very particular case as almost half of the population has limited access to secure land, but the very nature of that vulnerability to hardship and poverty holds useful lessons for wider analysis.

  13. Twenty-second Fungal Genetics Conference - Asilomar, 2003

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan D. Walton

    2003-06-30

    The purpose of the Twenty Second Fungal Genetics Conference is to bring together scientists and students who are interested in genetic approaches to studying the biology of filamentous fungi. It is intended to stimulate thinking and discussion in an atmosphere that supports interactions between scientists at different levels and in different disciplines. Topics range from the basic to the applied. Filamentous fungi impact human affairs in many ways. In the environment they are the most important agents of decay and nutrient turnover. They are used extensively in the food industry for the production of food enzymes such as pectinase and food additives such as citric acid. They are used in the production of fermented foods such as alcoholic drinks, bread, cheese, and soy sauce. More than a dozen species of mushrooms are used as foods directly. Many of our most important antibiotics, such as penicillin, cyclosporin, and lovastatin, come from fungi. Fungi also have many negative impacts on human health and economics. Fungi are serious pathogens in immuno-compromised patients. Fungi are the single largest group of plant pathogens and thus a serious limit on crop productivity throughout the world. Many fungi are allergenic, and mold contamination of residences and commercial buildings is now recognized as a serious public health threat. As decomposers, fungi cause extensive damage to just about all natural and synthetic materials.

  14. The Antigerminative Activity of Twenty-Seven Monoterpenes

    Directory of Open Access Journals (Sweden)

    Laura De Martino

    2010-09-01

    Full Text Available Monoterpenes, the main constituents of essential oils, are known for their many biological activities. The present work studied the potential biological activity of twenty-seven monoterpenes, including monoterpene hydrocarbons and oxygenated ones, against seed germination and subsequent primary radicle growth of Raphanus sativus L. (radish and Lepidium sativum L. (garden cress, under laboratory conditions. The compounds, belonging to different chemical classes, showed different potency in affecting both parameters evaluated. The assayed compounds demonstrated a good inhibitory activity in a dose-dependent way. In general, radish seed is more sensitive than garden cress and its germination appeares more inhibited by alcohols; at the highest concentration tested, the more active substances were geraniol, borneol, (±-β-citronellol and α-terpineol. Geraniol and carvone inhibited, in a significant way, the germination of garden cress, at the highest concentration tested. Radicle elongation of two test species was inhibited mainly by alcohols and ketones. Carvone inhibited the radicle elongation of both seeds, at almost all concentrations assayed, while 1,8-cineole inhibited their radicle elongation at the lowest concentrations (10−5 M, 10−6 M.

  15. Accelerators for the twenty-first century - a review

    International Nuclear Information System (INIS)

    Wilson, E.J.N.

    1990-01-01

    Modern synchrotrons and storage rings are based upon the electrical technology of the 1900s boosted by the microwave radar techniques of World War II. This method of acceleration now seems to be approaching its practical limit. It is high time that we seek a new physical acceleration mechanism to provide the higher energies and luminosities needed to continue particle physics beyond the machines now on the stocks. Twenty years is a short time in which to invent, develop, and construct such a device. Without it, high-energy physics may well come to an end. Particle physicists and astrophysicists are invited to join accelerator specialists in the hunt for this new principle. This report analyses the present limitations of colliders and explores some of the directions in which one might look to find a new principle. Chapters cover proton colliders, electron-positron colliders, linear colliders, and two-beam accelerators; transverse fields, wake-field and beat-wave accelerators, ferroelectric crystals, and acceleration in astrophysics. (orig.)

  16. The twenty-first century challenges to sexuality and religion.

    Science.gov (United States)

    Turner, Yolanda; Stayton, William

    2014-04-01

    Clergy and religious leaders are facing a wide variety of sexual needs and concerns within their faith communities. Conflicts over sexual issues are growing across the entire spectrum of religious denominations, and clerics remain ill prepared to deal with them. As religious communities work to remain influential in public policy debates, clergy and the institutions that train them need to be properly prepared for twenty-first century challenges that impact sexuality and religion. Clergy are often the first point of contact for sexual problems and concerns of their faith community members-complex issues centered on morals, spirituality, and ethics. Yet, there still exists a significant lack of sexual curricula in the programs that are educating our future religious leaders. The resulting paucity of knowledge leaves these leaders unprepared to address the needs and concerns of their congregants. However, with accurate, relevant human sexuality curricula integrated into theological formation programs, future leaders will be equipped to competently serve their constituencies. This paper provides a rationale for the need for such training, an overview of the faith- and theology-based history of a pilot training project, and a description of how the Christian faith and the social sciences intersect in a training pilot project's impetus and process.

  17. Twenty years of environmental opposition in the electric sector

    International Nuclear Information System (INIS)

    Molocchi, Andrea

    1997-01-01

    This article aims to provide a framework for analysing social opposition In Italy against the construction and management of electric power plants (nuclear and thermoelectric) and big electricity power lines in the past twenty years. First The author provide a history of social environmental opposition in the electric sector. This is followed by a typology of reason for opposition in terms of risk perception, which has been applied to about forty cases of social opposition against electric plants. This study an original experimental methodology which could also yield useful results when applied to other complex social phenomena. In the third phase of the study the author analyse the various roles of the social and institutional actors involved in the opposition, and the obstacles to future consensus building. The most interesting result of the study is the not only social but political nature opposition. This factor necessitates integration of the traditional individual risk perception approach with an approach which analyses political and social action of NGO's

  18. [Scrotal temperature in 258 healthy men, randomly selected from a population of men aged 18 to 23 years old. Statistical analysis, epidemiologic observations, and measurement of the testicular diameters].

    Science.gov (United States)

    Valeri, A; Mianné, D; Merouze, F; Bujan, L; Altobelli, A; Masson, J

    1993-06-01

    Scrotal hyperthermia can induce certain alterations in spermatogenesis. The basal scrotal temperature used to define hyperthermia is usually 33 degrees C. However, no study, conducted according to a strict methodology has validated this mean measurement. We therefore randomly selected 258 men between the ages of 18 and 23 years from a population of 2,000 young French men seen at the National Service Selection Centre in order to measure the scrotal temperature over each testis and in the median raphe in order to determine the mean and median values for these temperatures. For a mean room temperature of 23 +/- 0.5 degrees C with a range of 18 to 31 degrees C, the mean right and left scrotal temperature was 34.2 +/- 0.1 degree C and the mean medioscrotal temperature was 34.4 +/- 0.1 degree C. Scrotal temperature was very significantly correlated to room temperature and its variations. It was therefore impossible to define a normal value for scrotal temperature. Only measurement of scrotal temperature at neutral room temperature, between 21 and 25 degrees C, is able to provide a reference value for scrotal temperature. In this study, the mean scrotal temperature under these conditions was 34.4 +/- 0.2 degree C, i.e. 2.5 degrees C less than body temperature. In the 12.9% of cases with left varicocele, left scrotal temperature was significantly higher than in the absence of varicocele and was also higher than right Scrotal temperature. The authors also determined the dimensions of the testes.(ABSTRACT TRUNCATED AT 250 WORDS)

  19. Genomic relations among 31 species of Mammillaria haworth (Cactaceae) using random amplified polymorphic DNA.

    Science.gov (United States)

    Mattagajasingh, Ilwola; Mukherjee, Arup Kumar; Das, Premananda

    2006-01-01

    Thirty-one species of Mammillaria were selected to study the molecular phylogeny using random amplified polymorphic DNA (RAPD) markers. High amount of mucilage (gelling polysaccharides) present in Mammillaria was a major obstacle in isolating good quality genomic DNA. The CTAB (cetyl trimethyl ammonium bromide) method was modified to obtain good quality genomic DNA. Twenty-two random decamer primers resulted in 621 bands, all of which were polymorphic. The similarity matrix value varied from 0.109 to 0.622 indicating wide variability among the studied species. The dendrogram obtained from the unweighted pair group method using arithmetic averages (UPGMA) analysis revealed that some of the species did not follow the conventional classification. The present work shows the usefulness of RAPD markers for genetic characterization to establish phylogenetic relations among Mammillaria species.

  20. Evaluation of the effect of aromatherapy with Rosa damascena Mill. on postoperative pain intensity in hospitalized children in selected hospitals affiliated to Isfahan University of Medical Sciences in 2013: A randomized clinical trial

    Science.gov (United States)

    Marofi, Maryam; Sirousfard, Motahareh; Moeini, Mahin; Ghanadi, Alireza

    2015-01-01

    Background: Pain is the common complication after a surgery. The aim of this study was to evaluate the effect of aromatherapy with Rosa damascena Mill. on the postoperative pain in children. Materials and Methods: In a double-blind, placebo-controlled clinical trial, we selected 64 children of 3–6 years of age through convenient sampling and divided them randomly into two groups. Patients in group A were given inhalation aromatherapy with R. damascena Mill., and in group B, the patients were given almond oil as a placebo. Inhalation aromatherapy was used at the first time of subjects’ arrival to the ward and then at 3, 6, 9, and 12 h afterward. Common palliative treatments to relieve pain were used in both groups. Thirty minutes after aromatherapy, the postoperative pain in children was evaluated with the Toddler Preschooler Postoperative Pain Scale (TPPPS). Data were statistically analyzed using Chi-square test, one-way analysis of variance (ANOVA), and repeated measures ANOVA. Results: There was no significant difference in pain scores at the first time of subjects’ arrival to the ward (before receiving any aromatherapy or palliative care) between the two groups. After each time of aromatherapy and at the end of treatment, the pain score was significantly reduced in the aromatherapy group with R. damascena Mill. compared to the placebo group. Conclusions: According to our results, aromatherapy with R. damascena Mill. can be used in postoperative pain in children, together with other common treatments without any significant side effects. PMID:25878704

  1. Evaluating the effectiveness of selected community-level interventions on key maternal, child health, and prevention of mother-to-child transmission of HIV outcomes in three countries (the ACCLAIM Project): a study protocol for a randomized controlled trial.

    Science.gov (United States)

    Woelk, Godfrey B; Kieffer, Mary Pat; Walker, Damilola; Mpofu, Daphne; Machekano, Rhoderick

    2016-02-16

    original study design. We purposively selected facilities in the districts/regions though originally the study clusters were to be randomly selected. Lifelong antiretroviral therapy for all HIV positive pregnant and lactating women, Option B+, was implemented in the three countries during the study period, with the potential for a differential impact by study arm. Implementation however, was rapidly done across the districts/regions, so that there is unlikely be this potential confounding. We developed a system of monitoring and documentation of potential confounding activities or actions, and these data will be incorporated into analyses at the conclusion of the project. Strengthens of the study are that it tests multilevel interventions, utilizes program as well as study specific and individual data, and it is conducted under "real conditions" leading to more robust findings. Limitations of the protocol include the lack of a true control arm and inadequate control for the potential effect of Option B+, such as the intensification of messages as the importance of early ANC and male partner testing. ClinicalTrials.gov (study ID: NCT01971710) Protocol version 5, 30 July 2013, registered 13 August 2013.

  2. Using Random Numbers in Science Research Activities.

    Science.gov (United States)

    Schlenker, Richard M.; And Others

    1996-01-01

    Discusses the importance of science process skills and describes ways to select sets of random numbers for selection of subjects for a research study in an unbiased manner. Presents an activity appropriate for grades 5-12. (JRH)

  3. Twenty Common Testing Mistakes for EFL Teachers to Avoid

    Science.gov (United States)

    Henning, Grant

    2012-01-01

    To some extent, good testing procedure, like good language use, can be achieved through avoidance of errors. Almost any language-instruction program requires the preparation and administration of tests, and it is only to the extent that certain common testing mistakes have been avoided that such tests can be said to be worthwhile selection,…

  4. Selective reminding of prospective memory in Multiple Sclerosis.

    Science.gov (United States)

    McKeever, Joshua D; Schultheis, Maria T; Sim, Tiffanie; Goykhman, Jessica; Patrick, Kristina; Ehde, Dawn M; Woods, Steven Paul

    2017-04-19

    Multiple sclerosis (MS) is associated with prospective memory (PM) deficits, which may increase the risk of poor functional/health outcomes such as medication non-adherence. This study examined the potential benefits of selective reminding to enhance PM functioning in persons with MS. Twenty-one participants with MS and 22 healthy adults (HA) underwent a neuropsychological battery including a Selective Reminding PM (SRPM) experimental procedure. Participants were randomly assigned to either: (1) a selective reminding condition in which participants learn (to criterion) eight prospective memory tasks in a Selective Reminding format; or (2) a single trial encoding condition (1T). A significant interaction was demonstrated, with MS participants receiving greater benefit than HAs from the SR procedure in terms of PM performance. Across diagnostic groups, participants in the SR conditions (vs. 1T conditions) demonstrated significantly better PM performance. Individuals with MS were impaired relative to HAs in the 1T condition, but performance was statistically comparable in the SR condition. This preliminary study suggests that selective reminding can be used to enhance PM cue detection and retrieval in MS. The extent to which selective reminding of PM is effective in naturalistic settings and for health-related behaviours in MS remains to be determined.

  5. Twenty years of operation of Ljubljana's TRIGA Mark II reactor

    International Nuclear Information System (INIS)

    Dimic, V.

    1986-01-01

    Twenty years have now passed since the start of the TRIGA Mark II reactor in Ljubljana. The reactor was critical on May 31, 1966. The total energy produced until the end of May 1986 was 14.048 MWh or 585 MWd. For the first 14 years (until 1981) the yearly energy produced was about 600 MWh, since 1981 the yearly energy produced was 1000 MWh when a routine radioactive isotopes production started for medical use as well as other industrial applications, such as doping and irradiation with fast neutrons of silicon monocrystals, production of level indicators (irradiated cobalt wire), production of radioactive iridium for gamma-radiography, leak detection in pipes by sodium, etc. Besides these, applied research around the reactor is being conducted in the following main fields, where- many unique methods have been developed or have found their way into the local industry or hospitals: neutron radiography, neutron induced auto-radiography using solid state nuclear track detectors, nondestructive methods for assessment of nuclear burn-up, neutron dosimetry, calculation of core burn-up for the optimal in-core fuel management strategy. The solvent extraction method was developed for the everyday production of 99m Tc, which is the most widely used radionuclide in diagnostic nuclear medicine. The methods were developed for the production of the following isotopes: 18 F, 85m Kr, 24 Na, 82 Br, 64 Zn, 125 I. Neutron activation analysis represents one of the major usages for the TRIGA reactor. Basic research is being conducted in the following main fields: solid state physics (elastic and inelastic scattering of the neutrons), neutron dosimetry, neutron radiography, reactor physics and neutron activation analysis. The reactor is used very extensively as a main instrument in the Reactor Training Centre in Ljubljana where manpower training for our nuclear power plant and other organisations has been performed. Although the reactor was designed very carefully in order to be used for

  6. The deep, hot biosphere: Twenty-five years of retrospection.

    Science.gov (United States)

    Colman, Daniel R; Poudel, Saroj; Stamps, Blake W; Boyd, Eric S; Spear, John R

    2017-07-03

    Twenty-five years ago this month, Thomas Gold published a seminal manuscript suggesting the presence of a "deep, hot biosphere" in the Earth's crust. Since this publication, a considerable amount of attention has been given to the study of deep biospheres, their role in geochemical cycles, and their potential to inform on the origin of life and its potential outside of Earth. Overwhelming evidence now supports the presence of a deep biosphere ubiquitously distributed on Earth in both terrestrial and marine settings. Furthermore, it has become apparent that much of this life is dependent on lithogenically sourced high-energy compounds to sustain productivity. A vast diversity of uncultivated microorganisms has been detected in subsurface environments, and we show that H 2 , CH 4 , and CO feature prominently in many of their predicted metabolisms. Despite 25 years of intense study, key questions remain on life in the deep subsurface, including whether it is endemic and the extent of its involvement in the anaerobic formation and degradation of hydrocarbons. Emergent data from cultivation and next-generation sequencing approaches continue to provide promising new hints to answer these questions. As Gold suggested, and as has become increasingly evident, to better understand the subsurface is critical to further understanding the Earth, life, the evolution of life, and the potential for life elsewhere. To this end, we suggest the need to develop a robust network of interdisciplinary scientists and accessible field sites for long-term monitoring of the Earth's subsurface in the form of a deep subsurface microbiome initiative.

  7. Efficacy and tolerability balance of oxycodone/naloxone and tapentadol in chronic low back pain with a neuropathic component: a blinded end point analysis of randomly selected routine data from 12-week prospective open-label observations.

    Science.gov (United States)

    Ueberall, Michael A; Mueller-Schwefe, Gerhard H H

    2016-01-01

    To evaluate the benefit-risk profile (BRP) of oxycodone/naloxone (OXN) and tapentadol (TAP) in patients with chronic low back pain (cLBP) with a neuropathic component (NC) in routine clinical practice. This was a blinded end point analysis of randomly selected 12-week routine/open-label data of the German Pain Registry on adult patients with cLBP-NC who initiated an index treatment in compliance with the current German prescribing information between 1st January and 31st October 2015 (OXN/TAP, n=128/133). Primary end point was defined as a composite of three efficacy components (≥30% improvement of pain, pain-related disability, and quality of life each at the end of observation vs baseline) and three tolerability components (normal bowel function, absence of either central nervous system side effects, and treatment-emergent adverse event [TEAE]-related treatment discontinuation during the observation period) adopted to reflect BRP assessments under real-life conditions. Demographic as well as baseline and pretreatment characteristics were comparable for the randomly selected data sets of both index groups without any indicators for critical selection biases. Treatment with OXN resulted formally in a BRP noninferior to that of TAP and showed a significantly higher primary end point response vs TAP (39.8% vs 25.6%, odds ratio: 1.93; P =0.014), due to superior analgesic effects. Between-group differences increased with stricter response definitions for all three efficacy components in favor of OXN: ≥30%/≥50%/≥70% response rates for OXN vs TAP were seen for pain intensity in 85.2%/67.2%/39.1% vs 83.5%/54.1%/15.8% ( P = ns/0.031/<0.001), for pain-related disability in 78.1%/64.8%/43.8% vs 66.9%/50.4%/24.8% ( P =0.043/0.018/0.001), and for quality of life in 76.6%/68.0%/50.0% vs 63.9%/54.1%/34.6% ( P =0.026/0.022/0.017). Overall, OXN vs TAP treatments were well tolerated, and proportions of patients who either maintained a normal bowel function (68.0% vs 72

  8. Goal selection versus process control while learning to use a brain-computer interface

    Science.gov (United States)

    Royer, Audrey S.; Rose, Minn L.; He, Bin

    2011-06-01

    A brain-computer interface (BCI) can be used to accomplish a task without requiring motor output. Two major control strategies used by BCIs during task completion are process control and goal selection. In process control, the user exerts continuous control and independently executes the given task. In goal selection, the user communicates their goal to the BCI and then receives assistance executing the task. A previous study has shown that goal selection is more accurate and faster in use. An unanswered question is, which control strategy is easier to learn? This study directly compares goal selection and process control while learning to use a sensorimotor rhythm-based BCI. Twenty young healthy human subjects were randomly assigned either to a goal selection or a process control-based paradigm for eight sessions. At the end of the study, the best user from each paradigm completed two additional sessions using all paradigms randomly mixed. The results of this study were that goal selection required a shorter training period for increased speed, accuracy, and information transfer over process control. These results held for the best subjects as well as in the general subject population. The demonstrated characteristics of goal selection make it a promising option to increase the utility of BCIs intended for both disabled and able-bodied users.

  9. Random magnetism

    International Nuclear Information System (INIS)

    Tsallis, C.

    1980-03-01

    The 'ingredients' which control a phase transition in well defined system as well as in random ones (e.g. random magnetic systems) are listed and discussed within a somehow unifying perspective. Among these 'ingredients' we find the couplings and elements responsible for the cooperative phenomenon, the topological connectivity as well as possible topological incompatibilities, the influence of new degrees of freedom, the order parameter dimensionality, the ground state degeneracy and finally the 'quanticity' of the system. The general trends, though illustrated in magnetic systems, essentially hold for all phase transitions, and give a basis for connection of this area with Field theory, Theory of dynamical systems, etc. (Author) [pt

  10. Random magnetism

    International Nuclear Information System (INIS)

    Tsallis, C.

    1981-01-01

    The 'ingredients' which control a phase transition in well defined systems as well as in random ones (e.q. random magnetic systems) are listed and discussed within a somehow unifying perspective. Among these 'ingredients' the couplings and elements responsible for the cooperative phenomenon, the topological connectivity as well as possible topological incompatibilities, the influence of new degrees of freedom, the order parameter dimensionality, the ground state degeneracy and finally the 'quanticity' of the system are found. The general trends, though illustrated in magnetic systems, essentially hold for all phase transitions, and give a basis for connection of this area with Field theory, Theory of dynamical systems, etc. (Author) [pt

  11. Implementing multifactorial psychotherapy research in online virtual environments (IMPROVE-2): study protocol for a phase III trial of the MOST randomized component selection method for internet cognitive-behavioural therapy for depression.

    Science.gov (United States)

    Watkins, Edward; Newbold, Alexandra; Tester-Jones, Michelle; Javaid, Mahmood; Cadman, Jennifer; Collins, Linda M; Graham, John; Mostazir, Mohammod

    2016-10-06

    Depression is a global health challenge. Although there are effective psychological and pharmaceutical interventions, our best treatments achieve remission rates less than 1/3 and limited sustained recovery. Underpinning this efficacy gap is limited understanding of how complex psychological interventions for depression work. Recent reviews have argued that the active ingredients of therapy need to be identified so that therapy can be made briefer, more potent, and to improve scalability. This in turn requires the use of rigorous study designs that test the presence or absence of individual therapeutic elements, rather than standard comparative randomised controlled trials. One such approach is the Multiphase Optimization Strategy, which uses efficient experimentation such as factorial designs to identify active factors in complex interventions. This approach has been successfully applied to behavioural health but not yet to mental health interventions. A Phase III randomised, single-blind balanced fractional factorial trial, based in England and conducted on the internet, randomized at the level of the patient, will investigate the active ingredients of internet cognitive-behavioural therapy (CBT) for depression. Adults with depression (operationalized as PHQ-9 score ≥ 10), recruited directly from the internet and from an UK National Health Service Improving Access to Psychological Therapies service, will be randomized across seven experimental factors, each reflecting the presence versus absence of specific treatment components (activity scheduling, functional analysis, thought challenging, relaxation, concreteness training, absorption, self-compassion training) using a 32-condition balanced fractional factorial design (2 IV 7-2 ). The primary outcome is symptoms of depression (PHQ-9) at 12 weeks. Secondary outcomes include symptoms of anxiety and process measures related to hypothesized mechanisms. Better understanding of the active ingredients of

  12. Implementing multifactorial psychotherapy research in online virtual environments (IMPROVE-2: study protocol for a phase III trial of the MOST randomized component selection method for internet cognitive-behavioural therapy for depression

    Directory of Open Access Journals (Sweden)

    Edward Watkins

    2016-10-01

    Full Text Available Abstract Background Depression is a global health challenge. Although there are effective psychological and pharmaceutical interventions, our best treatments achieve remission rates less than 1/3 and limited sustained recovery. Underpinning this efficacy gap is limited understanding of how complex psychological interventions for depression work. Recent reviews have argued that the active ingredients of therapy need to be identified so that therapy can be made briefer, more potent, and to improve scalability. This in turn requires the use of rigorous study designs that test the presence or absence of individual therapeutic elements, rather than standard comparative randomised controlled trials. One such approach is the Multiphase Optimization Strategy, which uses efficient experimentation such as factorial designs to identify active factors in complex interventions. This approach has been successfully applied to behavioural health but not yet to mental health interventions. Methods/Design A Phase III randomised, single-blind balanced fractional factorial trial, based in England and conducted on the internet, randomized at the level of the patient, will investigate the active ingredients of internet cognitive-behavioural therapy (CBT for depression. Adults with depression (operationalized as PHQ-9 score ≥ 10, recruited directly from the internet and from an UK National Health Service Improving Access to Psychological Therapies service, will be randomized across seven experimental factors, each reflecting the presence versus absence of specific treatment components (activity scheduling, functional analysis, thought challenging, relaxation, concreteness training, absorption, self-compassion training using a 32-condition balanced fractional factorial design (2IV 7-2. The primary outcome is symptoms of depression (PHQ-9 at 12 weeks. Secondary outcomes include symptoms of anxiety and process measures related to hypothesized mechanisms

  13. A protocol for a three-arm cluster randomized controlled superiority trial investigating the effects of two pedagogical methodologies in Swedish preschool settings on language and communication, executive functions, auditive selective attention, socioemotional skills and early maths skills.

    Science.gov (United States)

    Gerholm, Tove; Hörberg, Thomas; Tonér, Signe; Kallioinen, Petter; Frankenberg, Sofia; Kjällander, Susanne; Palmer, Anna; Taguchi, Hillevi Lenz

    2018-06-19

    During the preschool years, children develop abilities and skills in areas crucial for later success in life. These abilities include language, executive functions, attention, and socioemotional skills. The pedagogical methods used in preschools hold the potential to enhance these abilities, but our knowledge of which pedagogical practices aid which abilities, and for which children, is limited. The aim of this paper is to describe an intervention study designed to evaluate and compare two pedagogical methodologies in terms of their effect on the above-mentioned skills in Swedish preschool children. The study is a randomized control trial (RCT) where two pedagogical methodologies were tested to evaluate how they enhanced children's language, executive functions and attention, socioemotional skills, and early maths skills during an intensive 6-week intervention. Eighteen preschools including 28 units and 432 children were enrolled in a municipality close to Stockholm, Sweden. The children were between 4;0 and 6;0 years old and each preschool unit was randomly assigned to either of the interventions or to the control group. Background information on all children was collected via questionnaires completed by parents and preschools. Pre- and post-intervention testing consisted of a test battery including tests on language, executive functions, selective auditive attention, socioemotional skills and early maths skills. The interventions consisted of 6 weeks of intensive practice of either a socioemotional and material learning paradigm (SEMLA), for which group-based activities and interactional structures were the main focus, or an individual, digitally implemented attention and math training paradigm, which also included a set of self-regulation practices (DIL). All preschools were evaluated with the ECERS-3. If this intervention study shows evidence of a difference between group-based learning paradigms and individual training of specific skills in terms of

  14. Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J.C.; Ibrahim, S.R.; Brincker, Rune

    Abstraet Thispaper demansirates how to use the Random Decrement (RD) technique for identification o flinear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing...

  15. Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune

    This paper demonstrates how to use the Random Decrement (RD) technique for identification of linear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing a new...

  16. Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, R.; Brincker, Rune

    1998-01-01

    This paper demonstrates how to use the Random Decrement (RD) technique for identification of linear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing a new...

  17. Random dynamics

    International Nuclear Information System (INIS)

    Bennett, D.L.; Brene, N.; Nielsen, H.B.

    1986-06-01

    The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model. (orig.)

  18. Random dynamics

    International Nuclear Information System (INIS)

    Bennett, D.L.

    1987-01-01

    The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: Gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model. (orig.)

  19. Random Dynamics

    Science.gov (United States)

    Bennett, D. L.; Brene, N.; Nielsen, H. B.

    1987-01-01

    The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model.

  20. Twenty-Five Year Site Plan FY2013 - FY2037

    Energy Technology Data Exchange (ETDEWEB)

    Jones, William H. [Los Alamos National Laboratory

    2012-07-12

    Los Alamos National Laboratory (the Laboratory) is the nation's premier national security science laboratory. Its mission is to develop and apply science and technology to ensure the safety, security, and reliability of the United States (U.S.) nuclear stockpile; reduce the threat of weapons of mass destruction, proliferation, and terrorism; and solve national problems in defense, energy, and the environment. The fiscal year (FY) 2013-2037 Twenty-Five Year Site Plan (TYSP) is a vital component for planning to meet the National Nuclear Security Administration (NNSA) commitment to ensure the U.S. has a safe, secure, and reliable nuclear deterrent. The Laboratory also uses the TYSP as an integrated planning tool to guide development of an efficient and responsive infrastructure that effectively supports the Laboratory's missions and workforce. Emphasizing the Laboratory's core capabilities, this TYSP reflects the Laboratory's role as a prominent contributor to NNSA missions through its programs and campaigns. The Laboratory is aligned with Nuclear Security Enterprise (NSE) modernization activities outlined in the NNSA Strategic Plan (May 2011) which include: (1) ensuring laboratory plutonium space effectively supports pit manufacturing and enterprise-wide special nuclear materials consolidation; (2) constructing the Chemistry and Metallurgy Research Replacement Nuclear Facility (CMRR-NF); (3) establishing shared user facilities to more cost effectively manage high-value, experimental, computational and production capabilities; and (4) modernizing enduring facilities while reducing the excess facility footprint. Th is TYSP is viewed by the Laboratory as a vital planning tool to develop an effi cient and responsive infrastructure. Long range facility and infrastructure development planning are critical to assure sustainment and modernization. Out-year re-investment is essential for sustaining existing facilities, and will be re-evaluated on an annual

  1. Understanding Contamination; Twenty Years of Simulating Radiological Contamination

    Energy Technology Data Exchange (ETDEWEB)

    Emily Snyder; John Drake; Ryan James

    2012-02-01

    A wide variety of simulated contamination methods have been developed by researchers to reproducibly test radiological decontamination methods. Some twenty years ago a method of non-radioactive contamination simulation was proposed at the Idaho National Laboratory (INL) that mimicked the character of radioactive cesium and zirconium contamination on stainless steel. It involved baking the contamination into the surface of the stainless steel in order to 'fix' it into a tenacious, tightly bound oxide layer. This type of contamination was particularly applicable to nuclear processing facilities (and nuclear reactors) where oxide growth and exchange of radioactive materials within the oxide layer became the predominant model for material/contaminant interaction. Additional simulation methods and their empirically derived basis (from a nuclear fuel reprocessing facility) are discussed. In the last ten years the INL, working with the Defense Advanced Research Projects Agency (DARPA) and the National Homeland Security Research Center (NHSRC), has continued to develop contamination simulation methodologies. The most notable of these newer methodologies was developed to compare the efficacy of different decontamination technologies against radiological dispersal device (RDD, 'dirty bomb') type of contamination. There are many different scenarios for how RDD contamination may be spread, but the most commonly used one at the INL involves the dispersal of an aqueous solution containing radioactive Cs-137. This method was chosen during the DARPA projects and has continued through the NHSRC series of decontamination trials and also gives a tenacious 'fixed' contamination. Much has been learned about the interaction of cesium contamination with building materials, particularly concrete, throughout these tests. The effects of porosity, cation-exchange capacity of the material and the amount of dirt and debris on the surface are very important factors

  2. Direct random insertion mutagenesis of Helicobacter pylori.

    NARCIS (Netherlands)

    Jonge, de R.; Bakker, D.; Vliet, van AH; Kuipers, E.J.; Vandenbroucke-Grauls, C.M.J.E.; Kusters, J.G.

    2003-01-01

    Random insertion mutagenesis is a widely used technique for the identification of bacterial virulence genes. Most strategies for random mutagenesis involve cloning in Escherichia coli for passage of plasmids or for phenotypic selection. This can result in biased selection due to restriction or

  3. Direct random insertion mutagenesis of Helicobacter pylori

    NARCIS (Netherlands)

    de Jonge, Ramon; Bakker, Dennis; van Vliet, Arnoud H. M.; Kuipers, Ernst J.; Vandenbroucke-Grauls, Christina M. J. E.; Kusters, Johannes G.

    2003-01-01

    Random insertion mutagenesis is a widely used technique for the identification of bacterial virulence genes. Most strategies for random mutagenesis involve cloning in Escherichia coli for passage of plasmids or for phenotypic selection. This can result in biased selection due to restriction or

  4. Lack of efficacy of resveratrol on C-reactive protein and selected cardiovascular risk factors--Results from a systematic review and meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Sahebkar, Amirhossein; Serban, Corina; Ursoniu, Sorin; Wong, Nathan D; Muntner, Paul; Graham, Ian M; Mikhailidis, Dimitri P; Rizzo, Manfredi; Rysz, Jacek; Sperling, Laurence S; Lip, Gregory Y H; Banach, Maciej

    2015-01-01

    Numerous studies have suggested that oral supplementation with resveratrol exerts cardioprotective effects, but evidence of the effects on C-reactive protein (CRP) plasma levels and other cardiovascular (CV) risk factors is inconclusive. Therefore, we performed a meta-analysis to evaluate the efficacy of resveratrol supplementation on plasma CRP concentrations and selected predictors of CV risk. The search included PUBMED, Cochrane Library, Web of Science, Scopus, and EMBASE (up to August 31, 2014) to identify RCTs investigating the effects of resveratrol supplementation on selected CV risk factors. Quantitative data synthesis was performed using a random-effects model, with weighted mean difference (WMD) and 95% confidence intervals (CI) as summary statistics. Meta-analysis of data from 10 RCTs (11 treatment arms) did not support a significant effect of resveratrol supplementation in altering plasma CRP concentrations (WMD: -0.144 mg/L, 95% CI: -0.968-0.680, p = 0.731). Resveratrol supplementation was not found to alter plasma levels of total cholesterol (WMD: 1.49 mg/dL, 95% CI: -14.96-17.93, p = 0.859), low density lipoprotein cholesterol (WMD: -0.31 mg/dL, 95% CI: -9.57-8.95, p = 0.948), triglycerides (WMD: 2.67 mg/dL, 95% CI: -28.34-33.67, p = 0.866), and glucose (WMD: 1.28 mg/dL, 95% CI: -5.28-7.84, p = 0.703). It also slightly reduced high density lipoprotein cholesterol concentrations (WMD: -4.18 mg/dL, 95% CI: -6.54 to -1.82, p = 0.001). Likewise, no significant effect was observed on systolic (WMD: 0.82 mmHg, 95% CI: -8.86-10.50, p = 0.868) and diastolic blood pressure (WMD: 1.72 mm Hg, 95% CI: -6.29-9.73, p=0.674). This meta-analysis of available RCTs does not suggest any benefit of resveratrol supplementation on CV risk factors. Larger, well-designed trials are necessary to confirm these results. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. Validity aspects in Chernobyl at twenty years of the accident

    International Nuclear Information System (INIS)

    Arredondo, C.

    2006-01-01

    inventory of rare gases of the core. The consequences of the accident have been studied during the twenty lapsed years since it happened. In this work the more recent discoveries on the effects in the health, the environment and economic that have been reported, as well as the current advances regarding the solution of the problems with the sarcophagus are commented. Other aspects little mentioned that consequences of the accident can be considered are discussed also, like they are the increment in the nuclear safety in the reactors in operation in the entire world and the termination of the cold war with the consequent dismantlement of a great one numbers of nuclear weapons. Finally it is remembered that the lessons learned in Chernobyl should never be forgotten. (Author)

  6. Twenty-one lectures on complex analysis a first course

    CERN Document Server

    Isaev, Alexander

    2017-01-01

    At its core, this concise textbook presents standard material for a first course in complex analysis at the advanced undergraduate level. This distinctive text will prove most rewarding for students who have a genuine passion for mathematics as well as certain mathematical maturity. Primarily aimed at undergraduates with working knowledge of real analysis and metric spaces, this book can also be used to instruct a graduate course. The text uses a conversational style with topics purposefully apportioned into 21 lectures, providing a suitable format for either independent study or lecture-based teaching. Instructors are invited to rearrange the order of topics according to their own vision. A clear and rigorous exposition is supported by engaging examples and exercises unique to each lecture; a large number of exercises contain useful calculation problems. Hints are given for a selection of the more difficult exercises. This text furnishes the reader with a means of learning complex analysis as well as a subtl...

  7. Twenty years on: the impact of fragments on drug discovery.

    Science.gov (United States)

    Erlanson, Daniel A; Fesik, Stephen W; Hubbard, Roderick E; Jahnke, Wolfgang; Jhoti, Harren

    2016-09-01

    After 20 years of sometimes quiet growth, fragment-based drug discovery (FBDD) has become mainstream. More than 30 drug candidates derived from fragments have entered the clinic, with two approved and several more in advanced trials. FBDD has been widely applied in both academia and industry, as evidenced by the large number of papers from universities, non-profit research institutions, biotechnology companies and pharmaceutical companies. Moreover, FBDD draws on a diverse range of disciplines, from biochemistry and biophysics to computational and medicinal chemistry. As the promise of FBDD strategies becomes increasingly realized, now is an opportune time to draw lessons and point the way to the future. This Review briefly discusses how to design fragment libraries, how to select screening techniques and how to make the most of information gleaned from them. It also shows how concepts from FBDD have permeated and enhanced drug discovery efforts.

  8. Random number generation and creativity.

    Science.gov (United States)

    Bains, William

    2008-01-01

    A previous paper suggested that humans can generate genuinely random numbers. I tested this hypothesis by repeating the experiment with a larger number of highly numerate subjects, asking them to call out a sequence of digits selected from 0 through 9. The resulting sequences were substantially non-random, with an excess of sequential pairs of numbers and a deficit of repeats of the same number, in line with previous literature. However, the previous literature suggests that humans generate random numbers with substantial conscious effort, and distractions which reduce that effort reduce the randomness of the numbers. I reduced my subjects' concentration by asking them to call out in another language, and with alcohol - neither affected the randomness of their responses. This suggests that the ability to generate random numbers is a 'basic' function of the human mind, even if those numbers are not mathematically 'random'. I hypothesise that there is a 'creativity' mechanism, while not truly random, provides novelty as part of the mind's defence against closed programming loops, and that testing for the effects seen here in people more or less familiar with numbers or with spontaneous creativity could identify more features of this process. It is possible that training to perform better at simple random generation tasks could help to increase creativity, through training people to reduce the conscious mind's suppression of the 'spontaneous', creative response to new questions.

  9. Three times twenty by twenty-twenty

    International Nuclear Information System (INIS)

    Heller, W.

    2007-01-01

    Under the German presidency, the European Council adopted a program of climate protection and energy policy on March 9, 2007. It contains these 3 objectives: (1) By 2020, greenhouse gas emissions in the EU will be reduced by at least 20 percent from their 1990 level. (2) The share of renewable energies in primary energy consumption in the EU will be raised to 20 percent by 2020. (3) To enhance energy efficiency, energy consumption in the EU is to be reduced by 20 percent relative to the forecasts for 2020. The reason given by the European Council for the ambitious energy efficiency goal is its conviction that, in addition to the considerable increase in renewables, a major improvement in energy efficiency will enhance the security of energy supply, attenuate the forecast rise in energy prices, and diminish greenhouse gas emissions. Nuclear power is mentioned only in passing and as a topic for national decisionmaking, which is absolutely unsatisfactory in the light of what nuclear power currently is, and could go on, contributing to the security of supply, the climate protection, and reasonably priced electricity supply. On the whole, the package of energy and environmental policy measures devised by the European Council may be termed ambitious. (orig.)

  10. Twenty years of social capital and health research: a glossary.

    Science.gov (United States)

    Moore, S; Kawachi, I

    2017-05-01

    Research on social capital in public health is approaching its 20th anniversary. Over this period, there have been rich and productive debates on the definition, measurement and importance of social capital for public health research and practice. As a result, the concepts and measures characterising social capital and health research have also evolved, often drawing from research in the social, political and behavioural sciences. The multidisciplinary adaptation of social capital-related concepts to study health has made it challenging for researchers to reach consensus on a common theoretical approach. This glossary thus aims to provide a general overview without recommending any particular approach. Based on our knowledge and research on social capital and health, we have selected key concepts and terms that have gained prominence over the last decade and complement an earlier glossary on social capital and health. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  11. Cost in medical education: one hundred and twenty years ago.

    Science.gov (United States)

    Walsh, Kieran

    2015-10-01

    The first full paper that is dedicated to cost in medical education appears in the BMJ in 1893. This paper "The cost of a medical education" outlines the likely costs associated with undergraduate education at the end of the nineteenth century, and offers guidance to the student on how to make financial planning. Many lessons can be gleaned from the paper about the cost and other aspects of nineteenth century medical education. Cost is viewed almost exclusively from the domain of the male gender. Cost is viewed not just from the perspective of a young man but of a young gentleman. There is a strong implication that medicine is a club and that you have to have money to join the club and then to take part in the club's activities. Cost affects choice of medical school and selection into schools. The paper places great emphasis on the importance of passing exams at their first sitting and progressing through each year in a timely manner-mainly to save costs. The subject of cost is viewed from the perspective of the payer-at this time students and their families. The paper encourages the reader to reflect on what has and has not changed in this field since 1893. Modern medical education is still expensive; its expense deters students; and we have only started to think about how to control costs or how to ensure value. Too much of the cost of medical education continues to burden students and their families.

  12. Drone Warfare: Twenty-First Century Empire and Communications

    Directory of Open Access Journals (Sweden)

    Kevin Howley

    2017-02-01

    Full Text Available This paper, part of a larger project that examines drones from a social-construction of technology perspective, considers drone warfare in light of Harold Innis’s seminal work on empire and communication. Leveraging leading-edge aeronautics with advanced optics, data processing, and networked communication, drones represent an archetypal “space-biased” technology. Indeed, by allowing remote operators and others to monitor, select, and strike targets from half a world away, and in real-time, these weapon systems epitomize the “pernicious neglect of time” Innis sought to identify and remedy in his later writing. With Innis’s time-space dialectic as a starting point, then, the paper considers drones in light of a longstanding paradox of American culture: the impulse to collapse the geographical distance between the United States and other parts of the globe, while simultaneously magnifying the cultural difference between Americans and other peoples and societies. In the midst of the worldwide proliferation of drones, this quintessentially sublime technology embodies this (disconnect in important, profound, and ominous ways.

  13. Random tensors

    CERN Document Server

    Gurau, Razvan

    2017-01-01

    Written by the creator of the modern theory of random tensors, this book is the first self-contained introductory text to this rapidly developing theory. Starting from notions familiar to the average researcher or PhD student in mathematical or theoretical physics, the book presents in detail the theory and its applications to physics. The recent detections of the Higgs boson at the LHC and gravitational waves at LIGO mark new milestones in Physics confirming long standing predictions of Quantum Field Theory and General Relativity. These two experimental results only reinforce today the need to find an underlying common framework of the two: the elusive theory of Quantum Gravity. Over the past thirty years, several alternatives have been proposed as theories of Quantum Gravity, chief among them String Theory. While these theories are yet to be tested experimentally, key lessons have already been learned. Whatever the theory of Quantum Gravity may be, it must incorporate random geometry in one form or another....

  14. Tuning up mind's pattern to nature's own idea: Eddington's early twenties case for variational derivatives

    Science.gov (United States)

    Smadja, Ivahn

    This paper sets out to show how Eddington's early twenties case for variational derivatives significantly bears witness to a steady and consistent shift in focus from a resolute striving for objectivity towards "selective subjectivism" and structuralism. While framing his so-called "Hamiltonian derivatives" along the lines of previously available variational methods allowing to derive gravitational field equations from an action principle, Eddington assigned them a theoretical function of his own devising in The Mathematical Theory of Relativity (1923). I make clear that two stages should be marked out in Eddington's train of thought if the meaning of such variational derivatives is to be adequately assessed. As far as they were originally intended to embody the mind's collusion with nature by linking atomicity of matter with atomicity of action, variational derivatives were at first assigned a dual role requiring of them not only to express mind's craving for permanence but also to tune up mind's privileged pattern to "Nature's own idea". Whereas at a later stage, as affine field theory would provide a framework for world-building, such "Hamiltonian differentiation" would grow out of tune through gauge-invariance and, by disregarding how mathematical theory might precisely come into contact with actual world, would be turned into a mere heuristic device for structural knowledge.

  15. Indication to Open Anatrophic Nephrolithotomy in the Twenty-First Century: A Case Report

    Directory of Open Access Journals (Sweden)

    Alfredo Maria Bove

    2012-01-01

    Full Text Available Introduction. Advances in endourology have greatly reduced indications to open surgery in the treatment of staghorn kidney stones. Nevertheless in our experience, open surgery still represents the treatment of choice in rare cases. Case Report. A 71-year-old morbidly obese female patient complaining about occasional left flank pain, and recurrent cystitis for many years, presented bilateral staghorn kidney stones. Comorbidities were obesity (BMI 36.2, hypertension, type II diabetes, and chronic obstructive pulmunary disease (COPD hyperlipidemia. Due to these comorbidities, endoscopic and laparoscopic approaches were not indicated. We offered the patient staged open anatrophic nephrolithotomy. Results. Operative time was 180 minutes. Blood loss was 500 cc. requiring one unit of packed red blood cells. Hospital stay was 7 days. The renal function was unaffected based on preoperative and postoperative serum creatinine levels. Stone-free status of the left kidney was confirmed after surgery with CT scan. Conclusions. Open surgery can represent a valid alterative in the treatment of staghorn kidney stones of very selected cases. A discussion of the current indications in the twenty-first century is presented.

  16. [Study on medical records of acupuncture-moxibustion in The Twenty-four Histories].

    Science.gov (United States)

    Huang, Kai-Wen

    2012-03-01

    Through the combination of manual retrieval and computerized retrieval, medical records of acupuncture-moxibustion in The Twenty-Four Histories were collected. Acupuncture cases from the Spring and Autumn Period (770-476 B.C.) to the end of the Ming Dynasty (1368-1644)were retrieved. From the medical records of acupuncture-moxibustion in Chinese official history books, it can be found that systematic diseases or emergent and severe diseases were already treated by physicians with the combination of acupuncture and medicine as early as in the Spring and Autumn Period as well as the Warring States Period(475-221 B.C.). CANG Gong, a famous physician of the Western Han Dynasty (206 B. C.-A. D. 24), cured diseases by selecting points along the running courses of meridians where the illness inhabited, which indicates that the theory of meridians and collaterals was served as a guide for clinical practice as early as in the Western Han Dynasty. Blood letting therapy, which has surprising effect, was often adopted by physicians of various historical periods to treat diseases. And treatment of diseases with single point was approved to be easy and effective.

  17. Antioxidant and antiproliferative activities of twenty-four Vitis vinifera grapes.

    Directory of Open Access Journals (Sweden)

    Zhenchang Liang

    Full Text Available Grapes are rich in phytochemicals with many proven health benefits. Phenolic profiles, antioxidant and antiproliferative activities of twenty-four selected Vitis vinifera grape cultivars were investigated in this study. Large ranges of variation were found in these cultivars for the contents of total phenolics (95.3 to 686.5 mg/100 g and flavonoids (94.7 to 1055 mg/100 g and antioxidant activities (oxygen radical absorbance capacity 378.7 to 3386.0 mg of Trolox equivalents/100 g and peroxylradical scavenging capacity14.2 to 557 mg of vitamin C equivalents/100 g, cellular antioxidant activities (3.9 to 139.9 µmol of quercetin equivalents/100 g without PBS wash and 1.4 to 95.8 µmol of quercetin equivalents /100 g with PBS wash and antiproliferative activities (25 to 82% at the concentrations of 100 mg/mL extracts.The total antioxidant activities were significantly correlated with the total phenolics and flavonoids. However, no significant correlations were found between antiproliferative activities and total phenolics or total flavonoids content. Wine grapes and color grapes showed much higher levels of phytochemicals and antioxidant activities than table grapes and green/yellow grapes. Several germplasm accessions with much high contents of phenolics and flavonoids, and total antioxidant activity were identified. These germplasm can be valuable sources of genes for breeding grape cultivars with better nutritional qualities of wine and table grapes in the future.

  18. Developing twenty-first century skills: insights from an intensive interdisciplinary workshop Mosaic of Life

    Directory of Open Access Journals (Sweden)

    Tamara Milosevic

    2013-11-01

    Full Text Available The Baltic Sea, one of the world’s largest semi-enclosed seas, which, with its very low salinity and quasi-isolation from the big oceans cannot decide whether it is a sea or a large lake. This geologically-unique environment supports an even more surprising and delicate marine ecosystem, where a complex community of fishes, marine mammals and important microscopic organisms creates a magical mosaic of life. Humans have enjoyed the abundance of life in the Baltic Sea for thousands of years, and major Scandinavian and Baltic cities have oriented themselves towards this geo-ecosystem in order to develop and seek ecological, economical and cultural inspiration and wealth. The ‘Mosaic of Life’ workshop aimed at going beyond the obvious in examining the meaning of the Baltic Sea by gathering together a selection of young, creative minds from different backgrounds ranging from the arts and economics to geology and life sciences. This intensive workshop was designed as a unique training opportunity to develop essential twenty-first century skills – to introduce and develop creative, critical and interdisciplinary thinking and collaborative teamwork, as well as to foster a visual and scientific literacy, using project-based learning and hands-on activities. Our final goal has been to be inspired by the resulting connections, differences and unifying concepts, creating innovative, interdisciplinary projects which would look further than the sea – further than the eye can see and further into the future.

  19. A Critical Feminist and Race Critique of Thomas Piketty's "Capital in the Twenty-First Century"

    Science.gov (United States)

    Moeller, Kathryn

    2016-01-01

    Thomas Piketty's "Capital in the Twenty-first Century" documents the foreboding nature of rising wealth inequality in the twenty-first century. In an effort to promote a more just and democratic global society and rein in the unfettered accumulation of wealth by the few, Piketty calls for a global progressive annual tax on corporate…

  20. Twenty-One at TREC-8: using Language Technology for Information Retrieval

    NARCIS (Netherlands)

    Kraaij, Wessel; Pohlmann, Renée; Hiemstra, Djoerd; Voorhees, E.M; Harman, D.K.

    2000-01-01

    This paper describes the official runs of the Twenty-One group for TREC-8. The Twenty-One group participated in the Ad-hoc, CLIR, Adaptive Filtering and SDR tracks. The main focus of our experiments is the development and evaluation of retrieval methods that are motivated by natural language

  1. Guidelines to Design Engineering Education in the Twenty-First Century for Supporting Innovative Product Development

    Science.gov (United States)

    Violante, Maria Grazia; Vezzetti, Enrico

    2017-01-01

    In the twenty-first century, meeting our technological challenges demands educational excellence, a skilled populace that is ready for the critical challenges society faces. There is widespread consensus, however, that education systems are failing to adequately prepare all students with the essential twenty-first century knowledge and skills…

  2. Twenty-One at TREC-7: ad-hoc and cross-language track

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Kraaij, Wessel; Voorhees, E.M; Harman, D.K.

    1999-01-01

    This paper describes the official runs of the Twenty-One group for TREC-7. The Twenty-One group participated in the ad-hoc and the cross-language track and made the following accomplishments: We developed a new weighting algorithm, which outperforms the popular Cornell version of BM25 on the ad-hoc

  3. Twenty-First Century Literacy: A Matter of Scale from Micro to Mega

    Science.gov (United States)

    Brown, Abbie; Slagter van Tryon, Patricia J.

    2010-01-01

    Twenty-first century technologies require educators to look for new ways to teach literacy skills. Current communication methods are combinations of traditional and newer, network-driven forms. This article describes the changes twenty-first century technologies cause in the perception of time, size, distance, audience, and available data, and…

  4. Establishing the R&D Agenda for Twenty-First Century Learning

    Science.gov (United States)

    Kay, Ken; Honey, Margaret

    2006-01-01

    Much ink has flowed over the past few years describing the need to incorporate twenty-first century skills into K-12 education. Preparing students to succeed as citizens, thinkers, and workers--the bedrock of any educational system--in this environment means arming them with more than a list of facts and important dates. Infusing twenty-first…

  5. Selected cerium phase diagrams

    International Nuclear Information System (INIS)

    Gschneidner, K.A. Jr.; Verkade, M.E.

    1974-09-01

    A compilation of cerium alloy phase equilibria data based on the most reliable information available is presented. The binary systems selected are those of cerium with each of the following twenty nine elements which might be commonly found in steels: Al, Sb, As, Bi, Ca, C, Cr, Co, Nb, Cu, Fe, Pb, Mg, Mn, Mo, Ni, N, O, P, Se, Si, Ag, S, Te, Sn, Ti, W, and Zn. A brief discussion, a summary of crystal lattice parameters where applicable, and a list of references is included for each element surveyed. (U.S.)

  6. Twenty years bone banking in Macedonia application and legislation

    International Nuclear Information System (INIS)

    Karevski, L.; Videovski, G.; Nospal, T.

    1999-01-01

    and tissue banks function as non-profitable, non-commercial / trade organizations. The principle of donation is voluntarily. The article 6 of the Law regulates which parts of the deceased (cadaver) can be taken for transplantation, that is, if the deceased person during his life has not declared in written form, that parts of his body can not be used for transplantation. This is probably a solution and compromise between 'opting in' and 'opting out', close to the proposed concept of 'informed consent' from Von Versen. We called this regulation as 'voluntary negative selection'. We think that this regulation stimulates the transplantation by reason of curing, respecting the ethical and moral problems. Despite all difficulties encountered, we try to work in accordance to the EATB standards and aim to accomplish similar results as other Bone Banks around the World by using the latest methods for bone procurement

  7. Twenty-Four-Hour Ambulatory Blood Pressure Monitoring in Hypertension

    Science.gov (United States)

    2012-01-01

    assessments, systematic reviews, meta-analyses, or randomized controlled trials. Exclusion Criteria non-English papers; animal or in vitro studies; case reports, case series, or case-case studies; studies comparing different antihypertensive therapies and evaluating their antihypertensive effects using 24-hour ABPM; studies on home or self-monitoring of BP, and studies on automated office BP measurement; studies in high-risk subgroups (e.g. diabetes, pregnancy, kidney disease). Outcomes of Interest Patient Outcomes mortality: all cardiovascular events (e.g., myocardial infarction [MI], stroke); non-fatal: all cardiovascular events (e.g., MI, stroke); combined fatal and non-fatal: all cardiovascular events (e.g., MI, stroke); all non-cardiovascular events; control of BP (e.g. systolic and/or diastolic target level). Drug-Related Outcomes percentage of patients who show a reduction in, or stop, drug treatment; percentage of patients who begin multi-drug treatment; drug therapy use (e.g. number, intensity of drug use); drug-related adverse events. Quality of Evidence The quality of the body of evidence was assessed as high, moderate, low, or very low according to the GRADE Working Group criteria. As stated by the GRADE Working Group, the following definitions of quality were used in grading the quality of the evidence: High Further research is very unlikely to change confidence in the estimate of effect. Moderate Further research is likely to have an important impact on confidence in the estimate of effect and may change the estimate. Low Further research is very likely to have an important impact on confidence in the estimate of effect and is likely to change the estimate. Very Low Any estimate of effect is very uncertain. Summary of Findings Short-Term Follow-Up Studies (Length of Follow-Up of ≤ 1 Year) Based on very low quality of evidence, there is no difference between technologies for non-fatal cardiovascular events. Based on moderate quality of evidence, ABPM resulted

  8. Pseudo-Random Number Generators

    Science.gov (United States)

    Howell, L. W.; Rheinfurth, M. H.

    1984-01-01

    Package features comprehensive selection of probabilistic distributions. Monte Carlo simulations resorted to whenever systems studied not amenable to deterministic analyses or when direct experimentation not feasible. Random numbers having certain specified distribution characteristic integral part of simulations. Package consists of collector of "pseudorandom" number generators for use in Monte Carlo simulations.

  9. Random pulse generator

    International Nuclear Information System (INIS)

    Guo Ya'nan; Jin Dapeng; Zhao Dixin; Liu Zhen'an; Qiao Qiao; Chinese Academy of Sciences, Beijing

    2007-01-01

    Due to the randomness of radioactive decay and nuclear reaction, the signals from detectors are random in time. But normal pulse generator generates periodical pulses. To measure the performances of nuclear electronic devices under random inputs, a random generator is necessary. Types of random pulse generator are reviewed, 2 digital random pulse generators are introduced. (authors)

  10. Random matrices and random difference equations

    International Nuclear Information System (INIS)

    Uppuluri, V.R.R.

    1975-01-01

    Mathematical models leading to products of random matrices and random difference equations are discussed. A one-compartment model with random behavior is introduced, and it is shown how the average concentration in the discrete time model converges to the exponential function. This is of relevance to understanding how radioactivity gets trapped in bone structure in blood--bone systems. The ideas are then generalized to two-compartment models and mammillary systems, where products of random matrices appear in a natural way. The appearance of products of random matrices in applications in demography and control theory is considered. Then random sequences motivated from the following problems are studied: constant pulsing and random decay models, random pulsing and constant decay models, and random pulsing and random decay models

  11. Dual-task results and the lateralization of spatial orientation: artifact of test selection?

    Science.gov (United States)

    Bowers, C A; Milham, L M; Price, C

    1998-01-01

    An investigation was conducted to identify the degree to which results regarding the lateralization of spatial orientation among men and women are artifacts of test selection. A dual-task design was used to study possible lateralization differences, providing baseline and dual-task measures of spatial-orientation performance, right- and left-hand tapping, and vocalization of "cat, dog, horse." The Guilford-Zimmerman Test (Guilford & Zimmerman, 1953), the Eliot-Price Test (Eliot & Price, 1976), and the Stumpf-Fay Cube Perspectives Test (Stumpf & Fay, 1983) were the three spatial-orientation tests used to investigate possible artifacts of test selection. Twenty-eight right-handed male and 39 right-handed female undergraduates completed random baseline and dual-task sessions. Analyses indicated no significant sex-related differences in spatial-orientation ability for all three tests. Furthermore, there was no evidence of differential lateralization of spatial orientation between the sexes.

  12. Rationale and Design of a Randomized Clinical Comparison of Everolimus-Eluting (Xience V/Promus) and Sirolimus-Eluting (Cypher Select+) Coronary Stents in Unselected Patients with Coronary Heart Disease

    DEFF Research Database (Denmark)

    Jensen, Lisette Okkels; Thayssen, Per; Tilsted, Hans Henrik

    2010-01-01

    with Clinical Outcome (SORT OUT) IV trial was designed as a prospective, multi-center, open-label, all-comer, two-arm, randomized, non-inferiority study comparing the everolimus-eluting stent with the sirolimus-eluting stent in the treatment of atherosclerotic coronary artery lesions. Based on a non...

  13. Driving safety after brain damage: follow-up of twenty-two patients with matched controls.

    Science.gov (United States)

    Katz, R T; Golden, R S; Butter, J; Tepper, D; Rothke, S; Holmes, J; Sahgal, V

    1990-02-01

    Driving after brain damage is a vital issue, considering the large number of patients who suffer from cerebrovascular and traumatic encephalopathy. The ability to operate a motor vehicle is an integral part of independence for most adults and so should be preserved whenever possible. The physician may estimate a patient's ability to drive safely based on his own examination, the evaluation of a neuropsychologist, and a comprehensive driving evaluation--testing, driving simulation, behind-the-wheel observation--with a driving specialist. This study sought to evaluate the ability of brain-damaged individuals to operate a motor vehicle safely at follow-up. These patients had been evaluated (by a physician, a neuropsychologist, and a driving specialist) and were judged able to operate a motor vehicle safely after their cognitive insult. Twenty-two brain-damaged patients who were evaluated at our institution were successfully followed up to five years (mean interval of 2.67 years). Patients were interviewed by telephone. Their driving safely was compared with a control group consisting of a close friend or spouse of each patient. Statistical analysis revealed no difference between patient and control groups in the type of driving, the incidence of speeding tickets, near accidents, and accidents, and the cost of vehicle damage when accidents occurred. The patient group was further divided into those who had, and those who had not experienced driving difficulties so that initial neuropsychologic testing could be compared. No significant differences were noted in any aspect of the neuropsychologic test battery. We conclude that selected brain-damaged patients who have passed a comprehensive driving assessment as outlined were as fit to drive as were their normal matched controls.(ABSTRACT TRUNCATED AT 250 WORDS)

  14. Chapter Twenty

    African Journals Online (AJOL)

    User

    manipulation of their natural resource blessing from God, the crude oil, ... pictures of destitute, starving children, raped and battered women and cases of pollution. This ..... 70) This is very simple and straightforward demands that can easily be ...

  15. Molecular characterization of 93 genotypes of cocoa (Theobroma cacao L. with random amplified microsatellites RAMs

    Directory of Open Access Journals (Sweden)

    Yacenia Morillo C.

    2014-12-01

    Full Text Available Random amplified microsatellite (RAMs markers six were used to characterize 93 genotypes of cocoa in Tumaco (Colombia. Hundred twenty seven bands were generated. The number of polymorphic loci varied between 11 and 25 for the AG and TG primers, respectively. This study differentiated the 93 genotypes into six groups with a 0.53 similarity, 0.28 mean heterozygosity (He for the population, and 0.12±0.02 genetic differentiation coefficient or Fst. A significant level of genetic diversity was evident in the T. cacao genotypes. This resource would benefit selection programs of individual trees or plant breeding programs. The genotypes clustered in a large proportion in accordance with the collection zone. This characteristic was associated with collection zones and along the rivers in the municipality of Tumaco. The RAM technique proved to be a useful tool for the determination of genetic diversity in Theobroma species.

  16. Guidelines to design engineering education in the twenty-first century for supporting innovative product development

    Science.gov (United States)

    Violante, Maria Grazia; Vezzetti, Enrico

    2017-11-01

    In the twenty-first century, meeting our technological challenges demands educational excellence, a skilled populace that is ready for the critical challenges society faces. There is widespread consensus, however, that education systems are failing to adequately prepare all students with the essential twenty-first century knowledge and skills necessary to succeed in life, career, and citizenship. The purpose of this paper is to understand how twenty-first century knowledge and skills can be appropriately embedded in engineering education finalised to innovative product development by using additive manufacturing (AM). The study designs a learning model by which to achieve effective AM education to address the requirements of twenty-first century and to offer students the occasion to experiment with STEM (Science, technology, engineering, and mathematics) concepts. The study is conducted using the quality function deployment (QFD) methodology.

  17. Twenty-fourth Semiannual Report of the Commission to the Congress, July 1958

    Energy Technology Data Exchange (ETDEWEB)

    Strauss, Lewis L.; McCone, John A.

    1958-07-31

    The document represents the twenty-fourth semiannual Atomic Energy Commission (AEC) report to Congress. The report sums up the major activities and developments in the national atomic energy program covering the period January - June 1958.

  18. 76 FR 72240 - Twenty-Seventh Meeting: RTCA Special Committee 206: Aeronautical Information and Meteorological...

    Science.gov (United States)

    2011-11-22

    ... DEPARTMENT OF TRANSPORTATION Federal Aviation Administration Twenty-Seventh Meeting: RTCA Special... Administration (FAA), U.S. Department of Transportation (DOT). ACTION: Notice of RTCA Special Committee 206..., 2011 FRAC OSED [[Page 72241

  19. Book Review: Africa and Europe in the Twenty-First Century ...

    African Journals Online (AJOL)

    Abstract. Title: Africa and Europe in the Twenty-First Century. Author: Osita C. Eze and Amadu Sesay. Publisher: Nigerian Institute of International Affairs, 2010, xvi + 397pp, Tables, Index. ISBN: 978-002-102-7 ...

  20. Twenty-second Semiannual Report of the Commission to the Congress, July 1957

    Energy Technology Data Exchange (ETDEWEB)

    Strauss, Lewis L.

    1957-07-31

    The document represents the twenty-second semiannual Atomic Energy Commission (AEC) report to Congress. The report sums up the major activities and developments in the national atomic energy program covering the period January - June 1957.

  1. Nutrients, technological properties and genetic relationships among twenty cowpea landraces cultivated in West Africa

    NARCIS (Netherlands)

    Madode, Y.E.E.; Linnemann, A.R.; Nout, M.J.R.; Vosman, B.J.; Hounhouigan, D.J.; Boekel, van T.

    2012-01-01

    The genetic relationships among twenty phenotypically different cowpea landraces were unravelled regarding their suitability for preparing West African dishes. Amplified fragment length polymorphism classified unpigmented landraces (UPs) as highly similar (65%, one cluster), contrary to pigmented

  2. Spatial memory deficits in patients after unilateral selective amygdalohippocampectomy

    NARCIS (Netherlands)

    Kessels, R.P.C.; Hendriks, M.P.H.; Schouten, J.A.; Asselen, M. van; Postma, A.

    2004-01-01

    The present study investigated the differential involvement of the right and left hippocampus in various forms of spatial memory: spatial search, positional memory versus object-location binding, and coordinate versus categorical processing. Twenty-five epilepsy patients with selective

  3. China's iGeneration - Cinema and Moving Image Culture for the Twenty-First Century

    OpenAIRE

    Johnson, Matthew D.; Wagner, Keith B.; Yu, Tianqui; Vulpiani, Luke

    2014-01-01

    Collection of essays on twenty-first century Chinese cinema and moving image culture. This innovative collection of essays on twenty-first century Chinese cinema and moving image culture features contributions from an international community of scholars, critics, and practitioners. Taken together, their perspectives make a compelling case that the past decade has witnessed a radical transformation of conventional notions of cinema. Following China's accession to the WTO in 2001, personal ...

  4. Nuclear power in the twenty-first century - An assessment (Part 1)

    OpenAIRE

    von Hirschhausen, Christian

    2017-01-01

    Nuclear power was one of the most important discoveries of the twentieth century, and it continues to play an important role in twenty-first century discussions about the future energy mix, climate change, innovation, proliferation, geopolitics, and many other crucial policy topics. This paper addresses some key issues around the emergence of nuclear power in the twentieth century and perspectives going forward in the twenty-first, including questions of economics and competitiveness, the str...

  5. Twenty years of meta-analyses in orthopaedic surgery: has quality kept up with quantity?

    Science.gov (United States)

    Dijkman, Bernadette G; Abouali, Jihad A K; Kooistra, Bauke W; Conter, Henry J; Poolman, Rudolf W; Kulkarni, Abhaya V; Tornetta, Paul; Bhandari, Mohit

    2010-01-01

    As the number of studies in the literature is increasing, orthopaedic surgeons highly depend on meta-analyses as their primary source of scientific evidence. The objectives of this review were to assess the scientific quality and number of published meta-analyses on orthopaedics-related topics over time. We conducted, in duplicate and independently, a systematic review of published meta-analyses in orthopaedics in the years 2005 and 2008 and compared them with a previous systematic review of meta-analyses from 1969 to 1999. A search of electronic databases (MEDLINE, EMBASE, and the Cochrane Database of Systematic Reviews) was performed to identify meta-analyses published in 2005 and 2008. We searched bibliographies and contacted content experts to identify additional relevant studies. Two investigators independently assessed the quality of the studies, using the Oxman and Guyatt index, and abstracted relevant data. We included forty-five and forty-four meta-analyses from 2005 and 2008, respectively. While the number of meta-analyses increased fivefold from 1999 to 2008, the mean quality score did not change significantly over time (p = 0.067). In the later years, a significantly lower proportion of meta-analyses had methodological flaws (56% in 2005 and 68% in 2008) compared with meta-analyses published prior to 2000 (88%) (p = 0.006). In 2005 and 2008, respectively, 18% and 30% of the meta-analyses had major to extensive flaws in their methodology. Studies from 2008 with positive conclusions used and described appropriate criteria for the validity assessment less often than did those with negative results. The use of random-effects and fixed-effects models as pooling methods became more popular toward 2008. Although the methodological quality of orthopaedic meta-analyses has increased in the past twenty years, a substantial proportion continues to show major to extensive flaws. As the number of published meta-analyses is increasing, a routine checklist for

  6. Genetic variability, path-coefficient and correlation studies in twenty elite bread-wheat (triticum aestivum L.) lines

    International Nuclear Information System (INIS)

    Mujahid, M.Y.; Asif, M.; Ahmad, I.; Kisana, N.A.; Ahmad, Z.; Asim, M.

    2005-01-01

    Twenty bread-wheat elite lines of diverse origin, developed by various research institutes in the country, were tested and evaluated at National Agricultural Research Centre (NARC) under optimum irrigated conditions. Significant variation was observed for all the traits studied viz: days to heading, days to maturity, kernel weight, test weight and grain yield. Genotypic and phenotypic correlations were computed and the direct and indirect contributions of each trait towards grain-yield were determined. Grain-yield showed significant association with test weight and kernel weight. Direct positive effects of kernel weight and test weight towards grain-yield suggest the effectiveness of these traits to select and identify the desirable wheat- genotypes for a target environment. (author)

  7. Mechanical design of multiple zone plates precision alignment apparatus for hard X-ray focusing in twenty-nanometer scale

    Energy Technology Data Exchange (ETDEWEB)

    Shu, Deming; Liu, Jie; Gleber, Sophie C.; Vila-Comamala, Joan; Lai, Barry; Maser, Jorg M.; Roehrig, Christian; Wojcik, Michael J.; Vogt, Franz Stefan

    2017-04-04

    An enhanced mechanical design of multiple zone plates precision alignment apparatus for hard x-ray focusing in a twenty-nanometer scale is provided. The precision alignment apparatus includes a zone plate alignment base frame; a plurality of zone plates; and a plurality of zone plate holders, each said zone plate holder for mounting and aligning a respective zone plate for hard x-ray focusing. At least one respective positioning stage drives and positions each respective zone plate holder. Each respective positioning stage is mounted on the zone plate alignment base frame. A respective linkage component connects each respective positioning stage and the respective zone plate holder. The zone plate alignment base frame, each zone plate holder and each linkage component is formed of a selected material for providing thermal expansion stability and positioning stability for the precision alignment apparatus.

  8. RANDOM WALK HYPOTHESIS IN FINANCIAL MARKETS

    Directory of Open Access Journals (Sweden)

    Nicolae-Marius JULA

    2017-05-01

    Full Text Available Random walk hypothesis states that the stock market prices do not follow a predictable trajectory, but are simply random. If you are trying to predict a random set of data, one should test for randomness, because, despite the power and complexity of the used models, the results cannot be trustworthy. There are several methods for testing these hypotheses and the use of computational power provided by the R environment makes the work of the researcher easier and with a cost-effective approach. The increasing power of computing and the continuous development of econometric tests should give the potential investors new tools in selecting commodities and investing in efficient markets.

  9. Fragmentation of random trees

    International Nuclear Information System (INIS)

    Kalay, Z; Ben-Naim, E

    2015-01-01

    We study fragmentation of a random recursive tree into a forest by repeated removal of nodes. The initial tree consists of N nodes and it is generated by sequential addition of nodes with each new node attaching to a randomly-selected existing node. As nodes are removed from the tree, one at a time, the tree dissolves into an ensemble of separate trees, namely, a forest. We study statistical properties of trees and nodes in this heterogeneous forest, and find that the fraction of remaining nodes m characterizes the system in the limit N→∞. We obtain analytically the size density ϕ s of trees of size s. The size density has power-law tail ϕ s ∼s −α with exponent α=1+(1/m). Therefore, the tail becomes steeper as further nodes are removed, and the fragmentation process is unusual in that exponent α increases continuously with time. We also extend our analysis to the case where nodes are added as well as removed, and obtain the asymptotic size density for growing trees. (paper)

  10. Random ancestor trees

    International Nuclear Information System (INIS)

    Ben-Naim, E; Krapivsky, P L

    2010-01-01

    We investigate a network growth model in which the genealogy controls the evolution. In this model, a new node selects a random target node and links either to this target node, or to its parent, or to its grandparent, etc; all nodes from the target node to its most ancient ancestor are equiprobable destinations. The emerging random ancestor tree is very shallow: the fraction g n of nodes at distance n from the root decreases super-exponentially with n, g n = e −1 /(n − 1)!. We find that a macroscopic hub at the root coexists with highly connected nodes at higher generations. The maximal degree of a node at the nth generation grows algebraically as N 1/β n , where N is the system size. We obtain the series of nontrivial exponents which are roots of transcendental equations: β 1 ≅1.351 746, β 2 ≅1.682 201, etc. As a consequence, the fraction p k of nodes with degree k has an algebraic tail, p k ∼ k −γ , with γ = β 1 + 1 = 2.351 746

  11. Random survival forests for competing risks

    DEFF Research Database (Denmark)

    Ishwaran, Hemant; Gerds, Thomas A; Kogalur, Udaya B

    2014-01-01

    We introduce a new approach to competing risks using random forests. Our method is fully non-parametric and can be used for selecting event-specific variables and for estimating the cumulative incidence function. We show that the method is highly effective for both prediction and variable selection...

  12. Biomarker analyses and final overall survival results from a phase III, randomized, open-label, first-line study of gefitinib versus carboplatin/paclitaxel in clinically selected patients with advanced non-small-cell lung cancer in Asia (IPASS).

    Science.gov (United States)

    Fukuoka, Masahiro; Wu, Yi-Long; Thongprasert, Sumitra; Sunpaweravong, Patrapim; Leong, Swan-Swan; Sriuranpong, Virote; Chao, Tsu-Yi; Nakagawa, Kazuhiko; Chu, Da-Tong; Saijo, Nagahiro; Duffield, Emma L; Rukazenkov, Yuri; Speake, Georgina; Jiang, Haiyi; Armour, Alison A; To, Ka-Fai; Yang, James Chih-Hsin; Mok, Tony S K

    2011-07-20

    The results of the Iressa Pan-Asia Study (IPASS), which compared gefitinib and carboplatin/paclitaxel in previously untreated never-smokers and light ex-smokers with advanced pulmonary adenocarcinoma were published previously. This report presents overall survival (OS) and efficacy according to epidermal growth factor receptor (EGFR) biomarker status. In all, 1,217 patients were randomly assigned. Biomarkers analyzed were EGFR mutation (amplification mutation refractory system; 437 patients evaluable), EGFR gene copy number (fluorescent in situ hybridization; 406 patients evaluable), and EGFR protein expression (immunohistochemistry; 365 patients evaluable). OS analysis was performed at 78% maturity. A Cox proportional hazards model was used to assess biomarker status by randomly assigned treatment interactions for progression-free survival (PFS) and OS. OS (954 deaths) was similar for gefitinib and carboplatin/paclitaxel with no significant difference between treatments overall (hazard ratio [HR], 0.90; 95% CI, 0.79 to 1.02; P = .109) or in EGFR mutation-positive (HR, 1.00; 95% CI, 0.76 to 1.33; P = .990) or EGFR mutation-negative (HR, 1.18; 95% CI, 0.86 to 1.63; P = .309; treatment by EGFR mutation interaction P = .480) subgroups. A high proportion (64.3%) of EGFR mutation-positive patients randomly assigned to carboplatin/paclitaxel received subsequent EGFR tyrosine kinase inhibitors. PFS was significantly longer with gefitinib for patients whose tumors had both high EGFR gene copy number and EGFR mutation (HR, 0.48; 95% CI, 0.34 to 0.67) but significantly shorter when high EGFR gene copy number was not accompanied by EGFR mutation (HR, 3.85; 95% CI, 2.09 to 7.09). EGFR mutations are the strongest predictive biomarker for PFS and tumor response to first-line gefitinib versus carboplatin/paclitaxel. The predictive value of EGFR gene copy number was driven by coexisting EGFR mutation (post hoc analysis). Treatment-related differences observed for PFS in the EGFR

  13. Nonlinear Pricing with Random Participation

    OpenAIRE

    Jean-Charles Rochet; Lars A. Stole

    2002-01-01

    The canonical selection contracting programme takes the agent's participation decision as deterministic and finds the optimal contract, typically satisfying this constraint for the worst type. Upon weakening this assumption of known reservation values by introducing independent randomness into the agents' outside options, we find that some of the received wisdom from mechanism design and nonlinear pricing is not robust and the richer model which allows for stochastic participation affords a m...

  14. "The First Twenty Years," by Bernard J. Siegel. Annual Review of Anthropology, 22 (1993, pp. 1-34, Annual Reviews, Inc, Palo Alto

    Directory of Open Access Journals (Sweden)

    James A. Delle

    1995-05-01

    Full Text Available After twenty years as editor of the Annual Review ofAnthropology (ARA, Professor Siegel took on a daunting task with this article. In his words, he set out to "ponder the developments in the several subfields of anthropol­ogy over this period of time, as reflected in the topics selected for review in this enterprise" (p. 8. To this end Siegel, a cultural anthropologist, mined the collective knowledge contained within twenty years of the ARA. In his presentation, he considers the intellectual developments within each of the five subdisciplines separately (he includes applied anthropology, concluding with some brief remarks on the importance of maintaining a four or five field approach to anthropology. For our purposes here, I will limit my comments to his section on archaeology.

  15. Border Crossing in Contemporary Brazilian Culture: Global Perspectives from the Twenty-First Century Literary Scene

    Directory of Open Access Journals (Sweden)

    Cimara Valim de Melo

    2016-06-01

    Full Text Available Abstract: This paper investigates the process of internationalisation of Brazilian literature in the twenty-first century from the perspective of the publishing market. For this, we analyse how Brazil has responded to globalisation and what effects of cultural globalisation can be seen in the Brazilian literary scene, focusing on the novel. Observing the movement of the novelists throughout the globe, the reception of Brazilian literature in the United Kingdom and the relations between art and the literary market in Brazil, we intend to provoke some reflections on Brazilian cultural history in the light of the twenty-first century.

  16. Over-Selectivity as a Learned Response

    Science.gov (United States)

    Reed, Phil; Petrina, Neysa; McHugh, Louise

    2011-01-01

    An experiment investigated the effects of different levels of task complexity in pre-training on over-selectivity in a subsequent match-to-sample (MTS) task. Twenty human participants were divided into two groups; exposed either to a 3-element, or a 9-element, compound stimulus as a sample during MTS training. After the completion of training,…

  17. Random broadcast on random geometric graphs

    Energy Technology Data Exchange (ETDEWEB)

    Bradonjic, Milan [Los Alamos National Laboratory; Elsasser, Robert [UNIV OF PADERBORN; Friedrich, Tobias [ICSI/BERKELEY; Sauerwald, Tomas [ICSI/BERKELEY

    2009-01-01

    In this work, we consider the random broadcast time on random geometric graphs (RGGs). The classic random broadcast model, also known as push algorithm, is defined as: starting with one informed node, in each succeeding round every informed node chooses one of its neighbors uniformly at random and informs it. We consider the random broadcast time on RGGs, when with high probability: (i) RGG is connected, (ii) when there exists the giant component in RGG. We show that the random broadcast time is bounded by {Omicron}({radical} n + diam(component)), where diam(component) is a diameter of the entire graph, or the giant component, for the regimes (i), or (ii), respectively. In other words, for both regimes, we derive the broadcast time to be {Theta}(diam(G)), which is asymptotically optimal.

  18. Quantumness, Randomness and Computability

    International Nuclear Information System (INIS)

    Solis, Aldo; Hirsch, Jorge G

    2015-01-01

    Randomness plays a central role in the quantum mechanical description of our interactions. We review the relationship between the violation of Bell inequalities, non signaling and randomness. We discuss the challenge in defining a random string, and show that algorithmic information theory provides a necessary condition for randomness using Borel normality. We close with a view on incomputablity and its implications in physics. (paper)

  19. How random is a random vector?

    Science.gov (United States)

    Eliazar, Iddo

    2015-12-01

    Over 80 years ago Samuel Wilks proposed that the "generalized variance" of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the "Wilks standard deviation" -the square root of the generalized variance-is indeed the standard deviation of a random vector. We further establish that the "uncorrelation index" -a derivative of the Wilks standard deviation-is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: "randomness measures" and "independence indices" of random vectors. In turn, these general notions give rise to "randomness diagrams"-tangible planar visualizations that answer the question: How random is a random vector? The notion of "independence indices" yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.

  20. How random is a random vector?

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2015-01-01

    Over 80 years ago Samuel Wilks proposed that the “generalized variance” of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the “Wilks standard deviation” –the square root of the generalized variance–is indeed the standard deviation of a random vector. We further establish that the “uncorrelation index” –a derivative of the Wilks standard deviation–is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: “randomness measures” and “independence indices” of random vectors. In turn, these general notions give rise to “randomness diagrams”—tangible planar visualizations that answer the question: How random is a random vector? The notion of “independence indices” yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.

  1. Prediction of genetic gains by selection indices using mixed models in elephant grass for energy purposes.

    Science.gov (United States)

    Silva, V B; Daher, R F; Araújo, M S B; Souza, Y P; Cassaro, S; Menezes, B R S; Gravina, L M; Novo, A A C; Tardin, F D; Júnior, A T Amaral

    2017-09-27

    Genetically improved cultivars of elephant grass need to be adapted to different ecosystems with a faster growth speed and lower seasonality of biomass production over the year. This study aimed to use selection indices using mixed models (REML/BLUP) for selecting families and progenies within full-sib families of elephant grass (Pennisetum purpureum) for biomass production. One hundred and twenty full-sib progenies were assessed from 2014 to 2015 in a randomized block design with three replications. During this period, the traits dry matter production, the number of tillers, plant height, stem diameter, and neutral detergent fiber were assessed. Families 3 and 1 were the best classified, being the most indicated for selection effect. Progenies 40, 45, 46, and 49 got the first positions in the three indices assessed in the first cut. The gain for individual 40 was 161.76% using Mulamba and Mock index. The use of selection indices using mixed models is advantageous in elephant grass since they provide high gains with the selection, which are distributed among all the assessed traits in the most appropriate situation to breeding programs.

  2. Fast-food consumption among 17-year-olds in the Birth to Twenty ...

    African Journals Online (AJOL)

    Objectives: Assessment of fast-food consumption in urban black adolescents. Design: The current research was a descriptive cross-sectional study. Setting: Subjects attending the Birth to Twenty (Bt20) research facility at the Chris Hani Baragwanath Hospital in Soweto, Johannesburg between September 2007 and May ...

  3. Theoretical Contexts and Conceptual Frames for the Study of Twenty-First Century Capitalism

    DEFF Research Database (Denmark)

    Hull Kristensen, Peer; Morgan, Glenn

    2012-01-01

    This chapter argues that the comparative institutionalist approach requires rethinking in the light of developments in the twenty-first century. The chapter emphasizes the following features of the new environment: first, the rise of the BRIC and the emerging economies; secondly, the changed...

  4. Visual Literacy: Does It Enhance Leadership Abilities Required for the Twenty-First Century?

    Science.gov (United States)

    Bintz, Carol

    2016-01-01

    The twenty-first century hosts a well-established global economy, where leaders are required to have increasingly complex skills that include creativity, innovation, vision, relatability, critical thinking and well-honed communications methods. The experience gained by learning to be visually literate includes the ability to see, observe, analyze,…

  5. 76 FR 21741 - Twenty-First Century Communications and Video Programming Accessibility Act; Announcement of Town...

    Science.gov (United States)

    2011-04-18

    ... equipment distribution program for people who are deaf-blind. In addition, the law will fill accessibility... Programming Accessibility Act; Announcement of Town Hall Meeting AGENCY: Federal Communications Commission... The Twenty-First Century Communications and Video Programming Accessibility Act (the Act or CVAA...

  6. How Do Students Value the Importance of Twenty-First Century Skills?

    Science.gov (United States)

    Ahonen, Arto Kalevi; Kinnunen, Päivi

    2015-01-01

    Frameworks of twenty-first century skills have attained a central role in school development and curriculum changes all over the world. There is a common understanding of the need for meta-skills such as problem solving, reasoning, collaboration, and self-regulation. This article presents results from a Finnish study, in which 718 school pupils…

  7. Speaking American: Comparing Supreme Court and Hollywood Racial Interpretation in the Early Twenty-First Century

    Science.gov (United States)

    Hawkins, Paul Henry

    2010-01-01

    Apprehending that race is social, not biological, this study examines U.S. racial formation in the early twenty-first century. In particular, Hollywood and Supreme Court texts are analyzed as media for gathering, shaping and transmitting racial ideas. Representing Hollywood, the 2004 film "Crash" is analyzed. Representing the Supreme Court, the…

  8. Testing Students under Cognitive Capitalism: Knowledge Production of Twenty-First Century Skills

    Science.gov (United States)

    Morgan, Clara

    2016-01-01

    Scholars studying the global governance of education have noted the increasingly important role corporations play in educational policy making. I contribute to this scholarship by examining the Assessment and Teaching of twenty-first century skills (ATC21S™) project, a knowledge production apparatus operating under cognitive capitalism. I analyze…

  9. Humanities: The Unexpected Success Story of the Twenty-First Century

    Science.gov (United States)

    Davis, Virginia

    2012-01-01

    Humanities within universities faced challenges in the latter half of the twentieth century as their value in the modern world was questioned. This paper argues that there is strong potential for the humanities to thrive in the twenty-first century university sector. It outlines some of the managerial implications necessary to ensure that this…

  10. 78 FR 31627 - Twenty-Second Meeting: RTCA Special Committee 224, Airport Security Access Control Systems

    Science.gov (United States)

    2013-05-24

    ..., Management Analyst, NextGen, Business Operations Group, Federal Aviation Administration. [FR Doc. 2013-12460... Welcome, Introductions & Administrative Remarks Review and Approve Summary of the Twenty-first Meeting... Next Meeting Any Other Business Adjourn Attendance is open to the interested public but limited to...

  11. 78 FR 20168 - Twenty Fourth Meeting: RTCA Special Committee 203, Unmanned Aircraft Systems

    Science.gov (United States)

    2013-04-03

    ... Washington, DC, on March 28, 2013. Paige Williams, Management Analyst, NextGen, Business Operations Group... Introductions Review Meeting Agenda Review/Approval of Twenty Third Plenary Meeting Summary Leadership Update... for Unmanned Aircraft Systems and Minimum Aviation System Performance Standards Other Business Adjourn...

  12. Movies to the Rescue: Keeping the Cold War Relevant for Twenty-First-Century Students

    Science.gov (United States)

    Gokcek, Gigi; Howard, Alison

    2013-01-01

    What are the challenges of teaching Cold War politics to the twenty-first-century student? How might the millennial generation be educated about the political science theories and concepts associated with this period in history? A college student today, who grew up in the post-Cold War era with the Internet, Facebook, Twitter, smart phones,…

  13. Critical Remarks on Piketty's 'Capital in the Twenty-first Century'

    OpenAIRE

    Homburg, Stefan

    2014-01-01

    This paper discusses the central macroeconomic claims that are made in Thomas Piketty's book 'Capital in the Twenty-first Century'. The paper aims to show that Piketty's contentions are not only logically flawed but also contradicted by his own data.

  14. Thomas Piketty – The Adam Smith of the Twenty-First Century?

    Directory of Open Access Journals (Sweden)

    Jacob Dahl Rendtorff

    2014-11-01

    Full Text Available Piketty’s book, Capital in the Twenty-First Century (2014 has become a bestseller in the world. Two month after its publication, it had sold more than 200.000 copies, and this success will surely continue for a long time. Piketty has established a new platform to discuss political economy.

  15. TPACK Updated to Measure Pre-Service Teachers' Twenty-First Century Skills

    Science.gov (United States)

    Valtonen, Teemu; Sointu, Erkko; Kukkonen, Jari; Kontkanen, Sini; Lambert, Matthew C.; Mäkitalo-Siegl, Kati

    2017-01-01

    Twenty-first century skills have attracted significant attention in recent years. Students of today and the future are expected to have the skills necessary for collaborating, problem solving, creative and innovative thinking, and the ability to take advantage of information and communication technology (ICT) applications. Teachers must be…

  16. The «Group of Twenty», IMF and EU and Reforming of Global Governance

    Directory of Open Access Journals (Sweden)

    Evgeniy J. Il'in

    2014-01-01

    Full Text Available This article is devoted to the process of reforming the global financial system and world economic organizations since the foundation of the International Monetary Fund at the Bretton Woods Conference in 1944 to present time. Special attention is given to results of cooperation of the IMF and the "Group of Twenty"in the context of the world financial crisis 2008-2009. This article mentions the key benchmarks of the historical development of world economy: foundation of the Bretton Woods financial system, rejection of the gold standard at the Jamaica Conference, transition to the floating exchange rates, the wave of crises in the 1990-s, the world financial crisis of 2008-2009. The process of evolution of the IMF within the framework of these global events is considered here. The cooperation of EU, IMF and "Group of Twenty" is considered. The reforms of the IMF and their results are analyzed. The policy of the IMF at different historical stages of its evolution is estimated. As well as it results, the article also deals with the formation and development of the "Group of Twenty". The increasing role of the "Group of Twenty" in the global economic governance and reforming the IMF is considered. Especially is marked the necessity of the further reforms of the IMF and increasing of participation of the "G-20" in the world economic and politic system.

  17. Vesico-vaginal fistula repair: experience with first twenty-three ...

    African Journals Online (AJOL)

    Vesico-vaginal fistula repair: experience with first twenty-three patients seen at a tertiary hospital in north-central Nigeria. Stephen D. Ngwan, Bassey E. Edem, Ajen S. Anzaku, Barnabas A. Eke, Mohammed A. Shittu, Solomon A. Obekpa ...

  18. 2010 Critical Success Factors for the North Carolina Community College System. Twenty First Annual Report

    Science.gov (United States)

    North Carolina Community College System (NJ1), 2010

    2010-01-01

    First mandated by the North Carolina General Assembly in 1989 (S.L. 1989; C. 752; S. 80), the Critical Success Factors report has evolved into the major accountability document for the North Carolina Community College System. This twenty first annual report on the critical success factors is the result of a process undertaken to streamline and…

  19. EXOGENOUS CHALLENGES FOR THE TOURISM INDUSTRY IN THE BEGINNING OF THE TWENTY FIRST CENTURY

    Directory of Open Access Journals (Sweden)

    Akosz Ozan

    2009-05-01

    Full Text Available Tourism is one of the fastest growing industries in the world. Besides its sustained growth the tourism industry has shown in the first years of the twenty first century that it can deal with political, military and natural disasters. The present paper ac

  20. Improving the temporal transposability of lumped hydrological models on twenty diversified U.S. watersheds

    Directory of Open Access Journals (Sweden)

    G. Seiller

    2015-03-01

    Full Text Available Study region: Twenty diversified U.S. watersheds. Study focus: Identifying optimal parameter sets for hydrological modeling on a specific catchment remains an important challenge for numerous applied and research projects. This is particularly the case when working under contrasted climate conditions that question the temporal transposability of the parameters. Methodologies exist, mainly based on Differential Split Sample Tests, to examine this concern. This work assesses the improved temporal transposability of a multimodel implementation, based on twenty dissimilar lumped conceptual structures and on twenty U.S. watersheds, over the performance of the individual models. New hydrological insights for the region: Individual and collective temporal transposabilities are analyzed and compared on the twenty studied watersheds. Results show that individual models performances on contrasted climate conditions are very dissimilar depending on test period and watershed, without the possibility to identify a best solution in all circumstances. They also confirm that performance and robustness are clearly enhanced using an ensemble of rainfall-runoff models instead of individual ones. The use of (calibrated weight averaged multimodels further improves temporal transposability over simple averaged ensemble, in most instances, confirming added-value of this approach but also the need to evaluate how individual models compensate each other errors. Keywords: Rainfall-runoff modeling, Multimodel approach, Differential Split Sample Test, Deterministic combination, Outputs averaging

  1. Twenty Years of Australian Educational Computing: A Call for Modern Traditionalism

    Science.gov (United States)

    McDougall, Anne

    2005-01-01

    This article reflects on twenty or more years of development and research in educational computing. It argues that the emphasis on exploiting the technology in the service of contemporary ideas about learning held by many of the early workers has been lost to a focus on the technology itself and its capabilities. In schools this has led to an…

  2. METHYLMERCURY BIOACCUMULATION DEPENDENCE ON NORTHERN PIKE AGE AND SIZE IN TWENTY MINNESOTA LAKES

    Science.gov (United States)

    Mercury accumulation in northern pike muscle tissue (fillets) was found to be directly related to fish age and size. Measurements were made on 173 individual northern pike specimens from twenty lakes across Minnesota. Best fit regressions of mercury fillet concentration (wet wt.)...

  3. Leadership for Twenty-First-Century Schools and Student Achievement: Lessons Learned from Three Exemplary Cases

    Science.gov (United States)

    Schrum, Lynne; Levin, Barbara B.

    2013-01-01

    The purpose of this research was to understand ways exemplary award winning secondary school leaders have transformed their schools for twenty-first-century education and student achievement. This article presents three diverse case studies and identifies ways that each school's leader and leadership team reconfigured its culture and expectations,…

  4. Synthesis of Carbon Nano tubes: A Revolution in Material Science for the Twenty-First Century

    International Nuclear Information System (INIS)

    Allaf, Abd. W.

    2003-01-01

    The aim of this work is to explain the preparation procedures of single walled carbon nano tubes using arc discharge technique. The optimum conditions of carbon nano tubes synthesis are given. It should be pointed out that this sort of materials would be the twenty-first century materials

  5. Twenty First Century Education: Transformative Education for Sustainability and Responsible Citizenship

    Science.gov (United States)

    Bell, David V. J.

    2016-01-01

    Many ministries of education focus on twenty-first century education but unless they are looking at this topic through a sustainability lens, they will be missing some of its most important elements. The usual emphasis on developing skills for employability in the current global economy begs the question whether the global economy is itself…

  6. The conundrum of religious schools in twenty-first-century Europe

    NARCIS (Netherlands)

    Merry, M.S.

    2015-01-01

    In this paper Merry examines in detail the continued - and curious - popularity of religious schools in an otherwise ‘secular’ twenty-first century Europe. To do this he considers a number of motivations underwriting the decision to place one's child in a religious school and delineates what are

  7. Twenty-One Ways to Use Music in Teaching the Language Arts.

    Science.gov (United States)

    Cardarelli, Aldo F.

    Twenty-one activities that integrate music and the language arts in order to capitalize on children's interests are described in this paper. Topics of the activities are as follows: alphabetical order, pantomime, vocabulary building from words of a favorite song, words that are "the most (whatever)" from songs, mood words, a configuration clue…

  8. Twenty-year trends in the prevalence of Down syndrome and other trisomies in Europe

    DEFF Research Database (Denmark)

    Loane, Maria; Morris, Joan K; Addor, Marie-Claude

    2013-01-01

    This study examines trends and geographical differences in total and live birth prevalence of trisomies 21, 18 and 13 with regard to increasing maternal age and prenatal diagnosis in Europe. Twenty-one population-based EUROCAT registries covering 6.1 million births between 1990 and 2009 participa...

  9. An investigation of twenty-one cases of low-frequency noise complaints

    DEFF Research Database (Denmark)

    Pedersen, Christian Sejer; Møller, Henrik; Persson-Waye, Kerstin

    2007-01-01

    Twenty-one cases of low-frequency noise complaints were thoroughly investigated with the aim of answering the question whether it is real physical sound or low-frequency tinnitus that causes the annoyance. Noise recordings were made in the homes of the complainants taking the spatial variation...

  10. Twenty-first Semiannual Report of the Commission to the Congress, January 1957

    Energy Technology Data Exchange (ETDEWEB)

    Strauss, Lewis L.

    1957-01-31

    The document represents the twenty-first semiannual Atomic Energy Commission (AEC) report to Congress. The report sums up the major activities and developments in the national atomic energy program covering the period July - December 1956. A special part two of this semiannual report addresses specifically Radiation Safety in Atomic Energy Activities.

  11. Teachers' Critical Reflective Practice in the Context of Twenty-First Century Learning

    Science.gov (United States)

    Benade, Leon

    2015-01-01

    In the twenty-first century, learning and teaching at school must prepare young people for engaging in a complex and dynamic world deeply influenced by globalisation and the revolution in digital technology. In addition to the use of digital technologies, is the development of flexible learning spaces. It is claimed that these developments demand,…

  12. Teaching and Learning in the Twenty-First Century: What Is an "Institute of Education" for?

    Science.gov (United States)

    Husbands, Chris

    2012-01-01

    As we begin the twenty-first century, schools and teachers are subject to enormous pressures for change. The revolution in digital technologies, the pressure to develop consistently high-performing schools systems, and the drive between excellence and equity all combine to raise profound questions about the nature of successful teaching and…

  13. An analysis of players\\' performances in the first cricket Twenty20 ...

    African Journals Online (AJOL)

    The purpose of this paper is to show how batting and bowling performance measures for one-day internationals can be adapted for use in Twenty20 matches, specifically in the case of a very small number of matches played. These measures are then used to give rankings of the batsmen and bowlers who performed best in ...

  14. Affective and cognitive determinants of intention to consume twenty foods that contribute to fat intake

    NARCIS (Netherlands)

    Stafleu, A.; Graaf, de C.; Staveren, van W.A.; Burema, J.

    2001-01-01

    Fishbein and Ajzen's theory of reasoned action was used as a framework to study beliefs and attitudes towards twenty foods that contribute to fat intake in a Netherlands smaple population. Subjects between 18 and 75 years of age (n = 419, response rate 23€filled out a self-administered

  15. Towards a Rational Kingdom in Africa: Knowledge, Critical Rationality and Development in a Twenty-First Century African Cultural Context

    Directory of Open Access Journals (Sweden)

    Lawrence Ogbo Ugwuanyi

    2018-03-01

    Full Text Available This paper seeks to locate the kind of knowledge that is relevant for African development in the twenty-first century African cultural context and to propose the paradigm for achieving such knowledge. To do this, it advances the view that the concept of twenty-first century in an African context must be located with the colonial and post-colonial challenges of the African world and applied to serve the African demand. Anchored on this position, the paper outlines and critiques the wrong assumption on which modern state project was anchored in post-colonial Africa and its development dividend to suggest that this is an outcome of a wrong knowledge design that is foundational to the state project and which the project did not address. It proposes a shift in the knowledge paradigm in Africa and suggests critical self-consciousness as a more desirable knowledge design for Africa. It applies the term ‘rational kingdom’ (defined as a community of reason marked by critical conceptual self-awareness driven by innovation and constructivism to suggest this paradigm. ‘Innovation’ is meant as the application of reason with an enlarged capacity to anticipate and address problems with fresh options and ‘constructivism’ is meant as the disposition to sustain innovation by advancing an alternative but more reliable worldview that can meet the exigencies of modernity in an African cultural context. The paper then proceeds to outline the nature of the rational kingdom and its anticipated gains and outcomes. It applies the method of inductive reasoning to advance its position. To do this it invokes selected but crucial areas of African life to locate how the developmental demands of these aspects of life suggest a critical turn in African rationality.

  16. [Selective mutism].

    Science.gov (United States)

    Ytzhak, A; Doron, Y; Lahat, E; Livne, A

    2012-10-01

    Selective mutism is an uncommon disorder in young children, in which they selectively don't speak in certain social situations, while being capable of speaking easily in other social situations. Many etiologies were proposed for selective mutism including psychodynamic, behavioral and familial etc. A developmental etiology that includes insights from all the above is gaining support. Accordingly, mild language impairment in a child with an anxiety trait may be at the root of developing selective mutism. The behavior will be reinforced by an avoidant pattern in the family. Early treatment and followup for children with selective mutism is important. The treatment includes non-pharmacological therapy (psychodynamic, behavioral and familial) and pharmacologic therapy--mainly selective serotonin reuptake inhibitors (SSRI).

  17. Twenty-first century learning for teachers: helping educators bring new skills into the classroom.

    Science.gov (United States)

    Wilson, John I

    2006-01-01

    The motivation behind every educator's dedication and hard work in the classroom is the knowledge that his or her teaching will result in students' success in life. Educators are committed to implementing twenty-first century skills; they have no question that students need such skills to be equipped for life beyond school. Members of the National Education Association are enthusiastic about the Partnership for 21st Century Skills framework, yet express frustration that many schools do not have adequate resources to make the necessary changes. Teaching these skills poses significant new responsibilities for schools and educators. To make it possible for teachers to build twenty-first century skills into the curriculum, physical and policy infrastructures must exist, professional development and curriculum materials must be offered, and meaningful assessments must be available. With an established understanding of what skills need to be infused into the classroom-problem solving, analysis, and com- munications-and educators' commitment to the new skill set, this chapter explores how to make such a dramatic reform happen. The author discusses existing strategies that will guide educators in infusing twenty-first century skills into traditional content areas such as math, English, geography, and science. Ultimately, public policy regarding educational standards, professional development, assessments, and physical school structures must exist to enable educators to employ twenty-first century skills, leading to student success in contemporary life. Any concern about the cost of bringing this nation's educational system up to par internationally should be offset by the price that not making twenty-first century skills a priority in the classroom will have on future economic well-being.

  18. Long-term efficacy of modified-release recombinant human TSH (MRrhTSH) augmented radioiodine (131I) therapy for benign multinodular goiter. Results from a multicenter international, randomized, placebo-controlled dose-selection study

    DEFF Research Database (Denmark)

    Fast, Søren; Hegedus, Laszlo; Pacini, Furio

    2014-01-01

    with 131I-therapy. Methods: In this phase II, single-blinded, placebo-controlled study, 95 patients (57.2±9.6 years old, 85% women, 83% Caucasians) with MNG (median size 96.0 ml (31.9 - 242.2 ml)) were randomized to receive placebo (n=32), 0.01 mg MRrhTSH (n=30) or 0.03 mg MRrhTSH (n=33), 24 hours before...... a calculated 131I activity. Thyroid volume (TV) and smallest cross-sectional area of trachea (SCAT) were measured (by CT-scan) at baseline, month 6 and month 36. Thyroid function and quality of life (QoL) was evaluated at 3 month and yearly intervals, respectively. Results: At 6 months, TV reduction...... was enhanced in the 0.03 mg MRrhTSH group (32.9% versus 23.1% in the placebo group, p=0.03), but not in the 0.01 mg MRrhTSH group. At month 36 the mean percent TV reduction from baseline was 44 ± 12.7% (SD) in the placebo group, 41 ± 21.0% in the 0.01 mg MRrhTSH-group and 53 ± 18.6% in the 0.03 mg MRrh...

  19. RANDOMIZED CONTROLLED TRIALS IN ORTHOPEDICS AND TRAUMATOLOGY: SYSTEMATIC ANALYSIS ON THE NATIONAL EVIDENCE

    Science.gov (United States)

    de Moraes, Vinícius Ynoe; Moreira, Cesar Domingues; Tamaoki, Marcel Jun Sugawara; Faloppa, Flávio; Belloti, Joao Carlos

    2015-01-01

    Objective: To assess whether there has been any improvement in the quality and quantity of randomized controlled trials (RCTs) in nationally published journals through the application of standardized and validated scores. Methods: We electronically selected all RCTs published in the two indexed Brazilian journals that focus on orthopedics, over the period 2000-2009: Acta Ortopédica Brasileira (AOB) and Revista Brasileira de Ortopedia (RBO). These RCTs were identified and scored by two independent researchers in accordance with the Jadad scale and the Cochrane Bone, Joint and Muscle Trauma Group score. The studies selected were grouped as follows: 1) publication period (2000-2004 or 2004-2009); 2) journal of publication (AOB or RBO). Results: Twenty-two papers were selected: 10 from AOB and 12 from RBO. No statistically significant differences were found between the proportions (nRCT/nTotal of published papers) of RCTs published in the two journals (p = 0.458), or in the Jadad score (p = 0.722) and Cochrane score (p = 0.630). Conclusion: The relative quality and quantity of RCTs in the journals analyzed were similar. There was a trend towards improvement of quality, but there was no increase in the number of RCTs between the two periods analyzed. PMID:27026971

  20. The effect of selection on genetic parameter estimates

    African Journals Online (AJOL)

    Unknown

    The South African Journal of Animal Science is available online at ... A simulation study was carried out to investigate the effect of selection on the estimation of genetic ... The model contained a fixed effect, random genetic and random.

  1. 47 CFR 1.1604 - Post-selection hearings.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a) Following the random...

  2. Site selection

    CERN Multimedia

    CERN PhotoLab

    1968-01-01

    To help resolve the problem of site selection for the proposed 300 GeV machine, the Council selected "three wise men" (left to right, J H Bannier of the Netherlands, A Chavanne of Switzerland and L K Boggild of Denmark).

  3. Benchmark selection

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Tvede, Mich

    2002-01-01

    Within a production theoretic framework, this paper considers an axiomatic approach to benchmark selection. It is shown that two simple and weak axioms; efficiency and comprehensive monotonicity characterize a natural family of benchmarks which typically becomes unique. Further axioms are added...... in order to obtain a unique selection...

  4. Characterization of Pharmacologic and Pharmacokinetic Properties of CCX168, a Potent and Selective Orally Administered Complement 5a Receptor Inhibitor, Based on Preclinical Evaluation and Randomized Phase 1 Clinical Study.

    Science.gov (United States)

    Bekker, Pirow; Dairaghi, Daniel; Seitz, Lisa; Leleti, Manmohan; Wang, Yu; Ertl, Linda; Baumgart, Trageen; Shugarts, Sarah; Lohr, Lisa; Dang, Ton; Miao, Shichang; Zeng, Yibin; Fan, Pingchen; Zhang, Penglie; Johnson, Daniel; Powers, Jay; Jaen, Juan; Charo, Israel; Schall, Thomas J

    2016-01-01

    The complement 5a receptor has been an attractive therapeutic target for many autoimmune and inflammatory disorders. However, development of a selective and potent C5aR antagonist has been challenging. Here we describe the characterization of CCX168 (avacopan), an orally administered selective and potent C5aR inhibitor. CCX168 blocked the C5a binding, C5a-mediated migration, calcium mobilization, and CD11b upregulation in U937 cells as well as in freshly isolated human neutrophils. CCX168 retains high potency when present in human blood. A transgenic human C5aR knock-in mouse model allowed comparison of the in vitro and in vivo efficacy of the molecule. CCX168 effectively blocked migration in in vitro and ex vivo chemotaxis assays, and it blocked the C5a-mediated neutrophil vascular endothelial margination. CCX168 was effective in migration and neutrophil margination assays in cynomolgus monkeys. This thorough in vitro and preclinical characterization enabled progression of CCX168 into the clinic and testing of its safety, tolerability, pharmacokinetic, and pharmacodynamic profiles in a Phase 1 clinical trial in 48 healthy volunteers. CCX168 was shown to be well tolerated across a broad dose range (1 to 100 mg) and it showed dose-dependent pharmacokinetics. An oral dose of 30 mg CCX168 given twice daily blocked the C5a-induced upregulation of CD11b in circulating neutrophils by 94% or greater throughout the entire day, demonstrating essentially complete target coverage. This dose regimen is being tested in clinical trials in patients with anti-neutrophil cytoplasmic antibody-associated vasculitis. Trial Registration ISRCTN registry with trial ID ISRCTN13564773.

  5. Characterization of Pharmacologic and Pharmacokinetic Properties of CCX168, a Potent and Selective Orally Administered Complement 5a Receptor Inhibitor, Based on Preclinical Evaluation and Randomized Phase 1 Clinical Study.

    Directory of Open Access Journals (Sweden)

    Pirow Bekker

    Full Text Available The complement 5a receptor has been an attractive therapeutic target for many autoimmune and inflammatory disorders. However, development of a selective and potent C5aR antagonist has been challenging. Here we describe the characterization of CCX168 (avacopan, an orally administered selective and potent C5aR inhibitor. CCX168 blocked the C5a binding, C5a-mediated migration, calcium mobilization, and CD11b upregulation in U937 cells as well as in freshly isolated human neutrophils. CCX168 retains high potency when present in human blood. A transgenic human C5aR knock-in mouse model allowed comparison of the in vitro and in vivo efficacy of the molecule. CCX168 effectively blocked migration in in vitro and ex vivo chemotaxis assays, and it blocked the C5a-mediated neutrophil vascular endothelial margination. CCX168 was effective in migration and neutrophil margination assays in cynomolgus monkeys. This thorough in vitro and preclinical characterization enabled progression of CCX168 into the clinic and testing of its safety, tolerability, pharmacokinetic, and pharmacodynamic profiles in a Phase 1 clinical trial in 48 healthy volunteers. CCX168 was shown to be well tolerated across a broad dose range (1 to 100 mg and it showed dose-dependent pharmacokinetics. An oral dose of 30 mg CCX168 given twice daily blocked the C5a-induced upregulation of CD11b in circulating neutrophils by 94% or greater throughout the entire day, demonstrating essentially complete target coverage. This dose regimen is being tested in clinical trials in patients with anti-neutrophil cytoplasmic antibody-associated vasculitis. Trial Registration ISRCTN registry with trial ID ISRCTN13564773.

  6. Twenty four year time trends in fats and cholesterol intake by adolescents. Warsaw Adolescents Study

    Directory of Open Access Journals (Sweden)

    Charzewska Jadwiga

    2015-06-01

    Full Text Available The objective of this study was to determine time trends (1982–2006 in total fat intake and changes in fatty acid structure intake in adolescents from Warsaw in view of increasing prevalence of obesity. Data come from four successive surveys randomly selected samples of adolescents (aged 11–15 years old, from Warsaw region. In total 9747 pupils have been examined, with response rate varying from 55% to 87% depending on year. Surveys were done always in the spring season of the year. Food intake was assessed by using 24 hours recall method of consumption by the pupils all products, including enriched, dishes and beverages as well as diet supplements, in the last 24 hours preceding the examination. The content of energy and nutrients was calculated by means of own computer softwares (DIET 2 and 4, taking into account successive revisions of the tables of food composition and nutritional values, as well as current Polish DRI. A significant decreasing trend was found in intake of total fat, of saturated fatty acids (SFA and cholesterol. The percentage of energy from total fat, also decreased both in boys (to 35,1% and girls (to 33,7%, what failed to reach the desired level below 30% of energy from fat which is recommended. Also significant decrease of SFA consumption was not satisfactory enough to approach the values <10% of energy recommended as was from 13% to 15%. Decreasing trends in fat intake was not in accordance with the trend in obesity prevalence in the adolescents as average BMI is going up. To stabilize the health-oriented changes especially in the diets of adolescents, further activity is desired from professionals working with prevention of adolescents obesity.

  7. A cluster-randomized controlled trial evaluating the effects of delaying onset of adolescent substance abuse on cognitive development and addiction following a selective, personality-targeted intervention programme: the Co-Venture trial.

    Science.gov (United States)

    O'Leary-Barrett, Maeve; Mâsse, Benoit; Pihl, Robert O; Stewart, Sherry H; Séguin, Jean R; Conrod, Patricia J

    2017-10-01

    Substance use and binge drinking during early adolescence are associated with neurocognitive abnormalities, mental health problems and an increased risk for future addiction. The trial aims to evaluate the protective effects of an evidence-based substance use prevention programme on the onset of alcohol and drug use in adolescence, as well as on cognitive, mental health and addiction outcomes over 5 years. Thirty-eight high schools will be recruited, with a final sample of 31 schools assigned to intervention or control conditions (3826 youth). Brief personality-targeted interventions will be delivered to high-risk youth attending intervention schools during the first year of the trial. Control school participants will receive no intervention above what is offered to them in the regular curriculum by their respective schools. Public/private French and English high schools in Montreal (Canada). All grade 7 students (12-13 years old) will be invited to participate. High-risk youth will be identified as those scoring one standard deviation or more above the school mean on one of the four personality subscales of the Substance Use Risk Profile Scale (40-45% youth). Self-reported substance use and mental health symptoms and cognitive functioning measured annually throughout 5 years. Primary outcomes are the onset of substance use disorders at 4 years post-intervention (year 5). Secondary intermediate outcomes are the onset of alcohol and substance use 2 years post-intervention and neuropsychological functions; namely, the protective effects of substance use prevention on cognitive functions generally, and executive functions and reward sensitivity specifically. This longitudinal, cluster-randomized controlled trial will investigate the impact of a brief personality-targeted intervention program on reducing the onset of addiction 4 years-post intervention. Results will tease apart the developmental sequences of uptake and growth in substance use and cognitive

  8. The Efficacy of Single-Agent Epidermal Growth Factor Receptor Tyrosine Kinase Inhibitor Therapy in Biologically Selected Patients with Non-Small-Cell Lung Cancer: A Meta-Analysis of 19 Randomized Controlled Trials.

    Science.gov (United States)

    Li, Guifang; Gao, Shunji; Sheng, Zhixin; Li, Bin

    2016-01-01

    To determine the efficacy of first-generation single-agent epidermal growth factor receptor (EGFR) tyrosine kinase inhibitor (TKI) therapy in advanced non-small-cell lung cancer patients with known EGFR mutation status, we undertook this pooled analysis. We searched for randomized controlled trials (RCTs) in Medline, Embase, the Cochrane Controlled Trials Register, the Science Citation Index, and the American Society of Clinical Oncology annual meetings. Out of 2,129 retrieved articles, 19 RCTs enrolling 2,016 patients with wild-type EGFR tumors and 1,034 patients with mutant EGFR tumors were identified. For these EGFR mutant patients, single-agent EGFR-TKI therapy improved progression-free survival (PFS) over chemotherapy: the summary hazard ratios (HRs) were 0.41 (p well as chemotherapy in the first-line setting (HR = 1.65, p = 0.03) and in the second-/third-line setting (HR = 1.27, p = 0.006). No statistically significant difference was observed in terms of overall survival (OS). Using platinum-based doublet chemotherapy as a common comparator, indirect comparison showed the superior efficacy of single-agent EGFR-TKI therapy over EGFR-TKIs added to chemotherapy in PFS [HR = 1.35 (1.03, 1.77), p = 0.03]. Additionally, a marginal trend towards the same direction was found in the OS analysis [HR = 1.16 (0.99, 1.35), p = 0.06]. Interestingly, for those EGFR wild-type tumors, single-agent EGFR-TKI therapy was inferior to EGFR-TKIs added to chemotherapy in PFS [HR = 0.38 (0.33, 0.44), p chemotherapy. However, single-agent EGFR-TKI therapy was inferior to chemotherapy in PFS for those EGFR wild-type patients. Single-agent EGFR-TKI therapy could improve PFS over the combination of EGFR-TKIs and chemotherapy in these EGFR mutant patients. However, EGFR-TKIs combined with chemotherapy could provide additive PFS and OS benefit over single-agent EGFR-TKI therapy in those EGFR wild-type patients. © 2016 S. Karger AG, Basel.

  9. Randomizer for High Data Rates

    Science.gov (United States)

    Garon, Howard; Sank, Victor J.

    2018-01-01

    NASA as well as a number of other space agencies now recognize that the current recommended CCSDS randomizer used for telemetry (TM) is too short. When multiple applications of the PN8 Maximal Length Sequence (MLS) are required in order to fully cover a channel access data unit (CADU), spectral problems in the form of elevated spurious discretes (spurs) appear. Originally the randomizer was called a bit transition generator (BTG) precisely because it was thought that its primary value was to insure sufficient bit transitions to allow the bit/symbol synchronizer to lock and remain locked. We, NASA, have shown that the old BTG concept is a limited view of the real value of the randomizer sequence and that the randomizer also aids in signal acquisition as well as minimizing the potential for false decoder lock. Under the guidelines we considered here there are multiple maximal length sequences under GF(2) which appear attractive in this application. Although there may be mitigating reasons why another MLS sequence could be selected, one sequence in particular possesses a combination of desired properties which offsets it from the others.

  10. Analysis of some selected toxic metals in registered herbal products ...

    African Journals Online (AJOL)

    Twenty brands of herbal remedies were purchased randomly from the Pharmacy shops in Lagos, digested with aquaregia (3:1 HCl: HNO3) and were analysed using atomic absorption spectroscopy (Buck 205 Atomic Absorption Spectrophotometer). There was no detectable lead in any of the 20 herbal samples; however, ...

  11. Using Chemistry Simulations: Attention Capture, Selective Amnesia and Inattentional Blindness

    Science.gov (United States)

    Rodrigues, Susan

    2011-01-01

    Twenty-one convenience sample student volunteers aged between 14-15 years worked in pairs (and one group of three) with two randomly allocated high quality conceptual (molecular level) and operational (mimicking wet labs) simulations. The volunteers were told they had five minutes to play, repeat, review, restart or stop the simulation, which in…

  12. Chemical, Bioactive, and Antioxidant Potential of Twenty Wild Culinary Mushroom Species

    OpenAIRE

    Sharma, S. K.; Gautam, N.

    2015-01-01

    The chemical, bioactive, and antioxidant potential of twenty wild culinary mushroom species being consumed by the people of northern Himalayan regions has been evaluated for the first time in the present study. Nutrients analyzed include protein, crude fat, fibres, carbohydrates, and monosaccharides. Besides, preliminary study on the detection of toxic compounds was done on these species. Bioactive compounds evaluated are fatty acids, amino acids, tocopherol content, carotenoids (β-carotene, ...

  13. The covered distance for twenty five years of the Manche waste storage 1969 - 1994

    International Nuclear Information System (INIS)

    Gourden, J.M.

    2001-01-01

    The twenty five years of the Manche plant,are narrated with the difficulties of the beginning, the problems coming from leaks or runoff that have led to improvement in the knowledge of radioactive waste behaviour and the solutions that have been brought. It is from the building of the plant until the radioactive waste management that is related in this book. (N.C.)

  14. Catholic school governance in the twenty-first century: continuity, incongruity and challenge

    OpenAIRE

    Storr, Christopher John

    2007-01-01

    This study has two main aspects: first, it reports the results of a survey of ninety nine governors working in Roman Catholic primary and secondary schools situated in four English Catholic dioceses, and publishes hitherto unknown information about them; and, second, it examines how, in seeking to maintain a distinctive educational ethos, these governors are responding both to the legislative changes of the last twenty years, and to changes in English social and cultural attitudes. It shows h...

  15. Autonomous Robotic Weapons: US Army Innovation for Ground Combat in the Twenty-First Century

    Science.gov (United States)

    2015-05-21

    1 Introduction Today the robot is an accepted fact, but the principle has not been pushed far enough. In the twenty-first century the...2013, accessed March 29, 2015, http://www.bbc.com/news/magazine-21576376?print=true. 113 Steven Kotler , “Say Hello to Comrade Terminator: Russia’s...of autonomous robotic weapons, black- marketed directed energy weapons, and or commercially available software, potential adversaries may find

  16. Automation and robotics for Space Station in the twenty-first century

    Science.gov (United States)

    Willshire, K. F.; Pivirotto, D. L.

    1986-01-01

    Space Station telerobotics will evolve beyond the initial capability into a smarter and more capable system as we enter the twenty-first century. Current technology programs including several proposed ground and flight experiments to enable development of this system are described. Advancements in the areas of machine vision, smart sensors, advanced control architecture, manipulator joint design, end effector design, and artificial intelligence will provide increasingly more autonomous telerobotic systems.

  17. Genetic parameters and selection gains for Euterpe oleracea in juvenile phase

    Directory of Open Access Journals (Sweden)

    João Tomé de Farias Neto

    2012-09-01

    Full Text Available Genetics parameters and selection gains, obtained 36 months after planting, are presented and discussed for progenies of open pollinated population of açai palm for plant height (AP, plant diameter (DPC, number of live leaves ( NFV and tiller number (NP, based on the linear mixed model methodology (REML / BLUP. The thirty progenies were evaluated in a randomized blocks design with three replications and plots of five plants, spaced at 6m x 4m. The values obtained for individual heritability (0.55, 0.44, 0.38 and 0.43 and for progeny means (0.64, 0.54, 0.58 and 0.64 for AP, DPC, NFV and NP, respectively, were expressives, which indicates the possibility of genetic progress with the selection. The accuracy among the genetics values predicted and the true were of 0.802 for height, 0.736 for diameter, 0.760 for number of live leaves and 0.797 for tiller number. With the exception of NFV character, the coefficients of individual genetic variation were high (>10%, confirming the potential of the population for selection. Predicted genetic gains of 89.3% were obtained for the character AP and 2.1% for DCP, with the selection of the twenty top individuals. Correlation was found between height and diameter of the plant. Among ages, for the same characters, positive correlations of mean magnitudes were found.

  18. Irrigation quality of ground water of twenty villages in Lahore district

    Directory of Open Access Journals (Sweden)

    M.S. Ali

    2009-05-01

    Full Text Available study was conducted in twenty villages of Lahore district to assess the suitability of ground water for irrigation. Three water samples were collected from each of twenty villages and were analyzed for electrical conductivity (EC, sodium adsorption ratio (SAR, residual sodium carbonate (RSC and chloride concentration. Out of total 60 water samples, 7 (11.7% were fit, 7 (11.7% were marginally fit, and remaining 46 (76.6% were unfit for irrigation. Twenty eight samples (46.6% had electrical conductivity higher than permissible limit (i.e. >1250 µS cm-1, 19 samples (31.6% were found with high SAR (i.e. >10 (m mol L-10. 5, 44 samples (73.3% had high RSC (i.e. >2.5 me L-1 and 10 samples (16.6% were found unfit for irrigation due to high concentration of chloride (i.e. >3.9 me L-1. It can be inferred from data that quality of available ground water in most ofthe villages is not suitable for sustainable crop production and soil health.

  19. On a randomly imperfect spherical cap pressurized by a random ...

    African Journals Online (AJOL)

    On a randomly imperfect spherical cap pressurized by a random dynamic load. ... In this paper, we investigate a dynamical system in a random setting of dual ... characterization of the random process for determining the dynamic buckling load ...

  20. Random walks, random fields, and disordered systems

    CERN Document Server

    Černý, Jiří; Kotecký, Roman

    2015-01-01

    Focusing on the mathematics that lies at the intersection of probability theory, statistical physics, combinatorics and computer science, this volume collects together lecture notes on recent developments in the area. The common ground of these subjects is perhaps best described by the three terms in the title: Random Walks, Random Fields and Disordered Systems. The specific topics covered include a study of Branching Brownian Motion from the perspective of disordered (spin-glass) systems, a detailed analysis of weakly self-avoiding random walks in four spatial dimensions via methods of field theory and the renormalization group, a study of phase transitions in disordered discrete structures using a rigorous version of the cavity method, a survey of recent work on interacting polymers in the ballisticity regime and, finally, a treatise on two-dimensional loop-soup models and their connection to conformally invariant systems and the Gaussian Free Field. The notes are aimed at early graduate students with a mod...

  1. Selective oxidation

    International Nuclear Information System (INIS)

    Cortes Henao, Luis F.; Castro F, Carlos A.

    2000-01-01

    It is presented a revision and discussion about the characteristics and factors that relate activity and selectivity in the catalytic and not catalytic partial oxidation of methane and the effect of variables as the temperature, pressure and others in the methane conversion to methanol. It thinks about the zeolites use modified for the catalytic oxidation of natural gas

  2. Selective gossip

    NARCIS (Netherlands)

    Üstebay, D.; Castro, R.M.; Rabbat, M.

    2009-01-01

    Motivated by applications in compression and distributed transform coding, we propose a new gossip algorithm called Selective Gossip to efficiently compute sparse approximations of network data. We consider running parallel gossip algorithms on the elements of a vector of transform coefficients.

  3. Therapist facilitative interpersonal skills and training status: A randomized clinical trial on alliance and outcome.

    Science.gov (United States)

    Anderson, Timothy; Crowley, Mary Ellen J; Himawan, Lina; Holmberg, Jennifer K; Uhlin, Brian D

    2016-09-01

    Therapist effects, independent of the treatment provided, have emerged as a contributor to psychotherapy outcomes. However, past research largely has not identified which therapist factors might be contributing to these effects, though research on psychotherapy implicates relational characteristics. The present Randomized Clinical Trial tested the efficacy of therapists who were selected by their facilitative interpersonal skills (FIS) and training status. Sixty-five clients were selected from 2713 undergraduates using a screening and clinical interview procedure. Twenty-three therapists met with 2 clients for 7 sessions and 20 participants served in a no-treatment control group. Outcome and alliance differences for Training Status were negligible. High FIS therapists had greater pre-post client outcome, and higher rates of change across sessions, than low FIS therapists. All clients treated by therapists improved more than the silent control, but effects were greater with high FIS than low FIS therapists. From the first session, high FIS therapists also had higher alliances than low FIS therapists as well as significant improvements on client-rated alliance. Results were consistent with the hypothesis that therapists' common relational skills are independent contributors to therapeutic alliance and outcome.

  4. The design and performance of a twenty barrel hydrogen pellet injector for Alcator C-Mod

    International Nuclear Information System (INIS)

    Urbahn, J.A.

    1994-05-01

    A twenty barrel hydrogen pellet injector has been designed, built and tested both in the laboratory and on the Alcator C-Mod Tokamak at MIT. The injector functions by firing pellets of frozen hydrogen or deuterium deep into the plasma discharge for the purpose of fueling the plasma, modifying the density profile and increasing the global energy confinement time. The design goals of the injector are: (1) Operational flexibility, (2) High reliability, (3) Remote operation with minimal maintenance. These requirements have lead to a single stage, pipe gun design with twenty barrels. Pellets are formed by in- situ condensation of the fuel gas, thus avoiding moving parts at cryogenic temperatures. The injector is the first to dispense with the need for cryogenic fluids and instead uses a closed cycle refrigerator to cool the thermal system components. The twenty barrels of the injector produce pellets of four different size groups and allow for a high degree of flexibility in fueling experiments. Operation of the injector is under PLC control allowing for remote operation, interlocked safety features and automated pellet manufacturing. The injector has been extrusively tested and shown to produce pellets reliably with velocities up to 1400 m/sec. During the period from September to November of 1993, the injector was successfully used to fire pellets into over fifty plasma discharges. Experimental results include data on the pellet penetration into the plasma using an advanced pellet tracking diagnostic with improved time and spatial response. Data from the tracker indicates pellet penetrations were between 30 and 86 percent of the plasma minor radius

  5. Why American business demands twenty-first century skills: an industry perspective.

    Science.gov (United States)

    Bruett, Karen

    2006-01-01

    Public education is the key to individual and business prosperity. With a vested stake in education, educators, employers, parents, policymakers, and the public should question how this nation's public education system is faring. Knowing that recent international assessments have shown little or no gains in American students' achievement, the author asserts the clear need for change. As both a large American corporate employer and a provider of technology for schools, Dell is concerned with ensuring that youth will thrive in their adult lives. Changing workplace expectations lead to a new list of skills students will need to acquire before completing their schooling. Through technology, Dell supports schools in meeting educational goals, striving to supply students with the necessary skills, referred to as twenty-first century skills. The Partnership for 21st Century Skills, of which Dell is a member, has led an initiative to define what twenty-first century learning should entail. Through extensive research, the partnership has built a framework outlining twenty-first century skills: analytical thinking, communication, collaboration, global awareness, and technological and economic literacy. Dell and the partnership are working state by state to promote the integration of these skills into curricula, professional development for teachers, and classroom environments. The authors describe two current initiatives, one in Virginia, the other in Texas, which both use technology to help student learning. All stakeholders can take part in preparing young people to compete in the global economy. Educators and administrators, legislators, parents, and employers must play their role in helping students be ready for what the workforce and the world has in store for them.

  6. Trends in CPAP adherence over twenty years of data collection: a flattened curve.

    Science.gov (United States)

    Rotenberg, Brian W; Murariu, Dorian; Pang, Kenny P

    2016-08-19

    Obstructive sleep apnea (OSA) is a common disorder, and continuous airway positive pressure (CPAP) is considered to be the gold standard of therapy. CPAP however is known to have problems with adherence, with many patients eventually abandoning the device. The purpose of this paper is to assess secular trends in CPAP adherence over the long term to see if there have been meaningful improvements in adherence in light of the multiple interventions proposed to do so. A comprehensive systematic literature review was conducted using the Medline-Ovid, Embase, and Pubmed databases, searching for data regarding CPAP adherence over a twenty year timeframe (1994-2015). Data was assessed for quality and then extracted. The main outcome measure was reported CPAP non-adherence. Secondary outcomes included changes in CPAP non-adherence when comparing short versus long-term, and changes in terms of behavioral counseling. Eighty-two papers met study inclusion/exclusion criteria. The overall CPAP non-adherence rate based on a 7-h/night sleep time that was reported in studies conducted over the twenty year time frame was 34.1 %. There was no significant improvement over the time frame. Behavioral intervention improved adherence rates by ~1 h per night on average. The rate of CPAP adherence remains persistently low over twenty years worth of reported data. No clinically significant improvement in CPAP adherence was seen even in recent years despite efforts toward behavioral intervention and patient coaching. This low rate of adherence is problematic, and calls into question the concept of CPAP as gold-standard of therapy for OSA.

  7. A history of meniscal surgery: from ancient times to the twenty-first century.

    Science.gov (United States)

    Di Matteo, B; Moran, C J; Tarabella, V; Viganò, A; Tomba, P; Marcacci, M; Verdonk, R

    2016-05-01

    The science and surgery of the meniscus have evolved significantly over time. Surgeons and scientists always enjoy looking forward to novel therapies. However, as part of the ongoing effort at optimizing interventions and outcomes, it may also be useful to reflect on important milestones from the past. The aim of the present manuscript was to explore the history of meniscal surgery across the ages, from ancient times to the twenty-first century. Herein, some of the investigations of the pioneers in orthopaedics are described, to underline how their work has influenced the management of the injured meniscus in modern times. Level of evidence V.

  8. Elevated dioxin levels in chloracne cases twenty years after the Seveso, Italy accident

    Energy Technology Data Exchange (ETDEWEB)

    Baccarelli, A.; Pesatori, A.C.; Consonni, D.; Bonzini, M.; Giacomini, S.M.; Bertazzi, P.A. [EPOCA Research Center, Univ. of Milan (Italy); Mocarelli, P. [Dept. of Lab. Medicine, Univ. of Milan-Bicocca, Desio (Italy); Patterson, D.G. Jr. [Centers for Disease Control and Prevention, Atlanta, GA (United States); Caporaso, N.E.; Landi, M.T. [Div. of Cancer Epidemiology and Genetics, National Cancer Inst., NIH, DHHS, Bethesda, MD (United States)

    2004-09-15

    In July 1976, an industrial accident contaminated a residential area surrounding Seveso, Italy, with high levels of 2,3,7,8-Tetrachlorodibenzo-p-dioxin (TCDD). The exposure was acute, relatively pure, and affected more than 45,000 men, women, and children. By February 1978, 193 chloracne cases, mostly children, had been identified in the exposed population. Twenty years after, we conducted a case-control study on subjects diagnosed with chloracne and control subjects, who had not developed chloracne after the accident, to evaluate their TCDD plasma levels, as well as the exposure-response relationship and possible determinants of susceptibility to TCDD effects in this population.

  9. Twenty-Nail Transverse Melanonychia Induced by Hydroxyurea: Case Report and Review of the Literature.

    Science.gov (United States)

    Osemwota, Osamuede; Uhlemann, John; Rubin, Adam

    2017-08-01

    Twenty-nail transverse melanonychia from hydroxyurea is a rare phenomenon, only reported four times previously. Here we describe a 51-year-old female who presented with 20-nail transverse melanonychia 3 months after initiating hydroxyurea therapy. Transverse melanonychia is a benign process but can cause patients significant distress, and thus is an entity with which dermatologists should recognize. We then review the cutaneous manifestations, differential diagnosis, and clinical considerations when evaluating patients with transverse melanonychia from hydroxyurea or other causes. J Drugs Dermatol. 2017;16(8):814-815..

  10. Twenty Years of Research on RNS for DSP: Lessons Learned and Future Perspectives

    DEFF Research Database (Denmark)

    Albicocco, Pietro; Cardarilli, Gian Carlo; Nannarelli, Alberto

    2014-01-01

    In this paper, we discuss a number of issues emerged from our twenty-year long experience in applying the Residue Number System (RNS) to DSP systems. In early days, RNS was mainly used to reach the maximum performance in speed. Today, RNS is also used to obtain powerefficient (tradeoffs speed......-power) and reliable systems (redundant RNS). Advances in microelectronics and CAD tools play an important role in favoring one technology over another, and a winning choice of the past may become at disadvantage today. In this paper, we address a number of factors influencing the choice of RNS as the winning solution...

  11. Ecomorphological correlates of twenty dominant fish species of Amazonian floodplain lakes

    Directory of Open Access Journals (Sweden)

    F. K. Siqueira-Souza

    Full Text Available Abstract Fishes inhabiting Amazonian floodplain lakes exhibits a great variety of body shape, which was a key advantage to colonize the several habitats that compose these areas adjacent to the large Amazon rivers. In this paper, we did an ecomorphological analysis of twenty abundant species, sampled in May and August 2011, into two floodplain lakes of the lower stretch of the Solimões River. The analysis detected differences among species, which could be probably associated with swimming ability and habitat use preferences.

  12. New Korean Record of Twenty Eight Species of the Family Ichneumonidae (Hymenoptera

    Directory of Open Access Journals (Sweden)

    Choi, Jin-Kyung

    2014-04-01

    Full Text Available We report twenty eight ichneumonid species new to Korea. These species belong to seven subfamilies. Among them five subfamilies, Diacritinae Townes, 1965, Microleptinae Townes, 1958, Orthocentrinae F$\\ddot{o} $ 수식 이미지rster, 1869, Orthopelmatinae Schmiedeknecht, 1910, Phrudinae Townes and Townes, 1949, are newly introduced to Korean fauna. All specimens are based on the insect collection of animal systematic laboratory at the Yeungnam University Gyeongsan Campus. Photographs of habitus of newly recorded subfamilies, diagnosis of 28 species and host information are provided.

  13. The twenty-five maiden ladies' tomb and predicaments of the feminist movement in Taiwan

    OpenAIRE

    Lee, Anru; Tang, Wen-hui Anna

    2010-01-01

    “The Twenty-five Maiden Ladies’ Tomb” is the collective burial site of the female workers who died in a ferry accident on their way to work in 1973. The fact that of the more than 70 passengers on board all 25 who died were unmarried young women, and the taboo in Taiwanese culture that shuns unmarried female ghosts, made the Tomb a fearsome place. Feminists in Gaoxiong (高雄) had for some years wanted the city government to change the tomb’s public image....

  14. Twenty years' application of agricultural countermeasures following the Chernobyl accident: lessons learned

    Energy Technology Data Exchange (ETDEWEB)

    Fesenko, S V [International Atomic Energy Agency, 1400 Vienna (Austria); Alexakhin, R M [Russian Institute of Agricultural Radiology and Agroecology, 249020 Obninsk (Russian Federation); Balonov, M I [International Atomic Energy Agency, 1400 Vienna (Austria); Bogdevich, I M [Research Institute for Soil Science and Agrochemistry, Minsk (Belarus); Howard, B J [Centre for Ecology and Hydrology, Lancaster Environment Centre, Library Avenue, Bailrigg, Lancaster LAI 4AP (United Kingdom); Kashparov, V A [Ukrainian Institute of Agricultural Radiology (UIAR), Mashinostroiteley Street 7, Chabany, Kiev Region 08162 (Ukraine); Sanzharova, N I [Russian Institute of Agricultural Radiology and Agroecology, 249020 Obninsk (Russian Federation); Panov, A V [Russian Institute of Agricultural Radiology and Agroecology, 249020 Obninsk (Russian Federation); Voigt, G [International Atomic Energy Agency, 1400 Vienna (Austria); Zhuchenka, Yu M [Research Institute of Radiology, 246000 Gomel (Belarus)

    2006-12-15

    The accident at the Chernobyl NPP (nuclear power plant) was the most serious ever to have occurred in the history of nuclear energy. The consumption of contaminated foodstuffs in affected areas was a significant source of irradiation for the population. A wide range of different countermeasures have been used to reduce exposure of people and to mitigate the consequences of the Chernobyl accident for agriculture in affected regions in Belarus, Russia and Ukraine. This paper for the first time summarises key data on countermeasure application over twenty years for all three countries and describes key lessons learnt from this experience. (review)

  15. Twenty years of „post-communism”: the radiography of a political failure

    Directory of Open Access Journals (Sweden)

    Aurora Martin

    2011-11-01

    Full Text Available This empirical view on the current „state” of Romania has as main objective to emphasize that the contemporary economic crisis actually revealed some other escalated crises in the very heart of the Romanian society. The vicious circle determined by three crises – one political, another social-institutional and last, but not least, a cultural one, continuously weakened the state for the last twenty years. Romania will overcome the economic crisis but the key for walking out of the above-mentioned circle would be the real transformation of its core systems – political, institutional and educational.

  16. Twenty years' application of agricultural countermeasures following the Chernobyl accident: lessons learned

    International Nuclear Information System (INIS)

    Fesenko, S V; Alexakhin, R M; Balonov, M I; Bogdevich, I M; Howard, B J; Kashparov, V A; Sanzharova, N I; Panov, A V; Voigt, G; Zhuchenka, Yu M

    2006-01-01

    The accident at the Chernobyl NPP (nuclear power plant) was the most serious ever to have occurred in the history of nuclear energy. The consumption of contaminated foodstuffs in affected areas was a significant source of irradiation for the population. A wide range of different countermeasures have been used to reduce exposure of people and to mitigate the consequences of the Chernobyl accident for agriculture in affected regions in Belarus, Russia and Ukraine. This paper for the first time summarises key data on countermeasure application over twenty years for all three countries and describes key lessons learnt from this experience. (review)

  17. Effects of energy constraints on transportation systems. [Twenty-six papers

    Energy Technology Data Exchange (ETDEWEB)

    Mittal, R. K. [ed.

    1977-12-01

    Twenty-six papers are presented on a variety of topics including: energy and transportaton facts and figures; long-range planning under energy constraints; technology assessment of alternative fuels; energy efficiency of intercity passenger and freight movement; energy efficiency of intracity passenger movement; federal role; electrification of railroads; energy impact of the electric car in an urban enviroment; research needs and projects in progress--federal viewpoint; research needs in transportation energy conservation--data needs; and energy intensity of various transportation modes--an overview. A separate abstract was prepared for each of the papers for inclusion in Energy Research Abstracts (ERA) and in Energy Abstracts for Policy Analysis (EAPA).

  18. Neurogenetics in Child Neurology: Redefining a Discipline in the Twenty-first Century.

    Science.gov (United States)

    Kaufmann, Walter E

    2016-12-01

    Increasing knowledge on genetic etiology of pediatric neurologic disorders is affecting the practice of the specialty. I reviewed here the history of pediatric neurologic disorder classification and the role of genetics in the process. I also discussed the concept of clinical neurogenetics, with its role in clinical practice, education, and research. Finally, I propose a flexible model for clinical neurogenetics in child neurology in the twenty-first century. In combination with disorder-specific clinical programs, clinical neurogenetics can become a home for complex clinical issues, repository of genetic diagnostic advances, educational resource, and research engine in child neurology.

  19. Managing the twenty-first century reference department challenges and prospects

    CERN Document Server

    Katz, Linda S

    2014-01-01

    Learn the skills needed to update and manage a reference department that efficiently meets the needs of clients today?and tomorrow! Managing the Twenty-First Century Reference Department: Challenges and Prospects provides librarians with the knowledge and skills they need to manage an effective reference service. Full of useful and practical ideas, this book presents successful methods for recruiting and retaining capable reference department staff and management, training new employees and adapting current services to an evolving field. Expert practitioners address the changing role of the r

  20. Twenty-five years of post-Bretton Woods experience: some lessons

    Directory of Open Access Journals (Sweden)

    H. ASKARI

    1999-03-01

    Full Text Available In 1971 many academic economists were predicting that the Bretton Woods system of fixed parities would collapse. Some, most notably Milton Friedman, became excited about the possibility of a floating system because the benefits of international capital mobility can only be achieved through the flexibility in the exchange rate. These economists argued that a floating exchange rate system can ensure positive results more than the fixed parities system. Twenty-five years later, however, there is still no consensus on the matter. The author reviews the post-Bretton Woods experience to highlight some policies and approaches that might be helpful for the future.

  1. Lines of Descent Under Selection

    Science.gov (United States)

    Baake, Ellen; Wakolbinger, Anton

    2017-11-01

    We review recent progress on ancestral processes related to mutation-selection models, both in the deterministic and the stochastic setting. We mainly rely on two concepts, namely, the killed ancestral selection graph and the pruned lookdown ancestral selection graph. The killed ancestral selection graph gives a representation of the type of a random individual from a stationary population, based upon the individual's potential ancestry back until the mutations that define the individual's type. The pruned lookdown ancestral selection graph allows one to trace the ancestry of individuals from a stationary distribution back into the distant past, thus leading to the stationary distribution of ancestral types. We illustrate the results by applying them to a prototype model for the error threshold phenomenon.

  2. Benchmarking Variable Selection in QSAR.

    Science.gov (United States)

    Eklund, Martin; Norinder, Ulf; Boyer, Scott; Carlsson, Lars

    2012-02-01

    Variable selection is important in QSAR modeling since it can improve model performance and transparency, as well as reduce the computational cost of model fitting and predictions. Which variable selection methods that perform well in QSAR settings is largely unknown. To address this question we, in a total of 1728 benchmarking experiments, rigorously investigated how eight variable selection methods affect the predictive performance and transparency of random forest models fitted to seven QSAR datasets covering different endpoints, descriptors sets, types of response variables, and number of chemical compounds. The results show that univariate variable selection methods are suboptimal and that the number of variables in the benchmarked datasets can be reduced with about 60 % without significant loss in model performance when using multivariate adaptive regression splines MARS and forward selection. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Generating equilateral random polygons in confinement III

    International Nuclear Information System (INIS)

    Diao, Y; Ernst, C; Montemayor, A; Ziegler, U

    2012-01-01

    In this paper we continue our earlier studies (Diao et al 2011 J. Phys. A: Math. Theor. 44 405202, Diao et al J. Phys. A: Math. Theor. 45 275203) on the generation methods of random equilateral polygons confined in a sphere. The first half of this paper is concerned with the generation of confined equilateral random walks. We show that if the selection of a vertex is uniform subject to the position of its previous vertex and the confining condition, then the distributions of the vertices are not uniform, although there exists a distribution such that if the initial vertex is selected following this distribution, then all vertices of the random walk follow this same distribution. Thus in order to generate a confined equilateral random walk, the selection of a vertex cannot be uniform subject to the position of its previous vertex and the confining condition. We provide a simple algorithm capable of generating confined equilateral random walks whose vertex distribution is almost uniform in the confinement sphere. In the second half of this paper we show that any process generating confined equilateral random walks can be turned into a process generating confined equilateral random polygons with the property that the vertex distribution of the polygons approaches the vertex distribution of the walks as the polygons get longer and longer. In our earlier studies, the starting point of the confined polygon is fixed at the center of the sphere. The new approach here allows us to move the starting point of the confined polygon off the center of the sphere. (paper)

  4. Achievements in the past twenty years and perspective outlook of crop space breeding in China

    International Nuclear Information System (INIS)

    Liu Luxiang; Guo Huijun; Zhao Linshu; Gu Jiayu; Zhao Shirong

    2007-01-01

    Space breeding is a novel effective approach to crop mutational improvement, which was firstly founded by the chinese scientists in 1987. A national collaborative research network has been established and significant achievements have been made during the past twenty years. More than forty new mutant varieties derived from space mutagenesis in rice, wheat, cotton, sweet pepper, tomato, sesame and alfalfa have been developed, officially released and put into production. A series of useful rare mutant germplasms which might make a great breakthrough in crop grain yield and/or quality improvement have been obtained. Technique innovations in space breeding and ground simulation of space environmental factors have been made good progresses. Intellective property right protection and industrialization of space mutation techniques and mutant varieties, exploration of the mechanism of space mutation induction have also been stably advanced. In this paper, the main achievements of crop space breeding in the past twenty years had been reviewed. The perspective development strategies of space breeding were also discussed. (authors)

  5. SERVICE QUALITY DIMENSIONS IN SPECTATOR SPORT: ANANALYSIS OF THE TWENTY-TWENTYCRICKETLEAGUEMATCHES IN SOUTH AFRICA

    Directory of Open Access Journals (Sweden)

    B. A Mokoena

    2017-01-01

    Full Text Available Since their inception, twenty-twenty (T20 cricket league matches have gainedtremendous popularity and areheralded in South Africa with the potential ofincreasingthespectatorsupport base in cricket. The purpose of this studywastoenhance an understanding of service quality by examining theservice qualitydimensions in T20 cricket events in South Africa. A convenience sample of250T20 cricket spectatorsin the Gauteng provinceofSouth Africa participated in asurveyusingastructuredself-administered questionnaire. The factor analysisprocedure resulted in the extraction of six primarydimensions, namely servicepersonnel, game atmosphere, facility access, facility aesthetics, home team qualityand opposing team characteristics. The model was tested using confirmatory factoranalysis, which showed a good-fit of the data to the model.Among the servicequality dimensions, facility access was rated the most important dimension. Thestudy recommends that managers use the suggested dimensional framework tomeasure service quality in T20 cricket. Managers can also use this framework andmeasurement scale as a diagnostic tool to identify strengths and weaknesses in theirservices, thus providing guidance for potential areas of improvement.

  6. Nephrotic syndrome induced by dibasic sodium phosphate injections for twenty-eight days in rats.

    Science.gov (United States)

    Tsuchiya, Noriko; Torii, Mikinori; Narama, Isao; Matsui, Takane

    2009-04-01

    Sprague-Dawley rats received once daily tail-vein injections of 360 mM dibasic sodium phosphate solution at 8 mL/kg for fourteen or twenty-eight days. Clinical examination revealed persistent proteinuria from three days after the first dosing and thereafter severe proteinuria from eight days or later in the phosphate-treated groups. Proteinuria developed without remission even after fourteen-day withdrawal in the fourteen-day dosed group. Phosphate-treated animals developed lipemia, hypercholesterolemia, anemia, higher serum fibrinogen levels, and lower serum albumin/globulin ratios on day 29. Renal weight increased significantly compared with control animals, and the kidneys appeared pale and enlarged with a rough surface. Histopathologically, glomerular changes consisted of mineralization in whole glomeruli, glomerular capillary dilatation, partial adhesion of glomerular tufts to Bowman's capsule, and mesangiolysis. Ultrastructural lesions such as an increased number of microvilli, effacement of foot processes, and thickening of the glomerular basement membrane, and immunocytochemical changes in podocytes, mainly decreased podoplanin-positive cells and increased desmin expression, were also conspicuous in the phosphate-treated rats for twenty-eight days. Marked tubulointerstitial lesions were tubular regeneration and dilatation, protein casts, mineralization in the basement membrane, focal interstitial inflammation, and fibrosis in the cortex. These clinical and morphological changes were similar to features of human nephrotic syndrome.

  7. Over Twenty Years Of Experience In ITU TRIGA MARK-II Reactor

    International Nuclear Information System (INIS)

    Yavuz, Hasbi

    2008-01-01

    I.T.U. TRIGA MARK-II Training and Research Reactor, rated at 250 kW steady-state and 1200 MW pulsing power is the only research and training reactor owned and operated by a university in Turkey. Reactor has been operating since March 11, 1979; therefore the reactor has been operating successfully for more than twenty years. Over the twenty years of operation: - The tangential beam tube was equipped with a neutron radiography facility, which consists of a divergent collimator and exposure room; - A computerized data acquisition system was designed and installed such that all parameters of the reactor, which are observed from the console, could be monitored both in normal and pulse operations; - An electrical power calibration system was built for the thermal power calibration of the reactor; - Publications related with I.T.U. TRIGA MARK-II Training and Research Reactor are listed in Appendix; - Two majors undesired shutdown occurred; - The I.T.U. TRIGA MARK-II Training and Research Reactor is still in operation at the moment. (authors)

  8. Ricardian selection

    OpenAIRE

    Finicelli, Andrea; Pagano, Patrizio; Sbracia, Massimo

    2009-01-01

    We analyze the foundations of the relationship between trade and total factor productivity (TFP) in the Ricardian model. Under general assumptions about the autarky distributions of industry productivities, trade openness raises TFP. This is due to the selection effect of international competition � driven by comparative advantages � which makes "some" high- and "many" low-productivity industries exit the market. We derive a model-based measure of this effect that requires only production...

  9. The Summit of «Group of Twenty» – 2016: geopolitical assessments

    Directory of Open Access Journals (Sweden)

    O. S. Vonsovych

    2016-08-01

    Full Text Available The article investigated the geopolitical assessments of the results of the Summit of «Group of Twenty», which took place in Hangzhou from 4 till 5 September, 2016. The main idea of this event was the development of  innovation, healthy, coherent and inclusive economy. Moreover, important for the geopolitical consolidation of efforts of member countries in the context of further development of the world economy was making such documents as: Contours of the innovative growth of «Group of Twenty», Action plan of «Group of Twenty» in connection with a new industrial revolution, Hangzhou action plan, which set out a strategy for overall and confident growth and also initiative of «Group of twenty» in development and cooperation in the field of digital economy. Important thing is that analysis of the Summit’s results for world regions is important from a geopolitical point of view. We can predict that in the future geopolitical dialogue between the EU and the US in the economy will become more vital for both sides, and will enable to contribute to the implementation of more global challenges of the global economy. We should mention that Summit of «Group of Twenty» largely determined the prospects of global economic processes and build a new world economic order with the introduction of innovative approaches and elements of the digital economy. Activity on the functioning of the global economy should be based on the collective efforts of all participants and take into account all the realities on the international stage. No less important is that the Summit proved the presence of leading geopolitical centers that affect not only the balance of power in the world, but also the further development of key economic processes. Each of these centers sees its future not only the global economy but the global system as a whole as well as its geopolitical influence in certain regions and countries. On the one hand, as for the development of

  10. Selection of autochthonous Oenococcus oeni strains according to their oenological properties and vinification results.

    Science.gov (United States)

    Ruiz, Patricia; Izquierdo, Pedro Miguel; Seseña, Susana; Palop, María Llanos

    2010-02-28

    The goal of this study is to carry out a characterization of 84 Oenococcus oeni strains isolated from Tempranillo wine samples taken at the cellars in Castilla-La Mancha, in order to select those showing the highest potential as oenological starter cultures. Various oenological properties were analyzed and the ability of some of these strains to grow and undergo MLF in simulated laboratory microvinifications was tested. Twenty-two strains were selected on the basis of fermentation assays and the eight that produced the best results in the chemical analysis of the wines were chosen for further assays. None of the eight strains was either able to produce biogenic amines or displayed tannase or anthocyanase activities. On the other hand all presented activity against p-NP-beta Glucopyranoside, p-NP-alpha Glucopyranoside and p-NP-beta xylopyranoside. Randomly Amplified Polymorphic DNA (RAPD)-PCR was used to determine the colonizing ability of the inoculated strains. C22L9 and D13L13 strains showed the highest implantation values. On the basis of this characterization, two strains have been selected which are suitable as starter cultures for MLF of Tempranillo wine. Use of these strains will ensure that MLF proceeds successfully and gives retention of the organoleptic characteristics of wines made in Castilla-La Mancha. (c) 2009 Elsevier B.V. All rights reserved.

  11. Randomized Prediction Games for Adversarial Machine Learning.

    Science.gov (United States)

    Rota Bulo, Samuel; Biggio, Battista; Pillai, Ignazio; Pelillo, Marcello; Roli, Fabio

    In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different evasion attacks and modifying the classification function accordingly. However, both the classification function and the simulated data manipulations have been modeled in a deterministic manner, without accounting for any form of randomization. In this paper, we overcome this limitation by proposing a randomized prediction game, namely, a noncooperative game-theoretic formulation in which the classifier and the attacker make randomized strategy selections according to some probability distribution defined over the respective strategy set. We show that our approach allows one to improve the tradeoff between attack detection and false alarms with respect to the state-of-the-art secure classifiers, even against attacks that are different from those hypothesized during design, on application examples including handwritten digit recognition, spam, and malware detection.In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different

  12. Random surfaces and strings

    International Nuclear Information System (INIS)

    Ambjoern, J.

    1987-08-01

    The theory of strings is the theory of random surfaces. I review the present attempts to regularize the world sheet of the string by triangulation. The corresponding statistical theory of triangulated random surfaces has a surprising rich structure, but the connection to conventional string theory seems non-trivial. (orig.)

  13. Derandomizing from random strings

    NARCIS (Netherlands)

    Buhrman, H.; Fortnow, L.; Koucký, M.; Loff, B.

    2010-01-01

    In this paper we show that BPP is truth-table reducible to the set of Kolmogorov random strings R(K). It was previously known that PSPACE, and hence BPP is Turing-reducible to R(K). The earlier proof relied on the adaptivity of the Turing-reduction to find a Kolmogorov-random string of polynomial

  14. Correlated randomness and switching phenomena

    Science.gov (United States)

    Stanley, H. E.; Buldyrev, S. V.; Franzese, G.; Havlin, S.; Mallamace, F.; Kumar, P.; Plerou, V.; Preis, T.

    2010-08-01

    One challenge of biology, medicine, and economics is that the systems treated by these serious scientific disciplines have no perfect metronome in time and no perfect spatial architecture-crystalline or otherwise. Nonetheless, as if by magic, out of nothing but randomness one finds remarkably fine-tuned processes in time and remarkably fine-tuned structures in space. Further, many of these processes and structures have the remarkable feature of “switching” from one behavior to another as if by magic. The past century has, philosophically, been concerned with placing aside the human tendency to see the universe as a fine-tuned machine. Here we will address the challenge of uncovering how, through randomness (albeit, as we shall see, strongly correlated randomness), one can arrive at some of the many spatial and temporal patterns in biology, medicine, and economics and even begin to characterize the switching phenomena that enables a system to pass from one state to another. Inspired by principles developed by A. Nihat Berker and scores of other statistical physicists in recent years, we discuss some applications of correlated randomness to understand switching phenomena in various fields. Specifically, we present evidence from experiments and from computer simulations supporting the hypothesis that water’s anomalies are related to a switching point (which is not unlike the “tipping point” immortalized by Malcolm Gladwell), and that the bubbles in economic phenomena that occur on all scales are not “outliers” (another Gladwell immortalization). Though more speculative, we support the idea of disease as arising from some kind of yet-to-be-understood complex switching phenomenon, by discussing data on selected examples, including heart disease and Alzheimer disease.

  15. Distributional and efficiency results for subset selection

    NARCIS (Netherlands)

    Laan, van der P.

    1996-01-01

    Assume k (??k \\geq 2) populations are given. The associated independent random variables have continuous distribution functions with an unknown location parameter. The statistical selec??tion goal is to select a non??empty subset which contains the best population,?? that is the pop??ulation with

  16. Analysis of swaps in Radix selection

    DEFF Research Database (Denmark)

    Elmasry, Amr Ahmed Abd Elmoneim; Mahmoud, Hosam

    2011-01-01

    Radix Sort is a sorting algorithm based on analyzing digital data. We study the number of swaps made by Radix Select (a one-sided version of Radix Sort) to find an element with a randomly selected rank. This kind of grand average provides a smoothing over all individual distributions for specific...

  17. Selection and characterization of DNA aptamers

    NARCIS (Netherlands)

    Ruigrok, V.J.B.

    2013-01-01

    This thesis focusses on the selection and characterisation of DNA aptamers and the various aspects related to their selection from large pools of randomized oligonucleotides. Aptamers are affinity tools that can specifically recognize and bind predefined target molecules; this ability, however,

  18. Quantum random number generator

    Science.gov (United States)

    Soubusta, Jan; Haderka, Ondrej; Hendrych, Martin

    2001-03-01

    Since reflection or transmission of a quantum particle on a beamsplitter is inherently random quantum process, a device built on this principle does not suffer from drawbacks of neither pseudo-random computer generators or classical noise sources. Nevertheless, a number of physical conditions necessary for high quality random numbers generation must be satisfied. Luckily, in quantum optics realization they can be well controlled. We present an easy random number generator based on the division of weak light pulses on a beamsplitter. The randomness of the generated bit stream is supported by passing the data through series of 15 statistical test. The device generates at a rate of 109.7 kbit/s.

  19. Quantum random number generator

    Science.gov (United States)

    Pooser, Raphael C.

    2016-05-10

    A quantum random number generator (QRNG) and a photon generator for a QRNG are provided. The photon generator may be operated in a spontaneous mode below a lasing threshold to emit photons. Photons emitted from the photon generator may have at least one random characteristic, which may be monitored by the QRNG to generate a random number. In one embodiment, the photon generator may include a photon emitter and an amplifier coupled to the photon emitter. The amplifier may enable the photon generator to be used in the QRNG without introducing significant bias in the random number and may enable multiplexing of multiple random numbers. The amplifier may also desensitize the photon generator to fluctuations in power supplied thereto while operating in the spontaneous mode. In one embodiment, the photon emitter and amplifier may be a tapered diode amplifier.

  20. Genetic parameters for residual feed intake in a random population of Pekin duck

    Directory of Open Access Journals (Sweden)

    Yunsheng Zhang

    2017-02-01

    Full Text Available Objective The feed intake (FI and feed efficiency are economically important traits in ducks. To obtain insight into this economically important trait, we designed an experiment based on the residual feed intake (RFI and feed conversion ratio (FCR of a random population Pekin duck. Methods Two thousand and twenty pedigreed random population Pekin ducks were established from 90 males mated to 450 females in two hatches. Traits analyzed in the study were body weight at the 42th day (BW42, 15 to 42 days average daily gain (ADG, 15 to 42 days FI, 15 to 42 days FCR, and 15 to 42 days RFI to assess their genetic inter-relationships. The genetic parameters for feed efficiency traits were estimated using restricted maximum likelihood (REML methodology applied to a sire-dam model for all traits using the ASREML software. Results Estimates heritability of BW42, ADG, FI, FCR, and RFI were 0.39, 0.38, 0.33, 0.38, and 0.41, respectively. The genetic correlation was high between RFI and FI (0.77 and moderate between RFI and FCR (0.54. The genetic correlation was high and moderate between FCR and ADG (−0.80, and between FCR and BW42 (−0.64, and between FCR and FI (0.49, respectively. Conclusion Thus, selection on RFI was expected to improve feed efficiency, and reduce FI. Selection on RFI thus improves the feed efficiency of animals without impairing their FI and increase growth rate.

  1. Five-year follow-up of harms and benefits of behavioral infant sleep intervention: randomized trial.

    Science.gov (United States)

    Price, Anna M H; Wake, Melissa; Ukoumunne, Obioha C; Hiscock, Harriet

    2012-10-01

    Randomized trials have demonstrated the short- to medium-term effectiveness of behavioral infant sleep interventions. However, concerns persist that they may harm children's emotional development and subsequent mental health. This study aimed to determine long-term harms and/or benefits of an infant behavioral sleep program at age 6 years on (1) child, (2) child-parent, and (3) maternal outcomes. Three hundred twenty-six children (173 intervention) with parent-reported sleep problems at age 7 months were selected from a population sample of 692 infants recruited from well-child centers. The study was a 5-year follow-up of a population-based cluster-randomized trial. Allocation was concealed and researchers (but not parents) were blinded to group allocation. Behavioral techniques were delivered over 1 to 3 individual nurse consultations at infant age 8 to 10 months, versus usual care. The main outcomes measured were (1) child mental health, sleep, psychosocial functioning, stress regulation; (2) child-parent relationship; and (3) maternal mental health and parenting styles. Two hundred twenty-five families (69%) participated. There was no evidence of differences between intervention and control families for any outcome, including (1) children's emotional (P = .8) and conduct behavior scores (P = .6), sleep problems (9% vs 7%, P = .2), sleep habits score (P = .4), parent- (P = .7) and child-reported (P = .8) psychosocial functioning, chronic stress (29% vs 22%, P = .4); (2) child-parent closeness (P = .1) and conflict (P = .4), global relationship (P = .9), disinhibited attachment (P = .3); and (3) parent depression, anxiety, and stress scores (P = .9) or authoritative parenting (63% vs 59%, P = .5). Behavioral sleep techniques have no marked long-lasting effects (positive or negative). Parents and health professionals can confidently use these techniques to reduce the short- to medium-term burden of infant sleep problems and maternal depression.

  2. Autonomous Byte Stream Randomizer

    Science.gov (United States)

    Paloulian, George K.; Woo, Simon S.; Chow, Edward T.

    2013-01-01

    Net-centric networking environments are often faced with limited resources and must utilize bandwidth as efficiently as possible. In networking environments that span wide areas, the data transmission has to be efficient without any redundant or exuberant metadata. The Autonomous Byte Stream Randomizer software provides an extra level of security on top of existing data encryption methods. Randomizing the data s byte stream adds an extra layer to existing data protection methods, thus making it harder for an attacker to decrypt protected data. Based on a generated crypto-graphically secure random seed, a random sequence of numbers is used to intelligently and efficiently swap the organization of bytes in data using the unbiased and memory-efficient in-place Fisher-Yates shuffle method. Swapping bytes and reorganizing the crucial structure of the byte data renders the data file unreadable and leaves the data in a deconstructed state. This deconstruction adds an extra level of security requiring the byte stream to be reconstructed with the random seed in order to be readable. Once the data byte stream has been randomized, the software enables the data to be distributed to N nodes in an environment. Each piece of the data in randomized and distributed form is a separate entity unreadable on its own right, but when combined with all N pieces, is able to be reconstructed back to one. Reconstruction requires possession of the key used for randomizing the bytes, leading to the generation of the same cryptographically secure random sequence of numbers used to randomize the data. This software is a cornerstone capability possessing the ability to generate the same cryptographically secure sequence on different machines and time intervals, thus allowing this software to be used more heavily in net-centric environments where data transfer bandwidth is limited.

  3. The frequency of drugs in randomly selected drivers in Denmark

    DEFF Research Database (Denmark)

    Simonsen, Kirsten Wiese; Steentoft, Anni; Hels, Tove

    is the Danish legal limit. The percentage of drivers positive for medicinal drugs above the Danish legal concentration limit was 0.4%; while, 0.3% of the drivers tested positive for one or more illicit drug at concentrations exceeding the Danish legal limit. Tetrahydrocannabinol, cocaine, and amphetamine were...... the most frequent illicit drugs detected above the limit of quantitation (LOQ); while, codeine, tramadol, zopiclone, and benzodiazepines were the most frequent legal drugs. Middle aged men (median age 47.5 years) dominated the drunk driving group, while the drivers positive for illegal drugs consisted......Introduction Driving under the influence of alcohol and drugs is a global problem. In Denmark as well as in other countries there is an increasing focus on impaired driving. Little is known about the occurrence of psychoactive drugs in the general traffic. Therefore the European commission...

  4. Model Selection with the Linear Mixed Model for Longitudinal Data

    Science.gov (United States)

    Ryoo, Ji Hoon

    2011-01-01

    Model building or model selection with linear mixed models (LMMs) is complicated by the presence of both fixed effects and random effects. The fixed effects structure and random effects structure are codependent, so selection of one influences the other. Most presentations of LMM in psychology and education are based on a multilevel or…

  5. Conversion of the random amplified polymorphic DNA (RAPD ...

    African Journals Online (AJOL)

    Conversion of the random amplified polymorphic DNA (RAPD) marker UBC#116 linked to Fusarium crown and root rot resistance gene (Frl) into a co-dominant sequence characterized amplified region (SCAR) marker for marker-assisted selection of tomato.

  6. Randomized phase II study of paclitaxel/carboplatin intercalated with gefitinib compared to paclitaxel/carboplatin alone for chemotherapy-naïve non-small cell lung cancer in a clinically selected population excluding patients with non-smoking adenocarcinoma or mutated EGFR

    International Nuclear Information System (INIS)

    Choi, Yoon Ji; Lee, Dae Ho; Choi, Chang Min; Lee, Jung Shin; Lee, Seung Jin; Ahn, Jin-Hee; Kim, Sang-We

    2015-01-01

    Considering cell cycle dependent cytotoxicity, intercalation of chemotherapy and epidermal growth factor receptor (EGFR) tyrosine kinase inhibitor (TKI) may be a treatment option in non-small cell lung cancer (NSCLC). This randomized phase 2 study compared the efficacy of paclitaxel and carboplatin (PC) intercalated with gefitinib (G) versus PC alone in a selected, chemotherapy-naïve population of advanced NSCLC patients with a history of smoking or wild-type EGFR. Eligible patients were chemotherapy-naïve advanced NSCLC patients with Eastern Cooperative Oncology Group performance status of 0—2. Non-smoking patients with adenocarcinoma or patients with activating EGFR mutation were excluded because they could benefit from gefitinib alone. Eligible patients were randomized to one of the following treatment arms: PCG, P 175 mg/m 2 , and C AUC 5 administered intravenously on day 1 intercalated with G 250 mg orally on days 2 through 15 every 3 weeks for four cycles followed by G 250 mg orally until progressive disease; or PC, same dosing schedule for four cycles only. The primary endpoint was the objective response rate (ORR), and the secondary endpoints included progression-free survival (PFS), overall survival (OS), and toxicity profile. A total of 90 patients participated in the study. The ORRs were 41.9 % (95 % confidence interval (CI) 27.0–57.9 %) for the PCG arm and 39.5 % (95 % CI 25.0–55.6 %) for the PC arm (P = 0.826). No differences in PFS (4.1 vs. 4.1 months, P = 0.781) or OS (9.3 vs. 10.5 months, P = 0.827) were observed between the PCG and PC arms. Safety analyses showed a similar incidence of drug-related grade 3/4 toxicity. Rash and pruritus were more frequent in the PCG than in the PC arm. PCG did not improve ORR, PFS, and OS compared to PC chemotherapy alone for NSCLC in a clinically selected population excluding non-smoking adenocarcinoma or mutated EGFR. The study is registered with ClinicalTrials.gov (NCT01196234). Registration date is 08/09/2010

  7. Lymphoid irradiation in intractable rheumatoid arthritis. A double-blind, randomized study comparing 750-rad treatment with 2,000-rad treatment

    International Nuclear Information System (INIS)

    Hanly, J.G.; Hassan, J.; Moriarty, M.; Barry, C.; Molony, J.; Casey, E.; Whelan, A.; Feighery, C.; Bresnihan, B.

    1986-01-01

    Twenty patients with intractable rheumatoid arthritis were treated with 750-rad or 2,000-rad lymphoid irradiation in a randomized double-blind comparative study. Over a 12-month followup period, there was a significant improvement in 4 of 7 and 6 of 7 standard parameters of disease activity following treatment with 750 rads and 2,000 rads, respectively. Transient, short-term toxicity was less frequent with the lower dose. In both groups, there was a sustained peripheral blood lymphopenia, a selective depletion of T helper (Leu-3a+) lymphocytes, and reduced in vitro mitogen responses. These changes did not occur, however, in synovial fluid. These results suggest that 750-rad lymphoid irradiation is as effective as, but less toxic than, that with 2,000 rads in the management of patients with intractable rheumatoid arthritis

  8. Selective Europeanization

    DEFF Research Database (Denmark)

    Hoch Jovanovic, Tamara; Lynggaard, Kennet

    2014-01-01

    and rules. The article examines the reasons for both resistance and selectiveness to Europeanization of the Danish minority policy through a “path dependency” perspective accentuating decision makers’ reluctance to deviate from existing institutional commitments, even in subsequently significantly altered...... political contexts at the European level. We further show how the “translation” of international norms to a domestic context has worked to reinforce the original institutional setup, dating back to the mid-1950s. The translation of European-level minority policy developed in the 1990s and 2000s works most...

  9. Selective Reproduction

    DEFF Research Database (Denmark)

    Svendsen, Mette N.

    2015-01-01

    This article employs a multi-species perspective in investigating how life's worth is negotiated in the field of neonatology in Denmark. It does so by comparing decision-making processes about human infants in the Danish neonatal intensive care unit with those associated with piglets who serve as...... as expectations within linear or predictive time frames are key markers in both sites. Exploring selective reproductive processes across human infants and research piglets can help us uncover aspects of the cultural production of viability that we would not otherwise see or acknowledge....

  10. Developmental memory capacity resources of typical children retrieving picture communication symbols using direct selection and visual linear scanning with fixed communication displays.

    Science.gov (United States)

    Wagner, Barry T; Jackson, Heather M

    2006-02-01

    This study examined the cognitive demands of 2 selection techniques in augmentative and alternative communication (AAC), direct selection, and visual linear scanning, by determining the memory retrieval abilities of typically developing children when presented with fixed communication displays. One hundred twenty typical children from kindergarten, 1st, and 3rd grades were randomly assigned to either a direct selection or visual linear scanning group. Memory retrieval was assessed through word span using Picture Communication Symbols (PCSs). Participants were presented various numbers and arrays of PCSs and asked to retrieve them by placing identical graphic symbols on fixed communication displays with grid layouts. The results revealed that participants were able to retrieve more PCSs during direct selection than scanning. Additionally, 3rd-grade children retrieved more PCSs than kindergarten and 1st-grade children. An analysis on the type of errors during retrieval indicated that children were more successful at retrieving the correct PCSs than the designated location of those symbols on fixed communication displays. AAC practitioners should consider using direct selection over scanning whenever possible and account for anticipatory monitoring and pulses when scanning is used in the service delivery of children with little or no functional speech. Also, researchers should continue to investigate AAC selection techniques in relationship to working memory resources.

  11. Agriculture in West Africa in the Twenty-First Century: Climate Change and Impacts Scenarios, and Potential for Adaptation

    Science.gov (United States)

    Sultan, Benjamin; Gaetani, Marco

    2016-01-01

    West Africa is known to be particularly vulnerable to climate change due to high climate variability, high reliance on rain-fed agriculture, and limited economic and institutional capacity to respond to climate variability and change. In this context, better knowledge of how climate will change in West Africa and how such changes will impact crop productivity is crucial to inform policies that may counteract the adverse effects. This review paper provides a comprehensive overview of climate change impacts on agriculture in West Africa based on the recent scientific literature. West Africa is nowadays experiencing a rapid climate change, characterized by a widespread warming, a recovery of the monsoonal precipitation, and an increase in the occurrence of climate extremes. The observed climate tendencies are also projected to continue in the twenty-first century under moderate and high emission scenarios, although large uncertainties still affect simulations of the future West African climate, especially regarding the summer precipitation. However, despite diverging future projections of the monsoonal rainfall, which is essential for rain-fed agriculture, a robust evidence of yield loss in West Africa emerges. This yield loss is mainly driven by increased mean temperature while potential wetter or drier conditions as well as elevated CO2 concentrations can modulate this effect. Potential for adaptation is illustrated for major crops in West Africa through a selection of studies based on process-based crop models to adjust cropping systems (change in varieties, sowing dates and density, irrigation, fertilizer management) to future climate. Results of the cited studies are crop and region specific and no clear conclusions can be made regarding the most effective adaptation options. Further efforts are needed to improve modeling of the monsoon system and to better quantify the uncertainty in its changes under a warmer climate, in the response of the crops to such

  12. EDITORIAL: Nanotechnological selection Nanotechnological selection

    Science.gov (United States)

    Demming, Anna

    2013-01-01

    At the nanoscale measures can move from a mass-scale analogue calibration to counters of discrete units. The shift redefines the possible levels of control that can be achieved in a system if adequate selectivity can be imposed. As an example as ionic substances pass through nanoscale pores, the quantity of ions is low enough that the pore can contain either negative or positive ions. Yet precise control over this selectivity still raises difficulties. In this issue researchers address the challenge of how to regulate the ionic selectivity of negative and positive charges with the use of an external charge. The approach may be useful for controlling the behaviour, properties and chemical composition of liquids and has possible technical applications for nanofluidic field effect transistors [1]. Selectivity is a critical advantage in the administration of drugs. Nanoparticles functionalized with targeting moieties can allow delivery of anti-cancer drugs to tumour cells, whilst avoiding healthy cells and hence reducing some of the debilitating side effects of cancer treatments [2]. Researchers in Belarus and the US developed a new theranostic approach—combining therapy and diagnosis—to support the evident benefits of cellular selectivity that can be achieved when nanoparticles are applied in medicine [3]. Their process uses nanobubbles of photothermal vapour, referred to as plasmonic nanobubbles, generated by plasmonic excitations in gold nanoparticles conjugated to diagnosis-specific antibodies. The intracellular plasmonic nanobubbles are controlled by laser fluence so that the response can be tuned in individual living cells. Lower fluence allows non-invasive high-sensitive imaging for diagnosis and higher fluence can disrupt the cellular membrane for treatments. The selective response of carbon nanotubes to different gases has leant them to be used within various different types of sensors, as summarized in a review by researchers at the University of

  13. Does epicatechin contribute to the acute vascular function effects of dark chocolate? A randomized, crossover study

    NARCIS (Netherlands)

    Dower, James I.; Geleijnse, Marianne; Kroon, Paul A.; Philo, Mark; Mensink, Marco; Kromhout, Daan; Hollman, Peter C.H.

    2016-01-01

    Scope: Cocoa, rich in flavan-3-ols, improves vascular function, but the contribution of specific flavan-3-ols is unknown. We compared the effects of pure epicatechin, a major cocoa flavan-3-ol, and chocolate. Methods and results: In a randomized crossover study, twenty healthy men (40-80 years)

  14. EMDR versus CBT for children with self-esteem and behavioral problems: a randomized controlled trial

    NARCIS (Netherlands)

    Wanders, F.; Serra, M.; de Jongh, A.

    2008-01-01

    This study compared eye movement desensitization and reprocessing (EMDR) with cognitive-behavioral therapy (CBT). Twenty-six children (average age 10.4 years) with behavioral problems were randomly assigned to receive either 4 sessions of EMDR or CBT prior to usual treatment provided in outpatient

  15. Chemical, Bioactive, and Antioxidant Potential of Twenty Wild Culinary Mushroom Species

    Science.gov (United States)

    Sharma, S. K.; Gautam, N.

    2015-01-01

    The chemical, bioactive, and antioxidant potential of twenty wild culinary mushroom species being consumed by the people of northern Himalayan regions has been evaluated for the first time in the present study. Nutrients analyzed include protein, crude fat, fibres, carbohydrates, and monosaccharides. Besides, preliminary study on the detection of toxic compounds was done on these species. Bioactive compounds evaluated are fatty acids, amino acids, tocopherol content, carotenoids (β-carotene, lycopene), flavonoids, ascorbic acid, and anthocyanidins. Fruitbodies extract of all the species was tested for different types of antioxidant assays. Although differences were observed in the net values of individual species all the species were found to be rich in protein, and carbohydrates and low in fat. Glucose was found to be the major monosaccharide. Predominance of UFA (65–70%) over SFA (30–35%) was observed in all the species with considerable amounts of other bioactive compounds. All the species showed higher effectiveness for antioxidant capacities. PMID:26199938

  16. Chemical, Bioactive, and Antioxidant Potential of Twenty Wild Culinary Mushroom Species

    Directory of Open Access Journals (Sweden)

    S. K. Sharma

    2015-01-01

    Full Text Available The chemical, bioactive, and antioxidant potential of twenty wild culinary mushroom species being consumed by the people of northern Himalayan regions has been evaluated for the first time in the present study. Nutrients analyzed include protein, crude fat, fibres, carbohydrates, and monosaccharides. Besides, preliminary study on the detection of toxic compounds was done on these species. Bioactive compounds evaluated are fatty acids, amino acids, tocopherol content, carotenoids (β-carotene, lycopene, flavonoids, ascorbic acid, and anthocyanidins. Fruitbodies extract of all the species was tested for different types of antioxidant assays. Although differences were observed in the net values of individual species all the species were found to be rich in protein, and carbohydrates and low in fat. Glucose was found to be the major monosaccharide. Predominance of UFA (65–70% over SFA (30–35% was observed in all the species with considerable amounts of other bioactive compounds. All the species showed higher effectiveness for antioxidant capacities.

  17. Report of the twenty-first session, London, 18-22 February 1991

    International Nuclear Information System (INIS)

    1991-01-01

    The Joint Group of Experts on the Scientific Aspects of Marine Pollution (GESAMP) held its twenty-first session at the Headquarters of the International Maritime Organization (IMO), London, from 18 to 22 February 1991. Marine pollution is primarily linked to coastal development. The most serious problems are those associated with inadequately controlled coastal development and intensive human settlement of the coastal zone. GESAMP emphasizes the importance of the following problems and issues: State of the marine environment; comprehensive framework for the assessment and regulation of waste disposal in the marine environment; information on preparations for the United Nations Conference on Environment and Development; review of potentially harmful substances: 1. Carcinogenic substances. 2. Mutagenic substances. 3. Teratogenic substances. 4. Organochlorine compounds. 5. Oil, and other hydrocarbons including used lubricating oils, oil spill dispersants and chemicals used in offshore oil exploration and exploitation; environmental impacts of coastal aquaculture; global change and the air/sea exchange of chemicals; future work programme

  18. Chemical, Bioactive, and Antioxidant Potential of Twenty Wild Culinary Mushroom Species.

    Science.gov (United States)

    Sharma, S K; Gautam, N

    2015-01-01

    The chemical, bioactive, and antioxidant potential of twenty wild culinary mushroom species being consumed by the people of northern Himalayan regions has been evaluated for the first time in the present study. Nutrients analyzed include protein, crude fat, fibres, carbohydrates, and monosaccharides. Besides, preliminary study on the detection of toxic compounds was done on these species. Bioactive compounds evaluated are fatty acids, amino acids, tocopherol content, carotenoids (β-carotene, lycopene), flavonoids, ascorbic acid, and anthocyanidins. Fruitbodies extract of all the species was tested for different types of antioxidant assays. Although differences were observed in the net values of individual species all the species were found to be rich in protein, and carbohydrates and low in fat. Glucose was found to be the major monosaccharide. Predominance of UFA (65-70%) over SFA (30-35%) was observed in all the species with considerable amounts of other bioactive compounds. All the species showed higher effectiveness for antioxidant capacities.

  19. Twenty-five years of the common market in coal, 1953--1978. [genesis and growth

    Energy Technology Data Exchange (ETDEWEB)

    1978-01-01

    Twenty-five years have passed since the European Coal and Steel commmunity was established. An attempt is made to show what economic integration is, what problems have arisen, and how the community has tried to overcome them. Three phases can be distinguished during the period under review--a first phase of growth in the coal industry between 1953 and 1957; a second phase marked by a plentiful supply of cheap hydrocarbons and a rapid reduction in coal output despite exceptional growth, linked with a parallel increase in overall energy requirements; and a third phase from 1973, marked by sharp price increases by the oil producing countries with repercussions on the world market in coal.

  20. A Farewell to Innocence? African Youth and Violence in the Twenty-First Century

    Directory of Open Access Journals (Sweden)

    Charles Ugochukwu Ukeje

    2012-12-01

    Full Text Available This is a broad examination of the issue of youth violence in twenty-first-century Africa, looking at the context within which a youth culture of violence has evolved and attempting to understand the underlining discourses of hegemony and power that drive it. The article focuses specifically on youth violence as apolitical response to the dynamics of (disempowerment, exclusion, and economic crisis and uses (postconflict states like Liberia, Sierra Leone, and Nigeriato explain not just the overall challenge of youth violence but also the nature of responses that it has elicited from established structures of authority. Youth violence is in many ways an expression of youth agency in the context of a social and economic system that provides little opportunity.

  1. Ecological restoration should be redefined for the twenty-first century.

    Science.gov (United States)

    Martin, David M

    2017-09-24

    Forty years ago, ecological restoration was conceptualized through a natural science lens. Today, ecological restoration has evolved into a social and scientific concept. The duality of ecological restoration is acknowledged in guidance documents on the subject but is not apparent in its definition. Current definitions reflect our views about what ecological restoration does but not why we do it. This viewpoint does not give appropriate credit to contributions from social sciences, nor does it provide compelling goals for people with different motivating rationales to engage in or support restoration. In this study, I give a concise history of the conceptualization and definition of ecological restoration, and I propose an alternative definition and corresponding viewpoint on restoration goal-setting to meet twenty-first century scientific and public inquiry.

  2. Global threats from invasive alien species in the twenty-first century and national response capacities

    Science.gov (United States)

    Early, Regan; Bradley, Bethany A.; Dukes, Jeffrey S.; Lawler, Joshua J.; Olden, Julian D.; Blumenthal, Dana M.; Gonzalez, Patrick; Grosholz, Edwin D.; Ibañez, Ines; Miller, Luke P.; Sorte, Cascade J. B.; Tatem, Andrew J.

    2016-01-01

    Invasive alien species (IAS) threaten human livelihoods and biodiversity globally. Increasing globalization facilitates IAS arrival, and environmental changes, including climate change, facilitate IAS establishment. Here we provide the first global, spatial analysis of the terrestrial threat from IAS in light of twenty-first century globalization and environmental change, and evaluate national capacities to prevent and manage species invasions. We find that one-sixth of the global land surface is highly vulnerable to invasion, including substantial areas in developing economies and biodiversity hotspots. The dominant invasion vectors differ between high-income countries (imports, particularly of plants and pets) and low-income countries (air travel). Uniting data on the causes of introduction and establishment can improve early-warning and eradication schemes. Most countries have limited capacity to act against invasions. In particular, we reveal a clear need for proactive invasion strategies in areas with high poverty levels, high biodiversity and low historical levels of invasion. PMID:27549569

  3. Niels Bohr and the philosophy of physics twenty-first century perspectives

    CERN Document Server

    Folse, Henry

    2017-01-01

    Niels Bohr and Philosophy of Physics: Twenty-First Century Perspectives examines the philosophical views, influences and legacy of the Nobel Prize physicist and philosophical spokesman of the quantum revolution, Niels Bohr. The sixteen contributions in this collection by some of the best contemporary philosophers and physicists writing on Bohr's philosophy today all carefully distinguish his subtle and unique interpretation of quantum mechanics from views often imputed to him under the banner of the “Copenhagen Interpretation.” With respect to philosophical influences on Bohr's outlook, the contributors analyse prominent similarities between his viewpoint and Kantian ways of thinking, the views of the Danish philosopher Harald Høffding, and themes characteristic of American pragmatism. In recognizing the importance of Bohr's epistemological naturalism they examine his defence of the indispensability of classical concepts from a variety of different perspectives. This collection shows us that Bohr's int...

  4. Golf science research at the beginning of the twenty-first century.

    Science.gov (United States)

    Farrally, M R; Cochran, A J; Crews, D J; Hurdzan, M J; Price, R J; Snow, J T; Thomas, P R

    2003-09-01

    At the beginning of the twenty-first century, there are 30,000 golf courses and 55 million people who play golf worldwide. In the USA alone, the value of golf club memberships sold in the 1990s was US dollar 3.2 billion. Underpinning this significant human activity is a wide variety of people researching and applying science to sustain and develop the game. The 11 golf science disciplines recognized by the World Scientific Congress of Golf have reported 311 papers at four world congresses since 1990. Additionally, scientific papers have been published in discipline-specific peer-reviewed journals, research has been sponsored by the two governing bodies of golf, the Royal and Ancient Golf Club of St. Andrews and the United States Golf Association, and confidential research is undertaken by commercial companies, especially equipment manufacturers. This paper reviews much of this human endeavour and points the way forward for future research into golf.

  5. Civil Rights Laws as Tools to Advance Health in the Twenty-First Century.

    Science.gov (United States)

    McGowan, Angela K; Lee, Mary M; Meneses, Cristina M; Perkins, Jane; Youdelman, Mara

    2016-01-01

    To improve health in the twenty-first century, to promote both access to and quality of health care services and delivery, and to address significant health disparities, legal and policy approaches, specifically those focused on civil rights, could be used more intentionally and strategically. This review describes how civil rights laws, and their implementation and enforcement, help to encourage health in the United States, and it provides examples for peers around the world. The review uses a broad lens to define health for both classes of individuals and their communities--places where people live, learn, work, and play. Suggestions are offered for improving health and equity broadly, especially within societal groups and marginalized populations. These recommendations include multisectorial approaches that focus on the social determinants of health.

  6. Twenty years of operation of WWER 440/230 units in Jaslovske Bohunice

    International Nuclear Information System (INIS)

    Tomek, J.

    1998-01-01

    It is twenty years this year since the first unit WWER 440 of Slovak Nuclear Power Plant Jaslovske Bohunice was commissioned. There are four units WWER 440 in operation Jaslovske Bohunice site. First two units of older soviet PWR design V-230 (also known as V-1) and other two units of newer V-213 type (also known as V-2). The goal of this presentation is to summarize and evaluate the operation of Unit 1 and 2 for this period of time and mainly to describe what has been done and what is planned to be done to increase the nuclear safety and operational reliability of both units. The operating organization and regulatory authority assume that an internationally acceptable level of safety will be reached by accomplishing of the upgrading program.(author)

  7. Twenty novel polymorphic microsatellite primers in the critically endangered Melastoma tetramerum var. tetramerum (Melastomataceae).

    Science.gov (United States)

    Narita, Ayu; Izuno, Ayako; Komaki, Yoshiteru; Tanaka, Takefumi; Murata, Jin; Isagi, Yuji

    2016-09-01

    Microsatellite markers were identified for Melastoma tetramerum var. tetramerum (Melastomataceae), a critically endangered shrub endemic to the Bonin Islands, to reveal genetic characteristics in wild and restored populations. Using next-generation sequencing, 27 microsatellite markers were identified. Twenty of these markers were polymorphic in M. tetramerum var. tetramerum, with two to nine alleles per locus and expected heterozygosity ranging from 0.10 to 0.71. Among the 20 polymorphic markers, 15 were applicable to other closely related taxa, namely M. tetramerum var. pentapetalum, M. candidum var. candidum, and M. candidum var. alessandrense. These markers can be potentially useful to investigate the genetic diversity, population genetic structure, and reproductive ecology of M. tetramerum var. tetramerum as well as of the three related taxa to provide appropriate genetic information for conservation.

  8. TWENTIES Project. Wind power for wide-area control of the grid

    Energy Technology Data Exchange (ETDEWEB)

    Perez, Juan Carlos; Combarros, Clara; Veguillas, Roberto; Hermosa, Mikel Joseba [Iberdrola Renovables, Madrid (Spain); Rubio, David [Iberdrola Ingenieria y Construccion (Spain); Egido, Ignacio [Comillas Univ. (ES). Inst. de Investigacion Tecnologica (IIT)

    2011-07-01

    Europe faces a great challenge with the 2020 scenario in which the renewable energy installed capacity in Europe should increase from its present value of approximately 80 GW to 230 GW in 2020. The future high penetration levels of wind and other renewable energies in the power system require decision makers and stakeholders of the electrical sector to work together to develop new ancillary services and to make the necessary changes to the grid infrastructure in Europe. This background is in line with the SYSERWIND demonstration lead by Iberdrola Renovables and included in the TWENTIES project, with three more partners taking part in this package: Red Electrica de Espana (REE), IIT and Gamesa Eolica. This paper introduces a first phase of preliminary work to define, install and test a Secondary Frequency Control and a Voltage Management System in a wide area, along a transport line. (orig.)

  9. Uncertainty in Twenty-First-Century CMIP5 Sea Level Projections

    Science.gov (United States)

    Little, Christopher M.; Horton, Radley M.; Kopp, Robert E.; Oppenheimer, Michael; Yip, Stan

    2015-01-01

    The representative concentration pathway (RCP) simulations included in phase 5 of the Coupled Model Intercomparison Project (CMIP5) quantify the response of the climate system to different natural and anthropogenic forcing scenarios. These simulations differ because of 1) forcing, 2) the representation of the climate system in atmosphere-ocean general circulation models (AOGCMs), and 3) the presence of unforced (internal) variability. Global and local sea level rise projections derived from these simulations, and the emergence of distinct responses to the four RCPs depend on the relative magnitude of these sources of uncertainty at different lead times. Here, the uncertainty in CMIP5 projections of sea level is partitioned at global and local scales, using a 164-member ensemble of twenty-first-century simulations. Local projections at New York City (NYSL) are highlighted. The partition between model uncertainty, scenario uncertainty, and internal variability in global mean sea level (GMSL) is qualitatively consistent with that of surface air temperature, with model uncertainty dominant for most of the twenty-first century. Locally, model uncertainty is dominant through 2100, with maxima in the North Atlantic and the Arctic Ocean. The model spread is driven largely by 4 of the 16 AOGCMs in the ensemble; these models exhibit outlying behavior in all RCPs and in both GMSL and NYSL. The magnitude of internal variability varies widely by location and across models, leading to differences of several decades in the local emergence of RCPs. The AOGCM spread, and its sensitivity to model exclusion and/or weighting, has important implications for sea level assessments, especially if a local risk management approach is utilized.

  10. Twenty-first century learning after school: the case of Junior Achievement Worldwide.

    Science.gov (United States)

    Box, John M

    2006-01-01

    Efforts to increase after-school programming indicate the nation's concern about how youth are engaged during out-of-school time. There are clear benefits to extending the learning that goes on during the school day. Research from the U.S. Departments of Education and Justice shows that after-school participants do better in school and have stronger expectations for the future than youth who are not occupied after school. And the need is evident: 14.3 million students return to an empty house after school, yet only 6.5 million children are currently enrolled in after-school programs. If an after-school program were available, parents of 15.3 million would enroll their child. JA Worldwide began in 1919 and has been rooted in the afterschool arena from its origins. Its after-school programs teach students about the free enterprise system through curriculum focusing on business, citizenship, economics, entrepreneurship, ethics and character, financial literacy, and career development. At the same time, JA Worldwide incorporates hands-on learning and engagement with adults as role models, both key elements to a successful after-school program. Now focused on developing curriculum emphasizing skills needed for the twenty-first century, JA adopted the key elements laid out for after-school programs by the Partnership for 21st Century Skills. To ensure that the next generation of students enters the workforce prepared, America's education system must provide the required knowledge, skills, and attitudes. Programs such as JA Worldwide serve as models of how to provide the twenty-first century skills that all students need to succeed.

  11. Parieto-occipital encephalomalacia in children; clinical and electrophysiological features of twenty-seven cases.

    Science.gov (United States)

    Karaoğlu, Pakize; Polat, Ayşe İpek; Yiş, Uluç; Hız, Semra

    2015-01-01

    Brain injuries occurring at a particular time may cause damages in well-defined regions of brain. Perinatal hypoxic ischemic encephalopathy and hypoglycemia are some of the most common types of brain injuries. Neonatal hypoglycemia can cause abnormal myelination in parietal and occipital lobes resulting in parieto-occipital encephalomalacia. There is a small number of studies about clinical and electroencephalographic (EEG) features of children with parieto-occipital encephalomalacia. They might have important neurologic sequelae such as cortical visual loss, seizures, and psychomotor retardation. We aimed to evaluate the causes of parieto-occipital encephalomalacia and evaluate the clinical and electrophysiological features of children with parieto-occipital encephalomalacia. We evaluated clinical features and EEGs of 27 children with parieto-occipital encephalomalacia. Descriptive statistics were used. Hospitalization during the neonatal period was the most common cause (88.9%) of parieto-occipital brain injury. Eleven patients (40.7%) had a history of neonatal hypoglycemia. Twenty-three patients (85.2%) had epilepsy and nine of the epileptic patients (39%) had refractory seizures. Most of the patients had bilateral (50%) epileptic discharges originating from temporal, parietal, and occipital lobes (56.2%). However, some patients had frontal sharp waves and some had continuous spike and wave discharges during sleep. Visual abnormalities were evident in 15 (55.6%) patients. Twenty-two (81.5%) had psychomotor retardation. Fine motor skills, social contact and language development were impaired more than gross motor skills. In our study, most of the patients with parieto-occipital encephalomalacia had an eventful perinatal history. Epilepsy, psychomotor retardation, and visual problems were common neurologic complications.

  12. Twenty-first century learning in states: the case of the Massachusetts educational system.

    Science.gov (United States)

    Driscoll, David P

    2006-01-01

    A current crisis in education is leaving students less prepared to succeed in the working world than any generation before them. Increasingly complex external, nonacademic pressures have an impact on many of today's students, often causing them to drop out of school. Only 76 percent of Massachusetts high school students graduate, and only 29 percent earn a college degree. National figures are worse. Most educational institutions share a common goal to support students in becoming skilled, productive, successful members of society, but the author argues that this goal is not being met. Despite the constant changes in the world, educational practices have remained static. Most public schools are not adapting to meet the shifting needs of students. Universities are not able to prepare the right mix of prospective employees for the demands of the job market; for example, schools are graduating only 10 percent of the needed engineers. Institutions of higher learning cannot keep up with employers' needs in an evolving global market: strong math, science, and writing abilities; critical thinking skills; and the ability to work in teams. The author draws on exemplary efforts at work in his home state of Massachusetts--whose improvements in student achievement outcomes have been some of the best in the nation--to suggest there is promise in twenty-first century learning. Middle school students involved in a NASA-funded project write proposals, work in teams, and engage in peer review. Older students participate in enhanced, hands-on cooperative school-to-work and after-school programs. Schools are starting to offer expanded day learning, increasing the number of hours they are engaged in formal learning. Yet such programs have not reached significant levels of scale. The author calls for a major shift in education to help today's students be successful in the twenty-first century.

  13. Black Hats and White Hats: The Effect of Organizational Culture and Institutional Identity on the Twenty-third Air Force

    National Research Council Canada - National Science Library

    Koskinas, Ioannis

    2006-01-01

    .... Although brief, the Twenty-third Air Force's experience provides sufficient data for a thorough analysis of the effect of organizational culture and institutional agendas on the evolution of a nascent organization...

  14. Random number generation

    International Nuclear Information System (INIS)

    Coveyou, R.R.

    1974-01-01

    The subject of random number generation is currently controversial. Differing opinions on this subject seem to stem from implicit or explicit differences in philosophy; in particular, from differing ideas concerning the role of probability in the real world of physical processes, electronic computers, and Monte Carlo calculations. An attempt is made here to reconcile these views. The role of stochastic ideas in mathematical models is discussed. In illustration of these ideas, a mathematical model of the use of random number generators in Monte Carlo calculations is constructed. This model is used to set up criteria for the comparison and evaluation of random number generators. (U.S.)

  15. Quantum random access memory

    OpenAIRE

    Giovannetti, Vittorio; Lloyd, Seth; Maccone, Lorenzo

    2007-01-01

    A random access memory (RAM) uses n bits to randomly address N=2^n distinct memory cells. A quantum random access memory (qRAM) uses n qubits to address any quantum superposition of N memory cells. We present an architecture that exponentially reduces the requirements for a memory call: O(log N) switches need be thrown instead of the N used in conventional (classical or quantum) RAM designs. This yields a more robust qRAM algorithm, as it in general requires entanglement among exponentially l...

  16. Randomization of inspections

    International Nuclear Information System (INIS)

    Markin, J.T.

    1989-01-01

    As the numbers and complexity of nuclear facilities increase, limitations on resources for international safeguards may restrict attainment of safeguards goals. One option for improving the efficiency of limited resources is to expand the current inspection regime to include random allocation of the amount and frequency of inspection effort to material strata or to facilities. This paper identifies the changes in safeguards policy, administrative procedures, and operational procedures that would be necessary to accommodate randomized inspections and identifies those situations where randomization can improve inspection efficiency and those situations where the current nonrandom inspections should be maintained. 9 refs., 1 tab

  17. Random phenomena; Phenomenes aleatoires

    Energy Technology Data Exchange (ETDEWEB)

    Bonnet, G. [Commissariat a l' energie atomique et aux energies alternatives - CEA, C.E.N.G., Service d' Electronique, Section d' Electronique, Grenoble (France)

    1963-07-01

    This document gathers a set of conferences presented in 1962. A first one proposes a mathematical introduction to the analysis of random phenomena. The second one presents an axiomatic of probability calculation. The third one proposes an overview of one-dimensional random variables. The fourth one addresses random pairs, and presents basic theorems regarding the algebra of mathematical expectations. The fifth conference discusses some probability laws: binomial distribution, the Poisson distribution, and the Laplace-Gauss distribution. The last one deals with the issues of stochastic convergence and asymptotic distributions.

  18. Selected writings

    CERN Document Server

    Galilei, Galileo

    2012-01-01

    'Philosophy is written in this great book which is continually open before our eyes - I mean the universe...' Galileo's astronomical discoveries changed the way we look at the world, and our place in the universe. Threatened by the Inquisition for daring to contradict the literal truth of the Bible, Galileo ignited a scientific revolution when he asserted that the Earth moves. This generous selection from his writings contains all the essential texts for a reader to appreciate his lasting significance. Mark Davie's new translation renders Galileo's vigorous Italian prose into clear modern English, while William R. Shea's version of the Latin Sidereal Message makes accessible the book that created a sensation in 1610 with its account of Galileo's observations using the newly invented telescope. All Galileo's contributions to the debate on science and religion are included, as well as key documents from his trial before the Inquisition in 1633. A lively introduction and clear notes give an overview of Galileo's...

  19. Site selection

    International Nuclear Information System (INIS)

    Olsen, C.W.

    1983-07-01

    The conditions and criteria for selecting a site for a nuclear weapons test at the Nevada Test Site are summarized. Factors considered are: (1) scheduling of drill rigs, (2) scheduling of site preparation (dirt work, auger hole, surface casing, cementing), (3) schedule of event (when are drill hole data needed), (4) depth range of proposed W.P., (5) geologic structure (faults, Pz contact, etc.), (6) stratigraphy (alluvium, location of Grouse Canyon Tuff, etc.), (7) material properties (particularly montmorillonite and CO 2 content), (8) water table depth, (9) potential drilling problems (caving), (10) adjacent collapse craters and chimneys, (11) adjacent expended but uncollapsed sites, (12) adjacent post-shot or other small diameter holes, (13) adjacent stockpile emplacement holes, (14) adjacent planned events (including LANL), (15) projected needs of Test Program for various DOB's and operational separations, and (16) optimal use of NTS real estate

  20. Birefringence characteristics in sperm heads allow for the selection of reacted spermatozoa for intracytoplasmic sperm injection.

    Science.gov (United States)

    Gianaroli, Luca; Magli, M Cristina; Ferraretti, Anna P; Crippa, Andor; Lappi, Michela; Capitani, Serena; Baccetti, Baccio

    2010-02-01

    To verify clinical outcome after injection of spermatozoa that have undergone the acrosome reaction (reacted spermatozoa) vs. those still having an intact acrosome (nonreacted spermatozoa). Prospective, randomized study. Reproductive Medicine Unit, Italian Society for the Study of Reproductive Medicine, Bologna, Italy. According to a prospective randomization including 71 couples with severe male factor infertility, intracytoplasmic sperm injection (ICSI) was performed under polarized light that permitted analysis of the pattern of birefringence in the sperm head. Twenty-three patients had their oocytes injected with reacted spermatozoa, 26 patient's oocytes were injected with nonreacted spermatozoa, and in 22 patients both reacted and nonreacted spermatozoa were injected. Intracytoplasmic sperm injection was performed under polarized light to selectively inject acrosome-reacted and acrosome-nonreacted spermatozoa. Rates of fertilization, cleavage, pregnancy, implantation, and ongoing implantation. There was no effect on the fertilizing capacity and embryo development of either type of sperm, whereas the implantation rate was higher in oocytes injected with reacted spermatozoa (39.0%) vs. those injected with nonreacted spermatozoa (8.6%). The implantation rate was 24.4% in the group injected with both reacted and nonreacted spermatozoa. The delivery rate per cycle followed the same trend. Spermatozoa that have undergone the acrosome reaction seem to be more prone to supporting the development of viable ICSI embryos. Copyright 2010 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.