WorldWideScience

Sample records for randomly selected high

  1. High Entropy Random Selection Protocols

    NARCIS (Netherlands)

    H. Buhrman (Harry); M. Christandl (Matthias); M. Koucky (Michal); Z. Lotker (Zvi); B. Patt-Shamir; M. Charikar; K. Jansen; O. Reingold; J. Rolim

    2007-01-01

    textabstractIn this paper, we construct protocols for two parties that do not trust each other, to generate random variables with high Shannon entropy. We improve known bounds for the trade off between the number of rounds, length of communication and the entropy of the outcome.

  2. Blocked Randomization with Randomly Selected Block Sizes

    Directory of Open Access Journals (Sweden)

    Jimmy Efird

    2010-12-01

    Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  3. Selectivity and sparseness in randomly connected balanced networks.

    Directory of Open Access Journals (Sweden)

    Cengiz Pehlevan

    Full Text Available Neurons in sensory cortex show stimulus selectivity and sparse population response, even in cases where no strong functionally specific structure in connectivity can be detected. This raises the question whether selectivity and sparseness can be generated and maintained in randomly connected networks. We consider a recurrent network of excitatory and inhibitory spiking neurons with random connectivity, driven by random projections from an input layer of stimulus selective neurons. In this architecture, the stimulus-to-stimulus and neuron-to-neuron modulation of total synaptic input is weak compared to the mean input. Surprisingly, we show that in the balanced state the network can still support high stimulus selectivity and sparse population response. In the balanced state, strong synapses amplify the variation in synaptic input and recurrent inhibition cancels the mean. Functional specificity in connectivity emerges due to the inhomogeneity caused by the generative statistical rule used to build the network. We further elucidate the mechanism behind and evaluate the effects of model parameters on population sparseness and stimulus selectivity. Network response to mixtures of stimuli is investigated. It is shown that a balanced state with unselective inhibition can be achieved with densely connected input to inhibitory population. Balanced networks exhibit the "paradoxical" effect: an increase in excitatory drive to inhibition leads to decreased inhibitory population firing rate. We compare and contrast selectivity and sparseness generated by the balanced network to randomly connected unbalanced networks. Finally, we discuss our results in light of experiments.

  4. 47 CFR 1.1602 - Designation for random selection.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...

  5. 47 CFR 1.1603 - Conduct of random selection.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection. The...

  6. The signature of positive selection at randomly chosen loci.

    OpenAIRE

    Przeworski, Molly

    2002-01-01

    In Drosophila and humans, there are accumulating examples of loci with a significant excess of high-frequency-derived alleles or high levels of linkage disequilibrium, relative to a neutral model of a random-mating population of constant size. These are features expected after a recent selective sweep. Their prevalence suggests that positive directional selection may be widespread in both species. However, as I show here, these features do not persist long after the sweep ends: The high-frequ...

  7. The signature of positive selection at randomly chosen loci.

    Science.gov (United States)

    Przeworski, Molly

    2002-03-01

    In Drosophila and humans, there are accumulating examples of loci with a significant excess of high-frequency-derived alleles or high levels of linkage disequilibrium, relative to a neutral model of a random-mating population of constant size. These are features expected after a recent selective sweep. Their prevalence suggests that positive directional selection may be widespread in both species. However, as I show here, these features do not persist long after the sweep ends: The high-frequency alleles drift to fixation and no longer contribute to polymorphism, while linkage disequilibrium is broken down by recombination. As a result, loci chosen without independent evidence of recent selection are not expected to exhibit either of these features, even if they have been affected by numerous sweeps in their genealogical history. How then can we explain the patterns in the data? One possibility is population structure, with unequal sampling from different subpopulations. Alternatively, positive selection may not operate as is commonly modeled. In particular, the rate of fixation of advantageous mutations may have increased in the recent past.

  8. Applications of random forest feature selection for fine-scale genetic population assignment.

    Science.gov (United States)

    Sylvester, Emma V A; Bentzen, Paul; Bradbury, Ian R; Clément, Marie; Pearce, Jon; Horne, John; Beiko, Robert G

    2018-02-01

    Genetic population assignment used to inform wildlife management and conservation efforts requires panels of highly informative genetic markers and sensitive assignment tests. We explored the utility of machine-learning algorithms (random forest, regularized random forest and guided regularized random forest) compared with F ST ranking for selection of single nucleotide polymorphisms (SNP) for fine-scale population assignment. We applied these methods to an unpublished SNP data set for Atlantic salmon ( Salmo salar ) and a published SNP data set for Alaskan Chinook salmon ( Oncorhynchus tshawytscha ). In each species, we identified the minimum panel size required to obtain a self-assignment accuracy of at least 90% using each method to create panels of 50-700 markers Panels of SNPs identified using random forest-based methods performed up to 7.8 and 11.2 percentage points better than F ST -selected panels of similar size for the Atlantic salmon and Chinook salmon data, respectively. Self-assignment accuracy ≥90% was obtained with panels of 670 and 384 SNPs for each data set, respectively, a level of accuracy never reached for these species using F ST -selected panels. Our results demonstrate a role for machine-learning approaches in marker selection across large genomic data sets to improve assignment for management and conservation of exploited populations.

  9. Testing, Selection, and Implementation of Random Number Generators

    National Research Council Canada - National Science Library

    Collins, Joseph C

    2008-01-01

    An exhaustive evaluation of state-of-the-art random number generators with several well-known suites of tests provides the basis for selection of suitable random number generators for use in stochastic simulations...

  10. Selection for altruism through random drift in variable size populations

    Directory of Open Access Journals (Sweden)

    Houchmandzadeh Bahram

    2012-05-01

    Full Text Available Abstract Background Altruistic behavior is defined as helping others at a cost to oneself and a lowered fitness. The lower fitness implies that altruists should be selected against, which is in contradiction with their widespread presence is nature. Present models of selection for altruism (kin or multilevel show that altruistic behaviors can have ‘hidden’ advantages if the ‘common good’ produced by altruists is restricted to some related or unrelated groups. These models are mostly deterministic, or assume a frequency dependent fitness. Results Evolutionary dynamics is a competition between deterministic selection pressure and stochastic events due to random sampling from one generation to the next. We show here that an altruistic allele extending the carrying capacity of the habitat can win by increasing the random drift of “selfish” alleles. In other terms, the fixation probability of altruistic genes can be higher than those of a selfish ones, even though altruists have a smaller fitness. Moreover when populations are geographically structured, the altruists advantage can be highly amplified and the fixation probability of selfish genes can tend toward zero. The above results are obtained both by numerical and analytical calculations. Analytical results are obtained in the limit of large populations. Conclusions The theory we present does not involve kin or multilevel selection, but is based on the existence of random drift in variable size populations. The model is a generalization of the original Fisher-Wright and Moran models where the carrying capacity depends on the number of altruists.

  11. Application of random effects to the study of resource selection by animals.

    Science.gov (United States)

    Gillies, Cameron S; Hebblewhite, Mark; Nielsen, Scott E; Krawchuk, Meg A; Aldridge, Cameron L; Frair, Jacqueline L; Saher, D Joanne; Stevens, Cameron E; Jerde, Christopher L

    2006-07-01

    1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence. 2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability. 3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed. 4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects. 5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection. 6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions

  12. Local randomization in neighbor selection improves PRM roadmap quality

    KAUST Repository

    McMahon, Troy; Jacobs, Sam; Boyd, Bryan; Tapia, Lydia; Amato, Nancy M.

    2012-01-01

    Probabilistic Roadmap Methods (PRMs) are one of the most used classes of motion planning methods. These sampling-based methods generate robot configurations (nodes) and then connect them to form a graph (roadmap) containing representative feasible pathways. A key step in PRM roadmap construction involves identifying a set of candidate neighbors for each node. Traditionally, these candidates are chosen to be the k-closest nodes based on a given distance metric. In this paper, we propose a new neighbor selection policy called LocalRand(k,K'), that first computes the K' closest nodes to a specified node and then selects k of those nodes at random. Intuitively, LocalRand attempts to benefit from random sampling while maintaining the higher levels of local planner success inherent to selecting more local neighbors. We provide a methodology for selecting the parameters k and K'. We perform an experimental comparison which shows that for both rigid and articulated robots, LocalRand results in roadmaps that are better connected than the traditional k-closest policy or a purely random neighbor selection policy. The cost required to achieve these results is shown to be comparable to k-closest. © 2012 IEEE.

  13. Local randomization in neighbor selection improves PRM roadmap quality

    KAUST Repository

    McMahon, Troy

    2012-10-01

    Probabilistic Roadmap Methods (PRMs) are one of the most used classes of motion planning methods. These sampling-based methods generate robot configurations (nodes) and then connect them to form a graph (roadmap) containing representative feasible pathways. A key step in PRM roadmap construction involves identifying a set of candidate neighbors for each node. Traditionally, these candidates are chosen to be the k-closest nodes based on a given distance metric. In this paper, we propose a new neighbor selection policy called LocalRand(k,K\\'), that first computes the K\\' closest nodes to a specified node and then selects k of those nodes at random. Intuitively, LocalRand attempts to benefit from random sampling while maintaining the higher levels of local planner success inherent to selecting more local neighbors. We provide a methodology for selecting the parameters k and K\\'. We perform an experimental comparison which shows that for both rigid and articulated robots, LocalRand results in roadmaps that are better connected than the traditional k-closest policy or a purely random neighbor selection policy. The cost required to achieve these results is shown to be comparable to k-closest. © 2012 IEEE.

  14. High-Tg Polynorbornene-Based Block and Random Copolymers for Butanol Pervaporation Membranes

    Science.gov (United States)

    Register, Richard A.; Kim, Dong-Gyun; Takigawa, Tamami; Kashino, Tomomasa; Burtovyy, Oleksandr; Bell, Andrew

    Vinyl addition polymers of substituted norbornene (NB) monomers possess desirably high glass transition temperatures (Tg); however, until very recently, the lack of an applicable living polymerization chemistry has precluded the synthesis of such polymers with controlled architecture, or copolymers with controlled sequence distribution. We have recently synthesized block and random copolymers of NB monomers bearing hydroxyhexafluoroisopropyl and n-butyl substituents (HFANB and BuNB) via living vinyl addition polymerization with Pd-based catalysts. Both series of polymers were cast into the selective skin layers of thin film composite (TFC) membranes, and these organophilic membranes investigated for the isolation of n-butanol from dilute aqueous solution (model fermentation broth) via pervaporation. The block copolymers show well-defined microphase-separated morphologies, both in bulk and as the selective skin layers on TFC membranes, while the random copolymers are homogeneous. Both block and random vinyl addition copolymers are effective as n-butanol pervaporation membranes, with the block copolymers showing a better flux-selectivity balance. While polyHFANB has much higher permeability and n-butanol selectivity than polyBuNB, incorporating BuNB units into the polymer (in either a block or random sequence) limits the swelling of the polyHFANB and thereby improves the n-butanol pervaporation selectivity.

  15. Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design

    Science.gov (United States)

    Wagler, Amy; Wagler, Ron

    2014-01-01

    Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…

  16. Differential privacy-based evaporative cooling feature selection and classification with relief-F and random forests.

    Science.gov (United States)

    Le, Trang T; Simmons, W Kyle; Misaki, Masaya; Bodurka, Jerzy; White, Bill C; Savitz, Jonathan; McKinney, Brett A

    2017-09-15

    Classification of individuals into disease or clinical categories from high-dimensional biological data with low prediction error is an important challenge of statistical learning in bioinformatics. Feature selection can improve classification accuracy but must be incorporated carefully into cross-validation to avoid overfitting. Recently, feature selection methods based on differential privacy, such as differentially private random forests and reusable holdout sets, have been proposed. However, for domains such as bioinformatics, where the number of features is much larger than the number of observations p≫n , these differential privacy methods are susceptible to overfitting. We introduce private Evaporative Cooling, a stochastic privacy-preserving machine learning algorithm that uses Relief-F for feature selection and random forest for privacy preserving classification that also prevents overfitting. We relate the privacy-preserving threshold mechanism to a thermodynamic Maxwell-Boltzmann distribution, where the temperature represents the privacy threshold. We use the thermal statistical physics concept of Evaporative Cooling of atomic gases to perform backward stepwise privacy-preserving feature selection. On simulated data with main effects and statistical interactions, we compare accuracies on holdout and validation sets for three privacy-preserving methods: the reusable holdout, reusable holdout with random forest, and private Evaporative Cooling, which uses Relief-F feature selection and random forest classification. In simulations where interactions exist between attributes, private Evaporative Cooling provides higher classification accuracy without overfitting based on an independent validation set. In simulations without interactions, thresholdout with random forest and private Evaporative Cooling give comparable accuracies. We also apply these privacy methods to human brain resting-state fMRI data from a study of major depressive disorder. Code

  17. Primitive polynomials selection method for pseudo-random number generator

    Science.gov (United States)

    Anikin, I. V.; Alnajjar, Kh

    2018-01-01

    In this paper we suggested the method for primitive polynomials selection of special type. This kind of polynomials can be efficiently used as a characteristic polynomials for linear feedback shift registers in pseudo-random number generators. The proposed method consists of two basic steps: finding minimum-cost irreducible polynomials of the desired degree and applying primitivity tests to get the primitive ones. Finally two primitive polynomials, which was found by the proposed method, used in pseudorandom number generator based on fuzzy logic (FRNG) which had been suggested before by the authors. The sequences generated by new version of FRNG have low correlation magnitude, high linear complexity, less power consumption, is more balanced and have better statistical properties.

  18. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  19. Interference-aware random beam selection for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed M.

    2012-09-01

    Spectrum sharing systems have been introduced to alleviate the problem of spectrum scarcity by allowing secondary unlicensed networks to share the spectrum with primary licensed networks under acceptable interference levels to the primary users. In this paper, we develop interference-aware random beam selection schemes that provide enhanced throughput for the secondary link under the condition that the interference observed at the primary link is within a predetermined acceptable value. For a secondary transmitter equipped with multiple antennas, our schemes select a random beam, among a set of power- optimized orthogonal random beams, that maximizes the capacity of the secondary link while satisfying the interference constraint at the primary receiver for different levels of feedback information describing the interference level at the primary receiver. For the proposed schemes, we develop a statistical analysis for the signal-to-noise and interference ratio (SINR) statistics as well as the capacity of the secondary link. Finally, we present numerical results that study the effect of system parameters including number of beams and the maximum transmission power on the capacity of the secondary link attained using the proposed schemes. © 2012 IEEE.

  20. Interference-aware random beam selection for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed M.; Sayed, Mostafa M.; Alouini, Mohamed-Slim; Qaraqe, Khalid A.

    2012-01-01

    . In this paper, we develop interference-aware random beam selection schemes that provide enhanced throughput for the secondary link under the condition that the interference observed at the primary link is within a predetermined acceptable value. For a secondary

  1. Materials selection for oxide-based resistive random access memories

    International Nuclear Information System (INIS)

    Guo, Yuzheng; Robertson, John

    2014-01-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO 2 , TiO 2 , Ta 2 O 5 , and Al 2 O 3 , to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta 2 O 5 RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy

  2. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil

    2016-12-14

    This paper considers the problem of selecting a set of $k$ measurements from $n$ available sensor observations. The selected measurements should minimize a certain error function assessing the error in estimating a certain $m$ dimensional parameter vector. The exhaustive search inspecting each of the $n\\\\choose k$ possible choices would require a very high computational complexity and as such is not practical for large $n$ and $k$. Alternative methods with low complexity have recently been investigated but their main drawbacks are that 1) they require perfect knowledge of the measurement matrix and 2) they need to be applied at the pace of change of the measurement matrix. To overcome these issues, we consider the asymptotic regime in which $k$, $n$ and $m$ grow large at the same pace. Tools from random matrix theory are then used to approximate in closed-form the most important error measures that are commonly used. The asymptotic approximations are then leveraged to select properly $k$ measurements exhibiting low values for the asymptotic error measures. Two heuristic algorithms are proposed: the first one merely consists in applying the convex optimization artifice to the asymptotic error measure. The second algorithm is a low-complexity greedy algorithm that attempts to look for a sufficiently good solution for the original minimization problem. The greedy algorithm can be applied to both the exact and the asymptotic error measures and can be thus implemented in blind and channel-aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also presented and sustain the efficiency of the proposed blind methods in reaching the performances of channel-aware algorithms.

  3. Simulated Performance Evaluation of a Selective Tracker Through Random Scenario Generation

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar

    2006-01-01

    performance assessment. Therefore, a random target motion scenario is adopted. Its implementation in particular for testing the proposed selective track splitting algorithm using Kalman filters is investigated through a number of performance parameters which gives the activity profile of the tracking scenario......  The paper presents a simulation study on the performance of a target tracker using selective track splitting filter algorithm through a random scenario implemented on a digital signal processor.  In a typical track splitting filter all the observation which fall inside a likelihood ellipse...... are used for update, however, in our proposed selective track splitting filter less number of observations are used for track update.  Much of the previous performance work [1] has been done on specific (deterministic) scenarios. One of the reasons for considering the specific scenarios, which were...

  4. Minimization over randomly selected lines

    Directory of Open Access Journals (Sweden)

    Ismet Sahin

    2013-07-01

    Full Text Available This paper presents a population-based evolutionary optimization method for minimizing a given cost function. The mutation operator of this method selects randomly oriented lines in the cost function domain, constructs quadratic functions interpolating the cost function at three different points over each line, and uses extrema of the quadratics as mutated points. The crossover operator modifies each mutated point based on components of two points in population, instead of one point as is usually performed in other evolutionary algorithms. The stopping criterion of this method depends on the number of almost degenerate quadratics. We demonstrate that the proposed method with these mutation and crossover operations achieves faster and more robust convergence than the well-known Differential Evolution and Particle Swarm algorithms.

  5. Selection bias and subject refusal in a cluster-randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Rochelle Yang

    2017-07-01

    Full Text Available Abstract Background Selection bias and non-participation bias are major methodological concerns which impact external validity. Cluster-randomized controlled trials are especially prone to selection bias as it is impractical to blind clusters to their allocation into intervention or control. This study assessed the impact of selection bias in a large cluster-randomized controlled trial. Methods The Improved Cardiovascular Risk Reduction to Enhance Rural Primary Care (ICARE study examined the impact of a remote pharmacist-led intervention in twelve medical offices. To assess eligibility, a standardized form containing patient demographics and medical information was completed for each screened patient. Eligible patients were approached by the study coordinator for recruitment. Both the study coordinator and the patient were aware of the site’s allocation prior to consent. Patients who consented or declined to participate were compared across control and intervention arms for differing characteristics. Statistical significance was determined using a two-tailed, equal variance t-test and a chi-square test with adjusted Bonferroni p-values. Results were adjusted for random cluster variation. Results There were 2749 completed screening forms returned to research staff with 461 subjects who had either consented or declined participation. Patients with poorly controlled diabetes were found to be significantly more likely to decline participation in intervention sites compared to those in control sites. A higher mean diastolic blood pressure was seen in patients with uncontrolled hypertension who declined in the control sites compared to those who declined in the intervention sites. However, these findings were no longer significant after adjustment for random variation among the sites. After this adjustment, females were now found to be significantly more likely to consent than males (odds ratio = 1.41; 95% confidence interval = 1.03, 1

  6. Materials selection for oxide-based resistive random access memories

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Yuzheng; Robertson, John [Engineering Department, Cambridge University, Cambridge CB2 1PZ (United Kingdom)

    2014-12-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO{sub 2}, TiO{sub 2}, Ta{sub 2}O{sub 5}, and Al{sub 2}O{sub 3}, to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta{sub 2}O{sub 5} RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy.

  7. Comparative Evaluations of Randomly Selected Four Point-of-Care Glucometer Devices in Addis Ababa, Ethiopia.

    Science.gov (United States)

    Wolde, Mistire; Tarekegn, Getahun; Kebede, Tedla

    2018-05-01

    Point-of-care glucometer (PoCG) devices play a significant role in self-monitoring of the blood sugar level, particularly in the follow-up of high blood sugar therapeutic response. The aim of this study was to evaluate blood glucose test results performed with four randomly selected glucometers on diabetes and control subjects versus standard wet chemistry (hexokinase) methods in Addis Ababa, Ethiopia. A prospective cross-sectional study was conducted on randomly selected 200 study participants (100 participants with diabetes and 100 healthy controls). Four randomly selected PoCG devices (CareSens N, DIAVUE Prudential, On Call Extra, i-QARE DS-W) were evaluated against hexokinase method and ISO 15197:2003 and ISO 15197:2013 standards. The minimum and maximum blood sugar values were recorded by CareSens N (21 mg/dl) and hexokinase method (498.8 mg/dl), respectively. The mean sugar values of all PoCG devices except On Call Extra showed significant differences compared with the reference hexokinase method. Meanwhile, all four PoCG devices had strong positive relationship (>80%) with the reference method (hexokinase). On the other hand, none of the four PoCG devices fulfilled the minimum accuracy measurement set by ISO 15197:2003 and ISO 15197:2013 standards. In addition, the linear regression analysis revealed that all four selected PoCG overestimated the glucose concentrations. The overall evaluation of the selected four PoCG measurements were poorly correlated with standard reference method. Therefore, before introducing PoCG devices to the market, there should be a standardized evaluation platform for validation. Further similar large-scale studies on other PoCG devices also need to be undertaken.

  8. Randomizer for High Data Rates

    Science.gov (United States)

    Garon, Howard; Sank, Victor J.

    2018-01-01

    NASA as well as a number of other space agencies now recognize that the current recommended CCSDS randomizer used for telemetry (TM) is too short. When multiple applications of the PN8 Maximal Length Sequence (MLS) are required in order to fully cover a channel access data unit (CADU), spectral problems in the form of elevated spurious discretes (spurs) appear. Originally the randomizer was called a bit transition generator (BTG) precisely because it was thought that its primary value was to insure sufficient bit transitions to allow the bit/symbol synchronizer to lock and remain locked. We, NASA, have shown that the old BTG concept is a limited view of the real value of the randomizer sequence and that the randomizer also aids in signal acquisition as well as minimizing the potential for false decoder lock. Under the guidelines we considered here there are multiple maximal length sequences under GF(2) which appear attractive in this application. Although there may be mitigating reasons why another MLS sequence could be selected, one sequence in particular possesses a combination of desired properties which offsets it from the others.

  9. Interference-aware random beam selection schemes for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed; Qaraqe, Khalid; Alouini, Mohamed-Slim

    2012-01-01

    users. In this work, we develop interference-aware random beam selection schemes that provide enhanced performance for the secondary network under the condition that the interference observed by the receivers of the primary network is below a

  10. Continuous-Time Mean-Variance Portfolio Selection with Random Horizon

    International Nuclear Information System (INIS)

    Yu, Zhiyong

    2013-01-01

    This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right

  11. Continuous-Time Mean-Variance Portfolio Selection with Random Horizon

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Zhiyong, E-mail: yuzhiyong@sdu.edu.cn [Shandong University, School of Mathematics (China)

    2013-12-15

    This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right.

  12. TEHRAN AIR POLLUTANTS PREDICTION BASED ON RANDOM FOREST FEATURE SELECTION METHOD

    Directory of Open Access Journals (Sweden)

    A. Shamsoddini

    2017-09-01

    Full Text Available Air pollution as one of the most serious forms of environmental pollutions poses huge threat to human life. Air pollution leads to environmental instability, and has harmful and undesirable effects on the environment. Modern prediction methods of the pollutant concentration are able to improve decision making and provide appropriate solutions. This study examines the performance of the Random Forest feature selection in combination with multiple-linear regression and Multilayer Perceptron Artificial Neural Networks methods, in order to achieve an efficient model to estimate carbon monoxide and nitrogen dioxide, sulfur dioxide and PM2.5 contents in the air. The results indicated that Artificial Neural Networks fed by the attributes selected by Random Forest feature selection method performed more accurate than other models for the modeling of all pollutants. The estimation accuracy of sulfur dioxide emissions was lower than the other air contaminants whereas the nitrogen dioxide was predicted more accurate than the other pollutants.

  13. Tehran Air Pollutants Prediction Based on Random Forest Feature Selection Method

    Science.gov (United States)

    Shamsoddini, A.; Aboodi, M. R.; Karami, J.

    2017-09-01

    Air pollution as one of the most serious forms of environmental pollutions poses huge threat to human life. Air pollution leads to environmental instability, and has harmful and undesirable effects on the environment. Modern prediction methods of the pollutant concentration are able to improve decision making and provide appropriate solutions. This study examines the performance of the Random Forest feature selection in combination with multiple-linear regression and Multilayer Perceptron Artificial Neural Networks methods, in order to achieve an efficient model to estimate carbon monoxide and nitrogen dioxide, sulfur dioxide and PM2.5 contents in the air. The results indicated that Artificial Neural Networks fed by the attributes selected by Random Forest feature selection method performed more accurate than other models for the modeling of all pollutants. The estimation accuracy of sulfur dioxide emissions was lower than the other air contaminants whereas the nitrogen dioxide was predicted more accurate than the other pollutants.

  14. Hebbian Learning in a Random Network Captures Selectivity Properties of the Prefrontal Cortex

    Science.gov (United States)

    Lindsay, Grace W.

    2017-01-01

    Complex cognitive behaviors, such as context-switching and rule-following, are thought to be supported by the prefrontal cortex (PFC). Neural activity in the PFC must thus be specialized to specific tasks while retaining flexibility. Nonlinear “mixed” selectivity is an important neurophysiological trait for enabling complex and context-dependent behaviors. Here we investigate (1) the extent to which the PFC exhibits computationally relevant properties, such as mixed selectivity, and (2) how such properties could arise via circuit mechanisms. We show that PFC cells recorded from male and female rhesus macaques during a complex task show a moderate level of specialization and structure that is not replicated by a model wherein cells receive random feedforward inputs. While random connectivity can be effective at generating mixed selectivity, the data show significantly more mixed selectivity than predicted by a model with otherwise matched parameters. A simple Hebbian learning rule applied to the random connectivity, however, increases mixed selectivity and enables the model to match the data more accurately. To explain how learning achieves this, we provide analysis along with a clear geometric interpretation of the impact of learning on selectivity. After learning, the model also matches the data on measures of noise, response density, clustering, and the distribution of selectivities. Of two styles of Hebbian learning tested, the simpler and more biologically plausible option better matches the data. These modeling results provide clues about how neural properties important for cognition can arise in a circuit and make clear experimental predictions regarding how various measures of selectivity would evolve during animal training. SIGNIFICANCE STATEMENT The prefrontal cortex is a brain region believed to support the ability of animals to engage in complex behavior. How neurons in this area respond to stimuli—and in particular, to combinations of stimuli (

  15. Performance Evaluation of User Selection Protocols in Random Networks with Energy Harvesting and Hardware Impairments

    Directory of Open Access Journals (Sweden)

    Tan Nhat Nguyen

    2016-01-01

    Full Text Available In this paper, we evaluate performances of various user selection protocols under impact of hardware impairments. In the considered protocols, a Base Station (BS selects one of available Users (US to serve, while the remaining USs harvest the energy from the Radio Frequency (RF transmitted by the BS. We assume that all of the US randomly appear around the BS. In the Random Selection Protocol (RAN, the BS randomly selects a US to transmit the data. In the second proposed protocol, named Minimum Distance Protocol (MIND, the US that is nearest to the BS will be chosen. In the Optimal Selection Protocol (OPT, the US providing the highest channel gain between itself and the BS will be served. For performance evaluation, we derive exact and asymptotic closed-form expressions of average Outage Probability (OP over Rayleigh fading channels. We also consider average harvested energy per a US. Finally, Monte-Carlo simulations are then performed to verify the theoretical results.

  16. The reliability of randomly selected final year pharmacy students in ...

    African Journals Online (AJOL)

    Employing ANOVA, factorial experimental analysis, and the theory of error, reliability studies were conducted on the assessment of the drug product chloroquine phosphate tablets. The G–Study employed equal numbers of the factors for uniform control, and involved three analysts (randomly selected final year Pharmacy ...

  17. Random selection of items. Selection of n1 samples among N items composing a stratum

    International Nuclear Information System (INIS)

    Jaech, J.L.; Lemaire, R.J.

    1987-02-01

    STR-224 provides generalized procedures to determine required sample sizes, for instance in the course of a Physical Inventory Verification at Bulk Handling Facilities. The present report describes procedures to generate random numbers and select groups of items to be verified in a given stratum through each of the measurement methods involved in the verification. (author). 3 refs

  18. The mathematics of random mutation and natural selection for multiple simultaneous selection pressures and the evolution of antimicrobial drug resistance.

    Science.gov (United States)

    Kleinman, Alan

    2016-12-20

    The random mutation and natural selection phenomenon act in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures when treating infections and cancers. The underlying principle to impair the random mutation and natural selection phenomenon is to use combination therapy, which forces the population to evolve to multiple selection pressures simultaneously that invoke the multiplication rule of probabilities simultaneously as well. Recently, it has been seen that combination therapy for the treatment of malaria has failed to prevent the emergence of drug-resistant variants. Using this empirical example and the principles of probability theory, the derivation of the equations describing this treatment failure is carried out. These equations give guidance as to how to use combination therapy for the treatment of cancers and infectious diseases and prevent the emergence of drug resistance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Integrated Behavior Therapy for Selective Mutism: a randomized controlled pilot study.

    Science.gov (United States)

    Bergman, R Lindsey; Gonzalez, Araceli; Piacentini, John; Keller, Melody L

    2013-10-01

    To evaluate the feasibility, acceptability, and preliminary efficacy of a novel behavioral intervention for reducing symptoms of selective mutism and increasing functional speech. A total of 21 children ages 4 to 8 with primary selective mutism were randomized to 24 weeks of Integrated Behavior Therapy for Selective Mutism (IBTSM) or a 12-week Waitlist control. Clinical outcomes were assessed using blind independent evaluators, parent-, and teacher-report, and an objective behavioral measure. Treatment recipients completed a three-month follow-up to assess durability of treatment gains. Data indicated increased functional speaking behavior post-treatment as rated by parents and teachers, with a high rate of treatment responders as rated by blind independent evaluators (75%). Conversely, children in the Waitlist comparison group did not experience significant improvements in speaking behaviors. Children who received IBTSM also demonstrated significant improvements in number of words spoken at school compared to baseline, however, significant group differences did not emerge. Treatment recipients also experienced significant reductions in social anxiety per parent, but not teacher, report. Clinical gains were maintained over 3 month follow-up. IBTSM appears to be a promising new intervention that is efficacious in increasing functional speaking behaviors, feasible, and acceptable to parents and teachers. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. A Bayesian random effects discrete-choice model for resource selection: Population-level selection inference

    Science.gov (United States)

    Thomas, D.L.; Johnson, D.; Griffith, B.

    2006-01-01

    Modeling the probability of use of land units characterized by discrete and continuous measures, we present a Bayesian random-effects model to assess resource selection. This model provides simultaneous estimation of both individual- and population-level selection. Deviance information criterion (DIC), a Bayesian alternative to AIC that is sample-size specific, is used for model selection. Aerial radiolocation data from 76 adult female caribou (Rangifer tarandus) and calf pairs during 1 year on an Arctic coastal plain calving ground were used to illustrate models and assess population-level selection of landscape attributes, as well as individual heterogeneity of selection. Landscape attributes included elevation, NDVI (a measure of forage greenness), and land cover-type classification. Results from the first of a 2-stage model-selection procedure indicated that there is substantial heterogeneity among cow-calf pairs with respect to selection of the landscape attributes. In the second stage, selection of models with heterogeneity included indicated that at the population-level, NDVI and land cover class were significant attributes for selection of different landscapes by pairs on the calving ground. Population-level selection coefficients indicate that the pairs generally select landscapes with higher levels of NDVI, but the relationship is quadratic. The highest rate of selection occurs at values of NDVI less than the maximum observed. Results for land cover-class selections coefficients indicate that wet sedge, moist sedge, herbaceous tussock tundra, and shrub tussock tundra are selected at approximately the same rate, while alpine and sparsely vegetated landscapes are selected at a lower rate. Furthermore, the variability in selection by individual caribou for moist sedge and sparsely vegetated landscapes is large relative to the variability in selection of other land cover types. The example analysis illustrates that, while sometimes computationally intense, a

  1. Inference for feature selection using the Lasso with high-dimensional data

    DEFF Research Database (Denmark)

    Brink-Jensen, Kasper; Ekstrøm, Claus Thorn

    2014-01-01

    Penalized regression models such as the Lasso have proved useful for variable selection in many fields - especially for situations with high-dimensional data where the numbers of predictors far exceeds the number of observations. These methods identify and rank variables of importance but do...... not generally provide any inference of the selected variables. Thus, the variables selected might be the "most important" but need not be significant. We propose a significance test for the selection found by the Lasso. We introduce a procedure that computes inference and p-values for features chosen...... by the Lasso. This method rephrases the null hypothesis and uses a randomization approach which ensures that the error rate is controlled even for small samples. We demonstrate the ability of the algorithm to compute $p$-values of the expected magnitude with simulated data using a multitude of scenarios...

  2. Genome-wide association data classification and SNPs selection using two-stage quality-based Random Forests.

    Science.gov (United States)

    Nguyen, Thanh-Tung; Huang, Joshua; Wu, Qingyao; Nguyen, Thuy; Li, Mark

    2015-01-01

    Single-nucleotide polymorphisms (SNPs) selection and identification are the most important tasks in Genome-wide association data analysis. The problem is difficult because genome-wide association data is very high dimensional and a large portion of SNPs in the data is irrelevant to the disease. Advanced machine learning methods have been successfully used in Genome-wide association studies (GWAS) for identification of genetic variants that have relatively big effects in some common, complex diseases. Among them, the most successful one is Random Forests (RF). Despite of performing well in terms of prediction accuracy in some data sets with moderate size, RF still suffers from working in GWAS for selecting informative SNPs and building accurate prediction models. In this paper, we propose to use a new two-stage quality-based sampling method in random forests, named ts-RF, for SNP subspace selection for GWAS. The method first applies p-value assessment to find a cut-off point that separates informative and irrelevant SNPs in two groups. The informative SNPs group is further divided into two sub-groups: highly informative and weak informative SNPs. When sampling the SNP subspace for building trees for the forest, only those SNPs from the two sub-groups are taken into account. The feature subspaces always contain highly informative SNPs when used to split a node at a tree. This approach enables one to generate more accurate trees with a lower prediction error, meanwhile possibly avoiding overfitting. It allows one to detect interactions of multiple SNPs with the diseases, and to reduce the dimensionality and the amount of Genome-wide association data needed for learning the RF model. Extensive experiments on two genome-wide SNP data sets (Parkinson case-control data comprised of 408,803 SNPs and Alzheimer case-control data comprised of 380,157 SNPs) and 10 gene data sets have demonstrated that the proposed model significantly reduced prediction errors and outperformed

  3. Interference-aware random beam selection schemes for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed

    2012-10-19

    Spectrum sharing systems have been recently introduced to alleviate the problem of spectrum scarcity by allowing secondary unlicensed networks to share the spectrum with primary licensed networks under acceptable interference levels to the primary users. In this work, we develop interference-aware random beam selection schemes that provide enhanced performance for the secondary network under the condition that the interference observed by the receivers of the primary network is below a predetermined/acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a primary link composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes select a beam, among a set of power-optimized random beams, that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint for different levels of feedback information describing the interference level at the primary receiver. For the proposed schemes, we develop a statistical analysis for the SINR statistics as well as the capacity and bit error rate (BER) of the secondary link.

  4. Topology-selective jamming of fully-connected, code-division random-access networks

    Science.gov (United States)

    Polydoros, Andreas; Cheng, Unjeng

    1990-01-01

    The purpose is to introduce certain models of topology selective stochastic jamming and examine its impact on a class of fully-connected, spread-spectrum, slotted ALOHA-type random access networks. The theory covers dedicated as well as half-duplex units. The dominant role of the spatial duty factor is established, and connections with the dual concept of time selective jamming are discussed. The optimal choices of coding rate and link access parameters (from the users' side) and the jamming spatial fraction are numerically established for DS and FH spreading.

  5. Peculiarities of the statistics of spectrally selected fluorescence radiation in laser-pumped dye-doped random media

    Science.gov (United States)

    Yuvchenko, S. A.; Ushakova, E. V.; Pavlova, M. V.; Alonova, M. V.; Zimnyakov, D. A.

    2018-04-01

    We consider the practical realization of a new optical probe method of the random media which is defined as the reference-free path length interferometry with the intensity moments analysis. A peculiarity in the statistics of the spectrally selected fluorescence radiation in laser-pumped dye-doped random medium is discussed. Previously established correlations between the second- and the third-order moments of the intensity fluctuations in the random interference patterns, the coherence function of the probe radiation, and the path difference probability density for the interfering partial waves in the medium are confirmed. The correlations were verified using the statistical analysis of the spectrally selected fluorescence radiation emitted by a laser-pumped dye-doped random medium. Water solution of Rhodamine 6G was applied as the doping fluorescent agent for the ensembles of the densely packed silica grains, which were pumped by the 532 nm radiation of a solid state laser. The spectrum of the mean path length for a random medium was reconstructed.

  6. Random forest variable selection in spatial malaria transmission modelling in Mpumalanga Province, South Africa

    Directory of Open Access Journals (Sweden)

    Thandi Kapwata

    2016-11-01

    Full Text Available Malaria is an environmentally driven disease. In order to quantify the spatial variability of malaria transmission, it is imperative to understand the interactions between environmental variables and malaria epidemiology at a micro-geographic level using a novel statistical approach. The random forest (RF statistical learning method, a relatively new variable-importance ranking method, measures the variable importance of potentially influential parameters through the percent increase of the mean squared error. As this value increases, so does the relative importance of the associated variable. The principal aim of this study was to create predictive malaria maps generated using the selected variables based on the RF algorithm in the Ehlanzeni District of Mpumalanga Province, South Africa. From the seven environmental variables used [temperature, lag temperature, rainfall, lag rainfall, humidity, altitude, and the normalized difference vegetation index (NDVI], altitude was identified as the most influential predictor variable due its high selection frequency. It was selected as the top predictor for 4 out of 12 months of the year, followed by NDVI, temperature and lag rainfall, which were each selected twice. The combination of climatic variables that produced the highest prediction accuracy was altitude, NDVI, and temperature. This suggests that these three variables have high predictive capabilities in relation to malaria transmission. Furthermore, it is anticipated that the predictive maps generated from predictions made by the RF algorithm could be used to monitor the progression of malaria and assist in intervention and prevention efforts with respect to malaria.

  7. Using Random Forests to Select Optimal Input Variables for Short-Term Wind Speed Forecasting Models

    Directory of Open Access Journals (Sweden)

    Hui Wang

    2017-10-01

    Full Text Available Achieving relatively high-accuracy short-term wind speed forecasting estimates is a precondition for the construction and grid-connected operation of wind power forecasting systems for wind farms. Currently, most research is focused on the structure of forecasting models and does not consider the selection of input variables, which can have significant impacts on forecasting performance. This paper presents an input variable selection method for wind speed forecasting models. The candidate input variables for various leading periods are selected and random forests (RF is employed to evaluate the importance of all variable as features. The feature subset with the best evaluation performance is selected as the optimal feature set. Then, kernel-based extreme learning machine is constructed to evaluate the performance of input variables selection based on RF. The results of the case study show that by removing the uncorrelated and redundant features, RF effectively extracts the most strongly correlated set of features from the candidate input variables. By finding the optimal feature combination to represent the original information, RF simplifies the structure of the wind speed forecasting model, shortens the training time required, and substantially improves the model’s accuracy and generalization ability, demonstrating that the input variables selected by RF are effective.

  8. PUFKEY: A High-Security and High-Throughput Hardware True Random Number Generator for Sensor Networks

    Directory of Open Access Journals (Sweden)

    Dongfang Li

    2015-10-01

    Full Text Available Random number generators (RNG play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST randomness tests and is resilient to a wide range of security attacks.

  9. PUFKEY: a high-security and high-throughput hardware true random number generator for sensor networks.

    Science.gov (United States)

    Li, Dongfang; Lu, Zhaojun; Zou, Xuecheng; Liu, Zhenglin

    2015-10-16

    Random number generators (RNG) play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF) elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST) randomness tests and is resilient to a wide range of security attacks.

  10. Towards a high-speed quantum random number generator

    Science.gov (United States)

    Stucki, Damien; Burri, Samuel; Charbon, Edoardo; Chunnilall, Christopher; Meneghetti, Alessio; Regazzoni, Francesco

    2013-10-01

    Randomness is of fundamental importance in various fields, such as cryptography, numerical simulations, or the gaming industry. Quantum physics, which is fundamentally probabilistic, is the best option for a physical random number generator. In this article, we will present the work carried out in various projects in the context of the development of a commercial and certified high speed random number generator.

  11. High-speed, random-access fluorescence microscopy: I. High-resolution optical recording with voltage-sensitive dyes and ion indicators.

    Science.gov (United States)

    Bullen, A; Patel, S S; Saggau, P

    1997-07-01

    The design and implementation of a high-speed, random-access, laser-scanning fluorescence microscope configured to record fast physiological signals from small neuronal structures with high spatiotemporal resolution is presented. The laser-scanning capability of this nonimaging microscope is provided by two orthogonal acousto-optic deflectors under computer control. Each scanning point can be randomly accessed and has a positioning time of 3-5 microseconds. Sampling time is also computer-controlled and can be varied to maximize the signal-to-noise ratio. Acquisition rates up to 200k samples/s at 16-bit digitizing resolution are possible. The spatial resolution of this instrument is determined by the minimal spot size at the level of the preparation (i.e., 2-7 microns). Scanning points are selected interactively from a reference image collected with differential interference contrast optics and a video camera. Frame rates up to 5 kHz are easily attainable. Intrinsic variations in laser light intensity and scanning spot brightness are overcome by an on-line signal-processing scheme. Representative records obtained with this instrument by using voltage-sensitive dyes and calcium indicators demonstrate the ability to make fast, high-fidelity measurements of membrane potential and intracellular calcium at high spatial resolution (2 microns) without any temporal averaging.

  12. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti.

    Science.gov (United States)

    Wampler, Peter J; Rediske, Richard R; Molla, Azizur R

    2013-01-18

    A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This

  13. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti

    Directory of Open Access Journals (Sweden)

    Wampler Peter J

    2013-01-01

    Full Text Available Abstract Background A remote sensing technique was developed which combines a Geographic Information System (GIS; Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. Methods The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. Results A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. Conclusions The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only

  14. Optimizing Event Selection with the Random Grid Search

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C. [Fermilab; Prosper, Harrison B. [Florida State U.; Sekmen, Sezen [Kyungpook Natl. U.; Stewart, Chip [Broad Inst., Cambridge

    2017-06-29

    The random grid search (RGS) is a simple, but efficient, stochastic algorithm to find optimal cuts that was developed in the context of the search for the top quark at Fermilab in the mid-1990s. The algorithm, and associated code, have been enhanced recently with the introduction of two new cut types, one of which has been successfully used in searches for supersymmetry at the Large Hadron Collider. The RGS optimization algorithm is described along with the recent developments, which are illustrated with two examples from particle physics. One explores the optimization of the selection of vector boson fusion events in the four-lepton decay mode of the Higgs boson and the other optimizes SUSY searches using boosted objects and the razor variables.

  15. Non-random mating for selection with restricted rates of inbreeding and overlapping generations

    NARCIS (Netherlands)

    Sonesson, A.K.; Meuwissen, T.H.E.

    2002-01-01

    Minimum coancestry mating with a maximum of one offspring per mating pair (MC1) is compared with random mating schemes for populations with overlapping generations. Optimum contribution selection is used, whereby $\\\\\\\\Delta F$ is restricted. For schemes with $\\\\\\\\Delta F$ restricted to 0.25% per

  16. Geography and genography: prediction of continental origin using randomly selected single nucleotide polymorphisms

    Directory of Open Access Journals (Sweden)

    Ramoni Marco F

    2007-03-01

    Full Text Available Abstract Background Recent studies have shown that when individuals are grouped on the basis of genetic similarity, group membership corresponds closely to continental origin. There has been considerable debate about the implications of these findings in the context of larger debates about race and the extent of genetic variation between groups. Some have argued that clustering according to continental origin demonstrates the existence of significant genetic differences between groups and that these differences may have important implications for differences in health and disease. Others argue that clustering according to continental origin requires the use of large amounts of genetic data or specifically chosen markers and is indicative only of very subtle genetic differences that are unlikely to have biomedical significance. Results We used small numbers of randomly selected single nucleotide polymorphisms (SNPs from the International HapMap Project to train naïve Bayes classifiers for prediction of ancestral continent of origin. Predictive accuracy was tested on two independent data sets. Genetically similar groups should be difficult to distinguish, especially if only a small number of genetic markers are used. The genetic differences between continentally defined groups are sufficiently large that one can accurately predict ancestral continent of origin using only a minute, randomly selected fraction of the genetic variation present in the human genome. Genotype data from only 50 random SNPs was sufficient to predict ancestral continent of origin in our primary test data set with an average accuracy of 95%. Genetic variations informative about ancestry were common and widely distributed throughout the genome. Conclusion Accurate characterization of ancestry is possible using small numbers of randomly selected SNPs. The results presented here show how investigators conducting genetic association studies can use small numbers of arbitrarily

  17. Causal Effects of Single-Sex Schools on College Entrance Exams and College Attendance: Random Assignment in Seoul High Schools

    OpenAIRE

    Park, Hyunjoon; Behrman, Jere R.; Choi, Jaesung

    2013-01-01

    Despite the voluminous literature on the potentials of single-sex schools, there is no consensus on the effects of single-sex schools because of student selection of school types. We exploit a unique feature of schooling in Seoul—the random assignment of students into single-sex versus coeducational high schools—to assess causal effects of single-sex schools on college entrance exam scores and college attendance. Our validation of the random assignment shows comparable socioeconomic backgroun...

  18. High-power random distributed feedback fiber laser: From science to application

    Energy Technology Data Exchange (ETDEWEB)

    Du, Xueyuan [College of Optoelectronic Science and Engineering, National University of Defense Technology, Changsha 410073 (China); Naval Academy of Armament, Beijing 100161 (China); Zhang, Hanwei; Xiao, Hu; Ma, Pengfei; Wang, Xiaolin; Zhou, Pu; Liu, Zejin [College of Optoelectronic Science and Engineering, National University of Defense Technology, Changsha 410073 (China)

    2016-10-15

    A fiber laser based on random distributed feedback has attracted increasing attention in recent years, as it has become an important photonic device and has found wide applications in fiber communications or sensing. In this article, recent advances in high-power random distributed feedback fiber laser are reviewed, including the theoretical analyses, experimental approaches, discussion on the practical applications and outlook. It is found that a random distributed feedback fiber laser can not only act as an information photonics device, but also has the feasibility for high-efficiency/high-power generation, which makes it competitive with conventional high-power laser sources. In addition, high-power random distributed feedback fiber laser has been successfully applied for midinfrared lasing, frequency doubling to the visible and high-quality imaging. It is believed that the high-power random distributed feedback fiber laser could become a promising light source with simple and economic configurations. (copyright 2016 by WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  19. Fuzzy Random λ-Mean SAD Portfolio Selection Problem: An Ant Colony Optimization Approach

    Science.gov (United States)

    Thakur, Gour Sundar Mitra; Bhattacharyya, Rupak; Mitra, Swapan Kumar

    2010-10-01

    To reach the investment goal, one has to select a combination of securities among different portfolios containing large number of securities. Only the past records of each security do not guarantee the future return. As there are many uncertain factors which directly or indirectly influence the stock market and there are also some newer stock markets which do not have enough historical data, experts' expectation and experience must be combined with the past records to generate an effective portfolio selection model. In this paper the return of security is assumed to be Fuzzy Random Variable Set (FRVS), where returns are set of random numbers which are in turn fuzzy numbers. A new λ-Mean Semi Absolute Deviation (λ-MSAD) portfolio selection model is developed. The subjective opinions of the investors to the rate of returns of each security are taken into consideration by introducing a pessimistic-optimistic parameter vector λ. λ-Mean Semi Absolute Deviation (λ-MSAD) model is preferred as it follows absolute deviation of the rate of returns of a portfolio instead of the variance as the measure of the risk. As this model can be reduced to Linear Programming Problem (LPP) it can be solved much faster than quadratic programming problems. Ant Colony Optimization (ACO) is used for solving the portfolio selection problem. ACO is a paradigm for designing meta-heuristic algorithms for combinatorial optimization problem. Data from BSE is used for illustration.

  20. Pediatric selective mutism therapy: a randomized controlled trial.

    Science.gov (United States)

    Esposito, Maria; Gimigliano, Francesca; Barillari, Maria R; Precenzano, Francesco; Ruberto, Maria; Sepe, Joseph; Barillari, Umberto; Gimigliano, Raffaele; Militerni, Roberto; Messina, Giovanni; Carotenuto, Marco

    2017-10-01

    Selective mutism (SM) is a rare disease in children coded by DSM-5 as an anxiety disorder. Despite the disabling nature of the disease, there is still no specific treatment. The aims of this study were to verify the efficacy of six-month standard psychomotor treatment and the positive changes in lifestyle, in a population of children affected by SM. Randomized controlled trial registered in the European Clinical Trials Registry (EuDract 2015-001161-36). University third level Centre (Child and Adolescent Neuropsychiatry Clinic). Study population was composed by 67 children in group A (psychomotricity treatment) (35 M, mean age 7.84±1.15) and 71 children in group B (behavioral and educational counseling) (37 M, mean age 7.75±1.36). Psychomotor treatment was administered by trained child therapists in residential settings three times per week. Each child was treated for the whole period by the same therapist and all the therapists shared the same protocol. The standard psychomotor session length is of 45 minutes. At T0 and after 6 months (T1) of treatments, patients underwent a behavioral and SM severity assessment. To verify the effects of the psychomotor management, the Child Behavior Checklist questionnaire (CBCL) and Selective Mutism Questionnaire (SMQ) were administered to the parents. After 6 months of psychomotor treatment SM children showed a significant reduction among CBCL scores such as in social relations, anxious/depressed, social problems and total problems (Pselective mutism, even if further studies are needed. The present study identifies in psychomotricity a safe and efficacy therapy for pediatric selective mutism.

  1. A comparison of random forest and its Gini importance with standard chemometric methods for the feature selection and classification of spectral data

    Directory of Open Access Journals (Sweden)

    Himmelreich Uwe

    2009-07-01

    Full Text Available Abstract Background Regularized regression methods such as principal component or partial least squares regression perform well in learning tasks on high dimensional spectral data, but cannot explicitly eliminate irrelevant features. The random forest classifier with its associated Gini feature importance, on the other hand, allows for an explicit feature elimination, but may not be optimally adapted to spectral data due to the topology of its constituent classification trees which are based on orthogonal splits in feature space. Results We propose to combine the best of both approaches, and evaluated the joint use of a feature selection based on a recursive feature elimination using the Gini importance of random forests' together with regularized classification methods on spectral data sets from medical diagnostics, chemotaxonomy, biomedical analytics, food science, and synthetically modified spectral data. Here, a feature selection using the Gini feature importance with a regularized classification by discriminant partial least squares regression performed as well as or better than a filtering according to different univariate statistical tests, or using regression coefficients in a backward feature elimination. It outperformed the direct application of the random forest classifier, or the direct application of the regularized classifiers on the full set of features. Conclusion The Gini importance of the random forest provided superior means for measuring feature relevance on spectral data, but – on an optimal subset of features – the regularized classifiers might be preferable over the random forest classifier, in spite of their limitation to model linear dependencies only. A feature selection based on Gini importance, however, may precede a regularized linear classification to identify this optimal subset of features, and to earn a double benefit of both dimensionality reduction and the elimination of noise from the classification task.

  2. Supplementary arteriel embolization an option in high-risk ulcer bleeding--a randomized study.

    Science.gov (United States)

    Laursen, Stig Borbjerg; Hansen, Jane Møller; Andersen, Poul Erik; Schaffalitzky de Muckadell, Ove B

    2014-01-01

    One of the major challenges in peptic ulcer bleeding (PUB) is rebleeding which is associated with up to a fivefold increase in mortality. We examined if supplementary transcatheter arterial embolization (STAE) performed after achieved endoscopic hemostasis improves outcome in patients with high-risk ulcers. The study was designed as a non-blinded, parallel group, randomized-controlled trial and performed in a university hospital setting. Patients admitted with PUB from Forrest Ia - IIb ulcers controlled by endoscopic therapy were randomized (1:1 ratio) to STAE of the bleeding artery within 24 h or continued standard treatment. Randomization was stratified according to stigmata of hemorrhage. Patients were followed for 30 days. Primary outcome was a composite endpoint where patients were classified into five groups based on transfusion requirement, development of rebleeding, need of hemostatic intervention and mortality. Secondary outcomes were rebleeding, number of blood transfusions received, duration of admission and mortality. Totally 105 patients were included. Of the 49 patients allocated to STAE 31 underwent successful STAE. There was no difference in composite endpoint. Two versus eight patients re-bled in the STAE and control group, respectively (Intention-to-treat analysis; p = .10). After adjustment for possible imbalances a strong trend was noted between STAE and rate of rebleeding (p = .079). STAE is potentially useful for preventing rebleeding in high-risk PUB. STAE can safely be performed in selected cases with high risk of rebleeding. Further studies are needed in order to confirm these findings; ClincialTrials.gov number, NCT01125852.

  3. Wheel-running activity and energy metabolism in relation to ambient temperature in mice selected for high wheel-running activity

    NARCIS (Netherlands)

    Vaanholt, Lobke M.; Garland, Theodore; Daan, Serge; Visser, G. Henk; Garland Jr., Theodore; Heldmaier, G.

    Interrelationships between ambient temperature, activity, and energy metabolism were explored in mice that had been selectively bred for high spontaneous wheel-running activity and their random-bred controls. Animals were exposed to three different ambient temperatures (10, 20 and 30 degrees C) and

  4. Random survival forests for competing risks

    DEFF Research Database (Denmark)

    Ishwaran, Hemant; Gerds, Thomas A; Kogalur, Udaya B

    2014-01-01

    We introduce a new approach to competing risks using random forests. Our method is fully non-parametric and can be used for selecting event-specific variables and for estimating the cumulative incidence function. We show that the method is highly effective for both prediction and variable selection...

  5. Emergence of multilevel selection in the prisoner's dilemma game on coevolving random networks

    International Nuclear Information System (INIS)

    Szolnoki, Attila; Perc, Matjaz

    2009-01-01

    We study the evolution of cooperation in the prisoner's dilemma game, whereby a coevolutionary rule is introduced that molds the random topology of the interaction network in two ways. First, existing links are deleted whenever a player adopts a new strategy or its degree exceeds a threshold value; second, new links are added randomly after a given number of game iterations. These coevolutionary processes correspond to the generic formation of new links and deletion of existing links that, especially in human societies, appear frequently as a consequence of ongoing socialization, change of lifestyle or death. Due to the counteraction of deletions and additions of links the initial heterogeneity of the interaction network is qualitatively preserved, and thus cannot be held responsible for the observed promotion of cooperation. Indeed, the coevolutionary rule evokes the spontaneous emergence of a powerful multilevel selection mechanism, which despite the sustained random topology of the evolving network, maintains cooperation across the whole span of defection temptation values.

  6. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology

    Science.gov (United States)

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...

  7. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness

    Science.gov (United States)

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and

  8. High-level 13C-enrichment of random and synchronous populations of Chlamydomonas reinhardii

    International Nuclear Information System (INIS)

    Price, R.L.; Crissman, H.A.; Martin, J.C.; Kollman, V.H.

    1975-01-01

    The alga Chlamydomonas reinhardii was grown in suspension culture at high levels of 13 C-enrichment (98 mol percent) both in synchronous and random populations for the purpose of investigating possible macro- and ultrastructural changes in the cell as induced by essentially total carbon replacement. The algae, grown in spinner flasks, were analyzed using a newly developed multiparameter flow-system technique applied to characterizing various algal genera. The versatility of this technique provides for measuring and processing several cell characteristics simultaneously and separating cells according to selected combinations of parameters. In these studies, cell volume (by Coulter aperture) and DNA and chlorophyll content were determined simultaneously. Cell ultrastructure was examined at various levels of isotope enrichment and time periods by electron microscopy. The data presented for synchronous growth of this organism demonstrate the absence of biological effects (considering the parameters measured) due to the almost total replacement of cellular 12 C with 13 C. Interpretational problems encountered when looking for biological effects on random populations are discussed

  9. Optimization of the Dutch Matrix Test by Random Selection of Sentences From a Preselected Subset

    Directory of Open Access Journals (Sweden)

    Rolph Houben

    2015-04-01

    Full Text Available Matrix tests are available for speech recognition testing in many languages. For an accurate measurement, a steep psychometric function of the speech materials is required. For existing tests, it would be beneficial if it were possible to further optimize the available materials by increasing the function’s steepness. The objective is to show if the steepness of the psychometric function of an existing matrix test can be increased by selecting a homogeneous subset of recordings with the steepest sentence-based psychometric functions. We took data from a previous multicenter evaluation of the Dutch matrix test (45 normal-hearing listeners. Based on half of the data set, first the sentences (140 out of 311 with a similar speech reception threshold and with the steepest psychometric function (≥9.7%/dB were selected. Subsequently, the steepness of the psychometric function for this selection was calculated from the remaining (unused second half of the data set. The calculation showed that the slope increased from 10.2%/dB to 13.7%/dB. The resulting subset did not allow the construction of enough balanced test lists. Therefore, the measurement procedure was changed to randomly select the sentences during testing. Random selection may interfere with a representative occurrence of phonemes. However, in our material, the median phonemic occurrence remained close to that of the original test. This finding indicates that phonemic occurrence is not a critical factor. The work highlights the possibility that existing speech tests might be improved by selecting sentences with a steep psychometric function.

  10. Factors Associated with High Use of a Workplace Web-Based Stress Management Program in a Randomized Controlled Intervention Study

    Science.gov (United States)

    Hasson, H.; Brown, C.; Hasson, D.

    2010-01-01

    In web-based health promotion programs, large variations in participant engagement are common. The aim was to investigate determinants of high use of a worksite self-help web-based program for stress management. Two versions of the program were offered to randomly selected departments in IT and media companies. A static version of the program…

  11. Selection of mRNA 5'-untranslated region sequence with high translation efficiency through ribosome display

    International Nuclear Information System (INIS)

    Mie, Masayasu; Shimizu, Shun; Takahashi, Fumio; Kobatake, Eiry

    2008-01-01

    The 5'-untranslated region (5'-UTR) of mRNAs functions as a translation enhancer, promoting translation efficiency. Many in vitro translation systems exhibit a reduced efficiency in protein translation due to decreased translation initiation. The use of a 5'-UTR sequence with high translation efficiency greatly enhances protein production in these systems. In this study, we have developed an in vitro selection system that favors 5'-UTRs with high translation efficiency using a ribosome display technique. A 5'-UTR random library, comprised of 5'-UTRs tagged with a His-tag and Renilla luciferase (R-luc) fusion, were in vitro translated in rabbit reticulocytes. By limiting the translation period, only mRNAs with high translation efficiency were translated. During translation, mRNA, ribosome and translated R-luc with His-tag formed ternary complexes. They were collected with translated His-tag using Ni-particles. Extracted mRNA from ternary complex was amplified using RT-PCR and sequenced. Finally, 5'-UTR with high translation efficiency was obtained from random 5'-UTR library

  12. Random drift versus selection in academic vocabulary: an evolutionary analysis of published keywords.

    Science.gov (United States)

    Bentley, R Alexander

    2008-08-27

    The evolution of vocabulary in academic publishing is characterized via keyword frequencies recorded in the ISI Web of Science citations database. In four distinct case-studies, evolutionary analysis of keyword frequency change through time is compared to a model of random copying used as the null hypothesis, such that selection may be identified against it. The case studies from the physical sciences indicate greater selection in keyword choice than in the social sciences. Similar evolutionary analyses can be applied to a wide range of phenomena; wherever the popularity of multiple items through time has been recorded, as with web searches, or sales of popular music and books, for example.

  13. Application of random coherence order selection in gradient-enhanced multidimensional NMR

    International Nuclear Information System (INIS)

    Bostock, Mark J.; Nietlispach, Daniel

    2016-01-01

    Development of multidimensional NMR is essential to many applications, for example in high resolution structural studies of biomolecules. Multidimensional techniques enable separation of NMR signals over several dimensions, improving signal resolution, whilst also allowing identification of new connectivities. However, these advantages come at a significant cost. The Fourier transform theorem requires acquisition of a grid of regularly spaced points to satisfy the Nyquist criterion, while frequency discrimination and acquisition of a pure phase spectrum require acquisition of both quadrature components for each time point in every indirect (non-acquisition) dimension, adding a factor of 2 N -1 to the number of free- induction decays which must be acquired, where N is the number of dimensions. Compressed sensing (CS) ℓ 1 -norm minimisation in combination with non-uniform sampling (NUS) has been shown to be extremely successful in overcoming the Nyquist criterion. Previously, maximum entropy reconstruction has also been used to overcome the limitation of frequency discrimination, processing data acquired with only one quadrature component at a given time interval, known as random phase detection (RPD), allowing a factor of two reduction in the number of points for each indirect dimension (Maciejewski et al. 2011 PNAS 108 16640). However, whilst this approach can be easily applied in situations where the quadrature components are acquired as amplitude modulated data, the same principle is not easily extended to phase modulated (P-/N-type) experiments where data is acquired in the form exp (iωt) or exp (-iωt), and which make up many of the multidimensional experiments used in modern NMR. Here we demonstrate a modification of the CS ℓ 1 -norm approach to allow random coherence order selection (RCS) for phase modulated experiments; we generalise the nomenclature for RCS and RPD as random quadrature detection (RQD). With this method, the power of RQD can be extended

  14. Open-field behavior of house mice selectively bred for high voluntary wheel-running.

    Science.gov (United States)

    Bronikowski, A M; Carter, P A; Swallow, J G; Girard, I A; Rhodes, J S; Garland, T

    2001-05-01

    Open-field behavioral assays are commonly used to test both locomotor activity and emotionality in rodents. We performed open-field tests on house mice (Mus domesticus) from four replicate lines genetically selected for high voluntary wheel-running for 22 generations and from four replicate random-bred control lines. Individual mice were recorded by video camera for 3 min in a 1-m2 open-field arena on 2 consecutive days. Mice from selected lines showed no statistical differences from control mice with respect to distance traveled, defecation, time spent in the interior, or average distance from the center of the arena during the trial. Thus, we found little evidence that open-field behavior, as traditionally defined, is genetically correlated with wheel-running behavior. This result is a useful converse test of classical studies that report no increased wheel-running in mice selected for increased open-field activity. However, mice from selected lines turned less in their travel paths than did control-line mice, and females from selected lines had slower travel times (longer latencies) to reach the wall. We discuss these results in the context of the historical open-field test and newly defined measures of open-field activity.

  15. Data-Driven Derivation of an "Informer Compound Set" for Improved Selection of Active Compounds in High-Throughput Screening.

    Science.gov (United States)

    Paricharak, Shardul; IJzerman, Adriaan P; Jenkins, Jeremy L; Bender, Andreas; Nigsch, Florian

    2016-09-26

    Despite the usefulness of high-throughput screening (HTS) in drug discovery, for some systems, low assay throughput or high screening cost can prohibit the screening of large numbers of compounds. In such cases, iterative cycles of screening involving active learning (AL) are employed, creating the need for smaller "informer sets" that can be routinely screened to build predictive models for selecting compounds from the screening collection for follow-up screens. Here, we present a data-driven derivation of an informer compound set with improved predictivity of active compounds in HTS, and we validate its benefit over randomly selected training sets on 46 PubChem assays comprising at least 300,000 compounds and covering a wide range of assay biology. The informer compound set showed improvement in BEDROC(α = 100), PRAUC, and ROCAUC values averaged over all assays of 0.024, 0.014, and 0.016, respectively, compared to randomly selected training sets, all with paired t-test p-values agnostic fashion. This approach led to a consistent improvement in hit rates in follow-up screens without compromising scaffold retrieval. The informer set is adjustable in size depending on the number of compounds one intends to screen, as performance gains are realized for sets with more than 3,000 compounds, and this set is therefore applicable to a variety of situations. Finally, our results indicate that random sampling may not adequately cover descriptor space, drawing attention to the importance of the composition of the training set for predicting actives.

  16. Broadband plasmonic silver nanoflowers for high-performance random lasing covering visible region

    Directory of Open Access Journals (Sweden)

    Chang Qing

    2017-05-01

    Full Text Available Multicolor random lasing has broad potential applications in the fields of imaging, sensing, and optoelectronics. Here, silver nanoflowers (Ag NF with abundant nanogaps are fabricated by a rapid one-step solution-phase synthesis method and are first proposed as effective broadband plasmonic scatterers to achieve different color random lasing. With abundant nanogaps and spiky tips near the surface and the interparticle coupling effect, Ag NFs greatly enhance the local electromagnetic field and induce broadband plasmonic scattering spectra over the whole visible range. The extremely low working threshold and the high-quality factor for Ag NF-based random lasers are thus demonstrated as 0.24 MW cm−2 and 11,851, respectively. Further, coherent colorful random lasing covering the visible range is realized using the dye molecules oxazine (red, Coumarin 440 (blue, and Coumarin 153 (green, showing high-quality factor of more than 10,000. All these features show that Ag NF are highly efficient scatterers for high-performance coherent random lasing and colorful random lasers.

  17. Evolving artificial metalloenzymes via random mutagenesis

    Science.gov (United States)

    Yang, Hao; Swartz, Alan M.; Park, Hyun June; Srivastava, Poonam; Ellis-Guardiola, Ken; Upp, David M.; Lee, Gihoon; Belsare, Ketaki; Gu, Yifan; Zhang, Chen; Moellering, Raymond E.; Lewis, Jared C.

    2018-03-01

    Random mutagenesis has the potential to optimize the efficiency and selectivity of protein catalysts without requiring detailed knowledge of protein structure; however, introducing synthetic metal cofactors complicates the expression and screening of enzyme libraries, and activity arising from free cofactor must be eliminated. Here we report an efficient platform to create and screen libraries of artificial metalloenzymes (ArMs) via random mutagenesis, which we use to evolve highly selective dirhodium cyclopropanases. Error-prone PCR and combinatorial codon mutagenesis enabled multiplexed analysis of random mutations, including at sites distal to the putative ArM active site that are difficult to identify using targeted mutagenesis approaches. Variants that exhibited significantly improved selectivity for each of the cyclopropane product enantiomers were identified, and higher activity than previously reported ArM cyclopropanases obtained via targeted mutagenesis was also observed. This improved selectivity carried over to other dirhodium-catalysed transformations, including N-H, S-H and Si-H insertion, demonstrating that ArMs evolved for one reaction can serve as starting points to evolve catalysts for others.

  18. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  19. Plasma adiponectin is increased in mice selectively bred for high wheel-running activity, but not by wheel running per se

    NARCIS (Netherlands)

    Vaanholt, L. M.; Meerlo, P.; Garland, T.; Visser, G. H.; van Dijk, G.

    2007-01-01

    Mice selectively bred for high wheel-running activity (S) have decreased fat content compared to mice from randomly bred control (C) lines. We explored whether this difference was associated with alterations in levels of circulating hormones involved in regulation of food intake and energy balance,

  20. From Protocols to Publications: A Study in Selective Reporting of Outcomes in Randomized Trials in Oncology

    Science.gov (United States)

    Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C.; Hobbs, Brian P.; Berry, Donald A.; Pentz, Rebecca D.; Tam, Alda; Hong, Waun K.; Ellis, Lee M.; Abbruzzese, James; Overman, Michael J.

    2015-01-01

    Purpose The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. Methods We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. Results A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P < .001). Twenty-eight studies (37.8%) reported a total of 65 unplanned end points; 52 (80.0%) of which were not identified as unplanned. Thirty-one (41.9%) and 19 (25.7%) of 74 trials reported a total of 52 unplanned analyses involving primary end points and 33 unplanned analyses involving nonprimary end points, respectively. Studies reported positive unplanned end points and unplanned analyses more frequently than negative outcomes in abstracts (unplanned end points odds ratio, 6.8; P = .002; unplanned analyses odd ratio, 8.4; P = .007). Conclusion Despite public and reviewer access to protocols, selective outcome reporting persists and is a major concern in the reporting of randomized clinical trials. To foster credible evidence-based medicine, additional initiatives are needed to minimize selective reporting. PMID:26304898

  1. Joint random beam and spectrum selection for spectrum sharing systems with partial channel state information

    KAUST Repository

    Abdallah, Mohamed M.

    2013-11-01

    In this work, we develop joint interference-aware random beam and spectrum selection scheme that provide enhanced performance for the secondary network under the condition that the interference observed at the primary receiver is below a predetermined acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a set of primary links composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes jointly select a beam, among a set of power-optimized random beams, as well as the primary spectrum that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint. In particular, we consider the case where the interference level is described by a q-bit description of its magnitude, whereby we propose a technique to find the optimal quantizer thresholds in a mean square error (MSE) sense. © 2013 IEEE.

  2. Joint random beam and spectrum selection for spectrum sharing systems with partial channel state information

    KAUST Repository

    Abdallah, Mohamed M.; Sayed, Mostafa M.; Alouini, Mohamed-Slim; Qaraqe, Khalid A.

    2013-01-01

    In this work, we develop joint interference-aware random beam and spectrum selection scheme that provide enhanced performance for the secondary network under the condition that the interference observed at the primary receiver is below a predetermined acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a set of primary links composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes jointly select a beam, among a set of power-optimized random beams, as well as the primary spectrum that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint. In particular, we consider the case where the interference level is described by a q-bit description of its magnitude, whereby we propose a technique to find the optimal quantizer thresholds in a mean square error (MSE) sense. © 2013 IEEE.

  3. The RANDOM computer program: A linear congruential random number generator

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  4. Analysis and applications of a frequency selective surface via a random distribution method

    International Nuclear Information System (INIS)

    Xie Shao-Yi; Huang Jing-Jian; Yuan Nai-Chang; Liu Li-Guo

    2014-01-01

    A novel frequency selective surface (FSS) for reducing radar cross section (RCS) is proposed in this paper. This FSS is based on the random distribution method, so it can be called random surface. In this paper, the stacked patches serving as periodic elements are employed for RCS reduction. Previous work has demonstrated the efficiency by utilizing the microstrip patches, especially for the reflectarray. First, the relevant theory of the method is described. Then a sample of a three-layer variable-sized stacked patch random surface with a dimension of 260 mm×260 mm is simulated, fabricated, and measured in order to demonstrate the validity of the proposed design. For the normal incidence, the 8-dB RCS reduction can be achieved both by the simulation and the measurement in 8 GHz–13 GHz. The oblique incidence of 30° is also investigated, in which the 7-dB RCS reduction can be obtained in a frequency range of 8 GHz–14 GHz. (condensed matter: electronic structure, electrical, magnetic, and optical properties)

  5. Random drift versus selection in academic vocabulary: an evolutionary analysis of published keywords.

    Directory of Open Access Journals (Sweden)

    R Alexander Bentley

    Full Text Available The evolution of vocabulary in academic publishing is characterized via keyword frequencies recorded in the ISI Web of Science citations database. In four distinct case-studies, evolutionary analysis of keyword frequency change through time is compared to a model of random copying used as the null hypothesis, such that selection may be identified against it. The case studies from the physical sciences indicate greater selection in keyword choice than in the social sciences. Similar evolutionary analyses can be applied to a wide range of phenomena; wherever the popularity of multiple items through time has been recorded, as with web searches, or sales of popular music and books, for example.

  6. On theoretical models of gene expression evolution with random genetic drift and natural selection.

    Directory of Open Access Journals (Sweden)

    Osamu Ogasawara

    2009-11-01

    Full Text Available The relative contributions of natural selection and random genetic drift are a major source of debate in the study of gene expression evolution, which is hypothesized to serve as a bridge from molecular to phenotypic evolution. It has been suggested that the conflict between views is caused by the lack of a definite model of the neutral hypothesis, which can describe the long-run behavior of evolutionary change in mRNA abundance. Therefore previous studies have used inadequate analogies with the neutral prediction of other phenomena, such as amino acid or nucleotide sequence evolution, as the null hypothesis of their statistical inference.In this study, we introduced two novel theoretical models, one based on neutral drift and the other assuming natural selection, by focusing on a common property of the distribution of mRNA abundance among a variety of eukaryotic cells, which reflects the result of long-term evolution. Our results demonstrated that (1 our models can reproduce two independently found phenomena simultaneously: the time development of gene expression divergence and Zipf's law of the transcriptome; (2 cytological constraints can be explicitly formulated to describe long-term evolution; (3 the model assuming that natural selection optimized relative mRNA abundance was more consistent with previously published observations than the model of optimized absolute mRNA abundances.The models introduced in this study give a formulation of evolutionary change in the mRNA abundance of each gene as a stochastic process, on the basis of previously published observations. This model provides a foundation for interpreting observed data in studies of gene expression evolution, including identifying an adequate time scale for discriminating the effect of natural selection from that of random genetic drift of selectively neutral variations.

  7. From Protocols to Publications: A Study in Selective Reporting of Outcomes in Randomized Trials in Oncology.

    Science.gov (United States)

    Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C; Hobbs, Brian P; Berry, Donald A; Pentz, Rebecca D; Tam, Alda; Hong, Waun K; Ellis, Lee M; Abbruzzese, James; Overman, Michael J

    2015-11-01

    The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P medicine, additional initiatives are needed to minimize selective reporting. © 2015 by American Society of Clinical Oncology.

  8. A high-speed on-chip pseudo-random binary sequence generator for multi-tone phase calibration

    Science.gov (United States)

    Gommé, Liesbeth; Vandersteen, Gerd; Rolain, Yves

    2011-07-01

    An on-chip reference generator is conceived by adopting the technique of decimating a pseudo-random binary sequence (PRBS) signal in parallel sequences. This is of great benefit when high-speed generation of PRBS and PRBS-derived signals is the objective. The design implemented standard CMOS logic is available in commercial libraries to provide the logic functions for the generator. The design allows the user to select the periodicity of the PRBS and the PRBS-derived signals. The characterization of the on-chip generator marks its performance and reveals promising specifications.

  9. A high-speed on-chip pseudo-random binary sequence generator for multi-tone phase calibration

    International Nuclear Information System (INIS)

    Gommé, Liesbeth; Vandersteen, Gerd; Rolain, Yves

    2011-01-01

    An on-chip reference generator is conceived by adopting the technique of decimating a pseudo-random binary sequence (PRBS) signal in parallel sequences. This is of great benefit when high-speed generation of PRBS and PRBS-derived signals is the objective. The design implemented standard CMOS logic is available in commercial libraries to provide the logic functions for the generator. The design allows the user to select the periodicity of the PRBS and the PRBS-derived signals. The characterization of the on-chip generator marks its performance and reveals promising specifications

  10. Correlates of smoking with socioeconomic status, leisure time physical activity and alcohol consumption among Polish adults from randomly selected regions.

    Science.gov (United States)

    Woitas-Slubowska, Donata; Hurnik, Elzbieta; Skarpańska-Stejnborn, Anna

    2010-12-01

    To determine the association between smoking status and leisure time physical activity (LTPA), alcohol consumption, and socioeconomic status (SES) among Polish adults. 466 randomly selected men and women (aged 18-66 years) responded to an anonymous questionnaire regarding smoking, alcohol consumption, LTPA, and SES. Multiple logistic regression was used to examine the association of smoking status with six socioeconomic measures, level of LTPA, and frequency and type of alcohol consumed. Smokers were defined as individuals smoking occasionally or daily. The odds of being smoker were 9 times (men) and 27 times (women) higher among respondents who drink alcohol several times/ week or everyday in comparison to non-drinkers (p times higher compared to those with the high educational attainment (p = 0.007). Among women we observed that students were the most frequent smokers. Female students were almost three times more likely to smoke than non-professional women, and two times more likely than physical workers (p = 0.018). The findings of this study indicated that among randomly selected Polish man and women aged 18-66 smoking and alcohol consumption tended to cluster. These results imply that intervention strategies need to target multiple risk factors simultaneously. The highest risk of smoking was observed among low educated men, female students, and both men and women drinking alcohol several times a week or every day. Information on subgroups with the high risk of smoking will help in planning future preventive strategies.

  11. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    Science.gov (United States)

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  12. Treatment selection in a randomized clinical trial via covariate-specific treatment effect curves.

    Science.gov (United States)

    Ma, Yunbei; Zhou, Xiao-Hua

    2017-02-01

    For time-to-event data in a randomized clinical trial, we proposed two new methods for selecting an optimal treatment for a patient based on the covariate-specific treatment effect curve, which is used to represent the clinical utility of a predictive biomarker. To select an optimal treatment for a patient with a specific biomarker value, we proposed pointwise confidence intervals for each covariate-specific treatment effect curve and the difference between covariate-specific treatment effect curves of two treatments. Furthermore, to select an optimal treatment for a future biomarker-defined subpopulation of patients, we proposed confidence bands for each covariate-specific treatment effect curve and the difference between each pair of covariate-specific treatment effect curve over a fixed interval of biomarker values. We constructed the confidence bands based on a resampling technique. We also conducted simulation studies to evaluate finite-sample properties of the proposed estimation methods. Finally, we illustrated the application of the proposed method in a real-world data set.

  13. Selective excitation of atoms or molecules to high-lying states

    International Nuclear Information System (INIS)

    Ducas, T.W.

    1978-01-01

    This specification relates to the selective excitation of atoms or molecules to high lying states and a method of separating different isotopes of the same element by selective excitation of the isotopes. (U.K.)

  14. Organic Ferroelectric-Based 1T1T Random Access Memory Cell Employing a Common Dielectric Layer Overcoming the Half-Selection Problem.

    Science.gov (United States)

    Zhao, Qiang; Wang, Hanlin; Ni, Zhenjie; Liu, Jie; Zhen, Yonggang; Zhang, Xiaotao; Jiang, Lang; Li, Rongjin; Dong, Huanli; Hu, Wenping

    2017-09-01

    Organic electronics based on poly(vinylidenefluoride/trifluoroethylene) (P(VDF-TrFE)) dielectric is facing great challenges in flexible circuits. As one indispensable part of integrated circuits, there is an urgent demand for low-cost and easy-fabrication nonvolatile memory devices. A breakthrough is made on a novel ferroelectric random access memory cell (1T1T FeRAM cell) consisting of one selection transistor and one ferroelectric memory transistor in order to overcome the half-selection problem. Unlike complicated manufacturing using multiple dielectrics, this system simplifies 1T1T FeRAM cell fabrication using one common dielectric. To achieve this goal, a strategy for semiconductor/insulator (S/I) interface modulation is put forward and applied to nonhysteretic selection transistors with high performances for driving or addressing purposes. As a result, high hole mobility of 3.81 cm 2 V -1 s -1 (average) for 2,6-diphenylanthracene (DPA) and electron mobility of 0.124 cm 2 V -1 s -1 (average) for N,N'-1H,1H-perfluorobutyl dicyanoperylenecarboxydiimide (PDI-FCN 2 ) are obtained in selection transistors. In this work, we demonstrate this technology's potential for organic ferroelectric-based pixelated memory module fabrication. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Selective decontamination in pediatric liver transplants. A randomized prospective study.

    Science.gov (United States)

    Smith, S D; Jackson, R J; Hannakan, C J; Wadowsky, R M; Tzakis, A G; Rowe, M I

    1993-06-01

    Although it has been suggested that selective decontamination of the digestive tract (SDD) decreases postoperative aerobic Gram-negative and fungal infections in orthotopic liver transplantation (OLT), no controlled trials exist in pediatric patients. This prospective, randomized controlled study of 36 pediatric OLT patients examines the effect of short-term SDD on postoperative infection and digestive tract flora. Patients were randomized into two groups. The control group received perioperative parenteral antibiotics only. The SDD group received in addition polymyxin E, tobramycin, and amphotericin B enterally and by oropharyngeal swab postoperatively until oral intake was tolerated (6 +/- 4 days). Indications for operation, preoperative status, age, and intensive care unit and hospital length of stay were no different in SDD (n = 18) and control (n = 18) groups. A total of 14 Gram-negative infections (intraabdominal abscess 7, septicemia 5, pneumonia 1, urinary tract 1) developed in the 36 patients studied. Mortality was not significantly different in the two groups. However, there were significantly fewer patients with Gram-negative infections in the SDD group: 3/18 patients (11%) vs. 11/18 patients (50%) in the control group, P < 0.001. There was also significant reduction in aerobic Gram-negative flora in the stool and pharynx in patients receiving SDD. Gram-positive and anaerobic organisms were unaffected. We conclude that short-term postoperative SDD significantly reduces Gram-negative infections in pediatric OLT patients.

  16. A school-based randomized controlled trial to improve physical activity among Iranian high school girls

    Directory of Open Access Journals (Sweden)

    Ghofranipour Fazloalha

    2008-04-01

    Full Text Available Abstract Background Physical activity (PA rates decline precipitously during the high school years and are consistently lower among adolescent girls than adolescent boys. Due to cultural barriers, this problem might be exacerbated in female Iranian adolescents. However, little intervention research has been conducted to try to increase PA participation rates with this population. Because PA interventions in schools have the potential to reach many children and adolescents, this study reports on PA intervention research conducted in all-female Iranian high schools. Methods A randomized controlled trial was conducted to examine the effects of two six-month tailored interventions on potential determinants of PA and PA behavior. Students (N = 161 were randomly allocated to one of three conditions: an intervention based on Pender's Health Promotion model (HP, an intervention based on an integration of the health promotion model and selected constructs from the Transtheoretical model (THP, and a control group (CON. Measures were administered prior to the intervention, at post-intervention and at a six-month follow-up. Results Repeated measure ANOVAs showed a significant interaction between group and time for perceived benefits, self efficacy, interpersonal norms, social support, behavioral processes, and PA behavior, indicating that both intervention groups significantly improved across the 24-week intervention, whereas the control group did not. Participants in the THP group showed greater use of counter conditioning and stimulus control at post-intervention and at follow-up. While there were no significant differences in PA between the HP and CON groups at follow-up, a significant difference was still found between the THP and the CON group. Conclusion This study provides the first evidence of the effectiveness of a PA intervention based on Pender's HP model combined with selected aspects of the TTM on potential determinants to increase PA among

  17. Day-ahead load forecast using random forest and expert input selection

    International Nuclear Information System (INIS)

    Lahouar, A.; Ben Hadj Slama, J.

    2015-01-01

    Highlights: • A model based on random forests for short term load forecast is proposed. • An expert feature selection is added to refine inputs. • Special attention is paid to customers behavior, load profile and special holidays. • The model is flexible and able to handle complex load signal. • A technical comparison is performed to assess the forecast accuracy. - Abstract: The electrical load forecast is getting more and more important in recent years due to the electricity market deregulation and integration of renewable resources. To overcome the incoming challenges and ensure accurate power prediction for different time horizons, sophisticated intelligent methods are elaborated. Utilization of intelligent forecast algorithms is among main characteristics of smart grids, and is an efficient tool to face uncertainty. Several crucial tasks of power operators such as load dispatch rely on the short term forecast, thus it should be as accurate as possible. To this end, this paper proposes a short term load predictor, able to forecast the next 24 h of load. Using random forest, characterized by immunity to parameter variations and internal cross validation, the model is constructed following an online learning process. The inputs are refined by expert feature selection using a set of if–then rules, in order to include the own user specifications about the country weather or market, and to generalize the forecast ability. The proposed approach is tested through a real historical set from the Tunisian Power Company, and the simulation shows accurate and satisfactory results for one day in advance, with an average error exceeding rarely 2.3%. The model is validated for regular working days and weekends, and special attention is paid to moving holidays, following non Gregorian calendar

  18. Distribution of orientation selectivity in recurrent networks of spiking neurons with different random topologies.

    Science.gov (United States)

    Sadeh, Sadra; Rotter, Stefan

    2014-01-01

    Neurons in the primary visual cortex are more or less selective for the orientation of a light bar used for stimulation. A broad distribution of individual grades of orientation selectivity has in fact been reported in all species. A possible reason for emergence of broad distributions is the recurrent network within which the stimulus is being processed. Here we compute the distribution of orientation selectivity in randomly connected model networks that are equipped with different spatial patterns of connectivity. We show that, for a wide variety of connectivity patterns, a linear theory based on firing rates accurately approximates the outcome of direct numerical simulations of networks of spiking neurons. Distance dependent connectivity in networks with a more biologically realistic structure does not compromise our linear analysis, as long as the linearized dynamics, and hence the uniform asynchronous irregular activity state, remain stable. We conclude that linear mechanisms of stimulus processing are indeed responsible for the emergence of orientation selectivity and its distribution in recurrent networks with functionally heterogeneous synaptic connectivity.

  19. Selection mechanisms underlying high impact biomedical research--a qualitative analysis and causal model.

    Directory of Open Access Journals (Sweden)

    Hilary Zelko

    Full Text Available BACKGROUND: Although scientific innovation has been a long-standing topic of interest for historians, philosophers and cognitive scientists, few studies in biomedical research have examined from researchers' perspectives how high impact publications are developed and why they are consistently produced by a small group of researchers. Our objective was therefore to interview a group of researchers with a track record of high impact publications to explore what mechanism they believe contribute to the generation of high impact publications. METHODOLOGY/PRINCIPAL FINDINGS: Researchers were located in universities all over the globe and interviews were conducted by phone. All interviews were transcribed using standard qualitative methods. A Grounded Theory approach was used to code each transcript, later aggregating concept and categories into overarching explanation model. The model was then translated into a System Dynamics mathematical model to represent its structure and behavior. Five emerging themes were found in our study. First, researchers used heuristics or rules of thumb that came naturally to them. Second, these heuristics were reinforced by positive feedback from their peers and mentors. Third, good communication skills allowed researchers to provide feedback to their peers, thus closing a positive feedback loop. Fourth, researchers exhibited a number of psychological attributes such as curiosity or open-mindedness that constantly motivated them, even when faced with discouraging situations. Fifth, the system is dominated by randomness and serendipity and is far from a linear and predictable environment. Some researchers, however, took advantage of this randomness by incorporating mechanisms that would allow them to benefit from random findings. The aggregation of these themes into a policy model represented the overall expected behavior of publications and their impact achieved by high impact researchers. CONCLUSIONS: The proposed

  20. Sequence-Based Prediction of RNA-Binding Proteins Using Random Forest with Minimum Redundancy Maximum Relevance Feature Selection

    Directory of Open Access Journals (Sweden)

    Xin Ma

    2015-01-01

    Full Text Available The prediction of RNA-binding proteins is one of the most challenging problems in computation biology. Although some studies have investigated this problem, the accuracy of prediction is still not sufficient. In this study, a highly accurate method was developed to predict RNA-binding proteins from amino acid sequences using random forests with the minimum redundancy maximum relevance (mRMR method, followed by incremental feature selection (IFS. We incorporated features of conjoint triad features and three novel features: binding propensity (BP, nonbinding propensity (NBP, and evolutionary information combined with physicochemical properties (EIPP. The results showed that these novel features have important roles in improving the performance of the predictor. Using the mRMR-IFS method, our predictor achieved the best performance (86.62% accuracy and 0.737 Matthews correlation coefficient. High prediction accuracy and successful prediction performance suggested that our method can be a useful approach to identify RNA-binding proteins from sequence information.

  1. Selective oropharyngeal decontamination versus selective digestive decontamination in critically ill patients: a meta-analysis of randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Zhao D

    2015-07-01

    Full Text Available Di Zhao,1,* Jian Song,2,* Xuan Gao,3 Fei Gao,4 Yupeng Wu,2 Yingying Lu,5 Kai Hou1 1Department of Neurosurgery, The First Hospital of Hebei Medical University, 2Department of Neurosurgery, 3Department of Neurology, The Second Hospital of Hebei Medical University, 4Hebei Provincial Procurement Centers for Medical Drugs and Devices, 5Department of Neurosurgery, The Second Hospital of Hebei Medical University, Shijiazhuang People’s Republic of China *These authors contributed equally to this work Background: Selective digestive decontamination (SDD and selective oropharyngeal decontamination (SOD are associated with reduced mortality and infection rates among patients in intensive care units (ICUs; however, whether SOD has a superior effect than SDD remains uncertain. Hence, we conducted a meta-analysis of randomized controlled trials (RCTs to compare SOD with SDD in terms of clinical outcomes and antimicrobial resistance rates in patients who were critically ill. Methods: RCTs published in PubMed, Embase, and Web of Science were systematically reviewed to compare the effects of SOD and SDD in patients who were critically ill. Outcomes included day-28 mortality, length of ICU stay, length of hospital stay, duration of mechanical ventilation, ICU-acquired bacteremia, and prevalence of antibiotic-resistant Gram-negative bacteria. Results were expressed as risk ratio (RR with 95% confidence intervals (CIs, and weighted mean differences (WMDs with 95% CIs. Pooled estimates were performed using a fixed-effects model or random-effects model, depending on the heterogeneity among studies. Results: A total of four RCTs involving 23,822 patients met the inclusion criteria and were included in this meta-analysis. Among patients whose admitting specialty was surgery, cardiothoracic surgery (57.3% and neurosurgery (29.7% were the two main types of surgery being performed. Pooled results showed that SOD had similar effects as SDD in day-28 mortality (RR =1

  2. Mice divergently selected for high and low basal metabolic rates evolved different cell size and organ mass.

    Science.gov (United States)

    Maciak, S; Bonda-Ostaszewska, E; Czarnołęski, M; Konarzewski, M; Kozłowski, J

    2014-03-01

    Evolution of metabolic rates of multicellular organisms is hypothesized to reflect the evolution of their cell architecture. This is likely to stem from a tight link between the sizes of cells and nuclei, which are expected to be inversely related to cell metabolism. Here, we analysed basal metabolic rate (BMR), internal organ masses and the cell/nucleus size in different tissues of laboratory mice divergently selected for high/low mass-corrected BMR and four random-bred mouse lines. Random-bred lines had intermediate levels of BMR as compared to low- and high-BMR lines. Yet, this pattern was only partly consistent with the between-line differences in cell/nucleus sizes. Erythrocytes and skin epithelium cells were smaller in the high-BMR line than in other lines, but the cells of low-BMR and random-bred mice were similar in size. On the other hand, the size of hepatocytes, kidney proximal tubule cells and duodenum enterocytes were larger in high-BMR mice than other lines. All cell and nucleus sizes were positively correlated, which supports the role of the nucleus in cell size regulation. Our results suggest that the evolution of high BMR involves a reduction in cell size in specialized tissues, whose functions are primarily dictated by surface-to-volume ratios, such as erythrocytes. High BMR may, however, also incur an increase in cell size in tissues with an intense transcription and translation, such as hepatocytes. © 2014 The Authors. Journal of Evolutionary Biology © 2014 European Society For Evolutionary Biology.

  3. Scaling Limit of Symmetric Random Walk in High-Contrast Periodic Environment

    Science.gov (United States)

    Piatnitski, A.; Zhizhina, E.

    2017-11-01

    The paper deals with the asymptotic properties of a symmetric random walk in a high contrast periodic medium in Z^d, d≥1. From the existing homogenization results it follows that under diffusive scaling the limit behaviour of this random walk need not be Markovian. The goal of this work is to show that if in addition to the coordinate of the random walk in Z^d we introduce an extra variable that characterizes the position of the random walk inside the period then the limit dynamics of this two-component process is Markov. We describe the limit process and observe that the components of the limit process are coupled. We also prove the convergence in the path space for the said random walk.

  4. High-Performance Pseudo-Random Number Generation on Graphics Processing Units

    OpenAIRE

    Nandapalan, Nimalan; Brent, Richard P.; Murray, Lawrence M.; Rendell, Alistair

    2011-01-01

    This work considers the deployment of pseudo-random number generators (PRNGs) on graphics processing units (GPUs), developing an approach based on the xorgens generator to rapidly produce pseudo-random numbers of high statistical quality. The chosen algorithm has configurable state size and period, making it ideal for tuning to the GPU architecture. We present a comparison of both speed and statistical quality with other common parallel, GPU-based PRNGs, demonstrating favourable performance o...

  5. Pseudo-random-number generators and the square site percolation threshold.

    Science.gov (United States)

    Lee, Michael J

    2008-09-01

    Selected pseudo-random-number generators are applied to a Monte Carlo study of the two-dimensional square-lattice site percolation model. A generator suitable for high precision calculations is identified from an application specific test of randomness. After extended computation and analysis, an ostensibly reliable value of p_{c}=0.59274598(4) is obtained for the percolation threshold.

  6. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  7. Ceramic-supported thin PVA pervaporation membranes combining high flux and high selectivity : contradicting the flux-selectivity paradigm

    NARCIS (Netherlands)

    Peters, T.A.; Poeth, C.H.S.; Benes, N.E.; Buijs, H.C.W.M.; Vercauteren, F.F.; Keurentjes, J.T.F.

    2006-01-01

    Thin, high-flux and highly selective cross-linked poly(vinyl)alcohol waterselective layers have been prepared on top of hollow fibre ceramic supports. The supports consist of an alpha-Al2O3 hollow fibre substrate and an intermediate gamma-Al2O3 layer, which provides a sufficiently smooth surface for

  8. Implications of structural genomics target selection strategies: Pfam5000, whole genome, and random approaches

    Energy Technology Data Exchange (ETDEWEB)

    Chandonia, John-Marc; Brenner, Steven E.

    2004-07-14

    The structural genomics project is an international effort to determine the three-dimensional shapes of all important biological macromolecules, with a primary focus on proteins. Target proteins should be selected according to a strategy which is medically and biologically relevant, of good value, and tractable. As an option to consider, we present the Pfam5000 strategy, which involves selecting the 5000 most important families from the Pfam database as sources for targets. We compare the Pfam5000 strategy to several other proposed strategies that would require similar numbers of targets. These include including complete solution of several small to moderately sized bacterial proteomes, partial coverage of the human proteome, and random selection of approximately 5000 targets from sequenced genomes. We measure the impact that successful implementation of these strategies would have upon structural interpretation of the proteins in Swiss-Prot, TrEMBL, and 131 complete proteomes (including 10 of eukaryotes) from the Proteome Analysis database at EBI. Solving the structures of proteins from the 5000 largest Pfam families would allow accurate fold assignment for approximately 68 percent of all prokaryotic proteins (covering 59 percent of residues) and 61 percent of eukaryotic proteins (40 percent of residues). More fine-grained coverage which would allow accurate modeling of these proteins would require an order of magnitude more targets. The Pfam5000 strategy may be modified in several ways, for example to focus on larger families, bacterial sequences, or eukaryotic sequences; as long as secondary consideration is given to large families within Pfam, coverage results vary only slightly. In contrast, focusing structural genomics on a single tractable genome would have only a limited impact in structural knowledge of other proteomes: a significant fraction (about 30-40 percent of the proteins, and 40-60 percent of the residues) of each proteome is classified in small

  9. Different Levels of DNA Methylation Detected in Human Sperms after Morphological Selection Using High Magnification Microscopy

    Directory of Open Access Journals (Sweden)

    Nino Guy Cassuto

    2016-01-01

    Full Text Available Objective. To analyze DNA methylation levels between two groups of spermatozoa taken from the same sample, following morphological selection by high magnification (HM at 6100x microscopy. A prospective study was conducted and studied 876 spermatozoa from 10 randomly selected men. Sperm morphology was characterized at HM according to criteria previously established. High-scoring Score 6 and low-scoring Score 0 sperm were selected. Sperm DNA methylation level was assessed using an immunoassay method targeting 5-methylcytosine residues by fluorescence microscopy with imaging analysis system to detect DNA methylation in single spermatozoon. Results. In total, 448 S6 spermatozoa and 428 S0 spermatozoa were analyzed. A strong relationship was found between sperm DNA methylation levels and sperm morphology observed at HM. Sperm DNA methylation level in the S6 group was significantly lower compared with that in the S0 group (p<10-6, OR = 2.4; and p<0.001, as determined using the Wilcoxon test. Conclusion. Differences in DNA methylation levels are associated with sperm morphology variations as observed at HM, which allows spermatozoa with abnormal levels to be discarded and ultimately decrease birth defects, malformations, and epigenetic diseases that may be transmitted from sperm to offspring in ICSI.

  10. Sparse Bayesian classification and feature selection for biological expression data with high correlations.

    Directory of Open Access Journals (Sweden)

    Xian Yang

    Full Text Available Classification models built on biological expression data are increasingly used to predict distinct disease subtypes. Selected features that separate sample groups can be the candidates of biomarkers, helping us to discover biological functions/pathways. However, three challenges are associated with building a robust classification and feature selection model: 1 the number of significant biomarkers is much smaller than that of measured features for which the search will be exhaustive; 2 current biological expression data are big in both sample size and feature size which will worsen the scalability of any search algorithms; and 3 expression profiles of certain features are typically highly correlated which may prevent to distinguish the predominant features. Unfortunately, most of the existing algorithms are partially addressing part of these challenges but not as a whole. In this paper, we propose a unified framework to address the above challenges. The classification and feature selection problem is first formulated as a nonconvex optimisation problem. Then the problem is relaxed and solved iteratively by a sequence of convex optimisation procedures which can be distributed computed and therefore allows the efficient implementation on advanced infrastructures. To illustrate the competence of our method over others, we first analyse a randomly generated simulation dataset under various conditions. We then analyse a real gene expression dataset on embryonal tumour. Further downstream analysis, such as functional annotation and pathway analysis, are performed on the selected features which elucidate several biological findings.

  11. Participant-selected music and physical activity in older adults following cardiac rehabilitation: a randomized controlled trial.

    Science.gov (United States)

    Clark, Imogen N; Baker, Felicity A; Peiris, Casey L; Shoebridge, Georgie; Taylor, Nicholas F

    2017-03-01

    To evaluate effects of participant-selected music on older adults' achievement of activity levels recommended in the physical activity guidelines following cardiac rehabilitation. A parallel group randomized controlled trial with measurements at Weeks 0, 6 and 26. A multisite outpatient rehabilitation programme of a publicly funded metropolitan health service. Adults aged 60 years and older who had completed a cardiac rehabilitation programme. Experimental participants selected music to support walking with guidance from a music therapist. Control participants received usual care only. The primary outcome was the proportion of participants achieving activity levels recommended in physical activity guidelines. Secondary outcomes compared amounts of physical activity, exercise capacity, cardiac risk factors, and exercise self-efficacy. A total of 56 participants, mean age 68.2 years (SD = 6.5), were randomized to the experimental ( n = 28) and control groups ( n = 28). There were no differences between groups in proportions of participants achieving activity recommended in physical activity guidelines at Week 6 or 26. Secondary outcomes demonstrated between-group differences in male waist circumference at both measurements (Week 6 difference -2.0 cm, 95% CI -4.0 to 0; Week 26 difference -2.8 cm, 95% CI -5.4 to -0.1), and observed effect sizes favoured the experimental group for amounts of physical activity (d = 0.30), exercise capacity (d = 0.48), and blood pressure (d = -0.32). Participant-selected music did not increase the proportion of participants achieving recommended amounts of physical activity, but may have contributed to exercise-related benefits.

  12. r2VIM: A new variable selection method for random forests in genome-wide association studies.

    Science.gov (United States)

    Szymczak, Silke; Holzinger, Emily; Dasgupta, Abhijit; Malley, James D; Molloy, Anne M; Mills, James L; Brody, Lawrence C; Stambolian, Dwight; Bailey-Wilson, Joan E

    2016-01-01

    Machine learning methods and in particular random forests (RFs) are a promising alternative to standard single SNP analyses in genome-wide association studies (GWAS). RFs provide variable importance measures (VIMs) to rank SNPs according to their predictive power. However, in contrast to the established genome-wide significance threshold, no clear criteria exist to determine how many SNPs should be selected for downstream analyses. We propose a new variable selection approach, recurrent relative variable importance measure (r2VIM). Importance values are calculated relative to an observed minimal importance score for several runs of RF and only SNPs with large relative VIMs in all of the runs are selected as important. Evaluations on simulated GWAS data show that the new method controls the number of false-positives under the null hypothesis. Under a simple alternative hypothesis with several independent main effects it is only slightly less powerful than logistic regression. In an experimental GWAS data set, the same strong signal is identified while the approach selects none of the SNPs in an underpowered GWAS. The novel variable selection method r2VIM is a promising extension to standard RF for objectively selecting relevant SNPs in GWAS while controlling the number of false-positive results.

  13. How High School Students Select a College.

    Science.gov (United States)

    Gilmour, Joseph E., Jr.; And Others

    The college selection process used by high school students was studied and a paradigm that describes the process was developed, based on marketing theory concerning consumer behavior. Primarily college freshmen and high school seniors were interviewed, and a few high school juniors and upper-level college students were surveyed to determine…

  14. A simple highly sensitive and selective aptamer-based colorimetric sensor for environmental toxins microcystin-LR in water samples.

    Science.gov (United States)

    Li, Xiuyan; Cheng, Ruojie; Shi, Huijie; Tang, Bo; Xiao, Hanshuang; Zhao, Guohua

    2016-03-05

    A simple and highly sensitive aptamer-based colorimetric sensor was developed for selective detection of Microcystin-LR (MC-LR). The aptamer (ABA) was employed as recognition element which could bind MC-LR with high-affinity, while gold nanoparticles (AuNPs) worked as sensing materials whose plasma resonance absorption peaks red shifted upon binding of the targets at a high concentration of sodium chloride. With the addition of MC-LR, the random coil aptamer adsorbed on Au NPs altered into regulated structure to form MC-LR-aptamer complexes and broke away from the surface of Au NPs, leading to the aggregation of AuNPs, and the color converted from red to blue due to the interparticle plasmon coupling. Results showed that our aptamer-based colorimetric sensor exhibited rapid and sensitive detection performance for MC-LR with linear range from 0.5 nM to 7.5 μM and the detection limit reached 0.37 nM. Meanwhile, the pollutants usually coexisting with MC-LR in pollutant water samples had not demonstrated disturbance for detecting of MC-LR. The mechanism was also proposed suggesting that high affinity interaction between aptamer and MC-LR significantly enhanced the sensitivity and selectivity for MC-LR detection. Besides, the established method was utilized in analyzing real water samples and splendid sensitivity and selectivity were obtained as well. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. A theory for the origin of a self-replicating chemical system. I - Natural selection of the autogen from short, random oligomers

    Science.gov (United States)

    White, D. H.

    1980-01-01

    A general theory is presented for the origin of a self-replicating chemical system, termed an autogen, which is capable of both crude replication and translation (protein synthesis). The theory requires the availability of free energy and monomers to the system, a significant background low-yield synthesis of kinetically stable oligopeptides and oligonucleotides, the localization of the oligomers, crude oligonucleotide selectivity of amino acids during oligopeptide synthesis, crude oligonucleotide replication, and two short peptide families which catalyze replication and translation, to produce a localized group of at least one copy each of two protogenes and two protoenzymes. The model posits a process of random oligomerization, followed by the random nucleation of functional components and the rapid autocatalytic growth of the functioning autogen to macroscopic amounts, to account for the origin of the first self-replicating system. Such a process contains steps of such high probability and short time periods that it is suggested that the emergence of an autogen in a laboratory experiment of reasonable time scale may be possible.

  16. Random number generation and creativity.

    Science.gov (United States)

    Bains, William

    2008-01-01

    A previous paper suggested that humans can generate genuinely random numbers. I tested this hypothesis by repeating the experiment with a larger number of highly numerate subjects, asking them to call out a sequence of digits selected from 0 through 9. The resulting sequences were substantially non-random, with an excess of sequential pairs of numbers and a deficit of repeats of the same number, in line with previous literature. However, the previous literature suggests that humans generate random numbers with substantial conscious effort, and distractions which reduce that effort reduce the randomness of the numbers. I reduced my subjects' concentration by asking them to call out in another language, and with alcohol - neither affected the randomness of their responses. This suggests that the ability to generate random numbers is a 'basic' function of the human mind, even if those numbers are not mathematically 'random'. I hypothesise that there is a 'creativity' mechanism, while not truly random, provides novelty as part of the mind's defence against closed programming loops, and that testing for the effects seen here in people more or less familiar with numbers or with spontaneous creativity could identify more features of this process. It is possible that training to perform better at simple random generation tasks could help to increase creativity, through training people to reduce the conscious mind's suppression of the 'spontaneous', creative response to new questions.

  17. Mirnacle: machine learning with SMOTE and random forest for improving selectivity in pre-miRNA ab initio prediction.

    Science.gov (United States)

    Marques, Yuri Bento; de Paiva Oliveira, Alcione; Ribeiro Vasconcelos, Ana Tereza; Cerqueira, Fabio Ribeiro

    2016-12-15

    MicroRNAs (miRNAs) are key gene expression regulators in plants and animals. Therefore, miRNAs are involved in several biological processes, making the study of these molecules one of the most relevant topics of molecular biology nowadays. However, characterizing miRNAs in vivo is still a complex task. As a consequence, in silico methods have been developed to predict miRNA loci. A common ab initio strategy to find miRNAs in genomic data is to search for sequences that can fold into the typical hairpin structure of miRNA precursors (pre-miRNAs). The current ab initio approaches, however, have selectivity issues, i.e., a high number of false positives is reported, which can lead to laborious and costly attempts to provide biological validation. This study presents an extension of the ab initio method miRNAFold, with the aim of improving selectivity through machine learning techniques, namely, random forest combined with the SMOTE procedure that copes with imbalance datasets. By comparing our method, termed Mirnacle, with other important approaches in the literature, we demonstrate that Mirnacle substantially improves selectivity without compromising sensitivity. For the three datasets used in our experiments, our method achieved at least 97% of sensitivity and could deliver a two-fold, 20-fold, and 6-fold increase in selectivity, respectively, compared with the best results of current computational tools. The extension of miRNAFold by the introduction of machine learning techniques, significantly increases selectivity in pre-miRNA ab initio prediction, which optimally contributes to advanced studies on miRNAs, as the need of biological validations is diminished. Hopefully, new research, such as studies of severe diseases caused by miRNA malfunction, will benefit from the proposed computational tool.

  18. Two-year Randomized Clinical Trial of Self-etching Adhesives and Selective Enamel Etching.

    Science.gov (United States)

    Pena, C E; Rodrigues, J A; Ely, C; Giannini, M; Reis, A F

    2016-01-01

    The aim of this randomized, controlled prospective clinical trial was to evaluate the clinical effectiveness of restoring noncarious cervical lesions with two self-etching adhesive systems applied with or without selective enamel etching. A one-step self-etching adhesive (Xeno V(+)) and a two-step self-etching system (Clearfil SE Bond) were used. The effectiveness of phosphoric acid selective etching of enamel margins was also evaluated. Fifty-six cavities were restored with each adhesive system and divided into two subgroups (n=28; etch and non-etch). All 112 cavities were restored with the nanohybrid composite Esthet.X HD. The clinical effectiveness of restorations was recorded in terms of retention, marginal integrity, marginal staining, caries recurrence, and postoperative sensitivity after 3, 6, 12, 18, and 24 months (modified United States Public Health Service). The Friedman test detected significant differences only after 18 months for marginal staining in the groups Clearfil SE non-etch (p=0.009) and Xeno V(+) etch (p=0.004). One restoration was lost during the trial (Xeno V(+) etch; p>0.05). Although an increase in marginal staining was recorded for groups Clearfil SE non-etch and Xeno V(+) etch, the clinical effectiveness of restorations was considered acceptable for the single-step and two-step self-etching systems with or without selective enamel etching in this 24-month clinical trial.

  19. Optimized bioregenerative space diet selection with crew choice

    Science.gov (United States)

    Vicens, Carrie; Wang, Carolyn; Olabi, Ammar; Jackson, Peter; Hunter, Jean

    2003-01-01

    Previous studies on optimization of crew diets have not accounted for choice. A diet selection model with crew choice was developed. Scenario analyses were conducted to assess the feasibility and cost of certain crew preferences, such as preferences for numerous-desserts, high-salt, and high-acceptability foods. For comparison purposes, a no-choice and a random-choice scenario were considered. The model was found to be feasible in terms of food variety and overall costs. The numerous-desserts, high-acceptability, and random-choice scenarios all resulted in feasible solutions costing between 13.2 and 17.3 kg ESM/person-day. Only the high-sodium scenario yielded an infeasible solution. This occurred when the foods highest in salt content were selected for the crew-choice portion of the diet. This infeasibility can be avoided by limiting the total sodium content in the crew-choice portion of the diet. Cost savings were found by reducing food variety in scenarios where the preference bias strongly affected nutritional content.

  20. Random effect selection in generalised linear models

    DEFF Research Database (Denmark)

    Denwood, Matt; Houe, Hans; Forkman, Björn

    We analysed abattoir recordings of meat inspection codes with possible relevance to onfarm animal welfare in cattle. Random effects logistic regression models were used to describe individual-level data obtained from 461,406 cattle slaughtered in Denmark. Our results demonstrate that the largest...

  1. Familial versus mass selection in small populations

    Directory of Open Access Journals (Sweden)

    Couvet Denis

    2003-07-01

    Full Text Available Abstract We used diffusion approximations and a Markov-chain approach to investigate the consequences of familial selection on the viability of small populations both in the short and in the long term. The outcome of familial selection was compared to the case of a random mating population under mass selection. In small populations, the higher effective size, associated with familial selection, resulted in higher fitness for slightly deleterious and/or highly recessive alleles. Conversely, because familial selection leads to a lower rate of directional selection, a lower fitness was observed for more detrimental genes that are not highly recessive, and with high population sizes. However, in the long term, genetic load was almost identical for both mass and familial selection for populations of up to 200 individuals. In terms of mean time to extinction, familial selection did not have any negative effect at least for small populations (N ≤ 50. Overall, familial selection could be proposed for use in management programs of small populations since it increases genetic variability and short-term viability without impairing the overall persistence times.

  2. Comparison of Psychological and Physiological Responses to Imposed vs. Self-selected High-Intensity Interval Training.

    Science.gov (United States)

    Kellogg, Erin; Cantacessi, Cheyann; McNamer, Olivia; Holmes, Heather; von Bargen, Robert; Ramirez, Richard; Gallagher, Daren; Vargas, Stacy; Santia, Ben; Rodriguez, Karen; Astorino, Todd A

    2018-05-08

    Kellogg, E, Cantacessi, C, McNamer, O, Holmes, H, von Bargen, R, Ramirez, R, Gallagher, D, Vargas, S, Santia, B, Rodriguez, K, and Astorino, TA. Comparison of psychological and physiological responses to imposed vs. self-selected high-intensity interval training. J Strength Cond Res XX(X): 000-000, 2018-High-intensity interval training elicits similar physiological adaptations as moderate intensity continuous training (MICT). Some studies report greater enjoyment to a bout of high-intensity interval exercise (HIIE) vs. MICT, which is surprising considering that HIIE is more intense and typically imposed on the participant. This study compared physiological and perceptual responses between imposed and self-selected HIIE. Fourteen adults (age = 24 ± 3 years) unfamiliar with HIIE initially performed ramp exercise to exhaustion to measure maximal oxygen uptake (V[Combining Dot Above]O2max) followed by 2 subsequent sessions whose order was randomized. Imposed HIIE consisted of eight 60 seconds bouts at 80 percent peak power output (%PPO) separated by 60 seconds recovery at 10 %PPO. Self-selected HIIE (HIIESS) followed the same structure, but participants freely selected intensity in increments of 10 %PPO to achieve a rating of perceived exertion (RPE) ≥7. During exercise, heart rate, V[Combining Dot Above]O2, blood lactate concentration (BLa), affect (+5 to -5), and RPE were assessed. Physical Activity Enjoyment Scale was measured after exercise. Results showed higher V[Combining Dot Above]O2 (+10%, p = 0.013), BLa (p = 0.001), and RPE (p = 0.001) in HIIESS vs. HIIEIMP, and lower affect (p = 0.01), and enjoyment (87.6 ± 15.7 vs. 95.7 ± 11.7, p = 0.04). There was a significantly higher power output in self-selected vs. imposed HIIE (263.9 ± 81.4 W vs. 225.2 ± 59.6 W, p < 0.001). Data suggest that intensity mediates affective responses rather than the mode of HIIE performed by the participant.

  3. Cyclic cholecystokinin analogues with high selectivity for central receptors

    International Nuclear Information System (INIS)

    Charpentier, B.; Pelaprat, D.; Durieux, C.; Dor, A.; Roques, B.P.; Reibaud, M.; Blanchard, J.C.

    1988-01-01

    Taking as a model the N-terminal folding of the cholecystokinin tyrosine-sulfated octapeptide deduced from conformational studies, two cyclic cholecystokinin (CCK) analogues were synthesized by conventional peptide synthesis. The binding characteristics of these peptides were investigated on brain cortex membranes and pancreatic acini of guinea pig. Compounds I and II were competitive inhibitors of [ 3 H]Boc[Ahx 28,31 ]CCK-(27-33) binding to central CCK receptors and showed a high degree of selectivity for these binding sites. This high selectivity was associated with a high affinity for central CCK receptors. Similar affinities and selectivities were found when 125 I Bolton-Hunter-labeled CCK-8 was used as a ligand. Moreover, these compounds were only weakly active in the stimulation of amylase release from guinea pig pancreatic acini and were unable to induce contractions in the guinea pig ileum. The two cyclic CCK analogues, therefore, appear to be synthetic ligands exhibiting both high affinity and high selectivity for central CCK binding sites. These compounds could help clarify the respective role of central and peripheral receptors for various CCK-8-induced pharmacological effects

  4. Selecting Optimal Parameters of Random Linear Network Coding for Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Heide, J; Zhang, Qi; Fitzek, F H P

    2013-01-01

    This work studies how to select optimal code parameters of Random Linear Network Coding (RLNC) in Wireless Sensor Networks (WSNs). With Rateless Deluge [1] the authors proposed to apply Network Coding (NC) for Over-the-Air Programming (OAP) in WSNs, and demonstrated that with NC a significant...... reduction in the number of transmitted packets can be achieved. However, NC introduces additional computations and potentially a non-negligible transmission overhead, both of which depend on the chosen coding parameters. Therefore it is necessary to consider the trade-off that these coding parameters...... present in order to obtain the lowest energy consumption per transmitted bit. This problem is analyzed and suitable coding parameters are determined for the popular Tmote Sky platform. Compared to the use of traditional RLNC, these parameters enable a reduction in the energy spent per bit which grows...

  5. On Random Numbers and Design

    Science.gov (United States)

    Ben-Ari, Morechai

    2004-01-01

    The term "random" is frequently used in discussion of the theory of evolution, even though the mathematical concept of randomness is problematic and of little relevance in the theory. Therefore, since the core concept of the theory of evolution is the non-random process of natural selection, the term random should not be used in teaching the…

  6. Selecting for Fast Protein-Protein Association As Demonstrated on a Random TEM1 Yeast Library Binding BLIP.

    Science.gov (United States)

    Cohen-Khait, Ruth; Schreiber, Gideon

    2018-04-27

    Protein-protein interactions mediate the vast majority of cellular processes. Though protein interactions obey basic chemical principles also within the cell, the in vivo physiological environment may not allow for equilibrium to be reached. Thus, in vitro measured thermodynamic affinity may not provide a complete picture of protein interactions in the biological context. Binding kinetics composed of the association and dissociation rate constants are relevant and important in the cell. Therefore, changes in protein-protein interaction kinetics have a significant impact on the in vivo activity of the proteins. The common protocol for the selection of tighter binders from a mutant library selects for protein complexes with slower dissociation rate constants. Here we describe a method to specifically select for variants with faster association rate constants by using pre-equilibrium selection, starting from a large random library. Toward this end, we refine the selection conditions of a TEM1-β-lactamase library against its natural nanomolar affinity binder β-lactamase inhibitor protein (BLIP). The optimal selection conditions depend on the ligand concentration and on the incubation time. In addition, we show that a second sort of the library helps to separate signal from noise, resulting in a higher percent of faster binders in the selected library. Fast associating protein variants are of particular interest for drug development and other biotechnological applications.

  7. Bayesian Multiresolution Variable Selection for Ultra-High Dimensional Neuroimaging Data.

    Science.gov (United States)

    Zhao, Yize; Kang, Jian; Long, Qi

    2018-01-01

    Ultra-high dimensional variable selection has become increasingly important in analysis of neuroimaging data. For example, in the Autism Brain Imaging Data Exchange (ABIDE) study, neuroscientists are interested in identifying important biomarkers for early detection of the autism spectrum disorder (ASD) using high resolution brain images that include hundreds of thousands voxels. However, most existing methods are not feasible for solving this problem due to their extensive computational costs. In this work, we propose a novel multiresolution variable selection procedure under a Bayesian probit regression framework. It recursively uses posterior samples for coarser-scale variable selection to guide the posterior inference on finer-scale variable selection, leading to very efficient Markov chain Monte Carlo (MCMC) algorithms. The proposed algorithms are computationally feasible for ultra-high dimensional data. Also, our model incorporates two levels of structural information into variable selection using Ising priors: the spatial dependence between voxels and the functional connectivity between anatomical brain regions. Applied to the resting state functional magnetic resonance imaging (R-fMRI) data in the ABIDE study, our methods identify voxel-level imaging biomarkers highly predictive of the ASD, which are biologically meaningful and interpretable. Extensive simulations also show that our methods achieve better performance in variable selection compared to existing methods.

  8. High-speed true random number generation based on paired memristors for security electronics

    Science.gov (United States)

    Zhang, Teng; Yin, Minghui; Xu, Changmin; Lu, Xiayan; Sun, Xinhao; Yang, Yuchao; Huang, Ru

    2017-11-01

    True random number generator (TRNG) is a critical component in hardware security that is increasingly important in the era of mobile computing and internet of things. Here we demonstrate a TRNG using intrinsic variation of memristors as a natural source of entropy that is otherwise undesirable in most applications. The random bits were produced by cyclically switching a pair of tantalum oxide based memristors and comparing their resistance values in the off state, taking advantage of the more pronounced resistance variation compared with that in the on state. Using an alternating read scheme in the designed TRNG circuit, the unbiasedness of the random numbers was significantly improved, and the bitstream passed standard randomness tests. The Pt/TaO x /Ta memristors fabricated in this work have fast programming/erasing speeds of ˜30 ns, suggesting a high random number throughput. The approach proposed here thus holds great promise for physically-implemented random number generation.

  9. A Permutation Importance-Based Feature Selection Method for Short-Term Electricity Load Forecasting Using Random Forest

    Directory of Open Access Journals (Sweden)

    Nantian Huang

    2016-09-01

    Full Text Available The prediction accuracy of short-term load forecast (STLF depends on prediction model choice and feature selection result. In this paper, a novel random forest (RF-based feature selection method for STLF is proposed. First, 243 related features were extracted from historical load data and the time information of prediction points to form the original feature set. Subsequently, the original feature set was used to train an RF as the original model. After the training process, the prediction error of the original model on the test set was recorded and the permutation importance (PI value of each feature was obtained. Then, an improved sequential backward search method was used to select the optimal forecasting feature subset based on the PI value of each feature. Finally, the optimal forecasting feature subset was used to train a new RF model as the final prediction model. Experiments showed that the prediction accuracy of RF trained by the optimal forecasting feature subset was higher than that of the original model and comparative models based on support vector regression and artificial neural network.

  10. Towards a pro-health food-selection model for gatekeepers in ...

    African Journals Online (AJOL)

    The purpose of this study was to develop a pro-health food selection model for gatekeepers of Bulawayo high-density suburbs in Zimbabwe. Gatekeepers in five suburbs constituted the study population from which a sample of 250 subjects was randomly selected. Of the total respondents (N= 182), 167 had their own ...

  11. Direct random insertion mutagenesis of Helicobacter pylori

    NARCIS (Netherlands)

    de Jonge, Ramon; Bakker, Dennis; van Vliet, Arnoud H. M.; Kuipers, Ernst J.; Vandenbroucke-Grauls, Christina M. J. E.; Kusters, Johannes G.

    2003-01-01

    Random insertion mutagenesis is a widely used technique for the identification of bacterial virulence genes. Most strategies for random mutagenesis involve cloning in Escherichia coli for passage of plasmids or for phenotypic selection. This can result in biased selection due to restriction or

  12. Direct random insertion mutagenesis of Helicobacter pylori.

    NARCIS (Netherlands)

    Jonge, de R.; Bakker, D.; Vliet, van AH; Kuipers, E.J.; Vandenbroucke-Grauls, C.M.J.E.; Kusters, J.G.

    2003-01-01

    Random insertion mutagenesis is a widely used technique for the identification of bacterial virulence genes. Most strategies for random mutagenesis involve cloning in Escherichia coli for passage of plasmids or for phenotypic selection. This can result in biased selection due to restriction or

  13. Summarized Costs, Placement Of Quality Stars, And Other Online Displays Can Help Consumers Select High-Value Health Plans.

    Science.gov (United States)

    Greene, Jessica; Hibbard, Judith H; Sacks, Rebecca M

    2016-04-01

    Starting in 2017, all state and federal health insurance exchanges will present quality data on health plans in addition to cost information. We analyzed variations in the current design of information on state exchanges to identify presentation approaches that encourage consumers to take quality as well as cost into account when selecting a health plan. Using an online sample of 1,025 adults, we randomly assigned participants to view the same comparative information on health plans, displayed in different ways. We found that consumers were much more likely to select a high-value plan when cost information was summarized instead of detailed, when quality stars were displayed adjacent to cost information, when consumers understood that quality stars signified the quality of medical care, and when high-value plans were highlighted with a check mark or blue ribbon. These approaches, which were equally effective for participants with higher and lower numeracy, can inform the development of future displays of plan information in the exchanges. Project HOPE—The People-to-People Health Foundation, Inc.

  14. Discrete least squares polynomial approximation with random evaluations - application to PDEs with Random parameters

    KAUST Repository

    Nobile, Fabio

    2015-01-07

    We consider a general problem F(u, y) = 0 where u is the unknown solution, possibly Hilbert space valued, and y a set of uncertain parameters. We specifically address the situation in which the parameterto-solution map u(y) is smooth, however y could be very high (or even infinite) dimensional. In particular, we are interested in cases in which F is a differential operator, u a Hilbert space valued function and y a distributed, space and/or time varying, random field. We aim at reconstructing the parameter-to-solution map u(y) from random noise-free or noisy observations in random points by discrete least squares on polynomial spaces. The noise-free case is relevant whenever the technique is used to construct metamodels, based on polynomial expansions, for the output of computer experiments. In the case of PDEs with random parameters, the metamodel is then used to approximate statistics of the output quantity. We discuss the stability of discrete least squares on random points show convergence estimates both in expectation and probability. We also present possible strategies to select, either a-priori or by adaptive algorithms, sequences of approximating polynomial spaces that allow to reduce, and in some cases break, the curse of dimensionality

  15. Strategyproof Peer Selection using Randomization, Partitioning, and Apportionment

    OpenAIRE

    Aziz, Haris; Lev, Omer; Mattei, Nicholas; Rosenschein, Jeffrey S.; Walsh, Toby

    2016-01-01

    Peer review, evaluation, and selection is a fundamental aspect of modern science. Funding bodies the world over employ experts to review and select the best proposals of those submitted for funding. The problem of peer selection, however, is much more general: a professional society may want to give a subset of its members awards based on the opinions of all members; an instructor for a MOOC or online course may want to crowdsource grading; or a marketing company may select ideas from group b...

  16. Predictive distractor context facilitates attentional selection of high, but not intermediate and low, salience targets.

    Science.gov (United States)

    Töllner, Thomas; Conci, Markus; Müller, Hermann J

    2015-03-01

    It is well established that we can focally attend to a specific region in visual space without shifting our eyes, so as to extract action-relevant sensory information from covertly attended locations. The underlying mechanisms that determine how fast we engage our attentional spotlight in visual-search scenarios, however, remain controversial. One dominant view advocated by perceptual decision-making models holds that the times taken for focal-attentional selection are mediated by an internal template that biases perceptual coding and selection decisions exclusively through target-defining feature coding. This notion directly predicts that search times remain unaffected whether or not participants can anticipate the upcoming distractor context. Here we tested this hypothesis by employing an illusory-figure localization task that required participants to search for an invariant target amongst a variable distractor context, which gradually changed--either randomly or predictably--as a function of distractor-target similarity. We observed a graded decrease in internal focal-attentional selection times--correlated with external behavioral latencies--for distractor contexts of higher relative to lower similarity to the target. Critically, for low but not intermediate and high distractor-target similarity, these context-driven effects were cortically and behaviorally amplified when participants could reliably predict the type of distractors. This interactive pattern demonstrates that search guidance signals can integrate information about distractor, in addition to target, identities to optimize distractor-target competition for focal-attentional selection. © 2014 Wiley Periodicals, Inc.

  17. DNABP: Identification of DNA-Binding Proteins Based on Feature Selection Using a Random Forest and Predicting Binding Residues.

    Science.gov (United States)

    Ma, Xin; Guo, Jing; Sun, Xiao

    2016-01-01

    DNA-binding proteins are fundamentally important in cellular processes. Several computational-based methods have been developed to improve the prediction of DNA-binding proteins in previous years. However, insufficient work has been done on the prediction of DNA-binding proteins from protein sequence information. In this paper, a novel predictor, DNABP (DNA-binding proteins), was designed to predict DNA-binding proteins using the random forest (RF) classifier with a hybrid feature. The hybrid feature contains two types of novel sequence features, which reflect information about the conservation of physicochemical properties of the amino acids, and the binding propensity of DNA-binding residues and non-binding propensities of non-binding residues. The comparisons with each feature demonstrated that these two novel features contributed most to the improvement in predictive ability. Furthermore, to improve the prediction performance of the DNABP model, feature selection using the minimum redundancy maximum relevance (mRMR) method combined with incremental feature selection (IFS) was carried out during the model construction. The results showed that the DNABP model could achieve 86.90% accuracy, 83.76% sensitivity, 90.03% specificity and a Matthews correlation coefficient of 0.727. High prediction accuracy and performance comparisons with previous research suggested that DNABP could be a useful approach to identify DNA-binding proteins from sequence information. The DNABP web server system is freely available at http://www.cbi.seu.edu.cn/DNABP/.

  18. Effects of prey abundance, distribution, visual contrast and morphology on selection by a pelagic piscivore

    Science.gov (United States)

    Hansen, Adam G.; Beauchamp, David A.

    2014-01-01

    Most predators eat only a subset of possible prey. However, studies evaluating diet selection rarely measure prey availability in a manner that accounts for temporal–spatial overlap with predators, the sensory mechanisms employed to detect prey, and constraints on prey capture.We evaluated the diet selection of cutthroat trout (Oncorhynchus clarkii) feeding on a diverse planktivore assemblage in Lake Washington to test the hypothesis that the diet selection of piscivores would reflect random (opportunistic) as opposed to non-random (targeted) feeding, after accounting for predator–prey overlap, visual detection and capture constraints.Diets of cutthroat trout were sampled in autumn 2005, when the abundance of transparent, age-0 longfin smelt (Spirinchus thaleichthys) was low, and 2006, when the abundance of smelt was nearly seven times higher. Diet selection was evaluated separately using depth-integrated and depth-specific (accounted for predator–prey overlap) prey abundance. The abundance of different prey was then adjusted for differences in detectability and vulnerability to predation to see whether these factors could explain diet selection.In 2005, cutthroat trout fed non-randomly by selecting against the smaller, transparent age-0 longfin smelt, but for the larger age-1 longfin smelt. After adjusting prey abundance for visual detection and capture, cutthroat trout fed randomly. In 2006, depth-integrated and depth-specific abundance explained the diets of cutthroat trout well, indicating random feeding. Feeding became non-random after adjusting for visual detection and capture. Cutthroat trout selected strongly for age-0 longfin smelt, but against similar sized threespine stickleback (Gasterosteus aculeatus) and larger age-1 longfin smelt in 2006. Overlap with juvenile sockeye salmon (O. nerka) was minimal in both years, and sockeye salmon were rare in the diets of cutthroat trout.The direction of the shift between random and non-random selection

  19. 47 CFR 1.1604 - Post-selection hearings.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a) Following the random...

  20. Statistical properties of random clique networks

    Science.gov (United States)

    Ding, Yi-Min; Meng, Jun; Fan, Jing-Fang; Ye, Fang-Fu; Chen, Xiao-Song

    2017-10-01

    In this paper, a random clique network model to mimic the large clustering coefficient and the modular structure that exist in many real complex networks, such as social networks, artificial networks, and protein interaction networks, is introduced by combining the random selection rule of the Erdös and Rényi (ER) model and the concept of cliques. We find that random clique networks having a small average degree differ from the ER network in that they have a large clustering coefficient and a power law clustering spectrum, while networks having a high average degree have similar properties as the ER model. In addition, we find that the relation between the clustering coefficient and the average degree shows a non-monotonic behavior and that the degree distributions can be fit by multiple Poisson curves; we explain the origin of such novel behaviors and degree distributions.

  1. Random genetic drift, natural selection, and noise in human cranial evolution.

    Science.gov (United States)

    Roseman, Charles C

    2016-08-01

    This study assesses the extent to which relationships among groups complicate comparative studies of adaptation in recent human cranial variation and the extent to which departures from neutral additive models of evolution hinder the reconstruction of population relationships among groups using cranial morphology. Using a maximum likelihood evolutionary model fitting approach and a mixed population genomic and cranial data set, I evaluate the relative fits of several widely used models of human cranial evolution. Moreover, I compare the goodness of fit of models of cranial evolution constrained by genomic variation to test hypotheses about population specific departures from neutrality. Models from population genomics are much better fits to cranial variation than are traditional models from comparative human biology. There is not enough evolutionary information in the cranium to reconstruct much of recent human evolution but the influence of population history on cranial variation is strong enough to cause comparative studies of adaptation serious difficulties. Deviations from a model of random genetic drift along a tree-like population history show the importance of environmental effects, gene flow, and/or natural selection on human cranial variation. Moreover, there is a strong signal of the effect of natural selection or an environmental factor on a group of humans from Siberia. The evolution of the human cranium is complex and no one evolutionary process has prevailed at the expense of all others. A holistic unification of phenome, genome, and environmental context, gives us a strong point of purchase on these problems, which is unavailable to any one traditional approach alone. Am J Phys Anthropol 160:582-592, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  2. Polyatomic Trilobite Rydberg Molecules in a Dense Random Gas.

    Science.gov (United States)

    Luukko, Perttu J J; Rost, Jan-Michael

    2017-11-17

    Trilobites are exotic giant dimers with enormous dipole moments. They consist of a Rydberg atom and a distant ground-state atom bound together by short-range electron-neutral attraction. We show that highly polar, polyatomic trilobite states unexpectedly persist and thrive in a dense ultracold gas of randomly positioned atoms. This is caused by perturbation-induced quantum scarring and the localization of electron density on randomly occurring atom clusters. At certain densities these states also mix with an s state, overcoming selection rules that hinder the photoassociation of ordinary trilobites.

  3. Fabrication of high efficacy selective solar absorbers

    CSIR Research Space (South Africa)

    Tile, N

    2012-03-01

    Full Text Available High efficiency tandem selective solar absorber materials of carbon in nickel oxide (C-NiO) composite were fabricated on an aluminium substrate using a simple and cost effective sol-gel process. The process involved preparation of carbon and nickel...

  4. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology.

    Science.gov (United States)

    Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H

    2017-07-01

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in

  5. Altruism Can Proliferate through Population Viscosity despite High Random Gene Flow

    Science.gov (United States)

    Schonmann, Roberto H.; Vicente, Renato; Caticha, Nestor

    2013-01-01

    The ways in which natural selection can allow the proliferation of cooperative behavior have long been seen as a central problem in evolutionary biology. Most of the literature has focused on interactions between pairs of individuals and on linear public goods games. This emphasis has led to the conclusion that even modest levels of migration would pose a serious problem to the spread of altruism through population viscosity in group structured populations. Here we challenge this conclusion, by analyzing evolution in a framework which allows for complex group interactions and random migration among groups. We conclude that contingent forms of strong altruism that benefits equally all group members, regardless of kinship and without greenbeard effects, can spread when rare under realistic group sizes and levels of migration, due to the assortment of genes resulting only from population viscosity. Our analysis combines group-centric and gene-centric perspectives, allows for arbitrary strength of selection, and leads to extensions of Hamilton’s rule for the spread of altruistic alleles, applicable under broad conditions. PMID:23991035

  6. Altruism can proliferate through population viscosity despite high random gene flow.

    Directory of Open Access Journals (Sweden)

    Roberto H Schonmann

    Full Text Available The ways in which natural selection can allow the proliferation of cooperative behavior have long been seen as a central problem in evolutionary biology. Most of the literature has focused on interactions between pairs of individuals and on linear public goods games. This emphasis has led to the conclusion that even modest levels of migration would pose a serious problem to the spread of altruism through population viscosity in group structured populations. Here we challenge this conclusion, by analyzing evolution in a framework which allows for complex group interactions and random migration among groups. We conclude that contingent forms of strong altruism that benefits equally all group members, regardless of kinship and without greenbeard effects, can spread when rare under realistic group sizes and levels of migration, due to the assortment of genes resulting only from population viscosity. Our analysis combines group-centric and gene-centric perspectives, allows for arbitrary strength of selection, and leads to extensions of Hamilton's rule for the spread of altruistic alleles, applicable under broad conditions.

  7. High spatial resolution and high contrast visualization of brain arteries and veins. Impact of blood pool contrast agent and water-selective excitation imaging at 3T

    International Nuclear Information System (INIS)

    Spuentrup, E.; Jacobs, J.E.; Kleimann, J.F.

    2010-01-01

    Purpose: To investigate a blood pool contrast agent and water-selective excitation imaging at 3 T for high spatial and high contrast imaging of brain vessels including the veins. Methods and Results: 48 clinical patients (47 ± 18 years old) were included. Based on clinical findings, twenty-four patients received a single dose of standard extracellular Gadoterate-meglumine (Dotarem registered ) and 24 received the blood pool contrast agent Gadofosveset (Vasovist registered ). After finishing routine MR protocols, all patients were investigated with two high spatial resolution (0.15 mm 3 voxel size) gradient echo sequences in random order in the equilibrium phase (steady-state) as approved by the review board: A standard RF-spoiled gradient-echo sequence (HR-SS, TR/TE 5.1 / 2.3 msec, FA 30 ) and a fat-suppressed gradient-echo sequence with water-selective excitation (HR-FS, 1331 binominal-pulse, TR/TE 8.8 / 3.8 msec, FA 30 ). The images were subjectively assessed (image quality with vessel contrast, artifacts, depiction of lesions) by two investigators and contrast-to-noise ratios (CNR) were compared using the Student's t-test. The image quality and CNR in the HR-FS were significantly superior compared to the HR-SS for both contrast agents (p < 0.05). The CNR was also improved when using the blood pool agent but only to a minor extent while the subjective image quality was similar for both contrast agents. Conclusion: The utilized sequence with water-selective excitation improved image quality and CNR properties in high spatial resolution imaging of brain arteries and veins. The used blood pool contrast agent improved the CNR only to a minor extent over the extracellular contrast agent. (orig.)

  8. Effect of a Selected Physical Exercise on the Development of Displacement Movement Skills in Highly Functional Autistic Children

    Directory of Open Access Journals (Sweden)

    Fatemeh Keyhani

    2014-09-01

    Full Text Available Background: The study is about to examine the effect of the selective physical exercises on the development of displacement skills in High Function Autistic (HFA children. Materials and Methods: In this research, 10 children (7.9±1.4 years among of 33 children with HFA in Sahr-e-Kord city (in Iran based on their pre-test scores randomly were selected. The measuring tool was Test of Gross Motor Development-2000 (TGMD-2. Selected motor program (SPARK motor program in this research includes motor strengthening activities, games and sports for children that were performed for 12 sessions by our subjects. Normal distribution of data checked by K-S test and appropriate statistical Levine's and ANOVA tests (dependent and independent types were used for compare mean values (α=0.05. Results: Twelfth sessions of selected physical exercises training in experiment group made significant differences in some research variables but it was not the case for the control group. There were significant differences in running (p=0.002, trotting (p=0.08, jumping (p=0.002 and gliding (p=0.004 and there were non-significant differences in hop (p=0.035 and leaping (p=0.02. Conclusion: According to the results of this research we suggest that the selected physical exercise programs that derived from SPARK motor program can improve displacement motor skills in children with HFA.

  9. POD NUMBER AND PHOTOSYNTHESIS AS PHYSIOLOGICAL SELECTION CRITERIA IN SOYBEAN (Glycine max L. Merrill BREEDING FOR HIGH YIELD

    Directory of Open Access Journals (Sweden)

    S.M. Sitompul

    2015-02-01

    Full Text Available Field studies were conducted in two years using 638 F2 and 1185 F3 lines of selected 16 F1 and 15 F2 parent lines (³80 pods plant-1 to evaluate pod number and CO2 exchange rate (CER as selection criteria. Pod and seed number, and seed weight of individual lines were observed during harvesting time, and CER of randomly selected 32 F2 and 30 F3 lines was measured at initial seed filling stage. The selection of F2 lines based on pod number to generate F3 lines increased the average of seed yield by 39%, and pod number by 77% in F3 lines compared with F2 lines. A close relationships was found between seed weight and pod or seed number per plant. Net CER responded sensitively to a reduction of light in a short-term and showed 78% of F2 lines and all F3 lines with maximum CER (Pmax³20 mmolCO2.m-2.s-1. The ratio of pod number per plant and Pmax varied between lines and were used to group lines resulting in close relationships between Pmax and pod number. It is concluded that the use of pod number and CER (Pmax as selection criteria offers an alternative approach in soybean breeding for high yield.

  10. (90)Y -PET imaging: Exploring limitations and accuracy under conditions of low counts and high random fraction.

    Science.gov (United States)

    Carlier, Thomas; Willowson, Kathy P; Fourkal, Eugene; Bailey, Dale L; Doss, Mohan; Conti, Maurizio

    2015-07-01

    . Point spread function (PSF) correction and TOF reconstruction in general reduce background variability and noise and increase recovered concentration. Results for patient data indicated a good correlation between the expected and PET reconstructed activities. A linear relationship between the expected and the measured activities in the organ of interest was observed for all reconstruction method used: a linearity coefficient of 0.89 ± 0.05 for the Biograph mCT and 0.81 ± 0.05 for the Biograph TruePoint. Due to the low counts and high random fraction, accurate image quantification of (90)Y during selective internal radionuclide therapy is affected by random coincidence estimation, scatter correction, and any positivity constraint of the algorithm. Nevertheless, phantom and patient studies showed that the impact of number of true and random coincidences on quantitative results was found to be limited as long as ordinary Poisson ordered subsets expectation maximization reconstruction algorithms with random smoothing are used. Adding PSF correction and TOF information to the reconstruction greatly improves the image quality in terms of bias, variability, noise reduction, and detectability. On the patient studies, the total activity in the field of view is in general accurately measured by Biograph mCT and slightly overestimated by the Biograph TruePoint.

  11. 90Y -PET imaging: Exploring limitations and accuracy under conditions of low counts and high random fraction

    International Nuclear Information System (INIS)

    Carlier, Thomas; Willowson, Kathy P.; Fourkal, Eugene; Bailey, Dale L.; Doss, Mohan; Conti, Maurizio

    2015-01-01

    iterative algorithms. Point spread function (PSF) correction and TOF reconstruction in general reduce background variability and noise and increase recovered concentration. Results for patient data indicated a good correlation between the expected and PET reconstructed activities. A linear relationship between the expected and the measured activities in the organ of interest was observed for all reconstruction method used: a linearity coefficient of 0.89 ± 0.05 for the Biograph mCT and 0.81 ± 0.05 for the Biograph TruePoint. Conclusions: Due to the low counts and high random fraction, accurate image quantification of 90 Y during selective internal radionuclide therapy is affected by random coincidence estimation, scatter correction, and any positivity constraint of the algorithm. Nevertheless, phantom and patient studies showed that the impact of number of true and random coincidences on quantitative results was found to be limited as long as ordinary Poisson ordered subsets expectation maximization reconstruction algorithms with random smoothing are used. Adding PSF correction and TOF information to the reconstruction greatly improves the image quality in terms of bias, variability, noise reduction, and detectability. On the patient studies, the total activity in the field of view is in general accurately measured by Biograph mCT and slightly overestimated by the Biograph TruePoint

  12. Early prevention of antisocial personality: long-term follow-up of two randomized controlled trials comparing indicated and selective approaches.

    Science.gov (United States)

    Scott, Stephen; Briskman, Jackie; O'Connor, Thomas G

    2014-06-01

    Antisocial personality is a common adult problem that imposes a major public health burden, but for which there is no effective treatment. Affected individuals exhibit persistent antisocial behavior and pervasive antisocial character traits, such as irritability, manipulativeness, and lack of remorse. Prevention of antisocial personality in childhood has been advocated, but evidence for effective interventions is lacking. The authors conducted two follow-up studies of randomized trials of group parent training. One involved 120 clinic-referred 3- to 7-year-olds with severe antisocial behavior for whom treatment was indicated, 93 of whom were reassessed between ages 10 and 17. The other involved 109 high-risk 4- to 6-year-olds with elevated antisocial behavior who were selectively screened from the community, 90 of whom were reassessed between ages 9 and 13. The primary psychiatric outcome measures were the two elements of antisocial personality, namely, antisocial behavior (assessed by a diagnostic interview) and antisocial character traits (assessed by a questionnaire). Also assessed were reading achievement (an important domain of youth functioning at work) and parent-adolescent relationship quality. In the indicated sample, both elements of antisocial personality were improved in the early intervention group at long-term follow-up compared with the control group (antisocial behavior: odds ratio of oppositional defiant disorder=0.20, 95% CI=0.06, 0.69; antisocial character traits: B=-4.41, 95% CI=-1.12, -8.64). Additionally, reading ability improved (B=9.18, 95% CI=0.58, 18.0). Parental expressed emotion was warmer (B=0.86, 95% CI=0.20, 1.41) and supervision was closer (B=-0.43, 95% CI=-0.11, -0.75), but direct observation of parenting showed no differences. Teacher-rated and self-rated antisocial behavior were unchanged. In contrast, in the selective high-risk sample, early intervention was not associated with improved long-term outcomes. Early intervention with

  13. High selection pressure promotes increase in cumulative adaptive culture.

    Directory of Open Access Journals (Sweden)

    Carolin Vegvari

    Full Text Available The evolution of cumulative adaptive culture has received widespread interest in recent years, especially the factors promoting its occurrence. Current evolutionary models suggest that an increase in population size may lead to an increase in cultural complexity via a higher rate of cultural transmission and innovation. However, relatively little attention has been paid to the role of natural selection in the evolution of cultural complexity. Here we use an agent-based simulation model to demonstrate that high selection pressure in the form of resource pressure promotes the accumulation of adaptive culture in spite of small population sizes and high innovation costs. We argue that the interaction of demography and selection is important, and that neither can be considered in isolation. We predict that an increase in cultural complexity is most likely to occur under conditions of population pressure relative to resource availability. Our model may help to explain why culture change can occur without major environmental change. We suggest that understanding the interaction between shifting selective pressures and demography is essential for explaining the evolution of cultural complexity.

  14. Selective high-affinity polydentate ligands and methods of making such

    Energy Technology Data Exchange (ETDEWEB)

    Denardo, Sally J.; Denardo, Gerald L.; Balhorn, Rodney L.

    2018-02-06

    This invention provides novel polydentate selective high affinity ligands (SHALs) that can be used in a variety of applications in a manner analogous to the use of antibodies. SHALs typically comprise a multiplicity of ligands that each bind different region son the target molecule. The ligands are joined directly or through a linker thereby forming a polydentate moiety that typically binds the target molecule with high selectivity and avidity.

  15. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil; Kammoun, Abla; Al-Naffouri, Tareq Y.; Alouini, Mohamed-Slim

    2016-01-01

    -aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also

  16. Combining rational and random strategies in β-glucosidase Zm-p60.1 protein library construction.

    Directory of Open Access Journals (Sweden)

    Dušan Turek

    Full Text Available Saturation mutagenesis is a cornerstone technique in protein engineering because of its utility (in conjunction with appropriate analytical techniques for assessing effects of varying residues at selected positions on proteins' structures and functions. Site-directed mutagenesis with degenerate primers is the simplest and most rapid saturation mutagenesis technique. Thus, it is highly appropriate for assessing whether or not variation at certain sites is permissible, but not necessarily the most time- and cost-effective technique for detailed assessment of variations' effects. Thus, in the presented study we applied the technique to randomize position W373 in β-glucosidase Zm-p60.1, which is highly conserved among β-glucosidases. Unexpectedly, β-glucosidase activity screening of the generated variants showed that most variants were active, although they generally had significantly lower activity than the wild type enzyme. Further characterization of the library led us to conclude that a carefully selected combination of randomized codon-based saturation mutagenesis and site-directed mutagenesis may be most efficient, particularly when constructing and investigating randomized libraries with high fractions of positive hits.

  17. Combining rational and random strategies in β-glucosidase Zm-p60.1 protein library construction.

    Science.gov (United States)

    Turek, Dušan; Klimeš, Pavel; Mazura, Pavel; Brzobohatý, Břetislav

    2014-01-01

    Saturation mutagenesis is a cornerstone technique in protein engineering because of its utility (in conjunction with appropriate analytical techniques) for assessing effects of varying residues at selected positions on proteins' structures and functions. Site-directed mutagenesis with degenerate primers is the simplest and most rapid saturation mutagenesis technique. Thus, it is highly appropriate for assessing whether or not variation at certain sites is permissible, but not necessarily the most time- and cost-effective technique for detailed assessment of variations' effects. Thus, in the presented study we applied the technique to randomize position W373 in β-glucosidase Zm-p60.1, which is highly conserved among β-glucosidases. Unexpectedly, β-glucosidase activity screening of the generated variants showed that most variants were active, although they generally had significantly lower activity than the wild type enzyme. Further characterization of the library led us to conclude that a carefully selected combination of randomized codon-based saturation mutagenesis and site-directed mutagenesis may be most efficient, particularly when constructing and investigating randomized libraries with high fractions of positive hits.

  18. Stock selection of high-dose-irradiation-resistant materials for filter press under high-dose irradiation operation

    International Nuclear Information System (INIS)

    Ishiyama, Shintaro; Minami, Mamoru; Hara, Kouji; Yamashita, Manabu

    2015-01-01

    In a volume reduction process for the decontamination of contained soil, the performance degradation of a filter press is expected owing to material deterioration under high-dose irradiation. Eleven-stock selection of candidate materials including polymers, fibers and rubbers for the filter press was conducted to achieve a high performance of volume reduction of contaminated soil and the following results were derived. Crude rubber and nylon were selected as prime candidates for packing, diaphragm and filter plate materials. Polyethylene was also selected as a prime candidate for the filter cloth material. (author)

  19. Effects of choice architecture and chef-enhanced meals on the selection and consumption of healthier school foods: a randomized clinical trial.

    Science.gov (United States)

    Cohen, Juliana F W; Richardson, Scott A; Cluggish, Sarah A; Parker, Ellen; Catalano, Paul J; Rimm, Eric B

    2015-05-01

    Little is known about the long-term effect of a chef-enhanced menu on healthier food selection and consumption in school lunchrooms. In addition, it remains unclear if extended exposure to other strategies to promote healthier foods (eg, choice architecture) also improves food selection or consumption. To evaluate the short- and long-term effects of chef-enhanced meals and extended exposure to choice architecture on healthier school food selection and consumption. A school-based randomized clinical trial was conducted during the 2011-2012 school year among 14 elementary and middle schools in 2 urban, low-income school districts (intent-to-treat analysis). Included in the study were 2638 students in grades 3 through 8 attending participating schools (38.4% of eligible participants). Schools were first randomized to receive a professional chef to improve school meal palatability (chef schools) or to a delayed intervention (control group). To assess the effect of choice architecture (smart café), all schools after 3 months were then randomized to the smart café intervention or to the control group. School food selection was recorded, and consumption was measured using plate waste methods. After 3 months, vegetable selection increased in chef vs control schools (odds ratio [OR], 1.75; 95% CI, 1.36-2.24), but there was no effect on the selection of other components or on meal consumption. After long-term or extended exposure to the chef or smart café intervention, fruit selection increased in the chef (OR, 3.08; 95% CI, 2.23-4.25), smart café (OR, 1.45; 95% CI, 1.13-1.87), and chef plus smart café (OR, 3.10; 95% CI, 2.26-4.25) schools compared with the control schools, and consumption increased in the chef schools (OR, 0.17; 95% CI, 0.03-0.30 cups/d). Vegetable selection increased in the chef (OR, 2.54; 95% CI, 1.83-3.54), smart café (OR, 1.91; 95% CI, 1.46-2.50), and chef plus smart café schools (OR, 7.38, 95% CI, 5.26-10.35) compared with the control schools

  20. Modified random hinge transport mechanics and multiple scattering step-size selection in EGS5

    International Nuclear Information System (INIS)

    Wilderman, S.J.; Bielajew, A.F.

    2005-01-01

    The new transport mechanics in EGS5 allows for significantly longer electron transport step sizes and hence shorter computation times than required for identical problems in EGS4. But as with all Monte Carlo electron transport algorithms, certain classes of problems exhibit step-size dependencies even when operating within recommended ranges, sometimes making selection of step-sizes a daunting task for novice users. Further contributing to this problem, because of the decoupling of multiple scattering and continuous energy loss in the dual random hinge transport mechanics of EGS5, there are two independent step sizes in EGS5, one for multiple scattering and one for continuous energy loss, each of which influences speed and accuracy in a different manner. Further, whereas EGS4 used a single value of fractional energy loss (ESTEPE) to determine step sizes at all energies, to increase performance by decreasing the amount of effort expended simulating lower energy particles, EGS5 permits the fractional energy loss values which are used to determine both the multiple scattering and continuous energy loss step sizes to vary with energy. This results in requiring the user to specify four fractional energy loss values when optimizing computations for speed. Thus, in order to simplify step-size selection and to mitigate step-size dependencies, a method has been devised to automatically optimize step-size selection based on a single material dependent input related to the size of problem tally region. In this paper we discuss the new transport mechanics in EGS5 and describe the automatic step-size optimization algorithm. (author)

  1. High-temperature series expansions for random Potts models

    Directory of Open Access Journals (Sweden)

    M.Hellmund

    2005-01-01

    Full Text Available We discuss recently generated high-temperature series expansions for the free energy and the susceptibility of random-bond q-state Potts models on hypercubic lattices. Using the star-graph expansion technique, quenched disorder averages can be calculated exactly for arbitrary uncorrelated coupling distributions while keeping the disorder strength p as well as the dimension d as symbolic parameters. We present analyses of the new series for the susceptibility of the Ising (q=2 and 4-state Potts model in three dimensions up to the order 19 and 18, respectively, and compare our findings with results from field-theoretical renormalization group studies and Monte Carlo simulations.

  2. A New Random Walk for Replica Detection in WSNs

    Science.gov (United States)

    Aalsalem, Mohammed Y.; Saad, N. M.; Hossain, Md. Shohrab; Atiquzzaman, Mohammed; Khan, Muhammad Khurram

    2016-01-01

    Wireless Sensor Networks (WSNs) are vulnerable to Node Replication attacks or Clone attacks. Among all the existing clone detection protocols in WSNs, RAWL shows the most promising results by employing Simple Random Walk (SRW). More recently, RAND outperforms RAWL by incorporating Network Division with SRW. Both RAND and RAWL have used SRW for random selection of witness nodes which is problematic because of frequently revisiting the previously passed nodes that leads to longer delays, high expenditures of energy with lower probability that witness nodes intersect. To circumvent this problem, we propose to employ a new kind of constrained random walk, namely Single Stage Memory Random Walk and present a distributed technique called SSRWND (Single Stage Memory Random Walk with Network Division). In SSRWND, single stage memory random walk is combined with network division aiming to decrease the communication and memory costs while keeping the detection probability higher. Through intensive simulations it is verified that SSRWND guarantees higher witness node security with moderate communication and memory overheads. SSRWND is expedient for security oriented application fields of WSNs like military and medical. PMID:27409082

  3. A New Random Walk for Replica Detection in WSNs.

    Science.gov (United States)

    Aalsalem, Mohammed Y; Khan, Wazir Zada; Saad, N M; Hossain, Md Shohrab; Atiquzzaman, Mohammed; Khan, Muhammad Khurram

    2016-01-01

    Wireless Sensor Networks (WSNs) are vulnerable to Node Replication attacks or Clone attacks. Among all the existing clone detection protocols in WSNs, RAWL shows the most promising results by employing Simple Random Walk (SRW). More recently, RAND outperforms RAWL by incorporating Network Division with SRW. Both RAND and RAWL have used SRW for random selection of witness nodes which is problematic because of frequently revisiting the previously passed nodes that leads to longer delays, high expenditures of energy with lower probability that witness nodes intersect. To circumvent this problem, we propose to employ a new kind of constrained random walk, namely Single Stage Memory Random Walk and present a distributed technique called SSRWND (Single Stage Memory Random Walk with Network Division). In SSRWND, single stage memory random walk is combined with network division aiming to decrease the communication and memory costs while keeping the detection probability higher. Through intensive simulations it is verified that SSRWND guarantees higher witness node security with moderate communication and memory overheads. SSRWND is expedient for security oriented application fields of WSNs like military and medical.

  4. Randomized clinical trials in dentistry: Risks of bias, risks of random errors, reporting quality, and methodologic quality over the years 1955-2013.

    Directory of Open Access Journals (Sweden)

    Humam Saltaji

    Full Text Available To examine the risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions and the development of these aspects over time.We included 540 randomized clinical trials from 64 selected systematic reviews. We extracted, in duplicate, details from each of the selected randomized clinical trials with respect to publication and trial characteristics, reporting and methodologic characteristics, and Cochrane risk of bias domains. We analyzed data using logistic regression and Chi-square statistics.Sequence generation was assessed to be inadequate (at unclear or high risk of bias in 68% (n = 367 of the trials, while allocation concealment was inadequate in the majority of trials (n = 464; 85.9%. Blinding of participants and blinding of the outcome assessment were judged to be inadequate in 28.5% (n = 154 and 40.5% (n = 219 of the trials, respectively. A sample size calculation before the initiation of the study was not performed/reported in 79.1% (n = 427 of the trials, while the sample size was assessed as adequate in only 17.6% (n = 95 of the trials. Two thirds of the trials were not described as double blinded (n = 358; 66.3%, while the method of blinding was appropriate in 53% (n = 286 of the trials. We identified a significant decrease over time (1955-2013 in the proportion of trials assessed as having inadequately addressed methodological quality items (P < 0.05 in 30 out of the 40 quality criteria, or as being inadequate (at high or unclear risk of bias in five domains of the Cochrane risk of bias tool: sequence generation, allocation concealment, incomplete outcome data, other sources of bias, and overall risk of bias.The risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions have improved over time; however, further efforts that contribute to the development of more stringent

  5. Antenna Selection for Full-Duplex MIMO Two-Way Communication Systems

    KAUST Repository

    Wilson-Nunn, Daniel; Chaaban, Anas; Sezgin, Aydin; Alouini, Mohamed-Slim

    2017-01-01

    Antenna selection for full-duplex communication between two nodes, each equipped with a predefined number of antennae and transmit/receive chains, is studied. Selection algorithms are proposed based on magnitude, orthogonality, and determinant criteria. The algorithms are compared to optimal selection obtained by exhaustive search as well as random selection, and are shown to yield performance fairly close to optimal at a much lower complexity. Performance comparison for a Rayleigh fading symmetric channel reveals that selecting a single transmit antenna is best at low signal-to-noise ratio (SNR), while selecting an equal number of transmit and receive antennae is best at high SNR.

  6. Antenna Selection for Full-Duplex MIMO Two-Way Communication Systems

    KAUST Repository

    Wilson-Nunn, Daniel

    2017-03-11

    Antenna selection for full-duplex communication between two nodes, each equipped with a predefined number of antennae and transmit/receive chains, is studied. Selection algorithms are proposed based on magnitude, orthogonality, and determinant criteria. The algorithms are compared to optimal selection obtained by exhaustive search as well as random selection, and are shown to yield performance fairly close to optimal at a much lower complexity. Performance comparison for a Rayleigh fading symmetric channel reveals that selecting a single transmit antenna is best at low signal-to-noise ratio (SNR), while selecting an equal number of transmit and receive antennae is best at high SNR.

  7. Random broadcast on random geometric graphs

    Energy Technology Data Exchange (ETDEWEB)

    Bradonjic, Milan [Los Alamos National Laboratory; Elsasser, Robert [UNIV OF PADERBORN; Friedrich, Tobias [ICSI/BERKELEY; Sauerwald, Tomas [ICSI/BERKELEY

    2009-01-01

    In this work, we consider the random broadcast time on random geometric graphs (RGGs). The classic random broadcast model, also known as push algorithm, is defined as: starting with one informed node, in each succeeding round every informed node chooses one of its neighbors uniformly at random and informs it. We consider the random broadcast time on RGGs, when with high probability: (i) RGG is connected, (ii) when there exists the giant component in RGG. We show that the random broadcast time is bounded by {Omicron}({radical} n + diam(component)), where diam(component) is a diameter of the entire graph, or the giant component, for the regimes (i), or (ii), respectively. In other words, for both regimes, we derive the broadcast time to be {Theta}(diam(G)), which is asymptotically optimal.

  8. Why the null matters: statistical tests, random walks and evolution.

    Science.gov (United States)

    Sheets, H D; Mitchell, C E

    2001-01-01

    A number of statistical tests have been developed to determine what type of dynamics underlie observed changes in morphology in evolutionary time series, based on the pattern of change within the time series. The theory of the 'scaled maximum', the 'log-rate-interval' (LRI) method, and the Hurst exponent all operate on the same principle of comparing the maximum change, or rate of change, in the observed dataset to the maximum change expected of a random walk. Less change in a dataset than expected of a random walk has been interpreted as indicating stabilizing selection, while more change implies directional selection. The 'runs test' in contrast, operates on the sequencing of steps, rather than on excursion. Applications of these tests to computer generated, simulated time series of known dynamical form and various levels of additive noise indicate that there is a fundamental asymmetry in the rate of type II errors of the tests based on excursion: they are all highly sensitive to noise in models of directional selection that result in a linear trend within a time series, but are largely noise immune in the case of a simple model of stabilizing selection. Additionally, the LRI method has a lower sensitivity than originally claimed, due to the large range of LRI rates produced by random walks. Examination of the published results of these tests show that they have seldom produced a conclusion that an observed evolutionary time series was due to directional selection, a result which needs closer examination in light of the asymmetric response of these tests.

  9. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Nils Ternès

    2017-05-01

    Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4

  10. Influence of Maximum Inbreeding Avoidance under BLUP EBV Selection on Pinzgau Population Diversity

    Directory of Open Access Journals (Sweden)

    Radovan Kasarda

    2011-05-01

    Full Text Available Evaluated was effect of mating (random vs. maximum avoidance of inbreeding under BLUP EBV selection strategy. Existing population structure was under Monte Carlo stochastic simulation analyzed from the point to minimize increase of inbreeding. Maximum avoidance of inbreeding under BLUP selection resulted into comparable increase of inbreeding then random mating in average of 10 generation development. After 10 generations of simulation of mating strategy was observed ΔF= 6,51 % (2 sires, 5,20 % (3 sires, 3,22 % (4 sires resp. 2,94 % (5 sires. With increased number of sires selected, decrease of inbreeding was observed. With use of 4, resp. 5 sires increase of inbreeding was comparable to random mating with phenotypic selection. For saving of genetic diversity and prevention of population loss is important to minimize increase of inbreeding in small populations. Classical approach was based on balancing ratio of sires and dams in mating program. Contrariwise in the most of commercial populations small number of sires was used with high mating ratio.

  11. New detection systems of bacteria using highly selective media designed by SMART: selective medium-design algorithm restricted by two constraints.

    Directory of Open Access Journals (Sweden)

    Takeshi Kawanishi

    Full Text Available Culturing is an indispensable technique in microbiological research, and culturing with selective media has played a crucial role in the detection of pathogenic microorganisms and the isolation of commercially useful microorganisms from environmental samples. Although numerous selective media have been developed in empirical studies, unintended microorganisms often grow on such media probably due to the enormous numbers of microorganisms in the environment. Here, we present a novel strategy for designing highly selective media based on two selective agents, a carbon source and antimicrobials. We named our strategy SMART for highly Selective Medium-design Algorithm Restricted by Two constraints. To test whether the SMART method is applicable to a wide range of microorganisms, we developed selective media for Burkholderia glumae, Acidovorax avenae, Pectobacterium carotovorum, Ralstonia solanacearum, and Xanthomonas campestris. The series of media developed by SMART specifically allowed growth of the targeted bacteria. Because these selective media exhibited high specificity for growth of the target bacteria compared to established selective media, we applied three notable detection technologies: paper-based, flow cytometry-based, and color change-based detection systems for target bacteria species. SMART facilitates not only the development of novel techniques for detecting specific bacteria, but also our understanding of the ecology and epidemiology of the targeted bacteria.

  12. Predicting disease risks from highly imbalanced data using random forest

    Directory of Open Access Journals (Sweden)

    Chakraborty Sounak

    2011-07-01

    Full Text Available Abstract Background We present a method utilizing Healthcare Cost and Utilization Project (HCUP dataset for predicting disease risk of individuals based on their medical diagnosis history. The presented methodology may be incorporated in a variety of applications such as risk management, tailored health communication and decision support systems in healthcare. Methods We employed the National Inpatient Sample (NIS data, which is publicly available through Healthcare Cost and Utilization Project (HCUP, to train random forest classifiers for disease prediction. Since the HCUP data is highly imbalanced, we employed an ensemble learning approach based on repeated random sub-sampling. This technique divides the training data into multiple sub-samples, while ensuring that each sub-sample is fully balanced. We compared the performance of support vector machine (SVM, bagging, boosting and RF to predict the risk of eight chronic diseases. Results We predicted eight disease categories. Overall, the RF ensemble learning method outperformed SVM, bagging and boosting in terms of the area under the receiver operating characteristic (ROC curve (AUC. In addition, RF has the advantage of computing the importance of each variable in the classification process. Conclusions In combining repeated random sub-sampling with RF, we were able to overcome the class imbalance problem and achieve promising results. Using the national HCUP data set, we predicted eight disease categories with an average AUC of 88.79%.

  13. Computation of High-Frequency Waves with Random Uncertainty

    KAUST Repository

    Malenova, Gabriela

    2016-01-06

    We consider the forward propagation of uncertainty in high-frequency waves, described by the second order wave equation with highly oscillatory initial data. The main sources of uncertainty are the wave speed and/or the initial phase and amplitude, described by a finite number of random variables with known joint probability distribution. We propose a stochastic spectral asymptotic method [1] for computing the statistics of uncertain output quantities of interest (QoIs), which are often linear or nonlinear functionals of the wave solution and its spatial/temporal derivatives. The numerical scheme combines two techniques: a high-frequency method based on Gaussian beams [2, 3], a sparse stochastic collocation method [4]. The fast spectral convergence of the proposed method depends crucially on the presence of high stochastic regularity of the QoI independent of the wave frequency. In general, the high-frequency wave solutions to parametric hyperbolic equations are highly oscillatory and non-smooth in both physical and stochastic spaces. Consequently, the stochastic regularity of the QoI, which is a functional of the wave solution, may in principle below and depend on frequency. In the present work, we provide theoretical arguments and numerical evidence that physically motivated QoIs based on local averages of |uE|2 are smooth, with derivatives in the stochastic space uniformly bounded in E, where uE and E denote the highly oscillatory wave solution and the short wavelength, respectively. This observable related regularity makes the proposed approach more efficient than current asymptotic approaches based on Monte Carlo sampling techniques.

  14. Universal Prevention for Anxiety and Depressive Symptoms in Children: A Meta-analysis of Randomized and Cluster-Randomized Trials.

    Science.gov (United States)

    Ahlen, Johan; Lenhard, Fabian; Ghaderi, Ata

    2015-12-01

    Although under-diagnosed, anxiety and depression are among the most prevalent psychiatric disorders in children and adolescents, leading to severe impairment, increased risk of future psychiatric problems, and a high economic burden to society. Universal prevention may be a potent way to address these widespread problems. There are several benefits to universal relative to targeted interventions because there is limited knowledge as to how to screen for anxiety and depression in the general population. Earlier meta-analyses of the prevention of depression and anxiety symptoms among children suffer from methodological inadequacies such as combining universal, selective, and indicated interventions in the same analyses, and comparing cluster-randomized trials with randomized trials without any correction for clustering effects. The present meta-analysis attempted to determine the effectiveness of universal interventions to prevent anxiety and depressive symptoms after correcting for clustering effects. A systematic search of randomized studies in PsychINFO, Cochrane Library, and Google Scholar resulted in 30 eligible studies meeting inclusion criteria, namely peer-reviewed, randomized or cluster-randomized trials of universal interventions for anxiety and depressive symptoms in school-aged children. Sixty-three percent of the studies reported outcome data regarding anxiety and 87 % reported outcome data regarding depression. Seventy percent of the studies used randomization at the cluster level. There were small but significant effects regarding anxiety (.13) and depressive (.11) symptoms as measured at immediate posttest. At follow-up, which ranged from 3 to 48 months, effects were significantly larger than zero regarding depressive (.07) but not anxiety (.11) symptoms. There was no significant moderation effect of the following pre-selected variables: the primary aim of the intervention (anxiety or depression), deliverer of the intervention, gender distribution

  15. High Mortality without ESCAPE: The Registry of Heart Failure Patients Receiving Pulmonary Artery Catheters without Randomization

    Science.gov (United States)

    Allen, Larry A.; Rogers, Joseph G.; Warnica, J. Wayne; DiSalvo, Thomas G.; Tasissa, Gudaye; Binanay, Cynthia; O’Connor, Christopher M.; Califf, Robert M.; Leier, Carl V.; Shah, Monica R.; Stevenson, Lynne W.

    2008-01-01

    Background In ESCAPE, there was no difference in days alive and out of the hospital for patients with decompensated heart failure (HF) randomly assigned to therapy guided by pulmonary artery catheter (PAC) plus clinical assessment versus clinical assessment alone. The external validity of these findings is debated. Methods and Results ESCAPE sites enrolled 439 patients receiving PAC without randomization in a prospective registry. Baseline characteristics, pertinent trial exclusion criteria, reasons for PAC use, hemodynamics, and complications were collected. Survival was determined from the National Death Index and the Alberta Registry. On average, registry patients had lower blood pressure, worse renal function, less neurohormonal antagonist therapy, and higher use of intravenous inotropes as compared with trial patients. Although clinical assessment anticipated less volume overload and greater hypoperfusion among the registry population, measured filling pressures were similarly elevated in the registry and trial, while measured perfusion was slightly higher among registry patients. Registry patients had longer hospitalization (13 vs. 6 days, p <0.001) and higher 6-month mortality (34% vs. 20%, p < 0.001) than trial patients. Conclusions The decision to use PAC without randomization identified a population with higher disease severity and risk of mortality. This prospective registry highlights the complex context of patient selection for randomized trials. PMID:18926438

  16. Comparison of confirmed inactive and randomly selected compounds as negative training examples in support vector machine-based virtual screening.

    Science.gov (United States)

    Heikamp, Kathrin; Bajorath, Jürgen

    2013-07-22

    The choice of negative training data for machine learning is a little explored issue in chemoinformatics. In this study, the influence of alternative sets of negative training data and different background databases on support vector machine (SVM) modeling and virtual screening has been investigated. Target-directed SVM models have been derived on the basis of differently composed training sets containing confirmed inactive molecules or randomly selected database compounds as negative training instances. These models were then applied to search background databases consisting of biological screening data or randomly assembled compounds for available hits. Negative training data were found to systematically influence compound recall in virtual screening. In addition, different background databases had a strong influence on the search results. Our findings also indicated that typical benchmark settings lead to an overestimation of SVM-based virtual screening performance compared to search conditions that are more relevant for practical applications.

  17. Variable Selection in Time Series Forecasting Using Random Forests

    Directory of Open Access Journals (Sweden)

    Hristos Tyralis

    2017-10-01

    Full Text Available Time series forecasting using machine learning algorithms has gained popularity recently. Random forest is a machine learning algorithm implemented in time series forecasting; however, most of its forecasting properties have remained unexplored. Here we focus on assessing the performance of random forests in one-step forecasting using two large datasets of short time series with the aim to suggest an optimal set of predictor variables. Furthermore, we compare its performance to benchmarking methods. The first dataset is composed by 16,000 simulated time series from a variety of Autoregressive Fractionally Integrated Moving Average (ARFIMA models. The second dataset consists of 135 mean annual temperature time series. The highest predictive performance of RF is observed when using a low number of recent lagged predictor variables. This outcome could be useful in relevant future applications, with the prospect to achieve higher predictive accuracy.

  18. High Selectivity Oxygen Delignification

    Energy Technology Data Exchange (ETDEWEB)

    Lucian A. Lucia

    2005-11-15

    Project Objective: The objectives of this project are as follows: (1) Examine the physical and chemical characteristics of a partner mill pre- and post-oxygen delignified pulp and compare them to lab generated oxygen delignified pulps; (2) Apply the chemical selectivity enhancement system to the partner pre-oxygen delignified pulps under mill conditions (with and without any predetermined amounts of carryover) to determine how efficiently viscosity is preserved, how well selectivity is enhanced, if strength is improved, measure any yield differences and/or bleachability differences; and (3) Initiate a mill scale oxygen delignification run using the selectivity enhancement agent, collect the mill data, analyze it, and propose any future plans for implementation.

  19. The Long-Term Effectiveness of a Selective, Personality-Targeted Prevention Program in Reducing Alcohol Use and Related Harms: A Cluster Randomized Controlled Trial

    Science.gov (United States)

    Newton, Nicola C.; Conrod, Patricia J.; Slade, Tim; Carragher, Natacha; Champion, Katrina E.; Barrett, Emma L.; Kelly, Erin V.; Nair, Natasha K.; Stapinski, Lexine; Teesson, Maree

    2016-01-01

    Background: This study investigated the long-term effectiveness of Preventure, a selective personality-targeted prevention program, in reducing the uptake of alcohol, harmful use of alcohol, and alcohol-related harms over a 3-year period. Methods: A cluster randomized controlled trial was conducted to assess the effectiveness of Preventure.…

  20. Prevalence of at-risk genotypes for genotoxic effects decreases with age in a randomly selected population in Flanders: a cross sectional study

    Directory of Open Access Journals (Sweden)

    van Delft Joost HM

    2011-10-01

    Full Text Available Abstract Background We hypothesized that in Flanders (Belgium, the prevalence of at-risk genotypes for genotoxic effects decreases with age due to morbidity and mortality resulting from chronic diseases. Rather than polymorphisms in single genes, the interaction of multiple genetic polymorphisms in low penetrance genes involved in genotoxic effects might be of relevance. Methods Genotyping was performed on 399 randomly selected adults (aged 50-65 and on 442 randomly selected adolescents. Based on their involvement in processes relevant to genotoxicity, 28 low penetrance polymorphisms affecting the phenotype in 19 genes were selected (xenobiotic metabolism, oxidative stress defense and DNA repair, respectively 13, 6 and 9 polymorphisms. Polymorphisms which, based on available literature, could not clearly be categorized a priori as leading to an 'increased risk' or a 'protective effect' were excluded. Results The mean number of risk alleles for all investigated polymorphisms was found to be lower in the 'elderly' (17.0 ± 2.9 than the 'adolescent' (17.6 ± 3.1 subpopulation (P = 0.002. These results were not affected by gender nor smoking. The prevalence of a high (> 17 = median number of risk alleles was less frequent in the 'elderly' (40.6% than the 'adolescent' (51.4% subpopulation (P = 0.002. In particular for phase II enzymes, the mean number of risk alleles was lower in the 'elderly' (4.3 ± 1.6 than the 'adolescent' age group (4.8 ± 1.9 P 4 = median number of risk alleles was less frequent in the 'elderly' (41.3% than the adolescent subpopulation (56.3%, P 8 = median number of risk alleles for DNA repair enzyme-coding genes was lower in the 'elderly' (37,3% than the 'adolescent' subpopulation (45.6%, P = 0.017. Conclusions These observations are consistent with the hypothesis that, in Flanders, the prevalence of at-risk alleles in genes involved in genotoxic effects decreases with age, suggesting that persons carrying a higher number of

  1. Classification of high resolution remote sensing image based on geo-ontology and conditional random fields

    Science.gov (United States)

    Hong, Liang

    2013-10-01

    The availability of high spatial resolution remote sensing data provides new opportunities for urban land-cover classification. More geometric details can be observed in the high resolution remote sensing image, Also Ground objects in the high resolution remote sensing image have displayed rich texture, structure, shape and hierarchical semantic characters. More landscape elements are represented by a small group of pixels. Recently years, the an object-based remote sensing analysis methodology is widely accepted and applied in high resolution remote sensing image processing. The classification method based on Geo-ontology and conditional random fields is presented in this paper. The proposed method is made up of four blocks: (1) the hierarchical ground objects semantic framework is constructed based on geoontology; (2) segmentation by mean-shift algorithm, which image objects are generated. And the mean-shift method is to get boundary preserved and spectrally homogeneous over-segmentation regions ;(3) the relations between the hierarchical ground objects semantic and over-segmentation regions are defined based on conditional random fields framework ;(4) the hierarchical classification results are obtained based on geo-ontology and conditional random fields. Finally, high-resolution remote sensed image data -GeoEye, is used to testify the performance of the presented method. And the experimental results have shown the superiority of this method to the eCognition method both on the effectively and accuracy, which implies it is suitable for the classification of high resolution remote sensing image.

  2. Generating equilateral random polygons in confinement III

    International Nuclear Information System (INIS)

    Diao, Y; Ernst, C; Montemayor, A; Ziegler, U

    2012-01-01

    In this paper we continue our earlier studies (Diao et al 2011 J. Phys. A: Math. Theor. 44 405202, Diao et al J. Phys. A: Math. Theor. 45 275203) on the generation methods of random equilateral polygons confined in a sphere. The first half of this paper is concerned with the generation of confined equilateral random walks. We show that if the selection of a vertex is uniform subject to the position of its previous vertex and the confining condition, then the distributions of the vertices are not uniform, although there exists a distribution such that if the initial vertex is selected following this distribution, then all vertices of the random walk follow this same distribution. Thus in order to generate a confined equilateral random walk, the selection of a vertex cannot be uniform subject to the position of its previous vertex and the confining condition. We provide a simple algorithm capable of generating confined equilateral random walks whose vertex distribution is almost uniform in the confinement sphere. In the second half of this paper we show that any process generating confined equilateral random walks can be turned into a process generating confined equilateral random polygons with the property that the vertex distribution of the polygons approaches the vertex distribution of the walks as the polygons get longer and longer. In our earlier studies, the starting point of the confined polygon is fixed at the center of the sphere. The new approach here allows us to move the starting point of the confined polygon off the center of the sphere. (paper)

  3. Reference satellite selection method for GNSS high-precision relative positioning

    Directory of Open Access Journals (Sweden)

    Xiao Gao

    2017-03-01

    Full Text Available Selecting the optimal reference satellite is an important component of high-precision relative positioning because the reference satellite directly influences the strength of the normal equation. The reference satellite selection methods based on elevation and positional dilution of precision (PDOP value were compared. Results show that all the above methods cannot select the optimal reference satellite. We introduce condition number of the design matrix in the reference satellite selection method to improve structure of the normal equation, because condition number can indicate the ill condition of the normal equation. The experimental results show that the new method can improve positioning accuracy and reliability in precise relative positioning.

  4. Affinity selection of Nipah and Hendra virus-related vaccine candidates from a complex random peptide library displayed on bacteriophage virus-like particles

    Energy Technology Data Exchange (ETDEWEB)

    Peabody, David S.; Chackerian, Bryce; Ashley, Carlee; Carnes, Eric; Negrete, Oscar

    2017-01-24

    The invention relates to virus-like particles of bacteriophage MS2 (MS2 VLPs) displaying peptide epitopes or peptide mimics of epitopes of Nipah Virus envelope glycoprotein that elicit an immune response against Nipah Virus upon vaccination of humans or animals. Affinity selection on Nipah Virus-neutralizing monoclonal antibodies using random sequence peptide libraries on MS2 VLPs selected peptides with sequence similarity to peptide sequences found within the envelope glycoprotein of Nipah itself, thus identifying the epitopes the antibodies recognize. The selected peptide sequences themselves are not necessarily identical in all respects to a sequence within Nipah Virus glycoprotein, and therefore may be referred to as epitope mimics VLPs displaying these epitope mimics can serve as vaccine. On the other hand, display of the corresponding wild-type sequence derived from Nipah Virus and corresponding to the epitope mapped by affinity selection, may also be used as a vaccine.

  5. Prediction-Oriented Marker Selection (PROMISE): With Application to High-Dimensional Regression.

    Science.gov (United States)

    Kim, Soyeon; Baladandayuthapani, Veerabhadran; Lee, J Jack

    2017-06-01

    In personalized medicine, biomarkers are used to select therapies with the highest likelihood of success based on an individual patient's biomarker/genomic profile. Two goals are to choose important biomarkers that accurately predict treatment outcomes and to cull unimportant biomarkers to reduce the cost of biological and clinical verifications. These goals are challenging due to the high dimensionality of genomic data. Variable selection methods based on penalized regression (e.g., the lasso and elastic net) have yielded promising results. However, selecting the right amount of penalization is critical to simultaneously achieving these two goals. Standard approaches based on cross-validation (CV) typically provide high prediction accuracy with high true positive rates but at the cost of too many false positives. Alternatively, stability selection (SS) controls the number of false positives, but at the cost of yielding too few true positives. To circumvent these issues, we propose prediction-oriented marker selection (PROMISE), which combines SS with CV to conflate the advantages of both methods. Our application of PROMISE with the lasso and elastic net in data analysis shows that, compared to CV, PROMISE produces sparse solutions, few false positives, and small type I + type II error, and maintains good prediction accuracy, with a marginal decrease in the true positive rates. Compared to SS, PROMISE offers better prediction accuracy and true positive rates. In summary, PROMISE can be applied in many fields to select regularization parameters when the goals are to minimize false positives and maximize prediction accuracy.

  6. High-throughput selection for cellulase catalysts using chemical complementation.

    Science.gov (United States)

    Peralta-Yahya, Pamela; Carter, Brian T; Lin, Hening; Tao, Haiyan; Cornish, Virginia W

    2008-12-24

    Efficient enzymatic hydrolysis of lignocellulosic material remains one of the major bottlenecks to cost-effective conversion of biomass to ethanol. Improvement of glycosylhydrolases, however, is limited by existing medium-throughput screening technologies. Here, we report the first high-throughput selection for cellulase catalysts. This selection was developed by adapting chemical complementation to provide a growth assay for bond cleavage reactions. First, a URA3 counter selection was adapted to link chemical dimerizer activated gene transcription to cell death. Next, the URA3 counter selection was shown to detect cellulase activity based on cleavage of a tetrasaccharide chemical dimerizer substrate and decrease in expression of the toxic URA3 reporter. Finally, the utility of the cellulase selection was assessed by isolating cellulases with improved activity from a cellulase library created by family DNA shuffling. This application provides further evidence that chemical complementation can be readily adapted to detect different enzymatic activities for important chemical transformations for which no natural selection exists. Because of the large number of enzyme variants that selections can now test as compared to existing medium-throughput screens for cellulases, this assay has the potential to impact the discovery of improved cellulases and other glycosylhydrolases for biomass conversion from libraries of cellulases created by mutagenesis or obtained from natural biodiversity.

  7. A Primer on High-Throughput Computing for Genomic Selection

    Directory of Open Access Journals (Sweden)

    Xiao-Lin eWu

    2011-02-01

    Full Text Available High-throughput computing (HTC uses computer clusters to solve advanced computational problems, with the goal of accomplishing high throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general purpose computation on a graphics processing unit (GPU provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin – Madison, which can be leveraged for genomic selection, in terms of central processing unit (CPU capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of

  8. Random ancestor trees

    International Nuclear Information System (INIS)

    Ben-Naim, E; Krapivsky, P L

    2010-01-01

    We investigate a network growth model in which the genealogy controls the evolution. In this model, a new node selects a random target node and links either to this target node, or to its parent, or to its grandparent, etc; all nodes from the target node to its most ancient ancestor are equiprobable destinations. The emerging random ancestor tree is very shallow: the fraction g n of nodes at distance n from the root decreases super-exponentially with n, g n = e −1 /(n − 1)!. We find that a macroscopic hub at the root coexists with highly connected nodes at higher generations. The maximal degree of a node at the nth generation grows algebraically as N 1/β n , where N is the system size. We obtain the series of nontrivial exponents which are roots of transcendental equations: β 1 ≅1.351 746, β 2 ≅1.682 201, etc. As a consequence, the fraction p k of nodes with degree k has an algebraic tail, p k ∼ k −γ , with γ = β 1 + 1 = 2.351 746

  9. A primer on high-throughput computing for genomic selection.

    Science.gov (United States)

    Wu, Xiao-Lin; Beissinger, Timothy M; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J M; Weigel, Kent A; Gatti, Natalia de Leon; Gianola, Daniel

    2011-01-01

    High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin-Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized

  10. High-level radioactive waste repositories site selection plan

    International Nuclear Information System (INIS)

    Castanon, A.; Recreo, F.

    1985-01-01

    A general vision of the high level nuclear waste (HLNW) and/or nuclear spent fuel facilities site selection processes is given, according to the main international nuclear safety regulatory organisms quidelines and the experience from those countries which have reached a larger development of their national nuclear programs. (author)

  11. 10-Year Mortality Outcome of a Routine Invasive Strategy Versus a Selective Invasive Strategy in Non-ST-Segment Elevation Acute Coronary Syndrome: The British Heart Foundation RITA-3 Randomized Trial.

    Science.gov (United States)

    Henderson, Robert A; Jarvis, Christopher; Clayton, Tim; Pocock, Stuart J; Fox, Keith A A

    2015-08-04

    The RITA-3 (Third Randomised Intervention Treatment of Angina) trial compared outcomes of a routine early invasive strategy (coronary arteriography and myocardial revascularization, as clinically indicated) to those of a selective invasive strategy (coronary arteriography for recurrent ischemia only) in patients with non-ST-segment elevation acute coronary syndrome (NSTEACS). At a median of 5 years' follow-up, the routine invasive strategy was associated with a 24% reduction in the odds of all-cause mortality. This study reports 10-year follow-up outcomes of the randomized cohort to determine the impact of a routine invasive strategy on longer-term mortality. We randomized 1,810 patients with NSTEACS to receive routine invasive or selective invasive strategies. All randomized patients had annual follow-up visits up to 5 years, and mortality was documented thereafter using data from the Office of National Statistics. Over 10 years, there were no differences in mortality between the 2 groups (all-cause deaths in 225 [25.1%] vs. 232 patients [25.4%]: p = 0.94; and cardiovascular deaths in 135 [15.1%] vs. 147 patients [16.1%]: p = 0.65 in the routine invasive and selective invasive groups, respectively). Multivariate analysis identified several independent predictors of 10-year mortality: age, previous myocardial infarction, heart failure, smoking status, diabetes, heart rate, and ST-segment depression. A modified post-discharge Global Registry of Acute Coronary Events (GRACE) score was used to calculate an individual risk score for each patient and to form low-risk, medium-risk, and high-risk groups. Risk of death within 10 years varied markedly from 14.4 % in the low-risk group to 56.2% in the high-risk group. This mortality trend did not depend on the assigned treatment strategy. The advantage of reduced mortality of routine early invasive strategy seen at 5 years was attenuated during later follow-up, with no evidence of a difference in outcome at 10 years

  12. High-dimensional model estimation and model selection

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I will review concepts and algorithms from high-dimensional statistics for linear model estimation and model selection. I will particularly focus on the so-called p>>n setting where the number of variables p is much larger than the number of samples n. I will focus mostly on regularized statistical estimators that produce sparse models. Important examples include the LASSO and its matrix extension, the Graphical LASSO, and more recent non-convex methods such as the TREX. I will show the applicability of these estimators in a diverse range of scientific applications, such as sparse interaction graph recovery and high-dimensional classification and regression problems in genomics.

  13. Variant selection of martensites in steel welded joints with low transformation temperature weld metals

    International Nuclear Information System (INIS)

    Takahashi, Masaru; Yasuda, Hiroyuki Y.

    2013-01-01

    Highlights: ► We examined the variant selection of martensites in the weld metals. ► We also measured the residual stress developed in the butt and box welded joints. ► 24 martensite variants were randomly selected in the butt welded joint. ► High tensile residual stress in the box welded joint led to the strong variant selection. ► We discussed the rule of the variant selection focusing on the residual stress. -- Abstract: Martensitic transformation behavior in steel welded joints with low transformation temperature weld (LTTW) metal was examined focusing on the variant selection of martensites. The butt and box welded joints were prepared with LTTW metals and 980 MPa grade high strength steels. The residual stress of the welded joints, which was measured by a neutron diffraction technique, was effectively reduced by the expansion of the LTTW metals by the martensitic transformation during cooling after the welding process. In the LTTW metals, the retained austenite and martensite phases have the Kurdjumov–Sachs (K–S) orientation relationship. The variant selection of the martensites in the LTTW metals depended strongly on the type of welded joints. In the butt welded joint, 24 K–S variants were almost randomly selected while a few variants were preferentially chosen in the box welded joint. This suggests that the high residual stress developed in the box welded joint accelerated the formation of specific variants during the cooling process, in contrast to the butt welded joint with low residual stress

  14. Fast selection of miRNA candidates based on large-scale pre-computed MFE sets of randomized sequences.

    Science.gov (United States)

    Warris, Sven; Boymans, Sander; Muiser, Iwe; Noback, Michiel; Krijnen, Wim; Nap, Jan-Peter

    2014-01-13

    Small RNAs are important regulators of genome function, yet their prediction in genomes is still a major computational challenge. Statistical analyses of pre-miRNA sequences indicated that their 2D structure tends to have a minimal free energy (MFE) significantly lower than MFE values of equivalently randomized sequences with the same nucleotide composition, in contrast to other classes of non-coding RNA. The computation of many MFEs is, however, too intensive to allow for genome-wide screenings. Using a local grid infrastructure, MFE distributions of random sequences were pre-calculated on a large scale. These distributions follow a normal distribution and can be used to determine the MFE distribution for any given sequence composition by interpolation. It allows on-the-fly calculation of the normal distribution for any candidate sequence composition. The speedup achieved makes genome-wide screening with this characteristic of a pre-miRNA sequence practical. Although this particular property alone will not be able to distinguish miRNAs from other sequences sufficiently discriminative, the MFE-based P-value should be added to the parameters of choice to be included in the selection of potential miRNA candidates for experimental verification.

  15. Selective serotonin reuptake inhibitors (SSRIs) for post-partum depression (PPD): a systematic review of randomized clinical trials.

    Science.gov (United States)

    De Crescenzo, Franco; Perelli, Federica; Armando, Marco; Vicari, Stefano

    2014-01-01

    The treatment of postpartum depression with selective serotonin reuptake inhibitors (SSRIs) has been claimed to be both efficacious and well tolerated, but no recent systematic reviews have been conducted. A qualitative systematic review of randomized clinical trials on women with postpartum depression comparing SSRIs to placebo and/or other treatments was performed. A comprehensive literature search of online databases, the bibliographies of published articles and grey literature were conducted. Data on efficacy, acceptability and tolerability were extracted and the quality of the trials was assessed. Six randomised clinical trials, comprising 595 patients, met quality criteria for inclusion in the analysis. Cognitive-behavioural intervention, psychosocial community-based intervention, psychodynamic therapy, cognitive behavioural therapy, a second-generation tricyclic antidepressant and placebo were used as comparisons. All studies demonstrated higher response and remission rates among those treated with SSRIs and greater mean changes on depression scales, although findings were not always statistically significant. Dropout rates were high in three of the trials but similar among treatment and comparison groups. In general, SSRIs were well tolerated and trial quality was good. There are few trials, patients included in the trials were not representative of all patients with postpartum depression, dropout rates in three trials were high, and long-term efficacy and tolerability were assessed in only two trials. SSRIs appear to be efficacious and well tolerated in the treatment of postpartum depression, but the available evidence fails to demonstrate a clear superiority over other treatments. © 2013 Elsevier B.V. All rights reserved.

  16. {sup 90}Y -PET imaging: Exploring limitations and accuracy under conditions of low counts and high random fraction

    Energy Technology Data Exchange (ETDEWEB)

    Carlier, Thomas, E-mail: thomas.carlier@chu-nantes.fr [Department of Nuclear Medicine, University Hospital of Nantes, Place Alexis Ricordeau, Nantes 44093, France and CRCNA–UMR 892 INSERM 6299 CNRS, 8 quai Moncousu BP 70721, Nantes 44007 (France); Willowson, Kathy P. [Institute of Medical Physics, University of Sydney, Camperdown, New South Wales 2006 (Australia); Fourkal, Eugene [Department of Radiation Oncology, Allegheny General Hospital, Pittsburgh, Pennsylvania 15212 (United States); Bailey, Dale L. [Faculty of Health Sciences, University of Sydney, Lidcombe 2141, Australia and Department of Nuclear Medicine, Royal North Shore Hospital, St Leonards, New South Wales 2065 (Australia); Doss, Mohan [Department of Diagnostic Imaging, Fox Chase Cancer Center, Philadelphia, Pennsylvania 19111 (United States); Conti, Maurizio [Siemens Healthcare Molecular Imaging, 810 Innovation Drive, Knoxville, Tennessee 37932 (United States)

    2015-07-15

    high random fraction and low counts for iterative algorithms. Point spread function (PSF) correction and TOF reconstruction in general reduce background variability and noise and increase recovered concentration. Results for patient data indicated a good correlation between the expected and PET reconstructed activities. A linear relationship between the expected and the measured activities in the organ of interest was observed for all reconstruction method used: a linearity coefficient of 0.89 ± 0.05 for the Biograph mCT and 0.81 ± 0.05 for the Biograph TruePoint. Conclusions: Due to the low counts and high random fraction, accurate image quantification of {sup 90}Y during selective internal radionuclide therapy is affected by random coincidence estimation, scatter correction, and any positivity constraint of the algorithm. Nevertheless, phantom and patient studies showed that the impact of number of true and random coincidences on quantitative results was found to be limited as long as ordinary Poisson ordered subsets expectation maximization reconstruction algorithms with random smoothing are used. Adding PSF correction and TOF information to the reconstruction greatly improves the image quality in terms of bias, variability, noise reduction, and detectability. On the patient studies, the total activity in the field of view is in general accurately measured by Biograph mCT and slightly overestimated by the Biograph TruePoint.

  17. Targeted reduction of highly abundant transcripts using pseudo-random primers.

    Science.gov (United States)

    Arnaud, Ophélie; Kato, Sachi; Poulain, Stéphane; Plessy, Charles

    2016-04-01

    Transcriptome studies based on quantitative sequencing can estimate levels of gene expression by measuring target RNA abundance in sequencing libraries. Sequencing costs are proportional to the total number of sequenced reads, and in order to cover rare RNAs, considerable quantities of abundant and identical reads are needed. This major limitation can be addressed by depleting a proportion of the most abundant sequences from the library. However, such depletion strategies involve either extra handling of the input RNA sample or use of a large number of reverse transcription primers, termed not-so-random (NSR) primers, which are costly to synthesize. Taking advantage of the high tolerance of reverse transcriptase to mis-prime, we found that it is possible to use as few as 40 pseudo-random (PS) reverse transcription primers to decrease the rate of undesirable abundant sequences within a library without affecting the overall transcriptome diversity. PS primers are simple to design and can be used to deplete several undesirable RNAs simultaneously, thus creating a flexible tool for enriching transcriptome libraries for rare transcript sequences.

  18. Performance of Novel Randomly Oriented High Graphene Carbon in Lithium Ion Capacitors

    Directory of Open Access Journals (Sweden)

    Rahul S. Kadam

    2018-01-01

    Full Text Available The structure of carbon material comprising the anode is the key to the performance of a lithium ion capacitor. In addition to determining the capacity, the structure of the carbon material also determines the diffusion rate of the lithium ion into the anode which in turn controls power density which is vital in high rate applications. This paper covers details of systematic investigation of the performance of a structurally novel carbon, called Randomly Oriented High Graphene (ROHG carbon, and graphite in a high rate application device, that is, lithium ion capacitor. Electrochemical impedance spectroscopy shows that ROHG is less resistive and has faster lithium ion diffusion rates (393.7 × 10−3 S·s(1/2 compared to graphite (338.1 × 10−3 S·s(1/2. The impedance spectroscopy data is supported by the cell data showing that the ROHG carbon based device has energy density of 22.8 Wh/l with a power density of 4349.3 W/l, whereas baseline graphite based device has energy density of 5 Wh/l and power density of 4243.3 W/l. This data clearly shows advantage of the randomly oriented graphene platelet structure of ROHG in lithium ion capacitor performance.

  19. Tracking and flavour tagging selection in the ATLAS High Level Trigger

    CERN Document Server

    Calvetti, Milene; The ATLAS collaboration

    2017-01-01

    In high-energy physics experiments, track based selection in the online environment is crucial for the efficient real time selection of the rare physics process of interest. This is of particular importance at the Large Hadron Collider (LHC), where the increasingly harsh collision environment is challenging the experiments to improve the performance of their online selection. Principal among these challenges is the increasing number of interactions per bunch crossing, known as pileup. In the ATLAS experiment the challenge has been addressed with multiple strategies. Firstly, specific trigger objects have been improved by building algorithms using detailed tracking and vertexing in specific detector regions to improve background rejection without loosing signal efficiency. Secondly, since 2015 all trigger areas have benefited from a new high performance Inner Detector (ID) software tracking system implemented in the High Level Trigger. Finally, performance will be further enhanced in future by the installation...

  20. Phage display peptide libraries: deviations from randomness and correctives

    Science.gov (United States)

    Ryvkin, Arie; Ashkenazy, Haim; Weiss-Ottolenghi, Yael; Piller, Chen; Pupko, Tal; Gershoni, Jonathan M

    2018-01-01

    Abstract Peptide-expressing phage display libraries are widely used for the interrogation of antibodies. Affinity selected peptides are then analyzed to discover epitope mimetics, or are subjected to computational algorithms for epitope prediction. A critical assumption for these applications is the random representation of amino acids in the initial naïve peptide library. In a previous study, we implemented next generation sequencing to evaluate a naïve library and discovered severe deviations from randomness in UAG codon over-representation as well as in high G phosphoramidite abundance causing amino acid distribution biases. In this study, we demonstrate that the UAG over-representation can be attributed to the burden imposed on the phage upon the assembly of the recombinant Protein 8 subunits. This was corrected by constructing the libraries using supE44-containing bacteria which suppress the UAG driven abortive termination. We also demonstrate that the overabundance of G stems from variant synthesis-efficiency and can be corrected using compensating oligonucleotide-mixtures calibrated by mass spectroscopy. Construction of libraries implementing these correctives results in markedly improved libraries that display random distribution of amino acids, thus ensuring that enriched peptides obtained in biopanning represent a genuine selection event, a fundamental assumption for phage display applications. PMID:29420788

  1. The High/Scope Perry Preschool Study: A Case Study in Random Assignment.

    Science.gov (United States)

    Schweinhart, Lawrence J.

    2000-01-01

    Studied the long-term benefits of preschool programs for young children living in poverty in the High/Scope Perry Preschool Study, which examined the lives of 123 African Americans randomly divided into a preschool treatment group and a no-preschool comparison group. Cost-benefit analyses of data on these students to age 27 show beneficial effects…

  2. [Intel random number generator-based true random number generator].

    Science.gov (United States)

    Huang, Feng; Shen, Hong

    2004-09-01

    To establish a true random number generator on the basis of certain Intel chips. The random numbers were acquired by programming using Microsoft Visual C++ 6.0 via register reading from the random number generator (RNG) unit of an Intel 815 chipset-based computer with Intel Security Driver (ISD). We tested the generator with 500 random numbers in NIST FIPS 140-1 and X(2) R-Squared test, and the result showed that the random number it generated satisfied the demand of independence and uniform distribution. We also compared the random numbers generated by Intel RNG-based true random number generator and those from the random number table statistically, by using the same amount of 7500 random numbers in the same value domain, which showed that the SD, SE and CV of Intel RNG-based random number generator were less than those of the random number table. The result of u test of two CVs revealed no significant difference between the two methods. Intel RNG-based random number generator can produce high-quality random numbers with good independence and uniform distribution, and solves some problems with random number table in acquisition of the random numbers.

  3. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination.

    Science.gov (United States)

    Koh, Bongyeun; Hong, Sunggi; Kim, Soon-Sim; Hyun, Jin-Sook; Baek, Milye; Moon, Jundong; Kwon, Hayran; Kim, Gyoungyong; Min, Seonggi; Kang, Gu-Hyun

    2016-01-01

    The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE), which requires examinees to select items randomly. The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01), as well as 4 of the 5 items on the advanced skills test (P<0.05). In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01), as well as all 3 of the advanced skills test items (P<0.01). In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  4. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination

    Directory of Open Access Journals (Sweden)

    Bongyeun Koh

    2016-01-01

    Full Text Available Purpose: The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE, which requires examinees to select items randomly. Methods: The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. Results: In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01, as well as 4 of the 5 items on the advanced skills test (P<0.05. In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01, as well as all 3 of the advanced skills test items (P<0.01. Conclusion: In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  5. A Community-Based Randomized Trial of Hepatitis B Screening Among High-Risk Vietnamese Americans.

    Science.gov (United States)

    Ma, Grace X; Fang, Carolyn Y; Seals, Brenda; Feng, Ziding; Tan, Yin; Siu, Philip; Yeh, Ming Chin; Golub, Sarit A; Nguyen, Minhhuyen T; Tran, Tam; Wang, Minqi

    2017-03-01

    To evaluate the effectiveness of a community-based liver cancer prevention program on hepatitis B virus (HBV) screening among low-income, underserved Vietnamese Americans at high risk. We conducted a cluster randomized trial involving 36 Vietnamese community-based organizations and 2337 participants in Pennsylvania, New Jersey, and New York City between 2009 and 2014. We randomly assigned 18 community-based organizations to a community-based multilevel HBV screening intervention (n = 1131). We randomly assigned the remaining 18 community-based organizations to a general cancer education program (n = 1206), which included information about HBV-related liver cancer prevention. We assessed HBV screening rates at 6-month follow-up. Intervention participants were significantly more likely to have undergone HBV screening (88.1%) than were control group participants (4.6%). In a Cochran-Mantel-Haenszel analysis, the intervention effect on screening outcomes remained statistically significant after adjustment for demographic and health care access variables, including income, having health insurance, having a regular health provider, and English proficiency. A community-based, culturally appropriate, multilevel HBV screening intervention effectively increases screening rates in a high-risk, hard-to-reach Vietnamese American population.

  6. Automatic Recognition of Chinese Personal Name Using Conditional Random Fields and Knowledge Base

    Directory of Open Access Journals (Sweden)

    Chuan Gu

    2015-01-01

    Full Text Available According to the features of Chinese personal name, we present an approach for Chinese personal name recognition based on conditional random fields (CRF and knowledge base in this paper. The method builds multiple features of CRF model by adopting Chinese character as processing unit, selects useful features based on selection algorithm of knowledge base and incremental feature template, and finally implements the automatic recognition of Chinese personal name from Chinese document. The experimental results on open real corpus demonstrated the effectiveness of our method and obtained high accuracy rate and high recall rate of recognition.

  7. Generation of pseudo-random sequences for spread spectrum systems

    Science.gov (United States)

    Moser, R.; Stover, J.

    1985-05-01

    The characteristics of pseudo random radio signal sequences (PRS) are explored. The randomness of the PSR is a matter of artificially altering the sequence of binary digits broadcast. Autocorrelations of the two sequences shifted in time, if high, determine if the signals are the same and thus allow for position identification. Cross-correlation can also be calculated between sequences. Correlations closest to zero are obtained with large volume of prime numbers in the sequences. Techniques for selecting optimal and maximal lengths for the sequences are reviewed. If the correlations are near zero in the sequences, then signal channels can accommodate multiple users. Finally, Gold codes are discussed as a technique for maximizing the code lengths.

  8. What can we learn from the neutron clinical experience for improving ion-beam techniques and high-LET patient selection?

    International Nuclear Information System (INIS)

    Wambersie, A.; Jones, D.T.L.; Gueulette, J.; Gahbauer, R.; DeLuca, P.M.

    2010-01-01

    Historically, improvements in radiotherapy have been mainly due to improvements in physical selectivity: beam penetration, collimation, dosimetry, treatment planning; and advances in imaging. Neutrons were the first high-LET (linear energy transfer) radiation to be used clinically and showed improvement in the differential response of radiation resistant tumors and normal tissues. The benefits of fast neutrons (and other forms of high LET radiations) are due to their biological effects: a reduction of the OER, a reduction in the differential cell radiosensitivity related to their position in the mitotic cycle, and a reduction in cellular repair capacity (thus less importance of fractionation). The poor physical selectivity of the early neutron therapy beams introduced a systematic bias in comparison with the photon treatments and created a negative perception for neutron therapy. However, significant improvements in the neutron therapy equipment resulted in a physical selectivity similar to modern MV photon therapy. The tumor types or sites where the best therapeutic results were obtained included inoperable or recurrent salivary gland tumors locally extended prostatic adenocarcinomas, and slowly growing well-differentiated sarcomas. The benefit of neutrons for some other well-defined groups of patients was demonstrated in randomized trials. It was estimated that about 20 % of all radiotherapy patients could benefit from fast neutrons (if neutrons are delivered under satisfactory physical conditions). An important issue for fast neutron therapy is the selection of the types of patients who could most benefit from high-LET radiations. The same issue is raised today with other high-LET radiations (e.g., 12 C ions). It is reasonable to assume that the same types of patients would benefit from 12 C irradiation. Of course the better physical selectivity of ion beams enhances the treatment possibilities but this is true for both the high-LET and low-LET radiations (i

  9. Somatic mitochondrial DNA mutations in cancer escape purifying selection and high pathogenicity mutations lead to the oncocytic phenotype: pathogenicity analysis of reported somatic mtDNA mutations in tumors

    International Nuclear Information System (INIS)

    Pereira, Luísa; Soares, Pedro; Máximo, Valdemar; Samuels, David C

    2012-01-01

    The presence of somatic mitochondrial DNA (mtDNA) mutations in cancer cells has been interpreted in controversial ways, ranging from random neutral accumulation of mutations, to positive selection for high pathogenicity, or conversely to purifying selection against high pathogenicity variants as occurs at the population level. Here we evaluated the predicted pathogenicity of somatic mtDNA mutations described in cancer and compare these to the distribution of variations observed in the global human population and all possible protein variations that could occur in human mtDNA. We focus on oncocytic tumors, which are clearly associated with mitochondrial dysfunction. The protein variant pathogenicity was predicted using two computational methods, MutPred and SNPs&GO. The pathogenicity score of the somatic mtDNA variants were significantly higher in oncocytic tumors compared to non-oncocytic tumors. Variations in subunits of Complex I of the electron transfer chain were significantly more common in tumors with the oncocytic phenotype, while variations in Complex V subunits were significantly more common in non-oncocytic tumors. Our results show that the somatic mtDNA mutations reported over all tumors are indistinguishable from a random selection from the set of all possible amino acid variations, and have therefore escaped the effects of purifying selection that act strongly at the population level. We show that the pathogenicity of somatic mtDNA mutations is a determining factor for the oncocytic phenotype. The opposite associations of the Complex I and Complex V variants with the oncocytic and non-oncocytic tumors implies that low mitochondrial membrane potential may play an important role in determining the oncocytic phenotype

  10. Pion radiation for high grade astrocytoma: results of a randomized study

    International Nuclear Information System (INIS)

    Pickles, Tom; Goodman, George B.; Rheaume, Dorianne E.; Duncan, Graeme G.; Fryer, Chris J.; Bhimji, Shamim; Ludgate, Charles; Syndikus, Isabel; Graham, Peter; Dimitrov, Mario; Bowen, Julie

    1997-01-01

    Purpose: This study attempted to compare within a randomized study the outcome of pion radiation therapy vs. conventional photon irradiation for the treatment of high-grade astrocytomas. Methods and Materials: Eighty-four patients were randomized to pion therapy (33-34.5 Gyπ), or conventional photon irradiation (60 Gy). Entry criteria included astrocytoma (modified Kernohan high Grade 3 or Grade 4), age 18-70, Karnofsky performance status (KPS) ≥50, ability to start irradiation within 30 days of surgery, unifocal tumor, and treatment volume < 850 cc. The high-dose volume in both arms was computed tomography enhancement plus a 2-cm margin. The study was designed with the power to detect a twofold difference between arms. Results: Eighty-one eligible patients were equally balanced for all known prognostic variables. Pion patients started radiation 7 days earlier on average than photon patients, but other treatment-related variables did not differ. There were no significant differences for either early or late radiation toxicity between treatment arms. Actuarial survival analysis shows no differences in terms of time to local recurrence or overall survival where median survival was 10 months in both arms (p = 0.22). The physician-assessed KPS and patient-assessed quality of life (QOL) measurements were generally maintained within 10 percentage points until shortly before tumor recurrence. There was no apparent difference in the serial KPS or QOL scores between treatment arms. Conclusion: In contrast to high linear energy transfer (LET) therapy for central nervous system tumors, such as neutron or neon therapy, the safety of pion therapy, which is of intermediate LET, has been reaffirmed. However, this study has demonstrated no therapeutic gain for pion therapy of glioblastoma

  11. Selection of Highly Expressed Gene Variants in Escherichia coli Using Translationally Coupled Antibiotic Selection Markers

    DEFF Research Database (Denmark)

    Rennig, Maja; Daley, Daniel O.; Nørholm, Morten H. H.

    2018-01-01

    Strategies to select highly expressed variants of a protein coding sequence are usually based on trial-and-error approaches, which are time-consuming and expensive. We address this problem using translationally coupled antibiotic resistance markers. The system requires that the target gene can...

  12. vuv fluorescence from selective high-order multiphoton excitation of N2

    International Nuclear Information System (INIS)

    Coffee, Ryan N.; Gibson, George N.

    2004-01-01

    Recent fluorescence studies suggest that ultrashort pulse laser excitation may be highly selective. Selective high-intensity laser excitation holds important consequences for the physics of multiphoton processes. To establish the extent of this selectivity, we performed a detailed comparative study of the vacuum ultraviolet fluorescence resulting from the interaction of N 2 and Ar with high-intensity infrared ultrashort laser pulses. Both N 2 and Ar reveal two classes of transitions, inner-valence ns ' l ' . From their pressure dependence, we associate each transition with either plasma or direct laser excitation. Furthermore, we qualitatively confirm such associations with the time dependence of the fluorescence signal. Remarkably, only N 2 presents evidence of direct laser excitation. This direct excitation produces ionic nitrogen fragments with inner-valence (2s) holes, two unidentified transitions, and one molecular transition, the N 2 + :X 2 Σ g + 2 Σ u + . We discuss these results in the light of a recently proposed model for multiphoton excitation

  13. A Feature Subset Selection Method Based On High-Dimensional Mutual Information

    Directory of Open Access Journals (Sweden)

    Chee Keong Kwoh

    2011-04-01

    Full Text Available Feature selection is an important step in building accurate classifiers and provides better understanding of the data sets. In this paper, we propose a feature subset selection method based on high-dimensional mutual information. We also propose to use the entropy of the class attribute as a criterion to determine the appropriate subset of features when building classifiers. We prove that if the mutual information between a feature set X and the class attribute Y equals to the entropy of Y , then X is a Markov Blanket of Y . We show that in some cases, it is infeasible to approximate the high-dimensional mutual information with algebraic combinations of pairwise mutual information in any forms. In addition, the exhaustive searches of all combinations of features are prerequisite for finding the optimal feature subsets for classifying these kinds of data sets. We show that our approach outperforms existing filter feature subset selection methods for most of the 24 selected benchmark data sets.

  14. Microfluidic sensor for ultra high redox cycling amplification for highly selective electrochemical measurements

    NARCIS (Netherlands)

    Odijk, Mathieu; Straver, Martin; Olthuis, Wouter; van den Berg, Albert

    2011-01-01

    In this contribution a SU8/glass-based microfluidic sensor is described with two closely spaced parallel electrodes for highly selective measurements using the redox cycling (RC) effect. Using this sensor, a RC amplification of ~2000x is measured using the ferrocyanide redox couple, which is much

  15. Quantum random flip-flop and its applications in random frequency synthesis and true random number generation

    Energy Technology Data Exchange (ETDEWEB)

    Stipčević, Mario, E-mail: mario.stipcevic@irb.hr [Photonics and Quantum Optics Research Unit, Center of Excellence for Advanced Materials and Sensing Devices, Ruđer Bošković Institute, Bijenička 54, 10000 Zagreb (Croatia)

    2016-03-15

    In this work, a new type of elementary logic circuit, named random flip-flop (RFF), is proposed, experimentally realized, and studied. Unlike conventional Boolean logic circuits whose action is deterministic and highly reproducible, the action of a RFF is intentionally made maximally unpredictable and, in the proposed realization, derived from a fundamentally random process of emission and detection of light quanta. We demonstrate novel applications of RFF in randomness preserving frequency division, random frequency synthesis, and random number generation. Possible usages of these applications in the information and communication technology, cryptographic hardware, and testing equipment are discussed.

  16. Review of Mid- to High-Temperature Solar Selective Absorber Materials

    Energy Technology Data Exchange (ETDEWEB)

    Kennedy, C. E.

    2002-07-01

    This report describes the concentrating solar power (CSP) systems using solar absorbers to convert concentrated sunlight to thermal electric power. It is possible to achieve solar absorber surfaces for efficient photothermal conversion having high solar absorptance (a) for solar radiation and a low thermal emittance (e) at the operational temperature. A low reflectance (?'' 0) at wavelengths (?) 3 mm and a high reflectance (?'' 1) at l 3 mm characterize spectrally selective surfaces. The operational temperature ranges of these materials for solar applications can be categorized as low temperature (T< 100 C), mid-temperature (100 C< T< 400 C), and high-temperature (T> 400 C). High- and mid-temperature applications are needed for CSP applications. For CSP applications, the ideal spectrally selective surface would be low-cost and easy to manufacture, chemically and thermally stable in air at elevated operating temperatures (T= 500 C), and have a solar absorptance= 0.98 and a thermal emittance= 0.05 at 500 C.

  17. Very high performance pseudo-random number generation on DAP

    Science.gov (United States)

    Smith, K. A.; Reddaway, S. F.; Scott, D. M.

    1985-07-01

    Since the National DAP Service began at QMC in 1980, extensive use has been made of pseudo-random numbers in Monte Carlo simulation. Matrices of uniform numbers have been produced by various generators: (a) multiplicative ( x+ 1 = 13 13xn mod 2 59); (b) very long period shift register ( x4423 + x271 + 1); (c) multiple shorter period ( x127 + x7 + 1) shift registers generating several matrices per iteration. The above uniform generators can also feed a normal distribution generator that uses the Box-Muller transformation. This paper describes briefly the generators, their implementation and speed. Generator (b) has been greatly speeded-up by re-implementation, and now produces more than 100 × 10 6 high quality 16-bit numbers/s. Generator (c) is under development and will achieve even higher performance, mainly due to producing data in greater bulk. High quality numbers are expected, and performance will range from 400 to 800 × 10 6 numbers/s, depending on how the generator is used.

  18. Polybenzimidazole-based mixed membranes with exceptional high water vapor permeability and selectivity

    KAUST Repository

    Akhtar, Faheem Hassan

    2017-09-13

    Polybenzimidazole (PBI), a thermal and chemically stable polymer, is commonly used to fabricate membranes for applications like hydrogen recovery at temperatures of more than 300 °C, fuel cells working in a highly acidic environment, and nanofiltration in aggressive solvents. This report shows for the first time use of PBI dense membranes for water vapor/gas separation applications. They showed an excellent selectivity and high water vapor permeability. Incorporation of inorganic hydrophilic titanium-based nano-fillers into the PBI matrix further increased the water vapor permeability and water vapor/N2 selectivity. The most selective mixed matrix membrane with 0.5 wt% loading of TiO2 nanotubes yielded a water vapor permeability of 6.8×104 Barrer and a H2O/N2 selectivity of 3.9×106. The most permeable membrane with 1 wt% loading of carboxylated TiO2 nanoparticles had a 7.1×104 Barrer water vapor permeability and a H2O/N2 selectivity of 3.1×106. The performance of these membranes in terms of water vapor transport and selectivity is among the highest reported ones. The remarkable ability of PBI to efficiently permeate water versus other gases opens the possibility to fabricate membranes for dehumidification of streams in harsh environments. This includes the removal of water from high temperature reaction mixtures to shift the equilibrium towards products.

  19. Polybenzimidazole-based mixed membranes with exceptional high water vapor permeability and selectivity

    KAUST Repository

    Akhtar, Faheem Hassan; Kumar, Mahendra; Villalobos, Luis Francisco; Shevate, Rahul; Vovusha, Hakkim; Schwingenschlö gl, Udo; Peinemann, Klaus-Viktor

    2017-01-01

    Polybenzimidazole (PBI), a thermal and chemically stable polymer, is commonly used to fabricate membranes for applications like hydrogen recovery at temperatures of more than 300 °C, fuel cells working in a highly acidic environment, and nanofiltration in aggressive solvents. This report shows for the first time use of PBI dense membranes for water vapor/gas separation applications. They showed an excellent selectivity and high water vapor permeability. Incorporation of inorganic hydrophilic titanium-based nano-fillers into the PBI matrix further increased the water vapor permeability and water vapor/N2 selectivity. The most selective mixed matrix membrane with 0.5 wt% loading of TiO2 nanotubes yielded a water vapor permeability of 6.8×104 Barrer and a H2O/N2 selectivity of 3.9×106. The most permeable membrane with 1 wt% loading of carboxylated TiO2 nanoparticles had a 7.1×104 Barrer water vapor permeability and a H2O/N2 selectivity of 3.1×106. The performance of these membranes in terms of water vapor transport and selectivity is among the highest reported ones. The remarkable ability of PBI to efficiently permeate water versus other gases opens the possibility to fabricate membranes for dehumidification of streams in harsh environments. This includes the removal of water from high temperature reaction mixtures to shift the equilibrium towards products.

  20. Pure-Phase Selective Excitation in Fast-Relaxing Systems

    Science.gov (United States)

    Zangger, Klaus; Oberer, Monika; Sterk, Heinz

    2001-09-01

    Selective pulses have been used frequently for small molecules. However, their application to proteins and other macromolecules has been limited. The long duration of shaped-selective pulses and the short T2 relaxation times in proteins often prohibited the use of highly selective pulses especially on larger biomolecules. A very selective excitation can be obtained within a short time by using the selective excitation sequence presented in this paper. Instead of using a shaped low-intensity radiofrequency pulse, a cluster of hard 90° pulses, delays of free precession, and pulsed field gradients can be used to selectively excite a narrow chemical shift range within a relatively short time. Thereby, off-resonance magnetization, which is allowed to evolve freely during the free precession intervals, is destroyed by the gradient pulses. Off-resonance excitation artifacts can be removed by random variation of the interpulse delays. This leads to an excitation profile with selectivity as well as phase and relaxation behavior superior to that of commonly used shaped-selective pulses. Since the evolution of scalar coupling is inherently suppressed during the double-selective excitation of two different scalar-coupled nuclei, the presented pulse cluster is especially suited for simultaneous highly selective excitation of N-H and C-H fragments. Experimental examples are demonstrated on hen egg white lysozyme (14 kD) and the bacterial antidote ParD (19 kD).

  1. Using Random Numbers in Science Research Activities.

    Science.gov (United States)

    Schlenker, Richard M.; And Others

    1996-01-01

    Discusses the importance of science process skills and describes ways to select sets of random numbers for selection of subjects for a research study in an unbiased manner. Presents an activity appropriate for grades 5-12. (JRH)

  2. Low or High Fractionation Dose {beta}-Radiotherapy for Pterygium? A Randomized Clinical Trial

    Energy Technology Data Exchange (ETDEWEB)

    Viani, Gustavo Arruda, E-mail: gusviani@gmail.com [Department of Radiation Oncology, Marilia Medicine School, Sao Paulo, SP (Brazil); De Fendi, Ligia Issa; Fonseca, Ellen Carrara [Department of Ophthalmology, Marilia Medicine School, Sao Paulo, SP (Brazil); Stefano, Eduardo Jose [Department of Radiation Oncology, Marilia Medicine School, Sao Paulo, SP (Brazil)

    2012-02-01

    Purpose: Postoperative adjuvant treatment using {beta}-radiotherapy (RT) is a proven technique for reducing the recurrence of pterygium. A randomized trial was conducted to determine whether a low fractionation dose of 2 Gy within 10 fractions would provide local control similar to that after a high fractionation dose of 5 Gy within 7 fractions for surgically resected pterygium. Methods: A randomized trial was conducted in 200 patients (216 pterygia) between February 2006 and July 2007. Only patients with fresh pterygium resected using a bare sclera method and given RT within 3 days were included. Postoperative RT was delivered using a strontium-90 eye applicator. The pterygia were randomly treated using either 5 Gy within 7 fractions (Group 1) or 2 Gy within 10 fractions (Group 2). The local control rate was calculated from the date of surgery. Results: Of the 216 pterygia included, 112 were allocated to Group 1 and 104 to Group 2. The 3-year local control rate for Groups 1 and 2 was 93.8% and 92.3%, respectively (p = .616). A statistically significant difference for cosmetic effect (p = .034), photophobia (p = .02), irritation (p = .001), and scleromalacia (p = .017) was noted in favor of Group 2. Conclusions: No better local control rate for postoperative pterygium was obtained using high-dose fractionation vs. low-dose fractionation. However, a low-dose fractionation schedule produced better cosmetic effects and resulted in fewer symptoms than high-dose fractionation. Moreover, pterygia can be safely treated in terms of local recurrence using RT schedules with a biologic effective dose of 24-52.5 Gy{sub 10.}.

  3. Is it beneficial to selectively boost high-risk tumor subvolumes? A comparison of selectively boosting high-risk tumor subvolumes versus homogeneous dose escalation of the entire tumor based on equivalent EUD plans

    International Nuclear Information System (INIS)

    Kim, Yusung; To me, Wolfgang A.

    2008-01-01

    Purpose. To quantify and compare expected local tumor control and expected normal tissue toxicities between selective boosting IMRT and homogeneous dose escalation IMRT for the case of prostate cancer. Methods. Four different selective boosting scenarios and three different high-risk tumor subvolume geometries were designed to compare selective boosting and homogeneous dose escalation IMRT plans delivering the same equivalent uniform dose (EUD) to the entire PTV. For each scenario, differences in tumor control probability between both boosting strategies were calculated for the high-risk tumor subvolume and remaining low-risk PTV, and were visualized using voxel based iso-TCP maps. Differences in expected rectal and bladder complications were quantified using radiobiological indices (generalized EUD (gEUD) and normal tissue complication probability (NTCP)) as well as %-volumes. Results. For all investigated scenarios and high-risk tumor subvolume geometries, selective boosting IMRT improves expected TCP compared to homogeneous dose escalation IMRT, especially when lack of control of the high-risk tumor subvolume could be the cause for tumor recurrence. Employing, selective boosting IMRT significant increases in expected TCP can be achieved for the high-risk tumor subvolumes. The three conventional selective boosting IMRT strategies, employing physical dose objectives, did not show significant improvement in rectal and bladder sparing as compared to their counterpart homogeneous dose escalation plans. However, risk-adaptive optimization, utilizing radiobiological objective functions, resulted in reduction in NTCP for the rectum when compared to its corresponding homogeneous dose escalation plan. Conclusions. Selective boosting is a more effective method than homogeneous dose escalation for achieving optimal treatment outcomes. Furthermore, risk-adaptive optimization increases the therapeutic ratio as compared to conventional selective boosting IMRT

  4. Take a look at the bright side: Effects of positive body exposure on selective visual attention in women with high body dissatisfaction.

    Science.gov (United States)

    Glashouwer, Klaske A; Jonker, Nienke C; Thomassen, Karen; de Jong, Peter J

    2016-08-01

    Women with high body dissatisfaction look less at their 'beautiful' body parts than their 'ugly' body parts. This study tested the robustness of this selective viewing pattern and examined the influence of positive body exposure on body-dissatisfied women's attention for 'ugly' and 'beautiful' body parts. In women with high body dissatisfaction (N = 28) and women with low body dissatisfaction (N = 14) eye-tracking was used to assess visual attention towards pictures of their own and other women's bodies. Participants with high body dissatisfaction were randomly assigned to 5 weeks positive body exposure (n = 15) or a no-treatment condition (n = 13). Attention bias was assessed again after 5 weeks. Body-dissatisfied women looked longer at 'ugly' than 'beautiful' body parts of themselves and others, while participants with low body dissatisfaction attended equally long to own/others' 'beautiful' and 'ugly' body parts. Although positive body exposure was very effective in improving participants' body satisfaction, it did not systematically change participants' viewing pattern. The tendency to preferentially allocate attention towards one's 'ugly' body parts seems a robust phenomenon in women with body dissatisfaction. Yet, modifying this selective viewing pattern seems not a prerequisite for successfully improving body satisfaction via positive body exposure. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Four minutes of in-class high-intensity interval activity improves selective attention in 9- to 11-year olds.

    Science.gov (United States)

    Ma, Jasmin K; Le Mare, Lucy; Gurd, Brendon J

    2015-03-01

    The amount of time allocated to physical activity in schools is declining. Time-efficient physical activity solutions that demonstrate their impact on academic achievement-related outcomes are needed to prioritize physical activity within the school curricula. "FUNtervals" are 4-min, high-intensity interval activities that use whole-body actions to complement a storyline. The purpose of this study was to (i) explore whether FUNtervals can improve selective attention, an executive function posited to be essential for learning and academic success; and (ii) examine whether this relationship is predicted by students' classroom off-task behaviour. Seven grade 3-5 classes (n = 88) were exposed to a single-group, repeated cross-over design where each student's selective attention was compared between no-activity and FUNtervals days. In week 1, students were familiarized with the d2 test of attention and FUNterval activities, and baseline off-task behaviour was observed. In both weeks 2 and 3 students completed the d2 test of attention following either a FUNterval break or a no-activity break. The order of these breaks was randomized and counterbalanced between weeks. Neither motor nor passive off-task behaviour predicted changes in selective attention following FUNtervals; however, a weak relationship was observed for verbal off-task behaviour and improvements in d2 test performance. More importantly, students made fewer errors during the d2 test following FUNtervals. In supporting the priority of physical activity inclusion within schools, FUNtervals, a time efficient and easily implemented physical activity break, can improve selective attention in 9- to 11-year olds.

  6. Natural Selection on Genes Related to Cardiovascular Health in High-Altitude Adapted Andeans.

    Science.gov (United States)

    Crawford, Jacob E; Amaru, Ricardo; Song, Jihyun; Julian, Colleen G; Racimo, Fernando; Cheng, Jade Yu; Guo, Xiuqing; Yao, Jie; Ambale-Venkatesh, Bharath; Lima, João A; Rotter, Jerome I; Stehlik, Josef; Moore, Lorna G; Prchal, Josef T; Nielsen, Rasmus

    2017-11-02

    The increase in red blood cell mass (polycythemia) due to the reduced oxygen availability (hypoxia) of residence at high altitude or other conditions is generally thought to be beneficial in terms of increasing tissue oxygen supply. However, the extreme polycythemia and accompanying increased mortality due to heart failure in chronic mountain sickness most likely reduces fitness. Tibetan highlanders have adapted to high altitude, possibly in part via the selection of genetic variants associated with reduced polycythemic response to hypoxia. In contrast, high-altitude-adapted Quechua- and Aymara-speaking inhabitants of the Andean Altiplano are not protected from high-altitude polycythemia in the same way, yet they exhibit other adaptive features for which the genetic underpinnings remain obscure. Here, we used whole-genome sequencing to scan high-altitude Andeans for signals of selection. The genes showing the strongest evidence of selection-including BRINP3, NOS2, and TBX5-are associated with cardiovascular development and function but are not in the response-to-hypoxia pathway. Using association mapping, we demonstrated that the haplotypes under selection are associated with phenotypic variations related to cardiovascular health. We hypothesize that selection in response to hypoxia in Andeans could have vascular effects and could serve to mitigate the deleterious effects of polycythemia rather than reduce polycythemia itself. Copyright © 2017. Published by Elsevier Inc.

  7. High throughput route selection in multi-rate wireless mesh networks

    Institute of Scientific and Technical Information of China (English)

    WEI Yi-fei; GUO Xiang-li; SONG Mei; SONG Jun-de

    2008-01-01

    Most existing Ad-hoc routing protocols use the shortest path algorithm with a hop count metric to select paths. It is appropriate in single-rate wireless networks, but has a tendency to select paths containing long-distance links that have low data rates and reduced reliability in multi-rate networks. This article introduces a high throughput routing algorithm utilizing the multi-rate capability and some mesh characteristics in wireless fidelity (WiFi) mesh networks. It uses the medium access control (MAC) transmission time as the routing metric, which is estimated by the information passed up from the physical layer. When the proposed algorithm is adopted, the Ad-hoc on-demand distance vector (AODV) routing can be improved as high throughput AODV (HT-AODV). Simulation results show that HT-AODV is capable of establishing a route that has high data-rate, short end-to-end delay and great network throughput.

  8. Highly selective enrichment of phosphorylated peptides using titanium dioxide

    DEFF Research Database (Denmark)

    Thingholm, Tine; Jørgensen, Thomas J D; Jensen, Ole N

    2006-01-01

    -column. Although phosphopeptide enrichment can be achieved by using TFA and acetonitrile alone, the selectivity is dramatically enhanced by adding DHB or phthalic acid since these compounds, in conjunction with the low pH caused by TFA, prevent binding of nonphosphorylated peptides to TiO2. Using an alkaline...... a protocol for selective phosphopeptide enrichment using titanium dioxide (TiO2) chromatography. The selectivity toward phosphopeptides is obtained by loading the sample in a 2,5-dihydroxybenzoic acid (DHB) or phthalic acid solution containing acetonitrile and trifluoroacetic acid (TFA) onto a TiO2 micro...... solution (pH > or = 10.5) both monophosphorylated and multiphosphorylated peptides are eluted from the TiO2 beads. This highly efficient method for purification of phosphopeptides is well suited for the characterization of phosphoproteins from both in vitro and in vivo studies in combination with mass...

  9. [Employees in high-reliability organizations: systematic selection of personnel as a final criterion].

    Science.gov (United States)

    Oubaid, V; Anheuser, P

    2014-05-01

    Employees represent an important safety factor in high-reliability organizations. The combination of clear organizational structures, a nonpunitive safety culture, and psychological personnel selection guarantee a high level of safety. The cockpit personnel selection process of a major German airline is presented in order to demonstrate a possible transferability into medicine and urology.

  10. Theory and implementation of a very high throughput true random number generator in field programmable gate array

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yonggang, E-mail: wangyg@ustc.edu.cn; Hui, Cong; Liu, Chong; Xu, Chao [Department of Modern Physics, University of Science and Technology of China, Hefei 230026 (China)

    2016-04-15

    The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving, so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.

  11. Holographic memories with encryption-selectable function

    Science.gov (United States)

    Su, Wei-Chia; Lee, Xuan-Hao

    2006-03-01

    Volume holographic storage has received increasing attention owing to its potential high storage capacity and access rate. In the meanwhile, encrypted holographic memory using random phase encoding technique is attractive for an optical community due to growing demand for protection of information. In this paper, encryption-selectable holographic storage algorithms in LiNbO 3 using angular multiplexing are proposed and demonstrated. Encryption-selectable holographic memory is an advance concept of security storage for content protection. It offers more flexibility to encrypt the data or not optionally during the recording processes. In our system design, the function of encryption and non-encryption storage is switched by a random phase pattern and a uniform phase pattern. Based on a 90-degree geometry, the input patterns including the encryption and non-encryption storage are stored via angular multiplexing with reference plane waves at different incident angles. Image is encrypted optionally by sliding the ground glass into one of the recording waves or removing it away in each exposure. The ground glass is a key for encryption. Besides, it is also an important key available for authorized user to decrypt the encrypted information.

  12. High-flavanol and high-theobromine versus low-flavanol and low-theobromine chocolate to improve uterine artery pulsatility index: a double blind randomized clinical trial.

    Science.gov (United States)

    Bujold, Emmanuel; Leblanc, Vicky; Lavoie-Lebel, Élise; Babar, Asma; Girard, Mario; Poungui, Lionel; Blanchet, Claudine; Marc, Isabelle; Lemieux, Simone; Belkacem, Abdous; Sidi, Elhadji Laouan; Dodin, Sylvie

    2017-09-01

    To evaluate the impact of high-flavanol and high-theobromine (HFHT) chocolate in women at risk of preeclampsia (PE). We conducted a single-center randomized controlled trial including women with singleton pregnancy between 11 and 14 weeks gestation who had bilateral abnormal uterine artery (UtA) waveforms (notching) and elevated pulsatility index (PI). Participants were randomized to either HFHT or low-flavanol and low-theobromine (LFLT) chocolate (30 grams daily for a total of 12 weeks). UtA PI, reported as multiple of medians (MoM) adjusted for gestational age, was assessed at baseline and 12 weeks after randomization. One hundred thirty-one women were randomized with mean gestational age of 12.4 ± 0.6 weeks and a mean UtA PI of 1.39 ± 0.31 MoM. UtA PI adjusted for gestational age significantly decreased from baseline to the second visit (12 weeks later) in the two groups (p chocolate, daily intake of HFHT chocolate was not associated with significant changes of UtA PI. Nevertheless, the improvement observed in both groups suggests that chocolate could improve placental function independently of flavanol and/or theobromine content.

  13. A High-throughput Selection for Cellulase Catalysts Using Chemical Complementation

    Science.gov (United States)

    Peralta-Yahya, Pamela; Carter, Brian T.; Lin, Hening; Tao, Haiyan; Cornish, Virginia W.

    2010-01-01

    Efficient enzymatic hydrolysis of lignocellulosic material remains one of the major bottlenecks to cost-effective conversion of biomass to ethanol. Improvement of glycosylhydrolases however is limited by existing medium-throughput screening technologies. Here, we report the first high-throughput selection for cellulase catalysts. This selection was developed by adapting chemical complementation to provide a growth assay for bond cleavage reactions. First, a URA3 counter selection was adapted to link chemical dimerizer activated gene transcription to cell death. Next, the URA3 counter selection was shown to detect cellulase activity based on cleavage of a tetrasaccharide chemical dimerizer substrate and decrease in expression of the toxic URA3 reporter. Finally, the utility of the cellulase selection was assessed by isolating cellulases with improved activity from a cellulase library created by family DNA shuffling. This application provides further evidence that chemical complementation can be readily adapted to detect different enzymatic activities for important chemical transformations for which no natural selection exists. Due to the large number of enzyme variants selections can test compared to existing medium-throughput screens for cellulases, this assay has the potential to impact the discovery of improved cellulases and other glycosylhydrolases for biomass conversion from libraries of cellulases created by mutagenesis or obtained from natural biodiversity. PMID:19053460

  14. Balancing treatment allocations by clinician or center in randomized trials allows unacceptable levels of treatment prediction.

    Science.gov (United States)

    Hills, Robert K; Gray, Richard; Wheatley, Keith

    2009-08-01

    Randomized controlled trials are the standard method for comparing treatments because they avoid the selection bias that might arise if clinicians were free to choose which treatment a patient would receive. In practice, allocation of treatments in randomized controlled trials is often not wholly random with various 'pseudo-randomization' methods, such as minimization or balanced blocks, used to ensure good balance between treatments within potentially important prognostic or predictive subgroups. These methods avoid selection bias so long as full concealment of the next treatment allocation is maintained. There is concern, however, that pseudo-random methods may allow clinicians to predict future treatment allocations from previous allocation history, particularly if allocations are balanced by clinician or center. We investigate here to what extent treatment prediction is possible. Using computer simulations of minimization and balanced block randomizations, the success rates of various prediction strategies were investigated for varying numbers of stratification variables, including the patient's clinician. Prediction rates for minimization and balanced block randomization typically exceed 60% when clinician is included as a stratification variable and, under certain circumstances, can exceed 80%. Increasing the number of clinicians and other stratification variables did not greatly reduce the prediction rates. Without clinician as a stratification variable, prediction rates are poor unless few clinicians participate. Prediction rates are unacceptably high when allocations are balanced by clinician or by center. This could easily lead to selection bias that might suggest spurious, or mask real, treatment effects. Unless treatment is blinded, randomization should not be balanced by clinician (or by center), and clinician-center effects should be allowed for instead by retrospectively stratified analyses. © 2009 Blackwell Publishing Asia Pty Ltd and Chinese

  15. Evaluation and selection of candidate high-level waste forms

    International Nuclear Information System (INIS)

    1982-03-01

    Seven candidate waste forms being developed under the direction of the Department of Energy's National High-Level Waste (HLW) Technology Program, were evaluated as potential media for the immobilization and geologic disposal of high-level nuclear wastes. The evaluation combined preliminary waste form evaluations conducted at DOE defense waste-sites and independent laboratories, peer review assessments, a product performance evaluation, and a processability analysis. Based on the combined results of these four inputs, two of the seven forms, borosilicate glass and a titanate based ceramic, SYNROC, were selected as the reference and alternative forms for continued development and evaluation in the National HLW Program. Both the glass and ceramic forms are viable candidates for use at each of the DOE defense waste-sites; they are also potential candidates for immobilization of commercial reprocessing wastes. This report describes the waste form screening process, and discusses each of the four major inputs considered in the selection of the two forms

  16. On dark matter selected high-scale supersymmetry

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, Sibo [Department of Physics, Chongqing University,Chongqing 401331 (China)

    2015-03-11

    The prediction for the Higgs mass in the dark matter selected high-scale SUSY is explored. We show the bounds on SUSY-breaking scale in models of SM +w-tilde and SM +h-tilde/s-tilde due to the observed Higgs mass at the LHC. We propose that effective theory below scale m-tilde described by SM +w-tilde is possibly realized in gauge mediation with multiple spurion fields that exhibit significant mass hierarchy, and that by SM +h-tilde/s-tilde can be realized with direct singlet-messenger-messenger coupling for singlet Yukawa coupling λ∼(v/m-tilde){sup 1/2}g{sub SM}. Finally, the constraint on high-scale SUSY is investigated in the light of inflation physics if these two subjects are directly related.

  17. Strategic project selection based on evidential reasoning approach for high-end equipment manufacturing industry

    Directory of Open Access Journals (Sweden)

    Lu Guangyan

    2017-01-01

    Full Text Available With the rapid development of science and technology, emerging information technologies have significantly changed the daily life of people. In such context, strategic project selection for high-end equipment manufacturing industries faces more and more complexities and uncertainties with the consideration of several complex criteria. For example, a group of experts rather than a single expert should be invited to select strategic project for high-end equipment manufacturing industries and the experts may feel difficulty to express their preferences towards different strategic projects due to their limited cognitive capabilities. In order to handle these complexities and uncertainties, the criteria framework of strategic project selection is firstly constructed based on the characteristics of high-end equipment manufacturing industries and then evidential reasoning (ER approach is introduced in this paper to help experts express their uncertain preferences and aggregate these preferences to generate an appropriate strategic project. A real case of strategic project selection in a high-speed train manufacturing enterprise is investigated to demonstrate the validity of the ER approach in solving strategic project selection problem.

  18. Effects of one versus two bouts of moderate intensity physical activity on selective attention during a school morning in Dutch primary schoolchildren: A randomized controlled trial.

    Science.gov (United States)

    Altenburg, Teatske M; Chinapaw, Mai J M; Singh, Amika S

    2016-10-01

    Evidence suggests that physical activity is positively related to several aspects of cognitive functioning in children, among which is selective attention. To date, no information is available on the optimal frequency of physical activity on cognitive functioning in children. The current study examined the acute effects of one and two bouts of moderate-intensity physical activity on children's selective attention. Randomized controlled trial (ISRCTN97975679). Thirty boys and twenty-six girls, aged 10-13 years, were randomly assigned to three conditions: (A) sitting all morning working on simulated school tasks; (B) one 20-min physical activity bout after 90min; and (C) two 20-min physical activity bouts, i.e. at the start and after 90min. Selective attention was assessed at five time points during the morning (i.e. at baseline and after 20, 110, 130 and 220min), using the 'Sky Search' subtest of the 'Test of Selective Attention in Children'. We used GEE analysis to examine differences in Sky Search scores between the three experimental conditions, adjusting for school, baseline scores, self-reported screen time and time spent in sports. Children who performed two 20-min bouts of moderate-intensity physical activity had significantly better Sky Search scores compared to children who performed one physical activity bout or remained seated the whole morning (B=-0.26; 95% CI=[-0.52; -0.00]). Our findings support the importance of repeated physical activity during the school day for beneficial effects on selective attention in children. Copyright © 2015 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  19. Common selective serotonin reuptake inhibitor side effects in older adults associated with genetic polymorphisms in the serotonin transporter and receptors: data from a randomized controlled trial.

    Science.gov (United States)

    Garfield, Lauren D; Dixon, David; Nowotny, Petra; Lotrich, Francis E; Pollock, Bruce G; Kristjansson, Sean D; Doré, Peter M; Lenze, Eric J

    2014-10-01

    Antidepressant side effects are a significant public health issue, associated with poor adherence, premature treatment discontinuation, and, rarely, significant harm. Older adults assume the largest and most serious burden of medication side effects. We investigated the association between antidepressant side effects and genetic variation in the serotonin system in anxious, older adults participating in a randomized, placebo-controlled trial of the selective serotonin reuptake inhibitor (SSRI) escitalopram. Adults (N = 177) aged ≥ 60 years were randomized to active treatment or placebo for 12 weeks. Side effects were assessed using the Udvalg fur Kliniske Undersøgelser side-effect rating scale. Genetic polymorphisms were putative functional variants in the promoters of the serotonin transporter and 1A and 2A receptors (5-HTTLPR [L/S + rs25531], HTR1A rs6295, HTR2A rs6311, respectively). Four significant drug-placebo side-effect differences were found: increased duration of sleep, dry mouth, diarrhea, and diminished sexual desire. Analyses using putative high- versus low-transcription genotype groupings revealed six pharmacogenetic effects: greater dry mouth and decreased sexual desire for the low- and high-expressing serotonin transporter genotypes, respectively, and greater diarrhea with the 1A receptor low-transcription genotype. Diminished sexual desire was experienced significantly more by high-expressing genotypes in the serotonin transporter, 1A, or 2A receptors. There was not a significant relationship between drug concentration and side effects nor a mean difference in drug concentration between low- and high-expressing genotypes. Genetic variation in the serotonin system may predict who develops common SSRI side effects and why. More work is needed to further characterize this genetic modulation and to translate research findings into strategies useful for more personalized patient care. Published by Elsevier Inc.

  20. Causal Effects of Single-Sex Schools on College Entrance Exams and College Attendance: Random Assignment in Seoul High Schools

    Science.gov (United States)

    Park, Hyunjoon; Behrman, Jere R.; Choi, Jaesung

    2012-01-01

    Despite the voluminous literature on the potentials of single-sex schools, there is no consensus on the effects of single-sex schools because of student selection of school types. We exploit a unique feature of schooling in Seoul—the random assignment of students into single-sex versus coeducational high schools—to assess causal effects of single-sex schools on college entrance exam scores and college attendance. Our validation of the random assignment shows comparable socioeconomic backgrounds and prior academic achievement of students attending single-sex schools and coeducational schools, which increases the credibility of our causal estimates of single-sex school effects. The three-level hierarchical model shows that attending all-boys schools or all-girls schools, rather than coeducational schools, is significantly associated with higher average scores on Korean and English test scores. Applying the school district fixed-effects models, we find that single-sex schools produce a higher percentage of graduates who attended four-year colleges and a lower percentage of graduates who attended two-year junior colleges than do coeducational schools. The positive effects of single-sex schools remain substantial, even after we take into account various school-level variables, such as teacher quality, the student-teacher ratio, the proportion of students receiving lunch support, and whether the schools are public or private. PMID:23073751

  1. Causal effects of single-sex schools on college entrance exams and college attendance: random assignment in Seoul high schools.

    Science.gov (United States)

    Park, Hyunjoon; Behrman, Jere R; Choi, Jaesung

    2013-04-01

    Despite the voluminous literature on the potentials of single-sex schools, there is no consensus on the effects of single-sex schools because of student selection of school types. We exploit a unique feature of schooling in Seoul-the random assignment of students into single-sex versus coeducational high schools-to assess causal effects of single-sex schools on college entrance exam scores and college attendance. Our validation of the random assignment shows comparable socioeconomic backgrounds and prior academic achievement of students attending single-sex schools and coeducational schools, which increases the credibility of our causal estimates of single-sex school effects. The three-level hierarchical model shows that attending all-boys schools or all-girls schools, rather than coeducational schools, is significantly associated with higher average scores on Korean and English test scores. Applying the school district fixed-effects models, we find that single-sex schools produce a higher percentage of graduates who attended four-year colleges and a lower percentage of graduates who attended two-year junior colleges than do coeducational schools. The positive effects of single-sex schools remain substantial, even after we take into account various school-level variables, such as teacher quality, the student-teacher ratio, the proportion of students receiving lunch support, and whether the schools are public or private.

  2. Robust state estimation for double pantographs with random missing measurements in high-speed railway

    DEFF Research Database (Denmark)

    Lu, Xiaobing; Liu, Zhigang; Wang, Yanbo

    2016-01-01

    Active control of pantograph could be performed to decrease the fluctuation in pantograph-catenary contact force (PCCF) in high-speed railway. However, it is difficult to obtain the states of the pantograph when state feedback control is implemented. And the measurements may randomly miss due...

  3. Randomized trial of low versus high carbon dioxide insufflation pressures in posterior retroperitoneoscopic adrenalectomy.

    Science.gov (United States)

    Fraser, Sheila; Norlén, Olov; Bender, Kyle; Davidson, Joanne; Bajenov, Sonya; Fahey, David; Li, Shawn; Sidhu, Stan; Sywak, Mark

    2018-05-01

    Posterior retroperitoneoscopic adrenalectomy has gained widespread acceptance for the removal of benign adrenal tumors. Higher insufflation pressures using carbon dioxide (CO 2 ) are required, although the ideal starting pressure is unclear. This prospective, randomized, single-blinded, study aims to compare physiologic differences with 2 different CO 2 insufflation pressures during posterior retroperitoneoscopic adrenalectomy. Participants were randomly assigned to a starting insufflation pressure of 20 mm Hg (low pressure) or 25 mm Hg (high pressure). The primary outcome measure was partial pressure of arterial CO 2 at 60 minutes. Secondary outcomes included end-tidal CO 2 , arterial pH, blood pressure, and peak airway pressure. Breaches of protocol to change insufflation pressure were permitted if required and were recorded. A prospective randomized trial including 31 patients (low pressure: n = 16; high pressure: n = 15) was undertaken. At 60 minutes, the high pressure group had greater mean partial pressure of arterial CO 2 (64 vs 50 mm Hg, P = .003) and end-tidal CO 2 (54 vs 45 mm Hg, P = .008) and a lesser pH (7.21 vs 7.29, P = .0005). There were no significant differences in base excess, peak airway pressure, operative time, or duration of hospital stay. Clinically indicated protocol breaches were more common in the low pressure than the high pressure group (8 vs 3, P = .03). In posterior retroperitoneoscopic adrenalectomy, greater insufflation pressures are associated with greater partial pressure of arterial CO 2 and end-tidal CO 2 and lesser pH at 60 minutes, be significant. Commencing with lesser CO 2 insufflation pressures decreases intraoperative acidosis. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Mitochondrial haplotypes are not associated with mice selectively bred for high voluntary wheel running.

    Science.gov (United States)

    Wone, Bernard W M; Yim, Won C; Schutz, Heidi; Meek, Thomas H; Garland, Theodore

    2018-04-04

    Mitochondrial haplotypes have been associated with human and rodent phenotypes, including nonshivering thermogenesis capacity, learning capability, and disease risk. Although the mammalian mitochondrial D-loop is highly polymorphic, D-loops in laboratory mice are identical, and variation occurs elsewhere mainly between nucleotides 9820 and 9830. Part of this region codes for the tRNA Arg gene and is associated with mitochondrial densities and number of mtDNA copies. We hypothesized that the capacity for high levels of voluntary wheel-running behavior would be associated with mitochondrial haplotype. Here, we analyzed the mtDNA polymorphic region in mice from each of four replicate lines selectively bred for 54 generations for high voluntary wheel running (HR) and from four control lines (Control) randomly bred for 54 generations. Sequencing the polymorphic region revealed a variable number of adenine repeats. Single nucleotide polymorphisms (SNPs) varied from 2 to 3 adenine insertions, resulting in three haplotypes. We found significant genetic differentiations between the HR and Control groups (F st  = 0.779, p ≤ 0.0001), as well as among the replicate lines of mice within groups (F sc  = 0.757, p ≤ 0.0001). Haplotypes, however, were not strongly associated with voluntary wheel running (revolutions run per day), nor with either body mass or litter size. This system provides a useful experimental model to dissect the physiological processes linking mitochondrial, genomic SNPs, epigenetics, or nuclear-mitochondrial cross-talk to exercise activity. Copyright © 2018. Published by Elsevier B.V.

  5. Design of highly selective ethanol dehydration nanocatalysts for ethylene production.

    Science.gov (United States)

    Austin, Natalie; Kostetskyy, Pavlo; Mpourmpakis, Giannis

    2018-02-22

    Rational design of catalysts for selective conversion of alcohols to olefins is key since product selectivity remains an issue due to competing etherification reactions. Using first principles calculations and chemical rules, we designed novel metal-oxide-protected metal nanoclusters (M 13 X 4 O 12 , with M = Cu, Ag, and Au and X = Al, Ga, and In) exhibiting strong Lewis acid sites on their surface, active for the selective formation of olefins from alcohols. These symmetrical nanocatalysts, due to their curvature, show unfavorable etherification chemistries, while favoring the olefin production. Furthermore, we determined that water removal and regeneration of the nanocatalysts is more feasible compared to the equivalent strong acid sites on solid acids used for alcohol dehydration. Our results demonstrate an exceptional stability of these new nanostructures with the most energetically favorable being Cu-based. Thus, the high selectivity and stability of these in-silico-predicted novel nanoclusters (e.g. Cu 13 Al 4 O 12 ) make them attractive catalysts for the selective dehydration of alcohols to olefins.

  6. Phase associations and potential selective extraction methods for selected high-tech metals from ferromanganese nodules and crusts with siderophores

    International Nuclear Information System (INIS)

    Mohwinkel, Dennis; Kleint, Charlotte; Koschinsky, Andrea

    2014-01-01

    Highlights: • Phase associations of metals in marine Fe–Mn nodules and crusts were determined. • Selective leaching experiments with siderophore desferrioxamine B were conducted. • Siderophores selectively mobilize high-tech metals associated with Fe carrier phases. • Base metal liberation including Fe and Mn is limited. • Siderophores have promising potential for application in ore processing industries. - Abstract: Deep-sea ferromanganese deposits contain a wide range of economically important metals. Ferromanganese crusts and nodules represent an important future resource, since they not only contain base metals such as Mn, Ni, Co, Cu and Zn, but are also enriched in critical or rare high-technology elements such as Li, Mo, Nb, W, the rare earth elements and yttrium (REY). These metals could be extracted from nodules and crusts as a by-product to the base metal production. However, there are no proper separation techniques available that selectively extract certain metals out of the carrier phases. By sequential leaching, we demonstrated that, except for Li, which is present in an easily soluble form, all other high-tech metals enriched in ferromanganese nodules and crusts are largely associated with the Fe-oxyhydroxide phases and only to subordinate extents with Mn-oxide phases. Based on this fact, we conducted selective leaching experiments with the Fe-specific organic ligand desferrioxamine-B, a naturally occurring and ubiquitous siderophore. We showed by leaching of ferromanganese nodules and crusts with desferrioxamine-B that a significant and selective extraction of high-tech metals such as Li, Mo, Zr, Hf and Ta is possible, while other elements like Fe and the base metals Mn, Ni, Cu, Co and Zn are not extracted to large extents. The set of selectively extracted elements can be extended to Nb and W if Mn and carbonate phases are stripped from the bulk nodule or crust prior to the siderophore leach by e.g. a sequential leaching technique. This

  7. High-capacity, selective solid sequestrants for innovative chemical separation: Inorganic ion exchange approach

    International Nuclear Information System (INIS)

    Bray, L.

    1995-01-01

    The approach of this task is to develop high-capacity, selective solid inorganic ion exchangers for the recovery of cesium and strontium from nuclear alkaline and acid wastes. To achieve this goal, Pacific Northwest Laboratories (PNL) is collaborating with industry and university participants to develop high capacity, selective, solid ion exchangers for the removal of specific contaminants from nuclear waste streams

  8. Randomized algorithms in automatic control and data mining

    CERN Document Server

    Granichin, Oleg; Toledano-Kitai, Dvora

    2015-01-01

    In the fields of data mining and control, the huge amount of unstructured data and the presence of uncertainty in system descriptions have always been critical issues. The book Randomized Algorithms in Automatic Control and Data Mining introduces the readers to the fundamentals of randomized algorithm applications in data mining (especially clustering) and in automatic control synthesis. The methods proposed in this book guarantee that the computational complexity of classical algorithms and the conservativeness of standard robust control techniques will be reduced. It is shown that when a problem requires "brute force" in selecting among options, algorithms based on random selection of alternatives offer good results with certain probability for a restricted time and significantly reduce the volume of operations.

  9. The basic science and mathematics of random mutation and natural selection.

    Science.gov (United States)

    Kleinman, Alan

    2014-12-20

    The mutation and natural selection phenomenon can and often does cause the failure of antimicrobial, herbicidal, pesticide and cancer treatments selection pressures. This phenomenon operates in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures. The mathematical behavior of mutation and selection is derived using the principles given by probability theory. The derivation of the equations describing the mutation and selection phenomenon is carried out in the context of an empirical example. Copyright © 2014 John Wiley & Sons, Ltd.

  10. Vast Volatility Matrix Estimation using High Frequency Data for Portfolio Selection*

    Science.gov (United States)

    Fan, Jianqing; Li, Yingying; Yu, Ke

    2012-01-01

    Portfolio allocation with gross-exposure constraint is an effective method to increase the efficiency and stability of portfolios selection among a vast pool of assets, as demonstrated in Fan et al. (2011). The required high-dimensional volatility matrix can be estimated by using high frequency financial data. This enables us to better adapt to the local volatilities and local correlations among vast number of assets and to increase significantly the sample size for estimating the volatility matrix. This paper studies the volatility matrix estimation using high-dimensional high-frequency data from the perspective of portfolio selection. Specifically, we propose the use of “pairwise-refresh time” and “all-refresh time” methods based on the concept of “refresh time” proposed by Barndorff-Nielsen et al. (2008) for estimation of vast covariance matrix and compare their merits in the portfolio selection. We establish the concentration inequalities of the estimates, which guarantee desirable properties of the estimated volatility matrix in vast asset allocation with gross exposure constraints. Extensive numerical studies are made via carefully designed simulations. Comparing with the methods based on low frequency daily data, our methods can capture the most recent trend of the time varying volatility and correlation, hence provide more accurate guidance for the portfolio allocation in the next time period. The advantage of using high-frequency data is significant in our simulation and empirical studies, which consist of 50 simulated assets and 30 constituent stocks of Dow Jones Industrial Average index. PMID:23264708

  11. Vast Volatility Matrix Estimation using High Frequency Data for Portfolio Selection.

    Science.gov (United States)

    Fan, Jianqing; Li, Yingying; Yu, Ke

    2012-01-01

    Portfolio allocation with gross-exposure constraint is an effective method to increase the efficiency and stability of portfolios selection among a vast pool of assets, as demonstrated in Fan et al. (2011). The required high-dimensional volatility matrix can be estimated by using high frequency financial data. This enables us to better adapt to the local volatilities and local correlations among vast number of assets and to increase significantly the sample size for estimating the volatility matrix. This paper studies the volatility matrix estimation using high-dimensional high-frequency data from the perspective of portfolio selection. Specifically, we propose the use of "pairwise-refresh time" and "all-refresh time" methods based on the concept of "refresh time" proposed by Barndorff-Nielsen et al. (2008) for estimation of vast covariance matrix and compare their merits in the portfolio selection. We establish the concentration inequalities of the estimates, which guarantee desirable properties of the estimated volatility matrix in vast asset allocation with gross exposure constraints. Extensive numerical studies are made via carefully designed simulations. Comparing with the methods based on low frequency daily data, our methods can capture the most recent trend of the time varying volatility and correlation, hence provide more accurate guidance for the portfolio allocation in the next time period. The advantage of using high-frequency data is significant in our simulation and empirical studies, which consist of 50 simulated assets and 30 constituent stocks of Dow Jones Industrial Average index.

  12. RANDOM WALK HYPOTHESIS IN FINANCIAL MARKETS

    Directory of Open Access Journals (Sweden)

    Nicolae-Marius JULA

    2017-05-01

    Full Text Available Random walk hypothesis states that the stock market prices do not follow a predictable trajectory, but are simply random. If you are trying to predict a random set of data, one should test for randomness, because, despite the power and complexity of the used models, the results cannot be trustworthy. There are several methods for testing these hypotheses and the use of computational power provided by the R environment makes the work of the researcher easier and with a cost-effective approach. The increasing power of computing and the continuous development of econometric tests should give the potential investors new tools in selecting commodities and investing in efficient markets.

  13. Maximizing the Diversity of Ensemble Random Forests for Tree Genera Classification Using High Density LiDAR Data

    Directory of Open Access Journals (Sweden)

    Connie Ko

    2016-08-01

    Full Text Available Recent research into improving the effectiveness of forest inventory management using airborne LiDAR data has focused on developing advanced theories in data analytics. Furthermore, supervised learning as a predictive model for classifying tree genera (and species, where possible has been gaining popularity in order to minimize this labor-intensive task. However, bottlenecks remain that hinder the immediate adoption of supervised learning methods. With supervised classification, training samples are required for learning the parameters that govern the performance of a classifier, yet the selection of training data is often subjective and the quality of such samples is critically important. For LiDAR scanning in forest environments, the quantification of data quality is somewhat abstract, normally referring to some metric related to the completeness of individual tree crowns; however, this is not an issue that has received much attention in the literature. Intuitively the choice of training samples having varying quality will affect classification accuracy. In this paper a Diversity Index (DI is proposed that characterizes the diversity of data quality (Qi among selected training samples required for constructing a classification model of tree genera. The training sample is diversified in terms of data quality as opposed to the number of samples per class. The diversified training sample allows the classifier to better learn the positive and negative instances and; therefore; has a higher classification accuracy in discriminating the “unknown” class samples from the “known” samples. Our algorithm is implemented within the Random Forests base classifiers with six derived geometric features from LiDAR data. The training sample contains three tree genera (pine; poplar; and maple and the validation samples contains four labels (pine; poplar; maple; and “unknown”. Classification accuracy improved from 72.8%; when training samples were

  14. Is neutron evaporation from highly excited nuclei a poisson random process

    International Nuclear Information System (INIS)

    Simbel, M.H.

    1982-01-01

    It is suggested that neutron emission from highly excited nuclei follows a Poisson random process. The continuous variable of the process is the excitation energy excess over the binding energy of the emitted neutrons and the discrete variable is the number of emitted neutrons. Cross sections for (HI,xn) reactions are analyzed using a formula containing a Poisson distribution function. The post- and pre-equilibrium components of the cross section are treated separately. The agreement between the predictions of this formula and the experimental results is very good. (orig.)

  15. Supplementary arteriel embolization an option in high-risk ulcer bleeding--a randomized study

    DEFF Research Database (Denmark)

    Laursen, Stig Borbjerg; Hansen, Jane Møller; Andersen, Poul Erik

    2014-01-01

    OBJECTIVE: One of the major challenges in peptic ulcer bleeding (PUB) is rebleeding which is associated with up to a fivefold increase in mortality. We examined if supplementary transcatheter arterial embolization (STAE) performed after achieved endoscopic hemostasis improves outcome in patients...... with high-risk ulcers. MATERIAL AND METHODS: The study was designed as a non-blinded, parallel group, randomized-controlled trial and performed in a university hospital setting. Patients admitted with PUB from Forrest Ia - IIb ulcers controlled by endoscopic therapy were randomized (1:1 ratio) to STAE...... of rebleeding, need of hemostatic intervention and mortality. Secondary outcomes were rebleeding, number of blood transfusions received, duration of admission and mortality. RESULTS: Totally 105 patients were included. Of the 49 patients allocated to STAE 31 underwent successful STAE. There was no difference...

  16. The Goodness of Covariance Selection Problem from AUC Bounds

    OpenAIRE

    Khajavi, Navid Tafaghodi; Kuh, Anthony

    2016-01-01

    We conduct a study of graphical models and discuss the quality of model selection approximation by formulating the problem as a detection problem and examining the area under the curve (AUC). We are specifically looking at the model selection problem for jointly Gaussian random vectors. For Gaussian random vectors, this problem simplifies to the covariance selection problem which is widely discussed in literature by Dempster [1]. In this paper, we give the definition for the correlation appro...

  17. Epidermis Microstructure Inspired Graphene Pressure Sensor with Random Distributed Spinosum for High Sensitivity and Large Linearity.

    Science.gov (United States)

    Pang, Yu; Zhang, Kunning; Yang, Zhen; Jiang, Song; Ju, Zhenyi; Li, Yuxing; Wang, Xuefeng; Wang, Danyang; Jian, Muqiang; Zhang, Yingying; Liang, Renrong; Tian, He; Yang, Yi; Ren, Tian-Ling

    2018-03-27

    Recently, wearable pressure sensors have attracted tremendous attention because of their potential applications in monitoring physiological signals for human healthcare. Sensitivity and linearity are the two most essential parameters for pressure sensors. Although various designed micro/nanostructure morphologies have been introduced, the trade-off between sensitivity and linearity has not been well balanced. Human skin, which contains force receptors in a reticular layer, has a high sensitivity even for large external stimuli. Herein, inspired by the skin epidermis with high-performance force sensing, we have proposed a special surface morphology with spinosum microstructure of random distribution via the combination of an abrasive paper template and reduced graphene oxide. The sensitivity of the graphene pressure sensor with random distribution spinosum (RDS) microstructure is as high as 25.1 kPa -1 in a wide linearity range of 0-2.6 kPa. Our pressure sensor exhibits superior comprehensive properties compared with previous surface-modified pressure sensors. According to simulation and mechanism analyses, the spinosum microstructure and random distribution contribute to the high sensitivity and large linearity range, respectively. In addition, the pressure sensor shows promising potential in detecting human physiological signals, such as heartbeat, respiration, phonation, and human motions of a pushup, arm bending, and walking. The wearable pressure sensor array was further used to detect gait states of supination, neutral, and pronation. The RDS microstructure provides an alternative strategy to improve the performance of pressure sensors and extend their potential applications in monitoring human activities.

  18. High-throughput phenotyping and genomic selection: the frontiers of crop breeding converge.

    Science.gov (United States)

    Cabrera-Bosquet, Llorenç; Crossa, José; von Zitzewitz, Jarislav; Serret, María Dolors; Araus, José Luis

    2012-05-01

    Genomic selection (GS) and high-throughput phenotyping have recently been captivating the interest of the crop breeding community from both the public and private sectors world-wide. Both approaches promise to revolutionize the prediction of complex traits, including growth, yield and adaptation to stress. Whereas high-throughput phenotyping may help to improve understanding of crop physiology, most powerful techniques for high-throughput field phenotyping are empirical rather than analytical and comparable to genomic selection. Despite the fact that the two methodological approaches represent the extremes of what is understood as the breeding process (phenotype versus genome), they both consider the targeted traits (e.g. grain yield, growth, phenology, plant adaptation to stress) as a black box instead of dissecting them as a set of secondary traits (i.e. physiological) putatively related to the target trait. Both GS and high-throughput phenotyping have in common their empirical approach enabling breeders to use genome profile or phenotype without understanding the underlying biology. This short review discusses the main aspects of both approaches and focuses on the case of genomic selection of maize flowering traits and near-infrared spectroscopy (NIRS) and plant spectral reflectance as high-throughput field phenotyping methods for complex traits such as crop growth and yield. © 2012 Institute of Botany, Chinese Academy of Sciences.

  19. A Traceless Selection: Counter-selection System That Allows Efficient Generation of Transposon and CRISPR-modified T-cell Products

    Directory of Open Access Journals (Sweden)

    Riccardo Mezzadra

    2016-01-01

    Full Text Available Recent years have seen major breakthroughs in genome-engineering systems, such as transposon-mediated gene delivery systems and CRISPR-Cas9-mediated genome-editing tools. In these systems, transient expression of auxiliary genes is responsible for permanent genomic modification. For both systems, it would be valuable to select for cells that are likely to undergo stable genome modification. Importantly, in particular for clinical applications of genome-engineered cell products, it will also be of importance to remove those cells that, due to random vector integration, display an unwanted stable expression of the auxiliary gene. Here, we develop a traceless selection system that on the one hand allows efficient enrichment of modified cells, and on the other hand can be used to select against cells that retain expression of the auxiliary gene. The value of this system to produce highly enriched-auxiliary gene-free cell products is demonstrated.

  20. Effect of tetracycline dose and treatment-mode on selection of resistant coliform bacteria in nursery pigs

    DEFF Research Database (Denmark)

    Græsbøll, Kaare; Damborg, Peter; Mellerup, Anders

    2017-01-01

    This study describes results of a randomized clinical trial investigating the effect of oxytetracycline treatment dose and mode of administration on selection of antibiotic resistant coliform bacteria in fecal samples from nursery pigs. Nursery pigs (pigs of 4-7 weeks of age) were treated...... with oxytetracycline against Lawsonia intracellularis induced diarrhea in five pig herds. Each group was randomly allocated to one of five treatment groups: oral flock treatment with (i) high (20 mg/kg), (ii) medium (10 mg/kg) and (iii) low (5 mg/kg) dosage, (iv) oral-pen-wise (small group) treatment (10 mg...... significant changes in number or proportion of tetracycline resistant coliforms. Selection for tetracycline-resistant coliforms was significantly correlated to selection for ampicillin- and sulfonamide-resistant, but not to cefotaxime-resistant strains. In conclusion, difference in dose of oxytetracycline...

  1. High cycle fatigue of austenitic stainless steels under random loading

    International Nuclear Information System (INIS)

    Gauthier, J.P.; Petrequin, P.

    1987-08-01

    To investigate reactor components, load control random fatigue tests were performed at 300 0 C and 550 0 C, on specimens from austenitic stainless steels plates in the transverse orientation. Random solicitations are produced on closed loop servo-hydraulic machines by a mini computer which generates random load sequence by the use of reduced Markovian matrix. The method has the advantage of taking into account the mean load for each cycle. The solicitations generated are those of a stationary gaussian process. Fatigue tests have been mainly performed in the endurance region of fatigue curve, with scattering determination using stair case method. Experimental results have been analysed aiming at determining design curves for components calculations, depending on irregularity factor and temperature. Analysis in term of mean square root fatigue limit calculation, shows that random loading gives more damage than constant amplitude loading. Damage calculations following Miner rule have been made using the probability density function for the case where the irregularity factor is nearest to 100 %. The Miner rule is too conservative for our results. A method using design curves including random loading effects with irregularity factor as an indexing parameter is proposed

  2. Selection of high hectolitre weight mutants of winter wheat

    International Nuclear Information System (INIS)

    Crowley, C.; Jones, P.

    1989-01-01

    Grain quality in wheat includes hectolitre weight (HLW) besides protein content and thousand-grain weight (TGW). The British winter wheat variety ''Guardian'' has a very high yield potential. Although the long grain of ''Guardian'' results in a desirable high TGW the HLW is too low. To select mutants exhibiting increased HLW the character was first analyzed to identify traits that could more easily be screened for using M 2 seeds. In comparison of 6 wheat cultivars, correlation analyses with HLW resulted in coefficients of -0.86 (grain length, L:P 2 seeds for shorter, less prolate grains. Mutagenesis was carried out using EMS sulphonate (1.8 or 3.6%), sodium azide (2 or 20 mM) or X-rays (7.5 or 20 kR). 69 M 2 grains with altered shape were selected. Examination of the M 3 progeny confirmed 6 grain-shape mutants, most of them resulting from EMS treatment (Table). Two of the mutants showed TGW values significantly below the parental variety, but three mutants exhibited HLW and TGW values significantly greater than those of the parental variety. Microplot yield trails on selected M 3 lines are in progress. The influence of physical grain characteristics on HLW offers prospects for mechanical fractionation of large M 2 populations. The application of gravity separators (fractionation on the basis of grain density) and sieves (fractionation on the basis of grain length) in screening mutants possessing improved grain quality is being investigated

  3. Randomized interpolative decomposition of separated representations

    Science.gov (United States)

    Biagioni, David J.; Beylkin, Daniel; Beylkin, Gregory

    2015-01-01

    We introduce an algorithm to compute tensor interpolative decomposition (dubbed CTD-ID) for the reduction of the separation rank of Canonical Tensor Decompositions (CTDs). Tensor ID selects, for a user-defined accuracy ɛ, a near optimal subset of terms of a CTD to represent the remaining terms via a linear combination of the selected terms. CTD-ID can be used as an alternative to or in combination with the Alternating Least Squares (ALS) algorithm. We present examples of its use within a convergent iteration to compute inverse operators in high dimensions. We also briefly discuss the spectral norm as a computational alternative to the Frobenius norm in estimating approximation errors of tensor ID. We reduce the problem of finding tensor IDs to that of constructing interpolative decompositions of certain matrices. These matrices are generated via randomized projection of the terms of the given tensor. We provide cost estimates and several examples of the new approach to the reduction of separation rank.

  4. Random-walk simulation of selected aspects of dissipative collisions

    International Nuclear Information System (INIS)

    Toeke, J.; Gobbi, A.; Matulewicz, T.

    1984-11-01

    Internuclear thermal equilibrium effects and shell structure effects in dissipative collisions are studied numerically within the framework of the model of stochastic exchanges by applying the random-walk technique. Effective blocking of the drift through the mass flux induced by the temperature difference, while leaving the variances of the mass distributions unaltered is found possible, provided an internuclear potential barrier is present. Presence of the shell structure is found to lead to characteristic correlations between the consecutive exchanges. Experimental evidence for the predicted effects is discussed. (orig.)

  5. FOOD SECURITY SITUATION OF SELECTED HIGHLY DEVELOPED COUNTRIES AGAINST DEVELOPING COUNTRIES

    OpenAIRE

    Karolina Pawlak

    2016-01-01

    The aim of the paper is to present the food security situation in selected highly developed countries and to identify consumption disparities between them and developing countries. The research is based on the data from the United Nations Food and Agriculture Organization (FAO), the Statistical Office of the European Union (Eurostat), the United Nations Statistics Division, the Organisation for Economic Co-operation and Development (OECD), World Food Programme (WFP) and selected measures used...

  6. Minimization of storage and disposal volumes by treatment of liquids by highly selective ion exchangers

    International Nuclear Information System (INIS)

    Tusa, E.; Harjula, R.; Lehto, J.

    2000-01-01

    Novel highly selective inorganic ion exchangers provide new efficient methods for the treatment of nuclear waste liquids. These methods have several advantages compared to conventional technologies such as evaporation, direct solidification or treatment by organic ion exchange resins. Due to high selectivity, the radionuclides can be concentrated to a very small volume even from high-salt effluents. This means that the volume waste will be very small compared to other methods, which brings considerable savings in the cost of intermediate storage and final disposal. Process equipment are highly compact and require little supervision, which brings down the capital and operation costs. The new selective inorganic ion exchangers CsTreat, SrTreat and CoTreat (manufactured by Fortum Engineering Ltd., Finland) have the highest selectivities and processing capacities, exceeding those of zeolites by several orders of magnitude. The materials are now in use in a number of nuclear sites worldwide, including those in the USA, Europe and Japan. Installations include mobile and stationary systems. Considerable experience has been gained in the use of these new materials. Lessons learned, as well as advantages and economic benefits of these highly selective exchangers will be discussed in this paper. (authors)

  7. Selection of DNA aptamers against epidermal growth factor receptor with high affinity and specificity

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Deng-Liang [The First Clinical Medical College of Fujian Medical University, Fuzhou (China); Department of Neurosurgery, The First Affiliated Hospital of Fujian Medical University, Fuzhou (China); Song, Yan-Ling; Zhu, Zhi; Li, Xi-Lan; Zou, Yuan [State Key Laboratory for Physical Chemistry of Solid Surfaces, Key Laboratory for Chemical Biology of Fujian Province, Key Laboratory of Analytical Chemistry, and Department of Chemical Biology, College of Chemistry and Chemical Engineering, Xiamen University, Xiamen 361005 (China); Yang, Hai-Tao; Wang, Jiang-Jie [The First Clinical Medical College of Fujian Medical University, Fuzhou (China); Yao, Pei-Sen [Department of Neurosurgery, The First Affiliated Hospital of Fujian Medical University, Fuzhou (China); Pan, Ru-Jun [The First Clinical Medical College of Fujian Medical University, Fuzhou (China); Yang, Chaoyong James, E-mail: cyyang@xmu.edu.cn [State Key Laboratory for Physical Chemistry of Solid Surfaces, Key Laboratory for Chemical Biology of Fujian Province, Key Laboratory of Analytical Chemistry, and Department of Chemical Biology, College of Chemistry and Chemical Engineering, Xiamen University, Xiamen 361005 (China); Kang, De-Zhi, E-mail: kdzy99988@163.com [The First Clinical Medical College of Fujian Medical University, Fuzhou (China); Department of Neurosurgery, The First Affiliated Hospital of Fujian Medical University, Fuzhou (China)

    2014-10-31

    Highlights: • This is the first report of DNA aptamer against EGFR in vitro. • Aptamer can bind targets with high affinity and selectivity. • DNA aptamers are more stable, cheap and efficient than RNA aptamers. • Our selected DNA aptamer against EGFR has high affinity with K{sub d} 56 ± 7.3 nM. • Our selected DNA aptamer against EGFR has high selectivity. - Abstract: Epidermal growth factor receptor (EGFR/HER1/c-ErbB1), is overexpressed in many solid cancers, such as epidermoid carcinomas, malignant gliomas, etc. EGFR plays roles in proliferation, invasion, angiogenesis and metastasis of malignant cancer cells and is the ideal antigen for clinical applications in cancer detection, imaging and therapy. Aptamers, the output of the systematic evolution of ligands by exponential enrichment (SELEX), are DNA/RNA oligonucleotides which can bind protein and other substances with specificity. RNA aptamers are undesirable due to their instability and high cost of production. Conversely, DNA aptamers have aroused researcher’s attention because they are easily synthesized, stable, selective, have high binding affinity and are cost-effective to produce. In this study, we have successfully identified DNA aptamers with high binding affinity and selectivity to EGFR. The aptamer named TuTu22 with K{sub d} 56 ± 7.3 nM was chosen from the identified DNA aptamers for further study. Flow cytometry analysis results indicated that the TuTu22 aptamer was able to specifically recognize a variety of cancer cells expressing EGFR but did not bind to the EGFR-negative cells. With all of the aforementioned advantages, the DNA aptamers reported here against cancer biomarker EGFR will facilitate the development of novel targeted cancer detection, imaging and therapy.

  8. Selection of DNA aptamers against epidermal growth factor receptor with high affinity and specificity

    International Nuclear Information System (INIS)

    Wang, Deng-Liang; Song, Yan-Ling; Zhu, Zhi; Li, Xi-Lan; Zou, Yuan; Yang, Hai-Tao; Wang, Jiang-Jie; Yao, Pei-Sen; Pan, Ru-Jun; Yang, Chaoyong James; Kang, De-Zhi

    2014-01-01

    Highlights: • This is the first report of DNA aptamer against EGFR in vitro. • Aptamer can bind targets with high affinity and selectivity. • DNA aptamers are more stable, cheap and efficient than RNA aptamers. • Our selected DNA aptamer against EGFR has high affinity with K d 56 ± 7.3 nM. • Our selected DNA aptamer against EGFR has high selectivity. - Abstract: Epidermal growth factor receptor (EGFR/HER1/c-ErbB1), is overexpressed in many solid cancers, such as epidermoid carcinomas, malignant gliomas, etc. EGFR plays roles in proliferation, invasion, angiogenesis and metastasis of malignant cancer cells and is the ideal antigen for clinical applications in cancer detection, imaging and therapy. Aptamers, the output of the systematic evolution of ligands by exponential enrichment (SELEX), are DNA/RNA oligonucleotides which can bind protein and other substances with specificity. RNA aptamers are undesirable due to their instability and high cost of production. Conversely, DNA aptamers have aroused researcher’s attention because they are easily synthesized, stable, selective, have high binding affinity and are cost-effective to produce. In this study, we have successfully identified DNA aptamers with high binding affinity and selectivity to EGFR. The aptamer named TuTu22 with K d 56 ± 7.3 nM was chosen from the identified DNA aptamers for further study. Flow cytometry analysis results indicated that the TuTu22 aptamer was able to specifically recognize a variety of cancer cells expressing EGFR but did not bind to the EGFR-negative cells. With all of the aforementioned advantages, the DNA aptamers reported here against cancer biomarker EGFR will facilitate the development of novel targeted cancer detection, imaging and therapy

  9. Selection and characterization of DNA aptamers

    NARCIS (Netherlands)

    Ruigrok, V.J.B.

    2013-01-01

    This thesis focusses on the selection and characterisation of DNA aptamers and the various aspects related to their selection from large pools of randomized oligonucleotides. Aptamers are affinity tools that can specifically recognize and bind predefined target molecules; this ability, however,

  10. Tracking and flavour tagging selection in the ATLAS High Level Trigger

    CERN Document Server

    Calvetti, Milene; The ATLAS collaboration

    2017-01-01

    In high-energy physics experiments, track based selection in the online environment is crucial for the detection of physics processes of interest for further study. This is of particular importance at the Large Hadron Collider (LHC), where the increasingly harsh collision environment is challenging participating experiments to improve the performance of their online selection. Principle among these challenges is the increasing number of interactions per bunch crossing, known as pileup. In the ATLAS experiment the challenge has been addressed with multiple strategies. Firstly, individual trigger groups focusing on specific physics objects have implemented novel algorithms which make use of the detailed tracking and vertexing performed within the trigger to improve rejection without losing efficiency. Secondly, since 2015 all trigger areas have also benefited from a new high performance inner detector software tracking system implemented in the High Level Trigger. Finally, performance will be further enhanced i...

  11. Pseudo-Random Number Generators

    Science.gov (United States)

    Howell, L. W.; Rheinfurth, M. H.

    1984-01-01

    Package features comprehensive selection of probabilistic distributions. Monte Carlo simulations resorted to whenever systems studied not amenable to deterministic analyses or when direct experimentation not feasible. Random numbers having certain specified distribution characteristic integral part of simulations. Package consists of collector of "pseudorandom" number generators for use in Monte Carlo simulations.

  12. Do vouchers lead to sorting under random private-school selection? Evidence from the Milwaukee voucher program

    OpenAIRE

    Chakrabarti, Rajashri

    2009-01-01

    This paper analyzes the effect of school vouchers on student sorting - defined as a flight to private schools by high-income and committed public-school students - and whether vouchers can be designed to reduce or eliminate it. Much of the existing literature investigates sorting in cases where private schools can screen students. However, publicly funded U.S. voucher programs require a private school to accept all students unless it is oversubscribed and to pick students randomly if it is ov...

  13. Model Selection with the Linear Mixed Model for Longitudinal Data

    Science.gov (United States)

    Ryoo, Ji Hoon

    2011-01-01

    Model building or model selection with linear mixed models (LMMs) is complicated by the presence of both fixed effects and random effects. The fixed effects structure and random effects structure are codependent, so selection of one influences the other. Most presentations of LMM in psychology and education are based on a multilevel or…

  14. Reduced basis ANOVA methods for partial differential equations with high-dimensional random inputs

    Energy Technology Data Exchange (ETDEWEB)

    Liao, Qifeng, E-mail: liaoqf@shanghaitech.edu.cn [School of Information Science and Technology, ShanghaiTech University, Shanghai 200031 (China); Lin, Guang, E-mail: guanglin@purdue.edu [Department of Mathematics & School of Mechanical Engineering, Purdue University, West Lafayette, IN 47907 (United States)

    2016-07-15

    In this paper we present a reduced basis ANOVA approach for partial deferential equations (PDEs) with random inputs. The ANOVA method combined with stochastic collocation methods provides model reduction in high-dimensional parameter space through decomposing high-dimensional inputs into unions of low-dimensional inputs. In this work, to further reduce the computational cost, we investigate spatial low-rank structures in the ANOVA-collocation method, and develop efficient spatial model reduction techniques using hierarchically generated reduced bases. We present a general mathematical framework of the methodology, validate its accuracy and demonstrate its efficiency with numerical experiments.

  15. Highly selective gas sensor arrays based on thermally reduced graphene oxide.

    Science.gov (United States)

    Lipatov, Alexey; Varezhnikov, Alexey; Wilson, Peter; Sysoev, Victor; Kolmakov, Andrei; Sinitskii, Alexander

    2013-06-21

    The electrical properties of reduced graphene oxide (rGO) have been previously shown to be very sensitive to surface adsorbates, thus making rGO a very promising platform for highly sensitive gas sensors. However, poor selectivity of rGO-based gas sensors remains a major problem for their practical use. In this paper, we address the selectivity problem by employing an array of rGO-based integrated sensors instead of focusing on the performance of a single sensing element. Each rGO-based device in such an array has a unique sensor response due to the irregular structure of rGO films at different levels of organization, ranging from nanoscale to macroscale. The resulting rGO-based gas sensing system could reliably recognize analytes of nearly the same chemical nature. In our experiments rGO-based sensor arrays demonstrated a high selectivity that was sufficient to discriminate between different alcohols, such as methanol, ethanol and isopropanol, at a 100% success rate. We also discuss a possible sensing mechanism that provides the basis for analyte differentiation.

  16. Low versus high volume of culture medium during embryo transfer: a randomized clinical trial.

    Science.gov (United States)

    Sigalos, George Α; Michalopoulos, Yannis; Kastoras, Athanasios G; Triantafyllidou, Olga; Vlahos, Nikos F

    2018-04-01

    The aim of this prospective randomized control trial was to evaluate if the use of two different volumes (20-25 vs 40-45 μl) of media used for embryo transfer affects the clinical outcomes in fresh in vitro fertilization (IVF) cycles. In total, 236 patients were randomized in two groups, i.e., "low volume" group (n = 118) transferring the embryos with 20-25 μl of medium and "high volume" group (n = 118) transferring the embryos with 40-45 μl of medium. The clinical pregnancy, implantation, and ongoing pregnancy rates were compared between the two groups. No statistically significant differences were observed in clinical pregnancy (46.8 vs 54.3%, p = 0.27), implantation (23.7 vs 27.8%, p = 0.30), and ongoing pregnancy (33.3 vs 40.0%, p = 0.31) rates between low and high volume group, respectively. Higher volume of culture medium to load the embryo into the catheter during embryo transfer does not influence the clinical outcome in fresh IVF cycles. NCT03350646.

  17. High-volume infiltration analgesia in total knee arthroplasty: a randomized, double-blind, placebo-controlled trial

    DEFF Research Database (Denmark)

    Andersen, L.O.; Husted, H.; Otte, K.S.

    2008-01-01

    with a detailed description of the infiltration technique. METHODS: In a randomized, double-blind, placebo-controlled trial in 12 patients undergoing bilateral knee arthroplasty, saline or high-volume (170 ml) ropivacaine (0.2%) with epinephrine was infiltrated around each knee, with repeated doses administered...

  18. The adverse effect of selective cyclooxygenase-2 inhibitor on random skin flap survival in rats.

    Directory of Open Access Journals (Sweden)

    Haiyong Ren

    Full Text Available BACKGROUND: Cyclooxygenase-2(COX-2 inhibitors provide desired analgesic effects after injury or surgery, but evidences suggested they also attenuate wound healing. The study is to investigate the effect of COX-2 inhibitor on random skin flap survival. METHODS: The McFarlane flap model was established in 40 rats and evaluated within two groups, each group gave the same volume of Parecoxib and saline injection for 7 days. The necrotic area of the flap was measured, the specimens of the flap were stained with haematoxylin-eosin(HE for histologic analysis. Immunohistochemical staining was performed to analyse the level of VEGF and COX-2 . RESULTS: 7 days after operation, the flap necrotic area ratio in study group (66.65 ± 2.81% was significantly enlarged than that of the control group(48.81 ± 2.33%(P <0.01. Histological analysis demonstrated angiogenesis with mean vessel density per mm(2 being lower in study group (15.4 ± 4.4 than in control group (27.2 ± 4.1 (P <0.05. To evaluate the expression of COX-2 and VEGF protein in the intermediate area II in the two groups by immunohistochemistry test .The expression of COX-2 in study group was (1022.45 ± 153.1, and in control group was (2638.05 ± 132.2 (P <0.01. The expression of VEGF in the study and control groups were (2779.45 ± 472.0 vs (4938.05 ± 123.6(P <0.01.In the COX-2 inhibitor group, the expressions of COX-2 and VEGF protein were remarkably down-regulated as compared with the control group. CONCLUSION: Selective COX-2 inhibitor had adverse effect on random skin flap survival. Suppression of neovascularization induced by low level of VEGF was supposed to be the biological mechanism.

  19. Adolescent alcohol use in rural South African high schools | Onya ...

    African Journals Online (AJOL)

    Objective: To examine psychosocial correlates of lifetime alcohol use among adolescents in rural South African high schools. Method: Questionnaires were administered to 1600 students from 20 randomly selected high schools in the Mankweng district within Limpopo province. Self-report data on alcohol use, demographic, ...

  20. Selectivity in Postencoding Connectivity with High-Level Visual Cortex Is Associated with Reward-Motivated Memory.

    Science.gov (United States)

    Murty, Vishnu P; Tompary, Alexa; Adcock, R Alison; Davachi, Lila

    2017-01-18

    Reward motivation has been demonstrated to enhance declarative memory by facilitating systems-level consolidation. Although high-reward information is often intermixed with lower reward information during an experience, memory for high value information is prioritized. How is this selectivity achieved? One possibility is that postencoding consolidation processes bias memory strengthening to those representations associated with higher reward. To test this hypothesis, we investigated the influence of differential reward motivation on the selectivity of postencoding markers of systems-level memory consolidation. Human participants encoded intermixed, trial-unique memoranda that were associated with either high or low-value during fMRI acquisition. Encoding was interleaved with periods of rest, allowing us to investigate experience-dependent changes in connectivity as they related to later memory. Behaviorally, we found that reward motivation enhanced 24 h associative memory. Analysis of patterns of postencoding connectivity showed that, even though learning trials were intermixed, there was significantly greater connectivity with regions of high-level, category-selective visual cortex associated with high-reward trials. Specifically, increased connectivity of category-selective visual cortex with both the VTA and the anterior hippocampus predicted associative memory for high- but not low-reward memories. Critically, these results were independent of encoding-related connectivity and univariate activity measures. Thus, these findings support a model by which the selective stabilization of memories for salient events is supported by postencoding interactions with sensory cortex associated with reward. Reward motivation is thought to promote memory by supporting memory consolidation. Yet, little is known as to how brain selects relevant information for subsequent consolidation based on reward. We show that experience-dependent changes in connectivity of both the

  1. Bias in random forest variable importance measures: Illustrations, sources and a solution

    Directory of Open Access Journals (Sweden)

    Hothorn Torsten

    2007-01-01

    Full Text Available Abstract Background Variable importance measures for random forests have been receiving increased attention as a means of variable selection in many classification tasks in bioinformatics and related scientific fields, for instance to select a subset of genetic markers relevant for the prediction of a certain disease. We show that random forest variable importance measures are a sensible means for variable selection in many applications, but are not reliable in situations where potential predictor variables vary in their scale of measurement or their number of categories. This is particularly important in genomics and computational biology, where predictors often include variables of different types, for example when predictors include both sequence data and continuous variables such as folding energy, or when amino acid sequence data show different numbers of categories. Results Simulation studies are presented illustrating that, when random forest variable importance measures are used with data of varying types, the results are misleading because suboptimal predictor variables may be artificially preferred in variable selection. The two mechanisms underlying this deficiency are biased variable selection in the individual classification trees used to build the random forest on one hand, and effects induced by bootstrap sampling with replacement on the other hand. Conclusion We propose to employ an alternative implementation of random forests, that provides unbiased variable selection in the individual classification trees. When this method is applied using subsampling without replacement, the resulting variable importance measures can be used reliably for variable selection even in situations where the potential predictor variables vary in their scale of measurement or their number of categories. The usage of both random forest algorithms and their variable importance measures in the R system for statistical computing is illustrated and

  2. Selection of design basis event for modular high temperature gas-cooled reactor

    International Nuclear Information System (INIS)

    Sato, Hiroyuki; Nakagawa, Shigeaki; Ohashi, Hirofumi

    2016-06-01

    Japan Atomic Energy Agency (JAEA) has been investigating safety requirements and basic approach of safety guidelines for modular High Temperature Gas-cooled Reactor (HTGR) aiming to increase internarial contribution for nuclear safety by developing an international HTGR safety standard under International Atomic Energy Agency. In this study, we investigate a deterministic approach to select design basis events utilizing information obtained from probabilistic approach. In addition, selections of design basis events are conducted for commercial HTGR designed by JAEA. As a result, an approach for selecting design basis event considering multiple failures of safety systems is established which has not been considered as design basis in the safety guideline for existing nuclear facility. Furthermore, selection of design basis events for commercial HTGR has completed. This report provides an approach and procedure for selecting design basis events of modular HTGR as well as selected events for the commercial HTGR, GTHTR300. (author)

  3. Multi-Label Learning via Random Label Selection for Protein Subcellular Multi-Locations Prediction.

    Science.gov (United States)

    Wang, Xiao; Li, Guo-Zheng

    2013-03-12

    Prediction of protein subcellular localization is an important but challenging problem, particularly when proteins may simultaneously exist at, or move between, two or more different subcellular location sites. Most of the existing protein subcellular localization methods are only used to deal with the single-location proteins. In the past few years, only a few methods have been proposed to tackle proteins with multiple locations. However, they only adopt a simple strategy, that is, transforming the multi-location proteins to multiple proteins with single location, which doesn't take correlations among different subcellular locations into account. In this paper, a novel method named RALS (multi-label learning via RAndom Label Selection), is proposed to learn from multi-location proteins in an effective and efficient way. Through five-fold cross validation test on a benchmark dataset, we demonstrate our proposed method with consideration of label correlations obviously outperforms the baseline BR method without consideration of label correlations, indicating correlations among different subcellular locations really exist and contribute to improvement of prediction performance. Experimental results on two benchmark datasets also show that our proposed methods achieve significantly higher performance than some other state-of-the-art methods in predicting subcellular multi-locations of proteins. The prediction web server is available at http://levis.tongji.edu.cn:8080/bioinfo/MLPred-Euk/ for the public usage.

  4. Holey graphene frameworks for highly selective post-combustion carbon capture

    Science.gov (United States)

    Chowdhury, Shamik; Balasubramanian, Rajasekhar

    2016-02-01

    Atmospheric CO2 concentrations continue to rise rapidly in response to increased combustion of fossil fuels, contributing to global climate change. In order to mitigate the effects of global warming, development of new materials for cost-effective and energy-efficient CO2 capture is critically important. Graphene-based porous materials are an emerging class of solid adsorbents for selectively removing CO2 from flue gases. Herein, we report a simple and scalable approach to produce three-dimensional holey graphene frameworks with tunable porosity and pore geometry, and demonstrate their application as high-performance CO2 adsorbents. These holey graphene macrostructures exhibit a significantly improved specific surface area and pore volume compared to their pristine counterparts, and can be effectively used in post-combustion CO2 adsorption systems because of their intrinsic hydrophobicity together with good gravimetric storage capacities, rapid removal capabilities, superior cycling stabilities, and moderate initial isosteric heats. In addition, an exceptionally high CO2 over N2 selectivity can be achieved under conditions relevant to capture from the dry exhaust gas stream of a coal burning power plant, suggesting the possibility of recovering highly pure CO2 for long-term sequestration and/or utilization for downstream applications.

  5. Human norovirus inactivation in oysters by high hydrostatic pressure processing: A randomized double-blinded study

    Science.gov (United States)

    This randomized, double-blinded, clinical trial assessed the effect of high hydrostatic pressure processing (HPP) on genogroup I.1 human norovirus (HuNoV) inactivation in virus-seeded oysters when ingested by subjects. The safety and efficacy of HPP treatments were assessed in three study phases wi...

  6. Survivor bias in Mendelian randomization analysis

    DEFF Research Database (Denmark)

    Vansteelandt, Stijn; Dukes, Oliver; Martinussen, Torben

    2017-01-01

    Mendelian randomization studies employ genotypes as experimental handles to infer the effect of genetically modified exposures (e.g. vitamin D exposure) on disease outcomes (e.g. mortality). The statistical analysis of these studies makes use of the standard instrumental variables framework. Many...... of these studies focus on elderly populations, thereby ignoring the problem of left truncation, which arises due to the selection of study participants being conditional upon surviving up to the time of study onset. Such selection, in general, invalidates the assumptions on which the instrumental variables...... analysis rests. We show that Mendelian randomization studies of adult or elderly populations will therefore, in general, return biased estimates of the exposure effect when the considered genotype affects mortality; in contrast, standard tests of the causal null hypothesis that the exposure does not affect...

  7. Analysis of severe feather pecking behavior in a high feather pecking selection line

    DEFF Research Database (Denmark)

    Labouriau, R; Kjaer, J B; Abreu, G C G

    2009-01-01

    Even though feather pecking (FP) in laying hens has been extensively studied, a good solution to prevent chickens from this behavior under commercial circumstances has not been found. Selection against FP behavior is possible, but for a more effective selection across different populations......, it is necessary to characterize the genetic mechanism associated with this behavior. In this study, we use a high FP selection line, which has been selected for 8 generations. We present evidence of the presence of a major dominant allele affecting the FP behavior by using an argument based on the presence...

  8. Characterization of selective solar absorber under high vacuum.

    Science.gov (United States)

    Russo, Roberto; Monti, Matteo; di Giamberardino, Francesco; Palmieri, Vittorio G

    2018-05-14

    Total absorption and emission coefficients of selective solar absorbers are measured under high vacuum conditions from room temperature up to stagnation temperature. The sample under investigation is illuminated under vacuum @1000W/m 2 and the sample temperature is recorded during heat up, equilibrium and cool down. During stagnation, the absorber temperature exceeds 300°C without concentration. Data analysis allows evaluating the solar absorptance and thermal emittance at different temperatures. These in turn are useful to predict evacuated solar panel performances at operating conditions.

  9. Random coil chemical shifts in acidic 8 M urea: Implementation of random coil shift data in NMRView

    International Nuclear Information System (INIS)

    Schwarzinger, Stephan; Kroon, Gerard J.A.; Foss, Ted R.; Wright, Peter E.; Dyson, H. Jane

    2000-01-01

    Studies of proteins unfolded in acid or chemical denaturant can help in unraveling events during the earliest phases of protein folding. In order for meaningful comparisons to be made of residual structure in unfolded states, it is necessary to use random coil chemical shifts that are valid for the experimental system under study. We present a set of random coil chemical shifts obtained for model peptides under experimental conditions used in studies of denatured proteins. This new set, together with previously published data sets, has been incorporated into a software interface for NMRView, allowing selection of the random coil data set that fits the experimental conditions best

  10. Discriminative Projection Selection Based Face Image Hashing

    Science.gov (United States)

    Karabat, Cagatay; Erdogan, Hakan

    Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.

  11. The effect of selection on genetic parameter estimates

    African Journals Online (AJOL)

    Unknown

    The South African Journal of Animal Science is available online at ... A simulation study was carried out to investigate the effect of selection on the estimation of genetic ... The model contained a fixed effect, random genetic and random.

  12. On the mechanism of high product selectivity for HCOOH using Pb in CO2 electroreduction.

    Science.gov (United States)

    Back, Seoin; Kim, Jun-Hyuk; Kim, Yong-Tae; Jung, Yousung

    2016-04-14

    While achieving high product selectivity is one of the major challenges of the CO2 electroreduction technology in general, Pb is one of the few examples with high selectivity that produces formic acid almost exclusively (versus H2, CO, or other byproducts). In this work, we study the mechanism of CO2 electroreduction reactions using Pb to understand the origin of high formic acid selectivity. In particular, we first assess the proton-assisted mechanism proposed in the literature using density functional calculations and find that it cannot fully explain the previous selectivity experiments for the Pb electrode. We then suggest an alternative proton-coupled-electron-transfer mechanism consistent with existing observations, and further validate a new mechanism by experimentally measuring and comparing the onset potentials for CO2 reduction vs. H2 production. We find that the origin of a high selectivity of the Pb catalyst for HCOOH production over CO and H2 lies in the strong O-affinitive and weak C-, H-affinitive characteristics of Pb, leading to the involvement of the *OCHO species as a key intermediate to produce HCOOH exclusively and preventing unwanted H2 production at the same time.

  13. Development, evaluation, and selection of candidate high-level waste forms

    International Nuclear Information System (INIS)

    Bernadzikowski, T.A.; Allender, J.S.; Gordon, D.E.; Gould, T.H. Jr.

    1982-01-01

    The seven candidate waste forms, evaluated as potential media for the immobilization and gelogic disposal of high-level nuclear wastes were borosilicate glass, SYNROC, tailored ceramic, high-silica glass, FUETAP concrete, coated sol-gel particles, and glass marbles in a lead matrix. The evaluation, completed on August 1, 1981, combined preliminary waste form evaluations conducted at Department of Energy (DOE) defense waste-sites and at independent laboratories, peer review assessments, a product performance evaluation, and a processability analysis. Based on the combined results of these four inputs, two of the seven forms, borosilicate glass and a titanate-based ceramic, SYNROC, were selected as the reference and alternative forms, respectively, for continued development and evaluation in the National HLW Program. The borosilicate glass and ceramic forms were further compared during FY-1982 on the basis of risk assessments, cost comparisons, properties comparisons, and conformance with proposed regulatory and repository criteria. Both the glass and ceramic forms are viable candidates for use at DOE defense HLW sites; they are also candidates for immobilization of commercial reprocessing wastes. This paper describes the waste form screening process, discusses each of the four major inputs considered in the selection of the two forms in 1981, and presents a brief summary of the comparisons of the two forms during 1982 and the selection process to determine the final form for SRP defense HLW

  14. Rhodium Nanoparticle-mesoporous Silicon Nanowire Nanohybrids for Hydrogen Peroxide Detection with High Selectivity

    Science.gov (United States)

    Song, Zhiqian; Chang, Hucheng; Zhu, Weiqin; Xu, Chenlong; Feng, Xinjian

    2015-01-01

    Developing nanostructured electrocatalysts, with low overpotential, high selectivity and activity has fundamental and technical importance in many fields. We report here rhodium nanoparticle and mesoporous silicon nanowire (RhNP@mSiNW) hybrids for hydrogen peroxide (H2O2) detection with high electrocatalytic activity and selectivity. By employing electrodes that loaded with RhNP@mSiNW nanohybrids, interference caused from both many electroactive substances and dissolved oxygen were eliminated by electrochemical assaying at an optimal potential of +75 mV. Furthermore, the electrodes exhibited a high detection sensitivity of 0.53 μA/mM and fast response (< 5 s). This high-performance nanohybrid electrocatalyst has great potential for future practical application in various oxidase-base biosensors. PMID:25588953

  15. K-Means Algorithm Performance Analysis With Determining The Value Of Starting Centroid With Random And KD-Tree Method

    Science.gov (United States)

    Sirait, Kamson; Tulus; Budhiarti Nababan, Erna

    2017-12-01

    Clustering methods that have high accuracy and time efficiency are necessary for the filtering process. One method that has been known and applied in clustering is K-Means Clustering. In its application, the determination of the begining value of the cluster center greatly affects the results of the K-Means algorithm. This research discusses the results of K-Means Clustering with starting centroid determination with a random and KD-Tree method. The initial determination of random centroid on the data set of 1000 student academic data to classify the potentially dropout has a sse value of 952972 for the quality variable and 232.48 for the GPA, whereas the initial centroid determination by KD-Tree has a sse value of 504302 for the quality variable and 214,37 for the GPA variable. The smaller sse values indicate that the result of K-Means Clustering with initial KD-Tree centroid selection have better accuracy than K-Means Clustering method with random initial centorid selection.

  16. Design of Highly Selective Gas Sensors via Physicochemical Modification of Oxide Nanowires: Overview

    Directory of Open Access Journals (Sweden)

    Hyung-Sik Woo

    2016-09-01

    Full Text Available Strategies for the enhancement of gas sensing properties, and specifically the improvement of gas selectivity of metal oxide semiconductor nanowire (NW networks grown by chemical vapor deposition and thermal evaporation, are reviewed. Highly crystalline NWs grown by vapor-phase routes have various advantages, and thus have been applied in the field of gas sensors over the years. In particular, n-type NWs such as SnO2, ZnO, and In2O3 are widely studied because of their simple synthetic preparation and high gas response. However, due to their usually high responses to C2H5OH and NO2, the selective detection of other harmful and toxic gases using oxide NWs remains a challenging issue. Various strategies—such as doping/loading of noble metals, decorating/doping of catalytic metal oxides, and the formation of core–shell structures—have been explored to enhance gas selectivity and sensitivity, and are discussed herein. Additional methods such as the transformation of n-type into p-type NWs and the formation of catalyst-doped hierarchical structures by branch growth have also proven to be promising for the enhancement of gas selectivity. Accordingly, the physicochemical modification of oxide NWs via various methods provides new strategies to achieve the selective detection of a specific gas, and after further investigations, this approach could pave a new way in the field of NW-based semiconductor-type gas sensors.

  17. The genealogy of samples in models with selection.

    Science.gov (United States)

    Neuhauser, C; Krone, S M

    1997-02-01

    We introduce the genealogy of a random sample of genes taken from a large haploid population that evolves according to random reproduction with selection and mutation. Without selection, the genealogy is described by Kingman's well-known coalescent process. In the selective case, the genealogy of the sample is embedded in a graph with a coalescing and branching structure. We describe this graph, called the ancestral selection graph, and point out differences and similarities with Kingman's coalescent. We present simulations for a two-allele model with symmetric mutation in which one of the alleles has a selective advantage over the other. We find that when the allele frequencies in the population are already in equilibrium, then the genealogy does not differ much from the neutral case. This is supported by rigorous results. Furthermore, we describe the ancestral selection graph for other selective models with finitely many selection classes, such as the K-allele models, infinitely-many-alleles models. DNA sequence models, and infinitely-many-sites models, and briefly discuss the diploid case.

  18. Randomized Trial Comparing R-CHOP Versus High-Dose Sequential Chemotherapy in High-Risk Patients With Diffuse Large B-Cell Lymphomas.

    Science.gov (United States)

    Cortelazzo, Sergio; Tarella, Corrado; Gianni, Alessandro Massimo; Ladetto, Marco; Barbui, Anna Maria; Rossi, Andrea; Gritti, Giuseppe; Corradini, Paolo; Di Nicola, Massimo; Patti, Caterina; Mulé, Antonino; Zanni, Manuela; Zoli, Valerio; Billio, Atto; Piccin, Andrea; Negri, Giovanni; Castellino, Claudia; Di Raimondo, Francesco; Ferreri, Andrés J M; Benedetti, Fabio; La Nasa, Giorgio; Gini, Guido; Trentin, Livio; Frezzato, Maurizio; Flenghi, Leonardo; Falorio, Simona; Chilosi, Marco; Bruna, Riccardo; Tabanelli, Valentina; Pileri, Stefano; Masciulli, Arianna; Delaini, Federica; Boschini, Cristina; Rambaldi, Alessandro

    2016-11-20

    Purpose The benefit of high-dose chemotherapy with autologous stem-cell transplantation (ASCT) as first-line treatment in patients with diffuse large B-cell lymphomas is still a matter of debate. To address this point, we designed a randomized phase III trial to compare rituximab plus cyclophosphamide, doxorubicin, vincristine, and prednisone (R-CHOP)-14 (eight cycles) with rituximab plus high-dose sequential chemotherapy (R-HDS) with ASCT. Patients and Methods From June 2005 to June 2011, 246 high-risk patients with a high-intermediate (56%) or high (44%) International Prognostic Index score were randomly assigned to the R-CHOP or R-HDS arm, and 235 were analyzed by intent to treat. The primary efficacy end point of the study was 3-year event-free survival, and results were analyzed on an intent-to-treat basis. Results Clinical response (complete response, 78% v 76%; partial response, 5% v 9%) and failures (no response, 15% v 11%; and early treatment-related mortality, 2% v 3%) were similar after R-CHOP versus R-HDS, respectively. After a median follow-up of 5 years, the 3-year event-free survival was 62% versus 65% ( P = .83). At 3 years, compared with the R-CHOP arm, the R-HDS arm had better disease-free survival (79% v 91%, respectively; P = .034), but this subsequently vanished because of late-occurring treatment-related deaths. No difference was detected in terms of progression-free survival (65% v 75%, respectively; P = .12), or overall survival (74% v 77%, respectively; P = .64). Significantly higher hematologic toxicity ( P < .001) and more infectious complications ( P < .001) were observed in the R-HDS arm. Conclusion In this study, front-line intensive R-HDS chemotherapy with ASCT did not improve the outcome of high-risk patients with diffuse large B-cell lymphomas.

  19. Selection gradients, the opportunity for selection, and the coefficient of determination.

    Science.gov (United States)

    Moorad, Jacob A; Wade, Michael J

    2013-03-01

    Abstract We derive the relationship between R(2) (the coefficient of determination), selection gradients, and the opportunity for selection for univariate and multivariate cases. Our main result is to show that the portion of the opportunity for selection that is caused by variation for any trait is equal to the product of its selection gradient and its selection differential. This relationship is a corollary of the first and second fundamental theorems of natural selection, and it permits one to investigate the portions of the total opportunity for selection that are involved in directional selection, stabilizing (and diversifying) selection, and correlational selection, which is important to morphological integration. It also allows one to determine the fraction of fitness variation not explained by variation in measured phenotypes and therefore attributable to random (or, at least, unknown) influences. We apply our methods to a human data set to show how sex-specific mating success as a component of fitness variance can be decoupled from that owing to prereproductive mortality. By quantifying linear sources of sexual selection and quadratic sources of sexual selection, we illustrate that the former is stronger in males, while the latter is stronger in females.

  20. Application of high Tc superconductors as frequency selective surfaces: Experiment and theory

    International Nuclear Information System (INIS)

    Dawei Zhang; Yahya Rahmat-Samii; Fetterman, H.R.

    1993-01-01

    YBa 2 Cu 3 O 7-x and Tl 2 CaBa 2 Cu 2 O 8 high temperature superconducting thin films were utilized to fabricate frequency selective surfaces (FSS) at millimeter-wave frequencies (75--110 GHz). An analytical/numerical model was applied, using a Floquet expansion and the Method of Moments, to analyze bandstop superconducting frequency selective surfaces. Experimental results were compared with the model, and showed a good agreement with resonant frequency prediction with an accuracy of better than 1%. The use of the superconducting frequency selective surfaces as quasi-optical millimeter-wave bandpass filters was also demonstrated

  1. Study on the partner selecting method of strategic alliance in high and new technology enterprises

    Institute of Scientific and Technical Information of China (English)

    王宏起; 唐宇; 迟运领

    2004-01-01

    A successful and effective strategic alliance involves many factors, of which selecting a proper partner is the most important factor to achieve the success of the alliance. In view of the characteristics of strategic alliance in high and new technology enterprises and according to the analysis on the standards of partner selecting and the factors of the success of alliance, this paper does some deeper research on the partner selecting and the alliance evaluation process from the perspective of different strategic levels by using a fuzzy comprehensive evaluating method, thus providing a method to select the alliance partner for high and new technology enterprises in China.

  2. Randomized Controlled Trial of Teaching Methods: Do Classroom Experiments Improve Economic Education in High Schools?

    Science.gov (United States)

    Eisenkopf, Gerald; Sulser, Pascal A.

    2016-01-01

    The authors present results from a comprehensive field experiment at Swiss high schools in which they compare the effectiveness of teaching methods in economics. They randomly assigned classes into an experimental and a conventional teaching group, or a control group that received no specific instruction. Both teaching treatments improve economic…

  3. Nitrates and bone turnover (NABT) - trial to select the best nitrate preparation: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Bucur, Roxana C; Reid, Lauren S; Hamilton, Celeste J; Cummings, Steven R; Jamal, Sophie A

    2013-09-08

    comparisons with the best' approach for data analyses, as this strategy allows practical considerations of ease of use and tolerability to guide selection of the preparation for future studies. Data from this protocol will be used to develop a randomized, controlled trial of nitrates to prevent osteoporotic fractures. ClinicalTrials.gov Identifier: NCT01387672. Controlled-Trials.com: ISRCTN08860742.

  4. Influence of Random Inclusion of Coconut Fibres on the Short term Strength of Highly Compressible Clay

    Science.gov (United States)

    Ramani Sujatha, Evangelin; SaiSree, S.; Prabalini, C.; Aysha Farsana, Z.

    2017-07-01

    The choice of natural fibres for soil stabilization provides an economic, safe and eco-friendly alternative to improve the properties of soil. They are an important step forward toward sustainable development. An attempt was made to study the influence of the random addition of untreated coconut fibres on the short term strength of soil, its stress-strain behavior, compaction characteristics and index properties. The soil selected for the study is a highly compressible clay sample with a liquid limit of 52.5 % and plasticity index of 38 %. The soil has no organic content. The study reveals that the compaction curves tend to shift to the right side, indicating more plastic behavior with the addition of fibres. The addition of fibres also reorient the soil structure to a more dispersed fashion. A significant increase in the unconfined compressive strength is also observed. An increase of nearly 51 % in the unconfined compressive strength is observed at 0.75 % coir inclusion. The stress-strain behavior of the soil shows a shift toward more plastic behavior. The mode of failure of the soil specimen is by cracking and with fibre inclusion, length of the failure cracks is restrained as the fibre tends to hold the cracks together, resulting in shorter cracks, with significant bulging of the specimen at failure.

  5. Performance of Universal Adhesive in Primary Molars After Selective Removal of Carious Tissue: An 18-Month Randomized Clinical Trial.

    Science.gov (United States)

    Lenzi, Tathiane Larissa; Pires, Carine Weber; Soares, Fabio Zovico Maxnuck; Raggio, Daniela Prócida; Ardenghi, Thiago Machado; de Oliveira Rocha, Rachel

    2017-09-15

    To evaluate the 18-month clinical performance of a universal adhesive, applied under different adhesion strategies, after selective carious tissue removal in primary molars. Forty-four subjects (five to 10 years old) contributed with 90 primary molars presenting moderately deep dentin carious lesions on occlusal or occluso-proximal surfaces, which were randomly assigned following either self-etch or etch-and-rinse protocol of Scotchbond Universal Adhesive (3M ESPE). Resin composite was incrementally inserted for all restorations. Restorations were evaluated at one, six, 12, and 18 months using the modified United States Public Health Service criteria. Survival estimates for restorations' longevity were evaluated using the Kaplan-Meier method. Multivariate Cox regression analysis with shared frailty to assess the factors associated with failures (Padhesion strategy did not influence the restorations' longevity (P=0.06; 72.2 percent and 89.7 percent with etch-and-rinse and self-etch mode, respectively). Self-etch and etch-and-rinse strategies did not influence the clinical behavior of universal adhesive used in primary molars after selective carious tissue removal; although there was a tendency for better outcome of the self-etch strategy.

  6. Analysis of swaps in Radix selection

    DEFF Research Database (Denmark)

    Elmasry, Amr Ahmed Abd Elmoneim; Mahmoud, Hosam

    2011-01-01

    Radix Sort is a sorting algorithm based on analyzing digital data. We study the number of swaps made by Radix Select (a one-sided version of Radix Sort) to find an element with a randomly selected rank. This kind of grand average provides a smoothing over all individual distributions for specific...

  7. High density submicron magnetoresistive random access memory (invited)

    Science.gov (United States)

    Tehrani, S.; Chen, E.; Durlam, M.; DeHerrera, M.; Slaughter, J. M.; Shi, J.; Kerszykowski, G.

    1999-04-01

    Various giant magnetoresistance material structures were patterned and studied for their potential as memory elements. The preferred memory element, based on pseudo-spin valve structures, was designed with two magnetic stacks (NiFeCo/CoFe) of different thickness with Cu as an interlayer. The difference in thickness results in dissimilar switching fields due to the shape anisotropy at deep submicron dimensions. It was found that a lower switching current can be achieved when the bits have a word line that wraps around the bit 1.5 times. Submicron memory elements integrated with complementary metal-oxide-semiconductor (CMOS) transistors maintained their characteristics and no degradation to the CMOS devices was observed. Selectivity between memory elements in high-density arrays was demonstrated.

  8. When selection ratios are high: predicting the expatriation willingness of prospective domestic entry-level job applicants

    NARCIS (Netherlands)

    Mol, S.T.; Born, M.P.; Willemsen, M.E.; van der Molen, H.T.; Derous, E.

    2009-01-01

    High expatriate selection ratios thwart the ability of multinational organizations to select expatriates. Reducing the selection ratio may be accomplished by selecting those applicants for entry level domestic positions who have expatriate aspirations. Regression analyses conducted on data from a

  9. High-efficiency single cell encapsulation and size selective capture of cells in picoliter droplets based on hydrodynamic micro-vortices.

    Science.gov (United States)

    Kamalakshakurup, Gopakumar; Lee, Abraham P

    2017-12-05

    Single cell analysis has emerged as a paradigm shift in cell biology to understand the heterogeneity of individual cells in a clone for pathological interrogation. Microfluidic droplet technology is a compelling platform to perform single cell analysis by encapsulating single cells inside picoliter-nanoliter (pL-nL) volume droplets. However, one of the primary challenges for droplet based single cell assays is single cell encapsulation in droplets, currently achieved either randomly, dictated by Poisson statistics, or by hydrodynamic techniques. In this paper, we present an interfacial hydrodynamic technique which initially traps the cells in micro-vortices, and later releases them one-to-one into the droplets, controlled by the width of the outer streamline that separates the vortex from the flow through the streaming passage adjacent to the aqueous-oil interface (d gap ). One-to-one encapsulation is achieved at a d gap equal to the radius of the cell, whereas complete trapping of the cells is realized at a d gap smaller than the radius of the cell. The unique feature of this technique is that it can perform 1. high efficiency single cell encapsulations and 2. size-selective capturing of cells, at low cell loading densities. Here we demonstrate these two capabilities with a 50% single cell encapsulation efficiency and size selective separation of platelets, RBCs and WBCs from a 10× diluted blood sample (WBC capture efficiency at 70%). The results suggest a passive, hydrodynamic micro-vortex based technique capable of performing high-efficiency single cell encapsulation for cell based assays.

  10. In the eye of the beholder: Inhomogeneous distribution of high-resolution shapes within the random-walk ensemble

    Science.gov (United States)

    Müller, Christian L.; Sbalzarini, Ivo F.; van Gunsteren, Wilfred F.; Žagrović, Bojan; Hünenberger, Philippe H.

    2009-06-01

    The concept of high-resolution shapes (also referred to as folds or states, depending on the context) of a polymer chain plays a central role in polymer science, structural biology, bioinformatics, and biopolymer dynamics. However, although the idea of shape is intuitively very useful, there is no unambiguous mathematical definition for this concept. In the present work, the distributions of high-resolution shapes within the ideal random-walk ensembles with N =3,…,6 beads (or up to N =10 for some properties) are investigated using a systematic (grid-based) approach based on a simple working definition of shapes relying on the root-mean-square atomic positional deviation as a metric (i.e., to define the distance between pairs of structures) and a single cutoff criterion for the shape assignment. Although the random-walk ensemble appears to represent the paramount of homogeneity and randomness, this analysis reveals that the distribution of shapes within this ensemble, i.e., in the total absence of interatomic interactions characteristic of a specific polymer (beyond the generic connectivity constraint), is significantly inhomogeneous. In particular, a specific (densest) shape occurs with a local probability that is 1.28, 1.79, 2.94, and 10.05 times (N =3,…,6) higher than the corresponding average over all possible shapes (these results can tentatively be extrapolated to a factor as large as about 1028 for N =100). The qualitative results of this analysis lead to a few rather counterintuitive suggestions, namely, that, e.g., (i) a fold classification analysis applied to the random-walk ensemble would lead to the identification of random-walk "folds;" (ii) a clustering analysis applied to the random-walk ensemble would also lead to the identification random-walk "states" and associated relative free energies; and (iii) a random-walk ensemble of polymer chains could lead to well-defined diffraction patterns in hypothetical fiber or crystal diffraction experiments

  11. A 3?week multimodal intervention involving high?intensity interval training in female cancer survivors: a randomized controlled trial

    OpenAIRE

    Schmitt, Joachim; Lindner, Nathalie; Reuss?Borst, Monika; Holmberg, Hans?Christer; Sperlich, Billy

    2016-01-01

    Abstract To compare the effects of a 3?week multimodal rehabilitation involving supervised high?intensity interval training (HIIT) on female breast cancer survivors with respect to key variables of aerobic fitness, body composition, energy expenditure, cancer?related fatigue, and quality of life to those of a standard multimodal rehabilitation program. A randomized controlled trial design was administered. Twenty?eight women, who had been treated for cancer were randomly assigned to either a ...

  12. Phenylmercuric hydroxide. A highly selective reagent for the hydration of nonconjugated terminal alkynes

    International Nuclear Information System (INIS)

    Janout, V.; Regen, S.L.

    1982-01-01

    This article describes an unusual and highly selective method for hydrating nonconjugated terminal alkynes based on the use of phenylmercuric hydroxide as a reagent. Unlike classical mercury catalyzed procedures, sigma-bonded mercury acetylides are formed initially as stable intermediates and subsequently reacted with water under neutral pH to form the corresponding methyl ketone. Isolated yields which have been obtained by using this approach lie in the range of 49-65%. The high selectivity toward nonconjugated terminal alkynes which characterizes the procedure described herein should make it a useful supplement to existing hydration methods

  13. High carotenoids content can enhance resistance of selected Pinctada fucata families to high temperature stress.

    Science.gov (United States)

    Meng, Zihao; Zhang, Bo; Liu, Baosuo; Li, Haimei; Fan, Sigang; Yu, Dahui

    2017-02-01

    Carotenoids are a class of natural antioxidants widely found in aquatic, and they have significant effects on the growth, survival, and immunity of these organisms. To investigate the mechanisms of carotenoids in high temperature resistance, we observed the immune response of selected pearl oyster Pinctada fucata (Akoya pearl oyster) families with different carotenoids contents to high temperature stress. The results indicated that the survival rate (SR) of P. fucata decreased significantly with increase in temperature from 26 °C to 34 °C and with the decrease of total carotenoids content (TCC); when the TCC was higher, the SR tended to be higher. TCC and total antioxidant capacity (TAC) decreased significantly at 30 °C with increasing stress time. Correlation analysis indicated that TAC was positively and linearly correlated with TCC, and SR was S-type correlated with TCC and TAC. Immune analysis indicated that levels of superoxide dismutase (SOD), catalase (CAT), and malondialdehyde (MDA) in selected families (with higher TCC) under temperature stress (at 30 °C) were generally significantly lower than in the control group (with lowest TCC) and from 0 to 96 h, the levels of each of these substances varied significantly. Levels of SOD, CAT, and MDA within each family first rose from 0 to 3 h, then decreased to their lowest point after 24 h, and then rose again to their highest levels at 96 h. When TCC was higher, the levels of SOD, CAT, and MDA tended to be lower. These findings indicated that carotenoids play an important role in improving survival rates of P. fucata under high temperature stress by enhancing animals' antioxidant system, and could serve as an index for breeding stress-resistant lines in selective breeding practices. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Integral Histogram with Random Projection for Pedestrian Detection.

    Directory of Open Access Journals (Sweden)

    Chang-Hua Liu

    Full Text Available In this paper, we give a systematic study to report several deep insights into the HOG, one of the most widely used features in the modern computer vision and image processing applications. We first show that, its magnitudes of gradient can be randomly projected with random matrix. To handle over-fitting, an integral histogram based on the differences of randomly selected blocks is proposed. The experiments show that both the random projection and integral histogram outperform the HOG feature obviously. Finally, the two ideas are combined into a new descriptor termed IHRP, which outperforms the HOG feature with less dimensions and higher speed.

  15. Territory and nest site selection patterns by Grasshopper Sparrows in southeastern Arizona

    Science.gov (United States)

    Ruth, Janet M.; Skagen, Susan K.

    2017-01-01

    Grassland bird populations are showing some of the greatest rates of decline of any North American birds, prompting measures to protect and improve important habitat. We assessed how vegetation structure and composition, habitat features often targeted for management, affected territory and nest site selection by Grasshopper Sparrows (Ammodramus savannarum ammolegus) in southeastern Arizona. To identify features important to males establishing territories, we compared vegetation characteristics of known territories and random samples on 2 sites over 5 years. We examined habitat selection patterns of females by comparing characteristics of nest sites with territories over 3 years. Males selected territories in areas of sparser vegetation structure and more tall shrubs (>2 m) than random plots on the site with low shrub densities. Males did not select territories based on the proportion of exotic grasses. Females generally located nest sites in areas with lower small shrub (1–2 m tall) densities than territories overall when possible and preferentially selected native grasses for nest construction. Whether habitat selection was apparent depended upon the range of vegetation structure that was available. We identified an upper threshold above which grass structure seemed to be too high and dense for Grasshopper Sparrows. Our results suggest that some management that reduces vegetative structure may benefit this species in desert grasslands at the nest and territory scale. However, we did not assess initial male habitat selection at a broader landscape scale where their selection patterns may be different and could be influenced by vegetation density and structure outside the range of values sampled in this study.

  16. Unwilling or Unable to Cheat? Evidence from a Randomized Tax Audit Experiment in Denmark

    OpenAIRE

    Henrik J. Kleven; Martin B. Knudsen; Claus T. Kreiner; Søren Pedersen; Emmanuel Saez

    2010-01-01

    This paper analyzes a randomized tax enforcement experiment in Denmark. In the base year, a stratified and representative sample of over 40,000 individual income tax filers was selected for the experiment. Half of the tax filers were randomly selected to be thoroughly audited, while the rest were deliberately not audited. The following year, "threat-of-audit" letters were randomly assigned and sent to tax filers in both groups. Using comprehensive administrative tax data, we present four main...

  17. Directional enhancement of selected high-order-harmonics from intense laser irradiated blazed grating targets.

    Science.gov (United States)

    Zhang, Guobo; Chen, Min; Liu, Feng; Yuan, Xiaohui; Weng, Suming; Zheng, Jun; Ma, Yanyun; Shao, Fuqiu; Sheng, Zhengming; Zhang, Jie

    2017-10-02

    Relativistically intense laser solid target interaction has been proved to be a promising way to generate high-order harmonics, which can be used to diagnose ultrafast phenomena. However, their emission direction and spectra still lack tunability. Based upon two-dimensional particle-in-cell simulations, we show that directional enhancement of selected high-order-harmonics can be realized using blazed grating targets. Such targets can select harmonics with frequencies being integer times of the grating frequency. Meanwhile, the radiation intensity and emission area of the harmonics are increased. The emission direction is controlled by tailoring the local blazed structure. Theoretical and electron dynamics analysis for harmonics generation, selection and directional enhancement from the interaction between multi-cycle laser and grating target are carried out. These studies will benefit the generation and application of laser plasma-based high order harmonics.

  18. Optimal classifier selection and negative bias in error rate estimation: an empirical study on high-dimensional prediction

    Directory of Open Access Journals (Sweden)

    Boulesteix Anne-Laure

    2009-12-01

    Full Text Available Abstract Background In biometric practice, researchers often apply a large number of different methods in a "trial-and-error" strategy to get as much as possible out of their data and, due to publication pressure or pressure from the consulting customer, present only the most favorable results. This strategy may induce a substantial optimistic bias in prediction error estimation, which is quantitatively assessed in the present manuscript. The focus of our work is on class prediction based on high-dimensional data (e.g. microarray data, since such analyses are particularly exposed to this kind of bias. Methods In our study we consider a total of 124 variants of classifiers (possibly including variable selection or tuning steps within a cross-validation evaluation scheme. The classifiers are applied to original and modified real microarray data sets, some of which are obtained by randomly permuting the class labels to mimic non-informative predictors while preserving their correlation structure. Results We assess the minimal misclassification rate over the different variants of classifiers in order to quantify the bias arising when the optimal classifier is selected a posteriori in a data-driven manner. The bias resulting from the parameter tuning (including gene selection parameters as a special case and the bias resulting from the choice of the classification method are examined both separately and jointly. Conclusions The median minimal error rate over the investigated classifiers was as low as 31% and 41% based on permuted uninformative predictors from studies on colon cancer and prostate cancer, respectively. We conclude that the strategy to present only the optimal result is not acceptable because it yields a substantial bias in error rate estimation, and suggest alternative approaches for properly reporting classification accuracy.

  19. High-Speed Device-Independent Quantum Random Number Generation without a Detection Loophole

    Science.gov (United States)

    Liu, Yang; Yuan, Xiao; Li, Ming-Han; Zhang, Weijun; Zhao, Qi; Zhong, Jiaqiang; Cao, Yuan; Li, Yu-Huai; Chen, Luo-Kan; Li, Hao; Peng, Tianyi; Chen, Yu-Ao; Peng, Cheng-Zhi; Shi, Sheng-Cai; Wang, Zhen; You, Lixing; Ma, Xiongfeng; Fan, Jingyun; Zhang, Qiang; Pan, Jian-Wei

    2018-01-01

    Quantum mechanics provides the means of generating genuine randomness that is impossible with deterministic classical processes. Remarkably, the unpredictability of randomness can be certified in a manner that is independent of implementation devices. Here, we present an experimental study of device-independent quantum random number generation based on a detection-loophole-free Bell test with entangled photons. In the randomness analysis, without the independent identical distribution assumption, we consider the worst case scenario that the adversary launches the most powerful attacks against the quantum adversary. After considering statistical fluctuations and applying an 80 Gb ×45.6 Mb Toeplitz matrix hashing, we achieve a final random bit rate of 114 bits /s , with a failure probability less than 10-5. This marks a critical step towards realistic applications in cryptography and fundamental physics tests.

  20. Quantum random number generator

    Science.gov (United States)

    Soubusta, Jan; Haderka, Ondrej; Hendrych, Martin

    2001-03-01

    Since reflection or transmission of a quantum particle on a beamsplitter is inherently random quantum process, a device built on this principle does not suffer from drawbacks of neither pseudo-random computer generators or classical noise sources. Nevertheless, a number of physical conditions necessary for high quality random numbers generation must be satisfied. Luckily, in quantum optics realization they can be well controlled. We present an easy random number generator based on the division of weak light pulses on a beamsplitter. The randomness of the generated bit stream is supported by passing the data through series of 15 statistical test. The device generates at a rate of 109.7 kbit/s.

  1. Stochastic space interval as a link between quantum randomness and macroscopic randomness?

    Science.gov (United States)

    Haug, Espen Gaarder; Hoff, Harald

    2018-03-01

    For many stochastic phenomena, we observe statistical distributions that have fat-tails and high-peaks compared to the Gaussian distribution. In this paper, we will explain how observable statistical distributions in the macroscopic world could be related to the randomness in the subatomic world. We show that fat-tailed (leptokurtic) phenomena in our everyday macroscopic world are ultimately rooted in Gaussian - or very close to Gaussian-distributed subatomic particle randomness, but they are not, in a strict sense, Gaussian distributions. By running a truly random experiment over a three and a half-year period, we observed a type of random behavior in trillions of photons. Combining our results with simple logic, we find that fat-tailed and high-peaked statistical distributions are exactly what we would expect to observe if the subatomic world is quantized and not continuously divisible. We extend our analysis to the fact that one typically observes fat-tails and high-peaks relative to the Gaussian distribution in stocks and commodity prices and many aspects of the natural world; these instances are all observable and documentable macro phenomena that strongly suggest that the ultimate building blocks of nature are discrete (e.g. they appear in quanta).

  2. Methods for model selection in applied science and engineering.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2004-10-01

    Mathematical models are developed and used to study the properties of complex systems and/or modify these systems to satisfy some performance requirements in just about every area of applied science and engineering. A particular reason for developing a model, e.g., performance assessment or design, is referred to as the model use. Our objective is the development of a methodology for selecting a model that is sufficiently accurate for an intended use. Information on the system being modeled is, in general, incomplete, so that there may be two or more models consistent with the available information. The collection of these models is called the class of candidate models. Methods are developed for selecting the optimal member from a class of candidate models for the system. The optimal model depends on the available information, the selected class of candidate models, and the model use. Classical methods for model selection, including the method of maximum likelihood and Bayesian methods, as well as a method employing a decision-theoretic approach, are formulated to select the optimal model for numerous applications. There is no requirement that the candidate models be random. Classical methods for model selection ignore model use and require data to be available. Examples are used to show that these methods can be unreliable when data is limited. The decision-theoretic approach to model selection does not have these limitations, and model use is included through an appropriate utility function. This is especially important when modeling high risk systems, where the consequences of using an inappropriate model for the system can be disastrous. The decision-theoretic method for model selection is developed and applied for a series of complex and diverse applications. These include the selection of the: (1) optimal order of the polynomial chaos approximation for non-Gaussian random variables and stationary stochastic processes, (2) optimal pressure load model to be

  3. Selection of common bean lines with high grain yield and high grain calcium and iron concentrations

    Directory of Open Access Journals (Sweden)

    Nerinéia Dalfollo Ribeiro

    2014-02-01

    Full Text Available Genetic improvement of common bean nutritional quality has advantages in marketing and can contribute to society as a food source. The objective of this study was to evaluate the genetic variability for grain yield, calcium and iron concentrations in grains of inbred common bean lines obtained by different breeding methods. For this, 136 F7 inbred lines were obtained using the Pedigree method and 136 F7 inbred lines were obtained using the Single-Seed Descent (SSD method. The lines showed genetic variability for grain yield, and concentrations of calcium and iron independently of the method of advancing segregating populations. The Pedigree method allows obtaining a greater number of lines with high grain yield. Selection using the SSD method allows the identification of a larger number of lines with high concentrations of calcium and iron in grains. Weak negative correlations were found between grain yield and calcium concentration (r = -0.0994 and grain yield and iron concentration (r = -0.3926. Several lines show genetic superiority for grain yield and concentrations of calcium and iron in grains and their selection can result in new common bean cultivars with high nutritional quality.

  4. Bayesian dose selection design for a binary outcome using restricted response adaptive randomization.

    Science.gov (United States)

    Meinzer, Caitlyn; Martin, Renee; Suarez, Jose I

    2017-09-08

    In phase II trials, the most efficacious dose is usually not known. Moreover, given limited resources, it is difficult to robustly identify a dose while also testing for a signal of efficacy that would support a phase III trial. Recent designs have sought to be more efficient by exploring multiple doses through the use of adaptive strategies. However, the added flexibility may potentially increase the risk of making incorrect assumptions and reduce the total amount of information available across the dose range as a function of imbalanced sample size. To balance these challenges, a novel placebo-controlled design is presented in which a restricted Bayesian response adaptive randomization (RAR) is used to allocate a majority of subjects to the optimal dose of active drug, defined as the dose with the lowest probability of poor outcome. However, the allocation between subjects who receive active drug or placebo is held constant to retain the maximum possible power for a hypothesis test of overall efficacy comparing the optimal dose to placebo. The design properties and optimization of the design are presented in the context of a phase II trial for subarachnoid hemorrhage. For a fixed total sample size, a trade-off exists between the ability to select the optimal dose and the probability of rejecting the null hypothesis. This relationship is modified by the allocation ratio between active and control subjects, the choice of RAR algorithm, and the number of subjects allocated to an initial fixed allocation period. While a responsive RAR algorithm improves the ability to select the correct dose, there is an increased risk of assigning more subjects to a worse arm as a function of ephemeral trends in the data. A subarachnoid treatment trial is used to illustrate how this design can be customized for specific objectives and available data. Bayesian adaptive designs are a flexible approach to addressing multiple questions surrounding the optimal dose for treatment efficacy

  5. Distributional and efficiency results for subset selection

    NARCIS (Netherlands)

    Laan, van der P.

    1996-01-01

    Assume k (??k \\geq 2) populations are given. The associated independent random variables have continuous distribution functions with an unknown location parameter. The statistical selec??tion goal is to select a non??empty subset which contains the best population,?? that is the pop??ulation with

  6. A Ranking Approach to Genomic Selection.

    Science.gov (United States)

    Blondel, Mathieu; Onogi, Akio; Iwata, Hiroyoshi; Ueda, Naonori

    2015-01-01

    Genomic selection (GS) is a recent selective breeding method which uses predictive models based on whole-genome molecular markers. Until now, existing studies formulated GS as the problem of modeling an individual's breeding value for a particular trait of interest, i.e., as a regression problem. To assess predictive accuracy of the model, the Pearson correlation between observed and predicted trait values was used. In this paper, we propose to formulate GS as the problem of ranking individuals according to their breeding value. Our proposed framework allows us to employ machine learning methods for ranking which had previously not been considered in the GS literature. To assess ranking accuracy of a model, we introduce a new measure originating from the information retrieval literature called normalized discounted cumulative gain (NDCG). NDCG rewards more strongly models which assign a high rank to individuals with high breeding value. Therefore, NDCG reflects a prerequisite objective in selective breeding: accurate selection of individuals with high breeding value. We conducted a comparison of 10 existing regression methods and 3 new ranking methods on 6 datasets, consisting of 4 plant species and 25 traits. Our experimental results suggest that tree-based ensemble methods including McRank, Random Forests and Gradient Boosting Regression Trees achieve excellent ranking accuracy. RKHS regression and RankSVM also achieve good accuracy when used with an RBF kernel. Traditional regression methods such as Bayesian lasso, wBSR and BayesC were found less suitable for ranking. Pearson correlation was found to correlate poorly with NDCG. Our study suggests two important messages. First, ranking methods are a promising research direction in GS. Second, NDCG can be a useful evaluation measure for GS.

  7. High selectivity ZIF-93 hollow fiber membranes for gas separation.

    Science.gov (United States)

    Cacho-Bailo, Fernando; Caro, Guillermo; Etxeberría-Benavides, Miren; Karvan, Oğuz; Téllez, Carlos; Coronas, Joaquín

    2015-06-30

    Zeolitic imidazolate framework-93 (ZIF-93) continuous membranes were synthesized on the inner side of P84 co-polyimide hollow fiber supports by microfluidics. MOFs and polymers showed high compatibility and the membrane exhibited H2-CH4 and CO2-CH4 separation selectivities of 97 (100 °C) and 17 (35 °C), respectively.

  8. Enhancing Security of Double Random Phase Encoding Based on Random S-Box

    Science.gov (United States)

    Girija, R.; Singh, Hukum

    2018-06-01

    In this paper, we propose a novel asymmetric cryptosystem for double random phase encoding (DRPE) using random S-Box. While utilising S-Box separately is not reliable and DRPE does not support non-linearity, so, our system unites the effectiveness of S-Box with an asymmetric system of DRPE (through Fourier transform). The uniqueness of proposed cryptosystem lies on employing high sensitivity dynamic S-Box for our DRPE system. The randomness and scalability achieved due to applied technique is an additional feature of the proposed solution. The firmness of random S-Box is investigated in terms of performance parameters such as non-linearity, strict avalanche criterion, bit independence criterion, linear and differential approximation probabilities etc. S-Boxes convey nonlinearity to cryptosystems which is a significant parameter and very essential for DRPE. The strength of proposed cryptosystem has been analysed using various parameters such as MSE, PSNR, correlation coefficient analysis, noise analysis, SVD analysis, etc. Experimental results are conferred in detail to exhibit proposed cryptosystem is highly secure.

  9. Random Number Generation for High Performance Computing

    Science.gov (United States)

    2015-01-01

    number streams, a quality metric for the parallel random number streams. * * * * * Atty. Dkt . No.: 5660-14400 Customer No. 35690 Eric B. Meyertons...responsibility to ensure timely payment of maintenance fees when due. Pagel of3 PTOL-85 (Rev. 02/11) Atty. Dkt . No.: 5660-14400 Page 1 Meyertons...with each subtask executed by a separate thread or process (henceforth, process). Each process has Atty. Dkt . No.: 5660-14400 Page 2 Meyertons

  10. Material Selection and Characterization for High Gradient RF Applications

    CERN Document Server

    Arnau-Izquierdo, G; Heikkinen, S; Ramsvik, T; Sgobba, Stefano; Taborelli, M; Wuensch, W

    2007-01-01

    The selection of candidate materials for the accelerating cavities of the Compact Linear Collider (CLIC) is carried out in parallel with high power RF testing. The maximum DC breakdown field of copper, copper alloys, refractory metals, aluminium and titanium have been measured with a dedicated setup. Higher maximum fields are obtained for refractory metals and for titanium, which exhibits, however, important damages after conditioning. Fatigue behaviour of copper alloys has been studied for surface and bulk by pulsed laser irradiation and ultrasonic excitation, respectively. The selected copper alloys show consistently higher fatigue resistance than copper in both experiments. In order to obtain the best local properties in the device a possible solution is a bi-metallic assembly. Junctions of molybdenum and copper-zirconium UNS C15000 alloy, achieved by HIP (Hot Isostatic Pressing) diffusion bonding or explosion bonding were evaluated for their mechanical strength. The reliability of the results obtained wit...

  11. Randomized Prediction Games for Adversarial Machine Learning.

    Science.gov (United States)

    Rota Bulo, Samuel; Biggio, Battista; Pillai, Ignazio; Pelillo, Marcello; Roli, Fabio

    In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different evasion attacks and modifying the classification function accordingly. However, both the classification function and the simulated data manipulations have been modeled in a deterministic manner, without accounting for any form of randomization. In this paper, we overcome this limitation by proposing a randomized prediction game, namely, a noncooperative game-theoretic formulation in which the classifier and the attacker make randomized strategy selections according to some probability distribution defined over the respective strategy set. We show that our approach allows one to improve the tradeoff between attack detection and false alarms with respect to the state-of-the-art secure classifiers, even against attacks that are different from those hypothesized during design, on application examples including handwritten digit recognition, spam, and malware detection.In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different

  12. Pseudo-random number generation using a 3-state cellular automaton

    Science.gov (United States)

    Bhattacharjee, Kamalika; Paul, Dipanjyoti; Das, Sukanta

    This paper investigates the potentiality of pseudo-random number generation of a 3-neighborhood 3-state cellular automaton (CA) under periodic boundary condition. Theoretical and empirical tests are performed on the numbers, generated by the CA, to observe the quality of it as pseudo-random number generator (PRNG). We analyze the strength and weakness of the proposed PRNG and conclude that the selected CA is a good random number generator.

  13. "Open mesh" or "strictly selected population" recruitment? The experience of the randomized controlled MeMeMe trial.

    Science.gov (United States)

    Cortellini, Mauro; Berrino, Franco; Pasanisi, Patrizia

    2017-01-01

    Among randomized controlled trials (RCTs), trials for primary prevention require large samples and long follow-up to obtain a high-quality outcome; therefore the recruitment process and the drop-out rates largely dictate the adequacy of the results. We are conducting a Phase III trial on persons with metabolic syndrome to test the hypothesis that comprehensive lifestyle changes and/or metformin treatment prevents age-related chronic diseases (the MeMeMe trial, EudraCT number: 2012-005427-32, also registered on ClinicalTrials.gov [NCT02960711]). Here, we briefly analyze and discuss the reasons which may lead to participants dropping out from trials. In our experience, participants may back out of a trial for different reasons. Drug-induced side effects are certainly the most compelling reason. But what are the other reasons, relating to the participants' perception of the progress of the trial which led them to withdraw after randomization? What about the time-dependent drop-out rate in primary prevention trials? The primary outcome of this analysis is the point of drop-out from trial, defined as the time from the randomization date to the withdrawal date. Survival functions were non-parametrically estimated using the product-limit estimator. The curves were statistically compared using the log-rank test ( P =0.64, not significant). Researchers involved in primary prevention RCTs seem to have to deal with the paradox of the proverbial "short blanket syndrome". Recruiting only highly motivated candidates might be useful for the smooth progress of the trial but it may lead to a very low enrollment rate. On the other hand, what about enrolling all the eligible subjects without considering their motivation? This might boost the enrollment rate, but it can lead to biased results on account of large proportions of drop-outs. Our experience suggests that participants do not change their mind depending on the allocation group (intervention or control). There is no single

  14. Selecting Optimal Feature Set in High-Dimensional Data by Swarm Search

    Directory of Open Access Journals (Sweden)

    Simon Fong

    2013-01-01

    Full Text Available Selecting the right set of features from data of high dimensionality for inducing an accurate classification model is a tough computational challenge. It is almost a NP-hard problem as the combinations of features escalate exponentially as the number of features increases. Unfortunately in data mining, as well as other engineering applications and bioinformatics, some data are described by a long array of features. Many feature subset selection algorithms have been proposed in the past, but not all of them are effective. Since it takes seemingly forever to use brute force in exhaustively trying every possible combination of features, stochastic optimization may be a solution. In this paper, we propose a new feature selection scheme called Swarm Search to find an optimal feature set by using metaheuristics. The advantage of Swarm Search is its flexibility in integrating any classifier into its fitness function and plugging in any metaheuristic algorithm to facilitate heuristic search. Simulation experiments are carried out by testing the Swarm Search over some high-dimensional datasets, with different classification algorithms and various metaheuristic algorithms. The comparative experiment results show that Swarm Search is able to attain relatively low error rates in classification without shrinking the size of the feature subset to its minimum.

  15. High-Q Defect-Free 2D Photonic Crystal Cavity from Random Localised Disorder

    Directory of Open Access Journals (Sweden)

    Kelvin Chung

    2014-07-01

    Full Text Available We propose a high-Q photonic crystal cavity formed by introducing random disorder to the central region of an otherwise defect-free photonic crystal slab (PhC. Three-dimensional finite-difference time-domain simulations determine the frequency, quality factor, Q, and modal volume, V, of the localized modes formed by the disorder. Relatively large Purcell factors of 500–800 are calculated for these cavities, which can be achieved for a large range of degrees of disorders.

  16. Hybrid feature selection for supporting lightweight intrusion detection systems

    Science.gov (United States)

    Song, Jianglong; Zhao, Wentao; Liu, Qiang; Wang, Xin

    2017-08-01

    Redundant and irrelevant features not only cause high resource consumption but also degrade the performance of Intrusion Detection Systems (IDS), especially when coping with big data. These features slow down the process of training and testing in network traffic classification. Therefore, a hybrid feature selection approach in combination with wrapper and filter selection is designed in this paper to build a lightweight intrusion detection system. Two main phases are involved in this method. The first phase conducts a preliminary search for an optimal subset of features, in which the chi-square feature selection is utilized. The selected set of features from the previous phase is further refined in the second phase in a wrapper manner, in which the Random Forest(RF) is used to guide the selection process and retain an optimized set of features. After that, we build an RF-based detection model and make a fair comparison with other approaches. The experimental results on NSL-KDD datasets show that our approach results are in higher detection accuracy as well as faster training and testing processes.

  17. Partial hydrogenation of alkynes on highly selective nano-structured mesoporous silica MCM-41 composite catalyst

    International Nuclear Information System (INIS)

    Kojoori, R.K.

    2016-01-01

    In this research, we have developed a silica MCM-41/Metformin/Pd (II) nano composite catalyst for the selective hydrogenation of alkynes to the corresponding (Z)-alkenes under a mild condition of atmospheric pressure and room temperature. Firstly, functionalized Si-MCM-41 metformin catalyst with the optimum performance was prepared. Then, the synthesized catalyst was elucidated by X-ray powder diffraction, BET surface area, FT-IR spectrophotometer, Scanning electron microscopy (SEM) and Transmission electron microscopy (TEM) and applied in partial hydrogenation of different alkynes, with high selectivity and high yield. The products were characterized by 1H-NMR, 13C-NMR, FT-IR, and Mass Spectrometry (MS) that strongly approved the (Z)-double bond configuration of produced alkenes. This prepared catalyst is competitive with the best palladium catalysts known for the selective liquid phase hydrogenation of alkynes and can be easily recovered and regenerated with keeping high activity and selectivity over at least three cycles with a simple regeneration procedure. (author)

  18. Ultrathin and Ion-Selective Janus Membranes for High-Performance Osmotic Energy Conversion.

    Science.gov (United States)

    Zhang, Zhen; Sui, Xin; Li, Pei; Xie, Ganhua; Kong, Xiang-Yu; Xiao, Kai; Gao, Longcheng; Wen, Liping; Jiang, Lei

    2017-07-05

    The osmotic energy existing in fluids is recognized as a promising "blue" energy source that can help solve the global issues of energy shortage and environmental pollution. Recently, nanofluidic channels have shown great potential for capturing this worldwide energy because of their novel transport properties contributed by nanoconfinement. However, with respect to membrane-scale porous systems, high resistance and undesirable ion selectivity remain bottlenecks, impeding their applications. The development of thinner, low-resistance membranes, meanwhile promoting their ion selectivity, is a necessity. Here, we engineered ultrathin and ion-selective Janus membranes prepared via the phase separation of two block copolymers, which enable osmotic energy conversion with power densities of approximately 2.04 W/m 2 by mixing natural seawater and river water. Both experiments and continuum simulation help us to understand the mechanism for how membrane thickness and channel structure dominate the ion transport process and overall device performance, which can serve as a general guiding principle for the future design of nanochannel membranes for high-energy concentration cells.

  19. Relationship between High School Students' Facebook Addiction and Loneliness Status

    Science.gov (United States)

    Karakose, Turgut; Yirci, Ramazan; Uygun, Harun; Ozdemir, Tuncay Yavuz

    2016-01-01

    This study was conducted in order to analyze the relation between high school students' Facebook addiction and loneliness levels. The study was conducted with the relational screening model. The sample of the study consists of 712 randomly selected high school students. The data was collected using the Bergen Facebook Addiction Scale (BFAS) to…

  20. Effectivity of artrihpi irrigation for diabetic ulcer healing: A randomized controlled trial

    Science.gov (United States)

    Gayatri, Dewi; Asmorohadi, Aries; Dahlia, Debie

    2018-02-01

    The healing process of diabetic ulcer is often impeded by inflammation, infection, and decreased immune state. High pressure irrigation (10-15 psi) may be used to control the infection level. This research was designed to identify the effectiveness of artrihpi irrigation device towards diabetic ulcers in public hospitals in the Central Java. This research is a randomized control trial with cross over design. Sixty four subjects were selected using block randomization technique, and were divided into control and intervention group. The intervention was given in 6 days along with wound healing evaluation in every 3 days. The results demonstrated that there was a significant difference decrease scoring healing after treatment, even though the difference scoring healing between both groups was not statistically significant. However, it means difference was found that in the intervention artrihpi the wound healing was better than the spuit. These results illustrates the artrihpi may be solution of using high pressure irrigation to help healing process diabetic ulcers.

  1. Theory of Randomized Search Heuristics in Combinatorial Optimization

    DEFF Research Database (Denmark)

    The rigorous mathematical analysis of randomized search heuristics(RSHs) with respect to their expected runtime is a growing research area where many results have been obtained in recent years. This class of heuristics includes well-known approaches such as Randomized Local Search (RLS), the Metr......The rigorous mathematical analysis of randomized search heuristics(RSHs) with respect to their expected runtime is a growing research area where many results have been obtained in recent years. This class of heuristics includes well-known approaches such as Randomized Local Search (RLS...... analysis of randomized algorithms to RSHs. Mostly, the expected runtime of RSHs on selected problems is analzyed. Thereby, we understand why and when RSHs are efficient optimizers and, conversely, when they cannot be efficient. The tutorial will give an overview on the analysis of RSHs for solving...

  2. Natural selection and algorithmic design of mRNA.

    Science.gov (United States)

    Cohen, Barry; Skiena, Steven

    2003-01-01

    Messenger RNA (mRNA) sequences serve as templates for proteins according to the triplet code, in which each of the 4(3) = 64 different codons (sequences of three consecutive nucleotide bases) in RNA either terminate transcription or map to one of the 20 different amino acids (or residues) which build up proteins. Because there are more codons than residues, there is inherent redundancy in the coding. Certain residues (e.g., tryptophan) have only a single corresponding codon, while other residues (e.g., arginine) have as many as six corresponding codons. This freedom implies that the number of possible RNA sequences coding for a given protein grows exponentially in the length of the protein. Thus nature has wide latitude to select among mRNA sequences which are informationally equivalent, but structurally and energetically divergent. In this paper, we explore how nature takes advantage of this freedom and how to algorithmically design structures more energetically favorable than have been built through natural selection. In particular: (1) Natural Selection--we perform the first large-scale computational experiment comparing the stability of mRNA sequences from a variety of organisms to random synonymous sequences which respect the codon preferences of the organism. This experiment was conducted on over 27,000 sequences from 34 microbial species with 36 genomic structures. We provide evidence that in all genomic structures highly stable sequences are disproportionately abundant, and in 19 of 36 cases highly unstable sequences are disproportionately abundant. This suggests that the stability of mRNA sequences is subject to natural selection. (2) Artificial Selection--motivated by these biological results, we examine the algorithmic problem of designing the most stable and unstable mRNA sequences which code for a target protein. We give a polynomial-time dynamic programming solution to the most stable sequence problem (MSSP), which is asymptotically no more complex

  3. Vector Triggering Random Decrement for High Identification Accuracy

    DEFF Research Database (Denmark)

    Ibrahim, S. R.; Asmussen, J. C.; Brincker, Rune

    Using the Random Decrement (RD) technique to obtain free response estimates and combining this with time domain modal identification methods to obtain the poles and the mode shapes is acknowledged as a fast and accurate way of analysing measured responses of structures subject to ambient loads. W...

  4. Evolution in fluctuating environments: decomposing selection into additive components of the Robertson-Price equation.

    Science.gov (United States)

    Engen, Steinar; Saether, Bernt-Erik

    2014-03-01

    We analyze the stochastic components of the Robertson-Price equation for the evolution of quantitative characters that enables decomposition of the selection differential into components due to demographic and environmental stochasticity. We show how these two types of stochasticity affect the evolution of multivariate quantitative characters by defining demographic and environmental variances as components of individual fitness. The exact covariance formula for selection is decomposed into three components, the deterministic mean value, as well as stochastic demographic and environmental components. We show that demographic and environmental stochasticity generate random genetic drift and fluctuating selection, respectively. This provides a common theoretical framework for linking ecological and evolutionary processes. Demographic stochasticity can cause random variation in selection differentials independent of fluctuating selection caused by environmental variation. We use this model of selection to illustrate that the effect on the expected selection differential of random variation in individual fitness is dependent on population size, and that the strength of fluctuating selection is affected by how environmental variation affects the covariance in Malthusian fitness between individuals with different phenotypes. Thus, our approach enables us to partition out the effects of fluctuating selection from the effects of selection due to random variation in individual fitness caused by demographic stochasticity. © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.

  5. Fundamental criteria for the design of high-performance Josephson nondestructive readout random access memory cells and experimental confirmation

    International Nuclear Information System (INIS)

    Henkels, W.H.

    1979-01-01

    Fundamental design criteria for Josephson nondestructive readout random access memory (NDRO RAM) cells are presented, within the context of an LSI array environment. Emphasis is placed upon principles which are relevant to high performance. The criteria are elucidated via a specific design which is simulated and then experimentally evaluated in a technology with a smallest critical dimension of 5 μm. The specific cell differs from previously tested Josephson NDRO cells in several respects; namely, the cell stores only approx.8Phi 0 , employs interferometer gates and an external damping resistor, allows switching into device resonances, and eliminates the need for a special initialization cycle. The cell-selection scheme, employing triple coincidence, results in larger operating margins and smaller operating currents than have previously been achieved. The large operating margins and all basic cell design criteria were experimentally verified. The experimental interferometer gate characteristics were analyzed in detail and found to be describable by simple models. In addition, it was discovered that single flux quantum transitions in the interferometer gates could be exploited beneficially in order to enhance the insensitivity of operating margins to fabrication tolerances

  6. Programmable disorder in random DNA tilings

    Science.gov (United States)

    Tikhomirov, Grigory; Petersen, Philip; Qian, Lulu

    2017-03-01

    Scaling up the complexity and diversity of synthetic molecular structures will require strategies that exploit the inherent stochasticity of molecular systems in a controlled fashion. Here we demonstrate a framework for programming random DNA tilings and show how to control the properties of global patterns through simple, local rules. We constructed three general forms of planar network—random loops, mazes and trees—on the surface of self-assembled DNA origami arrays on the micrometre scale with nanometre resolution. Using simple molecular building blocks and robust experimental conditions, we demonstrate control of a wide range of properties of the random networks, including the branching rules, the growth directions, the proximity between adjacent networks and the size distribution. Much as combinatorial approaches for generating random one-dimensional chains of polymers have been used to revolutionize chemical synthesis and the selection of functional nucleic acids, our strategy extends these principles to random two-dimensional networks of molecules and creates new opportunities for fabricating more complex molecular devices that are organized by DNA nanostructures.

  7. Key Aspects of Nucleic Acid Library Design for in Vitro Selection

    Science.gov (United States)

    Vorobyeva, Maria A.; Davydova, Anna S.; Vorobjev, Pavel E.; Pyshnyi, Dmitrii V.; Venyaminova, Alya G.

    2018-01-01

    Nucleic acid aptamers capable of selectively recognizing their target molecules have nowadays been established as powerful and tunable tools for biospecific applications, be it therapeutics, drug delivery systems or biosensors. It is now generally acknowledged that in vitro selection enables one to generate aptamers to almost any target of interest. However, the success of selection and the affinity of the resulting aptamers depend to a large extent on the nature and design of an initial random nucleic acid library. In this review, we summarize and discuss the most important features of the design of nucleic acid libraries for in vitro selection such as the nature of the library (DNA, RNA or modified nucleotides), the length of a randomized region and the presence of fixed sequences. We also compare and contrast different randomization strategies and consider computer methods of library design and some other aspects. PMID:29401748

  8. Metal-organic framework based highly selective fluorescence turn-on probe for hydrogen sulphide

    Science.gov (United States)

    Nagarkar, Sanjog S.; Saha, Tanmoy; Desai, Aamod V.; Talukdar, Pinaki; Ghosh, Sujit K.

    2014-11-01

    Hydrogen sulphide (H2S) is known to play a vital role in human physiology and pathology which stimulated interest in understanding complex behaviour of H2S. Discerning the pathways of H2S production and its mode of action is still a challenge owing to its volatile and reactive nature. Herein we report azide functionalized metal-organic framework (MOF) as a selective turn-on fluorescent probe for H2S detection. The MOF shows highly selective and fast response towards H2S even in presence of other relevant biomolecules. Low cytotoxicity and H2S detection in live cells, demonstrate the potential of MOF towards monitoring H2S chemistry in biological system. To the best of our knowledge this is the first example of MOF that exhibit fast and highly selective fluorescence turn-on response towards H2S under physiological conditions.

  9. A Heckman Selection- t Model

    KAUST Repository

    Marchenko, Yulia V.

    2012-03-01

    Sample selection arises often in practice as a result of the partial observability of the outcome of interest in a study. In the presence of sample selection, the observed data do not represent a random sample from the population, even after controlling for explanatory variables. That is, data are missing not at random. Thus, standard analysis using only complete cases will lead to biased results. Heckman introduced a sample selection model to analyze such data and proposed a full maximum likelihood estimation method under the assumption of normality. The method was criticized in the literature because of its sensitivity to the normality assumption. In practice, data, such as income or expenditure data, often violate the normality assumption because of heavier tails. We first establish a new link between sample selection models and recently studied families of extended skew-elliptical distributions. Then, this allows us to introduce a selection-t (SLt) model, which models the error distribution using a Student\\'s t distribution. We study its properties and investigate the finite-sample performance of the maximum likelihood estimators for this model. We compare the performance of the SLt model to the conventional Heckman selection-normal (SLN) model and apply it to analyze ambulatory expenditures. Unlike the SLNmodel, our analysis using the SLt model provides statistical evidence for the existence of sample selection bias in these data. We also investigate the performance of the test for sample selection bias based on the SLt model and compare it with the performances of several tests used with the SLN model. Our findings indicate that the latter tests can be misleading in the presence of heavy-tailed data. © 2012 American Statistical Association.

  10. Feature-selective attention in healthy old age: a selective decline in selective attention?

    Science.gov (United States)

    Quigley, Cliodhna; Müller, Matthias M

    2014-02-12

    Deficient selection against irrelevant information has been proposed to underlie age-related cognitive decline. We recently reported evidence for maintained early sensory selection when older and younger adults used spatial selective attention to perform a challenging task. Here we explored age-related differences when spatial selection is not possible and feature-selective attention must be deployed. We additionally compared the integrity of feedforward processing by exploiting the well established phenomenon of suppression of visual cortical responses attributable to interstimulus competition. Electroencephalogram was measured while older and younger human adults responded to brief occurrences of coherent motion in an attended stimulus composed of randomly moving, orientation-defined, flickering bars. Attention was directed to horizontal or vertical bars by a pretrial cue, after which two orthogonally oriented, overlapping stimuli or a single stimulus were presented. Horizontal and vertical bars flickered at different frequencies and thereby elicited separable steady-state visual-evoked potentials, which were used to examine the effect of feature-based selection and the competitive influence of a second stimulus on ongoing visual processing. Age differences were found in feature-selective attentional modulation of visual responses: older adults did not show consistent modulation of magnitude or phase. In contrast, the suppressive effect of a second stimulus was robust and comparable in magnitude across age groups, suggesting that bottom-up processing of the current stimuli is essentially unchanged in healthy old age. Thus, it seems that visual processing per se is unchanged, but top-down attentional control is compromised in older adults when space cannot be used to guide selection.

  11. High-dose estradiol improves cognition for women with AD: results of a randomized study.

    Science.gov (United States)

    Asthana, S; Baker, L D; Craft, S; Stanczyk, F Z; Veith, R C; Raskind, M A; Plymate, S R

    2001-08-28

    To characterize the cognitive and neuroendocrine response to treatment with a high dose of estrogen for postmenopausal women with AD. Twenty postmenopausal women with AD were randomized to receive either 0.10 mg/day of 17 beta-estradiol by skin patch or a placebo patch for 8 weeks. Subjects were evaluated at baseline, at weeks 3, 5, and 8 during treatment, and again 8 weeks after treatment termination. During each visit, cognition was assessed with a battery of neuropsychological tests, and blood samples were collected to measure plasma estradiol as well as several other neuroendocrine markers of interest. Significant effects of estrogen treatment were observed on attention (Stroop Color Word Interference Test), verbal memory (Buschke Selective Reminding Test), and visual memory (Figure Copy/Memory). In addition, women treated with estrogen demonstrated improved performance on a test of semantic memory (Boston Naming Test) compared with subjects who received a placebo. Estrogen appeared to have a suppressive effect on the insulin-like growth factor (IGF) system such that plasma concentration of IGF binding protein-3 was significantly reduced and plasma levels of estradiol and IGF-I were negatively correlated during estrogen treatment. Administration of a higher dose of estrogen may enhance attention and memory for postmenopausal women with AD. Although these findings provide further clinical evidence to support a cognitive benefit of estrogen for women with AD, studies evaluating the effect of estradiol administration, in particular, using larger sample sizes and for longer treatment durations are warranted before the therapeutic potential of estrogen replacement for women with AD can be firmly established.

  12. Site selection procedure for high level radioactive waste disposal in Bulgaria

    International Nuclear Information System (INIS)

    Evstatiev, D.; Vachev, B.

    1993-01-01

    A combined site selection approach is implemented. Bulgaria's territory has been classified in three categories, presented on a 1:500000 scale map. The number of suitable sites has been reduced to 20 using the method of successive screening. The formulated site selection problem is a typical discrete multi-criteria decision making problem under uncertainty. A 5-level procedure using Expert Choice Rating and relative models is created. It is a part of a common procedure for evaluation and choice of variants for high level radwaste disposal construction. On this basis 7-8 more preferable sites are demonstrated. A new knowledge and information about the relative importance of the criteria and their subsets, about the level of criteria uncertainty and the reliability are gained. It is very useful for planning and managing of the next final stages of the site selection procedure. 7 figs., 8 refs., 4 suppls. (author)

  13. Quantitative Analysis of High-Quality Officer Selection by Commandants Career-Level Education Board

    Science.gov (United States)

    2017-03-01

    impact on the organization and allocate resources to improve the human capital of this select group. From 2011 onward, CCLEB revamped the application...ANALYSIS OF HIGH-QUALITY OFFICER SELECTION BY COMMANDANT’S CAREER - LEVEL EDUCATION BOARD by Clifton N. Rateike March 2017 Thesis Advisor...of Management and Budget, Paperwork Reduction Project (0704-0188) Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE March

  14. Frequency selective surfaces based high performance microstrip antenna

    CERN Document Server

    Narayan, Shiv; Jha, Rakesh Mohan

    2016-01-01

    This book focuses on performance enhancement of printed antennas using frequency selective surfaces (FSS) technology. The growing demand of stealth technology in strategic areas requires high-performance low-RCS (radar cross section) antennas. Such requirements may be accomplished by incorporating FSS into the antenna structure either in its ground plane or as the superstrate, due to the filter characteristics of FSS structure. In view of this, a novel approach based on FSS technology is presented in this book to enhance the performance of printed antennas including out-of-band structural RCS reduction. In this endeavor, the EM design of microstrip patch antennas (MPA) loaded with FSS-based (i) high impedance surface (HIS) ground plane, and (ii) the superstrates are discussed in detail. The EM analysis of proposed FSS-based antenna structures have been carried out using transmission line analogy, in combination with the reciprocity theorem. Further, various types of novel FSS structures are considered in desi...

  15. Opportunistic Relay Selection with Cooperative Macro Diversity

    Directory of Open Access Journals (Sweden)

    Yu Chia-Hao

    2010-01-01

    Full Text Available We apply a fully opportunistic relay selection scheme to study cooperative diversity in a semianalytical manner. In our framework, idle Mobile Stations (MSs are capable of being used as Relay Stations (RSs and no relaying is required if the direct path is strong. Our relay selection scheme is fully selection based: either the direct path or one of the relaying paths is selected. Macro diversity, which is often ignored in analytical works, is taken into account together with micro diversity by using a complete channel model that includes both shadow fading and fast fading effects. The stochastic geometry of the network is taken into account by having a random number of randomly located MSs. The outage probability analysis of the selection differs from the case where only fast fading is considered. Under our framework, distribution of the received power is formulated using different Channel State Information (CSI assumptions to simulate both optimistic and practical environments. The results show that the relay selection gain can be significant given a suitable amount of candidate RSs. Also, while relay selection according to incomplete CSI is diversity suboptimal compared to relay selection based on full CSI, the loss in average throughput is not too significant. This is a consequence of the dominance of geometry over fast fading.

  16. Statistical auditing and randomness test of lotto k/N-type games

    Science.gov (United States)

    Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Rapallo, F.; Scalas, E.

    2008-11-01

    One of the most popular lottery games worldwide is the so-called “lotto k/N”. It considers N numbers 1,2,…,N from which k are drawn randomly, without replacement. A player selects k or more numbers and the first prize is shared amongst those players whose selected numbers match all of the k randomly drawn. Exact rules may vary in different countries. In this paper, mean values and covariances for the random variables representing the numbers drawn from this kind of game are presented, with the aim of using them to audit statistically the consistency of a given sample of historical results with theoretical values coming from a hypergeometric statistical model. The method can be adapted to test pseudorandom number generators.

  17. Experimental High Speed Milling of the Selected Thin-Walled Component

    Directory of Open Access Journals (Sweden)

    Jozef Zajac

    2017-11-01

    Full Text Available In a technical practice, it is possible to meet thin-walled parts more and more often. These parts are most commonly used in the automotive industry or aircraft industry to reduce the weight of different design part of cars or aircraft. Presented article is focused on experimental high speed milling of selected thin-walled component. The introduction of this article presents description of high speed machining and specification of thin – walled parts. The experiments were carried out using a CNC machine Pinnacle VMC 650S and C45 material - plain carbon steel for automotive components and mechanical engineering. In the last part of the article, described are the arrangements to reduction of deformation of thin-walled component during the experimental high speed milling.

  18. Accuracy of genomic selection in European maize elite breeding populations.

    Science.gov (United States)

    Zhao, Yusheng; Gowda, Manje; Liu, Wenxin; Würschum, Tobias; Maurer, Hans P; Longin, Friedrich H; Ranc, Nicolas; Reif, Jochen C

    2012-03-01

    Genomic selection is a promising breeding strategy for rapid improvement of complex traits. The objective of our study was to investigate the prediction accuracy of genomic breeding values through cross validation. The study was based on experimental data of six segregating populations from a half-diallel mating design with 788 testcross progenies from an elite maize breeding program. The plants were intensively phenotyped in multi-location field trials and fingerprinted with 960 SNP markers. We used random regression best linear unbiased prediction in combination with fivefold cross validation. The prediction accuracy across populations was higher for grain moisture (0.90) than for grain yield (0.58). The accuracy of genomic selection realized for grain yield corresponds to the precision of phenotyping at unreplicated field trials in 3-4 locations. As for maize up to three generations are feasible per year, selection gain per unit time is high and, consequently, genomic selection holds great promise for maize breeding programs.

  19. Random and non-random mating populations: Evolutionary dynamics in meiotic drive.

    Science.gov (United States)

    Sarkar, Bijan

    2016-01-01

    Game theoretic tools are utilized to analyze a one-locus continuous selection model of sex-specific meiotic drive by considering nonequivalence of the viabilities of reciprocal heterozygotes that might be noticed at an imprinted locus. The model draws attention to the role of viability selections of different types to examine the stable nature of polymorphic equilibrium. A bridge between population genetics and evolutionary game theory has been built up by applying the concept of the Fundamental Theorem of Natural Selection. In addition to pointing out the influences of male and female segregation ratios on selection, configuration structure reveals some noted results, e.g., Hardy-Weinberg frequencies hold in replicator dynamics, occurrence of faster evolution at the maximized variance fitness, existence of mixed Evolutionarily Stable Strategy (ESS) in asymmetric games, the tending evolution to follow not only a 1:1 sex ratio but also a 1:1 different alleles ratio at particular gene locus. Through construction of replicator dynamics in the group selection framework, our selection model introduces a redefining bases of game theory to incorporate non-random mating where a mating parameter associated with population structure is dependent on the social structure. Also, the model exposes the fact that the number of polymorphic equilibria will depend on the algebraic expression of population structure. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Velocity and Dispersion for a Two-Dimensional Random Walk

    International Nuclear Information System (INIS)

    Li Jinghui

    2009-01-01

    In the paper, we consider the transport of a two-dimensional random walk. The velocity and the dispersion of this two-dimensional random walk are derived. It mainly show that: (i) by controlling the values of the transition rates, the direction of the random walk can be reversed; (ii) for some suitably selected transition rates, our two-dimensional random walk can be efficient in comparison with the one-dimensional random walk. Our work is motivated in part by the challenge to explain the unidirectional transport of motor proteins. When the motor proteins move at the turn points of their tracks (i.e., the cytoskeleton filaments and the DNA molecular tubes), some of our results in this paper can be used to deal with the problem. (general)

  1. Frequency-Wavenumber (FK)-Based Data Selection in High-Frequency Passive Surface Wave Survey

    Science.gov (United States)

    Cheng, Feng; Xia, Jianghai; Xu, Zongbo; Hu, Yue; Mi, Binbin

    2018-04-01

    Passive surface wave methods have gained much attention from geophysical and civil engineering communities because of the limited application of traditional seismic surveys in highly populated urban areas. Considering that they can provide high-frequency phase velocity information up to several tens of Hz, the active surface wave survey would be omitted and the amount of field work could be dramatically reduced. However, the measured dispersion energy image in the passive surface wave survey would usually be polluted by a type of "crossed" artifacts at high frequencies. It is common in the bidirectional noise distribution case with a linear receiver array deployed along roads or railways. We review several frequently used passive surface wave methods and derive the underlying physics for the existence of the "crossed" artifacts. We prove that the "crossed" artifacts would cross the true surface wave energy at fixed points in the f-v domain and propose a FK-based data selection technique to attenuate the artifacts in order to retrieve the high-frequency information. Numerical tests further demonstrate the existence of the "crossed" artifacts and indicate that the well-known wave field separation method, FK filter, does not work for the selection of directional noise data. Real-world applications manifest the feasibility of the proposed FK-based technique to improve passive surface wave methods by a priori data selection. Finally, we discuss the applicability of our approach.

  2. Frequency-Wavenumber (FK)-Based Data Selection in High-Frequency Passive Surface Wave Survey

    Science.gov (United States)

    Cheng, Feng; Xia, Jianghai; Xu, Zongbo; Hu, Yue; Mi, Binbin

    2018-07-01

    Passive surface wave methods have gained much attention from geophysical and civil engineering communities because of the limited application of traditional seismic surveys in highly populated urban areas. Considering that they can provide high-frequency phase velocity information up to several tens of Hz, the active surface wave survey would be omitted and the amount of field work could be dramatically reduced. However, the measured dispersion energy image in the passive surface wave survey would usually be polluted by a type of "crossed" artifacts at high frequencies. It is common in the bidirectional noise distribution case with a linear receiver array deployed along roads or railways. We review several frequently used passive surface wave methods and derive the underlying physics for the existence of the "crossed" artifacts. We prove that the "crossed" artifacts would cross the true surface wave energy at fixed points in the f- v domain and propose a FK-based data selection technique to attenuate the artifacts in order to retrieve the high-frequency information. Numerical tests further demonstrate the existence of the "crossed" artifacts and indicate that the well-known wave field separation method, FK filter, does not work for the selection of directional noise data. Real-world applications manifest the feasibility of the proposed FK-based technique to improve passive surface wave methods by a priori data selection. Finally, we discuss the applicability of our approach.

  3. High-throughput screening of metal-porphyrin-like graphenes for selective capture of carbon dioxide

    OpenAIRE

    Hyeonhu Bae; Minwoo Park; Byungryul Jang; Yura Kang; Jinwoo Park; Hosik Lee; Haegeun Chung; ChiHye Chung; Suklyun Hong; Yongkyung Kwon; Boris I. Yakobson; Hoonkyung Lee

    2016-01-01

    Nanostructured materials, such as zeolites and metal-organic frameworks, have been considered to capture CO2. However, their application has been limited largely because they exhibit poor selectivity for flue gases and low capture capacity under low pressures. We perform a high-throughput screening for selective CO2 capture from flue gases by using first principles thermodynamics. We find that elements with empty d orbitals selectively attract CO2 from gaseous mixtures under low CO2 pressures...

  4. Mean-field analysis of orientation selectivity in inhibition-dominated networks of spiking neurons.

    Science.gov (United States)

    Sadeh, Sadra; Cardanobile, Stefano; Rotter, Stefan

    2014-01-01

    Mechanisms underlying the emergence of orientation selectivity in the primary visual cortex are highly debated. Here we study the contribution of inhibition-dominated random recurrent networks to orientation selectivity, and more generally to sensory processing. By simulating and analyzing large-scale networks of spiking neurons, we investigate tuning amplification and contrast invariance of orientation selectivity in these networks. In particular, we show how selective attenuation of the common mode and amplification of the modulation component take place in these networks. Selective attenuation of the baseline, which is governed by the exceptional eigenvalue of the connectivity matrix, removes the unspecific, redundant signal component and ensures the invariance of selectivity across different contrasts. Selective amplification of modulation, which is governed by the operating regime of the network and depends on the strength of coupling, amplifies the informative signal component and thus increases the signal-to-noise ratio. Here, we perform a mean-field analysis which accounts for this process.

  5. Risk Attitudes, Sample Selection and Attrition in a Longitudinal Field Experiment

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Lau, Morten Igel

    with respect to risk attitudes. Our design builds in explicit randomization on the incentives for participation. We show that there are significant sample selection effects on inferences about the extent of risk aversion, but that the effects of subsequent sample attrition are minimal. Ignoring sample...... selection leads to inferences that subjects in the population are more risk averse than they actually are. Correcting for sample selection and attrition affects utility curvature, but does not affect inferences about probability weighting. Properly accounting for sample selection and attrition effects leads...... to findings of temporal stability in overall risk aversion. However, that stability is around different levels of risk aversion than one might naively infer without the controls for sample selection and attrition we are able to implement. This evidence of “randomization bias” from sample selection...

  6. Effects of Video Game Training on Measures of Selective Attention and Working Memory in Older Adults: Results from a Randomized Controlled Trial

    Science.gov (United States)

    Ballesteros, Soledad; Mayas, Julia; Prieto, Antonio; Ruiz-Marquez, Eloísa; Toril, Pilar; Reales, José M.

    2017-01-01

    Video game training with older adults potentially enhances aspects of cognition that decline with aging and could therefore offer a promising training approach. Although, previous published studies suggest that training can produce transfer, many of them have certain shortcomings. This randomized controlled trial (RCT; Clinicaltrials.gov ID: NCT02796508) tried to overcome some of these limitations by incorporating an active control group and the assessment of motivation and expectations. Seventy-five older volunteers were randomly assigned to the experimental group trained for 16 sessions with non-action video games from Lumosity, a commercial platform (http://www.lumosity.com/) or to an active control group trained for the same number of sessions with simulation strategy games. The final sample included 55 older adults (30 in the experimental group and 25 in the active control group). Participants were tested individually before and after training to assess working memory (WM) and selective attention and also reported their perceived improvement, motivation and engagement. The results showed improved performance across the training sessions. The main results were: (1) the experimental group did not show greater improvements in measures of selective attention and working memory than the active control group (the opposite occurred in the oddball task); (2) a marginal training effect was observed for the N-back task, but not for the Stroop task while both groups improved in the Corsi Blocks task. Based on these results, one can conclude that training with non-action games provide modest benefits for untrained tasks. The effect is not specific for that kind of training as a similar effect was observed for strategy video games. Groups did not differ in motivation, engagement or expectations. PMID:29163136

  7. Effects of Video Game Training on Measures of Selective Attention and Working Memory in Older Adults: Results from a Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Soledad Ballesteros

    2017-11-01

    Full Text Available Video game training with older adults potentially enhances aspects of cognition that decline with aging and could therefore offer a promising training approach. Although, previous published studies suggest that training can produce transfer, many of them have certain shortcomings. This randomized controlled trial (RCT; Clinicaltrials.gov ID: NCT02796508 tried to overcome some of these limitations by incorporating an active control group and the assessment of motivation and expectations. Seventy-five older volunteers were randomly assigned to the experimental group trained for 16 sessions with non-action video games from Lumosity, a commercial platform (http://www.lumosity.com/ or to an active control group trained for the same number of sessions with simulation strategy games. The final sample included 55 older adults (30 in the experimental group and 25 in the active control group. Participants were tested individually before and after training to assess working memory (WM and selective attention and also reported their perceived improvement, motivation and engagement. The results showed improved performance across the training sessions. The main results were: (1 the experimental group did not show greater improvements in measures of selective attention and working memory than the active control group (the opposite occurred in the oddball task; (2 a marginal training effect was observed for the N-back task, but not for the Stroop task while both groups improved in the Corsi Blocks task. Based on these results, one can conclude that training with non-action games provide modest benefits for untrained tasks. The effect is not specific for that kind of training as a similar effect was observed for strategy video games. Groups did not differ in motivation, engagement or expectations.

  8. Effects of Video Game Training on Measures of Selective Attention and Working Memory in Older Adults: Results from a Randomized Controlled Trial.

    Science.gov (United States)

    Ballesteros, Soledad; Mayas, Julia; Prieto, Antonio; Ruiz-Marquez, Eloísa; Toril, Pilar; Reales, José M

    2017-01-01

    Video game training with older adults potentially enhances aspects of cognition that decline with aging and could therefore offer a promising training approach. Although, previous published studies suggest that training can produce transfer, many of them have certain shortcomings. This randomized controlled trial (RCT; Clinicaltrials.gov ID: NCT02796508) tried to overcome some of these limitations by incorporating an active control group and the assessment of motivation and expectations. Seventy-five older volunteers were randomly assigned to the experimental group trained for 16 sessions with non-action video games from Lumosity , a commercial platform (http://www.lumosity.com/) or to an active control group trained for the same number of sessions with simulation strategy games. The final sample included 55 older adults (30 in the experimental group and 25 in the active control group). Participants were tested individually before and after training to assess working memory (WM) and selective attention and also reported their perceived improvement, motivation and engagement. The results showed improved performance across the training sessions. The main results were: (1) the experimental group did not show greater improvements in measures of selective attention and working memory than the active control group (the opposite occurred in the oddball task); (2) a marginal training effect was observed for the N -back task, but not for the Stroop task while both groups improved in the Corsi Blocks task. Based on these results, one can conclude that training with non-action games provide modest benefits for untrained tasks. The effect is not specific for that kind of training as a similar effect was observed for strategy video games. Groups did not differ in motivation, engagement or expectations.

  9. A random phased array device for delivery of high intensity focused ultrasound.

    Science.gov (United States)

    Hand, J W; Shaw, A; Sadhoo, N; Rajagopal, S; Dickinson, R J; Gavrilov, L R

    2009-10-07

    Randomized phased arrays can offer electronic steering of a single focus and simultaneous multiple foci concomitant with low levels of secondary maxima and are potentially useful as sources of high intensity focused ultrasound (HIFU). This work describes laboratory testing of a 1 MHz random phased array consisting of 254 elements on a spherical shell of radius of curvature 130 mm and diameter 170 mm. Acoustic output power and efficiency are measured for a range of input electrical powers, and field distributions for various single- and multiple-focus conditions are evaluated by a novel technique using an infrared camera to provide rapid imaging of temperature changes on the surface of an absorbing target. Experimental results show that the array can steer a single focus laterally to at least +/-15 mm off axis and axially to more than +/-15 mm from the centre of curvature of the array and patterns of four and five simultaneous foci +/-10 mm laterally and axially whilst maintaining low intensity levels in secondary maxima away from the targeted area in good agreement with linear theoretical predictions. Experiments in which pork meat was thermally ablated indicate that contiguous lesions several cm(3) in volume can be produced using the patterns of multiple foci.

  10. A random phased array device for delivery of high intensity focused ultrasound

    International Nuclear Information System (INIS)

    Hand, J W; Shaw, A; Sadhoo, N; Rajagopal, S; Dickinson, R J; Gavrilov, L R

    2009-01-01

    Randomized phased arrays can offer electronic steering of a single focus and simultaneous multiple foci concomitant with low levels of secondary maxima and are potentially useful as sources of high intensity focused ultrasound (HIFU). This work describes laboratory testing of a 1 MHz random phased array consisting of 254 elements on a spherical shell of radius of curvature 130 mm and diameter 170 mm. Acoustic output power and efficiency are measured for a range of input electrical powers, and field distributions for various single- and multiple-focus conditions are evaluated by a novel technique using an infrared camera to provide rapid imaging of temperature changes on the surface of an absorbing target. Experimental results show that the array can steer a single focus laterally to at least ±15 mm off axis and axially to more than ±15 mm from the centre of curvature of the array and patterns of four and five simultaneous foci ±10 mm laterally and axially whilst maintaining low intensity levels in secondary maxima away from the targeted area in good agreement with linear theoretical predictions. Experiments in which pork meat was thermally ablated indicate that contiguous lesions several cm 3 in volume can be produced using the patterns of multiple foci.

  11. Applying a weighted random forests method to extract karst sinkholes from LiDAR data

    Science.gov (United States)

    Zhu, Junfeng; Pierskalla, William P.

    2016-02-01

    Detailed mapping of sinkholes provides critical information for mitigating sinkhole hazards and understanding groundwater and surface water interactions in karst terrains. LiDAR (Light Detection and Ranging) measures the earth's surface in high-resolution and high-density and has shown great potentials to drastically improve locating and delineating sinkholes. However, processing LiDAR data to extract sinkholes requires separating sinkholes from other depressions, which can be laborious because of the sheer number of the depressions commonly generated from LiDAR data. In this study, we applied the random forests, a machine learning method, to automatically separate sinkholes from other depressions in a karst region in central Kentucky. The sinkhole-extraction random forest was grown on a training dataset built from an area where LiDAR-derived depressions were manually classified through a visual inspection and field verification process. Based on the geometry of depressions, as well as natural and human factors related to sinkholes, 11 parameters were selected as predictive variables to form the dataset. Because the training dataset was imbalanced with the majority of depressions being non-sinkholes, a weighted random forests method was used to improve the accuracy of predicting sinkholes. The weighted random forest achieved an average accuracy of 89.95% for the training dataset, demonstrating that the random forest can be an effective sinkhole classifier. Testing of the random forest in another area, however, resulted in moderate success with an average accuracy rate of 73.96%. This study suggests that an automatic sinkhole extraction procedure like the random forest classifier can significantly reduce time and labor costs and makes its more tractable to map sinkholes using LiDAR data for large areas. However, the random forests method cannot totally replace manual procedures, such as visual inspection and field verification.

  12. High amino acid diversity and positive selection at a putative coral immunity gene (tachylectin-2

    Directory of Open Access Journals (Sweden)

    Hellberg Michael E

    2010-05-01

    Full Text Available Abstract Background Genes involved in immune functions, including pathogen recognition and the activation of innate defense pathways, are among the most genetically variable known, and the proteins that they encode are often characterized by high rates of amino acid substitutions, a hallmark of positive selection. The high levels of variation characteristic of immunity genes make them useful tools for conservation genetics. To date, highly variable immunity genes have yet to be found in corals, keystone organisms of the world's most diverse marine ecosystem, the coral reef. Here, we examine variation in and selection on a putative innate immunity gene from Oculina, a coral genus previously used as a model for studies of coral disease and bleaching. Results In a survey of 244 Oculina alleles, we find high nonsynonymous variation and a signature of positive selection, consistent with a putative role in immunity. Using computational protein structure prediction, we generate a structural model of the Oculina protein that closely matches the known structure of tachylectin-2 from the Japanese horseshoe crab (Tachypleus tridentatus, a protein with demonstrated function in microbial recognition and agglutination. We also demonstrate that at least three other genera of anthozoan cnidarians (Acropora, Montastrea and Nematostella possess proteins structurally similar to tachylectin-2. Conclusions Taken together, the evidence of high amino acid diversity, positive selection and structural correspondence to the horseshoe crab tachylectin-2 suggests that this protein is 1 part of Oculina's innate immunity repertoire, and 2 evolving adaptively, possibly under selective pressure from coral-associated microorganisms. Tachylectin-2 may serve as a candidate locus to screen coral populations for their capacity to respond adaptively to future environmental change.

  13. Lines of Descent Under Selection

    Science.gov (United States)

    Baake, Ellen; Wakolbinger, Anton

    2017-11-01

    We review recent progress on ancestral processes related to mutation-selection models, both in the deterministic and the stochastic setting. We mainly rely on two concepts, namely, the killed ancestral selection graph and the pruned lookdown ancestral selection graph. The killed ancestral selection graph gives a representation of the type of a random individual from a stationary population, based upon the individual's potential ancestry back until the mutations that define the individual's type. The pruned lookdown ancestral selection graph allows one to trace the ancestry of individuals from a stationary distribution back into the distant past, thus leading to the stationary distribution of ancestral types. We illustrate the results by applying them to a prototype model for the error threshold phenomenon.

  14. Multi-scale analyses of nest site selection and fledging success by marbled murrelets (Brachyramphus marmoratus) in British Columbia

    OpenAIRE

    Silvergieter, Michael Paul

    2009-01-01

    I studied nesting habitat selection and fledging success by marbled murrelets, a seabird that nests in old-growth forests of high economic value, at two regions of southwestern British Columbia. At Clayoquot Sound, habitat occurs in larger stands, and murrelets selected steeper slopes and patches with more platform trees, and shorter trees, than at random sites. At Desolation Sound, where smaller forest stands predominate, patch scale variables were less important; increased canopy complexity...

  15. Random Subspace Aggregation for Cancer Prediction with Gene Expression Profiles

    Directory of Open Access Journals (Sweden)

    Liying Yang

    2016-01-01

    Full Text Available Background. Precisely predicting cancer is crucial for cancer treatment. Gene expression profiles make it possible to analyze patterns between genes and cancers on the genome-wide scale. Gene expression data analysis, however, is confronted with enormous challenges for its characteristics, such as high dimensionality, small sample size, and low Signal-to-Noise Ratio. Results. This paper proposes a method, termed RS_SVM, to predict gene expression profiles via aggregating SVM trained on random subspaces. After choosing gene features through statistical analysis, RS_SVM randomly selects feature subsets to yield random subspaces and training SVM classifiers accordingly and then aggregates SVM classifiers to capture the advantage of ensemble learning. Experiments on eight real gene expression datasets are performed to validate the RS_SVM method. Experimental results show that RS_SVM achieved better classification accuracy and generalization performance in contrast with single SVM, K-nearest neighbor, decision tree, Bagging, AdaBoost, and the state-of-the-art methods. Experiments also explored the effect of subspace size on prediction performance. Conclusions. The proposed RS_SVM method yielded superior performance in analyzing gene expression profiles, which demonstrates that RS_SVM provides a good channel for such biological data.

  16. Feature Selection with the Boruta Package

    OpenAIRE

    Kursa, Miron B.; Rudnicki, Witold R.

    2010-01-01

    This article describes a R package Boruta, implementing a novel feature selection algorithm for finding emph{all relevant variables}. The algorithm is designed as a wrapper around a Random Forest classification algorithm. It iteratively removes the features which are proved by a statistical test to be less relevant than random probes. The Boruta package provides a convenient interface to the algorithm. The short description of the algorithm and examples of its application are presented.

  17. The MIXMAX random number generator

    Science.gov (United States)

    Savvidy, Konstantin G.

    2015-11-01

    In this paper, we study the randomness properties of unimodular matrix random number generators. Under well-known conditions, these discrete-time dynamical systems have the highly desirable K-mixing properties which guarantee high quality random numbers. It is found that some widely used random number generators have poor Kolmogorov entropy and consequently fail in empirical tests of randomness. These tests show that the lowest acceptable value of the Kolmogorov entropy is around 50. Next, we provide a solution to the problem of determining the maximal period of unimodular matrix generators of pseudo-random numbers. We formulate the necessary and sufficient condition to attain the maximum period and present a family of specific generators in the MIXMAX family with superior performance and excellent statistical properties. Finally, we construct three efficient algorithms for operations with the MIXMAX matrix which is a multi-dimensional generalization of the famous cat-map. First, allowing to compute the multiplication by the MIXMAX matrix with O(N) operations. Second, to recursively compute its characteristic polynomial with O(N2) operations, and third, to apply skips of large number of steps S to the sequence in O(N2 log(S)) operations.

  18. Multitrait, Random Regression, or Simple Repeatability Model in High-Throughput Phenotyping Data Improve Genomic Prediction for Wheat Grain Yield.

    Science.gov (United States)

    Sun, Jin; Rutkoski, Jessica E; Poland, Jesse A; Crossa, José; Jannink, Jean-Luc; Sorrells, Mark E

    2017-07-01

    High-throughput phenotyping (HTP) platforms can be used to measure traits that are genetically correlated with wheat ( L.) grain yield across time. Incorporating such secondary traits in the multivariate pedigree and genomic prediction models would be desirable to improve indirect selection for grain yield. In this study, we evaluated three statistical models, simple repeatability (SR), multitrait (MT), and random regression (RR), for the longitudinal data of secondary traits and compared the impact of the proposed models for secondary traits on their predictive abilities for grain yield. Grain yield and secondary traits, canopy temperature (CT) and normalized difference vegetation index (NDVI), were collected in five diverse environments for 557 wheat lines with available pedigree and genomic information. A two-stage analysis was applied for pedigree and genomic selection (GS). First, secondary traits were fitted by SR, MT, or RR models, separately, within each environment. Then, best linear unbiased predictions (BLUPs) of secondary traits from the above models were used in the multivariate prediction models to compare predictive abilities for grain yield. Predictive ability was substantially improved by 70%, on average, from multivariate pedigree and genomic models when including secondary traits in both training and test populations. Additionally, (i) predictive abilities slightly varied for MT, RR, or SR models in this data set, (ii) results indicated that including BLUPs of secondary traits from the MT model was the best in severe drought, and (iii) the RR model was slightly better than SR and MT models under drought environment. Copyright © 2017 Crop Science Society of America.

  19. A randomized trial of treatments for high-utilizing somatizing patients.

    Science.gov (United States)

    Barsky, Arthur J; Ahern, David K; Bauer, Mark R; Nolido, Nyryan; Orav, E John

    2013-11-01

    Somatization and hypochondriacal health anxiety are common sources of distress, impairment, and costly medical utilization in primary care practice. A range of interventions is needed to improve the care of these patients. To determine the effectiveness of two cognitive behavioral interventions for high-utilizing, somatizing patients, using the resources found in a routine care setting. Patients were randomly assigned to a two-step cognitive behavioral treatment program accompanied by a training seminar for their primary care physicians, or to relaxation training. Providers routinely working in these patients' primary care practices delivered the cognitive behavior therapy and relaxation training. A follow-up assessment was completed immediately prior to treatment and 6 and 12 months later. Eighty-nine medical outpatients with elevated levels of somatization, hypochondriacal health anxiety, and medical care utilization. Somatization and hypochondriasis, overall psychiatric distress, and role impairment were assessed with well-validated, self-report questionnaires. Outpatient visits and medical care costs before and after the intervention were obtained from the encounter claims database. At 6 month and 12 month follow-up, both intervention groups showed significant improvements in somatization (p somatization, hypochondriacal symptoms, overall psychiatric distress, and role function. They also reduced the ambulatory visits and costs of these high utilizing outpatients.

  20. Randomized Controlled Trial of "Mind Reading" and In Vivo Rehearsal for High-Functioning Children with ASD

    Science.gov (United States)

    Thomeer, Marcus L.; Smith, Rachael A.; Lopata, Christopher; Volker, Martin A.; Lipinski, Alanna M.; Rodgers, Jonathan D.; McDonald, Christin A.; Lee, Gloria K.

    2015-01-01

    This randomized controlled trial evaluated the efficacy of a computer software (i.e., "Mind Reading") and in vivo rehearsal treatment on the emotion decoding and encoding skills, autism symptoms, and social skills of 43 children, ages 7-12 years with high-functioning autism spectrum disorder (HFASD). Children in treatment (n = 22)…

  1. Effect of intra-operative high inspired oxygen fraction on surgical site infection: a meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Yang, W; Liu, Y; Zhang, Y; Zhao, Q-H; He, S-F

    2016-08-01

    Surgical site infection (SSI) causes significant mortality and morbidity. Administration of a high inspired oxygen fraction (FiO2) to patients undergoing surgery may represent a potential preventive strategy. To conduct a meta-analysis of randomized controlled trials in which high FiO2 was compared with normal FiO2 in patients undergoing surgery to estimate the effect on the development of SSI. A comprehensive search was undertaken for randomized controlled trials (until December 2015) that compared high FiO2 with normal FiO2 in adults undergoing surgery with general anaesthesia and reported on SSI. This study included 17 randomized controlled trials with 8093 patients. Infection rates were 13.11% in the control group and 11.53% in the hyperoxic group, while the overall risk ratio was 0.893 [95% confidence interval (CI) 0.794-1.003; P = 0.057]. Subgroup analyses stratified by country, definition of SSI, and type of surgery were also performed, and showed similar results. However, high FiO2 was found to be of significant benefit in patients undergoing colorectal surgery, with a risk ratio of 0.735 (95% CI 0.573-0.944; P=0.016). There is moderate evidence to suggest that administration of high FiO2 to patients undergoing surgery, especially colorectal surgery, reduces the risk of SSI. Further studies with better adherence to the intervention may affect the results of this meta-analysis. Copyright © 2016 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  2. High-throughput screening of metal-porphyrin-like graphenes for selective capture of carbon dioxide.

    Science.gov (United States)

    Bae, Hyeonhu; Park, Minwoo; Jang, Byungryul; Kang, Yura; Park, Jinwoo; Lee, Hosik; Chung, Haegeun; Chung, ChiHye; Hong, Suklyun; Kwon, Yongkyung; Yakobson, Boris I; Lee, Hoonkyung

    2016-02-23

    Nanostructured materials, such as zeolites and metal-organic frameworks, have been considered to capture CO2. However, their application has been limited largely because they exhibit poor selectivity for flue gases and low capture capacity under low pressures. We perform a high-throughput screening for selective CO2 capture from flue gases by using first principles thermodynamics. We find that elements with empty d orbitals selectively attract CO2 from gaseous mixtures under low CO2 pressures (~10(-3) bar) at 300 K and release it at ~450 K. CO2 binding to elements involves hybridization of the metal d orbitals with the CO2 π orbitals and CO2-transition metal complexes were observed in experiments. This result allows us to perform high-throughput screening to discover novel promising CO2 capture materials with empty d orbitals (e.g., Sc- or V-porphyrin-like graphene) and predict their capture performance under various conditions. Moreover, these findings provide physical insights into selective CO2 capture and open a new path to explore CO2 capture materials.

  3. High-throughput screening of metal-porphyrin-like graphenes for selective capture of carbon dioxide

    Science.gov (United States)

    Bae, Hyeonhu; Park, Minwoo; Jang, Byungryul; Kang, Yura; Park, Jinwoo; Lee, Hosik; Chung, Haegeun; Chung, Chihye; Hong, Suklyun; Kwon, Yongkyung; Yakobson, Boris I.; Lee, Hoonkyung

    2016-02-01

    Nanostructured materials, such as zeolites and metal-organic frameworks, have been considered to capture CO2. However, their application has been limited largely because they exhibit poor selectivity for flue gases and low capture capacity under low pressures. We perform a high-throughput screening for selective CO2 capture from flue gases by using first principles thermodynamics. We find that elements with empty d orbitals selectively attract CO2 from gaseous mixtures under low CO2 pressures (~10-3 bar) at 300 K and release it at ~450 K. CO2 binding to elements involves hybridization of the metal d orbitals with the CO2 π orbitals and CO2-transition metal complexes were observed in experiments. This result allows us to perform high-throughput screening to discover novel promising CO2 capture materials with empty d orbitals (e.g., Sc- or V-porphyrin-like graphene) and predict their capture performance under various conditions. Moreover, these findings provide physical insights into selective CO2 capture and open a new path to explore CO2 capture materials.

  4. Hybrid Feature Selection Approach Based on GRASP for Cancer Microarray Data

    Directory of Open Access Journals (Sweden)

    Arpita Nagpal

    2017-01-01

    Full Text Available Microarray data usually contain a large number of genes, but a small number of samples. Feature subset selection for microarray data aims at reducing the number of genes so that useful information can be extracted from the samples. Reducing the dimension of data sets further helps in improving the computational efficiency of the learning model. In this paper, we propose a modified algorithm based on the tabu search as local search procedures to a Greedy Randomized Adaptive Search Procedure (GRASP for high dimensional microarray data sets. The proposed Tabu based Greedy Randomized Adaptive Search Procedure algorithm is named as TGRASP. In TGRASP, a new parameter has been introduced named as Tabu Tenure and the existing parameters, NumIter and size have been modified. We observed that different parameter settings affect the quality of the optimum. The second proposed algorithm known as FFGRASP (Firefly Greedy Randomized Adaptive Search Procedure uses a firefly optimization algorithm in the local search optimzation phase of the greedy randomized adaptive search procedure (GRASP. Firefly algorithm is one of the powerful algorithms for optimization of multimodal applications. Experimental results show that the proposed TGRASP and FFGRASP algorithms are much better than existing algorithm with respect to three performance parameters viz. accuracy, run time, number of a selected subset of features. We have also compared both the approaches with a unified metric (Extended Adjusted Ratio of Ratios which has shown that TGRASP approach outperforms existing approach for six out of nine cancer microarray datasets and FFGRASP performs better on seven out of nine datasets.

  5. Simultaneous selection for cowpea (Vigna unguiculata L.) genotypes with adaptability and yield stability using mixed models.

    Science.gov (United States)

    Torres, F E; Teodoro, P E; Rodrigues, E V; Santos, A; Corrêa, A M; Ceccon, G

    2016-04-29

    The aim of this study was to select erect cowpea (Vigna unguiculata L.) genotypes simultaneously for high adaptability, stability, and yield grain in Mato Grosso do Sul, Brazil using mixed models. We conducted six trials of different cowpea genotypes in 2005 and 2006 in Aquidauana, Chapadão do Sul, Dourados, and Primavera do Leste. The experimental design was randomized complete blocks with four replications and 20 genotypes. Genetic parameters were estimated by restricted maximum likelihood/best linear unbiased prediction, and selection was based on the harmonic mean of the relative performance of genetic values method using three strategies: selection based on the predicted breeding value, having considered the performance mean of the genotypes in all environments (no interaction effect); the performance in each environment (with an interaction effect); and the simultaneous selection for grain yield, stability, and adaptability. The MNC99542F-5 and MNC99-537F-4 genotypes could be grown in various environments, as they exhibited high grain yield, adaptability, and stability. The average heritability of the genotypes was moderate to high and the selective accuracy was 82%, indicating an excellent potential for selection.

  6. Metabolic risk factors in mice divergently selected for BMR fed high fat and high carb diets.

    Science.gov (United States)

    Sadowska, Julita; Gębczyński, Andrzej K; Konarzewski, Marek

    2017-01-01

    Factors affecting contribution of spontaneous physical activity (SPA; activity associated with everyday tasks) to energy balance of humans are not well understood, as it is not clear whether low activity is related to dietary habits, precedes obesity or is a result of thereof. In particular, human studies on SPA and basal metabolic rates (BMR, accounting for >50% of human energy budget) and their associations with diet composition, metabolic thrift and obesity are equivocal. To clarify these ambiguities we used a unique animal model-mice selected for divergent BMR rates (the H-BMR and L-BMR line type) presenting a 50% between-line type difference in the primary selected trait. Males of each line type were divided into three groups and fed either a high fat, high carb or a control diet. They then spent 4 months in individual cages under conditions emulating human "sedentary lifestyle", with SPA followed every month and measurements of metabolic risk indicators (body fat mass %, blood lipid profile, fasting blood glucose levels and oxidative damage in the livers, kidneys and hearts) taken at the end of study. Mice with genetically determined high BMR assimilated more energy and had higher SPA irrespective of type of diet. H-BMR individuals were characterized by lower dry body fat mass %, better lipid profile and lower fasting blood glucose levels, but higher oxidative damage in the livers and hearts. Genetically determined high BMR may be a protective factor against diet-induced obesity and most of the metabolic syndrome indicators. Elevated spontaneous activity is correlated with high BMR, and constitutes an important factor affecting individual capability to sustain energy balance even under energy dense diets.

  7. "Open mesh" or "strictly selected population" recruitment? The experience of the randomized controlled MeMeMe trial

    Directory of Open Access Journals (Sweden)

    Cortellini M

    2017-07-01

    Full Text Available Mauro Cortellini, Franco Berrino, Patrizia Pasanisi Department of Preventive & Predictive Medicine, Foundation IRCCS National Cancer Institute of Milan, Milan, Italy Abstract: Among randomized controlled trials (RCTs, trials for primary prevention require large samples and long follow-up to obtain a high-quality outcome; therefore the recruitment process and the drop-out rates largely dictate the adequacy of the results. We are conducting a Phase III trial on persons with metabolic syndrome to test the hypothesis that comprehensive lifestyle changes and/or metformin treatment prevents age-related chronic diseases (the MeMeMe trial, EudraCT number: 2012-005427-32, also registered on ClinicalTrials.gov [NCT02960711]. Here, we briefly analyze and discuss the reasons which may lead to participants dropping out from trials. In our experience, participants may back out of a trial for different reasons. Drug-induced side effects are certainly the most compelling reason. But what are the other reasons, relating to the participants’ perception of the progress of the trial which led them to withdraw after randomization? What about the time-dependent drop-out rate in primary prevention trials? The primary outcome of this analysis is the point of drop-out from trial, defined as the time from the randomization date to the withdrawal date. Survival functions were non-parametrically estimated using the product-limit estimator. The curves were statistically compared using the log-rank test (P=0.64, not significant. Researchers involved in primary prevention RCTs seem to have to deal with the paradox of the proverbial “short blanket syndrome”. Recruiting only highly motivated candidates might be useful for the smooth progress of the trial but it may lead to a very low enrollment rate. On the other hand, what about enrolling all the eligible subjects without considering their motivation? This might boost the enrollment rate, but it can lead to biased

  8. Effect of mirtazapine versus selective serotonin reuptake inhibitors on benzodiazepine use in patients with major depressive disorder: a pragmatic, multicenter, open-label, randomized, active-controlled, 24-week trial.

    Science.gov (United States)

    Hashimoto, Tasuku; Shiina, Akihiro; Hasegawa, Tadashi; Kimura, Hiroshi; Oda, Yasunori; Niitsu, Tomihisa; Ishikawa, Masatomo; Tachibana, Masumi; Muneoka, Katsumasa; Matsuki, Satoshi; Nakazato, Michiko; Iyo, Masaomi

    2016-01-01

    This study aimed to evaluate whether selecting mirtazapine as the first choice for current depressive episode instead of selective serotonin reuptake inhibitors (SSRIs) reduces benzodiazepine use in patients with major depressive disorder (MDD). We concurrently examined the relationship between clinical responses and serum mature brain-derived neurotrophic factor (BDNF) and its precursor, proBDNF. We conducted an open-label randomized trial in routine psychiatric practice settings. Seventy-seven MDD outpatients were randomly assigned to the mirtazapine or predetermined SSRIs groups, and investigators arbitrarily selected sertraline or paroxetine. The primary outcome was the proportion of benzodiazepine users at weeks 6, 12, and 24 between the groups. We defined patients showing a ≥50 % reduction in Hamilton depression rating scale (HDRS) scores from baseline as responders. Blood samples were collected at baseline, weeks 6, 12, and 24. Sixty-five patients prescribed benzodiazepines from prescription day 1 were analyzed for the primary outcome. The percentage of benzodiazepine users was significantly lower in the mirtazapine than in the SSRIs group at weeks 6, 12, and 24 (21.4 vs. 81.8 %; 11.1 vs. 85.7 %, both P  depressive episodes may reduce benzodiazepine use in patients with MDD. Trial registration UMIN000004144. Registered 2nd September 2010. The date of enrolment of the first participant to the trial was 24th August 2010. This study was retrospectively registered 9 days after the first participant was enrolled.

  9. Pseudo-random number generator for the Sigma 5 computer

    Science.gov (United States)

    Carroll, S. N.

    1983-01-01

    A technique is presented for developing a pseudo-random number generator based on the linear congruential form. The two numbers used for the generator are a prime number and a corresponding primitive root, where the prime is the largest prime number that can be accurately represented on a particular computer. The primitive root is selected by applying Marsaglia's lattice test. The technique presented was applied to write a random number program for the Sigma 5 computer. The new program, named S:RANDOM1, is judged to be superior to the older program named S:RANDOM. For applications requiring several independent random number generators, a table is included showing several acceptable primitive roots. The technique and programs described can be applied to any computer having word length different from that of the Sigma 5.

  10. Highly efficient cobalt-doped carbon nitride polymers for solvent-free selective oxidation of cyclohexane

    Directory of Open Access Journals (Sweden)

    Yu Fu

    2017-04-01

    Full Text Available Selective oxidation of saturated hydrocarbons with molecular oxygen has been of great interest in catalysis, and the development of highly efficient catalysts for this process is a crucial challenge. A new kind of heterogeneous catalyst, cobalt-doped carbon nitride polymer (g-C3N4, was harnessed for the selective oxidation of cyclohexane. X-ray diffraction, Fourier transform infrared spectra and high resolution transmission electron microscope revealed that Co species were highly dispersed in g-C3N4 matrix and the characteristic structure of polymeric g-C3N4 can be retained after Co-doping, although Co-doping caused the incomplete polymerization to some extent. Ultraviolet–visible, Raman and X-ray photoelectron spectroscopy further proved the successful Co doping in g-C3N4 matrix as the form of Co(IIN bonds. For the selective oxidation of cyclohexane, Co-doping can markedly promote the catalytic performance of g-C3N4 catalyst due to the synergistic effect of Co species and g-C3N4 hybrid. Furthermore, the content of Co largely affected the activity of Co-doped g-C3N4 catalysts, among which the catalyst with 9.0 wt% Co content exhibited the highest yield (9.0% of cyclohexanone and cyclohexanol, as well as a high stability. Meanwhile, the reaction mechanism over Co-doped g-C3N4 catalysts was elaborated. Keywords: Selective oxidation of cyclohexane, Oxygen oxidant, Carbon nitride, Co-doping

  11. Highly selective coulometric method and equipment for the automated determination of plutonium

    International Nuclear Information System (INIS)

    Jackson, D.D.; Hollen, R.M.; Roensch, F.R.; Rein, J.E.

    1977-01-01

    A highly selective, controlled-potential coulometric method has been developed for the determination of plutonium. An automated instrument, consisting of commercial electronic components under control of a programmable calculator, is being constructed. Half-cell potentials and interfering anions are listed

  12. Feature Selection via Chaotic Antlion Optimization.

    Directory of Open Access Journals (Sweden)

    Hossam M Zawbaa

    Full Text Available Selecting a subset of relevant properties from a large set of features that describe a dataset is a challenging machine learning task. In biology, for instance, the advances in the available technologies enable the generation of a very large number of biomarkers that describe the data. Choosing the more informative markers along with performing a high-accuracy classification over the data can be a daunting task, particularly if the data are high dimensional. An often adopted approach is to formulate the feature selection problem as a biobjective optimization problem, with the aim of maximizing the performance of the data analysis model (the quality of the data training fitting while minimizing the number of features used.We propose an optimization approach for the feature selection problem that considers a "chaotic" version of the antlion optimizer method, a nature-inspired algorithm that mimics the hunting mechanism of antlions in nature. The balance between exploration of the search space and exploitation of the best solutions is a challenge in multi-objective optimization. The exploration/exploitation rate is controlled by the parameter I that limits the random walk range of the ants/prey. This variable is increased iteratively in a quasi-linear manner to decrease the exploration rate as the optimization progresses. The quasi-linear decrease in the variable I may lead to immature convergence in some cases and trapping in local minima in other cases. The chaotic system proposed here attempts to improve the tradeoff between exploration and exploitation. The methodology is evaluated using different chaotic maps on a number of feature selection datasets. To ensure generality, we used ten biological datasets, but we also used other types of data from various sources. The results are compared with the particle swarm optimizer and with genetic algorithm variants for feature selection using a set of quality metrics.

  13. Computational design for a wide-angle cermet-based solar selective absorber for high temperature applications

    International Nuclear Information System (INIS)

    Sakurai, Atsushi; Tanikawa, Hiroya; Yamada, Makoto

    2014-01-01

    The purpose of this study is to computationally design a wide-angle cermet-based solar selective absorber for high temperature applications by using a characteristic matrix method and a genetic algorithm. The present study investigates a solar selective absorber with tungsten–silica (W–SiO 2 ) cermet. Multilayer structures of 1, 2, 3, and 4 layers and a wide range of metal volume fractions are optimized. The predicted radiative properties show good solar performance, i.e., thermal emittances, especially beyond 2 μm, are quite low, in contrast, solar absorptance levels are successfully high with wide angular range, so that solar photons are effectively absorbed and infrared radiative heat loss can be decreased. -- Highlights: • Electromagnetic simulation of radiative properties by characteristic matrix method. • Optimization for multilayered W–SiO 2 cermet-based absorber by a Genetic Algorithm. • We propose a successfully high solar performance of solar selective absorber

  14. The Naples High- and Low-Excitability rats: selective breeding, behavioral profile, morphometry, and molecular biology of the mesocortical dopamine system.

    Science.gov (United States)

    Viggiano, Davide; Vallone, Daniela; Welzl, Hans; Sadile, Adolfo G

    2002-09-01

    The Naples High- (NHE) and Low-Excitability (NLE) rat lines have been selected since 1976 on the basis of behavioral arousal to novelty (Làt-maze). Selective breeding has been conducted under continuous genetic pressure, with no brother-sister mating. The behavioral analyses presented here deal with (1) activity in environments of different complexity, i.e., holeboard and Làt maze; (2) maze learning in hexagonal tunnel, Olton, and Morris water mazes and; (3) two-way active avoidance and conditioned taste aversion tests. Morphometric analyses deal with central dopaminergic systems at their origin and target sites, as well as the density of dopamine transporter immunoreactivity. Molecular biology analyses are also presented, dealing with recent experiments on the prefrontal cortex (PFc), cloning and identifying differentially expressed genes using subtractive libraries and RNAase protection. The divergence between NLE and NHE rats varies as a function of the complexity level of the environment, with an impaired working and reference memory in both lines compared to random bred (NRB) controls. Moreover, data from the PFc of NHE rats show a hyperdopaminergic innervation, with overexpression of mRNA species involved in basal metabolism, and down-regulation of dopamine D1 receptors. Altogether, the evidence gathered so far supports a hyperfunctioning mesocorticolimbic system that makes NHE rats a useful tool for the study of hyperactivity and attention deficit, learning and memory disabilities, and drug abuse.

  15. Review of Random Phase Encoding in Volume Holographic Storage

    Directory of Open Access Journals (Sweden)

    Wei-Chia Su

    2012-09-01

    Full Text Available Random phase encoding is a unique technique for volume hologram which can be applied to various applications such as holographic multiplexing storage, image encryption, and optical sensing. In this review article, we first review and discuss diffraction selectivity of random phase encoding in volume holograms, which is the most important parameter related to multiplexing capacity of volume holographic storage. We then review an image encryption system based on random phase encoding. The alignment of phase key for decryption of the encoded image stored in holographic memory is analyzed and discussed. In the latter part of the review, an all-optical sensing system implemented by random phase encoding and holographic interconnection is presented.

  16. Highly selective electrocoagulation therapy: an innovative treatment for lymphangioma circumscriptum.

    Science.gov (United States)

    Yang, Xi; Jin, Yunbo; Chen, Hui; Li, Suolan; Ma, Gang; Hu, Xiaojie; Qiu, Yajing; Yu, Wenxin; Chang, Lei; Wang, Tianyou; Lin, Xiaoxi

    2014-08-01

    Lymphangioma circumscriptum (LC) is a type of microcystic lymphatic malformation involving the skin and mucosa that presents as translucent vesicles of varying size with a pink, red, or black hue. Lymphangioma circumscriptum causes not only cosmetic problems but also refractory rupture, infection, lymphorrhea, and bleeding. Various invasive methods, such as surgical excision, lasers, and sclerotherapy, have been used in the past to treat LC with varying success. Herein, we report a new treatment for the management of LC. This study reports the outcomes of 12 patients (aged 4-31 years) with LC treated by electrocoagulation using a special isolated needle. Patient demographics, lesion characteristics, radiologic findings, treatment course, and clinical responses are recorded. All 12 patients who were treated with the highly selective electrocoagulation therapy achieved near-complete clearance. Minimal intra- and postoperative sequelae were observed. The local complications included mild pain (n = 9), proliferous scarring (n = 1), and ulceration (n = 1) with no systemic side effects. The mean follow-up period was 8.25 months (3-14 months). Highly selective electrocoagulation therapy is an innovative, minimally invasive technique that seems to be safe and effective for the treatment of LC; the results from our limited study population seem promising, and the observed complications are acceptable.

  17. Managing salinity in Upper Colorado River Basin streams: Selecting catchments for sediment control efforts using watershed characteristics and random forests models

    Science.gov (United States)

    Tillman, Fred; Anning, David W.; Heilman, Julian A.; Buto, Susan G.; Miller, Matthew P.

    2018-01-01

    Elevated concentrations of dissolved-solids (salinity) including calcium, sodium, sulfate, and chloride, among others, in the Colorado River cause substantial problems for its water users. Previous efforts to reduce dissolved solids in upper Colorado River basin (UCRB) streams often focused on reducing suspended-sediment transport to streams, but few studies have investigated the relationship between suspended sediment and salinity, or evaluated which watershed characteristics might be associated with this relationship. Are there catchment properties that may help in identifying areas where control of suspended sediment will also reduce salinity transport to streams? A random forests classification analysis was performed on topographic, climate, land cover, geology, rock chemistry, soil, and hydrologic information in 163 UCRB catchments. Two random forests models were developed in this study: one for exploring stream and catchment characteristics associated with stream sites where dissolved solids increase with increasing suspended-sediment concentration, and the other for predicting where these sites are located in unmonitored reaches. Results of variable importance from the exploratory random forests models indicate that no simple source, geochemical process, or transport mechanism can easily explain the relationship between dissolved solids and suspended sediment concentrations at UCRB monitoring sites. Among the most important watershed characteristics in both models were measures of soil hydraulic conductivity, soil erodibility, minimum catchment elevation, catchment area, and the silt component of soil in the catchment. Predictions at key locations in the basin were combined with observations from selected monitoring sites, and presented in map-form to give a complete understanding of where catchment sediment control practices would also benefit control of dissolved solids in streams.

  18. Direct conversion of CO2 into liquid fuels with high selectivity over a bifunctional catalyst

    Science.gov (United States)

    Gao, Peng; Li, Shenggang; Bu, Xianni; Dang, Shanshan; Liu, Ziyu; Wang, Hui; Zhong, Liangshu; Qiu, Minghuang; Yang, Chengguang; Cai, Jun; Wei, Wei; Sun, Yuhan

    2017-10-01

    Although considerable progress has been made in carbon dioxide (CO2) hydrogenation to various C1 chemicals, it is still a great challenge to synthesize value-added products with two or more carbons, such as gasoline, directly from CO2 because of the extreme inertness of CO2 and a high C-C coupling barrier. Here we present a bifunctional catalyst composed of reducible indium oxides (In2O3) and zeolites that yields a high selectivity to gasoline-range hydrocarbons (78.6%) with a very low methane selectivity (1%). The oxygen vacancies on the In2O3 surfaces activate CO2 and hydrogen to form methanol, and C-C coupling subsequently occurs inside zeolite pores to produce gasoline-range hydrocarbons with a high octane number. The proximity of these two components plays a crucial role in suppressing the undesired reverse water gas shift reaction and giving a high selectivity for gasoline-range hydrocarbons. Moreover, the pellet catalyst exhibits a much better performance during an industry-relevant test, which suggests promising prospects for industrial applications.

  19. Use of Novel Highly Selective Ion Exchange Media for Minimizing the Waste Arising from Different NPP and Other Liquids

    International Nuclear Information System (INIS)

    Tusa, Esko; Harjula, Risto; Lehto, Jukka

    2003-01-01

    Highly selective inorganic ion exchangers give new possibilities to implement and operate new innovative treatment systems for radioactive liquids. Because of high selectivity these ion exchangers can be used even in liquids of high salt concentrations. Only selected target nuclides will be separated and inactive salts are left in the liquid, which can be released or recategorized. Thus, it is possible to reduce the volume of radioactive waste dramatically. On the other hand, only a small volume of highly selective material is required in applications, which makes it possible to design totally new types of compact treatment systems. The major benefit of selective ion exchange media comes from the very large volume reduction of radioactive waste in final disposal. It is also possible to save in investment costs, because small ion exchanger volumes can be used and handled in a very small facility. This paper describes different applications of these highly selective ion exchangers, both commercial fullscale applications and laboratory tests, to give the idea of their efficiency for different liquids

  20. Selective Predation of a Stalking Predator on Ungulate Prey.

    Directory of Open Access Journals (Sweden)

    Marco Heurich

    Full Text Available Prey selection is a key factor shaping animal populations and evolutionary dynamics. An optimal forager should target prey that offers the highest benefits in terms of energy content at the lowest costs. Predators are therefore expected to select for prey of optimal size. Stalking predators do not pursue their prey long, which may lead to a more random choice of prey individuals. Due to difficulties in assessing the composition of available prey populations, data on prey selection of stalking carnivores are still scarce. We show how the stalking predator Eurasian lynx (Lynx lynx selects prey individuals based on species identity, age, sex and individual behaviour. To address the difficulties in assessing prey population structure, we confirm inferred selection patterns by using two independent data sets: (1 data of 387 documented kills of radio-collared lynx were compared to the prey population structure retrieved from systematic camera trapping using Manly's standardized selection ratio alpha and (2 data on 120 radio-collared roe deer were analysed using a Cox proportional hazards model. Among the larger red deer prey, lynx selected against adult males-the largest and potentially most dangerous prey individuals. In roe deer lynx preyed selectively on males and did not select for a specific age class. Activity during high risk periods reduced the risk of falling victim to a lynx attack. Our results suggest that the stalking predator lynx actively selects for size, while prey behaviour induces selection by encounter and stalking success rates.

  1. Feature Selection with the Boruta Package

    Directory of Open Access Journals (Sweden)

    Miron B. Kursa

    2010-10-01

    Full Text Available This article describes a R package Boruta, implementing a novel feature selection algorithm for finding emph{all relevant variables}. The algorithm is designed as a wrapper around a Random Forest classification algorithm. It iteratively removes the features which are proved by a statistical test to be less relevant than random probes. The Boruta package provides a convenient interface to the algorithm. The short description of the algorithm and examples of its application are presented.

  2. Visual inspection requirements for high-reliability random-access memories

    International Nuclear Information System (INIS)

    Andrade, A.; McHenery, J.

    1981-09-01

    Visual inspection requirements are given for random-access memories for deep-space satellite electronics. The requirements, based primarily on Military Standard 883B, are illustrated in the order of their manufacturing operation to clarify and facilitate inspection procedures

  3. A Study on Site Selecting for National Project including High Level Radioactive Waste Disposal

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kilyoo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Many national projects are stopped since sites for the projects are not determined. The sites selections are hold by NIMBY for unpleasant facilities or by PYMFY for preferable facilities among local governments. The followings are the typical ones; NIMBY projects: high level radioactive waste disposal, THAAD, Nuclear power plant(NPP), etc. PIMFY projects: South-east new airport, KTX station, Research center for NPP decommission, etc. The site selection for high level radioactive waste disposal is more difficult problem, and thus government did not decide and postpone to a dead end street. Since it seems that there is no solution for site selection for high level radioactive waste disposal due to NIMBY among local governments, a solution method is proposed in this paper. To decide a high level radioactive waste disposal, the first step is to invite a bid by suggesting a package deal including PIMFY projects such as Research Center for NPP decommission. Maybe potential host local governments are asked to submit sealed bids indicating the minimum compensation sum that they would accept the high level radioactive waste disposal site. If there are more than one local government put in a bid, then decide an adequate site by considering both the accumulated PESS point and technical evaluation results. By considering how fairly preferable national projects and unpleasant national projects are distributed among local government, sites selection for NIMBY or PIMFY facilities is suggested. For NIMBY national projects, risk, cost benefit analysis is useful and required since it generates cost value to be used in the PESS. For many cases, the suggested method may be not adequate. However, similar one should be prepared, and be basis to decide sites for NIMBY or PIMFY national projects.

  4. A Monte Carlo study of adsorption of random copolymers on random surfaces

    CERN Document Server

    Moghaddam, M S

    2003-01-01

    We study the adsorption problem of a random copolymer on a random surface in which a self-avoiding walk in three dimensions interacts with a plane defining a half-space to which the walk is confined. Each vertex of the walk is randomly labelled A with probability p sub p or B with probability 1 - p sub p , and only vertices labelled A are attracted to the surface plane. Each lattice site on the plane is also labelled either A with probability p sub s or B with probability 1 - p sub s , and only lattice sites labelled A interact with the walk. We study two variations of this model: in the first case the A-vertices of the walk interact only with the A-sites on the surface. In the second case the constraint of selective binding is removed; that is, any contact between the walk and the surface that involves an A-labelling, either from the surface or from the walk, is counted as a visit to the surface. The system is quenched in both cases, i.e. the labellings of the walk and of the surface are fixed as thermodynam...

  5. Selection of aptamers for Candida albicans by cell-SELEX

    International Nuclear Information System (INIS)

    Miranda, Alessandra Nunes Duarte

    2017-01-01

    The growing concern with invasive fungal infections, responsible for an alarming mortality rate of immunosuppressed patients and in Intensive Care Units, evidences the need for a fast and specific method for the Candida albicans detection, since this species is identified as one of the main causes of septicemia. Commonly, it is a challenge for clinicians to determine the primary infection foci, the dissemination degree, or whether the site of a particular surgery is involved. Although scintigraphic imaging represents a promising tool for infectious foci detection, it still lacks a methodology for C. albicans diagnosis due to the absence of specific radiotracers for this microorganism. Aptamers are molecules that have almost ideal properties for use as diagnostic radiopharmaceuticals, such as high specificity for their molecular targets, lack of immunogenicity and toxicity, high tissue penetration and rapid blood clearance. Aptamers can also be labeled with different radionuclides. This work aims to obtain aptamers for specific binding to C. albicans cells for future application as a radiopharmaceutical. It was used a variation of the SELEX (Systematic Evolution of Ligands by EXponential Enrichment) technique, termed cell-SELEX, in which cells are the targets for selection. A selection protocol was standardized using a random library of single-stranded oligonucleotides, each containing two fixed regions flanking a sequence of 40 random nucleotides. This library was incubated with C. albicans cells in the presence of competitors. Then, the binding sequences were separated by centrifugation, resuspended and amplified by PCR. The amplification was confirmed by agarose gel electrophoresis. After that, the ligands were purified to obtain a new pool of ssDNA, from which a new incubation was carried out. The selection parameters were gradually modified in order to increase stringency. This cycle was repeated 12 times to allow the selection of sequences with the maximum

  6. Site selection and characterization processes for deep geologic disposal of high level nuclear waste

    International Nuclear Information System (INIS)

    Costin, L.S.

    1997-10-01

    In this paper, the major elements of the site selection and characterization processes used in the US high level waste program are discussed. While much of the evolution of the site selection and characterization processes have been driven by the unique nature of the US program, these processes, which are well defined and documented, could be used as an initial basis for developing site screening, selection, and characterization programs in other countries. Thus, this paper focuses more on the process elements than the specific details of the US program

  7. Decoys Selection in Benchmarking Datasets: Overview and Perspectives

    Science.gov (United States)

    Réau, Manon; Langenfeld, Florent; Zagury, Jean-François; Lagarde, Nathalie; Montes, Matthieu

    2018-01-01

    Virtual Screening (VS) is designed to prospectively help identifying potential hits, i.e., compounds capable of interacting with a given target and potentially modulate its activity, out of large compound collections. Among the variety of methodologies, it is crucial to select the protocol that is the most adapted to the query/target system under study and that yields the most reliable output. To this aim, the performance of VS methods is commonly evaluated and compared by computing their ability to retrieve active compounds in benchmarking datasets. The benchmarking datasets contain a subset of known active compounds together with a subset of decoys, i.e., assumed non-active molecules. The composition of both the active and the decoy compounds subsets is critical to limit the biases in the evaluation of the VS methods. In this review, we focus on the selection of decoy compounds that has considerably changed over the years, from randomly selected compounds to highly customized or experimentally validated negative compounds. We first outline the evolution of decoys selection in benchmarking databases as well as current benchmarking databases that tend to minimize the introduction of biases, and secondly, we propose recommendations for the selection and the design of benchmarking datasets. PMID:29416509

  8. Variance Component Selection With Applications to Microbiome Taxonomic Data

    Directory of Open Access Journals (Sweden)

    Jing Zhai

    2018-03-01

    Full Text Available High-throughput sequencing technology has enabled population-based studies of the role of the human microbiome in disease etiology and exposure response. Microbiome data are summarized as counts or composition of the bacterial taxa at different taxonomic levels. An important problem is to identify the bacterial taxa that are associated with a response. One method is to test the association of specific taxon with phenotypes in a linear mixed effect model, which incorporates phylogenetic information among bacterial communities. Another type of approaches consider all taxa in a joint model and achieves selection via penalization method, which ignores phylogenetic information. In this paper, we consider regression analysis by treating bacterial taxa at different level as multiple random effects. For each taxon, a kernel matrix is calculated based on distance measures in the phylogenetic tree and acts as one variance component in the joint model. Then taxonomic selection is achieved by the lasso (least absolute shrinkage and selection operator penalty on variance components. Our method integrates biological information into the variable selection problem and greatly improves selection accuracies. Simulation studies demonstrate the superiority of our methods versus existing methods, for example, group-lasso. Finally, we apply our method to a longitudinal microbiome study of Human Immunodeficiency Virus (HIV infected patients. We implement our method using the high performance computing language Julia. Software and detailed documentation are freely available at https://github.com/JingZhai63/VCselection.

  9. AVN-492, A Novel Highly Selective 5-HT6R Antagonist: Preclinical Evaluation.

    Science.gov (United States)

    Ivachtchenko, Alexandre V; Okun, Ilya; Aladinskiy, Vladimir; Ivanenkov, Yan; Koryakova, Angela; Karapetyan, Ruben; Mitkin, Oleg; Salimov, Ramiz; Ivashchenko, Andrey

    2017-01-01

    Discovery of 5-HT6 receptor subtype and its exclusive localization within the central nervous system led to extensive investigations of its role in Alzheimer's disease, schizophrenia, and obesity. In the present study, we present preclinical evaluation of a novel highly-potent and highly-selective 5-HT6R antagonist, AVN-492. The affinity of AVN-492 to bind to 5-HT6R (Ki = 91 pM) was more than three orders of magnitude higher than that to bind to the only other target, 5-HT2BR, (Ki = 170 nM). Thus, the compound displayed great 5-HT6R selectivity against all other serotonin receptor subtypes, and is extremely specific against any other receptors such as adrenergic, GABAergic, dopaminergic, histaminergic, etc. AVN-492 demonstrates good in vitro and in vivo ADME profile with high oral bioavailability and good brain permeability in rodents. In behavioral tests, AVN-492 shows anxiolytic effect in elevated plus-maze model, prevents an apomorphine-induced disruption of startle pre-pulse inhibition (the PPI model) and reverses a scopolamine- and MK-801-induced memory deficit in passive avoidance model. No anti-obesity effect of AVN-492 was found in a murine model. The data presented here strongly indicate that due to its high oral bioavailability, extremely high selectivity, and potency to block the 5-HT6 receptor, AVN-492 is a very promising tool for evaluating the role the 5-HT6 receptor might play in cognitive and neurodegenerative impairments. AVN-492 is an excellent drug candidate to be tested for treatment of such diseases, and is currently being tested in Phase I trials.

  10. High-capacity thermo-responsive magnetic molecularly imprinted polymers for selective extraction of curcuminoids.

    Science.gov (United States)

    You, Qingping; Zhang, Yuping; Zhang, Qingwen; Guo, Junfang; Huang, Weihua; Shi, Shuyun; Chen, Xiaoqin

    2014-08-08

    Thermo-responsive magnetic molecularly imprinted polymers (TMMIPs) for selective recognition of curcuminoids with high capacity and selectivity have firstly been developed. The resulting TMMIPs were characterized by TEM, FT-IR, TGA, VSM and UV, which indicated that TMMIPs showed thermo-responsiveness [lower critical solution temperature (LCST) at 33.71°C] and rapid magnetic separation (5s). The polymerization, adsorption and release conditions were optimized in detail to obtain the highest binding capacity, selectivity and release ratio. We found that the adopted thermo-responsive monomer [N-isopropylacrylamide (NIPAm)] could be considered not only as inert polymer backbone for thermo-responsiveness but also as functional co-monomers combination with basic monomer (4-VP) for more specific binding sites when ethanol was added in binding solution. The maximum adsorption capacity with highest selectivity of curcumin was 440.3μg/g (1.93 times that on MMIPs with no thermosensitivity) at 45°C (above LCST) in 20% (v/v) ethanol solution on shrunk TMMIPs, and the maximum release proportion was about 98% at 20°C (below LCST) in methanol-acetic acid (9/1, v/v) solution on swelled TMMIPs. The adsorption process between curcumin and TMMIPs followed Langumuir adsorption isotherm and pseudo-first-order reaction kinetics. The prepared TMMIPs also showed high reproducibility (RSD<6% for batch-to-batch evaluation) and stability (only 7% decrease after five cycles). Subsequently, the TMMIPs were successfully applied for selective extraction of curcuminoids from complex natural product, Curcuma longa. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. High energy X-ray phase and dark-field imaging using a random absorption mask.

    Science.gov (United States)

    Wang, Hongchang; Kashyap, Yogesh; Cai, Biao; Sawhney, Kawal

    2016-07-28

    High energy X-ray imaging has unique advantage over conventional X-ray imaging, since it enables higher penetration into materials with significantly reduced radiation damage. However, the absorption contrast in high energy region is considerably low due to the reduced X-ray absorption cross section for most materials. Even though the X-ray phase and dark-field imaging techniques can provide substantially increased contrast and complementary information, fabricating dedicated optics for high energies still remain a challenge. To address this issue, we present an alternative X-ray imaging approach to produce transmission, phase and scattering signals at high X-ray energies by using a random absorption mask. Importantly, in addition to the synchrotron radiation source, this approach has been demonstrated for practical imaging application with a laboratory-based microfocus X-ray source. This new imaging method could be potentially useful for studying thick samples or heavy materials for advanced research in materials science.

  12. Assessment of pesticide residues on selected vegetables of Pakistan

    International Nuclear Information System (INIS)

    Khan, M.S.; Shah, M.M.

    2011-01-01

    The present study was conducted to determine the pesticide residues on selected summer vegetables. Five vegetables were grown with three replicates in a split plot randomized complete block design. Pesticides were sprayed on vegetables thrice at regular intervals each after 15 days. At maturity the pesticides residues were extracted from edible and leaf portions using anhydrous sodium sulfate and ethyl acetate while adsorption chromatography technique was used for cleanup. The extracts were subjected to high performance liquid chromatography (HPLC) for separation and analysis of the compounds. Significant differences (p<0.05) were found in the pesticides residues on edible portions whereas highly significant differences (p<0.001) were observed for the leafy portions. The residual level of cypermethrin was highest (16.2 mg kg/sup -1/) in edible portion of bitter gourd, while Lambdacyhalothrin and Mancozeb residues were detected high (4.50 mg kg/sup -1/, 6.26 mg kg/sup -1/) in edible portion of bitter gourd and Cucumber respectively. Cypermethrin residues were high (1.86 mg kg/sup -1/) in Okra leaves. Mancozeb and Lambdacyhalothrin residual level was high (1.23 mg kg/sup -1/, and 0.0002 mg kg/sup -1/) in chili and tomato leaves. Cypermethrin residues were readily detected in edible and leaf portion of the selected vegetables. (author)

  13. Large motor units are selectively affected following a stroke.

    Science.gov (United States)

    Lukács, M; Vécsei, L; Beniczky, S

    2008-11-01

    Previous studies have revealed a loss of functioning motor units in stroke patients. However, it remained unclear whether the motor units are affected randomly or in some specific pattern. We assessed whether there is a selective loss of the large (high recruitment threshold) or the small (low recruitment threshold) motor units following a stroke. Forty-five stroke patients and 40 healthy controls participated in the study. Macro-EMG was recorded from the abductor digiti minimi muscle at two levels of force output (low and high). The median macro motor unit potential (macro-MUP) amplitude on the paretic side was compared with those on the unaffected side and in the controls. In the control group and on the unaffected side, the macro-MUPs were significantly larger at the high force output than at the low one. However, on the paretic side the macro-MUPs at the high force output had the same amplitude as those recorded at the low force output. These changes correlated with the severity of the paresis. Following a stroke, there is a selective functional loss of the large, high-threshold motor units. These changes are related to the severity of the symptoms. Our findings furnish further insight into the pathophysiology of the motor deficit following a stroke.

  14. Selection for the compactness of highly expressed genes in Gallus gallus

    Directory of Open Access Journals (Sweden)

    Zhou Ming

    2010-05-01

    (n = 1105, and compared the first intron length and the average intron length between highly expressed genes (top 5% expressed genes and weakly expressed genes (bottom 5% expressed genes. We found that the first intron length and the average intron length in highly expressed genes are not different from that in weakly expressed genes. We also made a comparison between ubiquitously expressed genes and narrowly expressed somatic genes with similar expression levels. Our data demonstrated that ubiquitously expressed genes are less compact than narrowly expressed genes with the similar expression levels. Obviously, these observations can not be explained by mutational bias hypotheses either. We also found that the significant trend between genes' compactness and expression level could not be affected by local mutational biases. We argued that the selection of economy model is most likely one to explain the relationship between gene expression and gene characteristics in chicken genome. Conclusion Natural selection appears to favor the compactness of highly expressed genes in chicken genome. This observation can be explained by the selection of economy model. Reviewers This article was reviewed by Dr. Gavin Huttley, Dr. Liran Carmel (nominated by Dr. Eugene V. Koonin and Dr. Araxi Urrutia (nominated by Dr. Laurence D. Hurst.

  15. High Dimensional Classification Using Features Annealed Independence Rules.

    Science.gov (United States)

    Fan, Jianqing; Fan, Yingying

    2008-01-01

    Classification using high-dimensional features arises frequently in many contemporary statistical studies such as tumor classification using microarray or other high-throughput data. The impact of dimensionality on classifications is largely poorly understood. In a seminal paper, Bickel and Levina (2004) show that the Fisher discriminant performs poorly due to diverging spectra and they propose to use the independence rule to overcome the problem. We first demonstrate that even for the independence classification rule, classification using all the features can be as bad as the random guessing due to noise accumulation in estimating population centroids in high-dimensional feature space. In fact, we demonstrate further that almost all linear discriminants can perform as bad as the random guessing. Thus, it is paramountly important to select a subset of important features for high-dimensional classification, resulting in Features Annealed Independence Rules (FAIR). The conditions under which all the important features can be selected by the two-sample t-statistic are established. The choice of the optimal number of features, or equivalently, the threshold value of the test statistics are proposed based on an upper bound of the classification error. Simulation studies and real data analysis support our theoretical results and demonstrate convincingly the advantage of our new classification procedure.

  16. Selective attention to sound location or pitch studied with fMRI.

    Science.gov (United States)

    Degerman, Alexander; Rinne, Teemu; Salmi, Juha; Salonen, Oili; Alho, Kimmo

    2006-03-10

    We used 3-T functional magnetic resonance imaging to compare the brain mechanisms underlying selective attention to sound location and pitch. In different tasks, the subjects (N = 10) attended to a designated sound location or pitch or to pictures presented on the screen. In the Attend Location conditions, the sound location varied randomly (left or right), while the pitch was kept constant (high or low). In the Attend Pitch conditions, sounds of randomly varying pitch (high or low) were presented at a constant location (left or right). Both attention to location and attention to pitch produced enhanced activity (in comparison with activation caused by the same sounds when attention was focused on the pictures) in widespread areas of the superior temporal cortex. Attention to either sound feature also activated prefrontal and inferior parietal cortical regions. These activations were stronger during attention to location than during attention to pitch. Attention to location but not to pitch produced a significant increase of activation in the premotor/supplementary motor cortices of both hemispheres and in the right prefrontal cortex, while no area showed activity specifically related to attention to pitch. The present results suggest some differences in the attentional selection of sounds on the basis of their location and pitch consistent with the suggested auditory "what" and "where" processing streams.

  17. Selection of High Oil Yielding Trees of Millettia pinnata (L.) Panigrahi, Vegetative Propagation and Growth in the Field

    OpenAIRE

    Ni Luh Arpiwi; I Made Sutha Negara; I Nengah Simpen

    2017-01-01

    Millettia pinnata (L.) Panigrahi is a potential legume tree that produces seed oil for biodiesel feedstock. The initial step for raising a large-scale plantation of the species is selection of high oil yielding trees from the natural habitat. This is followed by vegetative propagation of the selected trees and then testing the growth of the clone in the field. The aim of the present study was to select high-oil yielding trees of M. pinnata, to propagate the selected trees by budding and to e...

  18. Natural selection on individual variation in tolerance of gastrointestinal nematode infection.

    Directory of Open Access Journals (Sweden)

    Adam D Hayward

    2014-07-01

    Full Text Available Hosts may mitigate the impact of parasites by two broad strategies: resistance, which limits parasite burden, and tolerance, which limits the fitness or health cost of increasing parasite burden. The degree and causes of variation in both resistance and tolerance are expected to influence host-parasite evolutionary and epidemiological dynamics and inform disease management, yet very little empirical work has addressed tolerance in wild vertebrates. Here, we applied random regression models to longitudinal data from an unmanaged population of Soay sheep to estimate individual tolerance, defined as the rate of decline in body weight with increasing burden of highly prevalent gastrointestinal nematode parasites. On average, individuals lost weight as parasite burden increased, but whereas some lost weight slowly as burden increased (exhibiting high tolerance, other individuals lost weight significantly more rapidly (exhibiting low tolerance. We then investigated associations between tolerance and fitness using selection gradients that accounted for selection on correlated traits, including body weight. We found evidence for positive phenotypic selection on tolerance: on average, individuals who lost weight more slowly with increasing parasite burden had higher lifetime breeding success. This variation did not have an additive genetic basis. These results reveal that selection on tolerance operates under natural conditions. They also support theoretical predictions for the erosion of additive genetic variance of traits under strong directional selection and fixation of genes conferring tolerance. Our findings provide the first evidence of selection on individual tolerance of infection in animals and suggest practical applications in animal and human disease management in the face of highly prevalent parasites.

  19. Effectiveness of ancestral irradiation on the direct and correlated responses to selection for body weight in rats

    International Nuclear Information System (INIS)

    Gianola, D.

    1975-01-01

    The effects of ancestral irradiation of rat spermatogonia (a cumulative total of 4050 r of x-rays) were studied in a highly inbred line of rats to explore the feasibility of using irradiation to enhance the effectiveness of selection. Six generations after irradiation was terminated, a selection experiment for body weight at six weeks of age was started in both ancestrally irradiated and non-irradiated populations. There were two non-contemporaneous replicates in each of the populations. Within each of the ancestral treatment-replicate combinations one line was selected for high, one for low body weight at six weeks of age, and a third line was maintained by random selection. In each line, avoidance of mating of animals with grandparents in common was attempted. Data on the first ten progeny generations of selection were included in this study. Five types of covariances among relatives were used to estimate causal components of variance for five different genetic models within the ''non-irradiated'' and ''irradiated'' randomly selected models. The parameters in the genetic models were estimated by generalized least-squares. This analysis suggested that a genetic model including direct genetic and maternal genetic effects was adequate to describe the body weights at 3, 6 and 10 weeks of age and the weight gains between these ages. Ancestral irradiation seemed to have enhanced the maternal genetic variance and the covariance between the direct genetic and the maternal genetic effects. On the basis of the above analysis, it was deduced that mass selection should have been more effective in the descendants of irradiated males than in those of the non-irradiated males as a consequence of greater phenotypic variability in their progeny and an enhancement in the regression of the genetic value on the selection criterion

  20. A Cluster-Randomized Trial of Restorative Practices: An Illustration to Spur High-Quality Research and Evaluation.

    Science.gov (United States)

    Acosta, Joie D; Chinman, Matthew; Ebener, Patricia; Phillips, Andrea; Xenakis, Lea; Malone, Patrick S

    2016-01-01

    Restorative Practices in schools lack rigorous evaluation studies. As an example of rigorous school-based research, this paper describes the first randomized control trial of restorative practices to date, the Study of Restorative Practices. It is a 5-year, cluster-randomized controlled trial (RCT) of the Restorative Practices Intervention (RPI) in 14 middle schools in Maine to assess whether RPI impacts both positive developmental outcomes and problem behaviors and whether the effects persist during the transition from middle to high school. The two-year RPI intervention began in the 2014-2015 school year. The study's rationale and theoretical concerns are discussed along with methodological concerns including teacher professional development. The theoretical rationale and description of the methods from this study may be useful to others conducting rigorous research and evaluation in this area.

  1. Natural Selection as an Emergent Process: Instructional Implications

    Science.gov (United States)

    Cooper, Robert A.

    2017-01-01

    Student reasoning about cases of natural selection is often plagued by errors that stem from miscategorising selection as a direct, causal process, misunderstanding the role of randomness, and from the intuitive ideas of intentionality, teleology and essentialism. The common thread throughout many of these reasoning errors is a failure to apply…

  2. High temperature and bacteriophages can indirectly select for bacterial pathogenicity in environmental reservoirs.

    Directory of Open Access Journals (Sweden)

    Ville-Petri Friman

    2011-03-01

    Full Text Available The coincidental evolution hypothesis predicts that traits connected to bacterial pathogenicity could be indirectly selected outside the host as a correlated response to abiotic environmental conditions or different biotic species interactions. To investigate this, an opportunistic bacterial pathogen, Serratia marcescens, was cultured in the absence and presence of the lytic bacteriophage PPV (Podoviridae at 25°C and 37°C for four weeks (N = 5. At the end, we measured changes in bacterial phage-resistance and potential virulence traits, and determined the pathogenicity of all bacterial selection lines in the Parasemia plantaginis insect model in vivo. Selection at 37°C increased bacterial motility and pathogenicity but only in the absence of phages. Exposure to phages increased the phage-resistance of bacteria, and this was costly in terms of decreased maximum population size in the absence of phages. However, this small-magnitude growth cost was not greater with bacteria that had evolved in high temperature regime, and no trade-off was found between phage-resistance and growth rate. As a result, phages constrained the evolution of a temperature-mediated increase in bacterial pathogenicity presumably by preferably infecting the highly motile and virulent bacteria. In more general perspective, our results suggest that the traits connected to bacterial pathogenicity could be indirectly selected as a correlated response by abiotic and biotic factors in environmental reservoirs.

  3. Tuning PIM-PI-Based Membranes for Highly Selective Transport of Propylene/Propane

    KAUST Repository

    Swaidan, Ramy J.

    2016-12-06

    To date there exists a great deal of energetic and economic inefficiency in the separation of olefins from paraffins because the principal means of achieving industrial purity requirements is accomplished with very energy intensive cryogenic distillation. Mitigation of the severe energy intensity of the propylene/propane separation has been identified as one of seven chemical separations which can change the landscape of global energy use, and membranes have been targeted as an emerging technology because they offer scalability and lower capital and operating costs. The focus of this work was to evaluate a new direction of material development for the very industrially relevant propylene/propane separation using membranes. The objective was to develop a rational design approach for generating highly selective membranes using a relatively new platform of materials known as polyimides of intrinsic microporosity (PIM-PIs), the prospects of which have never been examined for the propylene/propane separation. Structurally, PIMs comprise relatively inflexible macromolecular architectures integrating contortion sites that help disrupt packing and trap microporous free volume elements (< 20 Å). To date most of the work reported in the literature on this separation is based on conventional low free volume 6FDA-based polyimides which in the best case show moderate C3H6/C3H8 selectivities (<20) with C3H6 permeabilities too low to garner industrial interest. Due to propylene and propane’s relatively large molecular size, we hypothesized that the use of more open structures can provide greater accessibility to the pores necessary to enhance membrane sieving and flux. It has been shown for numerous key gas separations that introduction of microporosity into a polymer structure can defy the notorious permeability/selectivity tradeoff curve and induce simultaneous boosts in both permeability and selectivity. The cornerstone approach to designing state of the art high

  4. Parent Training with High-Risk Immigrant Chinese Families: A Pilot Group Randomized Trial Yielding Practice-Based Evidence

    Science.gov (United States)

    Lau, Anna S.; Fung, Joey J.; Ho, Lorinda Y.; Liu, Lisa L.; Gudino, Omar G.

    2011-01-01

    We studied the efficacy and implementation outcomes of a culturally responsive parent training (PT) program. Fifty-four Chinese American parents participated in a wait-list controlled group randomized trial (32 immediate treatment, 22 delayed treatment) of a 14-week intervention designed to address the needs of high-risk immigrant families.…

  5. Conversion of the random amplified polymorphic DNA (RAPD ...

    African Journals Online (AJOL)

    Conversion of the random amplified polymorphic DNA (RAPD) marker UBC#116 linked to Fusarium crown and root rot resistance gene (Frl) into a co-dominant sequence characterized amplified region (SCAR) marker for marker-assisted selection of tomato.

  6. Random matrices and random difference equations

    International Nuclear Information System (INIS)

    Uppuluri, V.R.R.

    1975-01-01

    Mathematical models leading to products of random matrices and random difference equations are discussed. A one-compartment model with random behavior is introduced, and it is shown how the average concentration in the discrete time model converges to the exponential function. This is of relevance to understanding how radioactivity gets trapped in bone structure in blood--bone systems. The ideas are then generalized to two-compartment models and mammillary systems, where products of random matrices appear in a natural way. The appearance of products of random matrices in applications in demography and control theory is considered. Then random sequences motivated from the following problems are studied: constant pulsing and random decay models, random pulsing and constant decay models, and random pulsing and random decay models

  7. Employing online quantum random number generators for generating truly random quantum states in Mathematica

    Science.gov (United States)

    Miszczak, Jarosław Adam

    2013-01-01

    The presented package for the Mathematica computing system allows the harnessing of quantum random number generators (QRNG) for investigating the statistical properties of quantum states. The described package implements a number of functions for generating random states. The new version of the package adds the ability to use the on-line quantum random number generator service and implements new functions for retrieving lists of random numbers. Thanks to the introduced improvements, the new version provides faster access to high-quality sources of random numbers and can be used in simulations requiring large amount of random data. New version program summaryProgram title: TRQS Catalogue identifier: AEKA_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 18 134 No. of bytes in distributed program, including test data, etc.: 2 520 49 Distribution format: tar.gz Programming language: Mathematica, C. Computer: Any supporting Mathematica in version 7 or higher. Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit). RAM: Case-dependent Supplementary material: Fig. 1 mentioned below can be downloaded. Classification: 4.15. External routines: Quantis software library (http://www.idquantique.com/support/quantis-trng.html) Catalogue identifier of previous version: AEKA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183(2012)118 Does the new version supersede the previous version?: Yes Nature of problem: Generation of random density matrices and utilization of high-quality random numbers for the purpose of computer simulation. Solution method: Use of a physical quantum random number generator and an on-line service providing access to the source of true random

  8. PONTIAC (NT-proBNP selected prevention of cardiac events in a population of diabetic patients without a history of cardiac disease): a prospective randomized controlled trial.

    Science.gov (United States)

    Huelsmann, Martin; Neuhold, Stephanie; Resl, Michael; Strunk, Guido; Brath, Helmut; Francesconi, Claudia; Adlbrecht, Christopher; Prager, Rudolf; Luger, Anton; Pacher, Richard; Clodi, Martin

    2013-10-08

    The study sought to assess the primary preventive effect of neurohumoral therapy in high-risk diabetic patients selected by N-terminal pro-B-type natriuretic peptide (NT-proBNP). Few clinical trials have successfully demonstrated the prevention of cardiac events in patients with diabetes. One reason for this might be an inaccurate selection of patients. NT-proBNP has not been assessed in this context. A total of 300 patients with type 2 diabetes, elevated NT-proBNP (>125 pg/ml) but free of cardiac disease were randomized. The "control" group was cared for at 4 diabetes care units; the "intensified" group was additionally treated at a cardiac outpatient clinic for the up-titration of renin-angiotensin system (RAS) antagonists and beta-blockers. The primary endpoint was hospitalization/death due to cardiac disease after 2 years. At baseline, the mean age of the patients was 67.5 ± 9 years, duration of diabetes was 15 ± 12 years, 37% were male, HbA1c was 7 ± 1.1%, blood pressure was 151 ± 22 mm Hg, heart rate was 72 ± 11 beats/min, median NT-proBNP was 265.5 pg/ml (interquartile range: 180.8 to 401.8 pg/ml). After 12 months there was a significant difference between the number of patients treated with a RAS antagonist/beta-blocker and the dosage reached between groups (p titration of RAS antagonists and beta-blockers to maximum tolerated dosages is an effective and safe intervention for the primary prevention of cardiac events for diabetic patients pre-selected using NT-proBNP. (Nt-proBNP Guided Primary Prevention of CV Events in Diabetic Patients [PONTIAC]; NCT00562952). Copyright © 2013 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  9. High Dimensional Spectral Graph Theory and Non-backtracking Random Walks on Graphs

    Science.gov (United States)

    Kempton, Mark

    This thesis has two primary areas of focus. First we study connection graphs, which are weighted graphs in which each edge is associated with a d-dimensional rotation matrix for some fixed dimension d, in addition to a scalar weight. Second, we study non-backtracking random walks on graphs, which are random walks with the additional constraint that they cannot return to the immediately previous state at any given step. Our work in connection graphs is centered on the notion of consistency, that is, the product of rotations moving from one vertex to another is independent of the path taken, and a generalization called epsilon-consistency. We present higher dimensional versions of the combinatorial Laplacian matrix and normalized Laplacian matrix from spectral graph theory, and give results characterizing the consistency of a connection graph in terms of the spectra of these matrices. We generalize several tools from classical spectral graph theory, such as PageRank and effective resistance, to apply to connection graphs. We use these tools to give algorithms for sparsification, clustering, and noise reduction on connection graphs. In non-backtracking random walks, we address the question raised by Alon et. al. concerning how the mixing rate of a non-backtracking random walk to its stationary distribution compares to the mixing rate for an ordinary random walk. Alon et. al. address this question for regular graphs. We take a different approach, and use a generalization of Ihara's Theorem to give a new proof of Alon's result for regular graphs, and to extend the result to biregular graphs. Finally, we give a non-backtracking version of Polya's Random Walk Theorem for 2-dimensional grids.

  10. Highly Selective Liquid-Phase Benzylation of Anisole with Solid-Acid Zeolite Catalysts

    DEFF Research Database (Denmark)

    Poreddy, Raju; Shunmugavel, Saravanamurugan; Riisager, Anders

    2015-01-01

    Zeolites were evaluated as solid acid catalysts for the liquid-phase benzylation of anisole with benzyl alcohol, benzyl bromide, and benzyl chloride at 80 °C. Among the examined zeolites, H-mordenite-10 (H-MOR-10) demonstrated particular high activity (>99 %) and excellent selectivity (>96...

  11. High resolution x-ray fluorescence spectroscopy - a new technique for site- and spin-selectivity

    International Nuclear Information System (INIS)

    Wang, Xin

    1996-12-01

    X-ray spectroscopy has long been used to elucidate electronic and structural information of molecules. One of the weaknesses of x-ray absorption is its sensitivity to all of the atoms of a particular element in a sample. Through out this thesis, a new technique for enhancing the site- and spin-selectivity of the x-ray absorption has been developed. By high resolution fluorescence detection, the chemical sensitivity of K emission spectra can be used to identify oxidation and spin states; it can also be used to facilitate site-selective X-ray Absorption Near Edge Structure (XANES) and site-selective Extended X-ray Absorption Fine Structure (EXAFS). The spin polarization in K fluorescence could be used to generate spin selective XANES or spin-polarized EXAFS, which provides a new measure of the spin density, or the nature of magnetic neighboring atoms. Finally, dramatic line-sharpening effects by the combination of absorption and emission processes allow observation of structure that is normally unobservable. All these unique characters can enormously simplify a complex x-ray spectrum. Applications of this novel technique have generated information from various transition-metal model compounds to metalloproteins. The absorption and emission spectra by high resolution fluorescence detection are interdependent. The ligand field multiplet model has been used for the analysis of Kα and Kβ emission spectra. First demonstration on different chemical states of Fe compounds has shown the applicability of site selectivity and spin polarization. Different interatomic distances of the same element in different chemical forms have been detected using site-selective EXAFS

  12. Effect of a Counseling Session Bolstered by Text Messaging on Self-Selected Health Behaviors in College Students: A Preliminary Randomized Controlled Trial.

    Science.gov (United States)

    Sandrick, Janice; Tracy, Doreen; Eliasson, Arn; Roth, Ashley; Bartel, Jeffrey; Simko, Melanie; Bowman, Tracy; Harouse-Bell, Karen; Kashani, Mariam; Vernalis, Marina

    2017-05-17

    The college experience is often the first time when young adults live independently and make their own lifestyle choices. These choices affect dietary behaviors, exercise habits, techniques to deal with stress, and decisions on sleep time, all of which direct the trajectory of future health. There is a need for effective strategies that will encourage healthy lifestyle choices in young adults attending college. This preliminary randomized controlled trial tested the effect of coaching and text messages (short message service, SMS) on self-selected health behaviors in the domains of diet, exercise, stress, and sleep. A second analysis measured the ripple effect of the intervention on health behaviors not specifically selected as a goal by participants. Full-time students aged 18-30 years were recruited by word of mouth and campuswide advertisements (flyers, posters, mailings, university website) at a small university in western Pennsylvania from January to May 2015. Exclusions included pregnancy, eating disorders, chronic medical diagnoses, and prescription medications other than birth control. Of 60 participants, 30 were randomized to receive a single face-to-face meeting with a health coach to review results of behavioral questionnaires and to set a health behavior goal for the 8-week study period. The face-to-face meeting was followed by SMS text messages designed to encourage achievement of the behavioral goal. A total of 30 control subjects underwent the same health and behavioral assessments at intake and program end but did not receive coaching or SMS text messages. The texting app showed that 87.31% (2187/2505) of messages were viewed by intervention participants. Furthermore, 28 of the 30 intervention participants and all 30 control participants provided outcome data. Among intervention participants, 22 of 30 (73%) showed improvement in health behavior goal attainment, with the whole group (n=30) showing a mean improvement of 88% (95% CI 39-136). Mean

  13. Sampling large random knots in a confined space

    International Nuclear Information System (INIS)

    Arsuaga, J; Blackstone, T; Diao, Y; Hinson, K; Karadayi, E; Saito, M

    2007-01-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e n 2 )). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n 2 ). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications

  14. Sampling large random knots in a confined space

    Science.gov (United States)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  15. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  16. Efficient and Highly Aldehyde Selective Wacker Oxidation

    KAUST Repository

    Teo, Peili; Wickens, Zachary K.; Dong, Guangbin; Grubbs, Robert H.

    2012-01-01

    A method for efficient and aldehyde-selective Wacker oxidation of aryl-substituted olefins using PdCl 2(MeCN) 2, 1,4-benzoquinone, and t-BuOH in air is described. Up to a 96% yield of aldehyde can be obtained, and up to 99% selectivity can be achieved with styrene-related substrates. © 2012 American Chemical Society.

  17. Efficient and Highly Aldehyde Selective Wacker Oxidation

    KAUST Repository

    Teo, Peili

    2012-07-06

    A method for efficient and aldehyde-selective Wacker oxidation of aryl-substituted olefins using PdCl 2(MeCN) 2, 1,4-benzoquinone, and t-BuOH in air is described. Up to a 96% yield of aldehyde can be obtained, and up to 99% selectivity can be achieved with styrene-related substrates. © 2012 American Chemical Society.

  18. Yoga Improves Academic Performance in Urban High School Students Compared to Physical Education: A Randomized Controlled Trial

    Science.gov (United States)

    Hagins, Marshall; Rundle, Andrew

    2016-01-01

    Yoga programs within schools have become more widespread but research regarding the potential effect on academic achievement remains limited. This study cluster-randomized 112 students within a single New York City public high school to participate in either school-based yoga or physical education (PE) for an entire academic year. The primary…

  19. Site selection and characterization processes for deep geologic disposal of high level nuclear waste

    International Nuclear Information System (INIS)

    Costin, L.S.

    1997-01-01

    In this paper, the major elements of the site selection and characterization processes used in the U. S. high level waste program are discussed. While much of the evolution of the site selection and characterization processes have been driven by the unique nature of the U. S. program, these processes, which are well-defined and documented, could be used as an initial basis for developing site screening, selection, and characterization programs in other countries. Thus, this paper focuses more on the process elements than the specific details of the U. S. program. (author). 3 refs., 2 tabs., 5 figs

  20. Site selection and characterization processes for deep geologic disposal of high level nuclear waste

    Energy Technology Data Exchange (ETDEWEB)

    Costin, L.S. [Sandia National Labs., Albuquerque, NM (United States)

    1997-12-31

    In this paper, the major elements of the site selection and characterization processes used in the U. S. high level waste program are discussed. While much of the evolution of the site selection and characterization processes have been driven by the unique nature of the U. S. program, these processes, which are well-defined and documented, could be used as an initial basis for developing site screening, selection, and characterization programs in other countries. Thus, this paper focuses more on the process elements than the specific details of the U. S. program. (author). 3 refs., 2 tabs., 5 figs.

  1. Bridging Emergent Attributes and Darwinian Principles in Teaching Natural Selection

    Science.gov (United States)

    Xu, Dongchen; Chi, Michelene T. H.

    2016-01-01

    Students often have misconceptions about natural selection as they misuse a direct causal schema to explain the process. Natural selection is in fact an emergent process where random interactions lead to changes in a population. The misconceptions stem from students' lack of emergent schema for natural selection. In order to help students…

  2. Selection of representative calibration sample sets for near-infrared reflectance spectroscopy to predict nitrogen concentration in grasses

    DEFF Research Database (Denmark)

    Shetty, Nisha; Rinnan, Åsmund; Gislum, René

    2012-01-01

    ) algorithm were used and compared. Both Puchwein and CADEX methods provide a calibration set equally distributed in space, and both methods require a minimum prior of knowledge. The samples were also selected randomly using complete random, cultivar random (year fixed), year random (cultivar fixed......) and interaction (cultivar × year fixed) random procedures to see the influence of different factors on sample selection. Puchwein's method performed best with lowest RMSEP followed by CADEX, interaction random, year random, cultivar random and complete random. Out of 118 samples of the complete calibration set...... effectively enhance the cost-effectiveness of NIR spectral analysis by reducing the number of analyzed samples in the calibration set by more than 80%, which substantially reduces the effort of laboratory analyses with no significant loss in prediction accuracy....

  3. Selective mutism.

    Science.gov (United States)

    Hua, Alexandra; Major, Nili

    2016-02-01

    Selective mutism is a disorder in which an individual fails to speak in certain social situations though speaks normally in other settings. Most commonly, this disorder initially manifests when children fail to speak in school. Selective mutism results in significant social and academic impairment in those affected by it. This review will summarize the current understanding of selective mutism with regard to diagnosis, epidemiology, cause, prognosis, and treatment. Studies over the past 20 years have consistently demonstrated a strong relationship between selective mutism and anxiety, most notably social phobia. These findings have led to the recent reclassification of selective mutism as an anxiety disorder in the Diagnostic and Statistical Manual of Mental Disorders, 5th Edition. In addition to anxiety, several other factors have been implicated in the development of selective mutism, including communication delays and immigration/bilingualism, adding to the complexity of the disorder. In the past few years, several randomized studies have supported the efficacy of psychosocial interventions based on a graduated exposure to situations requiring verbal communication. Less data are available regarding the use of pharmacologic treatment, though there are some studies that suggest a potential benefit. Selective mutism is a disorder that typically emerges in early childhood and is currently conceptualized as an anxiety disorder. The development of selective mutism appears to result from the interplay of a variety of genetic, temperamental, environmental, and developmental factors. Although little has been published about selective mutism in the general pediatric literature, pediatric clinicians are in a position to play an important role in the early diagnosis and treatment of this debilitating condition.

  4. High-Lift Propeller System Configuration Selection for NASA's SCEPTOR Distributed Electric Propulsion Flight Demonstrator

    Science.gov (United States)

    Patterson, Michael D.; Derlaga, Joseph M.; Borer, Nicholas K.

    2016-01-01

    Although the primary function of propellers is typically to produce thrust, aircraft equipped with distributed electric propulsion (DEP) may utilize propellers whose main purpose is to act as a form of high-lift device. These \\high-lift propellers" can be placed upstream of wing such that, when the higher-velocity ow in the propellers' slipstreams interacts with the wing, the lift is increased. This technique is a main design feature of a new NASA advanced design project called Scalable Convergent Electric Propulsion Technology Operations Research (SCEPTOR). The goal of the SCEPTOR project is design, build, and y a DEP aircraft to demonstrate that such an aircraft can be much more ecient than conventional designs. This paper provides details into the high-lift propeller system con guration selection for the SCEPTOR ight demonstrator. The methods used in the high-lift propeller system conceptual design and the tradeo s considered in selecting the number of propellers are discussed.

  5. Benchmarking Variable Selection in QSAR.

    Science.gov (United States)

    Eklund, Martin; Norinder, Ulf; Boyer, Scott; Carlsson, Lars

    2012-02-01

    Variable selection is important in QSAR modeling since it can improve model performance and transparency, as well as reduce the computational cost of model fitting and predictions. Which variable selection methods that perform well in QSAR settings is largely unknown. To address this question we, in a total of 1728 benchmarking experiments, rigorously investigated how eight variable selection methods affect the predictive performance and transparency of random forest models fitted to seven QSAR datasets covering different endpoints, descriptors sets, types of response variables, and number of chemical compounds. The results show that univariate variable selection methods are suboptimal and that the number of variables in the benchmarked datasets can be reduced with about 60 % without significant loss in model performance when using multivariate adaptive regression splines MARS and forward selection. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Feature Selection for Chemical Sensor Arrays Using Mutual Information

    Science.gov (United States)

    Wang, X. Rosalind; Lizier, Joseph T.; Nowotny, Thomas; Berna, Amalia Z.; Prokopenko, Mikhail; Trowell, Stephen C.

    2014-01-01

    We address the problem of feature selection for classifying a diverse set of chemicals using an array of metal oxide sensors. Our aim is to evaluate a filter approach to feature selection with reference to previous work, which used a wrapper approach on the same data set, and established best features and upper bounds on classification performance. We selected feature sets that exhibit the maximal mutual information with the identity of the chemicals. The selected features closely match those found to perform well in the previous study using a wrapper approach to conduct an exhaustive search of all permitted feature combinations. By comparing the classification performance of support vector machines (using features selected by mutual information) with the performance observed in the previous study, we found that while our approach does not always give the maximum possible classification performance, it always selects features that achieve classification performance approaching the optimum obtained by exhaustive search. We performed further classification using the selected feature set with some common classifiers and found that, for the selected features, Bayesian Networks gave the best performance. Finally, we compared the observed classification performances with the performance of classifiers using randomly selected features. We found that the selected features consistently outperformed randomly selected features for all tested classifiers. The mutual information filter approach is therefore a computationally efficient method for selecting near optimal features for chemical sensor arrays. PMID:24595058

  7. Quality pseudo-random number generator

    International Nuclear Information System (INIS)

    Tarasiuk, J.

    1996-01-01

    The pseudo-random number generator (RNG) was written to match needs of nuclear and high-energy physics computation which in some cases require very long and independent random number sequences. In this random number generator the repetition period is about 10 36 what should be sufficient for all computers in the world. In this article the test results of RNG correlation, speed and identity of computations for PC, Sun4 and VAX computer tests are presented

  8. An Integrated Model for Supplier Selection for a High-Tech Manufacturer

    Science.gov (United States)

    Lee, Amy H. I.; Kang, He-Yau; Lin, Chun-Yu

    2011-11-01

    Global competitiveness has become the biggest concern of manufacturing companies, especially in high-tech industries. Improving competitive edges in an environment with rapidly changing technological innovations and dynamic customer needs is essential for a firm to survive and to acquire a decent profit. Thus, the introduction of successful new products is a source of new sales and profits and is a necessity in the intense competitive international market. After a product is developed, a firm needs the cooperation of upstream suppliers to provide satisfactory components and parts for manufacturing final products. Therefore, the selection of suitable suppliers has also become a very important decision. In this study, an analytical approach is proposed to select the most appropriate critical-part suppliers in order to maintain a high reliability of the supply chain. A fuzzy analytic network process (FANP) model, which incorporates the benefits, opportunities, costs and risks (BOCR) concept, is constructed to evaluate various aspects of suppliers. The proposed model is adopted in a TFT-LCD manufacturer in Taiwan in evaluating the expected performance of suppliers with respect to each important factor, and an overall ranking of the suppliers can be generated as a result.

  9. Selection of antigenic markers on a GFP-Cκ fusion scaffold with high sensitivity by eukaryotic ribosome display

    International Nuclear Information System (INIS)

    Yang Yongmin; Barankiewicz, Teresa J.; He Mingyue; Taussig, Michael J.; Chen, Swey-Shen

    2007-01-01

    Ribosome display is a cell-free system permitting gene selection through the physical association of genetic material (mRNA) and its phenotypic (protein) product. While often used to select single-chain antibodies from large libraries by panning against immobilized antigens, we have adapted ribosome display for use in the 'reverse' format in order to select high affinity antigenic determinants against solid-phase antibody. To create an antigenic scaffold, DNA encoding green fluorescent protein (GFP) was fused to a light chain constant domain (Cκ) with stop codon deleted, and with 5' signals (T7 promoter, Kozak) enabling coupled transcription/translation in a eukaryotic cell-free system. Epitopes on either GFP (5') or Cκ (3') were selected by anti-GFP or anti-Cκ antibodies, respectively, coupled to magnetic beads. After selection, mRNA was amplified directly from protein-ribosome-mRNA (PRM) complexes by in situ PCR followed by internal amplification and reassembly PCR. As little as 10 fg of the 1 kb DNA construct, i.e. approximately 7500 molecules, could be recovered following a single round of interaction with solid-phase anti-GFP antibody. This platform is highly specific and sensitive for the antigen-antibody interaction and may permit selection and reshaping of high affinity antigenic variants of scaffold proteins

  10. Selective enhancement of orientation tuning before saccades.

    Science.gov (United States)

    Ohl, Sven; Kuper, Clara; Rolfs, Martin

    2017-11-01

    Saccadic eye movements cause a rapid sweep of the visual image across the retina and bring the saccade's target into high-acuity foveal vision. Even before saccade onset, visual processing is selectively prioritized at the saccade target. To determine how this presaccadic attention shift exerts its influence on visual selection, we compare the dynamics of perceptual tuning curves before movement onset at the saccade target and in the opposite hemifield. Participants monitored a 30-Hz sequence of randomly oriented gratings for a target orientation. Combining a reverse correlation technique previously used to study orientation tuning in neurons and general additive mixed modeling, we found that perceptual reports were tuned to the target orientation. The gain of orientation tuning increased markedly within the last 100 ms before saccade onset. In addition, we observed finer orientation tuning right before saccade onset. This increase in gain and tuning occurred at the saccade target location and was not observed at the incongruent location in the opposite hemifield. The present findings suggest, therefore, that presaccadic attention exerts its influence on vision in a spatially and feature-selective manner, enhancing performance and sharpening feature tuning at the future gaze location before the eyes start moving.

  11. A selective electrocatalyst-based direct methanol fuel cell operated at high concentrations of methanol.

    Science.gov (United States)

    Feng, Yan; Liu, Hui; Yang, Jun

    2017-06-01

    Owing to the serious crossover of methanol from the anode to the cathode through the polymer electrolyte membrane, direct methanol fuel cells (DMFCs) usually use dilute methanol solutions as fuel. However, the use of high-concentration methanol is highly demanded to improve the energy density of a DMFC system. Instead of the conventional strategies (for example, improving the fuel-feed system, membrane development, modification of electrode, and water management), we demonstrate the use of selective electrocatalysts to run a DMFC at high concentrations of methanol. In particular, at an operating temperature of 80°C, the as-fabricated DMFC with core-shell-shell Au@Ag 2 S@Pt nanocomposites at the anode and core-shell Au@Pd nanoparticles at the cathode produces a maximum power density of 89.7 mW cm -2 at a methanol feed concentration of 10 M and maintains good performance at a methanol concentration of up to 15 M. The high selectivity of the electrocatalysts achieved through structural construction accounts for the successful operation of the DMFC at high concentrations of methanol.

  12. A selective electrocatalyst–based direct methanol fuel cell operated at high concentrations of methanol

    Science.gov (United States)

    Feng, Yan; Liu, Hui; Yang, Jun

    2017-01-01

    Owing to the serious crossover of methanol from the anode to the cathode through the polymer electrolyte membrane, direct methanol fuel cells (DMFCs) usually use dilute methanol solutions as fuel. However, the use of high-concentration methanol is highly demanded to improve the energy density of a DMFC system. Instead of the conventional strategies (for example, improving the fuel-feed system, membrane development, modification of electrode, and water management), we demonstrate the use of selective electrocatalysts to run a DMFC at high concentrations of methanol. In particular, at an operating temperature of 80°C, the as-fabricated DMFC with core-shell-shell Au@Ag2S@Pt nanocomposites at the anode and core-shell Au@Pd nanoparticles at the cathode produces a maximum power density of 89.7 mW cm−2 at a methanol feed concentration of 10 M and maintains good performance at a methanol concentration of up to 15 M. The high selectivity of the electrocatalysts achieved through structural construction accounts for the successful operation of the DMFC at high concentrations of methanol. PMID:28695199

  13. Highly Selective and Sensitive Self-Powered Glucose Sensor Based on Capacitor Circuit.

    Science.gov (United States)

    Slaughter, Gymama; Kulkarni, Tanmay

    2017-05-03

    Enzymatic glucose biosensors are being developed to incorporate nanoscale materials with the biological recognition elements to assist in the rapid and sensitive detection of glucose. Here we present a highly sensitive and selective glucose sensor based on capacitor circuit that is capable of selectively sensing glucose while simultaneously powering a small microelectronic device. Multi-walled carbon nanotubes (MWCNTs) is chemically modified with pyrroloquinoline quinone glucose dehydrogenase (PQQ-GDH) and bilirubin oxidase (BOD) at anode and cathode, respectively, in the biofuel cell arrangement. The input voltage (as low as 0.25 V) from the biofuel cell is converted to a stepped-up power and charged to the capacitor to the voltage of 1.8 V. The frequency of the charge/discharge cycle of the capacitor corresponded to the oxidation of glucose. The biofuel cell structure-based glucose sensor synergizes the advantages of both the glucose biosensor and biofuel cell. In addition, this glucose sensor favored a very high selectivity towards glucose in the presence of competing and non-competing analytes. It exhibited unprecedented sensitivity of 37.66 Hz/mM.cm 2 and a linear range of 1 to 20 mM. This innovative self-powered glucose sensor opens new doors for implementation of biofuel cells and capacitor circuits for medical diagnosis and powering therapeutic devices.

  14. Sequence based prediction of DNA-binding proteins based on hybrid feature selection using random forest and Gaussian naïve Bayes.

    Directory of Open Access Journals (Sweden)

    Wangchao Lou

    Full Text Available Developing an efficient method for determination of the DNA-binding proteins, due to their vital roles in gene regulation, is becoming highly desired since it would be invaluable to advance our understanding of protein functions. In this study, we proposed a new method for the prediction of the DNA-binding proteins, by performing the feature rank using random forest and the wrapper-based feature selection using forward best-first search strategy. The features comprise information from primary sequence, predicted secondary structure, predicted relative solvent accessibility, and position specific scoring matrix. The proposed method, called DBPPred, used Gaussian naïve Bayes as the underlying classifier since it outperformed five other classifiers, including decision tree, logistic regression, k-nearest neighbor, support vector machine with polynomial kernel, and support vector machine with radial basis function. As a result, the proposed DBPPred yields the highest average accuracy of 0.791 and average MCC of 0.583 according to the five-fold cross validation with ten runs on the training benchmark dataset PDB594. Subsequently, blind tests on the independent dataset PDB186 by the proposed model trained on the entire PDB594 dataset and by other five existing methods (including iDNA-Prot, DNA-Prot, DNAbinder, DNABIND and DBD-Threader were performed, resulting in that the proposed DBPPred yielded the highest accuracy of 0.769, MCC of 0.538, and AUC of 0.790. The independent tests performed by the proposed DBPPred on completely a large non-DNA binding protein dataset and two RNA binding protein datasets also showed improved or comparable quality when compared with the relevant prediction methods. Moreover, we observed that majority of the selected features by the proposed method are statistically significantly different between the mean feature values of the DNA-binding and the non DNA-binding proteins. All of the experimental results indicate that

  15. Prediction of broadband ground-motion time histories: Hybrid low/high-frequency method with correlated random source parameters

    Science.gov (United States)

    Liu, P.; Archuleta, R.J.; Hartzell, S.H.

    2006-01-01

    We present a new method for calculating broadband time histories of ground motion based on a hybrid low-frequency/high-frequency approach with correlated source parameters. Using a finite-difference method we calculate low- frequency synthetics (structure. We also compute broadband synthetics in a 1D velocity model using a frequency-wavenumber method. The low frequencies from the 3D calculation are combined with the high frequencies from the 1D calculation by using matched filtering at a crossover frequency of 1 Hz. The source description, common to both the 1D and 3D synthetics, is based on correlated random distributions for the slip amplitude, rupture velocity, and rise time on the fault. This source description allows for the specification of source parameters independent of any a priori inversion results. In our broadband modeling we include correlation between slip amplitude, rupture velocity, and rise time, as suggested by dynamic fault modeling. The method of using correlated random source parameters is flexible and can be easily modified to adjust to our changing understanding of earthquake ruptures. A realistic attenuation model is common to both the 3D and 1D calculations that form the low- and high-frequency components of the broadband synthetics. The value of Q is a function of the local shear-wave velocity. To produce more accurate high-frequency amplitudes and durations, the 1D synthetics are corrected with a randomized, frequency-dependent radiation pattern. The 1D synthetics are further corrected for local site and nonlinear soil effects by using a 1D nonlinear propagation code and generic velocity structure appropriate for the site’s National Earthquake Hazards Reduction Program (NEHRP) site classification. The entire procedure is validated by comparison with the 1994 Northridge, California, strong ground motion data set. The bias and error found here for response spectral acceleration are similar to the best results that have been published by

  16. Data re-arranging techniques leading to proper variable selections in high energy physics

    Science.gov (United States)

    Kůs, Václav; Bouř, Petr

    2017-12-01

    We introduce a new data based approach to homogeneity testing and variable selection carried out in high energy physics experiments, where one of the basic tasks is to test the homogeneity of weighted samples, mainly the Monte Carlo simulations (weighted) and real data measurements (unweighted). This technique is called ’data re-arranging’ and it enables variable selection performed by means of the classical statistical homogeneity tests such as Kolmogorov-Smirnov, Anderson-Darling, or Pearson’s chi-square divergence test. P-values of our variants of homogeneity tests are investigated and the empirical verification through 46 dimensional high energy particle physics data sets is accomplished under newly proposed (equiprobable) quantile binning. Particularly, the procedure of homogeneity testing is applied to re-arranged Monte Carlo samples and real DATA sets measured at the particle accelerator Tevatron in Fermilab at DØ experiment originating from top-antitop quark pair production in two decay channels (electron, muon) with 2, 3, or 4+ jets detected. Finally, the variable selections in the electron and muon channels induced by the re-arranging procedure for homogeneity testing are provided for Tevatron top-antitop quark data sets.

  17. Selection of High Performance Alloy for Gas Turbine Blade Using Multiphysics Analysis

    Directory of Open Access Journals (Sweden)

    H Khawaja

    2016-09-01

    Full Text Available With the extensive increase in the utilization of energy resources in the modern era, the need of energy extraction from various resources has pronounced in recent years. Thus comprehensive efforts have been made around the globe in the technological development of turbo machines where means of energy extraction is energized fluids. This development led the aviation industry to power boost due to better performing engines. Meanwhile, the structural conformability requirements relative to the functional requirements have also increased with the advent of newer, better performing materials. Thus there is a need to study the material behavior and its usage with the idea of selecting the best possible material for its application. In this work a gas turbine blade of a small turbofan engine, where geometry and aerodynamic data was available, was analyzed for its structural behavior in the proposed mission envelope, where the engine turbine is subjected to high thermal, inertial and aerodynamic loads. Multiphysics Finite Element (FE linear stress analysis was carried out on the turbine blade. The results revealed the upper limit of Ultimate Tensile Strength (UTS for the blade. Based on the limiting factor, high performance alloys were selected from the literature. The two most recommended alloy categories for gas turbine blades are NIMONIC and INCONEL from where total of 21 types of INCONEL alloys and 12 of NIMONIC alloys, available on commercial bases, were analyzed individually to meet the structural requirements. After applying selection criteria, four alloys were finalized from NIMONIC and INCONEL alloys for further analysis. On the basis of stress-strain behavior of finalized alloys, the Multiphysics FE nonlinear stress analysis was then carried out for the selection of the individual alloy by imposing a restriction of Ultimate Factor of Safety (UFOS of 1.33 and yield strength. Final selection is made keeping in view other factors

  18. Selective CO Methanation on Highly Active Ru/TiO2 Catalysts: Identifying the Physical Origin of the Observed Activation/Deactivation and Loss in Selectivity

    DEFF Research Database (Denmark)

    Abdel-Mageed, Ali M.; Widmann, Daniel; Olesen, Sine Ellemann

    2018-01-01

    Ru /TiO2 catalysts are highly active and selective in the selective methanation of CO in the presence of large amounts of CO2, but suffer from a considerable deactivation and loss of selectivity during time on stream. Aiming at a fundamental understanding of these processes, we have systematically...... different effects such as structural effects, adlayer effects such as site blocking effects and changes in the chemical (surface) composition of the catalysts. Operando XANES / EXAFS measurements revealed that an initial activation phase is largely due to the reduction of oxidized Ru species, together...

  19. Random function representation of stationary stochastic vector processes for probability density evolution analysis of wind-induced structures

    Science.gov (United States)

    Liu, Zhangjun; Liu, Zenghui

    2018-06-01

    This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.

  20. The Effect of Speed Alterations on Tempo Note Selection.

    Science.gov (United States)

    Madsen, Clifford K.; And Others

    1986-01-01

    Investigated the tempo note preferences of 100 randomly selected college-level musicians using familiar orchestral music as stimuli. Subjects heard selections at increased, decreased, and unaltered tempi. Results showed musicians were not accurate in estimating original tempo and showed consistent preference for faster than actual tempo.…

  1. A synbio approach for selection of highly expressed gene variants in Gram-positive bacteria

    DEFF Research Database (Denmark)

    Ferro, Roberto; Rennig, Maja; Hernández Rollán, Cristina

    2018-01-01

    with a long history in food fermentation. We have developed a synbio approach for increasing gene expression in two Gram-positive bacteria. First of all, the gene of interest was coupled to an antibiotic resistance gene to create a growth-based selection system. We then randomised the translation initiation...... region (TIR) preceding the gene of interest and selected clones that produced high protein titres, as judged by their ability to survive on high concentrations of antibiotic. Using this approach, we were able to significantly increase production of two industrially relevant proteins; sialidase in B....... subtilis and tyrosine ammonia lyase in L. lactis. Gram-positive bacteria are widely used to produce industrial enzymes. High titres are necessary to make the production economically feasible. The synbio approach presented here is a simple and inexpensive way to increase protein titres, which can be carried...

  2. High-Resolution Manometry Improves the Diagnosis of Esophageal Motility Disorders in Patients With Dysphagia: A Randomized Multicenter Study.

    Science.gov (United States)

    Roman, Sabine; Huot, Laure; Zerbib, Frank; Bruley des Varannes, Stanislas; Gourcerol, Guillaume; Coffin, Benoit; Ropert, Alain; Roux, Adeline; Mion, François

    2016-03-01

    High-resolution manometry (HRM) might be superior to conventional manometry (CM) to diagnose esophageal motility disorders. We aimed to compare the diagnosis performed with HRM and CM and confirmed at 6 months in a multicenter randomized trial. Patients with unexplained dysphagia were randomized to undergo either CM or HRM. Motility disorders were diagnosed using the Castell and Spechler classification for CM and the Chicago classification for HRM. Diagnosis confirmation was based on clinical outcome and response to treatment after 6-month follow-up. The initial diagnosis and percentage of confirmed diagnoses were compared between the two arms (CM and HRM). In total, 247 patients were randomized and 245 analyzed: 122 in the CM arm and 123 in the HRM arm. A manometric diagnosis was more frequently initially achieved with HRM than with CM (97% vs. 84%; Pesophageal motility disorders could be identified earlier with HRM than with CM (ClinicalTrial.gov, NCT01284894).

  3. Random walk on random walks

    NARCIS (Netherlands)

    Hilário, M.; Hollander, den W.Th.F.; Sidoravicius, V.; Soares dos Santos, R.; Teixeira, A.

    2014-01-01

    In this paper we study a random walk in a one-dimensional dynamic random environment consisting of a collection of independent particles performing simple symmetric random walks in a Poisson equilibrium with density ¿¿(0,8). At each step the random walk performs a nearest-neighbour jump, moving to

  4. Overweight and obesity in Slovak high school students and body composition indicators: a non-randomized cross-sectional study

    Directory of Open Access Journals (Sweden)

    Bibiana Vadasova

    2016-08-01

    Full Text Available Abstract Background Physical development can be considered as an indicator of the overall health status of the youth population. Currently, it appears that the increasing trend of the prevalence of obesity among children and youths has stopped in a number of countries worldwide. Studies point to the fact that adolescence is a critical period for the development of obesity. Body mass index (BMI seems to be an orientation parameter in the assessment of prevalence of obesity which is not sufficient for more accurate identification of at risk individuals. The purpose of this study was to evaluate association between BMI percentile zones as health-risk for being overweight and obese and body composition indicators in high-school students from the Prešov (Slovakia region. Methods A non-randomized cross-sectional study in high school students from the Prešov (Slovakia region was conducted. The research sample consisted of 1014 participants (boys n = 466, girls n = 549. Body composition was measured using direct segmental multi-frequency bioelectrical impedance analysis (DSM-BIA. To examine the association between obesity and selected body composition indicators, Kruskal-Wallis ANOVA and Eta2 were used. The relationship between selected body composition indicators and percentile BMI zones was determined using the Kendall tau correlation. Results In groups with different BMI percentile zones (normal weight, overweight, obese, ANOVA showed significant differences for girls and boys (p ˂.05 with high effect size (η2 ˂.26 in body weight, body fat mass index, body fat percentage, fat free mass index, fat-free mass percentage, visceral fat area, waist-to-hip ratio, waist circumference, protein mass and mineral mass. The highest degree of correlation among boys was between BMI values indicating overweight and obesity and fat free mass index and waist circumference, respectively (τ = .71, τ = .70, respectively. In girls, the highest

  5. A DYNAMIC FEATURE SELECTION METHOD FOR DOCUMENT RANKING WITH RELEVANCE FEEDBACK APPROACH

    Directory of Open Access Journals (Sweden)

    K. Latha

    2010-07-01

    Full Text Available Ranking search results is essential for information retrieval and Web search. Search engines need to not only return highly relevant results, but also be fast to satisfy users. As a result, not all available features can be used for ranking, and in fact only a small percentage of these features can be used. Thus, it is crucial to have a feature selection mechanism that can find a subset of features that both meets latency requirements and achieves high relevance. In this paper we describe a 0/1 knapsack procedure for automatically selecting features to use within Generalization model for Document Ranking. We propose an approach for Relevance Feedback using Expectation Maximization method and evaluate the algorithm on the TREC Collection for describing classes of feedback textual information retrieval features. Experimental results, evaluated on standard TREC-9 part of the OHSUMED collections, show that our feature selection algorithm produces models that are either significantly more effective than, or equally effective as, models such as Markov Random Field model, Correlation Co-efficient and Count Difference method

  6. Early selection in open-pollinated Eucalyptus families based on competition covariates

    Directory of Open Access Journals (Sweden)

    Bruno Ettore Pavan

    2014-06-01

    Full Text Available The objetive of this work was to evaluate the influence of intergenotypic competition in open-pollinated families of Eucalyptus and its effects on early selection efficiency. Two experiments were carried out, in which the timber volume was evaluated at three ages, in a randomized complete block design. Data from the three years of evaluation (experiment 1, at 2, 4, and 7 years; and experiment 2, at 2, 5, and 7 years were analyzed using mixed models. The following were estimated: variance components, genetic parameters, selection gains, effective number, early selection efficiency, selection gain per unit time, and coincidence of selection with and without the use of competition covariates. Competition effect was nonsignificant for ages under three years, and adjustment using competition covariates was unnecessary. Early selection for families is effective; families that have a late growth spurt are more vulnerable to competition, which markedly impairs ranking at the end of the cycle. Early selection is efficient according to all adopted criteria, and the age of around three years is the most recommended, given the high efficiency and accuracy rate in the indication of trees and families. The addition of competition covariates at the end of the cycle improves early selection efficiency for almost all studied criteria.

  7. A terbium(III)-organic framework for highly selective sensing of cytidine triphosphate.

    Science.gov (United States)

    Zhao, Xi Juan; He, Rong Xing; Li, Yuan Fang

    2012-11-21

    Highly selective sensing of cytidine triphosphate (CTP) against other triphosphate nucleosides including ATP, GTP and UTP is successfully achieved with a luminescent terbium(III)-organic framework (TbOF) of [Tb(2)(2,3-pzdc)(2)(ox)(H(2)O)(2)](n) (2,3-pzdc(2-) = 2,3-pyrazinedicarboxylate, ox(2-) = oxalate).

  8. Brief Cognitive-Behavioral Depression Prevention Program for High-Risk Adolescents Outperforms Two Alternative Interventions: A Randomized Efficacy Trial

    Science.gov (United States)

    Stice, Eric; Rohde, Paul; Seeley, John R.; Gau, Jeff M.

    2008-01-01

    In this depression prevention trial, 341 high-risk adolescents (mean age = 15.6 years, SD = 1.2) with elevated depressive symptoms were randomized to a brief group cognitive-behavioral (CB) intervention, group supportive-expressive intervention, bibliotherapy, or assessment-only control condition. CB participants showed significantly greater…

  9. Biased random key genetic algorithm with insertion and gender selection for capacitated vehicle routing problem with time windows

    Science.gov (United States)

    Rochman, Auliya Noor; Prasetyo, Hari; Nugroho, Munajat Tri

    2017-06-01

    Vehicle Routing Problem (VRP) often occurs when the manufacturers need to distribute their product to some customers/outlets. The distribution process is typically restricted by the capacity of the vehicle and the working hours at the distributor. This type of VRP is also known as Capacitated Vehicle Routing Problem with Time Windows (CVRPTW). A Biased Random Key Genetic Algorithm (BRKGA) was designed and coded in MATLAB to solve the CVRPTW case of soft drink distribution. The standard BRKGA was then modified by applying chromosome insertion into the initial population and defining chromosome gender for parent undergoing crossover operation. The performance of the established algorithms was then compared to a heuristic procedure for solving a soft drink distribution. Some findings are revealed (1) the total distribution cost of BRKGA with insertion (BRKGA-I) results in a cost saving of 39% compared to the total cost of heuristic method, (2) BRKGA with the gender selection (BRKGA-GS) could further improve the performance of the heuristic method. However, the BRKGA-GS tends to yield worse results compared to that obtained from the standard BRKGA.

  10. Genomic relations among 31 species of Mammillaria haworth (Cactaceae) using random amplified polymorphic DNA.

    Science.gov (United States)

    Mattagajasingh, Ilwola; Mukherjee, Arup Kumar; Das, Premananda

    2006-01-01

    Thirty-one species of Mammillaria were selected to study the molecular phylogeny using random amplified polymorphic DNA (RAPD) markers. High amount of mucilage (gelling polysaccharides) present in Mammillaria was a major obstacle in isolating good quality genomic DNA. The CTAB (cetyl trimethyl ammonium bromide) method was modified to obtain good quality genomic DNA. Twenty-two random decamer primers resulted in 621 bands, all of which were polymorphic. The similarity matrix value varied from 0.109 to 0.622 indicating wide variability among the studied species. The dendrogram obtained from the unweighted pair group method using arithmetic averages (UPGMA) analysis revealed that some of the species did not follow the conventional classification. The present work shows the usefulness of RAPD markers for genetic characterization to establish phylogenetic relations among Mammillaria species.

  11. Random phase approximations for the screening function in high Tc superconductors

    International Nuclear Information System (INIS)

    Lopez-Aguilar, F.; Costa-Quintana, J.; Sanchez, A.; Puig, T.; Aurell, M.T.; Martinez, L.M.; Munoz, J.S.

    1990-01-01

    This paper reports on the electronic transferences from the CuO 2 sheets toward the CuO 3 linear chain, which locate electrons in the orbitals p y /p z of O4/O1 and d z 2 -y 2 of Cu1, and holes in the orbitals d x 2 -y 2 - P z /p y of Cu2 - P2/O3. These holes states present large interatomic overlapping. In this paper, we determine the screening function within the random phase approximation applied to the high-T c superconductors. This screening function is vanishing for determined values of the frequency which correspond to renormalized plasmon frequencies. These frequencies depends on the band parameters and their knowledge is essential for determining the self energy. This self energy is deduced and it contain independent terms for each of the channels for the localization

  12. Evolution of Near-Surface Internal and External Oxide Morphology During High-Temperature Selective Oxidation of Steels

    Science.gov (United States)

    Story, Mary E.; Webler, Bryan A.

    2018-05-01

    In this work we examine some observations made using high-temperature confocal scanning laser microscopy (HT-CSLM) during selective oxidation experiments. A plain carbon steel and advanced high-strength steel (AHSS) were selectively oxidized at high temperature (850-900°C) in either low oxygen or water vapor atmospheres. Surface evolution, including thermal grooving along grain boundaries and oxide growth, was viewed in situ during heating. Experiments investigated the influence of the microstructure and oxidizing atmosphere on selective oxidation behavior. Sequences of CSLM still frames collected during the experiment were processed with ImageJ to obtain histograms that showed a general darkening trend indicative of oxidation over time with all samples. Additional ex situ scanning electron microscopy and energy dispersive spectroscopy analysis supported in situ observations. Distinct oxidation behavior was observed for each case. Segregation, grain orientation, and extent of internal oxidation were all found to strongly influence surface evolution.

  13. Nonlinear Pricing with Random Participation

    OpenAIRE

    Jean-Charles Rochet; Lars A. Stole

    2002-01-01

    The canonical selection contracting programme takes the agent's participation decision as deterministic and finds the optimal contract, typically satisfying this constraint for the worst type. Upon weakening this assumption of known reservation values by introducing independent randomness into the agents' outside options, we find that some of the received wisdom from mechanism design and nonlinear pricing is not robust and the richer model which allows for stochastic participation affords a m...

  14. Reversal of profound, high-dose rocuronium-induced neuromuscular blockade by sugammadex at two different time points - An international, multicenter, randomized, dose-finding, safety assessor-blinded, phase II trial

    DEFF Research Database (Denmark)

    Puhringer, F.K.; Rex, C.; Sielenkamper, A.W.

    2008-01-01

    Background: Sugammadex (Org 25969), a novel, selective relaxant binding agent, was specifically designed to rapidly reverse rocuronium-induced neuromuscular blockade. The efficacy and safety of sugammadex for the reversal of profound, high-dose rocuronium-induced neuromuscular blockade...... was evaluated. Methods: A total of 176 adult patients were randomly assigned to receive sugammadex (2, 4, 8, 12, or 16 mg/kg) or placebo at 3 or 15 min after high-dose rocuronium (1.0 or 1.2 mg/kg) during propofol anesthesia. The primary endpoint was time to recovery of the train-of-four ratio to 0.......9. Neuromuscular monitoring was performed using acceleromyography. Results: Sugammadex administered 3 or 15 min after injection of 1 mg/kg rocuronium decreased the median recovery time of the train-of-four ratio to 0.9 in a dose-dependent manner from 111.1 min and 91.0 min (placebo) to 1.6 min and 0.9 min (16 mg...

  15. Highly selective and sensitive detection of neurotransmitters using receptor-modified single-walled carbon nanotube sensors

    Science.gov (United States)

    Kim, Byeongju; Song, Hyun Seok; Jin, Hye Jun; Park, Eun Jin; Lee, Sang Hun; Lee, Byung Yang; Park, Tai Hyun; Hong, Seunghun

    2013-07-01

    We present receptor-modified carbon nanotube sensors for the highly selective and sensitive detection of acetylcholine (ACh), one kind of neurotransmitter. Here, we successfully expressed the M1 muscarinic acetylcholine receptor (M1 mAChR), a family of G protein-coupled receptors (GPCRs), in E. coli and coated single-walled carbon nanotube (swCNT)-field effect transistors (FETs) with lipid membrane including the receptor, enabling highly selective and sensitive ACh detection. Using this sensor, we could detect ACh at 100 pM concentration. Moreover, we showed that this sensor could selectively detect ACh among other neurotransmitters. This is the first demonstration of the real-time detection of ACh using specific binding between ACh and M1 mAChR, and it may lead to breakthroughs for various applications such as disease diagnosis and drug screening.

  16. Highly selective and sensitive detection of neurotransmitters using receptor-modified single-walled carbon nanotube sensors

    International Nuclear Information System (INIS)

    Kim, Byeongju; Jin, Hye Jun; Park, Eun Jin; Hong, Seunghun; Song, Hyun Seok; Lee, Sang Hun; Park, Tai Hyun; Lee, Byung Yang

    2013-01-01

    We present receptor-modified carbon nanotube sensors for the highly selective and sensitive detection of acetylcholine (ACh), one kind of neurotransmitter. Here, we successfully expressed the M1 muscarinic acetylcholine receptor (M1 mAChR), a family of G protein-coupled receptors (GPCRs), in E. coli and coated single-walled carbon nanotube (swCNT)-field effect transistors (FETs) with lipid membrane including the receptor, enabling highly selective and sensitive ACh detection. Using this sensor, we could detect ACh at 100 pM concentration. Moreover, we showed that this sensor could selectively detect ACh among other neurotransmitters. This is the first demonstration of the real-time detection of ACh using specific binding between ACh and M1 mAChR, and it may lead to breakthroughs for various applications such as disease diagnosis and drug screening. (paper)

  17. Genetic search feature selection for affective modeling

    DEFF Research Database (Denmark)

    Martínez, Héctor P.; Yannakakis, Georgios N.

    2010-01-01

    Automatic feature selection is a critical step towards the generation of successful computational models of affect. This paper presents a genetic search-based feature selection method which is developed as a global-search algorithm for improving the accuracy of the affective models built....... The method is tested and compared against sequential forward feature selection and random search in a dataset derived from a game survey experiment which contains bimodal input features (physiological and gameplay) and expressed pairwise preferences of affect. Results suggest that the proposed method...

  18. A novel lentiviral scFv display library for rapid optimization and selection of high affinity antibodies.

    Science.gov (United States)

    Qudsia, Sehar; Merugu, Siva B; Mangukiya, Hitesh B; Hema, Negi; Wu, Zhenghua; Li, Dawei

    2018-04-30

    Antibody display libraries have become a popular technique to screen monoclonal antibodies for therapeutic purposes. An important aspect of display technology is to generate an optimization library by changing antibody affinity to antigen through mutagenesis and screening the high affinity antibody. In this study, we report a novel lentivirus display based optimization library antibody in which Agtuzumab scFv is displayed on cell membrane of HEK-293T cells. To generate an optimization library, hotspot mutagenesis was performed to achieve diverse antibody library. Based on sequence analysis of randomly selected clones, library size was estimated approximately to be 1.6 × 10 6 . Lentivirus display vector was used to display scFv antibody on cell surface and flow cytometery was performed to check the antibody affinity to antigen. Membrane bound scFv antibodies were then converted to secreted antibody through cre/loxP recombination. One of the mutant clones, M8 showed higher affinity to antigen in flow cytometery analysis. Further characterization of cellular and secreted scFv through western blot showed that antibody affinity was increased by three fold after mutagenesis. This study shows successful construction of a novel antibody library and suggests that hotspot mutagenesis could prove a useful and rapid optimization tool to generate similar libraries with various degree of antigen affinity. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Impact of selective genotyping in the training population on accuracy and bias of genomic selection.

    Science.gov (United States)

    Zhao, Yusheng; Gowda, Manje; Longin, Friedrich H; Würschum, Tobias; Ranc, Nicolas; Reif, Jochen C

    2012-08-01

    Estimating marker effects based on routinely generated phenotypic data of breeding programs is a cost-effective strategy to implement genomic selection. Truncation selection in breeding populations, however, could have a strong impact on the accuracy to predict genomic breeding values. The main objective of our study was to investigate the influence of phenotypic selection on the accuracy and bias of genomic selection. We used experimental data of 788 testcross progenies from an elite maize breeding program. The testcross progenies were evaluated in unreplicated field trials in ten environments and fingerprinted with 857 SNP markers. Random regression best linear unbiased prediction method was used in combination with fivefold cross-validation based on genotypic sampling. We observed a substantial loss in the accuracy to predict genomic breeding values in unidirectional selected populations. In contrast, estimating marker effects based on bidirectional selected populations led to only a marginal decrease in the prediction accuracy of genomic breeding values. We concluded that bidirectional selection is a valuable approach to efficiently implement genomic selection in applied plant breeding programs.

  20. Genetic algorithm for the design of high frequency diffraction gratings for high power laser applications

    Science.gov (United States)

    Thomson, Martin J.; Waddie, Andrew J.; Taghizadeh, Mohammad R.

    2006-04-01

    We present a genetic algorithm with small population sizes for the design of diffraction gratings in the rigorous domain. A general crossover and mutation scheme is defined, forming fifteen offspring from 3 parents, which enables the algorithm to be used for designing gratings with diverse optical properties by careful definition of the merit function. The initial parents are randomly selected and the parents of the subsequent generations are selected by survival of the fittest. The performance of the algorithm is demonstrated by designing diffraction gratings with specific application to high power laser beam lines. Gratings are designed that act as beam deflectors, polarisers, polarising beam splitters, harmonic separation gratings and pulse compression gratings. By imposing fabrication constraints within the design process, we determine which of these elements have true potential for application within high power laser beam lines.

  1. All-optical fast random number generator.

    Science.gov (United States)

    Li, Pu; Wang, Yun-Cai; Zhang, Jian-Zhong

    2010-09-13

    We propose a scheme of all-optical random number generator (RNG), which consists of an ultra-wide bandwidth (UWB) chaotic laser, an all-optical sampler and an all-optical comparator. Free from the electric-device bandwidth, it can generate 10Gbit/s random numbers in our simulation. The high-speed bit sequences can pass standard statistical tests for randomness after all-optical exclusive-or (XOR) operation.

  2. Blind column selection protocol for two-dimensional high performance liquid chromatography.

    Science.gov (United States)

    Burns, Niki K; Andrighetto, Luke M; Conlan, Xavier A; Purcell, Stuart D; Barnett, Neil W; Denning, Jacquie; Francis, Paul S; Stevenson, Paul G

    2016-07-01

    The selection of two orthogonal columns for two-dimensional high performance liquid chromatography (LC×LC) separation of natural product extracts can be a labour intensive and time consuming process and in many cases is an entirely trial-and-error approach. This paper introduces a blind optimisation method for column selection of a black box of constituent components. A data processing pipeline, created in the open source application OpenMS®, was developed to map the components within the mixture of equal mass across a library of HPLC columns; LC×LC separation space utilisation was compared by measuring the fractional surface coverage, fcoverage. It was found that for a test mixture from an opium poppy (Papaver somniferum) extract, the combination of diphenyl and C18 stationary phases provided a predicted fcoverage of 0.48 and was matched with an actual usage of 0.43. OpenMS®, in conjunction with algorithms designed in house, have allowed for a significantly quicker selection of two orthogonal columns, which have been optimised for a LC×LC separation of crude extractions of plant material. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Highly sensitive and selective detection of dopamine using one-pot synthesized highly photoluminescent silicon nanoparticles.

    Science.gov (United States)

    Zhang, Xiaodong; Chen, Xiaokai; Kai, Siqi; Wang, Hong-Yin; Yang, Jingjing; Wu, Fu-Gen; Chen, Zhan

    2015-03-17

    A simple and highly efficient method for dopamine (DA) detection using water-soluble silicon nanoparticles (SiNPs) was reported. The SiNPs with a high quantum yield of 23.6% were synthesized by using a one-pot microwave-assisted method. The fluorescence quenching capability of a variety of molecules on the synthesized SiNPs has been tested; only DA molecules were found to be able to quench the fluorescence of these SiNPs effectively. Therefore, such a quenching effect can be used to selectively detect DA. All other molecules tested have little interference with the dopamine detection, including ascorbic acid, which commonly exists in cells and can possibly affect the dopamine detection. The ratio of the fluorescence intensity difference between the quenched and unquenched cases versus the fluorescence intensity without quenching (ΔI/I) was observed to be linearly proportional to the DA analyte concentration in the range from 0.005 to 10.0 μM, with a detection limit of 0.3 nM (S/N = 3). To the best of our knowledge, this is the lowest limit for DA detection reported so far. The mechanism of fluorescence quenching is attributed to the energy transfer from the SiNPs to the oxidized dopamine molecules through Förster resonance energy transfer. The reported method of SiNP synthesis is very simple and cheap, making the above sensitive and selective DA detection approach using SiNPs practical for many applications.

  4. Curved Microneedle Array-Based sEMG Electrode for Robust Long-Term Measurements and High Selectivity

    Directory of Open Access Journals (Sweden)

    Minjae Kim

    2015-07-01

    Full Text Available Surface electromyography is widely used in many fields to infer human intention. However, conventional electrodes are not appropriate for long-term measurements and are easily influenced by the environment, so the range of applications of sEMG is limited. In this paper, we propose a flexible band-integrated, curved microneedle array electrode for robust long-term measurements, high selectivity, and easy applicability. Signal quality, in terms of long-term usability and sensitivity to perspiration, was investigated. Its motion-discriminating performance was also evaluated. The results show that the proposed electrode is robust to perspiration and can maintain a high-quality measuring ability for over 8 h. The proposed electrode also has high selectivity for motion compared with a commercial wet electrode and dry electrode.

  5. Random numbers spring from alpha decay

    International Nuclear Information System (INIS)

    Frigerio, N.A.; Sanathanan, L.P.; Morley, M.; Clark, N.A.; Tyler, S.A.

    1980-05-01

    Congruential random number generators, which are widely used in Monte Carlo simulations, are deficient in that the number they generate are concentrated in a relatively small number of hyperplanes. While this deficiency may not be a limitation in small Monte Carlo studies involving a few variables, it introduces a significant bias in large simulations requiring high resolution. This bias was recognized and assessed during preparations for an accident analysis study of nuclear power plants. This report describes a random number device based on the radioactive decay of alpha particles from a 235 U source in a high-resolution gas proportional counter. The signals were fed to a 4096-channel analyzer and for each channel the frequency of signals registered in a 20,000-microsecond interval was recorded. The parity bits of these frequency counts (0 for an even count and 1 for an odd count) were then assembled in sequence to form 31-bit binary random numbers and transcribed to a magnetic tape. This cycle was repeated as many times as were necessary to create 3 million random numbers. The frequency distribution of counts from the present device conforms to the Brockwell-Moyal distribution, which takes into account the dead time of the counter (both the dead time and decay constant of the underlying Poisson process were estimated). Analysis of the count data and tests of randomness on a sample set of the 31-bit binary numbers indicate that this random number device is a highly reliable source of truly random numbers. Its use is, therefore, recommended in Monte Carlo simulations for which the congruential pseudorandom number generators are found to be inadequate. 6 figures, 5 tables

  6. Nutrition Information at the Point of Selection in High Schools Does Not Affect Purchases

    Science.gov (United States)

    Rainville, Alice Jo; Choi, Kyunghee; Ragg, Mark; King, Amber; Carr, Deborah H.

    2010-01-01

    Purpose/Objectives: Nutrition information can be an important component of local wellness policies. There are very few studies regarding nutrition information at the point of selection (POS) in high schools. The purpose of this study was to investigate the effects of posting entree nutrition information at the POS in high schools nationwide.…

  7. Origin of high oxide to nitride polishing selectivity of ceria-based slurry in the presence of picolinic acid

    Institute of Scientific and Technical Information of China (English)

    Wang Liang-Yong; Liu Bo; Song Zhi-Tang; Liu Wei-Li; Feng Song-Lin; David Huang; S.V Babu

    2011-01-01

    We report on the investigation of the origin of high oxide to nitride polishing selectivity of ceria-based slurry in the presence of picolinic acid. The oxide to nitride removal selectivity of the ceria slurry with picolinic acid is as high as 76.6 in the chemical mechanical polishing. By using zeta potential analyzer, particle size analyzer, horizon profilometer, thermogravimetric analysis and Fourier transform infrared spectroscopy, the pre-and the post-polished wafer surfaces as well as the pre-and the post-used ceria-based slurries are compared. Possible mechanism of high oxide to nitride selectivity with using ceria-based slurry with picolinic acid is discussed.

  8. Selection and Clonal Propagation of High Artemisinin Genotypes of Artemisia annua

    Science.gov (United States)

    Wetzstein, Hazel Y.; Porter, Justin A.; Janick, Jules; Ferreira, Jorge F. S.; Mutui, Theophilus M.

    2018-01-01

    Artemisinin, produced in the glandular trichomes of Artemisia annua L. is a vital antimalarial drug effective against Plasmodium falciparum resistant to quinine-derived medicines. Although work has progressed on the semi-synthetic production of artemisinin, field production of A. annua remains the principal commercial source of the compound. Crop production of artemisia must be increased to meet the growing worldwide demand for artemisinin combination therapies (ACTs) to treat malaria. Grower artemisinin yields rely on plants generated from seeds from open-pollinated parents. Although selection has considerably increased plant artemisinin concentration in the past 15 years, seed-generated plants have highly variable artemisinin content that lowers artemisinin yield per hectare. Breeding efforts to produce improved F1 hybrids have been hampered by the inability to produce inbred lines due to self-incompatibility. An approach combining conventional hybridization and selection with clonal propagation of superior genotypes is proposed as a means to enhance crop yield and artemisinin production. Typical seed-propagated artemisia plants produce less than 1% (dry weight) artemisinin with yields below 25 kg/ha. Genotypes were identified producing high artemisinin levels of over 2% and possessing improved agronomic characteristics such as high leaf area and shoot biomass production. Field studies of clonally-propagated high-artemisinin plants verified enhanced plant uniformity and an estimated gross primary productivity of up to 70 kg/ha artemisinin, with a crop density of one plant m-2. Tissue culture and cutting protocols for the mass clonal propagation of A. annua were developed for shoot regeneration, rooting, acclimatization, and field cultivation. Proof of concept studies showed that both tissue culture-regenerated plants and rooted cutting performed better than plants derived from seed in terms of uniformity, yield, and consistently high artemisinin content. Use of

  9. Noncontextuality with Marginal Selectivity in Reconstructing Mental Architectures

    Directory of Open Access Journals (Sweden)

    Ru eZhang

    2015-06-01

    Full Text Available We present a general theory of series-parallel mental architectures with selectively influenced stochastically non-independent components. A mental architecture is a hypothetical network of processes aimed at performing a task, of which we only observe the overall time it takes under variable parameters of the task. It is usually assumed that the network contains several processes selectively influenced by different experimental factors, and then the question is asked as to how these processes are arranged within the network, e.g., whether they are concurrent or sequential. One way of doing this is to consider the distribution functions for the overall processing time and compute certain linear combinations thereof (interaction contrasts. The theory of selective influences in psychology can be viewed as a special application of the interdisciplinary theory of (noncontextuality having its origins and main applications in quantum theory. In particular, lack of contextuality is equivalent to the existence of a hidden random entity of which all the random variables in play are functions. Consequently, for any given value of this common random entity, the processing times and their compositions (minima, maxima, or sums become deterministic quantities. These quantities, in turn, can be treated as random variables with (shifted Heaviside distribution functions, for which one can easily compute various linear combinations across different treatments, including interaction contrasts. This mathematical fact leads to a simple method, more general than the previously used ones, to investigate and characterize the interaction contrast for different types of series-parallel architectures.

  10. Thermal-hydraulic code selection for modular high temperature gas-cooled reactors

    Energy Technology Data Exchange (ETDEWEB)

    Komen, E M.J.; Bogaard, J.P.A. van den

    1995-06-01

    In order to study the transient thermal-hydraulic system behaviour of modular high temperature gas-cooled reactors, the thermal-hydraulic computer codes RELAP5, MELCOR, THATCH, MORECA, and VSOP are considered at the Netherlands Energy Research Foundation ECN. This report presents the selection of the most appropriate codes. To cover the range of relevant accidents, a suite of three codes is recommended for analyses of HTR-M and MHTGR reactors. (orig.).

  11. A 3-week multimodal intervention involving high-intensity interval training in female cancer survivors: a randomized controlled trial.

    Science.gov (United States)

    Schmitt, Joachim; Lindner, Nathalie; Reuss-Borst, Monika; Holmberg, Hans-Christer; Sperlich, Billy

    2016-02-01

    To compare the effects of a 3-week multimodal rehabilitation involving supervised high-intensity interval training (HIIT) on female breast cancer survivors with respect to key variables of aerobic fitness, body composition, energy expenditure, cancer-related fatigue, and quality of life to those of a standard multimodal rehabilitation program. A randomized controlled trial design was administered. Twenty-eight women, who had been treated for cancer were randomly assigned to either a group performing exercise of low-to-moderate intensity (LMIE; n = 14) or a group performing high-intensity interval training (HIIT; n = 14) as part of a 3-week multimodal rehabilitation program. No adverse events related to the exercise were reported. Work economy improved following both HIIT and LMIE, with improved peak oxygen uptake following LMIE. HIIT reduced mean total body fat mass with no change in body mass, muscle or fat-free mass (best P body mass. Total energy expenditure (P = 0.45) did not change between the groups, whereas both improved quality of life to a similar high extent and lessened cancer-related fatigue. This randomized controlled study demonstrates that HIIT can be performed by female cancer survivors without adverse health effects. Here, HIIT and LMIE both improved work economy, quality of life and cancer-related fatigue, body composition or energy expenditure. Since the outcomes were similar, but HIIT takes less time, this may be a time-efficient strategy for improving certain aspects of the health of female cancer survivors. © 2016 The Authors. Physiological Reports published by Wiley Periodicals, Inc. on behalf of the American Physiological Society and The Physiological Society.

  12. Random demographic household surveys in highly mobile pastoral communities in Chad.

    Science.gov (United States)

    Weibel, Daniel; Béchir, Mahamat; Hattendorf, Jan; Bonfoh, Bassirou; Zinsstag, Jakob; Schelling, Esther

    2011-05-01

    Reliable demographic data is a central requirement for health planning and management, and for the implementation of adequate interventions. This study addresses the lack of demographic data on mobile pastoral communities in the Sahel. A total of 1081 Arab, Fulani and Gorane women and 2541 children (1336 boys and 1205 girls) were interviewed and registered by a biometric fingerprint scanner in five repeated random transect demographic and health surveys conducted from March 2007 to January 2008 in the Lake Chad region in Chad. Important determinants for the planning and implementation of household surveys among mobile pastoral communities include: environmental factors; availability of women for interviews; difficulties in defining "own" children; the need for information-education-communication campaigns; and informed consent of husbands in typically patriarchal societies. Due to their high mobility, only 5% (56/1081) of registered women were encountered twice. Therefore, it was not possible to establish a demographic and health cohort. Prospective demographic and health cohorts are the most accurate method to assess child mortality and other demographic indices. However, their feasibility in a highly mobile pastoral setting remains to be shown. Future interdisciplinary scientific efforts need to target innovative methods, tools and approaches to include marginalized communities in operational health and demographic surveillance systems.

  13. Selective population of high-j states via heavy-ion-induced transfer reactions

    International Nuclear Information System (INIS)

    Bond, P.D.

    1982-01-01

    One of the early hopes of heavy-ion-induced transfer reactions was to populate states not seen easily or at all by other means. To date, however, I believe it is fair to say that spectroscopic studies of previously unknown states have had, at best, limited success. Despite the early demonstration of selectivity with cluster transfer to high-lying states in light nuclei, the study of heavy-ion-induced transfer reactions has emphasized the reaction mechanism. The value of using two of these reactions for spectroscopy of high spin states is demonstrated: 143 Nd( 16 O, 15 O) 144 Nd and 170 Er( 16 O, 15 Oγ) 171 Er

  14. Plans, Patterns, and Move Categories Guiding a Highly Selective Search

    Science.gov (United States)

    Trippen, Gerhard

    In this paper we present our ideas for an Arimaa-playing program (also called a bot) that uses plans and pattern matching to guide a highly selective search. We restrict move generation to moves in certain move categories to reduce the number of moves considered by the bot significantly. Arimaa is a modern board game that can be played with a standard Chess set. However, the rules of the game are not at all like those of Chess. Furthermore, Arimaa was designed to be as simple and intuitive as possible for humans, yet challenging for computers. While all established Arimaa bots use alpha-beta search with a variety of pruning techniques and other heuristics ending in an extensive positional leaf node evaluation, our new bot, Rat, starts with a positional evaluation of the current position. Based on features found in the current position - supported by pattern matching using a directed position graph - our bot Rat decides which of a given set of plans to follow. The plan then dictates what types of moves can be chosen. This is another major difference from bots that generate "all" possible moves for a particular position. Rat is only allowed to generate moves that belong to certain categories. Leaf nodes are evaluated only by a straightforward material evaluation to help avoid moves that lose material. This highly selective search looks, on average, at only 5 moves out of 5,000 to over 40,000 possible moves in a middle game position.

  15. Highly selective population of two excited states in nonresonant two-photon absorption

    International Nuclear Information System (INIS)

    Zhang Hui; Zhang Shi-An; Sun Zhen-Rong

    2011-01-01

    A nonresonant two-photon absorption process can be manipulated by tailoring the ultra-short laser pulse. In this paper, we theoretically demonstrate a highly selective population of two excited states in the nonresonant two-photon absorption process by rationally designing a spectral phase distribution. Our results show that one excited state is maximally populated while the other state population is widely tunable from zero to the maximum value. We believe that the theoretical results may play an important role in the selective population of a more complex nonlinear process comprising nonresonant two-photon absorption, such as resonance-mediated (2+1)-three-photon absorption and (2+1)-resonant multiphoton ionization. (atomic and molecular physics)

  16. DNA-based random number generation in security circuitry.

    Science.gov (United States)

    Gearheart, Christy M; Arazi, Benjamin; Rouchka, Eric C

    2010-06-01

    DNA-based circuit design is an area of research in which traditional silicon-based technologies are replaced by naturally occurring phenomena taken from biochemistry and molecular biology. This research focuses on further developing DNA-based methodologies to mimic digital data manipulation. While exhibiting fundamental principles, this work was done in conjunction with the vision that DNA-based circuitry, when the technology matures, will form the basis for a tamper-proof security module, revolutionizing the meaning and concept of tamper-proofing and possibly preventing it altogether based on accurate scientific observations. A paramount part of such a solution would be self-generation of random numbers. A novel prototype schema employs solid phase synthesis of oligonucleotides for random construction of DNA sequences; temporary storage and retrieval is achieved through plasmid vectors. A discussion of how to evaluate sequence randomness is included, as well as how these techniques are applied to a simulation of the random number generation circuitry. Simulation results show generated sequences successfully pass three selected NIST random number generation tests specified for security applications.

  17. Selective suppression of high-order harmonics within phase-matched spectral regions.

    Science.gov (United States)

    Lerner, Gavriel; Diskin, Tzvi; Neufeld, Ofer; Kfir, Ofer; Cohen, Oren

    2017-04-01

    Phase matching in high-harmonic generation leads to enhancement of multiple harmonics. It is sometimes desired to control the spectral structure within the phase-matched spectral region. We propose a scheme for selective suppression of high-order harmonics within the phase-matched spectral region while weakly influencing the other harmonics. The method is based on addition of phase-mismatched segments within a phase-matched medium. We demonstrate the method numerically in two examples. First, we show that one phase-mismatched segment can significantly suppress harmonic orders 9, 15, and 21. Second, we show that two phase-mismatched segments can efficiently suppress circularly polarized harmonics with one helicity over the other when driven by a bi-circular field. The new method may be useful for various applications, including the generation of highly helical bright attosecond pulses.

  18. Multistage Selection and the Financing of New Ventures

    OpenAIRE

    Jonathan T. Eckhardt; Scott Shane; Frédéric Delmar

    2006-01-01

    Using a random sample of 221 new Swedish ventures initiated in 1998, we examine why some new ventures are more likely than others to successfully be awarded capital from external sources. We examine venture financing as a staged selection process in which two sequential selection events systematically winnow the population of ventures and influence which ventures receive financing. For a venture to receive external financing its founders must first select it as a candidate for external fundin...

  19. Highly selective oxidative dehydrogenation of ethane with supported molten chloride catalysts

    Energy Technology Data Exchange (ETDEWEB)

    Gaertner, C.A.; Veen, A.C. van; Lercher, J.A. [Technische Univ. Muenchen (Germany). Catalysis Research Center

    2011-07-01

    Ethene production is one of the most important transformations in chemical industry, given that C{sub 2}H{sub 4} serves as building block for many mass-market products. Besides conventional thermal processes like steam cracking of ethane, ethane can be produced selectively by catalytic processes. One of the classes of catalysts that have been reported in literature as active and highly selective for the oxidative dehydrogenation of ethane is that of supported molten chloride catalysts, containing an alkali chloride overlayer on a solid support. This work deals with fundamental aspects of the catalytic action in latter class of catalysts. Results from kinetic reaction studies are related to observations in detailed characterization and lead to a comprehensive mechanistic understanding. Of fundamental importance towards mechanistic insights is the oxygen storage capacity of the catalysts that has been determined by transient step experiments. (orig.)

  20. Rural Women\\'s Preference For Selected Programmes Of The ...

    African Journals Online (AJOL)

    The study focused on the rural women's preference for selected programmes of the National Special Programme for Food Security (NSPFS) in Imo State, Nigeria. Data was collected with the aid of structured interview from 150 randomly selected women in the study area. Results from the study showed that respondents ...