Blocked Randomization with Randomly Selected Block Sizes
Directory of Open Access Journals (Sweden)
Jimmy Efird
2010-12-01
Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.
Minimization over randomly selected lines
Directory of Open Access Journals (Sweden)
Ismet Sahin
2013-07-01
Full Text Available This paper presents a population-based evolutionary optimization method for minimizing a given cost function. The mutation operator of this method selects randomly oriented lines in the cost function domain, constructs quadratic functions interpolating the cost function at three different points over each line, and uses extrema of the quadratics as mutated points. The crossover operator modifies each mutated point based on components of two points in population, instead of one point as is usually performed in other evolutionary algorithms. The stopping criterion of this method depends on the number of almost degenerate quadratics. We demonstrate that the proposed method with these mutation and crossover operations achieves faster and more robust convergence than the well-known Differential Evolution and Particle Swarm algorithms.
High Entropy Random Selection Protocols
H. Buhrman (Harry); M. Christandl (Matthias); M. Koucky (Michal); Z. Lotker (Zvi); B. Patt-Shamir; M. Charikar; K. Jansen; O. Reingold; J. Rolim
2007-01-01
textabstractIn this paper, we construct protocols for two parties that do not trust each other, to generate random variables with high Shannon entropy. We improve known bounds for the trade off between the number of rounds, length of communication and the entropy of the outcome.
47 CFR 1.1603 - Conduct of random selection.
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection. The...
47 CFR 1.1602 - Designation for random selection.
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...
Testing, Selection, and Implementation of Random Number Generators
National Research Council Canada - National Science Library
Collins, Joseph C
2008-01-01
An exhaustive evaluation of state-of-the-art random number generators with several well-known suites of tests provides the basis for selection of suitable random number generators for use in stochastic simulations...
Random effect selection in generalised linear models
DEFF Research Database (Denmark)
Denwood, Matt; Houe, Hans; Forkman, Björn
We analysed abattoir recordings of meat inspection codes with possible relevance to onfarm animal welfare in cattle. Random effects logistic regression models were used to describe individual-level data obtained from 461,406 cattle slaughtered in Denmark. Our results demonstrate that the largest...
Interference-aware random beam selection for spectrum sharing systems
Abdallah, Mohamed M.; Sayed, Mostafa M.; Alouini, Mohamed-Slim; Qaraqe, Khalid A.
2012-01-01
. In this paper, we develop interference-aware random beam selection schemes that provide enhanced throughput for the secondary link under the condition that the interference observed at the primary link is within a predetermined acceptable value. For a secondary
Selectivity and sparseness in randomly connected balanced networks.
Directory of Open Access Journals (Sweden)
Cengiz Pehlevan
Full Text Available Neurons in sensory cortex show stimulus selectivity and sparse population response, even in cases where no strong functionally specific structure in connectivity can be detected. This raises the question whether selectivity and sparseness can be generated and maintained in randomly connected networks. We consider a recurrent network of excitatory and inhibitory spiking neurons with random connectivity, driven by random projections from an input layer of stimulus selective neurons. In this architecture, the stimulus-to-stimulus and neuron-to-neuron modulation of total synaptic input is weak compared to the mean input. Surprisingly, we show that in the balanced state the network can still support high stimulus selectivity and sparse population response. In the balanced state, strong synapses amplify the variation in synaptic input and recurrent inhibition cancels the mean. Functional specificity in connectivity emerges due to the inhomogeneity caused by the generative statistical rule used to build the network. We further elucidate the mechanism behind and evaluate the effects of model parameters on population sparseness and stimulus selectivity. Network response to mixtures of stimuli is investigated. It is shown that a balanced state with unselective inhibition can be achieved with densely connected input to inhibitory population. Balanced networks exhibit the "paradoxical" effect: an increase in excitatory drive to inhibition leads to decreased inhibitory population firing rate. We compare and contrast selectivity and sparseness generated by the balanced network to randomly connected unbalanced networks. Finally, we discuss our results in light of experiments.
The signature of positive selection at randomly chosen loci.
Przeworski, Molly
2002-01-01
In Drosophila and humans, there are accumulating examples of loci with a significant excess of high-frequency-derived alleles or high levels of linkage disequilibrium, relative to a neutral model of a random-mating population of constant size. These are features expected after a recent selective sweep. Their prevalence suggests that positive directional selection may be widespread in both species. However, as I show here, these features do not persist long after the sweep ends: The high-frequ...
The reliability of randomly selected final year pharmacy students in ...
African Journals Online (AJOL)
Employing ANOVA, factorial experimental analysis, and the theory of error, reliability studies were conducted on the assessment of the drug product chloroquine phosphate tablets. The G–Study employed equal numbers of the factors for uniform control, and involved three analysts (randomly selected final year Pharmacy ...
Local randomization in neighbor selection improves PRM roadmap quality
McMahon, Troy
2012-10-01
Probabilistic Roadmap Methods (PRMs) are one of the most used classes of motion planning methods. These sampling-based methods generate robot configurations (nodes) and then connect them to form a graph (roadmap) containing representative feasible pathways. A key step in PRM roadmap construction involves identifying a set of candidate neighbors for each node. Traditionally, these candidates are chosen to be the k-closest nodes based on a given distance metric. In this paper, we propose a new neighbor selection policy called LocalRand(k,K\\'), that first computes the K\\' closest nodes to a specified node and then selects k of those nodes at random. Intuitively, LocalRand attempts to benefit from random sampling while maintaining the higher levels of local planner success inherent to selecting more local neighbors. We provide a methodology for selecting the parameters k and K\\'. We perform an experimental comparison which shows that for both rigid and articulated robots, LocalRand results in roadmaps that are better connected than the traditional k-closest policy or a purely random neighbor selection policy. The cost required to achieve these results is shown to be comparable to k-closest. © 2012 IEEE.
Local randomization in neighbor selection improves PRM roadmap quality
McMahon, Troy; Jacobs, Sam; Boyd, Bryan; Tapia, Lydia; Amato, Nancy M.
2012-01-01
Probabilistic Roadmap Methods (PRMs) are one of the most used classes of motion planning methods. These sampling-based methods generate robot configurations (nodes) and then connect them to form a graph (roadmap) containing representative feasible pathways. A key step in PRM roadmap construction involves identifying a set of candidate neighbors for each node. Traditionally, these candidates are chosen to be the k-closest nodes based on a given distance metric. In this paper, we propose a new neighbor selection policy called LocalRand(k,K'), that first computes the K' closest nodes to a specified node and then selects k of those nodes at random. Intuitively, LocalRand attempts to benefit from random sampling while maintaining the higher levels of local planner success inherent to selecting more local neighbors. We provide a methodology for selecting the parameters k and K'. We perform an experimental comparison which shows that for both rigid and articulated robots, LocalRand results in roadmaps that are better connected than the traditional k-closest policy or a purely random neighbor selection policy. The cost required to achieve these results is shown to be comparable to k-closest. © 2012 IEEE.
Selection for altruism through random drift in variable size populations
Directory of Open Access Journals (Sweden)
Houchmandzadeh Bahram
2012-05-01
Full Text Available Abstract Background Altruistic behavior is defined as helping others at a cost to oneself and a lowered fitness. The lower fitness implies that altruists should be selected against, which is in contradiction with their widespread presence is nature. Present models of selection for altruism (kin or multilevel show that altruistic behaviors can have ‘hidden’ advantages if the ‘common good’ produced by altruists is restricted to some related or unrelated groups. These models are mostly deterministic, or assume a frequency dependent fitness. Results Evolutionary dynamics is a competition between deterministic selection pressure and stochastic events due to random sampling from one generation to the next. We show here that an altruistic allele extending the carrying capacity of the habitat can win by increasing the random drift of “selfish” alleles. In other terms, the fixation probability of altruistic genes can be higher than those of a selfish ones, even though altruists have a smaller fitness. Moreover when populations are geographically structured, the altruists advantage can be highly amplified and the fixation probability of selfish genes can tend toward zero. The above results are obtained both by numerical and analytical calculations. Analytical results are obtained in the limit of large populations. Conclusions The theory we present does not involve kin or multilevel selection, but is based on the existence of random drift in variable size populations. The model is a generalization of the original Fisher-Wright and Moran models where the carrying capacity depends on the number of altruists.
Interference-aware random beam selection for spectrum sharing systems
Abdallah, Mohamed M.
2012-09-01
Spectrum sharing systems have been introduced to alleviate the problem of spectrum scarcity by allowing secondary unlicensed networks to share the spectrum with primary licensed networks under acceptable interference levels to the primary users. In this paper, we develop interference-aware random beam selection schemes that provide enhanced throughput for the secondary link under the condition that the interference observed at the primary link is within a predetermined acceptable value. For a secondary transmitter equipped with multiple antennas, our schemes select a random beam, among a set of power- optimized orthogonal random beams, that maximizes the capacity of the secondary link while satisfying the interference constraint at the primary receiver for different levels of feedback information describing the interference level at the primary receiver. For the proposed schemes, we develop a statistical analysis for the signal-to-noise and interference ratio (SINR) statistics as well as the capacity of the secondary link. Finally, we present numerical results that study the effect of system parameters including number of beams and the maximum transmission power on the capacity of the secondary link attained using the proposed schemes. © 2012 IEEE.
Random selection of items. Selection of n1 samples among N items composing a stratum
International Nuclear Information System (INIS)
Jaech, J.L.; Lemaire, R.J.
1987-02-01
STR-224 provides generalized procedures to determine required sample sizes, for instance in the course of a Physical Inventory Verification at Bulk Handling Facilities. The present report describes procedures to generate random numbers and select groups of items to be verified in a given stratum through each of the measurement methods involved in the verification. (author). 3 refs
The signature of positive selection at randomly chosen loci.
Przeworski, Molly
2002-03-01
In Drosophila and humans, there are accumulating examples of loci with a significant excess of high-frequency-derived alleles or high levels of linkage disequilibrium, relative to a neutral model of a random-mating population of constant size. These are features expected after a recent selective sweep. Their prevalence suggests that positive directional selection may be widespread in both species. However, as I show here, these features do not persist long after the sweep ends: The high-frequency alleles drift to fixation and no longer contribute to polymorphism, while linkage disequilibrium is broken down by recombination. As a result, loci chosen without independent evidence of recent selection are not expected to exhibit either of these features, even if they have been affected by numerous sweeps in their genealogical history. How then can we explain the patterns in the data? One possibility is population structure, with unequal sampling from different subpopulations. Alternatively, positive selection may not operate as is commonly modeled. In particular, the rate of fixation of advantageous mutations may have increased in the recent past.
Thomas, D.L.; Johnson, D.; Griffith, B.
2006-01-01
Modeling the probability of use of land units characterized by discrete and continuous measures, we present a Bayesian random-effects model to assess resource selection. This model provides simultaneous estimation of both individual- and population-level selection. Deviance information criterion (DIC), a Bayesian alternative to AIC that is sample-size specific, is used for model selection. Aerial radiolocation data from 76 adult female caribou (Rangifer tarandus) and calf pairs during 1 year on an Arctic coastal plain calving ground were used to illustrate models and assess population-level selection of landscape attributes, as well as individual heterogeneity of selection. Landscape attributes included elevation, NDVI (a measure of forage greenness), and land cover-type classification. Results from the first of a 2-stage model-selection procedure indicated that there is substantial heterogeneity among cow-calf pairs with respect to selection of the landscape attributes. In the second stage, selection of models with heterogeneity included indicated that at the population-level, NDVI and land cover class were significant attributes for selection of different landscapes by pairs on the calving ground. Population-level selection coefficients indicate that the pairs generally select landscapes with higher levels of NDVI, but the relationship is quadratic. The highest rate of selection occurs at values of NDVI less than the maximum observed. Results for land cover-class selections coefficients indicate that wet sedge, moist sedge, herbaceous tussock tundra, and shrub tussock tundra are selected at approximately the same rate, while alpine and sparsely vegetated landscapes are selected at a lower rate. Furthermore, the variability in selection by individual caribou for moist sedge and sparsely vegetated landscapes is large relative to the variability in selection of other land cover types. The example analysis illustrates that, while sometimes computationally intense, a
Blind Measurement Selection: A Random Matrix Theory Approach
Elkhalil, Khalil
2016-12-14
This paper considers the problem of selecting a set of $k$ measurements from $n$ available sensor observations. The selected measurements should minimize a certain error function assessing the error in estimating a certain $m$ dimensional parameter vector. The exhaustive search inspecting each of the $n\\\\choose k$ possible choices would require a very high computational complexity and as such is not practical for large $n$ and $k$. Alternative methods with low complexity have recently been investigated but their main drawbacks are that 1) they require perfect knowledge of the measurement matrix and 2) they need to be applied at the pace of change of the measurement matrix. To overcome these issues, we consider the asymptotic regime in which $k$, $n$ and $m$ grow large at the same pace. Tools from random matrix theory are then used to approximate in closed-form the most important error measures that are commonly used. The asymptotic approximations are then leveraged to select properly $k$ measurements exhibiting low values for the asymptotic error measures. Two heuristic algorithms are proposed: the first one merely consists in applying the convex optimization artifice to the asymptotic error measure. The second algorithm is a low-complexity greedy algorithm that attempts to look for a sufficiently good solution for the original minimization problem. The greedy algorithm can be applied to both the exact and the asymptotic error measures and can be thus implemented in blind and channel-aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also presented and sustain the efficiency of the proposed blind methods in reaching the performances of channel-aware algorithms.
Materials selection for oxide-based resistive random access memories
International Nuclear Information System (INIS)
Guo, Yuzheng; Robertson, John
2014-01-01
The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO 2 , TiO 2 , Ta 2 O 5 , and Al 2 O 3 , to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta 2 O 5 RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy
Primitive polynomials selection method for pseudo-random number generator
Anikin, I. V.; Alnajjar, Kh
2018-01-01
In this paper we suggested the method for primitive polynomials selection of special type. This kind of polynomials can be efficiently used as a characteristic polynomials for linear feedback shift registers in pseudo-random number generators. The proposed method consists of two basic steps: finding minimum-cost irreducible polynomials of the desired degree and applying primitivity tests to get the primitive ones. Finally two primitive polynomials, which was found by the proposed method, used in pseudorandom number generator based on fuzzy logic (FRNG) which had been suggested before by the authors. The sequences generated by new version of FRNG have low correlation magnitude, high linear complexity, less power consumption, is more balanced and have better statistical properties.
Materials selection for oxide-based resistive random access memories
Energy Technology Data Exchange (ETDEWEB)
Guo, Yuzheng; Robertson, John [Engineering Department, Cambridge University, Cambridge CB2 1PZ (United Kingdom)
2014-12-01
The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO{sub 2}, TiO{sub 2}, Ta{sub 2}O{sub 5}, and Al{sub 2}O{sub 3}, to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta{sub 2}O{sub 5} RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy.
Optimizing Event Selection with the Random Grid Search
Energy Technology Data Exchange (ETDEWEB)
Bhat, Pushpalatha C. [Fermilab; Prosper, Harrison B. [Florida State U.; Sekmen, Sezen [Kyungpook Natl. U.; Stewart, Chip [Broad Inst., Cambridge
2017-06-29
The random grid search (RGS) is a simple, but efficient, stochastic algorithm to find optimal cuts that was developed in the context of the search for the top quark at Fermilab in the mid-1990s. The algorithm, and associated code, have been enhanced recently with the introduction of two new cut types, one of which has been successfully used in searches for supersymmetry at the Large Hadron Collider. The RGS optimization algorithm is described along with the recent developments, which are illustrated with two examples from particle physics. One explores the optimization of the selection of vector boson fusion events in the four-lepton decay mode of the Higgs boson and the other optimizes SUSY searches using boosted objects and the razor variables.
Selective decontamination in pediatric liver transplants. A randomized prospective study.
Smith, S D; Jackson, R J; Hannakan, C J; Wadowsky, R M; Tzakis, A G; Rowe, M I
1993-06-01
Although it has been suggested that selective decontamination of the digestive tract (SDD) decreases postoperative aerobic Gram-negative and fungal infections in orthotopic liver transplantation (OLT), no controlled trials exist in pediatric patients. This prospective, randomized controlled study of 36 pediatric OLT patients examines the effect of short-term SDD on postoperative infection and digestive tract flora. Patients were randomized into two groups. The control group received perioperative parenteral antibiotics only. The SDD group received in addition polymyxin E, tobramycin, and amphotericin B enterally and by oropharyngeal swab postoperatively until oral intake was tolerated (6 +/- 4 days). Indications for operation, preoperative status, age, and intensive care unit and hospital length of stay were no different in SDD (n = 18) and control (n = 18) groups. A total of 14 Gram-negative infections (intraabdominal abscess 7, septicemia 5, pneumonia 1, urinary tract 1) developed in the 36 patients studied. Mortality was not significantly different in the two groups. However, there were significantly fewer patients with Gram-negative infections in the SDD group: 3/18 patients (11%) vs. 11/18 patients (50%) in the control group, P < 0.001. There was also significant reduction in aerobic Gram-negative flora in the stool and pharynx in patients receiving SDD. Gram-positive and anaerobic organisms were unaffected. We conclude that short-term postoperative SDD significantly reduces Gram-negative infections in pediatric OLT patients.
Pediatric selective mutism therapy: a randomized controlled trial.
Esposito, Maria; Gimigliano, Francesca; Barillari, Maria R; Precenzano, Francesco; Ruberto, Maria; Sepe, Joseph; Barillari, Umberto; Gimigliano, Raffaele; Militerni, Roberto; Messina, Giovanni; Carotenuto, Marco
2017-10-01
Selective mutism (SM) is a rare disease in children coded by DSM-5 as an anxiety disorder. Despite the disabling nature of the disease, there is still no specific treatment. The aims of this study were to verify the efficacy of six-month standard psychomotor treatment and the positive changes in lifestyle, in a population of children affected by SM. Randomized controlled trial registered in the European Clinical Trials Registry (EuDract 2015-001161-36). University third level Centre (Child and Adolescent Neuropsychiatry Clinic). Study population was composed by 67 children in group A (psychomotricity treatment) (35 M, mean age 7.84±1.15) and 71 children in group B (behavioral and educational counseling) (37 M, mean age 7.75±1.36). Psychomotor treatment was administered by trained child therapists in residential settings three times per week. Each child was treated for the whole period by the same therapist and all the therapists shared the same protocol. The standard psychomotor session length is of 45 minutes. At T0 and after 6 months (T1) of treatments, patients underwent a behavioral and SM severity assessment. To verify the effects of the psychomotor management, the Child Behavior Checklist questionnaire (CBCL) and Selective Mutism Questionnaire (SMQ) were administered to the parents. After 6 months of psychomotor treatment SM children showed a significant reduction among CBCL scores such as in social relations, anxious/depressed, social problems and total problems (Pselective mutism, even if further studies are needed. The present study identifies in psychomotricity a safe and efficacy therapy for pediatric selective mutism.
Blind Measurement Selection: A Random Matrix Theory Approach
Elkhalil, Khalil; Kammoun, Abla; Al-Naffouri, Tareq Y.; Alouini, Mohamed-Slim
2016-01-01
-aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also
Strategyproof Peer Selection using Randomization, Partitioning, and Apportionment
Aziz, Haris; Lev, Omer; Mattei, Nicholas; Rosenschein, Jeffrey S.; Walsh, Toby
2016-01-01
Peer review, evaluation, and selection is a fundamental aspect of modern science. Funding bodies the world over employ experts to review and select the best proposals of those submitted for funding. The problem of peer selection, however, is much more general: a professional society may want to give a subset of its members awards based on the opinions of all members; an instructor for a MOOC or online course may want to crowdsource grading; or a marketing company may select ideas from group b...
Variable Selection in Time Series Forecasting Using Random Forests
Directory of Open Access Journals (Sweden)
Hristos Tyralis
2017-10-01
Full Text Available Time series forecasting using machine learning algorithms has gained popularity recently. Random forest is a machine learning algorithm implemented in time series forecasting; however, most of its forecasting properties have remained unexplored. Here we focus on assessing the performance of random forests in one-step forecasting using two large datasets of short time series with the aim to suggest an optimal set of predictor variables. Furthermore, we compare its performance to benchmarking methods. The first dataset is composed by 16,000 simulated time series from a variety of Autoregressive Fractionally Integrated Moving Average (ARFIMA models. The second dataset consists of 135 mean annual temperature time series. The highest predictive performance of RF is observed when using a low number of recent lagged predictor variables. This outcome could be useful in relevant future applications, with the prospect to achieve higher predictive accuracy.
Random-walk simulation of selected aspects of dissipative collisions
International Nuclear Information System (INIS)
Toeke, J.; Gobbi, A.; Matulewicz, T.
1984-11-01
Internuclear thermal equilibrium effects and shell structure effects in dissipative collisions are studied numerically within the framework of the model of stochastic exchanges by applying the random-walk technique. Effective blocking of the drift through the mass flux induced by the temperature difference, while leaving the variances of the mass distributions unaltered is found possible, provided an internuclear potential barrier is present. Presence of the shell structure is found to lead to characteristic correlations between the consecutive exchanges. Experimental evidence for the predicted effects is discussed. (orig.)
Application of random effects to the study of resource selection by animals.
Gillies, Cameron S; Hebblewhite, Mark; Nielsen, Scott E; Krawchuk, Meg A; Aldridge, Cameron L; Frair, Jacqueline L; Saher, D Joanne; Stevens, Cameron E; Jerde, Christopher L
2006-07-01
1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence. 2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability. 3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed. 4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects. 5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection. 6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions
Interference-aware random beam selection schemes for spectrum sharing systems
Abdallah, Mohamed; Qaraqe, Khalid; Alouini, Mohamed-Slim
2012-01-01
users. In this work, we develop interference-aware random beam selection schemes that provide enhanced performance for the secondary network under the condition that the interference observed by the receivers of the primary network is below a
Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...
Kleinman, Alan
2016-12-20
The random mutation and natural selection phenomenon act in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures when treating infections and cancers. The underlying principle to impair the random mutation and natural selection phenomenon is to use combination therapy, which forces the population to evolve to multiple selection pressures simultaneously that invoke the multiplication rule of probabilities simultaneously as well. Recently, it has been seen that combination therapy for the treatment of malaria has failed to prevent the emergence of drug-resistant variants. Using this empirical example and the principles of probability theory, the derivation of the equations describing this treatment failure is carried out. These equations give guidance as to how to use combination therapy for the treatment of cancers and infectious diseases and prevent the emergence of drug resistance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Acceptance sampling using judgmental and randomly selected samples
Energy Technology Data Exchange (ETDEWEB)
Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl
2010-09-01
We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.
40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.
2010-07-01
... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... Â§ 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...
Non-random mating for selection with restricted rates of inbreeding and overlapping generations
Sonesson, A.K.; Meuwissen, T.H.E.
2002-01-01
Minimum coancestry mating with a maximum of one offspring per mating pair (MC1) is compared with random mating schemes for populations with overlapping generations. Optimum contribution selection is used, whereby $\\\\\\\\Delta F$ is restricted. For schemes with $\\\\\\\\Delta F$ restricted to 0.25% per
Applications of random forest feature selection for fine-scale genetic population assignment.
Sylvester, Emma V A; Bentzen, Paul; Bradbury, Ian R; Clément, Marie; Pearce, Jon; Horne, John; Beiko, Robert G
2018-02-01
Genetic population assignment used to inform wildlife management and conservation efforts requires panels of highly informative genetic markers and sensitive assignment tests. We explored the utility of machine-learning algorithms (random forest, regularized random forest and guided regularized random forest) compared with F ST ranking for selection of single nucleotide polymorphisms (SNP) for fine-scale population assignment. We applied these methods to an unpublished SNP data set for Atlantic salmon ( Salmo salar ) and a published SNP data set for Alaskan Chinook salmon ( Oncorhynchus tshawytscha ). In each species, we identified the minimum panel size required to obtain a self-assignment accuracy of at least 90% using each method to create panels of 50-700 markers Panels of SNPs identified using random forest-based methods performed up to 7.8 and 11.2 percentage points better than F ST -selected panels of similar size for the Atlantic salmon and Chinook salmon data, respectively. Self-assignment accuracy ≥90% was obtained with panels of 670 and 384 SNPs for each data set, respectively, a level of accuracy never reached for these species using F ST -selected panels. Our results demonstrate a role for machine-learning approaches in marker selection across large genomic data sets to improve assignment for management and conservation of exploited populations.
40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.
2010-07-01
... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...
Hebbian Learning in a Random Network Captures Selectivity Properties of the Prefrontal Cortex
Lindsay, Grace W.
2017-01-01
Complex cognitive behaviors, such as context-switching and rule-following, are thought to be supported by the prefrontal cortex (PFC). Neural activity in the PFC must thus be specialized to specific tasks while retaining flexibility. Nonlinear “mixed” selectivity is an important neurophysiological trait for enabling complex and context-dependent behaviors. Here we investigate (1) the extent to which the PFC exhibits computationally relevant properties, such as mixed selectivity, and (2) how such properties could arise via circuit mechanisms. We show that PFC cells recorded from male and female rhesus macaques during a complex task show a moderate level of specialization and structure that is not replicated by a model wherein cells receive random feedforward inputs. While random connectivity can be effective at generating mixed selectivity, the data show significantly more mixed selectivity than predicted by a model with otherwise matched parameters. A simple Hebbian learning rule applied to the random connectivity, however, increases mixed selectivity and enables the model to match the data more accurately. To explain how learning achieves this, we provide analysis along with a clear geometric interpretation of the impact of learning on selectivity. After learning, the model also matches the data on measures of noise, response density, clustering, and the distribution of selectivities. Of two styles of Hebbian learning tested, the simpler and more biologically plausible option better matches the data. These modeling results provide clues about how neural properties important for cognition can arise in a circuit and make clear experimental predictions regarding how various measures of selectivity would evolve during animal training. SIGNIFICANCE STATEMENT The prefrontal cortex is a brain region believed to support the ability of animals to engage in complex behavior. How neurons in this area respond to stimuli—and in particular, to combinations of stimuli (
Directory of Open Access Journals (Sweden)
Tan Nhat Nguyen
2016-01-01
Full Text Available In this paper, we evaluate performances of various user selection protocols under impact of hardware impairments. In the considered protocols, a Base Station (BS selects one of available Users (US to serve, while the remaining USs harvest the energy from the Radio Frequency (RF transmitted by the BS. We assume that all of the US randomly appear around the BS. In the Random Selection Protocol (RAN, the BS randomly selects a US to transmit the data. In the second proposed protocol, named Minimum Distance Protocol (MIND, the US that is nearest to the BS will be chosen. In the Optimal Selection Protocol (OPT, the US providing the highest channel gain between itself and the BS will be served. For performance evaluation, we derive exact and asymptotic closed-form expressions of average Outage Probability (OP over Rayleigh fading channels. We also consider average harvested energy per a US. Finally, Monte-Carlo simulations are then performed to verify the theoretical results.
Simulated Performance Evaluation of a Selective Tracker Through Random Scenario Generation
DEFF Research Database (Denmark)
Hussain, Dil Muhammad Akbar
2006-01-01
performance assessment. Therefore, a random target motion scenario is adopted. Its implementation in particular for testing the proposed selective track splitting algorithm using Kalman filters is investigated through a number of performance parameters which gives the activity profile of the tracking scenario...... The paper presents a simulation study on the performance of a target tracker using selective track splitting filter algorithm through a random scenario implemented on a digital signal processor. In a typical track splitting filter all the observation which fall inside a likelihood ellipse...... are used for update, however, in our proposed selective track splitting filter less number of observations are used for track update. Much of the previous performance work [1] has been done on specific (deterministic) scenarios. One of the reasons for considering the specific scenarios, which were...
TEHRAN AIR POLLUTANTS PREDICTION BASED ON RANDOM FOREST FEATURE SELECTION METHOD
Directory of Open Access Journals (Sweden)
A. Shamsoddini
2017-09-01
Full Text Available Air pollution as one of the most serious forms of environmental pollutions poses huge threat to human life. Air pollution leads to environmental instability, and has harmful and undesirable effects on the environment. Modern prediction methods of the pollutant concentration are able to improve decision making and provide appropriate solutions. This study examines the performance of the Random Forest feature selection in combination with multiple-linear regression and Multilayer Perceptron Artificial Neural Networks methods, in order to achieve an efficient model to estimate carbon monoxide and nitrogen dioxide, sulfur dioxide and PM2.5 contents in the air. The results indicated that Artificial Neural Networks fed by the attributes selected by Random Forest feature selection method performed more accurate than other models for the modeling of all pollutants. The estimation accuracy of sulfur dioxide emissions was lower than the other air contaminants whereas the nitrogen dioxide was predicted more accurate than the other pollutants.
Tehran Air Pollutants Prediction Based on Random Forest Feature Selection Method
Shamsoddini, A.; Aboodi, M. R.; Karami, J.
2017-09-01
Air pollution as one of the most serious forms of environmental pollutions poses huge threat to human life. Air pollution leads to environmental instability, and has harmful and undesirable effects on the environment. Modern prediction methods of the pollutant concentration are able to improve decision making and provide appropriate solutions. This study examines the performance of the Random Forest feature selection in combination with multiple-linear regression and Multilayer Perceptron Artificial Neural Networks methods, in order to achieve an efficient model to estimate carbon monoxide and nitrogen dioxide, sulfur dioxide and PM2.5 contents in the air. The results indicated that Artificial Neural Networks fed by the attributes selected by Random Forest feature selection method performed more accurate than other models for the modeling of all pollutants. The estimation accuracy of sulfur dioxide emissions was lower than the other air contaminants whereas the nitrogen dioxide was predicted more accurate than the other pollutants.
Continuous-Time Mean-Variance Portfolio Selection with Random Horizon
International Nuclear Information System (INIS)
Yu, Zhiyong
2013-01-01
This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right
Continuous-Time Mean-Variance Portfolio Selection with Random Horizon
Energy Technology Data Exchange (ETDEWEB)
Yu, Zhiyong, E-mail: yuzhiyong@sdu.edu.cn [Shandong University, School of Mathematics (China)
2013-12-15
This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right.
Emergence of multilevel selection in the prisoner's dilemma game on coevolving random networks
International Nuclear Information System (INIS)
Szolnoki, Attila; Perc, Matjaz
2009-01-01
We study the evolution of cooperation in the prisoner's dilemma game, whereby a coevolutionary rule is introduced that molds the random topology of the interaction network in two ways. First, existing links are deleted whenever a player adopts a new strategy or its degree exceeds a threshold value; second, new links are added randomly after a given number of game iterations. These coevolutionary processes correspond to the generic formation of new links and deletion of existing links that, especially in human societies, appear frequently as a consequence of ongoing socialization, change of lifestyle or death. Due to the counteraction of deletions and additions of links the initial heterogeneity of the interaction network is qualitatively preserved, and thus cannot be held responsible for the observed promotion of cooperation. Indeed, the coevolutionary rule evokes the spontaneous emergence of a powerful multilevel selection mechanism, which despite the sustained random topology of the evolving network, maintains cooperation across the whole span of defection temptation values.
Topology-selective jamming of fully-connected, code-division random-access networks
Polydoros, Andreas; Cheng, Unjeng
1990-01-01
The purpose is to introduce certain models of topology selective stochastic jamming and examine its impact on a class of fully-connected, spread-spectrum, slotted ALOHA-type random access networks. The theory covers dedicated as well as half-duplex units. The dominant role of the spatial duty factor is established, and connections with the dual concept of time selective jamming are discussed. The optimal choices of coding rate and link access parameters (from the users' side) and the jamming spatial fraction are numerically established for DS and FH spreading.
Directory of Open Access Journals (Sweden)
R Alexander Bentley
Full Text Available The evolution of vocabulary in academic publishing is characterized via keyword frequencies recorded in the ISI Web of Science citations database. In four distinct case-studies, evolutionary analysis of keyword frequency change through time is compared to a model of random copying used as the null hypothesis, such that selection may be identified against it. The case studies from the physical sciences indicate greater selection in keyword choice than in the social sciences. Similar evolutionary analyses can be applied to a wide range of phenomena; wherever the popularity of multiple items through time has been recorded, as with web searches, or sales of popular music and books, for example.
Bentley, R Alexander
2008-08-27
The evolution of vocabulary in academic publishing is characterized via keyword frequencies recorded in the ISI Web of Science citations database. In four distinct case-studies, evolutionary analysis of keyword frequency change through time is compared to a model of random copying used as the null hypothesis, such that selection may be identified against it. The case studies from the physical sciences indicate greater selection in keyword choice than in the social sciences. Similar evolutionary analyses can be applied to a wide range of phenomena; wherever the popularity of multiple items through time has been recorded, as with web searches, or sales of popular music and books, for example.
Wolde, Mistire; Tarekegn, Getahun; Kebede, Tedla
2018-05-01
Point-of-care glucometer (PoCG) devices play a significant role in self-monitoring of the blood sugar level, particularly in the follow-up of high blood sugar therapeutic response. The aim of this study was to evaluate blood glucose test results performed with four randomly selected glucometers on diabetes and control subjects versus standard wet chemistry (hexokinase) methods in Addis Ababa, Ethiopia. A prospective cross-sectional study was conducted on randomly selected 200 study participants (100 participants with diabetes and 100 healthy controls). Four randomly selected PoCG devices (CareSens N, DIAVUE Prudential, On Call Extra, i-QARE DS-W) were evaluated against hexokinase method and ISO 15197:2003 and ISO 15197:2013 standards. The minimum and maximum blood sugar values were recorded by CareSens N (21 mg/dl) and hexokinase method (498.8 mg/dl), respectively. The mean sugar values of all PoCG devices except On Call Extra showed significant differences compared with the reference hexokinase method. Meanwhile, all four PoCG devices had strong positive relationship (>80%) with the reference method (hexokinase). On the other hand, none of the four PoCG devices fulfilled the minimum accuracy measurement set by ISO 15197:2003 and ISO 15197:2013 standards. In addition, the linear regression analysis revealed that all four selected PoCG overestimated the glucose concentrations. The overall evaluation of the selected four PoCG measurements were poorly correlated with standard reference method. Therefore, before introducing PoCG devices to the market, there should be a standardized evaluation platform for validation. Further similar large-scale studies on other PoCG devices also need to be undertaken.
Selection bias and subject refusal in a cluster-randomized controlled trial
Directory of Open Access Journals (Sweden)
Rochelle Yang
2017-07-01
Full Text Available Abstract Background Selection bias and non-participation bias are major methodological concerns which impact external validity. Cluster-randomized controlled trials are especially prone to selection bias as it is impractical to blind clusters to their allocation into intervention or control. This study assessed the impact of selection bias in a large cluster-randomized controlled trial. Methods The Improved Cardiovascular Risk Reduction to Enhance Rural Primary Care (ICARE study examined the impact of a remote pharmacist-led intervention in twelve medical offices. To assess eligibility, a standardized form containing patient demographics and medical information was completed for each screened patient. Eligible patients were approached by the study coordinator for recruitment. Both the study coordinator and the patient were aware of the site’s allocation prior to consent. Patients who consented or declined to participate were compared across control and intervention arms for differing characteristics. Statistical significance was determined using a two-tailed, equal variance t-test and a chi-square test with adjusted Bonferroni p-values. Results were adjusted for random cluster variation. Results There were 2749 completed screening forms returned to research staff with 461 subjects who had either consented or declined participation. Patients with poorly controlled diabetes were found to be significantly more likely to decline participation in intervention sites compared to those in control sites. A higher mean diastolic blood pressure was seen in patients with uncontrolled hypertension who declined in the control sites compared to those who declined in the intervention sites. However, these findings were no longer significant after adjustment for random variation among the sites. After this adjustment, females were now found to be significantly more likely to consent than males (odds ratio = 1.41; 95% confidence interval = 1.03, 1
Fuzzy Random λ-Mean SAD Portfolio Selection Problem: An Ant Colony Optimization Approach
Thakur, Gour Sundar Mitra; Bhattacharyya, Rupak; Mitra, Swapan Kumar
2010-10-01
To reach the investment goal, one has to select a combination of securities among different portfolios containing large number of securities. Only the past records of each security do not guarantee the future return. As there are many uncertain factors which directly or indirectly influence the stock market and there are also some newer stock markets which do not have enough historical data, experts' expectation and experience must be combined with the past records to generate an effective portfolio selection model. In this paper the return of security is assumed to be Fuzzy Random Variable Set (FRVS), where returns are set of random numbers which are in turn fuzzy numbers. A new λ-Mean Semi Absolute Deviation (λ-MSAD) portfolio selection model is developed. The subjective opinions of the investors to the rate of returns of each security are taken into consideration by introducing a pessimistic-optimistic parameter vector λ. λ-Mean Semi Absolute Deviation (λ-MSAD) model is preferred as it follows absolute deviation of the rate of returns of a portfolio instead of the variance as the measure of the risk. As this model can be reduced to Linear Programming Problem (LPP) it can be solved much faster than quadratic programming problems. Ant Colony Optimization (ACO) is used for solving the portfolio selection problem. ACO is a paradigm for designing meta-heuristic algorithms for combinatorial optimization problem. Data from BSE is used for illustration.
Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C.; Hobbs, Brian P.; Berry, Donald A.; Pentz, Rebecca D.; Tam, Alda; Hong, Waun K.; Ellis, Lee M.; Abbruzzese, James; Overman, Michael J.
2015-01-01
Purpose The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. Methods We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. Results A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P < .001). Twenty-eight studies (37.8%) reported a total of 65 unplanned end points; 52 (80.0%) of which were not identified as unplanned. Thirty-one (41.9%) and 19 (25.7%) of 74 trials reported a total of 52 unplanned analyses involving primary end points and 33 unplanned analyses involving nonprimary end points, respectively. Studies reported positive unplanned end points and unplanned analyses more frequently than negative outcomes in abstracts (unplanned end points odds ratio, 6.8; P = .002; unplanned analyses odd ratio, 8.4; P = .007). Conclusion Despite public and reviewer access to protocols, selective outcome reporting persists and is a major concern in the reporting of randomized clinical trials. To foster credible evidence-based medicine, additional initiatives are needed to minimize selective reporting. PMID:26304898
Optimization of the Dutch Matrix Test by Random Selection of Sentences From a Preselected Subset
Directory of Open Access Journals (Sweden)
Rolph Houben
2015-04-01
Full Text Available Matrix tests are available for speech recognition testing in many languages. For an accurate measurement, a steep psychometric function of the speech materials is required. For existing tests, it would be beneficial if it were possible to further optimize the available materials by increasing the function’s steepness. The objective is to show if the steepness of the psychometric function of an existing matrix test can be increased by selecting a homogeneous subset of recordings with the steepest sentence-based psychometric functions. We took data from a previous multicenter evaluation of the Dutch matrix test (45 normal-hearing listeners. Based on half of the data set, first the sentences (140 out of 311 with a similar speech reception threshold and with the steepest psychometric function (≥9.7%/dB were selected. Subsequently, the steepness of the psychometric function for this selection was calculated from the remaining (unused second half of the data set. The calculation showed that the slope increased from 10.2%/dB to 13.7%/dB. The resulting subset did not allow the construction of enough balanced test lists. Therefore, the measurement procedure was changed to randomly select the sentences during testing. Random selection may interfere with a representative occurrence of phonemes. However, in our material, the median phonemic occurrence remained close to that of the original test. This finding indicates that phonemic occurrence is not a critical factor. The work highlights the possibility that existing speech tests might be improved by selecting sentences with a steep psychometric function.
Using Random Forests to Select Optimal Input Variables for Short-Term Wind Speed Forecasting Models
Directory of Open Access Journals (Sweden)
Hui Wang
2017-10-01
Full Text Available Achieving relatively high-accuracy short-term wind speed forecasting estimates is a precondition for the construction and grid-connected operation of wind power forecasting systems for wind farms. Currently, most research is focused on the structure of forecasting models and does not consider the selection of input variables, which can have significant impacts on forecasting performance. This paper presents an input variable selection method for wind speed forecasting models. The candidate input variables for various leading periods are selected and random forests (RF is employed to evaluate the importance of all variable as features. The feature subset with the best evaluation performance is selected as the optimal feature set. Then, kernel-based extreme learning machine is constructed to evaluate the performance of input variables selection based on RF. The results of the case study show that by removing the uncorrelated and redundant features, RF effectively extracts the most strongly correlated set of features from the candidate input variables. By finding the optimal feature combination to represent the original information, RF simplifies the structure of the wind speed forecasting model, shortens the training time required, and substantially improves the model’s accuracy and generalization ability, demonstrating that the input variables selected by RF are effective.
Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C; Hobbs, Brian P; Berry, Donald A; Pentz, Rebecca D; Tam, Alda; Hong, Waun K; Ellis, Lee M; Abbruzzese, James; Overman, Michael J
2015-11-01
The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P medicine, additional initiatives are needed to minimize selective reporting. © 2015 by American Society of Clinical Oncology.
On theoretical models of gene expression evolution with random genetic drift and natural selection.
Directory of Open Access Journals (Sweden)
Osamu Ogasawara
2009-11-01
Full Text Available The relative contributions of natural selection and random genetic drift are a major source of debate in the study of gene expression evolution, which is hypothesized to serve as a bridge from molecular to phenotypic evolution. It has been suggested that the conflict between views is caused by the lack of a definite model of the neutral hypothesis, which can describe the long-run behavior of evolutionary change in mRNA abundance. Therefore previous studies have used inadequate analogies with the neutral prediction of other phenomena, such as amino acid or nucleotide sequence evolution, as the null hypothesis of their statistical inference.In this study, we introduced two novel theoretical models, one based on neutral drift and the other assuming natural selection, by focusing on a common property of the distribution of mRNA abundance among a variety of eukaryotic cells, which reflects the result of long-term evolution. Our results demonstrated that (1 our models can reproduce two independently found phenomena simultaneously: the time development of gene expression divergence and Zipf's law of the transcriptome; (2 cytological constraints can be explicitly formulated to describe long-term evolution; (3 the model assuming that natural selection optimized relative mRNA abundance was more consistent with previously published observations than the model of optimized absolute mRNA abundances.The models introduced in this study give a formulation of evolutionary change in the mRNA abundance of each gene as a stochastic process, on the basis of previously published observations. This model provides a foundation for interpreting observed data in studies of gene expression evolution, including identifying an adequate time scale for discriminating the effect of natural selection from that of random genetic drift of selectively neutral variations.
Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H
2017-07-01
Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in
Analysis and applications of a frequency selective surface via a random distribution method
International Nuclear Information System (INIS)
Xie Shao-Yi; Huang Jing-Jian; Yuan Nai-Chang; Liu Li-Guo
2014-01-01
A novel frequency selective surface (FSS) for reducing radar cross section (RCS) is proposed in this paper. This FSS is based on the random distribution method, so it can be called random surface. In this paper, the stacked patches serving as periodic elements are employed for RCS reduction. Previous work has demonstrated the efficiency by utilizing the microstrip patches, especially for the reflectarray. First, the relevant theory of the method is described. Then a sample of a three-layer variable-sized stacked patch random surface with a dimension of 260 mm×260 mm is simulated, fabricated, and measured in order to demonstrate the validity of the proposed design. For the normal incidence, the 8-dB RCS reduction can be achieved both by the simulation and the measurement in 8 GHz–13 GHz. The oblique incidence of 30° is also investigated, in which the 7-dB RCS reduction can be obtained in a frequency range of 8 GHz–14 GHz. (condensed matter: electronic structure, electrical, magnetic, and optical properties)
Directory of Open Access Journals (Sweden)
Zhao D
2015-07-01
Full Text Available Di Zhao,1,* Jian Song,2,* Xuan Gao,3 Fei Gao,4 Yupeng Wu,2 Yingying Lu,5 Kai Hou1 1Department of Neurosurgery, The First Hospital of Hebei Medical University, 2Department of Neurosurgery, 3Department of Neurology, The Second Hospital of Hebei Medical University, 4Hebei Provincial Procurement Centers for Medical Drugs and Devices, 5Department of Neurosurgery, The Second Hospital of Hebei Medical University, Shijiazhuang People’s Republic of China *These authors contributed equally to this work Background: Selective digestive decontamination (SDD and selective oropharyngeal decontamination (SOD are associated with reduced mortality and infection rates among patients in intensive care units (ICUs; however, whether SOD has a superior effect than SDD remains uncertain. Hence, we conducted a meta-analysis of randomized controlled trials (RCTs to compare SOD with SDD in terms of clinical outcomes and antimicrobial resistance rates in patients who were critically ill. Methods: RCTs published in PubMed, Embase, and Web of Science were systematically reviewed to compare the effects of SOD and SDD in patients who were critically ill. Outcomes included day-28 mortality, length of ICU stay, length of hospital stay, duration of mechanical ventilation, ICU-acquired bacteremia, and prevalence of antibiotic-resistant Gram-negative bacteria. Results were expressed as risk ratio (RR with 95% confidence intervals (CIs, and weighted mean differences (WMDs with 95% CIs. Pooled estimates were performed using a fixed-effects model or random-effects model, depending on the heterogeneity among studies. Results: A total of four RCTs involving 23,822 patients met the inclusion criteria and were included in this meta-analysis. Among patients whose admitting specialty was surgery, cardiothoracic surgery (57.3% and neurosurgery (29.7% were the two main types of surgery being performed. Pooled results showed that SOD had similar effects as SDD in day-28 mortality (RR =1
Directory of Open Access Journals (Sweden)
Ramoni Marco F
2007-03-01
Full Text Available Abstract Background Recent studies have shown that when individuals are grouped on the basis of genetic similarity, group membership corresponds closely to continental origin. There has been considerable debate about the implications of these findings in the context of larger debates about race and the extent of genetic variation between groups. Some have argued that clustering according to continental origin demonstrates the existence of significant genetic differences between groups and that these differences may have important implications for differences in health and disease. Others argue that clustering according to continental origin requires the use of large amounts of genetic data or specifically chosen markers and is indicative only of very subtle genetic differences that are unlikely to have biomedical significance. Results We used small numbers of randomly selected single nucleotide polymorphisms (SNPs from the International HapMap Project to train naïve Bayes classifiers for prediction of ancestral continent of origin. Predictive accuracy was tested on two independent data sets. Genetically similar groups should be difficult to distinguish, especially if only a small number of genetic markers are used. The genetic differences between continentally defined groups are sufficiently large that one can accurately predict ancestral continent of origin using only a minute, randomly selected fraction of the genetic variation present in the human genome. Genotype data from only 50 random SNPs was sufficient to predict ancestral continent of origin in our primary test data set with an average accuracy of 95%. Genetic variations informative about ancestry were common and widely distributed throughout the genome. Conclusion Accurate characterization of ancestry is possible using small numbers of randomly selected SNPs. The results presented here show how investigators conducting genetic association studies can use small numbers of arbitrarily
Directory of Open Access Journals (Sweden)
Thandi Kapwata
2016-11-01
Full Text Available Malaria is an environmentally driven disease. In order to quantify the spatial variability of malaria transmission, it is imperative to understand the interactions between environmental variables and malaria epidemiology at a micro-geographic level using a novel statistical approach. The random forest (RF statistical learning method, a relatively new variable-importance ranking method, measures the variable importance of potentially influential parameters through the percent increase of the mean squared error. As this value increases, so does the relative importance of the associated variable. The principal aim of this study was to create predictive malaria maps generated using the selected variables based on the RF algorithm in the Ehlanzeni District of Mpumalanga Province, South Africa. From the seven environmental variables used [temperature, lag temperature, rainfall, lag rainfall, humidity, altitude, and the normalized difference vegetation index (NDVI], altitude was identified as the most influential predictor variable due its high selection frequency. It was selected as the top predictor for 4 out of 12 months of the year, followed by NDVI, temperature and lag rainfall, which were each selected twice. The combination of climatic variables that produced the highest prediction accuracy was altitude, NDVI, and temperature. This suggests that these three variables have high predictive capabilities in relation to malaria transmission. Furthermore, it is anticipated that the predictive maps generated from predictions made by the RF algorithm could be used to monitor the progression of malaria and assist in intervention and prevention efforts with respect to malaria.
Sadeh, Sadra; Rotter, Stefan
2014-01-01
Neurons in the primary visual cortex are more or less selective for the orientation of a light bar used for stimulation. A broad distribution of individual grades of orientation selectivity has in fact been reported in all species. A possible reason for emergence of broad distributions is the recurrent network within which the stimulus is being processed. Here we compute the distribution of orientation selectivity in randomly connected model networks that are equipped with different spatial patterns of connectivity. We show that, for a wide variety of connectivity patterns, a linear theory based on firing rates accurately approximates the outcome of direct numerical simulations of networks of spiking neurons. Distance dependent connectivity in networks with a more biologically realistic structure does not compromise our linear analysis, as long as the linearized dynamics, and hence the uniform asynchronous irregular activity state, remain stable. We conclude that linear mechanisms of stimulus processing are indeed responsible for the emergence of orientation selectivity and its distribution in recurrent networks with functionally heterogeneous synaptic connectivity.
Interference-aware random beam selection schemes for spectrum sharing systems
Abdallah, Mohamed
2012-10-19
Spectrum sharing systems have been recently introduced to alleviate the problem of spectrum scarcity by allowing secondary unlicensed networks to share the spectrum with primary licensed networks under acceptable interference levels to the primary users. In this work, we develop interference-aware random beam selection schemes that provide enhanced performance for the secondary network under the condition that the interference observed by the receivers of the primary network is below a predetermined/acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a primary link composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes select a beam, among a set of power-optimized random beams, that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint for different levels of feedback information describing the interference level at the primary receiver. For the proposed schemes, we develop a statistical analysis for the SINR statistics as well as the capacity and bit error rate (BER) of the secondary link.
Abdallah, Mohamed M.
2013-11-01
In this work, we develop joint interference-aware random beam and spectrum selection scheme that provide enhanced performance for the secondary network under the condition that the interference observed at the primary receiver is below a predetermined acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a set of primary links composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes jointly select a beam, among a set of power-optimized random beams, as well as the primary spectrum that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint. In particular, we consider the case where the interference level is described by a q-bit description of its magnitude, whereby we propose a technique to find the optimal quantizer thresholds in a mean square error (MSE) sense. © 2013 IEEE.
Abdallah, Mohamed M.; Sayed, Mostafa M.; Alouini, Mohamed-Slim; Qaraqe, Khalid A.
2013-01-01
In this work, we develop joint interference-aware random beam and spectrum selection scheme that provide enhanced performance for the secondary network under the condition that the interference observed at the primary receiver is below a predetermined acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a set of primary links composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes jointly select a beam, among a set of power-optimized random beams, as well as the primary spectrum that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint. In particular, we consider the case where the interference level is described by a q-bit description of its magnitude, whereby we propose a technique to find the optimal quantizer thresholds in a mean square error (MSE) sense. © 2013 IEEE.
Treatment selection in a randomized clinical trial via covariate-specific treatment effect curves.
Ma, Yunbei; Zhou, Xiao-Hua
2017-02-01
For time-to-event data in a randomized clinical trial, we proposed two new methods for selecting an optimal treatment for a patient based on the covariate-specific treatment effect curve, which is used to represent the clinical utility of a predictive biomarker. To select an optimal treatment for a patient with a specific biomarker value, we proposed pointwise confidence intervals for each covariate-specific treatment effect curve and the difference between covariate-specific treatment effect curves of two treatments. Furthermore, to select an optimal treatment for a future biomarker-defined subpopulation of patients, we proposed confidence bands for each covariate-specific treatment effect curve and the difference between each pair of covariate-specific treatment effect curve over a fixed interval of biomarker values. We constructed the confidence bands based on a resampling technique. We also conducted simulation studies to evaluate finite-sample properties of the proposed estimation methods. Finally, we illustrated the application of the proposed method in a real-world data set.
Integrated Behavior Therapy for Selective Mutism: a randomized controlled pilot study.
Bergman, R Lindsey; Gonzalez, Araceli; Piacentini, John; Keller, Melody L
2013-10-01
To evaluate the feasibility, acceptability, and preliminary efficacy of a novel behavioral intervention for reducing symptoms of selective mutism and increasing functional speech. A total of 21 children ages 4 to 8 with primary selective mutism were randomized to 24 weeks of Integrated Behavior Therapy for Selective Mutism (IBTSM) or a 12-week Waitlist control. Clinical outcomes were assessed using blind independent evaluators, parent-, and teacher-report, and an objective behavioral measure. Treatment recipients completed a three-month follow-up to assess durability of treatment gains. Data indicated increased functional speaking behavior post-treatment as rated by parents and teachers, with a high rate of treatment responders as rated by blind independent evaluators (75%). Conversely, children in the Waitlist comparison group did not experience significant improvements in speaking behaviors. Children who received IBTSM also demonstrated significant improvements in number of words spoken at school compared to baseline, however, significant group differences did not emerge. Treatment recipients also experienced significant reductions in social anxiety per parent, but not teacher, report. Clinical gains were maintained over 3 month follow-up. IBTSM appears to be a promising new intervention that is efficacious in increasing functional speaking behaviors, feasible, and acceptable to parents and teachers. Copyright © 2013 Elsevier Ltd. All rights reserved.
Two-year Randomized Clinical Trial of Self-etching Adhesives and Selective Enamel Etching.
Pena, C E; Rodrigues, J A; Ely, C; Giannini, M; Reis, A F
2016-01-01
The aim of this randomized, controlled prospective clinical trial was to evaluate the clinical effectiveness of restoring noncarious cervical lesions with two self-etching adhesive systems applied with or without selective enamel etching. A one-step self-etching adhesive (Xeno V(+)) and a two-step self-etching system (Clearfil SE Bond) were used. The effectiveness of phosphoric acid selective etching of enamel margins was also evaluated. Fifty-six cavities were restored with each adhesive system and divided into two subgroups (n=28; etch and non-etch). All 112 cavities were restored with the nanohybrid composite Esthet.X HD. The clinical effectiveness of restorations was recorded in terms of retention, marginal integrity, marginal staining, caries recurrence, and postoperative sensitivity after 3, 6, 12, 18, and 24 months (modified United States Public Health Service). The Friedman test detected significant differences only after 18 months for marginal staining in the groups Clearfil SE non-etch (p=0.009) and Xeno V(+) etch (p=0.004). One restoration was lost during the trial (Xeno V(+) etch; p>0.05). Although an increase in marginal staining was recorded for groups Clearfil SE non-etch and Xeno V(+) etch, the clinical effectiveness of restorations was considered acceptable for the single-step and two-step self-etching systems with or without selective enamel etching in this 24-month clinical trial.
Selecting Optimal Parameters of Random Linear Network Coding for Wireless Sensor Networks
DEFF Research Database (Denmark)
Heide, J; Zhang, Qi; Fitzek, F H P
2013-01-01
This work studies how to select optimal code parameters of Random Linear Network Coding (RLNC) in Wireless Sensor Networks (WSNs). With Rateless Deluge [1] the authors proposed to apply Network Coding (NC) for Over-the-Air Programming (OAP) in WSNs, and demonstrated that with NC a significant...... reduction in the number of transmitted packets can be achieved. However, NC introduces additional computations and potentially a non-negligible transmission overhead, both of which depend on the chosen coding parameters. Therefore it is necessary to consider the trade-off that these coding parameters...... present in order to obtain the lowest energy consumption per transmitted bit. This problem is analyzed and suitable coding parameters are determined for the popular Tmote Sky platform. Compared to the use of traditional RLNC, these parameters enable a reduction in the energy spent per bit which grows...
Energy Technology Data Exchange (ETDEWEB)
Chandonia, John-Marc; Brenner, Steven E.
2004-07-14
The structural genomics project is an international effort to determine the three-dimensional shapes of all important biological macromolecules, with a primary focus on proteins. Target proteins should be selected according to a strategy which is medically and biologically relevant, of good value, and tractable. As an option to consider, we present the Pfam5000 strategy, which involves selecting the 5000 most important families from the Pfam database as sources for targets. We compare the Pfam5000 strategy to several other proposed strategies that would require similar numbers of targets. These include including complete solution of several small to moderately sized bacterial proteomes, partial coverage of the human proteome, and random selection of approximately 5000 targets from sequenced genomes. We measure the impact that successful implementation of these strategies would have upon structural interpretation of the proteins in Swiss-Prot, TrEMBL, and 131 complete proteomes (including 10 of eukaryotes) from the Proteome Analysis database at EBI. Solving the structures of proteins from the 5000 largest Pfam families would allow accurate fold assignment for approximately 68 percent of all prokaryotic proteins (covering 59 percent of residues) and 61 percent of eukaryotic proteins (40 percent of residues). More fine-grained coverage which would allow accurate modeling of these proteins would require an order of magnitude more targets. The Pfam5000 strategy may be modified in several ways, for example to focus on larger families, bacterial sequences, or eukaryotic sequences; as long as secondary consideration is given to large families within Pfam, coverage results vary only slightly. In contrast, focusing structural genomics on a single tractable genome would have only a limited impact in structural knowledge of other proteomes: a significant fraction (about 30-40 percent of the proteins, and 40-60 percent of the residues) of each proteome is classified in small
Day-ahead load forecast using random forest and expert input selection
International Nuclear Information System (INIS)
Lahouar, A.; Ben Hadj Slama, J.
2015-01-01
Highlights: • A model based on random forests for short term load forecast is proposed. • An expert feature selection is added to refine inputs. • Special attention is paid to customers behavior, load profile and special holidays. • The model is flexible and able to handle complex load signal. • A technical comparison is performed to assess the forecast accuracy. - Abstract: The electrical load forecast is getting more and more important in recent years due to the electricity market deregulation and integration of renewable resources. To overcome the incoming challenges and ensure accurate power prediction for different time horizons, sophisticated intelligent methods are elaborated. Utilization of intelligent forecast algorithms is among main characteristics of smart grids, and is an efficient tool to face uncertainty. Several crucial tasks of power operators such as load dispatch rely on the short term forecast, thus it should be as accurate as possible. To this end, this paper proposes a short term load predictor, able to forecast the next 24 h of load. Using random forest, characterized by immunity to parameter variations and internal cross validation, the model is constructed following an online learning process. The inputs are refined by expert feature selection using a set of if–then rules, in order to include the own user specifications about the country weather or market, and to generalize the forecast ability. The proposed approach is tested through a real historical set from the Tunisian Power Company, and the simulation shows accurate and satisfactory results for one day in advance, with an average error exceeding rarely 2.3%. The model is validated for regular working days and weekends, and special attention is paid to moving holidays, following non Gregorian calendar
Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E
2001-01-01
Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.
Modified random hinge transport mechanics and multiple scattering step-size selection in EGS5
International Nuclear Information System (INIS)
Wilderman, S.J.; Bielajew, A.F.
2005-01-01
The new transport mechanics in EGS5 allows for significantly longer electron transport step sizes and hence shorter computation times than required for identical problems in EGS4. But as with all Monte Carlo electron transport algorithms, certain classes of problems exhibit step-size dependencies even when operating within recommended ranges, sometimes making selection of step-sizes a daunting task for novice users. Further contributing to this problem, because of the decoupling of multiple scattering and continuous energy loss in the dual random hinge transport mechanics of EGS5, there are two independent step sizes in EGS5, one for multiple scattering and one for continuous energy loss, each of which influences speed and accuracy in a different manner. Further, whereas EGS4 used a single value of fractional energy loss (ESTEPE) to determine step sizes at all energies, to increase performance by decreasing the amount of effort expended simulating lower energy particles, EGS5 permits the fractional energy loss values which are used to determine both the multiple scattering and continuous energy loss step sizes to vary with energy. This results in requiring the user to specify four fractional energy loss values when optimizing computations for speed. Thus, in order to simplify step-size selection and to mitigate step-size dependencies, a method has been devised to automatically optimize step-size selection based on a single material dependent input related to the size of problem tally region. In this paper we discuss the new transport mechanics in EGS5 and describe the automatic step-size optimization algorithm. (author)
The adverse effect of selective cyclooxygenase-2 inhibitor on random skin flap survival in rats.
Directory of Open Access Journals (Sweden)
Haiyong Ren
Full Text Available BACKGROUND: Cyclooxygenase-2(COX-2 inhibitors provide desired analgesic effects after injury or surgery, but evidences suggested they also attenuate wound healing. The study is to investigate the effect of COX-2 inhibitor on random skin flap survival. METHODS: The McFarlane flap model was established in 40 rats and evaluated within two groups, each group gave the same volume of Parecoxib and saline injection for 7 days. The necrotic area of the flap was measured, the specimens of the flap were stained with haematoxylin-eosin(HE for histologic analysis. Immunohistochemical staining was performed to analyse the level of VEGF and COX-2 . RESULTS: 7 days after operation, the flap necrotic area ratio in study group (66.65 ± 2.81% was significantly enlarged than that of the control group(48.81 ± 2.33%(P <0.01. Histological analysis demonstrated angiogenesis with mean vessel density per mm(2 being lower in study group (15.4 ± 4.4 than in control group (27.2 ± 4.1 (P <0.05. To evaluate the expression of COX-2 and VEGF protein in the intermediate area II in the two groups by immunohistochemistry test .The expression of COX-2 in study group was (1022.45 ± 153.1, and in control group was (2638.05 ± 132.2 (P <0.01. The expression of VEGF in the study and control groups were (2779.45 ± 472.0 vs (4938.05 ± 123.6(P <0.01.In the COX-2 inhibitor group, the expressions of COX-2 and VEGF protein were remarkably down-regulated as compared with the control group. CONCLUSION: Selective COX-2 inhibitor had adverse effect on random skin flap survival. Suppression of neovascularization induced by low level of VEGF was supposed to be the biological mechanism.
Application of random coherence order selection in gradient-enhanced multidimensional NMR
International Nuclear Information System (INIS)
Bostock, Mark J.; Nietlispach, Daniel
2016-01-01
Development of multidimensional NMR is essential to many applications, for example in high resolution structural studies of biomolecules. Multidimensional techniques enable separation of NMR signals over several dimensions, improving signal resolution, whilst also allowing identification of new connectivities. However, these advantages come at a significant cost. The Fourier transform theorem requires acquisition of a grid of regularly spaced points to satisfy the Nyquist criterion, while frequency discrimination and acquisition of a pure phase spectrum require acquisition of both quadrature components for each time point in every indirect (non-acquisition) dimension, adding a factor of 2 N -1 to the number of free- induction decays which must be acquired, where N is the number of dimensions. Compressed sensing (CS) ℓ 1 -norm minimisation in combination with non-uniform sampling (NUS) has been shown to be extremely successful in overcoming the Nyquist criterion. Previously, maximum entropy reconstruction has also been used to overcome the limitation of frequency discrimination, processing data acquired with only one quadrature component at a given time interval, known as random phase detection (RPD), allowing a factor of two reduction in the number of points for each indirect dimension (Maciejewski et al. 2011 PNAS 108 16640). However, whilst this approach can be easily applied in situations where the quadrature components are acquired as amplitude modulated data, the same principle is not easily extended to phase modulated (P-/N-type) experiments where data is acquired in the form exp (iωt) or exp (-iωt), and which make up many of the multidimensional experiments used in modern NMR. Here we demonstrate a modification of the CS ℓ 1 -norm approach to allow random coherence order selection (RCS) for phase modulated experiments; we generalise the nomenclature for RCS and RPD as random quadrature detection (RQD). With this method, the power of RQD can be extended
Random genetic drift, natural selection, and noise in human cranial evolution.
Roseman, Charles C
2016-08-01
This study assesses the extent to which relationships among groups complicate comparative studies of adaptation in recent human cranial variation and the extent to which departures from neutral additive models of evolution hinder the reconstruction of population relationships among groups using cranial morphology. Using a maximum likelihood evolutionary model fitting approach and a mixed population genomic and cranial data set, I evaluate the relative fits of several widely used models of human cranial evolution. Moreover, I compare the goodness of fit of models of cranial evolution constrained by genomic variation to test hypotheses about population specific departures from neutrality. Models from population genomics are much better fits to cranial variation than are traditional models from comparative human biology. There is not enough evolutionary information in the cranium to reconstruct much of recent human evolution but the influence of population history on cranial variation is strong enough to cause comparative studies of adaptation serious difficulties. Deviations from a model of random genetic drift along a tree-like population history show the importance of environmental effects, gene flow, and/or natural selection on human cranial variation. Moreover, there is a strong signal of the effect of natural selection or an environmental factor on a group of humans from Siberia. The evolution of the human cranium is complex and no one evolutionary process has prevailed at the expense of all others. A holistic unification of phenome, genome, and environmental context, gives us a strong point of purchase on these problems, which is unavailable to any one traditional approach alone. Am J Phys Anthropol 160:582-592, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Multi-Label Learning via Random Label Selection for Protein Subcellular Multi-Locations Prediction.
Wang, Xiao; Li, Guo-Zheng
2013-03-12
Prediction of protein subcellular localization is an important but challenging problem, particularly when proteins may simultaneously exist at, or move between, two or more different subcellular location sites. Most of the existing protein subcellular localization methods are only used to deal with the single-location proteins. In the past few years, only a few methods have been proposed to tackle proteins with multiple locations. However, they only adopt a simple strategy, that is, transforming the multi-location proteins to multiple proteins with single location, which doesn't take correlations among different subcellular locations into account. In this paper, a novel method named RALS (multi-label learning via RAndom Label Selection), is proposed to learn from multi-location proteins in an effective and efficient way. Through five-fold cross validation test on a benchmark dataset, we demonstrate our proposed method with consideration of label correlations obviously outperforms the baseline BR method without consideration of label correlations, indicating correlations among different subcellular locations really exist and contribute to improvement of prediction performance. Experimental results on two benchmark datasets also show that our proposed methods achieve significantly higher performance than some other state-of-the-art methods in predicting subcellular multi-locations of proteins. The prediction web server is available at http://levis.tongji.edu.cn:8080/bioinfo/MLPred-Euk/ for the public usage.
Le, Trang T; Simmons, W Kyle; Misaki, Masaya; Bodurka, Jerzy; White, Bill C; Savitz, Jonathan; McKinney, Brett A
2017-09-15
Classification of individuals into disease or clinical categories from high-dimensional biological data with low prediction error is an important challenge of statistical learning in bioinformatics. Feature selection can improve classification accuracy but must be incorporated carefully into cross-validation to avoid overfitting. Recently, feature selection methods based on differential privacy, such as differentially private random forests and reusable holdout sets, have been proposed. However, for domains such as bioinformatics, where the number of features is much larger than the number of observations p≫n , these differential privacy methods are susceptible to overfitting. We introduce private Evaporative Cooling, a stochastic privacy-preserving machine learning algorithm that uses Relief-F for feature selection and random forest for privacy preserving classification that also prevents overfitting. We relate the privacy-preserving threshold mechanism to a thermodynamic Maxwell-Boltzmann distribution, where the temperature represents the privacy threshold. We use the thermal statistical physics concept of Evaporative Cooling of atomic gases to perform backward stepwise privacy-preserving feature selection. On simulated data with main effects and statistical interactions, we compare accuracies on holdout and validation sets for three privacy-preserving methods: the reusable holdout, reusable holdout with random forest, and private Evaporative Cooling, which uses Relief-F feature selection and random forest classification. In simulations where interactions exist between attributes, private Evaporative Cooling provides higher classification accuracy without overfitting based on an independent validation set. In simulations without interactions, thresholdout with random forest and private Evaporative Cooling give comparable accuracies. We also apply these privacy methods to human brain resting-state fMRI data from a study of major depressive disorder. Code
Meinzer, Caitlyn; Martin, Renee; Suarez, Jose I
2017-09-08
In phase II trials, the most efficacious dose is usually not known. Moreover, given limited resources, it is difficult to robustly identify a dose while also testing for a signal of efficacy that would support a phase III trial. Recent designs have sought to be more efficient by exploring multiple doses through the use of adaptive strategies. However, the added flexibility may potentially increase the risk of making incorrect assumptions and reduce the total amount of information available across the dose range as a function of imbalanced sample size. To balance these challenges, a novel placebo-controlled design is presented in which a restricted Bayesian response adaptive randomization (RAR) is used to allocate a majority of subjects to the optimal dose of active drug, defined as the dose with the lowest probability of poor outcome. However, the allocation between subjects who receive active drug or placebo is held constant to retain the maximum possible power for a hypothesis test of overall efficacy comparing the optimal dose to placebo. The design properties and optimization of the design are presented in the context of a phase II trial for subarachnoid hemorrhage. For a fixed total sample size, a trade-off exists between the ability to select the optimal dose and the probability of rejecting the null hypothesis. This relationship is modified by the allocation ratio between active and control subjects, the choice of RAR algorithm, and the number of subjects allocated to an initial fixed allocation period. While a responsive RAR algorithm improves the ability to select the correct dose, there is an increased risk of assigning more subjects to a worse arm as a function of ephemeral trends in the data. A subarachnoid treatment trial is used to illustrate how this design can be customized for specific objectives and available data. Bayesian adaptive designs are a flexible approach to addressing multiple questions surrounding the optimal dose for treatment efficacy
Wampler, Peter J; Rediske, Richard R; Molla, Azizur R
2013-01-18
A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This
Directory of Open Access Journals (Sweden)
Wampler Peter J
2013-01-01
Full Text Available Abstract Background A remote sensing technique was developed which combines a Geographic Information System (GIS; Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. Methods The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. Results A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. Conclusions The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only
Yuvchenko, S. A.; Ushakova, E. V.; Pavlova, M. V.; Alonova, M. V.; Zimnyakov, D. A.
2018-04-01
We consider the practical realization of a new optical probe method of the random media which is defined as the reference-free path length interferometry with the intensity moments analysis. A peculiarity in the statistics of the spectrally selected fluorescence radiation in laser-pumped dye-doped random medium is discussed. Previously established correlations between the second- and the third-order moments of the intensity fluctuations in the random interference patterns, the coherence function of the probe radiation, and the path difference probability density for the interfering partial waves in the medium are confirmed. The correlations were verified using the statistical analysis of the spectrally selected fluorescence radiation emitted by a laser-pumped dye-doped random medium. Water solution of Rhodamine 6G was applied as the doping fluorescent agent for the ensembles of the densely packed silica grains, which were pumped by the 532 nm radiation of a solid state laser. The spectrum of the mean path length for a random medium was reconstructed.
The basic science and mathematics of random mutation and natural selection.
Kleinman, Alan
2014-12-20
The mutation and natural selection phenomenon can and often does cause the failure of antimicrobial, herbicidal, pesticide and cancer treatments selection pressures. This phenomenon operates in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures. The mathematical behavior of mutation and selection is derived using the principles given by probability theory. The derivation of the equations describing the mutation and selection phenomenon is carried out in the context of an empirical example. Copyright © 2014 John Wiley & Sons, Ltd.
Cohen-Khait, Ruth; Schreiber, Gideon
2018-04-27
Protein-protein interactions mediate the vast majority of cellular processes. Though protein interactions obey basic chemical principles also within the cell, the in vivo physiological environment may not allow for equilibrium to be reached. Thus, in vitro measured thermodynamic affinity may not provide a complete picture of protein interactions in the biological context. Binding kinetics composed of the association and dissociation rate constants are relevant and important in the cell. Therefore, changes in protein-protein interaction kinetics have a significant impact on the in vivo activity of the proteins. The common protocol for the selection of tighter binders from a mutant library selects for protein complexes with slower dissociation rate constants. Here we describe a method to specifically select for variants with faster association rate constants by using pre-equilibrium selection, starting from a large random library. Toward this end, we refine the selection conditions of a TEM1-β-lactamase library against its natural nanomolar affinity binder β-lactamase inhibitor protein (BLIP). The optimal selection conditions depend on the ligand concentration and on the incubation time. In addition, we show that a second sort of the library helps to separate signal from noise, resulting in a higher percent of faster binders in the selected library. Fast associating protein variants are of particular interest for drug development and other biotechnological applications.
r2VIM: A new variable selection method for random forests in genome-wide association studies.
Szymczak, Silke; Holzinger, Emily; Dasgupta, Abhijit; Malley, James D; Molloy, Anne M; Mills, James L; Brody, Lawrence C; Stambolian, Dwight; Bailey-Wilson, Joan E
2016-01-01
Machine learning methods and in particular random forests (RFs) are a promising alternative to standard single SNP analyses in genome-wide association studies (GWAS). RFs provide variable importance measures (VIMs) to rank SNPs according to their predictive power. However, in contrast to the established genome-wide significance threshold, no clear criteria exist to determine how many SNPs should be selected for downstream analyses. We propose a new variable selection approach, recurrent relative variable importance measure (r2VIM). Importance values are calculated relative to an observed minimal importance score for several runs of RF and only SNPs with large relative VIMs in all of the runs are selected as important. Evaluations on simulated GWAS data show that the new method controls the number of false-positives under the null hypothesis. Under a simple alternative hypothesis with several independent main effects it is only slightly less powerful than logistic regression. In an experimental GWAS data set, the same strong signal is identified while the approach selects none of the SNPs in an underpowered GWAS. The novel variable selection method r2VIM is a promising extension to standard RF for objectively selecting relevant SNPs in GWAS while controlling the number of false-positive results.
Patel, Raj Kumar; Giri, V.K.
2016-01-01
Fault detection and diagnosis is the most important technology in condition-based maintenance (CBM) system for rotating machinery. This paper experimentally explores the development of a random forest (RF) classifier, a recently emerged machine learning technique, for multi-class mechanical fault diagnosis in bearing of an induction motor. Firstly, the vibration signals are collected from the bearing using accelerometer sensor. Parameters from the vibration signal are extracted in the form of...
Heikamp, Kathrin; Bajorath, Jürgen
2013-07-22
The choice of negative training data for machine learning is a little explored issue in chemoinformatics. In this study, the influence of alternative sets of negative training data and different background databases on support vector machine (SVM) modeling and virtual screening has been investigated. Target-directed SVM models have been derived on the basis of differently composed training sets containing confirmed inactive molecules or randomly selected database compounds as negative training instances. These models were then applied to search background databases consisting of biological screening data or randomly assembled compounds for available hits. Negative training data were found to systematically influence compound recall in virtual screening. In addition, different background databases had a strong influence on the search results. Our findings also indicated that typical benchmark settings lead to an overestimation of SVM-based virtual screening performance compared to search conditions that are more relevant for practical applications.
Novel Zn2+-chelating peptides selected from a fimbria-displayed random peptide library
DEFF Research Database (Denmark)
Kjærgaard, Kristian; Schembri, Mark; Klemm, Per
2001-01-01
The display of peptide sequences on the surface of bacteria is a technology that offers exciting applications in biotechnology and medical research. Type 1 fimbriae are surface organelles of Escherichia coli which mediate D-mannose-sensitive binding to different host surfaces by virtue of the Fim......H adhesin. FimH is a component of the fimbrial organelle that can accommodate and display a diverse range of peptide sequences on the E. coli cell surface. In this study we have constructed a random peptide library in FimH. The library, consisting of similar to 40 million individual clones, was screened...
Lee, Stella Juhyun; Brennan, Emily; Gibson, Laura Anne; Tan, Andy S. L.; Kybert-Momjian, Ani; Liu, Jiaying; Hornik, Robert
2016-01-01
Several message topic selection approaches propose that messages based on beliefs pretested and found to be more strongly associated with intentions will be more effective in changing population intentions and behaviors when used in a campaign. This study aimed to validate the underlying causal assumption of these approaches which rely on cross-sectional belief–intention associations. We experimentally tested whether messages addressing promising themes as identified by the above criterion were more persuasive than messages addressing less promising themes. Contrary to expectations, all messages increased intentions. Interestingly, mediation analyses showed that while messages deemed promising affected intentions through changes in targeted promising beliefs, messages deemed less promising also achieved persuasion by influencing nontargeted promising beliefs. Implications for message topic selection are discussed. PMID:27867218
Oracle Efficient Variable Selection in Random and Fixed Effects Panel Data Models
DEFF Research Database (Denmark)
Kock, Anders Bredahl
This paper generalizes the results for the Bridge estimator of Huang et al. (2008) to linear random and fixed effects panel data models which are allowed to grow in both dimensions. In particular we show that the Bridge estimator is oracle efficient. It can correctly distinguish between relevant...... and irrelevant variables and the asymptotic distribution of the estimators of the coefficients of the relevant variables is the same as if only these had been included in the model, i.e. as if an oracle had revealed the true model prior to estimation. In the case of more explanatory variables than observations......, we prove that the Marginal Bridge estimator can asymptotically correctly distinguish between relevant and irrelevant explanatory variables. We do this without restricting the dependence between covariates and without assuming sub Gaussianity of the error terms thereby generalizing the results...
Presence of psychoactive substances in oral fluid from randomly selected drivers in Denmark
DEFF Research Database (Denmark)
Simonsen, K. Wiese; Steentoft, A.; Hels, Tove
2012-01-01
. The percentage of drivers positive for medicinal drugs above the Danish legal concentration limit was 0.4%; while, 0.3% of the drivers tested positive for one or more illicit drug at concentrations exceeding the Danish legal limit. Tetrahydrocannabinol, cocaine, and amphetamine were the most frequent illicit......This roadside study is the Danish part of the EU-project DRUID (Driving under the Influence of Drugs, Alcohol, and Medicines) and included three representative regions in Denmark. Oral fluid samples (n = 3002) were collected randomly from drivers using a sampling scheme stratified by time, season......, and road type. The oral fluid samples were screened for 29 illegal and legal psychoactive substances and metabolites as well as ethanol. Fourteen (0.5%) drivers were positive for ethanol alone or in combination with drugs) at concentrations above 0.53 g/l (0.5 mg/g), which is the Danish legal limit...
Woitas-Slubowska, Donata; Hurnik, Elzbieta; Skarpańska-Stejnborn, Anna
2010-12-01
To determine the association between smoking status and leisure time physical activity (LTPA), alcohol consumption, and socioeconomic status (SES) among Polish adults. 466 randomly selected men and women (aged 18-66 years) responded to an anonymous questionnaire regarding smoking, alcohol consumption, LTPA, and SES. Multiple logistic regression was used to examine the association of smoking status with six socioeconomic measures, level of LTPA, and frequency and type of alcohol consumed. Smokers were defined as individuals smoking occasionally or daily. The odds of being smoker were 9 times (men) and 27 times (women) higher among respondents who drink alcohol several times/ week or everyday in comparison to non-drinkers (p times higher compared to those with the high educational attainment (p = 0.007). Among women we observed that students were the most frequent smokers. Female students were almost three times more likely to smoke than non-professional women, and two times more likely than physical workers (p = 0.018). The findings of this study indicated that among randomly selected Polish man and women aged 18-66 smoking and alcohol consumption tended to cluster. These results imply that intervention strategies need to target multiple risk factors simultaneously. The highest risk of smoking was observed among low educated men, female students, and both men and women drinking alcohol several times a week or every day. Information on subgroups with the high risk of smoking will help in planning future preventive strategies.
Nguyen, Thanh-Tung; Huang, Joshua; Wu, Qingyao; Nguyen, Thuy; Li, Mark
2015-01-01
Single-nucleotide polymorphisms (SNPs) selection and identification are the most important tasks in Genome-wide association data analysis. The problem is difficult because genome-wide association data is very high dimensional and a large portion of SNPs in the data is irrelevant to the disease. Advanced machine learning methods have been successfully used in Genome-wide association studies (GWAS) for identification of genetic variants that have relatively big effects in some common, complex diseases. Among them, the most successful one is Random Forests (RF). Despite of performing well in terms of prediction accuracy in some data sets with moderate size, RF still suffers from working in GWAS for selecting informative SNPs and building accurate prediction models. In this paper, we propose to use a new two-stage quality-based sampling method in random forests, named ts-RF, for SNP subspace selection for GWAS. The method first applies p-value assessment to find a cut-off point that separates informative and irrelevant SNPs in two groups. The informative SNPs group is further divided into two sub-groups: highly informative and weak informative SNPs. When sampling the SNP subspace for building trees for the forest, only those SNPs from the two sub-groups are taken into account. The feature subspaces always contain highly informative SNPs when used to split a node at a tree. This approach enables one to generate more accurate trees with a lower prediction error, meanwhile possibly avoiding overfitting. It allows one to detect interactions of multiple SNPs with the diseases, and to reduce the dimensionality and the amount of Genome-wide association data needed for learning the RF model. Extensive experiments on two genome-wide SNP data sets (Parkinson case-control data comprised of 408,803 SNPs and Alzheimer case-control data comprised of 380,157 SNPs) and 10 gene data sets have demonstrated that the proposed model significantly reduced prediction errors and outperformed
Capturing the Flatness of a peer-to-peer lending network through random and selected perturbations
Karampourniotis, Panagiotis D.; Singh, Pramesh; Uparna, Jayaram; Horvat, Emoke-Agnes; Szymanski, Boleslaw K.; Korniss, Gyorgy; Bakdash, Jonathan Z.; Uzzi, Brian
Null models are established tools that have been used in network analysis to uncover various structural patterns. They quantify the deviance of an observed network measure to that given by the null model. We construct a null model for weighted, directed networks to identify biased links (carrying significantly different weights than expected according to the null model) and thus quantify the flatness of the system. Using this model, we study the flatness of Kiva, a large international crownfinancing network of borrowers and lenders, aggregated to the country level. The dataset spans the years from 2006 to 2013. Our longitudinal analysis shows that flatness of the system is reducing over time, meaning the proportion of biased inter-country links is growing. We extend our analysis by testing the robustness of the flatness of the network in perturbations on the links' weights or the nodes themselves. Examples of such perturbations are event shocks (e.g. erecting walls) or regulatory shocks (e.g. Brexit). We find that flatness is unaffected by random shocks, but changes after shocks target links with a large weight or bias. The methods we use to capture the flatness are based on analytics, simulations, and numerical computations using Shannon's maximum entropy. Supported by ARL NS-CTA.
Clark, Imogen N; Baker, Felicity A; Peiris, Casey L; Shoebridge, Georgie; Taylor, Nicholas F
2017-03-01
To evaluate effects of participant-selected music on older adults' achievement of activity levels recommended in the physical activity guidelines following cardiac rehabilitation. A parallel group randomized controlled trial with measurements at Weeks 0, 6 and 26. A multisite outpatient rehabilitation programme of a publicly funded metropolitan health service. Adults aged 60 years and older who had completed a cardiac rehabilitation programme. Experimental participants selected music to support walking with guidance from a music therapist. Control participants received usual care only. The primary outcome was the proportion of participants achieving activity levels recommended in physical activity guidelines. Secondary outcomes compared amounts of physical activity, exercise capacity, cardiac risk factors, and exercise self-efficacy. A total of 56 participants, mean age 68.2 years (SD = 6.5), were randomized to the experimental ( n = 28) and control groups ( n = 28). There were no differences between groups in proportions of participants achieving activity recommended in physical activity guidelines at Week 6 or 26. Secondary outcomes demonstrated between-group differences in male waist circumference at both measurements (Week 6 difference -2.0 cm, 95% CI -4.0 to 0; Week 26 difference -2.8 cm, 95% CI -5.4 to -0.1), and observed effect sizes favoured the experimental group for amounts of physical activity (d = 0.30), exercise capacity (d = 0.48), and blood pressure (d = -0.32). Participant-selected music did not increase the proportion of participants achieving recommended amounts of physical activity, but may have contributed to exercise-related benefits.
Marques, Yuri Bento; de Paiva Oliveira, Alcione; Ribeiro Vasconcelos, Ana Tereza; Cerqueira, Fabio Ribeiro
2016-12-15
MicroRNAs (miRNAs) are key gene expression regulators in plants and animals. Therefore, miRNAs are involved in several biological processes, making the study of these molecules one of the most relevant topics of molecular biology nowadays. However, characterizing miRNAs in vivo is still a complex task. As a consequence, in silico methods have been developed to predict miRNA loci. A common ab initio strategy to find miRNAs in genomic data is to search for sequences that can fold into the typical hairpin structure of miRNA precursors (pre-miRNAs). The current ab initio approaches, however, have selectivity issues, i.e., a high number of false positives is reported, which can lead to laborious and costly attempts to provide biological validation. This study presents an extension of the ab initio method miRNAFold, with the aim of improving selectivity through machine learning techniques, namely, random forest combined with the SMOTE procedure that copes with imbalance datasets. By comparing our method, termed Mirnacle, with other important approaches in the literature, we demonstrate that Mirnacle substantially improves selectivity without compromising sensitivity. For the three datasets used in our experiments, our method achieved at least 97% of sensitivity and could deliver a two-fold, 20-fold, and 6-fold increase in selectivity, respectively, compared with the best results of current computational tools. The extension of miRNAFold by the introduction of machine learning techniques, significantly increases selectivity in pre-miRNA ab initio prediction, which optimally contributes to advanced studies on miRNAs, as the need of biological validations is diminished. Hopefully, new research, such as studies of severe diseases caused by miRNA malfunction, will benefit from the proposed computational tool.
LENUS (Irish Health Repository)
Cronin, C C
2012-02-03
Abnormalities of the renin-angiotensin system have been reported in patients with diabetes mellitus and with diabetic complications. In this study, plasma concentrations of prorenin, renin, and aldosterone were measured in a stratified random sample of 110 insulin-dependent (Type 1) diabetic patients attending our outpatient clinic. Fifty-four age- and sex-matched control subjects were also examined. Plasma prorenin concentration was higher in patients without complications than in control subjects when upright (geometric mean (95% confidence intervals (CI): 75.9 (55.0-105.6) vs 45.1 (31.6-64.3) mU I-1, p < 0.05). There was no difference in plasma prorenin concentration between patients without and with microalbuminuria and between patients without and with background retinopathy. Plasma renin concentration, both when supine and upright, was similar in control subjects, in patients without complications, and in patients with varying degrees of diabetic microangiopathy. Plasma aldosterone was suppressed in patients without complications in comparison to control subjects (74 (58-95) vs 167 (140-199) ng I-1, p < 0.001) and was also suppressed in patients with microvascular disease. Plasma potassium was significantly higher in patients than in control subjects (mean +\\/- standard deviation: 4.10 +\\/- 0.36 vs 3.89 +\\/- 0.26 mmol I-1; p < 0.001) and plasma sodium was significantly lower (138 +\\/- 4 vs 140 +\\/- 2 mmol I-1; p < 0.001). We conclude that plasma prorenin is not a useful early marker for diabetic microvascular disease. Despite apparently normal plasma renin concentrations, plasma aldosterone is suppressed in insulin-dependent diabetic patients.
Directory of Open Access Journals (Sweden)
Nantian Huang
2016-09-01
Full Text Available The prediction accuracy of short-term load forecast (STLF depends on prediction model choice and feature selection result. In this paper, a novel random forest (RF-based feature selection method for STLF is proposed. First, 243 related features were extracted from historical load data and the time information of prediction points to form the original feature set. Subsequently, the original feature set was used to train an RF as the original model. After the training process, the prediction error of the original model on the test set was recorded and the permutation importance (PI value of each feature was obtained. Then, an improved sequential backward search method was used to select the optimal forecasting feature subset based on the PI value of each feature. Finally, the optimal forecasting feature subset was used to train a new RF model as the final prediction model. Experiments showed that the prediction accuracy of RF trained by the optimal forecasting feature subset was higher than that of the original model and comparative models based on support vector regression and artificial neural network.
Bucur, Roxana C; Reid, Lauren S; Hamilton, Celeste J; Cummings, Steven R; Jamal, Sophie A
2013-09-08
comparisons with the best' approach for data analyses, as this strategy allows practical considerations of ease of use and tolerability to guide selection of the preparation for future studies. Data from this protocol will be used to develop a randomized, controlled trial of nitrates to prevent osteoporotic fractures. ClinicalTrials.gov Identifier: NCT01387672. Controlled-Trials.com: ISRCTN08860742.
Repar, Jelena; Warnecke, Tobias
2017-08-01
Inversions are a major contributor to structural genome evolution in prokaryotes. Here, using a novel alignment-based method, we systematically compare 1,651 bacterial and 98 archaeal genomes to show that inversion landscapes are frequently biased toward (symmetric) inversions around the origin-terminus axis. However, symmetric inversion bias is not a universal feature of prokaryotic genome evolution but varies considerably across clades. At the extremes, inversion landscapes in Bacillus-Clostridium and Actinobacteria are dominated by symmetric inversions, while there is little or no systematic bias favoring symmetric rearrangements in archaea with a single origin of replication. Within clades, we find strong but clade-specific relationships between symmetric inversion bias and different features of adaptive genome architecture, including the distance of essential genes to the origin of replication and the preferential localization of genes on the leading strand. We suggest that heterogeneous selection pressures have converged to produce similar patterns of structural genome evolution across prokaryotes. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Warris, Sven; Boymans, Sander; Muiser, Iwe; Noback, Michiel; Krijnen, Wim; Nap, Jan-Peter
2014-01-13
Small RNAs are important regulators of genome function, yet their prediction in genomes is still a major computational challenge. Statistical analyses of pre-miRNA sequences indicated that their 2D structure tends to have a minimal free energy (MFE) significantly lower than MFE values of equivalently randomized sequences with the same nucleotide composition, in contrast to other classes of non-coding RNA. The computation of many MFEs is, however, too intensive to allow for genome-wide screenings. Using a local grid infrastructure, MFE distributions of random sequences were pre-calculated on a large scale. These distributions follow a normal distribution and can be used to determine the MFE distribution for any given sequence composition by interpolation. It allows on-the-fly calculation of the normal distribution for any candidate sequence composition. The speedup achieved makes genome-wide screening with this characteristic of a pre-miRNA sequence practical. Although this particular property alone will not be able to distinguish miRNAs from other sequences sufficiently discriminative, the MFE-based P-value should be added to the parameters of choice to be included in the selection of potential miRNA candidates for experimental verification.
Li, Jin; Tran, Maggie; Siwabessy, Justy
2016-01-01
Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and
Lenzi, Tathiane Larissa; Pires, Carine Weber; Soares, Fabio Zovico Maxnuck; Raggio, Daniela Prócida; Ardenghi, Thiago Machado; de Oliveira Rocha, Rachel
2017-09-15
To evaluate the 18-month clinical performance of a universal adhesive, applied under different adhesion strategies, after selective carious tissue removal in primary molars. Forty-four subjects (five to 10 years old) contributed with 90 primary molars presenting moderately deep dentin carious lesions on occlusal or occluso-proximal surfaces, which were randomly assigned following either self-etch or etch-and-rinse protocol of Scotchbond Universal Adhesive (3M ESPE). Resin composite was incrementally inserted for all restorations. Restorations were evaluated at one, six, 12, and 18 months using the modified United States Public Health Service criteria. Survival estimates for restorations' longevity were evaluated using the Kaplan-Meier method. Multivariate Cox regression analysis with shared frailty to assess the factors associated with failures (Padhesion strategy did not influence the restorations' longevity (P=0.06; 72.2 percent and 89.7 percent with etch-and-rinse and self-etch mode, respectively). Self-etch and etch-and-rinse strategies did not influence the clinical behavior of universal adhesive used in primary molars after selective carious tissue removal; although there was a tendency for better outcome of the self-etch strategy.
Ma, Xin; Guo, Jing; Sun, Xiao
2016-01-01
DNA-binding proteins are fundamentally important in cellular processes. Several computational-based methods have been developed to improve the prediction of DNA-binding proteins in previous years. However, insufficient work has been done on the prediction of DNA-binding proteins from protein sequence information. In this paper, a novel predictor, DNABP (DNA-binding proteins), was designed to predict DNA-binding proteins using the random forest (RF) classifier with a hybrid feature. The hybrid feature contains two types of novel sequence features, which reflect information about the conservation of physicochemical properties of the amino acids, and the binding propensity of DNA-binding residues and non-binding propensities of non-binding residues. The comparisons with each feature demonstrated that these two novel features contributed most to the improvement in predictive ability. Furthermore, to improve the prediction performance of the DNABP model, feature selection using the minimum redundancy maximum relevance (mRMR) method combined with incremental feature selection (IFS) was carried out during the model construction. The results showed that the DNABP model could achieve 86.90% accuracy, 83.76% sensitivity, 90.03% specificity and a Matthews correlation coefficient of 0.727. High prediction accuracy and performance comparisons with previous research suggested that DNABP could be a useful approach to identify DNA-binding proteins from sequence information. The DNABP web server system is freely available at http://www.cbi.seu.edu.cn/DNABP/.
Rochman, Auliya Noor; Prasetyo, Hari; Nugroho, Munajat Tri
2017-06-01
Vehicle Routing Problem (VRP) often occurs when the manufacturers need to distribute their product to some customers/outlets. The distribution process is typically restricted by the capacity of the vehicle and the working hours at the distributor. This type of VRP is also known as Capacitated Vehicle Routing Problem with Time Windows (CVRPTW). A Biased Random Key Genetic Algorithm (BRKGA) was designed and coded in MATLAB to solve the CVRPTW case of soft drink distribution. The standard BRKGA was then modified by applying chromosome insertion into the initial population and defining chromosome gender for parent undergoing crossover operation. The performance of the established algorithms was then compared to a heuristic procedure for solving a soft drink distribution. Some findings are revealed (1) the total distribution cost of BRKGA with insertion (BRKGA-I) results in a cost saving of 39% compared to the total cost of heuristic method, (2) BRKGA with the gender selection (BRKGA-GS) could further improve the performance of the heuristic method. However, the BRKGA-GS tends to yield worse results compared to that obtained from the standard BRKGA.
Directory of Open Access Journals (Sweden)
Xin Ma
2015-01-01
Full Text Available The prediction of RNA-binding proteins is one of the most challenging problems in computation biology. Although some studies have investigated this problem, the accuracy of prediction is still not sufficient. In this study, a highly accurate method was developed to predict RNA-binding proteins from amino acid sequences using random forests with the minimum redundancy maximum relevance (mRMR method, followed by incremental feature selection (IFS. We incorporated features of conjoint triad features and three novel features: binding propensity (BP, nonbinding propensity (NBP, and evolutionary information combined with physicochemical properties (EIPP. The results showed that these novel features have important roles in improving the performance of the predictor. Using the mRMR-IFS method, our predictor achieved the best performance (86.62% accuracy and 0.737 Matthews correlation coefficient. High prediction accuracy and successful prediction performance suggested that our method can be a useful approach to identify RNA-binding proteins from sequence information.
Zhou, Fuqun; Zhang, Aining
2016-10-25
Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2-3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests' features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.
De Crescenzo, Franco; Perelli, Federica; Armando, Marco; Vicari, Stefano
2014-01-01
The treatment of postpartum depression with selective serotonin reuptake inhibitors (SSRIs) has been claimed to be both efficacious and well tolerated, but no recent systematic reviews have been conducted. A qualitative systematic review of randomized clinical trials on women with postpartum depression comparing SSRIs to placebo and/or other treatments was performed. A comprehensive literature search of online databases, the bibliographies of published articles and grey literature were conducted. Data on efficacy, acceptability and tolerability were extracted and the quality of the trials was assessed. Six randomised clinical trials, comprising 595 patients, met quality criteria for inclusion in the analysis. Cognitive-behavioural intervention, psychosocial community-based intervention, psychodynamic therapy, cognitive behavioural therapy, a second-generation tricyclic antidepressant and placebo were used as comparisons. All studies demonstrated higher response and remission rates among those treated with SSRIs and greater mean changes on depression scales, although findings were not always statistically significant. Dropout rates were high in three of the trials but similar among treatment and comparison groups. In general, SSRIs were well tolerated and trial quality was good. There are few trials, patients included in the trials were not representative of all patients with postpartum depression, dropout rates in three trials were high, and long-term efficacy and tolerability were assessed in only two trials. SSRIs appear to be efficacious and well tolerated in the treatment of postpartum depression, but the available evidence fails to demonstrate a clear superiority over other treatments. © 2013 Elsevier B.V. All rights reserved.
Newton, Nicola C.; Conrod, Patricia J.; Slade, Tim; Carragher, Natacha; Champion, Katrina E.; Barrett, Emma L.; Kelly, Erin V.; Nair, Natasha K.; Stapinski, Lexine; Teesson, Maree
2016-01-01
Background: This study investigated the long-term effectiveness of Preventure, a selective personality-targeted prevention program, in reducing the uptake of alcohol, harmful use of alcohol, and alcohol-related harms over a 3-year period. Methods: A cluster randomized controlled trial was conducted to assess the effectiveness of Preventure.…
Directory of Open Access Journals (Sweden)
Mohammad Parsa Mahjob
2011-06-01
Full Text Available Background and objective: Medical record documentation, often use to protect the patients legal rights, also providing information for medical researchers, general studies, education of health care staff and qualitative surveys is used. There is a need to control the amount of data entered in the medical record sheets of patients, considering the completion of these sheets is often carried out after completion of service delivery to the patients. Therefore, in this study the prevalence of completeness of medical history, operation reports, and physician order sheets by different documentaries in Jahrom teaching hospitals during year 2009 was analyzed. Methods and Materials: In this descriptive / retrospective study, the 400 medical record sheets of the patients from two teaching hospitals affiliated to Jahrom medical university was randomly selected. The tool of data collection was a checklist based on the content of medical history sheet, operation report and physician order sheets. The data were analyzed by SPSS (Version10 software and Microsoft Office Excel 2003. Results: Average of personal (Demography data entered in medical history, physician order and operation report sheets which is done by department's secretaries were 32.9, 35.8 and 40.18 percent. Average of clinical data entered by physician in medical history sheet is 38 percent. Surgical data entered by the surgeon in operation report sheet was 94.77 percent. Average of data entered by operation room's nurse in operation report sheet was 36.78 percent; Average of physician order data in physician order sheet entered by physician was 99.3 percent. Conclusion: According to this study, the rate of completed record papers reviewed by documentary in Jahrom teaching hospitals were not desirable and in some cases were very weak and incomplete. This deficiency was due to different reason such as medical record documentaries negligence, lack of adequate education for documentaries, High work
Löfgren, Stefan; Fröberg, Mats; Yu, Jun; Nisell, Jakob; Ranneby, Bo
2014-12-01
From a policy perspective, it is important to understand forestry effects on surface waters from a landscape perspective. The EU Water Framework Directive demands remedial actions if not achieving good ecological status. In Sweden, 44 % of the surface water bodies have moderate ecological status or worse. Many of these drain catchments with a mosaic of managed forests. It is important for the forestry sector and water authorities to be able to identify where, in the forested landscape, special precautions are necessary. The aim of this study was to quantify the relations between forestry parameters and headwater stream concentrations of nutrients, organic matter and acid-base chemistry. The results are put into the context of regional climate, sulphur and nitrogen deposition, as well as marine influences. Water chemistry was measured in 179 randomly selected headwater streams from two regions in southwest and central Sweden, corresponding to 10 % of the Swedish land area. Forest status was determined from satellite images and Swedish National Forest Inventory data using the probabilistic classifier method, which was used to model stream water chemistry with Bayesian model averaging. The results indicate that concentrations of e.g. nitrogen, phosphorus and organic matter are related to factors associated with forest production but that it is not forestry per se that causes the excess losses. Instead, factors simultaneously affecting forest production and stream water chemistry, such as climate, extensive soil pools and nitrogen deposition, are the most likely candidates The relationships with clear-felled and wetland areas are likely to be direct effects.
Simuta-Champo, R.; Herrera-Zamarrón, G. S.
2010-01-01
The Monte Carlo technique provides a natural method for evaluating uncertainties. The uncertainty is represented by a probability distribution or by related quantities such as statistical moments. When the groundwater flow and transport governing equations are solved and the hydraulic conductivity field is treated as a random spatial function, the hydraulic head, velocities and concentrations also become random spatial functions. When that is the case, for the stochastic simulation of groundw...
Directory of Open Access Journals (Sweden)
Himmelreich Uwe
2009-07-01
Full Text Available Abstract Background Regularized regression methods such as principal component or partial least squares regression perform well in learning tasks on high dimensional spectral data, but cannot explicitly eliminate irrelevant features. The random forest classifier with its associated Gini feature importance, on the other hand, allows for an explicit feature elimination, but may not be optimally adapted to spectral data due to the topology of its constituent classification trees which are based on orthogonal splits in feature space. Results We propose to combine the best of both approaches, and evaluated the joint use of a feature selection based on a recursive feature elimination using the Gini importance of random forests' together with regularized classification methods on spectral data sets from medical diagnostics, chemotaxonomy, biomedical analytics, food science, and synthetically modified spectral data. Here, a feature selection using the Gini feature importance with a regularized classification by discriminant partial least squares regression performed as well as or better than a filtering according to different univariate statistical tests, or using regression coefficients in a backward feature elimination. It outperformed the direct application of the random forest classifier, or the direct application of the regularized classifiers on the full set of features. Conclusion The Gini importance of the random forest provided superior means for measuring feature relevance on spectral data, but – on an optimal subset of features – the regularized classifiers might be preferable over the random forest classifier, in spite of their limitation to model linear dependencies only. A feature selection based on Gini importance, however, may precede a regularized linear classification to identify this optimal subset of features, and to earn a double benefit of both dimensionality reduction and the elimination of noise from the classification task.
Energy Technology Data Exchange (ETDEWEB)
Peabody, David S.; Chackerian, Bryce; Ashley, Carlee; Carnes, Eric; Negrete, Oscar
2017-01-24
The invention relates to virus-like particles of bacteriophage MS2 (MS2 VLPs) displaying peptide epitopes or peptide mimics of epitopes of Nipah Virus envelope glycoprotein that elicit an immune response against Nipah Virus upon vaccination of humans or animals. Affinity selection on Nipah Virus-neutralizing monoclonal antibodies using random sequence peptide libraries on MS2 VLPs selected peptides with sequence similarity to peptide sequences found within the envelope glycoprotein of Nipah itself, thus identifying the epitopes the antibodies recognize. The selected peptide sequences themselves are not necessarily identical in all respects to a sequence within Nipah Virus glycoprotein, and therefore may be referred to as epitope mimics VLPs displaying these epitope mimics can serve as vaccine. On the other hand, display of the corresponding wild-type sequence derived from Nipah Virus and corresponding to the epitope mapped by affinity selection, may also be used as a vaccine.
DEFF Research Database (Denmark)
Macdonald, Thomas M; Hawkey, Chris J; Ford, Ian
2017-01-01
BACKGROUND: Selective cyclooxygenase-2 inhibitors and conventional non-selective non-steroidal anti-inflammatory drugs (nsNSAIDs) have been associated with adverse cardiovascular (CV) effects. We compared the CV safety of switching to celecoxib vs. continuing nsNSAID therapy in a European setting...
Goh, David S.
1979-01-01
The advantages of using psychometric thoery to design short forms of intelligence tests are demonstrated by comparing such usage to a systematic random procedure that has previously been used. The Wechsler Intelligence Scale for Children Revised (WISC-R) Short Form is presented as an example. (JKS)
Chakrabarti, Rajashri
2009-01-01
This paper analyzes the effect of school vouchers on student sorting - defined as a flight to private schools by high-income and committed public-school students - and whether vouchers can be designed to reduce or eliminate it. Much of the existing literature investigates sorting in cases where private schools can screen students. However, publicly funded U.S. voucher programs require a private school to accept all students unless it is oversubscribed and to pick students randomly if it is ov...
Muñoz, Irene; Henriques, Dora; Jara, Laura; Johnston, J Spencer; Chávez-Galarza, Julio; De La Rúa, Pilar; Pinto, M Alice
2017-07-01
The honeybee (Apis mellifera) has been threatened by multiple factors including pests and pathogens, pesticides and loss of locally adapted gene complexes due to replacement and introgression. In western Europe, the genetic integrity of the native A. m. mellifera (M-lineage) is endangered due to trading and intensive queen breeding with commercial subspecies of eastern European ancestry (C-lineage). Effective conservation actions require reliable molecular tools to identify pure-bred A. m. mellifera colonies. Microsatellites have been preferred for identification of A. m. mellifera stocks across conservation centres. However, owing to high throughput, easy transferability between laboratories and low genotyping error, SNPs promise to become popular. Here, we compared the resolving power of a widely utilized microsatellite set to detect structure and introgression with that of different sets that combine a variable number of SNPs selected for their information content and genomic proximity to the microsatellite loci. Contrary to every SNP data set, microsatellites did not discriminate between the two lineages in the PCA space. Mean introgression proportions were identical across the two marker types, although at the individual level, microsatellites' performance was relatively poor at the upper range of Q-values, a result reflected by their lower precision. Our results suggest that SNPs are more accurate and powerful than microsatellites for identification of A. m. mellifera colonies, especially when they are selected by information content. © 2016 John Wiley & Sons Ltd.
Altenburg, Teatske M; Chinapaw, Mai J M; Singh, Amika S
2016-10-01
Evidence suggests that physical activity is positively related to several aspects of cognitive functioning in children, among which is selective attention. To date, no information is available on the optimal frequency of physical activity on cognitive functioning in children. The current study examined the acute effects of one and two bouts of moderate-intensity physical activity on children's selective attention. Randomized controlled trial (ISRCTN97975679). Thirty boys and twenty-six girls, aged 10-13 years, were randomly assigned to three conditions: (A) sitting all morning working on simulated school tasks; (B) one 20-min physical activity bout after 90min; and (C) two 20-min physical activity bouts, i.e. at the start and after 90min. Selective attention was assessed at five time points during the morning (i.e. at baseline and after 20, 110, 130 and 220min), using the 'Sky Search' subtest of the 'Test of Selective Attention in Children'. We used GEE analysis to examine differences in Sky Search scores between the three experimental conditions, adjusting for school, baseline scores, self-reported screen time and time spent in sports. Children who performed two 20-min bouts of moderate-intensity physical activity had significantly better Sky Search scores compared to children who performed one physical activity bout or remained seated the whole morning (B=-0.26; 95% CI=[-0.52; -0.00]). Our findings support the importance of repeated physical activity during the school day for beneficial effects on selective attention in children. Copyright © 2015 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
International Nuclear Information System (INIS)
Wandiga, S.O.; Jumba, I.O.
1982-01-01
An intercomparative analysis of the concentration of heavy metals:zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people using the techniques of atomic absorption spectrophotometry (AAS) and differential pulse anodic stripping voltammetry (DPAS) has been undertaken. The percent relative standard deviation for each sample analysed using either of the techniques show good sensitivity and correlation between the techniques. The DPAS was found to be slightly sensitive than the AAs instrument used. The recalculated body burden rations of Cd to Zn, Pb to Fe reveal no unusual health impairement symptoms and suggest a relatively clean environment in Kenya.(author)
Directory of Open Access Journals (Sweden)
Cortellini M
2017-07-01
Full Text Available Mauro Cortellini, Franco Berrino, Patrizia Pasanisi Department of Preventive & Predictive Medicine, Foundation IRCCS National Cancer Institute of Milan, Milan, Italy Abstract: Among randomized controlled trials (RCTs, trials for primary prevention require large samples and long follow-up to obtain a high-quality outcome; therefore the recruitment process and the drop-out rates largely dictate the adequacy of the results. We are conducting a Phase III trial on persons with metabolic syndrome to test the hypothesis that comprehensive lifestyle changes and/or metformin treatment prevents age-related chronic diseases (the MeMeMe trial, EudraCT number: 2012-005427-32, also registered on ClinicalTrials.gov [NCT02960711]. Here, we briefly analyze and discuss the reasons which may lead to participants dropping out from trials. In our experience, participants may back out of a trial for different reasons. Drug-induced side effects are certainly the most compelling reason. But what are the other reasons, relating to the participants’ perception of the progress of the trial which led them to withdraw after randomization? What about the time-dependent drop-out rate in primary prevention trials? The primary outcome of this analysis is the point of drop-out from trial, defined as the time from the randomization date to the withdrawal date. Survival functions were non-parametrically estimated using the product-limit estimator. The curves were statistically compared using the log-rank test (P=0.64, not significant. Researchers involved in primary prevention RCTs seem to have to deal with the paradox of the proverbial “short blanket syndrome”. Recruiting only highly motivated candidates might be useful for the smooth progress of the trial but it may lead to a very low enrollment rate. On the other hand, what about enrolling all the eligible subjects without considering their motivation? This might boost the enrollment rate, but it can lead to biased
Cortellini, Mauro; Berrino, Franco; Pasanisi, Patrizia
2017-01-01
Among randomized controlled trials (RCTs), trials for primary prevention require large samples and long follow-up to obtain a high-quality outcome; therefore the recruitment process and the drop-out rates largely dictate the adequacy of the results. We are conducting a Phase III trial on persons with metabolic syndrome to test the hypothesis that comprehensive lifestyle changes and/or metformin treatment prevents age-related chronic diseases (the MeMeMe trial, EudraCT number: 2012-005427-32, also registered on ClinicalTrials.gov [NCT02960711]). Here, we briefly analyze and discuss the reasons which may lead to participants dropping out from trials. In our experience, participants may back out of a trial for different reasons. Drug-induced side effects are certainly the most compelling reason. But what are the other reasons, relating to the participants' perception of the progress of the trial which led them to withdraw after randomization? What about the time-dependent drop-out rate in primary prevention trials? The primary outcome of this analysis is the point of drop-out from trial, defined as the time from the randomization date to the withdrawal date. Survival functions were non-parametrically estimated using the product-limit estimator. The curves were statistically compared using the log-rank test ( P =0.64, not significant). Researchers involved in primary prevention RCTs seem to have to deal with the paradox of the proverbial "short blanket syndrome". Recruiting only highly motivated candidates might be useful for the smooth progress of the trial but it may lead to a very low enrollment rate. On the other hand, what about enrolling all the eligible subjects without considering their motivation? This might boost the enrollment rate, but it can lead to biased results on account of large proportions of drop-outs. Our experience suggests that participants do not change their mind depending on the allocation group (intervention or control). There is no single
Zer, Alona; Prince, Rebecca M; Amir, Eitan; Abdul Razak, Albiruni
2016-05-01
Randomized controlled trials (RCTs) in soft tissue sarcoma (STS) have used varying end points. The surrogacy of intermediate end points, such as progression-free survival (PFS), response rate (RR), and 3-month and 6-month PFS (3moPFS and 6moPFS) with overall survival (OS), remains unknown. The quality of efficacy and toxicity reporting in these studies is also uncertain. A systematic review of systemic therapy RCTs in STS was performed. Surrogacy between intermediate end points and OS was explored using weighted linear regression for the hazard ratio for OS with the hazard ratio for PFS or the odds ratio for RR, 3moPFS, and 6moPFS. The quality of reporting for efficacy and toxicity was also evaluated. Fifty-two RCTs published between 1974 and 2014, comprising 9,762 patients, met the inclusion criteria. There were significant correlations between PFS and OS (R = 0.61) and between RR and OS (R = 0.51). Conversely, there were nonsignificant correlations between 3moPFS and 6moPFS with OS. A reduction in the use of RR as the primary end point was observed over time, favoring time-based events (P for trend = .02). In 14% of RCTs, the primary end point was not met, but the study was reported as being positive. Toxicity was comprehensively reported in 47% of RCTs, whereas 14% inadequately reported toxicity. In advanced STS, PFS and RR seem to be appropriate surrogates for OS. There is poor correlation between OS and both 3moPFS and 6moPFS. As such, caution is urged with the use of these as primary end points in randomized STS trials. The quality of toxicity reporting and interpretation of results is suboptimal. © 2016 by American Society of Clinical Oncology.
Zhao, Qiang; Wang, Hanlin; Ni, Zhenjie; Liu, Jie; Zhen, Yonggang; Zhang, Xiaotao; Jiang, Lang; Li, Rongjin; Dong, Huanli; Hu, Wenping
2017-09-01
Organic electronics based on poly(vinylidenefluoride/trifluoroethylene) (P(VDF-TrFE)) dielectric is facing great challenges in flexible circuits. As one indispensable part of integrated circuits, there is an urgent demand for low-cost and easy-fabrication nonvolatile memory devices. A breakthrough is made on a novel ferroelectric random access memory cell (1T1T FeRAM cell) consisting of one selection transistor and one ferroelectric memory transistor in order to overcome the half-selection problem. Unlike complicated manufacturing using multiple dielectrics, this system simplifies 1T1T FeRAM cell fabrication using one common dielectric. To achieve this goal, a strategy for semiconductor/insulator (S/I) interface modulation is put forward and applied to nonhysteretic selection transistors with high performances for driving or addressing purposes. As a result, high hole mobility of 3.81 cm 2 V -1 s -1 (average) for 2,6-diphenylanthracene (DPA) and electron mobility of 0.124 cm 2 V -1 s -1 (average) for N,N'-1H,1H-perfluorobutyl dicyanoperylenecarboxydiimide (PDI-FCN 2 ) are obtained in selection transistors. In this work, we demonstrate this technology's potential for organic ferroelectric-based pixelated memory module fabrication. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Directory of Open Access Journals (Sweden)
Bongyeun Koh
2016-01-01
Full Text Available Purpose: The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE, which requires examinees to select items randomly. Methods: The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. Results: In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01, as well as 4 of the 5 items on the advanced skills test (P<0.05. In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01, as well as all 3 of the advanced skills test items (P<0.01. Conclusion: In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.
Koh, Bongyeun; Hong, Sunggi; Kim, Soon-Sim; Hyun, Jin-Sook; Baek, Milye; Moon, Jundong; Kwon, Hayran; Kim, Gyoungyong; Min, Seonggi; Kang, Gu-Hyun
2016-01-01
The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE), which requires examinees to select items randomly. The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01), as well as 4 of the 5 items on the advanced skills test (P<0.05). In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01), as well as all 3 of the advanced skills test items (P<0.01). In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.
Cohen, Juliana F W; Richardson, Scott A; Cluggish, Sarah A; Parker, Ellen; Catalano, Paul J; Rimm, Eric B
2015-05-01
Little is known about the long-term effect of a chef-enhanced menu on healthier food selection and consumption in school lunchrooms. In addition, it remains unclear if extended exposure to other strategies to promote healthier foods (eg, choice architecture) also improves food selection or consumption. To evaluate the short- and long-term effects of chef-enhanced meals and extended exposure to choice architecture on healthier school food selection and consumption. A school-based randomized clinical trial was conducted during the 2011-2012 school year among 14 elementary and middle schools in 2 urban, low-income school districts (intent-to-treat analysis). Included in the study were 2638 students in grades 3 through 8 attending participating schools (38.4% of eligible participants). Schools were first randomized to receive a professional chef to improve school meal palatability (chef schools) or to a delayed intervention (control group). To assess the effect of choice architecture (smart café), all schools after 3 months were then randomized to the smart café intervention or to the control group. School food selection was recorded, and consumption was measured using plate waste methods. After 3 months, vegetable selection increased in chef vs control schools (odds ratio [OR], 1.75; 95% CI, 1.36-2.24), but there was no effect on the selection of other components or on meal consumption. After long-term or extended exposure to the chef or smart café intervention, fruit selection increased in the chef (OR, 3.08; 95% CI, 2.23-4.25), smart café (OR, 1.45; 95% CI, 1.13-1.87), and chef plus smart café (OR, 3.10; 95% CI, 2.26-4.25) schools compared with the control schools, and consumption increased in the chef schools (OR, 0.17; 95% CI, 0.03-0.30 cups/d). Vegetable selection increased in the chef (OR, 2.54; 95% CI, 1.83-3.54), smart café (OR, 1.91; 95% CI, 1.46-2.50), and chef plus smart café schools (OR, 7.38, 95% CI, 5.26-10.35) compared with the control schools
Cornelisse, Sandra; Joëls, Marian; Smeets, Tom
2011-12-01
Corticosteroids, released in high amounts after stress, exert their effects via two different receptors in the brain: glucocorticoid receptors (GRs) and mineralocorticoid receptors (MRs). GRs have a role in normalizing stress-induced effects and promoting consolidation, while MRs are thought to be important in determining the threshold for activation of the hypothalamic-pituitary-adrenal (HPA) axis. We investigated the effects of MR blockade on HPA axis responses to stress and stress-induced changes in cognitive function. In a double-blind, placebo-controlled study, 64 healthy young men received 400 mg of the MR antagonist spironolactone or placebo. After 1.5 h, they were exposed to either a Trier Social Stress Test or a non-stressful control task. Responses to stress were evaluated by hormonal, subjective, and physiological measurements. Afterwards, selective attention, working memory, and long-term memory performance were assessed. Spironolactone increased basal salivary cortisol levels as well as cortisol levels in response to stress. Furthermore, spironolactone significantly impaired selective attention, but only in the control group. The stress group receiving spironolactone showed impaired working memory performance. By contrast, long-term memory was enhanced in this group. These data support a role of MRs in the regulation of the HPA axis under basal conditions as well as in response to stress. The increased availability of cortisol after spironolactone treatment implies enhanced GR activation, which, in combination with MR blockade, presumably resulted in a decreased MR/GR activation ratio. This condition influences both selective attention and performance in various memory tasks.
Kandaswamy, Krishna Kumar Umar
2013-01-01
The extracellular matrix (ECM) is a major component of tissues of multicellular organisms. It consists of secreted macromolecules, mainly polysaccharides and glycoproteins. Malfunctions of ECM proteins lead to severe disorders such as marfan syndrome, osteogenesis imperfecta, numerous chondrodysplasias, and skin diseases. In this work, we report a random forest approach, EcmPred, for the prediction of ECM proteins from protein sequences. EcmPred was trained on a dataset containing 300 ECM and 300 non-ECM and tested on a dataset containing 145 ECM and 4187 non-ECM proteins. EcmPred achieved 83% accuracy on the training and 77% on the test dataset. EcmPred predicted 15 out of 20 experimentally verified ECM proteins. By scanning the entire human proteome, we predicted novel ECM proteins validated with gene ontology and InterPro. The dataset and standalone version of the EcmPred software is available at http://www.inb.uni-luebeck.de/tools-demos/Extracellular_matrix_proteins/EcmPred. © 2012 Elsevier Ltd.
Wilson, Norbert L W; Just, David R; Swigert, Jeffery; Wansink, Brian
2017-06-01
Food pantries and food banks are interested in cost-effective methods to encourage the selection of targeted foods without restricting choices. Thus, this study evaluates the effectiveness of nudges toward targeted foods. In October/November 2014, we manipulated the display of a targeted product in a New York State food pantry. We evaluated the binary choice of the targeted good when we placed it in the front or the back of the category line (placement order) and when we presented the product in its original box or unboxed (packaging). The average uptake proportion for the back treatment was 0.231, 95% CI = 0.179, 0.29, n = 205, and for the front treatment, the proportion was 0.337, 95% CI = 0.272, 0.406, n = 238 with an odds ratio of 1.688, 95% CI = 1.088, 2.523. The average uptake for the unboxed treatment was 0.224, 95% CI = 0.174, 0.280, n = 255, and for the boxed intervention, the proportion was 0.356, 95% CI = 0.288, 0.429, n = 188 with an odds ratio of 1.923, 95% CI = 1.237, 2.991. Nudges increased uptake of the targeted food. The findings also hold when we control for a potential confounder. Low cost and unobtrusive nudges can be effective tools for food pantry organizers to encourage the selection of targeted foods. NCT02403882. © The Author 2016. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Sugden, Nicole A.; Moulson, Margaret C.
2015-01-01
Psychological and developmental research have been critiqued for the lack of diversity of research samples. Because differences in culture, race, and ethnicity can influence participant behavior, limited diversity limits the generalizability of the findings. These differences may also impact how participants behave in response to recruitment attempts, which suggests that recruitment itself may be leveraged to increase sample diversity. The goal of the current study was to determine what factors, within a recruitment interaction, could be leveraged to increase success and diversity when recruiting families with children for developmental research. Study 1 found three factors influenced success: (1) recruitment was more successful when other potential participants were also interested (i.e., recruiters were busy), (2) recruiters of particular races were more successful than recruiters of other races, and (3) differences in success were related to what the recruiter said to engage the potential participant (i.e., the script). The latter two factors interacted, suggesting some recruiters were using less optimal scripts. To improve success rates, study 2 randomly assigned scripts to recruiters and encouraged them to recruit more vigorously during busy periods. Study 2 found that two factors influenced success: (1) some scripts were more successful than others and (2) we were more successful at recruiting non-White potential participants than White participants. These two interacted, with some scripts being more successful with White and other scripts being more successful with non-White families. This intervention significantly increased recruitment success rate by 8.1% and the overall number of families recruited by 15.3%. These findings reveal that empirically evaluating and tailoring recruitment efforts based on the most successful strategies is effective in boosting diversity through increased participation of children from non-White families. PMID:25972829
Hagaman, Ashley K; Khadka, S; Lohani, S; Kohrt, B
2017-12-01
Yearly, 600,000 people complete suicide in low- and middle-income countries, accounting for 75% of the world's burden of suicide mortality. The highest regional rates are in South and East Asia. Nepal has one of the highest suicide rates in the world; however, few investigations exploring patterns surrounding both male and female suicides exist. This study used psychological autopsies to identify common factors, precipitating events, and warning signs in a diverse sample. Randomly sampled from 302 police case reports over 24 months, psychological autopsies were conducted for 39 completed suicide cases in one urban and one rural region of Nepal. In the total police sample (n = 302), 57.0% of deaths were male. Over 40% of deaths were 25 years or younger, including 65% of rural and 50.8% of female suicide deaths. We estimate the crude urban and rural suicide rates to be 16.1 and 22.8 per 100,000, respectively. Within our psychological autopsy sample, 38.5% met criteria for depression and only 23.1% informants believed that the deceased had thoughts of self-harm or suicide before death. Important warning signs include recent geographic migration, alcohol abuse, and family history of suicide. Suicide prevention strategies in Nepal should account for the lack of awareness about suicide risk among family members and early age of suicide completion, especially in rural and female populations. Given the low rates of ideation disclosure to friends and family, educating the general public about other signs of suicide may help prevention efforts in Nepal.
Directory of Open Access Journals (Sweden)
Yuthavong Yongyuth
2011-05-01
Full Text Available Abstract Background The prevalence of drug resistance amongst the human malaria Plasmodium species has most commonly been associated with genomic mutation within the parasites. This phenomenon necessitates evolutionary predictive studies of possible resistance mutations, which may occur when a new drug is introduced. Therefore, identification of possible new Plasmodium falciparum dihydrofolate reductase (PfDHFR mutants that confer resistance to antifolate drugs is essential in the process of antifolate anti-malarial drug development. Methods A system to identify mutations in Pfdhfr gene that confer antifolate drug resistance using an animal Plasmodium parasite model was developed. By using error-prone PCR and Plasmodium transfection technologies, libraries of Pfdhfr mutant were generated and then episomally transfected to Plasmodium berghei parasites, from which pyrimethamine-resistant PfDHFR mutants were selected. Results The principal mutation found from this experiment was S108N, coincident with the first pyrimethamine-resistance mutation isolated from the field. A transgenic P. berghei, in which endogenous Pbdhfr allele was replaced with the mutant PfdhfrS108N, was generated and confirmed to have normal growth rate comparing to parental non-transgenic parasite and also confer resistance to pyrimethamine. Conclusion This study demonstrated the power of the transgenic P. berghei system to predict drug-resistant Pfdhfr mutations in an in vivo parasite/host setting. The system could be utilized for identification of possible novel drug-resistant mutants that could arise against new antifolate compounds and for prediction the evolution of resistance mutations.
Tillman, Fred; Anning, David W.; Heilman, Julian A.; Buto, Susan G.; Miller, Matthew P.
2018-01-01
Elevated concentrations of dissolved-solids (salinity) including calcium, sodium, sulfate, and chloride, among others, in the Colorado River cause substantial problems for its water users. Previous efforts to reduce dissolved solids in upper Colorado River basin (UCRB) streams often focused on reducing suspended-sediment transport to streams, but few studies have investigated the relationship between suspended sediment and salinity, or evaluated which watershed characteristics might be associated with this relationship. Are there catchment properties that may help in identifying areas where control of suspended sediment will also reduce salinity transport to streams? A random forests classification analysis was performed on topographic, climate, land cover, geology, rock chemistry, soil, and hydrologic information in 163 UCRB catchments. Two random forests models were developed in this study: one for exploring stream and catchment characteristics associated with stream sites where dissolved solids increase with increasing suspended-sediment concentration, and the other for predicting where these sites are located in unmonitored reaches. Results of variable importance from the exploratory random forests models indicate that no simple source, geochemical process, or transport mechanism can easily explain the relationship between dissolved solids and suspended sediment concentrations at UCRB monitoring sites. Among the most important watershed characteristics in both models were measures of soil hydraulic conductivity, soil erodibility, minimum catchment elevation, catchment area, and the silt component of soil in the catchment. Predictions at key locations in the basin were combined with observations from selected monitoring sites, and presented in map-form to give a complete understanding of where catchment sediment control practices would also benefit control of dissolved solids in streams.
Sandrick, Janice; Tracy, Doreen; Eliasson, Arn; Roth, Ashley; Bartel, Jeffrey; Simko, Melanie; Bowman, Tracy; Harouse-Bell, Karen; Kashani, Mariam; Vernalis, Marina
2017-05-17
The college experience is often the first time when young adults live independently and make their own lifestyle choices. These choices affect dietary behaviors, exercise habits, techniques to deal with stress, and decisions on sleep time, all of which direct the trajectory of future health. There is a need for effective strategies that will encourage healthy lifestyle choices in young adults attending college. This preliminary randomized controlled trial tested the effect of coaching and text messages (short message service, SMS) on self-selected health behaviors in the domains of diet, exercise, stress, and sleep. A second analysis measured the ripple effect of the intervention on health behaviors not specifically selected as a goal by participants. Full-time students aged 18-30 years were recruited by word of mouth and campuswide advertisements (flyers, posters, mailings, university website) at a small university in western Pennsylvania from January to May 2015. Exclusions included pregnancy, eating disorders, chronic medical diagnoses, and prescription medications other than birth control. Of 60 participants, 30 were randomized to receive a single face-to-face meeting with a health coach to review results of behavioral questionnaires and to set a health behavior goal for the 8-week study period. The face-to-face meeting was followed by SMS text messages designed to encourage achievement of the behavioral goal. A total of 30 control subjects underwent the same health and behavioral assessments at intake and program end but did not receive coaching or SMS text messages. The texting app showed that 87.31% (2187/2505) of messages were viewed by intervention participants. Furthermore, 28 of the 30 intervention participants and all 30 control participants provided outcome data. Among intervention participants, 22 of 30 (73%) showed improvement in health behavior goal attainment, with the whole group (n=30) showing a mean improvement of 88% (95% CI 39-136). Mean
White, D. H.
1980-01-01
A general theory is presented for the origin of a self-replicating chemical system, termed an autogen, which is capable of both crude replication and translation (protein synthesis). The theory requires the availability of free energy and monomers to the system, a significant background low-yield synthesis of kinetically stable oligopeptides and oligonucleotides, the localization of the oligomers, crude oligonucleotide selectivity of amino acids during oligopeptide synthesis, crude oligonucleotide replication, and two short peptide families which catalyze replication and translation, to produce a localized group of at least one copy each of two protogenes and two protoenzymes. The model posits a process of random oligomerization, followed by the random nucleation of functional components and the rapid autocatalytic growth of the functioning autogen to macroscopic amounts, to account for the origin of the first self-replicating system. Such a process contains steps of such high probability and short time periods that it is suggested that the emergence of an autogen in a laboratory experiment of reasonable time scale may be possible.
Ballesteros, Soledad; Mayas, Julia; Prieto, Antonio; Ruiz-Marquez, Eloísa; Toril, Pilar; Reales, José M.
2017-01-01
Video game training with older adults potentially enhances aspects of cognition that decline with aging and could therefore offer a promising training approach. Although, previous published studies suggest that training can produce transfer, many of them have certain shortcomings. This randomized controlled trial (RCT; Clinicaltrials.gov ID: NCT02796508) tried to overcome some of these limitations by incorporating an active control group and the assessment of motivation and expectations. Seventy-five older volunteers were randomly assigned to the experimental group trained for 16 sessions with non-action video games from Lumosity, a commercial platform (http://www.lumosity.com/) or to an active control group trained for the same number of sessions with simulation strategy games. The final sample included 55 older adults (30 in the experimental group and 25 in the active control group). Participants were tested individually before and after training to assess working memory (WM) and selective attention and also reported their perceived improvement, motivation and engagement. The results showed improved performance across the training sessions. The main results were: (1) the experimental group did not show greater improvements in measures of selective attention and working memory than the active control group (the opposite occurred in the oddball task); (2) a marginal training effect was observed for the N-back task, but not for the Stroop task while both groups improved in the Corsi Blocks task. Based on these results, one can conclude that training with non-action games provide modest benefits for untrained tasks. The effect is not specific for that kind of training as a similar effect was observed for strategy video games. Groups did not differ in motivation, engagement or expectations. PMID:29163136
Directory of Open Access Journals (Sweden)
Soledad Ballesteros
2017-11-01
Full Text Available Video game training with older adults potentially enhances aspects of cognition that decline with aging and could therefore offer a promising training approach. Although, previous published studies suggest that training can produce transfer, many of them have certain shortcomings. This randomized controlled trial (RCT; Clinicaltrials.gov ID: NCT02796508 tried to overcome some of these limitations by incorporating an active control group and the assessment of motivation and expectations. Seventy-five older volunteers were randomly assigned to the experimental group trained for 16 sessions with non-action video games from Lumosity, a commercial platform (http://www.lumosity.com/ or to an active control group trained for the same number of sessions with simulation strategy games. The final sample included 55 older adults (30 in the experimental group and 25 in the active control group. Participants were tested individually before and after training to assess working memory (WM and selective attention and also reported their perceived improvement, motivation and engagement. The results showed improved performance across the training sessions. The main results were: (1 the experimental group did not show greater improvements in measures of selective attention and working memory than the active control group (the opposite occurred in the oddball task; (2 a marginal training effect was observed for the N-back task, but not for the Stroop task while both groups improved in the Corsi Blocks task. Based on these results, one can conclude that training with non-action games provide modest benefits for untrained tasks. The effect is not specific for that kind of training as a similar effect was observed for strategy video games. Groups did not differ in motivation, engagement or expectations.
Ballesteros, Soledad; Mayas, Julia; Prieto, Antonio; Ruiz-Marquez, Eloísa; Toril, Pilar; Reales, José M
2017-01-01
Video game training with older adults potentially enhances aspects of cognition that decline with aging and could therefore offer a promising training approach. Although, previous published studies suggest that training can produce transfer, many of them have certain shortcomings. This randomized controlled trial (RCT; Clinicaltrials.gov ID: NCT02796508) tried to overcome some of these limitations by incorporating an active control group and the assessment of motivation and expectations. Seventy-five older volunteers were randomly assigned to the experimental group trained for 16 sessions with non-action video games from Lumosity , a commercial platform (http://www.lumosity.com/) or to an active control group trained for the same number of sessions with simulation strategy games. The final sample included 55 older adults (30 in the experimental group and 25 in the active control group). Participants were tested individually before and after training to assess working memory (WM) and selective attention and also reported their perceived improvement, motivation and engagement. The results showed improved performance across the training sessions. The main results were: (1) the experimental group did not show greater improvements in measures of selective attention and working memory than the active control group (the opposite occurred in the oddball task); (2) a marginal training effect was observed for the N -back task, but not for the Stroop task while both groups improved in the Corsi Blocks task. Based on these results, one can conclude that training with non-action games provide modest benefits for untrained tasks. The effect is not specific for that kind of training as a similar effect was observed for strategy video games. Groups did not differ in motivation, engagement or expectations.
Directory of Open Access Journals (Sweden)
Bridges JFP
2018-02-01
Full Text Available John FP Bridges,1,2 Norah L Crossnohere,2 Anne L Schuster,1 Judith A Miller,3 Carolyn Pastorini,3,† Rebecca A Aslakson2,4,5 1Department of Health Policy and Management, The Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, 2Department of Health, Behavior, and Society, The Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, 3Patient-Centered Outcomes Research Institute (PCORI Project, Baltimore, MD, 4Department of Anesthesiology and Critical Care Medicine, The Johns Hopkins School of Medicine, Baltimore, MD, 5Armstrong Institute for Patient Safety and Quality, The Johns Hopkins School of Medicine, Baltimore, MD, USA †Carolyn Pastorini passed away on August 24, 2015 Background: Despite a movement toward patient-centered outcomes, best practices on how to gather and refine patients’ perspectives on research endpoints are limited. Advanced care planning (ACP is inherently patient centered and would benefit from patient prioritization of endpoints for ACP-related tools and studies.Objective: This investigation sought to prioritize patient-centered endpoints for the content and evaluation of an ACP video being developed for patients undergoing major surgery. We also sought to highlight an approach using complementary engagement and research strategies to document priorities and preferences of patients and other stakeholders.Materials and methods: Endpoints identified from a previously published environmental scan were operationalized following rating by a caregiver co-investigator, refinement by a patient co-investigator, review by a stakeholder committee, and validation by patients and family members. Finalized endpoints were taken to a state fair where members of the public who indicated that they or a loved one had undergone major surgery prioritized their most relevant endpoints and provided comments.Results: Of the initial 50 ACP endpoints identified from the review, 12 endpoints were selected for public
Allegra, Adolfo; Marino, Angelo; Volpes, Aldo; Coffaro, Francesco; Scaglione, Piero; Gullo, Salvatore; La Marca, Antonio
2017-04-01
The number of oocytes retrieved is a relevant intermediate outcome in women undergoing IVF/intracytoplasmic sperm injection (ICSI). This trial compared the efficiency of the selection of the FSH starting dose according to a nomogram based on multiple biomarkers (age, day 3 FSH, anti-Müllerian hormone) versus an age-based strategy. The primary outcome measure was the proportion of women with an optimal number of retrieved oocytes defined as 8-14. At their first IVF/ICSI cycle, 191 patients underwent a long gonadotrophin-releasing hormone agonist protocol and were randomized to receive a starting dose of recombinant (human) FSH, based on their age (150 IU if ≤35 years, 225 IU if >35 years) or based on the nomogram. Optimal response was observed in 58/92 patients (63%) in the nomogram group and in 42/99 (42%) in the control group (+21%, 95% CI = 0.07 to 0.35, P = 0.0037). No significant differences were found in the clinical pregnancy rate or the number of embryos cryopreserved per patient. The study showed that the FSH starting dose selected according to ovarian reserve is associated with an increase in the proportion of patients with an optimal response: large trials are recommended to investigate any possible effect on the live-birth rate. Copyright © 2017 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
Shukla, Girja S.; Krag, David N.
2010-01-01
Novel phage-displayed random linear dodecapeptide (X12) and cysteine-constrained decapeptide (CX10C) libraries constructed in fusion to the amino-terminus of P99 β-lactamase molecules were used for identifying β-lactamase-linked cancer cell-specific ligands. The size and quality of both libraries were comparable to the standards of other reported phage display systems. Using the single-round panning method based on phage DNA recovery, we identified severalβ-lactamase fusion peptides that specifically bind to live human breast cancer MDA-MB-361 cells. The β-lactamase fusion to the peptides helped in conducting the enzyme activity-based clone normalization and cell-binding screening in a very time- and cost-efficient manner. The methods were suitable for 96-well readout as well as microscopic imaging. The success of the biopanning was indicated by the presence of ~40% cancer cell-specific clones among recovered phages. One of the binding clones appeared multiple times. The cancer cell-binding fusion peptides also shared several significant motifs. This opens a new way of preparing and selecting phage display libraries. The cancer cell-specific β-lactamase-linked affinity reagents selected from these libraries can be used for any application that requires a reporter for tracking the ligand molecules. Furthermore, these affinity reagents have also a potential for their direct use in the targeted enzyme prodrug therapy of cancer. PMID:19751096
International Nuclear Information System (INIS)
Wijers, Oda B.; Levendag, Peter C.; Harms, Erik; Gan-Teng, A.M.; Schmitz, Paul I.M.; Hendriks, W.D.H.; Wilms, Erik B.; Est, Henri van der; Visch, Leo L.
2001-01-01
Purpose: The aim of the study was to test the hypothesis that aerobic Gram-negative bacteria (AGNB) play a crucial role in the pathogenesis of radiation-induced mucositis; consequently, selective elimination of these bacteria from the oral flora should result in a reduction of the mucositis. Methods and Materials: Head-and-neck cancer patients, when scheduled for treatment by external beam radiation therapy (EBRT), were randomized for prophylactic treatment with an oral paste containing either a placebo or a combination of the antibiotics polymyxin E, tobramycin, and amphotericin B (PTA group). Weekly, the objective and subjective mucositis scores and microbiologic counts of the oral flora were noted. The primary study endpoint was the mucositis grade after 3 weeks of EBRT. Results: Seventy-seven patients were evaluable. No statistically significant difference for the objective and subjective mucositis scores was observed between the two study arms (p=0.33). The percentage of patients with positive cultures of AGNB was significantly reduced in the PTA group (p=0.01). However, complete eradication of AGNB was not achieved. Conclusions: Selective elimination of AGNB of the oral flora did not result in a reduction of radiation-induced mucositis and therefore does not support the hypothesis that these bacteria play a crucial role in the pathogenesis of mucositis
Directory of Open Access Journals (Sweden)
van Delft Joost HM
2011-10-01
Full Text Available Abstract Background We hypothesized that in Flanders (Belgium, the prevalence of at-risk genotypes for genotoxic effects decreases with age due to morbidity and mortality resulting from chronic diseases. Rather than polymorphisms in single genes, the interaction of multiple genetic polymorphisms in low penetrance genes involved in genotoxic effects might be of relevance. Methods Genotyping was performed on 399 randomly selected adults (aged 50-65 and on 442 randomly selected adolescents. Based on their involvement in processes relevant to genotoxicity, 28 low penetrance polymorphisms affecting the phenotype in 19 genes were selected (xenobiotic metabolism, oxidative stress defense and DNA repair, respectively 13, 6 and 9 polymorphisms. Polymorphisms which, based on available literature, could not clearly be categorized a priori as leading to an 'increased risk' or a 'protective effect' were excluded. Results The mean number of risk alleles for all investigated polymorphisms was found to be lower in the 'elderly' (17.0 ± 2.9 than the 'adolescent' (17.6 ± 3.1 subpopulation (P = 0.002. These results were not affected by gender nor smoking. The prevalence of a high (> 17 = median number of risk alleles was less frequent in the 'elderly' (40.6% than the 'adolescent' (51.4% subpopulation (P = 0.002. In particular for phase II enzymes, the mean number of risk alleles was lower in the 'elderly' (4.3 ± 1.6 than the 'adolescent' age group (4.8 ± 1.9 P 4 = median number of risk alleles was less frequent in the 'elderly' (41.3% than the adolescent subpopulation (56.3%, P 8 = median number of risk alleles for DNA repair enzyme-coding genes was lower in the 'elderly' (37,3% than the 'adolescent' subpopulation (45.6%, P = 0.017. Conclusions These observations are consistent with the hypothesis that, in Flanders, the prevalence of at-risk alleles in genes involved in genotoxic effects decreases with age, suggesting that persons carrying a higher number of
Huelsmann, Martin; Neuhold, Stephanie; Resl, Michael; Strunk, Guido; Brath, Helmut; Francesconi, Claudia; Adlbrecht, Christopher; Prager, Rudolf; Luger, Anton; Pacher, Richard; Clodi, Martin
2013-10-08
The study sought to assess the primary preventive effect of neurohumoral therapy in high-risk diabetic patients selected by N-terminal pro-B-type natriuretic peptide (NT-proBNP). Few clinical trials have successfully demonstrated the prevention of cardiac events in patients with diabetes. One reason for this might be an inaccurate selection of patients. NT-proBNP has not been assessed in this context. A total of 300 patients with type 2 diabetes, elevated NT-proBNP (>125 pg/ml) but free of cardiac disease were randomized. The "control" group was cared for at 4 diabetes care units; the "intensified" group was additionally treated at a cardiac outpatient clinic for the up-titration of renin-angiotensin system (RAS) antagonists and beta-blockers. The primary endpoint was hospitalization/death due to cardiac disease after 2 years. At baseline, the mean age of the patients was 67.5 ± 9 years, duration of diabetes was 15 ± 12 years, 37% were male, HbA1c was 7 ± 1.1%, blood pressure was 151 ± 22 mm Hg, heart rate was 72 ± 11 beats/min, median NT-proBNP was 265.5 pg/ml (interquartile range: 180.8 to 401.8 pg/ml). After 12 months there was a significant difference between the number of patients treated with a RAS antagonist/beta-blocker and the dosage reached between groups (p titration of RAS antagonists and beta-blockers to maximum tolerated dosages is an effective and safe intervention for the primary prevention of cardiac events for diabetic patients pre-selected using NT-proBNP. (Nt-proBNP Guided Primary Prevention of CV Events in Diabetic Patients [PONTIAC]; NCT00562952). Copyright © 2013 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Garfield, Lauren D; Dixon, David; Nowotny, Petra; Lotrich, Francis E; Pollock, Bruce G; Kristjansson, Sean D; Doré, Peter M; Lenze, Eric J
2014-10-01
Antidepressant side effects are a significant public health issue, associated with poor adherence, premature treatment discontinuation, and, rarely, significant harm. Older adults assume the largest and most serious burden of medication side effects. We investigated the association between antidepressant side effects and genetic variation in the serotonin system in anxious, older adults participating in a randomized, placebo-controlled trial of the selective serotonin reuptake inhibitor (SSRI) escitalopram. Adults (N = 177) aged ≥ 60 years were randomized to active treatment or placebo for 12 weeks. Side effects were assessed using the Udvalg fur Kliniske Undersøgelser side-effect rating scale. Genetic polymorphisms were putative functional variants in the promoters of the serotonin transporter and 1A and 2A receptors (5-HTTLPR [L/S + rs25531], HTR1A rs6295, HTR2A rs6311, respectively). Four significant drug-placebo side-effect differences were found: increased duration of sleep, dry mouth, diarrhea, and diminished sexual desire. Analyses using putative high- versus low-transcription genotype groupings revealed six pharmacogenetic effects: greater dry mouth and decreased sexual desire for the low- and high-expressing serotonin transporter genotypes, respectively, and greater diarrhea with the 1A receptor low-transcription genotype. Diminished sexual desire was experienced significantly more by high-expressing genotypes in the serotonin transporter, 1A, or 2A receptors. There was not a significant relationship between drug concentration and side effects nor a mean difference in drug concentration between low- and high-expressing genotypes. Genetic variation in the serotonin system may predict who develops common SSRI side effects and why. More work is needed to further characterize this genetic modulation and to translate research findings into strategies useful for more personalized patient care. Published by Elsevier Inc.
Wang, Lianfeng; Yan, Biao; Guo, Lijie; Gu, Dongdong
2018-04-01
A newly transient mesoscopic model with a randomly packed powder-bed has been proposed to investigate the heat and mass transfer and laser process quality between neighboring tracks during selective laser melting (SLM) AlSi12 alloy by finite volume method (FVM), considering the solid/liquid phase transition, variable temperature-dependent properties and interfacial force. The results apparently revealed that both the operating temperature and resultant cooling rate were obviously elevated by increasing the laser power. Accordingly, the resultant viscosity of liquid significantly reduced under a large laser power and was characterized with a large velocity, which was prone to result in a more intensive convection within pool. In this case, the sufficient heat and mass transfer occurred at the interface between the previously fabricated tracks and currently building track, revealing a strongly sufficient spreading between the neighboring tracks and a resultant high-quality surface without obvious porosity. By contrast, the surface quality of SLM-processed components with a relatively low laser power notably weakened due to the limited and insufficient heat and mass transfer at the interface of neighboring tracks. Furthermore, the experimental surface morphologies of the top surface were correspondingly acquired and were in full accordance to the calculated results via simulation.
Directory of Open Access Journals (Sweden)
Yang Zhihong
2012-05-01
Full Text Available Abstract Background Single embryo transfer (SET remains underutilized as a strategy to reduce multiple gestation risk in IVF, and its overall lower pregnancy rate underscores the need for improved techniques to select one embryo for fresh transfer. This study explored use of comprehensive chromosomal screening by array CGH (aCGH to provide this advantage and improve pregnancy rate from SET. Methods First-time IVF patients with a good prognosis (age Results For patients in Group A (n = 55, 425 blastocysts were biopsied and analyzed via aCGH (7.7 blastocysts/patient. Aneuploidy was detected in 191/425 (44.9% of blastocysts in this group. For patients in Group B (n = 48, 389 blastocysts were microscopically examined (8.1 blastocysts/patient. Clinical pregnancy rate was significantly higher in the morphology + aCGH group compared to the morphology-only group (70.9 and 45.8%, respectively; p = 0.017; ongoing pregnancy rate for Groups A and B were 69.1 vs. 41.7%, respectively (p = 0.009. There were no twin pregnancies. Conclusion Although aCGH followed by frozen embryo transfer has been used to screen at risk embryos (e.g., known parental chromosomal translocation or history of recurrent pregnancy loss, this is the first description of aCGH fully integrated with a clinical IVF program to select single blastocysts for fresh SET in good prognosis patients. The observed aneuploidy rate (44.9% among biopsied blastocysts highlights the inherent imprecision of SET when conventional morphology is used alone. Embryos randomized to the aCGH group implanted with greater efficiency, resulted in clinical pregnancy more often, and yielded a lower miscarriage rate than those selected without aCGH. Additional studies are needed to verify our pilot data and confirm a role for on-site, rapid aCGH for IVF patients contemplating fresh SET.
Scott, Stephen; Briskman, Jackie; O'Connor, Thomas G
2014-06-01
Antisocial personality is a common adult problem that imposes a major public health burden, but for which there is no effective treatment. Affected individuals exhibit persistent antisocial behavior and pervasive antisocial character traits, such as irritability, manipulativeness, and lack of remorse. Prevention of antisocial personality in childhood has been advocated, but evidence for effective interventions is lacking. The authors conducted two follow-up studies of randomized trials of group parent training. One involved 120 clinic-referred 3- to 7-year-olds with severe antisocial behavior for whom treatment was indicated, 93 of whom were reassessed between ages 10 and 17. The other involved 109 high-risk 4- to 6-year-olds with elevated antisocial behavior who were selectively screened from the community, 90 of whom were reassessed between ages 9 and 13. The primary psychiatric outcome measures were the two elements of antisocial personality, namely, antisocial behavior (assessed by a diagnostic interview) and antisocial character traits (assessed by a questionnaire). Also assessed were reading achievement (an important domain of youth functioning at work) and parent-adolescent relationship quality. In the indicated sample, both elements of antisocial personality were improved in the early intervention group at long-term follow-up compared with the control group (antisocial behavior: odds ratio of oppositional defiant disorder=0.20, 95% CI=0.06, 0.69; antisocial character traits: B=-4.41, 95% CI=-1.12, -8.64). Additionally, reading ability improved (B=9.18, 95% CI=0.58, 18.0). Parental expressed emotion was warmer (B=0.86, 95% CI=0.20, 1.41) and supervision was closer (B=-0.43, 95% CI=-0.11, -0.75), but direct observation of parenting showed no differences. Teacher-rated and self-rated antisocial behavior were unchanged. In contrast, in the selective high-risk sample, early intervention was not associated with improved long-term outcomes. Early intervention with
Safarinejad, Mohammad Reza
2010-09-01
To determine the safety and efficacy of adjunctive bupropion sustained-release (SR) on male sexual dysfunction (SD) induced by a selective serotonin reuptake inhibitor (SSRI), as SD is a common side-effect of SSRIs and the most effective treatments have yet to be determined. The randomized sample consisted of 234 euthymic men who were receiving some type of SSRI. The men were randomly assigned to bupropion SR (150 mg twice daily, 117) or placebo (twice daily, 117) for 12 weeks. Efficacy was evaluated using the Clinical Global Impression-Sexual Function (CGI-SF; the primary outcome measure), the International Index of Erectile Function (IIEF), Arizona Sexual Experience Scale (ASEX), and Erectile Dysfunction Inventory of Treatment Satisfaction (EDITS) (secondary outcome measures). Participants were followed biweekly during study period. After 12 weeks of treatment, the mean (sd) scores for CGI-SF were significantly lower, i.e. better, in patients on bupropion SR, at 2.4 (1.2), than in the placebo group, at 3.9 (1.1) (P= 0.01). Men who received bupropion had a significant increase in the total IIEF score (54.4% vs 1.2%; P= 0.003), and in the five different domains of the IIEF. Total ASEX scores were significantly lower, i.e. better, among men who received bupropion than placebo, at 15.5 (4.3) vs 21.5 (4.7) (P= 0.002). The EDITS scores were 67.4 (10.2) for the bupropion and 36.3 (11.7) for the placebo group (P= 0.001). The ASEX score and CGI-SF score were correlated (P= 0.003). In linear regression analyses the CGI-SF score was not affected significantly by the duration of SD, type of SSRI used and age. Bupropion is an effective treatment for male SD induced by SSRIs. These results provide empirical support for conducting a further study of bupropion.
Directory of Open Access Journals (Sweden)
Wangchao Lou
Full Text Available Developing an efficient method for determination of the DNA-binding proteins, due to their vital roles in gene regulation, is becoming highly desired since it would be invaluable to advance our understanding of protein functions. In this study, we proposed a new method for the prediction of the DNA-binding proteins, by performing the feature rank using random forest and the wrapper-based feature selection using forward best-first search strategy. The features comprise information from primary sequence, predicted secondary structure, predicted relative solvent accessibility, and position specific scoring matrix. The proposed method, called DBPPred, used Gaussian naïve Bayes as the underlying classifier since it outperformed five other classifiers, including decision tree, logistic regression, k-nearest neighbor, support vector machine with polynomial kernel, and support vector machine with radial basis function. As a result, the proposed DBPPred yields the highest average accuracy of 0.791 and average MCC of 0.583 according to the five-fold cross validation with ten runs on the training benchmark dataset PDB594. Subsequently, blind tests on the independent dataset PDB186 by the proposed model trained on the entire PDB594 dataset and by other five existing methods (including iDNA-Prot, DNA-Prot, DNAbinder, DNABIND and DBD-Threader were performed, resulting in that the proposed DBPPred yielded the highest accuracy of 0.769, MCC of 0.538, and AUC of 0.790. The independent tests performed by the proposed DBPPred on completely a large non-DNA binding protein dataset and two RNA binding protein datasets also showed improved or comparable quality when compared with the relevant prediction methods. Moreover, we observed that majority of the selected features by the proposed method are statistically significantly different between the mean feature values of the DNA-binding and the non DNA-binding proteins. All of the experimental results indicate that
Directory of Open Access Journals (Sweden)
Newton Nicola C
2012-08-01
Full Text Available Abstract Background Alcohol misuse amongst young people is a serious concern. The need for effective prevention is clear, yet there appear to be few evidenced-based programs that prevent alcohol misuse and none that target both high and low-risk youth. The CAP study addresses this gap by evaluating the efficacy of an integrated approach to alcohol misuse prevention, which combines the effective universal internet-based Climate Schools program with the effective selective personality-targeted Preventure program. This article describes the development and protocol of the CAP study which aims to prevent alcohol misuse and related harms in Australian adolescents. Methods/Design A cluster randomized controlled trial (RCT is being conducted with Year 8 students aged 13 to 14-years-old from 27 secondary schools in New South Wales and Victoria, Australia. Blocked randomisation was used to assign schools to one of four groups; Climate Schools only, Preventure only, CAP (Climate Schools and Preventure, or Control (alcohol, drug and health education as usual. The primary outcomes of the trial will be the uptake and harmful use of alcohol and alcohol related harms. Secondary outcomes will include alcohol and cannabis related knowledge, cannabis related harms, intentions to use, and mental health symptomatology. All participants will complete assessments on five occasions; baseline; immediately post intervention, and at 12, 24 and 36 months post baseline. Discussion This study protocol presents the design and current implementation of a cluster RCT to evaluate the efficacy of the CAP study; an integrated universal and selective approach to prevent alcohol use and related harms among adolescents. Compared to students who receive the stand-alone universal Climate Schools program or alcohol and drug education as usual (Controls, we expect the students who receive the CAP intervention to have significantly less uptake of alcohol use, a reduction in average
Benitez, Aline do Nascimento; Martins, Felippe Danyel Cardoso; Mareze, Marcelle; Nino, Beatriz de Souza Lima; Caldart, Eloiza Teles; Ferreira, Fernanda Pinto; Mitsuka-Breganó, Regina; Freire, Roberta Lemos; Galhardo, Juliana Arena; Martins, Camila Marinelli; Biondo, Alexander Welker; Navarro, Italmar Teodorico
2018-06-01
Although leishmaniasis has been described as a classic example of a zoonosis requiring a comprehensive approach for control, to date, no study has been conducted on the spatial distribution of simultaneous Leishmania spp. seroprevalence in dog owners and dogs from randomly selected households in urban settings. Accordingly, the present study aimed to simultaneously identify the seroprevalence, spatial distribution and associated factors of infection with Leishmania spp. in dog owners and their dogs in the city of Londrina, a county seat in southern Brazil with a population of half a million people and ranked 18th in population and 145th in the human development index (HDI) out of 5570 Brazilian cities. Overall, 564 households were surveyed and included 597 homeowners and their 729 dogs. Anti-Leishmania spp. antibodies were detected by ELISA in 9/597 (1.50%) dog owners and in 32/729 (4.38%) dogs, with significantly higher prevalence (p = 0.0042) in dogs. Spatial analysis revealed associations between seropositive dogs and households located up to 500 m from the local railway. No clusters were found for either owner or dog case distributions. In summary, the seroepidemiological and spatial results collectively show a lack of association of the factors for infection, and the results demonstrated higher exposure for dogs than their owners. However, railway areas may provide favorable conditions for the maintenance of infected phlebotomines, thereby causing infection in nearby domiciled dogs. In such an urban scenario, local sanitary barriers should be focused on the terrestrial routes of people and surrounding areas, particularly railways, via continuous vector surveillance and identification of phlebotomines infected by Leishmania spp. Copyright © 2018. Published by Elsevier B.V.
Directory of Open Access Journals (Sweden)
Hoque Dewan ME
2012-12-01
Full Text Available Abstract Background Quality hospital care is important in ensuring that the needs of severely ill children are met to avert child mortality. However, the quality of hospital care for children in developing countries has often been found poor. As the first step of a country road map for improving hospital care for children, we assessed the baseline situation with respect to the quality of care provided to children under-five years age in district and sub-district level hospitals in Bangladesh. Methods Using adapted World Health Organization (WHO hospital assessment tools and standards, an assessment of 18 randomly selected district (n=6 and sub-district (n=12 hospitals was undertaken. Teams of trained assessors used direct case observation, record review, interviews, and Management Information System (MIS data to assess the quality of clinical case management and monitoring; infrastructure, processes and hospital administration; essential hospital and laboratory supports, drugs and equipment. Results Findings demonstrate that the overall quality of care provided in these hospitals was poor. No hospital had a functioning triage system to prioritise those children most in need of immediate care. Laboratory supports and essential equipment were deficient. Only one hospital had all of the essential drugs for paediatric care. Less than a third of hospitals had a back-up power supply, and just under half had functioning arrangements for safe-drinking water. Clinical case management was found to be sub-optimal for prevalent illnesses, as was the quality of neonatal care. Conclusion Action is needed to improve the quality of paediatric care in hospital settings in Bangladesh, with a particular need to invest in improving newborn care.
Lammers, J.; Goossens, F.; Conrod, P.; Engels, R.C.M.E.; Wiers, R.W.H.J.; Kleinjan, M.
2015-01-01
Aim The effectiveness of Preventure was tested on drinking behaviour of young adolescents in secondary education in the Netherlands. Design A cluster randomized controlled trial was carried out, with participants assigned randomly to a two-session coping skills intervention or a control
Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design
Wagler, Amy; Wagler, Ron
2014-01-01
Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…
Directory of Open Access Journals (Sweden)
Lee JWY
2014-09-01
Full Text Available Jacky WY Lee,1,2 Catherine WS Chan,2 Mandy OM Wong,3 Jonathan CH Chan,3 Qing Li,2 Jimmy SM Lai2 1The Department of Ophthalmology, Caritas Medical Centre, 2The Department of Ophthalmology, The University of Hong Kong, 3The Department of Ophthalmology, Queen Mary Hospital, Hong Kong Background: The objective of this study was to investigate the effects of adjuvant selective laser trabeculoplasty (SLT versus medication alone on intraocular pressure (IOP control, medication use, and quality of life in patients with primary open-angle glaucoma.Methods: This prospective, randomized control study recruited 41 consecutive primary open-angle glaucoma subjects with medically-controlled IOP ≤21 mmHg. The SLT group (n=22 received a single 360-degree SLT treatment. The medication-only group (n=19 continued with their usual treatment regimen. In both groups, medication was titrated to maintain a target IOP defined as a 25% reduction from baseline IOP without medication, or <18 mmHg, whichever was lower. Outcomes, which were measured at baseline and at 6 months, included the Glaucoma Quality of Life-15 (GQL-15 and Comparison of Ophthalmic Medications for Tolerability (COMTOL survey scores, IOP, and the number of antiglaucoma medicines. Results: The baseline IOP was 15.8±2.7 mmHg and 14.5±2.5 mmHg in the SLT and medication-only groups, respectively (P=0.04. Both groups had a comparable number of baseline medication (P=0.2, GQL-15 (P=0.3 and COMTOL scores (P=0.7. At 6 months, the SLT group had a lower IOP (P=0.03 and required fewer medications compared with both baseline (P<0.0001 and with the medication-only group (P=0.02. There was no statistically significant difference in the 6-month GQL-15 or COMTOL score as compared to baseline (P≥0.4 or between the two treatment groups (P≥0.2.Conclusion: A single session of adjuvant SLT provided further reductions in IOP and medication without substantial changes in quality of life or medication tolerability at 6
Ben-Ari, Morechai
2004-01-01
The term "random" is frequently used in discussion of the theory of evolution, even though the mathematical concept of randomness is problematic and of little relevance in the theory. Therefore, since the core concept of the theory of evolution is the non-random process of natural selection, the term random should not be used in teaching the…
Schütt, Barbara; Kaiser, Andreas; Schultze-Mosgau, Marcus-Hillert; Seitz, Christian; Bell, David; Koch, Manuela; Rohde, Beate
2016-08-01
Does administration of vilaprisan (VPR) to healthy women for 12 weeks reduce menstrual bleeding? In this 12-week proof-of-concept phase 1 trial, most women (30/33, 90%) who received VPR at daily doses of 1-5 mg reported the absence of menstrual bleeding. Vilaprisan (BAY 1002670) is a novel, highly potent selective progesterone receptor modulator that markedly reduces the growth of human leiomyoma tissue in a preclinical model of uterine fibroids (UFs). In this double-blind, parallel-group study, of the 163 healthy women enrolled 73 were randomized to daily VPR 0.1 mg (n = 12), 0.5 mg (n = 12), 1 mg (n = 13), 2 mg (n = 12), 5 mg (n = 12) or placebo tablets (n = 12) for 12 weeks. Participants were followed up until the start of the second menstrual bleeding after the end of treatment. Trial simulations were used to determine the minimum sample size required to estimate the non-bleeding rate (i.e. self-assessed bleeding intensity of 'none' or 'spotting') using Bayesian dose-response estimation with incorporated prior information. It was estimated that 48 participants in the per-protocol analysis population would be sufficient. Women aged 18-45 years who had been sterilized by tubal ligation were enrolled between November 2011 and May 2012. Participants kept a daily diary of bleeding intensity. Blood and urine samples were taken, and transvaginal ultrasound was performed before treatment, during treatment and follow-up. Endometrial biopsies were obtained during the pretreatment cycle, at the end of the treatment period and during the follow-up phase. The primary outcome was the estimated dose-response curve of the observed non-bleeding rate during Days 10-84 of treatment, excluding the endometrial biopsy day and 2 days after biopsy. Secondary outcomes included return of bleeding during follow-up, size of follicle-like structures and serum hormone levels. Safety assessments included adverse events (AEs), endometrial thickness and histology, laboratory parameters, vital
Edgington, Eugene
2007-01-01
Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani
DEFF Research Database (Denmark)
Russell, James A; Vincent, Jean-Louis; Kjølbye, Anne Louise
2017-01-01
BACKGROUND: Vasopressin is widely used for vasopressor support in septic shock patients, but experimental evidence suggests that selective V1A agonists are superior. The initial pharmacodynamic effects, pharmacokinetics, and safety of selepressin, a novel V1A-selective vasopressin analogue, was e...
DEFF Research Database (Denmark)
Thornton, Steven; Goodwin, Thomas M; Greisen, Gorm
2009-01-01
OBJECTIVE: The objective of the study was to compare barusiban with placebo in threatened preterm labor. STUDY DESIGN: This was a randomized, double-blind, placebo-controlled, multicenter study. One hundred sixty-three women at 34-35 weeks plus 6 days, and with 6 or more contractions of 30 seconds...
Wiitavaara, Birgitta; Fahlström, Martin; Djupsjöbacka, Mats
2017-01-01
Abstract Rationale, aims and objectives The aims of this study is to investigate the prevalence of patients seeking care due to different musculoskeletal disorders (MSDs) at primary health care centres (PHCs), to chart different factors such as symptoms, diagnosis and actions prescribed for patients that visited the PHCs due to MSD and to make comparisons regarding differences due to gender, age and rural or urban PHC. Methods Patient records (2000) for patients in working age were randomly s...
Echazarreta-Gallego, Estíbaliz; Pola-Bandrés, Guillermo; Arribas-Del Amo, María Dolores; Gil-Romea, Ismael; Sousa-Domínguez, Ramón; Güemes-Sánchez, Antonio
2017-10-01
Breast prostheses exposure is probably the most devastating complication after a skin sparing mastectomy (SSM) and implant-based, one-stage, breast reconstruction. This complication may occur in the immediate post-operative period or in the weeks and even months after the procedure. In most cases, the cause is poor skin coverage of the implant due to skin necrosis. Eight consecutive cases of implant exposure (or risk of exposure) due to skin necrosis in SSM patients over a period of 5 years, all patients were treated using a random epigastric rotation flap, executed by the same medical team. A random epigastric flap (island or conventional rotation flap) was used to cover the skin defect. All the patients completed the procedure and all prostheses were saved; there were no cases of flap necrosis or infection. Cases of skin necrosis after SSM and immediate implant reconstruction, in which the implant is at risk of exposure, can be successfully treated with a random epigastric rotation flap.
Hashimoto, Tasuku; Shiina, Akihiro; Hasegawa, Tadashi; Kimura, Hiroshi; Oda, Yasunori; Niitsu, Tomihisa; Ishikawa, Masatomo; Tachibana, Masumi; Muneoka, Katsumasa; Matsuki, Satoshi; Nakazato, Michiko; Iyo, Masaomi
2016-01-01
This study aimed to evaluate whether selecting mirtazapine as the first choice for current depressive episode instead of selective serotonin reuptake inhibitors (SSRIs) reduces benzodiazepine use in patients with major depressive disorder (MDD). We concurrently examined the relationship between clinical responses and serum mature brain-derived neurotrophic factor (BDNF) and its precursor, proBDNF. We conducted an open-label randomized trial in routine psychiatric practice settings. Seventy-seven MDD outpatients were randomly assigned to the mirtazapine or predetermined SSRIs groups, and investigators arbitrarily selected sertraline or paroxetine. The primary outcome was the proportion of benzodiazepine users at weeks 6, 12, and 24 between the groups. We defined patients showing a ≥50 % reduction in Hamilton depression rating scale (HDRS) scores from baseline as responders. Blood samples were collected at baseline, weeks 6, 12, and 24. Sixty-five patients prescribed benzodiazepines from prescription day 1 were analyzed for the primary outcome. The percentage of benzodiazepine users was significantly lower in the mirtazapine than in the SSRIs group at weeks 6, 12, and 24 (21.4 vs. 81.8 %; 11.1 vs. 85.7 %, both P depressive episodes may reduce benzodiazepine use in patients with MDD. Trial registration UMIN000004144. Registered 2nd September 2010. The date of enrolment of the first participant to the trial was 24th August 2010. This study was retrospectively registered 9 days after the first participant was enrolled.
Jimenez-Quevedo, Pilar; Gonzalez-Ferrer, Juan Jose; Sabate, Manel; Garcia-Moll, Xavier; Delgado-Bolton, Roberto; Llorente, Leopoldo; Bernardo, Esther; Ortega-Pozzi, Aranzazu; Hernandez-Antolin, Rosana; Alfonso, Fernando; Gonzalo, Nieves; Escaned, Javier; Bañuelos, Camino; Regueiro, Ander; Marin, Pedro; Fernandez-Ortiz, Antonio; Neves, Barbara Das; Del Trigo, Maria; Fernandez, Cristina; Tejerina, Teresa; Redondo, Santiago; Garcia, Eulogio; Macaya, Carlos
2014-11-07
Refractory angina constitutes a clinical problem. The aim of this study was to assess the safety and the feasibility of transendocardial injection of CD133(+) cells to foster angiogenesis in patients with refractory angina. In this randomized, double-blinded, multicenter controlled trial, eligible patients were treated with granulocyte colony-stimulating factor, underwent an apheresis and electromechanical mapping, and were randomized to receive treatment with CD133(+) cells or no treatment. The primary end point was the safety of transendocardial injection of CD133(+) cells, as measured by the occurrence of major adverse cardiac and cerebrovascular event at 6 months. Secondary end points analyzed the efficacy. Twenty-eight patients were included (n=19 treatment; n=9 control). At 6 months, 1 patient in each group had ventricular fibrillation and 1 patient in each group died. One patient (treatment group) had a cardiac tamponade during mapping. There were no significant differences between groups with respect to efficacy parameters; however, the comparison within groups showed a significant improvement in the number of angina episodes per month (median absolute difference, -8.5 [95% confidence interval, -15.0 to -4.0]) and in angina functional class in the treatment arm but not in the control group. At 6 months, only 1 simple-photon emission computed tomography (SPECT) parameter: summed score improved significantly in the treatment group at rest and at stress (median absolute difference, -1.0 [95% confidence interval, -1.9 to -0.1]) but not in the control arm. Our findings support feasibility and safety of transendocardial injection of CD133(+) cells in patients with refractory angina. The promising clinical results and favorable data observed in SPECT summed score may set up the basis to test the efficacy of cell therapy in a larger randomized trial. © 2014 American Heart Association, Inc.
Kuiken, Sjoerd D.; Tytgat, Guido N. J.; Boeckxstaens, Guy E. E.
2003-01-01
BACKGROUND & AIMS: Although widely prescribed, the evidence for the use of antidepressants for the treatment of irritable bowel syndrome (IBS) is limited. In this study, we hypothesized that fluoxetine (Prozac), a selective serotonin reuptake inhibitor, has visceral analgesic properties, leading to
International Nuclear Information System (INIS)
Carnet, Bernard; Delhumeau, Michel
1971-06-01
The principles of binary analysis applied to the investigation of sequential circuits were used to design a two way coincidence circuit whose input may be, random or periodic variables of constant or variable duration. The output signal strictly reproduces the characteristics of the input signal triggering the coincidence. A coincidence between input signals does not produce any output signal if one of the signals has already triggered the output signal. The characteristics of the output signal in relation to those of the input signal are: minimum time jitter, excellent duration reproducibility and maximum efficiency. Some rules are given for achieving these results. The symmetry, transitivity and non-transitivity characteristics of the edges on the primitive graph are analyzed and lead to some rules for positioning the states on a secondary graph. It is from this graph that the equations of the circuits can be calculated. The development of the circuit and its dynamic testing are discussed. For this testing, the functioning of the circuit is simulated by feeding into the input randomly generated signals
Taradaj, Jakub; Ozon, Marcin; Dymarek, Robert; Bolach, Bartosz; Walewicz, Karolina; Rosińczuk, Joanna
2018-03-23
Interdisciplinary physical therapy together with pharmacological treatment constitute conservative treatment strategies related to low back pain (LBP). There is still a lack of high quality studies aimed at an objective evaluation of physiotherapeutic procedures according to their effectiveness in LBP. The aim of this study is to carry out a prospective, randomized, single-blinded, and placebocontrolled clinical trial to evaluate the effectiveness of magnetic fields in discopathy-related LBP. A group of 177 patients was assessed for eligibility based on inclusion and exclusion criteria. In the end, 106 patients were randomly assigned into 5 comparative groups: A (n = 23; magnetic therapy: 10 mT, 50 Hz); B (n = 23; magnetic therapy: 5 mT, 50 Hz); C (n = 20; placebo magnetic therapy); D (n = 20; magnetic stimulation: 49.2 μT, 195 Hz); and E (n = 20; placebo magnetic stimulation). All patients were assessed using tests for pain intensity, degree of disability and range of motion. Also, postural stability was assessed using a stabilographic platform. In this study, positive changes in all clinical outcomes were demonstrated in group A (p 0.05). It was determined that the application of magnetic therapy (10 mT, 50 Hz, 20 min) significantly reduces pain symptoms and leads to an improvement of functional ability in patients with LBP.
The RANDOM computer program: A linear congruential random number generator
Miles, R. F., Jr.
1986-01-01
The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.
Hilário, M.; Hollander, den W.Th.F.; Sidoravicius, V.; Soares dos Santos, R.; Teixeira, A.
2014-01-01
In this paper we study a random walk in a one-dimensional dynamic random environment consisting of a collection of independent particles performing simple symmetric random walks in a Poisson equilibrium with density ¿¿(0,8). At each step the random walk performs a nearest-neighbour jump, moving to
Energy Technology Data Exchange (ETDEWEB)
Pitton, Michael B., E-mail: michael.pitton@unimedizin-mainz.de; Kloeckner, Roman [Johannes Gutenberg University Medical Center, Department of Diagnostic and Interventional Radiology (Germany); Ruckes, Christian [Johannes Gutenberg University Medical Center, IZKS (Germany); Wirth, Gesine M. [Johannes Gutenberg University Medical Center, Department of Diagnostic and Interventional Radiology (Germany); Eichhorn, Waltraud [Johannes Gutenberg University Medical Center, Department of Nuclear Medicine (Germany); Wörns, Marcus A.; Weinmann, Arndt [Johannes Gutenberg University Medical Center, Department of Internal Medicine (Germany); Schreckenberger, Mathias [Johannes Gutenberg University Medical Center, Department of Nuclear Medicine (Germany); Galle, Peter R. [Johannes Gutenberg University Medical Center, Department of Internal Medicine (Germany); Otto, Gerd [Johannes Gutenberg University Medical Center, Department of Transplantation Surgery (Germany); Dueber, Christoph [Johannes Gutenberg University Medical Center, Department of Diagnostic and Interventional Radiology (Germany)
2015-04-15
PurposeTo prospectively compare SIRT and DEB-TACE for treating hepatocellular carcinoma (HCC).MethodsFrom 04/2010–07/2012, 24 patients with histologically proven unresectable N0, M0 HCCs were randomized 1:1 to receive SIRT or DEB-TACE. SIRT could be repeated once in case of recurrence; while, TACE was repeated every 6 weeks until no viable tumor tissue was detected by MRI or contraindications prohibited further treatment. Patients were followed-up by MRI every 3 months; the final evaluation was 05/2013.ResultsBoth groups were comparable in demographics (SIRT: 8males/4females, mean age 72 ± 7 years; TACE: 10males/2females, mean age 71 ± 9 years), initial tumor load (1 patient ≥25 % in each group), and BCLC (Barcelona Clinic Liver Cancer) stage (SIRT: 12×B; TACE 1×A, 11×B). Median progression-free survival (PFS) was 180 days for SIRT versus 216 days for TACE patients (p = 0.6193) with a median TTP of 371 days versus 336 days, respectively (p = 0.5764). Median OS was 592 days for SIRT versus 788 days for TACE patients (p = 0.9271). Seven patients died in each group. Causes of death were liver failure (n = 4 SIRT group), tumor progression (n = 4 TACE group), cardiovascular events, and inconclusive (n = 1 in each group).ConclusionsNo significant differences were found in median PFS, OS, and TTP. The lower rate of tumor progression in the SIRT group was nullified by a greater incidence of liver failure. This pilot study is the first prospective randomized trial comparing SIRT and TACE for treating HCC, and results can be used for sample size calculations of future studies.
International Nuclear Information System (INIS)
Pitton, Michael B.; Kloeckner, Roman; Ruckes, Christian; Wirth, Gesine M.; Eichhorn, Waltraud; Wörns, Marcus A.; Weinmann, Arndt; Schreckenberger, Mathias; Galle, Peter R.; Otto, Gerd; Dueber, Christoph
2015-01-01
PurposeTo prospectively compare SIRT and DEB-TACE for treating hepatocellular carcinoma (HCC).MethodsFrom 04/2010–07/2012, 24 patients with histologically proven unresectable N0, M0 HCCs were randomized 1:1 to receive SIRT or DEB-TACE. SIRT could be repeated once in case of recurrence; while, TACE was repeated every 6 weeks until no viable tumor tissue was detected by MRI or contraindications prohibited further treatment. Patients were followed-up by MRI every 3 months; the final evaluation was 05/2013.ResultsBoth groups were comparable in demographics (SIRT: 8males/4females, mean age 72 ± 7 years; TACE: 10males/2females, mean age 71 ± 9 years), initial tumor load (1 patient ≥25 % in each group), and BCLC (Barcelona Clinic Liver Cancer) stage (SIRT: 12×B; TACE 1×A, 11×B). Median progression-free survival (PFS) was 180 days for SIRT versus 216 days for TACE patients (p = 0.6193) with a median TTP of 371 days versus 336 days, respectively (p = 0.5764). Median OS was 592 days for SIRT versus 788 days for TACE patients (p = 0.9271). Seven patients died in each group. Causes of death were liver failure (n = 4 SIRT group), tumor progression (n = 4 TACE group), cardiovascular events, and inconclusive (n = 1 in each group).ConclusionsNo significant differences were found in median PFS, OS, and TTP. The lower rate of tumor progression in the SIRT group was nullified by a greater incidence of liver failure. This pilot study is the first prospective randomized trial comparing SIRT and TACE for treating HCC, and results can be used for sample size calculations of future studies
Energy Technology Data Exchange (ETDEWEB)
Tamura, Akio, E-mail: a.akahane@gmail.com [Department of Radiology, Iwate Medical University School of Medicine, 19-1 Uchimaru, Morioka 020-8505 (Japan); Kato, Kenichi, E-mail: kkato@iwate-med.ac.jp [Department of Radiology, Iwate Medical University School of Medicine, 19-1 Uchimaru, Morioka 020-8505 (Japan); Kamata, Masayoshi, E-mail: kamataaoi@yahoo.co.jp [Iwate Medical University Hospital, 19-1 Uchimaru, Morioka 020-8505 (Japan); Suzuki, Tomohiro, E-mail: suzukitomohiro123@gmail.com [Department of Radiology, Iwate Medical University School of Medicine, 19-1 Uchimaru, Morioka 020-8505 (Japan); Suzuki, Michiko, E-mail: mamimichiko@me.com [Department of Radiology, Iwate Medical University School of Medicine, 19-1 Uchimaru, Morioka 020-8505 (Japan); Nakayama, Manabu, E-mail: gakuymgt@yahoo.co.jp [Department of Radiology, Iwate Medical University School of Medicine, 19-1 Uchimaru, Morioka 020-8505 (Japan); Tomabechi, Makiko, E-mail: mtomabechi@mac.com [Department of Radiology, Iwate Medical University School of Medicine, 19-1 Uchimaru, Morioka 020-8505 (Japan); Nakasato, Tatsuhiko, E-mail: nakasato77@gmail.com [Department of Radiology, Southern Tohoku Research Institute for Neuroscience, 7-115 Yatsuyamada, Koriyama 963-8563 (Japan); Ehara, Shigeru, E-mail: ehara@iwate-med.ac.jp [Department of Radiology, Iwate Medical University School of Medicine, 19-1 Uchimaru, Morioka 020-8505 (Japan)
2017-02-15
Highlights: • We compared 24-gauge side-hole and conventional 22-gauge end-hole catheters in MDCT. • The 24-gauge side-hole catheter is noninferior to the 22-gauge end-hole catheter. • The 24-gauge side-hole catheter is safe and facilitates optimal enhancement quality. • The 24-gauge side-hole catheter is suitable for patients with narrow or fragile veins. - Abstract: Purpose: To compare the 24-gauge side-holes catheter and conventional 22-gauge end-hole catheter in terms of safety, injection pressure, and contrast enhancement on multi-detector computed tomography (MDCT). Materials & methods: In a randomized single-center study, 180 patients were randomized to either the 24-gauge side-holes catheter or the 22-gauge end-hole catheter groups. The primary endpoint was safety during intravenous administration of contrast material for MDCT, using a non-inferiority analysis (lower limit 95% CI greater than −10% non-inferiority margin for the group difference). The secondary endpoints were injection pressure and contrast enhancement. Results: A total of 174 patients were analyzed for safety during intravenous contrast material administration for MDCT. The overall extravasation rate was 1.1% (2/174 patients); 1 (1.2%) minor episode occurred in the 24-gauge side-holes catheter group and 1 (1.1%) in the 22-gauge end-hole catheter group (difference: 0.1%, 95% CI: −3.17% to 3.28%, non-inferiority P = 1). The mean maximum pressure was higher with the 24-gauge side-holes catheter than with the 22-gauge end-hole catheter (8.16 ± 0.95 kg/cm{sup 2} vs. 4.79 ± 0.63 kg/cm{sup 2}, P < 0.001). The mean contrast enhancement of the abdominal aorta, celiac artery, superior mesenteric artery, and pancreatic parenchyma in the two groups were not significantly different. Conclusion: In conclusion, our study showed that the 24-gauge side-holes catheter is safe and suitable for delivering iodine with a concentration of 300 mg/mL at a flow-rate of 3 mL/s, and it may contribute to
Takahashi, Fumihiro; Morita, Satoshi
2018-02-08
Phase II clinical trials are conducted to determine the optimal dose of the study drug for use in Phase III clinical trials while also balancing efficacy and safety. In conducting these trials, it may be important to consider subpopulations of patients grouped by background factors such as drug metabolism and kidney and liver function. Determining the optimal dose, as well as maximizing the eﬀectiveness of the study drug by analyzing patient subpopulations, requires a complex decision-making process. In extreme cases, drug development has to be terminated due to inadequate efficacy or severe toxicity. Such a decision may be based on a particular subpopulation. We propose a Bayesian utility approach (BUART) to randomized Phase II clinical trials which uses a first-order bivariate normal dynamic linear model for efficacy and safety in order to determine the optimal dose and study population in a subsequent Phase III clinical trial. We carried out a simulation study under a wide range of clinical scenarios to evaluate the performance of the proposed method in comparison with a conventional method separately analyzing efficacy and safety in each patient population. The proposed method showed more favorable operating characteristics in determining the optimal population and dose.
International Nuclear Information System (INIS)
Tahir-Kheli, R.A.
1975-01-01
A few simple problems relating to random magnetic systems are presented. Translational symmetry, only on the macroscopic scale, is assumed for these systems. A random set of parameters, on the microscopic scale, for the various regions of these systems is also assumed. A probability distribution for randomness is obeyed. Knowledge of the form of these probability distributions, is assumed in all cases [pt
Cai, Panpan; Tang, Xiaohong; Qin, Wei; Ji, Ling; Li, Zi
2016-04-01
The goal of this systematic review is to evaluate the efficacy and safety of paricalcitol versus active non-selective vitamin D receptor activators (VDRAs) for secondary hyperparathyroidism (SHPT) management in chronic kidney disease (CKD) patients. PubMed, EMBASE, Cochrane Central Register of Controlled Trials (CENTRAL), clinicaltrials.gov (inception to September 2015), and ASN Web site were searched for relevant studies. A meta-analysis of randomized controlled trials (RCTs) and quasi-RCTs that assessed the effects and adverse events of paricalcitol and active non-selective VDRA in adult CKD patients with SHPT was performed using Review Manager 5.2. A total of 10 trials involving 734 patients were identified for this review. The quality of included trials was limited, and very few trials reported all-cause mortality or cardiovascular calcification without any differences between two groups. Compared with active non-selective VDRAs, paricalcitol showed no significant difference in both PTH reduction (MD -7.78, 95% CI -28.59-13.03, P = 0.46) and the proportion of patients who achieved the target reduction of PTH (OR 1.27, 95% CI 0.87-1.85, P = 0.22). In addition, no statistical differences were found in terms of serum calcium, episodes of hypercalcemia, serum phosphorus, calcium × phosphorus products, and bone metabolism index. Current evidence is insufficient, showing paricalcitol is superior to active non-selective VDRAs in lowering PTH or reducing the burden of mineral loading. Further trials are required to prove the tissue-selective effect of paricalcitol and to overcome the limitation of current research.
Lecor, Papa Abdou; Touré, Babacar; Boucher, Yves
2018-03-01
This study aimed at analyzing the effect of the temporary removal of trigeminal dental afferents on electrogustometric thresholds (EGMt). EGMt were measured in 300 healthy subjects randomized in three groups, in nine loci on the right and left side (RS, LS) of the tongue surface before and after anesthesia. Group IAN (n = 56 RS, n = 44 LS) received intraosseous local anesthesia of the inferior alveolar nerve (IAN). Group MdN received mandibular nerve (MdN) block targeting IAN before its entrance into the mandibular foramen (n = 60, RS, and n = 40, LS); group MxN receiving maxillary nerve (MxN) anesthesia (n = 56 RS and n = 44 LS) was the control group. Differences between mean EGMt were analyzed with the Wilcoxon test; correlation between type of anesthesia and EGMt was performed with Spearman's rho, all with a level of significance set at p ≤ 0.05. Significant EGMt (μA) differences before and after anesthesia were found in all loci with MdN and IAN on the ipsilateral side (p Anesthesia of the MdN was positively correlated with the increase in EGMt (p anesthesia of IAN was positively correlated only with the increase in EGMt measured at posterior and dorsal loci of the tongue surface (p anesthesia suggests a participation of dental afferents in taste perception. Extraction of teeth may impair food intake not only due to impaired masticatory ability but also to alteration of neurological trigemino-gustatory interactions. PACTR201602001452260.
Henderson, Robert A; Jarvis, Christopher; Clayton, Tim; Pocock, Stuart J; Fox, Keith A A
2015-08-04
The RITA-3 (Third Randomised Intervention Treatment of Angina) trial compared outcomes of a routine early invasive strategy (coronary arteriography and myocardial revascularization, as clinically indicated) to those of a selective invasive strategy (coronary arteriography for recurrent ischemia only) in patients with non-ST-segment elevation acute coronary syndrome (NSTEACS). At a median of 5 years' follow-up, the routine invasive strategy was associated with a 24% reduction in the odds of all-cause mortality. This study reports 10-year follow-up outcomes of the randomized cohort to determine the impact of a routine invasive strategy on longer-term mortality. We randomized 1,810 patients with NSTEACS to receive routine invasive or selective invasive strategies. All randomized patients had annual follow-up visits up to 5 years, and mortality was documented thereafter using data from the Office of National Statistics. Over 10 years, there were no differences in mortality between the 2 groups (all-cause deaths in 225 [25.1%] vs. 232 patients [25.4%]: p = 0.94; and cardiovascular deaths in 135 [15.1%] vs. 147 patients [16.1%]: p = 0.65 in the routine invasive and selective invasive groups, respectively). Multivariate analysis identified several independent predictors of 10-year mortality: age, previous myocardial infarction, heart failure, smoking status, diabetes, heart rate, and ST-segment depression. A modified post-discharge Global Registry of Acute Coronary Events (GRACE) score was used to calculate an individual risk score for each patient and to form low-risk, medium-risk, and high-risk groups. Risk of death within 10 years varied markedly from 14.4 % in the low-risk group to 56.2% in the high-risk group. This mortality trend did not depend on the assigned treatment strategy. The advantage of reduced mortality of routine early invasive strategy seen at 5 years was attenuated during later follow-up, with no evidence of a difference in outcome at 10 years
Adlerberth, A; Stenström, G; Hasselgren, P O
1987-01-01
Despite the increasing use of beta-blocking agents alone as preoperative treatment of patients with hyperthyroidism, there are no controlled clinical studies in which this regimen has been compared with a more conventional preoperative treatment. Thirty patients with newly diagnosed and untreated hyperthyroidism were randomized to preoperative treatment with methimazole in combination with thyroxine (Group I) or the beta 1-blocking agent metoprolol (Group II). Metoprolol was used since it has been demonstrated that the beneficial effect of beta-blockade in hyperthyroidism is mainly due to beta 1-blockade. The preoperative, intraoperative, and postoperative courses in the two groups were compared, and patients were followed up for 1 year after thyroidectomy. At the time of diagnosis, serum concentration of triiodothyronine (T3) was 6.1 +/- 0.59 nmol/L in Group I and 5.7 +/- 0.66 nmol/L in Group II (reference interval 1.5-3.0 nmol/L). Clinical improvement during preoperative treatment was similar in the two groups of patients, but serum T3 was normalized only in Group I. The median length of preoperative treatment was 12 weeks in Group I and 5 weeks in Group II (p less than 0.01). There were no serious adverse effects of the drugs during preoperative preparation in either treatment group. Operating time, consistency and vascularity of the thyroid gland, and intraoperative blood loss were similar in the two groups. No anesthesiologic or cardiovascular complications occurred during operation in either group. One patient in Group I (7%) and three patients in Group II (20%) had clinical signs of hyperthyroid function during the first postoperative day. These symptoms were abolished by the administration of small doses of metoprolol, and no case of thyroid storm occurred. Postoperative hypocalcemia or recurrent laryngeal nerve paralysis did not occur in either group. During the first postoperative year, hypothyroidism developed in two patients in Group I (13%) and in six
Directory of Open Access Journals (Sweden)
Sipilä Sarianna
2011-12-01
Full Text Available Abstract Background To cope at their homes, community-dwelling older people surviving a hip fracture need a sufficient amount of functional ability and mobility. There is a lack of evidence on the best practices supporting recovery after hip fracture. The purpose of this article is to describe the design, intervention and demographic baseline results of a study investigating the effects of a rehabilitation program aiming to restore mobility and functional capacity among community-dwelling participants after hip fracture. Methods/Design Population-based sample of over 60-year-old community-dwelling men and women operated for hip fracture (n = 81, mean age 79 years, 78% were women participated in this study and were randomly allocated into control (Standard Care and ProMo intervention groups on average 10 weeks post fracture and 6 weeks after discharged to home. Standard Care included written home exercise program with 5-7 exercises for lower limbs. Of all participants, 12 got a referral to physiotherapy. After discharged to home, only 50% adhered to Standard Care. None of the participants were followed-up for Standard Care or mobility recovery. ProMo-intervention included Standard Care and a year-long program including evaluation/modification of environmental hazards, guidance for safe walking, pain management, progressive home exercise program and physical activity counseling. Measurements included a comprehensive battery of laboratory tests and self-report on mobility limitation, disability, physical functional capacity and health as well as assessments for the key prerequisites for mobility, disability and functional capacity. All assessments were performed blinded at the research laboratory. No significant differences were observed between intervention and control groups in any of the demographic variables. Discussion Ten weeks post hip fracture only half of the participants were compliant to Standard Care. No follow-up for Standard Care or
Randomized random walk on a random walk
International Nuclear Information System (INIS)
Lee, P.A.
1983-06-01
This paper discusses generalizations of the model introduced by Kehr and Kunter of the random walk of a particle on a one-dimensional chain which in turn has been constructed by a random walk procedure. The superimposed random walk is randomised in time according to the occurrences of a stochastic point process. The probability of finding the particle in a particular position at a certain instant is obtained explicitly in the transform domain. It is found that the asymptotic behaviour for large time of the mean-square displacement of the particle depends critically on the assumed structure of the basic random walk, giving a diffusion-like term for an asymmetric walk or a square root law if the walk is symmetric. Many results are obtained in closed form for the Poisson process case, and these agree with those given previously by Kehr and Kunter. (author)
Hua, Alexandra; Major, Nili
2016-02-01
Selective mutism is a disorder in which an individual fails to speak in certain social situations though speaks normally in other settings. Most commonly, this disorder initially manifests when children fail to speak in school. Selective mutism results in significant social and academic impairment in those affected by it. This review will summarize the current understanding of selective mutism with regard to diagnosis, epidemiology, cause, prognosis, and treatment. Studies over the past 20 years have consistently demonstrated a strong relationship between selective mutism and anxiety, most notably social phobia. These findings have led to the recent reclassification of selective mutism as an anxiety disorder in the Diagnostic and Statistical Manual of Mental Disorders, 5th Edition. In addition to anxiety, several other factors have been implicated in the development of selective mutism, including communication delays and immigration/bilingualism, adding to the complexity of the disorder. In the past few years, several randomized studies have supported the efficacy of psychosocial interventions based on a graduated exposure to situations requiring verbal communication. Less data are available regarding the use of pharmacologic treatment, though there are some studies that suggest a potential benefit. Selective mutism is a disorder that typically emerges in early childhood and is currently conceptualized as an anxiety disorder. The development of selective mutism appears to result from the interplay of a variety of genetic, temperamental, environmental, and developmental factors. Although little has been published about selective mutism in the general pediatric literature, pediatric clinicians are in a position to play an important role in the early diagnosis and treatment of this debilitating condition.
Banon, J.-P.; Hetland, Ø. S.; Simonsen, I.
2018-02-01
By the use of both perturbative and non-perturbative solutions of the reduced Rayleigh equation, we present a detailed study of the scattering of light from two-dimensional weakly rough dielectric films. It is shown that for several rough film configurations, Selényi interference rings exist in the diffusely scattered light. For film systems supported by dielectric substrates where only one of the two interfaces of the film is weakly rough and the other planar, Selényi interference rings are observed at angular positions that can be determined from simple phase arguments. For such single-rough-interface films, we find and explain by a single scattering model that the contrast in the interference patterns is better when the top interface of the film (the interface facing the incident light) is rough than when the bottom interface is rough. When both film interfaces are rough, Selényi interference rings exist but a potential cross-correlation of the two rough interfaces of the film can be used to selectively enhance some of the interference rings while others are attenuated and might even disappear. This feature may in principle be used in determining the correlation properties of interfaces of films that otherwise would be difficult to access.
DEFF Research Database (Denmark)
Laursen, Peter Nørkjær; Holmvang, L.; Kelbæk, H.
2017-01-01
Background: The extent of selection bias due to drop-out in clinical trials of ST-elevation myocardial infarction (STEMI) using cardiovascular magnetic resonance (CMR) as surrogate endpoints is unknown. We sought to interrogate the characteristics and prognosis of patients who dropped out before...... years of follow-up were assessed and compared between CMR-drop-outs and CMR-participants using the trial screening log and the Eastern Danish Heart Registry. Results: The drop-out rate from acute CMR was 28% (n = 92). These patients had a significantly worse clinical risk profile upon admission...... as evaluated by the TIMI-risk score (3.7 (± 2.1) vs 4.0 (± 2.6), p = 0.043) and by left ventricular ejection fraction (43 (± 9) vs. 47 (± 10), p = 0.029). CMR drop-outs had a higher incidence of known hypertension (39% vs. 35%, p = 0.043), known diabetes (14% vs. 7%, p = 0.025), known cardiac disease (11% vs...
Directory of Open Access Journals (Sweden)
Chi Zhenhai
2010-06-01
Full Text Available Abstract Background Knee osteoarthritis is a major cause of pain and functional limitation. Complementary and alternative medical approaches have been employed to relieve symptoms and to avoid the side effects of conventional medication. Moxibustion has been widely used to treat patients with knee osteoarthritis. Our past researches suggested heat-sensitive moxibustion might be superior to the conventional moxibustion. Our objective is to investigate the effectiveness of heat-sensitive moxibustion compared with conventional moxibustion or conventional drug treatment. Methods This study consists of a multi-centre (four centers in China, randomised, controlled trial with three parallel arms (A: heat-sensitive moxibustion; B: conventional moxibustion; C: conventional drug group. The moxibustion locations are different from A and B. Group A selects heat-sensitization acupoint from the region consisting of Yin Lingquan(SP9, Yang Lingquan(GB34, Liang Qiu(ST34, and Xue Hai (SP10. Meanwhile, fixed acupoints are used in group B, that is Xi Yan (EX-LE5 and He Ding (EX-LE2. The conventional drug group treats with intra-articular Sodium Hyaluronate injection. The outcome measures above will be assessed before the treatment, the 30 days of the last moxibustion session and 6 months after the last moxibustion session. Discussion This trial will utilize high quality trial methodologies in accordance with CONSORT guidelines. It will provide evidence for the effectiveness of moxibustion as a treatment for moderate and severe knee osteoarthritis. Moreover, the result will clarify the rules of heat-sensitive moxibustion location to improve the therapeutic effect with suspended moxibustion, and propose a new concept and a new theory of moxibustion to guide clinical practices. Trial Registration The trial is registered at Controlled Clinical Trials: ChiCTR-TRC-00000600.
Vanmarcke, Erik
1983-03-01
Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.
Secure Path Selection under Random Fading
Directory of Open Access Journals (Sweden)
Furqan Jameel
2017-05-01
Full Text Available Application-oriented Wireless Sensor Networks (WSNs promises to be one of the most useful technologies of this century. However, secure communication between nodes in WSNs is still an unresolved issue. In this context, we propose two protocols (i.e. Optimal Secure Path (OSP and Sub-optimal Secure Path (SSP to minimize the outage probability of secrecy capacity in the presence of multiple eavesdroppers. We consider dissimilar fading at the main and wiretap link and provide detailed evaluation of the impact of Nakagami-m and Rician-K factors on the secrecy performance of WSNs. Extensive simulations are performed to validate our findings. Although the optimal scheme ensures more security, yet the sub-optimal scheme proves to be a more practical approach to secure wireless links.
Valeri, A; Mianné, D; Merouze, F; Bujan, L; Altobelli, A; Masson, J
1993-06-01
Scrotal hyperthermia can induce certain alterations in spermatogenesis. The basal scrotal temperature used to define hyperthermia is usually 33 degrees C. However, no study, conducted according to a strict methodology has validated this mean measurement. We therefore randomly selected 258 men between the ages of 18 and 23 years from a population of 2,000 young French men seen at the National Service Selection Centre in order to measure the scrotal temperature over each testis and in the median raphe in order to determine the mean and median values for these temperatures. For a mean room temperature of 23 +/- 0.5 degrees C with a range of 18 to 31 degrees C, the mean right and left scrotal temperature was 34.2 +/- 0.1 degree C and the mean medioscrotal temperature was 34.4 +/- 0.1 degree C. Scrotal temperature was very significantly correlated to room temperature and its variations. It was therefore impossible to define a normal value for scrotal temperature. Only measurement of scrotal temperature at neutral room temperature, between 21 and 25 degrees C, is able to provide a reference value for scrotal temperature. In this study, the mean scrotal temperature under these conditions was 34.4 +/- 0.2 degree C, i.e. 2.5 degrees C less than body temperature. In the 12.9% of cases with left varicocele, left scrotal temperature was significantly higher than in the absence of varicocele and was also higher than right Scrotal temperature. The authors also determined the dimensions of the testes.(ABSTRACT TRUNCATED AT 250 WORDS)
Marofi, Maryam; Sirousfard, Motahareh; Moeini, Mahin; Ghanadi, Alireza
2015-01-01
Background: Pain is the common complication after a surgery. The aim of this study was to evaluate the effect of aromatherapy with Rosa damascena Mill. on the postoperative pain in children. Materials and Methods: In a double-blind, placebo-controlled clinical trial, we selected 64 children of 3–6 years of age through convenient sampling and divided them randomly into two groups. Patients in group A were given inhalation aromatherapy with R. damascena Mill., and in group B, the patients were given almond oil as a placebo. Inhalation aromatherapy was used at the first time of subjects’ arrival to the ward and then at 3, 6, 9, and 12 h afterward. Common palliative treatments to relieve pain were used in both groups. Thirty minutes after aromatherapy, the postoperative pain in children was evaluated with the Toddler Preschooler Postoperative Pain Scale (TPPPS). Data were statistically analyzed using Chi-square test, one-way analysis of variance (ANOVA), and repeated measures ANOVA. Results: There was no significant difference in pain scores at the first time of subjects’ arrival to the ward (before receiving any aromatherapy or palliative care) between the two groups. After each time of aromatherapy and at the end of treatment, the pain score was significantly reduced in the aromatherapy group with R. damascena Mill. compared to the placebo group. Conclusions: According to our results, aromatherapy with R. damascena Mill. can be used in postoperative pain in children, together with other common treatments without any significant side effects. PMID:25878704
Woelk, Godfrey B; Kieffer, Mary Pat; Walker, Damilola; Mpofu, Daphne; Machekano, Rhoderick
2016-02-16
original study design. We purposively selected facilities in the districts/regions though originally the study clusters were to be randomly selected. Lifelong antiretroviral therapy for all HIV positive pregnant and lactating women, Option B+, was implemented in the three countries during the study period, with the potential for a differential impact by study arm. Implementation however, was rapidly done across the districts/regions, so that there is unlikely be this potential confounding. We developed a system of monitoring and documentation of potential confounding activities or actions, and these data will be incorporated into analyses at the conclusion of the project. Strengthens of the study are that it tests multilevel interventions, utilizes program as well as study specific and individual data, and it is conducted under "real conditions" leading to more robust findings. Limitations of the protocol include the lack of a true control arm and inadequate control for the potential effect of Option B+, such as the intensification of messages as the importance of early ANC and male partner testing. ClinicalTrials.gov (study ID: NCT01971710) Protocol version 5, 30 July 2013, registered 13 August 2013.
Using Random Numbers in Science Research Activities.
Schlenker, Richard M.; And Others
1996-01-01
Discusses the importance of science process skills and describes ways to select sets of random numbers for selection of subjects for a research study in an unbiased manner. Presents an activity appropriate for grades 5-12. (JRH)
Ueberall, Michael A; Mueller-Schwefe, Gerhard H H
2016-01-01
To evaluate the benefit-risk profile (BRP) of oxycodone/naloxone (OXN) and tapentadol (TAP) in patients with chronic low back pain (cLBP) with a neuropathic component (NC) in routine clinical practice. This was a blinded end point analysis of randomly selected 12-week routine/open-label data of the German Pain Registry on adult patients with cLBP-NC who initiated an index treatment in compliance with the current German prescribing information between 1st January and 31st October 2015 (OXN/TAP, n=128/133). Primary end point was defined as a composite of three efficacy components (≥30% improvement of pain, pain-related disability, and quality of life each at the end of observation vs baseline) and three tolerability components (normal bowel function, absence of either central nervous system side effects, and treatment-emergent adverse event [TEAE]-related treatment discontinuation during the observation period) adopted to reflect BRP assessments under real-life conditions. Demographic as well as baseline and pretreatment characteristics were comparable for the randomly selected data sets of both index groups without any indicators for critical selection biases. Treatment with OXN resulted formally in a BRP noninferior to that of TAP and showed a significantly higher primary end point response vs TAP (39.8% vs 25.6%, odds ratio: 1.93; P =0.014), due to superior analgesic effects. Between-group differences increased with stricter response definitions for all three efficacy components in favor of OXN: ≥30%/≥50%/≥70% response rates for OXN vs TAP were seen for pain intensity in 85.2%/67.2%/39.1% vs 83.5%/54.1%/15.8% ( P = ns/0.031/<0.001), for pain-related disability in 78.1%/64.8%/43.8% vs 66.9%/50.4%/24.8% ( P =0.043/0.018/0.001), and for quality of life in 76.6%/68.0%/50.0% vs 63.9%/54.1%/34.6% ( P =0.026/0.022/0.017). Overall, OXN vs TAP treatments were well tolerated, and proportions of patients who either maintained a normal bowel function (68.0% vs 72
International Nuclear Information System (INIS)
Tsallis, C.
1980-03-01
The 'ingredients' which control a phase transition in well defined system as well as in random ones (e.g. random magnetic systems) are listed and discussed within a somehow unifying perspective. Among these 'ingredients' we find the couplings and elements responsible for the cooperative phenomenon, the topological connectivity as well as possible topological incompatibilities, the influence of new degrees of freedom, the order parameter dimensionality, the ground state degeneracy and finally the 'quanticity' of the system. The general trends, though illustrated in magnetic systems, essentially hold for all phase transitions, and give a basis for connection of this area with Field theory, Theory of dynamical systems, etc. (Author) [pt
International Nuclear Information System (INIS)
Tsallis, C.
1981-01-01
The 'ingredients' which control a phase transition in well defined systems as well as in random ones (e.q. random magnetic systems) are listed and discussed within a somehow unifying perspective. Among these 'ingredients' the couplings and elements responsible for the cooperative phenomenon, the topological connectivity as well as possible topological incompatibilities, the influence of new degrees of freedom, the order parameter dimensionality, the ground state degeneracy and finally the 'quanticity' of the system are found. The general trends, though illustrated in magnetic systems, essentially hold for all phase transitions, and give a basis for connection of this area with Field theory, Theory of dynamical systems, etc. (Author) [pt
Watkins, Edward; Newbold, Alexandra; Tester-Jones, Michelle; Javaid, Mahmood; Cadman, Jennifer; Collins, Linda M; Graham, John; Mostazir, Mohammod
2016-10-06
Depression is a global health challenge. Although there are effective psychological and pharmaceutical interventions, our best treatments achieve remission rates less than 1/3 and limited sustained recovery. Underpinning this efficacy gap is limited understanding of how complex psychological interventions for depression work. Recent reviews have argued that the active ingredients of therapy need to be identified so that therapy can be made briefer, more potent, and to improve scalability. This in turn requires the use of rigorous study designs that test the presence or absence of individual therapeutic elements, rather than standard comparative randomised controlled trials. One such approach is the Multiphase Optimization Strategy, which uses efficient experimentation such as factorial designs to identify active factors in complex interventions. This approach has been successfully applied to behavioural health but not yet to mental health interventions. A Phase III randomised, single-blind balanced fractional factorial trial, based in England and conducted on the internet, randomized at the level of the patient, will investigate the active ingredients of internet cognitive-behavioural therapy (CBT) for depression. Adults with depression (operationalized as PHQ-9 score ≥ 10), recruited directly from the internet and from an UK National Health Service Improving Access to Psychological Therapies service, will be randomized across seven experimental factors, each reflecting the presence versus absence of specific treatment components (activity scheduling, functional analysis, thought challenging, relaxation, concreteness training, absorption, self-compassion training) using a 32-condition balanced fractional factorial design (2 IV 7-2 ). The primary outcome is symptoms of depression (PHQ-9) at 12 weeks. Secondary outcomes include symptoms of anxiety and process measures related to hypothesized mechanisms. Better understanding of the active ingredients of
Directory of Open Access Journals (Sweden)
Edward Watkins
2016-10-01
Full Text Available Abstract Background Depression is a global health challenge. Although there are effective psychological and pharmaceutical interventions, our best treatments achieve remission rates less than 1/3 and limited sustained recovery. Underpinning this efficacy gap is limited understanding of how complex psychological interventions for depression work. Recent reviews have argued that the active ingredients of therapy need to be identified so that therapy can be made briefer, more potent, and to improve scalability. This in turn requires the use of rigorous study designs that test the presence or absence of individual therapeutic elements, rather than standard comparative randomised controlled trials. One such approach is the Multiphase Optimization Strategy, which uses efficient experimentation such as factorial designs to identify active factors in complex interventions. This approach has been successfully applied to behavioural health but not yet to mental health interventions. Methods/Design A Phase III randomised, single-blind balanced fractional factorial trial, based in England and conducted on the internet, randomized at the level of the patient, will investigate the active ingredients of internet cognitive-behavioural therapy (CBT for depression. Adults with depression (operationalized as PHQ-9 score ≥ 10, recruited directly from the internet and from an UK National Health Service Improving Access to Psychological Therapies service, will be randomized across seven experimental factors, each reflecting the presence versus absence of specific treatment components (activity scheduling, functional analysis, thought challenging, relaxation, concreteness training, absorption, self-compassion training using a 32-condition balanced fractional factorial design (2IV 7-2. The primary outcome is symptoms of depression (PHQ-9 at 12 weeks. Secondary outcomes include symptoms of anxiety and process measures related to hypothesized mechanisms
Gerholm, Tove; Hörberg, Thomas; Tonér, Signe; Kallioinen, Petter; Frankenberg, Sofia; Kjällander, Susanne; Palmer, Anna; Taguchi, Hillevi Lenz
2018-06-19
During the preschool years, children develop abilities and skills in areas crucial for later success in life. These abilities include language, executive functions, attention, and socioemotional skills. The pedagogical methods used in preschools hold the potential to enhance these abilities, but our knowledge of which pedagogical practices aid which abilities, and for which children, is limited. The aim of this paper is to describe an intervention study designed to evaluate and compare two pedagogical methodologies in terms of their effect on the above-mentioned skills in Swedish preschool children. The study is a randomized control trial (RCT) where two pedagogical methodologies were tested to evaluate how they enhanced children's language, executive functions and attention, socioemotional skills, and early maths skills during an intensive 6-week intervention. Eighteen preschools including 28 units and 432 children were enrolled in a municipality close to Stockholm, Sweden. The children were between 4;0 and 6;0 years old and each preschool unit was randomly assigned to either of the interventions or to the control group. Background information on all children was collected via questionnaires completed by parents and preschools. Pre- and post-intervention testing consisted of a test battery including tests on language, executive functions, selective auditive attention, socioemotional skills and early maths skills. The interventions consisted of 6 weeks of intensive practice of either a socioemotional and material learning paradigm (SEMLA), for which group-based activities and interactional structures were the main focus, or an individual, digitally implemented attention and math training paradigm, which also included a set of self-regulation practices (DIL). All preschools were evaluated with the ECERS-3. If this intervention study shows evidence of a difference between group-based learning paradigms and individual training of specific skills in terms of
DEFF Research Database (Denmark)
Asmussen, J.C.; Ibrahim, S.R.; Brincker, Rune
Abstraet Thispaper demansirates how to use the Random Decrement (RD) technique for identification o flinear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing...
DEFF Research Database (Denmark)
Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune
This paper demonstrates how to use the Random Decrement (RD) technique for identification of linear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing a new...
DEFF Research Database (Denmark)
Asmussen, J. C.; Ibrahim, R.; Brincker, Rune
1998-01-01
This paper demonstrates how to use the Random Decrement (RD) technique for identification of linear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing a new...
International Nuclear Information System (INIS)
Bennett, D.L.; Brene, N.; Nielsen, H.B.
1986-06-01
The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model. (orig.)
International Nuclear Information System (INIS)
Bennett, D.L.
1987-01-01
The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: Gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model. (orig.)
Bennett, D. L.; Brene, N.; Nielsen, H. B.
1987-01-01
The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model.
Direct random insertion mutagenesis of Helicobacter pylori.
Jonge, de R.; Bakker, D.; Vliet, van AH; Kuipers, E.J.; Vandenbroucke-Grauls, C.M.J.E.; Kusters, J.G.
2003-01-01
Random insertion mutagenesis is a widely used technique for the identification of bacterial virulence genes. Most strategies for random mutagenesis involve cloning in Escherichia coli for passage of plasmids or for phenotypic selection. This can result in biased selection due to restriction or
Direct random insertion mutagenesis of Helicobacter pylori
de Jonge, Ramon; Bakker, Dennis; van Vliet, Arnoud H. M.; Kuipers, Ernst J.; Vandenbroucke-Grauls, Christina M. J. E.; Kusters, Johannes G.
2003-01-01
Random insertion mutagenesis is a widely used technique for the identification of bacterial virulence genes. Most strategies for random mutagenesis involve cloning in Escherichia coli for passage of plasmids or for phenotypic selection. This can result in biased selection due to restriction or
Sahebkar, Amirhossein; Serban, Corina; Ursoniu, Sorin; Wong, Nathan D; Muntner, Paul; Graham, Ian M; Mikhailidis, Dimitri P; Rizzo, Manfredi; Rysz, Jacek; Sperling, Laurence S; Lip, Gregory Y H; Banach, Maciej
2015-01-01
Numerous studies have suggested that oral supplementation with resveratrol exerts cardioprotective effects, but evidence of the effects on C-reactive protein (CRP) plasma levels and other cardiovascular (CV) risk factors is inconclusive. Therefore, we performed a meta-analysis to evaluate the efficacy of resveratrol supplementation on plasma CRP concentrations and selected predictors of CV risk. The search included PUBMED, Cochrane Library, Web of Science, Scopus, and EMBASE (up to August 31, 2014) to identify RCTs investigating the effects of resveratrol supplementation on selected CV risk factors. Quantitative data synthesis was performed using a random-effects model, with weighted mean difference (WMD) and 95% confidence intervals (CI) as summary statistics. Meta-analysis of data from 10 RCTs (11 treatment arms) did not support a significant effect of resveratrol supplementation in altering plasma CRP concentrations (WMD: -0.144 mg/L, 95% CI: -0.968-0.680, p = 0.731). Resveratrol supplementation was not found to alter plasma levels of total cholesterol (WMD: 1.49 mg/dL, 95% CI: -14.96-17.93, p = 0.859), low density lipoprotein cholesterol (WMD: -0.31 mg/dL, 95% CI: -9.57-8.95, p = 0.948), triglycerides (WMD: 2.67 mg/dL, 95% CI: -28.34-33.67, p = 0.866), and glucose (WMD: 1.28 mg/dL, 95% CI: -5.28-7.84, p = 0.703). It also slightly reduced high density lipoprotein cholesterol concentrations (WMD: -4.18 mg/dL, 95% CI: -6.54 to -1.82, p = 0.001). Likewise, no significant effect was observed on systolic (WMD: 0.82 mmHg, 95% CI: -8.86-10.50, p = 0.868) and diastolic blood pressure (WMD: 1.72 mm Hg, 95% CI: -6.29-9.73, p=0.674). This meta-analysis of available RCTs does not suggest any benefit of resveratrol supplementation on CV risk factors. Larger, well-designed trials are necessary to confirm these results. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Random number generation and creativity.
Bains, William
2008-01-01
A previous paper suggested that humans can generate genuinely random numbers. I tested this hypothesis by repeating the experiment with a larger number of highly numerate subjects, asking them to call out a sequence of digits selected from 0 through 9. The resulting sequences were substantially non-random, with an excess of sequential pairs of numbers and a deficit of repeats of the same number, in line with previous literature. However, the previous literature suggests that humans generate random numbers with substantial conscious effort, and distractions which reduce that effort reduce the randomness of the numbers. I reduced my subjects' concentration by asking them to call out in another language, and with alcohol - neither affected the randomness of their responses. This suggests that the ability to generate random numbers is a 'basic' function of the human mind, even if those numbers are not mathematically 'random'. I hypothesise that there is a 'creativity' mechanism, while not truly random, provides novelty as part of the mind's defence against closed programming loops, and that testing for the effects seen here in people more or less familiar with numbers or with spontaneous creativity could identify more features of this process. It is possible that training to perform better at simple random generation tasks could help to increase creativity, through training people to reduce the conscious mind's suppression of the 'spontaneous', creative response to new questions.
Gurau, Razvan
2017-01-01
Written by the creator of the modern theory of random tensors, this book is the first self-contained introductory text to this rapidly developing theory. Starting from notions familiar to the average researcher or PhD student in mathematical or theoretical physics, the book presents in detail the theory and its applications to physics. The recent detections of the Higgs boson at the LHC and gravitational waves at LIGO mark new milestones in Physics confirming long standing predictions of Quantum Field Theory and General Relativity. These two experimental results only reinforce today the need to find an underlying common framework of the two: the elusive theory of Quantum Gravity. Over the past thirty years, several alternatives have been proposed as theories of Quantum Gravity, chief among them String Theory. While these theories are yet to be tested experimentally, key lessons have already been learned. Whatever the theory of Quantum Gravity may be, it must incorporate random geometry in one form or another....
Pseudo-Random Number Generators
Howell, L. W.; Rheinfurth, M. H.
1984-01-01
Package features comprehensive selection of probabilistic distributions. Monte Carlo simulations resorted to whenever systems studied not amenable to deterministic analyses or when direct experimentation not feasible. Random numbers having certain specified distribution characteristic integral part of simulations. Package consists of collector of "pseudorandom" number generators for use in Monte Carlo simulations.
International Nuclear Information System (INIS)
Guo Ya'nan; Jin Dapeng; Zhao Dixin; Liu Zhen'an; Qiao Qiao; Chinese Academy of Sciences, Beijing
2007-01-01
Due to the randomness of radioactive decay and nuclear reaction, the signals from detectors are random in time. But normal pulse generator generates periodical pulses. To measure the performances of nuclear electronic devices under random inputs, a random generator is necessary. Types of random pulse generator are reviewed, 2 digital random pulse generators are introduced. (authors)
Random matrices and random difference equations
International Nuclear Information System (INIS)
Uppuluri, V.R.R.
1975-01-01
Mathematical models leading to products of random matrices and random difference equations are discussed. A one-compartment model with random behavior is introduced, and it is shown how the average concentration in the discrete time model converges to the exponential function. This is of relevance to understanding how radioactivity gets trapped in bone structure in blood--bone systems. The ideas are then generalized to two-compartment models and mammillary systems, where products of random matrices appear in a natural way. The appearance of products of random matrices in applications in demography and control theory is considered. Then random sequences motivated from the following problems are studied: constant pulsing and random decay models, random pulsing and constant decay models, and random pulsing and random decay models
DEFF Research Database (Denmark)
Jensen, Lisette Okkels; Thayssen, Per; Tilsted, Hans Henrik
2010-01-01
with Clinical Outcome (SORT OUT) IV trial was designed as a prospective, multi-center, open-label, all-comer, two-arm, randomized, non-inferiority study comparing the everolimus-eluting stent with the sirolimus-eluting stent in the treatment of atherosclerotic coronary artery lesions. Based on a non...
Topics in random walks in random environment
International Nuclear Information System (INIS)
Sznitman, A.-S.
2004-01-01
Over the last twenty-five years random motions in random media have been intensively investigated and some new general methods and paradigms have by now emerged. Random walks in random environment constitute one of the canonical models of the field. However in dimension bigger than one they are still poorly understood and many of the basic issues remain to this day unresolved. The present series of lectures attempt to give an account of the progresses which have been made over the last few years, especially in the study of multi-dimensional random walks in random environment with ballistic behavior. (author)
Directory of Open Access Journals (Sweden)
Moynihan Clare
2012-11-01
Full Text Available Abstract Background Evidence suggests that poor recruitment into clinical trials rests on a patient ‘deficit’ model – an inability to comprehend trial processes. Poor communication has also been cited as a possible barrier to recruitment. A qualitative patient interview study was included within the feasibility stage of a phase III non-inferiority Randomized Controlled Trial (RCT (SPARE, CRUK/07/011 in muscle invasive bladder cancer. The aim was to illuminate problems in the context of randomization. Methods The qualitative study used a ‘Framework Analysis’ that included ‘constant comparison’ in which semi-structured interviews are transcribed, analyzed, compared and contrasted both between and within transcripts. Three researchers coded and interpreted data. Results Twenty-four patients agreed to enter the interview study; 10 decliners of randomization and 14 accepters, of whom 2 subsequently declined their allocated treatment. The main theme applying to the majority of the sample was confusion and ambiguity. There was little indication that confusion directly impacted on decisions to enter the SPARE trial. However, confusion did appear to impact on ethical considerations surrounding ‘informed consent’, as well as cause a sense of alienation between patients and health personnel. Sub-optimal communication in many guises accounted for the confusion, together with the logistical elements of a trial that involved treatment options delivered in a number of geographical locations. Conclusions These data highlight the difficulty of providing balanced and clear trial information within the UK health system, despite best intentions. Involvement of multiple professionals can impact on communication processes with patients who are considering participation in RCTs. Our results led us to question the ‘deficit’ model of patient behavior. It is suggested that health professionals might consider facilitating a context in which patients
RANDOM WALK HYPOTHESIS IN FINANCIAL MARKETS
Directory of Open Access Journals (Sweden)
Nicolae-Marius JULA
2017-05-01
Full Text Available Random walk hypothesis states that the stock market prices do not follow a predictable trajectory, but are simply random. If you are trying to predict a random set of data, one should test for randomness, because, despite the power and complexity of the used models, the results cannot be trustworthy. There are several methods for testing these hypotheses and the use of computational power provided by the R environment makes the work of the researcher easier and with a cost-effective approach. The increasing power of computing and the continuous development of econometric tests should give the potential investors new tools in selecting commodities and investing in efficient markets.
International Nuclear Information System (INIS)
Kalay, Z; Ben-Naim, E
2015-01-01
We study fragmentation of a random recursive tree into a forest by repeated removal of nodes. The initial tree consists of N nodes and it is generated by sequential addition of nodes with each new node attaching to a randomly-selected existing node. As nodes are removed from the tree, one at a time, the tree dissolves into an ensemble of separate trees, namely, a forest. We study statistical properties of trees and nodes in this heterogeneous forest, and find that the fraction of remaining nodes m characterizes the system in the limit N→∞. We obtain analytically the size density ϕ s of trees of size s. The size density has power-law tail ϕ s ∼s −α with exponent α=1+(1/m). Therefore, the tail becomes steeper as further nodes are removed, and the fragmentation process is unusual in that exponent α increases continuously with time. We also extend our analysis to the case where nodes are added as well as removed, and obtain the asymptotic size density for growing trees. (paper)
International Nuclear Information System (INIS)
Ben-Naim, E; Krapivsky, P L
2010-01-01
We investigate a network growth model in which the genealogy controls the evolution. In this model, a new node selects a random target node and links either to this target node, or to its parent, or to its grandparent, etc; all nodes from the target node to its most ancient ancestor are equiprobable destinations. The emerging random ancestor tree is very shallow: the fraction g n of nodes at distance n from the root decreases super-exponentially with n, g n = e −1 /(n − 1)!. We find that a macroscopic hub at the root coexists with highly connected nodes at higher generations. The maximal degree of a node at the nth generation grows algebraically as N 1/β n , where N is the system size. We obtain the series of nontrivial exponents which are roots of transcendental equations: β 1 ≅1.351 746, β 2 ≅1.682 201, etc. As a consequence, the fraction p k of nodes with degree k has an algebraic tail, p k ∼ k −γ , with γ = β 1 + 1 = 2.351 746
Random survival forests for competing risks
DEFF Research Database (Denmark)
Ishwaran, Hemant; Gerds, Thomas A; Kogalur, Udaya B
2014-01-01
We introduce a new approach to competing risks using random forests. Our method is fully non-parametric and can be used for selecting event-specific variables and for estimating the cumulative incidence function. We show that the method is highly effective for both prediction and variable selection...
Fukuoka, Masahiro; Wu, Yi-Long; Thongprasert, Sumitra; Sunpaweravong, Patrapim; Leong, Swan-Swan; Sriuranpong, Virote; Chao, Tsu-Yi; Nakagawa, Kazuhiko; Chu, Da-Tong; Saijo, Nagahiro; Duffield, Emma L; Rukazenkov, Yuri; Speake, Georgina; Jiang, Haiyi; Armour, Alison A; To, Ka-Fai; Yang, James Chih-Hsin; Mok, Tony S K
2011-07-20
The results of the Iressa Pan-Asia Study (IPASS), which compared gefitinib and carboplatin/paclitaxel in previously untreated never-smokers and light ex-smokers with advanced pulmonary adenocarcinoma were published previously. This report presents overall survival (OS) and efficacy according to epidermal growth factor receptor (EGFR) biomarker status. In all, 1,217 patients were randomly assigned. Biomarkers analyzed were EGFR mutation (amplification mutation refractory system; 437 patients evaluable), EGFR gene copy number (fluorescent in situ hybridization; 406 patients evaluable), and EGFR protein expression (immunohistochemistry; 365 patients evaluable). OS analysis was performed at 78% maturity. A Cox proportional hazards model was used to assess biomarker status by randomly assigned treatment interactions for progression-free survival (PFS) and OS. OS (954 deaths) was similar for gefitinib and carboplatin/paclitaxel with no significant difference between treatments overall (hazard ratio [HR], 0.90; 95% CI, 0.79 to 1.02; P = .109) or in EGFR mutation-positive (HR, 1.00; 95% CI, 0.76 to 1.33; P = .990) or EGFR mutation-negative (HR, 1.18; 95% CI, 0.86 to 1.63; P = .309; treatment by EGFR mutation interaction P = .480) subgroups. A high proportion (64.3%) of EGFR mutation-positive patients randomly assigned to carboplatin/paclitaxel received subsequent EGFR tyrosine kinase inhibitors. PFS was significantly longer with gefitinib for patients whose tumors had both high EGFR gene copy number and EGFR mutation (HR, 0.48; 95% CI, 0.34 to 0.67) but significantly shorter when high EGFR gene copy number was not accompanied by EGFR mutation (HR, 3.85; 95% CI, 2.09 to 7.09). EGFR mutations are the strongest predictive biomarker for PFS and tumor response to first-line gefitinib versus carboplatin/paclitaxel. The predictive value of EGFR gene copy number was driven by coexisting EGFR mutation (post hoc analysis). Treatment-related differences observed for PFS in the EGFR
Nonlinear Pricing with Random Participation
Jean-Charles Rochet; Lars A. Stole
2002-01-01
The canonical selection contracting programme takes the agent's participation decision as deterministic and finds the optimal contract, typically satisfying this constraint for the worst type. Upon weakening this assumption of known reservation values by introducing independent randomness into the agents' outside options, we find that some of the received wisdom from mechanism design and nonlinear pricing is not robust and the richer model which allows for stochastic participation affords a m...
Random broadcast on random geometric graphs
Energy Technology Data Exchange (ETDEWEB)
Bradonjic, Milan [Los Alamos National Laboratory; Elsasser, Robert [UNIV OF PADERBORN; Friedrich, Tobias [ICSI/BERKELEY; Sauerwald, Tomas [ICSI/BERKELEY
2009-01-01
In this work, we consider the random broadcast time on random geometric graphs (RGGs). The classic random broadcast model, also known as push algorithm, is defined as: starting with one informed node, in each succeeding round every informed node chooses one of its neighbors uniformly at random and informs it. We consider the random broadcast time on RGGs, when with high probability: (i) RGG is connected, (ii) when there exists the giant component in RGG. We show that the random broadcast time is bounded by {Omicron}({radical} n + diam(component)), where diam(component) is a diameter of the entire graph, or the giant component, for the regimes (i), or (ii), respectively. In other words, for both regimes, we derive the broadcast time to be {Theta}(diam(G)), which is asymptotically optimal.
Quantumness, Randomness and Computability
International Nuclear Information System (INIS)
Solis, Aldo; Hirsch, Jorge G
2015-01-01
Randomness plays a central role in the quantum mechanical description of our interactions. We review the relationship between the violation of Bell inequalities, non signaling and randomness. We discuss the challenge in defining a random string, and show that algorithmic information theory provides a necessary condition for randomness using Borel normality. We close with a view on incomputablity and its implications in physics. (paper)
How random is a random vector?
Eliazar, Iddo
2015-12-01
Over 80 years ago Samuel Wilks proposed that the "generalized variance" of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the "Wilks standard deviation" -the square root of the generalized variance-is indeed the standard deviation of a random vector. We further establish that the "uncorrelation index" -a derivative of the Wilks standard deviation-is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: "randomness measures" and "independence indices" of random vectors. In turn, these general notions give rise to "randomness diagrams"-tangible planar visualizations that answer the question: How random is a random vector? The notion of "independence indices" yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.
How random is a random vector?
International Nuclear Information System (INIS)
Eliazar, Iddo
2015-01-01
Over 80 years ago Samuel Wilks proposed that the “generalized variance” of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the “Wilks standard deviation” –the square root of the generalized variance–is indeed the standard deviation of a random vector. We further establish that the “uncorrelation index” –a derivative of the Wilks standard deviation–is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: “randomness measures” and “independence indices” of random vectors. In turn, these general notions give rise to “randomness diagrams”—tangible planar visualizations that answer the question: How random is a random vector? The notion of “independence indices” yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.
Ytzhak, A; Doron, Y; Lahat, E; Livne, A
2012-10-01
Selective mutism is an uncommon disorder in young children, in which they selectively don't speak in certain social situations, while being capable of speaking easily in other social situations. Many etiologies were proposed for selective mutism including psychodynamic, behavioral and familial etc. A developmental etiology that includes insights from all the above is gaining support. Accordingly, mild language impairment in a child with an anxiety trait may be at the root of developing selective mutism. The behavior will be reinforced by an avoidant pattern in the family. Early treatment and followup for children with selective mutism is important. The treatment includes non-pharmacological therapy (psychodynamic, behavioral and familial) and pharmacologic therapy--mainly selective serotonin reuptake inhibitors (SSRI).
DEFF Research Database (Denmark)
Fast, Søren; Hegedus, Laszlo; Pacini, Furio
2014-01-01
with 131I-therapy. Methods: In this phase II, single-blinded, placebo-controlled study, 95 patients (57.2±9.6 years old, 85% women, 83% Caucasians) with MNG (median size 96.0 ml (31.9 - 242.2 ml)) were randomized to receive placebo (n=32), 0.01 mg MRrhTSH (n=30) or 0.03 mg MRrhTSH (n=33), 24 hours before...... a calculated 131I activity. Thyroid volume (TV) and smallest cross-sectional area of trachea (SCAT) were measured (by CT-scan) at baseline, month 6 and month 36. Thyroid function and quality of life (QoL) was evaluated at 3 month and yearly intervals, respectively. Results: At 6 months, TV reduction...... was enhanced in the 0.03 mg MRrhTSH group (32.9% versus 23.1% in the placebo group, p=0.03), but not in the 0.01 mg MRrhTSH group. At month 36 the mean percent TV reduction from baseline was 44 ± 12.7% (SD) in the placebo group, 41 ± 21.0% in the 0.01 mg MRrhTSH-group and 53 ± 18.6% in the 0.03 mg MRrh...
The effect of selection on genetic parameter estimates
African Journals Online (AJOL)
Unknown
The South African Journal of Animal Science is available online at ... A simulation study was carried out to investigate the effect of selection on the estimation of genetic ... The model contained a fixed effect, random genetic and random.
47 CFR 1.1604 - Post-selection hearings.
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a) Following the random...
CERN PhotoLab
1968-01-01
To help resolve the problem of site selection for the proposed 300 GeV machine, the Council selected "three wise men" (left to right, J H Bannier of the Netherlands, A Chavanne of Switzerland and L K Boggild of Denmark).
DEFF Research Database (Denmark)
Hougaard, Jens Leth; Tvede, Mich
2002-01-01
Within a production theoretic framework, this paper considers an axiomatic approach to benchmark selection. It is shown that two simple and weak axioms; efficiency and comprehensive monotonicity characterize a natural family of benchmarks which typically becomes unique. Further axioms are added...... in order to obtain a unique selection...
Bekker, Pirow; Dairaghi, Daniel; Seitz, Lisa; Leleti, Manmohan; Wang, Yu; Ertl, Linda; Baumgart, Trageen; Shugarts, Sarah; Lohr, Lisa; Dang, Ton; Miao, Shichang; Zeng, Yibin; Fan, Pingchen; Zhang, Penglie; Johnson, Daniel; Powers, Jay; Jaen, Juan; Charo, Israel; Schall, Thomas J
2016-01-01
The complement 5a receptor has been an attractive therapeutic target for many autoimmune and inflammatory disorders. However, development of a selective and potent C5aR antagonist has been challenging. Here we describe the characterization of CCX168 (avacopan), an orally administered selective and potent C5aR inhibitor. CCX168 blocked the C5a binding, C5a-mediated migration, calcium mobilization, and CD11b upregulation in U937 cells as well as in freshly isolated human neutrophils. CCX168 retains high potency when present in human blood. A transgenic human C5aR knock-in mouse model allowed comparison of the in vitro and in vivo efficacy of the molecule. CCX168 effectively blocked migration in in vitro and ex vivo chemotaxis assays, and it blocked the C5a-mediated neutrophil vascular endothelial margination. CCX168 was effective in migration and neutrophil margination assays in cynomolgus monkeys. This thorough in vitro and preclinical characterization enabled progression of CCX168 into the clinic and testing of its safety, tolerability, pharmacokinetic, and pharmacodynamic profiles in a Phase 1 clinical trial in 48 healthy volunteers. CCX168 was shown to be well tolerated across a broad dose range (1 to 100 mg) and it showed dose-dependent pharmacokinetics. An oral dose of 30 mg CCX168 given twice daily blocked the C5a-induced upregulation of CD11b in circulating neutrophils by 94% or greater throughout the entire day, demonstrating essentially complete target coverage. This dose regimen is being tested in clinical trials in patients with anti-neutrophil cytoplasmic antibody-associated vasculitis. Trial Registration ISRCTN registry with trial ID ISRCTN13564773.
Directory of Open Access Journals (Sweden)
Pirow Bekker
Full Text Available The complement 5a receptor has been an attractive therapeutic target for many autoimmune and inflammatory disorders. However, development of a selective and potent C5aR antagonist has been challenging. Here we describe the characterization of CCX168 (avacopan, an orally administered selective and potent C5aR inhibitor. CCX168 blocked the C5a binding, C5a-mediated migration, calcium mobilization, and CD11b upregulation in U937 cells as well as in freshly isolated human neutrophils. CCX168 retains high potency when present in human blood. A transgenic human C5aR knock-in mouse model allowed comparison of the in vitro and in vivo efficacy of the molecule. CCX168 effectively blocked migration in in vitro and ex vivo chemotaxis assays, and it blocked the C5a-mediated neutrophil vascular endothelial margination. CCX168 was effective in migration and neutrophil margination assays in cynomolgus monkeys. This thorough in vitro and preclinical characterization enabled progression of CCX168 into the clinic and testing of its safety, tolerability, pharmacokinetic, and pharmacodynamic profiles in a Phase 1 clinical trial in 48 healthy volunteers. CCX168 was shown to be well tolerated across a broad dose range (1 to 100 mg and it showed dose-dependent pharmacokinetics. An oral dose of 30 mg CCX168 given twice daily blocked the C5a-induced upregulation of CD11b in circulating neutrophils by 94% or greater throughout the entire day, demonstrating essentially complete target coverage. This dose regimen is being tested in clinical trials in patients with anti-neutrophil cytoplasmic antibody-associated vasculitis. Trial Registration ISRCTN registry with trial ID ISRCTN13564773.
O'Leary-Barrett, Maeve; Mâsse, Benoit; Pihl, Robert O; Stewart, Sherry H; Séguin, Jean R; Conrod, Patricia J
2017-10-01
Substance use and binge drinking during early adolescence are associated with neurocognitive abnormalities, mental health problems and an increased risk for future addiction. The trial aims to evaluate the protective effects of an evidence-based substance use prevention programme on the onset of alcohol and drug use in adolescence, as well as on cognitive, mental health and addiction outcomes over 5 years. Thirty-eight high schools will be recruited, with a final sample of 31 schools assigned to intervention or control conditions (3826 youth). Brief personality-targeted interventions will be delivered to high-risk youth attending intervention schools during the first year of the trial. Control school participants will receive no intervention above what is offered to them in the regular curriculum by their respective schools. Public/private French and English high schools in Montreal (Canada). All grade 7 students (12-13 years old) will be invited to participate. High-risk youth will be identified as those scoring one standard deviation or more above the school mean on one of the four personality subscales of the Substance Use Risk Profile Scale (40-45% youth). Self-reported substance use and mental health symptoms and cognitive functioning measured annually throughout 5 years. Primary outcomes are the onset of substance use disorders at 4 years post-intervention (year 5). Secondary intermediate outcomes are the onset of alcohol and substance use 2 years post-intervention and neuropsychological functions; namely, the protective effects of substance use prevention on cognitive functions generally, and executive functions and reward sensitivity specifically. This longitudinal, cluster-randomized controlled trial will investigate the impact of a brief personality-targeted intervention program on reducing the onset of addiction 4 years-post intervention. Results will tease apart the developmental sequences of uptake and growth in substance use and cognitive
Li, Guifang; Gao, Shunji; Sheng, Zhixin; Li, Bin
2016-01-01
To determine the efficacy of first-generation single-agent epidermal growth factor receptor (EGFR) tyrosine kinase inhibitor (TKI) therapy in advanced non-small-cell lung cancer patients with known EGFR mutation status, we undertook this pooled analysis. We searched for randomized controlled trials (RCTs) in Medline, Embase, the Cochrane Controlled Trials Register, the Science Citation Index, and the American Society of Clinical Oncology annual meetings. Out of 2,129 retrieved articles, 19 RCTs enrolling 2,016 patients with wild-type EGFR tumors and 1,034 patients with mutant EGFR tumors were identified. For these EGFR mutant patients, single-agent EGFR-TKI therapy improved progression-free survival (PFS) over chemotherapy: the summary hazard ratios (HRs) were 0.41 (p well as chemotherapy in the first-line setting (HR = 1.65, p = 0.03) and in the second-/third-line setting (HR = 1.27, p = 0.006). No statistically significant difference was observed in terms of overall survival (OS). Using platinum-based doublet chemotherapy as a common comparator, indirect comparison showed the superior efficacy of single-agent EGFR-TKI therapy over EGFR-TKIs added to chemotherapy in PFS [HR = 1.35 (1.03, 1.77), p = 0.03]. Additionally, a marginal trend towards the same direction was found in the OS analysis [HR = 1.16 (0.99, 1.35), p = 0.06]. Interestingly, for those EGFR wild-type tumors, single-agent EGFR-TKI therapy was inferior to EGFR-TKIs added to chemotherapy in PFS [HR = 0.38 (0.33, 0.44), p chemotherapy. However, single-agent EGFR-TKI therapy was inferior to chemotherapy in PFS for those EGFR wild-type patients. Single-agent EGFR-TKI therapy could improve PFS over the combination of EGFR-TKIs and chemotherapy in these EGFR mutant patients. However, EGFR-TKIs combined with chemotherapy could provide additive PFS and OS benefit over single-agent EGFR-TKI therapy in those EGFR wild-type patients. © 2016 S. Karger AG, Basel.
Randomizer for High Data Rates
Garon, Howard; Sank, Victor J.
2018-01-01
NASA as well as a number of other space agencies now recognize that the current recommended CCSDS randomizer used for telemetry (TM) is too short. When multiple applications of the PN8 Maximal Length Sequence (MLS) are required in order to fully cover a channel access data unit (CADU), spectral problems in the form of elevated spurious discretes (spurs) appear. Originally the randomizer was called a bit transition generator (BTG) precisely because it was thought that its primary value was to insure sufficient bit transitions to allow the bit/symbol synchronizer to lock and remain locked. We, NASA, have shown that the old BTG concept is a limited view of the real value of the randomizer sequence and that the randomizer also aids in signal acquisition as well as minimizing the potential for false decoder lock. Under the guidelines we considered here there are multiple maximal length sequences under GF(2) which appear attractive in this application. Although there may be mitigating reasons why another MLS sequence could be selected, one sequence in particular possesses a combination of desired properties which offsets it from the others.
On a randomly imperfect spherical cap pressurized by a random ...
African Journals Online (AJOL)
On a randomly imperfect spherical cap pressurized by a random dynamic load. ... In this paper, we investigate a dynamical system in a random setting of dual ... characterization of the random process for determining the dynamic buckling load ...
Random walks, random fields, and disordered systems
Černý, Jiří; Kotecký, Roman
2015-01-01
Focusing on the mathematics that lies at the intersection of probability theory, statistical physics, combinatorics and computer science, this volume collects together lecture notes on recent developments in the area. The common ground of these subjects is perhaps best described by the three terms in the title: Random Walks, Random Fields and Disordered Systems. The specific topics covered include a study of Branching Brownian Motion from the perspective of disordered (spin-glass) systems, a detailed analysis of weakly self-avoiding random walks in four spatial dimensions via methods of field theory and the renormalization group, a study of phase transitions in disordered discrete structures using a rigorous version of the cavity method, a survey of recent work on interacting polymers in the ballisticity regime and, finally, a treatise on two-dimensional loop-soup models and their connection to conformally invariant systems and the Gaussian Free Field. The notes are aimed at early graduate students with a mod...
International Nuclear Information System (INIS)
Cortes Henao, Luis F.; Castro F, Carlos A.
2000-01-01
It is presented a revision and discussion about the characteristics and factors that relate activity and selectivity in the catalytic and not catalytic partial oxidation of methane and the effect of variables as the temperature, pressure and others in the methane conversion to methanol. It thinks about the zeolites use modified for the catalytic oxidation of natural gas
Üstebay, D.; Castro, R.M.; Rabbat, M.
2009-01-01
Motivated by applications in compression and distributed transform coding, we propose a new gossip algorithm called Selective Gossip to efficiently compute sparse approximations of network data. We consider running parallel gossip algorithms on the elements of a vector of transform coefficients.
DEFF Research Database (Denmark)
Liu, Jianping; Kjaergard, Lise Lotte; Gluud, Christian
2002-01-01
The quality of randomization of Chinese randomized trials on herbal medicines for hepatitis B was assessed. Search strategy and inclusion criteria were based on the published protocol. One hundred and seventy-six randomized clinical trials (RCTs) involving 20,452 patients with chronic hepatitis B...... virus (HBV) infection were identified that tested Chinese medicinal herbs. They were published in 49 Chinese journals. Only 10% (18/176) of the studies reported the method by which they randomized patients. Only two reported allocation concealment and were considered as adequate. Twenty percent (30...
Lines of Descent Under Selection
Baake, Ellen; Wakolbinger, Anton
2017-11-01
We review recent progress on ancestral processes related to mutation-selection models, both in the deterministic and the stochastic setting. We mainly rely on two concepts, namely, the killed ancestral selection graph and the pruned lookdown ancestral selection graph. The killed ancestral selection graph gives a representation of the type of a random individual from a stationary population, based upon the individual's potential ancestry back until the mutations that define the individual's type. The pruned lookdown ancestral selection graph allows one to trace the ancestry of individuals from a stationary distribution back into the distant past, thus leading to the stationary distribution of ancestral types. We illustrate the results by applying them to a prototype model for the error threshold phenomenon.
Benchmarking Variable Selection in QSAR.
Eklund, Martin; Norinder, Ulf; Boyer, Scott; Carlsson, Lars
2012-02-01
Variable selection is important in QSAR modeling since it can improve model performance and transparency, as well as reduce the computational cost of model fitting and predictions. Which variable selection methods that perform well in QSAR settings is largely unknown. To address this question we, in a total of 1728 benchmarking experiments, rigorously investigated how eight variable selection methods affect the predictive performance and transparency of random forest models fitted to seven QSAR datasets covering different endpoints, descriptors sets, types of response variables, and number of chemical compounds. The results show that univariate variable selection methods are suboptimal and that the number of variables in the benchmarked datasets can be reduced with about 60 % without significant loss in model performance when using multivariate adaptive regression splines MARS and forward selection. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Generating equilateral random polygons in confinement III
International Nuclear Information System (INIS)
Diao, Y; Ernst, C; Montemayor, A; Ziegler, U
2012-01-01
In this paper we continue our earlier studies (Diao et al 2011 J. Phys. A: Math. Theor. 44 405202, Diao et al J. Phys. A: Math. Theor. 45 275203) on the generation methods of random equilateral polygons confined in a sphere. The first half of this paper is concerned with the generation of confined equilateral random walks. We show that if the selection of a vertex is uniform subject to the position of its previous vertex and the confining condition, then the distributions of the vertices are not uniform, although there exists a distribution such that if the initial vertex is selected following this distribution, then all vertices of the random walk follow this same distribution. Thus in order to generate a confined equilateral random walk, the selection of a vertex cannot be uniform subject to the position of its previous vertex and the confining condition. We provide a simple algorithm capable of generating confined equilateral random walks whose vertex distribution is almost uniform in the confinement sphere. In the second half of this paper we show that any process generating confined equilateral random walks can be turned into a process generating confined equilateral random polygons with the property that the vertex distribution of the polygons approaches the vertex distribution of the walks as the polygons get longer and longer. In our earlier studies, the starting point of the confined polygon is fixed at the center of the sphere. The new approach here allows us to move the starting point of the confined polygon off the center of the sphere. (paper)
Finicelli, Andrea; Pagano, Patrizio; Sbracia, Massimo
2009-01-01
We analyze the foundations of the relationship between trade and total factor productivity (TFP) in the Ricardian model. Under general assumptions about the autarky distributions of industry productivities, trade openness raises TFP. This is due to the selection effect of international competition ï¿½ driven by comparative advantages ï¿½ which makes "some" high- and "many" low-productivity industries exit the market. We derive a model-based measure of this effect that requires only production...
Randomized Prediction Games for Adversarial Machine Learning.
Rota Bulo, Samuel; Biggio, Battista; Pillai, Ignazio; Pelillo, Marcello; Roli, Fabio
In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different evasion attacks and modifying the classification function accordingly. However, both the classification function and the simulated data manipulations have been modeled in a deterministic manner, without accounting for any form of randomization. In this paper, we overcome this limitation by proposing a randomized prediction game, namely, a noncooperative game-theoretic formulation in which the classifier and the attacker make randomized strategy selections according to some probability distribution defined over the respective strategy set. We show that our approach allows one to improve the tradeoff between attack detection and false alarms with respect to the state-of-the-art secure classifiers, even against attacks that are different from those hypothesized during design, on application examples including handwritten digit recognition, spam, and malware detection.In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different
International Nuclear Information System (INIS)
Ambjoern, J.
1987-08-01
The theory of strings is the theory of random surfaces. I review the present attempts to regularize the world sheet of the string by triangulation. The corresponding statistical theory of triangulated random surfaces has a surprising rich structure, but the connection to conventional string theory seems non-trivial. (orig.)
Derandomizing from random strings
Buhrman, H.; Fortnow, L.; Koucký, M.; Loff, B.
2010-01-01
In this paper we show that BPP is truth-table reducible to the set of Kolmogorov random strings R(K). It was previously known that PSPACE, and hence BPP is Turing-reducible to R(K). The earlier proof relied on the adaptivity of the Turing-reduction to find a Kolmogorov-random string of polynomial
Correlated randomness and switching phenomena
Stanley, H. E.; Buldyrev, S. V.; Franzese, G.; Havlin, S.; Mallamace, F.; Kumar, P.; Plerou, V.; Preis, T.
2010-08-01
One challenge of biology, medicine, and economics is that the systems treated by these serious scientific disciplines have no perfect metronome in time and no perfect spatial architecture-crystalline or otherwise. Nonetheless, as if by magic, out of nothing but randomness one finds remarkably fine-tuned processes in time and remarkably fine-tuned structures in space. Further, many of these processes and structures have the remarkable feature of “switching” from one behavior to another as if by magic. The past century has, philosophically, been concerned with placing aside the human tendency to see the universe as a fine-tuned machine. Here we will address the challenge of uncovering how, through randomness (albeit, as we shall see, strongly correlated randomness), one can arrive at some of the many spatial and temporal patterns in biology, medicine, and economics and even begin to characterize the switching phenomena that enables a system to pass from one state to another. Inspired by principles developed by A. Nihat Berker and scores of other statistical physicists in recent years, we discuss some applications of correlated randomness to understand switching phenomena in various fields. Specifically, we present evidence from experiments and from computer simulations supporting the hypothesis that water’s anomalies are related to a switching point (which is not unlike the “tipping point” immortalized by Malcolm Gladwell), and that the bubbles in economic phenomena that occur on all scales are not “outliers” (another Gladwell immortalization). Though more speculative, we support the idea of disease as arising from some kind of yet-to-be-understood complex switching phenomenon, by discussing data on selected examples, including heart disease and Alzheimer disease.
Distributional and efficiency results for subset selection
Laan, van der P.
1996-01-01
Assume k (??k \\geq 2) populations are given. The associated independent random variables have continuous distribution functions with an unknown location parameter. The statistical selec??tion goal is to select a non??empty subset which contains the best population,?? that is the pop??ulation with
Analysis of swaps in Radix selection
DEFF Research Database (Denmark)
Elmasry, Amr Ahmed Abd Elmoneim; Mahmoud, Hosam
2011-01-01
Radix Sort is a sorting algorithm based on analyzing digital data. We study the number of swaps made by Radix Select (a one-sided version of Radix Sort) to find an element with a randomly selected rank. This kind of grand average provides a smoothing over all individual distributions for specific...
Selection and characterization of DNA aptamers
Ruigrok, V.J.B.
2013-01-01
This thesis focusses on the selection and characterisation of DNA aptamers and the various aspects related to their selection from large pools of randomized oligonucleotides. Aptamers are affinity tools that can specifically recognize and bind predefined target molecules; this ability, however,
Quantum random number generator
Soubusta, Jan; Haderka, Ondrej; Hendrych, Martin
2001-03-01
Since reflection or transmission of a quantum particle on a beamsplitter is inherently random quantum process, a device built on this principle does not suffer from drawbacks of neither pseudo-random computer generators or classical noise sources. Nevertheless, a number of physical conditions necessary for high quality random numbers generation must be satisfied. Luckily, in quantum optics realization they can be well controlled. We present an easy random number generator based on the division of weak light pulses on a beamsplitter. The randomness of the generated bit stream is supported by passing the data through series of 15 statistical test. The device generates at a rate of 109.7 kbit/s.
Quantum random number generator
Pooser, Raphael C.
2016-05-10
A quantum random number generator (QRNG) and a photon generator for a QRNG are provided. The photon generator may be operated in a spontaneous mode below a lasing threshold to emit photons. Photons emitted from the photon generator may have at least one random characteristic, which may be monitored by the QRNG to generate a random number. In one embodiment, the photon generator may include a photon emitter and an amplifier coupled to the photon emitter. The amplifier may enable the photon generator to be used in the QRNG without introducing significant bias in the random number and may enable multiplexing of multiple random numbers. The amplifier may also desensitize the photon generator to fluctuations in power supplied thereto while operating in the spontaneous mode. In one embodiment, the photon emitter and amplifier may be a tapered diode amplifier.
Autonomous Byte Stream Randomizer
Paloulian, George K.; Woo, Simon S.; Chow, Edward T.
2013-01-01
Net-centric networking environments are often faced with limited resources and must utilize bandwidth as efficiently as possible. In networking environments that span wide areas, the data transmission has to be efficient without any redundant or exuberant metadata. The Autonomous Byte Stream Randomizer software provides an extra level of security on top of existing data encryption methods. Randomizing the data s byte stream adds an extra layer to existing data protection methods, thus making it harder for an attacker to decrypt protected data. Based on a generated crypto-graphically secure random seed, a random sequence of numbers is used to intelligently and efficiently swap the organization of bytes in data using the unbiased and memory-efficient in-place Fisher-Yates shuffle method. Swapping bytes and reorganizing the crucial structure of the byte data renders the data file unreadable and leaves the data in a deconstructed state. This deconstruction adds an extra level of security requiring the byte stream to be reconstructed with the random seed in order to be readable. Once the data byte stream has been randomized, the software enables the data to be distributed to N nodes in an environment. Each piece of the data in randomized and distributed form is a separate entity unreadable on its own right, but when combined with all N pieces, is able to be reconstructed back to one. Reconstruction requires possession of the key used for randomizing the bytes, leading to the generation of the same cryptographically secure random sequence of numbers used to randomize the data. This software is a cornerstone capability possessing the ability to generate the same cryptographically secure sequence on different machines and time intervals, thus allowing this software to be used more heavily in net-centric environments where data transfer bandwidth is limited.
The frequency of drugs in randomly selected drivers in Denmark
DEFF Research Database (Denmark)
Simonsen, Kirsten Wiese; Steentoft, Anni; Hels, Tove
is the Danish legal limit. The percentage of drivers positive for medicinal drugs above the Danish legal concentration limit was 0.4%; while, 0.3% of the drivers tested positive for one or more illicit drug at concentrations exceeding the Danish legal limit. Tetrahydrocannabinol, cocaine, and amphetamine were...... the most frequent illicit drugs detected above the limit of quantitation (LOQ); while, codeine, tramadol, zopiclone, and benzodiazepines were the most frequent legal drugs. Middle aged men (median age 47.5 years) dominated the drunk driving group, while the drivers positive for illegal drugs consisted......Introduction Driving under the influence of alcohol and drugs is a global problem. In Denmark as well as in other countries there is an increasing focus on impaired driving. Little is known about the occurrence of psychoactive drugs in the general traffic. Therefore the European commission...
Model Selection with the Linear Mixed Model for Longitudinal Data
Ryoo, Ji Hoon
2011-01-01
Model building or model selection with linear mixed models (LMMs) is complicated by the presence of both fixed effects and random effects. The fixed effects structure and random effects structure are codependent, so selection of one influences the other. Most presentations of LMM in psychology and education are based on a multilevel or…
Conversion of the random amplified polymorphic DNA (RAPD ...
African Journals Online (AJOL)
Conversion of the random amplified polymorphic DNA (RAPD) marker UBC#116 linked to Fusarium crown and root rot resistance gene (Frl) into a co-dominant sequence characterized amplified region (SCAR) marker for marker-assisted selection of tomato.
International Nuclear Information System (INIS)
Choi, Yoon Ji; Lee, Dae Ho; Choi, Chang Min; Lee, Jung Shin; Lee, Seung Jin; Ahn, Jin-Hee; Kim, Sang-We
2015-01-01
Considering cell cycle dependent cytotoxicity, intercalation of chemotherapy and epidermal growth factor receptor (EGFR) tyrosine kinase inhibitor (TKI) may be a treatment option in non-small cell lung cancer (NSCLC). This randomized phase 2 study compared the efficacy of paclitaxel and carboplatin (PC) intercalated with gefitinib (G) versus PC alone in a selected, chemotherapy-naïve population of advanced NSCLC patients with a history of smoking or wild-type EGFR. Eligible patients were chemotherapy-naïve advanced NSCLC patients with Eastern Cooperative Oncology Group performance status of 0—2. Non-smoking patients with adenocarcinoma or patients with activating EGFR mutation were excluded because they could benefit from gefitinib alone. Eligible patients were randomized to one of the following treatment arms: PCG, P 175 mg/m 2 , and C AUC 5 administered intravenously on day 1 intercalated with G 250 mg orally on days 2 through 15 every 3 weeks for four cycles followed by G 250 mg orally until progressive disease; or PC, same dosing schedule for four cycles only. The primary endpoint was the objective response rate (ORR), and the secondary endpoints included progression-free survival (PFS), overall survival (OS), and toxicity profile. A total of 90 patients participated in the study. The ORRs were 41.9 % (95 % confidence interval (CI) 27.0–57.9 %) for the PCG arm and 39.5 % (95 % CI 25.0–55.6 %) for the PC arm (P = 0.826). No differences in PFS (4.1 vs. 4.1 months, P = 0.781) or OS (9.3 vs. 10.5 months, P = 0.827) were observed between the PCG and PC arms. Safety analyses showed a similar incidence of drug-related grade 3/4 toxicity. Rash and pruritus were more frequent in the PCG than in the PC arm. PCG did not improve ORR, PFS, and OS compared to PC chemotherapy alone for NSCLC in a clinically selected population excluding non-smoking adenocarcinoma or mutated EGFR. The study is registered with ClinicalTrials.gov (NCT01196234). Registration date is 08/09/2010
DEFF Research Database (Denmark)
Hoch Jovanovic, Tamara; Lynggaard, Kennet
2014-01-01
and rules. The article examines the reasons for both resistance and selectiveness to Europeanization of the Danish minority policy through a “path dependency” perspective accentuating decision makers’ reluctance to deviate from existing institutional commitments, even in subsequently significantly altered...... political contexts at the European level. We further show how the “translation” of international norms to a domestic context has worked to reinforce the original institutional setup, dating back to the mid-1950s. The translation of European-level minority policy developed in the 1990s and 2000s works most...
DEFF Research Database (Denmark)
Svendsen, Mette N.
2015-01-01
This article employs a multi-species perspective in investigating how life's worth is negotiated in the field of neonatology in Denmark. It does so by comparing decision-making processes about human infants in the Danish neonatal intensive care unit with those associated with piglets who serve as...... as expectations within linear or predictive time frames are key markers in both sites. Exploring selective reproductive processes across human infants and research piglets can help us uncover aspects of the cultural production of viability that we would not otherwise see or acknowledge....
EDITORIAL: Nanotechnological selection Nanotechnological selection
Demming, Anna
2013-01-01
At the nanoscale measures can move from a mass-scale analogue calibration to counters of discrete units. The shift redefines the possible levels of control that can be achieved in a system if adequate selectivity can be imposed. As an example as ionic substances pass through nanoscale pores, the quantity of ions is low enough that the pore can contain either negative or positive ions. Yet precise control over this selectivity still raises difficulties. In this issue researchers address the challenge of how to regulate the ionic selectivity of negative and positive charges with the use of an external charge. The approach may be useful for controlling the behaviour, properties and chemical composition of liquids and has possible technical applications for nanofluidic field effect transistors [1]. Selectivity is a critical advantage in the administration of drugs. Nanoparticles functionalized with targeting moieties can allow delivery of anti-cancer drugs to tumour cells, whilst avoiding healthy cells and hence reducing some of the debilitating side effects of cancer treatments [2]. Researchers in Belarus and the US developed a new theranostic approach—combining therapy and diagnosis—to support the evident benefits of cellular selectivity that can be achieved when nanoparticles are applied in medicine [3]. Their process uses nanobubbles of photothermal vapour, referred to as plasmonic nanobubbles, generated by plasmonic excitations in gold nanoparticles conjugated to diagnosis-specific antibodies. The intracellular plasmonic nanobubbles are controlled by laser fluence so that the response can be tuned in individual living cells. Lower fluence allows non-invasive high-sensitive imaging for diagnosis and higher fluence can disrupt the cellular membrane for treatments. The selective response of carbon nanotubes to different gases has leant them to be used within various different types of sensors, as summarized in a review by researchers at the University of
International Nuclear Information System (INIS)
Coveyou, R.R.
1974-01-01
The subject of random number generation is currently controversial. Differing opinions on this subject seem to stem from implicit or explicit differences in philosophy; in particular, from differing ideas concerning the role of probability in the real world of physical processes, electronic computers, and Monte Carlo calculations. An attempt is made here to reconcile these views. The role of stochastic ideas in mathematical models is discussed. In illustration of these ideas, a mathematical model of the use of random number generators in Monte Carlo calculations is constructed. This model is used to set up criteria for the comparison and evaluation of random number generators. (U.S.)
Giovannetti, Vittorio; Lloyd, Seth; Maccone, Lorenzo
2007-01-01
A random access memory (RAM) uses n bits to randomly address N=2^n distinct memory cells. A quantum random access memory (qRAM) uses n qubits to address any quantum superposition of N memory cells. We present an architecture that exponentially reduces the requirements for a memory call: O(log N) switches need be thrown instead of the N used in conventional (classical or quantum) RAM designs. This yields a more robust qRAM algorithm, as it in general requires entanglement among exponentially l...
International Nuclear Information System (INIS)
Markin, J.T.
1989-01-01
As the numbers and complexity of nuclear facilities increase, limitations on resources for international safeguards may restrict attainment of safeguards goals. One option for improving the efficiency of limited resources is to expand the current inspection regime to include random allocation of the amount and frequency of inspection effort to material strata or to facilities. This paper identifies the changes in safeguards policy, administrative procedures, and operational procedures that would be necessary to accommodate randomized inspections and identifies those situations where randomization can improve inspection efficiency and those situations where the current nonrandom inspections should be maintained. 9 refs., 1 tab
Random phenomena; Phenomenes aleatoires
Energy Technology Data Exchange (ETDEWEB)
Bonnet, G. [Commissariat a l' energie atomique et aux energies alternatives - CEA, C.E.N.G., Service d' Electronique, Section d' Electronique, Grenoble (France)
1963-07-01
This document gathers a set of conferences presented in 1962. A first one proposes a mathematical introduction to the analysis of random phenomena. The second one presents an axiomatic of probability calculation. The third one proposes an overview of one-dimensional random variables. The fourth one addresses random pairs, and presents basic theorems regarding the algebra of mathematical expectations. The fifth conference discusses some probability laws: binomial distribution, the Poisson distribution, and the Laplace-Gauss distribution. The last one deals with the issues of stochastic convergence and asymptotic distributions.
Galilei, Galileo
2012-01-01
'Philosophy is written in this great book which is continually open before our eyes - I mean the universe...' Galileo's astronomical discoveries changed the way we look at the world, and our place in the universe. Threatened by the Inquisition for daring to contradict the literal truth of the Bible, Galileo ignited a scientific revolution when he asserted that the Earth moves. This generous selection from his writings contains all the essential texts for a reader to appreciate his lasting significance. Mark Davie's new translation renders Galileo's vigorous Italian prose into clear modern English, while William R. Shea's version of the Latin Sidereal Message makes accessible the book that created a sensation in 1610 with its account of Galileo's observations using the newly invented telescope. All Galileo's contributions to the debate on science and religion are included, as well as key documents from his trial before the Inquisition in 1633. A lively introduction and clear notes give an overview of Galileo's...
International Nuclear Information System (INIS)
Olsen, C.W.
1983-07-01
The conditions and criteria for selecting a site for a nuclear weapons test at the Nevada Test Site are summarized. Factors considered are: (1) scheduling of drill rigs, (2) scheduling of site preparation (dirt work, auger hole, surface casing, cementing), (3) schedule of event (when are drill hole data needed), (4) depth range of proposed W.P., (5) geologic structure (faults, Pz contact, etc.), (6) stratigraphy (alluvium, location of Grouse Canyon Tuff, etc.), (7) material properties (particularly montmorillonite and CO 2 content), (8) water table depth, (9) potential drilling problems (caving), (10) adjacent collapse craters and chimneys, (11) adjacent expended but uncollapsed sites, (12) adjacent post-shot or other small diameter holes, (13) adjacent stockpile emplacement holes, (14) adjacent planned events (including LANL), (15) projected needs of Test Program for various DOB's and operational separations, and (16) optimal use of NTS real estate
Brain Tumor Segmentation Based on Random Forest
Directory of Open Access Journals (Sweden)
László Lefkovits
2016-09-01
Full Text Available In this article we present a discriminative model for tumor detection from multimodal MR images. The main part of the model is built around the random forest (RF classifier. We created an optimization algorithm able to select the important features for reducing the dimensionality of data. This method is also used to find out the training parameters used in the learning phase. The algorithm is based on random feature properties for evaluating the importance of the variable, the evolution of learning errors and the proximities between instances. The detection performances obtained have been compared with the most recent systems, offering similar results.
International Nuclear Information System (INIS)
Lumay, G; Vandewalle, N
2007-01-01
We present an experimental protocol that allows one to tune the packing fraction η of a random pile of ferromagnetic spheres from a value close to the lower limit of random loose packing η RLP ≅0.56 to the upper limit of random close packing η RCP ≅0.64. This broad range of packing fraction values is obtained under normal gravity in air, by adjusting a magnetic cohesion between the grains during the formation of the pile. Attractive and repulsive magnetic interactions are found to affect stongly the internal structure and the stability of sphere packing. After the formation of the pile, the induced cohesion is decreased continuously along a linear decreasing ramp. The controlled collapse of the pile is found to generate various and reproducible values of the random packing fraction η
Nakagawa, Toshio
2014-01-01
Exploring random maintenance models, this book provides an introduction to the implementation of random maintenance, and it is one of the first books to be written on this subject. It aims to help readers learn new techniques for applying random policies to actual reliability models, and it provides new theoretical analyses of various models including classical replacement, preventive maintenance and inspection policies. These policies are applied to scheduling problems, backup policies of database systems, maintenance policies of cumulative damage models, and reliability of random redundant systems. Reliability theory is a major concern for engineers and managers, and in light of Japan’s recent earthquake, the reliability of large-scale systems has increased in importance. This also highlights the need for a new notion of maintenance and reliability theory, and how this can practically be applied to systems. Providing an essential guide for engineers and managers specializing in reliability maintenance a...
Molchanov, Ilya
2017-01-01
This monograph, now in a thoroughly revised second edition, offers the latest research on random sets. It has been extended to include substantial developments achieved since 2005, some of them motivated by applications of random sets to econometrics and finance. The present volume builds on the foundations laid by Matheron and others, including the vast advances in stochastic geometry, probability theory, set-valued analysis, and statistical inference. It shows the various interdisciplinary relationships of random set theory within other parts of mathematics, and at the same time fixes terminology and notation that often vary in the literature, establishing it as a natural part of modern probability theory and providing a platform for future development. It is completely self-contained, systematic and exhaustive, with the full proofs that are necessary to gain insight. Aimed at research level, Theory of Random Sets will be an invaluable reference for probabilists; mathematicians working in convex and integ...
Marchenko, Yulia V.
2012-03-01
Sample selection arises often in practice as a result of the partial observability of the outcome of interest in a study. In the presence of sample selection, the observed data do not represent a random sample from the population, even after controlling for explanatory variables. That is, data are missing not at random. Thus, standard analysis using only complete cases will lead to biased results. Heckman introduced a sample selection model to analyze such data and proposed a full maximum likelihood estimation method under the assumption of normality. The method was criticized in the literature because of its sensitivity to the normality assumption. In practice, data, such as income or expenditure data, often violate the normality assumption because of heavier tails. We first establish a new link between sample selection models and recently studied families of extended skew-elliptical distributions. Then, this allows us to introduce a selection-t (SLt) model, which models the error distribution using a Student\\'s t distribution. We study its properties and investigate the finite-sample performance of the maximum likelihood estimators for this model. We compare the performance of the SLt model to the conventional Heckman selection-normal (SLN) model and apply it to analyze ambulatory expenditures. Unlike the SLNmodel, our analysis using the SLt model provides statistical evidence for the existence of sample selection bias in these data. We also investigate the performance of the test for sample selection bias based on the SLt model and compare it with the performances of several tests used with the SLN model. Our findings indicate that the latter tests can be misleading in the presence of heavy-tailed data. © 2012 American Statistical Association.
Immigration And Self-Selection
George J. Borjas
1988-01-01
Self-selection plays a dominant role in determining the size and composition of immigrant flows. The United States competes with other potential host countries in the "immigration market". Host countries vary in their "offers" of economic opportunities and also differ in the way they ration entry through their immigration policies. Potential immigrants compare the various opportunities and are non-randomly sorted by the immigration market among the various host countries. This paper presents ...
Quantum randomness and unpredictability
Energy Technology Data Exchange (ETDEWEB)
Jaeger, Gregg [Quantum Communication and Measurement Laboratory, Department of Electrical and Computer Engineering and Division of Natural Science and Mathematics, Boston University, Boston, MA (United States)
2017-06-15
Quantum mechanics is a physical theory supplying probabilities corresponding to expectation values for measurement outcomes. Indeed, its formalism can be constructed with measurement as a fundamental process, as was done by Schwinger, provided that individual measurements outcomes occur in a random way. The randomness appearing in quantum mechanics, as with other forms of randomness, has often been considered equivalent to a form of indeterminism. Here, it is argued that quantum randomness should instead be understood as a form of unpredictability because, amongst other things, indeterminism is not a necessary condition for randomness. For concreteness, an explication of the randomness of quantum mechanics as the unpredictability of quantum measurement outcomes is provided. Finally, it is shown how this view can be combined with the recently introduced view that the very appearance of individual quantum measurement outcomes can be grounded in the Plenitude principle of Leibniz, a principle variants of which have been utilized in physics by Dirac and Gell-Mann in relation to the fundamental processes. This move provides further support to Schwinger's ''symbolic'' derivation of quantum mechanics from measurement. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)
Evolving artificial metalloenzymes via random mutagenesis
Yang, Hao; Swartz, Alan M.; Park, Hyun June; Srivastava, Poonam; Ellis-Guardiola, Ken; Upp, David M.; Lee, Gihoon; Belsare, Ketaki; Gu, Yifan; Zhang, Chen; Moellering, Raymond E.; Lewis, Jared C.
2018-03-01
Random mutagenesis has the potential to optimize the efficiency and selectivity of protein catalysts without requiring detailed knowledge of protein structure; however, introducing synthetic metal cofactors complicates the expression and screening of enzyme libraries, and activity arising from free cofactor must be eliminated. Here we report an efficient platform to create and screen libraries of artificial metalloenzymes (ArMs) via random mutagenesis, which we use to evolve highly selective dirhodium cyclopropanases. Error-prone PCR and combinatorial codon mutagenesis enabled multiplexed analysis of random mutations, including at sites distal to the putative ArM active site that are difficult to identify using targeted mutagenesis approaches. Variants that exhibited significantly improved selectivity for each of the cyclopropane product enantiomers were identified, and higher activity than previously reported ArM cyclopropanases obtained via targeted mutagenesis was also observed. This improved selectivity carried over to other dirhodium-catalysed transformations, including N-H, S-H and Si-H insertion, demonstrating that ArMs evolved for one reaction can serve as starting points to evolve catalysts for others.
Elgot, Calvin C
1982-01-01
Cal Elgot was a very serious and thoughtful researcher, who with great determi nation attempted to find basic explanations for certain mathematical phenomena as the selection of papers in this volume well illustrate. His approach was, for the most part, rather finitist and constructivist, and he was inevitably drawn to studies of the process of computation. It seems to me that his early work on decision problems relating automata and logic, starting with his thesis under Roger Lyndon and continuing with joint work with Biichi, Wright, Copi, Rutledge, Mezei, and then later with Rabin, set the stage for his attack on the theory of computation through the abstract treatment of the notion of a machine. This is also apparent in his joint work with A. Robinson reproduced here and in his joint papers with John Shepherdson. Of course in the light of subsequent work on decision problems by Biichi, Rabin, Shelah, and many, many others, the subject has been placed on a completely different plane from what it was whe...
The Goodness of Covariance Selection Problem from AUC Bounds
Khajavi, Navid Tafaghodi; Kuh, Anthony
2016-01-01
We conduct a study of graphical models and discuss the quality of model selection approximation by formulating the problem as a detection problem and examining the area under the curve (AUC). We are specifically looking at the model selection problem for jointly Gaussian random vectors. For Gaussian random vectors, this problem simplifies to the covariance selection problem which is widely discussed in literature by Dempster [1]. In this paper, we give the definition for the correlation appro...
Integral Histogram with Random Projection for Pedestrian Detection.
Directory of Open Access Journals (Sweden)
Chang-Hua Liu
Full Text Available In this paper, we give a systematic study to report several deep insights into the HOG, one of the most widely used features in the modern computer vision and image processing applications. We first show that, its magnitudes of gradient can be randomly projected with random matrix. To handle over-fitting, an integral histogram based on the differences of randomly selected blocks is proposed. The experiments show that both the random projection and integral histogram outperform the HOG feature obviously. Finally, the two ideas are combined into a new descriptor termed IHRP, which outperforms the HOG feature with less dimensions and higher speed.
Statistical properties of random clique networks
Ding, Yi-Min; Meng, Jun; Fan, Jing-Fang; Ye, Fang-Fu; Chen, Xiao-Song
2017-10-01
In this paper, a random clique network model to mimic the large clustering coefficient and the modular structure that exist in many real complex networks, such as social networks, artificial networks, and protein interaction networks, is introduced by combining the random selection rule of the Erdös and Rényi (ER) model and the concept of cliques. We find that random clique networks having a small average degree differ from the ER network in that they have a large clustering coefficient and a power law clustering spectrum, while networks having a high average degree have similar properties as the ER model. In addition, we find that the relation between the clustering coefficient and the average degree shows a non-monotonic behavior and that the degree distributions can be fit by multiple Poisson curves; we explain the origin of such novel behaviors and degree distributions.
Feature Selection with the Boruta Package
Kursa, Miron B.; Rudnicki, Witold R.
2010-01-01
This article describes a R package Boruta, implementing a novel feature selection algorithm for finding emph{all relevant variables}. The algorithm is designed as a wrapper around a Random Forest classification algorithm. It iteratively removes the features which are proved by a statistical test to be less relevant than random probes. The Boruta package provides a convenient interface to the algorithm. The short description of the algorithm and examples of its application are presented.
Feature Selection with the Boruta Package
Directory of Open Access Journals (Sweden)
Miron B. Kursa
2010-10-01
Full Text Available This article describes a R package Boruta, implementing a novel feature selection algorithm for finding emph{all relevant variables}. The algorithm is designed as a wrapper around a Random Forest classification algorithm. It iteratively removes the features which are proved by a statistical test to be less relevant than random probes. The Boruta package provides a convenient interface to the algorithm. The short description of the algorithm and examples of its application are presented.
Survivor bias in Mendelian randomization analysis
DEFF Research Database (Denmark)
Vansteelandt, Stijn; Dukes, Oliver; Martinussen, Torben
2017-01-01
Mendelian randomization studies employ genotypes as experimental handles to infer the effect of genetically modified exposures (e.g. vitamin D exposure) on disease outcomes (e.g. mortality). The statistical analysis of these studies makes use of the standard instrumental variables framework. Many...... of these studies focus on elderly populations, thereby ignoring the problem of left truncation, which arises due to the selection of study participants being conditional upon surviving up to the time of study onset. Such selection, in general, invalidates the assumptions on which the instrumental variables...... analysis rests. We show that Mendelian randomization studies of adult or elderly populations will therefore, in general, return biased estimates of the exposure effect when the considered genotype affects mortality; in contrast, standard tests of the causal null hypothesis that the exposure does not affect...
International Nuclear Information System (INIS)
Yeong, C.L.; Torquato, S.
1998-01-01
We formulate a procedure to reconstruct the structure of general random heterogeneous media from limited morphological information by extending the methodology of Rintoul and Torquato [J. Colloid Interface Sci. 186, 467 (1997)] developed for dispersions. The procedure has the advantages that it is simple to implement and generally applicable to multidimensional, multiphase, and anisotropic structures. Furthermore, an extremely useful feature is that it can incorporate any type and number of correlation functions in order to provide as much morphological information as is necessary for accurate reconstruction. We consider a variety of one- and two-dimensional reconstructions, including periodic and random arrays of rods, various distribution of disks, Debye random media, and a Fontainebleau sandstone sample. We also use our algorithm to construct heterogeneous media from specified hypothetical correlation functions, including an exponentially damped, oscillating function as well as physically unrealizable ones. copyright 1998 The American Physical Society
Intermittency and random matrices
Sokoloff, Dmitry; Illarionov, E. A.
2015-08-01
A spectacular phenomenon of intermittency, i.e. a progressive growth of higher statistical moments of a physical field excited by an instability in a random medium, attracted the attention of Zeldovich in the last years of his life. At that time, the mathematical aspects underlying the physical description of this phenomenon were still under development and relations between various findings in the field remained obscure. Contemporary results from the theory of the product of independent random matrices (the Furstenberg theory) allowed the elaboration of the phenomenon of intermittency in a systematic way. We consider applications of the Furstenberg theory to some problems in cosmology and dynamo theory.
International Nuclear Information System (INIS)
Bruzda, Wojciech; Cappellini, Valerio; Sommers, Hans-Juergen; Zyczkowski, Karol
2009-01-01
We define a natural ensemble of trace preserving, completely positive quantum maps and present algorithms to generate them at random. Spectral properties of the superoperator Φ associated with a given quantum map are investigated and a quantum analogue of the Frobenius-Perron theorem is proved. We derive a general formula for the density of eigenvalues of Φ and show the connection with the Ginibre ensemble of real non-symmetric random matrices. Numerical investigations of the spectral gap imply that a generic state of the system iterated several times by a fixed generic map converges exponentially to an invariant state
Random a-adic groups and random net fractals
Energy Technology Data Exchange (ETDEWEB)
Li Yin [Department of Mathematics, Nanjing University, Nanjing 210093 (China)], E-mail: Lyjerry7788@hotmail.com; Su Weiyi [Department of Mathematics, Nanjing University, Nanjing 210093 (China)], E-mail: suqiu@nju.edu.cn
2008-08-15
Based on random a-adic groups, this paper investigates the relationship between the existence conditions of a positive flow in a random network and the estimation of the Hausdorff dimension of a proper random net fractal. Subsequently we describe some particular random fractals for which our results can be applied. Finally the Mauldin and Williams theorem is shown to be very important example for a random Cantor set with application in physics as shown in E-infinity theory.
Efficient robust conditional random fields.
Song, Dongjin; Liu, Wei; Zhou, Tianyi; Tao, Dacheng; Meyer, David A
2015-10-01
Conditional random fields (CRFs) are a flexible yet powerful probabilistic approach and have shown advantages for popular applications in various areas, including text analysis, bioinformatics, and computer vision. Traditional CRF models, however, are incapable of selecting relevant features as well as suppressing noise from noisy original features. Moreover, conventional optimization methods often converge slowly in solving the training procedure of CRFs, and will degrade significantly for tasks with a large number of samples and features. In this paper, we propose robust CRFs (RCRFs) to simultaneously select relevant features. An optimal gradient method (OGM) is further designed to train RCRFs efficiently. Specifically, the proposed RCRFs employ the l1 norm of the model parameters to regularize the objective used by traditional CRFs, therefore enabling discovery of the relevant unary features and pairwise features of CRFs. In each iteration of OGM, the gradient direction is determined jointly by the current gradient together with the historical gradients, and the Lipschitz constant is leveraged to specify the proper step size. We show that an OGM can tackle the RCRF model training very efficiently, achieving the optimal convergence rate [Formula: see text] (where k is the number of iterations). This convergence rate is theoretically superior to the convergence rate O(1/k) of previous first-order optimization methods. Extensive experiments performed on three practical image segmentation tasks demonstrate the efficacy of OGM in training our proposed RCRFs.
[Intel random number generator-based true random number generator].
Huang, Feng; Shen, Hong
2004-09-01
To establish a true random number generator on the basis of certain Intel chips. The random numbers were acquired by programming using Microsoft Visual C++ 6.0 via register reading from the random number generator (RNG) unit of an Intel 815 chipset-based computer with Intel Security Driver (ISD). We tested the generator with 500 random numbers in NIST FIPS 140-1 and X(2) R-Squared test, and the result showed that the random number it generated satisfied the demand of independence and uniform distribution. We also compared the random numbers generated by Intel RNG-based true random number generator and those from the random number table statistically, by using the same amount of 7500 random numbers in the same value domain, which showed that the SD, SE and CV of Intel RNG-based random number generator were less than those of the random number table. The result of u test of two CVs revealed no significant difference between the two methods. Intel RNG-based random number generator can produce high-quality random numbers with good independence and uniform distribution, and solves some problems with random number table in acquisition of the random numbers.
Discriminative Projection Selection Based Face Image Hashing
Karabat, Cagatay; Erdogan, Hakan
Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.
Programmable disorder in random DNA tilings
Tikhomirov, Grigory; Petersen, Philip; Qian, Lulu
2017-03-01
Scaling up the complexity and diversity of synthetic molecular structures will require strategies that exploit the inherent stochasticity of molecular systems in a controlled fashion. Here we demonstrate a framework for programming random DNA tilings and show how to control the properties of global patterns through simple, local rules. We constructed three general forms of planar network—random loops, mazes and trees—on the surface of self-assembled DNA origami arrays on the micrometre scale with nanometre resolution. Using simple molecular building blocks and robust experimental conditions, we demonstrate control of a wide range of properties of the random networks, including the branching rules, the growth directions, the proximity between adjacent networks and the size distribution. Much as combinatorial approaches for generating random one-dimensional chains of polymers have been used to revolutionize chemical synthesis and the selection of functional nucleic acids, our strategy extends these principles to random two-dimensional networks of molecules and creates new opportunities for fabricating more complex molecular devices that are organized by DNA nanostructures.
Uniform random number generators
Farr, W. R.
1971-01-01
Methods are presented for the generation of random numbers with uniform and normal distributions. Subprogram listings of Fortran generators for the Univac 1108, SDS 930, and CDC 3200 digital computers are also included. The generators are of the mixed multiplicative type, and the mathematical method employed is that of Marsaglia and Bray.
On randomly interrupted diffusion
International Nuclear Information System (INIS)
Luczka, J.
1993-01-01
Processes driven by randomly interrupted Gaussian white noise are considered. An evolution equation for single-event probability distributions in presented. Stationary states are considered as a solution of a second-order ordinary differential equation with two imposed conditions. A linear model is analyzed and its stationary distributions are explicitly given. (author). 10 refs
DEFF Research Database (Denmark)
Paolini, Enrico; Stefanovic, Cedomir; Liva, Gianluigi
2015-01-01
The rise of machine-to-machine communications has rekindled the interest in random access protocols as a support for a massive number of uncoordinatedly transmitting devices. The legacy ALOHA approach is developed under a collision model, where slots containing collided packets are considered as ...
Random eigenvalue problems revisited
Indian Academy of Sciences (India)
statistical distributions; linear stochastic systems. 1. ... dimensional multivariate Gaussian random vector with mean µ ∈ Rm and covariance ... 5, the proposed analytical methods are applied to a three degree-of-freedom system and the ...... The joint pdf ofω1 andω3 is however close to a bivariate Gaussian density function.
Bridging Emergent Attributes and Darwinian Principles in Teaching Natural Selection
Xu, Dongchen; Chi, Michelene T. H.
2016-01-01
Students often have misconceptions about natural selection as they misuse a direct causal schema to explain the process. Natural selection is in fact an emergent process where random interactions lead to changes in a population. The misconceptions stem from students' lack of emergent schema for natural selection. In order to help students…
Odagaki, Takashi; Kasuya, Keisuke
2017-09-01
Using the Monte Carlo simulation, we investigate a memory-impaired self-avoiding walk on a square lattice in which a random walker marks each of sites visited with a given probability p and makes a random walk avoiding the marked sites. Namely, p = 0 and p = 1 correspond to the simple random walk and the self-avoiding walk, respectively. When p> 0, there is a finite probability that the walker is trapped. We show that the trap time distribution can well be fitted by Stacy's Weibull distribution b(a/b){a+1}/{b}[Γ({a+1}/{b})]-1x^a\\exp(-a/bx^b)} where a and b are fitting parameters depending on p. We also find that the mean trap time diverges at p = 0 as p- α with α = 1.89. In order to produce sufficient number of long walks, we exploit the pivot algorithm and obtain the mean square displacement and its Flory exponent ν(p) as functions of p. We find that the exponent determined for 1000 step walks interpolates both limits ν(0) for the simple random walk and ν(1) for the self-avoiding walk as [ ν(p) - ν(0) ] / [ ν(1) - ν(0) ] = pβ with β = 0.388 when p ≪ 0.1 and β = 0.0822 when p ≫ 0.1. Contribution to the Topical Issue "Continuous Time Random Walk Still Trendy: Fifty-year History, Current State and Outlook", edited by Ryszard Kutner and Jaume Masoliver.
Unwilling or Unable to Cheat? Evidence from a Randomized Tax Audit Experiment in Denmark
Henrik J. Kleven; Martin B. Knudsen; Claus T. Kreiner; Søren Pedersen; Emmanuel Saez
2010-01-01
This paper analyzes a randomized tax enforcement experiment in Denmark. In the base year, a stratified and representative sample of over 40,000 individual income tax filers was selected for the experiment. Half of the tax filers were randomly selected to be thoroughly audited, while the rest were deliberately not audited. The following year, "threat-of-audit" letters were randomly assigned and sent to tax filers in both groups. Using comprehensive administrative tax data, we present four main...
Random vibrations theory and practice
Wirsching, Paul H; Ortiz, Keith
1995-01-01
Random Vibrations: Theory and Practice covers the theory and analysis of mechanical and structural systems undergoing random oscillations due to any number of phenomena— from engine noise, turbulent flow, and acoustic noise to wind, ocean waves, earthquakes, and rough pavement. For systems operating in such environments, a random vibration analysis is essential to the safety and reliability of the system. By far the most comprehensive text available on random vibrations, Random Vibrations: Theory and Practice is designed for readers who are new to the subject as well as those who are familiar with the fundamentals and wish to study a particular topic or use the text as an authoritative reference. It is divided into three major sections: fundamental background, random vibration development and applications to design, and random signal analysis. Introductory chapters cover topics in probability, statistics, and random processes that prepare the reader for the development of the theory of random vibrations a...
Portfolio Selection with Jumps under Regime Switching
Directory of Open Access Journals (Sweden)
Lin Zhao
2010-01-01
Full Text Available We investigate a continuous-time version of the mean-variance portfolio selection model with jumps under regime switching. The portfolio selection is proposed and analyzed for a market consisting of one bank account and multiple stocks. The random regime switching is assumed to be independent of the underlying Brownian motion and jump processes. A Markov chain modulated diffusion formulation is employed to model the problem.
A computational model of selection by consequences.
McDowell, J J
2004-01-01
Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of computational experiments that arranged reinforcement according to random-interval (RI) schedules. The quantitative features of the model were varied o...
Voiculescu, Dan; Nica, Alexandru
1992-01-01
This book presents the first comprehensive introduction to free probability theory, a highly noncommutative probability theory with independence based on free products instead of tensor products. Basic examples of this kind of theory are provided by convolution operators on free groups and by the asymptotic behavior of large Gaussian random matrices. The probabilistic approach to free products has led to a recent surge of new results on the von Neumann algebras of free groups. The book is ideally suited as a textbook for an advanced graduate course and could also provide material for a seminar. In addition to researchers and graduate students in mathematics, this book will be of interest to physicists and others who use random matrices.
Independent random sampling methods
Martino, Luca; Míguez, Joaquín
2018-01-01
This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...
Directory of Open Access Journals (Sweden)
Anwer Khurshid
2012-07-01
Full Text Available Normal 0 false false false EN-US X-NONE X-NONE In this paper, it is shown that a complex multivariate random variable is a complex multivariate normal random variable of dimensionality if and only if all nondegenerate complex linear combinations of have a complex univariate normal distribution. The characteristic function of has been derived, and simpler forms of some theorems have been given using this characterization theorem without assuming that the variance-covariance matrix of the vector is Hermitian positive definite. Marginal distributions of have been given. In addition, a complex multivariate t-distribution has been defined and the density derived. A characterization of the complex multivariate t-distribution is given. A few possible uses of this distribution have been suggested.
International Nuclear Information System (INIS)
Reuss, J.D.; Misguich, J.H.
1993-02-01
The Campbell process is a stationary random process which can have various correlation functions, according to the choice of an elementary response function. The statistical properties of this process are presented. A numerical algorithm and a subroutine for generating such a process is built up and tested, for the physically interesting case of a Campbell process with Gaussian correlations. The (non-Gaussian) probability distribution appears to be similar to the Gamma distribution
Certified randomness in quantum physics.
Acín, Antonio; Masanes, Lluis
2016-12-07
The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.
Cross over of recurrence networks to random graphs and random ...
Indian Academy of Sciences (India)
2017-01-27
Jan 27, 2017 ... that all recurrence networks can cross over to random geometric graphs by adding sufficient amount of noise to .... municative [19] or social [20], deviate from the random ..... He has shown that the spatial effects become.
A random number generator for continuous random variables
Guerra, V. M.; Tapia, R. A.; Thompson, J. R.
1972-01-01
A FORTRAN 4 routine is given which may be used to generate random observations of a continuous real valued random variable. Normal distribution of F(x), X, E(akimas), and E(linear) is presented in tabular form.
On a randomly imperfect spherical cap pressurized by a random ...
African Journals Online (AJOL)
In this paper, we investigate a dynamical system in a random setting of dual randomness in space and time variables in which both the imperfection of the structure and the load function are considered random , each with a statistical zero-mean .The auto- covariance of the load is correlated as an exponentially decaying ...
A comparison of random walks in dependent random environments
Scheinhardt, Willem R.W.; Kroese, Dirk
We provide exact computations for the drift of random walks in dependent random environments, including $k$-dependent and moving average environments. We show how the drift can be characterized and evaluated using Perron–Frobenius theory. Comparing random walks in various dependent environments, we
Feature-selective attention in healthy old age: a selective decline in selective attention?
Quigley, Cliodhna; Müller, Matthias M
2014-02-12
Deficient selection against irrelevant information has been proposed to underlie age-related cognitive decline. We recently reported evidence for maintained early sensory selection when older and younger adults used spatial selective attention to perform a challenging task. Here we explored age-related differences when spatial selection is not possible and feature-selective attention must be deployed. We additionally compared the integrity of feedforward processing by exploiting the well established phenomenon of suppression of visual cortical responses attributable to interstimulus competition. Electroencephalogram was measured while older and younger human adults responded to brief occurrences of coherent motion in an attended stimulus composed of randomly moving, orientation-defined, flickering bars. Attention was directed to horizontal or vertical bars by a pretrial cue, after which two orthogonally oriented, overlapping stimuli or a single stimulus were presented. Horizontal and vertical bars flickered at different frequencies and thereby elicited separable steady-state visual-evoked potentials, which were used to examine the effect of feature-based selection and the competitive influence of a second stimulus on ongoing visual processing. Age differences were found in feature-selective attentional modulation of visual responses: older adults did not show consistent modulation of magnitude or phase. In contrast, the suppressive effect of a second stimulus was robust and comparable in magnitude across age groups, suggesting that bottom-up processing of the current stimuli is essentially unchanged in healthy old age. Thus, it seems that visual processing per se is unchanged, but top-down attentional control is compromised in older adults when space cannot be used to guide selection.
Opportunistic Relay Selection with Cooperative Macro Diversity
Directory of Open Access Journals (Sweden)
Yu Chia-Hao
2010-01-01
Full Text Available We apply a fully opportunistic relay selection scheme to study cooperative diversity in a semianalytical manner. In our framework, idle Mobile Stations (MSs are capable of being used as Relay Stations (RSs and no relaying is required if the direct path is strong. Our relay selection scheme is fully selection based: either the direct path or one of the relaying paths is selected. Macro diversity, which is often ignored in analytical works, is taken into account together with micro diversity by using a complete channel model that includes both shadow fading and fast fading effects. The stochastic geometry of the network is taken into account by having a random number of randomly located MSs. The outage probability analysis of the selection differs from the case where only fast fading is considered. Under our framework, distribution of the received power is formulated using different Channel State Information (CSI assumptions to simulate both optimistic and practical environments. The results show that the relay selection gain can be significant given a suitable amount of candidate RSs. Also, while relay selection according to incomplete CSI is diversity suboptimal compared to relay selection based on full CSI, the loss in average throughput is not too significant. This is a consequence of the dominance of geometry over fast fading.
Stirling Engine Configuration Selection
Directory of Open Access Journals (Sweden)
Jose Egas
2018-03-01
Full Text Available Unlike internal combustion engines, Stirling engines can be designed to work with many drive mechanisms based on the three primary configurations, alpha, beta and gamma. Hundreds of different combinations of configuration and mechanical drives have been proposed. Few succeed beyond prototypes. A reason for poor success is the use of inappropriate configuration and drive mechanisms, which leads to low power to weight ratio and reduced economic viability. The large number of options, the lack of an objective comparison method, and the absence of a selection criteria force designers to make random choices. In this article, the pressure—volume diagrams and compression ratios of machines of equal dimensions, using the main (alpha, beta and gamma crank based configurations as well as rhombic drive and Ross yoke mechanisms, are obtained. The existence of a direct relation between the optimum compression ratio and the temperature ratio is derived from the ideal Stirling cycle, and the usability of an empirical low temperature difference compression ratio equation for high temperature difference applications is tested using experimental data. It is shown that each machine has a different compression ratio, making it more or less suitable for a specific application, depending on the temperature difference reachable.
2007-01-01
Not much effort needed, just willpower In order to keep the cost of disposing of waste materials as low as possible, CERN provides two types of recipient at the entrance to each building: a green plastic one for paper/cardboard and a metal one for general refuse. For some time now we have noticed, to our great regret, a growing negligence as far as selective sorting is concerned, with, for example, the green recipients being filled with a mixture of cardboard boxes full of polystyrene or protective wrappers, plastic bottles, empty yogurts pots, etc. …We have been able to ascertain, after careful checking, that this haphazard mixing of waste cannot be attributed to the cleaning staff but rather to members of the personnel who unscrupulously throw away their rubbish in a completely random manner. Non-sorted waste entails heavy costs for CERN. For information, once a non-compliant item is found in a green recipient, the entire contents are sent off for incineration rather than recycling… We are all concerned...
Malaria parasitemia amongst pregnant women attending selected ...
African Journals Online (AJOL)
A cross-sectional study to determine malaria parasitemia amongst 300 randomly selected pregnant women attending government and private healthcare facilities in Rivers State was carried out. Blood samples were obtained through venous procedure and the presence or absence of Plasmodium was determined ...
Random numbers from vacuum fluctuations
International Nuclear Information System (INIS)
Shi, Yicheng; Kurtsiefer, Christian; Chng, Brenda
2016-01-01
We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.
Random numbers from vacuum fluctuations
Energy Technology Data Exchange (ETDEWEB)
Shi, Yicheng; Kurtsiefer, Christian, E-mail: christian.kurtsiefer@gmail.com [Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117542 (Singapore); Center for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543 (Singapore); Chng, Brenda [Center for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543 (Singapore)
2016-07-25
We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.
Pseudo-random number generator for the Sigma 5 computer
Carroll, S. N.
1983-01-01
A technique is presented for developing a pseudo-random number generator based on the linear congruential form. The two numbers used for the generator are a prime number and a corresponding primitive root, where the prime is the largest prime number that can be accurately represented on a particular computer. The primitive root is selected by applying Marsaglia's lattice test. The technique presented was applied to write a random number program for the Sigma 5 computer. The new program, named S:RANDOM1, is judged to be superior to the older program named S:RANDOM. For applications requiring several independent random number generators, a table is included showing several acceptable primitive roots. The technique and programs described can be applied to any computer having word length different from that of the Sigma 5.
Theory of Randomized Search Heuristics in Combinatorial Optimization
DEFF Research Database (Denmark)
The rigorous mathematical analysis of randomized search heuristics(RSHs) with respect to their expected runtime is a growing research area where many results have been obtained in recent years. This class of heuristics includes well-known approaches such as Randomized Local Search (RLS), the Metr......The rigorous mathematical analysis of randomized search heuristics(RSHs) with respect to their expected runtime is a growing research area where many results have been obtained in recent years. This class of heuristics includes well-known approaches such as Randomized Local Search (RLS...... analysis of randomized algorithms to RSHs. Mostly, the expected runtime of RSHs on selected problems is analzyed. Thereby, we understand why and when RSHs are efficient optimizers and, conversely, when they cannot be efficient. The tutorial will give an overview on the analysis of RSHs for solving...
Velocity and Dispersion for a Two-Dimensional Random Walk
International Nuclear Information System (INIS)
Li Jinghui
2009-01-01
In the paper, we consider the transport of a two-dimensional random walk. The velocity and the dispersion of this two-dimensional random walk are derived. It mainly show that: (i) by controlling the values of the transition rates, the direction of the random walk can be reversed; (ii) for some suitably selected transition rates, our two-dimensional random walk can be efficient in comparison with the one-dimensional random walk. Our work is motivated in part by the challenge to explain the unidirectional transport of motor proteins. When the motor proteins move at the turn points of their tracks (i.e., the cytoskeleton filaments and the DNA molecular tubes), some of our results in this paper can be used to deal with the problem. (general)
Randomness at the root of things 1: Random walks
Ogborn, Jon; Collins, Simon; Brown, Mick
2003-09-01
This is the first of a pair of articles about randomness in physics. In this article, we use some variations on the idea of a `random walk' to consider first the path of a particle in Brownian motion, and then the random variation to be expected in radioactive decay. The arguments are set in the context of the general importance of randomness both in physics and in everyday life. We think that the ideas could usefully form part of students' A-level work on random decay and quantum phenomena, as well as being good for their general education. In the second article we offer a novel and simple approach to Poisson sequences.
Investigating the Randomness of Numbers
Pendleton, Kenn L.
2009-01-01
The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…
Strong Decomposition of Random Variables
DEFF Research Database (Denmark)
Hoffmann-Jørgensen, Jørgen; Kagan, Abram M.; Pitt, Loren D.
2007-01-01
A random variable X is stongly decomposable if X=Y+Z where Y=Φ(X) and Z=X-Φ(X) are independent non-degenerated random variables (called the components). It is shown that at least one of the components is singular, and we derive a necessary and sufficient condition for strong decomposability...... of a discrete random variable....
Random Numbers and Quantum Computers
McCartney, Mark; Glass, David
2002-01-01
The topic of random numbers is investigated in such a way as to illustrate links between mathematics, physics and computer science. First, the generation of random numbers by a classical computer using the linear congruential generator and logistic map is considered. It is noted that these procedures yield only pseudo-random numbers since…
Rabi N. Bhattacharya selected papers
Waymire, Edward
2016-01-01
This volume presents some of the most influential papers published by Rabi N. Bhattacharya, along with commentaries from international experts, demonstrating his knowledge, insight, and influence in the field of probability and its applications. For more than three decades, Bhattacharya has made significant contributions in areas ranging from theoretical statistics via analytical probability theory, Markov processes, and random dynamics to applied topics in statistics, economics, and geophysics. Selected reprints of Bhattacharya’s papers are divided into three sections: Modes of Approximation, Large Times for Markov Processes, and Stochastic Foundations in Applied Sciences. The accompanying articles by the contributing authors not only help to position his work in the context of other achievements, but also provide a unique assessment of the state of their individual fields, both historically and for the next generation of researchers. Rabi N. Bhattacharya: Selected Papers will be a valuable resource for yo...
Social Selection and Religiously Selective Faith Schools
Pettinger, Paul
2014-01-01
This article reviews recent research looking at the socio-economic profile of pupils at faith schools and the contribution religiously selective admission arrangements make. It finds that selection by faith leads to greater social segregation and is open to manipulation. It urges that such selection should end, making the state-funded school…
Lawler, Gregory F.; Ferreras, José A. Trujillo
2004-01-01
The Brownian loop soup introduced in Lawler and Werner (2004) is a Poissonian realization from a sigma-finite measure on unrooted loops. This measure satisfies both conformal invariance and a restriction property. In this paper, we define a random walk loop soup and show that it converges to the Brownian loop soup. In fact, we give a strong approximation result making use of the strong approximation result of Koml\\'os, Major, and Tusn\\'ady. To make the paper self-contained, we include a proof...
Deift, Percy
2009-01-01
This book features a unified derivation of the mathematical theory of the three classical types of invariant random matrix ensembles-orthogonal, unitary, and symplectic. The authors follow the approach of Tracy and Widom, but the exposition here contains a substantial amount of additional material, in particular, facts from functional analysis and the theory of Pfaffians. The main result in the book is a proof of universality for orthogonal and symplectic ensembles corresponding to generalized Gaussian type weights following the authors' prior work. New, quantitative error estimates are derive
International Nuclear Information System (INIS)
Audenaert, Koenraad M R; Scheel, Stefan
2008-01-01
In this paper, we provide necessary and sufficient conditions for a completely positive trace-preserving (CPT) map to be decomposable into a convex combination of unitary maps. Additionally, we set out to define a proper distance measure between a given CPT map and the set of random unitary maps, and methods for calculating it. In this way one could determine whether non-classical error mechanisms such as spontaneous decay or photon loss dominate over classical uncertainties, for example, in a phase parameter. The present paper is a step towards achieving this goal
DEFF Research Database (Denmark)
Wanscher, Jørgen Bundgaard; Sørensen, Majken Vildrik
2006-01-01
Random numbers are used for a great variety of applications in almost any field of computer and economic sciences today. Examples ranges from stock market forecasting in economics, through stochastic traffic modelling in operations research to photon and ray tracing in graphics. The construction...... distributions into others with most of the required characteristics. In essence, a uniform sequence which is transformed into a new sequence with the required distribution. The subject of this article is to consider the well known highly uniform Halton sequence and modifications to it. The intent is to generate...
Mobile access to virtual randomization for investigator-initiated trials.
Deserno, Thomas M; Keszei, András P
2017-08-01
Background/aims Randomization is indispensable in clinical trials in order to provide unbiased treatment allocation and a valid statistical inference. Improper handling of allocation lists can be avoided using central systems, for example, human-based services. However, central systems are unaffordable for investigator-initiated trials and might be inaccessible from some places, where study subjects need allocations. We propose mobile access to virtual randomization, where the randomization lists are non-existent and the appropriate allocation is computed on demand. Methods The core of the system architecture is an electronic data capture system or a clinical trial management system, which is extended by an R interface connecting the R server using the Java R Interface. Mobile devices communicate via the representational state transfer web services. Furthermore, a simple web-based setup allows configuring the appropriate statistics by non-statisticians. Our comprehensive R script supports simple randomization, restricted randomization using a random allocation rule, block randomization, and stratified randomization for un-blinded, single-blinded, and double-blinded trials. For each trial, the electronic data capture system or the clinical trial management system stores the randomization parameters and the subject assignments. Results Apps are provided for iOS and Android and subjects are randomized using smartphones. After logging onto the system, the user selects the trial and the subject, and the allocation number and treatment arm are displayed instantaneously and stored in the core system. So far, 156 subjects have been allocated from mobile devices serving five investigator-initiated trials. Conclusion Transforming pre-printed allocation lists into virtual ones ensures the correct conduct of trials and guarantees a strictly sequential processing in all trial sites. Covering 88% of all randomization models that are used in recent trials, virtual randomization
Genetic search feature selection for affective modeling
DEFF Research Database (Denmark)
Martínez, Héctor P.; Yannakakis, Georgios N.
2010-01-01
Automatic feature selection is a critical step towards the generation of successful computational models of affect. This paper presents a genetic search-based feature selection method which is developed as a global-search algorithm for improving the accuracy of the affective models built....... The method is tested and compared against sequential forward feature selection and random search in a dataset derived from a game survey experiment which contains bimodal input features (physiological and gameplay) and expressed pairwise preferences of affect. Results suggest that the proposed method...
Random coil chemical shifts in acidic 8 M urea: Implementation of random coil shift data in NMRView
International Nuclear Information System (INIS)
Schwarzinger, Stephan; Kroon, Gerard J.A.; Foss, Ted R.; Wright, Peter E.; Dyson, H. Jane
2000-01-01
Studies of proteins unfolded in acid or chemical denaturant can help in unraveling events during the earliest phases of protein folding. In order for meaningful comparisons to be made of residual structure in unfolded states, it is necessary to use random coil chemical shifts that are valid for the experimental system under study. We present a set of random coil chemical shifts obtained for model peptides under experimental conditions used in studies of denatured proteins. This new set, together with previously published data sets, has been incorporated into a software interface for NMRView, allowing selection of the random coil data set that fits the experimental conditions best
Mak, Chi H.; Pham, Phuong; Afif, Samir A.; Goodman, Myron F.
2015-09-01
Enzymes that rely on random walk to search for substrate targets in a heterogeneously dispersed medium can leave behind complex spatial profiles of their catalyzed conversions. The catalytic signatures of these random-walk enzymes are the result of two coupled stochastic processes: scanning and catalysis. Here we develop analytical models to understand the conversion profiles produced by these enzymes, comparing an intrusive model, in which scanning and catalysis are tightly coupled, against a loosely coupled passive model. Diagrammatic theory and path-integral solutions of these models revealed clearly distinct predictions. Comparison to experimental data from catalyzed deaminations deposited on single-stranded DNA by the enzyme activation-induced deoxycytidine deaminase (AID) demonstrates that catalysis and diffusion are strongly intertwined, where the chemical conversions give rise to new stochastic trajectories that were absent if the substrate DNA was homogeneous. The C →U deamination profiles in both analytical predictions and experiments exhibit a strong contextual dependence, where the conversion rate of each target site is strongly contingent on the identities of other surrounding targets, with the intrusive model showing an excellent fit to the data. These methods can be applied to deduce sequence-dependent catalytic signatures of other DNA modification enzymes, with potential applications to cancer, gene regulation, and epigenetics.
Funaki, Tadahisa
2016-01-01
Interfaces are created to separate two distinct phases in a situation in which phase coexistence occurs. This book discusses randomly fluctuating interfaces in several different settings and from several points of view: discrete/continuum, microscopic/macroscopic, and static/dynamic theories. The following four topics in particular are dealt with in the book. Assuming that the interface is represented as a height function measured from a fixed-reference discretized hyperplane, the system is governed by the Hamiltonian of gradient of the height functions. This is a kind of effective interface model called ∇φ-interface model. The scaling limits are studied for Gaussian (or non-Gaussian) random fields with a pinning effect under a situation in which the rate functional of the corresponding large deviation principle has non-unique minimizers. Young diagrams determine decreasing interfaces, and their dynamics are introduced. The large-scale behavior of such dynamics is studied from the points of view of the hyd...
Random catalytic reaction networks
Stadler, Peter F.; Fontana, Walter; Miller, John H.
1993-03-01
We study networks that are a generalization of replicator (or Lotka-Volterra) equations. They model the dynamics of a population of object types whose binary interactions determine the specific type of interaction product. Such a system always reduces its dimension to a subset that contains production pathways for all of its members. The network equation can be rewritten at a level of collectives in terms of two basic interaction patterns: replicator sets and cyclic transformation pathways among sets. Although the system contains well-known cases that exhibit very complicated dynamics, the generic behavior of randomly generated systems is found (numerically) to be extremely robust: convergence to a globally stable rest point. It is easy to tailor networks that display replicator interactions where the replicators are entire self-sustaining subsystems, rather than structureless units. A numerical scan of random systems highlights the special properties of elementary replicators: they reduce the effective interconnectedness of the system, resulting in enhanced competition, and strong correlations between the concentrations.
Pradillo, Gerardo; Heintz, Aneesh; Vlahovska, Petia
2017-11-01
The spontaneous rotation of a sphere in an applied uniform DC electric field (Quincke effect) has been utilized to engineer self-propelled particles: if the sphere is initially resting on a surface, it rolls. The Quincke rollers have been widely used as a model system to study collective behavior in ``active'' suspensions. If the applied field is DC, an isolated Quincke roller follows a straight line trajectory. In this talk, we discuss the design of a Quincke roller that executes a random-walk-like behavior. We utilize AC field - upon reversal of the field direction a fluctuation in the axis of rotation (which is degenerate in the plane perpendicular to the field and parallel to the surface) introduces randomness in the direction of motion. The MSD of an isolated Quincke walker depends on frequency, amplitude, and waveform of the electric field. Experiment and theory are compared. We also investigate the collective behavior of Quincke walkers,the transport of inert particles in a bath of Quincke walkers, and the spontaneous motion of a drop containing Quincke active particle. supported by NSF Grant CBET 1437545.
Subjective randomness as statistical inference.
Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B
2018-06-01
Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.
Noginov, Mikhail A
2005-01-01
Random lasers are the simplest sources of stimulated emission without cavity, with the feedback provided by scattering in a gain medium. First proposed in the late 60’s, random lasers have grown to a large research field. This book reviews the history and the state of the art of random lasers, provides an outline of the basic models describing their behavior, and describes the recent advances in the field. The major focus of the book is on solid-state random lasers. However, it also briefly describes random lasers based on liquid dyes with scatterers. The chapters of the book are almost independent of each other. So, the scientists or engineers interested in any particular aspect of random lasers can read directly the relevant section. Researchers entering the field of random lasers will find in the book an overview of the field of study. Scientists working in the field can use the book as a reference source.
Nobile, Fabio
2015-01-07
We consider a general problem F(u, y) = 0 where u is the unknown solution, possibly Hilbert space valued, and y a set of uncertain parameters. We specifically address the situation in which the parameterto-solution map u(y) is smooth, however y could be very high (or even infinite) dimensional. In particular, we are interested in cases in which F is a differential operator, u a Hilbert space valued function and y a distributed, space and/or time varying, random field. We aim at reconstructing the parameter-to-solution map u(y) from random noise-free or noisy observations in random points by discrete least squares on polynomial spaces. The noise-free case is relevant whenever the technique is used to construct metamodels, based on polynomial expansions, for the output of computer experiments. In the case of PDEs with random parameters, the metamodel is then used to approximate statistics of the output quantity. We discuss the stability of discrete least squares on random points show convergence estimates both in expectation and probability. We also present possible strategies to select, either a-priori or by adaptive algorithms, sequences of approximating polynomial spaces that allow to reduce, and in some cases break, the curse of dimensionality
Sampling large random knots in a confined space
International Nuclear Information System (INIS)
Arsuaga, J; Blackstone, T; Diao, Y; Hinson, K; Karadayi, E; Saito, M
2007-01-01
DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e n 2 )). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n 2 ). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications
Sampling large random knots in a confined space
Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.
2007-09-01
DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.
Sampling large random knots in a confined space
Energy Technology Data Exchange (ETDEWEB)
Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)
2007-09-28
DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.
How random are random numbers generated using photons?
International Nuclear Information System (INIS)
Solis, Aldo; Angulo Martínez, Alí M; Ramírez Alarcón, Roberto; Cruz Ramírez, Hector; U’Ren, Alfred B; Hirsch, Jorge G
2015-01-01
Randomness is fundamental in quantum theory, with many philosophical and practical implications. In this paper we discuss the concept of algorithmic randomness, which provides a quantitative method to assess the Borel normality of a given sequence of numbers, a necessary condition for it to be considered random. We use Borel normality as a tool to investigate the randomness of ten sequences of bits generated from the differences between detection times of photon pairs generated by spontaneous parametric downconversion. These sequences are shown to fulfil the randomness criteria without difficulties. As deviations from Borel normality for photon-generated random number sequences have been reported in previous work, a strategy to understand these diverging findings is outlined. (paper)
Tailored Random Graph Ensembles
International Nuclear Information System (INIS)
Roberts, E S; Annibale, A; Coolen, A C C
2013-01-01
Tailored graph ensembles are a developing bridge between biological networks and statistical mechanics. The aim is to use this concept to generate a suite of rigorous tools that can be used to quantify and compare the topology of cellular signalling networks, such as protein-protein interaction networks and gene regulation networks. We calculate exact and explicit formulae for the leading orders in the system size of the Shannon entropies of random graph ensembles constrained with degree distribution and degree-degree correlation. We also construct an ergodic detailed balance Markov chain with non-trivial acceptance probabilities which converges to a strictly uniform measure and is based on edge swaps that conserve all degrees. The acceptance probabilities can be generalized to define Markov chains that target any alternative desired measure on the space of directed or undirected graphs, in order to generate graphs with more sophisticated topological features.
Malarz, K.; Szvetelszky, Z.; Szekf, B.; Kulakowski, K.
2006-11-01
We consider the average probability X of being informed on a gossip in a given social network. The network is modeled within the random graph theory of Erd{õ}s and Rényi. In this theory, a network is characterized by two parameters: the size N and the link probability p. Our experimental data suggest three levels of social inclusion of friendship. The critical value pc, for which half of agents are informed, scales with the system size as N-gamma with gamma approx 0.68. Computer simulations show that the probability X varies with p as a sigmoidal curve. Influence of the correlations between neighbors is also evaluated: with increasing clustering coefficient C, X decreases.
Vempala, Santosh S
2005-01-01
Random projection is a simple geometric technique for reducing the dimensionality of a set of points in Euclidean space while preserving pairwise distances approximately. The technique plays a key role in several breakthrough developments in the field of algorithms. In other cases, it provides elegant alternative proofs. The book begins with an elementary description of the technique and its basic properties. Then it develops the method in the context of applications, which are divided into three groups. The first group consists of combinatorial optimization problems such as maxcut, graph coloring, minimum multicut, graph bandwidth and VLSI layout. Presented in this context is the theory of Euclidean embeddings of graphs. The next group is machine learning problems, specifically, learning intersections of halfspaces and learning large margin hypotheses. The projection method is further refined for the latter application. The last set consists of problems inspired by information retrieval, namely, nearest neig...
Energy Technology Data Exchange (ETDEWEB)
Fukuma, Masafumi; Sugishita, Sotaro; Umeda, Naoya [Department of Physics, Kyoto University,Kitashirakawa Oiwake-cho, Kyoto 606-8502 (Japan)
2015-07-17
We propose a class of models which generate three-dimensional random volumes, where each configuration consists of triangles glued together along multiple hinges. The models have matrices as the dynamical variables and are characterized by semisimple associative algebras A. Although most of the diagrams represent configurations which are not manifolds, we show that the set of possible diagrams can be drastically reduced such that only (and all of the) three-dimensional manifolds with tetrahedral decompositions appear, by introducing a color structure and taking an appropriate large N limit. We examine the analytic properties when A is a matrix ring or a group ring, and show that the models with matrix ring have a novel strong-weak duality which interchanges the roles of triangles and hinges. We also give a brief comment on the relationship of our models with the colored tensor models.
Random Intercept and Random Slope 2-Level Multilevel Models
Directory of Open Access Journals (Sweden)
Rehan Ahmad Khan
2012-11-01
Full Text Available Random intercept model and random intercept & random slope model carrying two-levels of hierarchy in the population are presented and compared with the traditional regression approach. The impact of students’ satisfaction on their grade point average (GPA was explored with and without controlling teachers influence. The variation at level-1 can be controlled by introducing the higher levels of hierarchy in the model. The fanny movement of the fitted lines proves variation of student grades around teachers.
Random walk of passive tracers among randomly moving obstacles
Gori, Matteo; Donato, Irene; Floriani, Elena; Nardecchia, Ilaria; Pettini, Marco
2016-01-01
Background: This study is mainly motivated by the need of understanding how the diffusion behaviour of a biomolecule (or even of a larger object) is affected by other moving macromolecules, organelles, and so on, inside a living cell, whence the possibility of understanding whether or not a randomly walking biomolecule is also subject to a long-range force field driving it to its target. Method: By means of the Continuous Time Random Walk (CTRW) technique the topic of random walk in random en...
Selected papers on noise and stochastic processes
1954-01-01
Six classic papers on stochastic process, selected to meet the needs of physicists, applied mathematicians, and engineers. Contents: 1.Chandrasekhar, S.: Stochastic Problems in Physics and Astronomy. 2. Uhlenbeck, G. E. and Ornstein, L. S.: On the Theory of the Browninan Motion. 3. Ming Chen Wang and Uhlenbeck, G. E.: On the Theory of the Browninan Motion II. 4. Rice, S. O.: Mathematical Analysis of Random Noise. 5. Kac, Mark: Random Walk and the Theory of Brownian Motion. 6. Doob, J. L.: The Brownian Movement and Stochastic Equations. Unabridged republication of the Dover reprint (1954). Pre
Random lasing in human tissues
International Nuclear Information System (INIS)
Polson, Randal C.; Vardeny, Z. Valy
2004-01-01
A random collection of scatterers in a gain medium can produce coherent laser emission lines dubbed 'random lasing'. We show that biological tissues, including human tissues, can support coherent random lasing when infiltrated with a concentrated laser dye solution. To extract a typical random resonator size within the tissue we average the power Fourier transform of random laser spectra collected from many excitation locations in the tissue; we verified this procedure by a computer simulation. Surprisingly, we found that malignant tissues show many more laser lines compared to healthy tissues taken from the same organ. Consequently, the obtained typical random resonator was found to be different for healthy and cancerous tissues, and this may lead to a technique for separating malignant from healthy tissues for diagnostic imaging
Selection gradients, the opportunity for selection, and the coefficient of determination.
Moorad, Jacob A; Wade, Michael J
2013-03-01
Abstract We derive the relationship between R(2) (the coefficient of determination), selection gradients, and the opportunity for selection for univariate and multivariate cases. Our main result is to show that the portion of the opportunity for selection that is caused by variation for any trait is equal to the product of its selection gradient and its selection differential. This relationship is a corollary of the first and second fundamental theorems of natural selection, and it permits one to investigate the portions of the total opportunity for selection that are involved in directional selection, stabilizing (and diversifying) selection, and correlational selection, which is important to morphological integration. It also allows one to determine the fraction of fitness variation not explained by variation in measured phenotypes and therefore attributable to random (or, at least, unknown) influences. We apply our methods to a human data set to show how sex-specific mating success as a component of fitness variance can be decoupled from that owing to prereproductive mortality. By quantifying linear sources of sexual selection and quadratic sources of sexual selection, we illustrate that the former is stronger in males, while the latter is stronger in females.
Groupies in multitype random graphs
Shang, Yilun
2016-01-01
A groupie in a graph is a vertex whose degree is not less than the average degree of its neighbors. Under some mild conditions, we show that the proportion of groupies is very close to 1/2 in multitype random graphs (such as stochastic block models), which include Erd?s-R?nyi random graphs, random bipartite, and multipartite graphs as special examples. Numerical examples are provided to illustrate the theoretical results.
Groupies in multitype random graphs.
Shang, Yilun
2016-01-01
A groupie in a graph is a vertex whose degree is not less than the average degree of its neighbors. Under some mild conditions, we show that the proportion of groupies is very close to 1/2 in multitype random graphs (such as stochastic block models), which include Erdős-Rényi random graphs, random bipartite, and multipartite graphs as special examples. Numerical examples are provided to illustrate the theoretical results.
Natural Selection as an Emergent Process: Instructional Implications
Cooper, Robert A.
2017-01-01
Student reasoning about cases of natural selection is often plagued by errors that stem from miscategorising selection as a direct, causal process, misunderstanding the role of randomness, and from the intuitive ideas of intentionality, teleology and essentialism. The common thread throughout many of these reasoning errors is a failure to apply…
The Effect of Speed Alterations on Tempo Note Selection.
Madsen, Clifford K.; And Others
1986-01-01
Investigated the tempo note preferences of 100 randomly selected college-level musicians using familiar orchestral music as stimuli. Subjects heard selections at increased, decreased, and unaltered tempi. Results showed musicians were not accurate in estimating original tempo and showed consistent preference for faster than actual tempo.…
40 CFR 205.57-2 - Test vehicle sample selection.
2010-07-01
... pursuant to a test request in accordance with this subpart will be selected in the manner specified in the... then using a table of random numbers to select the number of vehicles as specified in paragraph (c) of... with the desig-nated AQL are contained in Appendix I, -Table II. (c) The appropriate batch sample size...
Rural Women\\'s Preference For Selected Programmes Of The ...
African Journals Online (AJOL)
The study focused on the rural women's preference for selected programmes of the National Special Programme for Food Security (NSPFS) in Imo State, Nigeria. Data was collected with the aid of structured interview from 150 randomly selected women in the study area. Results from the study showed that respondents ...
Adoption of selected innovations in rice production and their effect ...
African Journals Online (AJOL)
Adoption of selected innovations in rice production and their effect on farmers living standard in Bauchi local government area, Bauchi state, Nigeria. ... International Journal of Natural and Applied Sciences ... Simple random sampling technique was used for the selection of 82 rice growers from these villages. The data ...
Dissecting the circle, at random*
Directory of Open Access Journals (Sweden)
Curien Nicolas
2014-01-01
Full Text Available Random laminations of the disk are the continuous limits of random non-crossing configurations of regular polygons. We provide an expository account on this subject. Initiated by the work of Aldous on the Brownian triangulation, this field now possesses many characters such as the random recursive triangulation, the stable laminations and the Markovian hyperbolic triangulation of the disk. We will review the properties and constructions of these objects as well as the close relationships they enjoy with the theory of continuous random trees. Some open questions are scattered along the text.
Random Decrement Based FRF Estimation
DEFF Research Database (Denmark)
Brincker, Rune; Asmussen, J. C.
to speed and quality. The basis of the new method is the Fourier transformation of the Random Decrement functions which can be used to estimate the frequency response functions. The investigations are based on load and response measurements of a laboratory model of a 3 span bridge. By applying both methods...... that the Random Decrement technique is based on a simple controlled averaging of time segments of the load and response processes. Furthermore, the Random Decrement technique is expected to produce reliable results. The Random Decrement technique will reduce leakage, since the Fourier transformation...
Random Decrement Based FRF Estimation
DEFF Research Database (Denmark)
Brincker, Rune; Asmussen, J. C.
1997-01-01
to speed and quality. The basis of the new method is the Fourier transformation of the Random Decrement functions which can be used to estimate the frequency response functions. The investigations are based on load and response measurements of a laboratory model of a 3 span bridge. By applying both methods...... that the Random Decrement technique is based on a simple controlled averaging of time segments of the load and response processes. Furthermore, the Random Decrement technique is expected to produce reliable results. The Random Decrement technique will reduce leakage, since the Fourier transformation...
Management Matters. Selection Policies
Pappas, Marjorie L.
2003-01-01
One of the most important policy documents for a school library media center is the selection policy or the collection development policy. A well-developed selection policy provides a rationale for the selection decisions made by the school library media specialist. A selection policy represents the criteria against which a challenged book is…
Familial versus mass selection in small populations
Directory of Open Access Journals (Sweden)
Couvet Denis
2003-07-01
Full Text Available Abstract We used diffusion approximations and a Markov-chain approach to investigate the consequences of familial selection on the viability of small populations both in the short and in the long term. The outcome of familial selection was compared to the case of a random mating population under mass selection. In small populations, the higher effective size, associated with familial selection, resulted in higher fitness for slightly deleterious and/or highly recessive alleles. Conversely, because familial selection leads to a lower rate of directional selection, a lower fitness was observed for more detrimental genes that are not highly recessive, and with high population sizes. However, in the long term, genetic load was almost identical for both mass and familial selection for populations of up to 200 individuals. In terms of mean time to extinction, familial selection did not have any negative effect at least for small populations (N ≤ 50. Overall, familial selection could be proposed for use in management programs of small populations since it increases genetic variability and short-term viability without impairing the overall persistence times.
77 FR 2606 - Pipeline Safety: Random Drug Testing Rate
2012-01-18
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID PHMSA-2012-0004] Pipeline Safety: Random Drug Testing Rate AGENCY: Pipeline and Hazardous Materials... pipelines and operators of liquefied natural gas facilities must select and test a percentage of covered...
75 FR 9018 - Pipeline Safety: Random Drug Testing Rate
2010-02-26
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID PHMSA-2010-0034] Pipeline Safety: Random Drug Testing Rate AGENCY: Pipeline and Hazardous Materials... pipelines and operators of liquefied natural gas facilities must select and test a percentage of covered...
Atomic structure calculations using the relativistic random phase approximation
International Nuclear Information System (INIS)
Cheng, K.T.; Johnson, W.R.
1981-01-01
A brief review is given for the relativistic random phase approximation (RRPA) applied to atomic transition problems. Selected examples of RRPA calculations on discrete excitations and photoionization are given to illustrate the need of relativistic many-body theories in dealing with atomic processes where both relativity and correlation are important
Randomized algorithms in automatic control and data mining
Granichin, Oleg; Toledano-Kitai, Dvora
2015-01-01
In the fields of data mining and control, the huge amount of unstructured data and the presence of uncertainty in system descriptions have always been critical issues. The book Randomized Algorithms in Automatic Control and Data Mining introduces the readers to the fundamentals of randomized algorithm applications in data mining (especially clustering) and in automatic control synthesis. The methods proposed in this book guarantee that the computational complexity of classical algorithms and the conservativeness of standard robust control techniques will be reduced. It is shown that when a problem requires "brute force" in selecting among options, algorithms based on random selection of alternatives offer good results with certain probability for a restricted time and significantly reduce the volume of operations.
A comparison of random walks in dependent random environments
Scheinhardt, Willem R.W.; Kroese, Dirk
2015-01-01
Although the theoretical behavior of one-dimensional random walks in random environments is well understood, the actual evaluation of various characteristics of such processes has received relatively little attention. This paper develops new methodology for the exact computation of the drift in such
Random matrix ensembles with random interactions: Results for ...
Indian Academy of Sciences (India)
... Public Lectures · Lecture Workshops · Refresher Courses · Symposia · Live Streaming. Home; Journals; Pramana – Journal of Physics; Volume 73; Issue 3. Random matrix ensembles with random interactions: Results for EGUE(2)-(4). Manan Vyas Manan Vyas. Volume 73 Issue 3 September 2009 pp 521-531 ...
Pseudo-random number generator based on asymptotic deterministic randomness
Wang, Kai; Pei, Wenjiang; Xia, Haishan; Cheung, Yiu-ming
2008-06-01
A novel approach to generate the pseudorandom-bit sequence from the asymptotic deterministic randomness system is proposed in this Letter. We study the characteristic of multi-value correspondence of the asymptotic deterministic randomness constructed by the piecewise linear map and the noninvertible nonlinearity transform, and then give the discretized systems in the finite digitized state space. The statistic characteristics of the asymptotic deterministic randomness are investigated numerically, such as stationary probability density function and random-like behavior. Furthermore, we analyze the dynamics of the symbolic sequence. Both theoretical and experimental results show that the symbolic sequence of the asymptotic deterministic randomness possesses very good cryptographic properties, which improve the security of chaos based PRBGs and increase the resistance against entropy attacks and symbolic dynamics attacks.
Pseudo-random number generator based on asymptotic deterministic randomness
International Nuclear Information System (INIS)
Wang Kai; Pei Wenjiang; Xia Haishan; Cheung Yiuming
2008-01-01
A novel approach to generate the pseudorandom-bit sequence from the asymptotic deterministic randomness system is proposed in this Letter. We study the characteristic of multi-value correspondence of the asymptotic deterministic randomness constructed by the piecewise linear map and the noninvertible nonlinearity transform, and then give the discretized systems in the finite digitized state space. The statistic characteristics of the asymptotic deterministic randomness are investigated numerically, such as stationary probability density function and random-like behavior. Furthermore, we analyze the dynamics of the symbolic sequence. Both theoretical and experimental results show that the symbolic sequence of the asymptotic deterministic randomness possesses very good cryptographic properties, which improve the security of chaos based PRBGs and increase the resistance against entropy attacks and symbolic dynamics attacks
Random walk of passive tracers among randomly moving obstacles.
Gori, Matteo; Donato, Irene; Floriani, Elena; Nardecchia, Ilaria; Pettini, Marco
2016-04-14
This study is mainly motivated by the need of understanding how the diffusion behavior of a biomolecule (or even of a larger object) is affected by other moving macromolecules, organelles, and so on, inside a living cell, whence the possibility of understanding whether or not a randomly walking biomolecule is also subject to a long-range force field driving it to its target. By means of the Continuous Time Random Walk (CTRW) technique the topic of random walk in random environment is here considered in the case of a passively diffusing particle among randomly moving and interacting obstacles. The relevant physical quantity which is worked out is the diffusion coefficient of the passive tracer which is computed as a function of the average inter-obstacles distance. The results reported here suggest that if a biomolecule, let us call it a test molecule, moves towards its target in the presence of other independently interacting molecules, its motion can be considerably slowed down.
Random distributed feedback fibre lasers
Energy Technology Data Exchange (ETDEWEB)
Turitsyn, Sergei K., E-mail: s.k.turitsyn@aston.ac.uk [Aston Institute of Photonic Technologies, Aston University, Birmingham B4 7ET (United Kingdom); Novosibirsk State University, 2 Pirogova str., 630090, Novosibirsk (Russian Federation); Babin, Sergey A. [Novosibirsk State University, 2 Pirogova str., 630090, Novosibirsk (Russian Federation); Institute of Automation and Electrometry SB RAS, 1 Ac. Koptug. ave., 630090, Novosibirsk (Russian Federation); Churkin, Dmitry V. [Aston Institute of Photonic Technologies, Aston University, Birmingham B4 7ET (United Kingdom); Novosibirsk State University, 2 Pirogova str., 630090, Novosibirsk (Russian Federation); Institute of Automation and Electrometry SB RAS, 1 Ac. Koptug. ave., 630090, Novosibirsk (Russian Federation); Vatnik, Ilya D.; Nikulin, Maxim [Institute of Automation and Electrometry SB RAS, 1 Ac. Koptug. ave., 630090, Novosibirsk (Russian Federation); Podivilov, Evgenii V. [Novosibirsk State University, 2 Pirogova str., 630090, Novosibirsk (Russian Federation); Institute of Automation and Electrometry SB RAS, 1 Ac. Koptug. ave., 630090, Novosibirsk (Russian Federation)
2014-09-10
The concept of random lasers exploiting multiple scattering of photons in an amplifying disordered medium in order to generate coherent light without a traditional laser resonator has attracted a great deal of attention in recent years. This research area lies at the interface of the fundamental theory of disordered systems and laser science. The idea was originally proposed in the context of astrophysics in the 1960s by V.S. Letokhov, who studied scattering with “negative absorption” of the interstellar molecular clouds. Research on random lasers has since developed into a mature experimental and theoretical field. A simple design of such lasers would be promising for potential applications. However, in traditional random lasers the properties of the output radiation are typically characterized by complex features in the spatial, spectral and time domains, making them less attractive than standard laser systems in terms of practical applications. Recently, an interesting and novel type of one-dimensional random laser that operates in a conventional telecommunication fibre without any pre-designed resonator mirrors–random distributed feedback fibre laser–was demonstrated. The positive feedback required for laser generation in random fibre lasers is provided by the Rayleigh scattering from the inhomogeneities of the refractive index that are naturally present in silica glass. In the proposed laser concept, the randomly backscattered light is amplified through the Raman effect, providing distributed gain over distances up to 100 km. Although an effective reflection due to the Rayleigh scattering is extremely small (∼0.1%), the lasing threshold may be exceeded when a sufficiently large distributed Raman gain is provided. Such a random distributed feedback fibre laser has a number of interesting and attractive features. The fibre waveguide geometry provides transverse confinement, and effectively one-dimensional random distributed feedback leads to the
Random distributed feedback fibre lasers
International Nuclear Information System (INIS)
Turitsyn, Sergei K.; Babin, Sergey A.; Churkin, Dmitry V.; Vatnik, Ilya D.; Nikulin, Maxim; Podivilov, Evgenii V.
2014-01-01
The concept of random lasers exploiting multiple scattering of photons in an amplifying disordered medium in order to generate coherent light without a traditional laser resonator has attracted a great deal of attention in recent years. This research area lies at the interface of the fundamental theory of disordered systems and laser science. The idea was originally proposed in the context of astrophysics in the 1960s by V.S. Letokhov, who studied scattering with “negative absorption” of the interstellar molecular clouds. Research on random lasers has since developed into a mature experimental and theoretical field. A simple design of such lasers would be promising for potential applications. However, in traditional random lasers the properties of the output radiation are typically characterized by complex features in the spatial, spectral and time domains, making them less attractive than standard laser systems in terms of practical applications. Recently, an interesting and novel type of one-dimensional random laser that operates in a conventional telecommunication fibre without any pre-designed resonator mirrors–random distributed feedback fibre laser–was demonstrated. The positive feedback required for laser generation in random fibre lasers is provided by the Rayleigh scattering from the inhomogeneities of the refractive index that are naturally present in silica glass. In the proposed laser concept, the randomly backscattered light is amplified through the Raman effect, providing distributed gain over distances up to 100 km. Although an effective reflection due to the Rayleigh scattering is extremely small (∼0.1%), the lasing threshold may be exceeded when a sufficiently large distributed Raman gain is provided. Such a random distributed feedback fibre laser has a number of interesting and attractive features. The fibre waveguide geometry provides transverse confinement, and effectively one-dimensional random distributed feedback leads to the
Exploring pseudo- and chaotic random Monte Carlo simulations
Blais, J. A. Rod; Zhang, Zhan
2011-07-01
Computer simulations are an increasingly important area of geoscience research and development. At the core of stochastic or Monte Carlo simulations are the random number sequences that are assumed to be distributed with specific characteristics. Computer-generated random numbers, uniformly distributed on (0, 1), can be very different depending on the selection of pseudo-random number (PRN) or chaotic random number (CRN) generators. In the evaluation of some definite integrals, the resulting error variances can even be of different orders of magnitude. Furthermore, practical techniques for variance reduction such as importance sampling and stratified sampling can be applied in most Monte Carlo simulations and significantly improve the results. A comparative analysis of these strategies has been carried out for computational applications in planar and spatial contexts. Based on these experiments, and on some practical examples of geodetic direct and inverse problems, conclusions and recommendations concerning their performance and general applicability are included.
Record statistics of financial time series and geometric random walks.
Sabir, Behlool; Santhanam, M S
2014-09-01
The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. In this work, we study the record statistics of correlated empirical data for which random walk models have relevance. We obtain results for the records statistics of select stock market data and the geometric random walk, primarily through simulations. We show that the distribution of the age of records is a power law with the exponent α lying in the range 1.5≤α≤1.8. Further, the longest record ages follow the Fréchet distribution of extreme value theory. The records statistics of geometric random walk series is in good agreement with that obtained from empirical stock data.
Review of Random Phase Encoding in Volume Holographic Storage
Directory of Open Access Journals (Sweden)
Wei-Chia Su
2012-09-01
Full Text Available Random phase encoding is a unique technique for volume hologram which can be applied to various applications such as holographic multiplexing storage, image encryption, and optical sensing. In this review article, we first review and discuss diffraction selectivity of random phase encoding in volume holograms, which is the most important parameter related to multiplexing capacity of volume holographic storage. We then review an image encryption system based on random phase encoding. The alignment of phase key for decryption of the encoded image stored in holographic memory is analyzed and discussed. In the latter part of the review, an all-optical sensing system implemented by random phase encoding and holographic interconnection is presented.
Random and non-random mating populations: Evolutionary dynamics in meiotic drive.
Sarkar, Bijan
2016-01-01
Game theoretic tools are utilized to analyze a one-locus continuous selection model of sex-specific meiotic drive by considering nonequivalence of the viabilities of reciprocal heterozygotes that might be noticed at an imprinted locus. The model draws attention to the role of viability selections of different types to examine the stable nature of polymorphic equilibrium. A bridge between population genetics and evolutionary game theory has been built up by applying the concept of the Fundamental Theorem of Natural Selection. In addition to pointing out the influences of male and female segregation ratios on selection, configuration structure reveals some noted results, e.g., Hardy-Weinberg frequencies hold in replicator dynamics, occurrence of faster evolution at the maximized variance fitness, existence of mixed Evolutionarily Stable Strategy (ESS) in asymmetric games, the tending evolution to follow not only a 1:1 sex ratio but also a 1:1 different alleles ratio at particular gene locus. Through construction of replicator dynamics in the group selection framework, our selection model introduces a redefining bases of game theory to incorporate non-random mating where a mating parameter associated with population structure is dependent on the social structure. Also, the model exposes the fact that the number of polymorphic equilibria will depend on the algebraic expression of population structure. Copyright © 2015 Elsevier Inc. All rights reserved.
The random continued fraction transformation
Kalle, Charlene; Kempton, Tom; Verbitskiy, Evgeny
2017-03-01
We introduce a random dynamical system related to continued fraction expansions. It uses random combinations of the Gauss map and the Rényi (or backwards) continued fraction map. We explore the continued fraction expansions that this system produces, as well as the dynamical properties of the system.
Bell inequalities for random fields
Energy Technology Data Exchange (ETDEWEB)
Morgan, Peter [Physics Department, Yale University, CT 06520 (United States)
2006-06-09
The assumptions required for the derivation of Bell inequalities are not satisfied for random field models in which there are any thermal or quantum fluctuations, in contrast to the general satisfaction of the assumptions for classical two point particle models. Classical random field models that explicitly include the effects of quantum fluctuations on measurement are possible for experiments that violate Bell inequalities.
Bell inequalities for random fields
Morgan, Peter
2004-01-01
The assumptions required for the derivation of Bell inequalities are not usually satisfied for random fields in which there are any thermal or quantum fluctuations, in contrast to the general satisfaction of the assumptions for classical two point particle models. Classical random field models that explicitly include the effects of quantum fluctuations on measurement are possible for experiments that violate Bell inequalities.
Object grammars and random generation
Directory of Open Access Journals (Sweden)
I. Dutour
1998-12-01
Full Text Available This paper presents a new systematic approach for the uniform random generation of combinatorial objects. The method is based on the notion of object grammars which give recursive descriptions of objects and generalize context-freegrammars. The application of particular valuations to these grammars leads to enumeration and random generation of objects according to non algebraic parameters.
International Nuclear Information System (INIS)
Itzykson, C.
1983-10-01
We review the formulation of field theory and statistical mechanics on a Poissonian random lattice. Topics discussed include random geometry, the construction of field equations for arbitrary spin, the free field spectrum and the question of localization illustrated in the one dimensional case
Multistage Selection and the Financing of New Ventures
Jonathan T. Eckhardt; Scott Shane; Frédéric Delmar
2006-01-01
Using a random sample of 221 new Swedish ventures initiated in 1998, we examine why some new ventures are more likely than others to successfully be awarded capital from external sources. We examine venture financing as a staged selection process in which two sequential selection events systematically winnow the population of ventures and influence which ventures receive financing. For a venture to receive external financing its founders must first select it as a candidate for external fundin...
Directory of Open Access Journals (Sweden)
MS Yıldırım
2016-02-01
Full Text Available The aim of this study was to compare the effects of static stretching, proprioceptive neuromuscular facilitation (PNF stretching and Mulligan technique on hip flexion range of motion (ROM in subjects with bilateral hamstring tightness. A total of 40 students (mean age: 21.5±1.3 years, mean body height: 172.8±8.2 cm, mean body mass index: 21.9±3.0 kg • m-2 with bilateral hamstring tightness were enrolled in this randomized trial, of whom 26 completed the study. Subjects were divided into 4 groups performing (I typical static stretching, (II PNF stretching, (III Mulligan traction straight leg raise (TSLR technique, (IV no intervention. Hip flexion ROM was measured using a digital goniometer with the passive straight leg raise test before and after 4 weeks by two physiotherapists blinded to the groups. 52 extremities of 26 subjects were analyzed. Hip flexion ROM increased in all three intervention groups (p<0.05 but not in the no-intervention group after 4 weeks. A statistically significant change in initial–final assessment differences of hip flexion ROM was found between groups (p<0.001 in favour of PNF stretching and Mulligan TSLR technique in comparison to typical static stretching (p=0.016 and p=0.02, respectively. No significant difference was found between Mulligan TSLR technique and PNF stretching (p=0.920. The initial–final assessment difference of hip flexion ROM was similar in typical static stretching and no intervention (p=0.491. A 4-week stretching intervention is beneficial for increasing hip flexion ROM in bilateral hamstring tightness. However, PNF stretching and Mulligan TSLR technique are superior to typical static stretching. These two interventions can be alternatively used for stretching in hamstring tightness.
Levy flights and random searches
Energy Technology Data Exchange (ETDEWEB)
Raposo, E P [Laboratorio de Fisica Teorica e Computacional, Departamento de Fisica, Universidade Federal de Pernambuco, Recife-PE, 50670-901 (Brazil); Buldyrev, S V [Department of Physics, Yeshiva University, New York, 10033 (United States); Da Luz, M G E [Departamento de Fisica, Universidade Federal do Parana, Curitiba-PR, 81531-990 (Brazil); Viswanathan, G M [Instituto de Fisica, Universidade Federal de Alagoas, Maceio-AL, 57072-970 (Brazil); Stanley, H E [Center for Polymer Studies and Department of Physics, Boston University, Boston, MA 02215 (United States)
2009-10-30
In this work we discuss some recent contributions to the random search problem. Our analysis includes superdiffusive Levy processes and correlated random walks in several regimes of target site density, mobility and revisitability. We present results in the context of mean-field-like and closed-form average calculations, as well as numerical simulations. We then consider random searches performed in regular lattices and lattices with defects, and we discuss a necessary criterion for distinguishing true superdiffusion from correlated random walk processes. We invoke energy considerations in relation to critical survival states on the edge of extinction, and we analyze the emergence of Levy behavior in deterministic search walks. Finally, we comment on the random search problem in the context of biological foraging.
Computer generation of random deviates
International Nuclear Information System (INIS)
Cormack, John
1991-01-01
The need for random deviates arises in many scientific applications. In medical physics, Monte Carlo simulations have been used in radiology, radiation therapy and nuclear medicine. Specific instances include the modelling of x-ray scattering processes and the addition of random noise to images or curves in order to assess the effects of various processing procedures. Reliable sources of random deviates with statistical properties indistinguishable from true random deviates are a fundamental necessity for such tasks. This paper provides a review of computer algorithms which can be used to generate uniform random deviates and other distributions of interest to medical physicists, along with a few caveats relating to various problems and pitfalls which can occur. Source code listings for the generators discussed (in FORTRAN, Turbo-PASCAL and Data General ASSEMBLER) are available on request from the authors. 27 refs., 3 tabs., 5 figs
Extrapolating Weak Selection in Evolutionary Games
Wu, Bin; García, Julián; Hauert, Christoph; Traulsen, Arne
2013-01-01
In evolutionary games, reproductive success is determined by payoffs. Weak selection means that even large differences in game outcomes translate into small fitness differences. Many results have been derived using weak selection approximations, in which perturbation analysis facilitates the derivation of analytical results. Here, we ask whether results derived under weak selection are also qualitatively valid for intermediate and strong selection. By “qualitatively valid” we mean that the ranking of strategies induced by an evolutionary process does not change when the intensity of selection increases. For two-strategy games, we show that the ranking obtained under weak selection cannot be carried over to higher selection intensity if the number of players exceeds two. For games with three (or more) strategies, previous examples for multiplayer games have shown that the ranking of strategies can change with the intensity of selection. In particular, rank changes imply that the most abundant strategy at one intensity of selection can become the least abundant for another. We show that this applies already to pairwise interactions for a broad class of evolutionary processes. Even when both weak and strong selection limits lead to consistent predictions, rank changes can occur for intermediate intensities of selection. To analyze how common such games are, we show numerically that for randomly drawn two-player games with three or more strategies, rank changes frequently occur and their likelihood increases rapidly with the number of strategies . In particular, rank changes are almost certain for , which jeopardizes the predictive power of results derived for weak selection. PMID:24339769
Feature Selection for Chemical Sensor Arrays Using Mutual Information
Wang, X. Rosalind; Lizier, Joseph T.; Nowotny, Thomas; Berna, Amalia Z.; Prokopenko, Mikhail; Trowell, Stephen C.
2014-01-01
We address the problem of feature selection for classifying a diverse set of chemicals using an array of metal oxide sensors. Our aim is to evaluate a filter approach to feature selection with reference to previous work, which used a wrapper approach on the same data set, and established best features and upper bounds on classification performance. We selected feature sets that exhibit the maximal mutual information with the identity of the chemicals. The selected features closely match those found to perform well in the previous study using a wrapper approach to conduct an exhaustive search of all permitted feature combinations. By comparing the classification performance of support vector machines (using features selected by mutual information) with the performance observed in the previous study, we found that while our approach does not always give the maximum possible classification performance, it always selects features that achieve classification performance approaching the optimum obtained by exhaustive search. We performed further classification using the selected feature set with some common classifiers and found that, for the selected features, Bayesian Networks gave the best performance. Finally, we compared the observed classification performances with the performance of classifiers using randomly selected features. We found that the selected features consistently outperformed randomly selected features for all tested classifiers. The mutual information filter approach is therefore a computationally efficient method for selecting near optimal features for chemical sensor arrays. PMID:24595058
Officer Selection (la Selection des officiers)
National Research Council Canada - National Science Library
2000-01-01
.... The theme of this workshop, officer selection, is an issue of central importance to the military forces of all countries, since it determines which individuals, with what characteristics, will...
Pseudo-random number generation using a 3-state cellular automaton
Bhattacharjee, Kamalika; Paul, Dipanjyoti; Das, Sukanta
This paper investigates the potentiality of pseudo-random number generation of a 3-neighborhood 3-state cellular automaton (CA) under periodic boundary condition. Theoretical and empirical tests are performed on the numbers, generated by the CA, to observe the quality of it as pseudo-random number generator (PRNG). We analyze the strength and weakness of the proposed PRNG and conclude that the selected CA is a good random number generator.
Role of selective interaction in wealth distribution
International Nuclear Information System (INIS)
Gupta, A.K.
2005-08-01
In our simplified description 'money' is wealth. A kinetic theory model of money is investigated where two agents interact (trade) selectively and exchange random amount of money between them while keeping total money of all the agents constant. The probability distributions of individual money (P(m) vs. m) is seen to be influenced by certain modes of selective interactions. The distributions shift away from Boltzmann-Gibbs like exponential distribution and in some cases distributions emerge with power law tails known as Pareto's law (P(m) ∝ m -(1+α) ). (author)
Coupled continuous time-random walks in quenched random environment
Magdziarz, M.; Szczotka, W.
2018-02-01
We introduce a coupled continuous-time random walk with coupling which is characteristic for Lévy walks. Additionally we assume that the walker moves in a quenched random environment, i.e. the site disorder at each lattice point is fixed in time. We analyze the scaling limit of such a random walk. We show that for large times the behaviour of the analyzed process is exactly the same as in the case of uncoupled quenched trap model for Lévy flights.
Machine learning techniques to select variable stars
Directory of Open Access Journals (Sweden)
García-Varela Alejandro
2017-01-01
Full Text Available In order to perform a supervised classification of variable stars, we propose and evaluate a set of six features extracted from the magnitude density of the light curves. They are used to train automatic classification systems using state-of-the-art classifiers implemented in the R statistical computing environment. We find that random forests is the most successful method to select variables.
Random ensemble learning for EEG classification.
Hosseini, Mohammad-Parsa; Pompili, Dario; Elisevich, Kost; Soltanian-Zadeh, Hamid
2018-01-01
Real-time detection of seizure activity in epilepsy patients is critical in averting seizure activity and improving patients' quality of life. Accurate evaluation, presurgical assessment, seizure prevention, and emergency alerts all depend on the rapid detection of seizure onset. A new method of feature selection and classification for rapid and precise seizure detection is discussed wherein informative components of electroencephalogram (EEG)-derived data are extracted and an automatic method is presented using infinite independent component analysis (I-ICA) to select independent features. The feature space is divided into subspaces via random selection and multichannel support vector machines (SVMs) are used to classify these subspaces. The result of each classifier is then combined by majority voting to establish the final output. In addition, a random subspace ensemble using a combination of SVM, multilayer perceptron (MLP) neural network and an extended k-nearest neighbors (k-NN), called extended nearest neighbor (ENN), is developed for the EEG and electrocorticography (ECoG) big data problem. To evaluate the solution, a benchmark ECoG of eight patients with temporal and extratemporal epilepsy was implemented in a distributed computing framework as a multitier cloud-computing architecture. Using leave-one-out cross-validation, the accuracy, sensitivity, specificity, and both false positive and false negative ratios of the proposed method were found to be 0.97, 0.98, 0.96, 0.04, and 0.02, respectively. Application of the solution to cases under investigation with ECoG has also been effected to demonstrate its utility. Copyright © 2017 Elsevier B.V. All rights reserved.
Pseudo-random-number generators and the square site percolation threshold.
Lee, Michael J
2008-09-01
Selected pseudo-random-number generators are applied to a Monte Carlo study of the two-dimensional square-lattice site percolation model. A generator suitable for high precision calculations is identified from an application specific test of randomness. After extended computation and analysis, an ostensibly reliable value of p_{c}=0.59274598(4) is obtained for the percolation threshold.
Meyer, Ursina; Schindler, Christian; Bloesch, Tamara; Schmocker, Eliane; Zahner, Lukas; Puder, Jardena J; Kriemler, Susi
2014-01-01
PURPOSE: Negative lifestyle factors are known to be associated with increased cardiovascular risk (CVR) in children, but research on their combined impact on a general population of children is sparse. Therefore, we aimed to quantify the combined impact of easily assessable negative lifestyle factors on the CVR scores of randomly selected children after 4 years. METHODS: Of the 540 randomly selected 6- to 13-year-old children, 502 children participated in a baseline health assessment, and ...
An introduction to random sets
Nguyen, Hung T
2006-01-01
The study of random sets is a large and rapidly growing area with connections to many areas of mathematics and applications in widely varying disciplines, from economics and decision theory to biostatistics and image analysis. The drawback to such diversity is that the research reports are scattered throughout the literature, with the result that in science and engineering, and even in the statistics community, the topic is not well known and much of the enormous potential of random sets remains untapped.An Introduction to Random Sets provides a friendly but solid initiation into the theory of random sets. It builds the foundation for studying random set data, which, viewed as imprecise or incomplete observations, are ubiquitous in today''s technological society. The author, widely known for his best-selling A First Course in Fuzzy Logic text as well as his pioneering work in random sets, explores motivations, such as coarse data analysis and uncertainty analysis in intelligent systems, for studying random s...
Quantifiers for randomness of chaotic pseudo-random number generators.
De Micco, L; Larrondo, H A; Plastino, A; Rosso, O A
2009-08-28
We deal with randomness quantifiers and concentrate on their ability to discern the hallmark of chaos in time series used in connection with pseudo-random number generators (PRNGs). Workers in the field are motivated to use chaotic maps for generating PRNGs because of the simplicity of their implementation. Although there exist very efficient general-purpose benchmarks for testing PRNGs, we feel that the analysis provided here sheds additional didactic light on the importance of the main statistical characteristics of a chaotic map, namely (i) its invariant measure and (ii) the mixing constant. This is of help in answering two questions that arise in applications: (i) which is the best PRNG among the available ones? and (ii) if a given PRNG turns out not to be good enough and a randomization procedure must still be applied to it, which is the best applicable randomization procedure? Our answer provides a comparative analysis of several quantifiers advanced in the extant literature.
DEFF Research Database (Denmark)
Khaldari, Majid; Yeganeh, Hassan Mehrabani; Pakdel, Abbas
2011-01-01
An experiment was conducted to investigate the effect of short-term selection for 4 week breast weight (4wk BRW), and to estimate genetic parameters of body weight, and carcass traits. A selection (S) line and control (C) line was randomly selected from a base population. Data were collected over...... was 0.35±0.06. There were a significant difference for BW, and carcass weights but not for carcass percent components between lines (Pcarcass and leg weights were 0.46, 0.41 and 0.47, and 13.2, 16.2, 4.4 %, respectively....... The genetic correlations of BRW with BW, carcass, leg, and back weights were 0.85, 0.88 and 0.72, respectively. Selection for 4 wk BRW improved feed conversion ratio (FCR) about 0.19 units over the selection period. Inbreeding caused an insignificant decline of the mean of some traits. Results from...
Random linear codes in steganography
Directory of Open Access Journals (Sweden)
Kamil Kaczyński
2016-12-01
Full Text Available Syndrome coding using linear codes is a technique that allows improvement in the steganographic algorithms parameters. The use of random linear codes gives a great flexibility in choosing the parameters of the linear code. In parallel, it offers easy generation of parity check matrix. In this paper, the modification of LSB algorithm is presented. A random linear code [8, 2] was used as a base for algorithm modification. The implementation of the proposed algorithm, along with practical evaluation of algorithms’ parameters based on the test images was made.[b]Keywords:[/b] steganography, random linear codes, RLC, LSB
Orthogonal polynomials and random matrices
Deift, Percy
2000-01-01
This volume expands on a set of lectures held at the Courant Institute on Riemann-Hilbert problems, orthogonal polynomials, and random matrix theory. The goal of the course was to prove universality for a variety of statistical quantities arising in the theory of random matrix models. The central question was the following: Why do very general ensembles of random n {\\times} n matrices exhibit universal behavior as n {\\rightarrow} {\\infty}? The main ingredient in the proof is the steepest descent method for oscillatory Riemann-Hilbert problems.
Random processes in nuclear reactors
Williams, M M R
1974-01-01
Random Processes in Nuclear Reactors describes the problems that a nuclear engineer may meet which involve random fluctuations and sets out in detail how they may be interpreted in terms of various models of the reactor system. Chapters set out to discuss topics on the origins of random processes and sources; the general technique to zero-power problems and bring out the basic effect of fission, and fluctuations in the lifetime of neutrons, on the measured response; the interpretation of power reactor noise; and associated problems connected with mechanical, hydraulic and thermal noise sources
Curvature of random walks and random polygons in confinement
International Nuclear Information System (INIS)
Diao, Y; Ernst, C; Montemayor, A; Ziegler, U
2013-01-01
The purpose of this paper is to study the curvature of equilateral random walks and polygons that are confined in a sphere. Curvature is one of several basic geometric properties that can be used to describe random walks and polygons. We show that confinement affects curvature quite strongly, and in the limit case where the confinement diameter equals the edge length the unconfined expected curvature value doubles from π/2 to π. To study curvature a simple model of an equilateral random walk in spherical confinement in dimensions 2 and 3 is introduced. For this simple model we derive explicit integral expressions for the expected value of the total curvature in both dimensions. These expressions are functions that depend only on the radius R of the confinement sphere. We then show that the values obtained by numeric integration of these expressions agrees with numerical average curvature estimates obtained from simulations of random walks. Finally, we compare the confinement effect on curvature of random walks with random polygons. (paper)
DEFF Research Database (Denmark)
Knudsen, Thorbjørn
2003-01-01
principles of variation, continuity and selection, it is argued that economic selection theory should mimic the causal structure of neo-Darwinian theory. Two of the most influential explanations of economic evolution, Alchian's and Nelson and Winter's, are used to illustrate how this could be achieved.......The present article provides a minimal description of the causal structure of economic selection theory and outlines how the internal selection dynamics of business organisations can be reconciled with selection in competitive markets. In addition to generic similarity in terms of the Darwinian...
Selective Reproductive Technologies
DEFF Research Database (Denmark)
Gammeltoft, Tine; Wahlberg, Ayo
2014-01-01
From a historical perspective, selective reproduction is nothing new. Infanticide, abandonment, and selective neglect of children have a long history, and the widespread deployment of sterilization and forced abortion in the twentieth century has been well documented. Yet in recent decades select......, discussing how selective reproduction engages with issues of long-standing theoretical concern in anthropology, such as politics, kinship, gender, religion, globalization, and inequality....... (ARTs), what we term selective reproductive technologies (SRTs) are of a more specific nature: Rather than aiming to overcome infertility, they are used to prevent or allow the birth of certain kinds of children. This review highlights anthropological research into SRTs in different parts of the world...
DEFF Research Database (Denmark)
Pedersen, Keld
2016-01-01
for initiation. Most of the research on project selection is normative, suggesting new methods, but available empirical studies indicate that many methods are seldom used in practice. This paper addresses the issue by providing increased understanding of IT project selection practice, thereby facilitating...... the development of methods that better fit current practice. The study is based on naturalistic decision-making theory and interviews with experienced project portfolio managers who, when selecting projects, primarily rely on political skills, experience and personal networks rather than on formal IT project......-selection methods, and these findings point to new areas for developing new methodological support for IT project selection....
Polyatomic Trilobite Rydberg Molecules in a Dense Random Gas.
Luukko, Perttu J J; Rost, Jan-Michael
2017-11-17
Trilobites are exotic giant dimers with enormous dipole moments. They consist of a Rydberg atom and a distant ground-state atom bound together by short-range electron-neutral attraction. We show that highly polar, polyatomic trilobite states unexpectedly persist and thrive in a dense ultracold gas of randomly positioned atoms. This is caused by perturbation-induced quantum scarring and the localization of electron density on randomly occurring atom clusters. At certain densities these states also mix with an s state, overcoming selection rules that hinder the photoassociation of ordinary trilobites.
Local randomness: Examples and application
Fu, Honghao; Miller, Carl A.
2018-03-01
When two players achieve a superclassical score at a nonlocal game, their outputs must contain intrinsic randomness. This fact has many useful implications for quantum cryptography. Recently it has been observed [C. Miller and Y. Shi, Quantum Inf. Computat. 17, 0595 (2017)] that such scores also imply the existence of local randomness—that is, randomness known to one player but not to the other. This has potential implications for cryptographic tasks between two cooperating but mistrustful players. In the current paper we bring this notion toward practical realization, by offering near-optimal bounds on local randomness for the CHSH game, and also proving the security of a cryptographic application of local randomness (single-bit certified deletion).
Random walks on reductive groups
Benoist, Yves
2016-01-01
The classical theory of Random Walks describes the asymptotic behavior of sums of independent identically distributed random real variables. This book explains the generalization of this theory to products of independent identically distributed random matrices with real coefficients. Under the assumption that the action of the matrices is semisimple – or, equivalently, that the Zariski closure of the group generated by these matrices is reductive - and under suitable moment assumptions, it is shown that the norm of the products of such random matrices satisfies a number of classical probabilistic laws. This book includes necessary background on the theory of reductive algebraic groups, probability theory and operator theory, thereby providing a modern introduction to the topic.
Microcomputer Unit: Generating Random Numbers.
Haigh, William E.
1986-01-01
Presents an activity, suitable for students in grades 6-12, on generating random numbers. Objectives, equipment needed, list of prerequisite experiences, instructional strategies, and ready-to-copy student worksheets are included. (JN)
Chaotic systems are dynamically random
International Nuclear Information System (INIS)
Svozil, K.
1988-01-01
The idea is put forward that the significant route to chaos is driven by recursive iterations of suitable evolution functions. The corresponding formal notion of randomness is not based on dynamic complexity rather than on static complexity. 24 refs. (Author)
A Randomized Central Limit Theorem
International Nuclear Information System (INIS)
Eliazar, Iddo; Klafter, Joseph
2010-01-01
The Central Limit Theorem (CLT), one of the most elemental pillars of Probability Theory and Statistical Physics, asserts that: the universal probability law of large aggregates of independent and identically distributed random summands with zero mean and finite variance, scaled by the square root of the aggregate-size (√(n)), is Gaussian. The scaling scheme of the CLT is deterministic and uniform - scaling all aggregate-summands by the common and deterministic factor √(n). This Letter considers scaling schemes which are stochastic and non-uniform, and presents a 'Randomized Central Limit Theorem' (RCLT): we establish a class of random scaling schemes which yields universal probability laws of large aggregates of independent and identically distributed random summands. The RCLT universal probability laws, in turn, are the one-sided and the symmetric Levy laws.
Electromagnetic scattering from random media
Field, Timothy R
2009-01-01
- ;The book develops the dynamical theory of scattering from random media from first principles. Its key findings are to characterize the time evolution of the scattered field in terms of stochastic differential equations, and to illustrate this framework
Cluster randomization and political philosophy.
Chwang, Eric
2012-11-01
In this paper, I will argue that, while the ethical issues raised by cluster randomization can be challenging, they are not new. My thesis divides neatly into two parts. In the first, easier part I argue that many of the ethical challenges posed by cluster randomized human subjects research are clearly present in other types of human subjects research, and so are not novel. In the second, more difficult part I discuss the thorniest ethical challenge for cluster randomized research--cases where consent is genuinely impractical to obtain. I argue that once again these cases require no new analytic insight; instead, we should look to political philosophy for guidance. In other words, the most serious ethical problem that arises in cluster randomized research also arises in political philosophy. © 2011 Blackwell Publishing Ltd.
Quantum-noise randomized ciphers
International Nuclear Information System (INIS)
Nair, Ranjith; Yuen, Horace P.; Kumar, Prem; Corndorf, Eric; Eguchi, Takami
2006-01-01
We review the notion of a classical random cipher and its advantages. We sharpen the usual description of random ciphers to a particular mathematical characterization suggested by the salient feature responsible for their increased security. We describe a concrete system known as αη and show that it is equivalent to a random cipher in which the required randomization is affected by coherent-state quantum noise. We describe the currently known security features of αη and similar systems, including lower bounds on the unicity distances against ciphertext-only and known-plaintext attacks. We show how αη used in conjunction with any standard stream cipher such as the Advanced Encryption Standard provides an additional, qualitatively different layer of security from physical encryption against known-plaintext attacks on the key. We refute some claims in the literature that αη is equivalent to a nonrandom stream cipher
Random matrix improved subspace clustering
Couillet, Romain; Kammoun, Abla
2017-01-01
This article introduces a spectral method for statistical subspace clustering. The method is built upon standard kernel spectral clustering techniques, however carefully tuned by theoretical understanding arising from random matrix findings. We show
Random sequential adsorption of cubes
Cieśla, Michał; Kubala, Piotr
2018-01-01
Random packings built of cubes are studied numerically using a random sequential adsorption algorithm. To compare the obtained results with previous reports, three different models of cube orientation sampling were used. Also, three different cube-cube intersection algorithms were tested to find the most efficient one. The study focuses on the mean saturated packing fraction as well as kinetics of packing growth. Microstructural properties of packings were analyzed using density autocorrelation function.
Random walk through fractal environments
Isliker, H.; Vlahos, L.
2002-01-01
We analyze random walk through fractal environments, embedded in 3-dimensional, permeable space. Particles travel freely and are scattered off into random directions when they hit the fractal. The statistical distribution of the flight increments (i.e. of the displacements between two consecutive hittings) is analytically derived from a common, practical definition of fractal dimension, and it turns out to approximate quite well a power-law in the case where the dimension D of the fractal is ...
Randomness in Contemporary Graphic Art
Zavřelová, Veronika
2016-01-01
Veronika Zavřelová Bachelor thesis Charles University in Prague, Faculty of Education, Department of Art Education Randomness in contemporary graphic art imaginative picture card game ANNOTATION This (bachelor) thesis concerns itself with a connection between verbal and visual character system within the topic of Randomness in contemporary graphic art - imaginative picture card game. The thesis is mainly based on the practical part - exclusively created card game Piktim. The card game uses as...
Staggered chiral random matrix theory
International Nuclear Information System (INIS)
Osborn, James C.
2011-01-01
We present a random matrix theory for the staggered lattice QCD Dirac operator. The staggered random matrix theory is equivalent to the zero-momentum limit of the staggered chiral Lagrangian and includes all taste breaking terms at their leading order. This is an extension of previous work which only included some of the taste breaking terms. We will also present some results for the taste breaking contributions to the partition function and the Dirac eigenvalues.
Digital random-number generator
Brocker, D. H.
1973-01-01
For binary digit array of N bits, use N noise sources to feed N nonlinear operators; each flip-flop in digit array is set by nonlinear operator to reflect whether amplitude of generator which feeds it is above or below mean value of generated noise. Fixed-point uniform distribution random number generation method can also be used to generate random numbers with other than uniform distribution.
DEFF Research Database (Denmark)
Shetty, Nisha; Rinnan, Åsmund; Gislum, René
2012-01-01
) algorithm were used and compared. Both Puchwein and CADEX methods provide a calibration set equally distributed in space, and both methods require a minimum prior of knowledge. The samples were also selected randomly using complete random, cultivar random (year fixed), year random (cultivar fixed......) and interaction (cultivar × year fixed) random procedures to see the influence of different factors on sample selection. Puchwein's method performed best with lowest RMSEP followed by CADEX, interaction random, year random, cultivar random and complete random. Out of 118 samples of the complete calibration set...... effectively enhance the cost-effectiveness of NIR spectral analysis by reducing the number of analyzed samples in the calibration set by more than 80%, which substantially reduces the effort of laboratory analyses with no significant loss in prediction accuracy....
An introduction to random interlacements
Drewitz, Alexander; Sapozhnikov, Artëm
2014-01-01
This book gives a self-contained introduction to the theory of random interlacements. The intended reader of the book is a graduate student with a background in probability theory who wants to learn about the fundamental results and methods of this rapidly emerging field of research. The model was introduced by Sznitman in 2007 in order to describe the local picture left by the trace of a random walk on a large discrete torus when it runs up to times proportional to the volume of the torus. Random interlacements is a new percolation model on the d-dimensional lattice. The main results covered by the book include the full proof of the local convergence of random walk trace on the torus to random interlacements and the full proof of the percolation phase transition of the vacant set of random interlacements in all dimensions. The reader will become familiar with the techniques relevant to working with the underlying Poisson Process and the method of multi-scale renormalization, which helps in overcoming the ch...
The MIXMAX random number generator
Savvidy, Konstantin G.
2015-11-01
In this paper, we study the randomness properties of unimodular matrix random number generators. Under well-known conditions, these discrete-time dynamical systems have the highly desirable K-mixing properties which guarantee high quality random numbers. It is found that some widely used random number generators have poor Kolmogorov entropy and consequently fail in empirical tests of randomness. These tests show that the lowest acceptable value of the Kolmogorov entropy is around 50. Next, we provide a solution to the problem of determining the maximal period of unimodular matrix generators of pseudo-random numbers. We formulate the necessary and sufficient condition to attain the maximum period and present a family of specific generators in the MIXMAX family with superior performance and excellent statistical properties. Finally, we construct three efficient algorithms for operations with the MIXMAX matrix which is a multi-dimensional generalization of the famous cat-map. First, allowing to compute the multiplication by the MIXMAX matrix with O(N) operations. Second, to recursively compute its characteristic polynomial with O(N2) operations, and third, to apply skips of large number of steps S to the sequence in O(N2 log(S)) operations.
Perceptions of randomized security schedules.
Scurich, Nicholas; John, Richard S
2014-04-01
Security of infrastructure is a major concern. Traditional security schedules are unable to provide omnipresent coverage; consequently, adversaries can exploit predictable vulnerabilities to their advantage. Randomized security schedules, which randomly deploy security measures, overcome these limitations, but public perceptions of such schedules have not been examined. In this experiment, participants were asked to make a choice between attending a venue that employed a traditional (i.e., search everyone) or a random (i.e., a probability of being searched) security schedule. The absolute probability of detecting contraband was manipulated (i.e., 1/10, 1/4, 1/2) but equivalent between the two schedule types. In general, participants were indifferent to either security schedule, regardless of the probability of detection. The randomized schedule was deemed more convenient, but the traditional schedule was considered fairer and safer. There were no differences between traditional and random schedule in terms of perceived effectiveness or deterrence. Policy implications for the implementation and utilization of randomized schedules are discussed. © 2013 Society for Risk Analysis.
A New Random Walk for Replica Detection in WSNs
Aalsalem, Mohammed Y.; Saad, N. M.; Hossain, Md. Shohrab; Atiquzzaman, Mohammed; Khan, Muhammad Khurram
2016-01-01
Wireless Sensor Networks (WSNs) are vulnerable to Node Replication attacks or Clone attacks. Among all the existing clone detection protocols in WSNs, RAWL shows the most promising results by employing Simple Random Walk (SRW). More recently, RAND outperforms RAWL by incorporating Network Division with SRW. Both RAND and RAWL have used SRW for random selection of witness nodes which is problematic because of frequently revisiting the previously passed nodes that leads to longer delays, high expenditures of energy with lower probability that witness nodes intersect. To circumvent this problem, we propose to employ a new kind of constrained random walk, namely Single Stage Memory Random Walk and present a distributed technique called SSRWND (Single Stage Memory Random Walk with Network Division). In SSRWND, single stage memory random walk is combined with network division aiming to decrease the communication and memory costs while keeping the detection probability higher. Through intensive simulations it is verified that SSRWND guarantees higher witness node security with moderate communication and memory overheads. SSRWND is expedient for security oriented application fields of WSNs like military and medical. PMID:27409082
A New Random Walk for Replica Detection in WSNs.
Aalsalem, Mohammed Y; Khan, Wazir Zada; Saad, N M; Hossain, Md Shohrab; Atiquzzaman, Mohammed; Khan, Muhammad Khurram
2016-01-01
Wireless Sensor Networks (WSNs) are vulnerable to Node Replication attacks or Clone attacks. Among all the existing clone detection protocols in WSNs, RAWL shows the most promising results by employing Simple Random Walk (SRW). More recently, RAND outperforms RAWL by incorporating Network Division with SRW. Both RAND and RAWL have used SRW for random selection of witness nodes which is problematic because of frequently revisiting the previously passed nodes that leads to longer delays, high expenditures of energy with lower probability that witness nodes intersect. To circumvent this problem, we propose to employ a new kind of constrained random walk, namely Single Stage Memory Random Walk and present a distributed technique called SSRWND (Single Stage Memory Random Walk with Network Division). In SSRWND, single stage memory random walk is combined with network division aiming to decrease the communication and memory costs while keeping the detection probability higher. Through intensive simulations it is verified that SSRWND guarantees higher witness node security with moderate communication and memory overheads. SSRWND is expedient for security oriented application fields of WSNs like military and medical.
[Silvicultural treatments and their selection effects].
Vincent, G
1973-01-01
Selection can be defined in terms of its observable consequences as the non random differential reproduction of genotypes (Lerner 1958). In the forest stands we are selecting during the improvements-fellings and reproduction treatments the individuals surpassing in growth or in production of first-class timber. However the silvicultural treatments taken in forest stands guarantee a permanent increase of forest production only in such cases, if they have been taken with respect to the principles of directional (dynamic) selection. These principles require that the trees determined for further growing and for forest regeneration are selected by their hereditary properties, i.e. by their genotypes.For making this selection feasible, our study deals with the genetic parameters and gives some examples of the application of the response, the selection differential, the heritability in the narrow and in the broad sense, as well as of the genetic and genotypic gain. On the strength of this parameter we have the possibility to estimate the economic success of several silvicultural treatments in forest stands.The mentioned examples demonstrate that the selection measures of a higher intensity will be manifested in a higher selection differential, in a higher genetic and genotypic gain and that the mentioned measures show more distinct effects in the variable populations - in natural forest - than in the population characteristic by a smaller variability, e.g. in many uniform artificially established stands.The examples of influences of different selection on the genotypes composition of population prove that genetics instructs us to differentiate the different genotypes of the same species and gives us at the same time a new criterions for evaluating selectional treatments. These criterions from economic point of view is necessary to consider in silviculture as advantageous even for the reason that we can judge from these criterions the genetical composition of forest stands
Virial expansion for almost diagonal random matrices
Yevtushenko, Oleg; Kravtsov, Vladimir E.
2003-08-01
Energy level statistics of Hermitian random matrices hat H with Gaussian independent random entries Higeqj is studied for a generic ensemble of almost diagonal random matrices with langle|Hii|2rangle ~ 1 and langle|Hi\
Non-compact random generalized games and random quasi-variational inequalities
Yuan, Xian-Zhi
1994-01-01
In this paper, existence theorems of random maximal elements, random equilibria for the random one-person game and random generalized game with a countable number of players are given as applications of random fixed point theorems. By employing existence theorems of random generalized games, we deduce the existence of solutions for non-compact random quasi-variational inequalities. These in turn are used to establish several existence theorems of noncompact generalized random ...
[Genomic selection and its application].
Li, Heng-De; Bao, Zhen-Min; Sun, Xiao-Wen
2011-12-01
Selective breeding is very important in agricultural production and breeding value estimation is the core of selective breeding. With the development of genetic markers, especially high throughput genotyping technology, it becomes available to estimate breeding value at genome level, i.e. genomic selection (GS). In this review, the methods of GS was categorized into two groups: one is to predict genomic estimated breeding value (GEBV) based on the allele effect, such as least squares, random regression - best linear unbiased prediction (RR-BLUP), Bayes and principle component analysis, etc; the other is to predict GEBV with genetic relationship matrix, which constructs genetic relationship matrix via high throughput genetic markers and then predicts GEBV through linear mixed model, i.e. GBLUP. The basic principles of these methods were also introduced according to the above two classifications. Factors affecting GS accuracy include markers of type and density, length of haplotype, the size of reference population, the extent between marker-QTL and so on. Among the methods of GS, Bayes and GBLUP are usually more accurate than the others and least squares is the worst. GBLUP is time-efficient and can combine pedigree with genotypic information, hence it is superior to other methods. Although progress was made in GS, there are still some challenges, for examples, united breeding, long-term genetic gain with GS, and disentangling markers with and without contribution to the traits. GS has been applied in animal and plant breeding practice and also has the potential to predict genetic predisposition in humans and study evolutionary dynamics. GS, which is more precise than the traditional method, is a breakthrough at measuring genetic relationship. Therefore, GS will be a revolutionary event in the history of animal and plant breeding.
Systematic random sampling of the comet assay.
McArt, Darragh G; Wasson, Gillian R; McKerr, George; Saetzler, Kurt; Reed, Matt; Howard, C Vyvyan
2009-07-01
The comet assay is a technique used to quantify DNA damage and repair at a cellular level. In the assay, cells are embedded in agarose and the cellular content is stripped away leaving only the DNA trapped in an agarose cavity which can then be electrophoresed. The damaged DNA can enter the agarose and migrate while the undamaged DNA cannot and is retained. DNA damage is measured as the proportion of the migratory 'tail' DNA compared to the total DNA in the cell. The fundamental basis of these arbitrary values is obtained in the comet acquisition phase using fluorescence microscopy with a stoichiometric stain in tandem with image analysis software. Current methods deployed in such an acquisition are expected to be both objectively and randomly obtained. In this paper we examine the 'randomness' of the acquisition phase and suggest an alternative method that offers both objective and unbiased comet selection. In order to achieve this, we have adopted a survey sampling approach widely used in stereology, which offers a method of systematic random sampling (SRS). This is desirable as it offers an impartial and reproducible method of comet analysis that can be used both manually or automated. By making use of an unbiased sampling frame and using microscope verniers, we are able to increase the precision of estimates of DNA damage. Results obtained from a multiple-user pooled variation experiment showed that the SRS technique attained a lower variability than that of the traditional approach. The analysis of a single user with repetition experiment showed greater individual variances while not being detrimental to overall averages. This would suggest that the SRS method offers a better reflection of DNA damage for a given slide and also offers better user reproducibility.
Lane detection using Randomized Hough Transform
Mongkonyong, Peerawat; Nuthong, Chaiwat; Siddhichai, Supakorn; Yamakita, Masaki
2018-01-01
According to the report of the Royal Thai Police between 2006 and 2015, lane changing without consciousness is one of the most accident causes. To solve this problem, many methods are considered. Lane Departure Warning System (LDWS) is considered to be one of the potential solutions. LDWS is a mechanism designed to warn the driver when the vehicle begins to move out of its current lane. LDWS contains many parts including lane boundary detection, driver warning and lane marker tracking. This article focuses on the lane boundary detection part. The proposed lane boundary detection detects the lines of the image from the input video and selects the lane marker of the road surface from those lines. Standard Hough Transform (SHT) and Randomized Hough Transform (RHT) are considered in this article. They are used to extract lines of an image. SHT extracts the lines from all of the edge pixels. RHT extracts only the lines randomly picked by the point pairs from edge pixels. RHT algorithm reduces the time and memory usage when compared with SHT. The increase of the threshold value in RHT will increase the voted limit of the line that has a high possibility to be the lane marker, but it also consumes the time and memory. By comparison between SHT and RHT with the different threshold values, 500 frames of input video from the front car camera will be processed. The accuracy and the computational time of RHT are similar to those of SHT in the result of the comparison.
Wave propagation and scattering in random media
Ishimaru, Akira
1978-01-01
Wave Propagation and Scattering in Random Media, Volume 2, presents the fundamental formulations of wave propagation and scattering in random media in a unified and systematic manner. The topics covered in this book may be grouped into three categories: waves in random scatterers, waves in random continua, and rough surface scattering. Random scatterers are random distributions of many particles. Examples are rain, fog, smog, hail, ocean particles, red blood cells, polymers, and other particles in a state of Brownian motion. Random continua are the media whose characteristics vary randomly an
Restaurant Selection in Dublin
Cullen, Frank
2012-01-01
The primary objective of this research was to investigate the selection process used by consumers when choosing a restaurant to dine. This study examined literature on consumer behaviour, restaurant selection, and decision-making, underpinning the contention that service quality is linked to the consumer’s selection of a restaurant. It supports the utility theories that consumers buy bundles of attributes that simultaneously combined represent a certain level of service quality at a certain p...
Compressors selection and sizing
Brown, Royce N
2005-01-01
This practical reference provides in-depth information required to understand and properly estimate compressor capabilities and to select the proper designs. Engineers and students will gain a thorough understanding of compression principles, equipment, applications, selection, sizing, installation, and maintenance. The many examples clearly illustrate key aspects to help readers understand the ""real world"" of compressor technology.Compressors: Selection and Sizing, third edition is completely updated with new API standards. Additions requested by readers include a new section on di
A Monte Carlo study of adsorption of random copolymers on random surfaces
Moghaddam, M S
2003-01-01
We study the adsorption problem of a random copolymer on a random surface in which a self-avoiding walk in three dimensions interacts with a plane defining a half-space to which the walk is confined. Each vertex of the walk is randomly labelled A with probability p sub p or B with probability 1 - p sub p , and only vertices labelled A are attracted to the surface plane. Each lattice site on the plane is also labelled either A with probability p sub s or B with probability 1 - p sub s , and only lattice sites labelled A interact with the walk. We study two variations of this model: in the first case the A-vertices of the walk interact only with the A-sites on the surface. In the second case the constraint of selective binding is removed; that is, any contact between the walk and the surface that involves an A-labelling, either from the surface or from the walk, is counted as a visit to the surface. The system is quenched in both cases, i.e. the labellings of the walk and of the surface are fixed as thermodynam...
National Aeronautics and Space Administration — Selective surfaces have wavelength dependent emissivity/absorption. These surfaces can be designed to reflect solar radiation, while maximizing infrared emittance,...
National Research Council Canada - National Science Library
Halstead, John B
2006-01-01
.... The research uses a combination of statistical learning, feature selection methods, and multivariate statistics to determine the better prediction function approximation with features obtained...
National Research Council Canada - National Science Library
Institute of Medicine; Board on Population Health and Public Health Practice; Institute of Medicine; National Academy of Sciences
2006-01-01
...: Selected Health Effects. This committee was charged with addressing whether asbestos exposure is causally related to adverse health consequences in addition to asbestosis, mesothelioma, and lung cancer. Asbestos...
Random walk through fractal environments
International Nuclear Information System (INIS)
Isliker, H.; Vlahos, L.
2003-01-01
We analyze random walk through fractal environments, embedded in three-dimensional, permeable space. Particles travel freely and are scattered off into random directions when they hit the fractal. The statistical distribution of the flight increments (i.e., of the displacements between two consecutive hittings) is analytically derived from a common, practical definition of fractal dimension, and it turns out to approximate quite well a power-law in the case where the dimension D F of the fractal is less than 2, there is though, always a finite rate of unaffected escape. Random walks through fractal sets with D F ≤2 can thus be considered as defective Levy walks. The distribution of jump increments for D F >2 is decaying exponentially. The diffusive behavior of the random walk is analyzed in the frame of continuous time random walk, which we generalize to include the case of defective distributions of walk increments. It is shown that the particles undergo anomalous, enhanced diffusion for D F F >2 is normal for large times, enhanced though for small and intermediate times. In particular, it follows that fractals generated by a particular class of self-organized criticality models give rise to enhanced diffusion. The analytical results are illustrated by Monte Carlo simulations
Advances in randomized parallel computing
Rajasekaran, Sanguthevar
1999-01-01
The technique of randomization has been employed to solve numerous prob lems of computing both sequentially and in parallel. Examples of randomized algorithms that are asymptotically better than their deterministic counterparts in solving various fundamental problems abound. Randomized algorithms have the advantages of simplicity and better performance both in theory and often in practice. This book is a collection of articles written by renowned experts in the area of randomized parallel computing. A brief introduction to randomized algorithms In the aflalysis of algorithms, at least three different measures of performance can be used: the best case, the worst case, and the average case. Often, the average case run time of an algorithm is much smaller than the worst case. 2 For instance, the worst case run time of Hoare's quicksort is O(n ), whereas its average case run time is only O( n log n). The average case analysis is conducted with an assumption on the input space. The assumption made to arrive at t...
Cover times of random searches
Chupeau, Marie; Bénichou, Olivier; Voituriez, Raphaël
2015-10-01
How long must one undertake a random search to visit all sites of a given domain? This time, known as the cover time, is a key observable to quantify the efficiency of exhaustive searches, which require a complete exploration of an area and not only the discovery of a single target. Examples range from immune-system cells chasing pathogens to animals harvesting resources, from robotic exploration for cleaning or demining to the task of improving search algorithms. Despite its broad relevance, the cover time has remained elusive and so far explicit results have been scarce and mostly limited to regular random walks. Here we determine the full distribution of the cover time for a broad range of random search processes, including Lévy strategies, intermittent strategies, persistent random walks and random walks on complex networks, and reveal its universal features. We show that for all these examples the mean cover time can be minimized, and that the corresponding optimal strategies also minimize the mean search time for a single target, unambiguously pointing towards their robustness.
Lamplighter model of a random copolymer adsorption on a line
Directory of Open Access Journals (Sweden)
L.I. Nazarov
2014-09-01
Full Text Available We present a model of an AB-diblock random copolymer sequential self-packaging with local quenched interactions on a one-dimensional infinite sticky substrate. It is assumed that the A-A and B-B contacts are favorable, while A-B are not. The position of a newly added monomer is selected in view of the local contact energy minimization. The model demonstrates a self-organization behavior with the nontrivial dependence of the total energy, E (the number of unfavorable contacts, on the number of chain monomers, N: E ~ N^3/4 for quenched random equally probable distribution of A- and B-monomers along the chain. The model is treated by mapping it onto the "lamplighter" random walk and the diffusion-controlled chemical reaction of X+X → 0 type with the subdiffusive motion of reagents.
A randomized controlled trial of an electronic informed consent process.
Rothwell, Erin; Wong, Bob; Rose, Nancy C; Anderson, Rebecca; Fedor, Beth; Stark, Louisa A; Botkin, Jeffrey R
2014-12-01
A pilot study assessed an electronic informed consent model within a randomized controlled trial (RCT). Participants who were recruited for the parent RCT project were randomly selected and randomized to either an electronic consent group (n = 32) or a simplified paper-based consent group (n = 30). Results from the electronic consent group reported significantly higher understanding of the purpose of the study, alternatives to participation, and who to contact if they had questions or concerns about the study. However, participants in the paper-based control group reported higher mean scores on some survey items. This research suggests that an electronic informed consent presentation may improve participant understanding for some aspects of a research study. © The Author(s) 2014.
Using a Calendar and Explanatory Instructions to Aid Within-Household Selection in Mail Surveys
Stange, Mathew; Smyth, Jolene D.; Olson, Kristen
2016-01-01
Although researchers can easily select probability samples of addresses using the U.S. Postal Service's Delivery Sequence File, randomly selecting respondents within households for surveys remains challenging. Researchers often place within-household selection instructions, such as the next or last birthday methods, in survey cover letters to…
Free probability and random matrices
Mingo, James A
2017-01-01
This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.
Generating random networks and graphs
Coolen, Ton; Roberts, Ekaterina
2017-01-01
This book supports researchers who need to generate random networks, or who are interested in the theoretical study of random graphs. The coverage includes exponential random graphs (where the targeted probability of each network appearing in the ensemble is specified), growth algorithms (i.e. preferential attachment and the stub-joining configuration model), special constructions (e.g. geometric graphs and Watts Strogatz models) and graphs on structured spaces (e.g. multiplex networks). The presentation aims to be a complete starting point, including details of both theory and implementation, as well as discussions of the main strengths and weaknesses of each approach. It includes extensive references for readers wishing to go further. The material is carefully structured to be accessible to researchers from all disciplines while also containing rigorous mathematical analysis (largely based on the techniques of statistical mechanics) to support those wishing to further develop or implement the theory of rand...
Kusnerz, Peggy A., Comp.; Pollack, Ann M., Comp.
This select bibliography lists books, monographs, journals and newsletters which relate to feminism, women's studies, and other perspectives on women. Selections are organized by topic: general, bibliographies, art and literature, biography/autobiography, economics, education, family and marriage, history, politics and sex roles. Also included is…
Nieuwenhuis, B.P.S.
2012-01-01
Sexual selection is an important factor that drives evolution, in which fitness is increased, not by increasing survival or viability, but by acquiring more or better mates. Sexual selection favours traits that increase the ability of an individual to obtain more matings than other individuals
Sgobba, T.; Landon, L.B.; Marciacq, J.B.; Groen, E.L.; Tikhonov, N.; Torchia, F.
2018-01-01
Selection and training represent two means of ensuring flight crew members are qualified and prepared to perform safely and effectively in space. The first part of the chapter looks at astronaut selection beginning with the evolutionary changes in the US and Russian programs. A discussion of the
Kin Selection - Mutation Balance
DEFF Research Database (Denmark)
Dyken, J. David Van; Linksvayer, Timothy Arnold; Wade, Michael J.
2011-01-01
selection-mutation balance, which provides an evolutionary null hypothesis for the statics and dynamics of cheating. When social interactions have linear fitness effects and Hamilton´s rule is satisfied, selection is never strong enough to eliminate recurrent cheater mutants from a population, but cheater...
A Selective CPS Transformation
DEFF Research Database (Denmark)
Nielsen, Lasse Riechstein
2001-01-01
characterize this involvement as a control effect and we present a selective CPS transformation that makes functions and expressions continuation-passing if they have a control effect, and that leaves the rest of the program in direct style. We formalize this selective CPS transformation with an operational...
Impact of selective genotyping in the training population on accuracy and bias of genomic selection.
Zhao, Yusheng; Gowda, Manje; Longin, Friedrich H; Würschum, Tobias; Ranc, Nicolas; Reif, Jochen C
2012-08-01
Estimating marker effects based on routinely generated phenotypic data of breeding programs is a cost-effective strategy to implement genomic selection. Truncation selection in breeding populations, however, could have a strong impact on the accuracy to predict genomic breeding values. The main objective of our study was to investigate the influence of phenotypic selection on the accuracy and bias of genomic selection. We used experimental data of 788 testcross progenies from an elite maize breeding program. The testcross progenies were evaluated in unreplicated field trials in ten environments and fingerprinted with 857 SNP markers. Random regression best linear unbiased prediction method was used in combination with fivefold cross-validation based on genotypic sampling. We observed a substantial loss in the accuracy to predict genomic breeding values in unidirectional selected populations. In contrast, estimating marker effects based on bidirectional selected populations led to only a marginal decrease in the prediction accuracy of genomic breeding values. We concluded that bidirectional selection is a valuable approach to efficiently implement genomic selection in applied plant breeding programs.
Green Supplier Selection Criteria
DEFF Research Database (Denmark)
Nielsen, Izabela Ewa; Banaeian, Narges; Golinska, Paulina
2014-01-01
Green supplier selection (GSS) criteria arise from an organization inclination to respond to any existing trends in environmental issues related to business management and processes, so GSS is integrating environmental thinking into conventional supplier selection. This research is designed...... to determine prevalent general and environmental supplier selection criteria and develop a framework which can help decision makers to determine and prioritize suitable green supplier selection criteria (general and environmental). In this research we considered several parameters (evaluation objectives......) to establish suitable criteria for GSS such as their production type, requirements, policy and objectives instead of applying common criteria. At first a comprehensive and deep review on prevalent and green supplier selection literatures performed. Then several evaluation objectives defined to assess the green...
LPTAU, Quasi Random Sequence Generator
International Nuclear Information System (INIS)
Sobol, Ilya M.
1993-01-01
1 - Description of program or function: LPTAU generates quasi random sequences. These are uniformly distributed sets of L=M N points in the N-dimensional unit cube: I N =[0,1]x...x[0,1]. These sequences are used as nodes for multidimensional integration; as searching points in global optimization; as trial points in multi-criteria decision making; as quasi-random points for quasi Monte Carlo algorithms. 2 - Method of solution: Uses LP-TAU sequence generation (see references). 3 - Restrictions on the complexity of the problem: The number of points that can be generated is L 30 . The dimension of the space cannot exceed 51
Random walks in Euclidean space
Varjú, Péter Pál
2012-01-01
Consider a sequence of independent random isometries of Euclidean space with a previously fixed probability law. Apply these isometries successively to the origin and consider the sequence of random points that we obtain this way. We prove a local limit theorem under a suitable moment condition and a necessary non-degeneracy condition. Under stronger hypothesis, we prove a limit theorem on a wide range of scales: between e^(-cl^(1/4)) and l^(1/2), where l is the number of steps.
Aprendizaje supervisado mediante random forests
Molero del Río, María Cristina
2017-01-01
Muchos problemas de la vida real pueden modelarse como problemas de clasificación, tales como la detección temprana de enfermedades o la concesión de crédito a un cierto individuo. La Clasificación Supervisada se encarga de este tipo de problemas: aprende de una muestra con el objetivo final de inferir observaciones futuras. Hoy en día, existe una amplia gama de técnicas de Clasificación Supervisada. En este trabajo nos centramos en los bosques aleatorios (Random Forests). El Random Forests e...
Algebraic polynomials with random coefficients
Directory of Open Access Journals (Sweden)
K. Farahmand
2002-01-01
Full Text Available This paper provides an asymptotic value for the mathematical expected number of points of inflections of a random polynomial of the form a0(ω+a1(ω(n11/2x+a2(ω(n21/2x2+…an(ω(nn1/2xn when n is large. The coefficients {aj(w}j=0n, w∈Ω are assumed to be a sequence of independent normally distributed random variables with means zero and variance one, each defined on a fixed probability space (A,Ω,Pr. A special case of dependent coefficients is also studied.
Reserves Represented by Random Walks
International Nuclear Information System (INIS)
Filipe, J A; Ferreira, M A M; Andrade, M
2012-01-01
The reserves problem is studied through models based on Random Walks. Random walks are a classical particular case in the analysis of stochastic processes. They do not appear only to study reserves evolution models. They are also used to build more complex systems and as analysis instruments, in a theoretical feature, of other kind of systems. In this work by studying the reserves, the main objective is to see and guarantee that pensions funds get sustainable. Being the use of these models considering this goal a classical approach in the study of pensions funds, this work concluded about the problematic of reserves. A concrete example is presented.
The Wasteland of Random Supergravities
Marsh, David; McAllister, Liam; Wrase, Timm
2011-01-01
We show that in a general \\cal{N} = 1 supergravity with N \\gg 1 scalar fields, an exponentially small fraction of the de Sitter critical points are metastable vacua. Taking the superpotential and Kahler potential to be random functions, we construct a random matrix model for the Hessian matrix, which is well-approximated by the sum of a Wigner matrix and two Wishart matrices. We compute the eigenvalue spectrum analytically from the free convolution of the constituent spectra and find that in ...
Multipartite nonlocality and random measurements
de Rosier, Anna; Gruca, Jacek; Parisio, Fernando; Vértesi, Tamás; Laskowski, Wiesław
2017-07-01
We present an exhaustive numerical analysis of violations of local realism by families of multipartite quantum states. As an indicator of nonclassicality we employ the probability of violation for randomly sampled observables. Surprisingly, it rapidly increases with the number of parties or settings and even for relatively small values local realism is violated for almost all observables. We have observed this effect to be typical in the sense that it emerged for all investigated states including some with randomly drawn coefficients. We also present the probability of violation as a witness of genuine multipartite entanglement.
Bose condensation in (random traps
Directory of Open Access Journals (Sweden)
V.A. Zagrebnov
2009-01-01
Full Text Available We study a non-interacting (perfect Bose-gas in random external potentials (traps. It is shown that a generalized Bose-Einstein condensation in the random eigenstates manifests if and only if the same occurs in the one-particle kinetic-energy eigenstates, which corresponds to the generalized condensation of the free Bose-gas. Moreover, we prove that the amounts of both condensate densities are equal. This statement is relevant for justification of the Bogoliubov approximation} in the theory of disordered boson systems.
Random photonic crystal optical memory
International Nuclear Information System (INIS)
Wirth Lima Jr, A; Sombra, A S B
2012-01-01
Currently, optical cross-connects working on wavelength division multiplexing systems are based on optical fiber delay lines buffering. We designed and analyzed a novel photonic crystal optical memory, which replaces the fiber delay lines of the current optical cross-connect buffer. Optical buffering systems based on random photonic crystal optical memory have similar behavior to the electronic buffering systems based on electronic RAM memory. In this paper, we show that OXCs working with optical buffering based on random photonic crystal optical memories provides better performance than the current optical cross-connects. (paper)
Phenotypic selection in natural populations: what limits directional selection?
Kingsolver, Joel G; Diamond, Sarah E
2011-03-01
Studies of phenotypic selection document directional selection in many natural populations. What factors reduce total directional selection and the cumulative evolutionary responses to selection? We combine two data sets for phenotypic selection, representing more than 4,600 distinct estimates of selection from 143 studies, to evaluate the potential roles of fitness trade-offs, indirect (correlated) selection, temporally varying selection, and stabilizing selection for reducing net directional selection and cumulative responses to selection. We detected little evidence that trade-offs among different fitness components reduced total directional selection in most study systems. Comparisons of selection gradients and selection differentials suggest that correlated selection frequently reduced total selection on size but not on other types of traits. The direction of selection on a trait often changes over time in many temporally replicated studies, but these fluctuations have limited impact in reducing cumulative directional selection in most study systems. Analyses of quadratic selection gradients indicated stabilizing selection on body size in at least some studies but provided little evidence that stabilizing selection is more common than disruptive selection for most traits or study systems. Our analyses provide little evidence that fitness trade-offs, correlated selection, or stabilizing selection strongly constrains the directional selection reported for most quantitative traits.
Ahlen, Johan; Lenhard, Fabian; Ghaderi, Ata
2015-12-01
Although under-diagnosed, anxiety and depression are among the most prevalent psychiatric disorders in children and adolescents, leading to severe impairment, increased risk of future psychiatric problems, and a high economic burden to society. Universal prevention may be a potent way to address these widespread problems. There are several benefits to universal relative to targeted interventions because there is limited knowledge as to how to screen for anxiety and depression in the general population. Earlier meta-analyses of the prevention of depression and anxiety symptoms among children suffer from methodological inadequacies such as combining universal, selective, and indicated interventions in the same analyses, and comparing cluster-randomized trials with randomized trials without any correction for clustering effects. The present meta-analysis attempted to determine the effectiveness of universal interventions to prevent anxiety and depressive symptoms after correcting for clustering effects. A systematic search of randomized studies in PsychINFO, Cochrane Library, and Google Scholar resulted in 30 eligible studies meeting inclusion criteria, namely peer-reviewed, randomized or cluster-randomized trials of universal interventions for anxiety and depressive symptoms in school-aged children. Sixty-three percent of the studies reported outcome data regarding anxiety and 87 % reported outcome data regarding depression. Seventy percent of the studies used randomization at the cluster level. There were small but significant effects regarding anxiety (.13) and depressive (.11) symptoms as measured at immediate posttest. At follow-up, which ranged from 3 to 48 months, effects were significantly larger than zero regarding depressive (.07) but not anxiety (.11) symptoms. There was no significant moderation effect of the following pre-selected variables: the primary aim of the intervention (anxiety or depression), deliverer of the intervention, gender distribution
Randomized interpolative decomposition of separated representations
Biagioni, David J.; Beylkin, Daniel; Beylkin, Gregory
2015-01-01
We introduce an algorithm to compute tensor interpolative decomposition (dubbed CTD-ID) for the reduction of the separation rank of Canonical Tensor Decompositions (CTDs). Tensor ID selects, for a user-defined accuracy ɛ, a near optimal subset of terms of a CTD to represent the remaining terms via a linear combination of the selected terms. CTD-ID can be used as an alternative to or in combination with the Alternating Least Squares (ALS) algorithm. We present examples of its use within a convergent iteration to compute inverse operators in high dimensions. We also briefly discuss the spectral norm as a computational alternative to the Frobenius norm in estimating approximation errors of tensor ID. We reduce the problem of finding tensor IDs to that of constructing interpolative decompositions of certain matrices. These matrices are generated via randomized projection of the terms of the given tensor. We provide cost estimates and several examples of the new approach to the reduction of separation rank.
Simulation of selected genealogies.
Slade, P F
2000-02-01
Algorithms for generating genealogies with selection conditional on the sample configuration of n genes in one-locus, two-allele haploid and diploid models are presented. Enhanced integro-recursions using the ancestral selection graph, introduced by S. M. Krone and C. Neuhauser (1997, Theor. Popul. Biol. 51, 210-237), which is the non-neutral analogue of the coalescent, enables accessible simulation of the embedded genealogy. A Monte Carlo simulation scheme based on that of R. C. Griffiths and S. Tavaré (1996, Math. Comput. Modelling 23, 141-158), is adopted to consider the estimation of ancestral times under selection. Simulations show that selection alters the expected depth of the conditional ancestral trees, depending on a mutation-selection balance. As a consequence, branch lengths are shown to be an ineffective criterion for detecting the presence of selection. Several examples are given which quantify the effects of selection on the conditional expected time to the most recent common ancestor. Copyright 2000 Academic Press.
DEFF Research Database (Denmark)
Jørgensen, Allan Grønlund; Larsen, Kasper Green
2011-01-01
and several natural special cases thereof. The rst special case is known as range median, which arises when k is xed to b(j i + 1)=2c. The second case, denoted prex selection, arises when i is xed to 0. Finally, we also consider the bounded rank prex selection problem and the xed rank range......Range selection is the problem of preprocessing an input array A of n unique integers, such that given a query (i; j; k), one can report the k'th smallest integer in the subarray A[i];A[i+1]; : : : ;A[j]. In this paper we consider static data structures in the word-RAM for range selection...... selection problem. In the former, data structures must support prex selection queries under the assumption that k for some value n given at construction time, while in the latter, data structures must support range selection queries where k is xed beforehand for all queries. We prove cell probe lower bounds...
DNA-based random number generation in security circuitry.
Gearheart, Christy M; Arazi, Benjamin; Rouchka, Eric C
2010-06-01
DNA-based circuit design is an area of research in which traditional silicon-based technologies are replaced by naturally occurring phenomena taken from biochemistry and molecular biology. This research focuses on further developing DNA-based methodologies to mimic digital data manipulation. While exhibiting fundamental principles, this work was done in conjunction with the vision that DNA-based circuitry, when the technology matures, will form the basis for a tamper-proof security module, revolutionizing the meaning and concept of tamper-proofing and possibly preventing it altogether based on accurate scientific observations. A paramount part of such a solution would be self-generation of random numbers. A novel prototype schema employs solid phase synthesis of oligonucleotides for random construction of DNA sequences; temporary storage and retrieval is achieved through plasmid vectors. A discussion of how to evaluate sequence randomness is included, as well as how these techniques are applied to a simulation of the random number generation circuitry. Simulation results show generated sequences successfully pass three selected NIST random number generation tests specified for security applications.
Randomized Oversampling for Generalized Multiscale Finite Element Methods
Calo, Victor M.
2016-03-23
In this paper, we develop efficient multiscale methods for flows in heterogeneous media. We use the generalized multiscale finite element (GMsFEM) framework. GMsFEM approximates the solution space locally using a few multiscale basis functions. This approximation selects an appropriate snapshot space and a local spectral decomposition, e.g., the use of oversampled regions, in order to achieve an efficient model reduction. However, the successful construction of snapshot spaces may be costly if too many local problems need to be solved in order to obtain these spaces. We use a moderate quantity of local solutions (or snapshot vectors) with random boundary conditions on oversampled regions with zero forcing to deliver an efficient methodology. Motivated by the randomized algorithm presented in [P. G. Martinsson, V. Rokhlin, and M. Tygert, A Randomized Algorithm for the approximation of Matrices, YALEU/DCS/TR-1361, Yale University, 2006], we consider a snapshot space which consists of harmonic extensions of random boundary conditions defined in a domain larger than the target region. Furthermore, we perform an eigenvalue decomposition in this small space. We study the application of randomized sampling for GMsFEM in conjunction with adaptivity, where local multiscale spaces are adaptively enriched. Convergence analysis is provided. We present representative numerical results to validate the method proposed.
Effects of Random Values for Particle Swarm Optimization Algorithm
Directory of Open Access Journals (Sweden)
Hou-Ping Dai
2018-02-01
Full Text Available Particle swarm optimization (PSO algorithm is generally improved by adaptively adjusting the inertia weight or combining with other evolution algorithms. However, in most modified PSO algorithms, the random values are always generated by uniform distribution in the range of [0, 1]. In this study, the random values, which are generated by uniform distribution in the ranges of [0, 1] and [−1, 1], and Gauss distribution with mean 0 and variance 1 ( U [ 0 , 1 ] , U [ − 1 , 1 ] and G ( 0 , 1 , are respectively used in the standard PSO and linear decreasing inertia weight (LDIW PSO algorithms. For comparison, the deterministic PSO algorithm, in which the random values are set as 0.5, is also investigated in this study. Some benchmark functions and the pressure vessel design problem are selected to test these algorithms with different types of random values in three space dimensions (10, 30, and 100. The experimental results show that the standard PSO and LDIW-PSO algorithms with random values generated by U [ − 1 , 1 ] or G ( 0 , 1 are more likely to avoid falling into local optima and quickly obtain the global optima. This is because the large-scale random values can expand the range of particle velocity to make the particle more likely to escape from local optima and obtain the global optima. Although the random values generated by U [ − 1 , 1 ] or G ( 0 , 1 are beneficial to improve the global searching ability, the local searching ability for a low dimensional practical optimization problem may be decreased due to the finite particles.
Groupies in random bipartite graphs
Yilun Shang
2010-01-01
A vertex $v$ of a graph $G$ is called a groupie if its degree is notless than the average of the degrees of its neighbors. In thispaper we study the influence of bipartition $(B_1,B_2)$ on groupiesin random bipartite graphs $G(B_1,B_2,p)$ with both fixed $p$ and$p$ tending to zero.
Stalled ERP at Random Textiles
Brumberg, Robert; Kops, Eric; Little, Elizabeth; Gamble, George; Underbakke, Jesse; Havelka, Douglas
2016-01-01
Andre Raymond, Executive Vice President of Sales and Marketing for Random Textiles Co. Inc. (RTC), stood in front of the podium to address his team of 70 sales consultants in Las Vegas, NV. The organization had increased market share and achieved record sales over the past three years; however, in the shadow of this success lurked an obstacle that…
Fatigue Reliability under Random Loads
DEFF Research Database (Denmark)
Talreja, R.
1979-01-01
We consider the problem of estimating the probability of survival (non-failure) and the probability of safe operation (strength greater than a limiting value) of structures subjected to random loads. These probabilities are formulated in terms of the probability distributions of the loads...... propagation stage. The consequences of this behaviour on the fatigue reliability are discussed....
Privacy preserving randomized gossip algorithms
Hanzely, Filip; Konečný , Jakub; Loizou, Nicolas; Richtarik, Peter; Grishchenko, Dmitry
2017-01-01
In this work we present three different randomized gossip algorithms for solving the average consensus problem while at the same time protecting the information about the initial private values stored at the nodes. We give iteration complexity bounds for all methods, and perform extensive numerical experiments.
Random packing of digitized particles
Korte, de A.C.J.; Brouwers, H.J.H.
2013-01-01
The random packing of regularly and irregularly shaped particles has been studied extensively. Within this paper, packing is studied from the perspective of digitized particles. These digitized particles are developed for and used in cellular automata systems, which are employed for the simple
Alternative model of random surfaces
International Nuclear Information System (INIS)
Ambartzumian, R.V.; Sukiasian, G.S.; Savvidy, G.K.; Savvidy, K.G.
1992-01-01
We analyse models of triangulated random surfaces and demand that geometrically nearby configurations of these surfaces must have close actions. The inclusion of this principle drives us to suggest a new action, which is a modified Steiner functional. General arguments, based on the Minkowski inequality, shows that the maximal distribution to the partition function comes from surfaces close to the sphere. (orig.)
Chaos, complexity, and random matrices
Cotler, Jordan; Hunter-Jones, Nicholas; Liu, Junyu; Yoshida, Beni
2017-11-01
Chaos and complexity entail an entropic and computational obstruction to describing a system, and thus are intrinsically difficult to characterize. In this paper, we consider time evolution by Gaussian Unitary Ensemble (GUE) Hamiltonians and analytically compute out-of-time-ordered correlation functions (OTOCs) and frame potentials to quantify scrambling, Haar-randomness, and circuit complexity. While our random matrix analysis gives a qualitatively correct prediction of the late-time behavior of chaotic systems, we find unphysical behavior at early times including an O(1) scrambling time and the apparent breakdown of spatial and temporal locality. The salient feature of GUE Hamiltonians which gives us computational traction is the Haar-invariance of the ensemble, meaning that the ensemble-averaged dynamics look the same in any basis. Motivated by this property of the GUE, we introduce k-invariance as a precise definition of what it means for the dynamics of a quantum system to be described by random matrix theory. We envision that the dynamical onset of approximate k-invariance will be a useful tool for capturing the transition from early-time chaos, as seen by OTOCs, to late-time chaos, as seen by random matrix theory.
Random packing of digitized particles
de Korte, A.C.J.; Brouwers, Jos
2012-01-01
The random packing of regularly and irregularly shaped particles has been studied extensively. Within this paper, packing is studied from the perspective of digitized particles. These digitized particles are developed for and used in cellular automata systems, which are employed for the simple
Thermophoresis as persistent random walk
International Nuclear Information System (INIS)
Plyukhin, A.V.
2009-01-01
In a simple model of a continuous random walk a particle moves in one dimension with the velocity fluctuating between +v and -v. If v is associated with the thermal velocity of a Brownian particle and allowed to be position dependent, the model accounts readily for the particle's drift along the temperature gradient and recovers basic results of the conventional thermophoresis theory.
Randomized Item Response Theory Models
Fox, Gerardus J.A.
2005-01-01
The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by
Qubits in a random environment
International Nuclear Information System (INIS)
Akhalwaya, I; Fannes, M; Petruccione, F
2007-01-01
Decoherence phenomena in a small quantum system coupled to a complex environment can be modelled with random matrices. We propose a simple deterministic model in the limit of a high dimensional environment. The model is investigated numerically and some analytically addressable questions are singled out
Survey of random surface theory
International Nuclear Information System (INIS)
Froehlich, J.
1985-01-01
The author describes some recent results in random surface theory. Attention is focused on those developments which are relevant for a quantum theory of strings. Some general remarks on the status of mathematical quantum field theory are included at the beginning. (orig.)
Privacy preserving randomized gossip algorithms
Hanzely, Filip
2017-06-23
In this work we present three different randomized gossip algorithms for solving the average consensus problem while at the same time protecting the information about the initial private values stored at the nodes. We give iteration complexity bounds for all methods, and perform extensive numerical experiments.
Algorithmic randomness and physical entropy
International Nuclear Information System (INIS)
Zurek, W.H.
1989-01-01
Algorithmic randomness provides a rigorous, entropylike measure of disorder of an individual, microscopic, definite state of a physical system. It is defined by the size (in binary digits) of the shortest message specifying the microstate uniquely up to the assumed resolution. Equivalently, algorithmic randomness can be expressed as the number of bits in the smallest program for a universal computer that can reproduce the state in question (for instance, by plotting it with the assumed accuracy). In contrast to the traditional definitions of entropy, algorithmic randomness can be used to measure disorder without any recourse to probabilities. Algorithmic randomness is typically very difficult to calculate exactly but relatively easy to estimate. In large systems, probabilistic ensemble definitions of entropy (e.g., coarse-grained entropy of Gibbs and Boltzmann's entropy H=lnW, as well as Shannon's information-theoretic entropy) provide accurate estimates of the algorithmic entropy of an individual system or its average value for an ensemble. One is thus able to rederive much of thermodynamics and statistical mechanics in a setting very different from the usual. Physical entropy, I suggest, is a sum of (i) the missing information measured by Shannon's formula and (ii) of the algorithmic information content---algorithmic randomness---present in the available data about the system. This definition of entropy is essential in describing the operation of thermodynamic engines from the viewpoint of information gathering and using systems. These Maxwell demon-type entities are capable of acquiring and processing information and therefore can ''decide'' on the basis of the results of their measurements and computations the best strategy for extracting energy from their surroundings. From their internal point of view the outcome of each measurement is definite
Random scalar fields and hyperuniformity
Ma, Zheng; Torquato, Salvatore
2017-06-01
Disordered many-particle hyperuniform systems are exotic amorphous states of matter that lie between crystals and liquids. Hyperuniform systems have attracted recent attention because they are endowed with novel transport and optical properties. Recently, the hyperuniformity concept has been generalized to characterize two-phase media, scalar fields, and random vector fields. In this paper, we devise methods to explicitly construct hyperuniform scalar fields. Specifically, we analyze spatial patterns generated from Gaussian random fields, which have been used to model the microwave background radiation and heterogeneous materials, the Cahn-Hilliard equation for spinodal decomposition, and Swift-Hohenberg equations that have been used to model emergent pattern formation, including Rayleigh-Bénard convection. We show that the Gaussian random scalar fields can be constructed to be hyperuniform. We also numerically study the time evolution of spinodal decomposition patterns and demonstrate that they are hyperuniform in the scaling regime. Moreover, we find that labyrinth-like patterns generated by the Swift-Hohenberg equation are effectively hyperuniform. We show that thresholding (level-cutting) a hyperuniform Gaussian random field to produce a two-phase random medium tends to destroy the hyperuniformity of the progenitor scalar field. We then propose guidelines to achieve effectively hyperuniform two-phase media derived from thresholded non-Gaussian fields. Our investigation paves the way for new research directions to characterize the large-structure spatial patterns that arise in physics, chemistry, biology, and ecology. Moreover, our theoretical results are expected to guide experimentalists to synthesize new classes of hyperuniform materials with novel physical properties via coarsening processes and using state-of-the-art techniques, such as stereolithography and 3D printing.
Why the null matters: statistical tests, random walks and evolution.
Sheets, H D; Mitchell, C E
2001-01-01
A number of statistical tests have been developed to determine what type of dynamics underlie observed changes in morphology in evolutionary time series, based on the pattern of change within the time series. The theory of the 'scaled maximum', the 'log-rate-interval' (LRI) method, and the Hurst exponent all operate on the same principle of comparing the maximum change, or rate of change, in the observed dataset to the maximum change expected of a random walk. Less change in a dataset than expected of a random walk has been interpreted as indicating stabilizing selection, while more change implies directional selection. The 'runs test' in contrast, operates on the sequencing of steps, rather than on excursion. Applications of these tests to computer generated, simulated time series of known dynamical form and various levels of additive noise indicate that there is a fundamental asymmetry in the rate of type II errors of the tests based on excursion: they are all highly sensitive to noise in models of directional selection that result in a linear trend within a time series, but are largely noise immune in the case of a simple model of stabilizing selection. Additionally, the LRI method has a lower sensitivity than originally claimed, due to the large range of LRI rates produced by random walks. Examination of the published results of these tests show that they have seldom produced a conclusion that an observed evolutionary time series was due to directional selection, a result which needs closer examination in light of the asymmetric response of these tests.
Selection of Celebrity Endorsers
DEFF Research Database (Denmark)
Hollensen, Svend; Schimmelpfennig, Christian
2013-01-01
several candidates by means of subtle evaluation procedures. Design/methodology/approach – A case study research has been carried out among companies experienced in celebrity endorsements to learn more about the endorser selection process in practise. Based on these cases theory is inductively developed......Purpose - This research aims at shedding some light on the various avenues marketers can undertake until finally an endorsement contract is signed. The focus of the study lies on verifying the generally held assumption that endorser selection is usually taken care of by creative agencies, vetting....... Findings – Our research suggests that generally held assumption that endorsers being selected and thoroughly vetted by a creative agency may not be universally valid. A normative model to illustrate the continuum of the selection process in practise is suggested and the two polar case studies (Swiss brand...
Feature Selection by Reordering
Czech Academy of Sciences Publication Activity Database
Jiřina, Marcel; Jiřina jr., M.
2005-01-01
Roč. 2, č. 1 (2005), s. 155-161 ISSN 1738-6438 Institutional research plan: CEZ:AV0Z10300504 Keywords : feature selection * data reduction * ordering of features Subject RIV: BA - General Mathematics
Selective information sampling
Directory of Open Access Journals (Sweden)
Peter A. F. Fraser-Mackenzie
2009-06-01
Full Text Available This study investigates the amount and valence of information selected during single item evaluation. One hundred and thirty-five participants evaluated a cell phone by reading hypothetical customers reports. Some participants were first asked to provide a preliminary rating based on a picture of the phone and some technical specifications. The participants who were given the customer reports only after they made a preliminary rating exhibited valence bias in their selection of customers reports. In contrast, the participants that did not make an initial rating sought subsequent information in a more balanced, albeit still selective, manner. The preliminary raters used the least amount of information in their final decision, resulting in faster decision times. The study appears to support the notion that selective exposure is utilized in order to develop cognitive coherence.
Directory of Open Access Journals (Sweden)
Piotr FOLĘGA
2014-03-01
Full Text Available The variety of types and sizes currently in production harmonic drive is a problem in their rational choice. Properly selected harmonic drive must meet certain requirements during operation, and achieve the anticipated service life. The paper discusses the problems associated with the selection of the harmonic drive. It also presents the algorithm correct choice of harmonic drive. The main objective of this study was to develop a computer program that allows the correct choice of harmonic drive by developed algorithm.
Optimization methods for activities selection problems
Mahad, Nor Faradilah; Alias, Suriana; Yaakop, Siti Zulaika; Arshad, Norul Amanina Mohd; Mazni, Elis Sofia
2017-08-01
Co-curriculum activities must be joined by every student in Malaysia and these activities bring a lot of benefits to the students. By joining these activities, the students can learn about the time management and they can developing many useful skills. This project focuses on the selection of co-curriculum activities in secondary school using the optimization methods which are the Analytic Hierarchy Process (AHP) and Zero-One Goal Programming (ZOGP). A secondary school in Negeri Sembilan, Malaysia was chosen as a case study. A set of questionnaires were distributed randomly to calculate the weighted for each activity based on the 3 chosen criteria which are soft skills, interesting activities and performances. The weighted was calculated by using AHP and the results showed that the most important criteria is soft skills. Then, the ZOGP model will be analyzed by using LINGO Software version 15.0. There are two priorities to be considered. The first priority which is to minimize the budget for the activities is achieved since the total budget can be reduced by RM233.00. Therefore, the total budget to implement the selected activities is RM11,195.00. The second priority which is to select the co-curriculum activities is also achieved. The results showed that 9 out of 15 activities were selected. Thus, it can concluded that AHP and ZOGP approach can be used as the optimization methods for activities selection problem.
Energy Technology Data Exchange (ETDEWEB)
NONE
1998-08-01
Spectrally selective glazing is window glass that permits some portions of the solar spectrum to enter a building while blocking others. This high-performance glazing admits as much daylight as possible while preventing transmission of as much solar heat as possible. By controlling solar heat gains in summer, preventing loss of interior heat in winter, and allowing occupants to reduce electric lighting use by making maximum use of daylight, spectrally selective glazing significantly reduces building energy consumption and peak demand. Because new spectrally selective glazings can have a virtually clear appearance, they admit more daylight and permit much brighter, more open views to the outside while still providing the solar control of the dark, reflective energy-efficient glass of the past. This Federal Technology Alert provides detailed information and procedures for Federal energy managers to consider spectrally selective glazings. The principle of spectrally selective glazings is explained. Benefits related to energy efficiency and other architectural criteria are delineated. Guidelines are provided for appropriate application of spectrally selective glazing, and step-by-step instructions are given for estimating energy savings. Case studies are also presented to illustrate actual costs and energy savings. Current manufacturers, technology users, and references for further reading are included for users who have questions not fully addressed here.
Selection in artistic gymnastics
Directory of Open Access Journals (Sweden)
Maria Olaru
2009-06-01
Full Text Available This study envisages the analysis of the specific aspects of the selection process in artistic gymnastics, focusing particularly onthe selection of Romania’s recent years. In our opinion, the shift to a cone of darkness of the artistic gymnastics, an extremelypopular sport in our country 20 years ago, is also based on and the orientation of children to other fields – unfortunately manyof them outside sports and physical activities in general. In the present study, we shall present the stages of the artisticgymnastics, as its importance in the subsequent performances has been proven a long time ago. The plethora of qualities andskills which are necessary to select a child for gymnastics and those that this sport develops when performed as a spare timeactivity. The case studied in this endeavour is the one of the main centers for gymnast recruitment in Romania; the attentionpaid by the trainers to the selection for this sport makes the data regarding the number of children involved to increase oncemore. This is a satisfactory fact as it is a well-known fact that a wide range primary selection sets a serious basis for thesecondary selection, and the third, respectively, envisaging the future performance and concurrently ensures the involvementof more children in a physical activity that will prepare them, both physically and mentally for a healthy life.
Strong Selective Adsorption of Polymers.
Ge, Ting; Rubinstein, Michael
2015-06-09
A scaling theory is developed for selective adsorption of polymers induced by the strong binding between specific monomers and complementary surface adsorption sites. By "selective" we mean specific attraction between a subset of all monomers, called "sticky", and a subset of surface sites, called "adsorption sites". We demonstrate that, in addition to the expected dependence on the polymer volume fraction ϕ bulk in the bulk solution, selective adsorption strongly depends on the ratio between two characteristic length scales, the root-mean-square distance l between neighboring sticky monomers along the polymer, and the average distance d between neighboring surface adsorption sites. The role of the ratio l / d arises from the fact that a polymer needs to deform to enable the spatial commensurability between its sticky monomers and the surface adsorption sites for selective adsorption. We study strong selective adsorption of both telechelic polymers with two end monomers being sticky and multisticker polymers with many sticky monomers between sticky ends. For telechelic polymers, we identify four adsorption regimes at l / d 1, we expect that the adsorption layer at exponentially low ϕ bulk consists of separated unstretched loops, while as ϕ bulk increases the layer crosses over to a brush of extended loops with a second layer of weakly overlapping tails. For multisticker chains, in the limit of exponentially low ϕ bulk , adsorbed polymers are well separated from each other. As l / d increases, the conformation of an individual polymer changes from a single-end-adsorbed "mushroom" to a random walk of loops. For high ϕ bulk , adsorbed polymers at small l / d are mushrooms that cover all the adsorption sites. At sufficiently large l / d , adsorbed multisticker polymers strongly overlap. We anticipate the formation of a self-similar carpet and with increasing l / d a two-layer structure with a brush of loops covered by a self-similar carpet. As l / d exceeds the