WorldWideScience

Sample records for randomly selected point

  1. Comparative Evaluations of Randomly Selected Four Point-of-Care Glucometer Devices in Addis Ababa, Ethiopia.

    Science.gov (United States)

    Wolde, Mistire; Tarekegn, Getahun; Kebede, Tedla

    2018-05-01

    Point-of-care glucometer (PoCG) devices play a significant role in self-monitoring of the blood sugar level, particularly in the follow-up of high blood sugar therapeutic response. The aim of this study was to evaluate blood glucose test results performed with four randomly selected glucometers on diabetes and control subjects versus standard wet chemistry (hexokinase) methods in Addis Ababa, Ethiopia. A prospective cross-sectional study was conducted on randomly selected 200 study participants (100 participants with diabetes and 100 healthy controls). Four randomly selected PoCG devices (CareSens N, DIAVUE Prudential, On Call Extra, i-QARE DS-W) were evaluated against hexokinase method and ISO 15197:2003 and ISO 15197:2013 standards. The minimum and maximum blood sugar values were recorded by CareSens N (21 mg/dl) and hexokinase method (498.8 mg/dl), respectively. The mean sugar values of all PoCG devices except On Call Extra showed significant differences compared with the reference hexokinase method. Meanwhile, all four PoCG devices had strong positive relationship (>80%) with the reference method (hexokinase). On the other hand, none of the four PoCG devices fulfilled the minimum accuracy measurement set by ISO 15197:2003 and ISO 15197:2013 standards. In addition, the linear regression analysis revealed that all four selected PoCG overestimated the glucose concentrations. The overall evaluation of the selected four PoCG measurements were poorly correlated with standard reference method. Therefore, before introducing PoCG devices to the market, there should be a standardized evaluation platform for validation. Further similar large-scale studies on other PoCG devices also need to be undertaken.

  2. Minimization over randomly selected lines

    Directory of Open Access Journals (Sweden)

    Ismet Sahin

    2013-07-01

    Full Text Available This paper presents a population-based evolutionary optimization method for minimizing a given cost function. The mutation operator of this method selects randomly oriented lines in the cost function domain, constructs quadratic functions interpolating the cost function at three different points over each line, and uses extrema of the quadratics as mutated points. The crossover operator modifies each mutated point based on components of two points in population, instead of one point as is usually performed in other evolutionary algorithms. The stopping criterion of this method depends on the number of almost degenerate quadratics. We demonstrate that the proposed method with these mutation and crossover operations achieves faster and more robust convergence than the well-known Differential Evolution and Particle Swarm algorithms.

  3. Fast randomized point location without preprocessing in two- and three-dimensional Delaunay triangulations

    Energy Technology Data Exchange (ETDEWEB)

    Muecke, E.P.; Saias, I.; Zhu, B.

    1996-05-01

    This paper studies the point location problem in Delaunay triangulations without preprocessing and additional storage. The proposed procedure finds the query point simply by walking through the triangulation, after selecting a good starting point by random sampling. The analysis generalizes and extends a recent result of d = 2 dimensions by proving this procedure to take expected time close to O(n{sup 1/(d+1)}) for point location in Delaunay triangulations of n random points in d = 3 dimensions. Empirical results in both two and three dimensions show that this procedure is efficient in practice.

  4. Blocked Randomization with Randomly Selected Block Sizes

    Directory of Open Access Journals (Sweden)

    Jimmy Efird

    2010-12-01

    Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  5. From Protocols to Publications: A Study in Selective Reporting of Outcomes in Randomized Trials in Oncology

    Science.gov (United States)

    Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C.; Hobbs, Brian P.; Berry, Donald A.; Pentz, Rebecca D.; Tam, Alda; Hong, Waun K.; Ellis, Lee M.; Abbruzzese, James; Overman, Michael J.

    2015-01-01

    Purpose The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. Methods We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. Results A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P < .001). Twenty-eight studies (37.8%) reported a total of 65 unplanned end points; 52 (80.0%) of which were not identified as unplanned. Thirty-one (41.9%) and 19 (25.7%) of 74 trials reported a total of 52 unplanned analyses involving primary end points and 33 unplanned analyses involving nonprimary end points, respectively. Studies reported positive unplanned end points and unplanned analyses more frequently than negative outcomes in abstracts (unplanned end points odds ratio, 6.8; P = .002; unplanned analyses odd ratio, 8.4; P = .007). Conclusion Despite public and reviewer access to protocols, selective outcome reporting persists and is a major concern in the reporting of randomized clinical trials. To foster credible evidence-based medicine, additional initiatives are needed to minimize selective reporting. PMID:26304898

  6. Scattering analysis of point processes and random measures

    International Nuclear Information System (INIS)

    Hanisch, K.H.

    1984-01-01

    In the present paper scattering analysis of point processes and random measures is studied. Known formulae which connect the scattering intensity with the pair distribution function of the studied structures are proved in a rigorous manner with tools of the theory of point processes and random measures. For some special fibre processes the scattering intensity is computed. For a class of random measures, namely for 'grain-germ-models', a new formula is proved which yields the pair distribution function of the 'grain-germ-model' in terms of the pair distribution function of the underlying point process (the 'germs') and of the mean structure factor and the mean squared structure factor of the particles (the 'grains'). (author)

  7. From Protocols to Publications: A Study in Selective Reporting of Outcomes in Randomized Trials in Oncology.

    Science.gov (United States)

    Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C; Hobbs, Brian P; Berry, Donald A; Pentz, Rebecca D; Tam, Alda; Hong, Waun K; Ellis, Lee M; Abbruzzese, James; Overman, Michael J

    2015-11-01

    The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P medicine, additional initiatives are needed to minimize selective reporting. © 2015 by American Society of Clinical Oncology.

  8. EBTR design-point selection

    International Nuclear Information System (INIS)

    Krakowski, R.A.; Bathke, C.G.

    1981-01-01

    The procedure used to select the design point for the ELMO Bumpy Torus Reactor (EBTR) study is described. The models used in each phase of the selection process are described, with an emphasis placed on the parametric design curves produced by each model. The tradeoffs related to burn physics, stability/equilibrium, electron-ring physics, and magnetics design are discussed. The resulting design point indicates a plasma with a 35-m major radius and a 1-m minor radium operating at an average core-plasma beta of 0.17, which at approx. 30 keV produces an average neutron wall loading of 1.4 MW/m 2 while maintaining key magnet (< 10 T) and total power (less than or equal to 4000 MWt) constraints

  9. 47 CFR 1.1602 - Designation for random selection.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...

  10. Applications of random forest feature selection for fine-scale genetic population assignment.

    Science.gov (United States)

    Sylvester, Emma V A; Bentzen, Paul; Bradbury, Ian R; Clément, Marie; Pearce, Jon; Horne, John; Beiko, Robert G

    2018-02-01

    Genetic population assignment used to inform wildlife management and conservation efforts requires panels of highly informative genetic markers and sensitive assignment tests. We explored the utility of machine-learning algorithms (random forest, regularized random forest and guided regularized random forest) compared with F ST ranking for selection of single nucleotide polymorphisms (SNP) for fine-scale population assignment. We applied these methods to an unpublished SNP data set for Atlantic salmon ( Salmo salar ) and a published SNP data set for Alaskan Chinook salmon ( Oncorhynchus tshawytscha ). In each species, we identified the minimum panel size required to obtain a self-assignment accuracy of at least 90% using each method to create panels of 50-700 markers Panels of SNPs identified using random forest-based methods performed up to 7.8 and 11.2 percentage points better than F ST -selected panels of similar size for the Atlantic salmon and Chinook salmon data, respectively. Self-assignment accuracy ≥90% was obtained with panels of 670 and 384 SNPs for each data set, respectively, a level of accuracy never reached for these species using F ST -selected panels. Our results demonstrate a role for machine-learning approaches in marker selection across large genomic data sets to improve assignment for management and conservation of exploited populations.

  11. 47 CFR 1.1603 - Conduct of random selection.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection. The...

  12. Smooth random change point models.

    Science.gov (United States)

    van den Hout, Ardo; Muniz-Terrera, Graciela; Matthews, Fiona E

    2011-03-15

    Change point models are used to describe processes over time that show a change in direction. An example of such a process is cognitive ability, where a decline a few years before death is sometimes observed. A broken-stick model consists of two linear parts and a breakpoint where the two lines intersect. Alternatively, models can be formulated that imply a smooth change between the two linear parts. Change point models can be extended by adding random effects to account for variability between subjects. A new smooth change point model is introduced and examples are presented that show how change point models can be estimated using functions in R for mixed-effects models. The Bayesian inference using WinBUGS is also discussed. The methods are illustrated using data from a population-based longitudinal study of ageing, the Cambridge City over 75 Cohort Study. The aim is to identify how many years before death individuals experience a change in the rate of decline of their cognitive ability. Copyright © 2010 John Wiley & Sons, Ltd.

  13. Speeding up Coarse Point Cloud Registration by Threshold-Independent Baysac Match Selection

    Science.gov (United States)

    Kang, Z.; Lindenbergh, R.; Pu, S.

    2016-06-01

    This paper presents an algorithm for the automatic registration of terrestrial point clouds by match selection using an efficiently conditional sampling method -- threshold-independent BaySAC (BAYes SAmpling Consensus) and employs the error metric of average point-to-surface residual to reduce the random measurement error and then approach the real registration error. BaySAC and other basic sampling algorithms usually need to artificially determine a threshold by which inlier points are identified, which leads to a threshold-dependent verification process. Therefore, we applied the LMedS method to construct the cost function that is used to determine the optimum model to reduce the influence of human factors and improve the robustness of the model estimate. Point-to-point and point-to-surface error metrics are most commonly used. However, point-to-point error in general consists of at least two components, random measurement error and systematic error as a result of a remaining error in the found rigid body transformation. Thus we employ the measure of the average point-to-surface residual to evaluate the registration accuracy. The proposed approaches, together with a traditional RANSAC approach, are tested on four data sets acquired by three different scanners in terms of their computational efficiency and quality of the final registration. The registration results show the st.dev of the average point-to-surface residuals is reduced from 1.4 cm (plain RANSAC) to 0.5 cm (threshold-independent BaySAC). The results also show that, compared to the performance of RANSAC, our BaySAC strategies lead to less iterations and cheaper computational cost when the hypothesis set is contaminated with more outliers.

  14. Robust tie points selection for InSAR image coregistration

    Science.gov (United States)

    Skanderi, Takieddine; Chabira, Boulerbah; Afifa, Belkacem; Belhadj Aissa, Aichouche

    2013-10-01

    Image coregistration is an important step in SAR interferometry which is a well known method for DEM generation and surface displacement monitoring. A practical and widely used automatic coregistration algorithm is based on selecting a number of tie points in the master image and looking for the correspondence of each point in the slave image using correlation technique. The characteristics of these points, their number and their distribution have a great impact on the reliability of the estimated transformation. In this work, we present a method for automatic selection of suitable tie points that are well distributed over the common area without decreasing the desired tie points' number. First we select candidate points using Harris operator. Then from these points we select tie points depending on their cornerness measure (the highest first). Once a tie point is selected, its correspondence is searched for in the slave image, if the similarity measure maximum is less than a given threshold or it is at the border of the search window, this point is discarded and we proceed to the next Harris point, else, the cornerness of the remaining candidates Harris points are multiplied by a spatially radially increasing function centered at the selected point to disadvantage the points in a neighborhood of a radius determined from the size of the common area and the desired number of points. This is repeated until the desired number of points is selected. Results of an ERS1/2 tandem pair are presented and discussed.

  15. Evolution of Randomized Trials in Advanced/Metastatic Soft Tissue Sarcoma: End Point Selection, Surrogacy, and Quality of Reporting.

    Science.gov (United States)

    Zer, Alona; Prince, Rebecca M; Amir, Eitan; Abdul Razak, Albiruni

    2016-05-01

    Randomized controlled trials (RCTs) in soft tissue sarcoma (STS) have used varying end points. The surrogacy of intermediate end points, such as progression-free survival (PFS), response rate (RR), and 3-month and 6-month PFS (3moPFS and 6moPFS) with overall survival (OS), remains unknown. The quality of efficacy and toxicity reporting in these studies is also uncertain. A systematic review of systemic therapy RCTs in STS was performed. Surrogacy between intermediate end points and OS was explored using weighted linear regression for the hazard ratio for OS with the hazard ratio for PFS or the odds ratio for RR, 3moPFS, and 6moPFS. The quality of reporting for efficacy and toxicity was also evaluated. Fifty-two RCTs published between 1974 and 2014, comprising 9,762 patients, met the inclusion criteria. There were significant correlations between PFS and OS (R = 0.61) and between RR and OS (R = 0.51). Conversely, there were nonsignificant correlations between 3moPFS and 6moPFS with OS. A reduction in the use of RR as the primary end point was observed over time, favoring time-based events (P for trend = .02). In 14% of RCTs, the primary end point was not met, but the study was reported as being positive. Toxicity was comprehensively reported in 47% of RCTs, whereas 14% inadequately reported toxicity. In advanced STS, PFS and RR seem to be appropriate surrogates for OS. There is poor correlation between OS and both 3moPFS and 6moPFS. As such, caution is urged with the use of these as primary end points in randomized STS trials. The quality of toxicity reporting and interpretation of results is suboptimal. © 2016 by American Society of Clinical Oncology.

  16. Analysis of tree stand horizontal structure using random point field methods

    Directory of Open Access Journals (Sweden)

    O. P. Sekretenko

    2015-06-01

    Full Text Available This paper uses the model approach to analyze the horizontal structure of forest stands. The main types of models of random point fields and statistical procedures that can be used to analyze spatial patterns of trees of uneven and even-aged stands are described. We show how modern methods of spatial statistics can be used to address one of the objectives of forestry – to clarify the laws of natural thinning of forest stand and the corresponding changes in its spatial structure over time. Studying natural forest thinning, we describe the consecutive stages of modeling: selection of the appropriate parametric model, parameter estimation and generation of point patterns in accordance with the selected model, the selection of statistical functions to describe the horizontal structure of forest stands and testing of statistical hypotheses. We show the possibilities of a specialized software package, spatstat, which is designed to meet the challenges of spatial statistics and provides software support for modern methods of analysis of spatial data. We show that a model of stand thinning that does not consider inter-tree interaction can project the size distribution of the trees properly, but the spatial pattern of the modeled stand is not quite consistent with observed data. Using data of three even-aged pine forest stands of 25, 55, and 90-years old, we demonstrate that the spatial point process models are useful for combining measurements in the forest stands of different ages to study the forest stand natural thinning.

  17. Consumers' price awareness at the point-of-selection

    DEFF Research Database (Denmark)

    Jensen, Birger Boutrup

    This paper focuses on consumers' price information processing at the point-of-selection. Specifically, it updates past results of consumers' price awareness at the point-of-selection - applying both a price-recall and a price-recognition test - and tests hypotheses on potential determinants...... of consumers' price awareness at the point-of-selection. Both price-memory tests resulted in higher measured price awareness than in any of the past studies. Results also indicate that price recognition is not the most appropiate measure. Finally, a discriminant analysis shows that consumers who are aware...... of the price at the point-of-selection are more deal prone, more low-price prone, and bought a special-priced item. Implications are discussed....

  18. Some common random fixed point theorems for contractive type conditions in cone random metric spaces

    Directory of Open Access Journals (Sweden)

    Saluja Gurucharan S.

    2016-08-01

    Full Text Available In this paper, we establish some common random fixed point theorems for contractive type conditions in the setting of cone random metric spaces. Our results unify, extend and generalize many known results from the current existing literature.

  19. About the problem of generating three-dimensional pseudo-random points.

    Science.gov (United States)

    Carpintero, D. D.

    The author demonstrates that a popular pseudo-random number generator is not adequate in some circumstances to generate n-dimensional random points, n > 2. This problem is particularly noxious when direction cosines are generated. He proposes several soultions, among them a good generator that satisfies all statistical criteria.

  20. Selectivity and sparseness in randomly connected balanced networks.

    Directory of Open Access Journals (Sweden)

    Cengiz Pehlevan

    Full Text Available Neurons in sensory cortex show stimulus selectivity and sparse population response, even in cases where no strong functionally specific structure in connectivity can be detected. This raises the question whether selectivity and sparseness can be generated and maintained in randomly connected networks. We consider a recurrent network of excitatory and inhibitory spiking neurons with random connectivity, driven by random projections from an input layer of stimulus selective neurons. In this architecture, the stimulus-to-stimulus and neuron-to-neuron modulation of total synaptic input is weak compared to the mean input. Surprisingly, we show that in the balanced state the network can still support high stimulus selectivity and sparse population response. In the balanced state, strong synapses amplify the variation in synaptic input and recurrent inhibition cancels the mean. Functional specificity in connectivity emerges due to the inhomogeneity caused by the generative statistical rule used to build the network. We further elucidate the mechanism behind and evaluate the effects of model parameters on population sparseness and stimulus selectivity. Network response to mixtures of stimuli is investigated. It is shown that a balanced state with unselective inhibition can be achieved with densely connected input to inhibitory population. Balanced networks exhibit the "paradoxical" effect: an increase in excitatory drive to inhibition leads to decreased inhibitory population firing rate. We compare and contrast selectivity and sparseness generated by the balanced network to randomly connected unbalanced networks. Finally, we discuss our results in light of experiments.

  1. On tests of randomness for spatial point patterns

    International Nuclear Information System (INIS)

    Doguwa, S.I.

    1990-11-01

    New tests of randomness for spatial point patterns are introduced. These test statistics are then compared in a power study with the existing alternatives. These results of the power study suggest that one of the tests proposed is extremely powerful against both aggregated and regular alternatives. (author). 9 refs, 7 figs, 3 tabs

  2. Selective Integration in the Material-Point Method

    DEFF Research Database (Denmark)

    Andersen, Lars; Andersen, Søren; Damkilde, Lars

    2009-01-01

    The paper deals with stress integration in the material-point method. In order to avoid parasitic shear in bending, a formulation is proposed, based on selective integration in the background grid that is used to solve the governing equations. The suggested integration scheme is compared...... to a traditional material-point-method computation in which the stresses are evaluated at the material points. The deformation of a cantilever beam is analysed, assuming elastic or elastoplastic material behaviour....

  3. Testing, Selection, and Implementation of Random Number Generators

    National Research Council Canada - National Science Library

    Collins, Joseph C

    2008-01-01

    An exhaustive evaluation of state-of-the-art random number generators with several well-known suites of tests provides the basis for selection of suitable random number generators for use in stochastic simulations...

  4. Random electrodynamics: the theory of classical electrodynamics with classical electromagnetic zero-point radiation

    International Nuclear Information System (INIS)

    Boyer, T.H.

    1975-01-01

    The theory of classical electrodynamics with classical electromagnetic zero-point radiation is outlined here under the title random electrodynamics. The work represents a reanalysis of the bounds of validity of classical electron theory which should sharpen the understanding of the connections and distinctions between classical and quantum theories. The new theory of random electrodynamics is a classical electron theory involving Newton's equations for particle motion due to the Lorentz force, and Maxwell's equations for the electromagnetic fields with point particles as sources. However, the theory departs from the classical electron theory of Lorentz in that it adopts a new boundary condition on Maxwell's equations. It is assumed that the homogeneous boundary condition involves random classical electromagnetic radiation with a Lorentz-invariant spectrum, classical electromagnetic zero-point radiation. The implications of random electrodynamics for atomic structure, atomic spectra, and particle-interference effects are discussed on an order-of-magnitude or heuristic level. Some detailed mathematical connections and some merely heuristic connections are noted between random electrodynamics and quantum theory. (U.S.)

  5. Statistical properties of several models of fractional random point processes

    Science.gov (United States)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  6. A MOSUM procedure for the estimation of multiple random change points

    OpenAIRE

    Eichinger, Birte; Kirch, Claudia

    2018-01-01

    In this work, we investigate statistical properties of change point estimators based on moving sum statistics. We extend results for testing in a classical situation with multiple deterministic change points by allowing for random exogenous change points that arise in Hidden Markov or regime switching models among others. To this end, we consider a multiple mean change model with possible time series errors and prove that the number and location of change points are estimated consistently by ...

  7. Statistics of stationary points of random finite polynomial potentials

    International Nuclear Information System (INIS)

    Mehta, Dhagash; Niemerg, Matthew; Sun, Chuang

    2015-01-01

    The stationary points (SPs) of the potential energy landscapes (PELs) of multivariate random potentials (RPs) have found many applications in many areas of Physics, Chemistry and Mathematical Biology. However, there are few reliable methods available which can find all the SPs accurately. Hence, one has to rely on indirect methods such as Random Matrix theory. With a combination of the numerical polynomial homotopy continuation method and a certification method, we obtain all the certified SPs of the most general polynomial RP for each sample chosen from the Gaussian distribution with mean 0 and variance 1. While obtaining many novel results for the finite size case of the RP, we also discuss the implications of our results on mathematics of random systems and string theory landscapes. (paper)

  8. ERROR DISTRIBUTION EVALUATION OF THE THIRD VANISHING POINT BASED ON RANDOM STATISTICAL SIMULATION

    Directory of Open Access Journals (Sweden)

    C. Li

    2012-07-01

    Full Text Available POS, integrated by GPS / INS (Inertial Navigation Systems, has allowed rapid and accurate determination of position and attitude of remote sensing equipment for MMS (Mobile Mapping Systems. However, not only does INS have system error, but also it is very expensive. Therefore, in this paper error distributions of vanishing points are studied and tested in order to substitute INS for MMS in some special land-based scene, such as ground façade where usually only two vanishing points can be detected. Thus, the traditional calibration approach based on three orthogonal vanishing points is being challenged. In this article, firstly, the line clusters, which parallel to each others in object space and correspond to the vanishing points, are detected based on RANSAC (Random Sample Consensus and parallelism geometric constraint. Secondly, condition adjustment with parameters is utilized to estimate nonlinear error equations of two vanishing points (VX, VY. How to set initial weights for the adjustment solution of single image vanishing points is presented. Solving vanishing points and estimating their error distributions base on iteration method with variable weights, co-factor matrix and error ellipse theory. Thirdly, under the condition of known error ellipses of two vanishing points (VX, VY and on the basis of the triangle geometric relationship of three vanishing points, the error distribution of the third vanishing point (VZ is calculated and evaluated by random statistical simulation with ignoring camera distortion. Moreover, Monte Carlo methods utilized for random statistical estimation are presented. Finally, experimental results of vanishing points coordinate and their error distributions are shown and analyzed.

  9. Error Distribution Evaluation of the Third Vanishing Point Based on Random Statistical Simulation

    Science.gov (United States)

    Li, C.

    2012-07-01

    POS, integrated by GPS / INS (Inertial Navigation Systems), has allowed rapid and accurate determination of position and attitude of remote sensing equipment for MMS (Mobile Mapping Systems). However, not only does INS have system error, but also it is very expensive. Therefore, in this paper error distributions of vanishing points are studied and tested in order to substitute INS for MMS in some special land-based scene, such as ground façade where usually only two vanishing points can be detected. Thus, the traditional calibration approach based on three orthogonal vanishing points is being challenged. In this article, firstly, the line clusters, which parallel to each others in object space and correspond to the vanishing points, are detected based on RANSAC (Random Sample Consensus) and parallelism geometric constraint. Secondly, condition adjustment with parameters is utilized to estimate nonlinear error equations of two vanishing points (VX, VY). How to set initial weights for the adjustment solution of single image vanishing points is presented. Solving vanishing points and estimating their error distributions base on iteration method with variable weights, co-factor matrix and error ellipse theory. Thirdly, under the condition of known error ellipses of two vanishing points (VX, VY) and on the basis of the triangle geometric relationship of three vanishing points, the error distribution of the third vanishing point (VZ) is calculated and evaluated by random statistical simulation with ignoring camera distortion. Moreover, Monte Carlo methods utilized for random statistical estimation are presented. Finally, experimental results of vanishing points coordinate and their error distributions are shown and analyzed.

  10. A new diagnostic accuracy measure and cut-point selection criterion.

    Science.gov (United States)

    Dong, Tuochuan; Attwood, Kristopher; Hutson, Alan; Liu, Song; Tian, Lili

    2017-12-01

    Most diagnostic accuracy measures and criteria for selecting optimal cut-points are only applicable to diseases with binary or three stages. Currently, there exist two diagnostic measures for diseases with general k stages: the hypervolume under the manifold and the generalized Youden index. While hypervolume under the manifold cannot be used for cut-points selection, generalized Youden index is only defined upon correct classification rates. This paper proposes a new measure named maximum absolute determinant for diseases with k stages ([Formula: see text]). This comprehensive new measure utilizes all the available classification information and serves as a cut-points selection criterion as well. Both the geometric and probabilistic interpretations for the new measure are examined. Power and simulation studies are carried out to investigate its performance as a measure of diagnostic accuracy as well as cut-points selection criterion. A real data set from Alzheimer's Disease Neuroimaging Initiative is analyzed using the proposed maximum absolute determinant.

  11. Application of random effects to the study of resource selection by animals.

    Science.gov (United States)

    Gillies, Cameron S; Hebblewhite, Mark; Nielsen, Scott E; Krawchuk, Meg A; Aldridge, Cameron L; Frair, Jacqueline L; Saher, D Joanne; Stevens, Cameron E; Jerde, Christopher L

    2006-07-01

    1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence. 2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability. 3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed. 4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects. 5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection. 6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions

  12. Local randomization in neighbor selection improves PRM roadmap quality

    KAUST Repository

    McMahon, Troy; Jacobs, Sam; Boyd, Bryan; Tapia, Lydia; Amato, Nancy M.

    2012-01-01

    Probabilistic Roadmap Methods (PRMs) are one of the most used classes of motion planning methods. These sampling-based methods generate robot configurations (nodes) and then connect them to form a graph (roadmap) containing representative feasible pathways. A key step in PRM roadmap construction involves identifying a set of candidate neighbors for each node. Traditionally, these candidates are chosen to be the k-closest nodes based on a given distance metric. In this paper, we propose a new neighbor selection policy called LocalRand(k,K'), that first computes the K' closest nodes to a specified node and then selects k of those nodes at random. Intuitively, LocalRand attempts to benefit from random sampling while maintaining the higher levels of local planner success inherent to selecting more local neighbors. We provide a methodology for selecting the parameters k and K'. We perform an experimental comparison which shows that for both rigid and articulated robots, LocalRand results in roadmaps that are better connected than the traditional k-closest policy or a purely random neighbor selection policy. The cost required to achieve these results is shown to be comparable to k-closest. © 2012 IEEE.

  13. Local randomization in neighbor selection improves PRM roadmap quality

    KAUST Repository

    McMahon, Troy

    2012-10-01

    Probabilistic Roadmap Methods (PRMs) are one of the most used classes of motion planning methods. These sampling-based methods generate robot configurations (nodes) and then connect them to form a graph (roadmap) containing representative feasible pathways. A key step in PRM roadmap construction involves identifying a set of candidate neighbors for each node. Traditionally, these candidates are chosen to be the k-closest nodes based on a given distance metric. In this paper, we propose a new neighbor selection policy called LocalRand(k,K\\'), that first computes the K\\' closest nodes to a specified node and then selects k of those nodes at random. Intuitively, LocalRand attempts to benefit from random sampling while maintaining the higher levels of local planner success inherent to selecting more local neighbors. We provide a methodology for selecting the parameters k and K\\'. We perform an experimental comparison which shows that for both rigid and articulated robots, LocalRand results in roadmaps that are better connected than the traditional k-closest policy or a purely random neighbor selection policy. The cost required to achieve these results is shown to be comparable to k-closest. © 2012 IEEE.

  14. Shape Modelling Using Markov Random Field Restoration of Point Correspondences

    DEFF Research Database (Denmark)

    Paulsen, Rasmus Reinhold; Hilger, Klaus Baggesen

    2003-01-01

    A method for building statistical point distribution models is proposed. The novelty in this paper is the adaption of Markov random field regularization of the correspondence field over the set of shapes. The new approach leads to a generative model that produces highly homogeneous polygonized sh...

  15. Influencing Food Selection with Point-of-Choice Nutrition Information.

    Science.gov (United States)

    Davis-Chervin, Doryn; And Others

    1985-01-01

    Evaluated the effectiveness of a point-of-choice nutrition information program that used a comprehensive set of communication functions in its design. Results indicate that point-of-choice information without direct tangible rewards can (to a moderate degree) modify food-selection behavior of cafeteria patrons. (JN)

  16. Application of random coherence order selection in gradient-enhanced multidimensional NMR

    International Nuclear Information System (INIS)

    Bostock, Mark J.; Nietlispach, Daniel

    2016-01-01

    Development of multidimensional NMR is essential to many applications, for example in high resolution structural studies of biomolecules. Multidimensional techniques enable separation of NMR signals over several dimensions, improving signal resolution, whilst also allowing identification of new connectivities. However, these advantages come at a significant cost. The Fourier transform theorem requires acquisition of a grid of regularly spaced points to satisfy the Nyquist criterion, while frequency discrimination and acquisition of a pure phase spectrum require acquisition of both quadrature components for each time point in every indirect (non-acquisition) dimension, adding a factor of 2 N -1 to the number of free- induction decays which must be acquired, where N is the number of dimensions. Compressed sensing (CS) ℓ 1 -norm minimisation in combination with non-uniform sampling (NUS) has been shown to be extremely successful in overcoming the Nyquist criterion. Previously, maximum entropy reconstruction has also been used to overcome the limitation of frequency discrimination, processing data acquired with only one quadrature component at a given time interval, known as random phase detection (RPD), allowing a factor of two reduction in the number of points for each indirect dimension (Maciejewski et al. 2011 PNAS 108 16640). However, whilst this approach can be easily applied in situations where the quadrature components are acquired as amplitude modulated data, the same principle is not easily extended to phase modulated (P-/N-type) experiments where data is acquired in the form exp (iωt) or exp (-iωt), and which make up many of the multidimensional experiments used in modern NMR. Here we demonstrate a modification of the CS ℓ 1 -norm approach to allow random coherence order selection (RCS) for phase modulated experiments; we generalise the nomenclature for RCS and RPD as random quadrature detection (RQD). With this method, the power of RQD can be extended

  17. Simultaneous feature selection and parameter optimisation using an artificial ant colony: case study of melting point prediction

    Directory of Open Access Journals (Sweden)

    Nigsch Florian

    2008-10-01

    Full Text Available Abstract Background We present a novel feature selection algorithm, Winnowing Artificial Ant Colony (WAAC, that performs simultaneous feature selection and model parameter optimisation for the development of predictive quantitative structure-property relationship (QSPR models. The WAAC algorithm is an extension of the modified ant colony algorithm of Shen et al. (J Chem Inf Model 2005, 45: 1024–1029. We test the ability of the algorithm to develop a predictive partial least squares model for the Karthikeyan dataset (J Chem Inf Model 2005, 45: 581–590 of melting point values. We also test its ability to perform feature selection on a support vector machine model for the same dataset. Results Starting from an initial set of 203 descriptors, the WAAC algorithm selected a PLS model with 68 descriptors which has an RMSE on an external test set of 46.6°C and R2 of 0.51. The number of components chosen for the model was 49, which was close to optimal for this feature selection. The selected SVM model has 28 descriptors (cost of 5, ε of 0.21 and an RMSE of 45.1°C and R2 of 0.54. This model outperforms a kNN model (RMSE of 48.3°C, R2 of 0.47 for the same data and has similar performance to a Random Forest model (RMSE of 44.5°C, R2 of 0.55. However it is much less prone to bias at the extremes of the range of melting points as shown by the slope of the line through the residuals: -0.43 for WAAC/SVM, -0.53 for Random Forest. Conclusion With a careful choice of objective function, the WAAC algorithm can be used to optimise machine learning and regression models that suffer from overfitting. Where model parameters also need to be tuned, as is the case with support vector machine and partial least squares models, it can optimise these simultaneously. The moving probabilities used by the algorithm are easily interpreted in terms of the best and current models of the ants, and the winnowing procedure promotes the removal of irrelevant descriptors.

  18. Simultaneous feature selection and parameter optimisation using an artificial ant colony: case study of melting point prediction.

    Science.gov (United States)

    O'Boyle, Noel M; Palmer, David S; Nigsch, Florian; Mitchell, John Bo

    2008-10-29

    We present a novel feature selection algorithm, Winnowing Artificial Ant Colony (WAAC), that performs simultaneous feature selection and model parameter optimisation for the development of predictive quantitative structure-property relationship (QSPR) models. The WAAC algorithm is an extension of the modified ant colony algorithm of Shen et al. (J Chem Inf Model 2005, 45: 1024-1029). We test the ability of the algorithm to develop a predictive partial least squares model for the Karthikeyan dataset (J Chem Inf Model 2005, 45: 581-590) of melting point values. We also test its ability to perform feature selection on a support vector machine model for the same dataset. Starting from an initial set of 203 descriptors, the WAAC algorithm selected a PLS model with 68 descriptors which has an RMSE on an external test set of 46.6 degrees C and R2 of 0.51. The number of components chosen for the model was 49, which was close to optimal for this feature selection. The selected SVM model has 28 descriptors (cost of 5, epsilon of 0.21) and an RMSE of 45.1 degrees C and R2 of 0.54. This model outperforms a kNN model (RMSE of 48.3 degrees C, R2 of 0.47) for the same data and has similar performance to a Random Forest model (RMSE of 44.5 degrees C, R2 of 0.55). However it is much less prone to bias at the extremes of the range of melting points as shown by the slope of the line through the residuals: -0.43 for WAAC/SVM, -0.53 for Random Forest. With a careful choice of objective function, the WAAC algorithm can be used to optimise machine learning and regression models that suffer from overfitting. Where model parameters also need to be tuned, as is the case with support vector machine and partial least squares models, it can optimise these simultaneously. The moving probabilities used by the algorithm are easily interpreted in terms of the best and current models of the ants, and the winnowing procedure promotes the removal of irrelevant descriptors.

  19. Random fixed point equations and inverse problems using "collage method" for contraction mappings

    Science.gov (United States)

    Kunze, H. E.; La Torre, D.; Vrscay, E. R.

    2007-10-01

    In this paper we are interested in the direct and inverse problems for the following class of random fixed point equations T(w,x(w))=x(w) where is a given operator, [Omega] is a probability space and X is a Polish metric space. The inverse problem is solved by recourse to the collage theorem for contractive maps. We then consider two applications: (i) random integral equations, and (ii) random iterated function systems with greyscale maps (RIFSM), for which noise is added to the classical IFSM.

  20. Statistical theory of dislocation configurations in a random array of point obstacles

    International Nuclear Information System (INIS)

    Labusch, R.

    1977-01-01

    The stable configurations of a dislocation in an infinite random array of point obstacles are analyzed using the mathematical methods of statistical mechanics. The theory provides exact distribution functions of the forces on pinning points and of the link lengths between points on the line. The expected number of stable configurations is a function of the applied stress. This number drops to zero at the critical stress. Due to a degeneracy problem in the line count, the value of the flow stress cannot be determined rigorously, but we can give a good approximation that is very close to the empirical value

  1. Saddle-points of a two dimensional random lattice theory

    International Nuclear Information System (INIS)

    Pertermann, D.

    1985-07-01

    A two dimensional random lattice theory with a free massless scalar field is considered. We analyse the field theoretic generating functional for any given choice of positions of the lattice sites. Asking for saddle-points of this generating functional with respect to the positions we find the hexagonal lattice and a triangulated version of the hypercubic lattice as candidates. The investigation of the neighbourhood of a single lattice site yields triangulated rectangles and regular polygons extremizing the above generating functional on the local level. (author)

  2. Entropy Based Test Point Evaluation and Selection Method for Analog Circuit Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Yuan Gao

    2014-01-01

    Full Text Available By simplifying tolerance problem and treating faulty voltages on different test points as independent variables, integer-coded table technique is proposed to simplify the test point selection process. Usually, simplifying tolerance problem may induce a wrong solution while the independence assumption will result in over conservative result. To address these problems, the tolerance problem is thoroughly considered in this paper, and dependency relationship between different test points is considered at the same time. A heuristic graph search method is proposed to facilitate the test point selection process. First, the information theoretic concept of entropy is used to evaluate the optimality of test point. The entropy is calculated by using the ambiguous sets and faulty voltage distribution, determined by component tolerance. Second, the selected optimal test point is used to expand current graph node by using dependence relationship between the test point and graph node. Simulated results indicate that the proposed method more accurately finds the optimal set of test points than other methods; therefore, it is a good solution to minimize the size of the test point set. To simplify and clarify the proposed method, only catastrophic and some specific parametric faults are discussed in this paper.

  3. Selection rule for Dirac-like points in two-dimensional dielectric photonic crystals

    KAUST Repository

    Li, Yan

    2013-01-01

    We developed a selection rule for Dirac-like points in two-dimensional dielectric photonic crystals. The rule is derived from a perturbation theory and states that a non-zero, mode-coupling integral between the degenerate Bloch states guarantees a Dirac-like point, regardless of the type of the degeneracy. In fact, the selection rule can also be determined from the symmetry of the Bloch states even without computing the integral. Thus, the existence of Dirac-like points can be quickly and conclusively predicted for various photonic crystals independent of wave polarization, lattice structure, and composition. © 2013 Optical Society of America.

  4. Gradients estimation from random points with volumetric tensor in turbulence

    Science.gov (United States)

    Watanabe, Tomoaki; Nagata, Koji

    2017-12-01

    We present an estimation method of fully-resolved/coarse-grained gradients from randomly distributed points in turbulence. The method is based on a linear approximation of spatial gradients expressed with the volumetric tensor, which is a 3 × 3 matrix determined by a geometric distribution of the points. The coarse grained gradient can be considered as a low pass filtered gradient, whose cutoff is estimated with the eigenvalues of the volumetric tensor. The present method, the volumetric tensor approximation, is tested for velocity and passive scalar gradients in incompressible planar jet and mixing layer. Comparison with a finite difference approximation on a Cartesian grid shows that the volumetric tensor approximation computes the coarse grained gradients fairly well at a moderate computational cost under various conditions of spatial distributions of points. We also show that imposing the solenoidal condition improves the accuracy of the present method for solenoidal vectors, such as a velocity vector in incompressible flows, especially when the number of the points is not large. The volumetric tensor approximation with 4 points poorly estimates the gradient because of anisotropic distribution of the points. Increasing the number of points from 4 significantly improves the accuracy. Although the coarse grained gradient changes with the cutoff length, the volumetric tensor approximation yields the coarse grained gradient whose magnitude is close to the one obtained by the finite difference. We also show that the velocity gradient estimated with the present method well captures the turbulence characteristics such as local flow topology, amplification of enstrophy and strain, and energy transfer across scales.

  5. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  6. Interference-aware random beam selection for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed M.

    2012-09-01

    Spectrum sharing systems have been introduced to alleviate the problem of spectrum scarcity by allowing secondary unlicensed networks to share the spectrum with primary licensed networks under acceptable interference levels to the primary users. In this paper, we develop interference-aware random beam selection schemes that provide enhanced throughput for the secondary link under the condition that the interference observed at the primary link is within a predetermined acceptable value. For a secondary transmitter equipped with multiple antennas, our schemes select a random beam, among a set of power- optimized orthogonal random beams, that maximizes the capacity of the secondary link while satisfying the interference constraint at the primary receiver for different levels of feedback information describing the interference level at the primary receiver. For the proposed schemes, we develop a statistical analysis for the signal-to-noise and interference ratio (SINR) statistics as well as the capacity of the secondary link. Finally, we present numerical results that study the effect of system parameters including number of beams and the maximum transmission power on the capacity of the secondary link attained using the proposed schemes. © 2012 IEEE.

  7. Interference-aware random beam selection for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed M.; Sayed, Mostafa M.; Alouini, Mohamed-Slim; Qaraqe, Khalid A.

    2012-01-01

    . In this paper, we develop interference-aware random beam selection schemes that provide enhanced throughput for the secondary link under the condition that the interference observed at the primary link is within a predetermined acceptable value. For a secondary

  8. Exact two-point resistance, and the simple random walk on the complete graph minus N edges

    Energy Technology Data Exchange (ETDEWEB)

    Chair, Noureddine, E-mail: n.chair@ju.edu.jo

    2012-12-15

    An analytical approach is developed to obtain the exact expressions for the two-point resistance and the total effective resistance of the complete graph minus N edges of the opposite vertices. These expressions are written in terms of certain numbers that we introduce, which we call the Bejaia and the Pisa numbers; these numbers are the natural generalizations of the bisected Fibonacci and Lucas numbers. The correspondence between random walks and the resistor networks is then used to obtain the exact expressions for the first passage and mean first passage times on this graph. - Highlights: Black-Right-Pointing-Pointer We obtain exact formulas for the two-point resistance of the complete graph minus N edges. Black-Right-Pointing-Pointer We obtain also the total effective resistance of this graph. Black-Right-Pointing-Pointer We modified Schwatt's formula on trigonometrical power sum to suit our computations. Black-Right-Pointing-Pointer We introduced the generalized bisected Fibonacci and Lucas numbers: the Bejaia and the Pisa numbers. Black-Right-Pointing-Pointer The first passage and mean first passage times of the random walks have exact expressions.

  9. A random point process model for the score in sport matches

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2009-01-01

    Roč. 20, č. 2 (2009), s. 121-131 ISSN 1471-678X R&D Projects: GA AV ČR(CZ) IAA101120604 Institutional research plan: CEZ:AV0Z10750506 Keywords : sport statistics * scoring intensity * Cox’s regression model Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2009/SI/volf-a random point process model for the score in sport matches.pdf

  10. The signature of positive selection at randomly chosen loci.

    OpenAIRE

    Przeworski, Molly

    2002-01-01

    In Drosophila and humans, there are accumulating examples of loci with a significant excess of high-frequency-derived alleles or high levels of linkage disequilibrium, relative to a neutral model of a random-mating population of constant size. These are features expected after a recent selective sweep. Their prevalence suggests that positive directional selection may be widespread in both species. However, as I show here, these features do not persist long after the sweep ends: The high-frequ...

  11. Simulated Performance Evaluation of a Selective Tracker Through Random Scenario Generation

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar

    2006-01-01

    performance assessment. Therefore, a random target motion scenario is adopted. Its implementation in particular for testing the proposed selective track splitting algorithm using Kalman filters is investigated through a number of performance parameters which gives the activity profile of the tracking scenario......  The paper presents a simulation study on the performance of a target tracker using selective track splitting filter algorithm through a random scenario implemented on a digital signal processor.  In a typical track splitting filter all the observation which fall inside a likelihood ellipse...... are used for update, however, in our proposed selective track splitting filter less number of observations are used for track update.  Much of the previous performance work [1] has been done on specific (deterministic) scenarios. One of the reasons for considering the specific scenarios, which were...

  12. Effect of Electroacupuncture at The Zusanli Point (Stomach-36) on Dorsal Random Pattern Skin Flap Survival in a Rat Model.

    Science.gov (United States)

    Wang, Li-Ren; Cai, Le-Yi; Lin, Ding-Sheng; Cao, Bin; Li, Zhi-Jie

    2017-10-01

    Random skin flaps are commonly used for wound repair and reconstruction. Electroacupuncture at The Zusanli point could enhance microcirculation and blood perfusion in random skin flaps. To determine whether electroacupuncture at The Zusanli point can improve the survival of random skin flaps in a rat model. Thirty-six male Sprague Dawley rats were randomly divided into 3 groups: control group (no electroacupuncture), Group A (electroacupuncture at a nonacupoint near The Zusanli point), and Group B (electroacupuncture at The Zusanli point). McFarlane flaps were established. On postoperative Day 2, malondialdehyde (MDA) and superoxide dismutase were detected. The flap survival rate was evaluated, inflammation was examined in hematoxylin and eosin-stained slices, and the expression of vascular endothelial growth factor (VEGF) was measured immunohistochemically on Day 7. The mean survival area of the flaps in Group B was significantly larger than that in the control group and Group A. Superoxide dismutase activity and VEGF expression level were significantly higher in Group B than those in the control group and Group A, whereas MDA and inflammation levels in Group B were significantly lower than those in the other 2 groups. Electroacupuncture at The Zusanli point can effectively improve the random flap survival.

  13. Selection for altruism through random drift in variable size populations

    Directory of Open Access Journals (Sweden)

    Houchmandzadeh Bahram

    2012-05-01

    Full Text Available Abstract Background Altruistic behavior is defined as helping others at a cost to oneself and a lowered fitness. The lower fitness implies that altruists should be selected against, which is in contradiction with their widespread presence is nature. Present models of selection for altruism (kin or multilevel show that altruistic behaviors can have ‘hidden’ advantages if the ‘common good’ produced by altruists is restricted to some related or unrelated groups. These models are mostly deterministic, or assume a frequency dependent fitness. Results Evolutionary dynamics is a competition between deterministic selection pressure and stochastic events due to random sampling from one generation to the next. We show here that an altruistic allele extending the carrying capacity of the habitat can win by increasing the random drift of “selfish” alleles. In other terms, the fixation probability of altruistic genes can be higher than those of a selfish ones, even though altruists have a smaller fitness. Moreover when populations are geographically structured, the altruists advantage can be highly amplified and the fixation probability of selfish genes can tend toward zero. The above results are obtained both by numerical and analytical calculations. Analytical results are obtained in the limit of large populations. Conclusions The theory we present does not involve kin or multilevel selection, but is based on the existence of random drift in variable size populations. The model is a generalization of the original Fisher-Wright and Moran models where the carrying capacity depends on the number of altruists.

  14. Speeding up coarse point cloud registration by threshold-independent baysac match selection

    NARCIS (Netherlands)

    Kang, Z.; Lindenbergh, R.C.; Pu, S

    2016-01-01

    This paper presents an algorithm for the automatic registration of terrestrial point clouds by match selection using an efficiently conditional sampling method - Threshold-independent BaySAC (BAYes SAmpling Consensus) and employs the error metric of average point- To-surface residual to reduce

  15. Selection bias and subject refusal in a cluster-randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Rochelle Yang

    2017-07-01

    Full Text Available Abstract Background Selection bias and non-participation bias are major methodological concerns which impact external validity. Cluster-randomized controlled trials are especially prone to selection bias as it is impractical to blind clusters to their allocation into intervention or control. This study assessed the impact of selection bias in a large cluster-randomized controlled trial. Methods The Improved Cardiovascular Risk Reduction to Enhance Rural Primary Care (ICARE study examined the impact of a remote pharmacist-led intervention in twelve medical offices. To assess eligibility, a standardized form containing patient demographics and medical information was completed for each screened patient. Eligible patients were approached by the study coordinator for recruitment. Both the study coordinator and the patient were aware of the site’s allocation prior to consent. Patients who consented or declined to participate were compared across control and intervention arms for differing characteristics. Statistical significance was determined using a two-tailed, equal variance t-test and a chi-square test with adjusted Bonferroni p-values. Results were adjusted for random cluster variation. Results There were 2749 completed screening forms returned to research staff with 461 subjects who had either consented or declined participation. Patients with poorly controlled diabetes were found to be significantly more likely to decline participation in intervention sites compared to those in control sites. A higher mean diastolic blood pressure was seen in patients with uncontrolled hypertension who declined in the control sites compared to those who declined in the intervention sites. However, these findings were no longer significant after adjustment for random variation among the sites. After this adjustment, females were now found to be significantly more likely to consent than males (odds ratio = 1.41; 95% confidence interval = 1.03, 1

  16. Some results of the spectra of random Schroedinger operators and their application to random point interaction models in one and three dimensions

    International Nuclear Information System (INIS)

    Kirsch, W.; Martinelli, F.

    1981-01-01

    After the derivation of weak conditions under which the potential for the Schroedinger operator is well defined the authers state an ergodicity assumption of this potential which ensures that the spectrum of this operator is a fixed non random set. Then random point interaction Hamiltonians are considered in this framework. Finally the authors consider a model where for sufficiently small fluctuations around the equilibrium positions a finite number of gaps appears. (HSI)

  17. Unbiased stereological estimation of d-dimensional volume in Rn from an isotropic random slice through a fixed point

    DEFF Research Database (Denmark)

    Jensen, Eva B. Vedel; Kiêu, K

    1994-01-01

    Unbiased stereological estimators of d-dimensional volume in R(n) are derived, based on information from an isotropic random r-slice through a specified point. The content of the slice can be subsampled by means of a spatial grid. The estimators depend only on spatial distances. As a fundamental ...... lemma, an explicit formula for the probability that an isotropic random r-slice in R(n) through 0 hits a fixed point in R(n) is given....

  18. Interference-aware random beam selection schemes for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed; Qaraqe, Khalid; Alouini, Mohamed-Slim

    2012-01-01

    users. In this work, we develop interference-aware random beam selection schemes that provide enhanced performance for the secondary network under the condition that the interference observed by the receivers of the primary network is below a

  19. Efficacy and tolerability balance of oxycodone/naloxone and tapentadol in chronic low back pain with a neuropathic component: a blinded end point analysis of randomly selected routine data from 12-week prospective open-label observations.

    Science.gov (United States)

    Ueberall, Michael A; Mueller-Schwefe, Gerhard H H

    2016-01-01

    To evaluate the benefit-risk profile (BRP) of oxycodone/naloxone (OXN) and tapentadol (TAP) in patients with chronic low back pain (cLBP) with a neuropathic component (NC) in routine clinical practice. This was a blinded end point analysis of randomly selected 12-week routine/open-label data of the German Pain Registry on adult patients with cLBP-NC who initiated an index treatment in compliance with the current German prescribing information between 1st January and 31st October 2015 (OXN/TAP, n=128/133). Primary end point was defined as a composite of three efficacy components (≥30% improvement of pain, pain-related disability, and quality of life each at the end of observation vs baseline) and three tolerability components (normal bowel function, absence of either central nervous system side effects, and treatment-emergent adverse event [TEAE]-related treatment discontinuation during the observation period) adopted to reflect BRP assessments under real-life conditions. Demographic as well as baseline and pretreatment characteristics were comparable for the randomly selected data sets of both index groups without any indicators for critical selection biases. Treatment with OXN resulted formally in a BRP noninferior to that of TAP and showed a significantly higher primary end point response vs TAP (39.8% vs 25.6%, odds ratio: 1.93; P =0.014), due to superior analgesic effects. Between-group differences increased with stricter response definitions for all three efficacy components in favor of OXN: ≥30%/≥50%/≥70% response rates for OXN vs TAP were seen for pain intensity in 85.2%/67.2%/39.1% vs 83.5%/54.1%/15.8% ( P = ns/0.031/<0.001), for pain-related disability in 78.1%/64.8%/43.8% vs 66.9%/50.4%/24.8% ( P =0.043/0.018/0.001), and for quality of life in 76.6%/68.0%/50.0% vs 63.9%/54.1%/34.6% ( P =0.026/0.022/0.017). Overall, OXN vs TAP treatments were well tolerated, and proportions of patients who either maintained a normal bowel function (68.0% vs 72

  20. A Permutation Importance-Based Feature Selection Method for Short-Term Electricity Load Forecasting Using Random Forest

    Directory of Open Access Journals (Sweden)

    Nantian Huang

    2016-09-01

    Full Text Available The prediction accuracy of short-term load forecast (STLF depends on prediction model choice and feature selection result. In this paper, a novel random forest (RF-based feature selection method for STLF is proposed. First, 243 related features were extracted from historical load data and the time information of prediction points to form the original feature set. Subsequently, the original feature set was used to train an RF as the original model. After the training process, the prediction error of the original model on the test set was recorded and the permutation importance (PI value of each feature was obtained. Then, an improved sequential backward search method was used to select the optimal forecasting feature subset based on the PI value of each feature. Finally, the optimal forecasting feature subset was used to train a new RF model as the final prediction model. Experiments showed that the prediction accuracy of RF trained by the optimal forecasting feature subset was higher than that of the original model and comparative models based on support vector regression and artificial neural network.

  1. Generating equilateral random polygons in confinement III

    International Nuclear Information System (INIS)

    Diao, Y; Ernst, C; Montemayor, A; Ziegler, U

    2012-01-01

    In this paper we continue our earlier studies (Diao et al 2011 J. Phys. A: Math. Theor. 44 405202, Diao et al J. Phys. A: Math. Theor. 45 275203) on the generation methods of random equilateral polygons confined in a sphere. The first half of this paper is concerned with the generation of confined equilateral random walks. We show that if the selection of a vertex is uniform subject to the position of its previous vertex and the confining condition, then the distributions of the vertices are not uniform, although there exists a distribution such that if the initial vertex is selected following this distribution, then all vertices of the random walk follow this same distribution. Thus in order to generate a confined equilateral random walk, the selection of a vertex cannot be uniform subject to the position of its previous vertex and the confining condition. We provide a simple algorithm capable of generating confined equilateral random walks whose vertex distribution is almost uniform in the confinement sphere. In the second half of this paper we show that any process generating confined equilateral random walks can be turned into a process generating confined equilateral random polygons with the property that the vertex distribution of the polygons approaches the vertex distribution of the walks as the polygons get longer and longer. In our earlier studies, the starting point of the confined polygon is fixed at the center of the sphere. The new approach here allows us to move the starting point of the confined polygon off the center of the sphere. (paper)

  2. Continuous-Time Mean-Variance Portfolio Selection with Random Horizon

    International Nuclear Information System (INIS)

    Yu, Zhiyong

    2013-01-01

    This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right

  3. Continuous-Time Mean-Variance Portfolio Selection with Random Horizon

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Zhiyong, E-mail: yuzhiyong@sdu.edu.cn [Shandong University, School of Mathematics (China)

    2013-12-15

    This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right.

  4. TEHRAN AIR POLLUTANTS PREDICTION BASED ON RANDOM FOREST FEATURE SELECTION METHOD

    Directory of Open Access Journals (Sweden)

    A. Shamsoddini

    2017-09-01

    Full Text Available Air pollution as one of the most serious forms of environmental pollutions poses huge threat to human life. Air pollution leads to environmental instability, and has harmful and undesirable effects on the environment. Modern prediction methods of the pollutant concentration are able to improve decision making and provide appropriate solutions. This study examines the performance of the Random Forest feature selection in combination with multiple-linear regression and Multilayer Perceptron Artificial Neural Networks methods, in order to achieve an efficient model to estimate carbon monoxide and nitrogen dioxide, sulfur dioxide and PM2.5 contents in the air. The results indicated that Artificial Neural Networks fed by the attributes selected by Random Forest feature selection method performed more accurate than other models for the modeling of all pollutants. The estimation accuracy of sulfur dioxide emissions was lower than the other air contaminants whereas the nitrogen dioxide was predicted more accurate than the other pollutants.

  5. Tehran Air Pollutants Prediction Based on Random Forest Feature Selection Method

    Science.gov (United States)

    Shamsoddini, A.; Aboodi, M. R.; Karami, J.

    2017-09-01

    Air pollution as one of the most serious forms of environmental pollutions poses huge threat to human life. Air pollution leads to environmental instability, and has harmful and undesirable effects on the environment. Modern prediction methods of the pollutant concentration are able to improve decision making and provide appropriate solutions. This study examines the performance of the Random Forest feature selection in combination with multiple-linear regression and Multilayer Perceptron Artificial Neural Networks methods, in order to achieve an efficient model to estimate carbon monoxide and nitrogen dioxide, sulfur dioxide and PM2.5 contents in the air. The results indicated that Artificial Neural Networks fed by the attributes selected by Random Forest feature selection method performed more accurate than other models for the modeling of all pollutants. The estimation accuracy of sulfur dioxide emissions was lower than the other air contaminants whereas the nitrogen dioxide was predicted more accurate than the other pollutants.

  6. Can Ashi points stimulation have specific effects on shoulder pain? A systematic review of randomized controlled trials.

    Science.gov (United States)

    Wang, Kang-Feng; Zhang, Li-Juan; Lu, Feng; Lu, Yong-Hui; Yang, Chuan-Hua

    2016-06-01

    To provide an evidence-based overview regarding the efficacy of Ashi points stimulation for the treatment of shoulder pain. A comprehensive search [PubMed, Chinese Biomedical Literature Database, China National Knowledge Infrastructure (CNKI), Chongqing Weipu Database for Chinese Technical Periodicals (VIP) and Wanfang Database] was conducted to identify randomized or quasi-randomized controlled trials that evaluated the effectiveness of Ashi points stimulation for shoulder pain compared with conventional treatment. The methodological quality of the included studies was assessed using the Cochrane risk of bias tool. RevMan 5.0 was used for data synthesis. Nine trials were included. Seven studies assessed the effectiveness of Ashi points stimulation on response rate compared with conventional acupuncture. Their results suggested significant effect in favour of Ashi points stimulation [odds ratio (OR): 5.89, 95% confidence interval (CI): 2.97 to 11.67, Pfirm conclusion could not be reached until further studies of high quality are available.

  7. Hematological clozapine monitoring with a point-of-care device: A randomized cross-over trial

    DEFF Research Database (Denmark)

    Nielsen, Jimmi; Thode, Dorrit; Stenager, Elsebeth

    for several reasons, perhaps most importantly because of the mandatory hematological monitoring. The Chempaq Express Blood Counter (Chempaq XBC) is a point-of-care device providing counts of white blood cells (WBC) and granulocytes based on a capillary blood sampling. A randomized cross-over trial design...

  8. Hebbian Learning in a Random Network Captures Selectivity Properties of the Prefrontal Cortex

    Science.gov (United States)

    Lindsay, Grace W.

    2017-01-01

    Complex cognitive behaviors, such as context-switching and rule-following, are thought to be supported by the prefrontal cortex (PFC). Neural activity in the PFC must thus be specialized to specific tasks while retaining flexibility. Nonlinear “mixed” selectivity is an important neurophysiological trait for enabling complex and context-dependent behaviors. Here we investigate (1) the extent to which the PFC exhibits computationally relevant properties, such as mixed selectivity, and (2) how such properties could arise via circuit mechanisms. We show that PFC cells recorded from male and female rhesus macaques during a complex task show a moderate level of specialization and structure that is not replicated by a model wherein cells receive random feedforward inputs. While random connectivity can be effective at generating mixed selectivity, the data show significantly more mixed selectivity than predicted by a model with otherwise matched parameters. A simple Hebbian learning rule applied to the random connectivity, however, increases mixed selectivity and enables the model to match the data more accurately. To explain how learning achieves this, we provide analysis along with a clear geometric interpretation of the impact of learning on selectivity. After learning, the model also matches the data on measures of noise, response density, clustering, and the distribution of selectivities. Of two styles of Hebbian learning tested, the simpler and more biologically plausible option better matches the data. These modeling results provide clues about how neural properties important for cognition can arise in a circuit and make clear experimental predictions regarding how various measures of selectivity would evolve during animal training. SIGNIFICANCE STATEMENT The prefrontal cortex is a brain region believed to support the ability of animals to engage in complex behavior. How neurons in this area respond to stimuli—and in particular, to combinations of stimuli (

  9. Performance Evaluation of User Selection Protocols in Random Networks with Energy Harvesting and Hardware Impairments

    Directory of Open Access Journals (Sweden)

    Tan Nhat Nguyen

    2016-01-01

    Full Text Available In this paper, we evaluate performances of various user selection protocols under impact of hardware impairments. In the considered protocols, a Base Station (BS selects one of available Users (US to serve, while the remaining USs harvest the energy from the Radio Frequency (RF transmitted by the BS. We assume that all of the US randomly appear around the BS. In the Random Selection Protocol (RAN, the BS randomly selects a US to transmit the data. In the second proposed protocol, named Minimum Distance Protocol (MIND, the US that is nearest to the BS will be chosen. In the Optimal Selection Protocol (OPT, the US providing the highest channel gain between itself and the BS will be served. For performance evaluation, we derive exact and asymptotic closed-form expressions of average Outage Probability (OP over Rayleigh fading channels. We also consider average harvested energy per a US. Finally, Monte-Carlo simulations are then performed to verify the theoretical results.

  10. Repeated tender point injections of granisetron alleviate chronic myofascial pain--a randomized, controlled, double-blinded trial.

    Science.gov (United States)

    Christidis, Nikolaos; Omrani, Shahin; Fredriksson, Lars; Gjelset, Mattias; Louca, Sofia; Hedenberg-Magnusson, Britt; Ernberg, Malin

    2015-01-01

    Serotonin (5-HT) mediates pain by peripheral 5-HT3-receptors. Results from a few studies indicate that intramuscular injections of 5-HT3-antagonists may reduce musculoskeletal pain. The aim of this study was to investigate if repeated intramuscular tender-point injections of the 5-HT3-antagonist granisetron alleviate pain in patients with myofascial temporomandibular disorders (M-TMD). This prospective, randomized, controlled, double blind, parallel-arm trial (RCT) was carried out during at two centers in Stockholm, Sweden. The randomization was performed by a researcher who did not participate in data collection with an internet-based application ( www.randomization.com ). 40 patients with a diagnose of M-TMD according to the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) were randomized to receive repeated injections, one week apart, with either granisetron (GRA; 3 mg) or isotonic saline as control (CTR). The median weekly pain intensities decreased significantly at all follow-ups (1-, 2-, 6-months) in the GRA-group (Friedman test; P  0.075). The numbers needed to treat (NNT) were 4 at the 1- and 6-month follow-ups, and 3.3 at the 2-month follow-up in favor of granisetron. Repeated intramuscular tender-point injections with granisetron provide a new pharmacological treatment possibility for myofascial pain patients with repeated intramuscular tender-point injections with the serotonin type 3 antagonist granisetron. It showed a clinically relevant pain reducing effect in the temporomandibular region, both in a short- and long-term aspect. European Clinical Trials Database 2005-006042-41 as well as at Clinical Trials NCT02230371 .

  11. Shape measurement system for single point incremental forming (SPIF) manufacts by using trinocular vision and random pattern

    International Nuclear Information System (INIS)

    Setti, Francesco; Bini, Ruggero; Lunardelli, Massimo; Bosetti, Paolo; Bruschi, Stefania; De Cecco, Mariolino

    2012-01-01

    Many contemporary works show the interest of the scientific community in measuring the shape of artefacts made by single point incremental forming. In this paper, we will present an algorithm able to detect feature points with a random pattern, check the compatibility of associations exploiting multi-stereo constraints and reject outliers and perform a 3D reconstruction by dense random patterns. The algorithm is suitable for a real-time application, in fact it needs just three images and a synchronous relatively fast processing. The proposed method has been tested on a simple geometry and results have been compared with a coordinate measurement machine acquisition. (paper)

  12. The reliability of randomly selected final year pharmacy students in ...

    African Journals Online (AJOL)

    Employing ANOVA, factorial experimental analysis, and the theory of error, reliability studies were conducted on the assessment of the drug product chloroquine phosphate tablets. The G–Study employed equal numbers of the factors for uniform control, and involved three analysts (randomly selected final year Pharmacy ...

  13. Markov Random Field Restoration of Point Correspondences for Active Shape Modelling

    DEFF Research Database (Denmark)

    Hilger, Klaus Baggesen; Paulsen, Rasmus Reinhold; Larsen, Rasmus

    2004-01-01

    In this paper it is described how to build a statistical shape model using a training set with a sparse of landmarks. A well defined model mesh is selected and fitted to all shapes in the training set using thin plate spline warping. This is followed by a projection of the points of the warped...

  14. Customer Order Decoupling Point Selection Model in Mass Customization Based on MAS

    Institute of Scientific and Technical Information of China (English)

    XU Xuanguo; LI Xiangyang

    2006-01-01

    Mass customization relates to the ability of providing individually designed products or services to customer with high process flexibility or integration. Literatures on mass customization have been focused on mechanism of MC, but little on customer order decoupling point selection. The aim of this paper is to present a model for customer order decoupling point selection of domain knowledge interactions between enterprises and customers in mass customization. Based on the analysis of other researchers' achievements combining the demand problems of customer and enterprise, a model of group decision for customer order decoupling point selection is constructed based on quality function deployment and multi-agent system. Considering relatively the decision makers of independent functional departments as independent decision agents, a decision agent set is added as the third dimensionality to house of quality, the cubic quality function deployment is formed. The decision-making can be consisted of two procedures: the first one is to build each plane house of quality in various functional departments to express each opinions; the other is to evaluate and gather the foregoing sub-decisions by a new plane quality function deployment. Thus, department decision-making can well use its domain knowledge by ontology, and total decision-making can keep simple by avoiding too many customer requirements.

  15. Selective oxidation of dual phase steel after annealing at different dew points

    Science.gov (United States)

    Lins, Vanessa de Freitas Cunha; Madeira, Laureanny; Vilela, Jose Mario Carneiro; Andrade, Margareth Spangler; Buono, Vicente Tadeu Lopes; Guimarães, Juliana Porto; Alvarenga, Evandro de Azevedo

    2011-04-01

    Hot galvanized steels have been extensively used in the automotive industry. Selective oxidation on the steel surface affects the wettability of zinc on steel and the grain orientation of inhibition layer (Fe-Al-Zn alloy) and reduces the iron diffusion to the zinc layer. The aim of this work is to identify and quantify selective oxidation on the surface of a dual phase steel, and an experimental steel with a lower content of manganese, annealed at different dew points. The techniques employed were atomic force microscopy, X-ray photoelectron spectroscopy, and glow discharge optical emission spectroscopy. External selective oxidation was observed for phosphorus on steel surface annealed at 0 °C dp, and for manganese, silicon, and aluminum at a lower dew point. The concentration of manganese was higher on the dual phase steel surface than on the surface of the experimental steel. The concentration of molybdenum on the surface of both steels increased as the depth increased.

  16. Random selection of items. Selection of n1 samples among N items composing a stratum

    International Nuclear Information System (INIS)

    Jaech, J.L.; Lemaire, R.J.

    1987-02-01

    STR-224 provides generalized procedures to determine required sample sizes, for instance in the course of a Physical Inventory Verification at Bulk Handling Facilities. The present report describes procedures to generate random numbers and select groups of items to be verified in a given stratum through each of the measurement methods involved in the verification. (author). 3 refs

  17. Multiple ECG Fiducial Points-Based Random Binary Sequence Generation for Securing Wireless Body Area Networks.

    Science.gov (United States)

    Zheng, Guanglou; Fang, Gengfa; Shankaran, Rajan; Orgun, Mehmet A; Zhou, Jie; Qiao, Li; Saleem, Kashif

    2017-05-01

    Generating random binary sequences (BSes) is a fundamental requirement in cryptography. A BS is a sequence of N bits, and each bit has a value of 0 or 1. For securing sensors within wireless body area networks (WBANs), electrocardiogram (ECG)-based BS generation methods have been widely investigated in which interpulse intervals (IPIs) from each heartbeat cycle are processed to produce BSes. Using these IPI-based methods to generate a 128-bit BS in real time normally takes around half a minute. In order to improve the time efficiency of such methods, this paper presents an ECG multiple fiducial-points based binary sequence generation (MFBSG) algorithm. The technique of discrete wavelet transforms is employed to detect arrival time of these fiducial points, such as P, Q, R, S, and T peaks. Time intervals between them, including RR, RQ, RS, RP, and RT intervals, are then calculated based on this arrival time, and are used as ECG features to generate random BSes with low latency. According to our analysis on real ECG data, these ECG feature values exhibit the property of randomness and, thus, can be utilized to generate random BSes. Compared with the schemes that solely rely on IPIs to generate BSes, this MFBSG algorithm uses five feature values from one heart beat cycle, and can be up to five times faster than the solely IPI-based methods. So, it achieves a design goal of low latency. According to our analysis, the complexity of the algorithm is comparable to that of fast Fourier transforms. These randomly generated ECG BSes can be used as security keys for encryption or authentication in a WBAN system.

  18. Application of random-point processes to the detection of radiation sources

    International Nuclear Information System (INIS)

    Woods, J.W.

    1978-01-01

    In this report the mathematical theory of random-point processes is reviewed and it is shown how use of the theory can obtain optimal solutions to the problem of detecting radiation sources. As noted, the theory also applies to image processing in low-light-level or low-count-rate situations. Paralleling Snyder's work, the theory is extended to the multichannel case of a continuous, two-dimensional (2-D), energy-time space. This extension essentially involves showing that the data are doubly stochastic Poisson (DSP) point processes in energy as well as time. Further, a new 2-D recursive formulation is presented for the radiation-detection problem with large computational savings over nonrecursive techniques when the number of channels is large (greater than or equal to 30). Finally, some adaptive strategies for on-line ''learning'' of unknown, time-varying signal and background-intensity parameters and statistics are present and discussed. These adaptive procedures apply when a complete statistical description is not available a priori

  19. Learning a constrained conditional random field for enhanced segmentation of fallen trees in ALS point clouds

    Science.gov (United States)

    Polewski, Przemyslaw; Yao, Wei; Heurich, Marco; Krzystek, Peter; Stilla, Uwe

    2018-06-01

    In this study, we present a method for improving the quality of automatic single fallen tree stem segmentation in ALS data by applying a specialized constrained conditional random field (CRF). The entire processing pipeline is composed of two steps. First, short stem segments of equal length are detected and a subset of them is selected for further processing, while in the second step the chosen segments are merged to form entire trees. The first step is accomplished using the specialized CRF defined on the space of segment labelings, capable of finding segment candidates which are easier to merge subsequently. To achieve this, the CRF considers not only the features of every candidate individually, but incorporates pairwise spatial interactions between adjacent segments into the model. In particular, pairwise interactions include a collinearity/angular deviation probability which is learned from training data as well as the ratio of spatial overlap, whereas unary potentials encode a learned probabilistic model of the laser point distribution around each segment. Each of these components enters the CRF energy with its own balance factor. To process previously unseen data, we first calculate the subset of segments for merging on a grid of balance factors by minimizing the CRF energy. Then, we perform the merging and rank the balance configurations according to the quality of their resulting merged trees, obtained from a learned tree appearance model. The final result is derived from the top-ranked configuration. We tested our approach on 5 plots from the Bavarian Forest National Park using reference data acquired in a field inventory. Compared to our previous segment selection method without pairwise interactions, an increase in detection correctness and completeness of up to 7 and 9 percentage points, respectively, was observed.

  20. Hybrid collaborative optimization based on selection strategy of initial point and adaptive relaxation

    Energy Technology Data Exchange (ETDEWEB)

    Ji, Aimin; Yin, Xu; Yuan, Minghai [Hohai University, Changzhou (China)

    2015-09-15

    There are two problems in Collaborative optimization (CO): (1) the local optima arising from the selection of an inappropriate initial point; (2) the low efficiency and accuracy root in inappropriate relaxation factors. To solve these problems, we first develop the Latin hypercube design (LHD) to determine an initial point of optimization, and then use the non-linear programming by quadratic Lagrangian (NLPQL) to search for the global solution. The effectiveness of the initial point selection strategy is verified by three benchmark functions with some dimensions and different complexities. Then we propose the Adaptive relaxation collaborative optimization (ARCO) algorithm to solve the inconsistency between the system level and the disciplines level, and in this method, the relaxation factors are determined according to the three separated stages of CO respectively. The performance of the ARCO algorithm is compared with the standard collaborative algorithm and the constant relaxation collaborative algorithm with a typical numerical example, which indicates that the ARCO algorithm is more efficient and accurate. Finally, we propose a Hybrid collaborative optimization (HCO) approach, which integrates the selection strategy of initial point with the ARCO algorithm. The results show that HCO can achieve the global optimal solution without the initial value and it also has advantages in convergence, accuracy and robustness. Therefore, the proposed HCO approach can solve the CO problems with applications in the spindle and the speed reducer.

  1. Hybrid collaborative optimization based on selection strategy of initial point and adaptive relaxation

    International Nuclear Information System (INIS)

    Ji, Aimin; Yin, Xu; Yuan, Minghai

    2015-01-01

    There are two problems in Collaborative optimization (CO): (1) the local optima arising from the selection of an inappropriate initial point; (2) the low efficiency and accuracy root in inappropriate relaxation factors. To solve these problems, we first develop the Latin hypercube design (LHD) to determine an initial point of optimization, and then use the non-linear programming by quadratic Lagrangian (NLPQL) to search for the global solution. The effectiveness of the initial point selection strategy is verified by three benchmark functions with some dimensions and different complexities. Then we propose the Adaptive relaxation collaborative optimization (ARCO) algorithm to solve the inconsistency between the system level and the disciplines level, and in this method, the relaxation factors are determined according to the three separated stages of CO respectively. The performance of the ARCO algorithm is compared with the standard collaborative algorithm and the constant relaxation collaborative algorithm with a typical numerical example, which indicates that the ARCO algorithm is more efficient and accurate. Finally, we propose a Hybrid collaborative optimization (HCO) approach, which integrates the selection strategy of initial point with the ARCO algorithm. The results show that HCO can achieve the global optimal solution without the initial value and it also has advantages in convergence, accuracy and robustness. Therefore, the proposed HCO approach can solve the CO problems with applications in the spindle and the speed reducer

  2. ON THE ESTIMATION OF DISTANCE DISTRIBUTION FUNCTIONS FOR POINT PROCESSES AND RANDOM SETS

    Directory of Open Access Journals (Sweden)

    Dietrich Stoyan

    2011-05-01

    Full Text Available This paper discusses various estimators for the nearest neighbour distance distribution function D of a stationary point process and for the quadratic contact distribution function Hq of a stationary random closed set. It recommends the use of Hanisch's estimator of D, which is of Horvitz-Thompson type, and the minussampling estimator of Hq. This recommendation is based on simulations for Poisson processes and Boolean models.

  3. The mathematics of random mutation and natural selection for multiple simultaneous selection pressures and the evolution of antimicrobial drug resistance.

    Science.gov (United States)

    Kleinman, Alan

    2016-12-20

    The random mutation and natural selection phenomenon act in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures when treating infections and cancers. The underlying principle to impair the random mutation and natural selection phenomenon is to use combination therapy, which forces the population to evolve to multiple selection pressures simultaneously that invoke the multiplication rule of probabilities simultaneously as well. Recently, it has been seen that combination therapy for the treatment of malaria has failed to prevent the emergence of drug-resistant variants. Using this empirical example and the principles of probability theory, the derivation of the equations describing this treatment failure is carried out. These equations give guidance as to how to use combination therapy for the treatment of cancers and infectious diseases and prevent the emergence of drug resistance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  4. A Bayesian random effects discrete-choice model for resource selection: Population-level selection inference

    Science.gov (United States)

    Thomas, D.L.; Johnson, D.; Griffith, B.

    2006-01-01

    Modeling the probability of use of land units characterized by discrete and continuous measures, we present a Bayesian random-effects model to assess resource selection. This model provides simultaneous estimation of both individual- and population-level selection. Deviance information criterion (DIC), a Bayesian alternative to AIC that is sample-size specific, is used for model selection. Aerial radiolocation data from 76 adult female caribou (Rangifer tarandus) and calf pairs during 1 year on an Arctic coastal plain calving ground were used to illustrate models and assess population-level selection of landscape attributes, as well as individual heterogeneity of selection. Landscape attributes included elevation, NDVI (a measure of forage greenness), and land cover-type classification. Results from the first of a 2-stage model-selection procedure indicated that there is substantial heterogeneity among cow-calf pairs with respect to selection of the landscape attributes. In the second stage, selection of models with heterogeneity included indicated that at the population-level, NDVI and land cover class were significant attributes for selection of different landscapes by pairs on the calving ground. Population-level selection coefficients indicate that the pairs generally select landscapes with higher levels of NDVI, but the relationship is quadratic. The highest rate of selection occurs at values of NDVI less than the maximum observed. Results for land cover-class selections coefficients indicate that wet sedge, moist sedge, herbaceous tussock tundra, and shrub tussock tundra are selected at approximately the same rate, while alpine and sparsely vegetated landscapes are selected at a lower rate. Furthermore, the variability in selection by individual caribou for moist sedge and sparsely vegetated landscapes is large relative to the variability in selection of other land cover types. The example analysis illustrates that, while sometimes computationally intense, a

  5. Evolving artificial metalloenzymes via random mutagenesis

    Science.gov (United States)

    Yang, Hao; Swartz, Alan M.; Park, Hyun June; Srivastava, Poonam; Ellis-Guardiola, Ken; Upp, David M.; Lee, Gihoon; Belsare, Ketaki; Gu, Yifan; Zhang, Chen; Moellering, Raymond E.; Lewis, Jared C.

    2018-03-01

    Random mutagenesis has the potential to optimize the efficiency and selectivity of protein catalysts without requiring detailed knowledge of protein structure; however, introducing synthetic metal cofactors complicates the expression and screening of enzyme libraries, and activity arising from free cofactor must be eliminated. Here we report an efficient platform to create and screen libraries of artificial metalloenzymes (ArMs) via random mutagenesis, which we use to evolve highly selective dirhodium cyclopropanases. Error-prone PCR and combinatorial codon mutagenesis enabled multiplexed analysis of random mutations, including at sites distal to the putative ArM active site that are difficult to identify using targeted mutagenesis approaches. Variants that exhibited significantly improved selectivity for each of the cyclopropane product enantiomers were identified, and higher activity than previously reported ArM cyclopropanases obtained via targeted mutagenesis was also observed. This improved selectivity carried over to other dirhodium-catalysed transformations, including N-H, S-H and Si-H insertion, demonstrating that ArMs evolved for one reaction can serve as starting points to evolve catalysts for others.

  6. Word Length Selection Method for Controller Implementation on FPGAs Using the VHDL-2008 Fixed-Point and Floating-Point Packages

    Directory of Open Access Journals (Sweden)

    Urriza I

    2010-01-01

    Full Text Available Abstract This paper presents a word length selection method for the implementation of digital controllers in both fixed-point and floating-point hardware on FPGAs. This method uses the new types defined in the VHDL-2008 fixed-point and floating-point packages. These packages allow customizing the word length of fixed and floating point representations and shorten the design cycle simplifying the design of arithmetic operations. The method performs bit-true simulations in order to determine the word length to represent the constant coefficients and the internal signals of the digital controller while maintaining the control system specifications. A mixed-signal simulation tool is used to simulate the closed loop system as a whole in order to analyze the impact of the quantization effects and loop delays on the control system performance. The method is applied to implement a digital controller for a switching power converter. The digital circuit is implemented on an FPGA, and the simulations are experimentally verified.

  7. The signature of positive selection at randomly chosen loci.

    Science.gov (United States)

    Przeworski, Molly

    2002-03-01

    In Drosophila and humans, there are accumulating examples of loci with a significant excess of high-frequency-derived alleles or high levels of linkage disequilibrium, relative to a neutral model of a random-mating population of constant size. These are features expected after a recent selective sweep. Their prevalence suggests that positive directional selection may be widespread in both species. However, as I show here, these features do not persist long after the sweep ends: The high-frequency alleles drift to fixation and no longer contribute to polymorphism, while linkage disequilibrium is broken down by recombination. As a result, loci chosen without independent evidence of recent selection are not expected to exhibit either of these features, even if they have been affected by numerous sweeps in their genealogical history. How then can we explain the patterns in the data? One possibility is population structure, with unequal sampling from different subpopulations. Alternatively, positive selection may not operate as is commonly modeled. In particular, the rate of fixation of advantageous mutations may have increased in the recent past.

  8. Using ANFIS for selection of more relevant parameters to predict dew point temperature

    International Nuclear Information System (INIS)

    Mohammadi, Kasra; Shamshirband, Shahaboddin; Petković, Dalibor; Yee, Por Lip; Mansor, Zulkefli

    2016-01-01

    Highlights: • ANFIS is used to select the most relevant variables for dew point temperature prediction. • Two cities from the central and south central parts of Iran are selected as case studies. • Influence of 5 parameters on dew point temperature is evaluated. • Appropriate selection of input variables has a notable effect on prediction. • Considering the most relevant combination of 2 parameters would be more suitable. - Abstract: In this research work, for the first time, the adaptive neuro fuzzy inference system (ANFIS) is employed to propose an approach for identifying the most significant parameters for prediction of daily dew point temperature (T_d_e_w). The ANFIS process for variable selection is implemented, which includes a number of ways to recognize the parameters offering favorable predictions. According to the physical factors influencing the dew formation, 8 variables of daily minimum, maximum and average air temperatures (T_m_i_n, T_m_a_x and T_a_v_g), relative humidity (R_h), atmospheric pressure (P), water vapor pressure (V_P), sunshine hour (n) and horizontal global solar radiation (H) are considered to investigate their effects on T_d_e_w. The used data include 7 years daily measured data of two Iranian cities located in the central and south central parts of the country. The results indicate that despite climate difference between the considered case studies, for both stations, V_P is the most influential variable while R_h is the least relevant element. Furthermore, the combination of T_m_i_n and V_P is recognized as the most influential set to predict T_d_e_w. The conducted examinations show that there is a remarkable difference between the errors achieved for most and less relevant input parameters, which highlights the importance of appropriate selection of input parameters. The use of more than two inputs may not be advisable and appropriate; thus, considering the most relevant combination of 2 parameters would be more suitable

  9. Differential privacy-based evaporative cooling feature selection and classification with relief-F and random forests.

    Science.gov (United States)

    Le, Trang T; Simmons, W Kyle; Misaki, Masaya; Bodurka, Jerzy; White, Bill C; Savitz, Jonathan; McKinney, Brett A

    2017-09-15

    Classification of individuals into disease or clinical categories from high-dimensional biological data with low prediction error is an important challenge of statistical learning in bioinformatics. Feature selection can improve classification accuracy but must be incorporated carefully into cross-validation to avoid overfitting. Recently, feature selection methods based on differential privacy, such as differentially private random forests and reusable holdout sets, have been proposed. However, for domains such as bioinformatics, where the number of features is much larger than the number of observations p≫n , these differential privacy methods are susceptible to overfitting. We introduce private Evaporative Cooling, a stochastic privacy-preserving machine learning algorithm that uses Relief-F for feature selection and random forest for privacy preserving classification that also prevents overfitting. We relate the privacy-preserving threshold mechanism to a thermodynamic Maxwell-Boltzmann distribution, where the temperature represents the privacy threshold. We use the thermal statistical physics concept of Evaporative Cooling of atomic gases to perform backward stepwise privacy-preserving feature selection. On simulated data with main effects and statistical interactions, we compare accuracies on holdout and validation sets for three privacy-preserving methods: the reusable holdout, reusable holdout with random forest, and private Evaporative Cooling, which uses Relief-F feature selection and random forest classification. In simulations where interactions exist between attributes, private Evaporative Cooling provides higher classification accuracy without overfitting based on an independent validation set. In simulations without interactions, thresholdout with random forest and private Evaporative Cooling give comparable accuracies. We also apply these privacy methods to human brain resting-state fMRI data from a study of major depressive disorder. Code

  10. Fixed-Rate Compressed Floating-Point Arrays.

    Science.gov (United States)

    Lindstrom, Peter

    2014-12-01

    Current compression schemes for floating-point data commonly take fixed-precision values and compress them to a variable-length bit stream, complicating memory management and random access. We present a fixed-rate, near-lossless compression scheme that maps small blocks of 4(d) values in d dimensions to a fixed, user-specified number of bits per block, thereby allowing read and write random access to compressed floating-point data at block granularity. Our approach is inspired by fixed-rate texture compression methods widely adopted in graphics hardware, but has been tailored to the high dynamic range and precision demands of scientific applications. Our compressor is based on a new, lifted, orthogonal block transform and embedded coding, allowing each per-block bit stream to be truncated at any point if desired, thus facilitating bit rate selection using a single compression scheme. To avoid compression or decompression upon every data access, we employ a software write-back cache of uncompressed blocks. Our compressor has been designed with computational simplicity and speed in mind to allow for the possibility of a hardware implementation, and uses only a small number of fixed-point arithmetic operations per compressed value. We demonstrate the viability and benefits of lossy compression in several applications, including visualization, quantitative data analysis, and numerical simulation.

  11. Amorphous topological insulators constructed from random point sets

    Science.gov (United States)

    Mitchell, Noah P.; Nash, Lisa M.; Hexner, Daniel; Turner, Ari M.; Irvine, William T. M.

    2018-04-01

    The discovery that the band structure of electronic insulators may be topologically non-trivial has revealed distinct phases of electronic matter with novel properties1,2. Recently, mechanical lattices have been found to have similarly rich structure in their phononic excitations3,4, giving rise to protected unidirectional edge modes5-7. In all of these cases, however, as well as in other topological metamaterials3,8, the underlying structure was finely tuned, be it through periodicity, quasi-periodicity or isostaticity. Here we show that amorphous Chern insulators can be readily constructed from arbitrary underlying structures, including hyperuniform, jammed, quasi-crystalline and uniformly random point sets. While our findings apply to mechanical and electronic systems alike, we focus on networks of interacting gyroscopes as a model system. Local decorations control the topology of the vibrational spectrum, endowing amorphous structures with protected edge modes—with a chirality of choice. Using a real-space generalization of the Chern number, we investigate the topology of our structures numerically, analytically and experimentally. The robustness of our approach enables the topological design and self-assembly of non-crystalline topological metamaterials on the micro and macro scale.

  12. Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points

    KAUST Repository

    Migliorati, Giovanni; Nobile, Fabio; Tempone, Raul

    2015-01-01

    We study the accuracy of the discrete least-squares approximation on a finite dimensional space of a real-valued target function from noisy pointwise evaluations at independent random points distributed according to a given sampling probability

  13. Interference-aware random beam selection schemes for spectrum sharing systems

    KAUST Repository

    Abdallah, Mohamed

    2012-10-19

    Spectrum sharing systems have been recently introduced to alleviate the problem of spectrum scarcity by allowing secondary unlicensed networks to share the spectrum with primary licensed networks under acceptable interference levels to the primary users. In this work, we develop interference-aware random beam selection schemes that provide enhanced performance for the secondary network under the condition that the interference observed by the receivers of the primary network is below a predetermined/acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a primary link composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes select a beam, among a set of power-optimized random beams, that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint for different levels of feedback information describing the interference level at the primary receiver. For the proposed schemes, we develop a statistical analysis for the SINR statistics as well as the capacity and bit error rate (BER) of the secondary link.

  14. Evaluating gaze-based interface tools to facilitate point-and-select tasks with small targets

    DEFF Research Database (Denmark)

    Skovsgaard, Henrik; Mateo, Julio C.; Hansen, John Paulin

    2011-01-01

    -and-select tasks. We conducted two experiments comparing the performance of dwell, magnification and zoom methods in point-and-select tasks with small targets in single- and multiple-target layouts. Both magnification and zoom showed higher hit rates than dwell. Hit rates were higher when using magnification than...

  15. Genome-wide association data classification and SNPs selection using two-stage quality-based Random Forests.

    Science.gov (United States)

    Nguyen, Thanh-Tung; Huang, Joshua; Wu, Qingyao; Nguyen, Thuy; Li, Mark

    2015-01-01

    Single-nucleotide polymorphisms (SNPs) selection and identification are the most important tasks in Genome-wide association data analysis. The problem is difficult because genome-wide association data is very high dimensional and a large portion of SNPs in the data is irrelevant to the disease. Advanced machine learning methods have been successfully used in Genome-wide association studies (GWAS) for identification of genetic variants that have relatively big effects in some common, complex diseases. Among them, the most successful one is Random Forests (RF). Despite of performing well in terms of prediction accuracy in some data sets with moderate size, RF still suffers from working in GWAS for selecting informative SNPs and building accurate prediction models. In this paper, we propose to use a new two-stage quality-based sampling method in random forests, named ts-RF, for SNP subspace selection for GWAS. The method first applies p-value assessment to find a cut-off point that separates informative and irrelevant SNPs in two groups. The informative SNPs group is further divided into two sub-groups: highly informative and weak informative SNPs. When sampling the SNP subspace for building trees for the forest, only those SNPs from the two sub-groups are taken into account. The feature subspaces always contain highly informative SNPs when used to split a node at a tree. This approach enables one to generate more accurate trees with a lower prediction error, meanwhile possibly avoiding overfitting. It allows one to detect interactions of multiple SNPs with the diseases, and to reduce the dimensionality and the amount of Genome-wide association data needed for learning the RF model. Extensive experiments on two genome-wide SNP data sets (Parkinson case-control data comprised of 408,803 SNPs and Alzheimer case-control data comprised of 380,157 SNPs) and 10 gene data sets have demonstrated that the proposed model significantly reduced prediction errors and outperformed

  16. Discrete least squares polynomial approximation with random evaluations - application to PDEs with Random parameters

    KAUST Repository

    Nobile, Fabio

    2015-01-07

    We consider a general problem F(u, y) = 0 where u is the unknown solution, possibly Hilbert space valued, and y a set of uncertain parameters. We specifically address the situation in which the parameterto-solution map u(y) is smooth, however y could be very high (or even infinite) dimensional. In particular, we are interested in cases in which F is a differential operator, u a Hilbert space valued function and y a distributed, space and/or time varying, random field. We aim at reconstructing the parameter-to-solution map u(y) from random noise-free or noisy observations in random points by discrete least squares on polynomial spaces. The noise-free case is relevant whenever the technique is used to construct metamodels, based on polynomial expansions, for the output of computer experiments. In the case of PDEs with random parameters, the metamodel is then used to approximate statistics of the output quantity. We discuss the stability of discrete least squares on random points show convergence estimates both in expectation and probability. We also present possible strategies to select, either a-priori or by adaptive algorithms, sequences of approximating polynomial spaces that allow to reduce, and in some cases break, the curse of dimensionality

  17. TOPOLOGY OF RANDOM POINTS YOGESHWARAN. D.

    Indian Academy of Sciences (India)

    Balls grow at unit rate centred at the points of the point cloud/ process. ... Idea of persistence : Keep track of births and deaths of topological features. ..... holes, Betti numbers, etc., one will be more interested in the distribution of such objects on ...

  18. Topology-selective jamming of fully-connected, code-division random-access networks

    Science.gov (United States)

    Polydoros, Andreas; Cheng, Unjeng

    1990-01-01

    The purpose is to introduce certain models of topology selective stochastic jamming and examine its impact on a class of fully-connected, spread-spectrum, slotted ALOHA-type random access networks. The theory covers dedicated as well as half-duplex units. The dominant role of the spatial duty factor is established, and connections with the dual concept of time selective jamming are discussed. The optimal choices of coding rate and link access parameters (from the users' side) and the jamming spatial fraction are numerically established for DS and FH spreading.

  19. Reliability of impingement sampling designs: An example from the Indian Point station

    International Nuclear Information System (INIS)

    Mattson, M.T.; Waxman, J.B.; Watson, D.A.

    1988-01-01

    A 4-year data base (1976-1979) of daily fish impingement counts at the Indian Point electric power station on the Hudson River was used to compare the precision and reliability of three random-sampling designs: (1) simple random, (2) seasonally stratified, and (3) empirically stratified. The precision of daily impingement estimates improved logarithmically for each design as more days in the year were sampled. Simple random sampling was the least, and empirically stratified sampling was the most precise design, and the difference in precision between the two stratified designs was small. Computer-simulated sampling was used to estimate the reliability of the two stratified-random-sampling designs. A seasonally stratified sampling design was selected as the most appropriate reduced-sampling program for Indian Point station because: (1) reasonably precise and reliable impingement estimates were obtained using this design for all species combined and for eight common Hudson River fish by sampling only 30% of the days in a year (110 d); and (2) seasonal strata may be more precise and reliable than empirical strata if future changes in annual impingement patterns occur. The seasonally stratified design applied to the 1976-1983 Indian Point impingement data showed that selection of sampling dates based on daily species-specific impingement variability gave results that were more precise, but not more consistently reliable, than sampling allocations based on the variability of all fish species combined. 14 refs., 1 fig., 6 tabs

  20. Random genetic drift, natural selection, and noise in human cranial evolution.

    Science.gov (United States)

    Roseman, Charles C

    2016-08-01

    This study assesses the extent to which relationships among groups complicate comparative studies of adaptation in recent human cranial variation and the extent to which departures from neutral additive models of evolution hinder the reconstruction of population relationships among groups using cranial morphology. Using a maximum likelihood evolutionary model fitting approach and a mixed population genomic and cranial data set, I evaluate the relative fits of several widely used models of human cranial evolution. Moreover, I compare the goodness of fit of models of cranial evolution constrained by genomic variation to test hypotheses about population specific departures from neutrality. Models from population genomics are much better fits to cranial variation than are traditional models from comparative human biology. There is not enough evolutionary information in the cranium to reconstruct much of recent human evolution but the influence of population history on cranial variation is strong enough to cause comparative studies of adaptation serious difficulties. Deviations from a model of random genetic drift along a tree-like population history show the importance of environmental effects, gene flow, and/or natural selection on human cranial variation. Moreover, there is a strong signal of the effect of natural selection or an environmental factor on a group of humans from Siberia. The evolution of the human cranium is complex and no one evolutionary process has prevailed at the expense of all others. A holistic unification of phenome, genome, and environmental context, gives us a strong point of purchase on these problems, which is unavailable to any one traditional approach alone. Am J Phys Anthropol 160:582-592, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  1. Peculiarities of the statistics of spectrally selected fluorescence radiation in laser-pumped dye-doped random media

    Science.gov (United States)

    Yuvchenko, S. A.; Ushakova, E. V.; Pavlova, M. V.; Alonova, M. V.; Zimnyakov, D. A.

    2018-04-01

    We consider the practical realization of a new optical probe method of the random media which is defined as the reference-free path length interferometry with the intensity moments analysis. A peculiarity in the statistics of the spectrally selected fluorescence radiation in laser-pumped dye-doped random medium is discussed. Previously established correlations between the second- and the third-order moments of the intensity fluctuations in the random interference patterns, the coherence function of the probe radiation, and the path difference probability density for the interfering partial waves in the medium are confirmed. The correlations were verified using the statistical analysis of the spectrally selected fluorescence radiation emitted by a laser-pumped dye-doped random medium. Water solution of Rhodamine 6G was applied as the doping fluorescent agent for the ensembles of the densely packed silica grains, which were pumped by the 532 nm radiation of a solid state laser. The spectrum of the mean path length for a random medium was reconstructed.

  2. The Research and Application of SURF Algorithm Based on Feature Point Selection Algorithm

    Directory of Open Access Journals (Sweden)

    Zhang Fang Hu

    2014-04-01

    Full Text Available As the pixel information of depth image is derived from the distance information, when implementing SURF algorithm with KINECT sensor for static sign language recognition, there can be some mismatched pairs in palm area. This paper proposes a feature point selection algorithm, by filtering the SURF feature points step by step based on the number of feature points within adaptive radius r and the distance between the two points, it not only greatly improves the recognition rate, but also ensures the robustness under the environmental factors, such as skin color, illumination intensity, complex background, angle and scale changes. The experiment results show that the improved SURF algorithm can effectively improve the recognition rate, has a good robustness.

  3. Distributed Fair Access Point Selection for Multi-Rate IEEE 802.11 WLANs

    Science.gov (United States)

    Gong, Huazhi; Nahm, Kitae; Kim, Jongwon

    In IEEE 802.11 networks, the access point (AP) selection based on the strongest signal strength often results in the extremely unfair bandwidth allocation among mobile users (MUs). In this paper, we propose a distributed AP selection algorithm to achieve a fair bandwidth allocation for MUs. The proposed algorithm gradually balances the AP loads based on max-min fairness for the available multiple bit rate choices in a distributed manner. We analyze the stability and overhead of the proposed algorithm, and show the improvement of the fairness via computer simulation.

  4. [Sacroiliac joint injury treated with oblique insertion at anatomical points: a randomized controlled trial].

    Science.gov (United States)

    Kuang, Jiayi; Li, Yuxuan; He, Yufeng; Gan, Lin; Wang, Aiming; Chen, Yanhua; Li, Xiaoting; Guo, Lin; Tang, Rongjun

    2016-04-01

    To compare the effects of oblique insertion at anatomical points and conventional acupuncture for sacroiliac joint injury. Eighty patients were randomly divided into an observation group and a control group, 40 cases in each one. In the observation group, oblique insertion therapy at anatomical points was used, and the 9 points of equal division (anatomical points) marked by palpating the anatomical symbol were treated as the insertion acupoints. In the control group, conventional acupuncture was applied, and perpendicular insertion was adopted at Huantiao (GB 30), Zhibian (BL 54) and Weizhong (BL 40), etc. In the two groups, the! treatment was given once a day and 5 times per week. Ten treatments were made into one course and two courses were required. The clinical effects, the changes of visual analogue scale (VAS) and Oswestry dysfunctional index. (ODI) before and after treatment were observed in the two groups. The total effective rate of the observation group was 90.0% (36/40), which was better than 72.5% (29/40) of the control group (P sacroiliac joint injury is superior to that of conventional acupuncture, which can effectively relieve pain and improve the disfunction.

  5. Thermoresponsive Poly(2-Oxazoline) Molecular Brushes by Living Ionic Polymerization: Modulation of the Cloud Point by Random and Block Copolymer Pendant Chains

    KAUST Repository

    Zhang, Ning; Luxenhofer, Robert; Jordan, Rainer

    2012-01-01

    random and block copolymers. Their aqueous solutions displayed a distinct thermoresponsive behavior as a function of the side-chain composition and sequence. The cloud point (CP) of MBs with random copolymer side chains is a linear function

  6. The Effects of Point or Polygon Based Training Data on RandomForest Classification Accuracy of Wetlands

    Directory of Open Access Journals (Sweden)

    Jennifer Corcoran

    2015-04-01

    Full Text Available Wetlands are dynamic in space and time, providing varying ecosystem services. Field reference data for both training and assessment of wetland inventories in the State of Minnesota are typically collected as GPS points over wide geographical areas and at infrequent intervals. This status-quo makes it difficult to keep updated maps of wetlands with adequate accuracy, efficiency, and consistency to monitor change. Furthermore, point reference data may not be representative of the prevailing land cover type for an area, due to point location or heterogeneity within the ecosystem of interest. In this research, we present techniques for training a land cover classification for two study sites in different ecoregions by implementing the RandomForest classifier in three ways: (1 field and photo interpreted points; (2 fixed window surrounding the points; and (3 image objects that intersect the points. Additional assessments are made to identify the key input variables. We conclude that the image object area training method is the most accurate and the most important variables include: compound topographic index, summer season green and blue bands, and grid statistics from LiDAR point cloud data, especially those that relate to the height of the return.

  7. Exact two-point resistance, and the simple random walk on the complete graph minus N edges

    International Nuclear Information System (INIS)

    Chair, Noureddine

    2012-01-01

    An analytical approach is developed to obtain the exact expressions for the two-point resistance and the total effective resistance of the complete graph minus N edges of the opposite vertices. These expressions are written in terms of certain numbers that we introduce, which we call the Bejaia and the Pisa numbers; these numbers are the natural generalizations of the bisected Fibonacci and Lucas numbers. The correspondence between random walks and the resistor networks is then used to obtain the exact expressions for the first passage and mean first passage times on this graph. - Highlights: ► We obtain exact formulas for the two-point resistance of the complete graph minus N edges. ► We obtain also the total effective resistance of this graph. ► We modified Schwatt’s formula on trigonometrical power sum to suit our computations. ► We introduced the generalized bisected Fibonacci and Lucas numbers: the Bejaia and the Pisa numbers. ► The first passage and mean first passage times of the random walks have exact expressions.

  8. A randomized controlled trial of single point acupuncture in primary dysmenorrhea.

    Science.gov (United States)

    Liu, Cun-Zhi; Xie, Jie-Ping; Wang, Lin-Peng; Liu, Yu-Qi; Song, Jia-Shan; Chen, Yin-Ying; Shi, Guang-Xia; Zhou, Wei; Gao, Shu-Zhong; Li, Shi-Liang; Xing, Jian-Min; Ma, Liang-Xiao; Wang, Yan-Xia; Zhu, Jiang; Liu, Jian-Ping

    2014-06-01

    Acupuncture is often used for primary dysmenorrhea. But there is no convincing evidence due to low methodological quality. We aim to assess immediate effect of acupuncture at specific acupoint compared with unrelated acupoint and nonacupoint on primary dysmenorrhea. The Acupuncture Analgesia Effect in Primary Dysmenorrhoea-II is a multicenter controlled trial conducted in six large hospitals of China. Patients who met inclusion criteria were randomly assigned to classic acupoint (N = 167), unrelated acupoint (N = 167), or non-acupoint (N = 167) group on a 1:1:1 basis. They received three sessions with electro-acupuncture at a classic acupoint (Sanyinjiao, SP6), or an unrelated acupoint (Xuanzhong, GB39), or nonacupoint location, respectively. The primary outcome was subjective pain as measured by a 100-mm visual analog scale (VAS). Measurements were obtained at 0, 5, 10, 30, and 60 minutes following the first intervention. In addition, patients scored changes of general complaints using Cox retrospective symptom scales (RSS-Cox) and 7-point verbal rating scale (VRS) during three menstrual cycles. Secondary outcomes included VAS score for average pain, pain total time, additional in-bed time, and proportion of participants using analgesics during three menstrual cycles. Five hundred and one people underwent random assignment. The primary comparison of VAS scores following the first intervention demonstrated that classic acupoint group was more effective both than unrelated acupoint (-4.0 mm, 95% CI -7.1 to -0.9, P = 0.010) and nonacupoint (-4.0 mm, 95% CI -7.0 to -0.9, P = 0.012) groups. However, no significant differences were detected among the three acupuncture groups for RSS-Cox or VRS outcomes. The per-protocol analysis showed similar pattern. No serious adverse events were noted. Specific acupoint acupuncture produced a statistically, but not clinically, significant effect compared with unrelated acupoint and nonacupoint acupuncture in

  9. Using histograms to introduce randomization in the generation of ensembles of decision trees

    Science.gov (United States)

    Kamath, Chandrika; Cantu-Paz, Erick; Littau, David

    2005-02-22

    A system for decision tree ensembles that includes a module to read the data, a module to create a histogram, a module to evaluate a potential split according to some criterion using the histogram, a module to select a split point randomly in an interval around the best split, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method includes the steps of reading the data; creating a histogram; evaluating a potential split according to some criterion using the histogram, selecting a split point randomly in an interval around the best split, splitting the data, and combining multiple decision trees in ensembles.

  10. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil

    2016-12-14

    This paper considers the problem of selecting a set of $k$ measurements from $n$ available sensor observations. The selected measurements should minimize a certain error function assessing the error in estimating a certain $m$ dimensional parameter vector. The exhaustive search inspecting each of the $n\\\\choose k$ possible choices would require a very high computational complexity and as such is not practical for large $n$ and $k$. Alternative methods with low complexity have recently been investigated but their main drawbacks are that 1) they require perfect knowledge of the measurement matrix and 2) they need to be applied at the pace of change of the measurement matrix. To overcome these issues, we consider the asymptotic regime in which $k$, $n$ and $m$ grow large at the same pace. Tools from random matrix theory are then used to approximate in closed-form the most important error measures that are commonly used. The asymptotic approximations are then leveraged to select properly $k$ measurements exhibiting low values for the asymptotic error measures. Two heuristic algorithms are proposed: the first one merely consists in applying the convex optimization artifice to the asymptotic error measure. The second algorithm is a low-complexity greedy algorithm that attempts to look for a sufficiently good solution for the original minimization problem. The greedy algorithm can be applied to both the exact and the asymptotic error measures and can be thus implemented in blind and channel-aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also presented and sustain the efficiency of the proposed blind methods in reaching the performances of channel-aware algorithms.

  11. Sensitivity of landscape resistance estimates based on point selection functions to scale and behavioral state: Pumas as a case study

    Science.gov (United States)

    Katherine A. Zeller; Kevin McGarigal; Paul Beier; Samuel A. Cushman; T. Winston Vickers; Walter M. Boyce

    2014-01-01

    Estimating landscape resistance to animal movement is the foundation for connectivity modeling, and resource selection functions based on point data are commonly used to empirically estimate resistance. In this study, we used GPS data points acquired at 5-min intervals from radiocollared pumas in southern California to model context-dependent point selection...

  12. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti.

    Science.gov (United States)

    Wampler, Peter J; Rediske, Richard R; Molla, Azizur R

    2013-01-18

    A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This

  13. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti

    Directory of Open Access Journals (Sweden)

    Wampler Peter J

    2013-01-01

    Full Text Available Abstract Background A remote sensing technique was developed which combines a Geographic Information System (GIS; Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. Methods The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. Results A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. Conclusions The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only

  14. Optimizing Event Selection with the Random Grid Search

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C. [Fermilab; Prosper, Harrison B. [Florida State U.; Sekmen, Sezen [Kyungpook Natl. U.; Stewart, Chip [Broad Inst., Cambridge

    2017-06-29

    The random grid search (RGS) is a simple, but efficient, stochastic algorithm to find optimal cuts that was developed in the context of the search for the top quark at Fermilab in the mid-1990s. The algorithm, and associated code, have been enhanced recently with the introduction of two new cut types, one of which has been successfully used in searches for supersymmetry at the Large Hadron Collider. The RGS optimization algorithm is described along with the recent developments, which are illustrated with two examples from particle physics. One explores the optimization of the selection of vector boson fusion events in the four-lepton decay mode of the Higgs boson and the other optimizes SUSY searches using boosted objects and the razor variables.

  15. Non-random mating for selection with restricted rates of inbreeding and overlapping generations

    NARCIS (Netherlands)

    Sonesson, A.K.; Meuwissen, T.H.E.

    2002-01-01

    Minimum coancestry mating with a maximum of one offspring per mating pair (MC1) is compared with random mating schemes for populations with overlapping generations. Optimum contribution selection is used, whereby $\\\\\\\\Delta F$ is restricted. For schemes with $\\\\\\\\Delta F$ restricted to 0.25% per

  16. Depression treatment for impoverished mothers by point-of-care providers: A randomized controlled trial.

    Science.gov (United States)

    Segre, Lisa S; Brock, Rebecca L; O'Hara, Michael W

    2015-04-01

    Depression in low-income, ethnic-minority women of childbearing age is prevalent and compromises infant and child development. Yet numerous barriers prevent treatment delivery. Listening Visits (LV), an empirically supported intervention developed for delivery by British home-visiting nurses, could address this unmet mental health need. This randomized controlled trial (RCT) evaluated the effectiveness of LV delivered at a woman's usual point-of-care, including home visits or an ob-gyn office. Listening Visits were delivered to depressed pregnant women or mothers of young children by their point-of-care provider (e.g., home visitor or physician's assistant), all of whom had low levels of prior counseling experience. Three quarters of the study's participants were low-income. Of those who reported ethnicity, all identified themselves as minorities. Participants from 4 study sites (N = 66) were randomized in a 2:1 ratio, to LV or a wait-list control group (WLC). Assessments, conducted at baseline and 8 weeks, evaluated depression, quality of life, and treatment satisfaction. Depressive severity, depressive symptoms, and quality of life significantly improved among LV recipients as compared with women receiving standard social/health services. Women valued LV as evidenced by their high attendance rates and treatment satisfaction ratings. In a stepped model of depression care, LV can provide an accessible, acceptable, and effective first-line treatment option for at-risk women who otherwise are unlikely to receive treatment. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  17. Geography and genography: prediction of continental origin using randomly selected single nucleotide polymorphisms

    Directory of Open Access Journals (Sweden)

    Ramoni Marco F

    2007-03-01

    Full Text Available Abstract Background Recent studies have shown that when individuals are grouped on the basis of genetic similarity, group membership corresponds closely to continental origin. There has been considerable debate about the implications of these findings in the context of larger debates about race and the extent of genetic variation between groups. Some have argued that clustering according to continental origin demonstrates the existence of significant genetic differences between groups and that these differences may have important implications for differences in health and disease. Others argue that clustering according to continental origin requires the use of large amounts of genetic data or specifically chosen markers and is indicative only of very subtle genetic differences that are unlikely to have biomedical significance. Results We used small numbers of randomly selected single nucleotide polymorphisms (SNPs from the International HapMap Project to train naïve Bayes classifiers for prediction of ancestral continent of origin. Predictive accuracy was tested on two independent data sets. Genetically similar groups should be difficult to distinguish, especially if only a small number of genetic markers are used. The genetic differences between continentally defined groups are sufficiently large that one can accurately predict ancestral continent of origin using only a minute, randomly selected fraction of the genetic variation present in the human genome. Genotype data from only 50 random SNPs was sufficient to predict ancestral continent of origin in our primary test data set with an average accuracy of 95%. Genetic variations informative about ancestry were common and widely distributed throughout the genome. Conclusion Accurate characterization of ancestry is possible using small numbers of randomly selected SNPs. The results presented here show how investigators conducting genetic association studies can use small numbers of arbitrarily

  18. Acupuncture-Point Stimulation for Postoperative Pain Control: A Systematic Review and Meta-Analysis of Randomized Controlled Trials

    Directory of Open Access Journals (Sweden)

    Xian-Liang Liu

    2015-01-01

    Full Text Available The purpose of this study was to evaluate the effectiveness of Acupuncture-point stimulation (APS in postoperative pain control compared with sham/placebo acupuncture or standard treatments (usual care or no treatment. Only randomized controlled trials (RCTs were included. Meta-analysis results indicated that APS interventions improved VAS scores significantly and also reduced total morphine consumption. No serious APS-related adverse effects (AEs were reported. There is Level I evidence for the effectiveness of body points plaster therapy and Level II evidence for body points electroacupuncture (EA, body points acupressure, body points APS for abdominal surgery patients, auricular points seed embedding, manual auricular acupuncture, and auricular EA. We obtained Level III evidence for body points APS in patients who underwent cardiac surgery and cesarean section and for auricular-point stimulation in patients who underwent abdominal surgery. There is insufficient evidence to conclude that APS is an effective postoperative pain therapy in surgical patients, although the evidence does support the conclusion that APS can reduce analgesic requirements without AEs. The best level of evidence was not adequate in most subgroups. Some limitations of this study may have affected the results, possibly leading to an overestimation of APS effects.

  19. Chromosome break points of T-lymphocytes from atomic bomb survivors

    International Nuclear Information System (INIS)

    Tanaka, Kimio; Kamada, Nanao; Kuramoto, Atsushi; Ohkita, Takeshi

    1980-01-01

    Chromosome break points of T-lymphocytes were investigated for 9 atomic bomb survivors estimated to be irradiated with 100 - 630 red. Chromosome aberration was found in 199 cells out of 678 cells investigated, with non-random distribution. The types of the chromosome aberration were, transfer: 56%, deficit: 38%, additional abnormality 3%, and reverse: 2%. High and low incidence of chromosome aberrations were observed at the chromosome numbers of 22, 21, and 13, and 11, 12, and 4, respectively. The aberration numbers per arm were high in 22q, 21q, and 18p and low in 11q, 5p, and 12q. For the distribution of aberration number within a chromosome, 50.7% was observed at the terminal portion and 73% was at the pale band appeared by Q-partial-stain method, suggesting the non-random distribution. The incidence of aberration number in 22q was statistically significant (P 1 chromosome in chronic myelocytic leukemia. The non-random distribution of chromosome break points seemed to reflect the selection effect since irradiation. (Nakanishi, T.)

  20. Segmentation of Large Unstructured Point Clouds Using Octree-Based Region Growing and Conditional Random Fields

    Science.gov (United States)

    Bassier, M.; Bonduel, M.; Van Genechten, B.; Vergauwen, M.

    2017-11-01

    Point cloud segmentation is a crucial step in scene understanding and interpretation. The goal is to decompose the initial data into sets of workable clusters with similar properties. Additionally, it is a key aspect in the automated procedure from point cloud data to BIM. Current approaches typically only segment a single type of primitive such as planes or cylinders. Also, current algorithms suffer from oversegmenting the data and are often sensor or scene dependent. In this work, a method is presented to automatically segment large unstructured point clouds of buildings. More specifically, the segmentation is formulated as a graph optimisation problem. First, the data is oversegmented with a greedy octree-based region growing method. The growing is conditioned on the segmentation of planes as well as smooth surfaces. Next, the candidate clusters are represented by a Conditional Random Field after which the most likely configuration of candidate clusters is computed given a set of local and contextual features. The experiments prove that the used method is a fast and reliable framework for unstructured point cloud segmentation. Processing speeds up to 40,000 points per second are recorded for the region growing. Additionally, the recall and precision of the graph clustering is approximately 80%. Overall, nearly 22% of oversegmentation is reduced by clustering the data. These clusters will be classified and used as a basis for the reconstruction of BIM models.

  1. Velocity and Dispersion for a Two-Dimensional Random Walk

    International Nuclear Information System (INIS)

    Li Jinghui

    2009-01-01

    In the paper, we consider the transport of a two-dimensional random walk. The velocity and the dispersion of this two-dimensional random walk are derived. It mainly show that: (i) by controlling the values of the transition rates, the direction of the random walk can be reversed; (ii) for some suitably selected transition rates, our two-dimensional random walk can be efficient in comparison with the one-dimensional random walk. Our work is motivated in part by the challenge to explain the unidirectional transport of motor proteins. When the motor proteins move at the turn points of their tracks (i.e., the cytoskeleton filaments and the DNA molecular tubes), some of our results in this paper can be used to deal with the problem. (general)

  2. Fuzzy Random λ-Mean SAD Portfolio Selection Problem: An Ant Colony Optimization Approach

    Science.gov (United States)

    Thakur, Gour Sundar Mitra; Bhattacharyya, Rupak; Mitra, Swapan Kumar

    2010-10-01

    To reach the investment goal, one has to select a combination of securities among different portfolios containing large number of securities. Only the past records of each security do not guarantee the future return. As there are many uncertain factors which directly or indirectly influence the stock market and there are also some newer stock markets which do not have enough historical data, experts' expectation and experience must be combined with the past records to generate an effective portfolio selection model. In this paper the return of security is assumed to be Fuzzy Random Variable Set (FRVS), where returns are set of random numbers which are in turn fuzzy numbers. A new λ-Mean Semi Absolute Deviation (λ-MSAD) portfolio selection model is developed. The subjective opinions of the investors to the rate of returns of each security are taken into consideration by introducing a pessimistic-optimistic parameter vector λ. λ-Mean Semi Absolute Deviation (λ-MSAD) model is preferred as it follows absolute deviation of the rate of returns of a portfolio instead of the variance as the measure of the risk. As this model can be reduced to Linear Programming Problem (LPP) it can be solved much faster than quadratic programming problems. Ant Colony Optimization (ACO) is used for solving the portfolio selection problem. ACO is a paradigm for designing meta-heuristic algorithms for combinatorial optimization problem. Data from BSE is used for illustration.

  3. Treatment of myofascial trigger points in female patients with chronic tension-type headache - A randomized controlled trial

    DEFF Research Database (Denmark)

    Berggreen, S.; Wiik, E.; Lund, Hans

    2012-01-01

    The aim of this study was to evaluate the efficacy of myofascial trigger point massage in the muscles of the head, neck and shoulders regarding pain in the treatment of females with chronic tension-type headache. They were randomized into either a treatment group (n = 20) (one session of trigger......: 8.8 (95% CI 0.1117.4), p = 0.047). Furthermore, a significant decrease in the number of trigger points was observed in the treatment group compared with the control group. Myofascial trigger point massage has a beneficial effect on pain in female patients with chronic tension-type headache....... point massage per week for 10 weeks) or a control group receiving no treatment (n = 19). The patients kept a diary to record their pain on a visual analogue scale (VAS), and the daily intake of drugs (mg) during the 4 weeks before and after the treatment period. The McGill Pain Questionnaire...

  4. Correlations between PANCE performance, physician assistant program grade point average, and selection criteria.

    Science.gov (United States)

    Brown, Gina; Imel, Brittany; Nelson, Alyssa; Hale, LaDonna S; Jansen, Nick

    2013-01-01

    The purpose of this study was to examine correlations between first-time Physician Assistant National Certifying Exam (PANCE) scores and pass/fail status, physician assistant (PA) program didactic grade point average (GPA), and specific selection criteria. This retrospective study evaluated graduating classes from 2007, 2008, and 2009 at a single program (N = 119). There was no correlation between PANCE performance and undergraduate grade point average (GPA), science prerequisite GPA, or health care experience. There was a moderate correlation between PANCE pass/fail and where students took science prerequisites (r = 0.27, P = .003) but not with the PANCE score. PANCE scores were correlated with overall PA program GPA (r = 0.67), PA pharmacology grade (r = 0.68), and PA anatomy grade (r = 0.41) but not with PANCE pass/fail. Correlations between selection criteria and PANCE performance were limited, but further research regarding the influence of prerequisite institution type may be warranted and may improve admission decisions. PANCE scores and PA program GPA correlations may guide academic advising and remediation decisions for current students.

  5. Pediatric selective mutism therapy: a randomized controlled trial.

    Science.gov (United States)

    Esposito, Maria; Gimigliano, Francesca; Barillari, Maria R; Precenzano, Francesco; Ruberto, Maria; Sepe, Joseph; Barillari, Umberto; Gimigliano, Raffaele; Militerni, Roberto; Messina, Giovanni; Carotenuto, Marco

    2017-10-01

    Selective mutism (SM) is a rare disease in children coded by DSM-5 as an anxiety disorder. Despite the disabling nature of the disease, there is still no specific treatment. The aims of this study were to verify the efficacy of six-month standard psychomotor treatment and the positive changes in lifestyle, in a population of children affected by SM. Randomized controlled trial registered in the European Clinical Trials Registry (EuDract 2015-001161-36). University third level Centre (Child and Adolescent Neuropsychiatry Clinic). Study population was composed by 67 children in group A (psychomotricity treatment) (35 M, mean age 7.84±1.15) and 71 children in group B (behavioral and educational counseling) (37 M, mean age 7.75±1.36). Psychomotor treatment was administered by trained child therapists in residential settings three times per week. Each child was treated for the whole period by the same therapist and all the therapists shared the same protocol. The standard psychomotor session length is of 45 minutes. At T0 and after 6 months (T1) of treatments, patients underwent a behavioral and SM severity assessment. To verify the effects of the psychomotor management, the Child Behavior Checklist questionnaire (CBCL) and Selective Mutism Questionnaire (SMQ) were administered to the parents. After 6 months of psychomotor treatment SM children showed a significant reduction among CBCL scores such as in social relations, anxious/depressed, social problems and total problems (Pselective mutism, even if further studies are needed. The present study identifies in psychomotricity a safe and efficacy therapy for pediatric selective mutism.

  6. A Randomized Clinical Trial of Auricular Point Acupressure for Chronic Low Back Pain: A Feasibility Study

    Directory of Open Access Journals (Sweden)

    Chao Hsing Yeh

    2013-01-01

    Full Text Available Objectives. This prospective, randomized clinical trial (RCT was designed to investigate the feasibility and effects of a 4-week auricular point acupressure (APA for chronic low back pain (CLBP. Methods. Participants were randomized to either true APA (true acupoints with taped seeds on the designated ear points for CLBP or sham APA (sham acupoints with taped seeds but on different locations than those designated for CLBP. The duration of treatment was four weeks. Participants were assessed before treatment, weekly during treatment, and 1 month following treatment. Results. Participants in the true APA group who completed the 4-week APA treatment had a 70% reduction in worst pain intensity, a 75% reduction in overall pain intensity, and a 42% improvement in disability due to back pain from baseline assessment. The reductions of worst pain and overall pain intensity in the true APA group were statistically greater than participants in the sham group (P<0.01 at the completion of a 4-week APA and 1 month followup. Discussion. The preliminary findings of this feasibility study showed a reduction in pain intensity and improvement in physical function suggesting that APA may be a promising treatment for patients with CLBP.

  7. Primitive polynomials selection method for pseudo-random number generator

    Science.gov (United States)

    Anikin, I. V.; Alnajjar, Kh

    2018-01-01

    In this paper we suggested the method for primitive polynomials selection of special type. This kind of polynomials can be efficiently used as a characteristic polynomials for linear feedback shift registers in pseudo-random number generators. The proposed method consists of two basic steps: finding minimum-cost irreducible polynomials of the desired degree and applying primitivity tests to get the primitive ones. Finally two primitive polynomials, which was found by the proposed method, used in pseudorandom number generator based on fuzzy logic (FRNG) which had been suggested before by the authors. The sequences generated by new version of FRNG have low correlation magnitude, high linear complexity, less power consumption, is more balanced and have better statistical properties.

  8. The genealogy of samples in models with selection.

    Science.gov (United States)

    Neuhauser, C; Krone, S M

    1997-02-01

    We introduce the genealogy of a random sample of genes taken from a large haploid population that evolves according to random reproduction with selection and mutation. Without selection, the genealogy is described by Kingman's well-known coalescent process. In the selective case, the genealogy of the sample is embedded in a graph with a coalescing and branching structure. We describe this graph, called the ancestral selection graph, and point out differences and similarities with Kingman's coalescent. We present simulations for a two-allele model with symmetric mutation in which one of the alleles has a selective advantage over the other. We find that when the allele frequencies in the population are already in equilibrium, then the genealogy does not differ much from the neutral case. This is supported by rigorous results. Furthermore, we describe the ancestral selection graph for other selective models with finitely many selection classes, such as the K-allele models, infinitely-many-alleles models. DNA sequence models, and infinitely-many-sites models, and briefly discuss the diploid case.

  9. Post-model selection inference and model averaging

    Directory of Open Access Journals (Sweden)

    Georges Nguefack-Tsague

    2011-07-01

    Full Text Available Although model selection is routinely used in practice nowadays, little is known about its precise effects on any subsequent inference that is carried out. The same goes for the effects induced by the closely related technique of model averaging. This paper is concerned with the use of the same data first to select a model and then to carry out inference, in particular point estimation and point prediction. The properties of the resulting estimator, called a post-model-selection estimator (PMSE, are hard to derive. Using selection criteria such as hypothesis testing, AIC, BIC, HQ and Cp, we illustrate that, in terms of risk function, no single PMSE dominates the others. The same conclusion holds more generally for any penalised likelihood information criterion. We also compare various model averaging schemes and show that no single one dominates the others in terms of risk function. Since PMSEs can be regarded as a special case of model averaging, with 0-1 random-weights, we propose a connection between the two theories, in the frequentist approach, by taking account of the selection procedure when performing model averaging. We illustrate the point by simulating a simple linear regression model.

  10. 3D reconstruction of laser projective point with projection invariant generated from five points on 2D target.

    Science.gov (United States)

    Xu, Guan; Yuan, Jing; Li, Xiaotao; Su, Jian

    2017-08-01

    Vision measurement on the basis of structured light plays a significant role in the optical inspection research. The 2D target fixed with a line laser projector is designed to realize the transformations among the world coordinate system, the camera coordinate system and the image coordinate system. The laser projective point and five non-collinear points that are randomly selected from the target are adopted to construct a projection invariant. The closed form solutions of the 3D laser points are solved by the homogeneous linear equations generated from the projection invariants. The optimization function is created by the parameterized re-projection errors of the laser points and the target points in the image coordinate system. Furthermore, the nonlinear optimization solutions of the world coordinates of the projection points, the camera parameters and the lens distortion coefficients are contributed by minimizing the optimization function. The accuracy of the 3D reconstruction is evaluated by comparing the displacements of the reconstructed laser points with the actual displacements. The effects of the image quantity, the lens distortion and the noises are investigated in the experiments, which demonstrate that the reconstruction approach is effective to contribute the accurate test in the measurement system.

  11. Mahalanobis Distance Based Iterative Closest Point

    DEFF Research Database (Denmark)

    Hansen, Mads Fogtmann; Blas, Morten Rufus; Larsen, Rasmus

    2007-01-01

    the notion of a mahalanobis distance map upon a point set with associated covariance matrices which in addition to providing correlation weighted distance implicitly provides a method for assigning correspondence during alignment. This distance map provides an easy formulation of the ICP problem that permits...... a fast optimization. Initially, the covariance matrices are set to the identity matrix, and all shapes are aligned to a randomly selected shape (equivalent to standard ICP). From this point the algorithm iterates between the steps: (a) obtain mean shape and new estimates of the covariance matrices from...... the aligned shapes, (b) align shapes to the mean shape. Three different methods for estimating the mean shape with associated covariance matrices are explored in the paper. The proposed methods are validated experimentally on two separate datasets (IMM face dataset and femur-bones). The superiority of ICP...

  12. Random covering of the circle: the configuration-space of the free deposition process

    Energy Technology Data Exchange (ETDEWEB)

    Huillet, Thierry [Laboratoire de Physique Theorique et Modelisation, CNRS-UMR 8089 et Universite de Cergy-Pontoise, 5 mail Gay-Lussac, 95031, Neuville sur Oise (France)

    2003-12-12

    Consider a circle of circumference 1. Throw at random n points, sequentially, on this circle and append clockwise an arc (or rod) of length s to each such point. The resulting random set (the free gas of rods) is a collection of a random number of clusters with random sizes. It models a free deposition process on a 1D substrate. For such processes, we shall consider the occurrence times (number of rods) and probabilities, as n grows, of the following configurations: those avoiding rod overlap (the hard-rod gas), those for which the largest gap is smaller than rod length s (the packing gas), those (parking configurations) for which hard rod and packing constraints are both fulfilled and covering configurations. Special attention is paid to the statistical properties of each such (rare) configuration in the asymptotic density domain when ns = {rho}, for some finite density {rho} of points. Using results from spacings in the random division of the circle, explicit large deviation rate functions can be computed in each case from state equations. Lastly, a process consisting in selecting at random one of these specific equilibrium configurations (called the observable) can be modelled. When particularized to the parking model, this system produces parking configurations differently from Renyi's random sequential adsorption model.

  13. A heuristic for the distribution of point counts for random curves over a finite field.

    Science.gov (United States)

    Achter, Jeffrey D; Erman, Daniel; Kedlaya, Kiran S; Wood, Melanie Matchett; Zureick-Brown, David

    2015-04-28

    How many rational points are there on a random algebraic curve of large genus g over a given finite field Fq? We propose a heuristic for this question motivated by a (now proven) conjecture of Mumford on the cohomology of moduli spaces of curves; this heuristic suggests a Poisson distribution with mean q+1+1/(q-1). We prove a weaker version of this statement in which g and q tend to infinity, with q much larger than g. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  14. INTRA- AND INTER-OBSERVER RELIABILITY IN SELECTION OF THE HEART RATE DEFLECTION POINT DURING INCREMENTAL EXERCISE: COMPARISON TO A COMPUTER-GENERATED DEFLECTION POINT

    Directory of Open Access Journals (Sweden)

    Bridget A. Duoos

    2002-12-01

    Full Text Available This study was designed to 1 determine the relative frequency of occurrence of a heart rate deflection point (HRDP, when compared to a linear relationship, during progressive exercise, 2 measure the reproducibility of a visual assessment of a heart rate deflection point (HRDP, both within and between observers 3 compare visual and computer-assessed deflection points. Subjects consisted of 73 competitive male cyclists with mean age of 31.4 ± 6.3 years, mean height 178.3 ± 4.8 cm. and weight 74.0 ± 4.4 kg. Tests were conducted on an electrically-braked cycle ergometer beginning at 25 watts and progressing 25 watts per minute to fatigue. Heart Rates were recorded the last 10 seconds of each stage and at fatigue. Scatter plots of heart rate versus watts were computer-generated and given to 3 observers on two different occasions. A computer program was developed to assess if data points were best represented by a single line or two lines. The HRDP represented the intersection of the two lines. Results of this study showed that 1 computer-assessed HRDP showed that 44 of 73 subjects (60.3% had scatter plots best represented by a straight line with no HRDP 2in those subjects having HRDP, all 3 observers showed significant differences(p = 0.048, p = 0.007, p = 0.001 in reproducibility of their HRDP selection. Differences in HRDP selection were significant for two of the three comparisons between observers (p = 0.002, p = 0.305, p = 0.0003 Computer-generated HRDP was significantly different than visual HRDP for 2 of 3 observers (p = 0.0016, p = 0.513, p = 0.0001. It is concluded that 1 HRDP occurs in a minority of subjects 2 significant differences exist, both within and between observers, in selection of HRDP and 3 differences in agreement between visual and computer-generated HRDP would indicate that, when HRDP exists, it should be computer-assessed

  15. Critical points of multidimensional random Fourier series: variance estimates

    OpenAIRE

    Nicolaescu, Liviu I.

    2013-01-01

    To any positive number $\\varepsilon$ and any nonnegative even Schwartz function $w:\\mathbb{R}\\to\\mathbb{R}$ we associate the random function $u^\\varepsilon$ on the $m$-torus $T^m_\\varepsilon:=\\mathbb{R}^m/(\\varepsilon^{-1}\\mathbb{Z})^m$ defined as the real part of the random Fourier series $$ \\sum_{\

  16. Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design

    Science.gov (United States)

    Wagler, Amy; Wagler, Ron

    2014-01-01

    Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…

  17. Effects of one versus two bouts of moderate intensity physical activity on selective attention during a school morning in Dutch primary schoolchildren: A randomized controlled trial.

    Science.gov (United States)

    Altenburg, Teatske M; Chinapaw, Mai J M; Singh, Amika S

    2016-10-01

    Evidence suggests that physical activity is positively related to several aspects of cognitive functioning in children, among which is selective attention. To date, no information is available on the optimal frequency of physical activity on cognitive functioning in children. The current study examined the acute effects of one and two bouts of moderate-intensity physical activity on children's selective attention. Randomized controlled trial (ISRCTN97975679). Thirty boys and twenty-six girls, aged 10-13 years, were randomly assigned to three conditions: (A) sitting all morning working on simulated school tasks; (B) one 20-min physical activity bout after 90min; and (C) two 20-min physical activity bouts, i.e. at the start and after 90min. Selective attention was assessed at five time points during the morning (i.e. at baseline and after 20, 110, 130 and 220min), using the 'Sky Search' subtest of the 'Test of Selective Attention in Children'. We used GEE analysis to examine differences in Sky Search scores between the three experimental conditions, adjusting for school, baseline scores, self-reported screen time and time spent in sports. Children who performed two 20-min bouts of moderate-intensity physical activity had significantly better Sky Search scores compared to children who performed one physical activity bout or remained seated the whole morning (B=-0.26; 95% CI=[-0.52; -0.00]). Our findings support the importance of repeated physical activity during the school day for beneficial effects on selective attention in children. Copyright © 2015 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  18. The mean distance to the nth neighbour in a uniform distribution of random points: an application of probability theory

    International Nuclear Information System (INIS)

    Bhattacharyya, Pratip; Chakrabarti, Bikas K

    2008-01-01

    We study different ways of determining the mean distance (r n ) between a reference point and its nth neighbour among random points distributed with uniform density in a D-dimensional Euclidean space. First, we present a heuristic method; though this method provides only a crude mathematical result, it shows a simple way of estimating (r n ). Next, we describe two alternative means of deriving the exact expression of (r n ): we review the method using absolute probability and develop an alternative method using conditional probability. Finally, we obtain an approximation to (r n ) from the mean volume between the reference point and its nth neighbour and compare it with the heuristic and exact results

  19. Distribution majorization of corner points by reinforcement learning for moving object detection

    Science.gov (United States)

    Wu, Hao; Yu, Hao; Zhou, Dongxiang; Cheng, Yongqiang

    2018-04-01

    Corner points play an important role in moving object detection, especially in the case of free-moving camera. Corner points provide more accurate information than other pixels and reduce the computation which is unnecessary. Previous works only use intensity information to locate the corner points, however, the information that former and the last frames provided also can be used. We utilize the information to focus on more valuable area and ignore the invaluable area. The proposed algorithm is based on reinforcement learning, which regards the detection of corner points as a Markov process. In the Markov model, the video to be detected is regarded as environment, the selections of blocks for one corner point are regarded as actions and the performance of detection is regarded as state. Corner points are assigned to be the blocks which are seperated from original whole image. Experimentally, we select a conventional method which uses marching and Random Sample Consensus algorithm to obtain objects as the main framework and utilize our algorithm to improve the result. The comparison between the conventional method and the same one with our algorithm show that our algorithm reduce 70% of the false detection.

  20. Materials selection for oxide-based resistive random access memories

    International Nuclear Information System (INIS)

    Guo, Yuzheng; Robertson, John

    2014-01-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO 2 , TiO 2 , Ta 2 O 5 , and Al 2 O 3 , to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta 2 O 5 RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy

  1. Image subsampling and point scoring approaches for large-scale marine benthic monitoring programs

    Science.gov (United States)

    Perkins, Nicholas R.; Foster, Scott D.; Hill, Nicole A.; Barrett, Neville S.

    2016-07-01

    Benthic imagery is an effective tool for quantitative description of ecologically and economically important benthic habitats and biota. The recent development of autonomous underwater vehicles (AUVs) allows surveying of spatial scales that were previously unfeasible. However, an AUV collects a large number of images, the scoring of which is time and labour intensive. There is a need to optimise the way that subsamples of imagery are chosen and scored to gain meaningful inferences for ecological monitoring studies. We examine the trade-off between the number of images selected within transects and the number of random points scored within images on the percent cover of target biota, the typical output of such monitoring programs. We also investigate the efficacy of various image selection approaches, such as systematic or random, on the bias and precision of cover estimates. We use simulated biotas that have varying size, abundance and distributional patterns. We find that a relatively small sampling effort is required to minimise bias. An increased precision for groups that are likely to be the focus of monitoring programs is best gained through increasing the number of images sampled rather than the number of points scored within images. For rare species, sampling using point count approaches is unlikely to provide sufficient precision, and alternative sampling approaches may need to be employed. The approach by which images are selected (simple random sampling, regularly spaced etc.) had no discernible effect on mean and variance estimates, regardless of the distributional pattern of biota. Field validation of our findings is provided through Monte Carlo resampling analysis of a previously scored benthic survey from temperate waters. We show that point count sampling approaches are capable of providing relatively precise cover estimates for candidate groups that are not overly rare. The amount of sampling required, in terms of both the number of images and

  2. Automated corresponding point candidate selection for image registration using wavelet transformation neurla network with rotation invariant inputs and context information about neighboring candidates

    Science.gov (United States)

    Okumura, Hiroshi; Suezaki, Masashi; Sueyasu, Hideki; Arai, Kohei

    2003-03-01

    An automated method that can select corresponding point candidates is developed. This method has the following three features: 1) employment of the RIN-net for corresponding point candidate selection; 2) employment of multi resolution analysis with Haar wavelet transformation for improvement of selection accuracy and noise tolerance; 3) employment of context information about corresponding point candidates for screening of selected candidates. Here, the 'RIN-net' means the back-propagation trained feed-forward 3-layer artificial neural network that feeds rotation invariants as input data. In our system, pseudo Zernike moments are employed as the rotation invariants. The RIN-net has N x N pixels field of view (FOV). Some experiments are conducted to evaluate corresponding point candidate selection capability of the proposed method by using various kinds of remotely sensed images. The experimental results show the proposed method achieves fewer training patterns, less training time, and higher selection accuracy than conventional method.

  3. Active control on high-order coherence and statistic characterization on random phase fluctuation of two classical point sources.

    Science.gov (United States)

    Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan

    2016-03-29

    Young's double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources.

  4. Emergence of multilevel selection in the prisoner's dilemma game on coevolving random networks

    International Nuclear Information System (INIS)

    Szolnoki, Attila; Perc, Matjaz

    2009-01-01

    We study the evolution of cooperation in the prisoner's dilemma game, whereby a coevolutionary rule is introduced that molds the random topology of the interaction network in two ways. First, existing links are deleted whenever a player adopts a new strategy or its degree exceeds a threshold value; second, new links are added randomly after a given number of game iterations. These coevolutionary processes correspond to the generic formation of new links and deletion of existing links that, especially in human societies, appear frequently as a consequence of ongoing socialization, change of lifestyle or death. Due to the counteraction of deletions and additions of links the initial heterogeneity of the interaction network is qualitatively preserved, and thus cannot be held responsible for the observed promotion of cooperation. Indeed, the coevolutionary rule evokes the spontaneous emergence of a powerful multilevel selection mechanism, which despite the sustained random topology of the evolving network, maintains cooperation across the whole span of defection temptation values.

  5. Nutrition Information at the Point of Selection in High Schools Does Not Affect Purchases

    Science.gov (United States)

    Rainville, Alice Jo; Choi, Kyunghee; Ragg, Mark; King, Amber; Carr, Deborah H.

    2010-01-01

    Purpose/Objectives: Nutrition information can be an important component of local wellness policies. There are very few studies regarding nutrition information at the point of selection (POS) in high schools. The purpose of this study was to investigate the effects of posting entree nutrition information at the POS in high schools nationwide.…

  6. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology

    Science.gov (United States)

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...

  7. Optimization of the Dutch Matrix Test by Random Selection of Sentences From a Preselected Subset

    Directory of Open Access Journals (Sweden)

    Rolph Houben

    2015-04-01

    Full Text Available Matrix tests are available for speech recognition testing in many languages. For an accurate measurement, a steep psychometric function of the speech materials is required. For existing tests, it would be beneficial if it were possible to further optimize the available materials by increasing the function’s steepness. The objective is to show if the steepness of the psychometric function of an existing matrix test can be increased by selecting a homogeneous subset of recordings with the steepest sentence-based psychometric functions. We took data from a previous multicenter evaluation of the Dutch matrix test (45 normal-hearing listeners. Based on half of the data set, first the sentences (140 out of 311 with a similar speech reception threshold and with the steepest psychometric function (≥9.7%/dB were selected. Subsequently, the steepness of the psychometric function for this selection was calculated from the remaining (unused second half of the data set. The calculation showed that the slope increased from 10.2%/dB to 13.7%/dB. The resulting subset did not allow the construction of enough balanced test lists. Therefore, the measurement procedure was changed to randomly select the sentences during testing. Random selection may interfere with a representative occurrence of phonemes. However, in our material, the median phonemic occurrence remained close to that of the original test. This finding indicates that phonemic occurrence is not a critical factor. The work highlights the possibility that existing speech tests might be improved by selecting sentences with a steep psychometric function.

  8. Effects of dew point on selective oxidation of TRIP steels containing Si, Mn, and B

    Science.gov (United States)

    Lee, Suk-Kyu; Kim, Jong-Sang; Choi, Jin-Won; Kang, Namhyun; Cho, Kyung-Mox

    2011-04-01

    The selective oxidation of Si, Mn, and B on TRIP steel surfaces is a widely known phenomenon that occurs during heat treatment. However, the relationship between oxide formation and the annealing factors is not completely understood. This study examines the effect of the annealing conditions (dew point and annealing temperature) on oxide formation. A low dew point of -40 °C leads to the formation of Si-based oxides on the surface. A high dew point of -20 °C changes the oxide type to Mn-based oxides because the formation of Si oxides on the surface is suppressed by internal oxidation. Mn-based oxides exhibit superior wettability due to aluminothermic reduction during galvanizing.

  9. Evenly spaced Detrended Fluctuation Analysis: Selecting the number of points for the diffusion plot

    Science.gov (United States)

    Liddy, Joshua J.; Haddad, Jeffrey M.

    2018-02-01

    Detrended Fluctuation Analysis (DFA) has become a widely-used tool to examine the correlation structure of a time series and provided insights into neuromuscular health and disease states. As the popularity of utilizing DFA in the human behavioral sciences has grown, understanding its limitations and how to properly determine parameters is becoming increasingly important. DFA examines the correlation structure of variability in a time series by computing α, the slope of the log SD- log n diffusion plot. When using the traditional DFA algorithm, the timescales, n, are often selected as a set of integers between a minimum and maximum length based on the number of data points in the time series. This produces non-uniformly distributed values of n in logarithmic scale, which influences the estimation of α due to a disproportionate weighting of the long-timescale regions of the diffusion plot. Recently, the evenly spaced DFA and evenly spaced average DFA algorithms were introduced. Both algorithms compute α by selecting k points for the diffusion plot based on the minimum and maximum timescales of interest and improve the consistency of α estimates for simulated fractional Gaussian noise and fractional Brownian motion time series. Two issues that remain unaddressed are (1) how to select k and (2) whether the evenly-spaced DFA algorithms show similar benefits when assessing human behavioral data. We manipulated k and examined its effects on the accuracy, consistency, and confidence limits of α in simulated and experimental time series. We demonstrate that the accuracy and consistency of α are relatively unaffected by the selection of k. However, the confidence limits of α narrow as k increases, dramatically reducing measurement uncertainty for single trials. We provide guidelines for selecting k and discuss potential uses of the evenly spaced DFA algorithms when assessing human behavioral data.

  10. Non-compact random generalized games and random quasi-variational inequalities

    OpenAIRE

    Yuan, Xian-Zhi

    1994-01-01

    In this paper, existence theorems of random maximal elements, random equilibria for the random one-person game and random generalized game with a countable number of players are given as applications of random fixed point theorems. By employing existence theorems of random generalized games, we deduce the existence of solutions for non-compact random quasi-variational inequalities. These in turn are used to establish several existence theorems of noncompact generalized random ...

  11. Random drift versus selection in academic vocabulary: an evolutionary analysis of published keywords.

    Science.gov (United States)

    Bentley, R Alexander

    2008-08-27

    The evolution of vocabulary in academic publishing is characterized via keyword frequencies recorded in the ISI Web of Science citations database. In four distinct case-studies, evolutionary analysis of keyword frequency change through time is compared to a model of random copying used as the null hypothesis, such that selection may be identified against it. The case studies from the physical sciences indicate greater selection in keyword choice than in the social sciences. Similar evolutionary analyses can be applied to a wide range of phenomena; wherever the popularity of multiple items through time has been recorded, as with web searches, or sales of popular music and books, for example.

  12. Collocation methods for uncertainty quanti cation in PDE models with random data

    KAUST Repository

    Nobile, Fabio

    2014-01-06

    In this talk we consider Partial Differential Equations (PDEs) whose input data are modeled as random fields to account for their intrinsic variability or our lack of knowledge. After parametrizing the input random fields by finitely many independent random variables, we exploit the high regularity of the solution of the PDE as a function of the input random variables and consider sparse polynomial approximations in probability (Polynomial Chaos expansion) by collocation methods. We first address interpolatory approximations where the PDE is solved on a sparse grid of Gauss points in the probability space and the solutions thus obtained interpolated by multivariate polynomials. We present recent results on optimized sparse grids in which the selection of points is based on a knapsack approach and relies on sharp estimates of the decay of the coefficients of the polynomial chaos expansion of the solution. Secondly, we consider regression approaches where the PDE is evaluated on randomly chosen points in the probability space and a polynomial approximation constructed by the least square method. We present recent theoretical results on the stability and optimality of the approximation under suitable conditions between the number of sampling points and the dimension of the polynomial space. In particular, we show that for uniform random variables, the number of sampling point has to scale quadratically with the dimension of the polynomial space to maintain the stability and optimality of the approximation. Numerical results show that such condition is sharp in the monovariate case but seems to be over-constraining in higher dimensions. The regression technique seems therefore to be attractive in higher dimensions.

  13. Materials selection for oxide-based resistive random access memories

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Yuzheng; Robertson, John [Engineering Department, Cambridge University, Cambridge CB2 1PZ (United Kingdom)

    2014-12-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO{sub 2}, TiO{sub 2}, Ta{sub 2}O{sub 5}, and Al{sub 2}O{sub 3}, to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta{sub 2}O{sub 5} RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy.

  14. Visual evoked potentials and selective attention to points in space

    Science.gov (United States)

    Van Voorhis, S.; Hillyard, S. A.

    1977-01-01

    Visual evoked potentials (VEPs) were recorded to sequences of flashes delivered to the right and left visual fields while subjects responded promptly to designated stimuli in one field at a time (focused attention), in both fields at once (divided attention), or to neither field (passive). Three stimulus schedules were used: the first was a replication of a previous study (Eason, Harter, and White, 1969) where left- and right-field flashes were delivered quasi-independently, while in the other two the flashes were delivered to the two fields in random order (Bernoulli sequence). VEPs to attended-field stimuli were enhanced at both occipital (O2) and central (Cz) recording sites under all stimulus sequences, but different components were affected at the two scalp sites. It was suggested that the VEP at O2 may reflect modality-specific processing events, while the response at Cz, like its auditory homologue, may index more general aspects of selective attention.

  15. Double point source W-phase inversion: Real-time implementation and automated model selection

    Science.gov (United States)

    Nealy, Jennifer; Hayes, Gavin

    2015-01-01

    Rapid and accurate characterization of an earthquake source is an extremely important and ever evolving field of research. Within this field, source inversion of the W-phase has recently been shown to be an effective technique, which can be efficiently implemented in real-time. An extension to the W-phase source inversion is presented in which two point sources are derived to better characterize complex earthquakes. A single source inversion followed by a double point source inversion with centroid locations fixed at the single source solution location can be efficiently run as part of earthquake monitoring network operational procedures. In order to determine the most appropriate solution, i.e., whether an earthquake is most appropriately described by a single source or a double source, an Akaike information criterion (AIC) test is performed. Analyses of all earthquakes of magnitude 7.5 and greater occurring since January 2000 were performed with extended analyses of the September 29, 2009 magnitude 8.1 Samoa earthquake and the April 19, 2014 magnitude 7.5 Papua New Guinea earthquake. The AIC test is shown to be able to accurately select the most appropriate model and the selected W-phase inversion is shown to yield reliable solutions that match published analyses of the same events.

  16. Statistical MOSFET Parameter Extraction with Parameter Selection for Minimal Point Measurement

    Directory of Open Access Journals (Sweden)

    Marga Alisjahbana

    2013-11-01

    Full Text Available A method to statistically extract MOSFET model parameters from a minimal number of transistor I(V characteristic curve measurements, taken during fabrication process monitoring. It includes a sensitivity analysis of the model, test/measurement point selection, and a parameter extraction experiment on the process data. The actual extraction is based on a linear error model, the sensitivity of the MOSFET model with respect to the parameters, and Newton-Raphson iterations. Simulated results showed good accuracy of parameter extraction and I(V curve fit for parameter deviations of up 20% from nominal values, including for a process shift of 10% from nominal.

  17. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness

    Science.gov (United States)

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and

  18. Consumers' price awareness at the point-of-selection: What constitutes the most appropriate measure of consumers' price awareness and what determines the differences?

    DEFF Research Database (Denmark)

    Jensen, Birger Boutrup

    This paper focuses on consumers' price information processing at the point-of-selection. Specifically, it updates past results of consumers' price awareness at the point-of-selection - applying both a price-recall and a price-recognition test - and tests hypotheses on potential determinants...... of consumers' price awareness at the point-of-selection. Both price-memory tests resulted in higher measured price awareness than in any of the past studies. Results also indicate that price recognition is not the most appropiate measure. Finally, a discriminant analysis shows that consumers who are aware...

  19. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  20. Joint random beam and spectrum selection for spectrum sharing systems with partial channel state information

    KAUST Repository

    Abdallah, Mohamed M.

    2013-11-01

    In this work, we develop joint interference-aware random beam and spectrum selection scheme that provide enhanced performance for the secondary network under the condition that the interference observed at the primary receiver is below a predetermined acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a set of primary links composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes jointly select a beam, among a set of power-optimized random beams, as well as the primary spectrum that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint. In particular, we consider the case where the interference level is described by a q-bit description of its magnitude, whereby we propose a technique to find the optimal quantizer thresholds in a mean square error (MSE) sense. © 2013 IEEE.

  1. Joint random beam and spectrum selection for spectrum sharing systems with partial channel state information

    KAUST Repository

    Abdallah, Mohamed M.; Sayed, Mostafa M.; Alouini, Mohamed-Slim; Qaraqe, Khalid A.

    2013-01-01

    In this work, we develop joint interference-aware random beam and spectrum selection scheme that provide enhanced performance for the secondary network under the condition that the interference observed at the primary receiver is below a predetermined acceptable value. We consider a secondary link composed of a transmitter equipped with multiple antennas and a single-antenna receiver sharing the same spectrum with a set of primary links composed of a single-antenna transmitter and a single-antenna receiver. The proposed schemes jointly select a beam, among a set of power-optimized random beams, as well as the primary spectrum that maximizes the signal-to-interference-plus-noise ratio (SINR) of the secondary link while satisfying the primary interference constraint. In particular, we consider the case where the interference level is described by a q-bit description of its magnitude, whereby we propose a technique to find the optimal quantizer thresholds in a mean square error (MSE) sense. © 2013 IEEE.

  2. The RANDOM computer program: A linear congruential random number generator

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  3. Analysis and applications of a frequency selective surface via a random distribution method

    International Nuclear Information System (INIS)

    Xie Shao-Yi; Huang Jing-Jian; Yuan Nai-Chang; Liu Li-Guo

    2014-01-01

    A novel frequency selective surface (FSS) for reducing radar cross section (RCS) is proposed in this paper. This FSS is based on the random distribution method, so it can be called random surface. In this paper, the stacked patches serving as periodic elements are employed for RCS reduction. Previous work has demonstrated the efficiency by utilizing the microstrip patches, especially for the reflectarray. First, the relevant theory of the method is described. Then a sample of a three-layer variable-sized stacked patch random surface with a dimension of 260 mm×260 mm is simulated, fabricated, and measured in order to demonstrate the validity of the proposed design. For the normal incidence, the 8-dB RCS reduction can be achieved both by the simulation and the measurement in 8 GHz–13 GHz. The oblique incidence of 30° is also investigated, in which the 7-dB RCS reduction can be obtained in a frequency range of 8 GHz–14 GHz. (condensed matter: electronic structure, electrical, magnetic, and optical properties)

  4. Random drift versus selection in academic vocabulary: an evolutionary analysis of published keywords.

    Directory of Open Access Journals (Sweden)

    R Alexander Bentley

    Full Text Available The evolution of vocabulary in academic publishing is characterized via keyword frequencies recorded in the ISI Web of Science citations database. In four distinct case-studies, evolutionary analysis of keyword frequency change through time is compared to a model of random copying used as the null hypothesis, such that selection may be identified against it. The case studies from the physical sciences indicate greater selection in keyword choice than in the social sciences. Similar evolutionary analyses can be applied to a wide range of phenomena; wherever the popularity of multiple items through time has been recorded, as with web searches, or sales of popular music and books, for example.

  5. On theoretical models of gene expression evolution with random genetic drift and natural selection.

    Directory of Open Access Journals (Sweden)

    Osamu Ogasawara

    2009-11-01

    Full Text Available The relative contributions of natural selection and random genetic drift are a major source of debate in the study of gene expression evolution, which is hypothesized to serve as a bridge from molecular to phenotypic evolution. It has been suggested that the conflict between views is caused by the lack of a definite model of the neutral hypothesis, which can describe the long-run behavior of evolutionary change in mRNA abundance. Therefore previous studies have used inadequate analogies with the neutral prediction of other phenomena, such as amino acid or nucleotide sequence evolution, as the null hypothesis of their statistical inference.In this study, we introduced two novel theoretical models, one based on neutral drift and the other assuming natural selection, by focusing on a common property of the distribution of mRNA abundance among a variety of eukaryotic cells, which reflects the result of long-term evolution. Our results demonstrated that (1 our models can reproduce two independently found phenomena simultaneously: the time development of gene expression divergence and Zipf's law of the transcriptome; (2 cytological constraints can be explicitly formulated to describe long-term evolution; (3 the model assuming that natural selection optimized relative mRNA abundance was more consistent with previously published observations than the model of optimized absolute mRNA abundances.The models introduced in this study give a formulation of evolutionary change in the mRNA abundance of each gene as a stochastic process, on the basis of previously published observations. This model provides a foundation for interpreting observed data in studies of gene expression evolution, including identifying an adequate time scale for discriminating the effect of natural selection from that of random genetic drift of selectively neutral variations.

  6. MODIS 250m burned area mapping based on an algorithm using change point detection and Markov random fields.

    Science.gov (United States)

    Mota, Bernardo; Pereira, Jose; Campagnolo, Manuel; Killick, Rebeca

    2013-04-01

    Area burned in tropical savannas of Brazil was mapped using MODIS-AQUA daily 250m resolution imagery by adapting one of the European Space Agency fire_CCI project burned area algorithms, based on change point detection and Markov random fields. The study area covers 1,44 Mkm2 and was performed with data from 2005. The daily 1000 m image quality layer was used for cloud and cloud shadow screening. The algorithm addresses each pixel as a time series and detects changes in the statistical properties of NIR reflectance values, to identify potential burning dates. The first step of the algorithm is robust filtering, to exclude outlier observations, followed by application of the Pruned Exact Linear Time (PELT) change point detection technique. Near-infrared (NIR) spectral reflectance changes between time segments, and post change NIR reflectance values are combined into a fire likelihood score. Change points corresponding to an increase in reflectance are dismissed as potential burn events, as are those occurring outside of a pre-defined fire season. In the last step of the algorithm, monthly burned area probability maps and detection date maps are converted to dichotomous (burned-unburned maps) using Markov random fields, which take into account both spatial and temporal relations in the potential burned area maps. A preliminary assessment of our results is performed by comparison with data from the MODIS 1km active fires and the 500m burned area products, taking into account differences in spatial resolution between the two sensors.

  7. Bubble point pressures of the selected model system for CatLiq® bio-oil process

    DEFF Research Database (Denmark)

    Toor, Saqib Sohail; Rosendahl, Lasse; Baig, Muhammad Noman

    2010-01-01

    . In this work, the bubble point pressures of a selected model mixture (CO2 + H2O + Ethanol + Acetic acid + Octanoic acid) were measured to investigate the phase boundaries of the CatLiq® process. The bubble points were measured in the JEFRI-DBR high pressure PVT phase behavior system. The experimental results......The CatLiq® process is a second generation catalytic liquefaction process for the production of bio-oil from WDGS (Wet Distillers Grains with Solubles) at subcritical conditions (280-350 oC and 225-250 bar) in the presence of a homogeneous alkaline and a heterogeneous Zirconia catalyst...

  8. Statistical mechanics of the $N$-point vortex system with random intensities on $R^2$

    Directory of Open Access Journals (Sweden)

    Cassio Neri

    2005-01-01

    Full Text Available The system of N -point vortices on $mathbb{R}^2$ is considered under the hypothesis that vortex intensities are independent and identically distributed random variables with respect to a law $P$ supported on $(0,1]$. It is shown that, in the limit as $N$ approaches $infty$, the 1-vortex distribution is a minimizer of the free energy functional and is associated to (some solutions of the following non-linear Poisson Equation:$$ -Delta u(x = C^{-1}int_{(0,1]} rhbox{e}^{-eta ru(x- gamma r|x|^2}P(hbox{d}r, quadforall xin mathbb{R}^2, $$where $displaystyle C = int_{(0,1]}int_{mathbb{R}^2}hbox{e}^{-eta ru(y - gamma r|y|^2}hbox{d} yP(hbox{d}r$

  9. A novel knot selection method for the error-bounded B-spline curve fitting of sampling points in the measuring process

    International Nuclear Information System (INIS)

    Liang, Fusheng; Zhao, Ji; Ji, Shijun; Zhang, Bing; Fan, Cheng

    2017-01-01

    The B-spline curve has been widely used in the reconstruction of measurement data. The error-bounded sampling points reconstruction can be achieved by the knot addition method (KAM) based B-spline curve fitting. In KAM, the selection pattern of initial knot vector has been associated with the ultimate necessary number of knots. This paper provides a novel initial knots selection method to condense the knot vector required for the error-bounded B-spline curve fitting. The initial knots are determined by the distribution of features which include the chord length (arc length) and bending degree (curvature) contained in the discrete sampling points. Firstly, the sampling points are fitted into an approximate B-spline curve Gs with intensively uniform knot vector to substitute the description of the feature of the sampling points. The feature integral of Gs is built as a monotone increasing function in an analytic form. Then, the initial knots are selected according to the constant increment of the feature integral. After that, an iterative knot insertion (IKI) process starting from the initial knots is introduced to improve the fitting precision, and the ultimate knot vector for the error-bounded B-spline curve fitting is achieved. Lastly, two simulations and the measurement experiment are provided, and the results indicate that the proposed knot selection method can reduce the number of ultimate knots available. (paper)

  10. High Entropy Random Selection Protocols

    NARCIS (Netherlands)

    H. Buhrman (Harry); M. Christandl (Matthias); M. Koucky (Michal); Z. Lotker (Zvi); B. Patt-Shamir; M. Charikar; K. Jansen; O. Reingold; J. Rolim

    2007-01-01

    textabstractIn this paper, we construct protocols for two parties that do not trust each other, to generate random variables with high Shannon entropy. We improve known bounds for the trade off between the number of rounds, length of communication and the entropy of the outcome.

  11. A single point acupuncture treatment at large intestine meridian: a randomized controlled trial in acute tonsillitis and pharyngitis.

    Science.gov (United States)

    Fleckenstein, Johannes; Lill, Christian; Lüdtke, Rainer; Gleditsch, Jochen; Rasp, Gerd; Irnich, Dominik

    2009-09-01

    One out of 4 patients visiting a general practitioner reports of a sore throat associated with pain on swallowing. This study was established to examine the immediate pain alleviating effect of a single point acupuncture treatment applied to the large intestine meridian of patients with sore throat. Sixty patients with acute tonsillitis and pharyngitis were enrolled in this randomized placebo-controlled trial. They either received acupuncture, or sham laser acupuncture, directed to the large intestine meridian section between acupuncture points LI 8 and LI 10. The main outcome measure was the change of pain intensity on swallowing a sip of water evaluated by a visual analog scale 15 minutes after treatment. A credibility assessment regarding the respective treatment was performed. The pain intensity for the acupuncture group before and immediately after therapy was 5.6+/-2.8 and 3.0+/-3.0, and for the sham group 5.6+/-2.5 and 3.8+/-2.5, respectively. Despite the articulation of a more pronounced improvement among the acupuncture group, there was no significant difference between groups (Delta=0.9, confidence interval: -0.2-2.0; P=0.12; analysis of covariance). Patients' satisfaction was high in both treatment groups. The study was prematurely terminated due to a subsequent lack of suitable patients. A single acupuncture treatment applied to a selected area of the large intestine meridian was no more effective in the alleviation of pain associated with clinical sore throat than sham laser acupuncture applied to the same area. Hence, clinically relevant improvement could be achieved. Pain alleviation might partly be due to the intense palpation of the large intestine meridian. The benefit of a comprehensive acupuncture treatment protocol in this condition should be subject to further trials.

  12. Robust surface registration using N-points approximate congruent sets

    Directory of Open Access Journals (Sweden)

    Yao Jian

    2011-01-01

    Full Text Available Abstract Scans acquired by 3D sensors are typically represented in a local coordinate system. When multiple scans, taken from different locations, represent the same scene these must be registered to a common reference frame. We propose a fast and robust registration approach to automatically align two scans by finding two sets of N-points, that are approximately congruent under rigid transformation and leading to a good estimate of the transformation between their corresponding point clouds. Given two scans, our algorithm randomly searches for the best sets of congruent groups of points using a RANSAC-based approach. To successfully and reliably align two scans when there is only a small overlap, we improve the basic RANSAC random selection step by employing a weight function that approximates the probability of each pair of points in one scan to match one pair in the other. The search time to find pairs of congruent sets of N-points is greatly reduced by employing a fast search codebook based on both binary and multi-dimensional lookup tables. Moreover, we introduce a novel indicator of the overlapping region quality which is used to verify the estimated rigid transformation and to improve the alignment robustness. Our framework is general enough to incorporate and efficiently combine different point descriptors derived from geometric and texture-based feature points or scene geometrical characteristics. We also present a method to improve the matching effectiveness of texture feature descriptors by extracting them from an atlas of rectified images recovered from the scan reflectance image. Our algorithm is robust with respect to different sampling densities and also resilient to noise and outliers. We demonstrate its robustness and efficiency on several challenging scan datasets with varying degree of noise, outliers, extent of overlap, acquired from indoor and outdoor scenarios.

  13. The Effect of Acupressure on Sanyinjiao and Hugo Points on Labor Pain in Nulliparous Women : A Randomized Clinical Trial

    Directory of Open Access Journals (Sweden)

    Reza Heshmat

    2013-06-01

    Full Text Available Introduction:Most women have experienced child birth and its pain, which is inevitable. If this pain is not controlled it leads to prolonged labor and injury to the mother and fetus. This study was conducted to identify the effect of acupressure on sanyinjiao and hugo points on delivery pain in nulliparous women. Methods:This was a randomized controlled clinical trial on 84 nulliparous women in hospitals of Ardebil, Iran. The participants were divided by randomized blocks of 4 and 6 into two groups. The intervention was in the form of applying pressure at sanyinjiao and hugo points based on different dilatations. The intensity of the pain before and after the intervention was recorded by visual scale of pain assessment. To determine the effect of pressure on the intensity of labor pain, analytical descriptive test was conducted in SPSS (version 13. Results:There was a significant decrease in mean intensity of pain after each intervention in the experimental group with different dilatations (4, 6, 8, and 10 cm. Moreover, the Student’s independent t-test results indicated that the mean intensity of pain in the experimental group after the intervention in all four dilatations was significantly lower than the control group. Repeated measures ANOVA test indicated that in both experimental and control groups in four time periods, there was a statistically significant difference. Conclusion:Acupressure on sanyinjiao and hugo points decreases the labor pain. Therefore, this method can be used effectively in the labor process.

  14. Evaluation of Short-Term Changes in Serum Creatinine Level as a Meaningful End Point in Randomized Clinical Trials.

    Science.gov (United States)

    Coca, Steven G; Zabetian, Azadeh; Ferket, Bart S; Zhou, Jing; Testani, Jeffrey M; Garg, Amit X; Parikh, Chirag R

    2016-08-01

    Observational studies have shown that acute change in kidney function (specifically, AKI) is a strong risk factor for poor outcomes. Thus, the outcome of acute change in serum creatinine level, regardless of underlying biology or etiology, is frequently used in clinical trials as both efficacy and safety end points. We performed a meta-analysis of clinical trials to quantify the relationship between positive or negative short-term effects of interventions on change in serum creatinine level and more meaningful clinical outcomes. After a thorough literature search, we included 14 randomized trials of interventions that altered risk for an acute increase in serum creatinine level and had reported between-group differences in CKD and/or mortality rate ≥3 months after randomization. Seven trials assessed interventions that, compared with placebo, increased risk of acute elevation in serum creatinine level (pooled relative risk, 1.52; 95% confidence interval, 1.22 to 1.89), and seven trials assessed interventions that, compared with placebo, reduced risk of acute elevation in serum creatinine level (pooled relative risk, 0.57; 95% confidence interval, 0.44 to 0.74). However, pooled risks for CKD and mortality associated with interventions did not differ from those with placebo in either group. In conclusion, several interventions that affect risk of acute, mild to moderate, often temporary elevation in serum creatinine level in placebo-controlled randomized trials showed no appreciable effect on CKD or mortality months later, raising questions about the value of using small to moderate changes in serum creatinine level as end points in clinical trials. Copyright © 2016 by the American Society of Nephrology.

  15. Point-driven Mathematics Teaching. Studying and Intervening in Danish Classrooms

    DEFF Research Database (Denmark)

    Mogensen, Arne

    secondary schools emphasize such points in their teaching. Thus, 50 randomly selected mathematics teachers are filmed in one grade 8 math lesson and the dialogue investigated. The study identifies large variations and many influential components. There seems to be room for improvement. In order to examine...... possibilities to strengthen the presence and role of mathematical points in teaching two intervention studies are conducted. First a focus group of 5 of the original 50 teachers from each school are offered peer coaching by the researcher. This study indicates that different teachers appreciate peer coaching...... be supported in significant changes to a point-oriented mathematics teaching. The teachers emphasized joint planning of study lessons, and they regarded the peer coaching after each of these lessons as valuable. The studies with the two teacher groups indicate different opportunities and challenges...

  16. Integrated Behavior Therapy for Selective Mutism: a randomized controlled pilot study.

    Science.gov (United States)

    Bergman, R Lindsey; Gonzalez, Araceli; Piacentini, John; Keller, Melody L

    2013-10-01

    To evaluate the feasibility, acceptability, and preliminary efficacy of a novel behavioral intervention for reducing symptoms of selective mutism and increasing functional speech. A total of 21 children ages 4 to 8 with primary selective mutism were randomized to 24 weeks of Integrated Behavior Therapy for Selective Mutism (IBTSM) or a 12-week Waitlist control. Clinical outcomes were assessed using blind independent evaluators, parent-, and teacher-report, and an objective behavioral measure. Treatment recipients completed a three-month follow-up to assess durability of treatment gains. Data indicated increased functional speaking behavior post-treatment as rated by parents and teachers, with a high rate of treatment responders as rated by blind independent evaluators (75%). Conversely, children in the Waitlist comparison group did not experience significant improvements in speaking behaviors. Children who received IBTSM also demonstrated significant improvements in number of words spoken at school compared to baseline, however, significant group differences did not emerge. Treatment recipients also experienced significant reductions in social anxiety per parent, but not teacher, report. Clinical gains were maintained over 3 month follow-up. IBTSM appears to be a promising new intervention that is efficacious in increasing functional speaking behaviors, feasible, and acceptable to parents and teachers. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    Science.gov (United States)

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  18. Treatment selection in a randomized clinical trial via covariate-specific treatment effect curves.

    Science.gov (United States)

    Ma, Yunbei; Zhou, Xiao-Hua

    2017-02-01

    For time-to-event data in a randomized clinical trial, we proposed two new methods for selecting an optimal treatment for a patient based on the covariate-specific treatment effect curve, which is used to represent the clinical utility of a predictive biomarker. To select an optimal treatment for a patient with a specific biomarker value, we proposed pointwise confidence intervals for each covariate-specific treatment effect curve and the difference between covariate-specific treatment effect curves of two treatments. Furthermore, to select an optimal treatment for a future biomarker-defined subpopulation of patients, we proposed confidence bands for each covariate-specific treatment effect curve and the difference between each pair of covariate-specific treatment effect curve over a fixed interval of biomarker values. We constructed the confidence bands based on a resampling technique. We also conducted simulation studies to evaluate finite-sample properties of the proposed estimation methods. Finally, we illustrated the application of the proposed method in a real-world data set.

  19. Analysis of random point images with the use of symbolic computation codes and generalized Catalan numbers

    Science.gov (United States)

    Reznik, A. L.; Tuzikov, A. V.; Solov'ev, A. A.; Torgov, A. V.

    2016-11-01

    Original codes and combinatorial-geometrical computational schemes are presented, which are developed and applied for finding exact analytical formulas that describe the probability of errorless readout of random point images recorded by a scanning aperture with a limited number of threshold levels. Combinatorial problems encountered in the course of the study and associated with the new generalization of Catalan numbers are formulated and solved. An attempt is made to find the explicit analytical form of these numbers, which is, on the one hand, a necessary stage of solving the basic research problem and, on the other hand, an independent self-consistent problem.

  20. Species selective preconcentration and quantification of gold nanoparticles using cloud point extraction and electrothermal atomic absorption spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Hartmann, Georg, E-mail: georg.hartmann@tum.de [Department of Chemistry, Technische Universitaet Muenchen, 85748 Garching (Germany); Schuster, Michael, E-mail: michael.schuster@tum.de [Department of Chemistry, Technische Universitaet Muenchen, 85748 Garching (Germany)

    2013-01-25

    Highlights: Black-Right-Pointing-Pointer We optimized cloud point extraction and ET-AAS parameters for Au-NPs measurement. Black-Right-Pointing-Pointer A selective ligand (sodium thiosulphate) is introduced for species separation. Black-Right-Pointing-Pointer A limit of detection of 5 ng Au-NP per L is achieved for aqueous samples. Black-Right-Pointing-Pointer Measurement of samples with high natural organic mater content is possible. Black-Right-Pointing-Pointer Real water samples including wastewater treatment plant effluent were analyzed. - Abstract: The determination of metallic nanoparticles in environmental samples requires sample pretreatment that ideally combines pre-concentration and species selectivity. With cloud point extraction (CPE) using the surfactant Triton X-114 we present a simple and cost effective separation technique that meets both criteria. Effective separation of ionic gold species and Au nanoparticles (Au-NPs) is achieved by using sodium thiosulphate as a complexing agent. The extraction efficiency for Au-NP ranged from 1.01 {+-} 0.06 (particle size 2 nm) to 0.52 {+-} 0.16 (particle size 150 nm). An enrichment factor of 80 and a low limit of detection of 5 ng L{sup -1} is achieved using electrothermal atomic absorption spectrometry (ET-AAS) for quantification. TEM measurements showed that the particle size is not affected by the CPE process. Natural organic matter (NOM) is tolerated up to a concentration of 10 mg L{sup -1}. The precision of the method expressed as the standard deviation of 12 replicates at an Au-NP concentration of 100 ng L{sup -1} is 9.5%. A relation between particle concentration and the extraction efficiency was not observed. Spiking experiments showed a recovery higher than 91% for environmental water samples.

  1. Variation in the annual unsatisfactory rates of selected pathogens and indicators in ready-to-eat food sampled from the point of sale or service in Wales, United Kingdom.

    Science.gov (United States)

    Meldrum, R J; Garside, J; Mannion, P; Charles, D; Ellis, P

    2012-12-01

    The Welsh Food Microbiological Forum "shopping basket" survey is a long running, structured surveillance program examining ready-to-eat food randomly sampled from the point of sale or service in Wales, United Kingdom. The annual unsatisfactory rates for selected indicators and pathogens for 1998 through 2008 were examined. All the annual unsatisfactory rates for the selected pathogens were <0.5%, and no pattern with the annual rate was observed. There was also no discernible trend observed for the annual rates of Listeria spp. (not moncytogenes), with all rates <0.5%. However, there was a trend observed for Esherichia coli, with a decrease in rate between 1998 and 2003, rapid in the first few years, and then a gradual increase in rate up to 2008. It was concluded that there was no discernible pattern to the annual unsatisfactory rates for Listeria spp. (not monocytogenes), L. monocytogenes, Staphylococcus aureus, and Bacillus cereus, but that a definite trend had been observed for E. coli.

  2. Random and non-random mating populations: Evolutionary dynamics in meiotic drive.

    Science.gov (United States)

    Sarkar, Bijan

    2016-01-01

    Game theoretic tools are utilized to analyze a one-locus continuous selection model of sex-specific meiotic drive by considering nonequivalence of the viabilities of reciprocal heterozygotes that might be noticed at an imprinted locus. The model draws attention to the role of viability selections of different types to examine the stable nature of polymorphic equilibrium. A bridge between population genetics and evolutionary game theory has been built up by applying the concept of the Fundamental Theorem of Natural Selection. In addition to pointing out the influences of male and female segregation ratios on selection, configuration structure reveals some noted results, e.g., Hardy-Weinberg frequencies hold in replicator dynamics, occurrence of faster evolution at the maximized variance fitness, existence of mixed Evolutionarily Stable Strategy (ESS) in asymmetric games, the tending evolution to follow not only a 1:1 sex ratio but also a 1:1 different alleles ratio at particular gene locus. Through construction of replicator dynamics in the group selection framework, our selection model introduces a redefining bases of game theory to incorporate non-random mating where a mating parameter associated with population structure is dependent on the social structure. Also, the model exposes the fact that the number of polymorphic equilibria will depend on the algebraic expression of population structure. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Randomized random walk on a random walk

    International Nuclear Information System (INIS)

    Lee, P.A.

    1983-06-01

    This paper discusses generalizations of the model introduced by Kehr and Kunter of the random walk of a particle on a one-dimensional chain which in turn has been constructed by a random walk procedure. The superimposed random walk is randomised in time according to the occurrences of a stochastic point process. The probability of finding the particle in a particular position at a certain instant is obtained explicitly in the transform domain. It is found that the asymptotic behaviour for large time of the mean-square displacement of the particle depends critically on the assumed structure of the basic random walk, giving a diffusion-like term for an asymmetric walk or a square root law if the walk is symmetric. Many results are obtained in closed form for the Poisson process case, and these agree with those given previously by Kehr and Kunter. (author)

  4. Method and apparatus for producing and selectively directing x-rays to different points on an object

    International Nuclear Information System (INIS)

    Haimson, J.

    1981-01-01

    The invention relates to apparatus suitable for use in a computer tomography X-ray scanner. High intensity X-rays are produced and directed towards the object of interest from any of a plurality of preselected coplanar points spaced from the object and spaced radially about a line through the object. There are no moving parts. The electron beam, which produces X-rays as a consequence of impact with the target, is directed selectively to preselected points on the stationary target. Beam-direction compensates for the beam spreading effect of space charge forces acting on the beam, and beam-shaping shapes the beam to a predetermined cross-sectional configuration at its point of incidence with the target. Beam aberrations including sextupole aberrations are corrected. (U.K.)

  5. Selective decontamination in pediatric liver transplants. A randomized prospective study.

    Science.gov (United States)

    Smith, S D; Jackson, R J; Hannakan, C J; Wadowsky, R M; Tzakis, A G; Rowe, M I

    1993-06-01

    Although it has been suggested that selective decontamination of the digestive tract (SDD) decreases postoperative aerobic Gram-negative and fungal infections in orthotopic liver transplantation (OLT), no controlled trials exist in pediatric patients. This prospective, randomized controlled study of 36 pediatric OLT patients examines the effect of short-term SDD on postoperative infection and digestive tract flora. Patients were randomized into two groups. The control group received perioperative parenteral antibiotics only. The SDD group received in addition polymyxin E, tobramycin, and amphotericin B enterally and by oropharyngeal swab postoperatively until oral intake was tolerated (6 +/- 4 days). Indications for operation, preoperative status, age, and intensive care unit and hospital length of stay were no different in SDD (n = 18) and control (n = 18) groups. A total of 14 Gram-negative infections (intraabdominal abscess 7, septicemia 5, pneumonia 1, urinary tract 1) developed in the 36 patients studied. Mortality was not significantly different in the two groups. However, there were significantly fewer patients with Gram-negative infections in the SDD group: 3/18 patients (11%) vs. 11/18 patients (50%) in the control group, P < 0.001. There was also significant reduction in aerobic Gram-negative flora in the stool and pharynx in patients receiving SDD. Gram-positive and anaerobic organisms were unaffected. We conclude that short-term postoperative SDD significantly reduces Gram-negative infections in pediatric OLT patients.

  6. Statistical analysis of the influence of wheat black point kernels on selected indicators of wheat flour quality

    Directory of Open Access Journals (Sweden)

    Petrov Verica D.

    2011-01-01

    Full Text Available The influence of wheat black point kernels on selected indicators of wheat flour quality - farinograph and extensograph indicators, amylolytic activity, wet gluten and flour ash content, were examined in this study. The examinations were conducted on samples of wheat harvested in the years 2007 and 2008 from the area of Central Banat in four treatments-control (without black point flour and with 2, 4 and 10% of black point flour which was added as a replacement for a part of the control sample. Statistically significant differences between treatments were observed on the dough stability, falling number and extensibility. The samples with 10% of black point flour had the lowest dough stability and the highest amylolytic activity and extensibility. There was a trend of the increasing 15 min drop and water absorption with the increased share of black point flour. Extensograph area, resistance and ratio resistance to extensibility decreased with the addition of black point flour, but not properly. Mahalanobis distance indicates that the addition of 10% black point flour had the greatest influence on the observed quality indicators, thus proving that black point contributes to the technological quality of wheat, i.e .flour.

  7. Day-ahead load forecast using random forest and expert input selection

    International Nuclear Information System (INIS)

    Lahouar, A.; Ben Hadj Slama, J.

    2015-01-01

    Highlights: • A model based on random forests for short term load forecast is proposed. • An expert feature selection is added to refine inputs. • Special attention is paid to customers behavior, load profile and special holidays. • The model is flexible and able to handle complex load signal. • A technical comparison is performed to assess the forecast accuracy. - Abstract: The electrical load forecast is getting more and more important in recent years due to the electricity market deregulation and integration of renewable resources. To overcome the incoming challenges and ensure accurate power prediction for different time horizons, sophisticated intelligent methods are elaborated. Utilization of intelligent forecast algorithms is among main characteristics of smart grids, and is an efficient tool to face uncertainty. Several crucial tasks of power operators such as load dispatch rely on the short term forecast, thus it should be as accurate as possible. To this end, this paper proposes a short term load predictor, able to forecast the next 24 h of load. Using random forest, characterized by immunity to parameter variations and internal cross validation, the model is constructed following an online learning process. The inputs are refined by expert feature selection using a set of if–then rules, in order to include the own user specifications about the country weather or market, and to generalize the forecast ability. The proposed approach is tested through a real historical set from the Tunisian Power Company, and the simulation shows accurate and satisfactory results for one day in advance, with an average error exceeding rarely 2.3%. The model is validated for regular working days and weekends, and special attention is paid to moving holidays, following non Gregorian calendar

  8. A Uniform Energy Consumption Algorithm for Wireless Sensor and Actuator Networks Based on Dynamic Polling Point Selection

    Science.gov (United States)

    Li, Shuo; Peng, Jun; Liu, Weirong; Zhu, Zhengfa; Lin, Kuo-Chi

    2014-01-01

    Recent research has indicated that using the mobility of the actuator in wireless sensor and actuator networks (WSANs) to achieve mobile data collection can greatly increase the sensor network lifetime. However, mobile data collection may result in unacceptable collection delays in the network if the path of the actuator is too long. Because real-time network applications require meeting data collection delay constraints, planning the path of the actuator is a very important issue to balance the prolongation of the network lifetime and the reduction of the data collection delay. In this paper, a multi-hop routing mobile data collection algorithm is proposed based on dynamic polling point selection with delay constraints to address this issue. The algorithm can actively update the selection of the actuator's polling points according to the sensor nodes' residual energies and their locations while also considering the collection delay constraint. It also dynamically constructs the multi-hop routing trees rooted by these polling points to balance the sensor node energy consumption and the extension of the network lifetime. The effectiveness of the algorithm is validated by simulation. PMID:24451455

  9. A logistic regression estimating function for spatial Gibbs point processes

    DEFF Research Database (Denmark)

    Baddeley, Adrian; Coeurjolly, Jean-François; Rubak, Ege

    We propose a computationally efficient logistic regression estimating function for spatial Gibbs point processes. The sample points for the logistic regression consist of the observed point pattern together with a random pattern of dummy points. The estimating function is closely related to the p......We propose a computationally efficient logistic regression estimating function for spatial Gibbs point processes. The sample points for the logistic regression consist of the observed point pattern together with a random pattern of dummy points. The estimating function is closely related...

  10. Distribution of orientation selectivity in recurrent networks of spiking neurons with different random topologies.

    Science.gov (United States)

    Sadeh, Sadra; Rotter, Stefan

    2014-01-01

    Neurons in the primary visual cortex are more or less selective for the orientation of a light bar used for stimulation. A broad distribution of individual grades of orientation selectivity has in fact been reported in all species. A possible reason for emergence of broad distributions is the recurrent network within which the stimulus is being processed. Here we compute the distribution of orientation selectivity in randomly connected model networks that are equipped with different spatial patterns of connectivity. We show that, for a wide variety of connectivity patterns, a linear theory based on firing rates accurately approximates the outcome of direct numerical simulations of networks of spiking neurons. Distance dependent connectivity in networks with a more biologically realistic structure does not compromise our linear analysis, as long as the linearized dynamics, and hence the uniform asynchronous irregular activity state, remain stable. We conclude that linear mechanisms of stimulus processing are indeed responsible for the emergence of orientation selectivity and its distribution in recurrent networks with functionally heterogeneous synaptic connectivity.

  11. Mixed-Poisson Point Process with Partially-Observed Covariates: Ecological Momentary Assessment of Smoking.

    Science.gov (United States)

    Neustifter, Benjamin; Rathbun, Stephen L; Shiffman, Saul

    2012-01-01

    Ecological Momentary Assessment is an emerging method of data collection in behavioral research that may be used to capture the times of repeated behavioral events on electronic devices, and information on subjects' psychological states through the electronic administration of questionnaires at times selected from a probability-based design as well as the event times. A method for fitting a mixed Poisson point process model is proposed for the impact of partially-observed, time-varying covariates on the timing of repeated behavioral events. A random frailty is included in the point-process intensity to describe variation among subjects in baseline rates of event occurrence. Covariate coefficients are estimated using estimating equations constructed by replacing the integrated intensity in the Poisson score equations with a design-unbiased estimator. An estimator is also proposed for the variance of the random frailties. Our estimators are robust in the sense that no model assumptions are made regarding the distribution of the time-varying covariates or the distribution of the random effects. However, subject effects are estimated under gamma frailties using an approximate hierarchical likelihood. The proposed approach is illustrated using smoking data.

  12. Influence of Maximum Inbreeding Avoidance under BLUP EBV Selection on Pinzgau Population Diversity

    Directory of Open Access Journals (Sweden)

    Radovan Kasarda

    2011-05-01

    Full Text Available Evaluated was effect of mating (random vs. maximum avoidance of inbreeding under BLUP EBV selection strategy. Existing population structure was under Monte Carlo stochastic simulation analyzed from the point to minimize increase of inbreeding. Maximum avoidance of inbreeding under BLUP selection resulted into comparable increase of inbreeding then random mating in average of 10 generation development. After 10 generations of simulation of mating strategy was observed ΔF= 6,51 % (2 sires, 5,20 % (3 sires, 3,22 % (4 sires resp. 2,94 % (5 sires. With increased number of sires selected, decrease of inbreeding was observed. With use of 4, resp. 5 sires increase of inbreeding was comparable to random mating with phenotypic selection. For saving of genetic diversity and prevention of population loss is important to minimize increase of inbreeding in small populations. Classical approach was based on balancing ratio of sires and dams in mating program. Contrariwise in the most of commercial populations small number of sires was used with high mating ratio.

  13. Domain Adaptation for Machine Translation with Instance Selection

    Directory of Open Access Journals (Sweden)

    Biçici Ergun

    2015-04-01

    Full Text Available Domain adaptation for machine translation (MT can be achieved by selecting training instances close to the test set from a larger set of instances. We consider 7 different domain adaptation strategies and answer 7 research questions, which give us a recipe for domain adaptation in MT. We perform English to German statistical MT (SMT experiments in a setting where test and training sentences can come from different corpora and one of our goals is to learn the parameters of the sampling process. Domain adaptation with training instance selection can obtain 22% increase in target 2-gram recall and can gain up to 3:55 BLEU points compared with random selection. Domain adaptation with feature decay algorithm (FDA not only achieves the highest target 2-gram recall and BLEU performance but also perfectly learns the test sample distribution parameter with correlation 0:99. Moses SMT systems built with FDA selected 10K training sentences is able to obtain F1 results as good as the baselines that use up to 2M sentences. Moses SMT systems built with FDA selected 50K training sentences is able to obtain F1 point better results than the baselines.

  14. Demerit points systems.

    NARCIS (Netherlands)

    2006-01-01

    In 2012, 21 of the 27 EU Member States had some form of demerit points system. In theory, demerit points systems contribute to road safety through three mechanisms: 1) prevention of unsafe behaviour through the risk of receiving penalty points, 2) selection and suspension of the most frequent

  15. A protein-targeting strategy used to develop a selective inhibitor of the E17K point mutation in the PH domain of Akt1

    Science.gov (United States)

    Deyle, Kaycie M.; Farrow, Blake; Qiao Hee, Ying; Work, Jeremy; Wong, Michelle; Lai, Bert; Umeda, Aiko; Millward, Steven W.; Nag, Arundhati; Das, Samir; Heath, James R.

    2015-05-01

    Ligands that can bind selectively to proteins with single amino-acid point mutations offer the potential to detect or treat an abnormal protein in the presence of the wild type (WT). However, it is difficult to develop a selective ligand if the point mutation is not associated with an addressable location, such as a binding pocket. Here we report an all-chemical synthetic epitope-targeting strategy that we used to discover a 5-mer peptide with selectivity for the E17K-transforming point mutation in the pleckstrin homology domain of the Akt1 oncoprotein. A fragment of Akt1 that contained the E17K mutation and an I19[propargylglycine] substitution was synthesized to form an addressable synthetic epitope. Azide-presenting peptides that clicked covalently onto this alkyne-presenting epitope were selected from a library using in situ screening. One peptide exhibits a 10:1 in vitro selectivity for the oncoprotein relative to the WT, with a similar selectivity in cells. This 5-mer peptide was expanded into a larger ligand that selectively blocks the E17K Akt1 interaction with its PIP3 (phosphatidylinositol (3,4,5)-trisphosphate) substrate.

  16. Friction massage versus kinesiotaping for short-term management of latent trigger points in the upper trapezius: a randomized controlled trial.

    Science.gov (United States)

    Mohamadi, Marzieh; Piroozi, Soraya; Rashidi, Iman; Hosseinifard, Saeed

    2017-01-01

    Latent trigger points in the upper trapezius muscle may disrupt muscle movement patterns and cause problems such as cramping and decreased muscle strength. Because latent trigger points may spontaneously become active trigger points, they should be addressed and treated to prevent further problems. In this study we compared the short-term effect of kinesiotaping versus friction massage on latent trigger points in the upper trapezius muscle. Fifty-eight male students enrolled with a stratified sampling method participated in this single-blind randomized clinical trial (Registration ID: IRCT2016080126674N3) in 2016. Pressure pain threshold was recorded with a pressure algometer and grip strength was recorded with a Collin dynamometer. The participants were randomly assigned to two different treatment groups: kinesiotape or friction massage. Friction massage was performed daily for 3 sessions and kinesiotape was used for 72 h. One hour after the last session of friction massage or removal of the kinesiotape, pressure pain threshold and grip strength were evaluated again. Pressure pain threshold decreased significantly after both friction massage (2.66 ± 0.89 to 2.25 ± 0.76; P  = 0.02) and kinesiotaping (2.00 ± 0.74 to 1.71 ± 0.65; P  = 0.01). Grip strength increased significantly after friction massage (40.78 ± 9.55 to 42.17 ± 10.68; P  = 0.03); however there was no significant change in the kinesiotape group (39.72 ± 6.42 to 40.65 ± 7.3; P  = 0.197). There were no significant differences in pressure pain threshold (2.10 ± 0.11 & 1.87 ± 0.11; P  = 0.66) or grip strength (42.17 ± 10.68 & 40.65 ± 7.3; P  = 0.53) between the two study groups. Friction massage and kinesiotaping had identical short-term effects on latent trigger points in the upper trapezius. Three sessions of either of these two interventions did not improve latent trigger points. Registration ID in IRCT: IRCT2016080126674N3.

  17. College grade point average as a personnel selection device: ethnic group differences and potential adverse impact.

    Science.gov (United States)

    Roth, P L; Bobko, P

    2000-06-01

    College grade point average (GPA) is often used in a variety of ways in personnel selection. Unfortunately, there is little empirical research literature in human resource management that informs researchers or practitioners about the magnitude of ethnic group differences and any potential adverse impact implications when using cumulative GPA for selection. Data from a medium-sized university in the Southeast (N = 7,498) indicate that the standardized average Black-White difference for cumulative GPA in the senior year is d = 0.78. The authors also conducted analyses at 3 GPA screens (3.00, 3.25, and 3.50) to demonstrate that employers (or educators) might face adverse impact at all 3 levels if GPA continues to be implemented as part of a selection system. Implications and future research are discussed.

  18. RandomSpot: A web-based tool for systematic random sampling of virtual slides.

    Science.gov (United States)

    Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E

    2015-01-01

    This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.

  19. Hierarchical random additive process and logarithmic scaling of generalized high order, two-point correlations in turbulent boundary layer flow

    Science.gov (United States)

    Yang, X. I. A.; Marusic, I.; Meneveau, C.

    2016-06-01

    Townsend [Townsend, The Structure of Turbulent Shear Flow (Cambridge University Press, Cambridge, UK, 1976)] hypothesized that the logarithmic region in high-Reynolds-number wall-bounded flows consists of space-filling, self-similar attached eddies. Invoking this hypothesis, we express streamwise velocity fluctuations in the inertial layer in high-Reynolds-number wall-bounded flows as a hierarchical random additive process (HRAP): uz+=∑i=1Nzai . Here u is the streamwise velocity fluctuation, + indicates normalization in wall units, z is the wall normal distance, and ai's are independently, identically distributed random additives, each of which is associated with an attached eddy in the wall-attached hierarchy. The number of random additives is Nz˜ln(δ /z ) where δ is the boundary layer thickness and ln is natural log. Due to its simplified structure, such a process leads to predictions of the scaling behaviors for various turbulence statistics in the logarithmic layer. Besides reproducing known logarithmic scaling of moments, structure functions, and correlation function [" close="]3/2 uz(x ) uz(x +r ) >, new logarithmic laws in two-point statistics such as uz4(x ) > 1 /2, 1/3, etc. can be derived using the HRAP formalism. Supporting empirical evidence for the logarithmic scaling in such statistics is found from the Melbourne High Reynolds Number Boundary Layer Wind Tunnel measurements. We also show that, at high Reynolds numbers, the above mentioned new logarithmic laws can be derived by assuming the arrival of an attached eddy at a generic point in the flow field to be a Poisson process [Woodcock and Marusic, Phys. Fluids 27, 015104 (2015), 10.1063/1.4905301]. Taken together, the results provide new evidence supporting the essential ingredients of the attached eddy hypothesis to describe streamwise velocity fluctuations of large, momentum transporting eddies in wall-bounded turbulence, while observed deviations suggest the need for further extensions of the

  20. LPTAU, Quasi Random Sequence Generator

    International Nuclear Information System (INIS)

    Sobol, Ilya M.

    1993-01-01

    1 - Description of program or function: LPTAU generates quasi random sequences. These are uniformly distributed sets of L=M N points in the N-dimensional unit cube: I N =[0,1]x...x[0,1]. These sequences are used as nodes for multidimensional integration; as searching points in global optimization; as trial points in multi-criteria decision making; as quasi-random points for quasi Monte Carlo algorithms. 2 - Method of solution: Uses LP-TAU sequence generation (see references). 3 - Restrictions on the complexity of the problem: The number of points that can be generated is L 30 . The dimension of the space cannot exceed 51

  1. Using Random Forests to Select Optimal Input Variables for Short-Term Wind Speed Forecasting Models

    Directory of Open Access Journals (Sweden)

    Hui Wang

    2017-10-01

    Full Text Available Achieving relatively high-accuracy short-term wind speed forecasting estimates is a precondition for the construction and grid-connected operation of wind power forecasting systems for wind farms. Currently, most research is focused on the structure of forecasting models and does not consider the selection of input variables, which can have significant impacts on forecasting performance. This paper presents an input variable selection method for wind speed forecasting models. The candidate input variables for various leading periods are selected and random forests (RF is employed to evaluate the importance of all variable as features. The feature subset with the best evaluation performance is selected as the optimal feature set. Then, kernel-based extreme learning machine is constructed to evaluate the performance of input variables selection based on RF. The results of the case study show that by removing the uncorrelated and redundant features, RF effectively extracts the most strongly correlated set of features from the candidate input variables. By finding the optimal feature combination to represent the original information, RF simplifies the structure of the wind speed forecasting model, shortens the training time required, and substantially improves the model’s accuracy and generalization ability, demonstrating that the input variables selected by RF are effective.

  2. Ultrahigh Dimensional Variable Selection for Interpolation of Point Referenced Spatial Data: A Digital Soil Mapping Case Study

    Science.gov (United States)

    Lamb, David W.; Mengersen, Kerrie

    2016-01-01

    Modern soil mapping is characterised by the need to interpolate point referenced (geostatistical) observations and the availability of large numbers of environmental characteristics for consideration as covariates to aid this interpolation. Modelling tasks of this nature also occur in other fields such as biogeography and environmental science. This analysis employs the Least Angle Regression (LAR) algorithm for fitting Least Absolute Shrinkage and Selection Operator (LASSO) penalized Multiple Linear Regressions models. This analysis demonstrates the efficiency of the LAR algorithm at selecting covariates to aid the interpolation of geostatistical soil carbon observations. Where an exhaustive search of the models that could be constructed from 800 potential covariate terms and 60 observations would be prohibitively demanding, LASSO variable selection is accomplished with trivial computational investment. PMID:27603135

  3. Vendor compliance with Ontario's tobacco point of sale legislation.

    Science.gov (United States)

    Dubray, Jolene M; Schwartz, Robert M; Garcia, John M; Bondy, Susan J; Victor, J Charles

    2009-01-01

    On May 31, 2006, Ontario joined a small group of international jurisdictions to implement legislative restrictions on tobacco point of sale promotions. This study compares the presence of point of sale promotions in the retail tobacco environment from three surveys: one prior to and two following implementation of the legislation. Approximately 1,575 tobacco vendors were randomly selected for each survey. Each regionally-stratified sample included equal numbers of tobacco vendors categorized into four trade classes: chain convenience, independent convenience and discount, gas stations, and grocery. Data regarding the six restricted point of sale promotions were collected using standardized protocols and inspection forms. Weighted estimates and 95% confidence intervals were produced at the provincial, regional and vendor trade class level using the bootstrap method for estimating variance. At baseline, the proportion of tobacco vendors who did not engage in each of the six restricted point of sale promotions ranged from 41% to 88%. Within four months following implementation of the legislation, compliance with each of the six restricted point of sale promotions exceeded 95%. Similar levels of compliance were observed one year later. Grocery stores had the fewest point of sale promotions displayed at baseline. Compliance rates did not differ across vendor trade classes at either follow-up survey. Point of sale promotions did not differ across regions in any of the three surveys. Within a short period of time, a high level of compliance with six restricted point of sale promotions was achieved.

  4. Inverse problems for random differential equations using the collage method for random contraction mappings

    Science.gov (United States)

    Kunze, H. E.; La Torre, D.; Vrscay, E. R.

    2009-01-01

    In this paper we are concerned with differential equations with random coefficients which will be considered as random fixed point equations of the form T([omega],x([omega]))=x([omega]), [omega][set membership, variant][Omega]. Here T:[Omega]×X-->X is a random integral operator, is a probability space and X is a complete metric space. We consider the following inverse problem for such equations: Given a set of realizations of the fixed point of T (possibly the interpolations of different observational data sets), determine the operator T or the mean value of its random components, as appropriate. We solve the inverse problem for this class of equations by using the collage theorem for contraction mappings.

  5. Random Intercept and Random Slope 2-Level Multilevel Models

    Directory of Open Access Journals (Sweden)

    Rehan Ahmad Khan

    2012-11-01

    Full Text Available Random intercept model and random intercept & random slope model carrying two-levels of hierarchy in the population are presented and compared with the traditional regression approach. The impact of students’ satisfaction on their grade point average (GPA was explored with and without controlling teachers influence. The variation at level-1 can be controlled by introducing the higher levels of hierarchy in the model. The fanny movement of the fitted lines proves variation of student grades around teachers.

  6. Coupled continuous time-random walks in quenched random environment

    Science.gov (United States)

    Magdziarz, M.; Szczotka, W.

    2018-02-01

    We introduce a coupled continuous-time random walk with coupling which is characteristic for Lévy walks. Additionally we assume that the walker moves in a quenched random environment, i.e. the site disorder at each lattice point is fixed in time. We analyze the scaling limit of such a random walk. We show that for large times the behaviour of the analyzed process is exactly the same as in the case of uncoupled quenched trap model for Lévy flights.

  7. Turning point or selection? The effect of rustication on subsequent health for the Chinese Cultural Revolution cohort.

    Science.gov (United States)

    Fan, Wen

    2016-05-01

    During the Chinese Cultural Revolution (1966-76), Chairman Mao sent 17 million urban youth to rural areas to be "reeducated." These "sent-down" youth spent years working alongside peasants, enduring inadequate diets, shelter and medical attention. What were the consequences for subsequent health? Was there a benefit to individuals in the leading or trailing edges of this cohort? Was this a fundamental turning point or were selection process at work? Drawing on the 1994 State and Life Chances in Urban China Survey, I find the health disadvantage at midlife is mostly borne by members of the trailing-edge sub-cohort who lived in the countryside for more than five years. Results from propensity-score analysis indicate a selection process: those who suffered most came from disadvantaged backgrounds. Life chances following the rusticates' return home, however, either do not differ from those who stayed in cities or do not relate to health, refuting the turning-point view, at least in terms of midlife health. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Salient Point Detection in Protrusion Parts of 3D Object Robust to Isometric Variations

    Science.gov (United States)

    Mirloo, Mahsa; Ebrahimnezhad, Hosein

    2018-03-01

    In this paper, a novel method is proposed to detect 3D object salient points robust to isometric variations and stable against scaling and noise. Salient points can be used as the representative points from object protrusion parts in order to improve the object matching and retrieval algorithms. The proposed algorithm is started by determining the first salient point of the model based on the average geodesic distance of several random points. Then, according to the previous salient point, a new point is added to this set of points in each iteration. By adding every salient point, decision function is updated. Hence, a condition is created for selecting the next point in which the iterative point is not extracted from the same protrusion part so that drawing out of a representative point from every protrusion part is guaranteed. This method is stable against model variations with isometric transformations, scaling, and noise with different levels of strength due to using a feature robust to isometric variations and considering the relation between the salient points. In addition, the number of points used in averaging process is decreased in this method, which leads to lower computational complexity in comparison with the other salient point detection algorithms.

  9. Application of the Wiener-Hermite functional method to point reactor kinetics driven by random reactivity fluctuations

    International Nuclear Information System (INIS)

    Behringer, K.; Pineyro, J.; Mennig, J.

    1990-06-01

    The Wiener-Hermite functional (WHF) method has been applied to the point reactor kinetic equation excited by Gaussian random reactivity noise under stationary conditions. Delayed neutrons and any feedback effects are disregarded. The neutron steady-state value and the power spectral density (PSD) of the neutron flux have been calculated in a second order (WHF-2) approximation. Two cases are considered: in the first case, the noise source is low-pass white noise. In both cases the WHF-2 approximation of the neutron PSDs leads to relatively simple analytical expressions. The accuracy of the approach is determined by comparison with exact solutions of the problem. The investigations show that the WHF method is a powerful approximative tool for studying the nonlinear effects in the stochastic differential equation. (author) 5 figs., 29 refs

  10. Implications of structural genomics target selection strategies: Pfam5000, whole genome, and random approaches

    Energy Technology Data Exchange (ETDEWEB)

    Chandonia, John-Marc; Brenner, Steven E.

    2004-07-14

    The structural genomics project is an international effort to determine the three-dimensional shapes of all important biological macromolecules, with a primary focus on proteins. Target proteins should be selected according to a strategy which is medically and biologically relevant, of good value, and tractable. As an option to consider, we present the Pfam5000 strategy, which involves selecting the 5000 most important families from the Pfam database as sources for targets. We compare the Pfam5000 strategy to several other proposed strategies that would require similar numbers of targets. These include including complete solution of several small to moderately sized bacterial proteomes, partial coverage of the human proteome, and random selection of approximately 5000 targets from sequenced genomes. We measure the impact that successful implementation of these strategies would have upon structural interpretation of the proteins in Swiss-Prot, TrEMBL, and 131 complete proteomes (including 10 of eukaryotes) from the Proteome Analysis database at EBI. Solving the structures of proteins from the 5000 largest Pfam families would allow accurate fold assignment for approximately 68 percent of all prokaryotic proteins (covering 59 percent of residues) and 61 percent of eukaryotic proteins (40 percent of residues). More fine-grained coverage which would allow accurate modeling of these proteins would require an order of magnitude more targets. The Pfam5000 strategy may be modified in several ways, for example to focus on larger families, bacterial sequences, or eukaryotic sequences; as long as secondary consideration is given to large families within Pfam, coverage results vary only slightly. In contrast, focusing structural genomics on a single tractable genome would have only a limited impact in structural knowledge of other proteomes: a significant fraction (about 30-40 percent of the proteins, and 40-60 percent of the residues) of each proteome is classified in small

  11. Analysis of multi-species point patterns using multivariate log Gaussian Cox processes

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Guan, Yongtao; Jalilian, Abdollah

    Multivariate log Gaussian Cox processes are flexible models for multivariate point patterns. However, they have so far only been applied in bivariate cases. In this paper we move beyond the bivariate case in order to model multi-species point patterns of tree locations. In particular we address t...... of the data. The selected number of common latent fields provides an index of complexity of the multivariate covariance structure. Hierarchical clustering is used to identify groups of species with similar patterns of dependence on the common latent fields.......Multivariate log Gaussian Cox processes are flexible models for multivariate point patterns. However, they have so far only been applied in bivariate cases. In this paper we move beyond the bivariate case in order to model multi-species point patterns of tree locations. In particular we address...... the problems of identifying parsimonious models and of extracting biologically relevant information from the fitted models. The latent multivariate Gaussian field is decomposed into components given in terms of random fields common to all species and components which are species specific. This allows...

  12. Participant-selected music and physical activity in older adults following cardiac rehabilitation: a randomized controlled trial.

    Science.gov (United States)

    Clark, Imogen N; Baker, Felicity A; Peiris, Casey L; Shoebridge, Georgie; Taylor, Nicholas F

    2017-03-01

    To evaluate effects of participant-selected music on older adults' achievement of activity levels recommended in the physical activity guidelines following cardiac rehabilitation. A parallel group randomized controlled trial with measurements at Weeks 0, 6 and 26. A multisite outpatient rehabilitation programme of a publicly funded metropolitan health service. Adults aged 60 years and older who had completed a cardiac rehabilitation programme. Experimental participants selected music to support walking with guidance from a music therapist. Control participants received usual care only. The primary outcome was the proportion of participants achieving activity levels recommended in physical activity guidelines. Secondary outcomes compared amounts of physical activity, exercise capacity, cardiac risk factors, and exercise self-efficacy. A total of 56 participants, mean age 68.2 years (SD = 6.5), were randomized to the experimental ( n = 28) and control groups ( n = 28). There were no differences between groups in proportions of participants achieving activity recommended in physical activity guidelines at Week 6 or 26. Secondary outcomes demonstrated between-group differences in male waist circumference at both measurements (Week 6 difference -2.0 cm, 95% CI -4.0 to 0; Week 26 difference -2.8 cm, 95% CI -5.4 to -0.1), and observed effect sizes favoured the experimental group for amounts of physical activity (d = 0.30), exercise capacity (d = 0.48), and blood pressure (d = -0.32). Participant-selected music did not increase the proportion of participants achieving recommended amounts of physical activity, but may have contributed to exercise-related benefits.

  13. Random walk generated by random permutations of {1, 2, 3, ..., n + 1}

    International Nuclear Information System (INIS)

    Oshanin, G; Voituriez, R

    2004-01-01

    We study properties of a non-Markovian random walk X (n) l , l = 0, 1, 2, ..., n, evolving in discrete time l on a one-dimensional lattice of integers, whose moves to the right or to the left are prescribed by the rise-and-descent sequences characterizing random permutations π of [n + 1] = {1, 2, 3, ..., n + 1}. We determine exactly the probability of finding the end-point X n = X (n) n of the trajectory of such a permutation-generated random walk (PGRW) at site X, and show that in the limit n → ∞ it converges to a normal distribution with a smaller, compared to the conventional Polya random walk, diffusion coefficient. We formulate, as well, an auxiliary stochastic process whose distribution is identical to the distribution of the intermediate points X (n) l , l < n, which enables us to obtain the probability measure of different excursions and to define the asymptotic distribution of the number of 'turns' of the PGRW trajectories

  14. r2VIM: A new variable selection method for random forests in genome-wide association studies.

    Science.gov (United States)

    Szymczak, Silke; Holzinger, Emily; Dasgupta, Abhijit; Malley, James D; Molloy, Anne M; Mills, James L; Brody, Lawrence C; Stambolian, Dwight; Bailey-Wilson, Joan E

    2016-01-01

    Machine learning methods and in particular random forests (RFs) are a promising alternative to standard single SNP analyses in genome-wide association studies (GWAS). RFs provide variable importance measures (VIMs) to rank SNPs according to their predictive power. However, in contrast to the established genome-wide significance threshold, no clear criteria exist to determine how many SNPs should be selected for downstream analyses. We propose a new variable selection approach, recurrent relative variable importance measure (r2VIM). Importance values are calculated relative to an observed minimal importance score for several runs of RF and only SNPs with large relative VIMs in all of the runs are selected as important. Evaluations on simulated GWAS data show that the new method controls the number of false-positives under the null hypothesis. Under a simple alternative hypothesis with several independent main effects it is only slightly less powerful than logistic regression. In an experimental GWAS data set, the same strong signal is identified while the approach selects none of the SNPs in an underpowered GWAS. The novel variable selection method r2VIM is a promising extension to standard RF for objectively selecting relevant SNPs in GWAS while controlling the number of false-positive results.

  15. Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points

    KAUST Repository

    Migliorati, Giovanni

    2015-08-28

    We study the accuracy of the discrete least-squares approximation on a finite dimensional space of a real-valued target function from noisy pointwise evaluations at independent random points distributed according to a given sampling probability measure. The convergence estimates are given in mean-square sense with respect to the sampling measure. The noise may be correlated with the location of the evaluation and may have nonzero mean (offset). We consider both cases of bounded or square-integrable noise / offset. We prove conditions between the number of sampling points and the dimension of the underlying approximation space that ensure a stable and accurate approximation. Particular focus is on deriving estimates in probability within a given confidence level. We analyze how the best approximation error and the noise terms affect the convergence rate and the overall confidence level achieved by the convergence estimate. The proofs of our convergence estimates in probability use arguments from the theory of large deviations to bound the noise term. Finally we address the particular case of multivariate polynomial approximation spaces with any density in the beta family, including uniform and Chebyshev.

  16. Comparison between the effects of trigger point mesotherapy versus acupuncture points mesotherapy in the treatment of chronic low back pain: a short term randomized controlled trial.

    Science.gov (United States)

    Di Cesare, Annalisa; Giombini, Arrigo; Di Cesare, Mariachiara; Ripani, Maurizio; Vulpiani, Maria Chiara; Saraceni, Vincenzo Maria

    2011-02-01

    The goal of this study was to compare the effects of trigger point (TRP) mesotherapy and acupuncture (ACP) mesotherapy in the treatment of patients with chronic low back pain. Short term randomized controlled trial. 62 subjects with chronic low back pain were recruited at outpatients Physical Medicine and Rehabilitation Clinic at the University of Rome "La Sapienza" in the period between July 2006 and May 2008. Study subjects were assigned to receive 4 weeks treatments with either trigger point mesotherapy (TRP mesotherapy, n=29) or acupoints mesotherapy (ACP mesotherapy, n=33). Pain intensity with a pain visual analogic scale (VAS) and verbal rating scale (VRS) and pain disability with McGill Pain Questionnaire Short Form (SFMPQ), Roland Morris Disability Questionnaire (RMQ) and Oswestry Low Back Pain Disability Questionaire (ODQ). ACP mesotherapy shows a more effective results in VRS and VAS measures in the follow-up (p(VRS)=mesotherapy group. Our results suggest that the response to ACP mesotherapy may be greater than the response to TRP mesotherapy in the short term follow-up. This technique could be nevertheless a viable option as an adjunct treatment in an overall treatment planning of CLBP. Copyright © 2010 Elsevier Ltd. All rights reserved.

  17. Auricular Point Acupressure to Manage Chronic Low Back Pain in Older Adults: A Randomized Controlled Pilot Study

    Directory of Open Access Journals (Sweden)

    Chao Hsing Yeh

    2014-01-01

    Full Text Available This prospective, randomized clinical trial (RCT pilot study was designed to (1 assess the feasibility and tolerability of an easily administered, auricular point acupressure (APA intervention and (2 provide an initial assessment of effect size as compared to a sham treatment. Thirty-seven subjects were randomized to receive either the real or sham APA treatment. All participants were treated once a week for 4 weeks. Self-report measures were obtained at baseline, weekly during treatment, at end-of-intervention (EOI, and at a 1-month follow-up. A dropout rate of 26% in the real APA group and 50% in the sham group was observed. The reduction in worst pain from baseline to EOI was 41% for the real and 5% for the sham group with a Cohen’s effect size of 1.22 P<0.00. Disability scores on the Roland Morris Disability Questionnaire (RMDQ decreased in the real group by 29% and were unchanged in the sham group (+3% P<0.00. Given the high dropout rate, results must be interpreted with caution; nevertheless, our results suggest that APA may provide an inexpensive and effective complementary approach for the management of back pain in older adults, and further study is warranted.

  18. Stethoscope versus point-of-care ultrasound in the differential diagnosis of dyspnea: a randomized trial.

    Science.gov (United States)

    Özkan, Behzat; Ünlüer, Erden E; Akyol, Pinar Y; Karagöz, Arif; Bayata, Mehmet S; Akoğlu, Haldun; Oyar, Orhan; Dalli, Ayşe; Topal, Fatih E

    2015-12-01

    We aimed to determine the accuracies of point-of-care ultrasound (PoCUS) and stethoscopes as part of the physical examinations of patients with dyspnea. Three emergency medicine specialists in each of two groups of ultrasound and stethoscope performers underwent didactic and hands-on training on PoCUS and stethoscope usage. All the patients enrolled were randomized to one of two predetermined PoCUS or stethoscope groups. The diagnostic performance of ultrasonography was higher than that of the stethoscope in the diagnoses of heart failure (90 vs. 86%, 1.00 vs. 0.89, and 5.00 vs. 4.92, respectively) and pneumonia (90 vs. 86.7%, 0.75 vs. 0.73, and 16.50 vs. 13.82, respectively). No significant differences were observed in the utility parameters of these modalities in these diagnoses. Although some authors argue that it is time to abandon the 'archaic tools' of past centuries, we believe that it is too early to discontinue the use of the stethoscope.

  19. Cost-Based Design and Selection of Point Absorber Devices for the Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    Vincenzo Piscopo

    2018-04-01

    Full Text Available Sea wave energy is one of the most promising renewable sources, even if relevant technology is not mature enough for the global energy market and is not yet competitive if compared with solar, wind and tidal current devices. Particularly, among the variety of wave energy converters developed in the last decade, heaving point absorbers represent one of the most feasible and studied technologies, as shown by the small-scale testing and full-scale prototypes, deployed in the last years throughout the world. Nevertheless, the need for further reduction of the energy production costs requires a specialized design of wave energy converters, accounting for the restraints provided by the power take-off unit and the device operational profile. Hence, actual analysis focuses on a new cost-based design procedure for heaving point absorbers. The device is equipped with a floating buoy with an optional fully submerged mass connected, by means of a tensioned line, to the power take-off unit. It consists of a permanent magnet linear generator, lying on the seabed and equipped with a gravity-based foundation. The proposed procedure is applied to several candidate deployment sites located in the Mediterranean Sea; the incidence of the power take-off restraint and the converter operational profile is fully investigated and some recommendations for preliminary design of wave energy converter devices are provided. Current results show that there is wide scope to make the wave energy sector more competitive on the international market, by properly selecting the main design parameters of point absorbers, on the basis of met-ocean conditions at the deployment site.

  20. Two-year Randomized Clinical Trial of Self-etching Adhesives and Selective Enamel Etching.

    Science.gov (United States)

    Pena, C E; Rodrigues, J A; Ely, C; Giannini, M; Reis, A F

    2016-01-01

    The aim of this randomized, controlled prospective clinical trial was to evaluate the clinical effectiveness of restoring noncarious cervical lesions with two self-etching adhesive systems applied with or without selective enamel etching. A one-step self-etching adhesive (Xeno V(+)) and a two-step self-etching system (Clearfil SE Bond) were used. The effectiveness of phosphoric acid selective etching of enamel margins was also evaluated. Fifty-six cavities were restored with each adhesive system and divided into two subgroups (n=28; etch and non-etch). All 112 cavities were restored with the nanohybrid composite Esthet.X HD. The clinical effectiveness of restorations was recorded in terms of retention, marginal integrity, marginal staining, caries recurrence, and postoperative sensitivity after 3, 6, 12, 18, and 24 months (modified United States Public Health Service). The Friedman test detected significant differences only after 18 months for marginal staining in the groups Clearfil SE non-etch (p=0.009) and Xeno V(+) etch (p=0.004). One restoration was lost during the trial (Xeno V(+) etch; p>0.05). Although an increase in marginal staining was recorded for groups Clearfil SE non-etch and Xeno V(+) etch, the clinical effectiveness of restorations was considered acceptable for the single-step and two-step self-etching systems with or without selective enamel etching in this 24-month clinical trial.

  1. A comparison of random forest and its Gini importance with standard chemometric methods for the feature selection and classification of spectral data

    Directory of Open Access Journals (Sweden)

    Himmelreich Uwe

    2009-07-01

    Full Text Available Abstract Background Regularized regression methods such as principal component or partial least squares regression perform well in learning tasks on high dimensional spectral data, but cannot explicitly eliminate irrelevant features. The random forest classifier with its associated Gini feature importance, on the other hand, allows for an explicit feature elimination, but may not be optimally adapted to spectral data due to the topology of its constituent classification trees which are based on orthogonal splits in feature space. Results We propose to combine the best of both approaches, and evaluated the joint use of a feature selection based on a recursive feature elimination using the Gini importance of random forests' together with regularized classification methods on spectral data sets from medical diagnostics, chemotaxonomy, biomedical analytics, food science, and synthetically modified spectral data. Here, a feature selection using the Gini feature importance with a regularized classification by discriminant partial least squares regression performed as well as or better than a filtering according to different univariate statistical tests, or using regression coefficients in a backward feature elimination. It outperformed the direct application of the random forest classifier, or the direct application of the regularized classifiers on the full set of features. Conclusion The Gini importance of the random forest provided superior means for measuring feature relevance on spectral data, but – on an optimal subset of features – the regularized classifiers might be preferable over the random forest classifier, in spite of their limitation to model linear dependencies only. A feature selection based on Gini importance, however, may precede a regularized linear classification to identify this optimal subset of features, and to earn a double benefit of both dimensionality reduction and the elimination of noise from the classification task.

  2. Consideration regarding the scheduling of unannounced or randomized inspections

    International Nuclear Information System (INIS)

    Sanborn, J.

    2001-01-01

    Full text: Randomized inspection strategies, including unannounced, short notice, or randomly selected scheduled inspections can play an useful role in integrated safeguards by allowing a reduction in the number of inspections without sacrificing coverage of diversion scenarios. The Agency and member states have proposed such strategies as important elements of integrated safeguards proposals at reactors as well as bulk handling facilities. The Agency, however, has limited experience with such inspections, and a number of issues need to be addressed before effective implementation can occur; how these issues are resolved will determine how effective the inspections will be. This paper focuses on the question of how to determine the timing of such inspections. It is pointed out that there are a large number of variants of the idea of 'randomized inspection,' and that each option will have advantages and disadvantages from the points of view of the operator, the logistics of inspection scheduling, and the capabilities for detection. The method chosen should depend on the type of scenarios that the Agency wishes to detect. The mathematically purest form of randomized schedule will have broad theoretical applicability, but may prove more difficult to put into practice, and may be unnecessary, or even sub-optimal, depending on the inspection objective. On the other hand, each restriction on inspection that provides the operator with information on when the inspection will occur must be taken into account when assessing detection probability. The paper reviews a number of scheduling approaches in the context of different objectives and considers effectiveness, operational impact, and practicality. (author)

  3. Random effect selection in generalised linear models

    DEFF Research Database (Denmark)

    Denwood, Matt; Houe, Hans; Forkman, Björn

    We analysed abattoir recordings of meat inspection codes with possible relevance to onfarm animal welfare in cattle. Random effects logistic regression models were used to describe individual-level data obtained from 461,406 cattle slaughtered in Denmark. Our results demonstrate that the largest...

  4. Discrete least squares polynomial approximation with random evaluations - application to PDEs with Random parameters

    KAUST Repository

    Nobile, Fabio

    2015-01-01

    the parameter-to-solution map u(y) from random noise-free or noisy observations in random points by discrete least squares on polynomial spaces. The noise-free case is relevant whenever the technique is used to construct metamodels, based on polynomial

  5. Efficacy and Effectiveness of Exercise on Tender Points in Adults with Fibromyalgia: A Meta-Analysis of Randomized Controlled Trials

    Directory of Open Access Journals (Sweden)

    George A. Kelley

    2011-01-01

    Full Text Available Fibromyalgia is a major public health problem affecting an estimated 200 to 400 million people worldwide. The purpose of this study was to use the meta-analytic approach to determine the efficacy and effectiveness of randomized controlled exercise intervention trials (aerobic, strength training, or both on tender points (TPs in adults with fibromyalgia. Using random effects models and 95% confidence intervals (CI, a statistically significant reduction in TPs was observed based on per-protocol analyses (8 studies representing 322 participants but not intention-to-treat analyses (5 studies representing 338 participants (per-protocol, , −0.68, 95% CI, −1.16, −0.20; intention-to-treat, , −0.24, 95% CI, −0.62, 0.15. Changes were equivalent to relative reductions of 10.9% and 6.9%, respectively, for per-protocol and intention-to-treat analyses. It was concluded that exercise is efficacious for reducing TPs in women with FM. However, a need exists for additional well-designed and reported studies on this topic.

  6. Efficacy of electroacupuncture at Zhongliao point (BL33 for mild and moderate benign prostatic hyperplasia: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Ding Yulong

    2011-09-01

    Full Text Available Abstract Background Acu-point specificity is a key issue in acupuncture. To date there has not been any satisfactory trial which can ratify the specific effect of acupuncture. This trial will evaluate the specific effect of BL33 for mild and moderate benign prostatic hyperplasia (BPH on the basis of its effectiveness. The non-specific effect will be excluded and the therapeutic effect will be evaluated. Method This is a double-blinded randomized controlled trial. 100 Patients will be randomly allocated into the treatment group (n = 50 and the control group (n = 50. The treatment group receives needling at BL33 and the control group receives needling at non-point. The needling depth, angle, direction, achievement of De Qi and parameters of electroacupuncture are exactly the same in both groups. The primary outcome measure is reduction of international prostate symptom score (IPSS at the 6th week and the secondary outcome measures are reduction of bladder residual urine, increase in maximum urinary flow rate at the 6th week and reduction of IPSS at the 18th week. Discussion This trial will assess the specific therapeutic effect of electroacupuncture at BL33 for mild and moderate BPH. Trial registration Protocol Registration System of Clinical Trials.gov NCT01218243

  7. Vector control of wind turbine on the basis of the fuzzy selective neural net*

    Science.gov (United States)

    Engel, E. A.; Kovalev, I. V.; Engel, N. E.

    2016-04-01

    An article describes vector control of wind turbine based on fuzzy selective neural net. Based on the wind turbine system’s state, the fuzzy selective neural net tracks an maximum power point under random perturbations. Numerical simulations are accomplished to clarify the applicability and advantages of the proposed vector wind turbine’s control on the basis of the fuzzy selective neuronet. The simulation results show that the proposed intelligent control of wind turbine achieves real-time control speed and competitive performance, as compared to a classical control model with PID controllers based on traditional maximum torque control strategy.

  8. Impact of selected troposphere models on Precise Point Positioning convergence

    Science.gov (United States)

    Kalita, Jakub; Rzepecka, Zofia

    2016-04-01

    The Precise Point Positioning (PPP) absolute method is currently intensively investigated in order to reach fast convergence time. Among various sources that influence the convergence of the PPP, the tropospheric delay is one of the most important. Numerous models of tropospheric delay are developed and applied to PPP processing. However, with rare exceptions, the quality of those models does not allow fixing the zenith path delay tropospheric parameter, leaving difference between nominal and final value to the estimation process. Here we present comparison of several PPP result sets, each of which based on different troposphere model. The respective nominal values are adopted from models: VMF1, GPT2w, MOPS and ZERO-WET. The PPP solution admitted as reference is based on the final troposphere product from the International GNSS Service (IGS). The VMF1 mapping function was used for all processing variants in order to provide capability to compare impact of applied nominal values. The worst case initiates zenith wet delay with zero value (ZERO-WET). Impact from all possible models for tropospheric nominal values should fit inside both IGS and ZERO-WET border variants. The analysis is based on data from seven IGS stations located in mid-latitude European region from year 2014. For the purpose of this study several days with the most active troposphere were selected for each of the station. All the PPP solutions were determined using gLAB open-source software, with the Kalman filter implemented independently by the authors of this work. The processing was performed on 1 hour slices of observation data. In addition to the analysis of the output processing files, the presented study contains detailed analysis of the tropospheric conditions for the selected data. The overall results show that for the height component the VMF1 model outperforms GPT2w and MOPS by 35-40% and ZERO-WET variant by 150%. In most of the cases all solutions converge to the same values during first

  9. Strategic Manoeuvring and the Selection of Starting Points in the Pragma-Dialectical Framework

    Directory of Open Access Journals (Sweden)

    Forgács Gábor

    2014-03-01

    Full Text Available The article analyzes strategic manoeuvring within the pragmadialectical framework with respect to the selection of starting points in the opening stage to frame the arguments. The Terri Schiavo case is presented, which can provide interesting insights concerning this issue. I would like to show that resolution of the difference of opinion requires the resolution of a subordinate difference of opinion concerning how to label her medical state, and why discussants were not able to resolve this subordinate difference of opinion. After, the conflict that arises between critical reasonableness and rhetorical effectiveness is examined and how strategic manoeuvring aims to resolve this conflict. In the final part of the paper I argue that the problems raised can be dealt with within the framework of pragma-dialectics.

  10. Prone position as prevention of lung injury in comatose patients: a prospective, randomized, controlled study.

    Science.gov (United States)

    Beuret, Pascal; Carton, Marie-Jose; Nourdine, Karim; Kaaki, Mahmoud; Tramoni, Gerard; Ducreux, Jean-Claude

    2002-05-01

    Comatose patients frequently exhibit pulmonary function worsening, especially in cases of pulmonary infection. It appears to have a deleterious effect on neurologic outcome. We therefore conducted a randomized trial to determine whether daily prone positioning would prevent lung worsening in these patients. Prospective, randomized, controlled study. Sixteen-bed intensive care unit. Fifty-one patients who required invasive mechanical ventilation because of coma with Glascow coma scores of 9 or less. In the prone position (PP) group: prone positioning for 4 h once daily until the patients could get up to sit in an armchair; in the supine position (SP) group: supine positioning. The primary end point was the incidence of lung worsening defined by an increase in the Lung Injury Score of at least 1 point since the time of randomization. The secondary end point was the incidence of ventilator-associated pneumonia (VAP). A total of 25 patients were randomly assigned to the PP group and 26 patients to the SP group. The characteristics of the patients from the two groups were similar at randomization. The incidence of lung worsening was lower in the PP group (12%) than in the SP group (50%) ( p=0.003). The incidence of VAP was 20% in the PP group and 38.4% in the SP group ( p=0.14). There was no serious complication attributable to prone positioning, however, there was a significant increase of intracranial pressure in the PP. In a selected population of comatose ventilated patients, daily prone positioning reduced the incidence of lung worsening.

  11. Random forest variable selection in spatial malaria transmission modelling in Mpumalanga Province, South Africa

    Directory of Open Access Journals (Sweden)

    Thandi Kapwata

    2016-11-01

    Full Text Available Malaria is an environmentally driven disease. In order to quantify the spatial variability of malaria transmission, it is imperative to understand the interactions between environmental variables and malaria epidemiology at a micro-geographic level using a novel statistical approach. The random forest (RF statistical learning method, a relatively new variable-importance ranking method, measures the variable importance of potentially influential parameters through the percent increase of the mean squared error. As this value increases, so does the relative importance of the associated variable. The principal aim of this study was to create predictive malaria maps generated using the selected variables based on the RF algorithm in the Ehlanzeni District of Mpumalanga Province, South Africa. From the seven environmental variables used [temperature, lag temperature, rainfall, lag rainfall, humidity, altitude, and the normalized difference vegetation index (NDVI], altitude was identified as the most influential predictor variable due its high selection frequency. It was selected as the top predictor for 4 out of 12 months of the year, followed by NDVI, temperature and lag rainfall, which were each selected twice. The combination of climatic variables that produced the highest prediction accuracy was altitude, NDVI, and temperature. This suggests that these three variables have high predictive capabilities in relation to malaria transmission. Furthermore, it is anticipated that the predictive maps generated from predictions made by the RF algorithm could be used to monitor the progression of malaria and assist in intervention and prevention efforts with respect to malaria.

  12. Selecting Optimal Parameters of Random Linear Network Coding for Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Heide, J; Zhang, Qi; Fitzek, F H P

    2013-01-01

    This work studies how to select optimal code parameters of Random Linear Network Coding (RLNC) in Wireless Sensor Networks (WSNs). With Rateless Deluge [1] the authors proposed to apply Network Coding (NC) for Over-the-Air Programming (OAP) in WSNs, and demonstrated that with NC a significant...... reduction in the number of transmitted packets can be achieved. However, NC introduces additional computations and potentially a non-negligible transmission overhead, both of which depend on the chosen coding parameters. Therefore it is necessary to consider the trade-off that these coding parameters...... present in order to obtain the lowest energy consumption per transmitted bit. This problem is analyzed and suitable coding parameters are determined for the popular Tmote Sky platform. Compared to the use of traditional RLNC, these parameters enable a reduction in the energy spent per bit which grows...

  13. Automatic markerless registration of point clouds with semantic-keypoint-based 4-points congruent sets

    Science.gov (United States)

    Ge, Xuming

    2017-08-01

    The coarse registration of point clouds from urban building scenes has become a key topic in applications of terrestrial laser scanning technology. Sampling-based algorithms in the random sample consensus (RANSAC) model have emerged as mainstream solutions to address coarse registration problems. In this paper, we propose a novel combined solution to automatically align two markerless point clouds from building scenes. Firstly, the method segments non-ground points from ground points. Secondly, the proposed method detects feature points from each cross section and then obtains semantic keypoints by connecting feature points with specific rules. Finally, the detected semantic keypoints from two point clouds act as inputs to a modified 4PCS algorithm. Examples are presented and the results compared with those of K-4PCS to demonstrate the main contributions of the proposed method, which are the extension of the original 4PCS to handle heavy datasets and the use of semantic keypoints to improve K-4PCS in relation to registration accuracy and computational efficiency.

  14. Random Forest-Based Approach for Maximum Power Point Tracking of Photovoltaic Systems Operating under Actual Environmental Conditions.

    Science.gov (United States)

    Shareef, Hussain; Mutlag, Ammar Hussein; Mohamed, Azah

    2017-01-01

    Many maximum power point tracking (MPPT) algorithms have been developed in recent years to maximize the produced PV energy. These algorithms are not sufficiently robust because of fast-changing environmental conditions, efficiency, accuracy at steady-state value, and dynamics of the tracking algorithm. Thus, this paper proposes a new random forest (RF) model to improve MPPT performance. The RF model has the ability to capture the nonlinear association of patterns between predictors, such as irradiance and temperature, to determine accurate maximum power point. A RF-based tracker is designed for 25 SolarTIFSTF-120P6 PV modules, with the capacity of 3 kW peak using two high-speed sensors. For this purpose, a complete PV system is modeled using 300,000 data samples and simulated using the MATLAB/SIMULINK package. The proposed RF-based MPPT is then tested under actual environmental conditions for 24 days to validate the accuracy and dynamic response. The response of the RF-based MPPT model is also compared with that of the artificial neural network and adaptive neurofuzzy inference system algorithms for further validation. The results show that the proposed MPPT technique gives significant improvement compared with that of other techniques. In addition, the RF model passes the Bland-Altman test, with more than 95 percent acceptability.

  15. Use of electronic healthcare records in large-scale simple randomized trials at the point of care for the documentation of value-based medicine.

    Science.gov (United States)

    van Staa, T-P; Klungel, O; Smeeth, L

    2014-06-01

    A solid foundation of evidence of the effects of an intervention is a prerequisite of evidence-based medicine. The best source of such evidence is considered to be randomized trials, which are able to avoid confounding. However, they may not always estimate effectiveness in clinical practice. Databases that collate anonymized electronic health records (EHRs) from different clinical centres have been widely used for many years in observational studies. Randomized point-of-care trials have been initiated recently to recruit and follow patients using the data from EHR databases. In this review, we describe how EHR databases can be used for conducting large-scale simple trials and discuss the advantages and disadvantages of their use. © 2014 The Association for the Publication of the Journal of Internal Medicine.

  16. On Random Numbers and Design

    Science.gov (United States)

    Ben-Ari, Morechai

    2004-01-01

    The term "random" is frequently used in discussion of the theory of evolution, even though the mathematical concept of randomness is problematic and of little relevance in the theory. Therefore, since the core concept of the theory of evolution is the non-random process of natural selection, the term random should not be used in teaching the…

  17. Uncertainty analysis of point-by-point sampling complex surfaces using touch probe CMMs DOE for complex surfaces verification with CMM

    DEFF Research Database (Denmark)

    Barini, Emanuele Modesto; Tosello, Guido; De Chiffre, Leonardo

    2010-01-01

    The paper describes a study concerning point-by-point sampling of complex surfaces using tactile CMMs. A four factor, two level completely randomized factorial experiment was carried out, involving measurements on a complex surface configuration item comprising a sphere, a cylinder and a cone, co...

  18. An X-point ergodic divertor

    International Nuclear Information System (INIS)

    Chu, M.S.; Jensen, T.H.; La Haye, R.J.; Taylor, T.S.; Evans, T.E.

    1991-10-01

    A new ergodic divertor is proposed. It utilizes a system of external (n = 3) coils arranged to generate overlapping magnetic islands in the edge region of a diverted tokamak and connect the randomized field lines to the external (cold) divertor plate. The novel feature in the configuration is the placement of the external coils close to the X-point. A realistic design of the external coil set is studied by using the field line tracing method for a low aspect ratio (A ≅ 3) tokamak. Two types of effects are observed. First, by placing the coils close to the X-point, where the poloidal magnetic field is weak and the rational surfaces are closely packed only a moderate amount of current in the external coils is needed to ergodize the edge region. This ergodized edge enhances the edge transport in the X-point region and leads to the potential of edge profile control and the avoidance of edge localized modes (ELMs). Furthermore, the trajectories of the field lines close to the X-point are modified by the external coil set, causing the hit points on the external divertor plates to be randomized and spread out in the major radius direction. A time-dependent modulation of the currents in the external (n = 3) coils can potentially spread the heat flux more uniformly on the divertor plate avoiding high concentration of the heat flux. 10 refs., 9 figs

  19. A Walnut-Enriched Diet Reduces Lipids in Healthy Caucasian Subjects, Independent of Recommended Macronutrient Replacement and Time Point of Consumption: a Prospective, Randomized, Controlled Trial.

    Science.gov (United States)

    Bamberger, Charlotte; Rossmeier, Andreas; Lechner, Katharina; Wu, Liya; Waldmann, Elisa; Stark, Renée G; Altenhofer, Julia; Henze, Kerstin; Parhofer, Klaus G

    2017-10-06

    Studies indicate a positive association between walnut intake and improvements in plasma lipids. We evaluated the effect of an isocaloric replacement of macronutrients with walnuts and the time point of consumption on plasma lipids. We included 194 healthy subjects (134 females, age 63 ± 7 years, BMI 25.1 ± 4.0 kg/m²) in a randomized, controlled, prospective, cross-over study. Following a nut-free run-in period, subjects were randomized to two diet phases (8 weeks each). Ninety-six subjects first followed a walnut-enriched diet (43 g walnuts/day) and then switched to a nut-free diet. Ninety-eight subjects followed the diets in reverse order. Subjects were also randomized to either reduce carbohydrates ( n = 62), fat ( n = 65), or both ( n = 67) during the walnut diet, and instructed to consume walnuts either as a meal or as a snack. The walnut diet resulted in a significant reduction in fasting cholesterol (walnut vs. -8.5 ± 37.2 vs. -1.1 ± 35.4 mg/dL; p = 0.002), non-HDL cholesterol (-10.3 ± 35.5 vs. -1.4 ± 33.1 mg/dL; p ≤ 0.001), LDL-cholesterol (-7.4 ± 32.4 vs. -1.7 ± 29.7 mg/dL; p = 0.029), triglycerides (-5.0 ± 47.5 vs. 3.7 ± 48.5 mg/dL; p = 0.015) and apoB (-6.7 ± 22.4 vs. -0.5 ± 37.7; p ≤ 0.001), while HDL-cholesterol and lipoprotein (a) did not change significantly. Neither macronutrient replacement nor time point of consumption significantly affected the effect of walnuts on lipids. Thus, 43 g walnuts/d improved the lipid profile independent of the recommended macronutrient replacement and the time point of consumption.

  20. Selecting for Fast Protein-Protein Association As Demonstrated on a Random TEM1 Yeast Library Binding BLIP.

    Science.gov (United States)

    Cohen-Khait, Ruth; Schreiber, Gideon

    2018-04-27

    Protein-protein interactions mediate the vast majority of cellular processes. Though protein interactions obey basic chemical principles also within the cell, the in vivo physiological environment may not allow for equilibrium to be reached. Thus, in vitro measured thermodynamic affinity may not provide a complete picture of protein interactions in the biological context. Binding kinetics composed of the association and dissociation rate constants are relevant and important in the cell. Therefore, changes in protein-protein interaction kinetics have a significant impact on the in vivo activity of the proteins. The common protocol for the selection of tighter binders from a mutant library selects for protein complexes with slower dissociation rate constants. Here we describe a method to specifically select for variants with faster association rate constants by using pre-equilibrium selection, starting from a large random library. Toward this end, we refine the selection conditions of a TEM1-β-lactamase library against its natural nanomolar affinity binder β-lactamase inhibitor protein (BLIP). The optimal selection conditions depend on the ligand concentration and on the incubation time. In addition, we show that a second sort of the library helps to separate signal from noise, resulting in a higher percent of faster binders in the selected library. Fast associating protein variants are of particular interest for drug development and other biotechnological applications.

  1. An automated three-dimensional detection and segmentation method for touching cells by integrating concave points clustering and random walker algorithm.

    Directory of Open Access Journals (Sweden)

    Yong He

    Full Text Available Characterizing cytoarchitecture is crucial for understanding brain functions and neural diseases. In neuroanatomy, it is an important task to accurately extract cell populations' centroids and contours. Recent advances have permitted imaging at single cell resolution for an entire mouse brain using the Nissl staining method. However, it is difficult to precisely segment numerous cells, especially those cells touching each other. As presented herein, we have developed an automated three-dimensional detection and segmentation method applied to the Nissl staining data, with the following two key steps: 1 concave points clustering to determine the seed points of touching cells; and 2 random walker segmentation to obtain cell contours. Also, we have evaluated the performance of our proposed method with several mouse brain datasets, which were captured with the micro-optical sectioning tomography imaging system, and the datasets include closely touching cells. Comparing with traditional detection and segmentation methods, our approach shows promising detection accuracy and high robustness.

  2. Mechanistic spatio-temporal point process models for marked point processes, with a view to forest stand data

    DEFF Research Database (Denmark)

    Møller, Jesper; Ghorbani, Mohammad; Rubak, Ege Holger

    We show how a spatial point process, where to each point there is associated a random quantitative mark, can be identified with a spatio-temporal point process specified by a conditional intensity function. For instance, the points can be tree locations, the marks can express the size of trees......, and the conditional intensity function can describe the distribution of a tree (i.e., its location and size) conditionally on the larger trees. This enable us to construct parametric statistical models which are easily interpretable and where likelihood-based inference is tractable. In particular, we consider maximum...

  3. Solution to random differential equations with boundary conditions

    Directory of Open Access Journals (Sweden)

    Fairouz Tchier

    2017-04-01

    Full Text Available We study a family of random differential equations with boundary conditions. Using a random fixed point theorem, we prove an existence theorem that yields a unique random solution.

  4. Physical characteristics and quality of water from selected springs and wells in the Lincoln Point-Bird Island area, Utah Lake, Utah

    Science.gov (United States)

    Baskin, R.L.; Spangler, L.E.; Holmes, W.F.

    1994-01-01

    From February 1991 to October 1992, the U.S. Geological Survey, in cooperation with the Central Utah Water Conservancy District, investigated the hydrology of the Lincoln Point - Bird Island area in the southeast part of Utah Lake, Utah. The investigation included measurements of the discharge of selected springs and measurements of the physical and chemical characteristics of water from selected springs and wells in the LincolnPoint - Bird Island area. This report contains data for twenty-one distinct springs in the study area including two springs beneath the surface of Utah Lake at Bird Island. Data from this study, combined with data from previous studies, indicate that the location of springs in the Lincoln Point - Bird Island area probably is controlled by fractures that are the result of faulting. Measured discharge of springs in the Lincoln Point - Bird Island area ranged from less than 0.01 cubic foot per second to 0.84 cubic foot per second. Total discharge in the study area, including known unmeasured springs and seeps, is estimated to be about 5 cubic feet per second. Reported and measured temperatures of water from springs and wells in the Lincoln Point - Bird Island area ranged from 16.0 degrees Celsius to 36.5 degrees Celsius. Dissolved-solids con-centrations ranged from 444 milligrams per liter to 7,932 milligrams per liter, and pH ranged from 6.3 to 8.1. Physical and chemical characteristics of spring and well water from the west side of Lincoln Point were virtually identical to the physical and chemical characteristics of water from the submerged Bird Island springs, indicating a similar source for the water. Water chemistry, isotope analyses, and geothermometer calculations indicate deep circulation of water discharging from the springs and indicate that the source of recharge for the springs at Lincoln Point and Bird Island does not appear to be localized in the LincolnPoint - Bird Island area.

  5. Bridging the gap between a stationary point process and its Palm distribution

    NARCIS (Netherlands)

    Nieuwenhuis, G.

    1994-01-01

    In the context of stationary point processes measurements are usually made from a time point chosen at random or from an occurrence chosen at random. That is, either the stationary distribution P or its Palm distribution P° is the ruling probability measure. In this paper an approach is presented to

  6. Dynamics and bifurcations of random circle diffeomorphisms

    NARCIS (Netherlands)

    Zmarrou, H.; Homburg, A.J.

    2008-01-01

    We discuss iterates of random circle diffeomorphisms with identically distributed noise, where the noise is bounded and absolutely continuous. Using arguments of B. Deroin, V.A. Kleptsyn and A. Navas, we provide precise conditions under which random attracting fixed points or random attracting

  7. Criticality and entanglement in random quantum systems

    International Nuclear Information System (INIS)

    Refael, G; Moore, J E

    2009-01-01

    We review studies of entanglement entropy in systems with quenched randomness, concentrating on universal behavior at strongly random quantum critical points. The disorder-averaged entanglement entropy provides insight into the quantum criticality of these systems and an understanding of their relationship to non-random ('pure') quantum criticality. The entanglement near many such critical points in one dimension shows a logarithmic divergence in subsystem size, similar to that in the pure case but with a different universal coefficient. Such universal coefficients are examples of universal critical amplitudes in a random system. Possible measurements are reviewed along with the one-particle entanglement scaling at certain Anderson localization transitions. We also comment briefly on higher dimensions and challenges for the future.

  8. The statistics of the points where nodal lines intersect a reference curve

    International Nuclear Information System (INIS)

    Aronovitch, Amit; Smilansky, Uzy

    2007-01-01

    We study the intersection points of a fixed planar curve Γ with the nodal set of a translationally invariant and isotropic Gaussian random field Ψ(r) and the zeros of its normal derivative across the curve. The intersection points form a discrete random process which is the object of this study. The field probability distribution function is completely specified by the correlation G(|r - r'|) = (Ψ(r)Ψ(r')). Given an arbitrary G(|r - r'|), we compute the two-point correlation function of the point process on the line, and derive other statistical measures (repulsion, rigidity) which characterize the short- and long-range correlations of the intersection points. We use these statistical measures to quantitatively characterize the complex patterns displayed by various kinds of nodal networks. We apply these statistics in particular to nodal patterns of random waves and of eigenfunctions of chaotic billiards. Of special interest is the observation that for monochromatic random waves, the number variance of the intersections with long straight segments grows like Lln L, as opposed to the linear growth predicted by the percolation model, which was successfully used to predict other long-range nodal properties of that field

  9. Generalized treatment of point reactor kinetics driven by random reactivity fluctuations via the Wiener-Hermite functional method

    International Nuclear Information System (INIS)

    Behringer, K.

    1991-02-01

    In a recent paper by Behringer et al. (1990), the Wiener-Hermite Functional (WHF) method has been applied to point reactor kinetics excited by Gaussian random reactivity noise under stationary conditions, in order to calculate the neutron steady-state value and the neutron power spectral density (PSD) in a second-order (WHF-2) approximation. For simplicity, delayed neutrons and any feedback effects have been disregarded. The present study is a straightforward continuation of the previous one, treating the problem more generally by including any number of delayed neutron groups. For the case of white reactivity noise, the accuracy of the approach is determined by comparison with the exact solution available from the Fokker-Planck method. In the numerical comparisons, the first-oder (WHF-1) approximation of the PSD is also considered. (author) 4 figs., 10 refs

  10. Direct random insertion mutagenesis of Helicobacter pylori

    NARCIS (Netherlands)

    de Jonge, Ramon; Bakker, Dennis; van Vliet, Arnoud H. M.; Kuipers, Ernst J.; Vandenbroucke-Grauls, Christina M. J. E.; Kusters, Johannes G.

    2003-01-01

    Random insertion mutagenesis is a widely used technique for the identification of bacterial virulence genes. Most strategies for random mutagenesis involve cloning in Escherichia coli for passage of plasmids or for phenotypic selection. This can result in biased selection due to restriction or

  11. Direct random insertion mutagenesis of Helicobacter pylori.

    NARCIS (Netherlands)

    Jonge, de R.; Bakker, D.; Vliet, van AH; Kuipers, E.J.; Vandenbroucke-Grauls, C.M.J.E.; Kusters, J.G.

    2003-01-01

    Random insertion mutagenesis is a widely used technique for the identification of bacterial virulence genes. Most strategies for random mutagenesis involve cloning in Escherichia coli for passage of plasmids or for phenotypic selection. This can result in biased selection due to restriction or

  12. Point specificity in acupuncture

    Directory of Open Access Journals (Sweden)

    Choi Emma M

    2012-02-01

    Full Text Available Abstract The existence of point specificity in acupuncture is controversial, because many acupuncture studies using this principle to select control points have found that sham acupoints have similar effects to those of verum acupoints. Furthermore, the results of pain-related studies based on visual analogue scales have not supported the concept of point specificity. In contrast, hemodynamic, functional magnetic resonance imaging and neurophysiological studies evaluating the responses to stimulation of multiple points on the body surface have shown that point-specific actions are present. This review article focuses on clinical and laboratory studies supporting the existence of point specificity in acupuncture and also addresses studies that do not support this concept. Further research is needed to elucidate the point-specific actions of acupuncture.

  13. Professional SharePoint 2010 Development

    CERN Document Server

    Rizzo, Tom; Fried, Jeff; Swider, Paul J; Hillier, Scot; Schaefer, Kenneth

    2012-01-01

    Updated guidance on how to take advantage of the newest features of SharePoint programmability More than simply a portal, SharePoint is Microsoft's popular content management solution for building intranets and websites or hosting wikis and blogs. Offering broad coverage on all aspects of development for the SharePoint platform, this comprehensive book shows you exactly what SharePoint does, how to build solutions, and what features are accessible within SharePoint. Written by a team of SharePoint experts, this new edition offers an extensive selection of field-tested best practices that shows

  14. Optimal production lot size and reorder point of a two-stage supply chain while random demand is sensitive with sales teams' initiatives

    Science.gov (United States)

    Sankar Sana, Shib

    2016-01-01

    The paper develops a production-inventory model of a two-stage supply chain consisting of one manufacturer and one retailer to study production lot size/order quantity, reorder point sales teams' initiatives where demand of the end customers is dependent on random variable and sales teams' initiatives simultaneously. The manufacturer produces the order quantity of the retailer at one lot in which the procurement cost per unit quantity follows a realistic convex function of production lot size. In the chain, the cost of sales team's initiatives/promotion efforts and wholesale price of the manufacturer are negotiated at the points such that their optimum profits reached nearer to their target profits. This study suggests to the management of firms to determine the optimal order quantity/production quantity, reorder point and sales teams' initiatives/promotional effort in order to achieve their maximum profits. An analytical method is applied to determine the optimal values of the decision variables. Finally, numerical examples with its graphical presentation and sensitivity analysis of the key parameters are presented to illustrate more insights of the model.

  15. Human action analysis with randomized trees

    CERN Document Server

    Yu, Gang; Liu, Zicheng

    2014-01-01

    This book will provide a comprehensive overview on human action analysis with randomized trees. It will cover both the supervised random trees and the unsupervised random trees. When there are sufficient amount of labeled data available, supervised random trees provides a fast method for space-time interest point matching. When labeled data is minimal as in the case of example-based action search, unsupervised random trees is used to leverage the unlabelled data. We describe how the randomized trees can be used for action classification, action detection, action search, and action prediction.

  16. Random Forest-Based Approach for Maximum Power Point Tracking of Photovoltaic Systems Operating under Actual Environmental Conditions

    Directory of Open Access Journals (Sweden)

    Hussain Shareef

    2017-01-01

    Full Text Available Many maximum power point tracking (MPPT algorithms have been developed in recent years to maximize the produced PV energy. These algorithms are not sufficiently robust because of fast-changing environmental conditions, efficiency, accuracy at steady-state value, and dynamics of the tracking algorithm. Thus, this paper proposes a new random forest (RF model to improve MPPT performance. The RF model has the ability to capture the nonlinear association of patterns between predictors, such as irradiance and temperature, to determine accurate maximum power point. A RF-based tracker is designed for 25 SolarTIFSTF-120P6 PV modules, with the capacity of 3 kW peak using two high-speed sensors. For this purpose, a complete PV system is modeled using 300,000 data samples and simulated using the MATLAB/SIMULINK package. The proposed RF-based MPPT is then tested under actual environmental conditions for 24 days to validate the accuracy and dynamic response. The response of the RF-based MPPT model is also compared with that of the artificial neural network and adaptive neurofuzzy inference system algorithms for further validation. The results show that the proposed MPPT technique gives significant improvement compared with that of other techniques. In addition, the RF model passes the Bland–Altman test, with more than 95 percent acceptability.

  17. Escitalopram in painful polyneuropathy: A randomized, placebo-controlled, cross-over trial

    DEFF Research Database (Denmark)

    Otto, Marit; Bach, Flemming W; Jensen, Troels S

    2008-01-01

    Serotonin (5-HT) is involved in pain modulation via descending pathways in the central nervous system. The aim of this study was to test if escitalopram, a selective serotonin reuptake inhibitor (SSRI), would relieve pain in polyneuropathy. The study design was a randomized, double-blind, placebo......-controlled cross-over trial. The daily dose of escitalopram was 20mg once daily. During the two treatment periods of 5 weeks duration, patients rated pain relief (primary outcome variable) on a 6-point ordered nominal scale. Secondary outcome measures comprised total pain and different pain symptoms (touch...

  18. Selective oropharyngeal decontamination versus selective digestive decontamination in critically ill patients: a meta-analysis of randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Zhao D

    2015-07-01

    Full Text Available Di Zhao,1,* Jian Song,2,* Xuan Gao,3 Fei Gao,4 Yupeng Wu,2 Yingying Lu,5 Kai Hou1 1Department of Neurosurgery, The First Hospital of Hebei Medical University, 2Department of Neurosurgery, 3Department of Neurology, The Second Hospital of Hebei Medical University, 4Hebei Provincial Procurement Centers for Medical Drugs and Devices, 5Department of Neurosurgery, The Second Hospital of Hebei Medical University, Shijiazhuang People’s Republic of China *These authors contributed equally to this work Background: Selective digestive decontamination (SDD and selective oropharyngeal decontamination (SOD are associated with reduced mortality and infection rates among patients in intensive care units (ICUs; however, whether SOD has a superior effect than SDD remains uncertain. Hence, we conducted a meta-analysis of randomized controlled trials (RCTs to compare SOD with SDD in terms of clinical outcomes and antimicrobial resistance rates in patients who were critically ill. Methods: RCTs published in PubMed, Embase, and Web of Science were systematically reviewed to compare the effects of SOD and SDD in patients who were critically ill. Outcomes included day-28 mortality, length of ICU stay, length of hospital stay, duration of mechanical ventilation, ICU-acquired bacteremia, and prevalence of antibiotic-resistant Gram-negative bacteria. Results were expressed as risk ratio (RR with 95% confidence intervals (CIs, and weighted mean differences (WMDs with 95% CIs. Pooled estimates were performed using a fixed-effects model or random-effects model, depending on the heterogeneity among studies. Results: A total of four RCTs involving 23,822 patients met the inclusion criteria and were included in this meta-analysis. Among patients whose admitting specialty was surgery, cardiothoracic surgery (57.3% and neurosurgery (29.7% were the two main types of surgery being performed. Pooled results showed that SOD had similar effects as SDD in day-28 mortality (RR =1

  19. Strategyproof Peer Selection using Randomization, Partitioning, and Apportionment

    OpenAIRE

    Aziz, Haris; Lev, Omer; Mattei, Nicholas; Rosenschein, Jeffrey S.; Walsh, Toby

    2016-01-01

    Peer review, evaluation, and selection is a fundamental aspect of modern science. Funding bodies the world over employ experts to review and select the best proposals of those submitted for funding. The problem of peer selection, however, is much more general: a professional society may want to give a subset of its members awards based on the opinions of all members; an instructor for a MOOC or online course may want to crowdsource grading; or a marketing company may select ideas from group b...

  20. Poisson branching point processes

    International Nuclear Information System (INIS)

    Matsuo, K.; Teich, M.C.; Saleh, B.E.A.

    1984-01-01

    We investigate the statistical properties of a special branching point process. The initial process is assumed to be a homogeneous Poisson point process (HPP). The initiating events at each branching stage are carried forward to the following stage. In addition, each initiating event independently contributes a nonstationary Poisson point process (whose rate is a specified function) located at that point. The additional contributions from all points of a given stage constitute a doubly stochastic Poisson point process (DSPP) whose rate is a filtered version of the initiating point process at that stage. The process studied is a generalization of a Poisson branching process in which random time delays are permitted in the generation of events. Particular attention is given to the limit in which the number of branching stages is infinite while the average number of added events per event of the previous stage is infinitesimal. In the special case when the branching is instantaneous this limit of continuous branching corresponds to the well-known Yule--Furry process with an initial Poisson population. The Poisson branching point process provides a useful description for many problems in various scientific disciplines, such as the behavior of electron multipliers, neutron chain reactions, and cosmic ray showers

  1. Correlated randomness and switching phenomena

    Science.gov (United States)

    Stanley, H. E.; Buldyrev, S. V.; Franzese, G.; Havlin, S.; Mallamace, F.; Kumar, P.; Plerou, V.; Preis, T.

    2010-08-01

    One challenge of biology, medicine, and economics is that the systems treated by these serious scientific disciplines have no perfect metronome in time and no perfect spatial architecture-crystalline or otherwise. Nonetheless, as if by magic, out of nothing but randomness one finds remarkably fine-tuned processes in time and remarkably fine-tuned structures in space. Further, many of these processes and structures have the remarkable feature of “switching” from one behavior to another as if by magic. The past century has, philosophically, been concerned with placing aside the human tendency to see the universe as a fine-tuned machine. Here we will address the challenge of uncovering how, through randomness (albeit, as we shall see, strongly correlated randomness), one can arrive at some of the many spatial and temporal patterns in biology, medicine, and economics and even begin to characterize the switching phenomena that enables a system to pass from one state to another. Inspired by principles developed by A. Nihat Berker and scores of other statistical physicists in recent years, we discuss some applications of correlated randomness to understand switching phenomena in various fields. Specifically, we present evidence from experiments and from computer simulations supporting the hypothesis that water’s anomalies are related to a switching point (which is not unlike the “tipping point” immortalized by Malcolm Gladwell), and that the bubbles in economic phenomena that occur on all scales are not “outliers” (another Gladwell immortalization). Though more speculative, we support the idea of disease as arising from some kind of yet-to-be-understood complex switching phenomenon, by discussing data on selected examples, including heart disease and Alzheimer disease.

  2. 47 CFR 1.1604 - Post-selection hearings.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a) Following the random...

  3. The Effects of Using Microsoft Power Point on EFL Learners’ Attitude and Anxiety

    Directory of Open Access Journals (Sweden)

    Boualem Benghalem

    2015-12-01

    Full Text Available This study aims to investigate the effects of using ICT tools such as Microsoft PowerPoint on EFL students’ attitude and anxiety. The participants in this study were 40 Master 2 students of Didactics of English as a Foreign Language, Djillali Liabes University, Sidi Bel Abbes Algeria. In order to find out the effects of Microsoft PowerPoint on EFL students’ attitude and anxiety, two main research tools were employed in this study: a questionnaire that was addressed to 40 Master 2 students of Didactics of English as a Foreign Language and an interview for 10 participants, randomly selected from the forty participants who answered the questionnaire. After the data had been analysed, the results revealed a positive attitude and low anxiety among students towards Microsoft PowerPoint. These concluding results promote the use of ICT and encourage EFL teachers to use these tools in the most beneficial way to improve students’ level of English and motivate them. Keywords: Information and Communication Technology (ICT, attitude, anxiety, English as foreign language, teaching, learning

  4. [Clinical observation on auricular point magnetotherapy for treatment of senile low back pain].

    Science.gov (United States)

    Sun, Gui-Ping

    2007-02-01

    To compare the therapeutic effects of auricular point magnetotherapy and auricular point sticking of Vaccaria seed on senile low back pain. Sixty cases, aged 60 or over 60 years with back pain, were randomly divided into 2 groups, a control group and a test group. The control group were treated with auricular sticking of Vaccaria seed with no pressing, and the test group with sticking magnetic bead of 66 gauss each piece with no pressing. Auricular points, Shenmen, Kidney, Bladder, Yaodizhui, Gluteus, Liver and Spleen were selected. Three weeks constituted one course. The effects before, during and after the course were assessed by questionnaire about back pain. Compared with the control group, in the test group the back pain was more effectively improved, including reducing pain and numbness in the back and the legs, decreasing the disorder of physical strength induced by this disease, and improving daily life quality of the patient. Follow-up survey for 2-4 weeks showed the effects still were kept. Auricular magnetotherapy can effectively improve senile back pain.

  5. Digital random-number generator

    Science.gov (United States)

    Brocker, D. H.

    1973-01-01

    For binary digit array of N bits, use N noise sources to feed N nonlinear operators; each flip-flop in digit array is set by nonlinear operator to reflect whether amplitude of generator which feeds it is above or below mean value of generated noise. Fixed-point uniform distribution random number generation method can also be used to generate random numbers with other than uniform distribution.

  6. Palm theory for random time changes

    Directory of Open Access Journals (Sweden)

    Masakiyo Miyazawa

    2001-01-01

    Full Text Available Palm distributions are basic tools when studying stationarity in the context of point processes, queueing systems, fluid queues or random measures. The framework varies with the random phenomenon of interest, but usually a one-dimensional group of measure-preserving shifts is the starting point. In the present paper, by alternatively using a framework involving random time changes (RTCs and a two-dimensional family of shifts, we are able to characterize all of the above systems in a single framework. Moreover, this leads to what we call the detailed Palm distribution (DPD which is stationary with respect to a certain group of shifts. The DPD has a very natural interpretation as the distribution seen at a randomly chosen position on the extended graph of the RTC, and satisfies a general duality criterion: the DPD of the DPD gives the underlying probability P in return.

  7. Lévy based Cox point processes

    DEFF Research Database (Denmark)

    Hellmund, Gunnar; Prokesová, Michaela; Jensen, Eva Bjørn Vedel

    2008-01-01

    In this paper we introduce Lévy-driven Cox point processes (LCPs) as Cox point processes with driving intensity function Λ defined by a kernel smoothing of a Lévy basis (an independently scattered, infinitely divisible random measure). We also consider log Lévy-driven Cox point processes (LLCPs......) with Λ equal to the exponential of such a kernel smoothing. Special cases are shot noise Cox processes, log Gaussian Cox processes, and log shot noise Cox processes. We study the theoretical properties of Lévy-based Cox processes, including moment properties described by nth-order product densities...

  8. Thermoresponsive Poly(2-Oxazoline) Molecular Brushes by Living Ionic Polymerization: Modulation of the Cloud Point by Random and Block Copolymer Pendant Chains

    KAUST Repository

    Zhang, Ning

    2012-08-10

    Molecular brushes (MBs) of poly(2-oxazoline)s were prepared by living anionic polymerization of 2-isopropenyl-2-oxazoline to form the backbone and living cationic ring-opening polymerization of 2-n-propyl-2-oxazoline and 2-methyl-2-oxazoline to form random and block copolymers. Their aqueous solutions displayed a distinct thermoresponsive behavior as a function of the side-chain composition and sequence. The cloud point (CP) of MBs with random copolymer side chains is a linear function of the hydrophilic monomer content and can be modulated in a wide range. For MBs with block copolymer side chains, it was found that the block sequence had a strong and surprising effect on the CP. While MBs with a distal hydrophobic block had a CP at 70 °C, MBs with hydrophilic outer blocks already precipitated at 32 °C. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology.

    Science.gov (United States)

    Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H

    2017-07-01

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in

  10. Influence of olfactory and visual cover on nest site selection and nest success for grassland-nesting birds.

    Science.gov (United States)

    Fogarty, Dillon T; Elmore, R Dwayne; Fuhlendorf, Samuel D; Loss, Scott R

    2017-08-01

    Habitat selection by animals is influenced by and mitigates the effects of predation and environmental extremes. For birds, nest site selection is crucial to offspring production because nests are exposed to extreme weather and predation pressure. Predators that forage using olfaction often dominate nest predator communities; therefore, factors that influence olfactory detection (e.g., airflow and weather variables, including turbulence and moisture) should influence nest site selection and survival. However, few studies have assessed the importance of olfactory cover for habitat selection and survival. We assessed whether ground-nesting birds select nest sites based on visual and/or olfactory cover. Additionally, we assessed the importance of visual cover and airflow and weather variables associated with olfactory cover in influencing nest survival. In managed grasslands in Oklahoma, USA, we monitored nests of Northern Bobwhite ( Colinus virginianus ), Eastern Meadowlark ( Sturnella magna ), and Grasshopper Sparrow ( Ammodramus savannarum ) during 2015 and 2016. To assess nest site selection, we compared cover variables between nests and random points. To assess factors influencing nest survival, we used visual cover and olfactory-related measurements (i.e., airflow and weather variables) to model daily nest survival. For nest site selection, nest sites had greater overhead visual cover than random points, but no other significant differences were found. Weather variables hypothesized to influence olfactory detection, specifically precipitation and relative humidity, were the best predictors of and were positively related to daily nest survival. Selection for overhead cover likely contributed to mitigation of thermal extremes and possibly reduced detectability of nests. For daily nest survival, we hypothesize that major nest predators focused on prey other than the monitored species' nests during high moisture conditions, thus increasing nest survival on these

  11. Tobacco promotions at point-of-sale: the last hurrah.

    Science.gov (United States)

    Cohen, Joanna E; Planinac, Lynn C; Griffin, Kara; Robinson, Daniel J; O'Connor, Shawn C; Lavack, Anne; Thompson, Francis E; Di Nardo, Joanne

    2008-01-01

    The retail environment provides important opportunities for tobacco industry communication with current, former, and potential smokers. This study documented the extent of tobacco promotions at the retail point-of-sale and examined associations between the extent of tobacco promotions and relevant city and store characteristics. In each of 20 Ontario cities, 24 establishments were randomly selected from lists of convenience stores, gas stations, and grocery stores. Trained observers captured the range, type and intensity of tobacco promotions from April to July 2005. The extent of tobacco promotions was described using weighted descriptive statistics. Weighted t-tests and ANOVAs, and hierarchical linear modeling, were used to examine the relationships between tobacco promotions and city and store characteristics. Extensive tobacco promotions were found in Ontario stores one year prior to the implementation of a partial ban on retail displays, particularly in chain convenience stores, gas station convenience stores and independent convenience stores. The multivariate hierarchical linear model confirmed differences in the extent of tobacco promotions by store type (p point-of-sale. Public health messages about the harms of tobacco use may be compromised by the pervasiveness of these promotions.

  12. Comparison of cutting and pencil-point spinal needle in spinal anesthesia regarding postdural puncture headache

    Science.gov (United States)

    Xu, Hong; Liu, Yang; Song, WenYe; Kan, ShunLi; Liu, FeiFei; Zhang, Di; Ning, GuangZhi; Feng, ShiQing

    2017-01-01

    Abstract Background: Postdural puncture headache (PDPH), mainly resulting from the loss of cerebral spinal fluid (CSF), is a well-known iatrogenic complication of spinal anesthesia and diagnostic lumbar puncture. Spinal needles have been modified to minimize complications. Modifiable risk factors of PDPH mainly included needle size and needle shape. However, whether the incidence of PDPH is significantly different between cutting-point and pencil-point needles was controversial. Then we did a meta-analysis to assess the incidence of PDPH of cutting spinal needle and pencil-point spinal needle. Methods: We included all randomly designed trials, assessing the clinical outcomes in patients given elective spinal anesthesia or diagnostic lumbar puncture with either cutting or pencil-point spinal needle as eligible studies. All selected studies and the risk of bias of them were assessed by 2 investigators. Clinical outcomes including success rates, frequency of PDPH, reported severe PDPH, and the use of epidural blood patch (EBP) were recorded as primary results. Results were evaluated using risk ratio (RR) with 95% confidence interval (CI) for dichotomous variables. Rev Man software (version 5.3) was used to analyze all appropriate data. Results: Twenty-five randomized controlled trials (RCTs) were included in our study. The analysis result revealed that pencil-point spinal needle would result in lower rate of PDPH (RR 2.50; 95% CI [1.96, 3.19]; P < 0.00001) and severe PDPH (RR 3.27; 95% CI [2.15, 4.96]; P < 0.00001). Furthermore, EBP was less used in pencil-point spine needle group (RR 3.69; 95% CI [1.96, 6.95]; P < 0.0001). Conclusions: Current evidences suggest that pencil-point spinal needle was significantly superior compared with cutting spinal needle regarding the frequency of PDPH, PDPH severity, and the use of EBP. In view of this, we recommend the use of pencil-point spinal needle in spinal anesthesia and lumbar puncture. PMID:28383416

  13. Three-dimensional imaging of individual point defects using selective detection angles in annular dark field scanning transmission electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jared M.; Im, Soohyun; Windl, Wolfgang; Hwang, Jinwoo, E-mail: hwang.458@osu.edu

    2017-01-15

    We propose a new scanning transmission electron microscopy (STEM) technique that can realize the three-dimensional (3D) characterization of vacancies, lighter and heavier dopants with high precision. Using multislice STEM imaging and diffraction simulations of β-Ga{sub 2}O{sub 3} and SrTiO{sub 3}, we show that selecting a small range of low scattering angles can make the contrast of the defect-containing atomic columns substantially more depth-dependent. The origin of the depth-dependence is the de-channeling of electrons due to the existence of a point defect in the atomic column, which creates extra “ripples” at low scattering angles. The highest contrast of the point defect can be achieved when the de-channeling signal is captured using the 20–40 mrad detection angle range. The effect of sample thickness, crystal orientation, local strain, probe convergence angle, and experimental uncertainty to the depth-dependent contrast of the point defect will also be discussed. The proposed technique therefore opens new possibilities for highly precise 3D structural characterization of individual point defects in functional materials. - Highlights: • A new electron microscopy technique that can visualize 3D position of point defect is proposed. • The technique relies on the electron de-channeling signals at low scattering angles. • The technique enables precise determination of the depth of vacancies and lighter impurity atoms.

  14. Random geometry and Yang-Mills theory

    International Nuclear Information System (INIS)

    Froehlich, J.

    1981-01-01

    The author states various problems and discusses a very few preliminary rigorous results in a branch of mathematics and mathematical physics which one might call random (or stochastic) geometry. Furthermore, he points out why random geometry is important in the quantization of Yang-Mills theory. (Auth.)

  15. Modern Statistics for Spatial Point Processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Waagepetersen, Rasmus

    2007-01-01

    We summarize and discuss the current state of spatial point process theory and directions for future research, making an analogy with generalized linear models and random effect models, and illustrating the theory with various examples of applications. In particular, we consider Poisson, Gibbs...

  16. Modern statistics for spatial point processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Waagepetersen, Rasmus

    We summarize and discuss the current state of spatial point process theory and directions for future research, making an analogy with generalized linear models and random effect models, and illustrating the theory with various examples of applications. In particular, we consider Poisson, Gibbs...

  17. A RANDOMIZED TRIAL TO STUDY THE COMPARISON OF TRIGGER POINT DRY NEEDLING VERSUS KINESIO TAPING TECHNIQUE IN MYOFASCIAL PAIN SYNDROME DURING A 3-MONTH FOLLOW UP

    Directory of Open Access Journals (Sweden)

    Emrullah Hayta

    2016-10-01

    Full Text Available Background: Managemen of myofascial pain syndrome (MPS is a current research subject since there is a small number of randomized studies comparing different management techniques. Multiple studies attempted to assess various treatment options including trigger point dry needling and kinesiotaping. We compared the effects of trigger point dry needling and kinesiotaping in the management of myofascial pain syndome during a 3-month follow-up period. Methods: In this prospective randomized studyin MPS patients with upper trapezius muscle trigger points, the effects of dry needling (n=28 and kinesiotaping (n=27 was compared with regard to the visual analog scale (VAS, neck disability index (NDI, and Nottingham health profile (NHP scores measured at the weeks 0, 4, and 12. Results: Both dry needling and kinesiotaping comparably reduced VAS scores measured at the weeks 4 and 12 and their efficacies were more remarkable at the week 12 (p<0.05. These interventions significantly reduced the NDI and NHP score and their effects were also more remarkable at the week 12; however, dry needling was found more effective (p<0.05. Conclusion: Overall, in current clinical settings, during the management of MPS, pain can be reduced comparably by both dry needling and kinesiotaping; however, restriction in the range of motionin neck region and quality of life are more remarkably reduced by dry needling. Both dry needling and kinesiotaping can provide an increasing effectiveness up to 12 weeks.

  18. The correlation function for density perturbations in an expanding universe. III The three-point and predictions of the four-point and higher order correlation functions

    Science.gov (United States)

    Mcclelland, J.; Silk, J.

    1978-01-01

    Higher-order correlation functions for the large-scale distribution of galaxies in space are investigated. It is demonstrated that the three-point correlation function observed by Peebles and Groth (1975) is not consistent with a distribution of perturbations that at present are randomly distributed in space. The two-point correlation function is shown to be independent of how the perturbations are distributed spatially, and a model of clustered perturbations is developed which incorporates a nonuniform perturbation distribution and which explains the three-point correlation function. A model with hierarchical perturbations incorporating the same nonuniform distribution is also constructed; it is found that this model also explains the three-point correlation function, but predicts different results for the four-point and higher-order correlation functions than does the model with clustered perturbations. It is suggested that the model of hierarchical perturbations might be explained by the single assumption of having density fluctuations or discrete objects all of the same mass randomly placed at some initial epoch.

  19. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil; Kammoun, Abla; Al-Naffouri, Tareq Y.; Alouini, Mohamed-Slim

    2016-01-01

    -aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also

  20. The use of random amplified polymorphic DNA to evaluate the genetic variability of Ponkan mandarin (Citrus reticulata Blanco accessions

    Directory of Open Access Journals (Sweden)

    Coletta Filho Helvécio Della

    2000-01-01

    Full Text Available RAPD analysis of 19 Ponkan mandarin accessions was performed using 25 random primers. Of 112 amplification products selected, only 32 were polymorphic across five accessions. The absence of genetic variability among the other 14 accessions suggested that they were either clonal propagations with different local names, or that they had undetectable genetic variability, such as point mutations which cannot be detected by RAPD.

  1. Effects of choice architecture and chef-enhanced meals on the selection and consumption of healthier school foods: a randomized clinical trial.

    Science.gov (United States)

    Cohen, Juliana F W; Richardson, Scott A; Cluggish, Sarah A; Parker, Ellen; Catalano, Paul J; Rimm, Eric B

    2015-05-01

    Little is known about the long-term effect of a chef-enhanced menu on healthier food selection and consumption in school lunchrooms. In addition, it remains unclear if extended exposure to other strategies to promote healthier foods (eg, choice architecture) also improves food selection or consumption. To evaluate the short- and long-term effects of chef-enhanced meals and extended exposure to choice architecture on healthier school food selection and consumption. A school-based randomized clinical trial was conducted during the 2011-2012 school year among 14 elementary and middle schools in 2 urban, low-income school districts (intent-to-treat analysis). Included in the study were 2638 students in grades 3 through 8 attending participating schools (38.4% of eligible participants). Schools were first randomized to receive a professional chef to improve school meal palatability (chef schools) or to a delayed intervention (control group). To assess the effect of choice architecture (smart café), all schools after 3 months were then randomized to the smart café intervention or to the control group. School food selection was recorded, and consumption was measured using plate waste methods. After 3 months, vegetable selection increased in chef vs control schools (odds ratio [OR], 1.75; 95% CI, 1.36-2.24), but there was no effect on the selection of other components or on meal consumption. After long-term or extended exposure to the chef or smart café intervention, fruit selection increased in the chef (OR, 3.08; 95% CI, 2.23-4.25), smart café (OR, 1.45; 95% CI, 1.13-1.87), and chef plus smart café (OR, 3.10; 95% CI, 2.26-4.25) schools compared with the control schools, and consumption increased in the chef schools (OR, 0.17; 95% CI, 0.03-0.30 cups/d). Vegetable selection increased in the chef (OR, 2.54; 95% CI, 1.83-3.54), smart café (OR, 1.91; 95% CI, 1.46-2.50), and chef plus smart café schools (OR, 7.38, 95% CI, 5.26-10.35) compared with the control schools

  2. Bell inequalities for random fields

    Energy Technology Data Exchange (ETDEWEB)

    Morgan, Peter [Physics Department, Yale University, CT 06520 (United States)

    2006-06-09

    The assumptions required for the derivation of Bell inequalities are not satisfied for random field models in which there are any thermal or quantum fluctuations, in contrast to the general satisfaction of the assumptions for classical two point particle models. Classical random field models that explicitly include the effects of quantum fluctuations on measurement are possible for experiments that violate Bell inequalities.

  3. Bell inequalities for random fields

    OpenAIRE

    Morgan, Peter

    2004-01-01

    The assumptions required for the derivation of Bell inequalities are not usually satisfied for random fields in which there are any thermal or quantum fluctuations, in contrast to the general satisfaction of the assumptions for classical two point particle models. Classical random field models that explicitly include the effects of quantum fluctuations on measurement are possible for experiments that violate Bell inequalities.

  4. Modelling estimation and analysis of dynamic processes from image sequences using temporal random closed sets and point processes with application to the cell exocytosis and endocytosis

    OpenAIRE

    Díaz Fernández, Ester

    2010-01-01

    In this thesis, new models and methodologies are introduced for the analysis of dynamic processes characterized by image sequences with spatial temporal overlapping. The spatial temporal overlapping exists in many natural phenomena and should be addressed properly in several Science disciplines such as Microscopy, Material Sciences, Biology, Geostatistics or Communication Networks. This work is related to the Point Process and Random Closed Set theories, within Stochastic Ge...

  5. Modified random hinge transport mechanics and multiple scattering step-size selection in EGS5

    International Nuclear Information System (INIS)

    Wilderman, S.J.; Bielajew, A.F.

    2005-01-01

    The new transport mechanics in EGS5 allows for significantly longer electron transport step sizes and hence shorter computation times than required for identical problems in EGS4. But as with all Monte Carlo electron transport algorithms, certain classes of problems exhibit step-size dependencies even when operating within recommended ranges, sometimes making selection of step-sizes a daunting task for novice users. Further contributing to this problem, because of the decoupling of multiple scattering and continuous energy loss in the dual random hinge transport mechanics of EGS5, there are two independent step sizes in EGS5, one for multiple scattering and one for continuous energy loss, each of which influences speed and accuracy in a different manner. Further, whereas EGS4 used a single value of fractional energy loss (ESTEPE) to determine step sizes at all energies, to increase performance by decreasing the amount of effort expended simulating lower energy particles, EGS5 permits the fractional energy loss values which are used to determine both the multiple scattering and continuous energy loss step sizes to vary with energy. This results in requiring the user to specify four fractional energy loss values when optimizing computations for speed. Thus, in order to simplify step-size selection and to mitigate step-size dependencies, a method has been devised to automatically optimize step-size selection based on a single material dependent input related to the size of problem tally region. In this paper we discuss the new transport mechanics in EGS5 and describe the automatic step-size optimization algorithm. (author)

  6. Effectiveness of trigger point dry needling for plantar heel pain: a meta-analysis of seven randomized controlled trials

    Directory of Open Access Journals (Sweden)

    He C

    2017-08-01

    Full Text Available Chunhui He,1,* Hua Ma2,* 1Internal Medicine of Traditional Chinese Medicine, 2Medical Image Center, The First Affiliated Hospital of Xinjiang Medical University, Wulumuqi, People’s Republic of China *These authors contributed equally to this work Background: Plantar heel pain can be managed with dry needling of myofascial trigger points (MTrPs; however, whether MTrP needling is effective remains controversial. Thus, we conducted this meta-analysis to evaluate the effect of MTrP needling in patients with plantar heel pain. Materials and methods: PubMed, Embase, Web of Science, SinoMed (Chinese BioMedical Literature Service System, People’s Republic of China, and CNKI (National Knowledge Infrastructure, People’s Republic of China databases were systematically reviewed for randomized controlled trials (RCTs that assessed the effects of MTrP needling. Pooled weighted mean difference (WMD with 95% CIs were calculated for change in visual analog scale (VAS score, and pooled risk ratio (RR with 95% CIs were calculated for success rate for pain and incidence of adverse events. A fixed-effects model or random-effects model was used to pool the estimates, depending on the heterogeneity among the included studies. Results: Extensive literature search yielded 1,941 articles, of which only seven RCTs met the inclusion criteria and were included in this meta-analysis. The pooled results showed that MTrP needling significantly reduced the VAS score (WMD =–15.50, 95% CI: –19.48, –11.53; P<0.001 compared with control, but it had a similar success rate for pain with control (risk ratio [RR] =1.15, 95% CI: 0.87, 1.51; P=0.320. Moreover, MTrP needling was associated with a similar incidence of adverse events with control (RR =1.89, 95% CI: 0.38, 9.39; P=0.438. Conclusion: MTrP needling effectively reduced the heel pain due to plantar fasciitis. However, considering the potential limitations in this study, more large-scale, adequately powered, good

  7. Health information needs of professional nurses required at the point of care.

    Science.gov (United States)

    Ricks, Esmeralda; ten Ham, Wilma

    2015-06-11

    Professional nurses work in dynamic environments and need to keep up to date with relevant information for practice in nursing to render quality patient care. Keeping up to date with current information is often challenging because of heavy workload, diverse information needs and the accessibility of the required information at the point of care. The aim of the study was to explore and describe the information needs of professional nurses at the point of care in order to make recommendations to stakeholders to develop a mobile library accessible by means of smart phones when needed. The researcher utilised a quantitative, descriptive survey design to conduct this study. The target population comprised 757 professional nurses employed at a state hospital. Simple random sampling was used to select a sample of the wards, units and departments for inclusion in the study. A convenience sample of 250 participants was selected. Two hundred and fifty structured self-administered questionnaires were distributed amongst the participants. Descriptive statistics were used to analyse the data. A total of 136 completed questionnaires were returned. The findings highlighted the types and accessible sources of information. Information needs of professional nurses were identified such as: extremely drug-resistant tuberculosis, multi-drug-resistant tuberculosis, HIV, antiretrovirals and all chronic lifestyle diseases. This study has enabled the researcher to identify the information needs required by professional nurses at the point of care to enhance the delivery of patient care. The research results were used to develop a mobile library that could be accessed by professional nurses.

  8. Thinning spatial point processes into Poisson processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Schoenberg, Frederic Paik

    , and where one simulates backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and thus can......This paper describes methods for randomly thinning certain classes of spatial point processes. In the case of a Markov point process, the proposed method involves a dependent thinning of a spatial birth-and-death process, where clans of ancestors associated with the original points are identified...... be used as a diagnostic for assessing the goodness-of-fit of a spatial point process model. Several examples, including clustered and inhibitive point processes, are considered....

  9. Pseudo-random data acquisition geometry in 3D seismic survey; Sanjigen jishin tansa ni okeru giji random data shutoku reiauto ni tsuite

    Energy Technology Data Exchange (ETDEWEB)

    Minegishi, M; Tsuburaya, Y [Japan National Oil Corp., Tokyo (Japan). Technology Research Center

    1996-10-01

    Influence of pseudo-random geometry on the imaging for 3D seismic exploration data acquisition has been investigate using a simple model by comparing with the regular geometry. When constituting wave front by the interference of elemental waves, pseudo-random geometry data did not always provide good results. In the case of a point diffractor, the imaging operation, where the constituted wave front was returned to the point diffractor by the interference of elemental waves for the spatial alias records, did not always give clear images. In the case of multi point diffractor, good images were obtained with less noise generation in spite of alias records. There are a lot of diffractors in the actual geological structures, which corresponds to the case of multi point diffractors. Finally, better images could be obtained by inputting records acquired using the pseudo-random geometry rather than by inputting spatial alias records acquired using the regular geometry. 7 refs., 6 figs.

  10. Influence of random setup error on dose distribution

    International Nuclear Information System (INIS)

    Zhai Zhenyu

    2008-01-01

    Objective: To investigate the influence of random setup error on dose distribution in radiotherapy and determine the margin from ITV to PTV. Methods: A random sample approach was used to simulate the fields position in target coordinate system. Cumulative effect of random setup error was the sum of dose distributions of all individual treatment fractions. Study of 100 cumulative effects might get shift sizes of 90% dose point position. Margins from ITV to PTV caused by random setup error were chosen by 95% probability. Spearman's correlation was used to analyze the influence of each factor. Results: The average shift sizes of 90% dose point position was 0.62, 1.84, 3.13, 4.78, 6.34 and 8.03 mm if random setup error was 1,2,3,4,5 and 6 mm,respectively. Univariate analysis showed the size of margin was associated only by the size of random setup error. Conclusions: Margin of ITV to PTV is 1.2 times random setup error for head-and-neck cancer and 1.5 times for thoracic and abdominal cancer. Field size, energy and target depth, unlike random setup error, have no relation with the size of the margin. (authors)

  11. Students perception on the usage of PowerPoint in learning calculus

    Science.gov (United States)

    Othman, Zarith Sofiah; Tarmuji, Nor Habibah; Hilmi, Zulkifli Ab Ghani

    2017-04-01

    Mathematics is a core subject in most of the science and technology courses and in some social sciences programs. However, the low achievement of students in the subject especially in topics such as Differentiation and Integration is always an issue. Many factors contribute to the low performance such as motivation, environment, method of learning, academic background and others. The purpose of this paper is to determine the perception of learning mathematics using PowerPoint on Integration concepts at the undergraduate level with respect to mathematics anxiety, learning enjoyment, mobility and learning satisfaction. The main content of the PowerPoint presentation focused on the integration method with historical elements as an added value. The study was conducted on 48 students randomly selected from students in computer and applied sciences program as experimental group. Questionnaires were distributed to students to explore their learning experiences. Another 51 students who were taught using the traditional chalkboard method were used as the control group. Both groups were given a test on Integration. The statistical methods used were descriptive statistics and independent sample t-test between the experimental and the control group. The finding showed that most students perceived positively to the PowerPoint presentations with respect to mobility and learning satisfaction. The experimental group performed better than the control group.

  12. HIV Salvage Therapy Does Not Require Nucleoside Reverse Transcriptase Inhibitors: A Randomized, Controlled Trial.

    Science.gov (United States)

    Tashima, Karen T; Smeaton, Laura M; Fichtenbaum, Carl J; Andrade, Adriana; Eron, Joseph J; Gandhi, Rajesh T; Johnson, Victoria A; Klingman, Karin L; Ritz, Justin; Hodder, Sally; Santana, Jorge L; Wilkin, Timothy; Haubrich, Richard H

    2015-12-15

    Nucleoside reverse transcriptase inhibitors (NRTIs) are often included in antiretroviral regimens in treatment-experienced patients in the absence of data from randomized trials. To compare treatment success between participants who omit versus those who add NRTIs to an optimized antiretroviral regimen of 3 or more agents. Multicenter, randomized, controlled trial. (ClinicalTrials.gov: NCT00537394). Outpatient HIV clinics. Treatment-experienced patients with HIV infection and viral resistance. Open-label optimized regimens (not including NRTIs) were selected on the basis of treatment history and susceptibility testing. Participants were randomly assigned to omit or add NRTIs. The primary efficacy outcome was regimen failure through 48 weeks using a noninferiority margin of 15%. The primary safety outcome was time to initial episode of a severe sign, symptom, or laboratory abnormality before discontinuation of NRTI assignment. 360 participants were randomly assigned, and 93% completed a 48-week visit. The cumulative probability of regimen failure was 29.8% in the omit-NRTIs group versus 25.9% in the add-NRTIs group (difference, 3.2 percentage points [95% CI, -6.1 to 12.5 percentage points]). No significant between-group differences were found in the primary safety end points or the proportion of participants with HIV RNA level less than 50 copies/mL. No deaths occurred in the omit-NRTIs group compared with 7 deaths in the add-NRTIs group. Unblinded study design, and the study may not be applicable to resource-poor settings. Treatment-experienced patients with HIV infection starting a new optimized regimen can safely omit NRTIs without compromising virologic efficacy. Omitting NRTIs will reduce pill burden, cost, and toxicity in this patient population. National Institute of Allergy and Infectious Diseases, Boehringer Ingelheim, Janssen, Merck, ViiV Healthcare, Roche, and Monogram Biosciences (LabCorp).

  13. Automated Coarse Registration of Point Clouds in 3d Urban Scenes Using Voxel Based Plane Constraint

    Science.gov (United States)

    Xu, Y.; Boerner, R.; Yao, W.; Hoegner, L.; Stilla, U.

    2017-09-01

    For obtaining a full coverage of 3D scans in a large-scale urban area, the registration between point clouds acquired via terrestrial laser scanning (TLS) is normally mandatory. However, due to the complex urban environment, the automatic registration of different scans is still a challenging problem. In this work, we propose an automatic marker free method for fast and coarse registration between point clouds using the geometric constrains of planar patches under a voxel structure. Our proposed method consists of four major steps: the voxelization of the point cloud, the approximation of planar patches, the matching of corresponding patches, and the estimation of transformation parameters. In the voxelization step, the point cloud of each scan is organized with a 3D voxel structure, by which the entire point cloud is partitioned into small individual patches. In the following step, we represent points of each voxel with the approximated plane function, and select those patches resembling planar surfaces. Afterwards, for matching the corresponding patches, a RANSAC-based strategy is applied. Among all the planar patches of a scan, we randomly select a planar patches set of three planar surfaces, in order to build a coordinate frame via their normal vectors and their intersection points. The transformation parameters between scans are calculated from these two coordinate frames. The planar patches set with its transformation parameters owning the largest number of coplanar patches are identified as the optimal candidate set for estimating the correct transformation parameters. The experimental results using TLS datasets of different scenes reveal that our proposed method can be both effective and efficient for the coarse registration task. Especially, for the fast orientation between scans, our proposed method can achieve a registration error of less than around 2 degrees using the testing datasets, and much more efficient than the classical baseline methods.

  14. Multi-Class Simultaneous Adaptive Segmentation and Quality Control of Point Cloud Data

    Directory of Open Access Journals (Sweden)

    Ayman Habib

    2016-01-01

    Full Text Available 3D modeling of a given site is an important activity for a wide range of applications including urban planning, as-built mapping of industrial sites, heritage documentation, military simulation, and outdoor/indoor analysis of airflow. Point clouds, which could be either derived from passive or active imaging systems, are an important source for 3D modeling. Such point clouds need to undergo a sequence of data processing steps to derive the necessary information for the 3D modeling process. Segmentation is usually the first step in the data processing chain. This paper presents a region-growing multi-class simultaneous segmentation procedure, where planar, pole-like, and rough regions are identified while considering the internal characteristics (i.e., local point density/spacing and noise level of the point cloud in question. The segmentation starts with point cloud organization into a kd-tree data structure and characterization process to estimate the local point density/spacing. Then, proceeding from randomly-distributed seed points, a set of seed regions is derived through distance-based region growing, which is followed by modeling of such seed regions into planar and pole-like features. Starting from optimally-selected seed regions, planar and pole-like features are then segmented. The paper also introduces a list of hypothesized artifacts/problems that might take place during the region-growing process. Finally, a quality control process is devised to detect, quantify, and mitigate instances of partially/fully misclassified planar and pole-like features. Experimental results from airborne and terrestrial laser scanning as well as image-based point clouds are presented to illustrate the performance of the proposed segmentation and quality control framework.

  15. Fat suppression strategies in MR imaging of breast cancer at 3.0 T. Comparison of the two-point dixon technique and the frequency selective inversion method

    International Nuclear Information System (INIS)

    Kaneko Mikami, Wakako; Kazama, Toshiki; Sato, Hirotaka

    2013-01-01

    The purpose of this study was to compare two fat suppression methods in contrast-enhanced MR imaging of breast cancer at 3.0 T: the two-point Dixon method and the frequency selective inversion method. Forty female patients with breast cancer underwent contrast-enhanced three-dimensional T1-weighted MR imaging at 3.0 T. Both the two-point Dixon method and the frequency selective inversion method were applied. Quantitative analyses of the residual fat signal-to-noise ratio and the contrast noise ratio (CNR) of lesion-to-breast parenchyma, lesion-to-fat, and parenchyma-to-fat were performed. Qualitative analyses of the uniformity of fat suppression, image contrast, and the visibility of breast lesions and axillary metastatic adenopathy were performed. The signal-to-noise ratio was significantly lower in the two-point Dixon method (P<0.001). All CNR values were significantly higher in the two-point Dixon method (P<0.001 and P=0.001, respectively). According to qualitative analysis, both the uniformity of fat suppression and image contrast with the two-point Dixon method were significantly higher (P<0.001 and P=0.002, respectively). Visibility of breast lesions and metastatic adenopathy was significantly better in the two-point Dixon method (P<0.001 and P=0.03, respectively). The two-point Dixon method suppressed the fat signal more potently and improved contrast and visibility of the breast lesions and axillary adenopathy. (author)

  16. Self-exciting point process in modeling earthquake occurrences

    International Nuclear Information System (INIS)

    Pratiwi, H.; Slamet, I.; Respatiwulan; Saputro, D. R. S.

    2017-01-01

    In this paper, we present a procedure for modeling earthquake based on spatial-temporal point process. The magnitude distribution is expressed as truncated exponential and the event frequency is modeled with a spatial-temporal point process that is characterized uniquely by its associated conditional intensity process. The earthquakes can be regarded as point patterns that have a temporal clustering feature so we use self-exciting point process for modeling the conditional intensity function. The choice of main shocks is conducted via window algorithm by Gardner and Knopoff and the model can be fitted by maximum likelihood method for three random variables. (paper)

  17. Transforming spatial point processes into Poisson processes using random superposition

    DEFF Research Database (Denmark)

    Møller, Jesper; Berthelsen, Kasper Klitgaaard

    with a complementary spatial point process Y  to obtain a Poisson process X∪Y  with intensity function β. Underlying this is a bivariate spatial birth-death process (Xt,Yt) which converges towards the distribution of (X,Y). We study the joint distribution of X and Y, and their marginal and conditional distributions....... In particular, we introduce a fast and easy simulation procedure for Y conditional on X. This may be used for model checking: given a model for the Papangelou intensity of the original spatial point process, this model is used to generate the complementary process, and the resulting superposition is a Poisson...... process with intensity function β if and only if the true Papangelou intensity is used. Whether the superposition is actually such a Poisson process can easily be examined using well known results and fast simulation procedures for Poisson processes. We illustrate this approach to model checking...

  18. Point mutations in the post-M2 region of human alpha-ENaC regulate cation selectivity.

    Science.gov (United States)

    Ji, H L; Parker, S; Langloh, A L; Fuller, C M; Benos, D J

    2001-07-01

    We tested the hypothesis that an arginine-rich region immediately following the second transmembrane domain may constitute part of the inner mouth of the epithelial Na+ channel (ENaC) pore and, hence, influence conduction and/or selectivity properties of the channel by expressing double point mutants in Xenopus oocytes. Double point mutations of arginines in this post-M2 region of the human alpha-ENaC (alpha-hENaC) led to a decrease and increase in the macroscopic conductance of alphaR586E,R587Ebetagamma- and alphaR589E,R591Ebetagamma-hENaC, respectively, but had no effect on the single-channel conductance of either double point mutant. However, the apparent equilibrium dissociation constant for Na+ was decreased for both alphaR586E,R587Ebetagamma- and alphaR589E,R591Ebetagamma-hENaC, and the maximum amiloride-sensitive Na+ current was decreased for alphaR586E,R587Ebetagamma-hENaC and increased for alphaR589E,R591Ebetagamma-hENaC. The relative permeabilities of Li+ and K+ vs. Na+ were increased 11.25- to 27.57-fold for alphaR586E,R587Ebetagamma-hENaC compared with wild type. The relative ion permeability of these double mutants and wild-type ENaC was inversely related to the crystal diameter of the permeant ions. Thus the region of positive charge is important for the ion permeation properties of the channel and may form part of the pore itself.

  19. Coverage of space by random sets

    Indian Academy of Sciences (India)

    Consider the non-negative integer line. For each integer point we toss a coin. If the toss at location i is a. Heads we place an interval (of random length) there and move to location i + 1,. Tails we move to location i + 1. Coverage of space by random sets – p. 2/29 ...

  20. AUTOMATIC RECOGNITION OF INDOOR NAVIGATION ELEMENTS FROM KINECT POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    L. Zeng

    2017-09-01

    Full Text Available This paper realizes automatically the navigating elements defined by indoorGML data standard – door, stairway and wall. The data used is indoor 3D point cloud collected by Kinect v2 launched in 2011 through the means of ORB-SLAM. By contrast, it is cheaper and more convenient than lidar, but the point clouds also have the problem of noise, registration error and large data volume. Hence, we adopt a shape descriptor – histogram of distances between two randomly chosen points, proposed by Osada and merges with other descriptor – in conjunction with random forest classifier to recognize the navigation elements (door, stairway and wall from Kinect point clouds. This research acquires navigation elements and their 3-d location information from each single data frame through segmentation of point clouds, boundary extraction, feature calculation and classification. Finally, this paper utilizes the acquired navigation elements and their information to generate the state data of the indoor navigation module automatically. The experimental results demonstrate a high recognition accuracy of the proposed method.

  1. Automatic Recognition of Indoor Navigation Elements from Kinect Point Clouds

    Science.gov (United States)

    Zeng, L.; Kang, Z.

    2017-09-01

    This paper realizes automatically the navigating elements defined by indoorGML data standard - door, stairway and wall. The data used is indoor 3D point cloud collected by Kinect v2 launched in 2011 through the means of ORB-SLAM. By contrast, it is cheaper and more convenient than lidar, but the point clouds also have the problem of noise, registration error and large data volume. Hence, we adopt a shape descriptor - histogram of distances between two randomly chosen points, proposed by Osada and merges with other descriptor - in conjunction with random forest classifier to recognize the navigation elements (door, stairway and wall) from Kinect point clouds. This research acquires navigation elements and their 3-d location information from each single data frame through segmentation of point clouds, boundary extraction, feature calculation and classification. Finally, this paper utilizes the acquired navigation elements and their information to generate the state data of the indoor navigation module automatically. The experimental results demonstrate a high recognition accuracy of the proposed method.

  2. Thinning spatial point processes into Poisson processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Schoenberg, Frederic Paik

    2010-01-01

    are identified, and where we simulate backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and......In this paper we describe methods for randomly thinning certain classes of spatial point processes. In the case of a Markov point process, the proposed method involves a dependent thinning of a spatial birth-and-death process, where clans of ancestors associated with the original points......, thus, can be used as a graphical exploratory tool for inspecting the goodness-of-fit of a spatial point process model. Several examples, including clustered and inhibitive point processes, are considered....

  3. Explore Stochastic Instabilities of Periodic Points by Transition Path Theory

    Science.gov (United States)

    Cao, Yu; Lin, Ling; Zhou, Xiang

    2016-06-01

    We consider the noise-induced transitions from a linearly stable periodic orbit consisting of T periodic points in randomly perturbed discrete logistic map. Traditional large deviation theory and asymptotic analysis at small noise limit cannot distinguish the quantitative difference in noise-induced stochastic instabilities among the T periodic points. To attack this problem, we generalize the transition path theory to the discrete-time continuous-space stochastic process. In our first criterion to quantify the relative instability among T periodic points, we use the distribution of the last passage location related to the transitions from the whole periodic orbit to a prescribed disjoint set. This distribution is related to individual contributions to the transition rate from each periodic points. The second criterion is based on the competency of the transition paths associated with each periodic point. Both criteria utilize the reactive probability current in the transition path theory. Our numerical results for the logistic map reveal the transition mechanism of escaping from the stable periodic orbit and identify which periodic point is more prone to lose stability so as to make successful transitions under random perturbations.

  4. Species selective preconcentration and quantification of gold nanoparticles using cloud point extraction and electrothermal atomic absorption spectrometry

    International Nuclear Information System (INIS)

    Hartmann, Georg; Schuster, Michael

    2013-01-01

    Highlights: ► We optimized cloud point extraction and ET-AAS parameters for Au-NPs measurement. ► A selective ligand (sodium thiosulphate) is introduced for species separation. ► A limit of detection of 5 ng Au-NP per L is achieved for aqueous samples. ► Measurement of samples with high natural organic mater content is possible. ► Real water samples including wastewater treatment plant effluent were analyzed. - Abstract: The determination of metallic nanoparticles in environmental samples requires sample pretreatment that ideally combines pre-concentration and species selectivity. With cloud point extraction (CPE) using the surfactant Triton X-114 we present a simple and cost effective separation technique that meets both criteria. Effective separation of ionic gold species and Au nanoparticles (Au-NPs) is achieved by using sodium thiosulphate as a complexing agent. The extraction efficiency for Au-NP ranged from 1.01 ± 0.06 (particle size 2 nm) to 0.52 ± 0.16 (particle size 150 nm). An enrichment factor of 80 and a low limit of detection of 5 ng L −1 is achieved using electrothermal atomic absorption spectrometry (ET-AAS) for quantification. TEM measurements showed that the particle size is not affected by the CPE process. Natural organic matter (NOM) is tolerated up to a concentration of 10 mg L −1 . The precision of the method expressed as the standard deviation of 12 replicates at an Au-NP concentration of 100 ng L −1 is 9.5%. A relation between particle concentration and the extraction efficiency was not observed. Spiking experiments showed a recovery higher than 91% for environmental water samples.

  5. Collaborative Indoor Access Point Localization Using Autonomous Mobile Robot Swarm.

    Science.gov (United States)

    Awad, Fahed; Naserllah, Muhammad; Omar, Ammar; Abu-Hantash, Alaa; Al-Taj, Abrar

    2018-01-31

    Localization of access points has become an important research problem due to the wide range of applications it addresses such as dismantling critical security threats caused by rogue access points or optimizing wireless coverage of access points within a service area. Existing proposed solutions have mostly relied on theoretical hypotheses or computer simulation to demonstrate the efficiency of their methods. The techniques that rely on estimating the distance using samples of the received signal strength usually assume prior knowledge of the signal propagation characteristics of the indoor environment in hand and tend to take a relatively large number of uniformly distributed random samples. This paper presents an efficient and practical collaborative approach to detect the location of an access point in an indoor environment without any prior knowledge of the environment. The proposed approach comprises a swarm of wirelessly connected mobile robots that collaboratively and autonomously collect a relatively small number of non-uniformly distributed random samples of the access point's received signal strength. These samples are used to efficiently and accurately estimate the location of the access point. The experimental testing verified that the proposed approach can identify the location of the access point in an accurate and efficient manner.

  6. Extreme values, regular variation and point processes

    CERN Document Server

    Resnick, Sidney I

    1987-01-01

    Extremes Values, Regular Variation and Point Processes is a readable and efficient account of the fundamental mathematical and stochastic process techniques needed to study the behavior of extreme values of phenomena based on independent and identically distributed random variables and vectors It presents a coherent treatment of the distributional and sample path fundamental properties of extremes and records It emphasizes the core primacy of three topics necessary for understanding extremes the analytical theory of regularly varying functions; the probabilistic theory of point processes and random measures; and the link to asymptotic distribution approximations provided by the theory of weak convergence of probability measures in metric spaces The book is self-contained and requires an introductory measure-theoretic course in probability as a prerequisite Almost all sections have an extensive list of exercises which extend developments in the text, offer alternate approaches, test mastery and provide for enj...

  7. Myofascial trigger point-focused head and neck massage for recurrent tension-type headache: A randomized, placebo-controlled clinical trial

    Science.gov (United States)

    Moraska, Albert F.; Stenerson, Lea; Butryn, Nathan; Krutsch, Jason P.; Schmiege, Sarah J.; Mann, J. Douglas

    2014-01-01

    Objective Myofascial trigger points (MTrPs) are focal disruptions in skeletal muscle that can refer pain to the head and reproduce the pain patterns of tension-type headache (TTH). The present study applied massage focused on MTrPs of subjects with TTH in a placebo-controlled, clinical trial to assess efficacy on reducing headache pain. Methods Fifty-six subjects with TTH were randomized to receive 12 massage or placebo (detuned ultrasound) sessions over six weeks, or to wait-list. Trigger point release (TPR) massage focused on MTrPs in cervical musculature. Headache pain (frequency, intensity and duration) was recorded in a daily headache diary. Additional outcome measures included self-report of perceived clinical change in headache pain and pressure-pain threshold (PPT) at MTrPs in the upper trapezius and sub-occipital muscles. Results From diary recordings, group differences across time were detected in headache frequency (p=0.026), but not for intensity or duration. Post hoc analysis indicated headache frequency decreased from baseline for both massage (pheadache pain for massage than placebo or wait-list groups (p=0.002). PPT improved in all muscles tested for massage only (all p'streatment of TTH, and 2) TTH, like other chronic conditions, is responsive to placebo. Clinical trials on headache that do not include a placebo group are at risk for overestimating the specific contribution from the active intervention. PMID:25329141

  8. Laser assisted fabrication of random rough surfaces for optoelectronics

    Energy Technology Data Exchange (ETDEWEB)

    Brissonneau, V., E-mail: vincent.brissonneau@im2np.fr [Thales Optronique SA, Avenue Gay-Lussac, 78995 Elancourt (France); Institut Materiaux Microelectronique Nanosciences de Provence, Aix Marseille Universite, Avenue Escadrille Normandie Niemen, 13397 Marseille (France); Escoubas, L. [Institut Materiaux Microelectronique Nanosciences de Provence, Aix Marseille Universite, Avenue Escadrille Normandie Niemen, 13397 Marseille (France); Flory, F. [Institut Materiaux Microelectronique Nanosciences de Provence, Ecole Centrale Marseille, Marseille (France); Berginc, G. [Thales Optronique SA, Avenue Gay-Lussac, 78995 Elancourt (France); Maire, G.; Giovannini, H. [Institut Fresnel, Aix Marseille Universite, Avenue Escadrille Normandie Niemen, 13397 Marseille (France)

    2012-09-15

    Highlights: Black-Right-Pointing-Pointer Random rough surfaces are photofabricated using an argon ion laser. Black-Right-Pointing-Pointer Speckle and surface correlation function are linked. Black-Right-Pointing-Pointer Exposure beam is modified allowing tuning the correlation. Black-Right-Pointing-Pointer Theoretical examples are presented. Black-Right-Pointing-Pointer Experimental results are compared with theoretical expectation. - Abstract: Optical surface structuring shows great interest for antireflective or scattering properties. Generally, fabricated surface structures are periodical but random surfaces that offer new degrees of freedom and possibilities by the control of their statistical properties. We propose an experimental method to create random rough surfaces on silicon by laser processing followed by etching. A photoresist is spin coated onto a silicon substrate and then exposed to the scattering of a modified laser beam. The beam modification is performed by using a micromirror matrix allowing laser beam shaping. An example of tuning is presented. An image composed of two white circles with a black background is displayed and the theoretical shape of the correlation is calculated. Experimental surfaces are elaborated and the correlation function calculated from height mapping. We finally compared the experimental and theoretical correlation functions.

  9. Comparison of confirmed inactive and randomly selected compounds as negative training examples in support vector machine-based virtual screening.

    Science.gov (United States)

    Heikamp, Kathrin; Bajorath, Jürgen

    2013-07-22

    The choice of negative training data for machine learning is a little explored issue in chemoinformatics. In this study, the influence of alternative sets of negative training data and different background databases on support vector machine (SVM) modeling and virtual screening has been investigated. Target-directed SVM models have been derived on the basis of differently composed training sets containing confirmed inactive molecules or randomly selected database compounds as negative training instances. These models were then applied to search background databases consisting of biological screening data or randomly assembled compounds for available hits. Negative training data were found to systematically influence compound recall in virtual screening. In addition, different background databases had a strong influence on the search results. Our findings also indicated that typical benchmark settings lead to an overestimation of SVM-based virtual screening performance compared to search conditions that are more relevant for practical applications.

  10. Variable Selection in Time Series Forecasting Using Random Forests

    Directory of Open Access Journals (Sweden)

    Hristos Tyralis

    2017-10-01

    Full Text Available Time series forecasting using machine learning algorithms has gained popularity recently. Random forest is a machine learning algorithm implemented in time series forecasting; however, most of its forecasting properties have remained unexplored. Here we focus on assessing the performance of random forests in one-step forecasting using two large datasets of short time series with the aim to suggest an optimal set of predictor variables. Furthermore, we compare its performance to benchmarking methods. The first dataset is composed by 16,000 simulated time series from a variety of Autoregressive Fractionally Integrated Moving Average (ARFIMA models. The second dataset consists of 135 mean annual temperature time series. The highest predictive performance of RF is observed when using a low number of recent lagged predictor variables. This outcome could be useful in relevant future applications, with the prospect to achieve higher predictive accuracy.

  11. The Long-Term Effectiveness of a Selective, Personality-Targeted Prevention Program in Reducing Alcohol Use and Related Harms: A Cluster Randomized Controlled Trial

    Science.gov (United States)

    Newton, Nicola C.; Conrod, Patricia J.; Slade, Tim; Carragher, Natacha; Champion, Katrina E.; Barrett, Emma L.; Kelly, Erin V.; Nair, Natasha K.; Stapinski, Lexine; Teesson, Maree

    2016-01-01

    Background: This study investigated the long-term effectiveness of Preventure, a selective personality-targeted prevention program, in reducing the uptake of alcohol, harmful use of alcohol, and alcohol-related harms over a 3-year period. Methods: A cluster randomized controlled trial was conducted to assess the effectiveness of Preventure.…

  12. A Thin Plate Spline-Based Feature-Preserving Method for Reducing Elevation Points Derived from LiDAR

    Directory of Open Access Journals (Sweden)

    Chuanfa Chen

    2015-09-01

    Full Text Available Light detection and ranging (LiDAR technique is currently one of the most important tools for collecting elevation points with a high density in the context of digital elevation model (DEM construction. However, the high density data always leads to serious time and memory consumption problems in data processing. In this paper, we have developed a thin plate spline (TPS-based feature-preserving (TPS-F method for LiDAR-derived ground data reduction by selecting a certain amount of significant terrain points and by extracting geomorphological features from the raw dataset to maintain the accuracy of constructed DEMs as high as possible, while maximally keeping terrain features. We employed four study sites with different topographies (i.e., flat, undulating, hilly and mountainous terrains to analyze the performance of TPS-F for LiDAR data reduction in the context of DEM construction. These results were compared with those of the TPS-based algorithm without features (TPS-W and two classical data selection methods including maximum z-tolerance (Max-Z and the random method. Results show that irrespective of terrain characteristic, the two versions of TPS-based approaches (i.e., TPS-F and TPS-W are always more accurate than the classical methods in terms of error range and root means square error. Moreover, in terms of streamline matching rate (SMR, TPS-F has a better ability of preserving geomorphological features, especially for the mountainous terrain. For example, the average SMR of TPS-F is 89.2% in the mountainous area, while those of TPS-W, max-Z and the random method are 56.6%, 34.7% and 35.3%, respectively.

  13. On the spectral properties of random finite difference operators

    International Nuclear Information System (INIS)

    Kunz, H.; Souillard, B.

    1980-01-01

    We study a class of random finite difference operators, a typical example of which is the finite difference Schroedinger operator with a random potential which arises in solid state physics in the tight binding approximation. We obtain with probability one, in various situations, the exact location of the spectrum, and criterions for a given part in the spectrum to be pure point or purely continuous, or for the static electric conductivity to vanish. A general formalism is developped which transforms the study of these random operators into that of the asymptotics of a multiple integral constructed from a given recipe. Finally we apply our criterions and formalism to prove that, with probability one, the one-dimensional finite difference Schroedinger operator with a random potential has pure point spectrum and developps no static conductivity. (orig.)

  14. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, which...... indicates signal sampling points in time. Practical random sampling patterns are constrained by ADC characteristics and application requirements. In this paper, we introduce statistical methods which evaluate random sampling pattern generators with emphasis on practical applications. Furthermore, we propose...... algorithm generates random sampling patterns dedicated for event-driven-ADCs better than existed sampling pattern generators. Finally, implementation issues of random sampling patterns are discussed....

  15. Affinity selection of Nipah and Hendra virus-related vaccine candidates from a complex random peptide library displayed on bacteriophage virus-like particles

    Energy Technology Data Exchange (ETDEWEB)

    Peabody, David S.; Chackerian, Bryce; Ashley, Carlee; Carnes, Eric; Negrete, Oscar

    2017-01-24

    The invention relates to virus-like particles of bacteriophage MS2 (MS2 VLPs) displaying peptide epitopes or peptide mimics of epitopes of Nipah Virus envelope glycoprotein that elicit an immune response against Nipah Virus upon vaccination of humans or animals. Affinity selection on Nipah Virus-neutralizing monoclonal antibodies using random sequence peptide libraries on MS2 VLPs selected peptides with sequence similarity to peptide sequences found within the envelope glycoprotein of Nipah itself, thus identifying the epitopes the antibodies recognize. The selected peptide sequences themselves are not necessarily identical in all respects to a sequence within Nipah Virus glycoprotein, and therefore may be referred to as epitope mimics VLPs displaying these epitope mimics can serve as vaccine. On the other hand, display of the corresponding wild-type sequence derived from Nipah Virus and corresponding to the epitope mapped by affinity selection, may also be used as a vaccine.

  16. Collaborative Indoor Access Point Localization Using Autonomous Mobile Robot Swarm

    Directory of Open Access Journals (Sweden)

    Fahed Awad

    2018-01-01

    Full Text Available Localization of access points has become an important research problem due to the wide range of applications it addresses such as dismantling critical security threats caused by rogue access points or optimizing wireless coverage of access points within a service area. Existing proposed solutions have mostly relied on theoretical hypotheses or computer simulation to demonstrate the efficiency of their methods. The techniques that rely on estimating the distance using samples of the received signal strength usually assume prior knowledge of the signal propagation characteristics of the indoor environment in hand and tend to take a relatively large number of uniformly distributed random samples. This paper presents an efficient and practical collaborative approach to detect the location of an access point in an indoor environment without any prior knowledge of the environment. The proposed approach comprises a swarm of wirelessly connected mobile robots that collaboratively and autonomously collect a relatively small number of non-uniformly distributed random samples of the access point’s received signal strength. These samples are used to efficiently and accurately estimate the location of the access point. The experimental testing verified that the proposed approach can identify the location of the access point in an accurate and efficient manner.

  17. Automating Risk Assessments of Hazardous Material Shipments for Transportation Routes and Mode Selection

    International Nuclear Information System (INIS)

    Dolphin, Barbara H.; Richins, William D.; Novascone, Stephen R.

    2010-01-01

    The METEOR project at Idaho National Laboratory (INL) successfully addresses the difficult problem in risk assessment analyses of combining the results from bounding deterministic simulation results with probabilistic (Monte Carlo) risk assessment techniques. This paper describes a software suite designed to perform sensitivity and cost/benefit analyses on selected transportation routes and vehicles to minimize risk associated with the shipment of hazardous materials. METEOR uses Monte Carlo techniques to estimate the probability of an accidental release of a hazardous substance along a proposed transportation route. A METEOR user selects the mode of transportation, origin and destination points, and charts the route using interactive graphics. Inputs to METEOR (many selections built in) include crash rates for the specific aircraft, soil/rock type and population densities over the proposed route, and bounding limits for potential accident types (velocity, temperature, etc.). New vehicle, materials, and location data are added when available. If the risk estimates are unacceptable, the risks associated with alternate transportation modes or routes can be quickly evaluated and compared. Systematic optimizing methods will provide the user with the route and vehicle selection identified with the lowest risk of hazardous material release. The effects of a selected range of potential accidents such as vehicle impact, fire, fuel explosions, excessive containment pressure, flooding, etc. are evaluated primarily using hydrocodes capable of accurately simulating the material response of critical containment components. Bounding conditions that represent credible accidents (i.e; for an impact event, velocity, orientations, and soil conditions) are used as input parameters to the hydrocode models yielding correlation functions relating accident parameters to component damage. The Monte Carlo algorithms use random number generators to make selections at the various decision

  18. Computer simulation of vortex pinning in type II superconductors. II. Random point pins

    International Nuclear Information System (INIS)

    Brandt, E.H.

    1983-01-01

    Pinning of vortices in a type II superconductor by randomly positioned identical point pins is simulated using the two-dimensional method described in a previous paper (Part I). The system is characterized by the vortex and pin numbers (N/sub v/, N/sub p/), the vortex and pin interaction ranges (R/sub v/, R/sub p/), and the amplitude of the pin potential A/sub p/. The computation is performed for many cases: dilute or dense, sharp or soft, attractive or repulsive, weak or strong pins, and ideal or amorphous vortex lattice. The total pinning force F as a function of the mean vortex displacment X increases first linearly (over a distance usually much smaller than the vortex spacing and than R/sub p/) and then saturates, fluctuating about its averaging F-bar. We interpret F-bar as the maximum pinning force j/sub c/B of a large specimen. For weak pins the prediction of Larkin and Ovchinnikov for two-dimensional collective pinning is confirmed: F-bar = const. iW/R/sub p/c 66 , where W-bar is the mean square pinning force and c 66 is the shear modulus of the vortex lattice. If the initial vortex lattice is chosen highly defective (''amorphous'') the constant is 1.3--3 times larger than for the ideal triangular lattice. This finding may explain the often observed ''history effect.'' The function F-bar(A/sub p/) exhibits a jump, which for dilute, sharp, attractive pins occurs close to the ''threshold value'' predicted for isolated pins by Labusch. This jump reflects the onset of plastic deformation of the vortex lattice, and in some cases of vortex trapping, but is not a genuine threshold

  19. Fixed-Point Configurable Hardware Components

    Directory of Open Access Journals (Sweden)

    Rocher Romuald

    2006-01-01

    Full Text Available To reduce the gap between the VLSI technology capability and the designer productivity, design reuse based on IP (intellectual properties is commonly used. In terms of arithmetic accuracy, the generated architecture can generally only be configured through the input and output word lengths. In this paper, a new kind of method to optimize fixed-point arithmetic IP has been proposed. The architecture cost is minimized under accuracy constraints defined by the user. Our approach allows exploring the fixed-point search space and the algorithm-level search space to select the optimized structure and fixed-point specification. To significantly reduce the optimization and design times, analytical models are used for the fixed-point optimization process.

  20. Csf Based Non-Ground Points Extraction from LIDAR Data

    Science.gov (United States)

    Shen, A.; Zhang, W.; Shi, H.

    2017-09-01

    Region growing is a classical method of point cloud segmentation. Based on the idea of collecting the pixels with similar properties to form regions, region growing is widely used in many fields such as medicine, forestry and remote sensing. In this algorithm, there are two core problems. One is the selection of seed points, the other is the setting of the growth constraints, in which the selection of the seed points is the foundation. In this paper, we propose a CSF (Cloth Simulation Filtering) based method to extract the non-ground seed points effectively. The experiments have shown that this method can obtain a group of seed spots compared with the traditional methods. It is a new attempt to extract seed points

  1. Investigation of Random Switching Driven by a Poisson Point Process

    DEFF Research Database (Denmark)

    Simonsen, Maria; Schiøler, Henrik; Leth, John-Josef

    2015-01-01

    This paper investigates the switching mechanism of a two-dimensional switched system, when the switching events are generated by a Poisson point process. A model, in the shape of a stochastic process, for such a system is derived and the distribution of the trajectory's position is developed...... together with marginal density functions for the coordinate functions. Furthermore, the joint probability distribution is given explicitly....

  2. Random function representation of stationary stochastic vector processes for probability density evolution analysis of wind-induced structures

    Science.gov (United States)

    Liu, Zhangjun; Liu, Zenghui

    2018-06-01

    This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.

  3. Fast selection of miRNA candidates based on large-scale pre-computed MFE sets of randomized sequences.

    Science.gov (United States)

    Warris, Sven; Boymans, Sander; Muiser, Iwe; Noback, Michiel; Krijnen, Wim; Nap, Jan-Peter

    2014-01-13

    Small RNAs are important regulators of genome function, yet their prediction in genomes is still a major computational challenge. Statistical analyses of pre-miRNA sequences indicated that their 2D structure tends to have a minimal free energy (MFE) significantly lower than MFE values of equivalently randomized sequences with the same nucleotide composition, in contrast to other classes of non-coding RNA. The computation of many MFEs is, however, too intensive to allow for genome-wide screenings. Using a local grid infrastructure, MFE distributions of random sequences were pre-calculated on a large scale. These distributions follow a normal distribution and can be used to determine the MFE distribution for any given sequence composition by interpolation. It allows on-the-fly calculation of the normal distribution for any candidate sequence composition. The speedup achieved makes genome-wide screening with this characteristic of a pre-miRNA sequence practical. Although this particular property alone will not be able to distinguish miRNAs from other sequences sufficiently discriminative, the MFE-based P-value should be added to the parameters of choice to be included in the selection of potential miRNA candidates for experimental verification.

  4. MODELLING AND SIMULATION OF A NEUROPHYSIOLOGICAL EXPERIMENT BY SPATIO-TEMPORAL POINT PROCESSES

    Directory of Open Access Journals (Sweden)

    Viktor Beneš

    2011-05-01

    Full Text Available We present a stochastic model of an experimentmonitoring the spiking activity of a place cell of hippocampus of an experimental animal moving in an arena. Doubly stochastic spatio-temporal point process is used to model and quantify overdispersion. Stochastic intensity is modelled by a Lévy based random field while the animal path is simplified to a discrete random walk. In a simulation study first a method suggested previously is used. Then it is shown that a solution of the filtering problem yields the desired inference to the random intensity. Two approaches are suggested and the new one based on finite point process density is applied. Using Markov chain Monte Carlo we obtain numerical results from the simulated model. The methodology is discussed.

  5. Scaling Argument of Anisotropic Random Walk

    International Nuclear Information System (INIS)

    Xu Bingzhen; Jin Guojun; Wang Feifeng

    2005-01-01

    In this paper, we analytically discuss the scaling properties of the average square end-to-end distance (R 2 ) for anisotropic random walk in D-dimensional space (D≥2), and the returning probability P n (r 0 ) for the walker into a certain neighborhood of the origin. We will not only give the calculating formula for (R 2 ) and P n (r 0 ), but also point out that if there is a symmetric axis for the distribution of the probability density of a single step displacement, we always obtain (R p erpendicular n 2 )∼n, where perpendicular refers to the projections of the displacement perpendicular to each symmetric axes of the walk; in D-dimensional space with D symmetric axes perpendicular to each other, we always have (R n 2 )∼n and the random walk will be like a purely random motion; if the number of inter-perpendicular symmetric axis is smaller than the dimensions of the space, we must have (R n 2 )∼n 2 for very large n and the walk will be like a ballistic motion. It is worth while to point out that unlike the isotropic random walk in one and two dimensions, which is certain to return into the neighborhood of the origin, generally there is only a nonzero probability for the anisotropic random walker in two dimensions to return to the neighborhood.

  6. 47 CFR 73.158 - Directional antenna monitoring points.

    Science.gov (United States)

    2010-10-01

    ... construction or other disturbances to the measured field, an application to change the monitoring point location, including FCC Form 302-AM, is to be promptly submitted to the FCC in Washington, DC. (1) If the..., the licensee shall select a new monitoring point from the points measured in the last full proof of...

  7. Lectures on random interfaces

    CERN Document Server

    Funaki, Tadahisa

    2016-01-01

    Interfaces are created to separate two distinct phases in a situation in which phase coexistence occurs. This book discusses randomly fluctuating interfaces in several different settings and from several points of view: discrete/continuum, microscopic/macroscopic, and static/dynamic theories. The following four topics in particular are dealt with in the book. Assuming that the interface is represented as a height function measured from a fixed-reference discretized hyperplane, the system is governed by the Hamiltonian of gradient of the height functions. This is a kind of effective interface model called ∇φ-interface model. The scaling limits are studied for Gaussian (or non-Gaussian) random fields with a pinning effect under a situation in which the rate functional of the corresponding large deviation principle has non-unique minimizers. Young diagrams determine decreasing interfaces, and their dynamics are introduced. The large-scale behavior of such dynamics is studied from the points of view of the hyd...

  8. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination.

    Science.gov (United States)

    Koh, Bongyeun; Hong, Sunggi; Kim, Soon-Sim; Hyun, Jin-Sook; Baek, Milye; Moon, Jundong; Kwon, Hayran; Kim, Gyoungyong; Min, Seonggi; Kang, Gu-Hyun

    2016-01-01

    The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE), which requires examinees to select items randomly. The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01), as well as 4 of the 5 items on the advanced skills test (P<0.05). In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01), as well as all 3 of the advanced skills test items (P<0.01). In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  9. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination

    Directory of Open Access Journals (Sweden)

    Bongyeun Koh

    2016-01-01

    Full Text Available Purpose: The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE, which requires examinees to select items randomly. Methods: The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. Results: In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01, as well as 4 of the 5 items on the advanced skills test (P<0.05. In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01, as well as all 3 of the advanced skills test items (P<0.01. Conclusion: In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  10. Cross-point-type spin-transfer-torque magnetoresistive random access memory cell with multi-pillar vertical body channel MOSFET

    Science.gov (United States)

    Sasaki, Taro; Endoh, Tetsuo

    2018-04-01

    In this paper, from the viewpoint of cell size and sensing margin, the impact of a novel cross-point-type one transistor and one magnetic tunnel junction (1T–1MTJ) spin-transfer-torque magnetoresistive random access memory (STT-MRAM) cell with a multi-pillar vertical body channel (BC) MOSFET is shown for high density and wide sensing margin STT-MRAM, with a 10 ns writing period and 1.2 V V DD. For that purpose, all combinations of n/p-type MOSFETs and bottom/top-pin MTJs are compared, where the diameter of MTJ (D MTJ) is scaled down from 55 to 15 nm and the tunnel magnetoresistance (TMR) ratio is increased from 100 to 200%. The results show that, benefiting from the proposed STT-MRAM cell with no back bias effect, the MTJ with a high TMR ratio (200%) can be used in the design of smaller STT-MRAM cells (over 72.6% cell size reduction), which is a difficult task for conventional planar MOSFET based design.

  11. Imaging atomic-level random walk of a point defect in graphene

    Science.gov (United States)

    Kotakoski, Jani; Mangler, Clemens; Meyer, Jannik C.

    2014-05-01

    Deviations from the perfect atomic arrangements in crystals play an important role in affecting their properties. Similarly, diffusion of such deviations is behind many microstructural changes in solids. However, observation of point defect diffusion is hindered both by the difficulties related to direct imaging of non-periodic structures and by the timescales involved in the diffusion process. Here, instead of imaging thermal diffusion, we stimulate and follow the migration of a divacancy through graphene lattice using a scanning transmission electron microscope operated at 60 kV. The beam-activated process happens on a timescale that allows us to capture a significant part of the structural transformations and trajectory of the defect. The low voltage combined with ultra-high vacuum conditions ensure that the defect remains stable over long image sequences, which allows us for the first time to directly follow the diffusion of a point defect in a crystalline material.

  12. Constraints on grip selection in hemiparetic cerebral palsy: effects of lesional side, end-point accuracy, and context.

    Science.gov (United States)

    Steenbergen, Bert; Meulenbroek, Ruud G J; Rosenbaum, David A

    2004-04-01

    This study was concerned with selection criteria used for grip planning in adolescents with left or right hemiparetic cerebral palsy. In the first experiment, we asked participants to pick up a pencil and place the tip in a pre-defined target region. We varied the size of the target to test the hypothesis that increased end-point precision demands would favour the use of a grip that affords end-state comfort. In the second experiment, we studied grip planning in three task contexts that were chosen to let us test the hypothesis that a more functional task context would likewise promote the end-state comfort effect. When movements were performed with the impaired hand, we found that participants with right hemiparesis (i.e., left brain damage) aimed for postural comfort at the start rather than at the end of the object-manipulation phase in both experiments. By contrast, participants with left hemiparesis (i.e., right brain damage) did not favour a particular selection criterion with the impaired hand in the first experiment, but aimed for postural comfort at the start in the second experiment. When movements were performed with the unimpaired hand, grip selection criteria again differed for right and left hemiparetic participants. Participants with right hemiparesis did not favour a particular selection criterion with the unimpaired hand in the first experiment and only showed the end-state comfort effect in the most functional tasks of the second experiment. By contrast, participants with left hemiparesis showed the end-state comfort effect in all conditions of both experiments. These data suggest that the left hemisphere plays a special role in action planning, as has been recognized before, and that one of the deficits accompanying left brain damage is a deficit in forward movement planning, which has not been recognized before. Our findings have both theoretical and clinical implications.

  13. Focus point in dark matter selected high-scale supersymmetry

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, Sibo [Department of Physics, Chongqing University,Chongqing, 401331 P.R. (China)

    2015-03-19

    In this paper, we explore conditions for focus point in the high-scale supersymmetry with the weak-scale gaugino masses. In this context the tension between the naturalness and LHC 2013 data about supersymmetry as well as the cold dark matter candidate are addressed simultaneously. It is shown that the observed Higgs mass can be satisfied in a wide classes of new models, which are realized by employing the non-minimal gauge mediation.

  14. Application of Vector Triggering Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune

    result is a Random Decrement function from each measurement. In traditional Random Decrement estimation the triggering condition is a scalar condition, which should only be fulfilled in a single measurement. In vector triggering Random Decrement the triggering condition is a vector condition......This paper deals with applications of the vector triggering Random Decrement technique. This technique is new and developed with the aim of minimizing estimation time and identification errors. The theory behind the technique is discussed in an accompanying paper. The results presented...... in this paper should be regarded as a further documentation of the technique. The key point in Random Decrement estimation is the formulation of a triggering condition. If the triggering condition is fulfilled a time segment from each measurement is picked out and averaged with previous time segments. The final...

  15. Application of Vector Triggering Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune

    1997-01-01

    result is a Random Decrement function from each measurement. In traditional Random Decrement estimation the triggering condition is a scalar condition, which should only be fulfilled in a single measurement. In vector triggering Random Decrement the triggering condition is a vector condition......This paper deals with applications of the vector triggering Random Decrement technique. This technique is new and developed with the aim of minimizing estimation time and identification errors. The theory behind the technique is discussed in an accompanying paper. The results presented...... in this paper should be regarded as a further documentation of the technique. The key point in Random Decrement estimation is the formulation of a triggering condition. If the triggering condition is fulfilled a time segment from each measurement is picked out and averaged with previous time segments. The final...

  16. Selected CD133⁺ progenitor cells to promote angiogenesis in patients with refractory angina: final results of the PROGENITOR randomized trial.

    Science.gov (United States)

    Jimenez-Quevedo, Pilar; Gonzalez-Ferrer, Juan Jose; Sabate, Manel; Garcia-Moll, Xavier; Delgado-Bolton, Roberto; Llorente, Leopoldo; Bernardo, Esther; Ortega-Pozzi, Aranzazu; Hernandez-Antolin, Rosana; Alfonso, Fernando; Gonzalo, Nieves; Escaned, Javier; Bañuelos, Camino; Regueiro, Ander; Marin, Pedro; Fernandez-Ortiz, Antonio; Neves, Barbara Das; Del Trigo, Maria; Fernandez, Cristina; Tejerina, Teresa; Redondo, Santiago; Garcia, Eulogio; Macaya, Carlos

    2014-11-07

    Refractory angina constitutes a clinical problem. The aim of this study was to assess the safety and the feasibility of transendocardial injection of CD133(+) cells to foster angiogenesis in patients with refractory angina. In this randomized, double-blinded, multicenter controlled trial, eligible patients were treated with granulocyte colony-stimulating factor, underwent an apheresis and electromechanical mapping, and were randomized to receive treatment with CD133(+) cells or no treatment. The primary end point was the safety of transendocardial injection of CD133(+) cells, as measured by the occurrence of major adverse cardiac and cerebrovascular event at 6 months. Secondary end points analyzed the efficacy. Twenty-eight patients were included (n=19 treatment; n=9 control). At 6 months, 1 patient in each group had ventricular fibrillation and 1 patient in each group died. One patient (treatment group) had a cardiac tamponade during mapping. There were no significant differences between groups with respect to efficacy parameters; however, the comparison within groups showed a significant improvement in the number of angina episodes per month (median absolute difference, -8.5 [95% confidence interval, -15.0 to -4.0]) and in angina functional class in the treatment arm but not in the control group. At 6 months, only 1 simple-photon emission computed tomography (SPECT) parameter: summed score improved significantly in the treatment group at rest and at stress (median absolute difference, -1.0 [95% confidence interval, -1.9 to -0.1]) but not in the control arm. Our findings support feasibility and safety of transendocardial injection of CD133(+) cells in patients with refractory angina. The promising clinical results and favorable data observed in SPECT summed score may set up the basis to test the efficacy of cell therapy in a larger randomized trial. © 2014 American Heart Association, Inc.

  17. The Wasteland of Random Supergravities

    OpenAIRE

    Marsh, David; McAllister, Liam; Wrase, Timm

    2011-01-01

    We show that in a general \\cal{N} = 1 supergravity with N \\gg 1 scalar fields, an exponentially small fraction of the de Sitter critical points are metastable vacua. Taking the superpotential and Kahler potential to be random functions, we construct a random matrix model for the Hessian matrix, which is well-approximated by the sum of a Wigner matrix and two Wishart matrices. We compute the eigenvalue spectrum analytically from the free convolution of the constituent spectra and find that in ...

  18. Random walks in Euclidean space

    OpenAIRE

    Varjú, Péter Pál

    2012-01-01

    Consider a sequence of independent random isometries of Euclidean space with a previously fixed probability law. Apply these isometries successively to the origin and consider the sequence of random points that we obtain this way. We prove a local limit theorem under a suitable moment condition and a necessary non-degeneracy condition. Under stronger hypothesis, we prove a limit theorem on a wide range of scales: between e^(-cl^(1/4)) and l^(1/2), where l is the number of steps.

  19. Point based interactive image segmentation using multiquadrics splines

    Science.gov (United States)

    Meena, Sachin; Duraisamy, Prakash; Palniappan, Kannappan; Seetharaman, Guna

    2017-05-01

    Multiquadrics (MQ) are radial basis spline function that can provide an efficient interpolation of data points located in a high dimensional space. MQ were developed by Hardy to approximate geographical surfaces and terrain modelling. In this paper we frame the task of interactive image segmentation as a semi-supervised interpolation where an interpolating function learned from the user provided seed points is used to predict the labels of unlabeled pixel and the spline function used in the semi-supervised interpolation is MQ. This semi-supervised interpolation framework has a nice closed form solution which along with the fact that MQ is a radial basis spline function lead to a very fast interactive image segmentation process. Quantitative and qualitative results on the standard datasets show that MQ outperforms other regression based methods, GEBS, Ridge Regression and Logistic Regression, and popular methods like Graph Cut,4 Random Walk and Random Forest.6

  20. Sequential function approximation on arbitrarily distributed point sets

    Science.gov (United States)

    Wu, Kailiang; Xiu, Dongbin

    2018-02-01

    We present a randomized iterative method for approximating unknown function sequentially on arbitrary point set. The method is based on a recently developed sequential approximation (SA) method, which approximates a target function using one data point at each step and avoids matrix operations. The focus of this paper is on data sets with highly irregular distribution of the points. We present a nearest neighbor replacement (NNR) algorithm, which allows one to sample the irregular data sets in a near optimal manner. We provide mathematical justification and error estimates for the NNR algorithm. Extensive numerical examples are also presented to demonstrate that the NNR algorithm can deliver satisfactory convergence for the SA method on data sets with high irregularity in their point distributions.

  1. Random survival forests for competing risks

    DEFF Research Database (Denmark)

    Ishwaran, Hemant; Gerds, Thomas A; Kogalur, Udaya B

    2014-01-01

    We introduce a new approach to competing risks using random forests. Our method is fully non-parametric and can be used for selecting event-specific variables and for estimating the cumulative incidence function. We show that the method is highly effective for both prediction and variable selection...

  2. New analytically solvable models of relativistic point interactions

    International Nuclear Information System (INIS)

    Gesztesy, F.; Seba, P.

    1987-01-01

    Two new analytically solvable models of relativistic point interactions in one dimension (being natural extensions of the nonrelativistic δ-resp, δ'-interaction) are considered. Their spectral properties in the case of finitely many point interactions as well as in the periodic case are fully analyzed. Moreover the spectrum is explicitely determined in the case of independent, identically distributed random coupling constants and the analog of the Saxon and Huther conjecture concerning gaps in the energy spectrum of such systems is derived

  3. Using Random Numbers in Science Research Activities.

    Science.gov (United States)

    Schlenker, Richard M.; And Others

    1996-01-01

    Discusses the importance of science process skills and describes ways to select sets of random numbers for selection of subjects for a research study in an unbiased manner. Presents an activity appropriate for grades 5-12. (JRH)

  4. Schroedinger operators with point interactions and short range expansions

    International Nuclear Information System (INIS)

    Albeverio, S.; Hoeegh-Krohn, R.; Oslo Univ.

    1984-01-01

    We give a survey of recent results concerning Schroedinger operators with point interactions in R 3 . In the case where the point interactions are located at a discrete set of points we discuss results about the resolvent, the spectrum, the resonances and the scattering quantities. We also discuss the approximation of point interactions by short range local potentials (short range or low energy expansions) and the one electron model of a 3-dimensional crystal. Moreover we discuss Schroedinger operators with Coulomb plus point interactions, with applications to the determination of scattering lengths and of level shifts in mesic atoms. Further applications to the multiple well problem, to multiparticle systems, to crystals with random impurities, to polymers and quantum fields are also briefly discussed. (orig.)

  5. Optimal Strategy for Integrated Dynamic Inventory Control and Supplier Selection in Unknown Environment via Stochastic Dynamic Programming

    International Nuclear Information System (INIS)

    Sutrisno; Widowati; Solikhin

    2016-01-01

    In this paper, we propose a mathematical model in stochastic dynamic optimization form to determine the optimal strategy for an integrated single product inventory control problem and supplier selection problem where the demand and purchasing cost parameters are random. For each time period, by using the proposed model, we decide the optimal supplier and calculate the optimal product volume purchased from the optimal supplier so that the inventory level will be located at some point as close as possible to the reference point with minimal cost. We use stochastic dynamic programming to solve this problem and give several numerical experiments to evaluate the model. From the results, for each time period, the proposed model was generated the optimal supplier and the inventory level was tracked the reference point well. (paper)

  6. Some Limits Using Random Slope Models to Measure Academic Growth

    Directory of Open Access Journals (Sweden)

    Daniel B. Wright

    2017-11-01

    Full Text Available Academic growth is often estimated using a random slope multilevel model with several years of data. However, if there are few time points, the estimates can be unreliable. While using random slope multilevel models can lower the variance of the estimates, these procedures can produce more highly erroneous estimates—zero and negative correlations with the true underlying growth—than using ordinary least squares estimates calculated for each student or school individually. An example is provided where schools with increasing graduation rates are estimated to have negative growth and vice versa. The estimation is worse when the underlying data are skewed. It is recommended that there are at least six time points for estimating growth if using a random slope model. A combination of methods can be used to avoid some of the aberrant results if it is not possible to have six or more time points.

  7. A randomized, controlled trial of oral propranolol in infantile hemangioma.

    Science.gov (United States)

    Léauté-Labrèze, Christine; Hoeger, Peter; Mazereeuw-Hautier, Juliette; Guibaud, Laurent; Baselga, Eulalia; Posiunas, Gintas; Phillips, Roderic J; Caceres, Hector; Lopez Gutierrez, Juan Carlos; Ballona, Rosalia; Friedlander, Sheila Fallon; Powell, Julie; Perek, Danuta; Metz, Brandie; Barbarot, Sebastien; Maruani, Annabel; Szalai, Zsuzsanna Zsofia; Krol, Alfons; Boccara, Olivia; Foelster-Holst, Regina; Febrer Bosch, Maria Isabel; Su, John; Buckova, Hana; Torrelo, Antonio; Cambazard, Frederic; Grantzow, Rainer; Wargon, Orli; Wyrzykowski, Dariusz; Roessler, Jochen; Bernabeu-Wittel, Jose; Valencia, Adriana M; Przewratil, Przemyslaw; Glick, Sharon; Pope, Elena; Birchall, Nicholas; Benjamin, Latanya; Mancini, Anthony J; Vabres, Pierre; Souteyrand, Pierre; Frieden, Ilona J; Berul, Charles I; Mehta, Cyrus R; Prey, Sorilla; Boralevi, Franck; Morgan, Caroline C; Heritier, Stephane; Delarue, Alain; Voisard, Jean-Jacques

    2015-02-19

    Oral propranolol has been used to treat complicated infantile hemangiomas, although data from randomized, controlled trials to inform its use are limited. We performed a multicenter, randomized, double-blind, adaptive, phase 2-3 trial assessing the efficacy and safety of a pediatric-specific oral propranolol solution in infants 1 to 5 months of age with proliferating infantile hemangioma requiring systemic therapy. Infants were randomly assigned to receive placebo or one of four propranolol regimens (1 or 3 mg of propranolol base per kilogram of body weight per day for 3 or 6 months). A preplanned interim analysis was conducted to identify the regimen to study for the final efficacy analysis. The primary end point was success (complete or nearly complete resolution of the target hemangioma) or failure of trial treatment at week 24, as assessed by independent, centralized, blinded evaluations of standardized photographs. Of 460 infants who underwent randomization, 456 received treatment. On the basis of an interim analysis of the first 188 patients who completed 24 weeks of trial treatment, the regimen of 3 mg of propranolol per kilogram per day for 6 months was selected for the final efficacy analysis. The frequency of successful treatment was higher with this regimen than with placebo (60% vs. 4%, P<0.001). A total of 88% of patients who received the selected propranolol regimen showed improvement by week 5, versus 5% of patients who received placebo. A total of 10% of patients in whom treatment with propranolol was successful required systemic retreatment during follow-up. Known adverse events associated with propranolol (hypoglycemia, hypotension, bradycardia, and bronchospasm) occurred infrequently, with no significant difference in frequency between the placebo group and the groups receiving propranolol. This trial showed that propranolol was effective at a dose of 3 mg per kilogram per day for 6 months in the treatment of infantile hemangioma. (Funded by

  8. Preliminary Studies on Existing Scenario of Selected Soil Property in Cheddikulam DS Division Vavuniya, Sri Lanka

    OpenAIRE

    M.A. R. Aashifa; P. Loganathan

    2017-01-01

     This study was conducted to quantify the spatial variability of soil properties, use this information to produce accurate map by means of ordinary kriging and find the ways to reclaim the problem soil and make suggestions to cultivate the crop variety which is suitable for the existing soil property.70 sampling points were selected for that research using stratified random sampling method. Stratification was based on the type of land cover, and following land cover patterns were identified f...

  9. CADASTER QSPR Models for Predictions of Melting and Boiling Points of Perfluorinated Chemicals.

    Science.gov (United States)

    Bhhatarai, Barun; Teetz, Wolfram; Liu, Tao; Öberg, Tomas; Jeliazkova, Nina; Kochev, Nikolay; Pukalov, Ognyan; Tetko, Igor V; Kovarich, Simona; Papa, Ester; Gramatica, Paola

    2011-03-14

    Quantitative structure property relationship (QSPR) studies on per- and polyfluorinated chemicals (PFCs) on melting point (MP) and boiling point (BP) are presented. The training and prediction chemicals used for developing and validating the models were selected from Syracuse PhysProp database and literatures. The available experimental data sets were split in two different ways: a) random selection on response value, and b) structural similarity verified by self-organizing-map (SOM), in order to propose reliable predictive models, developed only on the training sets and externally verified on the prediction sets. Individual linear and non-linear approaches based models developed by different CADASTER partners on 0D-2D Dragon descriptors, E-state descriptors and fragment based descriptors as well as consensus model and their predictions are presented. In addition, the predictive performance of the developed models was verified on a blind external validation set (EV-set) prepared using PERFORCE database on 15 MP and 25 BP data respectively. This database contains only long chain perfluoro-alkylated chemicals, particularly monitored by regulatory agencies like US-EPA and EU-REACH. QSPR models with internal and external validation on two different external prediction/validation sets and study of applicability-domain highlighting the robustness and high accuracy of the models are discussed. Finally, MPs for additional 303 PFCs and BPs for 271 PFCs were predicted for which experimental measurements are unknown. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Point, surface and volumetric heat sources in the thermal modelling of selective laser melting

    Science.gov (United States)

    Yang, Yabin; Ayas, Can

    2017-10-01

    Selective laser melting (SLM) is a powder based additive manufacturing technique suitable for producing high precision metal parts. However, distortions and residual stresses within products arise during SLM because of the high temperature gradients created by the laser heating. Residual stresses limit the load resistance of the product and may even lead to fracture during the built process. It is therefore of paramount importance to predict the level of part distortion and residual stress as a function of SLM process parameters which requires a reliable thermal modelling of the SLM process. Consequently, a key question arises which is how to describe the laser source appropriately. Reasonable simplification of the laser representation is crucial for the computational efficiency of the thermal model of the SLM process. In this paper, first a semi-analytical thermal modelling approach is described. Subsequently, the laser heating is modelled using point, surface and volumetric sources, in order to compare the influence of different laser source geometries on the thermal history prediction of the thermal model. The present work provides guidelines on appropriate representation of the laser source in the thermal modelling of the SLM process.

  11. Learning stochastic reward distributions in a speeded pointing task.

    Science.gov (United States)

    Seydell, Anna; McCann, Brian C; Trommershäuser, Julia; Knill, David C

    2008-04-23

    Recent studies have shown that humans effectively take into account task variance caused by intrinsic motor noise when planning fast hand movements. However, previous evidence suggests that humans have greater difficulty accounting for arbitrary forms of stochasticity in their environment, both in economic decision making and sensorimotor tasks. We hypothesized that humans can learn to optimize movement strategies when environmental randomness can be experienced and thus implicitly learned over several trials, especially if it mimics the kinds of randomness for which subjects might have generative models. We tested the hypothesis using a task in which subjects had to rapidly point at a target region partly covered by three stochastic penalty regions introduced as "defenders." At movement completion, each defender jumped to a new position drawn randomly from fixed probability distributions. Subjects earned points when they hit the target, unblocked by a defender, and lost points otherwise. Results indicate that after approximately 600 trials, subjects approached optimal behavior. We further tested whether subjects simply learned a set of stimulus-contingent motor plans or the statistics of defenders' movements by training subjects with one penalty distribution and then testing them on a new penalty distribution. Subjects immediately changed their strategy to achieve the same average reward as subjects who had trained with the second penalty distribution. These results indicate that subjects learned the parameters of the defenders' jump distributions and used this knowledge to optimally plan their hand movements under conditions involving stochastic rewards and penalties.

  12. Assessment of osteoporotic vertebral fractures using specialized workflow software for 6-point morphometry

    International Nuclear Information System (INIS)

    Guglielmi, Giuseppe; Palmieri, Francesco; Placentino, Maria Grazia; D'Errico, Francesco; Stoppino, Luca Pio

    2009-01-01

    Purpose: To evaluate the time required, the accuracy and the precision of a model-based image analysis software tool for the diagnosis of osteoporotic fractures using a 6-point morphometry protocol. Materials and methods: Lateral dorsal and lumbar radiographs were performed on 92 elderly women (mean age 69.2 ± 5.7 years). Institutional review board approval and patient informed consent were obtained for all subjects. The semi-automated and the manual correct annotations of 6-point placement were compared to calculate the time consumed and the accuracy of the software. Twenty test images were randomly selected and the data obtained by multiple perturbed initialisation points on the same image were compared to assess the precision of the system. Results: The time requirement data of the semi-automated system (420 ± 67 s) were statistically different (p < 0.05) from that of manual placement (900 ± 77 s). In the accuracy test, the mean reproducibility error for semi-automatic 6-point placement was 2.50 ± 0.72% [95% CI] for the anterior-posterior reference and 2.16 ± 0.5% [95% CI] for the superior-inferior reference. In the precision test the mean error resulted averaged over all vertebrae was 2.6 ± 1.3% in terms of vertebral width. Conclusions: The technique is time effective, accurate and precise and can, therefore, be recommended in large epidemiological studies and pharmaceutical trials for reporting of osteoporotic vertebral fractures.

  13. Correlates of smoking with socioeconomic status, leisure time physical activity and alcohol consumption among Polish adults from randomly selected regions.

    Science.gov (United States)

    Woitas-Slubowska, Donata; Hurnik, Elzbieta; Skarpańska-Stejnborn, Anna

    2010-12-01

    To determine the association between smoking status and leisure time physical activity (LTPA), alcohol consumption, and socioeconomic status (SES) among Polish adults. 466 randomly selected men and women (aged 18-66 years) responded to an anonymous questionnaire regarding smoking, alcohol consumption, LTPA, and SES. Multiple logistic regression was used to examine the association of smoking status with six socioeconomic measures, level of LTPA, and frequency and type of alcohol consumed. Smokers were defined as individuals smoking occasionally or daily. The odds of being smoker were 9 times (men) and 27 times (women) higher among respondents who drink alcohol several times/ week or everyday in comparison to non-drinkers (p times higher compared to those with the high educational attainment (p = 0.007). Among women we observed that students were the most frequent smokers. Female students were almost three times more likely to smoke than non-professional women, and two times more likely than physical workers (p = 0.018). The findings of this study indicated that among randomly selected Polish man and women aged 18-66 smoking and alcohol consumption tended to cluster. These results imply that intervention strategies need to target multiple risk factors simultaneously. The highest risk of smoking was observed among low educated men, female students, and both men and women drinking alcohol several times a week or every day. Information on subgroups with the high risk of smoking will help in planning future preventive strategies.

  14. Quantum optics in multiple scattering random media

    DEFF Research Database (Denmark)

    Lodahl, Peter; Lagendijk, Ad

    2005-01-01

    Quantum Optics in Multiple Scattering Random Media Peter Lodahl Research Center COM, Technical University of Denmark, Dk-2800 Lyngby, Denmark. Coherent transport of light in a disordered random medium has attracted enormous attention both from a fundamental and application point of view. Coherent......-tions that should be readily attainable experimentally is devised. Figure 1. Inverse total transmission of shot noise (left) and technical noise (right) as a function of the thickness of the ran-dom medium. The experimental data are well explained by theory (curves). [1] J. Tworzydlo and C.W.J. Beenakker, Phys. Rev...

  15. Modest blood pressure reduction with valsartan in acute ischemic stroke: a prospective, randomized, open-label, blinded-end-point trial.

    Science.gov (United States)

    Oh, Mi Sun; Yu, Kyung-Ho; Hong, Keun-Sik; Kang, Dong-Wha; Park, Jong-Moo; Bae, Hee-Joon; Koo, Jaseong; Lee, Juneyoung; Lee, Byung-Chul

    2015-07-01

    To assess the efficacy and safety of modest blood pressure (BP) reduction with valsartan within 48 h after symptom onset in patients with acute ischemic stroke and high BP. This was a multicenter, prospective, randomized, open-label, blinded-end-point trial. A total of 393 subjects were recruited at 28 centers and then randomly assigned in a 1:1 ratio to receive valsartan (n = 195) or no treatment (n = 198) for seven-days after presentation. The primary outcome was death or dependency, defined as a score of 3-6 on the modified Rankin Scale (mRS) at 90 days after symptom onset. Early neurological deterioration (END) within seven-days and 90-day major vascular events were also assessed. There were 372 patients who completed the 90-day follow-up. The valsartan group had 46 of 187 patients (24·6%) with a 90-day mRS 3-6, compared with 42 of 185 patients (22·6%) in the control group (odds ratio [OR], 1·11; 95% confidence interval [CI], 0·69-1·79; P = 0·667). The rate of major vascular events did not differ between groups (OR, 1·41; 95% CI, 0·44-4·49; P = 0·771). There was a significant increase of END in the valsartan group (OR, 2·43; 95% CI, 1·25-4·73; P = 0·008). Early reduction of BP with valsartan did not reduce death or dependency and major vascular events at 90 days, but increased the risk of END. © 2015 World Stroke Organization.

  16. Inflation with a graceful exit in a random landscape

    International Nuclear Information System (INIS)

    Pedro, F.G.; Westphal, A.

    2016-11-01

    We develop a stochastic description of small-field inflationary histories with a graceful exit in a random potential whose Hessian is a Gaussian random matrix as a model of the unstructured part of the string landscape. The dynamical evolution in such a random potential from a small-field inflation region towards a viable late-time de Sitter (dS) minimum maps to the dynamics of Dyson Brownian motion describing the relaxation of non-equilibrium eigenvalue spectra in random matrix theory. We analytically compute the relaxation probability in a saddle point approximation of the partition function of the eigenvalue distribution of the Wigner ensemble describing the mass matrices of the critical points. When applied to small-field inflation in the landscape, this leads to an exponentially strong bias against small-field ranges and an upper bound N<<10 on the number of light fields N participating during inflation from the non-observation of negative spatial curvature.

  17. Inflation with a graceful exit in a random landscape

    Energy Technology Data Exchange (ETDEWEB)

    Pedro, F.G. [Univ. Autonoma de Madrid (Spain). Dept. de Fisica Teorica y Inst. de Fisica Teorica UAM/CSIC; Westphal, A. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Theory Group

    2016-11-15

    We develop a stochastic description of small-field inflationary histories with a graceful exit in a random potential whose Hessian is a Gaussian random matrix as a model of the unstructured part of the string landscape. The dynamical evolution in such a random potential from a small-field inflation region towards a viable late-time de Sitter (dS) minimum maps to the dynamics of Dyson Brownian motion describing the relaxation of non-equilibrium eigenvalue spectra in random matrix theory. We analytically compute the relaxation probability in a saddle point approximation of the partition function of the eigenvalue distribution of the Wigner ensemble describing the mass matrices of the critical points. When applied to small-field inflation in the landscape, this leads to an exponentially strong bias against small-field ranges and an upper bound N<<10 on the number of light fields N participating during inflation from the non-observation of negative spatial curvature.

  18. Inflation with a graceful exit in a random landscape

    Energy Technology Data Exchange (ETDEWEB)

    Pedro, F.G. [Departamento de Física Teórica and Instituto de Física Teórica UAM/CSIC,Universidad Autónoma de Madrid,Cantoblanco, 28049 Madrid (Spain); Westphal, A. [Deutsches Elektronen-Synchrotron DESY, Theory Group,D-22603 Hamburg (Germany)

    2017-03-30

    We develop a stochastic description of small-field inflationary histories with a graceful exit in a random potential whose Hessian is a Gaussian random matrix as a model of the unstructured part of the string landscape. The dynamical evolution in such a random potential from a small-field inflation region towards a viable late-time de Sitter (dS) minimum maps to the dynamics of Dyson Brownian motion describing the relaxation of non-equilibrium eigenvalue spectra in random matrix theory. We analytically compute the relaxation probability in a saddle point approximation of the partition function of the eigenvalue distribution of the Wigner ensemble describing the mass matrices of the critical points. When applied to small-field inflation in the landscape, this leads to an exponentially strong bias against small-field ranges and an upper bound N≪10 on the number of light fields N participating during inflation from the non-observation of negative spatial curvature.

  19. Asymptotic properties of a simple random motion

    International Nuclear Information System (INIS)

    Ravishankar, K.

    1988-01-01

    A random walker in R/sup N/ is considered. At each step the walker picks a point in R/sup N/ from a fixed finite set of destination points. Having chosen the point, the walker moves a fraction r (r < 1) of the distance toward the point along a straight line. Assuming that the successive destination points are chosen independently, it is shown that the asymptotic distribution of the walker's position has the same mean as the destination point distribution. An estimate is obtained for the fraction of time the walker stays within a ball centered at the mean value for almost every destination sequence. Examples show that the asymptotic distribution could have intricate structure

  20. Application of point-to-point matching algorithms for background correction in on-line liquid chromatography-Fourier transform infrared spectrometry (LC-FTIR).

    Science.gov (United States)

    Kuligowski, J; Quintás, G; Garrigues, S; de la Guardia, M

    2010-03-15

    A new background correction method for the on-line coupling of gradient liquid chromatography and Fourier transform infrared spectrometry has been developed. It is based on the use of a point-to-point matching algorithm that compares the absorption spectra of the sample data set with those of a previously recorded reference data set in order to select an appropriate reference spectrum. The spectral range used for the point-to-point comparison is selected with minimal user-interaction, thus facilitating considerably the application of the whole method. The background correction method has been successfully tested on a chromatographic separation of four nitrophenols running acetonitrile (0.08%, v/v TFA):water (0.08%, v/v TFA) gradients with compositions ranging from 35 to 85% (v/v) acetonitrile, giving accurate results for both, baseline resolved and overlapped peaks. Copyright (c) 2009 Elsevier B.V. All rights reserved.

  1. Random skew plane partitions and the Pearcey process

    DEFF Research Database (Denmark)

    Reshetikhin, Nicolai; Okounkov, Andrei

    2007-01-01

    We study random skew 3D partitions weighted by q vol and, specifically, the q → 1 asymptotics of local correlations near various points of the limit shape. We obtain sine-kernel asymptotics for correlations in the bulk of the disordered region, Airy kernel asymptotics near a general point of the ...

  2. Crossover and maximal fat-oxidation points in sedentary healthy subjects: methodological issues.

    Science.gov (United States)

    Gmada, N; Marzouki, H; Haboubi, M; Tabka, Z; Shephard, R J; Bouhlel, E

    2012-02-01

    Our study aimed to assess the influence of protocol on the crossover point and maximal fat-oxidation (LIPOX(max)) values in sedentary, but otherwise healthy, young men. Maximal oxygen intake was assessed in 23 subjects, using a progressive maximal cycle ergometer test. Twelve sedentary males (aged 20.5±1.0 years) whose directly measured maximal aerobic power (MAP) values were lower than their theoretical maximal values (tMAP) were selected from this group. These individuals performed, in random sequence, three submaximal graded exercise tests, separated by three-day intervals; work rates were based on the tMAP in one test and on MAP in the remaining two. The third test was used to assess the reliability of data. Heart rate, respiratory parameters, blood lactate, the crossover point and LIPOX(max) values were measured during each of these tests. The crossover point and LIPOX(max) values were significantly lower when the testing protocol was based on tMAP rather than on MAP (PtMAP at 30, 40, 50 and 60% of maximal aerobic power (PtMAP rather than MAP (P<0.001). During the first 5 min of recovery, EPOC(5 min) and blood lactate were significantly correlated (r=0.89; P<0.001). Our data show that, to assess the crossover point and LIPOX(max) values for research purposes, the protocol must be based on the measured MAP rather than on a theoretical value. Such a determination should improve individualization of training for initially sedentary subjects. Copyright © 2011 Elsevier Masson SAS. All rights reserved.

  3. Comparison of Dose When Prescribed to Point A and Point H for Brachytherapy in Cervical Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Gang, Ji Hyeong; Gim, Il Hwan; Hwang, Seon Boong; Kim, Woong; Im, Hyeong Seo; Gang, Jin Mook; Gim, Gi Hwan; Lee, Ah Ram [Dept. of Radiation Oncology, Korea Institute of Radiological and Medical Sciences, Seou (Korea, Republic of)

    2012-09-15

    The purpose of this study is to compare plans prescribed to point A with these prescribed to point H recommended by ABS (American Brachytherapy Society) in high dose rate intracavitary brachytherapy for cervical carcinoma. This study selected 103 patients who received HDR (High Dose Rate) brachytherapy using tandem and ovoids from March 2010 to January 2012. Point A, bladder point, and rectal point conform with Manchester System. Point H conforms with ABS recommendation. Also Sigmoid colon point, and vagina point were established arbitrarily. We examined distance between point A and point H. The percent dose at point A was calculated when 100% dose was prescribed to point H. Additionally, the percent dose at each reference points when dose is prescribed to point H and point A were calculated. The relative dose at point A was lower when point H was located inferior to point A. The relative doses at bladder, rectal, sigmoid colon, and vagina points were higher when point H was located superior to point A, and lower when point H was located inferior to point A. This study found out that as point H got located much superior to point A, the absorbed dose of surrounding normal organs became higher, and as point H got located much inferior to point A, the absorbed dose of surrounding normal organs became lower. This differences dose not seem to affect the treatment. However, we suggest this new point is worth being considered for the treatment of HDR if dose distribution and absorbed dose at normal organs have large differences between prescribed to point A and H.

  4. Marked point process for modelling seismic activity (case study in Sumatra and Java)

    Science.gov (United States)

    Pratiwi, Hasih; Sulistya Rini, Lia; Wayan Mangku, I.

    2018-05-01

    Earthquake is a natural phenomenon that is random, irregular in space and time. Until now the forecast of earthquake occurrence at a location is still difficult to be estimated so that the development of earthquake forecast methodology is still carried out both from seismology aspect and stochastic aspect. To explain the random nature phenomena, both in space and time, a point process approach can be used. There are two types of point processes: temporal point process and spatial point process. The temporal point process relates to events observed over time as a sequence of time, whereas the spatial point process describes the location of objects in two or three dimensional spaces. The points on the point process can be labelled with additional information called marks. A marked point process can be considered as a pair (x, m) where x is the point of location and m is the mark attached to the point of that location. This study aims to model marked point process indexed by time on earthquake data in Sumatra Island and Java Island. This model can be used to analyse seismic activity through its intensity function by considering the history process up to time before t. Based on data obtained from U.S. Geological Survey from 1973 to 2017 with magnitude threshold 5, we obtained maximum likelihood estimate for parameters of the intensity function. The estimation of model parameters shows that the seismic activity in Sumatra Island is greater than Java Island.

  5. A NEW METHOD HIGHLIGHTING PSYCHOMOTOR SKILLS AND COGNITIVE ATTRIBUTES IN ATHLETE SELECTIONS

    Directory of Open Access Journals (Sweden)

    Engin Sagdilek

    2015-05-01

    Full Text Available Talents are extraordinary but not completely developed characteristics in a field. These attributes cover a relatively wide range in sports. Tests perused in selection of athletes are generally motoric sports tests and measure predominantly conditional attributes. It is known that in sports, performance is related to cognitive skills as well as physical features and motor skills. This study explored a new method that could be utilized in the selection and tracking the level of improvement of athletes, and evaluate their attention, perception and learning levels, on athlete and other female students. 9 female table tennis athletes that trained for 16 hours per week for the last 5 years and 9 female students that never played in any sports, aged between 10 and 14 years, were participated in our study. For the Selective Action Array, developed for this study, a table tennis robot was utilized. Robot was set up to send a total of 26 balls in 3 different colors (6 whites, 10 yellows, 10 pinks to different areas of the table, in random colors and at the rate of 90 balls per minute. The participants were asked to ignore the white balls, to touch the yellow balls and to grab the pink balls using their dominant hands. Pursuant to explaining the task to the participants, two consecutive trials were executed and recorded using a camera. Every action performed/not performed by the participants was transformed into points in the scoring system. First trial total points in the Selective Action Array were 104±17 for athletes and 102±19 for non-athletes, whereas on the second trial total points were 122±11 and 105±20, respectively. The higher scores obtained in the second trial were significant for the athletes; the difference in the scores for non-athletes was minor. Non-athletes scored 33% better for the white balls as compared to the table tennis athletes. For the yellow balls, athletes and non-athletes scored similar points on the first trial, whereas

  6. [Silvicultural treatments and their selection effects].

    Science.gov (United States)

    Vincent, G

    1973-01-01

    Selection can be defined in terms of its observable consequences as the non random differential reproduction of genotypes (Lerner 1958). In the forest stands we are selecting during the improvements-fellings and reproduction treatments the individuals surpassing in growth or in production of first-class timber. However the silvicultural treatments taken in forest stands guarantee a permanent increase of forest production only in such cases, if they have been taken with respect to the principles of directional (dynamic) selection. These principles require that the trees determined for further growing and for forest regeneration are selected by their hereditary properties, i.e. by their genotypes.For making this selection feasible, our study deals with the genetic parameters and gives some examples of the application of the response, the selection differential, the heritability in the narrow and in the broad sense, as well as of the genetic and genotypic gain. On the strength of this parameter we have the possibility to estimate the economic success of several silvicultural treatments in forest stands.The mentioned examples demonstrate that the selection measures of a higher intensity will be manifested in a higher selection differential, in a higher genetic and genotypic gain and that the mentioned measures show more distinct effects in the variable populations - in natural forest - than in the population characteristic by a smaller variability, e.g. in many uniform artificially established stands.The examples of influences of different selection on the genotypes composition of population prove that genetics instructs us to differentiate the different genotypes of the same species and gives us at the same time a new criterions for evaluating selectional treatments. These criterions from economic point of view is necessary to consider in silviculture as advantageous even for the reason that we can judge from these criterions the genetical composition of forest stands

  7. Photobiomodulation in the Prevention of Tooth Sensitivity Caused by In-Office Dental Bleaching. A Randomized Placebo Preliminary Study.

    Science.gov (United States)

    Calheiros, Andrea Paiva Corsetti; Moreira, Maria Stella; Gonçalves, Flávia; Aranha, Ana Cecília Correa; Cunha, Sandra Ribeiro; Steiner-Oliveira, Carolina; Eduardo, Carlos de Paula; Ramalho, Karen Müller

    2017-08-01

    Analyze the effect of photobiomodulation in the prevention of tooth sensitivity after in-office dental bleaching. Tooth sensitivity is a common clinical consequence of dental bleaching. Therapies for prevention of sensitivity have been investigated in literature. This study was developed as a randomized, placebo blind clinical trial. Fifty patients were selected (n = 10) and randomly divided into five groups: (1) control, (2) placebo, (3) laser before bleaching, (4) laser after bleaching, and (5) laser before and after bleaching. Irradiation was performed perpendicularly, in contact, on each tooth during 10 sec per point in two points. The first point was positioned in the middle of the tooth crown and the second in the periapical region. Photobiomodulation was applied using the following parameters: 780 nm, 40 mW, 10 J/cm 2 , 0.4 J per point. Pain was analyzed before, immediately after, and seven subsequent days after bleaching. Patients were instructed to report pain using the scale: 0 = no tooth sensitivity, 1 = gentle sensitivity, 2 = moderate sensitivity, 3 = severe sensitivity. There were no statistical differences between groups at any time (p > 0.05). More studies, with others parameters and different methods of tooth sensitivity analysis, should be performed to complement the results found. Within the limitation of the present study, the laser parameters of photobiomodulation tested in the present study were not efficient in preventing tooth sensitivity after in-office bleaching.

  8. Pilot points method for conditioning multiple-point statistical facies simulation on flow data

    Science.gov (United States)

    Ma, Wei; Jafarpour, Behnam

    2018-05-01

    We propose a new pilot points method for conditioning discrete multiple-point statistical (MPS) facies simulation on dynamic flow data. While conditioning MPS simulation on static hard data is straightforward, their calibration against nonlinear flow data is nontrivial. The proposed method generates conditional models from a conceptual model of geologic connectivity, known as a training image (TI), by strategically placing and estimating pilot points. To place pilot points, a score map is generated based on three sources of information: (i) the uncertainty in facies distribution, (ii) the model response sensitivity information, and (iii) the observed flow data. Once the pilot points are placed, the facies values at these points are inferred from production data and then are used, along with available hard data at well locations, to simulate a new set of conditional facies realizations. While facies estimation at the pilot points can be performed using different inversion algorithms, in this study the ensemble smoother (ES) is adopted to update permeability maps from production data, which are then used to statistically infer facies types at the pilot point locations. The developed method combines the information in the flow data and the TI by using the former to infer facies values at selected locations away from the wells and the latter to ensure consistent facies structure and connectivity where away from measurement locations. Several numerical experiments are used to evaluate the performance of the developed method and to discuss its important properties.

  9. Providing full point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer

    Energy Technology Data Exchange (ETDEWEB)

    Archer, Charles J.; Faraj, Daniel A.; Inglett, Todd A.; Ratterman, Joseph D.

    2018-01-30

    Methods, apparatus, and products are disclosed for providing full point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer, each compute node connected to each adjacent compute node in the global combining network through a link, that include: receiving a network packet in a compute node, the network packet specifying a destination compute node; selecting, in dependence upon the destination compute node, at least one of the links for the compute node along which to forward the network packet toward the destination compute node; and forwarding the network packet along the selected link to the adjacent compute node connected to the compute node through the selected link.

  10. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data.

    Science.gov (United States)

    Nasejje, Justine B; Mwambi, Henry; Dheda, Keertan; Lesosky, Maia

    2017-07-28

    Random survival forest (RSF) models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF) are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. In this study, we compare the random survival forest model to the conditional inference model (CIF) using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points). The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB) which consists of mainly categorical covariates with two levels (few split-points). The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

  11. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data

    Directory of Open Access Journals (Sweden)

    Justine B. Nasejje

    2017-07-01

    Full Text Available Abstract Background Random survival forest (RSF models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. Methods In this study, we compare the random survival forest model to the conditional inference model (CIF using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points. The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB which consists of mainly categorical covariates with two levels (few split-points. Results The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Conclusion Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

  12. Channel Islands, Kelp Forest Monitoring, Survey, Random Point Contact, 1982-2007

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset from the Channel Islands National Park's Kelp Forest Monitoring Program has estimates of substrate composition and percent cover of selected algal and...

  13. Manhattan-World Urban Reconstruction from Point Clouds

    KAUST Repository

    Li, Minglei; Wonka, Peter; Nan, Liangliang

    2016-01-01

    Manhattan-world urban scenes are common in the real world. We propose a fully automatic approach for reconstructing such scenes from 3D point samples. Our key idea is to represent the geometry of the buildings in the scene using a set of well-aligned boxes. We first extract plane hypothesis from the points followed by an iterative refinement step. Then, candidate boxes are obtained by partitioning the space of the point cloud into a non-uniform grid. After that, we choose an optimal subset of the candidate boxes to approximate the geometry of the buildings. The contribution of our work is that we transform scene reconstruction into a labeling problem that is solved based on a novel Markov Random Field formulation. Unlike previous methods designed for particular types of input point clouds, our method can obtain faithful reconstructions from a variety of data sources. Experiments demonstrate that our method is superior to state-of-the-art methods. © Springer International Publishing AG 2016.

  14. Manhattan-World Urban Reconstruction from Point Clouds

    KAUST Repository

    Li, Minglei

    2016-09-16

    Manhattan-world urban scenes are common in the real world. We propose a fully automatic approach for reconstructing such scenes from 3D point samples. Our key idea is to represent the geometry of the buildings in the scene using a set of well-aligned boxes. We first extract plane hypothesis from the points followed by an iterative refinement step. Then, candidate boxes are obtained by partitioning the space of the point cloud into a non-uniform grid. After that, we choose an optimal subset of the candidate boxes to approximate the geometry of the buildings. The contribution of our work is that we transform scene reconstruction into a labeling problem that is solved based on a novel Markov Random Field formulation. Unlike previous methods designed for particular types of input point clouds, our method can obtain faithful reconstructions from a variety of data sources. Experiments demonstrate that our method is superior to state-of-the-art methods. © Springer International Publishing AG 2016.

  15. Mirnacle: machine learning with SMOTE and random forest for improving selectivity in pre-miRNA ab initio prediction.

    Science.gov (United States)

    Marques, Yuri Bento; de Paiva Oliveira, Alcione; Ribeiro Vasconcelos, Ana Tereza; Cerqueira, Fabio Ribeiro

    2016-12-15

    MicroRNAs (miRNAs) are key gene expression regulators in plants and animals. Therefore, miRNAs are involved in several biological processes, making the study of these molecules one of the most relevant topics of molecular biology nowadays. However, characterizing miRNAs in vivo is still a complex task. As a consequence, in silico methods have been developed to predict miRNA loci. A common ab initio strategy to find miRNAs in genomic data is to search for sequences that can fold into the typical hairpin structure of miRNA precursors (pre-miRNAs). The current ab initio approaches, however, have selectivity issues, i.e., a high number of false positives is reported, which can lead to laborious and costly attempts to provide biological validation. This study presents an extension of the ab initio method miRNAFold, with the aim of improving selectivity through machine learning techniques, namely, random forest combined with the SMOTE procedure that copes with imbalance datasets. By comparing our method, termed Mirnacle, with other important approaches in the literature, we demonstrate that Mirnacle substantially improves selectivity without compromising sensitivity. For the three datasets used in our experiments, our method achieved at least 97% of sensitivity and could deliver a two-fold, 20-fold, and 6-fold increase in selectivity, respectively, compared with the best results of current computational tools. The extension of miRNAFold by the introduction of machine learning techniques, significantly increases selectivity in pre-miRNA ab initio prediction, which optimally contributes to advanced studies on miRNAs, as the need of biological validations is diminished. Hopefully, new research, such as studies of severe diseases caused by miRNA malfunction, will benefit from the proposed computational tool.

  16. Tobacco point-of-sale advertising in Guatemala City, Guatemala and Buenos Aires, Argentina.

    Science.gov (United States)

    Barnoya, Joaquin; Mejia, Raul; Szeinman, Debora; Kummerfeldt, Carlos E

    2010-08-01

    To determine tobacco point of sale advertising prevalence in Guatemala City, Guatemala and Buenos Aires, Argentina. Convenience stores (120 per city) were chosen from randomly selected blocks in low, middle and high socioeconomic neighbourhoods. To assess tobacco point of sale advertising we used a checklist developed in Canada that was translated into Spanish and validated in both countries studied. Analysis was conducted by neighbourhood and store type. All stores sold cigarettes and most had tobacco products in close proximity to confectionery. In Guatemala, 60% of stores had cigarette ads. High and middle socioeconomic status neighbourhood stores had more indoor cigarette ads, but these differences were determined by store type: gas stations and supermarkets were more prevalent in high socioeconomic status neighbourhoods and had more indoor cigarette ads. In poorer areas, however, more ads could be seen from outside the stores, more stores were located within 100 metres of schools and fewer stores had 'No smoking' or 'No sales to minors' signs. In Argentina, 80% of stores had cigarette ads and few differences were observed by neighbourhood socioeconomic status. Compared to Guatemala, 'No sales to minors' signs were more prevalent in Argentina. Tobacco point of sale advertising is highly prevalent in these two cities of Guatemala and Argentina. An advertising ban should also include this type of advertising.

  17. Reduction of the Random Variables of the Turbulent Wind Field

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.

    2012-01-01

    .e. Importance Sampling (IS) or Subset Simulation (SS), will be deteriorated on problems with many random variables. The problem with PDEM is that a multidimensional integral has to be carried out over the space defined by the random variables of the system. The numerical procedure requires discretization......Applicability of the Probability Density Evolution Method (PDEM) for realizing evolution of the probability density for the wind turbines has rather strict bounds on the basic number of the random variables involved in the model. The efficiency of most of the Advanced Monte Carlo (AMC) methods, i...... of the integral domain; this becomes increasingly difficult as the dimensions of the integral domain increase. On the other hand efficiency of the AMC methods is closely dependent on the design points of the problem. Presence of many random variables may increase the number of the design points, hence affects...

  18. Floating point only SIMD instruction set architecture including compare, select, Boolean, and alignment operations

    Science.gov (United States)

    Gschwind, Michael K [Chappaqua, NY

    2011-03-01

    Mechanisms for implementing a floating point only single instruction multiple data instruction set architecture are provided. A processor is provided that comprises an issue unit, an execution unit coupled to the issue unit, and a vector register file coupled to the execution unit. The execution unit has logic that implements a floating point (FP) only single instruction multiple data (SIMD) instruction set architecture (ISA). The floating point vector registers of the vector register file store both scalar and floating point values as vectors having a plurality of vector elements. The processor may be part of a data processing system.

  19. Cover times of random searches

    Science.gov (United States)

    Chupeau, Marie; Bénichou, Olivier; Voituriez, Raphaël

    2015-10-01

    How long must one undertake a random search to visit all sites of a given domain? This time, known as the cover time, is a key observable to quantify the efficiency of exhaustive searches, which require a complete exploration of an area and not only the discovery of a single target. Examples range from immune-system cells chasing pathogens to animals harvesting resources, from robotic exploration for cleaning or demining to the task of improving search algorithms. Despite its broad relevance, the cover time has remained elusive and so far explicit results have been scarce and mostly limited to regular random walks. Here we determine the full distribution of the cover time for a broad range of random search processes, including Lévy strategies, intermittent strategies, persistent random walks and random walks on complex networks, and reveal its universal features. We show that for all these examples the mean cover time can be minimized, and that the corresponding optimal strategies also minimize the mean search time for a single target, unambiguously pointing towards their robustness.

  20. Pareto genealogies arising from a Poisson branching evolution model with selection.

    Science.gov (United States)

    Huillet, Thierry E

    2014-02-01

    We study a class of coalescents derived from a sampling procedure out of N i.i.d. Pareto(α) random variables, normalized by their sum, including β-size-biasing on total length effects (β Poisson-Dirichlet (α, -β) Ξ-coalescent (α ε[0, 1)), or to a family of continuous-time Beta (2 - α, α - β)Λ-coalescents (α ε[1, 2)), or to the Kingman coalescent (α ≥ 2). We indicate that this class of coalescent processes (and their scaling limits) may be viewed as the genealogical processes of some forward in time evolving branching population models including selection effects. In such constant-size population models, the reproduction step, which is based on a fitness-dependent Poisson Point Process with scaling power-law(α) intensity, is coupled to a selection step consisting of sorting out the N fittest individuals issued from the reproduction step.

  1. Molten salt oxidation as a technique for decommissioning: selection of low melting point salt mixtures

    International Nuclear Information System (INIS)

    Lainetti, Paulo E.O.; Garcia, Vitor F.; Benvegnu, Guilherme

    2013-01-01

    During the 70 and 80 years, IPEN built several facilities in pilot scale, destined to the technological domain of the Nuclear Fuel Cycle. In the nineties, radical changes in the Brazilian nuclear policy determined the interruption of the activities and the shut-down of pilot plants. Nowadays, IPEN has been facing the problem of the dismantling and decommissioning of its Nuclear Fuel Cycle old facilities. The facility CELESTE-I of the IPEN is a laboratory where reprocessing studies were accomplished during the decade of 80 and in the beginning of the 90s. The last operations occurred in 92-93. The research activities generated radioactive wastes in the form of organic and aqueous solutions of different compositions and concentrations. For the treatment of these liquid wastes it was proposed a study of waste thermal decomposition based on the molten salt oxidation process.Decomposition tests of different organic wastes have been performed in laboratory equipment developed at IPEN, in the range of temperatures of 900 to 1020 deg C, demonstrating the complete oxidation of the compounds. The reduction of the process temperatures would be of crucial importance. Besides this, the selection of lower melting point salt mixtures would have an important impact in the reduction of equipment costs. Several experiments were performed to determine the most suitable salt mixtures, optimizing costs and melting temperatures as low as possible. This paper describes the main characteristics of the molten salt oxidation process, besides the selection of salt mixtures of binary and ternary compositions, respectively Na 2 CO 3 - NaOH and Na 2 CO 3 - K 2 CO 3 -Li 2 CO 3 . (author)

  2. Archeological Data Recovery at Algiers Point. Volume 1.

    Science.gov (United States)

    1984-10-15

    Additional makers’ marks from Algiers Point .......... 155 62. Selected glass bottles from Algiers Point ............ 162 63. Butcher marks on bones...pp. 291-327. Academic Press, New York. Johnson, Willis B. and G. 0. Brown (Editors) 1903 The Poultry Book. Doubleday, Page & Co., New York. Jones...depending instead upon domestic livestock, primarily beef, pork, poultry , goat, and mutton. As noted previously, Algiers Point initially was settled in 0 1718

  3. Prediction of soil CO2 flux in sugarcane management systems using the Random Forest approach

    Directory of Open Access Journals (Sweden)

    Rose Luiza Moraes Tavares

    Full Text Available ABSTRACT: The Random Forest algorithm is a data mining technique used for classifying attributes in order of importance to explain the variation in an attribute-target, as soil CO2 flux. This study aimed to identify prediction of soil CO2 flux variables in management systems of sugarcane through the machine-learning algorithm called Random Forest. Two different management areas of sugarcane in the state of São Paulo, Brazil, were selected: burned and green. In each area, we assembled a sampling grid with 81 georeferenced points to assess soil CO2 flux through automated portable soil gas chamber with measuring spectroscopy in the infrared during the dry season of 2011 and the rainy season of 2012. In addition, we sampled the soil to evaluate physical, chemical, and microbiological attributes. For data interpretation, we used the Random Forest algorithm, based on the combination of predicted decision trees (machine learning algorithms in which every tree depends on the values of a random vector sampled independently with the same distribution to all the trees of the forest. The results indicated that clay content in the soil was the most important attribute to explain the CO2 flux in the areas studied during the evaluated period. The use of the Random Forest algorithm originated a model with a good fit (R2 = 0.80 for predicted and observed values.

  4. Randomized algorithms in automatic control and data mining

    CERN Document Server

    Granichin, Oleg; Toledano-Kitai, Dvora

    2015-01-01

    In the fields of data mining and control, the huge amount of unstructured data and the presence of uncertainty in system descriptions have always been critical issues. The book Randomized Algorithms in Automatic Control and Data Mining introduces the readers to the fundamentals of randomized algorithm applications in data mining (especially clustering) and in automatic control synthesis. The methods proposed in this book guarantee that the computational complexity of classical algorithms and the conservativeness of standard robust control techniques will be reduced. It is shown that when a problem requires "brute force" in selecting among options, algorithms based on random selection of alternatives offer good results with certain probability for a restricted time and significantly reduce the volume of operations.

  5. The Effect of Random Error on Diagnostic Accuracy Illustrated with the Anthropometric Diagnosis of Malnutrition

    Science.gov (United States)

    2016-01-01

    Background It is often thought that random measurement error has a minor effect upon the results of an epidemiological survey. Theoretically, errors of measurement should always increase the spread of a distribution. Defining an illness by having a measurement outside an established healthy range will lead to an inflated prevalence of that condition if there are measurement errors. Methods and results A Monte Carlo simulation was conducted of anthropometric assessment of children with malnutrition. Random errors of increasing magnitude were imposed upon the populations and showed that there was an increase in the standard deviation with each of the errors that became exponentially greater with the magnitude of the error. The potential magnitude of the resulting error of reported prevalence of malnutrition were compared with published international data and found to be of sufficient magnitude to make a number of surveys and the numerous reports and analyses that used these data unreliable. Conclusions The effect of random error in public health surveys and the data upon which diagnostic cut-off points are derived to define “health” has been underestimated. Even quite modest random errors can more than double the reported prevalence of conditions such as malnutrition. Increasing sample size does not address this problem, and may even result in less accurate estimates. More attention needs to be paid to the selection, calibration and maintenance of instruments, measurer selection, training & supervision, routine estimation of the likely magnitude of errors using standardization tests, use of statistical likelihood of error to exclude data from analysis and full reporting of these procedures in order to judge the reliability of survey reports. PMID:28030627

  6. The basic science and mathematics of random mutation and natural selection.

    Science.gov (United States)

    Kleinman, Alan

    2014-12-20

    The mutation and natural selection phenomenon can and often does cause the failure of antimicrobial, herbicidal, pesticide and cancer treatments selection pressures. This phenomenon operates in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures. The mathematical behavior of mutation and selection is derived using the principles given by probability theory. The derivation of the equations describing the mutation and selection phenomenon is carried out in the context of an empirical example. Copyright © 2014 John Wiley & Sons, Ltd.

  7. Structure Based Thermostability Prediction Models for Protein Single Point Mutations with Machine Learning Tools.

    Directory of Open Access Journals (Sweden)

    Lei Jia

    Full Text Available Thermostability issue of protein point mutations is a common occurrence in protein engineering. An application which predicts the thermostability of mutants can be helpful for guiding decision making process in protein design via mutagenesis. An in silico point mutation scanning method is frequently used to find "hot spots" in proteins for focused mutagenesis. ProTherm (http://gibk26.bio.kyutech.ac.jp/jouhou/Protherm/protherm.html is a public database that consists of thousands of protein mutants' experimentally measured thermostability. Two data sets based on two differently measured thermostability properties of protein single point mutations, namely the unfolding free energy change (ddG and melting temperature change (dTm were obtained from this database. Folding free energy change calculation from Rosetta, structural information of the point mutations as well as amino acid physical properties were obtained for building thermostability prediction models with informatics modeling tools. Five supervised machine learning methods (support vector machine, random forests, artificial neural network, naïve Bayes classifier, K nearest neighbor and partial least squares regression are used for building the prediction models. Binary and ternary classifications as well as regression models were built and evaluated. Data set redundancy and balancing, the reverse mutations technique, feature selection, and comparison to other published methods were discussed. Rosetta calculated folding free energy change ranked as the most influential features in all prediction models. Other descriptors also made significant contributions to increasing the accuracy of the prediction models.

  8. UHE point source survey at Cygnus experiment

    International Nuclear Information System (INIS)

    Lu, X.; Yodh, G.B.; Alexandreas, D.E.; Allen, R.C.; Berley, D.; Biller, S.D.; Burman, R.L.; Cady, R.; Chang, C.Y.; Dingus, B.L.; Dion, G.M.; Ellsworth, R.W.; Gilra, M.K.; Goodman, J.A.; Haines, T.J.; Hoffman, C.M.; Kwok, P.; Lloyd-Evans, J.; Nagle, D.E.; Potter, M.E.; Sandberg, V.D.; Stark, M.J.; Talaga, R.L.; Vishwanath, P.R.; Zhang, W.

    1991-01-01

    A new method of searching for UHE point source has been developed. With a data sample of 150 million events, we have surveyed the sky for point sources over 3314 locations (1.4 degree <δ<70.4 degree). It was found that their distribution is consistent with a random fluctuation. In addition, fifty two known potential sources, including pulsars and binary x-ray sources, were studied. The source with the largest positive excess is the Crab Nebula. An excess of 2.5 sigma above the background is observed in a bin of 2.3 degree by 2.5 degree in declination and right ascension respectively

  9. RANDOM WALK HYPOTHESIS IN FINANCIAL MARKETS

    Directory of Open Access Journals (Sweden)

    Nicolae-Marius JULA

    2017-05-01

    Full Text Available Random walk hypothesis states that the stock market prices do not follow a predictable trajectory, but are simply random. If you are trying to predict a random set of data, one should test for randomness, because, despite the power and complexity of the used models, the results cannot be trustworthy. There are several methods for testing these hypotheses and the use of computational power provided by the R environment makes the work of the researcher easier and with a cost-effective approach. The increasing power of computing and the continuous development of econometric tests should give the potential investors new tools in selecting commodities and investing in efficient markets.

  10. Bubble-point pressures of some selected methane + synthetic C{sub 6+} mixtures

    Energy Technology Data Exchange (ETDEWEB)

    Shariati, A.; Moshfeghian, M. [Shiraz Univ. (Iran, Islamic Republic of). Dept. of Chemical Engineering; Peters, C.J. [Delft Univ. of Technology (Netherlands). Lab. of Applied Thermodynamics and Phase Equilibria

    1998-03-01

    In this work, a series of bubble-point measurements were carried out on some synthetic C{sub 6+} mixtures in the presence of methane. These synthetic mixtures included alkanes, cycloalkanes, and aromatics. The experiments were carried out using the Cailletet apparatus, and bubble-point pressures were measured in a temperature range of 311--470 K. The corresponding pressures were predicted using the Peng-Robinson equation of state, and the relative errors were estimated. It is shown that such synthetic C{sub 6+} mixtures can be simulated reasonably well by this equation of state.

  11. Chromatic polynomials of random graphs

    International Nuclear Information System (INIS)

    Van Bussel, Frank; Fliegner, Denny; Timme, Marc; Ehrlich, Christoph; Stolzenberg, Sebastian

    2010-01-01

    Chromatic polynomials and related graph invariants are central objects in both graph theory and statistical physics. Computational difficulties, however, have so far restricted studies of such polynomials to graphs that were either very small, very sparse or highly structured. Recent algorithmic advances (Timme et al 2009 New J. Phys. 11 023001) now make it possible to compute chromatic polynomials for moderately sized graphs of arbitrary structure and number of edges. Here we present chromatic polynomials of ensembles of random graphs with up to 30 vertices, over the entire range of edge density. We specifically focus on the locations of the zeros of the polynomial in the complex plane. The results indicate that the chromatic zeros of random graphs have a very consistent layout. In particular, the crossing point, the point at which the chromatic zeros with non-zero imaginary part approach the real axis, scales linearly with the average degree over most of the density range. While the scaling laws obtained are purely empirical, if they continue to hold in general there are significant implications: the crossing points of chromatic zeros in the thermodynamic limit separate systems with zero ground state entropy from systems with positive ground state entropy, the latter an exception to the third law of thermodynamics.

  12. The Aged Residential Care Healthcare Utilization Study (ARCHUS): a multidisciplinary, cluster randomized controlled trial designed to reduce acute avoidable hospitalizations from long-term care facilities.

    Science.gov (United States)

    Connolly, Martin J; Boyd, Michal; Broad, Joanna B; Kerse, Ngaire; Lumley, Thomas; Whitehead, Noeline; Foster, Susan

    2015-01-01

    To assess effect of a complex, multidisciplinary intervention aimed at reducing avoidable acute hospitalization of residents of residential aged care (RAC) facilities. Cluster randomized controlled trial. RAC facilities with higher than expected hospitalizations in Auckland, New Zealand, were recruited and randomized to intervention or control. A total of 1998 residents of 18 intervention facilities and 18 control facilities. A facility-based complex intervention of 9 months' duration. The intervention comprised gerontology nurse specialist (GNS)-led staff education, facility bench-marking, GNS resident review, and multidisciplinary (geriatrician, primary-care physician, pharmacist, GNS, and facility nurse) discussion of residents selected using standard criteria. Primary end point was avoidable hospitalizations. Secondary end points were all acute admissions, mortality, and acute bed-days. Follow-up was for a total of 14 months. The intervention did not affect main study end points: number of acute avoidable hospital admissions (RR 1.07; 95% CI 0.85-1.36; P = .59) or mortality (RR 1.11; 95% CI 0.76-1.61; P = .62). This multidisciplinary intervention, packaging selected case review, and staff education had no overall impact on acute hospital admissions or mortality. This may have considerable implications for resourcing in the acute and RAC sectors in the face of population aging. Australian and New Zealand Clinical Trials Registry (ACTRN12611000187943). Copyright © 2015 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  13. The Goodness of Covariance Selection Problem from AUC Bounds

    OpenAIRE

    Khajavi, Navid Tafaghodi; Kuh, Anthony

    2016-01-01

    We conduct a study of graphical models and discuss the quality of model selection approximation by formulating the problem as a detection problem and examining the area under the curve (AUC). We are specifically looking at the model selection problem for jointly Gaussian random vectors. For Gaussian random vectors, this problem simplifies to the covariance selection problem which is widely discussed in literature by Dempster [1]. In this paper, we give the definition for the correlation appro...

  14. "Open mesh" or "strictly selected population" recruitment? The experience of the randomized controlled MeMeMe trial.

    Science.gov (United States)

    Cortellini, Mauro; Berrino, Franco; Pasanisi, Patrizia

    2017-01-01

    Among randomized controlled trials (RCTs), trials for primary prevention require large samples and long follow-up to obtain a high-quality outcome; therefore the recruitment process and the drop-out rates largely dictate the adequacy of the results. We are conducting a Phase III trial on persons with metabolic syndrome to test the hypothesis that comprehensive lifestyle changes and/or metformin treatment prevents age-related chronic diseases (the MeMeMe trial, EudraCT number: 2012-005427-32, also registered on ClinicalTrials.gov [NCT02960711]). Here, we briefly analyze and discuss the reasons which may lead to participants dropping out from trials. In our experience, participants may back out of a trial for different reasons. Drug-induced side effects are certainly the most compelling reason. But what are the other reasons, relating to the participants' perception of the progress of the trial which led them to withdraw after randomization? What about the time-dependent drop-out rate in primary prevention trials? The primary outcome of this analysis is the point of drop-out from trial, defined as the time from the randomization date to the withdrawal date. Survival functions were non-parametrically estimated using the product-limit estimator. The curves were statistically compared using the log-rank test ( P =0.64, not significant). Researchers involved in primary prevention RCTs seem to have to deal with the paradox of the proverbial "short blanket syndrome". Recruiting only highly motivated candidates might be useful for the smooth progress of the trial but it may lead to a very low enrollment rate. On the other hand, what about enrolling all the eligible subjects without considering their motivation? This might boost the enrollment rate, but it can lead to biased results on account of large proportions of drop-outs. Our experience suggests that participants do not change their mind depending on the allocation group (intervention or control). There is no single

  15. Strong disorder RG approach of random systems

    International Nuclear Information System (INIS)

    Igloi, Ferenc; Monthus, Cecile

    2005-01-01

    There is a large variety of quantum and classical systems in which the quenched disorder plays a dominant ro-circumflex le over quantum, thermal, or stochastic fluctuations: these systems display strong spatial heterogeneities, and many averaged observables are actually governed by rare regions. A unifying approach to treat the dynamical and/or static singularities of these systems has emerged recently, following the pioneering RG idea by Ma and Dasgupta and the detailed analysis by Fisher who showed that the Ma-Dasgupta RG rules yield asymptotic exact results if the broadness of the disorder grows indefinitely at large scales. Here we report these new developments by starting with an introduction of the main ingredients of the strong disorder RG method. We describe the basic properties of infinite disorder fixed points, which are realized at critical points, and of strong disorder fixed points, which control the singular behaviors in the Griffiths-phases. We then review in detail applications of the RG method to various disordered models, either (i) quantum models, such as random spin chains, ladders and higher dimensional spin systems, or (ii) classical models, such as diffusion in a random potential, equilibrium at low temperature and coarsening dynamics of classical random spin chains, trap models, delocalization transition of a random polymer from an interface, driven lattice gases and reaction diffusion models in the presence of quenched disorder. For several one-dimensional systems, the Ma-Dasgupta RG rules yields very detailed analytical results, whereas for other, mainly higher dimensional problems, the RG rules have to be implemented numerically. If available, the strong disorder RG results are compared with another, exact or numerical calculations

  16. Algebraic polynomials with random coefficients

    Directory of Open Access Journals (Sweden)

    K. Farahmand

    2002-01-01

    Full Text Available This paper provides an asymptotic value for the mathematical expected number of points of inflections of a random polynomial of the form a0(ω+a1(ω(n11/2x+a2(ω(n21/2x2+…an(ω(nn1/2xn when n is large. The coefficients {aj(w}j=0n, w∈Ω are assumed to be a sequence of independent normally distributed random variables with means zero and variance one, each defined on a fixed probability space (A,Ω,Pr. A special case of dependent coefficients is also studied.

  17. Quantifying Biomass from Point Clouds by Connecting Representations of Ecosystem Structure

    Science.gov (United States)

    Hendryx, S. M.; Barron-Gafford, G.

    2017-12-01

    Quantifying terrestrial ecosystem biomass is an essential part of monitoring carbon stocks and fluxes within the global carbon cycle and optimizing natural resource management. Point cloud data such as from lidar and structure from motion can be effective for quantifying biomass over large areas, but significant challenges remain in developing effective models that allow for such predictions. Inference models that estimate biomass from point clouds are established in many environments, yet, are often scale-dependent, needing to be fitted and applied at the same spatial scale and grid size at which they were developed. Furthermore, training such models typically requires large in situ datasets that are often prohibitively costly or time-consuming to obtain. We present here a scale- and sensor-invariant framework for efficiently estimating biomass from point clouds. Central to this framework, we present a new algorithm, assignPointsToExistingClusters, that has been developed for finding matches between in situ data and clusters in remotely-sensed point clouds. The algorithm can be used for assessing canopy segmentation accuracy and for training and validating machine learning models for predicting biophysical variables. We demonstrate the algorithm's efficacy by using it to train a random forest model of above ground biomass in a shrubland environment in Southern Arizona. We show that by learning a nonlinear function to estimate biomass from segmented canopy features we can reduce error, especially in the presence of inaccurate clusterings, when compared to a traditional, deterministic technique to estimate biomass from remotely measured canopies. Our random forest on cluster features model extends established methods of training random forest regressions to predict biomass of subplots but requires significantly less training data and is scale invariant. The random forest on cluster features model reduced mean absolute error, when evaluated on all test data in leave

  18. Investigation on the Weighted RANSAC Approaches for Building Roof Plane Segmentation from LiDAR Point Clouds

    Directory of Open Access Journals (Sweden)

    Bo Xu

    2015-12-01

    Full Text Available RANdom SAmple Consensus (RANSAC is a widely adopted method for LiDAR point cloud segmentation because of its robustness to noise and outliers. However, RANSAC has a tendency to generate false segments consisting of points from several nearly coplanar surfaces. To address this problem, we formulate the weighted RANSAC approach for the purpose of point cloud segmentation. In our proposed solution, the hard threshold voting function which considers both the point-plane distance and the normal vector consistency is transformed into a soft threshold voting function based on two weight functions. To improve weighted RANSAC’s ability to distinguish planes, we designed the weight functions according to the difference in the error distribution between the proper and improper plane hypotheses, based on which an outlier suppression ratio was also defined. Using the ratio, a thorough comparison was conducted between these different weight functions to determine the best performing function. The selected weight function was then compared to the existing weighted RANSAC methods, the original RANSAC, and a representative region growing (RG method. Experiments with two airborne LiDAR datasets of varying densities show that the various weighted methods can improve the segmentation quality differently, but the dedicated designed weight functions can significantly improve the segmentation accuracy and the topology correctness. Moreover, its robustness is much better when compared to the RG method.

  19. Organic Ferroelectric-Based 1T1T Random Access Memory Cell Employing a Common Dielectric Layer Overcoming the Half-Selection Problem.

    Science.gov (United States)

    Zhao, Qiang; Wang, Hanlin; Ni, Zhenjie; Liu, Jie; Zhen, Yonggang; Zhang, Xiaotao; Jiang, Lang; Li, Rongjin; Dong, Huanli; Hu, Wenping

    2017-09-01

    Organic electronics based on poly(vinylidenefluoride/trifluoroethylene) (P(VDF-TrFE)) dielectric is facing great challenges in flexible circuits. As one indispensable part of integrated circuits, there is an urgent demand for low-cost and easy-fabrication nonvolatile memory devices. A breakthrough is made on a novel ferroelectric random access memory cell (1T1T FeRAM cell) consisting of one selection transistor and one ferroelectric memory transistor in order to overcome the half-selection problem. Unlike complicated manufacturing using multiple dielectrics, this system simplifies 1T1T FeRAM cell fabrication using one common dielectric. To achieve this goal, a strategy for semiconductor/insulator (S/I) interface modulation is put forward and applied to nonhysteretic selection transistors with high performances for driving or addressing purposes. As a result, high hole mobility of 3.81 cm 2 V -1 s -1 (average) for 2,6-diphenylanthracene (DPA) and electron mobility of 0.124 cm 2 V -1 s -1 (average) for N,N'-1H,1H-perfluorobutyl dicyanoperylenecarboxydiimide (PDI-FCN 2 ) are obtained in selection transistors. In this work, we demonstrate this technology's potential for organic ferroelectric-based pixelated memory module fabrication. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Hematological clozapine monitoring with a point-of-care device

    DEFF Research Database (Denmark)

    Nielsen, Jimmi; Thode, Dorrit; Stenager, Elsebeth

    2012-01-01

    for several reasons, perhaps most importantly because of the mandatory hematological monitoring. The Chempaq Express Blood Counter (Chempaq XBC) is a point-of-care device providing counts of white blood cells (WBC) and granulocytes based on a capillary blood sampling. A randomized cross-over trial design...

  1. Remote Effect of Lower Limb Acupuncture on Latent Myofascial Trigger Point of Upper Trapezius Muscle: A Pilot Study

    Directory of Open Access Journals (Sweden)

    Kai-Hua Chen

    2013-01-01

    Full Text Available Objectives. To demonstrate the use of acupuncture in the lower limbs to treat myofascial pain of the upper trapezius muscles via a remote effect. Methods. Five adults with latent myofascial trigger points (MTrPs of bilateral upper trapezius muscles received acupuncture at Weizhong (UB40 and Yanglingquan (GB34 points in the lower limbs. Modified acupuncture was applied at these points on a randomly selected ipsilateral lower limb (experimental side versus sham needling on the contralateral lower limb (control side in each subject. Each subject received two treatments within a one-week interval. To evaluate the remote effect of acupuncture, the range of motion (ROM upon bending the contralateral side of the cervical spine was assessed before and after each treatment. Results. There was significant improvement in cervical ROM after the second treatment (P=0.03 in the experimental group, and the increased ROM on the modified acupuncture side was greater compared to the sham needling side (P=0.036. Conclusions. A remote effect of acupuncture was demonstrated in this pilot study. Using modified acupuncture needling at remote acupuncture points in the ipsilateral lower limb, our treatments released tightness due to latent MTrPs of the upper trapezius muscle.

  2. Remote Effect of Lower Limb Acupuncture on Latent Myofascial Trigger Point of Upper Trapezius Muscle: A Pilot Study

    Science.gov (United States)

    Chen, Kai-Hua; Hsiao, Kuang-Yu; Lin, Chu-Hsu; Chang, Wen-Ming; Hsu, Hung-Chih; Hsieh, Wei-Chi

    2013-01-01

    Objectives. To demonstrate the use of acupuncture in the lower limbs to treat myofascial pain of the upper trapezius muscles via a remote effect. Methods. Five adults with latent myofascial trigger points (MTrPs) of bilateral upper trapezius muscles received acupuncture at Weizhong (UB40) and Yanglingquan (GB34) points in the lower limbs. Modified acupuncture was applied at these points on a randomly selected ipsilateral lower limb (experimental side) versus sham needling on the contralateral lower limb (control side) in each subject. Each subject received two treatments within a one-week interval. To evaluate the remote effect of acupuncture, the range of motion (ROM) upon bending the contralateral side of the cervical spine was assessed before and after each treatment. Results. There was significant improvement in cervical ROM after the second treatment (P = 0.03) in the experimental group, and the increased ROM on the modified acupuncture side was greater compared to the sham needling side (P = 0.036). Conclusions. A remote effect of acupuncture was demonstrated in this pilot study. Using modified acupuncture needling at remote acupuncture points in the ipsilateral lower limb, our treatments released tightness due to latent MTrPs of the upper trapezius muscle. PMID:23710218

  3. Correlations of pseudo-random numbers of multiplicative sequence

    International Nuclear Information System (INIS)

    Bukin, A.D.

    1989-01-01

    An algorithm is suggested for searching with a computer in unit n-dimensional cube the sets of planes where all the points fall whose coordinates are composed of n successive pseudo-random numbers of multiplicative sequence. This effect should be taken into account in Monte-Carlo calculations with definite constructive dimension. The parameters of these planes are obtained for three random number generators. 2 refs.; 2 tabs

  4. Random-walk simulation of selected aspects of dissipative collisions

    International Nuclear Information System (INIS)

    Toeke, J.; Gobbi, A.; Matulewicz, T.

    1984-11-01

    Internuclear thermal equilibrium effects and shell structure effects in dissipative collisions are studied numerically within the framework of the model of stochastic exchanges by applying the random-walk technique. Effective blocking of the drift through the mass flux induced by the temperature difference, while leaving the variances of the mass distributions unaltered is found possible, provided an internuclear potential barrier is present. Presence of the shell structure is found to lead to characteristic correlations between the consecutive exchanges. Experimental evidence for the predicted effects is discussed. (orig.)

  5. The Canopy Graph and Level Statistics for Random Operators on Trees

    International Nuclear Information System (INIS)

    Aizenman, Michael; Warzel, Simone

    2006-01-01

    For operators with homogeneous disorder, it is generally expected that there is a relation between the spectral characteristics of a random operator in the infinite setup and the distribution of the energy gaps in its finite volume versions, in corresponding energy ranges. Whereas pure point spectrum of the infinite operator goes along with Poisson level statistics, it is expected that purely absolutely continuous spectrum would be associated with gap distributions resembling the corresponding random matrix ensemble. We prove that on regular rooted trees, which exhibit both spectral types, the eigenstate point process has always Poissonian limit. However, we also find that this does not contradict the picture described above if that is carefully interpreted, as the relevant limit of finite trees is not the infinite homogenous tree graph but rather a single-ended 'canopy graph.' For this tree graph, the random Schroedinger operator is proven here to have only pure-point spectrum at any strength of the disorder. For more general single-ended trees it is shown that the spectrum is always singular - pure point possibly with singular continuous component which is proven to occur in some cases

  6. Random number generation and creativity.

    Science.gov (United States)

    Bains, William

    2008-01-01

    A previous paper suggested that humans can generate genuinely random numbers. I tested this hypothesis by repeating the experiment with a larger number of highly numerate subjects, asking them to call out a sequence of digits selected from 0 through 9. The resulting sequences were substantially non-random, with an excess of sequential pairs of numbers and a deficit of repeats of the same number, in line with previous literature. However, the previous literature suggests that humans generate random numbers with substantial conscious effort, and distractions which reduce that effort reduce the randomness of the numbers. I reduced my subjects' concentration by asking them to call out in another language, and with alcohol - neither affected the randomness of their responses. This suggests that the ability to generate random numbers is a 'basic' function of the human mind, even if those numbers are not mathematically 'random'. I hypothesise that there is a 'creativity' mechanism, while not truly random, provides novelty as part of the mind's defence against closed programming loops, and that testing for the effects seen here in people more or less familiar with numbers or with spontaneous creativity could identify more features of this process. It is possible that training to perform better at simple random generation tasks could help to increase creativity, through training people to reduce the conscious mind's suppression of the 'spontaneous', creative response to new questions.

  7. Use acupuncture to treat functional constipation: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Li Ying

    2012-07-01

    Full Text Available Abstract Background Whether acupuncture is effective for patients with functional constipation is still unclear. Therefore, we report the protocol of a randomized controlled trial of using acupuncture to treat functional constipation. Design A randomized, controlled, four-arm design, large-scale trial is currently undergoing in China. Seven hundred participants are randomly assigned to three acupuncture treatment groups and Mosapride Citrate control group in a 1:1:1:1 ratio. Participants in acupuncture groups receive 16 sessions of acupuncture treatment, and are followed up for a period of 9 weeks after randomization. The acupuncture groups are: (1 Back-Shu and Front-Mu acupoints of Large Intestine meridians (Shu-Mu points group; (2 He-Sea and Lower He-Sea acupoints of Large Intestine meridians (He points group; (3 Combining used Back-Shu, Front-Mu, He-Sea, and Lower He-Sea acupoints of Large Intestine meridians (Shu-Mu-He points group. The control group is Mosapride Citrate group. The primary outcome is frequency of defecation per week at the fourth week after randomization. The secondary outcomes include Bristol stool scale, the extent of difficulty during defecating, MOS 36-item Short Form health survey (SF-36, Self-Rating Anxiety Scale (SAS, and Self-rating Depression Scale (SDS. The first two of second outcomes are measured 1 week before randomization and 2, 4, and 8 weeks after randomization. Other second outcomes are measured 1 week before randomization and 2 and 4 weeks after randomization, but SF-36 is measured at randomization and 4 weeks after randomization. Discussion The result of this trial (which will be available in 2012 will confirm whether acupuncture is effective to treat functional constipation and whether traditional acupuncture theories play an important role in it. Trials registration Clinical Trials.gov NCT01411501

  8. Sampling point selection for energy estimation in the quasicontinuum method

    NARCIS (Netherlands)

    Beex, L.A.A.; Peerlings, R.H.J.; Geers, M.G.D.

    2010-01-01

    The quasicontinuum (QC) method reduces computational costs of atomistic calculations by using interpolation between a small number of so-called repatoms to represent the displacements of the complete lattice and by selecting a small number of sampling atoms to estimate the total potential energy of

  9. Quantum phase transitions in random XY spin chains

    International Nuclear Information System (INIS)

    Bunder, J.E.; McKenzie, R.H.

    2000-01-01

    Full text: The XY spin chain in a transverse field is one of the simplest quantum spin models. It is a reasonable model for heavy fermion materials such as CeCu 6-x Au x . It has two quantum phase transitions: the Ising transition and the anisotropic transition. Quantum phase transitions occur at zero temperature. We are investigating what effect the introduction of randomness has on these quantum phase transitions. Disordered systems which undergo quantum phase transitions can exhibit new universality classes. The universality class of a phase transition is defined by the set of critical exponents. In a random system with quantum phase transitions we can observe Griffiths-McCoy singularities. Such singularities are observed in regions which have no long range order, so they are not classified as critical regions, yet they display phenomena normally associated with critical points, such as a diverging susceptibility. Griffiths-McCoy phases are due to rare regions with stronger than! average interactions and may be present far from the quantum critical point. We show how the random XY spin chain may be mapped onto a random Dirac equation. This allows us to calculate the density of states without making any approximations. From the density of states we can describe the conditions which should allow a Griffiths-McCoy phase. We find that for the Ising transition the dynamic critical exponent, z, is not universal. It is proportional to the disorder strength and inversely proportional to the energy gap, hence z becomes infinite at the critical point where the energy gap vanishes

  10. A fixed-point farrago

    CERN Document Server

    Shapiro, Joel H

    2016-01-01

    This text provides an introduction to some of the best-known fixed-point theorems, with an emphasis on their interactions with topics in analysis. The level of exposition increases gradually throughout the book, building from a basic requirement of undergraduate proficiency to graduate-level sophistication. Appendices provide an introduction to (or refresher on) some of the prerequisite material and exercises are integrated into the text, contributing to the volume’s ability to be used as a self-contained text. Readers will find the presentation especially useful for independent study or as a supplement to a graduate course in fixed-point theory. The material is split into four parts: the first introduces the Banach Contraction-Mapping Principle and the Brouwer Fixed-Point Theorem, along with a selection of interesting applications; the second focuses on Brouwer’s theorem and its application to John Nash’s work; the third applies Brouwer’s theorem to spaces of infinite dimension; and the fourth rests ...

  11. Selection and characterization of DNA aptamers

    NARCIS (Netherlands)

    Ruigrok, V.J.B.

    2013-01-01

    This thesis focusses on the selection and characterisation of DNA aptamers and the various aspects related to their selection from large pools of randomized oligonucleotides. Aptamers are affinity tools that can specifically recognize and bind predefined target molecules; this ability, however,

  12. Scattering and absorption of particles emitted by a point source in a cluster of point scatterers

    International Nuclear Information System (INIS)

    Liljequist, D.

    2012-01-01

    A theory for the scattering and absorption of particles isotropically emitted by a point source in a cluster of point scatterers is described and related to the theory for the scattering of an incident particle beam. The quantum mechanical probability of escape from the cluster in different directions is calculated, as well as the spatial distribution of absorption events within the cluster. A source strength renormalization procedure is required. The average quantum scattering in clusters with randomly shifting scatterer positions is compared to trajectory simulation with the aim of studying the validity of the trajectory method. Differences between the results of the quantum and trajectory methods are found primarily for wavelengths larger than the average distance between nearest neighbour scatterers. The average quantum results include, for example, a local minimum in the number of absorption events at the location of the point source and interference patterns in the angle-dependent escape probability as well as in the distribution of absorption events. The relative error of the trajectory method is in general, though not generally, of similar magnitude as that obtained for beam scattering.

  13. Pseudo-Random Number Generators

    Science.gov (United States)

    Howell, L. W.; Rheinfurth, M. H.

    1984-01-01

    Package features comprehensive selection of probabilistic distributions. Monte Carlo simulations resorted to whenever systems studied not amenable to deterministic analyses or when direct experimentation not feasible. Random numbers having certain specified distribution characteristic integral part of simulations. Package consists of collector of "pseudorandom" number generators for use in Monte Carlo simulations.

  14. Point-of-care cluster randomized trial in stroke secondary prevention using electronic health records

    NARCIS (Netherlands)

    Dregan, Alex; van Staa, Tjeerd P; McDermott, Lisa; McCann, Gerard; Ashworth, Mark; Charlton, Judith; Wolfe, Charles D A; Rudd, Anthony; Yardley, Lucy; Gulliford, Martin C

    BACKGROUND AND PURPOSE: The aim of this study was to evaluate whether the remote introduction of electronic decision support tools into family practices improves risk factor control after first stroke. This study also aimed to develop methods to implement cluster randomized trials in stroke using

  15. Preliminary Studies on Existing Scenario of Selected Soil Property in Cheddikulam DS Division Vavuniya, Sri Lanka

    Directory of Open Access Journals (Sweden)

    M.A. R. Aashifa

    2017-01-01

    Full Text Available  This study was conducted to quantify the spatial variability of soil properties, use this information to produce accurate map by means of ordinary kriging and find the ways to reclaim the problem soil and make suggestions to cultivate the crop variety which is suitable for the existing soil property.70 sampling points were selected for that research using stratified random sampling method. Stratification was based on the type of land cover, and following land cover patterns were identified forest patches, agriculture land patches, grass land patches and catchments. Sampling points were randomly selected from each land cover types. Minimum distance between two adjacent sampling points was 500m. Soil samples were analyzed for pH, EC, exchangeable K, available P. In each location, soils were collected from top to - 30 cm depth (root zone using a core sampler and sub soil samples were collected around the geo-reference point to obtain a composite sample. Geostatistical tool of the software (ArcGIS 10.2.2. trail version was used to construct semi-variograms and spatial structure analysis for the variables. Geostatistical estimation had done by kriging. 13% of agriculture land area was acidic soil and 5.7% alkaline soil. 13% of agriculture land area was identified as saline soil. 67.11% of agriculture lands contain more phosphorous concentration than the optimum range. 3.4% agriculture lands contain higher potassium concentration than the optimum range. 98% of forest lands and 100% of grass lands contains phosphorous concentration higher than the optimum range. But forest lands and catchments shows lower level of potassium concentration. 22% of grass lands contain higher potassium than the optimum level. Agriculture practices leads to change in the soil hence identified soil problems should be reclaimed in order to maintain the fertility of soil for sustainable production. Proper management of soil can be a better solution for supporting the

  16. Model Selection with the Linear Mixed Model for Longitudinal Data

    Science.gov (United States)

    Ryoo, Ji Hoon

    2011-01-01

    Model building or model selection with linear mixed models (LMMs) is complicated by the presence of both fixed effects and random effects. The fixed effects structure and random effects structure are codependent, so selection of one influences the other. Most presentations of LMM in psychology and education are based on a multilevel or…

  17. SDE based regression for random PDEs

    KAUST Repository

    Bayer, Christian

    2016-01-01

    A simulation based method for the numerical solution of PDE with random coefficients is presented. By the Feynman-Kac formula, the solution can be represented as conditional expectation of a functional of a corresponding stochastic differential equation driven by independent noise. A time discretization of the SDE for a set of points in the domain and a subsequent Monte Carlo regression lead to an approximation of the global solution of the random PDE. We provide an initial error and complexity analysis of the proposed method along with numerical examples illustrating its behaviour.

  18. SDE based regression for random PDEs

    KAUST Repository

    Bayer, Christian

    2016-01-06

    A simulation based method for the numerical solution of PDE with random coefficients is presented. By the Feynman-Kac formula, the solution can be represented as conditional expectation of a functional of a corresponding stochastic differential equation driven by independent noise. A time discretization of the SDE for a set of points in the domain and a subsequent Monte Carlo regression lead to an approximation of the global solution of the random PDE. We provide an initial error and complexity analysis of the proposed method along with numerical examples illustrating its behaviour.

  19. Hypothesis testing of a change point during cognitive decline among Alzheimer's disease patients.

    Science.gov (United States)

    Ji, Ming; Xiong, Chengjie; Grundman, Michael

    2003-10-01

    In this paper, we present a statistical hypothesis test for detecting a change point over the course of cognitive decline among Alzheimer's disease patients. The model under the null hypothesis assumes a constant rate of cognitive decline over time and the model under the alternative hypothesis is a general bilinear model with an unknown change point. When the change point is unknown, however, the null distribution of the test statistics is not analytically tractable and has to be simulated by parametric bootstrap. When the alternative hypothesis that a change point exists is accepted, we propose an estimate of its location based on the Akaike's Information Criterion. We applied our method to a data set from the Neuropsychological Database Initiative by implementing our hypothesis testing method to analyze Mini Mental Status Exam scores based on a random-slope and random-intercept model with a bilinear fixed effect. Our result shows that despite large amount of missing data, accelerated decline did occur for MMSE among AD patients. Our finding supports the clinical belief of the existence of a change point during cognitive decline among AD patients and suggests the use of change point models for the longitudinal modeling of cognitive decline in AD research.

  20. Pointing control for LDR

    Science.gov (United States)

    Yam, Y.; Briggs, C.

    1988-01-01

    One important aspect of the LDR control problem is the possible excitations of structural modes due to random disturbances, mirror chopping, and slewing maneuvers. An analysis was performed to yield a first order estimate of the effects of such dynamic excitations. The analysis involved a study of slewing jitters, chopping jitters, disturbance responses, and pointing errors, making use of a simplified planar LDR model which describes the LDR dynamics on a plane perpendicular to the primary reflector. Briefly, the results indicate that the command slewing profile plays an important role in minimizing the resultant jitter, even to a level acceptable without any control action. An optimal profile should therefore be studied.

  1. Transient selection in multicellular immune networks

    Science.gov (United States)

    Ivanchenko, M. V.

    2011-03-01

    We analyze the dynamics of a multi-clonotype naive T-cell population competing for survival signals from antigen-presenting cells. We find that this competition provides with an efficacious selection of clonotypes, making the less able and more repetitive get extinct. We uncover the scaling principles for large systems the extinction rate obeys and calibrate the model parameters to their experimental counterparts. For the first time, we estimate the physiological values of the T-cell receptor-antigen presentation profile recognition probability and T-cell clonotypes niche overlap. We demonstrate that, while the ultimate state is a stable fixed point, sequential transients dominate the dynamics over large timescales that may span over years, if not decades, in real time. We argue that what is currently viewed as "homeostasis" is a complex sequential transient process, while being quasi-stationary in the total number of T-cells only. The discovered type of sequential transient dynamics in large random networks is a novel alternative to the stable heteroclinic channel mechanism.

  2. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  3. Ferrimagnetic Properties of Bond Dilution Mixed Blume-Capel Model with Random Single-Ion Anisotropy

    International Nuclear Information System (INIS)

    Liu Lei; Yan Shilei

    2005-01-01

    We study the ferrimagnetic properties of spin 1/2 and spin-1 systems by means of the effective field theory. The system is considered in the framework of bond dilution mixed Blume-Capel model (BCM) with random single-ion anisotropy. The investigation of phase diagrams and magnetization curves indicates the existence of induced magnetic ordering and single or multi-compensation points. Special emphasis is placed on the influence of bond dilution and random single-ion anisotropy on normal or induced magnetic ordering states and single or multi-compensation points. Normal magnetic ordering states take on new phase diagrams with increasing randomness (bond and anisotropy), while anisotropy induced magnetic ordering states are always occurrence no matter whether concentration of anisotropy is large or small. Existence and disappearance of compensation points rely strongly on bond dilution and random single-ion anisotropy. Some results have not been revealed in previous papers and predicted by Neel theory of ferrimagnetism.

  4. Diffusion in randomly perturbed dissipative dynamics

    Science.gov (United States)

    Rodrigues, Christian S.; Chechkin, Aleksei V.; de Moura, Alessandro P. S.; Grebogi, Celso; Klages, Rainer

    2014-11-01

    Dynamical systems having many coexisting attractors present interesting properties from both fundamental theoretical and modelling points of view. When such dynamics is under bounded random perturbations, the basins of attraction are no longer invariant and there is the possibility of transport among them. Here we introduce a basic theoretical setting which enables us to study this hopping process from the perspective of anomalous transport using the concept of a random dynamical system with holes. We apply it to a simple model by investigating the role of hyperbolicity for the transport among basins. We show numerically that our system exhibits non-Gaussian position distributions, power-law escape times, and subdiffusion. Our simulation results are reproduced consistently from stochastic continuous time random walk theory.

  5. The adverse effect of selective cyclooxygenase-2 inhibitor on random skin flap survival in rats.

    Directory of Open Access Journals (Sweden)

    Haiyong Ren

    Full Text Available BACKGROUND: Cyclooxygenase-2(COX-2 inhibitors provide desired analgesic effects after injury or surgery, but evidences suggested they also attenuate wound healing. The study is to investigate the effect of COX-2 inhibitor on random skin flap survival. METHODS: The McFarlane flap model was established in 40 rats and evaluated within two groups, each group gave the same volume of Parecoxib and saline injection for 7 days. The necrotic area of the flap was measured, the specimens of the flap were stained with haematoxylin-eosin(HE for histologic analysis. Immunohistochemical staining was performed to analyse the level of VEGF and COX-2 . RESULTS: 7 days after operation, the flap necrotic area ratio in study group (66.65 ± 2.81% was significantly enlarged than that of the control group(48.81 ± 2.33%(P <0.01. Histological analysis demonstrated angiogenesis with mean vessel density per mm(2 being lower in study group (15.4 ± 4.4 than in control group (27.2 ± 4.1 (P <0.05. To evaluate the expression of COX-2 and VEGF protein in the intermediate area II in the two groups by immunohistochemistry test .The expression of COX-2 in study group was (1022.45 ± 153.1, and in control group was (2638.05 ± 132.2 (P <0.01. The expression of VEGF in the study and control groups were (2779.45 ± 472.0 vs (4938.05 ± 123.6(P <0.01.In the COX-2 inhibitor group, the expressions of COX-2 and VEGF protein were remarkably down-regulated as compared with the control group. CONCLUSION: Selective COX-2 inhibitor had adverse effect on random skin flap survival. Suppression of neovascularization induced by low level of VEGF was supposed to be the biological mechanism.

  6. Bias in random forest variable importance measures: Illustrations, sources and a solution

    Directory of Open Access Journals (Sweden)

    Hothorn Torsten

    2007-01-01

    Full Text Available Abstract Background Variable importance measures for random forests have been receiving increased attention as a means of variable selection in many classification tasks in bioinformatics and related scientific fields, for instance to select a subset of genetic markers relevant for the prediction of a certain disease. We show that random forest variable importance measures are a sensible means for variable selection in many applications, but are not reliable in situations where potential predictor variables vary in their scale of measurement or their number of categories. This is particularly important in genomics and computational biology, where predictors often include variables of different types, for example when predictors include both sequence data and continuous variables such as folding energy, or when amino acid sequence data show different numbers of categories. Results Simulation studies are presented illustrating that, when random forest variable importance measures are used with data of varying types, the results are misleading because suboptimal predictor variables may be artificially preferred in variable selection. The two mechanisms underlying this deficiency are biased variable selection in the individual classification trees used to build the random forest on one hand, and effects induced by bootstrap sampling with replacement on the other hand. Conclusion We propose to employ an alternative implementation of random forests, that provides unbiased variable selection in the individual classification trees. When this method is applied using subsampling without replacement, the resulting variable importance measures can be used reliably for variable selection even in situations where the potential predictor variables vary in their scale of measurement or their number of categories. The usage of both random forest algorithms and their variable importance measures in the R system for statistical computing is illustrated and

  7. Existence of solutions for quasilinear random impulsive neutral differential evolution equation

    Directory of Open Access Journals (Sweden)

    B. Radhakrishnan

    2018-07-01

    Full Text Available This paper deals with the existence of solutions for quasilinear random impulsive neutral functional differential evolution equation in Banach spaces and the results are derived by using the analytic semigroup theory, fractional powers of operators and the Schauder fixed point approach. An application is provided to illustrate the theory. Keywords: Quasilinear differential equation, Analytic semigroup, Random impulsive neutral differential equation, Fixed point theorem, 2010 Mathematics Subject Classification: 34A37, 47H10, 47H20, 34K40, 34K45, 35R12

  8. Multi-Label Learning via Random Label Selection for Protein Subcellular Multi-Locations Prediction.

    Science.gov (United States)

    Wang, Xiao; Li, Guo-Zheng

    2013-03-12

    Prediction of protein subcellular localization is an important but challenging problem, particularly when proteins may simultaneously exist at, or move between, two or more different subcellular location sites. Most of the existing protein subcellular localization methods are only used to deal with the single-location proteins. In the past few years, only a few methods have been proposed to tackle proteins with multiple locations. However, they only adopt a simple strategy, that is, transforming the multi-location proteins to multiple proteins with single location, which doesn't take correlations among different subcellular locations into account. In this paper, a novel method named RALS (multi-label learning via RAndom Label Selection), is proposed to learn from multi-location proteins in an effective and efficient way. Through five-fold cross validation test on a benchmark dataset, we demonstrate our proposed method with consideration of label correlations obviously outperforms the baseline BR method without consideration of label correlations, indicating correlations among different subcellular locations really exist and contribute to improvement of prediction performance. Experimental results on two benchmark datasets also show that our proposed methods achieve significantly higher performance than some other state-of-the-art methods in predicting subcellular multi-locations of proteins. The prediction web server is available at http://levis.tongji.edu.cn:8080/bioinfo/MLPred-Euk/ for the public usage.

  9. On the product and ratio of Bessel random variables

    Directory of Open Access Journals (Sweden)

    Saralees Nadarajah

    2005-01-01

    Full Text Available The distributions of products and ratios of random variables are of interest in many areas of the sciences. In this paper, the exact distributions of the product |XY| and the ratio |X/Y| are derived when X and Y are independent Bessel function random variables. An application of the results is provided by tabulating the associated percentage points.

  10. PointCloudExplore 2: Visual exploration of 3D gene expression

    Energy Technology Data Exchange (ETDEWEB)

    International Research Training Group Visualization of Large and Unstructured Data Sets, University of Kaiserslautern, Germany; Institute for Data Analysis and Visualization, University of California, Davis, CA; Computational Research Division, Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA; Genomics Division, LBNL; Computer Science Department, University of California, Irvine, CA; Computer Science Division,University of California, Berkeley, CA; Life Sciences Division, LBNL; Department of Molecular and Cellular Biology and the Center for Integrative Genomics, University of California, Berkeley, CA; Ruebel, Oliver; Rubel, Oliver; Weber, Gunther H.; Huang, Min-Yu; Bethel, E. Wes; Keranen, Soile V.E.; Fowlkes, Charless C.; Hendriks, Cris L. Luengo; DePace, Angela H.; Simirenko, L.; Eisen, Michael B.; Biggin, Mark D.; Hagen, Hand; Malik, Jitendra; Knowles, David W.; Hamann, Bernd

    2008-03-31

    To better understand how developmental regulatory networks are defined inthe genome sequence, the Berkeley Drosophila Transcription Network Project (BDNTP)has developed a suite of methods to describe 3D gene expression data, i.e.,the output of the network at cellular resolution for multiple time points. To allow researchersto explore these novel data sets we have developed PointCloudXplore (PCX).In PCX we have linked physical and information visualization views via the concept ofbrushing (cell selection). For each view dedicated operations for performing selectionof cells are available. In PCX, all cell selections are stored in a central managementsystem. Cells selected in one view can in this way be highlighted in any view allowingfurther cell subset properties to be determined. Complex cell queries can be definedby combining different cell selections using logical operations such as AND, OR, andNOT. Here we are going to provide an overview of PointCloudXplore 2 (PCX2), thelatest publicly available version of PCX. PCX2 has shown to be an effective tool forvisual exploration of 3D gene expression data. We discuss (i) all views available inPCX2, (ii) different strategies to perform cell selection, (iii) the basic architecture ofPCX2., and (iv) illustrate the usefulness of PCX2 using selected examples.

  11. A Comparison of the Hot Spot and the Average Cancer Cell Counting Methods and the Optimal Cutoff Point of the Ki-67 Index for Luminal Type Breast Cancer.

    Science.gov (United States)

    Arima, Nobuyuki; Nishimura, Reiki; Osako, Tomofumi; Nishiyama, Yasuyuki; Fujisue, Mamiko; Okumura, Yasuhiro; Nakano, Masahiro; Tashima, Rumiko; Toyozumi, Yasuo

    2016-01-01

    In this case-control study, we investigated the most suitable cell counting area and the optimal cutoff point of the Ki-67 index. Thirty recurrent cases were selected among hormone receptor (HR)-positive/HER2-negative breast cancer patients. As controls, 90 nonrecurrent cases were randomly selected by allotting 3 controls to each recurrent case based on the following criteria: age, nodal status, tumor size, and adjuvant endocrine therapy alone. Both the hot spot and the average area of the tumor were evaluated on a Ki-67 immunostaining slide. The median Ki-67 index value at the hot spot and average area were 25.0 and 14.5%, respectively. Irrespective of the area counted, the Ki-67 index value was significantly higher in all of the recurrent cases (p hot spot was the most suitable cutoff point for predicting recurrence. Moreover, higher x0394;Ki-67 index value (the difference between the hot spot and the average area, ≥10%) and lower progesterone receptor expression (hot spot strongly correlated with recurrence, and the optimal cutoff point was found to be 20%. © 2015 S. Karger AG, Basel.

  12. Exceptional points near first- and second-order quantum phase transitions.

    Science.gov (United States)

    Stránský, Pavel; Dvořák, Martin; Cejnar, Pavel

    2018-01-01

    We study the impact of quantum phase transitions (QPTs) on the distribution of exceptional points (EPs) of the Hamiltonian in the complex-extended parameter domain. Analyzing first- and second-order QPTs in the Lipkin-Meshkov-Glick model we find an exponentially and polynomially close approach of EPs to the respective critical point with increasing size of the system. If the critical Hamiltonian is subject to random perturbations of various kinds, the averaged distribution of EPs close to the critical point still carries decisive information on the QPT type. We therefore claim that properties of the EP distribution represent a parametrization-independent signature of criticality in quantum systems.

  13. Survivor bias in Mendelian randomization analysis

    DEFF Research Database (Denmark)

    Vansteelandt, Stijn; Dukes, Oliver; Martinussen, Torben

    2017-01-01

    Mendelian randomization studies employ genotypes as experimental handles to infer the effect of genetically modified exposures (e.g. vitamin D exposure) on disease outcomes (e.g. mortality). The statistical analysis of these studies makes use of the standard instrumental variables framework. Many...... of these studies focus on elderly populations, thereby ignoring the problem of left truncation, which arises due to the selection of study participants being conditional upon surviving up to the time of study onset. Such selection, in general, invalidates the assumptions on which the instrumental variables...... analysis rests. We show that Mendelian randomization studies of adult or elderly populations will therefore, in general, return biased estimates of the exposure effect when the considered genotype affects mortality; in contrast, standard tests of the causal null hypothesis that the exposure does not affect...

  14. Performance of Power Systems under Sustained Random Perturbations

    Directory of Open Access Journals (Sweden)

    Humberto Verdejo

    2014-01-01

    Full Text Available This paper studies linear systems under sustained additive random perturbations. The stable operating point of an electric power system is replaced by an attracting stationary solution if the system is subjected to (small random additive perturbations. The invariant distribution of this stationary solution gives rise to several performance indices that measure how well the system copes with the randomness. These indices are introduced, showing how they can be used for the optimal tuning of system parameters in the presence of noise. Results on a four-generator two-area system are presented and discussed.

  15. High-Tg Polynorbornene-Based Block and Random Copolymers for Butanol Pervaporation Membranes

    Science.gov (United States)

    Register, Richard A.; Kim, Dong-Gyun; Takigawa, Tamami; Kashino, Tomomasa; Burtovyy, Oleksandr; Bell, Andrew

    Vinyl addition polymers of substituted norbornene (NB) monomers possess desirably high glass transition temperatures (Tg); however, until very recently, the lack of an applicable living polymerization chemistry has precluded the synthesis of such polymers with controlled architecture, or copolymers with controlled sequence distribution. We have recently synthesized block and random copolymers of NB monomers bearing hydroxyhexafluoroisopropyl and n-butyl substituents (HFANB and BuNB) via living vinyl addition polymerization with Pd-based catalysts. Both series of polymers were cast into the selective skin layers of thin film composite (TFC) membranes, and these organophilic membranes investigated for the isolation of n-butanol from dilute aqueous solution (model fermentation broth) via pervaporation. The block copolymers show well-defined microphase-separated morphologies, both in bulk and as the selective skin layers on TFC membranes, while the random copolymers are homogeneous. Both block and random vinyl addition copolymers are effective as n-butanol pervaporation membranes, with the block copolymers showing a better flux-selectivity balance. While polyHFANB has much higher permeability and n-butanol selectivity than polyBuNB, incorporating BuNB units into the polymer (in either a block or random sequence) limits the swelling of the polyHFANB and thereby improves the n-butanol pervaporation selectivity.

  16. Fault Diagnosis for Hydraulic Servo System Using Compressed Random Subspace Based ReliefF

    Directory of Open Access Journals (Sweden)

    Yu Ding

    2018-01-01

    Full Text Available Playing an important role in electromechanical systems, hydraulic servo system is crucial to mechanical systems like engineering machinery, metallurgical machinery, ships, and other equipment. Fault diagnosis based on monitoring and sensory signals plays an important role in avoiding catastrophic accidents and enormous economic losses. This study presents a fault diagnosis scheme for hydraulic servo system using compressed random subspace based ReliefF (CRSR method. From the point of view of feature selection, the scheme utilizes CRSR method to determine the most stable feature combination that contains the most adequate information simultaneously. Based on the feature selection structure of ReliefF, CRSR employs feature integration rules in the compressed domain. Meanwhile, CRSR substitutes information entropy and fuzzy membership for traditional distance measurement index. The proposed CRSR method is able to enhance the robustness of the feature information against interference while selecting the feature combination with balanced information expressing ability. To demonstrate the effectiveness of the proposed CRSR method, a hydraulic servo system joint simulation model is constructed by HyPneu and Simulink, and three fault modes are injected to generate the validation data.

  17. Radiographic Progression-Free Survival as a Clinically Meaningful End Point in Metastatic Castration-Resistant Prostate Cancer: The PREVAIL Randomized Clinical Trial.

    Science.gov (United States)

    Rathkopf, Dana E; Beer, Tomasz M; Loriot, Yohann; Higano, Celestia S; Armstrong, Andrew J; Sternberg, Cora N; de Bono, Johann S; Tombal, Bertrand; Parli, Teresa; Bhattacharya, Suman; Phung, De; Krivoshik, Andrew; Scher, Howard I; Morris, Michael J

    2018-05-01

    Drug development for metastatic castration-resistant prostate cancer has been limited by a lack of clinically relevant trial end points short of overall survival (OS). Radiographic progression-free survival (rPFS) as defined by the Prostate Cancer Clinical Trials Working Group 2 (PCWG2) is a candidate end point that represents a clinically meaningful benefit to patients. To demonstrate the robustness of the PCWG2 definition and to examine the relationship between rPFS and OS. PREVAIL was a phase 3, randomized, double-blind, placebo-controlled multinational study that enrolled 1717 chemotherapy-naive men with metastatic castration-resistant prostate cancer from September 2010 through September 2012. The data were analyzed in November 2016. Patients were randomized 1:1 to enzalutamide 160 mg or placebo until confirmed radiographic disease progression or a skeletal-related event and initiation of either cytotoxic chemotherapy or an investigational agent for prostate cancer treatment. Sensitivity analyses (SAs) of investigator-assessed rPFS were performed using the final rPFS data cutoff (May 6, 2012; 439 events; SA1) and the interim OS data cutoff (September 16, 2013; 540 events; SA2). Additional SAs using investigator-assessed rPFS from the final rPFS data cutoff assessed the impact of skeletal-related events (SA3), clinical progression (SA4), a confirmatory scan for soft-tissue disease progression (SA5), and all deaths regardless of time after study drug discontinuation (SA6). Correlations between investigator-assessed rPFS (SA2) and OS were calculated using Spearman ρ and Kendall τ via Clayton copula. In the 1717 men (mean age, 72.0 [range, 43.0-93.0] years in enzalutamide arm and 71.0 [range, 42.0-93.0] years in placebo arm), enzalutamide significantly reduced risk of radiographic progression or death in all SAs, with hazard ratios of 0.22 (SA1; 95% CI, 0.18-0.27), 0.31 (SA2; 95% CI, 0.27-0.35), 0.21 (SA3; 95% CI, 0.18-0.26), 0.21 (SA4; 95% CI, 0.17-0.26), 0

  18. New Interval-Valued Intuitionistic Fuzzy Behavioral MADM Method and Its Application in the Selection of Photovoltaic Cells

    Directory of Open Access Journals (Sweden)

    Xiaolu Zhang

    2016-10-01

    Full Text Available As one of the emerging renewable resources, the use of photovoltaic cells has become a promise for offering clean and plentiful energy. The selection of a best photovoltaic cell for a promoter plays a significant role in aspect of maximizing income, minimizing costs and conferring high maturity and reliability, which is a typical multiple attribute decision making (MADM problem. Although many prominent MADM techniques have been developed, most of them are usually to select the optimal alternative under the hypothesis that the decision maker or expert is completely rational and the decision data are represented by crisp values. However, in the selecting processes of photovoltaic cells the decision maker is usually bounded rational and the ratings of alternatives are usually imprecise and vague. To address these kinds of complex and common issues, in this paper we develop a new interval-valued intuitionistic fuzzy behavioral MADM method. We employ interval-valued intuitionistic fuzzy numbers (IVIFNs to express the imprecise ratings of alternatives; and we construct LINMAP-based nonlinear programming models to identify the reference points under IVIFNs contexts, which avoid the subjective randomness of selecting the reference points. Finally we develop a prospect theory-based ranking method to identify the optimal alternative, which takes fully into account the decision maker’s behavioral characteristics such as reference dependence, diminishing sensitivity and loss aversion in the decision making process.

  19. Probabilistic analysis of structures involving random stress-strain behavior

    Science.gov (United States)

    Millwater, H. R.; Thacker, B. H.; Harren, S. V.

    1991-01-01

    The present methodology for analysis of structures with random stress strain behavior characterizes the uniaxial stress-strain curve in terms of (1) elastic modulus, (2) engineering stress at initial yield, (3) initial plastic-hardening slope, (4) engineering stress at point of ultimate load, and (5) engineering strain at point of ultimate load. The methodology is incorporated into the Numerical Evaluation of Stochastic Structures Under Stress code for probabilistic structural analysis. The illustrative problem of a thick cylinder under internal pressure, where both the internal pressure and the stress-strain curve are random, is addressed by means of the code. The response value is the cumulative distribution function of the equivalent plastic strain at the inner radius.

  20. Lane detection using Randomized Hough Transform

    Science.gov (United States)

    Mongkonyong, Peerawat; Nuthong, Chaiwat; Siddhichai, Supakorn; Yamakita, Masaki

    2018-01-01

    According to the report of the Royal Thai Police between 2006 and 2015, lane changing without consciousness is one of the most accident causes. To solve this problem, many methods are considered. Lane Departure Warning System (LDWS) is considered to be one of the potential solutions. LDWS is a mechanism designed to warn the driver when the vehicle begins to move out of its current lane. LDWS contains many parts including lane boundary detection, driver warning and lane marker tracking. This article focuses on the lane boundary detection part. The proposed lane boundary detection detects the lines of the image from the input video and selects the lane marker of the road surface from those lines. Standard Hough Transform (SHT) and Randomized Hough Transform (RHT) are considered in this article. They are used to extract lines of an image. SHT extracts the lines from all of the edge pixels. RHT extracts only the lines randomly picked by the point pairs from edge pixels. RHT algorithm reduces the time and memory usage when compared with SHT. The increase of the threshold value in RHT will increase the voted limit of the line that has a high possibility to be the lane marker, but it also consumes the time and memory. By comparison between SHT and RHT with the different threshold values, 500 frames of input video from the front car camera will be processed. The accuracy and the computational time of RHT are similar to those of SHT in the result of the comparison.

  1. Random coil chemical shifts in acidic 8 M urea: Implementation of random coil shift data in NMRView

    International Nuclear Information System (INIS)

    Schwarzinger, Stephan; Kroon, Gerard J.A.; Foss, Ted R.; Wright, Peter E.; Dyson, H. Jane

    2000-01-01

    Studies of proteins unfolded in acid or chemical denaturant can help in unraveling events during the earliest phases of protein folding. In order for meaningful comparisons to be made of residual structure in unfolded states, it is necessary to use random coil chemical shifts that are valid for the experimental system under study. We present a set of random coil chemical shifts obtained for model peptides under experimental conditions used in studies of denatured proteins. This new set, together with previously published data sets, has been incorporated into a software interface for NMRView, allowing selection of the random coil data set that fits the experimental conditions best

  2. Pervasive randomness in physics: an introduction to its modelling and spectral characterisation

    Science.gov (United States)

    Howard, Roy

    2017-10-01

    An introduction to the modelling and spectral characterisation of random phenomena is detailed at a level consistent with a first exposure to the subject at an undergraduate level. A signal framework for defining a random process is provided and this underpins an introduction to common random processes including the Poisson point process, the random walk, the random telegraph signal, shot noise, information signalling random processes, jittered pulse trains, birth-death random processes and Markov chains. An introduction to the spectral characterisation of signals and random processes, via either an energy spectral density or a power spectral density, is detailed. The important case of defining a white noise random process concludes the paper.

  3. Discriminative Projection Selection Based Face Image Hashing

    Science.gov (United States)

    Karabat, Cagatay; Erdogan, Hakan

    Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.

  4. The effect of selection on genetic parameter estimates

    African Journals Online (AJOL)

    Unknown

    The South African Journal of Animal Science is available online at ... A simulation study was carried out to investigate the effect of selection on the estimation of genetic ... The model contained a fixed effect, random genetic and random.

  5. A theory for the origin of a self-replicating chemical system. I - Natural selection of the autogen from short, random oligomers

    Science.gov (United States)

    White, D. H.

    1980-01-01

    A general theory is presented for the origin of a self-replicating chemical system, termed an autogen, which is capable of both crude replication and translation (protein synthesis). The theory requires the availability of free energy and monomers to the system, a significant background low-yield synthesis of kinetically stable oligopeptides and oligonucleotides, the localization of the oligomers, crude oligonucleotide selectivity of amino acids during oligopeptide synthesis, crude oligonucleotide replication, and two short peptide families which catalyze replication and translation, to produce a localized group of at least one copy each of two protogenes and two protoenzymes. The model posits a process of random oligomerization, followed by the random nucleation of functional components and the rapid autocatalytic growth of the functioning autogen to macroscopic amounts, to account for the origin of the first self-replicating system. Such a process contains steps of such high probability and short time periods that it is suggested that the emergence of an autogen in a laboratory experiment of reasonable time scale may be possible.

  6. The Theory of Random Laser Systems

    International Nuclear Information System (INIS)

    Xunya Jiang

    2002-01-01

    Studies of random laser systems are a new direction with promising potential applications and theoretical interest. The research is based on the theories of localization and laser physics. So far, the research shows that there are random lasing modes inside the systems which is quite different from the common laser systems. From the properties of the random lasing modes, they can understand the phenomena observed in the experiments, such as multi-peak and anisotropic spectrum, lasing mode number saturation, mode competition and dynamic processes, etc. To summarize, this dissertation has contributed the following in the study of random laser systems: (1) by comparing the Lamb theory with the Letokhov theory, the general formulas of the threshold length or gain of random laser systems were obtained; (2) they pointed out the vital weakness of previous time-independent methods in random laser research; (3) a new model which includes the FDTD method and the semi-classical laser theory. The solutions of this model provided an explanation of the experimental results of multi-peak and anisotropic emission spectra, predicted the saturation of lasing modes number and the length of localized lasing modes; (4) theoretical (Lamb theory) and numerical (FDTD and transfer-matrix calculation) studies of the origin of localized lasing modes in the random laser systems; and (5) proposal of using random lasing modes as a new path to study wave localization in random systems and prediction of the lasing threshold discontinuity at mobility edge

  7. Private randomness expansion with untrusted devices

    International Nuclear Information System (INIS)

    Colbeck, Roger; Kent, Adrian

    2011-01-01

    Randomness is an important resource for many applications, from gambling to secure communication. However, guaranteeing that the output from a candidate random source could not have been predicted by an outside party is a challenging task, and many supposedly random sources used today provide no such guarantee. Quantum solutions to this problem exist, for example a device which internally sends a photon through a beamsplitter and observes on which side it emerges, but, presently, such solutions require the user to trust the internal workings of the device. Here, we seek to go beyond this limitation by asking whether randomness can be generated using untrusted devices-even ones created by an adversarial agent-while providing a guarantee that no outside party (including the agent) can predict it. Since this is easily seen to be impossible unless the user has an initially private random string, the task we investigate here is private randomness expansion. We introduce a protocol for private randomness expansion with untrusted devices which is designed to take as input an initially private random string and produce as output a longer private random string. We point out that private randomness expansion protocols are generally vulnerable to attacks that can render the initial string partially insecure, even though that string is used only inside a secure laboratory; our protocol is designed to remove this previously unconsidered vulnerability by privacy amplification. We also discuss extensions of our protocol designed to generate an arbitrarily long random string from a finite initially private random string. The security of these protocols against the most general attacks is left as an open question.

  8. Private randomness expansion with untrusted devices

    Science.gov (United States)

    Colbeck, Roger; Kent, Adrian

    2011-03-01

    Randomness is an important resource for many applications, from gambling to secure communication. However, guaranteeing that the output from a candidate random source could not have been predicted by an outside party is a challenging task, and many supposedly random sources used today provide no such guarantee. Quantum solutions to this problem exist, for example a device which internally sends a photon through a beamsplitter and observes on which side it emerges, but, presently, such solutions require the user to trust the internal workings of the device. Here, we seek to go beyond this limitation by asking whether randomness can be generated using untrusted devices—even ones created by an adversarial agent—while providing a guarantee that no outside party (including the agent) can predict it. Since this is easily seen to be impossible unless the user has an initially private random string, the task we investigate here is private randomness expansion. We introduce a protocol for private randomness expansion with untrusted devices which is designed to take as input an initially private random string and produce as output a longer private random string. We point out that private randomness expansion protocols are generally vulnerable to attacks that can render the initial string partially insecure, even though that string is used only inside a secure laboratory; our protocol is designed to remove this previously unconsidered vulnerability by privacy amplification. We also discuss extensions of our protocol designed to generate an arbitrarily long random string from a finite initially private random string. The security of these protocols against the most general attacks is left as an open question.

  9. Private randomness expansion with untrusted devices

    Energy Technology Data Exchange (ETDEWEB)

    Colbeck, Roger; Kent, Adrian, E-mail: rcolbeck@perimeterinstitute.ca, E-mail: a.p.a.kent@damtp.cam.ac.uk [Perimeter Institute for Theoretical Physics, 31 Caroline Street North, Waterloo, ON N2L 2Y5 (Canada)

    2011-03-04

    Randomness is an important resource for many applications, from gambling to secure communication. However, guaranteeing that the output from a candidate random source could not have been predicted by an outside party is a challenging task, and many supposedly random sources used today provide no such guarantee. Quantum solutions to this problem exist, for example a device which internally sends a photon through a beamsplitter and observes on which side it emerges, but, presently, such solutions require the user to trust the internal workings of the device. Here, we seek to go beyond this limitation by asking whether randomness can be generated using untrusted devices-even ones created by an adversarial agent-while providing a guarantee that no outside party (including the agent) can predict it. Since this is easily seen to be impossible unless the user has an initially private random string, the task we investigate here is private randomness expansion. We introduce a protocol for private randomness expansion with untrusted devices which is designed to take as input an initially private random string and produce as output a longer private random string. We point out that private randomness expansion protocols are generally vulnerable to attacks that can render the initial string partially insecure, even though that string is used only inside a secure laboratory; our protocol is designed to remove this previously unconsidered vulnerability by privacy amplification. We also discuss extensions of our protocol designed to generate an arbitrarily long random string from a finite initially private random string. The security of these protocols against the most general attacks is left as an open question.

  10. Fuzzy logic prediction of dew point pressure of selected Iranian gas condensate reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Nowroozi, Saeed [Shahid Bahonar Univ. of Kerman (Iran); Iranian Offshore Oil Company (I.O.O.C.) (Iran); Ranjbar, Mohammad; Hashemipour, Hassan; Schaffie, Mahin [Shahid Bahonar Univ. of Kerman (Iran)

    2009-12-15

    The experimental determination of dew point pressure in a window PVT cell is often difficult especially in the case of lean retrograde gas condensate. Besides all statistical, graphical and experimental methods, the fuzzy logic method can be useful and more reliable for estimation of reservoir properties. Fuzzy logic can overcome uncertainty existent in many reservoir properties. Complexity, non-linearity and vagueness are some reservoir parameter characteristics, which can be propagated simply by fuzzy logic. The fuzzy logic dew point pressure modeling system used in this study is a multi input single output (MISO) Mamdani system. The model was developed using experimentally constant volume depletion (CVD) measured samples of some Iranian fields. The performance of the model is compared against the performance of some of the most accurate and general correlations for dew point pressure calculation. Results show that this novel method is more accurate and reliable with an average absolute deviation of 1.33% and 2.68% for developing and checking, respectively. (orig.)

  11. Generating random networks and graphs

    CERN Document Server

    Coolen, Ton; Roberts, Ekaterina

    2017-01-01

    This book supports researchers who need to generate random networks, or who are interested in the theoretical study of random graphs. The coverage includes exponential random graphs (where the targeted probability of each network appearing in the ensemble is specified), growth algorithms (i.e. preferential attachment and the stub-joining configuration model), special constructions (e.g. geometric graphs and Watts Strogatz models) and graphs on structured spaces (e.g. multiplex networks). The presentation aims to be a complete starting point, including details of both theory and implementation, as well as discussions of the main strengths and weaknesses of each approach. It includes extensive references for readers wishing to go further. The material is carefully structured to be accessible to researchers from all disciplines while also containing rigorous mathematical analysis (largely based on the techniques of statistical mechanics) to support those wishing to further develop or implement the theory of rand...

  12. Selection gradients, the opportunity for selection, and the coefficient of determination.

    Science.gov (United States)

    Moorad, Jacob A; Wade, Michael J

    2013-03-01

    Abstract We derive the relationship between R(2) (the coefficient of determination), selection gradients, and the opportunity for selection for univariate and multivariate cases. Our main result is to show that the portion of the opportunity for selection that is caused by variation for any trait is equal to the product of its selection gradient and its selection differential. This relationship is a corollary of the first and second fundamental theorems of natural selection, and it permits one to investigate the portions of the total opportunity for selection that are involved in directional selection, stabilizing (and diversifying) selection, and correlational selection, which is important to morphological integration. It also allows one to determine the fraction of fitness variation not explained by variation in measured phenotypes and therefore attributable to random (or, at least, unknown) influences. We apply our methods to a human data set to show how sex-specific mating success as a component of fitness variance can be decoupled from that owing to prereproductive mortality. By quantifying linear sources of sexual selection and quadratic sources of sexual selection, we illustrate that the former is stronger in males, while the latter is stronger in females.

  13. "Open mesh" or "strictly selected population" recruitment? The experience of the randomized controlled MeMeMe trial

    Directory of Open Access Journals (Sweden)

    Cortellini M

    2017-07-01

    Full Text Available Mauro Cortellini, Franco Berrino, Patrizia Pasanisi Department of Preventive & Predictive Medicine, Foundation IRCCS National Cancer Institute of Milan, Milan, Italy Abstract: Among randomized controlled trials (RCTs, trials for primary prevention require large samples and long follow-up to obtain a high-quality outcome; therefore the recruitment process and the drop-out rates largely dictate the adequacy of the results. We are conducting a Phase III trial on persons with metabolic syndrome to test the hypothesis that comprehensive lifestyle changes and/or metformin treatment prevents age-related chronic diseases (the MeMeMe trial, EudraCT number: 2012-005427-32, also registered on ClinicalTrials.gov [NCT02960711]. Here, we briefly analyze and discuss the reasons which may lead to participants dropping out from trials. In our experience, participants may back out of a trial for different reasons. Drug-induced side effects are certainly the most compelling reason. But what are the other reasons, relating to the participants’ perception of the progress of the trial which led them to withdraw after randomization? What about the time-dependent drop-out rate in primary prevention trials? The primary outcome of this analysis is the point of drop-out from trial, defined as the time from the randomization date to the withdrawal date. Survival functions were non-parametrically estimated using the product-limit estimator. The curves were statistically compared using the log-rank test (P=0.64, not significant. Researchers involved in primary prevention RCTs seem to have to deal with the paradox of the proverbial “short blanket syndrome”. Recruiting only highly motivated candidates might be useful for the smooth progress of the trial but it may lead to a very low enrollment rate. On the other hand, what about enrolling all the eligible subjects without considering their motivation? This might boost the enrollment rate, but it can lead to biased

  14. A method simulating random magnetic field in interplanetary space by an autoregressive method

    International Nuclear Information System (INIS)

    Kato, Masahito; Sakai, Takasuke

    1985-01-01

    With an autoregressive method, we tried to generate the random noise fitting in with the power spectrum which can be analytically Fouriertransformed into an autocorrelation function. Although we can not directly compare our method with FFT by Owens (1978), we can only point out the following; FFT method should determine at first the number of data points N, or the total length to be generated and we cannot generate random data more than N. Because, beyond the NΔy, the generated data repeats the same pattern as below NΔy, where Δy = minimum interval for random noise. So if you want to change or increase N after generating the random noise, you should start the generation from the first step. The characteristic of the generated random number may depend upon the number of N, judging from the generating method. Once the prediction error filters are determined, our method can produce successively the random numbers, that is, we can possibly extend N to infinite without any effort. (author)

  15. Nitrates and bone turnover (NABT) - trial to select the best nitrate preparation: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Bucur, Roxana C; Reid, Lauren S; Hamilton, Celeste J; Cummings, Steven R; Jamal, Sophie A

    2013-09-08

    comparisons with the best' approach for data analyses, as this strategy allows practical considerations of ease of use and tolerability to guide selection of the preparation for future studies. Data from this protocol will be used to develop a randomized, controlled trial of nitrates to prevent osteoporotic fractures. ClinicalTrials.gov Identifier: NCT01387672. Controlled-Trials.com: ISRCTN08860742.

  16. Performance of Universal Adhesive in Primary Molars After Selective Removal of Carious Tissue: An 18-Month Randomized Clinical Trial.

    Science.gov (United States)

    Lenzi, Tathiane Larissa; Pires, Carine Weber; Soares, Fabio Zovico Maxnuck; Raggio, Daniela Prócida; Ardenghi, Thiago Machado; de Oliveira Rocha, Rachel

    2017-09-15

    To evaluate the 18-month clinical performance of a universal adhesive, applied under different adhesion strategies, after selective carious tissue removal in primary molars. Forty-four subjects (five to 10 years old) contributed with 90 primary molars presenting moderately deep dentin carious lesions on occlusal or occluso-proximal surfaces, which were randomly assigned following either self-etch or etch-and-rinse protocol of Scotchbond Universal Adhesive (3M ESPE). Resin composite was incrementally inserted for all restorations. Restorations were evaluated at one, six, 12, and 18 months using the modified United States Public Health Service criteria. Survival estimates for restorations' longevity were evaluated using the Kaplan-Meier method. Multivariate Cox regression analysis with shared frailty to assess the factors associated with failures (Padhesion strategy did not influence the restorations' longevity (P=0.06; 72.2 percent and 89.7 percent with etch-and-rinse and self-etch mode, respectively). Self-etch and etch-and-rinse strategies did not influence the clinical behavior of universal adhesive used in primary molars after selective carious tissue removal; although there was a tendency for better outcome of the self-etch strategy.

  17. IT Project Selection

    DEFF Research Database (Denmark)

    Pedersen, Keld

    2016-01-01

    for initiation. Most of the research on project selection is normative, suggesting new methods, but available empirical studies indicate that many methods are seldom used in practice. This paper addresses the issue by providing increased understanding of IT project selection practice, thereby facilitating...... the development of methods that better fit current practice. The study is based on naturalistic decision-making theory and interviews with experienced project portfolio managers who, when selecting projects, primarily rely on political skills, experience and personal networks rather than on formal IT project......-selection methods, and these findings point to new areas for developing new methodological support for IT project selection....

  18. Analysis of swaps in Radix selection

    DEFF Research Database (Denmark)

    Elmasry, Amr Ahmed Abd Elmoneim; Mahmoud, Hosam

    2011-01-01

    Radix Sort is a sorting algorithm based on analyzing digital data. We study the number of swaps made by Radix Select (a one-sided version of Radix Sort) to find an element with a randomly selected rank. This kind of grand average provides a smoothing over all individual distributions for specific...

  19. [Efficacy on hemiplegic spasticity treated with plum blossom needle tapping therapy at the key points and Bobath therapy: a randomized controlled trial].

    Science.gov (United States)

    Wang, Fei; Zhang, Lijuan; Wang, Jianhua; Shi, Yan; Zheng, Liya

    2015-08-01

    To evaluate the efficacy on hemiplegic spasticity after cerebral infarction treated with plum blossom needle tapping therapy at the key points and Bobath therapy. Eighty patients were collected, in compliance with the inclusive criteria of hemiplegic spasticity after cerebral infarction, and randomized into an observation group and a control group, 40 cases in each one. In the control group, Bobath manipulation therapy was adopted to relieve spasticity and the treatment of 8 weeks was required. In the observation group, on the basis of the treatment as the control group, the tapping therapy with plum blossom needle was applied to the key points, named Jianyu (LI 15), Jianliao (LI 14), Jianzhen (SI 9), Hegu (LI 4), Chengfu (BL 36), Zusanli (ST 36), Xiyangguan (GB 33), etc. The treatment was given for 15 min each time, once a day. Before treatment, after 4 and 8 weeks of treatment, the Fugl-Meyer assessment (FMA) and Barthel index (BI) were adopted to evaluate the motor function of the extremity and the activity of daily life in the patients of the two groups separately. The modified Ashworth scale was used to evaluate the effect of anti-spasticity. In 4 and 8 weeks of treatment, FMA: scores and BI scores were all significantly increased as compared with those before treatment in the two groups: (both PBobath therapy effectively relieves hemiplegic spasticity in the patients of cerebral infarction and improves the motor function of extremity and the activity of daily life.

  20. Integral Histogram with Random Projection for Pedestrian Detection.

    Directory of Open Access Journals (Sweden)

    Chang-Hua Liu

    Full Text Available In this paper, we give a systematic study to report several deep insights into the HOG, one of the most widely used features in the modern computer vision and image processing applications. We first show that, its magnitudes of gradient can be randomly projected with random matrix. To handle over-fitting, an integral histogram based on the differences of randomly selected blocks is proposed. The experiments show that both the random projection and integral histogram outperform the HOG feature obviously. Finally, the two ideas are combined into a new descriptor termed IHRP, which outperforms the HOG feature with less dimensions and higher speed.

  1. Effects of point configuration on the accuracy in 3D reconstruction from biplane images

    International Nuclear Information System (INIS)

    Dmochowski, Jacek; Hoffmann, Kenneth R.; Singh, Vikas; Xu Jinhui; Nazareth, Daryl P.

    2005-01-01

    Two or more angiograms are being used frequently in medical imaging to reconstruct locations in three-dimensional (3D) space, e.g., for reconstruction of 3D vascular trees, implanted electrodes, or patient positioning. A number of techniques have been proposed for this task. In this simulation study, we investigate the effect of the shape of the configuration of the points in 3D (the 'cloud' of points) on reconstruction errors for one of these techniques developed in our laboratory. Five types of configurations (a ball, an elongated ellipsoid (cigar), flattened ball (pancake), flattened cigar, and a flattened ball with a single distant point) are used in the evaluations. For each shape, 100 random configurations were generated, with point coordinates chosen from Gaussian distributions having a covariance matrix corresponding to the desired shape. The 3D data were projected into the image planes using a known imaging geometry. Gaussian distributed errors were introduced in the x and y coordinates of these projected points. Gaussian distributed errors were also introduced into the gantry information used to calculate the initial imaging geometry. The imaging geometries and 3D positions were iteratively refined using the enhanced-Metz-Fencil technique. The image data were also used to evaluate the feasible R-t solution volume. The 3D errors between the calculated and true positions were determined. The effects of the shape of the configuration, the number of points, the initial geometry error, and the input image error were evaluated. The results for the number of points, initial geometry error, and image error are in agreement with previously reported results, i.e., increasing the number of points and reducing initial geometry and/or image error, improves the accuracy of the reconstructed data. The shape of the 3D configuration of points also affects the error of reconstructed 3D configuration; specifically, errors decrease as the 'volume' of the 3D configuration

  2. Discrete Approximations of Determinantal Point Processes on Continuous Spaces: Tree Representations and Tail Triviality

    Science.gov (United States)

    Osada, Hirofumi; Osada, Shota

    2018-01-01

    We prove tail triviality of determinantal point processes μ on continuous spaces. Tail triviality has been proved for such processes only on discrete spaces, and hence we have generalized the result to continuous spaces. To do this, we construct tree representations, that is, discrete approximations of determinantal point processes enjoying a determinantal structure. There are many interesting examples of determinantal point processes on continuous spaces such as zero points of the hyperbolic Gaussian analytic function with Bergman kernel, and the thermodynamic limit of eigenvalues of Gaussian random matrices for Sine_2 , Airy_2 , Bessel_2 , and Ginibre point processes. Our main theorem proves all these point processes are tail trivial.

  3. Unwilling or Unable to Cheat? Evidence from a Randomized Tax Audit Experiment in Denmark

    OpenAIRE

    Henrik J. Kleven; Martin B. Knudsen; Claus T. Kreiner; Søren Pedersen; Emmanuel Saez

    2010-01-01

    This paper analyzes a randomized tax enforcement experiment in Denmark. In the base year, a stratified and representative sample of over 40,000 individual income tax filers was selected for the experiment. Half of the tax filers were randomly selected to be thoroughly audited, while the rest were deliberately not audited. The following year, "threat-of-audit" letters were randomly assigned and sent to tax filers in both groups. Using comprehensive administrative tax data, we present four main...

  4. Direct Measurements of Human Colon Crypt Stem Cell Niche Genetic Fidelity: The Role of Chance in Non-Darwinian Mutation Selection

    Directory of Open Access Journals (Sweden)

    Haeyoun eKang

    2013-10-01

    Full Text Available Perfect human stem cell genetic fidelity would prevent aging and cancer. However, perfection would be difficult to achieve, and aging is universal and cancers common. A hypothesis is that because mutations are inevitable over a human lifetime, downstream mechanisms have evolved to manage the deleterious effects of beneficial and lethal mutations. In the colon, a crypt stem cell architecture reduces the number of mitotic cells at risk for mutation accumulation, and multiple niche stem cells ensure that a lethal mutation within any single stem cell does not lead to crypt death. In addition, the architecture of the colon crypt stem cell niche may harness probability or chance to randomly discard many beneficial mutations that might lead to cancer. An analysis of somatic chromosome copy number alterations (CNAs reveals a lack of perfect fidelity in individual normal human crypts, with age-related increases and higher frequencies in ulcerative colitis, a proliferative, inflammatory disease. The age-related increase in somatic CNAs appears consistent with relatively normal replication error and cell division rates. Surprisingly, and similar to point mutations in cancer genomes, the types of crypt mutations were more consistent with random fixation rather than selection. In theory, a simple non-Darwinian way to nullify selection is to reduce the size of the reproducing population. Fates are more determined by chance rather than selection in very small populations, and therefore selection may be minimized within small crypt niches. The desired effect is that many beneficial mutations that might lead to cancer are randomly lost by drift rather than fixed by selection. The subdivision of the colon into multiple very small stem cell niches may trade Darwinian evolution for non-Darwinian somatic cell evolution, capitulating to aging but reducing cancer risks.

  5. High-speed, random-access fluorescence microscopy: I. High-resolution optical recording with voltage-sensitive dyes and ion indicators.

    Science.gov (United States)

    Bullen, A; Patel, S S; Saggau, P

    1997-07-01

    The design and implementation of a high-speed, random-access, laser-scanning fluorescence microscope configured to record fast physiological signals from small neuronal structures with high spatiotemporal resolution is presented. The laser-scanning capability of this nonimaging microscope is provided by two orthogonal acousto-optic deflectors under computer control. Each scanning point can be randomly accessed and has a positioning time of 3-5 microseconds. Sampling time is also computer-controlled and can be varied to maximize the signal-to-noise ratio. Acquisition rates up to 200k samples/s at 16-bit digitizing resolution are possible. The spatial resolution of this instrument is determined by the minimal spot size at the level of the preparation (i.e., 2-7 microns). Scanning points are selected interactively from a reference image collected with differential interference contrast optics and a video camera. Frame rates up to 5 kHz are easily attainable. Intrinsic variations in laser light intensity and scanning spot brightness are overcome by an on-line signal-processing scheme. Representative records obtained with this instrument by using voltage-sensitive dyes and calcium indicators demonstrate the ability to make fast, high-fidelity measurements of membrane potential and intracellular calcium at high spatial resolution (2 microns) without any temporal averaging.

  6. [Plaque segmentation of intracoronary optical coherence tomography images based on K-means and improved random walk algorithm].

    Science.gov (United States)

    Wang, Guanglei; Wang, Pengyu; Han, Yechen; Liu, Xiuling; Li, Yan; Lu, Qian

    2017-06-01

    In recent years, optical coherence tomography (OCT) has developed into a popular coronary imaging technology at home and abroad. The segmentation of plaque regions in coronary OCT images has great significance for vulnerable plaque recognition and research. In this paper, a new algorithm based on K -means clustering and improved random walk is proposed and Semi-automated segmentation of calcified plaque, fibrotic plaque and lipid pool was achieved. And the weight function of random walk is improved. The distance between the edges of pixels in the image and the seed points is added to the definition of the weight function. It increases the weak edge weights and prevent over-segmentation. Based on the above methods, the OCT images of 9 coronary atherosclerotic patients were selected for plaque segmentation. By contrasting the doctor's manual segmentation results with this method, it was proved that this method had good robustness and accuracy. It is hoped that this method can be helpful for the clinical diagnosis of coronary heart disease.

  7. On plasma stability under anisotropic random electric field influence

    International Nuclear Information System (INIS)

    Rabich, L.N.; Sosenko, P.P.

    1987-01-01

    The influence of anisotropic random field on plasma stability is studied. The thresholds and instability increments are obtained. The stabilizing influence of frequency missmatch and external magnetic field is pointed out

  8. Bayesian dose selection design for a binary outcome using restricted response adaptive randomization.

    Science.gov (United States)

    Meinzer, Caitlyn; Martin, Renee; Suarez, Jose I

    2017-09-08

    In phase II trials, the most efficacious dose is usually not known. Moreover, given limited resources, it is difficult to robustly identify a dose while also testing for a signal of efficacy that would support a phase III trial. Recent designs have sought to be more efficient by exploring multiple doses through the use of adaptive strategies. However, the added flexibility may potentially increase the risk of making incorrect assumptions and reduce the total amount of information available across the dose range as a function of imbalanced sample size. To balance these challenges, a novel placebo-controlled design is presented in which a restricted Bayesian response adaptive randomization (RAR) is used to allocate a majority of subjects to the optimal dose of active drug, defined as the dose with the lowest probability of poor outcome. However, the allocation between subjects who receive active drug or placebo is held constant to retain the maximum possible power for a hypothesis test of overall efficacy comparing the optimal dose to placebo. The design properties and optimization of the design are presented in the context of a phase II trial for subarachnoid hemorrhage. For a fixed total sample size, a trade-off exists between the ability to select the optimal dose and the probability of rejecting the null hypothesis. This relationship is modified by the allocation ratio between active and control subjects, the choice of RAR algorithm, and the number of subjects allocated to an initial fixed allocation period. While a responsive RAR algorithm improves the ability to select the correct dose, there is an increased risk of assigning more subjects to a worse arm as a function of ephemeral trends in the data. A subarachnoid treatment trial is used to illustrate how this design can be customized for specific objectives and available data. Bayesian adaptive designs are a flexible approach to addressing multiple questions surrounding the optimal dose for treatment efficacy

  9. Distributional and efficiency results for subset selection

    NARCIS (Netherlands)

    Laan, van der P.

    1996-01-01

    Assume k (??k \\geq 2) populations are given. The associated independent random variables have continuous distribution functions with an unknown location parameter. The statistical selec??tion goal is to select a non??empty subset which contains the best population,?? that is the pop??ulation with

  10. Black holes and random matrices

    Energy Technology Data Exchange (ETDEWEB)

    Cotler, Jordan S.; Gur-Ari, Guy [Stanford Institute for Theoretical Physics, Stanford University,Stanford, CA 94305 (United States); Hanada, Masanori [Stanford Institute for Theoretical Physics, Stanford University,Stanford, CA 94305 (United States); Yukawa Institute for Theoretical Physics, Kyoto University,Kyoto 606-8502 (Japan); The Hakubi Center for Advanced Research, Kyoto University,Kyoto 606-8502 (Japan); Polchinski, Joseph [Department of Physics, University of California,Santa Barbara, CA 93106 (United States); Kavli Institute for Theoretical Physics, University of California,Santa Barbara, CA 93106 (United States); Saad, Phil; Shenker, Stephen H. [Stanford Institute for Theoretical Physics, Stanford University,Stanford, CA 94305 (United States); Stanford, Douglas [Institute for Advanced Study,Princeton, NJ 08540 (United States); Streicher, Alexandre [Stanford Institute for Theoretical Physics, Stanford University,Stanford, CA 94305 (United States); Department of Physics, University of California,Santa Barbara, CA 93106 (United States); Tezuka, Masaki [Department of Physics, Kyoto University,Kyoto 606-8501 (Japan)

    2017-05-22

    We argue that the late time behavior of horizon fluctuations in large anti-de Sitter (AdS) black holes is governed by the random matrix dynamics characteristic of quantum chaotic systems. Our main tool is the Sachdev-Ye-Kitaev (SYK) model, which we use as a simple model of a black hole. We use an analytically continued partition function |Z(β+it)|{sup 2} as well as correlation functions as diagnostics. Using numerical techniques we establish random matrix behavior at late times. We determine the early time behavior exactly in a double scaling limit, giving us a plausible estimate for the crossover time to random matrix behavior. We use these ideas to formulate a conjecture about general large AdS black holes, like those dual to 4D super-Yang-Mills theory, giving a provisional estimate of the crossover time. We make some preliminary comments about challenges to understanding the late time dynamics from a bulk point of view.

  11. Sequence-Based Prediction of RNA-Binding Proteins Using Random Forest with Minimum Redundancy Maximum Relevance Feature Selection

    Directory of Open Access Journals (Sweden)

    Xin Ma

    2015-01-01

    Full Text Available The prediction of RNA-binding proteins is one of the most challenging problems in computation biology. Although some studies have investigated this problem, the accuracy of prediction is still not sufficient. In this study, a highly accurate method was developed to predict RNA-binding proteins from amino acid sequences using random forests with the minimum redundancy maximum relevance (mRMR method, followed by incremental feature selection (IFS. We incorporated features of conjoint triad features and three novel features: binding propensity (BP, nonbinding propensity (NBP, and evolutionary information combined with physicochemical properties (EIPP. The results showed that these novel features have important roles in improving the performance of the predictor. Using the mRMR-IFS method, our predictor achieved the best performance (86.62% accuracy and 0.737 Matthews correlation coefficient. High prediction accuracy and successful prediction performance suggested that our method can be a useful approach to identify RNA-binding proteins from sequence information.

  12. Improving Adherence to Smoking Cessation Treatment: Smoking Outcomes in a Web-based Randomized Trial.

    Science.gov (United States)

    Graham, Amanda L; Papandonatos, George D; Cha, Sarah; Erar, Bahar; Amato, Michael S

    2018-03-15

    Partial adherence in Internet smoking cessation interventions presents treatment and evaluation challenges. Increasing adherence may improve outcomes. To present smoking outcomes from an Internet randomized trial of two strategies to encourage adherence to tobacco dependence treatment components: (i) a social network (SN) strategy to integrate smokers into an online community and (ii) free nicotine replacement therapy (NRT). In addition to intent-to-treat analyses, we used novel statistical methods to distinguish the impact of treatment assignment from treatment utilization. A total of 5,290 current smokers on a cessation website (WEB) were randomized to WEB, WEB + SN, WEB + NRT, or WEB + SN + NRT. The main outcome was 30-day point prevalence abstinence at 3 and 9 months post-randomization. Adherence measures included self-reported medication use (meds), and website metrics of skills training (sk) and community use (comm). Inverse Probability of Retention Weighting and Inverse Probability of Treatment Weighting jointly addressed dropout and treatment selection. Propensity weights were used to calculate Average Treatment effects on the Treated. Treatment assignment analyses showed no effects on abstinence for either adherence strategy. Abstinence rates were 25.7%-32.2% among participants that used all three treatment components (sk+comm +meds).Treatment utilization analyses revealed that among such participants, sk+comm+meds yielded large percentage point increases in 3-month abstinence rates over sk alone across arms: WEB = 20.6 (95% CI = 10.8, 30.4), WEB + SN = 19.2 (95% CI = 11.1, 27.3), WEB + NRT = 13.1 (95% CI = 4.1, 22.0), and WEB + SN + NRT = 20.0 (95% CI = 12.2, 27.7). Novel propensity weighting approaches can serve as a model for establishing efficacy of Internet interventions and yield important insights about mechanisms. NCT01544153.

  13. Spectral dimensionality of random superconducting networks

    International Nuclear Information System (INIS)

    Day, A.R.; Xia, W.; Thorpe, M.F.

    1988-01-01

    We compute the spectral dimensionality d of random superconducting-normal networks by directly examining the low-frequency density of states at the percolation threshold. We find that d = 4.1 +- 0.2 and 5.8 +- 0.3 in two and three dimensions, respectively, which confirms the scaling relation d = 2d/(2-s/ν), where s is the superconducting exponent and ν the correlation-length exponent for percolation. We also consider the one-dimensional problem where scaling arguments predict, and our numerical simulations confirm, that d = 0. A simple argument provides an expression for the density of states of the localized high-frequency modes in this special case. We comment on the connection between our calculations and the ''termite'' problem of a random walker on a random superconducting-normal network and point out difficulties in inferring d from simulations of the termite problem

  14. Randomized Prediction Games for Adversarial Machine Learning.

    Science.gov (United States)

    Rota Bulo, Samuel; Biggio, Battista; Pillai, Ignazio; Pelillo, Marcello; Roli, Fabio

    In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different evasion attacks and modifying the classification function accordingly. However, both the classification function and the simulated data manipulations have been modeled in a deterministic manner, without accounting for any form of randomization. In this paper, we overcome this limitation by proposing a randomized prediction game, namely, a noncooperative game-theoretic formulation in which the classifier and the attacker make randomized strategy selections according to some probability distribution defined over the respective strategy set. We show that our approach allows one to improve the tradeoff between attack detection and false alarms with respect to the state-of-the-art secure classifiers, even against attacks that are different from those hypothesized during design, on application examples including handwritten digit recognition, spam, and malware detection.In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different

  15. First steps in random walks from tools to applications

    CERN Document Server

    Klafter, J

    2011-01-01

    The name ""random walk"" for a problem of a displacement of a point in a sequence of independent random steps was coined by Karl Pearson in 1905 in a question posed to readers of ""Nature"". The same year, a similar problem was formulated by Albert Einstein in one of his Annus Mirabilis works. Even earlier such a problem was posed by Louis Bachelier in his thesis devoted to the theory of financial speculations in 1900. Nowadays the theory of random walks has proved useful in physics andchemistry (diffusion, reactions, mixing in flows), economics, biology (from animal spread to motion of subcel

  16. Pseudo-random bit generator based on lag time series

    Science.gov (United States)

    García-Martínez, M.; Campos-Cantón, E.

    2014-12-01

    In this paper, we present a pseudo-random bit generator (PRBG) based on two lag time series of the logistic map using positive and negative values in the bifurcation parameter. In order to hidden the map used to build the pseudo-random series we have used a delay in the generation of time series. These new series when they are mapped xn against xn+1 present a cloud of points unrelated to the logistic map. Finally, the pseudo-random sequences have been tested with the suite of NIST giving satisfactory results for use in stream ciphers.

  17. Pseudo-random number generation using a 3-state cellular automaton

    Science.gov (United States)

    Bhattacharjee, Kamalika; Paul, Dipanjyoti; Das, Sukanta

    This paper investigates the potentiality of pseudo-random number generation of a 3-neighborhood 3-state cellular automaton (CA) under periodic boundary condition. Theoretical and empirical tests are performed on the numbers, generated by the CA, to observe the quality of it as pseudo-random number generator (PRNG). We analyze the strength and weakness of the proposed PRNG and conclude that the selected CA is a good random number generator.

  18. Random crystal field effects on the integer and half-integer mixed-spin system

    Science.gov (United States)

    Yigit, Ali; Albayrak, Erhan

    2018-05-01

    In this work, we have focused on the random crystal field effects on the phase diagrams of the mixed spin-1 and spin-5/2 Ising system obtained by utilizing the exact recursion relations (ERR) on the Bethe lattice (BL). The distribution function P(Di) = pδ [Di - D(1 + α) ] +(1 - p) δ [Di - D(1 - α) ] is used to randomize the crystal field.The phase diagrams are found to exhibit second- and first-order phase transitions depending on the values of α, D and p. It is also observed that the model displays tricritical point, isolated point, critical end point and three compensation temperatures for suitable values of the system parameters.

  19. Theory of Randomized Search Heuristics in Combinatorial Optimization

    DEFF Research Database (Denmark)

    The rigorous mathematical analysis of randomized search heuristics(RSHs) with respect to their expected runtime is a growing research area where many results have been obtained in recent years. This class of heuristics includes well-known approaches such as Randomized Local Search (RLS), the Metr......The rigorous mathematical analysis of randomized search heuristics(RSHs) with respect to their expected runtime is a growing research area where many results have been obtained in recent years. This class of heuristics includes well-known approaches such as Randomized Local Search (RLS...... analysis of randomized algorithms to RSHs. Mostly, the expected runtime of RSHs on selected problems is analzyed. Thereby, we understand why and when RSHs are efficient optimizers and, conversely, when they cannot be efficient. The tutorial will give an overview on the analysis of RSHs for solving...

  20. PolyFit: Polygonal Surface Reconstruction from Point Clouds

    KAUST Repository

    Nan, Liangliang; Wonka, Peter

    2017-01-01

    We propose a novel framework for reconstructing lightweight polygonal surfaces from point clouds. Unlike traditional methods that focus on either extracting good geometric primitives or obtaining proper arrangements of primitives, the emphasis of this work lies in intersecting the primitives (planes only) and seeking for an appropriate combination of them to obtain a manifold polygonal surface model without boundary.,We show that reconstruction from point clouds can be cast as a binary labeling problem. Our method is based on a hypothesizing and selection strategy. We first generate a reasonably large set of face candidates by intersecting the extracted planar primitives. Then an optimal subset of the candidate faces is selected through optimization. Our optimization is based on a binary linear programming formulation under hard constraints that enforce the final polygonal surface model to be manifold and watertight. Experiments on point clouds from various sources demonstrate that our method can generate lightweight polygonal surface models of arbitrary piecewise planar objects. Besides, our method is capable of recovering sharp features and is robust to noise, outliers, and missing data.

  1. PolyFit: Polygonal Surface Reconstruction from Point Clouds

    KAUST Repository

    Nan, Liangliang

    2017-12-25

    We propose a novel framework for reconstructing lightweight polygonal surfaces from point clouds. Unlike traditional methods that focus on either extracting good geometric primitives or obtaining proper arrangements of primitives, the emphasis of this work lies in intersecting the primitives (planes only) and seeking for an appropriate combination of them to obtain a manifold polygonal surface model without boundary.,We show that reconstruction from point clouds can be cast as a binary labeling problem. Our method is based on a hypothesizing and selection strategy. We first generate a reasonably large set of face candidates by intersecting the extracted planar primitives. Then an optimal subset of the candidate faces is selected through optimization. Our optimization is based on a binary linear programming formulation under hard constraints that enforce the final polygonal surface model to be manifold and watertight. Experiments on point clouds from various sources demonstrate that our method can generate lightweight polygonal surface models of arbitrary piecewise planar objects. Besides, our method is capable of recovering sharp features and is robust to noise, outliers, and missing data.

  2. Additional Effect of Static Ultrasound and Diadynamic Currents on Myofascial Trigger Points in a Manual Therapy Program for Patients With Chronic Neck Pain: A Randomized Clinical Trial.

    Science.gov (United States)

    Dibai-Filho, Almir Vieira; de Oliveira, Alessandra Kelly; Girasol, Carlos Eduardo; Dias, Fabiana Rodrigues Cancio; Guirro, Rinaldo Roberto de Jesus

    2017-04-01

    To assess the additional effect of static ultrasound and diadynamic currents on myofascial trigger points in a manual therapy program to treat individuals with chronic neck pain. A single-blind randomized trial was conducted. Both men and women, between ages 18 and 45, with chronic neck pain and active myofascial trigger points in the upper trapezius were included in the study. Subjects were assigned to 3 different groups: group 1 (n = 20) was treated with manual therapy; group 2 (n = 20) was treated with manual therapy and static ultrasound; group 3 (n = 20) was treated with manual therapy and diadynamic currents. Individuals were assessed before the first treatment session, 48 hours after the first treatment session, 48 hours after the tenth treatment session, and 4 weeks after the last session. There was no group-versus-time interaction for Numeric Rating Scale, Neck Disability Index, Pain-Related Self-Statement Scale, pressure pain threshold, cervical range of motion, and skin temperature (F-value range, 0.089-1.961; P-value range, 0.106-0.977). Moreover, we found no differences between groups regarding electromyographic activity (P > 0.05). The use of static ultrasound or diadynamic currents on myofascial trigger points in upper trapezius associated with a manual therapy program did not generate greater benefits than manual therapy alone.

  3. Parallel point-multiplication architecture using combined group operations for high-speed cryptographic applications.

    Directory of Open Access Journals (Sweden)

    Md Selim Hossain

    Full Text Available In this paper, we propose a novel parallel architecture for fast hardware implementation of elliptic curve point multiplication (ECPM, which is the key operation of an elliptic curve cryptography processor. The point multiplication over binary fields is synthesized on both FPGA and ASIC technology by designing fast elliptic curve group operations in Jacobian projective coordinates. A novel combined point doubling and point addition (PDPA architecture is proposed for group operations to achieve high speed and low hardware requirements for ECPM. It has been implemented over the binary field which is recommended by the National Institute of Standards and Technology (NIST. The proposed ECPM supports two Koblitz and random curves for the key sizes 233 and 163 bits. For group operations, a finite-field arithmetic operation, e.g. multiplication, is designed on a polynomial basis. The delay of a 233-bit point multiplication is only 3.05 and 3.56 μs, in a Xilinx Virtex-7 FPGA, for Koblitz and random curves, respectively, and 0.81 μs in an ASIC 65-nm technology, which are the fastest hardware implementation results reported in the literature to date. In addition, a 163-bit point multiplication is also implemented in FPGA and ASIC for fair comparison which takes around 0.33 and 0.46 μs, respectively. The area-time product of the proposed point multiplication is very low compared to similar designs. The performance ([Formula: see text] and Area × Time × Energy (ATE product of the proposed design are far better than the most significant studies found in the literature.

  4. Evolution in fluctuating environments: decomposing selection into additive components of the Robertson-Price equation.

    Science.gov (United States)

    Engen, Steinar; Saether, Bernt-Erik

    2014-03-01

    We analyze the stochastic components of the Robertson-Price equation for the evolution of quantitative characters that enables decomposition of the selection differential into components due to demographic and environmental stochasticity. We show how these two types of stochasticity affect the evolution of multivariate quantitative characters by defining demographic and environmental variances as components of individual fitness. The exact covariance formula for selection is decomposed into three components, the deterministic mean value, as well as stochastic demographic and environmental components. We show that demographic and environmental stochasticity generate random genetic drift and fluctuating selection, respectively. This provides a common theoretical framework for linking ecological and evolutionary processes. Demographic stochasticity can cause random variation in selection differentials independent of fluctuating selection caused by environmental variation. We use this model of selection to illustrate that the effect on the expected selection differential of random variation in individual fitness is dependent on population size, and that the strength of fluctuating selection is affected by how environmental variation affects the covariance in Malthusian fitness between individuals with different phenotypes. Thus, our approach enables us to partition out the effects of fluctuating selection from the effects of selection due to random variation in individual fitness caused by demographic stochasticity. © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.

  5. Treatment of myofascial trigger points in common shoulder disorders by physical therapy: a randomized controlled trial [ISRCTN75722066].

    Science.gov (United States)

    Bron, Carel; Wensing, Michel; Franssen, Jo Lm; Oostendorp, Rob Ab

    2007-11-05

    Shoulder disorders are a common health problem in western societies. Several treatment protocols have been developed for the clinical management of persons with shoulder pain. However available evidence does not support any protocol as being superior over others. Systematic reviews provide some evidence that certain physical therapy interventions (i.e. supervised exercises and mobilisation) are effective in particular shoulder disorders (i.e. rotator cuff disorders, mixed shoulder disorders and adhesive capsulitis), but there is an ongoing need for high quality trials of physical therapy interventions. Usually, physical therapy consists of active exercises intended to strengthen the shoulder muscles as stabilizers of the glenohumeral joint or perform mobilisations to improve restricted mobility of the glenohumeral or adjacent joints (shoulder girdle). It is generally accepted that a-traumatic shoulder problems are the result of impingement of the subacromial structures, such as the bursa or rotator cuff tendons. Myofascial trigger points (MTrPs) in shoulder muscles may also lead to a complex of symptoms that are often seen in patients diagnosed with subacromial impingement or rotator cuff tendinopathy. Little is known about the treatment of MTrPs in patients with shoulder disorders.The primary aim of this study is to investigate whether physical therapy modalities to inactivate MTrPs can reduce symptoms and improve shoulder function in daily activities in a population of chronic a-traumatic shoulder patients when compared to a wait-and-see strategy. In addition we investigate the recurrence rate during a one-year-follow-up period. This paper presents the design for a randomized controlled trial to be conducted between September 2007 - September 2008, evaluating the effectiveness of a physical therapy treatment for non-traumatic shoulder complaints. One hundred subjects are included in this study. All subjects have unilateral shoulder pain for at least six months

  6. Antitumor activity and safety of tivozanib (AV-951) in a phase II randomized discontinuation trial in patients with renal cell carcinoma.

    Science.gov (United States)

    Nosov, Dmitry A; Esteves, Brooke; Lipatov, Oleg N; Lyulko, Alexei A; Anischenko, A A; Chacko, Raju T; Doval, Dinesh C; Strahs, Andrew; Slichenmyer, William J; Bhargava, Pankaj

    2012-05-10

    The antitumor activity and safety of tivozanib, which is a potent and selective vascular endothelial growth factor receptor-1, -2, and -3 inhibitor, was assessed in patients with advanced/metastatic renal cell carcinoma (RCC). In this phase II, randomized discontinuation trial, 272 patients received open-label tivozanib 1.5 mg/d (one cycle equaled three treatment weeks followed by a 1-week break) orally for 16 weeks. Thereafter, 78 patients who demonstrated ≥ 25% tumor shrinkage continued to take tivozanib, and 118 patients with less than 25% tumor change were randomly assigned to receive tivozanib or a placebo in a double-blind manner; patients with ≥ 25% tumor growth were discontinued. Primary end points included safety, the objective response rate (ORR) at 16 weeks, and the percentage of randomly assigned patients who remained progression free after 12 weeks of double-blind treatment; secondary end points included progression-free survival (PFS). Of 272 patients enrolled onto the study, 83% of patients had clear-cell histology, 73% of patients had undergone nephrectomy, and 54% of patients were treatment naive. The ORR after 16 weeks of tivozanib treatment was 18% (95% CI, 14% to 23%). Of the 118 randomized patients, significantly more patients who were randomly assigned to receive double-blind tivozanib remained progression free after 12 weeks versus patients who received the placebo (49% v 21%; P = .001). Throughout the study, the ORR was 24% (95% CI, 19% to 30%), and the median PFS was 11.7 months (95% CI, 8.3 to 14.3 months) in the overall study population. The most common grade 3 and 4 treatment-related adverse event was hypertension (12%). Tivozanib was active and well tolerated in patients with advanced RCC. These data support additional development of tivozanib in advanced RCC.

  7. Selection-Mutation Dynamics of Signaling Games

    Directory of Open Access Journals (Sweden)

    Josef Hofbauer

    2015-01-01

    Full Text Available We study the structure of the rest points of signaling games and their dynamic behavior under selection-mutation dynamics by taking the case of three signals as our canonical example. Many rest points of the replicator dynamics of signaling games are not isolated and, therefore, not robust under perturbations. However, some of them attract open sets of initial conditions. We prove the existence of certain rest points of the selection-mutation dynamics close to Nash equilibria of the signaling game and show that all but the perturbed rest points close to strict Nash equilibria are dynamically unstable. This is an important result for the evolution of signaling behavior, since it shows that the second-order forces that are governed by mutation can increase the chances of successful signaling.

  8. A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data

    KAUST Repository

    Babuška, Ivo; Nobile, Fabio; Tempone, Raul

    2010-01-01

    This work proposes and analyzes a stochastic collocation method for solving elliptic partial differential equations with random coefficients and forcing terms. These input data are assumed to depend on a finite number of random variables. The method consists of a Galerkin approximation in space and a collocation in the zeros of suitable tensor product orthogonal polynomials (Gauss points) in the probability space, and naturally leads to the solution of uncoupled deterministic problems as in the Monte Carlo approach. It treats easily a wide range of situations, such as input data that depend nonlinearly on the random variables, diffusivity coefficients with unbounded second moments, and random variables that are correlated or even unbounded. We provide a rigorous convergence analysis and demonstrate exponential convergence of the “probability error” with respect to the number of Gauss points in each direction of the probability space, under some regularity assumptions on the random input data. Numerical examples show the effectiveness of the method. Finally, we include a section with developments posterior to the original publication of this work. There we review sparse grid stochastic collocation methods, which are effective collocation strategies for problems that depend on a moderately large number of random variables.

  9. Dynamic probability of reinforcement for cooperation: Random game termination in the centipede game.

    Science.gov (United States)

    Krockow, Eva M; Colman, Andrew M; Pulford, Briony D

    2018-03-01

    Experimental games have previously been used to study principles of human interaction. Many such games are characterized by iterated or repeated designs that model dynamic relationships, including reciprocal cooperation. To enable the study of infinite game repetitions and to avoid endgame effects of lower cooperation toward the final game round, investigators have introduced random termination rules. This study extends previous research that has focused narrowly on repeated Prisoner's Dilemma games by conducting a controlled experiment of two-player, random termination Centipede games involving probabilistic reinforcement and characterized by the longest decision sequences reported in the empirical literature to date (24 decision nodes). Specifically, we assessed mean exit points and cooperation rates, and compared the effects of four different termination rules: no random game termination, random game termination with constant termination probability, random game termination with increasing termination probability, and random game termination with decreasing termination probability. We found that although mean exit points were lower for games with shorter expected game lengths, the subjects' cooperativeness was significantly reduced only in the most extreme condition with decreasing computer termination probability and an expected game length of two decision nodes. © 2018 Society for the Experimental Analysis of Behavior.

  10. Social media to supplement point-of-care ultrasound courses: the "sandwich e-learning" approach. A randomized trial.

    Science.gov (United States)

    Hempel, Dorothea; Haunhorst, Stephanie; Sinnathurai, Sivajini; Seibel, Armin; Recker, Florian; Heringer, Frank; Michels, Guido; Breitkreutz, Raoul

    2016-12-01

    Point-of-care ultrasound (POC-US) is gaining importance in almost all specialties. E-learning has been used to teach theoretical knowledge and pattern recognition. As social media are universally available, they can be utilized for educational purposes. We wanted to evaluate the utility of the sandwich e-learning approach defined as a pre-course e-learning and a post-course learning activity using Facebook after a one-day point-of-care ultrasound (POC-US) course and its effect on the retention of knowledge. A total of 62 medial students were recruited for this study and randomly assigned to one of four groups. All groups received an identical hands-on training and performed several tests during the study period. The hands-on training was performed in groups of five students per instructor with the students scanning each other. Group 1 had access to pre-course e-learning, but not to post-course e-learning. Instead of a pre-course e-learning, group 2 listened to presentations at the day of the course (classroom teaching) and had access to the post-course learning activity using Facebook. Group 3 had access to both pre- and post-course e-learning (sandwich e-learning) activities, while group 4 listened classroom presentations only (classroom teaching only). Therefore only groups 2 and 3 had access to post-course learning via Facebook by joining a secured group. Posts containing ultrasound pictures and videos were published to this group. The students were asked to "like" the posts to monitor attendance. Knowledge retention was assessed 6 weeks after the course. After 6 weeks, group 3 achieved comparable results when compared to group 2 (82.2 % + -8.2 vs. 84.3 + -8.02) (p = 0.3). Students who participated in the post-course activity were more satisfied with the overall course than students without post-course learning (5.5 vs. 5.3 on a range from 1 to 6). In this study, the sandwich e-learning approach led to equal rates of knowledge retention compared to

  11. Programmable disorder in random DNA tilings

    Science.gov (United States)

    Tikhomirov, Grigory; Petersen, Philip; Qian, Lulu

    2017-03-01

    Scaling up the complexity and diversity of synthetic molecular structures will require strategies that exploit the inherent stochasticity of molecular systems in a controlled fashion. Here we demonstrate a framework for programming random DNA tilings and show how to control the properties of global patterns through simple, local rules. We constructed three general forms of planar network—random loops, mazes and trees—on the surface of self-assembled DNA origami arrays on the micrometre scale with nanometre resolution. Using simple molecular building blocks and robust experimental conditions, we demonstrate control of a wide range of properties of the random networks, including the branching rules, the growth directions, the proximity between adjacent networks and the size distribution. Much as combinatorial approaches for generating random one-dimensional chains of polymers have been used to revolutionize chemical synthesis and the selection of functional nucleic acids, our strategy extends these principles to random two-dimensional networks of molecules and creates new opportunities for fabricating more complex molecular devices that are organized by DNA nanostructures.

  12. The phase diagrams of a ferromagnetic thin film in a random magnetic field

    Energy Technology Data Exchange (ETDEWEB)

    Zaim, N.; Zaim, A., E-mail: ah_zaim@yahoo.fr; Kerouad, M., E-mail: m.kerouad@fs-umi.ac.ma

    2016-10-07

    In this paper, the magnetic properties and the phase diagrams of a ferromagnetic thin film with a thickness N in a random magnetic field (RMF) are investigated by using the Monte Carlo simulation technique based on the Metropolis algorithm. The effects of the RMF and the surface exchange interaction on the critical behavior are studied. A variety of multicritical points such as tricritical points, isolated critical points, and triple points are obtained. It is also found that the double reentrant phenomenon can appear for appropriate values of the system parameters. - Highlights: • Phase diagrams of a ferromagnetic thin film are examined by the Monte Carlo simulation. • The effect of the random magnetic field on the magnetic properties is studied. • Different types of the phase diagrams are obtained. • The dependence of the magnetization and susceptibility on the temperature are investigated.

  13. Key Aspects of Nucleic Acid Library Design for in Vitro Selection

    Science.gov (United States)

    Vorobyeva, Maria A.; Davydova, Anna S.; Vorobjev, Pavel E.; Pyshnyi, Dmitrii V.; Venyaminova, Alya G.

    2018-01-01

    Nucleic acid aptamers capable of selectively recognizing their target molecules have nowadays been established as powerful and tunable tools for biospecific applications, be it therapeutics, drug delivery systems or biosensors. It is now generally acknowledged that in vitro selection enables one to generate aptamers to almost any target of interest. However, the success of selection and the affinity of the resulting aptamers depend to a large extent on the nature and design of an initial random nucleic acid library. In this review, we summarize and discuss the most important features of the design of nucleic acid libraries for in vitro selection such as the nature of the library (DNA, RNA or modified nucleotides), the length of a randomized region and the presence of fixed sequences. We also compare and contrast different randomization strategies and consider computer methods of library design and some other aspects. PMID:29401748

  14. Use of cluster analysis and preference mapping to evaluate consumer acceptability of choice and select bovine M. longissimus lumborum steaks cooked to various end-point temperatures.

    Science.gov (United States)

    Schmidt, T B; Schilling, M W; Behrends, J M; Battula, V; Jackson, V; Sekhon, R K; Lawrence, T E

    2010-01-01

    Consumer research was conducted to evaluate the acceptability of choice and select steaks from the Longissimus lumborum that were cooked to varying degrees of doneness using demographic information, cluster analysis and descriptive analysis. On average, using data from approximately 155 panelists, no differences (P>0.05) existed in consumer acceptability among select and choice steaks, and all treatment means ranged between like slightly and like moderately (6-7) on the hedonic scale. Individual consumers were highly variable in their perception of acceptability and consumers were grouped into clusters (eight for select and seven for choice) based on their preference and liking of steaks. The largest consumer groups liked steaks from all treatments, but other groups preferred (Pconsumers could be grouped together according to preference, liking and descriptive sensory attributes, (juiciness, tenderness, bloody, metallic, and roasted) to further understand consumer perception of steaks that were cooked to different end-point temperatures.

  15. Detecting bilateral motor associated areas with resting state functional magnetic resonance: the effect of different seed points selection on the results

    International Nuclear Information System (INIS)

    Yi Huiming; Yang Mingming; Meng Liangliang; Zhang Jing

    2011-01-01

    Objective: To investigate the effect of different seed points selection on localizing bilateral hand motor associated areas in resting state functional magnetic resonance. Methods: Thirty -one subjects were recruited (male 15, female 16), all of them underwent both block-designed fMRI scan during performing bilateral hand motor task and resting-state fMRI scan. DPARSA V2.0 and SPM8 were used to process the data. The peak voxels in the activity map of the task scan were selected as seeds to compute functional connectivity map of the resting-state scan. Spatial correlation analysis was performed to compare the activity map of the task scan and the connectivity map of the resting- state scan. Results: Fifteen isolated clusters were picked to generate the peak voxels, which were selected as seeds to compute functional connectivity maps. Among all the functional connectivity maps, those generated by motor area (SMA) presented the most consistent spatial distribution with task associated activity map, and the functional connectivity maps generated by primary motor cortex (M1) and dorsal premotor cortex (PMd) consisted of bilateral Ml and SMA. the functional connectivity maps generated by putamen (Pu), thalamus (Th), cerebellum anterior lobe (CbAL) and cerebellum posterior lobe (CbPL) consisted of the areas around the seeds and the mirror areas in the contralateral cortex. Conclusion: Using SMA as seed to compute resting-state functional connectivity map may produce the best spatial coherence with the activity map generated by bilateral hand motor task, and selecting M1 and PMd as seeds may present the best primary motor cortex in the connectivity map. (authors)

  16. From elongated spanning trees to vicious random walks

    OpenAIRE

    Gorsky, A.; Nechaev, S.; Poghosyan, V. S.; Priezzhev, V. B.

    2012-01-01

    Given a spanning forest on a large square lattice, we consider by combinatorial methods a correlation function of $k$ paths ($k$ is odd) along branches of trees or, equivalently, $k$ loop--erased random walks. Starting and ending points of the paths are grouped in a fashion a $k$--leg watermelon. For large distance $r$ between groups of starting and ending points, the ratio of the number of watermelon configurations to the total number of spanning trees behaves as $r^{-\

  17. SURFACE FITTING FILTERING OF LIDAR POINT CLOUD WITH WAVEFORM INFORMATION

    Directory of Open Access Journals (Sweden)

    S. Xing

    2017-09-01

    Full Text Available Full-waveform LiDAR is an active technology of photogrammetry and remote sensing. It provides more detailed information about objects along the path of a laser pulse than discrete-return topographic LiDAR. The point cloud and waveform information with high quality can be obtained by waveform decomposition, which could make contributions to accurate filtering. The surface fitting filtering method with waveform information is proposed to present such advantage. Firstly, discrete point cloud and waveform parameters are resolved by global convergent Levenberg Marquardt decomposition. Secondly, the ground seed points are selected, of which the abnormal ones are detected by waveform parameters and robust estimation. Thirdly, the terrain surface is fitted and the height difference threshold is determined in consideration of window size and mean square error. Finally, the points are classified gradually with the rising of window size. The filtering process is finished until window size is larger than threshold. The waveform data in urban, farmland and mountain areas from “WATER (Watershed Allied Telemetry Experimental Research” are selected for experiments. Results prove that compared with traditional method, the accuracy of point cloud filtering is further improved and the proposed method has highly practical value.

  18. DNABP: Identification of DNA-Binding Proteins Based on Feature Selection Using a Random Forest and Predicting Binding Residues.

    Science.gov (United States)

    Ma, Xin; Guo, Jing; Sun, Xiao

    2016-01-01

    DNA-binding proteins are fundamentally important in cellular processes. Several computational-based methods have been developed to improve the prediction of DNA-binding proteins in previous years. However, insufficient work has been done on the prediction of DNA-binding proteins from protein sequence information. In this paper, a novel predictor, DNABP (DNA-binding proteins), was designed to predict DNA-binding proteins using the random forest (RF) classifier with a hybrid feature. The hybrid feature contains two types of novel sequence features, which reflect information about the conservation of physicochemical properties of the amino acids, and the binding propensity of DNA-binding residues and non-binding propensities of non-binding residues. The comparisons with each feature demonstrated that these two novel features contributed most to the improvement in predictive ability. Furthermore, to improve the prediction performance of the DNABP model, feature selection using the minimum redundancy maximum relevance (mRMR) method combined with incremental feature selection (IFS) was carried out during the model construction. The results showed that the DNABP model could achieve 86.90% accuracy, 83.76% sensitivity, 90.03% specificity and a Matthews correlation coefficient of 0.727. High prediction accuracy and performance comparisons with previous research suggested that DNABP could be a useful approach to identify DNA-binding proteins from sequence information. The DNABP web server system is freely available at http://www.cbi.seu.edu.cn/DNABP/.

  19. Time-lapse culture with morphokinetic embryo selection improves pregnancy and live birth chances and reduces early pregnancy loss: a meta-analysis.

    Science.gov (United States)

    Pribenszky, Csaba; Nilselid, Anna-Maria; Montag, Markus

    2017-11-01

    Embryo evaluation and selection is fundamental in clinical IVF. Time-lapse follow-up of embryo development comprises undisturbed culture and the application of the visual information to support embryo evaluation. A meta-analysis of randomized controlled trials was carried out to study whether time-lapse monitoring with the prospective use of a morphokinetic algorithm for selection of embryos improves overall clinical outcome (pregnancy, early pregnancy loss, stillbirth and live birth rate) compared with embryo selection based on single time-point morphology in IVF cycles. The meta-analysis of five randomized controlled trials (n = 1637) showed that the application of time-lapse monitoring was associated with a significantly higher ongoing clinical pregnancy rate (51.0% versus 39.9%), with a pooled odds ratio of 1.542 (P loss (15.3% versus 21.3%; OR: 0.662; P = 0.019) and a significantly increased live birth rate (44.2% versus 31.3%; OR 1.668; P = 0.009). Difference in stillbirth was not significant between groups (4.7% versus 2.4%). Quality of the evidence was moderate to low owing to inconsistencies across the studies. Selective application and variability were also limitations. Although time-lapse is shown to significantly improve overall clinical outcome, further high-quality evidence is needed before universal conclusions can be drawn. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  20. FIGHT ZONE WITH POINTS OF THE SHOTOKAN KARATE FEMALE COMPETITION

    OpenAIRE

    Nelson Kautzner Marques Junior

    2014-01-01

    The objective of the study was to determine the fight zone with point during the female kumite of competition. This study used a quantitative research for identify the fight zone with point (ippon or waza-ari) or not during the female kumite of competition. Were selected on the Internet several championship of kumite of the JKA and of the ITKF. The study detected a high probability of point in the zone 7 and in the zone 2. The study determined that the most points at the corner occurred when ...