WorldWideScience

Sample records for random sampling pattern

  1. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, which...... indicates signal sampling points in time. Practical random sampling patterns are constrained by ADC characteristics and application requirements. In this paper, we introduce statistical methods which evaluate random sampling pattern generators with emphasis on practical applications. Furthermore, we propose...... algorithm generates random sampling patterns dedicated for event-driven-ADCs better than existed sampling pattern generators. Finally, implementation issues of random sampling patterns are discussed....

  2. BWIP-RANDOM-SAMPLING, Random Sample Generation for Nuclear Waste Disposal

    International Nuclear Information System (INIS)

    Sagar, B.

    1989-01-01

    1 - Description of program or function: Random samples for different distribution types are generated. Distribution types as required for performance assessment modeling of geologic nuclear waste disposal are provided. These are: - Uniform, - Log-uniform (base 10 or natural), - Normal, - Lognormal (base 10 or natural), - Exponential, - Bernoulli, - User defined continuous distribution. 2 - Method of solution: A linear congruential generator is used for uniform random numbers. A set of functions is used to transform the uniform distribution to the other distributions. Stratified, rather than random, sampling can be chosen. Truncated limits can be specified on many distributions, whose usual definition has an infinite support. 3 - Restrictions on the complexity of the problem: Generation of correlated random variables is not included

  3. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  4. An unbiased estimator of the variance of simple random sampling using mixed random-systematic sampling

    OpenAIRE

    Padilla, Alberto

    2009-01-01

    Systematic sampling is a commonly used technique due to its simplicity and ease of implementation. The drawback of this simplicity is that it is not possible to estimate the design variance without bias. There are several ways to circumvent this problem. One method is to suppose that the variable of interest has a random order in the population, so the sample variance of simple random sampling without replacement is used. By means of a mixed random - systematic sample, an unbiased estimator o...

  5. INTERACTIONS BETWEEN MODULATED LUMINANCE PATTERNS AND RANDOM-DOT PATTERNS

    NARCIS (Netherlands)

    CORNELISSEN, FW; KOOIJMAN, AC

    1994-01-01

    It has been suggested that density modulated random-dot patterns can be used to study higher order pattern vision [Van Meeteren and Barlow (1981) Vision Research, 21, 765-777]. The high contrast dots of which the pattern is composed, are assumed to be reliably transduced-and transmitted by the lower

  6. Systematic versus random sampling in stereological studies.

    Science.gov (United States)

    West, Mark J

    2012-12-01

    The sampling that takes place at all levels of an experimental design must be random if the estimate is to be unbiased in a statistical sense. There are two fundamental ways by which one can make a random sample of the sections and positions to be probed on the sections. Using a card-sampling analogy, one can pick any card at all out of a deck of cards. This is referred to as independent random sampling because the sampling of any one card is made without reference to the position of the other cards. The other approach to obtaining a random sample would be to pick a card within a set number of cards and others at equal intervals within the deck. Systematic sampling along one axis of many biological structures is more efficient than random sampling, because most biological structures are not randomly organized. This article discusses the merits of systematic versus random sampling in stereological studies.

  7. k-Means: Random Sampling Procedure

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. k-Means: Random Sampling Procedure. Optimal 1-Mean is. Approximation of Centroid (Inaba et al). S = random sample of size O(1/ ); Centroid of S is a (1+ )-approx centroid of P with constant probability.

  8. Natural Radioactivity Pattern of Surabaya Water Environmental Samples

    International Nuclear Information System (INIS)

    Rosidi; Agus Taftazani

    2007-01-01

    The gross β radioactivity and natural radionuclide of Surabaya environmental samples pattern have been evaluated. The environmental samples were chosen randomly at 12 locations. The environment samples were water (fresh, estuary and coastal), sediment, eichhornia crassipes (Mart) Solms, Mangrove (Rhizophora stylosa), (Moolgarda delicatus) fish and (Johnius (Johnieops) borneensis) (Sharpnose hammer croaker) fish. The water sample was evaporated; the sediment sample was dried and ground; the biotic samples was burnt at the temperature 500 °C ; The gross β measurement using GM detector and the radionuclides has been identified by γ spectrometer. From the investigation results could be concluded that the natural radioactivity of environmental samples was very low. gross-β of water samples were lower than the threshold value of local government regulation of Surabaya no: 2 year 2004 (1 Bq/L). The distribution of gross-β activity of eichhornia crassipes (Mart) Solms was higher than the other biotic, water and sediment samples as well as the accumulation of radionuclides in the water organism was taken place. The result of identification using γ spectrometer has detected 7 of radionuclides, i.e 210 Pb, 212 Pb, 214 Pb, 208 Tl, 214 Bi, 228 Ac, and 40 K in all sample. The distribution factor of sediment F D was less than bioaccumulation factor of biotic F B and it indicates that there the radionuclide accumulation migration follows the pattern of water - sediment - biotic sample. (author)

  9. Sampling problems for randomly broken sticks

    Energy Technology Data Exchange (ETDEWEB)

    Huillet, Thierry [Laboratoire de Physique Theorique et Modelisation, CNRS-UMR 8089 et Universite de Cergy-Pontoise, 5 mail Gay-Lussac, 95031, Neuville sur Oise (France)

    2003-04-11

    Consider the random partitioning model of a population (represented by a stick of length 1) into n species (fragments) with identically distributed random weights (sizes). Upon ranking the fragments' weights according to ascending sizes, let S{sub m:n} be the size of the mth smallest fragment. Assume that some observer is sampling such populations as follows: drop at random k points (the sample size) onto this stick and record the corresponding numbers of visited fragments. We shall investigate the following sampling problems: (1) what is the sample size if the sampling is carried out until the first visit of the smallest fragment (size S{sub 1:n})? (2) For a given sample size, have all the fragments of the stick been visited at least once or not? This question is related to Feller's random coupon collector problem. (3) In what order are new fragments being discovered and what is the random number of samples separating the discovery of consecutive new fragments until exhaustion of the list? For this problem, the distribution of the size-biased permutation of the species' weights, as the sequence of their weights in their order of appearance is needed and studied.

  10. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  11. Sampling large random knots in a confined space

    International Nuclear Information System (INIS)

    Arsuaga, J; Blackstone, T; Diao, Y; Hinson, K; Karadayi, E; Saito, M

    2007-01-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e n 2 )). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n 2 ). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications

  12. Sampling large random knots in a confined space

    Science.gov (United States)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  13. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  14. Colorization-Based RGB-White Color Interpolation using Color Filter Array with Randomly Sampled Pattern.

    Science.gov (United States)

    Oh, Paul; Lee, Sukho; Kang, Moon Gi

    2017-06-28

    Recently, several RGB-White (RGBW) color filter arrays (CFAs) have been proposed, which have extra white (W) pixels in the filter array that are highly sensitive. Due to the high sensitivity, the W pixels have better SNR (Signal to Noise Ratio) characteristics than other color pixels in the filter array, especially, in low light conditions. However, most of the RGBW CFAs are designed so that the acquired RGBW pattern image can be converted into the conventional Bayer pattern image, which is then again converted into the final color image by using conventional demosaicing methods, i.e., color interpolation techniques. In this paper, we propose a new RGBW color filter array based on a totally different color interpolation technique, the colorization algorithm. The colorization algorithm was initially proposed for colorizing a gray image into a color image using a small number of color seeds. Here, we adopt this algorithm as a color interpolation technique, so that the RGBW color filter array can be designed with a very large number of W pixels to make the most of the highly sensitive characteristics of the W channel. The resulting RGBW color filter array has a pattern with a large proportion of W pixels, while the small-numbered RGB pixels are randomly distributed over the array. The colorization algorithm makes it possible to reconstruct the colors from such a small number of RGB values. Due to the large proportion of W pixels, the reconstructed color image has a high SNR value, especially higher than those of conventional CFAs in low light condition. Experimental results show that many important information which are not perceived in color images reconstructed with conventional CFAs are perceived in the images reconstructed with the proposed method.

  15. spsann - optimization of sample patterns using spatial simulated annealing

    Science.gov (United States)

    Samuel-Rosa, Alessandro; Heuvelink, Gerard; Vasques, Gustavo; Anjos, Lúcia

    2015-04-01

    There are many algorithms and computer programs to optimize sample patterns, some private and others publicly available. A few have only been presented in scientific articles and text books. This dispersion and somewhat poor availability is holds back to their wider adoption and further development. We introduce spsann, a new R-package for the optimization of sample patterns using spatial simulated annealing. R is the most popular environment for data processing and analysis. Spatial simulated annealing is a well known method with widespread use to solve optimization problems in the soil and geo-sciences. This is mainly due to its robustness against local optima and easiness of implementation. spsann offers many optimizing criteria for sampling for variogram estimation (number of points or point-pairs per lag distance class - PPL), trend estimation (association/correlation and marginal distribution of the covariates - ACDC), and spatial interpolation (mean squared shortest distance - MSSD). spsann also includes the mean or maximum universal kriging variance (MUKV) as an optimizing criterion, which is used when the model of spatial variation is known. PPL, ACDC and MSSD were combined (PAN) for sampling when we are ignorant about the model of spatial variation. spsann solves this multi-objective optimization problem scaling the objective function values using their maximum absolute value or the mean value computed over 1000 random samples. Scaled values are aggregated using the weighted sum method. A graphical display allows to follow how the sample pattern is being perturbed during the optimization, as well as the evolution of its energy state. It is possible to start perturbing many points and exponentially reduce the number of perturbed points. The maximum perturbation distance reduces linearly with the number of iterations. The acceptance probability also reduces exponentially with the number of iterations. R is memory hungry and spatial simulated annealing is a

  16. Patterns of Childhood Abuse and Neglect in a Representative German Population Sample.

    Directory of Open Access Journals (Sweden)

    Christoph Schilling

    Full Text Available Different types of childhood maltreatment, like emotional abuse, emotional neglect, physical abuse, physical neglect and sexual abuse are interrelated because of their co-occurrence. Different patterns of childhood abuse and neglect are associated with the degree of severity of mental disorders in adulthood. The purpose of this study was (a to identify different patterns of childhood maltreatment in a representative German community sample, (b to replicate the patterns of childhood neglect and abuse recently found in a clinical German sample, (c to examine whether participants reporting exposure to specific patterns of child maltreatment would report different levels of psychological distress, and (d to compare the results of the typological approach and the results of a cumulative risk model based on our data set.In a cross-sectional survey conducted in 2010, a representative random sample of 2504 German participants aged between 14 and 92 years completed the Childhood Trauma Questionnaire (CTQ. General anxiety and depression were assessed by standardized questionnaires (GAD-2, PHQ-2. Cluster analysis was conducted with the CTQ-subscales to identify different patterns of childhood maltreatment.Three different patterns of childhood abuse and neglect could be identified by cluster analysis. Cluster one showed low values on all CTQ-scales. Cluster two showed high values in emotional and physical neglect. Only cluster three showed high values in physical and sexual abuse. The three patterns of childhood maltreatment showed different degrees of depression (PHQ-2 and anxiety (GAD-2. Cluster one showed lowest levels of psychological distress, cluster three showed highest levels of mental distress.The results show that different types of childhood maltreatment are interrelated and can be grouped into specific patterns of childhood abuse and neglect, which are associated with differing severity of psychological distress in adulthood. The results

  17. Patterns of Childhood Abuse and Neglect in a Representative German Population Sample

    Science.gov (United States)

    Schilling, Christoph; Weidner, Kerstin; Brähler, Elmar; Glaesmer, Heide; Häuser, Winfried; Pöhlmann, Karin

    2016-01-01

    Background Different types of childhood maltreatment, like emotional abuse, emotional neglect, physical abuse, physical neglect and sexual abuse are interrelated because of their co-occurrence. Different patterns of childhood abuse and neglect are associated with the degree of severity of mental disorders in adulthood. The purpose of this study was (a) to identify different patterns of childhood maltreatment in a representative German community sample, (b) to replicate the patterns of childhood neglect and abuse recently found in a clinical German sample, (c) to examine whether participants reporting exposure to specific patterns of child maltreatment would report different levels of psychological distress, and (d) to compare the results of the typological approach and the results of a cumulative risk model based on our data set. Methods In a cross-sectional survey conducted in 2010, a representative random sample of 2504 German participants aged between 14 and 92 years completed the Childhood Trauma Questionnaire (CTQ). General anxiety and depression were assessed by standardized questionnaires (GAD-2, PHQ-2). Cluster analysis was conducted with the CTQ-subscales to identify different patterns of childhood maltreatment. Results Three different patterns of childhood abuse and neglect could be identified by cluster analysis. Cluster one showed low values on all CTQ-scales. Cluster two showed high values in emotional and physical neglect. Only cluster three showed high values in physical and sexual abuse. The three patterns of childhood maltreatment showed different degrees of depression (PHQ-2) and anxiety (GAD-2). Cluster one showed lowest levels of psychological distress, cluster three showed highest levels of mental distress. Conclusion The results show that different types of childhood maltreatment are interrelated and can be grouped into specific patterns of childhood abuse and neglect, which are associated with differing severity of psychological distress in

  18. Efficient sampling of complex network with modified random walk strategies

    Science.gov (United States)

    Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei

    2018-02-01

    We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.

  19. RandomSpot: A web-based tool for systematic random sampling of virtual slides.

    Science.gov (United States)

    Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E

    2015-01-01

    This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.

  20. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  1. Conditional Random Fields for Pattern Recognition Applied to Structured Data

    Directory of Open Access Journals (Sweden)

    Tom Burr

    2015-07-01

    Full Text Available Pattern recognition uses measurements from an input domain, X, to predict their labels from an output domain, Y. Image analysis is one setting where one might want to infer whether a pixel patch contains an object that is “manmade” (such as a building or “natural” (such as a tree. Suppose the label for a pixel patch is “manmade”; if the label for a nearby pixel patch is then more likely to be “manmade” there is structure in the output domain that can be exploited to improve pattern recognition performance. Modeling P(X is difficult because features between parts of the model are often correlated. Therefore, conditional random fields (CRFs model structured data using the conditional distribution P(Y|X = x, without specifying a model for P(X, and are well suited for applications with dependent features. This paper has two parts. First, we overview CRFs and their application to pattern recognition in structured problems. Our primary examples are image analysis applications in which there is dependence among samples (pixel patches in the output domain. Second, we identify research topics and present numerical examples.

  2. Social patterns revealed through random matrix theory

    Science.gov (United States)

    Sarkar, Camellia; Jalan, Sarika

    2014-11-01

    Despite the tremendous advancements in the field of network theory, very few studies have taken weights in the interactions into consideration that emerge naturally in all real-world systems. Using random matrix analysis of a weighted social network, we demonstrate the profound impact of weights in interactions on emerging structural properties. The analysis reveals that randomness existing in particular time frame affects the decisions of individuals rendering them more freedom of choice in situations of financial security. While the structural organization of networks remains the same throughout all datasets, random matrix theory provides insight into the interaction pattern of individuals of the society in situations of crisis. It has also been contemplated that individual accountability in terms of weighted interactions remains as a key to success unless segregation of tasks comes into play.

  3. A random spatial sampling method in a rural developing nation

    Science.gov (United States)

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  4. Determination of the influence of dispersion pattern of pesticide-resistant individuals on the reliability of resistance estimates using different sampling plans.

    Science.gov (United States)

    Shah, R; Worner, S P; Chapman, R B

    2012-10-01

    Pesticide resistance monitoring includes resistance detection and subsequent documentation/ measurement. Resistance detection would require at least one (≥1) resistant individual(s) to be present in a sample to initiate management strategies. Resistance documentation, on the other hand, would attempt to get an estimate of the entire population (≥90%) of the resistant individuals. A computer simulation model was used to compare the efficiency of simple random and systematic sampling plans to detect resistant individuals and to document their frequencies when the resistant individuals were randomly or patchily distributed. A patchy dispersion pattern of resistant individuals influenced the sampling efficiency of systematic sampling plans while the efficiency of random sampling was independent of such patchiness. When resistant individuals were randomly distributed, sample sizes required to detect at least one resistant individual (resistance detection) with a probability of 0.95 were 300 (1%) and 50 (10% and 20%); whereas, when resistant individuals were patchily distributed, using systematic sampling, sample sizes required for such detection were 6000 (1%), 600 (10%) and 300 (20%). Sample sizes of 900 and 400 would be required to detect ≥90% of resistant individuals (resistance documentation) with a probability of 0.95 when resistant individuals were randomly dispersed and present at a frequency of 10% and 20%, respectively; whereas, when resistant individuals were patchily distributed, using systematic sampling, a sample size of 3000 and 1500, respectively, was necessary. Small sample sizes either underestimated or overestimated the resistance frequency. A simple random sampling plan is, therefore, recommended for insecticide resistance detection and subsequent documentation.

  5. Memory for Random Time Patterns in Audition, Touch, and Vision.

    Science.gov (United States)

    Kang, HiJee; Lancelin, Denis; Pressnitzer, Daniel

    2018-03-22

    Perception deals with temporal sequences of events, like series of phonemes for audition, dynamic changes in pressure for touch textures, or moving objects for vision. Memory processes are thus needed to make sense of the temporal patterning of sensory information. Recently, we have shown that auditory temporal patterns could be learned rapidly and incidentally with repeated exposure [Kang et al., 2017]. Here, we tested whether rapid incidental learning of temporal patterns was specific to audition, or if it was a more general property of sensory systems. We used a same behavioral task in three modalities: audition, touch, and vision, for stimuli having identical temporal statistics. Participants were presented with sequences of acoustic pulses for audition, motion pulses to the fingertips for touch, or light pulses for vision. Pulses were randomly and irregularly spaced, with all inter-pulse intervals in the sub-second range and all constrained to be longer than the temporal acuity in any modality. This led to pulse sequences with an average inter-pulse interval of 166 ms, a minimum inter-pulse interval of 60 ms, and a total duration of 1.2 s. Results showed that, if a random temporal pattern re-occurred at random times during an experimental block, it was rapidly learned, whatever the sensory modality. Moreover, patterns first learned in the auditory modality displayed transfer of learning to either touch or vision. This suggests that sensory systems may be exquisitely tuned to incidentally learn re-occurring temporal patterns, with possible cross-talk between the senses. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.

  6. A performance comparison of sampling methods in the assessment of species composition patterns and environment–vegetation relationships in species-rich grasslands

    OpenAIRE

    Grzegorz Swacha; Zoltán Botta-Dukát; Zygmunt Kącki; Daniel Pruchniewicz; Ludwik Żołnierz

    2017-01-01

    The influence that different sampling methods have on the results and the interpretation of vegetation analysis has been much debated, but little is yet known about how the spatial arrangement of samples affect patterns of species composition and environment–vegetation relationships within the same vegetation type. We compared three data sets of the same sample size obtained by three standard sampling methods: preferential, random, and systematic. These different sampling methods were applied...

  7. On tests of randomness for spatial point patterns

    International Nuclear Information System (INIS)

    Doguwa, S.I.

    1990-11-01

    New tests of randomness for spatial point patterns are introduced. These test statistics are then compared in a power study with the existing alternatives. These results of the power study suggest that one of the tests proposed is extremely powerful against both aggregated and regular alternatives. (author). 9 refs, 7 figs, 3 tabs

  8. SOME SYSTEMATIC SAMPLING STRATEGIES USING MULTIPLE RANDOM STARTS

    OpenAIRE

    Sampath Sundaram; Ammani Sivaraman

    2010-01-01

    In this paper an attempt is made to extend linear systematic sampling using multiple random starts due to Gautschi(1957)for various types of systematic sampling schemes available in literature, namely(i)  Balanced Systematic Sampling (BSS) of  Sethi (1965) and (ii) Modified Systematic Sampling (MSS) of Singh, Jindal, and Garg  (1968). Further, the proposed methods were compared with Yates corrected estimator developed with reference to Gautschi’s Linear systematic samplin...

  9. Using forbidden ordinal patterns to detect determinism in irregularly sampled time series.

    Science.gov (United States)

    Kulp, C W; Chobot, J M; Niskala, B J; Needhammer, C J

    2016-02-01

    It is known that when symbolizing a time series into ordinal patterns using the Bandt-Pompe (BP) methodology, there will be ordinal patterns called forbidden patterns that do not occur in a deterministic series. The existence of forbidden patterns can be used to identify deterministic dynamics. In this paper, the ability to use forbidden patterns to detect determinism in irregularly sampled time series is tested on data generated from a continuous model system. The study is done in three parts. First, the effects of sampling time on the number of forbidden patterns are studied on regularly sampled time series. The next two parts focus on two types of irregular-sampling, missing data and timing jitter. It is shown that forbidden patterns can be used to detect determinism in irregularly sampled time series for low degrees of sampling irregularity (as defined in the paper). In addition, comments are made about the appropriateness of using the BP methodology to symbolize irregularly sampled time series.

  10. Associations between Dietary Patterns and Blood Pressure in a Clinical Sample of Overweight Adults.

    Science.gov (United States)

    Ndanuko, Rhoda N; Tapsell, Linda C; Charlton, Karen E; Neale, Elizabeth P; Batterham, Marijka J

    2017-02-01

    Dietary pattern analysis provides important evidence revealing diet-disease relationships. It may be especially useful in areas less well researched, such as diet and hypertension in clinical populations. The aim of this study was to identify the association between dietary patterns and blood pressure (BP) in a sample of overweight adults volunteering for a clinical trial for weight loss. This cross-sectional analysis used baseline data from the HealthTrack study, a 12-month randomized controlled trial. Dietary intake was evaluated with 4-day food records. Participants were 328 adults recruited from the Illawarra region of New South Wales, Australia, between May 2014 and April 2015. Resting BP and 24-hour urine sodium and potassium were measured. Dietary patterns were derived by principal component analysis from 21 food groups. Multiple regression analysis was performed to assess the association between the extracted dietary patterns and BP. The participants' mean age was 43.6±8.0 years, mean body mass index was 32.4±4.2, and mean systolic BP/diastolic BP was 124.9±14.5/73.3±9.9 mm Hg. Six major dietary patterns were identified: "nuts, seeds, fruit, and fish," "milk and meat," "breads, cereals, and snacks," "cereal-based products, fats, and oils," "alcohol, eggs, and legumes," and "savoury sauces, condiments, and meat." The "nuts, seeds, fruit, and fish" dietary pattern was significantly and inversely associated with systolic BP (F [7,320]=15.248; Ppattern rich in nuts, seeds, fruit, and fish was inversely associated with blood pressure in this clinical sample. The findings suggest that current dietary guidelines are relevant to an overweight clinical population and support the value of dietary pattern analysis when exploring the diet-disease relationship. Copyright © 2017 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  11. Health plan auditing: 100-percent-of-claims vs. random-sample audits.

    Science.gov (United States)

    Sillup, George P; Klimberg, Ronald K

    2011-01-01

    The objective of this study was to examine the relative efficacy of two different methodologies for auditing self-funded medical claim expenses: 100-percent-of-claims auditing versus random-sampling auditing. Multiple data sets of claim errors or 'exceptions' from two Fortune-100 corporations were analysed and compared to 100 simulated audits of 300- and 400-claim random samples. Random-sample simulations failed to identify a significant number and amount of the errors that ranged from $200,000 to $750,000. These results suggest that health plan expenses of corporations could be significantly reduced if they audited 100% of claims and embraced a zero-defect approach.

  12. Programmable pseudo-random detector-pulse-pattern generator

    International Nuclear Information System (INIS)

    Putten, R. van der; Nationaal Inst. voor Kernfysica en Hoge-Energiefysica

    1990-01-01

    This report discusses the design and realization of the digital part of the programmable pseudo-random detector pulse-pattern generator. For the design and realization use has been made of F-TTL and high speed special purpose ic's, in particular FAL's (15 ns). The design possibilities offered by the software for pro-gramming of the FAL's have been utilized as much as possible. In this way counters, registers and a state machine with extended control possibilities have been designed and an advanced 8 channel pulse generator has been developed which is controlled via the VME system bus. the generator possesses an internal clock oscillator of 16 MHZ. The moment when a pulse is generated can be adjusted with a step size of 250 ps. 2000 different periods (time windows) can be stored for generating a pattern. (author). 37 refs.; 6 figs

  13. Improved orientation sampling for indexing diffraction patterns of polycrystalline materials

    DEFF Research Database (Denmark)

    Larsen, Peter Mahler; Schmidt, Søren

    2017-01-01

    to that of optimally distributing points on a four‐dimensional sphere. In doing so, the number of orientation samples needed to achieve a desired indexing accuracy is significantly reduced. Orientation sets at a range of sizes are generated in this way for all Laue groups and are made available online for easy use.......Orientation mapping is a widely used technique for revealing the microstructure of a polycrystalline sample. The crystalline orientation at each point in the sample is determined by analysis of the diffraction pattern, a process known as pattern indexing. A recent development in pattern indexing...... in the presence of noise, it has very high computational requirements. In this article, the computational burden is reduced by developing a method for nearly optimal sampling of orientations. By using the quaternion representation of orientations, it is shown that the optimal sampling problem is equivalent...

  14. Estimates of Inequality Indices Based on Simple Random, Ranked Set, and Systematic Sampling

    OpenAIRE

    Bansal, Pooja; Arora, Sangeeta; Mahajan, Kalpana K.

    2013-01-01

    Gini index, Bonferroni index, and Absolute Lorenz index are some popular indices of inequality showing different features of inequality measurement. In general simple random sampling procedure is commonly used to estimate the inequality indices and their related inference. The key condition that the samples must be drawn via simple random sampling procedure though makes calculations much simpler but this assumption is often violated in practice as the data does not always yield simple random ...

  15. An alternative procedure for estimating the population mean in simple random sampling

    Directory of Open Access Journals (Sweden)

    Housila P. Singh

    2012-03-01

    Full Text Available This paper deals with the problem of estimating the finite population mean using auxiliary information in simple random sampling. Firstly we have suggested a correction to the mean squared error of the estimator proposed by Gupta and Shabbir [On improvement in estimating the population mean in simple random sampling. Jour. Appl. Statist. 35(5 (2008, pp. 559-566]. Later we have proposed a ratio type estimator and its properties are studied in simple random sampling. Numerically we have shown that the proposed class of estimators is more efficient than different known estimators including Gupta and Shabbir (2008 estimator.

  16. Systematic random sampling of the comet assay.

    Science.gov (United States)

    McArt, Darragh G; Wasson, Gillian R; McKerr, George; Saetzler, Kurt; Reed, Matt; Howard, C Vyvyan

    2009-07-01

    The comet assay is a technique used to quantify DNA damage and repair at a cellular level. In the assay, cells are embedded in agarose and the cellular content is stripped away leaving only the DNA trapped in an agarose cavity which can then be electrophoresed. The damaged DNA can enter the agarose and migrate while the undamaged DNA cannot and is retained. DNA damage is measured as the proportion of the migratory 'tail' DNA compared to the total DNA in the cell. The fundamental basis of these arbitrary values is obtained in the comet acquisition phase using fluorescence microscopy with a stoichiometric stain in tandem with image analysis software. Current methods deployed in such an acquisition are expected to be both objectively and randomly obtained. In this paper we examine the 'randomness' of the acquisition phase and suggest an alternative method that offers both objective and unbiased comet selection. In order to achieve this, we have adopted a survey sampling approach widely used in stereology, which offers a method of systematic random sampling (SRS). This is desirable as it offers an impartial and reproducible method of comet analysis that can be used both manually or automated. By making use of an unbiased sampling frame and using microscope verniers, we are able to increase the precision of estimates of DNA damage. Results obtained from a multiple-user pooled variation experiment showed that the SRS technique attained a lower variability than that of the traditional approach. The analysis of a single user with repetition experiment showed greater individual variances while not being detrimental to overall averages. This would suggest that the SRS method offers a better reflection of DNA damage for a given slide and also offers better user reproducibility.

  17. Correlated random sampling for multivariate normal and log-normal distributions

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.

    2012-01-01

    A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.

  18. Random walk-based similarity measure method for patterns in complex object

    Directory of Open Access Journals (Sweden)

    Liu Shihu

    2017-04-01

    Full Text Available This paper discusses the similarity of the patterns in complex objects. The complex object is composed both of the attribute information of patterns and the relational information between patterns. Bearing in mind the specificity of complex object, a random walk-based similarity measurement method for patterns is constructed. In this method, the reachability of any two patterns with respect to the relational information is fully studied, and in the case of similarity of patterns with respect to the relational information can be calculated. On this bases, an integrated similarity measurement method is proposed, and algorithms 1 and 2 show the performed calculation procedure. One can find that this method makes full use of the attribute information and relational information. Finally, a synthetic example shows that our proposed similarity measurement method is validated.

  19. Prevalence of different palatal rugae patterns in a sample Lucknow population

    Directory of Open Access Journals (Sweden)

    Mohammed Asdullah

    2014-01-01

    Full Text Available Introduction: Palatoscopy is the study of palatal rugae pattern and it can be used to establish the identity of a person. The rugae are unique internal structures protected by the lips, buccal pad of fat and teeth that resist decomposition, thus serving an important role in forensic identification. The appearance of palatal rugae is constant, as they do not demonstrate changes after the completion of growth of the individual, thus increasing its utility as a personal identification tool. Aims and Objectives: To study the prevalence of different palatal rugae patterns in a sample of Lucknow, India population. Materials and Methods: The study comprised 100 healthy (50 males and 50 females subjects of age between 18 and 40 years, who were randomly selected. In this study we used the classification of rugae patterns suggested by Kapali et al. (straight, curved, wavy andcircular, and Thomas and Kotze (unification and nonspecific. Results: The mean rugae number in females was slightly more than males. The most prevalent forms in both genders were curved type followed by wavy, straight and diverging types. Straight and curved forms were significantly more in numbers in females than males. Conclusion: This is a preliminary study on a localized Indian population. Scope still exists for further studies to determine gender as well as personal identification. We believe that rugae can be used as a reliable guide in forensic identification.

  20. Correlation between k-space sampling pattern and MTF in compressed sensing MRSI.

    Science.gov (United States)

    Heikal, A A; Wachowicz, K; Fallone, B G

    2016-10-01

    To investigate the relationship between the k-space sampling patterns used for compressed sensing MR spectroscopic imaging (CS-MRSI) and the modulation transfer function (MTF) of the metabolite maps. This relationship may allow the desired frequency content of the metabolite maps to be quantitatively tailored when designing an undersampling pattern. Simulations of a phantom were used to calculate the MTF of Nyquist sampled (NS) 32 × 32 MRSI, and four-times undersampled CS-MRSI reconstructions. The dependence of the CS-MTF on the k-space sampling pattern was evaluated for three sets of k-space sampling patterns generated using different probability distribution functions (PDFs). CS-MTFs were also evaluated for three more sets of patterns generated using a modified algorithm where the sampling ratios are constrained to adhere to PDFs. Strong visual correlation as well as high R 2 was found between the MTF of CS-MRSI and the product of the frequency-dependant sampling ratio and the NS 32 × 32 MTF. Also, PDF-constrained sampling patterns led to higher reproducibility of the CS-MTF, and stronger correlations to the above-mentioned product. The relationship established in this work provides the user with a theoretical solution for the MTF of CS MRSI that is both predictable and customizable to the user's needs.

  1. Systematic Onset of Periodic Patterns in Random Disk Packings

    Science.gov (United States)

    Topic, Nikola; Pöschel, Thorsten; Gallas, Jason A. C.

    2018-04-01

    We report evidence of a surprising systematic onset of periodic patterns in very tall piles of disks deposited randomly between rigid walls. Independently of the pile width, periodic structures are always observed in monodisperse deposits containing up to 1 07 disks. The probability density function of the lengths of disordered transient phases that precede the onset of periodicity displays an approximately exponential tail. These disordered transients may become very large when the channel width grows without bound. For narrow channels, the probability density of finding periodic patterns of a given period displays a series of discrete peaks, which, however, are washed out completely when the channel width grows.

  2. Random sampling of evolution time space and Fourier transform processing

    International Nuclear Information System (INIS)

    Kazimierczuk, Krzysztof; Zawadzka, Anna; Kozminski, Wiktor; Zhukov, Igor

    2006-01-01

    Application of Fourier Transform for processing 3D NMR spectra with random sampling of evolution time space is presented. The 2D FT is calculated for pairs of frequencies, instead of conventional sequence of one-dimensional transforms. Signal to noise ratios and linewidths for different random distributions were investigated by simulations and experiments. The experimental examples include 3D HNCA, HNCACB and 15 N-edited NOESY-HSQC spectra of 13 C 15 N labeled ubiquitin sample. Obtained results revealed general applicability of proposed method and the significant improvement of resolution in comparison with conventional spectra recorded in the same time

  3. A random sampling procedure for anisotropic distributions

    International Nuclear Information System (INIS)

    Nagrajan, P.S.; Sethulakshmi, P.; Raghavendran, C.P.; Bhatia, D.P.

    1975-01-01

    A procedure is described for sampling the scattering angle of neutrons as per specified angular distribution data. The cosine of the scattering angle is written as a double Legendre expansion in the incident neutron energy and a random number. The coefficients of the expansion are given for C, N, O, Si, Ca, Fe and Pb and these elements are of interest in dosimetry and shielding. (author)

  4. Random sampling of elementary flux modes in large-scale metabolic networks.

    Science.gov (United States)

    Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel

    2012-09-15

    The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.

  5. Perceived beauty of random texture patterns: A preference for complexity.

    Science.gov (United States)

    Friedenberg, Jay; Liby, Bruce

    2016-07-01

    We report two experiments on the perceived aesthetic quality of random density texture patterns. In each experiment a square grid was filled with a progressively larger number of elements. Grid size in Experiment 1 was 10×10 with elements added to create a variety of textures ranging from 10%-100% fill levels. Participants rated the beauty of the patterns. Average judgments across all observers showed an inverted U-shaped function that peaked near middle densities. In Experiment 2 grid size was increased to 15×15 to see if observers preferred patterns with a fixed density or a fixed number of elements. The results of the second experiment were nearly identical to that of the first showing a preference for density over fixed element number. Ratings in both studies correlated positively with a GIF compression metric of complexity and with edge length. Within the range of stimuli used, observers judge more complex patterns to be more beautiful. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Fast egg collection method greatly improves randomness of egg sampling in Drosophila melanogaster

    DEFF Research Database (Denmark)

    Schou, Mads Fristrup

    2013-01-01

    When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column...... and diminishes environmental variance. This method was compared with a traditional egg collection method where eggs are collected directly from the medium. Within each method the observed and expected standard deviations of egg-to-adult viability were compared, whereby the difference in the randomness...... and to obtain a representative collection of genotypes, the method presented here is strongly recommended when collecting eggs from Drosophila....

  7. Universal principles governing multiple random searchers on complex networks: The logarithmic growth pattern and the harmonic law

    Science.gov (United States)

    Weng, Tongfeng; Zhang, Jie; Small, Michael; Harandizadeh, Bahareh; Hui, Pan

    2018-03-01

    We propose a unified framework to evaluate and quantify the search time of multiple random searchers traversing independently and concurrently on complex networks. We find that the intriguing behaviors of multiple random searchers are governed by two basic principles—the logarithmic growth pattern and the harmonic law. Specifically, the logarithmic growth pattern characterizes how the search time increases with the number of targets, while the harmonic law explores how the search time of multiple random searchers varies relative to that needed by individual searchers. Numerical and theoretical results demonstrate these two universal principles established across a broad range of random search processes, including generic random walks, maximal entropy random walks, intermittent strategies, and persistent random walks. Our results reveal two fundamental principles governing the search time of multiple random searchers, which are expected to facilitate investigation of diverse dynamical processes like synchronization and spreading.

  8. STATISTICAL LANDMARKS AND PRACTICAL ISSUES REGARDING THE USE OF SIMPLE RANDOM SAMPLING IN MARKET RESEARCHES

    Directory of Open Access Journals (Sweden)

    CODRUŢA DURA

    2010-01-01

    Full Text Available The sample represents a particular segment of the statistical populationchosen to represent it as a whole. The representativeness of the sample determines the accuracyfor estimations made on the basis of calculating the research indicators and the inferentialstatistics. The method of random sampling is part of probabilistic methods which can be usedwithin marketing research and it is characterized by the fact that it imposes the requirementthat each unit belonging to the statistical population should have an equal chance of beingselected for the sampling process. When the simple random sampling is meant to be rigorouslyput into practice, it is recommended to use the technique of random number tables in order toconfigure the sample which will provide information that the marketer needs. The paper alsodetails the practical procedure implemented in order to create a sample for a marketingresearch by generating random numbers using the facilities offered by Microsoft Excel.

  9. CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.

    Science.gov (United States)

    Shalizi, Cosma Rohilla; Rinaldo, Alessandro

    2013-04-01

    The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.

  10. Identification of dietary patterns associated with blood pressure in a sample of overweight Australian adults.

    Science.gov (United States)

    Anil, S; Charlton, K E; Tapsell, L C; Probst, Y; Ndanuko, R; Batterham, M J

    2016-11-01

    The dietary approaches to stop hypertension (DASH) diet provides strong evidence for an optimal dietary pattern for blood pressure (BP) control; however, investigation at the level of key foods in a dietary pattern is sparse. This study aimed to assess the relationship between dietary patterns driven by key foods with BP in a sample of obese Australian adults. Secondary analysis was conducted on baseline data of 118 participants (45.1±8.4 years, mean BP=124.1±15.8/72.6±9.2 mm Hg) recruited in a weight reduction randomized controlled trial (ACTRN12608000425392). Dietary assessment was by a validated diet history interview. The average of three office BP measurements was taken. Factor analysis extracted dietary patterns and their relation to systolic BP (SBP) and diastolic BP (DBP) was analysed using multiple linear regression. Eight dietary patterns were identified based on leading foods: meat and alcohol; seafood; fats; fruits and nuts; legumes; confectionery; sweet foods; and yeast extracts and seasonings. A lower SBP was associated with alignment with the fruit and nuts pattern (β=-4.1 (95% confidence interval -7.5 to -0.7) mm Hg) and with seafood for DBP (β=-2.4 (-4.6 to -0.3) mm Hg). SBP and DBP were higher with yeast extract and seasonings (β=4.3 (1.4-7.3); 2.5 (0.9-4.0) mm Hg, respectively). In obese adults attending for weight loss, dietary patterns that included larger amounts of fruits and nuts and/or seafood were associated with lower BP at baseline, whereas patterns that were characterised by yeast extract and seasonings were associated with higher BP.

  11. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    Science.gov (United States)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-09-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [1,2] and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  12. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    International Nuclear Information System (INIS)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-01-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2 /Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  13. LOD score exclusion analyses for candidate genes using random population samples.

    Science.gov (United States)

    Deng, H W; Li, J; Recker, R R

    2001-05-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes with random population samples. We develop a LOD score approach for exclusion analyses of candidate genes with random population samples. Under this approach, specific genetic effects and inheritance models at candidate genes can be analysed and if a LOD score is < or = - 2.0, the locus can be excluded from having an effect larger than that specified. Computer simulations show that, with sample sizes often employed in association studies, this approach has high power to exclude a gene from having moderate genetic effects. In contrast to regular association analyses, population admixture will not affect the robustness of our analyses; in fact, it renders our analyses more conservative and thus any significant exclusion result is robust. Our exclusion analysis complements association analysis for candidate genes in random population samples and is parallel to the exclusion mapping analyses that may be conducted in linkage analyses with pedigrees or relative pairs. The usefulness of the approach is demonstrated by an application to test the importance of vitamin D receptor and estrogen receptor genes underlying the differential risk to osteoporotic fractures.

  14. Generating Random Samples of a Given Size Using Social Security Numbers.

    Science.gov (United States)

    Erickson, Richard C.; Brauchle, Paul E.

    1984-01-01

    The purposes of this article are (1) to present a method by which social security numbers may be used to draw cluster samples of a predetermined size and (2) to describe procedures used to validate this method of drawing random samples. (JOW)

  15. Random On-Board Pixel Sampling (ROPS) X-Ray Camera

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Zhehui [Los Alamos; Iaroshenko, O. [Los Alamos; Li, S. [Los Alamos; Liu, T. [Fermilab; Parab, N. [Argonne (main); Chen, W. W. [Purdue U.; Chu, P. [Los Alamos; Kenyon, G. [Los Alamos; Lipton, R. [Fermilab; Sun, K.-X. [Nevada U., Las Vegas

    2017-09-25

    Recent advances in compressed sensing theory and algorithms offer new possibilities for high-speed X-ray camera design. In many CMOS cameras, each pixel has an independent on-board circuit that includes an amplifier, noise rejection, signal shaper, an analog-to-digital converter (ADC), and optional in-pixel storage. When X-ray images are sparse, i.e., when one of the following cases is true: (a.) The number of pixels with true X-ray hits is much smaller than the total number of pixels; (b.) The X-ray information is redundant; or (c.) Some prior knowledge about the X-ray images exists, sparse sampling may be allowed. Here we first illustrate the feasibility of random on-board pixel sampling (ROPS) using an existing set of X-ray images, followed by a discussion about signal to noise as a function of pixel size. Next, we describe a possible circuit architecture to achieve random pixel access and in-pixel storage. The combination of a multilayer architecture, sparse on-chip sampling, and computational image techniques, is expected to facilitate the development and applications of high-speed X-ray camera technology.

  16. An empirically derived dietary pattern associated with breast cancer risk is validated in a nested case-control cohort from a randomized primary prevention trial.

    Science.gov (United States)

    Hidaka, Brandon H; Kimler, Bruce F; Fabian, Carol J; Carlson, Susan E

    2017-02-01

    We reported an association between cytologic atypia, a reversible biomarker of breast cancer risk, and lower omega-3/omega-6 fatty acid ratio in blood and breast tissue. Our goal was to develop and validate a dietary pattern index in this high-risk sample of U.S. women, and test its capacity to predict incidence in a nested case-control cohort of Canadian women from a randomized trial of a low-fat dietary intervention for primary prevention of breast cancer. Food intake was measured by food frequency questionnaire in the U.S. sample (n = 65) and multiple dietary recalls in the Canadian sample (n = 220 cases; 440 controls). Principal component analysis identified a dietary pattern associated with atypia. We measured differences among dietary pattern tertiles in (a) fatty acid composition in blood lipids and breast tissue in the U.S. sample, and (b) risk of breast cancer subtypes in the Canadian cohort. Registered under ClinicalTrials.gov Identifier: NCT00148057. A Modern diet was characterized as consuming more grains, dairy, and sugar and less vegetables, fish and poultry; these women had lower tissue omega-3 fatty acids and higher omega-6 and trans fatty acids. The low-fat intervention increased the likelihood of a Modern diet after randomization. A Modern diet at baseline and post-randomization was associated with estrogen-receptor negative (ER-) breast cancer risk among those at least 160 cm tall. A Traditional diet (the reciprocal of Modern) at baseline was associated with lower ER-positive (ER+) risk in the comparison group, but not the low-fat intervention group. A Modern diet (high in grains, dairy, and sugar and low in vegetables, fish, and poultry) is associated with ER- breast cancer risk among taller women. Recommending dietary fat reduction may have untoward effects on breast cancer risk. Copyright © 2016 European Society for Clinical Nutrition and Metabolism. Published by Elsevier Ltd. All rights reserved.

  17. Visual memory for random block patterns defined by luminance and color contrast

    NARCIS (Netherlands)

    Cornelissen, FW; Greenlee, MW

    2000-01-01

    We studied the ability of human subjects to memorize the visual information in computer-generated random block patterns defined either by luminance contrast, by color contrast, or by both. Memory performance declines rapidly with increasing inter-stimulus interval, showing a half-life of

  18. Random sequential adsorption: from continuum to lattice and pre-patterned substrates

    International Nuclear Information System (INIS)

    Cadilhe, A; Araujo, N A M; Privman, Vladimir

    2007-01-01

    The random sequential adsorption (RSA) model has served as a paradigm for diverse phenomena in physical chemistry, as well as in other areas such as biology, ecology, and sociology. In the present work, we survey aspects of the RSA model with emphasis on the approach to and properties of jammed states obtained for large times in continuum deposition versus that on lattice substrates, and on pre-patterned surfaces. The latter model has been of recent interest in the context of efforts to use pre-patterning as a tool to improve self-assembly in micro- and nanoscale surface structure engineering

  19. Misrepresenting random sampling? A systematic review of research papers in the Journal of Advanced Nursing.

    Science.gov (United States)

    Williamson, Graham R

    2003-11-01

    This paper discusses the theoretical limitations of the use of random sampling and probability theory in the production of a significance level (or P-value) in nursing research. Potential alternatives, in the form of randomization tests, are proposed. Research papers in nursing, medicine and psychology frequently misrepresent their statistical findings, as the P-values reported assume random sampling. In this systematic review of studies published between January 1995 and June 2002 in the Journal of Advanced Nursing, 89 (68%) studies broke this assumption because they used convenience samples or entire populations. As a result, some of the findings may be questionable. The key ideas of random sampling and probability theory for statistical testing (for generating a P-value) are outlined. The result of a systematic review of research papers published in the Journal of Advanced Nursing is then presented, showing how frequently random sampling appears to have been misrepresented. Useful alternative techniques that might overcome these limitations are then discussed. REVIEW LIMITATIONS: This review is limited in scope because it is applied to one journal, and so the findings cannot be generalized to other nursing journals or to nursing research in general. However, it is possible that other nursing journals are also publishing research articles based on the misrepresentation of random sampling. The review is also limited because in several of the articles the sampling method was not completely clearly stated, and in this circumstance a judgment has been made as to the sampling method employed, based on the indications given by author(s). Quantitative researchers in nursing should be very careful that the statistical techniques they use are appropriate for the design and sampling methods of their studies. If the techniques they employ are not appropriate, they run the risk of misinterpreting findings by using inappropriate, unrepresentative and biased samples.

  20. Smoothing the redshift distributions of random samples for the baryon acoustic oscillations: applications to the SDSS-III BOSS DR12 and QPM mock samples

    Science.gov (United States)

    Wang, Shao-Jiang; Guo, Qi; Cai, Rong-Gen

    2017-12-01

    We investigate the impact of different redshift distributions of random samples on the baryon acoustic oscillations (BAO) measurements of D_V(z)r_d^fid/r_d from the two-point correlation functions of galaxies in the Data Release 12 of the Baryon Oscillation Spectroscopic Survey (BOSS). Big surveys, such as BOSS, usually assign redshifts to the random samples by randomly drawing values from the measured redshift distributions of the data, which would necessarily introduce fiducial signals of fluctuations into the random samples, weakening the signals of BAO, if the cosmic variance cannot be ignored. We propose a smooth function of redshift distribution that fits the data well to populate the random galaxy samples. The resulting cosmological parameters match the input parameters of the mock catalogue very well. The significance of BAO signals has been improved by 0.33σ for a low-redshift sample and by 0.03σ for a constant-stellar-mass sample, though the absolute values do not change significantly. Given the precision of the measurements of current cosmological parameters, it would be appreciated for the future improvements on the measurements of galaxy clustering.

  1. Evaluating the Applicability of Data-Driven Dietary Patterns to Independent Samples with a Focus on Measurement Tools for Pattern Similarity.

    Science.gov (United States)

    Castelló, Adela; Buijsse, Brian; Martín, Miguel; Ruiz, Amparo; Casas, Ana M; Baena-Cañada, Jose M; Pastor-Barriuso, Roberto; Antolín, Silvia; Ramos, Manuel; Muñoz, Monserrat; Lluch, Ana; de Juan-Ferré, Ana; Jara, Carlos; Lope, Virginia; Jimeno, María A; Arriola-Arellano, Esperanza; Díaz, Elena; Guillem, Vicente; Carrasco, Eva; Pérez-Gómez, Beatriz; Vioque, Jesús; Pollán, Marina

    2016-12-01

    Diet is a key modifiable risk for many chronic diseases, but it remains unclear whether dietary patterns from one study sample are generalizable to other independent populations. The primary objective of this study was to assess whether data-driven dietary patterns from one study sample are applicable to other populations. The secondary objective was to assess the validity of two criteria of pattern similarity. Six dietary patterns-Western (n=3), Mediterranean, Prudent, and Healthy- from three published studies on breast cancer were reconstructed in a case-control study of 973 breast cancer patients and 973 controls. Three more internal patterns (Western, Prudent, and Mediterranean) were derived from this case-control study's own data. Applicability was assessed by comparing the six reconstructed patterns with the three internal dietary patterns, using the congruence coefficient (CC) between pattern loadings. In cases where any pair met either of two commonly used criteria for declaring patterns similar (CC ≥0.85 or a statistically significant [Pdietary patterns was double-checked by comparing their associations to risk for breast cancer, to assess whether those two criteria of similarity are actually reliable. Five of the six reconstructed dietary patterns showed high congruence (CC >0.9) to their corresponding dietary pattern derived from the case-control study's data. Similar associations with risk for breast cancer were found in all pairs of dietary patterns that had high CC but not in all pairs of dietary patterns with statistically significant correlations. Similar dietary patterns can be found in independent samples. The P value of a correlation coefficient is less reliable than the CC as a criterion for declaring two dietary patterns similar. This study shows that diet scores based on a particular study are generalizable to other populations. Copyright © 2016 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  2. Randomized branch sampling to estimatefruit production in Pecan trees cv. ‘Barton’

    Directory of Open Access Journals (Sweden)

    Filemom Manoel Mokochinski

    Full Text Available ABSTRACT: Sampling techniques to quantify the production of fruits are still very scarce and create a gap in crop development research. This study was conducted in a rural property in the county of Cachoeira do Sul - RS to estimate the efficiency of randomized branch sampling (RBS in quantifying the production of pecan fruit at three different ages (5,7 and 10 years. Two selection techniques were tested: the probability proportional to the diameter (PPD and the uniform probability (UP techniques, which were performed on nine trees, three from each age and randomly chosen. The RBS underestimated fruit production for all ages, and its main drawback was the high sampling error (125.17% - PPD and 111.04% - UP. The UP was regarded as more efficient than the PPD, though both techniques estimated similar production and similar experimental errors. In conclusion, we reported that branch sampling was inaccurate for this case study, requiring new studies to produce estimates with smaller sampling error.

  3. Human mammary epithelial cells exhibit a bimodal correlated random walk pattern.

    Science.gov (United States)

    Potdar, Alka A; Jeon, Junhwan; Weaver, Alissa M; Quaranta, Vito; Cummings, Peter T

    2010-03-10

    Organisms, at scales ranging from unicellular to mammals, have been known to exhibit foraging behavior described by random walks whose segments confirm to Lévy or exponential distributions. For the first time, we present evidence that single cells (mammary epithelial cells) that exist in multi-cellular organisms (humans) follow a bimodal correlated random walk (BCRW). Cellular tracks of MCF-10A pBabe, neuN and neuT random migration on 2-D plastic substrates, analyzed using bimodal analysis, were found to reveal the BCRW pattern. We find two types of exponentially distributed correlated flights (corresponding to what we refer to as the directional and re-orientation phases) each having its own correlation between move step-lengths within flights. The exponential distribution of flight lengths was confirmed using different analysis methods (logarithmic binning with normalization, survival frequency plots and maximum likelihood estimation). Because of the presence of non-uniform turn angle distribution of move step-lengths within a flight and two different types of flights, we propose that the epithelial random walk is a BCRW comprising of two alternating modes with varying degree of correlations, rather than a simple persistent random walk. A BCRW model rather than a simple persistent random walk correctly matches the super-diffusivity in the cell migration paths as indicated by simulations based on the BCRW model.

  4. Human mammary epithelial cells exhibit a bimodal correlated random walk pattern.

    Directory of Open Access Journals (Sweden)

    Alka A Potdar

    2010-03-01

    Full Text Available Organisms, at scales ranging from unicellular to mammals, have been known to exhibit foraging behavior described by random walks whose segments confirm to Lévy or exponential distributions. For the first time, we present evidence that single cells (mammary epithelial cells that exist in multi-cellular organisms (humans follow a bimodal correlated random walk (BCRW.Cellular tracks of MCF-10A pBabe, neuN and neuT random migration on 2-D plastic substrates, analyzed using bimodal analysis, were found to reveal the BCRW pattern. We find two types of exponentially distributed correlated flights (corresponding to what we refer to as the directional and re-orientation phases each having its own correlation between move step-lengths within flights. The exponential distribution of flight lengths was confirmed using different analysis methods (logarithmic binning with normalization, survival frequency plots and maximum likelihood estimation.Because of the presence of non-uniform turn angle distribution of move step-lengths within a flight and two different types of flights, we propose that the epithelial random walk is a BCRW comprising of two alternating modes with varying degree of correlations, rather than a simple persistent random walk. A BCRW model rather than a simple persistent random walk correctly matches the super-diffusivity in the cell migration paths as indicated by simulations based on the BCRW model.

  5. ArF photo resist pattern sample preparation method using FIB without protective coating

    Science.gov (United States)

    Okushima, Hirohisa; Onozuka, Toshihiko; Kuroda, Yasushi; Yaguchi, Toshie; Umemura, Kaoru; Tamochi, Ryuichiro; Watanabe, Kenji; Hasegawa, Norio; Kawata, Isao; Rijpers, Bart

    2006-03-01

    This paper presents a novel method of FIB (FIB: focused ion beam) sample preparation to accurately evaluate critical dimensions and profiles of ArF photo resist patterns without the use of a protective coating on the photo resist. In order to accomplish this, the FIB micro-sampling method that is one of effective FIB milling and fabrication method was employed. First a Si cap is picked up from a silicon wafer and fixed to ArF photo resist patterns to protect against ion beam irradiation. Then, a micro-sample, a piece of Si-capped ArF photo resist, was extracted from the bulk ArF photo resist. In this procedure, this silicon cap always protects ArF photo resist patterns against ion beam irradiation. For the next step, the micro-sample is fixed to a needle stub of the FIB-STEM (STEM: scanning transmission electron microscopy) compatible rotation holder. This sample on the needle stub was rotated 180 degrees and milled from the side of Si substrate. Lastly, the sample is milled to the thickness of 2μm. In this process, the ion beam is irradiating from the silicon substrate side to minimize the ion beam irradiation damages on the ArF photo resist patterns. EDX (EDX: Energy dispersive X-ray spectroscopy) analysis proved that no gallium ions were detected on the surface of the ArF photo resist patterns. The feasibility of high accelerating voltage observation of STEM to observe line edge roughness of a thick sample like 2μm without shrinkage has been demonstrated.

  6. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  7. Large-area imaging reveals biologically driven non-random spatial patterns of corals at a remote reef

    Science.gov (United States)

    Edwards, Clinton B.; Eynaud, Yoan; Williams, Gareth J.; Pedersen, Nicole E.; Zgliczynski, Brian J.; Gleason, Arthur C. R.; Smith, Jennifer E.; Sandin, Stuart A.

    2017-12-01

    For sessile organisms such as reef-building corals, differences in the degree of dispersion of individuals across a landscape may result from important differences in life-history strategies or may reflect patterns of habitat availability. Descriptions of spatial patterns can thus be useful not only for the identification of key biological and physical mechanisms structuring an ecosystem, but also by providing the data necessary to generate and test ecological theory. Here, we used an in situ imaging technique to create large-area photomosaics of 16 plots at Palmyra Atoll, central Pacific, each covering 100 m2 of benthic habitat. We mapped the location of 44,008 coral colonies and identified each to the lowest taxonomic level possible. Using metrics of spatial dispersion, we tested for departures from spatial randomness. We also used targeted model fitting to explore candidate processes leading to differences in spatial patterns among taxa. Most taxa were clustered and the degree of clustering varied by taxon. A small number of taxa did not significantly depart from randomness and none revealed evidence of spatial uniformity. Importantly, taxa that readily fragment or tolerate stress through partial mortality were more clustered. With little exception, clustering patterns were consistent with models of fragmentation and dispersal limitation. In some taxa, dispersion was linearly related to abundance, suggesting density dependence of spatial patterning. The spatial patterns of stony corals are non-random and reflect fundamental life-history characteristics of the taxa, suggesting that the reef landscape may, in many cases, have important elements of spatial predictability.

  8. Monitoring oil persistence on beaches : SCAT versus stratified random sampling designs

    International Nuclear Information System (INIS)

    Short, J.W.; Lindeberg, M.R.; Harris, P.M.; Maselko, J.M.; Pella, J.J.; Rice, S.D.

    2003-01-01

    In the event of a coastal oil spill, shoreline clean-up assessment teams (SCAT) commonly rely on visual inspection of the entire affected area to monitor the persistence of the oil on beaches. Occasionally, pits are excavated to evaluate the persistence of subsurface oil. This approach is practical for directing clean-up efforts directly following a spill. However, sampling of the 1989 Exxon Valdez oil spill in Prince William Sound 12 years later has shown that visual inspection combined with pit excavation does not offer estimates of contaminated beach area of stranded oil volumes. This information is needed to statistically evaluate the significance of change with time. Assumptions regarding the correlation of visually-evident surface oil and cryptic subsurface oil are usually not evaluated as part of the SCAT mandate. Stratified random sampling can avoid such problems and could produce precise estimates of oiled area and volume that allow for statistical assessment of major temporal trends and the extent of the impact. The 2001 sampling of the shoreline of Prince William Sound showed that 15 per cent of surface oil occurrences were associated with subsurface oil. This study demonstrates the usefulness of the stratified random sampling method and shows how sampling design parameters impact statistical outcome. Power analysis based on the study results, indicate that optimum power is derived when unnecessary stratification is avoided. It was emphasized that sampling effort should be balanced between choosing sufficient beaches for sampling and the intensity of sampling

  9. Implementation of Sample Graphic Patterns on Derived Scientific/Technologic Documentary Figures

    Institute of Scientific and Technical Information of China (English)

    MENG Xiang-bao; WANG Xiao-yu; WANG Lei

    2013-01-01

    The presenting work deals with implementation of sample graphic patterns derived from published scientific/technologic documentation figures on numeric simulation of multiphase flow and FEM analysis of thin walled mechanical structures. First, geometric plane patterns in rectangular/circular form were demonstrated in contrast to gradual change style in color and graphic configuration. Next, selected artistic/graphic sample patterns were implemented in logo conceptual design and visual innovation storming. The way in editing the above original figures is 2D symmetry, rectangular array geometrically, and converting them in inverse color in software like PS. The objective of this work is to cultivate, explore and discipline trainees’ visual ability in artistic/aesthetic appreciation, graphic communication and industrial design and application, thus laying ties closely among comprehensive university students from different majors on science, engineering, arts and humanity.

  10. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    Science.gov (United States)

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  11. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    Science.gov (United States)

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  12. Accounting for Sampling Error in Genetic Eigenvalues Using Random Matrix Theory.

    Science.gov (United States)

    Sztepanacz, Jacqueline L; Blows, Mark W

    2017-07-01

    The distribution of genetic variance in multivariate phenotypes is characterized by the empirical spectral distribution of the eigenvalues of the genetic covariance matrix. Empirical estimates of genetic eigenvalues from random effects linear models are known to be overdispersed by sampling error, where large eigenvalues are biased upward, and small eigenvalues are biased downward. The overdispersion of the leading eigenvalues of sample covariance matrices have been demonstrated to conform to the Tracy-Widom (TW) distribution. Here we show that genetic eigenvalues estimated using restricted maximum likelihood (REML) in a multivariate random effects model with an unconstrained genetic covariance structure will also conform to the TW distribution after empirical scaling and centering. However, where estimation procedures using either REML or MCMC impose boundary constraints, the resulting genetic eigenvalues tend not be TW distributed. We show how using confidence intervals from sampling distributions of genetic eigenvalues without reference to the TW distribution is insufficient protection against mistaking sampling error as genetic variance, particularly when eigenvalues are small. By scaling such sampling distributions to the appropriate TW distribution, the critical value of the TW statistic can be used to determine if the magnitude of a genetic eigenvalue exceeds the sampling error for each eigenvalue in the spectral distribution of a given genetic covariance matrix. Copyright © 2017 by the Genetics Society of America.

  13. Automated CBED processing: Sample thickness estimation based on analysis of zone-axis CBED pattern

    Energy Technology Data Exchange (ETDEWEB)

    Klinger, M., E-mail: klinger@post.cz; Němec, M.; Polívka, L.; Gärtnerová, V.; Jäger, A.

    2015-03-15

    An automated processing of convergent beam electron diffraction (CBED) patterns is presented. The proposed methods are used in an automated tool for estimating the thickness of transmission electron microscopy (TEM) samples by matching an experimental zone-axis CBED pattern with a series of patterns simulated for known thicknesses. The proposed tool detects CBED disks, localizes a pattern in detected disks and unifies the coordinate system of the experimental pattern with the simulated one. The experimental pattern is then compared disk-by-disk with a series of simulated patterns each corresponding to different known thicknesses. The thickness of the most similar simulated pattern is then taken as the thickness estimate. The tool was tested on [0 1 1] Si, [0 1 0] α-Ti and [0 1 1] α-Ti samples prepared using different techniques. Results of the presented approach were compared with thickness estimates based on analysis of CBED patterns in two beam conditions. The mean difference between these two methods was 4.1% for the FIB-prepared silicon samples, 5.2% for the electro-chemically polished titanium and 7.9% for Ar{sup +} ion-polished titanium. The proposed techniques can also be employed in other established CBED analyses. Apart from the thickness estimation, it can potentially be used to quantify lattice deformation, structure factors, symmetry, defects or extinction distance. - Highlights: • Automated TEM sample thickness estimation using zone-axis CBED is presented. • Computer vision and artificial intelligence are employed in CBED processing. • This approach reduces operator effort, analysis time and increases repeatability. • Individual parts can be employed in other analyses of CBED/diffraction pattern.

  14. A random sampling approach for robust estimation of tissue-to-plasma ratio from extremely sparse data.

    Science.gov (United States)

    Chu, Hui-May; Ette, Ene I

    2005-09-02

    his study was performed to develop a new nonparametric approach for the estimation of robust tissue-to-plasma ratio from extremely sparsely sampled paired data (ie, one sample each from plasma and tissue per subject). Tissue-to-plasma ratio was estimated from paired/unpaired experimental data using independent time points approach, area under the curve (AUC) values calculated with the naïve data averaging approach, and AUC values calculated using sampling based approaches (eg, the pseudoprofile-based bootstrap [PpbB] approach and the random sampling approach [our proposed approach]). The random sampling approach involves the use of a 2-phase algorithm. The convergence of the sampling/resampling approaches was investigated, as well as the robustness of the estimates produced by different approaches. To evaluate the latter, new data sets were generated by introducing outlier(s) into the real data set. One to 2 concentration values were inflated by 10% to 40% from their original values to produce the outliers. Tissue-to-plasma ratios computed using the independent time points approach varied between 0 and 50 across time points. The ratio obtained from AUC values acquired using the naive data averaging approach was not associated with any measure of uncertainty or variability. Calculating the ratio without regard to pairing yielded poorer estimates. The random sampling and pseudoprofile-based bootstrap approaches yielded tissue-to-plasma ratios with uncertainty and variability. However, the random sampling approach, because of the 2-phase nature of its algorithm, yielded more robust estimates and required fewer replications. Therefore, a 2-phase random sampling approach is proposed for the robust estimation of tissue-to-plasma ratio from extremely sparsely sampled data.

  15. Evolving cellular automata for diversity generation and pattern recognition: deterministic versus random strategy

    International Nuclear Information System (INIS)

    De Menezes, Marcio Argollo; Brigatti, Edgardo; Schwämmle, Veit

    2013-01-01

    Microbiological systems evolve to fulfil their tasks with maximal efficiency. The immune system is a remarkable example, where the distinction between self and non-self is made by means of molecular interaction between self-proteins and antigens, triggering affinity-dependent systemic actions. Specificity of this binding and the infinitude of potential antigenic patterns call for novel mechanisms to generate antibody diversity. Inspired by this problem, we develop a genetic algorithm where agents evolve their strings in the presence of random antigenic strings and reproduce with affinity-dependent rates. We ask what is the best strategy to generate diversity if agents can rearrange their strings a finite number of times. We find that endowing each agent with an inheritable cellular automaton rule for performing rearrangements makes the system more efficient in pattern-matching than if transformations are totally random. In the former implementation, the population evolves to a stationary state where agents with different automata rules coexist. (paper)

  16. Examining drivers' eye glance patterns during distracted driving: Insights from scanning randomness and glance transition matrix.

    Science.gov (United States)

    Wang, Yuan; Bao, Shan; Du, Wenjun; Ye, Zhirui; Sayer, James R

    2017-12-01

    Visual attention to the driving environment is of great importance for road safety. Eye glance behavior has been used as an indicator of distracted driving. This study examined and quantified drivers' glance patterns and features during distracted driving. Data from an existing naturalistic driving study were used. Entropy rate was calculated and used to assess the randomness associated with drivers' scanning patterns. A glance-transition proportion matrix was defined to quantity visual search patterns transitioning among four main eye glance locations while driving (i.e., forward on-road, phone, mirrors and others). All measurements were calculated within a 5s time window under both cell phone and non-cell phone use conditions. Results of the glance data analyses showed different patterns between distracted and non-distracted driving, featured by a higher entropy rate value and highly biased attention transferring between forward and phone locations during distracted driving. Drivers in general had higher number of glance transitions, and their on-road glance duration was significantly shorter during distracted driving when compared to non-distracted driving. Results suggest that drivers have a higher scanning randomness/disorder level and shift their main attention from surrounding areas towards phone area when engaging in visual-manual tasks. Drivers' visual search patterns during visual-manual distraction with a high scanning randomness and a high proportion of eye glance transitions towards the location of the phone provide insight into driver distraction detection. This will help to inform the design of in-vehicle human-machine interface/systems. Copyright © 2017. Published by Elsevier Ltd.

  17. Random Sampling of Correlated Parameters – a Consistent Solution for Unfavourable Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Žerovnik, G., E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Kodeli, I.A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Capote, R. [International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Smith, D.L. [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States)

    2015-01-15

    Two methods for random sampling according to a multivariate lognormal distribution – the correlated sampling method and the method of transformation of correlation coefficients – are briefly presented. The methods are mathematically exact and enable consistent sampling of correlated inherently positive parameters with given information on the first two distribution moments. Furthermore, a weighted sampling method to accelerate the convergence of parameters with extremely large relative uncertainties is described. However, the method is efficient only for a limited number of correlated parameters.

  18. The Bootstrap, the Jackknife, and the Randomization Test: A Sampling Taxonomy.

    Science.gov (United States)

    Rodgers, J L

    1999-10-01

    A simple sampling taxonomy is defined that shows the differences between and relationships among the bootstrap, the jackknife, and the randomization test. Each method has as its goal the creation of an empirical sampling distribution that can be used to test statistical hypotheses, estimate standard errors, and/or create confidence intervals. Distinctions between the methods can be made based on the sampling approach (with replacement versus without replacement) and the sample size (replacing the whole original sample versus replacing a subset of the original sample). The taxonomy is useful for teaching the goals and purposes of resampling schemes. An extension of the taxonomy implies other possible resampling approaches that have not previously been considered. Univariate and multivariate examples are presented.

  19. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    Science.gov (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  20. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Science.gov (United States)

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile

  1. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Directory of Open Access Journals (Sweden)

    Kelly L'Engle

    Full Text Available Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample.The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census.The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample.The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit

  2. Pseudo-random arranged color filter array for controlling moiré patterns in display.

    Science.gov (United States)

    Zhou, Yangui; Fan, Hang; An, Sengzhong; Li, Juntao; Wang, Jiahui; Zhou, Jianying; Liu, Yikun

    2015-11-16

    Optical display quality can be degraded by the appearance of moiré pattern occurring in a display system consisting of a basic matrix superimposed with a functional structured optical layer. We propose in this paper a novel pseudo-random arranged color filter array with the table number arranged with an optimal design scenario. We show that the moiré pattern can be significantly reduced with the introduction of the special color filter array. The idea is tested with an experiment that gives rise to a substantially reduced moiré pattern in a display system. It is believed that the novel functional optical structures have significant impact to complex structured display system in general and to the autostereoscopic and integrated display systems in particular.

  3. Patterns of particle distribution in multiparticle systems by random walks with memory enhancement and decay

    Science.gov (United States)

    Tan, Zhi-Jie; Zou, Xian-Wu; Huang, Sheng-You; Zhang, Wei; Jin, Zhun-Zhi

    2002-07-01

    We investigate the pattern of particle distribution and its evolution with time in multiparticle systems using the model of random walks with memory enhancement and decay. This model describes some biological intelligent walks. With decrease in the memory decay exponent α, the distribution of particles changes from a random dispersive pattern to a locally dense one, and then returns to the random one. Correspondingly, the fractal dimension Df,p characterizing the distribution of particle positions increases from a low value to a maximum and then decreases to the low one again. This is determined by the degree of overlap of regions consisting of sites with remanent information. The second moment of the density ρ(2) was introduced to investigate the inhomogeneity of the particle distribution. The dependence of ρ(2) on α is similar to that of Df,p on α. ρ(2) increases with time as a power law in the process of adjusting the particle distribution, and then ρ(2) tends to a stable equilibrium value.

  4. Occupational position and its relation to mental distress in a random sample of Danish residents

    DEFF Research Database (Denmark)

    Rugulies, Reiner Ernst; Madsen, Ida E H; Nielsen, Maj Britt D

    2010-01-01

    PURPOSE: To analyze the distribution of depressive, anxiety, and somatization symptoms across different occupational positions in a random sample of Danish residents. METHODS: The study sample consisted of 591 Danish residents (50% women), aged 20-65, drawn from an age- and gender-stratified random...... sample of the Danish population. Participants filled out a survey that included the 92 item version of the Hopkins Symptom Checklist (SCL-92). We categorized occupational position into seven groups: high- and low-grade non-manual workers, skilled and unskilled manual workers, high- and low-grade self...

  5. Estimation of Sensitive Proportion by Randomized Response Data in Successive Sampling

    Directory of Open Access Journals (Sweden)

    Bo Yu

    2015-01-01

    Full Text Available This paper considers the problem of estimation for binomial proportions of sensitive or stigmatizing attributes in the population of interest. Randomized response techniques are suggested for protecting the privacy of respondents and reducing the response bias while eliciting information on sensitive attributes. In many sensitive question surveys, the same population is often sampled repeatedly on each occasion. In this paper, we apply successive sampling scheme to improve the estimation of the sensitive proportion on current occasion.

  6. Adaptive importance sampling of random walks on continuous state spaces

    International Nuclear Information System (INIS)

    Baggerly, K.; Cox, D.; Picard, R.

    1998-01-01

    The authors consider adaptive importance sampling for a random walk with scoring in a general state space. Conditions under which exponential convergence occurs to the zero-variance solution are reviewed. These results generalize previous work for finite, discrete state spaces in Kollman (1993) and in Kollman, Baggerly, Cox, and Picard (1996). This paper is intended for nonstatisticians and includes considerable explanatory material

  7. Radiation pattern synthesis of planar antennas using the iterative sampling method

    Science.gov (United States)

    Stutzman, W. L.; Coffey, E. L.

    1975-01-01

    A synthesis method is presented for determining an excitation of an arbitrary (but fixed) planar source configuration. The desired radiation pattern is specified over all or part of the visible region. It may have multiple and/or shaped main beams with low sidelobes. The iterative sampling method is used to find an excitation of the source which yields a radiation pattern that approximates the desired pattern to within a specified tolerance. In this paper the method is used to calculate excitations for line sources, linear arrays (equally and unequally spaced), rectangular apertures, rectangular arrays (arbitrary spacing grid), and circular apertures. Examples using these sources to form patterns with shaped main beams, multiple main beams, shaped sidelobe levels, and combinations thereof are given.

  8. Shape measurement system for single point incremental forming (SPIF) manufacts by using trinocular vision and random pattern

    International Nuclear Information System (INIS)

    Setti, Francesco; Bini, Ruggero; Lunardelli, Massimo; Bosetti, Paolo; Bruschi, Stefania; De Cecco, Mariolino

    2012-01-01

    Many contemporary works show the interest of the scientific community in measuring the shape of artefacts made by single point incremental forming. In this paper, we will present an algorithm able to detect feature points with a random pattern, check the compatibility of associations exploiting multi-stereo constraints and reject outliers and perform a 3D reconstruction by dense random patterns. The algorithm is suitable for a real-time application, in fact it needs just three images and a synchronous relatively fast processing. The proposed method has been tested on a simple geometry and results have been compared with a coordinate measurement machine acquisition. (paper)

  9. Introducing a rainfall compound distribution model based on weather patterns sub-sampling

    Directory of Open Access Journals (Sweden)

    F. Garavaglia

    2010-06-01

    Full Text Available This paper presents a probabilistic model for daily rainfall, using sub-sampling based on meteorological circulation. We classified eight typical but contrasted synoptic situations (weather patterns for France and surrounding areas, using a "bottom-up" approach, i.e. from the shape of the rain field to the synoptic situations described by geopotential fields. These weather patterns (WP provide a discriminating variable that is consistent with French climatology, and allows seasonal rainfall records to be split into more homogeneous sub-samples, in term of meteorological genesis.

    First results show how the combination of seasonal and WP sub-sampling strongly influences the identification of the asymptotic behaviour of rainfall probabilistic models. Furthermore, with this level of stratification, an asymptotic exponential behaviour of each sub-sample appears as a reasonable hypothesis. This first part is illustrated with two daily rainfall records from SE of France.

    The distribution of the multi-exponential weather patterns (MEWP is then defined as the composition, for a given season, of all WP sub-sample marginal distributions, weighted by the relative frequency of occurrence of each WP. This model is finally compared to Exponential and Generalized Pareto distributions, showing good features in terms of robustness and accuracy. These final statistical results are computed from a wide dataset of 478 rainfall chronicles spread on the southern half of France. All these data cover the 1953–2005 period.

  10. Precision of systematic and random sampling in clustered populations: habitat patches and aggregating organisms.

    Science.gov (United States)

    McGarvey, Richard; Burch, Paul; Matthews, Janet M

    2016-01-01

    Natural populations of plants and animals spatially cluster because (1) suitable habitat is patchy, and (2) within suitable habitat, individuals aggregate further into clusters of higher density. We compare the precision of random and systematic field sampling survey designs under these two processes of species clustering. Second, we evaluate the performance of 13 estimators for the variance of the sample mean from a systematic survey. Replicated simulated surveys, as counts from 100 transects, allocated either randomly or systematically within the study region, were used to estimate population density in six spatial point populations including habitat patches and Matérn circular clustered aggregations of organisms, together and in combination. The standard one-start aligned systematic survey design, a uniform 10 x 10 grid of transects, was much more precise. Variances of the 10 000 replicated systematic survey mean densities were one-third to one-fifth of those from randomly allocated transects, implying transect sample sizes giving equivalent precision by random survey would need to be three to five times larger. Organisms being restricted to patches of habitat was alone sufficient to yield this precision advantage for the systematic design. But this improved precision for systematic sampling in clustered populations is underestimated by standard variance estimators used to compute confidence intervals. True variance for the survey sample mean was computed from the variance of 10 000 simulated survey mean estimates. Testing 10 published and three newly proposed variance estimators, the two variance estimators (v) that corrected for inter-transect correlation (ν₈ and ν(W)) were the most accurate and also the most precise in clustered populations. These greatly outperformed the two "post-stratification" variance estimators (ν₂ and ν₃) that are now more commonly applied in systematic surveys. Similar variance estimator performance rankings were found with

  11. A Table-Based Random Sampling Simulation for Bioluminescence Tomography

    Directory of Open Access Journals (Sweden)

    Xiaomeng Zhang

    2006-01-01

    Full Text Available As a popular simulation of photon propagation in turbid media, the main problem of Monte Carlo (MC method is its cumbersome computation. In this work a table-based random sampling simulation (TBRS is proposed. The key idea of TBRS is to simplify multisteps of scattering to a single-step process, through randomly table querying, thus greatly reducing the computing complexity of the conventional MC algorithm and expediting the computation. The TBRS simulation is a fast algorithm of the conventional MC simulation of photon propagation. It retained the merits of flexibility and accuracy of conventional MC method and adapted well to complex geometric media and various source shapes. Both MC simulations were conducted in a homogeneous medium in our work. Also, we present a reconstructing approach to estimate the position of the fluorescent source based on the trial-and-error theory as a validation of the TBRS algorithm. Good agreement is found between the conventional MC simulation and the TBRS simulation.

  12. Pattern imprinting in deep sub-micron static random access memories induced by total dose irradiation

    Science.gov (United States)

    Zheng, Qi-Wen; Yu, Xue-Feng; Cui, Jiang-Wei; Guo, Qi; Ren, Di-Yuan; Cong, Zhong-Chao; Zhou, Hang

    2014-10-01

    Pattern imprinting in deep sub-micron static random access memories (SRAMs) during total dose irradiation is investigated in detail. As the dose accumulates, the data pattern of memory cells loading during irradiation is gradually imprinted on their background data pattern. We build a relationship between the memory cell's static noise margin (SNM) and the background data, and study the influence of irradiation on the probability density function of ΔSNM, which is the difference between two data sides' SNMs, to discuss the reason for pattern imprinting. Finally, we demonstrate that, for micron and deep sub-micron devices, the mechanism of pattern imprinting is the bias-dependent threshold shift of the transistor, but for a deep sub-micron device the shift results from charge trapping in the shallow trench isolation (STI) oxide rather than from the gate oxide of the micron-device.

  13. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  14. A Nationwide Random Sampling Survey of Potential Complicated Grief in Japan

    Science.gov (United States)

    Mizuno, Yasunao; Kishimoto, Junji; Asukai, Nozomu

    2012-01-01

    To investigate the prevalence of significant loss, potential complicated grief (CG), and its contributing factors, we conducted a nationwide random sampling survey of Japanese adults aged 18 or older (N = 1,343) using a self-rating Japanese-language version of the Complicated Grief Brief Screen. Among them, 37.0% experienced their most significant…

  15. Reinforcing Sampling Distributions through a Randomization-Based Activity for Introducing ANOVA

    Science.gov (United States)

    Taylor, Laura; Doehler, Kirsten

    2015-01-01

    This paper examines the use of a randomization-based activity to introduce the ANOVA F-test to students. The two main goals of this activity are to successfully teach students to comprehend ANOVA F-tests and to increase student comprehension of sampling distributions. Four sections of students in an advanced introductory statistics course…

  16. Optimizing sampling approaches along ecological gradients

    DEFF Research Database (Denmark)

    Schweiger, Andreas; Irl, Severin D. H.; Steinbauer, Manuel

    2016-01-01

    1. Natural scientists and especially ecologists use manipulative experiments or field observations along gradients to differentiate patterns driven by processes from those caused by random noise. A well-conceived sampling design is essential for identifying, analysing and reporting underlying...... patterns in a statistically solid and reproducible manner, given the normal restrictions in labour, time and money. However, a technical guideline about an adequate sampling design to maximize prediction success under restricted resources is lacking. This study aims at developing such a solid...... and reproducible guideline for sampling along gradients in all fields of ecology and science in general. 2. We conducted simulations with artificial data for five common response types known in ecology, each represented by a simple function (no response, linear, exponential, symmetric unimodal and asymmetric...

  17. Lensless digital holography with diffuse illumination through a pseudo-random phase mask.

    Science.gov (United States)

    Bernet, Stefan; Harm, Walter; Jesacher, Alexander; Ritsch-Marte, Monika

    2011-12-05

    Microscopic imaging with a setup consisting of a pseudo-random phase mask, and an open CMOS camera, without an imaging objective, is demonstrated. The pseudo random phase mask acts as a diffuser for an incoming laser beam, scattering a speckle pattern to a CMOS chip, which is recorded once as a reference. A sample which is afterwards inserted somewhere in the optical beam path changes the speckle pattern. A single (non-iterative) image processing step, comparing the modified speckle pattern with the previously recorded one, generates a sharp image of the sample. After a first calibration the method works in real-time and allows quantitative imaging of complex (amplitude and phase) samples in an extended three-dimensional volume. Since no lenses are used, the method is free from lens abberations. Compared to standard inline holography the diffuse sample illumination improves the axial sectioning capability by increasing the effective numerical aperture in the illumination path, and it suppresses the undesired so-called twin images. For demonstration, a high resolution spatial light modulator (SLM) is programmed to act as the pseudo-random phase mask. We show experimental results, imaging microscopic biological samples, e.g. insects, within an extended volume at a distance of 15 cm with a transverse and longitudinal resolution of about 60 μm and 400 μm, respectively.

  18. Randomization tests

    CERN Document Server

    Edgington, Eugene

    2007-01-01

    Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani

  19. Optimization of sampling pattern and the design of Fourier ptychographic illuminator.

    Science.gov (United States)

    Guo, Kaikai; Dong, Siyuan; Nanda, Pariksheet; Zheng, Guoan

    2015-03-09

    Fourier ptychography (FP) is a recently developed imaging approach that facilitates high-resolution imaging beyond the cutoff frequency of the employed optics. In the original FP approach, a periodic LED array is used for sample illumination, and therefore, the scanning pattern is a uniform grid in the Fourier space. Such a uniform sampling scheme leads to 3 major problems for FP, namely: 1) it requires a large number of raw images, 2) it introduces the raster grid artefacts in the reconstruction process, and 3) it requires a high-dynamic-range detector. Here, we investigate scanning sequences and sampling patterns to optimize the FP approach. For most biological samples, signal energy is concentrated at low-frequency region, and as such, we can perform non-uniform Fourier sampling in FP by considering the signal structure. In contrast, conventional ptychography perform uniform sampling over the entire real space. To implement the non-uniform Fourier sampling scheme in FP, we have designed and built an illuminator using LEDs mounted on a 3D-printed plastic case. The advantages of this illuminator are threefold in that: 1) it reduces the number of image acquisitions by at least 50% (68 raw images versus 137 in the original FP setup), 2) it departs from the translational symmetry of sampling to solve the raster grid artifact problem, and 3) it reduces the dynamic range of the captured images 6 fold. The results reported in this paper significantly shortened acquisition time and improved quality of FP reconstructions. It may provide new insights for developing Fourier ptychographic imaging platforms and find important applications in digital pathology.

  20. LOD score exclusion analyses for candidate QTLs using random population samples.

    Science.gov (United States)

    Deng, Hong-Wen

    2003-11-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes as putative QTLs using random population samples. Previously, we developed an LOD score exclusion mapping approach for candidate genes for complex diseases. Here, we extend this LOD score approach for exclusion analyses of candidate genes for quantitative traits. Under this approach, specific genetic effects (as reflected by heritability) and inheritance models at candidate QTLs can be analyzed and if an LOD score is < or = -2.0, the locus can be excluded from having a heritability larger than that specified. Simulations show that this approach has high power to exclude a candidate gene from having moderate genetic effects if it is not a QTL and is robust to population admixture. Our exclusion analysis complements association analysis for candidate genes as putative QTLs in random population samples. The approach is applied to test the importance of Vitamin D receptor (VDR) gene as a potential QTL underlying the variation of bone mass, an important determinant of osteoporosis.

  1. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    Science.gov (United States)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  2. The Dirichet-Multinomial model for multivariate randomized response data and small samples

    NARCIS (Netherlands)

    Avetisyan, Marianna; Fox, Gerardus J.A.

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The

  3. Pattern imprinting in deep sub-micron static random access memories induced by total dose irradiation

    International Nuclear Information System (INIS)

    Zheng Qi-Wen; Yu Xue-Feng; Cui Jiang-Wei; Guo Qi; Ren Di-Yuan; Cong Zhong-Chao; Zhou Hang

    2014-01-01

    Pattern imprinting in deep sub-micron static random access memories (SRAMs) during total dose irradiation is investigated in detail. As the dose accumulates, the data pattern of memory cells loading during irradiation is gradually imprinted on their background data pattern. We build a relationship between the memory cell's static noise margin (SNM) and the background data, and study the influence of irradiation on the probability density function of ΔSNM, which is the difference between two data sides' SNMs, to discuss the reason for pattern imprinting. Finally, we demonstrate that, for micron and deep sub-micron devices, the mechanism of pattern imprinting is the bias-dependent threshold shift of the transistor, but for a deep sub-micron device the shift results from charge trapping in the shallow trench isolation (STI) oxide rather than from the gate oxide of the micron-device. (condensed matter: structural, mechanical, and thermal properties)

  4. The influence of sampling unit size and spatial arrangement patterns on neighborhood-based spatial structure analyses of forest stands

    Energy Technology Data Exchange (ETDEWEB)

    Wang, H.; Zhang, G.; Hui, G.; Li, Y.; Hu, Y.; Zhao, Z.

    2016-07-01

    Aim of study: Neighborhood-based stand spatial structure parameters can quantify and characterize forest spatial structure effectively. How these neighborhood-based structure parameters are influenced by the selection of different numbers of nearest-neighbor trees is unclear, and there is some disagreement in the literature regarding the appropriate number of nearest-neighbor trees to sample around reference trees. Understanding how to efficiently characterize forest structure is critical for forest management. Area of study: Multi-species uneven-aged forests of Northern China. Material and methods: We simulated stands with different spatial structural characteristics and systematically compared their structure parameters when two to eight neighboring trees were selected. Main results: Results showed that values of uniform angle index calculated in the same stand were different with different sizes of structure unit. When tree species and sizes were completely randomly interspersed, different numbers of neighbors had little influence on mingling and dominance indices. Changes of mingling or dominance indices caused by different numbers of neighbors occurred when the tree species or size classes were not randomly interspersed and their changing characteristics can be detected according to the spatial arrangement patterns of tree species and sizes. Research highlights: The number of neighboring trees selected for analyzing stand spatial structure parameters should be fixed. We proposed that the four-tree structure unit is the best compromise between sampling accuracy and costs for practical forest management. (Author)

  5. On the expected duration of a search for a fixed pattern in random data

    DEFF Research Database (Denmark)

    Nielsen, Peter Tolstrup

    1973-01-01

    An expression is obtained for the expected duration of a search to find a givenL-ary sequence in a semi-infinite stream of randomL-ary data. The search time is found to be an increasing function of the lengths of the "bifices" of the pattern, where the term bifix denotes a sequence which is both...

  6. Pattern analysis of total item score and item response of the Kessler Screening Scale for Psychological Distress (K6 in a nationally representative sample of US adults

    Directory of Open Access Journals (Sweden)

    Shinichiro Tomitaka

    2017-02-01

    Full Text Available Background Several recent studies have shown that total scores on depressive symptom measures in a general population approximate an exponential pattern except for the lower end of the distribution. Furthermore, we confirmed that the exponential pattern is present for the individual item responses on the Center for Epidemiologic Studies Depression Scale (CES-D. To confirm the reproducibility of such findings, we investigated the total score distribution and item responses of the Kessler Screening Scale for Psychological Distress (K6 in a nationally representative study. Methods Data were drawn from the National Survey of Midlife Development in the United States (MIDUS, which comprises four subsamples: (1 a national random digit dialing (RDD sample, (2 oversamples from five metropolitan areas, (3 siblings of individuals from the RDD sample, and (4 a national RDD sample of twin pairs. K6 items are scored using a 5-point scale: “none of the time,” “a little of the time,” “some of the time,” “most of the time,” and “all of the time.” The pattern of total score distribution and item responses were analyzed using graphical analysis and exponential regression model. Results The total score distributions of the four subsamples exhibited an exponential pattern with similar rate parameters. The item responses of the K6 approximated a linear pattern from “a little of the time” to “all of the time” on log-normal scales, while “none of the time” response was not related to this exponential pattern. Discussion The total score distribution and item responses of the K6 showed exponential patterns, consistent with other depressive symptom scales.

  7. Betting on Illusory Patterns: Probability Matching in Habitual Gamblers.

    Science.gov (United States)

    Gaissmaier, Wolfgang; Wilke, Andreas; Scheibehenne, Benjamin; McCanney, Paige; Barrett, H Clark

    2016-03-01

    Why do people gamble? A large body of research suggests that cognitive distortions play an important role in pathological gambling. Many of these distortions are specific cases of a more general misperception of randomness, specifically of an illusory perception of patterns in random sequences. In this article, we provide further evidence for the assumption that gamblers are particularly prone to perceiving illusory patterns. In particular, we compared habitual gamblers to a matched sample of community members with regard to how much they exhibit the choice anomaly 'probability matching'. Probability matching describes the tendency to match response proportions to outcome probabilities when predicting binary outcomes. It leads to a lower expected accuracy than the maximizing strategy of predicting the most likely event on each trial. Previous research has shown that an illusory perception of patterns in random sequences fuels probability matching. So does impulsivity, which is also reported to be higher in gamblers. We therefore hypothesized that gamblers will exhibit more probability matching than non-gamblers, which was confirmed in a controlled laboratory experiment. Additionally, gamblers scored much lower than community members on the cognitive reflection task, which indicates higher impulsivity. This difference could account for the difference in probability matching between the samples. These results suggest that gamblers are more willing to bet impulsively on perceived illusory patterns.

  8. Additive non-uniform random sampling in superimposed fiber Bragg grating strain gauge

    Science.gov (United States)

    Ma, Y. C.; Liu, H. Y.; Yan, S. B.; Yang, Y. H.; Yang, M. W.; Li, J. M.; Tang, J.

    2013-05-01

    This paper demonstrates an additive non-uniform random sampling and interrogation method for dynamic and/or static strain gauge using a reflection spectrum from two superimposed fiber Bragg gratings (FBGs). The superimposed FBGs are designed to generate non-equidistant space of a sensing pulse train in the time domain during dynamic strain gauge. By combining centroid finding with smooth filtering methods, both the interrogation speed and accuracy are improved. A 1.9 kHz dynamic strain is measured by generating an additive non-uniform randomly distributed 2 kHz optical sensing pulse train from a mean 500 Hz triangular periodically changing scanning frequency.

  9. Path integral methods for primordial density perturbations - sampling of constrained Gaussian random fields

    International Nuclear Information System (INIS)

    Bertschinger, E.

    1987-01-01

    Path integrals may be used to describe the statistical properties of a random field such as the primordial density perturbation field. In this framework the probability distribution is given for a Gaussian random field subjected to constraints such as the presence of a protovoid or supercluster at a specific location in the initial conditions. An algorithm has been constructed for generating samples of a constrained Gaussian random field on a lattice using Monte Carlo techniques. The method makes possible a systematic study of the density field around peaks or other constrained regions in the biased galaxy formation scenario, and it is effective for generating initial conditions for N-body simulations with rare objects in the computational volume. 21 references

  10. An efficient method of randomly sampling the coherent angular scatter distribution

    International Nuclear Information System (INIS)

    Williamson, J.F.; Morin, R.L.

    1983-01-01

    Monte Carlo simulations of photon transport phenomena require random selection of an interaction process at each collision site along the photon track. Possible choices are usually limited to photoelectric absorption and incoherent scatter as approximated by the Klein-Nishina distribution. A technique is described for sampling the coherent angular scatter distribution, for the benefit of workers in medical physics. (U.K.)

  11. Energy Preserved Sampling for Compressed Sensing MRI

    Directory of Open Access Journals (Sweden)

    Yudong Zhang

    2014-01-01

    Full Text Available The sampling patterns, cost functions, and reconstruction algorithms play important roles in optimizing compressed sensing magnetic resonance imaging (CS-MRI. Simple random sampling patterns did not take into account the energy distribution in k-space and resulted in suboptimal reconstruction of MR images. Therefore, a variety of variable density (VD based samplings patterns had been developed. To further improve it, we propose a novel energy preserving sampling (ePRESS method. Besides, we improve the cost function by introducing phase correction and region of support matrix, and we propose iterative thresholding algorithm (ITA to solve the improved cost function. We evaluate the proposed ePRESS sampling method, improved cost function, and ITA reconstruction algorithm by 2D digital phantom and 2D in vivo MR brains of healthy volunteers. These assessments demonstrate that the proposed ePRESS method performs better than VD, POWER, and BKO; the improved cost function can achieve better reconstruction quality than conventional cost function; and the ITA is faster than SISTA and is competitive with FISTA in terms of computation time.

  12. Random vs. systematic sampling from administrative databases involving human subjects.

    Science.gov (United States)

    Hagino, C; Lo, R J

    1998-09-01

    Two sampling techniques, simple random sampling (SRS) and systematic sampling (SS), were compared to determine whether they yield similar and accurate distributions for the following four factors: age, gender, geographic location and years in practice. Any point estimate within 7 yr or 7 percentage points of its reference standard (SRS or the entire data set, i.e., the target population) was considered "acceptably similar" to the reference standard. The sampling frame was from the entire membership database of the Canadian Chiropractic Association. The two sampling methods were tested using eight different sample sizes of n (50, 100, 150, 200, 250, 300, 500, 800). From the profile/characteristics, summaries of four known factors [gender, average age, number (%) of chiropractors in each province and years in practice], between- and within-methods chi 2 tests and unpaired t tests were performed to determine whether any of the differences [descriptively greater than 7% or 7 yr] were also statistically significant. The strengths of the agreements between the provincial distributions were quantified by calculating the percent agreements for each (provincial pairwise-comparison methods). Any percent agreement less than 70% was judged to be unacceptable. Our assessments of the two sampling methods (SRS and SS) for the different sample sizes tested suggest that SRS and SS yielded acceptably similar results. Both methods started to yield "correct" sample profiles at approximately the same sample size (n > 200). SS is not only convenient, it can be recommended for sampling from large databases in which the data are listed without any inherent order biases other than alphabetical listing by surname.

  13. Stage-specific sampling by pattern recognition receptors during Candida albicans phagocytosis.

    Directory of Open Access Journals (Sweden)

    Sigrid E M Heinsbroek

    2008-11-01

    Full Text Available Candida albicans is a medically important pathogen, and recognition by innate immune cells is critical for its clearance. Although a number of pattern recognition receptors have been shown to be involved in recognition and phagocytosis of this fungus, the relative role of these receptors has not been formally examined. In this paper, we have investigated the contribution of the mannose receptor, Dectin-1, and complement receptor 3; and we have demonstrated that Dectin-1 is the main non-opsonic receptor involved in fungal uptake. However, both Dectin-1 and complement receptor 3 were found to accumulate at the site of uptake, while mannose receptor accumulated on C. albicans phagosomes at later stages. These results suggest a potential role for MR in phagosome sampling; and, accordingly, MR deficiency led to a reduction in TNF-alpha and MCP-1 production in response to C. albicans uptake. Our data suggest that pattern recognition receptors sample the fungal phagosome in a sequential fashion.

  14. Eating patterns in a population-based sample of children aged 5 to 7 years

    DEFF Research Database (Denmark)

    Micali, Nadia; Simonoff, Emily; Elberling, Hanne

    2011-01-01

    There is still a lack of research on childhood eating patterns and their correlates in relation to psychopathology and parentally perceived impact in general population samples. We aimed to determine which eating patterns were more likely to be identified as problematic by parents, and their impact...

  15. Additive non-uniform random sampling in superimposed fiber Bragg grating strain gauge

    International Nuclear Information System (INIS)

    Ma, Y C; Liu, H Y; Yan, S B; Li, J M; Tang, J; Yang, Y H; Yang, M W

    2013-01-01

    This paper demonstrates an additive non-uniform random sampling and interrogation method for dynamic and/or static strain gauge using a reflection spectrum from two superimposed fiber Bragg gratings (FBGs). The superimposed FBGs are designed to generate non-equidistant space of a sensing pulse train in the time domain during dynamic strain gauge. By combining centroid finding with smooth filtering methods, both the interrogation speed and accuracy are improved. A 1.9 kHz dynamic strain is measured by generating an additive non-uniform randomly distributed 2 kHz optical sensing pulse train from a mean 500 Hz triangular periodically changing scanning frequency. (paper)

  16. Visual discrimination and short-term memory for random patterns in patients with a focal cortical lesion

    NARCIS (Netherlands)

    Greenlee, MW; Koessler, M; Cornelissen, FW; Mergner, T

    1997-01-01

    Visual discrimination and short-term recognition memory for computer-generated random patterns were explored in 23 patients with a postsurgical lesion in one of the cortical hemispheres. Their results are compared with those of 23 age-matched volunteers. In a same-different forced-choice

  17. The Dirichlet-Multinomial Model for Multivariate Randomized Response Data and Small Samples

    Science.gov (United States)

    Avetisyan, Marianna; Fox, Jean-Paul

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The beta-binomial model for binary RR data will be generalized…

  18. Landslide Susceptibility Assessment Using Frequency Ratio Technique with Iterative Random Sampling

    Directory of Open Access Journals (Sweden)

    Hyun-Joo Oh

    2017-01-01

    Full Text Available This paper assesses the performance of the landslide susceptibility analysis using frequency ratio (FR with an iterative random sampling. A pair of before-and-after digital aerial photographs with 50 cm spatial resolution was used to detect landslide occurrences in Yongin area, Korea. Iterative random sampling was run ten times in total and each time it was applied to the training and validation datasets. Thirteen landslide causative factors were derived from the topographic, soil, forest, and geological maps. The FR scores were calculated from the causative factors and training occurrences repeatedly ten times. The ten landslide susceptibility maps were obtained from the integration of causative factors that assigned FR scores. The landslide susceptibility maps were validated by using each validation dataset. The FR method achieved susceptibility accuracies from 89.48% to 93.21%. And the landslide susceptibility accuracy of the FR method is higher than 89%. Moreover, the ten times iterative FR modeling may contribute to a better understanding of a regularized relationship between the causative factors and landslide susceptibility. This makes it possible to incorporate knowledge-driven considerations of the causative factors into the landslide susceptibility analysis and also be extensively used to other areas.

  19. Differences in Movement Pattern and Detectability between Males and Females Influence How Common Sampling Methods Estimate Sex Ratio.

    Directory of Open Access Journals (Sweden)

    João Fabrício Mota Rodrigues

    Full Text Available Sampling the biodiversity is an essential step for conservation, and understanding the efficiency of sampling methods allows us to estimate the quality of our biodiversity data. Sex ratio is an important population characteristic, but until now, no study has evaluated how efficient are the sampling methods commonly used in biodiversity surveys in estimating the sex ratio of populations. We used a virtual ecologist approach to investigate whether active and passive capture methods are able to accurately sample a population's sex ratio and whether differences in movement pattern and detectability between males and females produce biased estimates of sex-ratios when using these methods. Our simulation allowed the recognition of individuals, similar to mark-recapture studies. We found that differences in both movement patterns and detectability between males and females produce biased estimates of sex ratios. However, increasing the sampling effort or the number of sampling days improves the ability of passive or active capture methods to properly sample sex ratio. Thus, prior knowledge regarding movement patterns and detectability for species is important information to guide field studies aiming to understand sex ratio related patterns.

  20. Differences in Movement Pattern and Detectability between Males and Females Influence How Common Sampling Methods Estimate Sex Ratio.

    Science.gov (United States)

    Rodrigues, João Fabrício Mota; Coelho, Marco Túlio Pacheco

    2016-01-01

    Sampling the biodiversity is an essential step for conservation, and understanding the efficiency of sampling methods allows us to estimate the quality of our biodiversity data. Sex ratio is an important population characteristic, but until now, no study has evaluated how efficient are the sampling methods commonly used in biodiversity surveys in estimating the sex ratio of populations. We used a virtual ecologist approach to investigate whether active and passive capture methods are able to accurately sample a population's sex ratio and whether differences in movement pattern and detectability between males and females produce biased estimates of sex-ratios when using these methods. Our simulation allowed the recognition of individuals, similar to mark-recapture studies. We found that differences in both movement patterns and detectability between males and females produce biased estimates of sex ratios. However, increasing the sampling effort or the number of sampling days improves the ability of passive or active capture methods to properly sample sex ratio. Thus, prior knowledge regarding movement patterns and detectability for species is important information to guide field studies aiming to understand sex ratio related patterns.

  1. A simple sample size formula for analysis of covariance in cluster randomized trials.

    NARCIS (Netherlands)

    Teerenstra, S.; Eldridge, S.; Graff, M.J.; Hoop, E. de; Borm, G.F.

    2012-01-01

    For cluster randomized trials with a continuous outcome, the sample size is often calculated as if an analysis of the outcomes at the end of the treatment period (follow-up scores) would be performed. However, often a baseline measurement of the outcome is available or feasible to obtain. An

  2. Application of a stratified random sampling technique to the estimation and minimization of respirable quartz exposure to underground miners

    International Nuclear Information System (INIS)

    Makepeace, C.E.; Horvath, F.J.; Stocker, H.

    1981-11-01

    The aim of a stratified random sampling plan is to provide the best estimate (in the absence of full-shift personal gravimetric sampling) of personal exposure to respirable quartz among underground miners. One also gains information of the exposure distribution of all the miners at the same time. Three variables (or strata) are considered in the present scheme: locations, occupations and times of sampling. Random sampling within each stratum ensures that each location, occupation and time of sampling has equal opportunity of being selected without bias. Following implementation of the plan and analysis of collected data, one can determine the individual exposures and the mean. This information can then be used to identify those groups whose exposure contributes significantly to the collective exposure. In turn, this identification, along with other considerations, allows the mine operator to carry out a cost-benefit optimization and eventual implementation of engineering controls for these groups. This optimization and engineering control procedure, together with the random sampling plan, can then be used in an iterative manner to minimize the mean value of the distribution and collective exposures

  3. Event-triggered synchronization for reaction-diffusion complex networks via random sampling

    Science.gov (United States)

    Dong, Tao; Wang, Aijuan; Zhu, Huiyun; Liao, Xiaofeng

    2018-04-01

    In this paper, the synchronization problem of the reaction-diffusion complex networks (RDCNs) with Dirichlet boundary conditions is considered, where the data is sampled randomly. An event-triggered controller based on the sampled data is proposed, which can reduce the number of controller and the communication load. Under this strategy, the synchronization problem of the diffusion complex network is equivalently converted to the stability of a of reaction-diffusion complex dynamical systems with time delay. By using the matrix inequality technique and Lyapunov method, the synchronization conditions of the RDCNs are derived, which are dependent on the diffusion term. Moreover, it is found the proposed control strategy can get rid of the Zeno behavior naturally. Finally, a numerical example is given to verify the obtained results.

  4. Eating patterns in a population-based sample of children aged 5 to 7 years

    DEFF Research Database (Denmark)

    Micali, Nadia; Simonoff, Emily; Elberling, Hanne

    2011-01-01

    There is still a lack of research on childhood eating patterns and their correlates in relation to psychopathology and parentally perceived impact in general population samples. We aimed to determine which eating patterns were more likely to be identified as problematic by parents, and their impact...... and association with childhood psychopathology (emotional, behavioral, and pervasive developmental disorders) in a general population child cohort....

  5. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bonney, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schroeder, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a class of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10-4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.

  6. Fuzzy C-Means Clustering Model Data Mining For Recognizing Stock Data Sampling Pattern

    Directory of Open Access Journals (Sweden)

    Sylvia Jane Annatje Sumarauw

    2007-06-01

    Full Text Available Abstract Capital market has been beneficial to companies and investor. For investors, the capital market provides two economical advantages, namely deviden and capital gain, and a non-economical one that is a voting .} hare in Shareholders General Meeting. But, it can also penalize the share owners. In order to prevent them from the risk, the investors should predict the prospect of their companies. As a consequence of having an abstract commodity, the share quality will be determined by the validity of their company profile information. Any information of stock value fluctuation from Jakarta Stock Exchange can be a useful consideration and a good measurement for data analysis. In the context of preventing the shareholders from the risk, this research focuses on stock data sample category or stock data sample pattern by using Fuzzy c-Me, MS Clustering Model which providing any useful information jar the investors. lite research analyses stock data such as Individual Index, Volume and Amount on Property and Real Estate Emitter Group at Jakarta Stock Exchange from January 1 till December 31 of 204. 'he mining process follows Cross Industry Standard Process model for Data Mining (CRISP,. DM in the form of circle with these steps: Business Understanding, Data Understanding, Data Preparation, Modelling, Evaluation and Deployment. At this modelling process, the Fuzzy c-Means Clustering Model will be applied. Data Mining Fuzzy c-Means Clustering Model can analyze stock data in a big database with many complex variables especially for finding the data sample pattern, and then building Fuzzy Inference System for stimulating inputs to be outputs that based on Fuzzy Logic by recognising the pattern. Keywords: Data Mining, AUz..:y c-Means Clustering Model, Pattern Recognition

  7. Prognostic interaction patterns in diabetes mellitus II: A random-matrix-theory relation

    Science.gov (United States)

    Rai, Aparna; Pawar, Amit Kumar; Jalan, Sarika

    2015-08-01

    We analyze protein-protein interactions in diabetes mellitus II and its normal counterpart under the combined framework of random matrix theory and network biology. This disease is the fifth-leading cause of death in high-income countries and an epidemic in developing countries, affecting around 8 % of the total adult population in the world. Treatment at the advanced stage is difficult and challenging, making early detection a high priority in the cure of the disease. Our investigation reveals specific structural patterns important for the occurrence of the disease. In addition to the structural parameters, the spectral properties reveal the top contributing nodes from localized eigenvectors, which turn out to be significant for the occurrence of the disease. Our analysis is time-efficient and cost-effective, bringing a new horizon in the field of medicine by highlighting major pathways involved in the disease. The analysis provides a direction for the development of novel drugs and therapies in curing the disease by targeting specific interaction patterns instead of a single protein.

  8. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  9. Detection of sunn pest-damaged wheat samples using visible/near-infrared spectroscopy based on pattern recognition.

    Science.gov (United States)

    Basati, Zahra; Jamshidi, Bahareh; Rasekh, Mansour; Abbaspour-Gilandeh, Yousef

    2018-05-30

    The presence of sunn pest-damaged grains in wheat mass reduces the quality of flour and bread produced from it. Therefore, it is essential to assess the quality of the samples in collecting and storage centers of wheat and flour mills. In this research, the capability of visible/near-infrared (Vis/NIR) spectroscopy combined with pattern recognition methods was investigated for discrimination of wheat samples with different percentages of sunn pest-damaged. To this end, various samples belonging to five classes (healthy and 5%, 10%, 15% and 20% unhealthy) were analyzed using Vis/NIR spectroscopy (wavelength range of 350-1000 nm) based on both supervised and unsupervised pattern recognition methods. Principal component analysis (PCA) and hierarchical cluster analysis (HCA) as the unsupervised techniques and soft independent modeling of class analogies (SIMCA) and partial least squares-discriminant analysis (PLS-DA) as supervised methods were used. The results showed that Vis/NIR spectra of healthy samples were correctly clustered using both PCA and HCA. Due to the high overlapping between the four unhealthy classes (5%, 10%, 15% and 20%), it was not possible to discriminate all the unhealthy samples in individual classes. However, when considering only the two main categories of healthy and unhealthy, an acceptable degree of separation between the classes can be obtained after classification with supervised pattern recognition methods of SIMCA and PLS-DA. SIMCA based on PCA modeling correctly classified samples in two classes of healthy and unhealthy with classification accuracy of 100%. Moreover, the power of the wavelengths of 839 nm, 918 nm and 995 nm were more than other wavelengths to discriminate two classes of healthy and unhealthy. It was also concluded that PLS-DA provides excellent classification results of healthy and unhealthy samples (R 2  = 0.973 and RMSECV = 0.057). Therefore, Vis/NIR spectroscopy based on pattern recognition techniques

  10. Randomization of grab-sampling strategies for estimating the annual exposure of U miners to Rn daughters.

    Science.gov (United States)

    Borak, T B

    1986-04-01

    Periodic grab sampling in combination with time-of-occupancy surveys has been the accepted procedure for estimating the annual exposure of underground U miners to Rn daughters. Temporal variations in the concentration of potential alpha energy in the mine generate uncertainties in this process. A system to randomize the selection of locations for measurement is described which can reduce uncertainties and eliminate systematic biases in the data. In general, a sample frequency of 50 measurements per year is sufficient to satisfy the criteria that the annual exposure be determined in working level months to within +/- 50% of the true value with a 95% level of confidence. Suggestions for implementing this randomization scheme are presented.

  11. A systematic examination of a random sampling strategy for source apportionment calculations.

    Science.gov (United States)

    Andersson, August

    2011-12-15

    Estimating the relative contributions from multiple potential sources of a specific component in a mixed environmental matrix is a general challenge in diverse fields such as atmospheric, environmental and earth sciences. Perhaps the most common strategy for tackling such problems is by setting up a system of linear equations for the fractional influence of different sources. Even though an algebraic solution of this approach is possible for the common situation with N+1 sources and N source markers, such methodology introduces a bias, since it is implicitly assumed that the calculated fractions and the corresponding uncertainties are independent of the variability of the source distributions. Here, a random sampling (RS) strategy for accounting for such statistical bias is examined by investigating rationally designed synthetic data sets. This random sampling methodology is found to be robust and accurate with respect to reproducibility and predictability. This method is also compared to a numerical integration solution for a two-source situation where source variability also is included. A general observation from this examination is that the variability of the source profiles not only affects the calculated precision but also the mean/median source contributions. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. A Systematic Review of Surgical Randomized Controlled Trials: Part 2. Funding Source, Conflict of Interest, and Sample Size in Plastic Surgery.

    Science.gov (United States)

    Voineskos, Sophocles H; Coroneos, Christopher J; Ziolkowski, Natalia I; Kaur, Manraj N; Banfield, Laura; Meade, Maureen O; Chung, Kevin C; Thoma, Achilleas; Bhandari, Mohit

    2016-02-01

    The authors examined industry support, conflict of interest, and sample size in plastic surgery randomized controlled trials that compared surgical interventions. They hypothesized that industry-funded trials demonstrate statistically significant outcomes more often, and randomized controlled trials with small sample sizes report statistically significant results more frequently. An electronic search identified randomized controlled trials published between 2000 and 2013. Independent reviewers assessed manuscripts and performed data extraction. Funding source, conflict of interest, primary outcome direction, and sample size were examined. Chi-squared and independent-samples t tests were used in the analysis. The search identified 173 randomized controlled trials, of which 100 (58 percent) did not acknowledge funding status. A relationship between funding source and trial outcome direction was not observed. Both funding status and conflict of interest reporting improved over time. Only 24 percent (six of 25) of industry-funded randomized controlled trials reported authors to have independent control of data and manuscript contents. The mean number of patients randomized was 73 per trial (median, 43, minimum, 3, maximum, 936). Small trials were not found to be positive more often than large trials (p = 0.87). Randomized controlled trials with small sample size were common; however, this provides great opportunity for the field to engage in further collaboration and produce larger, more definitive trials. Reporting of trial funding and conflict of interest is historically poor, but it greatly improved over the study period. Underreporting at author and journal levels remains a limitation when assessing the relationship between funding source and trial outcomes. Improved reporting and manuscript control should be goals that both authors and journals can actively achieve.

  13. Meat Consumption Patterns among Different Income Groups in Imo ...

    African Journals Online (AJOL)

    This research examined meat consumption patterns among different income groups in Imo State, Nigeria. A combination of purposive and simple random sampling techniques was used to select the markets and 200 respondents. The result of cross price elasticity of meat and fish showed that they were substitute with cross ...

  14. Distributed fiber sparse-wideband vibration sensing by sub-Nyquist additive random sampling

    Science.gov (United States)

    Zhang, Jingdong; Zheng, Hua; Zhu, Tao; Yin, Guolu; Liu, Min; Bai, Yongzhong; Qu, Dingrong; Qiu, Feng; Huang, Xianbing

    2018-05-01

    The round trip time of the light pulse limits the maximum detectable vibration frequency response range of phase-sensitive optical time domain reflectometry ({\\phi}-OTDR). Unlike the uniform laser pulse interval in conventional {\\phi}-OTDR, we randomly modulate the pulse interval, so that an equivalent sub-Nyquist additive random sampling (sNARS) is realized for every sensing point of the long interrogation fiber. For an {\\phi}-OTDR system with 10 km sensing length, the sNARS method is optimized by theoretical analysis and Monte Carlo simulation, and the experimental results verify that a wide-band spars signal can be identified and reconstructed. Such a method can broaden the vibration frequency response range of {\\phi}-OTDR, which is of great significance in sparse-wideband-frequency vibration signal detection, such as rail track monitoring and metal defect detection.

  15. Spatial pattern of Baccharis platypoda shrub as determined by sex and life stages

    Science.gov (United States)

    Fonseca, Darliana da Costa; de Oliveira, Marcio Leles Romarco; Pereira, Israel Marinho; Gonzaga, Anne Priscila Dias; de Moura, Cristiane Coelho; Machado, Evandro Luiz Mendonça

    2017-11-01

    Spatial patterns of dioecious species can be determined by their nutritional requirements and intraspecific competition, apart from being a response to environmental heterogeneity. The aim of the study was to evaluate the spatial pattern of populations of a dioecious shrub reporting to sex and reproductive stage patterns of individuals. Sampling was carried out in three areas located in the meridional portion of Serra do Espinhaço, where in individuals of the studied species were mapped. The spatial pattern was determined through O-ring analysis and Ripley's K-function and the distribution of individuals' frequencies was verified through x2 test. Populations in two areas showed an aggregate spatial pattern tending towards random or uniform according to the observed scale. Male and female adults presented an aggregate pattern at smaller scales, while random and uniform patterns were verified above 20 m for individuals of both sexes of the areas A2 and A3. Young individuals presented an aggregate pattern in all areas and spatial independence in relation to adult individuals, especially female plants. The interactions between individuals of both genders presented spatial independence with respect to spatial distribution. Baccharis platypoda showed characteristics in accordance with the spatial distribution of savannic and dioecious species, whereas the population was aggregated tending towards random at greater spatial scales. Young individuals showed an aggregated pattern at different scales compared to adults, without positive association between them. Female and male adult individuals presented similar characteristics, confirming that adult individuals at greater scales are randomly distributed despite their distinct preferences for environments with moisture variation.

  16. Effect of intradermal human recombinant copper-zinc superoxide dismutase on random pattern flaps in rats.

    Science.gov (United States)

    Schein, Ophir; Westreich, Melvyn; Shalom, Avshalom

    2013-09-01

    Studies have focused on enhancing flap viability using superoxide dismutase (SOD), but only a few used SOD from human origin, and most gave the compound systemically. We evaluated the ability of SOD to improve random skin flap survival using human recombinant copper-zinc superoxide dismutase (Hr-CuZnSOD) in variable doses, injected intradermally into the flap. Seventy male Sprague Dawley rats were randomly assigned into 4 groups. Cephalic random pattern flaps were elevated on their backs and intradermal injections of different dosages of Hr-CuZnSOD were given 15 minutes before surgery. Flap survival was evaluated by fluorescein fluorescence. Analysis of variance (ANOVA) and t test statistical analyses were performed. Flap survival in all treated groups was significantly better than in the controls. The beneficial effect of HR-CuZnSOD on flap survival is attained when it is given intradermally into the flap tissue. Theoretically, Hr-CuZnSOD delivered with local anesthetics used in flap elevation may be a valuable clinical tool. Copyright © 2012 Wiley Periodicals, Inc.

  17. Climatologies from satellite measurements: the impact of orbital sampling on the standard error of the mean

    Directory of Open Access Journals (Sweden)

    M. Toohey

    2013-04-01

    Full Text Available Climatologies of atmospheric observations are often produced by binning measurements according to latitude and calculating zonal means. The uncertainty in these climatological means is characterised by the standard error of the mean (SEM. However, the usual estimator of the SEM, i.e., the sample standard deviation divided by the square root of the sample size, holds only for uncorrelated randomly sampled measurements. Measurements of the atmospheric state along a satellite orbit cannot always be considered as independent because (a the time-space interval between two nearest observations is often smaller than the typical scale of variations in the atmospheric state, and (b the regular time-space sampling pattern of a satellite instrument strongly deviates from random sampling. We have developed a numerical experiment where global chemical fields from a chemistry climate model are sampled according to real sampling patterns of satellite-borne instruments. As case studies, the model fields are sampled using sampling patterns of the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS and Atmospheric Chemistry Experiment Fourier-Transform Spectrometer (ACE-FTS satellite instruments. Through an iterative subsampling technique, and by incorporating information on the random errors of the MIPAS and ACE-FTS measurements, we produce empirical estimates of the standard error of monthly mean zonal mean model O3 in 5° latitude bins. We find that generally the classic SEM estimator is a conservative estimate of the SEM, i.e., the empirical SEM is often less than or approximately equal to the classic estimate. Exceptions occur only when natural variability is larger than the random measurement error, and specifically in instances where the zonal sampling distribution shows non-uniformity with a similar zonal structure as variations in the sampled field, leading to maximum sensitivity to arbitrary phase shifts between the sample distribution and

  18. Comparison of sample preparation procedures on metal(loid) fractionation patterns in lichens.

    Science.gov (United States)

    Kroukamp, E M; Godeto, T W; Forbes, P B C

    2017-08-13

    The effects of different sample preparation strategies and storage on metal(loid) fractionation trends in plant material is largely underresearched. In this study, a bulk sample of lichen Parmotrema austrosinense (Zahlbr.) Hale was analysed for its total extractable metal(loid) content by ICP-MS, and was determined to be adequately homogenous (sample were prepared utilising a range of sample preservation techniques and subjected to a modified sequential extraction procedure or to total metal extraction. Both experiments were repeated after 1-month storage at 4 °C. Cryogenic freezing gave the best reproducibility for total extractable elemental concentrations between months, indicating this to be the most suitable method of sample preparation in such studies. The combined extraction efficiencies were >82% for As, Cu, Mn, Pb, Sr and Zn but poor for other elements, where sample preparation strategies 'no sample preparation' and 'dried in a desiccator' had the best extraction recoveries. Cryogenic freezing procedures had a significantly (p sample cleaning and preservation when species fractionation patterns are of interest. This study also shows that the assumption that species stability can be ensured through cryopreservation and freeze drying techniques needs to be revisited.

  19. Oral Health Patterns among Schoolchildren in Mafraq Governorate, Jordan

    Science.gov (United States)

    ALBashtawy, Mohammed

    2012-01-01

    Little is known about the oral hygiene patterns among schoolchildren in Jordan. A school-based cross-sectional study was performed from January to March 2010. A simple random sampling method was used. Each student participant completed a detailed questionnaire regarding oral hygiene habits. Data were coded and analyzed using SPSS software version…

  20. Behavioural sampling techniques and activity pattern of Indian Pangolin Manis crassicaudata (Mammalia: Manidae in captivity

    Directory of Open Access Journals (Sweden)

    R.K. Mohapatra

    2013-12-01

    Full Text Available The study presents data on six Indian Pangolins Manis crassicaudata observed in captivity at the Pangolin Conservation Breeding Centre, Nandankanan, Odisha, India over 1377 hours of video recordings for each pangolin between 1500hr and 0800hr on 81 consecutive observational days. Video recordings were made through digital systems assisted by infrared enabled CCTV cameras. The data highlights patterns relate to 12 different behaviour and enclosure utilization. Different interval periods for sampling of instantaneous behaviour from video recordings have been evaluated to develop optimal study methods for the future. The activity budgets of pangolins displayed natural patterns of nocturnal activity with a peak between 20:00-21:00 hr. When out of their burrow, they spent about 59% of the time walking in the enclosure, and 14% of the time feeding. The repeatability of the behaviours has a significant negative correlation with the mean time spent in that behaviour. Focal behavioural samples significantly correlated with instantaneous samples up to 15 minutes interval. The correlation values gradually decreased with the increase in sampling interval. The results indicate that results obtained from focal sampling and instantaneous sampling with relatively shorter intervals (=5 minutes are about equally reliable. The study suggests use of focal sampling, instead of instantaneous sampling to record behaviour relating to social interactions.

  1. Tobacco smoking surveillance: is quota sampling an efficient tool for monitoring national trends? A comparison with a random cross-sectional survey.

    Directory of Open Access Journals (Sweden)

    Romain Guignard

    Full Text Available OBJECTIVES: It is crucial for policy makers to monitor the evolution of tobacco smoking prevalence. In France, this monitoring is based on a series of cross-sectional general population surveys, the Health Barometers, conducted every five years and based on random samples. A methodological study has been carried out to assess the reliability of a monitoring system based on regular quota sampling surveys for smoking prevalence. DESIGN / OUTCOME MEASURES: In 2010, current and daily tobacco smoking prevalences obtained in a quota survey on 8,018 people were compared with those of the 2010 Health Barometer carried out on 27,653 people. Prevalences were assessed separately according to the telephone equipment of the interviewee (landline phone owner vs "mobile-only", and logistic regressions were conducted in the pooled database to assess the impact of the telephone equipment and of the survey mode on the prevalences found. Finally, logistic regressions adjusted for sociodemographic characteristics were conducted in the random sample in order to determine the impact of the needed number of calls to interwiew "hard-to-reach" people on the prevalence found. RESULTS: Current and daily prevalences were higher in the random sample (respectively 33.9% and 27.5% in 15-75 years-old than in the quota sample (respectively 30.2% and 25.3%. In both surveys, current and daily prevalences were lower among landline phone owners (respectively 31.8% and 25.5% in the random sample and 28.9% and 24.0% in the quota survey. The required number of calls was slightly related to the smoking status after adjustment for sociodemographic characteristics. CONCLUSION: Random sampling appears to be more effective than quota sampling, mainly by making it possible to interview hard-to-reach populations.

  2. A collision avoidance model for two-pedestrian groups: Considering random avoidance patterns

    Science.gov (United States)

    Zhou, Zhuping; Cai, Yifei; Ke, Ruimin; Yang, Jiwei

    2017-06-01

    Grouping is a common phenomenon in pedestrian crowds and group modeling is still an open challenging problem. When grouping pedestrians avoid each other, different patterns can be observed. Pedestrians can keep close with group members and avoid other groups in cluster. Also, they can avoid other groups separately. Considering this randomness in avoidance patterns, we propose a collision avoidance model for two-pedestrian groups. In our model, the avoidance model is proposed based on velocity obstacle method at first. Then grouping model is established using Distance constrained line (DCL), by transforming DCL into the framework of velocity obstacle, the avoidance model and grouping model are successfully put into one unified calculation structure. Within this structure, an algorithm is developed to solve the problem when solutions of the two models conflict with each other. Two groups of bidirectional pedestrian experiments are designed to verify the model. The accuracy of avoidance behavior and grouping behavior is validated in the microscopic level, while the lane formation phenomenon and fundamental diagrams is validated in the macroscopic level. The experiments results show our model is convincing and has a good expansibility to describe three or more pedestrian groups.

  3. Light-absorbent liquid immersion angled exposure for patterning 3D samples with vertical sidewalls

    International Nuclear Information System (INIS)

    Kumagai, Shinya; Kubo, Hironori; Sasaki, Minoru

    2017-01-01

    To make photolithography patterns on 3D samples, the angled (inclined) exposure technique has been used so far. However, technological issues have emerged in making photolithography patterns on the surface of trench structures. The surface of the trench structures can be covered with a photoresist film by spray-coating but the photoresist film deposited on the sidewalls and bottom of the trench is generally thin. The thin photoresist film deposited inside the trench has been easily overdosed. Moreover, irregular patterns have frequently been formed by the light reflected inside the trench. In this study, we have developed liquid immersion photolithography using a light-absorbent material. The light-reflection inside the trench was suppressed. Various patterns were transferred in the photoresist film deposited on the trench structures which had an aspect ratio of 0.74. Compared to immersion photolithography using pure water under p -polarization light control, the light-absorbent liquid immersion photolithography developed here patterned well the surfaces of the trench sidewalls and bottom. (paper)

  4. Iterative algorithm of discrete Fourier transform for processing randomly sampled NMR data sets

    International Nuclear Information System (INIS)

    Stanek, Jan; Kozminski, Wiktor

    2010-01-01

    Spectra obtained by application of multidimensional Fourier Transformation (MFT) to sparsely sampled nD NMR signals are usually corrupted due to missing data. In the present paper this phenomenon is investigated on simulations and experiments. An effective iterative algorithm for artifact suppression for sparse on-grid NMR data sets is discussed in detail. It includes automated peak recognition based on statistical methods. The results enable one to study NMR spectra of high dynamic range of peak intensities preserving benefits of random sampling, namely the superior resolution in indirectly measured dimensions. Experimental examples include 3D 15 N- and 13 C-edited NOESY-HSQC spectra of human ubiquitin.

  5. Generalized Pareto for Pattern-Oriented Random Walk Modelling of Organisms' Movements.

    Directory of Open Access Journals (Sweden)

    Sophie Bertrand

    Full Text Available How organisms move and disperse is crucial to understand how population dynamics relates to the spatial heterogeneity of the environment. Random walk (RW models are typical tools to describe movement patterns. Whether Lévy or alternative RW better describes forager movements is keenly debated. We get around this issue using the Generalized Pareto Distribution (GPD. GPD includes as specific cases Normal, exponential and power law distributions, which underlie Brownian, Poisson-like and Lévy walks respectively. Whereas previous studies typically confronted a limited set of candidate models, GPD lets the most likely RW model emerge from the data. We illustrate the wide applicability of the method using GPS-tracked seabird foraging movements and fishing vessel movements tracked by Vessel Monitoring System (VMS, both collected in the Peruvian pelagic ecosystem. The two parameters from the fitted GPD, a scale and a shape parameter, provide a synoptic characterization of the observed movement in terms of characteristic scale and diffusive property. They reveal and quantify the variability, among species and individuals, of the spatial strategies selected by predators foraging on a common prey field. The GPD parameters constitute relevant metrics for (1 providing a synthetic and pattern-oriented description of movement, (2 using top predators as ecosystem indicators and (3 studying the variability of spatial behaviour among species or among individuals with different personalities.

  6. Generalized Pareto for Pattern-Oriented Random Walk Modelling of Organisms' Movements.

    Science.gov (United States)

    Bertrand, Sophie; Joo, Rocío; Fablet, Ronan

    2015-01-01

    How organisms move and disperse is crucial to understand how population dynamics relates to the spatial heterogeneity of the environment. Random walk (RW) models are typical tools to describe movement patterns. Whether Lévy or alternative RW better describes forager movements is keenly debated. We get around this issue using the Generalized Pareto Distribution (GPD). GPD includes as specific cases Normal, exponential and power law distributions, which underlie Brownian, Poisson-like and Lévy walks respectively. Whereas previous studies typically confronted a limited set of candidate models, GPD lets the most likely RW model emerge from the data. We illustrate the wide applicability of the method using GPS-tracked seabird foraging movements and fishing vessel movements tracked by Vessel Monitoring System (VMS), both collected in the Peruvian pelagic ecosystem. The two parameters from the fitted GPD, a scale and a shape parameter, provide a synoptic characterization of the observed movement in terms of characteristic scale and diffusive property. They reveal and quantify the variability, among species and individuals, of the spatial strategies selected by predators foraging on a common prey field. The GPD parameters constitute relevant metrics for (1) providing a synthetic and pattern-oriented description of movement, (2) using top predators as ecosystem indicators and (3) studying the variability of spatial behaviour among species or among individuals with different personalities.

  7. Reliability of impingement sampling designs: An example from the Indian Point station

    International Nuclear Information System (INIS)

    Mattson, M.T.; Waxman, J.B.; Watson, D.A.

    1988-01-01

    A 4-year data base (1976-1979) of daily fish impingement counts at the Indian Point electric power station on the Hudson River was used to compare the precision and reliability of three random-sampling designs: (1) simple random, (2) seasonally stratified, and (3) empirically stratified. The precision of daily impingement estimates improved logarithmically for each design as more days in the year were sampled. Simple random sampling was the least, and empirically stratified sampling was the most precise design, and the difference in precision between the two stratified designs was small. Computer-simulated sampling was used to estimate the reliability of the two stratified-random-sampling designs. A seasonally stratified sampling design was selected as the most appropriate reduced-sampling program for Indian Point station because: (1) reasonably precise and reliable impingement estimates were obtained using this design for all species combined and for eight common Hudson River fish by sampling only 30% of the days in a year (110 d); and (2) seasonal strata may be more precise and reliable than empirical strata if future changes in annual impingement patterns occur. The seasonally stratified design applied to the 1976-1983 Indian Point impingement data showed that selection of sampling dates based on daily species-specific impingement variability gave results that were more precise, but not more consistently reliable, than sampling allocations based on the variability of all fish species combined. 14 refs., 1 fig., 6 tabs

  8. Computer code ENDSAM for random sampling and validation of the resonance parameters covariance matrices of some major nuclear data libraries

    International Nuclear Information System (INIS)

    Plevnik, Lucijan; Žerovnik, Gašper

    2016-01-01

    Highlights: • Methods for random sampling of correlated parameters. • Link to open-source code for sampling of resonance parameters in ENDF-6 format. • Validation of the code on realistic and artificial data. • Validation of covariances in three major contemporary nuclear data libraries. - Abstract: Methods for random sampling of correlated parameters are presented. The methods are implemented for sampling of resonance parameters in ENDF-6 format and a link to the open-source code ENDSAM is given. The code has been validated on realistic data. Additionally, consistency of covariances of resonance parameters of three major contemporary nuclear data libraries (JEFF-3.2, ENDF/B-VII.1 and JENDL-4.0u2) has been checked.

  9. Patterned feeding experience for preterm infants: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Pickler, Rita H; Wetzel, Paul A; Meinzen-Derr, Jareen; Tubbs-Cooley, Heather L; Moore, Margo

    2015-06-04

    Neurobehavioral disabilities occur in 5-15% of preterm infants with an estimated 50-70% of very low birth weight preterm infants experiencing later dysfunction, including cognitive, behavioral, and social delays that often persist into adulthood. Factors implicated in poor neurobehavioral and developmental outcomes are hospitalization in the neonatal intensive care unit (NICU) and inconsistent caregiving patterns. Although much underlying brain damage occurs in utero or shortly after birth, neuroprotective strategies can stop lesions from progressing, particularly when these strategies are used during the most sensitive periods of neural plasticity occurring months before term age. The purpose of this randomized trial is to test the effect of a patterned feeding experience on preterm infants' neurobehavioral organization and development, cognitive function, and clinical outcomes. This trial uses an experimental, longitudinal, 2-group design with 120 preterm infants. Infants are enrolled within the first week of life and randomized to an experimental group receiving a patterned feeding experience from the first gavage feeding through discharge or to a control group receiving usual feeding care experience. The intervention involves a continuity of tactile experiences associated with feeding to train and build neuronal networks supportive of normal infant feeding experience. Primary outcomes are neurobehavioral organization as measured by Neurobehavioral Assessment of the Preterm Infant at 3 time points: the transition to oral feedings, NICU discharge, and 2 months corrected age. Secondary aims are cognitive function measured using the Bayley Scales of Infant and Toddler Development, Third Edition at 6 months corrected age, neurobehavioral development (sucking organization, feeding performance, and heart rate variability), and clinical outcomes (length of NICU stay and time to full oral feeding). The potential effects of demographic and biobehavioral factors

  10. Corrigendum: Information Search in Decisions From Experience: Do Our Patterns of Sampling Foreshadow Our Decisions?

    Science.gov (United States)

    2017-09-01

    Original article: Hills, T. T., & Hertwig, R. (2010). Information search in decisions from experience: Do our patterns of sampling foreshadow our decisions? Psychological Science, 21, 1787-1792. doi:10.1177/0956797610387443.

  11. Random sampling of quantum states: a survey of methods and some issues regarding the Overparametrized Method

    International Nuclear Information System (INIS)

    Maziero, Jonas

    2015-01-01

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed. (author)

  12. Improving ambulatory saliva-sampling compliance in pregnant women: a randomized controlled study.

    Directory of Open Access Journals (Sweden)

    Julian Moeller

    Full Text Available OBJECTIVE: Noncompliance with scheduled ambulatory saliva sampling is common and has been associated with biased cortisol estimates in nonpregnant subjects. This study is the first to investigate in pregnant women strategies to improve ambulatory saliva-sampling compliance, and the association between sampling noncompliance and saliva cortisol estimates. METHODS: We instructed 64 pregnant women to collect eight scheduled saliva samples on two consecutive days each. Objective compliance with scheduled sampling times was assessed with a Medication Event Monitoring System and self-reported compliance with a paper-and-pencil diary. In a randomized controlled study, we estimated whether a disclosure intervention (informing women about objective compliance monitoring and a reminder intervention (use of acoustical reminders improved compliance. A mixed model analysis was used to estimate associations between women's objective compliance and their diurnal cortisol profiles, and between deviation from scheduled sampling and the cortisol concentration measured in the related sample. RESULTS: Self-reported compliance with a saliva-sampling protocol was 91%, and objective compliance was 70%. The disclosure intervention was associated with improved objective compliance (informed: 81%, noninformed: 60%, F(1,60  = 17.64, p<0.001, but not the reminder intervention (reminders: 68%, without reminders: 72%, F(1,60 = 0.78, p = 0.379. Furthermore, a woman's increased objective compliance was associated with a higher diurnal cortisol profile, F(2,64  = 8.22, p<0.001. Altered cortisol levels were observed in less objective compliant samples, F(1,705  = 7.38, p = 0.007, with delayed sampling associated with lower cortisol levels. CONCLUSIONS: The results suggest that in pregnant women, objective noncompliance with scheduled ambulatory saliva sampling is common and is associated with biased cortisol estimates. To improve sampling compliance, results suggest

  13. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    Science.gov (United States)

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  14. Dietary patterns in obese pregnant women; influence of a behavioral intervention of diet and physical activity in the UPBEAT randomized controlled trial.

    Science.gov (United States)

    Flynn, Angela C; Seed, Paul T; Patel, Nashita; Barr, Suzanne; Bell, Ruth; Briley, Annette L; Godfrey, Keith M; Nelson, Scott M; Oteng-Ntim, Eugene; Robinson, Sian M; Sanders, Thomas A; Sattar, Naveed; Wardle, Jane; Poston, Lucilla; Goff, Louise M

    2016-11-29

    Understanding dietary patterns in obese pregnant women will inform future intervention strategies to improve pregnancy outcomes and the health of the child. The aim of this study was to investigate the effect of a behavioral intervention of diet and physical activity advice on dietary patterns in obese pregnant woman participating in the UPBEAT study, and to explore associations of dietary patterns with pregnancy outcomes. In the UPBEAT randomized controlled trial, pregnant obese women from eight UK multi-ethnic, inner-city populations were randomly assigned to receive a diet/physical activity intervention or standard antenatal care. The dietary intervention aimed to reduce glycemic load and saturated fat intake. Diet was assessed using a food frequency questionnaire (FFQ) at baseline (15 +0 -18 +6 weeks' gestation), post intervention (27 +0 -28 +6 weeks) and in late pregnancy (34 +0 -36 +0 weeks). Dietary patterns were characterized using factor analysis of the baseline FFQ data, and changes compared in the control and intervention arms. Patterns were related to pregnancy outcomes in the combined control/intervention cohort (n = 1023). Four distinct baseline dietary patterns were defined; Fruit and vegetables, African/Caribbean, Processed, and Snacks, which were differently associated with social and demographic factors. The UPBEAT intervention significantly reduced the Processed (-0.14; 95% CI -0.19, -0.08, P obese pregnant women an intensive dietary intervention improved Processed and Snack dietary pattern scores. African/Caribbean and Processed patterns were associated with an increased risk of gestational diabetes, and provide potential targets for future interventions. Current controlled trials; ISRCTN89971375.

  15. Sample design effects in landscape genetics

    Science.gov (United States)

    Oyler-McCance, Sara J.; Fedy, Bradley C.; Landguth, Erin L.

    2012-01-01

    An important research gap in landscape genetics is the impact of different field sampling designs on the ability to detect the effects of landscape pattern on gene flow. We evaluated how five different sampling regimes (random, linear, systematic, cluster, and single study site) affected the probability of correctly identifying the generating landscape process of population structure. Sampling regimes were chosen to represent a suite of designs common in field studies. We used genetic data generated from a spatially-explicit, individual-based program and simulated gene flow in a continuous population across a landscape with gradual spatial changes in resistance to movement. Additionally, we evaluated the sampling regimes using realistic and obtainable number of loci (10 and 20), number of alleles per locus (5 and 10), number of individuals sampled (10-300), and generational time after the landscape was introduced (20 and 400). For a simulated continuously distributed species, we found that random, linear, and systematic sampling regimes performed well with high sample sizes (>200), levels of polymorphism (10 alleles per locus), and number of molecular markers (20). The cluster and single study site sampling regimes were not able to correctly identify the generating process under any conditions and thus, are not advisable strategies for scenarios similar to our simulations. Our research emphasizes the importance of sampling data at ecologically appropriate spatial and temporal scales and suggests careful consideration for sampling near landscape components that are likely to most influence the genetic structure of the species. In addition, simulating sampling designs a priori could help guide filed data collection efforts.

  16. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  17. Pattern of somatostatin receptors expression in normal and bladder cancer tissue samples.

    Science.gov (United States)

    Karavitakis, Markos; Msaouel, Pavlos; Michalopoulos, Vassilis; Koutsilieris, Michael

    2014-06-01

    Known risks factors for bladder cancer progression and recurrence are limited regarding their prognostic ability. Therefore identification of molecular determinants of disease progression could provide with more specific prognostic information and could be translated into new approaches for biomarker development. In the present study we evaluated, the expression patterns of somatostatin receptors 1-5 (SSTRs) in normal and tumor bladder tissues. The expression of SSTR1-5 was characterized in 45 normal and bladder cancer tissue samples using reverse transcriptase-polymerase chain reaction (RT-PCR). SSTR1 was expressed in 24 samples, SSTR2 in 15, SSTR3 in 23, SSTR4 in 16 and SSTR5 in all but one sample. Bladder cancer tissue samples expressed lower levels of SSTR3. Co-expression of SSTRs was associated with superficial disease. Our results demonstrate, for the first time, that there is expression of SSTR in normal and bladder cancer urothelium. Further studies are required to evaluate the prognostic and therapeutic significance of these findings. Copyright© 2014 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  18. Revisiting random walk based sampling in networks: evasion of burn-in period and frequent regenerations.

    Science.gov (United States)

    Avrachenkov, Konstantin; Borkar, Vivek S; Kadavankandy, Arun; Sreedharan, Jithin K

    2018-01-01

    In the framework of network sampling, random walk (RW) based estimation techniques provide many pragmatic solutions while uncovering the unknown network as little as possible. Despite several theoretical advances in this area, RW based sampling techniques usually make a strong assumption that the samples are in stationary regime, and hence are impelled to leave out the samples collected during the burn-in period. This work proposes two sampling schemes without burn-in time constraint to estimate the average of an arbitrary function defined on the network nodes, for example, the average age of users in a social network. The central idea of the algorithms lies in exploiting regeneration of RWs at revisits to an aggregated super-node or to a set of nodes, and in strategies to enhance the frequency of such regenerations either by contracting the graph or by making the hitting set larger. Our first algorithm, which is based on reinforcement learning (RL), uses stochastic approximation to derive an estimator. This method can be seen as intermediate between purely stochastic Markov chain Monte Carlo iterations and deterministic relative value iterations. The second algorithm, which we call the Ratio with Tours (RT)-estimator, is a modified form of respondent-driven sampling (RDS) that accommodates the idea of regeneration. We study the methods via simulations on real networks. We observe that the trajectories of RL-estimator are much more stable than those of standard random walk based estimation procedures, and its error performance is comparable to that of respondent-driven sampling (RDS) which has a smaller asymptotic variance than many other estimators. Simulation studies also show that the mean squared error of RT-estimator decays much faster than that of RDS with time. The newly developed RW based estimators (RL- and RT-estimators) allow to avoid burn-in period, provide better control of stability along the sample path, and overall reduce the estimation time. Our

  19. Functional redundancy patterns reveal non-random assembly rules in a species-rich marine assemblage.

    Directory of Open Access Journals (Sweden)

    Nicolas Guillemot

    Full Text Available The relationship between species and the functional diversity of assemblages is fundamental in ecology because it contains key information on functional redundancy, and functionally redundant ecosystems are thought to be more resilient, resistant and stable. However, this relationship is poorly understood and undocumented for species-rich coastal marine ecosystems. Here, we used underwater visual censuses to examine the patterns of functional redundancy for one of the most diverse vertebrate assemblages, the coral reef fishes of New Caledonia, South Pacific. First, we found that the relationship between functional and species diversity displayed a non-asymptotic power-shaped curve, implying that rare functions and species mainly occur in highly diverse assemblages. Second, we showed that the distribution of species amongst possible functions was significantly different from a random distribution up to a threshold of ∼90 species/transect. Redundancy patterns for each function further revealed that some functions displayed fast rates of increase in redundancy at low species diversity, whereas others were only becoming redundant past a certain threshold. This suggested non-random assembly rules and the existence of some primordial functions that would need to be fulfilled in priority so that coral reef fish assemblages can gain a basic ecological structure. Last, we found little effect of habitat on the shape of the functional-species diversity relationship and on the redundancy of functions, although habitat is known to largely determine assemblage characteristics such as species composition, biomass, and abundance. Our study shows that low functional redundancy is characteristic of this highly diverse fish assemblage, and, therefore, that even species-rich ecosystems such as coral reefs may be vulnerable to the removal of a few keystone species.

  20. Fixed-location hydroacoustic monitoring designs for estimating fish passage using stratified random and systematic sampling

    International Nuclear Information System (INIS)

    Skalski, J.R.; Hoffman, A.; Ransom, B.H.; Steig, T.W.

    1993-01-01

    Five alternate sampling designs are compared using 15 d of 24-h continuous hydroacoustic data to identify the most favorable approach to fixed-location hydroacoustic monitoring of salmonid outmigrants. Four alternative aproaches to systematic sampling are compared among themselves and with stratified random sampling (STRS). Stratifying systematic sampling (STSYS) on a daily basis is found to reduce sampling error in multiday monitoring studies. Although sampling precision was predictable with varying levels of effort in STRS, neither magnitude nor direction of change in precision was predictable when effort was varied in systematic sampling (SYS). Furthermore, modifying systematic sampling to include replicated (e.g., nested) sampling (RSYS) is further shown to provide unbiased point and variance estimates as does STRS. Numerous short sampling intervals (e.g., 12 samples of 1-min duration per hour) must be monitored hourly using RSYS to provide efficient, unbiased point and interval estimates. For equal levels of effort, STRS outperformed all variations of SYS examined. Parametric approaches to confidence interval estimates are found to be superior to nonparametric interval estimates (i.e., bootstrap and jackknife) in estimating total fish passage. 10 refs., 1 fig., 8 tabs

  1. Asymptotic laws for random knot diagrams

    Science.gov (United States)

    Chapman, Harrison

    2017-06-01

    We study random knotting by considering knot and link diagrams as decorated, (rooted) topological maps on spheres and pulling them uniformly from among sets of a given number of vertices n, as first established in recent work with Cantarella and Mastin. The knot diagram model is an exciting new model which captures both the random geometry of space curve models of knotting as well as the ease of computing invariants from diagrams. We prove that unknot diagrams are asymptotically exponentially rare, an analogue of Sumners and Whittington’s landmark result for self-avoiding polygons. Our proof uses the same key idea: we first show that knot diagrams obey a pattern theorem, which describes their fractal structure. We examine how quickly this behavior occurs in practice. As a consequence, almost all diagrams are asymmetric, simplifying sampling from this model. We conclude with experimental data on knotting in this model. This model of random knotting is similar to those studied by Diao et al, and Dunfield et al.

  2. QUASI-RANDOM TESTING OF COMPUTER SYSTEMS

    Directory of Open Access Journals (Sweden)

    S. V. Yarmolik

    2013-01-01

    Full Text Available Various modified random testing approaches have been proposed for computer system testing in the black box environment. Their effectiveness has been evaluated on the typical failure patterns by employing three measures, namely, P-measure, E-measure and F-measure. A quasi-random testing, being a modified version of the random testing, has been proposed and analyzed. The quasi-random Sobol sequences and modified Sobol sequences are used as the test patterns. Some new methods for Sobol sequence generation have been proposed and analyzed.

  3. At convenience and systematic random sampling: effects on the prognostic value of nuclear area assessments in breast cancer patients.

    Science.gov (United States)

    Jannink, I; Bennen, J N; Blaauw, J; van Diest, P J; Baak, J P

    1995-01-01

    This study compares the influence of two different nuclear sampling methods on the prognostic value of assessments of mean and standard deviation of nuclear area (MNA, SDNA) in 191 consecutive invasive breast cancer patients with long term follow up. The first sampling method used was 'at convenience' sampling (ACS); the second, systematic random sampling (SRS). Both sampling methods were tested with a sample size of 50 nuclei (ACS-50 and SRS-50). To determine whether, besides the sampling methods, sample size had impact on prognostic value as well, the SRS method was also tested using a sample size of 100 nuclei (SRS-100). SDNA values were systematically lower for ACS, obviously due to (unconsciously) not including small and large nuclei. Testing prognostic value of a series of cut off points, MNA and SDNA values assessed by the SRS method were prognostically significantly stronger than the values obtained by the ACS method. This was confirmed in Cox regression analysis. For the MNA, the Mantel-Cox p-values from SRS-50 and SRS-100 measurements were not significantly different. However, for the SDNA, SRS-100 yielded significantly lower p-values than SRS-50. In conclusion, compared with the 'at convenience' nuclear sampling method, systematic random sampling of nuclei is not only superior with respect to reproducibility of results, but also provides a better prognostic value in patients with invasive breast cancer.

  4. Segmentation of turbo generator and reactor coolant pump vibratory patterns: a syntactic pattern recognition approach

    International Nuclear Information System (INIS)

    Tira, Z.

    1993-02-01

    This study was undertaken in the context of turbogenerator and reactor coolant pump vibration surveillance. Vibration meters are used to monitor equipment condition. An anomaly will modify the signal mean. At the present time, the expert system DIVA, developed to automate diagnosis, requests the operator to identify the nature of the pattern change thus indicated. In order to minimize operator intervention, we have to automate on the one hand classification and on the other hand, detection and segmentation of the patterns. The purpose of this study is to develop a new automatic system for the segmentation and classification of signals. The segmentation is based on syntactic pattern recognition. For the classification, a decision tree is used. The signals to process are the rms values of the vibrations measured on rotating machines. These signals are randomly sampled. All processing is automatic and no a priori statistical knowledge on the signals is required. The segmentation performances are assessed by tests on vibratory signals. (author). 31 figs

  5. Comparison of the commercial color LCD and the medical monochrome LCD using randomized object test patterns.

    Directory of Open Access Journals (Sweden)

    Jay Wu

    Full Text Available Workstations and electronic display devices in a picture archiving and communication system (PACS provide a convenient and efficient platform for medical diagnosis. The performance of display devices has to be verified to ensure that image quality is not degraded. In this study, we designed a set of randomized object test patterns (ROTPs consisting of randomly located spheres with various image characteristics to evaluate the performance of a 2.5 mega-pixel (MP commercial color LCD and a 3 MP diagnostic monochrome LCD in several aspects, including the contrast, resolution, point spread effect, and noise. The ROTPs were then merged into 120 abdominal CT images. Five radiologists were invited to review the CT images, and receiver operating characteristic (ROC analysis was carried out using a five-point rating scale. In the high background patterns of ROTPs, the sensitivity performance was comparable between both monitors in terms of contrast and resolution, whereas, in the low background patterns, the performance of the commercial color LCD was significantly poorer than that of the diagnostic monochrome LCD in all aspects. The average area under the ROC curve (AUC for reviewing abdominal CT images was 0.717±0.0200 and 0.740±0.0195 for the color monitor and the diagnostic monitor, respectively. The observation time (OT was 145±27.6 min and 127±19.3 min, respectively. No significant differences appeared in AUC (p = 0.265 and OT (p = 0.07. The overall results indicate that ROTPs can be implemented as a quality control tool to evaluate the intrinsic characteristics of display devices. Although there is still a gap in technology between different types of LCDs, commercial color LCDs could replace diagnostic monochrome LCDs as a platform for reviewing abdominal CT images after monitor calibration.

  6. Random selection of items. Selection of n1 samples among N items composing a stratum

    International Nuclear Information System (INIS)

    Jaech, J.L.; Lemaire, R.J.

    1987-02-01

    STR-224 provides generalized procedures to determine required sample sizes, for instance in the course of a Physical Inventory Verification at Bulk Handling Facilities. The present report describes procedures to generate random numbers and select groups of items to be verified in a given stratum through each of the measurement methods involved in the verification. (author). 3 refs

  7. Characterisation of taro (Colocasia esculenta based on morphological and isozymic patterns markers

    Directory of Open Access Journals (Sweden)

    SUGIYARTO

    2011-03-01

    Full Text Available Trimanto, Sajidan, Sugiyarto. 2011. Characterization of taro (Colocasia esculenta based on morphological and isozymic patterns markers. Nusantara Bioscience: 7-14. The aims of this research were to find out: (i the variety of Colocasia esculenta based on the morphological characteristics; (ii the variety of C. esculenta based on the isozymic banding pattern; and (iii the correlation of genetic distance based on the morphological characteristics and isozymic banding pattern. Survey research conducted in the Karanganyar district, which include high, medium and low altitude. The sample was taken using random purposive sampling technique, including 9 sampling points. The morphological data was elaborated descriptively and then made dendogram. The data on isozymic banding pattern was analyzed quantitatively based on the presence or absence of bands appeared on the gel, and then made dendogram. The correlation based on the morphological characteristics and isozymic banding pattern were analyzed based on the product-moment correlation coefficient with goodness of fit criterion. The result showed : (i in Karanganyar was founded 10 variety of C. esculenta; (ii morphological characteristics are not affected by altitude; (iii isozymic banding pattern of peroxides forms 14 banding patterns, esterase forms 11 banding patterns and shikimic dehydrogenase forms 15 banding patterns; (iv the correlation of morphological data and the isozymic banding pattern of peroxidase has good correlation (0.893542288 while esterase and shikimic dehydrogenase isozymes have very good correlation (0.917557716 and 0.9121985446; (v isozymic banding pattern of data supports the morphological character data.

  8. Randomized comparison of vaginal self-sampling by standard vs. dry swabs for Human papillomavirus testing

    International Nuclear Information System (INIS)

    Eperon, Isabelle; Vassilakos, Pierre; Navarria, Isabelle; Menoud, Pierre-Alain; Gauthier, Aude; Pache, Jean-Claude; Boulvain, Michel; Untiet, Sarah; Petignat, Patrick

    2013-01-01

    To evaluate if human papillomavirus (HPV) self-sampling (Self-HPV) using a dry vaginal swab is a valid alternative for HPV testing. Women attending colposcopy clinic were recruited to collect two consecutive Self-HPV samples: a Self-HPV using a dry swab (S-DRY) and a Self-HPV using a standard wet transport medium (S-WET). These samples were analyzed for HPV using real time PCR (Roche Cobas). Participants were randomized to determine the order of the tests. Questionnaires assessing preferences and acceptability for both tests were conducted. Subsequently, women were invited for colposcopic examination; a physician collected a cervical sample (physician-sampling) with a broom-type device and placed it into a liquid-based cytology medium. Specimens were then processed for the production of cytology slides and a Hybrid Capture HPV DNA test (Qiagen) was performed from the residual liquid. Biopsies were performed if indicated. Unweighted kappa statistics (κ) and McNemar tests were used to measure the agreement among the sampling methods. A total of 120 women were randomized. Overall HPV prevalence was 68.7% (95% Confidence Interval (CI) 59.3–77.2) by S-WET, 54.4% (95% CI 44.8–63.9) by S-DRY and 53.8% (95% CI 43.8–63.7) by HC. Among paired samples (S-WET and S-DRY), the overall agreement was good (85.7%; 95% CI 77.8–91.6) and the κ was substantial (0.70; 95% CI 0.57-0.70). The proportion of positive type-specific HPV agreement was also good (77.3%; 95% CI 68.2-84.9). No differences in sensitivity for cervical intraepithelial neoplasia grade one (CIN1) or worse between the two Self-HPV tests were observed. Women reported the two Self-HPV tests as highly acceptable. Self-HPV using dry swab transfer does not appear to compromise specimen integrity. Further study in a large screening population is needed. ClinicalTrials.gov: http://clinicaltrials.gov/show/NCT01316120

  9. Dietary patterns and depressive mood in a multiethnic representative sample of Texas eighth graders

    Science.gov (United States)

    The purpose of the study was to examine the relation between dietary patterns and depressive mood among 8th grade students in Texas. Data were from the 2004–2005 School Physical Activity and Nutrition study, a multistage probability-based sample of Texas 8th graders. Participants (n=8827; 14.7% Afri...

  10. Pattern transfer on fused silica samples using sub-aperture reactive ion beam etching

    Energy Technology Data Exchange (ETDEWEB)

    Miessler, Andre; Arnold, Thomas [Leibniz-Institut fuer Oberflaechenmodifizierung (IOM), Permoserstrasse 15, D-04318 Leipzig (Germany)

    2012-07-01

    In comparison to sole Ar ion beam sputtering Reactive Ion Beam Etching (RIBE) reveals the main advantage of increasing the selectivity for different kind of materials due to chemical contributions during the material removal. Therefore RIBE is qualified to be an excellent candidate for pattern transfer applications. The goal of the present study is to apply a sub-aperture reactive ion beam for pattern transfer on large fused silica samples. Concerning this matter, the etching behavior in the ion beam periphery plays a decisive role. Using a Kaufman-typed ion source with NF{sub 3} as reactive gas, XPS measurements of the modified surface exposes impurities like Ni, Fe and Cr, which belongs to chemically eroded material of the plasma pot and a layer formation of silicon nitride, handicaps the etching process mainly in the beam periphery where the sputtering contribution decrease. These side effects influence the pattern transfer of trench structures, produced in AZ MIR 701 photoresist by lithography on a 2'' fused silica plate, by changing the selectivity due to modified chemical reactions of the resist layer. Concerning this we investigate a RF-Ion source for sub aperture reactive ion beam applications and finally we examine the pattern transfer on large fused silica plates using NF{sub 3}-sub-aperture RIBE.

  11. Accounting for sampling patterns reverses the relative importance of trade and climate for the global sharing of exotic plants

    Science.gov (United States)

    Sofaer, Helen R.; Jarnevich, Catherine S.

    2017-01-01

    AimThe distributions of exotic species reflect patterns of human-mediated dispersal, species climatic tolerances and a suite of other biotic and abiotic factors. The relative importance of each of these factors will shape how the spread of exotic species is affected by ongoing economic globalization and climate change. However, patterns of trade may be correlated with variation in scientific sampling effort globally, potentially confounding studies that do not account for sampling patterns.LocationGlobal.Time periodMuseum records, generally from the 1800s up to 2015.Major taxa studiedPlant species exotic to the United States.MethodsWe used data from the Global Biodiversity Information Facility (GBIF) to summarize the number of plant species with exotic occurrences in the United States that also occur in each other country world-wide. We assessed the relative importance of trade and climatic similarity for explaining variation in the number of shared species while evaluating several methods to account for variation in sampling effort among countries.ResultsAccounting for variation in sampling effort reversed the relative importance of trade and climate for explaining numbers of shared species. Trade was strongly correlated with numbers of shared U.S. exotic plants between the United States and other countries before, but not after, accounting for sampling variation among countries. Conversely, accounting for sampling effort strengthened the relationship between climatic similarity and species sharing. Using the number of records as a measure of sampling effort provided a straightforward approach for the analysis of occurrence data, whereas species richness estimators and rarefaction were less effective at removing sampling bias.Main conclusionsOur work provides support for broad-scale climatic limitation on the distributions of exotic species, illustrates the need to account for variation in sampling effort in large biodiversity databases, and highlights the

  12. Determination of Initial Conditions for the Safety Analysis by Random Sampling of Operating Parameters

    International Nuclear Information System (INIS)

    Jeong, Hae-Yong; Park, Moon-Ghu

    2015-01-01

    In most existing evaluation methodologies, which follow a conservative approach, the most conservative initial conditions are searched for each transient scenario through tremendous assessment for wide operating windows or limiting conditions for operation (LCO) allowed by the operating guidelines. In this procedure, a user effect could be involved and a remarkable time and human resources are consumed. In the present study, we investigated a more effective statistical method for the selection of the most conservative initial condition by the use of random sampling of operating parameters affecting the initial conditions. A method for the determination of initial conditions based on random sampling of plant design parameters is proposed. This method is expected to be applied for the selection of the most conservative initial plant conditions in the safety analysis using a conservative evaluation methodology. In the method, it is suggested that the initial conditions of reactor coolant flow rate, pressurizer level, pressurizer pressure, and SG level are adjusted by controlling the pump rated flow, setpoints of PLCS, PPCS, and FWCS, respectively. The proposed technique is expected to contribute to eliminate the human factors introduced in the conventional safety analysis procedure and also to reduce the human resources invested in the safety evaluation of nuclear power plants

  13. A Combined Weighting Method Based on Hybrid of Interval Evidence Fusion and Random Sampling

    Directory of Open Access Journals (Sweden)

    Ying Yan

    2017-01-01

    Full Text Available Due to the complexity of system and lack of expertise, epistemic uncertainties may present in the experts’ judgment on the importance of certain indices during group decision-making. A novel combination weighting method is proposed to solve the index weighting problem when various uncertainties are present in expert comments. Based on the idea of evidence theory, various types of uncertain evaluation information are uniformly expressed through interval evidence structures. Similarity matrix between interval evidences is constructed, and expert’s information is fused. Comment grades are quantified using the interval number, and cumulative probability function for evaluating the importance of indices is constructed based on the fused information. Finally, index weights are obtained by Monte Carlo random sampling. The method can process expert’s information with varying degrees of uncertainties, which possesses good compatibility. Difficulty in effectively fusing high-conflict group decision-making information and large information loss after fusion is avertible. Original expert judgments are retained rather objectively throughout the processing procedure. Cumulative probability function constructing and random sampling processes do not require any human intervention or judgment. It can be implemented by computer programs easily, thus having an apparent advantage in evaluation practices of fairly huge index systems.

  14. Patterns of Drug Use in a Sample of 200 Young Drug Users in London

    Science.gov (United States)

    McCambridge, Jim; Strang, John

    2004-01-01

    A cross-sectional analysis of baseline data collected during a secondary prevention intervention study was conducted to describe patterns of drug use in a non-treatment sample of young drug users recruited in ten further-education colleges across inner London. Participants were 200 young people who were either weekly cannabis users and/or who had…

  15. Human values and the emergence of a sustainable consumption pattern: A panel study

    DEFF Research Database (Denmark)

    Thøgersen, John; Ølander, Carl Folke

    2002-01-01

    In this study, data from a random sample of Danish consumers are used to test the hypothesis that the emergence of a sustainable consumption pattern is influenced by individual value priorities. By the use of a cross-lagged panel design and structural equation modelling it is possible to draw...

  16. Low-fat dietary pattern and cardiovascular disease: results from the Women's Health Initiative randomized controlled trial.

    Science.gov (United States)

    Prentice, Ross L; Aragaki, Aaron K; Van Horn, Linda; Thomson, Cynthia A; Beresford, Shirley Aa; Robinson, Jennifer; Snetselaar, Linda; Anderson, Garnet L; Manson, JoAnn E; Allison, Matthew A; Rossouw, Jacques E; Howard, Barbara V

    2017-07-01

    Background: The influence of a low-fat dietary pattern on the cardiovascular health of postmenopausal women continues to be of public health interest. Objective: This report evaluates low-fat dietary pattern influences on cardiovascular disease (CVD) incidence and mortality during the intervention and postintervention phases of the Women's Health Initiative Dietary Modification Trial. Design: Participants comprised 48,835 postmenopausal women aged 50-79 y; 40% were randomly assigned to a low-fat dietary pattern intervention (target of 20% of energy from fat), and 60% were randomly assigned to a usual diet comparison group. The 8.3-y intervention period ended in March 2005, after which >80% of surviving participants consented to additional active follow-up through September 2010; all participants were followed for mortality through 2013. Breast and colorectal cancer were the primary trial outcomes, and coronary heart disease (CHD) and overall CVD were additional designated outcomes. Results: Incidence rates for CHD and total CVD did not differ between the intervention and comparison groups in either the intervention or postintervention period. However, CHD HRs comparing these groups varied strongly with baseline CVD and hypertension status. Participants without prior CVD had an intervention period CHD HR of 0.70 (95% CI: 0.56, 0.87) or 1.04 (95% CI: 0.90, 1.19) if they were normotensive or hypertensive, respectively ( P -interaction = 0.003). The CHD benefit among healthy normotensive women was partially offset by an increase in ischemic stroke risk. Corresponding HRs in the postintervention period were close to null. Participants with CVD at baseline (3.4%) had CHD HRs of 1.47 (95% CI: 1.12, 1.93) and 1.61 (95% CI: 1.02, 2.55) in the intervention and postintervention periods, respectively. However, various lines of evidence suggest that results in women with CVD or hypertension at baseline are confounded by postrandomization use of cholesterol-lowering medications

  17. Self-reference and random sampling approach for label-free identification of DNA composition using plasmonic nanomaterials.

    Science.gov (United States)

    Freeman, Lindsay M; Pang, Lin; Fainman, Yeshaiahu

    2018-05-09

    The analysis of DNA has led to revolutionary advancements in the fields of medical diagnostics, genomics, prenatal screening, and forensic science, with the global DNA testing market expected to reach revenues of USD 10.04 billion per year by 2020. However, the current methods for DNA analysis remain dependent on the necessity for fluorophores or conjugated proteins, leading to high costs associated with consumable materials and manual labor. Here, we demonstrate a potential label-free DNA composition detection method using surface-enhanced Raman spectroscopy (SERS) in which we identify the composition of cytosine and adenine within single strands of DNA. This approach depends on the fact that there is one phosphate backbone per nucleotide, which we use as a reference to compensate for systematic measurement variations. We utilize plasmonic nanomaterials with random Raman sampling to perform label-free detection of the nucleotide composition within DNA strands, generating a calibration curve from standard samples of DNA and demonstrating the capability of resolving the nucleotide composition. The work represents an innovative way for detection of the DNA composition within DNA strands without the necessity of attached labels, offering a highly sensitive and reproducible method that factors in random sampling to minimize error.

  18. The Effects of 6 Isocaloric Meals Pattern on Blood Lipid Profile, Glucose, Hemoglobin A1c, Insulin and Malondialdehyde in Type 2 Diabetic Patients: A Randomized Clinical Trial

    Directory of Open Access Journals (Sweden)

    Moosa Salehi

    2014-09-01

    Full Text Available Background: The present clinical trial study aims at investigating the effect of daily energy intake in 6 isocaloric meals in comparison with the current meal pattern (3 meals and 2 small snacks per day on type 2 diabetes risk markers in diabetes during 3-month period. Methods: Eighty four type 2 diabetes patients were randomly divided into 6 isocaloric meal diet or a balanced diet (3 meals and 2 snacks previous meal pattern. The planned reduced calorie diets for both groups were identical except for the meal pattern. Blood samples were analyzed before and after the investigation for fasting blood sugar (FBS, two-hour post-prandial glucose (2hPP, insulin, hemoglobin A1c (HbA1c, total cholesterol, triglyceride, HDL-C, LDL-C, and molondialdehyde (MDA concentrations. Results: HbA1c (P=0.00 and body mass index (BMI (P=0.04 values decreased significantly in the 6 isocaloric meal pattern compared with the controls. There were no significant differences in fasting serum glucose (P=0.09, insulin (P=0.65, total cholesterol (P=0.32, LDL-C (P=0.43, HDL-C (P=0.40 cholesterol, triglyceride (P=0.40, MDA (P=0.13 and 2hPP serum glucose (P=0.30 concentrations between the 6 isocaloric meal and tradition meal pattern. Conclusion: Six isocaloric meal pattern in comparison with the current meal pattern led to weight loss and improved glycemic control. Serum lipid profile and MDA did not change significantly. Trial Registration Number: IRCT201205179780N1

  19. Frequency-Selective Signal Sensing with Sub-Nyquist Uniform Sampling Scheme

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2015-01-01

    In this paper the authors discuss a problem of acquisition and reconstruction of a signal polluted by adjacent- channel interference. The authors propose a method to find a sub-Nyquist uniform sampling pattern which allows for correct reconstruction of selected frequencies. The method is inspired...... by the Restricted Isometry Property, which is known from the field of compressed sensing. Then, compressed sensing is used to successfully reconstruct a wanted signal even if some of the uniform samples were randomly lost, e. g. due to ADC saturation. An experiment which tests the proposed method in practice...

  20. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    Science.gov (United States)

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  1. Gray bootstrap method for estimating frequency-varying random vibration signals with small samples

    Directory of Open Access Journals (Sweden)

    Wang Yanqing

    2014-04-01

    Full Text Available During environment testing, the estimation of random vibration signals (RVS is an important technique for the airborne platform safety and reliability. However, the available methods including extreme value envelope method (EVEM, statistical tolerances method (STM and improved statistical tolerance method (ISTM require large samples and typical probability distribution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated interval, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM and gray method (GM in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.

  2. Probabilistic generation of random networks taking into account information on motifs occurrence.

    Science.gov (United States)

    Bois, Frederic Y; Gayraud, Ghislaine

    2015-01-01

    Because of the huge number of graphs possible even with a small number of nodes, inference on network structure is known to be a challenging problem. Generating large random directed graphs with prescribed probabilities of occurrences of some meaningful patterns (motifs) is also difficult. We show how to generate such random graphs according to a formal probabilistic representation, using fast Markov chain Monte Carlo methods to sample them. As an illustration, we generate realistic graphs with several hundred nodes mimicking a gene transcription interaction network in Escherichia coli.

  3. Acute stress symptoms during the second Lebanon war in a random sample of Israeli citizens.

    Science.gov (United States)

    Cohen, Miri; Yahav, Rivka

    2008-02-01

    The aims of this study were to assess prevalence of acute stress disorder (ASD) and acute stress symptoms (ASS) in Israel during the second Lebanon war. A telephone survey was conducted in July 2006 of a random sample of 235 residents of northern Israel, who were subjected to missile attacks, and of central Israel, who were not subjected to missile attacks. Results indicate that ASS scores were higher in the northern respondents; 6.8% of the northern sample and 3.9% of the central sample met ASD criteria. Appearance of each symptom ranged from 15.4% for dissociative to 88.4% for reexperiencing, with significant differences between northern and central respondents only for reexperiencing and arousal. A low ASD rate and a moderate difference between areas subjected and not subjected to attack were found.

  4. Prevalence and correlates of problematic smartphone use in a large random sample of Chinese undergraduates.

    Science.gov (United States)

    Long, Jiang; Liu, Tie-Qiao; Liao, Yan-Hui; Qi, Chang; He, Hao-Yu; Chen, Shu-Bao; Billieux, Joël

    2016-11-17

    Smartphones are becoming a daily necessity for most undergraduates in Mainland China. Because the present scenario of problematic smartphone use (PSU) is largely unexplored, in the current study we aimed to estimate the prevalence of PSU and to screen suitable predictors for PSU among Chinese undergraduates in the framework of the stress-coping theory. A sample of 1062 undergraduate smartphone users was recruited by means of the stratified cluster random sampling strategy between April and May 2015. The Problematic Cellular Phone Use Questionnaire was used to identify PSU. We evaluated five candidate risk factors for PSU by using logistic regression analysis while controlling for demographic characteristics and specific features of smartphone use. The prevalence of PSU among Chinese undergraduates was estimated to be 21.3%. The risk factors for PSU were majoring in the humanities, high monthly income from the family (≥1500 RMB), serious emotional symptoms, high perceived stress, and perfectionism-related factors (high doubts about actions, high parental expectations). PSU among undergraduates appears to be ubiquitous and thus constitutes a public health issue in Mainland China. Although further longitudinal studies are required to test whether PSU is a transient phenomenon or a chronic and progressive condition, our study successfully identified socio-demographic and psychological risk factors for PSU. These results, obtained from a random and thus representative sample of undergraduates, opens up new avenues in terms of prevention and regulation policies.

  5. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge

    OpenAIRE

    Catarino, Rosa; Vassilakos, Pierre; Bilancioni, Aline; Vanden Eynde, Mathieu; Meyer-Hamme, Ulrike; Menoud, Pierre-Alain; Guerry, Fr?d?ric; Petignat, Patrick

    2015-01-01

    Background Human papillomavirus (HPV) self-sampling (self-HPV) is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab. Methods A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed f...

  6. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Directory of Open Access Journals (Sweden)

    Andreas Steimer

    Full Text Available Oscillations between high and low values of the membrane potential (UP and DOWN states respectively are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs of the exponential integrate and fire (EIF model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing

  7. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Science.gov (United States)

    Steimer, Andreas; Schindler, Kaspar

    2015-01-01

    Oscillations between high and low values of the membrane potential (UP and DOWN states respectively) are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs) of the exponential integrate and fire (EIF) model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing computational

  8. Discriminative motif discovery via simulated evolution and random under-sampling.

    Directory of Open Access Journals (Sweden)

    Tao Song

    Full Text Available Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  9. Discriminative motif discovery via simulated evolution and random under-sampling.

    Science.gov (United States)

    Song, Tao; Gu, Hong

    2014-01-01

    Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs) training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  10. Random scalar fields and hyperuniformity

    Science.gov (United States)

    Ma, Zheng; Torquato, Salvatore

    2017-06-01

    Disordered many-particle hyperuniform systems are exotic amorphous states of matter that lie between crystals and liquids. Hyperuniform systems have attracted recent attention because they are endowed with novel transport and optical properties. Recently, the hyperuniformity concept has been generalized to characterize two-phase media, scalar fields, and random vector fields. In this paper, we devise methods to explicitly construct hyperuniform scalar fields. Specifically, we analyze spatial patterns generated from Gaussian random fields, which have been used to model the microwave background radiation and heterogeneous materials, the Cahn-Hilliard equation for spinodal decomposition, and Swift-Hohenberg equations that have been used to model emergent pattern formation, including Rayleigh-Bénard convection. We show that the Gaussian random scalar fields can be constructed to be hyperuniform. We also numerically study the time evolution of spinodal decomposition patterns and demonstrate that they are hyperuniform in the scaling regime. Moreover, we find that labyrinth-like patterns generated by the Swift-Hohenberg equation are effectively hyperuniform. We show that thresholding (level-cutting) a hyperuniform Gaussian random field to produce a two-phase random medium tends to destroy the hyperuniformity of the progenitor scalar field. We then propose guidelines to achieve effectively hyperuniform two-phase media derived from thresholded non-Gaussian fields. Our investigation paves the way for new research directions to characterize the large-structure spatial patterns that arise in physics, chemistry, biology, and ecology. Moreover, our theoretical results are expected to guide experimentalists to synthesize new classes of hyperuniform materials with novel physical properties via coarsening processes and using state-of-the-art techniques, such as stereolithography and 3D printing.

  11. Magnetic nanoparticle imaging by random and maximum length sequences of inhomogeneous activation fields.

    Science.gov (United States)

    Baumgarten, Daniel; Eichardt, Roland; Crevecoeur, Guillaume; Supriyanto, Eko; Haueisen, Jens

    2013-01-01

    Biomedical applications of magnetic nanoparticles require a precise knowledge of their biodistribution. From multi-channel magnetorelaxometry measurements, this distribution can be determined by means of inverse methods. It was recently shown that the combination of sequential inhomogeneous excitation fields in these measurements is favorable regarding the reconstruction accuracy when compared to homogeneous activation . In this paper, approaches for the determination of activation sequences for these measurements are investigated. Therefor, consecutive activation of single coils, random activation patterns and families of m-sequences are examined in computer simulations involving a sample measurement setup and compared with respect to the relative condition number of the system matrix. We obtain that the values of this condition number decrease with larger number of measurement samples for all approaches. Random sequences and m-sequences reveal similar results with a significant reduction of the required number of samples. We conclude that the application of pseudo-random sequences for sequential activation in the magnetorelaxometry imaging of magnetic nanoparticles considerably reduces the number of required sequences while preserving the relevant measurement information.

  12. Accounting for randomness in measurement and sampling in studying cancer cell population dynamics.

    Science.gov (United States)

    Ghavami, Siavash; Wolkenhauer, Olaf; Lahouti, Farshad; Ullah, Mukhtar; Linnebacher, Michael

    2014-10-01

    Knowing the expected temporal evolution of the proportion of different cell types in sample tissues gives an indication about the progression of the disease and its possible response to drugs. Such systems have been modelled using Markov processes. We here consider an experimentally realistic scenario in which transition probabilities are estimated from noisy cell population size measurements. Using aggregated data of FACS measurements, we develop MMSE and ML estimators and formulate two problems to find the minimum number of required samples and measurements to guarantee the accuracy of predicted population sizes. Our numerical results show that the convergence mechanism of transition probabilities and steady states differ widely from the real values if one uses the standard deterministic approach for noisy measurements. This provides support for our argument that for the analysis of FACS data one should consider the observed state as a random variable. The second problem we address is about the consequences of estimating the probability of a cell being in a particular state from measurements of small population of cells. We show how the uncertainty arising from small sample sizes can be captured by a distribution for the state probability.

  13. RANdom SAmple Consensus (RANSAC) algorithm for material-informatics: application to photovoltaic solar cells.

    Science.gov (United States)

    Kaspi, Omer; Yosipof, Abraham; Senderowitz, Hanoch

    2017-06-06

    An important aspect of chemoinformatics and material-informatics is the usage of machine learning algorithms to build Quantitative Structure Activity Relationship (QSAR) models. The RANdom SAmple Consensus (RANSAC) algorithm is a predictive modeling tool widely used in the image processing field for cleaning datasets from noise. RANSAC could be used as a "one stop shop" algorithm for developing and validating QSAR models, performing outlier removal, descriptors selection, model development and predictions for test set samples using applicability domain. For "future" predictions (i.e., for samples not included in the original test set) RANSAC provides a statistical estimate for the probability of obtaining reliable predictions, i.e., predictions within a pre-defined number of standard deviations from the true values. In this work we describe the first application of RNASAC in material informatics, focusing on the analysis of solar cells. We demonstrate that for three datasets representing different metal oxide (MO) based solar cell libraries RANSAC-derived models select descriptors previously shown to correlate with key photovoltaic properties and lead to good predictive statistics for these properties. These models were subsequently used to predict the properties of virtual solar cells libraries highlighting interesting dependencies of PV properties on MO compositions.

  14. Relationship between accuracy and number of samples on statistical quantity and contour map of environmental gamma-ray dose rate. Example of random sampling

    International Nuclear Information System (INIS)

    Matsuda, Hideharu; Minato, Susumu

    2002-01-01

    The accuracy of statistical quantity like the mean value and contour map obtained by measurement of the environmental gamma-ray dose rate was evaluated by random sampling of 5 different model distribution maps made by the mean slope, -1.3, of power spectra calculated from the actually measured values. The values were derived from 58 natural gamma dose rate data reported worldwide ranging in the means of 10-100 Gy/h rates and 10 -3 -10 7 km 2 areas. The accuracy of the mean value was found around ±7% even for 60 or 80 samplings (the most frequent number) and the standard deviation had the accuracy less than 1/4-1/3 of the means. The correlation coefficient of the frequency distribution was found 0.860 or more for 200-400 samplings (the most frequent number) but of the contour map, 0.502-0.770. (K.H.)

  15. The Effect of Omeprazole Usage on the Viability of Random Pattern Skin Flaps in Rats.

    Science.gov (United States)

    Şen, Hilmi; Oruç, Melike; Işik, Veysel Murat; Sadiç, Murat; Sayar, Hamide; Çitil, Rana; Korkmaz, Meliha; Koçer, Uğur

    2017-06-01

    Necrosis of random pattern flaps caused by inadequate blood flow, especially in the distal part of the flap is one of the biggest challenges in reconstructive surgery. Various agents have been used to prevent flap ischemia. In this study, we used omeprazole, which is a potent inhibitor of gastric acidity to increase flap viability. In this study, 35 Wistar-Albino type rats which were divided into 5 equal groups were used. Random-pattern dorsal skin flaps were raised in all groups at seventh day of the study. Group 1 was accepted as control group, and the rats in this group was only given distilled water intraperitoneally for 14 days. Group 2 and group 3 received 10 and 40 mg/kg omeprazole daily for 14 days, respectively. Group 4 and group 5 were given distilled water for the first 7 days and then after the operations they received 10 and 40 mg/kg omeprazole daily for 7 days, respectively. Survival rates of the flaps were examined seventh day after elevation of the flaps by digital imaging and scintigraphy. After assessment of the amount of necrosis, number of vascular structures were counted histopathologically. Percentage of flap necrosis was found to be less in all omeprazole received groups. On digital imaging, percentages of flap necrosis in the study groups were statistically significantly lower than that of the control group (P 0.05).In the histopathologic specimens, it was detected that the mean number of vessels in proximal (a) and distal (c) portions of the flap in the study groups showed a significant increase when compared with the control group (P usage of medications increasing gastrin during flap surgeries can be thought as a positive contributor. In this sense, this study showed that parenteral administration of omeprazole in skin flap surgery increases flap viability possibly by increasing gastrin levels.

  16. Study of Relationship between Genetic Pattern and Susceptibility to Fluconazole in Clinical Isolated of Trichophyton rubrum

    Directory of Open Access Journals (Sweden)

    F Hadadi

    2015-06-01

    Full Text Available Background & objectives: Trichophyton rubrum is one of the most common pathogenic causes of dermatophytosis. One of the drugs prescribed for fungal infections is fluconazole which belongs to Azoles group of antifungal agents. Recently molecular typing methods have been developed for answering the epidemiological questions and disease recurrence problems. Current study has been conducted on 22 isolates of Trichophyton rubrum obtained from patients randomly. Our aim was the investigation of correlation between genetic pattern and sensitivity to Fluconazole in clinical isolates of Trichophyton rubrum .   Methods: Firstly the genus and species of isolated fungi from patients have been confirmed by macroscopic and microscopic methods. Then, the resistance and sensitivity of isolates against drug have been determined using culture medium containing defined amount of drug. In next step fungal DNA has been extracted by RAPD-PCR (random amplified polymorphic DNA with random sequences of 3 primers.   Results: Each primer produced different amplified pattern, and differences have been observed in genetic pattern of resistant and sensitive samples using each 3 primers, but there was no bond with 100% specificity.   Conclusion: The 12 sensitive isolates which didn’t grow in 50µg/ml concentration of drug, also had limited growth at the lower concentration of drug. Ten resistant isolates which grew in 50µg/ml of drug, also showed resistant to lower concentration of drug. There are differences in genetic pattern of resistant and sensitive samples. RAPD analysis for molecular typing of Trichophyton rubrum seems to be completely suitable.

  17. Choosing a suitable sample size in descriptive sampling

    International Nuclear Information System (INIS)

    Lee, Yong Kyun; Choi, Dong Hoon; Cha, Kyung Joon

    2010-01-01

    Descriptive sampling (DS) is an alternative to crude Monte Carlo sampling (CMCS) in finding solutions to structural reliability problems. It is known to be an effective sampling method in approximating the distribution of a random variable because it uses the deterministic selection of sample values and their random permutation,. However, because this method is difficult to apply to complex simulations, the sample size is occasionally determined without thorough consideration. Input sample variability may cause the sample size to change between runs, leading to poor simulation results. This paper proposes a numerical method for choosing a suitable sample size for use in DS. Using this method, one can estimate a more accurate probability of failure in a reliability problem while running a minimal number of simulations. The method is then applied to several examples and compared with CMCS and conventional DS to validate its usefulness and efficiency

  18. Sampling procedures for inventory of commercial volume tree species in Amazon Forest.

    Science.gov (United States)

    Netto, Sylvio P; Pelissari, Allan L; Cysneiros, Vinicius C; Bonazza, Marcelo; Sanquetta, Carlos R

    2017-01-01

    The spatial distribution of tropical tree species can affect the consistency of the estimators in commercial forest inventories, therefore, appropriate sampling procedures are required to survey species with different spatial patterns in the Amazon Forest. For this, the present study aims to evaluate the conventional sampling procedures and introduce the adaptive cluster sampling for volumetric inventories of Amazonian tree species, considering the hypotheses that the density, the spatial distribution and the zero-plots affect the consistency of the estimators, and that the adaptive cluster sampling allows to obtain more accurate volumetric estimation. We use data from a census carried out in Jamari National Forest, Brazil, where trees with diameters equal to or higher than 40 cm were measured in 1,355 plots. Species with different spatial patterns were selected and sampled with simple random sampling, systematic sampling, linear cluster sampling and adaptive cluster sampling, whereby the accuracy of the volumetric estimation and presence of zero-plots were evaluated. The sampling procedures applied to species were affected by the low density of trees and the large number of zero-plots, wherein the adaptive clusters allowed concentrating the sampling effort in plots with trees and, thus, agglutinating more representative samples to estimate the commercial volume.

  19. Maternal genealogical patterns of chicken breeds sampled in Europe.

    Science.gov (United States)

    Lyimo, C M; Weigend, A; Msoffe, P L; Hocking, P M; Simianer, H; Weigend, S

    2015-08-01

    The aim of this study was to investigate the maternal genealogical pattern of chicken breeds sampled in Europe. Sequence polymorphisms of 1256 chickens of the hypervariable region (D-loop) of mitochondrial DNA (mtDNA) were used. Median-joining networks were constructed to establish evolutionary relationships among mtDNA haplotypes of chickens, which included a wide range of breeds with different origin and history. Chicken breeds which have had their roots in Europe for more than 3000 years were categorized by their founding regions, encompassing Mediterranean type, East European type and Northwest European type. Breeds which were introduced to Europe from Asia since the mid-19th century were classified as Asian type, and breeds based on crossbreeding between Asian breeds and European breeds were classified as Intermediate type. The last group, Game birds, included fighting birds from Asia. The classification of mtDNA haplotypes was based on Liu et al.'s (2006) nomenclature. Haplogroup E was the predominant clade among the European chicken breeds. The results showed, on average, the highest number of haplotypes, highest haplotype diversity, and highest nucleotide diversity for Asian type breeds, followed by Intermediate type chickens. East European and Northwest European breeds had lower haplotype and nucleotide diversity compared to Mediterranean, Intermediate, Game and Asian type breeds. Results of our study support earlier findings that chicken breeds sampled in Europe have their roots in the Indian subcontinent and East Asia. This is consistent with historical and archaeological evidence of chicken migration routes to Europe. © 2015 Stichting International Foundation for Animal Genetics.

  20. Psychotropic medication patterns among youth in foster care.

    Science.gov (United States)

    Zito, Julie M; Safer, Daniel J; Sai, Devadatta; Gardner, James F; Thomas, Diane; Coombes, Phyllis; Dubowski, Melissa; Mendez-Lewis, Maria

    2008-01-01

    Studies have revealed that youth in foster care covered by Medicaid insurance receive psychotropic medication at a rate > 3 times that of Medicaid-insured youth who qualify by low family income. Systematic data on patterns of medication treatment, particularly concomitant drugs, for youth in foster care are limited. The purpose of this work was to describe and quantify patterns of psychotropic monotherapy and concomitant therapy prescribed to a randomly selected, 1-month sample of youth in foster care who had been receiving psychotropic medication. METHODS. Medicaid data were accessed for a July 2004 random sample of 472 medicated youth in foster care aged 0 through 19 years from a southwestern US state. Psychotropic medication treatment data were identified by concomitant pattern, frequency, medication class, subclass, and drug entity and were analyzed in relation to age group; gender; race or ethnicity; International Classification of Diseases, Ninth Revision, psychiatric diagnosis; and physician specialty. Of the foster children who had been dispensed psychotropic medication, 41.3% received > or = 3 different classes of these drugs during July 2004, and 15.9% received > or = 4 different classes. The most frequently used medications were antidepressants (56.8%), attention-deficit/hyperactivity disorder drugs (55.9%), and antipsychotic agents (53.2%). The use of specific psychotropic medication classes varied little by diagnostic grouping. Psychiatrists prescribed 93% of the psychotropic medication dispensed to youth in foster care. The use of > or = 2 drugs within the same psychotropic medication class was noted in 22.2% of those who were given prescribed drugs concomitantly. Concomitant psychotropic medication treatment is frequent for youth in foster care and lacks substantive evidence as to its effectiveness and safety.

  1. Sample preparation for total reflection X-ray fluorescence analysis using resist pattern technique

    Science.gov (United States)

    Tsuji, K.; Yomogita, N.; Konyuba, Y.

    2018-06-01

    A circular resist pattern layer with a diameter of 9 mm was prepared on a glass substrate (26 mm × 76 mm; 1.5 mm thick) for total reflection X-ray fluorescence (TXRF) analysis. The parallel cross pattern was designed with a wall thickness of 10 μm, an interval of 20 μm, and a height of 1.4 or 0.8 μm. This additional resist layer did not significantly increase background intensity on the XRF peaks in TXRF spectra. Dotted residue was obtained from a standard solution (10 μL) containing Ti, Cr, Ni, Pb, and Ga, each at a final concentration of 10 ppm, on a normal glass substrate with a silicone coating layer. The height of the residue was more than 100 μm, where self-absorption in the large residue affected TXRF quantification (intensity relative standard deviation (RSD): 12-20%). In contrast, from a droplet composed of a small volume of solution dropped and cast on the resist pattern structure, the obtained residue was not completely film but a film-like residue with a thickness less than 1 μm, where self-absorption was not a serious problem. In the end, this sample preparation was demonstrated to improve TXRF quantification (intensity RSD: 2-4%).

  2. Pattern transfer on large samples using a sub-aperture reactive ion beam

    Energy Technology Data Exchange (ETDEWEB)

    Miessler, Andre; Mill, Agnes; Gerlach, Juergen W.; Arnold, Thomas [Leibniz-Institut fuer Oberflaechenmodifizierung (IOM), Permoserstrasse 15, D-04318 Leipzig (Germany)

    2011-07-01

    In comparison to sole Ar ion beam sputtering Reactive Ion Beam Etching (RIBE) reveals the main advantage of increasing the selectivity for different kind of materials due to chemical contributions during the material removal. Therefore RIBE is qualified to be an excellent candidate for pattern transfer applications. The goal of the present study is to apply a sub-aperture reactive ion beam for pattern transfer on large fused silica samples. Concerning this matter, the etching behavior in the ion beam periphery plays a decisive role. Using CF{sub 4} as reactive gas, XPS measurements of the modified surface exposes impurities like Ni, Fe and Cr, which belongs to chemically eroded material of the plasma pot as well as an accumulation of carbon (up to 40 atomic percent) in the beam periphery, respectively. The substitution of CF{sub 4} by NF{sub 3} as reactive gas reveals a lot of benefits: more stable ion beam conditions in combination with a reduction of the beam size down to a diameter of 5 mm and a reduced amount of the Ni, Fe and Cr contaminations. However, a layer formation of silicon nitride handicaps the chemical contribution of the etching process. These negative side effects influence the transfer of trench structures on quartz by changing the selectivity due to altered chemical reaction of the modified resist layer. Concerning this we investigate the pattern transfer on large fused silica plates using NF{sub 3}-sub-aperture RIBE.

  3. Methodology Series Module 5: Sampling Strategies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  4. Methodology series module 5: Sampling strategies

    Directory of Open Access Journals (Sweden)

    Maninder Singh Setia

    2016-01-01

    Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.

  5. Methodology Series Module 5: Sampling Strategies

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  6. Location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling

    Directory of Open Access Journals (Sweden)

    Alireza Goli

    2015-09-01

    Full Text Available Distribution and optimum allocation of emergency resources are the most important tasks, which need to be accomplished during crisis. When a natural disaster such as earthquake, flood, etc. takes place, it is necessary to deliver rescue efforts as quickly as possible. Therefore, it is important to find optimum location and distribution of emergency relief resources. When a natural disaster occurs, it is not possible to reach some damaged areas. In this paper, location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling is investigated. In this study, there is no need to visit all the places and some demand points receive their needs from the nearest possible location. The proposed study is implemented for some randomly generated numbers in different sizes. The preliminary results indicate that the proposed method was capable of reaching desirable solutions in reasonable amount of time.

  7. Sampling Large Graphs for Anticipatory Analytics

    Science.gov (United States)

    2015-05-15

    low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges

  8. Multiple Intelligences Patterns of Students at King Saud University and Its Relationship with Mathematics' Achievement

    Science.gov (United States)

    Kandeel, Refat A. A.

    2016-01-01

    The purpose of this study was to determine the multiple intelligences patterns of students at King Saud University and its relationship with academic achievement for the courses of Mathematics. The study sample consisted of 917 students were selected a stratified random manner, the descriptive analysis method and Pearson correlation were used, the…

  9. Assessment of fracture risk: value of random population-based samples--the Geelong Osteoporosis Study.

    Science.gov (United States)

    Henry, M J; Pasco, J A; Seeman, E; Nicholson, G C; Sanders, K M; Kotowicz, M A

    2001-01-01

    Fracture risk is determined by bone mineral density (BMD). The T-score, a measure of fracture risk, is the position of an individual's BMD in relation to a reference range. The aim of this study was to determine the magnitude of change in the T-score when different sampling techniques were used to produce the reference range. Reference ranges were derived from three samples, drawn from the same region: (1) an age-stratified population-based random sample, (2) unselected volunteers, and (3) a selected healthy subset of the population-based sample with no diseases or drugs known to affect bone. T-scores were calculated using the three reference ranges for a cohort of women who had sustained a fracture and as a group had a low mean BMD (ages 35-72 yr; n = 484). For most comparisons, the T-scores for the fracture cohort were more negative using the population reference range. The difference in T-scores reached 1.0 SD. The proportion of the fracture cohort classified as having osteoporosis at the spine was 26, 14, and 23% when the population, volunteer, and healthy reference ranges were applied, respectively. The use of inappropriate reference ranges results in substantial changes to T-scores and may lead to inappropriate management.

  10. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    Science.gov (United States)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  11. Parameters, test criteria and fault assessment in random sampling of waste barrels from non-qualified processes

    International Nuclear Information System (INIS)

    Martens, B.R.

    1989-01-01

    In the context of random sampling tests, parameters are checked on the waste barrels and criteria are given on which these tests are based. Also, it is shown how faulty data on the properties of the waste or faulty waste barrels should be treated. To decide the extent of testing, the properties of the waste relevant to final storage are determined based on the conditioning process used. (DG) [de

  12. Study of palatal rugae pattern in gender identification

    Directory of Open Access Journals (Sweden)

    Pooja Balgi

    2014-01-01

    Full Text Available Aim: The aim was to determine the gender differences in rugae pattern with regards to the length, number, and shape. Materials and Methods: Fifty patients (25 males, 25 females aged 30-50 years were randomly selected from the routine outpatient department at Sinhgad Dental College and Hospital. Maxillary impressions were made using alginate hydrocolloid and cast in dental stone. Palatal rugae pattern were then evaluated under the parameters such as length, number, and shape. Results: The association of rugae pattern (length, number, shape and the gender was analyzed using Chi-square test for qualitative variable and t-test for quantitative variable. The average length of the rugae was greater in males than in females. The average numbers of rugae were same in both males and females. Straight pattern was commonly seen in females than in males. Analysis showed significant difference with parameters like length and shape (straight pattern in both the males and females. Conclusion: As the analysis showed significant difference with the length and shape of rugae patterns in both males and females, rugoscopy, thereby could be an important tool for gender identification. The study will be continued with larger sample size to expand knowledge about gender differences in palatal rugae patterns.

  13. Introducing etch kernels for efficient pattern sampling and etch bias prediction

    Science.gov (United States)

    Weisbuch, François; Lutich, Andrey; Schatz, Jirka

    2018-01-01

    Successful patterning requires good control of the photolithography and etch processes. While compact litho models, mainly based on rigorous physics, can predict very well the contours printed in photoresist, pure empirical etch models are less accurate and more unstable. Compact etch models are based on geometrical kernels to compute the litho-etch biases that measure the distance between litho and etch contours. The definition of the kernels, as well as the choice of calibration patterns, is critical to get a robust etch model. This work proposes to define a set of independent and anisotropic etch kernels-"internal, external, curvature, Gaussian, z_profile"-designed to represent the finest details of the resist geometry to characterize precisely the etch bias at any point along a resist contour. By evaluating the etch kernels on various structures, it is possible to map their etch signatures in a multidimensional space and analyze them to find an optimal sampling of structures. The etch kernels evaluated on these structures were combined with experimental etch bias derived from scanning electron microscope contours to train artificial neural networks to predict etch bias. The method applied to contact and line/space layers shows an improvement in etch model prediction accuracy over standard etch model. This work emphasizes the importance of the etch kernel definition to characterize and predict complex etch effects.

  14. Coordination of Conditional Poisson Samples

    Directory of Open Access Journals (Sweden)

    Grafström Anton

    2015-12-01

    Full Text Available Sample coordination seeks to maximize or to minimize the overlap of two or more samples. The former is known as positive coordination, and the latter as negative coordination. Positive coordination is mainly used for estimation purposes and to reduce data collection costs. Negative coordination is mainly performed to diminish the response burden of the sampled units. Poisson sampling design with permanent random numbers provides an optimum coordination degree of two or more samples. The size of a Poisson sample is, however, random. Conditional Poisson (CP sampling is a modification of the classical Poisson sampling that produces a fixed-size πps sample. We introduce two methods to coordinate Conditional Poisson samples over time or simultaneously. The first one uses permanent random numbers and the list-sequential implementation of CP sampling. The second method uses a CP sample in the first selection and provides an approximate one in the second selection because the prescribed inclusion probabilities are not respected exactly. The methods are evaluated using the size of the expected sample overlap, and are compared with their competitors using Monte Carlo simulation. The new methods provide a good coordination degree of two samples, close to the performance of Poisson sampling with permanent random numbers.

  15. Replicability of time-varying connectivity patterns in large resting state fMRI samples.

    Science.gov (United States)

    Abrol, Anees; Damaraju, Eswar; Miller, Robyn L; Stephen, Julia M; Claus, Eric D; Mayer, Andrew R; Calhoun, Vince D

    2017-12-01

    The past few years have seen an emergence of approaches that leverage temporal changes in whole-brain patterns of functional connectivity (the chronnectome). In this chronnectome study, we investigate the replicability of the human brain's inter-regional coupling dynamics during rest by evaluating two different dynamic functional network connectivity (dFNC) analysis frameworks using 7 500 functional magnetic resonance imaging (fMRI) datasets. To quantify the extent to which the emergent functional connectivity (FC) patterns are reproducible, we characterize the temporal dynamics by deriving several summary measures across multiple large, independent age-matched samples. Reproducibility was demonstrated through the existence of basic connectivity patterns (FC states) amidst an ensemble of inter-regional connections. Furthermore, application of the methods to conservatively configured (statistically stationary, linear and Gaussian) surrogate datasets revealed that some of the studied state summary measures were indeed statistically significant and also suggested that this class of null model did not explain the fMRI data fully. This extensive testing of reproducibility of similarity statistics also suggests that the estimated FC states are robust against variation in data quality, analysis, grouping, and decomposition methods. We conclude that future investigations probing the functional and neurophysiological relevance of time-varying connectivity assume critical importance. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Massive parallel optical pattern recognition and retrieval via a two-stage high-capacity multichannel holographic random access memory system

    International Nuclear Information System (INIS)

    Cai, Luzhong; Liu, Hua-Kuang

    2000-01-01

    The multistage holographic optical random access memory (HORAM) system reported recently by Liu et al. provides a new degree of freedom for improving storage capacity. We further present a theoretical and practical analysis of the HORAM system with experimental results. Our discussions include the system design and geometrical requirements, its applications for multichannel pattern recognition and associative memory, the 2-D and 3-D information storage capacity, and multichannel image storage and retrieval via VanderLugt correlator (VLC) filters and joint transform holograms. A series of experiments are performed to demonstrate the feasibility of the multichannel pattern recognition and image retrieval with both the VLC and joint transform correlator (JTC) architectures. The experimental results with as many as 2025 channels show good agreement with the theoretical analysis. (c) 2000 Society of Photo-Optical Instrumentation Engineers

  17. Low-fat dietary pattern and risk of colorectal cancer: the Women's Health Initiative Randomized Controlled Dietary Modification Trial.

    Science.gov (United States)

    Beresford, Shirley A A; Johnson, Karen C; Ritenbaugh, Cheryl; Lasser, Norman L; Snetselaar, Linda G; Black, Henry R; Anderson, Garnet L; Assaf, Annlouise R; Bassford, Tamsen; Bowen, Deborah; Brunner, Robert L; Brzyski, Robert G; Caan, Bette; Chlebowski, Rowan T; Gass, Margery; Harrigan, Rosanne C; Hays, Jennifer; Heber, David; Heiss, Gerardo; Hendrix, Susan L; Howard, Barbara V; Hsia, Judith; Hubbell, F Allan; Jackson, Rebecca D; Kotchen, Jane Morley; Kuller, Lewis H; LaCroix, Andrea Z; Lane, Dorothy S; Langer, Robert D; Lewis, Cora E; Manson, JoAnn E; Margolis, Karen L; Mossavar-Rahmani, Yasmin; Ockene, Judith K; Parker, Linda M; Perri, Michael G; Phillips, Lawrence; Prentice, Ross L; Robbins, John; Rossouw, Jacques E; Sarto, Gloria E; Stefanick, Marcia L; Van Horn, Linda; Vitolins, Mara Z; Wactawski-Wende, Jean; Wallace, Robert B; Whitlock, Evelyn

    2006-02-08

    Observational studies and polyp recurrence trials are not conclusive regarding the effects of a low-fat dietary pattern on risk of colorectal cancer, necessitating a primary prevention trial. To evaluate the effects of a low-fat eating pattern on risk of colorectal cancer in postmenopausal women. The Women's Health Initiative Dietary Modification Trial, a randomized controlled trial conducted in 48,835 postmenopausal women aged 50 to 79 years recruited between 1993 and 1998 from 40 clinical centers throughout the United States. Participants were randomly assigned to the dietary modification intervention (n = 19,541; 40%) or the comparison group (n = 29,294; 60%). The intensive behavioral modification program aimed to motivate and support reductions in dietary fat, to increase consumption of vegetables and fruits, and to increase grain servings by using group sessions, self-monitoring techniques, and other tailored and targeted strategies. Women in the comparison group continued their usual eating pattern. Invasive colorectal cancer incidence. A total of 480 incident cases of invasive colorectal cancer occurred during a mean follow-up of 8.1 (SD, 1.7) years. Intervention group participants significantly reduced their percentage of energy from fat by 10.7% more than did the comparison group at 1 year, and this difference between groups was mostly maintained (8.1% at year 6). Statistically significant increases in vegetable, fruit, and grain servings were also made. Despite these dietary changes, there was no evidence that the intervention reduced the risk of invasive colorectal cancer during the follow-up period. There were 201 women with invasive colorectal cancer (0.13% per year) in the intervention group and 279 (0.12% per year) in the comparison group (hazard ratio, 1.08; 95% confidence interval, 0.90-1.29). Secondary analyses suggested potential interactions with baseline aspirin use and combined estrogen-progestin use status (P = .01 for each). Colorectal

  18. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge.

    Science.gov (United States)

    Catarino, Rosa; Vassilakos, Pierre; Bilancioni, Aline; Vanden Eynde, Mathieu; Meyer-Hamme, Ulrike; Menoud, Pierre-Alain; Guerry, Frédéric; Petignat, Patrick

    2015-01-01

    Human papillomavirus (HPV) self-sampling (self-HPV) is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab. A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed first: self-HPV using dry swabs (s-DRY) or vaginal specimen collection using a cytobrush applied to an FTA cartridge (s-FTA). After self-HPV, a physician collected a cervical sample using liquid-based medium (Dr-WET). HPV types were identified by real-time PCR. Agreement between collection methods was measured using the kappa statistic. HPV prevalence for high-risk types was 62.3% (95%CI: 53.7-70.2) detected by s-DRY, 56.2% (95%CI: 47.6-64.4) by Dr-WET, and 54.6% (95%CI: 46.1-62.9) by s-FTA. There was overall agreement of 70.8% between s-FTA and s-DRY samples (kappa = 0.34), and of 82.3% between self-HPV and Dr-WET samples (kappa = 0.56). Detection sensitivities for low-grade squamous intraepithelial lesion or worse (LSIL+) were: 64.0% (95%CI: 44.5-79.8) for s-FTA, 84.6% (95%CI: 66.5-93.9) for s-DRY, and 76.9% (95%CI: 58.0-89.0) for Dr-WET. The preferred self-collection method among patients was s-DRY (40.8% vs. 15.4%). Regarding costs, FTA card was five times more expensive than the swab (~5 US dollars (USD)/per card vs. ~1 USD/per swab). Self-HPV using dry swabs is sensitive for detecting LSIL+ and less expensive than s-FTA. International Standard Randomized Controlled Trial Number (ISRCTN): 43310942.

  19. Investigating the Randomness of Numbers

    Science.gov (United States)

    Pendleton, Kenn L.

    2009-01-01

    The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…

  20. Simulation of range imaging-based estimation of respiratory lung motion. Influence of noise, signal dimensionality and sampling patterns.

    Science.gov (United States)

    Wilms, M; Werner, R; Blendowski, M; Ortmüller, J; Handels, H

    2014-01-01

    A major problem associated with the irradiation of thoracic and abdominal tumors is respiratory motion. In clinical practice, motion compensation approaches are frequently steered by low-dimensional breathing signals (e.g., spirometry) and patient-specific correspondence models, which are used to estimate the sought internal motion given a signal measurement. Recently, the use of multidimensional signals derived from range images of the moving skin surface has been proposed to better account for complex motion patterns. In this work, a simulation study is carried out to investigate the motion estimation accuracy of such multidimensional signals and the influence of noise, the signal dimensionality, and different sampling patterns (points, lines, regions). A diffeomorphic correspondence modeling framework is employed to relate multidimensional breathing signals derived from simulated range images to internal motion patterns represented by diffeomorphic non-linear transformations. Furthermore, an automatic approach for the selection of optimal signal combinations/patterns within this framework is presented. This simulation study focuses on lung motion estimation and is based on 28 4D CT data sets. The results show that the use of multidimensional signals instead of one-dimensional signals significantly improves the motion estimation accuracy, which is, however, highly affected by noise. Only small differences exist between different multidimensional sampling patterns (lines and regions). Automatically determined optimal combinations of points and lines do not lead to accuracy improvements compared to results obtained by using all points or lines. Our results show the potential of multidimensional breathing signals derived from range images for the model-based estimation of respiratory motion in radiation therapy.

  1. Detecting beer intake by unique metabolite patterns

    DEFF Research Database (Denmark)

    Gürdeniz, Gözde; Jensen, Morten Georg; Meier, Sebastian

    2016-01-01

    Evaluation of health related effects of beer intake is hampered by the lack of accurate tools for assessing intakes (biomarkers). Therefore, we identified plasma and urine metabolites associated with recent beer intake by untargeted metabolomics and established a characteristic metabolite pattern...... representing raw materials and beer production as a qualitative biomarker of beer intake. In a randomized, crossover, single-blinded meal study (MSt1) 18 participants were given one at a time four different test beverages: strong, regular and non-alcoholic beers and a soft drink. Four participants were...... assigned to have two additional beers (MSt2). In addition to plasma and urine samples, test beverages, wort and hops extract were analyzed by UPLC-QTOF. A unique metabolite pattern reflecting beer metabolome, including metabolites derived from beer raw material (i.e. N-methyl tyramine sulfate and the sum...

  2. Dietary Pattern and Its Association with the Prevalence of Obesity and Related Cardiometabolic Risk Factors among Chinese Children

    OpenAIRE

    Shang, Xianwen; Li, Yanping; Liu, Ailing; Zhang, Qian; Hu, Xiaoqi; Du, Songming; Ma, Jun; Xu, Guifa; Li, Ying; Guo, Hongwei; Du, Lin; Ma, Guansheng

    2012-01-01

    BACKGROUND: The association of dietary pattern with chronic diseases has been investigated widely in western countries. However, information is quite limited among children in China. Our study is aimed to identify the dietary patterns of Chinese children and examine their association with obesity and related cardiometabolic risk factors. METHODS: A total of 5267 children were selected using multistage random sampling from 30 primary schools of 5 provincial capital cities in China. Dietary int...

  3. Dietary pattern and 20 year mortality in elderly men in Finland, Italy and the Netherlands: Longitudinal cohort study.

    NARCIS (Netherlands)

    Huijbregts, P.; Feskens, E.J.M.; Résénen, L.; Fidanza, F.; Nissinen, A.; Menotti, A.; Kromhout, D.

    1997-01-01

    OBJECTIVE: To investigate the association of dietary pattern and mortality in international data. DESIGN: Cohort study with 20 years' follow up of mortality. SETTING: Five cohorts in Finland, the Netherlands, and Italy. SUBJECTS: Population based random sample of 3045 men aged 50-70 years in 1970.

  4. Global patterns and predictions of seafloor biomass using random forests.

    Directory of Open Access Journals (Sweden)

    Chih-Lin Wei

    Full Text Available A comprehensive seafloor biomass and abundance database has been constructed from 24 oceanographic institutions worldwide within the Census of Marine Life (CoML field projects. The machine-learning algorithm, Random Forests, was employed to model and predict seafloor standing stocks from surface primary production, water-column integrated and export particulate organic matter (POM, seafloor relief, and bottom water properties. The predictive models explain 63% to 88% of stock variance among the major size groups. Individual and composite maps of predicted global seafloor biomass and abundance are generated for bacteria, meiofauna, macrofauna, and megafauna (invertebrates and fishes. Patterns of benthic standing stocks were positive functions of surface primary production and delivery of the particulate organic carbon (POC flux to the seafloor. At a regional scale, the census maps illustrate that integrated biomass is highest at the poles, on continental margins associated with coastal upwelling and with broad zones associated with equatorial divergence. Lowest values are consistently encountered on the central abyssal plains of major ocean basins The shift of biomass dominance groups with depth is shown to be affected by the decrease in average body size rather than abundance, presumably due to decrease in quantity and quality of food supply. This biomass census and associated maps are vital components of mechanistic deep-sea food web models and global carbon cycling, and as such provide fundamental information that can be incorporated into evidence-based management.

  5. Comparison of Contact Patterns Relevant for Transmission of Respiratory Pathogens in Thailand and the Netherlands Using Respondent-Driven Sampling

    Science.gov (United States)

    Stein, Mart L.; van Steenbergen, Jim E.; Buskens, Vincent; van der Heijden, Peter G. M.; Chanyasanha, Charnchudhi; Tipayamongkholgul, Mathuros; Thorson, Anna E.; Bengtsson, Linus; Lu, Xin; Kretzschmar, Mirjam E. E.

    2014-01-01

    Understanding infection dynamics of respiratory diseases requires the identification and quantification of behavioural, social and environmental factors that permit the transmission of these infections between humans. Little empirical information is available about contact patterns within real-world social networks, let alone on differences in these contact networks between populations that differ considerably on a socio-cultural level. Here we compared contact network data that were collected in the Netherlands and Thailand using a similar online respondent-driven method. By asking participants to recruit contact persons we studied network links relevant for the transmission of respiratory infections. We studied correlations between recruiter and recruited contacts to investigate mixing patterns in the observed social network components. In both countries, mixing patterns were assortative by demographic variables and random by total numbers of contacts. However, in Thailand participants reported overall more contacts which resulted in higher effective contact rates. Our findings provide new insights on numbers of contacts and mixing patterns in two different populations. These data could be used to improve parameterisation of mathematical models used to design control strategies. Although the spread of infections through populations depends on more factors, found similarities suggest that spread may be similar in the Netherlands and Thailand. PMID:25423343

  6. Scale-free animal movement patterns: Levy walks outperform fractional Brownian motions and fractional Levy motions in random search scenarios

    International Nuclear Information System (INIS)

    Reynolds, A M

    2009-01-01

    The movement patterns of a diverse range of animals have scale-free characteristics. These characteristics provide necessary but not sufficient conditions for the presence of movement patterns that can be approximated by Levy walks. Nevertheless, it has been widely assumed that the occurrence of scale-free animal movements can indeed be attributed to the presence of Levy walks. This is, in part, because it is known that the super-diffusive properties of Levy walks can be advantageous in random search scenarios when searchers have little or no prior knowledge of target locations. However, fractional Brownian motions (fBms) and fractional Levy motions (fLms) are both scale-free and super-diffusive, and so it is possible that these motions rather than Levy walks underlie some or all occurrences of scale-free animal movement patterns. Here this possibility is examined in numerical simulations through a determination of the searching efficiencies of fBm and fLm searches. It is shown that these searches are less efficient than Levy walk searches. This finding does not rule out the possibility that some animals with scale-free movement patterns are executing fBm and fLm searches, but it does make Levy walk searches the more likely possibility.

  7. Characteristics of men with substance use disorder consequent to illicit drug use: comparison of a random sample and volunteers.

    Science.gov (United States)

    Reynolds, Maureen D; Tarter, Ralph E; Kirisci, Levent

    2004-09-06

    Men qualifying for substance use disorder (SUD) consequent to consumption of an illicit drug were compared according to recruitment method. It was hypothesized that volunteers would be more self-disclosing and exhibit more severe disturbances compared to randomly recruited subjects. Personal, demographic, family, social, substance use, psychiatric, and SUD characteristics of volunteers (N = 146) were compared to randomly recruited (N = 102) subjects. Volunteers had lower socioceconomic status, were more likely to be African American, and had lower IQ than randomly recruited subjects. Volunteers also evidenced greater social and family maladjustment and more frequently had received treatment for substance abuse. In addition, lower social desirability response bias was observed in the volunteers. SUD was not more severe in the volunteers; however, they reported a higher lifetime rate of opiate, diet, depressant, and analgesic drug use. Volunteers and randomly recruited subjects qualifying for SUD consequent to illicit drug use are similar in SUD severity but differ in terms of severity of psychosocial disturbance and history of drug involvement. The factors discriminating volunteers and randomly recruited subjects are well known to impact on outcome, hence they need to be considered in research design, especially when selecting a sampling strategy in treatment research.

  8. Test Pattern Generator for Mixed Mode BIST

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hong Sik; Lee, Hang Kyu; Kang, Sung Ho [Yonsei University (Korea, Republic of)

    1998-07-01

    As the increasing integrity of VLSI, the BIST (Built-In Self Test) is used as an effective method to test chips. Generally the pseudo-random test pattern generation is used for BIST. But it requires lots of test patterns when there exist random resistant faults. Therefore deterministic testing is an interesting BIST technique due to the minimal number of test patterns and to its high fault coverage. However this is not applicable since the existing deterministic test pattern generators require too much area overhead despite their efficiency. Therefore we propose a mixed test scheme which applies to the circuit under test, a deterministic test sequence followed by a pseudo-random one. This scheme allows the maximum fault coverage detection to be achieved, furthermore the silicon area overhead of the mixed hardware generator can be reduced. The deterministic test generator is made with a finite state machine and a pseudo-random test generator is made with LFSR(linear feedback shift register). The results of ISCAS circuits show that the maximum fault coverage is guaranteed with small number of test set and little hardware overhead. (author). 15 refs., 10 figs., 4 tabs.

  9. An intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces.

    Science.gov (United States)

    Ying, Xiang; Xin, Shi-Qing; Sun, Qian; He, Ying

    2013-09-01

    Poisson disk sampling has excellent spatial and spectral properties, and plays an important role in a variety of visual computing. Although many promising algorithms have been proposed for multidimensional sampling in euclidean space, very few studies have been reported with regard to the problem of generating Poisson disks on surfaces due to the complicated nature of the surface. This paper presents an intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces. In sharp contrast to the conventional parallel approaches, our method neither partitions the given surface into small patches nor uses any spatial data structure to maintain the voids in the sampling domain. Instead, our approach assigns each sample candidate a random and unique priority that is unbiased with regard to the distribution. Hence, multiple threads can process the candidates simultaneously and resolve conflicts by checking the given priority values. Our algorithm guarantees that the generated Poisson disks are uniformly and randomly distributed without bias. It is worth noting that our method is intrinsic and independent of the embedding space. This intrinsic feature allows us to generate Poisson disk patterns on arbitrary surfaces in IR(n). To our knowledge, this is the first intrinsic, parallel, and accurate algorithm for surface Poisson disk sampling. Furthermore, by manipulating the spatially varying density function, we can obtain adaptive sampling easily.

  10. Sample Size Calculation: Inaccurate A Priori Assumptions for Nuisance Parameters Can Greatly Affect the Power of a Randomized Controlled Trial.

    Directory of Open Access Journals (Sweden)

    Elsa Tavernier

    Full Text Available We aimed to examine the extent to which inaccurate assumptions for nuisance parameters used to calculate sample size can affect the power of a randomized controlled trial (RCT. In a simulation study, we separately considered an RCT with continuous, dichotomous or time-to-event outcomes, with associated nuisance parameters of standard deviation, success rate in the control group and survival rate in the control group at some time point, respectively. For each type of outcome, we calculated a required sample size N for a hypothesized treatment effect, an assumed nuisance parameter and a nominal power of 80%. We then assumed a nuisance parameter associated with a relative error at the design stage. For each type of outcome, we randomly drew 10,000 relative errors of the associated nuisance parameter (from empirical distributions derived from a previously published review. Then, retro-fitting the sample size formula, we derived, for the pre-calculated sample size N, the real power of the RCT, taking into account the relative error for the nuisance parameter. In total, 23%, 0% and 18% of RCTs with continuous, binary and time-to-event outcomes, respectively, were underpowered (i.e., the real power was 90%. Even with proper calculation of sample size, a substantial number of trials are underpowered or overpowered because of imprecise knowledge of nuisance parameters. Such findings raise questions about how sample size for RCTs should be determined.

  11. MUP, CEC-DES, STRADE. Codes for uncertainty propagation, experimental design and stratified random sampling techniques

    International Nuclear Information System (INIS)

    Amendola, A.; Astolfi, M.; Lisanti, B.

    1983-01-01

    The report describes the how-to-use of the codes: MUP (Monte Carlo Uncertainty Propagation) for uncertainty analysis by Monte Carlo simulation, including correlation analysis, extreme value identification and study of selected ranges of the variable space; CEC-DES (Central Composite Design) for building experimental matrices according to the requirements of Central Composite and Factorial Experimental Designs; and, STRADE (Stratified Random Design) for experimental designs based on the Latin Hypercube Sampling Techniques. Application fields, of the codes are probabilistic risk assessment, experimental design, sensitivity analysis and system identification problems

  12. [Food consumption patterns among adolescents].

    Science.gov (United States)

    Palenzuela Paniagua, S M; Pérez Milena, A; Pérula de Torres, L A; Fernández García, J A; Maldonado Alconada, J

    2014-01-01

    Adolescence is a critical time for the establishment of healthy eating habits. The objective was to analyze food consumption patterns among adolescents and their relationship with family and social factors. Multicentre observational cross-sectional descriptive study using a food frequency questionnaire for the last week. It was answered anonymously. The adolescent's age/gender, parents' studies/occupation and school's location/type were included. The population sample was composed of 1,095 adolescents in sixth grade at primary schools from an Andalusian region. They were chosen by polietapic random sampling that distinguished between public/private and capital/provincial schools. 1,005 surveys were analyzed. The mean age is 11.45 (SD: 0.59). Fifty-three percent were male. The intake of dairy products (only two-thirds taken daily), pasta, fruit and vegetables (daily consumption of 30%) is deficient. Sixty-four point five percent consume legumes weekly. Fish consumption is equal to meat, with a preference for poultry. More than half consume red meat daily. Olive oil is preferred. The intake of "empty calories" (fast food, candies, soft drink) is high. Through multivariate analysis the existence of clusters of healthy and unhealthy foods, related to the social status of the parents and the type of school, is proved. A healthy diet based on the nutritional pyramid is not the consumption pattern in the adolescents surveyed. There is a low consumption of diary products, legumes, fruits and vegetables. There is a relationship between the social class of the family and consumption patterns (healthy and unhealthy). Health strategies are needed to modify such inappropriate consumption.

  13. Nursing Theory, Terminology, and Big Data: Data-Driven Discovery of Novel Patterns in Archival Randomized Clinical Trial Data.

    Science.gov (United States)

    Monsen, Karen A; Kelechi, Teresa J; McRae, Marion E; Mathiason, Michelle A; Martin, Karen S

    The growth and diversification of nursing theory, nursing terminology, and nursing data enable a convergence of theory- and data-driven discovery in the era of big data research. Existing datasets can be viewed through theoretical and terminology perspectives using visualization techniques in order to reveal new patterns and generate hypotheses. The Omaha System is a standardized terminology and metamodel that makes explicit the theoretical perspective of the nursing discipline and enables terminology-theory testing research. The purpose of this paper is to illustrate the approach by exploring a large research dataset consisting of 95 variables (demographics, temperature measures, anthropometrics, and standardized instruments measuring quality of life and self-efficacy) from a theory-based perspective using the Omaha System. Aims were to (a) examine the Omaha System dataset to understand the sample at baseline relative to Omaha System problem terms and outcome measures, (b) examine relationships within the normalized Omaha System dataset at baseline in predicting adherence, and (c) examine relationships within the normalized Omaha System dataset at baseline in predicting incident venous ulcer. Variables from a randomized clinical trial of a cryotherapy intervention for the prevention of venous ulcers were mapped onto Omaha System terms and measures to derive a theoretical framework for the terminology-theory testing study. The original dataset was recoded using the mapping to create an Omaha System dataset, which was then examined using visualization to generate hypotheses. The hypotheses were tested using standard inferential statistics. Logistic regression was used to predict adherence and incident venous ulcer. Findings revealed novel patterns in the psychosocial characteristics of the sample that were discovered to be drivers of both adherence (Mental health Behavior: OR = 1.28, 95% CI [1.02, 1.60]; AUC = .56) and incident venous ulcer (Mental health Behavior

  14. Random Pattern Vertically Oriented, Partial Thickness Buccinator Myomucosal Flap for Intraoral Reconstruction: A Report of Two Cases

    Directory of Open Access Journals (Sweden)

    Amin Rahpeyma

    2016-05-01

    Full Text Available Introduction: Reconstruction of the oral cavity witha flap design containing the buccal mucosa and buccinator muscle but excluding the facial artery and vein is the topic of these case reports. Case Reports: This article uses random pattern vertically oriented partial thickness buccinator myomucosal flap for intraoral reconstruction in two cases. The first was for lining the mandibular anterior vestibule in a trauma patient. The second was for oral side coverage of bone graft in special cleft patient. In both patients, this flap survived and good bone coverage with non-keratinized mucosa was obtained. Conclusion:  Thin long buccal myomucosal flap not including facial artery and vein can survive.

  15. Isoflurane Preconditioning Increases Survival of Rat Skin Random-Pattern Flaps by Induction of HIF-1α Expression

    Directory of Open Access Journals (Sweden)

    Yu Sun

    2013-04-01

    Full Text Available Background: Survival of random-pattern skin flaps is important for the success of plastic and reconstructive surgeries. This study investigates isoflurane-induced protection against ischemia of skin flap and the underlying molecular mechanism in this process. Methods: Human umbilical vein endothelial cells (HUVECs and human skin fibroblast cells were exposed to isoflurane for 4 h. Expression of hypoxia inducible factor-1α (HIF-1α, heme oxygenase-1 (HO-1 and vascular endothelial growth factor (VEGF were analyzed up to 24 h post isoflurane exposure using qRT-PCR and western blot, or ELISA analyses. PI3K inhibitors - LY 294002 and wortmannin, mTOR inhibitor - rapamycin, and GSK3β inhibitor - SB 216763 were used respectively to assess the effects of isoflurane treatment and HIF-1α expression. Furthermore, 40 rats were randomly divided into 5 groups (control, isoflurane, scrambled siRNA plus isoflurane, HIF-1α siRNA plus isoflurane, and DMOG and subjected to random-pattern skin flaps operation. Rats were prepared for evaluation of flap survival and full-feld laser perfusion imager (FLPI (at 7 day and microvessel density evaluation (at 10 day. Results: Isoflurane exposure induced expression of HIF-1α protein, HO-1 and VEGF mRNA and proteins in a time-dependent manner. Both LY 294002 and wortmannin inhibited phospho-Akt, phospho-mTOR, phospho-GSK 3β and HIF-1α expression after isoflurane exposure. Both wortmannin and rapamycin inhibited isoflurane-induced phospho-4E-BP1 (Ser 65 and phospho-P70s6k (Thr 389 and HIF-1α expression. SB 216763 pre-treatment could further enhance isoflurane-induced expression of phospho-GSK 3β (Ser 9 and HIF-1α protein compared to the isoflurane-alone cells. In animal experiments, isoflurane alone, scrambled siRNA plus isoflurane, or DMOG groups had significantly upregulated vascularity and increased survival of the skin flaps compared to the controls. However, HIF-1α knockdown abrogated the protective effect of

  16. Zero Distribution of System with Unknown Random Variables Case Study: Avoiding Collision Path

    Directory of Open Access Journals (Sweden)

    Parman Setyamartana

    2014-07-01

    Full Text Available This paper presents the stochastic analysis of finding the feasible trajectories of robotics arm motion at obstacle surrounding. Unknown variables are coefficients of polynomials joint angle so that the collision-free motion is achieved. ãk is matrix consisting of these unknown feasible polynomial coefficients. The pattern of feasible polynomial in the obstacle environment shows as random. This paper proposes to model the pattern of this randomness values using random polynomial with unknown variables as coefficients. The behavior of the system will be obtained from zero distribution as the characteristic of such random polynomial. Results show that the pattern of random polynomial of avoiding collision can be constructed from zero distribution. Zero distribution is like building block of the system with obstacles as uncertainty factor. By scale factor k, which has range, the random coefficient pattern can be predicted.

  17. Species undersampling in tropical bat surveys: effects on emerging biodiversity patterns.

    Science.gov (United States)

    Meyer, Christoph F J; Aguiar, Ludmilla M S; Aguirre, Luis F; Baumgarten, Julio; Clarke, Frank M; Cosson, Jean-François; Estrada Villegas, Sergio; Fahr, Jakob; Faria, Deborah; Furey, Neil; Henry, Mickaël; Jenkins, Richard K B; Kunz, Thomas H; Cristina MacSwiney González, M; Moya, Isabel; Pons, Jean-Marc; Racey, Paul A; Rex, Katja; Sampaio, Erica M; Stoner, Kathryn E; Voigt, Christian C; von Staden, Dietrich; Weise, Christa D; Kalko, Elisabeth K V

    2015-01-01

    Undersampling is commonplace in biodiversity surveys of species-rich tropical assemblages in which rare taxa abound, with possible repercussions for our ability to implement surveys and monitoring programmes in a cost-effective way. We investigated the consequences of information loss due to species undersampling (missing subsets of species from the full species pool) in tropical bat surveys for the emerging patterns of species richness (SR) and compositional variation across sites. For 27 bat assemblage data sets from across the tropics, we used correlations between original data sets and subsets with different numbers of species deleted either at random, or according to their rarity in the assemblage, to assess to what extent patterns in SR and composition in data subsets are congruent with those in the initial data set. We then examined to what degree high sample representativeness (r ≥ 0·8) was influenced by biogeographic region, sampling method, sampling effort or structural assemblage characteristics. For SR, correlations between random subsets and original data sets were strong (r ≥ 0·8) with moderate (ca. 20%) species loss. Bias associated with information loss was greater for species composition; on average ca. 90% of species in random subsets had to be retained to adequately capture among-site variation. For nonrandom subsets, removing only the rarest species (on average c. 10% of the full data set) yielded strong correlations (r > 0·95) for both SR and composition. Eliminating greater proportions of rare species resulted in weaker correlations and large variation in the magnitude of observed correlations among data sets. Species subsets that comprised ca. 85% of the original set can be considered reliable surrogates, capable of adequately revealing patterns of SR and temporal or spatial turnover in many tropical bat assemblages. Our analyses thus demonstrate the potential as well as limitations for reducing survey effort and streamlining

  18. CORRELATION BETWEEN FAMILY COMMUNICATION PATTERNS AND JUVENILE DELINQUENCY IN JUNIOR HIGH SCHOOL

    Directory of Open Access Journals (Sweden)

    Zurriyatun Thoyibah

    2017-08-01

    Full Text Available Background: Adolescents who are in transition period have high risk behavior of juvenile delinquency. Communication between parents and adolescents effectively and openly could help adolescents to avoid delinquency behavior. Objective: This study aims to examine the relationship between family communication patterns and juvenile delinquency in Junior High School. Methods: This research employed a cross-sectional design with correlation description approach. There were 243 students selected using simple random sampling from the 7th and 8th grade students of Junior High School. A questionnaire of juvenile delinquency and family communication pattern were used in this study. Data were analyzed using Chi Square test. Result: The research showed that the majority juvenile delinquency category was low (65% and the majority of communication pattern was in functional category (73.3%. There was a significant relationship between family communication pattern and juvenile delinquency (p<0.05. Conclusion: Communication pattern within family have significant association with juvenile delinquency.

  19. Meal patterns and cooking practices in Southern France and Central England.

    Science.gov (United States)

    Pettinger, Clare; Holdsworth, Michelle; Gerber, Mariette

    2006-12-01

    To evaluate whether meal patterns and cooking practices in Central England and Mediterranean France conform to popular stereotypes, eating together as a household, preparation of meals, food purchasing patterns, cooking practices and eating out were investigated. Cross-sectional studies conducted simultaneously in April 2001 using self-administered postal questionnaires. England (Nottingham, East Midlands) and France (Montpellier, Languedoc-Roussillon). A stratified random sample of 1000 males and 1000 females aged 18-65 years was generated from the electoral roll in each country. The final sample comprised 826 subjects in England (58% males, 42% females; mean age 44 years) and 766 subjects in France (42% males, 58% females; mean age 42 years). Analyses were conducted on samples standardised for sociodemographic differences. The French cooked from raw ingredients most often (PFrench and English food cultures, as the importance of the convivial aspects of eating, as well as more traditional practices such as cooking meals from basic ingredients, structured mealtimes and less between-meal snacking, remain more prominent within the French population. This may contribute to the differences in prevalence of obesity seen between the two countries.

  20. Analysis of spatial patterns informs community assembly and sampling requirements for Collembola in forest soils

    Science.gov (United States)

    Dirilgen, Tara; Juceviča, Edite; Melecis, Viesturs; Querner, Pascal; Bolger, Thomas

    2018-01-01

    The relative importance of niche separation, non-equilibrial and neutral models of community assembly has been a theme in community ecology for many decades with none appearing to be applicable under all circumstances. In this study, Collembola species abundances were recorded over eleven consecutive years in a spatially explicit grid and used to examine (i) whether observed beta diversity differed from that expected under conditions of neutrality, (ii) whether sampling points differed in their relative contributions to overall beta diversity, and (iii) the number of samples required to provide comparable estimates of species richness across three forest sites. Neutrality could not be rejected for 26 of the forest by year combinations. However, there is a trend toward greater structure in the oldest forest, where beta diversity was greater than predicted by neutrality on five of the eleven sampling dates. The lack of difference in individual- and sample-based rarefaction curves also suggests randomness in the system at this particular scale of investigation. It seems that Collembola communities are not spatially aggregated and assembly is driven primarily by neutral processes particularly in the younger two sites. Whether this finding is due to small sample size or unaccounted for environmental variables cannot be determined. Variability between dates and sites illustrates the potential of drawing incorrect conclusions if data are collected at a single site and a single point in time.

  1. Does diet-beverage intake affect dietary consumption patterns? Results from the Choose Healthy Options Consciously Everyday (CHOICE) randomized clinical trial.

    Science.gov (United States)

    Piernas, Carmen; Tate, Deborah F; Wang, Xiaoshan; Popkin, Barry M

    2013-03-01

    Little is understood about the effect of increased consumption of low-calorie sweeteners in diet beverages on dietary patterns and energy intake. We investigated whether energy intakes and dietary patterns were different in subjects who were randomly assigned to substitute caloric beverages with either water or diet beverages (DBs). Participants from the Choose Healthy Options Consciously Everyday randomized clinical trial (a 6-mo, 3-arm study) were included in the analysis [water groups: n = 106 (94% women); DB group: n = 104 (82% women)]. For energy, macronutrient, and food and beverage intakes, we investigated the main effects of time, treatment, and the treatment-by-time interaction by using mixed models. Overall, the macronutrient composition changed in both groups without significant differences between groups over time. Both groups reduced absolute intakes of total daily energy, carbohydrates, fat, protein, saturated fat, total sugar, added sugar, and other carbohydrates. The DB group decreased energy from all beverages more than the water group did only at month 3 (P-group-by-time dessert intake than the water group did at month 6 (P-group-by-time beverages and specifically reduced more desserts than the water group did. Our study does not provide evidence to suggest that a short-term consumption of DBs, compared with water, increases preferences for sweet foods and beverages. This trial was registered at clinicaltrials.gov as NCT01017783.

  2. Importance sampling of heavy-tailed iterated random functions

    NARCIS (Netherlands)

    B. Chen (Bohan); C.H. Rhee (Chang-Han); A.P. Zwart (Bert)

    2016-01-01

    textabstractWe consider a stochastic recurrence equation of the form $Z_{n+1} = A_{n+1} Z_n+B_{n+1}$, where $\\mathbb{E}[\\log A_1]<0$, $\\mathbb{E}[\\log^+ B_1]<\\infty$ and $\\{(A_n,B_n)\\}_{n\\in\\mathbb{N}}$ is an i.i.d. sequence of positive random vectors. The stationary distribution of this Markov

  3. Randomly and Non-Randomly Missing Renal Function Data in the Strong Heart Study: A Comparison of Imputation Methods.

    Directory of Open Access Journals (Sweden)

    Nawar Shara

    Full Text Available Kidney and cardiovascular disease are widespread among populations with high prevalence of diabetes, such as American Indians participating in the Strong Heart Study (SHS. Studying these conditions simultaneously in longitudinal studies is challenging, because the morbidity and mortality associated with these diseases result in missing data, and these data are likely not missing at random. When such data are merely excluded, study findings may be compromised. In this article, a subset of 2264 participants with complete renal function data from Strong Heart Exams 1 (1989-1991, 2 (1993-1995, and 3 (1998-1999 was used to examine the performance of five methods used to impute missing data: listwise deletion, mean of serial measures, adjacent value, multiple imputation, and pattern-mixture. Three missing at random models and one non-missing at random model were used to compare the performance of the imputation techniques on randomly and non-randomly missing data. The pattern-mixture method was found to perform best for imputing renal function data that were not missing at random. Determining whether data are missing at random or not can help in choosing the imputation method that will provide the most accurate results.

  4. Mechanical properties of banana/kenaf fiber-reinforced hybrid polyester composites: Effect of woven fabric and random orientation

    International Nuclear Information System (INIS)

    Alavudeen, A.; Rajini, N.; Karthikeyan, S.; Thiruchitrambalam, M.; Venkateshwaren, N.

    2015-01-01

    Highlights: • This paper is presents the fabrications of kenaf/banana fiber hybrid composites. • Effect of weaving pattern and random orientation on mechanical properties was studied. • Role of interfacial adhesion due to chemical modifications were analyzed with the aid of SEM. • Hybridization of kenaf and banana fibers in plain woven composites exhibits maximum mechanical strength. - Abstract: The present work deals with the effect of weaving patterns and random orientatation on the mechanical properties of banana, kenaf and banana/kenaf fiber-reinforced hybrid polyester composites. Composites were prepared using the hand lay-up method with two different weaving patterns, namely, plain and twill type. Of the two weaving patterns, the plain type showed improved tensile properties compared to the twill type in all the fabricated composites. Furthermore, the maximum increase in mechanical strength was observed in the plain woven hybrid composites rather than in randomly oriented composites. This indicates minimum stress development at the interface of composites due to the distribution of load transfer along the fiber direction. Moreover, alkali (NaOH) and sodium lauryl sulfate (SLS) treatments appear to provide an additional improvement in mechanical strength through enhanced interfacial bonding. Morphological studies of fractured mechanical testing samples were performed by scanning electron microscopy (SEM) to understand the de-bonding of fiber/matrix adhesion

  5. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge.

    Directory of Open Access Journals (Sweden)

    Rosa Catarino

    Full Text Available Human papillomavirus (HPV self-sampling (self-HPV is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab.A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed first: self-HPV using dry swabs (s-DRY or vaginal specimen collection using a cytobrush applied to an FTA cartridge (s-FTA. After self-HPV, a physician collected a cervical sample using liquid-based medium (Dr-WET. HPV types were identified by real-time PCR. Agreement between collection methods was measured using the kappa statistic.HPV prevalence for high-risk types was 62.3% (95%CI: 53.7-70.2 detected by s-DRY, 56.2% (95%CI: 47.6-64.4 by Dr-WET, and 54.6% (95%CI: 46.1-62.9 by s-FTA. There was overall agreement of 70.8% between s-FTA and s-DRY samples (kappa = 0.34, and of 82.3% between self-HPV and Dr-WET samples (kappa = 0.56. Detection sensitivities for low-grade squamous intraepithelial lesion or worse (LSIL+ were: 64.0% (95%CI: 44.5-79.8 for s-FTA, 84.6% (95%CI: 66.5-93.9 for s-DRY, and 76.9% (95%CI: 58.0-89.0 for Dr-WET. The preferred self-collection method among patients was s-DRY (40.8% vs. 15.4%. Regarding costs, FTA card was five times more expensive than the swab (~5 US dollars (USD/per card vs. ~1 USD/per swab.Self-HPV using dry swabs is sensitive for detecting LSIL+ and less expensive than s-FTA.International Standard Randomized Controlled Trial Number (ISRCTN: 43310942.

  6. Capsaicin on the viability of random-pattern skin flaps in rats Capsaicina na viabilidade de retalhos isquêmicos randômicos em ratos

    Directory of Open Access Journals (Sweden)

    Gustavo Roberto de Godoy

    2010-10-01

    Full Text Available PURPOSE: To evaluate the effects of capsaicin on the viability of ischemic random-pattern skin flaps in rats. METHODS:Forty EPM1-Wistar rats were randomized into two groups of 20 animals each, the capsaicin group and the control group. A random-pattern skin flap measuring 10 x 4cm was raised and a plastic barrier was placed between the flap and the donor site. After the surgical procedure, the control group was treated with an inert vehicle in the form of a cream applied uniformly to a rayon bandage which, in turn, was applied to the surface of the skin flap. The capsaicin group was treated in the same way, but in this case capsaicin was added to the cream. This procedure was repeated for two consecutive days. RESULTS: There was a significantly smaller amount of flap necrosis in the capsaicin group (35.07% than in the control group (44.75% (p=0.035. CONCLUSION:Topical administration of capsaicin improved the viability of ischemic random-pattern skin flaps in rats.OBJETIVO: Avaliar os efeitos da capsaicina na viabilidade de retalhos isquêmicos randômicos em ratos. MÉTODOS: Quarenta ratos EPM1-Wistar foram distribuídos ao acaso em dois grupos de 20 animais cada, um grupo capsaicina e um grupo controle. Um retalho isquêmico randômico medindo 10 x 4cm foi elevado e uma barreira plástica foi colocada entre o retalho e a área doadora. Após o procedimento cirúrgico, o grupo controle foi tratado com um veículo inerte sob a forma creme aplicado uniformemente sobre uma atadura de rayon, que, por sua vez, foi aplicada à superfície do retalho. O grupo capsaicina foi tratado da mesma forma, porém a capsaicina foi adicionada ao creme. Este procedimento foi repetido por dois dias consecutivos. RESULTADOS: Houve uma quantidade significativamente menor da necrose do retalho no grupo capsaicina (35,07% comparado ao grupo controle (44,75% (p=0,035. CONCLUSÃO: A administração tópica da capsaicina melhorou a viabilidade de retalhos isquêmicos rand

  7. Two-dimensional random arrays for real time volumetric imaging

    DEFF Research Database (Denmark)

    Davidsen, Richard E.; Jensen, Jørgen Arendt; Smith, Stephen W.

    1994-01-01

    real time volumetric imaging system, which employs a wide transmit beam and receive mode parallel processing to increase image frame rate. Depth-of-field comparisons were made from simulated on-axis and off-axis beamplots at ranges from 30 to 160 mm for both coaxial and offset transmit and receive......Two-dimensional arrays are necessary for a variety of ultrasonic imaging techniques, including elevation focusing, 2-D phase aberration correction, and real time volumetric imaging. In order to reduce system cost and complexity, sparse 2-D arrays have been considered with element geometries...... selected ad hoc, by algorithm, or by random process. Two random sparse array geometries and a sparse array with a Mills cross receive pattern were simulated and compared to a fully sampled aperture with the same overall dimensions. The sparse arrays were designed to the constraints of the Duke University...

  8. Relative efficiency and sample size for cluster randomized trials with variable cluster sizes.

    Science.gov (United States)

    You, Zhiying; Williams, O Dale; Aban, Inmaculada; Kabagambe, Edmond Kato; Tiwari, Hemant K; Cutter, Gary

    2011-02-01

    The statistical power of cluster randomized trials depends on two sample size components, the number of clusters per group and the numbers of individuals within clusters (cluster size). Variable cluster sizes are common and this variation alone may have significant impact on study power. Previous approaches have taken this into account by either adjusting total sample size using a designated design effect or adjusting the number of clusters according to an assessment of the relative efficiency of unequal versus equal cluster sizes. This article defines a relative efficiency of unequal versus equal cluster sizes using noncentrality parameters, investigates properties of this measure, and proposes an approach for adjusting the required sample size accordingly. We focus on comparing two groups with normally distributed outcomes using t-test, and use the noncentrality parameter to define the relative efficiency of unequal versus equal cluster sizes and show that statistical power depends only on this parameter for a given number of clusters. We calculate the sample size required for an unequal cluster sizes trial to have the same power as one with equal cluster sizes. Relative efficiency based on the noncentrality parameter is straightforward to calculate and easy to interpret. It connects the required mean cluster size directly to the required sample size with equal cluster sizes. Consequently, our approach first determines the sample size requirements with equal cluster sizes for a pre-specified study power and then calculates the required mean cluster size while keeping the number of clusters unchanged. Our approach allows adjustment in mean cluster size alone or simultaneous adjustment in mean cluster size and number of clusters, and is a flexible alternative to and a useful complement to existing methods. Comparison indicated that we have defined a relative efficiency that is greater than the relative efficiency in the literature under some conditions. Our measure

  9. Chopper model of pattern recognition

    NARCIS (Netherlands)

    van Hemmen, J.L.; Enter, A.C.D. van

    A simple model is proposed that allows an efficient storage and retrieval of random patterns. Also correlated patterns can be handled. The data are stored in an Ising-spin system with ferromagnetic interactions between all the spins and the main idea is to "chop" the system along the boundaries

  10. Power Spectrum Estimation of Randomly Sampled Signals

    DEFF Research Database (Denmark)

    Velte, C. M.; Buchhave, P.; K. George, W.

    algorithms; sample and-hold and the direct spectral estimator without residence time weighting. The computer generated signal is a Poisson process with a sample rate proportional to velocity magnitude that consist of well-defined frequency content, which makes bias easy to spot. The idea...

  11. Dietary Patterns and Blood Pressure in Adults: A Systematic Review and Meta-Analysis of Randomized Controlled Trials.

    Science.gov (United States)

    Ndanuko, Rhoda N; Tapsell, Linda C; Charlton, Karen E; Neale, Elizabeth P; Batterham, Marijka J

    2016-01-01

    Hypertension is a major risk factor for developing cardiovascular disease, stroke, and kidney disease. To lower blood pressure (BP), several lifestyle changes are recommended such as weight loss, exercise, and following a healthy diet. Investigating the effect of single nutrients may have positive results, but food is consumed as part of a whole diet, resulting in nutrient interactions. The aim of this systematic review and meta-analysis was to assess the effect of dietary patterns on BP in adults. Studies that were published between January 1999 and June 2014 were retrieved using Scopus, Web of Science, and the MEDLINE database. Seventeen randomized controlled trials were included in the meta-analysis. The results suggest that healthy dietary patterns such as the Dietary Approaches to Stop Hypertension diet, Nordic diet, and Mediterranean diet significantly lowered systolic BP and diastolic BP by 4.26 mm Hg and 2.38 mm Hg, respectively. These diets are rich in fruit, vegetables, whole grains, legumes, seeds, nuts, fish, and dairy and low in meat, sweets, and alcohol. Lifestyle factors such as exercise and weight loss in combination with dietary changes may also reduce BP. Further research is needed to establish the effect of dietary patterns on BP in different cultures other than those identified in this review. The review was registered on PROSPERO (International prospective register of systematic reviews) as CRD42015016272. © 2016 American Society for Nutrition.

  12. Cost-effective, transfer-free, flexible resistive random access memory using laser-scribed reduced graphene oxide patterning technology.

    Science.gov (United States)

    Tian, He; Chen, Hong-Yu; Ren, Tian-Ling; Li, Cheng; Xue, Qing-Tang; Mohammad, Mohammad Ali; Wu, Can; Yang, Yi; Wong, H-S Philip

    2014-06-11

    Laser scribing is an attractive reduced graphene oxide (rGO) growth and patterning technology because the process is low-cost, time-efficient, transfer-free, and flexible. Various laser-scribed rGO (LSG) components such as capacitors, gas sensors, and strain sensors have been demonstrated. However, obstacles remain toward practical application of the technology where all the components of a system are fabricated using laser scribing. Memory components, if developed, will substantially broaden the application space of low-cost, flexible electronic systems. For the first time, a low-cost approach to fabricate resistive random access memory (ReRAM) using laser-scribed rGO as the bottom electrode is experimentally demonstrated. The one-step laser scribing technology allows transfer-free rGO synthesis directly on flexible substrates or non-flat substrates. Using this time-efficient laser-scribing technology, the patterning of a memory-array area up to 100 cm(2) can be completed in 25 min. Without requiring the photoresist coating for lithography, the surface of patterned rGO remains as clean as its pristine state. Ag/HfOx/LSG ReRAM using laser-scribing technology is fabricated in this work. Comprehensive electrical characteristics are presented including forming-free behavior, stable switching, reasonable reliability performance and potential for 2-bit storage per memory cell. The results suggest that laser-scribing technology can potentially produce more cost-effective and time-effective rGO-based circuits and systems for practical applications.

  13. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    Science.gov (United States)

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  14. Assessing differences in groups randomized by recruitment chain in a respondent-driven sample of Seattle-area injection drug users.

    Science.gov (United States)

    Burt, Richard D; Thiede, Hanne

    2014-11-01

    Respondent-driven sampling (RDS) is a form of peer-based study recruitment and analysis that incorporates features designed to limit and adjust for biases in traditional snowball sampling. It is being widely used in studies of hidden populations. We report an empirical evaluation of RDS's consistency and variability, comparing groups recruited contemporaneously, by identical methods and using identical survey instruments. We randomized recruitment chains from the RDS-based 2012 National HIV Behavioral Surveillance survey of injection drug users in the Seattle area into two groups and compared them in terms of sociodemographic characteristics, drug-associated risk behaviors, sexual risk behaviors, human immunodeficiency virus (HIV) status and HIV testing frequency. The two groups differed in five of the 18 variables examined (P ≤ .001): race (e.g., 60% white vs. 47%), gender (52% male vs. 67%), area of residence (32% downtown Seattle vs. 44%), an HIV test in the previous 12 months (51% vs. 38%). The difference in serologic HIV status was particularly pronounced (4% positive vs. 18%). In four further randomizations, differences in one to five variables attained this level of significance, although the specific variables involved differed. We found some material differences between the randomized groups. Although the variability of the present study was less than has been reported in serial RDS surveys, these findings indicate caution in the interpretation of RDS results. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. A random cluster survey and a convenience sample give comparable estimates of immunity to vaccine preventable diseases in children of school age in Victoria, Australia.

    Science.gov (United States)

    Kelly, Heath; Riddell, Michaela A; Gidding, Heather F; Nolan, Terry; Gilbert, Gwendolyn L

    2002-08-19

    We compared estimates of the age-specific population immunity to measles, mumps, rubella, hepatitis B and varicella zoster viruses in Victorian school children obtained by a national sero-survey, using a convenience sample of residual sera from diagnostic laboratories throughout Australia, with those from a three-stage random cluster survey. When grouped according to school age (primary or secondary school) there was no significant difference in the estimates of immunity to measles, mumps, hepatitis B or varicella. Compared with the convenience sample, the random cluster survey estimated higher immunity to rubella in samples from both primary (98.7% versus 93.6%, P = 0.002) and secondary school students (98.4% versus 93.2%, P = 0.03). Despite some limitations, this study suggests that the collection of a convenience sample of sera from diagnostic laboratories is an appropriate sampling strategy to provide population immunity data that will inform Australia's current and future immunisation policies. Copyright 2002 Elsevier Science Ltd.

  16. Naturally Occurring Radionuclides and Rare Earth Elements Pattern in Weathered Japanese Soil Samples

    International Nuclear Information System (INIS)

    Sahoo, S.K.; Hosoda, M.; Takahashi, H.; Sorimachi, A.; Ishikawa, T.; Tokonami, S.; Uchida, S.

    2011-01-01

    From the viewpoint of radiation protection, determination of natural radionuclides e.g. thorium and uranium in soil samples are important. Accurate methods for determination of Th and U is gaining importance. The geochemical behavior of Th, U and rare earth elements (REEs) are relatively close to one another while compared to other elements in geological environment. Radioactive elements like 232 Th and 238 U along with their decay products (e.g. 226 Ra) are present in most of the environmental matrices and can be transferred to living bodies by different pathways that can lead to sources of exposure of man. Therefore, it is necessary to monitor these natural radionuclides in weathered soil samples to assess the possible hazards. The activity concentrations of 226 Ra, 228 Th, and 40 K in soils have been measured using a g γ-ray spectroscopy system with high purity germanium detector. The thorium, uranium and REEs were determined from the same sample using inductively coupled plasma mass spectrometry (ICP-MS). Granitic rocks contain higher amounts of Th, U and light REEs compared to other igneous rocks such as basalt and andesites. Therefore, it is necessary to understand the interaction between REEs and nature of soils, as soils are complex heterogeneous mixture of organic and inorganic solids, water and gases. In this paper, we have discussed about distribution pattern of 226 Ra, 232 Th and 238 U along with REEs in soil samples of weathered acid rock (granite and ryolite) collected from two prefectures in Japan: 1. Gifu and 2. Okinawa. (author)

  17. Pattern formation during electropolishing

    International Nuclear Information System (INIS)

    Yuzhakov, V.V.; Chang, H.; Miller, A.E.

    1997-01-01

    Using atomic force microscopy, we find that the surface morphology of a dissolving aluminum anode in a commercial electropolishing electrolyte can exhibit both highly regular and randomly packed stripe and hexagonal patterns with amplitudes of about 5 nm and wavelengths of 100 nm. The driving instability of this pattern formation phenomenon is proposed to be the preferential adsorption of polar or polarizable organic molecules on surface ridges where the contorted double layer produces a higher electric potential gradient. The enhanced relative coverage shields the anode and induces a smaller dissolution rate at the ridges. The instability is balanced by surface diffusion of the adsorbate to yield a length scale of 4π(D s /k d ) 1/2 , where D s is the surface diffusivity and k d is the desorption coefficient of the adsorbate, which correlates well with the measured wavelength. A long-wavelength expansion of the double-layer field yields an interface evolution equation that reproduces all of the observed patterns. In particular, bifurcation analysis and numerical simulation yield a single voltage-dependent dimensionless parameter ξ that measures a balance between smoothing of adsorbate concentration by electric-field-dependent surface diffusion and fluctuation due to interfacial curvature and stretching. Randomly oriented stripes are favored at large ξ (low voltage), while random hills dominate at small ξ (high voltage) with perfectly periodic stripes and hexagonal hill patterns within a small window near ξ=1. These predictions are in qualitative and quantitative agreement with our measurements. copyright 1997 The American Physical Society

  18. The contribution of simple random sampling to observed variations in faecal egg counts.

    Science.gov (United States)

    Torgerson, Paul R; Paul, Michaela; Lewis, Fraser I

    2012-09-10

    It has been over 100 years since the classical paper published by Gosset in 1907, under the pseudonym "Student", demonstrated that yeast cells suspended in a fluid and measured by a haemocytometer conformed to a Poisson process. Similarly parasite eggs in a faecal suspension also conform to a Poisson process. Despite this there are common misconceptions how to analyse or interpret observations from the McMaster or similar quantitative parasitic diagnostic techniques, widely used for evaluating parasite eggs in faeces. The McMaster technique can easily be shown from a theoretical perspective to give variable results that inevitably arise from the random distribution of parasite eggs in a well mixed faecal sample. The Poisson processes that lead to this variability are described and illustrative examples of the potentially large confidence intervals that can arise from observed faecal eggs counts that are calculated from the observations on a McMaster slide. Attempts to modify the McMaster technique, or indeed other quantitative techniques, to ensure uniform egg counts are doomed to failure and belie ignorance of Poisson processes. A simple method to immediately identify excess variation/poor sampling from replicate counts is provided. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Relating neuronal firing patterns to functional differentiation of cerebral cortex.

    Directory of Open Access Journals (Sweden)

    Shigeru Shinomoto

    2009-07-01

    Full Text Available It has been empirically established that the cerebral cortical areas defined by Brodmann one hundred years ago solely on the basis of cellular organization are closely correlated to their function, such as sensation, association, and motion. Cytoarchitectonically distinct cortical areas have different densities and types of neurons. Thus, signaling patterns may also vary among cytoarchitectonically unique cortical areas. To examine how neuronal signaling patterns are related to innate cortical functions, we detected intrinsic features of cortical firing by devising a metric that efficiently isolates non-Poisson irregular characteristics, independent of spike rate fluctuations that are caused extrinsically by ever-changing behavioral conditions. Using the new metric, we analyzed spike trains from over 1,000 neurons in 15 cortical areas sampled by eight independent neurophysiological laboratories. Analysis of firing-pattern dissimilarities across cortical areas revealed a gradient of firing regularity that corresponded closely to the functional category of the cortical area; neuronal spiking patterns are regular in motor areas, random in the visual areas, and bursty in the prefrontal area. Thus, signaling patterns may play an important role in function-specific cerebral cortical computation.

  20. The effect of autologous activated platelet-rich plasma injection on female pattern hair loss: A randomized placebo-controlled study.

    Science.gov (United States)

    Tawfik, Abeer Attia; Osman, Mai Abdel Raouf

    2018-02-01

    Hair is an essential part of a woman's appearance and attractiveness. This is reflected in the predominantly psychological morbidity that can be associated with female pattern hair loss. Platelet-rich plasma(PRP) has been used in numerous fields of medicine. Recently, PRP has received growing attention as a potential therapeutic tool for hair loss. To evaluate the efficacy and safety of autologous platelet-rich plasma in the treatment of female pattern hair loss. Thirty female patients with female pattern hair loss were randomly assigned to receive autologous PRP injection into a selected area, and another area was injected with normal saline as a placebo. Sessions were performed weekly for a maximum total of four sessions. Patients were followed up 6 months after the end of last session. The outcome was assessed both subjectively and objectively. There was a statistical significant difference between PRP and placebo areas (Phair density and hair thickness as measured by a folliscope. The hair pull test became negative in PRP-injected areas in 25 patients (83%) with average number of three hairs. Global pictures showed a significant improvement in hair volume and quality together with a high overall patient satisfaction in PRP-injected sites, and these results were maintained during the 6-month follow- up. Platelet-rich plasma injections can be regarded as an alternative for the treatment of female pattern hair loss with minimal morbidity and a low cost-to-benefit ratio. © 2017 Wiley Periodicals, Inc.

  1. The Role of Binocular Disparity in Rapid Scene and Pattern Recognition

    Directory of Open Access Journals (Sweden)

    Matteo Valsecchi

    2013-04-01

    Full Text Available We investigated the contribution of binocular disparity to the rapid recognition of scenes and simpler spatial patterns using a paradigm combining backward masked stimulus presentation and short-term match-to-sample recognition. First, we showed that binocular disparity did not contribute significantly to the recognition of briefly presented natural and artificial scenes, even when the availability of monocular cues was reduced. Subsequently, using dense random dot stereograms as stimuli, we showed that observers were in principle able to extract spatial patterns defined only by disparity under brief, masked presentations. Comparing our results with the predictions from a cue-summation model, we showed that combining disparity with luminance did not per se disrupt the processing of disparity. Our results suggest that the rapid recognition of scenes is mediated mostly by a monocular comparison of the images, although we can rely on stereo in fast pattern recognition.

  2. Traditional Chinese medicine for stable angina pectoris via TCM pattern differentiation and TCM mechanism: study protocol of a randomized controlled trial.

    Science.gov (United States)

    Zhang, Zhe; Zhang, Fan; Wang, Yang; Du, Yi; Zhang, Huiyong; Kong, Dezhao; Liu, Yue; Yang, Guanlin

    2014-10-30

    Stable angina pectoris is experienced as trans-sternal or retro-sternal pressure or pain that may radiate to the left arm, neck or back. Although available evidence relating to its effectiveness and mechanism are weak, traditional Chinese medicine is used as an alternative therapy for stable angina pectoris. We report a protocol of a randomized controlled trial using traditional Chinese medicine to investigate the effectiveness, mechanism and safety for patients with stable angina pectoris. This is a north-east Chinese, multi-center, multi-blinded, placebo-controlled and superiority randomized trail. A total of 240 patients with stable angina pectoris will be randomly assigned to three groups: two treatment groups and a control group. The treatment groups will receive Chinese herbal medicine consisting of Yi-Qi-Jian-Pi and Qu-Tan-Hua-Zhuo granule and Yi-Qi-Jian-Pi and Qu-Tan-Hua-Yu granule, respectively, and conventional medicine. The control group will receive placebo medicine in addition to conventional medicine. All 3 groups will undergo a 12-week treatment and 2-week follow-up. Four visits in sum will be scheduled for each subject: 1 visit each in week 0, week 4, week 12 and week 14. The primary outcomes include: the frequency of angina pectoris attack; the dosage of nitroglycerin; body limited dimension of Seattle Angina Questionnaire. The secondary outcomes include: except for the body limited dimension of SAQ, traditional Chinese medicine pattern questionnaire and so on. Therapeutic mechanism outcomes, safety outcomes and endpoint outcomes will be also assessed. The primary aim of this trial is to develop a standard protocol to utilize high-quality EBM evidence for assessing the effectiveness and safety of SAP via TCM pattern differentiation as well as exploring the efficacy mechanism and regulation with the molecular biology and systems biology. ChiCTR-TRC-13003608, registered 18 June 2013.

  3. Employing online quantum random number generators for generating truly random quantum states in Mathematica

    Science.gov (United States)

    Miszczak, Jarosław Adam

    2013-01-01

    numbers generated by quantum real number generator. Reasons for new version: Added support for the high-speed on-line quantum random number generator and improved methods for retrieving lists of random numbers. Summary of revisions: The presented version provides two signicant improvements. The first one is the ability to use the on-line Quantum Random Number Generation service developed by PicoQuant GmbH and the Nano-Optics groups at the Department of Physics of Humboldt University. The on-line service supported in the version 2.0 of the TRQS package provides faster access to true randomness sources constructed using the laws of quantum physics. The service is freely available at https://qrng.physik.hu-berlin.de/. The use of this service allows using the presented package with the need of a physical quantum random number generator. The second improvement introduced in this version is the ability to retrieve arrays of random data directly for the used source. This increases the speed of the random number generation, especially in the case of an on-line service, where it reduces the time necessary to establish the connection. Thanks to the speed improvement of the presented version, the package can now be used in simulations requiring larger amounts of random data. Moreover, the functions for generating random numbers provided by the current version of the package more closely follow the pattern of functions for generating pseudo- random numbers provided in Mathematica. Additional comments: Speed comparison: The implementation of the support for the QRNG on-line service provides a noticeable improvement in the speed of random number generation. For the samples of real numbers of size 101; 102,…,107 the times required to generate these samples using Quantis USB device and QRNG service are compared in Fig. 1. The presented results show that the use of the on-line service provides faster access to random numbers. One should note, however, that the speed gain can increase or

  4. 42 CFR 431.814 - Sampling plan and procedures.

    Science.gov (United States)

    2010-10-01

    ... reliability of the reduced sample. (4) The sample selection procedure. Systematic random sampling is... sampling, and yield estimates with the same or better precision than achieved in systematic random sampling... 42 Public Health 4 2010-10-01 2010-10-01 false Sampling plan and procedures. 431.814 Section 431...

  5. Parental interaction patterns in children with attention deficit hyperactive disorder and control group

    Directory of Open Access Journals (Sweden)

    Mojgan Karahmadi

    2007-07-01

    Full Text Available

    BACKGROUND: Parental communication patterns influence children's personality. This study investigated effects of parental interaction patterns on children with attention deficit hyperactive disorder (ADHD.
    METHODS: There were 50 male children, 7-12 years old, selected in two groups. The first group included students with ADHD referred to psychiatry clinics in Isfahan-based on diagnostic scale of DSM-IV (25 subjects. The second group involved healthy boys selected by random cluster multistage sampling from primary schools in five districts of Isfahan (25 subjects from September 2005 to March 2005. Schaffer and Edgerton parental interaction questionnaire was filled for them.
    RESULTS: Mean scores of parental interaction patterns in healthy children were all higher than those in ADHD children except for “aggression control” and “lack of aggressive attachment”.
    CONCLUSIONS: The severity of ADHD signs has negative relationship with parental "admission" and parental "control" patterns. It also has positive relationship with “lack of aggressive/attachment” and “aggressive/control” patterns.
    KEY WORDS: Parental interaction patterns, ADHD.

  6. National-scale vegetation change across Britain; an analysis of sample-based surveillance data from the Countryside Surveys of 1990 and 1998

    NARCIS (Netherlands)

    Smart, S.M.; Clarke, R.T.; Poll, van de H.M.; Robertson, E.J.; Shield, E.R.; Bunce, R.G.H.; Maskell, L.C.

    2003-01-01

    Patterns of vegetation across Great Britain (GB) between 1990 and 1998 were quantified based on an analysis of plant species data from a total of 9596 fixed plots. Plots were established on a stratified random basis within 501 1 km sample squares located as part of the Countryside Survey of GB.

  7. Stratified random sampling plans designed to assist in the determination of radon and radon daughter concentrations in underground uranium mine atmosphere

    International Nuclear Information System (INIS)

    Makepeace, C.E.

    1981-01-01

    Sampling strategies for the monitoring of deleterious agents present in uranium mine air in underground and surface mining areas are described. These methods are designed to prevent overexposure of the lining of the respiratory system of uranium miners to ionizing radiation from radon and radon daughters, and whole body overexposure to external gamma radiation. A detailed description is provided of stratified random sampling monitoring methodology for obtaining baseline data to be used as a reference for subsequent compliance assessment

  8. The quality of the reported sample size calculations in randomized controlled trials indexed in PubMed.

    Science.gov (United States)

    Lee, Paul H; Tse, Andy C Y

    2017-05-01

    There are limited data on the quality of reporting of information essential for replication of the calculation as well as the accuracy of the sample size calculation. We examine the current quality of reporting of the sample size calculation in randomized controlled trials (RCTs) published in PubMed and to examine the variation in reporting across study design, study characteristics, and journal impact factor. We also reviewed the targeted sample size reported in trial registries. We reviewed and analyzed all RCTs published in December 2014 with journals indexed in PubMed. The 2014 Impact Factors for the journals were used as proxies for their quality. Of the 451 analyzed papers, 58.1% reported an a priori sample size calculation. Nearly all papers provided the level of significance (97.7%) and desired power (96.6%), and most of the papers reported the minimum clinically important effect size (73.3%). The median (inter-quartile range) of the percentage difference of the reported and calculated sample size calculation was 0.0% (IQR -4.6%;3.0%). The accuracy of the reported sample size was better for studies published in journals that endorsed the CONSORT statement and journals with an impact factor. A total of 98 papers had provided targeted sample size on trial registries and about two-third of these papers (n=62) reported sample size calculation, but only 25 (40.3%) had no discrepancy with the reported number in the trial registries. The reporting of the sample size calculation in RCTs published in PubMed-indexed journals and trial registries were poor. The CONSORT statement should be more widely endorsed. Copyright © 2016 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  9. Randomized controlled trial of attention bias modification in a racially diverse, socially anxious, alcohol dependent sample.

    Science.gov (United States)

    Clerkin, Elise M; Magee, Joshua C; Wells, Tony T; Beard, Courtney; Barnett, Nancy P

    2016-12-01

    Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Adult participants (N = 86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Randomized Controlled Trial of Attention Bias Modification in a Racially Diverse, Socially Anxious, Alcohol Dependent Sample

    Science.gov (United States)

    Clerkin, Elise M.; Magee, Joshua C.; Wells, Tony T.; Beard, Courtney; Barnett, Nancy P.

    2016-01-01

    Objective Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Method Adult participants (N=86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Results Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. Conclusions These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. PMID:27591918

  11. Gender differences in substance use patterns and disorders among an Iranian patient sample receiving methadone maintenance treatment.

    Science.gov (United States)

    Ghaderi, Amir; Motmaen, Maryam; Abdi, Iraj; Rasouli-Azad, Morad

    2017-09-01

    The current prevalence rate of substance abuse and dependence, represents an increasing trend of substance abuse and dependence among women, and the results of epidemiology studies indicate that substance use patterns are different between men and women. This study aimed to determine gender differences in substance use patterns and disorders among the patients undergoing methadone maintenance treatment. This cross-sectional study was conducted throughout a specified time bracket ranging from September 2012 through March 2013 in Methadone Maintenance Treatment (MMT) clinics of Mashhad, Iran. In this study, 140 men and 120 women were selected from among the patients referring to MMT clinics in Mashhad through purposeful sampling method. The sample units were assessed using a demographic information questionnaire and Structured Clinical Interview for Diagnostic and Statistical Manual of Mental Disorders (SCID). The data were then analyzed by Chi Square test, Mann-Whitney U test, and Independent-samples t-test. SPSS software 16 was used to conduct statistical analyses with P values less than 0.05 regarded as significant. The results showed that men and women are significantly different from each other in terms of marital status (p=0.001), education (p=0.001), income (p=0.001), history of injection (p=0.002), imprisonment (p=0.001), and substance use abstention (p=0.023). It was also revealed that methamphetamine dependence (p=0.017) and simultaneous use of multiple substances (p=0.001) in the past 12 months were diagnosed, to a larger extent, in male participants than those in female participants. In addition, the diagnoses of nicotine dependence (p=0.001), cannabis abuse (p=0.001), heroin dependence (p=0.001), and substance abuse and alcohol dependence (p=0.001) during a lifetime were more frequently existing in males than those in females. There are gender differences in substance use patterns and disorders that appear to be caused by the degree of access to

  12. Influences of sampling effort on detected patterns and structuring processes of a Neotropical plant-hummingbird network.

    Science.gov (United States)

    Vizentin-Bugoni, Jeferson; Maruyama, Pietro K; Debastiani, Vanderlei J; Duarte, L da S; Dalsgaard, Bo; Sazima, Marlies

    2016-01-01

    Virtually all empirical ecological interaction networks to some extent suffer from undersampling. However, how limitations imposed by sampling incompleteness affect our understanding of ecological networks is still poorly explored, which may hinder further advances in the field. Here, we use a plant-hummingbird network with unprecedented sampling effort (2716 h of focal observations) from the Atlantic Rainforest in Brazil, to investigate how sampling effort affects the description of network structure (i.e. widely used network metrics) and the relative importance of distinct processes (i.e. species abundances vs. traits) in determining the frequency of pairwise interactions. By dividing the network into time slices representing a gradient of sampling effort, we show that quantitative metrics, such as interaction evenness, specialization (H2 '), weighted nestedness (wNODF) and modularity (Q; QuanBiMo algorithm) were less biased by sampling incompleteness than binary metrics. Furthermore, the significance of some network metrics changed along the sampling effort gradient. Nevertheless, the higher importance of traits in structuring the network was apparent even with small sampling effort. Our results (i) warn against using very poorly sampled networks as this may bias our understanding of networks, both their patterns and structuring processes, (ii) encourage the use of quantitative metrics little influenced by sampling when performing spatio-temporal comparisons and (iii) indicate that in networks strongly constrained by species traits, such as plant-hummingbird networks, even small sampling is sufficient to detect their relative importance for the frequencies of interactions. Finally, we argue that similar effects of sampling are expected for other highly specialized subnetworks. © 2015 The Authors. Journal of Animal Ecology © 2015 British Ecological Society.

  13. Patterns of Dysmorphic Features in Schizophrenia

    Science.gov (United States)

    Scutt, L.E.; Chow, E.W.C.; Weksberg, R.; Honer, W.G.; Bassett, Anne S.

    2011-01-01

    Congenital dysmorphic features are prevalent in schizophrenia and may reflect underlying neurodevelopmental abnormalities. A cluster analysis approach delineating patterns of dysmorphic features has been used in genetics to classify individuals into more etiologically homogeneous subgroups. In the present study, this approach was applied to schizophrenia, using a sample with a suspected genetic syndrome as a testable model. Subjects (n = 159) with schizophrenia or schizoaffective disorder were ascertained from chronic patient populations (random, n=123) or referred with possible 22q11 deletion syndrome (referred, n = 36). All subjects were evaluated for presence or absence of 70 reliably assessed dysmorphic features, which were used in a three-step cluster analysis. The analysis produced four major clusters with different patterns of dysmorphic features. Significant between-cluster differences were found for rates of 37 dysmorphic features (P dysmorphic features (P = 0.0001), and validating features not used in the cluster analysis: mild mental retardation (P = 0.001) and congenital heart defects (P = 0.002). Two clusters (1 and 4) appeared to represent more developmental subgroups of schizophrenia with elevated rates of dysmorphic features and validating features. Cluster 1 (n = 27) comprised mostly referred subjects. Cluster 4 (n= 18) had a different pattern of dysmorphic features; one subject had a mosaic Turner syndrome variant. Two other clusters had lower rates and patterns of features consistent with those found in previous studies of schizophrenia. Delineating patterns of dysmorphic features may help identify subgroups that could represent neurodevelopmental forms of schizophrenia with more homogeneous origins. PMID:11803519

  14. Prescribing Patterns in Outpatient Clinics of Township Hospitals in China: A Comparative Study before and after the 2009 Health System Reform.

    Science.gov (United States)

    Ding, Ding; Pan, Qingxia; Shan, Linghan; Liu, Chaojie; Gao, Lijun; Hao, Yanhua; Song, Jian; Ning, Ning; Cui, Yu; Li, Ye; Qi, Xinye; Liang, Chao; Wu, Qunhong; Liu, Guoxiang

    2016-07-05

    China introduced a series of health reforms in 2009, including a national essential medicines policy and a medical insurance system for primary care institutions. This study aimed to determine the changing prescribing patterns associated with those reforms in township hospitals. A multi-stage stratified random cluster sampling method was adopted to identify 29 township hospitals from six counties in three provinces. A total of 2899 prescriptions were collected from the participating township hospitals using a systematic random sampling strategy. Seven prescribing indicators were calculated and compared between 2008 and 2013, assessing use of medicines (antibiotics and adrenal corticosteroids) and polypharmacy, administration route of medicines (injections), and affordability of medicines. Significant changes in prescribing patterns were found. The average number of medicines and costs per-prescription dropped by about 50%. The percentage of prescriptions requiring antibiotics declined from 54% to 38%. The percentage of prescriptions requiring adrenal corticosteroid declined from 14% to 4%. The percentage of prescriptions requiring injections declined from 54% to 25%. Despite similar changing patterns, significant regional differences were observed. Significant changes in prescribing patterns are evident in township hospitals in China. Overprescription of antibiotics, injections and adrenal corticosteroids has been reduced. However, salient regional disparities still exist. Further studies are needed to determine potential shifts in the risk of the inappropriate use of medicines from primary care settings to metropolitan hospitals.

  15. Sex difference in human fingertip recognition of micron-level randomness as unpleasant.

    Science.gov (United States)

    Nakatani, M; Kawasoe, T; Denda, M

    2011-08-01

    We investigated sex difference in evaluation, using the human fingertip, of the tactile impressions of three different micron-scale patterns laser-engraved on plastic plates. There were two ordered (periodical) patterns consisting of ripples on a scale of a few micrometres and one pseudo-random (non-periodical) pattern; these patterns were considered to mimic the surface geometry of healthy and damaged human hair, respectively. In the first experiment, 10 women and 10 men ran a fingertip over each surface and determined which of the three plates felt most unpleasant. All 10 female participants reported the random pattern, but not the ordered patterns, as unpleasant, whereas the majority of the male participants did not. In the second experiment, 9 of 10 female participants continued to report the pseudo-random pattern as unpleasant even after their fingertip had been coated with a collodion membrane. In the third experiment, participants were asked to evaluate the magnitude of the tactile impression for each pattern. The results again indicated that female participants tend to report a greater magnitude of unpleasantness than male participants. Our findings indicate that the female participants could readily detect microgeometric surface characteristics and that they evaluated the random pattern as more unpleasant. Possible physical and perceptual mechanisms involved are discussed. © 2011 The Authors. ICS © 2011 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  16. High resolution 4-D spectroscopy with sparse concentric shell sampling and FFT-CLEAN.

    Science.gov (United States)

    Coggins, Brian E; Zhou, Pei

    2008-12-01

    Recent efforts to reduce the measurement time for multidimensional NMR experiments have fostered the development of a variety of new procedures for sampling and data processing. We recently described concentric ring sampling for 3-D NMR experiments, which is superior to radial sampling as input for processing by a multidimensional discrete Fourier transform. Here, we report the extension of this approach to 4-D spectroscopy as Randomized Concentric Shell Sampling (RCSS), where sampling points for the indirect dimensions are positioned on concentric shells, and where random rotations in the angular space are used to avoid coherent artifacts. With simulations, we show that RCSS produces a very low level of artifacts, even with a very limited number of sampling points. The RCSS sampling patterns can be adapted to fine rectangular grids to permit use of the Fast Fourier Transform in data processing, without an apparent increase in the artifact level. These artifacts can be further reduced to the noise level using the iterative CLEAN algorithm developed in radioastronomy. We demonstrate these methods on the high resolution 4-D HCCH-TOCSY spectrum of protein G's B1 domain, using only 1.2% of the sampling that would be needed conventionally for this resolution. The use of a multidimensional FFT instead of the slow DFT for initial data processing and for subsequent CLEAN significantly reduces the calculation time, yielding an artifact level that is on par with the level of the true spectral noise.

  17. Color Charts, Esthetics, and Subjective Randomness

    Science.gov (United States)

    Sanderson, Yasmine B.

    2012-01-01

    Color charts, or grids of evenly spaced multicolored dots or squares, appear in the work of modern artists and designers. Often the artist/designer distributes the many colors in a way that could be described as "random," that is, without an obvious pattern. We conduct a statistical analysis of 125 "random-looking" art and design color charts and…

  18. Spatial Sampling of Weather Data for Regional Crop Yield Simulations

    Science.gov (United States)

    Van Bussel, Lenny G. J.; Ewert, Frank; Zhao, Gang; Hoffmann, Holger; Enders, Andreas; Wallach, Daniel; Asseng, Senthold; Baigorria, Guillermo A.; Basso, Bruno; Biernath, Christian; hide

    2016-01-01

    Field-scale crop models are increasingly applied at spatio-temporal scales that range from regions to the globe and from decades up to 100 years. Sufficiently detailed data to capture the prevailing spatio-temporal heterogeneity in weather, soil, and management conditions as needed by crop models are rarely available. Effective sampling may overcome the problem of missing data but has rarely been investigated. In this study the effect of sampling weather data has been evaluated for simulating yields of winter wheat in a region in Germany over a 30-year period (1982-2011) using 12 process-based crop models. A stratified sampling was applied to compare the effect of different sizes of spatially sampled weather data (10, 30, 50, 100, 500, 1000 and full coverage of 34,078 sampling points) on simulated wheat yields. Stratified sampling was further compared with random sampling. Possible interactions between sample size and crop model were evaluated. The results showed differences in simulated yields among crop models but all models reproduced well the pattern of the stratification. Importantly, the regional mean of simulated yields based on full coverage could already be reproduced by a small sample of 10 points. This was also true for reproducing the temporal variability in simulated yields but more sampling points (about 100) were required to accurately reproduce spatial yield variability. The number of sampling points can be smaller when a stratified sampling is applied as compared to a random sampling. However, differences between crop models were observed including some interaction between the effect of sampling on simulated yields and the model used. We concluded that stratified sampling can considerably reduce the number of required simulations. But, differences between crop models must be considered as the choice for a specific model can have larger effects on simulated yields than the sampling strategy. Assessing the impact of sampling soil and crop management

  19. Perfectly Secure Oblivious RAM without Random Oracles

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Meldgaard, Sigurd Torkel; Nielsen, Jesper Buus

    2011-01-01

    We present an algorithm for implementing a secure oblivious RAM where the access pattern is perfectly hidden in the information theoretic sense, without assuming that the CPU has access to a random oracle. In addition we prove a lower bound on the amount of randomness needed for implementing...

  20. Effects of Spatial Distribution of Trees on Density Estimation by Nearest Individual Sampling Method: Case Studies in Zagros Wild Pistachio Woodlands and Simulated Stands

    Directory of Open Access Journals (Sweden)

    Y. Erfanifard

    2014-06-01

    Full Text Available Distance methods and their estimators of density may have biased measurements unless the studied stand of trees has a random spatial pattern. This study aimed at assessing the effect of spatial arrangement of wild pistachio trees on the results of density estimation by using the nearest individual sampling method in Zagros woodlands, Iran, and applying a correction factor based on the spatial pattern of trees. A 45 ha clumped stand of wild pistachio trees was selected in Zagros woodlands and two random and dispersed stands with similar density and area were simulated. Distances from the nearest individual and neighbour at 40 sample points in a 100 × 100 m grid were measured in the three stands. The results showed that the nearest individual method with Batcheler estimator could not calculate density correctly in all stands. However, applying the correction factor based on the spatial pattern of the trees, density was measured with no significant difference in terms of the real density of the stands. This study showed that considering the spatial arrangement of trees can improve the results of the nearest individual method with Batcheler estimator in density measurement.

  1. Colonic stem cell data are consistent with the immortal model of stem cell division under non-random strand segregation.

    Science.gov (United States)

    Walters, K

    2009-06-01

    Colonic stem cells are thought to reside towards the base of crypts of the colon, but their numbers and proliferation mechanisms are not well characterized. A defining property of stem cells is that they are able to divide asymmetrically, but it is not known whether they always divide asymmetrically (immortal model) or whether there are occasional symmetrical divisions (stochastic model). By measuring diversity of methylation patterns in colon crypt samples, a recent study found evidence in favour of the stochastic model, assuming random segregation of stem cell DNA strands during cell division. Here, the effect of preferential segregation of the template strand is considered to be consistent with the 'immortal strand hypothesis', and explore the effect on conclusions of previously published results. For a sample of crypts, it is shown how, under the immortal model, to calculate mean and variance of the number of unique methylation patterns allowing for non-random strand segregation and compare them with those observed. The calculated mean and variance are consistent with an immortal model that incorporates non-random strand segregation for a range of stem cell numbers and levels of preferential strand segregation. Allowing for preferential strand segregation considerably alters previously published conclusions relating to stem cell numbers and turnover mechanisms. Evidence in favour of the stochastic model may not be as strong as previously thought.

  2. Revisiting sample size: are big trials the answer?

    Science.gov (United States)

    Lurati Buse, Giovanna A L; Botto, Fernando; Devereaux, P J

    2012-07-18

    The superiority of the evidence generated in randomized controlled trials over observational data is not only conditional to randomization. Randomized controlled trials require proper design and implementation to provide a reliable effect estimate. Adequate random sequence generation, allocation implementation, analyses based on the intention-to-treat principle, and sufficient power are crucial to the quality of a randomized controlled trial. Power, or the probability of the trial to detect a difference when a real difference between treatments exists, strongly depends on sample size. The quality of orthopaedic randomized controlled trials is frequently threatened by a limited sample size. This paper reviews basic concepts and pitfalls in sample-size estimation and focuses on the importance of large trials in the generation of valid evidence.

  3. Adipose tissue Fatty Acid patterns and changes in antrhropometry

    DEFF Research Database (Denmark)

    Dahm, Christina Catherine; Gorst-Rasmussen, Anders; Jakobsen, Marianne Uhre

    2011-01-01

    Introduction Diets rich in n-3 long chain polyunsaturated fatty acids (LC-PUFA), but low in n-6 LC-PUFA and 18:1 trans-fatty acids (TFA), may lower the risk of overweight and obesity. These fatty acids have often been investigated individually. We explored associations between global patterns...... in adipose tissue fatty acids and changes in anthropometry. Methods 34 fatty acid species from adipose tissue biopsies were determined in a random sample of 1100 men and women from a Danish cohort study. We used sex-specific principal component analysis and multiple linear regression to investigate...... the associations of adipose tissue fatty acid patterns with changes in weight, waist circumference (WC), and WC controlled for changes in body mass index (WCBMI), adjusting for confounders. Results 7 principal components were extracted for each sex, explaining 77.6% and 78.3% of fatty acid variation in men...

  4. Distribution of peak expiratory flow variability by age, gender and smoking habits in a random population sample aged 20-70 yrs

    NARCIS (Netherlands)

    Boezen, H M; Schouten, J. P.; Postma, D S; Rijcken, B

    1994-01-01

    Peak expiratory flow (PEF) variability can be considered as an index of bronchial lability. Population studies on PEF variability are few. The purpose of the current paper is to describe the distribution of PEF variability in a random population sample of adults with a wide age range (20-70 yrs),

  5. Long-term outcomes and recurrence patterns of standard versus extended pancreatectomy for pancreatic head cancer: a multicenter prospective randomized controlled study.

    Science.gov (United States)

    Jang, Jin-Young; Kang, Jae Seung; Han, Youngmin; Heo, Jin Seok; Choi, Seong Ho; Choi, Dong Wook; Park, Sang Jae; Han, Sung-Sik; Yoon, Dong Sup; Park, Joon Seong; Yu, Hee Chul; Kang, Koo Jeong; Kim, Sang Geol; Lee, Hongeun; Kwon, Wooil; Yoon, Yoo-Seok; Han, Ho-Seong; Kim, Sun-Whe

    2017-07-01

    Our previous randomized controlled trial revealed no difference in 2-year overall survival (OS) between extended and standard resection for pancreatic adenocarcinoma. The present study evaluated the 5-year OS and recurrence patterns according to the extent of pancreatectomy. Between 2006 and 2009, 169 consecutive patients were prospectively enrolled and randomized to standard (n = 83) or extended resection (n = 86) groups to compare 5-year OS rate, long-term recurrence patterns and factors associated with long-term survival. The surgical R0 rate was similar between the standard and extended groups (85.5 vs. 90.7%, P = 0.300). Five-year OS (18.4 vs. 14.4%, P = 0.388), 5-year disease-free survival (14.8 vs. 14.0%, P = 0.531), and overall recurrence rates (74.7 vs. 69.9%, P = 0.497) were not significantly different between the two groups, although the incidence of peritoneal seeding was higher in the extended group (25 vs. 8.1%, P = 0.014). Extended pancreatectomy does not have better short-term and long-term survival outcomes, and shows similar R0 rates and overall recurrence rates compared with standard pancreatectomy. Extended pancreatectomy does not have to be performed routinely for all cases of resectable pancreatic adenocarcinoma, especially considering its associated increased morbidity shown in our previous study. © 2017 Japanese Society of Hepato-Biliary-Pancreatic Surgery.

  6. Does diet-beverage intake affect dietary consumption patterns? Results from the Choose Healthy Options Consciously Everyday (CHOICE) randomized clinical trial123

    Science.gov (United States)

    Piernas, Carmen; Tate, Deborah F; Wang, Xiaoshan

    2013-01-01

    Background: Little is understood about the effect of increased consumption of low-calorie sweeteners in diet beverages on dietary patterns and energy intake. Objective: We investigated whether energy intakes and dietary patterns were different in subjects who were randomly assigned to substitute caloric beverages with either water or diet beverages (DBs). Design: Participants from the Choose Healthy Options Consciously Everyday randomized clinical trial (a 6-mo, 3-arm study) were included in the analysis [water groups: n = 106 (94% women); DB group: n = 104 (82% women)]. For energy, macronutrient, and food and beverage intakes, we investigated the main effects of time, treatment, and the treatment-by-time interaction by using mixed models. Results: Overall, the macronutrient composition changed in both groups without significant differences between groups over time. Both groups reduced absolute intakes of total daily energy, carbohydrates, fat, protein, saturated fat, total sugar, added sugar, and other carbohydrates. The DB group decreased energy from all beverages more than the water group did only at month 3 (P-group-by-time dessert intake than the water group did at month 6 (P-group-by-time beverages and specifically reduced more desserts than the water group did. Our study does not provide evidence to suggest that a short-term consumption of DBs, compared with water, increases preferences for sweet foods and beverages. This trial was registered at clinicaltrials.gov as NCT01017783. PMID:23364015

  7. [Variables related to the emergence of differential patterns in work motivation].

    Science.gov (United States)

    Arrieta, Carlos; Navarro, José; Vicente, Susana

    2008-11-01

    Several longitudinal studies have shown that motivation at work acts chaotically. In very few cases, it may be linear or random. However, the factors that might explain why these different patterns emerge have not been analysed to date. In this exploratory study, we interviewed 73 employees whose motivational patterns were previously known. The results revealed that chaotic patterns were associated with high levels of motivation, self-efficacy beliefs, and perceptions of instrumentality, and also with intrinsic personal goal orientation and a perception of high work control. Linear patterns were associated with extrinsic goals and a perception of work as difficult, and random patterns were linked to high flexibility at work.

  8. The Effects of 6 Isocaloric Meals Pattern on Blood Lipid Profile, Glucose, Hemoglobin A1c, Insulin and Malondialdehyde in Type 2 Diabetic Patients: A Randomized Clinical Trial

    OpenAIRE

    Moosa Salehi; Asma Kazemi; Jafar Hasan Zadeh

    2014-01-01

    Background: The present clinical trial study aims at investigating the effect of daily energy intake in 6 isocaloric meals in comparison with the current meal pattern (3 meals and 2 small snacks per day) on type 2 diabetes risk markers in diabetes during 3-month period. Methods: Eighty four type 2 diabetes patients were randomly divided into 6 isocaloric meal diet or a balanced diet (3 meals and 2 snacks previous meal pattern). The planned reduced calorie diets for both groups were identi...

  9. Dynamic defense and network randomization for computer systems

    Science.gov (United States)

    Chavez, Adrian R.; Stout, William M. S.; Hamlet, Jason R.; Lee, Erik James; Martin, Mitchell Tyler

    2018-05-29

    The various technologies presented herein relate to determining a network attack is taking place, and further to adjust one or more network parameters such that the network becomes dynamically configured. A plurality of machine learning algorithms are configured to recognize an active attack pattern. Notification of the attack can be generated, and knowledge gained from the detected attack pattern can be utilized to improve the knowledge of the algorithms to detect a subsequent attack vector(s). Further, network settings and application communications can be dynamically randomized, wherein artificial diversity converts control systems into moving targets that help mitigate the early reconnaissance stages of an attack. An attack(s) based upon a known static address(es) of a critical infrastructure network device(s) can be mitigated by the dynamic randomization. Network parameters that can be randomized include IP addresses, application port numbers, paths data packets navigate through the network, application randomization, etc.

  10. Dynamical implications of sample shape for avalanches in 2-dimensional random-field Ising model with saw-tooth domain wall

    Science.gov (United States)

    Tadić, Bosiljka

    2018-03-01

    We study dynamics of a built-in domain wall (DW) in 2-dimensional disordered ferromagnets with different sample shapes using random-field Ising model on a square lattice rotated by 45 degrees. The saw-tooth DW of the length Lx is created along one side and swept through the sample by slow ramping of the external field until the complete magnetisation reversal and the wall annihilation at the open top boundary at a distance Ly. By fixing the number of spins N =Lx ×Ly = 106 and the random-field distribution at a value above the critical disorder, we vary the ratio of the DW length to the annihilation distance in the range Lx /Ly ∈ [ 1 / 16 , 16 ] . The periodic boundary conditions are applied in the y-direction so that these ratios comprise different samples, i.e., surfaces of cylinders with the changing perimeter Lx and height Ly. We analyse the avalanches of the DW slips between following field updates, and the multifractal structure of the magnetisation fluctuation time series. Our main findings are that the domain-wall lengths materialised in different sample shapes have an impact on the dynamics at all scales. Moreover, the domain-wall motion at the beginning of the hysteresis loop (HLB) probes the disorder effects resulting in the fluctuations that are significantly different from the large avalanches in the central part of the loop (HLC), where the strong fields dominate. Specifically, the fluctuations in HLB exhibit a wide multi-fractal spectrum, which shifts towards higher values of the exponents when the DW length is reduced. The distributions of the avalanches in this segments of the loops obey power-law decay and the exponential cutoffs with the exponents firmly in the mean-field universality class for long DW. In contrast, the avalanches in the HLC obey Tsallis density distribution with the power-law tails which indicate the new categories of the scale invariant behaviour for different ratios Lx /Ly. The large fluctuations in the HLC, on the other

  11. Identification of dietary patterns of adolescents attending public schools

    Directory of Open Access Journals (Sweden)

    Lucinéia de Pinho

    2014-06-01

    Full Text Available OBJECTIVE: to identify the dietary patterns of adolescents attending public municipal schools in Northern Minas Gerais, Brazil, and to test the association between these patterns and socioeconomic variables and nutritional status of the adolescents. METHODS: this was an analytical, cross-sectional study with randomized sample of 474 adolescents of both genders, between 11 and 17 years of age, attending municipal public schools in the urban area of Montes Claros, MG, Brazil. The parents provided demographic and economic data. The nutritional status (body mass index - BMI of the adolescents was determined at school, and their dietary habits were assessed though the administration of the Food Frequency Questionnaire for Adolescents (FFQA. Based on 26 categories extracted from FFQA, dietary patterns were determined using principal component analysis (PCA and associated to anthropometric and socioeconomic factors using multiple regression analysis. RESULTS: the three dietary patterns identified, "junk food," "healthy," and "traditional", explained 23.26%, 6.90%, and 5.24% of data variability, respectively. Adolescents with per capita family income exceeding half a minimum wage were more likely to consume the "junk food" pattern (OR = 1.66; 95% CI = 1.07-2.56, and overweight adolescents had lower chances of eating the "healthy" food pattern (OR = 0.56, 95% CI = 0.35-0.91. CONCLUSIONS: deviations from the "healthy" patterns were not associated to low income, but rather to bad eating habits in the studied population. Overweight adolescents did not adhere to the "healthy" dietary pattern, emphasizing the need for nutritional education among them.

  12. Simultaneous escaping of explicit and hidden free energy barriers: application of the orthogonal space random walk strategy in generalized ensemble based conformational sampling.

    Science.gov (United States)

    Zheng, Lianqing; Chen, Mengen; Yang, Wei

    2009-06-21

    To overcome the pseudoergodicity problem, conformational sampling can be accelerated via generalized ensemble methods, e.g., through the realization of random walks along prechosen collective variables, such as spatial order parameters, energy scaling parameters, or even system temperatures or pressures, etc. As usually observed, in generalized ensemble simulations, hidden barriers are likely to exist in the space perpendicular to the collective variable direction and these residual free energy barriers could greatly abolish the sampling efficiency. This sampling issue is particularly severe when the collective variable is defined in a low-dimension subset of the target system; then the "Hamiltonian lagging" problem, which reveals the fact that necessary structural relaxation falls behind the move of the collective variable, may be likely to occur. To overcome this problem in equilibrium conformational sampling, we adopted the orthogonal space random walk (OSRW) strategy, which was originally developed in the context of free energy simulation [L. Zheng, M. Chen, and W. Yang, Proc. Natl. Acad. Sci. U.S.A. 105, 20227 (2008)]. Thereby, generalized ensemble simulations can simultaneously escape both the explicit barriers along the collective variable direction and the hidden barriers that are strongly coupled with the collective variable move. As demonstrated in our model studies, the present OSRW based generalized ensemble treatments show improved sampling capability over the corresponding classical generalized ensemble treatments.

  13. Development of a versatile sample preparation method and its application for rare-earth pattern and Nd isotope ratio analysis in nuclear forensics

    International Nuclear Information System (INIS)

    Krajko, J.

    2015-01-01

    An improved sample preparation procedure for trace-levels of lanthanides in uranium-bearing samples was developed. The method involves a simple co-precipitation using Fe(III) carrier in ammonium carbonate medium to remove the uranium matrix. The procedure is an effective initial pre-concentration step for the subsequent extraction chromatographic separations. The applicability of the method was demonstrated by the measurement of REE pattern and 143 Nd/ 144 Nd isotope ratio in uranium ore concentrate samples. (author)

  14. Comparison of two Nd:YAG laser posterior capsulotomy: cruciate pattern vs circular pattern with vitreous strand cutting.

    Science.gov (United States)

    Kim, Jin-Soo; Choi, Jung Yeol; Kwon, Ji-Won; Wee, Won Ryang; Han, Young Keun

    2018-01-01

    To investigate the effects and safety of neodymium: yttrium-aluminium-garnet (Nd:YAG) laser posterior capsulotomy with vitreous strand cutting. A total of 40 eyes of 37 patients with symptomatic posterior capsular opacity (PCO) were included in this prospective randomized study and were randomly subjected to either cruciate pattern or round pattern Nd:YAG posterior capsulotomy with vitreous strand cutting (modified round pattern). The best corrected visual acuity (BCVA), intraocular pressure (IOP), refractive error, endothelial cell count (ECC), anterior segment parameters, including anterior chamber depth (ACD) and anterior chamber angle (ACA) were measured before and 1mo after the laser posterior capsulotomy. In both groups, the BCVA improved significantly ( P <0.001 for the modified round pattern group, P =0.001 for the cruciate pattern group); the IOP and ECC did not significantly change. The ACD significantly decreased ( P <0.001 for both) and the ACA significantly increased ( P =0.001 for the modified round pattern group and P =0.034 for the cruciate group). The extent of changes in these parameters was not significantly different between the groups. Modified round pattern Nd:YAG laser posterior capsulotomy is an effective and safe method for the treatment of PCO. This method significantly changes the ACD and ACA, but the change in refraction is not significant. Modified round pattern Nd:YAG laser posterior capsulotomy can be considered a good alternative procedure in patients with symptomatic PCO.

  15. Comparison of two Nd:YAG laser posterior capsulotomy: cruciate pattern vs circular pattern with vitreous strand cutting

    Directory of Open Access Journals (Sweden)

    Jin-Soo Kim

    2018-02-01

    Full Text Available AIM: To investigate the effects and safety of neodymium: yttrium-aluminium-garnet (Nd:YAG laser posterior capsulotomy with vitreous strand cutting METHODS: A total of 40 eyes of 37 patients with symptomatic posterior capsular opacity (PCO were included in this prospective randomized study and were randomly subjected to either cruciate pattern or round pattern Nd:YAG posterior capsulotomy with vitreous strand cutting (modified round pattern. The best corrected visual acuity (BCVA, intraocular pressure (IOP, refractive error, endothelial cell count (ECC, anterior segment parameters, including anterior chamber depth (ACD and anterior chamber angle (ACA were measured before and 1mo after the laser posterior capsulotomy. RESULTS: In both groups, the BCVA improved significantly (P<0.001 for the modified round pattern group, P=0.001 for the cruciate pattern group; the IOP and ECC did not significantly change. The ACD significantly decreased (P<0.001 for both and the ACA significantly increased (P=0.001 for the modified round pattern group and P=0.034 for the cruciate group. The extent of changes in these parameters was not significantly different between the groups. CONCLUSION: Modified round pattern Nd:YAG laser posterior capsulotomy is an effective and safe method for the treatment of PCO. This method significantly changes the ACD and ACA, but the change in refraction is not significant. Modified round pattern Nd:YAG laser posterior capsulotomy can be considered a good alternative procedure in patients with symptomatic PCO.

  16. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    International Nuclear Information System (INIS)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ 1 -minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy

  17. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    Science.gov (United States)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ1-minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy.

  18. 3-Way pattern-recognition of PAHs from Galicia (NW Spain) seawater samples after the Prestige's wreck

    International Nuclear Information System (INIS)

    Grueiro-Noche, G.; Andrade, J.M.; Muniategui-Lorenzo, S.; Lopez-Mahia, P.; Prada-Rodriguez, D.

    2010-01-01

    In November 2002 the oil tanker 'Prestige' released 65 000 tons of a heavy fuel oil throughout the Galician coastline (NW Spain), causing extensive damage to marine life, natural resources and economic activities at Northern Portugal, Spain and SW France. To evaluate the impact of the oil spill on the aquatic system, 30 polycyclic aromatic hydrocarbons (PAHs), including alkylated derivatives, were analyzed in seawater on five different sampling campaigns from 2002 to 2004. Sampling was made along the Galician continental shelf. In each station three samples were collected at three different depths (surface, mid-depth and bottom). Four different approaches for 3-way analyses (Catenated-PCA, Matrix-Augmented Principal Components Analysis, Parallel Factor Analysis and Procrustes rotation) have been used to asses the major sources of PAHs into the seawater. They revealed two main pollution patterns: one related to oil spillages and discharge of petroleum products, and another more associated with a diffuse anthropogenic origin. - Oil- and anthropogenic-related sources of PAHs were found in a survey program in Galicia after the Prestige wreck, the latter being more relevant as time went by.

  19. Soil map disaggregation improved by soil-landscape relationships, area-proportional sampling and random forest implementation

    DEFF Research Database (Denmark)

    Møller, Anders Bjørn; Malone, Brendan P.; Odgers, Nathan

    implementation generally improved the algorithm’s ability to predict the correct soil class. The implementation of soil-landscape relationships and area-proportional sampling generally increased the calculation time, while the random forest implementation reduced the calculation time. In the most successful......Detailed soil information is often needed to support agricultural practices, environmental protection and policy decisions. Several digital approaches can be used to map soil properties based on field observations. When soil observations are sparse or missing, an alternative approach...... is to disaggregate existing conventional soil maps. At present, the DSMART algorithm represents the most sophisticated approach for disaggregating conventional soil maps (Odgers et al., 2014). The algorithm relies on classification trees trained from resampled points, which are assigned classes according...

  20. Comparison of Address-based Sampling and Random-digit Dialing Methods for Recruiting Young Men as Controls in a Case-Control Study of Testicular Cancer Susceptibility

    OpenAIRE

    Clagett, Bartholt; Nathanson, Katherine L.; Ciosek, Stephanie L.; McDermoth, Monique; Vaughn, David J.; Mitra, Nandita; Weiss, Andrew; Martonik, Rachel; Kanetsky, Peter A.

    2013-01-01

    Random-digit dialing (RDD) using landline telephone numbers is the historical gold standard for control recruitment in population-based epidemiologic research. However, increasing cell-phone usage and diminishing response rates suggest that the effectiveness of RDD in recruiting a random sample of the general population, particularly for younger target populations, is decreasing. In this study, we compared landline RDD with alternative methods of control recruitment, including RDD using cell-...

  1. Detecting Beer Intake by Unique Metabolite Patterns.

    Science.gov (United States)

    Gürdeniz, Gözde; Jensen, Morten Georg; Meier, Sebastian; Bech, Lene; Lund, Erik; Dragsted, Lars Ove

    2016-12-02

    Evaluation of the health related effects of beer intake is hampered by the lack of accurate tools for assessing intakes (biomarkers). Therefore, we identified plasma and urine metabolites associated with recent beer intake by untargeted metabolomics and established a characteristic metabolite pattern representing raw materials and beer production as a qualitative biomarker of beer intake. In a randomized, crossover, single-blinded meal study (MSt1), 18 participants were given, one at a time, four different test beverages: strong, regular, and nonalcoholic beers and a soft drink. Four participants were assigned to have two additional beers (MSt2). In addition to plasma and urine samples, test beverages, wort, and hops extract were analyzed by UPLC-QTOF. A unique metabolite pattern reflecting beer metabolome, including metabolites derived from beer raw material (i.e., N-methyl tyramine sulfate and the sum of iso-α-acids and tricyclohumols) and the production process (i.e., pyro-glutamyl proline and 2-ethyl malate), was selected to establish a compliance biomarker model for detection of beer intake based on MSt1. The model predicted the MSt2 samples collected before and up to 12 h after beer intake correctly (AUC = 1). A biomarker model including four metabolites representing both beer raw materials and production steps provided a specific and accurate tool for measurement of beer consumption.

  2. Patterns of drug dependence in a Queensland (Australia) sample of Indigenous and non-Indigenous people who inject drugs.

    Science.gov (United States)

    Smirnov, Andrew; Kemp, Robert; Ward, James; Henderson, Suzanna; Williams, Sidney; Dev, Abhilash; Najman, Jake M

    2016-09-01

    Despite over-representation of Indigenous Australians in sentinel studies of injecting drug use, little is known about relevant patterns of drug use and dependence. This study compares drug dependence and possible contributing factors in Indigenous and non-Indigenous Australians who inject drugs. Respondent-driven sampling was used in major cities and 'peer recruitment' in regional towns of Queensland to obtain a community sample of Indigenous (n = 282) and non-Indigenous (n = 267) injectors. Data are cross sectional. Multinomial models were developed for each group to examine types of dependence on injected drugs (no dependence, methamphetamine-dependent only, opioid-dependent only, dependent on methamphetamine and opioids). Around one-fifth of Indigenous and non-Indigenous injectors were dependent on both methamphetamine and opioids in the previous 12 months. Psychological distress was associated with dual dependence on these drugs for Indigenous [adjusted relative risk (ARR) 4.86, 95% confidence interval (CI) 2.08-11.34] and non-Indigenous (ARR 4.14, 95% CI 1.59-10.78) participants. Unemployment (ARR 8.98, 95% CI 2.25-35.82) and repeated (> once) incarceration as an adult (ARR 3.78, 95% CI 1.43-9.97) were associated with dual dependence for Indigenous participants only. Indigenous participants had high rates of alcohol dependence, except for those dependent on opioids only. The drug dependence patterns of Indigenous and non-Indigenous people who inject drugs were similar, including the proportions dependent on both methamphetamine and opioids. However, for Indigenous injectors, there was a stronger association between drug dependence and contextual factors such as unemployment and incarceration. Expansion of treatment options and community-level programs may be required. [Smirnov A, Kemp R, Ward J, Henderson S, Williams S, Dev A, Najman J M. Patterns of drug dependence in a Queensland (Australia) sample of Indigenous and non-Indigenous people who

  3. Examination of Cognitive Function During Six Months of Calorie Restriction: Results of a Randomized Controlled Trial

    Science.gov (United States)

    Martin, Corby K.; Anton, Stephen D.; Han, Hongmei; York-Crowe, Emily; Redman, Leanne M.; Ravussin, Eric; Williamson, Donald A.

    2009-01-01

    Background Calorie restriction increases longevity in many organisms, and calorie restriction or its mimetic might increase longevity in humans. It is unclear if calorie restriction/dieting contributes to cognitive impairment. During this randomized controlled trial, the effect of 6 months of calorie restriction on cognitive functioning was tested. Methods Participants (n = 48) were randomized to one of four groups: (1) control (weight maintenance), (2) calorie restriction (CR; 25% restriction), (3) CR plus structured exercise (CR + EX, 12.5% restriction plus 12.5% increased energy expenditure via exercise), or (4) low-calorie diet (LCD; 890 kcal/d diet until 15% weight loss, followed by weight maintenance). Cognitive tests (verbal memory, visual memory, attention/concentration) were conducted at baseline and months 3 and 6. Mixed linear models tested if cognitive function changed significantly from baseline to months 3 and 6, and if this change differed by group. Correlation analysis was used to determine if average daily energy deficit (quantified from change in body energy stores) was associated with change in cognitive test performance for the three dieting groups combined. Results No consistent pattern of verbal memory, visual retention/memory, or attention/concentration deficits emerged during the trial. Daily energy deficit was not significantly associated with change in cognitive test performance. Conclusions This randomized controlled trial suggests that calorie restriction/dieting was not associated with a consistent pattern of cognitive impairment. These conclusions must be interpreted in the context of study limitations, namely small sample size and limited statistical power. Previous reports of cognitive impairment might reflect sampling biases or information processing biases. PMID:17518698

  4. Spatial analysis of "crazy quilts", a class of potentially random aesthetic artefacts.

    Directory of Open Access Journals (Sweden)

    Gesche Westphal-Fitch

    Full Text Available Human artefacts in general are highly structured and often display ordering principles such as translational, reflectional or rotational symmetry. In contrast, human artefacts that are intended to appear random and non symmetrical are very rare. Furthermore, many studies show that humans find it extremely difficult to recognize or reproduce truly random patterns or sequences. Here, we attempt to model two-dimensional decorative spatial patterns produced by humans that show no obvious order. "Crazy quilts" represent a historically important style of quilt making that became popular in the 1870s, and lasted about 50 years. Crazy quilts are unusual because unlike most human artefacts, they are specifically intended to appear haphazard and unstructured. We evaluate the degree to which this intention was achieved by using statistical techniques of spatial point pattern analysis to compare crazy quilts with regular quilts from the same region and era and to evaluate the fit of various random distributions to these two quilt classes. We found that the two quilt categories exhibit fundamentally different spatial characteristics: The patch areas of crazy quilts derive from a continuous random distribution, while area distributions of regular quilts consist of Gaussian mixtures. These Gaussian mixtures derive from regular pattern motifs that are repeated and we suggest that such a mixture is a distinctive signature of human-made visual patterns. In contrast, the distribution found in crazy quilts is shared with many other naturally occurring spatial patterns. Centroids of patches in the two quilt classes are spaced differently and in general, crazy quilts but not regular quilts are well-fitted by a random Strauss process. These results indicate that, within the constraints of the quilt format, Victorian quilters indeed achieved their goal of generating random structures.

  5. Spatial analysis of "crazy quilts", a class of potentially random aesthetic artefacts.

    Science.gov (United States)

    Westphal-Fitch, Gesche; Fitch, W Tecumseh

    2013-01-01

    Human artefacts in general are highly structured and often display ordering principles such as translational, reflectional or rotational symmetry. In contrast, human artefacts that are intended to appear random and non symmetrical are very rare. Furthermore, many studies show that humans find it extremely difficult to recognize or reproduce truly random patterns or sequences. Here, we attempt to model two-dimensional decorative spatial patterns produced by humans that show no obvious order. "Crazy quilts" represent a historically important style of quilt making that became popular in the 1870s, and lasted about 50 years. Crazy quilts are unusual because unlike most human artefacts, they are specifically intended to appear haphazard and unstructured. We evaluate the degree to which this intention was achieved by using statistical techniques of spatial point pattern analysis to compare crazy quilts with regular quilts from the same region and era and to evaluate the fit of various random distributions to these two quilt classes. We found that the two quilt categories exhibit fundamentally different spatial characteristics: The patch areas of crazy quilts derive from a continuous random distribution, while area distributions of regular quilts consist of Gaussian mixtures. These Gaussian mixtures derive from regular pattern motifs that are repeated and we suggest that such a mixture is a distinctive signature of human-made visual patterns. In contrast, the distribution found in crazy quilts is shared with many other naturally occurring spatial patterns. Centroids of patches in the two quilt classes are spaced differently and in general, crazy quilts but not regular quilts are well-fitted by a random Strauss process. These results indicate that, within the constraints of the quilt format, Victorian quilters indeed achieved their goal of generating random structures.

  6. The Difference of Food Pattern and Physical Acti vity between Obese and Non Obese Teenage Group.

    Directory of Open Access Journals (Sweden)

    Kartika Suryaputra

    2012-06-01

    Full Text Available Obesity in teenage is a syndrome that happened because of fat accumulation in the body. Obesity occured because of complex interaction between parental fatness, food pattern, and physical activity. In Indonesia, prevalence of teenage obesity is gradually increasing. The aim of this research was to analyze about the difference of foodpattern and physical activity between obesity and non obesity teenage group. This study was an analytical observational research with cross sectional design. The samples were 40 teenage from Santa Agnes seniorhigh school Surabaya (age 15-17 that was taken by simple random sampling, that divers to 20 obese and 20 non obese teenage group. The data were analysed by Mann Whitney test for nutrition knowledge, pocket money, food pattern, fast food’s consumption, snack’s consumption pattern, consumption level of energy, carbohydrat, protein, and fat, physical activity and parental fatness. The result of the statistic test showed that variables significant difference are nutrition knowledge, pocket money, food pattern, fastfood’s consumption, snack’s consumption pattern, energy consumption level, carbohydrate consumption level, protein consumption level, fat consumption level, physical activity and parental fatness between obese and non obese teenage group. The conlusion is that significant differences are food pattern and physical activity between obese and non obese teenage group. Recommendation is necessary to provide information and education to teenage about healthy food and adequate physical activity to prevent obesity

  7. The physics of pattern formation at liquid interfaces

    International Nuclear Information System (INIS)

    Maher, J.V.

    1991-06-01

    This report discusses the following physics of liquid interfaces: pattern formation; perturbing Saffman-Taylor flow with a small gap-gradient; scaling of radial patterns in a viscoelastic solution; dynamic surface tension at an interface between miscible liquids; and random systems

  8. Patterns of proliferation related to vasculature in human head-and-neck carcinomas before and after transplantation in nude mice

    International Nuclear Information System (INIS)

    Wijffels, Karien I.E.M.; Kaanders, Johannes H.A.M.; Marres, Henri A.M.; Bussink, Johan; Peters, Hans P.W.; Rijken, Paulus F.J.W.; Hoogen, Frank J.A. van den; Wilde, Peter C.M. de; Kogel, Albert J. van der

    2001-01-01

    Purpose: The predictive potential of tumor cell kinetic parameters may be improved when they are studied in relation to other microenvironmental parameters. The purpose of this investigation was to quantitatively categorize human tumor samples according to proliferation patterns. Second, it was examined whether these characteristics are retained after xenotransplantation. Methods and Materials: Fifty tumor samples from head-and-neck cancer patients were immunohistochemically stained for Ki-67 and vessels. Also, parts of the samples were transplanted into nude mice. Tumors were categorized according to previously described patterns of proliferation. Vascular and proliferation patterns were analyzed using an image processing system. Results: The 50 tumors were categorized into four patterns of proliferation by visual assessment: marginal (6), intermediate (10), random (21), and mixed (12). One tumor could not be classified. These patterns were quantified by calculating the Ki-67 labeling index in distinct zones at increasing distance from vessels yielding good discrimination and significant differences between patterns. The probability of growth after xenotransplantation was significantly higher for tumors with a labeling index and vascular density above the median value compared to tumors with both parameters below the median (82% vs. 35%). Fifty percent of the tumors retained their proliferation patterns after xenotransplantation. Conclusion: The categorization by proliferation pattern previously described by others was reproduced quantitatively and spatially related to the vascular network using a computerized image processing system. The combination of quantitative and architectural information of multiple microenvironmental parameters adds a new dimension to the study of treatment resistance mechanisms. Tumor models representative of the various patterns can be used to further investigate the relevance of these architectural patterns

  9. Evaluating effectiveness of down-sampling for stratified designs and unbalanced prevalence in Random Forest models of tree species distributions in Nevada

    Science.gov (United States)

    Elizabeth A. Freeman; Gretchen G. Moisen; Tracy S. Frescino

    2012-01-01

    Random Forests is frequently used to model species distributions over large geographic areas. Complications arise when data used to train the models have been collected in stratified designs that involve different sampling intensity per stratum. The modeling process is further complicated if some of the target species are relatively rare on the landscape leading to an...

  10. Global patterns and predictions of seafloor biomass using random forests

    Digital Repository Service at National Institute of Oceanography (India)

    Wei, Chih-Lin; Rowe, G.T.; Escobar-Briones, E.; Boetius, A; Soltwedel, T.; Caley, M.J.; Soliman, Y.; Huettmann, F.; Qu, F.; Yu, Z.; Pitcher, C.R.; Haedrich, R.L.; Wicksten, M.K.; Rex, M.A; Baguley, J.G.; Sharma, J.; Danovaro, R.; MacDonald, I.R.; Nunnally, C.C.; Deming, J.W.; Montagna, P.; Levesque, M.; Weslawsk, J.M.; Wlodarska-Kowalczuk, M.; Ingole, B.S.; Bett, B.J.; Billett, D.S.M.; Yool, A; Bluhm, B.A; Iken, K.; Narayanaswamy, B.E.

    A comprehensive seafloor biomass and abundance database has been constructed from 24 oceanographic institutions worldwide within the Census of Marine Life (CoML) field projects. The machine-learning algorithm, Random Forests, was employed to model...

  11. Product's promotion patterns and their effects on consumers ...

    African Journals Online (AJOL)

    This research investigated the influence of promotional patterns of Coca-Cola products on purchase intentions, loyalty and attitude of consumers in Nigeria. Ninety randomly selected social science undergraduates of a major Nigerian university participated in the study. Participants were randomly assigned to three new ...

  12. Origin of convex tetrads in rare earth element patterns of hydrothermally altered siliceous igneous rocks from the Zinnwald Sn–W deposit, Germany

    OpenAIRE

    T. Monecke; Peter Dulski; U. Kempe

    2007-01-01

    The chondrite-normalized rare earth element (REE) patterns of whole rock samples from evolved granitic systems hosting rare metal deposits sometimes show a split into four consecutive curved segments, referred to as tetrads. In the present contribution, a rigorous statistical method is proposed that can be used to test whether geological significance should be attributed to tetrads that are only of limited size. The method involves a detailed evaluation of element and sample specific random a...

  13. Extraction and characterization of essential discharge patterns from multisite recordings of spiking ongoing activity.

    Directory of Open Access Journals (Sweden)

    Riccardo Storchi

    Full Text Available Neural activation patterns proceed often by schemes or motifs distributed across the involved cortical networks. As neurons are correlated, the estimate of all possible dependencies quickly goes out of control. The complex nesting of different oscillation frequencies and their high non-stationariety further hamper any quantitative evaluation of spiking network activities. The problem is exacerbated by the intrinsic variability of neural patterns.Our technique introduces two important novelties and enables to insulate essential patterns on larger sets of spiking neurons and brain activity regimes. First, the sampling procedure over N units is based on a fixed spike number k in order to detect N-dimensional arrays (k-sequences, whose sum over all dimension is k. Then k-sequences variability is greatly reduced by a hierarchical separative clustering, that assigns large amounts of distinct k-sequences to few classes. Iterative separations are stopped when the dimension of each cluster comes to be smaller than a certain threshold. As threshold tuning critically impacts on the number of classes extracted, we developed an effective cost criterion to select the shortest possible description of our dataset. Finally we described three indexes (C,S,R to evaluate the average pattern complexity, the structure of essential classes and their stability in time.We validated this algorithm with four kinds of surrogated activity, ranging from random to very regular patterned. Then we characterized a selection of ongoing activity recordings. By the S index we identified unstable, moderatly and strongly stable patterns while by the C and the R indices we evidenced their non-random structure. Our algorithm seems able to extract interesting and non-trivial spatial dynamics from multisource neuronal recordings of ongoing and potentially stimulated activity. Combined with time-frequency analysis of LFPs could provide a powerful multiscale approach linking population

  14. Random Subspace Aggregation for Cancer Prediction with Gene Expression Profiles

    Directory of Open Access Journals (Sweden)

    Liying Yang

    2016-01-01

    Full Text Available Background. Precisely predicting cancer is crucial for cancer treatment. Gene expression profiles make it possible to analyze patterns between genes and cancers on the genome-wide scale. Gene expression data analysis, however, is confronted with enormous challenges for its characteristics, such as high dimensionality, small sample size, and low Signal-to-Noise Ratio. Results. This paper proposes a method, termed RS_SVM, to predict gene expression profiles via aggregating SVM trained on random subspaces. After choosing gene features through statistical analysis, RS_SVM randomly selects feature subsets to yield random subspaces and training SVM classifiers accordingly and then aggregates SVM classifiers to capture the advantage of ensemble learning. Experiments on eight real gene expression datasets are performed to validate the RS_SVM method. Experimental results show that RS_SVM achieved better classification accuracy and generalization performance in contrast with single SVM, K-nearest neighbor, decision tree, Bagging, AdaBoost, and the state-of-the-art methods. Experiments also explored the effect of subspace size on prediction performance. Conclusions. The proposed RS_SVM method yielded superior performance in analyzing gene expression profiles, which demonstrates that RS_SVM provides a good channel for such biological data.

  15. Structural patterns in nature

    DEFF Research Database (Denmark)

    Wester, Ture

    2003-01-01

    be an inspiration to structural morphologists and other dealing with the shaping of structures for buildings and other objects. Often the patterns appear in "dual" materializations, which indicate two radically different structural types of action. Randomness as a generator for optimal and basic structural action...

  16. Structural patterns in nature

    DEFF Research Database (Denmark)

    Wester, Ture

    2004-01-01

    be an inspiration to structural morphologists and other dealing with the shaping of structures for buildings and other objects. Often the patterns appear in "dual" materializations, which indicate two radically different structural types of action. Randomness as a generator for optimal and basic structural action...

  17. Study of Relationship between Genetic Pattern and Susceptibility to Terbinafine in Clinical Isolated of Trichophyton rubrum

    Directory of Open Access Journals (Sweden)

    Fatemeh Hadadi

    2014-06-01

    Full Text Available Background & objectives: Trichophyton rubrum is one of the most common pathogeniccause of dermatophytosis. One of the drugs which have been prescribed widely for fungal infections is terbinafine which belongs to allylamines group of antifungal agents. Recently molecular typing methods have been developed for answering the epidemiological questions and disease recurrence problems. Current study has been conducted on 22 isolates of Trichophyton rubrum obtained from patients randomly. Our aim was the investigation of correlation between genetic pattern and sensitivity to Terbinafine in clinical isolates of Trichophyton rubrum.   Methods: Firstly the genus and species of isolated fungi from patients have been confirmed by macroscopic and microscopic methods, then, the resistance and sensitivity of isolates against drug have been determined using culture medium containing defined amount of drug. In next step fungal DNA has been extracted by RAPD-PCR (random amplified polymorphic DNA with random sequences of 3 primers.   Results: Each primer produced different amplified pattern, and using each 3 primers differences have been observed in genetic pattern of resistant and sensitive samples using each 3 primers, but there was no bond with 100% specificity.   Conclusion: The 12 sensitive isolates which didn’t grow in 0.1 mg concentration of drug, also had limited growth at the low concentration of drug. Ten resistant isolates which grew in 0.1mg/ml of drug, in lower concentration of drug were resisted. RAPD analysis for molecular typing of Trichophyton rubrum seems to be completely suitable.

  18. Blocked Randomization with Randomly Selected Block Sizes

    Directory of Open Access Journals (Sweden)

    Jimmy Efird

    2010-12-01

    Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  19. Finding Biomarker Signatures in Pooled Sample Designs: A Simulation Framework for Methodological Comparisons

    Directory of Open Access Journals (Sweden)

    Anna Telaar

    2010-01-01

    Full Text Available Detection of discriminating patterns in gene expression data can be accomplished by using various methods of statistical learning. It has been proposed that sample pooling in this context would have negative effects; however, pooling cannot always be avoided. We propose a simulation framework to explicitly investigate the parameters of patterns, experimental design, noise, and choice of method in order to find out which effects on classification performance are to be expected. We use a two-group classification task and simulated gene expression data with independent differentially expressed genes as well as bivariate linear patterns and the combination of both. Our results show a clear increase of prediction error with pool size. For pooled training sets powered partial least squares discriminant analysis outperforms discriminance analysis, random forests, and support vector machines with linear or radial kernel for two of three simulated scenarios. The proposed simulation approach can be implemented to systematically investigate a number of additional scenarios of practical interest.

  20. Variable screening and ranking using sampling-based sensitivity measures

    International Nuclear Information System (INIS)

    Wu, Y-T.; Mohanty, Sitakanta

    2006-01-01

    This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables

  1. Comparison of tool feed influence in CNC polishing between a novel circular-random path and other pseudo-random paths.

    Science.gov (United States)

    Takizawa, Ken; Beaucamp, Anthony

    2017-09-18

    A new category of circular pseudo-random paths is proposed in order to suppress repetitive patterns and improve surface waviness on ultra-precision polished surfaces. Random paths in prior research had many corners, therefore deceleration of the polishing tool affected the surface waviness. The new random path can suppress velocity changes of the polishing tool and thus restrict degradation of the surface waviness, making it suitable for applications with stringent mid-spatial-frequency requirements such as photomask blanks for EUV lithography.

  2. Dispersion patterns and sampling plans for Diaphorina citri (Hemiptera: Psyllidae) in citrus.

    Science.gov (United States)

    Sétamou, Mamoudou; Flores, Daniel; French, J Victor; Hall, David G

    2008-08-01

    The abundance and spatial dispersion of Diaphorina citri Kuwayama (Hemiptera: Psyllidae) were studied in 34 grapefruit (Citrus paradisi Macfad.) and six sweet orange [Citrus sinensis (L.) Osbeck] orchards from March to August 2006 when the pest is more abundant in southern Texas. Although flush shoot infestation levels did not vary with host plant species, densities of D. citri eggs, nymphs, and adults were significantly higher on sweet orange than on grapefruit. D. citri immatures also were found in significantly higher numbers in the southeastern quadrant of trees than other parts of the canopy. The spatial distribution of D. citri nymphs and adults was analyzed using Iowa's patchiness regression and Taylor's power law. Taylor's power law fitted the data better than Iowa's model. Based on both regression models, the field dispersion patterns of D. citri nymphs and adults were aggregated among flush shoots in individual trees as indicated by the regression slopes that were significantly >1. For the average density of each life stage obtained during our surveys, the minimum number of flush shoots per tree needed to estimate D. citri densities varied from eight for eggs to four flush shoots for adults. Projections indicated that a sampling plan consisting of 10 trees and eight flush shoots per tree would provide density estimates of the three developmental stages of D. citri acceptable enough for population studies and management decisions. A presence-absence sampling plan with a fixed precision level was developed and can be used to provide a quick estimation of D. citri populations in citrus orchards.

  3. Not accounting for interindividual variability can mask habitat selection patterns: a case study on black bears.

    Science.gov (United States)

    Lesmerises, Rémi; St-Laurent, Martin-Hugues

    2017-11-01

    Habitat selection studies conducted at the population scale commonly aim to describe general patterns that could improve our understanding of the limiting factors in species-habitat relationships. Researchers often consider interindividual variation in selection patterns to control for its effects and avoid pseudoreplication by using mixed-effect models that include individuals as random factors. Here, we highlight common pitfalls and possible misinterpretations of this strategy by describing habitat selection of 21 black bears Ursus americanus. We used Bayesian mixed-effect models and compared results obtained when using random intercept (i.e., population level) versus calculating individual coefficients for each independent variable (i.e., individual level). We then related interindividual variability to individual characteristics (i.e., age, sex, reproductive status, body condition) in a multivariate analysis. The assumption of comparable behavior among individuals was verified only in 40% of the cases in our seasonal best models. Indeed, we found strong and opposite responses among sampled bears and individual coefficients were linked to individual characteristics. For some covariates, contrasted responses canceled each other out at the population level. In other cases, interindividual variability was concealed by the composition of our sample, with the majority of the bears (e.g., old individuals and bears in good physical condition) driving the population response (e.g., selection of young forest cuts). Our results stress the need to consider interindividual variability to avoid misinterpretation and uninformative results, especially for a flexible and opportunistic species. This study helps to identify some ecological drivers of interindividual variability in bear habitat selection patterns.

  4. Low-fat dietary pattern and cancer incidence in the Women's Health Initiative Dietary Modification Randomized Controlled Trial.

    Science.gov (United States)

    Prentice, Ross L; Thomson, Cynthia A; Caan, Bette; Hubbell, F Allan; Anderson, Garnet L; Beresford, Shirley A A; Pettinger, Mary; Lane, Dorothy S; Lessin, Lawrence; Yasmeen, Shagufta; Singh, Baljinder; Khandekar, Janardan; Shikany, James M; Satterfield, Suzanne; Chlebowski, Rowan T

    2007-10-17

    The Women's Health Initiative Dietary Modification (DM) Randomized Controlled Trial evaluated the effects of a low-fat dietary pattern on chronic disease incidence, with breast cancer and colorectal cancer as primary outcomes. The trial protocol also listed ovarian cancer and endometrial cancer as outcomes that may be favorably affected by the intervention. A total of 48,835 postmenopausal women were randomly assigned during 1993-1998 to a DM intervention (n = 19,541) or comparison (usual diet; n = 29,294) group and followed up for an average of 8.1 years. The intervention goal was to reduce total fat intake to 20% of energy and to increase consumption of vegetables, fruits, and grains. Cancer outcomes were verified by pathology report review. We used weighted log-rank tests to compare incidence of invasive cancers of the ovary and endometrium, total invasive cancer, and invasive cancers at other sites between the groups. All statistical tests were two-sided. Ovarian cancer risk was lower in the intervention than in the comparison group (P = .03). Although the overall ovarian cancer hazard ratio (HR) was not statistically significantly less than 1.0, the hazard ratio decreased with increasing intervention duration (P(trend) = .01). For the first 4 years, the risk for ovarian cancer was similar in the intervention and control groups (0.52 cases per 1000 person-years in the intervention group versus 0.45 per 1000 person-years in the comparison group; HR = 1.16, 95% confidence interval [CI] = 0.73 to 1.84); over the next 4.1 years, the risk was lower in the intervention group (0.38 cases per 1000 person-years in the intervention group versus 0.64 per 1000 person-years in the comparison group; HR = 0.60, 95% CI = 0.38 to 0.96). Risk of cancer of the endometrium did not differ between the groups (P = .18). The estimated risk of total invasive cancer was slightly lower in the intervention group than in the control group (HR = 0.95, 95% CI = 0.89 to 1.01; P = .10). A low

  5. Exact distribution of a pattern in a set of random sequences generated by a Markov source: applications to biological data.

    Science.gov (United States)

    Nuel, Gregory; Regad, Leslie; Martin, Juliette; Camproux, Anne-Claude

    2010-01-26

    In bioinformatics it is common to search for a pattern of interest in a potentially large set of rather short sequences (upstream gene regions, proteins, exons, etc.). Although many methodological approaches allow practitioners to compute the distribution of a pattern count in a random sequence generated by a Markov source, no specific developments have taken into account the counting of occurrences in a set of independent sequences. We aim to address this problem by deriving efficient approaches and algorithms to perform these computations both for low and high complexity patterns in the framework of homogeneous or heterogeneous Markov models. The latest advances in the field allowed us to use a technique of optimal Markov chain embedding based on deterministic finite automata to introduce three innovative algorithms. Algorithm 1 is the only one able to deal with heterogeneous models. It also permits to avoid any product of convolution of the pattern distribution in individual sequences. When working with homogeneous models, Algorithm 2 yields a dramatic reduction in the complexity by taking advantage of previous computations to obtain moment generating functions efficiently. In the particular case of low or moderate complexity patterns, Algorithm 3 exploits power computation and binary decomposition to further reduce the time complexity to a logarithmic scale. All these algorithms and their relative interest in comparison with existing ones were then tested and discussed on a toy-example and three biological data sets: structural patterns in protein loop structures, PROSITE signatures in a bacterial proteome, and transcription factors in upstream gene regions. On these data sets, we also compared our exact approaches to the tempting approximation that consists in concatenating the sequences in the data set into a single sequence. Our algorithms prove to be effective and able to handle real data sets with multiple sequences, as well as biological patterns of

  6. Human X-chromosome inactivation pattern distributions fit a model of genetically influenced choice better than models of completely random choice

    Science.gov (United States)

    Renault, Nisa K E; Pritchett, Sonja M; Howell, Robin E; Greer, Wenda L; Sapienza, Carmen; Ørstavik, Karen Helene; Hamilton, David C

    2013-01-01

    In eutherian mammals, one X-chromosome in every XX somatic cell is transcriptionally silenced through the process of X-chromosome inactivation (XCI). Females are thus functional mosaics, where some cells express genes from the paternal X, and the others from the maternal X. The relative abundance of the two cell populations (X-inactivation pattern, XIP) can have significant medical implications for some females. In mice, the ‘choice' of which X to inactivate, maternal or paternal, in each cell of the early embryo is genetically influenced. In humans, the timing of XCI choice and whether choice occurs completely randomly or under a genetic influence is debated. Here, we explore these questions by analysing the distribution of XIPs in large populations of normal females. Models were generated to predict XIP distributions resulting from completely random or genetically influenced choice. Each model describes the discrete primary distribution at the onset of XCI, and the continuous secondary distribution accounting for changes to the XIP as a result of development and ageing. Statistical methods are used to compare models with empirical data from Danish and Utah populations. A rigorous data treatment strategy maximises information content and allows for unbiased use of unphased XIP data. The Anderson–Darling goodness-of-fit statistics and likelihood ratio tests indicate that a model of genetically influenced XCI choice better fits the empirical data than models of completely random choice. PMID:23652377

  7. Visualizing the Sample Standard Deviation

    Science.gov (United States)

    Sarkar, Jyotirmoy; Rashid, Mamunur

    2017-01-01

    The standard deviation (SD) of a random sample is defined as the square-root of the sample variance, which is the "mean" squared deviation of the sample observations from the sample mean. Here, we interpret the sample SD as the square-root of twice the mean square of all pairwise half deviations between any two sample observations. This…

  8. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    Science.gov (United States)

    Mouw, Ted; Verdery, Ashton M.

    2013-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)—the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a “List” mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a “Search” mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS. PMID:24159246

  9. Sampling methods to the statistical control of the production of blood components.

    Science.gov (United States)

    Pereira, Paulo; Seghatchian, Jerard; Caldeira, Beatriz; Santos, Paula; Castro, Rosa; Fernandes, Teresa; Xavier, Sandra; de Sousa, Gracinda; de Almeida E Sousa, João Paulo

    2017-12-01

    The control of blood components specifications is a requirement generalized in Europe by the European Commission Directives and in the US by the AABB standards. The use of a statistical process control methodology is recommended in the related literature, including the EDQM guideline. The control reliability is dependent of the sampling. However, a correct sampling methodology seems not to be systematically applied. Commonly, the sampling is intended to comply uniquely with the 1% specification to the produced blood components. Nevertheless, on a purely statistical viewpoint, this model could be argued not to be related to a consistent sampling technique. This could be a severe limitation to detect abnormal patterns and to assure that the production has a non-significant probability of producing nonconforming components. This article discusses what is happening in blood establishments. Three statistical methodologies are proposed: simple random sampling, sampling based on the proportion of a finite population, and sampling based on the inspection level. The empirical results demonstrate that these models are practicable in blood establishments contributing to the robustness of sampling and related statistical process control decisions for the purpose they are suggested for. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Fast integration using quasi-random numbers

    International Nuclear Information System (INIS)

    Bossert, J.; Feindt, M.; Kerzel, U.

    2006-01-01

    Quasi-random numbers are specially constructed series of numbers optimised to evenly sample a given s-dimensional volume. Using quasi-random numbers in numerical integration converges faster with a higher accuracy compared to the case of pseudo-random numbers. The basic properties of quasi-random numbers are introduced, various generators are discussed and the achieved gain is illustrated by examples

  11. Fast integration using quasi-random numbers

    Science.gov (United States)

    Bossert, J.; Feindt, M.; Kerzel, U.

    2006-04-01

    Quasi-random numbers are specially constructed series of numbers optimised to evenly sample a given s-dimensional volume. Using quasi-random numbers in numerical integration converges faster with a higher accuracy compared to the case of pseudo-random numbers. The basic properties of quasi-random numbers are introduced, various generators are discussed and the achieved gain is illustrated by examples.

  12. Spatial Analysis of “Crazy Quilts”, a Class of Potentially Random Aesthetic Artefacts

    Science.gov (United States)

    Westphal-Fitch, Gesche; Fitch, W. Tecumseh

    2013-01-01

    Human artefacts in general are highly structured and often display ordering principles such as translational, reflectional or rotational symmetry. In contrast, human artefacts that are intended to appear random and non symmetrical are very rare. Furthermore, many studies show that humans find it extremely difficult to recognize or reproduce truly random patterns or sequences. Here, we attempt to model two-dimensional decorative spatial patterns produced by humans that show no obvious order. “Crazy quilts” represent a historically important style of quilt making that became popular in the 1870s, and lasted about 50 years. Crazy quilts are unusual because unlike most human artefacts, they are specifically intended to appear haphazard and unstructured. We evaluate the degree to which this intention was achieved by using statistical techniques of spatial point pattern analysis to compare crazy quilts with regular quilts from the same region and era and to evaluate the fit of various random distributions to these two quilt classes. We found that the two quilt categories exhibit fundamentally different spatial characteristics: The patch areas of crazy quilts derive from a continuous random distribution, while area distributions of regular quilts consist of Gaussian mixtures. These Gaussian mixtures derive from regular pattern motifs that are repeated and we suggest that such a mixture is a distinctive signature of human-made visual patterns. In contrast, the distribution found in crazy quilts is shared with many other naturally occurring spatial patterns. Centroids of patches in the two quilt classes are spaced differently and in general, crazy quilts but not regular quilts are well-fitted by a random Strauss process. These results indicate that, within the constraints of the quilt format, Victorian quilters indeed achieved their goal of generating random structures. PMID:24066095

  13. An improved sampling method of complex network

    Science.gov (United States)

    Gao, Qi; Ding, Xintong; Pan, Feng; Li, Weixing

    2014-12-01

    Sampling subnet is an important topic of complex network research. Sampling methods influence the structure and characteristics of subnet. Random multiple snowball with Cohen (RMSC) process sampling which combines the advantages of random sampling and snowball sampling is proposed in this paper. It has the ability to explore global information and discover the local structure at the same time. The experiments indicate that this novel sampling method could keep the similarity between sampling subnet and original network on degree distribution, connectivity rate and average shortest path. This method is applicable to the situation where the prior knowledge about degree distribution of original network is not sufficient.

  14. Multiple Mating, Paternity and Complex Fertilisation Patterns in the Chokka Squid Loligo reynaudii.

    Directory of Open Access Journals (Sweden)

    Marie-Jose Naud

    Full Text Available Polyandry is widespread and influences patterns of sexual selection, with implications for sexual conflict over mating. Assessing sperm precedence patterns is a first step towards understanding sperm competition within a female and elucidating the roles of male- and female-controlled factors. In this study behavioural field data and genetic data were combined to investigate polyandry in the chokka squid Loligo reynaudii. Microsatellite DNA-based paternity analysis revealed multiple paternity to be the norm, with 79% of broods sired by at least two males. Genetic data also determined that the male who was guarding the female at the moment of sampling was a sire in 81% of the families tested, highlighting mate guarding as a successful male tactic with postcopulatory benefits linked to sperm deposition site giving privileged access to extruded egg strings. As females lay multiple eggs in capsules (egg strings wherein their position is not altered during maturation it is possible to describe the spatial / temporal sequence of fertilisation / sperm precedence There were four different patterns of fertilisation found among the tested egg strings: 1 unique sire; 2 dominant sire, with one or more rare sires; 3 randomly mixed paternity (two or more sires; and 4 a distinct switch in paternity occurring along the egg string. The latter pattern cannot be explained by a random use of stored sperm, and suggests postcopulatory female sperm choice. Collectively the data indicate multiple levels of male- and female-controlled influences on sperm precedence, and highlights squid as interesting models to study the interplay between sexual and natural selection.

  15. Sampling Polya-Gamma random variates: alternate and approximate techniques

    OpenAIRE

    Windle, Jesse; Polson, Nicholas G.; Scott, James G.

    2014-01-01

    Efficiently sampling from the P\\'olya-Gamma distribution, ${PG}(b,z)$, is an essential element of P\\'olya-Gamma data augmentation. Polson et. al (2013) show how to efficiently sample from the ${PG}(1,z)$ distribution. We build two new samplers that offer improved performance when sampling from the ${PG}(b,z)$ distribution and $b$ is not unity.

  16. Random Photon Absorption Model Elucidates How Early Gain Control in Fly Photoreceptors Arises from Quantal Sampling

    Science.gov (United States)

    Song, Zhuoyi; Zhou, Yu; Juusola, Mikko

    2016-01-01

    Many diurnal photoreceptors encode vast real-world light changes effectively, but how this performance originates from photon sampling is unclear. A 4-module biophysically-realistic fly photoreceptor model, in which information capture is limited by the number of its sampling units (microvilli) and their photon-hit recovery time (refractoriness), can accurately simulate real recordings and their information content. However, sublinear summation in quantum bump production (quantum-gain-nonlinearity) may also cause adaptation by reducing the bump/photon gain when multiple photons hit the same microvillus simultaneously. Here, we use a Random Photon Absorption Model (RandPAM), which is the 1st module of the 4-module fly photoreceptor model, to quantify the contribution of quantum-gain-nonlinearity in light adaptation. We show how quantum-gain-nonlinearity already results from photon sampling alone. In the extreme case, when two or more simultaneous photon-hits reduce to a single sublinear value, quantum-gain-nonlinearity is preset before the phototransduction reactions adapt the quantum bump waveform. However, the contribution of quantum-gain-nonlinearity in light adaptation depends upon the likelihood of multi-photon-hits, which is strictly determined by the number of microvilli and light intensity. Specifically, its contribution to light-adaptation is marginal (≤ 1%) in fly photoreceptors with many thousands of microvilli, because the probability of simultaneous multi-photon-hits on any one microvillus is low even during daylight conditions. However, in cells with fewer sampling units, the impact of quantum-gain-nonlinearity increases with brightening light. PMID:27445779

  17. Meaning in meaninglessness: The propensity to perceive meaningful patterns in coincident events and randomly arranged stimuli is linked to enhanced attention in early sensory processing.

    Science.gov (United States)

    Rominger, Christian; Schulter, Günter; Fink, Andreas; Weiss, Elisabeth M; Papousek, Ilona

    2018-05-01

    Perception of objectively independent events or stimuli as being significantly connected and the associated proneness to perceive meaningful patterns constitute part of the positive symptoms of schizophrenia, which are associated with altered attentional processes in lateralized speech perception. Since perceiving meaningful patterns is to some extent already prevalent in the general population, the aim of the study was to investigate whether the propensity to experience meaningful patterns in co-occurring events and random stimuli may be associated with similar altered attentional processes in lateralized speech perception. Self-reported and behavioral indicators of the perception of meaningful patterns were assessed in non-clinical individuals, along with EEG auditory evoked potentials during the performance of an attention related lateralized speech perception task (Dichotic Listening Test). A greater propensity to perceive meaningful patterns was associated with higher N1 amplitudes of the evoked potentials to the onset of the dichotically presented consonant-vowel syllables, indicating enhanced automatic attention in early sensory processing. The study suggests that more basic mechanisms in how people associate events may play a greater role in the cognitive biases that are manifest in personality expressions such as positive schizotypy, rather than that positive schizotypy moderates these cognitive biases directly. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Territory and nest site selection patterns by Grasshopper Sparrows in southeastern Arizona

    Science.gov (United States)

    Ruth, Janet M.; Skagen, Susan K.

    2017-01-01

    Grassland bird populations are showing some of the greatest rates of decline of any North American birds, prompting measures to protect and improve important habitat. We assessed how vegetation structure and composition, habitat features often targeted for management, affected territory and nest site selection by Grasshopper Sparrows (Ammodramus savannarum ammolegus) in southeastern Arizona. To identify features important to males establishing territories, we compared vegetation characteristics of known territories and random samples on 2 sites over 5 years. We examined habitat selection patterns of females by comparing characteristics of nest sites with territories over 3 years. Males selected territories in areas of sparser vegetation structure and more tall shrubs (>2 m) than random plots on the site with low shrub densities. Males did not select territories based on the proportion of exotic grasses. Females generally located nest sites in areas with lower small shrub (1–2 m tall) densities than territories overall when possible and preferentially selected native grasses for nest construction. Whether habitat selection was apparent depended upon the range of vegetation structure that was available. We identified an upper threshold above which grass structure seemed to be too high and dense for Grasshopper Sparrows. Our results suggest that some management that reduces vegetative structure may benefit this species in desert grasslands at the nest and territory scale. However, we did not assess initial male habitat selection at a broader landscape scale where their selection patterns may be different and could be influenced by vegetation density and structure outside the range of values sampled in this study.

  19. Health indicators: eliminating bias from convenience sampling estimators.

    Science.gov (United States)

    Hedt, Bethany L; Pagano, Marcello

    2011-02-28

    Public health practitioners are often called upon to make inference about a health indicator for a population at large when the sole available information are data gathered from a convenience sample, such as data gathered on visitors to a clinic. These data may be of the highest quality and quite extensive, but the biases inherent in a convenience sample preclude the legitimate use of powerful inferential tools that are usually associated with a random sample. In general, we know nothing about those who do not visit the clinic beyond the fact that they do not visit the clinic. An alternative is to take a random sample of the population. However, we show that this solution would be wasteful if it excluded the use of available information. Hence, we present a simple annealing methodology that combines a relatively small, and presumably far less expensive, random sample with the convenience sample. This allows us to not only take advantage of powerful inferential tools, but also provides more accurate information than that available from just using data from the random sample alone. Copyright © 2011 John Wiley & Sons, Ltd.

  20. Computed tomography of the brain, hepatotoxic drugs and high alcohol consumption in male alcoholic patients and a random sample from the general male population

    Energy Technology Data Exchange (ETDEWEB)

    Muetzell, S. (Univ. Hospital of Uppsala (Sweden). Dept. of Family Medicine)

    1992-01-01

    Computed tomography (CT) of the brain was performed in a random sample of a total of 195 men and 211 male alcoholic patients admitted for the first time during a period of two years from the same geographically limited area of Greater Stockholm as the sample. Laboratory tests were performed, including liver and pancreatic tests. Toxicological screening was performed and the consumption of hepatotoxic drugs was also investigated. The groups were then subdivided with respect to alcohol consumption and use of hepatotoxic drugs: group IA, men from the random sample with low or moderate alcohol consumption and no use of hepatotoxic drugs; IB, men from the random sample with low or moderate alcohol consumption with use of hepatotoxic drugs; IIA, alcoholic inpatients with use of alcohol and no drugs; and IIB, alcoholic inpatients with use of alcohol and drugs. Group IIB was found to have a higher incidence of cortical and subcortical changes than group IA. Group IB had a higher incidence of subcortical changes than group IA, and they differed only in drug use. Groups IIN and IIA only differed in drug use, and IIB had a higher incidence of brian damage except for anterior horn index and wide cerebellar sulci indicating vermian atrophy. Significantly higher serum levels of bilirubin, GGT, ASAT, ALAT, CK LD, and amylase were found in IIB. The results indicate that drug use influences the incidence of cortical and subcortical aberrations, except anterior horn index. It is concluded that the groups with alcohol abuse who used hepatotoxic drugs showed a picture of cortical changes (wide transport sulci and clear-cut of high-grade cortical changes) and also of subcortical aberrations, expressed as an increased widening on the third ventricle.

  1. Computed tomography of the brain, hepatotoxic drugs and high alcohol consumption in male alcoholic patients and a random sample from the general male population

    International Nuclear Information System (INIS)

    Muetzell, S.

    1992-01-01

    Computed tomography (CT) of the brain was performed in a random sample of a total of 195 men and 211 male alcoholic patients admitted for the first time during a period of two years from the same geographically limited area of Greater Stockholm as the sample. Laboratory tests were performed, including liver and pancreatic tests. Toxicological screening was performed and the consumption of hepatotoxic drugs was also investigated. The groups were then subdivided with respect to alcohol consumption and use of hepatotoxic drugs: group IA, men from the random sample with low or moderate alcohol consumption and no use of hepatotoxic drugs; IB, men from the random sample with low or moderate alcohol consumption with use of hepatotoxic drugs; IIA, alcoholic inpatients with use of alcohol and no drugs; and IIB, alcoholic inpatients with use of alcohol and drugs. Group IIB was found to have a higher incidence of cortical and subcortical changes than group IA. Group IB had a higher incidence of subcortical changes than group IA, and they differed only in drug use. Groups IIN and IIA only differed in drug use, and IIB had a higher incidence of brian damage except for anterior horn index and wide cerebellar sulci indicating vermian atrophy. Significantly higher serum levels of bilirubin, GGT, ASAT, ALAT, CK LD, and amylase were found in IIB. The results indicate that drug use influences the incidence of cortical and subcortical aberrations, except anterior horn index. It is concluded that the groups with alcohol abuse who used hepatotoxic drugs showed a picture of cortical changes (wide transport sulci and clear-cut of high-grade cortical changes) and also of subcortical aberrations, expressed as an increased widening on the third ventricle

  2. The influence of psychoeducation on regulating biological rhythm in a sample of patients with bipolar II disorder: a randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Faria AD

    2014-06-01

    Full Text Available Augusto Duarte Faria,1 Luciano Dias de Mattos Souza,2 Taiane de Azevedo Cardoso,2 Karen Amaral Tavares Pinheiro,2 Ricardo Tavares Pinheiro,2 Ricardo Azevedo da Silva,2 Karen Jansen21Department of Clinical and Health Psychology, Universidade Federal do Rio Grande – FURG, Rio Grande, RS, Brazil; 2Health and Behavior Postgraduate Program, Universidade Católica de Pelotas – UCPEL, Pelotas, RS, BrazilIntroduction: Changes in biological rhythm are among the various characteristics of bipolar disorder, and have long been associated with the functional impairment of the disease. There are only a few viable options of psychosocial interventions that deal with this specific topic; one of them is psychoeducation, a model that, although it has been used by practitioners for some time, only recently have studies shown its efficacy in clinical practice.Aim: To assess if patients undergoing psychosocial intervention in addition to a pharmacological treatment have better regulation of their biological rhythm than those only using medication.Method: This study is a randomized clinical trial that compares a standard medication intervention to an intervention combined with drugs and psychoeducation. The evaluation of the biological rhythm was made using the Biological Rhythm Interview of Assessment in Neuropsychiatry, an 18-item scale divided in four areas (sleep, activity, social rhythm, and eating pattern. The combined intervention consisted of medication and a short-term psychoeducation model summarized in a protocol of six individual sessions of 1 hour each.Results: The sample consisted of 61 patients with bipolar II disorder, but during the study, there were 14 losses to follow-up. Therefore, the final sample consisted of 45 individuals (26 for standard intervention and 19 for combined. The results showed that, in this sample and time period evaluated, the combined treatment of medication and psychoeducation had no statistically significant impact on the

  3. Dietary patterns, nutrition knowledge and lifestyle: associations with blood pressure in a sample of Australian adults (the Food BP study).

    Science.gov (United States)

    Khalesi, S; Sharma, S; Irwin, C; Sun, J

    2016-10-01

    This study examined the association between dietary patterns, nutrition knowledge and lifestyle with blood pressure (BP) in a sample of Australian adults. Adults with normal and high BP were included in a cross-sectional study. Dietary intake data was collected using a Food Frequency Questionnaire. Nutrition knowledge and lifestyle surveys were included in the questionnaire. Dietary patterns were extracted using factor analysis followed by cluster analysis. Associations were analysed using logistic regression. Four hundred and seven participants were included. Three dietary patterns were identified: Western; Snack and alcohol; and Balanced. Participants with high BP had a higher intake of Western and a lower intake of Balanced dietary pattern. A significant and higher frequency of discretionary foods and oils consumption, as well as lower nutrition knowledge score and activity frequency, were observed in the high BP group. Regression analysis indicated that the intake of Western and Snack and alcohol dietary patterns increases the likelihood of having high BP by 2.40 (95% confidence interval (CI): 1.28-4.49) and 2.76 (95% CI: 1.52-5.00), respectively, when nutrition knowledge and lifestyle were controlled for as moderator variables. The likelihood of high BP was not associated with nutrition knowledge, but increased with physical inactivity. This study indicates that poor dietary patterns and inactivity are associated with increases in the likelihood of high BP, and the association is not influenced by nutrition knowledge. These findings indicate the importance of developing public health strategies with an emphasis on improving the dietary patterns of individuals to prevent and control high BP in Australian adults.

  4. Application of serum SELDI proteomic patterns in diagnosis of lung cancer

    Directory of Open Access Journals (Sweden)

    Zhou Bin

    2005-07-01

    Full Text Available Abstract Background Currently, no satisfactory biomarkers are available to screen for lung cancer. Surface-Enhanced Laser Desorption/ionization Time-of- Flight Mass Spectrometry ProteinChip system (SELDI-TOF-MS is one of the currently used techniques to identify biomarkers for cancers. The aim of this study is to explore the application of serum SELDI proteomic patterns to distinguish lung cancer patients from healthy individuals. Methods A total of 208 serum samples, including 158 lung cancer patients and 50 healthy individuals, were randomly divided into a training set (including 11 sera from patients with stages I/II lung cancer, 63 from patients with stages III/IV lung cancer and 20 from healthy controls and a blinded test set (including 43 sera from patients with stages I/II lung cancer, 41 from patients with stages III/IV lung cancer and 30 from healthy controls. All samples were analyzed by SELDI technology. The spectra were generated on weak cation exchange (WCX2 chips, and protein peaks clustering and classification analyses were made using Ciphergen Biomarker Wizard and Biomarker Pattern software, respectively. We additionally determined Cyfra21-1 and NSE in the 208 serum samples included in this study using an electrochemiluminescent immunoassay. Results Five protein peaks at 11493, 6429, 8245, 5335 and 2538 Da were automatically chosen as a biomarker pattern in the training set. When the SELDI marker pattern was tested with the blinded test set, it yielded a sensitivity of 86.9%, a specificity of 80.0% and a positive predictive value of 92.4%. The sensitivities provided by Cyfra21-1 and NSE used individually or in combination were significantly lower than that of the SELDI marker pattern (P P Conclusion These results suggest that serum SELDI protein profiling can distinguish lung cancer patients, especially NSCLC patients, from normal subjects with relatively high sensitivity and specificity, and the SELDI-TOF-MS is a potential tool

  5. Toward cost-efficient sampling methods

    Science.gov (United States)

    Luo, Peng; Li, Yongli; Wu, Chong; Zhang, Guijie

    2015-09-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper proposes two new sampling methods based on the idea that a small part of vertices with high node degree could possess the most structure information of a complex network. The two proposed sampling methods are efficient in sampling high degree nodes so that they would be useful even if the sampling rate is low, which means cost-efficient. The first new sampling method is developed on the basis of the widely used stratified random sampling (SRS) method and the second one improves the famous snowball sampling (SBS) method. In order to demonstrate the validity and accuracy of two new sampling methods, we compare them with the existing sampling methods in three commonly used simulation networks that are scale-free network, random network, small-world network, and also in two real networks. The experimental results illustrate that the two proposed sampling methods perform much better than the existing sampling methods in terms of achieving the true network structure characteristics reflected by clustering coefficient, Bonacich centrality and average path length, especially when the sampling rate is low.

  6. STUDY ON THE WORDS AND MUSICAL PATTERNS OF THE NATIONAL ANTHEMS OF THE MIDDLE EASTERN COUNTRIES

    Directory of Open Access Journals (Sweden)

    Gulden Filiz ONAL

    2017-02-01

    Full Text Available Anthems are poetic verses sung in a melodic harmony expressing a nation’s feelings, enthusiasm, hopes, joy of coexistence and determination to survive. Thanks to anthems, people reflect their national values, historical background of their countries, and their way of thinking. In this direction national anthems have a unifying power creating a common ground for individuals that sustain continuity and the sense of belonging to a country. This study deals with the national anthems of the Middle Eastern countries where major conflicts have taken place for years in terms of music and lyrics. The random method was used to form the sample group. The sample group is composed of the United Arab Emirates, Israel, Lebanon, Egypt, Syria and Turkey. Some basic musical elements of the national anthems of the countries involved in the study have been investigated (tone, tempo, meter signatures and rhythm patterns. It has been found that four of them are composed in major tones, while two of them are composed in minor tones. All of these anthems are in 4/4 meter with walking pace tempo. As for the rhythm patterns; 3 different patterns are used in one anthem, while 4 different patterns are used in two, 5 different patterns in one, and 6 different patterns in two. The words of these anthems as literary verses focus on many topics, mainly soldiers, flags, and homelands as well as political developments, national heroes...

  7. Investigating causal associations between use of nicotine, alcohol, caffeine and cannabis: a two-sample bidirectional Mendelian randomization study.

    Science.gov (United States)

    Verweij, Karin J H; Treur, Jorien L; Vink, Jacqueline M

    2018-07-01

    Epidemiological studies consistently show co-occurrence of use of different addictive substances. Whether these associations are causal or due to overlapping underlying influences remains an important question in addiction research. Methodological advances have made it possible to use published genetic associations to infer causal relationships between phenotypes. In this exploratory study, we used Mendelian randomization (MR) to examine the causality of well-established associations between nicotine, alcohol, caffeine and cannabis use. Two-sample MR was employed to estimate bidirectional causal effects between four addictive substances: nicotine (smoking initiation and cigarettes smoked per day), caffeine (cups of coffee per day), alcohol (units per week) and cannabis (initiation). Based on existing genome-wide association results we selected genetic variants associated with the exposure measure as an instrument to estimate causal effects. Where possible we applied sensitivity analyses (MR-Egger and weighted median) more robust to horizontal pleiotropy. Most MR tests did not reveal causal associations. There was some weak evidence for a causal positive effect of genetically instrumented alcohol use on smoking initiation and of cigarettes per day on caffeine use, but these were not supported by the sensitivity analyses. There was also some suggestive evidence for a positive effect of alcohol use on caffeine use (only with MR-Egger) and smoking initiation on cannabis initiation (only with weighted median). None of the suggestive causal associations survived corrections for multiple testing. Two-sample Mendelian randomization analyses found little evidence for causal relationships between nicotine, alcohol, caffeine and cannabis use. © 2018 Society for the Study of Addiction.

  8. Maximum-entropy networks pattern detection, network reconstruction and graph combinatorics

    CERN Document Server

    Squartini, Tiziano

    2017-01-01

    This book is an introduction to maximum-entropy models of random graphs with given topological properties and their applications. Its original contribution is the reformulation of many seemingly different problems in the study of both real networks and graph theory within the unified framework of maximum entropy. Particular emphasis is put on the detection of structural patterns in real networks, on the reconstruction of the properties of networks from partial information, and on the enumeration and sampling of graphs with given properties.  After a first introductory chapter explaining the motivation, focus, aim and message of the book, chapter 2 introduces the formal construction of maximum-entropy ensembles of graphs with local topological constraints. Chapter 3 focuses on the problem of pattern detection in real networks and provides a powerful way to disentangle nontrivial higher-order structural features from those that can be traced back to simpler local constraints. Chapter 4 focuses on the problem o...

  9. A randomized trial of the effect of a plant-based dietary pattern on additional breast cancer events and survival: the Women's Healthy Eating and Living (WHEL) Study.

    Science.gov (United States)

    Pierce, John P; Faerber, Susan; Wright, Fred A; Rock, Cheryl L; Newman, Vicky; Flatt, Shirley W; Kealey, Sheila; Jones, Vicky E; Caan, Bette J; Gold, Ellen B; Haan, Mary; Hollenbach, Kathryn A; Jones, Lovell; Marshall, James R; Ritenbaugh, Cheryl; Stefanick, Marcia L; Thomson, Cynthia; Wasserman, Linda; Natarajan, Loki; Thomas, Ronald G; Gilpin, Elizabeth A

    2002-12-01

    The Women's Healthy Eating and Living (WHEL) Study is a multisite randomized controlled trial of the effectiveness of a high-vegetable, low-fat diet, aimed at markedly raising circulating carotenoid concentrations from food sources, in reducing additional breast cancer events and early death in women with early-stage invasive breast cancer (within 4 years of diagnosis). The study randomly assigned 3088 such women to an intensive diet intervention or to a comparison group between 1995 and 2000 and is expected to follow them through 2006. Two thirds of these women were under 55 years of age at randomization. This research study has a coordinating center and seven clinical sites. Randomization was stratified by age, stage of tumor and clinical site. A comprehensive intervention program that includes intensive telephone counseling, cooking classes and print materials helps shift the dietary pattern of women in the intervention. Through an innovative telephone counseling program, dietary counselors encourage women in the intervention group to meet the following daily behavioral targets: five vegetable servings, 16 ounces of vegetable juice, three fruit servings, 30 g of fiber and 15-20% energy from fat. Adherence assessments occur at baseline, 6, 12, 24 or 36, 48 and 72 months. These assessments can include dietary intake (repeated 24-hour dietary recalls and food frequency questionnaire), circulating carotenoid concentrations, physical measures and questionnaires about health symptoms, quality of life, personal habits and lifestyle patterns. Outcome assessments are completed by telephone interview every 6 months with medical record verification. We will assess evidence of effectiveness by the length of the breast cancer event-free interval, as well as by overall survival separately in all the women in the study as well as specifically in women under and over the age of 55 years.

  10. Genotyping-by-sequencing for Populus population genomics: an assessment of genome sampling patterns and filtering approaches.

    Directory of Open Access Journals (Sweden)

    Martin P Schilling

    Full Text Available Continuing advances in nucleotide sequencing technology are inspiring a suite of genomic approaches in studies of natural populations. Researchers are faced with data management and analytical scales that are increasing by orders of magnitude. With such dramatic advances comes a need to understand biases and error rates, which can be propagated and magnified in large-scale data acquisition and processing. Here we assess genomic sampling biases and the effects of various population-level data filtering strategies in a genotyping-by-sequencing (GBS protocol. We focus on data from two species of Populus, because this genus has a relatively small genome and is emerging as a target for population genomic studies. We estimate the proportions and patterns of genomic sampling by examining the Populus trichocarpa genome (Nisqually-1, and demonstrate a pronounced bias towards coding regions when using the methylation-sensitive ApeKI restriction enzyme in this species. Using population-level data from a closely related species (P. tremuloides, we also investigate various approaches for filtering GBS data to retain high-depth, informative SNPs that can be used for population genetic analyses. We find a data filter that includes the designation of ambiguous alleles resulted in metrics of population structure and Hardy-Weinberg equilibrium that were most consistent with previous studies of the same populations based on other genetic markers. Analyses of the filtered data (27,910 SNPs also resulted in patterns of heterozygosity and population structure similar to a previous study using microsatellites. Our application demonstrates that technically and analytically simple approaches can readily be developed for population genomics of natural populations.

  11. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  12. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    Science.gov (United States)

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  13. Sleep Duration and Patterns in Chinese Older Adults: a Comprehensive Meta-analysis.

    Science.gov (United States)

    Lu, Li; Wang, Shi-Bin; Rao, Wen-Wang; Ungvari, Gabor S; Ng, Chee H; Chiu, Helen F K; Zhang, Juan; Kou, Changgui; Jia, Fu-Jun; Xiang, Yu-Tao

    2017-01-01

    This meta-analysis examined the mean sleep duration and patterns in Chinese older adult population. A literature search was systematically conducted covering major English (PubMed, Embase and PsycINFO) and Chinese (Chinese National Knowledge Infrastructure (CNKI), WanFang and SinoMed) databases. Data in studies with the mean and standard deviation of sleep duration and/or the proportion of short and long sleep durations in Chinese older adults were extracted and pooled using random-effects models. Subgroup analyses were conducted according to gender, region, area, survey time and sample size. A total of 36 studies with 150,616 subjects were included for analyses. The pooled mean sleep duration of 21 studies with available data was 6.82 hours/day (95% CI: 6.59-7.05 hours/day). The estimated proportions of sleep duration 8 hours/day) and 17.6% (95% CI: 12.4%-22.9%) (>9 hours/day). Given the adverse effects of unhealthy sleep patterns, health professionals should pay more attention to sleep patterns in this population in China.

  14. Sampling methods for pasture, soil and deposition for radioactivity emergency preparedness in the Nordic countries

    International Nuclear Information System (INIS)

    Isaksson, M.

    2002-01-01

    The aim of this work was to compare sampling techniques for pasture, soil and deposition, planned for radioactivity surveillance in emergency situations in the Nordic countries. The basis of the survey was a questionnaire, sent to radiation protection authorities and laboratories. Sampling of pasture is performed with a cutting height between 1 and 5 cm above the ground from an area of about 1 m 2 . The sampling plots are usually randomly positioned. Soil samples, 3 to 20 cores in various patterns, are generally taken by a corer of varying diameter. For deposition sampling, precipitation collectors of different sizes are used. When comparing results, the differences between laboratories should be borne in mind so that proper corrections can be made. It is, however, important to consider that, especially in an emergency situation, the use of standardised methods may worsen the results if these methods are not part of the daily work. (orig.)

  15. The impact of financial incentives on participants' food purchasing patterns in a supermarket-based randomized controlled trial.

    Science.gov (United States)

    Olstad, Dana Lee; Crawford, David A; Abbott, Gavin; McNaughton, Sarah A; Le, Ha Nd; Ni Mhurchu, Cliona; Pollard, Christina; Ball, Kylie

    2017-08-25

    The impacts of supermarket-based nutrition promotion interventions might be overestimated if participants shift their proportionate food purchasing away from their usual stores. This study quantified whether participants who received price discounts on fruits and vegetables (FV) in the Supermarket Healthy Eating for Life (SHELf) randomized controlled trial (RCT) shifted their FV purchasing into study supermarkets during the intervention period. Participants were 642 females randomly assigned to a 1) skill-building (n = 160), 2) price reduction (n = 161), 3) combined skill-building and price reduction (n = 160), or 4) control (n = 161) group. Participants self-reported the proportion of FV purchased in study supermarkets at baseline, 3- and 6-months post-intervention. Fisher's exact and χ 2 tests assessed differences among groups in the proportion of FV purchased in study supermarkets at each time point. Multinomial logistic regression assessed differences among groups in the change in proportionate FV purchasing over time. Post-intervention, 49% of participants purchased ≥50% of their FV in study supermarkets. Compared to all other groups, the price reduction group was approximately twice as likely (RRR: 1.8-2.2) to have increased proportionate purchasing of FV in study supermarkets from baseline to post-intervention (psupermarkets during the intervention period. Unless food purchasing data are available for all sources, differential changes in purchasing patterns can make it difficult to discern the true impacts of nutrition interventions. The SHELf trial is registered with Current Controlled Trials Registration ISRCTN39432901, Registered 30 June 2010, Retrospectively registered ( http://www.isrctn.com/ISRCTN39432901 ).

  16. Sampling designs and methods for estimating fish-impingement losses at cooling-water intakes

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-01-01

    Several systems for estimating fish impingement at power plant cooling-water intakes are compared to determine the most statistically efficient sampling designs and methods. Compared to a simple random sampling scheme the stratified systematic random sampling scheme, the systematic random sampling scheme, and the stratified random sampling scheme yield higher efficiencies and better estimators for the parameters in two models of fish impingement as a time-series process. Mathematical results and illustrative examples of the applications of the sampling schemes to simulated and real data are given. Some sampling designs applicable to fish-impingement studies are presented in appendixes

  17. Pattern of relative growth in cockle Anadara antiquata in Ihamahu coastal waters, Central Maluku

    Science.gov (United States)

    Siahainenia, L.; Tuhumury, S. F.; Uneputty, P. A.; Tuhumury, N. C.

    2018-03-01

    Anadara antiquata is economically important species for fisheries and belong to phylum mollusc which also found in Maluku. However, the density of population begins to decrease recently due to high exploitation by people. The objective of the present study was to analyze relative growth pattern of A. antiquata including size distribution and growth pattern based on shell dimensions. The study was conducted from December 2016 to February 2017 in Ihamahu coastal waters. Data were collected by using purposive random sampling. All the individuals of A. antiquata found were measured the length, width and height by using a vernier caliper to the nearest 0.01mm. Data were analyzed by using SPSS 20 and Microsoft Excel software. The results indicated that the A. antiquata mostly found in seagrass bed with a muddy substrate. The size distribution of shell dimensions was different during sampling. Overall, the length ranged from 15.87 mm to 57.5 mm, the width from 15.50 mm to 48.60 mm and the height was from 9.36 mm to 35.9 mm. The population of A. antiquata consisted of juvenile and mature size. The mature size (> 30 mm) was more dominant in the population. The A. antiquata showed allometric relative growth pattern based on shell dimensions.

  18. Low incidence of clonality in cold water corals revealed through the novel use of a standardized protocol adapted to deep sea sampling

    Science.gov (United States)

    Becheler, Ronan; Cassone, Anne-Laure; Noël, Philippe; Mouchel, Olivier; Morrison, Cheryl L.; Arnaud-Haond, Sophie

    2017-11-01

    Sampling in the deep sea is a technical challenge, which has hindered the acquisition of robust datasets that are necessary to determine the fine-grained biological patterns and processes that may shape genetic diversity. Estimates of the extent of clonality in deep-sea species, despite the importance of clonality in shaping the local dynamics and evolutionary trajectories, have been largely obscured by such limitations. Cold-water coral reefs along European margins are formed mainly by two reef-building species, Lophelia pertusa and Madrepora oculata. Here we present a fine-grained analysis of the genotypic and genetic composition of reefs occurring in the Bay of Biscay, based on an innovative deep-sea sampling protocol. This strategy was designed to be standardized, random, and allowed the georeferencing of all sampled colonies. Clonal lineages discriminated through their Multi-Locus Genotypes (MLG) at 6-7 microsatellite markers could thus be mapped to assess the level of clonality and the spatial spread of clonal lineages. High values of clonal richness were observed for both species across all sites suggesting a limited occurrence of clonality, which likely originated through fragmentation. Additionally, spatial autocorrelation analysis underlined the possible occurrence of fine-grained genetic structure in several populations of both L. pertusa and M. oculata. The two cold-water coral species examined had contrasting patterns of connectivity among canyons, with among-canyon genetic structuring detected in M. oculata, whereas L. pertusa was panmictic at the canyon scale. This study exemplifies that a standardized, random and georeferenced sampling strategy, while challenging, can be applied in the deep sea, and associated benefits outlined here include improved estimates of fine grained patterns of clonality and dispersal that are comparable across sites and among species.

  19. Exact distribution of a pattern in a set of random sequences generated by a Markov source: applications to biological data

    Directory of Open Access Journals (Sweden)

    Regad Leslie

    2010-01-01

    Full Text Available Abstract Background In bioinformatics it is common to search for a pattern of interest in a potentially large set of rather short sequences (upstream gene regions, proteins, exons, etc.. Although many methodological approaches allow practitioners to compute the distribution of a pattern count in a random sequence generated by a Markov source, no specific developments have taken into account the counting of occurrences in a set of independent sequences. We aim to address this problem by deriving efficient approaches and algorithms to perform these computations both for low and high complexity patterns in the framework of homogeneous or heterogeneous Markov models. Results The latest advances in the field allowed us to use a technique of optimal Markov chain embedding based on deterministic finite automata to introduce three innovative algorithms. Algorithm 1 is the only one able to deal with heterogeneous models. It also permits to avoid any product of convolution of the pattern distribution in individual sequences. When working with homogeneous models, Algorithm 2 yields a dramatic reduction in the complexity by taking advantage of previous computations to obtain moment generating functions efficiently. In the particular case of low or moderate complexity patterns, Algorithm 3 exploits power computation and binary decomposition to further reduce the time complexity to a logarithmic scale. All these algorithms and their relative interest in comparison with existing ones were then tested and discussed on a toy-example and three biological data sets: structural patterns in protein loop structures, PROSITE signatures in a bacterial proteome, and transcription factors in upstream gene regions. On these data sets, we also compared our exact approaches to the tempting approximation that consists in concatenating the sequences in the data set into a single sequence. Conclusions Our algorithms prove to be effective and able to handle real data sets with

  20. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning.

    Directory of Open Access Journals (Sweden)

    Anne Hsu

    Full Text Available A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning.

  1. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning

    Science.gov (United States)

    2016-01-01

    A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning. PMID:27310576

  2. Experimental phase diagram for random laser spectra

    International Nuclear Information System (INIS)

    El-Dardiry, Ramy G S; Mooiweer, Ronald; Lagendijk, Ad

    2012-01-01

    We systematically study the presence of narrow spectral features in a wide variety of random laser samples. Less gain or stronger scattering are shown to lead to a crossover from spiky to smooth spectra. A decomposition of random laser spectra into a set of Lorentzians provides unprecedented detail in the analysis of random laser spectra. We suggest an interpretation in terms of mode competition that enables an understanding of the observed experimental trends. In this interpretation, smooth random laser spectra are a consequence of competing modes for which the loss and gain are proportional. Spectral spikes are associated with modes that are uncoupled from the mode competition in the bulk of the sample. (paper)

  3. Preference and consumption pattern of biomass fuel in some disregarded villages of Bangladesh

    Energy Technology Data Exchange (ETDEWEB)

    Jashimuddin, M.; Masum, K.M.; Salam, M.A. [Institute of Forestry and Environmental Sciences, Chittagong University, Chittagong 4331 (Bangladesh)

    2006-05-15

    Consumer's preference and consumption pattern of biomass fuel, fuel types and energy use category has been studied in the disregarded villages of Bangladesh. The study was conducted both in the inland and island areas. Biomass energy in the study area was used in domestic cooking, tea stalls, brickfields, bakeries, paddy parboiling and pottery. Assessment of consumption in domestic cooking was done by means of multistage random sampling and that in the tea stalls, brickfields, bakeries, paddy parboiling and pottery by means of complete enumeration of the sampling unit. Based on the homestead size respondents were categorized into small, medium and large and a total of 60 homesteads (30 from inland and another 30 from island areas), 20 from each category were selected randomly for the study. The study revealed that natural gas was quite absent and stems, branches and twigs, leaves of trees, agricultural residues, shell and coir of coconut, saw dust, brush wood, rhizomes of bamboo, and cowdung were the biomass fuel used by the respondent. Nine fuelwood species were identified as the most preferred in the study area. Consumption pattern was mostly traditional. Each year preceding the rainy season cyclonic action damages a large quantity of biomass energy sources. Though at the initial stage of rainy season (April-May) there remain a more or less good collection of fuelwood to the user as the byproduct of cyclones and storms, the last part of the rainy season (July-August) was identified as the fuel shortage period. (author)

  4. [A comparison of convenience sampling and purposive sampling].

    Science.gov (United States)

    Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien

    2014-06-01

    Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.

  5. Dependence of B1+ and B1- Field Patterns of Surface Coils on the Electrical Properties of the Sample and the MR Operating Frequency.

    Science.gov (United States)

    Vaidya, Manushka V; Collins, Christopher M; Sodickson, Daniel K; Brown, Ryan; Wiggins, Graham C; Lattanzi, Riccardo

    2016-02-01

    In high field MRI, the spatial distribution of the radiofrequency magnetic ( B 1 ) field is usually affected by the presence of the sample. For hardware design and to aid interpretation of experimental results, it is important both to anticipate and to accurately simulate the behavior of these fields. Fields generated by a radiofrequency surface coil were simulated using dyadic Green's functions, or experimentally measured over a range of frequencies inside an object whose electrical properties were varied to illustrate a variety of transmit [Formula: see text] and receive [Formula: see text] field patterns. In this work, we examine how changes in polarization of the field and interference of propagating waves in an object can affect the B 1 spatial distribution. Results are explained conceptually using Maxwell's equations and intuitive illustrations. We demonstrate that the electrical conductivity alters the spatial distribution of distinct polarized components of the field, causing "twisted" transmit and receive field patterns, and asymmetries between [Formula: see text] and [Formula: see text]. Additionally, interference patterns due to wavelength effects are observed at high field in samples with high relative permittivity and near-zero conductivity, but are not present in lossy samples due to the attenuation of propagating EM fields. This work provides a conceptual framework for understanding B 1 spatial distributions for surface coils and can provide guidance for RF engineers.

  6. Scalability on LHS (Latin Hypercube Sampling) samples for use in uncertainty analysis of large numerical models

    International Nuclear Information System (INIS)

    Baron, Jorge H.; Nunez Mac Leod, J.E.

    2000-01-01

    The present paper deals with the utilization of advanced sampling statistical methods to perform uncertainty and sensitivity analysis on numerical models. Such models may represent physical phenomena, logical structures (such as boolean expressions) or other systems, and various of their intrinsic parameters and/or input variables are usually treated as random variables simultaneously. In the present paper a simple method to scale-up Latin Hypercube Sampling (LHS) samples is presented, starting with a small sample and duplicating its size at each step, making it possible to use the already run numerical model results with the smaller sample. The method does not distort the statistical properties of the random variables and does not add any bias to the samples. The results is a significant reduction in numerical models running time can be achieved (by re-using the previously run samples), keeping all the advantages of LHS, until an acceptable representation level is achieved in the output variables. (author)

  7. Development of Spatial Distribution Patterns by Biofilm Cells

    DEFF Research Database (Denmark)

    Haagensen, Janus Anders Juul; Hansen, Susse Kirkelund; Bak Christensen, Bjarke

    2015-01-01

    -pattern by Acinetobacter sp. C6. Ecological spatial pattern analyses revealed that the microcolonies were not entirely randomly distributed, and instead arranged in a uniform pattern. Detailed time-lapse confocal microscopy at the single cell level demonstrated that the spatial pattern was the result of an intriguing self......-organization: Small multicellular clusters moved along the surface to fuse with one another to form microcolonies. This active distribution capability was dependent on environmental factors (carbon source, oxygen) and historical contingency (formation of phenotypic variants). The findings of this study are discussed...

  8. Typology of alcohol users based on longitudinal patterns of drinking.

    Science.gov (United States)

    Harrington, Magdalena; Velicer, Wayne F; Ramsey, Susan

    2014-03-01

    Worldwide, alcohol is the most commonly used psychoactive substance. However, heterogeneity among alcohol users has been widely recognized. This paper presents a typology of alcohol users based on an implementation of idiographic methodology to examine longitudinal daily and cyclic (weekly) patterns of alcohol use at the individual level. A secondary data analysis was performed on the pre-intervention data from a large randomized control trial. A time series analysis was performed at the individual level, and a dynamic cluster analysis was employed to identify homogenous longitudinal patterns of drinking behavior at the group level. The analysis employed 180 daily observations of alcohol use in a sample of 177 alcohol users. The first order autocorrelations ranged from -.76 to .72, and seventh order autocorrelations ranged from -.27 to .79. Eight distinct profiles of alcohol users were identified, each characterized by a unique configuration of first and seventh autoregressive terms and longitudinal trajectories of alcohol use. External validity of the profiles confirmed the theoretical relevance of different patterns of alcohol use. Significant differences among the eight subtypes were found on gender, marital status, frequency of drug use, lifetime alcohol dependence, family history of alcohol use and the Short Index of Problems. Our findings demonstrate that individuals can have very different temporal patterns of drinking behavior. The daily and cyclic patterns of alcohol use may be important for designing tailored interventions for problem drinkers. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Doing the Impossible: A Note on Induction and the Experience of Randomness.

    Science.gov (United States)

    Lopes, Lola L.

    1982-01-01

    The process of induction is formulated as a problem in detecting nonrandomness, or pattern, against a background of randomness, or noise. Experimental and philosophical approaches to human conceptions of randomness are contrasted. The relation between induction and the experience of randomness is discussed in terms of signal-detection theory.…

  10. Image Correlation Pattern Optimization for Micro-Scale In-Situ Strain Measurements

    Science.gov (United States)

    Bomarito, G. F.; Hochhalter, J. D.; Cannon, A. H.

    2016-01-01

    The accuracy and precision of digital image correlation (DIC) is a function of three primary ingredients: image acquisition, image analysis, and the subject of the image. Development of the first two (i.e. image acquisition techniques and image correlation algorithms) has led to widespread use of DIC; however, fewer developments have been focused on the third ingredient. Typically, subjects of DIC images are mechanical specimens with either a natural surface pattern or a pattern applied to the surface. Research in the area of DIC patterns has primarily been aimed at identifying which surface patterns are best suited for DIC, by comparing patterns to each other. Because the easiest and most widespread methods of applying patterns have a high degree of randomness associated with them (e.g., airbrush, spray paint, particle decoration, etc.), less effort has been spent on exact construction of ideal patterns. With the development of patterning techniques such as microstamping and lithography, patterns can be applied to a specimen pixel by pixel from a patterned image. In these cases, especially because the patterns are reused many times, an optimal pattern is sought such that error introduced into DIC from the pattern is minimized. DIC consists of tracking the motion of an array of nodes from a reference image to a deformed image. Every pixel in the images has an associated intensity (grayscale) value, with discretization depending on the bit depth of the image. Because individual pixel matching by intensity value yields a non-unique scale-dependent problem, subsets around each node are used for identification. A correlation criteria is used to find the best match of a particular subset of a reference image within a deformed image. The reader is referred to references for enumerations of typical correlation criteria. As illustrated by Schreier and Sutton and Lu and Cary systematic errors can be introduced by representing the underlying deformation with under

  11. Patterns of Fitbit Use and Activity Levels Throughout a Physical Activity Intervention: Exploratory Analysis from a Randomized Controlled Trial.

    Science.gov (United States)

    Hartman, Sheri J; Nelson, Sandahl H; Weiner, Lauren S

    2018-02-05

    There has been a rapid increase in the use of technology-based activity trackers to promote behavior change. However, little is known about how individuals use these trackers on a day-to-day basis or how tracker use relates to increasing physical activity. The aims were to use minute level data collected from a Fitbit tracker throughout a physical activity intervention to examine patterns of Fitbit use and activity and their relationships with success in the intervention based on ActiGraph-measured moderate to vigorous physical activity (MVPA). Participants included 42 female breast cancer survivors randomized to the physical activity intervention arm of a 12-week randomized controlled trial. The Fitbit One was worn daily throughout the 12-week intervention. ActiGraph GT3X+ accelerometer was worn for 7 days at baseline (prerandomization) and end of intervention (week 12). Self-reported frequency of looking at activity data on the Fitbit tracker and app or website was collected at week 12. Adherence to wearing the Fitbit was high and stable, with a mean of 88.13% of valid days over 12 weeks (SD 14.49%). Greater adherence to wearing the Fitbit was associated with greater increases in ActiGraph-measured MVPA (b interaction =0.35, PHartman, Sandahl H Nelson, Lauren S Weiner. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 05.02.2018.

  12. Coupling methods for multistage sampling

    OpenAIRE

    Chauvet, Guillaume

    2015-01-01

    Multistage sampling is commonly used for household surveys when there exists no sampling frame, or when the population is scattered over a wide area. Multistage sampling usually introduces a complex dependence in the selection of the final units, which makes asymptotic results quite difficult to prove. In this work, we consider multistage sampling with simple random without replacement sampling at the first stage, and with an arbitrary sampling design for further stages. We consider coupling ...

  13. Systematic sampling of discrete and continuous populations: sample selection and the choice of estimator

    Science.gov (United States)

    Harry T. Valentine; David L. R. Affleck; Timothy G. Gregoire

    2009-01-01

    Systematic sampling is easy, efficient, and widely used, though it is not generally recognized that a systematic sample may be drawn from the population of interest with or without restrictions on randomization. The restrictions or the lack of them determine which estimators are unbiased, when using the sampling design as the basis for inference. We describe the...

  14. Preliminary analysis of the nestedness patterns of Montane forest ...

    African Journals Online (AJOL)

    Results show that the species ordering is significantly non-random. The discussion and conclusions focus on the nested subset patterns exhibited by 14 species and, to a lesser extent, 'idiosyncratic' species and islands. Factors that may have contributed to this pattern include selective extinction and colonisation; however, ...

  15. Influence of population versus convenience sampling on sample characteristics in studies of cognitive aging.

    Science.gov (United States)

    Brodaty, Henry; Mothakunnel, Annu; de Vel-Palumbo, Melissa; Ames, David; Ellis, Kathryn A; Reppermund, Simone; Kochan, Nicole A; Savage, Greg; Trollor, Julian N; Crawford, John; Sachdev, Perminder S

    2014-01-01

    We examined whether differences in findings of studies examining mild cognitive impairment (MCI) were associated with recruitment methods by comparing sample characteristics in two contemporaneous Australian studies, using population-based and convenience sampling. The Sydney Memory and Aging Study invited participants randomly from the electoral roll in defined geographic areas in Sydney. The Australian Imaging, Biomarkers and Lifestyle Study of Ageing recruited cognitively normal (CN) individuals via media appeals and MCI participants via referrals from clinicians in Melbourne and Perth. Demographic and cognitive variables were harmonized, and similar diagnostic criteria were applied to both samples retrospectively. CN participants recruited via convenience sampling were younger, better educated, more likely to be married and have a family history of dementia, and performed better cognitively than those recruited via population-based sampling. MCI participants recruited via population-based sampling had better memory performance and were less likely to carry the apolipoprotein E ε4 allele than clinically referred participants but did not differ on other demographic variables. A convenience sample of normal controls is likely to be younger and better functioning and that of an MCI group likely to perform worse than a purportedly random sample. Sampling bias should be considered when interpreting findings. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Security patterns in practice designing secure architectures using software patterns

    CERN Document Server

    Fernandez-Buglioni, Eduardo

    2013-01-01

    Learn to combine security theory and code to produce secure systems Security is clearly a crucial issue to consider during the design and implementation of any distributed software architecture. Security patterns are increasingly being used by developers who take security into serious consideration from the creation of their work. Written by the authority on security patterns, this unique book examines the structure and purpose of security patterns, illustrating their use with the help of detailed implementation advice, numerous code samples, and descriptions in UML. Provides

  17. Application of serum SELDI proteomic patterns in diagnosis of lung cancer

    International Nuclear Information System (INIS)

    Yang, Shuan-ying; Xiao, Xue-yuan; Zhang, Wang-gang; Zhang, Li-juan; Zhang, Wei; Zhou, Bin; Chen, Guoan; He, Da-cheng

    2005-01-01

    Currently, no satisfactory biomarkers are available to screen for lung cancer. Surface-Enhanced Laser Desorption/ionization Time-of- Flight Mass Spectrometry ProteinChip system (SELDI-TOF-MS) is one of the currently used techniques to identify biomarkers for cancers. The aim of this study is to explore the application of serum SELDI proteomic patterns to distinguish lung cancer patients from healthy individuals. A total of 208 serum samples, including 158 lung cancer patients and 50 healthy individuals, were randomly divided into a training set (including 11 sera from patients with stages I/II lung cancer, 63 from patients with stages III/IV lung cancer and 20 from healthy controls) and a blinded test set (including 43 sera from patients with stages I/II lung cancer, 41 from patients with stages III/IV lung cancer and 30 from healthy controls). All samples were analyzed by SELDI technology. The spectra were generated on weak cation exchange (WCX2) chips, and protein peaks clustering and classification analyses were made using Ciphergen Biomarker Wizard and Biomarker Pattern software, respectively. We additionally determined Cyfra21-1 and NSE in the 208 serum samples included in this study using an electrochemiluminescent immunoassay. Five protein peaks at 11493, 6429, 8245, 5335 and 2538 Da were automatically chosen as a biomarker pattern in the training set. When the SELDI marker pattern was tested with the blinded test set, it yielded a sensitivity of 86.9%, a specificity of 80.0% and a positive predictive value of 92.4%. The sensitivities provided by Cyfra21-1 and NSE used individually or in combination were significantly lower than that of the SELDI marker pattern (P < 0.005 or 0.05, respectively). Based on the results of the test set, we found that the SELDI marker pattern showed a sensitivity of 91.4% in the detection of non-small cell lung cancers (NSCLC), which was significantly higher than that in the detection of small cell lung cancers (P < 0.05); The

  18. Attenuation of species abundance distributions by sampling

    Science.gov (United States)

    Shimadzu, Hideyasu; Darnell, Ross

    2015-01-01

    Quantifying biodiversity aspects such as species presence/ absence, richness and abundance is an important challenge to answer scientific and resource management questions. In practice, biodiversity can only be assessed from biological material taken by surveys, a difficult task given limited time and resources. A type of random sampling, or often called sub-sampling, is a commonly used technique to reduce the amount of time and effort for investigating large quantities of biological samples. However, it is not immediately clear how (sub-)sampling affects the estimate of biodiversity aspects from a quantitative perspective. This paper specifies the effect of (sub-)sampling as attenuation of the species abundance distribution (SAD), and articulates how the sampling bias is induced to the SAD by random sampling. The framework presented also reveals some confusion in previous theoretical studies. PMID:26064626

  19. Sexual Behavior Pattern and Related Factors in Women with Breast Cancer in Iran

    Science.gov (United States)

    Rashidi, Sanaz; Dashti, Forouzandeh

    2016-01-01

    Background: Despite the most of treatment team efforts focused on the maintaining patient’s life, attention to sexual issues don’t be considered. This stud is designed to determine the sexual behavior pattern and related factors in women with breast cancer. Methods: This descriptive- correlation study was performed on 90 women that diagnosed with breast Cancer that was admitted to Sayed-Al-Shohada hospital of Isfahan in 2010. Sampling method was available (non- random sampling) and Sexual Behavior Pattern determined with 3 domains: sexual identity, sexual role and sexual function. Data collection tools, was a questionnaire that made by the researcher and was used after determining the validity and reliability. For data analysis, was used of Descriptive- analytic statistics, frequency and ANOVA and Pearson correlation analytical tests in the SPSS statistical software (version 16). Results: Cases had 60% of Desirable sexual identity, 50% of Desirable sexual role, 40% Desirable sexual function and were be able to play 47.61% Desirable sexual behavior. Participants that their husbands had Elementary education had more desirable sexual behavior (pSexual behavior than of were working and retired (psexual behavior (psexual behavior pattern that is one of the important aspects of health, Provide valuable information to nurses and medical team and will be enhance the quality of provided services. Adopt appropriate strategies and interventions to promote sexual health, breast cancer is recommended. PMID:26925917

  20. Pattern of diseases among rice farmers exposed to pesticides in the MUDA area

    International Nuclear Information System (INIS)

    Syarif Husin Lubis; Jamal Hisham Hashim; Noor Hassim Ismail; Salmaan Hussain Inayat Hussain

    2002-01-01

    The aim of this study was to find out the pattern of diseases among rice farmers exposed to pesticides in the Muda area, Kedah; and to identify those who suffered from these diseases so that they can be referred to the nearest clinic for treatment. A cross sectional study was conducted in August 1994 to identify the pattern of diseases among rice farmers exposed to pesticides. Cluster Random Sampling was employed and the sample size was 136 farmers. A guided questionnaire was used for assessing the usage of protective clothing, symptoms of the farmers diseases and the distribution of these symptoms. Blood obtained by finger prick was taken for cholesterol level, triglycerides, and blood glucose analysis by using the Reflotron. A medical examination was also conducted. The pattern of diseases detected among the rice farmers shows that 57.4% suffered from pterygium., 81.6% suffered contact dermatitis and 97.1% had central nervous system (CNS) symptoms. Regarding the cholesterol level, 47.1% of rice farmers were found with total cholesterol > 5.2 mmol/L and 40.0% with triglycerides > 2.3 mmol/L. The blood glucose level measured was > 6.1 mmol/L in 55.03% of rice farmers. Usage of protective equipment among rice farmers was: gloves (68.4%), boots (52.2%), goggles (40.4%), apron (38.3%) and face mask (77.2%). (Author)

  1. The spatial and temporal patterns of odors sampled by lobsters and crabs in a turbulent plume.

    Science.gov (United States)

    Reidenbach, Matthew A; Koehl, M A R

    2011-09-15

    Odors are dispersed across aquatic habitats by turbulent water flow as filamentous, intermittent plumes. Many crustaceans sniff (take discrete samples of ambient water and the odors it carries) by flicking their olfactory antennules. We used planar laser-induced fluorescence to investigate how flicking antennules of different morphologies (long antennules of spiny lobsters, Panulirus argus; short antennules of blue crabs, Callinectes sapidus) sample fluctuating odor signals at different positions in a turbulent odor plume in a flume to determine whether the patterns of concentrations captured can provide information about an animal's position relative to the odor source. Lobster antennules intercept odors during a greater percentage of flicks and encounter higher peak concentrations than do crab antennules, but because crabs flick at higher frequency, the duration of odor-free gaps between encountered odor pulses is similar. For flicking antennules there were longer time gaps between odor encounters as the downstream distance to the odor source decreases, but shorter gaps along the plume centerline than near the edge. In contrast to the case for antennule flicking, almost all odor-free gaps were <500 ms at all positions in the plume if concentration was measured continuously at the same height as the antennules. Variance in concentration is lower and mean concentration is greater near the substratum, where leg chemosensors continuously sample the plume, than in the water where antennules sniff. Concentrations sampled by legs increase as an animal nears an odor source, but decrease for antennules. Both legs and antennules encounter higher concentrations near the centerline than at the edge of the plume.

  2. Iterative random vs. Kennard-Stone sampling for IR spectrum-based classification task using PLS2-DA

    Science.gov (United States)

    Lee, Loong Chuen; Liong, Choong-Yeun; Jemain, Abdul Aziz

    2018-04-01

    External testing (ET) is preferred over auto-prediction (AP) or k-fold-cross-validation in estimating more realistic predictive ability of a statistical model. With IR spectra, Kennard-stone (KS) sampling algorithm is often used to split the data into training and test sets, i.e. respectively for model construction and for model testing. On the other hand, iterative random sampling (IRS) has not been the favored choice though it is theoretically more likely to produce reliable estimation. The aim of this preliminary work is to compare performances of KS and IRS in sampling a representative training set from an attenuated total reflectance - Fourier transform infrared spectral dataset (of four varieties of blue gel pen inks) for PLS2-DA modeling. The `best' performance achievable from the dataset is estimated with AP on the full dataset (APF, error). Both IRS (n = 200) and KS were used to split the dataset in the ratio of 7:3. The classic decision rule (i.e. maximum value-based) is employed for new sample prediction via partial least squares - discriminant analysis (PLS2-DA). Error rate of each model was estimated repeatedly via: (a) AP on full data (APF, error); (b) AP on training set (APS, error); and (c) ET on the respective test set (ETS, error). A good PLS2-DA model is expected to produce APS, error and EVS, error that is similar to the APF, error. Bearing that in mind, the similarities between (a) APS, error vs. APF, error; (b) ETS, error vs. APF, error and; (c) APS, error vs. ETS, error were evaluated using correlation tests (i.e. Pearson and Spearman's rank test), using series of PLS2-DA models computed from KS-set and IRS-set, respectively. Overall, models constructed from IRS-set exhibits more similarities between the internal and external error rates than the respective KS-set, i.e. less risk of overfitting. In conclusion, IRS is more reliable than KS in sampling representative training set.

  3. Nuptiality pattern in Saudi Arabia.

    Science.gov (United States)

    Farag, M K; al-Mazrou, Y Y; Baldo, M H; Aziz, K M; al-Shehri, S N

    1995-01-01

    The data of this work are based on the 'Saudi Maternal and Child Health Survey' conducted in 1991. This was a nationwide house to house field survey. The study included 6294 ever-married Saudi female in the childbearing age who represent the target population. They were randomly selected from both urban and rural settings of the five geographical areas of Saudi Arabia. The mean current age of the sample of ever-married women was 31 years and the mean age at first marriage was 17 and 16 years for urban and rural females, respectively. Education was associated with upward shift of the age at first marriage. About 20 per cent of the sampled ever-married Saudi females got married before their 15th birthday and 83 per cent before reaching 20 years of age. These percentages are even higher in rural than urban settings. The overall percentage of women who were currently married at the time of the survey was 96 per cent. The divorced and widowed women of childbearing age were 2 and 3 per cent, respectively, of the total. Husband's educational level had positive impact on the frequency of divorce. The latter was highest among women married to illiterate husbands with a dose response pattern. Comparison has been made with the situation 4 years ago. An increase in literacy rates among Saudi ever-married women and their husbands was observed, however, a significant (P < 0.05) difference still exists between female and male literacy rates (38 and 71 per cent, respectively). The events of early marriage, before the age of 15 years, became less frequent. There is an overall decline in teenage marriages which explains a large part of the recent changes observed in nuptiality and pattern of birth in Saudi Arabia.

  4. On Angular Sampling Methods for 3-D Spatial Channel Models

    DEFF Research Database (Denmark)

    Fan, Wei; Jämsä, Tommi; Nielsen, Jesper Ødum

    2015-01-01

    This paper discusses generating three dimensional (3D) spatial channel models with emphasis on the angular sampling methods. Three angular sampling methods, i.e. modified uniform power sampling, modified uniform angular sampling, and random pairing methods are proposed and investigated in detail....... The random pairing method, which uses only twenty sinusoids in the ray-based model for generating the channels, presents good results if the spatial channel cluster is with a small elevation angle spread. For spatial clusters with large elevation angle spreads, however, the random pairing method would fail...... and the other two methods should be considered....

  5. Association of 1-y changes in diet pattern with cardiovascular disease risk factors and adipokines: results from the 1-y randomized Oslo Diet and Exercise Study.

    Science.gov (United States)

    Jacobs, David R; Sluik, Diewertje; Rokling-Andersen, Merethe H; Anderssen, Sigmund A; Drevon, Christian A

    2009-02-01

    We hypothesized that favorable changes in dietary patterns would lead to a reduction in body size and an improvement in metabolic status. The objective was to study changes in diet patterns relative to changes in body size, blood pressure, and circulating concentrations of lipids, glucose, insulin, adiponectin, and other cytokines in the context of a 1-y randomized intervention study. For 1 y, 187 men aged 45 +/- 2 y, approximately 50% of whom met the criteria of the metabolic syndrome, were randomly assigned to a diet protocol (n = 45), an exercise protocol (n = 48), a protocol of diet plus exercise (n = 58), or a control protocol (n = 36). A previously defined a priori diet score was created by summing tertile rankings of 35 food group variables; a higher score generally reflected recommended dietary changes in the trial (mean +/- SD at baseline: 31 +/- 6.5; range: 15-47). Over the study year, the diet score increased by approximately 2 +/- 5.5 in both diet groups, with a decrease of an equivalent amount in the exercise and control groups. The weight change was -3.5 +/- 0.6 kg/10-point change in diet score (P smoking status. Weight change was attenuated but remained significant after adjustment for intervention group and percentage body fat. Subjects with an increased diet score had more favorable changes in other body size variables, systolic blood pressure, and blood lipid, glucose, insulin, and adiponectin concentrations. Change in diet score was unrelated to resistin and several cytokines. The change toward a more favorable diet pattern was associated with improved body size and metabolic profile.

  6. Statistical sampling strategies

    International Nuclear Information System (INIS)

    Andres, T.H.

    1987-01-01

    Systems assessment codes use mathematical models to simulate natural and engineered systems. Probabilistic systems assessment codes carry out multiple simulations to reveal the uncertainty in values of output variables due to uncertainty in the values of the model parameters. In this paper, methods are described for sampling sets of parameter values to be used in a probabilistic systems assessment code. Three Monte Carlo parameter selection methods are discussed: simple random sampling, Latin hypercube sampling, and sampling using two-level orthogonal arrays. Three post-selection transformations are also described: truncation, importance transformation, and discretization. Advantages and disadvantages of each method are summarized

  7. Effects of changing the random number stride in Monte Carlo calculations

    International Nuclear Information System (INIS)

    Hendricks, J.S.

    1991-01-01

    This paper reports on a common practice in Monte Carlo radiation transport codes which is to start each random walk a specified number of steps up the random number sequence from the previous one. This is called the stride in the random number sequence between source particles. It is used for correlated sampling or to provide tree-structured random numbers. A new random number generator algorithm for the major Monte Carlo code MCNP has been written to allow adjustment of the random number stride. This random number generator is machine portable. The effects of varying the stride for several sample problems are examined

  8. Covering Ground: Movement Patterns and Random Walk Behavior in Aquilonastra anomala Sea Stars.

    Science.gov (United States)

    Lohmann, Amanda C; Evangelista, Dennis; Waldrop, Lindsay D; Mah, Christopher L; Hedrick, Tyson L

    2016-10-01

    The paths animals take while moving through their environments affect their likelihood of encountering food and other resources; thus, models of foraging behavior abound. To collect movement data appropriate for comparison with these models, we used time-lapse photography to track movements of a small, hardy, and easy-to-obtain organism, Aquilonastra anomala sea stars. We recorded the sea stars in a tank over many hours, with and without a food cue. With food present, they covered less distance, as predicted by theory; this strategy would allow them to remain near food. We then compared the paths of the sea stars to three common models of animal movement: Brownian motion, Lévy walks, and correlated random walks; we found that the sea stars' movements most closely resembled a correlated random walk. Additionally, we compared the search performance of models of Brownian motion, a Lévy walk, and a correlated random walk to that of a model based on the sea stars' movements. We found that the behavior of the modeled sea star walk was similar to that of the modeled correlated random walk and the Brownian motion model, but that the sea star walk was slightly more likely than the other walks to find targets at intermediate distances. While organisms are unlikely to follow an idealized random walk in all details, our data suggest that comparing the effectiveness of an organism's paths to those from theory can give insight into the organism's actual movement strategy. Finally, automated optical tracking of invertebrates proved feasible, and A. anomala was revealed to be a tractable, 2D-movement study system.

  9. Response of spatial point pattern of halostachys caspica population to ground water depth

    International Nuclear Information System (INIS)

    Niu, P.; Wang, M.; Jiang, P.; Li, M.; Chu, G.

    2017-01-01

    We subjected Halostachys caspica populations to three groundwater depths: shallow ( 4.5 m) in the sample plots, at the diluvial fan of the South Junggar Basin. Both the spatial pattern and spatial association of the population among all three groundwater depths and four growth stages were studied to investigate the impact of groundwater depth on the formation and persistence mechanism of the spatial pattern of Halostachys caspica populations. In this study, Ripley's K function was utilized to characterize spatial patterns and intraspecific associations of H. caspica in three 1-ha plots, as well as to study their relationship with groundwater depth. The seedling supplement severely decreased with increasing groundwater depth, and the population structure changed noticeably due to increased amount of dead standing plants. Different growth stages of the H. caspica population all had aggregated distributions at small scale in the three groundwater depth areas. With increasing scales, the aggregation intensity weakened in all growth stages. Distribution was aggregated at 50 m scales in both the shallow and middle groundwater depth areas, while the deep groundwater depth area followed a random distribution. (author)

  10. Application of random amplified polymorphic DNA (RAPD) markers ...

    African Journals Online (AJOL)

    SAM

    2014-06-11

    Jun 11, 2014 ... variety share an identical genome. In this field one of the most successful techniques is random ... To each minced sample, 350 µL of the same extraction buffer was added and the samples were ..... using fingerprints produced by random primers. J. Hort. Sci. 69:123-. 130. Levi A, Rowland LJ, Hartung JS ...

  11. Patterns among the ashes: Exploring the relationship between landscape pattern and the emerald ash borer

    Science.gov (United States)

    Susan J. Crocker; Dacia M. Meneguzzo; Greg C. Liknes

    2010-01-01

    Landscape metrics, including host abundance and population density, were calculated using forest inventory and land cover data to assess the relationship between landscape pattern and the presence or absence of the emerald ash borer (EAB) (Agrilus planipennis Fairmaire). The Random Forests classification algorithm in the R statistical environment was...

  12. Origin of convex tetrads in rare earth element patterns of hydrothermally altered siliceous igneous rocks from the Zinnwald Sn W deposit, Germany

    Science.gov (United States)

    Monecke, T.; Dulski, P.; Kempe, U.

    2007-01-01

    The chondrite-normalized rare earth element (REE) patterns of whole rock samples from evolved granitic systems hosting rare metal deposits sometimes show a split into four consecutive curved segments, referred to as tetrads. In the present contribution, a rigorous statistical method is proposed that can be used to test whether geological significance should be attributed to tetrads that are only of limited size. The method involves a detailed evaluation of element and sample specific random and systematic errors that are constrained on the basis of independent repeated preparations and analyses of sample and reference materials. Application of the proposed method to samples from the granite-hosted Zinnwald Sn-W deposit, Germany, revealed that at least two tetrads in normalized whole rock REE patterns have to be analytically significant to rule out that fractional crystallization led to the unusual behavior of the REEs. Based on the analysis of altered albite granite and greisen samples from the endocontact of the Zinnwald granite massif, it is demonstrated that the lanthanide tetrad effect is responsible for the formation of the convex tetrads. Geological and petrological evidence suggests that the tetrads in the samples developed prior to greisenization and related cassiterite precipitation. In contrast to the endocontact samples, the rhyolitic wall rocks are typified by normalized REE patterns having tetrads that are variable in size and frequently close to the limit of analytical significance. The sizes of the tetrads apparently correlate with the intensity of albitization, but show no relation to subsequent alteration processes including greisenization and low-temperature argillization. This observation proves that curved segments in normalized whole rock REE patterns can be introduced during hydrothermal fluid-rock interaction.

  13. Micro-Texture Synthesis by Phase Randomization

    Directory of Open Access Journals (Sweden)

    Bruno Galerne

    2011-09-01

    Full Text Available This contribution is concerned with texture synthesis by example, the process of generating new texture images from a given sample. The Random Phase Noise algorithm presented here synthesizes a texture from an original image by simply randomizing its Fourier phase. It is able to reproduce textures which are characterized by their Fourier modulus, namely the random phase textures (or micro-textures.

  14. Sample size of the reference sample in a case-augmented study.

    Science.gov (United States)

    Ghosh, Palash; Dewanji, Anup

    2017-05-01

    The case-augmented study, in which a case sample is augmented with a reference (random) sample from the source population with only covariates information known, is becoming popular in different areas of applied science such as pharmacovigilance, ecology, and econometrics. In general, the case sample is available from some source (for example, hospital database, case registry, etc.); however, the reference sample is required to be drawn from the corresponding source population. The required minimum size of the reference sample is an important issue in this regard. In this work, we address the minimum sample size calculation and discuss related issues. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Systematic analysis of stability patterns in plant primary metabolism.

    Directory of Open Access Journals (Sweden)

    Dorothee Girbig

    Full Text Available Metabolic networks are characterized by complex interactions and regulatory mechanisms between many individual components. These interactions determine whether a steady state is stable to perturbations. Structural kinetic modeling (SKM is a framework to analyze the stability of metabolic steady states that allows the study of the system Jacobian without requiring detailed knowledge about individual rate equations. Stability criteria can be derived by generating a large number of structural kinetic models (SK-models with randomly sampled parameter sets and evaluating the resulting Jacobian matrices. Until now, SKM experiments applied univariate tests to detect the network components with the largest influence on stability. In this work, we present an extended SKM approach relying on supervised machine learning to detect patterns of enzyme-metabolite interactions that act together in an orchestrated manner to ensure stability. We demonstrate its application on a detailed SK-model of the Calvin-Benson cycle and connected pathways. The identified stability patterns are highly complex reflecting that changes in dynamic properties depend on concerted interactions between several network components. In total, we find more patterns that reliably ensure stability than patterns ensuring instability. This shows that the design of this system is strongly targeted towards maintaining stability. We also investigate the effect of allosteric regulators revealing that the tendency to stability is significantly increased by including experimentally determined regulatory mechanisms that have not yet been integrated into existing kinetic models.

  16. Lot quality assurance sampling for screening communities hyperendemic for Schistosoma mansoni.

    Science.gov (United States)

    Rabarijaona, L P; Boisier, P; Ravaoalimalala, V E; Jeanne, I; Roux, J F; Jutand, M A; Salamon, R

    2003-04-01

    Lot quality assurance sampling (LQAS) was evaluated for rapid low cost identification of communities where Schistosoma mansoni infection was hyperendemic in southern Madagascar. In the study area, S. mansoni infection shows very focused and heterogeneous distribution requiring multifariousness of local surveys. One sampling plan was tested in the field with schoolchildren and several others were simulated in the laboratory. Randomization and stool specimen collection were performed by voluntary teachers under direct supervision of the study staff and no significant problem occurred. As expected from Receiver Operating Characteristic (ROC) curves, all sampling plans allowed correct identification of hyperendemic communities and of most of the hypoendemic ones. Frequent misclassifications occurred for communities with intermediate prevalence and the cheapest plans had very low specificity. The study confirmed that LQAS would be a valuable tool for large scale screening in a country with scarce financial and staff resources. Involving teachers, appeared to be quite feasible and should not lower the reliability of surveys. We recommend that the national schistosomiasis control programme systematically uses LQAS for identification of communities, provided that sample sizes are adapted to the specific epidemiological patterns of S. mansoni infection in the main regions.

  17. Parental Interaction Patterns in Children with Adhd and Controls; a Comparative Study

    Directory of Open Access Journals (Sweden)

    M Afkhami -Aghda

    2007-04-01

    Full Text Available Introduction: Communicational patterns of the parents can either positively or negatively influence children's personality. Parenting manner has long-term effects on behavior, function, expectations and eventually people's future personality. This study investigates parental interaction patterns in children with attention deficit- hyperactive disorder. Methods :In this study, 50 male children aged 7-12 years were selected in two groups including 1 25 students with ADHD referring to psychiatry clinics in Isfahan according to the diagnostic scale of DSM- IV and 2 25 healthy boys selected by random cluster multistage sampling from primary schools in five districts of Isfahan from Septamber 2005 until March 2005. Schaffer and Edgerton parental interaction questionnaire was filled for them. Results: In "Communication" interaction pattern, the mean score of healthy children was 15.08, while the mean score of ADHD children was 13.42. In "admission" interactional pattern; the mean of the first group was 14.76, while the second group was 11.76. In "control" interactional pattern, mean of group one was 13.28 and the second group was 11.76. In "aggression control" interactional pattern, the mean of group one was 13 and the second group was 14.68. In "lack of aggressive attachment" interactional pattern, mean of the first group was 13.36 and the second group was 16.67. The mean scores of parental interactional pattern in healthy children were all higher than ADHD children except for "aggression control" and "lack of aggressive attachment" interactional patterns. Conclusion: The more the parental "admission" interactional pattern score, the lower the signs of ADHD in children. The signs of severity are lower in cases with more positive parental "control" interactional patterns. If the scores of "lack of aggressive/ attachment" and "aggressive/ control" interactional patterns are higher, ADHD signs are more severe.

  18. Cooperative random Levy flight searches and the flight patterns of honeybees

    International Nuclear Information System (INIS)

    Reynolds, A.M.

    2006-01-01

    The most efficient Levy flight (scale-free) searching strategy for N independent searchers to adopt when target sites are randomly and sparsely distributed is identified. For N=1, it is well known that the optimal searching strategy is attained when μ=2, where the exponent μ characterizes the Levy distribution, P(l)=l -μ , of flight-lengths. For N>1, the optimal searching strategy is attained as μ->1. It is suggested that the orientation flights of honeybees can be understood within the context of such an optimal cooperative random Levy flight searching strategy. Upon returning to their hive after surveying a landscape honeybees can exchange information about the locations of target sites through the waggle dance. In accordance with observations it is predicted that the waggle dance can be disrupted without noticeable influence on a hive's ability to maintain weight when forage is plentiful

  19. Dietary Patterns Associated with Alzheimer’s Disease: Population Based Study

    Directory of Open Access Journals (Sweden)

    Katarzyna Gustaw-Rothenberg

    2009-04-01

    Full Text Available Recently dietary pattern analysis has emerged as a way for examining diet-disease relations in Alzheimer’s disease. In contrast with the conventional approach, which focuses on a single nutrient or a few nutrients or foods, this method considers overall eating patterns. We examined the dietary patterns defined by factor analysis using data collected with a food-frequency questionnaire in people with Alzheimer’s disease (AD as compared to healthy controls. The diet data were obtained during population based study of the prevalence of Alzheimer’s disease in a population in Poland. Stratified sampling and random selection strategies were combined to obtain a representative population for screening (age group > 55. From the population screened three times, 71 people were diagnosed with Alzheimer’s according to DSM-IV, and were recruited for further diet risk factors assessment. A group of people with Alzheimer disease (n = 71; F/M 42/29 and the same number of healthy, age and gender matched control were recruited for the study. Patients and their caregivers as well as controls were presented with a food frequency questionnaire based on the 12 food groups. Factor analysis (principal component was used to derive food patterns. The analysis was conducted using the factor procedure. The factors were rotated by an orthogonal transformation (Varimax rotation to achieve simpler structure with greater interpretability. Using factor analysis, we identified major eating patterns, one for Alzheimer’s patients and a different one for control group. The AD dietary pattern, FACTOR AD was characterized by a high intake of meat, butter, high-fat dairy products, eggs, and refined sugar, whereas the other pattern, (FACTOR C was characterized by a high intake of grains and vegetables. These data indicate the existence of dietary patterns defined by factor analysis with data from a food frequency questionnaire, characteristic for Alzheimer’s disease in a

  20. Encoding plaintext by Fourier transform hologram in double random phase encoding using fingerprint keys

    Science.gov (United States)

    Takeda, Masafumi; Nakano, Kazuya; Suzuki, Hiroyuki; Yamaguchi, Masahiro

    2012-09-01

    It has been shown that biometric information can be used as a cipher key for binary data encryption by applying double random phase encoding. In such methods, binary data are encoded in a bit pattern image, and the decrypted image becomes a plain image when the key is genuine; otherwise, decrypted images become random images. In some cases, images decrypted by imposters may not be fully random, such that the blurred bit pattern can be partially observed. In this paper, we propose a novel bit coding method based on a Fourier transform hologram, which makes images decrypted by imposters more random. Computer experiments confirm that the method increases the randomness of images decrypted by imposters while keeping the false rejection rate as low as in the conventional method.

  1. Encoding plaintext by Fourier transform hologram in double random phase encoding using fingerprint keys

    International Nuclear Information System (INIS)

    Takeda, Masafumi; Nakano, Kazuya; Suzuki, Hiroyuki; Yamaguchi, Masahiro

    2012-01-01

    It has been shown that biometric information can be used as a cipher key for binary data encryption by applying double random phase encoding. In such methods, binary data are encoded in a bit pattern image, and the decrypted image becomes a plain image when the key is genuine; otherwise, decrypted images become random images. In some cases, images decrypted by imposters may not be fully random, such that the blurred bit pattern can be partially observed. In this paper, we propose a novel bit coding method based on a Fourier transform hologram, which makes images decrypted by imposters more random. Computer experiments confirm that the method increases the randomness of images decrypted by imposters while keeping the false rejection rate as low as in the conventional method. (paper)

  2. The characteristics and use patterns of all-terrain vehicle drivers in the United States.

    Science.gov (United States)

    Rodgers, G B

    1999-07-01

    The consent decrees between the US Consumer Product Safety Commission and the major distributors of all-terrain vehicles (ATV), which were designed to address ATV-related injuries and deaths, expired in April, 1998. While national estimates of nonfatal and fatal injuries involving ATVs declined after the consent decrees went into effect 10 years ago, the injury estimates have stabilized in recent years. To gain a better understanding of current ATV use patterns, the CPSC sponsored a national probability survey of ATV drivers in the fall of 1997. The survey was designed to collect information about the characteristics and use patterns of ATV drivers and to quantify the numbers and types of ATVs in use. It employed a single stage list-assisted random-digit-dial sample design. This article describes the results of the survey, and discusses long term ATV usage trends.

  3. Statistical benchmark for BosonSampling

    International Nuclear Information System (INIS)

    Walschaers, Mattia; Mayer, Klaus; Buchleitner, Andreas; Kuipers, Jack; Urbina, Juan-Diego; Richter, Klaus; Tichy, Malte Christopher

    2016-01-01

    Boson samplers—set-ups that generate complex many-particle output states through the transmission of elementary many-particle input states across a multitude of mutually coupled modes—promise the efficient quantum simulation of a classically intractable computational task, and challenge the extended Church–Turing thesis, one of the fundamental dogmas of computer science. However, as in all experimental quantum simulations of truly complex systems, one crucial problem remains: how to certify that a given experimental measurement record unambiguously results from enforcing the claimed dynamics, on bosons, fermions or distinguishable particles? Here we offer a statistical solution to the certification problem, identifying an unambiguous statistical signature of many-body quantum interference upon transmission across a multimode, random scattering device. We show that statistical analysis of only partial information on the output state allows to characterise the imparted dynamics through particle type-specific features of the emerging interference patterns. The relevant statistical quantifiers are classically computable, define a falsifiable benchmark for BosonSampling, and reveal distinctive features of many-particle quantum dynamics, which go much beyond mere bunching or anti-bunching effects. (fast track communication)

  4. It's about time: a comparison of Canadian and American time-activity patterns.

    Science.gov (United States)

    Leech, Judith A; Nelson, William C; Burnett, Richard T; Aaron, Shawn; Raizenne, Mark E

    2002-11-01

    This study compares two North American time-activity data bases: the National Human Activity Pattern Survey (NHAPS) of 9386 interviewees in 1992-1994 in the continental USA with the Canadian Human Activity Pattern Survey (CHAPS) of 2381 interviewees in 1996-1997 in four major Canadian cities. Identical surveys and methodology were used to collect this data: random sample telephone selection within the identified telephone exchanges, computer-assisted telephone interviews, overselection of children and weekends in the 24-h recall diary and the same interviewers. Very similar response rates were obtained: 63% (NHAPS) and 64.5% (CHAPS). Results of comparisons by age within major activity and location groups suggest activity and location patterns are very similar (most differences being less than 1% or 14 min in a 24-h day) with the exception of seasonal differences. Canadians spend less time outdoors in winter and less time indoors in summer than their U.S. counterparts. When exposure assessments use time of year or outdoor/indoor exposure gradients, these differences may result in significant differences in exposure assessments. Otherwise, the 24-h time activity patterns of North Americans are remarkably similar and use of the combined data set for some exposure assessments may be feasible.

  5. Jenis Sample: Keuntungan dan Kerugiannya

    OpenAIRE

    Suprapto, Agus

    1994-01-01

    Sample is a part of a population that are used in a study for purposes of making estimation about the nature of the total population that is obtained with sampling technic. Sampling technic is more adventagous than cencus because it can reduce cost, time, and it can gather deeper information and more accurate data. It is useful to distinguish two major types of sampling technics. First, Prob bility sampling i.e. simple random sampling. Second, Non Probability sampling i.e. systematic sam­plin...

  6. Toward a Principled Sampling Theory for Quasi-Orders.

    Science.gov (United States)

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets.

  7. Toward a Principled Sampling Theory for Quasi-Orders

    Science.gov (United States)

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets. PMID:27965601

  8. Towards Cost-efficient Sampling Methods

    OpenAIRE

    Peng, Luo; Yongli, Li; Chong, Wu

    2014-01-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper presents two new sampling methods based on the perspective that a small part of vertices with high node degree can possess the most structure information of a network. The two proposed sampling methods are efficient in sampling the nodes with high degree. The first new sampling method is improved on the basis of the stratified random sampling method and...

  9. Adipose tissue fatty acid patterns and changes in anthropometry: a cohort study.

    Directory of Open Access Journals (Sweden)

    Christina Catherine Dahm

    Full Text Available INTRODUCTION: Diets rich in n-3 long chain polyunsaturated fatty acids (LC-PUFA, but low in n-6 LC-PUFA and 18:1 trans-fatty acids (TFA, may lower the risk of overweight and obesity. These fatty acids have often been investigated individually. We explored associations between global patterns in adipose tissue fatty acids and changes in anthropometry. METHODS: 34 fatty acid species from adipose tissue biopsies were determined in a random sample of 1100 men and women from a Danish cohort study. We used sex-specific principal component analysis and multiple linear regression to investigate the associations of adipose tissue fatty acid patterns with changes in weight, waist circumference (WC, and WC controlled for changes in body mass index (WC(BMI, adjusting for confounders. RESULTS: 7 principal components were extracted for each sex, explaining 77.6% and 78.3% of fatty acid variation in men and women, respectively. Fatty acid patterns with high levels of TFA tended to be positively associated with changes in weight and WC for both sexes. Patterns with high levels of n-6 LC-PUFA tended to be negatively associated with changes in weight and WC in men, and positively associated in women. Associations with patterns with high levels of n-3 LC-PUFA were dependent on the context of the rest of the fatty acid pattern. CONCLUSIONS: Adipose tissue fatty acid patterns with high levels of TFA may be linked to weight gain, but patterns with high n-3 LC-PUFA did not appear to be linked to weight loss. Associations depended on characteristics of the rest of the pattern.

  10. Sampling in epidemiological research: issues, hazards and pitfalls

    Science.gov (United States)

    Tyrer, Stephen; Heyman, Bob

    2016-01-01

    Surveys of people's opinions are fraught with difficulties. It is easier to obtain information from those who respond to text messages or to emails than to attempt to obtain a representative sample. Samples of the population that are selected non-randomly in this way are termed convenience samples as they are easy to recruit. This introduces a sampling bias. Such non-probability samples have merit in many situations, but an epidemiological enquiry is of little value unless a random sample is obtained. If a sufficient number of those selected actually complete a survey, the results are likely to be representative of the population. This editorial describes probability and non-probability sampling methods and illustrates the difficulties and suggested solutions in performing accurate epidemiological research. PMID:27087985

  11. Monte Carlo method for random surfaces

    International Nuclear Information System (INIS)

    Berg, B.

    1985-01-01

    Previously two of the authors proposed a Monte Carlo method for sampling statistical ensembles of random walks and surfaces with a Boltzmann probabilistic weight. In the present paper we work out the details for several models of random surfaces, defined on d-dimensional hypercubic lattices. (orig.)

  12. Learning from correlated patterns by simple perceptrons

    Energy Technology Data Exchange (ETDEWEB)

    Shinzato, Takashi; Kabashima, Yoshiyuki [Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, Yokohama 226-8502 (Japan)], E-mail: shinzato@sp.dis.titech.ac.jp, E-mail: kaba@dis.titech.ac.jp

    2009-01-09

    Learning behavior of simple perceptrons is analyzed for a teacher-student scenario in which output labels are provided by a teacher network for a set of possibly correlated input patterns, and such that the teacher and student networks are of the same type. Our main concern is the effect of statistical correlations among the input patterns on learning performance. For this purpose, we extend to the teacher-student scenario a methodology for analyzing randomly labeled patterns recently developed in Shinzato and Kabashima 2008 J. Phys. A: Math. Theor. 41 324013. This methodology is used for analyzing situations in which orthogonality of the input patterns is enhanced in order to optimize the learning performance.

  13. Learning from correlated patterns by simple perceptrons

    Science.gov (United States)

    Shinzato, Takashi; Kabashima, Yoshiyuki

    2009-01-01

    Learning behavior of simple perceptrons is analyzed for a teacher-student scenario in which output labels are provided by a teacher network for a set of possibly correlated input patterns, and such that the teacher and student networks are of the same type. Our main concern is the effect of statistical correlations among the input patterns on learning performance. For this purpose, we extend to the teacher-student scenario a methodology for analyzing randomly labeled patterns recently developed in Shinzato and Kabashima 2008 J. Phys. A: Math. Theor. 41 324013. This methodology is used for analyzing situations in which orthogonality of the input patterns is enhanced in order to optimize the learning performance.

  14. Learning from correlated patterns by simple perceptrons

    International Nuclear Information System (INIS)

    Shinzato, Takashi; Kabashima, Yoshiyuki

    2009-01-01

    Learning behavior of simple perceptrons is analyzed for a teacher-student scenario in which output labels are provided by a teacher network for a set of possibly correlated input patterns, and such that the teacher and student networks are of the same type. Our main concern is the effect of statistical correlations among the input patterns on learning performance. For this purpose, we extend to the teacher-student scenario a methodology for analyzing randomly labeled patterns recently developed in Shinzato and Kabashima 2008 J. Phys. A: Math. Theor. 41 324013. This methodology is used for analyzing situations in which orthogonality of the input patterns is enhanced in order to optimize the learning performance

  15. Comparison of sampling methods for hard-to-reach francophone populations: yield and adequacy of advertisement and respondent-driven sampling.

    Science.gov (United States)

    Ngwakongnwi, Emmanuel; King-Shier, Kathryn M; Hemmelgarn, Brenda R; Musto, Richard; Quan, Hude

    2014-01-01

    Francophones who live outside the primarily French-speaking province of Quebec, Canada, risk being excluded from research by lack of a sampling frame. We examined the adequacy of random sampling, advertising, and respondent-driven sampling for recruitment of francophones for survey research. We recruited francophones residing in the city of Calgary, Alberta, through advertising and respondentdriven sampling. These 2 samples were then compared with a random subsample of Calgary francophones derived from the 2006 Canadian Community Health Survey (CCHS). We assessed the effectiveness of advertising and respondent-driven sampling in relation to the CCHS sample by comparing demographic characteristics and selected items from the CCHS (specifically self-reported general health status, perceived weight, and having a family doctor). We recruited 120 francophones through advertising and 145 through respondent-driven sampling; the random sample from the CCHS consisted of 259 records. The samples derived from advertising and respondentdriven sampling differed from the CCHS in terms of age (mean ages 41.0, 37.6, and 42.5 years, respectively), sex (proportion of males 26.1%, 40.6%, and 56.6%, respectively), education (college or higher 86.7% , 77.9% , and 59.1%, respectively), place of birth (immigrants accounting for 45.8%, 55.2%, and 3.7%, respectively), and not having a regular medical doctor (16.7%, 34.5%, and 16.6%, respectively). Differences were not tested statistically because of limitations on the analysis of CCHS data imposed by Statistics Canada. The samples generated exclusively through advertising and respondent-driven sampling were not representative of the gold standard sample from the CCHS. Use of such biased samples for research studies could generate misleading results.

  16. Association between Cheiloscopic Patterns and ABO Blood Groups among South Indian Population.

    Science.gov (United States)

    Khanapure, Sneha; Suhas, H G; Potdar, Shrudha; Sam, George; Sudeep, C B; Arjun, M R

    2017-07-01

    Human beings have few characteristics that are unique from others. Lip prints are one of such feature. They are not changed throughout the life and are not influenced by injuries, diseases, or environmental changes. According to the various antigen-antibody reactions in the bloodstream, different individuals have specific blood groups. To study the distribution of lip print patterns among individuals with different ABO and Rh blood groups and also to know the relation between their characters and blood groups. In the present study, lip prints were collected randomly from 85 individuals, and their blood group matching was performed. This is to identify the most common lip print type and to know any association between lip print types and blood groups. Tsuchihashi's classification of lip prints was used to compare with the ABO and Rh blood grouping systems. It was observed that in individuals with B+, A+, and O- blood groups, predominant pattern was Type IV and individuals having blood group O+ and AB+ common lip print pattern was Type II. This study showed strong association between lip print patterns and ABO blood groups as some blood groups were not included in statistical analysis; further studies including larger sample are essential to substantiate the results. Correlating lip print with blood group helps in identification of the suspects. Along with lip prints, another biological record that remains unchanged throughout the lifetime of a person is the blood group. Determining the blood group of a person from the samples obtained at the site of crime and also recovering lip prints from site can help identify a person.

  17. Concept of a data base for signal patterns

    International Nuclear Information System (INIS)

    Olma, B.

    1985-01-01

    The many requirements made on the storage, administration and handling of signal patterns of stationary random signal patterns and multi-channel transient signal patterns are discussed. For the field of loose part detection, the concept of a data base is presented for structure-borne noise burst patterns. Chapter 2 describes the measuring principle of structure-borne noise monitoring. Chapter 3 gives a summary of the requirements made on the automation of measured data recording; chapter 4 compiles requirements made on the data base. Chapter 5 describes the concept developed for a burst pattern data base; chapter 6 shows the organization of the data base, and chapter 7 reflects the present state of application. (orig./HP) [de

  18. THEMATIC PROGRESSION PATTERN : A TECHNIQUE TO IMPROVE STUDENTS’ WRITING SKILL VIEWED FROM WRITING APPREHENSION

    Directory of Open Access Journals (Sweden)

    Fitri Nurdianingsih

    2017-10-01

    Full Text Available The objective of conducting this research was to find out : (1 whether or not the use of thematic progression pattern is more effective than direct instruction in teaching writing to the second semester students at English Education Department; (2 the students who have a low writing apprehension have better writing skill than those who have a high writng apprehension; and (3 there is an interaction between teaching technique and writing apprehension in teaching writing skill. This reasearch was an experimental research design. The population of this research was the second semester students at English Education Department of IKIP PGRI Bojonegoro. Meanwhile the sample of this research was selected by using cluster random sampling. The instruments of data collection were witing test and writing apprehension questionnaire. The findings of this study are: (1 thematic progression pattern is more effective than direct instruction in teaching writing; (2 the students who have low writing apprehension have better writing skill than those who have high writing apprehension; and (3 there is an interaction between teaching technique and writing apprehension in teaching writing skill. It can be summarized that thematic progression pattern is an effective technique in teaching writing skill at the second semester students of English Education Department in IKIP PGRI Bojonegoro. The effectiveness of the technique is affected by writing apprehension.

  19. Detecting oscillatory patterns and time lags from proxy records with non-uniform sampling: Some pitfalls and possible solutions

    Science.gov (United States)

    Donner, Reik

    2013-04-01

    Time series analysis offers a rich toolbox for deciphering information from high-resolution geological and geomorphological archives and linking the thus obtained results to distinct climate and environmental processes. Specifically, on various time-scales from inter-annual to multi-millenial, underlying driving forces exhibit more or less periodic oscillations, the detection of which in proxy records often allows linking them to specific mechanisms by which the corresponding drivers may have affected the archive under study. A persistent problem in geomorphology is that available records do not present a clear signal of the variability of environmental conditions, but exhibit considerable uncertainties of both the measured proxy variables and the associated age model. Particularly, time-scale uncertainty as well as the heterogeneity of sampling in the time domain are source of severe conceptual problems that may lead to false conclusions about the presence or absence of oscillatory patterns and their mutual phasing in different archives. In my presentation, I will discuss how one can cope with non-uniformly sampled proxy records to detect and quantify oscillatory patterns in one or more data sets. For this purpose, correlation analysis is reformulated using kernel estimates which are found superior to classical estimators based on interpolation or Fourier transform techniques. In order to characterize non-stationary or noisy periodicities and their relative phasing between different records, an extension of continuous wavelet transform is utilized. The performance of both methods is illustrated for different case studies. An extension to explicitly considering time-scale uncertainties by means of Bayesian techniques is briefly outlined.

  20. Repairable system analysis in presence of covariates and random effects

    International Nuclear Information System (INIS)

    Giorgio, M.; Guida, M.; Pulcini, G.

    2014-01-01

    This paper aims to model the failure pattern of repairable systems in presence of explained and unexplained heterogeneity. The failure pattern of each system is described by a Power Law Process. Part of the heterogeneity among the patterns is explained through the use of a covariate, and the residual unexplained heterogeneity (random effects) is modeled via a joint probability distribution on the PLP parameters. The proposed approach is applied to a real set of failure time data of powertrain systems mounted on 33 buses employed in urban and suburban routes. Moreover, the joint probability distribution on the PLP parameters estimated from the data is used as an informative prior to make Bayesian inference on the future failure process of a generic system belonging to the same population and employed in an urban or suburban route under randomly chosen working conditions. - Highlights: • We describe the failure process of buses powertrain system subject to heterogeneity. • Heterogeneity due to different service types is explained by a covariate. • Random effect is modeled through a joint pdf on failure process parameters. • The powertrain reliability under new future operating conditions is estimated

  1. Neurofeedback Against Binge Eating: A Randomized Controlled Trial in a Female Subclinical Threshold Sample.

    Science.gov (United States)

    Schmidt, Jennifer; Martin, Alexandra

    2016-09-01

    Brain-directed treatment techniques, such as neurofeedback, have recently been proposed as adjuncts in the treatment of eating disorders to improve therapeutic outcomes. In line with this recommendation, a cue exposure EEG-neurofeedback protocol was developed. The present study aimed at the evaluation of the specific efficacy of neurofeedback to reduce subjective binge eating in a female subthreshold sample. A total of 75 subjects were randomized to EEG-neurofeedback, mental imagery with a comparable treatment set-up or a waitlist group. At post-treatment, only EEG-neurofeedback led to a reduced frequency of binge eating (p = .015, g = 0.65). The effects remained stable to a 3-month follow-up. EEG-neurofeedback further showed particular beneficial effects on perceived stress and dietary self-efficacy. Differences in outcomes did not arise from divergent treatment expectations. Because EEG-neurofeedback showed a specific efficacy, it may be a promising brain-directed approach that should be tested as a treatment adjunct in clinical groups with binge eating. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association.

  2. Creating ensembles of decision trees through sampling

    Science.gov (United States)

    Kamath, Chandrika; Cantu-Paz, Erick

    2005-08-30

    A system for decision tree ensembles that includes a module to read the data, a module to sort the data, a module to evaluate a potential split of the data according to some criterion using a random sample of the data, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method is based on statistical sampling techniques and includes the steps of reading the data; sorting the data; evaluating a potential split according to some criterion using a random sample of the data, splitting the data, and combining multiple decision trees in ensembles.

  3. Irregular meal-pattern effects on energy expenditure, metabolism, and appetite regulation: a randomized controlled trial in healthy normal-weight women.

    Science.gov (United States)

    Alhussain, Maha H; Macdonald, Ian A; Taylor, Moira A

    2016-07-01

    Obesity is increasing in parallel with greater all-day food availability. The latter may promote meal irregularity, dysregulation of the energy balance, and poor metabolic health. We investigated the effect of meal irregularity on the thermic effect of food (TEF), lipid concentrations, carbohydrate metabolism, subjective appetite, and gut hormones in healthy women. Eleven normal-weight women (18-40 y of age) were recruited in a randomized crossover trial with two 14-d isoenergetic diet periods (identical foods provided and free living) that were separated by a 14-d habitual diet washout period. In period 1, participants followed a regular meal pattern (6 meals/d) or an irregular meal pattern (3-9 meals/d), and in period 2, the alternative meal pattern was followed. Before and after each period, when participants were fasting and for 3 h after intake of a test drink, measurements were taken of energy expenditure, circulating glucose, lipids (fasting only), insulin, glucagon-like peptide 1 (GLP-1), peptide YY (PYY), and ghrelin. An ad libitum test meal was offered. Subjective appetite ratings were assessed while fasting, after the test drink, after the ad libitum meal, and during the intervention. Continuous interstitial glucose monitoring was undertaken for 3 consecutive days during each intervention, and the ambulatory activity pattern was recorded (ambulatory energy expenditure estimation). Regularity was associated with a greater TEF (P breakfast; day 9: after lunch and dinner). There was no difference between treatments for the test-drink gut hormone response. A time effect was noted for fasting GLP-1, fasting PYY, PYY responses, and hunger-rating responses to the test drink (P < 0.05). Lower hunger and higher fullness ratings were seen premeal and postmeal during the regular period while subjects were free living. Meal regularity appears to be associated with greater TEF and lower glucose responses, which may favor weight management and metabolic health. This

  4. Sex-role patterns, paternal rearing attitudes and child development in different social classes.

    Science.gov (United States)

    Nettelbladt, P; Uddenberg, N; Englesson, I

    1981-07-01

    Sex-role patterns, the father's rearing attitude and the child's intellectual and emotional development in different social classes were studied in a randomly selected sample of 58 Swedish unbroken families of a small child. Working class men and women married younger and the women were more often house-wives. Working class men had more often been reared in an "authoritarian" way and more often reared their children in the same way. Upper middle class men had taken a more active part in the care of the child. Working class children scored lower on the intelligence tests, especially the verbal ones and were more often estimated as socially immature.

  5. Stable and efficient retrospective 4D-MRI using non-uniformly distributed quasi-random numbers

    Science.gov (United States)

    Breuer, Kathrin; Meyer, Cord B.; Breuer, Felix A.; Richter, Anne; Exner, Florian; Weng, Andreas M.; Ströhle, Serge; Polat, Bülent; Jakob, Peter M.; Sauer, Otto A.; Flentje, Michael; Weick, Stefan

    2018-04-01

    The purpose of this work is the development of a robust and reliable three-dimensional (3D) Cartesian imaging technique for fast and flexible retrospective 4D abdominal MRI during free breathing. To this end, a non-uniform quasi random (NU-QR) reordering of the phase encoding (k y –k z ) lines was incorporated into 3D Cartesian acquisition. The proposed sampling scheme allocates more phase encoding points near the k-space origin while reducing the sampling density in the outer part of the k-space. Respiratory self-gating in combination with SPIRiT-reconstruction is used for the reconstruction of abdominal data sets in different respiratory phases (4D-MRI). Six volunteers and three patients were examined at 1.5 T during free breathing. Additionally, data sets with conventional two-dimensional (2D) linear and 2D quasi random phase encoding order were acquired for the volunteers for comparison. A quantitative evaluation of image quality versus scan times (from 70 s to 626 s) for the given sampling schemes was obtained by calculating the normalized mutual information (NMI) for all volunteers. Motion estimation was accomplished by calculating the maximum derivative of a signal intensity profile of a transition (e.g. tumor or diaphragm). The 2D non-uniform quasi-random distribution of phase encoding lines in Cartesian 3D MRI yields more efficient undersampling patterns for parallel imaging compared to conventional uniform quasi-random and linear sampling. Median NMI values of NU-QR sampling are the highest for all scan times. Therefore, within the same scan time 4D imaging could be performed with improved image quality. The proposed method allows for the reconstruction of motion artifact reduced 4D data sets with isotropic spatial resolution of 2.1  ×  2.1  ×  2.1 mm3 in a short scan time, e.g. 10 respiratory phases in only 3 min. Cranio-caudal tumor displacements between 23 and 46 mm could be observed. NU-QR sampling enables for stable 4D

  6. Histological Pattern Of Endometrial Samples In Postmenopausal Women With Abnormal Uterine Bleeding.

    Science.gov (United States)

    Deeba, Farhat; Shaista; Khan, Bushra

    2016-01-01

    Abnormal uterine bleeding is one of the most common clinical problems in gynaecological practice and is an indicator of various underlying disorders. An endometrial biopsy should be done in all women over 35 years with AUB to rule out endometrial cancer or pre-malignant lesion and to initiate treatment. However, wide range of histological patterns on endometrial biopsy offer a diagnostic challenge to practicing pathologists. The objective of this study was to determine histological patterns of endometrium in postmenopausal women with abnormal uterine bleeding. This cross-sectional study was conducted in the department of obstetrics and gynaecology, Benazir Bhutto Shaheed women and children teaching hospital, Abbottabad from 15/11/2014 to 14/05/2015. This study involved 110 postmenopausal women presenting with abnormal uterine bleeding. A written informed consent was obtained from every patient. The mean age of the patients was 61.60±6.17 years and the mean duration of AUB was 5.20±2.80 years. Most of the patients were para 6 (28.2%) and para 5 (28.2%) followed by para 4 (18.2%) and para 3 (17.3%) while only 8.2% were para 1. The most common histological pattern observed was complex hyperplasia without atypia (30.9%) followed by atrophic endometrium (24.5%), simple hyperplasia (23.6%), malignancy (12.7%), complex hyperplasia with atypia (4.5%) and benign endometrial polyp (3.6%). When stratified the data, there was no significant difference of histological patterns across various age groups (p=.673), duration of AUB (p=.064) and parity (p=.242). The most common histological pattern observed in postmenopausal women with AUB was complex hyperplasia without atypia (30.9%) followed by atrophic endometrium (24.5%), simple hyperplasia (23.6%), malignancy (12.7%), complex hyperplasia with atypia (4.5%) and benign endometrial polyp (3.6%).

  7. A random matrix approach to VARMA processes

    International Nuclear Information System (INIS)

    Burda, Zdzislaw; Jarosz, Andrzej; Nowak, Maciej A; Snarska, Malgorzata

    2010-01-01

    We apply random matrix theory to derive the spectral density of large sample covariance matrices generated by multivariate VMA(q), VAR(q) and VARMA(q 1 , q 2 ) processes. In particular, we consider a limit where the number of random variables N and the number of consecutive time measurements T are large but the ratio N/T is fixed. In this regime, the underlying random matrices are asymptotically equivalent to free random variables (FRV). We apply the FRV calculus to calculate the eigenvalue density of the sample covariance for several VARMA-type processes. We explicitly solve the VARMA(1, 1) case and demonstrate perfect agreement between the analytical result and the spectra obtained by Monte Carlo simulations. The proposed method is purely algebraic and can be easily generalized to q 1 >1 and q 2 >1.

  8. Programmable disorder in random DNA tilings

    Science.gov (United States)

    Tikhomirov, Grigory; Petersen, Philip; Qian, Lulu

    2017-03-01

    Scaling up the complexity and diversity of synthetic molecular structures will require strategies that exploit the inherent stochasticity of molecular systems in a controlled fashion. Here we demonstrate a framework for programming random DNA tilings and show how to control the properties of global patterns through simple, local rules. We constructed three general forms of planar network—random loops, mazes and trees—on the surface of self-assembled DNA origami arrays on the micrometre scale with nanometre resolution. Using simple molecular building blocks and robust experimental conditions, we demonstrate control of a wide range of properties of the random networks, including the branching rules, the growth directions, the proximity between adjacent networks and the size distribution. Much as combinatorial approaches for generating random one-dimensional chains of polymers have been used to revolutionize chemical synthesis and the selection of functional nucleic acids, our strategy extends these principles to random two-dimensional networks of molecules and creates new opportunities for fabricating more complex molecular devices that are organized by DNA nanostructures.

  9. Effects of behavioral patterns and network topology structures on Parrondo’s paradox

    Science.gov (United States)

    Ye, Ye; Cheong, Kang Hao; Cen, Yu-Wan; Xie, Neng-Gang

    2016-11-01

    A multi-agent Parrondo’s model based on complex networks is used in the current study. For Parrondo’s game A, the individual interaction can be categorized into five types of behavioral patterns: the Matthew effect, harmony, cooperation, poor-competition-rich-cooperation and a random mode. The parameter space of Parrondo’s paradox pertaining to each behavioral pattern, and the gradual change of the parameter space from a two-dimensional lattice to a random network and from a random network to a scale-free network was analyzed. The simulation results suggest that the size of the region of the parameter space that elicits Parrondo’s paradox is positively correlated with the heterogeneity of the degree distribution of the network. For two distinct sets of probability parameters, the microcosmic reasons underlying the occurrence of the paradox under the scale-free network are elaborated. Common interaction mechanisms of the asymmetric structure of game B, behavioral patterns and network topology are also revealed.

  10. Random Walk Model for the Growth of Monolayer in Dip Pen Nanolithography

    International Nuclear Information System (INIS)

    Kim, H; Ha, S; Jang, J

    2013-01-01

    By using a simple random-walk model, we simulate the growth of a self-assembled monolayer (SAM) pattern generated in dip pen nanolithography (DPN). In this model, the SAM pattern grows mainly via the serial pushing of molecules deposited from the tip. We examine various SAM patterns, such as lines, crosses, and letters by changing the tip scan speed.

  11. Patterns of prescribing and utilization of asthma medications in a ...

    African Journals Online (AJOL)

    Purpose: To assess the prescribing patterns of asthma medications in a hospital in Dubai, United Arab Emirates (UAE) with regard to the demographic pattern of the population. Methods: One hundred fifty four patients, 83 male and 71 female, were randomly selected from the outpatient respiratory diseases clinic of a tertiary ...

  12. Symmetry pattern transition in cellular automata with complex behavior

    International Nuclear Information System (INIS)

    Sanchez, Juan R.; Lopez-Ruiz, Ricardo

    2008-01-01

    A transition from asymmetric to symmetric patterns in time-dependent extended systems is described. It is shown that one dimensional cellular automata, started from fully random initial conditions, can be forced to evolve into complex symmetrical patterns by stochastically coupling a proportion p of pairs of sites located at equal distance from the center of the lattice. A nontrivial critical value of p must be surpassed in order to obtain symmetrical patterns during the evolution. This strategy is able to classify the cellular automata rules - with complex behavior - between those that support time-dependent symmetric patterns and those which do not support such kind of patterns

  13. 32nm and below logic patterning using optimized illumination and double patterning

    Science.gov (United States)

    Smayling, Michael C.; Axelrad, Valery

    2009-03-01

    Line/space dimensions for 32nm generation logic are expected to be ~45-50nm at ~90-100nm pitch. It is likely that the node will begin at the upper end of the range, and then shrink by ~10% to a "28nm" node. For the lower end of the range, even with immersion scanners, the Rayleigh k1 factor is below 0.32. The 22nm logic node should begin with minimum pitches of approximately 70nm, requiring some form of double patterning to maintain k1 above 0.25. Logic patterning has been more difficult than NAND Flash patterning because random logic was designed with complete "freedom" compared to the very regular patterns used in memory. The logic layouts with bends and multiple pitches resulted in larger rules, un-optimized illumination, and a poorly understood process windows with little control of context-dependent "hot spots."[1] The introduction of logic design styles which use strictly one-directional lines for the critical levels now gives the opportunity for illumination optimization. Gridded Design Rules (GDR) have been demonstrated to give areacompetitive layouts at existing 90, 65, and 45nm logic nodes while reducing CD variability.[2] These benefits can be extended to <=32nm logic using selective double pass patterning.

  14. Dissipative neutrino oscillations in randomly fluctuating matter

    International Nuclear Information System (INIS)

    Benatti, F.; Floreanini, R.

    2005-01-01

    The generalized dynamics describing the propagation of neutrinos in randomly fluctuating media is analyzed: It takes into account matter-induced, decoherence phenomena that go beyond the standard Mikheyev-Smirnov-Wolfenstein (MSW) effect. A widely adopted density fluctuation pattern is found to be physically untenable: A more general model needs to be instead considered, leading to flavor changing effective neutrino-matter interactions. They induce new, dissipative effects that modify the neutrino oscillation pattern in a way amenable to a direct experimental analysis

  15. Dissipative neutrino oscillations in randomly fluctuating matter

    Science.gov (United States)

    Benatti, F.; Floreanini, R.

    2005-01-01

    The generalized dynamics describing the propagation of neutrinos in randomly fluctuating media is analyzed: It takes into account matter-induced, decoherence phenomena that go beyond the standard Mikheyev-Smirnov-Wolfenstein (MSW) effect. A widely adopted density fluctuation pattern is found to be physically untenable: A more general model needs to be instead considered, leading to flavor changing effective neutrino-matter interactions. They induce new, dissipative effects that modify the neutrino oscillation pattern in a way amenable to a direct experimental analysis.

  16. Random Walks on Directed Networks: Inference and Respondent-Driven Sampling

    Directory of Open Access Journals (Sweden)

    Malmros Jens

    2016-06-01

    Full Text Available Respondent-driven sampling (RDS is often used to estimate population properties (e.g., sexual risk behavior in hard-to-reach populations. In RDS, already sampled individuals recruit population members to the sample from their social contacts in an efficient snowball-like sampling procedure. By assuming a Markov model for the recruitment of individuals, asymptotically unbiased estimates of population characteristics can be obtained. Current RDS estimation methodology assumes that the social network is undirected, that is, all edges are reciprocal. However, empirical social networks in general also include a substantial number of nonreciprocal edges. In this article, we develop an estimation method for RDS in populations connected by social networks that include reciprocal and nonreciprocal edges. We derive estimators of the selection probabilities of individuals as a function of the number of outgoing edges of sampled individuals. The proposed estimators are evaluated on artificial and empirical networks and are shown to generally perform better than existing estimators. This is the case in particular when the fraction of directed edges in the network is large.

  17. Sample size adjustments for varying cluster sizes in cluster randomized trials with binary outcomes analyzed with second-order PQL mixed logistic regression.

    Science.gov (United States)

    Candel, Math J J M; Van Breukelen, Gerard J P

    2010-06-30

    Adjustments of sample size formulas are given for varying cluster sizes in cluster randomized trials with a binary outcome when testing the treatment effect with mixed effects logistic regression using second-order penalized quasi-likelihood estimation (PQL). Starting from first-order marginal quasi-likelihood (MQL) estimation of the treatment effect, the asymptotic relative efficiency of unequal versus equal cluster sizes is derived. A Monte Carlo simulation study shows this asymptotic relative efficiency to be rather accurate for realistic sample sizes, when employing second-order PQL. An approximate, simpler formula is presented to estimate the efficiency loss due to varying cluster sizes when planning a trial. In many cases sampling 14 per cent more clusters is sufficient to repair the efficiency loss due to varying cluster sizes. Since current closed-form formulas for sample size calculation are based on first-order MQL, planning a trial also requires a conversion factor to obtain the variance of the second-order PQL estimator. In a second Monte Carlo study, this conversion factor turned out to be 1.25 at most. (c) 2010 John Wiley & Sons, Ltd.

  18. X-ray diffraction from thin films : Size/strain analysis and whole pattern fitting

    Energy Technology Data Exchange (ETDEWEB)

    Scardi, P [Trento Univ. (Italy). Dept. of Materials Engineering

    1996-09-01

    Line Profile Analysis (LPA) and whole pattern fitting may be used with success for the characterization of thin films from XRD data collected with the traditional Bragg-Brentano geometry. The size/strain analysis was conducted by an integrated procedure of profile modelling-assisted Fourier analysis, in order to measure the content of lattice imperfections and crystalline domain size along the growth direction in heteroepitaxial thin films. The microstructure of these films is typical of several PVD processes for the production of highly textured and low-defect thin crystalline layers. The same analysis could be conducted on random thin films as well, and in this case it is possible to determine an average crystallite size and shape. As will be shown in the paper, structural and microstructural parameters obtained by these methods may be correlated with thin film properties of technological interest. The whole pattern analysis may be used to obtain the information contained in a wide region of the diffraction pattern. This approach, currently used for the quantitative analysis of phase mixtures in traditional powder samples, was modified to account both for the size/strain effects, according to a simplified LPA, and for the structure of thin films and multi-layer systems. In this way, a detailed analysis based on a structural model for the present phases can be performed considering the real geometry of these samples. In particular, the quantitative phase analysis could be conducted in terms of layer thickness instead of volume or weight fractions.

  19. X-ray diffraction from thin films : Size/strain analysis and whole pattern fitting

    International Nuclear Information System (INIS)

    Scardi, P.

    1996-01-01

    Line Profile Analysis (LPA) and whole pattern fitting may be used with success for the characterization of thin films from XRD data collected with the traditional Bragg-Brentano geometry. The size/strain analysis was conducted by an integrated procedure of profile modelling-assisted Fourier analysis, in order to measure the content of lattice imperfections and crystalline domain size along the growth direction in heteroepitaxial thin films. The microstructure of these films is typical of several PVD processes for the production of highly textured and low-defect thin crystalline layers. The same analysis could be conducted on random thin films as well, and in this case it is possible to determine an average crystallite size and shape. As will be shown in the paper, structural and microstructural parameters obtained by these methods may be correlated with thin film properties of technological interest. The whole pattern analysis may be used to obtain the information contained in a wide region of the diffraction pattern. This approach, currently used for the quantitative analysis of phase mixtures in traditional powder samples, was modified to account both for the size/strain effects, according to a simplified LPA, and for the structure of thin films and multi-layer systems. In this way, a detailed analysis based on a structural model for the present phases can be performed considering the real geometry of these samples. In particular, the quantitative phase analysis could be conducted in terms of layer thickness instead of volume or weight fractions

  20. Application of bias factor method using random sampling technique for prediction accuracy improvement of critical eigenvalue of BWR

    International Nuclear Information System (INIS)

    Ito, Motohiro; Endo, Tomohiro; Yamamoto, Akio; Kuroda, Yusuke; Yoshii, Takashi

    2017-01-01

    The bias factor method based on the random sampling technique is applied to the benchmark problem of Peach Bottom Unit 2. Validity and availability of the present method, i.e. correction of calculation results and reduction of uncertainty, are confirmed in addition to features and performance of the present method. In the present study, core characteristics in cycle 3 are corrected with the proposed method using predicted and 'measured' critical eigenvalues in cycles 1 and 2. As the source of uncertainty, variance-covariance of cross sections is considered. The calculation results indicate that bias between predicted and measured results, and uncertainty owing to cross section can be reduced. Extension to other uncertainties such as thermal hydraulics properties will be a future task. (author)

  1. Recognition of building group patterns in topographic maps based on graph partitioning and random forest

    Science.gov (United States)

    He, Xianjin; Zhang, Xinchang; Xin, Qinchuan

    2018-02-01

    Recognition of building group patterns (i.e., the arrangement and form exhibited by a collection of buildings at a given mapping scale) is important to the understanding and modeling of geographic space and is hence essential to a wide range of downstream applications such as map generalization. Most of the existing methods develop rigid rules based on the topographic relationships between building pairs to identify building group patterns and thus their applications are often limited. This study proposes a method to identify a variety of building group patterns that allow for map generalization. The method first identifies building group patterns from potential building clusters based on a machine-learning algorithm and further partitions the building clusters with no recognized patterns based on the graph partitioning method. The proposed method is applied to the datasets of three cities that are representative of the complex urban environment in Southern China. Assessment of the results based on the reference data suggests that the proposed method is able to recognize both regular (e.g., the collinear, curvilinear, and rectangular patterns) and irregular (e.g., the L-shaped, H-shaped, and high-density patterns) building group patterns well, given that the correctness values are consistently nearly 90% and the completeness values are all above 91% for three study areas. The proposed method shows promises in automated recognition of building group patterns that allows for map generalization.

  2. Filter Bank Regularized Common Spatial Pattern Ensemble for Small Sample Motor Imagery Classification.

    Science.gov (United States)

    Park, Sang-Hoon; Lee, David; Lee, Sang-Goog

    2018-02-01

    For the last few years, many feature extraction methods have been proposed based on biological signals. Among these, the brain signals have the advantage that they can be obtained, even by people with peripheral nervous system damage. Motor imagery electroencephalograms (EEG) are inexpensive to measure, offer a high temporal resolution, and are intuitive. Therefore, these have received a significant amount of attention in various fields, including signal processing, cognitive science, and medicine. The common spatial pattern (CSP) algorithm is a useful method for feature extraction from motor imagery EEG. However, performance degradation occurs in a small-sample setting (SSS), because the CSP depends on sample-based covariance. Since the active frequency range is different for each subject, it is also inconvenient to set the frequency range to be different every time. In this paper, we propose the feature extraction method based on a filter bank to solve these problems. The proposed method consists of five steps. First, motor imagery EEG is divided by a using filter bank. Second, the regularized CSP (R-CSP) is applied to the divided EEG. Third, we select the features according to mutual information based on the individual feature algorithm. Fourth, parameter sets are selected for the ensemble. Finally, we classify using ensemble based on features. The brain-computer interface competition III data set IVa is used to evaluate the performance of the proposed method. The proposed method improves the mean classification accuracy by 12.34%, 11.57%, 9%, 4.95%, and 4.47% compared with CSP, SR-CSP, R-CSP, filter bank CSP (FBCSP), and SR-FBCSP. Compared with the filter bank R-CSP ( , ), which is a parameter selection version of the proposed method, the classification accuracy is improved by 3.49%. In particular, the proposed method shows a large improvement in performance in the SSS.

  3. Comparison of address-based sampling and random-digit dialing methods for recruiting young men as controls in a case-control study of testicular cancer susceptibility.

    Science.gov (United States)

    Clagett, Bartholt; Nathanson, Katherine L; Ciosek, Stephanie L; McDermoth, Monique; Vaughn, David J; Mitra, Nandita; Weiss, Andrew; Martonik, Rachel; Kanetsky, Peter A

    2013-12-01

    Random-digit dialing (RDD) using landline telephone numbers is the historical gold standard for control recruitment in population-based epidemiologic research. However, increasing cell-phone usage and diminishing response rates suggest that the effectiveness of RDD in recruiting a random sample of the general population, particularly for younger target populations, is decreasing. In this study, we compared landline RDD with alternative methods of control recruitment, including RDD using cell-phone numbers and address-based sampling (ABS), to recruit primarily white men aged 18-55 years into a study of testicular cancer susceptibility conducted in the Philadelphia, Pennsylvania, metropolitan area between 2009 and 2012. With few exceptions, eligible and enrolled controls recruited by means of RDD and ABS were similar with regard to characteristics for which data were collected on the screening survey. While we find ABS to be a comparably effective method of recruiting young males compared with landline RDD, we acknowledge the potential impact that selection bias may have had on our results because of poor overall response rates, which ranged from 11.4% for landline RDD to 1.7% for ABS.

  4. Some Tests of Random Walk Hypothesis for Bulgarian Foreign Exchange Rates

    OpenAIRE

    Nikolai Gueorguiev

    1993-01-01

    The objective of this paper is to check if the exchange rate in newly emerged, relatively thin foreign exchange markets, follows a random walk pattern. The findings of the current study cast doubts on random walk presence in Bulgarian exchange rates against major international currencies. It turns out that the series of daily returns are stationary but correlated and therefore can be modelled better by higher-order ARIMA processes than by random walk.

  5. Missing citations due to exact reference matching: Analysis of a random sample from WoS. Are publications from peripheral countries disadvantaged?

    Energy Technology Data Exchange (ETDEWEB)

    Donner, P.

    2016-07-01

    Citation counts of scientific research contributions are one fundamental data in scientometrics. Accuracy and completeness of citation links are therefore crucial data quality issues (Moed, 2005, Ch. 13). However, despite the known flaws of reference matching algorithms, usually no attempts are made to incorporate uncertainty about citation counts into indicators. This study is a step towards that goal. Particular attention is paid to the question whether publications from countries not using basic Latin script are differently affected by missed citations. The proprietary reference matching procedure of Web of Science (WoS) is based on (near) exact agreement of cited reference data (normalized during processing) to the target papers bibliographical data. Consequently, the procedure has near-optimal precision but incomplete recall - it is known to miss some slightly inaccurate reference links (Olensky, 2015). However, there has been no attempt so far to estimate the rate of missed citations by a principled method for a random sample. For this study a simple random sample of WoS source papers was drawn and it was attempted to find all reference strings of WoS indexed documents that refer to them, in particular inexact matches. The objective is to give a statistical estimate of the proportion of missed citations and to describe the relationship of the number of found citations to the number of missed citations, i.e. the conditional error distribution. The empirical error distribution is statistically analyzed and modelled. (Author)

  6. Sampling from the normal and exponential distributions

    International Nuclear Information System (INIS)

    Chaplin, K.R.; Wills, C.A.

    1982-01-01

    Methods for generating random numbers from the normal and exponential distributions are described. These involve dividing each function into subregions, and for each of these developing a method of sampling usually based on an acceptance rejection technique. When sampling from the normal or exponential distribution, each subregion provides the required random value with probability equal to the ratio of its area to the total area. Procedures written in FORTRAN for the CYBER 175/CDC 6600 system are provided to implement the two algorithms

  7. Patterns of alcohol consumption in 10 European countries participating in the EPIC project

    DEFF Research Database (Denmark)

    Sieri, S.; Agudo, A.; Kesse, E.

    2002-01-01

    , are characterised by widely differing drinking habits and frequencies of alcohol-related diseases. Methods: We collected a single standardised 24-hour dietary recall per subject from a random sample of the EPIC cohort (36 900 persons initially and 35 955 after exclusion of subjects under 35 and over 74 years of age......). This provided detailed information on the distribution of alcohol consumption during the day in relation to main meals, and was used to determine weekly consumption patterns. The crude and adjusted (by age, day of week and season) means of total ethanol consumption and consumption according to type of beverage...... were stratified by centre and sex. Results: Sex was a strong determinant of drinking patterns in all 10 countries. The highest total alcohol consumption was observed in the Spanish centres (San Sebastian, 41.4 g day−1) for men and in Danish centres (Copenhagen, 20.9 g day−1) for women. The lowest total...

  8. Methodology Series Module 5: Sampling Strategies

    OpenAIRE

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ? Sampling Method?. There are essentially two types of sampling methods: 1) probability sampling ? based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling ? based on researcher's choice, population that accessible & available. Some of the non-probabilit...

  9. Positive random variables with a discrete probability mass at the origin: Parameter estimation for left-censored samples with application to air quality monitoring data

    International Nuclear Information System (INIS)

    Gogolak, C.V.

    1986-11-01

    The concentration of a contaminant measured in a particular medium might be distributed as a positive random variable when it is present, but it may not always be present. If there is a level below which the concentration cannot be distinguished from zero by the analytical apparatus, a sample from such a population will be censored on the left. The presence of both zeros and positive values in the censored portion of such samples complicates the problem of estimating the parameters of the underlying positive random variable and the probability of a zero observation. Using the method of maximum likelihood, it is shown that the solution to this estimation problem reduces largely to that of estimating the parameters of the distribution truncated at the point of censorship. The maximum likelihood estimate of the proportion of zero values follows directly. The derivation of the maximum likelihood estimates for a lognormal population with zeros is given in detail, and the asymptotic properties of the estimates are examined. The estimation method was used to fit several different distributions to a set of severely censored 85 Kr monitoring data from six locations at the Savannah River Plant chemical separations facilities

  10. Professional ASPNET Design Patterns

    CERN Document Server

    Millett, Scott

    2010-01-01

    Professional ASP.NET Design Patterns will show you how to implement design patterns in real ASP.NET applications by introducing you to the basic OOP skills needed to understand and interpret design patterns. A sample application used throughout the book is an enterprise level ASP.NET website with multi-tiered, SOA design techniques that can be applied to your future ASP.NET projects. Read about each design pattern in detail, including how to interpret the UML design, how to implement it in ASP.NET, its importance for ASP.NET development, and how it's integrated into the final project.

  11. Sample size determinations for group-based randomized clinical trials with different levels of data hierarchy between experimental and control arms.

    Science.gov (United States)

    Heo, Moonseong; Litwin, Alain H; Blackstock, Oni; Kim, Namhee; Arnsten, Julia H

    2017-02-01

    We derived sample size formulae for detecting main effects in group-based randomized clinical trials with different levels of data hierarchy between experimental and control arms. Such designs are necessary when experimental interventions need to be administered to groups of subjects whereas control conditions need to be administered to individual subjects. This type of trial, often referred to as a partially nested or partially clustered design, has been implemented for management of chronic diseases such as diabetes and is beginning to emerge more commonly in wider clinical settings. Depending on the research setting, the level of hierarchy of data structure for the experimental arm can be three or two, whereas that for the control arm is two or one. Such different levels of data hierarchy assume correlation structures of outcomes that are different between arms, regardless of whether research settings require two or three level data structure for the experimental arm. Therefore, the different correlations should be taken into account for statistical modeling and for sample size determinations. To this end, we considered mixed-effects linear models with different correlation structures between experimental and control arms to theoretically derive and empirically validate the sample size formulae with simulation studies.

  12. Geostatistical analysis of allele presence patterns among American black bears in eastern North Carolina

    Science.gov (United States)

    Thompson, L.M.; Van Manen, F.T.; King, T.L.

    2005-01-01

    Highways are one of the leading causes of wildlife habitat fragmentation and may particularly affect wide-ranging species, such as American black bears (Ursus americanus). We initiated a research project in 2000 to determine potential effects of a 4-lane highway on black bear ecology in Washington County, North Carolina. The research design included a treatment area (highway construction) and a control area and a pre- and post-construction phase. We used data from the pre-construction phase to determine whether we could detect scale dependency or directionality among allele occurrence patterns using geostatistics. Detection of such patterns could provide a powerful tool to measure the effects of landscape fragmentation on gene flow. We sampled DNA from roots of black bear hair at 70 hair-sampling sites on each study area for 7 weeks during fall of 2000. We used microsatellite analysis based on 10 loci to determine unique multi-locus genotypes. We examined all alleles sampled at ???25 sites on each study area and mapped their presence or absence at each hair-sample site. We calculated semivariograms, which measure the strength of statistical correlation as a function of distance, and adjusted them for anisotropy to determine the maximum direction of spatial continuity. We then calculated the mean direction of spatial continuity for all examined alleles. The mean direction of allele frequency variation was 118.3?? (SE = 8.5) on the treatment area and 172.3?? (SE = 6.0) on the control area. Rayleigh's tests showed that these directions differed from random distributions (P = 0.028 and P < 0.001, respectively), indicating consistent directional patterns for the alleles we examined in each area. Despite the small spatial scale of our study (approximately 11,000 ha for each study area), we observed distinct and consistent patterns of allele occurrence, suggesting different directions of gene flow between the study areas. These directions seemed to coincide with the

  13. A randomized trial of a DWI intervention program for first offenders: intervention outcomes and interactions with antisocial personality disorder among a primarily American-Indian sample.

    Science.gov (United States)

    Woodall, W Gill; Delaney, Harold D; Kunitz, Stephen J; Westerberg, Verner S; Zhao, Hongwei

    2007-06-01

    Randomized trial evidence on the effectiveness of incarceration and treatment of first-time driving while intoxicated (DWI) offenders who are primarily American Indian has yet to be reported in the literature on DWI prevention. Further, research has confirmed the association of antisocial personality disorder (ASPD) with problems with alcohol including DWI. A randomized clinical trial was conducted, in conjunction with 28 days of incarceration, of a treatment program incorporating motivational interviewing principles for first-time DWI offenders. The sample of 305 offenders including 52 diagnosed as ASPD by the Diagnostic Interview Schedule were assessed before assignment to conditions and at 6, 12, and 24 months after discharge. Self-reported frequency of drinking and driving as well as various measures of drinking over the preceding 90 days were available at all assessments for 244 participants. Further, DWI rearrest data for 274 participants were available for analysis. Participants randomized to receive the first offender incarceration and treatment program reported greater reductions in alcohol consumption from baseline levels when compared with participants who were only incarcerated. Antisocial personality disorder participants reported heavier and more frequent drinking but showed significantly greater declines in drinking from intake to posttreatment assessments. Further, the treatment resulted in larger effects relative to the control on ASPD than non-ASPD participants. Nonconfrontational treatment may significantly enhance outcomes for DWI offenders with ASPD when delivered in an incarcerated setting, and in the present study, such effects were found in a primarily American-Indian sample.

  14. Particle-associated flagellates: swimming patterns, colonization rates, and grazing on attached bacteria

    DEFF Research Database (Denmark)

    Kiørboe, Thomas; Grossart, H.P.; Ploug, H.

    2004-01-01

    Some pelagic flagellates colonize particles, such as marine snow, where they graze on bacteria and thus impact the dynamics of the attached microbial communities. Particle colonization is governed by motility. Swimming patterns of 2 particle-associated flagellates, Bodo designis and Spumella sp......., are very different, the former swimming slowly in an erratic, random pattern, and the latter faster and along smooth helixes of variable amplitude and frequency. At spatial scales exceeding ca. 50 mum, the motility of B. designis can be described as a random walk and modeled as diffusion. Spumella sp...

  15. Pattern of disability among persons who availed half-way home-care services for psychosocial rehabilitation

    Directory of Open Access Journals (Sweden)

    M Ranganathan

    2012-01-01

    Full Text Available Background: There is dearth of studies related to pattern of disability among persons who availed psychosocial rehabilitation services in India. We studied the pattern of disability among persons who availed half-way home-care services for psychosocial rehabilitation. Materials and Methods: Out of 130 case files of discharged patients, 50 files were randomly selected for data collection. Indian Disability Evaluation and Assessment Schedule was used to assess the pattern of disability in the sample. Results: The study revealed that only one-third (35% of the residents had disability in self-care, 41% in communication and understanding and 47% in interpersonal relationship. Overall, majority (76% of the respondents had moderate level of psychiatric disability at the time of discharge from half-way home. There was no significant relationship between gender and type of psychiatric illness with the level of disability. The overall disability correlated positively with the duration of illness (rs=0.39. Conclusion: Three-fourth of the residents who availed half-way home-care services had moderate level of disability.

  16. In Search of Meaning: Are School Rampage Shootings Random and Senseless Violence?

    Science.gov (United States)

    Madfis, Eric

    2017-01-02

    This article discusses Joel Best's ( 1999 ) notion of random violence and applies his concepts of pointlessness, patternlessness, and deterioration to the reality about multiple-victim school shootings gleaned from empirical research about the phenomenon. Best describes how violence is rarely random, as scholarship reveals myriad observable patterns, lots of discernable motives and causes, and often far too much fear-mongering over how bad society is getting and how violent we are becoming. In contrast, it is vital that the media, scholars, and the public better understand crime patterns, criminal motivations, and the causes of fluctuating crime rates. As an effort toward such progress, this article reviews the academic literature on school rampage shootings and explores the extent to which these attacks are and are not random acts of violence.

  17. Estimating the encounter rate variance in distance sampling

    Science.gov (United States)

    Fewster, R.M.; Buckland, S.T.; Burnham, K.P.; Borchers, D.L.; Jupp, P.E.; Laake, J.L.; Thomas, L.

    2009-01-01

    The dominant source of variance in line transect sampling is usually the encounter rate variance. Systematic survey designs are often used to reduce the true variability among different realizations of the design, but estimating the variance is difficult and estimators typically approximate the variance by treating the design as a simple random sample of lines. We explore the properties of different encounter rate variance estimators under random and systematic designs. We show that a design-based variance estimator improves upon the model-based estimator of Buckland et al. (2001, Introduction to Distance Sampling. Oxford: Oxford University Press, p. 79) when transects are positioned at random. However, if populations exhibit strong spatial trends, both estimators can have substantial positive bias under systematic designs. We show that poststratification is effective in reducing this bias. ?? 2008, The International Biometric Society.

  18. Random-effects linear modeling and sample size tables for two special crossover designs of average bioequivalence studies: the four-period, two-sequence, two-formulation and six-period, three-sequence, three-formulation designs.

    Science.gov (United States)

    Diaz, Francisco J; Berg, Michel J; Krebill, Ron; Welty, Timothy; Gidal, Barry E; Alloway, Rita; Privitera, Michael

    2013-12-01

    Due to concern and debate in the epilepsy medical community and to the current interest of the US Food and Drug Administration (FDA) in revising approaches to the approval of generic drugs, the FDA is currently supporting ongoing bioequivalence studies of antiepileptic drugs, the EQUIGEN studies. During the design of these crossover studies, the researchers could not find commercial or non-commercial statistical software that quickly allowed computation of sample sizes for their designs, particularly software implementing the FDA requirement of using random-effects linear models for the analyses of bioequivalence studies. This article presents tables for sample-size evaluations of average bioequivalence studies based on the two crossover designs used in the EQUIGEN studies: the four-period, two-sequence, two-formulation design, and the six-period, three-sequence, three-formulation design. Sample-size computations assume that random-effects linear models are used in bioequivalence analyses with crossover designs. Random-effects linear models have been traditionally viewed by many pharmacologists and clinical researchers as just mathematical devices to analyze repeated-measures data. In contrast, a modern view of these models attributes an important mathematical role in theoretical formulations in personalized medicine to them, because these models not only have parameters that represent average patients, but also have parameters that represent individual patients. Moreover, the notation and language of random-effects linear models have evolved over the years. Thus, another goal of this article is to provide a presentation of the statistical modeling of data from bioequivalence studies that highlights the modern view of these models, with special emphasis on power analyses and sample-size computations.

  19. Antibiotic Resistance Among Staphylococcus aureus and Escherichia coli Isolated From Traditional and Industrial Food Samples

    Directory of Open Access Journals (Sweden)

    Mojtaba Arslani

    2017-05-01

    Full Text Available Background: Foodborne diseases are one of the serious problems in the world. Every year, more than 100 million people are affected by foodborne and waterborne diseases particularly immunocompromised diseases. Objectives: The aim of the present study was to evaluate bacterial load and antibiotic resistance pattern in bacterial isolates from food samples of meat, dairy, and pastry products from west of Tehran, Iran, during April 2007 to March 2008. Materials and Methods: A total of 1625 different food samples including dairy products, meat and pastries were collected randomly from different parts of the west of Tehran. All samples were kept at 4°C. The samples were first cultured according to the standard bacteriological methods and then Staphylococcus aureus and Escherichia coli isolates were identified using standard bacteriological tests. Antimicrobial susceptibility test was performed by disk diffusion method according to Clinical & Laboratory Standards Institute (CLSI guidelines. Results: During 2007 and 2008, 2.8% and 3% of the food samples were contaminated with S. aureus. Similarly, 3.5% and 6.4% of the food samples were contaminated with E. coli. E. coli isolates were highly resistant to amikacin and cephotaxime and this resistance was increased in 2008. Similarly S. aureus isolates were resistant to ciprofloxacin, cephotaxime, gentamicin, and tetracyclin. There was no significant difference during 2007-2008. Conclusion: The rate of contamination during 2007 was 2.8% and during 2008 was 3% for S. aureus. This strain was isolated from the food samples. Further studies should be done to determine the changes of bacterial resistance pattern for various food samples. Thus, the baseline for comparison with future prospective studies should be established, enabling the determination of trends over time.

  20. Some regional variations in dietary patterns in a random sample of British adults.

    Science.gov (United States)

    Whichelow, M J; Erzinclioglu, S W; Cox, B D

    1991-05-01

    Comparison was made of the reported frequency of consumption or choice of 30 food items by 8860 adults in the 11 standard regions of Great Britain, with the use of log-linear analysis to allow for the age, sex, social class and smoking habit variations between the regions. The South-East was taken as the base region against which the others were compared. The number of food items for which there were significant differences from the South-East were Scotland 23, North 25, North-West and Yorkshire/Humberside 20, Wales 19, West Midlands 15, East Midlands 10, East Anglia 8, South-West 7 and Greater London 9. Overall the findings confirm a North/South trend in relation to eating habits, even when demographic and smoking-habit variations are taken into account, with the frequent consumption of many fruit and vegetable products being much less common and of several high-fat foods (chips, processed meats and fried food) more common in Scotland, Wales and the northern part of England. In most regions there was a significantly lower frequency of consumption of fresh fruit, fruit juice, 'brown' bread, pasta/rice, poultry, skimmed/semi-skimmed milk, light desserts and nuts, and a higher consumption of red meat, fish and fried food than in the South-East.

  1. Sampling pig farms at the abattoir in a cross-sectional study - Evaluation of a sampling method.

    Science.gov (United States)

    Birkegård, Anna Camilla; Halasa, Tariq; Toft, Nils

    2017-09-15

    A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However, it is difficult to sample a large number of farms from an exact predefined list, due to the logistics and workflow of an abattoir. Therefore, it is necessary to have a systematic sampling procedure and to evaluate the obtained sample with respect to the study objective. We propose a method for 1) planning, 2) conducting, and 3) evaluating the representativeness and reproducibility of a cross-sectional study when simple random sampling is not possible. We used an example of a cross-sectional study with the aim of quantifying the association of antimicrobial resistance and antimicrobial consumption in Danish slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2) conducting: sampling was carried out at five abattoirs; 3) evaluation: representativeness was evaluated by comparing sampled and non-sampled farms, and the reproducibility of the study was assessed through simulated sampling based on meat inspection data from the period where the actual data collection was carried out. In the cross-sectional study samples were taken from 681 Danish pig farms, during five weeks from February to March 2015. The evaluation showed that the sampling

  2. Prescription pattern of medication in the elderly residing in nursing homes in Tehran

    Directory of Open Access Journals (Sweden)

    Sabour Malihe

    2014-01-01

    Full Text Available Objective: This study aimed to investigate prescription patterns for older people in nursing homes of Tehran. Methods: In this cross-sectional study, the data of 170 older people, sampled random cluster gathering method, using medical records, questionnaires and interview with nurses and physicians. Results: The mean average age of the sample was 79.75. 64.7% of them were female. 62.4% received more than 5 types of medicines. The mean number of medicines was 7.55 with the ranging of 1-19 drugs. The most medicine forms used by older people were: tablets 98.2%, injection medicines 20.6 %, drops 13.5%, syrup 8.8%, sprays 6.5%, ointments and suppositories 2.9%. There was not a significant relationship between participating in geriatric educational course And the mean numbers of the prescribed medications (P>0.05., as well as between covering by health insurance specialty in medicine and the mean of the numbers of mediations (P>0.05. There was a significant relationship between having insurance and the mean number of prescribed medicine (P<0.05. Conclusion: Developing educational programs on geriatric pharmacology general practitioners and more supervision on residential care homes practices may have affects on prescription pattern.

  3. The concentration of heavy metals: zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people

    International Nuclear Information System (INIS)

    Wandiga, S.O.; Jumba, I.O.

    1982-01-01

    An intercomparative analysis of the concentration of heavy metals:zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people using the techniques of atomic absorption spectrophotometry (AAS) and differential pulse anodic stripping voltammetry (DPAS) has been undertaken. The percent relative standard deviation for each sample analysed using either of the techniques show good sensitivity and correlation between the techniques. The DPAS was found to be slightly sensitive than the AAs instrument used. The recalculated body burden rations of Cd to Zn, Pb to Fe reveal no unusual health impairement symptoms and suggest a relatively clean environment in Kenya.(author)

  4. GRD: An SPSS extension command for generating random data

    Directory of Open Access Journals (Sweden)

    Bradley Harding

    2014-09-01

    Full Text Available To master statistics and data analysis tools, it is necessary to understand a number of concepts, manyof which are quite abstract. For example, sampling from a theoretical distribution can help individuals explore andunderstand randomness. Sampling can also be used to build exercises aimed to help students master statistics. Here, we present GRD (Generator of Random Data, an extension command for SPSS (version 17 and above. With GRD, it is possible to get random data from a given distribution. In its simplest use, GRD will return a set of simulated data from a normal distribution.With subcommands to GRD, it is possible to get data from multiple groups, over multiple repeated measures, and with desired effectsizes. Group sizes can be equal or unequal. With further subcommands, it is possible to sample from any theoretical population, (not simply the normal distribution, introduce non-homogeneous variances,fix or randomize subject effects, etc. Finally, GRD’s generated data are in a format ready to be analyzed.

  5. Publication Bias in Psychology: A Diagnosis Based on the Correlation between Effect Size and Sample Size

    Science.gov (United States)

    Kühberger, Anton; Fritz, Astrid; Scherndl, Thomas

    2014-01-01

    Background The p value obtained from a significance test provides no information about the magnitude or importance of the underlying phenomenon. Therefore, additional reporting of effect size is often recommended. Effect sizes are theoretically independent from sample size. Yet this may not hold true empirically: non-independence could indicate publication bias. Methods We investigate whether effect size is independent from sample size in psychological research. We randomly sampled 1,000 psychological articles from all areas of psychological research. We extracted p values, effect sizes, and sample sizes of all empirical papers, and calculated the correlation between effect size and sample size, and investigated the distribution of p values. Results We found a negative correlation of r = −.45 [95% CI: −.53; −.35] between effect size and sample size. In addition, we found an inordinately high number of p values just passing the boundary of significance. Additional data showed that neither implicit nor explicit power analysis could account for this pattern of findings. Conclusion The negative correlation between effect size and samples size, and the biased distribution of p values indicate pervasive publication bias in the entire field of psychology. PMID:25192357

  6. Airflow patterns in complex workplaces

    International Nuclear Information System (INIS)

    Mishima, J.; Selby, J.M.; Lynch, T.P.; Langer, G.; Vallario, E.J.

    1987-01-01

    There are many considerations in obtaining an accurate evaluation of aerosols. One aspect that has been neglected is the study of airflow patterns within the workplace. In many nuclear facilities, the operations performed required extensive equipment (e.g., glove boxes, piping) that create complex arrangements of physical barriers to flow. To provide samples of the airborne materials, particularly particles, knowledge of these complex airflow patterns is required for sampler placement. Recent studies have shown that materials introduced into the air flow within a workplace act as plumes embedded in major airflow streams. Portions of the plumes can recycle through the ventilated area, be lost to dead air pockets, or exhaust through unusual, unexpected outlets. Unusual flow patterns are observed even in relatively uncomplicated arrangements of equipment. This behavior must be factored into sampling/monitoring programs for evaluation of the airborne hazard to personnel within the workplace consistent with the objective of the program. Other factors that also must be considered to provide valid samples of airborne particulate materials are objectives of the sampling program, characteristics of the airborne particulate materials, nonsegregatory transport for the extracted materials, and requirements for the measurement techniques used

  7. Dietary intake patterns and diet quality in a nationally representative sample of women with and without severe headache or migraine.

    Science.gov (United States)

    Evans, E Whitney; Lipton, Richard B; Peterlin, B Lee; Raynor, Hollie A; Thomas, J Graham; O'Leary, Kevin C; Pavlovic, Jelena; Wing, Rena R; Bond, Dale S

    2015-04-01

    The role of diet in migraine is not well understood. We sought to characterize usual dietary intake patterns and diet quality in a nationally representative sample of women with and without severe headache or migraine. We also examined whether the relationship between migraine and diet differs by weight status. In this analysis, women with migraine or severe headache status was determined by questionnaire for 3069 women, ages 20-50 years, who participated in the National Health and Nutrition Examination Study, 1999-2004. Women who experienced severe headaches or migraines were classified as migraine for the purposes of this analysis. Dietary intake patterns (micro- and macronutrient intake and eating frequency) and diet quality, measured by the Healthy Eating Index, 2005, were determined using one 24-hour dietary recall. Dietary intake patterns did not significantly differ between women with and without migraine. Normal weight women with migraine had significantly lower diet quality (Healthy Eating Index, 2005 total scores) than women without migraine (52.5 ± 0.9 vs. 45.9 ± 1.0; P quality differs by migraine status in normal weight women. Prospective analyses are needed to establish how diet relates to migraine onset, characteristics, and clinical features in individuals of varying weight status. © 2015 American Headache Society.

  8. Equilibrium Molecular Thermodynamics from Kirkwood Sampling

    OpenAIRE

    Somani, Sandeep; Okamoto, Yuko; Ballard, Andrew J.; Wales, David J.

    2015-01-01

    We present two methods for barrierless equilibrium sampling of molecular systems based on the recently proposed Kirkwood method (J. Chem. Phys. 2009, 130, 134102). Kirkwood sampling employs low-order correlations among internal coordinates of a molecule for random (or non-Markovian) sampling of the high dimensional conformational space. This is a geometrical sampling method independent of the potential energy surface. The first method is a variant of biased Monte Carlo, wher...

  9. The hot (invisible?) hand: can time sequence patterns of success/failure in sports be modeled as repeated random independent trials?

    Science.gov (United States)

    Yaari, Gur; Eisenmann, Shmuel

    2011-01-01

    The long lasting debate initiated by Gilovich, Vallone and Tversky in [Formula: see text] is revisited: does a "hot hand" phenomenon exist in sports? Hereby we come back to one of the cases analyzed by the original study, but with a much larger data set: all free throws taken during five regular seasons ([Formula: see text]) of the National Basketball Association (NBA). Evidence supporting the existence of the "hot hand" phenomenon is provided. However, while statistical traces of this phenomenon are observed in the data, an open question still remains: are these non random patterns a result of "success breeds success" and "failure breeds failure" mechanisms or simply "better" and "worse" periods? Although free throws data is not adequate to answer this question in a definite way, we speculate based on it, that the latter is the dominant cause behind the appearance of the "hot hand" phenomenon in the data.

  10. Looking for age-related growth decline in natural forests: unexpected biomass patterns from tree rings and simulated mortality

    Science.gov (United States)

    Foster, Jane R.; D'Amato, Anthony W.; Bradford, John B.

    2014-01-01

    Forest biomass growth is almost universally assumed to peak early in stand development, near canopy closure, after which it will plateau or decline. The chronosequence and plot remeasurement approaches used to establish the decline pattern suffer from limitations and coarse temporal detail. We combined annual tree ring measurements and mortality models to address two questions: first, how do assumptions about tree growth and mortality influence reconstructions of biomass growth? Second, under what circumstances does biomass production follow the model that peaks early, then declines? We integrated three stochastic mortality models with a census tree-ring data set from eight temperate forest types to reconstruct stand-level biomass increments (in Minnesota, USA). We compared growth patterns among mortality models, forest types and stands. Timing of peak biomass growth varied significantly among mortality models, peaking 20–30 years earlier when mortality was random with respect to tree growth and size, than when mortality favored slow-growing individuals. Random or u-shaped mortality (highest in small or large trees) produced peak growth 25–30 % higher than the surviving tree sample alone. Growth trends for even-aged, monospecific Pinus banksiana or Acer saccharum forests were similar to the early peak and decline expectation. However, we observed continually increasing biomass growth in older, low-productivity forests of Quercus rubra, Fraxinus nigra, and Thuja occidentalis. Tree-ring reconstructions estimated annual changes in live biomass growth and identified more diverse development patterns than previous methods. These detailed, long-term patterns of biomass development are crucial for detecting recent growth responses to global change and modeling future forest dynamics.

  11. Random walks and diffusion on networks

    Science.gov (United States)

    Masuda, Naoki; Porter, Mason A.; Lambiotte, Renaud

    2017-11-01

    Random walks are ubiquitous in the sciences, and they are interesting from both theoretical and practical perspectives. They are one of the most fundamental types of stochastic processes; can be used to model numerous phenomena, including diffusion, interactions, and opinions among humans and animals; and can be used to extract information about important entities or dense groups of entities in a network. Random walks have been studied for many decades on both regular lattices and (especially in the last couple of decades) on networks with a variety of structures. In the present article, we survey the theory and applications of random walks on networks, restricting ourselves to simple cases of single and non-adaptive random walkers. We distinguish three main types of random walks: discrete-time random walks, node-centric continuous-time random walks, and edge-centric continuous-time random walks. We first briefly survey random walks on a line, and then we consider random walks on various types of networks. We extensively discuss applications of random walks, including ranking of nodes (e.g., PageRank), community detection, respondent-driven sampling, and opinion models such as voter models.

  12. A Bayesian Method for Weighted Sampling

    OpenAIRE

    Lo, Albert Y.

    1993-01-01

    Bayesian statistical inference for sampling from weighted distribution models is studied. Small-sample Bayesian bootstrap clone (BBC) approximations to the posterior distribution are discussed. A second-order property for the BBC in unweighted i.i.d. sampling is given. A consequence is that BBC approximations to a posterior distribution of the mean and to the sampling distribution of the sample average, can be made asymptotically accurate by a proper choice of the random variables that genera...

  13. Coronary artery calcification and ECG pattern of left ventricular hypertrophy or strain identify different healthy individuals at risk

    DEFF Research Database (Denmark)

    Diederichsen, Søren Zöga; Gerke, Oke; Olsen, Michael Hecht

    2013-01-01

    PURPOSE:: To improve risk stratification for development of ischaemic heart disease, several markers have been proposed. Both the presence of coronary artery calcification (CAC) and ECG pattern of left ventricular hypertrophy/strain have been shown to provide independent prognostic information....... In this study, we investigated the association between established risk factors, ECG measurements and the presence of coronary artery calcification. METHOD:: A random sample of healthy men and women aged 50 or 60 years were invited to the screening study. Established risk factors were measured. A noncontrast...... computed tomographic (CT) scan was performed to assess the CAC score. ECG analysis included left ventricular hypertrophy (LVH) using the Sokolow-Lyon criteria and the Cornell voltage × QRS duration product, and strain pattern based on ST segment depression and T-wave abnormalities. The association between...

  14. Selection of representative calibration sample sets for near-infrared reflectance spectroscopy to predict nitrogen concentration in grasses

    DEFF Research Database (Denmark)

    Shetty, Nisha; Rinnan, Åsmund; Gislum, René

    2012-01-01

    ) algorithm were used and compared. Both Puchwein and CADEX methods provide a calibration set equally distributed in space, and both methods require a minimum prior of knowledge. The samples were also selected randomly using complete random, cultivar random (year fixed), year random (cultivar fixed......) and interaction (cultivar × year fixed) random procedures to see the influence of different factors on sample selection. Puchwein's method performed best with lowest RMSEP followed by CADEX, interaction random, year random, cultivar random and complete random. Out of 118 samples of the complete calibration set...... effectively enhance the cost-effectiveness of NIR spectral analysis by reducing the number of analyzed samples in the calibration set by more than 80%, which substantially reduces the effort of laboratory analyses with no significant loss in prediction accuracy....

  15. Sampling Strategies for Evaluating the Rate of Adventitious Transgene Presence in Non-Genetically Modified Crop Fields.

    Science.gov (United States)

    Makowski, David; Bancal, Rémi; Bensadoun, Arnaud; Monod, Hervé; Messéan, Antoine

    2017-09-01

    According to E.U. regulations, the maximum allowable rate of adventitious transgene presence in non-genetically modified (GM) crops is 0.9%. We compared four sampling methods for the detection of transgenic material in agricultural non-GM maize fields: random sampling, stratified sampling, random sampling + ratio reweighting, random sampling + regression reweighting. Random sampling involves simply sampling maize grains from different locations selected at random from the field concerned. The stratified and reweighting sampling methods make use of an auxiliary variable corresponding to the output of a gene-flow model (a zero-inflated Poisson model) simulating cross-pollination as a function of wind speed, wind direction, and distance to the closest GM maize field. With the stratified sampling method, an auxiliary variable is used to define several strata with contrasting transgene presence rates, and grains are then sampled at random from each stratum. With the two methods involving reweighting, grains are first sampled at random from various locations within the field, and the observations are then reweighted according to the auxiliary variable. Data collected from three maize fields were used to compare the four sampling methods, and the results were used to determine the extent to which transgene presence rate estimation was improved by the use of stratified and reweighting sampling methods. We found that transgene rate estimates were more accurate and that substantially smaller samples could be used with sampling strategies based on an auxiliary variable derived from a gene-flow model. © 2017 Society for Risk Analysis.

  16. Research Article Comparing covariance matrices: random skewers method compared to the common principal components model

    Directory of Open Access Journals (Sweden)

    James M. Cheverud

    2007-03-01

    Full Text Available Comparisons of covariance patterns are becoming more common as interest in the evolution of relationships between traits and in the evolutionary phenotypic diversification of clades have grown. We present parallel analyses of covariance matrix similarity for cranial traits in 14 New World Monkey genera using the Random Skewers (RS, T-statistics, and Common Principal Components (CPC approaches. We find that the CPC approach is very powerful in that with adequate sample sizes, it can be used to detect significant differences in matrix structure, even between matrices that are virtually identical in their evolutionary properties, as indicated by the RS results. We suggest that in many instances the assumption that population covariance matrices are identical be rejected out of hand. The more interesting and relevant question is, How similar are two covariance matrices with respect to their predicted evolutionary responses? This issue is addressed by the random skewers method described here.

  17. Reducing the random seed effect on segmentation by applying an edge-preserving filter

    NARCIS (Netherlands)

    Addink, E.A.

    2012-01-01

    In region-growing segmentation algorithms random seed locations are used (reference). To ensure that repeating the segmentation will produce the same result, the seed locations are following a fixed random pattern. Empirical studies show that when the image that is subjected to the segmentation is

  18. Phd dissertation on: Effects of an expressive writing intervention in a nationwide sample of breast cancer patients

    DEFF Research Database (Denmark)

    Jensen-Johansen, Mikael Birkelund

    2010-01-01

    and cancer patients. EWI had been used in 10 of the studies (n=661) with number of participants ranging from 30 to 234 and an average sample size of 82.6. Across all available studies, EWI did not show the hypothesized therapeutic effect on either psychological or physical outcomes. However, when including...... at baseline with respect to the prevalence of somatic symptoms, depression, distress, and sociodemografic patterns, and to compare the present sample with a large Danish cohort of 3500 Danish women treated for breast cancer. Objective 4 – To investigate the impact of EWI on the outcomes of cancer related...... is a literature review and meta-analysis of studies of expressive writing intervention (EWI) focusing on health outcomes in samples of cancer patients. A list of randomized clinical trials (RCT) of EWI with cancer patients published before December 2009 was compiled using relevant search engines and previously...

  19. Media Use and Source Trust among Muslims in Seven Countries: Results of a Large Random Sample Survey

    Directory of Open Access Journals (Sweden)

    Steven R. Corman

    2013-12-01

    Full Text Available Despite the perceived importance of media in the spread of and resistance against Islamist extremism, little is known about how Muslims use different kinds of media to get information about religious issues, and what sources they trust when doing so. This paper reports the results of a large, random sample survey among Muslims in seven countries Southeast Asia, West Africa and Western Europe, which helps fill this gap. Results show a diverse set of profiles of media use and source trust that differ by country, with overall low trust in mediated sources of information. Based on these findings, we conclude that mass media is still the most common source of religious information for Muslims, but that trust in mediated information is low overall. This suggests that media are probably best used to persuade opinion leaders, who will then carry anti-extremist messages through more personal means.

  20. Random sequential adsorption of cubes

    Science.gov (United States)

    Cieśla, Michał; Kubala, Piotr

    2018-01-01

    Random packings built of cubes are studied numerically using a random sequential adsorption algorithm. To compare the obtained results with previous reports, three different models of cube orientation sampling were used. Also, three different cube-cube intersection algorithms were tested to find the most efficient one. The study focuses on the mean saturated packing fraction as well as kinetics of packing growth. Microstructural properties of packings were analyzed using density autocorrelation function.

  1. Evidence of Levy walk foraging patterns in human hunter-gatherers.

    Science.gov (United States)

    Raichlen, David A; Wood, Brian M; Gordon, Adam D; Mabulla, Audax Z P; Marlowe, Frank W; Pontzer, Herman

    2014-01-14

    When searching for food, many organisms adopt a superdiffusive, scale-free movement pattern called a Lévy walk, which is considered optimal when foraging for heterogeneously located resources with little prior knowledge of distribution patterns [Viswanathan GM, da Luz MGE, Raposo EP, Stanley HE (2011) The Physics of Foraging: An Introduction to Random Searches and Biological Encounters]. Although memory of food locations and higher cognition may limit the benefits of random walk strategies, no studies to date have fully explored search patterns in human foraging. Here, we show that human hunter-gatherers, the Hadza of northern Tanzania, perform Lévy walks in nearly one-half of all foraging bouts. Lévy walks occur when searching for a wide variety of foods from animal prey to underground tubers, suggesting that, even in the most cognitively complex forager on Earth, such patterns are essential to understanding elementary foraging mechanisms. This movement pattern may be fundamental to how humans experience and interact with the world across a wide range of ecological contexts, and it may be adaptive to food distribution patterns on the landscape, which previous studies suggested for organisms with more limited cognition. Additionally, Lévy walks may have become common early in our genus when hunting and gathering arose as a major foraging strategy, playing an important role in the evolution of human mobility.

  2. A reversed gender pattern? A meta-analysis of gender differences in the prevalence of non-suicidal self-injurious behaviour among Chinese adolescents

    Directory of Open Access Journals (Sweden)

    Xueyan Yang

    2017-07-01

    Full Text Available Abstract Background A reversed gender pattern has been observed in the suicide rate in China compared to elsewhere. Like suicidal behaviour, non-suicidal self-injurious (NSSI behaviour is a health-risk behaviour. We examined whether a reversed gender pattern existed in the prevalence of NSSI. Methods Online literature databases were searched for English and Chinese articles on NSSI behaviours among the Chinese. A meta-analysis with a random-effects model and a subgroup analysis were used to estimate the odds ratios of gender differences in NSSI prevalence among Chinese adolescents including college students, middle school students, and clinical samples, as well as rural, urban, and Hong Kong middle school students. Results There was a male bias in NSSI prevalence among college students (OR = 1.56, 95% CI = [1.30, 1.87], p  0.1. The NSSI prevalence among middle school students had a female bias in the rural (OR = 0.58, 95% CI = [0.47, 0.72], p  0.1 among middle school students. Conclusions Our analysis indicated the existence of specific gender and age patterns in NSSI prevalence among Chinese adolescents. The sample type, age, and the areas that have different gender norms and culture could partly explain this pattern.

  3. Establishing patterns on hysteroscopy in abnormal uterine bleeding (AUB

    Directory of Open Access Journals (Sweden)

    Deeksha Pandey

    2017-11-01

    Full Text Available Introduction: Pattern recognition of various phases of normal endometrium and endometrial pathologies during hysteroscopy has many advantages. It would help to triage women with AUB, so as to be selective with biopsies and curettages. Recognition of normal variant or benign lesion would reduce burden to the pathologist by decreasing the number of unnecessary sampling. It will also decreases anxiety of the patient as the report/prognostication can be instant in many cases. Material and methods: This prospective, double blind, correlation study was carried out in the teaching hospital with a sample population of 70 women presenting with AUB who underwent hysteroscopy and endometrial sampling. We identified patterns of endometrium which can used to predict six endometrial pathologies which were later correlated with the final histological diagnosis. Results: There was good correlation between hysteroscopic patterns and histopathology report, 33% of starry sky appearance correlated with atrophic endometrium, 87% of tongue shaped projections correlated with endometrial polyp, 44.4% of pebble stone appearance correlated with myomatous polyp, 50% of polypoidal pattern correlated with endometrial hyperplasia. 100% correlation was seen in strawberry appearance, pattern for secretory endometrium and cerebroid appearance which was pattern designated to endometrial carcinoma. Conclusion: Hysteroscopic pattern recognition is a useful concept to triage women who require sampling for histopathological diagnosis. Keywords: Abnormal uterine bleeding, Histopathology, Hysteroscopy, Pattern recognition

  4. Morphological and isozymic banding pattern study of white grubs (Coleoptera: Melolonthidae as pest of bark crop in mounth Merapi’s slope.

    Directory of Open Access Journals (Sweden)

    SUGIYARTO

    2008-07-01

    Full Text Available White grub (Coleoptera: Melolonthidae is a group of soil pest at any agrosystem., especially at Salak pondoh (Salacca zalacca (Gaert. Voss. crop. The characteristics of this specimen were very crucial to be studied in order to find the exact biocontrol. The aim of this research was to know the characteristics of white grubs (Melolonthidae: Coleoptera based on morphological and isozyme banding patterns. This research was conducted on August - November 2007 at Sleman and Magelang districts for the morphological purposes, while for the isozyme data were conducted at Sub Laboratory Biology, Central Laboratory of Sebelas Maret University Surakarta. Sample was taken by using stratified random sampling method, on five stations. Polyacrylamide gel electrophoresis (PAGE using the vertical type was taken to isozyme analysis. The enzyme used in this research were peroxidase and esterase to detect the isozyme banding patterns. The results showed that there was no morphological variation of white grubs (Melolonthidae: Coleoptera at salak pondoh agroecosystem in Mounth Merapi’s slope. Based on this character, there was one species of white grub found, i.e. Holotrichia javana. There was a genetic variation based on the variation of isozyme banding patterns.

  5. Random noise attenuation of non-uniformly sampled 3D seismic data along two spatial coordinates using non-equispaced curvelet transform

    Science.gov (United States)

    Zhang, Hua; Yang, Hui; Li, Hongxing; Huang, Guangnan; Ding, Zheyi

    2018-04-01

    The attenuation of random noise is important for improving the signal to noise ratio (SNR). However, the precondition for most conventional denoising methods is that the noisy data must be sampled on a uniform grid, making the conventional methods unsuitable for non-uniformly sampled data. In this paper, a denoising method capable of regularizing the noisy data from a non-uniform grid to a specified uniform grid is proposed. Firstly, the denoising method is performed for every time slice extracted from the 3D noisy data along the source and receiver directions, then the 2D non-equispaced fast Fourier transform (NFFT) is introduced in the conventional fast discrete curvelet transform (FDCT). The non-equispaced fast discrete curvelet transform (NFDCT) can be achieved based on the regularized inversion of an operator that links the uniformly sampled curvelet coefficients to the non-uniformly sampled noisy data. The uniform curvelet coefficients can be calculated by using the inversion algorithm of the spectral projected-gradient for ℓ1-norm problems. Then local threshold factors are chosen for the uniform curvelet coefficients for each decomposition scale, and effective curvelet coefficients are obtained respectively for each scale. Finally, the conventional inverse FDCT is applied to the effective curvelet coefficients. This completes the proposed 3D denoising method using the non-equispaced curvelet transform in the source-receiver domain. The examples for synthetic data and real data reveal the effectiveness of the proposed approach in applications to noise attenuation for non-uniformly sampled data compared with the conventional FDCT method and wavelet transformation.

  6. Erasure and reestablishment of random allelic expression imbalance after epigenetic reprogramming.

    Science.gov (United States)

    Jeffries, Aaron Richard; Uwanogho, Dafe Aghogho; Cocks, Graham; Perfect, Leo William; Dempster, Emma; Mill, Jonathan; Price, Jack

    2016-10-01

    Clonal level random allelic expression imbalance and random monoallelic expression provides cellular heterogeneity within tissues by modulating allelic dosage. Although such expression patterns have been observed in multiple cell types, little is known about when in development these stochastic allelic choices are made. We examine allelic expression patterns in human neural progenitor cells before and after epigenetic reprogramming to induced pluripotency, observing that loci previously characterized by random allelic expression imbalance (0.63% of expressed genes) are generally reset to a biallelic state in induced pluripotent stem cells (iPSCs). We subsequently neuralized the iPSCs and profiled isolated clonal neural stem cells, observing that significant random allelic expression imbalance is reestablished at 0.65% of expressed genes, including novel loci not found to show allelic expression imbalance in the original parental neural progenitor cells. Allelic expression imbalance was associated with altered DNA methylation across promoter regulatory regions, with clones characterized by skewed allelic expression being hypermethylated compared to their biallelic sister clones. Our results suggest that random allelic expression imbalance is established during lineage commitment and is associated with increased DNA methylation at the gene promoter. © 2016 Jeffries et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  7. Accelerating Markov chain Monte Carlo simulation by differential evolution with self-adaptive randomized subspace sampling

    Energy Technology Data Exchange (ETDEWEB)

    Vrugt, Jasper A [Los Alamos National Laboratory; Hyman, James M [Los Alamos National Laboratory; Robinson, Bruce A [Los Alamos National Laboratory; Higdon, Dave [Los Alamos National Laboratory; Ter Braak, Cajo J F [NETHERLANDS; Diks, Cees G H [UNIV OF AMSTERDAM

    2008-01-01

    Markov chain Monte Carlo (MCMC) methods have found widespread use in many fields of study to estimate the average properties of complex systems, and for posterior inference in a Bayesian framework. Existing theory and experiments prove convergence of well constructed MCMC schemes to the appropriate limiting distribution under a variety of different conditions. In practice, however this convergence is often observed to be disturbingly slow. This is frequently caused by an inappropriate selection of the proposal distribution used to generate trial moves in the Markov Chain. Here we show that significant improvements to the efficiency of MCMC simulation can be made by using a self-adaptive Differential Evolution learning strategy within a population-based evolutionary framework. This scheme, entitled DiffeRential Evolution Adaptive Metropolis or DREAM, runs multiple different chains simultaneously for global exploration, and automatically tunes the scale and orientation of the proposal distribution in randomized subspaces during the search. Ergodicity of the algorithm is proved, and various examples involving nonlinearity, high-dimensionality, and multimodality show that DREAM is generally superior to other adaptive MCMC sampling approaches. The DREAM scheme significantly enhances the applicability of MCMC simulation to complex, multi-modal search problems.

  8. Reconstructing random media

    International Nuclear Information System (INIS)

    Yeong, C.L.; Torquato, S.

    1998-01-01

    We formulate a procedure to reconstruct the structure of general random heterogeneous media from limited morphological information by extending the methodology of Rintoul and Torquato [J. Colloid Interface Sci. 186, 467 (1997)] developed for dispersions. The procedure has the advantages that it is simple to implement and generally applicable to multidimensional, multiphase, and anisotropic structures. Furthermore, an extremely useful feature is that it can incorporate any type and number of correlation functions in order to provide as much morphological information as is necessary for accurate reconstruction. We consider a variety of one- and two-dimensional reconstructions, including periodic and random arrays of rods, various distribution of disks, Debye random media, and a Fontainebleau sandstone sample. We also use our algorithm to construct heterogeneous media from specified hypothetical correlation functions, including an exponentially damped, oscillating function as well as physically unrealizable ones. copyright 1998 The American Physical Society

  9. Prevalence and pattern of self-medication in Karachi: A community survey

    Science.gov (United States)

    Afridi, M. Iqbal; Rasool, Ghulam; Tabassum, Rabia; Shaheen, Marriam; Siddiqullah; Shujauddin, M.

    2015-01-01

    Objective: To study the prevalence and pattern of self-medication among adult males and females in Karachi, Pakistan. Methods: This cross-sectional community- based survey was carried out at five randomly selected towns of Karachi (Defence, Gulshan-e-Iqbal, North Nazimabad, Malir, Orangi town) over a period of 3 months (October, November & December 2012). A sample size of 500 adult cases (250 males & 250 females), with systemic random selection from different towns of Karachi were inducted in this study. The city was divided in 5 zones and one town from each zone was selected by systemic randomization. First available male and female from each randomly selected house were included in the study. After consent and confidentiality assurance they were interviewed on semi-structured Performa designed for this purpose. Results were analyzed and tabulated through SPSS v14.0. Result: The prevalence of self-medication in males and females in Karachi is found to be 84.8% (males 88.4% and females 81.2%). The most frequent symptoms for which self-medication used were headache (32.7%), fever (23.3%) and the medicines used were painkillers (28.8%), fever reducer medicines (19.8%). The most common reason 33.3% was previous experience with similar symptom. Conclusion: Self-medication is highly prevalent (84.8%) in Karachi. It was frequently used for headache followed by fever. Predominantly painkillers, fever reducer and cough syrups were used in the form of tablets and syrups. Main source of medicines for males were friends and for females were relatives. PMID:26649022

  10. Differential effects of rhythmic auditory stimulation and neurodevelopmental treatment/Bobath on gait patterns in adults with cerebral palsy: a randomized controlled trial.

    Science.gov (United States)

    Kim, Soo Ji; Kwak, Eunmi E; Park, Eun Sook; Cho, Sung-Rae

    2012-10-01

    To investigate the effects of rhythmic auditory stimulation (RAS) on gait patterns in comparison with changes after neurodevelopmental treatment (NDT/Bobath) in adults with cerebral palsy. A repeated-measures analysis between the pretreatment and posttreatment tests and a comparison study between groups. Human gait analysis laboratory. Twenty-eight cerebral palsy patients with bilateral spasticity participated in this study. The subjects were randomly allocated to either neurodevelopmental treatment (n = 13) or rhythmic auditory stimulation (n = 15). Gait training with rhythmic auditory stimulation or neurodevelopmental treatment was performed three sessions per week for three weeks. Temporal and kinematic data were analysed before and after the intervention. Rhythmic auditory stimulation was provided using a combination of a metronome beat set to the individual's cadence and rhythmic cueing from a live keyboard, while neurodevelopmental treatment was implemented following the traditional method. Temporal data, kinematic parameters and gait deviation index as a measure of overall gait pathology were assessed. Temporal gait measures revealed that rhythmic auditory stimulation significantly increased cadence, walking velocity, stride length, and step length (P rhythmic auditory stimulation (P rhythmic auditory stimulation (P rhythmic auditory stimulation showed aggravated maximal internal rotation in the transverse plane (P rhythmic auditory stimulation or neurodevelopmental treatment elicited differential effects on gait patterns in adults with cerebral palsy.

  11. Unsupervised Ensemble Anomaly Detection Using Time-Periodic Packet Sampling

    Science.gov (United States)

    Uchida, Masato; Nawata, Shuichi; Gu, Yu; Tsuru, Masato; Oie, Yuji

    We propose an anomaly detection method for finding patterns in network traffic that do not conform to legitimate (i.e., normal) behavior. The proposed method trains a baseline model describing the normal behavior of network traffic without using manually labeled traffic data. The trained baseline model is used as the basis for comparison with the audit network traffic. This anomaly detection works in an unsupervised manner through the use of time-periodic packet sampling, which is used in a manner that differs from its intended purpose — the lossy nature of packet sampling is used to extract normal packets from the unlabeled original traffic data. Evaluation using actual traffic traces showed that the proposed method has false positive and false negative rates in the detection of anomalies regarding TCP SYN packets comparable to those of a conventional method that uses manually labeled traffic data to train the baseline model. Performance variation due to the probabilistic nature of sampled traffic data is mitigated by using ensemble anomaly detection that collectively exploits multiple baseline models in parallel. Alarm sensitivity is adjusted for the intended use by using maximum- and minimum-based anomaly detection that effectively take advantage of the performance variations among the multiple baseline models. Testing using actual traffic traces showed that the proposed anomaly detection method performs as well as one using manually labeled traffic data and better than one using randomly sampled (unlabeled) traffic data.

  12. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based

  13. Comparing the performance of cluster random sampling and integrated threshold mapping for targeting trachoma control, using computer simulation.

    Directory of Open Access Journals (Sweden)

    Jennifer L Smith

    Full Text Available Implementation of trachoma control strategies requires reliable district-level estimates of trachomatous inflammation-follicular (TF, generally collected using the recommended gold-standard cluster randomized surveys (CRS. Integrated Threshold Mapping (ITM has been proposed as an integrated and cost-effective means of rapidly surveying trachoma in order to classify districts according to treatment thresholds. ITM differs from CRS in a number of important ways, including the use of a school-based sampling platform for children aged 1-9 and a different age distribution of participants. This study uses computerised sampling simulations to compare the performance of these survey designs and evaluate the impact of varying key parameters.Realistic pseudo gold standard data for 100 districts were generated that maintained the relative risk of disease between important sub-groups and incorporated empirical estimates of disease clustering at the household, village and district level. To simulate the different sampling approaches, 20 clusters were selected from each district, with individuals sampled according to the protocol for ITM and CRS. Results showed that ITM generally under-estimated the true prevalence of TF over a range of epidemiological settings and introduced more district misclassification according to treatment thresholds than did CRS. However, the extent of underestimation and resulting misclassification was found to be dependent on three main factors: (i the district prevalence of TF; (ii the relative risk of TF between enrolled and non-enrolled children within clusters; and (iii the enrollment rate in schools.Although in some contexts the two methodologies may be equivalent, ITM can introduce a bias-dependent shift as prevalence of TF increases, resulting in a greater risk of misclassification around treatment thresholds. In addition to strengthening the evidence base around choice of trachoma survey methodologies, this study illustrates

  14. Virulence properties and random amplification of polymorphic DNA ...

    African Journals Online (AJOL)

    Genotypic and phenotypic characterization as well as studies on the virulence factors of Candida albicans isolates obtained from oral cavity of patients was carried out using random amplified polymorphic DNA (RAPD) fingerprinting and epithelial cells adherence assay, respectively. RAPD patterns revealed the presence of ...

  15. Aerobic physical activity and resistance training: an application of the theory of planned behavior among adults with type 2 diabetes in a random, national sample of Canadians

    Directory of Open Access Journals (Sweden)

    Karunamuni Nandini

    2008-12-01

    Full Text Available Abstract Background Aerobic physical activity (PA and resistance training are paramount in the treatment and management of type 2 diabetes (T2D, but few studies have examined the determinants of both types of exercise in the same sample. Objective The primary purpose was to investigate the utility of the Theory of Planned Behavior (TPB in explaining aerobic PA and resistance training in a population sample of T2D adults. Methods A total of 244 individuals were recruited through a random national sample which was created by generating a random list of household phone numbers. The list was proportionate to the actual number of household telephone numbers for each Canadian province (with the exception of Quebec. These individuals completed self-report TPB constructs of attitude, subjective norm, perceived behavioral control and intention, and a 3-month follow-up that assessed aerobic PA and resistance training. Results TPB explained 10% and 8% of the variance respectively for aerobic PA and resistance training; and accounted for 39% and 45% of the variance respectively for aerobic PA and resistance training intentions. Conclusion These results may guide the development of appropriate PA interventions for aerobic PA and resistance training based on the TPB.

  16. Connecting the dots : Illusory pattern perception predicts belief in conspiracies and the supernatural

    NARCIS (Netherlands)

    van Prooijen, Jan Willem; Douglas, Karen M.; De Inocencio, Clara

    A common assumption is that belief in conspiracy theories and supernatural phenomena are grounded in illusory pattern perception. In the present research we systematically tested this assumption. Study 1 revealed that such irrational beliefs are related to perceiving patterns in randomly generated

  17. Pseudo-random tool paths for CNC sub-aperture polishing and other applications.

    Science.gov (United States)

    Dunn, Christina R; Walker, David D

    2008-11-10

    In this paper we first contrast classical and CNC polishing techniques in regard to the repetitiveness of the machine motions. We then present a pseudo-random tool path for use with CNC sub-aperture polishing techniques and report polishing results from equivalent random and raster tool-paths. The random tool-path used - the unicursal random tool-path - employs a random seed to generate a pattern which never crosses itself. Because of this property, this tool-path is directly compatible with dwell time maps for corrective polishing. The tool-path can be used to polish any continuous area of any boundary shape, including surfaces with interior perforations.

  18. The hot (invisible? hand: can time sequence patterns of success/failure in sports be modeled as repeated random independent trials?

    Directory of Open Access Journals (Sweden)

    Gur Yaari

    Full Text Available The long lasting debate initiated by Gilovich, Vallone and Tversky in [Formula: see text] is revisited: does a "hot hand" phenomenon exist in sports? Hereby we come back to one of the cases analyzed by the original study, but with a much larger data set: all free throws taken during five regular seasons ([Formula: see text] of the National Basketball Association (NBA. Evidence supporting the existence of the "hot hand" phenomenon is provided. However, while statistical traces of this phenomenon are observed in the data, an open question still remains: are these non random patterns a result of "success breeds success" and "failure breeds failure" mechanisms or simply "better" and "worse" periods? Although free throws data is not adequate to answer this question in a definite way, we speculate based on it, that the latter is the dominant cause behind the appearance of the "hot hand" phenomenon in the data.

  19. Ventilatory Function in Relation to Mining Experience and Smoking in a Random Sample of Miners and Non-miners in a Witwatersrand Town1

    Science.gov (United States)

    Sluis-Cremer, G. K.; Walters, L. G.; Sichel, H. S.

    1967-01-01

    The ventilatory capacity of a random sample of men over the age of 35 years in the town of Carletonville was estimated by the forced expiratory volume and the peak expiratory flow rate. Five hundred and sixty-two persons were working or had worked in gold-mines and 265 had never worked in gold-mines. No difference in ventilatory function was found between the miners and non-miners other than that due to the excess of chronic bronchitis in miners. PMID:6017134

  20. On the pertinence to Physics of random walks induced by random dynamical systems: a survey

    International Nuclear Information System (INIS)

    Petritis, Dimitri

    2016-01-01

    Let be an abstract space and a denumerable (finite or infinite) alphabet. Suppose that is a family of functions such that for all we have and a family of transformations . The pair (( S_a)_a , ( p_a)_a ) is termed an iterated function system with place dependent probabilities. Such systems can be thought as generalisations of random dynamical systems. As a matter of fact, suppose we start from a given ; we pick then randomly, with probability p_a (x) , the transformation S_a and evolve to S_a (x) . We are interested in the behaviour of the system when the iteration continues indefinitely. Random walks of the above type are omnipresent in both classical and quantum Physics. To give a small sample of occurrences we mention: random walks on the affine group, random walks on Penrose lattices, random walks on partially directed lattices, evolution of density matrices induced by repeated quantum measurements, quantum channels, quantum random walks, etc. In this article, we review some basic properties of such systems and provide with a pathfinder in the extensive bibliography (both on mathematical and physical sides) where the main results have been originally published. (paper)