WorldWideScience

Sample records for sample distribution information

  1. Adaptive Metropolis Sampling with Product Distributions

    Science.gov (United States)

    Wolpert, David H.; Lee, Chiu Fan

    2005-01-01

    The Metropolis-Hastings (MH) algorithm is a way to sample a provided target distribution pi(z). It works by repeatedly sampling a separate proposal distribution T(x,x') to generate a random walk {x(t)}. We consider a modification of the MH algorithm in which T is dynamically updated during the walk. The update at time t uses the {x(t' less than t)} to estimate the product distribution that has the least Kullback-Leibler distance to pi. That estimate is the information-theoretically optimal mean-field approximation to pi. We demonstrate through computer experiments that our algorithm produces samples that are superior to those of the conventional MH algorithm.

  2. Information sampling behavior with explicit sampling costs

    Science.gov (United States)

    Juni, Mordechai Z.; Gureckis, Todd M.; Maloney, Laurence T.

    2015-01-01

    The decision to gather information should take into account both the value of information and its accrual costs in time, energy and money. Here we explore how people balance the monetary costs and benefits of gathering additional information in a perceptual-motor estimation task. Participants were rewarded for touching a hidden circular target on a touch-screen display. The target’s center coincided with the mean of a circular Gaussian distribution from which participants could sample repeatedly. Each “cue” — sampled one at a time — was plotted as a dot on the display. Participants had to repeatedly decide, after sampling each cue, whether to stop sampling and attempt to touch the hidden target or continue sampling. Each additional cue increased the participants’ probability of successfully touching the hidden target but reduced their potential reward. Two experimental conditions differed in the initial reward associated with touching the hidden target and the fixed cost per cue. For each condition we computed the optimal number of cues that participants should sample, before taking action, to maximize expected gain. Contrary to recent claims that people gather less information than they objectively should before taking action, we found that participants over-sampled in one experimental condition, and did not significantly under- or over-sample in the other. Additionally, while the ideal observer model ignores the current sample dispersion, we found that participants used it to decide whether to stop sampling and take action or continue sampling, a possible consequence of imperfect learning of the underlying population dispersion across trials. PMID:27429991

  3. Use of spatially distributed time-integrated sediment sampling networks and distributed fine sediment modelling to inform catchment management.

    Science.gov (United States)

    Perks, M T; Warburton, J; Bracken, L J; Reaney, S M; Emery, S B; Hirst, S

    2017-11-01

    Under the EU Water Framework Directive, suspended sediment is omitted from environmental quality standards and compliance targets. This omission is partly explained by difficulties in assessing the complex dose-response of ecological communities. But equally, it is hindered by a lack of spatially distributed estimates of suspended sediment variability across catchments. In this paper, we demonstrate the inability of traditional, discrete sampling campaigns for assessing exposure to fine sediment. Sampling frequencies based on Environmental Quality Standard protocols, whilst reflecting typical manual sampling constraints, are unable to determine the magnitude of sediment exposure with an acceptable level of precision. Deviations from actual concentrations range between -35 and +20% based on the interquartile range of simulations. As an alternative, we assess the value of low-cost, suspended sediment sampling networks for quantifying suspended sediment transfer (SST). In this study of the 362 km 2 upland Esk catchment we observe that spatial patterns of sediment flux are consistent over the two year monitoring period across a network of 17 monitoring sites. This enables the key contributing sub-catchments of Butter Beck (SST: 1141 t km 2 yr -1 ) and Glaisdale Beck (SST: 841 t km 2 yr -1 ) to be identified. The time-integrated samplers offer a feasible alternative to traditional infrequent and discrete sampling approaches for assessing spatio-temporal changes in contamination. In conjunction with a spatially distributed diffuse pollution model (SCIMAP), time-integrated sediment sampling is an effective means of identifying critical sediment source areas in the catchment, which can better inform sediment management strategies for pollution prevention and control. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Sampling informative/complex a priori probability distributions using Gibbs sampling assisted by sequential simulation

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Mosegaard, Klaus; Cordua, Knud Skou

    2010-01-01

    Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis algorithm can be used to sample the solutions to non-linear inverse problems. In principle these methods allow incorporation of arbitrarily complex a priori information, but current methods allow only relatively simple...... this algorithm with the Metropolis algorithm to obtain an efficient method for sampling posterior probability densities for nonlinear inverse problems....

  5. Reliability assessment based on small samples of normal distribution

    International Nuclear Information System (INIS)

    Ma Zhibo; Zhu Jianshi; Xu Naixin

    2003-01-01

    When the pertinent parameter involved in reliability definition complies with normal distribution, the conjugate prior of its distributing parameters (μ, h) is of normal-gamma distribution. With the help of maximum entropy and the moments-equivalence principles, the subjective information of the parameter and the sampling data of its independent variables are transformed to a Bayesian prior of (μ,h). The desired estimates are obtained from either the prior or the posterior which is formed by combining the prior and sampling data. Computing methods are described and examples are presented to give demonstrations

  6. Probability Distribution and Deviation Information Fusion Driven Support Vector Regression Model and Its Application

    Directory of Open Access Journals (Sweden)

    Changhao Fan

    2017-01-01

    Full Text Available In modeling, only information from the deviation between the output of the support vector regression (SVR model and the training sample is considered, whereas the other prior information of the training sample, such as probability distribution information, is ignored. Probabilistic distribution information describes the overall distribution of sample data in a training sample that contains different degrees of noise and potential outliers, as well as helping develop a high-accuracy model. To mine and use the probability distribution information of a training sample, a new support vector regression model that incorporates probability distribution information weight SVR (PDISVR is proposed. In the PDISVR model, the probability distribution of each sample is considered as the weight and is then introduced into the error coefficient and slack variables of SVR. Thus, the deviation and probability distribution information of the training sample are both used in the PDISVR model to eliminate the influence of noise and outliers in the training sample and to improve predictive performance. Furthermore, examples with different degrees of noise were employed to demonstrate the performance of PDISVR, which was then compared with those of three SVR-based methods. The results showed that PDISVR performs better than the three other methods.

  7. Computer Graphics Simulations of Sampling Distributions.

    Science.gov (United States)

    Gordon, Florence S.; Gordon, Sheldon P.

    1989-01-01

    Describes the use of computer graphics simulations to enhance student understanding of sampling distributions that arise in introductory statistics. Highlights include the distribution of sample proportions, the distribution of the difference of sample means, the distribution of the difference of sample proportions, and the distribution of sample…

  8. How Sample Size Affects a Sampling Distribution

    Science.gov (United States)

    Mulekar, Madhuri S.; Siegel, Murray H.

    2009-01-01

    If students are to understand inferential statistics successfully, they must have a profound understanding of the nature of the sampling distribution. Specifically, they must comprehend the determination of the expected value and standard error of a sampling distribution as well as the meaning of the central limit theorem. Many students in a high…

  9. Investigation of elemental distribution in lung samples by X-ray fluorescence microtomography

    International Nuclear Information System (INIS)

    Pereira, Gabriela R.; Rocha, Henrique S.; Lopes, Ricardo T.

    2007-01-01

    X-Ray Fluorescence Microtomography (XRFCT) is a suitable technique to find elemental distributions in heterogeneous samples. While x-ray transmission microtomography provides information about the linear attenuation coefficient distribution, XRFCT allows one to map the most important elements in the sample. The x-ray fluorescence tomography is based on the use of the X-ray fluorescence emitted from the elements contained in a sample so as to give additional information to characterize the object under study. In this work a rat lung and two human lung tissue samples have been investigated in order to verify the efficiency of the system in determination of the internal distribution of detected elements in these kinds of samples and to compare the elemental distribution in the lung tissue of an old human and a fetus. The experiments were performed at the X-Ray Fluorescence beamline (XRF) of the Brazilian Synchrotron Light Source (LNLS), Campinas, Brazil. A white beam was used for the excitation of the elements and the fluorescence photons have been detected by a HPGe detector. All the tomographies have been reconstructed using a filtered-back projection algorithm. It was possible to visualize the distribution of high atomic number elements on both, artificial and tissues samples. It was compared the quantity of Zn, Cu and Fe for the lung human tissue samples and verify that these elements have a higher concentration on the fetus tissue sample than the adult tissue sample. (author)

  10. Proper and Paradigmatic Metonymy as a Lens for Characterizing Student Conceptions of Distributions and Sampling

    Science.gov (United States)

    Noll, Jennifer; Hancock, Stacey

    2015-01-01

    This research investigates what students' use of statistical language can tell us about their conceptions of distribution and sampling in relation to informal inference. Prior research documents students' challenges in understanding ideas of distribution and sampling as tools for making informal statistical inferences. We know that these…

  11. Statistical distribution sampling

    Science.gov (United States)

    Johnson, E. S.

    1975-01-01

    Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.

  12. Micro-organism distribution sampling for bioassays

    Science.gov (United States)

    Nelson, B. A.

    1975-01-01

    Purpose of sampling distribution is to characterize sample-to-sample variation so statistical tests may be applied, to estimate error due to sampling (confidence limits) and to evaluate observed differences between samples. Distribution could be used for bioassays taken in hospitals, breweries, food-processing plants, and pharmaceutical plants.

  13. Parameter estimation of multivariate multiple regression model using bayesian with non-informative Jeffreys’ prior distribution

    Science.gov (United States)

    Saputro, D. R. S.; Amalia, F.; Widyaningsih, P.; Affan, R. C.

    2018-05-01

    Bayesian method is a method that can be used to estimate the parameters of multivariate multiple regression model. Bayesian method has two distributions, there are prior and posterior distributions. Posterior distribution is influenced by the selection of prior distribution. Jeffreys’ prior distribution is a kind of Non-informative prior distribution. This prior is used when the information about parameter not available. Non-informative Jeffreys’ prior distribution is combined with the sample information resulting the posterior distribution. Posterior distribution is used to estimate the parameter. The purposes of this research is to estimate the parameters of multivariate regression model using Bayesian method with Non-informative Jeffreys’ prior distribution. Based on the results and discussion, parameter estimation of β and Σ which were obtained from expected value of random variable of marginal posterior distribution function. The marginal posterior distributions for β and Σ are multivariate normal and inverse Wishart. However, in calculation of the expected value involving integral of a function which difficult to determine the value. Therefore, approach is needed by generating of random samples according to the posterior distribution characteristics of each parameter using Markov chain Monte Carlo (MCMC) Gibbs sampling algorithm.

  14. ExSample. A library for sampling Sudakov-type distributions

    Energy Technology Data Exchange (ETDEWEB)

    Plaetzer, Simon

    2011-08-15

    Sudakov-type distributions are at the heart of generating radiation in parton showers as well as contemporary NLO matching algorithms along the lines of the POWHEG algorithm. In this paper, the C++ library ExSample is introduced, which implements adaptive sampling of Sudakov-type distributions for splitting kernels which are in general only known numerically. Besides the evolution variable, the splitting kernels can depend on an arbitrary number of other degrees of freedom to be sampled, and any number of further parameters which are fixed on an event-by-event basis. (orig.)

  15. ExSample. A library for sampling Sudakov-type distributions

    International Nuclear Information System (INIS)

    Plaetzer, Simon

    2011-08-01

    Sudakov-type distributions are at the heart of generating radiation in parton showers as well as contemporary NLO matching algorithms along the lines of the POWHEG algorithm. In this paper, the C++ library ExSample is introduced, which implements adaptive sampling of Sudakov-type distributions for splitting kernels which are in general only known numerically. Besides the evolution variable, the splitting kernels can depend on an arbitrary number of other degrees of freedom to be sampled, and any number of further parameters which are fixed on an event-by-event basis. (orig.)

  16. Egg distribution and sampling of Diaprepes abbreviatus (Coleoptera: Curculionidae) on silver buttonwood

    International Nuclear Information System (INIS)

    Pena, J.E.; Mannion, C.; Amalin, D.; Hunsberger, A.

    2007-01-01

    Taylor's power law and Iwao's patchiness regression were used to analyze spatial distribution of eggs of the Diaprepes root weevil, Diaprepes abbreviatus (L.), on silver buttonwood trees, Conocarpus erectus, during 1997 and 1998. Taylor's power law and Iwao's patchiness regression provided similar descriptions of variance-mean relationship for egg distribution within trees. Sample size requirements were determined. Information presented in this paper should help to improve accuracy and efficiency in sampling of the weevil eggs in the future. (author) [es

  17. Understanding the Sampling Distribution and the Central Limit Theorem.

    Science.gov (United States)

    Lewis, Charla P.

    The sampling distribution is a common source of misuse and misunderstanding in the study of statistics. The sampling distribution, underlying distribution, and the Central Limit Theorem are all interconnected in defining and explaining the proper use of the sampling distribution of various statistics. The sampling distribution of a statistic is…

  18. Evaluation of optimized bronchoalveolar lavage sampling designs for characterization of pulmonary drug distribution.

    Science.gov (United States)

    Clewe, Oskar; Karlsson, Mats O; Simonsson, Ulrika S H

    2015-12-01

    Bronchoalveolar lavage (BAL) is a pulmonary sampling technique for characterization of drug concentrations in epithelial lining fluid and alveolar cells. Two hypothetical drugs with different pulmonary distribution rates (fast and slow) were considered. An optimized BAL sampling design was generated assuming no previous information regarding the pulmonary distribution (rate and extent) and with a maximum of two samples per subject. Simulations were performed to evaluate the impact of the number of samples per subject (1 or 2) and the sample size on the relative bias and relative root mean square error of the parameter estimates (rate and extent of pulmonary distribution). The optimized BAL sampling design depends on a characterized plasma concentration time profile, a population plasma pharmacokinetic model, the limit of quantification (LOQ) of the BAL method and involves only two BAL sample time points, one early and one late. The early sample should be taken as early as possible, where concentrations in the BAL fluid ≥ LOQ. The second sample should be taken at a time point in the declining part of the plasma curve, where the plasma concentration is equivalent to the plasma concentration in the early sample. Using a previously described general pulmonary distribution model linked to a plasma population pharmacokinetic model, simulated data using the final BAL sampling design enabled characterization of both the rate and extent of pulmonary distribution. The optimized BAL sampling design enables characterization of both the rate and extent of the pulmonary distribution for both fast and slowly equilibrating drugs.

  19. Comparing distribution models for small samples of overdispersed counts of freshwater fish

    Science.gov (United States)

    Vaudor, Lise; Lamouroux, Nicolas; Olivier, Jean-Michel

    2011-05-01

    The study of species abundance often relies on repeated abundance counts whose number is limited by logistic or financial constraints. The distribution of abundance counts is generally right-skewed (i.e. with many zeros and few high values) and needs to be modelled for statistical inference. We used an extensive dataset involving about 100,000 fish individuals of 12 freshwater fish species collected in electrofishing points (7 m 2) during 350 field surveys made in 25 stream sites, in order to compare the performance and the generality of four distribution models of counts (Poisson, negative binomial and their zero-inflated counterparts). The negative binomial distribution was the best model (Bayesian Information Criterion) for 58% of the samples (species-survey combinations) and was suitable for a variety of life histories, habitat, and sample characteristics. The performance of the models was closely related to samples' statistics such as total abundance and variance. Finally, we illustrated the consequences of a distribution assumption by calculating confidence intervals around the mean abundance, either based on the most suitable distribution assumption or on an asymptotical, distribution-free (Student's) method. Student's method generally corresponded to narrower confidence intervals, especially when there were few (≤3) non-null counts in the samples.

  20. A Story-Based Simulation for Teaching Sampling Distributions

    Science.gov (United States)

    Turner, Stephen; Dabney, Alan R.

    2015-01-01

    Statistical inference relies heavily on the concept of sampling distributions. However, sampling distributions are difficult to teach. We present a series of short animations that are story-based, with associated assessments. We hope that our contribution can be useful as a tool to teach sampling distributions in the introductory statistics…

  1. Sample size determination for logistic regression on a logit-normal distribution.

    Science.gov (United States)

    Kim, Seongho; Heath, Elisabeth; Heilbrun, Lance

    2017-06-01

    Although the sample size for simple logistic regression can be readily determined using currently available methods, the sample size calculation for multiple logistic regression requires some additional information, such as the coefficient of determination ([Formula: see text]) of a covariate of interest with other covariates, which is often unavailable in practice. The response variable of logistic regression follows a logit-normal distribution which can be generated from a logistic transformation of a normal distribution. Using this property of logistic regression, we propose new methods of determining the sample size for simple and multiple logistic regressions using a normal transformation of outcome measures. Simulation studies and a motivating example show several advantages of the proposed methods over the existing methods: (i) no need for [Formula: see text] for multiple logistic regression, (ii) available interim or group-sequential designs, and (iii) much smaller required sample size.

  2. Succinct Sampling from Discrete Distributions

    DEFF Research Database (Denmark)

    Bringmann, Karl; Larsen, Kasper Green

    2013-01-01

    We revisit the classic problem of sampling from a discrete distribution: Given n non-negative w-bit integers x_1,...,x_n, the task is to build a data structure that allows sampling i with probability proportional to x_i. The classic solution is Walker's alias method that takes, when implemented...

  3. Development of sample size allocation program using hypergeometric distribution

    International Nuclear Information System (INIS)

    Kim, Hyun Tae; Kwack, Eun Ho; Park, Wan Soo; Min, Kyung Soo; Park, Chan Sik

    1996-01-01

    The objective of this research is the development of sample allocation program using hypergeometric distribution with objected-oriented method. When IAEA(International Atomic Energy Agency) performs inspection, it simply applies a standard binomial distribution which describes sampling with replacement instead of a hypergeometric distribution which describes sampling without replacement in sample allocation to up to three verification methods. The objective of the IAEA inspection is the timely detection of diversion of significant quantities of nuclear material, therefore game theory is applied to its sampling plan. It is necessary to use hypergeometric distribution directly or approximate distribution to secure statistical accuracy. Improved binomial approximation developed by Mr. J. L. Jaech and correctly applied binomial approximation are more closer to hypergeometric distribution in sample size calculation than the simply applied binomial approximation of the IAEA. Object-oriented programs of 1. sample approximate-allocation with correctly applied standard binomial approximation, 2. sample approximate-allocation with improved binomial approximation, and 3. sample approximate-allocation with hypergeometric distribution were developed with Visual C ++ and corresponding programs were developed with EXCEL(using Visual Basic for Application). 8 tabs., 15 refs. (Author)

  4. Continuous sampling from distributed streams

    DEFF Research Database (Denmark)

    Graham, Cormode; Muthukrishnan, S.; Yi, Ke

    2012-01-01

    A fundamental problem in data management is to draw and maintain a sample of a large data set, for approximate query answering, selectivity estimation, and query planning. With large, streaming data sets, this problem becomes particularly difficult when the data is shared across multiple distribu......A fundamental problem in data management is to draw and maintain a sample of a large data set, for approximate query answering, selectivity estimation, and query planning. With large, streaming data sets, this problem becomes particularly difficult when the data is shared across multiple...... distributed sites. The main challenge is to ensure that a sample is drawn uniformly across the union of the data while minimizing the communication needed to run the protocol on the evolving data. At the same time, it is also necessary to make the protocol lightweight, by keeping the space and time costs low...... for each participant. In this article, we present communication-efficient protocols for continuously maintaining a sample (both with and without replacement) from k distributed streams. These apply to the case when we want a sample from the full streams, and to the sliding window cases of only the W most...

  5. Sampling frequency of ciliated protozoan microfauna for seasonal distribution research in marine ecosystems.

    Science.gov (United States)

    Xu, Henglong; Yong, Jiang; Xu, Guangjian

    2015-12-30

    Sampling frequency is important to obtain sufficient information for temporal research of microfauna. To determine an optimal strategy for exploring the seasonal variation in ciliated protozoa, a dataset from the Yellow Sea, northern China was studied. Samples were collected with 24 (biweekly), 12 (monthly), 8 (bimonthly per season) and 4 (seasonally) sampling events. Compared to the 24 samplings (100%), the 12-, 8- and 4-samplings recovered 94%, 94%, and 78% of the total species, respectively. To reveal the seasonal distribution, the 8-sampling regime may result in >75% information of the seasonal variance, while the traditional 4-sampling may only explain sampling frequency, the biotic data showed stronger correlations with seasonal variables (e.g., temperature, salinity) in combination with nutrients. It is suggested that the 8-sampling events per year may be an optimal sampling strategy for ciliated protozoan seasonal research in marine ecosystems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. The Distribution of the Sample Minimum-Variance Frontier

    OpenAIRE

    Raymond Kan; Daniel R. Smith

    2008-01-01

    In this paper, we present a finite sample analysis of the sample minimum-variance frontier under the assumption that the returns are independent and multivariate normally distributed. We show that the sample minimum-variance frontier is a highly biased estimator of the population frontier, and we propose an improved estimator of the population frontier. In addition, we provide the exact distribution of the out-of-sample mean and variance of sample minimum-variance portfolios. This allows us t...

  7. Sampling Methods for Wallenius' and Fisher's Noncentral Hypergeometric Distributions

    DEFF Research Database (Denmark)

    Fog, Agner

    2008-01-01

    the mode, ratio-of-uniforms rejection method, and rejection by sampling in the tau domain. Methods for the multivariate distributions include: simulation of urn experiments, conditional method, Gibbs sampling, and Metropolis-Hastings sampling. These methods are useful for Monte Carlo simulation of models...... of biased sampling and models of evolution and for calculating moments and quantiles of the distributions.......Several methods for generating variates with univariate and multivariate Wallenius' and Fisher's noncentral hypergeometric distributions are developed. Methods for the univariate distributions include: simulation of urn experiments, inversion by binary search, inversion by chop-down search from...

  8. Binomial Distribution Sample Confidence Intervals Estimation 1. Sampling and Medical Key Parameters Calculation

    Directory of Open Access Journals (Sweden)

    Tudor DRUGAN

    2003-08-01

    Full Text Available The aim of the paper was to present the usefulness of the binomial distribution in studying of the contingency tables and the problems of approximation to normality of binomial distribution (the limits, advantages, and disadvantages. The classification of the medical keys parameters reported in medical literature and expressing them using the contingency table units based on their mathematical expressions restrict the discussion of the confidence intervals from 34 parameters to 9 mathematical expressions. The problem of obtaining different information starting with the computed confidence interval for a specified method, information like confidence intervals boundaries, percentages of the experimental errors, the standard deviation of the experimental errors and the deviation relative to significance level was solves through implementation in PHP programming language of original algorithms. The cases of expression, which contain two binomial variables, were separately treated. An original method of computing the confidence interval for the case of two-variable expression was proposed and implemented. The graphical representation of the expression of two binomial variables for which the variation domain of one of the variable depend on the other variable was a real problem because the most of the software used interpolation in graphical representation and the surface maps were quadratic instead of triangular. Based on an original algorithm, a module was implements in PHP in order to represent graphically the triangular surface plots. All the implementation described above was uses in computing the confidence intervals and estimating their performance for binomial distributions sample sizes and variable.

  9. Distribution-Preserving Stratified Sampling for Learning Problems.

    Science.gov (United States)

    Cervellera, Cristiano; Maccio, Danilo

    2017-06-09

    The need for extracting a small sample from a large amount of real data, possibly streaming, arises routinely in learning problems, e.g., for storage, to cope with computational limitations, obtain good training/test/validation sets, and select minibatches for stochastic gradient neural network training. Unless we have reasons to select the samples in an active way dictated by the specific task and/or model at hand, it is important that the distribution of the selected points is as similar as possible to the original data. This is obvious for unsupervised learning problems, where the goal is to gain insights on the distribution of the data, but it is also relevant for supervised problems, where the theory explains how the training set distribution influences the generalization error. In this paper, we analyze the technique of stratified sampling from the point of view of distances between probabilities. This allows us to introduce an algorithm, based on recursive binary partition of the input space, aimed at obtaining samples that are distributed as much as possible as the original data. A theoretical analysis is proposed, proving the (greedy) optimality of the procedure together with explicit error bounds. An adaptive version of the algorithm is also introduced to cope with streaming data. Simulation tests on various data sets and different learning tasks are also provided.

  10. Simulating quantum correlations as a distributed sampling problem

    International Nuclear Information System (INIS)

    Degorre, Julien; Laplante, Sophie; Roland, Jeremie

    2005-01-01

    It is known that quantum correlations exhibited by a maximally entangled qubit pair can be simulated with the help of shared randomness, supplemented with additional resources, such as communication, postselection or nonlocal boxes. For instance, in the case of projective measurements, it is possible to solve this problem with protocols using one bit of communication or making one use of a nonlocal box. We show that this problem reduces to a distributed sampling problem. We give a new method to obtain samples from a biased distribution, starting with shared random variables following a uniform distribution, and use it to build distributed sampling protocols. This approach allows us to derive, in a simpler and unified way, many existing protocols for projective measurements, and extend them to positive operator value measurements. Moreover, this approach naturally leads to a local hidden variable model for Werner states

  11. Sampling from the normal and exponential distributions

    International Nuclear Information System (INIS)

    Chaplin, K.R.; Wills, C.A.

    1982-01-01

    Methods for generating random numbers from the normal and exponential distributions are described. These involve dividing each function into subregions, and for each of these developing a method of sampling usually based on an acceptance rejection technique. When sampling from the normal or exponential distribution, each subregion provides the required random value with probability equal to the ratio of its area to the total area. Procedures written in FORTRAN for the CYBER 175/CDC 6600 system are provided to implement the two algorithms

  12. Distribution of age at menopause in two Danish samples

    DEFF Research Database (Denmark)

    Boldsen, J L; Jeune, B

    1990-01-01

    We analyzed the distribution of reported age at natural menopause in two random samples of Danish women (n = 176 and n = 150) to determine the shape of the distribution and to disclose any possible trends in the distribution parameters. It was necessary to correct the frequencies of the reported...... ages for the effect of differing ages at reporting. The corrected distribution of age at menopause differs from the normal distribution in the same way in both samples. Both distributions could be described by a mixture of two normal distributions. It appears that most of the parameters of the normal...... distribution mixtures remain unchanged over a 50-year time lag. The position of the distribution, that is, the mean age at menopause, however, increases slightly but significantly....

  13. Approach-Induced Biases in Human Information Sampling.

    Directory of Open Access Journals (Sweden)

    Laurence T Hunt

    2016-11-01

    Full Text Available Information sampling is often biased towards seeking evidence that confirms one's prior beliefs. Despite such biases being a pervasive feature of human behavior, their underlying causes remain unclear. Many accounts of these biases appeal to limitations of human hypothesis testing and cognition, de facto evoking notions of bounded rationality, but neglect more basic aspects of behavioral control. Here, we investigated a potential role for Pavlovian approach in biasing which information humans will choose to sample. We collected a large novel dataset from 32,445 human subjects, making over 3 million decisions, who played a gambling task designed to measure the latent causes and extent of information-sampling biases. We identified three novel approach-related biases, formalized by comparing subject behavior to a dynamic programming model of optimal information gathering. These biases reflected the amount of information sampled ("positive evidence approach", the selection of which information to sample ("sampling the favorite", and the interaction between information sampling and subsequent choices ("rejecting unsampled options". The prevalence of all three biases was related to a Pavlovian approach-avoid parameter quantified within an entirely independent economic decision task. Our large dataset also revealed that individual differences in the amount of information gathered are a stable trait across multiple gameplays and can be related to demographic measures, including age and educational attainment. As well as revealing limitations in cognitive processing, our findings suggest information sampling biases reflect the expression of primitive, yet potentially ecologically adaptive, behavioral repertoires. One such behavior is sampling from options that will eventually be chosen, even when other sources of information are more pertinent for guiding future action.

  14. A Note on Information-Directed Sampling and Thompson Sampling

    OpenAIRE

    Zhou, Li

    2015-01-01

    This note introduce three Bayesian style Multi-armed bandit algorithms: Information-directed sampling, Thompson Sampling and Generalized Thompson Sampling. The goal is to give an intuitive explanation for these three algorithms and their regret bounds, and provide some derivations that are omitted in the original papers.

  15. Formal Specification of Distributed Information Systems

    NARCIS (Netherlands)

    Vis, J.; Brinksma, Hendrik; de By, R.A.; de By, R.A.

    The design of distributed information systems tends to be complex and therefore error-prone. However, in the field of monolithic, i.e. non-distributed, information systems much has already been achieved, and by now, the principles of their design seem to be fairly well-understood. The past decade

  16. Distribution of pesticide residues in soil and uncertainty of sampling.

    Science.gov (United States)

    Suszter, Gabriela K; Ambrus, Árpád

    2017-08-03

    Pesticide residues were determined in about 120 soil cores taken randomly from the top 15 cm layer of two sunflower fields about 30 days after preemergence herbicide treatments. Samples were extracted with acetone-ethyl acetate mixture and the residues were determined with GC-TSD. Residues of dimethenamid, pendimethalin, and prometryn ranged from 0.005 to 2.97 mg/kg. Their relative standard deviations (CV) were between 0.66 and 1.13. The relative frequency distributions of residues in soil cores were very similar to those observed in root and tuber vegetables grown in pesticide treated soils. Based on all available information, a typical CV of 1.00 was estimated for pesticide residues in primary soil samples (soil cores). The corresponding expectable relative uncertainty of sampling is 20% when composite samples of size 25 are taken. To obtain a reliable estimate of the average residues in the top 15 cm layer of soil of a field up to 8 independent replicate random samples should be taken. To obtain better estimate of the actual residue level of the sampled filed would be marginal if larger number of samples were taken.

  17. The Distribution of Heavy Metal Pollutants in Suez Bay Using Geographic Information System (GIS)

    International Nuclear Information System (INIS)

    Hassan, H.B.; Mohamed, W.M.

    2017-01-01

    Suez city represents the southern entrance of the Suez Canal. As a result of the rapid development of industrialization and anthropogenic activities of Suez city may be entered containments such as heavy metals through Suez bay boundaries. The geographical information system (Arc GIS 9.1) is used to study the spatial distribution of heavy metals concentrations (Cd, Mn, Fe, Ni, Pb, Cu and Zn) of water samples which were collected in four different sampling sites (I, II, III and IV) from Suez bay. In this study, tabular data representation of the spatial distribution was developed using the inverse distance weighted (IDW) interpolation method. The GIS technique was applied to transfer the information into a final map illustrating the spatial distribution of heavy metals within the studied area. GIS models showed the high concentrations of heavy metals in some sites in the Suez city affecting by their activities. An overall distribution map of heavy metals is observed from GIS special analysis. Site (IV) in Suez City demonstrated the highest polluted are a in the overall distribution map

  18. Assessing protein conformational sampling methods based on bivariate lag-distributions of backbone angles

    KAUST Repository

    Maadooliat, Mehdi; Gao, Xin; Huang, Jianhua Z.

    2012-01-01

    Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.

  19. Assessing protein conformational sampling methods based on bivariate lag-distributions of backbone angles

    KAUST Repository

    Maadooliat, Mehdi

    2012-08-27

    Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.

  20. Aspects of Students' Reasoning about Variation in Empirical Sampling Distributions

    Science.gov (United States)

    Noll, Jennifer; Shaughnessy, J. Michael

    2012-01-01

    Sampling tasks and sampling distributions provide a fertile realm for investigating students' conceptions of variability. A project-designed teaching episode on samples and sampling distributions was team-taught in 6 research classrooms (2 middle school and 4 high school) by the investigators and regular classroom mathematics teachers. Data…

  1. Water sample-collection and distribution system

    Science.gov (United States)

    Brooks, R. R.

    1978-01-01

    Collection and distribution system samples water from six designated stations, filtered if desired, and delivers it to various analytical sensors. System may be controlled by Water Monitoring Data Acquisition System or operated manually.

  2. Enhanced conformational sampling using enveloping distribution sampling.

    Science.gov (United States)

    Lin, Zhixiong; van Gunsteren, Wilfred F

    2013-10-14

    To lessen the problem of insufficient conformational sampling in biomolecular simulations is still a major challenge in computational biochemistry. In this article, an application of the method of enveloping distribution sampling (EDS) is proposed that addresses this challenge and its sampling efficiency is demonstrated in simulations of a hexa-β-peptide whose conformational equilibrium encompasses two different helical folds, i.e., a right-handed 2.7(10∕12)-helix and a left-handed 3(14)-helix, separated by a high energy barrier. Standard MD simulations of this peptide using the GROMOS 53A6 force field did not reach convergence of the free enthalpy difference between the two helices even after 500 ns of simulation time. The use of soft-core non-bonded interactions in the centre of the peptide did enhance the number of transitions between the helices, but at the same time led to neglect of relevant helical configurations. In the simulations of a two-state EDS reference Hamiltonian that envelops both the physical peptide and the soft-core peptide, sampling of the conformational space of the physical peptide ensures that physically relevant conformations can be visited, and sampling of the conformational space of the soft-core peptide helps to enhance the transitions between the two helices. The EDS simulations sampled many more transitions between the two helices and showed much faster convergence of the relative free enthalpy of the two helices compared with the standard MD simulations with only a slightly larger computational effort to determine optimized EDS parameters. Combined with various methods to smoothen the potential energy surface, the proposed EDS application will be a powerful technique to enhance the sampling efficiency in biomolecular simulations.

  3. An Investigation of the Sampling Distribution of the Congruence Coefficient.

    Science.gov (United States)

    Broadbooks, Wendy J.; Elmore, Patricia B.

    This study developed and investigated an empirical sampling distribution of the congruence coefficient. The effects of sample size, number of variables, and population value of the congruence coefficient on the sampling distribution of the congruence coefficient were examined. Sample data were generated on the basis of the common factor model and…

  4. An Investigation of the Sampling Distributions of Equating Coefficients.

    Science.gov (United States)

    Baker, Frank B.

    1996-01-01

    Using the characteristic curve method for dichotomously scored test items, the sampling distributions of equating coefficients were examined. Simulations indicate that for the equating conditions studied, the sampling distributions of the equating coefficients appear to have acceptable characteristics, suggesting confidence in the values obtained…

  5. Correlated random sampling for multivariate normal and log-normal distributions

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.

    2012-01-01

    A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.

  6. Community structure informs species geographic distributions

    KAUST Repository

    Montesinos-Navarro, Alicia

    2018-05-23

    Understanding what determines species\\' geographic distributions is crucial for assessing global change threats to biodiversity. Measuring limits on distributions is usually, and necessarily, done with data at large geographic extents and coarse spatial resolution. However, survival of individuals is determined by processes that happen at small spatial scales. The relative abundance of coexisting species (i.e. \\'community structure\\') reflects assembly processes occurring at small scales, and are often available for relatively extensive areas, so could be useful for explaining species distributions. We demonstrate that Bayesian Network Inference (BNI) can overcome several challenges to including community structure into studies of species distributions, despite having been little used to date. We hypothesized that the relative abundance of coexisting species can improve predictions of species distributions. In 1570 assemblages of 68 Mediterranean woody plant species we used BNI to incorporate community structure into Species Distribution Models (SDMs), alongside environmental information. Information on species associations improved SDM predictions of community structure and species distributions moderately, though for some habitat specialists the deviance explained increased by up to 15%. We demonstrate that most species associations (95%) were positive and occurred between species with ecologically similar traits. This suggests that SDM improvement could be because species co-occurrences are a proxy for local ecological processes. Our study shows that Bayesian Networks, when interpreted carefully, can be used to include local conditions into measurements of species\\' large-scale distributions, and this information can improve the predictions of species distributions.

  7. New information on parton distributions

    International Nuclear Information System (INIS)

    Martin, A.D.; Stirling, W.J.; Roberts, R.G.

    1992-04-01

    New data on structure functions from deep-inelastic scattering provide new information on parton distributions, particularly in the 0.01 2 data from the New Muon Collaboration (NMC) and its implications for other processes, and the evidence for SU(2) symmetry breaking in the light quark sea. We show that although good fits can be obtained with or without this symmetry breaking, more physically reasonable parton distributions are obtained if we allow d-bar > u-bar at small x. With the inclusion of the latest deep-inelastic data we find α s (M Z ) = 0.111 -0.005 +0.004 . We also show how W, Z and Drell-Yan production at p-barp colliders can give information on parton distributions. (Author)

  8. Superfund Site Information - Site Sampling Data

    Data.gov (United States)

    U.S. Environmental Protection Agency — This asset includes Superfund site-specific sampling information including location of samples, types of samples, and analytical chemistry characteristics of...

  9. Log-stable concentration distributions of trace elements in biomedical samples

    International Nuclear Information System (INIS)

    Kubala-Kukus, A.; Kuternoga, E.; Braziewicz, J.; Pajek, M.

    2004-01-01

    In the present paper, which follows our earlier observation that the asymmetric and long-tailed concentration distributions of trace elements in biomedical samples, measured by the X-ray fluorescence techniques, can be modeled by the log-stable distributions, further specific aspects of this observation are discussed. First, we demonstrate that, typically, for a quite substantial fraction (10-20%) of trace elements studied in different kinds of biomedical samples, the measured concentration distributions are described in fact by the 'symmetric' log-stable distributions, i.e. the asymmetric distributions which are described by the symmetric stable distributions. This observation is, in fact, expected for the random multiplicative process, which models the concentration distributions of trace elements in the biomedical samples. The log-stable nature of concentration distribution of trace elements results in several problems of statistical nature, which have to be addressed in XRF data analysis practice. Consequently, in the present paper, the following problems, namely (i) the estimation of parameters for stable distributions and (ii) the testing of the log-stable nature of the concentration distribution by using the Anderson-Darling (A 2 ) test, especially for symmetric stable distributions, are discussed in detail. In particular, the maximum likelihood estimation and Monte Carlo simulation techniques were used, respectively, for estimation of stable distribution parameters and calculation of the critical values for the Anderson-Darling test. The discussed ideas are exemplified by the results of the study of trace element concentration distributions in selected biomedical samples, which were obtained by using the X-ray fluorescence (XRF, TXRF) methods

  10. Experimental determination of size distributions: analyzing proper sample sizes

    International Nuclear Information System (INIS)

    Buffo, A; Alopaeus, V

    2016-01-01

    The measurement of various particle size distributions is a crucial aspect for many applications in the process industry. Size distribution is often related to the final product quality, as in crystallization or polymerization. In other cases it is related to the correct evaluation of heat and mass transfer, as well as reaction rates, depending on the interfacial area between the different phases or to the assessment of yield stresses of polycrystalline metals/alloys samples. The experimental determination of such distributions often involves laborious sampling procedures and the statistical significance of the outcome is rarely investigated. In this work, we propose a novel rigorous tool, based on inferential statistics, to determine the number of samples needed to obtain reliable measurements of size distribution, according to specific requirements defined a priori. Such methodology can be adopted regardless of the measurement technique used. (paper)

  11. Interference Imaging of Refractive Index Distribution in Thin Samples

    Directory of Open Access Journals (Sweden)

    Ivan Turek

    2004-01-01

    Full Text Available There are three versions of interference imaging of refractive index distribution in thin samples suggested in this contribution. These are based on imaging of interference field created by waves reflected from the front and the back sample surface or imaging of interference field of Michelson or Mach-Zehnder interferometer with the sample put in one of the interferometers arm. The work discusses the advantages and disadvantages of these techniques and presents the results of imaging of refrective index distribution in photorefractive record of a quasi-harmonic optical field in thin LiNbO3 crystal sample.

  12. Sample sizes and model comparison metrics for species distribution models

    Science.gov (United States)

    B.B. Hanberry; H.S. He; D.C. Dey

    2012-01-01

    Species distribution models use small samples to produce continuous distribution maps. The question of how small a sample can be to produce an accurate model generally has been answered based on comparisons to maximum sample sizes of 200 observations or fewer. In addition, model comparisons often are made with the kappa statistic, which has become controversial....

  13. Building a foundation to study distributed information behaviour

    Directory of Open Access Journals (Sweden)

    Terry L. von Thaden

    2007-01-01

    Full Text Available Introduction. The purpose of this research is to assess information behaviour as it pertains to operational teams in dynamic safety critical operations. Method. In this paper, I describe some of the problems faced by crews on modern flight decks and suggest a framework modelled on Information Science, Human Factors, and Activity Theory research to assess the distribution of information actions, namely information identification, gathering and use, by teams of users in a dynamic, safety critical environment. Analysis. By analysing the information behaviour of crews who have accidents and those who do not, researchers may be able to ascertain how they (fail to make use of essential, safety critical information in their information environment. The ultimate goal of this research is to differentiate information behaviour among the distinct outcomes. Results. This research affords the possibility to discern differences in distributed information behaviour illustrating that crews who err to the point of an accident appear to practice different distributed information behaviour than those who do not. This foundation serves to operationalise team sense-making through illustrating the social practice of information structuring within the activity of the work environment. Conclusion. . The distributed information behaviour framework provides a useful structure to study the patterning and organization of information distributed over space and time, to reach a common goal. This framework may allow researchers and investigators alike to identify critical information activity in the negotiation of meaning in high reliability safety critical work, eventually informing safer practice. This framework is applicable to other domains.

  14. Eccentricity samples: Implications on the potential and the velocity distribution

    Directory of Open Access Journals (Sweden)

    Cubarsi R.

    2017-01-01

    Full Text Available Planar and vertical epicycle frequencies and local angular velocity are related to the derivatives up to the second order of the local potential and can be used to test the shape of the potential from stellar disc samples. These samples show a more complex velocity distribution than halo stars and should provide a more realistic test. We assume an axisymmetric potential allowing a mixture of independent ellipsoidal velocity distributions, of separable or Staeckel form in cylindrical or spherical coordinates. We prove that values of local constants are not consistent with a potential separable in addition in cylindrical coordinates and with a spherically symmetric potential. The simplest potential that fits the local constants is used to show that the harmonical and non-harmonical terms of the potential are equally important. The same analysis is used to estimate the local constants. Two families of nested subsamples selected for decreasing planar and vertical eccentricities are used to borne out the relation between the mean squared planar and vertical eccentricities and the velocity dispersions of the subsamples. According to the first-order epicycle model, the radial and vertical velocity components provide accurate information on the planar and vertical epicycle frequencies. However, it is impossible to account for the asymmetric drift which introduces a systematic bias in estimation of the third constant. Under a more general model, when the asymmetric drift is taken into account, the rotation velocity dispersions together with their asymmetric drift provide the correct fit for the local angular velocity. The consistency of the results shows that this new method based on the distribution of eccentricities is worth using for kinematic stellar samples. [Project of the Serbian Ministry of Education, Science and Technological Development, Grant no. No 176011: Dynamics and Kinematics of Celestial Bodies and Systems

  15. Two sample Bayesian prediction intervals for order statistics based on the inverse exponential-type distributions using right censored sample

    Directory of Open Access Journals (Sweden)

    M.M. Mohie El-Din

    2011-10-01

    Full Text Available In this paper, two sample Bayesian prediction intervals for order statistics (OS are obtained. This prediction is based on a certain class of the inverse exponential-type distributions using a right censored sample. A general class of prior density functions is used and the predictive cumulative function is obtained in the two samples case. The class of the inverse exponential-type distributions includes several important distributions such the inverse Weibull distribution, the inverse Burr distribution, the loglogistic distribution, the inverse Pareto distribution and the inverse paralogistic distribution. Special cases of the inverse Weibull model such as the inverse exponential model and the inverse Rayleigh model are considered.

  16. A Distributed User Information System

    Science.gov (United States)

    1990-03-01

    NOE08 Department of Computer Science NOVO 8 1990 University of Maryland S College Park, MD 20742 D Abstract Current user information database technology ...Transactions on Computer Systems, May 1988. [So189] K. Sollins. A plan for internet directory services. Technical report, DDN Network Information Center...2424 A Distributed User Information System DTiC Steven D. Miller, Scott Carson, and Leo Mark DELECTE Institute for Advanced Computer Studies and

  17. Geotechnical parameter spatial distribution stochastic analysis based on multi-precision information assimilation

    Science.gov (United States)

    Wang, C.; Rubin, Y.

    2014-12-01

    Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi

  18. Statistical Symbolic Execution with Informed Sampling

    Science.gov (United States)

    Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco

    2014-01-01

    Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.

  19. Attenuation of species abundance distributions by sampling

    Science.gov (United States)

    Shimadzu, Hideyasu; Darnell, Ross

    2015-01-01

    Quantifying biodiversity aspects such as species presence/ absence, richness and abundance is an important challenge to answer scientific and resource management questions. In practice, biodiversity can only be assessed from biological material taken by surveys, a difficult task given limited time and resources. A type of random sampling, or often called sub-sampling, is a commonly used technique to reduce the amount of time and effort for investigating large quantities of biological samples. However, it is not immediately clear how (sub-)sampling affects the estimate of biodiversity aspects from a quantitative perspective. This paper specifies the effect of (sub-)sampling as attenuation of the species abundance distribution (SAD), and articulates how the sampling bias is induced to the SAD by random sampling. The framework presented also reveals some confusion in previous theoretical studies. PMID:26064626

  20. Exact run length distribution of the double sampling x-bar chart with estimated process parameters

    Directory of Open Access Journals (Sweden)

    Teoh, W. L.

    2016-05-01

    Full Text Available Since the run length distribution is generally highly skewed, a significant concern about focusing too much on the average run length (ARL criterion is that we may miss some crucial information about a control chart’s performance. Thus it is important to investigate the entire run length distribution of a control chart for an in-depth understanding before implementing the chart in process monitoring. In this paper, the percentiles of the run length distribution for the double sampling (DS X chart with estimated process parameters are computed. Knowledge of the percentiles of the run length distribution provides a more comprehensive understanding of the expected behaviour of the run length. This additional information includes the early false alarm, the skewness of the run length distribution, and the median run length (MRL. A comparison of the run length distribution between the optimal ARL-based and MRL-based DS X chart with estimated process parameters is presented in this paper. Examples of applications are given to aid practitioners to select the best design scheme of the DS X chart with estimated process parameters, based on their specific purpose.

  1. Estimation of time-delayed mutual information and bias for irregularly and sparsely sampled time-series

    International Nuclear Information System (INIS)

    Albers, D.J.; Hripcsak, George

    2012-01-01

    Highlights: ► Time-delayed mutual information for irregularly sampled time-series. ► Estimation bias for the time-delayed mutual information calculation. ► Fast, simple, PDF estimator independent, time-delayed mutual information bias estimate. ► Quantification of data-set-size limits of the time-delayed mutual calculation. - Abstract: A method to estimate the time-dependent correlation via an empirical bias estimate of the time-delayed mutual information for a time-series is proposed. In particular, the bias of the time-delayed mutual information is shown to often be equivalent to the mutual information between two distributions of points from the same system separated by infinite time. Thus intuitively, estimation of the bias is reduced to estimation of the mutual information between distributions of data points separated by large time intervals. The proposed bias estimation techniques are shown to work for Lorenz equations data and glucose time series data of three patients from the Columbia University Medical Center database.

  2. Distributed Administrative Management Information System (DAMIS).

    Science.gov (United States)

    Juckiewicz, Robert; Kroculick, Joseph

    Columbia University's major program to distribute its central administrative data processing to its various schools and departments is described. The Distributed Administrative Management Information System (DAMIS) will link every department and school within the university via micrcomputers, terminals, and/or minicomputers to the central…

  3. Finite-key analysis for quantum key distribution with weak coherent pulses based on Bernoulli sampling

    Science.gov (United States)

    Kawakami, Shun; Sasaki, Toshihiko; Koashi, Masato

    2017-07-01

    An essential step in quantum key distribution is the estimation of parameters related to the leaked amount of information, which is usually done by sampling of the communication data. When the data size is finite, the final key rate depends on how the estimation process handles statistical fluctuations. Many of the present security analyses are based on the method with simple random sampling, where hypergeometric distribution or its known bounds are used for the estimation. Here we propose a concise method based on Bernoulli sampling, which is related to binomial distribution. Our method is suitable for the Bennett-Brassard 1984 (BB84) protocol with weak coherent pulses [C. H. Bennett and G. Brassard, Proceedings of the IEEE Conference on Computers, Systems and Signal Processing (IEEE, New York, 1984), Vol. 175], reducing the number of estimated parameters to achieve a higher key generation rate compared to the method with simple random sampling. We also apply the method to prove the security of the differential-quadrature-phase-shift (DQPS) protocol in the finite-key regime. The result indicates that the advantage of the DQPS protocol over the phase-encoding BB84 protocol in terms of the key rate, which was previously confirmed in the asymptotic regime, persists in the finite-key regime.

  4. Extending the alias Monte Carlo sampling method to general distributions

    International Nuclear Information System (INIS)

    Edwards, A.L.; Rathkopf, J.A.; Smidt, R.K.

    1991-01-01

    The alias method is a Monte Carlo sampling technique that offers significant advantages over more traditional methods. It equals the accuracy of table lookup and the speed of equal probable bins. The original formulation of this method sampled from discrete distributions and was easily extended to histogram distributions. We have extended the method further to applications more germane to Monte Carlo particle transport codes: continuous distributions. This paper presents the alias method as originally derived and our extensions to simple continuous distributions represented by piecewise linear functions. We also present a method to interpolate accurately between distributions tabulated at points other than the point of interest. We present timing studies that demonstrate the method's increased efficiency over table lookup and show further speedup achieved through vectorization. 6 refs., 12 figs., 2 tabs

  5. Rational learning and information sampling: on the "naivety" assumption in sampling explanations of judgment biases.

    Science.gov (United States)

    Le Mens, Gaël; Denrell, Jerker

    2011-04-01

    Recent research has argued that several well-known judgment biases may be due to biases in the available information sample rather than to biased information processing. Most of these sample-based explanations assume that decision makers are "naive": They are not aware of the biases in the available information sample and do not correct for them. Here, we show that this "naivety" assumption is not necessary. Systematically biased judgments can emerge even when decision makers process available information perfectly and are also aware of how the information sample has been generated. Specifically, we develop a rational analysis of Denrell's (2005) experience sampling model, and we prove that when information search is interested rather than disinterested, even rational information sampling and processing can give rise to systematic patterns of errors in judgments. Our results illustrate that a tendency to favor alternatives for which outcome information is more accessible can be consistent with rational behavior. The model offers a rational explanation for behaviors that had previously been attributed to cognitive and motivational biases, such as the in-group bias or the tendency to prefer popular alternatives. 2011 APA, all rights reserved

  6. Measuring Biometric Sample Quality in terms of Biometric Feature Information in Iris Images

    Directory of Open Access Journals (Sweden)

    R. Youmaran

    2012-01-01

    Full Text Available This paper develops an approach to measure the information content in a biometric feature representation of iris images. In this context, the biometric feature information is calculated using the relative entropy between the intraclass and interclass feature distributions. The collected data is regularized using a Gaussian model of the feature covariances in order to practically measure the biometric information with limited data samples. An example of this method is shown for iris templates processed using Principal-Component Analysis- (PCA- and Independent-Component Analysis- (ICA- based feature decomposition schemes. From this, the biometric feature information is calculated to be approximately 278 bits for PCA and 288 bits for ICA iris features using Masek's iris recognition scheme. This value approximately matches previous estimates of iris information content.

  7. Fluid sample collection and distribution system. [qualitative analysis of aqueous samples from several points

    Science.gov (United States)

    Brooks, R. L. (Inventor)

    1979-01-01

    A multipoint fluid sample collection and distribution system is provided wherein the sample inputs are made through one or more of a number of sampling valves to a progressive cavity pump which is not susceptible to damage by large unfiltered particles. The pump output is through a filter unit that can provide a filtered multipoint sample. An unfiltered multipoint sample is also provided. An effluent sample can be taken and applied to a second progressive cavity pump for pumping to a filter unit that can provide one or more filtered effluent samples. The second pump can also provide an unfiltered effluent sample. Means are provided to periodically back flush each filter unit without shutting off the whole system.

  8. Quantifying predictability through information theory: small sample estimation in a non-Gaussian framework

    International Nuclear Information System (INIS)

    Haven, Kyle; Majda, Andrew; Abramov, Rafail

    2005-01-01

    Many situations in complex systems require quantitative estimates of the lack of information in one probability distribution relative to another. In short term climate and weather prediction, examples of these issues might involve the lack of information in the historical climate record compared with an ensemble prediction, or the lack of information in a particular Gaussian ensemble prediction strategy involving the first and second moments compared with the non-Gaussian ensemble itself. The relative entropy is a natural way to quantify the predictive utility in this information, and recently a systematic computationally feasible hierarchical framework has been developed. In practical systems with many degrees of freedom, computational overhead limits ensemble predictions to relatively small sample sizes. Here the notion of predictive utility, in a relative entropy framework, is extended to small random samples by the definition of a sample utility, a measure of the unlikeliness that a random sample was produced by a given prediction strategy. The sample utility is the minimum predictability, with a statistical level of confidence, which is implied by the data. Two practical algorithms for measuring such a sample utility are developed here. The first technique is based on the statistical method of null-hypothesis testing, while the second is based upon a central limit theorem for the relative entropy of moment-based probability densities. These techniques are tested on known probability densities with parameterized bimodality and skewness, and then applied to the Lorenz '96 model, a recently developed 'toy' climate model with chaotic dynamics mimicking the atmosphere. The results show a detection of non-Gaussian tendencies of prediction densities at small ensemble sizes with between 50 and 100 members, with a 95% confidence level

  9. The DELPHI distributed information system for exchanging LEP machine related information

    International Nuclear Information System (INIS)

    Doenszelmann, M.; Gaspar, C.

    1994-01-01

    An information management system was designed and implemented to interchange information between the DELPHI experiment at CERN and the monitoring/control system for the LEP (Large Electron Positron Collider) accelerator. This system is distributed and communicates with many different sources and destinations (LEP) using different types of communication. The system itself communicates internally via a communication system based on a publish-and-subscribe mechanism, DIM (Distributed Information Manager). The information gathered by this system is used for on-line as well as off-line data analysis. Therefore it logs the information to a database and makes it available to operators and users via DUI (DELPHI User Interface). The latter was extended to be capable of displaying ''time-evolution'' plots. It also handles a protocol, implemented using a finite state machine, SMI (State Management Interface), for (semi-)automatic running of the Data Acquisition System and the Slow Controls System. ((orig.))

  10. Rational Arithmetic Mathematica Functions to Evaluate the Two-Sided One Sample K-S Cumulative Sampling Distribution

    Directory of Open Access Journals (Sweden)

    J. Randall Brown

    2007-06-01

    Full Text Available One of the most widely used goodness-of-fit tests is the two-sided one sample Kolmogorov-Smirnov (K-S test which has been implemented by many computer statistical software packages. To calculate a two-sided p value (evaluate the cumulative sampling distribution, these packages use various methods including recursion formulae, limiting distributions, and approximations of unknown accuracy developed over thirty years ago. Based on an extensive literature search for the two-sided one sample K-S test, this paper identifies an exact formula for sample sizes up to 31, six recursion formulae, and one matrix formula that can be used to calculate a p value. To ensure accurate calculation by avoiding catastrophic cancelation and eliminating rounding error, each of these formulae is implemented in rational arithmetic. For the six recursion formulae and the matrix formula, computational experience for sample sizes up to 500 shows that computational times are increasing functions of both the sample size and the number of digits in the numerator and denominator integers of the rational number test statistic. The computational times of the seven formulae vary immensely but the Durbin recursion formula is almost always the fastest. Linear search is used to calculate the inverse of the cumulative sampling distribution (find the confidence interval half-width and tables of calculated half-widths are presented for sample sizes up to 500. Using calculated half-widths as input, computational times for the fastest formula, the Durbin recursion formula, are given for sample sizes up to two thousand.

  11. Informational analysis for compressive sampling in radar imaging.

    Science.gov (United States)

    Zhang, Jingxiong; Yang, Ke

    2015-03-24

    Compressive sampling or compressed sensing (CS) works on the assumption of the sparsity or compressibility of the underlying signal, relies on the trans-informational capability of the measurement matrix employed and the resultant measurements, operates with optimization-based algorithms for signal reconstruction and is thus able to complete data compression, while acquiring data, leading to sub-Nyquist sampling strategies that promote efficiency in data acquisition, while ensuring certain accuracy criteria. Information theory provides a framework complementary to classic CS theory for analyzing information mechanisms and for determining the necessary number of measurements in a CS environment, such as CS-radar, a radar sensor conceptualized or designed with CS principles and techniques. Despite increasing awareness of information-theoretic perspectives on CS-radar, reported research has been rare. This paper seeks to bridge the gap in the interdisciplinary area of CS, radar and information theory by analyzing information flows in CS-radar from sparse scenes to measurements and determining sub-Nyquist sampling rates necessary for scene reconstruction within certain distortion thresholds, given differing scene sparsity and average per-sample signal-to-noise ratios (SNRs). Simulated studies were performed to complement and validate the information-theoretic analysis. The combined strategy proposed in this paper is valuable for information-theoretic orientated CS-radar system analysis and performance evaluation.

  12. Online catalog access and distribution of remotely sensed information

    Science.gov (United States)

    Lutton, Stephen M.

    1997-09-01

    Remote sensing is providing voluminous data and value added information products. Electronic sensors, communication electronics, computer software, hardware, and network communications technology have matured to the point where a distributed infrastructure for remotely sensed information is a reality. The amount of remotely sensed data and information is making distributed infrastructure almost a necessity. This infrastructure provides data collection, archiving, cataloging, browsing, processing, and viewing for applications from scientific research to economic, legal, and national security decision making. The remote sensing field is entering a new exciting stage of commercial growth and expansion into the mainstream of government and business decision making. This paper overviews this new distributed infrastructure and then focuses on describing a software system for on-line catalog access and distribution of remotely sensed information.

  13. Rational Learning and Information Sampling: On the "Naivety" Assumption in Sampling Explanations of Judgment Biases

    Science.gov (United States)

    Le Mens, Gael; Denrell, Jerker

    2011-01-01

    Recent research has argued that several well-known judgment biases may be due to biases in the available information sample rather than to biased information processing. Most of these sample-based explanations assume that decision makers are "naive": They are not aware of the biases in the available information sample and do not correct for them.…

  14. Improved side information generation for distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2008-01-01

    As a new coding paradigm, distributed video coding (DVC) deals with lossy source coding using side information to exploit the statistics at the decoder to reduce computational demands at the encoder. The performance of DVC highly depends on the quality of side information. With a better side...... information generation method, fewer bits will be requested from the encoder and more reliable decoded frames will be obtained. In this paper, a side information generation method is introduced to further improve the rate-distortion (RD) performance of transform domain distributed video coding. This algorithm...

  15. Development of a sampling strategy and sample size calculation to estimate the distribution of mammographic breast density in Korean women.

    Science.gov (United States)

    Jun, Jae Kwan; Kim, Mi Jin; Choi, Kui Son; Suh, Mina; Jung, Kyu-Won

    2012-01-01

    Mammographic breast density is a known risk factor for breast cancer. To conduct a survey to estimate the distribution of mammographic breast density in Korean women, appropriate sampling strategies for representative and efficient sampling design were evaluated through simulation. Using the target population from the National Cancer Screening Programme (NCSP) for breast cancer in 2009, we verified the distribution estimate by repeating the simulation 1,000 times using stratified random sampling to investigate the distribution of breast density of 1,340,362 women. According to the simulation results, using a sampling design stratifying the nation into three groups (metropolitan, urban, and rural), with a total sample size of 4,000, we estimated the distribution of breast density in Korean women at a level of 0.01% tolerance. Based on the results of our study, a nationwide survey for estimating the distribution of mammographic breast density among Korean women can be conducted efficiently.

  16. Vessel Sampling and Blood Flow Velocity Distribution With Vessel Diameter for Characterizing the Human Bulbar Conjunctival Microvasculature.

    Science.gov (United States)

    Wang, Liang; Yuan, Jin; Jiang, Hong; Yan, Wentao; Cintrón-Colón, Hector R; Perez, Victor L; DeBuc, Delia C; Feuer, William J; Wang, Jianhua

    2016-03-01

    This study determined (1) how many vessels (i.e., the vessel sampling) are needed to reliably characterize the bulbar conjunctival microvasculature and (2) if characteristic information can be obtained from the distribution histogram of the blood flow velocity and vessel diameter. Functional slitlamp biomicroscope was used to image hundreds of venules per subject. The bulbar conjunctiva in five healthy human subjects was imaged on six different locations in the temporal bulbar conjunctiva. The histograms of the diameter and velocity were plotted to examine whether the distribution was normal. Standard errors were calculated from the standard deviation and vessel sample size. The ratio of the standard error of the mean over the population mean was used to determine the sample size cutoff. The velocity was plotted as a function of the vessel diameter to display the distribution of the diameter and velocity. The results showed that the sampling size was approximately 15 vessels, which generated a standard error equivalent to 15% of the population mean from the total vessel population. The distributions of the diameter and velocity were not only unimodal, but also somewhat positively skewed and not normal. The blood flow velocity was related to the vessel diameter (r=0.23, Psampling size of the vessels and the distribution histogram of the blood flow velocity and vessel diameter, which may lead to a better understanding of the human microvascular system of the bulbar conjunctiva.

  17. Scalable Distributed Architectures for Information Retrieval

    National Research Council Canada - National Science Library

    Lu, Zhihong

    1999-01-01

    .... Our distributed architectures exploit parallelism in information retrieval on a cluster of parallel IR servers using symmetric multiprocessors, and use partial collection replication and selection...

  18. Using Group Projects to Assess the Learning of Sampling Distributions

    Science.gov (United States)

    Neidigh, Robert O.; Dunkelberger, Jake

    2012-01-01

    In an introductory business statistics course, student groups used sample data to compare a set of sample means to the theoretical sampling distribution. Each group was given a production measurement with a population mean and standard deviation. The groups were also provided an excel spreadsheet with 40 sample measurements per week for 52 weeks…

  19. Simple method for highlighting the temperature distribution into a liquid sample heated by microwave power field

    International Nuclear Information System (INIS)

    Surducan, V.; Surducan, E.; Dadarlat, D.

    2013-01-01

    Microwave induced heating is widely used in medical treatments, scientific and industrial applications. The temperature field inside a microwave heated sample is often inhomogenous, therefore multiple temperature sensors are required for an accurate result. Nowadays, non-contact (Infra Red thermography or microwave radiometry) or direct contact temperature measurement methods (expensive and sophisticated fiber optic temperature sensors transparent to microwave radiation) are mainly used. IR thermography gives only the surface temperature and can not be used for measuring temperature distributions in cross sections of a sample. In this paper we present a very simple experimental method for temperature distribution highlighting inside a cross section of a liquid sample, heated by a microwave radiation through a coaxial applicator. The method proposed is able to offer qualitative information about the heating distribution, using a temperature sensitive liquid crystal sheet. Inhomogeneities as smaller as 1°-2°C produced by the symmetry irregularities of the microwave applicator can be easily detected by visual inspection or by computer assisted color to temperature conversion. Therefore, the microwave applicator is tuned and verified with described method until the temperature inhomogeneities are solved

  20. Distributed morality in an information society.

    Science.gov (United States)

    Floridi, Luciano

    2013-09-01

    The phenomenon of distributed knowledge is well-known in epistemic logic. In this paper, a similar phenomenon in ethics, somewhat neglected so far, is investigated, namely distributed morality. The article explains the nature of distributed morality, as a feature of moral agency, and explores the implications of its occurrence in advanced information societies. In the course of the analysis, the concept of infraethics is introduced, in order to refer to the ensemble of moral enablers, which, although morally neutral per se, can significantly facilitate or hinder both positive and negative moral behaviours.

  1. A methodology for more efficient tail area sampling with discrete probability distribution

    International Nuclear Information System (INIS)

    Park, Sang Ryeol; Lee, Byung Ho; Kim, Tae Woon

    1988-01-01

    Monte Carlo Method is commonly used to observe the overall distribution and to determine the lower or upper bound value in statistical approach when direct analytical calculation is unavailable. However, this method would not be efficient if the tail area of a distribution is concerned. A new method entitled 'Two Step Tail Area Sampling' is developed, which uses the assumption of discrete probability distribution and samples only the tail area without distorting the overall distribution. This method uses two step sampling procedure. First, sampling at points separated by large intervals is done and second, sampling at points separated by small intervals is done with some check points determined at first step sampling. Comparison with Monte Carlo Method shows that the results obtained from the new method converge to analytic value faster than Monte Carlo Method if the numbers of calculation of both methods are the same. This new method is applied to DNBR (Departure from Nucleate Boiling Ratio) prediction problem in design of the pressurized light water nuclear reactor

  2. The redshift distribution of cosmological samples: a forward modeling approach

    Energy Technology Data Exchange (ETDEWEB)

    Herbel, Jörg; Kacprzak, Tomasz; Amara, Adam; Refregier, Alexandre; Bruderer, Claudio; Nicola, Andrina, E-mail: joerg.herbel@phys.ethz.ch, E-mail: tomasz.kacprzak@phys.ethz.ch, E-mail: adam.amara@phys.ethz.ch, E-mail: alexandre.refregier@phys.ethz.ch, E-mail: claudio.bruderer@phys.ethz.ch, E-mail: andrina.nicola@phys.ethz.ch [Institute for Astronomy, Department of Physics, ETH Zürich, Wolfgang-Pauli-Strasse 27, 8093 Zürich (Switzerland)

    2017-08-01

    Determining the redshift distribution n ( z ) of galaxy samples is essential for several cosmological probes including weak lensing. For imaging surveys, this is usually done using photometric redshifts estimated on an object-by-object basis. We present a new approach for directly measuring the global n ( z ) of cosmological galaxy samples, including uncertainties, using forward modeling. Our method relies on image simulations produced using \\textsc(UFig) (Ultra Fast Image Generator) and on ABC (Approximate Bayesian Computation) within the MCCL (Monte-Carlo Control Loops) framework. The galaxy population is modeled using parametric forms for the luminosity functions, spectral energy distributions, sizes and radial profiles of both blue and red galaxies. We apply exactly the same analysis to the real data and to the simulated images, which also include instrumental and observational effects. By adjusting the parameters of the simulations, we derive a set of acceptable models that are statistically consistent with the data. We then apply the same cuts to the simulations that were used to construct the target galaxy sample in the real data. The redshifts of the galaxies in the resulting simulated samples yield a set of n ( z ) distributions for the acceptable models. We demonstrate the method by determining n ( z ) for a cosmic shear like galaxy sample from the 4-band Subaru Suprime-Cam data in the COSMOS field. We also complement this imaging data with a spectroscopic calibration sample from the VVDS survey. We compare our resulting posterior n ( z ) distributions to the one derived from photometric redshifts estimated using 36 photometric bands in COSMOS and find good agreement. This offers good prospects for applying our approach to current and future large imaging surveys.

  3. The redshift distribution of cosmological samples: a forward modeling approach

    Science.gov (United States)

    Herbel, Jörg; Kacprzak, Tomasz; Amara, Adam; Refregier, Alexandre; Bruderer, Claudio; Nicola, Andrina

    2017-08-01

    Determining the redshift distribution n(z) of galaxy samples is essential for several cosmological probes including weak lensing. For imaging surveys, this is usually done using photometric redshifts estimated on an object-by-object basis. We present a new approach for directly measuring the global n(z) of cosmological galaxy samples, including uncertainties, using forward modeling. Our method relies on image simulations produced using \\textsc{UFig} (Ultra Fast Image Generator) and on ABC (Approximate Bayesian Computation) within the MCCL (Monte-Carlo Control Loops) framework. The galaxy population is modeled using parametric forms for the luminosity functions, spectral energy distributions, sizes and radial profiles of both blue and red galaxies. We apply exactly the same analysis to the real data and to the simulated images, which also include instrumental and observational effects. By adjusting the parameters of the simulations, we derive a set of acceptable models that are statistically consistent with the data. We then apply the same cuts to the simulations that were used to construct the target galaxy sample in the real data. The redshifts of the galaxies in the resulting simulated samples yield a set of n(z) distributions for the acceptable models. We demonstrate the method by determining n(z) for a cosmic shear like galaxy sample from the 4-band Subaru Suprime-Cam data in the COSMOS field. We also complement this imaging data with a spectroscopic calibration sample from the VVDS survey. We compare our resulting posterior n(z) distributions to the one derived from photometric redshifts estimated using 36 photometric bands in COSMOS and find good agreement. This offers good prospects for applying our approach to current and future large imaging surveys.

  4. The redshift distribution of cosmological samples: a forward modeling approach

    International Nuclear Information System (INIS)

    Herbel, Jörg; Kacprzak, Tomasz; Amara, Adam; Refregier, Alexandre; Bruderer, Claudio; Nicola, Andrina

    2017-01-01

    Determining the redshift distribution n ( z ) of galaxy samples is essential for several cosmological probes including weak lensing. For imaging surveys, this is usually done using photometric redshifts estimated on an object-by-object basis. We present a new approach for directly measuring the global n ( z ) of cosmological galaxy samples, including uncertainties, using forward modeling. Our method relies on image simulations produced using \\textsc(UFig) (Ultra Fast Image Generator) and on ABC (Approximate Bayesian Computation) within the MCCL (Monte-Carlo Control Loops) framework. The galaxy population is modeled using parametric forms for the luminosity functions, spectral energy distributions, sizes and radial profiles of both blue and red galaxies. We apply exactly the same analysis to the real data and to the simulated images, which also include instrumental and observational effects. By adjusting the parameters of the simulations, we derive a set of acceptable models that are statistically consistent with the data. We then apply the same cuts to the simulations that were used to construct the target galaxy sample in the real data. The redshifts of the galaxies in the resulting simulated samples yield a set of n ( z ) distributions for the acceptable models. We demonstrate the method by determining n ( z ) for a cosmic shear like galaxy sample from the 4-band Subaru Suprime-Cam data in the COSMOS field. We also complement this imaging data with a spectroscopic calibration sample from the VVDS survey. We compare our resulting posterior n ( z ) distributions to the one derived from photometric redshifts estimated using 36 photometric bands in COSMOS and find good agreement. This offers good prospects for applying our approach to current and future large imaging surveys.

  5. Spatial distribution sampling and Monte Carlo simulation of radioactive isotopes

    CERN Document Server

    Krainer, Alexander Michael

    2015-01-01

    This work focuses on the implementation of a program for random sampling of uniformly spatially distributed isotopes for Monte Carlo particle simulations and in specific FLUKA. With FLUKA it is possible to calculate the radio nuclide production in high energy fields. The decay of these nuclide, and therefore the resulting radiation field, however can only be simulated in the same geometry. This works gives the tool to simulate the decay of the produced nuclide in other geometries. With that the radiation field from an irradiated object can be simulated in arbitrary environments. The sampling of isotope mixtures was tested by simulating a 50/50 mixture of $Cs^{137}$ and $Co^{60}$. These isotopes are both well known and provide therefore a first reliable benchmark in that respect. The sampling of uniformly distributed coordinates was tested using the histogram test for various spatial distributions. The advantages and disadvantages of the program compared to standard methods are demonstrated in the real life ca...

  6. A distributed name resolution system in information centric networks

    Science.gov (United States)

    Elbreiki, Walid; Arlimatti, Shivaleela; Hassan, Suhaidi; Habbal, Adib; Elshaikh, Mohamed

    2016-08-01

    Information Centric Networks (ICN) is the new paradigm that envisages to shift the Internet away from its existing Point-to-Point architecture to a data centric, where communication is based on named hosts rather than the information stored on these hosts. Name Resolution is the center of attraction for ICN, where Named Data Objects (NDO) are used for identifying the information and guiding for routing or forwarding inside ICN. Recently, several researches use distributed NRS to overcome the problem of interest flooding, congestion and overloading. Yet the distribution of NRS is based on random distribution. How to distribute the NRS is still an important and challenging problem. In this work, we address the problem of distribution of NRS by proposing a new mechanism called Distributed Name Resolution System (DNRS), by considering the time of publishing the NDOs in the NRS. This mechanism partitions the network to distribute the workload among NRSs by increasing storage capacity. In addition, partitioning the network increases flexibility and scalability of NRS. We evaluate the effectiveness of our proposed mechanism, which achieves lesser end-to-end delay with more average throughputs compared to random distribution of NRS without disturbing the underlying routing or forwarding strategies.

  7. Acceptance sampling for attributes via hypothesis testing and the hypergeometric distribution

    Science.gov (United States)

    Samohyl, Robert Wayne

    2017-10-01

    This paper questions some aspects of attribute acceptance sampling in light of the original concepts of hypothesis testing from Neyman and Pearson (NP). Attribute acceptance sampling in industry, as developed by Dodge and Romig (DR), generally follows the international standards of ISO 2859, and similarly the Brazilian standards NBR 5425 to NBR 5427 and the United States Standards ANSI/ASQC Z1.4. The paper evaluates and extends the area of acceptance sampling in two directions. First, by suggesting the use of the hypergeometric distribution to calculate the parameters of sampling plans avoiding the unnecessary use of approximations such as the binomial or Poisson distributions. We show that, under usual conditions, discrepancies can be large. The conclusion is that the hypergeometric distribution, ubiquitously available in commonly used software, is more appropriate than other distributions for acceptance sampling. Second, and more importantly, we elaborate the theory of acceptance sampling in terms of hypothesis testing rigorously following the original concepts of NP. By offering a common theoretical structure, hypothesis testing from NP can produce a better understanding of applications even beyond the usual areas of industry and commerce such as public health and political polling. With the new procedures, both sample size and sample error can be reduced. What is unclear in traditional acceptance sampling is the necessity of linking the acceptable quality limit (AQL) exclusively to the producer and the lot quality percent defective (LTPD) exclusively to the consumer. In reality, the consumer should also be preoccupied with a value of AQL, as should the producer with LTPD. Furthermore, we can also question why type I error is always uniquely associated with the producer as producer risk, and likewise, the same question arises with consumer risk which is necessarily associated with type II error. The resolution of these questions is new to the literature. The

  8. Information Technologies of the Distributed Applications Design

    Directory of Open Access Journals (Sweden)

    Safwan Al SALAIMEH

    2007-01-01

    Full Text Available The questions of distributed systems development based on Java RMI, EJB and J2EE technologies and tools are rated. Here is brought the comparative analysis, which determines the domain of an expedient demand of the considered information technologies as applied to the concrete distributed applications requirements.

  9. Sample Size in Qualitative Interview Studies: Guided by Information Power.

    Science.gov (United States)

    Malterud, Kirsti; Siersma, Volkert Dirk; Guassora, Ann Dorrit

    2015-11-27

    Sample sizes must be ascertained in qualitative studies like in quantitative studies but not by the same means. The prevailing concept for sample size in qualitative studies is "saturation." Saturation is closely tied to a specific methodology, and the term is inconsistently applied. We propose the concept "information power" to guide adequate sample size for qualitative studies. Information power indicates that the more information the sample holds, relevant for the actual study, the lower amount of participants is needed. We suggest that the size of a sample with sufficient information power depends on (a) the aim of the study, (b) sample specificity, (c) use of established theory, (d) quality of dialogue, and (e) analysis strategy. We present a model where these elements of information and their relevant dimensions are related to information power. Application of this model in the planning and during data collection of a qualitative study is discussed. © The Author(s) 2015.

  10. Prototyping a Distributed Information Retrieval System That Uses Statistical Ranking.

    Science.gov (United States)

    Harman, Donna; And Others

    1991-01-01

    Built using a distributed architecture, this prototype distributed information retrieval system uses statistical ranking techniques to provide better service to the end user. Distributed architecture was shown to be a feasible alternative to centralized or CD-ROM information retrieval, and user testing of the ranking methodology showed both…

  11. Test of methods for retrospective activity size distribution determination from filter samples

    International Nuclear Information System (INIS)

    Meisenberg, Oliver; Tschiersch, Jochen

    2015-01-01

    Determining the activity size distribution of radioactive aerosol particles requires sophisticated and heavy equipment, which makes measurements at large number of sites difficult and expensive. Therefore three methods for a retrospective determination of size distributions from aerosol filter samples in the laboratory were tested for their applicability. Extraction into a carrier liquid with subsequent nebulisation showed size distributions with a slight but correctable bias towards larger diameters compared with the original size distribution. Yields in the order of magnitude of 1% could be achieved. Sonication-assisted extraction into a carrier liquid caused a coagulation mode to appear in the size distribution. Sonication-assisted extraction into the air did not show acceptable results due to small yields. The method of extraction into a carrier liquid without sonication was applied to aerosol samples from Chernobyl in order to calculate inhalation dose coefficients for 137 Cs based on the individual size distribution. The effective dose coefficient is about half of that calculated with a default reference size distribution. - Highlights: • Activity size distributions can be recovered after aerosol sampling on filters. • Extraction into a carrier liquid and subsequent nebulisation is appropriate. • This facilitates the determination of activity size distributions for individuals. • Size distributions from this method can be used for individual dose coefficients. • Dose coefficients were calculated for the workers at the new Chernobyl shelter

  12. Empirical Sampling Distributions of Equating Coefficients for Graded and Nominal Response Instruments.

    Science.gov (United States)

    Baker, Frank B.

    1997-01-01

    Examined the sampling distributions of equating coefficients produced by the characteristic curve method for tests using graded and nominal response scoring using simulated data. For both models and across all three equating situations, the sampling distributions were generally bell-shaped and peaked, and occasionally had a small degree of…

  13. [Effects of sampling plot number on tree species distribution prediction under climate change].

    Science.gov (United States)

    Liang, Yu; He, Hong-Shi; Wu, Zhi-Wei; Li, Xiao-Na; Luo, Xu

    2013-05-01

    Based on the neutral landscapes under different degrees of landscape fragmentation, this paper studied the effects of sampling plot number on the prediction of tree species distribution at landscape scale under climate change. The tree species distribution was predicted by the coupled modeling approach which linked an ecosystem process model with a forest landscape model, and three contingent scenarios and one reference scenario of sampling plot numbers were assumed. The differences between the three scenarios and the reference scenario under different degrees of landscape fragmentation were tested. The results indicated that the effects of sampling plot number on the prediction of tree species distribution depended on the tree species life history attributes. For the generalist species, the prediction of their distribution at landscape scale needed more plots. Except for the extreme specialist, landscape fragmentation degree also affected the effects of sampling plot number on the prediction. With the increase of simulation period, the effects of sampling plot number on the prediction of tree species distribution at landscape scale could be changed. For generalist species, more plots are needed for the long-term simulation.

  14. The asymmetric distribution of informative face information during gender recognition.

    Science.gov (United States)

    Hu, Fengpei; Hu, Huan; Xu, Lian; Qin, Jungang

    2013-02-01

    Recognition of the gender of a face is important in social interactions. In the current study, the distribution of informative facial information was systematically examined during gender judgment using two methods, Bubbles and Focus windows techniques. Two experiments found that the most informative information was around the eyes, followed by the mouth and nose. Other parts of the face contributed to the gender recognition but were less important. The left side of the face was used more during gender recognition in two experiments. These results show mainly areas around the eyes are used for gender judgment and demonstrate perceptual asymmetry with a normal (non-chimeric) face.

  15. A random sampling procedure for anisotropic distributions

    International Nuclear Information System (INIS)

    Nagrajan, P.S.; Sethulakshmi, P.; Raghavendran, C.P.; Bhatia, D.P.

    1975-01-01

    A procedure is described for sampling the scattering angle of neutrons as per specified angular distribution data. The cosine of the scattering angle is written as a double Legendre expansion in the incident neutron energy and a random number. The coefficients of the expansion are given for C, N, O, Si, Ca, Fe and Pb and these elements are of interest in dosimetry and shielding. (author)

  16. PREFERENCE OF PRIOR FOR BAYESIAN ANALYSIS OF THE MIXED BURR TYPE X DISTRIBUTION UNDER TYPE I CENSORED SAMPLES

    Directory of Open Access Journals (Sweden)

    Tabassum Naz Sindhu

    2014-05-01

    Full Text Available The paper is concerned with the preference of prior for the Bayesian analysis of the shape parameter of the mixture of Burr type X distribution using the censored data. We modeled the heterogeneous population using two components mixture of the Burr type X distribution. A comprehensive simulation scheme, through probabilistic mixing, has been followed to highlight the properties and behavior of the estimates in terms of sample size, corresponding risks and the proportion of the component of the mixture. The Bayes estimators of the parameters have been evaluated under the assumption of informative and non-informative priors using symmetric and asymmetric loss functions. The model selection criterion for the preference of the prior has been introduced. The hazard rate function of the mixture distribution has been discussed. The Bayes estimates under exponential prior and precautionary loss function exhibit the minimum posterior risks with some exceptions.

  17. Distributed video coding with multiple side information

    DEFF Research Database (Denmark)

    Huang, Xin; Brites, C.; Ascenso, J.

    2009-01-01

    Distributed Video Coding (DVC) is a new video coding paradigm which mainly exploits the source statistics at the decoder based on the availability of some decoder side information. The quality of the side information has a major impact on the DVC rate-distortion (RD) performance in the same way...... the quality of the predictions had a major impact in predictive video coding. In this paper, a DVC solution exploiting multiple side information is proposed; the multiple side information is generated by frame interpolation and frame extrapolation targeting to improve the side information of a single...

  18. Molecular dynamics equation designed for realizing arbitrary density: Application to sampling method utilizing the Tsallis generalized distribution

    International Nuclear Information System (INIS)

    Fukuda, Ikuo; Nakamura, Haruki

    2010-01-01

    Several molecular dynamics techniques applying the Tsallis generalized distribution are presented. We have developed a deterministic dynamics to generate an arbitrary smooth density function ρ. It creates a measure-preserving flow with respect to the measure ρdω and realizes the density ρ under the assumption of the ergodicity. It can thus be used to investigate physical systems that obey such distribution density. Using this technique, the Tsallis distribution density based on a full energy function form along with the Tsallis index q ≥ 1 can be created. From the fact that an effective support of the Tsallis distribution in the phase space is broad, compared with that of the conventional Boltzmann-Gibbs (BG) distribution, and the fact that the corresponding energy-surface deformation does not change energy minimum points, the dynamics enhances the physical state sampling, in particular for a rugged energy surface spanned by a complicated system. Other feature of the Tsallis distribution is that it provides more degree of the nonlinearity, compared with the case of the BG distribution, in the deterministic dynamics equation, which is very useful to effectively gain the ergodicity of the dynamical system constructed according to the scheme. Combining such methods with the reconstruction technique of the BG distribution, we can obtain the information consistent with the BG ensemble and create the corresponding free energy surface. We demonstrate several sampling results obtained from the systems typical for benchmark tests in MD and from biomolecular systems.

  19. Investigating the influence of standard staining procedures on the copper distribution and concentration in Wilson's disease liver samples by laser ablation-inductively coupled plasma-mass spectrometry.

    Science.gov (United States)

    Hachmöller, Oliver; Aichler, Michaela; Schwamborn, Kristina; Lutz, Lisa; Werner, Martin; Sperling, Michael; Walch, Axel; Karst, Uwe

    2017-12-01

    The influence of rhodanine and haematoxylin and eosin (HE) staining on the copper distribution and concentration in liver needle biopsy samples originating from patients with Wilson's disease (WD), a rare autosomal recessive inherited disorder of the copper metabolism, is investigated. In contemporary diagnostic of WD, rhodanine staining is used for histopathology, since rhodanine and copper are forming a red to orange-red complex, which can be recognized in the liver tissue using a microscope. In this paper, a laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) method is applied for the analysis of eight different WD liver samples. Apart from a spatially resolved elemental detection as qualitative information, this LA-ICP-MS method offers also quantitative information by external calibration with matrix-matched gelatine standards. The sample set of this work included an unstained and a rhodanine stained section of each WD liver sample. While unstained sections of WD liver samples showed very distinct structures of the copper distribution with high copper concentrations, rhodanine stained sections revealed a blurred copper distribution with significant decreased concentrations in a range from 20 to more than 90%. This implies a copper removal from the liver tissue by complexation during the rhodanine staining. In contrast to this, a further HE stained sample of one WD liver sample did not show a significant decrease in the copper concentration and influence on the copper distribution in comparison to the unstained section. Therefore, HE staining can be combined with the analysis by means of LA-ICP-MS in two successive steps from one thin section of a biopsy specimen. This allows further information to be gained on the elemental distribution by LA-ICP-MS additional to results obtained by histological staining. Copyright © 2017 Elsevier GmbH. All rights reserved.

  20. Illustrating Sampling Distribution of a Statistic: Minitab Revisited

    Science.gov (United States)

    Johnson, H. Dean; Evans, Marc A.

    2008-01-01

    Understanding the concept of the sampling distribution of a statistic is essential for the understanding of inferential procedures. Unfortunately, this topic proves to be a stumbling block for students in introductory statistics classes. In efforts to aid students in their understanding of this concept, alternatives to a lecture-based mode of…

  1. Simulation of the Sampling Distribution of the Mean Can Mislead

    Science.gov (United States)

    Watkins, Ann E.; Bargagliotti, Anna; Franklin, Christine

    2014-01-01

    Although the use of simulation to teach the sampling distribution of the mean is meant to provide students with sound conceptual understanding, it may lead them astray. We discuss a misunderstanding that can be introduced or reinforced when students who intuitively understand that "bigger samples are better" conduct a simulation to…

  2. Encryption of covert information into multiple statistical distributions

    International Nuclear Information System (INIS)

    Venkatesan, R.C.

    2007-01-01

    A novel strategy to encrypt covert information (code) via unitary projections into the null spaces of ill-conditioned eigenstructures of multiple host statistical distributions, inferred from incomplete constraints, is presented. The host pdf's are inferred using the maximum entropy principle. The projection of the covert information is dependent upon the pdf's of the host statistical distributions. The security of the encryption/decryption strategy is based on the extreme instability of the encoding process. A self-consistent procedure to derive keys for both symmetric and asymmetric cryptography is presented. The advantages of using a multiple pdf model to achieve encryption of covert information are briefly highlighted. Numerical simulations exemplify the efficacy of the model

  3. Information system for personnel work distribution in Kaunas Maironis gymnasium

    OpenAIRE

    Ivanauskaitė, Eglė

    2005-01-01

    This information technology master degree work. In this work it is made research of the task of teacher's work distribution acording to schoolchildren's individual studying plans. In this work it it analyzed the process of teacher's work distribution and their information needs. This information system was created in MS Visio surroundings and realized with MS SQL Server and VBA implements.

  4. Connecting Research to Teaching: Using Data to Motivate the Use of Empirical Sampling Distributions

    Science.gov (United States)

    Lee, Hollylynne S.; Starling, Tina T.; Gonzalez, Marggie D.

    2014-01-01

    Research shows that students often struggle with understanding empirical sampling distributions. Using hands-on and technology models and simulations of problems generated by real data help students begin to make connections between repeated sampling, sample size, distribution, variation, and center. A task to assist teachers in implementing…

  5. Calculation of absolute protein-ligand binding free energy using distributed replica sampling.

    Science.gov (United States)

    Rodinger, Tomas; Howell, P Lynne; Pomès, Régis

    2008-10-21

    Distributed replica sampling [T. Rodinger et al., J. Chem. Theory Comput. 2, 725 (2006)] is a simple and general scheme for Boltzmann sampling of conformational space by computer simulation in which multiple replicas of the system undergo a random walk in reaction coordinate or temperature space. Individual replicas are linked through a generalized Hamiltonian containing an extra potential energy term or bias which depends on the distribution of all replicas, thus enforcing the desired sampling distribution along the coordinate or parameter of interest regardless of free energy barriers. In contrast to replica exchange methods, efficient implementation of the algorithm does not require synchronicity of the individual simulations. The algorithm is inherently suited for large-scale simulations using shared or heterogeneous computing platforms such as a distributed network. In this work, we build on our original algorithm by introducing Boltzmann-weighted jumping, which allows moves of a larger magnitude and thus enhances sampling efficiency along the reaction coordinate. The approach is demonstrated using a realistic and biologically relevant application; we calculate the standard binding free energy of benzene to the L99A mutant of T4 lysozyme. Distributed replica sampling is used in conjunction with thermodynamic integration to compute the potential of mean force for extracting the ligand from protein and solvent along a nonphysical spatial coordinate. Dynamic treatment of the reaction coordinate leads to faster statistical convergence of the potential of mean force than a conventional static coordinate, which suffers from slow transitions on a rugged potential energy surface.

  6. Current distribution between petals in PF-FSJS sample

    International Nuclear Information System (INIS)

    Zani, L.

    2003-01-01

    6 Rogowski coils have been installed on each leg of each of the 12 petals in the PF-FSJS sample (poloidal field - full size joint sample) in order to diagnostic current. It appears that Rogowski signal seem reliable for current distribution analysis (Ampere's law is checked and reproducibility is assured) but there is some limitations for qualitative diagnostics. In the series of transparencies results are detailed for the PU1 position, for both leg and right legs and for various unique-angle shift (Δθ) configurations but only results for 0 < Δθ < -5 are consistent

  7. Finding a minimally informative Dirichlet prior distribution using least squares

    International Nuclear Information System (INIS)

    Kelly, Dana; Atwood, Corwin

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straightforward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson λ, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in the form of a standard distribution (e.g., beta, gamma), and so a beta distribution is used as an approximation in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial model for common-cause failure, must be estimated from data that are often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  8. Comparing simulated and theoretical sampling distributions of the U3 person-fit statistic

    NARCIS (Netherlands)

    Emons, W.H.M.; Meijer, R.R.; Sijtsma, K.

    2002-01-01

    The accuracy with which the theoretical sampling distribution of van der Flier's person-.t statistic U3 approaches the empirical U3 sampling distribution is affected by the item discrimination. A simulation study showed that for tests with a moderate or a strong mean item discrimination, the Type I

  9. On distribution reduction and algorithm implementation in inconsistent ordered information systems.

    Science.gov (United States)

    Zhang, Yanqin

    2014-01-01

    As one part of our work in ordered information systems, distribution reduction is studied in inconsistent ordered information systems (OISs). Some important properties on distribution reduction are studied and discussed. The dominance matrix is restated for reduction acquisition in dominance relations based information systems. Matrix algorithm for distribution reduction acquisition is stepped. And program is implemented by the algorithm. The approach provides an effective tool for the theoretical research and the applications for ordered information systems in practices. For more detailed and valid illustrations, cases are employed to explain and verify the algorithm and the program which shows the effectiveness of the algorithm in complicated information systems.

  10. DIGI-vis: Distributed interactive geospatial information visualization

    KAUST Repository

    Ponto, Kevin

    2010-03-01

    Geospatial information systems provide an abundance of information for researchers and scientists. Unfortunately this type of data can usually only be analyzed a few megapixels at a time, giving researchers a very narrow view into these voluminous data sets. We propose a distributed data gathering and visualization system that allows researchers to view these data at hundreds of megapixels simultaneously. This system allows scientists to view real-time geospatial information at unprecedented levels expediting analysis, interrogation, and discovery. ©2010 IEEE.

  11. Efficient Monte Carlo Estimation of the Expected Value of Sample Information Using Moment Matching.

    Science.gov (United States)

    Heath, Anna; Manolopoulou, Ioanna; Baio, Gianluca

    2018-02-01

    The Expected Value of Sample Information (EVSI) is used to calculate the economic value of a new research strategy. Although this value would be important to both researchers and funders, there are very few practical applications of the EVSI. This is due to computational difficulties associated with calculating the EVSI in practical health economic models using nested simulations. We present an approximation method for the EVSI that is framed in a Bayesian setting and is based on estimating the distribution of the posterior mean of the incremental net benefit across all possible future samples, known as the distribution of the preposterior mean. Specifically, this distribution is estimated using moment matching coupled with simulations that are available for probabilistic sensitivity analysis, which is typically mandatory in health economic evaluations. This novel approximation method is applied to a health economic model that has previously been used to assess the performance of other EVSI estimators and accurately estimates the EVSI. The computational time for this method is competitive with other methods. We have developed a new calculation method for the EVSI which is computationally efficient and accurate. This novel method relies on some additional simulation so can be expensive in models with a large computational cost.

  12. Comparing simulated and theoretical sampling distributions of the U3 person-fit statistic

    NARCIS (Netherlands)

    Emons, Wilco H.M.; Meijer, R.R.; Sijtsma, Klaas

    2002-01-01

    The accuracy with which the theoretical sampling distribution of van der Flier’s person-fit statistic U3 approaches the empirical U3 sampling distribution is affected by the item discrimination. A simulation study showed that for tests with a moderate or a strong mean item discrimination, the Type I

  13. Simulations of the Sampling Distribution of the Mean Do Not Necessarily Mislead and Can Facilitate Learning

    Science.gov (United States)

    Lane, David M.

    2015-01-01

    Recently Watkins, Bargagliotti, and Franklin (2014) discovered that simulations of the sampling distribution of the mean can mislead students into concluding that the mean of the sampling distribution of the mean depends on sample size. This potential error arises from the fact that the mean of a simulated sampling distribution will tend to be…

  14. Size Distributions and Characterization of Native and Ground Samples for Toxicology Studies

    Science.gov (United States)

    McKay, David S.; Cooper, Bonnie L.; Taylor, Larry A.

    2010-01-01

    This slide presentation shows charts and graphs that review the particle size distribution and characterization of natural and ground samples for toxicology studies. There are graphs which show the volume distribution versus the number distribution for natural occurring dust, jet mill ground dust, and ball mill ground dust.

  15. Determination of distribution pattern of the heavy metal concentrations in the potable network of Gachsaran by Geographical Information System (GIS

    Directory of Open Access Journals (Sweden)

    G Paraham

    2013-12-01

    . Methods: In this descriptive, cross-sectional study, samples were taken from11 spots of the drinking water distribution network and tested for concentration of 10 metals by Inductivity Coupled Ions Plasma (ICP method in summer of 2010. The research data were compared with national and international water standards. Then the distribution map of heavy metals concentrations in the drinking water wells of the region was prepared by using the Geographical Information System (GIS software. Data were analyzed by the Kruskal-Wallis tests. Results: In all samples, the average concentration of heavy metals were: Arsenic 0.54, Cadmium 0.05, Zinc 55.9, Lead 0.18, Copper .82, Chromium 1.6, Barium 36.5, Selenium0.5, Mercury 0.1 and Silver 0.05 micrograms per liter and was less than the water quality standard. Conclusion: Based on the results obtained, it can be concluded that concentrations of heavy metals in Gachsaran’s drinking water distribution network are not higher than national and international standards and therefore not harmful for people. Key words: Heavy metals, Distribution network, Gachsaran, geographical information system (GIS

  16. Finding a Minimally Informative Dirichlet Prior Distribution Using Least Squares

    International Nuclear Information System (INIS)

    Kelly, Dana; Atwood, Corwin

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straight-forward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in closed form, and so an approximate beta distribution is used in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial aleatory model for common-cause failure, must be estimated from data that is often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  17. Comparing Simulated and Theoretical Sampling Distributions of the U3 Person-Fit Statistic.

    Science.gov (United States)

    Emons, Wilco H. M.; Meijer, Rob R.; Sijtsma, Klaas

    2002-01-01

    Studied whether the theoretical sampling distribution of the U3 person-fit statistic is in agreement with the simulated sampling distribution under different item response theory models and varying item and test characteristics. Simulation results suggest that the use of standard normal deviates for the standardized version of the U3 statistic may…

  18. Intact information sampling in mesial temporal lobe epilepsy.

    Science.gov (United States)

    Zamarian, Laura; Trinka, Eugen; Kuchukhidze, Giorgi; Bodner, Thomas; Unterberger, Iris; Luef, Gerhard; Delazer, Margarete

    2015-11-01

    Previous studies have reported deficits in decision making under ambiguity for patients with mesial temporal lobe epilepsy (mTLE). It is unknown whether mTLE is also associated with alterations at a predecisional stage. This study aimed to gain insight into predecisional processing of patients with mTLE. We compared performance of patients with mTLE (n = 25) with that of healthy controls (n = 75) on the information sampling task (IST), a task assessing reflection-impulsivity and predecisional information sampling. Patients and healthy controls showed a similar performance pattern in both conditions of the IST as indicated by the amount of information gathered, the degree of uncertainty tolerated, and the number of decision errors made. They both also demonstrated a significant sensitivity to the different reward characteristics of the task. For the patient group, we found no significant effects on performance on the IST of epilepsy lateralization, abnormality side, structural abnormality (hippocampus vs. amygdala), and medication (monotherapy vs. polytherapy). Reflection processes and predecisional information sampling as tested by the IST are intact in mTLE. Patients collect as much information as healthy individuals and adapt their behavior according to the changing reward conditions. Our findings indicate that in well-defined risk situations, where memory demands are sufficiently minimized, patients with mTLE should be able to gather sufficient information, weight risks and benefits, and make advantageous decisions. (c) 2015 APA, all rights reserved).

  19. Appendix B: Fisher, lynx, wolverine summary of distribution information

    Science.gov (United States)

    Mary Maj

    1994-01-01

    We present maps depicting distributions of fisher, lynx, and wolverine in the western United States since 1961. Comparison of past and current distributions of species can shed light on population persistence, periods of population isolation, meta-population structure, and important connecting landscapes. Information on the distribution of the American marten is not...

  20. Differentiating gold nanorod samples using particle size and shape distributions from transmission electron microscope images

    Science.gov (United States)

    Grulke, Eric A.; Wu, Xiaochun; Ji, Yinglu; Buhr, Egbert; Yamamoto, Kazuhiro; Song, Nam Woong; Stefaniak, Aleksandr B.; Schwegler-Berry, Diane; Burchett, Woodrow W.; Lambert, Joshua; Stromberg, Arnold J.

    2018-04-01

    Size and shape distributions of gold nanorod samples are critical to their physico-chemical properties, especially their longitudinal surface plasmon resonance. This interlaboratory comparison study developed methods for measuring and evaluating size and shape distributions for gold nanorod samples using transmission electron microscopy (TEM) images. The objective was to determine whether two different samples, which had different performance attributes in their application, were different with respect to their size and/or shape descriptor distributions. Touching particles in the captured images were identified using a ruggedness shape descriptor. Nanorods could be distinguished from nanocubes using an elongational shape descriptor. A non-parametric statistical test showed that cumulative distributions of an elongational shape descriptor, that is, the aspect ratio, were statistically different between the two samples for all laboratories. While the scale parameters of size and shape distributions were similar for both samples, the width parameters of size and shape distributions were statistically different. This protocol fulfills an important need for a standardized approach to measure gold nanorod size and shape distributions for applications in which quantitative measurements and comparisons are important. Furthermore, the validated protocol workflow can be automated, thus providing consistent and rapid measurements of nanorod size and shape distributions for researchers, regulatory agencies, and industry.

  1. Optimal updating magnitude in adaptive flat-distribution sampling.

    Science.gov (United States)

    Zhang, Cheng; Drake, Justin A; Ma, Jianpeng; Pettitt, B Montgomery

    2017-11-07

    We present a study on the optimization of the updating magnitude for a class of free energy methods based on flat-distribution sampling, including the Wang-Landau (WL) algorithm and metadynamics. These methods rely on adaptive construction of a bias potential that offsets the potential of mean force by histogram-based updates. The convergence of the bias potential can be improved by decreasing the updating magnitude with an optimal schedule. We show that while the asymptotically optimal schedule for the single-bin updating scheme (commonly used in the WL algorithm) is given by the known inverse-time formula, that for the Gaussian updating scheme (commonly used in metadynamics) is often more complex. We further show that the single-bin updating scheme is optimal for very long simulations, and it can be generalized to a class of bandpass updating schemes that are similarly optimal. These bandpass updating schemes target only a few long-range distribution modes and their optimal schedule is also given by the inverse-time formula. Constructed from orthogonal polynomials, the bandpass updating schemes generalize the WL and Langfeld-Lucini-Rago algorithms as an automatic parameter tuning scheme for umbrella sampling.

  2. DIGI-vis: Distributed interactive geospatial information visualization

    KAUST Repository

    Ponto, Kevin; Kuester, Falk

    2010-01-01

    data sets. We propose a distributed data gathering and visualization system that allows researchers to view these data at hundreds of megapixels simultaneously. This system allows scientists to view real-time geospatial information at unprecedented

  3. Towards a distributed information architecture for avionics data

    Science.gov (United States)

    Mattmann, Chris; Freeborn, Dana; Crichton, Dan

    2003-01-01

    Avionics data at the National Aeronautics and Space Administration's (NASA) Jet Propulsion Laboratory (JPL consists of distributed, unmanaged, and heterogeneous information that is hard for flight system design engineers to find and use on new NASA/JPL missions. The development of a systematic approach for capturing, accessing and sharing avionics data critical to the support of NASA/JPL missions and projects is required. We propose a general information architecture for managing the existing distributed avionics data sources and a method for querying and retrieving avionics data using the Object Oriented Data Technology (OODT) framework. OODT uses XML messaging infrastructure that profiles data products and their locations using the ISO-11179 data model for describing data products. Queries against a common data dictionary (which implements the ISO model) are translated to domain dependent source data models, and distributed data products are returned asynchronously through the OODT middleware. Further work will include the ability to 'plug and play' new manufacturer data sources, which are distributed at avionics component manufacturer locations throughout the United States.

  4. Wyoming CV Pilot Traveler Information Message Sample

    Data.gov (United States)

    Department of Transportation — This dataset contains a sample of the sanitized Traveler Information Messages (TIM) being generated by the Wyoming Connected Vehicle (CV) Pilot. The full set of TIMs...

  5. Distributed Collaborative Learning Communities Enabled by Information Communication Technology

    NARCIS (Netherlands)

    H.L. Alvarez (Heidi Lee)

    2006-01-01

    textabstractHow and why can Information Communication Technology (ICT) contribute to enhancing learning in distributed Collaborative Learning Communities (CLCs)? Drawing from relevant theories concerned with phenomenon of ICT enabled distributed collaborative learning, this book identifies gaps in

  6. Autonomous Information Fading and Provision to Achieve High Response Time in Distributed Information Systems

    Science.gov (United States)

    Lu, Xiaodong; Arfaoui, Helene; Mori, Kinji

    In highly dynamic electronic commerce environment, the need for adaptability and rapid response time to information service systems has become increasingly important. In order to cope with the continuously changing conditions of service provision and utilization, Faded Information Field (FIF) has been proposed. FIF is a distributed information service system architecture, sustained by push/pull mobile agents to bring high-assurance of services through a recursive demand-oriented provision of the most popular information closer to the users to make a tradeoff between the cost of information service allocation and access. In this paper, based on the analysis of the relationship that exists among the users distribution, information provision and access time, we propose the technology for FIF design to resolve the competing requirements of users and providers to improve users' access time. In addition, to achieve dynamic load balancing with changing users preference, the autonomous information reallocation technology is proposed. We proved the effectiveness of the proposed technology through the simulation and comparison with the conventional system.

  7. Shannon Entropy and Mutual Information for Multivariate Skew-Elliptical Distributions

    KAUST Repository

    Arellano-Valle, Reinaldo B.

    2012-02-27

    The entropy and mutual information index are important concepts developed by Shannon in the context of information theory. They have been widely studied in the case of the multivariate normal distribution. We first extend these tools to the full symmetric class of multivariate elliptical distributions and then to the more flexible families of multivariate skew-elliptical distributions. We study in detail the cases of the multivariate skew-normal and skew-t distributions. We implement our findings to the application of the optimal design of an ozone monitoring station network in Santiago de Chile. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  8. Shannon Entropy and Mutual Information for Multivariate Skew-Elliptical Distributions

    KAUST Repository

    Arellano-Valle, Reinaldo B.; Contreras-Reyes, Javier E.; Genton, Marc G.

    2012-01-01

    The entropy and mutual information index are important concepts developed by Shannon in the context of information theory. They have been widely studied in the case of the multivariate normal distribution. We first extend these tools to the full symmetric class of multivariate elliptical distributions and then to the more flexible families of multivariate skew-elliptical distributions. We study in detail the cases of the multivariate skew-normal and skew-t distributions. We implement our findings to the application of the optimal design of an ozone monitoring station network in Santiago de Chile. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  9. Prediction uncertainty assessment of a systems biology model requires a sample of the full probability distribution of its parameters

    Directory of Open Access Journals (Sweden)

    Simon van Mourik

    2014-06-01

    Full Text Available Multi-parameter models in systems biology are typically ‘sloppy’: some parameters or combinations of parameters may be hard to estimate from data, whereas others are not. One might expect that parameter uncertainty automatically leads to uncertain predictions, but this is not the case. We illustrate this by showing that the prediction uncertainty of each of six sloppy models varies enormously among different predictions. Statistical approximations of parameter uncertainty may lead to dramatic errors in prediction uncertainty estimation. We argue that prediction uncertainty assessment must therefore be performed on a per-prediction basis using a full computational uncertainty analysis. In practice this is feasible by providing a model with a sample or ensemble representing the distribution of its parameters. Within a Bayesian framework, such a sample may be generated by a Markov Chain Monte Carlo (MCMC algorithm that infers the parameter distribution based on experimental data. Matlab code for generating the sample (with the Differential Evolution Markov Chain sampler and the subsequent uncertainty analysis using such a sample, is supplied as Supplemental Information.

  10. Estimates of the Sampling Distribution of Scalability Coefficient H

    Science.gov (United States)

    Van Onna, Marieke J. H.

    2004-01-01

    Coefficient "H" is used as an index of scalability in nonparametric item response theory (NIRT). It indicates the degree to which a set of items rank orders examinees. Theoretical sampling distributions, however, have only been derived asymptotically and only under restrictive conditions. Bootstrap methods offer an alternative possibility to…

  11. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  12. Spatial distribution of single-nucleotide polymorphisms related to fungicide resistance and implications for sampling.

    Science.gov (United States)

    Van der Heyden, H; Dutilleul, P; Brodeur, L; Carisse, O

    2014-06-01

    Spatial distribution of single-nucleotide polymorphisms (SNPs) related to fungicide resistance was studied for Botrytis cinerea populations in vineyards and for B. squamosa populations in onion fields. Heterogeneity in this distribution was characterized by performing geostatistical analyses based on semivariograms and through the fitting of discrete probability distributions. Two SNPs known to be responsible for boscalid resistance (H272R and H272Y), both located on the B subunit of the succinate dehydrogenase gene, and one SNP known to be responsible for dicarboximide resistance (I365S) were chosen for B. cinerea in grape. For B. squamosa in onion, one SNP responsible for dicarboximide resistance (I365S homologous) was chosen. One onion field was sampled in 2009 and another one was sampled in 2010 for B. squamosa, and two vineyards were sampled in 2011 for B. cinerea, for a total of four sampled sites. Cluster sampling was carried on a 10-by-10 grid, each of the 100 nodes being the center of a 10-by-10-m quadrat. In each quadrat, 10 samples were collected and analyzed by restriction fragment length polymorphism polymerase chain reaction (PCR) or allele specific PCR. Mean SNP incidence varied from 16 to 68%, with an overall mean incidence of 43%. In the geostatistical analyses, omnidirectional variograms showed spatial autocorrelation characterized by ranges of 21 to 1 m. Various levels of anisotropy were detected, however, with variograms computed in four directions (at 0°, 45°, 90°, and 135° from the within-row direction used as reference), indicating that spatial autocorrelation was prevalent or characterized by a longer range in one direction. For all eight data sets, the β-binomial distribution was found to fit the data better than the binomial distribution. This indicates local aggregation of fungicide resistance among sampling units, as supported by estimates of the parameter θ of the β-binomial distribution of 0.09 to 0.23 (overall median value = 0

  13. Group Acceptance Sampling Plan for Lifetime Data Using Generalized Pareto Distribution

    Directory of Open Access Journals (Sweden)

    Muhammad Aslam

    2010-02-01

    Full Text Available In this paper, a group acceptance sampling plan (GASP is introduced for the situations when lifetime of the items follows the generalized Pareto distribution. The design parameters such as minimum group size and acceptance number are determined when the consumer’s risk and the test termination time are specified. The proposed sampling plan is compared with the existing sampling plan. It is concluded that the proposed sampling plan performs better than the existing plan in terms of minimum sample size required to reach the same decision.

  14. Generalized molybdenum oxide surface chemical state XPS determination via informed amorphous sample model

    Energy Technology Data Exchange (ETDEWEB)

    Baltrusaitis, Jonas, E-mail: job314@lehigh.edu [Department of Chemical Engineering, Lehigh University, B336 Iacocca Hall, 111 Research Drive, Bethlehem, PA 18015 (United States); PhotoCatalytic Synthesis group, MESA+ Institute for Nanotechnology, Faculty of Science and Technology, University of Twente, Meander 229, P.O. Box 217, 7500 AE Enschede (Netherlands); Mendoza-Sanchez, Beatriz [CRANN, Chemistry School, Trinity College Dublin, Dublin (Ireland); Fernandez, Vincent [Institut des Matériaux Jean Rouxel, 2 rue de la Houssinière, BP 32229, F-44322 Nantes Cedex 3 (France); Veenstra, Rick [PhotoCatalytic Synthesis group, MESA+ Institute for Nanotechnology, Faculty of Science and Technology, University of Twente, Meander 229, P.O. Box 217, 7500 AE Enschede (Netherlands); Dukstiene, Nijole [Department of Physical and Inorganic Chemistry, Kaunas University of Technology, Radvilenu pl. 19, LT-50254 Kaunas (Lithuania); Roberts, Adam [Kratos Analytical Ltd, Trafford Wharf Road, Wharfside, Manchester, M17 1GP (United Kingdom); Fairley, Neal [Casa Software Ltd, Bay House, 5 Grosvenor Terrace, Teignmouth, Devon TQ14 8NE (United Kingdom)

    2015-01-30

    Highlights: • We analyzed and modeled spectral envelopes of complex molybdenum oxides. • Molybdenum oxide films of varying valence and crystallinity were synthesized. • MoO{sub 3} and MoO{sub 2} line shapes from experimental data were created. • Informed amorphous sample model (IASM) developed. • Amorphous molybdenum oxide XPS envelopes were interpreted. - Abstract: Accurate elemental oxidation state determination for the outer surface of a complex material is of crucial importance in many science and engineering disciplines, including chemistry, fundamental and applied surface science, catalysis, semiconductors and many others. X-ray photoelectron spectroscopy (XPS) is the primary tool used for this purpose. The spectral data obtained, however, is often very complex and can be subject to incorrect interpretation. Unlike traditional XPS spectra fitting procedures using purely synthetic spectral components, here we develop and present an XPS data processing method based on vector analysis that allows creating XPS spectral components by incorporating key information, obtained experimentally. XPS spectral data, obtained from series of molybdenum oxide samples with varying oxidation states and degree of crystallinity, were processed using this method and the corresponding oxidation states present, as well as their relative distribution was elucidated. It was shown that monitoring the evolution of the chemistry and crystal structure of a molybdenum oxide sample due to an invasive X-ray probe could be used to infer solutions to complex spectral envelopes.

  15. Spatial Distribution and Sampling Plans for Grapevine Plant Canopy-Inhabiting Scaphoideus titanus (Hemiptera: Cicadellidae) Nymphs.

    Science.gov (United States)

    Rigamonti, Ivo E; Brambilla, Carla; Colleoni, Emanuele; Jermini, Mauro; Trivellone, Valeria; Baumgärtner, Johann

    2016-04-01

    The paper deals with the study of the spatial distribution and the design of sampling plans for estimating nymph densities of the grape leafhopper Scaphoideus titanus Ball in vine plant canopies. In a reference vineyard sampled for model parameterization, leaf samples were repeatedly taken according to a multistage, stratified, random sampling procedure, and data were subjected to an ANOVA. There were no significant differences in density neither among the strata within the vineyard nor between the two strata with basal and apical leaves. The significant differences between densities on trunk and productive shoots led to the adoption of two-stage (leaves and plants) and three-stage (leaves, shoots, and plants) sampling plans for trunk shoots- and productive shoots-inhabiting individuals, respectively. The mean crowding to mean relationship used to analyze the nymphs spatial distribution revealed aggregated distributions. In both the enumerative and the sequential enumerative sampling plans, the number of leaves of trunk shoots, and of leaves and shoots of productive shoots, was kept constant while the number of plants varied. In additional vineyards data were collected and used to test the applicability of the distribution model and the sampling plans. The tests confirmed the applicability 1) of the mean crowding to mean regression model on the plant and leaf stages for representing trunk shoot-inhabiting distributions, and on the plant, shoot, and leaf stages for productive shoot-inhabiting nymphs, 2) of the enumerative sampling plan, and 3) of the sequential enumerative sampling plan. In general, sequential enumerative sampling was more cost efficient than enumerative sampling.

  16. Optimum sample length for estimating anchovy size distribution and the proportion of juveniles per fishing set for the Peruvian purse-seine fleet

    Directory of Open Access Journals (Sweden)

    Rocío Joo

    2017-04-01

    Full Text Available The length distribution of catches represents a fundamental source of information for estimating growth and spatio-temporal dynamics of cohorts. The length distribution of caught is estimated based on samples of catched individuals. This work studies the optimum sample size of individuals at each fishing set in order to obtain a representative sample of the length and the proportion of juveniles in the fishing set. For that matter, we use anchovy (Engraulis ringens length data from different fishing sets recorded by observers at-sea from the On-board Observers Program from the Peruvian Marine Research Institute. Finally, we propose an optimum sample size for obtaining robust size and juvenile estimations. Though the application of this work corresponds to the anchovy fishery, the procedure can be applied to any fishery, either for on board or inland biometric measurements.

  17. Novel names extend for how long preschool children sample visual information.

    Science.gov (United States)

    Carvalho, Paulo F; Vales, Catarina; Fausey, Caitlin M; Smith, Linda B

    2018-04-01

    Known words can guide visual attention, affecting how information is sampled. How do novel words, those that do not provide any top-down information, affect preschoolers' visual sampling in a conceptual task? We proposed that novel names can also change visual sampling by influencing how long children look. We investigated this possibility by analyzing how children sample visual information when they hear a sentence with a novel name versus without a novel name. Children completed a match-to-sample task while their moment-to-moment eye movements were recorded using eye-tracking technology. Our analyses were designed to provide specific information on the properties of visual sampling that novel names may change. Overall, we found that novel words prolonged the duration of each sampling event but did not affect sampling allocation (which objects children looked at) or sampling organization (how children transitioned from one object to the next). These results demonstrate that novel words change one important dynamic property of gaze: Novel words can entrain the cognitive system toward longer periods of sustained attention early in development. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Simulated tempering distributed replica sampling: A practical guide to enhanced conformational sampling

    Energy Technology Data Exchange (ETDEWEB)

    Rauscher, Sarah; Pomes, Regis, E-mail: pomes@sickkids.ca

    2010-11-01

    Simulated tempering distributed replica sampling (STDR) is a generalized-ensemble method designed specifically for simulations of large molecular systems on shared and heterogeneous computing platforms [Rauscher, Neale and Pomes (2009) J. Chem. Theor. Comput. 5, 2640]. The STDR algorithm consists of an alternation of two steps: (1) a short molecular dynamics (MD) simulation; and (2) a stochastic temperature jump. Repeating these steps thousands of times results in a random walk in temperature, which allows the system to overcome energetic barriers, thereby enhancing conformational sampling. The aim of the present paper is to provide a practical guide to applying STDR to complex biomolecular systems. We discuss the details of our STDR implementation, which is a highly-parallel algorithm designed to maximize computational efficiency while simultaneously minimizing network communication and data storage requirements. Using a 35-residue disordered peptide in explicit water as a test system, we characterize the efficiency of the STDR algorithm with respect to both diffusion in temperature space and statistical convergence of structural properties. Importantly, we show that STDR provides a dramatic enhancement of conformational sampling compared to a canonical MD simulation.

  19. Distributed quantum information processing via quantum dot spins

    International Nuclear Information System (INIS)

    Jun, Liu; Qiong, Wang; Le-Man, Kuang; Hao-Sheng, Zeng

    2010-01-01

    We propose a scheme to engineer a non-local two-qubit phase gate between two remote quantum-dot spins. Along with one-qubit local operations, one can in principal perform various types of distributed quantum information processing. The scheme employs a photon with linearly polarisation interacting one after the other with two remote quantum-dot spins in cavities. Due to the optical spin selection rule, the photon obtains a Faraday rotation after the interaction process. By measuring the polarisation of the final output photon, a non-local two-qubit phase gate between the two remote quantum-dot spins is constituted. Our scheme may has very important applications in the distributed quantum information processing

  20. A Distributed Flocking Approach for Information Stream Clustering Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL

    2006-01-01

    Intelligence analysts are currently overwhelmed with the amount of information streams generated everyday. There is a lack of comprehensive tool that can real-time analyze the information streams. Document clustering analysis plays an important role in improving the accuracy of information retrieval. However, most clustering technologies can only be applied for analyzing the static document collection because they normally require a large amount of computation resource and long time to get accurate result. It is very difficult to cluster a dynamic changed text information streams on an individual computer. Our early research has resulted in a dynamic reactive flock clustering algorithm which can continually refine the clustering result and quickly react to the change of document contents. This character makes the algorithm suitable for cluster analyzing dynamic changed document information, such as text information stream. Because of the decentralized character of this algorithm, a distributed approach is a very natural way to increase the clustering speed of the algorithm. In this paper, we present a distributed multi-agent flocking approach for the text information stream clustering and discuss the decentralized architectures and communication schemes for load balance and status information synchronization in this approach.

  1. An efficient method of randomly sampling the coherent angular scatter distribution

    International Nuclear Information System (INIS)

    Williamson, J.F.; Morin, R.L.

    1983-01-01

    Monte Carlo simulations of photon transport phenomena require random selection of an interaction process at each collision site along the photon track. Possible choices are usually limited to photoelectric absorption and incoherent scatter as approximated by the Klein-Nishina distribution. A technique is described for sampling the coherent angular scatter distribution, for the benefit of workers in medical physics. (U.K.)

  2. An alternative phase-space distribution to sample initial conditions for classical dynamics simulations

    International Nuclear Information System (INIS)

    Garcia-Vela, A.

    2002-01-01

    A new quantum-type phase-space distribution is proposed in order to sample initial conditions for classical trajectory simulations. The phase-space distribution is obtained as the modulus of a quantum phase-space state of the system, defined as the direct product of the coordinate and momentum representations of the quantum initial state. The distribution is tested by sampling initial conditions which reproduce the initial state of the Ar-HCl cluster prepared by ultraviolet excitation, and by simulating the photodissociation dynamics by classical trajectories. The results are compared with those of a wave packet calculation, and with a classical simulation using an initial phase-space distribution recently suggested. A better agreement is found between the classical and the quantum predictions with the present phase-space distribution, as compared with the previous one. This improvement is attributed to the fact that the phase-space distribution propagated classically in this work resembles more closely the shape of the wave packet propagated quantum mechanically

  3. On peculiarities of distribution of some elements in vegetation samples

    International Nuclear Information System (INIS)

    Bakiev, S.A.; Rakhmanov, J.; Khakimov, Z.M.; Turayev, S.

    2005-01-01

    This work is devoted to the neutron-activation analysis of medicines of vegetation origin and some herbs, vegetables, fruits and cereals, which are used in oriental medicine, in order to reveal peculiarities of distribution of studied elements in them and possible relations between this distribution and parameters of oriental medicine. The sampling involving 85 species and their preparation for analysis, as well as complex of necessary methodological studies were performed and the method of sample analysis for 14 macro- and microelements (Na, Al, Cl, K, Sc, Mn, Fe, Co, Cu, Zn, Br, J, La, Au) was developed. The studies carried out have enabled one to obtain data on concentrations of these elements and to reveal peculiarities of their distribution in the samples under interest. It was revealed herbs, fruits and cereals with pronounced higher concentrations (with respect to the mean values) of one or another element, which are perhaps concentrators of those elements, as well as samples with lower concentrations of elements (see table). It is indicative that in all herbs only enhanced concentrations of elements are observed, but in fruits and cereals-only lowered concentrations of elements. These results can be of interest for geochemical ecology, dietology, therapy, as well as for activities on correction of elemental content of ecosystems, including soils, and alive organisms It is suggested to continue studies with extension of range of object types and analysed elements. Mathematical analysis of the obtained results was performed with comparison of concentrations of a number of elements in the different objects with classifying parameters ('cold-hot' and 'dry-wet') of these objects according to oriental medicine. In the current stage of studies no relation between these parameters and concentrations has been found. It does not mean that there are not such relations at all, they may be revealed with extension and development of our' studies.

  4. The Value of Information in Distributed Decision Networks

    Science.gov (United States)

    2016-03-04

    formulation, and then we describe the various results at- tained. 1 Mathematical description of Distributed Decision Network un- der Information...Constraints We now define a mathematical framework for networks. Let G = (V,E) be an undirected random network (graph) drawn from a known distribution pG, 1...to any linear, combinatorial problem like shortest path optimization, and, further, so long as the original combinatorial problem can be solved in

  5. Internal Stress Distribution Measurement of TIG Welded SUS304 Samples Using Neutron Diffraction Technique

    Science.gov (United States)

    Muslih, M. Refai; Sumirat, I.; Sairun; Purwanta

    2008-03-01

    The distribution of residual stress of SUS304 samples that were undergone TIG welding process with four different electric currents has been measured. The welding has been done in the middle part of the samples that was previously grooved by milling machine. Before they were welded the samples were annealed at 650 degree Celsius for one hour. The annealing process was done to eliminate residual stress generated by grooving process so that the residual stress within the samples was merely produced from welding process. The calculation of distribution of residual stress was carried out by measuring the strains within crystal planes of Fe(220) SUS304. Strain, Young modulus, and Poisson ratio of Fe(220) SUS304 were measured using DN1-M neutron diffractometer. Young modulus and Poisson ratio of Fe(220) SUS304 sample were measured in-situ. The result of calculations showed that distribution of residual stress of SUS304 in the vicinity of welded area is influenced both by treatments given at the samples-making process and by the electric current used during welding process.

  6. Modelling Dynamic Forgetting in Distributed Information Systems

    NARCIS (Netherlands)

    N.F. Höning (Nicolas); M.C. Schut

    2010-01-01

    htmlabstractWe describe and model a new aspect in the design of distributed information systems. We build upon a previously described problem on the microlevel, which asks how quickly agents should discount (forget) their experience: If they cherish their memories, they can build their reports on

  7. Threshold Estimation of Generalized Pareto Distribution Based on Akaike Information Criterion for Accurate Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee [Hanyang University, Seoul (Korea, Republic of); Lee, Minuk; Choi, Jong-su; Hong, Sup [Korea Research Insitute of Ships and Ocean Engineering, Daejeon (Korea, Republic of)

    2015-02-15

    In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF.

  8. Threshold Estimation of Generalized Pareto Distribution Based on Akaike Information Criterion for Accurate Reliability Analysis

    International Nuclear Information System (INIS)

    Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee; Lee, Minuk; Choi, Jong-su; Hong, Sup

    2015-01-01

    In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF

  9. Treatment of Nuclear Data Covariance Information in Sample Generation

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Adams, Brian M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wieselquist, William [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division

    2017-10-01

    This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on developing a sampling capability that can handle the challenges of generating samples from nuclear cross-section data. The covariance information between energy groups tends to be very ill-conditioned and thus poses a problem using traditional methods for generated correlated samples. This report outlines a method that addresses the sample generation from cross-section matrices.

  10. Treatment of Nuclear Data Covariance Information in Sample Generation

    International Nuclear Information System (INIS)

    Swiler, Laura Painton; Adams, Brian M.; Wieselquist, William

    2017-01-01

    This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on developing a sampling capability that can handle the challenges of generating samples from nuclear cross-section data. The covariance information between energy groups tends to be very ill-conditioned and thus poses a problem using traditional methods for generated correlated samples. This report outlines a method that addresses the sample generation from cross-section matrices.

  11. Observed mass distribution of spontaneous fission fragments from samples of lime - an SSNTD study

    CERN Document Server

    Paul, D; Ghose, D; Sastri, R C

    1999-01-01

    SSNTD is one of the most commonly used detectors in the studies involving nuclear phenomena. The ease of registration of the presence of alpha particles and fission fragments has made it particularly suitable in studies where stable long exposures are needed to extract reliable information. Studies on the presence of alpha emitting nuclides in the environment assume importance since they are found to be carcinogenic. Lime samples from Silchar in Assam of Eastern India have shown the presence of spontaneous fission fragments besides alphas. In the present study we look at the ratio of the average mass distribution of these fission fragments, that gives us an indication of the presence of the traces of transuranic elements.

  12. Introducing a rainfall compound distribution model based on weather patterns sub-sampling

    Directory of Open Access Journals (Sweden)

    F. Garavaglia

    2010-06-01

    Full Text Available This paper presents a probabilistic model for daily rainfall, using sub-sampling based on meteorological circulation. We classified eight typical but contrasted synoptic situations (weather patterns for France and surrounding areas, using a "bottom-up" approach, i.e. from the shape of the rain field to the synoptic situations described by geopotential fields. These weather patterns (WP provide a discriminating variable that is consistent with French climatology, and allows seasonal rainfall records to be split into more homogeneous sub-samples, in term of meteorological genesis.

    First results show how the combination of seasonal and WP sub-sampling strongly influences the identification of the asymptotic behaviour of rainfall probabilistic models. Furthermore, with this level of stratification, an asymptotic exponential behaviour of each sub-sample appears as a reasonable hypothesis. This first part is illustrated with two daily rainfall records from SE of France.

    The distribution of the multi-exponential weather patterns (MEWP is then defined as the composition, for a given season, of all WP sub-sample marginal distributions, weighted by the relative frequency of occurrence of each WP. This model is finally compared to Exponential and Generalized Pareto distributions, showing good features in terms of robustness and accuracy. These final statistical results are computed from a wide dataset of 478 rainfall chronicles spread on the southern half of France. All these data cover the 1953–2005 period.

  13. Spatial distribution and landuse planning of informal automobile ...

    African Journals Online (AJOL)

    Spatial distribution and landuse planning of informal automobile workshops in Osogbo, ... data pertaining to the activities and other related issues of their workshops. ... The study therefore, recommends the establishment of mechanic complex, ...

  14. Sensitivity of postplanning target and OAR coverage estimates to dosimetric margin distribution sampling parameters.

    Science.gov (United States)

    Xu, Huijun; Gordon, J James; Siebers, Jeffrey V

    2011-02-01

    A dosimetric margin (DM) is the margin in a specified direction between a structure and a specified isodose surface, corresponding to a prescription or tolerance dose. The dosimetric margin distribution (DMD) is the distribution of DMs over all directions. Given a geometric uncertainty model, representing inter- or intrafraction setup uncertainties or internal organ motion, the DMD can be used to calculate coverage Q, which is the probability that a realized target or organ-at-risk (OAR) dose metric D, exceeds the corresponding prescription or tolerance dose. Postplanning coverage evaluation quantifies the percentage of uncertainties for which target and OAR structures meet their intended dose constraints. The goal of the present work is to evaluate coverage probabilities for 28 prostate treatment plans to determine DMD sampling parameters that ensure adequate accuracy for postplanning coverage estimates. Normally distributed interfraction setup uncertainties were applied to 28 plans for localized prostate cancer, with prescribed dose of 79.2 Gy and 10 mm clinical target volume to planning target volume (CTV-to-PTV) margins. Using angular or isotropic sampling techniques, dosimetric margins were determined for the CTV, bladder and rectum, assuming shift invariance of the dose distribution. For angular sampling, DMDs were sampled at fixed angular intervals w (e.g., w = 1 degree, 2 degrees, 5 degrees, 10 degrees, 20 degrees). Isotropic samples were uniformly distributed on the unit sphere resulting in variable angular increments, but were calculated for the same number of sampling directions as angular DMDs, and accordingly characterized by the effective angular increment omega eff. In each direction, the DM was calculated by moving the structure in radial steps of size delta (=0.1, 0.2, 0.5, 1 mm) until the specified isodose was crossed. Coverage estimation accuracy deltaQ was quantified as a function of the sampling parameters omega or omega eff and delta. The

  15. Rock sampling. [method for controlling particle size distribution

    Science.gov (United States)

    Blum, P. (Inventor)

    1971-01-01

    A method for sampling rock and other brittle materials and for controlling resultant particle sizes is described. The method involves cutting grooves in the rock surface to provide a grouping of parallel ridges and subsequently machining the ridges to provide a powder specimen. The machining step may comprise milling, drilling, lathe cutting or the like; but a planing step is advantageous. Control of the particle size distribution is effected primarily by changing the height and width of these ridges. This control exceeds that obtainable by conventional grinding.

  16. Distribution and Origin of Amino Acids in Lunar Regolith Samples

    Science.gov (United States)

    Elsila, J. E.; Callahan, M. P.; Glavin, D. P.; Dworkin, J. P.; McLain, H. L.; Noble, S. K.; Gibson, E. K., Jr.

    2015-01-01

    The existence of organic compounds on the lunar surface has been a question of interest from the Apollo era to the present. Investigations of amino acids immediately after collection of lunar samples yielded inconclusive identifications, in part due to analytical limitations including insensitivity to certain compounds, an inability to separate enantiomers, and lack of compound-specific isotopic measurements. It was not possible to determine if the detected amino acids were indigenous to the lunar samples or the result of terrestrial contamination. Recently, we presented initial data from the analysis of amino acid abundances in 12 lunar regolith samples and discussed those results in the context of four potential amino acid sources [5]. Here, we expand on our previous work, focusing on amino acid abundances and distributions in seven regolith samples and presenting the first compound-specific carbon isotopic ratios measured for amino acids in a lunar sample.

  17. The Role of the Sampling Distribution in Understanding Statistical Inference

    Science.gov (United States)

    Lipson, Kay

    2003-01-01

    Many statistics educators believe that few students develop the level of conceptual understanding essential for them to apply correctly the statistical techniques at their disposal and to interpret their outcomes appropriately. It is also commonly believed that the sampling distribution plays an important role in developing this understanding.…

  18. Inference for Local Distributions at High Sampling Frequencies: A Bootstrap Approach

    DEFF Research Database (Denmark)

    Hounyo, Ulrich; Varneskov, Rasmus T.

    of "large" jumps. Our locally dependent wild bootstrap (LDWB) accommodate issues related to the stochastic scale and jumps as well as account for a special block-wise dependence structure induced by sampling errors. We show that the LDWB replicates first and second-order limit theory from the usual...... empirical process and the stochastic scale estimate, respectively, as well as an asymptotic bias. Moreover, we design the LDWB sufficiently general to establish asymptotic equivalence between it and and a nonparametric local block bootstrap, also introduced here, up to second-order distribution theory....... Finally, we introduce LDWB-aided Kolmogorov-Smirnov tests for local Gaussianity as well as local von-Mises statistics, with and without bootstrap inference, and establish their asymptotic validity using the second-order distribution theory. The finite sample performance of CLT and LDWB-aided local...

  19. Distributed retrieval practice promotes superior recall of anatomy information.

    Science.gov (United States)

    Dobson, John L; Perez, Jose; Linderholm, Tracy

    2017-07-01

    Effortful retrieval produces greater long-term recall of information when compared to studying (i.e., reading), as do learning sessions that are distributed (i.e., spaced apart) when compared to those that are massed together. Although the retrieval and distributed practice effects are well-established in the cognitive science literature, no studies have examined their additive effect with regard to learning anatomy information. The aim of this study was to determine how the benefits of retrieval practice vary with massed versus distributed learning. Participants used the following strategies to learn sets of skeletal muscle anatomy: (1) studying on three different days over a seven day period (SSSS 7,2,0 ), (2) studying and retrieving on three different days over a seven day period (SRSR 7,2,0 ), (3) studying on two different days over a two day period (SSSSSS 2,0 ), (4) studying and retrieving on two separate days over a two day period (SRSRSR 2,0 ), and (5) studying and retrieving on one day (SRx6 0 ). All strategies consisted of 12 learning phases and lasted exactly 24 minutes. Muscle information retention was assessed via free recall and using repeated measures ANOVAs. A week after learning, the recall scores were 24.72 ± 3.12, 33.88 ± 3.48, 15.51 ± 2.48, 20.72 ± 2.94, and 12.86 ± 2.05 for the SSSS 7,2,0 , SRSR 7,2,0 , SSSSSS 2,0 , STSTST 2,0 , and SRx6 0 strategies, respectively. In conclusion, the distributed strategies produced significantly better recall than the massed strategies, the retrieval-based strategies produced significantly better recall than the studying strategies, and the combination of distributed and retrieval practice generated the greatest recall of anatomy information. Anat Sci Educ 10: 339-347. © 2016 American Association of Anatomists. © 2016 American Association of Anatomists.

  20. Acceptance Sampling Plans Based on Truncated Life Tests for Sushila Distribution

    Directory of Open Access Journals (Sweden)

    Amer Ibrahim Al-Omari

    2018-03-01

    Full Text Available An acceptance sampling plan problem based on truncated life tests when the lifetime following a Sushila distribution is considered in this paper. For various acceptance numbers, confidence levels and values of the ratio between fixed experiment time and particular mean lifetime, the minimum sample sizes required to ascertain a specified mean life were found. The operating characteristic function values of the suggested sampling plans and the producer’s risk are presented. Some tables are provided and the results are illustrated by an example of a real data set.

  1. Study on distributions and recoveries of tetrachlorodibenzo-p-dioxin and octachlorodibenzo-p-dioxin in a mm5 sampling train

    International Nuclear Information System (INIS)

    Finkel, J.M.; James, R.H.; Baughman, K.W.

    1990-12-01

    14 C-dioxin tracers were used to evaluate whole MM5 sampling train recoveries of dioxin and to determine the distribution of dioxins spiked into a sampling train that was concurrently sampling emissions from a burn of either natural gas ('clean' burn) or kerosene ('dirty' burn). The spike tests were made with a pilot-scale furnace constructed and operated in the laboratory. Recovery of 14 C-dioxin from the MM5 sampling train was determined by scintillation spectrometry. The experimental results indicate that the amount of spiked TCDD- 14 C recovered was approximately 85% during a natural gas test and 83% during a kerosene test. The amount of spiked OCDD- 14 C recovered was approximately 88% during a kerosene test. Also, the data indicate that during the kerosene tests OCDD- 14 C is collected primarily in the front half of the sampling train but TCDD- 14 C is often found in the XAD and the rear filter bell, riser and condenser of the sampling train. During the natural gas tests, TCDD- 14 C was primarily in the XAD. The distribution of the TCDD- 14 C in the kerosene tests was dependent on the rigid operation of the sampling train. The information from the study will be used to determine procedural areas that need improvements or modifications to allow the efficient collection and accurate determination of trace levels of dioxins and furans using the MM5 Method

  2. AGIS: Evolution of Distributed Computing Information system for ATLAS

    CERN Document Server

    Anisenkov, Alexey; The ATLAS collaboration; Alandes Pradillo, Maria; Karavakis, Edward

    2015-01-01

    The variety of the ATLAS Computing Infrastructure requires a central information system to define the topology of computing resources and to store the different parameters and configuration data which are needed by the various ATLAS software components. The ATLAS Grid Information System is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services.

  3. Distributed Information Search and Retrieval for Astronomical Resource Discovery and Data Mining

    Science.gov (United States)

    Murtagh, Fionn; Guillaume, Damien

    Information search and retrieval has become by nature a distributed task. We look at tools and techniques which are of importance in this area. Current technological evolution can be summarized as the growing stability and cohesiveness of distributed architectures of searchable objects. The objects themselves are more often than not multimedia, including published articles or grey literature reports, yellow page services, image data, catalogs, presentation and online display materials, and ``operations'' information such as scheduling and publicly accessible proposal information. The evolution towards distributed architectures, protocols and formats, and the direction of our own work, are focussed on in this paper.

  4. MODELING OF TECHNICAL CHANNELS OF INFORMATION LEAKAGE AT DISTRIBUTED CONTROL OBJECTS

    Directory of Open Access Journals (Sweden)

    Aleksander Vladimirovich Karpov

    2018-05-01

    Full Text Available The significant increase in requirements for distributed control objects’ functioning can’t be realized only at the expense of the widening and strengthening of security control measures. The first step in ensuring the information security at such objects is the analysis of the conditions of their functioning and modeling of technical channels of information leakage. The development of models of such channels is essentially the only method of complete study of their opportunities and it is pointed toward receiving quantitative assessments of the safe operation of compound objects. The evaluation data are necessary to make a decision on the degree of the information security from a leak according to the current criterion. The existing models are developed for the standard concentrated objects and allow to evaluate the level of information security from a leak on each of channels separately, what involves the significant increase in the required protective resource and time of assessment of information security on an object in general. The article deals with a logical-and-probabilistic method of a security assessment of structurally-compound objects. The model of a security leak on the distributed control objects is cited as an example. It is recommended to use a software package of an automated structurally-logistical modeling of compound systems, which allows to evaluate risk of information leakage in the loudspeaker. A possibility of information leakage by technical channels is evaluated and such differential characteristics of the safe operation of the distributed control objects as positive and negative contributions of the initiating events and conditions, which cause a leak are calculated. Purpose. The aim is a quantitative assessment of data risk, which is necessary for justifying the rational composition of organizational and technical protection measures, as well as a variant of the structure of the information security system from a

  5. Teaching the Concept of the Sampling Distribution of the Mean

    Science.gov (United States)

    Aguinis, Herman; Branstetter, Steven A.

    2007-01-01

    The authors use proven cognitive and learning principles and recent developments in the field of educational psychology to teach the concept of the sampling distribution of the mean, which is arguably one of the most central concepts in inferential statistics. The proposed pedagogical approach relies on cognitive load, contiguity, and experiential…

  6. Simulated Tempering Distributed Replica Sampling, Virtual Replica Exchange, and Other Generalized-Ensemble Methods for Conformational Sampling.

    Science.gov (United States)

    Rauscher, Sarah; Neale, Chris; Pomès, Régis

    2009-10-13

    Generalized-ensemble algorithms in temperature space have become popular tools to enhance conformational sampling in biomolecular simulations. A random walk in temperature leads to a corresponding random walk in potential energy, which can be used to cross over energetic barriers and overcome the problem of quasi-nonergodicity. In this paper, we introduce two novel methods: simulated tempering distributed replica sampling (STDR) and virtual replica exchange (VREX). These methods are designed to address the practical issues inherent in the replica exchange (RE), simulated tempering (ST), and serial replica exchange (SREM) algorithms. RE requires a large, dedicated, and homogeneous cluster of CPUs to function efficiently when applied to complex systems. ST and SREM both have the drawback of requiring extensive initial simulations, possibly adaptive, for the calculation of weight factors or potential energy distribution functions. STDR and VREX alleviate the need for lengthy initial simulations, and for synchronization and extensive communication between replicas. Both methods are therefore suitable for distributed or heterogeneous computing platforms. We perform an objective comparison of all five algorithms in terms of both implementation issues and sampling efficiency. We use disordered peptides in explicit water as test systems, for a total simulation time of over 42 μs. Efficiency is defined in terms of both structural convergence and temperature diffusion, and we show that these definitions of efficiency are in fact correlated. Importantly, we find that ST-based methods exhibit faster temperature diffusion and correspondingly faster convergence of structural properties compared to RE-based methods. Within the RE-based methods, VREX is superior to both SREM and RE. On the basis of our observations, we conclude that ST is ideal for simple systems, while STDR is well-suited for complex systems.

  7. Distributed Systems and Applications of Information Filtering and Retrieval

    CERN Document Server

    Giuliani, Alessandro; Semeraro, Giovanni; DART 2012

    2014-01-01

    This volume focuses on new challenges in distributed Information Filtering and Retrieval. It collects invited chapters and extended research contributions from the special session on Information Filtering and Retrieval: Novel Distributed Systems and Applications (DART) of the 4th International Conference on Knowledge Discovery and Information Retrieval (KDIR 2012), held in Barcelona, Spain, on 4-7 October 2012. The main focus of DART was to discuss and compare suitable novel solutions based on intelligent techniques and applied to real-world applications. The chapters of this book present a comprehensive review of related works and state of the art. Authors, both practitioners and researchers, shared their results in several topics such as "Multi-Agent Systems", "Natural Language Processing", "Automatic Advertisement", "Customer Interaction Analytics", "Opinion Mining". Contributions have been careful reviewed by experts in the area, who also gave useful suggestions to improve the quality of the volume.

  8. Obtaining Samples Representative of Contaminant Distribution in an Aquifer

    International Nuclear Information System (INIS)

    Schalla, Ronald; Spane, Frank A.; Narbutovskih, Susan M.; Conley, Scott F.; Webber, William D.

    2002-01-01

    Historically, groundwater samples collected from monitoring wells have been assumed to provide average indications of contaminant concentrations within the aquifer over the well-screen interval. In-well flow circulation, heterogeneity in the surrounding aquifer, and the sampling method utilized, however, can significantly impact the representativeness of samples as contaminant indicators of actual conditions within the surrounding aquifer. This paper identifies the need and approaches essential for providing cost-effective and technically meaningful groundwater-monitoring results. Proper design of the well screen interval is critical. An accurate understanding of ambient (non-pumping) flow conditions within the monitoring well is essential for determining the contaminant distribution within the aquifer. The ambient in-well flow velocity, flow direction and volumetric flux rate are key to this understanding. Not only do the ambient flow conditions need to be identified for preferential flow zones, but also the probable changes that will be imposed under dynamic conditions that occur during groundwater sampling. Once the in-well flow conditions are understood, effective sampling can be conducted to obtain representative samples for specific depth zones or zones of interest. The question of sample representativeness has become an important issue as waste minimization techniques such as low flow purging and sampling are implemented to combat the increasing cost of well purging and sampling at many hazardous waste sites. Several technical approaches (e.g., well tracer techniques and flowmeter surveys) can be used to determine in-well flow conditions, and these are discussed with respect to both their usefulness and limitations. Proper fluid extraction methods using minimal, (low) volume and no purge sampling methods that are used to obtain representative samples of aquifer conditions are presented

  9. Cross-Sectional Information on Pore Structure and Element Distribution of Sediment Particles by SEM and EDS

    Directory of Open Access Journals (Sweden)

    Minghong Chen

    2017-01-01

    Full Text Available The interaction between pollutants and sediment particles often occurs on the particle surface, so surface properties directly affect surface reaction. The physical and chemical processes occurring on sediment particle surfaces are microscopic processes and as such need to be studied from a microscopic perspective. In this study, field emission scanning electron microscopy (SEM and energy dispersive X-ray spectrometer (EDS were adopted to observe and analyze the pore structure and element distribution of sediment particles. In particular, a special method of sample preparation was used to achieve the corresponding cross-sectional information of sediment particles. Clear images of a particle profile and pore microstructure were obtained by high-resolution SEM, while element distribution maps of sediment particles were obtained by EDS. The results provide an intuitive understanding of the internal microenvironment and external behavior of sediment particles, in addition to revealing a significant role of pore microstructure in the adsorption and desorption of pollutants. Thus, a combination of different experimental instruments and observation methods can provide real images and information on microscopic pore structure and element distribution of sediment particles. These results should help to improve our understanding of sediment dynamics and its environmental effects.

  10. Application of PIXE analysis to environmental samples stable element distribution in sea algae by scanning microprobe analysis

    International Nuclear Information System (INIS)

    Ishikawa, M.; Kitao, K.; Imaseki, H.; Ishii, T.; Uchida, S.

    1984-01-01

    The resolution of a 33+-3 μm microprobe focussed with quadrupole doublet installed at the 3 MV Van de Graaff of the National Institute of Radiological Sciences, Japan, was used for the present analysis. Brown algae, Hizikia fusiforme was the sample target bombarded with a 2 MeV proton beam collimated mechanically into a rectangular image of 100 μm x 700 μm. Scanning across the sample target prepared into a longitudinal section from the caulis of the algae provided the following observations. More than 12 elements such as Al, Si, P, Cl, Ca, Mn, Fe, Cu, Zn, As, Br and Sr were determined simultaneously, together with their distributional information across the diameter. In the medullary layer, Mn and Zn were specific in their accumulation, while the deposition of Fe, Cu, As and Br were observed to be high in the epithelial layer, especially Fe and Cu which were found on the surface, where they contacted ambient sea water, but no significant change in pattern was indicated for such elements as Al, P and Cl. The PIXE microprobe analysis was, therefore, effective in its detectability for elements below a few ppm level, resultantly providing further possibilities for collecting information from bio-medical and environmental samples on trace characterization of elements. (author)

  11. Information system for administrating and distributing color images through internet

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The information system for administrating and distributing color images through the Internet ensures the consistent replication of color images, their storage - in an on-line data base - and predictable distribution, by means of a digitally distributed flow, based on Windows platform and POD (Print On Demand technology. The consistent replication of color images inde-pendently from the parameters of the processing equipment and from the features of the programs composing the technological flow, is ensured by the standard color management sys-tem defined by ICC (International Color Consortium, which is integrated by the Windows operation system and by the POD technology. The latter minimize the noticeable differences between the colors captured, displayed or printed by various replication equipments and/or edited by various graphical applications. The system integrated web application ensures the uploading of the color images in an on-line database and their administration and distribution among the users via the Internet. For the preservation of the data expressed by the color im-ages during their transfer along a digitally distributed flow, the software application includes an original tool ensuring the accurate replication of colors on computer displays or when printing them by means of various color printers or presses. For development and use, this application employs a hardware platform based on PC support and a competitive software platform, based on: the Windows operation system, the .NET. Development medium and the C# programming language. This information system is beneficial for creators and users of color images, the success of the printed or on-line (Internet publications depending on the sizeable, predictable and accurate replication of colors employed for the visual expression of information in every activity fields of the modern society. The herein introduced information system enables all interested persons to access the

  12. Power distribution system reliability evaluation using dagger-sampling Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Y.; Zhao, S.; Ma, Y. [North China Electric Power Univ., Hebei (China). Dept. of Electrical Engineering

    2009-03-11

    A dagger-sampling Monte Carlo simulation method was used to evaluate power distribution system reliability. The dagger-sampling technique was used to record the failure of a component as an incident and to determine its occurrence probability by generating incident samples using random numbers. The dagger sampling technique was combined with the direct sequential Monte Carlo method to calculate average values of load point indices and system indices. Results of the 2 methods with simulation times of up to 100,000 years were then compared. The comparative evaluation showed that less computing time was required using the dagger-sampling technique due to its higher convergence speed. When simulation times were 1000 years, the dagger-sampling method required 0.05 seconds to accomplish an evaluation, while the direct method required 0.27 seconds. 12 refs., 3 tabs., 4 figs.

  13. Efficient Round-Trip Time Optimization for Replica-Exchange Enveloping Distribution Sampling (RE-EDS).

    Science.gov (United States)

    Sidler, Dominik; Cristòfol-Clough, Michael; Riniker, Sereina

    2017-06-13

    Replica-exchange enveloping distribution sampling (RE-EDS) allows the efficient estimation of free-energy differences between multiple end-states from a single molecular dynamics (MD) simulation. In EDS, a reference state is sampled, which can be tuned by two types of parameters, i.e., smoothness parameters(s) and energy offsets, such that all end-states are sufficiently sampled. However, the choice of these parameters is not trivial. Replica exchange (RE) or parallel tempering is a widely applied technique to enhance sampling. By combining EDS with the RE technique, the parameter choice problem could be simplified and the challenge shifted toward an optimal distribution of the replicas in the smoothness-parameter space. The choice of a certain replica distribution can alter the sampling efficiency significantly. In this work, global round-trip time optimization (GRTO) algorithms are tested for the use in RE-EDS simulations. In addition, a local round-trip time optimization (LRTO) algorithm is proposed for systems with slowly adapting environments, where a reliable estimate for the round-trip time is challenging to obtain. The optimization algorithms were applied to RE-EDS simulations of a system of nine small-molecule inhibitors of phenylethanolamine N-methyltransferase (PNMT). The energy offsets were determined using our recently proposed parallel energy-offset (PEOE) estimation scheme. While the multistate GRTO algorithm yielded the best replica distribution for the ligands in water, the multistate LRTO algorithm was found to be the method of choice for the ligands in complex with PNMT. With this, the 36 alchemical free-energy differences between the nine ligands were calculated successfully from a single RE-EDS simulation 10 ns in length. Thus, RE-EDS presents an efficient method for the estimation of relative binding free energies.

  14. Multicounter neutron detector for examination of content and spatial distribution of fissile materials in bulk samples

    International Nuclear Information System (INIS)

    Swiderska-Kowalczyk, M.; Starosta, W.; Zoltowski, T.

    1999-01-01

    A new neutron coincidence well-counter is presented. This experimental device can be applied for passive assay of fissile and, in particular, for plutonium bearing materials. It contains of a set of the 3 He tubes placed inside a polyethylene moderator. Outputs from the tubes, first processed by preamplifier/amplifier/discriminator circuits, are then analysed using a correlator connected with PC, and correlation techniques implemented in software. Such a neutron counter enables determination of the 240 Pu effective mass in samples of a small Pu content (i.e., where the multiplication effects can be neglected) having a fairly big volume (up to 0.17 m 3 ), if only the isotopic composition is known. For determination of neutron sources distribution inside a sample, a heuristic method based on hierarchical cluster analysis was applied. As input parameters, amplitudes and phases of two-dimensional Fourier transformation of the count profiles matrices for known point sources distributions and for the examined samples were taken. Such matrices of profiles counts are collected using the sample scanning with detection head. In the clustering processes, process, counts profiles of unknown samples are fitted into dendrograms employing the 'proximity' criterion of the examined sample profile to standard samples profiles. Distribution of neutron sources in the examined sample is then evaluated on the basis of a comparison with standard sources distributions. (author)

  15. Elemental distribution and sample integrity comparison of freeze-dried and frozen-hydrated biological tissue samples with nuclear microprobe

    Energy Technology Data Exchange (ETDEWEB)

    Vavpetič, P., E-mail: primoz.vavpetic@ijs.si [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); Vogel-Mikuš, K. [Biotechnical Faculty, Department of Biology, University of Ljubljana, Jamnikarjeva 101, SI-1000 Ljubljana (Slovenia); Jeromel, L. [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); Ogrinc Potočnik, N. [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); FOM-Institute AMOLF, Science Park 104, 1098 XG Amsterdam (Netherlands); Pongrac, P. [Biotechnical Faculty, Department of Biology, University of Ljubljana, Jamnikarjeva 101, SI-1000 Ljubljana (Slovenia); Department of Plant Physiology, University of Bayreuth, Universitätstr. 30, 95447 Bayreuth (Germany); Drobne, D.; Pipan Tkalec, Ž.; Novak, S.; Kos, M.; Koren, Š.; Regvar, M. [Biotechnical Faculty, Department of Biology, University of Ljubljana, Jamnikarjeva 101, SI-1000 Ljubljana (Slovenia); Pelicon, P. [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia)

    2015-04-01

    The analysis of biological samples in frozen-hydrated state with micro-PIXE technique at Jožef Stefan Institute (JSI) nuclear microprobe has matured to a point that enables us to measure and examine frozen tissue samples routinely as a standard research method. Cryotome-cut slice of frozen-hydrated biological sample is mounted between two thin foils and positioned on the sample holder. The temperature of the cold stage in the measuring chamber is kept below 130 K throughout the insertion of the samples and the proton beam exposure. Matrix composition of frozen-hydrated tissue is consisted mostly of ice. Sample deterioration during proton beam exposure is monitored during the experiment, as both Elastic Backscattering Spectrometry (EBS) and Scanning Transmission Ion Microscopy (STIM) in on–off axis geometry are recorded together with the events in two PIXE detectors and backscattered ions from the chopper in a single list-mode file. The aim of this experiment was to determine differences and similarities between two kinds of biological sample preparation techniques for micro-PIXE analysis, namely freeze-drying and frozen-hydrated sample preparation in order to evaluate the improvements in the elemental localisation of the latter technique if any. In the presented work, a standard micro-PIXE configuration for tissue mapping at JSI was used with five detection systems operating in parallel, with proton beam cross section of 1.0 × 1.0 μm{sup 2} and a beam current of 100 pA. The comparison of the resulting elemental distributions measured at the biological tissue prepared in the frozen-hydrated and in the freeze-dried state revealed differences in elemental distribution of particular elements at the cellular level due to the morphology alteration in particular tissue compartments induced either by water removal in the lyophilisation process or by unsatisfactory preparation of samples for cutting and mounting during the shock-freezing phase of sample preparation.

  16. Towards an Information Model of Consistency Maintenance in Distributed Interactive Applications

    Directory of Open Access Journals (Sweden)

    Xin Zhang

    2008-01-01

    Full Text Available A novel framework to model and explore predictive contract mechanisms in distributed interactive applications (DIAs using information theory is proposed. In our model, the entity state update scheme is modelled as an information generation, encoding, and reconstruction process. Such a perspective facilitates a quantitative measurement of state fidelity loss as a result of the distribution protocol. Results from an experimental study on a first-person shooter game are used to illustrate the utility of this measurement process. We contend that our proposed model is a starting point to reframe and analyse consistency maintenance in DIAs as a problem in distributed interactive media compression.

  17. Moment and maximum likelihood estimators for Weibull distributions under length- and area-biased sampling

    Science.gov (United States)

    Jeffrey H. Gove

    2003-01-01

    Many of the most popular sampling schemes used in forestry are probability proportional to size methods. These methods are also referred to as size biased because sampling is actually from a weighted form of the underlying population distribution. Length- and area-biased sampling are special cases of size-biased sampling where the probability weighting comes from a...

  18. Neutron multicounter detector for investigation of content and spatial distribution of fission materials in large volume samples

    International Nuclear Information System (INIS)

    Swiderska-Kowalczyk, M.; Starosta, W.; Zoltowski, T.

    1998-01-01

    The experimental device is a neutron coincidence well counter. It can be applied for passive assay of fissile - especially for plutonium bearing - materials. It consist of a set of 3 He tubes placed inside a polyethylene moderator; outputs from the tubes, first processed by preamplifier/amplifier/discriminator circuits, are then analysed using neutron correlator connected with a PC, and correlation techniques implemented in software. Such a neutron counter allows for determination of plutonium mass ( 240 Pu effective mass) in nonmultiplying samples having fairly big volume (up to 0.14 m 3 ). For determination of neutron sources distribution inside the sample, the heuristic methods based on hierarchical cluster analysis are applied. As an input parameters, amplitudes and phases of two-dimensional Fourier transformation of the count profiles matrices for known point sources distributions and for the examined samples, are taken. Such matrices are collected by means of sample scanning by detection head. During clustering process, counts profiles for unknown samples fitted into dendrograms using the 'proximity' criterion of the examined sample profile to standard samples profiles. Distribution of neutron sources in an examined sample is then evaluated on the basis of comparison with standard sources distributions. (author)

  19. Discrete Ziggurat: A time-memory trade-off for sampling from a Gaussian distribution over the integers

    NARCIS (Netherlands)

    Buchmann, J.; Cabarcas, D.; Göpfert, F.; Hülsing, A.T.; Weiden, P.; Lange, T.; Lauter, K.; Lisonek, P.

    2014-01-01

    Several lattice-based cryptosystems require to sample from a discrete Gaussian distribution over the integers. Existing methods to sample from such a distribution either need large amounts of memory or they are very slow. In this paper we explore a different method that allows for a flexible

  20. Practical continuous-variable quantum key distribution without finite sampling bandwidth effects.

    Science.gov (United States)

    Li, Huasheng; Wang, Chao; Huang, Peng; Huang, Duan; Wang, Tao; Zeng, Guihua

    2016-09-05

    In a practical continuous-variable quantum key distribution system, finite sampling bandwidth of the employed analog-to-digital converter at the receiver's side may lead to inaccurate results of pulse peak sampling. Then, errors in the parameters estimation resulted. Subsequently, the system performance decreases and security loopholes are exposed to eavesdroppers. In this paper, we propose a novel data acquisition scheme which consists of two parts, i.e., a dynamic delay adjusting module and a statistical power feedback-control algorithm. The proposed scheme may improve dramatically the data acquisition precision of pulse peak sampling and remove the finite sampling bandwidth effects. Moreover, the optimal peak sampling position of a pulse signal can be dynamically calibrated through monitoring the change of the statistical power of the sampled data in the proposed scheme. This helps to resist against some practical attacks, such as the well-known local oscillator calibration attack.

  1. Media Exposure: How Models Simplify Sampling

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl

    1998-01-01

    In media planning, the distribution of exposures to more ad spots in more media (print, TV, radio) is crucial to the evaluation of the campaign. If such information should be sampled, it would only be possible in expensive panel-studies (eg TV-meter panels). Alternatively, the distribution...... of exposures may be modelled statistically, using the Beta distribution combined with the Binomial Distribution. Examples are given....

  2. Local Information as a Resource in Distributed Quantum Systems

    Science.gov (United States)

    Horodecki, Michał; Horodecki, Karol; Horodecki, Paweł; Horodecki, Ryszard; Oppenheim, Jonathan; Sende, Aditi; Sen, Ujjwal

    2003-03-01

    A new paradigm for distributed quantum systems where information is a valuable resource is developed. After finding a unique measure for information, we construct a scheme for its manipulation in analogy with entanglement theory. In this scheme, instead of maximally entangled states, two parties distill local states. We show that, surprisingly, the main tools of entanglement theory are general enough to work in this opposite scheme. Up to plausible assumptions, we show that the amount of information that must be lost during the protocol of concentration of local information can be expressed as the relative entropy distance from some special set of states.

  3. Global information sampling in the honey bee

    Science.gov (United States)

    Johnson, Brian R.

    2008-06-01

    Central to the question of task allocation in social insects is how workers acquire information. Patrolling is a curious behavior in which bees meander over the face of the comb inspecting cells. Several authors have suggested it allows bees to collect global information, but this has never been formally evaluated. This study explores this hypothesis by answering three questions. First, do bees gather information in a consistent manner as they patrol? Second, do they move far enough to get a sense of task demand in distant areas of the nest? And third, is patrolling a commonly performed task? Focal animal observations were used to address the first two predictions, while a scan sampling study was used to address the third. The results were affirmative for each question. While patrolling, workers collected information by performing periodic clusters of cell inspections. Patrolling bees not only traveled far enough to frequently change work zone; they often visited every part of the nest. Finally, the majority of the bees in the middle-age caste were shown to move throughout the nest over the course of a few hours in a manner suggestive of patrolling. Global information collection is contrary to much current theory, which assumes that workers respond to local information only. This study thus highlights the nonmutually exclusive nature of various information collection regimes in social insects.

  4. Learning to merge search results for efficient Distributed Information Retrieval

    NARCIS (Netherlands)

    Tjin-Kam-Jet, Kien; Hiemstra, Djoerd

    2010-01-01

    Merging search results from different servers is a major problem in Distributed Information Retrieval. We used Regression-SVM and Ranking-SVM which would learn a function that merges results based on information that is readily available: i.e. the ranks, titles, summaries and URLs contained in the

  5. Different goodness of fit tests for Rayleigh distribution in ranked set sampling

    Directory of Open Access Journals (Sweden)

    Amer Al-Omari

    2016-03-01

    Full Text Available In this paper, different goodness of fit tests for the Rayleigh distribution are considered based on simple random sampling (SRS and ranked set sampling (RSS techniques. The performance of the suggested estimators is evaluated in terms of the power of the tests by using Monte Carlo simulation. It is found that the suggested RSS tests perform better than their counterparts  in SRS.

  6. Smoothing the redshift distributions of random samples for the baryon acoustic oscillations: applications to the SDSS-III BOSS DR12 and QPM mock samples

    Science.gov (United States)

    Wang, Shao-Jiang; Guo, Qi; Cai, Rong-Gen

    2017-12-01

    We investigate the impact of different redshift distributions of random samples on the baryon acoustic oscillations (BAO) measurements of D_V(z)r_d^fid/r_d from the two-point correlation functions of galaxies in the Data Release 12 of the Baryon Oscillation Spectroscopic Survey (BOSS). Big surveys, such as BOSS, usually assign redshifts to the random samples by randomly drawing values from the measured redshift distributions of the data, which would necessarily introduce fiducial signals of fluctuations into the random samples, weakening the signals of BAO, if the cosmic variance cannot be ignored. We propose a smooth function of redshift distribution that fits the data well to populate the random galaxy samples. The resulting cosmological parameters match the input parameters of the mock catalogue very well. The significance of BAO signals has been improved by 0.33σ for a low-redshift sample and by 0.03σ for a constant-stellar-mass sample, though the absolute values do not change significantly. Given the precision of the measurements of current cosmological parameters, it would be appreciated for the future improvements on the measurements of galaxy clustering.

  7. Determination and optimization of spatial samples for distributed measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Huo, Xiaoming (Georgia Institute of Technology, Atlanta, GA); Tran, Hy D.; Shilling, Katherine Meghan; Kim, Heeyong (Georgia Institute of Technology, Atlanta, GA)

    2010-10-01

    There are no accepted standards for determining how many measurements to take during part inspection or where to take them, or for assessing confidence in the evaluation of acceptance based on these measurements. The goal of this work was to develop a standard method for determining the number of measurements, together with the spatial distribution of measurements and the associated risks for false acceptance and false rejection. Two paths have been taken to create a standard method for selecting sampling points. A wavelet-based model has been developed to select measurement points and to determine confidence in the measurement after the points are taken. An adaptive sampling strategy has been studied to determine implementation feasibility on commercial measurement equipment. Results using both real and simulated data are presented for each of the paths.

  8. Mapping species distributions with MAXENT using a geographically biased sample of presence data: a performance assessment of methods for correcting sampling bias.

    Science.gov (United States)

    Fourcade, Yoan; Engler, Jan O; Rödder, Dennis; Secondi, Jean

    2014-01-01

    MAXENT is now a common species distribution modeling (SDM) tool used by conservation practitioners for predicting the distribution of a species from a set of records and environmental predictors. However, datasets of species occurrence used to train the model are often biased in the geographical space because of unequal sampling effort across the study area. This bias may be a source of strong inaccuracy in the resulting model and could lead to incorrect predictions. Although a number of sampling bias correction methods have been proposed, there is no consensual guideline to account for it. We compared here the performance of five methods of bias correction on three datasets of species occurrence: one "virtual" derived from a land cover map, and two actual datasets for a turtle (Chrysemys picta) and a salamander (Plethodon cylindraceus). We subjected these datasets to four types of sampling biases corresponding to potential types of empirical biases. We applied five correction methods to the biased samples and compared the outputs of distribution models to unbiased datasets to assess the overall correction performance of each method. The results revealed that the ability of methods to correct the initial sampling bias varied greatly depending on bias type, bias intensity and species. However, the simple systematic sampling of records consistently ranked among the best performing across the range of conditions tested, whereas other methods performed more poorly in most cases. The strong effect of initial conditions on correction performance highlights the need for further research to develop a step-by-step guideline to account for sampling bias. However, this method seems to be the most efficient in correcting sampling bias and should be advised in most cases.

  9. The behavior of Metropolis-coupled Markov chains when sampling rugged phylogenetic distributions.

    Science.gov (United States)

    Brown, Jeremy M; Thomson, Robert C

    2018-02-15

    Bayesian phylogenetic inference involves sampling from posterior distributions of trees, which sometimes exhibit local optima, or peaks, separated by regions of low posterior density. Markov chain Monte Carlo (MCMC) algorithms are the most widely used numerical method for generating samples from these posterior distributions, but they are susceptible to entrapment on individual optima in rugged distributions when they are unable to easily cross through or jump across regions of low posterior density. Ruggedness of posterior distributions can result from a variety of factors, including unmodeled variation in evolutionary processes and unrecognized variation in the true topology across sites or genes. Ruggedness can also become exaggerated when constraints are placed on topologies that require the presence or absence of particular bipartitions (often referred to as positive or negative constraints, respectively). These types of constraints are frequently employed when conducting tests of topological hypotheses (Bergsten et al. 2013; Brown and Thomson 2017). Negative constraints can lead to particularly rugged distributions when the data strongly support a forbidden clade, because monophyly of the clade can be disrupted by inserting outgroup taxa in many different ways. However, topological moves between the alternative disruptions are very difficult, because they require swaps between the inserted outgroup taxa while the data constrain taxa from the forbidden clade to remain close together on the tree. While this precise form of ruggedness is particular to negative constraints, trees with high posterior density can be separated by similarly complicated topological rearrangements, even in the absence of constraints.

  10. Agent paradigm and services technology for distributed Information Sources

    Directory of Open Access Journals (Sweden)

    Hakima Mellah

    2011-10-01

    Full Text Available The complexity of information is issued from interacting information sources (IS, and could be better exploited with respect to relevance of information. In distributed IS system, relevant information has a content that is in connection with other contents in information network, and is used for a certain purpose. The highlighting point of the proposed model is to contribute to information system agility according to a three-dimensional view involving the content, the use and the structure. This reflects the relevance of information complexity and effective methodologies through self organized principle to manage the complexity. This contribution is primarily focused on presenting some factors that lead and trigger for self organization in a Service Oriented Architecture (SOA and how it can be possible to integrate self organization mechanism in the same.

  11. An Object-Oriented Information Model for Policy-based Management of Distributed Applications

    NARCIS (Netherlands)

    Diaz, G.; Gay, V.C.J.; Horlait, E.; Hamza, M.H.

    2002-01-01

    This paper presents an object-oriented information model to support a policy-based management for distributed multimedia applications. The information base contains application-level information about the users, the applications, and their profile. Our Information model is described in details and

  12. Analysis of stationary power/amplitude distributions for multiple channels of sampled FBGs.

    Science.gov (United States)

    Xing, Ya; Zou, Xihua; Pan, Wei; Yan, Lianshan; Luo, Bin; Shao, Liyang

    2015-08-10

    Stationary power/amplitude distributions for multiple channels of the sampled fiber Bragg grating (SFBG) along the grating length are analyzed. Unlike a uniform FBG, the SFBG has multiple channels in the reflection spectrum, not a single channel. Thus, the stationary power/amplitude distributions for these multiple channels are analyzed by using two different theoretical models. In the first model, the SFBG is regarded as a set of grating sections and non-grating sections, which are alternately stacked. A step-like distribution is obtained for the corresponding power/amplitude of each channel along the grating length. While, in the second model, the SFBG is decomposed into multiple uniform "ghost" gratings, and a continuous distribution is obtained for each ghost grating (i.e., each channel). After a comparison, the distributions obtained in the two models are identical, and the equivalence between the two models is demonstrated. In addition, the impacts of the duty cycle on the power/amplitude distributions of multiple channels of SFBG are presented.

  13. An improved sampling method of complex network

    Science.gov (United States)

    Gao, Qi; Ding, Xintong; Pan, Feng; Li, Weixing

    2014-12-01

    Sampling subnet is an important topic of complex network research. Sampling methods influence the structure and characteristics of subnet. Random multiple snowball with Cohen (RMSC) process sampling which combines the advantages of random sampling and snowball sampling is proposed in this paper. It has the ability to explore global information and discover the local structure at the same time. The experiments indicate that this novel sampling method could keep the similarity between sampling subnet and original network on degree distribution, connectivity rate and average shortest path. This method is applicable to the situation where the prior knowledge about degree distribution of original network is not sufficient.

  14. Symmetric Blind Information Reconciliation for Quantum Key Distribution

    International Nuclear Information System (INIS)

    Kiktenko, Evgeniy O.

    2017-01-01

    Quantum key distribution (QKD) is a quantum-proof key-exchange scheme which is fast approaching the communication industry. An essential component in QKD is the information reconciliation step, which is used for correcting the quantum-channel noise errors. The recently suggested blind-reconciliation technique, based on low-density parity-check codes, offers remarkable prospectives for efficient information reconciliation without an a priori quantum bit error rate estimation. We suggest an improvement of the blind-information-reconciliation protocol promoting a significant increase in the efficiency of the procedure and reducing its interactivity. Finally, the proposed technique is based on introducing symmetry in operations of parties, and the consideration of results of unsuccessful belief-propagation decodings.

  15. How does observation uncertainty influence which stream water samples are most informative for model calibration?

    Science.gov (United States)

    Wang, Ling; van Meerveld, Ilja; Seibert, Jan

    2016-04-01

    Streamflow isotope samples taken during rainfall-runoff events are very useful for multi-criteria model calibration because they can help decrease parameter uncertainty and improve internal model consistency. However, the number of samples that can be collected and analysed is often restricted by practical and financial constraints. It is, therefore, important to choose an appropriate sampling strategy and to obtain samples that have the highest information content for model calibration. We used the Birkenes hydrochemical model and synthetic rainfall, streamflow and isotope data to explore which samples are most informative for model calibration. Starting with error-free observations, we investigated how many samples are needed to obtain a certain model fit. Based on different parameter sets, representing different catchments, and different rainfall events, we also determined which sampling times provide the most informative data for model calibration. Our results show that simulation performance for models calibrated with the isotopic data from two intelligently selected samples was comparable to simulations based on isotopic data for all 100 time steps. The models calibrated with the intelligently selected samples also performed better than the model calibrations with two benchmark sampling strategies (random selection and selection based on hydrologic information). Surprisingly, samples on the rising limb and at the peak were less informative than expected and, generally, samples taken at the end of the event were most informative. The timing of the most informative samples depends on the proportion of different flow components (baseflow, slow response flow, fast response flow and overflow). For events dominated by baseflow and slow response flow, samples taken at the end of the event after the fast response flow has ended were most informative; when the fast response flow was dominant, samples taken near the peak were most informative. However when overflow

  16. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis.

  17. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2013-01-01

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis

  18. A general algorithm for distributing information in a graph

    OpenAIRE

    Aji, Srinivas M.; McEliece, Robert J.

    1997-01-01

    We present a general “message-passing” algorithm for distributing information in a graph. This algorithm may help us to understand the approximate correctness of both the Gallager-Tanner-Wiberg algorithm, and the turbo-decoding algorithm.

  19. AN EMPIRICAL INVESTIGATION OF THE EFFECTS OF NONNORMALITY UPON THE SAMPLING DISTRIBUTION OF THE PROJECT MOMENT CORRELATION COEFFICIENT.

    Science.gov (United States)

    HJELM, HOWARD; NORRIS, RAYMOND C.

    THE STUDY EMPIRICALLY DETERMINED THE EFFECTS OF NONNORMALITY UPON SOME SAMPLING DISTRIBUTIONS OF THE PRODUCT MOMENT CORRELATION COEFFICIENT (PMCC). SAMPLING DISTRIBUTIONS OF THE PMCC WERE OBTAINED BY DRAWING NUMEROUS SAMPLES FROM CONTROL AND EXPERIMENTAL POPULATIONS HAVING VARIOUS DEGREES OF NONNORMALITY AND BY CALCULATING CORRELATION COEFFICIENTS…

  20. Forecasting an invasive species’ distribution with global distribution data, local data, and physiological information

    Science.gov (United States)

    Jarnevich, Catherine S.; Young, Nicholas E.; Talbert, Marian; Talbert, Colin

    2018-01-01

    Understanding invasive species distributions and potential invasions often requires broad‐scale information on the environmental tolerances of the species. Further, resource managers are often faced with knowing these broad‐scale relationships as well as nuanced environmental factors related to their landscape that influence where an invasive species occurs and potentially could occur. Using invasive buffelgrass (Cenchrus ciliaris), we developed global models and local models for Saguaro National Park, Arizona, USA, based on location records and literature on physiological tolerances to environmental factors to investigate whether environmental relationships of a species at a global scale are also important at local scales. In addition to correlative models with five commonly used algorithms, we also developed a model using a priori user‐defined relationships between occurrence and environmental characteristics based on a literature review. All correlative models at both scales performed well based on statistical evaluations. The user‐defined curves closely matched those produced by the correlative models, indicating that the correlative models may be capturing mechanisms driving the distribution of buffelgrass. Given climate projections for the region, both global and local models indicate that conditions at Saguaro National Park may become more suitable for buffelgrass. Combining global and local data with correlative models and physiological information provided a holistic approach to forecasting invasive species distributions.

  1. Micro/Nano-scale Strain Distribution Measurement from Sampling Moiré Fringes.

    Science.gov (United States)

    Wang, Qinghua; Ri, Shien; Tsuda, Hiroshi

    2017-05-23

    This work describes the measurement procedure and principles of a sampling moiré technique for full-field micro/nano-scale deformation measurements. The developed technique can be performed in two ways: using the reconstructed multiplication moiré method or the spatial phase-shifting sampling moiré method. When the specimen grid pitch is around 2 pixels, 2-pixel sampling moiré fringes are generated to reconstruct a multiplication moiré pattern for a deformation measurement. Both the displacement and strain sensitivities are twice as high as in the traditional scanning moiré method in the same wide field of view. When the specimen grid pitch is around or greater than 3 pixels, multi-pixel sampling moiré fringes are generated, and a spatial phase-shifting technique is combined for a full-field deformation measurement. The strain measurement accuracy is significantly improved, and automatic batch measurement is easily achievable. Both methods can measure the two-dimensional (2D) strain distributions from a single-shot grid image without rotating the specimen or scanning lines, as in traditional moiré techniques. As examples, the 2D displacement and strain distributions, including the shear strains of two carbon fiber-reinforced plastic specimens, were measured in three-point bending tests. The proposed technique is expected to play an important role in the non-destructive quantitative evaluations of mechanical properties, crack occurrences, and residual stresses of a variety of materials.

  2. Distribution of {sup 90}Sr activities in the environmental radiation samples of Jeju Island, Korea

    Energy Technology Data Exchange (ETDEWEB)

    Han, Chung Hun; Park, Youn Hyun; Lee, Young Gyu; Park, Jae Woo [Jeju National University, Jeju (Korea, Republic of)

    2016-12-15

    This work was to get information about {sup 90}Sr contamination of the environment by using soil and moss from selected areas in Jeju Island, Korea. The activities of {sup 90}Sr in soil and moss samples were investigated at nine locations of Jeju island, Korea. The soil samples have been collected at 4 sites of Jeju island during June to August of 2013, analyzed for vertical distribution of {sup 90}Sr activities. The moss samples have been collected at 5 sites of Jeju island during November of 2011 to June of 2012, and analyzed for radioactive {sup 90}Sr. The {sup 90}Sr vertical concentrations in the investigated soil samples were 2.77 to 18.24 Bq·kg{sup -1} in eastern part, 1.69 to 18.27 Bq·kg{sup -1} in northern part, 3.76 to 13.46 Bq·kg{sup -1} in the western part and 1.09 to 8.70 Bq·kg{sup -1} in the southern part of the Mt. Halla in Jeju island, respectively. Activities of {sup 90}Sr show the highest value at the surface soil and decrease with depth. The activity concentration measured was in the range of 79.6 to 363 Bq·kg{sup -1} -dry moss. This material is expected to be basis reference for survey of environmental radioactivity in Jeju Island.

  3. Constructing Common Information Space across Distributed Emergency Medical Teams

    DEFF Research Database (Denmark)

    Zhang, Zhan; Sarcevic, Aleksandra; Bossen, Claus

    2017-01-01

    This paper examines coordination and real-time information sharing across four emergency medical teams in a high-risk and distributed setting as they provide care to critically injured patients within the first hour after injury. Through multiple field studies we explored how common understanding...... of critical patient data is established across these heterogeneous teams and what coordination mechanisms are being used to support information sharing and interpretation. To analyze the data, we drew on the concept of Common Information Spaces (CIS). Our results showed that teams faced many challenges...... in achieving efficient information sharing and coordination, including difficulties in locating and assembling team members, communicating and interpreting information from the field, and accommodating differences in team goals and information needs, all while having minimal technology support. We reflect...

  4. Multirobot autonomous landmine detection using distributed multisensor information aggregation

    Science.gov (United States)

    Jumadinova, Janyl; Dasgupta, Prithviraj

    2012-06-01

    We consider the problem of distributed sensor information fusion by multiple autonomous robots within the context of landmine detection. We assume that different landmines can be composed of different types of material and robots are equipped with different types of sensors, while each robot has only one type of landmine detection sensor on it. We introduce a novel technique that uses a market-based information aggregation mechanism called a prediction market. Each robot is provided with a software agent that uses sensory input of the robot and performs calculations of the prediction market technique. The result of the agent's calculations is a 'belief' representing the confidence of the agent in identifying the object as a landmine. The beliefs from different robots are aggregated by the market mechanism and passed on to a decision maker agent. The decision maker agent uses this aggregate belief information about a potential landmine and makes decisions about which other robots should be deployed to its location, so that the landmine can be confirmed rapidly and accurately. Our experimental results show that, for identical data distributions and settings, using our prediction market-based information aggregation technique increases the accuracy of object classification favorably as compared to two other commonly used techniques.

  5. Improving Statistics Education through Simulations: The Case of the Sampling Distribution.

    Science.gov (United States)

    Earley, Mark A.

    This paper presents a summary of action research investigating statistics students' understandings of the sampling distribution of the mean. With four sections of an introductory Statistics in Education course (n=98 students), a computer simulation activity (R. delMas, J. Garfield, and B. Chance, 1999) was implemented and evaluated to show…

  6. Particle Sampling and Real Time Size Distribution Measurement in H2/O2/TEOS Diffusion Flame

    International Nuclear Information System (INIS)

    Ahn, K.H.; Jung, C.H.; Choi, M.; Lee, J.S.

    2001-01-01

    Growth characteristics of silica particles have been studied experimentally using in situ particle sampling technique from H 2 /O 2 /Tetraethylorthosilicate (TEOS) diffusion flame with carefully devised sampling probe. The particle morphology and the size comparisons are made between the particles sampled by the local thermophoretic method from the inside of the flame and by the electrostatic collector sampling method after the dilution sampling probe. The Transmission Electron Microscope (TEM) image processed data of these two sampling techniques are compared with Scanning Mobility Particle Sizer (SMPS) measurement. TEM image analysis of two sampling methods showed a good agreement with SMPS measurement. The effects of flame conditions and TEOS flow rates on silica particle size distributions are also investigated using the new particle dilution sampling probe. It is found that the particle size distribution characteristics and morphology are mostly governed by the coagulation process and sintering process in the flame. As the flame temperature increases, the effect of coalescence or sintering becomes an important particle growth mechanism which reduces the coagulation process. However, if the flame temperature is not high enough to sinter the aggregated particles then the coagulation process is a dominant particle growth mechanism. In a certain flame condition a secondary particle formation is observed which results in a bimodal particle size distribution

  7. Texture side information generation for distributed coding of video-plus-depth

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Raket, Lars Lau; Zamarin, Marco

    2013-01-01

    We consider distributed video coding in a monoview video-plus-depth scenario, aiming at coding textures jointly with their corresponding depth stream. Distributed Video Coding (DVC) is a video coding paradigm in which the complexity is shifted from the encoder to the decoder. The Side Information...... components) is strongly correlated, so the additional depth information may be used to generate more accurate SI for the texture stream, increasing the efficiency of the system. In this paper we propose various methods for accurate texture SI generation, comparing them with other state-of-the-art solutions...

  8. [Monitoring microbiological safety of small systems of water distribution. Comparison of two sampling programs in a town in central Italy].

    Science.gov (United States)

    Papini, Paolo; Faustini, Annunziata; Manganello, Rosa; Borzacchi, Giancarlo; Spera, Domenico; Perucci, Carlo A

    2005-01-01

    To determine the frequency of sampling in small water distribution systems (distribution. We carried out two sampling programs to monitor the water distribution system in a town in Central Italy between July and September 1992; the Poisson distribution assumption implied 4 water samples, the assumption of negative binomial distribution implied 21 samples. Coliform organisms were used as indicators of water safety. The network consisted of two pipe rings and two wells fed by the same water source. The number of summer customers varied considerably from 3,000 to 20,000. The mean density was 2.33 coliforms/100 ml (sd= 5.29) for 21 samples and 3 coliforms/100 ml (sd= 6) for four samples. However the hypothesis of homogeneity was rejected (p-value samples (beta= 0.24) than with 21 (beta= 0.05). For this small network, determining the samples' size according to heterogeneity hypothesis strengthens the statement that water is drinkable compared with homogeneity assumption.

  9. Reinforcing Sampling Distributions through a Randomization-Based Activity for Introducing ANOVA

    Science.gov (United States)

    Taylor, Laura; Doehler, Kirsten

    2015-01-01

    This paper examines the use of a randomization-based activity to introduce the ANOVA F-test to students. The two main goals of this activity are to successfully teach students to comprehend ANOVA F-tests and to increase student comprehension of sampling distributions. Four sections of students in an advanced introductory statistics course…

  10. The distribution and use of information and communication ...

    African Journals Online (AJOL)

    The study determined the distribution and use of Information and Communication Technology (ICT) in teaching and learning in some faculties in the University of Ghana. Study specifically looks at the availability of ICT laboratories in the faculties, the purpose ICT is used f or as well as the challenges in its use. Survey of 300 ...

  11. Community problem-solving framed as a distributed information use environment: bridging research and practice

    Directory of Open Access Journals (Sweden)

    Joan C. Durrance

    2006-01-01

    Full Text Available Introduction. This article results from a qualitative study of 1 information behavior in community problem-solving framed as a distributed information use environment and 2 approaches used by a best-practice library to anticipate information needs associated with community problem solving. Method. Several approaches to data collection were used - focus groups, interviews, observation of community and library meetings, and analysis of supporting documents. We focused first on the information behaviour of community groups. Finding that the library supported these activities we sought to understand its approach. Analysis. Data were coded thematically for both information behaviour concepts and themes germane to problem-solving activity. A grounded theory approach was taken to capture aspects of the library staff's practice. Themes evolved from the data; supporting documentation - reports, articles and library communication - was also coded. Results. The study showed 1 how information use environment components (people, setting, problems, problem resolutions combine in this distributed information use environment to determine specific information needs and uses; and 2 how the library contributed to the viability of this distributed information use environment. Conclusion. Community problem solving, here explicated as a distributed IUE, is likely to be seen in multiple communities. The library model presented demonstrates that by reshaping its information practice within the framework of an information use environment, a library can anticipate community information needs as they are generated and where they are most relevant.

  12. Characterization of spatial distribution of Tetranychus urticae in peppermint in California and implication for improving sampling plan.

    Science.gov (United States)

    Rijal, Jhalendra P; Wilson, Rob; Godfrey, Larry D

    2016-02-01

    Twospotted spider mite, Tetranychus urticae Koch, is an important pest of peppermint in California, USA. Spider mite feeding on peppermint leaves causes physiological changes in the plant, which coupling with the favorable environmental condition can lead to increased mite infestations. Significant yield loss can occur in absence of pest monitoring and timely management. Understating the within-field spatial distribution of T. urticae is critical for the development of reliable sampling plan. The study reported here aims to characterize the spatial distribution of mite infestation in four commercial peppermint fields in northern California using spatial techniques, variogram and Spatial Analysis by Distance IndicEs (SADIE). Variogram analysis revealed that there was a strong evidence for spatially dependent (aggregated) mite population in 13 of 17 sampling dates and the physical distance of the aggregation reached maximum to 7 m in peppermint fields. Using SADIE, 11 of 17 sampling dates showed aggregated distribution pattern of mite infestation. Combining results from variogram and SADIE analysis, the spatial aggregation of T. urticae was evident in all four fields for all 17 sampling dates evaluated. Comparing spatial association using SADIE, ca. 62% of the total sampling pairs showed a positive association of mite spatial distribution patterns between two consecutive sampling dates, which indicates a strong spatial and temporal stability of mite infestation in peppermint fields. These results are discussed in relation to behavior of spider mite distribution within field, and its implications for improving sampling guidelines that are essential for effective pest monitoring and management.

  13. Evaluating sample allocation and effort in detecting population differentiation for discrete and continuously distributed individuals

    Science.gov (United States)

    Erin L. Landguth; Michael K. Schwartz

    2014-01-01

    One of the most pressing issues in spatial genetics concerns sampling. Traditionally, substructure and gene flow are estimated for individuals sampled within discrete populations. Because many species may be continuously distributed across a landscape without discrete boundaries, understanding sampling issues becomes paramount. Given large-scale, geographically broad...

  14. Determinant Factors of Rural Income Distribution with Special Reference to Information and Communication Technology

    Directory of Open Access Journals (Sweden)

    Hamid Sepehrdoust

    2014-06-01

    Full Text Available The aim of this study is to evaluate the impact of information and communication technology development on economic development and income distribution of rural communities and to answer this question that whether the development of information and communication technologies in rural areas could improve income distribution condition in these communities or not. To this end, data on 30 province of country during 2000-2009 and panel data method has used. Results approves Kuznet's inverted U theory with respect to the economic growth and income distribution and shows that information and communication technology development has improved the income distribution and economic justice in country's rural communities. The negative and significant coefficient (-0.15, of number of computer users among rural households, show that the development of information and communication technologies in rural areas of the country play as a factor for improving income distribution in these communities. The model estimation also showed a significant and positive effect of urbanization and unemployment on the dependent variable. This means that with rising unemployment, the condition of income distribution has worsened in rural communities during the period of study.

  15. Real World Awareness in Distributed Organizations: A View on Informal Processes

    Directory of Open Access Journals (Sweden)

    Eldar Sultanow

    2011-06-01

    Full Text Available Geographically distributed development has consistently had to deal with the challenge of intense awareness extensively more than locally concentrated development. Awareness marks the state of being informed incorporated with an understanding of project-related activities, states or relationships of each individual employee within a given group as a whole. In multifarious offices, where social interaction is necessary in order to distribute and locate information together with experts, awareness becomes a concurrent process which amplifies the exigency of easy routes for staff to be able to access this information, deferred or decentralized, in a formalized and problem-oriented way. Although the subject of Awareness has immensely increased in importance, there is extensive disagreement about how this transparency can be conceptually and technically implemented [1]. This paper introduces a model in order to visualize and navigate this information in three tiers using semantic networks, GIS and Web3D.

  16. Nonlinear Spatial Inversion Without Monte Carlo Sampling

    Science.gov (United States)

    Curtis, A.; Nawaz, A.

    2017-12-01

    High-dimensional, nonlinear inverse or inference problems usually have non-unique solutions. The distribution of solutions are described by probability distributions, and these are usually found using Monte Carlo (MC) sampling methods. These take pseudo-random samples of models in parameter space, calculate the probability of each sample given available data and other information, and thus map out high or low probability values of model parameters. However, such methods would converge to the solution only as the number of samples tends to infinity; in practice, MC is found to be slow to converge, convergence is not guaranteed to be achieved in finite time, and detection of convergence requires the use of subjective criteria. We propose a method for Bayesian inversion of categorical variables such as geological facies or rock types in spatial problems, which requires no sampling at all. The method uses a 2-D Hidden Markov Model over a grid of cells, where observations represent localized data constraining the model in each cell. The data in our example application are seismic properties such as P- and S-wave impedances or rock density; our model parameters are the hidden states and represent the geological rock types in each cell. The observations at each location are assumed to depend on the facies at that location only - an assumption referred to as `localized likelihoods'. However, the facies at a location cannot be determined solely by the observation at that location as it also depends on prior information concerning its correlation with the spatial distribution of facies elsewhere. Such prior information is included in the inversion in the form of a training image which represents a conceptual depiction of the distribution of local geologies that might be expected, but other forms of prior information can be used in the method as desired. The method provides direct (pseudo-analytic) estimates of posterior marginal probability distributions over each variable

  17. Industry sector analysis, Mexico: Electric power production and distribution equipment. Export Trade Information

    International Nuclear Information System (INIS)

    Wood, J.S.; Miller, R.W.

    1988-09-01

    The Industry Sector Analyses (I.S.A.) for electric power production and distribution equipment contains statistical and narrative information on projected market demand, end-users, receptivity of Mexican consumers to U.S. products, the competitive situation - Mexican production, total import market, U.S. market position, foreign competition, and competitive factors, and market access - Mexican tariffs, non-tariff barriers, standards, taxes and distribution channels. The I.S.A. provides the United States industry with meaningful information regarding the Mexican market for electric power production and distribution equipment

  18. Accelerated Enveloping Distribution Sampling: Enabling Sampling of Multiple End States while Preserving Local Energy Minima.

    Science.gov (United States)

    Perthold, Jan Walther; Oostenbrink, Chris

    2018-05-17

    Enveloping distribution sampling (EDS) is an efficient approach to calculate multiple free-energy differences from a single molecular dynamics (MD) simulation. However, the construction of an appropriate reference-state Hamiltonian that samples all states efficiently is not straightforward. We propose a novel approach for the construction of the EDS reference-state Hamiltonian, related to a previously described procedure to smoothen energy landscapes. In contrast to previously suggested EDS approaches, our reference-state Hamiltonian preserves local energy minima of the combined end-states. Moreover, we propose an intuitive, robust and efficient parameter optimization scheme to tune EDS Hamiltonian parameters. We demonstrate the proposed method with established and novel test systems and conclude that our approach allows for the automated calculation of multiple free-energy differences from a single simulation. Accelerated EDS promises to be a robust and user-friendly method to compute free-energy differences based on solid statistical mechanics.

  19. Fiscal Year 2001 Tank Characterization Technical Sampling Basis and Waste Information Requirements Document

    International Nuclear Information System (INIS)

    ADAMS, M.R.

    2000-01-01

    The Fiscal Year 2001 Tank Characterization Technical Sampling Basis and Waste Information Requirements Document (TSB-WIRD) has the following purposes: (1) To identify and integrate sampling and analysis needs for fiscal year (FY) 2001 and beyond. (2) To describe the overall drivers that require characterization information and to document their source. (3) To describe the process for identifying, prioritizing, and weighting issues that require characterization information to resolve. (4) To define the method for determining sampling priorities and to present the sampling priorities on a tank-by-tank basis. (5) To define how the characterization program is going to satisfy the drivers, close issues, and report progress. (6)To describe deliverables and acceptance criteria for characterization deliverables

  20. Distribution of the Determinant of the Sample Correlation Matrix: Monte Carlo Type One Error Rates.

    Science.gov (United States)

    Reddon, John R.; And Others

    1985-01-01

    Computer sampling from a multivariate normal spherical population was used to evaluate the type one error rates for a test of sphericity based on the distribution of the determinant of the sample correlation matrix. (Author/LMO)

  1. Hierarchical species distribution models

    Science.gov (United States)

    Hefley, Trevor J.; Hooten, Mevin B.

    2016-01-01

    Determining the distribution pattern of a species is important to increase scientific knowledge, inform management decisions, and conserve biodiversity. To infer spatial and temporal patterns, species distribution models have been developed for use with many sampling designs and types of data. Recently, it has been shown that count, presence-absence, and presence-only data can be conceptualized as arising from a point process distribution. Therefore, it is important to understand properties of the point process distribution. We examine how the hierarchical species distribution modeling framework has been used to incorporate a wide array of regression and theory-based components while accounting for the data collection process and making use of auxiliary information. The hierarchical modeling framework allows us to demonstrate how several commonly used species distribution models can be derived from the point process distribution, highlight areas of potential overlap between different models, and suggest areas where further research is needed.

  2. Hanford Environmental Information System (HEIS). Volume 7: Sample and Data Tracking subject area

    International Nuclear Information System (INIS)

    1994-06-01

    The Hanford Environmental Information System (HEIS) Sample and Data Tracking subject area allows insertion of tracking information into a central repository where the data is immediately available for viewing. For example, a technical coordinator is able to view the current status of a particular sampling effort, from sample collection to data package validation dates. Four major types of data comprise the Sample and Data Tracking subject area: data about the mechanisms that groups a set of samples for a particular sampling effort; data about how constituents are grouped and assigned to a sample; data about when, where, and how samples are sent to a laboratory for analysis; and data bout the status of a sample's constituent analysis requirements, i.e., whether the analysis results have been returned from the laboratory

  3. Distributed database kriging for adaptive sampling (D2KAS)

    International Nuclear Information System (INIS)

    Roehm, Dominic; Pavel, Robert S.; Barros, Kipton; Rouet-Leduc, Bertrand; McPherson, Allen L.; Germann, Timothy C.; Junghans, Christoph

    2015-01-01

    We present an adaptive sampling method supplemented by a distributed database and a prediction method for multiscale simulations using the Heterogeneous Multiscale Method. A finite-volume scheme integrates the macro-scale conservation laws for elastodynamics, which are closed by momentum and energy fluxes evaluated at the micro-scale. In the original approach, molecular dynamics (MD) simulations are launched for every macro-scale volume element. Our adaptive sampling scheme replaces a large fraction of costly micro-scale MD simulations with fast table lookup and prediction. The cloud database Redis provides the plain table lookup, and with locality aware hashing we gather input data for our prediction scheme. For the latter we use kriging, which estimates an unknown value and its uncertainty (error) at a specific location in parameter space by using weighted averages of the neighboring points. We find that our adaptive scheme significantly improves simulation performance by a factor of 2.5 to 25, while retaining high accuracy for various choices of the algorithm parameters

  4. Basic distribution free identification tests for small size samples of environmental data

    International Nuclear Information System (INIS)

    Federico, A.G.; Musmeci, F.

    1998-01-01

    Testing two or more data sets for the hypothesis that they are sampled form the same population is often required in environmental data analysis. Typically the available samples have a small number of data and often then assumption of normal distributions is not realistic. On the other hand the diffusion of the days powerful Personal Computers opens new possible opportunities based on a massive use of the CPU resources. The paper reviews the problem introducing the feasibility of two non parametric approaches based on intrinsic equi probability properties of the data samples. The first one is based on a full re sampling while the second is based on a bootstrap approach. A easy to use program is presented. A case study is given based on the Chernobyl children contamination data [it

  5. Estimating the spatial distribution of a plant disease epidemic from a sample

    Science.gov (United States)

    Sampling is of central importance in plant pathology. It facilitates our understanding of how epidemics develop in space and time and can also be used to inform disease management decisions. Making inferences from a sample is necessary because we rarely have the resources to conduct a complete censu...

  6. Evaluation of the information servicing in a distributed learning ...

    African Journals Online (AJOL)

    The authors' main idea is to organize a distributed learning environment (DLE) based on information and communication resources of global network in combination with the technologies for virtual reality and 3D simulation. In this reason a conceptual model of the DLE architecture and learning processes is defined, and ...

  7. Correlated Sources in Distributed Networks--Data Transmission, Common Information Characterization and Inferencing

    Science.gov (United States)

    Liu, Wei

    2011-01-01

    Correlation is often present among observations in a distributed system. This thesis deals with various design issues when correlated data are observed at distributed terminals, including: communicating correlated sources over interference channels, characterizing the common information among dependent random variables, and testing the presence of…

  8. Extended Poisson Exponential Distribution

    Directory of Open Access Journals (Sweden)

    Anum Fatima

    2015-09-01

    Full Text Available A new mixture of Modified Exponential (ME and Poisson distribution has been introduced in this paper. Taking the Maximum of Modified Exponential random variable when the sample size follows a zero truncated Poisson distribution we have derived the new distribution, named as Extended Poisson Exponential distribution. This distribution possesses increasing and decreasing failure rates. The Poisson-Exponential, Modified Exponential and Exponential distributions are special cases of this distribution. We have also investigated some mathematical properties of the distribution along with Information entropies and Order statistics of the distribution. The estimation of parameters has been obtained using the Maximum Likelihood Estimation procedure. Finally we have illustrated a real data application of our distribution.

  9. Information Modeling for Direct Control of Distributed Energy Resources

    DEFF Research Database (Denmark)

    Biegel, Benjamin; Andersen, Palle; Stoustrup, Jakob

    2013-01-01

    We present an architecture for an unbundled liberalized electricity market system where a virtual power plant (VPP) is able to control a number of distributed energy resources (DERs) directly through a two-way communication link. The aggregator who operates the VPP utilizes the accumulated...... a desired accumulated response. In this paper, we design such an information model based on the markets that the aggregator participates in and based on the flexibility characteristics of the remote controlled DERs. The information model is constructed in a modular manner making the interface suitable...

  10. DAIDS: a Distributed, Agent-based Information Dissemination System

    Directory of Open Access Journals (Sweden)

    Pete Haglich

    2007-10-01

    Full Text Available The Distributed Agent-Based Information Dissemination System (DAIDS concept was motivated by the need to share information among the members of a military tactical team in an atmosphere of extremely limited or intermittent bandwidth. The DAIDS approach recognizes that in many cases communications limitations will preclude the complete sharing of all tactical information between the members of the tactical team. Communications may be limited by obstructions to the line of sight between platforms; electronic warfare; or environmental conditions, or just contention from other users of that bandwidth. Since it may not be possible to achieve a complete information exchange, it is important to prioritize transmissions so the most critical information from the standpoint of the recipient is disseminated first. The challenge is to be able to determine which elements of information are the most important to each teammate. The key innovation of the DAIDS concept is the use of software proxy agents to represent the information needs of the recipient of the information. The DAIDS approach uses these proxy agents to evaluate the content of a message in accordance with the context and information needs of the recipient platform (the agent's principal and prioritize the message for dissemination. In our research we implemented this approach and demonstrated that it provides nearly a reduction in transmission times for critical tactical reports by up to a factor of 30 under severe bandwidth limitations.

  11. Optimising metadata workflows in a distributed information environment

    OpenAIRE

    Robertson, R. John; Barton, Jane

    2005-01-01

    The different purposes present within a distributed information environment create the potential for repositories to enhance their metadata by capitalising on the diversity of metadata available for any given object. This paper presents three conceptual reference models required to achieve this optimisation of metadata workflow: the ecology of repositories, the object lifecycle model, and the metadata lifecycle model. It suggests a methodology for developing the metadata lifecycle model, and ...

  12. Spatial distribution and sequential sampling plans for Tuta absoluta (Lepidoptera: Gelechiidae) in greenhouse tomato crops.

    Science.gov (United States)

    Cocco, Arturo; Serra, Giuseppe; Lentini, Andrea; Deliperi, Salvatore; Delrio, Gavino

    2015-09-01

    The within- and between-plant distribution of the tomato leafminer, Tuta absoluta (Meyrick), was investigated in order to define action thresholds based on leaf infestation and to propose enumerative and binomial sequential sampling plans for pest management applications in protected crops. The pest spatial distribution was aggregated between plants, and median leaves were the most suitable sample to evaluate the pest density. Action thresholds of 36 and 48%, 43 and 56% and 60 and 73% infested leaves, corresponding to economic thresholds of 1 and 3% damaged fruits, were defined for tomato cultivars with big, medium and small fruits respectively. Green's method was a more suitable enumerative sampling plan as it required a lower sampling effort. Binomial sampling plans needed lower average sample sizes than enumerative plans to make a treatment decision, with probabilities of error of sampling plan required 87 or 343 leaves to estimate the population density in extensive or intensive ecological studies respectively. Binomial plans would be more practical and efficient for control purposes, needing average sample sizes of 17, 20 and 14 leaves to take a pest management decision in order to avoid fruit damage higher than 1% in cultivars with big, medium and small fruits respectively. © 2014 Society of Chemical Industry.

  13. Using semi-variogram analysis for providing spatially distributed information on soil surface condition for land surface modeling

    Science.gov (United States)

    Croft, Holly; Anderson, Karen; Kuhn, Nikolaus J.

    2010-05-01

    The ability to quantitatively and spatially assess soil surface roughness is important in geomorphology and land degradation studies. Soils can experience rapid structural degradation in response to land cover changes, resulting in increased susceptibility to erosion and a loss of Soil Organic Matter (SOM). Changes in soil surface condition can also alter sediment detachment, transport and deposition processes, infiltration rates and surface runoff characteristics. Deriving spatially distributed quantitative information on soil surface condition for inclusion in hydrological and soil erosion models is therefore paramount. However, due to the time and resources involved in using traditional field sampling techniques, there is a lack of spatially distributed information on soil surface condition. Laser techniques can provide data for a rapid three dimensional representation of the soil surface at a fine spatial resolution. This provides the ability to capture changes at the soil surface associated with aggregate breakdown, flow routing, erosion and sediment re-distribution. Semi-variogram analysis of the laser data can be used to represent spatial dependence within the dataset; providing information about the spatial character of soil surface structure. This experiment details the ability of semi-variogram analysis to spatially describe changes in soil surface condition. Soil for three soil types (silt, silt loam and silty clay) was sieved to produce aggregates between 1 mm and 16 mm in size and placed evenly in sample trays (25 x 20 x 2 cm). Soil samples for each soil type were exposed to five different durations of artificial rainfall, to produce progressively structurally degraded soil states. A calibrated laser profiling instrument was used to measure surface roughness over a central 10 x 10 cm plot of each soil state, at 2 mm sample spacing. The laser data were analysed within a geostatistical framework, where semi-variogram analysis quantitatively represented

  14. Fluorescence imaging of ion distributions in an inductively coupled plasma with laser ablation sample introduction

    International Nuclear Information System (INIS)

    Moses, Lance M.; Ellis, Wade C.; Jones, Derick D.; Farnsworth, Paul B.

    2015-01-01

    High-resolution images of the spatial distributions of Sc II, Ca II, and Ba II ion densities in the 10 mm upstream from the sampling cone in a laser ablation-inductively coupled plasma-mass spectrometer (LA-ICP-MS) were obtained using planar laser induced fluorescence. Images were obtained for each analyte as a function of the carrier gas flow rate with laser ablation (LA) sample introduction and compared to images with solution nebulization (SN) over the same range of flow rates. Additionally, images were obtained using LA at varying fluences and with varying amounts of helium added to a constant flow of argon gas. Ion profiles in SN images followed a pattern consistent with previous work: increasing gas flow caused a downstream shift in the ion profiles. When compared to SN, LA led to ion profiles that were much narrower radially and reached a maximum near the sampling cone at higher flow rates. Increasing the fluence led to ions formed in the ICP over greater axial and radial distances. The addition of He to the carrier gas prior to the ablation cell led to an upstream shift in the position of ionization and lower overall fluorescence intensities. - Highlights: • We map distributions of analytes in the ICP using laser ablation sample introduction. • We compare images from laser ablation with those from a pneumatic nebulizer. • We document the effects of water added to the laser ablation aerosol. • We compare distributions from a metal to those from crystalline solids. • We document the effect of laser fluence on ion distributions

  15. Fine-scale tracking and diet information of a marine predator reveals the origin and contrasting spatial distribution of prey

    Science.gov (United States)

    Alonso, Hany; Granadeiro, José P.; Dias, Maria P.; Catry, Teresa; Catry, Paulo

    2018-03-01

    The distribution of many marine organisms is still poorly understood, particularly in oceanic regions. Seabirds, as aerial predators which cover extensive areas across the oceans, can potentially be used to enhance our knowledge on the distribution and abundance of their prey. In this study, we combined tracking data and dietary data from individual Cory's shearwaters Calonectris borealis (n = 68) breeding in Selvagens archipelago, Madeira, Portugal, during the chick-rearing periods of 2011 and 2016, in order to infer prey origin within shearwaters' main foraging areas. The digestion state of each prey item in the diet was assessed and classified; and compared to digestion states from known prey items fed to captive birds. In a novel approach, we combined tracking data with information on the prey digestion duration and data on the transit times from foraging grounds to the colony to estimate the location of prey capture. We found a consistent heterogeneity in prey distribution across four different marine domains: Selvagens, deep-sea, seamounts, and continental shelf. In oceanic areas, the chub mackerel Scomber colias, the main prey of Cory's shearwaters, was strongly associated with seamounts and insular shelves, whereas oceanic species like pilot-fish, flying-squid, flying-fish were clearly associated with deep-sea waters. Sardines Sardina pilchardus, anchovies Engraulis encrasicolus and other coastal species were associated with the African shelf. Prey origin assignment was robust across three different sets of assumptions, and was also supported by information on the digestion state of prey collected over a large independent sampling period (671 samples, collected in 2008-2010). The integration of fine-scale dietary and foraging trip data from marine predators provides a new framework to gain insights into the distribution and abundance of prey species in poorly known oceanic areas.

  16. Analysis Of Educational Services Distribution-Based Geographic Information System GIS

    Directory of Open Access Journals (Sweden)

    Waleed Lagrab

    2015-03-01

    Full Text Available Abstract This study analyzes the spatial distribution of kindergarten facilities in the study area based on the Geographic Information Systems GIS in order to test an efficiency of GIS technology to redistribute the existing kindergarten and choose the best location in the future and applying the standard criteria for selecting the suitable locations for kindergarten. To achieve this goal the data and information are collected via interviews and comprehensive statistics on the education facilities in Mukalla districts in YEMEN which contributed to building a geographic database for the study area. After that the Kindergarten spatial patterns are analyzed in terms of proximity to each other and used near some other land in the surrounding area such as streets highways factories etc. Also measures the concentration dispersion clustering and distribution direction for the kindergarten this study showed the effectiveness of the GIS for spatial data analysis. One of the most important finding that most of the Kindergarten was established in Mukalla city did not take into account the criteria that set by the authorities. Furthermore almost district suffers from a shortage in the number of kindergarten and pattern of distribution of those kindergartens dominated by spatial dispersed.

  17. Analysis Of Educational Services Distribution-Based Geographic Information System GIS

    OpenAIRE

    Waleed Lagrab; Noura AKNIN

    2015-01-01

    Abstract This study analyzes the spatial distribution of kindergarten facilities in the study area based on the Geographic Information Systems GIS in order to test an efficiency of GIS technology to redistribute the existing kindergarten and choose the best location in the future and applying the standard criteria for selecting the suitable locations for kindergarten. To achieve this goal the data and information are collected via interviews and comprehensive statistics on the education facil...

  18. Parent-Adolescent Cross-Informant Agreement in Clinically Referred Samples

    DEFF Research Database (Denmark)

    Rescorla, Leslie A; Ewing, Grace; Ivanova, Masha Y

    2017-01-01

    To conduct international comparisons of parent-adolescent cross-informant agreement in clinical samples, we analyzed ratings on the Child Behavior Checklist (CBCL) and Youth Self-Report (YSR) for 6,762 clinically referred adolescents ages 11-18 from 7 societies (M = 14.5 years, SD = 2.0 years; 51...

  19. Bayes allocation of the sample for estimation of the mean when each stratum has a Poisson distribution

    International Nuclear Information System (INIS)

    Wright, T.

    1983-01-01

    Consider a stratified population with L strata, so that a Poisson random variable is associated with each stratum. The parameter associated with the hth stratum is theta/sub h/, h = 1, 2, ..., L. Let ω/sub h/ be the known proportion of the population in the hth stratum, h = 1, 2, ..., L. The authors want to estimate the parameter theta = summation from h = 1 to L ω/sub h/theta/sub h/. We assume that prior information is available on theta/sub h/ and that it can be expressed in terms of a gamma distribution with parameters α/sub h/ and β/sub h/, h = 1, 2, ..., L. We also assume that the prior distributions are independent. Using squared error loss function, a Bayes allocation of total sample size with a cost constraint is given. The Bayes estimate using the Bayes allocation is shown to have an adjusted mean square error which is strictly less than the adjusted mean square error of the classical estimate using the classical allocation

  20. Word categorization from distributional information: frames confer more than the sum of their (Bigram) parts.

    Science.gov (United States)

    Mintz, Toben H; Wang, Felix Hao; Li, Jia

    2014-12-01

    Grammatical categories, such as noun and verb, are the building blocks of syntactic structure and the components that govern the grammatical patterns of language. However, in many languages words are not explicitly marked with their category information, hence a critical part of acquiring a language is categorizing the words. Computational analyses of child-directed speech have shown that distributional information-information about how words pattern with one another in sentences-could be a useful source of initial category information. Yet questions remain as to whether learners use this kind of information, and if so, what kinds of distributional patterns facilitate categorization. In this paper we investigated how adults exposed to an artificial language use distributional information to categorize words. We compared training situations in which target words occurred in frames (i.e., surrounded by two words that frequently co-occur) against situations in which target words occurred in simpler bigram contexts (where an immediately adjacent word provides the context for categorization). We found that learners categorized words together when they occurred in similar frame contexts, but not when they occurred in similar bigram contexts. These findings are particularly relevant because they accord with computational investigations showing that frame contexts provide accurate category information cross-linguistically. We discuss these findings in the context of prior research on distribution-based categorization and the broader implications for the role of distributional categorization in language acquisition. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Selective information sampling

    Directory of Open Access Journals (Sweden)

    Peter A. F. Fraser-Mackenzie

    2009-06-01

    Full Text Available This study investigates the amount and valence of information selected during single item evaluation. One hundred and thirty-five participants evaluated a cell phone by reading hypothetical customers reports. Some participants were first asked to provide a preliminary rating based on a picture of the phone and some technical specifications. The participants who were given the customer reports only after they made a preliminary rating exhibited valence bias in their selection of customers reports. In contrast, the participants that did not make an initial rating sought subsequent information in a more balanced, albeit still selective, manner. The preliminary raters used the least amount of information in their final decision, resulting in faster decision times. The study appears to support the notion that selective exposure is utilized in order to develop cognitive coherence.

  2. Distributing Congestion Management System Information Using the World Wide Web

    Science.gov (United States)

    1997-01-01

    The Internet is a unique medium for the distribution of information, and it provides a tremendous opportunity to take advantage of peoples innate interest in transportation issues as they relate to their own lives. In particular, the World Wide Web (...

  3. On sampling and modeling complex systems

    International Nuclear Information System (INIS)

    Marsili, Matteo; Mastromatteo, Iacopo; Roudi, Yasser

    2013-01-01

    The study of complex systems is limited by the fact that only a few variables are accessible for modeling and sampling, which are not necessarily the most relevant ones to explain the system behavior. In addition, empirical data typically undersample the space of possible states. We study a generic framework where a complex system is seen as a system of many interacting degrees of freedom, which are known only in part, that optimize a given function. We show that the underlying distribution with respect to the known variables has the Boltzmann form, with a temperature that depends on the number of unknown variables. In particular, when the influence of the unknown degrees of freedom on the known variables is not too irregular, the temperature decreases as the number of variables increases. This suggests that models can be predictable only when the number of relevant variables is less than a critical threshold. Concerning sampling, we argue that the information that a sample contains on the behavior of the system is quantified by the entropy of the frequency with which different states occur. This allows us to characterize the properties of maximally informative samples: within a simple approximation, the most informative frequency size distributions have power law behavior and Zipf’s law emerges at the crossover between the under sampled regime and the regime where the sample contains enough statistics to make inferences on the behavior of the system. These ideas are illustrated in some applications, showing that they can be used to identify relevant variables or to select the most informative representations of data, e.g. in data clustering. (paper)

  4. A new framework of statistical inferences based on the valid joint sampling distribution of the observed counts in an incomplete contingency table.

    Science.gov (United States)

    Tian, Guo-Liang; Li, Hui-Qiong

    2017-08-01

    Some existing confidence interval methods and hypothesis testing methods in the analysis of a contingency table with incomplete observations in both margins entirely depend on an underlying assumption that the sampling distribution of the observed counts is a product of independent multinomial/binomial distributions for complete and incomplete counts. However, it can be shown that this independency assumption is incorrect and can result in unreliable conclusions because of the under-estimation of the uncertainty. Therefore, the first objective of this paper is to derive the valid joint sampling distribution of the observed counts in a contingency table with incomplete observations in both margins. The second objective is to provide a new framework for analyzing incomplete contingency tables based on the derived joint sampling distribution of the observed counts by developing a Fisher scoring algorithm to calculate maximum likelihood estimates of parameters of interest, the bootstrap confidence interval methods, and the bootstrap testing hypothesis methods. We compare the differences between the valid sampling distribution and the sampling distribution under the independency assumption. Simulation studies showed that average/expected confidence-interval widths of parameters based on the sampling distribution under the independency assumption are shorter than those based on the new sampling distribution, yielding unrealistic results. A real data set is analyzed to illustrate the application of the new sampling distribution for incomplete contingency tables and the analysis results again confirm the conclusions obtained from the simulation studies.

  5. Theory of choice in bandit, information sampling and foraging tasks.

    Science.gov (United States)

    Averbeck, Bruno B

    2015-03-01

    Decision making has been studied with a wide array of tasks. Here we examine the theoretical structure of bandit, information sampling and foraging tasks. These tasks move beyond tasks where the choice in the current trial does not affect future expected rewards. We have modeled these tasks using Markov decision processes (MDPs). MDPs provide a general framework for modeling tasks in which decisions affect the information on which future choices will be made. Under the assumption that agents are maximizing expected rewards, MDPs provide normative solutions. We find that all three classes of tasks pose choices among actions which trade-off immediate and future expected rewards. The tasks drive these trade-offs in unique ways, however. For bandit and information sampling tasks, increasing uncertainty or the time horizon shifts value to actions that pay-off in the future. Correspondingly, decreasing uncertainty increases the relative value of actions that pay-off immediately. For foraging tasks the time-horizon plays the dominant role, as choices do not affect future uncertainty in these tasks.

  6. 17 CFR 242.603 - Distribution, consolidation, and display of information with respect to quotations for and...

    Science.gov (United States)

    2010-04-01

    ..., and display of information with respect to quotations for and transactions in NMS stocks. 242.603... with respect to quotations for and transactions in NMS stocks. (a) Distribution of information. (1) Any... source, that distributes information with respect to quotations for or transactions in an NMS stock to a...

  7. Distributed Information and Control system reliability enhancement by fog-computing concept application

    Science.gov (United States)

    Melnik, E. V.; Klimenko, A. B.; Ivanov, D. Ya

    2018-03-01

    The paper focuses on the information and control system reliability issue. Authors of the current paper propose a new complex approach of information and control system reliability enhancement by application of the computing concept elements. The approach proposed consists of a complex of optimization problems to be solved. These problems are: estimation of computational complexity, which can be shifted to the edge of the network and fog-layer, distribution of computations among the data processing elements and distribution of computations among the sensors. The problems as well as some simulated results and discussion are formulated and presented within this paper.

  8. Two-dimensional T2 distribution mapping in rock core plugs with optimal k-space sampling.

    Science.gov (United States)

    Xiao, Dan; Balcom, Bruce J

    2012-07-01

    Spin-echo single point imaging has been employed for 1D T(2) distribution mapping, but a simple extension to 2D is challenging since the time increase is n fold, where n is the number of pixels in the second dimension. Nevertheless 2D T(2) mapping in fluid saturated rock core plugs is highly desirable because the bedding plane structure in rocks often results in different pore properties within the sample. The acquisition time can be improved by undersampling k-space. The cylindrical shape of rock core plugs yields well defined intensity distributions in k-space that may be efficiently determined by new k-space sampling patterns that are developed in this work. These patterns acquire 22.2% and 11.7% of the k-space data points. Companion density images may be employed, in a keyhole imaging sense, to improve image quality. T(2) weighted images are fit to extract T(2) distributions, pixel by pixel, employing an inverse Laplace transform. Images reconstructed with compressed sensing, with similar acceleration factors, are also presented. The results show that restricted k-space sampling, in this application, provides high quality results. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Performance of local information-based link prediction: a sampling perspective

    Science.gov (United States)

    Zhao, Jichang; Feng, Xu; Dong, Li; Liang, Xiao; Xu, Ke

    2012-08-01

    Link prediction is pervasively employed to uncover the missing links in the snapshots of real-world networks, which are usually obtained through different kinds of sampling methods. In the previous literature, in order to evaluate the performance of the prediction, known edges in the sampled snapshot are divided into the training set and the probe set randomly, without considering the underlying sampling approaches. However, different sampling methods might lead to different missing links, especially for the biased ways. For this reason, random partition-based evaluation of performance is no longer convincing if we take the sampling method into account. In this paper, we try to re-evaluate the performance of local information-based link predictions through sampling method governed division of the training set and the probe set. It is interesting that we find that for different sampling methods, each prediction approach performs unevenly. Moreover, most of these predictions perform weakly when the sampling method is biased, which indicates that the performance of these methods might have been overestimated in the prior works.

  10. Distributed and dynamic intracellular organization of extracellular information.

    Science.gov (United States)

    Granados, Alejandro A; Pietsch, Julian M J; Cepeda-Humerez, Sarah A; Farquhar, Iseabail L; Tkačik, Gašper; Swain, Peter S

    2018-06-05

    Although cells respond specifically to environments, how environmental identity is encoded intracellularly is not understood. Here, we study this organization of information in budding yeast by estimating the mutual information between environmental transitions and the dynamics of nuclear translocation for 10 transcription factors. Our method of estimation is general, scalable, and based on decoding from single cells. The dynamics of the transcription factors are necessary to encode the highest amounts of extracellular information, and we show that information is transduced through two channels: Generalists (Msn2/4, Tod6 and Dot6, Maf1, and Sfp1) can encode the nature of multiple stresses, but only if stress is high; specialists (Hog1, Yap1, and Mig1/2) encode one particular stress, but do so more quickly and for a wider range of magnitudes. In particular, Dot6 encodes almost as much information as Msn2, the master regulator of the environmental stress response. Each transcription factor reports differently, and it is only their collective behavior that distinguishes between multiple environmental states. Changes in the dynamics of the localization of transcription factors thus constitute a precise, distributed internal representation of extracellular change. We predict that such multidimensional representations are common in cellular decision-making.

  11. A method for ion distribution function evaluation using escaping neutral atom kinetic energy samples

    International Nuclear Information System (INIS)

    Goncharov, P.R.; Ozaki, T.; Veshchev, E.A.; Sudo, S.

    2008-01-01

    A reliable method to evaluate the probability density function for escaping atom kinetic energies is required for the analysis of neutral particle diagnostic data used to study the fast ion distribution function in fusion plasmas. Digital processing of solid state detector signals is proposed in this paper as an improvement of the simple histogram approach. Probability density function for kinetic energies of neutral particles escaping from the plasma has been derived in a general form taking into account the plasma ion energy distribution, electron capture and loss rates, superposition along the diagnostic sight line and the magnetic surface geometry. A pseudorandom number generator has been realized that enables a sample of escaping neutral particle energies to be simulated for given plasma parameters and experimental conditions. Empirical probability density estimation code has been developed and tested to reconstruct the probability density function from simulated samples assuming. Maxwellian and classical slowing down plasma ion energy distribution shapes for different temperatures and different slowing down times. The application of the developed probability density estimation code to the analysis of experimental data obtained by the novel Angular-Resolved Multi-Sightline Neutral Particle Analyzer has been studied to obtain the suprathermal particle distributions. The optimum bandwidth parameter selection algorithm has also been realized. (author)

  12. Multi-UAV Doppler Information Fusion for Target Tracking Based on Distributed High Degrees Information Filters

    Directory of Open Access Journals (Sweden)

    Hamza Benzerrouk

    2018-03-01

    Full Text Available Multi-Unmanned Aerial Vehicle (UAV Doppler-based target tracking has not been widely investigated, specifically when using modern nonlinear information filters. A high-degree Gauss–Hermite information filter, as well as a seventh-degree cubature information filter (CIF, is developed to improve the fifth-degree and third-degree CIFs proposed in the most recent related literature. These algorithms are applied to maneuvering target tracking based on Radar Doppler range/range rate signals. To achieve this purpose, different measurement models such as range-only, range rate, and bearing-only tracking are used in the simulations. In this paper, the mobile sensor target tracking problem is addressed and solved by a higher-degree class of quadrature information filters (HQIFs. A centralized fusion architecture based on distributed information filtering is proposed, and yielded excellent results. Three high dynamic UAVs are simulated with synchronized Doppler measurement broadcasted in parallel channels to the control center for global information fusion. Interesting results are obtained, with the superiority of certain classes of higher-degree quadrature information filters.

  13. A Two-Level Cache for Distributed Information Retrieval in Search Engines

    Directory of Open Access Journals (Sweden)

    Weizhe Zhang

    2013-01-01

    Full Text Available To improve the performance of distributed information retrieval in search engines, we propose a two-level cache structure based on the queries of the users’ logs. We extract the highest rank queries of users from the static cache, in which the queries are the most popular. We adopt the dynamic cache as an auxiliary to optimize the distribution of the cache data. We propose a distribution strategy of the cache data. The experiments prove that the hit rate, the efficiency, and the time consumption of the two-level cache have advantages compared with other structures of cache.

  14. A two-level cache for distributed information retrieval in search engines.

    Science.gov (United States)

    Zhang, Weizhe; He, Hui; Ye, Jianwei

    2013-01-01

    To improve the performance of distributed information retrieval in search engines, we propose a two-level cache structure based on the queries of the users' logs. We extract the highest rank queries of users from the static cache, in which the queries are the most popular. We adopt the dynamic cache as an auxiliary to optimize the distribution of the cache data. We propose a distribution strategy of the cache data. The experiments prove that the hit rate, the efficiency, and the time consumption of the two-level cache have advantages compared with other structures of cache.

  15. 77 FR 38323 - Proposed Extension of Existing Information Collection; Respirable Coal Mine Dust Sampling

    Science.gov (United States)

    2012-06-27

    ... Information Collection; Respirable Coal Mine Dust Sampling AGENCY: Mine Safety and Health Administration... Sampling'' to more accurately reflect the type of information that is collected. Chronic exposure to... dust levels since 1970 and, consequently, the prevalence rate of black lung among coal miners, severe...

  16. A Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants

    KAUST Repository

    Liang, Faming; Jin, Ick-Hoon

    2013-01-01

    Simulating from distributions with intractable normalizing constants has been a long-standing problem inmachine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. TheMCMHalgorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals. © 2013 Massachusetts Institute of Technology.

  17. A Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants

    KAUST Repository

    Liang, Faming

    2013-08-01

    Simulating from distributions with intractable normalizing constants has been a long-standing problem inmachine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. TheMCMHalgorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals. © 2013 Massachusetts Institute of Technology.

  18. Development and deployment of a low-cost, mobile-ready, air quality sensor system: progress toward distributed networks and autonomous aerial sampling

    Science.gov (United States)

    Hersey, S. P.; DiVerdi, R.; Gadtaula, P.; Sheneman, T.; Flores, K.; Chen, Y. H.; Jayne, J. T.; Cross, E. S.

    2017-12-01

    Throughout the 2016-2017 academic year, a new partnership between Olin College of Engineering and Aerodyne Research, Inc. developed an affordable, self-contained air quality monitoring instrument called Modulair. The Modulair instrument is based on the same operating principles as Aerodyne's newly-developed ARISense integrated sensor system, employing electrochemical sensors for gas-phase measurements of CO, NO, NO2, and O3 and an off-the-shelf optical particle counter for particle concentration, number, and size distribution information (0.4 backend with a mobile, cloud-based data management system for real-time data posting and analysis. Open source tools and software were utilized in the development of the instrument. All initial work was completed by a team of undergraduate students as part of the Senior Capstone Program in Engineering (SCOPE) at Olin College. Deployment strategies for Modulair include distributed, mobile measurements and drone-based aerial sampling. Design goals for the drone integration include maximizing airborne sampling time and laying the foundation for software integration with the drone's autopilot system to allow for autonomous plume sampling across concentration gradients. Modulair and its flexible deployments enable real-time mapping of air quality data at exposure-relevant spatial scales, as well as regular, autonomous characterization of sources and dispersion of atmospheric pollutants. We will present an overview of the Modulair instrument and results from benchtop and field validation, including mobile and drone-based plume sampling in the Boston area.

  19. A QUANTITATIVE EVALUATION OF THE WATER DISTRIBUTION IN A SOIL SAMPLE USING NEUTRON IMAGING

    Directory of Open Access Journals (Sweden)

    Jan Šácha

    2016-10-01

    Full Text Available This paper presents an empirical method by Kang et al. recently proposed for correcting two-dimensional neutron radiography for water quantification in soil. The method was tested on data from neutron imaging of the water infiltration in a soil sample. The raw data were affected by neutron scattering and by beam hardening artefacts. Two strategies for identifying the correction parameters are proposed in this paper. The method has been further developed for the case of three-dimensional neutron tomography. In a related experiment, neutron imaging is used to record ponded-infiltration experiments in two artificial soil samples. Radiograms, i.e., two-dimensional projections of the sample, were acquired during infiltration. A calculation was made of the amount of water and its distribution within the radiograms, in the form of two-dimensional water thickness maps. Tomograms were reconstructed from the corrected and uncorrected water thickness maps to obtain the 3D spatial distribution of the water content within the sample. Without the correction, the beam hardening and the scattering effects overestimated the water content values close to the perimeter of the sample, and at the same time underestimated the values close to the centre of the sample. The total water content of the entire sample was the same in both cases. The empirical correction method presented in this study is a relatively accurate, rapid and simple way to obtain the quantitatively determined water content from two-dimensional and three-dimensional neutron images. However, an independent method for measuring the total water volume in the sample is needed in order to identify the correction parameters.

  20. Bayesian Estimation of Two-Parameter Weibull Distribution Using Extension of Jeffreys' Prior Information with Three Loss Functions

    Directory of Open Access Journals (Sweden)

    Chris Bambey Guure

    2012-01-01

    Full Text Available The Weibull distribution has been observed as one of the most useful distribution, for modelling and analysing lifetime data in engineering, biology, and others. Studies have been done vigorously in the literature to determine the best method in estimating its parameters. Recently, much attention has been given to the Bayesian estimation approach for parameters estimation which is in contention with other estimation methods. In this paper, we examine the performance of maximum likelihood estimator and Bayesian estimator using extension of Jeffreys prior information with three loss functions, namely, the linear exponential loss, general entropy loss, and the square error loss function for estimating the two-parameter Weibull failure time distribution. These methods are compared using mean square error through simulation study with varying sample sizes. The results show that Bayesian estimator using extension of Jeffreys' prior under linear exponential loss function in most cases gives the smallest mean square error and absolute bias for both the scale parameter α and the shape parameter β for the given values of extension of Jeffreys' prior.

  1. Wealth of information derivable from Evaporation Residue (ER) angular momentum distributions

    International Nuclear Information System (INIS)

    Madhavan, N.

    2016-01-01

    Understanding fusion-fission dynamics is possible by studying the fission process, or, alternatively, by studying the complementary fusion-evaporation process. Though the latter method is difficult to implement, requiring sophisticated recoil separators/spectrometers for selecting the ERs in the direction of the primary beam, it provides more clarity with better accuracy and is indispensible for probing the pre-saddle region in heavy nuclei. Super Heavy Element (SHE) search crucially depends on understanding the fusion-fission process, the choice of entrance channel and excitation energy of the Compound Nucleus (CN), ER cross-section and, more importantly, the angular momenta populated in ERs which survive fission. The measurement of ER angular momentum distributions, through coincidence technique involving large gamma multiplicity detector array and recoil separator, throws up a wealth of information such as, nuclear viscosity effects, limits of stability of ERs, shape changes at high spins, snapshot of frozen set of barriers using a single-shot experiment and indirect information about onset of quasi-fission processes. There is a paucity of experimental data with regard to angular momentum distributions in heavy nuclei due to experimental constraints. In this talk, the variety of information which could be derived through experimental ER angular momentum distributions will be elaborated with examples from work carried out at IUAC using advanced experimental facilities. (author)

  2. Wireless Technology Recognition Based on RSSI Distribution at Sub-Nyquist Sampling Rate for Constrained Devices.

    Science.gov (United States)

    Liu, Wei; Kulin, Merima; Kazaz, Tarik; Shahid, Adnan; Moerman, Ingrid; De Poorter, Eli

    2017-09-12

    Driven by the fast growth of wireless communication, the trend of sharing spectrum among heterogeneous technologies becomes increasingly dominant. Identifying concurrent technologies is an important step towards efficient spectrum sharing. However, due to the complexity of recognition algorithms and the strict condition of sampling speed, communication systems capable of recognizing signals other than their own type are extremely rare. This work proves that multi-model distribution of the received signal strength indicator (RSSI) is related to the signals' modulation schemes and medium access mechanisms, and RSSI from different technologies may exhibit highly distinctive features. A distinction is made between technologies with a streaming or a non-streaming property, and appropriate feature spaces can be established either by deriving parameters such as packet duration from RSSI or directly using RSSI's probability distribution. An experimental study shows that even RSSI acquired at a sub-Nyquist sampling rate is able to provide sufficient features to differentiate technologies such as Wi-Fi, Long Term Evolution (LTE), Digital Video Broadcasting-Terrestrial (DVB-T) and Bluetooth. The usage of the RSSI distribution-based feature space is illustrated via a sample algorithm. Experimental evaluation indicates that more than 92% accuracy is achieved with the appropriate configuration. As the analysis of RSSI distribution is straightforward and less demanding in terms of system requirements, we believe it is highly valuable for recognition of wideband technologies on constrained devices in the context of dynamic spectrum access.

  3. Assessing Understanding of Sampling Distributions and Differences in Learning amongst Different Learning Styles

    Science.gov (United States)

    Beeman, Jennifer Leigh Sloan

    2013-01-01

    Research has found that students successfully complete an introductory course in statistics without fully comprehending the underlying theory or being able to exhibit statistical reasoning. This is particularly true for the understanding about the sampling distribution of the mean, a crucial concept for statistical inference. This study…

  4. Sampling theorem for geometric moment determination and its application to a laser beam position detector.

    Science.gov (United States)

    Loce, R P; Jodoin, R E

    1990-09-10

    Using the tools of Fourier analysis, a sampling requirement is derived that assures that sufficient information is contained within the samples of a distribution to calculate accurately geometric moments of that distribution. The derivation follows the standard textbook derivation of the Whittaker-Shannon sampling theorem, which is used for reconstruction, but further insight leads to a coarser minimum sampling interval for moment determination. The need for fewer samples to determine moments agrees with intuition since less information should be required to determine a characteristic of a distribution compared with that required to construct the distribution. A formula for calculation of the moments from these samples is also derived. A numerical analysis is performed to quantify the accuracy of the calculated first moment for practical nonideal sampling conditions. The theory is applied to a high speed laser beam position detector, which uses the normalized first moment to measure raster line positional accuracy in a laser printer. The effects of the laser irradiance profile, sampling aperture, number of samples acquired, quantization, and noise are taken into account.

  5. Spatial distribution of metals in soil samples from Zona da Mata, Pernambuco, Brazil using XRF technique

    International Nuclear Information System (INIS)

    Fernandez, Zahily Herrero; Santos Junior, Jose Araujo dos; Amaral, Romilton dos Santos; Menezes, Romulo Simoes Cezar; Santos, Josineide Marques do Nascimento; Bezerra, Jairo Dias; Damascena, Kennedy Francys Rodrigues; Silva, Edvane Borges da; Silva, Alberto Antonio da

    2015-01-01

    Soil contamination is today one of the most important environmental issues for society. In the past, soil pollution was not considered as important as air and water contamination, because this was more difficult to be controlled, becoming an important topic in studies of environmental protection worldwide. Based on this, this paper provides information on the determination of metals in soil samples collected in Zona da Mata, Pernambuco, Brazil, where normally the application of pesticides, insecticides and other agricultural additives are used in a disorderly manner and without control. A total of 24 sampling points were monitored. The analysis of Mn, Fe, Ni, Zn, Br, Rb, Sr, Pb, Ti, La, Al, Si and P were performed using Energy Dispersive X-Ray Fluorescence. In order to assess the development of analytical method, inorganic Certified Reference Materials (IAEA-SOIL-7 and SRM 2709) were analyzed. In each sampling site, the geoaccumulation index were calculated to estimate the level of metal contamination in the soil, this was made taking into account the resolution 460 of the National Environmental Council (CONAMA in Portuguese). The elemental distribution patterns obtained for each metal were associated with different pollution sources. This assessment provides an initial description of pollution levels presented by metals in soils from several areas of Zona da Mata, providing quantitative evidence and demonstrating the need to improve the regulation of agricultural and industrial activities. (author)

  6. Spatial distribution of metals in soil samples from Zona da Mata, Pernambuco, Brazil using XRF technique

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez, Zahily Herrero; Santos Junior, Jose Araujo dos; Amaral, Romilton dos Santos; Menezes, Romulo Simoes Cezar; Santos, Josineide Marques do Nascimento; Bezerra, Jairo Dias; Damascena, Kennedy Francys Rodrigues, E-mail: zahily1985@gmail.com, E-mail: jaraujo@ufpe.br, E-mail: romilton@ufpe.br, E-mail: rmenezes@ufpe.br, E-mail: neideden@hotmail.com, E-mail: jairo.dias@ufpe.br, E-mail: kennedy.eng.ambiental@gmail.com [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Centro de Tecnologia e Geociencias. Departamento de Energia Nuclear; Alvarez, Juan Reinaldo Estevez, E-mail: jestevez@ceaden.cu [Centro de Aplicaciones Tecnologicas y Desarrollo Nuclear (CEADEN), Havana (Cuba); Silva, Edvane Borges da, E-mail: edvane.borges@pq.cnpq.br [Universidade Federal de Pernambuco (UFPE), Vitoria de Santo Antao, PE (Brazil). Nucleo de Biologia; Franca, Elvis Joacir de; Farias, Emerson Emiliano Gualberto de, E-mail: ejfranca@cnen.gov.br, E-mail: emersonemiliano@yahoo.com.br [Centro Regional de Ciencias Nucleares do Nordeste (CRCN-NE/CNEN-PE), Recife, PE (Brazil); Silva, Alberto Antonio da, E-mail: alberto.silva@barreiros.ifpe.edu.br [Instituto Federal de Educacao, Ciencia e Tecnologia de Pernambuco (IFPE), Barreiros, PE (Brazil)

    2015-07-01

    Soil contamination is today one of the most important environmental issues for society. In the past, soil pollution was not considered as important as air and water contamination, because this was more difficult to be controlled, becoming an important topic in studies of environmental protection worldwide. Based on this, this paper provides information on the determination of metals in soil samples collected in Zona da Mata, Pernambuco, Brazil, where normally the application of pesticides, insecticides and other agricultural additives are used in a disorderly manner and without control. A total of 24 sampling points were monitored. The analysis of Mn, Fe, Ni, Zn, Br, Rb, Sr, Pb, Ti, La, Al, Si and P were performed using Energy Dispersive X-Ray Fluorescence. In order to assess the development of analytical method, inorganic Certified Reference Materials (IAEA-SOIL-7 and SRM 2709) were analyzed. In each sampling site, the geoaccumulation index were calculated to estimate the level of metal contamination in the soil, this was made taking into account the resolution 460 of the National Environmental Council (CONAMA in Portuguese). The elemental distribution patterns obtained for each metal were associated with different pollution sources. This assessment provides an initial description of pollution levels presented by metals in soils from several areas of Zona da Mata, providing quantitative evidence and demonstrating the need to improve the regulation of agricultural and industrial activities. (author)

  7. Seasonal phenology, spatial distribution, and sampling plan for the invasive mealybug Phenacoccus peruvianus (Hemiptera: Pseudococcidae).

    Science.gov (United States)

    Beltrá, A; Garcia-Marí, F; Soto, A

    2013-06-01

    Phlenacoccus peruvianus Granara de Willink (Hemiptera: Pseudococcidae) is an invasive mealybug of Neotropical origin. In recent years it has invaded the Mediterranean Basin causing significant damages in bougainvillea and other ornamental plants. This article examines its phenology, location on the plant and spatial distribution, and presents a sampling plan to determine P. peruvianus population density for the management of this mealybug in southern Europe. Six urban green spaces with bougainvillea plants were periodically surveyed between March 2008 and September 2010 in eastern Spain, sampling bracts, leaves, and twigs. Our results show that P. peruvianus abundance was high in spring and summer, declining to almost undetectable levels in autumn and winter. The mealybugs showed a preference for settling on bracts and there were no significant migrations between plant organs. P. peruvianus showed a highly aggregated distribution on bracts, leaves, and twigs. We recommend abinomial sampling of 200 leaves and an action threshold of 55% infested leaves for integrated pest management purposes on urban landscapes and enumerative sampling for ornamental nursery management and additional biological studies.

  8. Actual distribution of Cronobacter spp. in industrial batches of powdered infant formula and consequences for performance of sampling strategies.

    Science.gov (United States)

    Jongenburger, I; Reij, M W; Boer, E P J; Gorris, L G M; Zwietering, M H

    2011-11-15

    The actual spatial distribution of microorganisms within a batch of food influences the results of sampling for microbiological testing when this distribution is non-homogeneous. In the case of pathogens being non-homogeneously distributed, it markedly influences public health risk. This study investigated the spatial distribution of Cronobacter spp. in powdered infant formula (PIF) on industrial batch-scale for both a recalled batch as well a reference batch. Additionally, local spatial occurrence of clusters of Cronobacter cells was assessed, as well as the performance of typical sampling strategies to determine the presence of the microorganisms. The concentration of Cronobacter spp. was assessed in the course of the filling time of each batch, by taking samples of 333 g using the most probable number (MPN) enrichment technique. The occurrence of clusters of Cronobacter spp. cells was investigated by plate counting. From the recalled batch, 415 MPN samples were drawn. The expected heterogeneous distribution of Cronobacter spp. could be quantified from these samples, which showed no detectable level (detection limit of -2.52 log CFU/g) in 58% of samples, whilst in the remainder concentrations were found to be between -2.52 and 2.75 log CFU/g. The estimated average concentration in the recalled batch was -2.78 log CFU/g and a standard deviation of 1.10 log CFU/g. The estimated average concentration in the reference batch was -4.41 log CFU/g, with 99% of the 93 samples being below the detection limit. In the recalled batch, clusters of cells occurred sporadically in 8 out of 2290 samples of 1g taken. The two largest clusters contained 123 (2.09 log CFU/g) and 560 (2.75 log CFU/g) cells. Various sampling strategies were evaluated for the recalled batch. Taking more and smaller samples and keeping the total sampling weight constant, considerably improved the performance of the sampling plans to detect such a type of contaminated batch. Compared to random sampling

  9. Coordinating Information and Decisions of Hierarchical Distributed Decision Units in Crises

    National Research Council Canada - National Science Library

    Rose, Gerald

    1997-01-01

    A program of research is described. The research addressed decision making by distributed decision makers using either consensus or leader structures and confronted by both routine tasks and different kinds of information system crisis...

  10. Atmospheric aerosol sampling campaign in Budapest and K-puszta. Part 1. Elemental concentrations and size distributions

    International Nuclear Information System (INIS)

    Dobos, E.; Borbely-Kiss, I.; Kertesz, Zs.; Szabo, Gy.; Salma, I.

    2004-01-01

    Complete text of publication follows. Atmospheric aerosol samples were collected in a sampling campaign from 24 July to 1 Au- gust, 2003 in Hungary. The sampling were performed in two places simultaneously: in Budapest (urban site) and K-puszta (remote area). Two PIXE International 7-stage cascade impactors were used for aerosol sampling with 24 hours duration. These impactors separate the aerosol into 7 size ranges. The elemental concentrations of the samples were obtained by proton-induced X-ray Emission (PIXE) analysis. Size distributions of S, Si, Ca, W, Zn, Pb and Fe elements were investigated in K-puszta and in Budapest. Average rates (shown in Table 1) of the elemental concentrations was calculated for each stage (in %) from the obtained distributions. The elements can be grouped into two parts on the basis of these data. The majority of the particle containing Fe, Si, Ca, (Ti) are in the 2-8 μm size range (first group). These soil origin elements were found usually in higher concentration in Budapest than in K-puszta (Fig.1.). The second group consisted of S, Pb and (W). The majority of these elements was found in the 0.25-1 μm size range and was much higher in Budapest than in K-puszta. W was measured only in samples collected in Budapest. Zn has uniform distribution in Budapest and does not belong to the above mentioned groups. This work was supported by the National Research and Development Program (NRDP 3/005/2001). (author)

  11. Mechanical properties and filler distribution as a function filler content in silica filled PDMS samples

    International Nuclear Information System (INIS)

    Hawley, Marilyn E.; Wrobleski, Debra A.; Orler, E. Bruce; Houlton, Robert J.; Chitanvis, Kiran E.; Brown, Geoffrey W.; Hanson, David E.

    2004-01-01

    Atomic force microscopy (AFM) phase imaging and tensile stress-strain measurements are used to study a series of model compression molded fumed silica filled polydimethysiloxane (PDMS) samples with filler content of zero, 20, 35, and 50 parts per hundred (phr) to determine the relationship between filler content and stress-strain properties. AFM phase imaging was used to determine filler size, degree of aggregation, and distribution within the soft PDMS matrix. A small tensile stage was used to measure mechanical properties. Samples were not pulled to break in order to study Mullins and aging effects. Several identical 35 phr samples were subjected to an initial stress, and then one each was reevaluated over intervals up to 26 weeks to determine the degree to which these samples recovered their initial stress-strain behavior as a function of time. One sample was tested before and after heat treatment to determine if heating accelerated recovery of the stress-strain behavior. The effect of filler surface treatment on mechanical properties was examined for two samples containing 35 phr filler treated or untreated with hexamethyldisilazane (HMDZ), respectively. Fiduciary marks were used on several samples to determine permanent set. 35 phr filler samples were found to give the optimum mechanical properties. A clear Mullins effect was seen. Within experimental error, no change was seen in mechanical behavior as a function of time or heat-treatment. The mechanical properties of the sample containing the HDMZ treated silica were adversely affected. AFM phase images revealed aggregation and nonuniform distribution of the filler for all samples. Finally, a permanent set of about 3 to 6 percent was observed for the 35 phr samples.

  12. Web-based Distributed Medical Information System for Chronic Viral Hepatitis

    Science.gov (United States)

    Yang, Ying; Qin, Tuan-fa; Jiang, Jian-ning; Lu, Hui; Ma, Zong-e.; Meng, Hong-chang

    2008-11-01

    To make a long-term dynamic monitoring to the chronically ill, especially patients of HBV A, we build a distributed Medical Information System for Chronic Viral Hepatitis (MISCHV). The Web-based system architecture and its function are described, and the extensive application and important role are also presented.

  13. Modified FlowCAM procedure for quantifying size distribution of zooplankton with sample recycling capacity.

    Directory of Open Access Journals (Sweden)

    Esther Wong

    Full Text Available We have developed a modified FlowCAM procedure for efficiently quantifying the size distribution of zooplankton. The modified method offers the following new features: 1 prevents animals from settling and clogging with constant bubbling in the sample container; 2 prevents damage to sample animals and facilitates recycling by replacing the built-in peristaltic pump with an external syringe pump, in order to generate negative pressure, creates a steady flow by drawing air from the receiving conical flask (i.e. vacuum pump, and transfers plankton from the sample container toward the main flowcell of the imaging system and finally into the receiving flask; 3 aligns samples in advance of imaging and prevents clogging with an additional flowcell placed ahead of the main flowcell. These modifications were designed to overcome the difficulties applying the standard FlowCAM procedure to studies where the number of individuals per sample is small, and since the FlowCAM can only image a subset of a sample. Our effective recycling procedure allows users to pass the same sample through the FlowCAM many times (i.e. bootstrapping the sample in order to generate a good size distribution. Although more advanced FlowCAM models are equipped with syringe pump and Field of View (FOV flowcells which can image all particles passing through the flow field; we note that these advanced setups are very expensive, offer limited syringe and flowcell sizes, and do not guarantee recycling. In contrast, our modifications are inexpensive and flexible. Finally, we compared the biovolumes estimated by automated FlowCAM image analysis versus conventional manual measurements, and found that the size of an individual zooplankter can be estimated by the FlowCAM image system after ground truthing.

  14. A Distributed Multi-Agent System for Collaborative Information Management and Learning

    Science.gov (United States)

    Chen, James R.; Wolfe, Shawn R.; Wragg, Stephen D.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In this paper, we present DIAMS, a system of distributed, collaborative agents to help users access, manage, share and exchange information. A DIAMS personal agent helps its owner find information most relevant to current needs. It provides tools and utilities for users to manage their information repositories with dynamic organization and virtual views. Flexible hierarchical display is integrated with indexed query search-to support effective information access. Automatic indexing methods are employed to support user queries and communication between agents. Contents of a repository are kept in object-oriented storage to facilitate information sharing. Collaboration between users is aided by easy sharing utilities as well as automated information exchange. Matchmaker agents are designed to establish connections between users with similar interests and expertise. DIAMS agents provide needed services for users to share and learn information from one another on the World Wide Web.

  15. An Empirical Consideration of the Use of R in Actively Constructing Sampling Distributions

    Science.gov (United States)

    Vaughn, Brandon K.

    2009-01-01

    In this paper, an interactive teaching approach to introduce the concept of sampling distributions using the statistical software program, R, is shown. One advantage of this approach is that the program R is freely available via the internet. Instructors can easily demonstrate concepts in class, outfit entire computer labs, and/or assign the…

  16. MCNPX calculations of dose rate distribution inside samples treated in the research gamma irradiating facility at CTEx

    Energy Technology Data Exchange (ETDEWEB)

    Rusin, Tiago; Rebello, Wilson F.; Vellozo, Sergio O.; Gomes, Renato G., E-mail: tiagorusin@ime.eb.b, E-mail: rebello@ime.eb.b, E-mail: vellozo@cbpf.b, E-mail: renatoguedes@ime.eb.b [Instituto Militar de Engenharia (IME), Rio de Janeiro, RJ (Brazil). Dept. de Engenharia Nuclear; Vital, Helio C., E-mail: vital@ctex.eb.b [Centro Tecnologico do Exercito (CTEx), Rio de Janeiro, RJ (Brazil); Silva, Ademir X., E-mail: ademir@con.ufrj.b [Universidade Federal do Rio de Janeiro (PEN/COPPE/UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao de Engenharia. Programa de Engenharia Nuclear

    2011-07-01

    A cavity-type cesium-137 research irradiating facility at CTEx has been modeled by using the Monte Carlo code MCNPX. The irradiator has been daily used in experiments to optimize the use of ionizing radiation for conservation of many kinds of food and to improve materials properties. In order to correlate the effects of the treatment, average doses have been calculated for each irradiated sample, accounting for the measured dose rate distribution in the irradiating chambers. However that approach is only approximate, being subject to significant systematic errors due to the heterogeneous internal structure of most samples that can lead to large anisotropy in attenuation and Compton scattering properties across the media. Thus this work is aimed at further investigating such uncertainties by calculating the dose rate distribution inside the items treated such that a more accurate and representative estimate of the total absorbed dose can be determined for later use in the effects-versus-dose correlation curves. Samples of different simplified geometries and densities (spheres, cylinders, and parallelepipeds), have been modeled to evaluate internal dose rate distributions within the volume of the samples and the overall effect on the average dose. (author)

  17. MCNPX calculations of dose rate distribution inside samples treated in the research gamma irradiating facility at CTEx

    International Nuclear Information System (INIS)

    Rusin, Tiago; Rebello, Wilson F.; Vellozo, Sergio O.; Gomes, Renato G.; Silva, Ademir X.

    2011-01-01

    A cavity-type cesium-137 research irradiating facility at CTEx has been modeled by using the Monte Carlo code MCNPX. The irradiator has been daily used in experiments to optimize the use of ionizing radiation for conservation of many kinds of food and to improve materials properties. In order to correlate the effects of the treatment, average doses have been calculated for each irradiated sample, accounting for the measured dose rate distribution in the irradiating chambers. However that approach is only approximate, being subject to significant systematic errors due to the heterogeneous internal structure of most samples that can lead to large anisotropy in attenuation and Compton scattering properties across the media. Thus this work is aimed at further investigating such uncertainties by calculating the dose rate distribution inside the items treated such that a more accurate and representative estimate of the total absorbed dose can be determined for later use in the effects-versus-dose correlation curves. Samples of different simplified geometries and densities (spheres, cylinders, and parallelepipeds), have been modeled to evaluate internal dose rate distributions within the volume of the samples and the overall effect on the average dose. (author)

  18. Mechanical Properties Distribution within Polypropylene Injection Molded Samples: Effect of Mold Temperature under Uneven Thermal Conditions

    Directory of Open Access Journals (Sweden)

    Sara Liparoti

    2017-11-01

    Full Text Available The quality of the polymer parts produced by injection molding is strongly affected by the processing conditions. Uncontrolled deviations from the proper process parameters could significantly affect both internal structure and final material properties. In this work, to mimic an uneven temperature field, a strong asymmetric heating is applied during the production of injection-molded polypropylene samples. The morphology of the samples is characterized by optical and atomic force microscopy (AFM, whereas the distribution of mechanical modulus at different scales is obtained by Indentation and HarmoniX AFM tests. Results clearly show that the temperature differences between the two mold surfaces significantly affect the morphology distributions of the molded parts. This is due to both the uneven temperature field evolutions and to the asymmetric flow field. The final mechanical property distributions are determined by competition between the local molecular stretch and the local structuring achieved during solidification. The cooling rate changes affect internal structures in terms of relaxation/reorganization levels and give rise to an asymmetric distribution of mechanical properties.

  19. Information system architecture to support transparent access to distributed, heterogeneous data sources

    International Nuclear Information System (INIS)

    Brown, J.C.

    1994-08-01

    Quality situation assessment and decision making require access to multiple sources of data and information. Insufficient accessibility to data exists for many large corporations and Government agencies. By utilizing current advances in computer technology, today's situation analyst's have a wealth of information at their disposal. There are many potential solutions to the information accessibility problem using today's technology. The United States Department of Energy (US-DOE) faced this problem when dealing with one class of problem in the US. The result of their efforts has been the creation of the Tank Waste Information Network System -- TWINS. The TWINS solution combines many technologies to address problems in several areas such as User Interfaces, Transparent Access to Multiple Data Sources, and Integrated Data Access. Data related to the complex is currently distributed throughout several US-DOE installations. Over time, each installation has adopted their own set of standards as related to information management. Heterogeneous hardware and software platforms exist both across the complex and within a single installation. Standards for information management vary between US-DOE mission areas within installations. These factors contribute to the complexity of accessing information in a manner that enhances the performance and decision making process of the analysts. This paper presents one approach taken by the DOE to resolve the problem of distributed, heterogeneous, multi-media information management for the HLW Tank complex. The information system architecture developed for the DOE by the TWINS effort is one that is adaptable to other problem domains and uses

  20. A sampling device for counting insect egg clusters and measuring vertical distribution of vegetation

    Science.gov (United States)

    Robert L. Talerico; Robert W., Jr. Wilson

    1978-01-01

    The use of a vertical sampling pole that delineates known volumes and position is illustrated and demonstrated for counting egg clusters of N. sertifer. The pole can also be used to estimate vertical and horizontal coverage, distribution or damage of vegetation or foliage.

  1. Random Sampling of Correlated Parameters – a Consistent Solution for Unfavourable Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Žerovnik, G., E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Kodeli, I.A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Capote, R. [International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Smith, D.L. [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States)

    2015-01-15

    Two methods for random sampling according to a multivariate lognormal distribution – the correlated sampling method and the method of transformation of correlation coefficients – are briefly presented. The methods are mathematically exact and enable consistent sampling of correlated inherently positive parameters with given information on the first two distribution moments. Furthermore, a weighted sampling method to accelerate the convergence of parameters with extremely large relative uncertainties is described. However, the method is efficient only for a limited number of correlated parameters.

  2. Studies on cellular distribution of elements in human hepatocellular carcinoma samples by molecular activation analysis

    International Nuclear Information System (INIS)

    Deng Guilong; Chen Chunying; Zhang Peiqun; Zhao Jiujiang; Chai Zhifang

    2005-01-01

    The distribution patterns of 17 elements in the subcellular fractions of nuclei, mitochondria, lysosome, microsome and cytosol of human hepatocellular carcinoma (HCC) and normal liver samples were investigated by using molecular activation analysis (MAA) and differential centrifugation. Their significant difference was checked by the Studient's t-test. These elements exhibit inhomogeneous distributions in each subcellular fraction. Some elements have no significant difference between hepatocellular carcinoma and normal liver samples. However, the concentrations of Br, Ca, Cd and Cs are significantly higher in each component of hepatocarcinoma than in normal liver. The content of Fe in microsome of HCC is significantly lower, almost half of normal liver samples, but higher in other subcellular fractions than in those of normal tissues. The rare earth elements of La and Ce have the patterns similar to Fe. The concentrations of Sb and Zn in nuclei of HCC are obviously lower (P<0.05, P<0.05). The contents of K and Na are higher in cytosol of HCC (P<0.05). The distributions of Ba and Rb show no significant difference between two groups. The relationships of Fe, Cd and K with HCC were also discussed. The levels of some elements in subcellular fractions of tumor were quite different from those of normal liver, which suggested that trace elements might play important roles in the occurrence and development of hepatocellular carcinoma. (authors)

  3. Studies on cellular distribution of elements in human hepatocellular carcinoma samples by molecular activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Guilong, Deng [Chinese Academy of Sciences, Beijing (China). Inst. of High Energy Physics, Key Laboratory of Nuclear Analytical Techniques; Department of General Surgery, the Second Affiliated Hospital, School of Medicine, Zhejiang Univ., Hangzhou (China); Chunying, Chen; Peiqun, Zhang; Jiujiang, Zhao; Zhifang, Chai [Chinese Academy of Sciences, Beijing (China). Inst. of High Energy Physics, Key Laboratory of Nuclear Analytical Techniques; Yingbin, Liu; Jianwei, Wang; Bin, Xu; Shuyou, Peng [Department of General Surgery, the Second Affiliated Hospital, School of Medicine, Zhejiang Univ., Hangzhou (China)

    2005-07-15

    The distribution patterns of 17 elements in the subcellular fractions of nuclei, mitochondria, lysosome, microsome and cytosol of human hepatocellular carcinoma (HCC) and normal liver samples were investigated by using molecular activation analysis (MAA) and differential centrifugation. Their significant difference was checked by the Studient's t-test. These elements exhibit inhomogeneous distributions in each subcellular fraction. Some elements have no significant difference between hepatocellular carcinoma and normal liver samples. However, the concentrations of Br, Ca, Cd and Cs are significantly higher in each component of hepatocarcinoma than in normal liver. The content of Fe in microsome of HCC is significantly lower, almost half of normal liver samples, but higher in other subcellular fractions than in those of normal tissues. The rare earth elements of La and Ce have the patterns similar to Fe. The concentrations of Sb and Zn in nuclei of HCC are obviously lower (P<0.05, P<0.05). The contents of K and Na are higher in cytosol of HCC (P<0.05). The distributions of Ba and Rb show no significant difference between two groups. The relationships of Fe, Cd and K with HCC were also discussed. The levels of some elements in subcellular fractions of tumor were quite different from those of normal liver, which suggested that trace elements might play important roles in the occurrence and development of hepatocellular carcinoma. (authors)

  4. Designing Better Graphs by Including Distributional Information and Integrating Words, Numbers, and Images

    Science.gov (United States)

    Lane, David M.; Sandor, Aniko

    2009-01-01

    Statistical graphs are commonly used in scientific publications. Unfortunately, graphs in psychology journals rarely portray distributional information beyond central tendency, and few graphs portray inferential statistics. Moreover, those that do portray inferential information generally do not portray it in a way that is useful for interpreting…

  5. Analysis of Urban Households' Preference for Informal Access to ...

    African Journals Online (AJOL)

    2016-10-02

    Oct 2, 2016 ... system, demand and supply, information systems as well as social ... the price system to dictate solely the allocation and distribution of land in the .... 400 questionnaires were distributed to the respondents through a random sampling ..... Urban land and informality: An evaluation of institutional response.

  6. Uncertainty assessment of integrated distributed hydrological models using GLUE with Markov chain Monte Carlo sampling

    DEFF Research Database (Denmark)

    Blasone, Roberta-Serena; Madsen, Henrik; Rosbjerg, Dan

    2008-01-01

    uncertainty estimation (GLUE) procedure based on Markov chain Monte Carlo sampling is applied in order to improve the performance of the methodology in estimating parameters and posterior output distributions. The description of the spatial variations of the hydrological processes is accounted for by defining......In recent years, there has been an increase in the application of distributed, physically-based and integrated hydrological models. Many questions regarding how to properly calibrate and validate distributed models and assess the uncertainty of the estimated parameters and the spatially......-site validation must complement the usual time validation. In this study, we develop, through an application, a comprehensive framework for multi-criteria calibration and uncertainty assessment of distributed physically-based, integrated hydrological models. A revised version of the generalized likelihood...

  7. Actual distribution of Cronobacter spp. in industrial batches of powdered infant formula and consequences for performance of sampling strategies

    NARCIS (Netherlands)

    Jongenburger, I.; Reij, M.W.; Boer, E.P.J.; Gorris, L.G.M.; Zwietering, M.H.

    2011-01-01

    The actual spatial distribution of microorganisms within a batch of food influences the results of sampling for microbiological testing when this distribution is non-homogeneous. In the case of pathogens being non-homogeneously distributed, it markedly influences public health risk. This study

  8. Implementation of Web-based Information Systems in Distributed Organizations

    DEFF Research Database (Denmark)

    Bødker, Keld; Pors, Jens Kaaber; Simonsen, Jesper

    2004-01-01

    This article presents results elicited from studies conducted in relation to implementing a web-based information system throughout a large distributed organization. We demonstrate the kind of expectations and conditions for change that management face in relation to open-ended, configurable......, and context specific web-based information systems like Lotus QuickPlace. Our synthesis from the empirical findings is related to two recent models, the improvisational change management model suggested by Orlikowski and Hofman (1997), and Gallivan's (2001) model for organizational adoption and assimilation....... In line with comparable approaches from the knowledge management area (Dixon 2000; Markus 2001), we relate to, refine, and operationalize the models from an overall organizational view by identifying and characterizing four different and general implementation contexts...

  9. DISCLOSING THE RADIO LOUDNESS DISTRIBUTION DICHOTOMY IN QUASARS: AN UNBIASED MONTE CARLO APPROACH APPLIED TO THE SDSS-FIRST QUASAR SAMPLE

    Energy Technology Data Exchange (ETDEWEB)

    Balokovic, M. [Department of Astronomy, California Institute of Technology, 1200 East California Boulevard, Pasadena, CA 91125 (United States); Smolcic, V. [Argelander-Institut fuer Astronomie, Auf dem Hugel 71, D-53121 Bonn (Germany); Ivezic, Z. [Department of Astronomy, University of Washington, Box 351580, Seattle, WA 98195 (United States); Zamorani, G. [INAF-Osservatorio Astronomico di Bologna, via Ranzani 1, I-40127 Bologna (Italy); Schinnerer, E. [Max-Planck-Institut fuer Astronomie, Koenigstuhl 17, D-69117 Heidelberg (Germany); Kelly, B. C. [Department of Physics, Broida Hall, University of California, Santa Barbara, CA 93106 (United States)

    2012-11-01

    We investigate the dichotomy in the radio loudness distribution of quasars by modeling their radio emission and various selection effects using a Monte Carlo approach. The existence of two physically distinct quasar populations, the radio-loud and radio-quiet quasars, is controversial and over the last decade a bimodal distribution of radio loudness of quasars has been both affirmed and disputed. We model the quasar radio luminosity distribution with simple unimodal and bimodal distribution functions. The resulting simulated samples are compared to a fiducial sample of 8300 quasars drawn from the SDSS DR7 Quasar Catalog and combined with radio observations from the FIRST survey. Our results indicate that the SDSS-FIRST sample is best described by a radio loudness distribution which consists of two components, with (12 {+-} 1)% of sources in the radio-loud component. On the other hand, the evidence for a local minimum in the loudness distribution (bimodality) is not strong and we find that previous claims for its existence were probably affected by the incompleteness of the FIRST survey close to its faint limit. We also investigate the redshift and luminosity dependence of the radio loudness distribution and find tentative evidence that at high redshift radio-loud quasars were rarer, on average louder, and exhibited a smaller range in radio loudness. In agreement with other recent work, we conclude that the SDSS-FIRST sample strongly suggests that the radio loudness distribution of quasars is not a universal function, and that more complex models than presented here are needed to fully explain available observations.

  10. DISCLOSING THE RADIO LOUDNESS DISTRIBUTION DICHOTOMY IN QUASARS: AN UNBIASED MONTE CARLO APPROACH APPLIED TO THE SDSS-FIRST QUASAR SAMPLE

    International Nuclear Information System (INIS)

    Baloković, M.; Smolčić, V.; Ivezić, Ž.; Zamorani, G.; Schinnerer, E.; Kelly, B. C.

    2012-01-01

    We investigate the dichotomy in the radio loudness distribution of quasars by modeling their radio emission and various selection effects using a Monte Carlo approach. The existence of two physically distinct quasar populations, the radio-loud and radio-quiet quasars, is controversial and over the last decade a bimodal distribution of radio loudness of quasars has been both affirmed and disputed. We model the quasar radio luminosity distribution with simple unimodal and bimodal distribution functions. The resulting simulated samples are compared to a fiducial sample of 8300 quasars drawn from the SDSS DR7 Quasar Catalog and combined with radio observations from the FIRST survey. Our results indicate that the SDSS-FIRST sample is best described by a radio loudness distribution which consists of two components, with (12 ± 1)% of sources in the radio-loud component. On the other hand, the evidence for a local minimum in the loudness distribution (bimodality) is not strong and we find that previous claims for its existence were probably affected by the incompleteness of the FIRST survey close to its faint limit. We also investigate the redshift and luminosity dependence of the radio loudness distribution and find tentative evidence that at high redshift radio-loud quasars were rarer, on average louder, and exhibited a smaller range in radio loudness. In agreement with other recent work, we conclude that the SDSS-FIRST sample strongly suggests that the radio loudness distribution of quasars is not a universal function, and that more complex models than presented here are needed to fully explain available observations.

  11. 78 FR 2992 - Agency Information Collection Activities; Proposed Collection; Comment Request; Distribution of...

    Science.gov (United States)

    2013-01-15

    ... consequence analyses (OCA) as well as other elements of the risk management program. On August 5, 1999, the...). The Act required the President to promulgate regulations on the distribution of OCA information (CAA... responsibility to promulgate regulations to govern the dissemination of OCA information to the public. The final...

  12. A system for on-line monitoring of light element concentration distributions in thin samples

    NARCIS (Netherlands)

    Brands, P.J.M.; Mutsaers, P.H.A.; Voigt, de M.J.A.

    1999-01-01

    At the Cyclotron Laboratory, a scanning proton microprobe is used to determine concentration distributions in biomedical samples. The data acquired in these measurements used to be analysed in a time consuming off-line analysis. To avoid the loss of valuable measurement and analysis time, DYANA was

  13. Distributed Input and State Estimation Using Local Information in Heterogeneous Sensor Networks

    Directory of Open Access Journals (Sweden)

    Dzung Tran

    2017-07-01

    Full Text Available A new distributed input and state estimation architecture is introduced and analyzed for heterogeneous sensor networks. Specifically, nodes of a given sensor network are allowed to have heterogeneous information roles in the sense that a subset of nodes can be active (that is, subject to observations of a process of interest and the rest can be passive (that is, subject to no observation. Both fixed and varying active and passive roles of sensor nodes in the network are investigated. In addition, these nodes are allowed to have non-identical sensor modalities under the common underlying assumption that they have complimentary properties distributed over the sensor network to achieve collective observability. The key feature of our framework is that it utilizes local information not only during the execution of the proposed distributed input and state estimation architecture but also in its design in that global uniform ultimate boundedness of error dynamics is guaranteed once each node satisfies given local stability conditions independent from the graph topology and neighboring information of these nodes. As a special case (e.g., when all nodes are active and a positive real condition is satisfied, the asymptotic stability can be achieved with our algorithm. Several illustrative numerical examples are further provided to demonstrate the efficacy of the proposed architecture.

  14. A new technique for testing distribution of knowledge and to estimate sampling sufficiency in ethnobiology studies.

    Science.gov (United States)

    Araújo, Thiago Antonio Sousa; Almeida, Alyson Luiz Santos; Melo, Joabe Gomes; Medeiros, Maria Franco Trindade; Ramos, Marcelo Alves; Silva, Rafael Ricardo Vasconcelos; Almeida, Cecília Fátima Castelo Branco Rangel; Albuquerque, Ulysses Paulino

    2012-03-15

    We propose a new quantitative measure that enables the researcher to make decisions and test hypotheses about the distribution of knowledge in a community and estimate the richness and sharing of information among informants. In our study, this measure has two levels of analysis: intracultural and intrafamily. Using data collected in northeastern Brazil, we evaluated how these new estimators of richness and sharing behave for different categories of use. We observed trends in the distribution of the characteristics of informants. We were also able to evaluate how outliers interfere with these analyses and how other analyses may be conducted using these indices, such as determining the distance between the knowledge of a community and that of experts, as well as exhibiting the importance of these individuals' communal information of biological resources. One of the primary applications of these indices is to supply the researcher with an objective tool to evaluate the scope and behavior of the collected data.

  15. Attitudes of the Japanese public and doctors towards use of archived information and samples without informed consent: Preliminary findings based on focus group interviews

    Directory of Open Access Journals (Sweden)

    Fukuhara Shunichi

    2002-01-01

    Full Text Available Abstract Background The purpose of this study is to explore laypersons' attitudes toward the use of archived (existing materials such as medical records and biological samples and to compare them with the attitudes of physicians who are involved in medical research. Methods Three focus group interviews were conducted, in which seven Japanese male members of the general public, seven female members of the general public and seven physicians participated. Results It was revealed that the lay public expressed diverse attitudes towards the use of archived information and samples without informed consent. Protecting a subject's privacy, maintaining confidentiality, and communicating the outcomes of studies to research subjects were regarded as essential preconditions if researchers were to have access to archived information and samples used for research without the specific informed consent of the subjects who provided the material. Although participating physicians thought that some kind of prior permission from subjects was desirable, they pointed out the difficulties involved in obtaining individual informed consent in each case. Conclusions The present preliminary study indicates that the lay public and medical professionals may have different attitudes towards the use of archived information and samples without specific informed consent. This hypothesis, however, is derived from our focus groups interviews, and requires validation through research using a larger sample.

  16. Spatial Distribution of Stony Desertification and Key Influencing Factors on Different Sampling Scales in Small Karst Watersheds

    Science.gov (United States)

    Zhang, Zhenming; Zhou, Yunchao; Wang, Shijie

    2018-01-01

    Karst areas are typical ecologically fragile areas, and stony desertification has become the most serious ecological and economic problems in these areas worldwide as well as a source of disasters and poverty. A reasonable sampling scale is of great importance for research on soil science in karst areas. In this paper, the spatial distribution of stony desertification characteristics and its influencing factors in karst areas are studied at different sampling scales using a grid sampling method based on geographic information system (GIS) technology and geo-statistics. The rock exposure obtained through sampling over a 150 m × 150 m grid in the Houzhai River Basin was utilized as the original data, and five grid scales (300 m × 300 m, 450 m × 450 m, 600 m × 600 m, 750 m × 750 m, and 900 m × 900 m) were used as the subsample sets. The results show that the rock exposure does not vary substantially from one sampling scale to another, while the average values of the five subsamples all fluctuate around the average value of the entire set. As the sampling scale increases, the maximum value and the average value of the rock exposure gradually decrease, and there is a gradual increase in the coefficient of variability. At the scale of 150 m × 150 m, the areas of minor stony desertification, medium stony desertification, and major stony desertification in the Houzhai River Basin are 7.81 km2, 4.50 km2, and 1.87 km2, respectively. The spatial variability of stony desertification at small scales is influenced by many factors, and the variability at medium scales is jointly influenced by gradient, rock content, and rock exposure. At large scales, the spatial variability of stony desertification is mainly influenced by soil thickness and rock content. PMID:29652811

  17. Spatial Distribution of Stony Desertification and Key Influencing Factors on Different Sampling Scales in Small Karst Watersheds

    Directory of Open Access Journals (Sweden)

    Zhenming Zhang

    2018-04-01

    Full Text Available Karst areas are typical ecologically fragile areas, and stony desertification has become the most serious ecological and economic problems in these areas worldwide as well as a source of disasters and poverty. A reasonable sampling scale is of great importance for research on soil science in karst areas. In this paper, the spatial distribution of stony desertification characteristics and its influencing factors in karst areas are studied at different sampling scales using a grid sampling method based on geographic information system (GIS technology and geo-statistics. The rock exposure obtained through sampling over a 150 m × 150 m grid in the Houzhai River Basin was utilized as the original data, and five grid scales (300 m × 300 m, 450 m × 450 m, 600 m × 600 m, 750 m × 750 m, and 900 m × 900 m were used as the subsample sets. The results show that the rock exposure does not vary substantially from one sampling scale to another, while the average values of the five subsamples all fluctuate around the average value of the entire set. As the sampling scale increases, the maximum value and the average value of the rock exposure gradually decrease, and there is a gradual increase in the coefficient of variability. At the scale of 150 m × 150 m, the areas of minor stony desertification, medium stony desertification, and major stony desertification in the Houzhai River Basin are 7.81 km2, 4.50 km2, and 1.87 km2, respectively. The spatial variability of stony desertification at small scales is influenced by many factors, and the variability at medium scales is jointly influenced by gradient, rock content, and rock exposure. At large scales, the spatial variability of stony desertification is mainly influenced by soil thickness and rock content.

  18. Spatial Distribution of Stony Desertification and Key Influencing Factors on Different Sampling Scales in Small Karst Watersheds.

    Science.gov (United States)

    Zhang, Zhenming; Zhou, Yunchao; Wang, Shijie; Huang, Xianfei

    2018-04-13

    Karst areas are typical ecologically fragile areas, and stony desertification has become the most serious ecological and economic problems in these areas worldwide as well as a source of disasters and poverty. A reasonable sampling scale is of great importance for research on soil science in karst areas. In this paper, the spatial distribution of stony desertification characteristics and its influencing factors in karst areas are studied at different sampling scales using a grid sampling method based on geographic information system (GIS) technology and geo-statistics. The rock exposure obtained through sampling over a 150 m × 150 m grid in the Houzhai River Basin was utilized as the original data, and five grid scales (300 m × 300 m, 450 m × 450 m, 600 m × 600 m, 750 m × 750 m, and 900 m × 900 m) were used as the subsample sets. The results show that the rock exposure does not vary substantially from one sampling scale to another, while the average values of the five subsamples all fluctuate around the average value of the entire set. As the sampling scale increases, the maximum value and the average value of the rock exposure gradually decrease, and there is a gradual increase in the coefficient of variability. At the scale of 150 m × 150 m, the areas of minor stony desertification, medium stony desertification, and major stony desertification in the Houzhai River Basin are 7.81 km², 4.50 km², and 1.87 km², respectively. The spatial variability of stony desertification at small scales is influenced by many factors, and the variability at medium scales is jointly influenced by gradient, rock content, and rock exposure. At large scales, the spatial variability of stony desertification is mainly influenced by soil thickness and rock content.

  19. 78 FR 79009 - Proposed Information Collection; Radiation Sampling and Exposure Records (Pertains to Underground...

    Science.gov (United States)

    2013-12-27

    ... soliciting comments concerning the proposed information collection for updating Radiation Sampling and... exposed with no adverse effects have been established and are expressed as working levels (WL). The... mandatory samplings. Records must include the sample date, location, and results, and must be retained at...

  20. Using Dedal to share and reuse distributed engineering design information

    Science.gov (United States)

    Baya, Vinod; Baudin, Catherine; Mabogunje, Ade; Das, Aseem; Cannon, David M.; Leifer, Larry J.

    1994-01-01

    The overall goal of the project is to facilitate the reuse of previous design experience for the maintenance, repair and redesign of artifacts in the electromechanical engineering domain. An engineering team creates information in the form of meeting summaries, project memos, progress reports, engineering notes, spreadsheet calculations and CAD drawings. Design information captured in these media is difficult to reuse because the way design concepts are referred to evolve over the life of a project and because decisions, requirements and structure are interrelated but rarely explicitly linked. Based on protocol analysis of the information seeking behavior of designer's, we defined a language to describe the content and the form of design records and implemented this language in Dedal, a tool for indexing, modeling and retrieving design information. We first describe the approach to indexing and retrieval in Dedal. Next we describe ongoing work in extending Dedal's capabilities to a distributed environment by integrating it with World Wide Web. This will enable members of a design team who are not co-located to share and reuse information.

  1. Methods and apparatuses for information analysis on shared and distributed computing systems

    Science.gov (United States)

    Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA

    2011-02-22

    Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.

  2. Statistical distributions applications and parameter estimates

    CERN Document Server

    Thomopoulos, Nick T

    2017-01-01

    This book gives a description of the group of statistical distributions that have ample application to studies in statistics and probability.  Understanding statistical distributions is fundamental for researchers in almost all disciplines.  The informed researcher will select the statistical distribution that best fits the data in the study at hand.  Some of the distributions are well known to the general researcher and are in use in a wide variety of ways.  Other useful distributions are less understood and are not in common use.  The book describes when and how to apply each of the distributions in research studies, with a goal to identify the distribution that best applies to the study.  The distributions are for continuous, discrete, and bivariate random variables.  In most studies, the parameter values are not known a priori, and sample data is needed to estimate parameter values.  In other scenarios, no sample data is available, and the researcher seeks some insight that allows the estimate of ...

  3. AGIS: Evolution of Distributed Computing information system for ATLAS

    Science.gov (United States)

    Anisenkov, A.; Di Girolamo, A.; Alandes, M.; Karavakis, E.

    2015-12-01

    ATLAS, a particle physics experiment at the Large Hadron Collider at CERN, produces petabytes of data annually through simulation production and tens of petabytes of data per year from the detector itself. The ATLAS computing model embraces the Grid paradigm and a high degree of decentralization of computing resources in order to meet the ATLAS requirements of petabytes scale data operations. It has been evolved after the first period of LHC data taking (Run-1) in order to cope with new challenges of the upcoming Run- 2. In this paper we describe the evolution and recent developments of the ATLAS Grid Information System (AGIS), developed in order to integrate configuration and status information about resources, services and topology of the computing infrastructure used by the ATLAS Distributed Computing applications and services.

  4. [Application of simulated annealing method and neural network on optimizing soil sampling schemes based on road distribution].

    Science.gov (United States)

    Han, Zong-wei; Huang, Wei; Luo, Yun; Zhang, Chun-di; Qi, Da-cheng

    2015-03-01

    Taking the soil organic matter in eastern Zhongxiang County, Hubei Province, as a research object, thirteen sample sets from different regions were arranged surrounding the road network, the spatial configuration of which was optimized by the simulated annealing approach. The topographic factors of these thirteen sample sets, including slope, plane curvature, profile curvature, topographic wetness index, stream power index and sediment transport index, were extracted by the terrain analysis. Based on the results of optimization, a multiple linear regression model with topographic factors as independent variables was built. At the same time, a multilayer perception model on the basis of neural network approach was implemented. The comparison between these two models was carried out then. The results revealed that the proposed approach was practicable in optimizing soil sampling scheme. The optimal configuration was capable of gaining soil-landscape knowledge exactly, and the accuracy of optimal configuration was better than that of original samples. This study designed a sampling configuration to study the soil attribute distribution by referring to the spatial layout of road network, historical samples, and digital elevation data, which provided an effective means as well as a theoretical basis for determining the sampling configuration and displaying spatial distribution of soil organic matter with low cost and high efficiency.

  5. A quick method based on SIMPLISMA-KPLS for simultaneously selecting outlier samples and informative samples for model standardization in near infrared spectroscopy

    Science.gov (United States)

    Li, Li-Na; Ma, Chang-Ming; Chang, Ming; Zhang, Ren-Cheng

    2017-12-01

    A novel method based on SIMPLe-to-use Interactive Self-modeling Mixture Analysis (SIMPLISMA) and Kernel Partial Least Square (KPLS), named as SIMPLISMA-KPLS, is proposed in this paper for selection of outlier samples and informative samples simultaneously. It is a quick algorithm used to model standardization (or named as model transfer) in near infrared (NIR) spectroscopy. The NIR experiment data of the corn for analysis of the protein content is introduced to evaluate the proposed method. Piecewise direct standardization (PDS) is employed in model transfer. And the comparison of SIMPLISMA-PDS-KPLS and KS-PDS-KPLS is given in this research by discussion of the prediction accuracy of protein content and calculation speed of each algorithm. The conclusions include that SIMPLISMA-KPLS can be utilized as an alternative sample selection method for model transfer. Although it has similar accuracy to Kennard-Stone (KS), it is different from KS as it employs concentration information in selection program. This means that it ensures analyte information is involved in analysis, and the spectra (X) of the selected samples is interrelated with concentration (y). And it can be used for outlier sample elimination simultaneously by validation of calibration. According to the statistical data results of running time, it is clear that the sample selection process is more rapid when using KPLS. The quick algorithm of SIMPLISMA-KPLS is beneficial to improve the speed of online measurement using NIR spectroscopy.

  6. Enhanced Sampling in Free Energy Calculations: Combining SGLD with the Bennett's Acceptance Ratio and Enveloping Distribution Sampling Methods.

    Science.gov (United States)

    König, Gerhard; Miller, Benjamin T; Boresch, Stefan; Wu, Xiongwu; Brooks, Bernard R

    2012-10-09

    One of the key requirements for the accurate calculation of free energy differences is proper sampling of conformational space. Especially in biological applications, molecular dynamics simulations are often confronted with rugged energy surfaces and high energy barriers, leading to insufficient sampling and, in turn, poor convergence of the free energy results. In this work, we address this problem by employing enhanced sampling methods. We explore the possibility of using self-guided Langevin dynamics (SGLD) to speed up the exploration process in free energy simulations. To obtain improved free energy differences from such simulations, it is necessary to account for the effects of the bias due to the guiding forces. We demonstrate how this can be accomplished for the Bennett's acceptance ratio (BAR) and the enveloping distribution sampling (EDS) methods. While BAR is considered among the most efficient methods available for free energy calculations, the EDS method developed by Christ and van Gunsteren is a promising development that reduces the computational costs of free energy calculations by simulating a single reference state. To evaluate the accuracy of both approaches in connection with enhanced sampling, EDS was implemented in CHARMM. For testing, we employ benchmark systems with analytical reference results and the mutation of alanine to serine. We find that SGLD with reweighting can provide accurate results for BAR and EDS where conventional molecular dynamics simulations fail. In addition, we compare the performance of EDS with other free energy methods. We briefly discuss the implications of our results and provide practical guidelines for conducting free energy simulations with SGLD.

  7. A Sample Calculation of Tritium Production and Distribution at VHTR by using TRITGO Code

    International Nuclear Information System (INIS)

    Park, Ik Kyu; Kim, D. H.; Lee, W. J.

    2007-03-01

    TRITGO code was developed for estimating the tritium production and distribution of high temperature gas cooled reactor(HTGR), especially GTMHR350 by General Atomics. In this study, the tritium production and distribution of NHDD was analyzed by using TRITGO Code. The TRITGO code was improved by a simple method to calculate the tritium amount in IS Loop. The improved TRITGO input for the sample calculation was prepared based on GTMHR600 because the NHDD has been designed referring GTMHR600. The GTMHR350 input with related to the tritium distribution was directly used. The calculated tritium activity among the hydrogen produced in IS-Loop is 0.56 Bq/g- H2. This is a very satisfying result considering that the limited tritium activity of Japanese Regulation Guide is 5.6 Bq/g-H2. The basic system to analyze the tritium production and the distribution by using TRITGO was successfully constructed. However, there exists some uncertainties in tritium distribution models, the suggested method for IS-Loop, and the current input was not for NHDD but for GTMHR600. The qualitative analysis for the distribution model and the IS-Loop model and the quantitative analysis for the input should be done in the future

  8. A Sample Calculation of Tritium Production and Distribution at VHTR by using TRITGO Code

    Energy Technology Data Exchange (ETDEWEB)

    Park, Ik Kyu; Kim, D. H.; Lee, W. J

    2007-03-15

    TRITGO code was developed for estimating the tritium production and distribution of high temperature gas cooled reactor(HTGR), especially GTMHR350 by General Atomics. In this study, the tritium production and distribution of NHDD was analyzed by using TRITGO Code. The TRITGO code was improved by a simple method to calculate the tritium amount in IS Loop. The improved TRITGO input for the sample calculation was prepared based on GTMHR600 because the NHDD has been designed referring GTMHR600. The GTMHR350 input with related to the tritium distribution was directly used. The calculated tritium activity among the hydrogen produced in IS-Loop is 0.56 Bq/g- H2. This is a very satisfying result considering that the limited tritium activity of Japanese Regulation Guide is 5.6 Bq/g-H2. The basic system to analyze the tritium production and the distribution by using TRITGO was successfully constructed. However, there exists some uncertainties in tritium distribution models, the suggested method for IS-Loop, and the current input was not for NHDD but for GTMHR600. The qualitative analysis for the distribution model and the IS-Loop model and the quantitative analysis for the input should be done in the future.

  9. Hanford Environmental Information System (HEIS)

    International Nuclear Information System (INIS)

    1994-01-01

    The purpose of the Biota subject area of the Hanford Environmental Information System (HEIS) is to manage the data collected from samples of plants and animals. This includes both samples taken from the plant or animal or samples related to the plant or animal. Related samples include animal feces and animal habitat. Data stored in the Biota subject area include data about the biota samples taken, analysis results counts from population studies, and species distribution maps

  10. Photon event distribution sampling: an image formation technique for scanning microscopes that permits tracking of sub-diffraction particles with high spatial and temporal resolutions.

    Science.gov (United States)

    Larkin, J D; Publicover, N G; Sutko, J L

    2011-01-01

    In photon event distribution sampling, an image formation technique for scanning microscopes, the maximum likelihood position of origin of each detected photon is acquired as a data set rather than binning photons in pixels. Subsequently, an intensity-related probability density function describing the uncertainty associated with the photon position measurement is applied to each position and individual photon intensity distributions are summed to form an image. Compared to pixel-based images, photon event distribution sampling images exhibit increased signal-to-noise and comparable spatial resolution. Photon event distribution sampling is superior to pixel-based image formation in recognizing the presence of structured (non-random) photon distributions at low photon counts and permits use of non-raster scanning patterns. A photon event distribution sampling based method for localizing single particles derived from a multi-variate normal distribution is more precise than statistical (Gaussian) fitting to pixel-based images. Using the multi-variate normal distribution method, non-raster scanning and a typical confocal microscope, localizations with 8 nm precision were achieved at 10 ms sampling rates with acquisition of ~200 photons per frame. Single nanometre precision was obtained with a greater number of photons per frame. In summary, photon event distribution sampling provides an efficient way to form images when low numbers of photons are involved and permits particle tracking with confocal point-scanning microscopes with nanometre precision deep within specimens. © 2010 The Authors Journal of Microscopy © 2010 The Royal Microscopical Society.

  11. The Mann-Whitney U: A Test for Assessing Whether Two Independent Samples Come from the Same Distribution

    Directory of Open Access Journals (Sweden)

    Nadim Nachar

    2008-03-01

    Full Text Available It is often difficult, particularly when conducting research in psychology, to have access to large normally distributed samples. Fortunately, there are statistical tests to compare two independent groups that do not require large normally distributed samples. The Mann-Whitney U is one of these tests. In the following work, a summary of this test is presented. The explanation of the logic underlying this test and its application are presented. Moreover, the forces and weaknesses of the Mann-Whitney U are mentioned. One major limit of the Mann-Whitney U is that the type I error or alpha (? is amplified in a situation of heteroscedasticity.

  12. 75 FR 36615 - Pipeline Safety: Information Collection Gas Distribution Annual Report Form

    Science.gov (United States)

    2010-06-28

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part 192 [Docket No. PHMSA-RSPA-2004-19854] Pipeline Safety: Information Collection Gas Distribution Annual Report Form AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Request...

  13. Implications of the Cressie-Read Family of Additive Divergences for Information Recovery

    Directory of Open Access Journals (Sweden)

    George G. Judge

    2012-12-01

    Full Text Available To address the unknown nature of probability-sampling models, in this paper we use information theoretic concepts and the Cressie-Read (CR family of information divergence measures to produce a flexible family of probability distributions, likelihood functions, estimators, and inference procedures. The usual case in statistical modeling is that the noisy indirect data are observed and known and the sampling model-error distribution-probability space, consistent with the data, is unknown. To address the unknown sampling process underlying the data, we consider a convex combination of two or more estimators derived from members of the flexible CR family of divergence measures and optimize that combination to select an estimator that minimizes expected quadratic loss. Sampling experiments are used to illustrate the finite sample properties of the resulting estimator and the nature of the recovered sampling distribution.

  14. Goodness-of-fit tests for the Gompertz distribution

    DEFF Research Database (Denmark)

    Lenart, Adam; Missov, Trifon

    The Gompertz distribution is often fitted to lifespan data, however testing whether the fit satisfies theoretical criteria was neglected. Here five goodness-of-fit measures, the Anderson-Darling statistic, the Kullback-Leibler discrimination information, the correlation coefficient test, testing ...... for the mean of the sample hazard and a nested test against the generalized extreme value distributions are discussed. Along with an application to laboratory rat data, critical values calculated by the empirical distribution of the test statistics are also presented.......The Gompertz distribution is often fitted to lifespan data, however testing whether the fit satisfies theoretical criteria was neglected. Here five goodness-of-fit measures, the Anderson-Darling statistic, the Kullback-Leibler discrimination information, the correlation coefficient test, testing...

  15. Mutual trust method for forwarding information in wireless sensor networks using random secret pre-distribution

    Directory of Open Access Journals (Sweden)

    Chih-Hsueh Lin

    2016-04-01

    Full Text Available In wireless sensor networks, sensing information must be transmitted from sensor nodes to the base station by multiple hopping. Every sensor node is a sender and a relay node that forwards the sensing information that is sent by other nodes. Under an attack, the sensing information may be intercepted, modified, interrupted, or fabricated during transmission. Accordingly, the development of mutual trust to enable a secure path to be established for forwarding information is an important issue. Random key pre-distribution has been proposed to establish mutual trust among sensor nodes. This article modifies the random key pre-distribution to a random secret pre-distribution and incorporates identity-based cryptography to establish an effective method of establishing mutual trust for a wireless sensor network. In the proposed method, base station assigns an identity and embeds n secrets into the private secret keys for every sensor node. Based on the identity and private secret keys, the mutual trust method is utilized to explore the types of trust among neighboring sensor nodes. The novel method can resist malicious attacks and satisfy the requirements of wireless sensor network, which are resistance to compromising attacks, masquerading attacks, forger attacks, replying attacks, authentication of forwarding messages, and security of sensing information.

  16. Calculation of the effective D-d neutron energy distribution incident on a cylindrical shell sample

    International Nuclear Information System (INIS)

    Gotoh, Hiroshi

    1977-07-01

    A method is proposed to calculate the effective energy distribution of neutrons incident on a cylindrical shell sample placed perpendicularly to the direction of the deuteron beam bombarding a deuterium metal target. The Monte Carlo method is used and the Fortran program is contained. (auth.)

  17. Failure-censored accelerated life test sampling plans for Weibull distribution under expected test time constraint

    International Nuclear Information System (INIS)

    Bai, D.S.; Chun, Y.R.; Kim, J.G.

    1995-01-01

    This paper considers the design of life-test sampling plans based on failure-censored accelerated life tests. The lifetime distribution of products is assumed to be Weibull with a scale parameter that is a log linear function of a (possibly transformed) stress. Two levels of stress higher than the use condition stress, high and low, are used. Sampling plans with equal expected test times at high and low test stresses which satisfy the producer's and consumer's risk requirements and minimize the asymptotic variance of the test statistic used to decide lot acceptability are obtained. The properties of the proposed life-test sampling plans are investigated

  18. MaxEnt queries and sequential sampling

    International Nuclear Information System (INIS)

    Riegler, Peter; Caticha, Nestor

    2001-01-01

    In this paper we pose the question: After gathering N data points, at what value of the control parameter should the next measurement be done? We propose an on-line algorithm which samples optimally by maximizing the gain in information on the parameters to be measured. We show analytically that the information gain is maximum for those potential measurements whose outcome is most unpredictable, i.e. for which the predictive distribution has maximum entropy. The resulting algorithm is applied to exponential analysis

  19. Design for Distributed Moroccan Hospital Pharmacy Information Environment with Service Oriented Architecture

    OpenAIRE

    Omrana, Hajar; Nassiri, Safae; Belouadha, Fatima-Zahra; Roudiés, Ounsa

    2012-01-01

    In the last five years, Moroccan e-health system has focused on improving the quality of patient care services by making use of advanced Information and Communications Technologies (ICT) solutions. In actual fact, achieving runtime and efficient information sharing, through large-scale distributed environments such as e-health system, is not a trivial task. It seems to present many issues due to the heterogeneity and complex nature of data resources. This concerns, in particular, Moroccan Hos...

  20. Ambient Learning Displays - Distributed Mixed Reality Information Mash-ups to support Ubiquitous Learning

    NARCIS (Netherlands)

    Börner, Dirk

    2010-01-01

    Börner, D. (2010, 19-21 March). Ambient Learning Displays Distributed Mixed Reality Information Mash-ups to support Ubiquitous Learning. Presented at the IADIS International Conference Mobile Learning 2010, Porto, Portugal.

  1. A self-scaling, distributed information architecture for public health, research, and clinical care.

    Science.gov (United States)

    McMurry, Andrew J; Gilbert, Clint A; Reis, Ben Y; Chueh, Henry C; Kohane, Isaac S; Mandl, Kenneth D

    2007-01-01

    This study sought to define a scalable architecture to support the National Health Information Network (NHIN). This architecture must concurrently support a wide range of public health, research, and clinical care activities. The architecture fulfils five desiderata: (1) adopt a distributed approach to data storage to protect privacy, (2) enable strong institutional autonomy to engender participation, (3) provide oversight and transparency to ensure patient trust, (4) allow variable levels of access according to investigator needs and institutional policies, (5) define a self-scaling architecture that encourages voluntary regional collaborations that coalesce to form a nationwide network. Our model has been validated by a large-scale, multi-institution study involving seven medical centers for cancer research. It is the basis of one of four open architectures developed under funding from the Office of the National Coordinator of Health Information Technology, fulfilling the biosurveillance use case defined by the American Health Information Community. The model supports broad applicability for regional and national clinical information exchanges. This model shows the feasibility of an architecture wherein the requirements of care providers, investigators, and public health authorities are served by a distributed model that grants autonomy, protects privacy, and promotes participation.

  2. Evaluation of Circle Diameter by Distributed Tactile Information in Active Tracing

    Directory of Open Access Journals (Sweden)

    Hiroyuki Nakamoto

    2013-01-01

    Full Text Available Active touch with voluntary movement on the surface of an object is important for human to obtain the local and detailed features on it. In addition, the active touch is considered to enhance the human spatial resolution. In order to improve dexterity performance of multifinger robotic hands, it is necessary to study an active touch method for robotic hands. In this paper, first, we define four requirements of a tactile sensor for active touch and design a distributed tactile sensor model, which can measure a distribution of compressive deformation. Second, we suggest a measurement process with the sensor model, a synthesis method of distributed deformations. In the experiments, a five-finger robotic hand with tactile sensors traces on the surface of cylindrical objects and evaluates the diameters. We confirm that the hand can obtain more information of the diameters by tracing the finger.

  3. Environmental DNA method for estimating salamander distribution in headwater streams, and a comparison of water sampling methods.

    Science.gov (United States)

    Katano, Izumi; Harada, Ken; Doi, Hideyuki; Souma, Rio; Minamoto, Toshifumi

    2017-01-01

    Environmental DNA (eDNA) has recently been used for detecting the distribution of macroorganisms in various aquatic habitats. In this study, we applied an eDNA method to estimate the distribution of the Japanese clawed salamander, Onychodactylus japonicus, in headwater streams. Additionally, we compared the detection of eDNA and hand-capturing methods used for determining the distribution of O. japonicus. For eDNA detection, we designed a qPCR primer/probe set for O. japonicus using the 12S rRNA region. We detected the eDNA of O. japonicus at all sites (with the exception of one), where we also observed them by hand-capturing. Additionally, we detected eDNA at two sites where we were unable to observe individuals using the hand-capturing method. Moreover, we found that eDNA concentrations and detection rates of the two water sampling areas (stream surface and under stones) were not significantly different, although the eDNA concentration in the water under stones was more varied than that on the surface. We, therefore, conclude that eDNA methods could be used to determine the distribution of macroorganisms inhabiting headwater systems by using samples collected from the surface of the water.

  4. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  5. Integration of Geographical Information Systems and Geophysical Applications with Distributed Computing Technologies.

    Science.gov (United States)

    Pierce, M. E.; Aktas, M. S.; Aydin, G.; Fox, G. C.; Gadgil, H.; Sayar, A.

    2005-12-01

    We examine the application of Web Service Architectures and Grid-based distributed computing technologies to geophysics and geo-informatics. We are particularly interested in the integration of Geographical Information System (GIS) services with distributed data mining applications. GIS services provide the general purpose framework for building archival data services, real time streaming data services, and map-based visualization services that may be integrated with data mining and other applications through the use of distributed messaging systems and Web Service orchestration tools. Building upon on our previous work in these areas, we present our current research efforts. These include fundamental investigations into increasing XML-based Web service performance, supporting real time data streams, and integrating GIS mapping tools with audio/video collaboration systems for shared display and annotation.

  6. Proposal for logistics information management system using distributed architecture; Bunsangata butsuryu joho system no teian to kensho

    Energy Technology Data Exchange (ETDEWEB)

    Kataoka, N.; Koizumi, H.; Shimizu, H. [Mitsubishi Electric Power Corp., Tokyo (Japan)

    1998-03-01

    Conventional host-based central-processing type logistics information systems collect all information about stocked products (sales results, inventory, out-of-stock items) on a single host computer, and based on this information perform ordering, shipping, receiving, and other processing. In a client/server architecture, the system is not simply downsized: in order to ensure more effective use of logistics information and closer coordination with manufacturing information systems, the logistics information system must be configured as a distributed system specific to a given factory and its various products. Such distributed systems each function acts independently, but at the same time the overall system of which they is part must operate in harmony to perform cost optimization, adjust allocation of resources among different factories and business locations, and present a single monolithic interface to retailers and sales agents. In this paper, we propose a logistics information system with a distributed architecture as well as agents whose role is to coordinate operation of the overall system, as one means of realizing this combination of component autonomy and overall system harmony. The methodology proposed here was applied to a proving system, and its effectiveness was verified. 9 refs., 12 figs.

  7. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    Science.gov (United States)

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  8. Elemental distribution in human femoral head

    Energy Technology Data Exchange (ETDEWEB)

    Santos, C., E-mail: catia.santos@itn.pt [Dep. Física, Faculdade de Ciências e Tecnologia, Universidade Nova de Lisboa, 2829-516 Caparica (Portugal); Centro de Física Nuclear da Universidade de Lisboa, 1649-003 Lisboa (Portugal); Campus Tecnológico e Nuclear, IST/CTN, Universidade Técnica de Lisboa E.N. 10, 2686-953 Sacavém (Portugal); Fonseca, M. [Dep. Física, Faculdade de Ciências e Tecnologia, Universidade Nova de Lisboa, 2829-516 Caparica (Portugal); Centro de Física Nuclear da Universidade de Lisboa, 1649-003 Lisboa (Portugal); Universidade Europeia|Laureate International Universities, 1500-210 Lisboa (Portugal); Corregidor, V. [Campus Tecnológico e Nuclear, IST/CTN, Universidade Técnica de Lisboa E.N. 10, 2686-953 Sacavém (Portugal); Silva, H. [Dep. Física, Faculdade de Ciências e Tecnologia, Universidade Nova de Lisboa, 2829-516 Caparica (Portugal); Centro de Física Nuclear da Universidade de Lisboa, 1649-003 Lisboa (Portugal); Campus Tecnológico e Nuclear, IST/CTN, Universidade Técnica de Lisboa E.N. 10, 2686-953 Sacavém (Portugal); Luís, H.; Jesus, A.P. [Dep. Física, Faculdade de Ciências e Tecnologia, Universidade Nova de Lisboa, 2829-516 Caparica (Portugal); Centro de Física Nuclear da Universidade de Lisboa, 1649-003 Lisboa (Portugal); and others

    2014-07-15

    Osteoporosis is the most common bone disease with severe symptoms and harmful effects on the patient quality of life. Because abnormal distribution and concentration of the major and trace elements may help to characterize the disease, ion beam analysis is applied to the study of bone samples. Proton Induced X-ray Emission and Elastic Backscattering Spectrometry are applied for qualitative and quantitative analysis of an osteoporotic bone sample, for the determination of the Ca/P ratio and analysis of the distribution of major and trace elements. The analysis was made both in trabecular and cortical bone and the results are in agreement with the information found in literature.

  9. Decision Criteria for Distributed Versus Non-Distributed Information Systems in the Health Care Environment

    Science.gov (United States)

    McGinnis, John W.

    1980-01-01

    The very same technological advances that support distributed systems have also dramatically increased the efficiency and capabilities of centralized systems making it more complex for health care managers to select the “right” system architecture to meet their particular needs. How this selection can be made with a reasonable degree of managerial comfort is the focus of this paper. The approach advocated is based on experience in developing the Tri-Service Medical Information System (TRIMIS) program. Along with this technical standards and configuration management procedures were developed that provided the necessary guidance to implement the selected architecture and to allow it to change in a controlled way over its life cycle.

  10. A stochastic optimisation method to estimate the spatial distribution of a pathogen from a sample.

    Science.gov (United States)

    Sampling is of central importance in plant pathology. It facilitates our understanding of how epidemics develop in space and time and can also be used to inform disease management decisions. Making inferences from a sample is necessary because we rarely have the resources to conduct a complete censu...

  11. Rapid sampling of molecular motions with prior information constraints.

    Science.gov (United States)

    Raveh, Barak; Enosh, Angela; Schueler-Furman, Ora; Halperin, Dan

    2009-02-01

    Proteins are active, flexible machines that perform a range of different functions. Innovative experimental approaches may now provide limited partial information about conformational changes along motion pathways of proteins. There is therefore a need for computational approaches that can efficiently incorporate prior information into motion prediction schemes. In this paper, we present PathRover, a general setup designed for the integration of prior information into the motion planning algorithm of rapidly exploring random trees (RRT). Each suggested motion pathway comprises a sequence of low-energy clash-free conformations that satisfy an arbitrary number of prior information constraints. These constraints can be derived from experimental data or from expert intuition about the motion. The incorporation of prior information is very straightforward and significantly narrows down the vast search in the typically high-dimensional conformational space, leading to dramatic reduction in running time. To allow the use of state-of-the-art energy functions and conformational sampling, we have integrated this framework into Rosetta, an accurate protocol for diverse types of structural modeling. The suggested framework can serve as an effective complementary tool for molecular dynamics, Normal Mode Analysis, and other prevalent techniques for predicting motion in proteins. We applied our framework to three different model systems. We show that a limited set of experimentally motivated constraints may effectively bias the simulations toward diverse predicates in an outright fashion, from distance constraints to enforcement of loop closure. In particular, our analysis sheds light on mechanisms of protein domain swapping and on the role of different residues in the motion.

  12. Rapid sampling of molecular motions with prior information constraints.

    Directory of Open Access Journals (Sweden)

    Barak Raveh

    2009-02-01

    Full Text Available Proteins are active, flexible machines that perform a range of different functions. Innovative experimental approaches may now provide limited partial information about conformational changes along motion pathways of proteins. There is therefore a need for computational approaches that can efficiently incorporate prior information into motion prediction schemes. In this paper, we present PathRover, a general setup designed for the integration of prior information into the motion planning algorithm of rapidly exploring random trees (RRT. Each suggested motion pathway comprises a sequence of low-energy clash-free conformations that satisfy an arbitrary number of prior information constraints. These constraints can be derived from experimental data or from expert intuition about the motion. The incorporation of prior information is very straightforward and significantly narrows down the vast search in the typically high-dimensional conformational space, leading to dramatic reduction in running time. To allow the use of state-of-the-art energy functions and conformational sampling, we have integrated this framework into Rosetta, an accurate protocol for diverse types of structural modeling. The suggested framework can serve as an effective complementary tool for molecular dynamics, Normal Mode Analysis, and other prevalent techniques for predicting motion in proteins. We applied our framework to three different model systems. We show that a limited set of experimentally motivated constraints may effectively bias the simulations toward diverse predicates in an outright fashion, from distance constraints to enforcement of loop closure. In particular, our analysis sheds light on mechanisms of protein domain swapping and on the role of different residues in the motion.

  13. Expected net present value of sample information: from burden to investment.

    Science.gov (United States)

    Hall, Peter S; Edlin, Richard; Kharroubi, Samer; Gregory, Walter; McCabe, Christopher

    2012-01-01

    The Expected Value of Information Framework has been proposed as a method for identifying when health care technologies should be immediately reimbursed and when any reimbursement should be withheld while awaiting more evidence. This framework assesses the value of obtaining additional evidence to inform a current reimbursement decision. This represents the burden of not having the additional evidence at the time of the decision. However, when deciding whether to reimburse now or await more evidence, decision makers need to know the value of investing in more research to inform a future decision. Assessing this value requires consideration of research costs, research time, and what happens to patients while the research is undertaken and after completion. The investigators describe a development of the calculation of the expected value of sample information that assesses the value of investing in further research, including an only-in-research strategy and an only-with-research strategy.

  14. 21 CFR 809.40 - Restrictions on the sale, distribution, and use of OTC test sample collection systems for drugs...

    Science.gov (United States)

    2010-04-01

    ... OTC test sample collection systems for drugs of abuse testing. 809.40 Section 809.40 Food and Drugs... Restrictions on the sale, distribution, and use of OTC test sample collection systems for drugs of abuse testing. (a) Over-the-counter (OTC) test sample collection systems for drugs of abuse testing (§ 864.3260...

  15. Benchmarking distributed data warehouse solutions for storing genomic variant information

    Science.gov (United States)

    Wiewiórka, Marek S.; Wysakowicz, Dawid P.; Okoniewski, Michał J.

    2017-01-01

    the storage and analysis of variants from thousands of samples can benefit from the scalability and performance of distributed data warehouse solutions. Database URL: https://github.com/ZSI-Bio/variantsdwh PMID:29220442

  16. A framework for implementing a Distributed Intrusion Detection System (DIDS) with interoperabilty and information analysis

    OpenAIRE

    Davicino, Pablo; Echaiz, Javier; Ardenghi, Jorge Raúl

    2011-01-01

    Computer Intrusion Detection Systems (IDS) are primarily designed to protect availability, condentiality and integrity of critical information infrastructures. A Distributed IDS (DIDS) consists of several IDS over a large network(s), all of which communicate with each other, with a central server or with a cluster of servers that facilitates advanced network monitoring. In a distributed environment, DIDS are implemented using cooperative intelligent sensors distributed across the network(s). ...

  17. Spatial distribution of grape root borer (Lepidoptera: Sesiidae) infestations in Virginia vineyards and implications for sampling.

    Science.gov (United States)

    Rijal, J P; Brewster, C C; Bergh, J C

    2014-06-01

    Grape root borer, Vitacea polistiformis (Harris) (Lepidoptera: Sesiidae) is a potentially destructive pest of grape vines, Vitis spp. in the eastern United States. After feeding on grape roots for ≍2 yr in Virginia, larvae pupate beneath the soil surface around the vine base. Adults emerge during July and August, leaving empty pupal exuviae on or protruding from the soil. Weekly collections of pupal exuviae from an ≍1-m-diameter weed-free zone around the base of a grid of sample vines in Virginia vineyards were conducted in July and August, 2008-2012, and their distribution was characterized using both nonspatial (dispersion) and spatial techniques. Taylor's power law showed a significant aggregation of pupal exuviae, based on data from 19 vineyard blocks. Combined use of geostatistical and Spatial Analysis by Distance IndicEs methods indicated evidence of an aggregated pupal exuviae distribution pattern in seven of the nine blocks used for those analyses. Grape root borer pupal exuviae exhibited spatial dependency within a mean distance of 8.8 m, based on the range values of best-fitted variograms. Interpolated and clustering index-based infestation distribution maps were developed to show the spatial pattern of the insect within the vineyard blocks. The temporal distribution of pupal exuviae showed that the majority of moths emerged during the 3-wk period spanning the third week of July and the first week of August. The spatial distribution of grape root borer pupal exuviae was used in combination with temporal moth emergence patterns to develop a quantitative and efficient sampling scheme to assess infestations.

  18. Spatial Distribution and Sampling Plans With Fixed Level of Precision for Citrus Aphids (Hom., Aphididae) on Two Orange Species.

    Science.gov (United States)

    Kafeshani, Farzaneh Alizadeh; Rajabpour, Ali; Aghajanzadeh, Sirous; Gholamian, Esmaeil; Farkhari, Mohammad

    2018-04-02

    Aphis spiraecola Patch, Aphis gossypii Glover, and Toxoptera aurantii Boyer de Fonscolombe are three important aphid pests of citrus orchards. In this study, spatial distributions of the aphids on two orange species, Satsuma mandarin and Thomson navel, were evaluated using Taylor's power law and Iwao's patchiness. In addition, a fixed-precision sequential sampling plant was developed for each species on the host plant by Green's model at precision levels of 0.25 and 0.1. The results revealed that spatial distribution parameters and therefore the sampling plan were significantly different according to aphid and host plant species. Taylor's power law provides a better fit for the data than Iwao's patchiness regression. Except T. aurantii on Thomson navel orange, spatial distribution patterns of the aphids were aggregative on both citrus. T. aurantii had regular dispersion pattern on Thomson navel orange. Optimum sample size of the aphids varied from 30-2061 and 1-1622 shoots on Satsuma mandarin and Thomson navel orange based on aphid species and desired precision level. Calculated stop lines of the aphid species on Satsuma mandarin and Thomson navel orange ranged from 0.48 to 19 and 0.19 to 80.4 aphids per 24 shoots according to aphid species and desired precision level. The performance of the sampling plan was validated by resampling analysis using resampling for validation of sampling plans (RVSP) software. This sampling program is useful for IPM program of the aphids in citrus orchards.

  19. Impaired information sampling in mild dementia of Alzheimer's type but not in healthy aging.

    Science.gov (United States)

    Zamarian, Laura; Benke, Thomas; Brand, Matthias; Djamshidian, Atbin; Delazer, Margarete

    2015-05-01

    It is unknown whether aging affects predecisional processing, that is, gathering information and evaluating options before making a decision. Here, we investigated information sampling in mild Dementia of Alzheimer's type (DAT) and healthy aging by using the Information Sampling Task (IST). In a first investigation, we compared patients with mild DAT (n = 20) with healthy controls (n = 20) on the IST and several neuropsychological background tests. In a second investigation, healthy older adults (n = 30) were compared with younger adults (n = 30) on the IST and executive-function tasks. Results of the first investigation demonstrated that, in the IST, patients gathered significantly less information, made riskier and less accurate decisions, and showed less reward sensitivity relative to controls. We found a significant correlation between performance on the IST and performance on tests of verbal fluency, working memory, and recognition in patients but not in controls. Results of the second investigation indicated a largely similar performance pattern between healthy older adults and younger adults. There were no significant correlations for both groups between the IST and executive-function tasks. There are no relevant changes with healthy aging in predecisional processing. In contrast, mild DAT significantly affects predecisional information sampling. Thus, the problems shown in patients with mild DAT in decision making might be related to the patients' difficulties in predecisional processing. Decision-making performance in mild DAT might be improved by helping the patients at a predecisional stage to gather sufficient information and evaluate options more accurately. (c) 2015 APA, all rights reserved).

  20. In Situ Sampling of Relative Dust Devil Particle Loads and Their Vertical Grain Size Distributions.

    Science.gov (United States)

    Raack, Jan; Reiss, Dennis; Balme, Matthew R; Taj-Eddine, Kamal; Ori, Gian Gabriele

    2017-04-19

    During a field campaign in the Sahara Desert in southern Morocco, spring 2012, we sampled the vertical grain size distribution of two active dust devils that exhibited different dimensions and intensities. With these in situ samples of grains in the vortices, it was possible to derive detailed vertical grain size distributions and measurements of the lifted relative particle load. Measurements of the two dust devils show that the majority of all lifted particles were only lifted within the first meter (∼46.5% and ∼61% of all particles; ∼76.5 wt % and ∼89 wt % of the relative particle load). Furthermore, ∼69% and ∼82% of all lifted sand grains occurred in the first meter of the dust devils, indicating the occurrence of "sand skirts." Both sampled dust devils were relatively small (∼15 m and ∼4-5 m in diameter) compared to dust devils in surrounding regions; nevertheless, measurements show that ∼58.5% to 73.5% of all lifted particles were small enough to go into suspension (grain size classification). This relatively high amount represents only ∼0.05 to 0.15 wt % of the lifted particle load. Larger dust devils probably entrain larger amounts of fine-grained material into the atmosphere, which can have an influence on the climate. Furthermore, our results indicate that the composition of the surface, on which the dust devils evolved, also had an influence on the particle load composition of the dust devil vortices. The internal particle load structure of both sampled dust devils was comparable related to their vertical grain size distribution and relative particle load, although both dust devils differed in their dimensions and intensities. A general trend of decreasing grain sizes with height was also detected. Key Words: Mars-Dust devils-Planetary science-Desert soils-Atmosphere-Grain sizes. Astrobiology 17, xxx-xxx.

  1. Vertical distribution of Pu radionuclides, 241Am and 99Sr in soil Samples from Romania

    International Nuclear Information System (INIS)

    Breban, D.; Mocanu, N.; Moreno-Bermudez, J.

    2002-01-01

    The investigated area is a natural alpine pasture located in the South chain of the Carpathian Mountains, which was found as one of the most contaminated areas in Romania after the Chernobyl accident.Radioactive concentrations of Pu radioisotopes, 2 41Am and 9 0Sr were determined in successive layers from 4 soil sections downward a depth of 6-8 cm, providing for the first time information on the effect of fallout acumulation of these radionuclides in soil samples from Romania. Pu and Am were separated by using a combined sequential procedure based on anion exchange and extraction chromatography. The measurements of 241Pu were performed by Liquid Scintillation Counting. 9 0Sr was determined by chemical separation of Sr using the classical precipitation method and Cerenkov counting of 9 0Y. For Quality Control IAEA reference materials were analyzed along with the samples. In a soil section of 8 cm depth radioactive inventories were approximately 500 Bq/m 2 for 2 41Pu, 115 Bq/m 2 for 2 39 +2 40Pu, 8 Bq/m 2 for 2 38Pu, 50 Bq/m2 for 2 41Am and 2500 Bq/m 2 for 9 0Sr. The data on each of the radioisotopes vertical distribution profile are compared between themselves and give an idea for their migration. On the basis of activity isotopic ratios in the soil depth profile, the origin of the contamination (Chernobyl accident and nuclear weapon test fallout) is discussed. The results are also compared to previous data on Cs-137

  2. Hospital distribution in a metropolitan city: assessment by a geographical information system grid modelling approach

    Directory of Open Access Journals (Sweden)

    Kwang-Soo Lee

    2014-05-01

    Full Text Available Grid models were used to assess urban hospital distribution in Seoul, the capital of South Korea. A geographical information system (GIS based analytical model was developed and applied to assess the situation in a metropolitan area with a population exceeding 10 million. Secondary data for this analysis were obtained from multiple sources: the Korean Statistical Information Service, the Korean Hospital Association and the Statistical Geographical Information System. A grid of cells measuring 1 × 1 km was superimposed on the city map and a set of variables related to population, economy, mobility and housing were identified and measured for each cell. Socio-demographic variables were included to reflect the characteristics of each area. Analytical models were then developed using GIS software with the number of hospitals as the dependent variable. Applying multiple linear regression and geographically weighted regression models, three factors (highway and major arterial road areas; number of subway entrances; and row house areas were statistically significant in explaining the variance of hospital distribution for each cell. The overall results show that GIS is a useful tool for analysing and understanding location strategies. This approach appears a useful source of information for decision-makers concerned with the distribution of hospitals and other health care centres in a city.

  3. Positional information generated by spatially distributed signaling cascades.

    Directory of Open Access Journals (Sweden)

    Javier Muñoz-García

    2009-03-01

    Full Text Available The temporal and stationary behavior of protein modification cascades has been extensively studied, yet little is known about the spatial aspects of signal propagation. We have previously shown that the spatial separation of opposing enzymes, such as a kinase and a phosphatase, creates signaling activity gradients. Here we show under what conditions signals stall in the space or robustly propagate through spatially distributed signaling cascades. Robust signal propagation results in activity gradients with long plateaus, which abruptly decay at successive spatial locations. We derive an approximate analytical solution that relates the maximal amplitude and propagation length of each activation profile with the cascade level, protein diffusivity, and the ratio of the opposing enzyme activities. The control of the spatial signal propagation appears to be very different from the control of transient temporal responses for spatially homogenous cascades. For spatially distributed cascades where activating and deactivating enzymes operate far from saturation, the ratio of the opposing enzyme activities is shown to be a key parameter controlling signal propagation. The signaling gradients characteristic for robust signal propagation exemplify a pattern formation mechanism that generates precise spatial guidance for multiple cellular processes and conveys information about the cell size to the nucleus.

  4. SERVICES OF FULL-TEXT SEARCHING IN A DISTRIBUTED INFORMATION ENVIRONMENT (PROJECT HUMANITARIANA

    Directory of Open Access Journals (Sweden)

    S. K. Lyapin

    2015-01-01

    Full Text Available Problem statement. We justify the possibility of full-text search services application in both universal and specialized (in terms of resource base digital libraries for the extraction and analysis of the context knowledge in the humanities. The architecture and services of virtual information and resource center for extracting knowledge from the humanitarian texts generated by «Humanitariana» project are described. The functional integration of the resources and services for a full-text search in a distributed decentralized environment, organized in the Internet / Intranet architecture under the control of the client (user browser accessing a variety of independent servers. An algorithm for a distributed full-text query implementation is described. Methods. Method of combining requency-ranked and paragraph-oriented full-text queries is used: the first are used for the preliminary analysis of the subject area or a combination product (explication of "vertical" context, or macro context, the second - for the explication of "horizontal" context, or micro context within copyright paragraph. The results of the frequency-ranked queries are used to compile paragraph-oriented queries. Results. The results of textual research are shown on the topics "The question of fact in Russian philosophy", "The question of loneliness in Russian philosophy and culture". About 50 pieces of context knowledge on the total resource base of about 2,500 full-text resources have been explicated and briefly described to their further expert investigating. Practical significance. The proposed technology (advanced full-text searching services in a distributed information environment can be used for the information support of humanitarian studies and education in the humanities, for functional integration of resources and services of various organizations, for carrying out interdisciplinary research.

  5. Sampling intraspecific variability in leaf functional traits: Practical suggestions to maximize collected information.

    Science.gov (United States)

    Petruzzellis, Francesco; Palandrani, Chiara; Savi, Tadeja; Alberti, Roberto; Nardini, Andrea; Bacaro, Giovanni

    2017-12-01

    The choice of the best sampling strategy to capture mean values of functional traits for a species/population, while maintaining information about traits' variability and minimizing the sampling size and effort, is an open issue in functional trait ecology. Intraspecific variability (ITV) of functional traits strongly influences sampling size and effort. However, while adequate information is available about intraspecific variability between individuals (ITV BI ) and among populations (ITV POP ), relatively few studies have analyzed intraspecific variability within individuals (ITV WI ). Here, we provide an analysis of ITV WI of two foliar traits, namely specific leaf area (SLA) and osmotic potential (π), in a population of Quercus ilex L. We assessed the baseline ITV WI level of variation between the two traits and provided the minimum and optimal sampling size in order to take into account ITV WI , comparing sampling optimization outputs with those previously proposed in the literature. Different factors accounted for different amount of variance of the two traits. SLA variance was mostly spread within individuals (43.4% of the total variance), while π variance was mainly spread between individuals (43.2%). Strategies that did not account for all the canopy strata produced mean values not representative of the sampled population. The minimum size to adequately capture the studied functional traits corresponded to 5 leaves taken randomly from 5 individuals, while the most accurate and feasible sampling size was 4 leaves taken randomly from 10 individuals. We demonstrate that the spatial structure of the canopy could significantly affect traits variability. Moreover, different strategies for different traits could be implemented during sampling surveys. We partially confirm sampling sizes previously proposed in the recent literature and encourage future analysis involving different traits.

  6. Protection of safety-relevant information in distributed energy information systems; Schutz sicherheitsrelevanter Informationen in verteilten Energieinformationssystemen

    Energy Technology Data Exchange (ETDEWEB)

    Beenken, Petra

    2010-07-01

    Within the last years there has been an ongoing change in the energy domain. The German renewable energies law EnWG requires a liberalization that leads to a strict separation of domains such as transportation, supply and conversion of energy. Furthermore, climate and environmental protection as well as cost transparency and energy saving in combination with efficiency of resources leads to new challenges for the energy industry. The so called smart grid vision and the concluding design of an ICT-based information structure for the energy domain will help to reach these goals by integrating renewable energy resources, saving fuels and getting a higher energy efficiency. In order to reach these goals, information about current energy generation, energy storage and energy demand is required. Through an efficient network and fast information exchange by means of an energy information network an efficient energy use can be gained. The federated networking of an energy information network like this can tend to a weakness for cyber security within the energy domain. The growing number of people involved and data exchanges will create more potential points of attacks than before. Therefore, a suitable protection of an energy information network is necessary. Through paragraph 9 EnWG the protection goal confidentiality is particularly important. But the implementation of confidentiality must not lead to a violation of availability requirements, which are very important at some point of the energy domain. Additionally to the identification of such crucial side effects, the implementation of confidentiality for distributed, decentral systems is a challenge for the domain. The ENERTRUST security model includes a knowledge base construction, which allows the identification of such side effects or conflicts in the energy domain by applying reasoning techniques. Moreover, it allows the realization of confidentiality from distributed locations through a use and combination of

  7. Mutual Information Based Analysis for the Distribution of Financial Contagion in Stock Markets

    Directory of Open Access Journals (Sweden)

    Xudong Wang

    2017-01-01

    Full Text Available This paper applies mutual information to research the distribution of financial contagion in global stock markets during the US subprime crisis. First, we symbolize the daily logarithmic stock returns based on their quantiles. Then, the mutual information of the stock indices is calculated and the block bootstrap approach is adopted to test the financial contagion. We analyze not only the contagion distribution during the entire crisis period but also its evolution over different stages by using the sliding window method. The empirical results prove the widespread existence of financial contagion and show that markets impacted by contagion tend to cluster geographically. The distribution of the contagion strength is positively skewed and leptokurtic. The average contagion strength is low at the beginning and then witnesses an uptrend. It has larger values in the middle stage and declines in the late phase of the crisis. Meanwhile, the cross-regional contagion between Europe and America is stronger than that between either America and Asia or Europe and Asia. Europe is found to be the region most deeply impacted by the contagion, whereas Asia is the least affected.

  8. Sparsity-weighted outlier FLOODing (OFLOOD) method: Efficient rare event sampling method using sparsity of distribution.

    Science.gov (United States)

    Harada, Ryuhei; Nakamura, Tomotake; Shigeta, Yasuteru

    2016-03-30

    As an extension of the Outlier FLOODing (OFLOOD) method [Harada et al., J. Comput. Chem. 2015, 36, 763], the sparsity of the outliers defined by a hierarchical clustering algorithm, FlexDice, was considered to achieve an efficient conformational search as sparsity-weighted "OFLOOD." In OFLOOD, FlexDice detects areas of sparse distribution as outliers. The outliers are regarded as candidates that have high potential to promote conformational transitions and are employed as initial structures for conformational resampling by restarting molecular dynamics simulations. When detecting outliers, FlexDice defines a rank in the hierarchy for each outlier, which relates to sparsity in the distribution. In this study, we define a lower rank (first ranked), a medium rank (second ranked), and the highest rank (third ranked) outliers, respectively. For instance, the first-ranked outliers are located in a given conformational space away from the clusters (highly sparse distribution), whereas those with the third-ranked outliers are nearby the clusters (a moderately sparse distribution). To achieve the conformational search efficiently, resampling from the outliers with a given rank is performed. As demonstrations, this method was applied to several model systems: Alanine dipeptide, Met-enkephalin, Trp-cage, T4 lysozyme, and glutamine binding protein. In each demonstration, the present method successfully reproduced transitions among metastable states. In particular, the first-ranked OFLOOD highly accelerated the exploration of conformational space by expanding the edges. In contrast, the third-ranked OFLOOD reproduced local transitions among neighboring metastable states intensively. For quantitatively evaluations of sampled snapshots, free energy calculations were performed with a combination of umbrella samplings, providing rigorous landscapes of the biomolecules. © 2015 Wiley Periodicals, Inc.

  9. Ambient Learning Displays - Distributed Mixed Reality Information Mash-ups to support Ubiquitous Learning

    NARCIS (Netherlands)

    Börner, Dirk

    2012-01-01

    Börner, D. (2012). Ambient Learning Displays - Distributed Mixed Reality Information Mash-ups to support Ubiquitous Learning. 2012 IEEE Seventh International Conference on Wireless, Mobile and Ubiquitous Technology in Education (pp. 337-338). March, 27-30, 2012, Takamatsu, Japan: IEEE Computer

  10. Current Distributional Information on Freshwater Mussels (family Unionidae) in Mississippi National Forests

    Science.gov (United States)

    Wendell R. Haag; Melvin L. Warren

    1995-01-01

    Little is known about the distribution of freshwater mussels in Mississippi national forests. Review of the scant available information revealed that the national forests harbor a diverse mussel fauna of possibly 46 or more species (including confirmed, probable, and potential occurrences). Occurrence of 33 species is confirmed. Because of the geographic, physiographic...

  11. Eggshells as an index of aedine mosquito production. 1: Distribution, movement and sampling of Aedes taeniorhynchus eggshells.

    Science.gov (United States)

    Ritchie, S A; Addison, D S; van Essen, F

    1992-03-01

    The distribution of Aedes taeniorhynchus eggshells in Florida mangrove basin forests was determined and used to design a sampling plan. Eggshells were found in 10/11 sites (91%), with a mean +/- SE density of 1.45 +/- 0.75/cc; density did not change significantly year to year. Highest densities were located on the sloping banks of hummocks, ponds and potholes. Eggshells were less clumped in distribution than eggs and larvae and thus required a smaller sample size for a given precision level. While eggshells were flushed from compact soil that was subject to runoff during heavy rain, mangrove peat, the dominant soil of eggshell-bearing sites, was less dense and had little runoff or eggshell flushing. We suggest that eggshell surveys could be used to identify Ae. taeniorhynchus oviposition sites and oviposition patterns.

  12. Environmental gamma-ray measurements using in situ and core sampling techniques

    International Nuclear Information System (INIS)

    Dickson, H.W.; Kerr, G.D.; Perdue, P.T.; Abdullah, S.A.

    1976-01-01

    Dose rates from natural radionuclides and 137 Cs in soils of the Oak Ridge area have been determined from in situ and core sample measurements. In situ γ-ray measurements were made with a transportable spectrometer. A tape of spectral data and a soil core sample from each site were returned to ORNL for further analysis. Information on soil composition, density and moisture content and on the distribution of cesium in the soil was obtained from the core samples. In situ spectra were analyzed by a computer program which identified and assigned energies to peaks, integrated the areas under the peaks, and calculated radionuclide concentrations based on a uniform distribution in the soil. The assumption of a uniform distribution was adequate only for natural radionuclides, but simple corrections can be made to the computer calculations for man-made radionuclides distributed on the surface or exponentially in the soil. For 137 Cs a correction was used based on an exponential function fitted to the distribution measured in core samples. At typical sites in Oak Ridge, the dose rate determined from these measurements was about 5 μrad/hr. (author)

  13. Spatial distribution, sampling precision and survey design optimisation with non-normal variables: The case of anchovy (Engraulis encrasicolus) recruitment in Spanish Mediterranean waters

    Science.gov (United States)

    Tugores, M. Pilar; Iglesias, Magdalena; Oñate, Dolores; Miquel, Joan

    2016-02-01

    In the Mediterranean Sea, the European anchovy (Engraulis encrasicolus) displays a key role in ecological and economical terms. Ensuring stock sustainability requires the provision of crucial information, such as species spatial distribution or unbiased abundance and precision estimates, so that management strategies can be defined (e.g. fishing quotas, temporal closure areas or marine protected areas MPA). Furthermore, the estimation of the precision of global abundance at different sampling intensities can be used for survey design optimisation. Geostatistics provide a priori unbiased estimations of the spatial structure, global abundance and precision for autocorrelated data. However, their application to non-Gaussian data introduces difficulties in the analysis in conjunction with low robustness or unbiasedness. The present study applied intrinsic geostatistics in two dimensions in order to (i) analyse the spatial distribution of anchovy in Spanish Western Mediterranean waters during the species' recruitment season, (ii) produce distribution maps, (iii) estimate global abundance and its precision, (iv) analyse the effect of changing the sampling intensity on the precision of global abundance estimates and, (v) evaluate the effects of several methodological options on the robustness of all the analysed parameters. The results suggested that while the spatial structure was usually non-robust to the tested methodological options when working with the original dataset, it became more robust for the transformed datasets (especially for the log-backtransformed dataset). The global abundance was always highly robust and the global precision was highly or moderately robust to most of the methodological options, except for data transformation.

  14. Accelerating distributed average consensus by exploring the information of second-order neighbors

    Energy Technology Data Exchange (ETDEWEB)

    Yuan Deming [School of Automation, Nanjing University of Science and Technology, Nanjing 210094, Jiangsu (China); Xu Shengyuan, E-mail: syxu02@yahoo.com.c [School of Automation, Nanjing University of Science and Technology, Nanjing 210094, Jiangsu (China); Zhao Huanyu [School of Automation, Nanjing University of Science and Technology, Nanjing 210094, Jiangsu (China); Chu Yuming [Department of Mathematics, Huzhou Teacher' s College, Huzhou 313000, Zhejiang (China)

    2010-05-17

    The problem of accelerating distributed average consensus by using the information of second-order neighbors in both the discrete- and continuous-time cases is addressed in this Letter. In both two cases, when the information of second-order neighbors is used in each iteration, the network will converge with a speed faster than the algorithm only using the information of first-order neighbors. Moreover, the problem of using partial information of second-order neighbors is considered, and the edges are not chosen randomly from second-order neighbors. In the continuous-time case, the edges are chosen by solving a convex optimization problem which is formed by using the convex relaxation method. In the discrete-time case, for small network the edges are chosen optimally via the brute force method. Finally, simulation examples are provided to demonstrate the effectiveness of the proposed algorithm.

  15. Distribution of Heavy Metal Content Hg and Cr of Environmental Samples at Surabaya Area

    International Nuclear Information System (INIS)

    Agus Taftazani

    2007-01-01

    Determination of Hg and Cr content of Surabaya river and coastal environmental samples using Instrumental Neutron Activation Analysis (INAA) have been done. The environmental samples were water, sediment, Eichhornia crassipes (Mart) Solmms, Rhizophora stylosa, Johnius (Johnieops) borneensis fish, and Moolgarda delicate fish at 12 locations selected of Surabaya area. Dry powder of sediment and biotic samples and concentrate water samples was irradiated by neutron flux 1.05 x 10 11 n.cm -2 .det -1 during 12 hours. The analytical result showed that the concentration of the heavy metals of river water are smaller than Perda Surabaya City No. 02/2004 for the 4 th level water which are Hg (0.005 ppm) and Cr (1.000 ppm). All locations coastal water samples have Hg and Cr concentrations are higher than Kepmen LH No.51/2004 Hg (0.001 ppm) and Cr (0.005 ppm). The Hg concentration of fish samples have exceeded the threshold according to Kep. Dirjen POM No.03725/B/SK/VII/89 about the maximum concentration of metal pollution in food. The concentration of heavy metals in sediment, Eichhornia crassipes (Mart) Solmms and Rhizophora stylosa are not regulated, so then heavy metals pollution can not be referred to. The concentration of Hg and Cr elements of water samples are smaller than that of biotic and sediment samples. The distribution factor (F d ) is bigger than bioaccumulation factor (F b ). (author)

  16. Web services for distributed and interoperable hydro-information systems

    Science.gov (United States)

    Horak, J.; Orlik, A.; Stromsky, J.

    2008-03-01

    Web services support the integration and interoperability of Web-based applications and enable machine-to-machine interaction. The concepts of web services and open distributed architecture were applied to the development of T-DSS, the prototype customised for web based hydro-information systems. T-DSS provides mapping services, database related services and access to remote components, with special emphasis placed on the output flexibility (e.g. multilingualism), where SOAP web services are mainly used for communication. The remote components are represented above all by remote data and mapping services (e.g. meteorological predictions), modelling and analytical systems (currently HEC-HMS, MODFLOW and additional utilities), which support decision making in water management.

  17. Blockchain in government : Benefits and implications of distributed ledger technology for information sharing

    NARCIS (Netherlands)

    Ølnes, Svein; Ubacht, J.; Janssen, M.F.W.H.A.

    2017-01-01

    Blockchain refers to a range of general purpose technologies to exchange information and transact digital assets in distributed networks. The core question addressed in this paper is whether blockchain technology will lead to innovation and transformation of governmental processes. To address

  18. Linking diversity and distribution to understand biodiversity gradients and inform conservation assessments

    Directory of Open Access Journals (Sweden)

    Fabricio Villalobos

    2014-03-01

    Full Text Available Broad-scale patterns of species richness result from differential coexistence among species in distinct regions of the globe, determined by the species’ ranges and their properties such as size, shape and location. Thus, species richness and ranges are inherently linked. These two biodiversity features also yield primary information for conservation assessments. However, species richness and range size have been usually studied separately and no formal analytical link has been established. In my PhD thesis, I applied and extended a recently developed conceptual and methodological framework to study geographical association among species and similarity among sites. This range–diversity framework, along with stochastic simulation modelling, allowed me to jointly evaluate the relationship between diversity and distribution, to infer potential processes underlying composite patterns of phyllostomid bats, and to use this approach to inform conservation assessments for the Mexican avifauna. I highlight the need to explore composite patterns for understanding biodiversity patterns and show how combining diversity and distributional data can help describe complex biogeographical patterns, providing a transparent and explicit application for initial conservation assessments.

  19. Onco-STS: a web-based laboratory information management system for sample and analysis tracking in oncogenomic experiments.

    Science.gov (United States)

    Gavrielides, Mike; Furney, Simon J; Yates, Tim; Miller, Crispin J; Marais, Richard

    2014-01-01

    Whole genomes, whole exomes and transcriptomes of tumour samples are sequenced routinely to identify the drivers of cancer. The systematic sequencing and analysis of tumour samples, as well other oncogenomic experiments, necessitates the tracking of relevant sample information throughout the investigative process. These meta-data of the sequencing and analysis procedures include information about the samples and projects as well as the sequencing centres, platforms, data locations, results locations, alignments, analysis specifications and further information relevant to the experiments. The current work presents a sample tracking system for oncogenomic studies (Onco-STS) to store these data and make them easily accessible to the researchers who work with the samples. The system is a web application, which includes a database and a front-end web page that allows the remote access, submission and updating of the sample data in the database. The web application development programming framework Grails was used for the development and implementation of the system. The resulting Onco-STS solution is efficient, secure and easy to use and is intended to replace the manual data handling of text records. Onco-STS allows simultaneous remote access to the system making collaboration among researchers more effective. The system stores both information on the samples in oncogenomic studies and details of the analyses conducted on the resulting data. Onco-STS is based on open-source software, is easy to develop and can be modified according to a research group's needs. Hence it is suitable for laboratories that do not require a commercial system.

  20. Background Information for the Nevada National Security Site Integrated Sampling Plan, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Farnham, Irene; Marutzky, Sam

    2014-12-01

    This document describes the process followed to develop the Nevada National Security Site (NNSS) Integrated Sampling Plan (referred to herein as the Plan). It provides the Plan’s purpose and objectives, and briefly describes the Underground Test Area (UGTA) Activity, including the conceptual model and regulatory requirements as they pertain to groundwater sampling. Background information on other NNSS groundwater monitoring programs—the Routine Radiological Environmental Monitoring Plan (RREMP) and Community Environmental Monitoring Program (CEMP)—and their integration with the Plan are presented. Descriptions of the evaluations, comments, and responses of two Sampling Plan topical committees are also included.

  1. Improved Sampling Algorithms in the Risk-Informed Safety Margin Characterization Toolkit

    International Nuclear Information System (INIS)

    Mandelli, Diego; Smith, Curtis Lee; Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua Joseph

    2015-01-01

    The RISMC approach is developing advanced set of methodologies and algorithms in order to perform Probabilistic Risk Analyses (PRAs). In contrast to classical PRA methods, which are based on Event-Tree and Fault-Tree methods, the RISMC approach largely employs system simulator codes applied to stochastic analysis tools. The basic idea is to randomly perturb (by employing sampling algorithms) timing and sequencing of events and internal parameters of the system codes (i.e., uncertain parameters) in order to estimate stochastic parameters such as core damage probability. This approach applied to complex systems such as nuclear power plants requires to perform a series of computationally expensive simulation runs given a large set of uncertain parameters. These types of analysis are affected by two issues. Firstly, the space of the possible solutions (a.k.a., the issue space or the response surface) can be sampled only very sparsely, and this precludes the ability to fully analyze the impact of uncertainties on the system dynamics. Secondly, large amounts of data are generated and tools to generate knowledge from such data sets are not yet available. This report focuses on the first issue and in particular employs novel methods that optimize the information generated by the sampling process by sampling unexplored and risk-significant regions of the issue space: adaptive (smart) sampling algorithms. They infer system response from surrogate models constructed from existing samples and predict the most relevant location of the next sample. It is therefore possible to understand features of the issue space with a small number of carefully selected samples. In this report, we will present how it is possible to perform adaptive sampling using the RISMC toolkit and highlight the advantages compared to more classical sampling approaches such Monte-Carlo. We will employ RAVEN to perform such statistical analyses using both analytical cases but also another RISMC code: RELAP-7.

  2. Improved Sampling Algorithms in the Risk-Informed Safety Margin Characterization Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Cogliati, Joshua Joseph [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The RISMC approach is developing advanced set of methodologies and algorithms in order to perform Probabilistic Risk Analyses (PRAs). In contrast to classical PRA methods, which are based on Event-Tree and Fault-Tree methods, the RISMC approach largely employs system simulator codes applied to stochastic analysis tools. The basic idea is to randomly perturb (by employing sampling algorithms) timing and sequencing of events and internal parameters of the system codes (i.e., uncertain parameters) in order to estimate stochastic parameters such as core damage probability. This approach applied to complex systems such as nuclear power plants requires to perform a series of computationally expensive simulation runs given a large set of uncertain parameters. These types of analysis are affected by two issues. Firstly, the space of the possible solutions (a.k.a., the issue space or the response surface) can be sampled only very sparsely, and this precludes the ability to fully analyze the impact of uncertainties on the system dynamics. Secondly, large amounts of data are generated and tools to generate knowledge from such data sets are not yet available. This report focuses on the first issue and in particular employs novel methods that optimize the information generated by the sampling process by sampling unexplored and risk-significant regions of the issue space: adaptive (smart) sampling algorithms. They infer system response from surrogate models constructed from existing samples and predict the most relevant location of the next sample. It is therefore possible to understand features of the issue space with a small number of carefully selected samples. In this report, we will present how it is possible to perform adaptive sampling using the RISMC toolkit and highlight the advantages compared to more classical sampling approaches such Monte-Carlo. We will employ RAVEN to perform such statistical analyses using both analytical cases but also another RISMC code: RELAP-7.

  3. Bayesian estimation of Weibull distribution parameters

    International Nuclear Information System (INIS)

    Bacha, M.; Celeux, G.; Idee, E.; Lannoy, A.; Vasseur, D.

    1994-11-01

    In this paper, we expose SEM (Stochastic Expectation Maximization) and WLB-SIR (Weighted Likelihood Bootstrap - Sampling Importance Re-sampling) methods which are used to estimate Weibull distribution parameters when data are very censored. The second method is based on Bayesian inference and allow to take into account available prior informations on parameters. An application of this method, with real data provided by nuclear power plants operation feedback analysis has been realized. (authors). 8 refs., 2 figs., 2 tabs

  4. Empirically simulated study to compare and validate sampling methods used in aerial surveys of wildlife populations

    NARCIS (Netherlands)

    Khaemba, W.M.; Stein, A.; Rasch, D.; Leeuw, de J.; Georgiadis, N.

    2001-01-01

    This paper compares the distribution, sampling and estimation of abundance for two animal species in an African ecosystem by means of an intensive simulation of the sampling process under a geographical information system (GIS) environment. It focuses on systematic and random sampling designs,

  5. Dimension-Independent Likelihood-Informed MCMC

    KAUST Repository

    Cui, Tiangang; Law, Kody; Marzouk, Youssef

    2015-01-01

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters, which in principle can be described as functions. By exploiting low-dimensional structure in the change from prior to posterior [distributions], we introduce a suite of MCMC samplers that can adapt to the complex structure of the posterior distribution, yet are well-defined on function space. Posterior sampling in nonlinear inverse problems arising from various partial di erential equations and also a stochastic differential equation are used to demonstrate the e ciency of these dimension-independent likelihood-informed samplers.

  6. Dimension-Independent Likelihood-Informed MCMC

    KAUST Repository

    Cui, Tiangang

    2015-01-07

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters, which in principle can be described as functions. By exploiting low-dimensional structure in the change from prior to posterior [distributions], we introduce a suite of MCMC samplers that can adapt to the complex structure of the posterior distribution, yet are well-defined on function space. Posterior sampling in nonlinear inverse problems arising from various partial di erential equations and also a stochastic differential equation are used to demonstrate the e ciency of these dimension-independent likelihood-informed samplers.

  7. Efficient sampling to determine distribution of fruit quality and yield in a commercial apple orchard

    DEFF Research Database (Denmark)

    Martinez Vega, Mabel Virginia; Wulfsohn, D.; Zamora, I.

    2012-01-01

    In situ assessment of fruit quality and yield can provide critical data for marketing and for logistical planning of the harvest, as well as for site-specific management. Our objective was to develop and validate efficient field sampling procedures for this purpose. We used the previously reported...... ‘fractionator’ tree sampling procedure and supporting handheld software (Gardi et al., 2007; Wulfsohn et al., 2012) to obtain representative samples of fruit from a 7.6-ha apple orchard (Malus ×domestica ‘Fuji Raku Raku’) in central Chile. The resulting sample consisted of 70 fruit on 56 branch segments...... of yield. Estimated marketable yield was 295.8±50.2 t. Field and packinghouse records indicated that of 348.2 t sent to packing (52.4 t or 15% higher than our estimate), 263.0 t was packed for export (32.8 t less or -12% error compared to our estimate). The estimated distribution of caliber compared very...

  8. Effect of current distribution on the voltage-temperature characteristics: study of the NbTi PF-FSJS sample for ITER

    International Nuclear Information System (INIS)

    Zani, L.; Ciazynski, D.; Gislon, P.; Stepanov, B.; Huber, S.

    2004-01-01

    Various tests, either on full-size joint samples or on model coils confirmed that current distribution may play a crucial role in the electrical behaviour of CICC in operating conditions. In order to evaluate its influence, CEA developed a code (ENSIC) the main feature of which is a CICC electrical model including a discrete resistive network associated with superconducting lengths. Longitudinal and transverse resistances are also modeled, representing either joint or conductor. In our paper we will present the comparison of experimental results with ENSIC calculations for one International Thermonuclear Experimental Reactor (ITER) sample prototype relevant to poloidal field (PF) coils: the PF-full-size joint sample (PF-FSJS). In this purpose, the current distribution has been measured thanks to a segmented Rogowski coils system. Current distribution effects on the basic characteristics (T CS , n-value etc) of the cable compared to single strand will be discussed. This study aims at putting light on the global strand state in a conductor and is also useful to evaluate some intrinsic parameters hardly measurable (effective interpetal transverse contact resistance for example) allowing further application in coils

  9. A trade-off between local and distributed information processing associated with remote episodic versus semantic memory.

    Science.gov (United States)

    Heisz, Jennifer J; Vakorin, Vasily; Ross, Bernhard; Levine, Brian; McIntosh, Anthony R

    2014-01-01

    Episodic memory and semantic memory produce very different subjective experiences yet rely on overlapping networks of brain regions for processing. Traditional approaches for characterizing functional brain networks emphasize static states of function and thus are blind to the dynamic information processing within and across brain regions. This study used information theoretic measures of entropy to quantify changes in the complexity of the brain's response as measured by magnetoencephalography while participants listened to audio recordings describing past personal episodic and general semantic events. Personal episodic recordings evoked richer subjective mnemonic experiences and more complex brain responses than general semantic recordings. Critically, we observed a trade-off between the relative contribution of local versus distributed entropy, such that personal episodic recordings produced relatively more local entropy whereas general semantic recordings produced relatively more distributed entropy. Changes in the relative contributions of local and distributed entropy to the total complexity of the system provides a potential mechanism that allows the same network of brain regions to represent cognitive information as either specific episodes or more general semantic knowledge.

  10. Predicting cyclohexane/water distribution coefficients for the SAMPL5 challenge using MOSCED and the SMD solvation model

    Science.gov (United States)

    Diaz-Rodriguez, Sebastian; Bozada, Samantha M.; Phifer, Jeremy R.; Paluch, Andrew S.

    2016-11-01

    We present blind predictions using the solubility parameter based method MOSCED submitted for the SAMPL5 challenge on calculating cyclohexane/water distribution coefficients at 298 K. Reference data to parameterize MOSCED was generated with knowledge only of chemical structure by performing solvation free energy calculations using electronic structure calculations in the SMD continuum solvent. To maintain simplicity and use only a single method, we approximate the distribution coefficient with the partition coefficient of the neutral species. Over the final SAMPL5 set of 53 compounds, we achieved an average unsigned error of 2.2± 0.2 log units (ranking 15 out of 62 entries), the correlation coefficient ( R) was 0.6± 0.1 (ranking 35), and 72± 6 % of the predictions had the correct sign (ranking 30). While used here to predict cyclohexane/water distribution coefficients at 298 K, MOSCED is broadly applicable, allowing one to predict temperature dependent infinite dilution activity coefficients in any solvent for which parameters exist, and provides a means by which an excess Gibbs free energy model may be parameterized to predict composition dependent phase-equilibrium.

  11. Entropy and chemical change. 1: Characterization of product (and reactant) energy distributions in reactive molecular collisions: Information and enthropy deficiency

    Science.gov (United States)

    Bernstein, R. B.; Levine, R. D.

    1972-01-01

    Optimal means of characterizing the distribution of product energy states resulting from reactive collisions of molecules with restricted distributions of initial states are considered, along with those for characterizing the particular reactant state distribution which yields a given set of product states at a specified total energy. It is suggested to represent the energy-dependence of global-type results in the form of square-faced bar plots, and of data for specific-type experiments as triangular-faced prismatic plots. The essential parameters defining the internal state distribution are isolated, and the information content of such a distribution is put on a quantitative basis. The relationship between the information content, the surprisal, and the entropy of the continuous distribution is established. The concept of an entropy deficiency, which characterizes the specificity of product state formation, is suggested as a useful measure of the deviance from statistical behavior. The degradation of information by experimental averaging is considered, leading to bounds on the entropy deficiency.

  12. A design-based approximation to the Bayes Information Criterion in finite population sampling

    Directory of Open Access Journals (Sweden)

    Enrico Fabrizi

    2014-05-01

    Full Text Available In this article, various issues related to the implementation of the usual Bayesian Information Criterion (BIC are critically examined in the context of modelling a finite population. A suitable design-based approximation to the BIC is proposed in order to avoid the derivation of the exact likelihood of the sample which is often very complex in a finite population sampling. The approximation is justified using a theoretical argument and a Monte Carlo simulation study.

  13. Information management in smart grids: Who should govern information management to balance between coordination and competition on the distribution grid level?

    OpenAIRE

    Buchmann, Marius

    2016-01-01

    Smart grids should increase coordination on the distribution grid level and facilitate new market opportunities (I.e. competition on a level playing field). Information management is becoming a new task in the electricity supply chain. It is an enbaler for the development of smart grids. Therefore, the governance of information management should as well efficiently balance between coordination and competition. Within this paper we analyse which role from the energy sector could govern the inf...

  14. Basic distribution free identification tests for small size samples of environmental data

    Energy Technology Data Exchange (ETDEWEB)

    Federico, A.G.; Musmeci, F. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dipt. Ambiente

    1998-01-01

    Testing two or more data sets for the hypothesis that they are sampled form the same population is often required in environmental data analysis. Typically the available samples have a small number of data and often then assumption of normal distributions is not realistic. On the other hand the diffusion of the days powerful Personal Computers opens new possible opportunities based on a massive use of the CPU resources. The paper reviews the problem introducing the feasibility of two non parametric approaches based on intrinsic equi probability properties of the data samples. The first one is based on a full re sampling while the second is based on a bootstrap approach. A easy to use program is presented. A case study is given based on the Chernobyl children contamination data. [Italiano] Nell`analisi di dati ambientali ricorre spesso il caso di dover sottoporre a test l`ipotesi di provenienza di due, o piu`, insiemi di dati dalla stessa popolazione. Tipicamente i dati disponibili sono pochi e spesso l`ipotesi di provenienza da distribuzioni normali non e` sostenibile. D`altra aprte la diffusione odierna di Personal Computer fornisce nuove possibili soluzioni basate sull`uso intensivo delle risorse della CPU. Il rapporto analizza il problema e presenta la possibilita` di utilizzo di due test non parametrici basati sulle proprieta` intrinseche di equiprobabilita` dei campioni. Il primo e` basato su una tecnica di ricampionamento esaustivo mentre il secondo su un approccio di tipo bootstrap. E` presentato un programma di semplice utilizzo e un caso di studio basato su dati di contaminazione di bambini a Chernobyl.

  15. BWIP-RANDOM-SAMPLING, Random Sample Generation for Nuclear Waste Disposal

    International Nuclear Information System (INIS)

    Sagar, B.

    1989-01-01

    1 - Description of program or function: Random samples for different distribution types are generated. Distribution types as required for performance assessment modeling of geologic nuclear waste disposal are provided. These are: - Uniform, - Log-uniform (base 10 or natural), - Normal, - Lognormal (base 10 or natural), - Exponential, - Bernoulli, - User defined continuous distribution. 2 - Method of solution: A linear congruential generator is used for uniform random numbers. A set of functions is used to transform the uniform distribution to the other distributions. Stratified, rather than random, sampling can be chosen. Truncated limits can be specified on many distributions, whose usual definition has an infinite support. 3 - Restrictions on the complexity of the problem: Generation of correlated random variables is not included

  16. Medication errors in residential aged care facilities: a distributed cognition analysis of the information exchange process.

    Science.gov (United States)

    Tariq, Amina; Georgiou, Andrew; Westbrook, Johanna

    2013-05-01

    Medication safety is a pressing concern for residential aged care facilities (RACFs). Retrospective studies in RACF settings identify inadequate communication between RACFs, doctors, hospitals and community pharmacies as the major cause of medication errors. Existing literature offers limited insight about the gaps in the existing information exchange process that may lead to medication errors. The aim of this research was to explicate the cognitive distribution that underlies RACF medication ordering and delivery to identify gaps in medication-related information exchange which lead to medication errors in RACFs. The study was undertaken in three RACFs in Sydney, Australia. Data were generated through ethnographic field work over a period of five months (May-September 2011). Triangulated analysis of data primarily focused on examining the transformation and exchange of information between different media across the process. The findings of this study highlight the extensive scope and intense nature of information exchange in RACF medication ordering and delivery. Rather than attributing error to individual care providers, the explication of distributed cognition processes enabled the identification of gaps in three information exchange dimensions which potentially contribute to the occurrence of medication errors namely: (1) design of medication charts which complicates order processing and record keeping (2) lack of coordination mechanisms between participants which results in misalignment of local practices (3) reliance on restricted communication bandwidth channels mainly telephone and fax which complicates the information processing requirements. The study demonstrates how the identification of these gaps enhances understanding of medication errors in RACFs. Application of the theoretical lens of distributed cognition can assist in enhancing our understanding of medication errors in RACFs through identification of gaps in information exchange. Understanding

  17. Data-driven importance distributions for articulated tracking

    DEFF Research Database (Denmark)

    Hauberg, Søren; Pedersen, Kim Steenstrup

    2011-01-01

    We present two data-driven importance distributions for particle filterbased articulated tracking; one based on background subtraction, another on depth information. In order to keep the algorithms efficient, we represent human poses in terms of spatial joint positions. To ensure constant bone le...... filter, where they improve both accuracy and efficiency of the tracker. In fact, they triple the effective number of samples compared to the most commonly used importance distribution at little extra computational cost....

  18. Systematic underestimation of the age of samples with saturating exponential behaviour and inhomogeneous dose distribution

    International Nuclear Information System (INIS)

    Brennan, B.J.

    2000-01-01

    In luminescence and ESR studies, a systematic underestimate of the (average) equivalent dose, and thus also the age, of a sample can occur when there is significant variation of the natural dose within the sample and some regions approach saturation. This is demonstrated explicitly for a material that exhibits a single-saturating-exponential growth of signal with dose. The result is valid for any geometry (e.g. a plain layer, spherical grain, etc.) and some illustrative cases are modelled, with the age bias exceeding 10% in extreme cases. If the dose distribution within the sample can be modelled accurately, it is possible to correct for the bias in the estimates of equivalent dose estimate and age. While quantifying the effect would be more difficult, similar systematic biases in dose and age estimates are likely in other situations more complex than the one modelled

  19. Wrong, but useful: regional species distribution models may not be improved by range-wide data under biased sampling.

    Science.gov (United States)

    El-Gabbas, Ahmed; Dormann, Carsten F

    2018-02-01

    Species distribution modeling (SDM) is an essential method in ecology and conservation. SDMs are often calibrated within one country's borders, typically along a limited environmental gradient with biased and incomplete data, making the quality of these models questionable. In this study, we evaluated how adequate are national presence-only data for calibrating regional SDMs. We trained SDMs for Egyptian bat species at two different scales: only within Egypt and at a species-specific global extent. We used two modeling algorithms: Maxent and elastic net, both under the point-process modeling framework. For each modeling algorithm, we measured the congruence of the predictions of global and regional models for Egypt, assuming that the lower the congruence, the lower the appropriateness of the Egyptian dataset to describe the species' niche. We inspected the effect of incorporating predictions from global models as additional predictor ("prior") to regional models, and quantified the improvement in terms of AUC and the congruence between regional models run with and without priors. Moreover, we analyzed predictive performance improvements after correction for sampling bias at both scales. On average, predictions from global and regional models in Egypt only weakly concur. Collectively, the use of priors did not lead to much improvement: similar AUC and high congruence between regional models calibrated with and without priors. Correction for sampling bias led to higher model performance, whatever prior used, making the use of priors less pronounced. Under biased and incomplete sampling, the use of global bats data did not improve regional model performance. Without enough bias-free regional data, we cannot objectively identify the actual improvement of regional models after incorporating information from the global niche. However, we still believe in great potential for global model predictions to guide future surveys and improve regional sampling in data

  20. 78 FR 34703 - Pipeline Safety: Information Collection Activities, Revision to Gas Distribution Annual Report

    Science.gov (United States)

    2013-06-10

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2013-0004] Pipeline Safety: Information Collection Activities, Revision to Gas Distribution Annual Report AGENCY: Pipeline and Hazardous Materials Safety Administration, DOT. ACTION: Notice and request...

  1. Combination of digital autoradiography and alpha track analysis to reveal the distribution of definite alpha- and beta-emitting nuclides in contaminated samples

    Energy Technology Data Exchange (ETDEWEB)

    Vlasova, I. [Lomonosov MSU (Russian Federation); Kuzmenkova, N. [Vernadsky GEOKHI RAS (Russian Federation); Shiryaev, A. [Frumkin IPCE RAS (Russian Federation); Pryakhin, E. [Urals Research Center for Radiation Medicine (Russian Federation); Kalmykov, S.; Ivanov, I. [PA Mayak (Russian Federation)

    2014-07-01

    Digital autoradiography using Imaging Plate is commonly employed for searching 'hot' particles in the contaminated soil, sediment and aerosol probes. However digital radiography images combined with Alpha Track radiography data could provide much more information about micro-distribution of different alpha- and beta- nuclides. The discrimination method to estimate the distribution of radionuclides that are the main contributors to the total radioactivity ({sup 90}Sr/{sup 90}Y, {sup 137}Cs, {sup 241}Am) has been developed on the case of artificial reservoir V-17 (PA 'Mayak'). The bottom sediments and hydrobionts probes collected from V-17 along with the standards of {sup 137}Cs, {sup 90}Sr/{sup 90}Y and {sup 241}Am have been exposed for a short time (15 min) using a stack of 3 Imaging Plates (Cyclone Plus Storage Phosphor System, Perkin Elmer). The attenuation of photostimulated luminescence (PSL) intensity from layer to layer of the Imaging Plates depends on the type and energy of radiation. Integrated approach using PSL attenuation in the samples and standards (digital radiography) along with Alpha Track radiography and gamma-spectroscopy of the preparation was used to estimate the contribution of the main nuclides in specific parts of contaminated samples. The observation of the {sup 90}Sr/{sup 90}Y and {sup 137}Cs activity maxima could help to find the phases which are responsible for preferential sorption of the nuclides. Document available in abstract form only. (authors)

  2. UV TO FAR-IR CATALOG OF A GALAXY SAMPLE IN NEARBY CLUSTERS: SPECTRAL ENERGY DISTRIBUTIONS AND ENVIRONMENTAL TRENDS

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez-Fernandez, Jonathan D.; Iglesias-Paramo, J.; Vilchez, J. M., E-mail: jonatan@iaa.es [Instituto de Astrofisica de Andalucia, Glorieta de la Astronomia s/n, 18008 Granada (Spain)

    2012-03-01

    In this paper, we present a sample of cluster galaxies devoted to study the environmental influence on the star formation activity. This sample of galaxies inhabits in clusters showing a rich variety in their characteristics and have been observed by the SDSS-DR6 down to M{sub B} {approx} -18, and by the Galaxy Evolution Explorer AIS throughout sky regions corresponding to several megaparsecs. We assign the broadband and emission-line fluxes from ultraviolet to far-infrared to each galaxy performing an accurate spectral energy distribution for spectral fitting analysis. The clusters follow the general X-ray luminosity versus velocity dispersion trend of L{sub X} {proportional_to} {sigma}{sup 4.4}{sub c}. The analysis of the distributions of galaxy density counting up to the 5th nearest neighbor {Sigma}{sub 5} shows: (1) the virial regions and the cluster outskirts share a common range in the high density part of the distribution. This can be attributed to the presence of massive galaxy structures in the surroundings of virial regions. (2) The virial regions of massive clusters ({sigma}{sub c} > 550 km s{sup -1}) present a {Sigma}{sub 5} distribution statistically distinguishable ({approx}96%) from the corresponding distribution of low-mass clusters ({sigma}{sub c} < 550 km s{sup -1}). Both massive and low-mass clusters follow a similar density-radius trend, but the low-mass clusters avoid the high density extreme. We illustrate, with ABELL 1185, the environmental trends of galaxy populations. Maps of sky projected galaxy density show how low-luminosity star-forming galaxies appear distributed along more spread structures than their giant counterparts, whereas low-luminosity passive galaxies avoid the low-density environment. Giant passive and star-forming galaxies share rather similar sky regions with passive galaxies exhibiting more concentrated distributions.

  3. Distribution of blood types in a sample of 245 New Zealand non-purebred cats.

    Science.gov (United States)

    Cattin, R P

    2016-05-01

    To determine the distribution of feline blood types in a sample of non-pedigree, domestic cats in New Zealand, whether a difference exists in this distribution between domestic short haired and domestic long haired cats, and between the North and South Islands of New Zealand; and to calculate the risk of a random blood transfusion causing a severe transfusion reaction, and the risk of a random mating producing kittens susceptible to neonatal isoerythrolysis. The results of 245 blood typing tests in non-pedigree cats performed at the New Zealand Veterinary Pathology (NZVP) and Gribbles Veterinary Pathology laboratories between the beginning of 2009 and the end of 2014 were retrospectively collated and analysed. Cats that were identified as domestic short or long haired were included. For the cats tested at Gribbles Veterinary Pathology 62 were from the North Island, and 27 from the South Island. The blood type distribution differed between samples from the two laboratories (p=0.029), but not between domestic short and long haired cats (p=0.50), or between the North and South Islands (p=0.76). Of the 89 cats tested at Gribbles Veterinary Pathology, 70 (79%) were type A, 18 (20%) type B, and 1 (1%) type AB; for NZVP 139/156 (89.1%) cats were type A, 16 (10.3%) type B, and 1 (0.6%) type AB. It was estimated that 18.3-31.9% of random blood transfusions would be at risk of a transfusion reaction, and neonatal isoerythrolysis would be a risk in 9.2-16.1% of random matings between non-pedigree cats. The results from this study suggest that there is a high risk of complications for a random blood transfusion between non-purebred cats in New Zealand. Neonatal isoerythrolysis should be considered an important differential diagnosis in illness or mortality in kittens during the first days of life.

  4. Metadata Schema Used in OCLC Sampled Web Pages

    Directory of Open Access Journals (Sweden)

    Fei Yu

    2005-12-01

    Full Text Available The tremendous growth of Web resources has made information organization and retrieval more and more difficult. As one approach to this problem, metadata schemas have been developed to characterize Web resources. However, many questions have been raised about the use of metadata schemas such as which metadata schemas have been used on the Web? How did they describe Web accessible information? What is the distribution of these metadata schemas among Web pages? Do certain schemas dominate the others? To address these issues, this study analyzed 16,383 Web pages with meta tags extracted from 200,000 OCLC sampled Web pages in 2000. It found that only 8.19% Web pages used meta tags; description tags, keyword tags, and Dublin Core tags were the only three schemas used in the Web pages. This article revealed the use of meta tags in terms of their function distribution, syntax characteristics, granularity of the Web pages, and the length distribution and word number distribution of both description and keywords tags.

  5. A weighted sampling algorithm for the design of RNA sequences with targeted secondary structure and nucleotide distribution.

    Science.gov (United States)

    Reinharz, Vladimir; Ponty, Yann; Waldispühl, Jérôme

    2013-07-01

    The design of RNA sequences folding into predefined secondary structures is a milestone for many synthetic biology and gene therapy studies. Most of the current software uses similar local search strategies (i.e. a random seed is progressively adapted to acquire the desired folding properties) and more importantly do not allow the user to control explicitly the nucleotide distribution such as the GC-content in their sequences. However, the latter is an important criterion for large-scale applications as it could presumably be used to design sequences with better transcription rates and/or structural plasticity. In this article, we introduce IncaRNAtion, a novel algorithm to design RNA sequences folding into target secondary structures with a predefined nucleotide distribution. IncaRNAtion uses a global sampling approach and weighted sampling techniques. We show that our approach is fast (i.e. running time comparable or better than local search methods), seedless (we remove the bias of the seed in local search heuristics) and successfully generates high-quality sequences (i.e. thermodynamically stable) for any GC-content. To complete this study, we develop a hybrid method combining our global sampling approach with local search strategies. Remarkably, our glocal methodology overcomes both local and global approaches for sampling sequences with a specific GC-content and target structure. IncaRNAtion is available at csb.cs.mcgill.ca/incarnation/. Supplementary data are available at Bioinformatics online.

  6. Knowing when to trust a teacher: The contribution of category status and sample composition to young children's judgments of informant trustworthiness.

    Science.gov (United States)

    Lawson, Chris A

    2018-09-01

    Two experiments examined the extent to which category status influences children's attention to the composition of evidence samples provided by different informants. Children were told about two informants, each of whom presented different samples of evidence, and then were asked to judge which informant they would trust to help them learn something new. The composition of evidence samples was manipulated such that one sample included either a large number (n = 5) or a diverse range of exemplars relative to the other sample, which included either a small number (n = 2) or a homogeneous range of exemplars. Experiment 1 revealed that participants (N = 37; M age = 4.76 years) preferred to place their trust in the informant who presented the large or diverse sample when each informant was labeled "teacher" but exhibited no preference when each informant was labeled "child." Experiment 2 revealed developmental differences in responses when labels and sample composition were pitted against each other. Younger children (n = 32; M age = 3.42 years) consistently trusted the "teacher" regardless of the composition of the sample the informant was said to have provided, whereas older children (n = 30; M age = 5.54 years) consistently trusted the informant who provided the large or diverse sample regardless of whether it was provided by a "teacher" or a "child." These results have important implications for understanding the interplay between children's category knowledge and their evaluation of evidence. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Statistical inferences with jointly type-II censored samples from two Pareto distributions

    Science.gov (United States)

    Abu-Zinadah, Hanaa H.

    2017-08-01

    In the several fields of industries the product comes from more than one production line, which is required to work the comparative life tests. This problem requires sampling of the different production lines, then the joint censoring scheme is appeared. In this article we consider the life time Pareto distribution with jointly type-II censoring scheme. The maximum likelihood estimators (MLE) and the corresponding approximate confidence intervals as well as the bootstrap confidence intervals of the model parameters are obtained. Also Bayesian point and credible intervals of the model parameters are presented. The life time data set is analyzed for illustrative purposes. Monte Carlo results from simulation studies are presented to assess the performance of our proposed method.

  8. Distribution of polybrominated diphenyl ethers in Japanese autopsy tissue and body fluid samples.

    Science.gov (United States)

    Hirai, Tetsuya; Fujimine, Yoshinori; Watanabe, Shaw; Nakano, Takeshi

    2012-09-01

    Brominated flame retardants are components of many plastics and are used in products such as cars, textiles, televisions, and personal computers. Human exposure to polybrominated diphenyl ether (PBDE) flame retardants has increased exponentially during the last three decades. Our objective was to measure the body burden and distribution of PBDEs and to determine the concentrations of the predominant PBDE congeners in samples of liver, bile, adipose tissue, and blood obtained from Japanese autopsy cases. Tissues and body fluids obtained from 20 autopsy cases were analyzed. The levels of 25 PBDE congeners, ranging from tri- to hexa-BDEs, were assessed. The geometric means of the sum of the concentrations of PBDE congeners having detection frequencies >50 % (ΣPBDE) in the blood, liver, bile, and adipose tissue were 2.4, 2.6, 1.4, and 4.3 ng/g lipid, respectively. The most abundant congeners were BDE-47 and BDE-153, followed by BDE-100, BDE-99, and BDE-28+33. These concentrations of PBDE congeners were similar to other reports of human exposure in Japan but were notably lower than concentrations than those reported in the USA. Significant positive correlations were observed between the concentrations of predominant congeners and ΣPBDE among the samples analyzed. The ΣPBDE concentration was highest in the adipose tissue, but PBDEs were distributed widely among the tissues and body fluids analyzed. The PBDE levels observed in the present study are similar to those reported in previous studies in Japan and significantly lower than those reported in the USA.

  9. The effects of sampling bias and model complexity on the predictive performance of MaxEnt species distribution models.

    Science.gov (United States)

    Syfert, Mindy M; Smith, Matthew J; Coomes, David A

    2013-01-01

    Species distribution models (SDMs) trained on presence-only data are frequently used in ecological research and conservation planning. However, users of SDM software are faced with a variety of options, and it is not always obvious how selecting one option over another will affect model performance. Working with MaxEnt software and with tree fern presence data from New Zealand, we assessed whether (a) choosing to correct for geographical sampling bias and (b) using complex environmental response curves have strong effects on goodness of fit. SDMs were trained on tree fern data, obtained from an online biodiversity data portal, with two sources that differed in size and geographical sampling bias: a small, widely-distributed set of herbarium specimens and a large, spatially clustered set of ecological survey records. We attempted to correct for geographical sampling bias by incorporating sampling bias grids in the SDMs, created from all georeferenced vascular plants in the datasets, and explored model complexity issues by fitting a wide variety of environmental response curves (known as "feature types" in MaxEnt). In each case, goodness of fit was assessed by comparing predicted range maps with tree fern presences and absences using an independent national dataset to validate the SDMs. We found that correcting for geographical sampling bias led to major improvements in goodness of fit, but did not entirely resolve the problem: predictions made with clustered ecological data were inferior to those made with the herbarium dataset, even after sampling bias correction. We also found that the choice of feature type had negligible effects on predictive performance, indicating that simple feature types may be sufficient once sampling bias is accounted for. Our study emphasizes the importance of reducing geographical sampling bias, where possible, in datasets used to train SDMs, and the effectiveness and essentialness of sampling bias correction within MaxEnt.

  10. The information-motivation-behavioral skills model of ART adherence in a Deep South HIV+ clinic sample.

    Science.gov (United States)

    Amico, K Rivet; Barta, William; Konkle-Parker, Deborah J; Fisher, Jeffrey D; Cornman, Deborah H; Shuper, Paul A; Fisher, William A

    2009-02-01

    High levels of adherence to antiretroviral therapy (ART) are critical to the management of HIV, yet many people living with HIV do not achieve these levels. There is a substantial body of literature regarding correlates of adherence to ART, and theory-based multivariate models of ART adherence are emerging. The current study assessed the determinants of adherence behavior postulated by the Information-Motivation-Behavioral Skills model of ART adherence in a sample of 149 HIV-positive patients in Mississippi. Structural equation modeling indicated that ART-related information correlated with personal and social motivation, and the two sub-areas of motivation were not intercorrelated. In this Deep South sample, being better informed, socially supported, and perceiving fewer negative consequences of adherence were independently related to stronger behavioral skills for taking medications, which in turn associated with self-reported adherence. The IMB model of ART adherence appeared to well characterize the complexities of adherence for this sample.

  11. A network-based distributed, media-rich computing and information environment

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, R.L.

    1995-12-31

    Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to be a prototype National Information Infrastructure development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multi-media technologies, and data-mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and K-12 education. This paper provides a description of Sunrise and a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; (3) To define a new way of collaboration between computer science and industrially-relevant research.

  12. Capture and exploration of sample quality data to inform and improve the management of a screening collection.

    Science.gov (United States)

    Charles, Isabel; Sinclair, Ian; Addison, Daniel H

    2014-04-01

    A new approach to the storage, processing, and interrogation of the quality data for screening samples has improved analytical throughput and confidence and enhanced the opportunities for learning from the accumulating records. The approach has entailed the design, development, and implementation of a database-oriented system, capturing information from the liquid chromatography-mass spectrometry capabilities used for assessing the integrity of samples in AstraZeneca's screening collection. A Web application has been developed to enable the visualization and interactive annotation of the analytical data, monitor the current sample queue, and report the throughput rate. Sample purity and identity are certified automatically on the chromatographic peaks of interest if predetermined thresholds are reached on key parameters. Using information extracted in parallel from the compound registration and container inventory databases, the chromatographic and spectroscopic profiles for each vessel are linked to the sample structures and storage histories. A search engine facilitates the direct comparison of results for multiple vessels of the same or similar compounds, for single vessels analyzed at different time points, or for vessels related by their origin or process flow. Access to this network of information has provided a deeper understanding of the multiple factors contributing to sample quality assurance.

  13. Optimal sampling theory and population modelling - Application to determination of the influence of the microgravity environment on drug distribution and elimination

    Science.gov (United States)

    Drusano, George L.

    1991-01-01

    The optimal sampling theory is evaluated in applications to studies related to the distribution and elimination of several drugs (including ceftazidime, piperacillin, and ciprofloxacin), using the SAMPLE module of the ADAPT II package of programs developed by D'Argenio and Schumitzky (1979, 1988) and comparing the pharmacokinetic parameter values with results obtained by traditional ten-sample design. The impact of the use of optimal sampling was demonstrated in conjunction with NONMEM (Sheiner et al., 1977) approach, in which the population is taken as the unit of analysis, allowing even fragmentary patient data sets to contribute to population parameter estimates. It is shown that this technique is applicable in both the single-dose and the multiple-dose environments. The ability to study real patients made it possible to show that there was a bimodal distribution in ciprofloxacin nonrenal clearance.

  14. A Bayesian Method for Weighted Sampling

    OpenAIRE

    Lo, Albert Y.

    1993-01-01

    Bayesian statistical inference for sampling from weighted distribution models is studied. Small-sample Bayesian bootstrap clone (BBC) approximations to the posterior distribution are discussed. A second-order property for the BBC in unweighted i.i.d. sampling is given. A consequence is that BBC approximations to a posterior distribution of the mean and to the sampling distribution of the sample average, can be made asymptotically accurate by a proper choice of the random variables that genera...

  15. Radiographic information theory: correction for x-ray spectral distribution

    International Nuclear Information System (INIS)

    Brodie, I.; Gutcheck, R.A.

    1983-01-01

    A more complete computational method is developed to account for the effect of the spectral distribution of the incident x-ray fluence on the minimum exposure required to record a specified information set in a diagnostic radiograph. It is shown that an earlier, less rigorous, but simpler computational technique does not introduce serious errors provided that both a good estimate of the mean energy per photon can be made and the detector does not contain an absorption edge in the spectral range. Also shown is that to a first approximation, it is immaterial whether the detecting surface counts the number of photons incident from each pixel or measures the energy incident on each pixel. A previous result is confirmed that, for mammography, the present methods of processing data from the detector utilize only a few percent of the incident information, suggesting that techniques can be developed for obtaining mammograms at substantially lower doses than those presently used. When used with film-screen combinations, x-ray tubes with tungsten anodes should require substantially lower exposures than devices using molybdenum anodes, when both are operated at their optimal voltage

  16. Genetic patterns in forest antelope populations in the Udzungwa Mountains, Tanzania, as inferred from non-invasive sampling

    DEFF Research Database (Denmark)

    Bowkett, Andrew E.; Jones, Trevor; Rovero, Francesco

    2015-01-01

    As for many tropical regions, the evolutionary and demographic status of antelope populations in the Udzungwa Mountains, Tanzania, are poorly resolved. We employed genetic information from 618 faecal samples to assess the status of forest antelope species in terms of their distribution, intraspec...... except the endangered C. spadix. Overall, our results demonstrate the value of non-invasive genetic sampling in studying the distribution and evolution of rarely observed species.......As for many tropical regions, the evolutionary and demographic status of antelope populations in the Udzungwa Mountains, Tanzania, are poorly resolved. We employed genetic information from 618 faecal samples to assess the status of forest antelope species in terms of their distribution......, intraspecific diversity and population subdivision within the Udzungwa landscape. Most species were detected in the majority of forest fragments, except for Philantomba monticola. Phylogenetic analyses were consistent with traditional taxonomy with the exception of Cephalophus harveyi which was paraphyletic...

  17. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning.

    Directory of Open Access Journals (Sweden)

    Anne Hsu

    Full Text Available A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning.

  18. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning

    Science.gov (United States)

    2016-01-01

    A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning. PMID:27310576

  19. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    Science.gov (United States)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  20. Distribution of local critical current along sample length and its relation to overall current in a long Bi2223/Ag superconducting composite tape

    International Nuclear Information System (INIS)

    Ochiai, S; Doko, D; Okuda, H; Oh, S S; Ha, D W

    2006-01-01

    The distribution of the local critical current and the n-value along the sample length and its relation to the overall critical current were studied experimentally and analytically for the bent multifilamentary Bi2223/Ag/Ag-Mg alloy superconducting composite tape. Then, based on the results, it was attempted to simulate on a computer the dependence of the critical current on the sample length. The main results are summarized as follows. The experimentally observed relation of the distributed local critical current and n-value to the overall critical current was described comprehensively with a simple voltage summation model, in which the sample was regarded as a one-dimensional series circuit. The sample length dependence of the critical current was reproduced on the computer by a Monte Carlo simulation incorporating the voltage summation model and the regression analysis results for the local critical current distribution and the relation of the n-value to the critical current

  1. It is time to improve the quality of medical information distributed to students across social media.

    Science.gov (United States)

    Zucker, Benjamin E; Kontovounisios, Christos

    2018-01-01

    The ubiquitous nature of social media has meant that its effects on fields outside of social communication have begun to be felt. The generation undergoing medical education are of the generation referred to as "digital natives", and as such routinely incorporate social media into their education. Social media's incorporation into medical education includes its use as a platform to distribute information to the public ("distributive education") and as a platform to provide information to a specific audience ("push education"). These functions have proved beneficial in many regards, such as enabling constant access to the subject matter, other learners, and educators. However, the usefulness of using social media as part of medical education is limited by the vast quantities of poor quality information and the time required to find information of sufficient quality and relevance, a problem confounded by many student's preoccupation with "efficient" learning. In this Perspective, the authors discuss whether social media has proved useful as a tool for medical education. The current growth in the use of social media as a tool for medical education seems to be principally supported by students' desire for efficient learning rather than by the efficacy of social media as a resource for medical education. Therefore, improvements in the quality of information required to maximize the impact of social media as a tool for medical education are required. Suggested improvements include an increase in the amount of educational content distributed on social media produced by academic institutions, such as universities and journals.

  2. Sample-size effects in fast-neutron gamma-ray production measurements: solid-cylinder samples

    International Nuclear Information System (INIS)

    Smith, D.L.

    1975-09-01

    The effects of geometry, absorption and multiple scattering in (n,Xγ) reaction measurements with solid-cylinder samples are investigated. Both analytical and Monte-Carlo methods are employed in the analysis. Geometric effects are shown to be relatively insignificant except in definition of the scattering angles. However, absorption and multiple-scattering effects are quite important; accurate microscopic differential cross sections can be extracted from experimental data only after a careful determination of corrections for these processes. The results of measurements performed using several natural iron samples (covering a wide range of sizes) confirm validity of the correction procedures described herein. It is concluded that these procedures are reliable whenever sufficiently accurate neutron and photon cross section and angular distribution information is available for the analysis. (13 figures, 5 tables) (auth)

  3. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  4. Eco-distribution Mapping of Invasive Weed Limnocharis flava (L. Buchenau Using Geographical Information System: Implications for Containment and Integrated Weed Management for Ecosystem Conservation

    Directory of Open Access Journals (Sweden)

    P. C. Abhilash

    2008-03-01

    Full Text Available Exotic weed invasion has been identified as one of the serious environmental problem impacting the structure, composition and function of biological diversity. They are aggressive colonizers, which have flexible habitat requirement and ability to outcompete native species. The present paper describes the distribution and autecology of an exotic weed Limnocharis flava (L. Buchenau (an emergent aquatic weed of ‘Limnocharitaceae’ in Kumarakom Grama Panchayat, one of the well known tourist spot of South India famous for its vast stretches of paddy fields, wetlands and backwaters. The mapping of L. flava in the entire study area has been done using Geographical Information System (Arc-info 8.3 version. The growth and distribution pattern of L. flava were studied quantitatively. Data on distribution, abundance, biomass, ecological associations and root zone nutrient quality of water and sediment samples were collected from different sampling points of Kumarakom. The study reflected that nutrients, water depth and land use patterns were the major factors responsible for the growth and proliferation of this exotic weed. The strategies for controlling L. flava invasion are discussed in detail. If early steps are not taken to eradicate this weed, it will become a problematic weed in the same way as other noxious aquatic weeds like Salvinia molesta D. Mitch and Eichhornia crassipes (C. Martius Solms-Laub.

  5. Efficient sampling to determine the distribution of fruit quality and yield in a commercial apple orchard

    DEFF Research Database (Denmark)

    Martinez, M.; Wulfsohn, Dvora-Laio; Zamora, I.

    2012-01-01

    In situ assessment of fruit quality and yield can provide critical data for marketing and for logistical planning of the harvest, as well as for site-specific management. Our objective was to develop and validate efficient field sampling procedures for this purpose. We used the previously reported...... 'fractionator' tree sampling procedure and supporting handheld software (Gardi et al., 2007; Wulfsohn et al., 2012) to obtain representative samples of fruit from a 7.6-ha apple orchard (Malus ×domestica 'Fuji Raku Raku') in central Chile. The resulting sample consisted of 70 fruit on 56 branch segments...... of yield. Estimated marketable yield was 295.8±50.2 t. Field and packinghouse records indicated that of 348.2 t sent to packing (52.4 t or 15% higher than our estimate), 263.0 t was packed for export (32.8 t less or -12% error compared to our estimate). The estimated distribution of caliber compared very...

  6. Reconstructing missing information on precipitation datasets: impact of tails on adopted statistical distributions.

    Science.gov (United States)

    Pedretti, Daniele; Beckie, Roger Daniel

    2014-05-01

    Missing data in hydrological time-series databases are ubiquitous in practical applications, yet it is of fundamental importance to make educated decisions in problems involving exhaustive time-series knowledge. This includes precipitation datasets, since recording or human failures can produce gaps in these time series. For some applications, directly involving the ratio between precipitation and some other quantity, lack of complete information can result in poor understanding of basic physical and chemical dynamics involving precipitated water. For instance, the ratio between precipitation (recharge) and outflow rates at a discharge point of an aquifer (e.g. rivers, pumping wells, lysimeters) can be used to obtain aquifer parameters and thus to constrain model-based predictions. We tested a suite of methodologies to reconstruct missing information in rainfall datasets. The goal was to obtain a suitable and versatile method to reduce the errors given by the lack of data in specific time windows. Our analyses included both a classical chronologically-pairing approach between rainfall stations and a probability-based approached, which accounted for the probability of exceedence of rain depths measured at two or multiple stations. Our analyses proved that it is not clear a priori which method delivers the best methodology. Rather, this selection should be based considering the specific statistical properties of the rainfall dataset. In this presentation, our emphasis is to discuss the effects of a few typical parametric distributions used to model the behavior of rainfall. Specifically, we analyzed the role of distributional "tails", which have an important control on the occurrence of extreme rainfall events. The latter strongly affect several hydrological applications, including recharge-discharge relationships. The heavy-tailed distributions we considered were parametric Log-Normal, Generalized Pareto, Generalized Extreme and Gamma distributions. The methods were

  7. Sample Size Determination for One- and Two-Sample Trimmed Mean Tests

    Science.gov (United States)

    Luh, Wei-Ming; Olejnik, Stephen; Guo, Jiin-Huarng

    2008-01-01

    Formulas to determine the necessary sample sizes for parametric tests of group comparisons are available from several sources and appropriate when population distributions are normal. However, in the context of nonnormal population distributions, researchers recommend Yuen's trimmed mean test, but formulas to determine sample sizes have not been…

  8. METHODS OF MANAGING TRAFFIC DISTRIBUTION IN INFORMATION AND COMMUNICATION NETWORKS OF CRITICAL INFRASTRUCTURE SYSTEMS

    OpenAIRE

    Kosenko, Viktor; Persiyanova, Elena; Belotskyy, Oleksiy; Malyeyeva, Olga

    2017-01-01

    The subject matter of the article is information and communication networks (ICN) of critical infrastructure systems (CIS). The goal of the work is to create methods for managing the data flows and resources of the ICN of CIS to improve the efficiency of information processing. The following tasks were solved in the article: the data flow model of multi-level ICN structure was developed, the method of adaptive distribution of data flows was developed, the method of network resource assignment...

  9. Gathering Opinions on Depression Information Needs and Preferences: Samples and Opinions in Clinic Versus Web-Based Surveys.

    Science.gov (United States)

    Bernstein, Matthew T; Walker, John R; Sexton, Kathryn A; Katz, Alan; Beatie, Brooke E

    2017-04-24

    There has been limited research on the information needs and preferences of the public concerning treatment for depression. Very little research is available comparing samples and opinions when recruitment for surveys is done over the Web as opposed to a personal invitation to complete a paper survey. This study aimed to (1) to explore information needs and preferences among members of the public and (2) compare Clinic and Web samples on sample characteristics and survey findings. Web survey participants were recruited with a notice on three self-help association websites (N=280). Clinic survey participants were recruited by a research assistant in the waiting rooms of a family medicine clinic and a walk-in medical clinic (N=238) and completed a paper version of the survey. The Clinic and Web samples were similar in age (39.0 years, SD 13.9 vs 40.2 years, SD 12.5, respectively), education, and proportion in full time employment. The Clinic sample was more diverse in demographic characteristics and closer to the demographic characteristics of the region (Winnipeg, Canada) with a higher proportion of males (102/238 [42.9%] vs 45/280 [16.1%]) and nonwhites (Aboriginal, Asian, and black) (69/238 [29.0%] vs 39/280 [13.9%]). The Web sample reported a higher level of emotional distress and had more previous psychological (224/280 [80.0%] vs 83/238 [34.9%]) and pharmacological (202/280 [72.1%] vs 57/238 [23.9%]) treatment. In terms of opinions, most respondents in both settings saw information on a wide range of topics around depression treatment as very important including information about treatment choices, effectiveness of treatment, how long it takes treatment to work, how long treatment continues, what happens when treatment stops, advantages and disadvantages of treatments, and potential side effects. Females, respondents with a white background, and those who had received or felt they would have benefited from therapy in the past saw more information topics as very

  10. A development of two-dimensional birefringence distribution measurement system with a sampling rate of 1.3 MHz

    Science.gov (United States)

    Onuma, Takashi; Otani, Yukitoshi

    2014-03-01

    A two-dimensional birefringence distribution measurement system with a sampling rate of 1.3 MHz is proposed. A polarization image sensor is developed as core device of the system. It is composed of a pixelated polarizer array made from photonic crystal and a parallel read out circuit with a multi-channel analog to digital converter specialized for two-dimensional polarization detection. By applying phase shifting algorism with circularly-polarized incident light, birefringence phase difference and azimuthal angle can be measured. The performance of the system is demonstrated experimentally by measuring actual birefringence distribution and polarization device such as Babinet-Soleil compensator.

  11. preparation and distribution of microfiche in International Nuclear Information System (INIS)

    International Nuclear Information System (INIS)

    Kajiro, Tadashi; Habara, Tadashi

    1981-01-01

    INIS started the activity in 1970 aiming at the unified treatment of atomic energy literatures in the world. At present, 66 countries and 13 international organs participate in it, and the INIS Section in IAEA supervises the system. The participants treat the atomic energy literatures published in respective countries and send to the INIS Section in the form of magnetic tapes and others. These informations are returned to respective countries through the publication of ''INIS Atomindex'' and the distribution of the magnetic tapes for mechanized retrieval. Recently, the number of abstracted papers reached about 80000/year. One of the features of this system is to utilize microfiche as the medium of literature distribution, and the literatures which cannot be bought through bookstores are available in the form of microfiche from the INIS Clearing House. The microfiche before the establishment of INIS, the troubles concerning the equipment and the problems of the originals to make microfiche, the change of duplicating films and the conversion of frame number, the adoption of the standard for microfiche production and quality inspection, the second change of duplicating films, the recent improvement of microfiche, and the present state of the production and distribution of microfiche in the INIS Section are described. (Kako, I.)

  12. Mnemonic transmission, social contagion, and emergence of collective memory: Influence of emotional valence, group structure, and information distribution.

    Science.gov (United States)

    Choi, Hae-Yoon; Kensinger, Elizabeth A; Rajaram, Suparna

    2017-09-01

    Social transmission of memory and its consequence on collective memory have generated enduring interdisciplinary interest because of their widespread significance in interpersonal, sociocultural, and political arenas. We tested the influence of 3 key factors-emotional salience of information, group structure, and information distribution-on mnemonic transmission, social contagion, and collective memory. Participants individually studied emotionally salient (negative or positive) and nonemotional (neutral) picture-word pairs that were completely shared, partially shared, or unshared within participant triads, and then completed 3 consecutive recalls in 1 of 3 conditions: individual-individual-individual (control), collaborative-collaborative (identical group; insular structure)-individual, and collaborative-collaborative (reconfigured group; diverse structure)-individual. Collaboration enhanced negative memories especially in insular group structure and especially for shared information, and promoted collective forgetting of positive memories. Diverse group structure reduced this negativity effect. Unequally distributed information led to social contagion that creates false memories; diverse structure propagated a greater variety of false memories whereas insular structure promoted confidence in false recognition and false collective memory. A simultaneous assessment of network structure, information distribution, and emotional valence breaks new ground to specify how network structure shapes the spread of negative memories and false memories, and the emergence of collective memory. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. Life prediction for white OLED based on LSM under lognormal distribution

    Science.gov (United States)

    Zhang, Jianping; Liu, Fang; Liu, Yu; Wu, Helen; Zhu, Wenqing; Wu, Wenli; Wu, Liang

    2012-09-01

    In order to acquire the reliability information of White Organic Light Emitting Display (OLED), three groups of OLED constant stress accelerated life tests (CSALTs) were carried out to obtain failure data of samples. Lognormal distribution function was applied to describe OLED life distribution, and the accelerated life equation was determined by Least square method (LSM). The Kolmogorov-Smirnov test was performed to verify whether the white OLED life meets lognormal distribution or not. Author-developed software was employed to predict the average life and the median life. The numerical results indicate that the white OLED life submits to lognormal distribution, and that the accelerated life equation meets inverse power law completely. The estimated life information of the white OLED provides manufacturers and customers with important guidelines.

  14. Adaptive Kalman Filter Based on Adjustable Sampling Interval in Burst Detection for Water Distribution System

    Directory of Open Access Journals (Sweden)

    Doo Yong Choi

    2016-04-01

    Full Text Available Rapid detection of bursts and leaks in water distribution systems (WDSs can reduce the social and economic costs incurred through direct loss of water into the ground, additional energy demand for water supply, and service interruptions. Many real-time burst detection models have been developed in accordance with the use of supervisory control and data acquisition (SCADA systems and the establishment of district meter areas (DMAs. Nonetheless, no consideration has been given to how frequently a flow meter measures and transmits data for predicting breaks and leaks in pipes. This paper analyzes the effect of sampling interval when an adaptive Kalman filter is used for detecting bursts in a WDS. A new sampling algorithm is presented that adjusts the sampling interval depending on the normalized residuals of flow after filtering. The proposed algorithm is applied to a virtual sinusoidal flow curve and real DMA flow data obtained from Jeongeup city in South Korea. The simulation results prove that the self-adjusting algorithm for determining the sampling interval is efficient and maintains reasonable accuracy in burst detection. The proposed sampling method has a significant potential for water utilities to build and operate real-time DMA monitoring systems combined with smart customer metering systems.

  15. Identification of systems with distributed parameters

    International Nuclear Information System (INIS)

    Moret, J.M.

    1990-10-01

    The problem of finding a model for the dynamical response of a system with distributed parameters based on measured data is addressed. First a mathematical formalism is developed in order to obtain the specific properties of such a system. Then a linear iterative identification algorithm is proposed that includes these properties, and that produces better results than usual non linear minimisation techniques. This algorithm is further improved by an original data decimation that allow to artificially increase the sampling period without losing between sample information. These algorithms are tested with real laboratory data

  16. Limitations to the Use of Species-Distribution Models for Environmental-Impact Assessments in the Amazon.

    Directory of Open Access Journals (Sweden)

    Lorena Ribeiro de A Carneiro

    Full Text Available Species-distribution models (SDM are tools with potential to inform environmental-impact studies (EIA. However, they are not always appropriate and may result in improper and expensive mitigation and compensation if their limitations are not understood by decision makers. Here, we examine the use of SDM for frogs that were used in impact assessment using data obtained from the EIA of a hydroelectric project located in the Amazon Basin in Brazil. The results show that lack of knowledge of species distributions limits the appropriate use of SDM in the Amazon region for most target species. Because most of these targets are newly described and their distributions poorly known, data about their distributions are insufficient to be effectively used in SDM. Surveys that are mandatory for the EIA are often conducted only near the area under assessment, and so models must extrapolate well beyond the sampled area to inform decisions made at much larger spatial scales, such as defining areas to be used to offset the negative effects of the projects. Using distributions of better-known species in simulations, we show that geographical-extrapolations based on limited information of species ranges often lead to spurious results. We conclude that the use of SDM as evidence to support project-licensing decisions in the Amazon requires much greater area sampling for impact studies, or, alternatively, integrated and comparative survey strategies, to improve biodiversity sampling. When more detailed distribution information is unavailable, SDM will produce results that generate uncertain and untestable decisions regarding impact assessment. In many cases, SDM is unlikely to be better than the use of expert opinion.

  17. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    Science.gov (United States)

    Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan

    2016-01-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  18. The Applicability of the Distribution Coefficient, KD, Based on Non-Aggregated Particulate Samples from Lakes with Low Suspended Solids Concentrations.

    Directory of Open Access Journals (Sweden)

    Aine Marie Gormley-Gallagher

    Full Text Available Separate phases of metal partitioning behaviour in freshwater lakes that receive varying degrees of atmospheric contamination and have low concentrations of suspended solids were investigated to determine the applicability of the distribution coefficient, KD. Concentrations of Pb, Ni, Co, Cu, Cd, Cr, Hg and Mn were determined using a combination of filtration methods, bulk sample collection and digestion and Inductively Coupled Plasma-Mass Spectrometry (ICP-MS. Phytoplankton biomass, suspended solids concentrations and the organic content of the sediment were also analysed. By distinguishing between the phytoplankton and (inorganic lake sediment, transient variations in KD were observed. Suspended solids concentrations over the 6-month sampling campaign showed no correlation with the KD (n = 15 for each metal, p > 0.05 for Mn (r2 = 0.0063, Cu (r2 = 0.0002, Cr (r2 = 0.021, Ni (r2 = 0.0023, Cd (r2 = 0.00001, Co (r2 = 0.096, Hg (r2 = 0.116 or Pb (r2 = 0.164. The results implied that colloidal matter had less opportunity to increase the dissolved (filter passing fraction, which inhibited the spurious lowering of KD. The findings conform to the increasingly documented theory that the use of KD in modelling may mask true information on metal partitioning behaviour. The root mean square error of prediction between the directly measured total metal concentrations and those modelled based on the separate phase fractions were ± 3.40, 0.06, 0.02, 0.03, 0.44, 484.31, 80.97 and 0.1 μg/L for Pb, Cd, Mn, Cu, Hg, Ni, Cr and Co respectively. The magnitude of error suggests that the separate phase models for Mn and Cu can be used in distribution or partitioning models for these metals in lake water.

  19. Score distributions in information retrieval

    NARCIS (Netherlands)

    Arampatzis, A.; Robertson, S.; Kamps, J.

    2009-01-01

    We review the history of modeling score distributions, focusing on the mixture of normal-exponential by investigating the theoretical as well as the empirical evidence supporting its use. We discuss previously suggested conditions which valid binary mixture models should satisfy, such as the

  20. Isotope dilution and sampling factors of the quality assurance and TQM of environmental analysis

    International Nuclear Information System (INIS)

    Macasek, F.

    1999-01-01

    Sampling and preparatory treatment of environmental objects is discussed from the view of their information content, functional speciation of the pollutant, statistical distribution treatment and uncertainty assessment. During homogenization of large samples, a substantial information may be lost and validity of environmental information becomes vague. Isotope dilution analysis is discussed as the most valuable tool for both validity of analysis and evaluation of samples variance. Data collection for a non-parametric statistical treatment of series of 'non-representative' sub-samples, and physico-chemical speciation of analyte may actually better fulfill criteria of similarity and representativeness. Large samples are often required due to detection limits of analysis, but the representativeness of environmental samples should by understood not only by the mean analyte concentration, but also by its spatial and time variance. Hence, heuristic analytical scenarios and interpretation of results must be designed by cooperation of environmentalists and analytical chemists. (author)

  1. Assessment of crystalline disorder in cryo-milled samples of indomethacin using atomic pair-wise distribution functions

    DEFF Research Database (Denmark)

    Bøtker, Johan P; Karmwar, Pranav; Strachan, Clare J

    2011-01-01

    to analyse the cryo-milled samples. The high similarity between the ¿-indomethacin cryogenic ball milled samples and the crude ¿-indomethacin indicated that milled samples retained residual order of the ¿-form. The PDF analysis encompassed the capability of achieving a correlation with the physical......The aim of this study was to investigate the usefulness of the atomic pair-wise distribution function (PDF) to detect the extension of disorder/amorphousness induced into a crystalline drug using a cryo-milling technique, and to determine the optimal milling times to achieve amorphisation. The PDF...... properties determined from DSC, ss-NMR and stability experiments. Multivariate data analysis (MVDA) was used to visualize the differences in the PDF and XRPD data. The MVDA approach revealed that PDF is more efficient in assessing the introduced degree of disorder in ¿-indomethacin after cryo-milling than...

  2. Uplift characterization in an inland area based on information of terrace distribution. Case study in the Mid Niigata region

    International Nuclear Information System (INIS)

    Hataya, Ryuta; Hamada, Takaomi

    2009-01-01

    We have investigated the correlation and chronology of fluvial terraces, and characteristics of terrace distribution around active structures, in the Mid Niigata region. Making much of geomorphologic and geologic information, such as morphology and continuity of terrace plains, weathering of terrace gravels, gives to appropriately interpretation of tephra data. Then, at both sides of the active structure, characteristics of terrace distribution are quite different. It indicates that terrace distribution is useful information to evaluate fault activity. Furthermore, uplift estimation for the last 100,000 years is given by terrace distribution information and relative heights of fluvial terraces, and can be applied to evaluation the fault activity. For example, using the uplift index of the values of relative height between terraces, we estimated that the uplift difference on the both side of the Muikamachi fault during late Quaternary was more than 40m. It suggests that the slip rate of the Muikamachi fault is more than 0.4m/10 3 year. The viewpoint and method shown in this paper reflected to the upgrade of the air-photo interpretation for active fault survey. (author)

  3. Temperature distribution study in flash-annealed amorphous ribbons

    International Nuclear Information System (INIS)

    Moron, C.; Garcia, A.; Carracedo, M.T.

    2003-01-01

    Negative magnetrostrictive amorphous ribbons have been locally current annealed with currents from 1 to 8 A and annealing times from 14 ms to 200 s. In order to obtain information about the sample temperature during flash or current annealing, a study of the temperature dispersion during annealing in amorphous ribbons was made. The local temperature variation was obtained by measuring the local intensity of the infrared emission of the sample with a CCD liquid nitrogen cooled camera. A distribution of local temperature has been found in spite of the small dimension of the sample

  4. Advanced model for expansion of natural gas distribution networks based on geographic information systems

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez-Rosado, I.J.; Fernandez-Jimenez, L.A.; Garcia-Garrido, E.; Zorzano-Santamaria, P.; Zorzano-Alba, E. [La Rioja Univ., La Rioja (Spain). Dept. of Electrical Engineering; Miranda, V.; Montneiro, C. [Porto Univ., Porto (Portugal). Faculty of Engineering]|[Inst. de Engenharia de Sistemas e Computadores do Porto, Porto (Portugal)

    2005-07-01

    An advanced geographic information system (GIS) model of natural gas distribution networks was presented. The raster-based model was developed to evaluate costs associated with the expansion of electrical networks due to increased demand in the La Rioja region of Spain. The model was also used to evaluate costs associated with maintenance and amortization of the already existing distribution network. Expansion costs of the distribution network were modelled in various demand scenarios. The model also considered a variety of technical factors associated with pipeline length and topography. Soil and slope data from previous pipeline projects were used to estimate real costs per unit length of pipeline. It was concluded that results obtained by the model will be used by planners to select zones where expansion is economically feasible. 4 refs., 5 figs.

  5. Correlated Raman micro-spectroscopy and scanning electron microscopy analyses of flame retardants in environmental samples: a micro-analytical tool for probing chemical composition, origin and spatial distribution.

    Science.gov (United States)

    Ghosal, Sutapa; Wagner, Jeff

    2013-07-07

    We present correlated application of two micro-analytical techniques: scanning electron microscopy/energy dispersive X-ray spectroscopy (SEM/EDS) and Raman micro-spectroscopy (RMS) for the non-invasive characterization and molecular identification of flame retardants (FRs) in environmental dusts and consumer products. The SEM/EDS-RMS technique offers correlated, morphological, molecular, spatial distribution and semi-quantitative elemental concentration information at the individual particle level with micrometer spatial resolution and minimal sample preparation. The presented methodology uses SEM/EDS analyses for rapid detection of particles containing FR specific elements as potential indicators of FR presence in a sample followed by correlated RMS analyses of the same particles for characterization of the FR sub-regions and surrounding matrices. The spatially resolved characterization enabled by this approach provides insights into the distributional heterogeneity as well as potential transfer and exposure mechanisms for FRs in the environment that is typically not available through traditional FR analysis. We have used this methodology to reveal a heterogeneous distribution of highly concentrated deca-BDE particles in environmental dust, sometimes in association with identifiable consumer materials. The observed coexistence of deca-BDE with consumer material in dust is strongly indicative of its release into the environment via weathering/abrasion of consumer products. Ingestion of such enriched FR particles in dust represents a potential for instantaneous exposure to high FR concentrations. Therefore, correlated SEM/RMS analysis offers a novel investigative tool for addressing an area of important environmental concern.

  6. Binomial probability distribution model-based protein identification algorithm for tandem mass spectrometry utilizing peak intensity information.

    Science.gov (United States)

    Xiao, Chuan-Le; Chen, Xiao-Zhou; Du, Yang-Li; Sun, Xuesong; Zhang, Gong; He, Qing-Yu

    2013-01-04

    Mass spectrometry has become one of the most important technologies in proteomic analysis. Tandem mass spectrometry (LC-MS/MS) is a major tool for the analysis of peptide mixtures from protein samples. The key step of MS data processing is the identification of peptides from experimental spectra by searching public sequence databases. Although a number of algorithms to identify peptides from MS/MS data have been already proposed, e.g. Sequest, OMSSA, X!Tandem, Mascot, etc., they are mainly based on statistical models considering only peak-matches between experimental and theoretical spectra, but not peak intensity information. Moreover, different algorithms gave different results from the same MS data, implying their probable incompleteness and questionable reproducibility. We developed a novel peptide identification algorithm, ProVerB, based on a binomial probability distribution model of protein tandem mass spectrometry combined with a new scoring function, making full use of peak intensity information and, thus, enhancing the ability of identification. Compared with Mascot, Sequest, and SQID, ProVerB identified significantly more peptides from LC-MS/MS data sets than the current algorithms at 1% False Discovery Rate (FDR) and provided more confident peptide identifications. ProVerB is also compatible with various platforms and experimental data sets, showing its robustness and versatility. The open-source program ProVerB is available at http://bioinformatics.jnu.edu.cn/software/proverb/ .

  7. Concentration distribution of trace elements: from normal distribution to Levy flights

    International Nuclear Information System (INIS)

    Kubala-Kukus, A.; Banas, D.; Braziewicz, J.; Majewska, U.; Pajek, M.

    2003-01-01

    The paper discusses a nature of concentration distributions of trace elements in biomedical samples, which were measured by using the X-ray fluorescence techniques (XRF, TXRF). Our earlier observation, that the lognormal distribution well describes the measured concentration distribution is explained here on a more general ground. Particularly, the role of random multiplicative process, which models the concentration distributions of trace elements in biomedical samples, is discussed in detail. It is demonstrated that the lognormal distribution, appearing when the multiplicative process is driven by normal distribution, can be generalized to the so-called log-stable distribution. Such distribution describes the random multiplicative process, which is driven, instead of normal distribution, by more general stable distribution, being known as the Levy flights. The presented ideas are exemplified by the results of the study of trace element concentration distributions in selected biomedical samples, obtained by using the conventional (XRF) and (TXRF) X-ray fluorescence methods. Particularly, the first observation of log-stable concentration distribution of trace elements is reported and discussed here in detail

  8. Distribution of Total Depressive Symptoms Scores and Each Depressive Symptom Item in a Sample of Japanese Employees.

    Science.gov (United States)

    Tomitaka, Shinichiro; Kawasaki, Yohei; Ide, Kazuki; Yamada, Hiroshi; Miyake, Hirotsugu; Furukawa, Toshiaki A; Furukaw, Toshiaki A

    2016-01-01

    In a previous study, we reported that the distribution of total depressive symptoms scores according to the Center for Epidemiologic Studies Depression Scale (CES-D) in a general population is stable throughout middle adulthood and follows an exponential pattern except for at the lowest end of the symptom score. Furthermore, the individual distributions of 16 negative symptom items of the CES-D exhibit a common mathematical pattern. To confirm the reproducibility of these findings, we investigated the distribution of total depressive symptoms scores and 16 negative symptom items in a sample of Japanese employees. We analyzed 7624 employees aged 20-59 years who had participated in the Northern Japan Occupational Health Promotion Centers Collaboration Study for Mental Health. Depressive symptoms were assessed using the CES-D. The CES-D contains 20 items, each of which is scored in four grades: "rarely," "some," "much," and "most of the time." The descriptive statistics and frequency curves of the distributions were then compared according to age group. The distribution of total depressive symptoms scores appeared to be stable from 30-59 years. The right tail of the distribution for ages 30-59 years exhibited a linear pattern with a log-normal scale. The distributions of the 16 individual negative symptom items of the CES-D exhibited a common mathematical pattern which displayed different distributions with a boundary at "some." The distributions of the 16 negative symptom items from "some" to "most" followed a linear pattern with a log-normal scale. The distributions of the total depressive symptoms scores and individual negative symptom items in a Japanese occupational setting show the same patterns as those observed in a general population. These results show that the specific mathematical patterns of the distributions of total depressive symptoms scores and individual negative symptom items can be reproduced in an occupational population.

  9. Confluence of calculational and experimental information for determination of power distribution and burnup

    International Nuclear Information System (INIS)

    Serov, I.V.; Hoogenboom, J.E.

    1996-01-01

    A technique for the statistical confluence of any number of possibly correlated informational sources employed in reactor analysis can be used to improve the estimates of physical quantities given by the sources taken separately. The formulas of the presented technique being based on multivariate Bayesian conditioning are general and can be employed in different applications. Insight into the nature of the informational source allows different types of data associated with the source to be improved. Estimation of biases, variances and correlation coefficients for the systematic and statistical errors associated with the informational sources is reliable confluence, but pays off by providing optimal estimates. The technique of the calculational and experimental information confluence is applied to the determination of the power distribution and burnup for the research reactor HOR of the Delft University of Technology. The code system CONHOR carries out all the stages of the calculation for the HOR reactor, using an existing code for static core calculations and burnup calculations. (author)

  10. Confluence of calculational and experimental information for determination of power distribution and burnup

    Energy Technology Data Exchange (ETDEWEB)

    Serov, I.V.; Hoogenboom, J.E. [Interuniversitair Reactor Inst., Delft (Netherlands)

    1996-05-01

    A technique for the statistical confluence of any number of possibly correlated informational sources employed in reactor analysis can be used to improve the estimates of physical quantities given by the sources taken separately. The formulas of the presented technique being based on multivariate Bayesian conditioning are general and can be employed in different applications. Insight into the nature of the informational source allows different types of data associated with the source to be improved. Estimation of biases, variances and correlation coefficients for the systematic and statistical errors associated with the informational sources is reliable confluence, but pays off by providing optimal estimates. The technique of the calculational and experimental information confluence is applied to the determination of the power distribution and burnup for the research reactor HOR of the Delft University of Technology. The code system CONHOR carries out all the stages of the calculation for the HOR reactor, using an existing code for static core calculations and burnup calculations. (author).

  11. A service-oriented distributed semantic mediator: integrating multiscale biomedical information.

    Science.gov (United States)

    Mora, Oscar; Engelbrecht, Gerhard; Bisbal, Jesus

    2012-11-01

    Biomedical research continuously generates large amounts of heterogeneous and multimodal data spread over multiple data sources. These data, if appropriately shared and exploited, could dramatically improve the research practice itself, and ultimately the quality of health care delivered. This paper presents DISMED (DIstributed Semantic MEDiator), an open source semantic mediator that provides a unified view of a federated environment of multiscale biomedical data sources. DISMED is a Web-based software application to query and retrieve information distributed over a set of registered data sources, using semantic technologies. It also offers a userfriendly interface specifically designed to simplify the usage of these technologies by non-expert users. Although the architecture of the software mediator is generic and domain independent, in the context of this paper, DISMED has been evaluated for managing biomedical environments and facilitating research with respect to the handling of scientific data distributed in multiple heterogeneous data sources. As part of this contribution, a quantitative evaluation framework has been developed. It consist of a benchmarking scenario and the definition of five realistic use-cases. This framework, created entirely with public datasets, has been used to compare the performance of DISMED against other available mediators. It is also available to the scientific community in order to evaluate progress in the domain of semantic mediation, in a systematic and comparable manner. The results show an average improvement in the execution time by DISMED of 55% compared to the second best alternative in four out of the five use-cases of the experimental evaluation.

  12. Emergence of distributed coordination in the Kolkata Paise Restaurant problem with finite information

    Science.gov (United States)

    Ghosh, Diptesh; Chakrabarti, Anindya S.

    2017-10-01

    In this paper, we study a large-scale distributed coordination problem and propose efficient adaptive strategies to solve the problem. The basic problem is to allocate finite number of resources to individual agents in the absence of a central planner such that there is as little congestion as possible and the fraction of unutilized resources is reduced as far as possible. In the absence of a central planner and global information, agents can employ adaptive strategies that uses only a finite knowledge about the competitors. In this paper, we show that a combination of finite information sets and reinforcement learning can increase the utilization fraction of resources substantially.

  13. Distribution of 137Cs in samples of ocean bottom sediments of the baltic sea in 1982-1983

    International Nuclear Information System (INIS)

    Gedenov, L.I.; Flegontov, V.M.; Ivanova, L.M.; Kostandov, K.A.

    1986-01-01

    The concentration of Cs-137 in samples of ocean bottom sediments picked up in 1979 in the Gulf of Finland with a geological nozzle pipe varied within a wide interval of values. The results could indicate nonuniformity of the Cs-137 distribution in ocean bottom sediments as well as the penetration of significant amounts of Cs-137 to large depths. The main error resulted from the sampling technique employed because the upper part of the sediment could be lost. In 1982, a special ground-sampling device, with which the upper layer of sediments in the water layer close to the ocean bottom could be sampled, was tested in the Gulf of Finland and the Northeastern part of the Baltic Sea. The results of a layerwise determination of the Cs-137 concentration in samples of ocean bottom sediments of the Gulf of Finland and of the Baltic Sea are listed. The new soil-sampling device for picking samples of ocean sediments of undisturbed stratification will allow a correct determination of the radionuclide accumulation in the upper layers of ocean bottom sediments in the Baltic Sea

  14. Iterative Multiview Side Information for Enhanced Reconstruction in Distributed Video Coding

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available Distributed video coding (DVC is a new paradigm for video compression based on the information theoretical results of Slepian and Wolf (SW and Wyner and Ziv (WZ. DVC entails low-complexity encoders as well as separate encoding of correlated video sources. This is particularly attractive for multiview camera systems in video surveillance and camera sensor network applications, where low complexity is required at the encoder. In addition, the separate encoding of the sources implies no communication between the cameras in a practical scenario. This is an advantage since communication is time and power consuming and requires complex networking. In this work, different intercamera estimation techniques for side information (SI generation are explored and compared in terms of estimating quality, complexity, and rate distortion (RD performance. Further, a technique called iterative multiview side information (IMSI is introduced, where the final SI is used in an iterative reconstruction process. The simulation results show that IMSI significantly improves the RD performance for video with significant motion and activity. Furthermore, DVC outperforms AVC/H.264 Intra for video with average and low motion but it is still inferior to the Inter No Motion and Inter Motion modes.

  15. The trends of modeling the ways of formation, distribution and exploitation of megapolis lands using geo-information systems

    Directory of Open Access Journals (Sweden)

    Kostyantyn Mamonov

    2017-10-01

    Full Text Available The areas of need for ways of modeling the formation, distribution and use of land metropolis using GIS are identified. The article is to define the areas of modeling ways of formation, distribution and use of land metropolis using GIS. In the study, the following objectives are set: to develop an algorithm process data base (Data System creation for pecuniary valuation of land settlements with the use of GIS; to offer process model taking into account the influence of one factor modules using geographic information systems; to identify components of geo providing expert money evaluation of land metropolis; to describe the general procedure for expert money assessment of land and property by using geographic information system software; to develop an algorithm methods for expert evaluation of land. Identified tools built algorithms used for modeling the ways of formation, distribution and use of land metropolis using GIS. Directions ways of modeling the formation, distribution and use of land metropolis using GIS.

  16. Corrigendum: Information Search in Decisions From Experience: Do Our Patterns of Sampling Foreshadow Our Decisions?

    Science.gov (United States)

    2017-09-01

    Original article: Hills, T. T., & Hertwig, R. (2010). Information search in decisions from experience: Do our patterns of sampling foreshadow our decisions? Psychological Science, 21, 1787-1792. doi:10.1177/0956797610387443.

  17. A new method for determining the uranium and thorium distribution in volcanic rock samples using solid state nuclear track detectors

    International Nuclear Information System (INIS)

    Misdaq, M.A.; Bakhchi, A.; Ktata, A.; Koutit, A.; Lamine, J.; Ait nouh, F.; Oufni, L.

    2000-01-01

    A method based on using solid state nuclear track detectors (SSNTD) CR- 39 and LR-115 type II and calculating the probabilities for the alpha particles emitted by the uranium and thorium series to reach and be registered on these films was utilized for uranium and thorium contents determination in various geological samples. The distribution of uranium and thorium in different volcanic rocks has been investigated using the track fission method. In this work, the uranium and thorium contents have been determined in different volcanic rock samples by using CR-39 and LR-115 type II solid state nuclear track detectors (SSNTD). The mean critical angles of etching of the solid state nuclear track detectors utilized have been calculated. A petrographical study of the volcanic rock thin layers studied has been conducted. The uranium and thorium distribution inside different rock thin layers has been studied. The mechanism of inclusion of the uranium and thorium nuclei inside the volcanic rock samples studied has been investigated. (author)

  18. Repurposing environmental DNA samples: Detecting the western pearlshell (Margaritifera falcata) as a proof of concept

    Science.gov (United States)

    Joseph C. Dysthe; Torrey Rodgers; Thomas W. Franklin; Kellie J. Carim; Michael K. Young; Kevin S. McKelvey; Karen E. Mock; Michael K. Schwartz

    2018-01-01

    Information on the distribution of multiple species in a common landscape is fundamental to effective conservation and management. However, distribution data are expensive to obtain and often limited to high-profile species in a system. A recently developed technique, environmental DNA (eDNA) sampling, has been shown to be more sensitive than traditional detection...

  19. Geographic information system-coupling sediment delivery distributed modeling based on observed data.

    Science.gov (United States)

    Lee, S E; Kang, S H

    2014-01-01

    Spatially distributed sediment delivery (SEDD) models are of great interest in estimating the expected effect of changes on soil erosion and sediment yield. However, they can only be applied if the model can be calibrated using observed data. This paper presents a geographic information system (GIS)-based method to calculate the sediment discharge from basins to coastal areas. For this, an SEDD model, with a sediment rating curve method based on observed data, is proposed and validated. The model proposed here has been developed using the combined application of the revised universal soil loss equation (RUSLE) and a spatially distributed sediment delivery ratio, within Model Builder of ArcGIS's software. The model focuses on spatial variability and is useful for estimating the spatial patterns of soil loss and sediment discharge. The model consists of two modules, a soil erosion prediction component and a sediment delivery model. The integrated approach allows for relatively practical and cost-effective estimation of spatially distributed soil erosion and sediment delivery, for gauged or ungauged basins. This paper provides the first attempt at estimating sediment delivery ratio based on observed data in the monsoon region of Korea.

  20. Identifying Qualitative Factors Affecting the Production and Distribution of Information and Knowledge in Science and Technology Parks of Iran

    Directory of Open Access Journals (Sweden)

    Ali Haji Shamsaei

    2017-06-01

    Full Text Available This study was conducted in order to identity Qualitative factors affecting the production and distribution of information and knowledge in science and technology parks of Iran. The research was Applied Research in which, qualitative method was carried out. The population of the study was included of 10 managers of Knowledge-based Companies. The data was collected from the population using semi-structured and in-depth interviews. For data analysis, content analysis was used. Results of the qualitative factors affecting the production and distribution of information and knowledge in science and technology parks of Iran, led to extraction of 39 components which were classified in four categories: I Foreign and domestic policy, II Financial and economic support, III Infrastructure barriers and IV Cultural barriers. Results howed that overcoming the political, financial and economic, infrastructural and cultural barriers has undeniable impact on production and distribution of information and knowledge.

  1. Testing the mutual information expansion of entropy with multivariate Gaussian distributions.

    Science.gov (United States)

    Goethe, Martin; Fita, Ignacio; Rubi, J Miguel

    2017-12-14

    The mutual information expansion (MIE) represents an approximation of the configurational entropy in terms of low-dimensional integrals. It is frequently employed to compute entropies from simulation data of large systems, such as macromolecules, for which brute-force evaluation of the full configurational integral is intractable. Here, we test the validity of MIE for systems consisting of more than m = 100 degrees of freedom (dofs). The dofs are distributed according to multivariate Gaussian distributions which were generated from protein structures using a variant of the anisotropic network model. For the Gaussian distributions, we have semi-analytical access to the configurational entropy as well as to all contributions of MIE. This allows us to accurately assess the validity of MIE for different situations. We find that MIE diverges for systems containing long-range correlations which means that the error of consecutive MIE approximations grows with the truncation order n for all tractable n ≪ m. This fact implies severe limitations on the applicability of MIE, which are discussed in the article. For systems with correlations that decay exponentially with distance, MIE represents an asymptotic expansion of entropy, where the first successive MIE approximations approach the exact entropy, while MIE also diverges for larger orders. In this case, MIE serves as a useful entropy expansion when truncated up to a specific truncation order which depends on the correlation length of the system.

  2. A Novel Analysis Method for Paired-Sample Microbial Ecology Experiments.

    Science.gov (United States)

    Olesen, Scott W; Vora, Suhani; Techtmann, Stephen M; Fortney, Julian L; Bastidas-Oyanedel, Juan R; Rodríguez, Jorge; Hazen, Terry C; Alm, Eric J

    2016-01-01

    Many microbial ecology experiments use sequencing data to measure a community's response to an experimental treatment. In a common experimental design, two units, one control and one experimental, are sampled before and after the treatment is applied to the experimental unit. The four resulting samples contain information about the dynamics of organisms that respond to the treatment, but there are no analytical methods designed to extract exactly this type of information from this configuration of samples. Here we present an analytical method specifically designed to visualize and generate hypotheses about microbial community dynamics in experiments that have paired samples and few or no replicates. The method is based on the Poisson lognormal distribution, long studied in macroecology, which we found accurately models the abundance distribution of taxa counts from 16S rRNA surveys. To demonstrate the method's validity and potential, we analyzed an experiment that measured the effect of crude oil on ocean microbial communities in microcosm. Our method identified known oil degraders as well as two clades, Maricurvus and Rhodobacteraceae, that responded to amendment with oil but do not include known oil degraders. Our approach is sensitive to organisms that increased in abundance only in the experimental unit but less sensitive to organisms that increased in both control and experimental units, thus mitigating the role of "bottle effects".

  3. Comparison of SHOX and associated elements duplications distribution between patients (Lėri-Weill dyschondrosteosis/idiopathic short stature) and population sample.

    Science.gov (United States)

    Hirschfeldova, Katerina; Solc, Roman

    2017-09-05

    The effect of heterozygous duplications of SHOX and associated elements on Lėri-Weill dyschondrosteosis (LWD) and idiopathic short stature (ISS) development is less distinct when compared to reciprocal deletions. The aim of our study was to compare frequency and distribution of duplications within SHOX and associated elements between population sample and LWD (ISS) patients. A preliminary analysis conducted on Czech population sample of 250 individuals compared to our previously reported sample of 352 ISS/LWD Czech patients indicated that rather than the difference in frequency of duplications it is the difference in their distribution. Particularly, there was an increased frequency of duplications residing to the CNE-9 enhancer in our LWD/ISS sample. To see whether the obtained data are consistent across published studies we made a literature survey to get published cases with SHOX or associated elements duplication and formed the merged LWD, the merged ISS, and the merged population samples. Relative frequency of particular region duplication in each of those merged samples were calculated. There was a significant difference in the relative frequency of CNE-9 enhancer duplications (11 vs. 3) and complete SHOX (exon1-6b) duplications (4 vs. 24) (p-value 0.0139 and p-value 0.000014, respectively) between the merged LWD sample and the merged population sample. We thus propose that partial SHOX duplications and small duplications encompassing CNE-9 enhancer could be highly penetrant alleles associated with ISS and LWD development. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. The depth distribution functions of the natural abundances of carbon isotopes in Alfisols thoroughly sampled by thin-layer sampling, and their relation to the dynamics of organic matter in theses soils

    International Nuclear Information System (INIS)

    Becker-Heidmann, P.

    1989-01-01

    The aim of this study was to gain fundamental statements on the relationship between the depth distributions of the natural abundances of 13 C and 14 C isotopes and the dynamics of the organic matter in Alfisols. For this purpose, six Alfisols were investigated: four forest soils from Northern Germany, two of them developed in Loess and two in glacial loam, one West German Loess soil used for fruit-growing and one agricultural granite-gneiss soil from the semiarid part of India. The soil was sampled as succesive horizontal layers of 2 cm depth from an area of 0.5 to 1 m 2 size, starting from the organic down to the C horizon or the lower part of the Bt. This kind of completely thin-layer-wise sampling was applied here for the first time. The carbon content and the natural abundances of the 13 C and the 14 C isotopes of each sample were determined. The δ 13 C value was measured by mass spectrometry. A vacuum preparation line with an electronically controlled cooling unit was constructed thereto. For the determination of the 14 C content, the sample carbon was transferred into benzene, and its activity was measured by liquid scintillation spectrometry. From the combination of the depth distribution functions of the 14 C activity and the δ 13 C value, and with the aid of additional analyses like C/N ratio and particle size distribution, a conclusive interpretation as to the dynamics of the organic matter in the investigated Alfisols is given. (orig./BBR)

  5. Distributed artificial intelligence, diversity and information literacy

    Directory of Open Access Journals (Sweden)

    Peter Kåhre

    2010-09-01

    Full Text Available My proposal is based on my doctoral dissertation On the Shoulders of AI-technology : Sociology of Knowledge and Strong Artificial Intelligence which I succesfully defended on May 29th 2009. E-published http://www.lu.se/o.o.i.s?id=12588&postid=1389611 The dissertation is concerned with Sociology’s stance in the debate on Strong Artificial Intelligence,.i.e. AI-systems that is able to shape knowledge on their own. There is a need for sociologists to realize the difference between two approaches to constructing AI systems: Symbolic AI (or Classic AI and Connectionistic AI in a distributed model – DAI. Sociological literature shows a largely critical attitude towards Symbolic AI, an attitude that is justified. The main theme of the dissertation is that DAI is not only compatible with Sociology’s approach to what is social, but also constitutes an apt model of how a social system functions. This is consolidated with help from german sociologist Niklas Luhmann’s social systems theory. A lot of sociologists criticize AI because they think that diversity is important and can only be comprehended in informal circumstances that only humans interacting together can handle. They mean that social intelligence is needed to make something out of diversity and informalism. Luhmann´s systems theory gives the opposite perspective. It tells us that it is social systems that communicate and produce new knowledge structures out of contincency. Psychological systems, i.e. humans, can only think within the circumstances the social system offer. In that way human thoughts are bound by formalism. Diversity is constructed when the social systems interact with complexity in their environments. They reduce the complexity and try to present it as meaningful diversity. Today when most of academic literature is electronically stored and is accessible through the Internet from al over the world, DAI can help social systems to observe and reduce complexity in this

  6. Elaboration of austenitic stainless steel samples with bimodal grain size distributions and investigation of their mechanical behavior

    Science.gov (United States)

    Flipon, B.; de la Cruz, L. Garcia; Hug, E.; Keller, C.; Barbe, F.

    2017-10-01

    Samples of 316L austenitic stainless steel with bimodal grain size distributions are elaborated using two distinct routes. The first one is based on powder metallurgy using spark plasma sintering of two powders with different particle sizes. The second route applies the reverse-annealing method: it consists in inducing martensitic phase transformation by plastic strain and further annealing in order to obtain two austenitic grain populations with different sizes. Microstructural analy ses reveal that both methods are suitable to generate significative grain size contrast and to control this contrast according to the elaboration conditions. Mechanical properties under tension are then characterized for different grain size distributions. Crystal plasticity finite element modelling is further applied in a configuration of bimodal distribution to analyse the role played by coarse grains within a matrix of fine grains, considering not only their volume fraction but also their spatial arrangement.

  7. Mapping molecular orientational distributions for biological sample in 3D (Conference Presentation)

    Science.gov (United States)

    HE, Wei; Ferrand, Patrick; Richter, Benjamin; Bastmeyer, Martin; Brasselet, Sophie

    2016-04-01

    Measuring molecular orientation properties is very appealing for scientists in molecular and cell biology, as well as biomedical research. Orientational organization at the molecular scale is indeed an important brick to cells and tissues morphology, mechanics, functions and pathologies. Recent work has shown that polarized fluorescence imaging, based on excitation polarization tuning in the sample plane, is able to probe molecular orientational order in biological samples; however this applies only to information in 2D, projected in the sample plane. To surpass this limitation, we extended this approach to excitation polarization tuning in 3D. The principle is based on the decomposition of any arbitrary 3D linear excitation in a polarization along the longitudinal z-axis, and a polarization in the transverse xy-sample plane. We designed an interferometer with one arm generating radial polarization light (thus producing longitudinal polarization under high numerical aperture focusing), the other arm controlling a linear polarization in the transverse plane. The amplitude ratio between the two arms can vary so as to get any linear polarized excitation in 3D at the focus of a high NA objective. This technique has been characterized by polarimetry imaging at the back focal plane of the focusing objective, and modeled theoretically. 3D polarized fluorescence microscopy is demonstrated on actin stress fibers in non-flat cells suspended on synthetic polymer structures forming supporting pillars, for which heterogeneous actin orientational order could be identified. This technique shows a great potential in structural investigations in 3D biological systems, such as cell spheroids and tissues.

  8. The implicit assumption of symmetry and the species abundance distribution

    NARCIS (Netherlands)

    Alonso, D.; Ostling, A.; Etienne, R.S.

    2008-01-01

    Species abundance distributions (SADs) have played a historical role in the development of community ecology. They summarize information about the number and the relative abundance of the species encountered in a sample from a given community. For years ecologists have developed theory to

  9. The implicit assumption of symmetry and the species abundance distribution

    NARCIS (Netherlands)

    Alonso, David; Ostling, Annette; Etienne, Rampal S.

    Species abundance distributions (SADs) have played a historical role in the development of community ecology. They summarize information about the number and the relative abundance of the species encountered in a sample from a given community. For years ecologists have developed theory to

  10. Alpha Matting with KL-Divergence Based Sparse Sampling.

    Science.gov (United States)

    Karacan, Levent; Erdem, Aykut; Erdem, Erkut

    2017-06-22

    In this paper, we present a new sampling-based alpha matting approach for the accurate estimation of foreground and background layers of an image. Previous sampling-based methods typically rely on certain heuristics in collecting representative samples from known regions, and thus their performance deteriorates if the underlying assumptions are not satisfied. To alleviate this, we take an entirely new approach and formulate sampling as a sparse subset selection problem where we propose to pick a small set of candidate samples that best explains the unknown pixels. Moreover, we describe a new dissimilarity measure for comparing two samples which is based on KLdivergence between the distributions of features extracted in the vicinity of the samples. The proposed framework is general and could be easily extended to video matting by additionally taking temporal information into account in the sampling process. Evaluation on standard benchmark datasets for image and video matting demonstrates that our approach provides more accurate results compared to the state-of-the-art methods.

  11. Estimating cyclopoid copepod species richness and geographical distribution (Crustacea across a large hydrographical basin: comparing between samples from water column (plankton and macrophyte stands

    Directory of Open Access Journals (Sweden)

    Gilmar Perbiche-Neves

    2014-06-01

    Full Text Available Species richness and geographical distribution of Cyclopoida freshwater copepods were analyzed along the "La Plata" River basin. Ninety-six samples were taken from 24 sampling sites, twelve sites for zooplankton in open waters and twelve sites for zooplankton within macrophyte stands, including reservoirs and lotic stretches. There were, on average, three species per sample in the plankton compared to five per sample in macrophytes. Six species were exclusive to the plankton, 10 to macrophyte stands, and 17 were common to both. Only one species was found in similar proportions in plankton and macrophytes, while five species were widely found in plankton, and thirteen in macrophytes. The distinction between species from open water zooplankton and macrophytes was supported by nonmetric multidimensional analysis. There was no distinct pattern of endemicity within the basin, and double sampling contributes to this result. This lack of sub-regional faunal differentiation is in accordance with other studies that have shown that cyclopoids generally have wide geographical distribution in the Neotropics and that some species there are cosmopolitan. This contrasts with other freshwater copepods such as Calanoida and some Harpacticoida. We conclude that sampling plankton and macrophytes together provided a more accurate estimate of the richness and geographical distribution of these organisms than sampling in either one of those zones alone.

  12. Adaptive sampling based on the cumulative distribution function of order statistics to delineate heavy-metal contaminated soils using kriging

    International Nuclear Information System (INIS)

    Juang, K.-W.; Lee, D.-Y.; Teng, Y.-L.

    2005-01-01

    Correctly classifying 'contaminated' areas in soils, based on the threshold for a contaminated site, is important for determining effective clean-up actions. Pollutant mapping by means of kriging is increasingly being used for the delineation of contaminated soils. However, those areas where the kriged pollutant concentrations are close to the threshold have a high possibility for being misclassified. In order to reduce the misclassification due to the over- or under-estimation from kriging, an adaptive sampling using the cumulative distribution function of order statistics (CDFOS) was developed to draw additional samples for delineating contaminated soils, while kriging. A heavy-metal contaminated site in Hsinchu, Taiwan was used to illustrate this approach. The results showed that compared with random sampling, adaptive sampling using CDFOS reduced the kriging estimation errors and misclassification rates, and thus would appear to be a better choice than random sampling, as additional sampling is required for delineating the 'contaminated' areas. - A sampling approach was derived for drawing additional samples while kriging

  13. Study on the contents of trace rare earth elements and their distribution in wheat and rice samples by RNAA

    International Nuclear Information System (INIS)

    Sun Jingxin; Zhao Hang; Wang Yuqi

    1994-01-01

    The concentrations of 8 REE (La, Ce, Nd, Sm, Eu, Tb, Yb and Lu) in wheat and rice samples have been determined by RNAA. The contents and distributions of REE in each part of the plants (i.e. root, leaf, stem, husk and seed) and their host soils were studied, which included samples applied with rare earth elements in farming and control samples. The effects of applying rare earth on the uptake of REE by the plants and the REE accumulation in the grains of human health were also discussed. (author) 9 refs.; 4 figs.; 4 tabs

  14. Elemental analysis of size-fractionated particulate matter sampled in Goeteborg, Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, Annemarie [Department of Chemistry, Atmospheric Science, Goeteborg University, SE-412 96 Goeteborg (Sweden)], E-mail: wagnera@chalmers.se; Boman, Johan [Department of Chemistry, Atmospheric Science, Goeteborg University, SE-412 96 Goeteborg (Sweden); Gatari, Michael J. [Institute of Nuclear Science and Technology, University of Nairobi, P.O. Box 30197-00100, Nairobi (Kenya)

    2008-12-15

    The aim of the study was to investigate the mass distribution of trace elements in aerosol samples collected in the urban area of Goeteborg, Sweden, with special focus on the impact of different air masses and anthropogenic activities. Three measurement campaigns were conducted during December 2006 and January 2007. A PIXE cascade impactor was used to collect particulate matter in 9 size fractions ranging from 16 to 0.06 {mu}m aerodynamic diameter. Polished quartz carriers were chosen as collection substrates for the subsequent direct analysis by TXRF. To investigate the sources of the analyzed air masses, backward trajectories were calculated. Our results showed that diurnal sampling was sufficient to investigate the mass distribution for Br, Ca, Cl, Cu, Fe, K, Sr and Zn, whereas a 5-day sampling period resulted in additional information on mass distribution for Cr and S. Unimodal mass distributions were found in the study area for the elements Ca, Cl, Fe and Zn, whereas the distributions for Br, Cu, Cr, K, Ni and S were bimodal, indicating high temperature processes as source of the submicron particle components. The measurement period including the New Year firework activities showed both an extensive increase in concentrations as well as a shift to the submicron range for K and Sr, elements that are typically found in fireworks. Further research is required to validate the quantification of trace elements directly collected on sample carriers.

  15. Elemental analysis of size-fractionated particulate matter sampled in Goeteborg, Sweden

    International Nuclear Information System (INIS)

    Wagner, Annemarie; Boman, Johan; Gatari, Michael J.

    2008-01-01

    The aim of the study was to investigate the mass distribution of trace elements in aerosol samples collected in the urban area of Goeteborg, Sweden, with special focus on the impact of different air masses and anthropogenic activities. Three measurement campaigns were conducted during December 2006 and January 2007. A PIXE cascade impactor was used to collect particulate matter in 9 size fractions ranging from 16 to 0.06 μm aerodynamic diameter. Polished quartz carriers were chosen as collection substrates for the subsequent direct analysis by TXRF. To investigate the sources of the analyzed air masses, backward trajectories were calculated. Our results showed that diurnal sampling was sufficient to investigate the mass distribution for Br, Ca, Cl, Cu, Fe, K, Sr and Zn, whereas a 5-day sampling period resulted in additional information on mass distribution for Cr and S. Unimodal mass distributions were found in the study area for the elements Ca, Cl, Fe and Zn, whereas the distributions for Br, Cu, Cr, K, Ni and S were bimodal, indicating high temperature processes as source of the submicron particle components. The measurement period including the New Year firework activities showed both an extensive increase in concentrations as well as a shift to the submicron range for K and Sr, elements that are typically found in fireworks. Further research is required to validate the quantification of trace elements directly collected on sample carriers

  16. Calculation of life distributions, in particular Weibull distributions, from operational observations

    International Nuclear Information System (INIS)

    Rauhut, J.

    1982-01-01

    Established methods are presented by which life distributions of machine elements can be determined on the basis of laboratory experiments and operational observations. Practical observations are given special attention as the results estimated on the basis of conventional have not been accurate enough. As an introduction, the stochastic life concept, the general method of determining life distributions, various sampling methods, and the Weibull distribution are explained. Further, possible life testing schedules and maximum-likelihood estimates are discussed for the complete sample case and for censered sampling without replacement in laboratory experiments. Finally, censered sampling with replacement in laboratory experiments is discussed; it is shown how suitable parameter estimates can be obtained for given life distributions by means of the maximum-likelihood method. (orig./RW) [de

  17. Myocardium tracking via matching distributions.

    Science.gov (United States)

    Ben Ayed, Ismail; Li, Shuo; Ross, Ian; Islam, Ali

    2009-01-01

    The goal of this study is to investigate automatic myocardium tracking in cardiac Magnetic Resonance (MR) sequences using global distribution matching via level-set curve evolution. Rather than relying on the pixelwise information as in existing approaches, distribution matching compares intensity distributions, and consequently, is well-suited to the myocardium tracking problem. Starting from a manual segmentation of the first frame, two curves are evolved in order to recover the endocardium (inner myocardium boundary) and the epicardium (outer myocardium boundary) in all the frames. For each curve, the evolution equation is sought following the maximization of a functional containing two terms: (1) a distribution matching term measuring the similarity between the non-parametric intensity distributions sampled from inside and outside the curve to the model distributions of the corresponding regions estimated from the previous frame; (2) a gradient term for smoothing the curve and biasing it toward high gradient of intensity. The Bhattacharyya coefficient is used as a similarity measure between distributions. The functional maximization is obtained by the Euler-Lagrange ascent equation of curve evolution, and efficiently implemented via level-set. The performance of the proposed distribution matching was quantitatively evaluated by comparisons with independent manual segmentations approved by an experienced cardiologist. The method was applied to ten 2D mid-cavity MR sequences corresponding to ten different subjects. Although neither shape prior knowledge nor curve coupling were used, quantitative evaluation demonstrated that the results were consistent with manual segmentations. The proposed method compares well with existing methods. The algorithm also yields a satisfying reproducibility. Distribution matching leads to a myocardium tracking which is more flexible and applicable than existing methods because the algorithm uses only the current data, i.e., does not

  18. Tourism and information technologies distribution channels: a panorama of the brazilian reality

    Directory of Open Access Journals (Sweden)

    Elisete Santos da Silva Zagheni

    2011-05-01

    Full Text Available Technological evolution has made it possible that the same service may be delivered by means of multiple channels, including in the tourism sector.  The present study proposes to present a bibliographic summary of the distribution channels in tourism and the impact of information technologies (IT in these channels.  Based on exploratory research, 24 scientific papers were analyzed with the intention of identifying a research structure concerning the channels of Brazilian tourism and IT.  We observed that in Brazil, there is a lack of work concerning distribution channels in tourism, highlighting the management of these channels under the view of supply chains.  Beyond this, these papers concentrate on two elements of the channels: means of lodging and hospitality, and travel agencies.  These elements have used direct channels based on IT in order to support service activities and the commercialization of the tourist product.

  19. Mapping the Distribution and Flora of the Weeds in Canola Fields of Gorgan Township by Geographic Information System (GIS

    Directory of Open Access Journals (Sweden)

    sahar jannati ataie

    2018-02-01

    Full Text Available Introduction: Oil seeds are the second world’s food supply after cereals. These crops are grown primarily for the oil contained in the seeds. The major world sources of edible seed oils are soybeans, sunflowers, canola, cotton and peanuts. Canola is one of the most important plants in the world that has great importance. The plant belongs to the Brassica genus, the botanical family that includes cauliflower and cabbages. Weeds are one of the major problems in canola production that reduce yield and its quality. In general, one of the most important factors in development of management plans is information about the weed’s flora and geographic distribution. Knowledge of weed flora enables one to use the required herbicide and formulate other suitable management strategies. It is also useful in exploiting abundant weeds as a cover crop or pasture and for other economic uses. The geographic information system has the proper use in weed science and management of agricultural information and their analysis. In this study, distribution and flora of the weeds in canola fields of Gorgan Township investigated by Geographic Information System. Material and Methods: Crop sampling was conducted during May and June 2014, in 58 canola fields in Gorgan Township (Golestan province and the weed species were sampled and detected using a W method and by specific formula of density, frequency, uniformity, and abundance of each weed species was calculated. Also, geographic coordinates of fields (latitude, altitude and elevation were determined by using GPS model Garmin map 60. After collecting data, in order to create a database of weed distribution, the data was transferred from GPS to ArcGIS 9/3.1 software. From all information obtained, consistently a database with location was created and after separation of data based on present or absence of weeds on fields, distribution maps were produced. Results and Discussion: The results showed that there are 35 weed

  20. 78 FR 33146 - Notice of Proposal Policy for Distribution of FAA Data and Information; Extension of Comment Period

    Science.gov (United States)

    2013-06-03

    ... DEPARTMENT OF TRANSPORTATION Federal Aviation Administration [Docket No. FAA-2013-0392; Notice No.] RIN 2120-AJ61 Notice of Proposal Policy for Distribution of FAA Data and Information; Extension of Comment Period AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Notice for Data and Information...

  1. A normative inference approach for optimal sample sizes in decisions from experience

    Science.gov (United States)

    Ostwald, Dirk; Starke, Ludger; Hertwig, Ralph

    2015-01-01

    “Decisions from experience” (DFE) refers to a body of work that emerged in research on behavioral decision making over the last decade. One of the major experimental paradigms employed to study experience-based choice is the “sampling paradigm,” which serves as a model of decision making under limited knowledge about the statistical structure of the world. In this paradigm respondents are presented with two payoff distributions, which, in contrast to standard approaches in behavioral economics, are specified not in terms of explicit outcome-probability information, but by the opportunity to sample outcomes from each distribution without economic consequences. Participants are encouraged to explore the distributions until they feel confident enough to decide from which they would prefer to draw from in a final trial involving real monetary payoffs. One commonly employed measure to characterize the behavior of participants in the sampling paradigm is the sample size, that is, the number of outcome draws which participants choose to obtain from each distribution prior to terminating sampling. A natural question that arises in this context concerns the “optimal” sample size, which could be used as a normative benchmark to evaluate human sampling behavior in DFE. In this theoretical study, we relate the DFE sampling paradigm to the classical statistical decision theoretic literature and, under a probabilistic inference assumption, evaluate optimal sample sizes for DFE. In our treatment we go beyond analytically established results by showing how the classical statistical decision theoretic framework can be used to derive optimal sample sizes under arbitrary, but numerically evaluable, constraints. Finally, we critically evaluate the value of deriving optimal sample sizes under this framework as testable predictions for the experimental study of sampling behavior in DFE. PMID:26441720

  2. Ensemble Sampling

    OpenAIRE

    Lu, Xiuyuan; Van Roy, Benjamin

    2017-01-01

    Thompson sampling has emerged as an effective heuristic for a broad range of online decision problems. In its basic form, the algorithm requires computing and sampling from a posterior distribution over models, which is tractable only for simple special cases. This paper develops ensemble sampling, which aims to approximate Thompson sampling while maintaining tractability even in the face of complex models such as neural networks. Ensemble sampling dramatically expands on the range of applica...

  3. Mapserver – Information Flow Management Software for The Border Guard Distributed Data Exchange System

    Directory of Open Access Journals (Sweden)

    Blok Marek

    2016-09-01

    Full Text Available In this paper the architecture of the software designed for management of position and identification data of floating and flying objects in Maritime areas controlled by Polish Border Guard is presented. The software was designed for managing information stored in a distributed system with two variants of the software, one for a mobile device installed on a vessel, an airplane or a car and second for a central server. The details of implementation of all functionalities of the MapServer in both, mobile and central, versions are briefly presented on the basis of information flow diagrams.

  4. Development of spatial scaling technique of forest health sample point information

    Science.gov (United States)

    Lee, J.; Ryu, J.; Choi, Y. Y.; Chung, H. I.; Kim, S. H.; Jeon, S. W.

    2017-12-01

    Most forest health assessments are limited to monitoring sampling sites. The monitoring of forest health in Britain in Britain was carried out mainly on five species (Norway spruce, Sitka spruce, Scots pine, Oak, Beech) Database construction using Oracle database program with density The Forest Health Assessment in GreatBay in the United States was conducted to identify the characteristics of the ecosystem populations of each area based on the evaluation of forest health by tree species, diameter at breast height, water pipe and density in summer and fall of 200. In the case of Korea, in the first evaluation report on forest health vitality, 1000 sample points were placed in the forests using a systematic method of arranging forests at 4Km × 4Km at regular intervals based on an sample point, and 29 items in four categories such as tree health, vegetation, soil, and atmosphere. As mentioned above, existing researches have been done through the monitoring of the survey sample points, and it is difficult to collect information to support customized policies for the regional survey sites. In the case of special forests such as urban forests and major forests, policy and management appropriate to the forest characteristics are needed. Therefore, it is necessary to expand the survey headquarters for diagnosis and evaluation of customized forest health. For this reason, we have constructed a method of spatial scale through the spatial interpolation according to the characteristics of each index of the main sample point table of 29 index in the four points of diagnosis and evaluation report of the first forest health vitality report, PCA statistical analysis and correlative analysis are conducted to construct the indicators with significance, and then weights are selected for each index, and evaluation of forest health is conducted through statistical grading.

  5. Sample path analysis and distributions of boundary crossing times

    CERN Document Server

    Zacks, Shelemyahu

    2017-01-01

    This monograph is focused on the derivations of exact distributions of first boundary crossing times of Poisson processes, compound Poisson processes, and more general renewal processes.  The content is limited to the distributions of first boundary crossing times and their applications to various stochastic models. This book provides the theory and techniques for exact computations of distributions and moments of level crossing times. In addition, these techniques could replace simulations in many cases, thus providing more insight about the phenomenona studied. This book takes a general approach for studying telegraph processes and is based on nearly thirty published papers by the author and collaborators over the past twenty five years.  No prior knowledge of advanced probability is required, making the book widely available to students and researchers in applied probability, operations research, applied physics, and applied mathematics. .

  6. Sampled-data consensus in switching networks of integrators based on edge events

    Science.gov (United States)

    Xiao, Feng; Meng, Xiangyu; Chen, Tongwen

    2015-02-01

    This paper investigates the event-driven sampled-data consensus in switching networks of multiple integrators and studies both the bidirectional interaction and leader-following passive reaction topologies in a unified framework. In these topologies, each information link is modelled by an edge of the information graph and assigned a sequence of edge events, which activate the mutual data sampling and controller updates of the two linked agents. Two kinds of edge-event-detecting rules are proposed for the general asynchronous data-sampling case and the synchronous periodic event-detecting case. They are implemented in a distributed fashion, and their effectiveness in reducing communication costs and solving consensus problems under a jointly connected topology condition is shown by both theoretical analysis and simulation examples.

  7. Dynamics of Biofilm Regrowth in Drinking Water Distribution Systems.

    Science.gov (United States)

    Douterelo, I; Husband, S; Loza, V; Boxall, J

    2016-07-15

    The majority of biomass within water distribution systems is in the form of attached biofilm. This is known to be central to drinking water quality degradation following treatment, yet little understanding of the dynamics of these highly heterogeneous communities exists. This paper presents original information on such dynamics, with findings demonstrating patterns of material accumulation, seasonality, and influential factors. Rigorous flushing operations repeated over a 1-year period on an operational chlorinated system in the United Kingdom are presented here. Intensive monitoring and sampling were undertaken, including time-series turbidity and detailed microbial analysis using 16S rRNA Illumina MiSeq sequencing. The results show that bacterial dynamics were influenced by differences in the supplied water and by the material remaining attached to the pipe wall following flushing. Turbidity, metals, and phosphate were the main factors correlated with the distribution of bacteria in the samples. Coupled with the lack of inhibition of biofilm development due to residual chlorine, this suggests that limiting inorganic nutrients, rather than organic carbon, might be a viable component in treatment strategies to manage biofilms. The research also showed that repeat flushing exerted beneficial selective pressure, giving another reason for flushing being a viable advantageous biofilm management option. This work advances our understanding of microbiological processes in drinking water distribution systems and helps inform strategies to optimize asset performance. This research provides novel information regarding the dynamics of biofilm formation in real drinking water distribution systems made of different materials. This new knowledge on microbiological process in water supply systems can be used to optimize the performance of the distribution network and to guarantee safe and good-quality drinking water to consumers. Copyright © 2016 Douterelo et al.

  8. The evaluation of supply chain performance in the Oil Products Distribution Company, using information technology indicators and fuzzy TOPSIS technique

    Directory of Open Access Journals (Sweden)

    Daryosh Mohamadi Janaki

    2018-08-01

    Full Text Available Information Technology (IT plays an essential role on development of effective supply chain planning and it can improve the supply chain performance, either directly or indirectly. As a national industry, the National Iranian Oil Products Distribution Company involves a large number of organizations within its supply chain. Therefore, this descriptive-survey uses information sharing indicators, fuzzy TOPSIS technique based on managers and expert opinions to evaluate and to rank some oil products distribution companies. Data are analyzed and the results show that Oil Products Distribution Company of Chaharmahal and Bakhtiari received the highest rank and Farsan maintained the lowest rank compared with other regional companies.

  9. Can anchovy age structure be estimated from length distribution ...

    African Journals Online (AJOL)

    The analysis provides a new time-series of proportions-at-age 1, together with associated standard errors, for input into assessments of the resource. The results also caution against the danger of scientists reading more information into data than is really there. Keywords: anchovy, effective sample size, length distribution, ...

  10. The complete information for phenomenal distributed parameter control of multicomponent chemical processes in gas, fluid and solid phase

    International Nuclear Information System (INIS)

    Niemiec, W.

    1985-01-01

    A constitutive mathematical model of distributed parameters of multicomponent chemical processes in gas, fluid and solid phase is utilized to the realization of phenomenal distributed parameter control of these processes. Original systems of partial differential constitutive state equations, in the following derivative forms /I/, /II/ and /III/ are solved in this paper from the point of view of information for phenomenal distributed parameter control of considered processes. Obtained in this way for multicomponent chemical processes in gas, fluid and solid phase: -dynamical working space-time characteristics/analytical solutions in working space-time of chemical reactors/, -dynamical phenomenal Green functions as working space-time transfer functions, -statical working space characteristics /analytical solutions in working space of chemical reactors/, -statical phenomenal Green functions as working space transfer functions, are applied, as information for realization of constitutive distributed parameter control of mass, energy and momentum aspects of above processes. Two cases are considered by existence of: A/sup o/ - initial conditions, B/sup o/ - initial and boundary conditions, for multicomponent chemical processes in gas, fluid and solid phase

  11. Monitoring fish distributions along electrofishing segments

    Science.gov (United States)

    Miranda, Leandro E.

    2014-01-01

    Electrofishing is widely used to monitor fish species composition and relative abundance in streams and lakes. According to standard protocols, multiple segments are selected in a body of water to monitor population relative abundance as the ratio of total catch to total sampling effort. The standard protocol provides an assessment of fish distribution at a macrohabitat scale among segments, but not within segments. An ancillary protocol was developed for assessing fish distribution at a finer scale within electrofishing segments. The ancillary protocol was used to estimate spacing, dispersion, and association of two species along shore segments in two local reservoirs. The added information provided by the ancillary protocol may be useful for assessing fish distribution relative to fish of the same species, to fish of different species, and to environmental or habitat characteristics.

  12. Interactive microbial distribution analysis using BioAtlas

    DEFF Research Database (Denmark)

    Lund, Jesper; List, Markus; Baumbach, Jan

    2017-01-01

    body maps and (iii) user-defined maps. It further allows for (iv) uploading of own sample data, which can be placed on existing maps to (v) browse the distribution of the associated taxonomies. Finally, BioAtlas enables users to (vi) contribute custom maps (e.g. for plants or animals) and to map...... to analyze microbial distribution in a location-specific context. BioAtlas is an interactive web application that closes this gap between sequence databases, taxonomy profiling and geo/body-location information. It enables users to browse taxonomically annotated sequences across (i) the world map, (ii) human...

  13. The Evaluative Advantage of Novel Alternatives: An Information-Sampling Account.

    Science.gov (United States)

    Le Mens, Gaël; Kareev, Yaakov; Avrahami, Judith

    2016-02-01

    New products, services, and ideas are often evaluated more favorably than similar but older ones. Although several explanations of this phenomenon have been proposed, we identify an overlooked asymmetry in information about new and old items that emerges when people seek positive experiences and learn about the qualities of (noisy) alternatives by experiencing them. The reason for the asymmetry is that people avoid rechoosing alternatives that previously led to poor outcomes; hence, additional feedback on their qualities is precluded. Negative quality estimates, even when caused by noise, thus tend to persist. This negative bias takes time to develop, and affects old alternatives more strongly than similar but newer alternatives. We analyze a simple learning model and demonstrate the process by which people would tend to evaluate a new alternative more positively than an older alternative with the same payoff distribution. The results from two experimental studies (Ns = 769 and 805) support the predictions of our model. © The Author(s) 2015.

  14. Baltimore PM2.5 Supersite: highly time-resolved organic compounds--sampling duration and phase distribution--implications for health effects studies.

    Science.gov (United States)

    Rogge, Wolfgang F; Ondov, John M; Bernardo-Bricker, Anna; Sevimoglu, Orhan

    2011-12-01

    As part of the Baltimore PM2.5 Supersite study, intensive three-hourly continuous PM2.5 sampling was conducted for nearly 4 weeks in summer of 2002 and as well in winter of 2002/2003. Close to 120 individual organic compounds have been quantified separately in filter and polyurethane foam (PUF) plug pairs for 17 days for each sampling period. Here, the focus is on (1) describing briefly the new sampling system, (2) discussing filter/PUF plugs breakthrough experiments for semi-volatile compounds, (3) providing insight into phase distribution of semi-volatile organic species, and (4) discussing the impact of air pollution sampling time on human exposure with information on maximum 3- and 24-h averaged ambient concentrations of potentially adverse health effects causing organic pollutants. The newly developed sampling system consisted of five electronically controlled parallel sampling channels that are operated in a sequential mode. Semi-volatile breakthrough experiments were conducted in three separate experiments over 3, 4, and 5 h each using one filter and three PUF plugs. Valuable insight was obtained about the transfer of semi-volatile organic compounds through the sequence of PUF plugs and a cut-off could be defined for complete sampling of semi-volatile compounds on only one filter/PUF plug pair, i.e., the setup finally used during the seasonal PM2.5 sampling campaign. Accordingly, n-nonadecane (C19) with a vapor pressure (vp) of 3.25 × 10(-4) Torr is collected with > 95% on the filter/PUF pair. Applied to phenanthrene, the most abundant the PAH sampled, phenanthrene (vp, 6.2 × 10(-5) Torr) was collected completely in wintertime and correlates very well with three-hourly PM2.5 ambient concentrations. Valuable data on the fractional partitioning for semi-volatile organics as a function of season is provided here and can be used to differentiate the human uptake of an organic pollutant of interest via gas- and particle-phase exposure. Health effects studies

  15. Workflow-enabled distributed component-based information architecture for digital medical imaging enterprises.

    Science.gov (United States)

    Wong, Stephen T C; Tjandra, Donny; Wang, Huili; Shen, Weimin

    2003-09-01

    Few information systems today offer a flexible means to define and manage the automated part of radiology processes, which provide clinical imaging services for the entire healthcare organization. Even fewer of them provide a coherent architecture that can easily cope with heterogeneity and inevitable local adaptation of applications and can integrate clinical and administrative information to aid better clinical, operational, and business decisions. We describe an innovative enterprise architecture of image information management systems to fill the needs. Such a system is based on the interplay of production workflow management, distributed object computing, Java and Web techniques, and in-depth domain knowledge in radiology operations. Our design adapts the approach of "4+1" architectural view. In this new architecture, PACS and RIS become one while the user interaction can be automated by customized workflow process. Clinical service applications are implemented as active components. They can be reasonably substituted by applications of local adaptations and can be multiplied for fault tolerance and load balancing. Furthermore, the workflow-enabled digital radiology system would provide powerful query and statistical functions for managing resources and improving productivity. This paper will potentially lead to a new direction of image information management. We illustrate the innovative design with examples taken from an implemented system.

  16. Samples and Sampling Protocols for Scientific Investigations | Joel ...

    African Journals Online (AJOL)

    ... from sampling, through sample preparation, calibration to final measurement and reporting. This paper, therefore offers useful information on practical guidance on sampling protocols in line with best practice and international standards. Keywords: Sampling, sampling protocols, chain of custody, analysis, documentation ...

  17. Optimal cross-sectional sampling for river modelling with bridges: An information theory-based method

    Energy Technology Data Exchange (ETDEWEB)

    Ridolfi, E.; Napolitano, F., E-mail: francesco.napolitano@uniroma1.it [Sapienza Università di Roma, Dipartimento di Ingegneria Civile, Edile e Ambientale (Italy); Alfonso, L. [Hydroinformatics Chair Group, UNESCO-IHE, Delft (Netherlands); Di Baldassarre, G. [Department of Earth Sciences, Program for Air, Water and Landscape Sciences, Uppsala University (Sweden)

    2016-06-08

    The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers’ cross-sectional spacing.

  18. Optimal cross-sectional sampling for river modelling with bridges: An information theory-based method

    International Nuclear Information System (INIS)

    Ridolfi, E.; Napolitano, F.; Alfonso, L.; Di Baldassarre, G.

    2016-01-01

    The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers’ cross-sectional spacing.

  19. Ancestry informative markers: inference of ancestry in aged bone samples using an autosomal AIM-Indel multiplex.

    Science.gov (United States)

    Romanini, Carola; Romero, Magdalena; Salado Puerto, Mercedes; Catelli, Laura; Phillips, Christopher; Pereira, Rui; Gusmão, Leonor; Vullo, Carlos

    2015-05-01

    Ancestry informative markers (AIMs) can be useful to infer ancestry proportions of the donors of forensic evidence. The probability of success typing degraded samples, such as human skeletal remains, is strongly influenced by the DNA fragment lengths that can be amplified and the presence of PCR inhibitors. Several AIM panels are available amongst the many forensic marker sets developed for genotyping degraded DNA. Using a 46 AIM Insertion Deletion (Indel) multiplex, we analyzed human skeletal remains of post mortem time ranging from 35 to 60 years from four different continents (Sub-Saharan Africa, South and Central America, East Asia and Europe) to ascertain the genetic ancestry components. Samples belonging to non-admixed individuals could be assigned to their corresponding continental group. For the remaining samples with admixed ancestry, it was possible to estimate the proportion of co-ancestry components from the four reference population groups. The 46 AIM Indel set was informative enough to efficiently estimate the proportion of ancestry even in samples yielding partial profiles, a frequent occurrence when analyzing inhibited and/or degraded DNA extracts. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. A general technique for confluence of calculational and experimental information with application to power distribution determination

    International Nuclear Information System (INIS)

    Serov, I.V.; Hoogenboom, J.E.

    1994-01-01

    Physical quantities can be obtained by utilizing different informational sources. The available information is usually associated with systematic and statistical errors. If the informational sources are utilized simultaneously, then it is possible to obtain posterior estimates of the quantities with better statistical properties than exhibited by any prior estimates. The general technique for confluence of any number possibly dependent informational sources can be developed. Insight into the nature of the informational source allows different types of data associated with the source to be improved. The formulas of the technique are presented and applied to the power distribution determination for research reactor HOR of the Delft University of Technology, employing calculational and experimental data. (authors). 5 refs., 1 tab., 5 figs

  1. Side Information and Noise Learning for Distributed Video Coding using Optical Flow and Clustering

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Rakêt, Lars Lau; Huang, Xin

    2012-01-01

    Distributed video coding (DVC) is a coding paradigm which exploits the source statistics at the decoder side to reduce the complexity at the encoder. The coding efficiency of DVC critically depends on the quality of side information generation and accuracy of noise modeling. This paper considers...... Transform Domain Wyner-Ziv (TDWZ) coding and proposes using optical flow to improve side information generation and clustering to improve noise modeling. The optical flow technique is exploited at the decoder side to compensate weaknesses of block based methods, when using motion-compensation to generate...... side information frames. Clustering is introduced to capture cross band correlation and increase local adaptivity in the noise modeling. This paper also proposes techniques to learn from previously decoded (WZ) frames. Different techniques are combined by calculating a number of candidate soft side...

  2. Distribution of quantum information between an atom and two photons

    International Nuclear Information System (INIS)

    Weber, Bernhard

    2008-01-01

    The construction of networks consisting of optically interconnected processing units is a promising way to scale up quantum information processing systems. To store quantum information, single trapped atoms are among the most proven candidates. By placing them in high finesse optical resonators, a bidirectional information exchange between the atoms and photons becomes possible with, in principle, unit efficiency. Such an interface between stationary and ying qubits constitutes a possible node of a future quantum network. The results presented in this thesis demonstrate the prospects of a quantum interface consisting of a single atom trapped within the mode of a high-finesse optical cavity. In a two-step process, we distribute entanglement between the stored atom and two subsequently emitted single photons. The long atom trapping times achieved in the system together with the high photon collection efficiency of the cavity make the applied protocol in principle deterministic, allowing for the creation of an entangled state at the push of a button. Running the protocol on this quasi-stationary quantum interface, the internal state of the atom is entangled with the polarization state of a single emitted photon. The entanglement is generated by driving a vacuum-stimulated Raman adiabatic passage between states of the coupled atom-cavity system. In a second process, the atomic part of the entangled state is mapped onto a second emitted photon using a similar technique and resulting in a polarization-entangled two-photon state. To verify and characterize the photon-photon entanglement, we measured a violation of a Bell inequality and performed a full quantum state tomography. The results prove the prior atom-photon entanglement and demonstrate a quantum information transfer between the atom and the two emitted photons. This reflects the advantages of a high-finesse cavity as a quantum interface in future quantum networks. (orig.)

  3. Distribution of quantum information between an atom and two photons

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Bernhard

    2008-11-03

    The construction of networks consisting of optically interconnected processing units is a promising way to scale up quantum information processing systems. To store quantum information, single trapped atoms are among the most proven candidates. By placing them in high finesse optical resonators, a bidirectional information exchange between the atoms and photons becomes possible with, in principle, unit efficiency. Such an interface between stationary and ying qubits constitutes a possible node of a future quantum network. The results presented in this thesis demonstrate the prospects of a quantum interface consisting of a single atom trapped within the mode of a high-finesse optical cavity. In a two-step process, we distribute entanglement between the stored atom and two subsequently emitted single photons. The long atom trapping times achieved in the system together with the high photon collection efficiency of the cavity make the applied protocol in principle deterministic, allowing for the creation of an entangled state at the push of a button. Running the protocol on this quasi-stationary quantum interface, the internal state of the atom is entangled with the polarization state of a single emitted photon. The entanglement is generated by driving a vacuum-stimulated Raman adiabatic passage between states of the coupled atom-cavity system. In a second process, the atomic part of the entangled state is mapped onto a second emitted photon using a similar technique and resulting in a polarization-entangled two-photon state. To verify and characterize the photon-photon entanglement, we measured a violation of a Bell inequality and performed a full quantum state tomography. The results prove the prior atom-photon entanglement and demonstrate a quantum information transfer between the atom and the two emitted photons. This reflects the advantages of a high-finesse cavity as a quantum interface in future quantum networks. (orig.)

  4. Combination of panoramic and fluorescence endoscopic images to obtain tumor spatial distribution information useful for bladder cancer detection

    Science.gov (United States)

    Olijnyk, S.; Hernández Mier, Y.; Blondel, W. C. P. M.; Daul, C.; Wolf, D.; Bourg-Heckly, G.

    2007-07-01

    Bladder cancer is widely spread. Moreover, carcinoma in situ can be difficult to diagnose as it may be difficult to see, and become invasive in 50 % of case. Non invasive diagnosis methods like photodynamic or autofluorescence endoscopy allow enhancing sensitivity and specificity. Besides, bladder tumors can be multifocal. Multifocality increases the probability of recurrence and infiltration into bladder muscle. Analysis of spatial distribution of tumors could be used to improve diagnosis. We explore the feasibility to combine fluorescence and spatial information on phantoms. We developed a system allowing the acquisition of consecutive images under white light or UV excitation alternatively and automatically along the video sequence. We also developed an automatic image processing algorithm to build a partial panoramic image from a cystoscopic sequence of images. Fluorescence information is extracted from wavelength bandpass filtered images and superimposed over the cartography. Then, spatial distribution measures of fluorescent spots can be computed. This cartography can be positioned on a 3D generic shape of bladder by selecting some reference points. Our first results on phantoms show that it is possible to obtain cartography with fluorescent spots and extract quantitative information of their spatial distribution on a "wide" field of view basis.

  5. The role of digital sample information within the digital geoscience infrastructure: a pragmatic approach

    Science.gov (United States)

    Howe, Michael

    2014-05-01

    Much of the digital geological information on the composition, properties and dynamics of the subsurface is based ultimately on physical samples, many of which are archived to provide a basis for the information. Online metadata catalogues of these collections have now been available for many years. Many of these are institutional and tightly focussed, with UK examples including the British Geological Survey's (BGS) palaeontological samples database, PalaeoSaurus (http://www.bgs.ac.uk/palaeosaurus/), and mineralogical and petrological sample database, Britrocks (http://www.bgs.ac.uk/data/britrocks.html) . There are now a growing number of international sample metadata databases, including The Palaeobiology Database (http://paleobiodb.org/) and SESAR, the IGSN (International Geo Sample Number) database (http://www.geosamples.org/catalogsearch/ ). More recently the emphasis has moved beyond metadata (locality, identification, age, citations, etc) to digital imagery, with the intention of providing the user with at least enough information to determine whether viewing the sample would be worthwhile. Recent BGS examples include high resolution (e.g. 7216 x 5412 pixel) hydrocarbon well core images (http://www.bgs.ac.uk/data/offshoreWells/wells.cfc?method=searchWells) , high resolution rock thin section images (e.g. http://www.largeimages.bgs.ac.uk/iip/britrocks.html?id=290000/291739 ) and building stone images (http://geoscenic.bgs.ac.uk/asset-bank/action/browseItems?categoryId=1547&categoryTypeId=1) . This has been developed further with high resolution stereo images. The Jisc funded GB3D type fossils online project delivers these as red-cyan anaglyphs (http://www.3d-fossils.ac.uk/). More innovatively, the GB3D type fossils project has laser scanned several thousand type fossils and the resulting 3d-digital models are now being delivered through the online portal. Importantly, this project also represents collaboration between the BGS, Oxford and Cambridge Universities

  6. Information 2.0 new models of information production, distribution and consumption

    CERN Document Server

    Saulles, Martin De

    2015-01-01

    This textbook provides an overview of the digital information landscape and explains the implications of the technological changes for the information industry, from publishers and broadcasters to the information professionals who manage information in all its forms.

  7. Information-theoretic security proof for quantum-key-distribution protocols

    International Nuclear Information System (INIS)

    Renner, Renato; Gisin, Nicolas; Kraus, Barbara

    2005-01-01

    We present a technique for proving the security of quantum-key-distribution (QKD) protocols. It is based on direct information-theoretic arguments and thus also applies if no equivalent entanglement purification scheme can be found. Using this technique, we investigate a general class of QKD protocols with one-way classical post-processing. We show that, in order to analyze the full security of these protocols, it suffices to consider collective attacks. Indeed, we give new lower and upper bounds on the secret-key rate which only involve entropies of two-qubit density operators and which are thus easy to compute. As an illustration of our results, we analyze the Bennett-Brassard 1984, the six-state, and the Bennett 1992 protocols with one-way error correction and privacy amplification. Surprisingly, the performance of these protocols is increased if one of the parties adds noise to the measurement data before the error correction. In particular, this additional noise makes the protocols more robust against noise in the quantum channel

  8. Information-theoretic security proof for quantum-key-distribution protocols

    Science.gov (United States)

    Renner, Renato; Gisin, Nicolas; Kraus, Barbara

    2005-07-01

    We present a technique for proving the security of quantum-key-distribution (QKD) protocols. It is based on direct information-theoretic arguments and thus also applies if no equivalent entanglement purification scheme can be found. Using this technique, we investigate a general class of QKD protocols with one-way classical post-processing. We show that, in order to analyze the full security of these protocols, it suffices to consider collective attacks. Indeed, we give new lower and upper bounds on the secret-key rate which only involve entropies of two-qubit density operators and which are thus easy to compute. As an illustration of our results, we analyze the Bennett-Brassard 1984, the six-state, and the Bennett 1992 protocols with one-way error correction and privacy amplification. Surprisingly, the performance of these protocols is increased if one of the parties adds noise to the measurement data before the error correction. In particular, this additional noise makes the protocols more robust against noise in the quantum channel.

  9. Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain

    KAUST Repository

    Beck, Joakim

    2018-02-19

    In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized for a specified error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a single-loop Monte Carlo method that uses the Laplace approximation of the return value of the inner loop. The first demonstration example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.

  10. Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain

    Science.gov (United States)

    Beck, Joakim; Dia, Ben Mansour; Espath, Luis F. R.; Long, Quan; Tempone, Raúl

    2018-06-01

    In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized according to the desired error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a more recent single-loop Monte Carlo method that uses the Laplace method as an approximation of the return value of the inner loop. The first example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.

  11. An Ethnomathematics Exercise for Analyzing a Khipu Sample from Pachacamac (Perú

    Directory of Open Access Journals (Sweden)

    Alberto Saez-Rodríguez

    2012-02-01

    Full Text Available A khipu sample studied by Gary Urton embodies an unusual division into quarters. Urton‟s research findings allow us to visualize the information in the pairing quadrants, which are determined by the distribution of Sand Z-knots, to provide overall information that is helpful for identifying the celestial coordinates of the brightest stars in the Pleiades cluster. In the present study, the linear regression attempts to model the relationship between two variables (which are determined by the distribution of the S- and Z-knots. The scatter plot illustrates the results of our simple linear regression: suggesting a map of the Pleiades represented by seven points on the Cartesian coordinate plane.

  12. Care episode retrieval: distributional semantic models for information retrieval in the clinical domain.

    Science.gov (United States)

    Moen, Hans; Ginter, Filip; Marsi, Erwin; Peltonen, Laura-Maria; Salakoski, Tapio; Salanterä, Sanna

    2015-01-01

    Patients' health related information is stored in electronic health records (EHRs) by health service providers. These records include sequential documentation of care episodes in the form of clinical notes. EHRs are used throughout the health care sector by professionals, administrators and patients, primarily for clinical purposes, but also for secondary purposes such as decision support and research. The vast amounts of information in EHR systems complicate information management and increase the risk of information overload. Therefore, clinicians and researchers need new tools to manage the information stored in the EHRs. A common use case is, given a--possibly unfinished--care episode, to retrieve the most similar care episodes among the records. This paper presents several methods for information retrieval, focusing on care episode retrieval, based on textual similarity, where similarity is measured through domain-specific modelling of the distributional semantics of words. Models include variants of random indexing and the semantic neural network model word2vec. Two novel methods are introduced that utilize the ICD-10 codes attached to care episodes to better induce domain-specificity in the semantic model. We report on experimental evaluation of care episode retrieval that circumvents the lack of human judgements regarding episode relevance. Results suggest that several of the methods proposed outperform a state-of-the art search engine (Lucene) on the retrieval task.

  13. Global Distribution of Human-Associated Fecal Genetic Markers in Reference Samples from Six Continents.

    Science.gov (United States)

    Mayer, René E; Reischer, Georg H; Ixenmaier, Simone K; Derx, Julia; Blaschke, Alfred Paul; Ebdon, James E; Linke, Rita; Egle, Lukas; Ahmed, Warish; Blanch, Anicet R; Byamukama, Denis; Savill, Marion; Mushi, Douglas; Cristóbal, Héctor A; Edge, Thomas A; Schade, Margit A; Aslan, Asli; Brooks, Yolanda M; Sommer, Regina; Masago, Yoshifumi; Sato, Maria I; Taylor, Huw D; Rose, Joan B; Wuertz, Stefan; Shanks, Orin C; Piringer, Harald; Mach, Robert L; Savio, Domenico; Zessner, Matthias; Farnleitner, Andreas H

    2018-05-01

    Numerous bacterial genetic markers are available for the molecular detection of human sources of fecal pollution in environmental waters. However, widespread application is hindered by a lack of knowledge regarding geographical stability, limiting implementation to a small number of well-characterized regions. This study investigates the geographic distribution of five human-associated genetic markers (HF183/BFDrev, HF183/BacR287, BacHum-UCD, BacH, and Lachno2) in municipal wastewaters (raw and treated) from 29 urban and rural wastewater treatment plants (750-4 400 000 population equivalents) from 13 countries spanning six continents. In addition, genetic markers were tested against 280 human and nonhuman fecal samples from domesticated, agricultural and wild animal sources. Findings revealed that all genetic markers are present in consistently high concentrations in raw (median log 10 7.2-8.0 marker equivalents (ME) 100 mL -1 ) and biologically treated wastewater samples (median log 10 4.6-6.0 ME 100 mL -1 ) regardless of location and population. The false positive rates of the various markers in nonhuman fecal samples ranged from 5% to 47%. Results suggest that several genetic markers have considerable potential for measuring human-associated contamination in polluted environmental waters. This will be helpful in water quality monitoring, pollution modeling and health risk assessment (as demonstrated by QMRAcatch) to guide target-oriented water safety management across the globe.

  14. CO2 Data Distribution and Support from the Goddard Earth Science Data and Information Services Center (GES-DISC)

    Science.gov (United States)

    Hearty, Thomas; Savtchenko, Andrey; Vollmer, Bruce; Albayrak, Arif; Theobald, Mike; Esfandiari, Ed; Wei, Jennifer

    2015-01-01

    This talk will describe the support and distribution of CO2 data products from OCO-2, AIRS, and ACOS, that are archived and distributed from the Goddard Earth Sciences Data and Information Services Center. We will provide a brief summary of the current online archive and distribution metrics for the OCO-2 Level 1 products and plans for the Level 2 products. We will also describe collaborative data sets and services (e.g., matchups with other sensors) and solicit feedback for potential future services.

  15. Estimates of microbial quality and concentration of copper in distributed drinking water are highly dependent on sampling strategy.

    Science.gov (United States)

    Lehtola, Markku J; Miettinen, Ilkka T; Hirvonen, Arja; Vartiainen, Terttu; Martikainen, Pertti J

    2007-12-01

    The numbers of bacteria generally increase in distributed water. Often household pipelines or water fittings (e.g., taps) represent the most critical location for microbial growth in water distribution systems. According to the European Union drinking water directive, there should not be abnormal changes in the colony counts in water. We used a pilot distribution system to study the effects of water stagnation on drinking water microbial quality, concentration of copper and formation of biofilms with two commonly used pipeline materials in households; copper and plastic (polyethylene). Water stagnation for more than 4h significantly increased both the copper concentration and the number of bacteria in water. Heterotrophic plate counts were six times higher in PE pipes and ten times higher in copper pipes after 16 h of stagnation than after only 40 min stagnation. The increase in the heterotrophic plate counts was linear with time in both copper and plastic pipelines. In the distribution system, bacteria originated mainly from biofilms, because in laboratory tests with water, there was only minor growth of bacteria after 16 h stagnation. Our study indicates that water stagnation in the distribution system clearly affects microbial numbers and the concentration of copper in water, and should be considered when planning the sampling strategy for drinking water quality control in distribution systems.

  16. Learning maximum entropy models from finite-size data sets: A fast data-driven algorithm allows sampling from the posterior distribution.

    Science.gov (United States)

    Ferrari, Ulisse

    2016-08-01

    Maximum entropy models provide the least constrained probability distributions that reproduce statistical properties of experimental datasets. In this work we characterize the learning dynamics that maximizes the log-likelihood in the case of large but finite datasets. We first show how the steepest descent dynamics is not optimal as it is slowed down by the inhomogeneous curvature of the model parameters' space. We then provide a way for rectifying this space which relies only on dataset properties and does not require large computational efforts. We conclude by solving the long-time limit of the parameters' dynamics including the randomness generated by the systematic use of Gibbs sampling. In this stochastic framework, rather than converging to a fixed point, the dynamics reaches a stationary distribution, which for the rectified dynamics reproduces the posterior distribution of the parameters. We sum up all these insights in a "rectified" data-driven algorithm that is fast and by sampling from the parameters' posterior avoids both under- and overfitting along all the directions of the parameters' space. Through the learning of pairwise Ising models from the recording of a large population of retina neurons, we show how our algorithm outperforms the steepest descent method.

  17. Fission-track studies of uranium distribution in geological samples

    International Nuclear Information System (INIS)

    Brynard, H.J.

    1983-01-01

    The standard method of studying uranium distribution in geological material by registration of fission tracks from the thermal neutron-induced fission of 235 U has been adapted for utilisation in the SAFARI-1 reactor at Pelindaba. The theory of fission-track registration as well as practical problems are discussed. The method has been applied to study uranium distribution in a variety of rock types and the results are discussed in this paper. The method is very sensitive and uranium present in quantities far below the detection limit of the microprobe have been detected

  18. The beta Burr type X distribution properties with application.

    Science.gov (United States)

    Merovci, Faton; Khaleel, Mundher Abdullah; Ibrahim, Noor Akma; Shitan, Mahendran

    2016-01-01

    We develop a new continuous distribution called the beta-Burr type X distribution that extends the Burr type X distribution. The properties provide a comprehensive mathematical treatment of this distribution. Further more, various structural properties of the new distribution are derived, that includes moment generating function and the rth moment thus generalizing some results in the literature. We also obtain expressions for the density, moment generating function and rth moment of the order statistics. We consider the maximum likelihood estimation to estimate the parameters. Additionally, the asymptotic confidence intervals for the parameters are derived from the Fisher information matrix. Finally, simulation study is carried at under varying sample size to assess the performance of this model. Illustration the real dataset indicates that this new distribution can serve as a good alternative model to model positive real data in many areas.

  19. Probabilistic Decision Making with Spikes: From ISI Distributions to Behaviour via Information Gain.

    Directory of Open Access Journals (Sweden)

    Javier A Caballero

    Full Text Available Computational theories of decision making in the brain usually assume that sensory 'evidence' is accumulated supporting a number of hypotheses, and that the first accumulator to reach threshold triggers a decision in favour of its associated hypothesis. However, the evidence is often assumed to occur as a continuous process whose origins are somewhat abstract, with no direct link to the neural signals - action potentials or 'spikes' - that must ultimately form the substrate for decision making in the brain. Here we introduce a new variant of the well-known multi-hypothesis sequential probability ratio test (MSPRT for decision making whose evidence observations consist of the basic unit of neural signalling - the inter-spike interval (ISI - and which is based on a new form of the likelihood function. We dub this mechanism s-MSPRT and show its precise form for a range of realistic ISI distributions with positive support. In this way we show that, at the level of spikes, the refractory period may actually facilitate shorter decision times, and that the mechanism is robust against poor choice of the hypothesized data distribution. We show that s-MSPRT performance is related to the Kullback-Leibler divergence (KLD or information gain between ISI distributions, through which we are able to link neural signalling to psychophysical observation at the behavioural level. Thus, we find the mean information needed for a decision is constant, thereby offering an account of Hick's law (relating decision time to the number of choices. Further, the mean decision time of s-MSPRT shows a power law dependence on the KLD offering an account of Piéron's law (relating reaction time to stimulus intensity. These results show the foundations for a research programme in which spike train analysis can be made the basis for predictions about behavior in multi-alternative choice tasks.

  20. Probabilistic Decision Making with Spikes: From ISI Distributions to Behaviour via Information Gain.

    Science.gov (United States)

    Caballero, Javier A; Lepora, Nathan F; Gurney, Kevin N

    2015-01-01

    Computational theories of decision making in the brain usually assume that sensory 'evidence' is accumulated supporting a number of hypotheses, and that the first accumulator to reach threshold triggers a decision in favour of its associated hypothesis. However, the evidence is often assumed to occur as a continuous process whose origins are somewhat abstract, with no direct link to the neural signals - action potentials or 'spikes' - that must ultimately form the substrate for decision making in the brain. Here we introduce a new variant of the well-known multi-hypothesis sequential probability ratio test (MSPRT) for decision making whose evidence observations consist of the basic unit of neural signalling - the inter-spike interval (ISI) - and which is based on a new form of the likelihood function. We dub this mechanism s-MSPRT and show its precise form for a range of realistic ISI distributions with positive support. In this way we show that, at the level of spikes, the refractory period may actually facilitate shorter decision times, and that the mechanism is robust against poor choice of the hypothesized data distribution. We show that s-MSPRT performance is related to the Kullback-Leibler divergence (KLD) or information gain between ISI distributions, through which we are able to link neural signalling to psychophysical observation at the behavioural level. Thus, we find the mean information needed for a decision is constant, thereby offering an account of Hick's law (relating decision time to the number of choices). Further, the mean decision time of s-MSPRT shows a power law dependence on the KLD offering an account of Piéron's law (relating reaction time to stimulus intensity). These results show the foundations for a research programme in which spike train analysis can be made the basis for predictions about behavior in multi-alternative choice tasks.