WorldWideScience

Sample records for sample processing prior

  1. Generalized species sampling priors with latent Beta reinforcements

    Science.gov (United States)

    Airoldi, Edoardo M.; Costa, Thiago; Bassetti, Federico; Leisen, Fabrizio; Guindani, Michele

    2014-01-01

    Many popular Bayesian nonparametric priors can be characterized in terms of exchangeable species sampling sequences. However, in some applications, exchangeability may not be appropriate. We introduce a novel and probabilistically coherent family of non-exchangeable species sampling sequences characterized by a tractable predictive probability function with weights driven by a sequence of independent Beta random variables. We compare their theoretical clustering properties with those of the Dirichlet Process and the two parameters Poisson-Dirichlet process. The proposed construction provides a complete characterization of the joint process, differently from existing work. We then propose the use of such process as prior distribution in a hierarchical Bayes modeling framework, and we describe a Markov Chain Monte Carlo sampler for posterior inference. We evaluate the performance of the prior and the robustness of the resulting inference in a simulation study, providing a comparison with popular Dirichlet Processes mixtures and Hidden Markov Models. Finally, we develop an application to the detection of chromosomal aberrations in breast cancer by leveraging array CGH data. PMID:25870462

  2. Reference Priors For Non-Normal Two-Sample Problems

    NARCIS (Netherlands)

    Fernández, C.; Steel, M.F.J.

    1997-01-01

    The reference prior algorithm (Berger and Bernardo, 1992) is applied to locationscale models with any regular sampling density. A number of two-sample problems is analyzed in this general context, extending the dierence, ratio and product of Normal means problems outside Normality, while explicitly

  3. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sø rbye, Sigrunn Holbek; Rue, Haavard

    2017-01-01

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  4. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sørbye, Sigrunn Holbek

    2017-05-25

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users\\' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  5. Inverse problems with non-trivial priors: efficient solution through sequential Gibbs sampling

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Mosegaard, Klaus

    2012-01-01

    Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis algorithm can be used to sample solutions to non-linear inverse problems. In principle, these methods allow incorporation of prior information of arbitrary complexity. If an analytical closed form description of the prior...... is available, which is the case when the prior can be described by a multidimensional Gaussian distribution, such prior information can easily be considered. In reality, prior information is often more complex than can be described by the Gaussian model, and no closed form expression of the prior can be given....... We propose an algorithm, called sequential Gibbs sampling, allowing the Metropolis algorithm to efficiently incorporate complex priors into the solution of an inverse problem, also for the case where no closed form description of the prior exists. First, we lay out the theoretical background...

  6. Implementation of antimicrobial peptides for sample preparation prior to nucleic acid amplification in point-of-care settings.

    Science.gov (United States)

    Krõlov, Katrin; Uusna, Julia; Grellier, Tiia; Andresen, Liis; Jevtuševskaja, Jekaterina; Tulp, Indrek; Langel, Ülo

    2017-12-01

    A variety of sample preparation techniques are used prior to nucleic acid amplification. However, their efficiency is not always sufficient and nucleic acid purification remains the preferred method for template preparation. Purification is difficult and costly to apply in point-of-care (POC) settings and there is a strong need for more robust, rapid, and efficient biological sample preparation techniques in molecular diagnostics. Here, the authors applied antimicrobial peptides (AMPs) for urine sample preparation prior to isothermal loop-mediated amplification (LAMP). AMPs bind to many microorganisms such as bacteria, fungi, protozoa and viruses causing disruption of their membrane integrity and facilitate nucleic acid release. The authors show that incubation of E. coli with antimicrobial peptide cecropin P1 for 5 min had a significant effect on the availability of template DNA compared with untreated or even heat treated samples resulting in up to six times increase of the amplification efficiency. These results show that AMPs treatment is a very efficient sample preparation technique that is suitable for application prior to nucleic acid amplification directly within biological samples. Furthermore, the entire process of AMPs treatment was performed at room temperature for 5 min thereby making it a good candidate for use in POC applications.

  7. Form of prior for constrained thermodynamic processes with uncertainty

    Science.gov (United States)

    Aneja, Preety; Johal, Ramandeep S.

    2015-05-01

    We consider the quasi-static thermodynamic processes with constraints, but with additional uncertainty about the control parameters. Motivated by inductive reasoning, we assign prior distribution that provides a rational guess about likely values of the uncertain parameters. The priors are derived explicitly for both the entropy-conserving and the energy-conserving processes. The proposed form is useful when the constraint equation cannot be treated analytically. The inference is performed using spin-1/2 systems as models for heat reservoirs. Analytical results are derived in the high-temperatures limit. An agreement beyond linear response is found between the estimates of thermal quantities and their optimal values obtained from extremum principles. We also seek an intuitive interpretation for the prior and the estimated value of temperature obtained therefrom. We find that the prior over temperature becomes uniform over the quantity kept conserved in the process.

  8. Prior processes and their applications nonparametric Bayesian estimation

    CERN Document Server

    Phadia, Eswar G

    2016-01-01

    This book presents a systematic and comprehensive treatment of various prior processes that have been developed over the past four decades for dealing with Bayesian approach to solving selected nonparametric inference problems. This revised edition has been substantially expanded to reflect the current interest in this area. After an overview of different prior processes, it examines the now pre-eminent Dirichlet process and its variants including hierarchical processes, then addresses new processes such as dependent Dirichlet, local Dirichlet, time-varying and spatial processes, all of which exploit the countable mixture representation of the Dirichlet process. It subsequently discusses various neutral to right type processes, including gamma and extended gamma, beta and beta-Stacy processes, and then describes the Chinese Restaurant, Indian Buffet and infinite gamma-Poisson processes, which prove to be very useful in areas such as machine learning, information retrieval and featural modeling. Tailfree and P...

  9. Effects of model complexity and priors on estimation using sequential importance sampling/resampling for species conservation

    Science.gov (United States)

    Dunham, Kylee; Grand, James B.

    2016-01-01

    We examined the effects of complexity and priors on the accuracy of models used to estimate ecological and observational processes, and to make predictions regarding population size and structure. State-space models are useful for estimating complex, unobservable population processes and making predictions about future populations based on limited data. To better understand the utility of state space models in evaluating population dynamics, we used them in a Bayesian framework and compared the accuracy of models with differing complexity, with and without informative priors using sequential importance sampling/resampling (SISR). Count data were simulated for 25 years using known parameters and observation process for each model. We used kernel smoothing to reduce the effect of particle depletion, which is common when estimating both states and parameters with SISR. Models using informative priors estimated parameter values and population size with greater accuracy than their non-informative counterparts. While the estimates of population size and trend did not suffer greatly in models using non-informative priors, the algorithm was unable to accurately estimate demographic parameters. This model framework provides reasonable estimates of population size when little to no information is available; however, when information on some vital rates is available, SISR can be used to obtain more precise estimates of population size and process. Incorporating model complexity such as that required by structured populations with stage-specific vital rates affects precision and accuracy when estimating latent population variables and predicting population dynamics. These results are important to consider when designing monitoring programs and conservation efforts requiring management of specific population segments.

  10. Rapid sampling of molecular motions with prior information constraints.

    Science.gov (United States)

    Raveh, Barak; Enosh, Angela; Schueler-Furman, Ora; Halperin, Dan

    2009-02-01

    Proteins are active, flexible machines that perform a range of different functions. Innovative experimental approaches may now provide limited partial information about conformational changes along motion pathways of proteins. There is therefore a need for computational approaches that can efficiently incorporate prior information into motion prediction schemes. In this paper, we present PathRover, a general setup designed for the integration of prior information into the motion planning algorithm of rapidly exploring random trees (RRT). Each suggested motion pathway comprises a sequence of low-energy clash-free conformations that satisfy an arbitrary number of prior information constraints. These constraints can be derived from experimental data or from expert intuition about the motion. The incorporation of prior information is very straightforward and significantly narrows down the vast search in the typically high-dimensional conformational space, leading to dramatic reduction in running time. To allow the use of state-of-the-art energy functions and conformational sampling, we have integrated this framework into Rosetta, an accurate protocol for diverse types of structural modeling. The suggested framework can serve as an effective complementary tool for molecular dynamics, Normal Mode Analysis, and other prevalent techniques for predicting motion in proteins. We applied our framework to three different model systems. We show that a limited set of experimentally motivated constraints may effectively bias the simulations toward diverse predicates in an outright fashion, from distance constraints to enforcement of loop closure. In particular, our analysis sheds light on mechanisms of protein domain swapping and on the role of different residues in the motion.

  11. Rapid sampling of molecular motions with prior information constraints.

    Directory of Open Access Journals (Sweden)

    Barak Raveh

    2009-02-01

    Full Text Available Proteins are active, flexible machines that perform a range of different functions. Innovative experimental approaches may now provide limited partial information about conformational changes along motion pathways of proteins. There is therefore a need for computational approaches that can efficiently incorporate prior information into motion prediction schemes. In this paper, we present PathRover, a general setup designed for the integration of prior information into the motion planning algorithm of rapidly exploring random trees (RRT. Each suggested motion pathway comprises a sequence of low-energy clash-free conformations that satisfy an arbitrary number of prior information constraints. These constraints can be derived from experimental data or from expert intuition about the motion. The incorporation of prior information is very straightforward and significantly narrows down the vast search in the typically high-dimensional conformational space, leading to dramatic reduction in running time. To allow the use of state-of-the-art energy functions and conformational sampling, we have integrated this framework into Rosetta, an accurate protocol for diverse types of structural modeling. The suggested framework can serve as an effective complementary tool for molecular dynamics, Normal Mode Analysis, and other prevalent techniques for predicting motion in proteins. We applied our framework to three different model systems. We show that a limited set of experimentally motivated constraints may effectively bias the simulations toward diverse predicates in an outright fashion, from distance constraints to enforcement of loop closure. In particular, our analysis sheds light on mechanisms of protein domain swapping and on the role of different residues in the motion.

  12. Does attention speed up processing? Decreases and increases of processing rates in visual prior entry.

    Science.gov (United States)

    Tünnermann, Jan; Petersen, Anders; Scharlau, Ingrid

    2015-03-02

    Selective visual attention improves performance in many tasks. Among others, it leads to "prior entry"--earlier perception of an attended compared to an unattended stimulus. Whether this phenomenon is purely based on an increase of the processing rate of the attended stimulus or if a decrease in the processing rate of the unattended stimulus also contributes to the effect is, up to now, unanswered. Here we describe a novel approach to this question based on Bundesen's Theory of Visual Attention, which we use to overcome the limitations of earlier prior-entry assessment with temporal order judgments (TOJs) that only allow relative statements regarding the processing speed of attended and unattended stimuli. Prevalent models of prior entry in TOJs either indirectly predict a pure acceleration or cannot model the difference between acceleration and deceleration. In a paradigm that combines a letter-identification task with TOJs, we show that indeed acceleration of the attended and deceleration of the unattended stimuli conjointly cause prior entry. © 2015 ARVO.

  13. Perfluoroalkyl Acid Concentrations in Blood Samples Subjected to Transportation and Processing Delay

    DEFF Research Database (Denmark)

    Bach, Cathrine Carlsen; Henriksen, Tine Brink; Bossi, Rossana

    2015-01-01

    and transportation prior to processing and samples with immediate processing and freezing. METHODS: Pregnant women recruited at Aarhus University Hospital, Denmark, (n = 88) provided paired blood samples. For each pair of samples, one was immediately processed and plasma was frozen, and the other was delayed...... and transported as whole blood before processing and freezing of plasma (similar to the Danish National Birth Cohort). We measured 12 perfluoroalkyl acids and present results for compounds with more than 50% of samples above the lower limit of quantification. RESULTS: For samples taken in the winter, relative...... differences between the paired samples ranged between -77 and +38% for individual perfluoroalkyl acids. In most cases concentrations were lower in the delayed and transported samples, e.g. the relative difference was -29% (95% confidence interval -30; -27) for perfluorooctane sulfonate. For perfluorooctanoate...

  14. Prior knowledge processing for initial state of Kalman filter

    Czech Academy of Sciences Publication Activity Database

    Suzdaleva, Evgenia

    2010-01-01

    Roč. 24, č. 3 (2010), s. 188-202 ISSN 0890-6327 R&D Projects: GA ČR(CZ) GP201/06/P434 Institutional research plan: CEZ:AV0Z10750506 Keywords : Kalman filtering * prior knowledge * state-space model * initial state distribution Subject RIV: BC - Control Systems Theory Impact factor: 0.729, year: 2010 http://library.utia.cas.cz/separaty/2009/AS/suzdaleva-prior knowledge processing for initial state of kalman filter.pdf

  15. On a Robust MaxEnt Process Regression Model with Sample-Selection

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2018-04-01

    Full Text Available In a regression analysis, a sample-selection bias arises when a dependent variable is partially observed as a result of the sample selection. This study introduces a Maximum Entropy (MaxEnt process regression model that assumes a MaxEnt prior distribution for its nonparametric regression function and finds that the MaxEnt process regression model includes the well-known Gaussian process regression (GPR model as a special case. Then, this special MaxEnt process regression model, i.e., the GPR model, is generalized to obtain a robust sample-selection Gaussian process regression (RSGPR model that deals with non-normal data in the sample selection. Various properties of the RSGPR model are established, including the stochastic representation, distributional hierarchy, and magnitude of the sample-selection bias. These properties are used in the paper to develop a hierarchical Bayesian methodology to estimate the model. This involves a simple and computationally feasible Markov chain Monte Carlo algorithm that avoids analytical or numerical derivatives of the log-likelihood function of the model. The performance of the RSGPR model in terms of the sample-selection bias correction, robustness to non-normality, and prediction, is demonstrated through results in simulations that attest to its good finite-sample performance.

  16. Standardised Resting Time Prior to Blood Sampling and Diurnal Variation Associated with Risk of Patient Misclassification

    DEFF Research Database (Denmark)

    Bøgh Andersen, Ida; Brasen, Claus L.; Christensen, Henry

    2015-01-01

    .9×10-7) and sodium (p = 8.7×10-16). Only TSH and albumin were clinically significantly influenced by diurnal variation. Resting time had no clinically significant effect. CONCLUSIONS: We found no need for resting 15 minutes prior to blood sampling. However, diurnal variation was found to have a significant......BACKGROUND: According to current recommendations, blood samples should be taken in the morning after 15 minutes' resting time. Some components exhibit diurnal variation and in response to pressures to expand opening hours and reduce waiting time, the aims of this study were to investigate...... the impact of resting time prior to blood sampling and diurnal variation on biochemical components, including albumin, thyrotropin (TSH), total calcium and sodium in plasma. METHODS: All patients referred to an outpatient clinic for blood sampling were included in the period Nov 2011 until June 2014 (opening...

  17. Is Mars Sample Return Required Prior to Sending Humans to Mars?

    Science.gov (United States)

    Carr, Michael; Abell, Paul; Allwood, Abigail; Baker, John; Barnes, Jeff; Bass, Deborah; Beaty, David; Boston, Penny; Brinkerhoff, Will; Budney, Charles; hide

    2012-01-01

    Prior to potentially sending humans to the surface of Mars, it is fundamentally important to return samples from Mars. Analysis in Earth's extensive scientific laboratories would significantly reduce the risk of human Mars exploration and would also support the science and engineering decisions relating to the Mars human flight architecture. The importance of measurements of any returned Mars samples range from critical to desirable, and in all cases these samples will would enhance our understanding of the Martian environment before potentially sending humans to that alien locale. For example, Mars sample return (MSR) could yield information that would enable human exploration related to 1) enabling forward and back planetary protection, 2) characterizing properties of Martian materials relevant for in situ resource utilization (ISRU), 3) assessing any toxicity of Martian materials with respect to human health and performance, and 4) identifying information related to engineering surface hazards such as the corrosive effect of the Martian environment. In addition, MSR would be engineering 'proof of concept' for a potential round trip human mission to the planet, and a potential model for international Mars exploration.

  18. Green approaches in sample preparation of bioanalytical samples prior to chromatographic analysis.

    Science.gov (United States)

    Filippou, Olga; Bitas, Dimitrios; Samanidou, Victoria

    2017-02-01

    Sample preparation is considered to be the most challenging step of the analytical procedure, since it has an effect on the whole analytical methodology, therefore it contributes significantly to the greenness or lack of it of the entire process. The elimination of the sample treatment steps, pursuing at the same time the reduction of the amount of the sample, strong reductions in consumption of hazardous reagents and energy also maximizing safety for operators and environment, the avoidance of the use of big amount of organic solvents, form the basis for greening sample preparation and analytical methods. In the last decade, the development and utilization of greener and sustainable microextraction techniques is an alternative to classical sample preparation procedures. In this review, the main green microextraction techniques (solid phase microextraction, stir bar sorptive extraction, hollow-fiber liquid phase microextraction, dispersive liquid - liquid microextraction, etc.) will be presented, with special attention to bioanalytical applications of these environment-friendly sample preparation techniques which comply with the green analytical chemistry principles. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Perfluoroalkyl Acid Concentrations in Blood Samples Subjected to Transportation and Processing Delay.

    Science.gov (United States)

    Bach, Cathrine Carlsen; Henriksen, Tine Brink; Bossi, Rossana; Bech, Bodil Hammer; Fuglsang, Jens; Olsen, Jørn; Nohr, Ellen Aagaard

    2015-01-01

    In studies of perfluoroalkyl acids, the validity and comparability of measured concentrations may be affected by differences in the handling of biospecimens. We aimed to investigate whether measured plasma levels of perfluoroalkyl acids differed between blood samples subjected to delay and transportation prior to processing and samples with immediate processing and freezing. Pregnant women recruited at Aarhus University Hospital, Denmark, (n = 88) provided paired blood samples. For each pair of samples, one was immediately processed and plasma was frozen, and the other was delayed and transported as whole blood before processing and freezing of plasma (similar to the Danish National Birth Cohort). We measured 12 perfluoroalkyl acids and present results for compounds with more than 50% of samples above the lower limit of quantification. For samples taken in the winter, relative differences between the paired samples ranged between -77 and +38% for individual perfluoroalkyl acids. In most cases concentrations were lower in the delayed and transported samples, e.g. the relative difference was -29% (95% confidence interval -30; -27) for perfluorooctane sulfonate. For perfluorooctanoate there was no difference between the two setups [corresponding estimate 1% (0, 3)]. Differences were negligible in the summer for all compounds. Transport of blood samples and processing delay, similar to conditions applied in some large, population-based studies, may affect measured perfluoroalkyl acid concentrations, mainly when outdoor temperatures are low. Attention to processing conditions is needed in studies of perfluoroalkyl acid exposure in humans.

  20. Sets of priors reflecting prior-data conflict and agreement

    NARCIS (Netherlands)

    Walter, G.M.; Coolen, F.P.A.; Carvalho, J.P.; Lesot, M.-J.; Kaymak, U.; Vieira, S.; Bouchon-Meunier, B.; Yager, R.R.

    2016-01-01

    Bayesian inference enables combination of observations with prior knowledge in the reasoning process. The choice of a particular prior distribution to represent the available prior knowledge is, however, often debatable, especially when prior knowledge is limited or data are scarce, as then

  1. On selecting a prior for the precision parameter of Dirichlet process mixture models

    Science.gov (United States)

    Dorazio, R.M.

    2009-01-01

    In hierarchical mixture models the Dirichlet process is used to specify latent patterns of heterogeneity, particularly when the distribution of latent parameters is thought to be clustered (multimodal). The parameters of a Dirichlet process include a precision parameter ?? and a base probability measure G0. In problems where ?? is unknown and must be estimated, inferences about the level of clustering can be sensitive to the choice of prior assumed for ??. In this paper an approach is developed for computing a prior for the precision parameter ?? that can be used in the presence or absence of prior information about the level of clustering. This approach is illustrated in an analysis of counts of stream fishes. The results of this fully Bayesian analysis are compared with an empirical Bayes analysis of the same data and with a Bayesian analysis based on an alternative commonly used prior.

  2. Sample preparation prior to the LC-MS-based metabolomics/metabonomics of blood-derived samples.

    Science.gov (United States)

    Gika, Helen; Theodoridis, Georgios

    2011-07-01

    Blood represents a very important biological fluid and has been the target of continuous and extensive research for diagnostic, or health and drug monitoring reasons. Recently, metabonomics/metabolomics have emerged as a new and promising 'omics' platform that shows potential in biomarker discovery, especially in areas such as disease diagnosis, assessment of drug efficacy or toxicity. Blood is collected in various establishments in conditions that are not standardized. Next, the samples are prepared and analyzed using different methodologies or tools. When targeted analysis of key molecules (e.g., a drug or its metabolite[s]) is the aim, enforcement of certain measures or additional analyses may correct and harmonize these discrepancies. In omics fields such as those performed by holistic analytical approaches, no such rules or tools are available. As a result, comparison or correlation of results or data fusion becomes impractical. However, it becomes evident that such obstacles should be overcome in the near future to allow for large-scale studies that involve the assaying of samples from hundreds of individuals. In this case the effect of sample handling and preparation becomes very serious, in order to avoid wasting months of work from experts and expensive instrument time. The present review aims to cover the different methodologies applied to the pretreatment of blood prior to LC-MS metabolomic/metabonomic studies. The article tries to critically compare the methods and highlight issues that need to be addressed.

  3. Variable Selection for Nonparametric Gaussian Process Priors: Models and Computational Strategies.

    Science.gov (United States)

    Savitsky, Terrance; Vannucci, Marina; Sha, Naijun

    2011-02-01

    This paper presents a unified treatment of Gaussian process models that extends to data from the exponential dispersion family and to survival data. Our specific interest is in the analysis of data sets with predictors that have an a priori unknown form of possibly nonlinear associations to the response. The modeling approach we describe incorporates Gaussian processes in a generalized linear model framework to obtain a class of nonparametric regression models where the covariance matrix depends on the predictors. We consider, in particular, continuous, categorical and count responses. We also look into models that account for survival outcomes. We explore alternative covariance formulations for the Gaussian process prior and demonstrate the flexibility of the construction. Next, we focus on the important problem of selecting variables from the set of possible predictors and describe a general framework that employs mixture priors. We compare alternative MCMC strategies for posterior inference and achieve a computationally efficient and practical approach. We demonstrate performances on simulated and benchmark data sets.

  4. Sampling and sample processing in pesticide residue analysis.

    Science.gov (United States)

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  5. Flexible link functions in nonparametric binary regression with Gaussian process priors.

    Science.gov (United States)

    Li, Dan; Wang, Xia; Lin, Lizhen; Dey, Dipak K

    2016-09-01

    In many scientific fields, it is a common practice to collect a sequence of 0-1 binary responses from a subject across time, space, or a collection of covariates. Researchers are interested in finding out how the expected binary outcome is related to covariates, and aim at better prediction in the future 0-1 outcomes. Gaussian processes have been widely used to model nonlinear systems; in particular to model the latent structure in a binary regression model allowing nonlinear functional relationship between covariates and the expectation of binary outcomes. A critical issue in modeling binary response data is the appropriate choice of link functions. Commonly adopted link functions such as probit or logit links have fixed skewness and lack the flexibility to allow the data to determine the degree of the skewness. To address this limitation, we propose a flexible binary regression model which combines a generalized extreme value link function with a Gaussian process prior on the latent structure. Bayesian computation is employed in model estimation. Posterior consistency of the resulting posterior distribution is demonstrated. The flexibility and gains of the proposed model are illustrated through detailed simulation studies and two real data examples. Empirical results show that the proposed model outperforms a set of alternative models, which only have either a Gaussian process prior on the latent regression function or a Dirichlet prior on the link function. © 2015, The International Biometric Society.

  6. Online persuasion process: a critical literature review of prior research

    OpenAIRE

    Poorrezaei, M

    2013-01-01

    In this paper, some of the limitations of prior research in terms of online persuasion process are\\ud highlighted. To do this, two main approaches which have been considered to study online persuasion\\ud process in context of social media are identified. Then, this study discusses the limitations and gaps\\ud of each approach. This paper is a part of author’s PhD dissertation which is being conducted to\\ud examine how different online behaviours are persuaded in online brand communities. The r...

  7. False recognition depends on depth of prior word processing: a magnetoencephalographic (MEG) study.

    Science.gov (United States)

    Walla, P; Hufnagl, B; Lindinger, G; Deecke, L; Imhof, H; Lang, W

    2001-04-01

    Brain activity was measured with a whole head magnetoencephalograph (MEG) during the test phases of word recognition experiments. Healthy young subjects had to discriminate between previously presented and new words. During prior study phases two different levels of word processing were provided according to two different kinds of instructions (shallow and deep encoding). Event-related fields (ERFs) associated with falsely recognized words (false alarms) were found to depend on the depth of processing during the prior study phase. False alarms elicited higher brain activity (as reflected by dipole strength) in case of prior deep encoding as compared to shallow encoding between 300 and 500 ms after stimulus onset at temporal brain areas. Between 500 and 700 ms we found evidence for differences in the involvement of neural structures related to both conditions of false alarms. Furthermore, the number of false alarms was found to depend on depth of processing. Shallow encoding led to a higher number of false alarms than deep encoding. All data are discussed as strong support for the ideas that a certain level of word processing is performed by a distinct set of neural systems and that the same neural systems which encode information are reactivated during the retrieval.

  8. Fast Markov chain Monte Carlo sampling for sparse Bayesian inference in high-dimensional inverse problems using L1-type priors

    International Nuclear Information System (INIS)

    Lucka, Felix

    2012-01-01

    Sparsity has become a key concept for solving of high-dimensional inverse problems using variational regularization techniques. Recently, using similar sparsity-constraints in the Bayesian framework for inverse problems by encoding them in the prior distribution has attracted attention. Important questions about the relation between regularization theory and Bayesian inference still need to be addressed when using sparsity promoting inversion. A practical obstacle for these examinations is the lack of fast posterior sampling algorithms for sparse, high-dimensional Bayesian inversion. Accessing the full range of Bayesian inference methods requires being able to draw samples from the posterior probability distribution in a fast and efficient way. This is usually done using Markov chain Monte Carlo (MCMC) sampling algorithms. In this paper, we develop and examine a new implementation of a single component Gibbs MCMC sampler for sparse priors relying on L1-norms. We demonstrate that the efficiency of our Gibbs sampler increases when the level of sparsity or the dimension of the unknowns is increased. This property is contrary to the properties of the most commonly applied Metropolis–Hastings (MH) sampling schemes. We demonstrate that the efficiency of MH schemes for L1-type priors dramatically decreases when the level of sparsity or the dimension of the unknowns is increased. Practically, Bayesian inversion for L1-type priors using MH samplers is not feasible at all. As this is commonly believed to be an intrinsic feature of MCMC sampling, the performance of our Gibbs sampler also challenges common beliefs about the applicability of sample based Bayesian inference. (paper)

  9. Relations among conceptual knowledge, procedural knowledge, and procedural flexibility in two samples differing in prior knowledge.

    Science.gov (United States)

    Schneider, Michael; Rittle-Johnson, Bethany; Star, Jon R

    2011-11-01

    Competence in many domains rests on children developing conceptual and procedural knowledge, as well as procedural flexibility. However, research on the developmental relations between these different types of knowledge has yielded unclear results, in part because little attention has been paid to the validity of the measures or to the effects of prior knowledge on the relations. To overcome these problems, we modeled the three constructs in the domain of equation solving as latent factors and tested (a) whether the predictive relations between conceptual and procedural knowledge were bidirectional, (b) whether these interrelations were moderated by prior knowledge, and (c) how both constructs contributed to procedural flexibility. We analyzed data from 2 measurement points each from two samples (Ns = 228 and 304) of middle school students who differed in prior knowledge. Conceptual and procedural knowledge had stable bidirectional relations that were not moderated by prior knowledge. Both kinds of knowledge contributed independently to procedural flexibility. The results demonstrate how changes in complex knowledge structures contribute to competence development.

  10. Gaussian process based intelligent sampling for measuring nano-structure surfaces

    Science.gov (United States)

    Sun, L. J.; Ren, M. J.; Yin, Y. H.

    2016-09-01

    Nanotechnology is the science and engineering that manipulate matters at nano scale, which can be used to create many new materials and devices with a vast range of applications. As the nanotech product increasingly enters the commercial marketplace, nanometrology becomes a stringent and enabling technology for the manipulation and the quality control of the nanotechnology. However, many measuring instruments, for instance scanning probe microscopy, are limited to relatively small area of hundreds of micrometers with very low efficiency. Therefore some intelligent sampling strategies should be required to improve the scanning efficiency for measuring large area. This paper presents a Gaussian process based intelligent sampling method to address this problem. The method makes use of Gaussian process based Bayesian regression as a mathematical foundation to represent the surface geometry, and the posterior estimation of Gaussian process is computed by combining the prior probability distribution with the maximum likelihood function. Then each sampling point is adaptively selected by determining the position which is the most likely outside of the required tolerance zone among the candidates and then inserted to update the model iteratively. Both simulationson the nominal surface and manufactured surface have been conducted on nano-structure surfaces to verify the validity of the proposed method. The results imply that the proposed method significantly improves the measurement efficiency in measuring large area structured surfaces.

  11. PREFERENCE OF PRIOR FOR BAYESIAN ANALYSIS OF THE MIXED BURR TYPE X DISTRIBUTION UNDER TYPE I CENSORED SAMPLES

    Directory of Open Access Journals (Sweden)

    Tabassum Naz Sindhu

    2014-05-01

    Full Text Available The paper is concerned with the preference of prior for the Bayesian analysis of the shape parameter of the mixture of Burr type X distribution using the censored data. We modeled the heterogeneous population using two components mixture of the Burr type X distribution. A comprehensive simulation scheme, through probabilistic mixing, has been followed to highlight the properties and behavior of the estimates in terms of sample size, corresponding risks and the proportion of the component of the mixture. The Bayes estimators of the parameters have been evaluated under the assumption of informative and non-informative priors using symmetric and asymmetric loss functions. The model selection criterion for the preference of the prior has been introduced. The hazard rate function of the mixture distribution has been discussed. The Bayes estimates under exponential prior and precautionary loss function exhibit the minimum posterior risks with some exceptions.

  12. The Role of Prior Entrepreneurial Exposure in the Entrepreneurial Process: A Review and Future Research Implications

    NARCIS (Netherlands)

    Zapkau, F.B.; Schwens, Christian; Kabst, Rüdiger

    2017-01-01

    Despite considerable research, the current state regarding how and in which context prior entrepreneurial exposure impacts the entrepreneurial process is unclear. The present paper's goal is to systemize and discuss extant quantitative-empirical research on the role of prior entrepreneurial exposure

  13. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control

    CERN Document Server

    Pitard, Francis F

    1993-01-01

    Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...

  14. A Habermasian Analysis of a Process of Recognition of Prior Learning for Health Care Assistants

    Science.gov (United States)

    Sandberg, Fredrik

    2012-01-01

    This article discusses a process of recognition of prior learning for accreditation of prior experiential learning to qualify for course credits used in an adult in-service education program for health care assistants at the upper-secondary level in Sweden. The data are based on interviews and observations drawn from a field study, and Habermas's…

  15. Representative process sampling - in practice

    DEFF Research Database (Denmark)

    Esbensen, Kim; Friis-Pedersen, Hans Henrik; Julius, Lars Petersen

    2007-01-01

    Didactic data sets representing a range of real-world processes are used to illustrate "how to do" representative process sampling and process characterisation. The selected process data lead to diverse variogram expressions with different systematics (no range vs. important ranges; trends and....../or periodicity; different nugget effects and process variations ranging from less than one lag to full variogram lag). Variogram data analysis leads to a fundamental decomposition into 0-D sampling vs. 1-D process variances, based on the three principal variogram parameters: range, sill and nugget effect...

  16. Sample and data management process description

    International Nuclear Information System (INIS)

    Kessner, J.H.

    2000-01-01

    The sample and data management process was initiated in 1994 as a result of a process improvement workshop. The purpose of the workshop was to develop a sample and data management process that would reduce cycle time and costs, simplify systems and procedures, and improve customer satisfaction for sampling, analytical services, and data management activities

  17. Radar Doppler Processing with Nonuniform Sampling.

    Energy Technology Data Exchange (ETDEWEB)

    Doerry, Armin W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-07-01

    Conventional signal processing to estimate radar Doppler frequency often assumes uniform pulse/sample spacing. This is for the convenience of t he processing. More recent performance enhancements in processor capability allow optimally processing nonuniform pulse/sample spacing, thereby overcoming some of the baggage that attends uniform sampling, such as Doppler ambiguity and SNR losses due to sidelobe control measures.

  18. Storage of intact heads prior to processing limits the shelf-life of fresh-cut Lactuca sativa L.

    NARCIS (Netherlands)

    Witkowska, I.M.; Woltering, E.J.

    2014-01-01

    Harvested lettuce heads are usually transported and stored for some period of time under a variety of conditions prior to processing. During storage, especially under suboptimal conditions, nutritional composition of the harvested produce continues to change. The possible impact of prior storage of

  19. Optimization of Sample Preparation processes of Bone Material for Raman Spectroscopy.

    Science.gov (United States)

    Chikhani, Madelen; Wuhrer, Richard; Green, Hayley

    2018-03-30

    Raman spectroscopy has recently been investigated for use in the calculation of postmortem interval from skeletal material. The fluorescence generated by samples, which affects the interpretation of Raman data, is a major limitation. This study compares the effectiveness of two sample preparation techniques, chemical bleaching and scraping, in the reduction of fluorescence from bone samples during testing with Raman spectroscopy. Visual assessment of Raman spectra obtained at 1064 nm excitation following the preparation protocols indicates an overall reduction in fluorescence. Results demonstrate that scraping is more effective at resolving fluorescence than chemical bleaching. The scraping of skeletonized remains prior to Raman analysis is a less destructive method and allows for the preservation of a bone sample in a state closest to its original form, which is beneficial in forensic investigations. It is recommended that bone scraping supersedes chemical bleaching as the preferred method for sample preparation prior to Raman spectroscopy. © 2018 American Academy of Forensic Sciences.

  20. Influence of feed provisioning prior to digesta sampling on precaecal amino acid digestibility in broiler chickens.

    Science.gov (United States)

    Siegert, Wolfgang; Ganzer, Christian; Kluth, Holger; Rodehutscord, Markus

    2018-06-01

    A regression approach was applied to determine the influence of feed provisioning prior to digesta sampling on precaecal (pc) amino acid (AA) digestibility in broiler chickens. Soybean meal was used as an example test ingredient. Five feed-provisioning protocols were investigated, four with restricted provision and one with ad libitum provision. When provision was restricted, feed was provided for 30 min after a withdrawal period of 12 h. Digesta were sampled 1, 2, 4 and 6 h after feeding commenced. A diet containing 300 g maize starch/kg was prepared. Half or all the maize starch was replaced with soybean meal in two other diets. Average pc digestibility of all determined AA in the soybean meal was 86% for the 4 and 6-h protocols and 66% and 60% for the 2 and 1-h protocols, respectively. Average pc AA digestibility of soybean meal was 76% for ad libitum feed provision. Feed provisioning also influenced the determined variance. Variance in digestibility ranked in magnitude 1 h > ad libitum > 2 h > 6 h > 4 h for all AA. Owing to the considerable influence of feed-provisioning protocols found in this study, comparisons of pc AA digestibility between studies applying different protocols prior to digesta sampling must be treated with caution. Digestibility experiments aimed at providing estimates for practical feed formulation should use feed-provisioning procedures similar to those used in practice.

  1. New and conventional evaporative systems in concentrating nitrogen samples prior to isotope-ratio analysis

    International Nuclear Information System (INIS)

    Lober, R.W.; Reeder, J.D.; Porter, L.K.

    1987-01-01

    Studies were conducted to quantify and compare the efficiencies of various evaporative systems used in evaporating 15 N samples prior to mass spectrometric analysis. Two new forced-air systems were designed and compared with a conventional forced-air system and with an open-air dry bath technique for effectiveness in preventing atmospheric contamination of evaporating samples. The forced-air evaporative systems significantly reduced the time needed to evaporate samples as compared to the open-air dry bath technique; samples were evaporated to dryness in 2.5 h with the forced-air systems as compared to 8 to 10 h on the open-air dry bath. The effectiveness of a given forced-air system to prevent atmospheric contamination of evaporating samples was significantly affected by the flow rate of the air stream flowing over the samples. The average atmospheric contaminant N found in samples evaporated on the open-air dry bath was 0.3 μ N, indicating very low concentrations of atmospheric NH 3 during this study. However, in previous studies the authors have experienced significant contamination of 15 N samples evaporated on an open-air dry bath because the level of contaminant N in the laboratory atmosphere varied and could not be adequately controlled. Average cross-contaminant levels of 0.28, 0.20, and 1.01 μ of N were measured between samples evaporated on the open-air dry bath, the newly-designed forced-air system, and the conventional forced-air system, respectively. The cross-contamination level is significantly higher on the conventional forced-air system than on the other two systems, and could significantly alter the atom % 15 N of high-enriched, low [N] evaporating samples

  2. Sample preparation

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Sample preparation prior to HPLC analysis is certainly one of the most important steps to consider in trace or ultratrace analysis. For many years scientists have tried to simplify the sample preparation process. It is rarely possible to inject a neat liquid sample or a sample where preparation may not be any more complex than dissolution of the sample in a given solvent. The last process alone can remove insoluble materials, which is especially helpful with the samples in complex matrices if other interactions do not affect extraction. Here, it is very likely a large number of components will not dissolve and are, therefore, eliminated by a simple filtration process. In most cases, the process of sample preparation is not as simple as dissolution of the component interest. At times, enrichment is necessary, that is, the component of interest is present in very large volume or mass of material. It needs to be concentrated in some manner so a small volume of the concentrated or enriched sample can be injected into HPLC. 88 refs

  3. Sampling, feasibility, and priors in Bayesian estimation

    OpenAIRE

    Chorin, Alexandre J.; Lu, Fei; Miller, Robert N.; Morzfeld, Matthias; Tu, Xuemin

    2015-01-01

    Importance sampling algorithms are discussed in detail, with an emphasis on implicit sampling, and applied to data assimilation via particle filters. Implicit sampling makes it possible to use the data to find high-probability samples at relatively low cost, making the assimilation more efficient. A new analysis of the feasibility of data assimilation is presented, showing in detail why feasibility depends on the Frobenius norm of the covariance matrix of the noise and not on the number of va...

  4. Extreme robustness of scaling in sample space reducing processes explains Zipf’s law in diffusion on directed networks

    International Nuclear Information System (INIS)

    Corominas-Murtra, Bernat; Hanel, Rudolf; Thurner, Stefan

    2016-01-01

    It has been shown recently that a specific class of path-dependent stochastic processes, which reduce their sample space as they unfold, lead to exact scaling laws in frequency and rank distributions. Such sample space reducing processes offer an alternative new mechanism to understand the emergence of scaling in countless processes. The corresponding power law exponents were shown to be related to noise levels in the process. Here we show that the emergence of scaling is not limited to the simplest SSRPs, but holds for a huge domain of stochastic processes that are characterised by non-uniform prior distributions. We demonstrate mathematically that in the absence of noise the scaling exponents converge to −1 (Zipf’s law) for almost all prior distributions. As a consequence it becomes possible to fully understand targeted diffusion on weighted directed networks and its associated scaling laws in node visit distributions. The presence of cycles can be properly interpreted as playing the same role as noise in SSRPs and, accordingly, determine the scaling exponents. The result that Zipf’s law emerges as a generic feature of diffusion on networks, regardless of its details, and that the exponent of visiting times is related to the amount of cycles in a network could be relevant for a series of applications in traffic-, transport- and supply chain management. (paper)

  5. Freezing fecal samples prior to DNA extraction affects the Firmicutes to Bacteroidetes ratio determined by downstream quantitative PCR analysis

    DEFF Research Database (Denmark)

    Bahl, Martin Iain; Bergström, Anders; Licht, Tine Rask

    2012-01-01

    Freezing stool samples prior to DNA extraction and downstream analysis is widely used in metagenomic studies of the human microbiota but may affect the inferred community composition. In this study, DNA was extracted either directly or following freeze storage of three homogenized human fecal...

  6. Freezing fecal samples prior to DNA extraction affects the Firmicutes to Bacteroidetes ratio determined by downstream quantitative PCR analysis

    DEFF Research Database (Denmark)

    Bahl, Martin Iain; Bergström, Anders; Licht, Tine Rask

    Freezing stool samples prior to DNA extraction and downstream analysis is widely used in metagenomic studies of the human microbiota but may affect the inferred community composition. In this study DNA was extracted either directly or following freeze storage of three homogenized human fecal...

  7. Data Validation Package May 2016 Groundwater Sampling at the Lakeview, Oregon, Processing Site August 2016

    Energy Technology Data Exchange (ETDEWEB)

    Linard, Joshua [USDOE Office of Legacy Management, Washington, DC (United States); Hall, Steve [Navarro Research and Engineering, Inc., Oak Ridge, TN (United States)

    2016-08-01

    This biennial event includes sampling five groundwater locations (four monitoring wells and one domestic well) at the Lakeview, Oregon, Processing Site. For this event, the domestic well (location 0543) could not be sampled because no one was in residence during the sampling event (Note: notification was provided to the resident prior to the event). Per Appendix A of the Groundwater Compliance Action Plan, sampling is conducted to monitor groundwater quality on a voluntary basis. Sampling and analyses were conducted as specified in the Sampling and Analysis Plan for U.S. Department of Energy Office of Legacy Management Sites (LMS/PRO/S04351, continually updated). One duplicate sample was collected from location 0505. Water levels were measured at each sampled monitoring well. The constituents monitored at the Lakeview site are manganese and sulfate. Monitoring locations that exceeded the U.S. Environmental Protection Agency (EPA) Secondary Maximum Contaminant Levels for these constituents are listed in Table 1. Review of time-concentration graphs included in this report indicate that manganese and sulfate concentrations are consistent with historical measurements.

  8. Separation and enrichment of gold(III) from environmental samples prior to its flame atomic absorption spectrometric determination

    International Nuclear Information System (INIS)

    Senturk, Hasan Basri; Gundogdu, Ali; Bulut, Volkan Numan; Duran, Celal; Soylak, Mustafa; Elci, Latif; Tufekci, Mehmet

    2007-01-01

    A simple and accurate method was developed for separation and enrichment of trace levels of gold in environmental samples. The method is based on the adsorption of Au(III)-diethyldithiocarbamate complex on Amberlite XAD-2000 resin prior to the analysis of gold by flame atomic absorption spectrometry after elution with 1 mol L -1 HNO 3 in acetone. Some parameters including nitric acid concentration, eluent type, matrix ions, sample volume, sample flow rate and adsorption capacity were investigated on the recovery of gold(III). The recovery values for gold(III) and detection limit of gold were greater than 95% and 16.6 μg L -1 , respectively. The preconcentration factor was 200. The relative standard deviation of the method was -1 . The validation of the presented procedure was checked by the analysis of CRM-SA-C Sandy Soil certified reference material. The presented procedure was applied to the determination of gold in some environmental samples

  9. Cost-constrained optimal sampling for system identification in pharmacokinetics applications with population priors and nuisance parameters.

    Science.gov (United States)

    Sorzano, Carlos Oscars S; Pérez-De-La-Cruz Moreno, Maria Angeles; Burguet-Castell, Jordi; Montejo, Consuelo; Ros, Antonio Aguilar

    2015-06-01

    Pharmacokinetics (PK) applications can be seen as a special case of nonlinear, causal systems with memory. There are cases in which prior knowledge exists about the distribution of the system parameters in a population. However, for a specific patient in a clinical setting, we need to determine her system parameters so that the therapy can be personalized. This system identification is performed many times by measuring drug concentrations in plasma. The objective of this work is to provide an irregular sampling strategy that minimizes the uncertainty about the system parameters with a fixed amount of samples (cost constrained). We use Monte Carlo simulations to estimate the average Fisher's information matrix associated to the PK problem, and then estimate the sampling points that minimize the maximum uncertainty associated to system parameters (a minimax criterion). The minimization is performed employing a genetic algorithm. We show that such a sampling scheme can be designed in a way that is adapted to a particular patient and that it can accommodate any dosing regimen as well as it allows flexible therapeutic strategies. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  10. Sample processing device and method

    DEFF Research Database (Denmark)

    2011-01-01

    A sample processing device is disclosed, which sample processing device comprises a first substrate and a second substrate, where the first substrate has a first surface comprising two area types, a first area type with a first contact angle with water and a second area type with a second contact...... angle with water, the first contact angle being smaller than the second contact angle. The first substrate defines an inlet system and a preparation system in areas of the first type which two areas are separated by a barrier system in an area of the second type. The inlet system is adapted to receive...

  11. Separation and enrichment of gold(III) from environmental samples prior to its flame atomic absorption spectrometric determination

    Energy Technology Data Exchange (ETDEWEB)

    Senturk, Hasan Basri; Gundogdu, Ali [Department of Chemistry, Faculty of Arts and Sciences, Karadeniz Technical University, 61080 Trabzon (Turkey); Bulut, Volkan Numan [Department of Chemistry, Faculty of Arts and Sciences, Karadeniz Technical University, 28049 Giresun (Turkey); Duran, Celal [Department of Chemistry, Faculty of Arts and Sciences, Karadeniz Technical University, 61080 Trabzon (Turkey); Soylak, Mustafa [Department of Chemistry, Faculty of Arts and Sciences, Erciyes University, 38039 Kayseri (Turkey)], E-mail: soylak@erciyes.edu.tr; Elci, Latif [Department of Chemistry, Faculty of Arts and Sciences, Pamukkale University, 20020 Denizli (Turkey); Tufekci, Mehmet [Department of Chemistry, Faculty of Arts and Sciences, Karadeniz Technical University, 61080 Trabzon (Turkey)

    2007-10-22

    A simple and accurate method was developed for separation and enrichment of trace levels of gold in environmental samples. The method is based on the adsorption of Au(III)-diethyldithiocarbamate complex on Amberlite XAD-2000 resin prior to the analysis of gold by flame atomic absorption spectrometry after elution with 1 mol L{sup -1} HNO{sub 3} in acetone. Some parameters including nitric acid concentration, eluent type, matrix ions, sample volume, sample flow rate and adsorption capacity were investigated on the recovery of gold(III). The recovery values for gold(III) and detection limit of gold were greater than 95% and 16.6 {mu}g L{sup -1}, respectively. The preconcentration factor was 200. The relative standard deviation of the method was <6%. The adsorption capacity of the resin was 12.3 mg g{sup -1}. The validation of the presented procedure was checked by the analysis of CRM-SA-C Sandy Soil certified reference material. The presented procedure was applied to the determination of gold in some environmental samples.

  12. The Influence of Prior Knowledge on the Retrieval-Directed Function of Note Taking in Prior Knowledge Activation

    Science.gov (United States)

    Wetzels, Sandra A. J.; Kester, Liesbeth; van Merrienboer, Jeroen J. G.; Broers, Nick J.

    2011-01-01

    Background: Prior knowledge activation facilitates learning. Note taking during prior knowledge activation (i.e., note taking directed at retrieving information from memory) might facilitate the activation process by enabling learners to build an external representation of their prior knowledge. However, taking notes might be less effective in…

  13. Robotic system for process sampling

    International Nuclear Information System (INIS)

    Dyches, G.M.

    1985-01-01

    A three-axis cartesian geometry robot for process sampling was developed at the Savannah River Laboratory (SRL) and implemented in one of the site radioisotope separations facilities. Use of the robot reduces personnel radiation exposure and contamination potential by routinely handling sample containers under operator control in a low-level radiation area. This robot represents the initial phase of a longer term development program to use robotics for further sample automation. Preliminary design of a second generation robot with additional capabilities is also described. 8 figs

  14. Remote sampling of process fluids in radiochemical plants

    International Nuclear Information System (INIS)

    Sengar, P.B.; Bhattacharya, R.; Ozarde, P. D.; Rana, D.S.

    1990-01-01

    Sampling of process fluids, continuous or periodic, is an essential requirement in any chemical process plant, so as to keep a control on process variables. In a radiochemical plant the task of taking and conveying the samples is a very tricky affair. This is due to the fact that neither the vessels/equipment containing radioactive effluents can be approached for manual sampling nor sampled fluids can be handled directly. The problems become more accute with higher levels of radioactivity. As such, inovative systems have to be devised to obtain and handle the raioactive samples employing remote operations. The remote sampling system developed in this Division has some of the unique features such as taking only requisite amount of samples in microlitre range, practically maintenance free design, avoidence of excess radioactive fluids coming out of process systems, etc. The paper describes in detail the design of remote sampling system and compares the same with existing systems. The design efforts are towards simplicity in operation, obtaining homogenised representative samples and highly economical on man-rem expenditure. The performance of a prototype system has also been evaluated. (author). 3 refs

  15. Chromosomal differences between acute nonlymphocytic leukemia in patients with prior solid tumors and prior hematologic malignancies. A study of 14 cases with prior breast cancer

    International Nuclear Information System (INIS)

    Mamuris, Z.; Dumont, J.; Dutrillaux, B.; Aurias, A.

    1989-01-01

    A cytogenetic study of 14 patients with secondary acute nonlymphocytic leukemia (S-ANLL) with prior treatment for breast cancer is reported. The chromosomes recurrently involved in numerical or structural anomalies are chromosomes 7, 5, 17, and 11, in decreasing order of frequency. The distribution of the anomalies detected in this sample of patients is similar to that observed in published cases with prior breast or other solid tumors, though anomalies of chromosome 11 were not pointed out, but it significantly differs from that of the S-ANLL with prior hematologic malignancies. This difference is principally due to a higher involvement of chromosome 7 in patients with prior hematologic malignancies and of chromosomes 11 and 17 in patients with prior solid tumors. A genetic determinism involving abnormal recessive alleles located on chromosomes 5, 7, 11, and 17 uncovered by deletions of the normal homologs may be a cause of S-ANLL. The difference between patients with prior hematologic malignancies or solid tumors may be explained by different constitutional mutations of recessive genes in the two groups of patients

  16. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, Nur Aira Abd, E-mail: nur-aira@nuclearmalaysia.gov.my; Yussup, Nolida; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh Shaari, Syirrazie Bin Che; Azman, Azraf B. [Technical Support Division, Malaysian Nuclear Agency, 43000, Kajang, Selangor (Malaysia); Salim, Nazaratul Ashifa Bt. Abdullah [Division of Waste and Environmental Technology, Malaysian Nuclear Agency, 43000, Kajang, Selangor (Malaysia); Ismail, Nadiah Binti [Fakulti Kejuruteraan Elektrik, UiTM Pulau Pinang, 13500 Permatang Pauh, Pulau Pinang (Malaysia)

    2015-04-29

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on ‘Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)’. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  17. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    Science.gov (United States)

    Rahman, Nur Aira Abd; Yussup, Nolida; Salim, Nazaratul Ashifa Bt. Abdullah; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh@Shaari, Syirrazie Bin Che; Azman, Azraf B.; Ismail, Nadiah Binti

    2015-04-01

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on `Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)'. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  18. Disc valve for sampling erosive process streams

    Science.gov (United States)

    Mrochek, J.E.; Dinsmore, S.R.; Chandler, E.W.

    1986-01-07

    A four-port disc valve is described for sampling erosive, high temperature process streams. A rotatable disc defining opposed first and second sampling cavities rotates between fired faceplates defining flow passageways positioned to be alternatively in axial alignment with the first and second cavities. Silicon carbide inserts and liners composed of [alpha] silicon carbide are provided in the faceplates and in the sampling cavities to limit erosion while providing lubricity for a smooth and precise operation when used under harsh process conditions. 1 fig.

  19. Automated force volume image processing for biological samples.

    Directory of Open Access Journals (Sweden)

    Pavel Polyakov

    2011-04-01

    Full Text Available Atomic force microscopy (AFM has now become a powerful technique for investigating on a molecular level, surface forces, nanomechanical properties of deformable particles, biomolecular interactions, kinetics, and dynamic processes. This paper specifically focuses on the analysis of AFM force curves collected on biological systems, in particular, bacteria. The goal is to provide fully automated tools to achieve theoretical interpretation of force curves on the basis of adequate, available physical models. In this respect, we propose two algorithms, one for the processing of approach force curves and another for the quantitative analysis of retraction force curves. In the former, electrostatic interactions prior to contact between AFM probe and bacterium are accounted for and mechanical interactions operating after contact are described in terms of Hertz-Hooke formalism. Retraction force curves are analyzed on the basis of the Freely Jointed Chain model. For both algorithms, the quantitative reconstruction of force curves is based on the robust detection of critical points (jumps, changes of slope or changes of curvature which mark the transitions between the various relevant interactions taking place between the AFM tip and the studied sample during approach and retraction. Once the key regions of separation distance and indentation are detected, the physical parameters describing the relevant interactions operating in these regions are extracted making use of regression procedure for fitting experiments to theory. The flexibility, accuracy and strength of the algorithms are illustrated with the processing of two force-volume images, which collect a large set of approach and retraction curves measured on a single biological surface. For each force-volume image, several maps are generated, representing the spatial distribution of the searched physical parameters as estimated for each pixel of the force-volume image.

  20. Determination of some organophosphorus pesticides in water and watermelon samples by microextraction prior to high-performance liquid chromatography.

    Science.gov (United States)

    Wang, Chun; Wu, Qiuhua; Wu, Chunxia; Wang, Zhi

    2011-11-01

    A novel method based on simultaneous liquid-liquid microextraction and carbon nanotube reinforced hollow fiber microporous membrane solid-liquid phase microextraction has been developed for the determination of six organophosphorus pesticides, i.e. isocarbophos, phosmet, parathion-methyl, triazophos, fonofos and phoxim, in water and watermelon samples prior to high-performance liquid chromatography (HPLC). Under the optimum conditions, the method shows a good linearity within a range of 1-200 ng/mL for water samples and 5-200 ng/g for watermelon samples, with the correlation coefficients (r) varying from 0.9990 to 0.9997 and 0.9986 to 0.9995, respectively. The limits of detection (LODs) were in the range between 0.1 and 0.3 ng/mL for water samples and between 1.0 and 1.5 ng/g for watermelon samples. The recoveries of the method at spiking levels of 5.0 and 50.0 ng/mL for water samples were between 85.4 and 100.8%, and at spiking levels of 5.0 and 50.0 ng/g for watermelon samples, they were between 82.6 and 92.4%, with the relative standard deviations (RSDs) varying from 4.5-6.9% and 5.2-7.4%, respectively. The results suggested that the developed method represents a simple, low-cost, high analytes preconcentration and excellent sample cleanup procedure for the determination of organophosphorus pesticides in water and watermelon samples. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Molecularly imprinted polymers for extraction of malachite green from fish samples prior to its determination by HPLC

    International Nuclear Information System (INIS)

    Li, Lu; Chen, Xiao-mei; Zhang, Hong-yuan; Lin, Yi-dong; Lin, Zheng-zhong; Huang, Zhi-yong; Lai, Zhu-zhi

    2015-01-01

    Molecularly imprinted polymer (MIP) particles for malachite green (MG) were prepared by emulsion polymerization using methacrylic acid as the functional monomer, ethylene glycol dimethacrylate as the cross-linker, and a combination of Span-80 and Tween-80 as an emulsifier. The MIP particles were characterized by SEM micrographs and FT-IR spectra. Their binding capacity for MG was evaluated based on kinetic and isothermal adsorption experiments and compared to non-imprinted polymer particles. Analytical figures of merit include an adsorption equilibrium time of 15 min, an adsorption capacity of 1.9 mg∙g -1 in acetonitrile-water (20:80), and an imprinting factor of 1.85. The MIP particles were successfully applied to the extraction of MG from fish samples spiked with MG and the other interfering substances prior to its determination of MG by HPLC. Spiked samples gave recoveries of MG that ranged from 86 to 104 %, much higher than that of the other interfering substance. (author)

  2. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  3. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    Science.gov (United States)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  4. On incomplete sampling under birth-death models and connections to the sampling-based coalescent.

    Science.gov (United States)

    Stadler, Tanja

    2009-11-07

    The constant rate birth-death process is used as a stochastic model for many biological systems, for example phylogenies or disease transmission. As the biological data are usually not fully available, it is crucial to understand the effect of incomplete sampling. In this paper, we analyze the constant rate birth-death process with incomplete sampling. We derive the density of the bifurcation events for trees on n leaves which evolved under this birth-death-sampling process. This density is used for calculating prior distributions in Bayesian inference programs and for efficiently simulating trees. We show that the birth-death-sampling process can be interpreted as a birth-death process with reduced rates and complete sampling. This shows that joint inference of birth rate, death rate and sampling probability is not possible. The birth-death-sampling process is compared to the sampling-based population genetics model, the coalescent. It is shown that despite many similarities between these two models, the distribution of bifurcation times remains different even in the case of very large population sizes. We illustrate these findings on an Hepatitis C virus dataset from Egypt. We show that the transmission times estimates are significantly different-the widely used Gamma statistic even changes its sign from negative to positive when switching from the coalescent to the birth-death process.

  5. Memory-related functional connectivity in visual processing regions varies by prior emotional context.

    Science.gov (United States)

    Bowen, Holly J; Kensinger, Elizabeth A

    2017-09-06

    Memory retrieval involves the reactivation of processes that were engaged at encoding. Using a Generalized Linear Model to test for effects of valence, our prior study suggests that memory for information previously encoded in a negative context reengages sensory processing regions at retrieval to a greater extent than positive. Here, we used partial least squares analyses of the same dataset to determine whether this valence-specific processing was one of the dominant patterns in the retrieval data. Trials previously paired with a face revealed a strong pattern of emotion that did not vary by valence, but for trials previously paired with a scene, an extensive network of regions was active during recollection of trials paired with negative content. These same regions were negatively correlated with recollection of trials paired with positive content. These results confirm that, despite no emotional content present during the time of retrieval, strong patterns of emotional study context are present in the data. Moreover, at least for trials paired with scenes at encoding, valence-specific networks are linked to episodic memory recollection, providing further support for recapitulation of sensory processing during recollection of negative emotional information.

  6. Appropriate Handling, Processing and Analysis of Blood Samples Is Essential to Avoid Oxidation of Vitamin C to Dehydroascorbic Acid.

    Science.gov (United States)

    Pullar, Juliet M; Bayer, Simone; Carr, Anitra C

    2018-02-11

    Vitamin C (ascorbate) is the major water-soluble antioxidant in plasma and its oxidation to dehydroascorbic acid (DHA) has been proposed as a marker of oxidative stress in vivo. However, controversy exists in the literature around the amount of DHA detected in blood samples collected from various patient cohorts. In this study, we report on DHA concentrations in a selection of different clinical cohorts (diabetes, pneumonia, cancer, and critically ill). All clinical samples were collected into EDTA anticoagulant tubes and processed at 4 °C prior to storage at -80 °C for subsequent analysis by HPLC with electrochemical detection. We also investigated the effects of different handling and processing conditions on short-term and long-term ascorbate and DHA stability in vitro and in whole blood and plasma samples. These conditions included metal chelation, anticoagulants (EDTA and heparin), and processing temperatures (ice, 4 °C and room temperature). Analysis of our clinical cohorts indicated very low to negligible DHA concentrations. Samples exhibiting haemolysis contained significantly higher concentrations of DHA. Metal chelation inhibited oxidation of vitamin C in vitro, confirming the involvement of contaminating metal ions. Although EDTA is an effective metal chelator, complexes with transition metal ions are still redox active, thus its use as an anticoagulant can facilitate metal ion-dependent oxidation of vitamin C in whole blood and plasma. Handling and processing blood samples on ice (or at 4 °C) delayed oxidation of vitamin C by a number of hours. A review of the literature regarding DHA concentrations in clinical cohorts highlighted the fact that studies using colourimetric or fluorometric assays reported significantly higher concentrations of DHA compared to those using HPLC with electrochemical detection. In conclusion, careful handling and processing of samples, combined with appropriate analysis, is crucial for accurate determination of ascorbate

  7. Quality evaluation of processed clay soil samples.

    Science.gov (United States)

    Steiner-Asiedu, Matilda; Harrison, Obed Akwaa; Vuvor, Frederick; Tano-Debrah, Kwaku

    2016-01-01

    This study assessed the microbial quality of clay samples sold on two of the major Ghanaian markets. The study was a cross-sectional assessing the evaluation of processed clay and effects it has on the nutrition of the consumers in the political capital town of Ghana. The items for the examination was processed clay soil samples. Staphylococcus spp and fecal coliforms including Klebsiella, Escherichia, and Shigella and Enterobacterspp were isolated from the clay samples. Samples from the Kaneshie market in Accra recorded the highest total viable counts 6.5 Log cfu/g and Staphylococcal count 5.8 Log cfu/g. For fecal coliforms, Madina market samples had the highest count 6.5 Log cfu/g and also recorded the highest levels of yeast and mould. For Koforidua, total viable count was highest in the samples from the Zongo market 6.3 Log cfu/g. Central market samples had the highest count of fecal coliforms 4.6 Log cfu/g and yeasts and moulds 6.5 Log cfu/g. "Small" market recorded the highest staphylococcal count 6.2 Log cfu/g. The water activity of the clay samples were low, and ranged between 0.65±0.01 and 0.66±0.00 for samples collected from Koforidua and Accra respectively. The clay samples were found to contain Klebsiella spp. Escherichia, Enterobacter, Shigella spp. staphylococcus spp., yeast and mould. These have health implications when consumed.

  8. Example-driven manifold priors for image deconvolution.

    Science.gov (United States)

    Ni, Jie; Turaga, Pavan; Patel, Vishal M; Chellappa, Rama

    2011-11-01

    Image restoration methods that exploit prior information about images to be estimated have been extensively studied, typically using the Bayesian framework. In this paper, we consider the role of prior knowledge of the object class in the form of a patch manifold to address the deconvolution problem. Specifically, we incorporate unlabeled image data of the object class, say natural images, in the form of a patch-manifold prior for the object class. The manifold prior is implicitly estimated from the given unlabeled data. We show how the patch-manifold prior effectively exploits the available sample class data for regularizing the deblurring problem. Furthermore, we derive a generalized cross-validation (GCV) function to automatically determine the regularization parameter at each iteration without explicitly knowing the noise variance. Extensive experiments show that this method performs better than many competitive image deconvolution methods.

  9. Robust nonhomogeneous training samples detection method for space-time adaptive processing radar using sparse-recovery with knowledge-aided

    Science.gov (United States)

    Li, Zhihui; Liu, Hanwei; Zhang, Yongshun; Guo, Yiduo

    2017-10-01

    The performance of space-time adaptive processing (STAP) may degrade significantly when some of the training samples are contaminated by the signal-like components (outliers) in nonhomogeneous clutter environments. To remove the training samples contaminated by outliers in nonhomogeneous clutter environments, a robust nonhomogeneous training samples detection method using the sparse-recovery (SR) with knowledge-aided (KA) is proposed. First, the reduced-dimension (RD) overcomplete spatial-temporal steering dictionary is designed with the prior knowledge of system parameters and the possible target region. Then, the clutter covariance matrix (CCM) of cell under test is efficiently estimated using a modified focal underdetermined system solver (FOCUSS) algorithm, where a RD overcomplete spatial-temporal steering dictionary is applied. Third, the proposed statistics are formed by combining the estimated CCM with the generalized inner products (GIP) method, and the contaminated training samples can be detected and removed. Finally, several simulation results validate the effectiveness of the proposed KA-SR-GIP method.

  10. Application of acetone acetals as water scavengers and derivatization agents prior to the gas chromatographic analysis of polar residual solvents in aqueous samples.

    Science.gov (United States)

    van Boxtel, Niels; Wolfs, Kris; Van Schepdael, Ann; Adams, Erwin

    2015-12-18

    The sensitivity of gas chromatography (GC) combined with the full evaporation technique (FET) for the analysis of aqueous samples is limited due to the maximum tolerable sample volume in a headspace vial. Using an acetone acetal as water scavenger prior to FET-GC analysis proved to be a useful and versatile tool for the analysis of high boiling analytes in aqueous samples. 2,2-Dimethoxypropane (DMP) was used in this case resulting in methanol and acetone as reaction products with water. These solvents are relatively volatile and were easily removed by evaporation enabling sample enrichment leading to 10-fold improvement in sensitivity compared to the standard 10μL FET sample volumes for a selection of typical high boiling polar residual solvents in water. This could be improved even further if more sample is used. The method was applied for the determination of residual NMP in an aqueous solution of a cefotaxime analogue and proved to be considerably better than conventional static headspace (sHS) and the standard FET approach. The methodology was also applied to determine trace amounts of ethylene glycol (EG) in aqueous samples like contact lens fluids, where scavenging of the water would avoid laborious extraction prior to derivatization. During this experiment it was revealed that DMP reacts quantitatively with EG to form 2,2-dimethyl-1,3-dioxolane (2,2-DD) under the proposed reaction conditions. The relatively high volatility (bp 93°C) of 2,2-DD makes it possible to perform analysis of EG using the sHS methodology making additional derivatization reactions superfluous. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Reproducibility of Serum Potassium Values in Serum From Blood Samples Stored for Increasing Times Prior to Centrifugation and Analysis.

    Science.gov (United States)

    Harper, Aaron; Lu, Chuanyong; Sun, Yi; Garcia, Rafael; Rets, Anton; Alexis, Herol; Saad, Heba; Eid, Ikram; Harris, Loretta; Marshall, Barbara; Tafani, Edlira; Pincus, Matthew R

    2016-05-01

    The goal of this work was to determine if immediate versus postponed centrifugation of samples affects the levels of serum potassium. Twenty participants donated normal venous blood that was collected in four serum separator tubes per donor, each of which was analyzed at 0, 1, 2, or 4 hr on the Siemens Advia 1800 autoanalyzer. Coefficients of variation (CVs) for potassium levels ranged from 0% to 7.6% with a mean of 3 ± 2%. ANOVA testing of the means for all 20 samples showed a P-value of 0.72 (>0.05) indicating that there was no statistically significant difference between the means of the samples at the four time points. Sixteen samples were found to have CVs that were ≤5%. Two samples showed increases of potassium from the reference range to levels higher than the upper reference limit, one of which had a 4-hr value that was within the reference or normal range (3.5-5 mEq/l). Overall, most samples were found to have reproducible levels of serum potassium. Serum potassium levels from stored whole blood collected in serum separator tubes are, for the most part, stable at room temperature for at least 4 hr prior to analysis. However, some samples can exhibit significant fluctuations of values. © 2015 Wiley Periodicals, Inc.

  12. Reproducing kernel Hilbert spaces of Gaussian priors

    NARCIS (Netherlands)

    Vaart, van der A.W.; Zanten, van J.H.; Clarke, B.; Ghosal, S.

    2008-01-01

    We review definitions and properties of reproducing kernel Hilbert spaces attached to Gaussian variables and processes, with a view to applications in nonparametric Bayesian statistics using Gaussian priors. The rate of contraction of posterior distributions based on Gaussian priors can be described

  13. Representative process sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Julius, Lars Petersen; Esbensen, Kim

    2005-01-01

    (sampling variances) can be reduced greatly however, and sampling biases can be eliminated completely, by respecting a simple set of rules and guidelines provided by TOS. A systematic approach for description of process heterogeneity furnishes in-depth knowledge about the specific variability of any 1-D lot...

  14. ISOLOK VALVE ACCEPTANCE TESTING FOR DWPF SME SAMPLING PROCESS

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, T.; Hera, K.; Coleman, C.; Jones, M.; Wiedenman, B.

    2011-12-05

    Evaluation of the Defense Waste Processing Facility (DWPF) Chemical Process Cell (CPC) cycle time identified several opportunities to improve the CPC processing time. Of the opportunities, a focus area related to optimizing the equipment and efficiency of the sample turnaround time for DWPF Analytical Laboratory was identified. The Mechanical Systems & Custom Equipment Development (MS&CED) Section of the Savannah River National Laboratory (SRNL) evaluated the possibility of using an Isolok{reg_sign} sampling valve as an alternative to the Hydragard{reg_sign} valve for taking process samples. Previous viability testing was conducted with favorable results using the Isolok sampler and reported in SRNL-STI-2010-00749 (1). This task has the potential to improve operability, reduce maintenance time and decrease CPC cycle time. This report summarizes the results from acceptance testing which was requested in Task Technical Request (TTR) HLW-DWPF-TTR-2010-0036 (2) and which was conducted as outlined in Task Technical and Quality Assurance Plan (TTQAP) SRNL-RP-2011-00145 (3). The Isolok to be tested is the same model which was tested, qualified, and installed in the Sludge Receipt Adjustment Tank (SRAT) sample system. RW-0333P QA requirements apply to this task. This task was to qualify the Isolok sampler for use in the DWPF Slurry Mix Evaporator (SME) sampling process. The Hydragard, which is the current baseline sampling method, was used for comparison to the Isolok sampling data. The Isolok sampler is an air powered grab sampler used to 'pull' a sample volume from a process line. The operation of the sampler is shown in Figure 1. The image on the left shows the Isolok's spool extended into the process line and the image on the right shows the sampler retracted and then dispensing the liquid into the sampling container. To determine tank homogeneity, a Coliwasa sampler was used to grab samples at a high and low location within the mixing tank. Data from

  15. Superflux chlorophyll-a analysis: An assessment of variability in results introduced prior to fluorometric analysis. [chesapeake bay and shelf regions

    Science.gov (United States)

    Cibik, S. J.; Rutledge, C. K.; Robertson, C. N.

    1981-01-01

    Several experiments were undertaken to identify variability in results that came from procedural differences in the processing of chlorophyll samples prior to fluorometric analysis. T-tests on group means indicated that significant differences (alpha = 0.05) in phaeopigment a concentrations did result in samples not initially screened, but not in the chlorophyll a concentrations. Highly significant differences (alpha = 0.001) in group means were found in samples which were held in acetone after filtering as compared to unfiltered seawater samples held for the same period. No difference in results was found between the 24-hour extraction and samples which were processed immediately.

  16. Sample processing procedures and radiocarbon dating

    International Nuclear Information System (INIS)

    Svetlik, Ivo; Tomaskova, Lenka; Dreslerova, Dagmar

    2010-01-01

    The article outlines radiocarbon dating routines and highlights the potential and limitations of this method. The author's institutions have been jointly running a conventional radiocarbon dating laboratory using the international CRL code. A procedure based on the synthesis of benzene is used. Small samples are sent abroad for dating because no AMS instrumentation is available in the Czech Republic so far. Our laboratory plans to introduce routines for the processing of milligram samples and preparation of graphitized targets for AMS

  17. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    Science.gov (United States)

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  18. Fractional Gaussian noise: Prior specification and model comparison

    KAUST Repository

    Sørbye, Sigrunn Holbek

    2017-07-07

    Fractional Gaussian noise (fGn) is a stationary stochastic process used to model antipersistent or persistent dependency structures in observed time series. Properties of the autocovariance function of fGn are characterised by the Hurst exponent (H), which, in Bayesian contexts, typically has been assigned a uniform prior on the unit interval. This paper argues why a uniform prior is unreasonable and introduces the use of a penalised complexity (PC) prior for H. The PC prior is computed to penalise divergence from the special case of white noise and is invariant to reparameterisations. An immediate advantage is that the exact same prior can be used for the autocorrelation coefficient ϕ(symbol) of a first-order autoregressive process AR(1), as this model also reflects a flexible version of white noise. Within the general setting of latent Gaussian models, this allows us to compare an fGn model component with AR(1) using Bayes factors, avoiding the confounding effects of prior choices for the two hyperparameters H and ϕ(symbol). Among others, this is useful in climate regression models where inference for underlying linear or smooth trends depends heavily on the assumed noise model.

  19. Fractional Gaussian noise: Prior specification and model comparison

    KAUST Repository

    Sø rbye, Sigrunn Holbek; Rue, Haavard

    2017-01-01

    Fractional Gaussian noise (fGn) is a stationary stochastic process used to model antipersistent or persistent dependency structures in observed time series. Properties of the autocovariance function of fGn are characterised by the Hurst exponent (H), which, in Bayesian contexts, typically has been assigned a uniform prior on the unit interval. This paper argues why a uniform prior is unreasonable and introduces the use of a penalised complexity (PC) prior for H. The PC prior is computed to penalise divergence from the special case of white noise and is invariant to reparameterisations. An immediate advantage is that the exact same prior can be used for the autocorrelation coefficient ϕ(symbol) of a first-order autoregressive process AR(1), as this model also reflects a flexible version of white noise. Within the general setting of latent Gaussian models, this allows us to compare an fGn model component with AR(1) using Bayes factors, avoiding the confounding effects of prior choices for the two hyperparameters H and ϕ(symbol). Among others, this is useful in climate regression models where inference for underlying linear or smooth trends depends heavily on the assumed noise model.

  20. The Role of Prior Knowledge in International Franchise Partner Recruitment

    OpenAIRE

    Wang, Catherine; Altinay, Levent

    2006-01-01

    Purpose To investigate the role of prior knowledge in the international franchise partner recruitment process and to evaluate how cultural distance influences the role of prior knowledge in this process. Design/Methodology/Approach A single embedded case study of an international hotel firm was the focus of the enquiry. Interviews, observations and document analysis were used as the data collection techniques. Findings Findings reveal that prior knowledge of the franchisor enab...

  1. Performance of Identifiler Direct and PowerPlex 16 HS on the Applied Biosystems 3730 DNA Analyzer for processing biological samples archived on FTA cards.

    Science.gov (United States)

    Laurin, Nancy; DeMoors, Anick; Frégeau, Chantal

    2012-09-01

    Direct amplification of STR loci from biological samples collected on FTA cards without prior DNA purification was evaluated using Identifiler Direct and PowerPlex 16 HS in conjunction with the use of a high throughput Applied Biosystems 3730 DNA Analyzer. In order to reduce the overall sample processing cost, reduced PCR volumes combined with various FTA disk sizes were tested. Optimized STR profiles were obtained using a 0.53 mm disk size in 10 μL PCR volume for both STR systems. These protocols proved effective in generating high quality profiles on the 3730 DNA Analyzer from both blood and buccal FTA samples. Reproducibility, concordance, robustness, sample stability and profile quality were assessed using a collection of blood and buccal samples on FTA cards from volunteer donors as well as from convicted offenders. The new developed protocols offer enhanced throughput capability and cost effectiveness without compromising the robustness and quality of the STR profiles obtained. These results support the use of these protocols for processing convicted offender samples submitted to the National DNA Data Bank of Canada. Similar protocols could be applied to the processing of casework reference samples or in paternity or family relationship testing. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  2. Sample Handling and Processing on Mars for Future Astrobiology Missions

    Science.gov (United States)

    Beegle, Luther; Kirby, James P.; Fisher, Anita; Hodyss, Robert; Saltzman, Alison; Soto, Juancarlos; Lasnik, James; Roark, Shane

    2011-01-01

    In most analytical investigations, there is a need to process complex field samples for the unique detection of analytes especially when detecting low concentration organic molecules that may identify extraterrestrial life. Sample processing for analytical instruments is time, resource and manpower consuming in terrestrial laboratories. Every step in this laborious process will have to be automated for in situ life detection. We have developed, and are currently demonstrating, an automated wet chemistry preparation system that can operate autonomously on Earth and is designed to operate under Martian ambient conditions. This will enable a complete wet chemistry laboratory as part of future missions. Our system, namely the Automated Sample Processing System (ASPS) receives fines, extracts organics through solvent extraction, processes the extract by removing non-organic soluble species and delivers sample to multiple instruments for analysis (including for non-organic soluble species).

  3. Reference Priors for the General Location-Scale Model

    NARCIS (Netherlands)

    Fernández, C.; Steel, M.F.J.

    1997-01-01

    The reference prior algorithm (Berger and Bernardo 1992) is applied to multivariate location-scale models with any regular sampling density, where we establish the irrelevance of the usual assumption of Normal sampling if our interest is in either the location or the scale. This result immediately

  4. Solving probabilistic inverse problems rapidly with prior samples

    NARCIS (Netherlands)

    Käufl, Paul; Valentine, Andrew P.; de Wit, Ralph W.; Trampert, Jeannot

    2016-01-01

    Owing to the increasing availability of computational resources, in recent years the probabilistic solution of non-linear, geophysical inverse problems by means of sampling methods has become increasingly feasible. Nevertheless, we still face situations in which a Monte Carlo approach is not

  5. Prior authorisation schemes: trade barriers in need of scientific justification

    NARCIS (Netherlands)

    Meulen, van der B.M.J.

    2010-01-01

    Case C-333/08 Commission v. French Republic ‘processing aids’ [2010] ECR-0000 French prior authorisation scheme for processing aids in food production infringes upon Article 34 TFEU** 1. A prior authorisation scheme not complying with the principle of proportionality, infringes upon Article 34 TFEU.

  6. An improved sampling method of complex network

    Science.gov (United States)

    Gao, Qi; Ding, Xintong; Pan, Feng; Li, Weixing

    2014-12-01

    Sampling subnet is an important topic of complex network research. Sampling methods influence the structure and characteristics of subnet. Random multiple snowball with Cohen (RMSC) process sampling which combines the advantages of random sampling and snowball sampling is proposed in this paper. It has the ability to explore global information and discover the local structure at the same time. The experiments indicate that this novel sampling method could keep the similarity between sampling subnet and original network on degree distribution, connectivity rate and average shortest path. This method is applicable to the situation where the prior knowledge about degree distribution of original network is not sufficient.

  7. On-line sample processing methods in flow analysis

    DEFF Research Database (Denmark)

    Miró, Manuel; Hansen, Elo Harald

    2008-01-01

    In this chapter, the state of the art of flow injection and related approaches thereof for automation and miniaturization of sample processing regardless of the aggregate state of the sample medium is overviewed. The potential of the various generation of flow injection for implementation of in...

  8. Random template placement and prior information

    International Nuclear Information System (INIS)

    Roever, Christian

    2010-01-01

    In signal detection problems, one is usually faced with the task of searching a parameter space for peaks in the likelihood function which indicate the presence of a signal. Random searches have proven to be very efficient as well as easy to implement, compared e.g. to searches along regular grids in parameter space. Knowledge of the parameterised shape of the signal searched for adds structure to the parameter space, i.e., there are usually regions requiring to be densely searched while in other regions a coarser search is sufficient. On the other hand, prior information identifies the regions in which a search will actually be promising or may likely be in vain. Defining specific figures of merit allows one to combine both template metric and prior distribution and devise optimal sampling schemes over the parameter space. We show an example related to the gravitational wave signal from a binary inspiral event. Here the template metric and prior information are particularly contradictory, since signals from low-mass systems tolerate the least mismatch in parameter space while high-mass systems are far more likely, as they imply a greater signal-to-noise ratio (SNR) and hence are detectable to greater distances. The derived sampling strategy is implemented in a Markov chain Monte Carlo (MCMC) algorithm where it improves convergence.

  9. Prior Elicitation, Assessment and Inference with a Dirichlet Prior

    Directory of Open Access Journals (Sweden)

    Michael Evans

    2017-10-01

    Full Text Available Methods are developed for eliciting a Dirichlet prior based upon stating bounds on the individual probabilities that hold with high prior probability. This approach to selecting a prior is applied to a contingency table problem where it is demonstrated how to assess the prior with respect to the bias it induces as well as how to check for prior-data conflict. It is shown that the assessment of a hypothesis via relative belief can easily take into account what it means for the falsity of the hypothesis to correspond to a difference of practical importance and provide evidence in favor of a hypothesis.

  10. Testability evaluation using prior information of multiple sources

    Institute of Scientific and Technical Information of China (English)

    Wang Chao; Qiu Jing; Liu Guanjun; Zhang Yong

    2014-01-01

    Testability plays an important role in improving the readiness and decreasing the life-cycle cost of equipment. Testability demonstration and evaluation is of significance in measuring such testability indexes as fault detection rate (FDR) and fault isolation rate (FIR), which is useful to the producer in mastering the testability level and improving the testability design, and helpful to the consumer in making purchase decisions. Aiming at the problems with a small sample of testabil-ity demonstration test data (TDTD) such as low evaluation confidence and inaccurate result, a test-ability evaluation method is proposed based on the prior information of multiple sources and Bayes theory. Firstly, the types of prior information are analyzed. The maximum entropy method is applied to the prior information with the mean and interval estimate forms on the testability index to obtain the parameters of prior probability density function (PDF), and the empirical Bayesian method is used to get the parameters for the prior information with a success-fail form. Then, a parametrical data consistency check method is used to check the compatibility between all the sources of prior information and TDTD. For the prior information to pass the check, the prior credibility is calculated. A mixed prior distribution is formed based on the prior PDFs and the corresponding credibility. The Bayesian posterior distribution model is acquired with the mixed prior distribution and TDTD, based on which the point and interval estimates are calculated. Finally, examples of a flying control system are used to verify the proposed method. The results show that the proposed method is feasible and effective.

  11. Insights into the processes behind the contamination of degraded human teeth and bone samples with exogenous sources of DNA

    DEFF Research Database (Denmark)

    Gilbert, M. T. P.; Hansen, Anders J.; Willerslev, E.

    2006-01-01

    A principal problem facing human DNA studies that use old and degraded remains is contamination from other sources of human DNA. In this study we have attempted to contaminate deliberately bones and teeth sampled from a medieval collection excavated in Trondheim, Norway, in order to investigate......, prior to assaying for the residual presence of the handler's DNA. Surprisingly, although our results suggest that a large proportion of the teeth were contaminated with multiple sources of human DNA prior to our investigation, we were unable to contaminate the samples with further human DNA. One...

  12. Drunkorexia: Calorie Restriction Prior to Alcohol Consumption among College Freshman

    Science.gov (United States)

    Burke, Sloane C.; Cremeens, Jennifer; Vail-Smith, Karen; Woolsey, Conrad

    2010-01-01

    Using a sample of 692 freshmen at a southeastern university, this study examined caloric restriction among students prior to planned alcohol consumption. Participants were surveyed for self-reported alcohol consumption, binge drinking, and caloric intake habits prior to drinking episodes. Results indicated that 99 of 695 (14%) of first year…

  13. Automated processing of whole blood samples for the determination of immunosuppressants by liquid chromatography tandem-mass spectrometry.

    Science.gov (United States)

    Vogeser, Michael; Spöhrer, Ute

    2006-01-01

    Liquid chromatography tandem-mass spectrometry (LC-MS/MS) is an efficient technology for routine determination of immunosuppressants in whole blood; however, time-consuming manual sample preparation remains a significant limitation of this technique. Using a commercially available robotic pipetting system (Tecan Freedom EVO), we developed an automated sample-preparation protocol for quantification of tacrolimus in whole blood by LC-MS/MS. Barcode reading, sample resuspension, transfer of whole blood aliquots into a deep-well plate, addition of internal standard solution, mixing, and protein precipitation by addition of an organic solvent is performed by the robotic system. After centrifugation of the plate, the deproteinized supernatants are submitted to on-line solid phase extraction, using column switching prior to LC-MS/MS analysis. The only manual actions within the entire process are decapping of the tubes, and transfer of the deep-well plate from the robotic system to a centrifuge and finally to the HPLC autosampler. Whole blood pools were used to assess the reproducibility of the entire analytical system for measuring tacrolimus concentrations. A total coefficient of variation of 1.7% was found for the entire automated analytical process (n=40; mean tacrolimus concentration, 5.3 microg/L). Close agreement between tacrolimus results obtained after manual and automated sample preparation was observed. The analytical system described here, comprising automated protein precipitation, on-line solid phase extraction and LC-MS/MS analysis, is convenient and precise, and minimizes hands-on time and the risk of mistakes in the quantification of whole blood immunosuppressant concentrations compared to conventional methods.

  14. Prior Sensitivity Analysis in Default Bayesian Structural Equation Modeling.

    Science.gov (United States)

    van Erp, Sara; Mulder, Joris; Oberski, Daniel L

    2017-11-27

    Bayesian structural equation modeling (BSEM) has recently gained popularity because it enables researchers to fit complex models and solve some of the issues often encountered in classical maximum likelihood estimation, such as nonconvergence and inadmissible solutions. An important component of any Bayesian analysis is the prior distribution of the unknown model parameters. Often, researchers rely on default priors, which are constructed in an automatic fashion without requiring substantive prior information. However, the prior can have a serious influence on the estimation of the model parameters, which affects the mean squared error, bias, coverage rates, and quantiles of the estimates. In this article, we investigate the performance of three different default priors: noninformative improper priors, vague proper priors, and empirical Bayes priors-with the latter being novel in the BSEM literature. Based on a simulation study, we find that these three default BSEM methods may perform very differently, especially with small samples. A careful prior sensitivity analysis is therefore needed when performing a default BSEM analysis. For this purpose, we provide a practical step-by-step guide for practitioners to conducting a prior sensitivity analysis in default BSEM. Our recommendations are illustrated using a well-known case study from the structural equation modeling literature, and all code for conducting the prior sensitivity analysis is available in the online supplemental materials. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. Sample preparation strategies for food and biological samples prior to nanoparticle detection and imaging

    DEFF Research Database (Denmark)

    Larsen, Erik Huusfeldt; Löschner, Katrin

    2014-01-01

    microscopy (TEM) proved to be necessary for trouble shooting of results obtained from AFFF-LS-ICP-MS. Aqueous and enzymatic extraction strategies were tested for thorough sample preparation aiming at degrading the sample matrix and to liberate the AgNPs from chicken meat into liquid suspension. The resulting...... AFFF-ICP-MS fractograms, which corresponded to the enzymatic digests, showed a major nano-peak (about 80 % recovery of AgNPs spiked to the meat) plus new smaller peaks that eluted close to the void volume of the fractograms. Small, but significant shifts in retention time of AFFF peaks were observed...... for the meat sample extracts and the corresponding neat AgNP suspension, and rendered sizing by way of calibration with AgNPs as sizing standards inaccurate. In order to gain further insight into the sizes of the separated AgNPs, or their possible dissolved state, fractions of the AFFF eluate were collected...

  16. Speciation of Tl(III and Tl(I in hair samples by dispersive liquid–liquid microextraction based on solidification of floating organic droplet prior to flame atomic absorption spectrometry determination

    Directory of Open Access Journals (Sweden)

    S.Z. Mohammadi

    2016-11-01

    Full Text Available Dispersive liquid–liquid microextraction based on solidification of floating organic droplet was successfully used as a sample preparation method prior to flame atomic absorption determination of trace amounts of Tl(III and Tl(I in hair samples. In the proposed method, 1-(2-pyridylazo-2-naphthol, 1-dodecanol and ethanol were used as chelating agent, extraction and dispersive solvent, respectively. Several factors that may be affected in the extraction process, such as type and volume of extraction and disperser solvents, pH, salting out effect, ionic strength and extraction time were studied. Under the optimal conditions, linearity was maintained between 6.0 and 900.0 ng mL−1 for Tl(III. The relative standard deviation for seven replicate determinations of 0.2 μg mL−1 Tl(III was 2.5%. The detection limit based on 3Sb for Tl(III in the original solution was 2.1 ng mL−1. The proposed method has been applied for the determination of trace amounts of thallium in hair samples and satisfactory results were obtained.

  17. Challenging genosensors in food samples: The case of gluten determination in highly processed samples.

    Science.gov (United States)

    Martín-Fernández, Begoña; de-los-Santos-Álvarez, Noemí; Martín-Clemente, Juan Pedro; Lobo-Castañón, María Jesús; López-Ruiz, Beatriz

    2016-01-01

    Electrochemical genosensors have undergone an enormous development in the last decades, but only very few have achieved a quantification of target content in highly processed food samples. The detection of allergens, and particularly gluten, is challenging because legislation establishes a threshold of 20 ppm for labeling as gluten-free but most genosensors expresses the results in DNA concentration or DNA copies. This paper describes the first attempt to correlate the genosensor response and the wheat content in real samples, even in the case of highly processed food samples. A sandwich-based format, comprising a capture probe immobilized onto the screen-printed gold electrode, and a signaling probe functionalized with fluorescein isothiocyanate (FITC), both hybridizing with the target was used. The hybridization event was electrochemically monitored by adding an anti-FITC peroxidase (antiFITC-HRP) and its substrate, tetramethylbenzidine. Binary model mixtures, as a reference material, and real samples have been analyzed. DNA from food was extracted and a fragment encoding the immunodominant peptide of α2-gliadin amplified by a tailored PCR. The sensor was able to selectively detect toxic cereals for celiac patients, such as different varieties of wheat, barley, rye and oats, from non-toxic plants. As low as 0.001% (10 mg/kg) of wheat flour in an inert matrix was reliably detected, which directly compete with the current method of choice for DNA detection, the real-time PCR. A good correlation with the official immunoassay was found in highly processed food samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Bayesian inference from count data using discrete uniform priors.

    Directory of Open Access Journals (Sweden)

    Federico Comoglio

    Full Text Available We consider a set of sample counts obtained by sampling arbitrary fractions of a finite volume containing an homogeneously dispersed population of identical objects. We report a Bayesian derivation of the posterior probability distribution of the population size using a binomial likelihood and non-conjugate, discrete uniform priors under sampling with or without replacement. Our derivation yields a computationally feasible formula that can prove useful in a variety of statistical problems involving absolute quantification under uncertainty. We implemented our algorithm in the R package dupiR and compared it with a previously proposed Bayesian method based on a Gamma prior. As a showcase, we demonstrate that our inference framework can be used to estimate bacterial survival curves from measurements characterized by extremely low or zero counts and rather high sampling fractions. All in all, we provide a versatile, general purpose algorithm to infer population sizes from count data, which can find application in a broad spectrum of biological and physical problems.

  19. Testability evaluation using prior information of multiple sources

    Directory of Open Access Journals (Sweden)

    Wang Chao

    2014-08-01

    Full Text Available Testability plays an important role in improving the readiness and decreasing the life-cycle cost of equipment. Testability demonstration and evaluation is of significance in measuring such testability indexes as fault detection rate (FDR and fault isolation rate (FIR, which is useful to the producer in mastering the testability level and improving the testability design, and helpful to the consumer in making purchase decisions. Aiming at the problems with a small sample of testability demonstration test data (TDTD such as low evaluation confidence and inaccurate result, a testability evaluation method is proposed based on the prior information of multiple sources and Bayes theory. Firstly, the types of prior information are analyzed. The maximum entropy method is applied to the prior information with the mean and interval estimate forms on the testability index to obtain the parameters of prior probability density function (PDF, and the empirical Bayesian method is used to get the parameters for the prior information with a success-fail form. Then, a parametrical data consistency check method is used to check the compatibility between all the sources of prior information and TDTD. For the prior information to pass the check, the prior credibility is calculated. A mixed prior distribution is formed based on the prior PDFs and the corresponding credibility. The Bayesian posterior distribution model is acquired with the mixed prior distribution and TDTD, based on which the point and interval estimates are calculated. Finally, examples of a flying control system are used to verify the proposed method. The results show that the proposed method is feasible and effective.

  20. Training shortest-path tractography: Automatic learning of spatial priors

    DEFF Research Database (Denmark)

    Kasenburg, Niklas; Liptrot, Matthew George; Reislev, Nina Linde

    2016-01-01

    Tractography is the standard tool for automatic delineation of white matter tracts from diffusion weighted images. However, the output of tractography often requires post-processing to remove false positives and ensure a robust delineation of the studied tract, and this demands expert prior...... knowledge. Here we demonstrate how such prior knowledge, or indeed any prior spatial information, can be automatically incorporated into a shortest-path tractography approach to produce more robust results. We describe how such a prior can be automatically generated (learned) from a population, and we...

  1. Surface studies of plasma processed Nb samples

    International Nuclear Information System (INIS)

    Tyagi, Puneet V.; Doleans, Marc; Hannah, Brian S.; Afanador, Ralph; Stewart, Stephen; Mammosser, John; Howell, Matthew P; Saunders, Jeffrey W; Degraff, Brian D; Kim, Sang-Ho

    2015-01-01

    Contaminants present at top surface of superconducting radio frequency (SRF) cavities can act as field emitters and restrict the cavity accelerating gradient. A room temperature in-situ plasma processing technology for SRF cavities aiming to clean hydrocarbons from inner surface of cavities has been recently developed at the Spallation Neutron Source (SNS). Surface studies of the plasma-processed Nb samples by Secondary ion mass spectrometry (SIMS) and Scanning Kelvin Probe (SKP) showed that the NeO_2 plasma processing is very effective to remove carbonaceous contaminants from top surface and improves the surface work function by 0.5 to 1.0 eV.

  2. Recommended practice for process sampling for partial pressure analysis

    International Nuclear Information System (INIS)

    Blessing, James E.; Ellefson, Robert E.; Raby, Bruce A.; Brucker, Gerardo A.; Waits, Robert K.

    2007-01-01

    This Recommended Practice describes and recommends various procedures and types of apparatus for obtaining representative samples of process gases from >10 -2 Pa (10 -4 Torr) for partial pressure analysis using a mass spectrometer. The document was prepared by a subcommittee of the Recommended Practices Committee of the American Vacuum Society. The subcommittee was comprised of vacuum users and manufacturers of mass spectrometer partial pressure analyzers who have practical experience in the sampling of process gas atmospheres

  3. Savings for visuomotor adaptation require prior history of error, not prior repetition of successful actions.

    Science.gov (United States)

    Leow, Li-Ann; de Rugy, Aymar; Marinovic, Welber; Riek, Stephan; Carroll, Timothy J

    2016-10-01

    When we move, perturbations to our body or the environment can elicit discrepancies between predicted and actual outcomes. We readily adapt movements to compensate for such discrepancies, and the retention of this learning is evident as savings, or faster readaptation to a previously encountered perturbation. The mechanistic processes contributing to savings, or even the necessary conditions for savings, are not fully understood. One theory suggests that savings requires increased sensitivity to previously experienced errors: when perturbations evoke a sequence of correlated errors, we increase our sensitivity to the errors experienced, which subsequently improves error correction (Herzfeld et al. 2014). An alternative theory suggests that a memory of actions is necessary for savings: when an action becomes associated with successful target acquisition through repetition, that action is more rapidly retrieved at subsequent learning (Huang et al. 2011). In the present study, to better understand the necessary conditions for savings, we tested how savings is affected by prior experience of similar errors and prior repetition of the action required to eliminate errors using a factorial design. Prior experience of errors induced by a visuomotor rotation in the savings block was either prevented at initial learning by gradually removing an oppositely signed perturbation or enforced by abruptly removing the perturbation. Prior repetition of the action required to eliminate errors in the savings block was either deprived or enforced by manipulating target location in preceding trials. The data suggest that prior experience of errors is both necessary and sufficient for savings, whereas prior repetition of a successful action is neither necessary nor sufficient for savings. Copyright © 2016 the American Physiological Society.

  4. Moderation of the Alliance-Outcome Association by Prior Depressive Episodes: Differential Effects in Cognitive-Behavioral Therapy and Short-Term Psychodynamic Supportive Psychotherapy.

    Science.gov (United States)

    Lorenzo-Luaces, Lorenzo; Driessen, Ellen; DeRubeis, Robert J; Van, Henricus L; Keefe, John R; Hendriksen, Mariëlle; Dekker, Jack

    2017-09-01

    Prior studies have suggested that the association between the alliance and depression improvement varies as a function of prior history of depression. We sought to replicate these findings and extend them to short-term psychodynamic supportive psychotherapy (SPSP) in a sample of patients who were randomized to one of these treatments and were administered the Helping Alliance Questionnaire (N=282) at Week 5 of treatment. Overall, the alliance was a predictor of symptom change (d=0.33). In SPSP, the alliance was a modest but robust predictor of change, irrespective of prior episodes (d=0.25-0.33). By contrast, in CBT, the effects of the alliance on symptom change were large for patients with 0 prior episodes (d=0.86), moderate for those with 1 prior episode (d=0.49), and small for those with 2+ prior episodes (d=0.12). These findings suggest a complex interaction between patient features and common vs. specific therapy processes. In CBT, the alliance relates to change for patients with less recurrent depression whereas other CBT-specific processes may account for change for patients with more recurrent depression. Copyright © 2016. Published by Elsevier Ltd.

  5. Real-time recursive hyperspectral sample and band processing algorithm architecture and implementation

    CERN Document Server

    Chang, Chein-I

    2017-01-01

    This book explores recursive architectures in designing progressive hyperspectral imaging algorithms. In particular, it makes progressive imaging algorithms recursive by introducing the concept of Kalman filtering in algorithm design so that hyperspectral imagery can be processed not only progressively sample by sample or band by band but also recursively via recursive equations. This book can be considered a companion book of author’s books, Real-Time Progressive Hyperspectral Image Processing, published by Springer in 2016. Explores recursive structures in algorithm architecture Implements algorithmic recursive architecture in conjunction with progressive sample and band processing Derives Recursive Hyperspectral Sample Processing (RHSP) techniques according to Band-Interleaved Sample/Pixel (BIS/BIP) acquisition format Develops Recursive Hyperspectral Band Processing (RHBP) techniques according to Band SeQuential (BSQ) acquisition format for hyperspectral data.

  6. The relation between prior knowledge and students' collaborative discovery learning processes.

    NARCIS (Netherlands)

    Gijlers, Aaltje H.; de Jong, Anthonius J.M.

    2005-01-01

    In this study we investigate how prior knowledge influences knowledge development during collaborative discovery learning. Fifteen dyads of students (pre-university education, 15-16 years old) worked on a discovery learning task in the physics field of kinematics. The (face-to-face) communication

  7. Parameter estimation of multivariate multiple regression model using bayesian with non-informative Jeffreys’ prior distribution

    Science.gov (United States)

    Saputro, D. R. S.; Amalia, F.; Widyaningsih, P.; Affan, R. C.

    2018-05-01

    Bayesian method is a method that can be used to estimate the parameters of multivariate multiple regression model. Bayesian method has two distributions, there are prior and posterior distributions. Posterior distribution is influenced by the selection of prior distribution. Jeffreys’ prior distribution is a kind of Non-informative prior distribution. This prior is used when the information about parameter not available. Non-informative Jeffreys’ prior distribution is combined with the sample information resulting the posterior distribution. Posterior distribution is used to estimate the parameter. The purposes of this research is to estimate the parameters of multivariate regression model using Bayesian method with Non-informative Jeffreys’ prior distribution. Based on the results and discussion, parameter estimation of β and Σ which were obtained from expected value of random variable of marginal posterior distribution function. The marginal posterior distributions for β and Σ are multivariate normal and inverse Wishart. However, in calculation of the expected value involving integral of a function which difficult to determine the value. Therefore, approach is needed by generating of random samples according to the posterior distribution characteristics of each parameter using Markov chain Monte Carlo (MCMC) Gibbs sampling algorithm.

  8. MANU. Handling of bentonite prior buffer block manufacturing

    International Nuclear Information System (INIS)

    Laaksonen, R.

    2010-01-01

    The aim of this study is to describe the entire bentonite handling process starting from freight from harbour to storage facility and ending up to the manufacturing filling process of the bentonite block moulds. This work describes the bentonite handling prior to the process in which bentonite blocks are manufactured in great quantities. This work included a study of relevant Nordic and international well documented cases of storage, processing and techniques involving bentonite material. Information about storage and handling processes from producers or re-sellers of bentonite was collected while keeping in mind the requirements coming from the Posiva side. Also a limited experiment was made for humidification of different material types. This work includes a detailed description of methods and equipment needed for bentonite storage and processing. Posiva Oy used Jauhetekniikka Oy as a consultant to prepare handling process flow charts for bentonite. Jauhetekniikka Oy also evaluated the content of this report. The handling of bentonite was based on the assumption that bentonite process work is done in one factory for 11 months of work time while the weekly volume is around 41-45 tons. Storage space needed in this case is about 300 tons of bentonite which equals about seven weeks of raw material consumption. This work concluded several things to be carefully considered: sampling at various phases of the process, the air quality at the production/storage facilities (humidity and temperature), the level of automation/process control of the manufacturing process and the means of producing/saving data from different phases of the process. (orig.)

  9. Automated Blood Sample Preparation Unit (ABSPU) for Portable Microfluidic Flow Cytometry.

    Science.gov (United States)

    Chaturvedi, Akhil; Gorthi, Sai Siva

    2017-02-01

    Portable microfluidic diagnostic devices, including flow cytometers, are being developed for point-of-care settings, especially in conjunction with inexpensive imaging devices such as mobile phone cameras. However, two pervasive drawbacks of these have been the lack of automated sample preparation processes and cells settling out of sample suspensions, leading to inaccurate results. We report an automated blood sample preparation unit (ABSPU) to prevent blood samples from settling in a reservoir during loading of samples in flow cytometers. This apparatus automates the preanalytical steps of dilution and staining of blood cells prior to microfluidic loading. It employs an assembly with a miniature vibration motor to drive turbulence in a sample reservoir. To validate performance of this system, we present experimental evidence demonstrating prevention of blood cell settling, cell integrity, and staining of cells prior to flow cytometric analysis. This setup is further integrated with a microfluidic imaging flow cytometer to investigate cell count variability. With no need for prior sample preparation, a drop of whole blood can be directly introduced to the setup without premixing with buffers manually. Our results show that integration of this assembly with microfluidic analysis provides a competent automation tool for low-cost point-of-care blood-based diagnostics.

  10. A Noninformative Prior on a Space of Distribution Functions

    Directory of Open Access Journals (Sweden)

    Alexander Terenin

    2017-07-01

    Full Text Available In a given problem, the Bayesian statistical paradigm requires the specification of a prior distribution that quantifies relevant information about the unknowns of main interest external to the data. In cases where little such information is available, the problem under study may possess an invariance under a transformation group that encodes a lack of information, leading to a unique prior—this idea was explored at length by E.T. Jaynes. Previous successful examples have included location-scale invariance under linear transformation, multiplicative invariance of the rate at which events in a counting process are observed, and the derivation of the Haldane prior for a Bernoulli success probability. In this paper we show that this method can be extended, by generalizing Jaynes, in two ways: (1 to yield families of approximately invariant priors; and (2 to the infinite-dimensional setting, yielding families of priors on spaces of distribution functions. Our results can be used to describe conditions under which a particular Dirichlet Process posterior arises from an optimal Bayesian analysis, in the sense that invariances in the prior and likelihood lead to one and only one posterior distribution.

  11. 40 CFR 141.703 - Sampling locations.

    Science.gov (United States)

    2010-07-01

    ... samples prior to the point of filter backwash water addition. (d) Bank filtration. (1) Systems that... applicable, must collect source water samples in the surface water prior to bank filtration. (2) Systems that use bank filtration as pretreatment to a filtration plant must collect source water samples from the...

  12. An Automated Sample Processing System for Planetary Exploration

    Science.gov (United States)

    Soto, Juancarlos; Lasnik, James; Roark, Shane; Beegle, Luther

    2012-01-01

    An Automated Sample Processing System (ASPS) for wet chemistry processing of organic materials on the surface of Mars has been jointly developed by Ball Aerospace and the Jet Propulsion Laboratory. The mechanism has been built and tested to demonstrate TRL level 4. This paper describes the function of the system, mechanism design, lessons learned, and several challenges that were overcome.

  13. Application of Bayesian Decision Theory Based on Prior Information in the Multi-Objective Optimization Problem

    Directory of Open Access Journals (Sweden)

    Xia Lei

    2010-12-01

    Full Text Available General multi-objective optimization methods are hard to obtain prior information, how to utilize prior information has been a challenge. This paper analyzes the characteristics of Bayesian decision-making based on maximum entropy principle and prior information, especially in case that how to effectively improve decision-making reliability in deficiency of reference samples. The paper exhibits effectiveness of the proposed method using the real application of multi-frequency offset estimation in distributed multiple-input multiple-output system. The simulation results demonstrate Bayesian decision-making based on prior information has better global searching capability when sampling data is deficient.

  14. On the prior probabilities for two-stage Bayesian estimates

    International Nuclear Information System (INIS)

    Kohut, P.

    1992-01-01

    The method of Bayesian inference is reexamined for its applicability and for the required underlying assumptions in obtaining and using prior probability estimates. Two different approaches are suggested to determine the first-stage priors in the two-stage Bayesian analysis which avoid certain assumptions required for other techniques. In the first scheme, the prior is obtained through a true frequency based distribution generated at selected intervals utilizing actual sampling of the failure rate distributions. The population variability distribution is generated as the weighed average of the frequency distributions. The second method is based on a non-parametric Bayesian approach using the Maximum Entropy Principle. Specific features such as integral properties or selected parameters of prior distributions may be obtained with minimal assumptions. It is indicated how various quantiles may also be generated with a least square technique

  15. Sample processing method for the determination of perchlorate in milk

    International Nuclear Information System (INIS)

    Dyke, Jason V.; Kirk, Andrea B.; Kalyani Martinelango, P.; Dasgupta, Purnendu K.

    2006-01-01

    In recent years, many different water sources and foods have been reported to contain perchlorate. Studies indicate that significant levels of perchlorate are present in both human and dairy milk. The determination of perchlorate in milk is particularly important due to its potential health impact on infants and children. As for many other biological samples, sample preparation is more time consuming than the analysis itself. The concurrent presence of large amounts of fats, proteins, carbohydrates, etc., demands some initial cleanup; otherwise the separation column lifetime and the limit of detection are both greatly compromised. Reported milk processing methods require the addition of chemicals such as ethanol, acetic acid or acetonitrile. Reagent addition is undesirable in trace analysis. We report here an essentially reagent-free sample preparation method for the determination of perchlorate in milk. Milk samples are spiked with isotopically labeled perchlorate and centrifuged to remove lipids. The resulting liquid is placed in a disposable centrifugal ultrafilter device with a molecular weight cutoff of 10 kDa, and centrifuged. Approximately 5-10 ml of clear liquid, ready for analysis, is obtained from a 20 ml milk sample. Both bovine and human milk samples have been successfully processed and analyzed by ion chromatography-mass spectrometry (IC-MS). Standard addition experiments show good recoveries. The repeatability of the analytical result for the same sample in multiple sample cleanup runs ranged from 3 to 6% R.S.D. This processing technique has also been successfully applied for the determination of iodide and thiocyanate in milk

  16. Bayesian linear regression : different conjugate models and their (in)sensitivity to prior-data conflict

    NARCIS (Netherlands)

    Walter, G.M.; Augustin, Th.; Kneib, Thomas; Tutz, Gerhard

    2010-01-01

    The paper is concerned with Bayesian analysis under prior-data conflict, i.e. the situation when observed data are rather unexpected under the prior (and the sample size is not large enough to eliminate the influence of the prior). Two approaches for Bayesian linear regression modeling based on

  17. Acceleration to failure in geophysical signals prior to laboratory rock failure and volcanic eruptions (Invited)

    Science.gov (United States)

    Main, I. G.; Bell, A. F.; Greenhough, J.; Heap, M. J.; Meredith, P. G.

    2010-12-01

    The nucleation processes that ultimately lead to earthquakes, volcanic eruptions, rock bursts in mines, and landslides from cliff slopes are likely to be controlled at some scale by brittle failure of the Earth’s crust. In laboratory brittle deformation experiments geophysical signals commonly exhibit an accelerating trend prior to dynamic failure. Similar signals have been observed prior to volcanic eruptions, including volcano-tectonic earthquake event and moment release rates. Despite a large amount of effort in the search, no such statistically robust systematic trend is found prior to natural earthquakes. Here we describe the results of a suite of laboratory tests on Mount Etna Basalt and other rocks to examine the nature of the non-linear scaling from laboratory to field conditions, notably using laboratory ‘creep’ tests to reduce the boundary strain rate to conditions more similar to those in the field. Seismic event rate, seismic moment release rate and rate of porosity change show a classic ‘bathtub’ graph that can be derived from a simple damage model based on separate transient and accelerating sub-critical crack growth mechanisms, resulting from separate processes of negative and positive feedback in the population dynamics. The signals exhibit clear precursors based on formal statistical model tests using maximum likelihood techniques with Poisson errors. After correcting for the finite loading time of the signal, the results show a transient creep rate that decays as a classic Omori law for earthquake aftershocks, and remarkably with an exponent near unity, as commonly observed for natural earthquake sequences. The accelerating trend follows an inverse power law when fitted in retrospect, i.e. with prior knowledge of the failure time. In contrast the strain measured on the sample boundary shows a less obvious but still accelerating signal that is often absent altogether in natural strain data prior to volcanic eruptions. To test the

  18. Prior Authorization of PMDs Demonstration - Status Update

    Data.gov (United States)

    U.S. Department of Health & Human Services — CMS implemented a Prior Authorization process for scooters and power wheelchairs for people with Fee-For-Service Medicare who reside in seven states with high...

  19. Testing results of Monte Carlo sampling processes in MCSAD

    International Nuclear Information System (INIS)

    Pinnera, I.; Cruz, C.; Abreu, Y.; Leyva, A.; Correa, C.; Demydenko, C.

    2009-01-01

    The Monte Carlo Simulation of Atom Displacements (MCSAD) is a code implemented by the authors to simulate the complete process of atom displacement (AD) formation. This code makes use of the Monte Carlo (MC) method to sample all the processes involved in the gamma and electronic radiation transport through matter. The kernel of the calculations applied to this code relies on a model based on an algorithm developed by the authors, which firstly splits out multiple electron elastic scattering events from those single ones at higher scattering angles and then, from the last one, sampling those leading to AD at high transferred atomic recoil energies. Some tests have been developed to check the sampling algorithms with the help of the corresponding theoretical distribution functions. Satisfactory results have been obtained, which indicate the strength of the methods and subroutines used in the code. (Author)

  20. External Prior Guided Internal Prior Learning for Real-World Noisy Image Denoising

    Science.gov (United States)

    Xu, Jun; Zhang, Lei; Zhang, David

    2018-06-01

    Most of existing image denoising methods learn image priors from either external data or the noisy image itself to remove noise. However, priors learned from external data may not be adaptive to the image to be denoised, while priors learned from the given noisy image may not be accurate due to the interference of corrupted noise. Meanwhile, the noise in real-world noisy images is very complex, which is hard to be described by simple distributions such as Gaussian distribution, making real noisy image denoising a very challenging problem. We propose to exploit the information in both external data and the given noisy image, and develop an external prior guided internal prior learning method for real noisy image denoising. We first learn external priors from an independent set of clean natural images. With the aid of learned external priors, we then learn internal priors from the given noisy image to refine the prior model. The external and internal priors are formulated as a set of orthogonal dictionaries to efficiently reconstruct the desired image. Extensive experiments are performed on several real noisy image datasets. The proposed method demonstrates highly competitive denoising performance, outperforming state-of-the-art denoising methods including those designed for real noisy images.

  1. Incorporation of stochastic engineering models as prior information in Bayesian medical device trials.

    Science.gov (United States)

    Haddad, Tarek; Himes, Adam; Thompson, Laura; Irony, Telba; Nair, Rajesh

    2017-01-01

    Evaluation of medical devices via clinical trial is often a necessary step in the process of bringing a new product to market. In recent years, device manufacturers are increasingly using stochastic engineering models during the product development process. These models have the capability to simulate virtual patient outcomes. This article presents a novel method based on the power prior for augmenting a clinical trial using virtual patient data. To properly inform clinical evaluation, the virtual patient model must simulate the clinical outcome of interest, incorporating patient variability, as well as the uncertainty in the engineering model and in its input parameters. The number of virtual patients is controlled by a discount function which uses the similarity between modeled and observed data. This method is illustrated by a case study of cardiac lead fracture. Different discount functions are used to cover a wide range of scenarios in which the type I error rates and power vary for the same number of enrolled patients. Incorporation of engineering models as prior knowledge in a Bayesian clinical trial design can provide benefits of decreased sample size and trial length while still controlling type I error rate and power.

  2. Capillary absorption spectrometer and process for isotopic analysis of small samples

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, M. Lizabeth; Kelly, James F.; Sams, Robert L.; Moran, James J.; Newburn, Matthew K.; Blake, Thomas A.

    2018-04-24

    A capillary absorption spectrometer and process are described that provide highly sensitive and accurate stable absorption measurements of analytes in a sample gas that may include isotopologues of carbon and oxygen obtained from gas and biological samples. It further provides isotopic images of microbial communities that allow tracking of nutrients at the single cell level. It further targets naturally occurring variations in carbon and oxygen isotopes that avoids need for expensive isotopically labeled mixtures which allows study of samples taken from the field without modification. The process also permits sampling in vivo permitting real-time ambient studies of microbial communities.

  3. Stationary and related stochastic processes sample function properties and their applications

    CERN Document Server

    Cramér, Harald

    2004-01-01

    This graduate-level text offers a comprehensive account of the general theory of stationary processes, with special emphasis on the properties of sample functions. Assuming a familiarity with the basic features of modern probability theory, the text develops the foundations of the general theory of stochastic processes, examines processes with a continuous-time parameter, and applies the general theory to procedures key to the study of stationary processes. Additional topics include analytic properties of the sample functions and the problem of time distribution of the intersections between a

  4. GENERAL ASPECTS REGARDING THE PRIOR DISCIPLINARY RESEARCH

    Directory of Open Access Journals (Sweden)

    ANDRA PURAN (DASCĂLU

    2012-05-01

    Full Text Available Disciplinary research is the first phase of the disciplinary action. According to art. 251 paragraph 1 of the Labour Code no disciplinary sanction may be ordered before performing the prior disciplinary research.These regulations provide an exception: the sanction of written warning. The current regulations in question, kept from the old regulation, provides a protection for employees against abuses made by employers, since sanctions are affecting the salary or the position held, or even the development of individual employment contract. Thus, prior research of the fact that is a misconduct, before a disciplinary sanction is applied, is an essential condition for the validity of the measure ordered. Through this study we try to highlight some general issues concerning the characteristics, processes and effects of prior disciplinary research.

  5. Biopolymers for Sample Collection, Protection, and Preservation

    Science.gov (United States)

    2015-05-19

    knowledge of sample collection from various matrices is crucial. Recovery and preservation of microorganisms prior to analysis are important...Another method for encapsulating bacteria for use in biodegradation of gasoline involves a complex process using gellan gum (Moslemy et al. 2002). Many...use of acacia gum in preserving microorganisms for extended periods of time without refrigeration (Krumnow et al. 2009; Sorokulova et al. 2008, 2012

  6. Adaptive multiple importance sampling for Gaussian processes

    Czech Academy of Sciences Publication Activity Database

    Xiong, X.; Šmídl, Václav; Filippone, M.

    2017-01-01

    Roč. 87, č. 8 (2017), s. 1644-1665 ISSN 0094-9655 R&D Projects: GA MŠk(CZ) 7F14287 Institutional support: RVO:67985556 Keywords : Gaussian Process * Bayesian estimation * Adaptive importance sampling Subject RIV: BB - Applied Statistics, Operational Research OBOR OECD: Statistics and probability Impact factor: 0.757, year: 2016 http://library.utia.cas.cz/separaty/2017/AS/smidl-0469804.pdf

  7. Prior Visual Experience Modulates Learning of Sound Localization Among Blind Individuals.

    Science.gov (United States)

    Tao, Qian; Chan, Chetwyn C H; Luo, Yue-Jia; Li, Jian-Jun; Ting, Kin-Hung; Lu, Zhong-Lin; Whitfield-Gabrieli, Susan; Wang, Jun; Lee, Tatia M C

    2017-05-01

    Cross-modal learning requires the use of information from different sensory modalities. This study investigated how the prior visual experience of late blind individuals could modulate neural processes associated with learning of sound localization. Learning was realized by standardized training on sound localization processing, and experience was investigated by comparing brain activations elicited from a sound localization task in individuals with (late blind, LB) and without (early blind, EB) prior visual experience. After the training, EB showed decreased activation in the precuneus, which was functionally connected to a limbic-multisensory network. In contrast, LB showed the increased activation of the precuneus. A subgroup of LB participants who demonstrated higher visuospatial working memory capabilities (LB-HVM) exhibited an enhanced precuneus-lingual gyrus network. This differential connectivity suggests that visuospatial working memory due to the prior visual experience gained via LB-HVM enhanced learning of sound localization. Active visuospatial navigation processes could have occurred in LB-HVM compared to the retrieval of previously bound information from long-term memory for EB. The precuneus appears to play a crucial role in learning of sound localization, disregarding prior visual experience. Prior visual experience, however, could enhance cross-modal learning by extending binding to the integration of unprocessed information, mediated by the cognitive functions that these experiences develop.

  8. ACTINIDE REMOVAL PROCESS SAMPLE ANALYSIS, CHEMICAL MODELING, AND FILTRATION EVALUATION

    Energy Technology Data Exchange (ETDEWEB)

    Martino, C.; Herman, D.; Pike, J.; Peters, T.

    2014-06-05

    Filtration within the Actinide Removal Process (ARP) currently limits the throughput in interim salt processing at the Savannah River Site. In this process, batches of salt solution with Monosodium Titanate (MST) sorbent are concentrated by crossflow filtration. The filtrate is subsequently processed to remove cesium in the Modular Caustic Side Solvent Extraction Unit (MCU) followed by disposal in saltstone grout. The concentrated MST slurry is washed and sent to the Defense Waste Processing Facility (DWPF) for vitrification. During recent ARP processing, there has been a degradation of filter performance manifested as the inability to maintain high filtrate flux throughout a multi-batch cycle. The objectives of this effort were to characterize the feed streams, to determine if solids (in addition to MST) are precipitating and causing the degraded performance of the filters, and to assess the particle size and rheological data to address potential filtration impacts. Equilibrium modelling with OLI Analyzer{sup TM} and OLI ESP{sup TM} was performed to determine chemical components at risk of precipitation and to simulate the ARP process. The performance of ARP filtration was evaluated to review potential causes of the observed filter behavior. Task activities for this study included extensive physical and chemical analysis of samples from the Late Wash Pump Tank (LWPT) and the Late Wash Hold Tank (LWHT) within ARP as well as samples of the tank farm feed from Tank 49H. The samples from the LWPT and LWHT were obtained from several stages of processing of Salt Batch 6D, Cycle 6, Batch 16.

  9. Design aspects of automation system for initial processing of fecal samples

    International Nuclear Information System (INIS)

    Sawant, Pramilla D.; Prabhu, Supreetha P.; Suja, A.; Wankhede, Sonal; Chaudhary, Seema; Rao, D.D.; Pradeepkumar, K.S.; Das, A.P.; Badodkar, B.D.

    2014-01-01

    The procedure for initial handling of the fecal samples at Bioassay Lab., Trombay is as follows: overnight fecal samples are collected from the worker in a kit consisting of a polythene bag placed in a wide mouth polythene container closed with an inner lid and a screw cap. Occupational worker collects the sample in the polythene bag. On receiving the sample, the polythene container along with the sample is weighed, polythene bag containing fecal sample is lifted out of the container using a pair of tongs placed inside a crucible and ashed inside a muffle furnace at 450℃. After complete ashing, the crucible containing white ash is taken-up for further radiochemical processing. This paper describes the various steps in developing a prototype automated system for initial handling of fecal samples. The proposed system for handling and processing of fecal samples is proposed to automate the above. The system once developed will help eliminate manual intervention till the ashing stage and reduce the biological hazard involved in handling such samples mentioned procedure

  10. A Green Preconcentration Method for Determination of Cobalt and Lead in Fresh Surface and Waste Water Samples Prior to Flame Atomic Absorption Spectrometry

    Directory of Open Access Journals (Sweden)

    Naeemullah

    2012-01-01

    Full Text Available Cloud point extraction (CPE has been used for the preconcentration and simultaneous determination of cobalt (Co and lead (Pb in fresh and wastewater samples. The extraction of analytes from aqueous samples was performed in the presence of 8-hydroxyquinoline (oxine as a chelating agent and Triton X-114 as a nonionic surfactant. Experiments were conducted to assess the effect of different chemical variables such as pH, amounts of reagents (oxine and Triton X-114, temperature, incubation time, and sample volume. After phase separation, based on the cloud point, the surfactant-rich phase was diluted with acidic ethanol prior to its analysis by the flame atomic absorption spectrometry (FAAS. The enhancement factors 70 and 50 with detection limits of 0.26 μg L−1 and 0.44 μg L−1 were obtained for Co and Pb, respectively. In order to validate the developed method, a certified reference material (SRM 1643e was analyzed and the determined values obtained were in a good agreement with the certified values. The proposed method was applied successfully to the determination of Co and Pb in a fresh surface and waste water sample.

  11. Prior and present evidence: how prior experience interacts with present information in a perceptual decision making task.

    Directory of Open Access Journals (Sweden)

    Muhsin Karim

    Full Text Available Vibrotactile discrimination tasks have been used to examine decision making processes in the presence of perceptual uncertainty, induced by barely discernible frequency differences between paired stimuli or by the presence of embedded noise. One lesser known property of such tasks is that decisions made on a single trial may be biased by information from prior trials. An example is the time-order effect whereby the presentation order of paired stimuli may introduce differences in accuracy. Subjects perform better when the first stimulus lies between the second stimulus and the global mean of all stimuli on the judged dimension ("preferred" time-orders compared to the alternative presentation order ("nonpreferred" time-orders. This has been conceptualised as a "drift" of the first stimulus representation towards the global mean of the stimulus-set (an internal standard. We describe the influence of prior information in relation to the more traditionally studied factors of interest in a classic discrimination task.Sixty subjects performed a vibrotactile discrimination task with different levels of uncertainty parametrically induced by increasing task difficulty, aperiodic stimulus noise, and changing the task instructions whilst maintaining identical stimulus properties (the "context".The time-order effect had a greater influence on task performance than two of the explicit factors-task difficulty and noise-but not context. The influence of prior information increased with the distance of the first stimulus from the global mean, suggesting that the "drift" velocity of the first stimulus towards the global mean representation was greater for these trials.Awareness of the time-order effect and prior information in general is essential when studying perceptual decision making tasks. Implicit mechanisms may have a greater influence than the explicit factors under study. It also affords valuable insights into basic mechanisms of information

  12. Fission products in National Atmospheric Deposition Program—Wet deposition samples prior to and following the Fukushima Dai-Ichi Nuclear Power Plant incident, March 8?April 5, 2011

    Science.gov (United States)

    Wetherbee, Gregory A.; Debey, Timothy M.; Nilles, Mark A.; Lehmann, Christopher M.B.; Gay, David A.

    2012-01-01

    Radioactive isotopes I-131, Cs-134, or Cs-137, products of uranium fission, were measured at approximately 20 percent of 167 sampled National Atmospheric Deposition Program monitoring sites in North America (primarily in the contiguous United States and Alaska) after the Fukushima Dai-Ichi Nuclear Power Plant incident on March 12, 2011. Samples from the National Atmospheric Deposition Program were analyzed for the period of March 8-April 5, 2011. Calculated 1- or 2-week radionuclide deposition fluxes at 35 sites from Alaska to Vermont ranged from 0.47 to 5,100 Becquerels per square meter during the sampling period of March 15-April 5, 2011. No fission-product isotopes were measured in National Atmospheric Deposition Program samples obtained during March 8-15, 2011, prior to the arrival of contaminated air in North America.

  13. The Prior Can Often Only Be Understood in the Context of the Likelihood

    Directory of Open Access Journals (Sweden)

    Andrew Gelman

    2017-10-01

    Full Text Available A key sticking point of Bayesian analysis is the choice of prior distribution, and there is a vast literature on potential defaults including uniform priors, Jeffreys’ priors, reference priors, maximum entropy priors, and weakly informative priors. These methods, however, often manifest a key conceptual tension in prior modeling: a model encoding true prior information should be chosen without reference to the model of the measurement process, but almost all common prior modeling techniques are implicitly motivated by a reference likelihood. In this paper we resolve this apparent paradox by placing the choice of prior into the context of the entire Bayesian analysis, from inference to prediction to model evaluation.

  14. Sampling and examination methods used for TMI-2 samples

    International Nuclear Information System (INIS)

    Marley, A.W.; Akers, D.W.; McIsaac, C.V.

    1988-01-01

    The purpose of this paper is to summarize the sampling and examination techniques that were used in the collection and analysis of TMI-2 samples. Samples ranging from auxiliary building air to core debris were collected and analyzed. Handling of the larger samples and many of the smaller samples had to be done remotely and many standard laboratory analytical techniques were modified to accommodate the extremely high radiation fields associated with these samples. The TMI-2 samples presented unique problems with sampling and the laboratory analysis of prior molten fuel debris. 14 refs., 8 figs

  15. Influence of volume of sample processed on detection of Chlamydia trachomatis in urogenital samples by PCR

    NARCIS (Netherlands)

    Goessens, W H; Kluytmans, J A; den Toom, N; van Rijsoort-Vos, T H; Niesters, B G; Stolz, E; Verbrugh, H A; Quint, W G

    In the present study, it was demonstrated that the sensitivity of the PCR for the detection of Chlamydia trachomatis is influenced by the volume of the clinical sample which is processed in the PCR. An adequate sensitivity for PCR was established by processing at least 4%, i.e., 80 microliters, of

  16. Gravimetric dust sampling for control purposes and occupational dust sampling.

    CSIR Research Space (South Africa)

    Unsted, AD

    1997-02-01

    Full Text Available Prior to the introduction of gravimetric dust sampling, konimeters had been used for dust sampling, which was largely for control purposes. Whether or not absolute results were achievable was not an issue since relative results were used to evaluate...

  17. Variational segmentation problems using prior knowledge in imaging and vision

    DEFF Research Database (Denmark)

    Fundana, Ketut

    This dissertation addresses variational formulation of segmentation problems using prior knowledge. Variational models are among the most successful approaches for solving many Computer Vision and Image Processing problems. The models aim at finding the solution to a given energy functional defined......, prior knowledge is needed to obtain the desired solution. The introduction of shape priors in particular, has proven to be an effective way to segment objects of interests. Firstly, we propose a prior-based variational segmentation model to segment objects of interest in image sequences, that can deal....... Many objects have high variability in shape and orientation. This often leads to unsatisfactory results, when using a segmentation model with single shape template. One way to solve this is by using more sophisticated shape models. We propose to incorporate shape priors from a shape sub...

  18. Sample Size for Tablet Compression and Capsule Filling Events During Process Validation.

    Science.gov (United States)

    Charoo, Naseem Ahmad; Durivage, Mark; Rahman, Ziyaur; Ayad, Mohamad Haitham

    2017-12-01

    During solid dosage form manufacturing, the uniformity of dosage units (UDU) is ensured by testing samples at 2 stages, that is, blend stage and tablet compression or capsule/powder filling stage. The aim of this work is to propose a sample size selection approach based on quality risk management principles for process performance qualification (PPQ) and continued process verification (CPV) stages by linking UDU to potential formulation and process risk factors. Bayes success run theorem appeared to be the most appropriate approach among various methods considered in this work for computing sample size for PPQ. The sample sizes for high-risk (reliability level of 99%), medium-risk (reliability level of 95%), and low-risk factors (reliability level of 90%) were estimated to be 299, 59, and 29, respectively. Risk-based assignment of reliability levels was supported by the fact that at low defect rate, the confidence to detect out-of-specification units would decrease which must be supplemented with an increase in sample size to enhance the confidence in estimation. Based on level of knowledge acquired during PPQ and the level of knowledge further required to comprehend process, sample size for CPV was calculated using Bayesian statistics to accomplish reduced sampling design for CPV. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  19. Superposing pure quantum states with partial prior information

    Science.gov (United States)

    Dogra, Shruti; Thomas, George; Ghosh, Sibasish; Suter, Dieter

    2018-05-01

    The principle of superposition is an intriguing feature of quantum mechanics, which is regularly exploited in many different circumstances. A recent work [M. Oszmaniec et al., Phys. Rev. Lett. 116, 110403 (2016), 10.1103/PhysRevLett.116.110403] shows that the fundamentals of quantum mechanics restrict the process of superimposing two unknown pure states, even though it is possible to superimpose two quantum states with partial prior knowledge. The prior knowledge imposes geometrical constraints on the choice of input states. We discuss an experimentally feasible protocol to superimpose multiple pure states of a d -dimensional quantum system and carry out an explicit experimental realization for two single-qubit pure states with partial prior information on a two-qubit NMR quantum information processor.

  20. Constrained noninformative priors

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-10-01

    The Jeffreys noninformative prior distribution for a single unknown parameter is the distribution corresponding to a uniform distribution in the transformed model where the unknown parameter is approximately a location parameter. To obtain a prior distribution with a specified mean but with diffusion reflecting great uncertainty, a natural generalization of the noninformative prior is the distribution corresponding to the constrained maximum entropy distribution in the transformed model. Examples are given

  1. Comparative analysis of whole mount processing and systematic sampling of radical prostatectomy specimens: pathological outcomes and risk of biochemical recurrence.

    Science.gov (United States)

    Salem, Shady; Chang, Sam S; Clark, Peter E; Davis, Rodney; Herrell, S Duke; Kordan, Yakup; Wills, Marcia L; Shappell, Scott B; Baumgartner, Roxelyn; Phillips, Sharon; Smith, Joseph A; Cookson, Michael S; Barocas, Daniel A

    2010-10-01

    Whole mount processing is more resource intensive than routine systematic sampling of radical retropubic prostatectomy specimens. We compared whole mount and systematic sampling for detecting pathological outcomes, and compared the prognostic value of pathological findings across pathological methods. We included men (608 whole mount and 525 systematic sampling samples) with no prior treatment who underwent radical retropubic prostatectomy at Vanderbilt University Medical Center between January 2000 and June 2008. We used univariate and multivariate analysis to compare the pathological outcome detection rate between pathological methods. Kaplan-Meier curves and the log rank test were used to compare the prognostic value of pathological findings across pathological methods. There were no significant differences between the whole mount and the systematic sampling groups in detecting extraprostatic extension (25% vs 30%), positive surgical margins (31% vs 31%), pathological Gleason score less than 7 (49% vs 43%), 7 (39% vs 43%) or greater than 7 (12% vs 13%), seminal vesicle invasion (8% vs 10%) or lymph node involvement (3% vs 5%). Tumor volume was higher in the systematic sampling group and whole mount detected more multiple surgical margins (each p systematic sampling yield similar pathological information. Each method stratifies patients into comparable risk groups for biochemical recurrence. Thus, while whole mount is more resource intensive, it does not appear to result in improved detection of clinically important pathological outcomes or prognostication. Copyright © 2010 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  2. Evaluating the optimum rest period prior to blood collection for fractionated plasma free metanephrines analysis

    LENUS (Irish Health Repository)

    Griffin, T.P.

    2016-05-01

    The high diagnostic accuracy of plasma metanephrines (PMets) in the di-agnosis of Phaeochromocytoma\\/Paraganglioma (PPGL) is well established. Considerable controversy exists regarding optimum sampling conditions for PMets. The use of reference intervals that do not compromise diagnostic sensitivity is recommended. However, the optimum rest period prior to sampling has yet to be clearly established. The aim of this study was to evaluate PMets concentrations in paired blood samples collected following 30 and 40 min seated-rest prior to sampling, in patients in whom it was clinically rea-sonable to suspect that PPGL may be present.

  3. pH adjustment of human blood plasma prior to bioanalytical sample preparation

    NARCIS (Netherlands)

    Hendriks, G.; Uges, D. R. A.; Franke, J. P.

    2008-01-01

    pH adjustment in bioanalytical sample preparation concerning ionisable compounds is one of the most common sample treatments. This is often done by mixing an aliquot of the sample with a proper buffer adjusted to the proposed pH. The pH of the resulting mixture however, does not necessarily have to

  4. Sample preservation, transport and processing strategies for honeybee RNA extraction: Influence on RNA yield, quality, target quantification and data normalization.

    Science.gov (United States)

    Forsgren, Eva; Locke, Barbara; Semberg, Emilia; Laugen, Ane T; Miranda, Joachim R de

    2017-08-01

    Viral infections in managed honey bees are numerous, and most of them are caused by viruses with an RNA genome. Since RNA degrades rapidly, appropriate sample management and RNA extraction methods are imperative to get high quality RNA for downstream assays. This study evaluated the effect of various sampling-transport scenarios (combinations of temperature, RNA stabilizers, and duration) of transport on six RNA quality parameters; yield, purity, integrity, cDNA synthesis efficiency, target detection and quantification. The use of water and extraction buffer were also compared for a primary bee tissue homogenate prior to RNA extraction. The strategy least affected by time was preservation of samples at -80°C. All other regimens turned out to be poor alternatives unless the samples were frozen or processed within 24h. Chemical stabilizers have the greatest impact on RNA quality and adding an extra homogenization step (a QIAshredder™ homogenizer) to the extraction protocol significantly improves the RNA yield and chemical purity. This study confirms that RIN values (RNA Integrity Number), should be used cautiously with bee RNA. Using water for the primary homogenate has no negative effect on RNA quality as long as this step is no longer than 15min. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR.

    Science.gov (United States)

    Mobli, Mehdi; Hoch, Jeffrey C

    2014-11-01

    Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Comparing nonparametric Bayesian tree priors for clonal reconstruction of tumors.

    Science.gov (United States)

    Deshwar, Amit G; Vembu, Shankar; Morris, Quaid

    2015-01-01

    Statistical machine learning methods, especially nonparametric Bayesian methods, have become increasingly popular to infer clonal population structure of tumors. Here we describe the treeCRP, an extension of the Chinese restaurant process (CRP), a popular construction used in nonparametric mixture models, to infer the phylogeny and genotype of major subclonal lineages represented in the population of cancer cells. We also propose new split-merge updates tailored to the subclonal reconstruction problem that improve the mixing time of Markov chains. In comparisons with the tree-structured stick breaking prior used in PhyloSub, we demonstrate superior mixing and running time using the treeCRP with our new split-merge procedures. We also show that given the same number of samples, TSSB and treeCRP have similar ability to recover the subclonal structure of a tumor…

  7. Real-time single image dehazing based on dark channel prior theory and guided filtering

    Science.gov (United States)

    Zhang, Zan

    2017-10-01

    Images and videos taken outside the foggy day are serious degraded. In order to restore degraded image taken in foggy day and overcome traditional Dark Channel prior algorithms problems of remnant fog in edge, we propose a new dehazing method.We first find the fog area in the dark primary color map to obtain the estimated value of the transmittance using quadratic tree. Then we regard the gray-scale image after guided filtering as atmospheric light map and remove haze based on it. Box processing and image down sampling technology are also used to improve the processing speed. Finally, the atmospheric light scattering model is used to restore the image. A plenty of experiments show that algorithm is effective, efficient and has a wide range of application.

  8. Evaluation of standard methods for collecting and processing fuel moisture samples

    Science.gov (United States)

    Sally M. Haase; José Sánchez; David R. Weise

    2016-01-01

    A variety of techniques for collecting and processing samples to determine moisture content of wildland fuels in support of fire management activities were evaluated. The effects of using a chainsaw or handsaw to collect samples of largediameter wood, containers for storing and transporting collected samples, and quick-response ovens for estimating moisture content...

  9. A review of blood sample handling and pre-processing for metabolomics studies.

    Science.gov (United States)

    Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta

    2017-09-01

    Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. High peak power processing up to 100 MV/M on various metallic samples

    International Nuclear Information System (INIS)

    Luong, M.; Bonin, B.; Safa, H.

    1996-01-01

    The high peak power processing (HPPP) is a well established way to reduce electronic field emission from radiofrequency (RF) metallic surfaces. The processing occurs because of some kind of instability destroys the emitter, but the basic physical mechanism at work has not yet been clearly identified. RF processing experiments on samples of restricted area, are described with well localized artificial emitting sites (protrusions from scratches on the sample surface). In order to disentangle the role of thermal and mechanical effects, in the processing, the samples were made from metals with different melting temperatures and tensile strengths. (author)

  11. High peak power processing up to 100 MV/m on various metallic samples

    International Nuclear Information System (INIS)

    Luong, M.; Bonin, B.; Safa, H.; Le Goff, A.

    1996-01-01

    The high peak power processing (HPPP) is a well established way to reduce electronic field emission from radiofrequency (RF) metallic surfaces. The processing occurs because of some kind of instability destroys the emitter, but the basic physical mechanism at work has not yet been clearly identified. The present study describes RF processing experiments on samples of restricted area, with well localized artificial emitting sites (protrusions from scratches on the sample surface). In order to disentangle the role of thermal and mechanical effects in the processing, the samples were made from metals with different melting temperatures and tensile strengths. (author)

  12. Rare behavior of growth processes via umbrella sampling of trajectories

    Science.gov (United States)

    Klymko, Katherine; Geissler, Phillip L.; Garrahan, Juan P.; Whitelam, Stephen

    2018-03-01

    We compute probability distributions of trajectory observables for reversible and irreversible growth processes. These results reveal a correspondence between reversible and irreversible processes, at particular points in parameter space, in terms of their typical and atypical trajectories. Thus key features of growth processes can be insensitive to the precise form of the rate constants used to generate them, recalling the insensitivity to microscopic details of certain equilibrium behavior. We obtained these results using a sampling method, inspired by the "s -ensemble" large-deviation formalism, that amounts to umbrella sampling in trajectory space. The method is a simple variant of existing approaches, and applies to ensembles of trajectories controlled by the total number of events. It can be used to determine large-deviation rate functions for trajectory observables in or out of equilibrium.

  13. Rapid determination of benzene derivatives in water samples by trace volume solvent DLLME prior to GC-FID

    Energy Technology Data Exchange (ETDEWEB)

    Diao, Chun Peng; Wei, Chao Hai; Feng, Chun Hua [South China Univ. of Technology, Guangzhou Higher Education Mega Center (China). College of Environmental Science and Engineering; Guangdong Regular Higher Education Institutions, Guangzhou (China). Key Lab. of Environmental Protection and Eco-Remediation

    2012-05-15

    An inexpensive, simple and environmentally friendly method based on dispersive liquid liquid microextraction (DLLME) for rapid determination of benzene derivatives in water samples was proposed. A significant improvement of DLLME procedure was achieved. Trace volume ethyl acetate (60 {mu}L) was exploited as dispersion solvent instead of common ones such as methanol and acetone, the volume of which was more than 0.5 mL, and the organic solvent required in DLLME was reduced to a great extent. Only 83-{mu}L organic solvent was consumed in the whole analytic process and the preconcentration procedure was less than 10 min. The advantageous approach coupled with gas chromatograph-flame ionization detector was proposed for the rapid determination of benzene, toluene, ethylbenzene and xylene isomers in water samples. Results showed that the proposed approach was an efficient method for rapid determination of benzene derivatives in aqueous samples. (orig.)

  14. Influence of sample processing on the analysis of carotenoids in maize.

    Science.gov (United States)

    Rivera, Sol; Canela, Ramon

    2012-09-21

    We performed a number of tests with the aim to develop an effective extraction method for the analysis of carotenoid content in maize seed. Mixtures of methanol-ethyl acetate (6:4, v/v) and methanol-tetrahydrofuran (1:1, v/v) were the most effective solvent systems for carotenoid extraction from maize endosperm under the conditions assayed. In addition, we also addressed sample preparation prior to the analysis of carotenoids by liquid chromatography (LC). The LC response of extracted carotenoids and standards in several solvents was evaluated and results were related to the degree of solubility of these pigments. Three key factors were found to be important when selecting a suitable injection solvent: compatibility between the mobile phase and injection solvent, carotenoid polarity and content in the matrix.

  15. Discovering biological progression underlying microarray samples.

    Directory of Open Access Journals (Sweden)

    Peng Qiu

    2011-04-01

    Full Text Available In biological systems that undergo processes such as differentiation, a clear concept of progression exists. We present a novel computational approach, called Sample Progression Discovery (SPD, to discover patterns of biological progression underlying microarray gene expression data. SPD assumes that individual samples of a microarray dataset are related by an unknown biological process (i.e., differentiation, development, cell cycle, disease progression, and that each sample represents one unknown point along the progression of that process. SPD aims to organize the samples in a manner that reveals the underlying progression and to simultaneously identify subsets of genes that are responsible for that progression. We demonstrate the performance of SPD on a variety of microarray datasets that were generated by sampling a biological process at different points along its progression, without providing SPD any information of the underlying process. When applied to a cell cycle time series microarray dataset, SPD was not provided any prior knowledge of samples' time order or of which genes are cell-cycle regulated, yet SPD recovered the correct time order and identified many genes that have been associated with the cell cycle. When applied to B-cell differentiation data, SPD recovered the correct order of stages of normal B-cell differentiation and the linkage between preB-ALL tumor cells with their cell origin preB. When applied to mouse embryonic stem cell differentiation data, SPD uncovered a landscape of ESC differentiation into various lineages and genes that represent both generic and lineage specific processes. When applied to a prostate cancer microarray dataset, SPD identified gene modules that reflect a progression consistent with disease stages. SPD may be best viewed as a novel tool for synthesizing biological hypotheses because it provides a likely biological progression underlying a microarray dataset and, perhaps more importantly, the

  16. Static sampling of dynamic processes - a paradox?

    Science.gov (United States)

    Mälicke, Mirko; Neuper, Malte; Jackisch, Conrad; Hassler, Sibylle; Zehe, Erwin

    2017-04-01

    Environmental systems monitoring aims at its core at the detection of spatio-temporal patterns of processes and system states, which is a pre-requisite for understanding and explaining their baffling heterogeneity. Most observation networks rely on distributed point sampling of states and fluxes of interest, which is combined with proxy-variables from either remote sensing or near surface geophysics. The cardinal question on the appropriate experimental design of such a monitoring network has up to now been answered in many different ways. Suggested approaches range from sampling in a dense regular grid using for the so-called green machine, transects along typical catenas, clustering of several observations sensors in presumed functional units or HRUs, arrangements of those cluster along presumed lateral flow paths to last not least a nested, randomized stratified arrangement of sensors or samples. Common to all these approaches is that they provide a rather static spatial sampling, while state variables and their spatial covariance structure dynamically change in time. It is hence of key interest how much of our still incomplete understanding stems from inappropriate sampling and how much needs to be attributed to an inappropriate analysis of spatial data sets. We suggest that it is much more promising to analyze the spatial variability of processes, for instance changes in soil moisture values, than to investigate the spatial variability of soil moisture states themselves. This is because wetting of the soil, reflected in a soil moisture increase, is causes by a totally different meteorological driver - rainfall - than drying of the soil. We hence propose that the rising and the falling limbs of soil moisture time series belong essentially to different ensembles, as they are influenced by different drivers. Positive and negative temporal changes in soil moisture need, hence, to be analyzed separately. We test this idea using the CAOS data set as a benchmark

  17. Active Prior Tactile Knowledge Transfer for Learning Tactual Properties of New Objects

    Directory of Open Access Journals (Sweden)

    Di Feng

    2018-02-01

    Full Text Available Reusing the tactile knowledge of some previously-explored objects (prior objects helps us to easily recognize the tactual properties of new objects. In this paper, we enable a robotic arm equipped with multi-modal artificial skin, like humans, to actively transfer the prior tactile exploratory action experiences when it learns the detailed physical properties of new objects. These experiences, or prior tactile knowledge, are built by the feature observations that the robot perceives from multiple sensory modalities, when it applies the pressing, sliding, and static contact movements on objects with different action parameters. We call our method Active Prior Tactile Knowledge Transfer (APTKT, and systematically evaluated its performance by several experiments. Results show that the robot improved the discrimination accuracy by around 10 % when it used only one training sample with the feature observations of prior objects. By further incorporating the predictions from the observation models of prior objects as auxiliary features, our method improved the discrimination accuracy by over 20 % . The results also show that the proposed method is robust against transferring irrelevant prior tactile knowledge (negative knowledge transfer.

  18. Advancement of Solidification Processing Technology Through Real Time X-Ray Transmission Microscopy: Sample Preparation

    Science.gov (United States)

    Stefanescu, D. M.; Curreri, P. A.

    1996-01-01

    Two types of samples were prepared for the real time X-ray transmission microscopy (XTM) characterization. In the first series directional solidification experiments were carried out to evaluate the critical velocity of engulfment of zirconia particles in the Al and Al-Ni eutectic matrix under ground (l-g) conditions. The particle distribution in the samples was recorded on video before and after the samples were directionally solidified. In the second series samples of the above two type of composites were prepared for directional solidification runs to be carried out on the Advanced Gradient Heating Facility (AGHF) aboard the space shuttle during the LMS mission in June 1996. X-ray microscopy proved to be an invaluable tool for characterizing the particle distribution in the metal matrix samples. This kind of analysis helped in determining accurately the critical velocity of engulfment of ceramic particles by the melt interface in the opaque metal matrix composites. The quality of the cast samples with respect to porosity and instrumented thermocouple sheath breakage or shift could be easily viewed and thus helped in selecting samples for the space shuttle experiments. Summarizing the merits of this technique it can be stated that this technique enabled the use of cast metal matrix composite samples since the particle location was known prior to the experiment.

  19. 32 CFR 806.27 - Samples of Air Force FOIA processing documents.

    Science.gov (United States)

    2010-07-01

    ... ADMINISTRATION AIR FORCE FREEDOM OF INFORMATION ACT PROGRAM § 806.27 Samples of Air Force FOIA processing... determination within 20 workdays, we have instituted multitrack processing of requests. Based on the information... source; responsive records were part of the Air Force's decision-making process, and the prerelease...

  20. Processing scarce biological samples for light and transmission electron microscopy

    Directory of Open Access Journals (Sweden)

    P Taupin

    2008-06-01

    Full Text Available Light microscopy (LM and transmission electron microscopy (TEM aim at understanding the relationship structure-function. With advances in biology, isolation and purification of scarce populations of cells or subcellular structures may not lead to enough biological material, for processing for LM and TEM. A protocol for preparation of scarce biological samples is presented. It is based on pre-embedding the biological samples, suspensions or pellets, in bovine serum albumin (BSA and bis-acrylamide (BA, cross-linked and polymerized. This preparation provides a simple and reproducible technique to process biological materials, present in limited quantities that can not be amplified, for light and transmission electron microscopy.

  1. Finite element simulation of the T-shaped ECAP processing of round samples

    Science.gov (United States)

    Shaban Ghazani, Mehdi; Fardi-Ilkhchy, Ali; Binesh, Behzad

    2018-05-01

    Grain refinement is the only mechanism that increases the yield strength and toughness of the materials simultaneously. Severe plastic deformation is one of the promising methods to refine the microstructure of materials. Among different severe plastic deformation processes, the T-shaped equal channel angular pressing (T-ECAP) is a relatively new technique. In the present study, finite element analysis was conducted to evaluate the deformation behavior of metals during T-ECAP process. The study was focused mainly on flow characteristics, plastic strain distribution and its homogeneity, damage development, and pressing force which are among the most important factors governing the sound and successful processing of nanostructured materials by severe plastic deformation techniques. The results showed that plastic strain is localized in the bottom side of sample and uniform deformation cannot be possible using T-ECAP processing. Friction coefficient between sample and die channel wall has a little effect on strain distributions in mirror plane and transverse plane of deformed sample. Also, damage analysis showed that superficial cracks may be initiated from bottom side of sample and their propagation will be limited due to the compressive state of stress. It was demonstrated that the V shaped deformation zone are existed in T-ECAP process and the pressing load needed for execution of deformation process is increased with friction.

  2. Using Priors to Compensate Geometrical Problems in Head-Mounted Eye Trackers

    DEFF Research Database (Denmark)

    Batista Narcizo, Fabricio; Ahmed, Zaheer; Hansen, Dan Witzner

    The use of additional information (a.k.a. priors) to help the eye tracking process is presented as an alternative to compensate classical geometrical problems in head-mounted eye trackers. Priors can be obtained from several distinct sources, such as: sensors to collect information related...... estimation specially for uncalibrated head-mounted setups....

  3. Data Set for the manuscript entitled, "Sample Processing Approach for Detection of Ricin in Surface Samples."

    Data.gov (United States)

    U.S. Environmental Protection Agency — Figure. This dataset is associated with the following publication: Shah, S., S. Kane, A.M. Erler, and T. Alfaro. Sample Processing Approach for Detection of Ricin in...

  4. PET reconstruction via nonlocal means induced prior.

    Science.gov (United States)

    Hou, Qingfeng; Huang, Jing; Bian, Zhaoying; Chen, Wufan; Ma, Jianhua

    2015-01-01

    The traditional Bayesian priors for maximum a posteriori (MAP) reconstruction methods usually incorporate local neighborhood interactions that penalize large deviations in parameter estimates for adjacent pixels; therefore, only local pixel differences are utilized. This limits their abilities of penalizing the image roughness. To achieve high-quality PET image reconstruction, this study investigates a MAP reconstruction strategy by incorporating a nonlocal means induced (NLMi) prior (NLMi-MAP) which enables utilizing global similarity information of image. The present NLMi prior approximates the derivative of Gibbs energy function by an NLM filtering process. Specially, the NLMi prior is obtained by subtracting the current image estimation from its NLM filtered version and feeding the residual error back to the reconstruction filter to yield the new image estimation. We tested the present NLMi-MAP method with simulated and real PET datasets. Comparison studies with conventional filtered backprojection (FBP) and a few iterative reconstruction methods clearly demonstrate that the present NLMi-MAP method performs better in lowering noise, preserving image edge and in higher signal to noise ratio (SNR). Extensive experimental results show that the NLMi-MAP method outperforms the existing methods in terms of cross profile, noise reduction, SNR, root mean square error (RMSE) and correlation coefficient (CORR).

  5. Prior Expectations Bias Sensory Representations in Visual Cortex

    NARCIS (Netherlands)

    Kok, P.; Brouwer, G.J.; Gerven, M.A.J. van; Lange, F.P. de

    2013-01-01

    Perception is strongly influenced by expectations. Accordingly, perception has sometimes been cast as a process of inference, whereby sensory inputs are combined with prior knowledge. However, despite a wealth of behavioral literature supporting an account of perception as probabilistic inference,

  6. Logarithmic Laplacian Prior Based Bayesian Inverse Synthetic Aperture Radar Imaging.

    Science.gov (United States)

    Zhang, Shuanghui; Liu, Yongxiang; Li, Xiang; Bi, Guoan

    2016-04-28

    This paper presents a novel Inverse Synthetic Aperture Radar Imaging (ISAR) algorithm based on a new sparse prior, known as the logarithmic Laplacian prior. The newly proposed logarithmic Laplacian prior has a narrower main lobe with higher tail values than the Laplacian prior, which helps to achieve performance improvement on sparse representation. The logarithmic Laplacian prior is used for ISAR imaging within the Bayesian framework to achieve better focused radar image. In the proposed method of ISAR imaging, the phase errors are jointly estimated based on the minimum entropy criterion to accomplish autofocusing. The maximum a posterior (MAP) estimation and the maximum likelihood estimation (MLE) are utilized to estimate the model parameters to avoid manually tuning process. Additionally, the fast Fourier Transform (FFT) and Hadamard product are used to minimize the required computational efficiency. Experimental results based on both simulated and measured data validate that the proposed algorithm outperforms the traditional sparse ISAR imaging algorithms in terms of resolution improvement and noise suppression.

  7. Logarithmic Laplacian Prior Based Bayesian Inverse Synthetic Aperture Radar Imaging

    Directory of Open Access Journals (Sweden)

    Shuanghui Zhang

    2016-04-01

    Full Text Available This paper presents a novel Inverse Synthetic Aperture Radar Imaging (ISAR algorithm based on a new sparse prior, known as the logarithmic Laplacian prior. The newly proposed logarithmic Laplacian prior has a narrower main lobe with higher tail values than the Laplacian prior, which helps to achieve performance improvement on sparse representation. The logarithmic Laplacian prior is used for ISAR imaging within the Bayesian framework to achieve better focused radar image. In the proposed method of ISAR imaging, the phase errors are jointly estimated based on the minimum entropy criterion to accomplish autofocusing. The maximum a posterior (MAP estimation and the maximum likelihood estimation (MLE are utilized to estimate the model parameters to avoid manually tuning process. Additionally, the fast Fourier Transform (FFT and Hadamard product are used to minimize the required computational efficiency. Experimental results based on both simulated and measured data validate that the proposed algorithm outperforms the traditional sparse ISAR imaging algorithms in terms of resolution improvement and noise suppression.

  8. Highly oriented Bi-system bulk sample prepared by a decomposition-crystallization process

    International Nuclear Information System (INIS)

    Xi Zhengping; Zhou Lian; Ji Chunlin

    1992-01-01

    A decomposition-crystallization method, preparing highly oriented Bi-system bulk sample is reported. The effects of processing parameter, decomposition temperature, cooling rate and post-treatment condition on texture and superconductivity are investigated. The method has successfully prepared highly textured Bi-system bulk samples. High temperature annealing does not destroy the growing texture, but the cooling rate has some effect on texture and superconductivity. Annealing in N 2 /O 2 atmosphere can improve superconductivity of the textured sample. The study on the superconductivity of the Bi(Pb)-Sr-Ca-Cu-O bulk material has been reported in numerous papers. The research on J c concentrates on the tape containing the 2223 phase, with very few studies on the J c of bulk sample. The reason for the lack of studies is that the change of superconducting phases at high temperatures has not been known. The authors have reported that the 2212 phase incongruently melted at about 875 degrees C and proceeded to orient the c-axis perpendicular to the surface in the process of crystallization of the 2212 phase. Based on that result, a decomposition-crystallization method was proposed to prepare highly oriented Bi-system bulk sample. In this paper, the process is described in detail and the effects of processing parameters on texture and superconductivity are reported

  9. Recognising Health Care Assistants' Prior Learning through a Caring Ideology

    Science.gov (United States)

    Sandberg, Fredrik

    2010-01-01

    This article critically appraises a process of recognising prior learning (RPL) using analytical tools from Habermas' theory of communicative action. The RPL process is part of an in-service training program for health care assistants where the goal is to become a licensed practical nurse. Data about the RPL process were collected using interviews…

  10. Relationships between processing delay and microbial load of broiler neck skin samples.

    Science.gov (United States)

    Lucianez, A; Holmes, M A; Tucker, A W

    2010-01-01

    The measurable microbial load on poultry carcasses during processing is determined by a number of factors including farm or origin, processing hygiene, and external temperature. This study investigated associations between carcass microbial load and progressive delays to processing. A total of 30 carcasses were delayed immediately after defeathering and before evisceration in a commercial abattoir in groups of five, and were held at ambient temperature for 1, 2, 3, 4, 6, and 8 h. Delayed carcasses were reintroduced to the processing line, and quantitative assessment of total viable count, coliforms, Staphylococcus aureus, and Pseudomonas spp. was undertaken on neck skin flap samples collected after carcass chilling and then pooled for each group. Sampling was repeated on 5 separate days, and the data were combined. Significant increases in total viable count (P = 0.001) and coliforms (P = 0.004), but not for S. aureus or Pseudomonas loads, were observed across the 8-h period of delay. In line with previous studies, there was significant variation in microbiological data according to sampling day. In conclusion, there is a significant and measurable decline in microbiological status of uneviscerated but defeathered poultry carcasses after an 8-h delay, but the variability of sampling results, reflecting the wide range of factors that impact microbial load, means that it is not possible to determine maximum or minimum acceptable periods of processing delay based on this criterion alone.

  11. Determination of uranium in samples containing bulk aluminium

    International Nuclear Information System (INIS)

    Das, S.K.; Kannan, R.; Dhami, P.S.; Tripathi, S.C.; Gandhi, P.M.

    2015-01-01

    The determination of uranium is of great importance in PUREX process and need to be analyzed at different concentration ranges depending on the stage of reprocessing. Various techniques like volumetry, spectrophotometry, ICP-OES, fluorimetry, mass spectrometry etc. are used for the measurement of uranium in these samples. Fast and sensitive methods suitable for low level detection of uranium are desirable to cater the process needs. Microgram quantities of uranium are analyzed by spectrophotometric method using 2-(5- bromo-2-pyridylazo-5-diethylaminophenol) (Br-PADAP) as the complexing agent. But, the presence of some of the metal ions viz. Al, Pu, Zr etc. interferes in its analysis. Therefore, separation of uranium from such interfering metal ions is required prior to its analysis. This paper describes the analysis of uranium in samples containing aluminium as major matrix

  12. Recognition of Prior Learning: The Participants' Perspective

    Science.gov (United States)

    Miguel, Marta C.; Ornelas, José H.; Maroco, João P.

    2016-01-01

    The current narrative on lifelong learning goes beyond formal education and training, including learning at work, in the family and in the community. Recognition of prior learning is a process of evaluation of those skills and knowledge acquired through life experience, allowing them to be formally recognized by the qualification systems. It is a…

  13. Co-production of electricity and ethanol, process economics of value prior combustion

    International Nuclear Information System (INIS)

    Treasure, T.; Gonzalez, R.; Venditti, R.; Pu, Y.; Jameel, H.; Kelley, S.; Prestemon, Jeffrey

    2012-01-01

    Highlights: ► Economics of producing cellulosic ethanol and bio-power in the same facility using an autohydrolysis process. ► Feedstock considerably affect the economics of the biorefinery facility. ► Lower moisture content improves financial performance of the bio-power business. - Abstract: A process economic analysis of co-producing bioethanol and electricity (value prior to combustion) from mixed southern hardwood and southern yellow pine is presented. Bioethanol is produced by extracting carbohydrates from wood via autohydrolysis, membrane separation of byproducts, enzymatic hydrolysis of extracted oligomers and fermentation to ethanol. The residual solids after autohydrolysis are pressed and burned in a power boiler to generate steam and electricity. A base case scenario of biomass combustion to produce electricity is presented as a reference to understand the basics of bio-power generation economics. For the base case, minimum electricity revenue of $70–$96/MWh must be realized to achieve a 6–12% internal rate of return. In the alternative co-production cases, the ethanol facility is treated as a separate business entity that purchases power and steam from the biomass power plant. Minimum ethanol revenue required to achieve a 12% internal rate of return was estimated to be $0.84–$1.05/l for hardwood and $0.74–$0.85/l for softwood. Based on current market conditions and an assumed future ethanol selling price of $0.65/l, the co-production of cellulosic bioethanol and power does not produce financeable returns. A risk analysis indicates that there is a probability of 26.6% to achieve an internal rate of return equal or higher than 12%. It is suggested that focus be placed on improving yield and reducing CAPEX before this technology can be applied commercially. This modeling approach is a robust method to evaluate economic feasibility of integrated production of bio-power and other products based on extracted hemicellulose.

  14. Sampling phased array a new technique for signal processing and ultrasonic imaging

    OpenAIRE

    Bulavinov, A.; Joneit, D.; Kröning, M.; Bernus, L.; Dalichow, M.H.; Reddy, K.M.

    2006-01-01

    Different signal processing and image reconstruction techniques are applied in ultrasonic non-destructive material evaluation. In recent years, rapid development in the fields of microelectronics and computer engineering lead to wide application of phased array systems. A new phased array technique, called "Sampling Phased Array" has been developed in Fraunhofer Institute for non-destructive testing. It realizes unique approach of measurement and processing of ultrasonic signals. The sampling...

  15. Sampling and Timing: A Task for the Environmetal Process

    NARCIS (Netherlands)

    Hilderink, G.H.; Broenink, Johannes F.

    2003-01-01

    Sampling and timing is considered a responsibility of the environment of controller software. In this paper we will illustrate a concept whereby an environmental process and multi-way events play an important role in applying timing for untimed CSP software architectures. We use this timing concept

  16. Coordinate transformation and Polynomial Chaos for the Bayesian inference of a Gaussian process with parametrized prior covariance function

    KAUST Repository

    Sraj, Ihab

    2015-10-22

    This paper addresses model dimensionality reduction for Bayesian inference based on prior Gaussian fields with uncertainty in the covariance function hyper-parameters. The dimensionality reduction is traditionally achieved using the Karhunen-Loève expansion of a prior Gaussian process assuming covariance function with fixed hyper-parameters, despite the fact that these are uncertain in nature. The posterior distribution of the Karhunen-Loève coordinates is then inferred using available observations. The resulting inferred field is therefore dependent on the assumed hyper-parameters. Here, we seek to efficiently estimate both the field and covariance hyper-parameters using Bayesian inference. To this end, a generalized Karhunen-Loève expansion is derived using a coordinate transformation to account for the dependence with respect to the covariance hyper-parameters. Polynomial Chaos expansions are employed for the acceleration of the Bayesian inference using similar coordinate transformations, enabling us to avoid expanding explicitly the solution dependence on the uncertain hyper-parameters. We demonstrate the feasibility of the proposed method on a transient diffusion equation by inferring spatially-varying log-diffusivity fields from noisy data. The inferred profiles were found closer to the true profiles when including the hyper-parameters’ uncertainty in the inference formulation.

  17. Lyophilization: a useful approach to the automation of analytical processes?

    OpenAIRE

    de Castro, M. D. Luque; Izquierdo, A.

    1990-01-01

    An overview of the state-of-the-art in the use of lyophilization for the pretreatment of samples and standards prior to their storage and/or preconcentration is presented. The different analytical applications of this process are dealt with according to the type of material (reagent, standard, samples) and matrix involved.

  18. Evaluation of lyophilization for the preconcentration of natural water samples prior to neutron activation analysis

    International Nuclear Information System (INIS)

    Harrison, S.H.; LaFleur, P.D.; Zoller, W.H.

    1975-01-01

    Water is preconcentrated by freeze drying using a method which virtually eliminates sample contamination and trace element losses. To test the possibility of losses of volatile elements during the drying process, known quantities of radioactive tracers for 21 elements were added to water, the solutions freeze dried, and the tracer residues counted. The results confirm that at least 95 percent of all but the most volatile elements studied (Hg and I) were retained in the residue. The problem of transferring quantitatively the dry residue from the freeze drying container to an irradiation container was eliminated by designing a freeze drying container that would also serve as an irradiation and counting container. (U.S.)

  19. Down sampled signal processing for a B Factory bunch-by-bunch feedback system

    International Nuclear Information System (INIS)

    Hindi, H.; Hosseini, W.; Briggs, D.; Fox, J.; Hutton, A.

    1992-03-01

    A bunch-by-bunch feedback scheme is studied for damping coupled bunch synchrotron oscillations in the proposed PEP II B Factory. The quasi-linear feedback systems design incorporates a phase detector to provide a quantized measure of bunch phase, digital signal processing to compute an error correction signal and a kicker system to correct the energy of the bunches. A farm of digital processors, operating in parallel, is proposed to compute correction signals for the 1658 bunches of the B Factory. This paper studies the use of down sampled processing to reduce the computational complexity of the feedback system. We present simulation results showing the effect of down sampling on beam dynamics. Results show that down sampled processing can reduce the scale of the processing task by a factor of 10

  20. Ergonomic analysis of radiopharmaceuticals samples preparation process

    International Nuclear Information System (INIS)

    Gomes, Luciene Betzler C.; Santos, Isaac Luquetti dos; Fonseca, Antonio Carlos C. da; Pellini, Marcos Pinto; Rebelo, Ana Maria

    2005-01-01

    The doses of radioisotopes to be administrated in patients for diagnostic effect or therapy are prepared in the radiopharmacological sector. The preparation process adopts techniques that are aimed to reduce the exposition time of the professionals and the absorption of excessive doses for patients. The ergonomic analysis of this process contributes in the prevention of occupational illnesses and to prevent risks of accidents during the routines, providing welfare and security to the involved users and conferring to the process an adequate working standard. In this context it is perceived relevance of studies that deal with the analysis of factors that point with respect to the solution of problems and for establishing proposals that minimize risks in the exercise of the activities. Through a methodology that considers the application of the concepts of Ergonomics, it is searched the improvement of the effectiveness or the quality and reduction of the difficulties lived for the workers. The work prescribed, established through norms and procedures codified will be faced with the work effectively carried through, the real work, shaped to break the correct appreciation, with focus in the activities. This work has as objective to argue an ergonomic analysis of samples preparation process of radioisotopes in the Setor de Radiofarmacia do Hospital Universitario Clementino Fraga Filho da Universidade Federal do Rio de Janeiro (UFRJ). (author)

  1. Effect of Prior Health-Related Employment on the Registered Nurse Workforce Supply.

    Science.gov (United States)

    Yoo, Byung-kwan; Lin, Tzu-chun; Kim, Minchul; Sasaki, Tomoko; Spetz, Joanne

    2016-01-01

    Registered nurses (RN) who held prior health-related employment in occupations other than licensed practical or vocational nursing (LPN/LVN) are reported to have increased rapidly in the past decades. Researchers examined whether prior health-related employment affects RN workforce supply. A cross-sectional bivariate probit model using the 2008 National Sample Survey of Registered Nurses was esti- mated. Prior health-related employment in relatively lower-wage occupations, such as allied health, clerk, or nursing aide, was positively associated with working s an RN. ~>Prior health-related employ- ment in relatively higher-wage categories, such as a health care manager or LPN/LVN, was positively associated with working full-time as an RN. Policy implications are to promote an expanded career ladder program and a nursing school admission policy that targets non-RN health care workers with an interest in becoming RNs.

  2. Tank 12H Acidic Chemical Cleaning Sample Analysis And Material Balance

    International Nuclear Information System (INIS)

    Martino, C. J.; Reboul, S. H.; Wiersma, B. J.; Coleman, C. J.

    2013-01-01

    A process of Bulk Oxalic Acid (BOA) chemical cleaning was performed for Tank 12H during June and July of 2013 to remove all or a portion of the approximately 4400 gallon sludge heel. Three strikes of oxalic acid (nominally 4 wt % or 2 wt %) were used at 55 deg C and tank volumes of 96- to 140-thousand gallons. This report details the sample analysis of a scrape sample taken prior to BOA cleaning and dip samples taken during BOA cleaning. It also documents a rudimentary material balance for the Tank 12H cleaning results

  3. Reliability assessment based on small samples of normal distribution

    International Nuclear Information System (INIS)

    Ma Zhibo; Zhu Jianshi; Xu Naixin

    2003-01-01

    When the pertinent parameter involved in reliability definition complies with normal distribution, the conjugate prior of its distributing parameters (μ, h) is of normal-gamma distribution. With the help of maximum entropy and the moments-equivalence principles, the subjective information of the parameter and the sampling data of its independent variables are transformed to a Bayesian prior of (μ,h). The desired estimates are obtained from either the prior or the posterior which is formed by combining the prior and sampling data. Computing methods are described and examples are presented to give demonstrations

  4. Influence of prior information on pain involves biased perceptual decision-making.

    Science.gov (United States)

    Wiech, Katja; Vandekerckhove, Joachim; Zaman, Jonas; Tuerlinckx, Francis; Vlaeyen, Johan W S; Tracey, Irene

    2014-08-04

    Prior information about features of a stimulus is a strong modulator of perception. For instance, the prospect of more intense pain leads to an increased perception of pain, whereas the expectation of analgesia reduces pain, as shown in placebo analgesia and expectancy modulations during drug administration. This influence is commonly assumed to be rooted in altered sensory processing and expectancy-related modulations in the spinal cord, are often taken as evidence for this notion. Contemporary models of perception, however, suggest that prior information can also modulate perception by biasing perceptual decision-making - the inferential process underlying perception in which prior information is used to interpret sensory information. In this type of bias, the information is already present in the system before the stimulus is observed. Computational models can distinguish between changes in sensory processing and altered decision-making as they result in different response times for incorrect choices in a perceptual decision-making task (Figure S1A,B). Using a drift-diffusion model, we investigated the influence of both processes in two independent experiments. The results of both experiments strongly suggest that these changes in pain perception are predominantly based on altered perceptual decision-making. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Improved compressed sensing-based cone-beam CT reconstruction using adaptive prior image constraints

    Science.gov (United States)

    Lee, Ho; Xing, Lei; Davidi, Ran; Li, Ruijiang; Qian, Jianguo; Lee, Rena

    2012-04-01

    Volumetric cone-beam CT (CBCT) images are acquired repeatedly during a course of radiation therapy and a natural question to ask is whether CBCT images obtained earlier in the process can be utilized as prior knowledge to reduce patient imaging dose in subsequent scans. The purpose of this work is to develop an adaptive prior image constrained compressed sensing (APICCS) method to solve this problem. Reconstructed images using full projections are taken on the first day of radiation therapy treatment and are used as prior images. The subsequent scans are acquired using a protocol of sparse projections. In the proposed APICCS algorithm, the prior images are utilized as an initial guess and are incorporated into the objective function in the compressed sensing (CS)-based iterative reconstruction process. Furthermore, the prior information is employed to detect any possible mismatched regions between the prior and current images for improved reconstruction. For this purpose, the prior images and the reconstructed images are classified into three anatomical regions: air, soft tissue and bone. Mismatched regions are identified by local differences of the corresponding groups in the two classified sets of images. A distance transformation is then introduced to convert the information into an adaptive voxel-dependent relaxation map. In constructing the relaxation map, the matched regions (unchanged anatomy) between the prior and current images are assigned with smaller weight values, which are translated into less influence on the CS iterative reconstruction process. On the other hand, the mismatched regions (changed anatomy) are associated with larger values and the regions are updated more by the new projection data, thus avoiding any possible adverse effects of prior images. The APICCS approach was systematically assessed by using patient data acquired under standard and low-dose protocols for qualitative and quantitative comparisons. The APICCS method provides an

  6. Incorporating prior knowledge induced from stochastic differential equations in the classification of stochastic observations.

    Science.gov (United States)

    Zollanvari, Amin; Dougherty, Edward R

    2016-12-01

    In classification, prior knowledge is incorporated in a Bayesian framework by assuming that the feature-label distribution belongs to an uncertainty class of feature-label distributions governed by a prior distribution. A posterior distribution is then derived from the prior and the sample data. An optimal Bayesian classifier (OBC) minimizes the expected misclassification error relative to the posterior distribution. From an application perspective, prior construction is critical. The prior distribution is formed by mapping a set of mathematical relations among the features and labels, the prior knowledge, into a distribution governing the probability mass across the uncertainty class. In this paper, we consider prior knowledge in the form of stochastic differential equations (SDEs). We consider a vector SDE in integral form involving a drift vector and dispersion matrix. Having constructed the prior, we develop the optimal Bayesian classifier between two models and examine, via synthetic experiments, the effects of uncertainty in the drift vector and dispersion matrix. We apply the theory to a set of SDEs for the purpose of differentiating the evolutionary history between two species.

  7. Imaging performance of a hybrid x-ray computed tomography-fluorescence molecular tomography system using priors.

    Science.gov (United States)

    Ale, Angelique; Schulz, Ralf B; Sarantopoulos, Athanasios; Ntziachristos, Vasilis

    2010-05-01

    The performance is studied of two newly introduced and previously suggested methods that incorporate priors into inversion schemes associated with data from a recently developed hybrid x-ray computed tomography and fluorescence molecular tomography system, the latter based on CCD camera photon detection. The unique data set studied attains accurately registered data of high spatially sampled photon fields propagating through tissue along 360 degrees projections. Approaches that incorporate structural prior information were included in the inverse problem by adding a penalty term to the minimization function utilized for image reconstructions. Results were compared as to their performance with simulated and experimental data from a lung inflammation animal model and against the inversions achieved when not using priors. The importance of using priors over stand-alone inversions is also showcased with high spatial sampling simulated and experimental data. The approach of optimal performance in resolving fluorescent biodistribution in small animals is also discussed. Inclusion of prior information from x-ray CT data in the reconstruction of the fluorescence biodistribution leads to improved agreement between the reconstruction and validation images for both simulated and experimental data.

  8. Understanding sleep disturbance in athletes prior to important competitions.

    Science.gov (United States)

    Juliff, Laura E; Halson, Shona L; Peiffer, Jeremiah J

    2015-01-01

    Anecdotally many athletes report worse sleep in the nights prior to important competitions. Despite sleep being acknowledged as an important factor for optimal athletic performance and overall health, little is understood about athlete sleep around competition. The aims of this study were to identify sleep complaints of athletes prior to competitions and determine whether complaints were confined to competition periods. Cross-sectional study. A sample of 283 elite Australian athletes (129 male, 157 female, age 24±5 y) completed two questionnaires; Competitive Sport and Sleep questionnaire and the Pittsburgh Sleep Quality Index. 64.0% of athletes indicated worse sleep on at least one occasion in the nights prior to an important competition over the past 12 months. The main sleep problem specified by athletes was problems falling asleep (82.1%) with the main reasons responsible for poor sleep indicated as thoughts about the competition (83.5%) and nervousness (43.8%). Overall 59.1% of team sport athletes reported having no strategy to overcome poor sleep compared with individual athletes (32.7%, p=0.002) who utilised relaxation and reading as strategies. Individual sport athletes had increased likelihood of poor sleep as they aged. The poor sleep reported by athletes prior to competition was situational rather than a global sleep problem. Poor sleep is common prior to major competitions in Australian athletes, yet most athletes are unaware of strategies to overcome the poor sleep experienced. It is essential coaches and scientists monitor and educate both individual and team sport athletes to facilitate sleep prior to important competitions. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  9. Quantitating morphological changes in biological samples during scanning electron microscopy sample preparation with correlative super-resolution microscopy.

    Science.gov (United States)

    Zhang, Ying; Huang, Tao; Jorgens, Danielle M; Nickerson, Andrew; Lin, Li-Jung; Pelz, Joshua; Gray, Joe W; López, Claudia S; Nan, Xiaolin

    2017-01-01

    Sample preparation is critical to biological electron microscopy (EM), and there have been continuous efforts on optimizing the procedures to best preserve structures of interest in the sample. However, a quantitative characterization of the morphological changes associated with each step in EM sample preparation is currently lacking. Using correlative EM and superresolution microscopy (SRM), we have examined the effects of different drying methods as well as osmium tetroxide (OsO4) post-fixation on cell morphology during scanning electron microscopy (SEM) sample preparation. Here, SRM images of the sample acquired under hydrated conditions were used as a baseline for evaluating morphological changes as the sample went through SEM sample processing. We found that both chemical drying and critical point drying lead to a mild cellular boundary retraction of ~60 nm. Post-fixation by OsO4 causes at least 40 nm additional boundary retraction. We also found that coating coverslips with adhesion molecules such as fibronectin prior to cell plating helps reduce cell distortion from OsO4 post-fixation. These quantitative measurements offer useful information for identifying causes of cell distortions in SEM sample preparation and improving current procedures.

  10. Double Shell Tank (DST) Process Waste Sampling Subsystem Definition Report

    International Nuclear Information System (INIS)

    RASMUSSEN, J.H.

    2000-01-01

    This report defines the Double-Shell Tank (DST) Process Waste Sampling Subsystem (PWSS). This subsystem definition report fully describes and identifies the system boundaries of the PWSS. This definition provides a basis for developing functional, performance, and test requirements (i.e., subsystem specification), as necessary, for the PWSS. The resultant PWSS specification will include the sampling requirements to support the transfer of waste from the DSTs to the Privatization Contractor during Phase 1 of Waste Feed Delivery

  11. Capillary absorption spectrometer and process for isotopic analysis of small samples

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, M. Lizabeth; Kelly, James F.; Sams, Robert L.; Moran, James J.; Newburn, Matthew K.; Blake, Thomas A.

    2016-03-29

    A capillary absorption spectrometer and process are described that provide highly sensitive and accurate stable absorption measurements of analytes in a sample gas that may include isotopologues of carbon and oxygen obtained from gas and biological samples. It further provides isotopic images of microbial communities that allow tracking of nutrients at the single cell level. It further targets naturally occurring variations in carbon and oxygen isotopes that avoids need for expensive isotopically labeled mixtures which allows study of samples taken from the field without modification. The method also permits sampling in vivo permitting real-time ambient studies of microbial communities.

  12. How Prior Knowledge and Colour Contrast Interfere Visual Search Processes in Novice Learners: An Eye Tracking Study

    Science.gov (United States)

    Sonmez, Duygu; Altun, Arif; Mazman, Sacide Guzin

    2012-01-01

    This study investigates how prior content knowledge and prior exposure to microscope slides on the phases of mitosis effect students' visual search strategies and their ability to differentiate cells that are going through any phases of mitosis. Two different sets of microscope slide views were used for this purpose; with high and low colour…

  13. Quality evaluation of processed clay soil samples | Steiner-Asiedu ...

    African Journals Online (AJOL)

    Introduction: This study assessed the microbial quality of clay samples sold on two of the major Ghanaian markets. Methods: The study was a cross-sectional assessing the evaluation of processed clay and effects it has on the nutrition of the consumers in the political capital town of Ghana. The items for the examination was ...

  14. DSMC multicomponent aerosol dynamics: Sampling algorithms and aerosol processes

    Science.gov (United States)

    Palaniswaamy, Geethpriya

    The post-accident nuclear reactor primary and containment environments can be characterized by high temperatures and pressures, and fission products and nuclear aerosols. These aerosols evolve via natural transport processes as well as under the influence of engineered safety features. These aerosols can be hazardous and may pose risk to the public if released into the environment. Computations of their evolution, movement and distribution involve the study of various processes such as coagulation, deposition, condensation, etc., and are influenced by factors such as particle shape, charge, radioactivity and spatial inhomogeneity. These many factors make the numerical study of nuclear aerosol evolution computationally very complicated. The focus of this research is on the use of the Direct Simulation Monte Carlo (DSMC) technique to elucidate the role of various phenomena that influence the nuclear aerosol evolution. In this research, several aerosol processes such as coagulation, deposition, condensation, and source reinforcement are explored for a multi-component, aerosol dynamics problem in a spatially homogeneous medium. Among the various sampling algorithms explored the Metropolis sampling algorithm was found to be effective and fast. Several test problems and test cases are simulated using the DSMC technique. The DSMC results obtained are verified against the analytical and sectional results for appropriate test problems. Results show that the assumption of a single mean density is not appropriate due to the complicated effect of component densities on the aerosol processes. The methods developed and the insights gained will also be helpful in future research on the challenges associated with the description of fission product and aerosol releases.

  15. Evaluation of Sample Handling Effects on Serum Vitamin E and Cholesterol Concentrations in Alpacas

    Directory of Open Access Journals (Sweden)

    Andrea S. Lear

    2014-01-01

    Full Text Available Clinical cases of vitamin E deficiencies have been diagnosed in camelids and may indicate that these species are more sensitive to inadequate vitamin E in hay-based diets compared to other ruminant and equine species. In bovine, cholesterol has been reported to affect vitamin E concentrations. In order to evaluate vitamin E deficiencies in camelids, the effects of collection and storage of the blood samples prior to processing were necessary. Reports vary as to factors affecting vitamin E and cholesterol in blood samples, and diagnostic laboratories vary in instructions regarding sample handling. Blood was collected from healthy alpacas and processed under conditions including exposure to fluorescent light, serum and red blood cell contact, tube stopper contact, temperature, and hemolysis. Serum vitamin E and cholesterol concentrations were then measured. Statistical analyses found that the vitamin E concentrations decreased with prolonged contact with the tube stopper and with increasing hemolysis. Vitamin E concentration variations were seen with other factors but were not significant. Time prior to serum separation and individual animal variation was found to alter cholesterol concentrations within the sample, yet this finding was clinically unremarkable. No correlation was seen between vitamin E and cholesterol concentration, possibly due to lack of variation of cholesterol.

  16. A Monte Carlo Sampling Technique for Multi-phonon Processes

    Energy Technology Data Exchange (ETDEWEB)

    Hoegberg, Thure

    1961-12-15

    A sampling technique for selecting scattering angle and energy gain in Monte Carlo calculations of neutron thermalization is described. It is supposed that the scattering is separated into processes involving different numbers of phonons. The number of phonons involved is first determined. Scattering angle and energy gain are then chosen by using special properties of the multi-phonon term.

  17. Study on auto-plating process time versus recovery for polonium, Po-210 in environmental sample

    International Nuclear Information System (INIS)

    Jalal Sharib; Zaharudin Ahmad; Abdul Kadir Ishak; Norfaizal Mohamed; Ahmad Sanadi Abu Bakar; Yii Mei Wo; Kamarozaman Ishak; Siti Aminah Yusoff

    2008-08-01

    This study was carried out to evaluate time effectiveness and recovery 16 samples of 4 Kuala Muda stations during auto-plating process procedures for determination Polonium, Po 210 activity concentration in environmental sample. The study was performed using Kuala Muda sediment as sample in the same methodology. The auto-plating process runs for 4, 12, 24 and 30 hours on a silver disc for 4 samples each station, and then counted for one (1) day using an alpha spectrometry counting system. The objectives for this study is to justify on time duration for auto-plating process effecting a chemical yield of Po-209.The results showed recovery are increasing versus time and constantly at 24 hour auto-plating. Its mean, 24 hour is an optimum time for auto-plating process for determination of Polonium, Po 210 activity concentration in environmental sample. (Author)

  18. Special study for the manual transfer of process samples from CPP [Chemical Processing Plant] 601 to RAL [Remote Analytical Laboratory

    International Nuclear Information System (INIS)

    Marts, D.J.

    1987-05-01

    A study of alternate methods to manually transport radioactive samples from their glove boxes to the Remote Analytical Laboratory (RAL) was conducted at the Idaho National Engineering Laboratory. The study was performed to mitigate the effects of a potential loss of sampling capabilities that could take place if a malfunction in the Pneumatic Transfer System (PTS) occurred. Samples are required to be taken from the cell glove boxes and analyzed at the RAL regardless of the operational status of the PTS. This paper documents the conclusions of the study and how a decision was reached that determined the best handling scenarios for manually transporting 15 mL vials of liquid process samples from the K, W, U, WG, or WH cell glove boxes in the Chemical Processing Plant (CPP) 601 to the RAL. This study of methods to manually remove the samples from the glove boxes, package them for safe shipment, transport them by the safest route, receive them at the RAL, and safely unload them was conducted by EG and G Idaho, Inc., for Westinghouse Idaho Nuclear Company as part of the Glove Box Sampling and Transfer System Project for the Fuel Processing Facilities Upgrade, Task 10, Subtask 2. The study focused on the safest and most reliable scenarios that could be implemented using existing equipment. Hardware modifications and new hardware proposals were identified, and their impact on the handling scenario has been evaluated. A conclusion was reached that by utilizing the existing facility hardware, these samples can be safely transported manually from the sample stations in CPP 601 to the RAL, and that additional hardware could facilitate the transportation process even further

  19. A Simulation of Pell Grant Awards and Costs Using Prior-Prior Year Financial Data

    Science.gov (United States)

    Kelchen, Robert; Jones, Gigi

    2015-01-01

    We examine the likely implications of switching from a prior year (PY) financial aid system, the current practice in which students file the Free Application for Federal Student Aid (FAFSA) using income data from the previous tax year, to prior-prior year (PPY), in which data from two years before enrollment is used. While PPY allows students to…

  20. The Effects of Prior Outcomes on Risky Choice: Evidence from the Stock Market

    Directory of Open Access Journals (Sweden)

    Fenghua Wen

    2014-01-01

    Full Text Available How do prior outcomes affect the risk choice? Research on this can help people to understand investors’ dynamic decisions in financial market. This paper puts forward a new value function. By analyzing the new value function, we find that the prior gains and losses have an impact on the form of value function and the current investors’ risk attitude. Then the paper takes the behavior of the whole stock market as the research object, adopts aggregative index number of 14 representative stocks around the world as samples, and establishes a TVRA-GARCH-M model to investigate the influences of prior gains and losses on the current risk attitude. The empirical study indicates that, at the whole market level, prior gains increase people’s current willingness to take risk assert; that is to say, the house money effect exists in the market, while people are more risk aversion following prior losses.

  1. Novel bayes factors that capture expert uncertainty in prior density specification in genetic association studies.

    Science.gov (United States)

    Spencer, Amy V; Cox, Angela; Lin, Wei-Yu; Easton, Douglas F; Michailidou, Kyriaki; Walters, Kevin

    2015-05-01

    Bayes factors (BFs) are becoming increasingly important tools in genetic association studies, partly because they provide a natural framework for including prior information. The Wakefield BF (WBF) approximation is easy to calculate and assumes a normal prior on the log odds ratio (logOR) with a mean of zero. However, the prior variance (W) must be specified. Because of the potentially high sensitivity of the WBF to the choice of W, we propose several new BF approximations with logOR ∼N(0,W), but allow W to take a probability distribution rather than a fixed value. We provide several prior distributions for W which lead to BFs that can be calculated easily in freely available software packages. These priors allow a wide range of densities for W and provide considerable flexibility. We examine some properties of the priors and BFs and show how to determine the most appropriate prior based on elicited quantiles of the prior odds ratio (OR). We show by simulation that our novel BFs have superior true-positive rates at low false-positive rates compared to those from both P-value and WBF analyses across a range of sample sizes and ORs. We give an example of utilizing our BFs to fine-map the CASP8 region using genotype data on approximately 46,000 breast cancer case and 43,000 healthy control samples from the Collaborative Oncological Gene-environment Study (COGS) Consortium, and compare the single-nucleotide polymorphism ranks to those obtained using WBFs and P-values from univariate logistic regression. © 2015 The Authors. *Genetic Epidemiology published by Wiley Periodicals, Inc.

  2. Graphene for separation and preconcentration of trace amounts of cobalt in water samples prior to flame atomic absorption spectrometry

    Directory of Open Access Journals (Sweden)

    Yukun Wang

    2016-09-01

    Full Text Available A new sensitive and simple method was developed for the preconcentration of trace amounts of cobalt (Co using 1-(2-pyridylazo-2-naphthol (PAN as chelating reagent prior to its determination by flame atomic absorption spectrometry. The proposed method is based on the utilization of a column packed with graphene as sorbent. Several effective parameters on the extraction and complex formation were selected and optimized. Under optimum conditions, the calibration graph was linear in the concentration range of 5.0–240.0 μg L−1 with a detection limit of 0.36 μg L−1. The relative standard deviation for ten replicate measurements of 20.0 and 100.0 μg L−1 of Co were 3.45 and 3.18%, respectively. Comparative studies showed that graphene is superior to other adsorbents including C18 silica, graphitic carbon, and single- and multi-walled carbon nanotubes for the extraction of Co. The proposed method was successfully applied in the analysis of four real environmental water samples. Good spiked recoveries over the range of 95.8–102.6% were obtained.

  3. Rapid DNA analysis for automated processing and interpretation of low DNA content samples.

    Science.gov (United States)

    Turingan, Rosemary S; Vasantgadkar, Sameer; Palombo, Luke; Hogan, Catherine; Jiang, Hua; Tan, Eugene; Selden, Richard F

    2016-01-01

    Short tandem repeat (STR) analysis of casework samples with low DNA content include those resulting from the transfer of epithelial cells from the skin to an object (e.g., cells on a water bottle, or brim of a cap), blood spatter stains, and small bone and tissue fragments. Low DNA content (LDC) samples are important in a wide range of settings, including disaster response teams to assist in victim identification and family reunification, military operations to identify friend or foe, criminal forensics to identify suspects and exonerate the innocent, and medical examiner and coroner offices to identify missing persons. Processing LDC samples requires experienced laboratory personnel, isolated workstations, and sophisticated equipment, requires transport time, and involves complex procedures. We present a rapid DNA analysis system designed specifically to generate STR profiles from LDC samples in field-forward settings by non-technical operators. By performing STR in the field, close to the site of collection, rapid DNA analysis has the potential to increase throughput and to provide actionable information in real time. A Low DNA Content BioChipSet (LDC BCS) was developed and manufactured by injection molding. It was designed to function in the fully integrated Accelerated Nuclear DNA Equipment (ANDE) instrument previously designed for analysis of buccal swab and other high DNA content samples (Investigative Genet. 4(1):1-15, 2013). The LDC BCS performs efficient DNA purification followed by microfluidic ultrafiltration of the purified DNA, maximizing the quantity of DNA available for subsequent amplification and electrophoretic separation and detection of amplified fragments. The system demonstrates accuracy, precision, resolution, signal strength, and peak height ratios appropriate for casework analysis. The LDC rapid DNA analysis system is effective for the generation of STR profiles from a wide range of sample types. The technology broadens the range of sample

  4. In-process weld sampling during hot end welds of type W overpacks

    International Nuclear Information System (INIS)

    Barnes, G.A.

    1998-01-01

    Establish the criteria and process controls to be used in obtaining, testing, and evaluating in-process weld sample during the hot end welding of Type W Overpack capsules used to overpack CsCl capsules for storage at WESF

  5. Polynomial Chaos Acceleration for the Bayesian Inference of Random Fields with Gaussian Priors and Uncertain Covariance Hyper-Parameters

    KAUST Repository

    Le Maitre, Olivier

    2015-01-07

    We address model dimensionality reduction in the Bayesian inference of Gaussian fields, considering prior covariance function with unknown hyper-parameters. The Karhunen-Loeve (KL) expansion of a prior Gaussian process is traditionally derived assuming fixed covariance function with pre-assigned hyperparameter values. Thus, the modes strengths of the Karhunen-Loeve expansion inferred using available observations, as well as the resulting inferred process, dependent on the pre-assigned values for the covariance hyper-parameters. Here, we seek to infer the process and its the covariance hyper-parameters in a single Bayesian inference. To this end, the uncertainty in the hyper-parameters is treated by means of a coordinate transformation, leading to a KL-type expansion on a fixed reference basis of spatial modes, but with random coordinates conditioned on the hyper-parameters. A Polynomial Chaos (PC) expansion of the model prediction is also introduced to accelerate the Bayesian inference and the sampling of the posterior distribution with MCMC method. The PC expansion of the model prediction also rely on a coordinates transformation, enabling us to avoid expanding the dependence of the prediction with respect to the covariance hyper-parameters. We demonstrate the efficiency of the proposed method on a transient diffusion equation by inferring spatially-varying log-diffusivity fields from noisy data.

  6. Satellite Infrared Radiation Measurements Prior to the Major Earthquakes

    Science.gov (United States)

    Ouzounov, Dimitar; Pulintes, S.; Bryant, N.; Taylor, Patrick; Freund, F.

    2005-01-01

    This work describes our search for a relationship between tectonic stresses and increases in mid-infrared (IR) flux as part of a possible ensemble of electromagnetic (EM) phenomena that may be related to earthquake activity. We present and &scuss observed variations in thermal transients and radiation fields prior to the earthquakes of Jan 22, 2003 Colima (M6.7) Mexico, Sept. 28 .2004 near Parkfield (M6.0) in California and Northern Sumatra (M8.5) Dec. 26,2004. Previous analysis of earthquake events has indicated the presence of an IR anomaly, where temperatures increased or did not return to its usual nighttime value. Our procedures analyze nighttime satellite data that records the general condtion of the ground after sunset. We have found from the MODIS instrument data that five days before the Colima earthquake the IR land surface nighttime temperature rose up to +4 degrees C in a 100 km radius around the epicenter. The IR transient field recorded by MODIS in the vicinity of Parkfield, also with a cloud free environment, was around +1 degree C and is significantly smaller than the IR anomaly around the Colima epicenter. Ground surface temperatures near the Parkfield epicenter four days prior to the earthquake show steady increase. However, on the night preceding the quake, a significant drop in relative humidity was indicated, process similar to those register prior to the Colima event. Recent analyses of continuous ongoing long- wavelength Earth radiation (OLR) indicate significant and anomalous variability prior to some earthquakes. The cause of these anomalies is not well understood but could be the result of a triggering by an interaction between the lithosphere-hydrosphere and atmospheric related to changes in the near surface electrical field and/or gas composition prior to the earthquake. The OLR anomaly usually covers large areas surrounding the main epicenter. We have found strong anomalies signal (two sigma) along the epicentral area signals on Dec 21

  7. Microfabricated Devices for Sample Extraction, Concentrations, and Related Sample Processing Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Gang; Lin, Yuehe

    2006-12-01

    This is an invited book chapter. As with other analytical techniques, sample pretreatments, sample extraction, sample introduction, and related techniques are of extreme importance for micro-electro-mechanical systems (MEMS). Bio-MEMS devices and systems start with a sampling step. The biological sample then usually undergoes some kinds of sample preparation steps before the actual analysis. These steps may involve extracting the target sample from its matrix, removing interferences from the sample, derivatizing the sample to detectable species, or performing a sample preconcentration step. The integration of the components for sample pretreatment into microfluidic devices represents one of the remaining the bottle-neck towards achieving true miniaturized total analysis systems (?TAS). This chapter provides a thorough state-of-art of the developments in this field to date.

  8. Compressive Online Robust Principal Component Analysis with Multiple Prior Information

    DEFF Research Database (Denmark)

    Van Luong, Huynh; Deligiannis, Nikos; Seiler, Jürgen

    -rank components. Unlike conventional batch RPCA, which processes all the data directly, our method considers a small set of measurements taken per data vector (frame). Moreover, our method incorporates multiple prior information signals, namely previous reconstructed frames, to improve these paration...... and thereafter, update the prior information for the next frame. Using experiments on synthetic data, we evaluate the separation performance of the proposed algorithm. In addition, we apply the proposed algorithm to online video foreground and background separation from compressive measurements. The results show...

  9. Bayesian road safety analysis: incorporation of past evidence and effect of hyper-prior choice.

    Science.gov (United States)

    Miranda-Moreno, Luis F; Heydari, Shahram; Lord, Dominique; Fu, Liping

    2013-09-01

    This paper aims to address two related issues when applying hierarchical Bayesian models for road safety analysis, namely: (a) how to incorporate available information from previous studies or past experiences in the (hyper) prior distributions for model parameters and (b) what are the potential benefits of incorporating past evidence on the results of a road safety analysis when working with scarce accident data (i.e., when calibrating models with crash datasets characterized by a very low average number of accidents and a small number of sites). A simulation framework was developed to evaluate the performance of alternative hyper-priors including informative and non-informative Gamma, Pareto, as well as Uniform distributions. Based on this simulation framework, different data scenarios (i.e., number of observations and years of data) were defined and tested using crash data collected at 3-legged rural intersections in California and crash data collected for rural 4-lane highway segments in Texas. This study shows how the accuracy of model parameter estimates (inverse dispersion parameter) is considerably improved when incorporating past evidence, in particular when working with the small number of observations and crash data with low mean. The results also illustrates that when the sample size (more than 100 sites) and the number of years of crash data is relatively large, neither the incorporation of past experience nor the choice of the hyper-prior distribution may affect the final results of a traffic safety analysis. As a potential solution to the problem of low sample mean and small sample size, this paper suggests some practical guidance on how to incorporate past evidence into informative hyper-priors. By combining evidence from past studies and data available, the model parameter estimates can significantly be improved. The effect of prior choice seems to be less important on the hotspot identification. The results show the benefits of incorporating prior

  10. 9 CFR 590.516 - Sanitizing and drying of shell eggs prior to breaking.

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Sanitizing and drying of shell eggs... PRODUCTS INSPECTION ACT) Sanitary, Processing, and Facility Requirements § 590.516 Sanitizing and drying of shell eggs prior to breaking. (a) Immediately prior to breaking, all shell eggs shall be spray rinsed...

  11. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    Science.gov (United States)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  12. Can an inadequate cervical cytology sample in ThinPrep be converted to a satisfactory sample by processing it with a SurePath preparation?

    Science.gov (United States)

    Sørbye, Sveinung Wergeland; Pedersen, Mette Kristin; Ekeberg, Bente; Williams, Merete E Johansen; Sauer, Torill; Chen, Ying

    2017-01-01

    The Norwegian Cervical Cancer Screening Program recommends screening every 3 years for women between 25 and 69 years of age. There is a large difference in the percentage of unsatisfactory samples between laboratories that use different brands of liquid-based cytology. We wished to examine if inadequate ThinPrep samples could be satisfactory by processing them with the SurePath protocol. A total of 187 inadequate ThinPrep specimens from the Department of Clinical Pathology at University Hospital of North Norway were sent to Akershus University Hospital for conversion to SurePath medium. Ninety-one (48.7%) were processed through the automated "gynecologic" application for cervix cytology samples, and 96 (51.3%) were processed with the "nongynecological" automatic program. Out of 187 samples that had been unsatisfactory by ThinPrep, 93 (49.7%) were satisfactory after being converted to SurePath. The rate of satisfactory cytology was 36.6% and 62.5% for samples run through the "gynecology" program and "nongynecology" program, respectively. Of the 93 samples that became satisfactory after conversion from ThinPrep to SurePath, 80 (86.0%) were screened as normal while 13 samples (14.0%) were given an abnormal diagnosis, which included 5 atypical squamous cells of undetermined significance, 5 low-grade squamous intraepithelial lesion, 2 atypical glandular cells not otherwise specified, and 1 atypical squamous cells cannot exclude high-grade squamous intraepithelial lesion. A total of 2.1% (4/187) of the women got a diagnosis of cervical intraepithelial neoplasia 2 or higher at a later follow-up. Converting cytology samples from ThinPrep to SurePath processing can reduce the number of unsatisfactory samples. The samples should be run through the "nongynecology" program to ensure an adequate number of cells.

  13. A two-sample Bayesian t-test for microarray data

    Directory of Open Access Journals (Sweden)

    Dimmic Matthew W

    2006-03-01

    Full Text Available Abstract Background Determining whether a gene is differentially expressed in two different samples remains an important statistical problem. Prior work in this area has featured the use of t-tests with pooled estimates of the sample variance based on similarly expressed genes. These methods do not display consistent behavior across the entire range of pooling and can be biased when the prior hyperparameters are specified heuristically. Results A two-sample Bayesian t-test is proposed for use in determining whether a gene is differentially expressed in two different samples. The test method is an extension of earlier work that made use of point estimates for the variance. The method proposed here explicitly calculates in analytic form the marginal distribution for the difference in the mean expression of two samples, obviating the need for point estimates of the variance without recourse to posterior simulation. The prior distribution involves a single hyperparameter that can be calculated in a statistically rigorous manner, making clear the connection between the prior degrees of freedom and prior variance. Conclusion The test is easy to understand and implement and application to both real and simulated data shows that the method has equal or greater power compared to the previous method and demonstrates consistent Type I error rates. The test is generally applicable outside the microarray field to any situation where prior information about the variance is available and is not limited to cases where estimates of the variance are based on many similar observations.

  14. Order–disorder–reorder process in thermally treated dolomite samples

    DEFF Research Database (Denmark)

    Zucchini, Azzurra; Comodi, Paola; Katerinopoulou, Anna

    2012-01-01

    A combined powder and single-crystal X-ray diffraction analysis of dolomite [CaMg(CO3)2] heated to 1,200oC at 3 GPa was made to study the order–disorder–reorder process. The order/disorder transition is inferred to start below 1,100oC, and complete disorder is attained at approximately 1,200o......C. Twinned crystals characterized by high internal order were found in samples annealed over 1,100oC, and their fraction was found to increase with temperature. Evidences of twinning domains combined with probable remaining disordered portions of the structure imply that reordering processes occur during...

  15. Marine sediment sample pre-processing for macroinvertebrates metabarcoding: mechanical enrichment and homogenization

    Directory of Open Access Journals (Sweden)

    Eva Aylagas

    2016-10-01

    Full Text Available Metabarcoding is an accurate and cost-effective technique that allows for simultaneous taxonomic identification of multiple environmental samples. Application of this technique to marine benthic macroinvertebrate biodiversity assessment for biomonitoring purposes requires standardization of laboratory and data analysis procedures. In this context, protocols for creation and sequencing of amplicon libraries and their related bioinformatics analysis have been recently published. However, a standardized protocol describing all previous steps (i.e. processing and manipulation of environmental samples for macroinvertebrate community characterization is lacking. Here, we provide detailed procedures for benthic environmental sample collection, processing, enrichment for macroinvertebrates, homogenization, and subsequent DNA extraction for metabarcoding analysis. Since this is the first protocol of this kind, it should be of use to any researcher in this field, having the potential for improvement.

  16. Socializing processes in relation to the recognition of unskilled adults’ prior learning

    DEFF Research Database (Denmark)

    Aarkrog, Vibe

    for the workplace-based training. However the study can contribute to the discussion of the value of practical experiences: are practical experiences creditable in educational programs? The study shows that the recognition and assessment of prior learning requires that the students can verbalize and preferably also......The ordinary Danish VET programs are organized as dual programs in which the students alternate between school-based education and training and workplace-based training. The adult students in the course “From unskilled worker to skilled worker in record time” are automatically credited...

  17. The use of low-calorie sweeteners is associated with self-reported prior intent to lose weight in a representative sample of US adults.

    Science.gov (United States)

    Drewnowski, A; Rehm, C D

    2016-03-07

    Low-calorie sweeteners (LCSs) are said to be a risk factor for obesity and diabetes. Reverse causality may be an alternative explanation. Data on LCS use, from a single 24-h dietary recall, for a representative sample of 22 231 adults were obtained from 5 cycles of the National Health and Nutrition Examination Survey (1999-2008 NHANES). Retrospective data on intent to lose or maintain weight during the prior 12-months and 10-year weight history were obtained from the weight history questionnaire. Objectively measured heights and weights were obtained from the examination. Primary analyses evaluated the association between intent to lose/maintain weight and use of LCSs and specific LCS product types using survey-weighted generalized linear models. We further evaluated whether body mass index (BMI) may mediate the association between weight loss intent and use of LCSs. The association between 10-year weight history and current LCS use was evaluated using restricted cubic splines. In cross-sectional analyses, LCS use was associated with a higher prevalence of obesity and diabetes. Adults who tried to lose weight during the previous 12 months were more likely to consume LCS beverages (prevalence ratio=1.64, 95% confidence interval (CI) 1.54-1.75), tabletop LCS (prevalence ratio=1.68, 95% CI 1.47-1.91) and LCS foods (prevalence ratio=1.93, 95% CI 1.60-2.33) as compared with those who did not. In mediation analyses, BMI only partially mediated the association between weight control history and the use of LCS beverages, tabletop LCS, but not LCS foods. Current LCS use was further associated with a history of prior weight change (for example, weight loss and gain). LCS use was associated with self-reported intent to lose weight during the previous 12 months. This association was only partially mediated by differences in BMI. Any inference of causality between attempts at weight control and LCS use is tempered by the cross-sectional nature of these data and retrospective

  18. Quantitative utilization of prior biological knowledge in the Bayesian network modeling of gene expression data

    Directory of Open Access Journals (Sweden)

    Gao Shouguo

    2011-08-01

    Full Text Available Abstract Background Bayesian Network (BN is a powerful approach to reconstructing genetic regulatory networks from gene expression data. However, expression data by itself suffers from high noise and lack of power. Incorporating prior biological knowledge can improve the performance. As each type of prior knowledge on its own may be incomplete or limited by quality issues, integrating multiple sources of prior knowledge to utilize their consensus is desirable. Results We introduce a new method to incorporate the quantitative information from multiple sources of prior knowledge. It first uses the Naïve Bayesian classifier to assess the likelihood of functional linkage between gene pairs based on prior knowledge. In this study we included cocitation in PubMed and schematic similarity in Gene Ontology annotation. A candidate network edge reservoir is then created in which the copy number of each edge is proportional to the estimated likelihood of linkage between the two corresponding genes. In network simulation the Markov Chain Monte Carlo sampling algorithm is adopted, and samples from this reservoir at each iteration to generate new candidate networks. We evaluated the new algorithm using both simulated and real gene expression data including that from a yeast cell cycle and a mouse pancreas development/growth study. Incorporating prior knowledge led to a ~2 fold increase in the number of known transcription regulations recovered, without significant change in false positive rate. In contrast, without the prior knowledge BN modeling is not always better than a random selection, demonstrating the necessity in network modeling to supplement the gene expression data with additional information. Conclusion our new development provides a statistical means to utilize the quantitative information in prior biological knowledge in the BN modeling of gene expression data, which significantly improves the performance.

  19. Phase transitions in restricted Boltzmann machines with generic priors

    Science.gov (United States)

    Barra, Adriano; Genovese, Giuseppe; Sollich, Peter; Tantari, Daniele

    2017-10-01

    We study generalized restricted Boltzmann machines with generic priors for units and weights, interpolating between Boolean and Gaussian variables. We present a complete analysis of the replica symmetric phase diagram of these systems, which can be regarded as generalized Hopfield models. We underline the role of the retrieval phase for both inference and learning processes and we show that retrieval is robust for a large class of weight and unit priors, beyond the standard Hopfield scenario. Furthermore, we show how the paramagnetic phase boundary is directly related to the optimal size of the training set necessary for good generalization in a teacher-student scenario of unsupervised learning.

  20. Recent results of the investigation of a micro-fluidic sampling chip and sampling system for hot cell aqueous processing streams

    International Nuclear Information System (INIS)

    Tripp, J.; Smith, T.; Law, J.

    2013-01-01

    A Fuel Cycle Research and Development project has investigated an innovative sampling method that could evolve into the next generation sampling and analysis system for metallic elements present in aqueous processing streams. Initially sampling technologies were evaluated and micro-fluidic sampling chip technology was selected and tested. A conceptual design for a fully automated microcapillary-based system was completed and a robotic automated sampling system was fabricated. The mechanical and sampling operation of the completed sampling system was investigated. Different sampling volumes have been tested. It appears that the 10 μl volume has produced data that had much smaller relative standard deviations than the 2 μl volume. In addition, the production of a less expensive, mass produced sampling chip was investigated to avoid chip reuse thus increasing sampling reproducibility/accuracy. The micro-fluidic-based robotic sampling system's mechanical elements were tested to ensure analytical reproducibility and the optimum robotic handling of micro-fluidic sampling chips. (authors)

  1. Fs-laser processing of polydimethylsiloxane

    Energy Technology Data Exchange (ETDEWEB)

    Atanasov, Petar A., E-mail: paatanas@ie.bas.bg; Nedyalkov, Nikolay N. [Institute of Electronics, Bulgarian Academy of Sciences, 72 Tsarigradsko Shose, Sofia 1784 (Bulgaria); Valova, Eugenia I.; Georgieva, Zhenya S.; Armyanov, Stefan A.; Kolev, Konstantin N. [Rostislaw Kaischew Institute of Physical Chemistry, Bulgarian Academy of Sciences, Acad. G. Bonchev Str., Block 11, Sofia 1113 (Bulgaria); Amoruso, Salvatore; Wang, Xuan; Bruzzese, Ricardo [CNR-SPIN, Dipartimento di Scienze Fisiche, Universita degli Studi di Napoli Federico II, Complesso Universitario di Monte S. Angelo, Via Cintia, I-80126 Napoli (Italy); Sawczak, Miroslaw; Śliwiński, Gerard [Photophysics Department, The Szewalski Institute, Polish Academy of Sciences, 14 Fiszera St, 80-231 Gdańsk (Poland)

    2014-07-14

    We present an experimental analysis on surface structuring of polydimethylsiloxane films with UV (263 nm) femtosecond laser pulses, in air. Laser processed areas are analyzed by optical microscopy, SEM, and μ-Raman spectroscopy. The laser-treated sample shows the formation of a randomly nanostructured surface morphology. μ-Raman spectra, carried out at both 514 and 785 nm excitation wavelengths, prior and after laser treatment allow evidencing the changes in the sample structure. The influence of the laser fluence on the surface morphology is studied. Finally, successful electro-less metallization of the laser-processed sample is achieved, even after several months from the laser-treatment contrary to previous observation with nanosecond pulses. Our findings address the effectiveness of fs-laser treatment and chemical metallization of polydimethylsiloxane films with perspective technological interest in micro-fabrication devices for MEMS and nano-electromechanical systems.

  2. Sampling Transition Pathways in Highly Correlated Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Chandler, David

    2004-10-20

    This research grant supported my group's efforts to apply and extend the method of transition path sampling that we invented during the late 1990s. This methodology is based upon a statistical mechanics of trajectory space. Traditional statistical mechanics focuses on state space, and with it, one can use Monte Carlo methods to facilitate importance sampling of states. With our formulation of a statistical mechanics of trajectory space, we have succeeded at creating algorithms by which importance sampling can be done for dynamical processes. In particular, we are able to study rare but important events without prior knowledge of transition states or mechanisms. In perhaps the most impressive application of transition path sampling, my group combined forces with Michele Parrinello and his coworkers to unravel the dynamics of auto ionization of water [5]. This dynamics is the fundamental kinetic step of pH. Other applications concern nature of dynamics far from equilibrium [1, 7], nucleation processes [2], cluster isomerization, melting and dissociation [3, 6], and molecular motors [10]. Research groups throughout the world are adopting transition path sampling. In part this has been the result of our efforts to provide pedagogical presentations of the technique [4, 8, 9], as well as providing new procedures for interpreting trajectories of complex systems [11].

  3. SAMPLE RESULTS FROM THE INTEGRATED SALT DISPOSITION PROGRAM MACROBATCH 4 TANK 21H QUALIFICATION SAMPLES

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T.; Fink, S.

    2011-06-22

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H to qualify them for use in the Integrated Salt Disposition Program (ISDP) Batch 4 processing. All sample results agree with expectations based on prior analyses where available. No issues with the projected Salt Batch 4 strategy are identified. This revision includes additional data points that were not available in the original issue of the document, such as additional plutonium results, the results of the monosodium titanate (MST) sorption test and the extraction, scrub strip (ESS) test. This report covers the revision to the Tank 21H qualification sample results for Macrobatch (Salt Batch) 4 of the Integrated Salt Disposition Program (ISDP). A previous document covers initial characterization which includes results for a number of non-radiological analytes. These results were used to perform aluminum solubility modeling to determine the hydroxide needs for Salt Batch 4 to prevent the precipitation of solids. Sodium hydroxide was then added to Tank 21 and additional samples were pulled for the analyses discussed in this report. This work was specified by Task Technical Request and by Task Technical and Quality Assurance Plan (TTQAP).

  4. Should warfarin or aspirin be stopped prior to prostate biopsy? An analysis of bleeding complications related to increasing sample number regimes

    International Nuclear Information System (INIS)

    Chowdhury, R.; Abbas, A.; Idriz, S.; Hoy, A.; Rutherford, E.E.; Smart, J.M.

    2012-01-01

    Aim: To determine whether patients undergoing transrectal ultrasound (TRUS)-guided prostate biopsy with increased sampling numbers are more likely to experience bleeding complications and whether warfarin or low-dose aspirin are independent risk factors. Materials and methods: 930 consecutive patients with suspected prostatic cancer were followed up after biopsy. Warfarin/low-dose aspirin was not stopped prior to the procedure. An eight to 10 sample regime TRUS-guided prostate biopsy was performed and patients were offered a questionnaire to complete 10 days after the procedure, to determine any immediate or delayed bleeding complications. Results: 902 patients returned completed questionnaires. 579 (64.2%) underwent eight core biopsies, 47 (5.2%) underwent nine, and 276 (30.6%) underwent 10. 68 were taking warfarin [mean international normalized ratio (INR) = 2.5], 216 were taking low-dose aspirin, one was taking both, and 617 were taking neither. 27.9% of those on warfarin and 33.8% of those on aspirin experienced haematuria. 37% of those on no blood-thinning medication experienced haematuria. 13.2% of those on warfarin and 14.4% of those on aspirin experienced rectal bleeding. 11.5% of those on no blood-thinning medication experienced rectal bleeding. 7.4% of those on warfarin and 12% of those on aspirin experienced haematospermia. 13.8% of those on neither experienced haematospermia. Regression analysis showed a significant association between increasing sampling number and occurrence of all bleeding complication types. There was no significant association between minor bleeding complications and warfarin use; however, there was a significant association between minor bleeding complications and low-dose aspirin use. There was no severe bleeding complication. Conclusion: There is an increased risk of bleeding complications following TRUS-guided prostate biopsy with increased sampling numbers but these are minor. There is also an increased risk with low

  5. Program for TI programmable 59 calculator for calculation of 3H concentration of water samples

    International Nuclear Information System (INIS)

    Hussain, S.D.; Asghar, G.

    1982-09-01

    A program has been developed for TI Programmable 59 Calculator of Texas Instruments Inc. to calculate from the observed parameters such as count rate etc. the 3 H (tritium) concentration of water samples processed with/without prior electrolytic enrichment. Procedure to use the program has been described in detail. A brief description of the laboratory treatment of samples and the mathematical equations used in the calculations have been given. (orig./A.B.)

  6. Preparing learners with partly incorrect intuitive prior knowledge for learning

    Directory of Open Access Journals (Sweden)

    Andrea eOhst

    2014-07-01

    Full Text Available Learners sometimes have incoherent and fragmented intuitive prior knowledge that is (partly ‘incompatible’ with the to-be-learned contents. Such knowledge in pieces can cause conceptual disorientation and cognitive overload while learning. We hypothesized that a pre-training intervention providing a generalized schema as a structuring framework for such knowledge in pieces would support (reorganizing-processes of prior knowledge and thus reduce unnecessary cognitive load during subsequent learning. Fifty-six student teachers participated in the experiment. A framework group underwent a pre-training intervention providing a generalized, categorical schema for categorizing primary learning strategies and related but different strategies as a cognitive framework for (re-organizing their prior knowledge. Our control group received comparable factual information but no framework. Afterwards, all participants learned about primary learning strategies. The framework group claimed to possess higher levels of interest and self-efficacy, achieved higher learning outcomes, and learned more efficiently. Hence, providing a categorical framework can help overcome the barrier of incorrect prior knowledge in pieces.

  7. Preparing learners with partly incorrect intuitive prior knowledge for learning

    Science.gov (United States)

    Ohst, Andrea; Fondu, Béatrice M. E.; Glogger, Inga; Nückles, Matthias; Renkl, Alexander

    2014-01-01

    Learners sometimes have incoherent and fragmented intuitive prior knowledge that is (partly) “incompatible” with the to-be-learned contents. Such knowledge in pieces can cause conceptual disorientation and cognitive overload while learning. We hypothesized that a pre-training intervention providing a generalized schema as a structuring framework for such knowledge in pieces would support (re)organizing-processes of prior knowledge and thus reduce unnecessary cognitive load during subsequent learning. Fifty-six student teachers participated in the experiment. A framework group underwent a pre-training intervention providing a generalized, categorical schema for categorizing primary learning strategies and related but different strategies as a cognitive framework for (re-)organizing their prior knowledge. Our control group received comparable factual information but no framework. Afterwards, all participants learned about primary learning strategies. The framework group claimed to possess higher levels of interest and self-efficacy, achieved higher learning outcomes, and learned more efficiently. Hence, providing a categorical framework can help overcome the barrier of incorrect prior knowledge in pieces. PMID:25071638

  8. Evaluation of a value prior to pulping-thermomechanical pulp business concept. Part 2.

    Science.gov (United States)

    Ted Bilek; Carl Houtman; Peter Ince

    2011-01-01

    Value Prior to Pulping (VPP) is a novel biorefining concept for pulp mills that includes hydrolysis extraction of hemicellulose wood sugars and acetic acid from pulpwood prior to pulping. The concept involves conversion of wood sugars via fermentation to fuel ethanol or other chemicals and the use of remaining solid wood material in the pulping process. This paper...

  9. Prior implicit knowledge shapes human threshold for orientation noise

    DEFF Research Database (Denmark)

    Christensen, Jeppe H; Bex, Peter J; Fiser, József

    2015-01-01

    , resulting in an image-class-specific threshold that changes the shape and position of the dipper function according to image class. These findings do not fit a filter-based feed-forward view of orientation coding, but can be explained by a process that utilizes an experience-based perceptual prior...

  10. The difference in pediatric blood pressure between middle childhood and late childhood prior to dental treatment

    Directory of Open Access Journals (Sweden)

    Fitri Anissa Syaimima bt. Syaiful Azim

    2018-01-01

    Full Text Available Every child will go through several stages in his or her life. They are different from each other as they are in the process of development of cognition, physics, emotion, and personality. For many children, a visit to the dentist can raise their anxiety. This anxiousness will lead to stress that influences the cardiovascular function in the body. The purpose of this research was to determine the difference in pediatric blood pressure between middle childhood and late childhood prior to dental treatment. This research was a clinical trial, pure experimental study. The sample consisted of 30 children within the range of 4-12 years old where they were divided into two groups of age; middle childhood (4-7 years old and late childhood (8-12 years old. The blood pressures were measured before any dental treatment began and the values were recorded. The data were then analyzed using the One-Sample T-Test analysis. The results of blood pressure in middle childhood and late childhood were compared to the average mean values for each age group. It showed that there was a significant difference in the systolic pressure, which was found higher in the middle childhood group compared to the late childhood. From the result can be concluded that there was a difference in the pediatric blood pressure between middle childhood and late childhood prior to dental treatment.

  11. Random sampling of evolution time space and Fourier transform processing

    International Nuclear Information System (INIS)

    Kazimierczuk, Krzysztof; Zawadzka, Anna; Kozminski, Wiktor; Zhukov, Igor

    2006-01-01

    Application of Fourier Transform for processing 3D NMR spectra with random sampling of evolution time space is presented. The 2D FT is calculated for pairs of frequencies, instead of conventional sequence of one-dimensional transforms. Signal to noise ratios and linewidths for different random distributions were investigated by simulations and experiments. The experimental examples include 3D HNCA, HNCACB and 15 N-edited NOESY-HSQC spectra of 13 C 15 N labeled ubiquitin sample. Obtained results revealed general applicability of proposed method and the significant improvement of resolution in comparison with conventional spectra recorded in the same time

  12. A Pipeline for Large Data Processing Using Regular Sampling for Unstructured Grids

    Energy Technology Data Exchange (ETDEWEB)

    Berres, Anne Sabine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Adhinarayanan, Vignesh [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Turton, Terece [Univ. of Texas, Austin, TX (United States); Feng, Wu [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Rogers, David Honegger [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-05-12

    Large simulation data requires a lot of time and computational resources to compute, store, analyze, visualize, and run user studies. Today, the largest cost of a supercomputer is not hardware but maintenance, in particular energy consumption. Our goal is to balance energy consumption and cognitive value of visualizations of resulting data. This requires us to go through the entire processing pipeline, from simulation to user studies. To reduce the amount of resources, data can be sampled or compressed. While this adds more computation time, the computational overhead is negligible compared to the simulation time. We built a processing pipeline at the example of regular sampling. The reasons for this choice are two-fold: using a simple example reduces unnecessary complexity as we know what to expect from the results. Furthermore, it provides a good baseline for future, more elaborate sampling methods. We measured time and energy for each test we did, and we conducted user studies in Amazon Mechanical Turk (AMT) for a range of different results we produced through sampling.

  13. Accommodating Uncertainty in Prior Distributions

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-19

    A fundamental premise of Bayesian methodology is that a priori information is accurately summarized by a single, precisely de ned prior distribution. In many cases, especially involving informative priors, this premise is false, and the (mis)application of Bayes methods produces posterior quantities whose apparent precisions are highly misleading. We examine the implications of uncertainty in prior distributions, and present graphical methods for dealing with them.

  14. Performance of informative priors skeptical of large treatment effects in clinical trials: A simulation study.

    Science.gov (United States)

    Pedroza, Claudia; Han, Weilu; Thanh Truong, Van Thi; Green, Charles; Tyson, Jon E

    2018-01-01

    One of the main advantages of Bayesian analyses of clinical trials is their ability to formally incorporate skepticism about large treatment effects through the use of informative priors. We conducted a simulation study to assess the performance of informative normal, Student- t, and beta distributions in estimating relative risk (RR) or odds ratio (OR) for binary outcomes. Simulation scenarios varied the prior standard deviation (SD; level of skepticism of large treatment effects), outcome rate in the control group, true treatment effect, and sample size. We compared the priors with regards to bias, mean squared error (MSE), and coverage of 95% credible intervals. Simulation results show that the prior SD influenced the posterior to a greater degree than the particular distributional form of the prior. For RR, priors with a 95% interval of 0.50-2.0 performed well in terms of bias, MSE, and coverage under most scenarios. For OR, priors with a wider 95% interval of 0.23-4.35 had good performance. We recommend the use of informative priors that exclude implausibly large treatment effects in analyses of clinical trials, particularly for major outcomes such as mortality.

  15. What good are actions? Accelerating learning using learned action priors

    CSIR Research Space (South Africa)

    Rosman, Benjamin S

    2012-11-01

    Full Text Available The computational complexity of learning in sequential decision problems grows exponentially with the number of actions available to the agent at each state. We present a method for accelerating this process by learning action priors that express...

  16. The UK Biobank sample handling and storage validation studies.

    Science.gov (United States)

    Peakman, Tim C; Elliott, Paul

    2008-04-01

    and aims UK Biobank is a large prospective study in the United Kingdom to investigate the role of genetic factors, environmental exposures and lifestyle in the causes of major diseases of late and middle age. It involves the collection of blood and urine from 500 000 individuals aged between 40 and 69 years. How the samples are collected, processed and stored will have a major impact on the future scientific usefulness of the UK Biobank resource. A series of validation studies was recommended to test the robustness of the draft sample handling and storage protocol. Samples of blood and urine were collected from 40 healthy volunteers and either processed immediately according to the protocol or maintained at specified temperatures (4 degrees C for all tubes with the exception of vacutainers containing acid citrate dextrose that were maintained at 18 degrees C) for 12, 24 or 36 h prior to processing. A further sample was maintained for 24 h at 4 degrees C, processed and the aliquots frozen at -80 degrees C for 20 days and then thawed under controlled conditions. The stability of the samples was compared for the different times in a wide variety of assays. The samples maintained at 4 degrees C were stable for at least 24 h after collection for a wide range of assays. Small but significant changes were observed in metabonomic studies in samples maintained at 4 degrees C for 36 h. There was no degradation of the samples for a range of biochemical assays after short-term freezing and thawing under controlled conditions. Whole blood maintained at 18 degrees C for 24 h in vacutainers containing acid citrate dextrose is suitable for viral immortalization techniques. The validation studies reported in this supplement provide justification for the sample handling and storage procedures adopted in the UK Biobank project.

  17. Study on infrasonic characteristics of coal samples in failure process under uniaxial loading

    Directory of Open Access Journals (Sweden)

    Bing Jia

    Full Text Available To study the precursory failure infrasonic characteristics of coal samples, coal rock stress loading system and infrasonic wave acquisition system were adopted, and infrasonic tests in uniaxial loading process were made for the coal samples in the studied area. Wavelet filtering, fast Fourier transform, and relative infrasonic energy methods were used to analyze the characteristics of the infrasonic waves in the loading process, including time domain characteristics, and relative energy. The analysis results demonstrated that the frequencies of the infrasonic signals in the loading process mainly distribute within 5–10 Hz, which are significantly different from noise signals. The changes of the infrasonic signals show clear periodic characters in time domain. Meanwhile, the relative energy changes of the infrasonic wave also show periodic characters, which are divided into two stages by the yield limit of coal samples, and are clear and easy to be recognized, so that they can be used as the precursory characteristics for recognizing coal sample failures. Moreover, the infrasonic waves generated by coal samples have low frequency and low attenuation, which can be collected without coupling and transmitted in long distance. This study provides an important support for the further in-situ prediction of coal rock failures. Keywords: Infrasound, Relative energy, Time-frequency analysis, Failure prediction, Identification feature

  18. Using machine learning to accelerate sampling-based inversion

    Science.gov (United States)

    Valentine, A. P.; Sambridge, M.

    2017-12-01

    In most cases, a complete solution to a geophysical inverse problem (including robust understanding of the uncertainties associated with the result) requires a sampling-based approach. However, the computational burden is high, and proves intractable for many problems of interest. There is therefore considerable value in developing techniques that can accelerate sampling procedures.The main computational cost lies in evaluation of the forward operator (e.g. calculation of synthetic seismograms) for each candidate model. Modern machine learning techniques-such as Gaussian Processes-offer a route for constructing a computationally-cheap approximation to this calculation, which can replace the accurate solution during sampling. Importantly, the accuracy of the approximation can be refined as inversion proceeds, to ensure high-quality results.In this presentation, we describe and demonstrate this approach-which can be seen as an extension of popular current methods, such as the Neighbourhood Algorithm, and bridges the gap between prior- and posterior-sampling frameworks.

  19. Neutrino masses and their ordering: global data, priors and models

    Science.gov (United States)

    Gariazzo, S.; Archidiacono, M.; de Salas, P. F.; Mena, O.; Ternes, C. A.; Tórtola, M.

    2018-03-01

    We present a full Bayesian analysis of the combination of current neutrino oscillation, neutrinoless double beta decay and Cosmic Microwave Background observations. Our major goal is to carefully investigate the possibility to single out one neutrino mass ordering, namely Normal Ordering or Inverted Ordering, with current data. Two possible parametrizations (three neutrino masses versus the lightest neutrino mass plus the two oscillation mass splittings) and priors (linear versus logarithmic) are exhaustively examined. We find that the preference for NO is only driven by neutrino oscillation data. Moreover, the values of the Bayes factor indicate that the evidence for NO is strong only when the scan is performed over the three neutrino masses with logarithmic priors; for every other combination of parameterization and prior, the preference for NO is only weak. As a by-product of our Bayesian analyses, we are able to (a) compare the Bayesian bounds on the neutrino mixing parameters to those obtained by means of frequentist approaches, finding a very good agreement; (b) determine that the lightest neutrino mass plus the two mass splittings parametrization, motivated by the physical observables, is strongly preferred over the three neutrino mass eigenstates scan and (c) find that logarithmic priors guarantee a weakly-to-moderately more efficient sampling of the parameter space. These results establish the optimal strategy to successfully explore the neutrino parameter space, based on the use of the oscillation mass splittings and a logarithmic prior on the lightest neutrino mass, when combining neutrino oscillation data with cosmology and neutrinoless double beta decay. We also show that the limits on the total neutrino mass ∑ mν can change dramatically when moving from one prior to the other. These results have profound implications for future studies on the neutrino mass ordering, as they crucially state the need for self-consistent analyses which explore the

  20. Prior Knowledge and the Learning of Science. A Review of Ausubel's Theory of This Process

    Science.gov (United States)

    West, L. H. T.; Fensham, P. J.

    1974-01-01

    Examines Ausubel's theory of learning as a model of the role concerning the influence of prior knowledge on how learning occurs. Research evidence for Ausubel's theory is presented and discussed. Implications of Ausubel's theory for teaching are summarized. (PEB)

  1. Neutron-activation analysis of routine mineral-processing samples

    International Nuclear Information System (INIS)

    Watterson, J.; Eddy, B.; Pearton, D.

    1974-01-01

    Instrumental neutron-activation analysis was applied to a suite of typical mineral-processing samples to establish which elements can be rapidly determined in them by this technique. A total of 35 elements can be determined with precisions (from the counting statistics) ranging from better than 1 per cent to approximately 20 per cent. The elements that can be determined have been tabulated together with the experimental conditions, the precision from the counting statistics, and the estimated number of analyses possible per day. With an automated system, this number can be as high as 150 in the most favourable cases [af

  2. Pippi — Painless parsing, post-processing and plotting of posterior and likelihood samples

    Science.gov (United States)

    Scott, Pat

    2012-11-01

    Interpreting samples from likelihood or posterior probability density functions is rarely as straightforward as it seems it should be. Producing publication-quality graphics of these distributions is often similarly painful. In this short note I describe pippi, a simple, publicly available package for parsing and post-processing such samples, as well as generating high-quality PDF graphics of the results. Pippi is easily and extensively configurable and customisable, both in its options for parsing and post-processing samples, and in the visual aspects of the figures it produces. I illustrate some of these using an existing supersymmetric global fit, performed in the context of a gamma-ray search for dark matter. Pippi can be downloaded and followed at http://github.com/patscott/pippi.

  3. A novel heterogeneous training sample selection method on space-time adaptive processing

    Science.gov (United States)

    Wang, Qiang; Zhang, Yongshun; Guo, Yiduo

    2018-04-01

    The performance of ground target detection about space-time adaptive processing (STAP) decreases when non-homogeneity of clutter power is caused because of training samples contaminated by target-like signals. In order to solve this problem, a novel nonhomogeneous training sample selection method based on sample similarity is proposed, which converts the training sample selection into a convex optimization problem. Firstly, the existing deficiencies on the sample selection using generalized inner product (GIP) are analyzed. Secondly, the similarities of different training samples are obtained by calculating mean-hausdorff distance so as to reject the contaminated training samples. Thirdly, cell under test (CUT) and the residual training samples are projected into the orthogonal subspace of the target in the CUT, and mean-hausdorff distances between the projected CUT and training samples are calculated. Fourthly, the distances are sorted in order of value and the training samples which have the bigger value are selective preference to realize the reduced-dimension. Finally, simulation results with Mountain-Top data verify the effectiveness of the proposed method.

  4. Iterated random walks with shape prior

    DEFF Research Database (Denmark)

    Pujadas, Esmeralda Ruiz; Kjer, Hans Martin; Piella, Gemma

    2016-01-01

    the parametric probability density function. Then, random walks is performed iteratively aligning the prior with the current segmentation in every iteration. We tested the proposed approach with natural and medical images and compared it with the latest techniques with random walks and shape priors......We propose a new framework for image segmentation using random walks where a distance shape prior is combined with a region term. The shape prior is weighted by a confidence map to reduce the influence of the prior in high gradient areas and the region term is computed with k-means to estimate....... The experiments suggest that this method gives promising results for medical and natural images....

  5. Probabilistic Round Trip Contamination Analysis of a Mars Sample Acquisition and Handling Process Using Markovian Decompositions

    Science.gov (United States)

    Hudson, Nicolas; Lin, Ying; Barengoltz, Jack

    2010-01-01

    A method for evaluating the probability of a Viable Earth Microorganism (VEM) contaminating a sample during the sample acquisition and handling (SAH) process of a potential future Mars Sample Return mission is developed. A scenario where multiple core samples would be acquired using a rotary percussive coring tool, deployed from an arm on a MER class rover is analyzed. The analysis is conducted in a structured way by decomposing sample acquisition and handling process into a series of discrete time steps, and breaking the physical system into a set of relevant components. At each discrete time step, two key functions are defined: The probability of a VEM being released from each component, and the transport matrix, which represents the probability of VEM transport from one component to another. By defining the expected the number of VEMs on each component at the start of the sampling process, these decompositions allow the expected number of VEMs on each component at each sampling step to be represented as a Markov chain. This formalism provides a rigorous mathematical framework in which to analyze the probability of a VEM entering the sample chain, as well as making the analysis tractable by breaking the process down into small analyzable steps.

  6. Separation and Enrichment of Gold in Water, Geological and Environmental Samples by Solid Phase Extraction on Multiwalled Carbon Nanotubes Prior to its Determination by Flame Atomic Absorption Spectrometry.

    Science.gov (United States)

    Duran, Ali; Tuzen, Mustafa; Soylak, Mustafa

    2015-01-01

    This study proposes the application of multi-walled carbon nanotubes as a solid sorbent for the preconcentration of gold prior to its flame atomic absorption spectrometry determination. Extraction was achieved by using a glass column (15.0 cm in length and 1.0 cm in diameter). Quantitative recoveries were obtained in the pH range of 2.5-4.0; the elution step was carried out with 5.0 ml of 1.0 mol/L HNO3 in acetone. In the ligand-free study, variables such as pH, eluent type, sample volume, flow rates, and matrix effect were examined for the optimum recovery of gold ions. The gold ions were able to be pre-concentrated by a factor of 150 and their LOD was determined to be 1.71 μg/L. In order to evaluate the accuracy of the developed method, addition-recovery tests were applied for the tap water, mineral water, and sea water samples. Gold recovery studies were implemented using a wet digestion technique for mine and soil samples taken from various media, and this method was also applied for anodic slime samples taken from the factories located in the Kayseri Industrial Zone of Turkey.

  7. The Prior-project

    DEFF Research Database (Denmark)

    Engerer, Volkmar Paul; Roued-Cunliffe, Henriette; Albretsen, Jørgen

    digitisation of Arthur Prior’s Nachlass kept in the Bodleian Library, Oxford. The DH infrastructure in question is the Prior Virtual Lab (PVL). PVL was established in 2011 in order to provide researchers in the field of temporal logic easy access to the papers of Arthur Norman Prior (1914-1969), and officially......In this paper, we present a DH research infrastructure which relies heavily on a combination of domain knowledge with information technology. The general goal is to develop tools to aid scholars in their interpretations and understanding of temporal logic. This in turn is based on an extensive...

  8. Dry sample storage system for an analytical laboratory supporting plutonium processing

    International Nuclear Information System (INIS)

    Treibs, H.A.; Hartenstein, S.D.; Griebenow, B.L.; Wade, M.A.

    1990-01-01

    The Special Isotope Separation (SIS) plant is designed to provide removal of undesirable isotopes in fuel grade plutonium by the atomic vapor laser isotope separation (AVLIS) process. The AVLIS process involves evaporation of plutonium metal, and passage of an intense beam of light from a laser through the plutonium vapor. The laser beam consists of several discrete wavelengths, tuned to the precise wavelength required to ionize the undesired isotopes. These ions are attracted to charged plates, leaving the bulk of the plutonium vapor enriched in the desired isotopes to be collected on a cold plate. Major portions of the process consist of pyrochemical processes, including direct reduction of the plutonium oxide feed material with calcium metal, and aqueous processes for purification of plutonium in residues. The analytical laboratory for the plant is called the Material and Process Control Laboratory (MPCL), and provides for the analysis of solid and liquid process samples

  9. Stability of arsenic compounds in seafood samples during processing and storage by freezing

    DEFF Research Database (Denmark)

    Dahl, Lisbeth; Molin, Marianne; Amlund, Heidi

    2010-01-01

    was observed after processing or after storage by freezing. The content of tetramethylarsonium ion was generally low in all samples types, but increased significantly in all fried samples of both fresh and frozen seafood. Upon storage by freezing, the arsenobetaine content was reduced significantly, but only...

  10. Wrong, but useful: regional species distribution models may not be improved by range-wide data under biased sampling.

    Science.gov (United States)

    El-Gabbas, Ahmed; Dormann, Carsten F

    2018-02-01

    Species distribution modeling (SDM) is an essential method in ecology and conservation. SDMs are often calibrated within one country's borders, typically along a limited environmental gradient with biased and incomplete data, making the quality of these models questionable. In this study, we evaluated how adequate are national presence-only data for calibrating regional SDMs. We trained SDMs for Egyptian bat species at two different scales: only within Egypt and at a species-specific global extent. We used two modeling algorithms: Maxent and elastic net, both under the point-process modeling framework. For each modeling algorithm, we measured the congruence of the predictions of global and regional models for Egypt, assuming that the lower the congruence, the lower the appropriateness of the Egyptian dataset to describe the species' niche. We inspected the effect of incorporating predictions from global models as additional predictor ("prior") to regional models, and quantified the improvement in terms of AUC and the congruence between regional models run with and without priors. Moreover, we analyzed predictive performance improvements after correction for sampling bias at both scales. On average, predictions from global and regional models in Egypt only weakly concur. Collectively, the use of priors did not lead to much improvement: similar AUC and high congruence between regional models calibrated with and without priors. Correction for sampling bias led to higher model performance, whatever prior used, making the use of priors less pronounced. Under biased and incomplete sampling, the use of global bats data did not improve regional model performance. Without enough bias-free regional data, we cannot objectively identify the actual improvement of regional models after incorporating information from the global niche. However, we still believe in great potential for global model predictions to guide future surveys and improve regional sampling in data

  11. X-Ray Computed Tomography: The First Step in Mars Sample Return Processing

    Science.gov (United States)

    Welzenbach, L. C.; Fries, M. D.; Grady, M. M.; Greenwood, R. C.; McCubbin, F. M.; Zeigler, R. A.; Smith, C. L.; Steele, A.

    2017-01-01

    The Mars 2020 rover mission will collect and cache samples from the martian surface for possible retrieval and subsequent return to Earth. If the samples are returned, that mission would likely present an opportunity to analyze returned Mars samples within a geologic context on Mars. In addition, it may provide definitive information about the existence of past or present life on Mars. Mars sample return presents unique challenges for the collection, containment, transport, curation and processing of samples [1] Foremost in the processing of returned samples are the closely paired considerations of life detection and Planetary Protection. In order to achieve Mars Sample Return (MSR) science goals, reliable analyses will depend on overcoming some challenging signal/noise-related issues where sparse martian organic compounds must be reliably analyzed against the contamination background. While reliable analyses will depend on initial clean acquisition and robust documentation of all aspects of developing and managing the cache [2], there needs to be a reliable sample handling and analysis procedure that accounts for a variety of materials which may or may not contain evidence of past or present martian life. A recent report [3] suggests that a defined set of measurements should be made to effectively inform both science and Planetary Protection, when applied in the context of the two competing null hypotheses: 1) that there is no detectable life in the samples; or 2) that there is martian life in the samples. The defined measurements would include a phased approach that would be accepted by the community to preserve the bulk of the material, but provide unambiguous science data that can be used and interpreted by various disciplines. Fore-most is the concern that the initial steps would ensure the pristine nature of the samples. Preliminary, non-invasive techniques such as computed X-ray tomography (XCT) have been suggested as the first method to interrogate and

  12. The search for Infrared radiation prior to major earthquakes

    Science.gov (United States)

    Ouzounov, D.; Taylor, P.; Pulinets, S.

    2004-12-01

    This work describes our search for a relationship between tectonic stresses and electro-chemical and thermodynamic processes in the Earth and increases in mid-IR flux as part of a possible ensemble of electromagnetic (EM) phenomena that may be related to earthquake activity. Recent analysis of continuous ongoing long- wavelength Earth radiation (OLR) indicates significant and anomalous variability prior to some earthquakes. The cause of these anomalies is not well understood but could be the result of a triggering by an interaction between the lithosphere-hydrosphere and atmospheric related to changes in the near surface electrical field and gas composition prior to the earthquake. The OLR anomaly covers large areas surrounding the main epicenter. We have use the NOAA IR data to differentiate between the global and seasonal variability and these transient local anomalies. Indeed, on the basis of a temporal and spatial distribution analysis, an anomaly pattern is found to occur several days prior some major earthquakes. The significance of these observations was explored using data sets of some recent worldwide events.

  13. Recolonization of group B Streptococcus (GBS) in women with prior GBS genital colonization in pregnancy.

    Science.gov (United States)

    Tam, Teresa; Bilinski, Ewa; Lombard, Emily

    2012-10-01

    The purpose of the study is to evaluate the incidence of women with prior GBS genital colonization who have recolonization in subsequent pregnancies. This is a retrospective, cohort study of patients with a prior GBS genital colonization in pregnancy and a subsequent pregnancy with a recorded GBS culture result, from January 2000 through June 2007. Documentation of GBS status was through GBS culture performed between 35 to 37 weeks gestation. Exclusion criteria included pregnancies with unknown GBS status, patients with GBS bacteriuria, women with a previous neonate with GBS disease and GBS finding prior to 35 weeks. Data was analyzed using SPSS 15.0. The sample proportion of subjects with GBS genital colonization and its confidence interval were computed to estimate the incidence rate. Logistic regression was performed to assess potential determinants of GBS colonization. Regression coefficients, odds ratios and associated confidence intervals, and p-values were reported, with significant results reported. There were 371 pregnancies that met the test criteria. There were 151 subsequent pregnancies with GBS genital colonization and 220 without GBS recolonization. The incidence of GBS recolonization on patients with prior GBS genital colonization was 40.7% (95% confidence interval 35.7-45.69%). The incidence rate for the sample was significantly larger than 30% (p recolonization in subsequent pregnancies.

  14. Impact of implementing ISO 9001:2008 standard on the Spanish Renal Research Network biobank sample transfer process.

    Science.gov (United States)

    Cortés, M Alicia; Irrazábal, Emanuel; García-Jerez, Andrea; Bohórquez-Magro, Lourdes; Luengo, Alicia; Ortiz-Arduán, Alberto; Calleros, Laura; Rodríguez-Puyol, Manuel

    2014-01-01

    Biobank certification ISO 9001:2008 aims to improve the management of processes performed. This has two objectives: customer satisfaction and continuous improvement. This paper presents the impact of certification ISO 9001:2008 on the sample transfer process in a Spanish biobank specialising in kidney patient samples. The biobank experienced a large increase in the number of samples between 2009 (12,582 vials) and 2010 (37,042 vials). The biobank of the Spanish Renal Research Network (REDinREN), located at the University of Alcalá, has implemented ISO standard 9001:2008 for the effective management of human material given to research centres. Using surveys, we analysed two periods in the “sample transfer” process. During the first period between 1-10-12 and 26-11-12 (8 weeks), minimal changes were made to correct isolated errors. In the second period, between 7-01-13 and 18-02-13 (6 weeks), we carried out general corrective actions. The identification of problems and implementation of corrective actions for certification allowed: a 70% reduction in the process execution time, a significant increase (200%) in the number of samples processed and a 25% improvement in the process. The increase in the number of samples processed was directly related to process improvement. The certification of ISO standard 9001:2008, obtained in July 2013, allowed an improvement of the REDinREN biobank processes to be achieved, which increased quality and customer satisfaction.

  15. 40 CFR 61.54 - Sludge sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 8 2010-07-01 2010-07-01 false Sludge sampling. 61.54 Section 61.54... sampling. (a) As an alternative means for demonstrating compliance with § 61.52(b), an owner or operator... days prior to a sludge sampling test, so that he may at his option observe the test. (c) Sludge shall...

  16. A Frequency Matching Method: Solving Inverse Problems by Use of Geologically Realistic Prior Information

    DEFF Research Database (Denmark)

    Lange, Katrine; Frydendall, Jan; Cordua, Knud Skou

    2012-01-01

    The frequency matching method defines a closed form expression for a complex prior that quantifies the higher order statistics of a proposed solution model to an inverse problem. While existing solution methods to inverse problems are capable of sampling the solution space while taking into account...... arbitrarily complex a priori information defined by sample algorithms, it is not possible to directly compute the maximum a posteriori model, as the prior probability of a solution model cannot be expressed. We demonstrate how the frequency matching method enables us to compute the maximum a posteriori...... solution model to an inverse problem by using a priori information based on multiple point statistics learned from training images. We demonstrate the applicability of the suggested method on a synthetic tomographic crosshole inverse problem....

  17. Evolution of Industry Knowledge in the Public Domain: Prior Art Searching for Software Patents

    Directory of Open Access Journals (Sweden)

    Jinseok Park

    2005-03-01

    Full Text Available Searching prior art is a key part of the patent application and examination processes. A comprehensive prior art search gives the inventor ideas as to how he can improve or circumvent existing technology by providing up to date knowledge on the state of the art. It also enables the patent applicant to minimise the likelihood of an objection from the patent office. This article explores the characteristics of prior art associated with software patents, dealing with difficulties in searching prior art due to the lack of resources, and considers public contribution to the formation of prior art databases. It addresses the evolution of electronic prior art in line with technological development, and discusses laws and practices in the EPO, USPTO, and the JPO in relation to the validity of prior art resources on the Internet. This article also investigates the main features of searching sources and tools in the three patent offices as well as non-patent literature databases. Based on the analysis of various searching databases, it provides some strategies of efficient prior art searching that should be considered for software-related inventions.

  18. White HDPE bottles as source of serious contamination of water samples with Ba and Zn.

    Science.gov (United States)

    Reimann, Clemens; Grimstvedt, Andreas; Frengstad, Bjørn; Finne, Tor Erik

    2007-03-15

    During a recent study of surface water quality factory new white high-density polyethylene (HDPE) bottles were used for collecting the water samples. According to the established field protocol of the Geological Survey of Norway the bottles were twice carefully rinsed with water in the field prior to sampling. Several blank samples using milli-Q (ELGA) water (>18.2 MOmega) were also prepared. On checking the analytical results the blanks returned values of Ag, Ba, Sr, V, Zn and Zr. For Ba and Zn the values (c. 300 microg/l and 95 microg/l) were about 10 times above the concentrations that can be expected in natural waters. A laboratory test of the bottles demonstrated that the bottles contaminate the samples with significant amounts of Ba and Zn and some Sr. Simple acid washing of the bottles prior to use did not solve the contamination problem for Ba and Zn. The results suggest that there may exist "clean" and "dirty" HDPE bottles depending on manufacturer/production process. When collecting water samples it is mandatory to check bottles regularly as a possible source of contamination.

  19. Solubility of airborne uranium samples from uranium processing plant

    International Nuclear Information System (INIS)

    Kravchik, T.; Oved, S.; Sarah, R.; Gonen, R.; Paz-Tal, O.; Pelled, O.; German, U.; Tshuva, A.

    2005-01-01

    Full text: During the production and machining processes of uranium metal, aerosols might be released to the air. Inhalation of these aerosols is the main route of internal exposure of workers. To assess the radiation dose from the intake of these uranium compounds it is necessary to know their absorption type, based on their dissolution rate in extracellular aqueous environment of lung fluid. The International Commission on Radiological Protection (ICRP) has assigned UF4 and U03 to absorption type M (blood absorption which contains a 10 % fraction with an absorption rate of 10 minutes and 90 % fraction with an absorption rate of 140 fays) and UO2 and U3O8 to absorption type S (blood absorption rate with a half-time of 7000 days) in the ICRP-66 model.The solubility classification of uranium compounds defined by the ICRP can serve as a general guidance. At specific workplaces, differences can be encountered, because of differences in compounds production process and the presence of additional compounds, with different solubility characteristics. According to ICRP recommendations, material-specific rates of absorption should be preferred to default parameters whenever specific experimental data exists. Solubility profiles of uranium aerosols were determined by performing in vitro chemical solubility tests on air samples taken from uranium production and machining facilities. The dissolution rate was determined over 100 days in a simultant solution of the extracellular airway lining fluid. The filter sample was immersed in a test vial holding 60 ml of simultant fluid, which was maintained at a 37 o C inside a thermostatic bath and at a physiological pH of 7.2-7.6. The test vials with the solution were shaken to simulate the conditions inside the extracellular aqueous environment of the lung as much as possible. The tests indicated that the uranium aerosols samples taken from the metal production and machining facilities at the Nuclear Research Center Negev (NRCN

  20. Total-Evidence Dating under the Fossilized Birth-Death Process.

    Science.gov (United States)

    Zhang, Chi; Stadler, Tanja; Klopfstein, Seraina; Heath, Tracy A; Ronquist, Fredrik

    2016-03-01

    Bayesian total-evidence dating involves the simultaneous analysis of morphological data from the fossil record and morphological and sequence data from recent organisms, and it accommodates the uncertainty in the placement of fossils while dating the phylogenetic tree. Due to the flexibility of the Bayesian approach, total-evidence dating can also incorporate additional sources of information. Here, we take advantage of this and expand the analysis to include information about fossilization and sampling processes. Our work is based on the recently described fossilized birth-death (FBD) process, which has been used to model speciation, extinction, and fossilization rates that can vary over time in a piecewise manner. So far, sampling of extant and fossil taxa has been assumed to be either complete or uniformly at random, an assumption which is only valid for a minority of data sets. We therefore extend the FBD process to accommodate diversified sampling of extant taxa, which is standard practice in studies of higher-level taxa. We verify the implementation using simulations and apply it to the early radiation of Hymenoptera (wasps, ants, and bees). Previous total-evidence dating analyses of this data set were based on a simple uniform tree prior and dated the initial radiation of extant Hymenoptera to the late Carboniferous (309 Ma). The analyses using the FBD prior under diversified sampling, however, date the radiation to the Triassic and Permian (252 Ma), slightly older than the age of the oldest hymenopteran fossils. By exploring a variety of FBD model assumptions, we show that it is mainly the accommodation of diversified sampling that causes the push toward more recent divergence times. Accounting for diversified sampling thus has the potential to close the long-discussed gap between rocks and clocks. We conclude that the explicit modeling of fossilization and sampling processes can improve divergence time estimates, but only if all important model aspects

  1. REMOTE IN-CELL SAMPLING IMPROVEMENTS PROGRAM AT THESAVANNAH RIVER SITE (SRS) DEFENSE WASTE PROCESSING FACILITY (DWPF)

    International Nuclear Information System (INIS)

    Marzolf, A

    2007-01-01

    Remote Systems Engineering (RSE) of the Savannah River National Lab (SRNL) in combination with the Defense Waste Processing Facility(DWPF) Engineering and Operations has evaluated the existing equipment and processes used in the facility sample cells for 'pulling' samples from the radioactive waste stream and performing equipment in-cell repairs/replacements. RSE has designed and tested equipment for improving remote in-cell sampling evolutions and reducing the time required for in-cell maintenance of existing equipment. The equipment within the present process tank sampling system has been in constant use since the facility start-up over 17 years ago. At present, the method for taking samples within the sample cells produces excessive maintenance and downtime due to frequent failures relative to the sampling station equipment and manipulator. Location and orientation of many sampling stations within the sample cells is not conducive to manipulator operation. The overextension of manipulators required to perform many in-cell operations is a major cause of manipulator failures. To improve sampling operations and reduce downtime due to equipment maintenance, a Portable Sampling Station (PSS), wireless in-cell cameras, and new commercially available sampling technology has been designed, developed and/or adapted and tested. The uniqueness of the design(s), the results of the scoping tests, and the benefits relative to in-cell operation and reduction of waste are presented

  2. Data Validation Package - April and July 2015 Groundwater and Surface Water Sampling at the Gunnison, Colorado, Processing Site

    Energy Technology Data Exchange (ETDEWEB)

    Linard, Joshua [Dept. of Energy (DOE), Washington, DC (United States). Office of Legacy Management; Campbell, Sam [Navarro Research and Engineering, Inc., Oak Ridge, TN (United States)

    2016-02-01

    This event included annual sampling of groundwater and surface water locations at the Gunnison, Colorado, Processing Site. Sampling and analyses were conducted as specified in Sampling and Analysis Plan for U.S. Department of Energy Office of Legacy Management Sites. Samples were collected from 28 monitoring wells, three domestic wells, and six surface locations in April at the processing site as specified in the 2010 Ground Water Compliance Action Plan for the Gunnison, Colorado, Processing Site. Domestic wells 0476 and 0477 were sampled in July because the homes were unoccupied in April, and the wells were not in use. Duplicate samples were collected from locations 0113, 0248, and 0477. One equipment blank was collected during this sampling event. Water levels were measured at all monitoring wells that were sampled. No issues were identified during the data validation process that requires additional action or follow-up.

  3. Cloud point extraction for determination of lead in blood samples of children, using different ligands prior to analysis by flame atomic absorption spectrometry: A multivariate study

    Energy Technology Data Exchange (ETDEWEB)

    Shah, Faheem, E-mail: shah_ceac@yahoo.com [National Center of Excellence in Analytical Chemistry, University of Sindh, Jamshoro 76080 (Pakistan); Kazi, Tasneem Gul, E-mail: tgkazi@yahoo.com [National Center of Excellence in Analytical Chemistry, University of Sindh, Jamshoro 76080 (Pakistan); Afridi, Hassan Imran, E-mail: hassanimranafridi@yahoo.com [National Center of Excellence in Analytical Chemistry, University of Sindh, Jamshoro 76080 (Pakistan); Naeemullah, E-mail: khannaeemullah@ymail.com [National Center of Excellence in Analytical Chemistry, University of Sindh, Jamshoro 76080 (Pakistan); Arain, Muhammad Balal, E-mail: bilal_ku2004@yahoo.com [Department of Chemistry, University of Science and Technology, Bannu, KPK (Pakistan); Baig, Jameel Ahmed, E-mail: jab_mughal@yahoo.com [National Center of Excellence in Analytical Chemistry, University of Sindh, Jamshoro 76080 (Pakistan)

    2011-09-15

    Highlights: {yields} Trace levels of lead in blood samples of healthy children and with different kidney disorders {yields} Pre-concentration of Pb{sup +2} in acid digested blood samples after chelating with two complexing reagents. {yields} Multivariate technique was used for screening of significant factors that influence the CPE of Pb{sup +2} {yields} The level of Pb{sup +2} in diseased children was significantly higher than referents of same age group. - Abstract: The phase-separation phenomenon of non-ionic surfactants occurring in aqueous solution was used for the extraction of lead (Pb{sup 2+}) from digested blood samples after simultaneous complexation with ammonium pyrrolidinedithiocarbamate (APDC) and diethyldithiocarbamate (DDTC) separately. The complexed analyte was quantitatively extracted with octylphenoxypolyethoxyethanol (Triton X-114). The multivariate strategy was applied to estimate the optimum values of experimental factors. Acidic ethanol was added to the surfactant-rich phase prior to its analysis by flame atomic absorption spectrometer (FAAS). The detection limit value of Pb{sup 2+} for the preconcentration of 10 mL of acid digested blood sample was 1.14 {mu}g L{sup -1}. The accuracy of the proposed methods was assessed by analyzing certified reference material (whole blood). Under the optimized conditions of both CPE methods, 10 mL of Pb{sup 2+} standards (10 {mu}g L{sup -1}) complexed with APDC and DDTC, permitted the enhancement factors of 56 and 42, respectively. The proposed method was used for determination of Pb{sup 2+} in blood samples of children with kidney disorders and healthy controls.

  4. A Bayesian sampling strategy for hazardous waste site characterization

    International Nuclear Information System (INIS)

    Skalski, J.R.

    1987-12-01

    Prior knowledge based on historical records or physical evidence often suggests the existence of a hazardous waste site. Initial surveys may provide additional or even conflicting evidence of site contamination. This article presents a Bayes sampling strategy that allocates sampling at a site using this prior knowledge. This sampling strategy minimizes the environmental risks of missing chemical or radionuclide hot spots at a waste site. The environmental risk is shown to be proportional to the size of the undetected hot spot or inversely proportional to the probability of hot spot detection. 12 refs., 2 figs

  5. Equipment and techniques for remote sampling of stored radioactive waste

    International Nuclear Information System (INIS)

    Nance, T.A.

    1996-01-01

    Several tools have been developed at the Savannah River Site (SRS) to remotely sample stored radioactive waste. These sampling tools have been developed to determine the chemical characteristics of the waste prior to processing. The processing of waste material varies according to the chemical characteristics of the waste, which change due to additions, settling, mixing, and chemical reactions during storage. Once the waste has been sampled to identify its characteristics, the chemical composition of the waste can then be altered if needed to prepare for processing. Various types of waste material in several types of containment must be sampled at SRS. Stored waste materials consist of liquids, floating organics, sludge, salt and solids. Waste is stored in four basic types of tanks with different means of access and interior obstructions. The waste tanks can only be accessed by small openings: access ports, risers and downcomers. Requirements for sampling depend on the type of tank being accessed, the waste within the tank, and the particular location in the tank desired for taking the sample. Sampling devices have been developed to sample all of the waste material forms found in the SRS tank farms. The fluid type samplers are capable of sampling surface liquid, subsurface liquid at varying depth, surface sludge, subsurface sludge, and floating organics. The solid type samplers are capable of sampling salt, sampling a solid layer on the bottom of the tank, and capturing a small solid mass on the tank bottom. The sampling devices are all designed to access the tanks through small access ports. The samplers are reusable and are designed to allow quick transfer of the samples to shielded packaging for transport, reducing the amount of radiation exposure to sampling personnel. The samplers weigh less than 100 lb. and are designed in sections to allow easy disassembly for storage and transport by personnel. (Abstract Truncated)

  6. Progressive sample processing of band selection for hyperspectral imagery

    Science.gov (United States)

    Liu, Keng-Hao; Chien, Hung-Chang; Chen, Shih-Yu

    2017-10-01

    Band selection (BS) is one of the most important topics in hyperspectral image (HSI) processing. The objective of BS is to find a set of representative bands that can represent the whole image with lower inter-band redundancy. Many types of BS algorithms were proposed in the past. However, most of them can be carried on in an off-line manner. It means that they can only be implemented on the pre-collected data. Those off-line based methods are sometime useless for those applications that are timeliness, particular in disaster prevention and target detection. To tackle this issue, a new concept, called progressive sample processing (PSP), was proposed recently. The PSP is an "on-line" framework where the specific type of algorithm can process the currently collected data during the data transmission under band-interleavedby-sample/pixel (BIS/BIP) protocol. This paper proposes an online BS method that integrates a sparse-based BS into PSP framework, called PSP-BS. In PSP-BS, the BS can be carried out by updating BS result recursively pixel by pixel in the same way that a Kalman filter does for updating data information in a recursive fashion. The sparse regression is solved by orthogonal matching pursuit (OMP) algorithm, and the recursive equations of PSP-BS are derived by using matrix decomposition. The experiments conducted on a real hyperspectral image show that the PSP-BS can progressively output the BS status with very low computing time. The convergence of BS results during the transmission can be quickly achieved by using a rearranged pixel transmission sequence. This significant advantage allows BS to be implemented in a real time manner when the HSI data is transmitted pixel by pixel.

  7. Wavelet data processing of micro-Raman spectra of biological samples

    Science.gov (United States)

    Camerlingo, C.; Zenone, F.; Gaeta, G. M.; Riccio, R.; Lepore, M.

    2006-02-01

    A wavelet multi-component decomposition algorithm is proposed for processing data from micro-Raman spectroscopy (μ-RS) of biological tissue. The μ-RS has been recently recognized as a promising tool for the biopsy test and in vivo diagnosis of degenerative human tissue pathologies, due to the high chemical and structural information contents of this spectroscopic technique. However, measurements of biological tissues are usually hampered by typically low-level signals and by the presence of noise and background components caused by light diffusion or fluorescence processes. In order to overcome these problems, a numerical method based on discrete wavelet transform is used for the analysis of data from μ-RS measurements performed in vitro on animal (pig and chicken) tissue samples and, in a preliminary form, on human skin and oral tissue biopsy from normal subjects. Visible light μ-RS was performed using a He-Ne laser and a monochromator with a liquid nitrogen cooled charge coupled device equipped with a grating of 1800 grooves mm-1. The validity of the proposed data procedure has been tested on the well-characterized Raman spectra of reference acetylsalicylic acid samples.

  8. Bayesian Analysis of two Censored Shifted Gompertz Mixture Distributions using Informative and Noninformative Priors

    Directory of Open Access Journals (Sweden)

    Tabassum Naz Sindhu

    2017-03-01

    Full Text Available This study deals with Bayesian analysis of shifted Gompertz mixture model under type-I censored samples assuming both informative and noninformative priors. We have discussed the Bayesian estimation of parameters of shifted Gompertz mixture model under the uniform, and gamma priors assuming three loss functions. Further, some properties of the model with some graphs of the mixture density are discussed. These properties include Bayes estimators, posterior risks and reliability function under simulation scheme. Bayes estimates are obtained considering two cases: (a when the shape parameter is known and (b when all parameters are unknown. We analyzed some simulated sets in order to investigate the effect of prior belief, loss functions, and performance of the proposed set of estimators of the mixture model parameters.

  9. Varying prior information in Bayesian inversion

    International Nuclear Information System (INIS)

    Walker, Matthew; Curtis, Andrew

    2014-01-01

    Bayes' rule is used to combine likelihood and prior probability distributions. The former represents knowledge derived from new data, the latter represents pre-existing knowledge; the Bayesian combination is the so-called posterior distribution, representing the resultant new state of knowledge. While varying the likelihood due to differing data observations is common, there are also situations where the prior distribution must be changed or replaced repeatedly. For example, in mixture density neural network (MDN) inversion, using current methods the neural network employed for inversion needs to be retrained every time prior information changes. We develop a method of prior replacement to vary the prior without re-training the network. Thus the efficiency of MDN inversions can be increased, typically by orders of magnitude when applied to geophysical problems. We demonstrate this for the inversion of seismic attributes in a synthetic subsurface geological reservoir model. We also present results which suggest that prior replacement can be used to control the statistical properties (such as variance) of the final estimate of the posterior in more general (e.g., Monte Carlo based) inverse problem solutions. (paper)

  10. Modeling and validating Bayesian accrual models on clinical data and simulations using adaptive priors.

    Science.gov (United States)

    Jiang, Yu; Simon, Steve; Mayo, Matthew S; Gajewski, Byron J

    2015-02-20

    Slow recruitment in clinical trials leads to increased costs and resource utilization, which includes both the clinic staff and patient volunteers. Careful planning and monitoring of the accrual process can prevent the unnecessary loss of these resources. We propose two hierarchical extensions to the existing Bayesian constant accrual model: the accelerated prior and the hedging prior. The new proposed priors are able to adaptively utilize the researcher's previous experience and current accrual data to produce the estimation of trial completion time. The performance of these models, including prediction precision, coverage probability, and correct decision-making ability, is evaluated using actual studies from our cancer center and simulation. The results showed that a constant accrual model with strongly informative priors is very accurate when accrual is on target or slightly off, producing smaller mean squared error, high percentage of coverage, and a high number of correct decisions as to whether or not continue the trial, but it is strongly biased when off target. Flat or weakly informative priors provide protection against an off target prior but are less efficient when the accrual is on target. The accelerated prior performs similar to a strong prior. The hedging prior performs much like the weak priors when the accrual is extremely off target but closer to the strong priors when the accrual is on target or only slightly off target. We suggest improvements in these models and propose new models for future research. Copyright © 2014 John Wiley & Sons, Ltd.

  11. 21 CFR 211.110 - Sampling and testing of in-process materials and drug products.

    Science.gov (United States)

    2010-04-01

    ... PHARMACEUTICALS Production and Process Controls § 211.110 Sampling and testing of in-process materials and drug... capsule weight variation; (2) Disintegration time; (3) Adequacy of mixing to assure uniformity and... production process, e.g., at commencement or completion of significant phases or after storage for long...

  12. Perilymph sampling from the cochlear apex: a reliable method to obtain higher purity perilymph samples from scala tympani.

    Science.gov (United States)

    Salt, Alec N; Hale, Shane A; Plonkte, Stefan K R

    2006-05-15

    Measurements of drug levels in the fluids of the inner ear are required to establish kinetic parameters and to determine the influence of specific local delivery protocols. For most substances, this requires cochlear fluids samples to be obtained for analysis. When auditory function is of primary interest, the drug level in the perilymph of scala tympani (ST) is most relevant, since drug in this scala has ready access to the auditory sensory cells. In many prior studies, ST perilymph samples have been obtained from the basal turn, either by aspiration through the round window membrane (RWM) or through an opening in the bony wall. A number of studies have demonstrated that such samples are likely to be contaminated with cerebrospinal fluid (CSF). CSF enters the basal turn of ST through the cochlear aqueduct when the bony capsule is perforated or when fluid is aspirated. The degree of sample contamination has, however, not been widely appreciated. Recent studies have shown that perilymph samples taken through the round window membrane are highly contaminated with CSF, with samples greater than 2microL in volume containing more CSF than perilymph. In spite of this knowledge, many groups continue to sample from the base of the cochlea, as it is a well-established method. We have developed an alternative, technically simple method to increase the proportion of ST perilymph in a fluid sample. The sample is taken from the apex of the cochlea, a site that is distant from the cochlear aqueduct. A previous problem with sampling through a perforation in the bone was that the native perilymph rapidly leaked out driven by CSF pressure and was lost to the middle ear space. We therefore developed a procedure to collect all the fluid that emerged from the perforated apex after perforation. We evaluated the method using a marker ion trimethylphenylammonium (TMPA). TMPA was applied to the perilymph of guinea pigs either by RW irrigation or by microinjection into the apical turn. The

  13. XRF analysis of mineralised samples

    International Nuclear Information System (INIS)

    Ahmedali, T.

    2002-01-01

    Full text: Software now supplied by instrument manufacturers has made it practical and convenient for users to analyse unusual samples routinely. Semiquantitative scanning software can be used for rapid preliminary screening of elements ranging from Carbon to Uranium, prior to assigning mineralised samples to an appropriate quantitative analysis routine. The general quality and precision of analytical results obtained from modern XRF spectrometers can be significantly enhanced by several means: a. Modifications in preliminary sample preparation can result in less contamination from crushing and grinding equipment. Optimised techniques of actual sample preparation can significantly increase precision of results. b. Employment of automatic data recording balances and the use of catch weights during sample preparation reduces technician time as well as weighing errors. * c. Consistency of results can be improved significantly by the use of appropriate stable drift monitors with a statistically significant content of the analyte d. A judicious selection of kV/mA combinations, analysing crystals, primary beam filters, collimators, peak positions, accurate background correction and peak overlap corrections, followed by the use of appropriate matrix correction procedures. e. Preventative maintenance procedures for XRF spectrometers and ancillary equipment, which can also contribute significantly to reducing instrument down times, are described. Examples of various facets of sample processing routines are given from the XRF spectrometer component of a multi-instrument analytical university facility, which provides XRF data to 17 Canadian universities. Copyright (2002) Australian X-ray Analytical Association Inc

  14. The Prior Internet Resources 2017

    DEFF Research Database (Denmark)

    Engerer, Volkmar Paul; Albretsen, Jørgen

    2017-01-01

    The Prior Internet Resources (PIR) are presented. Prior’s unpublished scientific manuscripts and his wast letter correspondence with fellow researchers at the time, his Nachlass, is now subject to transcription by Prior-researchers worldwide, and form an integral part of PIR. It is demonstrated...

  15. Processing a Complex Architectural Sampling with Meshlab: the Case of Piazza della Signoria

    Science.gov (United States)

    Callieri, M.; Cignoni, P.; Dellepiane, M.; Ranzuglia, G.; Scopigno, R.

    2011-09-01

    The paper presents a recent 3D scanning project performed with long range scanning technology showing how a complex sampled dataset can be processed with the features available in MeshLab, an open source tool. MeshLab is an open source mesh processing system. It is a portable and extensible system aimed to help the processing of the typical not-so-small unstructured models that arise in 3D scanning, providing a set of tools for editing, cleaning, processing, inspecting, rendering and converting meshes. The MeshLab system started in late 2005 as a part of a university course, and considerably evolved since then thanks to the effort of the Visual Computing Lab and of the support of several funded EC projects. MeshLab gained so far an excellent visibility and distribution, with several thousands downloads every month, and a continuous evolution. The aim of this scanning campaign was to sample the façades of the buildings located in Piazza della Signoria (Florence, Italy). This digital 3D model was required, in the framework of a Regional Project, as a basic background model to present a complex set of images using a virtual navigation metaphor (following the PhotoSynth approach). Processing of complex dataset, such as the ones produced by long range scanners, often requires specialized, difficult to use and costly software packages. We show in the paper how it is possible to process this kind of data inside an open source tool, thanks to the many new features recently introduced in MeshLab for the management of large sets of sampled point.

  16. A mixed signal ECG processing platform with an adaptive sampling ADC for portable monitoring applications.

    Science.gov (United States)

    Kim, Hyejung; Van Hoof, Chris; Yazicioglu, Refet Firat

    2011-01-01

    This paper describes a mixed-signal ECG processing platform with an 12-bit ADC architecture that can adapt its sampling rate according to the input signals rate of change. This enables the sampling of ECG signals with significantly reduced data rate without loss of information. The presented adaptive sampling scheme reduces the ADC power consumption, enables the processing of ECG signals with lower power consumption, and reduces the power consumption of the radio while streaming the ECG signals. The test results show that running a CWT-based R peak detection algorithm using the adaptively sampled ECG signals consumes only 45.6 μW and it leads to 36% less overall system power consumption.

  17. Differences in metabolite profiles caused by pre-analytical blood processing procedures.

    Science.gov (United States)

    Nishiumi, Shin; Suzuki, Makoto; Kobayashi, Takashi; Yoshida, Masaru

    2018-05-01

    Recently, the use of metabolomic analysis of human serum and plasma for biomarker discovery and disease diagnosis in clinical studies has been increasing. The feasibility of using a metabolite biomarker for disease diagnosis is strongly dependent on the metabolite's stability during pre-analytical blood processing procedures, such as serum or plasma sampling and sample storage prior to centrifugation. However, the influence of blood processing procedures on the stability of metabolites has not been fully characterized. In the present study, we compared the levels of metabolites in matched human serum and plasma samples using gas chromatography coupled with mass spectrometry and liquid chromatography coupled with mass spectrometry. In addition, we evaluated the changes in plasma metabolite levels induced by storage at room temperature or at a cold temperature prior to centrifugation. As a result, it was found that 76 metabolites exhibited significant differences between their serum and plasma levels. Furthermore, the pre-centrifugation storage conditions significantly affected the plasma levels of 45 metabolites. These results highlight the importance of blood processing procedures during metabolome analysis, which should be considered during biomarker discovery and the subsequent use of biomarkers for disease diagnosis. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  18. A novel approach to process carbonate samples for radiocarbon measurements with helium carrier gas

    Energy Technology Data Exchange (ETDEWEB)

    Wacker, L., E-mail: wacker@phys.ethz.ch [Laboratory of Ion Beam Physics, ETH Zurich, 8093 Zurich (Switzerland); Fueloep, R.-H. [Institute of Geology and Mineralogy, University of Cologne, 50674 Cologne (Germany); Hajdas, I. [Laboratory of Ion Beam Physics, ETH Zurich, 8093 Zurich (Switzerland); Molnar, M. [Laboratory of Ion Beam Physics, ETH Zurich, 8093 Zurich (Switzerland); Institute of Nuclear Research, Hungarian Academy of Sciences, 4026 Debrecen (Hungary); Rethemeyer, J. [Institute of Geology and Mineralogy, University of Cologne, 50674 Cologne (Germany)

    2013-01-15

    Most laboratories prepare carbonates samples for radiocarbon analysis by acid decomposition in evacuated glass tubes and subsequent reduction of the evolved CO{sub 2} to graphite in self-made reduction manifolds. This process is time consuming and labor intensive. In this work, we have tested a new approach for the preparation of carbonate samples, where any high-vacuum system is avoided and helium is used as a carrier gas. The liberation of CO{sub 2} from carbonates with phosphoric acid is performed in a similar way as it is often done in stable isotope ratio mass spectrometry where CO{sub 2} is released with acid in septum sealed tube under helium atmosphere. The formed CO{sub 2} is later flushed in a helium flow by means of a double-walled needle mounted from the tubes to the zeolite trap of the automated graphitization equipment (AGE). It essentially replaces the elemental analyzer normally used for the combustion of organic samples. The process can be fully automated from sampling the released CO{sub 2} in the septum-sealed tubes with a commercially available auto-sampler to the graphitization with the automated graphitization. The new method yields in low sample blanks of about 50000 years. Results of processed reference materials (IAEA-C2, FIRI-C) are in agreement with their consensus values.

  19. Automated processing of forensic casework samples using robotic workstations equipped with nondisposable tips: contamination prevention.

    Science.gov (United States)

    Frégeau, Chantal J; Lett, C Marc; Elliott, Jim; Yensen, Craig; Fourney, Ron M

    2008-05-01

    An automated process has been developed for the analysis of forensic casework samples using TECAN Genesis RSP 150/8 or Freedom EVO liquid handling workstations equipped exclusively with nondisposable tips. Robot tip cleaning routines have been incorporated strategically within the DNA extraction process as well as at the end of each session. Alternative options were examined for cleaning the tips and different strategies were employed to verify cross-contamination. A 2% sodium hypochlorite wash (1/5th dilution of the 10.8% commercial bleach stock) proved to be the best overall approach for preventing cross-contamination of samples processed using our automated protocol. The bleach wash steps do not adversely impact the short tandem repeat (STR) profiles developed from DNA extracted robotically and allow for major cost savings through the implementation of fixed tips. We have demonstrated that robotic workstations equipped with fixed pipette tips can be used with confidence with properly designed tip washing routines to process casework samples using an adapted magnetic bead extraction protocol.

  20. Estimating Ambiguity Preferences and Perceptions in Multiple Prior Models: Evidence from the Field

    NARCIS (Netherlands)

    S.G. Dimmock (Stephen); R.R.P. Kouwenberg (Roy); O.S. Mitchell (Olivia); K. Peijnenburg (Kim)

    2015-01-01

    markdownabstractWe develop a tractable method to estimate multiple prior models of decision-making under ambiguity. In a representative sample of the U.S. population, we measure ambiguity attitudes in the gain and loss domains. We find that ambiguity aversion is common for uncertain events of

  1. Generalized Bayesian inference with sets of conjugate priors for dealing with prior-data conflict : course at Lund University

    NARCIS (Netherlands)

    Walter, G.

    2015-01-01

    In the Bayesian approach to statistical inference, possibly subjective knowledge on model parameters can be expressed by so-called prior distributions. A prior distribution is updated, via Bayes’ Rule, to the so-called posterior distribution, which combines prior information and information from

  2. The Importance of Prior Knowledge.

    Science.gov (United States)

    Cleary, Linda Miller

    1989-01-01

    Recounts a college English teacher's experience of reading and rereading Noam Chomsky, building up a greater store of prior knowledge. Argues that Frank Smith provides a theory for the importance of prior knowledge and Chomsky's work provided a personal example with which to interpret and integrate that theory. (RS)

  3. Large sample hydrology in NZ: Spatial organisation in process diagnostics

    Science.gov (United States)

    McMillan, H. K.; Woods, R. A.; Clark, M. P.

    2013-12-01

    A key question in hydrology is how to predict the dominant runoff generation processes in any given catchment. This knowledge is vital for a range of applications in forecasting hydrological response and related processes such as nutrient and sediment transport. A step towards this goal is to map dominant processes in locations where data is available. In this presentation, we use data from 900 flow gauging stations and 680 rain gauges in New Zealand, to assess hydrological processes. These catchments range in character from rolling pasture, to alluvial plains, to temperate rainforest, to volcanic areas. By taking advantage of so many flow regimes, we harness the benefits of large-sample and comparative hydrology to study patterns and spatial organisation in runoff processes, and their relationship to physical catchment characteristics. The approach we use to assess hydrological processes is based on the concept of diagnostic signatures. Diagnostic signatures in hydrology are targeted analyses of measured data which allow us to investigate specific aspects of catchment response. We apply signatures which target the water balance, the flood response and the recession behaviour. We explore the organisation, similarity and diversity in hydrological processes across the New Zealand landscape, and how these patterns change with scale. We discuss our findings in the context of the strong hydro-climatic gradients in New Zealand, and consider the implications for hydrological model building on a national scale.

  4. Sampling in ecology and evolution - bridging the gap between theory and practice

    Science.gov (United States)

    Albert, C.H.; Yoccoz, N.G.; Edwards, T.C.; Graham, C.H.; Zimmermann, N.E.; Thuiller, W.

    2010-01-01

    Sampling is a key issue for answering most ecological and evolutionary questions. The importance of developing a rigorous sampling design tailored to specific questions has already been discussed in the ecological and sampling literature and has provided useful tools and recommendations to sample and analyse ecological data. However, sampling issues are often difficult to overcome in ecological studies due to apparent inconsistencies between theory and practice, often leading to the implementation of simplified sampling designs that suffer from unknown biases. Moreover, we believe that classical sampling principles which are based on estimation of means and variances are insufficient to fully address many ecological questions that rely on estimating relationships between a response and a set of predictor variables over time and space. Our objective is thus to highlight the importance of selecting an appropriate sampling space and an appropriate sampling design. We also emphasize the importance of using prior knowledge of the study system to estimate models or complex parameters and thus better understand ecological patterns and processes generating these patterns. Using a semi-virtual simulation study as an illustration we reveal how the selection of the space (e.g. geographic, climatic), in which the sampling is designed, influences the patterns that can be ultimately detected. We also demonstrate the inefficiency of common sampling designs to reveal response curves between ecological variables and climatic gradients. Further, we show that response-surface methodology, which has rarely been used in ecology, is much more efficient than more traditional methods. Finally, we discuss the use of prior knowledge, simulation studies and model-based designs in defining appropriate sampling designs. We conclude by a call for development of methods to unbiasedly estimate nonlinear ecologically relevant parameters, in order to make inferences while fulfilling requirements of

  5. Separation/preconcentration of silver(I) and lead(II) in environmental samples on cellulose nitrate membrane filter prior to their flame atomic absorption spectrometric determinations

    International Nuclear Information System (INIS)

    Soylak, Mustafa; Cay, Rukiye Sungur

    2007-01-01

    An enrichment method for trace amounts of Ag(I) and Pb(II) has been established prior to their flame atomic absorption spectrometric determinations. The preconcentration/separation procedure is based on chelate formation of Ag(I) and Pb(II) with ammonium pyrrolidine dithiocarbamate (APDC) and on retention of the chelates on cellulose nitrate membrane filter. The influences of some analytical parameters including pH and amounts of reagent, etc. on the recoveries of analytes were investigated. The effects of interferic ions on the quantitative recoveries of analytes were also examined. The detection limits (k = 3, N = 11) were 4.6 μg L -1 for silver(I) and 15.3 μg L -1 for lead(II). The relative standard deviations (R.S.D.) of the determinations for analyte ions were below 3%. The method was applied to environmental samples for the determination of analyte ions with satisfactory results (recoveries >95%)

  6. Event Processing and Variable Part of Sample Period Determining in Combined Systems Using GA

    Science.gov (United States)

    Strémy, Maximilián; Závacký, Pavol; Jedlička, Martin

    2011-01-01

    This article deals with combined dynamic systems and usage of modern techniques in dealing with these systems, focusing particularly on sampling period design, cyclic processing tasks and related processing algorithms in the combined event management systems using genetic algorithms.

  7. Sampling phased array - a new technique for ultrasonic signal processing and imaging

    OpenAIRE

    Verkooijen, J.; Boulavinov, A.

    2008-01-01

    Over the past 10 years, the improvement in the field of microelectronics and computer engineering has led to significant advances in ultrasonic signal processing and image construction techniques that are currently being applied to non-destructive material evaluation. A new phased array technique, called 'Sampling Phased Array', has been developed in the Fraunhofer Institute for Non-Destructive Testing([1]). It realises a unique approach of measurement and processing of ultrasonic signals. Th...

  8. Disparities in Diagnoses Received Prior to a Diagnosis of Autism Spectrum Disorder

    Science.gov (United States)

    Mandell, David S.; Ittenbach, Richard F.; Levy, Susan E.; Pinto-Martin, Jennifer A.

    2007-01-01

    This study estimated differences by ethnicity in the diagnoses assigned prior to the diagnosis of autism. In this sample of 406 Medicaid-eligible children, African-Americans were 2.6 times less likely than white children to receive an autism diagnosis on their first specialty care visit. Among children who did not receive an autism diagnosis on…

  9. 7 CFR 52.22 - Report of inspection results prior to issuance of formal report.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Report of inspection results prior to issuance of... MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946 PROCESSED FRUITS AND VEGETABLES, PROCESSED PRODUCTS...

  10. Tank 12H residuals sample analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Oji, L. N. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Shine, E. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Diprete, D. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Coleman, C. J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Hay, M. S. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-06-11

    The Savannah River National Laboratory (SRNL) was requested by Savannah River Remediation (SRR) to provide sample preparation and analysis of the Tank 12H final characterization samples to determine the residual tank inventory prior to grouting. Eleven Tank 12H floor and mound residual material samples and three cooling coil scrape samples were collected and delivered to SRNL between May and August of 2014.

  11. Contamination risk of stable isotope samples during milling.

    Science.gov (United States)

    Isaac-Renton, M; Schneider, L; Treydte, K

    2016-07-15

    Isotope analysis of wood is an important tool in dendrochronology and ecophysiology. Prior to mass spectrometry analysis, wood must be homogenized, and a convenient method involves a ball mill capable of milling samples directly in sample tubes. However, sample-tube plastic can contaminate wood during milling, which could lead to biological misinterpretations. We tested possible contamination of whole wood and cellulose samples during ball-mill homogenization for carbon and oxygen isotope measurements. We used a multi-factorial design with two/three steel milling balls, two sample amounts (10 mg, 40 mg), and two milling times (5 min, 10 min). We further analyzed abrasion by milling empty tubes, and measured the isotope ratios of pure contaminants. A strong risk exists for carbon isotope bias through plastic contamination: the δ(13) C value of polypropylene deviated from the control by -6.77‰. Small fibers from PTFE filter bags used during cellulose extraction also present a risk as the δ(13) C value of this plastic deviated by -5.02‰. Low sample amounts (10 mg) showed highest contamination due to increased abrasion during milling (-1.34‰), which is further concentrated by cellulose extraction (-3.38‰). Oxygen isotope measurements were unaffected. A ball mill can be used to homogenize samples within test tubes prior to oxygen isotope analysis, but not prior to carbon or radiocarbon isotope analysis. There is still a need for a fast, simple and contamination-free sample preparation procedure. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  12. Qualification of Daiichi Units 1, 2, and 3 Data for Severe Accident Evaluations - Process and Illustrative Examples from Prior TMI-2 Evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Rempe, Joy Lynn [Idaho National Lab. (INL), Idaho Falls, ID (United States); Knudson, Darrell Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-09-01

    The accidents at the Three Mile Island Unit 2 (TMI-2) Pressurized Water Reactor (PWR) and the Daiichi Units 1, 2, and 3 Boiling Water Reactors (BWRs) provide unique opportunities to evaluate instrumentation exposed to severe accident conditions. Conditions associated with the release of coolant and the hydrogen burn that occurred during the TMI-2 accident exposed instrumentation to harsh conditions, including direct radiation, radioactive contamination, and high humidity with elevated temperatures and pressures. As part of a program initiated in 2012 by the Department of Energy Office of Nuclear Energy (DOE-NE), a review was completed to gain insights from prior TMI-2 sensor survivability and data qualification efforts. This initial review focused on the set of sensors deemed most important by post-TMI-2 instrumentation evaluation programs. Instrumentation evaluation programs focused on data required by TMI-2 operators to assess the condition of the reactor and containment and the effect of mitigating actions taken by these operators. In addition, prior efforts focused on sensors providing data required for subsequent forensic evaluations and accident simulations. To encourage the potential for similar activities to be completed for qualifying data from Daiichi Units 1, 2, and 3, this report provides additional details related to the formal process used to develop a qualified TMI-2 data base and presents data qualification details for three parameters: primary system pressure; containment building temperature; and containment pressure. As described within this report, sensor evaluations and data qualification required implementation of various processes, including comparisons with data from other sensors, analytical calculations, laboratory testing, and comparisons with sensors subjected to similar conditions in large-scale integral tests and with sensors that were similar in design to instruments easily removed from the TMI-2 plant for evaluations. As documented

  13. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    Science.gov (United States)

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  14. Divergent Priors and well Behaved Bayes Factors

    NARCIS (Netherlands)

    R.W. Strachan (Rodney); H.K. van Dijk (Herman)

    2011-01-01

    textabstractDivergent priors are improper when defined on unbounded supports. Bartlett's paradox has been taken to imply that using improper priors results in ill-defined Bayes factors, preventing model comparison by posterior probabilities. However many improper priors have attractive properties

  15. Apples and oranges: avoiding different priors in Bayesian DNA sequence analysis

    Directory of Open Access Journals (Sweden)

    Posch Stefan

    2010-03-01

    Full Text Available Abstract Background One of the challenges of bioinformatics remains the recognition of short signal sequences in genomic DNA such as donor or acceptor splice sites, splicing enhancers or silencers, translation initiation sites, transcription start sites, transcription factor binding sites, nucleosome binding sites, miRNA binding sites, or insulator binding sites. During the last decade, a wealth of algorithms for the recognition of such DNA sequences has been developed and compared with the goal of improving their performance and to deepen our understanding of the underlying cellular processes. Most of these algorithms are based on statistical models belonging to the family of Markov random fields such as position weight matrix models, weight array matrix models, Markov models of higher order, or moral Bayesian networks. While in many comparative studies different learning principles or different statistical models have been compared, the influence of choosing different prior distributions for the model parameters when using different learning principles has been overlooked, and possibly lead to questionable conclusions. Results With the goal of allowing direct comparisons of different learning principles for models from the family of Markov random fields based on the same a-priori information, we derive a generalization of the commonly-used product-Dirichlet prior. We find that the derived prior behaves like a Gaussian prior close to the maximum and like a Laplace prior in the far tails. In two case studies, we illustrate the utility of the derived prior for a direct comparison of different learning principles with different models for the recognition of binding sites of the transcription factor Sp1 and human donor splice sites. Conclusions We find that comparisons of different learning principles using the same a-priori information can lead to conclusions different from those of previous studies in which the effect resulting from different

  16. Stochastic, goal-oriented rapid impact modeling of uncertainty and environmental impacts in poorly-sampled sites using ex-situ priors

    Science.gov (United States)

    Li, Xiaojun; Li, Yandong; Chang, Ching-Fu; Tan, Benjamin; Chen, Ziyang; Sege, Jon; Wang, Changhong; Rubin, Yoram

    2018-01-01

    Modeling of uncertainty associated with subsurface dynamics has long been a major research topic. Its significance is widely recognized for real-life applications. Despite the huge effort invested in the area, major obstacles still remain on the way from theory and applications. Particularly problematic here is the confusion between modeling uncertainty and modeling spatial variability, which translates into a (mis)conception, in fact an inconsistency, in that it suggests that modeling of uncertainty and modeling of spatial variability are equivalent, and as such, requiring a lot of data. This paper investigates this challenge against the backdrop of a 7 km, deep underground tunnel in China, where environmental impacts are of major concern. We approach the data challenge by pursuing a new concept for Rapid Impact Modeling (RIM), which bypasses altogether the need to estimate posterior distributions of model parameters, focusing instead on detailed stochastic modeling of impacts, conditional to all information available, including prior, ex-situ information and in-situ measurements as well. A foundational element of RIM is the construction of informative priors for target parameters using ex-situ data, relying on ensembles of well-documented sites, pre-screened for geological and hydrological similarity to the target site. The ensembles are built around two sets of similarity criteria: a physically-based set of criteria and an additional set covering epistemic criteria. In another variation to common Bayesian practice, we update the priors to obtain conditional distributions of the target (environmental impact) dependent variables and not the hydrological variables. This recognizes that goal-oriented site characterization is in many cases more useful in applications compared to parameter-oriented characterization.

  17. Minimizing technical variation during sample preparation prior to label-free quantitative mass spectrometry.

    Science.gov (United States)

    Scheerlinck, E; Dhaenens, M; Van Soom, A; Peelman, L; De Sutter, P; Van Steendam, K; Deforce, D

    2015-12-01

    Sample preparation is the crucial starting point to obtain high-quality mass spectrometry data and can be divided into two main steps in a bottom-up proteomics approach: cell/tissue lysis with or without detergents and a(n) (in-solution) digest comprising denaturation, reduction, alkylation, and digesting of the proteins. Here, some important considerations, among others, are that the reagents used for sample preparation can inhibit the digestion enzyme (e.g., 0.1% sodium dodecyl sulfate [SDS] and 0.5 M guanidine HCl), give rise to ion suppression (e.g., polyethylene glycol [PEG]), be incompatible with liquid chromatography-tandem mass spectrometry (LC-MS/MS) (e.g., SDS), and can induce additional modifications (e.g., urea). Taken together, all of these irreproducible effects are gradually becoming a problem when label-free quantitation of the samples is envisioned such as during the increasingly popular high-definition mass spectrometry (HDMS(E)) and sequential window acquisition of all theoretical fragment ion spectra (SWATH) data-independent acquisition strategies. Here, we describe the detailed validation of a reproducible method with sufficient protein yield for sample preparation without any known LC-MS/MS interfering substances by using 1% sodium deoxycholate (SDC) during both cell lysis and in-solution digest. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  18. New methods for sampling sparse populations

    Science.gov (United States)

    Anna Ringvall

    2007-01-01

    To improve surveys of sparse objects, methods that use auxiliary information have been suggested. Guided transect sampling uses prior information, e.g., from aerial photographs, for the layout of survey strips. Instead of being laid out straight, the strips will wind between potentially more interesting areas. 3P sampling (probability proportional to prediction) uses...

  19. PERIOD ESTIMATION FOR SPARSELY SAMPLED QUASI-PERIODIC LIGHT CURVES APPLIED TO MIRAS

    Energy Technology Data Exchange (ETDEWEB)

    He, Shiyuan; Huang, Jianhua Z.; Long, James [Department of Statistics, Texas A and M University, College Station, TX (United States); Yuan, Wenlong; Macri, Lucas M., E-mail: lmacri@tamu.edu [George P. and Cynthia W. Mitchell Institute for Fundamental Physics and Astronomy, Department of Physics and Astronomy, Texas A and M University, College Station, TX (United States)

    2016-12-01

    We develop a nonlinear semi-parametric Gaussian process model to estimate periods of Miras with sparsely sampled light curves. The model uses a sinusoidal basis for the periodic variation and a Gaussian process for the stochastic changes. We use maximum likelihood to estimate the period and the parameters of the Gaussian process, while integrating out the effects of other nuisance parameters in the model with respect to a suitable prior distribution obtained from earlier studies. Since the likelihood is highly multimodal for period, we implement a hybrid method that applies the quasi-Newton algorithm for Gaussian process parameters and search the period/frequency parameter space over a dense grid. A large-scale, high-fidelity simulation is conducted to mimic the sampling quality of Mira light curves obtained by the M33 Synoptic Stellar Survey. The simulated data set is publicly available and can serve as a testbed for future evaluation of different period estimation methods. The semi-parametric model outperforms an existing algorithm on this simulated test data set as measured by period recovery rate and quality of the resulting period–luminosity relations.

  20. Emulsification based dispersive liquid microextraction prior to flame atomic absorption spectrometry for the sensitive determination of Cd(II) in water samples

    International Nuclear Information System (INIS)

    Rahimi-Nasrabadi, Mehdi; Banan, Alireza; Zahedi, Mir Mahdi; Pourmortazavi, Seied Mahdi; Nazari, Zakieh; Asghari, Alireza

    2013-01-01

    We report on the application of emulsification-based dispersive liquid micro extraction (EB-DLME) to the preconcentration of Cd(II). This procedure not only possesses all the advantages of routine DLLME, but also results in a more stable cloudy state which is particularly useful when coupling it to FAAS. In EB-DLME, appropriate amounts of the extraction solvent (a solution of dithizone in chloroform) and an aqueous solution of sodium dodecyl sulfate (SDS; acting as a disperser) are injected into the samples. A stable cloudy microemulsion is formed and Cd(II) ion is extracted by chelation. After phase separation, the sedimented phase is subjected to FAAS. Under optimized conditions, the calibration curve for Cd(II) is linear in the range from 0.1 to 25 μg L −1 , the limit of detection (at S/N = 3) is 30 pg L −1 , the relative standard deviations for seven replicate analyses (at 0.56 μg L −1 of Cd(II)) is 4.6 %, and the enrichment factor is 151. EB-DLME in our opinion is a simple, efficient and rapid method for the preconcentration of Cd(II) (and most likely of many other ions) prior to FAAS determination. (author)

  1. The variance quadtree algorithm: use for spatial sampling design

    NARCIS (Netherlands)

    Minasny, B.; McBratney, A.B.; Walvoort, D.J.J.

    2007-01-01

    Spatial sampling schemes are mainly developed to determine sampling locations that can cover the variation of environmental properties in the area of interest. Here we proposed the variance quadtree algorithm for sampling in an area with prior information represented as ancillary or secondary

  2. Improvements to sample processing and measurement to enable more widespread environmental application of tritium

    Energy Technology Data Exchange (ETDEWEB)

    Moran, James; Alexander, Thomas; Aalseth, Craig; Back, Henning; Mace, Emily; Overman, Cory; Seifert, Allen; Freeburg, Wilcox

    2017-08-01

    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120 mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of T behavior in the environment.

  3. Control Charts for Processes with an Inherent Between-Sample Variation

    Directory of Open Access Journals (Sweden)

    Eva Jarošová

    2018-06-01

    Full Text Available A number of processes to which statistical control is applied are subject to various effects that cause random changes in the mean value. The removal of these fluctuations is either technologically impossible or economically disadvantageous under current conditions. The frequent occurrence of signals in the Shewhart chart due to these fluctuations is then undesirable and therefore the conventional control limits need to be extended. Several approaches to the design of the control charts with extended limits are presented in the paper and applied on the data from a real production process. The methods assume samples of size greater than 1. The performance of the charts is examined using the operating characteristic and average run length. The study reveals that in many cases, reducing the risk of false alarms is insufficient.

  4. A Two-Stage Maximum Entropy Prior of Location Parameter with a Stochastic Multivariate Interval Constraint and Its Properties

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2016-05-01

    Full Text Available This paper proposes a two-stage maximum entropy prior to elicit uncertainty regarding a multivariate interval constraint of the location parameter of a scale mixture of normal model. Using Shannon’s entropy, this study demonstrates how the prior, obtained by using two stages of a prior hierarchy, appropriately accounts for the information regarding the stochastic constraint and suggests an objective measure of the degree of belief in the stochastic constraint. The study also verifies that the proposed prior plays the role of bridging the gap between the canonical maximum entropy prior of the parameter with no interval constraint and that with a certain multivariate interval constraint. It is shown that the two-stage maximum entropy prior belongs to the family of rectangle screened normal distributions that is conjugate for samples from a normal distribution. Some properties of the prior density, useful for developing a Bayesian inference of the parameter with the stochastic constraint, are provided. We also propose a hierarchical constrained scale mixture of normal model (HCSMN, which uses the prior density to estimate the constrained location parameter of a scale mixture of normal model and demonstrates the scope of its applicability.

  5. Synchrotron micro-diffraction analysis of the microstructure of cryogenically treated high performance tool steels prior to and after tempering

    Energy Technology Data Exchange (ETDEWEB)

    Xu, N.; Cavallaro, G.P. [Applied Centre for Structural and Synchrotron Studies, Mawson Lakes Blvd, University of South Australia, Mawson Lakes, South Australia 5095 (Australia); Gerson, A.R., E-mail: Andrea.Gerson@unisa.edu.au [Applied Centre for Structural and Synchrotron Studies, Mawson Lakes Blvd, University of South Australia, Mawson Lakes, South Australia 5095 (Australia)

    2010-10-15

    The phase transformation and strain changes within cryogenically (-196 deg. C) treated high performance tool steels (AISI H13) before and after tempering have been examined using both laboratory XRD and synchrotron micro-diffraction. The martensitic unit cell was found to have very low tetragonality as expected for low carbon steel. Tempering resulted in the diffusion of excess carbon out of the martensite phase and consequent unit cell shrinkage. In addition on tempering the martensite became more homogeneous as compared to the same samples prior to tempering. For cryogenically treated samples, the effect was most pronounced for the rapidly cooled sample which was the least homogenous sample prior to tempering but was the most homogenous sample after tempering. This suggests that the considerable degree of disorder resulting from rapid cryogenic cooling results in the beneficial release of micro-stresses on tempering thus possibly resulting in the improved wear resistance and durability observed for cryogenically treated tool steels.

  6. Synchrotron micro-diffraction analysis of the microstructure of cryogenically treated high performance tool steels prior to and after tempering

    International Nuclear Information System (INIS)

    Xu, N.; Cavallaro, G.P.; Gerson, A.R.

    2010-01-01

    The phase transformation and strain changes within cryogenically (-196 deg. C) treated high performance tool steels (AISI H13) before and after tempering have been examined using both laboratory XRD and synchrotron micro-diffraction. The martensitic unit cell was found to have very low tetragonality as expected for low carbon steel. Tempering resulted in the diffusion of excess carbon out of the martensite phase and consequent unit cell shrinkage. In addition on tempering the martensite became more homogeneous as compared to the same samples prior to tempering. For cryogenically treated samples, the effect was most pronounced for the rapidly cooled sample which was the least homogenous sample prior to tempering but was the most homogenous sample after tempering. This suggests that the considerable degree of disorder resulting from rapid cryogenic cooling results in the beneficial release of micro-stresses on tempering thus possibly resulting in the improved wear resistance and durability observed for cryogenically treated tool steels.

  7. Note: A simple image processing based fiducial auto-alignment method for sample registration.

    Science.gov (United States)

    Robertson, Wesley D; Porto, Lucas R; Ip, Candice J X; Nantel, Megan K T; Tellkamp, Friedjof; Lu, Yinfei; Miller, R J Dwayne

    2015-08-01

    A simple method for the location and auto-alignment of sample fiducials for sample registration using widely available MATLAB/LabVIEW software is demonstrated. The method is robust, easily implemented, and applicable to a wide variety of experiment types for improved reproducibility and increased setup speed. The software uses image processing to locate and measure the diameter and center point of circular fiducials for distance self-calibration and iterative alignment and can be used with most imaging systems. The method is demonstrated to be fast and reliable in locating and aligning sample fiducials, provided here by a nanofabricated array, with accuracy within the optical resolution of the imaging system. The software was further demonstrated to register, load, and sample the dynamically wetted array.

  8. Prior knowledge in recalling arguments in bioethical dilemmas

    Directory of Open Access Journals (Sweden)

    Hiemke Katharina Schmidt

    2015-09-01

    Full Text Available Prior knowledge is known to facilitate learning new information. Normally in studies confirming this outcome the relationship between prior knowledge and the topic to be learned is obvious: the information to be acquired is part of the domain or topic to which the prior knowledge belongs. This raises the question as to whether prior knowledge of various domains facilitates recalling information. In this study 79 eleventh-grade students completed a questionnaire on their prior knowledge of seven different domains related to the bioethical dilemma of prenatal diagnostics. The students read a text containing arguments for and arguments against prenatal diagnostics. After one week and again 12 weeks later they were asked to write down all the arguments they remembered. Prior knowledge helped them recall the arguments one week (r = .350 and 12 weeks (r = .316 later. Prior knowledge of three of the seven domains significantly helped them recall the arguments one week later (correlations between r = .194 to r = .394. Partial correlations with interest as a control item revealed that interest did not explain the relationship between prior knowledge and recall. Prior knowledge of different domains jointly supports the recall of arguments related to bioethical topics.

  9. Crystalline Silicon Solar Cells with Thin Silicon Passivation Film Deposited prior to Phosphorous Diffusion

    Directory of Open Access Journals (Sweden)

    Ching-Tao Li

    2014-01-01

    Full Text Available We demonstrate the performance improvement of p-type single-crystalline silicon (sc-Si solar cells resulting from front surface passivation by a thin amorphous silicon (a-Si film deposited prior to phosphorus diffusion. The conversion efficiency was improved for the sample with an a-Si film of ~5 nm thickness deposited on the front surface prior to high-temperature phosphorus diffusion, with respect to the samples with an a-Si film deposited on the front surface after phosphorus diffusion. The improvement in conversion efficiency is 0.4% absolute with respect to a-Si film passivated cells, that is, the cells with an a-Si film deposited on the front surface after phosphorus diffusion. The new technique provided a 0.5% improvement in conversion efficiency compared to the cells without a-Si passivation. Such performance improvements result from reduced surface recombination as well as lowered contact resistance, the latter of which induces a high fill factor of the solar cell.

  10. Statin Eligibility and Outpatient Care Prior to ST-Segment Elevation Myocardial Infarction.

    Science.gov (United States)

    Miedema, Michael D; Garberich, Ross F; Schnaidt, Lucas J; Peterson, Erin; Strauss, Craig; Sharkey, Scott; Knickelbine, Thomas; Newell, Marc C; Henry, Timothy D

    2017-04-12

    The impact of the 2013 American College of Cardiology/American Heart Association cholesterol guidelines on statin eligibility in individuals otherwise destined to experience cardiovascular disease (CVD) events is unclear. We analyzed a prospective cohort of consecutive ST-segment elevation myocardial infarction (STEMI) patients from a regional STEMI system with data on patient demographics, low-density lipoprotein cholesterol levels, CVD risk factors, medication use, and outpatient visits over the 2 years prior to STEMI. We determined pre-STEMI eligibility according to American College of Cardiology/American Heart Association guidelines and the prior Third Report of the Adult Treatment Panel guidelines. Our sample included 1062 patients with a mean age of 63.7 (13.0) years (72.5% male), and 761 (71.7%) did not have known CVD prior to STEMI. Only 62.5% and 19.3% of individuals with and without prior CVD were taking a statin before STEMI, respectively. In individuals not taking a statin, median (interquartile range) low-density lipoprotein cholesterol levels in those with and without known CVD were low (108 [83, 138]  mg/dL and 110 [87, 133] mg/dL). For individuals not taking a statin, only 38.7% were statin eligible by ATP III guidelines. Conversely, 79.0% would have been statin eligible according to American College of Cardiology/American Heart Association guidelines. Less than half of individuals with (49.2%) and without (41.1%) prior CVD had seen a primary care provider during the 2 years prior to STEMI. In a large cohort of STEMI patients, application of American College of Cardiology/American Heart Association guidelines more than doubled pre-STEMI statin eligibility compared with Third Report of the Adult Treatment Panel guidelines. However, access to and utilization of health care, a necessity for guideline implementation, was suboptimal prior to STEMI. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  11. The UK Biobank sample handling and storage protocol for the collection, processing and archiving of human blood and urine.

    Science.gov (United States)

    Elliott, Paul; Peakman, Tim C

    2008-04-01

    UK Biobank is a large prospective study in the UK to investigate the role of genetic factors, environmental exposures and lifestyle in the causes of major diseases of late and middle age. Extensive data and biological samples are being collected from 500,000 participants aged between 40 and 69 years. The biological samples that are collected and how they are processed and stored will have a major impact on the future scientific usefulness of the UK Biobank resource. The aim of the UK Biobank sample handling and storage protocol is to specify methods for the collection and storage of participant samples that give maximum scientific return within the available budget. Processing or storage methods that, as far as can be predicted, will preclude current or future assays have been avoided. The protocol was developed through a review of the literature on sample handling and processing, wide consultation within the academic community and peer review. Protocol development addressed which samples should be collected, how and when they should be processed and how the processed samples should be stored to ensure their long-term integrity. The recommended protocol was extensively tested in a series of validation studies. UK Biobank collects about 45 ml blood and 9 ml of urine with minimal local processing from each participant using the vacutainer system. A variety of preservatives, anti-coagulants and clot accelerators is used appropriate to the expected end use of the samples. Collection of other material (hair, nails, saliva and faeces) was also considered but rejected for the full cohort. Blood and urine samples from participants are transported overnight by commercial courier to a central laboratory where they are processed and aliquots of urine, plasma, serum, white cells and red cells stored in ultra-low temperature archives. Aliquots of whole blood are also stored for potential future production of immortalized cell lines. A standard panel of haematology assays is

  12. Characterization of the Three Mile Island Unit-2 reactor building atmosphere prior to the reactor building purge

    International Nuclear Information System (INIS)

    Hartwell, J.K.; Mandler, J.W.; Duce, S.W.; Motes, B.G.

    1981-05-01

    The Three Mile Island Unit-2 reactor building atmosphere was sampled prior to the reactor building purge. Samples of the containment atmosphere were obtained using specialized sampling equipment installed through penetration R-626 at the 358-foot (109-meter) level of the TMI-2 reactor building. The samples were subsequently analyzed for radionuclide concentration and for gaseous molecular components (O 2 , N 2 , etc.) by two independent laboratories at the Idaho National Engineering Laboratory (INEL). The sampling procedures, analysis methods, and results are summarized

  13. Analysis of the 2H-evaporator scale samples (HTF-17-56, -57)

    Energy Technology Data Exchange (ETDEWEB)

    Hay, M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Coleman, C. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Diprete, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-09-13

    Savannah River National Laboratory analyzed scale samples from both the wall and cone sections of the 242-16H Evaporator prior to chemical cleaning. The samples were analyzed for uranium and plutonium isotopes required for a Nuclear Criticality Safety Assessment of the scale removal process. The analysis of the scale samples found the material to contain crystalline nitrated cancrinite and clarkeite. Samples from both the wall and cone contain depleted uranium. Uranium concentrations of 16.8 wt% 4.76 wt% were measured in the wall and cone samples, respectively. The ratio of plutonium isotopes in both samples is ~85% Pu-239 and ~15% Pu-238 by mass and shows approximately the same 3.5 times higher concentration in the wall sample versus the cone sample as observed in the uranium concentrations. The mercury concentrations measured in the scale samples were higher than previously reported values. The wall sample contains 19.4 wt% mercury and the cone scale sample 11.4 wt% mercury. The results from the current scales samples show reasonable agreement with previous 242-16H Evaporator scale sample analysis; however, the uranium concentration in the current wall sample is substantially higher than previous measurements.

  14. Spectral BRDF measurements of metallic samples for laser processing applications

    International Nuclear Information System (INIS)

    Vitali, L; Fustinoni, D; Gramazio, P; Niro, A

    2015-01-01

    The spectral bidirectional reflectance distribution function (BRDF) of metals plays an important role in industrial processing involving laser-surface interaction. In particular, in laser metal machining, absorbance is strongly dependent on the radiation incidence angle as well as on finishing and contamination grade of the surface, and in turn it can considerably affect processing results. Very recently, laser radiation is also used to structure metallic surfaces, in order to produce many particular optical effects, ranging from a high level polishing to angular color shifting. Of course, full knowledge of the spectral BRDF of these structured layers makes it possible to infer reflectance or color for any irradiation and viewing angles. In this paper, we present Vis-NIR spectral BRDF measurements of laser-polished metallic, opaque, flat samples commonly employed in such applications. The resulting optical properties seem to be dependent on the atmospheric composition during the polishing process in addition to the roughness. The measurements are carried out with a Perkin Elmer Lambda 950 double-beam spectrophotometer, equipped with the Absolute Reflectance/Transmittance Analyzer (ARTA) motorized goniometer. (paper)

  15. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    International Nuclear Information System (INIS)

    Elsheikh, Ahmed H.; Wheeler, Mary F.; Hoteit, Ibrahim

    2014-01-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems

  16. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    Energy Technology Data Exchange (ETDEWEB)

    Elsheikh, Ahmed H., E-mail: aelsheikh@ices.utexas.edu [Institute for Computational Engineering and Sciences (ICES), University of Texas at Austin, TX (United States); Institute of Petroleum Engineering, Heriot-Watt University, Edinburgh EH14 4AS (United Kingdom); Wheeler, Mary F. [Institute for Computational Engineering and Sciences (ICES), University of Texas at Austin, TX (United States); Hoteit, Ibrahim [Department of Earth Sciences and Engineering, King Abdullah University of Science and Technology (KAUST), Thuwal (Saudi Arabia)

    2014-02-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems.

  17. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    KAUST Repository

    Elsheikh, Ahmed H.

    2014-02-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems. © 2013 Elsevier Inc.

  18. Sample vial inserts: A better approach for sampling heterogeneous slurry samples in the SRS Defense Waste Processing Facility

    International Nuclear Information System (INIS)

    Coleman, C.J.; Goode, S.R.

    1996-01-01

    A convenient and effective new approach for analyzing DWPF samples involves the use of inserts with volumes of 1.5--3 ml placed in the neck of 14 ml sample vials. The inserts have rims that conform to the rim of the vials so that they sit straight and stable in the vial. The DWPF tank sampling system fills the pre-weighed insert rather than the entire vial, so the vial functions only as the insert holder. The shielded cell operator then removes the vial cap and decants the insert containing the sample into a plastic bottle, crucible, etc., for analysis. Inert materials such as Teflon, plastic, and zirconium are used for the insert so it is unnecessary to separate the insert from the sample for most analyses. The key technique advantage of using inserts to take DWPF samples versus filling sample vials is that it provides a convenient and almost foolproof way of obtaining and handling small volumes of slurry samples in a shielded cell without corrupting the sample. Since the insert allows the entire sample to be analyzed, this approach eliminates the errors inherent with subsampling heterogeneous slurries that comprise DWPF samples. Slurry samples can then be analyzed with confidence. Analysis times are dramatically reduced by eliminating the drying and vitrification steps normally used to produce a homogeneous solid sample. Direct dissolution and elemental analysis of slurry samples are achieved in 8 hours or less compared with 40 hours for analysis of vitrified slurry samples. Comparison of samples taken in inserts versus full vials indicate that the insert does not significantly affect sample composition

  19. Hydrogen determination using secondary processes of recoil proton interaction with sample material

    International Nuclear Information System (INIS)

    Muminov, V.A.; Khajdarov, R.A.; Navalikhin, L.V.; Pardaev, Eh.

    1980-01-01

    Possibilities of hydrogen content determination in different materials according to secondary processes of interaction of recoil protons(irradiation in the field of fast neutrons) with sample material resulting in the appearance of characteristic X-ray irradiation are studied. Excitated irradiation is recorded with a detector placed in the protective screen and located at a certain distance from the object analyzed and neutron source. The method is tested taking as an example analysis of bromine-containing samples (30% Br, 0.5% H) and tungsten dioxide. The determination limit of hydrogen content constitutes 0.05% at confidence coefficient of 0.9. Neutron flux constituted 10 3 neutrons/cm 2 xs, the time of measurement being 15-20 minutes, the distance from the sample to the detector being 12-15 cm [ru

  20. Prior image constrained image reconstruction in emerging computed tomography applications

    Science.gov (United States)

    Brunner, Stephen T.

    Advances have been made in computed tomography (CT), especially in the past five years, by incorporating prior images into the image reconstruction process. In this dissertation, we investigate prior image constrained image reconstruction in three emerging CT applications: dual-energy CT, multi-energy photon-counting CT, and cone-beam CT in image-guided radiation therapy. First, we investigate the application of Prior Image Constrained Compressed Sensing (PICCS) in dual-energy CT, which has been called "one of the hottest research areas in CT." Phantom and animal studies are conducted using a state-of-the-art 64-slice GE Discovery 750 HD CT scanner to investigate the extent to which PICCS can enable radiation dose reduction in material density and virtual monochromatic imaging. Second, we extend the application of PICCS from dual-energy CT to multi-energy photon-counting CT, which has been called "one of the 12 topics in CT to be critical in the next decade." Numerical simulations are conducted to generate multiple energy bin images for a photon-counting CT acquisition and to investigate the extent to which PICCS can enable radiation dose efficiency improvement. Third, we investigate the performance of a newly proposed prior image constrained scatter correction technique to correct scatter-induced shading artifacts in cone-beam CT, which, when used in image-guided radiation therapy procedures, can assist in patient localization, and potentially, dose verification and adaptive radiation therapy. Phantom studies are conducted using a Varian 2100 EX system with an on-board imager to investigate the extent to which the prior image constrained scatter correction technique can mitigate scatter-induced shading artifacts in cone-beam CT. Results show that these prior image constrained image reconstruction techniques can reduce radiation dose in dual-energy CT by 50% in phantom and animal studies in material density and virtual monochromatic imaging, can lead to radiation

  1. The effect of sample grinding procedures after processing on gas production profiles and end-product formation in expander processed barley and peas

    NARCIS (Netherlands)

    Azarfar, A.; Poel, van der A.F.B.; Tamminga, S.

    2007-01-01

    Grinding is a technological process widely applied in the feed manufacturing industry and is a prerequisite for obtaining representative samples for laboratory procedures (e.g. gas production analysis). When feeds are subjected to technological processes other than grinding (e.g. expander

  2. Can natural selection encode Bayesian priors?

    Science.gov (United States)

    Ramírez, Juan Camilo; Marshall, James A R

    2017-08-07

    The evolutionary success of many organisms depends on their ability to make decisions based on estimates of the state of their environment (e.g., predation risk) from uncertain information. These decision problems have optimal solutions and individuals in nature are expected to evolve the behavioural mechanisms to make decisions as if using the optimal solutions. Bayesian inference is the optimal method to produce estimates from uncertain data, thus natural selection is expected to favour individuals with the behavioural mechanisms to make decisions as if they were computing Bayesian estimates in typically-experienced environments, although this does not necessarily imply that favoured decision-makers do perform Bayesian computations exactly. Each individual should evolve to behave as if updating a prior estimate of the unknown environment variable to a posterior estimate as it collects evidence. The prior estimate represents the decision-maker's default belief regarding the environment variable, i.e., the individual's default 'worldview' of the environment. This default belief has been hypothesised to be shaped by natural selection and represent the environment experienced by the individual's ancestors. We present an evolutionary model to explore how accurately Bayesian prior estimates can be encoded genetically and shaped by natural selection when decision-makers learn from uncertain information. The model simulates the evolution of a population of individuals that are required to estimate the probability of an event. Every individual has a prior estimate of this probability and collects noisy cues from the environment in order to update its prior belief to a Bayesian posterior estimate with the evidence gained. The prior is inherited and passed on to offspring. Fitness increases with the accuracy of the posterior estimates produced. Simulations show that prior estimates become accurate over evolutionary time. In addition to these 'Bayesian' individuals, we also

  3. Product/Process (P/P) Models For The Defense Waste Processing Facility (DWPF): Model Ranges And Validation Ranges For Future Processing

    Energy Technology Data Exchange (ETDEWEB)

    Jantzen, C. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Edwards, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-09-25

    Radioactive high level waste (HLW) at the Savannah River Site (SRS) has successfully been vitrified into borosilicate glass in the Defense Waste Processing Facility (DWPF) since 1996. Vitrification requires stringent product/process (P/P) constraints since the glass cannot be reworked once it is poured into ten foot tall by two foot diameter canisters. A unique “feed forward” statistical process control (SPC) was developed for this control rather than statistical quality control (SQC). In SPC, the feed composition to the DWPF melter is controlled prior to vitrification. In SQC, the glass product would be sampled after it is vitrified. Individual glass property-composition models form the basis for the “feed forward” SPC. The models transform constraints on the melt and glass properties into constraints on the feed composition going to the melter in order to guarantee, at the 95% confidence level, that the feed will be processable and that the durability of the resulting waste form will be acceptable to a geologic repository.

  4. Application of cotton as a solid phase extraction sorbent for on-line preconcentration of copper in water samples prior to inductively coupled plasma optical emission spectrometry determination.

    Science.gov (United States)

    Faraji, Mohammad; Yamini, Yadollah; Shariati, Shahab

    2009-07-30

    Copper, as a heavy metal, is toxic for many biological systems. Thus, the determination of trace amounts of copper in environmental samples is of great importance. In the present work, a new method was developed for the determination of trace amounts of copper in water samples. The method is based on the formation of ternary Cu(II)-CAS-CTAB ion-pair and adsorption of it into a mini-column packed with cotton prior applying inductively coupled plasma optical emission spectrometry (ICP-OES). The experimental parameters that affected the extraction efficiency of the method such as pH, flow rate and volume of the sample solution, concentration of chromazurol S (CAS) and cethyltrimethylammonium bromide (CTAB) as well as type and concentration of eluent were investigated and optimized. The ion-pair (Cu(II)-CAS-CTAB) was quantitatively retained on the cotton under the optimum conditions, then eluted completely using a solution of 25% (v/v) 1-propanol in 0.5 mol L(-1) HNO(3) and directly introduced into the nebulizer of the ICP-OES. The detection limit (DL) of the method for copper was 40 ng L(-1) (V(sample)=100mL) and the relative standard deviation (R.S.D.) for the determination of copper at 10 microg L(-1) level was found to be 1.3%. The method was successfully applied to determine the trace amounts of copper in tap water, deep well water, seawater and two different mineral waters, and suitable recoveries were obtained (92-106%).

  5. The Effects of Prior Knowledge Activation on Study Time Allocation and Free Recall: Investigating the Discrepancy Reduction Model

    NARCIS (Netherlands)

    P.P.J.L. Verkoeijen (Peter); R.M.J.P. Rikers (Remy); H.G. Schmidt (Henk)

    2005-01-01

    textabstractIn this study, the authors examined the influence of prior knowledge activation on information processing by means of a prior knowledge activation procedure adopted from the read–generate paradigm. On the basis of cue-target pairs, participants in the experimental groups generated two

  6. Application of bar codes to the automation of analytical sample data collection

    International Nuclear Information System (INIS)

    Jurgensen, H.A.

    1986-01-01

    The Health Protection Department at the Savannah River Plant collects 500 urine samples per day for tritium analyses. Prior to automation, all sample information was compiled manually. Bar code technology was chosen for automating this program because it provides a more accurate, efficient, and inexpensive method for data entry. The system has three major functions: sample labeling is accomplished at remote bar code label stations composed of an Intermec 8220 (Intermec Corp.) interfaced to an IBM-PC, data collection is done on a central VAX 11/730 (Digital Equipment Corp.). Bar code readers are used to log-in samples to be analyzed on liquid scintillation counters. The VAX 11/730 processes the data and generates reports, data storage is on the VAX 11/730 and backed up on the plant's central computer. A brief description of several other bar code applications at the Savannah River Plant is also presented

  7. A nested sampling particle filter for nonlinear data assimilation

    KAUST Repository

    Elsheikh, Ahmed H.; Hoteit, Ibrahim; Wheeler, Mary Fanett

    2014-01-01

    . The proposed nested sampling particle filter (NSPF) iteratively builds the posterior distribution by applying a constrained sampling from the prior distribution to obtain particles in high-likelihood regions of the search space, resulting in a reduction

  8. Bayesian analysis of spatial point processes in the neighbourhood of Voronoi networks

    DEFF Research Database (Denmark)

    Skare, Øivind; Møller, Jesper; Jensen, Eva Bjørn Vedel

    2007-01-01

    A model for an inhomogeneous Poisson process with high intensity near the edges of a Voronoi tessellation in 2D or 3D is proposed. The model is analysed in a Bayesian setting with priors on nuclei of the Voronoi tessellation and other model parameters. An MCMC algorithm is constructed to sample...

  9. Bayesian analysis of spatial point processes in the neighbourhood of Voronoi networks

    DEFF Research Database (Denmark)

    Skare, Øivind; Møller, Jesper; Vedel Jensen, Eva B.

    A model for an inhomogeneous Poisson process with high intensity near the edges of a Voronoi tessellation in 2D or 3D is proposed. The model is analysed in a Bayesian setting with priors on nuclei of the Voronoi tessellation and other model parameters. An MCMC algorithm is constructed to sample...

  10. Attentional and Contextual Priors in Sound Perception.

    Science.gov (United States)

    Wolmetz, Michael; Elhilali, Mounya

    2016-01-01

    Behavioral and neural studies of selective attention have consistently demonstrated that explicit attentional cues to particular perceptual features profoundly alter perception and performance. The statistics of the sensory environment can also provide cues about what perceptual features to expect, but the extent to which these more implicit contextual cues impact perception and performance, as well as their relationship to explicit attentional cues, is not well understood. In this study, the explicit cues, or attentional prior probabilities, and the implicit cues, or contextual prior probabilities, associated with different acoustic frequencies in a detection task were simultaneously manipulated. Both attentional and contextual priors had similarly large but independent impacts on sound detectability, with evidence that listeners tracked and used contextual priors for a variety of sound classes (pure tones, harmonic complexes, and vowels). Further analyses showed that listeners updated their contextual priors rapidly and optimally, given the changing acoustic frequency statistics inherent in the paradigm. A Bayesian Observer model accounted for both attentional and contextual adaptations found with listeners. These results bolster the interpretation of perception as Bayesian inference, and suggest that some effects attributed to selective attention may be a special case of contextual prior integration along a feature axis.

  11. Total Variability Modeling using Source-specific Priors

    DEFF Research Database (Denmark)

    Shepstone, Sven Ewan; Lee, Kong Aik; Li, Haizhou

    2016-01-01

    sequence of an utterance. In both cases the prior for the latent variable is assumed to be non-informative, since for homogeneous datasets there is no gain in generality in using an informative prior. This work shows in the heterogeneous case, that using informative priors for com- puting the posterior......, can lead to favorable results. We focus on modeling the priors using minimum divergence criterion or fac- tor analysis techniques. Tests on the NIST 2008 and 2010 Speaker Recognition Evaluation (SRE) dataset show that our proposed method beats four baselines: For i-vector extraction using an already...... trained matrix, for the short2-short3 task in SRE’08, five out of eight female and four out of eight male common conditions, were improved. For the core-extended task in SRE’10, four out of nine female and six out of nine male common conditions were improved. When incorporating prior information...

  12. An Interdisciplinary Method for the Visualization of Novel High-Resolution Precision Photography and Micro-XCT Data Sets of NASA's Apollo Lunar Samples and Antarctic Meteorite Samples to Create Combined Research-Grade 3D Virtual Samples for the Benefit of Astromaterials Collections Conservation, Curation, Scientific Research and Education

    Science.gov (United States)

    Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K.; Zeigler, R. A.; Hanna, R. D.; Ketcham, R. A.

    2016-01-01

    New technologies make possible the advancement of documentation and visualization practices that can enhance conservation and curation protocols for NASA's Astromaterials Collections. With increasing demands for accessibility to updated comprehensive data, and with new sample return missions on the horizon, it is of primary importance to develop new standards for contemporary documentation and visualization methodologies. Our interdisciplinary team has expertise in the fields of heritage conservation practices, professional photography, photogrammetry, imaging science, application engineering, data curation, geoscience, and astromaterials curation. Our objective is to create virtual 3D reconstructions of Apollo Lunar and Antarctic Meteorite samples that are a fusion of two state-of-the-art data sets: the interior view of the sample by collecting Micro-XCT data and the exterior view of the sample by collecting high-resolution precision photography data. These new data provide researchers an information-rich visualization of both compositional and textural information prior to any physical sub-sampling. Since January 2013 we have developed a process that resulted in the successful creation of the first image-based 3D reconstruction of an Apollo Lunar Sample correlated to a 3D reconstruction of the same sample's Micro- XCT data, illustrating that this technique is both operationally possible and functionally beneficial. In May of 2016 we began a 3-year research period during which we aim to produce Virtual Astromaterials Samples for 60 high-priority Apollo Lunar and Antarctic Meteorite samples and serve them on NASA's Astromaterials Acquisition and Curation website. Our research demonstrates that research-grade Virtual Astromaterials Samples are beneficial in preserving for posterity a precise 3D reconstruction of the sample prior to sub-sampling, which greatly improves documentation practices, provides unique and novel visualization of the sample's interior and

  13. Quantitative Evidence Synthesis with Power Priors

    NARCIS (Netherlands)

    Rietbergen, C.|info:eu-repo/dai/nl/322847796

    2016-01-01

    The aim of this thesis is to provide the applied researcher with a practical approach for quantitative evidence synthesis using the conditional power prior that allows for subjective input and thereby provides an alternative tgbgo deal with the difficulties as- sociated with the joint power prior

  14. Materials processing issues for non-destructive laser gas sampling (NDLGS)

    Energy Technology Data Exchange (ETDEWEB)

    Lienert, Thomas J [Los Alamos National Laboratory

    2010-12-09

    The Non-Destructive Laser Gas Sampling (NDLGS) process essentially involves three steps: (1) laser drilling through the top of a crimped tube made of 304L stainles steel (Hammar and Svennson Cr{sub eq}/Ni{sub eq} = 1.55, produced in 1985); (2) gas sampling; and (3) laser re-welding of the crimp. All three steps are performed in a sealed chamber with a fused silica window under controlled vacuum conditions. Quality requirements for successful processing call for a hermetic re-weld with no cracks or other defects in the fusion zone or HAZ. It has been well established that austenitic stainless steels ({gamma}-SS), such as 304L, can suffer from solidification cracking if their Cr{sub eq}/Ni{sub eq} is below a critical value that causes solidification to occur as austenite (fcc structure) and their combined impurity level (%P+%S) is above {approx}0.02%. Conversely, for Cr{sub eq}/Ni{sub eq} values above the critical level, solidification occurs as ferrite (bcc structure), and cracking propensity is greatly reduced at all combined impurity levels. The consensus of results from studies of several researchers starting in the late 1970's indicates that the critical Cr{sub eq}/Ni{sub eq} value is {approx}1.5 for arc welds. However, more recent studies by the author and others show that the critical Cr{sub eq}/Ni{sub eq} value increases to {approx}1 .6 for weld processes with very rapid thermal cycles, such as the pulsed Nd:YAG laser beam welding (LBW) process used here. Initial attempts at NDLGS using pulsed LBW resulted in considerable solidification cracking, consistent with the results of work discussed above. After a brief introduction to the welding metallurgy of {gamma}-SS, this presentation will review the results of a study aimed at developing a production-ready process that eliminates cracking. The solution to the cracking issue, developed at LANL, involved locally augmenting the Cr content by applying either Cr or a Cr-rich stainless steel (ER 312) to the top

  15. RAPID PROCESSING OF ARCHIVAL TISSUE SAMPLES FOR PROTEOMIC ANALYSIS USING PRESSURE-CYCLING TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Vinuth N. Puttamallesh1,2

    2017-06-01

    Full Text Available Advent of mass spectrometry based proteomics has revolutionized our ability to study proteins from biological specimen in a high-throughput manner. Unlike cell line based studies, biomedical research involving tissue specimen is often challenging due to limited sample availability. In addition, investigation of clinically relevant research questions often requires enormous amount of time for sample collection prospectively. Formalin fixed paraffin embedded (FFPE archived tissue samples are a rich source of tissue specimen for biomedical research. However, there are several challenges associated with analysing FFPE samples. Protein cross-linking and degradation of proteins particularly affects proteomic analysis. We demonstrate that barocycler that uses pressure-cycling technology enables efficient protein extraction and processing of small amounts of FFPE tissue samples for proteomic analysis. We identified 3,525 proteins from six 10µm esophageal squamous cell carcinoma (ESCC tissue sections. Barocycler allows efficient protein extraction and proteolytic digestion of proteins from FFPE tissue sections at par with conventional methods.

  16. Recruiting for Prior Service Market

    Science.gov (United States)

    2008-06-01

    perceptions, expectations and issues for re-enlistment • Develop potential marketing and advertising tactics and strategies targeted to the defined...01 JUN 2008 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Recruiting for Prior Service Market 5a. CONTRACT NUMBER 5b. GRANT...Command First Handshake to First Unit of Assignment An Army of One Proud to Be e e to Serve Recruiting for Prior Service Market MAJ Eric Givens / MAJ Brian

  17. The outlier sample effects on multivariate statistical data processing geochemical stream sediment survey (Moghangegh region, North West of Iran)

    International Nuclear Information System (INIS)

    Ghanbari, Y.; Habibnia, A.; Memar, A.

    2009-01-01

    In geochemical stream sediment surveys in Moghangegh Region in north west of Iran, sheet 1:50,000, 152 samples were collected and after the analyze and processing of data, it revealed that Yb, Sc, Ni, Li, Eu, Cd, Co, as contents in one sample is far higher than other samples. After detecting this sample as an outlier sample, the effect of this sample on multivariate statistical data processing for destructive effects of outlier sample in geochemical exploration was investigated. Pearson and Spear man correlation coefficient methods and cluster analysis were used for multivariate studies and the scatter plot of some elements together the regression profiles are given in case of 152 and 151 samples and the results are compared. After investigation of multivariate statistical data processing results, it was realized that results of existence of outlier samples may appear as the following relations between elements: - true relation between two elements, which have no outlier frequency in the outlier sample. - false relation between two elements which one of them has outlier frequency in the outlier sample. - complete false relation between two elements which both have outlier frequency in the outlier sample

  18. Automatic drawing and CAD actualization in processing data of radiant sampling in physics prospect

    International Nuclear Information System (INIS)

    Liu Jinsheng

    2010-01-01

    In this paper discussed a method of processing radiant sampling data with computer. By this method can get expain the curve of radiant sampling data, and we can combine mineral masses and analyse and calculate them, then record the result on Notebook. There are many merites of this method: easy to learn, simple to use, high efficient. It adapts to all sorts of mines. (authors)

  19. Automatic drawing and cad actualiztion in processing data of radiant sampling in physics prospect

    International Nuclear Information System (INIS)

    Liu Jinsheng

    2010-01-01

    In this paper discussed a method of processing radiant sampling data with computer. By this method can get explain the curve of radiant sampling data, and we can combine mineral masses and analyses and calculate them, then record the result on Notebook. There are many merites of this method: easy to learn, simple to use, high efficient. It adapts to all sorts of mines. (authors)

  20. Segmentation of kidney using C-V model and anatomy priors

    Science.gov (United States)

    Lu, Jinghua; Chen, Jie; Zhang, Juan; Yang, Wenjia

    2007-12-01

    This paper presents an approach for kidney segmentation on abdominal CT images as the first step of a virtual reality surgery system. Segmentation for medical images is often challenging because of the objects' complicated anatomical structures, various gray levels, and unclear edges. A coarse to fine approach has been applied in the kidney segmentation using Chan-Vese model (C-V model) and anatomy prior knowledge. In pre-processing stage, the candidate kidney regions are located. Then C-V model formulated by level set method is applied in these smaller ROI, which can reduce the calculation complexity to a certain extent. At last, after some mathematical morphology procedures, the specified kidney structures have been extracted interactively with prior knowledge. The satisfying results on abdominal CT series show that the proposed approach keeps all the advantages of C-V model and overcome its disadvantages.

  1. Preview-based sampling for controlling gaseous simulations

    KAUST Repository

    Huang, Ruoguan

    2011-01-01

    In this work, we describe an automated method for directing the control of a high resolution gaseous fluid simulation based on the results of a lower resolution preview simulation. Small variations in accuracy between low and high resolution grids can lead to divergent simulations, which is problematic for those wanting to achieve a desired behavior. Our goal is to provide a simple method for ensuring that the high resolution simulation matches key properties from the lower resolution simulation. We first let a user specify a fast, coarse simulation that will be used for guidance. Our automated method samples the data to be matched at various positions and scales in the simulation, or allows the user to identify key portions of the simulation to maintain. During the high resolution simulation, a matching process ensures that the properties sampled from the low resolution simulation are maintained. This matching process keeps the different resolution simulations aligned even for complex systems, and can ensure consistency of not only the velocity field, but also advected scalar values. Because the final simulation is naturally similar to the preview simulation, only minor controlling adjustments are needed, allowing a simpler control method than that used in prior keyframing approaches. Copyright © 2011 by the Association for Computing Machinery, Inc.

  2. ECAE-processed Cu-Nb and Cu-Ag nanocomposite wires for pulse magnet applications

    International Nuclear Information System (INIS)

    Edgecumbe Summers, T.S.; Walsh, R.P.; Pernambuco-Wise, P.

    1997-01-01

    Cu-Nb and Cu-Ag nanocomposites have recently become of interest to pulse magnet designers because of their unusual combination of high strength with reasonable conductivity. In the as-cast condition, these conductors consist of two phases, one of almost pure Nb (or Ag) and the other almost pure Cu. When these castings are cold worked as in a wire-drawing operation for example, the two phases are drawn into very fine filaments which produce considerable strengthening without an unacceptable decrease in conductivity. Unfortunately, any increase in strength with operations such as wire drawing is accompanied by a reduction in the cross section of the billet, and thus far, no wires with strengths on the order of 1.5 GPa or more have been produced with cross sections large enough to be useful in magnet applications. Equal Channel Angular Extrusion (ECAE) is an innovative technique which allows for the refinement of the as-cast ingot structure without a reduction in the cross sectional dimensions. Samples processed by the ECAE technique prior to wire drawing should be stronger at a given wire diameter than those processed by wire drawing alone. The tensile properties of wire-drawn Cu-18%Nb and Cu-25%Ag both with and without prior ECAE processing were tested and compared at both room temperature and 77K. All samples were found to have resistivities consistent with their strengths, and the strengths of the ECAE-processed wires were significantly higher than their as-cast and drawn counterparts. Therefore, with ECAE processing prior to wire drawing, it appears to be possible to make high-strength conductors with adequately large cross sections for pulse magnets

  3. Emerging halogenated flame retardants and hexabromocyclododecanes in food samples from an e-waste processing area in Vietnam.

    Science.gov (United States)

    Tao, Fang; Matsukami, Hidenori; Suzuki, Go; Tue, Nguyen Minh; Viet, Pham Hung; Takigami, Hidetaka; Harrad, Stuart

    2016-03-01

    This study reports concentrations of selected emerging halogenated flame retardants (HFRs) and hexabromocyclododecanes (HBCDs) in foodstuffs sourced from an e-waste processing area in Vietnam and two reference sites in Vietnam and Japan. Concentrations of all target HFRs in e-waste-impacted samples in this study exceed significantly (p e-waste processing activities exert a substantial impact on local environmental contamination and human dietary exposure. Significant linear positive correlations in concentrations of syn-Dechlorane Plus (DP) and anti-DP were found between soils and those in co-located chicken samples (p e-waste processing sites and non-e-waste processing areas elsewhere.

  4. A Bayesian optimal design for degradation tests based on the inverse Gaussian process

    Energy Technology Data Exchange (ETDEWEB)

    Peng, Weiwen; Liu, Yu; Li, Yan Feng; Zhu, Shun Peng; Huang, Hong Zhong [University of Electronic Science and Technology of China, Chengdu (China)

    2014-10-15

    The inverse Gaussian process is recently introduced as an attractive and flexible stochastic process for degradation modeling. This process has been demonstrated as a valuable complement for models that are developed on the basis of the Wiener and gamma processes. We investigate the optimal design of the degradation tests on the basis of the inverse Gaussian process. In addition to an optimal design with pre-estimated planning values of model parameters, we also address the issue of uncertainty in the planning values by using the Bayesian method. An average pre-posterior variance of reliability is used as the optimization criterion. A trade-off between sample size and number of degradation observations is investigated in the degradation test planning. The effects of priors on the optimal designs and on the value of prior information are also investigated and quantified. The degradation test planning of a GaAs Laser device is performed to demonstrate the proposed method.

  5. In Situ Visualization of the Phase Behavior of Oil Samples Under Refinery Process Conditions.

    Science.gov (United States)

    Laborde-Boutet, Cedric; McCaffrey, William C

    2017-02-21

    To help address production issues in refineries caused by the fouling of process units and lines, we have developed a setup as well as a method to visualize the behavior of petroleum samples under process conditions. The experimental setup relies on a custom-built micro-reactor fitted with a sapphire window at the bottom, which is placed over the objective of an inverted microscope equipped with a cross-polarizer module. Using reflection microscopy enables the visualization of opaque samples, such as petroleum vacuum residues, or asphaltenes. The combination of the sapphire window from the micro-reactor with the cross-polarizer module of the microscope on the light path allows high-contrast imaging of isotropic and anisotropic media. While observations are carried out, the micro-reactor can be heated to the temperature range of cracking reactions (up to 450 °C), can be subjected to H2 pressure relevant to hydroconversion reactions (up to 16 MPa), and can stir the sample by magnetic coupling. Observations are typically carried out by taking snapshots of the sample under cross-polarized light at regular time intervals. Image analyses may not only provide information on the temperature, pressure, and reactive conditions yielding phase separation, but may also give an estimate of the evolution of the chemical (absorption/reflection spectra) and physical (refractive index) properties of the sample before the onset of phase separation.

  6. PET image reconstruction using multi-parametric anato-functional priors

    Science.gov (United States)

    Mehranian, Abolfazl; Belzunce, Martin A.; Niccolini, Flavia; Politis, Marios; Prieto, Claudia; Turkheimer, Federico; Hammers, Alexander; Reader, Andrew J.

    2017-08-01

    In this study, we investigate the application of multi-parametric anato-functional (MR-PET) priors for the maximum a posteriori (MAP) reconstruction of brain PET data in order to address the limitations of the conventional anatomical priors in the presence of PET-MR mismatches. In addition to partial volume correction benefits, the suitability of these priors for reconstruction of low-count PET data is also introduced and demonstrated, comparing to standard maximum-likelihood (ML) reconstruction of high-count data. The conventional local Tikhonov and total variation (TV) priors and current state-of-the-art anatomical priors including the Kaipio, non-local Tikhonov prior with Bowsher and Gaussian similarity kernels are investigated and presented in a unified framework. The Gaussian kernels are calculated using both voxel- and patch-based feature vectors. To cope with PET and MR mismatches, the Bowsher and Gaussian priors are extended to multi-parametric priors. In addition, we propose a modified joint Burg entropy prior that by definition exploits all parametric information in the MAP reconstruction of PET data. The performance of the priors was extensively evaluated using 3D simulations and two clinical brain datasets of [18F]florbetaben and [18F]FDG radiotracers. For simulations, several anato-functional mismatches were intentionally introduced between the PET and MR images, and furthermore, for the FDG clinical dataset, two PET-unique active tumours were embedded in the PET data. Our simulation results showed that the joint Burg entropy prior far outperformed the conventional anatomical priors in terms of preserving PET unique lesions, while still reconstructing functional boundaries with corresponding MR boundaries. In addition, the multi-parametric extension of the Gaussian and Bowsher priors led to enhanced preservation of edge and PET unique features and also an improved bias-variance performance. In agreement with the simulation results, the clinical results

  7. Assessment of Processes of Change for Weight Management in a UK Sample

    Science.gov (United States)

    Andrés, Ana; Saldaña, Carmina; Beeken, Rebecca J.

    2015-01-01

    Objective The present study aimed to validate the English version of the Processes of Change questionnaire in weight management (P-Weight). Methods Participants were 1,087 UK adults, including people enrolled in a behavioural weight management programme, university students and an opportunistic sample. The mean age of the sample was 34.80 (SD = 13.56) years, and 83% were women. BMI ranged from 18.51 to 55.36 (mean = 25.92, SD = 6.26) kg/m2. Participants completed both the stages and processes questionnaires in weight management (S-Weight and P-Weight), and subscales from the EDI-2 and EAT-40. A refined version of the P-Weight consisting of 32 items was obtained based on the item analysis. Results The internal structure of the scale fitted a four-factor model, and statistically significant correlations with external measures supported the convergent validity of the scale. Conclusion The adequate psychometric properties of the P-Weight English version suggest that it could be a useful tool to tailor weight management interventions. PMID:25765163

  8. Improvements to sample processing and measurement to enable more widespread environmental application of tritium.

    Science.gov (United States)

    Moran, James; Alexander, Thomas; Aalseth, Craig; Back, Henning; Mace, Emily; Overman, Cory; Seifert, Allen; Freeburg, Wilcox

    2017-08-01

    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. We identify a current quantification limit of 92.2 TU which, combined with our small sample sizes, correlates to as little as 0.00133Bq of total T activity. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of both natural and artificial T behavior in the environment. Copyright © 2017. Published by Elsevier Ltd.

  9. Image-guided percutaneous disc sampling: impact of antecedent antibiotics on yield

    International Nuclear Information System (INIS)

    Agarwal, V.; Wo, S.; Lagemann, G.M.; Tsay, J.; Delfyett, W.T.

    2016-01-01

    Aim: To evaluate the effect of antecedent antimicrobial therapy on diagnostic yield from percutaneous image-guided disc-space sampling. Materials and methods: A retrospective review of the electronic health records of all patients who underwent image-guided percutaneous sampling procedures for suspected discitis/osteomyelitis over a 5-year period was performed. One hundred and twenty-four patients were identified. Demographics, medical history, and culture results were recorded as well as duration of presenting symptoms and whether antecedent antibiotic therapy had been administered. Results: Of the 124 patients identified who underwent image-guided percutaneous disc-space sampling, 73 had received antecedent antibiotic treatment compared with 51 who had not. The overall positive culture rate for the present study population was 24% (n=30). The positive culture rate from patients previously on antibiotics was 21% (n=15) compared with 29% (n=15) for patients who had not received prior antibiotic treatment, which is not statistically significant (p=0.26). Eighty-six percent (n=63) of patients who had antecedent antibiotics received treatment for 4 or more days prior to their procedure, whereas 14% (n=10) received treatment for 1–3 days prior to their procedure. The difference in culture positivity rate between these two groups was not statistically significant (p=0.43). Culture results necessitated a change in antibiotic therapy in a third of the patients who had received antecedent antibiotic therapy. Conclusion: Antecedent antibiotic therapy, regardless of duration, did not result in significantly diminished diagnostic yield from percutaneous sampling for suspected discitis/osteomyelitis. The present results suggest that percutaneous biopsy may nonetheless yield positive diagnostic information despite prior antimicrobial therapy. If the diagnostic information may impact choice of therapeutic regimen, percutaneous biopsy should still be considered in cases where

  10. The influence of prior knowledge on the retrieval-directed function of note taking in prior knowledge activation

    NARCIS (Netherlands)

    Wetzels, Sandra; Kester, Liesbeth; Van Merriënboer, Jeroen; Broers, Nick

    2010-01-01

    Wetzels, S. A. J., Kester, L., Van Merriënboer, J. J. G., & Broers, N. J. (2011). The influence of prior knowledge on the retrieval-directed function of note taking in prior knowledge activation. British Journal of Educational Psychology, 81(2), 274-291. doi: 10.1348/000709910X517425

  11. Precursors prior to type IIn supernova explosions are common: Precursor rates, properties, and correlations

    Energy Technology Data Exchange (ETDEWEB)

    Ofek, Eran O.; Steinbok, Aviram; Arcavi, Iair; Gal-Yam, Avishay; Tal, David; Ben-Ami, Sagi; Yaron, Ofer [Benoziyo Center for Astrophysics, Weizmann Institute of Science, 76100 Rehovot (Israel); Sullivan, Mark [School of Physics and Astronomy, University of Southampton, Southampton SO17 1BJ (United Kingdom); Shaviv, Nir J. [Racah Institute of Physics, The Hebrew University, 91904 Jerusalem (Israel); Kulkarni, Shrinivas R. [Cahill Center for Astronomy and Astrophysics, California Institute of Technology, Pasadena, CA 91125 (United States); Nugent, Peter E. [Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Kasliwal, Mansi M. [Observatories of the Carnegie Institution for Science, 813 Santa Barbara Street, Pasadena, CA 91101 (United States); Cenko, S. Bradley [Astrophysics Science Division, NASA/Goddard Space Flight Center, Mail Code 661, Greenbelt, MD 20771 (United States); Laher, Russ; Surace, Jason [Spitzer Science Center, California Institute of Technology, M/S 314-6, Pasadena, CA 91125 (United States); Bloom, Joshua S.; Filippenko, Alexei V. [Department of Astronomy, University of California, Berkeley, CA 94720-3411 (United States); Silverman, Jeffrey M. [Department of Astronomy, University of Texas, Austin, TX 78712 (United States)

    2014-07-10

    There is a growing number of Type IIn supernovae (SNe) which present an outburst prior to their presumably final explosion. These precursors may affect the SN display, and are likely related to poorly charted phenomena in the final stages of stellar evolution. By coadding Palomar Transient Factory (PTF) images taken prior to the explosion, here we present a search for precursors in a sample of 16 Type IIn SNe. We find five SNe IIn that likely have at least one possible precursor event (PTF 10bjb, SN 2010mc, PTF 10weh, SN 2011ht, and PTF 12cxj), three of which are reported here for the first time. For each SN we calculate the control time. We find that precursor events among SNe IIn are common: at the one-sided 99% confidence level, >50% of SNe IIn have at least one pre-explosion outburst that is brighter than 3 × 10{sup 7} L{sub ☉} taking place up to 1/3 yr prior to the SN explosion. The average rate of such precursor events during the year prior to the SN explosion is likely ≳ 1 yr{sup –1}, and fainter precursors are possibly even more common. Ignoring the two weakest precursors in our sample, the precursors rate we find is still on the order of one per year. We also find possible correlations between the integrated luminosity of the precursor and the SN total radiated energy, peak luminosity, and rise time. These correlations are expected if the precursors are mass-ejection events, and the early-time light curve of these SNe is powered by interaction of the SN shock and ejecta with optically thick circumstellar material.

  12. Adaptive local thresholding for robust nucleus segmentation utilizing shape priors

    Science.gov (United States)

    Wang, Xiuzhong; Srinivas, Chukka

    2016-03-01

    This paper describes a novel local thresholding method for foreground detection. First, a Canny edge detection method is used for initial edge detection. Then, tensor voting is applied on the initial edge pixels, using a nonsymmetric tensor field tailored to encode prior information about nucleus size, shape, and intensity spatial distribution. Tensor analysis is then performed to generate the saliency image and, based on that, the refined edge. Next, the image domain is divided into blocks. In each block, at least one foreground and one background pixel are sampled for each refined edge pixel. The saliency weighted foreground histogram and background histogram are then created. These two histograms are used to calculate a threshold by minimizing the background and foreground pixel classification error. The block-wise thresholds are then used to generate the threshold for each pixel via interpolation. Finally, the foreground is obtained by comparing the original image with the threshold image. The effective use of prior information, combined with robust techniques, results in far more reliable foreground detection, which leads to robust nucleus segmentation.

  13. Prior expectations facilitate metacognition for perceptual decision.

    Science.gov (United States)

    Sherman, M T; Seth, A K; Barrett, A B; Kanai, R

    2015-09-01

    The influential framework of 'predictive processing' suggests that prior probabilistic expectations influence, or even constitute, perceptual contents. This notion is evidenced by the facilitation of low-level perceptual processing by expectations. However, whether expectations can facilitate high-level components of perception remains unclear. We addressed this question by considering the influence of expectations on perceptual metacognition. To isolate the effects of expectation from those of attention we used a novel factorial design: expectation was manipulated by changing the probability that a Gabor target would be presented; attention was manipulated by instructing participants to perform or ignore a concurrent visual search task. We found that, independently of attention, metacognition improved when yes/no responses were congruent with expectations of target presence/absence. Results were modeled under a novel Bayesian signal detection theoretic framework which integrates bottom-up signal propagation with top-down influences, to provide a unified description of the mechanisms underlying perceptual decision and metacognition. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Sample preparation optimization in fecal metabolic profiling.

    Science.gov (United States)

    Deda, Olga; Chatziioannou, Anastasia Chrysovalantou; Fasoula, Stella; Palachanis, Dimitris; Raikos, Νicolaos; Theodoridis, Georgios A; Gika, Helen G

    2017-03-15

    Metabolomic analysis of feces can provide useful insight on the metabolic status, the health/disease state of the human/animal and the symbiosis with the gut microbiome. As a result, recently there is increased interest on the application of holistic analysis of feces for biomarker discovery. For metabolomics applications, the sample preparation process used prior to the analysis of fecal samples is of high importance, as it greatly affects the obtained metabolic profile, especially since feces, as matrix are diversifying in their physicochemical characteristics and molecular content. However there is still little information in the literature and lack of a universal approach on sample treatment for fecal metabolic profiling. The scope of the present work was to study the conditions for sample preparation of rat feces with the ultimate goal of the acquisition of comprehensive metabolic profiles either untargeted by NMR spectroscopy and GC-MS or targeted by HILIC-MS/MS. A fecal sample pooled from male and female Wistar rats was extracted under various conditions by modifying the pH value, the nature of the organic solvent and the sample weight to solvent volume ratio. It was found that the 1/2 (w f /v s ) ratio provided the highest number of metabolites under neutral and basic conditions in both untargeted profiling techniques. Concerning LC-MS profiles, neutral acetonitrile and propanol provided higher signals and wide metabolite coverage, though extraction efficiency is metabolite dependent. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. UNLABELED SELECTED SAMPLES IN FEATURE EXTRACTION FOR CLASSIFICATION OF HYPERSPECTRAL IMAGES WITH LIMITED TRAINING SAMPLES

    Directory of Open Access Journals (Sweden)

    A. Kianisarkaleh

    2015-12-01

    Full Text Available Feature extraction plays a key role in hyperspectral images classification. Using unlabeled samples, often unlimitedly available, unsupervised and semisupervised feature extraction methods show better performance when limited number of training samples exists. This paper illustrates the importance of selecting appropriate unlabeled samples that used in feature extraction methods. Also proposes a new method for unlabeled samples selection using spectral and spatial information. The proposed method has four parts including: PCA, prior classification, posterior classification and sample selection. As hyperspectral image passes these parts, selected unlabeled samples can be used in arbitrary feature extraction methods. The effectiveness of the proposed unlabeled selected samples in unsupervised and semisupervised feature extraction is demonstrated using two real hyperspectral datasets. Results show that through selecting appropriate unlabeled samples, the proposed method can improve the performance of feature extraction methods and increase classification accuracy.

  16. SINGLE TREE DETECTION FROM AIRBORNE LASER SCANNING DATA USING A MARKED POINT PROCESS BASED METHOD

    Directory of Open Access Journals (Sweden)

    J. Zhang

    2013-05-01

    Full Text Available Tree detection and reconstruction is of great interest in large-scale city modelling. In this paper, we present a marked point process model to detect single trees from airborne laser scanning (ALS data. We consider single trees in ALS recovered canopy height model (CHM as a realization of point process of circles. Unlike traditional marked point process, we sample the model in a constraint configuration space by making use of image process techniques. A Gibbs energy is defined on the model, containing a data term which judge the fitness of the model with respect to the data, and prior term which incorporate the prior knowledge of object layouts. We search the optimal configuration through a steepest gradient descent algorithm. The presented hybrid framework was test on three forest plots and experiments show the effectiveness of the proposed method.

  17. Systematic sampling with errors in sample locations

    DEFF Research Database (Denmark)

    Ziegel, Johanna; Baddeley, Adrian; Dorph-Petersen, Karl-Anton

    2010-01-01

    analysis using point process methods. We then analyze three different models for the error process, calculate exact expressions for the variances, and derive asymptotic variances. Errors in the placement of sample points can lead to substantial inflation of the variance, dampening of zitterbewegung......Systematic sampling of points in continuous space is widely used in microscopy and spatial surveys. Classical theory provides asymptotic expressions for the variance of estimators based on systematic sampling as the grid spacing decreases. However, the classical theory assumes that the sample grid...... is exactly periodic; real physical sampling procedures may introduce errors in the placement of the sample points. This paper studies the effect of errors in sample positioning on the variance of estimators in the case of one-dimensional systematic sampling. First we sketch a general approach to variance...

  18. Elapsed decision time affects the weighting of prior probability in a perceptual decision task

    Science.gov (United States)

    Hanks, Timothy D.; Mazurek, Mark E.; Kiani, Roozbeh; Hopp, Elizabeth; Shadlen, Michael N.

    2012-01-01

    Decisions are often based on a combination of new evidence with prior knowledge of the probable best choice. Optimal combination requires knowledge about the reliability of evidence, but in many realistic situations, this is unknown. Here we propose and test a novel theory: the brain exploits elapsed time during decision formation to combine sensory evidence with prior probability. Elapsed time is useful because (i) decisions that linger tend to arise from less reliable evidence, and (ii) the expected accuracy at a given decision time depends on the reliability of the evidence gathered up to that point. These regularities allow the brain to combine prior information with sensory evidence by weighting the latter in accordance with reliability. To test this theory, we manipulated the prior probability of the rewarded choice while subjects performed a reaction-time discrimination of motion direction using a range of stimulus reliabilities that varied from trial to trial. The theory explains the effect of prior probability on choice and reaction time over a wide range of stimulus strengths. We found that prior probability was incorporated into the decision process as a dynamic bias signal that increases as a function of decision time. This bias signal depends on the speed-accuracy setting of human subjects, and it is reflected in the firing rates of neurons in the lateral intraparietal cortex (LIP) of rhesus monkeys performing this task. PMID:21525274

  19. Field Sample Preparation Method Development for Isotope Ratio Mass Spectrometry

    International Nuclear Information System (INIS)

    Leibman, C.; Weisbrod, K.; Yoshida, T.

    2015-01-01

    Non-proliferation and International Security (NA-241) established a working group of researchers from Los Alamos National Laboratory (LANL), Pacific Northwest National Laboratory (PNNL) and Savannah River National Laboratory (SRNL) to evaluate the utilization of in-field mass spectrometry for safeguards applications. The survey of commercial off-the-shelf (COTS) mass spectrometers (MS) revealed no instrumentation existed capable of meeting all the potential safeguards requirements for performance, portability, and ease of use. Additionally, fieldable instruments are unlikely to meet the International Target Values (ITVs) for accuracy and precision for isotope ratio measurements achieved with laboratory methods. The major gaps identified for in-field actinide isotope ratio analysis were in the areas of: 1. sample preparation and/or sample introduction, 2. size reduction of mass analyzers and ionization sources, 3. system automation, and 4. decreased system cost. Development work in 2 through 4, numerated above continues, in the private and public sector. LANL is focusing on developing sample preparation/sample introduction methods for use with the different sample types anticipated for safeguard applications. Addressing sample handling and sample preparation methods for MS analysis will enable use of new MS instrumentation as it becomes commercially available. As one example, we have developed a rapid, sample preparation method for dissolution of uranium and plutonium oxides using ammonium bifluoride (ABF). ABF is a significantly safer and faster alternative to digestion with boiling combinations of highly concentrated mineral acids. Actinides digested with ABF yield fluorides, which can then be analyzed directly or chemically converted and separated using established column chromatography techniques as needed prior to isotope analysis. The reagent volumes and the sample processing steps associated with ABF sample digestion lend themselves to automation and field

  20. Influence of DC plasma modification on the selected properties and the geometrical surface structure of polylactide prior to autocatalytic metallization

    Energy Technology Data Exchange (ETDEWEB)

    Moraczewski, Krzysztof, E-mail: kmm@ukw.edu.pl [Kazimierz Wielki University, Chodkiewicza 30, 85-064 Bydgoszcz (Poland); Rytlewski, Piotr [Kazimierz Wielki University, Chodkiewicza 30, 85-064 Bydgoszcz (Poland); Malinowski, Rafał [Institute for Engineering of Polymer Materials and Dyes, Marii Skłodowskiej-Curie 55, 87-100 Toruń (Poland); Tracz, Adam [Centre for Molecular and Macromolecular Studies of the Polish Academy of Sciences, Sienkiewicza 112, 90-363 Łódź (Poland); Żenkiewicz, Marian [Institute for Engineering of Polymer Materials and Dyes, Marii Skłodowskiej-Curie 55, 87-100 Toruń (Poland)

    2015-03-01

    The paper presents the results of studies to determine the applicability of plasma modification in the process of polylactide (PLA) surface preparation prior to the autocatalytic metallization. The polylactide plasma modification was carried out in an oxygen or nitrogen chemistry. The samples were tested with the following methods: scanning electron microscopy (SEM), atomic force microscopy (AFM), goniometry and electron spectrophotometry (XPS). Scanning electron microscopy and atomic force microscopy images were demonstrated. The results of surface free energy calculations, performed based on the results of the contact angle measurements have been presented. The results of the qualitative (degree of oxidation or nitridation) and quantitative analysis of the chemical composition of the polylactide surface layer have also been described. The results of the studies show that the DC plasma modification performed in the proposed condition is a suitable as a method of surface preparation for the polylactide metallization. - Highlights: • We modified polylactide surface layer with plasma generated in oxygen or nitrogen. • We tested selected properties and surface structure of modified samples. • DC plasma modification can be used to prepare the PLA surface for metallization. • For better results metallization should be preceded by sonication process.

  1. Virtual sampling in variational processing of Monte Carlo simulation in a deep neutron penetration problem

    International Nuclear Information System (INIS)

    Allagi, Mabruk O.; Lewins, Jeffery D.

    1999-01-01

    In a further study of virtually processed Monte Carlo estimates in neutron transport, a shielding problem has been studied. The use of virtual sampling to estimate the importance function at a certain point in the phase space depends on the presence of neutrons from the real source at that point. But in deep penetration problems, not many neutrons will reach regions far away from the source. In order to overcome this problem, two suggestions are considered: (1) virtual sampling is used as far as the real neutrons can reach, then fictitious sampling is introduced for the remaining regions, distributed in all the regions, or (2) only one fictitious source is placed where the real neutrons almost terminate and then virtual sampling is used in the same way as for the real source. Variational processing is again found to improve the Monte Carlo estimates, being best when using one fictitious source in the far regions with virtual sampling (option 2). When fictitious sources are used to estimate the importances in regions far away from the source, some optimization has to be performed for the proportion of fictitious to real sources, weighted against accuracy and computational costs. It has been found in this study that the optimum number of cells to be treated by fictitious sampling is problem dependent, but as a rule of thumb, fictitious sampling should be employed in regions where the number of neutrons from the real source fall below a specified limit for good statistics

  2. Unambiguous range-Doppler LADAR processing using 2 giga-sample-per-second noise waveforms

    International Nuclear Information System (INIS)

    Cole, Z.; Roos, P.A.; Berg, T.; Kaylor, B.; Merkel, K.D.; Babbitt, W.R.; Reibel, R.R.

    2007-01-01

    We demonstrate sub-nanosecond range and unambiguous sub-50-Hz Doppler resolved laser radar (LADAR) measurements using spectral holographic processing in rare-earth ion doped crystals. The demonstration utilizes pseudo-random-noise 2 giga-sample-per-second baseband waveforms modulated onto an optical carrier

  3. Unambiguous range-Doppler LADAR processing using 2 giga-sample-per-second noise waveforms

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Z. [S2 Corporation, 2310 University Way 4-1, Bozeman, MT 59715 (United States)]. E-mail: cole@s2corporation.com; Roos, P.A. [Spectrum Lab, Montana State University, P.O. Box 173510, Bozeman, MT 59717 (United States); Berg, T. [S2 Corporation, 2310 University Way 4-1, Bozeman, MT 59715 (United States); Kaylor, B. [S2 Corporation, 2310 University Way 4-1, Bozeman, MT 59715 (United States); Merkel, K.D. [S2 Corporation, 2310 University Way 4-1, Bozeman, MT 59715 (United States); Babbitt, W.R. [Spectrum Lab, Montana State University, P.O. Box 173510, Bozeman, MT 59717 (United States); Reibel, R.R. [S2 Corporation, 2310 University Way 4-1, Bozeman, MT 59715 (United States)

    2007-11-15

    We demonstrate sub-nanosecond range and unambiguous sub-50-Hz Doppler resolved laser radar (LADAR) measurements using spectral holographic processing in rare-earth ion doped crystals. The demonstration utilizes pseudo-random-noise 2 giga-sample-per-second baseband waveforms modulated onto an optical carrier.

  4. Electrodeposition as an alternate method for preparation of environmental samples for iodide by AMS

    Energy Technology Data Exchange (ETDEWEB)

    Adamic, M.L., E-mail: Mary.Adamic@inl.gov [Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83402 (United States); Lister, T.E.; Dufek, E.J.; Jenson, D.D.; Olson, J.E. [Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83402 (United States); Vockenhuber, C. [Laboratory of Ion Beam Physics, ETH Zurich, Otto-Stern-Weg 5, 8093 Zurich (Switzerland); Watrous, M.G. [Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83402 (United States)

    2015-10-15

    This paper presents an evaluation of an alternate method for preparing environmental samples for {sup 129}I analysis by accelerator mass spectrometry (AMS) at Idaho National Laboratory. The optimal sample preparation method is characterized by ease of preparation, capability of processing very small quantities of iodide, and ease of loading into a cathode. Electrodeposition of iodide on a silver wire was evaluated using these criteria. This study indicates that the electrochemically-formed silver iodide deposits produce ion currents similar to those from precipitated silver iodide for the same sample mass. Precipitated silver iodide samples are usually mixed with niobium or silver powder prior to loading in a cathode. Using electrodeposition, the silver is already mixed with the sample and can simply be picked up with tweezers, placed in the sample die, and pressed into a cathode. The major advantage of this method is that the silver wire/electrodeposited silver iodide is much easier to load into a cathode.

  5. Electrodeposition as an alternate method for preparation of environmental samples for iodide by AMS

    International Nuclear Information System (INIS)

    Adamic, M.L.; Lister, T.E.; Dufek, E.J.; Jenson, D.D.; Olson, J.E.; Vockenhuber, C.; Watrous, M.G.

    2015-01-01

    This paper presents an evaluation of an alternate method for preparing environmental samples for "1"2"9I analysis by accelerator mass spectrometry (AMS) at Idaho National Laboratory. The optimal sample preparation method is characterized by ease of preparation, capability of processing very small quantities of iodide, and ease of loading into a cathode. Electrodeposition of iodide on a silver wire was evaluated using these criteria. This study indicates that the electrochemically-formed silver iodide deposits produce ion currents similar to those from precipitated silver iodide for the same sample mass. Precipitated silver iodide samples are usually mixed with niobium or silver powder prior to loading in a cathode. Using electrodeposition, the silver is already mixed with the sample and can simply be picked up with tweezers, placed in the sample die, and pressed into a cathode. The major advantage of this method is that the silver wire/electrodeposited silver iodide is much easier to load into a cathode.

  6. Output Information Based Fault-Tolerant Iterative Learning Control for Dual-Rate Sampling Process with Disturbances and Output Delay

    Directory of Open Access Journals (Sweden)

    Hongfeng Tao

    2018-01-01

    Full Text Available For a class of single-input single-output (SISO dual-rate sampling processes with disturbances and output delay, this paper presents a robust fault-tolerant iterative learning control algorithm based on output information. Firstly, the dual-rate sampling process with output delay is transformed into discrete system in state-space model form with slow sampling rate without time delay by using lifting technology; then output information based fault-tolerant iterative learning control scheme is designed and the control process is turned into an equivalent two-dimensional (2D repetitive process. Moreover, based on the repetitive process stability theory, the sufficient conditions for the stability of system and the design method of robust controller are given in terms of linear matrix inequalities (LMIs technique. Finally, the flow control simulations of two flow tanks in series demonstrate the feasibility and effectiveness of the proposed method.

  7. Evaluating a Collaborative Approach to Improve Prior Authorization Efficiency in the Treatment of Hepatitis C Virus.

    Science.gov (United States)

    Dunn, Emily E; Vranek, Kathryn; Hynicka, Lauren M; Gripshover, Janet; Potosky, Darryn; Mattingly, T Joseph

    A team-based approach to obtaining prior authorization approval was implemented utilizing a specialty pharmacy, a clinic-based pharmacy technician specialist, and a registered nurse to work with providers to obtain approval for medications for hepatitis C virus (HCV) infection. The objective of this study was to evaluate the time to approval for prescribed treatment of HCV infection. A retrospective observational study was conducted including patients treated for HCV infection by clinic providers who received at least 1 oral direct-acting antiviral HCV medication. Patients were divided into 2 groups, based on whether they were treated before or after the implementation of the team-based approach. Student t tests were used to compare average wait times before and after the intervention. The sample included 180 patients, 68 treated before the intervention and 112 patients who initiated therapy after. All patients sampled required prior authorization approval by a third-party payer to begin therapy. There was a statistically significant reduction (P = .02) in average wait time in the postintervention group (15.6 ± 12.1 days) once adjusted using dates of approval. Pharmacy collaboration may provide increases in efficiency in provider prior authorization practices and reduced wait time for patients to begin treatment.

  8. A NOVEL TECHNIQUE TO IMPROVE PHOTOMETRY IN CONFUSED IMAGES USING GRAPHS AND BAYESIAN PRIORS

    International Nuclear Information System (INIS)

    Safarzadeh, Mohammadtaher; Ferguson, Henry C.; Lu, Yu; Inami, Hanae; Somerville, Rachel S.

    2015-01-01

    We present a new technique for overcoming confusion noise in deep far-infrared Herschel space telescope images making use of prior information from shorter λ < 2 μm wavelengths. For the deepest images obtained by Herschel, the flux limit due to source confusion is about a factor of three brighter than the flux limit due to instrumental noise and (smooth) sky background. We have investigated the possibility of de-confusing simulated Herschel PACS 160 μm images by using strong Bayesian priors on the positions and weak priors on the flux of sources. We find the blended sources and group them together and simultaneously fit their fluxes. We derive the posterior probability distribution function of fluxes subject to these priors through Monte Carlo Markov Chain (MCMC) sampling by fitting the image. Assuming we can predict the FIR flux of sources based on the ultraviolet-optical part of their SEDs to within an order of magnitude, the simulations show that we can obtain reliable fluxes and uncertainties at least a factor of three fainter than the confusion noise limit of 3σ c = 2.7 mJy in our simulated PACS-160 image. This technique could in principle be used to mitigate the effects of source confusion in any situation where one has prior information of positions and plausible fluxes of blended sources. For Herschel, application of this technique will improve our ability to constrain the dust content in normal galaxies at high redshift

  9. An approach for sampling solid heterogeneous waste at the Hanford Site waste receiving and processing and solid waste projects

    International Nuclear Information System (INIS)

    Sexton, R.A.

    1993-03-01

    This paper addresses the problem of obtaining meaningful data from samples of solid heterogeneous waste while maintaining sample rates as low as practical. The Waste Receiving and Processing Facility, Module 1, at the Hanford Site in south-central Washington State will process mostly heterogeneous solid wastes. The presence of hazardous materials is documented for some packages and unknown for others. Waste characterization is needed to segregate the waste, meet waste acceptance and shipping requirements, and meet facility permitting requirements. Sampling and analysis are expensive, and no amount of sampling will produce absolute certainty of waste contents. A sampling strategy is proposed that provides acceptable confidence with achievable sampling rates

  10. Recognition of prior learning candidates’ experiences in a nurse training programme

    Directory of Open Access Journals (Sweden)

    Nomathemba B. Mothokoa

    2018-06-01

    Full Text Available Recognition of prior learning (RPL in South Africa is critical to the development of an equitable education and training system. Historically, nursing has been known as one of the professions that provides access to the training and education of marginalised groups who have minimal access to formal education. The advent of implementing RPL in nursing has, however, not been without challenges. The purpose of this study was to explore and describe the experiences of RPL nursing candidates related to a 4-year comprehensive nursing training programme at a nursing education institution in Gauteng. An exploratory, descriptive and contextual qualitative research design was undertaken. The research sample comprised 13 purposefully selected participants. Face-to-face individual interviews, using open-ended questions, were used to collect data, which were analysed using Tesch’s approach. Recognition of prior learning candidates experienced a number of realities as adult learners. On a positive note, their prior knowledge and experience supported them in their learning endeavours. Participants, however, experienced a number of challenges on personal, interpersonal and socialisation, and educational levels. It is important that opportunities are created to support and assist RPL candidates to complete their nursing training. This support structure, among others, should include the provision of RPL-related information, giving appropriate advice, coaching and mentoring, effective administration services, integrated curriculum design, and a variety of formative and summative assessment practices.

  11. Elicitation of expert prior opinion: application to the MYPAN trial in childhood polyarteritis nodosa.

    Directory of Open Access Journals (Sweden)

    Lisa V Hampson

    Full Text Available Definitive sample sizes for clinical trials in rare diseases are usually infeasible. Bayesian methodology can be used to maximise what is learnt from clinical trials in these circumstances. We elicited expert prior opinion for a future Bayesian randomised controlled trial for a rare inflammatory paediatric disease, polyarteritis nodosa (MYPAN, Mycophenolate mofetil for polyarteritis nodosa.A Bayesian prior elicitation meeting was convened. Opinion was sought on the probability that a patient in the MYPAN trial treated with cyclophosphamide would achieve disease remission within 6-months, and on the relative efficacies of mycophenolate mofetil and cyclophosphamide. Expert opinion was combined with previously unseen data from a recently completed randomised controlled trial in ANCA associated vasculitis.A pan-European group of fifteen experts participated in the elicitation meeting. Consensus expert prior opinion was that the most likely rates of disease remission within 6 months on cyclophosphamide or mycophenolate mofetil were 74% and 71%, respectively. This prior opinion will now be taken forward and will be modified to formulate a Bayesian posterior opinion once the MYPAN trial data from 40 patients randomised 1:1 to either CYC or MMF become available.We suggest that the methodological template we propose could be applied to trial design for other rare diseases.

  12. Y-STR analysis on DNA mixture samples--results of a collaborative project of the ENFSI DNA Working Group

    DEFF Research Database (Denmark)

    Parson, Walther; Niederstätter, Harald; Lindinger, Alexandra

    2008-01-01

    The ENFSI (European Network of Forensic Science Institutes) DNA Working Group undertook a collaborative project on Y-STR typing of DNA mixture samples that were centrally prepared and thoroughly tested prior to the shipment. Four commercial Y-STR typing kits (Y-Filer, Applied Biosystems, Foster C...... a laboratory-specific optimization process is indicated to reach a comparable sensitivity for the analysis of minute amounts of DNA....

  13. Browsing while reading: effects of instructional design and learners' prior knowledge

    Directory of Open Access Journals (Sweden)

    Theimo Müller-Kalthoff

    2006-12-01

    Full Text Available One of the key reasons that multimedia, and particularly hypertext systems, are gaining in importance is that they inspire hopes of optimizing learners' processes of knowledge construction. The present study is concerned with the respective influence of individual learner variables (i.e. particularly domain-specific prior knowledge on the use of different design attributes. Thirty-six university students worked through a hierarchically structured two-part hypertext about the psychology of memory under two experimental browsing conditions (reduced versus free browsing. Results show that deeper-level comprehension (i.e. structural knowledge was predicted by the interaction of experimental condition and prior knowledge, but that simply retaining facts was not. Participants with low prior knowledge performed better on the comprehension test if they had worked on the version with reduced access. Moreover, the version with reduced access helped to reduce feelings of disorientation. The measure of disorientation also appeared to be closely linked with the individual's computer experience, self-concept of computer ability and subject-related interest. The main implications for educational practice relate to the design of an adaptive multimedia and hypertext learning system and the successful learning with it.

  14. Automated sample-processing and titration system for determining uranium in nuclear materials

    International Nuclear Information System (INIS)

    Harrar, J.E.; Boyle, W.G.; Breshears, J.D.; Pomernacki, C.L.; Brand, H.R.; Kray, A.M.; Sherry, R.J.; Pastrone, J.A.

    1977-01-01

    The system is designed for accurate, precise, and selective determination of from 10 to 180 mg of uranium in 2 to 12 cm 3 of solution. Samples, standards, and their solutions are handled on a weight basis. These weights, together with their appropriate identification numbers, are stored in computer memory and are used automatically in the assay calculations after each titration. The measurement technique (controlled-current coulometry) is based on the Davies-Gray and New Brunswick Laboratory method, in which U(VI) is reduced to U(IV) in strong H 3 PO 4 , followed by titration of the U(IV) with electrogenerated V(V). Solution pretreatment and titration are automatic. The analyzer is able to process 44 samples per loading of the sample changer, at a rate of 4 to 9 samples per hour. The system includes a comprehensive fault-monitoring system that detects analytical errors, guards against abnormal conditions which might cause errors, and prevents unsafe operation. A detailed description of the system, information on the reliability of the component subsystems, and a summary of its evaluation by the New Brunswick Laboratory are presented

  15. Advanced liquid waste processing technologies: Theoretical versus actual application

    International Nuclear Information System (INIS)

    Barker, Tracy A.

    1992-01-01

    This paper provides an overview of Chem-Nuclear Systems, Inc. (CNSI) experience with turn-key chromate removal at the Maine Yankee Nuclear Plant. Theoretical and actual experiences are addressed on topics such as processing duration, laboratory testing, equipment requirements, chromate removal, waste generation, and waste processing. Chromate salts are used in industrial recirculation cooling water systems as a corrosion inhibitor. However, chromates are toxic at concentrations necessary for surface inhibition. As a result, Chem-Nuclear was contracted to perform turn-key chromate removal and waste disposal by demineralization. This project was unique in that prior to on-site mobilization, a composite sample of chromated waste was shipped to CNSI laboratories for treatment through a laboratory scale system. Removal efficiency, process media requirements, and waste processing methodology were determined from this laboratory testing. Samples of the waste resulting from this testing were processed by dewatering and solidification, respectively. TCLP tests were performed on the actual processed waste, and based on the TCLP results, pre-approval for media waste disposal was obtained. (author)

  16. Offending prior to first psychiatric contact

    DEFF Research Database (Denmark)

    Stevens, H; Agerbo, E; Dean, K

    2012-01-01

    There is a well-established association between psychotic disorders and subsequent offending but the extent to which those who develop psychosis might have a prior history of offending is less clear. Little is known about whether the association between illness and offending exists in non-psychot......-psychotic disorders. The aim of this study was to determine whether the association between mental disorder and offending is present prior to illness onset in psychotic and non-psychotic disorders.......There is a well-established association between psychotic disorders and subsequent offending but the extent to which those who develop psychosis might have a prior history of offending is less clear. Little is known about whether the association between illness and offending exists in non...

  17. Approach-Induced Biases in Human Information Sampling.

    Directory of Open Access Journals (Sweden)

    Laurence T Hunt

    2016-11-01

    Full Text Available Information sampling is often biased towards seeking evidence that confirms one's prior beliefs. Despite such biases being a pervasive feature of human behavior, their underlying causes remain unclear. Many accounts of these biases appeal to limitations of human hypothesis testing and cognition, de facto evoking notions of bounded rationality, but neglect more basic aspects of behavioral control. Here, we investigated a potential role for Pavlovian approach in biasing which information humans will choose to sample. We collected a large novel dataset from 32,445 human subjects, making over 3 million decisions, who played a gambling task designed to measure the latent causes and extent of information-sampling biases. We identified three novel approach-related biases, formalized by comparing subject behavior to a dynamic programming model of optimal information gathering. These biases reflected the amount of information sampled ("positive evidence approach", the selection of which information to sample ("sampling the favorite", and the interaction between information sampling and subsequent choices ("rejecting unsampled options". The prevalence of all three biases was related to a Pavlovian approach-avoid parameter quantified within an entirely independent economic decision task. Our large dataset also revealed that individual differences in the amount of information gathered are a stable trait across multiple gameplays and can be related to demographic measures, including age and educational attainment. As well as revealing limitations in cognitive processing, our findings suggest information sampling biases reflect the expression of primitive, yet potentially ecologically adaptive, behavioral repertoires. One such behavior is sampling from options that will eventually be chosen, even when other sources of information are more pertinent for guiding future action.

  18. Terminology for pregnancy loss prior to viability

    DEFF Research Database (Denmark)

    Kolte, A M; Bernardi, L A; Christiansen, O B

    2015-01-01

    Pregnancy loss prior to viability is common and research in the field is extensive. Unfortunately, terminology in the literature is inconsistent. The lack of consensus regarding nomenclature and classification of pregnancy loss prior to viability makes it difficult to compare study results from...... different centres. In our opinion, terminology and definitions should be based on clinical findings, and when possible, transvaginal ultrasound. With this Early Pregnancy Consensus Statement, it is our goal to provide clear and consistent terminology for pregnancy loss prior to viability....

  19. Sampling and Hydrogeology of the Vadose Zone Beneath the 300 Area Process Ponds

    International Nuclear Information System (INIS)

    Bjornstad, Bruce N.

    2004-01-01

    Four open pits were dug with a backhoe into the vadose zone beneath the former 300 Area Process Ponds in April 2003. Samples were collected about every 2 feet for physical, chemical, and/or microbiological characterization. This reports presents a stratigraphic and geohydrologic summary of the four excavations

  20. A method for disaggregating clay concretions and eliminating formalin smell in the processing of sediment samples

    DEFF Research Database (Denmark)

    Cedhagen, Tomas

    1989-01-01

    A complete handling procedure for processing sediment samples is described. It includes some improvements of conventional methods. The fixed sediment sample is mixed with a solution of the alkaline detergent AJAX® (Colgate-Palmolive). It is kept at 80-900 C for 20-40 min. This treatment facilitates...

  1. Guidelines for the processing and quality assurance of benthic invertebrate samples collected as part of the National Water-Quality Assessment Program

    Science.gov (United States)

    Cuffney, T.F.; Gurtz, M.E.; Meador, M.R.

    1993-01-01

    Benthic invertebrate samples are collected as part of the U.S. Geological Survey's National Water-Quality Assessment Program. This is a perennial, multidisciplinary program that integrates biological, physical, and chemical indicators of water quality to evaluate status and trends and to develop an understanding of the factors controlling observed water quality. The Program examines water quality in 60 study units (coupled ground- and surface-water systems) that encompass most of the conterminous United States and parts of Alaska and Hawaii. Study-unit teams collect and process qualitative and semi-quantitative invertebrate samples according to standardized procedures. These samples are processed (elutriated and subsampled) in the field to produce as many as four sample components: large-rare, main-body, elutriate, and split. Each sample component is preserved in 10-percent formalin, and two components, large-rare and main-body, are sent to contract laboratories for further processing. The large-rare component is composed of large invertebrates that are removed from the sample matrix during field processing and placed in one or more containers. The main-body sample component consists of the remaining sample materials (sediment, detritus, and invertebrates) and is subsampled in the field to achieve a volume of 750 milliliters or less. The remaining two sample components, elutriate and split, are used for quality-assurance and quality-control purposes. Contract laboratories are used to identify and quantify invertebrates from the large-rare and main-body sample components according to the procedures and guidelines specified within this document. These guidelines allow the use of subsampling techniques to reduce the volume of sample material processed and to facilitate identifications. These processing procedures and techniques may be modified if the modifications provide equal or greater levels of accuracy and precision. The intent of sample processing is to

  2. Lessons Learned from Preparing OSIRIS-REx Spectral Analog Samples for Bennu

    Science.gov (United States)

    Schrader, D. L.; McCoy, T. J.; Cody, G. D.; King, A. J.; Schofield, P. F.; Russell, S. S.; Connolly, H. C., Jr.; Keller, L. P.; Donaldson Hanna, K.; Bowles, N.; hide

    2017-01-01

    NASA's OSIRIS-REx sample return mission launched on September 8th, 2016 to rendezvous with B-type asteroid (101955) Bennu in 2018. Type C and B asteroids have been linked to carbonaceous chondrites because of their similar visible - to - near infrared (VIS-NIR) spectral properties [e.g., 1,2]. The OSIRIS-REx Visible and Infrared Spectrometer (OVIRS) and the Thermal Emission Spectrometer (OTES) will make spectroscopic observations of Bennu during the encounter. Constraining the presence or absence of hydrous minerals (e.g., Ca-carbonate, phyllosilicates) and organic molecules will be key to characterizing Bennu [3] prior to sample site selection. The goal of this study was to develop a suite of analog and meteorite samples and obtain their spectral properties over the wavelength ranges of OVIRS (0.4- 4.3 micrometer) and OTES (5.0-50 micrometer). These spectral data were used to validate the mission science-data processing system. We discuss the reasoning behind the study and share lessons learned.

  3. Using Linked Survey Paradata to Improve Sampling Strategies in the Medical Expenditure Panel Survey

    Directory of Open Access Journals (Sweden)

    Mirel Lisa B.

    2017-06-01

    Full Text Available Using paradata from a prior survey that is linked to a new survey can help a survey organization develop more effective sampling strategies. One example of this type of linkage or subsampling is between the National Health Interview Survey (NHIS and the Medical Expenditure Panel Survey (MEPS. MEPS is a nationally representative sample of the U.S. civilian, noninstitutionalized population based on a complex multi-stage sample design. Each year a new sample is drawn as a subsample of households from the prior year’s NHIS. The main objective of this article is to examine how paradata from a prior survey can be used in developing a sampling scheme in a subsequent survey. A framework for optimal allocation of the sample in substrata formed for this purpose is presented and evaluated for the relative effectiveness of alternative substratification schemes. The framework is applied, using real MEPS data, to illustrate how utilizing paradata from the linked survey offers the possibility of making improvements to the sampling scheme for the subsequent survey. The improvements aim to reduce the data collection costs while maintaining or increasing effective responding sample sizes and response rates for a harder to reach population.

  4. Obtaining value prior to pulping with diethyl oxalate and oxalic acid

    Science.gov (United States)

    W.R. Kenealy; E. Horn; C.J. Houtman; J. Laplaza; T.W. Jeffries

    2007-01-01

    Pulp and paper are converted to paper products with yields of paper dependent on the wood and the process used. Even with high yield pulps there are conversion losses and with chemical pulps the yields approach 50%. The portions of the wood that do not provide product are either combusted to generate power and steam or incur a cost in waste water treatment. Value prior...

  5. Sample processing, protocol, and statistical analysis of the time-of-flight secondary ion mass spectrometry (ToF-SIMS) of protein, cell, and tissue samples.

    Science.gov (United States)

    Barreto, Goncalo; Soininen, Antti; Sillat, Tarvo; Konttinen, Yrjö T; Kaivosoja, Emilia

    2014-01-01

    Time-of-flight secondary ion mass spectrometry (ToF-SIMS) is increasingly being used in analysis of biological samples. For example, it has been applied to distinguish healthy and osteoarthritic human cartilage. This chapter discusses ToF-SIMS principle and instrumentation including the three modes of analysis in ToF-SIMS. ToF-SIMS sets certain requirements for the samples to be analyzed; for example, the samples have to be vacuum compatible. Accordingly, sample processing steps for different biological samples, i.e., proteins, cells, frozen and paraffin-embedded tissues and extracellular matrix for the ToF-SIMS are presented. Multivariate analysis of the ToF-SIMS data and the necessary data preprocessing steps (peak selection, data normalization, mean-centering, and scaling and transformation) are discussed in this chapter.

  6. Influences of sampling effort on detected patterns and structuring processes of a Neotropical plant-hummingbird network.

    Science.gov (United States)

    Vizentin-Bugoni, Jeferson; Maruyama, Pietro K; Debastiani, Vanderlei J; Duarte, L da S; Dalsgaard, Bo; Sazima, Marlies

    2016-01-01

    Virtually all empirical ecological interaction networks to some extent suffer from undersampling. However, how limitations imposed by sampling incompleteness affect our understanding of ecological networks is still poorly explored, which may hinder further advances in the field. Here, we use a plant-hummingbird network with unprecedented sampling effort (2716 h of focal observations) from the Atlantic Rainforest in Brazil, to investigate how sampling effort affects the description of network structure (i.e. widely used network metrics) and the relative importance of distinct processes (i.e. species abundances vs. traits) in determining the frequency of pairwise interactions. By dividing the network into time slices representing a gradient of sampling effort, we show that quantitative metrics, such as interaction evenness, specialization (H2 '), weighted nestedness (wNODF) and modularity (Q; QuanBiMo algorithm) were less biased by sampling incompleteness than binary metrics. Furthermore, the significance of some network metrics changed along the sampling effort gradient. Nevertheless, the higher importance of traits in structuring the network was apparent even with small sampling effort. Our results (i) warn against using very poorly sampled networks as this may bias our understanding of networks, both their patterns and structuring processes, (ii) encourage the use of quantitative metrics little influenced by sampling when performing spatio-temporal comparisons and (iii) indicate that in networks strongly constrained by species traits, such as plant-hummingbird networks, even small sampling is sufficient to detect their relative importance for the frequencies of interactions. Finally, we argue that similar effects of sampling are expected for other highly specialized subnetworks. © 2015 The Authors. Journal of Animal Ecology © 2015 British Ecological Society.

  7. Is the Near-Earth Current Sheet Prior to Reconnection Unstable to Tearing Mode?

    International Nuclear Information System (INIS)

    Xin-Hua, Wei; Jin-Bin, Cao; Guo-Cheng, Zhou; Hui-Shan, Fu

    2010-01-01

    The tearing mode instability plays a key role in the triggering process of reconnection. The triggering collisionless tearing mode instability has been theoretically and numerically analyzed by many researchers. However, due to the difficulty in obtaining the observational wave number, it is still unknown whether the tearing mode instability can be excited in an actual plasma sheet prior to reconnection onset. Using the data from four Cluster satellites prior to a magnetospheric reconnection event on 13 September 2002, we utilized the wave telescope technique to obtain the wave number which corresponds to the peak of power spectral density. The wavelength is about 18R E and is consistent with previous theoretic and numerical results. After substituting the wave vector and other necessary parameters of the observed current sheet into the triggering condition of tearing mode instability, we find that the near-Earth current sheet prior to reconnection is unstable to tearing mode. (geophysics, astronomy, and astrophysics)

  8. A Bayesian approach to PET reconstruction using image-modeling Gibbs priors: Implementation and comparison

    International Nuclear Information System (INIS)

    Chan, M.T.; Herman, G.T.; Levitan, E.

    1996-01-01

    We demonstrate that (i) classical methods of image reconstruction from projections can be improved upon by considering the output of such a method as a distorted version of the original image and applying a Bayesian approach to estimate from it the original image (based on a model of distortion and on a Gibbs distribution as the prior) and (ii) by selecting an open-quotes image-modelingclose quotes prior distribution (i.e., one which is such that it is likely that a random sample from it shares important characteristics of the images of the application area) one can improve over another Gibbs prior formulated using only pairwise interactions. We illustrate our approach using simulated Positron Emission Tomography (PET) data from realistic brain phantoms. Since algorithm performance ultimately depends on the diagnostic task being performed. we examine a number of different medically relevant figures of merit to give a fair comparison. Based on a training-and-testing evaluation strategy, we demonstrate that statistically significant improvements can be obtained using the proposed approach

  9. TECHNIQUES WITH POTENTIAL FOR HANDLING ENVIRONMENTAL SAMPLES IN CAPILLARY ELECTROPHORESIS

    Science.gov (United States)

    An assessment of the methods for handling environmental samples prior to capillary electrophoresis (CE) is presented for both aqueous and solid matrices. Sample handling in environmental analyses is the subject of ongoing research at the Environmental Protection Agency's National...

  10. Waste retrieval sluicing system vapor sampling and analysis plan for evaluation of organic emissions, process test phase III

    International Nuclear Information System (INIS)

    SASAKI, L.M.

    1999-01-01

    This sampling and analysis plan identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for vapor samples obtained to address vapor issues related to the sluicing of tank 241-C-106. Sampling will be performed in accordance with Waste Retrieval Sluicing System Emissions Collection Phase III (Jones 1999) and Process Test Plan Phase III, Waste Retrieval Sluicing System Emissions Collection (Powers 1999). Analytical requirements include those specified in Request for Ecology Concurrence on Draft Strategy/Path Forward to Address Concerns Regarding Organic Emissions from C-106 Sluicing Activities (Peterson 1998). The Waste Retrieval Sluicing System was installed to retrieve and transfer high-heat sludge from tank 241-C-106 to tank 241-AY-102, which is designed for high-heat waste storage. During initial sluicing of tank 241-C-106 in November 1998, operations were halted due to detection of unexpected high volatile organic compounds in emissions that exceeded regulatory permit limits. Several workers also reported smelling sharp odors and throat irritation. Vapor grab samples from the 296-C-006 ventilation system were taken as soon as possible after detection; the analyses indicated that volatile and semi-volatile organic compounds were present. In December 1998, a process test (phase I) was conducted in which the pumps in tanks 241-C-106 and 241-AY-102 were operated and vapor samples obtained to determine constituents that may be present during active sluicing of tank 241-C-106. The process test was suspended when a jumper leak was detected. On March 7, 1999, phase I1 of the process test was performed; the sluicing system was operated for approximately 7 hours and was ended using the controlled shutdown method when the allowable amount of solids were transferred to 241-AY-102. The phase II test was successful, however, further testing is required to obtain vapor samples at higher emission levels

  11. Employment Interventions for Individuals with ASD: The Relative Efficacy of Supported Employment With or Without Prior Project SEARCH Training.

    Science.gov (United States)

    Schall, Carol M; Wehman, Paul; Brooke, Valerie; Graham, Carolyn; McDonough, Jennifer; Brooke, Alissa; Ham, Whitney; Rounds, Rachael; Lau, Stephanie; Allen, Jaclyn

    2015-12-01

    This paper presents findings from a retrospective observational records review study that compares the outcomes associated with implementation of supported employment (SE) with and without prior Project SEARCH with ASD Supports (PS-ASD) on wages earned, time spent in intervention, and job retention. Results suggest that SE resulted in competitive employment for 45 adults with ASD. Twenty-five individuals received prior intervention through PS-ASD while the other 20 individuals received SE only. Individuals in this sample who received PS-ASD required fewer hours of intervention. Additionally, individuals in the PS-ASD group achieved a mean higher wage and had higher retention rates than their peers who received SE only. Further research with a larger sample is needed to confirm these findings.

  12. Prior Knowledge Assessment Guide

    Science.gov (United States)

    2014-12-01

    assessment in a reasonable amount of time. Hands-on assessments can be extremely diverse in makeup and administration depending on the subject matter...DEVELOPING AND USING PRIOR KNOWLEDGE ASSESSMENTS TO TAILOR TRAINING D-3 ___ Brush and scrub ___ Orchards ___ Rice

  13. Tank vapor sampling and analysis data package for tank 241-C-106 waste retrieval sluicing system process test phase III, sampled March 28, 1999

    International Nuclear Information System (INIS)

    LOCKREM, L.L.

    1999-01-01

    This data package presents sampling data and analytical results from the March 28, 1999, vapor sampling of Hanford Site single-shell tank 241-C-106 during active sluicing. Samples were obtained from the 296-C-006 ventilation system stack and ambient air at several locations. Characterization Project Operations (CPO) was responsible for the collection of all SUMMATM canister samples. The Special Analytical Support (SAS) vapor team was responsible for the collection of all triple sorbent trap (TST), sorbent tube train (STT), polyurethane foam (PUF), and particulate filter samples collected at the 296-C-006 stack. The SAS vapor team used the non-electrical vapor sampling (NEVS) system to collect samples of the air, gases, and vapors from the 296-C-006 stack. The SAS vapor team collected and analyzed these samples for Lockheed Martin Hanford Corporation (LMHC) and Tank Waste Remediation System (TWRS) in accordance with the sampling and analytical requirements specified in the Waste Retrieval Sluicing System Vapor Sampling and Analysis Plan (SAP) for Evaluation of Organic Emissions, Process Test Phase III, HNF-4212, Rev. 0-A, (LMHC, 1999). All samples were stored in a secured Radioactive Materials Area (RMA) until the samples were radiologically released and received by SAS for analysis. The Waste Sampling and Characterization Facility (WSCF) performed the radiological analyses. The samples were received on April 5, 1999

  14. On the development of automatic sample preparation devices

    International Nuclear Information System (INIS)

    Oesselmann, J.

    1987-01-01

    Modern mass spectrometers for stable isotope analysis offer accurate isotope ratio results from gaseous samples (CO 2 , N 2 , H 2 , SO 2 ) in a completely automated fashion. However, most samples of interest either are associated with contaminant gases or the gas has to be liberated by a chemical procedure prior to measurement. In most laboratories this sample preparation step is performed manually. As a consequence, sample throughput is rather low and - despite skilful operation - the preparation procedure varies slightly from one sample to the next affecting mainly the reproducibility of the data. (author)

  15. GNS Castor V/21 Headspace Gas Sampling 2014

    Energy Technology Data Exchange (ETDEWEB)

    Winston, Philip Lon [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-01-01

    Prior to performing an internal visual inspection, samples of the headspace gas of the GNS Castor V/21 cask were taken on June 12, 2014. These samples were taken in support of the CREIPI/Japanese nuclear industry effort to validate fuel integrity without visual inspection by measuring the 85Kr content of the cask headspace

  16. Microstructure and Macrosegregation Study of Directionally Solidified Al-7Si Samples Processed Terrestrially and Aboard the International Space Station

    Science.gov (United States)

    Angart, Samuel; Erdman, R. G.; Poirier, David R.; Tewari, S.N.; Grugel, R. N.

    2014-01-01

    This talk reports research that has been carried out under the aegis of NASA as part of a collaboration between ESA and NASA for solidification experiments on the International Space Station (ISS). The focus has been on the effect of convection on the microstructural evolution and macrosegregation in hypoeutectic Al-Si alloys during directional solidification (DS). The DS-experiments have been carried out under 1-g at Cleveland State University (CSU) and under low-g on the International Space Station (ISS). The thermal processing-history of the experiments is well defined for both the terrestrially-processed samples and the ISS-processed samples. We have observed that the primary dendrite arm spacings of two samples grown in the low-g environment of the ISS show good agreement with a dendrite-growth model based on diffusion controlled growth. The gravity-driven convection (i.e., thermosolutal convection) in terrestrially grown samples has the effect of decreasing the primary dendrite arm spacings and causes macrosgregation. In order to process DS-samples aboard the ISS, dendritic-seed crystals have to partially remelted in a stationary thermal gradient before the DS is carried out. Microstructural changes and macrosegregation effects during this period are described.

  17. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    Energy Technology Data Exchange (ETDEWEB)

    Shine, E. P.; Poirier, M. R.

    2013-10-29

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  18. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    International Nuclear Information System (INIS)

    Shine, E. P.; Poirier, M. R.

    2013-01-01

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  19. Learning priors for Bayesian computations in the nervous system.

    Directory of Open Access Journals (Sweden)

    Max Berniker

    Full Text Available Our nervous system continuously combines new information from our senses with information it has acquired throughout life. Numerous studies have found that human subjects manage this by integrating their observations with their previous experience (priors in a way that is close to the statistical optimum. However, little is known about the way the nervous system acquires or learns priors. Here we present results from experiments where the underlying distribution of target locations in an estimation task was switched, manipulating the prior subjects should use. Our experimental design allowed us to measure a subject's evolving prior while they learned. We confirm that through extensive practice subjects learn the correct prior for the task. We found that subjects can rapidly learn the mean of a new prior while the variance is learned more slowly and with a variable learning rate. In addition, we found that a Bayesian inference model could predict the time course of the observed learning while offering an intuitive explanation for the findings. The evidence suggests the nervous system continuously updates its priors to enable efficient behavior.

  20. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.

    1978-10-01

    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  1. Valid MR imaging predictors of prior knee arthroscopy

    International Nuclear Information System (INIS)

    Discepola, Federico; Le, Huy B.Q.; Park, John S.; Clopton, Paul; Knoll, Andrew N.; Austin, Matthew J.; Resnick, Donald L.

    2012-01-01

    To determine whether fibrosis of the medial patellar reticulum (MPR), lateral patellar reticulum (LPR), deep medial aspect of Hoffa's fat pad (MDH), or deep lateral aspect of Hoffa's fat pad (LDH) is a valid predictor of prior knee arthroscopy. Institutional review board approval and waiver of informed consent were obtained for this HIPPA-compliant study. Initially, fibrosis of the MPR, LPR, MDH, or LDH in MR imaging studies of 50 patients with prior knee arthroscopy and 100 patients without was recorded. Subsequently, two additional radiologists, blinded to clinical data, retrospectively and independently recorded the presence of fibrosis of the MPR in 50 patients with prior knee arthroscopy and 50 without. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy for detecting the presence of fibrosis in the MPR were calculated. κ statistics were used to analyze inter-observer agreement. Fibrosis of each of the regions examined during the first portion of the study showed a significant association with prior knee arthroscopy (p < 0.005 for each). A patient with fibrosis of the MPR, LDH, or LPR was 45.5, 9, or 3.7 times more likely, respectively, to have had a prior knee arthroscopy. Logistic regression analysis indicated that fibrosis of the MPR supplanted the diagnostic utility of identifying fibrosis of the LPR, LDH, or MDH, or combinations of these (p ≥ 0.09 for all combinations). In the second portion of the study, fibrosis of the MPR demonstrated a mean sensitivity of 82%, specificity of 72%, PPV of 75%, NPV of 81%, and accuracy of 77% for predicting prior knee arthroscopy. Analysis of MR images can be used to determine if a patient has had prior knee arthroscopy by identifying fibrosis of the MPR, LPR, MDH, or LDH. Fibrosis of the MPR was the strongest predictor of prior knee arthroscopy. (orig.)

  2. Valid MR imaging predictors of prior knee arthroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Discepola, Federico; Le, Huy B.Q. [McGill University Health Center, Jewsih General Hospital, Division of Musculoskeletal Radiology, Montreal, Quebec (Canada); Park, John S. [Annapolis Radiology Associates, Division of Musculoskeletal Radiology, Annapolis, MD (United States); Clopton, Paul; Knoll, Andrew N.; Austin, Matthew J.; Resnick, Donald L. [University of California San Diego (UCSD), Division of Musculoskeletal Radiology, San Diego, CA (United States)

    2012-01-15

    To determine whether fibrosis of the medial patellar reticulum (MPR), lateral patellar reticulum (LPR), deep medial aspect of Hoffa's fat pad (MDH), or deep lateral aspect of Hoffa's fat pad (LDH) is a valid predictor of prior knee arthroscopy. Institutional review board approval and waiver of informed consent were obtained for this HIPPA-compliant study. Initially, fibrosis of the MPR, LPR, MDH, or LDH in MR imaging studies of 50 patients with prior knee arthroscopy and 100 patients without was recorded. Subsequently, two additional radiologists, blinded to clinical data, retrospectively and independently recorded the presence of fibrosis of the MPR in 50 patients with prior knee arthroscopy and 50 without. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy for detecting the presence of fibrosis in the MPR were calculated. {kappa} statistics were used to analyze inter-observer agreement. Fibrosis of each of the regions examined during the first portion of the study showed a significant association with prior knee arthroscopy (p < 0.005 for each). A patient with fibrosis of the MPR, LDH, or LPR was 45.5, 9, or 3.7 times more likely, respectively, to have had a prior knee arthroscopy. Logistic regression analysis indicated that fibrosis of the MPR supplanted the diagnostic utility of identifying fibrosis of the LPR, LDH, or MDH, or combinations of these (p {>=} 0.09 for all combinations). In the second portion of the study, fibrosis of the MPR demonstrated a mean sensitivity of 82%, specificity of 72%, PPV of 75%, NPV of 81%, and accuracy of 77% for predicting prior knee arthroscopy. Analysis of MR images can be used to determine if a patient has had prior knee arthroscopy by identifying fibrosis of the MPR, LPR, MDH, or LDH. Fibrosis of the MPR was the strongest predictor of prior knee arthroscopy. (orig.)

  3. Effects of GPS sampling intensity on home range analyses

    Science.gov (United States)

    Jeffrey J. Kolodzinski; Lawrence V. Tannenbaum; David A. Osborn; Mark C. Conner; W. Mark Ford; Karl V. Miller

    2010-01-01

    The two most common methods for determining home ranges, minimum convex polygon (MCP) and kernel analyses, can be affected by sampling intensity. Despite prior research, it remains unclear how high-intensity sampling regimes affect home range estimations. We used datasets from 14 GPS-collared, white-tailed deer (Odocoileus virginianus) to describe...

  4. Extended Linear Models with Gaussian Priors

    DEFF Research Database (Denmark)

    Quinonero, Joaquin

    2002-01-01

    In extended linear models the input space is projected onto a feature space by means of an arbitrary non-linear transformation. A linear model is then applied to the feature space to construct the model output. The dimension of the feature space can be very large, or even infinite, giving the model...... a very big flexibility. Support Vector Machines (SVM's) and Gaussian processes are two examples of such models. In this technical report I present a model in which the dimension of the feature space remains finite, and where a Bayesian approach is used to train the model with Gaussian priors...... on the parameters. The Relevance Vector Machine, introduced by Tipping, is a particular case of such a model. I give the detailed derivations of the expectation-maximisation (EM) algorithm used in the training. These derivations are not found in the literature, and might be helpful for newcomers....

  5. Rotary mode core sampling approved checklist: 241-TX-113

    International Nuclear Information System (INIS)

    Fowler, K.D.

    1998-01-01

    The safety assessment for rotary mode core sampling was developed using certain bounding assumptions, however, those assumptions were not verified for each of the existing or potential flammable gas tanks. Therefore, a Flammable Gas/Rotary Mode Core Sampling Approved Checklist has been completed for tank 241-TX-113 prior to sampling operations. This transmittal documents the dispositions of the checklist items from the safety assessment

  6. Rotary mode core sampling approved checklist: 241-TX-116

    International Nuclear Information System (INIS)

    FOWLER, K.D.

    1999-01-01

    The safety assessment for rotary mode core sampling was developed using certain bounding assumptions, however, those assumptions were not verified for each of the existing or potential flammable gas tanks. Therefore, a Flammable Gas/Rotary Mode Core Sampling Approved Checklist has been completed for tank 241-TX-116 prior to sampling operations. This transmittal documents the dispositions of the checklist items from the safety assessment

  7. Immunoaffinity chromatography for the sample pretreatment of Taxus plant and cell extracts prior to analysis of taxanes by high-performance liquid chromatography

    NARCIS (Netherlands)

    Theodoridis, G; Haasnoot, W; Schilt, R; Jaziri, M; Diallo, B; Papadoyannis, IN; de Jong, GJ; Cazemier, G.

    2002-01-01

    The application of immunoaffinity chromatography for the purification of Taxus plant and cell extracts prior to the HPLC analysis is described. Polyclonal antibodies raised against 10-deacetylbaccatin III (10-DAB III), paclitaxel's main precursor in plant, were characterised by enzymed-linked

  8. Data Validation Package September 2016 Groundwater and Surface Water Sampling at the Slick Rock, Colorado, Processing Sites January 2017

    Energy Technology Data Exchange (ETDEWEB)

    Traub, David [Navarro Research and Engineering, Inc., Oak Ridge, TN (United States); Nguyen, Jason [US Department of Energy, Washington, DC (United States)

    2017-01-04

    The Slick Rock, Colorado, Processing Sites are referred to as the Slick Rock West Processing Site (SRK05) and the Slick Rock East Processing Site (SRK06). This annual event involved sampling both sites for a total of 16 monitoring wells and 6 surface water locations as required by the 2006 Draft Final Ground Water Compliance Action Plan for the Slick Rock, Colorado, Processing Sites (GCAP). A domestic well was also sampled at a property adjacent to the Slick Rock East site at the request of the landowner.

  9. Generalized multiple kernel learning with data-dependent priors.

    Science.gov (United States)

    Mao, Qi; Tsang, Ivor W; Gao, Shenghua; Wang, Li

    2015-06-01

    Multiple kernel learning (MKL) and classifier ensemble are two mainstream methods for solving learning problems in which some sets of features/views are more informative than others, or the features/views within a given set are inconsistent. In this paper, we first present a novel probabilistic interpretation of MKL such that maximum entropy discrimination with a noninformative prior over multiple views is equivalent to the formulation of MKL. Instead of using the noninformative prior, we introduce a novel data-dependent prior based on an ensemble of kernel predictors, which enhances the prediction performance of MKL by leveraging the merits of the classifier ensemble. With the proposed probabilistic framework of MKL, we propose a hierarchical Bayesian model to learn the proposed data-dependent prior and classification model simultaneously. The resultant problem is convex and other information (e.g., instances with either missing views or missing labels) can be seamlessly incorporated into the data-dependent priors. Furthermore, a variety of existing MKL models can be recovered under the proposed MKL framework and can be readily extended to incorporate these priors. Extensive experiments demonstrate the benefits of our proposed framework in supervised and semisupervised settings, as well as in tasks with partial correspondence among multiple views.

  10. 7 CFR 52.38c - Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes.

    Science.gov (United States)

    2010-01-01

    ... FRUITS AND VEGETABLES, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER PROCESSED FOOD PRODUCTS 1... processed fruits and vegetables by attributes. 52.38c Section 52.38c Agriculture Regulations of the... inspection of processed fruits and vegetables by attributes. (a) General. Single sampling plans shall be used...

  11. Calibrated birth-death phylogenetic time-tree priors for bayesian inference.

    Science.gov (United States)

    Heled, Joseph; Drummond, Alexei J

    2015-05-01

    Here we introduce a general class of multiple calibration birth-death tree priors for use in Bayesian phylogenetic inference. All tree priors in this class separate ancestral node heights into a set of "calibrated nodes" and "uncalibrated nodes" such that the marginal distribution of the calibrated nodes is user-specified whereas the density ratio of the birth-death prior is retained for trees with equal values for the calibrated nodes. We describe two formulations, one in which the calibration information informs the prior on ranked tree topologies, through the (conditional) prior, and the other which factorizes the prior on divergence times and ranked topologies, thus allowing uniform, or any arbitrary prior distribution on ranked topologies. Although the first of these formulations has some attractive properties, the algorithm we present for computing its prior density is computationally intensive. However, the second formulation is always faster and computationally efficient for up to six calibrations. We demonstrate the utility of the new class of multiple-calibration tree priors using both small simulations and a real-world analysis and compare the results to existing schemes. The two new calibrated tree priors described in this article offer greater flexibility and control of prior specification in calibrated time-tree inference and divergence time dating, and will remove the need for indirect approaches to the assessment of the combined effect of calibration densities and tree priors in Bayesian phylogenetic inference. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  12. Automated processing of whole blood samples into microliter aliquots of plasma.

    Science.gov (United States)

    Burtis, C A; Johnson, W W; Walker, W A

    1988-01-01

    A rotor that accepts and automatically processes a bulk aliquot of a single blood sample into multiple aliquots of plasma has been designed and built. The rotor consists of a central processing unit, which includes a disk containing eight precision-bore capillaries. By varying the internal diameters of the capillaries, aliquot volumes ranging 1 to 10 mul can be prepared. In practice, an unmeasured volume of blood is placed in a centre well, and, as the rotor begins to spin, is moved radially into a central annular ring where it is distributed into a series of processing chambers. The rotor is then spun at 3000 rpm for 10 min. When the centrifugal field is removed by slowly decreasing the rotor speed, an aliquot of plasma is withdrawn by capillary action into each of the capillary tubes. The disk containing the eight measured aliquots of plasma is subsequently removed and placed in a modifed rotor for conventional centrifugal analysis. Initial evaluation of the new rotor indicates that it is capable of producing discrete, microliter volumes of plasma with a degree of accuracy and precision approaching that of mechanical pipettes.

  13. On Sparse Multi-Task Gaussian Process Priors for Music Preference Learning

    DEFF Research Database (Denmark)

    Nielsen, Jens Brehm; Jensen, Bjørn Sand; Larsen, Jan

    In this paper we study pairwise preference learning in a music setting with multitask Gaussian processes and examine the effect of sparsity in the input space as well as in the actual judgments. To introduce sparsity in the inputs, we extend a classic pairwise likelihood model to support sparse...... simulation shows the performance on a real-world music preference dataset which motivates and demonstrates the potential of the sparse Gaussian process formulation for pairwise likelihoods....

  14. Protocol for Cohesionless Sample Preparation for Physical Experimentation

    Science.gov (United States)

    2016-05-01

    Standard test method for consolidated drained triaxial compression test for soils . In Annual book of ASTM standards. West Conshohocken, PA: ASTM...derived wherein uncertainties and laboratory scatter associated with soil fabric-behavior variance during sample preparation are mitigated. Samples of...wherein comparable analysis between different laboratory tests’ results can be made by ensuring a comparable soil fabric prior to laboratory testing

  15. Statistical literacy and sample survey results

    Science.gov (United States)

    McAlevey, Lynn; Sullivan, Charles

    2010-10-01

    Sample surveys are widely used in the social sciences and business. The news media almost daily quote from them, yet they are widely misused. Using students with prior managerial experience embarking on an MBA course, we show that common sample survey results are misunderstood even by those managers who have previously done a statistics course. In general, they fare no better than managers who have never studied statistics. There are implications for teaching, especially in business schools, as well as for consulting.

  16. Electrons in feldspar II: A consideration of the influence of conduction band-tail states on luminescence processes

    DEFF Research Database (Denmark)

    Poolton, H.R.J.; Ozanyan, K.B.; Wallinga, J.

    2002-01-01

    consider what influence the band tails have on the luminescence properties of feldspar, where electrons travel through the sample prior to recombination. The work highlights the dominant role that 0.04-0.05-eV phonons play in both the luminescence excitation and emission processes of these materials...

  17. Applicability of cloud point extraction for the separation trace amount of lead ion in environmental and biological samples prior to determination by flame atomic absorption spectrometry

    Directory of Open Access Journals (Sweden)

    Sayed Zia Mohammadi

    2016-09-01

    Full Text Available A sensitive cloud point extraction procedure(CPE for the preconcentration of trace lead prior to its determination by flame atomic absorption spectrometry (FAAS has been developed. The CPE method is based on the complex of Pb(II ion with 1-(2-pyridylazo-2-naphthol (PAN, and then entrapped in the non-ionic surfactant Triton X-114. The main factors affecting CPE efficiency, such as pH of sample solution, concentration of PAN and Triton X-114, equilibration temperature and time, were investigated in detail. A preconcentration factor of 30 was obtained for the preconcentration of Pb(II ion with 15.0 mL solution. Under the optimal conditions, the calibration curve was linear in the range of 7.5 ng mL−1–3.5 μg mL−1 of lead with R2 = 0.9998 (n = 10. Detection limit based on three times the standard deviation of the blank (3Sb was 5.27 ng mL−1. Eight replicate determinations of 1.0 μg mL−1 lead gave a mean absorbance of 0.275 with a relative standard deviation of 1.6%. The high efficiency of cloud point extraction to carry out the determination of analytes in complex matrices was demonstrated. The proposed method has been applied for determination of trace amounts of lead in biological and water samples with satisfactory results.

  18. LC-MS analysis of the plasma metabolome–a novel sample preparation strategy

    DEFF Research Database (Denmark)

    Skov, Kasper; Hadrup, Niels; Smedsgaard, Jørn

    2015-01-01

    Blood plasma is a well-known body fluid often analyzed in studies on the effects of toxic compounds as physiological or chemical induced changes in the mammalian body are reflected in the plasma metabolome. Sample preparation prior to LC-MS based analysis of the plasma metabolome is a challenge...... as plasma contains compounds with very different properties. Besides, proteins, which usually are precipitated with organic solvent, phospholipids, are known to cause ion suppression in electrospray mass spectrometry. We have compared two different sample preparation techniques prior to LC-qTOF analysis...... of plasma samples: The first is protein precipitation; the second is protein precipitation followed by solid phase extraction with sub-fractionation into three sub-samples; a phospholipid, a lipid and a polar sub-fraction. Molecular feature extraction of the data files from LC-qTOF analysis of the samples...

  19. Image Denoising via Bayesian Estimation of Statistical Parameter Using Generalized Gamma Density Prior in Gaussian Noise Model

    Science.gov (United States)

    Kittisuwan, Pichid

    2015-03-01

    The application of image processing in industry has shown remarkable success over the last decade, for example, in security and telecommunication systems. The denoising of natural image corrupted by Gaussian noise is a classical problem in image processing. So, image denoising is an indispensable step during image processing. This paper is concerned with dual-tree complex wavelet-based image denoising using Bayesian techniques. One of the cruxes of the Bayesian image denoising algorithms is to estimate the statistical parameter of the image. Here, we employ maximum a posteriori (MAP) estimation to calculate local observed variance with generalized Gamma density prior for local observed variance and Laplacian or Gaussian distribution for noisy wavelet coefficients. Evidently, our selection of prior distribution is motivated by efficient and flexible properties of generalized Gamma density. The experimental results show that the proposed method yields good denoising results.

  20. Smartphone and Mobile Application Utilization Prior to and Following Treatment Among Individuals Enrolled in Residential Substance Use Treatment

    Science.gov (United States)

    Dahne, Jennifer; Lejuez, C. W.

    2015-01-01

    Background Following completion of substance use treatment, it is crucial for patients to continue to utilize skills learned in treatment for optimal treatment outcomes. Mobile applications (apps) on smartphones offer a unique platform to promote utilization of evidence-based skills following completion of substance use treatment. Despite the promise of mobile apps and smartphones for treatment delivery, it remains unknown whether patients in substance use treatment in the United States have access to smartphones and utilize mobile apps on smartphones. The present study sought to determine smartphone utilization among individuals enrolled in one residential substance use treatment center in the U.S catering specifically to low-income adults. Methods Participants included 251 individuals at a residential substance use treatment center in Washington DC admitted to the center between March, 2014 and January, 2015. During the intake process, participants completed interviewer-administered demographics and psychiatric questionnaires as well as a self-report of technology utilization. Results Results indicated that the majority of patients in this residential substance use treatment center owned mobile phones prior to treatment entry (86.9%) and expected to own mobile phones after leaving treatment (92.6%). Moreover, the majority of these phones were (68.5%) or will be smartphones (72.4%) on which patients reported utilizing mobile applications (Prior to treatment: 61.3%; Post treatment: 64.3%) and accessing the internet (Prior to treatment: 61.3%; Post treatment: 65.9%). Conclusions Mobile phone and smartphone ownership among this sample were comparable to ownership among U.S. adults broadly. Findings suggest that smartphones and mobile apps may hold clinical utility for fostering continued use of treatment skills following substance use treatment completion. PMID:26231698

  1. A modified FASP protocol for high-throughput preparation of protein samples for mass spectrometry.

    Directory of Open Access Journals (Sweden)

    Jeremy Potriquet

    Full Text Available To facilitate high-throughput proteomic analyses we have developed a modified FASP protocol which improves the rate at which protein samples can be processed prior to mass spectrometry. Adapting the original FASP protocol to a 96-well format necessitates extended spin times for buffer exchange due to the low centrifugation speeds tolerated by these devices. However, by using 96-well plates with a more robust polyethersulfone molecular weight cutoff membrane, instead of the cellulose membranes typically used in these devices, we could use isopropanol as a wetting agent, decreasing spin times required for buffer exchange from an hour to 30 minutes. In a typical work flow used in our laboratory this equates to a reduction of 3 hours per plate, providing processing times similar to FASP for the processing of up to 96 samples per plate. To test whether our modified protocol produced similar results to FASP and other FASP-like protocols we compared the performance of our modified protocol to the original FASP and the more recently described eFASP and MStern-blot. We show that all FASP-like methods, including our modified protocol, display similar performance in terms of proteins identified and reproducibility. Our results show that our modified FASP protocol is an efficient method for the high-throughput processing of protein samples for mass spectral analysis.

  2. Abdominal multi-organ segmentation from CT images using conditional shape-location and unsupervised intensity priors.

    Science.gov (United States)

    Okada, Toshiyuki; Linguraru, Marius George; Hori, Masatoshi; Summers, Ronald M; Tomiyama, Noriyuki; Sato, Yoshinobu

    2015-12-01

    This paper addresses the automated segmentation of multiple organs in upper abdominal computed tomography (CT) data. The aim of our study is to develop methods to effectively construct the conditional priors and use their prediction power for more accurate segmentation as well as easy adaptation to various imaging conditions in CT images, as observed in clinical practice. We propose a general framework of multi-organ segmentation which effectively incorporates interrelations among multiple organs and easily adapts to various imaging conditions without the need for supervised intensity information. The features of the framework are as follows: (1) A method for modeling conditional shape and location (shape-location) priors, which we call prediction-based priors, is developed to derive accurate priors specific to each subject, which enables the estimation of intensity priors without the need for supervised intensity information. (2) Organ correlation graph is introduced, which defines how the conditional priors are constructed and segmentation processes of multiple organs are executed. In our framework, predictor organs, whose segmentation is sufficiently accurate by using conventional single-organ segmentation methods, are pre-segmented, and the remaining organs are hierarchically segmented using conditional shape-location priors. The proposed framework was evaluated through the segmentation of eight abdominal organs (liver, spleen, left and right kidneys, pancreas, gallbladder, aorta, and inferior vena cava) from 134 CT data from 86 patients obtained under six imaging conditions at two hospitals. The experimental results show the effectiveness of the proposed prediction-based priors and the applicability to various imaging conditions without the need for supervised intensity information. Average Dice coefficients for the liver, spleen, and kidneys were more than 92%, and were around 73% and 67% for the pancreas and gallbladder, respectively. Copyright © 2015

  3. Degradation of hydrocarbons in soil samples analyzed within accepted analytical holding times

    International Nuclear Information System (INIS)

    Jackson, J.; Thomey, N.; Dietlein, L.F.

    1992-01-01

    Samples which are collected in conjunction with subsurface investigations at leaking petroleum storage tank sites and petroleum refineries are routinely analyzed for benzene, toluene, ethylbenzene, xylenes (BTEX), and total petroleum hydrocarbons (TPH). Water samples are preserved by the addition of hydrochloric acid and maintained at four degrees centigrade prior to analysis. This is done to prevent bacterial degradation of hydrocarbons. Chemical preservation is not presently performed on soil samples. Instead, the samples are cooled and maintained at four degrees centigrade. This study was done to measure the degree of degradation of hydrocarbons in soil samples which are analyzed within accepted holding times. Soil samples were collected and representative subsamples were prepared from the initial sample. Subsamples were analyzed in triplicate for BTEX and TPH throughout the length of the approved holding times to measure the extent of sample constituent degradation prior to analysis. Findings imply that for sandy soils, BTEX and TPH concentrations can be highly dependent upon the length of time which elapses between sample collection and analysis

  4. The Effect of Prior Caffeine Consumption on Neuropsychological Test Performance: A Placebo-Controlled Study.

    Science.gov (United States)

    Walters, Elizabeth R; Lesk, Valerie E

    2016-01-01

    The aim of this study was to investigate whether the prior consumption of 200 mg of pure caffeine affected neuropsychological test scores in a group of elderly participants aged over 60 years. Using a double-blind placebo versus caffeine design, participants were randomly assigned to receive 200 mg of caffeine or placebo. A neuropsychological assessment testing the domains of general cognitive function, processing speed, semantic memory, episodic memory, executive function, working memory and short-term memory was carried out. Significant interaction effects between age, caffeine and scores of executive function and processing speed were found; participants who had received caffeine showed a decline in performance with increasing age. This effect was not seen for participants who received placebo. The results highlight the need to consider and control prior caffeine consumption when scoring neuropsychological assessments in the elderly, which is important for accuracy of diagnosis and corresponding normative data. © 2016 S. Karger AG, Basel.

  5. The relation between cognitive and metacognitive strategic processing during a science simulation.

    Science.gov (United States)

    Dinsmore, Daniel L; Zoellner, Brian P

    2018-03-01

    This investigation was designed to uncover the relations between students' cognitive and metacognitive strategies used during a complex climate simulation. While cognitive strategy use during science inquiry has been studied, the factors related to this strategy use, such as concurrent metacognition, prior knowledge, and prior interest, have not been investigated in a multidimensional fashion. This study addressed current issues in strategy research by examining not only how metacognitive, surface-level, and deep-level strategies influence performance, but also how these strategies related to each other during a contextually relevant science simulation. The sample for this study consisted of 70 undergraduates from a mid-sized Southeastern university in the United States. These participants were recruited from both physical and life science (e.g., biology) and education majors to obtain a sample with variance in terms of their prior knowledge, interest, and strategy use. Participants completed measures of prior knowledge and interest about global climate change. Then, they were asked to engage in an online climate simulator for up to 30 min while thinking aloud. Finally, participants were asked to answer three outcome questions about global climate change. Results indicated a poor fit for the statistical model of the frequency and level of processing predicting performance. However, a statistical model that independently examined the influence of metacognitive monitoring and control of cognitive strategies showed a very strong relation between the metacognitive and cognitive strategies. Finally, smallest space analysis results provided evidence that strategy use may be better captured in a multidimensional fashion, particularly with attention paid towards the combination of strategies employed. Conclusions drawn from the evidence point to the need for more dynamic, multidimensional models of strategic processing that account for the patterns of optimal and non

  6. Endophytic bacterial community of grapevine leaves influenced by sampling date and phytoplasma infection process.

    Science.gov (United States)

    Bulgari, Daniela; Casati, Paola; Quaglino, Fabio; Bianco, Piero A

    2014-07-21

    Endophytic bacteria benefit host plant directly or indirectly, e.g. by biocontrol of the pathogens. Up to now, their interactions with the host and with other microorganisms are poorly understood. Consequently, a crucial step for improving the knowledge of those relationships is to determine if pathogens or plant growing season influence endophytic bacterial diversity and dynamic. Four healthy, four phytoplasma diseased and four recovered (symptomatic plants that spontaneously regain a healthy condition) grapevine plants were sampled monthly from June to October 2010 in a vineyard in north-western Italy. Metagenomic DNA was extracted from sterilized leaves and the endophytic bacterial community dynamic and diversity were analyzed by taxon specific real-time PCR, Length-Heterogeneity PCR and genus-specific PCR. These analyses revealed that both sampling date and phytoplasma infection influenced the endophytic bacterial composition. Interestingly, in June, when the plants are symptomless and the pathogen is undetectable (i) the endophytic bacterial community associated with diseased grapevines was different from those in the other sampling dates, when the phytoplasmas are detectable inside samples; (ii) the microbial community associated with recovered plants differs from that living inside healthy and diseased plants. Interestingly, LH-PCR database identified bacteria previously reported as biocontrol agents in the examined grapevines. Of these, Burkholderia, Methylobacterium and Pantoea dynamic was influenced by the phytoplasma infection process and seasonality. Results indicated that endophytic bacterial community composition in grapevine is correlated to both phytoplasma infection and sampling date. For the first time, data underlined that, in diseased plants, the pathogen infection process can decrease the impact of seasonality on community dynamic. Moreover, based on experimental evidences, it was reasonable to hypothesize that after recovery the restructured

  7. Improving Open Access through Prior Learning Assessment

    Science.gov (United States)

    Yin, Shuangxu; Kawachi, Paul

    2013-01-01

    This paper explores and presents new data on how to improve open access in distance education through using prior learning assessments. Broadly there are three types of prior learning assessment (PLAR): Type-1 for prospective students to be allowed to register for a course; Type-2 for current students to avoid duplicating work-load to gain…

  8. The Effect of Sterilization on Size and Shape of Fat Globules in Model Processed Cheese Samples

    Directory of Open Access Journals (Sweden)

    B. Tremlová

    2006-01-01

    Full Text Available Model cheese samples from 4 independent productions were heat sterilized (117 °C, 20 minutes after the melting process and packing with an aim to prolong their durability. The objective of the study was to assess changes in the size and shape of fat globules due to heat sterilization by using image analysis methods. The study included a selection of suitable methods of preparation mounts, taking microphotographs and making overlays for automatic processing of photographs by image analyser, ascertaining parameters to determine the size and shape of fat globules and statistical analysis of results obtained. The results of the experiment suggest that changes in shape of fat globules due to heat sterilization are not unequivocal. We found that the size of fat globules was significantly increased (p < 0.01 due to heat sterilization (117 °C, 20 min, and the shares of small fat globules (up to 500 μm2, or 100 μm2 in the samples of heat sterilized processed cheese were decreased. The results imply that the image analysis method is very useful when assessing the effect of technological process on the quality of processed cheese quality.

  9. Iterative importance sampling algorithms for parameter estimation

    OpenAIRE

    Morzfeld, Matthias; Day, Marcus S.; Grout, Ray W.; Pau, George Shu Heng; Finsterle, Stefan A.; Bell, John B.

    2016-01-01

    In parameter estimation problems one computes a posterior distribution over uncertain parameters defined jointly by a prior distribution, a model, and noisy data. Markov Chain Monte Carlo (MCMC) is often used for the numerical solution of such problems. An alternative to MCMC is importance sampling, which can exhibit near perfect scaling with the number of cores on high performance computing systems because samples are drawn independently. However, finding a suitable proposal distribution is ...

  10. Acquisition of multiple prior distributions in tactile temporal order judgment

    Directory of Open Access Journals (Sweden)

    Yasuhito eNagai

    2012-08-01

    Full Text Available The Bayesian estimation theory proposes that the brain acquires the prior distribution of a task and integrates it with sensory signals to minimize the effect of sensory noise. Psychophysical studies have demonstrated that our brain actually implements Bayesian estimation in a variety of sensory-motor tasks. However, these studies only imposed one prior distribution on participants within a task period. In this study, we investigated the conditions that enable the acquisition of multiple prior distributions in temporal order judgment (TOJ of two tactile stimuli across the hands. In Experiment 1, stimulation intervals were randomly selected from one of two prior distributions (biased to right hand earlier and biased to left hand earlier in association with color cues (green and red, respectively. Although the acquisition of the two priors was not enabled by the color cues alone, it was significant when participants shifted their gaze (above or below in response to the color cues. However, the acquisition of multiple priors was not significant when participants moved their mouths (opened or closed. In Experiment 2, the spatial cues (above and below were used to identify which eye position or retinal cue position was crucial for the eye-movement-dependent acquisition of multiple priors in Experiment 1. The acquisition of the two priors was significant when participants moved their gaze to the cues (i.e., the cue positions on the retina were constant across the priors, as well as when participants did not shift their gazes (i.e., the cue positions on the retina changed according to the priors. Thus, both eye and retinal cue positions were effective in acquiring multiple priors. Based on previous neurophysiological reports, we discuss possible neural correlates that contribute to the acquisition of multiple priors.

  11. Field sampling for monitoring migration and defining the areal extent of chemical contamination

    International Nuclear Information System (INIS)

    Thomas, J.M.; Skalski, J.R.; Eberhardt, L.L.; Simmons, M.A.

    1984-11-01

    Initial research on compositing, field designs, and site mapping oriented toward detecting spills and migration at commercial low-level radioactive or chemical waste sites is summarized. Results indicate that the significance test developed to detect samples containing high levels of contamination when they are mixed with several other samples below detectable limits (composites), will be highly effective with large sample sizes when contaminant levels frequently or greatly exceed a maximum acceptable level. These conditions of frequent and high contaminant levels are most likely to occur in regions of a commercial waste site where the priors (previous knowledge) about a spill or migration are highest. Conversely, initial investigations of Bayes sampling strategies suggest that field sampling efforts should be inversely proportional to the priors (expressed as probabilities) for the occurrence of contamination

  12. A sampling-based approach to probabilistic pursuit evasion

    KAUST Repository

    Mahadevan, Aditya; Amato, Nancy M.

    2012-01-01

    Probabilistic roadmaps (PRMs) are a sampling-based approach to motion-planning that encodes feasible paths through the environment using a graph created from a subset of valid positions. Prior research has shown that PRMs can be augmented

  13. Sleep Spindle Density Predicts the Effect of Prior Knowledge on Memory Consolidation

    Science.gov (United States)

    Lambon Ralph, Matthew A.; Kempkes, Marleen; Cousins, James N.; Lewis, Penelope A.

    2016-01-01

    Information that relates to a prior knowledge schema is remembered better and consolidates more rapidly than information that does not. Another factor that influences memory consolidation is sleep and growing evidence suggests that sleep-related processing is important for integration with existing knowledge. Here, we perform an examination of how sleep-related mechanisms interact with schema-dependent memory advantage. Participants first established a schema over 2 weeks. Next, they encoded new facts, which were either related to the schema or completely unrelated. After a 24 h retention interval, including a night of sleep, which we monitored with polysomnography, participants encoded a second set of facts. Finally, memory for all facts was tested in a functional magnetic resonance imaging scanner. Behaviorally, sleep spindle density predicted an increase of the schema benefit to memory across the retention interval. Higher spindle densities were associated with reduced decay of schema-related memories. Functionally, spindle density predicted increased disengagement of the hippocampus across 24 h for schema-related memories only. Together, these results suggest that sleep spindle activity is associated with the effect of prior knowledge on memory consolidation. SIGNIFICANCE STATEMENT Episodic memories are gradually assimilated into long-term memory and this process is strongly influenced by sleep. The consolidation of new information is also influenced by its relationship to existing knowledge structures, or schemas, but the role of sleep in such schema-related consolidation is unknown. We show that sleep spindle density predicts the extent to which schemas influence the consolidation of related facts. This is the first evidence that sleep is associated with the interaction between prior knowledge and long-term memory formation. PMID:27030764

  14. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    KAUST Repository

    Elsheikh, Ahmed H.; Wheeler, Mary Fanett; Hoteit, Ibrahim

    2014-01-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using

  15. Abbreviated sampling and analysis plan for planning decontamination and decommissioning at Test Reactor Area (TRA) facilities

    International Nuclear Information System (INIS)

    1994-10-01

    The objective is to sample and analyze for the presence of gamma emitting isotopes and hazardous constituents within certain areas of the Test Reactor Area (TRA), prior to D and D activities. The TRA is composed of three major reactor facilities and three smaller reactors built in support of programs studying the performance of reactor materials and components under high neutron flux conditions. The Materials Testing Reactor (MTR) and Engineering Test Reactor (ETR) facilities are currently pending D/D. Work consists of pre-D and D sampling of designated TRA (primarily ETR) process areas. This report addresses only a limited subset of the samples which will eventually be required to characterize MTR and ETR and plan their D and D. Sampling which is addressed in this document is intended to support planned D and D work which is funded at the present time. Biased samples, based on process knowledge and plant configuration, are to be performed. The multiple process areas which may be potentially sampled will be initially characterized by obtaining data for upstream source areas which, based on facility configuration, would affect downstream and as yet unsampled, process areas. Sampling and analysis will be conducted to determine the level of gamma emitting isotopes and hazardous constituents present in designated areas within buildings TRA-612, 642, 643, 644, 645, 647, 648, 663; and in the soils surrounding Facility TRA-611. These data will be used to plan the D and D and help determine disposition of material by D and D personnel. Both MTR and ETR facilities will eventually be decommissioned by total dismantlement so that the area can be restored to its original condition

  16. Modified Activated Carbon Prepared from Acorn Shells as a New Solid-Phase Extraction Sorbent for the Preconcentration and Determination of Trace Amounts of Nickel in Food Samples Prior to Flame Atomic Absorption Spectrometry.

    Science.gov (United States)

    Ebrahimi, Bahram

    2017-03-01

    A new solid-phase extraction (SPE) sorbent was introduced based on acidic-modified (AM) activated carbon (AC) prepared from acorn shells of native oak trees in Kurdistan. Hydrochloric acid (15%, w/w) and nitric acid (32.5%, w/w) were used to condition and modify AC. The IR spectra of AC and AM-AC showed that AM lead to the formation of increasing numbers of acidic functional groups on AM-AC. AM-AC was used in the SPE method for the extraction and preconcentration of Ni+2 prior to flame atomic absorption spectrometric determination at ng/mL levels in model and real food samples. Effective parameters of the SPE procedure, such as the pH of the solutions, sorbent dosage, extraction time, sample volume, type of eluent, and matrix ions, were considered and optimized. An enrichment factor of 140 was obtained. The calibration curve was linear with an R2 of 0.997 in the concentration range of 1-220 ng/mL. The RSD was 5.67% (for n = 7), the LOD was 0.352 ng/mL, and relative recoveries in vegetable samples ranged from 96.7 to 103.7%.

  17. Atmospheric vs. anaerobic processing of metabolome samples for the metabolite profiling of a strict anaerobic bacterium, Clostridium acetobutylicum.

    Science.gov (United States)

    Lee, Sang-Hyun; Kim, Sooah; Kwon, Min-A; Jung, Young Hoon; Shin, Yong-An; Kim, Kyoung Heon

    2014-12-01

    Well-established metabolome sample preparation is a prerequisite for reliable metabolomic data. For metabolome sampling of a Gram-positive strict anaerobe, Clostridium acetobutylicum, fast filtration and metabolite extraction with acetonitrile/methanol/water (2:2:1, v/v) at -20°C under anaerobic conditions has been commonly used. This anaerobic metabolite processing method is laborious and time-consuming since it is conducted in an anaerobic chamber. Also, there have not been any systematic method evaluation and development of metabolome sample preparation for strict anaerobes and Gram-positive bacteria. In this study, metabolome sampling and extraction methods were rigorously evaluated and optimized for C. acetobutylicum by using gas chromatography/time-of-flight mass spectrometry-based metabolomics, in which a total of 116 metabolites were identified. When comparing the atmospheric (i.e., in air) and anaerobic (i.e., in an anaerobic chamber) processing of metabolome sample preparation, there was no significant difference in the quality and quantity of the metabolomic data. For metabolite extraction, pure methanol at -20°C was a better solvent than acetonitrile/methanol/water (2:2:1, v/v/v) at -20°C that is frequently used for C. acetobutylicum, and metabolite profiles were significantly different depending on extraction solvents. This is the first evaluation of metabolite sample preparation under aerobic processing conditions for an anaerobe. This method could be applied conveniently, efficiently, and reliably to metabolome analysis for strict anaerobes in air. © 2014 Wiley Periodicals, Inc.

  18. Prior-based artifact correction (PBAC) in computed tomography

    International Nuclear Information System (INIS)

    Heußer, Thorsten; Brehm, Marcus; Ritschl, Ludwig; Sawall, Stefan; Kachelrieß, Marc

    2014-01-01

    Purpose: Image quality in computed tomography (CT) often suffers from artifacts which may reduce the diagnostic value of the image. In many cases, these artifacts result from missing or corrupt regions in the projection data, e.g., in the case of metal, truncation, and limited angle artifacts. The authors propose a generalized correction method for different kinds of artifacts resulting from missing or corrupt data by making use of available prior knowledge to perform data completion. Methods: The proposed prior-based artifact correction (PBAC) method requires prior knowledge in form of a planning CT of the same patient or in form of a CT scan of a different patient showing the same body region. In both cases, the prior image is registered to the patient image using a deformable transformation. The registered prior is forward projected and data completion of the patient projections is performed using smooth sinogram inpainting. The obtained projection data are used to reconstruct the corrected image. Results: The authors investigate metal and truncation artifacts in patient data sets acquired with a clinical CT and limited angle artifacts in an anthropomorphic head phantom data set acquired with a gantry-based flat detector CT device. In all cases, the corrected images obtained by PBAC are nearly artifact-free. Compared to conventional correction methods, PBAC achieves better artifact suppression while preserving the patient-specific anatomy at the same time. Further, the authors show that prominent anatomical details in the prior image seem to have only minor impact on the correction result. Conclusions: The results show that PBAC has the potential to effectively correct for metal, truncation, and limited angle artifacts if adequate prior data are available. Since the proposed method makes use of a generalized algorithm, PBAC may also be applicable to other artifacts resulting from missing or corrupt data

  19. A combined method for DNA analysis and radiocarbon dating from a single sample.

    Science.gov (United States)

    Korlević, Petra; Talamo, Sahra; Meyer, Matthias

    2018-03-07

    Current protocols for ancient DNA and radiocarbon analysis of ancient bones and teeth call for multiple destructive samplings of a given specimen, thereby increasing the extent of undesirable damage to precious archaeological material. Here we present a method that makes it possible to obtain both ancient DNA sequences and radiocarbon dates from the same sample material. This is achieved by releasing DNA from the bone matrix through incubation with either EDTA or phosphate buffer prior to complete demineralization and collagen extraction utilizing the acid-base-acid-gelatinization and ultrafiltration procedure established in most radiocarbon dating laboratories. Using a set of 12 bones of different ages and preservation conditions we demonstrate that on average 89% of the DNA can be released from sample powder with minimal, or 38% without any, detectable collagen loss. We also detect no skews in radiocarbon dates compared to untreated samples. Given the different material demands for radiocarbon dating (500 mg of bone/dentine) and DNA analysis (10-100 mg), combined DNA and collagen extraction not only streamlines the sampling process but also drastically increases the amount of DNA that can be recovered from limited sample material.

  20. Estimating Functions with Prior Knowledge, (EFPK) for diffusions

    DEFF Research Database (Denmark)

    Nolsøe, Kim; Kessler, Mathieu; Madsen, Henrik

    2003-01-01

    In this paper a method is formulated in an estimating function setting for parameter estimation, which allows the use of prior information. The main idea is to use prior knowledge of the parameters, either specified as moments restrictions or as a distribution, and use it in the construction of a...... of an estimating function. It may be useful when the full Bayesian analysis is difficult to carry out for computational reasons. This is almost always the case for diffusions, which is the focus of this paper, though the method applies in other settings.......In this paper a method is formulated in an estimating function setting for parameter estimation, which allows the use of prior information. The main idea is to use prior knowledge of the parameters, either specified as moments restrictions or as a distribution, and use it in the construction...

  1. Nonlinear Spatial Inversion Without Monte Carlo Sampling

    Science.gov (United States)

    Curtis, A.; Nawaz, A.

    2017-12-01

    High-dimensional, nonlinear inverse or inference problems usually have non-unique solutions. The distribution of solutions are described by probability distributions, and these are usually found using Monte Carlo (MC) sampling methods. These take pseudo-random samples of models in parameter space, calculate the probability of each sample given available data and other information, and thus map out high or low probability values of model parameters. However, such methods would converge to the solution only as the number of samples tends to infinity; in practice, MC is found to be slow to converge, convergence is not guaranteed to be achieved in finite time, and detection of convergence requires the use of subjective criteria. We propose a method for Bayesian inversion of categorical variables such as geological facies or rock types in spatial problems, which requires no sampling at all. The method uses a 2-D Hidden Markov Model over a grid of cells, where observations represent localized data constraining the model in each cell. The data in our example application are seismic properties such as P- and S-wave impedances or rock density; our model parameters are the hidden states and represent the geological rock types in each cell. The observations at each location are assumed to depend on the facies at that location only - an assumption referred to as `localized likelihoods'. However, the facies at a location cannot be determined solely by the observation at that location as it also depends on prior information concerning its correlation with the spatial distribution of facies elsewhere. Such prior information is included in the inversion in the form of a training image which represents a conceptual depiction of the distribution of local geologies that might be expected, but other forms of prior information can be used in the method as desired. The method provides direct (pseudo-analytic) estimates of posterior marginal probability distributions over each variable

  2. A new formulation of the linear sampling method: spatial resolution and post-processing

    International Nuclear Information System (INIS)

    Piana, M; Aramini, R; Brignone, M; Coyle, J

    2008-01-01

    A new formulation of the linear sampling method is described, which requires the regularized solution of a single functional equation set in a direct sum of L 2 spaces. This new approach presents the following notable advantages: it is computationally more effective than the traditional implementation, since time consuming samplings of the Tikhonov minimum problem and of the generalized discrepancy equation are avoided; it allows a quantitative estimate of the spatial resolution achievable by the method; it facilitates a post-processing procedure for the optimal selection of the scatterer profile by means of edge detection techniques. The formulation is described in a two-dimensional framework and in the case of obstacle scattering, although generalizations to three dimensions and penetrable inhomogeneities are straightforward

  3. Scientific guidelines for preservation of samples collected from Mars

    International Nuclear Information System (INIS)

    Gooding, J.L.

    1990-04-01

    The maximum scientific value of Martian geologic and atmospheric samples is retained when the samples are preserved in the conditions that applied prior to their collection. Any sample degradation equates to loss of information. Based on detailed review of pertinent scientific literature, and advice from experts in planetary sample analysis, number values are recommended for key parameters in the environmental control of collected samples with respect to material contamination, temperature, head-space gas pressure, ionizing radiation, magnetic fields, and acceleration/shock. Parametric values recommended for the most sensitive geologic samples should also be adequate to preserve any biogenic compounds or exobiological relics

  4. Use of a holder-vacuum tube device to save on-site hands in preparing urine samples for head-space gas-chromatography, and its application to determine the time allowance for sample sealing.

    Science.gov (United States)

    Kawai, Toshio; Sumino, Kimiaki; Ohashi, Fumiko; Ikeda, Masayuki

    2011-01-01

    To facilitate urine sample preparation prior to head-space gas-chromatographic (HS-GC) analysis. Urine samples containing one of the five solvents (acetone, methanol, methyl ethyl ketone, methyl isobutyl ketone and toluene) at the levels of biological exposure limits were aspirated into a vacuum tube via holder, a device commercially available for venous blood collection (the vacuum tube method). The urine sample, 5 ml, was quantitatively transferred to a 20-ml head-space vial prior to HS-GC analysis. The loaded tubes were stored at +4 ℃ in dark for up to 3 d. The vacuum tube method facilitated on-site procedures of urine sample preparation for HS-GC with no significant loss of solvents in the sample and no need of skilled hands, whereas on-site sample preparation time was significantly reduced. Furthermore, no loss of solvents was detected during the 3-d storage, irrespective of hydrophilic (acetone) or lipophilic solvent (toluene). In a pilot application, high performance of the vacuum tube method in sealing a sample in an air-tight space succeeded to confirm that no solvent will be lost when sealing is completed within 5 min after urine voiding, and that the allowance time is as long as 30 min in case of toluene in urine. The use of the holder-vacuum tube device not only saves hands for transfer of the sample to air-tight space, but facilitates sample storage prior to HS-GC analysis.

  5. Rapid quantitation of atorvastatin in process pharmaceutical powder sample using Raman spectroscopy and evaluation of parameters related to accuracy of analysis

    Science.gov (United States)

    Lim, Young-Il; Han, Janghee; Woo, Young-Ah; Kim, Jaejin; Kang, Myung Joo

    2018-07-01

    The purpose of this study was to determine the atorvastatin (ATV) content in process pharmaceutical powder sample using Raman spectroscopy. To establish the analysis method, the influence of the type of Raman measurements (back-scattering or transmission mode), preparation of calibration sample (simple admixing or granulation), sample pre-treatment (pelletization), and spectral pretreatment on the Raman spectra was investigated. The characteristic peak of the active compound was more distinctively detected in transmission Raman mode with a laser spot size of 4 mm than in the back-scattering method. Preparation of calibration samples by wet granulation, identical to the actual manufacturing process, provided unchanged spectral patterns for the in process sample, with no changes and/or shifts in the spectrum. Pelletization before Raman analysis remarkably improved spectral reproducibility by decreasing the difference in density between the samples. Probabilistic quotient normalization led to accurate and consistent quantification of the ATV content in the calibration samples (standard error of cross validation: 1.21%). Moreover, the drug content in the granules obtained from five commercial batches were reliably quantified, with no statistical difference (p = 0.09) with that obtained by HPLC assay. From these findings, we suggest that transmission Raman analysis may be a fast and non-invasive method for the quantification of ATV in actual manufacturing processes.

  6. Fast multi-dimensional NMR by minimal sampling

    Science.gov (United States)

    Kupče, Ēriks; Freeman, Ray

    2008-03-01

    A new scheme is proposed for very fast acquisition of three-dimensional NMR spectra based on minimal sampling, instead of the customary step-wise exploration of all of evolution space. The method relies on prior experiments to determine accurate values for the evolving frequencies and intensities from the two-dimensional 'first planes' recorded by setting t1 = 0 or t2 = 0. With this prior knowledge, the entire three-dimensional spectrum can be reconstructed by an additional measurement of the response at a single location (t1∗,t2∗) where t1∗ and t2∗ are fixed values of the evolution times. A key feature is the ability to resolve problems of overlap in the acquisition dimension. Applied to a small protein, agitoxin, the three-dimensional HNCO spectrum is obtained 35 times faster than systematic Cartesian sampling of the evolution domain. The extension to multi-dimensional spectroscopy is outlined.

  7. Arthur Prior and 'Now'

    DEFF Research Database (Denmark)

    Blackburn, Patrick Rowan; Jørgensen, Klaus Frovin

    2016-01-01

    ’s search led him through the work of Castañeda, and back to his own work on hybrid logic: the first made temporal reference philosophically respectable, the second made it technically feasible in a modal framework. With the aid of hybrid logic, Prior built a bridge from a two-dimensional UT calculus...

  8. Self-prior strategy for organ reconstruction in fluorescence molecular tomography.

    Science.gov (United States)

    Zhou, Yuan; Chen, Maomao; Su, Han; Luo, Jianwen

    2017-10-01

    The purpose of this study is to propose a strategy for organ reconstruction in fluorescence molecular tomography (FMT) without prior information from other imaging modalities, and to overcome the high cost and ionizing radiation caused by the traditional structural prior strategy. The proposed strategy is designed as an iterative architecture to solve the inverse problem of FMT. In each iteration, a short time Fourier transform (STFT) based algorithm is used to extract the self-prior information in the space-frequency energy spectrum with the assumption that the regions with higher fluorescence concentration have larger energy intensity, then the cost function of the inverse problem is modified by the self-prior information, and lastly an iterative Laplacian regularization algorithm is conducted to solve the updated inverse problem and obtains the reconstruction results. Simulations and in vivo experiments on liver reconstruction are carried out to test the performance of the self-prior strategy on organ reconstruction. The organ reconstruction results obtained by the proposed self-prior strategy are closer to the ground truth than those obtained by the iterative Tikhonov regularization (ITKR) method (traditional non-prior strategy). Significant improvements are shown in the evaluation indexes of relative locational error (RLE), relative error (RE) and contrast-to-noise ratio (CNR). The self-prior strategy improves the organ reconstruction results compared with the non-prior strategy and also overcomes the shortcomings of the traditional structural prior strategy. Various applications such as metabolic imaging and pharmacokinetic study can be aided by this strategy.

  9. Delimbing hybrid poplar prior to processing with a flail/chipper

    Science.gov (United States)

    Bruce Hartsough; Raffaele Spinelli; Steve Pottle

    2000-01-01

    We compared the performance of a flail/chipper for processing a) whole poplar trees and b) poplar trees that had been roughly delimbed with a pull-through delimber. Production rate was about 10% higher for the delimbed trees. The reduced cost of flail/chipping would not cover the additional cost of delimbing with the machine mix tested, but changes to equipment might...

  10. Heuristics as Bayesian inference under extreme priors.

    Science.gov (United States)

    Parpart, Paula; Jones, Matt; Love, Bradley C

    2018-05-01

    Simple heuristics are often regarded as tractable decision strategies because they ignore a great deal of information in the input data. One puzzle is why heuristics can outperform full-information models, such as linear regression, which make full use of the available information. These "less-is-more" effects, in which a relatively simpler model outperforms a more complex model, are prevalent throughout cognitive science, and are frequently argued to demonstrate an inherent advantage of simplifying computation or ignoring information. In contrast, we show at the computational level (where algorithmic restrictions are set aside) that it is never optimal to discard information. Through a formal Bayesian analysis, we prove that popular heuristics, such as tallying and take-the-best, are formally equivalent to Bayesian inference under the limit of infinitely strong priors. Varying the strength of the prior yields a continuum of Bayesian models with the heuristics at one end and ordinary regression at the other. Critically, intermediate models perform better across all our simulations, suggesting that down-weighting information with the appropriate prior is preferable to entirely ignoring it. Rather than because of their simplicity, our analyses suggest heuristics perform well because they implement strong priors that approximate the actual structure of the environment. We end by considering how new heuristics could be derived by infinitely strengthening the priors of other Bayesian models. These formal results have implications for work in psychology, machine learning and economics. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Limit theory for the sample autocorrelations and extremes of a GARCH (1,1) process

    NARCIS (Netherlands)

    Mikosch, T; Starica, C

    2000-01-01

    The asymptotic theory for the sample autocorrelations and extremes of a GARCH(I, 1) process is provided. Special attention is given to the case when the sum of the ARCH and GARCH parameters is close to 1, that is, when one is close to an infinite Variance marginal distribution. This situation has

  12. Benchmarking in Prior Learning Assessment and Recognition (PLAR – Tool for training reform

    Directory of Open Access Journals (Sweden)

    Rodica MOLDOVEANU

    2009-12-01

    Full Text Available Adults invariably need to change their job from sector to sector and place to place as their life and training needs change. In this respect the importance of recognizing people’s knowledge, skills and competencies as basis for further learning and development can not be overstated Prior Learning Assessment and Recognition process (PLAR. Accepted benchmarks for PLAR support assessment process by providing a list of function, skills and knowledge that need to be used as a complementary set of generic standards of best practices in PLAR.

  13. Direct PCR amplification of forensic touch and other challenging DNA samples: A review.

    Science.gov (United States)

    Cavanaugh, Sarah E; Bathrick, Abigail S

    2018-01-01

    DNA evidence sample processing typically involves DNA extraction, quantification, and STR amplification; however, DNA loss can occur at both the DNA extraction and quantification steps, which is not ideal for forensic evidence containing low levels of DNA. Direct PCR amplification of forensic unknown samples has been suggested as a means to circumvent extraction and quantification, thereby retaining the DNA typically lost during those procedures. Direct PCR amplification is a method in which a sample is added directly to an amplification reaction without being subjected to prior DNA extraction, purification, or quantification. It allows for maximum quantities of DNA to be targeted, minimizes opportunities for error and contamination, and reduces the time and monetary resources required to process samples, although data analysis may take longer as the increased DNA detection sensitivity of direct PCR may lead to more instances of complex mixtures. ISO 17025 accredited laboratories have successfully implemented direct PCR for limited purposes (e.g., high-throughput databanking analysis), and recent studies indicate that direct PCR can be an effective method for processing low-yield evidence samples. Despite its benefits, direct PCR has yet to be widely implemented across laboratories for the processing of evidentiary items. While forensic DNA laboratories are always interested in new methods that will maximize the quantity and quality of genetic information obtained from evidentiary items, there is often a lag between the advent of useful methodologies and their integration into laboratories. Delayed implementation of direct PCR of evidentiary items can be attributed to a variety of factors, including regulatory guidelines that prevent laboratories from omitting the quantification step when processing forensic unknown samples, as is the case in the United States, and, more broadly, a reluctance to validate a technique that is not widely used for evidence samples. The

  14. Process for compound transformation

    KAUST Repository

    Basset, Jean-Marie

    2016-12-29

    Embodiments of the present disclosure provide for methods of using a catalytic system to chemically transform a compound (e.g., a hydrocarbon). In an embodiment, the method does not employ grafting the catalyst prior to catalysis. In particular, embodiments of the present disclosure provide for a process of hydrocarbon (e.g., C1 to C20 hydrocarbon) metathesis (e.g., alkane, olefin, or alkyne metathesis) transformation, where the process can be conducted without employing grafting prior to catalysis.

  15. A Unifying model of perfusion and motion applied to reconstruction of sparsely sampled free-breathing myocardial perfusion MRI

    DEFF Research Database (Denmark)

    Pedersen, Henrik; Ólafsdóttir, Hildur; Larsen, Rasmus

    2010-01-01

    The clinical potential of dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is currently limited by respiratory induced motion of the heart. This paper presents a unifying model of perfusion and motion in which respiratory motion becomes an integral part of myocardial perfusion...... quantification. Hence, the need for tedious manual motion correction prior to perfusion quantification is avoided. In addition, we demonstrate that the proposed framework facilitates the process of reconstructing DCEMRI from sparsely sampled data in the presence of respiratory motion. The paper focuses primarily...... on the underlying theory of the proposed framework, but shows in vivo results of respiratory motion correction and simulation results of reconstructing sparsely sampled data....

  16. A Review of Inflammatory Processes of the Breast with a Focus on Diagnosis in Core Biopsy Samples

    Directory of Open Access Journals (Sweden)

    Timothy M. D’Alfonso

    2015-07-01

    Full Text Available Inflammatory and reactive lesions of the breast are relatively uncommon among benign breast lesions and can be the source of an abnormality on imaging. Such lesions can simulate a malignant process, based on both clinical and radiographic findings, and core biopsy is often performed to rule out malignancy. Furthermore, some inflammatory processes can mimic carcinoma or other malignancy microscopically, and vice versa. Diagnostic difficulty may arise due to the small and fragmented sample of a core biopsy. This review will focus on the pertinent clinical, radiographic, and histopathologic features of the more commonly encountered inflammatory lesions of the breast that can be characterized in a core biopsy sample. These include fat necrosis, mammary duct ectasia, granulomatous lobular mastitis, diabetic mastopathy, and abscess. The microscopic differential diagnoses for these lesions when seen in a core biopsy sample will be discussed.

  17. A Review of Inflammatory Processes of the Breast with a Focus on Diagnosis in Core Biopsy Samples.

    Science.gov (United States)

    D'Alfonso, Timothy M; Ginter, Paula S; Shin, Sandra J

    2015-07-01

    Inflammatory and reactive lesions of the breast are relatively uncommon among benign breast lesions and can be the source of an abnormality on imaging. Such lesions can simulate a malignant process, based on both clinical and radiographic findings, and core biopsy is often performed to rule out malignancy. Furthermore, some inflammatory processes can mimic carcinoma or other malignancy microscopically, and vice versa. Diagnostic difficulty may arise due to the small and fragmented sample of a core biopsy. This review will focus on the pertinent clinical, radiographic, and histopathologic features of the more commonly encountered inflammatory lesions of the breast that can be characterized in a core biopsy sample. These include fat necrosis, mammary duct ectasia, granulomatous lobular mastitis, diabetic mastopathy, and abscess. The microscopic differential diagnoses for these lesions when seen in a core biopsy sample will be discussed.

  18. Prospective regularization design in prior-image-based reconstruction

    International Nuclear Information System (INIS)

    Dang, Hao; Siewerdsen, Jeffrey H; Stayman, J Webster

    2015-01-01

    Prior-image-based reconstruction (PIBR) methods leveraging patient-specific anatomical information from previous imaging studies and/or sequences have demonstrated dramatic improvements in dose utilization and image quality for low-fidelity data. However, a proper balance of information from the prior images and information from the measurements is required (e.g. through careful tuning of regularization parameters). Inappropriate selection of reconstruction parameters can lead to detrimental effects including false structures and failure to improve image quality. Traditional methods based on heuristics are subject to error and sub-optimal solutions, while exhaustive searches require a large number of computationally intensive image reconstructions. In this work, we propose a novel method that prospectively estimates the optimal amount of prior image information for accurate admission of specific anatomical changes in PIBR without performing full image reconstructions. This method leverages an analytical approximation to the implicitly defined PIBR estimator, and introduces a predictive performance metric leveraging this analytical form and knowledge of a particular presumed anatomical change whose accurate reconstruction is sought. Additionally, since model-based PIBR approaches tend to be space-variant, a spatially varying prior image strength map is proposed to optimally admit changes everywhere in the image (eliminating the need to know change locations a priori). Studies were conducted in both an ellipse phantom and a realistic thorax phantom emulating a lung nodule surveillance scenario. The proposed method demonstrated accurate estimation of the optimal prior image strength while achieving a substantial computational speedup (about a factor of 20) compared to traditional exhaustive search. Moreover, the use of the proposed prior strength map in PIBR demonstrated accurate reconstruction of anatomical changes without foreknowledge of change locations in

  19. Achieving Our Potential: An Action Plan for Prior Learning Assessment and Recognition (PLAR) in Canada

    Science.gov (United States)

    Morrissey, Mary; Myers, Douglas; Belanger, Paul; Robitaille, Magali; Davison, Phil; Van Kleef, Joy; Williams, Rick

    2008-01-01

    This comprehensive publication assesses the status of prior learning assessment and recognition (PLAR) across Canada and offers insights and recommendations into the processes necessary for employers, post-secondary institutions and government to recognize and value experiential and informal learning. Acknowledging economic trends in Canada's job…

  20. Bayesian Prior Probability Distributions for Internal Dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Miller, G.; Inkret, W.C.; Little, T.T.; Martz, H.F.; Schillaci, M.E

    2001-07-01

    The problem of choosing a prior distribution for the Bayesian interpretation of measurements (specifically internal dosimetry measurements) is considered using a theoretical analysis and by examining historical tritium and plutonium urine bioassay data from Los Alamos. Two models for the prior probability distribution are proposed: (1) the log-normal distribution, when there is some additional information to determine the scale of the true result, and (2) the 'alpha' distribution (a simplified variant of the gamma distribution) when there is not. These models have been incorporated into version 3 of the Bayesian internal dosimetric code in use at Los Alamos (downloadable from our web site). Plutonium internal dosimetry at Los Alamos is now being done using prior probability distribution parameters determined self-consistently from population averages of Los Alamos data. (author)

  1. Automated CBED processing: Sample thickness estimation based on analysis of zone-axis CBED pattern

    Energy Technology Data Exchange (ETDEWEB)

    Klinger, M., E-mail: klinger@post.cz; Němec, M.; Polívka, L.; Gärtnerová, V.; Jäger, A.

    2015-03-15

    An automated processing of convergent beam electron diffraction (CBED) patterns is presented. The proposed methods are used in an automated tool for estimating the thickness of transmission electron microscopy (TEM) samples by matching an experimental zone-axis CBED pattern with a series of patterns simulated for known thicknesses. The proposed tool detects CBED disks, localizes a pattern in detected disks and unifies the coordinate system of the experimental pattern with the simulated one. The experimental pattern is then compared disk-by-disk with a series of simulated patterns each corresponding to different known thicknesses. The thickness of the most similar simulated pattern is then taken as the thickness estimate. The tool was tested on [0 1 1] Si, [0 1 0] α-Ti and [0 1 1] α-Ti samples prepared using different techniques. Results of the presented approach were compared with thickness estimates based on analysis of CBED patterns in two beam conditions. The mean difference between these two methods was 4.1% for the FIB-prepared silicon samples, 5.2% for the electro-chemically polished titanium and 7.9% for Ar{sup +} ion-polished titanium. The proposed techniques can also be employed in other established CBED analyses. Apart from the thickness estimation, it can potentially be used to quantify lattice deformation, structure factors, symmetry, defects or extinction distance. - Highlights: • Automated TEM sample thickness estimation using zone-axis CBED is presented. • Computer vision and artificial intelligence are employed in CBED processing. • This approach reduces operator effort, analysis time and increases repeatability. • Individual parts can be employed in other analyses of CBED/diffraction pattern.

  2. High throughput sample processing and automated scoring

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2014-10-01

    Full Text Available The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to high throughput are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. High throughput methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies, and automation gives more uniform sample treatment and less dependence on operator performance. The high throughput modifications now available vary largely in their versatility, capacity, complexity and costs. The bottleneck for further increase of throughput appears to be the scoring.

  3. Forward flux sampling calculation of homogeneous nucleation rates from aqueous NaCl solutions.

    Science.gov (United States)

    Jiang, Hao; Haji-Akbari, Amir; Debenedetti, Pablo G; Panagiotopoulos, Athanassios Z

    2018-01-28

    We used molecular dynamics simulations and the path sampling technique known as forward flux sampling to study homogeneous nucleation of NaCl crystals from supersaturated aqueous solutions at 298 K and 1 bar. Nucleation rates were obtained for a range of salt concentrations for the Joung-Cheatham NaCl force field combined with the Extended Simple Point Charge (SPC/E) water model. The calculated nucleation rates are significantly lower than the available experimental measurements. The estimates for the nucleation rates in this work do not rely on classical nucleation theory, but the pathways observed in the simulations suggest that the nucleation process is better described by classical nucleation theory than an alternative interpretation based on Ostwald's step rule, in contrast to some prior simulations of related models. In addition to the size of NaCl nucleus, we find that the crystallinity of a nascent cluster plays an important role in the nucleation process. Nuclei with high crystallinity were found to have higher growth probability and longer lifetimes, possibly because they are less exposed to hydration water.

  4. Being a haematopoietic stem cell donor for a sick sibling: Adult donors' experiences prior to donation.

    Science.gov (United States)

    Kisch, Annika; Bolmsjö, Ingrid; Lenhoff, Stig; Bengtsson, Mariette

    2015-10-01

    There is a lack of knowledge about sibling stem cell donors' experiences pre-donation and the waiting period before the donation might have been long. The donors and their corresponding sibling recipients were simultaneously included in two different interview studies. The results from the recipient study have been presented in a separate paper. The aim was to explore the experiences of being a stem cell donor for a sibling, prior to donation. Ten adult sibling donors were interviewed prior to stem cell donation. The interviews were digitally recorded, transcribed verbatim and subjected to qualitative content analysis. The main theme Being a cog in a big wheel describes the complex process of being a sibling donor prior to donation, covering a mixture of emotions and thoughts. The four subthemes Being available, Being anxious, Being concerned and Being obliged cover the various experiences. The sibling donors' experiences are influenced by the quality of the relationship with the sick sibling. Sibling stem cell donors go through a complex process once they have accidentally got involved in. They have been asked to become a donor; it was not a voluntary choice. In caring for sibling stem cell donors the nurses should be aware of the complexity of the process they experience and take into consideration their personal situation and needs. Providing optimal care for both sibling donors and their corresponding recipients is a challenge, and further improvement and exploration are needed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Spectrally Consistent Satellite Image Fusion with Improved Image Priors

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Aanæs, Henrik; Jensen, Thomas B.S.

    2006-01-01

    Here an improvement to our previous framework for satellite image fusion is presented. A framework purely based on the sensor physics and on prior assumptions on the fused image. The contributions of this paper are two fold. Firstly, a method for ensuring 100% spectrally consistency is proposed......, even when more sophisticated image priors are applied. Secondly, a better image prior is introduced, via data-dependent image smoothing....

  6. The impact of fecal sample processing on prevalence estimates for antibiotic-resistant Escherichia coli.

    Science.gov (United States)

    Omulo, Sylvia; Lofgren, Eric T; Mugoh, Maina; Alando, Moshe; Obiya, Joshua; Kipyegon, Korir; Kikwai, Gilbert; Gumbi, Wilson; Kariuki, Samuel; Call, Douglas R

    2017-05-01

    Investigators often rely on studies of Escherichia coli to characterize the burden of antibiotic resistance in a clinical or community setting. To determine if prevalence estimates for antibiotic resistance are sensitive to sample handling and interpretive criteria, we collected presumptive E. coli isolates (24 or 95 per stool sample) from a community in an urban informal settlement in Kenya. Isolates were tested for susceptibility to nine antibiotics using agar breakpoint assays and results were analyzed using generalized linear mixed models. We observed a 0.1). Prevalence estimates did not differ for five distinct E. coli colony morphologies on MacConkey agar plates (P>0.2). Successive re-plating of samples for up to five consecutive days had little to no impact on prevalence estimates. Finally, culturing E. coli under different conditions (with 5% CO 2 or micro-aerobic) did not affect estimates of prevalence. For the conditions tested in these experiments, minor modifications in sample processing protocols are unlikely to bias estimates of the prevalence of antibiotic-resistance for fecal E. coli. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. The effects of prior knowledge on study-time allocation and free recall: investigating the discrepancy reduction model.

    Science.gov (United States)

    Verkoeijen, Peter P J L; Rikers, Remy M J P; Schmidt, Henk G

    2005-01-01

    In this study, the authors examined the influence of prior knowledge activation on information processing by means of a prior knowledge activation procedure adopted from the read-generate paradigm. On the basis of cue-target pairs, participants in the experimental groups generated two different sets of items before studying a relevant list. Subsequently, participants were informed that they had to study the items in the list and that they should try to remember as many items as possible. The authors assessed the processing time allocated to the items in the list and free recall of those items. The results revealed that the experimental groups spent less time on items that had already been activated. In addition, the experimental groups outperformed the control group in overall free recall and in free recall of the activated items. Between-group comparisons did not demonstrate significant effects with respect to the processing time and free recall of nonactivated items. The authors interpreted these results in terms of the discrepancy reduction model of regulating the amount of processing time allocated to different parts of the list.

  8. Linking actions and objects: Context-specific learning of novel weight priors.

    Science.gov (United States)

    Trewartha, Kevin M; Flanagan, J Randall

    2017-06-01

    Distinct explicit and implicit memory processes support weight predictions used when lifting objects and making perceptual judgments about weight, respectively. The first time that an object is encountered weight is predicted on the basis of learned associations, or priors, linking size and material to weight. A fundamental question is whether the brain maintains a single, global representation of priors, or multiple representations that can be updated in a context specific way. A second key question is whether the updating of priors, or the ability to scale lifting forces when repeatedly lifting unusually weighted objects requires focused attention. To investigate these questions we compared the adaptability of weight predictions used when lifting objects and judging their weights in different groups of participants who experienced size-weight inverted objects passively (with the objects placed on the hands) or actively (where participants lift the objects) under full or divided attention. To assess weight judgments we measured the size-weight illusion after every 20 trials of experience with the inverted objects both passively and actively. The attenuation of the illusion that arises when lifting inverted object was found to be context-specific such that the attenuation was larger when the mode of interaction with the inverted objects matched the method of assessment of the illusion. Dividing attention during interaction with the inverted objects had no effect on attenuation of the illusion, but did slow the rate at which lifting forces were scaled to the weight inverted objects. These findings suggest that the brain stores multiple representations of priors that are context specific, and that focused attention is important for scaling lifting forces, but not for updating weight predictions used when judging object weight. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Development of an autonomous treatment planning strategy for radiation therapy with effective use of population-based prior data.

    Science.gov (United States)

    Wang, Huan; Dong, Peng; Liu, Hongcheng; Xing, Lei

    2017-02-01

    Current treatment planning remains a costly and labor intensive procedure and requires multiple trial-and-error adjustments of system parameters such as the weighting factors and prescriptions. The purpose of this work is to develop an autonomous treatment planning strategy with effective use of prior knowledge and in a clinically realistic treatment planning platform to facilitate radiation therapy workflow. Our technique consists of three major components: (i) a clinical treatment planning system (TPS); (ii) a formulation of decision-function constructed using an assemble of prior treatment plans; (iii) a plan evaluator or decision-function and an outer-loop optimization independent of the clinical TPS to assess the TPS-generated plan and to drive the search toward a solution optimizing the decision-function. Microsoft (MS) Visual Studio Coded UI is applied to record some common planner-TPS interactions as subroutines for querying and interacting with the TPS. These subroutines are called back in the outer-loop optimization program to navigate the plan selection process through the solution space iteratively. The utility of the approach is demonstrated by using clinical prostate and head-and-neck cases. An autonomous treatment planning technique with effective use of an assemble of prior treatment plans is developed to automatically maneuver the clinical treatment planning process in the platform of a commercial TPS. The process mimics the decision-making process of a human planner and provides a clinically sensible treatment plan automatically, thus reducing/eliminating the tedious manual trial-and-errors of treatment planning. It is found that the prostate and head-and-neck treatment plans generated using the approach compare favorably with that used for the patients' actual treatments. Clinical inverse treatment planning process can be automated effectively with the guidance of an assemble of prior treatment plans. The approach has the potential to

  10. Bayesian Inference for Structured Spike and Slab Priors

    DEFF Research Database (Denmark)

    Andersen, Michael Riis; Winther, Ole; Hansen, Lars Kai

    2014-01-01

    Sparse signal recovery addresses the problem of solving underdetermined linear inverse problems subject to a sparsity constraint. We propose a novel prior formulation, the structured spike and slab prior, which allows to incorporate a priori knowledge of the sparsity pattern by imposing a spatial...

  11. Women's experiences of self-reporting health online prior to their first midwifery visit

    DEFF Research Database (Denmark)

    Johnsen, Helle; Clausen, Jette Aaroe; Hvidtjørn, Dorte

    2018-01-01

    BACKGROUND: Information and communication technologies are increasingly used in health care to meet demands of efficiency, safety and patient-centered care. At a large Danish regional hospital, women report their physical, mental health and personal needs prior to their first antenatal visit....... Little is known about the process of self-reporting health, and how this information is managed during the client-professional meeting. AIM: To explore women's experiences of self-reporting their health status and personal needs online prior to the first midwifery visit, and how this information may...... personal health', 'Reducing and generating risk', and 'Bridges and gaps'. Compared to reporting physical health information, more advanced levels of health literacy might be needed to self-assess mental health and personal needs. Self-reporting health can induce feelings of being normal but also increase...

  12. Some results of processing NURE geochemical sampling in the northern Rocky Mountain area

    International Nuclear Information System (INIS)

    Thayer, P.A.; Cook, J.R.; Price, V. Jr.

    1980-01-01

    The National Uranium Resource Evaluation (NURE) program was begun in the spring of 1973 to evaluate domestic uranium resources in the continental United States and to identify areas favorable for uranium exploration. The significance of the distribution of uranium in natural waters and sediments will be assessed as an indicator of favorable areas for the discovery of uranium deposits. This paper is oriented primarily to the discussion of stream sediments. Data for the Challis 1 0 x 2 0 NTMS quadrangle will be used for specific samples of NURE data processing. A high-capacity neutron activation analysis facility at SRL is used to determine uranium and about 19 other elements in hydrogeochemical samples. Evaluation of the areal distributions of uranium ratios demonstrate that most of the high U/Hf, U/Th and U/(Th + Hf) ratios occur scattered throughout the western two-thirds of the quadrangle. Most of the higher ratio values are found in samples taken at sites underlain by granitic rocks of the Idaho batholith or Tertiary-age plutons

  13. Microbial profiling of cpn60 universal target sequences in artificial mixtures of vaginal bacteria sampled by nylon swabs or self-sampling devices under different storage conditions.

    Science.gov (United States)

    Schellenberg, John J; Oh, Angela Yena; Hill, Janet E

    2017-05-01

    The vaginal microbiome is increasingly characterized by deep sequencing of universal genes. However, there are relatively few studies of how different specimen collection and sample storage and processing influence these molecular profiles. Here, we evaluate molecular microbial community profiles of samples collected using the HerSwab™ self-sampling device, compared to nylon swabs and under different storage conditions. In order to minimize technical variation, mixtures of 11 common vaginal bacteria in simulated vaginal fluid medium were sampled and DNA extracts prepared for massively parallel sequencing of the cpn60 universal target (UT). Three artificial mixtures imitating commonly observed vaginal microbiome profiles were easily distinguished and proportion of sequence reads correlated with the estimated proportion of the organism added to the artificial mixtures. Our results indicate that cpn60 UT amplicon sequencing quantifies the proportional abundance of member organisms in these artificial communities regardless of swab type or storage conditions, although some significant differences were observed between samples that were stored frozen and thawed prior to DNA extraction, compared to extractions from samples stored at room temperature for up to 7days. Our results indicate that an on-the-market device developed for infectious disease diagnostics may be appropriate for vaginal microbiome profiling, an approach that is increasingly facilitated by rapidly dropping deep sequencing costs. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Boat sampling

    International Nuclear Information System (INIS)

    Citanovic, M.; Bezlaj, H.

    1994-01-01

    This presentation describes essential boat sampling activities: on site boat sampling process optimization and qualification; boat sampling of base material (beltline region); boat sampling of weld material (weld No. 4); problems accompanied with weld crown varieties, RPV shell inner radius tolerance, local corrosion pitting and water clarity. The equipment used for boat sampling is described too. 7 pictures

  15. High-throughput immunoturbidimetric assays for in-process determination of polyclonal antibody concentration and functionality in crude samples

    DEFF Research Database (Denmark)

    Bak, Hanne; Kyhse-Andersen, J.; Thomas, O.R.T.

    2007-01-01

    We present fast, simple immunoturbidimetric assays suitable for direct determination of antibody 'concentration' and 'functionality' in crude samples, such as in-process samples taken at various stages during antibody purification. Both assays display excellent linearity and analytical recovery. ...... antibodies, require only basic laboratory equipment, are robust, fast, cheap, easy to perform, and readily adapted to automation....

  16. Research of Plasma Spraying Process on Aluminum-Magnesium Alloy

    Directory of Open Access Journals (Sweden)

    Patricija Kavaliauskaitė

    2016-04-01

    Full Text Available The article examines plasma sprayed 95Ni-5Al coatings on alu-minum-magnesium (Mg ≈ 2,6‒3,6 % alloy substrate. Alumi-num-magnesium samples prior spraying were prepared with mechanical treatment (blasting with Al2O3. 95Ni-5Al coatings on aluminum-magnesium alloys were sprayed with different parameters of process and coating‘s thickness, porosity, micro-hardness and microstructure were evaluated. Also numerical simulations in electric and magnetic phenomena of plasma spray-ing were carried out.

  17. The Effects of Prior Knowledge Activation on Study Time Allocation and Free Recall: Investigating the Discrepancy Reduction Model

    OpenAIRE

    Verkoeijen, Peter; Rikers, Remy; Schmidt, Henk

    2005-01-01

    textabstractIn this study, the authors examined the influence of prior knowledge activation on information processing by means of a prior knowledge activation procedure adopted from the read–generate paradigm. On the basis of cue-target pairs, participants in the experimental groups generated two different sets of items before studying a relevant list. Subsequently, participants were informed that they had to study the items in the list and that they should try to remember as many items as po...

  18. Rapid quantitation of atorvastatin in process pharmaceutical powder sample using Raman spectroscopy and evaluation of parameters related to accuracy of analysis.

    Science.gov (United States)

    Lim, Young-Il; Han, Janghee; Woo, Young-Ah; Kim, Jaejin; Kang, Myung Joo

    2018-07-05

    The purpose of this study was to determine the atorvastatin (ATV) content in process pharmaceutical powder sample using Raman spectroscopy. To establish the analysis method, the influence of the type of Raman measurements (back-scattering or transmission mode), preparation of calibration sample (simple admixing or granulation), sample pre-treatment (pelletization), and spectral pretreatment on the Raman spectra was investigated. The characteristic peak of the active compound was more distinctively detected in transmission Raman mode with a laser spot size of 4mm than in the back-scattering method. Preparation of calibration samples by wet granulation, identical to the actual manufacturing process, provided unchanged spectral patterns for the in process sample, with no changes and/or shifts in the spectrum. Pelletization before Raman analysis remarkably improved spectral reproducibility by decreasing the difference in density between the samples. Probabilistic quotient normalization led to accurate and consistent quantification of the ATV content in the calibration samples (standard error of cross validation: 1.21%). Moreover, the drug content in the granules obtained from five commercial batches were reliably quantified, with no statistical difference (p=0.09) with that obtained by HPLC assay. From these findings, we suggest that transmission Raman analysis may be a fast and non-invasive method for the quantification of ATV in actual manufacturing processes. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. SIPPI: A Matlab toolbox for sampling the solution to inverse problems with complex prior information

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Looms, Majken Caroline

    2013-01-01

    We present an application of the SIPPI Matlab toolbox, to obtain a sample from the a posteriori probability density function for the classical tomographic inversion problem. We consider a number of different forward models, linear and non-linear, such as ray based forward models that rely...

  20. Evaluation of various techniques for the pretreatment of sewage sludges prior to trace metal analysis by atomic absorption spectrophotometry

    International Nuclear Information System (INIS)

    Smith, R.

    1983-01-01

    Six techniques were evaluated for their suitability for the pretreatment of dried sewage sludge prior to trace metal analysis by atomic absorption spectrophotometry. The evaluation comprised analysis of two prepared samples of dried sludge for aluminium, cadmium, chromium, copper, iron, lead, manganese, nickel and zinc, after the following pretreatment: dry ashing at 500 degrees Celsius followed by extraction with dilute hydrochloric acid; dry ashing at 500 degrees Celsius followed by extraction with aqua regia; nitric acid digestion followed by extraction with hydrochloric acid; extraction with aqua regia; ashing with magnesium nitrate solution at 550 degrees Celsius followed by digestion with hydrochloric acid and extraction with nitric acid; extraction with nitric acid. Procedures involving the use of perchloric acid, hydrofluoric acid and hydrogen peroxide were not considered for reasons of safety. Except in the case of aluminium the direct mineral acid digestion and/or extraction methods generally gave higher recoveries than the procedures incorporating an ashing step. Direct extraction of the sample with aqua regia was recommended as a rapid and simple general method of sample pretreatment prior to analysis for all the metals investigated except aluminium. For this metal, more drastic sample pretreatment will be required, for example fusion or hydrofluoric acid digestion

  1. Low-sampling-rate ultra-wideband channel estimation using equivalent-time sampling

    KAUST Repository

    Ballal, Tarig

    2014-09-01

    In this paper, a low-sampling-rate scheme for ultra-wideband channel estimation is proposed. The scheme exploits multiple observations generated by transmitting multiple pulses. In the proposed scheme, P pulses are transmitted to produce channel impulse response estimates at a desired sampling rate, while the ADC samples at a rate that is P times slower. To avoid loss of fidelity, the number of sampling periods (based on the desired rate) in the inter-pulse interval is restricted to be co-prime with P. This condition is affected when clock drift is present and the transmitted pulse locations change. To handle this case, and to achieve an overall good channel estimation performance, without using prior information, we derive an improved estimator based on the bounded data uncertainty (BDU) model. It is shown that this estimator is related to the Bayesian linear minimum mean squared error (LMMSE) estimator. Channel estimation performance of the proposed sub-sampling scheme combined with the new estimator is assessed in simulation. The results show that high reduction in sampling rate can be achieved. The proposed estimator outperforms the least squares estimator in almost all cases, while in the high SNR regime it also outperforms the LMMSE estimator. In addition to channel estimation, a synchronization method is also proposed that utilizes the same pulse sequence used for channel estimation. © 2014 IEEE.

  2. Professional judgment in the Data Quality Objectives process: A Bayesian approach to screening assessment

    International Nuclear Information System (INIS)

    Black, P.K.; Neptune, M.D.; Ryti, R.T.; Hickmott, D.D.

    1994-01-01

    The Data Quality Objectives (DQO) process provides a logical planning structure for specifying the optimal sample allocation for defensible decision making, depending on acceptable levels of decision uncertainty and anticipated sampling and measurement errors. These planning inputs must be established prior to designing the data collection activity. Application of the DQO process has traditionally been performed under the framework of Classical statistical theory; elicited decision errors have been interpreted as Classical Type I and Type II errors; mean and variance constraints have been incorporated based on historical information; and, Classical statistical testing methods have been used to determine optimal sample sizes. However, decision errors are usually stated, for dichotomous hypotheses, in terms of the probability of making a false positive or false negative decision; these probabilities, at best, relate loosely to probabilities of Classical Type I and Type II errors. Statements of Classical error types are couched in the language of the probability of rejection of hypotheses as opposed to the probability that a hypothesis is correct. Also, historical or archival data are often insufficient to adequately support prior judgments of means and variances. In many circumstances, however, expert knowledge and opinion is not only available, but is substantial. Finally, a paradigm that provides solutions for other than dichotomous decision problems offers greater diversity for solving real world problems

  3. HIV Risk Behaviors in the U.S. Transgender Population: Prevalence and Predictors in a Large Internet Sample

    Science.gov (United States)

    Feldman, Jamie; Romine, Rebecca Swinburne; Bockting, Walter O.

    2014-01-01

    To study the influence of gender on HIV risk, a sample of the U.S. transgender population (N = 1,229) was recruited via the Internet. HIV risk and prevalence were lower than reported in prior studies of localized, urban samples, but higher than the overall U.S. population. Findings suggest that gender nonconformity alone does not itself result in markedly higher HIV risk. Sex with nontransgender men emerged as the strongest independent predictor of unsafe sex for both male-to-female (MtF) and female-to-male (FtM) participants. These sexual relationships constitute a process that may either affirm or problematize gender identity and sexual orientation, with different emphases for MtFs and FtMs, respectively. PMID:25022491

  4. Use of respondent driven sampling (RDS generates a very diverse sample of men who have sex with men (MSM in Buenos Aires, Argentina.

    Directory of Open Access Journals (Sweden)

    Alex Carballo-Diéguez

    Full Text Available Prior research focusing on men who have sex with men (MSM conducted in Buenos Aires, Argentina, used convenience samples that included mainly gay identified men. To increase MSM sample representativeness, we used Respondent Driven Sampling (RDS for the first time in Argentina. Using RDS, under certain specified conditions, the observed estimates for the percentage of the population with a specific trait are asymptotically unbiased. We describe, the diversity of the recruited sample, from the point of view of sexual orientation, and contrast the different subgroups in terms of their HIV sexual risk behavior.500 MSM were recruited using RDS. Behavioral data were collected through face-to-face interviews and Web-based CASI.In contrast with prior studies, RDS generated a very diverse sample of MSM from a sexual identity perspective. Only 24.5% of participants identified as gay; 36.2% identified as bisexual, 21.9% as heterosexual, and 17.4% were grouped as "other." Gay and non-gay identified MSM differed significantly in their sexual behavior, the former having higher numbers of partners, more frequent sexual contacts and less frequency of condom use. One third of the men (gay, 3%; bisexual, 34%, heterosexual, 51%; other, 49% reported having had sex with men, women and transvestites in the two months prior to the interview. This population requires further study and, potentially, HIV prevention strategies tailored to such diversity of partnerships. Our results highlight the potential effectiveness of using RDS to reach non-gay identified MSM. They also present lessons learned in the implementation of RDS to recruit MSM concerning both the importance and limitations of formative work, the need to tailor incentives to circumstances of the less affluent potential participants, the need to prevent masking, and the challenge of assessing network size.

  5. Bayesian Image Restoration Using a Large-Scale Total Patch Variation Prior

    Directory of Open Access Journals (Sweden)

    Yang Chen

    2011-01-01

    Full Text Available Edge-preserving Bayesian restorations using nonquadratic priors are often inefficient in restoring continuous variations and tend to produce block artifacts around edges in ill-posed inverse image restorations. To overcome this, we have proposed a spatial adaptive (SA prior with improved performance. However, this SA prior restoration suffers from high computational cost and the unguaranteed convergence problem. Concerning these issues, this paper proposes a Large-scale Total Patch Variation (LS-TPV Prior model for Bayesian image restoration. In this model, the prior for each pixel is defined as a singleton conditional probability, which is in a mixture prior form of one patch similarity prior and one weight entropy prior. A joint MAP estimation is thus built to ensure the iteration monotonicity. The intensive calculation of patch distances is greatly alleviated by the parallelization of Compute Unified Device Architecture(CUDA. Experiments with both simulated and real data validate the good performance of the proposed restoration.

  6. Neutrino mass priors for cosmology from random matrices

    Science.gov (United States)

    Long, Andrew J.; Raveri, Marco; Hu, Wayne; Dodelson, Scott

    2018-02-01

    Cosmological measurements of structure are placing increasingly strong constraints on the sum of the neutrino masses, Σ mν, through Bayesian inference. Because these constraints depend on the choice for the prior probability π (Σ mν), we argue that this prior should be motivated by fundamental physical principles rather than the ad hoc choices that are common in the literature. The first step in this direction is to specify the prior directly at the level of the neutrino mass matrix Mν, since this is the parameter appearing in the Lagrangian of the particle physics theory. Thus by specifying a probability distribution over Mν, and by including the known squared mass splittings, we predict a theoretical probability distribution over Σ mν that we interpret as a Bayesian prior probability π (Σ mν). Assuming a basis-invariant probability distribution on Mν, also known as the anarchy hypothesis, we find that π (Σ mν) peaks close to the smallest Σ mν allowed by the measured mass splittings, roughly 0.06 eV (0.1 eV) for normal (inverted) ordering, due to the phenomenon of eigenvalue repulsion in random matrices. We consider three models for neutrino mass generation: Dirac, Majorana, and Majorana via the seesaw mechanism; differences in the predicted priors π (Σ mν) allow for the possibility of having indications about the physical origin of neutrino masses once sufficient experimental sensitivity is achieved. We present fitting functions for π (Σ mν), which provide a simple means for applying these priors to cosmological constraints on the neutrino masses or marginalizing over their impact on other cosmological parameters.

  7. Estimating security betas using prior information based on firm fundamentals

    NARCIS (Netherlands)

    Cosemans, M.; Frehen, R.; Schotman, P.C.; Bauer, R.

    2010-01-01

    This paper proposes a novel approach for estimating time-varying betas of individual stocks that incorporates prior information based on fundamentals. We shrink the rolling window estimate of beta towards a firm-specific prior that is motivated by asset pricing theory. The prior captures structural

  8. Solar Ion Processing of Itokawa Grains: Reconciling Model Predictions with Sample Observations

    Science.gov (United States)

    Christoffersen, Roy; Keller, L. P.

    2014-01-01

    Analytical TEM observations of Itokawa grains reported to date show complex solar wind ion processing effects in the outer 30-100 nm of pyroxene and olivine grains. The effects include loss of long-range structural order, formation of isolated interval cavities or "bubbles", and other nanoscale compositional/microstructural variations. None of the effects so far described have, however, included complete ion-induced amorphization. To link the array of observed relationships to grain surface exposure times, we have adapted our previous numerical model for progressive solar ion processing effects in lunar regolith grains to the Itokawa samples. The model uses SRIM ion collision damage and implantation calculations within a framework of a constant-deposited-energy model for amorphization. Inputs include experimentally-measured amorphization fluences, a Pi steradian variable ion incidence geometry required for a rotating asteroid, and a numerical flux-versus-velocity solar wind spectrum.

  9. 34 CFR 303.403 - Prior notice; native language.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 2 2010-07-01 2010-07-01 false Prior notice; native language. 303.403 Section 303.403... TODDLERS WITH DISABILITIES Procedural Safeguards General § 303.403 Prior notice; native language. (a... file a complaint and the timelines under those procedures. (c) Native language. (1) The notice must be...

  10. Honest Importance Sampling with Multiple Markov Chains.

    Science.gov (United States)

    Tan, Aixin; Doss, Hani; Hobert, James P

    2015-01-01

    Importance sampling is a classical Monte Carlo technique in which a random sample from one probability density, π 1 , is used to estimate an expectation with respect to another, π . The importance sampling estimator is strongly consistent and, as long as two simple moment conditions are satisfied, it obeys a central limit theorem (CLT). Moreover, there is a simple consistent estimator for the asymptotic variance in the CLT, which makes for routine computation of standard errors. Importance sampling can also be used in the Markov chain Monte Carlo (MCMC) context. Indeed, if the random sample from π 1 is replaced by a Harris ergodic Markov chain with invariant density π 1 , then the resulting estimator remains strongly consistent. There is a price to be paid however, as the computation of standard errors becomes more complicated. First, the two simple moment conditions that guarantee a CLT in the iid case are not enough in the MCMC context. Second, even when a CLT does hold, the asymptotic variance has a complex form and is difficult to estimate consistently. In this paper, we explain how to use regenerative simulation to overcome these problems. Actually, we consider a more general set up, where we assume that Markov chain samples from several probability densities, π 1 , …, π k , are available. We construct multiple-chain importance sampling estimators for which we obtain a CLT based on regeneration. We show that if the Markov chains converge to their respective target distributions at a geometric rate, then under moment conditions similar to those required in the iid case, the MCMC-based importance sampling estimator obeys a CLT. Furthermore, because the CLT is based on a regenerative process, there is a simple consistent estimator of the asymptotic variance. We illustrate the method with two applications in Bayesian sensitivity analysis. The first concerns one-way random effects models under different priors. The second involves Bayesian variable

  11. Acoustic Sample Deposition MALDI-MS (ASD-MALDI-MS): A Novel Process Flow for Quality Control Screening of Compound Libraries.

    Science.gov (United States)

    Chin, Jefferson; Wood, Elizabeth; Peters, Grace S; Drexler, Dieter M

    2016-02-01

    In the early stages of drug discovery, high-throughput screening (HTS) of compound libraries against pharmaceutical targets is a common method to identify potential lead molecules. For these HTS campaigns to be efficient and successful, continuous quality control of the compound collection is necessary and crucial. However, the large number of compound samples and the limited sample amount pose unique challenges. Presented here is a proof-of-concept study for a novel process flow for the quality control screening of small-molecule compound libraries that consumes only minimal amounts of samples and affords compound-specific molecular data. This process employs an acoustic sample deposition (ASD) technique for the offline sample preparation by depositing nanoliter volumes in an array format onto microscope glass slides followed by matrix-assisted laser desorption/ionization mass spectrometric (MALDI-MS) analysis. An initial study of a 384-compound array employing the ASD-MALDI-MS workflow resulted in a 75% first-pass positive identification rate with an analysis time of <1 s per sample. © 2015 Society for Laboratory Automation and Screening.

  12. Precise simultaneous determination of zirconium and hafnium in silicate rocks, meteorites and lunar samples. [Neutron reactions

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, P A; Garg, A N; Ehmann, W D [Kentucky Univ., Lexington (USA). Dept. of Chemistry

    1977-01-01

    A precise, sensitive and rapid analytical technique has been developed for the simultaneous determination of Zr and Hf in natural silicate matrices. The technique is based on radiochemical neutron activation analysis and employs a rapid fusion dissolution of the sample and simultaneous precipitation of the Zr-Hf pair with p-hydroxybenzene arsenic acid in an acidic medium. The indicator radionuclides, /sup 95/Zr and /sup 181/Hf, are counted and the /sup 95/Zr activity is corrected for the contribution from U fission. The chemical yields of the radiochemical separation are based on Hf carrier. The yield is determined by reactivation of the processed samples and standards with a /sup 252/Cf isotopic neutron source and by counting the 18.6 sec half-life sup(179m)Hf. The RNAA procedure for Zr and Hf has been shown to be precise and accurate for natural silicate samples, based on replicate analyses of samples containing Zr in the range of 1 ..mu..g/g to over 600 ..mu..g/g. The procedure is relatively rapid with a total chemical processing time of approximately 3 hours. At least 4 samples are processed simultaneously. Ten additional elements (Fe, Cr, Co, Sc, Eu, La, Lu, Ce, Th and Tb) can be determined by direct Ge(Li) spectrometry (INAA) on the samples prior to dissolution for the RNAA determination of Zr and Hf. Corrections for the U fission contribution can be made on the basis of the known U content or from the INAA Th content, based on the relatively constant natural Th/U ratio.

  13. Delayed matching-to-sample: A tool to assess memory and other cognitive processes in pigeons.

    Science.gov (United States)

    Zentall, Thomas R; Smith, Aaron P

    2016-02-01

    Delayed matching-to-sample is a versatile task that has been used to assess the nature of animal memory. Although once thought to be a relatively passive process, matching research has demonstrated considerable flexibility in how animals actively represent events in memory. But delayed matching can also demonstrate how animals fail to maintain representations in memory when they are cued that they will not be tested (directed forgetting) and how the outcome expected can serve as a choice cue. When pigeons have shown divergent retention functions following training without a delay, it has been taken as evidence of the use of a single-code/default coding strategy but in many cases an alternative account may be involved. Delayed matching has also been used to investigate equivalence learning (how animals represent stimuli when they learn that the same comparison response is correct following the presentation of two different samples) and to test for metamemory (the ability of pigeons to indicate that they understand what they know) by allowing animals to decline to be tested when they are uncertain that they remember a stimulus. How animals assess the passage of time has also been studied using the matching task. And there is evidence that when memory for the sample is impaired by a delay, rather than use the probability of being correct for choice of each of the comparison stimuli, pigeons tend to choose based on the overall sample frequency (base-rate neglect). Finally, matching has been used to identify natural color categories as well as dimensional categories in pigeons. Overall, matching to sample has provided an excellent methodology for assessing an assortment of cognitive processes in animals. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Including Below Detection Limit Samples in Post Decommissioning Soil Sample Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Hwan; Yim, Man Sung [KAIST, Daejeon (Korea, Republic of)

    2016-05-15

    To meet the required standards the site owner has to show that the soil at the facility has been sufficiently cleaned up. To do this one must know the contamination of the soil at the site prior to clean up. This involves sampling that soil to identify the degree of contamination. However there is a technical difficulty in determining how much decontamination should be done. The problem arises when measured samples are below the detection limit. Regulatory guidelines for site reuse after decommissioning are commonly challenged because the majority of the activity in the soil at or below the limit of detection. Using additional statistical analyses of contaminated soil after decommissioning is expected to have the following advantages: a better and more reliable probabilistic exposure assessment, better economics (lower project costs) and improved communication with the public. This research will develop an approach that defines an acceptable method for demonstrating compliance of decommissioned NPP sites and validates that compliance. Soil samples from NPP often contain censored data. Conventional methods for dealing with censored data sets are statistically biased and limited in their usefulness.

  15. Pengaruh Prior Online Purchase Experience Terhadap Trust Dan Online Repurchase Intention (Survey Pada Pelanggan Zalora Indonesia Melalui Website Www.zalora.co.id)

    OpenAIRE

    Parastanti, Gadis Paramita

    2014-01-01

    This study aims to determine the effect of Prior Online Purchase Experience on the Trust and Online Repurchase Intention. Exogenous variables used in this study were Prior Online Purchase Experience, while the intervening variable in this study is the Trust and endogenous variables in this study are Online Repurchase Intention. Types of research used is explanatory research with quantitative approach. This study is done to the customer on the website ZALORA Indonesia (www.zalora.co.id). Sampl...

  16. DEFENSE WASTE PROCESSING FACILITY ANALYTICAL METHOD VERIFICATION FOR THE SLUDGE BATCH 5 QUALIFICATION SAMPLE

    International Nuclear Information System (INIS)

    Click, D; Tommy Edwards, T; Henry Ajo, H

    2008-01-01

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) performs confirmation of the applicability of the digestion method to be used by the DWPF lab for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) receipt samples and SRAT product process control samples. DWPF SRAT samples are typically dissolved using a room temperature HF-HNO3 acid dissolution (i.e., DWPF Cold Chem Method, see Procedure SW4-15.201) and then analyzed by inductively coupled plasma - atomic emission spectroscopy (ICP-AES). This report contains the results and comparison of data generated from performing the Aqua Regia (AR), Sodium Peroxide/Hydroxide Fusion (PF) and DWPF Cold Chem (CC) method digestion of Sludge Batch 5 (SB5) SRAT Receipt and SB5 SRAT Product samples. The SB5 SRAT Receipt and SB5 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB5 Batch composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 4 (SB4), to form the SB5 Blend composition. The results for any one particular element should not be used in any way to identify the form or speciation of a particular element in the sludge or used to estimate ratios of compounds in the sludge. A statistical comparison of the data validates the use of the DWPF CC method for SB5 Batch composition. However, the difficulty that was encountered in using the CC method for SB4 brings into question the adequacy of CC for the SB5 Blend. Also, it should be noted that visible solids remained in the final diluted solutions of all samples digested by this method at SRNL (8 samples total), which is typical for the DWPF CC method but not seen in the other methods. Recommendations to the DWPF for application to SB5 based on studies to date: (1) A dissolution study should be performed on the WAPS

  17. Electrical discharge machining for vessel sample removal

    International Nuclear Information System (INIS)

    Litka, T.J.

    1993-01-01

    Due to aging-related problems or essential metallurgy information (plant-life extension or decommissioning) of nuclear plants, sample removal from vessels may be required as part of an examination. Vessel or cladding samples with cracks may be removed to determine the cause of cracking. Vessel weld samples may be removed to determine the weld metallurgy. In all cases, an engineering analysis must be done prior to sample removal to determine the vessel's integrity upon sample removal. Electrical discharge machining (EDM) is being used for in-vessel nuclear power plant vessel sampling. Machining operations in reactor coolant system (RCS) components must be accomplished while collecting machining chips that could cause damage if they become part of the flow stream. The debris from EDM is a fine talclike particulate (no chips), which can be collected by flushing and filtration

  18. Sampling and Characterization of 618-2 Anomalous Material

    International Nuclear Information System (INIS)

    Zacharias, A.E.

    2006-01-01

    This as low as reasonably achievable (ALARA) Level II review documents radiological engineering and administrative controls necessary for the sampling and characterization of anomalous materials discovered during the remediation of the 618-2 solid waste burial ground. The goals of these engineering and administrative controls are to keep personnel exposure ALARA, control contamination levels, and minimize potential for airborne contamination. Excavation of the 618-2 Burial Ground has produced many items of anomalous waste. Prior to temporary packaging and/or storage, these items have been characterized in the field to identify radiological and industrial safety conditions. Further sampling and characterization of these items, as well as those remaining from an excavated combination safe, is the subject of this ALARA Level II review. An ALARA in-progress review will also be performed prior to sampling and characterization of 618-2 anomalous materials offering risks of differing natures. General categories of anomalies requiring further characterization include the following: (1) Containers of unknown liquids and/or solids and powders (excluding transuranics); (2) Drums containing unknown liquids and/or solids; (3) Metal containers with unknown contents; and (4) Known or suspected transuranic material.

  19. Prior elicitation and Bayesian analysis of the Steroids for Corneal Ulcers Trial.

    Science.gov (United States)

    See, Craig W; Srinivasan, Muthiah; Saravanan, Somu; Oldenburg, Catherine E; Esterberg, Elizabeth J; Ray, Kathryn J; Glaser, Tanya S; Tu, Elmer Y; Zegans, Michael E; McLeod, Stephen D; Acharya, Nisha R; Lietman, Thomas M

    2012-12-01

    To elicit expert opinion on the use of adjunctive corticosteroid therapy in bacterial corneal ulcers. To perform a Bayesian analysis of the Steroids for Corneal Ulcers Trial (SCUT), using expert opinion as a prior probability. The SCUT was a placebo-controlled trial assessing visual outcomes in patients receiving topical corticosteroids or placebo as adjunctive therapy for bacterial keratitis. Questionnaires were conducted at scientific meetings in India and North America to gauge expert consensus on the perceived benefit of corticosteroids as adjunct treatment. Bayesian analysis, using the questionnaire data as a prior probability and the primary outcome of SCUT as a likelihood, was performed. For comparison, an additional Bayesian analysis was performed using the results of the SCUT pilot study as a prior distribution. Indian respondents believed there to be a 1.21 Snellen line improvement, and North American respondents believed there to be a 1.24 line improvement with corticosteroid therapy. The SCUT primary outcome found a non-significant 0.09 Snellen line benefit with corticosteroid treatment. The results of the Bayesian analysis estimated a slightly greater benefit than did the SCUT primary analysis (0.19 lines verses 0.09 lines). Indian and North American experts had similar expectations on the effectiveness of corticosteroids in bacterial corneal ulcers; that corticosteroids would markedly improve visual outcomes. Bayesian analysis produced results very similar to those produced by the SCUT primary analysis. The similarity in result is likely due to the large sample size of SCUT and helps validate the results of SCUT.

  20. Boat sampling and inservice inspections of the reactor pressure vessel weld No. 4 at Kozloduy NPP, Unit 1

    International Nuclear Information System (INIS)

    Cvitanovic, M.; Oreb, E.; Mudronja, V.; Zado, V.; Bezlaj, H.; Petkov, M.; Gledatchev, J.; Radomirski, S.; Ribarska, T.; Kroes, B.

    1999-01-01

    The paper deals with reactor pressure vessel (RPV) boat sampling performed at Kozloduy Nuclear Power Plant, Unit 1, from August to November 1996. Kozloduy NPP, Unit 1 has no reactor vessel material surveillance program. Changes in the material fracture toughness resulting from the fast neutron irradiation which cannot be monitored without removal of the vessel material. Therefore, the main objective of the project was to cut samples from the RPV wall in order to obtain samples of the RPV material for further structural analyses. The most critical area, i.e. weld No. 4 was determined as a location for boat sampling. Replication technique was applied in order to obtain precise determination of the weld geometry necessary for positioning of the cutting tool prior to boat sampling, and determination of divot depth left after boat sampling and grinding of sample sites. Boat sampling was performed by electrical discharge machining (EDM). Grinding of sample sites was implemented to minimize stress concentration effects on sample sites, to eliminate surface irregularities resulting from EDM process, and to eliminate recast layer on the surface of the EDM cut. Ultrasonic, liquid penetrant, magnetic particles, and visual examinations were performed after grinding to establish baseline data in the boat sampling area. The project preparation activities, apart from EDM process, and the site organization lead was entrusted to INETEC. The activities were funded by the PHARE program of the European Commission. (orig.)