WorldWideScience

Sample records for samples obtained prior

  1. Perilymph sampling from the cochlear apex: a reliable method to obtain higher purity perilymph samples from scala tympani.

    Science.gov (United States)

    Salt, Alec N; Hale, Shane A; Plonkte, Stefan K R

    2006-05-15

    Measurements of drug levels in the fluids of the inner ear are required to establish kinetic parameters and to determine the influence of specific local delivery protocols. For most substances, this requires cochlear fluids samples to be obtained for analysis. When auditory function is of primary interest, the drug level in the perilymph of scala tympani (ST) is most relevant, since drug in this scala has ready access to the auditory sensory cells. In many prior studies, ST perilymph samples have been obtained from the basal turn, either by aspiration through the round window membrane (RWM) or through an opening in the bony wall. A number of studies have demonstrated that such samples are likely to be contaminated with cerebrospinal fluid (CSF). CSF enters the basal turn of ST through the cochlear aqueduct when the bony capsule is perforated or when fluid is aspirated. The degree of sample contamination has, however, not been widely appreciated. Recent studies have shown that perilymph samples taken through the round window membrane are highly contaminated with CSF, with samples greater than 2microL in volume containing more CSF than perilymph. In spite of this knowledge, many groups continue to sample from the base of the cochlea, as it is a well-established method. We have developed an alternative, technically simple method to increase the proportion of ST perilymph in a fluid sample. The sample is taken from the apex of the cochlea, a site that is distant from the cochlear aqueduct. A previous problem with sampling through a perforation in the bone was that the native perilymph rapidly leaked out driven by CSF pressure and was lost to the middle ear space. We therefore developed a procedure to collect all the fluid that emerged from the perforated apex after perforation. We evaluated the method using a marker ion trimethylphenylammonium (TMPA). TMPA was applied to the perilymph of guinea pigs either by RW irrigation or by microinjection into the apical turn. The

  2. Reference Priors For Non-Normal Two-Sample Problems

    NARCIS (Netherlands)

    Fernández, C.; Steel, M.F.J.

    1997-01-01

    The reference prior algorithm (Berger and Bernardo, 1992) is applied to locationscale models with any regular sampling density. A number of two-sample problems is analyzed in this general context, extending the dierence, ratio and product of Normal means problems outside Normality, while explicitly

  3. Generalized species sampling priors with latent Beta reinforcements

    Science.gov (United States)

    Airoldi, Edoardo M.; Costa, Thiago; Bassetti, Federico; Leisen, Fabrizio; Guindani, Michele

    2014-01-01

    Many popular Bayesian nonparametric priors can be characterized in terms of exchangeable species sampling sequences. However, in some applications, exchangeability may not be appropriate. We introduce a novel and probabilistically coherent family of non-exchangeable species sampling sequences characterized by a tractable predictive probability function with weights driven by a sequence of independent Beta random variables. We compare their theoretical clustering properties with those of the Dirichlet Process and the two parameters Poisson-Dirichlet process. The proposed construction provides a complete characterization of the joint process, differently from existing work. We then propose the use of such process as prior distribution in a hierarchical Bayes modeling framework, and we describe a Markov Chain Monte Carlo sampler for posterior inference. We evaluate the performance of the prior and the robustness of the resulting inference in a simulation study, providing a comparison with popular Dirichlet Processes mixtures and Hidden Markov Models. Finally, we develop an application to the detection of chromosomal aberrations in breast cancer by leveraging array CGH data. PMID:25870462

  4. Inverse problems with non-trivial priors: efficient solution through sequential Gibbs sampling

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Mosegaard, Klaus

    2012-01-01

    Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis algorithm can be used to sample solutions to non-linear inverse problems. In principle, these methods allow incorporation of prior information of arbitrary complexity. If an analytical closed form description of the prior...... is available, which is the case when the prior can be described by a multidimensional Gaussian distribution, such prior information can easily be considered. In reality, prior information is often more complex than can be described by the Gaussian model, and no closed form expression of the prior can be given....... We propose an algorithm, called sequential Gibbs sampling, allowing the Metropolis algorithm to efficiently incorporate complex priors into the solution of an inverse problem, also for the case where no closed form description of the prior exists. First, we lay out the theoretical background...

  5. Rapid sampling of molecular motions with prior information constraints.

    Science.gov (United States)

    Raveh, Barak; Enosh, Angela; Schueler-Furman, Ora; Halperin, Dan

    2009-02-01

    Proteins are active, flexible machines that perform a range of different functions. Innovative experimental approaches may now provide limited partial information about conformational changes along motion pathways of proteins. There is therefore a need for computational approaches that can efficiently incorporate prior information into motion prediction schemes. In this paper, we present PathRover, a general setup designed for the integration of prior information into the motion planning algorithm of rapidly exploring random trees (RRT). Each suggested motion pathway comprises a sequence of low-energy clash-free conformations that satisfy an arbitrary number of prior information constraints. These constraints can be derived from experimental data or from expert intuition about the motion. The incorporation of prior information is very straightforward and significantly narrows down the vast search in the typically high-dimensional conformational space, leading to dramatic reduction in running time. To allow the use of state-of-the-art energy functions and conformational sampling, we have integrated this framework into Rosetta, an accurate protocol for diverse types of structural modeling. The suggested framework can serve as an effective complementary tool for molecular dynamics, Normal Mode Analysis, and other prevalent techniques for predicting motion in proteins. We applied our framework to three different model systems. We show that a limited set of experimentally motivated constraints may effectively bias the simulations toward diverse predicates in an outright fashion, from distance constraints to enforcement of loop closure. In particular, our analysis sheds light on mechanisms of protein domain swapping and on the role of different residues in the motion.

  6. Rapid sampling of molecular motions with prior information constraints.

    Directory of Open Access Journals (Sweden)

    Barak Raveh

    2009-02-01

    Full Text Available Proteins are active, flexible machines that perform a range of different functions. Innovative experimental approaches may now provide limited partial information about conformational changes along motion pathways of proteins. There is therefore a need for computational approaches that can efficiently incorporate prior information into motion prediction schemes. In this paper, we present PathRover, a general setup designed for the integration of prior information into the motion planning algorithm of rapidly exploring random trees (RRT. Each suggested motion pathway comprises a sequence of low-energy clash-free conformations that satisfy an arbitrary number of prior information constraints. These constraints can be derived from experimental data or from expert intuition about the motion. The incorporation of prior information is very straightforward and significantly narrows down the vast search in the typically high-dimensional conformational space, leading to dramatic reduction in running time. To allow the use of state-of-the-art energy functions and conformational sampling, we have integrated this framework into Rosetta, an accurate protocol for diverse types of structural modeling. The suggested framework can serve as an effective complementary tool for molecular dynamics, Normal Mode Analysis, and other prevalent techniques for predicting motion in proteins. We applied our framework to three different model systems. We show that a limited set of experimentally motivated constraints may effectively bias the simulations toward diverse predicates in an outright fashion, from distance constraints to enforcement of loop closure. In particular, our analysis sheds light on mechanisms of protein domain swapping and on the role of different residues in the motion.

  7. Evaluation of the Validity of Groundwater Samples Obtained Using the Purge Water Management System at SRS

    International Nuclear Information System (INIS)

    Beardsley, C.C.

    1999-01-01

    As part of the demonstration testing of the Purge Water Management System (PWMS) technology at the Savannah River Site (SRS), four wells were equipped with PWMS units in 1997 and a series of sampling events were conducted at each during 1997-1998. Three of the wells were located in A/M Area while the fourth was located at the Old Radioactive Waste Burial Ground in the General Separations Area.The PWMS is a ''closed-loop'', non-contact, system used to collect and return purge water to the originating aquifer after a sampling event without having significantly altered the water quality. One of the primary concerns as to its applicability at SRS, and elsewhere, is whether the PWMS might resample groundwater that is returned to the aquifer during the previous sampling event. The purpose of the present investigation was to compare groundwater chemical analysis data collected at the four test wells using the PWMS vs. historical data collected using the standard monitoring program methodology to determine if the PWMS provides representative monitoring samples.The analysis of the groundwater chemical concentrations indicates that the PWMS sampling methodology acquired representative groundwater samples at monitoring wells ABP-1A, ABP-4, ARP-3 and BGO-33C. Representative groundwater samples are achieved if the PWMS does not resample groundwater that has been purged and returned during a previous sampling event. Initial screening calculations, conducted prior to the selection of these four wells, indicated that groundwater velocities were high enough under the ambient hydraulic gradients to preclude resampling from occurring at the time intervals that were used at each well. Corroborating evidence included a tracer test that was conducted at BGO-33C, the high degree of similarity between analyte concentrations derived from the PWMS samples and those obtained from historical protocol sampling, as well as the fact that PWMS data extend all previously existing concentration

  8. Effects of model complexity and priors on estimation using sequential importance sampling/resampling for species conservation

    Science.gov (United States)

    Dunham, Kylee; Grand, James B.

    2016-01-01

    We examined the effects of complexity and priors on the accuracy of models used to estimate ecological and observational processes, and to make predictions regarding population size and structure. State-space models are useful for estimating complex, unobservable population processes and making predictions about future populations based on limited data. To better understand the utility of state space models in evaluating population dynamics, we used them in a Bayesian framework and compared the accuracy of models with differing complexity, with and without informative priors using sequential importance sampling/resampling (SISR). Count data were simulated for 25 years using known parameters and observation process for each model. We used kernel smoothing to reduce the effect of particle depletion, which is common when estimating both states and parameters with SISR. Models using informative priors estimated parameter values and population size with greater accuracy than their non-informative counterparts. While the estimates of population size and trend did not suffer greatly in models using non-informative priors, the algorithm was unable to accurately estimate demographic parameters. This model framework provides reasonable estimates of population size when little to no information is available; however, when information on some vital rates is available, SISR can be used to obtain more precise estimates of population size and process. Incorporating model complexity such as that required by structured populations with stage-specific vital rates affects precision and accuracy when estimating latent population variables and predicting population dynamics. These results are important to consider when designing monitoring programs and conservation efforts requiring management of specific population segments.

  9. On the prior probabilities for two-stage Bayesian estimates

    International Nuclear Information System (INIS)

    Kohut, P.

    1992-01-01

    The method of Bayesian inference is reexamined for its applicability and for the required underlying assumptions in obtaining and using prior probability estimates. Two different approaches are suggested to determine the first-stage priors in the two-stage Bayesian analysis which avoid certain assumptions required for other techniques. In the first scheme, the prior is obtained through a true frequency based distribution generated at selected intervals utilizing actual sampling of the failure rate distributions. The population variability distribution is generated as the weighed average of the frequency distributions. The second method is based on a non-parametric Bayesian approach using the Maximum Entropy Principle. Specific features such as integral properties or selected parameters of prior distributions may be obtained with minimal assumptions. It is indicated how various quantiles may also be generated with a least square technique

  10. Implementation of antimicrobial peptides for sample preparation prior to nucleic acid amplification in point-of-care settings.

    Science.gov (United States)

    Krõlov, Katrin; Uusna, Julia; Grellier, Tiia; Andresen, Liis; Jevtuševskaja, Jekaterina; Tulp, Indrek; Langel, Ülo

    2017-12-01

    A variety of sample preparation techniques are used prior to nucleic acid amplification. However, their efficiency is not always sufficient and nucleic acid purification remains the preferred method for template preparation. Purification is difficult and costly to apply in point-of-care (POC) settings and there is a strong need for more robust, rapid, and efficient biological sample preparation techniques in molecular diagnostics. Here, the authors applied antimicrobial peptides (AMPs) for urine sample preparation prior to isothermal loop-mediated amplification (LAMP). AMPs bind to many microorganisms such as bacteria, fungi, protozoa and viruses causing disruption of their membrane integrity and facilitate nucleic acid release. The authors show that incubation of E. coli with antimicrobial peptide cecropin P1 for 5 min had a significant effect on the availability of template DNA compared with untreated or even heat treated samples resulting in up to six times increase of the amplification efficiency. These results show that AMPs treatment is a very efficient sample preparation technique that is suitable for application prior to nucleic acid amplification directly within biological samples. Furthermore, the entire process of AMPs treatment was performed at room temperature for 5 min thereby making it a good candidate for use in POC applications.

  11. Standardised Resting Time Prior to Blood Sampling and Diurnal Variation Associated with Risk of Patient Misclassification

    DEFF Research Database (Denmark)

    Bøgh Andersen, Ida; Brasen, Claus L.; Christensen, Henry

    2015-01-01

    .9×10-7) and sodium (p = 8.7×10-16). Only TSH and albumin were clinically significantly influenced by diurnal variation. Resting time had no clinically significant effect. CONCLUSIONS: We found no need for resting 15 minutes prior to blood sampling. However, diurnal variation was found to have a significant......BACKGROUND: According to current recommendations, blood samples should be taken in the morning after 15 minutes' resting time. Some components exhibit diurnal variation and in response to pressures to expand opening hours and reduce waiting time, the aims of this study were to investigate...... the impact of resting time prior to blood sampling and diurnal variation on biochemical components, including albumin, thyrotropin (TSH), total calcium and sodium in plasma. METHODS: All patients referred to an outpatient clinic for blood sampling were included in the period Nov 2011 until June 2014 (opening...

  12. Is Mars Sample Return Required Prior to Sending Humans to Mars?

    Science.gov (United States)

    Carr, Michael; Abell, Paul; Allwood, Abigail; Baker, John; Barnes, Jeff; Bass, Deborah; Beaty, David; Boston, Penny; Brinkerhoff, Will; Budney, Charles; hide

    2012-01-01

    Prior to potentially sending humans to the surface of Mars, it is fundamentally important to return samples from Mars. Analysis in Earth's extensive scientific laboratories would significantly reduce the risk of human Mars exploration and would also support the science and engineering decisions relating to the Mars human flight architecture. The importance of measurements of any returned Mars samples range from critical to desirable, and in all cases these samples will would enhance our understanding of the Martian environment before potentially sending humans to that alien locale. For example, Mars sample return (MSR) could yield information that would enable human exploration related to 1) enabling forward and back planetary protection, 2) characterizing properties of Martian materials relevant for in situ resource utilization (ISRU), 3) assessing any toxicity of Martian materials with respect to human health and performance, and 4) identifying information related to engineering surface hazards such as the corrosive effect of the Martian environment. In addition, MSR would be engineering 'proof of concept' for a potential round trip human mission to the planet, and a potential model for international Mars exploration.

  13. Obtaining Samples Representative of Contaminant Distribution in an Aquifer

    International Nuclear Information System (INIS)

    Schalla, Ronald; Spane, Frank A.; Narbutovskih, Susan M.; Conley, Scott F.; Webber, William D.

    2002-01-01

    Historically, groundwater samples collected from monitoring wells have been assumed to provide average indications of contaminant concentrations within the aquifer over the well-screen interval. In-well flow circulation, heterogeneity in the surrounding aquifer, and the sampling method utilized, however, can significantly impact the representativeness of samples as contaminant indicators of actual conditions within the surrounding aquifer. This paper identifies the need and approaches essential for providing cost-effective and technically meaningful groundwater-monitoring results. Proper design of the well screen interval is critical. An accurate understanding of ambient (non-pumping) flow conditions within the monitoring well is essential for determining the contaminant distribution within the aquifer. The ambient in-well flow velocity, flow direction and volumetric flux rate are key to this understanding. Not only do the ambient flow conditions need to be identified for preferential flow zones, but also the probable changes that will be imposed under dynamic conditions that occur during groundwater sampling. Once the in-well flow conditions are understood, effective sampling can be conducted to obtain representative samples for specific depth zones or zones of interest. The question of sample representativeness has become an important issue as waste minimization techniques such as low flow purging and sampling are implemented to combat the increasing cost of well purging and sampling at many hazardous waste sites. Several technical approaches (e.g., well tracer techniques and flowmeter surveys) can be used to determine in-well flow conditions, and these are discussed with respect to both their usefulness and limitations. Proper fluid extraction methods using minimal, (low) volume and no purge sampling methods that are used to obtain representative samples of aquifer conditions are presented

  14. Supporting patients in obtaining and oncologists in providing evidence-based health-related quality of life information prior to and after esophageal cancer surgery

    NARCIS (Netherlands)

    Jacobs, M.

    2015-01-01

    The overall aim of this thesis was to support patients in obtaining and oncologists in providing evidence-based HRQL data prior to and following esophageal cancer surgery. This thesis is divided in two parts. In Part I, we addressed the information needs of esophageal cancer patients prior to and

  15. Sample preparation prior to the LC-MS-based metabolomics/metabonomics of blood-derived samples.

    Science.gov (United States)

    Gika, Helen; Theodoridis, Georgios

    2011-07-01

    Blood represents a very important biological fluid and has been the target of continuous and extensive research for diagnostic, or health and drug monitoring reasons. Recently, metabonomics/metabolomics have emerged as a new and promising 'omics' platform that shows potential in biomarker discovery, especially in areas such as disease diagnosis, assessment of drug efficacy or toxicity. Blood is collected in various establishments in conditions that are not standardized. Next, the samples are prepared and analyzed using different methodologies or tools. When targeted analysis of key molecules (e.g., a drug or its metabolite[s]) is the aim, enforcement of certain measures or additional analyses may correct and harmonize these discrepancies. In omics fields such as those performed by holistic analytical approaches, no such rules or tools are available. As a result, comparison or correlation of results or data fusion becomes impractical. However, it becomes evident that such obstacles should be overcome in the near future to allow for large-scale studies that involve the assaying of samples from hundreds of individuals. In this case the effect of sample handling and preparation becomes very serious, in order to avoid wasting months of work from experts and expensive instrument time. The present review aims to cover the different methodologies applied to the pretreatment of blood prior to LC-MS metabolomic/metabonomic studies. The article tries to critically compare the methods and highlight issues that need to be addressed.

  16. A Green Preconcentration Method for Determination of Cobalt and Lead in Fresh Surface and Waste Water Samples Prior to Flame Atomic Absorption Spectrometry

    Directory of Open Access Journals (Sweden)

    Naeemullah

    2012-01-01

    Full Text Available Cloud point extraction (CPE has been used for the preconcentration and simultaneous determination of cobalt (Co and lead (Pb in fresh and wastewater samples. The extraction of analytes from aqueous samples was performed in the presence of 8-hydroxyquinoline (oxine as a chelating agent and Triton X-114 as a nonionic surfactant. Experiments were conducted to assess the effect of different chemical variables such as pH, amounts of reagents (oxine and Triton X-114, temperature, incubation time, and sample volume. After phase separation, based on the cloud point, the surfactant-rich phase was diluted with acidic ethanol prior to its analysis by the flame atomic absorption spectrometry (FAAS. The enhancement factors 70 and 50 with detection limits of 0.26 μg L−1 and 0.44 μg L−1 were obtained for Co and Pb, respectively. In order to validate the developed method, a certified reference material (SRM 1643e was analyzed and the determined values obtained were in a good agreement with the certified values. The proposed method was applied successfully to the determination of Co and Pb in a fresh surface and waste water sample.

  17. From global to local statistical shape priors novel methods to obtain accurate reconstruction results with a limited amount of training shapes

    CERN Document Server

    Last, Carsten

    2017-01-01

    This book proposes a new approach to handle the problem of limited training data. Common approaches to cope with this problem are to model the shape variability independently across predefined segments or to allow artificial shape variations that cannot be explained through the training data, both of which have their drawbacks. The approach presented uses a local shape prior in each element of the underlying data domain and couples all local shape priors via smoothness constraints. The book provides a sound mathematical foundation in order to embed this new shape prior formulation into the well-known variational image segmentation framework. The new segmentation approach so obtained allows accurate reconstruction of even complex object classes with only a few training shapes at hand.

  18. Application of Bayesian Decision Theory Based on Prior Information in the Multi-Objective Optimization Problem

    Directory of Open Access Journals (Sweden)

    Xia Lei

    2010-12-01

    Full Text Available General multi-objective optimization methods are hard to obtain prior information, how to utilize prior information has been a challenge. This paper analyzes the characteristics of Bayesian decision-making based on maximum entropy principle and prior information, especially in case that how to effectively improve decision-making reliability in deficiency of reference samples. The paper exhibits effectiveness of the proposed method using the real application of multi-frequency offset estimation in distributed multiple-input multiple-output system. The simulation results demonstrate Bayesian decision-making based on prior information has better global searching capability when sampling data is deficient.

  19. Fast Markov chain Monte Carlo sampling for sparse Bayesian inference in high-dimensional inverse problems using L1-type priors

    International Nuclear Information System (INIS)

    Lucka, Felix

    2012-01-01

    Sparsity has become a key concept for solving of high-dimensional inverse problems using variational regularization techniques. Recently, using similar sparsity-constraints in the Bayesian framework for inverse problems by encoding them in the prior distribution has attracted attention. Important questions about the relation between regularization theory and Bayesian inference still need to be addressed when using sparsity promoting inversion. A practical obstacle for these examinations is the lack of fast posterior sampling algorithms for sparse, high-dimensional Bayesian inversion. Accessing the full range of Bayesian inference methods requires being able to draw samples from the posterior probability distribution in a fast and efficient way. This is usually done using Markov chain Monte Carlo (MCMC) sampling algorithms. In this paper, we develop and examine a new implementation of a single component Gibbs MCMC sampler for sparse priors relying on L1-norms. We demonstrate that the efficiency of our Gibbs sampler increases when the level of sparsity or the dimension of the unknowns is increased. This property is contrary to the properties of the most commonly applied Metropolis–Hastings (MH) sampling schemes. We demonstrate that the efficiency of MH schemes for L1-type priors dramatically decreases when the level of sparsity or the dimension of the unknowns is increased. Practically, Bayesian inversion for L1-type priors using MH samplers is not feasible at all. As this is commonly believed to be an intrinsic feature of MCMC sampling, the performance of our Gibbs sampler also challenges common beliefs about the applicability of sample based Bayesian inference. (paper)

  20. Constrained noninformative priors

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-10-01

    The Jeffreys noninformative prior distribution for a single unknown parameter is the distribution corresponding to a uniform distribution in the transformed model where the unknown parameter is approximately a location parameter. To obtain a prior distribution with a specified mean but with diffusion reflecting great uncertainty, a natural generalization of the noninformative prior is the distribution corresponding to the constrained maximum entropy distribution in the transformed model. Examples are given

  1. Sample preparation strategies for food and biological samples prior to nanoparticle detection and imaging

    DEFF Research Database (Denmark)

    Larsen, Erik Huusfeldt; Löschner, Katrin

    2014-01-01

    microscopy (TEM) proved to be necessary for trouble shooting of results obtained from AFFF-LS-ICP-MS. Aqueous and enzymatic extraction strategies were tested for thorough sample preparation aiming at degrading the sample matrix and to liberate the AgNPs from chicken meat into liquid suspension. The resulting...... AFFF-ICP-MS fractograms, which corresponded to the enzymatic digests, showed a major nano-peak (about 80 % recovery of AgNPs spiked to the meat) plus new smaller peaks that eluted close to the void volume of the fractograms. Small, but significant shifts in retention time of AFFF peaks were observed...... for the meat sample extracts and the corresponding neat AgNP suspension, and rendered sizing by way of calibration with AgNPs as sizing standards inaccurate. In order to gain further insight into the sizes of the separated AgNPs, or their possible dissolved state, fractions of the AFFF eluate were collected...

  2. Relations among conceptual knowledge, procedural knowledge, and procedural flexibility in two samples differing in prior knowledge.

    Science.gov (United States)

    Schneider, Michael; Rittle-Johnson, Bethany; Star, Jon R

    2011-11-01

    Competence in many domains rests on children developing conceptual and procedural knowledge, as well as procedural flexibility. However, research on the developmental relations between these different types of knowledge has yielded unclear results, in part because little attention has been paid to the validity of the measures or to the effects of prior knowledge on the relations. To overcome these problems, we modeled the three constructs in the domain of equation solving as latent factors and tested (a) whether the predictive relations between conceptual and procedural knowledge were bidirectional, (b) whether these interrelations were moderated by prior knowledge, and (c) how both constructs contributed to procedural flexibility. We analyzed data from 2 measurement points each from two samples (Ns = 228 and 304) of middle school students who differed in prior knowledge. Conceptual and procedural knowledge had stable bidirectional relations that were not moderated by prior knowledge. Both kinds of knowledge contributed independently to procedural flexibility. The results demonstrate how changes in complex knowledge structures contribute to competence development.

  3. Bayesian inference from count data using discrete uniform priors.

    Directory of Open Access Journals (Sweden)

    Federico Comoglio

    Full Text Available We consider a set of sample counts obtained by sampling arbitrary fractions of a finite volume containing an homogeneously dispersed population of identical objects. We report a Bayesian derivation of the posterior probability distribution of the population size using a binomial likelihood and non-conjugate, discrete uniform priors under sampling with or without replacement. Our derivation yields a computationally feasible formula that can prove useful in a variety of statistical problems involving absolute quantification under uncertainty. We implemented our algorithm in the R package dupiR and compared it with a previously proposed Bayesian method based on a Gamma prior. As a showcase, we demonstrate that our inference framework can be used to estimate bacterial survival curves from measurements characterized by extremely low or zero counts and rather high sampling fractions. All in all, we provide a versatile, general purpose algorithm to infer population sizes from count data, which can find application in a broad spectrum of biological and physical problems.

  4. Bespoke Bias for Obtaining Free Energy Differences within Variationally Enhanced Sampling.

    Science.gov (United States)

    McCarty, James; Valsson, Omar; Parrinello, Michele

    2016-05-10

    Obtaining efficient sampling of multiple metastable states through molecular dynamics and hence determining free energy differences is central for understanding many important phenomena. Here we present a new biasing strategy, which employs the recent variationally enhanced sampling approach (Valsson and Parrinello Phys. Rev. Lett. 2014, 113, 090601). The bias is constructed from an intuitive model of the local free energy surface describing fluctuations around metastable minima and depends on only a few parameters which are determined variationally such that efficient sampling between states is obtained. The bias constructed in this manner largely reduces the need of finding a set of collective variables that completely spans the conformational space of interest, as they only need to be a locally valid descriptor of the system about its local minimum. We introduce the method and demonstrate its power on two representative examples.

  5. Supporting patients in obtaining and oncologists in providing evidence-based health-related quality of life information prior to and after esophageal cancer surgery

    OpenAIRE

    Jacobs, M.

    2015-01-01

    The overall aim of this thesis was to support patients in obtaining and oncologists in providing evidence-based HRQL data prior to and following esophageal cancer surgery. This thesis is divided in two parts. In Part I, we addressed the information needs of esophageal cancer patients prior to and following esophageal surgery, the barriers and facilitators patients experienced when discussing their information needs with their oncologist, and the development of a web-based question prompt shee...

  6. Obtaining value prior to pulping with diethyl oxalate and oxalic acid

    Science.gov (United States)

    W.R. Kenealy; E. Horn; C.J. Houtman; J. Laplaza; T.W. Jeffries

    2007-01-01

    Pulp and paper are converted to paper products with yields of paper dependent on the wood and the process used. Even with high yield pulps there are conversion losses and with chemical pulps the yields approach 50%. The portions of the wood that do not provide product are either combusted to generate power and steam or incur a cost in waste water treatment. Value prior...

  7. PREFERENCE OF PRIOR FOR BAYESIAN ANALYSIS OF THE MIXED BURR TYPE X DISTRIBUTION UNDER TYPE I CENSORED SAMPLES

    Directory of Open Access Journals (Sweden)

    Tabassum Naz Sindhu

    2014-05-01

    Full Text Available The paper is concerned with the preference of prior for the Bayesian analysis of the shape parameter of the mixture of Burr type X distribution using the censored data. We modeled the heterogeneous population using two components mixture of the Burr type X distribution. A comprehensive simulation scheme, through probabilistic mixing, has been followed to highlight the properties and behavior of the estimates in terms of sample size, corresponding risks and the proportion of the component of the mixture. The Bayes estimators of the parameters have been evaluated under the assumption of informative and non-informative priors using symmetric and asymmetric loss functions. The model selection criterion for the preference of the prior has been introduced. The hazard rate function of the mixture distribution has been discussed. The Bayes estimates under exponential prior and precautionary loss function exhibit the minimum posterior risks with some exceptions.

  8. Parameter estimation of multivariate multiple regression model using bayesian with non-informative Jeffreys’ prior distribution

    Science.gov (United States)

    Saputro, D. R. S.; Amalia, F.; Widyaningsih, P.; Affan, R. C.

    2018-05-01

    Bayesian method is a method that can be used to estimate the parameters of multivariate multiple regression model. Bayesian method has two distributions, there are prior and posterior distributions. Posterior distribution is influenced by the selection of prior distribution. Jeffreys’ prior distribution is a kind of Non-informative prior distribution. This prior is used when the information about parameter not available. Non-informative Jeffreys’ prior distribution is combined with the sample information resulting the posterior distribution. Posterior distribution is used to estimate the parameter. The purposes of this research is to estimate the parameters of multivariate regression model using Bayesian method with Non-informative Jeffreys’ prior distribution. Based on the results and discussion, parameter estimation of β and Σ which were obtained from expected value of random variable of marginal posterior distribution function. The marginal posterior distributions for β and Σ are multivariate normal and inverse Wishart. However, in calculation of the expected value involving integral of a function which difficult to determine the value. Therefore, approach is needed by generating of random samples according to the posterior distribution characteristics of each parameter using Markov chain Monte Carlo (MCMC) Gibbs sampling algorithm.

  9. Fission products in National Atmospheric Deposition Program—Wet deposition samples prior to and following the Fukushima Dai-Ichi Nuclear Power Plant incident, March 8?April 5, 2011

    Science.gov (United States)

    Wetherbee, Gregory A.; Debey, Timothy M.; Nilles, Mark A.; Lehmann, Christopher M.B.; Gay, David A.

    2012-01-01

    Radioactive isotopes I-131, Cs-134, or Cs-137, products of uranium fission, were measured at approximately 20 percent of 167 sampled National Atmospheric Deposition Program monitoring sites in North America (primarily in the contiguous United States and Alaska) after the Fukushima Dai-Ichi Nuclear Power Plant incident on March 12, 2011. Samples from the National Atmospheric Deposition Program were analyzed for the period of March 8-April 5, 2011. Calculated 1- or 2-week radionuclide deposition fluxes at 35 sites from Alaska to Vermont ranged from 0.47 to 5,100 Becquerels per square meter during the sampling period of March 15-April 5, 2011. No fission-product isotopes were measured in National Atmospheric Deposition Program samples obtained during March 8-15, 2011, prior to the arrival of contaminated air in North America.

  10. A self-sampling method to obtain large volumes of undiluted cervicovaginal secretions.

    Science.gov (United States)

    Boskey, Elizabeth R; Moench, Thomas R; Hees, Paul S; Cone, Richard A

    2003-02-01

    Studies of vaginal physiology and pathophysiology sometime require larger volumes of undiluted cervicovaginal secretions than can be obtained by current methods. A convenient method for self-sampling these secretions outside a clinical setting can facilitate such studies of reproductive health. The goal was to develop a vaginal self-sampling method for collecting large volumes of undiluted cervicovaginal secretions. A menstrual collection device (the Instead cup) was inserted briefly into the vagina to collect secretions that were then retrieved from the cup by centrifugation in a 50-ml conical tube. All 16 women asked to perform this procedure found it feasible and acceptable. Among 27 samples, an average of 0.5 g of secretions (range, 0.1-1.5 g) was collected. This is a rapid and convenient self-sampling method for obtaining relatively large volumes of undiluted cervicovaginal secretions. It should prove suitable for a wide range of assays, including those involving sexually transmitted diseases, microbicides, vaginal physiology, immunology, and pathophysiology.

  11. Testability evaluation using prior information of multiple sources

    Institute of Scientific and Technical Information of China (English)

    Wang Chao; Qiu Jing; Liu Guanjun; Zhang Yong

    2014-01-01

    Testability plays an important role in improving the readiness and decreasing the life-cycle cost of equipment. Testability demonstration and evaluation is of significance in measuring such testability indexes as fault detection rate (FDR) and fault isolation rate (FIR), which is useful to the producer in mastering the testability level and improving the testability design, and helpful to the consumer in making purchase decisions. Aiming at the problems with a small sample of testabil-ity demonstration test data (TDTD) such as low evaluation confidence and inaccurate result, a test-ability evaluation method is proposed based on the prior information of multiple sources and Bayes theory. Firstly, the types of prior information are analyzed. The maximum entropy method is applied to the prior information with the mean and interval estimate forms on the testability index to obtain the parameters of prior probability density function (PDF), and the empirical Bayesian method is used to get the parameters for the prior information with a success-fail form. Then, a parametrical data consistency check method is used to check the compatibility between all the sources of prior information and TDTD. For the prior information to pass the check, the prior credibility is calculated. A mixed prior distribution is formed based on the prior PDFs and the corresponding credibility. The Bayesian posterior distribution model is acquired with the mixed prior distribution and TDTD, based on which the point and interval estimates are calculated. Finally, examples of a flying control system are used to verify the proposed method. The results show that the proposed method is feasible and effective.

  12. Reliability assessment based on small samples of normal distribution

    International Nuclear Information System (INIS)

    Ma Zhibo; Zhu Jianshi; Xu Naixin

    2003-01-01

    When the pertinent parameter involved in reliability definition complies with normal distribution, the conjugate prior of its distributing parameters (μ, h) is of normal-gamma distribution. With the help of maximum entropy and the moments-equivalence principles, the subjective information of the parameter and the sampling data of its independent variables are transformed to a Bayesian prior of (μ,h). The desired estimates are obtained from either the prior or the posterior which is formed by combining the prior and sampling data. Computing methods are described and examples are presented to give demonstrations

  13. The use of low-calorie sweeteners is associated with self-reported prior intent to lose weight in a representative sample of US adults.

    Science.gov (United States)

    Drewnowski, A; Rehm, C D

    2016-03-07

    Low-calorie sweeteners (LCSs) are said to be a risk factor for obesity and diabetes. Reverse causality may be an alternative explanation. Data on LCS use, from a single 24-h dietary recall, for a representative sample of 22 231 adults were obtained from 5 cycles of the National Health and Nutrition Examination Survey (1999-2008 NHANES). Retrospective data on intent to lose or maintain weight during the prior 12-months and 10-year weight history were obtained from the weight history questionnaire. Objectively measured heights and weights were obtained from the examination. Primary analyses evaluated the association between intent to lose/maintain weight and use of LCSs and specific LCS product types using survey-weighted generalized linear models. We further evaluated whether body mass index (BMI) may mediate the association between weight loss intent and use of LCSs. The association between 10-year weight history and current LCS use was evaluated using restricted cubic splines. In cross-sectional analyses, LCS use was associated with a higher prevalence of obesity and diabetes. Adults who tried to lose weight during the previous 12 months were more likely to consume LCS beverages (prevalence ratio=1.64, 95% confidence interval (CI) 1.54-1.75), tabletop LCS (prevalence ratio=1.68, 95% CI 1.47-1.91) and LCS foods (prevalence ratio=1.93, 95% CI 1.60-2.33) as compared with those who did not. In mediation analyses, BMI only partially mediated the association between weight control history and the use of LCS beverages, tabletop LCS, but not LCS foods. Current LCS use was further associated with a history of prior weight change (for example, weight loss and gain). LCS use was associated with self-reported intent to lose weight during the previous 12 months. This association was only partially mediated by differences in BMI. Any inference of causality between attempts at weight control and LCS use is tempered by the cross-sectional nature of these data and retrospective

  14. Correlation of lithium levels between drinking water obtained from different sources and scalp hair samples of adult male subjects.

    Science.gov (United States)

    Baloch, Shahnawaz; Kazi, Tasneem Gul; Afridi, Hassan Imran; Baig, Jameel Ahmed; Talpur, Farah Naz; Arain, Muhammad Balal

    2017-10-01

    There is some evidence that natural levels of lithium (Li) in drinking water may have a protective effect on neurological health. In present study, we evaluate the Li levels in drinking water of different origin and bottled mineral water. To evaluate the association between lithium levels in drinking water with human health, the scalp hair samples of male subjects (25-45 years) consumed drinking water obtained from ground water (GW), municipal treated water (MTW) and bottled mineral water (BMW) from rural and urban areas of Sindh, Pakistan were selected. The water samples were pre-concentrated five to tenfold at 60 °C using temperature-controlled electric hot plate. While scalp hair samples were oxidized by acid in a microwave oven, prior to determined by flame atomic absorption spectrometry. The Li content in different types of drinking water, GW, MTW and BMW was found in the range of 5.12-22.6, 4.2-16.7 and 0.0-16.3 µg/L, respectively. It was observed that Li concentration in the scalp hair samples of adult males consuming ground water was found to be higher, ranged as 292-393 μg/kg, than those who are drinking municipal treated and bottle mineral water (212-268 and 145-208 μg/kg), respectively.

  15. Testability evaluation using prior information of multiple sources

    Directory of Open Access Journals (Sweden)

    Wang Chao

    2014-08-01

    Full Text Available Testability plays an important role in improving the readiness and decreasing the life-cycle cost of equipment. Testability demonstration and evaluation is of significance in measuring such testability indexes as fault detection rate (FDR and fault isolation rate (FIR, which is useful to the producer in mastering the testability level and improving the testability design, and helpful to the consumer in making purchase decisions. Aiming at the problems with a small sample of testability demonstration test data (TDTD such as low evaluation confidence and inaccurate result, a testability evaluation method is proposed based on the prior information of multiple sources and Bayes theory. Firstly, the types of prior information are analyzed. The maximum entropy method is applied to the prior information with the mean and interval estimate forms on the testability index to obtain the parameters of prior probability density function (PDF, and the empirical Bayesian method is used to get the parameters for the prior information with a success-fail form. Then, a parametrical data consistency check method is used to check the compatibility between all the sources of prior information and TDTD. For the prior information to pass the check, the prior credibility is calculated. A mixed prior distribution is formed based on the prior PDFs and the corresponding credibility. The Bayesian posterior distribution model is acquired with the mixed prior distribution and TDTD, based on which the point and interval estimates are calculated. Finally, examples of a flying control system are used to verify the proposed method. The results show that the proposed method is feasible and effective.

  16. Influence of feed provisioning prior to digesta sampling on precaecal amino acid digestibility in broiler chickens.

    Science.gov (United States)

    Siegert, Wolfgang; Ganzer, Christian; Kluth, Holger; Rodehutscord, Markus

    2018-06-01

    A regression approach was applied to determine the influence of feed provisioning prior to digesta sampling on precaecal (pc) amino acid (AA) digestibility in broiler chickens. Soybean meal was used as an example test ingredient. Five feed-provisioning protocols were investigated, four with restricted provision and one with ad libitum provision. When provision was restricted, feed was provided for 30 min after a withdrawal period of 12 h. Digesta were sampled 1, 2, 4 and 6 h after feeding commenced. A diet containing 300 g maize starch/kg was prepared. Half or all the maize starch was replaced with soybean meal in two other diets. Average pc digestibility of all determined AA in the soybean meal was 86% for the 4 and 6-h protocols and 66% and 60% for the 2 and 1-h protocols, respectively. Average pc AA digestibility of soybean meal was 76% for ad libitum feed provision. Feed provisioning also influenced the determined variance. Variance in digestibility ranked in magnitude 1 h > ad libitum > 2 h > 6 h > 4 h for all AA. Owing to the considerable influence of feed-provisioning protocols found in this study, comparisons of pc AA digestibility between studies applying different protocols prior to digesta sampling must be treated with caution. Digestibility experiments aimed at providing estimates for practical feed formulation should use feed-provisioning procedures similar to those used in practice.

  17. New and conventional evaporative systems in concentrating nitrogen samples prior to isotope-ratio analysis

    International Nuclear Information System (INIS)

    Lober, R.W.; Reeder, J.D.; Porter, L.K.

    1987-01-01

    Studies were conducted to quantify and compare the efficiencies of various evaporative systems used in evaporating 15 N samples prior to mass spectrometric analysis. Two new forced-air systems were designed and compared with a conventional forced-air system and with an open-air dry bath technique for effectiveness in preventing atmospheric contamination of evaporating samples. The forced-air evaporative systems significantly reduced the time needed to evaporate samples as compared to the open-air dry bath technique; samples were evaporated to dryness in 2.5 h with the forced-air systems as compared to 8 to 10 h on the open-air dry bath. The effectiveness of a given forced-air system to prevent atmospheric contamination of evaporating samples was significantly affected by the flow rate of the air stream flowing over the samples. The average atmospheric contaminant N found in samples evaporated on the open-air dry bath was 0.3 μ N, indicating very low concentrations of atmospheric NH 3 during this study. However, in previous studies the authors have experienced significant contamination of 15 N samples evaporated on an open-air dry bath because the level of contaminant N in the laboratory atmosphere varied and could not be adequately controlled. Average cross-contaminant levels of 0.28, 0.20, and 1.01 μ of N were measured between samples evaporated on the open-air dry bath, the newly-designed forced-air system, and the conventional forced-air system, respectively. The cross-contamination level is significantly higher on the conventional forced-air system than on the other two systems, and could significantly alter the atom % 15 N of high-enriched, low [N] evaporating samples

  18. Sampling, feasibility, and priors in Bayesian estimation

    OpenAIRE

    Chorin, Alexandre J.; Lu, Fei; Miller, Robert N.; Morzfeld, Matthias; Tu, Xuemin

    2015-01-01

    Importance sampling algorithms are discussed in detail, with an emphasis on implicit sampling, and applied to data assimilation via particle filters. Implicit sampling makes it possible to use the data to find high-probability samples at relatively low cost, making the assimilation more efficient. A new analysis of the feasibility of data assimilation is presented, showing in detail why feasibility depends on the Frobenius norm of the covariance matrix of the noise and not on the number of va...

  19. Freezing fecal samples prior to DNA extraction affects the Firmicutes to Bacteroidetes ratio determined by downstream quantitative PCR analysis

    DEFF Research Database (Denmark)

    Bahl, Martin Iain; Bergström, Anders; Licht, Tine Rask

    2012-01-01

    Freezing stool samples prior to DNA extraction and downstream analysis is widely used in metagenomic studies of the human microbiota but may affect the inferred community composition. In this study, DNA was extracted either directly or following freeze storage of three homogenized human fecal...

  20. Freezing fecal samples prior to DNA extraction affects the Firmicutes to Bacteroidetes ratio determined by downstream quantitative PCR analysis

    DEFF Research Database (Denmark)

    Bahl, Martin Iain; Bergström, Anders; Licht, Tine Rask

    Freezing stool samples prior to DNA extraction and downstream analysis is widely used in metagenomic studies of the human microbiota but may affect the inferred community composition. In this study DNA was extracted either directly or following freeze storage of three homogenized human fecal...

  1. A nested sampling particle filter for nonlinear data assimilation

    KAUST Repository

    Elsheikh, Ahmed H.; Hoteit, Ibrahim; Wheeler, Mary Fanett

    2014-01-01

    . The proposed nested sampling particle filter (NSPF) iteratively builds the posterior distribution by applying a constrained sampling from the prior distribution to obtain particles in high-likelihood regions of the search space, resulting in a reduction

  2. Geochemical porosity values obtained in core samples from different clay-rocks

    International Nuclear Information System (INIS)

    Fernandez, A.M.

    2010-01-01

    Document available in extended abstract form only. Argillaceous formations of low permeability are considered in many countries as potential host rocks for the disposal of high level radioactive wastes (HLRW). In order to determine their suitability for waste disposal, evaluations of the hydro-geochemistry and transport mechanisms from such geologic formations to the biosphere must be undertaken. One of the key questions about radionuclide diffusion and retention is to know the chemistry and chemical reactions and sorption processes that will occur in the rock and their effects on radionuclide mobility. In this context, the knowledge of the pore water chemistry is essential for performance assessment purposes. This information allows to establish a reliable model for the main water-rock interactions, which control the physico-chemical parameters and the chemistry of the major elements of the system. An important issue in order to model the pore water chemistry in clayey media is to determine the respective volume accessible to cations and anions, i.e, the amount of water actually available for chemical reactions/solute transport. This amount is usually referred as accessible porosity or geochemical porosity. By using the anion inventories, i.e. the anion content obtained from aqueous leaching, and assuming that all Cl - , Br - and SO4 2- leached in the aqueous extracts originates from pore water, the concentration of a conservative ion can be converted into the real pore water concentration if the accessible porosity is known. In this work, the accessible porosity or geochemical porosity has been determined in core samples belonging to four different formations: Boom Clay from Hades URL (Belgium, BE), Opalinus Clay from Mont Terri (Switzerland, CH), and Callovo-Oxfordian argillite from Bure URL (France, FR). The geochemical or chloride porosity was defined as the ratio between the pore water volume containing Cl-bearing pore water and the total volume of a sample

  3. Specific heat capacities of different clayey samples obtained by differential scanning calorimetry

    International Nuclear Information System (INIS)

    Fernandez, A.M.

    2012-01-01

    1600 C. The furnace is made of a graphite tube and it is cooled down by water circulation. The temperature regulation is performed by an S type Pt/Pt-Rh 10% thermocouple. The weighing module has a maximum capacity of 35 g, the balance being well suited for the analysis of samples ranging from micro-quantities to a bulky and dense materials, maintaining a measuring resolution equivalent to a microgram whatever the mass analysed. The crucibles containing the samples can be of various materials and volumes: a) alumina (up to 1750 C), b) platinum (up to 1750 C), and c) aluminium (up to 500 C). The specific heat capacity was determined by using the stepwise method with sapphire as reference material. Prior to performing the tests, the equipment was calibrated in temperature by using aluminium crucibles and a DSC plate rod transducer. Eight tests were carried out with four Standards elements (In, Sn, Pb, and Zn) at two scan rates: 5 and 10 C. The heat flow (HF) signal correction or calorimetric sensitivity was obtained for each test by using sapphire as reference material. The selected clayey materials used for the C p determination were: a) FEBEX bentonite (92 wt% di-octahedral Ca-Mg smectite) from Serrata de Nijar (Almeria, Spain); b) MX-80 bentonite (85 wt.% di-octahedral Na-smectite) from Wyoming (USA); c) Ibeco RWC 16 (82% di-octahedral Ca-Mg smectite) from Milos (Greece); d) Opalinus Clay from a core sample of the borehole BHT-1 (Mont Terri, Switzerland); e) Callovo-Oxfordian clay-rock from core samples of the boreholes PAC-1002 and EPT1201 (Meuse/Haute-Marne URL, France); f) MX-80 bentonite pellets from HE-E experiment at Mont Terri (Switzerland); and g) 65:35 Sand:MX-80 mixture material from HE-E experiment at Mont Terri (Switzerland). The tests were performed from 7 to 300 C. After several analyses with sapphire, the step method and a scan rate of 20 C/min was selected to carry out the experiments. Prior to perform the experiments, the samples were dried at 110 C

  4. Separation and enrichment of gold(III) from environmental samples prior to its flame atomic absorption spectrometric determination

    International Nuclear Information System (INIS)

    Senturk, Hasan Basri; Gundogdu, Ali; Bulut, Volkan Numan; Duran, Celal; Soylak, Mustafa; Elci, Latif; Tufekci, Mehmet

    2007-01-01

    A simple and accurate method was developed for separation and enrichment of trace levels of gold in environmental samples. The method is based on the adsorption of Au(III)-diethyldithiocarbamate complex on Amberlite XAD-2000 resin prior to the analysis of gold by flame atomic absorption spectrometry after elution with 1 mol L -1 HNO 3 in acetone. Some parameters including nitric acid concentration, eluent type, matrix ions, sample volume, sample flow rate and adsorption capacity were investigated on the recovery of gold(III). The recovery values for gold(III) and detection limit of gold were greater than 95% and 16.6 μg L -1 , respectively. The preconcentration factor was 200. The relative standard deviation of the method was -1 . The validation of the presented procedure was checked by the analysis of CRM-SA-C Sandy Soil certified reference material. The presented procedure was applied to the determination of gold in some environmental samples

  5. Cost-constrained optimal sampling for system identification in pharmacokinetics applications with population priors and nuisance parameters.

    Science.gov (United States)

    Sorzano, Carlos Oscars S; Pérez-De-La-Cruz Moreno, Maria Angeles; Burguet-Castell, Jordi; Montejo, Consuelo; Ros, Antonio Aguilar

    2015-06-01

    Pharmacokinetics (PK) applications can be seen as a special case of nonlinear, causal systems with memory. There are cases in which prior knowledge exists about the distribution of the system parameters in a population. However, for a specific patient in a clinical setting, we need to determine her system parameters so that the therapy can be personalized. This system identification is performed many times by measuring drug concentrations in plasma. The objective of this work is to provide an irregular sampling strategy that minimizes the uncertainty about the system parameters with a fixed amount of samples (cost constrained). We use Monte Carlo simulations to estimate the average Fisher's information matrix associated to the PK problem, and then estimate the sampling points that minimize the maximum uncertainty associated to system parameters (a minimax criterion). The minimization is performed employing a genetic algorithm. We show that such a sampling scheme can be designed in a way that is adapted to a particular patient and that it can accommodate any dosing regimen as well as it allows flexible therapeutic strategies. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  6. Difficulties in obtaining representative samples for compliance with the Ballast Water Management Convention

    Digital Repository Service at National Institute of Oceanography (India)

    Carney, K.J; Basurko, O.C; Pazouki, K.; Marsham, S.; Delany, J; Desai, D.V.; Anil, A.C; Mesbahi, E.

    water, the shape, size and number of ballast tanks and the heterogeneous distribution of organisms within tanks. These factors hinder efforts to obtain samples that truly represent the total ballast water onboard a vessel. A known cell density...

  7. Effects of physical and chemical heterogeneity on water-quality samples obtained from wells

    Science.gov (United States)

    Reilly, Thomas E.; Gibs, Jacob

    1993-01-01

    Factors that affect the mass of chemical constituents entering a well include the distributions of flow rate and chemical concentrations along and near the screened or open section of the well. Assuming a layered porous medium (with each layer being characterized by a uniform hydraulic conductivity and chemical concentration), a knowledge of the flow from each layer along the screened zone and of the chemical concentrations in each layer enables the total mass entering the well to be determined. Analyses of hypothetical systems and a site at Galloway, NJ, provide insight into the temporal variation of water-quality data observed when withdrawing water from screened wells in heterogeneous ground-water systems.The analyses of hypothetical systems quantitatively indicate the cause-and-effect relations that cause temporal variability in water samples obtained from wells. Chemical constituents that have relatively uniform concentrations with depth may not show variations in concentrations in the water discharged from a well after the well is purged (evacuation of standing water in the well casing). However, chemical constituents that do not have uniform concentrations near the screened interval of the well may show variations in concentrations in the well discharge water after purging because of the physics of ground-water flow in the vicinity of the screen.Water-quality samples were obtained through time over a 30 minute period from a site at Galloway, NJ. The water samples were analyzed for aromatic hydrocarbons, and the data for benzene, toluene, and meta+para xylene were evaluated for temporal variations. Samples were taken from seven discrete zones, and the flow-weighted concentrations of benzene, toluene, and meta+para xylene all indicate an increase in concentration over time during pumping. These observed trends in time were reproduced numerically based on the estimated concentration distribution in the aquifer and the flow rates from each zone.The results of

  8. Separation and enrichment of gold(III) from environmental samples prior to its flame atomic absorption spectrometric determination

    Energy Technology Data Exchange (ETDEWEB)

    Senturk, Hasan Basri; Gundogdu, Ali [Department of Chemistry, Faculty of Arts and Sciences, Karadeniz Technical University, 61080 Trabzon (Turkey); Bulut, Volkan Numan [Department of Chemistry, Faculty of Arts and Sciences, Karadeniz Technical University, 28049 Giresun (Turkey); Duran, Celal [Department of Chemistry, Faculty of Arts and Sciences, Karadeniz Technical University, 61080 Trabzon (Turkey); Soylak, Mustafa [Department of Chemistry, Faculty of Arts and Sciences, Erciyes University, 38039 Kayseri (Turkey)], E-mail: soylak@erciyes.edu.tr; Elci, Latif [Department of Chemistry, Faculty of Arts and Sciences, Pamukkale University, 20020 Denizli (Turkey); Tufekci, Mehmet [Department of Chemistry, Faculty of Arts and Sciences, Karadeniz Technical University, 61080 Trabzon (Turkey)

    2007-10-22

    A simple and accurate method was developed for separation and enrichment of trace levels of gold in environmental samples. The method is based on the adsorption of Au(III)-diethyldithiocarbamate complex on Amberlite XAD-2000 resin prior to the analysis of gold by flame atomic absorption spectrometry after elution with 1 mol L{sup -1} HNO{sub 3} in acetone. Some parameters including nitric acid concentration, eluent type, matrix ions, sample volume, sample flow rate and adsorption capacity were investigated on the recovery of gold(III). The recovery values for gold(III) and detection limit of gold were greater than 95% and 16.6 {mu}g L{sup -1}, respectively. The preconcentration factor was 200. The relative standard deviation of the method was <6%. The adsorption capacity of the resin was 12.3 mg g{sup -1}. The validation of the presented procedure was checked by the analysis of CRM-SA-C Sandy Soil certified reference material. The presented procedure was applied to the determination of gold in some environmental samples.

  9. No upregulation of digitalis glycoside receptor (Na,K-ATPase) concentration in human heart left ventricle samples obtained at necropsy after long term digitalisation.

    Science.gov (United States)

    Schmidt, T A; Holm-Nielsen, P; Kjeldsen, K

    1991-08-01

    The aim was to evaluate the hypothesis that digitalis glycosides increase the concentration of their specific receptor (Na,K-ATPase) in human myocardial tissue, thereby possibly reducing the inotropic effect of long term digitalis treatment. Intact samples of left ventricle were obtained at necropsy from patients who had been on long term treatment with digoxin and from patients not previously given digoxin. Digitalis glycoside receptors were quantified using vanadate facilitated 3H-ouabain binding before and after washing samples in buffer containing excess digoxin antibody fragments for 16 h at 30 degrees C. This washing procedure has previously been shown to reduce prior specific digoxin binding in human left ventricle by 95% and to allow subsequent vanadate facilitated complete quantification of 3H-ouabain binding sites. In this context it was performed to reduce occupancy of digitalis glycoside receptors by digoxin, caused by digitalisation before 3H-ouabain binding. 11 patients who had been on long term treatment with digoxin and eight who had not previously been given digoxin were studied. Left ventricle samples were obtained at necropsy at around 15 h after death. Standard 3H-ouabain binding was 39% less in samples from digitalised than from undigitalised subjects (p less than 0.001). Washing samples in buffer containing excess digoxin antibody fragments induced an increase in 3H-ouabain binding from 174(SEM 10) to 265(20) pmol.g-1 wet weight (n = 11, p less than 0.001) in samples from digitalised patients. After washing, the digitalis glycoside receptor concentration in left ventricle samples showed a tendency to a lower value (14%, p greater than 0.10) in patients exposed to digoxin compared to left ventricle samples from individuals unexposed to digitalis glycoside treatment. Calculating 3H-ouabain binding relative to dry ventricular muscle weight confirmed the results obtained using wet weight as reference. The results suggest that digoxin treatment in

  10. Green approaches in sample preparation of bioanalytical samples prior to chromatographic analysis.

    Science.gov (United States)

    Filippou, Olga; Bitas, Dimitrios; Samanidou, Victoria

    2017-02-01

    Sample preparation is considered to be the most challenging step of the analytical procedure, since it has an effect on the whole analytical methodology, therefore it contributes significantly to the greenness or lack of it of the entire process. The elimination of the sample treatment steps, pursuing at the same time the reduction of the amount of the sample, strong reductions in consumption of hazardous reagents and energy also maximizing safety for operators and environment, the avoidance of the use of big amount of organic solvents, form the basis for greening sample preparation and analytical methods. In the last decade, the development and utilization of greener and sustainable microextraction techniques is an alternative to classical sample preparation procedures. In this review, the main green microextraction techniques (solid phase microextraction, stir bar sorptive extraction, hollow-fiber liquid phase microextraction, dispersive liquid - liquid microextraction, etc.) will be presented, with special attention to bioanalytical applications of these environment-friendly sample preparation techniques which comply with the green analytical chemistry principles. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Graphene for separation and preconcentration of trace amounts of cobalt in water samples prior to flame atomic absorption spectrometry

    Directory of Open Access Journals (Sweden)

    Yukun Wang

    2016-09-01

    Full Text Available A new sensitive and simple method was developed for the preconcentration of trace amounts of cobalt (Co using 1-(2-pyridylazo-2-naphthol (PAN as chelating reagent prior to its determination by flame atomic absorption spectrometry. The proposed method is based on the utilization of a column packed with graphene as sorbent. Several effective parameters on the extraction and complex formation were selected and optimized. Under optimum conditions, the calibration graph was linear in the concentration range of 5.0–240.0 μg L−1 with a detection limit of 0.36 μg L−1. The relative standard deviation for ten replicate measurements of 20.0 and 100.0 μg L−1 of Co were 3.45 and 3.18%, respectively. Comparative studies showed that graphene is superior to other adsorbents including C18 silica, graphitic carbon, and single- and multi-walled carbon nanotubes for the extraction of Co. The proposed method was successfully applied in the analysis of four real environmental water samples. Good spiked recoveries over the range of 95.8–102.6% were obtained.

  12. Bayesian Analysis of two Censored Shifted Gompertz Mixture Distributions using Informative and Noninformative Priors

    Directory of Open Access Journals (Sweden)

    Tabassum Naz Sindhu

    2017-03-01

    Full Text Available This study deals with Bayesian analysis of shifted Gompertz mixture model under type-I censored samples assuming both informative and noninformative priors. We have discussed the Bayesian estimation of parameters of shifted Gompertz mixture model under the uniform, and gamma priors assuming three loss functions. Further, some properties of the model with some graphs of the mixture density are discussed. These properties include Bayes estimators, posterior risks and reliability function under simulation scheme. Bayes estimates are obtained considering two cases: (a when the shape parameter is known and (b when all parameters are unknown. We analyzed some simulated sets in order to investigate the effect of prior belief, loss functions, and performance of the proposed set of estimators of the mixture model parameters.

  13. Prospects of obtaining samples of bottom sediments from subglacial lake Vostok

    Directory of Open Access Journals (Sweden)

    Н. И. Васильев

    2017-04-01

    Full Text Available The paper proves the timeliness of obtaining and examining bottom sediments from subglacial Lake Vostok. Predictive geological section of Lake Vostok and information value of bottom sediments have been examined. Severe requirements towards environmental security of lake examinations and sampling of bottom sediments rule out the use of conventional drilling technologies, as they would pollute the lake with injection liquid from the borehole. In order to carry out sampling of bottom sediments from the subglacial lake, it is proposed to use a dynamically balanced tool string, which enables rotary drilling without any external support on borehole walls to transmit counter torque.     A theoretical analysis has been carried out to assess the operation of the tool string, which is a two-mass oscillatory electromechanical system of reciprocating and rotating motion (RRM with two degrees of freedom.

  14. Characterization of the Three Mile Island Unit-2 reactor building atmosphere prior to the reactor building purge

    International Nuclear Information System (INIS)

    Hartwell, J.K.; Mandler, J.W.; Duce, S.W.; Motes, B.G.

    1981-05-01

    The Three Mile Island Unit-2 reactor building atmosphere was sampled prior to the reactor building purge. Samples of the containment atmosphere were obtained using specialized sampling equipment installed through penetration R-626 at the 358-foot (109-meter) level of the TMI-2 reactor building. The samples were subsequently analyzed for radionuclide concentration and for gaseous molecular components (O 2 , N 2 , etc.) by two independent laboratories at the Idaho National Engineering Laboratory (INEL). The sampling procedures, analysis methods, and results are summarized

  15. Statistical evaluation of the data obtained from the K East Basin Sandfilter Backwash Pit samples

    International Nuclear Information System (INIS)

    Welsh, T.L.

    1994-01-01

    Samples were obtained from different locations from the K Each Sandfilter Backwash Pit to characterize the sludge material. These samples were analyzed chemically for elements, radionuclides, and residual compounds. The analytical results were statistically analyzed to determine the mean analyte content and the associated variability for each mean value

  16. Chromosomal differences between acute nonlymphocytic leukemia in patients with prior solid tumors and prior hematologic malignancies. A study of 14 cases with prior breast cancer

    International Nuclear Information System (INIS)

    Mamuris, Z.; Dumont, J.; Dutrillaux, B.; Aurias, A.

    1989-01-01

    A cytogenetic study of 14 patients with secondary acute nonlymphocytic leukemia (S-ANLL) with prior treatment for breast cancer is reported. The chromosomes recurrently involved in numerical or structural anomalies are chromosomes 7, 5, 17, and 11, in decreasing order of frequency. The distribution of the anomalies detected in this sample of patients is similar to that observed in published cases with prior breast or other solid tumors, though anomalies of chromosome 11 were not pointed out, but it significantly differs from that of the S-ANLL with prior hematologic malignancies. This difference is principally due to a higher involvement of chromosome 7 in patients with prior hematologic malignancies and of chromosomes 11 and 17 in patients with prior solid tumors. A genetic determinism involving abnormal recessive alleles located on chromosomes 5, 7, 11, and 17 uncovered by deletions of the normal homologs may be a cause of S-ANLL. The difference between patients with prior hematologic malignancies or solid tumors may be explained by different constitutional mutations of recessive genes in the two groups of patients

  17. Magnetic properties of ultrafine-grained cobalt samples obtained from consolidated nanopowders

    Energy Technology Data Exchange (ETDEWEB)

    Fellah, F.; Cherif, S.M.; Schoenstein, F.; Jouini, N.; Dirras, G. [LSPM (CNRS-UPR 3407), Universite Paris 13, 99 av. J.B. Clement, 93430 Villetaneuse (France); Bouziane, K. [Department of Physics, College of Science, Sultan Qaboos University, P.O. Box 36, Al-Khodh 123 (Oman)

    2011-08-15

    Co powders having nominal average particle size of 50 and 240 nm were synthesized using a polyol method and then consolidated by hot isostatic pressing (HIP) or the emerging spark plasma sintering (SPS) compaction processes. Bulk polycrystalline aggregates were obtained, having average grain sizes of about 200 and 300 nm, respectively. It is found that both nanoparticles and consolidated samples exhibit a soft ferromagnetic behavior. The magnetization reversal likely occurs by nucleation/propagation process. However, a curling process can be involved in the magnetization reversal for the smaller particles. The dynamic measurements provide for the consolidated samples magnetic parameters corresponding to bulk cobalt with vanishing anisotropy. The contribution of the intergranular region is found to be negligible. We can infer that the used consolidation routes insure a good magnetic interfacial contact between the particles. (Copyright copyright 2011 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  18. Data analysis of a polycrystalline nickel sample obtained with neutron diffraction

    International Nuclear Information System (INIS)

    Parente, C.B.R.; Mazzochi, V.L.; Araujo Kaschny, J.R. de; Costa, M.S. da; Rizzo, P.; Campos, W.M.S.

    1990-01-01

    A simple analysis of the nickel structure was made. Neutron diffraction data were used, obtained with polycrystalline nickel placed in a cylindrical sample-holder with dimensions of 1,5cm of diameter and 5cm of height. The theoretical intensities were calculated of 3 forms: 1. without considering the temperature and obsorption effects, 2. considering only the temperature effect and 3. considering both the temperature and absorption effects. The disagreement factors found in this 3 cases were, respectively, R 1 = 13.4%, R 2 = 7,7% e R 3 = 7,5%. (L.C.)

  19. Evaluating a Collaborative Approach to Improve Prior Authorization Efficiency in the Treatment of Hepatitis C Virus.

    Science.gov (United States)

    Dunn, Emily E; Vranek, Kathryn; Hynicka, Lauren M; Gripshover, Janet; Potosky, Darryn; Mattingly, T Joseph

    A team-based approach to obtaining prior authorization approval was implemented utilizing a specialty pharmacy, a clinic-based pharmacy technician specialist, and a registered nurse to work with providers to obtain approval for medications for hepatitis C virus (HCV) infection. The objective of this study was to evaluate the time to approval for prescribed treatment of HCV infection. A retrospective observational study was conducted including patients treated for HCV infection by clinic providers who received at least 1 oral direct-acting antiviral HCV medication. Patients were divided into 2 groups, based on whether they were treated before or after the implementation of the team-based approach. Student t tests were used to compare average wait times before and after the intervention. The sample included 180 patients, 68 treated before the intervention and 112 patients who initiated therapy after. All patients sampled required prior authorization approval by a third-party payer to begin therapy. There was a statistically significant reduction (P = .02) in average wait time in the postintervention group (15.6 ± 12.1 days) once adjusted using dates of approval. Pharmacy collaboration may provide increases in efficiency in provider prior authorization practices and reduced wait time for patients to begin treatment.

  20. Two sample Bayesian prediction intervals for order statistics based on the inverse exponential-type distributions using right censored sample

    Directory of Open Access Journals (Sweden)

    M.M. Mohie El-Din

    2011-10-01

    Full Text Available In this paper, two sample Bayesian prediction intervals for order statistics (OS are obtained. This prediction is based on a certain class of the inverse exponential-type distributions using a right censored sample. A general class of prior density functions is used and the predictive cumulative function is obtained in the two samples case. The class of the inverse exponential-type distributions includes several important distributions such the inverse Weibull distribution, the inverse Burr distribution, the loglogistic distribution, the inverse Pareto distribution and the inverse paralogistic distribution. Special cases of the inverse Weibull model such as the inverse exponential model and the inverse Rayleigh model are considered.

  1. Evaluation of endoscopically obtained duodenal biopsy samples from cats and dogs in an adapter-modified Ussing chamber

    Science.gov (United States)

    DeBiasio, John V.; Suchodolski, Jan S.; Newman, Shelley; Musch, Mark W.; Steiner, Jörg M.

    2014-01-01

    This study was conducted to evaluate an adapter-modified Ussing chamber for assessment of transport physiology in endoscopically obtained duodenal biopsies from healthy cats and dogs, as well as dogs with chronic enteropathies. 17 duodenal biopsies from five cats and 51 duodenal biopsies from 13 dogs were obtained. Samples were transferred into an adapter-modified Ussing chamber and sequentially exposed to various absorbagogues and secretagogues. Overall, 78.6% of duodenal samples obtained from cats responded to at least one compound. In duodenal biopsies obtained from dogs, the rate of overall response ranged from 87.5% (healthy individuals; n = 8), to 63.6% (animals exhibiting clinical signs of gastrointestinal disease and histopathological unremarkable duodenum; n = 15), and 32.1% (animals exhibiting clinical signs of gastrointestinal diseases and moderate to severe histopathological lesions; n = 28). Detailed information regarding the magnitude and duration of the response are provided. The adapter-modified Ussing chamber enables investigation of the absorptive and secretory capacity of endoscopically obtained duodenal biopsies from cats and dogs and has the potential to become a valuable research tool. The response of samples was correlated with histopathological findings. PMID:24378587

  2. Separation and Enrichment of Gold in Water, Geological and Environmental Samples by Solid Phase Extraction on Multiwalled Carbon Nanotubes Prior to its Determination by Flame Atomic Absorption Spectrometry.

    Science.gov (United States)

    Duran, Ali; Tuzen, Mustafa; Soylak, Mustafa

    2015-01-01

    This study proposes the application of multi-walled carbon nanotubes as a solid sorbent for the preconcentration of gold prior to its flame atomic absorption spectrometry determination. Extraction was achieved by using a glass column (15.0 cm in length and 1.0 cm in diameter). Quantitative recoveries were obtained in the pH range of 2.5-4.0; the elution step was carried out with 5.0 ml of 1.0 mol/L HNO3 in acetone. In the ligand-free study, variables such as pH, eluent type, sample volume, flow rates, and matrix effect were examined for the optimum recovery of gold ions. The gold ions were able to be pre-concentrated by a factor of 150 and their LOD was determined to be 1.71 μg/L. In order to evaluate the accuracy of the developed method, addition-recovery tests were applied for the tap water, mineral water, and sea water samples. Gold recovery studies were implemented using a wet digestion technique for mine and soil samples taken from various media, and this method was also applied for anodic slime samples taken from the factories located in the Kayseri Industrial Zone of Turkey.

  3. Self-prior strategy for organ reconstruction in fluorescence molecular tomography.

    Science.gov (United States)

    Zhou, Yuan; Chen, Maomao; Su, Han; Luo, Jianwen

    2017-10-01

    The purpose of this study is to propose a strategy for organ reconstruction in fluorescence molecular tomography (FMT) without prior information from other imaging modalities, and to overcome the high cost and ionizing radiation caused by the traditional structural prior strategy. The proposed strategy is designed as an iterative architecture to solve the inverse problem of FMT. In each iteration, a short time Fourier transform (STFT) based algorithm is used to extract the self-prior information in the space-frequency energy spectrum with the assumption that the regions with higher fluorescence concentration have larger energy intensity, then the cost function of the inverse problem is modified by the self-prior information, and lastly an iterative Laplacian regularization algorithm is conducted to solve the updated inverse problem and obtains the reconstruction results. Simulations and in vivo experiments on liver reconstruction are carried out to test the performance of the self-prior strategy on organ reconstruction. The organ reconstruction results obtained by the proposed self-prior strategy are closer to the ground truth than those obtained by the iterative Tikhonov regularization (ITKR) method (traditional non-prior strategy). Significant improvements are shown in the evaluation indexes of relative locational error (RLE), relative error (RE) and contrast-to-noise ratio (CNR). The self-prior strategy improves the organ reconstruction results compared with the non-prior strategy and also overcomes the shortcomings of the traditional structural prior strategy. Various applications such as metabolic imaging and pharmacokinetic study can be aided by this strategy.

  4. Determination of some organophosphorus pesticides in water and watermelon samples by microextraction prior to high-performance liquid chromatography.

    Science.gov (United States)

    Wang, Chun; Wu, Qiuhua; Wu, Chunxia; Wang, Zhi

    2011-11-01

    A novel method based on simultaneous liquid-liquid microextraction and carbon nanotube reinforced hollow fiber microporous membrane solid-liquid phase microextraction has been developed for the determination of six organophosphorus pesticides, i.e. isocarbophos, phosmet, parathion-methyl, triazophos, fonofos and phoxim, in water and watermelon samples prior to high-performance liquid chromatography (HPLC). Under the optimum conditions, the method shows a good linearity within a range of 1-200 ng/mL for water samples and 5-200 ng/g for watermelon samples, with the correlation coefficients (r) varying from 0.9990 to 0.9997 and 0.9986 to 0.9995, respectively. The limits of detection (LODs) were in the range between 0.1 and 0.3 ng/mL for water samples and between 1.0 and 1.5 ng/g for watermelon samples. The recoveries of the method at spiking levels of 5.0 and 50.0 ng/mL for water samples were between 85.4 and 100.8%, and at spiking levels of 5.0 and 50.0 ng/g for watermelon samples, they were between 82.6 and 92.4%, with the relative standard deviations (RSDs) varying from 4.5-6.9% and 5.2-7.4%, respectively. The results suggested that the developed method represents a simple, low-cost, high analytes preconcentration and excellent sample cleanup procedure for the determination of organophosphorus pesticides in water and watermelon samples. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Novel bayes factors that capture expert uncertainty in prior density specification in genetic association studies.

    Science.gov (United States)

    Spencer, Amy V; Cox, Angela; Lin, Wei-Yu; Easton, Douglas F; Michailidou, Kyriaki; Walters, Kevin

    2015-05-01

    Bayes factors (BFs) are becoming increasingly important tools in genetic association studies, partly because they provide a natural framework for including prior information. The Wakefield BF (WBF) approximation is easy to calculate and assumes a normal prior on the log odds ratio (logOR) with a mean of zero. However, the prior variance (W) must be specified. Because of the potentially high sensitivity of the WBF to the choice of W, we propose several new BF approximations with logOR ∼N(0,W), but allow W to take a probability distribution rather than a fixed value. We provide several prior distributions for W which lead to BFs that can be calculated easily in freely available software packages. These priors allow a wide range of densities for W and provide considerable flexibility. We examine some properties of the priors and BFs and show how to determine the most appropriate prior based on elicited quantiles of the prior odds ratio (OR). We show by simulation that our novel BFs have superior true-positive rates at low false-positive rates compared to those from both P-value and WBF analyses across a range of sample sizes and ORs. We give an example of utilizing our BFs to fine-map the CASP8 region using genotype data on approximately 46,000 breast cancer case and 43,000 healthy control samples from the Collaborative Oncological Gene-environment Study (COGS) Consortium, and compare the single-nucleotide polymorphism ranks to those obtained using WBFs and P-values from univariate logistic regression. © 2015 The Authors. *Genetic Epidemiology published by Wiley Periodicals, Inc.

  6. A NOVEL TECHNIQUE TO IMPROVE PHOTOMETRY IN CONFUSED IMAGES USING GRAPHS AND BAYESIAN PRIORS

    International Nuclear Information System (INIS)

    Safarzadeh, Mohammadtaher; Ferguson, Henry C.; Lu, Yu; Inami, Hanae; Somerville, Rachel S.

    2015-01-01

    We present a new technique for overcoming confusion noise in deep far-infrared Herschel space telescope images making use of prior information from shorter λ < 2 μm wavelengths. For the deepest images obtained by Herschel, the flux limit due to source confusion is about a factor of three brighter than the flux limit due to instrumental noise and (smooth) sky background. We have investigated the possibility of de-confusing simulated Herschel PACS 160 μm images by using strong Bayesian priors on the positions and weak priors on the flux of sources. We find the blended sources and group them together and simultaneously fit their fluxes. We derive the posterior probability distribution function of fluxes subject to these priors through Monte Carlo Markov Chain (MCMC) sampling by fitting the image. Assuming we can predict the FIR flux of sources based on the ultraviolet-optical part of their SEDs to within an order of magnitude, the simulations show that we can obtain reliable fluxes and uncertainties at least a factor of three fainter than the confusion noise limit of 3σ c = 2.7 mJy in our simulated PACS-160 image. This technique could in principle be used to mitigate the effects of source confusion in any situation where one has prior information of positions and plausible fluxes of blended sources. For Herschel, application of this technique will improve our ability to constrain the dust content in normal galaxies at high redshift

  7. Characterization of specimens obtained by different sampling methods for evaluation of periodontal bacteria.

    Science.gov (United States)

    Okada, Ayako; Sogabe, Kaoru; Takeuchi, Hiroaki; Okamoto, Masaaki; Nomura, Yoshiaki; Hanada, Nobuhiro

    2017-12-27

    Quantitative analysis of periodontal bacteria is considered useful for clinical diagnosis, evaluation and assessment of the risk of periodontal disease. The purpose of this study was to compare the effectiveness of sampling of saliva, supragingival and subgingival plaque for evaluation of periodontal bacteria. From each of 12 subjects, i) subgingival plaque was collected from the deepest pocket using a sterile paper point, ii) stimulated whole saliva was collected after chewing gum, and iii) supragingival plaque was collected using a tooth brush. These samples were sent to the medical examination laboratory for quantitative analysis of the counts of three periodontal bacterial species: Porphyromonas gingivalis, Treponema denticola, and Tannerella forsythia. The proportions of these bacteria in subgingival plaque were higher than those in saliva or supragingival plaque, but lower in subgingival plaque than in saliva or supragingival plaque. In several cases, periodontal bacteria were below the levels of detection in subgingival plaque. We concluded that samples taken from subgingival plaque may be more useful for evaluating the proportion of periodontal bacteria in deep pockets than is the case for other samples. Therefore, for evaluation of periodontal bacteria, clinicians should consider the characteristics of the specimens obtained using different sampling methods.

  8. SIPPI: A Matlab toolbox for sampling the solution to inverse problems with complex prior information

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Looms, Majken Caroline

    2013-01-01

    We present an application of the SIPPI Matlab toolbox, to obtain a sample from the a posteriori probability density function for the classical tomographic inversion problem. We consider a number of different forward models, linear and non-linear, such as ray based forward models that rely...

  9. Evaluation of Rock Powdering Methods to Obtain Fine-grained Samples for CHEMIN, a Combined XRD/XRF Instrument

    Science.gov (United States)

    Chipera, S. J.; Vaniman, D. T.; Bish, D. L.; Sarrazin, P.; Feldman, S.; Blake, D. F.; Bearman, G.; Bar-Cohen, Y.

    2004-01-01

    A miniature XRD/XRF (X-ray diffraction / X-ray fluorescence) instrument, CHEMIN, is currently being developed for definitive mineralogic analysis of soils and rocks on Mars. One of the technical issues that must be addressed to enable remote XRD analysis is how best to obtain a representative sample powder for analysis. For powder XRD analyses, it is beneficial to have a fine-grained sample to reduce preferred orientation effects and to provide a statistically significant number of crystallites to the X-ray beam. Although a two-dimensional detector as used in the CHEMIN instrument will produce good results even with poorly prepared powder, the quality of the data will improve and the time required for data collection will be reduced if the sample is fine-grained and randomly oriented. A variety of methods have been proposed for XRD sample preparation. Chipera et al. presented grain size distributions and XRD results from powders generated with an Ultrasonic/Sonic Driller/Corer (USDC) currently being developed at JPL. The USDC was shown to be an effective instrument for sampling rock to produce powder suitable for XRD. In this paper, we compare powder prepared using the USDC with powder obtained with a miniaturized rock crusher developed at JPL and with powder obtained with a rotary tungsten carbide bit to powders obtained from a laboratory bench-scale Retsch mill (provides benchmark mineralogical data). These comparisons will allow assessment of the suitability of these methods for analysis by an XRD/XRF instrument such as CHEMIN.

  10. Molecularly imprinted polymers for extraction of malachite green from fish samples prior to its determination by HPLC

    International Nuclear Information System (INIS)

    Li, Lu; Chen, Xiao-mei; Zhang, Hong-yuan; Lin, Yi-dong; Lin, Zheng-zhong; Huang, Zhi-yong; Lai, Zhu-zhi

    2015-01-01

    Molecularly imprinted polymer (MIP) particles for malachite green (MG) were prepared by emulsion polymerization using methacrylic acid as the functional monomer, ethylene glycol dimethacrylate as the cross-linker, and a combination of Span-80 and Tween-80 as an emulsifier. The MIP particles were characterized by SEM micrographs and FT-IR spectra. Their binding capacity for MG was evaluated based on kinetic and isothermal adsorption experiments and compared to non-imprinted polymer particles. Analytical figures of merit include an adsorption equilibrium time of 15 min, an adsorption capacity of 1.9 mg∙g -1 in acetonitrile-water (20:80), and an imprinting factor of 1.85. The MIP particles were successfully applied to the extraction of MG from fish samples spiked with MG and the other interfering substances prior to its determination of MG by HPLC. Spiked samples gave recoveries of MG that ranged from 86 to 104 %, much higher than that of the other interfering substance. (author)

  11. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  12. Use of alkaline or enzymatic sample pretreatment prior to characterization of gold nanoparticles in animal tissue by single-particle ICPMS

    DEFF Research Database (Denmark)

    Löschner, Katrin; Brabrand, Myung Suk Jung; Sloth, Jens Jørgen

    2014-01-01

    , not much is known about the applicability of spICPMS for determination of NPs in complex matrices such as biological tissues. In the present study, alkaline and enzymatic treatments were applied to solubilize spleen samples from rats, which had been administered 60-nm gold nanoparticles (Au......NPs) intravenously. The results showed that similar size distributions of AuNPs were obtained independent of the sample preparation method used. Furthermore, the quantitative results for AuNP mass concentration obtained with spICPMS following alkaline sample pretreatment coincided with results for total gold...... concentration obtained by conventional ICPMS analysis of acid-digested tissue. The recovery of AuNPs from enzymatically digested tissue, however, was approximately four times lower. Spiking experiments of blank spleen samples with AuNPs showed that the lower recovery was caused by an inferior transport...

  13. Detection and genotyping of human papillomavirus in self-obtained cervicovaginal samples by using the FTA cartridge: new possibilities for cervical cancer screening.

    Science.gov (United States)

    Lenselink, Charlotte H; de Bie, Roosmarie P; van Hamont, Dennis; Bakkers, Judith M J E; Quint, Wim G V; Massuger, Leon F A G; Bekkers, Ruud L M; Melchers, Willem J G

    2009-08-01

    This study assesses human papillomavirus (HPV) detection and genotyping in self-sampled genital smears applied to an indicating FTA elute cartridge (FTA cartridge). The study group consisted of 96 women, divided into two sample sets. All samples were analyzed by the HPV SPF(10)-Line Blot 25. Set 1 consisted of 45 women attending the gynecologist; all obtained a self-sampled cervicovaginal smear, which was applied to an FTA cartridge. HPV results were compared to a cervical smear (liquid based) taken by a trained physician. Set 2 consisted of 51 women who obtained a self-sampled cervicovaginal smear at home, which was applied to an FTA cartridge and to a liquid-based medium. DNA was obtained from the FTA cartridges by simple elution as well as extraction. Of all self-obtained samples of set 1, 62.2% tested HPV positive. The overall agreement between self- and physician-obtained samples was 93.3%, in favor of the self-obtained samples. In sample set 2, 25.5% tested HPV positive. The overall agreement for high-risk HPV presence between the FTA cartridge and liquid-based medium and between DNA elution and extraction was 100%. This study shows that HPV detection and genotyping in self-obtained cervicovaginal samples applied to an FTA cartridge is highly reliable. It shows a high level of overall agreement with HPV detection and genotyping in physician-obtained cervical smears and liquid-based self-samples. DNA can be obtained by simple elution and is therefore easy, cheap, and fast. Furthermore, the FTA cartridge is a convenient medium for collection and safe transport at ambient temperatures. Therefore, this method may contribute to a new way of cervical cancer screening.

  14. Form of prior for constrained thermodynamic processes with uncertainty

    Science.gov (United States)

    Aneja, Preety; Johal, Ramandeep S.

    2015-05-01

    We consider the quasi-static thermodynamic processes with constraints, but with additional uncertainty about the control parameters. Motivated by inductive reasoning, we assign prior distribution that provides a rational guess about likely values of the uncertain parameters. The priors are derived explicitly for both the entropy-conserving and the energy-conserving processes. The proposed form is useful when the constraint equation cannot be treated analytically. The inference is performed using spin-1/2 systems as models for heat reservoirs. Analytical results are derived in the high-temperatures limit. An agreement beyond linear response is found between the estimates of thermal quantities and their optimal values obtained from extremum principles. We also seek an intuitive interpretation for the prior and the estimated value of temperature obtained therefrom. We find that the prior over temperature becomes uniform over the quantity kept conserved in the process.

  15. Example-driven manifold priors for image deconvolution.

    Science.gov (United States)

    Ni, Jie; Turaga, Pavan; Patel, Vishal M; Chellappa, Rama

    2011-11-01

    Image restoration methods that exploit prior information about images to be estimated have been extensively studied, typically using the Bayesian framework. In this paper, we consider the role of prior knowledge of the object class in the form of a patch manifold to address the deconvolution problem. Specifically, we incorporate unlabeled image data of the object class, say natural images, in the form of a patch-manifold prior for the object class. The manifold prior is implicitly estimated from the given unlabeled data. We show how the patch-manifold prior effectively exploits the available sample class data for regularizing the deblurring problem. Furthermore, we derive a generalized cross-validation (GCV) function to automatically determine the regularization parameter at each iteration without explicitly knowing the noise variance. Extensive experiments show that this method performs better than many competitive image deconvolution methods.

  16. Adaptive nonparametric Bayesian inference using location-scale mixture priors

    NARCIS (Netherlands)

    Jonge, de R.; Zanten, van J.H.

    2010-01-01

    We study location-scale mixture priors for nonparametric statistical problems, including multivariate regression, density estimation and classification. We show that a rate-adaptive procedure can be obtained if the prior is properly constructed. In particular, we show that adaptation is achieved if

  17. Application of acetone acetals as water scavengers and derivatization agents prior to the gas chromatographic analysis of polar residual solvents in aqueous samples.

    Science.gov (United States)

    van Boxtel, Niels; Wolfs, Kris; Van Schepdael, Ann; Adams, Erwin

    2015-12-18

    The sensitivity of gas chromatography (GC) combined with the full evaporation technique (FET) for the analysis of aqueous samples is limited due to the maximum tolerable sample volume in a headspace vial. Using an acetone acetal as water scavenger prior to FET-GC analysis proved to be a useful and versatile tool for the analysis of high boiling analytes in aqueous samples. 2,2-Dimethoxypropane (DMP) was used in this case resulting in methanol and acetone as reaction products with water. These solvents are relatively volatile and were easily removed by evaporation enabling sample enrichment leading to 10-fold improvement in sensitivity compared to the standard 10μL FET sample volumes for a selection of typical high boiling polar residual solvents in water. This could be improved even further if more sample is used. The method was applied for the determination of residual NMP in an aqueous solution of a cefotaxime analogue and proved to be considerably better than conventional static headspace (sHS) and the standard FET approach. The methodology was also applied to determine trace amounts of ethylene glycol (EG) in aqueous samples like contact lens fluids, where scavenging of the water would avoid laborious extraction prior to derivatization. During this experiment it was revealed that DMP reacts quantitatively with EG to form 2,2-dimethyl-1,3-dioxolane (2,2-DD) under the proposed reaction conditions. The relatively high volatility (bp 93°C) of 2,2-DD makes it possible to perform analysis of EG using the sHS methodology making additional derivatization reactions superfluous. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Assessment of prior learning in vocational education and training

    DEFF Research Database (Denmark)

    Wahlgren, Bjarne; Aarkrog, Vibe

    ’ knowledge, skills and competences during the students’ performances and the methods that the teachers apply in order to assess the students’ prior learning in relation to the regulations of the current VET-program. In particular the study focuses on how to assess not only the students’ explicated knowledge......The article deals about the results of a study of the assessment of prior learning among adult workers who want to obtain formal qualifications as skilled workers. The study contributes to developing methods for assessing prior learning including both the teachers’ ways of eliciting the students...... and skills but also their competences, i.e. the way the students use their skills and knowledge to perform in practice. Based on a description of the assessment procedures the article discusses central issues in relation to the assessment of prior learning. The empirical data have been obtained in the VET...

  19. Reproducibility of Serum Potassium Values in Serum From Blood Samples Stored for Increasing Times Prior to Centrifugation and Analysis.

    Science.gov (United States)

    Harper, Aaron; Lu, Chuanyong; Sun, Yi; Garcia, Rafael; Rets, Anton; Alexis, Herol; Saad, Heba; Eid, Ikram; Harris, Loretta; Marshall, Barbara; Tafani, Edlira; Pincus, Matthew R

    2016-05-01

    The goal of this work was to determine if immediate versus postponed centrifugation of samples affects the levels of serum potassium. Twenty participants donated normal venous blood that was collected in four serum separator tubes per donor, each of which was analyzed at 0, 1, 2, or 4 hr on the Siemens Advia 1800 autoanalyzer. Coefficients of variation (CVs) for potassium levels ranged from 0% to 7.6% with a mean of 3 ± 2%. ANOVA testing of the means for all 20 samples showed a P-value of 0.72 (>0.05) indicating that there was no statistically significant difference between the means of the samples at the four time points. Sixteen samples were found to have CVs that were ≤5%. Two samples showed increases of potassium from the reference range to levels higher than the upper reference limit, one of which had a 4-hr value that was within the reference or normal range (3.5-5 mEq/l). Overall, most samples were found to have reproducible levels of serum potassium. Serum potassium levels from stored whole blood collected in serum separator tubes are, for the most part, stable at room temperature for at least 4 hr prior to analysis. However, some samples can exhibit significant fluctuations of values. © 2015 Wiley Periodicals, Inc.

  20. Figure-ground segmentation based on class-independent shape priors

    Science.gov (United States)

    Li, Yang; Liu, Yang; Liu, Guojun; Guo, Maozu

    2018-01-01

    We propose a method to generate figure-ground segmentation by incorporating shape priors into the graph-cuts algorithm. Given an image, we first obtain a linear representation of an image and then apply directional chamfer matching to generate class-independent, nonparametric shape priors, which provide shape clues for the graph-cuts algorithm. We then enforce shape priors in a graph-cuts energy function to produce object segmentation. In contrast to previous segmentation methods, the proposed method shares shape knowledge for different semantic classes and does not require class-specific model training. Therefore, the approach obtains high-quality segmentation for objects. We experimentally validate that the proposed method outperforms previous approaches using the challenging PASCAL VOC 2010/2012 and Berkeley (BSD300) segmentation datasets.

  1. A Two-Stage Maximum Entropy Prior of Location Parameter with a Stochastic Multivariate Interval Constraint and Its Properties

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2016-05-01

    Full Text Available This paper proposes a two-stage maximum entropy prior to elicit uncertainty regarding a multivariate interval constraint of the location parameter of a scale mixture of normal model. Using Shannon’s entropy, this study demonstrates how the prior, obtained by using two stages of a prior hierarchy, appropriately accounts for the information regarding the stochastic constraint and suggests an objective measure of the degree of belief in the stochastic constraint. The study also verifies that the proposed prior plays the role of bridging the gap between the canonical maximum entropy prior of the parameter with no interval constraint and that with a certain multivariate interval constraint. It is shown that the two-stage maximum entropy prior belongs to the family of rectangle screened normal distributions that is conjugate for samples from a normal distribution. Some properties of the prior density, useful for developing a Bayesian inference of the parameter with the stochastic constraint, are provided. We also propose a hierarchical constrained scale mixture of normal model (HCSMN, which uses the prior density to estimate the constrained location parameter of a scale mixture of normal model and demonstrates the scope of its applicability.

  2. Minimizing technical variation during sample preparation prior to label-free quantitative mass spectrometry.

    Science.gov (United States)

    Scheerlinck, E; Dhaenens, M; Van Soom, A; Peelman, L; De Sutter, P; Van Steendam, K; Deforce, D

    2015-12-01

    Sample preparation is the crucial starting point to obtain high-quality mass spectrometry data and can be divided into two main steps in a bottom-up proteomics approach: cell/tissue lysis with or without detergents and a(n) (in-solution) digest comprising denaturation, reduction, alkylation, and digesting of the proteins. Here, some important considerations, among others, are that the reagents used for sample preparation can inhibit the digestion enzyme (e.g., 0.1% sodium dodecyl sulfate [SDS] and 0.5 M guanidine HCl), give rise to ion suppression (e.g., polyethylene glycol [PEG]), be incompatible with liquid chromatography-tandem mass spectrometry (LC-MS/MS) (e.g., SDS), and can induce additional modifications (e.g., urea). Taken together, all of these irreproducible effects are gradually becoming a problem when label-free quantitation of the samples is envisioned such as during the increasingly popular high-definition mass spectrometry (HDMS(E)) and sequential window acquisition of all theoretical fragment ion spectra (SWATH) data-independent acquisition strategies. Here, we describe the detailed validation of a reproducible method with sufficient protein yield for sample preparation without any known LC-MS/MS interfering substances by using 1% sodium deoxycholate (SDC) during both cell lysis and in-solution digest. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Speciation of Tl(III and Tl(I in hair samples by dispersive liquid–liquid microextraction based on solidification of floating organic droplet prior to flame atomic absorption spectrometry determination

    Directory of Open Access Journals (Sweden)

    S.Z. Mohammadi

    2016-11-01

    Full Text Available Dispersive liquid–liquid microextraction based on solidification of floating organic droplet was successfully used as a sample preparation method prior to flame atomic absorption determination of trace amounts of Tl(III and Tl(I in hair samples. In the proposed method, 1-(2-pyridylazo-2-naphthol, 1-dodecanol and ethanol were used as chelating agent, extraction and dispersive solvent, respectively. Several factors that may be affected in the extraction process, such as type and volume of extraction and disperser solvents, pH, salting out effect, ionic strength and extraction time were studied. Under the optimal conditions, linearity was maintained between 6.0 and 900.0 ng mL−1 for Tl(III. The relative standard deviation for seven replicate determinations of 0.2 μg mL−1 Tl(III was 2.5%. The detection limit based on 3Sb for Tl(III in the original solution was 2.1 ng mL−1. The proposed method has been applied for the determination of trace amounts of thallium in hair samples and satisfactory results were obtained.

  4. Detection and Genotyping of Human Papillomavirus in Self-Obtained Cervicovaginal Samples by Using the FTA Cartridge: New Possibilities for Cervical Cancer Screening ▿

    Science.gov (United States)

    Lenselink, Charlotte H.; de Bie, Roosmarie P.; van Hamont, Dennis; Bakkers, Judith M. J. E.; Quint, Wim G. V.; Massuger, Leon F. A. G.; Bekkers, Ruud L. M.; Melchers, Willem J. G.

    2009-01-01

    This study assesses human papillomavirus (HPV) detection and genotyping in self-sampled genital smears applied to an indicating FTA elute cartridge (FTA cartridge). The study group consisted of 96 women, divided into two sample sets. All samples were analyzed by the HPV SPF10-Line Blot 25. Set 1 consisted of 45 women attending the gynecologist; all obtained a self-sampled cervicovaginal smear, which was applied to an FTA cartridge. HPV results were compared to a cervical smear (liquid based) taken by a trained physician. Set 2 consisted of 51 women who obtained a self-sampled cervicovaginal smear at home, which was applied to an FTA cartridge and to a liquid-based medium. DNA was obtained from the FTA cartridges by simple elution as well as extraction. Of all self-obtained samples of set 1, 62.2% tested HPV positive. The overall agreement between self- and physician-obtained samples was 93.3%, in favor of the self-obtained samples. In sample set 2, 25.5% tested HPV positive. The overall agreement for high-risk HPV presence between the FTA cartridge and liquid-based medium and between DNA elution and extraction was 100%. This study shows that HPV detection and genotyping in self-obtained cervicovaginal samples applied to an FTA cartridge is highly reliable. It shows a high level of overall agreement with HPV detection and genotyping in physician-obtained cervical smears and liquid-based self-samples. DNA can be obtained by simple elution and is therefore easy, cheap, and fast. Furthermore, the FTA cartridge is a convenient medium for collection and safe transport at ambient temperatures. Therefore, this method may contribute to a new way of cervical cancer screening. PMID:19553570

  5. Highly Simple Deep Eutectic Solvent Extraction of Manganese in Vegetable Samples Prior to Its ICP-OES Analysis.

    Science.gov (United States)

    Bağda, Esra; Altundağ, Hüseyin; Soylak, Mustafa

    2017-10-01

    In the present work, simple and sensitive extraction methods for selective determination of manganese have been successfully developed. The methods were based on solubilization of manganese in deep eutectic solvent medium. Three deep eutectic solvents with choline chloride (vitamin B4) and tartaric/oxalic/citric acids have been prepared. Extraction parameters were optimized with using standard reference material (1573a tomato leaves). The quantitative recovery values were obtained with 1.25 g/L sample to deep eutectic solvent (DES) volume, at 95 °C for 2 h. The limit of detection was found as 0.50, 0.34, and 1.23 μg/L for DES/tartaric, DES/oxalic, and DES/citric acid, respectively. At optimum conditions, the analytical signal was linear for the range of 10-3000 μg/L for all studied DESs with the correlation coefficient >0.99. The extraction methods were applied to different real samples such as basil herb, spinach, dill, and cucumber barks. The known amount of manganese was spiked to samples, and good recovery results were obtained.

  6. Use of an Ultrasonic/Sonic Driller/Corer to Obtain Sample Powder for CHEMIN, a Combined XRD/XRF Instrument

    Science.gov (United States)

    Chipera, S. J.; Bish, D. L.; Vaniman, D. T.; Sherrit, S.; Bar-Cohen, Y.; Sarrazin, P.; Blake, D. F.

    2003-01-01

    A miniature CHEMIN XRD/XRF (X-Ray Diffraction/X-Ray Fluourescence) instrument is currently being developed for definitive mineralogic analysis of soils and rocks on Mars. One of the technical issues that must be addressed in order to enable XRD analysis on an extraterrestrial body is how best to obtain a representative sample powder for analysis. For XRD powder diffraction analyses, it is beneficial to have a fine-grained sample to reduce preferred orientation effects and to provide a statistically significant number of crystallites to the X-ray beam. Although a 2-dimensional detector as used in the CHEMIN instrument will produce good results with poorly prepared powders, the quality of the data will improve if the sample is fine-grained and randomly oriented. An Ultrasonic/Sonic Driller/Corer (USDC) currently being developed at JPL is an effective mechanism of sampling rock to produce cores and powdered cuttings. It requires low axial load (XRD/XRF spectrometer such as CHEMIN, powders obtained from the JPL ultrasonic drill were analyzed and the results were compared to carefully prepared powders obtained using a laboratory bench scale Retsch mill.

  7. Neutrino masses and their ordering: global data, priors and models

    Science.gov (United States)

    Gariazzo, S.; Archidiacono, M.; de Salas, P. F.; Mena, O.; Ternes, C. A.; Tórtola, M.

    2018-03-01

    We present a full Bayesian analysis of the combination of current neutrino oscillation, neutrinoless double beta decay and Cosmic Microwave Background observations. Our major goal is to carefully investigate the possibility to single out one neutrino mass ordering, namely Normal Ordering or Inverted Ordering, with current data. Two possible parametrizations (three neutrino masses versus the lightest neutrino mass plus the two oscillation mass splittings) and priors (linear versus logarithmic) are exhaustively examined. We find that the preference for NO is only driven by neutrino oscillation data. Moreover, the values of the Bayes factor indicate that the evidence for NO is strong only when the scan is performed over the three neutrino masses with logarithmic priors; for every other combination of parameterization and prior, the preference for NO is only weak. As a by-product of our Bayesian analyses, we are able to (a) compare the Bayesian bounds on the neutrino mixing parameters to those obtained by means of frequentist approaches, finding a very good agreement; (b) determine that the lightest neutrino mass plus the two mass splittings parametrization, motivated by the physical observables, is strongly preferred over the three neutrino mass eigenstates scan and (c) find that logarithmic priors guarantee a weakly-to-moderately more efficient sampling of the parameter space. These results establish the optimal strategy to successfully explore the neutrino parameter space, based on the use of the oscillation mass splittings and a logarithmic prior on the lightest neutrino mass, when combining neutrino oscillation data with cosmology and neutrinoless double beta decay. We also show that the limits on the total neutrino mass ∑ mν can change dramatically when moving from one prior to the other. These results have profound implications for future studies on the neutrino mass ordering, as they crucially state the need for self-consistent analyses which explore the

  8. Adaptive local thresholding for robust nucleus segmentation utilizing shape priors

    Science.gov (United States)

    Wang, Xiuzhong; Srinivas, Chukka

    2016-03-01

    This paper describes a novel local thresholding method for foreground detection. First, a Canny edge detection method is used for initial edge detection. Then, tensor voting is applied on the initial edge pixels, using a nonsymmetric tensor field tailored to encode prior information about nucleus size, shape, and intensity spatial distribution. Tensor analysis is then performed to generate the saliency image and, based on that, the refined edge. Next, the image domain is divided into blocks. In each block, at least one foreground and one background pixel are sampled for each refined edge pixel. The saliency weighted foreground histogram and background histogram are then created. These two histograms are used to calculate a threshold by minimizing the background and foreground pixel classification error. The block-wise thresholds are then used to generate the threshold for each pixel via interpolation. Finally, the foreground is obtained by comparing the original image with the threshold image. The effective use of prior information, combined with robust techniques, results in far more reliable foreground detection, which leads to robust nucleus segmentation.

  9. Sets of priors reflecting prior-data conflict and agreement

    NARCIS (Netherlands)

    Walter, G.M.; Coolen, F.P.A.; Carvalho, J.P.; Lesot, M.-J.; Kaymak, U.; Vieira, S.; Bouchon-Meunier, B.; Yager, R.R.

    2016-01-01

    Bayesian inference enables combination of observations with prior knowledge in the reasoning process. The choice of a particular prior distribution to represent the available prior knowledge is, however, often debatable, especially when prior knowledge is limited or data are scarce, as then

  10. Prior-based artifact correction (PBAC) in computed tomography

    International Nuclear Information System (INIS)

    Heußer, Thorsten; Brehm, Marcus; Ritschl, Ludwig; Sawall, Stefan; Kachelrieß, Marc

    2014-01-01

    Purpose: Image quality in computed tomography (CT) often suffers from artifacts which may reduce the diagnostic value of the image. In many cases, these artifacts result from missing or corrupt regions in the projection data, e.g., in the case of metal, truncation, and limited angle artifacts. The authors propose a generalized correction method for different kinds of artifacts resulting from missing or corrupt data by making use of available prior knowledge to perform data completion. Methods: The proposed prior-based artifact correction (PBAC) method requires prior knowledge in form of a planning CT of the same patient or in form of a CT scan of a different patient showing the same body region. In both cases, the prior image is registered to the patient image using a deformable transformation. The registered prior is forward projected and data completion of the patient projections is performed using smooth sinogram inpainting. The obtained projection data are used to reconstruct the corrected image. Results: The authors investigate metal and truncation artifacts in patient data sets acquired with a clinical CT and limited angle artifacts in an anthropomorphic head phantom data set acquired with a gantry-based flat detector CT device. In all cases, the corrected images obtained by PBAC are nearly artifact-free. Compared to conventional correction methods, PBAC achieves better artifact suppression while preserving the patient-specific anatomy at the same time. Further, the authors show that prominent anatomical details in the prior image seem to have only minor impact on the correction result. Conclusions: The results show that PBAC has the potential to effectively correct for metal, truncation, and limited angle artifacts if adequate prior data are available. Since the proposed method makes use of a generalized algorithm, PBAC may also be applicable to other artifacts resulting from missing or corrupt data

  11. Utility of the microculture method in non-invasive samples obtained from an experimental murine model with asymptomatic leishmaniasis.

    Science.gov (United States)

    Allahverdiyev, Adil M; Bagirova, Malahat; Cakir-Koc, Rabia; Elcicek, Serhat; Oztel, Olga Nehir; Canim-Ates, Sezen; Abamor, Emrah Sefik; Yesilkir-Baydar, Serap

    2012-07-01

    The sensitivity of diagnostic methods for visceral leishmaniasis (VL) decreases because of the low number of parasites and antibody amounts in asymptomatic healthy donors who are not suitable for invasive sample acquisition procedures. Therefore, new studies are urgently needed to improve the sensitivity and specificity of the diagnostic approaches in non-invasive samples. In this study, the sensitivity of the microculture method (MCM) was compared with polymerase chain reaction (PCR), enzyme-linked immunosorbent assay (ELISA), and immunofluorescent antibody test (IFAT) methods in an experimental murine model with asymptomatic leishmaniasis. Results showed that the percent of positive samples in ELISA, IFAT, and peripheral blood (PB) -PCR tests were 17.64%, 8.82%, and 5.88%, respectively, whereas 100% positive results were obtained with MCM and MCM-PCR methods. Thus, this study, for the first time, showed that MCM is more sensitive, specific, and economic than other methods, and the sensitivity of PCR that was performed to samples obtained from MCM was higher than sensitivity of the PCR method sampled by PB.

  12. Reference Priors for the General Location-Scale Model

    NARCIS (Netherlands)

    Fernández, C.; Steel, M.F.J.

    1997-01-01

    The reference prior algorithm (Berger and Bernardo 1992) is applied to multivariate location-scale models with any regular sampling density, where we establish the irrelevance of the usual assumption of Normal sampling if our interest is in either the location or the scale. This result immediately

  13. Quality of determinations obtained from laboratory reference samples used in the calibration of X-ray electron probe microanalysis of silicate minerals

    International Nuclear Information System (INIS)

    Pavlova, Ludmila A.; Suvorova, Ludmila F.; Belozerova, Olga Yu.; Pavlov, Sergey M.

    2003-01-01

    Nine simple minerals and oxides, traditionally used as laboratory reference samples in the electron probe microanalysis (EPMA) of silicate minerals, have been quantitatively evaluated. Three separate series of data, comprising the average concentration, standard deviation, relative standard deviation, confidence interval and the z-score of data quality, were calculated for 21 control samples derived from calibrations obtained from three sets of reference samples: (1) simple minerals; (2) oxides; and (3) certified glass reference materials. No systematic difference was observed between the concentrations obtained from these three calibration sets when analyzed results were compared to certified compositions. The relative standard deviations obtained for each element were smaller than target values for all determinations. The z-score values for all elements determined fell within acceptable limits (-2< z<2) for concentrations ranging from 0.1 to 100%. These experiments show that the quality of data obtained from laboratory reference calibration samples is not inferior to that from certified reference glasses. The quality of results obtained corresponds to the 'applied geochemistry' type of analysis (category 2) as defined in the GeoPT proficiency testing program. Therefore, the laboratory reference samples can be used for calibrating EPMA techniques in the analysis of silicate minerals and for controlling the quality of results

  14. Solving probabilistic inverse problems rapidly with prior samples

    NARCIS (Netherlands)

    Käufl, Paul; Valentine, Andrew P.; de Wit, Ralph W.; Trampert, Jeannot

    2016-01-01

    Owing to the increasing availability of computational resources, in recent years the probabilistic solution of non-linear, geophysical inverse problems by means of sampling methods has become increasingly feasible. Nevertheless, we still face situations in which a Monte Carlo approach is not

  15. Use of Saliva for Assessment of Stress and Its Effect on the Immune System Prior to Gross Anatomy Practical Examinations

    Science.gov (United States)

    Lester, S. Reid; Brown, Jason R.; Aycock, Jeffrey E.; Grubbs, S. Lee; Johnson, Roger B.

    2010-01-01

    The objective of this study was to determine the longitudinal effects of a series of stressful gross anatomy tests on the immune system. Thirty-six freshman occupational therapy students completed a written stress evaluation survey, and saliva samples were obtained at baseline and prior to each of three timed-practical gross anatomy tests.…

  16. Stochastic, goal-oriented rapid impact modeling of uncertainty and environmental impacts in poorly-sampled sites using ex-situ priors

    Science.gov (United States)

    Li, Xiaojun; Li, Yandong; Chang, Ching-Fu; Tan, Benjamin; Chen, Ziyang; Sege, Jon; Wang, Changhong; Rubin, Yoram

    2018-01-01

    Modeling of uncertainty associated with subsurface dynamics has long been a major research topic. Its significance is widely recognized for real-life applications. Despite the huge effort invested in the area, major obstacles still remain on the way from theory and applications. Particularly problematic here is the confusion between modeling uncertainty and modeling spatial variability, which translates into a (mis)conception, in fact an inconsistency, in that it suggests that modeling of uncertainty and modeling of spatial variability are equivalent, and as such, requiring a lot of data. This paper investigates this challenge against the backdrop of a 7 km, deep underground tunnel in China, where environmental impacts are of major concern. We approach the data challenge by pursuing a new concept for Rapid Impact Modeling (RIM), which bypasses altogether the need to estimate posterior distributions of model parameters, focusing instead on detailed stochastic modeling of impacts, conditional to all information available, including prior, ex-situ information and in-situ measurements as well. A foundational element of RIM is the construction of informative priors for target parameters using ex-situ data, relying on ensembles of well-documented sites, pre-screened for geological and hydrological similarity to the target site. The ensembles are built around two sets of similarity criteria: a physically-based set of criteria and an additional set covering epistemic criteria. In another variation to common Bayesian practice, we update the priors to obtain conditional distributions of the target (environmental impact) dependent variables and not the hydrological variables. This recognizes that goal-oriented site characterization is in many cases more useful in applications compared to parameter-oriented characterization.

  17. The use of importance sampling in a trial assessment to obtain converged estimates of radiological risk

    International Nuclear Information System (INIS)

    Johnson, K.; Lucas, R.

    1986-12-01

    In developing a methodology for assessing potential sites for the disposal of radioactive wastes, the Department of the Environment has conducted a series of trial assessment exercises. In order to produce converged estimates of radiological risk using the SYVAC A/C simulation system an efficient sampling procedure is required. Previous work has demonstrated that importance sampling can substantially increase sampling efficiency. This study used importance sampling to produce converged estimates of risk for the first DoE trial assessment. Four major nuclide chains were analysed. In each case importance sampling produced converged risk estimates with between 10 and 170 times fewer runs of the SYVAC A/C model. This increase in sampling efficiency can reduce the total elapsed time required to obtain a converged estimate of risk from one nuclide chain by a factor of 20. The results of this study suggests that the use of importance sampling could reduce the elapsed time required to perform a risk assessment of a potential site by a factor of ten. (author)

  18. A Bayesian approach to PET reconstruction using image-modeling Gibbs priors: Implementation and comparison

    International Nuclear Information System (INIS)

    Chan, M.T.; Herman, G.T.; Levitan, E.

    1996-01-01

    We demonstrate that (i) classical methods of image reconstruction from projections can be improved upon by considering the output of such a method as a distorted version of the original image and applying a Bayesian approach to estimate from it the original image (based on a model of distortion and on a Gibbs distribution as the prior) and (ii) by selecting an open-quotes image-modelingclose quotes prior distribution (i.e., one which is such that it is likely that a random sample from it shares important characteristics of the images of the application area) one can improve over another Gibbs prior formulated using only pairwise interactions. We illustrate our approach using simulated Positron Emission Tomography (PET) data from realistic brain phantoms. Since algorithm performance ultimately depends on the diagnostic task being performed. we examine a number of different medically relevant figures of merit to give a fair comparison. Based on a training-and-testing evaluation strategy, we demonstrate that statistically significant improvements can be obtained using the proposed approach

  19. Obtaining Self-Samples to Diagnose Curable Sexually Transmitted Infections: A Systematic Review of Patients’ Experiences

    Science.gov (United States)

    Paudyal, Priyamvada; Llewellyn, Carrie; Lau, Jason; Mahmud, Mohammad; Smith, Helen

    2015-01-01

    Background Routine screening is key to sexually transmitted infection (STI) prevention and control. Previous studies suggest that clinic-based screening programmes capture only a small proportion of people with STIs. Self-sampling using non- or minimally invasive techniques may be beneficial for those reluctant to actively engage with conventional sampling methods. We systematically reviewed studies of patients’ experiences of obtaining self-samples to diagnose curable STIs. Methods We conducted an electronic search of MEDLINE, EMBASE, CINAHL, PsychINFO, BNI, and Cochrane Database of Systematic Reviews to identify relevant articles published in English between January 1980 and March 2014. Studies were included if participants self-sampled for the diagnosis of a curable STI and had specifically sought participants’ opinions of their experience, acceptability, preferences, or willingness to self-sample. Results The initial search yielded 558 references. Of these, 45 studies met the inclusion criteria. Thirty-six studies assessed patients’ acceptability and experiences of self-sampling. Pooled results from these studies shows that self-sampling is a highly acceptable method with 85% of patients reporting the method to be well received and acceptable. Twenty-eight studies reported on ease of self-sampling; the majority of patients (88%) in these studies found self-sampling an “easy” procedure. Self-sampling was favoured compared to clinician sampling, and home sampling was preferred to clinic-based sampling. Females and older participants were more accepting of self-sampling. Only a small minority of participants (13%) reported pain during self-sampling. Participants were willing to undergo self-sampling and recommend others. Privacy and safety were the most common concerns. Conclusion Self-sampling for diagnostic testing is well accepted with the majority having a positive experience and willingness to use again. Standardization of self-sampling procedures

  20. Random template placement and prior information

    International Nuclear Information System (INIS)

    Roever, Christian

    2010-01-01

    In signal detection problems, one is usually faced with the task of searching a parameter space for peaks in the likelihood function which indicate the presence of a signal. Random searches have proven to be very efficient as well as easy to implement, compared e.g. to searches along regular grids in parameter space. Knowledge of the parameterised shape of the signal searched for adds structure to the parameter space, i.e., there are usually regions requiring to be densely searched while in other regions a coarser search is sufficient. On the other hand, prior information identifies the regions in which a search will actually be promising or may likely be in vain. Defining specific figures of merit allows one to combine both template metric and prior distribution and devise optimal sampling schemes over the parameter space. We show an example related to the gravitational wave signal from a binary inspiral event. Here the template metric and prior information are particularly contradictory, since signals from low-mass systems tolerate the least mismatch in parameter space while high-mass systems are far more likely, as they imply a greater signal-to-noise ratio (SNR) and hence are detectable to greater distances. The derived sampling strategy is implemented in a Markov chain Monte Carlo (MCMC) algorithm where it improves convergence.

  1. An Optimized DNA Analysis Workflow for the Sampling, Extraction, and Concentration of DNA obtained from Archived Latent Fingerprints.

    Science.gov (United States)

    Solomon, April D; Hytinen, Madison E; McClain, Aryn M; Miller, Marilyn T; Dawson Cruz, Tracey

    2018-01-01

    DNA profiles have been obtained from fingerprints, but there is limited knowledge regarding DNA analysis from archived latent fingerprints-touch DNA "sandwiched" between adhesive and paper. Thus, this study sought to comparatively analyze a variety of collection and analytical methods in an effort to seek an optimized workflow for this specific sample type. Untreated and treated archived latent fingerprints were utilized to compare different biological sampling techniques, swab diluents, DNA extraction systems, DNA concentration practices, and post-amplification purification methods. Archived latent fingerprints disassembled and sampled via direct cutting, followed by DNA extracted using the QIAamp® DNA Investigator Kit, and concentration with Centri-Sep™ columns increased the odds of obtaining an STR profile. Using the recommended DNA workflow, 9 of the 10 samples provided STR profiles, which included 7-100% of the expected STR alleles and two full profiles. Thus, with carefully selected procedures, archived latent fingerprints can be a viable DNA source for criminal investigations including cold/postconviction cases. © 2017 American Academy of Forensic Sciences.

  2. Prior Elicitation, Assessment and Inference with a Dirichlet Prior

    Directory of Open Access Journals (Sweden)

    Michael Evans

    2017-10-01

    Full Text Available Methods are developed for eliciting a Dirichlet prior based upon stating bounds on the individual probabilities that hold with high prior probability. This approach to selecting a prior is applied to a contingency table problem where it is demonstrated how to assess the prior with respect to the bias it induces as well as how to check for prior-data conflict. It is shown that the assessment of a hypothesis via relative belief can easily take into account what it means for the falsity of the hypothesis to correspond to a difference of practical importance and provide evidence in favor of a hypothesis.

  3. Application of cotton as a solid phase extraction sorbent for on-line preconcentration of copper in water samples prior to inductively coupled plasma optical emission spectrometry determination.

    Science.gov (United States)

    Faraji, Mohammad; Yamini, Yadollah; Shariati, Shahab

    2009-07-30

    Copper, as a heavy metal, is toxic for many biological systems. Thus, the determination of trace amounts of copper in environmental samples is of great importance. In the present work, a new method was developed for the determination of trace amounts of copper in water samples. The method is based on the formation of ternary Cu(II)-CAS-CTAB ion-pair and adsorption of it into a mini-column packed with cotton prior applying inductively coupled plasma optical emission spectrometry (ICP-OES). The experimental parameters that affected the extraction efficiency of the method such as pH, flow rate and volume of the sample solution, concentration of chromazurol S (CAS) and cethyltrimethylammonium bromide (CTAB) as well as type and concentration of eluent were investigated and optimized. The ion-pair (Cu(II)-CAS-CTAB) was quantitatively retained on the cotton under the optimum conditions, then eluted completely using a solution of 25% (v/v) 1-propanol in 0.5 mol L(-1) HNO(3) and directly introduced into the nebulizer of the ICP-OES. The detection limit (DL) of the method for copper was 40 ng L(-1) (V(sample)=100mL) and the relative standard deviation (R.S.D.) for the determination of copper at 10 microg L(-1) level was found to be 1.3%. The method was successfully applied to determine the trace amounts of copper in tap water, deep well water, seawater and two different mineral waters, and suitable recoveries were obtained (92-106%).

  4. Optimization of Sample Preparation processes of Bone Material for Raman Spectroscopy.

    Science.gov (United States)

    Chikhani, Madelen; Wuhrer, Richard; Green, Hayley

    2018-03-30

    Raman spectroscopy has recently been investigated for use in the calculation of postmortem interval from skeletal material. The fluorescence generated by samples, which affects the interpretation of Raman data, is a major limitation. This study compares the effectiveness of two sample preparation techniques, chemical bleaching and scraping, in the reduction of fluorescence from bone samples during testing with Raman spectroscopy. Visual assessment of Raman spectra obtained at 1064 nm excitation following the preparation protocols indicates an overall reduction in fluorescence. Results demonstrate that scraping is more effective at resolving fluorescence than chemical bleaching. The scraping of skeletonized remains prior to Raman analysis is a less destructive method and allows for the preservation of a bone sample in a state closest to its original form, which is beneficial in forensic investigations. It is recommended that bone scraping supersedes chemical bleaching as the preferred method for sample preparation prior to Raman spectroscopy. © 2018 American Academy of Forensic Sciences.

  5. Drunkorexia: Calorie Restriction Prior to Alcohol Consumption among College Freshman

    Science.gov (United States)

    Burke, Sloane C.; Cremeens, Jennifer; Vail-Smith, Karen; Woolsey, Conrad

    2010-01-01

    Using a sample of 692 freshmen at a southeastern university, this study examined caloric restriction among students prior to planned alcohol consumption. Participants were surveyed for self-reported alcohol consumption, binge drinking, and caloric intake habits prior to drinking episodes. Results indicated that 99 of 695 (14%) of first year…

  6. Prior Sensitivity Analysis in Default Bayesian Structural Equation Modeling.

    Science.gov (United States)

    van Erp, Sara; Mulder, Joris; Oberski, Daniel L

    2017-11-27

    Bayesian structural equation modeling (BSEM) has recently gained popularity because it enables researchers to fit complex models and solve some of the issues often encountered in classical maximum likelihood estimation, such as nonconvergence and inadmissible solutions. An important component of any Bayesian analysis is the prior distribution of the unknown model parameters. Often, researchers rely on default priors, which are constructed in an automatic fashion without requiring substantive prior information. However, the prior can have a serious influence on the estimation of the model parameters, which affects the mean squared error, bias, coverage rates, and quantiles of the estimates. In this article, we investigate the performance of three different default priors: noninformative improper priors, vague proper priors, and empirical Bayes priors-with the latter being novel in the BSEM literature. Based on a simulation study, we find that these three default BSEM methods may perform very differently, especially with small samples. A careful prior sensitivity analysis is therefore needed when performing a default BSEM analysis. For this purpose, we provide a practical step-by-step guide for practitioners to conducting a prior sensitivity analysis in default BSEM. Our recommendations are illustrated using a well-known case study from the structural equation modeling literature, and all code for conducting the prior sensitivity analysis is available in the online supplemental materials. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. Fast multi-dimensional NMR by minimal sampling

    Science.gov (United States)

    Kupče, Ēriks; Freeman, Ray

    2008-03-01

    A new scheme is proposed for very fast acquisition of three-dimensional NMR spectra based on minimal sampling, instead of the customary step-wise exploration of all of evolution space. The method relies on prior experiments to determine accurate values for the evolving frequencies and intensities from the two-dimensional 'first planes' recorded by setting t1 = 0 or t2 = 0. With this prior knowledge, the entire three-dimensional spectrum can be reconstructed by an additional measurement of the response at a single location (t1∗,t2∗) where t1∗ and t2∗ are fixed values of the evolution times. A key feature is the ability to resolve problems of overlap in the acquisition dimension. Applied to a small protein, agitoxin, the three-dimensional HNCO spectrum is obtained 35 times faster than systematic Cartesian sampling of the evolution domain. The extension to multi-dimensional spectroscopy is outlined.

  8. Variational segmentation problems using prior knowledge in imaging and vision

    DEFF Research Database (Denmark)

    Fundana, Ketut

    This dissertation addresses variational formulation of segmentation problems using prior knowledge. Variational models are among the most successful approaches for solving many Computer Vision and Image Processing problems. The models aim at finding the solution to a given energy functional defined......, prior knowledge is needed to obtain the desired solution. The introduction of shape priors in particular, has proven to be an effective way to segment objects of interests. Firstly, we propose a prior-based variational segmentation model to segment objects of interest in image sequences, that can deal....... Many objects have high variability in shape and orientation. This often leads to unsatisfactory results, when using a segmentation model with single shape template. One way to solve this is by using more sophisticated shape models. We propose to incorporate shape priors from a shape sub...

  9. The use of human samples obtained during medicolegal autopsies in research: An introduction to current conditions and initiatives in Japan.

    Science.gov (United States)

    Tsujimura-Ito, Takako; Inoue, Yusuke; Muto, Kaori; Yoshida, Ken-Ichi

    2017-04-01

    Background Leftover samples obtained during autopsies are extremely important basic materials for forensic research. However, there are no established practices for research-related use of obtained samples. Objective This study discusses good practice for the secondary use of samples collected during medicolegal autopsies. Methods A questionnaire was posted to all 76 departments of forensic medicine performing medicolegal autopsies in Japan, and 48 responses were received (response rate: 63.2%). As a secondary analysis, we surveyed information provided on department websites. Results Ethical reviews conducted when samples were to be used for research varied greatly among departments, with 21 (43.8%) departments reporting 'fundamentally, all cases are subject to review', eight (16.7%) reporting 'only some are subject to review' and 17 (39.6%) reporting 'none are subject to review'. Information made available on websites indicated that 11 departments had a statement of some type to bereaved families about the potential research use of human samples obtained during autopsies. Nine of these included a notice stating that bereaved families may revoke their consent for use. Several departments used an opt-out system. Conclusion There is no common practice in the field of legal medicine on the ethical use for medical research of leftover samples from medicolegal autopsies. The trust of not only bereaved families but also society in general is required for the scientific validity and social benefits of medical studies using leftover samples from medicolegal autopsies through the use of opt-out consenting and offline and online dissemination and public-relations activities.

  10. Utility of the microculture method for Leishmania detection in non-invasive samples obtained from a blood bank.

    Science.gov (United States)

    Ates, Sezen Canim; Bagirova, Malahat; Allahverdiyev, Adil M; Kocazeybek, Bekir; Kosan, Erdogan

    2013-10-01

    In recent years, the role of donor blood has taken an important place in epidemiology of Leishmaniasis. According to the WHO, the numbers of patients considered as symptomatic are only 5-20% of individuals with asymptomatic leishmaniasis. In this study for detection of Leishmania infection in donor blood samples, 343 samples from the Capa Red Crescent Blood Center were obtained and primarily analyzed by microscopic and serological methods. Subsequently, the traditional culture (NNN), Immuno-chromatographic test (ICT) and Polymerase Chain Reaction (PCR) methods were applied to 21 samples which of them were found positive with at least one method. Buffy coat (BC) samples from 343 blood donors were analyzed: 15 (4.3%) were positive by a microculture method (MCM); and 4 (1.1%) by smear. The sera of these 343 samples included 9 (2.6%) determined positive by ELISA and 7 (2%) positive by IFAT. Thus, 21 of (6.1%) the 343 subjects studied by smear, MCM, IFAT and ELISA techniques were identified as positive for leishmaniasis at least one of the techniques and the sensitivity assessed. According to our data, the sensitivity of the methods are identified as MCM (71%), smear (19%), IFAT (33%), ELISA (42%), NNN (4%), PCR (14%) and ICT (4%). Thus, with this study for the first time, the sensitivity of a MCM was examined in blood donors by comparing MCM with the methods used in the diagnosis of leishmaniasis. As a result, MCM was found the most sensitive method for detection of Leishmania parasites in samples obtained from a blood bank. In addition, the presence of Leishmania parasites was detected in donor bloods in Istanbul, a non-endemic region of Turkey, and these results is a vital importance for the health of blood recipients. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. User's guide for polyethylene-based passive diffusion bag samplers to obtain volatile organic compound concentrations in wells. Part 2, Field tests

    Science.gov (United States)

    Vroblesky, Don A.

    2001-01-01

    Diffusion samplers installed in observation wells were found to be capable of yielding representative water samples for chlorinated volatile organic compounds. The samplers consisted of polyethylene bags containing deionized water and relied on diffusion of chlorinated volatile organic compounds through the polyethylene membrane. The known ability of polyethylene to transmit other volatile compounds, such as benzene and toluene, indicates that the samplers can be used for a variety of volatile organic compounds. In wells at the study area, the volatile organic compound concentrations in water samples obtained using the samplers without prior purging were similar to concentrations in water samples obtained from the respective wells using traditional purging and sampling approaches. The low cost associated with this approach makes it a viable option for monitoring large observation-well networks for volatile organic compounds.

  12. Using Priors to Compensate Geometrical Problems in Head-Mounted Eye Trackers

    DEFF Research Database (Denmark)

    Batista Narcizo, Fabricio; Ahmed, Zaheer; Hansen, Dan Witzner

    The use of additional information (a.k.a. priors) to help the eye tracking process is presented as an alternative to compensate classical geometrical problems in head-mounted eye trackers. Priors can be obtained from several distinct sources, such as: sensors to collect information related...... estimation specially for uncalibrated head-mounted setups....

  13. Short communication: Comparison of pH, volatile fatty acids, and microbiome of rumen samples from preweaned calves obtained via cannula or stomach tube.

    Science.gov (United States)

    Terré, M; Castells, L; Fàbregas, F; Bach, A

    2013-08-01

    The objective of this study was to compare rumen samples from young dairy calves obtained via a stomach tube (ST) or a ruminal cannula (RC). Five male Holstein calves (46±4.0 kg of body weight and 11±4.9 d of age) were ruminally cannulated at 15 d of age. Calves received 4 L/d of a commercial milk replacer (25% crude protein and 19.2% fat) at 12.5% dry matter, and were provided concentrate and chopped oats hay ad libitum throughout the study (56 d). In total, 29 paired rumen samples were obtained weekly throughout the study in most of the calves by each extraction method. These samples were used to determine pH and volatile fatty acids (VFA) concentration, and to quantify Prevotella ruminicola and Streptococcus bovis by quantitative PCR. Furthermore, a denaturing gradient gel electrophoresis was performed on rumen samples harvested during wk 8 of the study to determine the degree of similarity between rumen bacteria communities. Rumen pH was 0.30 units greater in ST compared with RC samples. Furthermore, total VFA concentrations were greater in RC than in ST samples. However, when analyzing the proportion of each VFA by ANOVA, no differences were found between the sampling methods. The quantification of S. bovis and P. ruminicola was similar in both extraction methods, and values obtained using different methods were highly correlated (R(2)=0.89 and 0.98 for S. bovis and P. ruminicola, respectively). Fingerprinting analysis showed similar bacteria band profiles between samples obtained from the same calves using different extraction methods. In conclusion, when comparing rumen parameters obtained using different sampling techniques, it is recommended that VFA profiles be used rather than total VFA concentrations, as total VFA concentrations are more affected by the method of collection. Furthermore, although comparisons of pH across studies should be avoided when samples are not obtained using the same sampling method, the comparison of fingerprinting of a

  14. Modified Activated Carbon Prepared from Acorn Shells as a New Solid-Phase Extraction Sorbent for the Preconcentration and Determination of Trace Amounts of Nickel in Food Samples Prior to Flame Atomic Absorption Spectrometry.

    Science.gov (United States)

    Ebrahimi, Bahram

    2017-03-01

    A new solid-phase extraction (SPE) sorbent was introduced based on acidic-modified (AM) activated carbon (AC) prepared from acorn shells of native oak trees in Kurdistan. Hydrochloric acid (15%, w/w) and nitric acid (32.5%, w/w) were used to condition and modify AC. The IR spectra of AC and AM-AC showed that AM lead to the formation of increasing numbers of acidic functional groups on AM-AC. AM-AC was used in the SPE method for the extraction and preconcentration of Ni+2 prior to flame atomic absorption spectrometric determination at ng/mL levels in model and real food samples. Effective parameters of the SPE procedure, such as the pH of the solutions, sorbent dosage, extraction time, sample volume, type of eluent, and matrix ions, were considered and optimized. An enrichment factor of 140 was obtained. The calibration curve was linear with an R2 of 0.997 in the concentration range of 1-220 ng/mL. The RSD was 5.67% (for n = 7), the LOD was 0.352 ng/mL, and relative recoveries in vegetable samples ranged from 96.7 to 103.7%.

  15. Multiplex preamplification of specific cDNA targets prior to gene expression analysis by TaqMan Arrays

    Directory of Open Access Journals (Sweden)

    Ribal María

    2008-06-01

    Full Text Available Abstract Background An accurate gene expression quantification using TaqMan Arrays (TA could be limited by the low RNA quantity obtained from some clinical samples. The novel cDNA preamplification system, the TaqMan PreAmp Master Mix kit (TPAMMK, enables a multiplex preamplification of cDNA targets and therefore, could provide a sufficient amount of specific amplicons for their posterior analysis on TA. Findings A multiplex preamplification of 47 genes was performed in 22 samples prior to their analysis by TA, and relative gene expression levels of non-preamplified (NPA and preamplified (PA samples were compared. Overall, the mean cycle threshold (CT decrement in the PA genes was 3.85 (ranging from 2.07 to 5.01. A high correlation (r between the gene expression measurements of NPA and PA samples was found (mean r = 0.970, ranging from 0.937 to 0.994; p Conclusion We demonstrate that cDNA preamplification using the TPAMMK before TA analysis is a reliable approach to simultaneously measure gene expression of multiple targets in a single sample. Moreover, this procedure was validated in genes from degraded RNA samples and low abundance expressed genes. This combined methodology could have wide applications in clinical research, where scarce amounts of degraded RNA are usually obtained and several genes need to be quantified in each sample.

  16. Improving the Pattern Reproducibility of Multiple-Point-Based Prior Models Using Frequency Matching

    DEFF Research Database (Denmark)

    Cordua, Knud Skou; Hansen, Thomas Mejer; Mosegaard, Klaus

    2014-01-01

    Some multiple-point-based sampling algorithms, such as the snesim algorithm, rely on sequential simulation. The conditional probability distributions that are used for the simulation are based on statistics of multiple-point data events obtained from a training image. During the simulation, data...... events with zero probability in the training image statistics may occur. This is handled by pruning the set of conditioning data until an event with non-zero probability is found. The resulting probability distribution sampled by such algorithms is a pruned mixture model. The pruning strategy leads...... to a probability distribution that lacks some of the information provided by the multiple-point statistics from the training image, which reduces the reproducibility of the training image patterns in the outcome realizations. When pruned mixture models are used as prior models for inverse problems, local re...

  17. 40 CFR 141.703 - Sampling locations.

    Science.gov (United States)

    2010-07-01

    ... samples prior to the point of filter backwash water addition. (d) Bank filtration. (1) Systems that... applicable, must collect source water samples in the surface water prior to bank filtration. (2) Systems that use bank filtration as pretreatment to a filtration plant must collect source water samples from the...

  18. Rapid Fractionation and Isolation of Whole Blood Components in Samples Obtained from a Community-based Setting.

    Science.gov (United States)

    Weckle, Amy; Aiello, Allison E; Uddin, Monica; Galea, Sandro; Coulborn, Rebecca M; Soliven, Richelo; Meier, Helen; Wildman, Derek E

    2015-11-30

    Collection and processing of whole blood samples in a non-clinical setting offers a unique opportunity to evaluate community-dwelling individuals both with and without preexisting conditions. Rapid processing of these samples is essential to avoid degradation of key cellular components. Included here are methods for simultaneous peripheral blood mononuclear cell (PBMC), DNA, RNA and serum isolation from a single blood draw performed in the homes of consenting participants across a metropolitan area, with processing initiated within 2 hr of collection. We have used these techniques to process over 1,600 blood specimens yielding consistent, high quality material, which has subsequently been used in successful DNA methylation, genotyping, gene expression and flow cytometry analyses. Some of the methods employed are standard; however, when combined in the described manner, they enable efficient processing of samples from participants of population- and/or community-based studies who would not normally be evaluated in a clinical setting. Therefore, this protocol has the potential to obtain samples (and subsequently data) that are more representative of the general population.

  19. Bayesian seismic inversion based on rock-physics prior modeling for the joint estimation of acoustic impedance, porosity and lithofacies

    Energy Technology Data Exchange (ETDEWEB)

    Passos de Figueiredo, Leandro, E-mail: leandrop.fgr@gmail.com [Physics Department, Federal University of Santa Catarina, Florianópolis (Brazil); Grana, Dario [Department of Geology and Geophysics, University of Wyoming, Laramie (United States); Santos, Marcio; Figueiredo, Wagner [Physics Department, Federal University of Santa Catarina, Florianópolis (Brazil); Roisenberg, Mauro [Informatic and Statistics Department, Federal University of Santa Catarina, Florianópolis (Brazil); Schwedersky Neto, Guenther [Petrobras Research Center, Rio de Janeiro (Brazil)

    2017-05-01

    We propose a Bayesian approach for seismic inversion to estimate acoustic impedance, porosity and lithofacies within the reservoir conditioned to post-stack seismic and well data. The link between elastic and petrophysical properties is given by a joint prior distribution for the logarithm of impedance and porosity, based on a rock-physics model. The well conditioning is performed through a background model obtained by well log interpolation. Two different approaches are presented: in the first approach, the prior is defined by a single Gaussian distribution, whereas in the second approach it is defined by a Gaussian mixture to represent the well data multimodal distribution and link the Gaussian components to different geological lithofacies. The forward model is based on a linearized convolutional model. For the single Gaussian case, we obtain an analytical expression for the posterior distribution, resulting in a fast algorithm to compute the solution of the inverse problem, i.e. the posterior distribution of acoustic impedance and porosity as well as the facies probability given the observed data. For the Gaussian mixture prior, it is not possible to obtain the distributions analytically, hence we propose a Gibbs algorithm to perform the posterior sampling and obtain several reservoir model realizations, allowing an uncertainty analysis of the estimated properties and lithofacies. Both methodologies are applied to a real seismic dataset with three wells to obtain 3D models of acoustic impedance, porosity and lithofacies. The methodologies are validated through a blind well test and compared to a standard Bayesian inversion approach. Using the probability of the reservoir lithofacies, we also compute a 3D isosurface probability model of the main oil reservoir in the studied field.

  20. Modulation of Cytokine mRNA Expression in Pharyngeal Epithelial Samples obtained from Cattle Infected with Foot-and-Mouth Disease Virus

    DEFF Research Database (Denmark)

    Stenfeldt, Anna Carolina; Heegaard, Peter M. H.; Stockmarr, Anders

    2012-01-01

    A novel technique of endoscopical collection of small tissue samples was used to obtain sequential tissue samples from the dorsal soft palate (DSP) of individual cattle infected with foot-and-mouth disease virus (FMDV) at different phases of the infection. Levels of mRNA encoding interferon (IFN)...

  1. Casingless down-hole for sealing an ablation volume and obtaining a sample for analysis

    Science.gov (United States)

    Noble, Donald T.; Braymen, Steven D.; Anderson, Marvin S.

    1996-10-01

    A casing-less down hole sampling system for acquiring a subsurface sample for analysis using an inductively coupled plasma system is disclosed. The system includes a probe which is pushed into the formation to be analyzed using a hydraulic ram system. The probe includes a detachable tip member which has a soil point mad a barb, with the soil point aiding the penetration of the earth, and the barb causing the tip member to disengage from the probe and remain in the formation when the probe is pulled up. The probe is forced into the formation to be tested, and then pulled up slightly, to disengage the tip member and expose a column of the subsurface formation to be tested. An instrumentation tube mounted in the probe is then extended outward from the probe to longitudinally extend into the exposed column. A balloon seal mounted on the end of the instrumentation tube allows the bottom of the column to be sealed. A source of laser radiation is emitted from the instrumentation tube to ablate a sample from the exposed column. The instrumentation tube can be rotated in the probe to sweep the laser source across the surface of the exposed column. An aerosol transport system carries the ablated sample from the probe to the surface for testing in an inductively coupled plasma system. By testing at various levels in the down-hole as the probe is extracted from the soil, a profile of the subsurface formation may be obtained.

  2. Bayesian linear regression : different conjugate models and their (in)sensitivity to prior-data conflict

    NARCIS (Netherlands)

    Walter, G.M.; Augustin, Th.; Kneib, Thomas; Tutz, Gerhard

    2010-01-01

    The paper is concerned with Bayesian analysis under prior-data conflict, i.e. the situation when observed data are rather unexpected under the prior (and the sample size is not large enough to eliminate the influence of the prior). Two approaches for Bayesian linear regression modeling based on

  3. Applicability of cloud point extraction for the separation trace amount of lead ion in environmental and biological samples prior to determination by flame atomic absorption spectrometry

    Directory of Open Access Journals (Sweden)

    Sayed Zia Mohammadi

    2016-09-01

    Full Text Available A sensitive cloud point extraction procedure(CPE for the preconcentration of trace lead prior to its determination by flame atomic absorption spectrometry (FAAS has been developed. The CPE method is based on the complex of Pb(II ion with 1-(2-pyridylazo-2-naphthol (PAN, and then entrapped in the non-ionic surfactant Triton X-114. The main factors affecting CPE efficiency, such as pH of sample solution, concentration of PAN and Triton X-114, equilibration temperature and time, were investigated in detail. A preconcentration factor of 30 was obtained for the preconcentration of Pb(II ion with 15.0 mL solution. Under the optimal conditions, the calibration curve was linear in the range of 7.5 ng mL−1–3.5 μg mL−1 of lead with R2 = 0.9998 (n = 10. Detection limit based on three times the standard deviation of the blank (3Sb was 5.27 ng mL−1. Eight replicate determinations of 1.0 μg mL−1 lead gave a mean absorbance of 0.275 with a relative standard deviation of 1.6%. The high efficiency of cloud point extraction to carry out the determination of analytes in complex matrices was demonstrated. The proposed method has been applied for determination of trace amounts of lead in biological and water samples with satisfactory results.

  4. Bayesian optimal experimental design for priors of compact support

    KAUST Repository

    Long, Quan

    2016-01-08

    In this study, we optimize the experimental setup computationally by optimal experimental design (OED) in a Bayesian framework. We approximate the posterior probability density functions (pdf) using truncated Gaussian distributions in order to account for the bounded domain of the uniform prior pdf of the parameters. The underlying Gaussian distribution is obtained in the spirit of the Laplace method, more precisely, the mode is chosen as the maximum a posteriori (MAP) estimate, and the covariance is chosen as the negative inverse of the Hessian of the misfit function at the MAP estimate. The model related entities are obtained from a polynomial surrogate. The optimality, quantified by the information gain measures, can be estimated efficiently by a rejection sampling algorithm against the underlying Gaussian probability distribution, rather than against the true posterior. This approach offers a significant error reduction when the magnitude of the invariants of the posterior covariance are comparable to the size of the bounded domain of the prior. We demonstrate the accuracy and superior computational efficiency of our method for shock-tube experiments aiming to measure the model parameters of a key reaction which is part of the complex kinetic network describing the hydrocarbon oxidation. In the experiments, the initial temperature and fuel concentration are optimized with respect to the expected information gain in the estimation of the parameters of the target reaction rate. We show that the expected information gain surface can change its shape dramatically according to the level of noise introduced into the synthetic data. The information that can be extracted from the data saturates as a logarithmic function of the number of experiments, and few experiments are needed when they are conducted at the optimal experimental design conditions.

  5. External Prior Guided Internal Prior Learning for Real-World Noisy Image Denoising

    Science.gov (United States)

    Xu, Jun; Zhang, Lei; Zhang, David

    2018-06-01

    Most of existing image denoising methods learn image priors from either external data or the noisy image itself to remove noise. However, priors learned from external data may not be adaptive to the image to be denoised, while priors learned from the given noisy image may not be accurate due to the interference of corrupted noise. Meanwhile, the noise in real-world noisy images is very complex, which is hard to be described by simple distributions such as Gaussian distribution, making real noisy image denoising a very challenging problem. We propose to exploit the information in both external data and the given noisy image, and develop an external prior guided internal prior learning method for real noisy image denoising. We first learn external priors from an independent set of clean natural images. With the aid of learned external priors, we then learn internal priors from the given noisy image to refine the prior model. The external and internal priors are formulated as a set of orthogonal dictionaries to efficiently reconstruct the desired image. Extensive experiments are performed on several real noisy image datasets. The proposed method demonstrates highly competitive denoising performance, outperforming state-of-the-art denoising methods including those designed for real noisy images.

  6. Investigation into alternative sludge conditioning prior to dewatering

    CSIR Research Space (South Africa)

    Smollen, M

    1997-01-01

    Full Text Available have proven that the mixture of char and a small quantity of polyelectrolyte (0.5 to 1kg per ton of dry solids), used as a conditioner prior to centrifugation and filtration tests, produced cake solids concentration superior to that obtained by using...

  7. PET reconstruction via nonlocal means induced prior.

    Science.gov (United States)

    Hou, Qingfeng; Huang, Jing; Bian, Zhaoying; Chen, Wufan; Ma, Jianhua

    2015-01-01

    The traditional Bayesian priors for maximum a posteriori (MAP) reconstruction methods usually incorporate local neighborhood interactions that penalize large deviations in parameter estimates for adjacent pixels; therefore, only local pixel differences are utilized. This limits their abilities of penalizing the image roughness. To achieve high-quality PET image reconstruction, this study investigates a MAP reconstruction strategy by incorporating a nonlocal means induced (NLMi) prior (NLMi-MAP) which enables utilizing global similarity information of image. The present NLMi prior approximates the derivative of Gibbs energy function by an NLM filtering process. Specially, the NLMi prior is obtained by subtracting the current image estimation from its NLM filtered version and feeding the residual error back to the reconstruction filter to yield the new image estimation. We tested the present NLMi-MAP method with simulated and real PET datasets. Comparison studies with conventional filtered backprojection (FBP) and a few iterative reconstruction methods clearly demonstrate that the present NLMi-MAP method performs better in lowering noise, preserving image edge and in higher signal to noise ratio (SNR). Extensive experimental results show that the NLMi-MAP method outperforms the existing methods in terms of cross profile, noise reduction, SNR, root mean square error (RMSE) and correlation coefficient (CORR).

  8. Valid MR imaging predictors of prior knee arthroscopy

    International Nuclear Information System (INIS)

    Discepola, Federico; Le, Huy B.Q.; Park, John S.; Clopton, Paul; Knoll, Andrew N.; Austin, Matthew J.; Resnick, Donald L.

    2012-01-01

    To determine whether fibrosis of the medial patellar reticulum (MPR), lateral patellar reticulum (LPR), deep medial aspect of Hoffa's fat pad (MDH), or deep lateral aspect of Hoffa's fat pad (LDH) is a valid predictor of prior knee arthroscopy. Institutional review board approval and waiver of informed consent were obtained for this HIPPA-compliant study. Initially, fibrosis of the MPR, LPR, MDH, or LDH in MR imaging studies of 50 patients with prior knee arthroscopy and 100 patients without was recorded. Subsequently, two additional radiologists, blinded to clinical data, retrospectively and independently recorded the presence of fibrosis of the MPR in 50 patients with prior knee arthroscopy and 50 without. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy for detecting the presence of fibrosis in the MPR were calculated. κ statistics were used to analyze inter-observer agreement. Fibrosis of each of the regions examined during the first portion of the study showed a significant association with prior knee arthroscopy (p < 0.005 for each). A patient with fibrosis of the MPR, LDH, or LPR was 45.5, 9, or 3.7 times more likely, respectively, to have had a prior knee arthroscopy. Logistic regression analysis indicated that fibrosis of the MPR supplanted the diagnostic utility of identifying fibrosis of the LPR, LDH, or MDH, or combinations of these (p ≥ 0.09 for all combinations). In the second portion of the study, fibrosis of the MPR demonstrated a mean sensitivity of 82%, specificity of 72%, PPV of 75%, NPV of 81%, and accuracy of 77% for predicting prior knee arthroscopy. Analysis of MR images can be used to determine if a patient has had prior knee arthroscopy by identifying fibrosis of the MPR, LPR, MDH, or LDH. Fibrosis of the MPR was the strongest predictor of prior knee arthroscopy. (orig.)

  9. Valid MR imaging predictors of prior knee arthroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Discepola, Federico; Le, Huy B.Q. [McGill University Health Center, Jewsih General Hospital, Division of Musculoskeletal Radiology, Montreal, Quebec (Canada); Park, John S. [Annapolis Radiology Associates, Division of Musculoskeletal Radiology, Annapolis, MD (United States); Clopton, Paul; Knoll, Andrew N.; Austin, Matthew J.; Resnick, Donald L. [University of California San Diego (UCSD), Division of Musculoskeletal Radiology, San Diego, CA (United States)

    2012-01-15

    To determine whether fibrosis of the medial patellar reticulum (MPR), lateral patellar reticulum (LPR), deep medial aspect of Hoffa's fat pad (MDH), or deep lateral aspect of Hoffa's fat pad (LDH) is a valid predictor of prior knee arthroscopy. Institutional review board approval and waiver of informed consent were obtained for this HIPPA-compliant study. Initially, fibrosis of the MPR, LPR, MDH, or LDH in MR imaging studies of 50 patients with prior knee arthroscopy and 100 patients without was recorded. Subsequently, two additional radiologists, blinded to clinical data, retrospectively and independently recorded the presence of fibrosis of the MPR in 50 patients with prior knee arthroscopy and 50 without. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy for detecting the presence of fibrosis in the MPR were calculated. {kappa} statistics were used to analyze inter-observer agreement. Fibrosis of each of the regions examined during the first portion of the study showed a significant association with prior knee arthroscopy (p < 0.005 for each). A patient with fibrosis of the MPR, LDH, or LPR was 45.5, 9, or 3.7 times more likely, respectively, to have had a prior knee arthroscopy. Logistic regression analysis indicated that fibrosis of the MPR supplanted the diagnostic utility of identifying fibrosis of the LPR, LDH, or MDH, or combinations of these (p {>=} 0.09 for all combinations). In the second portion of the study, fibrosis of the MPR demonstrated a mean sensitivity of 82%, specificity of 72%, PPV of 75%, NPV of 81%, and accuracy of 77% for predicting prior knee arthroscopy. Analysis of MR images can be used to determine if a patient has had prior knee arthroscopy by identifying fibrosis of the MPR, LPR, MDH, or LDH. Fibrosis of the MPR was the strongest predictor of prior knee arthroscopy. (orig.)

  10. Bayesian optimal experimental design for priors of compact support

    KAUST Repository

    Long, Quan

    2016-01-01

    to account for the bounded domain of the uniform prior pdf of the parameters. The underlying Gaussian distribution is obtained in the spirit of the Laplace method, more precisely, the mode is chosen as the maximum a posteriori (MAP) estimate

  11. Sampling and examination methods used for TMI-2 samples

    International Nuclear Information System (INIS)

    Marley, A.W.; Akers, D.W.; McIsaac, C.V.

    1988-01-01

    The purpose of this paper is to summarize the sampling and examination techniques that were used in the collection and analysis of TMI-2 samples. Samples ranging from auxiliary building air to core debris were collected and analyzed. Handling of the larger samples and many of the smaller samples had to be done remotely and many standard laboratory analytical techniques were modified to accommodate the extremely high radiation fields associated with these samples. The TMI-2 samples presented unique problems with sampling and the laboratory analysis of prior molten fuel debris. 14 refs., 8 figs

  12. Gravimetric dust sampling for control purposes and occupational dust sampling.

    CSIR Research Space (South Africa)

    Unsted, AD

    1997-02-01

    Full Text Available Prior to the introduction of gravimetric dust sampling, konimeters had been used for dust sampling, which was largely for control purposes. Whether or not absolute results were achievable was not an issue since relative results were used to evaluate...

  13. Threat-responsiveness and the decision to obtain free influenza vaccinations among the older adults in Taiwan.

    Science.gov (United States)

    Li, Ying-Chun; Liu, Chi-Mei

    2009-07-31

    Although older adults are encouraged by government agencies to receive influenza vaccinations, many do not obtain them. In Taiwan, where universal health care coverage has significantly reduced the barriers of access to care, the health care system has provided free influenza vaccinations for people 65 years or older since 2001. Nevertheless, the numbers of people who use this service are much fewer than expected. The aim of this study was to explore major factors that might affect the decision to receive influenza vaccinations among older adults in Taiwan. Using national representative health insurance medical claims from the National Health Insurance Research Database between 2002 and 2004, we investigated the role of threat-responsiveness, represented by prior vaccinations and prior physician visits for flu-like respiratory conditions, in the decisions of older adults to obtain vaccinations in Taiwan. Among the sample of 23,023 older adults, the overall yearly vaccination rates in this study were 38.6%, 44.3% and 39.3% for 2002, 2003, and 2004, respectively. Adjusting for covariates of individual and health care facility characteristics, the multivariate logistic regression revealed that older adults who had had prior vaccinations were ten times more likely to be vaccinated during the following influenza season than those who had not (OR=10.22, 95%CI: 9.82-10.64). The greater the frequency of prior physician visits for flu-like respiratory conditions, the greater the likelihood that one would decide to be vaccinated. Visits during prior interim (non-epidemic) season exerted a stronger positive influence than prior influenza season on this likelihood (OR=1.59, 95% CI: 1.46-1.73 vs. OR=1.11 95% CI: 1.01-1.22, respectively). Threat-responsiveness, or perceived risk, greatly influences influenza vaccination rates among the older adults in Taiwan. These findings can be used to help design public health campaigns to increase the influenza vaccination rate in this

  14. Radiochemical data obtained by α spectrometry on unrecrystallized fossil coral samples from the Egyptian shoreline of the north-western Red Sea

    International Nuclear Information System (INIS)

    Choukri, A.; Hakam, O.K.; Reyss, J.L.; Plaziat, J.C.

    2007-01-01

    In this work, radiochemical results obtained by α spectrometry on 80 unrecrystallized fossil coral samples from the Egyptian shoreline of the north-western Red Sea are presented and discussed. The coral samples were collected in Egypt from the emerged 5e coral reef terraces over 500km from The Ras Gharib-Ras Shukeir depression (28 deg. 10 ' ) in the north to Wadi Lahami (north of Ras Banas, 24 deg. 10 ' ) in the south. The statistical description of radiochemical results (concentrations of U and Th radioisotopes, 234 U/ 238 U activity ratios and ages) obtained on a great number of coral samples showed that it is possible to establish methodological criterions which could be used to validate the measured ages before confronting them to the geological context of sampling sites. The obtained results confirm that the unrecrystallized corals ( 232 Th 238 U varies between 2.2 and 4.9ppm around an average of 3.18+/-0.65ppm. 234 U/ 238 U activity ratios are between 1.08 and 1.28 with an averaged value of 1.164+/-0.016 which exceeds that of present day sea water but which is in agreement with the ratio of 1.16 measured by a precise mass spectrometry in many Pleistocene coral samples. Except three samples dated at least 100ka, the radiochemical age of 5e coral samples vary between 108 and 131ka with an average value of 122.2ka and a standard deviation of 4.3ka. Except for samples from the Zeit area, the reef terrace is between 2 and 6m above the present sea level. This position is similar to the highest sea level from the last interglacial according to the glacio-isostatic rebound calculated for stable regions. This work proves that the large tectonic motions which affected the studied area after the Oligocene ceased after at least the last interglacial period

  15. Evaluating the optimum rest period prior to blood collection for fractionated plasma free metanephrines analysis

    LENUS (Irish Health Repository)

    Griffin, T.P.

    2016-05-01

    The high diagnostic accuracy of plasma metanephrines (PMets) in the di-agnosis of Phaeochromocytoma\\/Paraganglioma (PPGL) is well established. Considerable controversy exists regarding optimum sampling conditions for PMets. The use of reference intervals that do not compromise diagnostic sensitivity is recommended. However, the optimum rest period prior to sampling has yet to be clearly established. The aim of this study was to evaluate PMets concentrations in paired blood samples collected following 30 and 40 min seated-rest prior to sampling, in patients in whom it was clinically rea-sonable to suspect that PPGL may be present.

  16. Hair of the dog: obtaining samples from coyotes and wolves noninvasively

    Science.gov (United States)

    Ausband, David E.; Young, Julie; Fannin, Barbara; Mitchell, Michael S.; Stenglein, Jennifer L.; Waits, Lisette P.; Shivik, John A.

    2011-01-01

    Canids can be difficult to detect and their populations difficult to monitor. We tested whether hair samples could be collected from coyotes (Canis latrans) in Texas, USA and gray wolves (C. lupus) in Montana, USA using lure to elicit rubbing behavior at both man-made and natural collection devices. We used mitochondrial and nuclear DNA to determine whether collected hair samples were from coyote, wolf, or nontarget species. Both coyotes and wolves rubbed on man-made barbed surfaces but coyotes in Texas seldom rubbed on hanging barbed surfaces. Wolves in Montana showed a tendency to rub at stations where natural-material collection devices (sticks and debris) were present. Time to detection was relatively short (5 nights and 4 nights for coyotes and wolves, respectively) with nontarget and unknown species comprising approximately 26% of the detections in both locations. Eliciting rubbing behavior from coyotes and wolves using lures has advantages over opportunistic genetic sampling methods (e.g., scat transects) because it elicits a behavior that deposits a hair sample at a fixed sampling location, thereby increasing the efficiency of sampling for these canids. Hair samples from rub stations could be used to provide estimates of abundance, measures of genetic diversity and health, and detection-nondetection data useful for cost-effective population monitoring.

  17. pH adjustment of human blood plasma prior to bioanalytical sample preparation

    NARCIS (Netherlands)

    Hendriks, G.; Uges, D. R. A.; Franke, J. P.

    2008-01-01

    pH adjustment in bioanalytical sample preparation concerning ionisable compounds is one of the most common sample treatments. This is often done by mixing an aliquot of the sample with a proper buffer adjusted to the proposed pH. The pH of the resulting mixture however, does not necessarily have to

  18. Influence of sintering parameters in the ferroelectric properties os strontium bismuth tantalate samples obtained by oxide mixture

    International Nuclear Information System (INIS)

    Souza, R.R. de; Pereira, A.S.; Sousa, V.C.; Egea, J.R.J.

    2012-01-01

    The family of compounds layered-type perovskite, know as Aurivilius presents great alternative not only by the absence of lead in the composition, but because the polarization retention, replacing PZT in FeRAM devices. The strontium bismuth tantalate (SrBi 2 Ta 2 O 9 ) or SBT is ferroelectric material that has attracted considerable interest, since it has high fatigue resistance, supporting high hysteresis loops, with the change in polarization.Checking polarization and depolarization currents stimulated by temperature it is possible to obtain, for example, information about the nature of charges and about the activation energy for the process of dielectric relaxation. For analysis of ferroelectric properties of this compound, it is essential to obtain specimens with a relative density around 95%. Thus, it is important the optimization of the sintering process in order to obtain a ceramic body with a high densification. The influence of sintering parameters to obtain SrBi 2 Ta 2 O 9 in the polarization properties and in the microstructure of sintered samples was investigated by thermostimulated currents and electronic microscopy, respectively. Results show that variation of these parameters may cause changes in the ferroelectric properties of the material. (author)

  19. Comparison between two sampling methods by results obtained using petrographic techniques, specially developed for minerals of the Itataia uranium phosphate deposit, Ceara, Brazil

    International Nuclear Information System (INIS)

    Salas, H.T.; Murta, R.L.L.

    1985-01-01

    The results of comparison of two sampling methods applied to a gallery of the uranium-phosphate ore body of Itataia-Ceara State, Brazil, along 235 metres of mineralized zone, are presented. The results were obtained through petrographic techniques especially developed and applied to both samplings. In the first one it was studied hand samples from a systematically sampling made at intervals of 2 metres. After that, the estimated mineralogical composition studies were carried out. Some petrogenetic observations were for the first time verified. The second sampling was made at intervals of 20 metres and 570 tons of ore extracted and distributed in sections and a sample representing each section was studied after crushing at -65. Their mineralogy were quantified and the degree of liberation of apatite calculated. Based on the mineralogical data obtained it was possible to represent both samplings and to make the comparison of the main mineralogical groups (phosphates, carbonates and silicates). In spite of utilizing different methods and methodology and the kind of mineralization, stockwork, being quite irregular, the results were satisfactory. (Author) [pt

  20. Interpretation of secondary electron images obtained using a low vacuum SEM

    International Nuclear Information System (INIS)

    Toth, M.; Thiel, B.L.; Donald, A.M.

    2003-01-01

    Charging of insulators in a variable pressure environment was investigated in the context of secondary electron (SE) image formation. Sample charging and ionized gas molecules present in a low vacuum specimen chamber can give rise to SE image contrast. 'Charge-induced' SE contrast reflects lateral variations in the charge state of a sample caused by electron irradiation during and prior to image acquisition. This contrast corresponds to SE emission current alterations produced by sub-surface charge deposited by the electron beam. 'Ion-induced' contrast results from spatial inhomogeneities in the extent of SE signal inhibition caused by ions in the gaseous environment of a low vacuum scanning electron microscope (SEM). The inhomogeneities are caused by ion focusing onto regions of a sample that correspond to local minima in the magnitude of the surface potential (generated by sub-surface trapped charge), or topographic asperities. The two types of contrast exhibit characteristic dependencies on microscope operating parameters such as scan speed, beam current, gas pressure, detector bias and working distance. These dependencies, explained in terms of the behavior of the gaseous environment and sample charging, can serve as a basis for a correct interpretation of SE images obtained using a low vacuum SEM

  1. Active Prior Tactile Knowledge Transfer for Learning Tactual Properties of New Objects

    Directory of Open Access Journals (Sweden)

    Di Feng

    2018-02-01

    Full Text Available Reusing the tactile knowledge of some previously-explored objects (prior objects helps us to easily recognize the tactual properties of new objects. In this paper, we enable a robotic arm equipped with multi-modal artificial skin, like humans, to actively transfer the prior tactile exploratory action experiences when it learns the detailed physical properties of new objects. These experiences, or prior tactile knowledge, are built by the feature observations that the robot perceives from multiple sensory modalities, when it applies the pressing, sliding, and static contact movements on objects with different action parameters. We call our method Active Prior Tactile Knowledge Transfer (APTKT, and systematically evaluated its performance by several experiments. Results show that the robot improved the discrimination accuracy by around 10 % when it used only one training sample with the feature observations of prior objects. By further incorporating the predictions from the observation models of prior objects as auxiliary features, our method improved the discrimination accuracy by over 20 % . The results also show that the proposed method is robust against transferring irrelevant prior tactile knowledge (negative knowledge transfer.

  2. Study of anti corrosive behaviour on A I 6061 samples covered with Ni-P alloys obtained by autocatalytic method

    International Nuclear Information System (INIS)

    Castro, M. E; Barbero, J. A; Bubach, E

    2006-01-01

    There are many ways to keep safe an industrial material from corrosion attack.One is covering the piece with a layer of another material which corrosion resistance is higher to the one of the element to protect.The anticorrosion protection mechanism is achieved by the formation of a physical pore less barrier without any defects.This avoid the arrival of those agents from environment responsible of electrochemical attack.In this paper, corrosion resistance of metallic coatings over nuclear usage aluminum samples is analyzed.Our interest is aimed on nickel I phosphorous alloy coatings (Ni I P) obtained by electroless method (autocatalytic) over Al 6061 alloy samples.A comparative study is carried on with different phosphorous contents but always under 12 %.This job is completed with other nickel coating, Vitro vac 0080 (with no phosphorous content) in order to compare structures and anti corrosive properties.Besides, the comparison between mentioned materials and aluminum samples is made.The study is carried on using superficial characterization of each sample with or without coating through a series of complementary techniques such as chemical, electrochemical (linear sweep voltammetry, cyclic voltammetry, polarization resistance determination) and physical (scanning electronic microscopy, hardness determination) techniques.Finally, variable correlation is made as a function of the phosphorous content in the samples used in the experiences.The coating structure obtained is amorphous.It presents no pore or failure and its hardness shows important values.The electrochemical analysis allows to check that anti corrosive protection capacity of Ni-P alloy increases with the phosphorous content in the coat. Al 6061 by itself demonstrate an electrochemically bad behaviour.Substrate I coating adherence is very good [es

  3. Characterization and obtainment of phosphate rock concentrates of Turmequé, Boyacá

    Science.gov (United States)

    Zanguña, S. Quijano; Lozano Gómez, L. F.; Pineda Triana, Y.

    2017-12-01

    The work focuses on the use and exploitation of the mineral concentrates from phosphate rock (PR) coming from mines with a low percentage of phosphorus. The procedure was based on the collection of a source of phosphate rock from the department of Boyacá (municipality of Turmequé), using a randomized design with three replications. The samples were initially milled and sifted using meshes between 140 and 200 US standard, homogenizing them and improving the process of solubility of the phosphorus in the soil. We conduced Z-potential tests, which show that by performing a prior wash on the mineral and maintaining certain concentrations and pH defined, better results are achieved in terms of the buoyancy of the particles in the flotation process. The results obtained from the microflotation tests; both direct and inverse, and the results of chemical composition, with X-Ray Fluorescence (XRF) and X-Ray Diffraction (XRD), before and after the microflotation process, were carried out to obtain of commercial laws grade phosphate rock concentrate, confirm that the protocol used increases by 9% the value of total phosphorus in the collected sample. These concentrates from phosphate rock, could be used in the future for the attainment of simple superphosphate (SSP), with the help of sulphuric acid and ammonium thiosulphate mixtures.

  4. Analytical dual-energy microtomography: A new method for obtaining three-dimensional mineral phase images and its application to Hayabusa samples

    Science.gov (United States)

    Tsuchiyama, A.; Nakano, T.; Uesugi, K.; Uesugi, M.; Takeuchi, A.; Suzuki, Y.; Noguchi, R.; Matsumoto, T.; Matsuno, J.; Nagano, T.; Imai, Y.; Nakamura, T.; Ogami, T.; Noguchi, T.; Abe, M.; Yada, T.; Fujimura, A.

    2013-09-01

    We developed a novel technique called "analytical dual-energy microtomography" that uses the linear attenuation coefficients (LACs) of minerals at two different X-ray energies to nondestructively obtain three-dimensional (3D) images of mineral distribution in materials such as rock specimens. The two energies are above and below the absorption edge energy of an abundant element, which we call the "index element". The chemical compositions of minerals forming solid solution series can also be measured. The optimal size of a sample is of the order of the inverse of the LAC values at the X-ray energies used. We used synchrotron-based microtomography with an effective spatial resolution of >200 nm to apply this method to small particles (30-180 μm) collected from the surface of asteroid 25143 Itokawa by the Hayabusa mission of the Japan Aerospace Exploration Agency (JAXA). A 3D distribution of the minerals was successively obtained by imaging the samples at X-ray energies of 7 and 8 keV, using Fe as the index element (the K-absorption edge of Fe is 7.11 keV). The optimal sample size in this case is of the order of 50 μm. The chemical compositions of the minerals, including the Fe/Mg ratios of ferromagnesian minerals and the Na/Ca ratios of plagioclase, were measured. This new method is potentially applicable to other small samples such as cosmic dust, lunar regolith, cometary dust (recovered by the Stardust mission of the National Aeronautics and Space Administration [NASA]), and samples from extraterrestrial bodies (those from future sample return missions such as the JAXA Hayabusa2 mission and the NASA OSIRIS-REx mission), although limitations exist for unequilibrated samples. Further, this technique is generally suited for studying materials in multicomponent systems with multiple phases across several research fields.

  5. Identification of subsurface structures using electromagnetic data and shape priors

    Energy Technology Data Exchange (ETDEWEB)

    Tveit, Svenn, E-mail: svenn.tveit@uni.no [Uni CIPR, Uni Research, Bergen 5020 (Norway); Department of Mathematics, University of Bergen, Bergen 5020 (Norway); Bakr, Shaaban A., E-mail: shaaban.bakr1@gmail.com [Department of Mathematics, Faculty of Science, Assiut University, Assiut 71516 (Egypt); Uni CIPR, Uni Research, Bergen 5020 (Norway); Lien, Martha, E-mail: martha.lien@octio.com [Uni CIPR, Uni Research, Bergen 5020 (Norway); Octio AS, Bøhmergaten 44, Bergen 5057 (Norway); Mannseth, Trond, E-mail: trond.mannseth@uni.no [Uni CIPR, Uni Research, Bergen 5020 (Norway); Department of Mathematics, University of Bergen, Bergen 5020 (Norway)

    2015-03-01

    We consider the inverse problem of identifying large-scale subsurface structures using the controlled source electromagnetic method. To identify structures in the subsurface where the contrast in electric conductivity can be small, regularization is needed to bias the solution towards preserving structural information. We propose to combine two approaches for regularization of the inverse problem. In the first approach we utilize a model-based, reduced, composite representation of the electric conductivity that is highly flexible, even for a moderate number of degrees of freedom. With a low number of parameters, the inverse problem is efficiently solved using a standard, second-order gradient-based optimization algorithm. Further regularization is obtained using structural prior information, available, e.g., from interpreted seismic data. The reduced conductivity representation is suitable for incorporation of structural prior information. Such prior information cannot, however, be accurately modeled with a gaussian distribution. To alleviate this, we incorporate the structural information using shape priors. The shape prior technique requires the choice of kernel function, which is application dependent. We argue for using the conditionally positive definite kernel which is shown to have computational advantages over the commonly applied gaussian kernel for our problem. Numerical experiments on various test cases show that the methodology is able to identify fairly complex subsurface electric conductivity distributions while preserving structural prior information during the inversion.

  6. Comparison of the diagnostic performance of bacterial culture of nasopharyngeal swab and bronchoalveolar lavage fluid samples obtained from calves with bovine respiratory disease.

    Science.gov (United States)

    Capik, Sarah F; White, Brad J; Lubbers, Brian V; Apley, Michael D; DeDonder, Keith D; Larson, Robert L; Harhay, Greg P; Chitko-McKown, Carol G; Harhay, Dayna M; Kalbfleisch, Ted S; Schuller, Gennie; Clawson, Michael L

    2017-03-01

    OBJECTIVE To compare predictive values, extent of agreement, and gamithromycin susceptibility between bacterial culture results of nasopharyngeal swab (NPS) and bronchoalveolar lavage fluid (BALF) samples obtained from calves with bovine respiratory disease (BRD). ANIMALS 28 beef calves with clinical BRD. PROCEDURES Pooled bilateral NPS samples and BALF samples were obtained for bacterial culture from calves immediately before and at various times during the 5 days after gamithromycin (6 mg/kg, SC, once) administration. For each culture-positive sample, up to 12 Mannheimia haemolytica, 6 Pasteurella multocida, and 6 Histophilus somni colonies underwent gamithromycin susceptibility testing. Whole-genome sequencing was performed on all M haemolytica isolates. For paired NPS and BALF samples collected 5 days after gamithromycin administration, the positive and negative predictive values for culture results of NPS samples relative to those of BALF samples and the extent of agreement between the sampling methods were determined. RESULTS Positive and negative predictive values of NPS samples were 67% and 100% for M haemolytica, 75% and 100% for P multocida, and 100% and 96% for H somni. Extent of agreement between results for NPS and BALF samples was substantial for M haemolytica (κ, 0.71) and H somni (κ, 0.78) and almost perfect for P multocida (κ, 0.81). Gamithromycin susceptibility varied within the same sample and between paired NPS and BALF samples. CONCLUSIONS AND CLINICAL RELEVANCE Results indicated culture results of NPS and BALF samples from calves with BRD should be interpreted cautiously considering disease prevalence within the population, sample collection relative to antimicrobial administration, and limitations of diagnostic testing methods.

  7. Real-time single image dehazing based on dark channel prior theory and guided filtering

    Science.gov (United States)

    Zhang, Zan

    2017-10-01

    Images and videos taken outside the foggy day are serious degraded. In order to restore degraded image taken in foggy day and overcome traditional Dark Channel prior algorithms problems of remnant fog in edge, we propose a new dehazing method.We first find the fog area in the dark primary color map to obtain the estimated value of the transmittance using quadratic tree. Then we regard the gray-scale image after guided filtering as atmospheric light map and remove haze based on it. Box processing and image down sampling technology are also used to improve the processing speed. Finally, the atmospheric light scattering model is used to restore the image. A plenty of experiments show that algorithm is effective, efficient and has a wide range of application.

  8. Racing Sampling Based Microimmune Optimization Approach Solving Constrained Expected Value Programming

    Directory of Open Access Journals (Sweden)

    Kai Yang

    2016-01-01

    Full Text Available This work investigates a bioinspired microimmune optimization algorithm to solve a general kind of single-objective nonlinear constrained expected value programming without any prior distribution. In the study of algorithm, two lower bound sample estimates of random variables are theoretically developed to estimate the empirical values of individuals. Two adaptive racing sampling schemes are designed to identify those competitive individuals in a given population, by which high-quality individuals can obtain large sampling size. An immune evolutionary mechanism, along with a local search approach, is constructed to evolve the current population. The comparative experiments have showed that the proposed algorithm can effectively solve higher-dimensional benchmark problems and is of potential for further applications.

  9. Finding A Minimally Informative Dirichlet Prior Using Least Squares

    International Nuclear Information System (INIS)

    Kelly, Dana

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straightforward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson λ, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in the form of a standard distribution (e.g., beta, gamma), and so a beta distribution is used as an approximation in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial model for common-cause failure, must be estimated from data that are often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  10. Threat-responsiveness and the decision to obtain free influenza vaccinations among the older adults in Taiwan

    Directory of Open Access Journals (Sweden)

    Liu Chi-Mei

    2009-07-01

    Full Text Available Abstract Background Although older adults are encouraged by government agencies to receive influenza vaccinations, many do not obtain them. In Taiwan, where universal health care coverage has significantly reduced the barriers of access to care, the health care system has provided free influenza vaccinations for people 65 years or older since 2001. Nevertheless, the numbers of people who use this service are much fewer than expected. The aim of this study was to explore major factors that might affect the decision to receive influenza vaccinations among older adults in Taiwan. Methods Using national representative health insurance medical claims from the National Health Insurance Research Database between 2002 and 2004, we investigated the role of threat-responsiveness, represented by prior vaccinations and prior physician visits for flu-like respiratory conditions, in the decisions of older adults to obtain vaccinations in Taiwan. Results Among the sample of 23,023 older adults, the overall yearly vaccination rates in this study were 38.6%, 44.3% and 39.3% for 2002, 2003, and 2004, respectively. Adjusting for covariates of individual and health care facility characteristics, the multivariate logistic regression revealed that older adults who had had prior vaccinations were ten times more likely to be vaccinated during the following influenza season than those who had not (OR = 10.22, 95%CI: 9.82–10.64. The greater the frequency of prior physician visits for flu-like respiratory conditions, the greater the likelihood that one would decide to be vaccinated. Visits during prior interim (non-epidemic season exerted a stronger positive influence than prior influenza season on this likelihood (OR = 1.59, 95% CI: 1.46–1.73 vs. OR = 1.11 95% CI: 1.01–1.22, respectively. Conclusion Threat-responsiveness, or perceived risk, greatly influences influenza vaccination rates among the older adults in Taiwan. These findings can be used to help design

  11. Improving Bayesian credibility intervals for classifier error rates using maximum entropy empirical priors.

    Science.gov (United States)

    Gustafsson, Mats G; Wallman, Mikael; Wickenberg Bolin, Ulrika; Göransson, Hanna; Fryknäs, M; Andersson, Claes R; Isaksson, Anders

    2010-06-01

    Successful use of classifiers that learn to make decisions from a set of patient examples require robust methods for performance estimation. Recently many promising approaches for determination of an upper bound for the error rate of a single classifier have been reported but the Bayesian credibility interval (CI) obtained from a conventional holdout test still delivers one of the tightest bounds. The conventional Bayesian CI becomes unacceptably large in real world applications where the test set sizes are less than a few hundred. The source of this problem is that fact that the CI is determined exclusively by the result on the test examples. In other words, there is no information at all provided by the uniform prior density distribution employed which reflects complete lack of prior knowledge about the unknown error rate. Therefore, the aim of the study reported here was to study a maximum entropy (ME) based approach to improved prior knowledge and Bayesian CIs, demonstrating its relevance for biomedical research and clinical practice. It is demonstrated how a refined non-uniform prior density distribution can be obtained by means of the ME principle using empirical results from a few designs and tests using non-overlapping sets of examples. Experimental results show that ME based priors improve the CIs when employed to four quite different simulated and two real world data sets. An empirically derived ME prior seems promising for improving the Bayesian CI for the unknown error rate of a designed classifier. Copyright 2010 Elsevier B.V. All rights reserved.

  12. Is the Near-Earth Current Sheet Prior to Reconnection Unstable to Tearing Mode?

    International Nuclear Information System (INIS)

    Xin-Hua, Wei; Jin-Bin, Cao; Guo-Cheng, Zhou; Hui-Shan, Fu

    2010-01-01

    The tearing mode instability plays a key role in the triggering process of reconnection. The triggering collisionless tearing mode instability has been theoretically and numerically analyzed by many researchers. However, due to the difficulty in obtaining the observational wave number, it is still unknown whether the tearing mode instability can be excited in an actual plasma sheet prior to reconnection onset. Using the data from four Cluster satellites prior to a magnetospheric reconnection event on 13 September 2002, we utilized the wave telescope technique to obtain the wave number which corresponds to the peak of power spectral density. The wavelength is about 18R E and is consistent with previous theoretic and numerical results. After substituting the wave vector and other necessary parameters of the observed current sheet into the triggering condition of tearing mode instability, we find that the near-Earth current sheet prior to reconnection is unstable to tearing mode. (geophysics, astronomy, and astrophysics)

  13. Radiochemical data obtained by {alpha} spectrometry on unrecrystallized fossil coral samples from the Egyptian shoreline of the north-western Red Sea

    Energy Technology Data Exchange (ETDEWEB)

    Choukri, A. [Laboratoire de Physique de la Matiere et Rayonnement, Equipe de Physique et Techniques Nucleaires, UFR ' Faibles Radioactivites, Mathematiques physiques et environnement' Universite Ibn Tofail, Faculte des Sciences, Departement de Physique, P.B. 133, 14 000 Kenitra (Morocco)]. E-mail: choukrimajid@yahoo.com; Hakam, O.K. [Laboratoire de Physique de la Matiere et Rayonnement, Equipe de Physique et Techniques Nucleaires, UFR ' Faibles Radioactivites, Mathematiques physiques et environnement' Universite Ibn Tofail, Faculte des Sciences, Departement de Physique, P.B. 133, 14 000 Kenitra (Morocco); Reyss, J.L. [Laboratoire des Sciences de Climat et de l' Environnement, Domaine du CNRS, Avenue de la Terrasse 91 1958, Gif sur Yvette (France); Plaziat, J.C. [Universite de Paris-Sud, Departement des Sciences de la Terre, URA 723, Batiment 504, F-91405 Orsay, Cedex (France)

    2007-02-15

    In this work, radiochemical results obtained by {alpha} spectrometry on 80 unrecrystallized fossil coral samples from the Egyptian shoreline of the north-western Red Sea are presented and discussed. The coral samples were collected in Egypt from the emerged 5e coral reef terraces over 500km from The Ras Gharib-Ras Shukeir depression (28 deg. 10{sup '}) in the north to Wadi Lahami (north of Ras Banas, 24 deg. 10{sup '}) in the south. The statistical description of radiochemical results (concentrations of U and Th radioisotopes, {sup 234}U/{sup 238}U activity ratios and ages) obtained on a great number of coral samples showed that it is possible to establish methodological criterions which could be used to validate the measured ages before confronting them to the geological context of sampling sites. The obtained results confirm that the unrecrystallized corals ({sup 232}Th<3%) constitute the reliable means of determining the timing of Pleistocene sea-level fluctuations in the past. A few number of measured younger ages could be explain as a result of a rejuvenation due to latter addition of a ''younger'' uranium to the initial stock entered just after coral death. The obvious rejuvenation observed and confirmed is due to a recent cementation of aragonitic deposit on the fossil coral. {sup 238}U varies between 2.2 and 4.9ppm around an average of 3.18+/-0.65ppm. {sup 234}U/ {sup 238}U activity ratios are between 1.08 and 1.28 with an averaged value of 1.164+/-0.016 which exceeds that of present day sea water but which is in agreement with the ratio of 1.16 measured by a precise mass spectrometry in many Pleistocene coral samples. Except three samples dated at least 100ka, the radiochemical age of 5e coral samples vary between 108 and 131ka with an average value of 122.2ka and a standard deviation of 4.3ka. Except for samples from the Zeit area, the reef terrace is between 2 and 6m above the present sea level. This position is similar to

  14. User's guide for polyethylene-based passive diffusion bag samplers to obtain volatile organic compound concentrations in wells. Part I, Deployment, recovery, data interpretation, and quality control and assurance

    Science.gov (United States)

    Vroblesky, Don A.

    2001-01-01

    Diffusion samplers installed in observation wells were found to be capable of yielding representative water samples for chlorinated volatile organic compounds. The samplers consisted of polyethylene bags containing deionized water and relied on diffusion of chlorinated volatile organic compounds through the polyethylene membrane. The known ability of polyethylene to transmit other volatile compounds, such as benzene and toluene, indicates that the samplers can be used for a variety of volatile organic compounds. In wells at the study area, the volatile organic compound concentrations in water samples obtained using the samplers without prior purging were similar to concentrations in water samples obtained from the respective wells using traditional purging and sampling approaches. The low cost associated with this approach makes it a viable option for monitoring large observation-well networks for volatile organic compounds.

  15. Joint sampling programme-Verification of data obtained in environmental monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Lauria, D.C. [Instituto de Radioprotecao e Dosimetria, Comissao Nacional de Energia Nuclear, Av. Salvador Allende s/no., CEP 22780-160, Rio de Janeiro, RJ (Brazil)], E-mail: dejanira@ird.gov.br; Martins, N.S.F.; Vasconcellos, M.L.H.; Zenaro, R.; Peres, S.S.; Pires do Rio, M.A. [Instituto de Radioprotecao e Dosimetria, Comissao Nacional de Energia Nuclear, Av. Salvador Allende s/no., CEP 22780-160, Rio de Janeiro, RJ (Brazil)

    2008-11-15

    The objective of the Environmental Radiological Monitoring Control programme carried out by the Institute of Radiation Protection and Dosimetry (IRD) in Brazil is to verify the licensee's compliance with the requirements for environmental monitoring of Brazilian facilities. The Joint Sampling Programme (JSP) is just one part of the control programme. In order to verify that the data reported by the licensees is representative and legitimate, this programme verifies sampling procedures, accuracy and precision of the data and the changes in the environmental conditions. This paper discusses the main findings of this programme that allowed IRD to optimize its available resources to control the monitoring of the eight facilities in Brazil.

  16. Genotoxicity and fetal abnormality in streptozotocin-induced diabetic rats exposed to cigarette smoke prior to and during pregnancy.

    Science.gov (United States)

    Damasceno, D C; Volpato, G T; Sinzato, Y K; Lima, P H O; Souza, M S S; Iessi, I L; Kiss, A C I; Takaku, M; Rudge, M V C; Calderon, I M P

    2011-10-01

    Maternal hyperglycemia during early pregnancy is associated with increased risk of abnormalities in the offspring. Malformation rates among the offspring of diabetic mothers are 2-5-fold higher than that of the normal population, and congenital malformations are the major cause of mortality and morbidity in the offspring of diabetic mothers. Metabolic changes, such as hyperglycemia and the metabolites obtained from cigarettes both increase the production of reactive oxygen species (ROS) in the embryo or fetus, causing DNA damage. To evaluate the maternal and fetal genotoxicity, and to assess the incidence of fetal anomaly in diabetic female rats exposed to cigarette smoke at different stages of pregnancy in rats. Diabetes was induced by streptozotocin administration and cigarette smoke exposure was produced by a mechanical smoking device that generated mainstream smoke that was delivered into a chamber. Female Wistar rats were randomly assigned to: non-diabetic (ND) and diabetic (D) groups exposed to filtered air; a diabetic group exposed to cigarette smoke prior to and during pregnancy (DS) and a diabetic group only exposed to cigarette smoke prior to pregnancy (DSPP). On pregnancy day 21, blood samples were obtained for DNA damage analysis and fetuses were collected for congenital anomaly assessment. Statistical significance was set at p<0.05 for all analysis. Exposure of diabetic rats to tobacco smoke prior to pregnancy increased fetal DNA damage, but failed to induce teratogenicity. Thus, these results reinforce the importance for women to avoid exposure to cigarette smoke long before they become pregnant. © J. A. Barth Verlag in Georg Thieme Verlag KG Stuttgart · New York.

  17. Factors associated with number of duodenal samples obtained in suspected celiac disease.

    Science.gov (United States)

    Shamban, Leonid; Sorser, Serge; Naydin, Stan; Lebwohl, Benjamin; Shukr, Mousa; Wiemann, Charlotte; Yevsyukov, Daniel; Piper, Michael H; Warren, Bradley; Green, Peter H R

    2017-12-01

     Many people with celiac disease are undiagnosed and there is evidence that insufficient duodenal samples may contribute to underdiagnosis. The aims of this study were to investigate whether more samples leads to a greater likelihood of a diagnosis of celiac disease and to elucidate factors that influence the number of samples collected.  We identified patients from two community hospitals who were undergoing duodenal biopsy for indications (as identified by International Classification of Diseases code) compatible with possible celiac disease. Three cohorts were evaluated: no celiac disease (NCD, normal villi), celiac disease (villous atrophy, Marsh score 3), and possible celiac disease (PCD, Marsh score celiac disease had a median of 4 specimens collected. The percentage of patients diagnosed with celiac disease with one sample was 0.3 % compared with 12.8 % of those with six samples ( P  = 0.001). Patient factors that positively correlated with the number of samples collected were endoscopic features, demographic details, and indication ( P  = 0.001). Endoscopist factors that positively correlated with the number of samples collected were absence of a trainee, pediatric gastroenterologist, and outpatient setting ( P  celiac disease significantly increased with six samples. Multiple factors influenced whether adequate biopsies were taken. Adherence to guidelines may increase the diagnosis rate of celiac disease.

  18. Priming in implicit memory tasks: prior study causes enhanced discriminability, not only bias.

    Science.gov (United States)

    Zeelenberg, René; Wagenmakers, Eric-Jan M; Raaijmakers, Jeroen G W

    2002-03-01

    R. Ratcliff and G. McKoon (1995, 1996, 1997; R. Ratcliff, D. Allbritton, & G. McKoon, 1997) have argued that repetition priming effects are solely due to bias. They showed that prior study of the target resulted in a benefit in a later implicit memory task. However, prior study of a stimulus similar to the target resulted in a cost. The present study, using a 2-alternative forced-choice procedure, investigated the effect of prior study in an unbiased condition: Both alternatives were studied prior to their presentation in an implicit memory task. Contrary to a pure bias interpretation of priming, consistent evidence was obtained in 3 implicit memory tasks (word fragment completion, auditory word identification, and picture identification) that performance was better when both alternatives were studied than when neither alternative was studied. These results show that prior study results in enhanced discriminability, not only bias.

  19. Rietveld analysis using powder diffraction data with anomalous scattering effect obtained by focused beam flat sample method

    International Nuclear Information System (INIS)

    Tanaka, Masahiko; Katsuya, Yoshio; Sakata, Osami

    2016-01-01

    Focused-beam flat-sample method (FFM) is a new trial for synchrotron powder diffraction method, which is a combination of beam focusing optics, flat shape powder sample and area detectors. The method has advantages for X-ray diffraction experiments applying anomalous scattering effect (anomalous diffraction), because of 1. Absorption correction without approximation, 2. High intensity X-rays of focused incident beams and high signal noise ratio of diffracted X-rays 3. Rapid data collection with area detectors. We applied the FFM to anomalous diffraction experiments and collected synchrotron X-ray powder diffraction data of CoFe_2O_4 (inverse spinel structure) using X-rays near Fe K absorption edge, which can distinguish Co and Fe by anomalous scattering effect. We conducted Rietveld analyses with the obtained powder diffraction data and successfully determined the distribution of Co and Fe ions in CoFe_2O_4 crystal structure.

  20. Standard guide for sampling radioactive tank waste

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 This guide addresses techniques used to obtain grab samples from tanks containing high-level radioactive waste created during the reprocessing of spent nuclear fuels. Guidance on selecting appropriate sampling devices for waste covered by the Resource Conservation and Recovery Act (RCRA) is also provided by the United States Environmental Protection Agency (EPA) (1). Vapor sampling of the head-space is not included in this guide because it does not significantly affect slurry retrieval, pipeline transport, plugging, or mixing. 1.2 The values stated in inch-pound units are to be regarded as standard. No other units of measurement are included in this standard. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  1. Effect of Prior Health-Related Employment on the Registered Nurse Workforce Supply.

    Science.gov (United States)

    Yoo, Byung-kwan; Lin, Tzu-chun; Kim, Minchul; Sasaki, Tomoko; Spetz, Joanne

    2016-01-01

    Registered nurses (RN) who held prior health-related employment in occupations other than licensed practical or vocational nursing (LPN/LVN) are reported to have increased rapidly in the past decades. Researchers examined whether prior health-related employment affects RN workforce supply. A cross-sectional bivariate probit model using the 2008 National Sample Survey of Registered Nurses was esti- mated. Prior health-related employment in relatively lower-wage occupations, such as allied health, clerk, or nursing aide, was positively associated with working s an RN. ~>Prior health-related employ- ment in relatively higher-wage categories, such as a health care manager or LPN/LVN, was positively associated with working full-time as an RN. Policy implications are to promote an expanded career ladder program and a nursing school admission policy that targets non-RN health care workers with an interest in becoming RNs.

  2. Luminescence lifetimes in quartz: dependence on annealing temperature prior to beta irradiation

    International Nuclear Information System (INIS)

    Galloway, R.B.

    2002-01-01

    It is well known that the thermal history of a quartz sample influences the optically stimulated luminescence sensitivity of the quartz. It is found that the optically stimulated luminescence lifetime, determined from time resolved spectra obtained with pulsed stimulation, also depends on past thermal treatment. For samples at 20 deg. C during stimulation, the lifetime depends on beta dose and on duration of preheating at 220 deg. C prior to stimulation for quartz annealed at 600 deg. C and above, but is independent of these factors for quartz annealed at 500 deg. C and below. For stimulation at higher temperatures, the lifetime becomes shorter if the sample is held at temperatures above 125 deg. C during stimulation, in a manner consistent with thermal quenching. A single exponential decay is all that is required to fit the time resolved spectra for un-annealed quartz regardless of the temperature during stimulation (20-175 deg. C), or to fit the time resolved spectra from all samples held at 20 deg. C during stimulation, regardless of annealing temperature (20-1000 deg. C). An additional shorter lifetime is found for some combinations of annealing temperature and temperature during stimulation. The results are discussed in terms of a model previously used to explain thermal sensitisation. The luminescence lifetime data are best explained by the presence of two principal luminescence centres, their relative importance depending on the annealing temperature, with a third centre involved for limited combinations of annealing temperature and temperature during stimulation

  3. Incorporating prior knowledge induced from stochastic differential equations in the classification of stochastic observations.

    Science.gov (United States)

    Zollanvari, Amin; Dougherty, Edward R

    2016-12-01

    In classification, prior knowledge is incorporated in a Bayesian framework by assuming that the feature-label distribution belongs to an uncertainty class of feature-label distributions governed by a prior distribution. A posterior distribution is then derived from the prior and the sample data. An optimal Bayesian classifier (OBC) minimizes the expected misclassification error relative to the posterior distribution. From an application perspective, prior construction is critical. The prior distribution is formed by mapping a set of mathematical relations among the features and labels, the prior knowledge, into a distribution governing the probability mass across the uncertainty class. In this paper, we consider prior knowledge in the form of stochastic differential equations (SDEs). We consider a vector SDE in integral form involving a drift vector and dispersion matrix. Having constructed the prior, we develop the optimal Bayesian classifier between two models and examine, via synthetic experiments, the effects of uncertainty in the drift vector and dispersion matrix. We apply the theory to a set of SDEs for the purpose of differentiating the evolutionary history between two species.

  4. Imaging performance of a hybrid x-ray computed tomography-fluorescence molecular tomography system using priors.

    Science.gov (United States)

    Ale, Angelique; Schulz, Ralf B; Sarantopoulos, Athanasios; Ntziachristos, Vasilis

    2010-05-01

    The performance is studied of two newly introduced and previously suggested methods that incorporate priors into inversion schemes associated with data from a recently developed hybrid x-ray computed tomography and fluorescence molecular tomography system, the latter based on CCD camera photon detection. The unique data set studied attains accurately registered data of high spatially sampled photon fields propagating through tissue along 360 degrees projections. Approaches that incorporate structural prior information were included in the inverse problem by adding a penalty term to the minimization function utilized for image reconstructions. Results were compared as to their performance with simulated and experimental data from a lung inflammation animal model and against the inversions achieved when not using priors. The importance of using priors over stand-alone inversions is also showcased with high spatial sampling simulated and experimental data. The approach of optimal performance in resolving fluorescent biodistribution in small animals is also discussed. Inclusion of prior information from x-ray CT data in the reconstruction of the fluorescence biodistribution leads to improved agreement between the reconstruction and validation images for both simulated and experimental data.

  5. Super resolution reconstruction of μ-CT image of rock sample using neighbour embedding algorithm

    Science.gov (United States)

    Wang, Yuzhu; Rahman, Sheik S.; Arns, Christoph H.

    2018-03-01

    X-ray computed tomography (μ-CT) is considered to be the most effective way to obtain the inner structure of rock sample without destructions. However, its limited resolution hampers its ability to probe sub-micro structures which is critical for flow transportation of rock sample. In this study, we propose an innovative methodology to improve the resolution of μ-CT image using neighbour embedding algorithm where low frequency information is provided by μ-CT image itself while high frequency information is supplemented by high resolution scanning electron microscopy (SEM) image. In order to obtain prior for reconstruction, a large number of image patch pairs contain high- and low- image patches are extracted from the Gaussian image pyramid generated by SEM image. These image patch pairs contain abundant information about tomographic evolution of local porous structures under different resolution spaces. Relying on the assumption of self-similarity of porous structure, this prior information can be used to supervise the reconstruction of high resolution μ-CT image effectively. The experimental results show that the proposed method is able to achieve the state-of-the-art performance.

  6. Understanding sleep disturbance in athletes prior to important competitions.

    Science.gov (United States)

    Juliff, Laura E; Halson, Shona L; Peiffer, Jeremiah J

    2015-01-01

    Anecdotally many athletes report worse sleep in the nights prior to important competitions. Despite sleep being acknowledged as an important factor for optimal athletic performance and overall health, little is understood about athlete sleep around competition. The aims of this study were to identify sleep complaints of athletes prior to competitions and determine whether complaints were confined to competition periods. Cross-sectional study. A sample of 283 elite Australian athletes (129 male, 157 female, age 24±5 y) completed two questionnaires; Competitive Sport and Sleep questionnaire and the Pittsburgh Sleep Quality Index. 64.0% of athletes indicated worse sleep on at least one occasion in the nights prior to an important competition over the past 12 months. The main sleep problem specified by athletes was problems falling asleep (82.1%) with the main reasons responsible for poor sleep indicated as thoughts about the competition (83.5%) and nervousness (43.8%). Overall 59.1% of team sport athletes reported having no strategy to overcome poor sleep compared with individual athletes (32.7%, p=0.002) who utilised relaxation and reading as strategies. Individual sport athletes had increased likelihood of poor sleep as they aged. The poor sleep reported by athletes prior to competition was situational rather than a global sleep problem. Poor sleep is common prior to major competitions in Australian athletes, yet most athletes are unaware of strategies to overcome the poor sleep experienced. It is essential coaches and scientists monitor and educate both individual and team sport athletes to facilitate sleep prior to important competitions. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  7. Rietveld analysis using powder diffraction data with anomalous scattering effect obtained by focused beam flat sample method

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, Masahiko, E-mail: masahiko@spring8.or.jp; Katsuya, Yoshio, E-mail: katsuya@spring8.or.jp; Sakata, Osami, E-mail: SAKATA.Osami@nims.go.jp [Synchrotron X-ray Station at SPring-8, National Institute for Materials Science 1-1-1 Kouto, Sayo, Hyogo 679-5198 (Japan)

    2016-07-27

    Focused-beam flat-sample method (FFM) is a new trial for synchrotron powder diffraction method, which is a combination of beam focusing optics, flat shape powder sample and area detectors. The method has advantages for X-ray diffraction experiments applying anomalous scattering effect (anomalous diffraction), because of 1. Absorption correction without approximation, 2. High intensity X-rays of focused incident beams and high signal noise ratio of diffracted X-rays 3. Rapid data collection with area detectors. We applied the FFM to anomalous diffraction experiments and collected synchrotron X-ray powder diffraction data of CoFe{sub 2}O{sub 4} (inverse spinel structure) using X-rays near Fe K absorption edge, which can distinguish Co and Fe by anomalous scattering effect. We conducted Rietveld analyses with the obtained powder diffraction data and successfully determined the distribution of Co and Fe ions in CoFe{sub 2}O{sub 4} crystal structure.

  8. Obtaining of potassium dicyan-argentate

    International Nuclear Information System (INIS)

    Sattarova, M.A.; Solojenkin, P.M.

    1997-01-01

    This work is devoted to obtaining of potassium dicyan-argentate. By means of exchange reaction between silver nitrate and potassium cyanide the potassium dicyan-argentate was synthesized. The analysis of obtained samples was carried out by means of titration and potentiometry.

  9. Stable isotope analysis of precipitation samples obtained via crowdsourcing reveals the spatiotemporal evolution of Superstorm Sandy.

    Directory of Open Access Journals (Sweden)

    Stephen P Good

    Full Text Available Extra-tropical cyclones, such as 2012 Superstorm Sandy, pose a significant climatic threat to the northeastern United Sates, yet prediction of hydrologic and thermodynamic processes within such systems is complicated by their interaction with mid-latitude water patterns as they move poleward. Fortunately, the evolution of these systems is also recorded in the stable isotope ratios of storm-associated precipitation and water vapor, and isotopic analysis provides constraints on difficult-to-observe cyclone dynamics. During Superstorm Sandy, a unique crowdsourced approach enabled 685 precipitation samples to be obtained for oxygen and hydrogen isotopic analysis, constituting the largest isotopic sampling of a synoptic-scale system to date. Isotopically, these waters span an enormous range of values (> 21‰ for δ(18O, > 160‰ for δ(2H and exhibit strong spatiotemporal structure. Low isotope ratios occurred predominantly in the west and south quadrants of the storm, indicating robust isotopic distillation that tracked the intensity of the storm's warm core. Elevated values of deuterium-excess (> 25‰ were found primarily in the New England region after Sandy made landfall. Isotope mass balance calculations and Lagrangian back-trajectory analysis suggest that these samples reflect the moistening of dry continental air entrained from a mid-latitude trough. These results demonstrate the power of rapid-response isotope monitoring to elucidate the structure and dynamics of water cycling within synoptic-scale systems and improve our understanding of storm evolution, hydroclimatological impacts, and paleo-storm proxies.

  10. Stable isotope analysis of precipitation samples obtained via crowdsourcing reveals the spatiotemporal evolution of Superstorm Sandy.

    Science.gov (United States)

    Good, Stephen P; Mallia, Derek V; Lin, John C; Bowen, Gabriel J

    2014-01-01

    Extra-tropical cyclones, such as 2012 Superstorm Sandy, pose a significant climatic threat to the northeastern United Sates, yet prediction of hydrologic and thermodynamic processes within such systems is complicated by their interaction with mid-latitude water patterns as they move poleward. Fortunately, the evolution of these systems is also recorded in the stable isotope ratios of storm-associated precipitation and water vapor, and isotopic analysis provides constraints on difficult-to-observe cyclone dynamics. During Superstorm Sandy, a unique crowdsourced approach enabled 685 precipitation samples to be obtained for oxygen and hydrogen isotopic analysis, constituting the largest isotopic sampling of a synoptic-scale system to date. Isotopically, these waters span an enormous range of values (> 21‰ for δ(18)O, > 160‰ for δ(2)H) and exhibit strong spatiotemporal structure. Low isotope ratios occurred predominantly in the west and south quadrants of the storm, indicating robust isotopic distillation that tracked the intensity of the storm's warm core. Elevated values of deuterium-excess (> 25‰) were found primarily in the New England region after Sandy made landfall. Isotope mass balance calculations and Lagrangian back-trajectory analysis suggest that these samples reflect the moistening of dry continental air entrained from a mid-latitude trough. These results demonstrate the power of rapid-response isotope monitoring to elucidate the structure and dynamics of water cycling within synoptic-scale systems and improve our understanding of storm evolution, hydroclimatological impacts, and paleo-storm proxies.

  11. Effects of prior aversive experience upon retrograde amnesia induced by hypothermia.

    Science.gov (United States)

    Jensen, R A; Riccio, D C; Gehres, L

    1975-08-01

    Two experiments examined the extent to which retrograde amnesia (RA) is attenuated by prior learning experiences. In Experiment 1, rats initially received either passive avoidance training in a step-through apparatus, exposure to the apparatus, or noncontingent footshock. When training on a second but different passive avoidance task was followed by hypothermia treatment, RA was obtained only in the latter two groups. In Experiment 2, one-way active avoidance training, yoked noncontingent shocks, or apparatus exposure constituted the initial experience. Subsequent step-down passive avoidance training and amnestic treatment resulted in memory loss for the prior apparatus exposure group, but not for either of the preshocked conditions. These experiments demonstrate that certain types of prior aversive experience can substantially modify the magnitude of RA, and, in conjunction with other familiarization studies, emphasize a paradox for interpretations of RA based solely upon CNS disruption. The possibility that hypothermia treatment serves as an important contextual or encoding cue necessary for memory retrieval was considered. It was suggested that prior experience may block RA by enabling rats to differentiate training and treatment conditions.

  12. A Simulation of Pell Grant Awards and Costs Using Prior-Prior Year Financial Data

    Science.gov (United States)

    Kelchen, Robert; Jones, Gigi

    2015-01-01

    We examine the likely implications of switching from a prior year (PY) financial aid system, the current practice in which students file the Free Application for Federal Student Aid (FAFSA) using income data from the previous tax year, to prior-prior year (PPY), in which data from two years before enrollment is used. While PPY allows students to…

  13. The Effects of Prior Outcomes on Risky Choice: Evidence from the Stock Market

    Directory of Open Access Journals (Sweden)

    Fenghua Wen

    2014-01-01

    Full Text Available How do prior outcomes affect the risk choice? Research on this can help people to understand investors’ dynamic decisions in financial market. This paper puts forward a new value function. By analyzing the new value function, we find that the prior gains and losses have an impact on the form of value function and the current investors’ risk attitude. Then the paper takes the behavior of the whole stock market as the research object, adopts aggregative index number of 14 representative stocks around the world as samples, and establishes a TVRA-GARCH-M model to investigate the influences of prior gains and losses on the current risk attitude. The empirical study indicates that, at the whole market level, prior gains increase people’s current willingness to take risk assert; that is to say, the house money effect exists in the market, while people are more risk aversion following prior losses.

  14. Performance of local information-based link prediction: a sampling perspective

    Science.gov (United States)

    Zhao, Jichang; Feng, Xu; Dong, Li; Liang, Xiao; Xu, Ke

    2012-08-01

    Link prediction is pervasively employed to uncover the missing links in the snapshots of real-world networks, which are usually obtained through different kinds of sampling methods. In the previous literature, in order to evaluate the performance of the prediction, known edges in the sampled snapshot are divided into the training set and the probe set randomly, without considering the underlying sampling approaches. However, different sampling methods might lead to different missing links, especially for the biased ways. For this reason, random partition-based evaluation of performance is no longer convincing if we take the sampling method into account. In this paper, we try to re-evaluate the performance of local information-based link predictions through sampling method governed division of the training set and the probe set. It is interesting that we find that for different sampling methods, each prediction approach performs unevenly. Moreover, most of these predictions perform weakly when the sampling method is biased, which indicates that the performance of these methods might have been overestimated in the prior works.

  15. Finding a minimally informative Dirichlet prior distribution using least squares

    International Nuclear Information System (INIS)

    Kelly, Dana; Atwood, Corwin

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straightforward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson λ, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in the form of a standard distribution (e.g., beta, gamma), and so a beta distribution is used as an approximation in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial model for common-cause failure, must be estimated from data that are often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  16. Finding a Minimally Informative Dirichlet Prior Distribution Using Least Squares

    International Nuclear Information System (INIS)

    Kelly, Dana; Atwood, Corwin

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straight-forward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in closed form, and so an approximate beta distribution is used in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial aleatory model for common-cause failure, must be estimated from data that is often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  17. Bayesian road safety analysis: incorporation of past evidence and effect of hyper-prior choice.

    Science.gov (United States)

    Miranda-Moreno, Luis F; Heydari, Shahram; Lord, Dominique; Fu, Liping

    2013-09-01

    This paper aims to address two related issues when applying hierarchical Bayesian models for road safety analysis, namely: (a) how to incorporate available information from previous studies or past experiences in the (hyper) prior distributions for model parameters and (b) what are the potential benefits of incorporating past evidence on the results of a road safety analysis when working with scarce accident data (i.e., when calibrating models with crash datasets characterized by a very low average number of accidents and a small number of sites). A simulation framework was developed to evaluate the performance of alternative hyper-priors including informative and non-informative Gamma, Pareto, as well as Uniform distributions. Based on this simulation framework, different data scenarios (i.e., number of observations and years of data) were defined and tested using crash data collected at 3-legged rural intersections in California and crash data collected for rural 4-lane highway segments in Texas. This study shows how the accuracy of model parameter estimates (inverse dispersion parameter) is considerably improved when incorporating past evidence, in particular when working with the small number of observations and crash data with low mean. The results also illustrates that when the sample size (more than 100 sites) and the number of years of crash data is relatively large, neither the incorporation of past experience nor the choice of the hyper-prior distribution may affect the final results of a traffic safety analysis. As a potential solution to the problem of low sample mean and small sample size, this paper suggests some practical guidance on how to incorporate past evidence into informative hyper-priors. By combining evidence from past studies and data available, the model parameter estimates can significantly be improved. The effect of prior choice seems to be less important on the hotspot identification. The results show the benefits of incorporating prior

  18. XRF analysis of mineralised samples

    International Nuclear Information System (INIS)

    Ahmedali, T.

    2002-01-01

    Full text: Software now supplied by instrument manufacturers has made it practical and convenient for users to analyse unusual samples routinely. Semiquantitative scanning software can be used for rapid preliminary screening of elements ranging from Carbon to Uranium, prior to assigning mineralised samples to an appropriate quantitative analysis routine. The general quality and precision of analytical results obtained from modern XRF spectrometers can be significantly enhanced by several means: a. Modifications in preliminary sample preparation can result in less contamination from crushing and grinding equipment. Optimised techniques of actual sample preparation can significantly increase precision of results. b. Employment of automatic data recording balances and the use of catch weights during sample preparation reduces technician time as well as weighing errors. * c. Consistency of results can be improved significantly by the use of appropriate stable drift monitors with a statistically significant content of the analyte d. A judicious selection of kV/mA combinations, analysing crystals, primary beam filters, collimators, peak positions, accurate background correction and peak overlap corrections, followed by the use of appropriate matrix correction procedures. e. Preventative maintenance procedures for XRF spectrometers and ancillary equipment, which can also contribute significantly to reducing instrument down times, are described. Examples of various facets of sample processing routines are given from the XRF spectrometer component of a multi-instrument analytical university facility, which provides XRF data to 17 Canadian universities. Copyright (2002) Australian X-ray Analytical Association Inc

  19. A two-sample Bayesian t-test for microarray data

    Directory of Open Access Journals (Sweden)

    Dimmic Matthew W

    2006-03-01

    Full Text Available Abstract Background Determining whether a gene is differentially expressed in two different samples remains an important statistical problem. Prior work in this area has featured the use of t-tests with pooled estimates of the sample variance based on similarly expressed genes. These methods do not display consistent behavior across the entire range of pooling and can be biased when the prior hyperparameters are specified heuristically. Results A two-sample Bayesian t-test is proposed for use in determining whether a gene is differentially expressed in two different samples. The test method is an extension of earlier work that made use of point estimates for the variance. The method proposed here explicitly calculates in analytic form the marginal distribution for the difference in the mean expression of two samples, obviating the need for point estimates of the variance without recourse to posterior simulation. The prior distribution involves a single hyperparameter that can be calculated in a statistically rigorous manner, making clear the connection between the prior degrees of freedom and prior variance. Conclusion The test is easy to understand and implement and application to both real and simulated data shows that the method has equal or greater power compared to the previous method and demonstrates consistent Type I error rates. The test is generally applicable outside the microarray field to any situation where prior information about the variance is available and is not limited to cases where estimates of the variance are based on many similar observations.

  20. A Bone Sample Containing a Bone Graft Substitute Analyzed by Correlating Density Information Obtained by X-ray Micro Tomography with Compositional Information Obtained by Raman Microscopy

    Directory of Open Access Journals (Sweden)

    Johann Charwat-Pessler

    2015-06-01

    Full Text Available The ability of bone graft substitutes to promote new bone formation has been increasingly used in the medical field to repair skeletal defects or to replace missing bone in a broad range of applications in dentistry and orthopedics. A common way to assess such materials is via micro computed tomography (µ-CT, through the density information content provided by the absorption of X-rays. Information on the chemical composition of a material can be obtained via Raman spectroscopy. By investigating a bone sample from miniature pigs containing the bone graft substitute Bio Oss®, we pursued the target of assessing to what extent the density information gained by µ-CT imaging matches the chemical information content provided by Raman spectroscopic imaging. Raman images and Raman correlation maps of the investigated sample were used in order to generate a Raman based segmented image by means of an agglomerative, hierarchical cluster analysis. The resulting segments, showing chemically related areas, were subsequently compared with the µ-CT image by means of a one-way ANOVA. We found out that to a certain extent typical gray-level values (and the related histograms in the µ-CT image can be reliably related to specific segments within the image resulting from the cluster analysis.

  1. Weakly supervised semantic segmentation using fore-background priors

    Science.gov (United States)

    Han, Zheng; Xiao, Zhitao; Yu, Mingjun

    2017-07-01

    Weakly-supervised semantic segmentation is a challenge in the field of computer vision. Most previous works utilize the labels of the whole training set and thereby need the construction of a relationship graph about image labels, thus result in expensive computation. In this study, we tackle this problem from a different perspective. We proposed a novel semantic segmentation algorithm based on background priors, which avoids the construction of a huge graph in whole training dataset. Specifically, a random forest classifier is obtained using weakly supervised training data .Then semantic texton forest (STF) feature is extracted from image superpixels. Finally, a CRF based optimization algorithm is proposed. The unary potential of CRF derived from the outputting probability of random forest classifier and the robust saliency map as background prior. Experiments on the MSRC21 dataset show that the new algorithm outperforms some previous influential weakly-supervised segmentation algorithms. Furthermore, the use of efficient decision forests classifier and parallel computing of saliency map significantly accelerates the implementation.

  2. Quantitative utilization of prior biological knowledge in the Bayesian network modeling of gene expression data

    Directory of Open Access Journals (Sweden)

    Gao Shouguo

    2011-08-01

    Full Text Available Abstract Background Bayesian Network (BN is a powerful approach to reconstructing genetic regulatory networks from gene expression data. However, expression data by itself suffers from high noise and lack of power. Incorporating prior biological knowledge can improve the performance. As each type of prior knowledge on its own may be incomplete or limited by quality issues, integrating multiple sources of prior knowledge to utilize their consensus is desirable. Results We introduce a new method to incorporate the quantitative information from multiple sources of prior knowledge. It first uses the Naïve Bayesian classifier to assess the likelihood of functional linkage between gene pairs based on prior knowledge. In this study we included cocitation in PubMed and schematic similarity in Gene Ontology annotation. A candidate network edge reservoir is then created in which the copy number of each edge is proportional to the estimated likelihood of linkage between the two corresponding genes. In network simulation the Markov Chain Monte Carlo sampling algorithm is adopted, and samples from this reservoir at each iteration to generate new candidate networks. We evaluated the new algorithm using both simulated and real gene expression data including that from a yeast cell cycle and a mouse pancreas development/growth study. Incorporating prior knowledge led to a ~2 fold increase in the number of known transcription regulations recovered, without significant change in false positive rate. In contrast, without the prior knowledge BN modeling is not always better than a random selection, demonstrating the necessity in network modeling to supplement the gene expression data with additional information. Conclusion our new development provides a statistical means to utilize the quantitative information in prior biological knowledge in the BN modeling of gene expression data, which significantly improves the performance.

  3. A novel needle-type sampling device for flexible ultrathin bronchoscopy

    International Nuclear Information System (INIS)

    Suda, Yuji; Hayashi, Katsutoshi; Shindoh, Yuriko; Iijima, Hideya; Tanaka, Akiko

    2008-01-01

    Diagnosis of suspected cancer in the periphery of the lung is difficult. A flexible ultrathin bronchoscope has been developed for the diagnosis of peripherally located pulmonary lesions that cannot be reached with the sampling devices for standard flexible bronchoscopes. The diagnostic yield with forceps and a brush for ultrathin bronchoscopes, however, is not adequate, especially when a lesion is not exposed to the bronchial lumen. We have thus developed a novel needle-type sampling device and tested its yield in transbronchial cytology. The device consists of an elongated dental H-file (0.4 mm in diameter and 110 cm in length), a housing sheath (1.0 mm in outer diameter), and a novel handle, which enables rapid out-and-in motion of the needle. Ten consecutive patients with a peripheral pulmonary lesion who had an indication for diagnostic procedure with a flexible ultrathin bronchoscope were enrolled. The optimal bronchial route to the lesion was analyzed with virtual bronchoscopy in a data set obtained with high-resolution computed tomography, and a novel bronchial route labeling system (prior-ridge-based relative orientation nomenclature) was employed to guide insertion of the bronchoscope. Sampling with the novel needle was performed prior to use of the forceps and brush under conventional fluoroscopy. In all the cases, sampling with the needle was successful and the amount of the specimen was sufficient for cytology. Our novel sampling system with flexible ultrathin bronchoscopes may contribute to accurate and minimally invasive diagnosis of peripheral pulmonary lesions. (author)

  4. Should warfarin or aspirin be stopped prior to prostate biopsy? An analysis of bleeding complications related to increasing sample number regimes

    International Nuclear Information System (INIS)

    Chowdhury, R.; Abbas, A.; Idriz, S.; Hoy, A.; Rutherford, E.E.; Smart, J.M.

    2012-01-01

    Aim: To determine whether patients undergoing transrectal ultrasound (TRUS)-guided prostate biopsy with increased sampling numbers are more likely to experience bleeding complications and whether warfarin or low-dose aspirin are independent risk factors. Materials and methods: 930 consecutive patients with suspected prostatic cancer were followed up after biopsy. Warfarin/low-dose aspirin was not stopped prior to the procedure. An eight to 10 sample regime TRUS-guided prostate biopsy was performed and patients were offered a questionnaire to complete 10 days after the procedure, to determine any immediate or delayed bleeding complications. Results: 902 patients returned completed questionnaires. 579 (64.2%) underwent eight core biopsies, 47 (5.2%) underwent nine, and 276 (30.6%) underwent 10. 68 were taking warfarin [mean international normalized ratio (INR) = 2.5], 216 were taking low-dose aspirin, one was taking both, and 617 were taking neither. 27.9% of those on warfarin and 33.8% of those on aspirin experienced haematuria. 37% of those on no blood-thinning medication experienced haematuria. 13.2% of those on warfarin and 14.4% of those on aspirin experienced rectal bleeding. 11.5% of those on no blood-thinning medication experienced rectal bleeding. 7.4% of those on warfarin and 12% of those on aspirin experienced haematospermia. 13.8% of those on neither experienced haematospermia. Regression analysis showed a significant association between increasing sampling number and occurrence of all bleeding complication types. There was no significant association between minor bleeding complications and warfarin use; however, there was a significant association between minor bleeding complications and low-dose aspirin use. There was no severe bleeding complication. Conclusion: There is an increased risk of bleeding complications following TRUS-guided prostate biopsy with increased sampling numbers but these are minor. There is also an increased risk with low

  5. Magnetic stirrer induced dispersive ionic-liquid microextraction for the determination of vanadium in water and food samples prior to graphite furnace atomic absorption spectrometry.

    Science.gov (United States)

    Naeemullah; Kazi, Tasneem Gul; Tuzen, Mustafa

    2015-04-01

    A new dispersive liquid-liquid microextraction, magnetic stirrer induced dispersive ionic-liquid microextraction (MS-IL-DLLME) was developed to quantify the trace level of vanadium in real water and food samples by graphite furnace atomic absorption spectrometry (GFAAS). In this extraction method magnetic stirrer was applied to obtained a dispersive medium of 1-butyl-3-methylimidazolium hexafluorophosphate [C4MIM][PF6] in aqueous solution of (real water samples and digested food samples) to increase phase transfer ratio, which significantly enhance the recovery of vanadium - 4-(2-pyridylazo) resorcinol (PAR) chelate. Variables having vital role on desired microextraction methods were optimised to obtain the maximum recovery of study analyte. Under the optimised experimental variables, enhancement factor (EF) and limit of detection (LOD) were achieved to be 125 and 18 ng L(-1), respectively. Validity and accuracy of the desired method was checked by analysis of certified reference materials (SLRS-4 Riverine water and NIST SRM 1515 Apple leaves). The relative standard deviation (RSD) for 10 replicate determinations at 0.5 μg L(-1) of vanadium level was found to be <5.0%. This method was successfully applied to real water and acid digested food samples. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Models for Validation of Prior Learning (VPL)

    DEFF Research Database (Denmark)

    Ehlers, Søren

    The national policies for the education/training of adults are in the 21st century highly influenced by proposals which are formulated and promoted by The European Union (EU) as well as other transnational players and this shift in policy making has consequences. One is that ideas which in the past...... would have been categorized as utopian can become realpolitik. Validation of Prior Learning (VPL) was in Europe mainly regarded as utopian while universities in the United States of America (USA) were developing ways to obtain credits to those students which was coming with experiences from working life....

  7. Characterization and decant of Tank 42H sludge sample ESP-200

    International Nuclear Information System (INIS)

    Hay, M.S.

    2000-01-01

    DWPF Engineering requested that the Savannah River Technology Center (SRTC) provide a demonstration of the DWPF flowsheet on sludge from Tank 42H in the Shielded Cell facility. A 5 liter sample of the Tank 42H sludge (ESP-200), obtained with the tank contents fully mixed, arrived at SRTC on January 20, 1998. This report details receipt of the 5 liter sample at SRTC, the decant of the sample, and the characterization of the pre- and post-decant Tank 42H sludge. Evaluation of the measured composition of the supernate indicates Sample ESP-200 became diluted approximately 20 percent by volume prior to receipt. This dilution complicates the relationship of the characterization of Post-Decant ESP-200 to the current contents of Tank 42H. For the purposes of modeling the current tank contents of Tank 42H, this report provides an estimated composition based on analytical data of recent samples from Tank 42H

  8. High-level core sample x-ray imaging at the Hanford Site

    International Nuclear Information System (INIS)

    Weber, J.R.; Keye, J.K.

    1995-01-01

    Waste tank sampling of radioactive high-level waste is required for continued operations, waste characterization, and site safety. Hanford Site Tank farms consist of 28 double-shell and 149 single-shell underground storage tanks. The single shell tanks are out-of-service and no longer receive liquid waste. Core samples of salt cake and sludge waste are remotely obtained using truck-mounted, core drill platforms. Samples are recovered from tanks through a 2.25 inch (in.) drill pipe in 26-in. steel tubes, 1.5 in. diameter. Drilling parameters vary with different waste types. Because sample recovery has been marginal and inadequate at times, a system was needed to provide drill truck operators with real-time feedback about the physical conditions of the sample and the percent recovery, prior to making nuclear assay measurements and characterizations at the analytical laboratory. Westinghouse hanford Company conducted proof-of -principal radiographic testing to verify the feasibility of a proposed imaging system

  9. Characterization and decant of Tank 42H sludge sample ESP-200

    Energy Technology Data Exchange (ETDEWEB)

    Hay, M.S.

    2000-04-25

    DWPF Engineering requested that the Savannah River Technology Center (SRTC) provide a demonstration of the DWPF flowsheet on sludge from Tank 42H in the Shielded Cell facility. A 5 liter sample of the Tank 42H sludge (ESP-200), obtained with the tank contents fully mixed, arrived at SRTC on January 20, 1998. This report details receipt of the 5 liter sample at SRTC, the decant of the sample, and the characterization of the pre- and post-decant Tank 42H sludge. Evaluation of the measured composition of the supernate indicates Sample ESP-200 became diluted approximately 20 percent by volume prior to receipt. This dilution complicates the relationship of the characterization of Post-Decant ESP-200 to the current contents of Tank 42H. For the purposes of modeling the current tank contents of Tank 42H, this report provides an estimated composition based on analytical data of recent samples from Tank 42H.

  10. Accommodating Uncertainty in Prior Distributions

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-19

    A fundamental premise of Bayesian methodology is that a priori information is accurately summarized by a single, precisely de ned prior distribution. In many cases, especially involving informative priors, this premise is false, and the (mis)application of Bayes methods produces posterior quantities whose apparent precisions are highly misleading. We examine the implications of uncertainty in prior distributions, and present graphical methods for dealing with them.

  11. Results For The Third Quarter Calendar Year 2016 Tank 50H Salt Solution Sample

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, C. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-10-13

    In this memorandum, the chemical and radionuclide contaminant results from the Third Quarter Calendar Year 2016 (CY16) sample of Tank 50H salt solution are presented in tabulated form. The Third Quarter CY16 Tank 50H samples (a 200 mL sample obtained 6” below the surface (HTF-5-16-63) and a 1 L sample obtained 66” from the tank bottom (HTF-50-16-64)) were obtained on July 14, 2016 and received at Savannah River National Laboratory (SRNL) on the same day. Prior to obtaining the samples from Tank 50H, a single pump was run at least 4.4 hours, and the samples were pulled immediately after pump shut down. The information from this characterization will be used by Defense Waste Processing Facility (DWPF) & Saltstone Facility Engineering for the transfer of aqueous waste from Tank 50H to the Saltstone Production Facility, where the waste will be treated and disposed of in the Saltstone Disposal Facility. This memorandum compares results, where applicable, to Saltstone Waste Acceptance Criteria (WAC) limits and targets. Data pertaining to the regulatory limits for Resource Conservation and Recovery Act (RCRA) metals will be documented at a later time per the Task Technical and Quality Assurance Plan (TTQAP) for the Tank 50H saltstone task. The chemical and radionuclide contaminant results from the characterization of the Third Quarter CY16 sampling of Tank 50H were requested by Savannah River Remediation (SRR) personnel and details of the testing are presented in the SRNL TTQAP.

  12. Performance of informative priors skeptical of large treatment effects in clinical trials: A simulation study.

    Science.gov (United States)

    Pedroza, Claudia; Han, Weilu; Thanh Truong, Van Thi; Green, Charles; Tyson, Jon E

    2018-01-01

    One of the main advantages of Bayesian analyses of clinical trials is their ability to formally incorporate skepticism about large treatment effects through the use of informative priors. We conducted a simulation study to assess the performance of informative normal, Student- t, and beta distributions in estimating relative risk (RR) or odds ratio (OR) for binary outcomes. Simulation scenarios varied the prior standard deviation (SD; level of skepticism of large treatment effects), outcome rate in the control group, true treatment effect, and sample size. We compared the priors with regards to bias, mean squared error (MSE), and coverage of 95% credible intervals. Simulation results show that the prior SD influenced the posterior to a greater degree than the particular distributional form of the prior. For RR, priors with a 95% interval of 0.50-2.0 performed well in terms of bias, MSE, and coverage under most scenarios. For OR, priors with a wider 95% interval of 0.23-4.35 had good performance. We recommend the use of informative priors that exclude implausibly large treatment effects in analyses of clinical trials, particularly for major outcomes such as mortality.

  13. Statistical evaluation of fatty acid profile and cholesterol content in fish (common carp) lipids obtained by different sample preparation procedures.

    Science.gov (United States)

    Spiric, Aurelija; Trbovic, Dejana; Vranic, Danijela; Djinovic, Jasna; Petronijevic, Radivoj; Matekalo-Sverak, Vesna

    2010-07-05

    the second principal component (PC2) is recorded by C18:3 n-3, and C20:3 n-6, being present in a higher amount in the samples treated by the modified Soxhlet extraction, while C22:5 n-3, C20:3 n-3, C22:1 and C20:4, C16 and C18 negatively influence the score values of the PC2, showing significantly increased level in the samples treated by ASE method. Hotelling's paired T-square test used on the first three principal components for confirmation of differences in individual fatty acid content obtained by ASE and Soxhlet method in carp muscle showed statistically significant difference between these two data sets (T(2)=161.308, p<0.001). Copyright 2010 Elsevier B.V. All rights reserved.

  14. High-level core sample x-ray imaging at the Hanford Site

    International Nuclear Information System (INIS)

    Weber, J.R.; Keve, J.K.

    1995-10-01

    Waste tank sampling of radioactive high-level waste is required for continued operations, waste characterization, and site safety. Hanford Site tank farms consist of 28 double-shell and 149 single-shell underground storage tanks. The single shell tanks are out-of-service an no longer receive liquid waste. Core samples of salt cake and sludge waste are remotely obtained using truck-mounted, core drill platforms. Samples are recovered from tanks through a 2.25 inch (in.) drill pipe in 26-in. steel tubes, 1.5 in. diameter. Drilling parameters vary with different waste types. Because sample recovery has been marginal an inadequate at times, a system was needed to provide drill truck operators with ''real-time feedback'' about the physical condition of the sample and the percent recovery, prior to making nuclear assay measurements and characterizations at the analytical laboratory. The Westinghouse Hanford Company conducted proof-of-principal radiographic testing to verify the feasibility of a proposed imaging system. Tests were conducted using an iridium 192 radiography source to determine the effects of high radiation on image quality. The tests concluded that samplers with a dose rate in excess of 5000 R/hr could be imaged with only a slight loss of image quality and samples less than 1000 R/hr have virtually no effect on image quality. The Mobile Core Sample X-Ray Examination System, a portable vendor-engineered assembly, has components uniquely configured to produce a real-time radiographic system suitable for safely examining radioactive tank core segments collected at the Hanford Site. The radiographic region of interest extends from the bottom (valve) of the sampler upward 19 to 20 in. The purpose of the Mobile Core Sample X-Ray Examination System is to examine the physical contents of core samples after removal from the tank and prior to placement in an onsite transfer cask

  15. Nanofluid of zinc oxide nanoparticles in ionic liquid for single drop liquid microextraction of fungicides in environmental waters prior to high performance liquid chromatographic analysis.

    Science.gov (United States)

    Amde, Meseret; Tan, Zhi-Qiang; Liu, Rui; Liu, Jing-Fu

    2015-05-22

    Using a nanofluid obtained by dispersing ZnO nanoparticles (ZnO NPs) in 1-hexyl-3-methylimidazolium hexafluorophosphate, new single drop microextraction method was developed for simultaneous extraction of three fungicides (chlorothalonil, kresoxim-methyl and famoxadone) in water samples prior to their analysis by high performance liquid chromatography (HPLC-VWD). The parameters affecting the extraction efficiency such as amount of ZnO NPs in the nanofluid, solvent volume, extraction time, stirring rate, pH and ionic strength of the sample solution were optimized. Under the optimized conditions, the limits of detection were in the range of 0.13-0.19ng/mL, the precision of the method assessed with intra-day and inter-day relative standard deviations were water samples including lake water, river water, as well as effluent and influent of wastewater treatment plant, with recoveries in the range of 74.94-96.11% at 5ng/mL spiking level. Besides to being environmental friendly, the high enrichment factor and the data quality obtained with the proposed method demonstrated its potential for application in multi residue analysis of fungicides in actual water samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Comparison of geochemical data obtained using four brine sampling methods at the SECARB Phase III Anthropogenic Test CO2 injection site, Citronelle Oil Field, Alabama

    Science.gov (United States)

    Conaway, Christopher; Thordsen, James J.; Manning, Michael A.; Cook, Paul J.; Trautz, Robert C.; Thomas, Burt; Kharaka, Yousif K.

    2016-01-01

    The chemical composition of formation water and associated gases from the lower Cretaceous Paluxy Formation was determined using four different sampling methods at a characterization well in the Citronelle Oil Field, Alabama, as part of the Southeast Regional Carbon Sequestration Partnership (SECARB) Phase III Anthropogenic Test, which is an integrated carbon capture and storage project. In this study, formation water and gas samples were obtained from well D-9-8 #2 at Citronelle using gas lift, electric submersible pump, U-tube, and a downhole vacuum sampler (VS) and subjected to both field and laboratory analyses. Field chemical analyses included electrical conductivity, dissolved sulfide concentration, alkalinity, and pH; laboratory analyses included major, minor and trace elements, dissolved carbon, volatile fatty acids, free and dissolved gas species. The formation water obtained from this well is a Na–Ca–Cl-type brine with a salinity of about 200,000 mg/L total dissolved solids. Differences were evident between sampling methodologies, particularly in pH, Fe and alkalinity. There was little gas in samples, and gas composition results were strongly influenced by sampling methods. The results of the comparison demonstrate the difficulty and importance of preserving volatile analytes in samples, with the VS and U-tube system performing most favorably in this aspect.

  17. Bioanalysis of a panel of neurotransmitters and their metabolites in plasma samples obtained from pediatric patients with neuroblastoma and Wilms' tumor.

    Science.gov (United States)

    Konieczna, Lucyna; Roszkowska, Anna; Stachowicz-Stencel, Teresa; Synakiewicz, Anna; Bączek, Tomasz

    2018-02-01

    This paper details the quantitative analysis of neurotransmitters, including dopamine (DA), norepinephrine (NE), epinephrine (E), and serotonin (5-HT), along with their respective precursors and metabolites in children with solid tumors: Wilms' tumor (WT) and neuroblastoma (NB). A panel of neurotransmitters was determined with the use of dispersive liquid-liquid microextraction (DLLME) technique combined with liquid-chromatography mass spectrometry (LC-MS/MS) in plasma samples obtained from a group of pediatric subjects with solid tumors and a control group of healthy children. Next, statistical univariate analysis (t-test) and multivariate analysis (Principal Component Analysis) were performed using chromatographic data. The levels of tyrosine (Tyr) and tryptophan (Trp) (the precursors of analyzed neurotransmitters) as well as 3,4-dihydroxyphenylacetic acid (DOPAC) (a product of metabolism of DA) were significantly higher in the plasma samples obtained from pediatric patients with WT than in the samples taken from the control group. Moreover, statistically significant differences were observed between the levels of 5-HT and homovanillic acid (HVA) in the plasma samples from pediatric patients with solid tumors and the control group. However, elevated levels of these analytes did not facilitate a clear distinction between pediatric patients with WT and those with NB. Nonetheless, the application of advanced statistical tools allowed the healthy controls to be differentiated from the pediatric oncological patients. The identification and quantification of a panel of neurotransmitters as potential prognostic factors in selected childhood malignancies may provide clinically relevant information about ongoing metabolic alterations, and it could potentially serve as an adjunctive strategy in the effective diagnosis and treatment of solid tumors in children. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Iterated random walks with shape prior

    DEFF Research Database (Denmark)

    Pujadas, Esmeralda Ruiz; Kjer, Hans Martin; Piella, Gemma

    2016-01-01

    the parametric probability density function. Then, random walks is performed iteratively aligning the prior with the current segmentation in every iteration. We tested the proposed approach with natural and medical images and compared it with the latest techniques with random walks and shape priors......We propose a new framework for image segmentation using random walks where a distance shape prior is combined with a region term. The shape prior is weighted by a confidence map to reduce the influence of the prior in high gradient areas and the region term is computed with k-means to estimate....... The experiments suggest that this method gives promising results for medical and natural images....

  19. Effective dielectric functions of samples obtained by evaporation of alkali halides

    International Nuclear Information System (INIS)

    Sturm, J.; Grosse, P.; Theiss, W.

    1991-01-01

    This paper investigates the dielectric properties of inhomogeneous samples consisting of small alkali halide particles (NaCl, KBr) on gold-coated substrates. Our reflection measurements in the far infrared can be simulated as a thin layer of the power with an effective dielectric function on a perfectly reflecting substrate. Scanning electron micrographs provide useful information about sample topology. Several mixing formulas (e.g. the Maxwell-Garnett, the Bruggeman- and the Looyenga-formula) lead to effective dielectric functions neglecting the individual arrangement of the particles. The essence of our work is that, in contrast, the general ansatz of the Bergman spectral representation has to be employed in order to take into account topology effects on the dielectric function based on the so-called spectral density g adjustable to the specific situation. (orig.)

  20. The Prior-project

    DEFF Research Database (Denmark)

    Engerer, Volkmar Paul; Roued-Cunliffe, Henriette; Albretsen, Jørgen

    digitisation of Arthur Prior’s Nachlass kept in the Bodleian Library, Oxford. The DH infrastructure in question is the Prior Virtual Lab (PVL). PVL was established in 2011 in order to provide researchers in the field of temporal logic easy access to the papers of Arthur Norman Prior (1914-1969), and officially......In this paper, we present a DH research infrastructure which relies heavily on a combination of domain knowledge with information technology. The general goal is to develop tools to aid scholars in their interpretations and understanding of temporal logic. This in turn is based on an extensive...

  1. Protocol for sampling and analysis of bone specimens

    International Nuclear Information System (INIS)

    Aras, N.K.

    2000-01-01

    The iliac crest of hip bone was chosen as the most suitable sampling site for several reasons: Local variation in the elemental concentration along the iliac crest is minimal; Iliac crest biopsies are commonly taken clinically on patients; The cortical part of the sample is small (∼2 mm) and can be separated easily from the trabecular bone; The use of the trabecular part of the iliac crest for trace element analysis has the advantage of reflecting rapidly changes in the composition of bone due to external parameters, including medication. Biopsy studies, although in some ways more difficult than autopsy studies, because of the need to obtain the informed consents of the subjects, are potentially more useful than autopsy studies. Thereby many problems of postmortem migration of elements can be avoided and reliable dietary and other data can be collected simultaneously. Select the subjects among the patients undergoing orthopedic surgery due to any reason other than osteoporosis. Follow an established protocol to obtain bone biopsies. Patients undergoing synergy should fill in the 'Osteoporosis Project Questionnaire Form' including information on lifestyle variables, dietary intakes, the reason for surgery etc. If possible, measure the bone mineral density (BMD) prior to removal of the biopsy sample. However it may not possible to have BMD results on all the subjects because of difficulty of DEXA measurement after an accident

  2. A Bayesian hierarchical model with novel prior specifications for estimating HIV testing rates.

    Science.gov (United States)

    An, Qian; Kang, Jian; Song, Ruiguang; Hall, H Irene

    2016-04-30

    Human immunodeficiency virus (HIV) infection is a severe infectious disease actively spreading globally, and acquired immunodeficiency syndrome (AIDS) is an advanced stage of HIV infection. The HIV testing rate, that is, the probability that an AIDS-free HIV infected person seeks a test for HIV during a particular time interval, given no previous positive test has been obtained prior to the start of the time, is an important parameter for public health. In this paper, we propose a Bayesian hierarchical model with two levels of hierarchy to estimate the HIV testing rate using annual AIDS and AIDS-free HIV diagnoses data. At level one, we model the latent number of HIV infections for each year using a Poisson distribution with the intensity parameter representing the HIV incidence rate. At level two, the annual numbers of AIDS and AIDS-free HIV diagnosed cases and all undiagnosed cases stratified by the HIV infections at different years are modeled using a multinomial distribution with parameters including the HIV testing rate. We propose a new class of priors for the HIV incidence rate and HIV testing rate taking into account the temporal dependence of these parameters to improve the estimation accuracy. We develop an efficient posterior computation algorithm based on the adaptive rejection metropolis sampling technique. We demonstrate our model using simulation studies and the analysis of the national HIV surveillance data in the USA. Copyright © 2015 John Wiley & Sons, Ltd.

  3. Abdominal multi-organ CT segmentation using organ correlation graph and prediction-based shape and location priors.

    Science.gov (United States)

    Okada, Toshiyuki; Linguraru, Marius George; Hori, Masatoshi; Summers, Ronald M; Tomiyama, Noriyuki; Sato, Yoshinobu

    2013-01-01

    The paper addresses the automated segmentation of multiple organs in upper abdominal CT data. We propose a framework of multi-organ segmentation which is adaptable to any imaging conditions without using intensity information in manually traced training data. The features of the framework are as follows: (1) the organ correlation graph (OCG) is introduced, which encodes the spatial correlations among organs inherent in human anatomy; (2) the patient-specific organ shape and location priors obtained using OCG enable the estimation of intensity priors from only target data and optionally a number of untraced CT data of the same imaging condition as the target data. The proposed methods were evaluated through segmentation of eight abdominal organs (liver, spleen, left and right kidney, pancreas, gallbladder, aorta, and inferior vena cava) from 86 CT data obtained by four imaging conditions at two hospitals. The performance was comparable to the state-of-the-art method using intensity priors constructed from manually traced data.

  4. Prenatal cytogenetic diagnosis in Spain: analysis and evaluation of the results obtained from amniotic fluid samples during the last decade.

    Science.gov (United States)

    Mademont-Soler, Irene; Morales, Carme; Clusellas, Núria; Soler, Anna; Sánchez, Aurora

    2011-08-01

    Chromosome abnormalities are one of the main causes of congenital defects, and establishing their frequency according to the different clinical indications for invasive procedure during pregnancy is especially important for genetic counselling. We analyzed the results of 29,883 amniotic fluid samples referred to our laboratory for cytogenetic studies from 1998 to 2009, which constitutes the largest series of cytogenetic analysis performed on amniotic fluid samples in Spain. The number of samples received tended to increase from 1998 to 2005, but after 2005 it decreased substantially. Cytogenetic results were obtained in 99.5% of the samples, and the detected incidence of chromosome abnormalities was 2.9%. Of these, 48.1% consisted of classical autosomal aneuploidies, trisomy 21 being the most frequent one. The main clinical indications for amniocentesis were positive prenatal screening and advanced maternal age, but referral reasons with highest positive predictive values were, excluding parental chromosome rearrangement, increased nuchal translucency (9.2%) and ultrasound abnormalities (6.6%). In conclusion, performing the karyotype on amniotic fluid samples is a good method for the detection of chromosome abnormalities during pregnancy. The number of cytogenetic studies on amniotic fluid has now decreased, however, due to the implementation of first trimester prenatal screening for the detection of Down syndrome, which allows karyotyping on chorionic villus samples. Our results also show that both ultrasound abnormalities and increased nuchal translucency are excellent clinical indicators for fetal chromosome abnormality. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  5. Recolonization of group B Streptococcus (GBS) in women with prior GBS genital colonization in pregnancy.

    Science.gov (United States)

    Tam, Teresa; Bilinski, Ewa; Lombard, Emily

    2012-10-01

    The purpose of the study is to evaluate the incidence of women with prior GBS genital colonization who have recolonization in subsequent pregnancies. This is a retrospective, cohort study of patients with a prior GBS genital colonization in pregnancy and a subsequent pregnancy with a recorded GBS culture result, from January 2000 through June 2007. Documentation of GBS status was through GBS culture performed between 35 to 37 weeks gestation. Exclusion criteria included pregnancies with unknown GBS status, patients with GBS bacteriuria, women with a previous neonate with GBS disease and GBS finding prior to 35 weeks. Data was analyzed using SPSS 15.0. The sample proportion of subjects with GBS genital colonization and its confidence interval were computed to estimate the incidence rate. Logistic regression was performed to assess potential determinants of GBS colonization. Regression coefficients, odds ratios and associated confidence intervals, and p-values were reported, with significant results reported. There were 371 pregnancies that met the test criteria. There were 151 subsequent pregnancies with GBS genital colonization and 220 without GBS recolonization. The incidence of GBS recolonization on patients with prior GBS genital colonization was 40.7% (95% confidence interval 35.7-45.69%). The incidence rate for the sample was significantly larger than 30% (p recolonization in subsequent pregnancies.

  6. Use of a D17Z1 oligonucleotide probe for human DNA quantitation prior to PCR analysis of polymorphic DNA markers

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, S.; Alavaren, M.; Varlaro, J. [Roche Molecular Systems, Alameda, CA (United States)] [and others

    1994-09-01

    The alpha-satellite DNA locus D17Z1 contains primate-specific sequences which are repeated several hundred times per chromosome 17. A probe that was designed to hybridize to a subset of the D17Z1 sequence can be used for very sensitive and specific quantitation of human DNA. Sample human genomic DNA is immobilized on nylon membrane using a slot blot apparatus, and then hybridized with a biotinylated D17Z1 oligonucleotide probe. The subsequent binding of streptavidin-horseradish peroxidase to the bound probe allows for either calorimetric (TMB) or chemiluminescent (ECL) detection. Signals obtained for sample DNAs are then compared to the signals obtained for a series of human DNA standards. For either detection method, forty samples can be quantitated in less than two hours, with a sensitivity of 150 pg. As little as 20 pg of DNA can be quantitated when using chemiluminescent detection with longer film exposures. PCR analysis of several VNTR and STR markers has indicated that optimal typing results are generally obtained within a relatively narrow range of input DNA quantities. Too much input DNA can lead to PCR artifacts such as preferential amplification of smaller alleles, non-specific amplification products, and exaggeration of the DNA synthesis slippage products that are seen with STR markers. Careful quantitation of human genomic DNA prior to PCR can avoid or minimize these problems and ultimately give cleaner, more unambiguous PCR results.

  7. 40 CFR 61.54 - Sludge sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 8 2010-07-01 2010-07-01 false Sludge sampling. 61.54 Section 61.54... sampling. (a) As an alternative means for demonstrating compliance with § 61.52(b), an owner or operator... days prior to a sludge sampling test, so that he may at his option observe the test. (c) Sludge shall...

  8. Results for the first quarter calendar year 2017 tank 50H salt solution sample

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, C. L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-04-12

    In this memorandum, the chemical and radionuclide contaminant results from the First Quarter Calendar Year 2017 (CY17) sample of Tank 50H salt solution are presented in tabulated form. The First Quarter CY17 Tank 50H samples [a 200 mL sample obtained 6” below the surface (HTF-50-17-7) and a 1 L sample obtained 66” from the tank bottom (HTF-50-17-8)] were obtained on January 15, 2017 and received at Savannah River National Laboratory (SRNL) on January 16, 2017. Prior to obtaining the samples from Tank 50H, a single pump was run at least 4.4 hours and the samples were pulled immediately after pump shut down. All volatile organic analysis (VOA) and semi-volatile organic analysis (SVOA) were performed on the surface sample and all other analyses were performed on the variable depth sample. The information from this characterization will be used by Savannah River Remediation (SRR) for the transfer of aqueous waste from Tank 50H to the Saltstone Production Facility, where the waste will be treated and disposed of in the Saltstone Disposal Facility. This memorandum compares results, where applicable, to Saltstone Waste Acceptance Criteria (WAC) limits and targets. The chemical and radionuclide contaminant results from the characterization of the First Quarter CY17 sampling of Tank 50H were requested by SRR personnel and details of the testing are presented in the SRNL Task Technical and Quality Assurance Plan (TTQAP). This memorandum is part of Deliverable 2 from SRR request. Data pertaining to the regulatory limits for Resource Conservation and Recovery Act (RCRA) metals will be documented at a later time per the TTQAP for the Tank 50H saltstone task.

  9. A Frequency Matching Method: Solving Inverse Problems by Use of Geologically Realistic Prior Information

    DEFF Research Database (Denmark)

    Lange, Katrine; Frydendall, Jan; Cordua, Knud Skou

    2012-01-01

    The frequency matching method defines a closed form expression for a complex prior that quantifies the higher order statistics of a proposed solution model to an inverse problem. While existing solution methods to inverse problems are capable of sampling the solution space while taking into account...... arbitrarily complex a priori information defined by sample algorithms, it is not possible to directly compute the maximum a posteriori model, as the prior probability of a solution model cannot be expressed. We demonstrate how the frequency matching method enables us to compute the maximum a posteriori...... solution model to an inverse problem by using a priori information based on multiple point statistics learned from training images. We demonstrate the applicability of the suggested method on a synthetic tomographic crosshole inverse problem....

  10. A nested sampling particle filter for nonlinear data assimilation

    KAUST Repository

    Elsheikh, Ahmed H.

    2014-04-15

    We present an efficient nonlinear data assimilation filter that combines particle filtering with the nested sampling algorithm. Particle filters (PF) utilize a set of weighted particles as a discrete representation of probability distribution functions (PDF). These particles are propagated through the system dynamics and their weights are sequentially updated based on the likelihood of the observed data. Nested sampling (NS) is an efficient sampling algorithm that iteratively builds a discrete representation of the posterior distributions by focusing a set of particles to high-likelihood regions. This would allow the representation of the posterior PDF with a smaller number of particles and reduce the effects of the curse of dimensionality. The proposed nested sampling particle filter (NSPF) iteratively builds the posterior distribution by applying a constrained sampling from the prior distribution to obtain particles in high-likelihood regions of the search space, resulting in a reduction of the number of particles required for an efficient behaviour of particle filters. Numerical experiments with the 3-dimensional Lorenz63 and the 40-dimensional Lorenz96 models show that NSPF outperforms PF in accuracy with a relatively smaller number of particles. © 2013 Royal Meteorological Society.

  11. The Influence of Prior Knowledge on the Retrieval-Directed Function of Note Taking in Prior Knowledge Activation

    Science.gov (United States)

    Wetzels, Sandra A. J.; Kester, Liesbeth; van Merrienboer, Jeroen J. G.; Broers, Nick J.

    2011-01-01

    Background: Prior knowledge activation facilitates learning. Note taking during prior knowledge activation (i.e., note taking directed at retrieving information from memory) might facilitate the activation process by enabling learners to build an external representation of their prior knowledge. However, taking notes might be less effective in…

  12. Detection of Small Numbers of Campylobacter jejuni and Campylobacter coli Cells in Environmental Water, Sewage, and Food Samples by a Seminested PCR Assay

    Science.gov (United States)

    Waage, Astrid S.; Vardund, Traute; Lund, Vidar; Kapperud, Georg

    1999-01-01

    A rapid and sensitive assay was developed for detection of small numbers of Campylobacter jejuni and Campylobacter coli cells in environmental water, sewage, and food samples. Water and sewage samples were filtered, and the filters were enriched overnight in a nonselective medium. The enrichment cultures were prepared for PCR by a rapid and simple procedure consisting of centrifugation, proteinase K treatment, and boiling. A seminested PCR based on specific amplification of the intergenic sequence between the two Campylobacter flagellin genes, flaA and flaB, was performed, and the PCR products were visualized by agarose gel electrophoresis. The assay allowed us to detect 3 to 15 CFU of C. jejuni per 100 ml in water samples containing a background flora consisting of up to 8,700 heterotrophic organisms per ml and 10,000 CFU of coliform bacteria per 100 ml. Dilution of the enriched cultures 1:10 with sterile broth prior to the PCR was sometimes necessary to obtain positive results. The assay was also conducted with food samples analyzed with or without overnight enrichment. As few as ≤3 CFU per g of food could be detected with samples subjected to overnight enrichment, while variable results were obtained for samples analyzed without prior enrichment. This rapid and sensitive nested PCR assay provides a useful tool for specific detection of C. jejuni or C. coli in drinking water, as well as environmental water, sewage, and food samples containing high levels of background organisms. PMID:10103261

  13. Cloud point extraction for determination of lead in blood samples of children, using different ligands prior to analysis by flame atomic absorption spectrometry: A multivariate study

    Energy Technology Data Exchange (ETDEWEB)

    Shah, Faheem, E-mail: shah_ceac@yahoo.com [National Center of Excellence in Analytical Chemistry, University of Sindh, Jamshoro 76080 (Pakistan); Kazi, Tasneem Gul, E-mail: tgkazi@yahoo.com [National Center of Excellence in Analytical Chemistry, University of Sindh, Jamshoro 76080 (Pakistan); Afridi, Hassan Imran, E-mail: hassanimranafridi@yahoo.com [National Center of Excellence in Analytical Chemistry, University of Sindh, Jamshoro 76080 (Pakistan); Naeemullah, E-mail: khannaeemullah@ymail.com [National Center of Excellence in Analytical Chemistry, University of Sindh, Jamshoro 76080 (Pakistan); Arain, Muhammad Balal, E-mail: bilal_ku2004@yahoo.com [Department of Chemistry, University of Science and Technology, Bannu, KPK (Pakistan); Baig, Jameel Ahmed, E-mail: jab_mughal@yahoo.com [National Center of Excellence in Analytical Chemistry, University of Sindh, Jamshoro 76080 (Pakistan)

    2011-09-15

    Highlights: {yields} Trace levels of lead in blood samples of healthy children and with different kidney disorders {yields} Pre-concentration of Pb{sup +2} in acid digested blood samples after chelating with two complexing reagents. {yields} Multivariate technique was used for screening of significant factors that influence the CPE of Pb{sup +2} {yields} The level of Pb{sup +2} in diseased children was significantly higher than referents of same age group. - Abstract: The phase-separation phenomenon of non-ionic surfactants occurring in aqueous solution was used for the extraction of lead (Pb{sup 2+}) from digested blood samples after simultaneous complexation with ammonium pyrrolidinedithiocarbamate (APDC) and diethyldithiocarbamate (DDTC) separately. The complexed analyte was quantitatively extracted with octylphenoxypolyethoxyethanol (Triton X-114). The multivariate strategy was applied to estimate the optimum values of experimental factors. Acidic ethanol was added to the surfactant-rich phase prior to its analysis by flame atomic absorption spectrometer (FAAS). The detection limit value of Pb{sup 2+} for the preconcentration of 10 mL of acid digested blood sample was 1.14 {mu}g L{sup -1}. The accuracy of the proposed methods was assessed by analyzing certified reference material (whole blood). Under the optimized conditions of both CPE methods, 10 mL of Pb{sup 2+} standards (10 {mu}g L{sup -1}) complexed with APDC and DDTC, permitted the enhancement factors of 56 and 42, respectively. The proposed method was used for determination of Pb{sup 2+} in blood samples of children with kidney disorders and healthy controls.

  14. Sample preparation

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Sample preparation prior to HPLC analysis is certainly one of the most important steps to consider in trace or ultratrace analysis. For many years scientists have tried to simplify the sample preparation process. It is rarely possible to inject a neat liquid sample or a sample where preparation may not be any more complex than dissolution of the sample in a given solvent. The last process alone can remove insoluble materials, which is especially helpful with the samples in complex matrices if other interactions do not affect extraction. Here, it is very likely a large number of components will not dissolve and are, therefore, eliminated by a simple filtration process. In most cases, the process of sample preparation is not as simple as dissolution of the component interest. At times, enrichment is necessary, that is, the component of interest is present in very large volume or mass of material. It needs to be concentrated in some manner so a small volume of the concentrated or enriched sample can be injected into HPLC. 88 refs

  15. Impact of cleaning before obtaining midstream urine samples from children

    DEFF Research Database (Denmark)

    Lytzen, Rebekka; Knudsen, Jenny Dahl; Ladelund, Steen

    2014-01-01

    INTRODUCTION: Microbiological documentation of one uropathogenic bacterium in significant numbers in urine from patients with typical symptoms is the gold standard for diagnosing urinary tract infection (UTI). Cleaning before collecting midstream urine (MSU) is reported not to reduce the risk...... of contaminating the sample and was therefore omitted at Hvidovre Hospital as from the autumn of 2006. We evaluate if no cleaning increased the risk of contamination in the Department of Paediatrics. MATERIAL AND METHODS: A total of 1,858 patients aged 0-15 years who were suspected of UTI delivered two MSUs within...

  16. A Bayesian sampling strategy for hazardous waste site characterization

    International Nuclear Information System (INIS)

    Skalski, J.R.

    1987-12-01

    Prior knowledge based on historical records or physical evidence often suggests the existence of a hazardous waste site. Initial surveys may provide additional or even conflicting evidence of site contamination. This article presents a Bayes sampling strategy that allocates sampling at a site using this prior knowledge. This sampling strategy minimizes the environmental risks of missing chemical or radionuclide hot spots at a waste site. The environmental risk is shown to be proportional to the size of the undetected hot spot or inversely proportional to the probability of hot spot detection. 12 refs., 2 figs

  17. Hayabusa2 Sample Catcher and Container: Metal-Seal System for Vacuum Encapsulation of Returned Samples with Volatiles and Organic Compounds Recovered from C-Type Asteroid Ryugu

    Science.gov (United States)

    Okazaki, Ryuji; Sawada, Hirotaka; Yamanouchi, Shinji; Tachibana, Shogo; Miura, Yayoi N.; Sakamoto, Kanako; Takano, Yoshinori; Abe, Masanao; Itoh, Shoichi; Yamada, Keita; Yabuta, Hikaru; Okamoto, Chisato; Yano, Hajime; Noguchi, Takaaki; Nakamura, Tomoki; Nagao, Keisuke

    2017-07-01

    The spacecraft Hayabusa2 was launched on December 3, 2014, to collect and return samples from a C-type asteroid, 162173 Ryugu (provisional designation, 1999 JU3). It is expected that the samples collected contain organic matter and water-bearing minerals and have key information to elucidate the origin and history of the Solar System and the evolution of bio-related organics prior to delivery to the early Earth. In order to obtain samples with volatile species without terrestrial contamination, based on lessons learned from the Hayabusa mission, the sample catcher and container of Hayabusa2 were refined from those used in Hayabusa. The improvements include (1) a mirror finish of the inner wall surface of the sample catcher and the container, (2) adoption of an aluminum metal sealing system, and (3) addition of a gas-sampling interface for gas collection and evacuation. The former two improvements were made to limit contamination of the samples by terrestrial atmosphere below 1 Pa after the container is sealed. The gas-sampling interface will be used to promptly collect volatile species released from the samples in the sample container after sealing of the container. These improvements maintain the value of the returned samples.

  18. Guideline for Sampling and Analysis of Tar and Particles in Biomass Producer Gases. Version 3.3

    Energy Technology Data Exchange (ETDEWEB)

    Neeft, J.P.A.; Knoef, H.A.M.; Zielke, U.; Sjoestroem, K.; Hasler, P.; Simell, P.A.; Dorrington, M.A.; Thomas, L.; Abatzoglou, N.; Deutch, S.; Greil, C.; Buffinga, G.J.; Brage, C.; Suomalainen, M.

    2002-07-01

    This Guideline provides a set of procedures for the measurement of organic contaminants and particles in producer gases from biomass gasifiers. The procedures are designed to cover different gasifier types (updraft or downdraft fixed bed or fluidised bed gasifiers), operating conditions (0 - 900C and 0.6-60 bars) and concentration ranges (1 mg/m{sub n}{sup 3} to 300 g/m{sub n}{sup 3}). The Guideline describes a modular sampling train, and a set of procedures, which include: planning and preparation of the sampling, sampling and post-sampling, analysis, calculations, error analysis and reporting. The modular sampling train consists of 4 modules. Module 1 is a preconditioning module for isokinetic sampling and gas cooling. Module 2 is a particle collection module including a heated filter. Module 3 is a tar collection module with a gas quench (optionally by circulating a liquid), impinger bottles and a backup adsorber. Module 4 is a volume-sampling module consisting of a pump, a rotameter, a gas flow meter and pressure and temperature indicators. The equipment and materials that are required for procuring this modular sampling train are given in the Guideline. The sampling procedures consist of a description for isokinetic sampling, a leakage test prior to sampling, the actual sampling and its duration, how the equipment is cleaned after the sampling, and how the samples are prepared and stored. Analysis of the samples is performed via three procedures. Prior to these procedures, the sample is prepared by Soxhlet extraction of the tars on the particle filter and by collection of all tars in one bulk solution. The first procedure describes the weighing of the particle filter to obtain the concentration of particles in the biomass producer gas. The bulk tar solution is used for two purposes: for determination of gravimetric tar and for analysis of individual compounds. The second procedure describes how to determine the gravimetric tar mass from the bulk solution. The

  19. Does obtaining an initial magnetic resonance imaging decrease the reamputation rates in the diabetic foot?

    Directory of Open Access Journals (Sweden)

    Marlena Jbara

    2016-06-01

    Full Text Available Objective: Diabetes mellitus (DM through its over glycosylation of neurovascular structures and resultant peripheral neuropathy continues to be the major risk factor for pedal amputation. Repetitive trauma to the insensate foot results in diabetic foot ulcers, which are at high risk to develop osteomyelitis. Many patients who present with diabetic foot complications will undergo one or more pedal amputations during the course of their disease. The purpose of this study was to determine if obtaining an initial magnetic resonance imaging (MRI, prior to the first amputation, is associated with a decreased rate of reamputation in the diabetic foot. Our hypothesis was that the rate of reamputation may be associated with underutilization of obtaining an initial MRI, useful in presurgical planning. This study was designed to determine whether there was an association between the reamputation rate in diabetic patients and utilization of MRI in the presurgical planning and prior to initial forefoot amputations. Methods: Following approval by our institutional review board, our study design consisted of a retrospective cohort analysis of 413 patients at Staten Island University Hospital, a 700-bed tertiary referral center between 2008 and 2013 who underwent an initial great toe (hallux amputation. Of the 413 patients with a hallux amputation, there were 368 eligible patients who had a history of DM with documented hemoglobin A1c (HbA1c within 3 months of the initial first ray (hallux and first metatarsal amputation and available radiographic data. Statistical analysis compared the incidence rates of reamputation between patients who underwent initial MRI and those who did not obtain an initial MRI prior to their first amputation. The reamputation rate was compared after adjustment for age, gender, ethnicity, HbA1c, cardiovascular disease, hypoalbuminemia, smoking, body mass index, and prior antibiotic treatment. Results: The results of our statistical

  20. Varying prior information in Bayesian inversion

    International Nuclear Information System (INIS)

    Walker, Matthew; Curtis, Andrew

    2014-01-01

    Bayes' rule is used to combine likelihood and prior probability distributions. The former represents knowledge derived from new data, the latter represents pre-existing knowledge; the Bayesian combination is the so-called posterior distribution, representing the resultant new state of knowledge. While varying the likelihood due to differing data observations is common, there are also situations where the prior distribution must be changed or replaced repeatedly. For example, in mixture density neural network (MDN) inversion, using current methods the neural network employed for inversion needs to be retrained every time prior information changes. We develop a method of prior replacement to vary the prior without re-training the network. Thus the efficiency of MDN inversions can be increased, typically by orders of magnitude when applied to geophysical problems. We demonstrate this for the inversion of seismic attributes in a synthetic subsurface geological reservoir model. We also present results which suggest that prior replacement can be used to control the statistical properties (such as variance) of the final estimate of the posterior in more general (e.g., Monte Carlo based) inverse problem solutions. (paper)

  1. Use of the oral sugar test in ponies when performed with or without prior fasting.

    Science.gov (United States)

    Knowles, E J; Harris, P A; Elliott, J; Menzies-Gow, N J

    2017-07-01

    It is recommended that the oral sugar test (OST) for insulin dysregulation (ID) be performed after an overnight fast, but fasting is impractical in ponies kept solely at pasture. There are few data on OST repeatability and reliability in ponies. To report 1) whether OST results obtained in the morning after an overnight fast or without fasting in the afternoon (FASTING/FED) can be used interchangeably, 2) time of highest insulin concentration T max [insulin], repeatability and reliability of insulin response to the OST when FASTING or FED and 3) dichotomous agreement (ID/normal) within a small sample when FASTING or FED. Method comparison study. Oral sugar tests were performed on four occasions in 10 adult native British ponies, twice FASTING and twice FED. Insulin concentrations were measured by radioimmunoassay at 0-120 min (T 0,30,60,75,90,120 ). Differences between FASTING and FED results were assessed using mixed effects models. Indices of repeatability and reliability were calculated; dichotomous agreement was reported using kappa statistics. Serum [insulin] was significantly (P≤0.05) higher at T 60 -T 90 with prior fasting (estimated differences [95% confidence intervals]): T 60 : 23.5 μiu/ml (8.7-38.4 μiu/ml), T 75 : 27.1 μiu/ml (12.3-41.8 μiu/ml), T 90 : 15.1 (0.36-29.9 μiu/ml). Most frequently, T max [ins] occurred at T 30 . At any single time point, within-subject coefficients of variation were: FASTING: 40% and FED: 31%. The 95% limits for repeatability were FASTING: 29-340%, FED: 41-240%. Test reliabilities were FASTING: 0.70 and FED: 0.67. For dichotomous interpretation similar results (kappa = 0.7) were obtained using cut-offs of [Insulin] >60 μiu/ml at T 60 or T 90 for FASTING and [Insulin] >51 μiu/ml at T 30 or T 60 for FED samples. Oral sugar tests were performed on a small number of animals on one pasture during one season (spring). Clinicians should beware of interpreting changes in absolute OST results owing to poor repeatability. When

  2. The Prior Internet Resources 2017

    DEFF Research Database (Denmark)

    Engerer, Volkmar Paul; Albretsen, Jørgen

    2017-01-01

    The Prior Internet Resources (PIR) are presented. Prior’s unpublished scientific manuscripts and his wast letter correspondence with fellow researchers at the time, his Nachlass, is now subject to transcription by Prior-researchers worldwide, and form an integral part of PIR. It is demonstrated...

  3. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sø rbye, Sigrunn Holbek; Rue, Haavard

    2017-01-01

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  4. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sørbye, Sigrunn Holbek

    2017-05-25

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users\\' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  5. Effect of simulated sampling disturbance on creep behaviour of rock salt

    Science.gov (United States)

    Guessous, Z.; Gill, D. E.; Ladanyi, B.

    1987-10-01

    This article presents the results of an experimental study of creep behaviour of a rock salt under uniaxial compression as a function of prestrain, simulating sampling disturbance. The prestrain was produced by radial compressive loading of the specimens prior to creep testing. The tests were conducted on an artifical salt to avoid excessive scattering of the results. The results obtained from several series of single-stage creep tests show that, at short-term, the creep response of salt is strongly affected by the preloading history of samples. The nature of this effect depends upon the intensity of radial compressive preloading, and its magnitude is a function of the creep stress level. The effect, however, decreases with increasing plastic deformation, indicating that large creep strains may eventually lead to a complete loss of preloading memory.

  6. Estimating Ambiguity Preferences and Perceptions in Multiple Prior Models: Evidence from the Field

    NARCIS (Netherlands)

    S.G. Dimmock (Stephen); R.R.P. Kouwenberg (Roy); O.S. Mitchell (Olivia); K. Peijnenburg (Kim)

    2015-01-01

    markdownabstractWe develop a tractable method to estimate multiple prior models of decision-making under ambiguity. In a representative sample of the U.S. population, we measure ambiguity attitudes in the gain and loss domains. We find that ambiguity aversion is common for uncertain events of

  7. Generalized Bayesian inference with sets of conjugate priors for dealing with prior-data conflict : course at Lund University

    NARCIS (Netherlands)

    Walter, G.

    2015-01-01

    In the Bayesian approach to statistical inference, possibly subjective knowledge on model parameters can be expressed by so-called prior distributions. A prior distribution is updated, via Bayes’ Rule, to the so-called posterior distribution, which combines prior information and information from

  8. The Importance of Prior Knowledge.

    Science.gov (United States)

    Cleary, Linda Miller

    1989-01-01

    Recounts a college English teacher's experience of reading and rereading Noam Chomsky, building up a greater store of prior knowledge. Argues that Frank Smith provides a theory for the importance of prior knowledge and Chomsky's work provided a personal example with which to interpret and integrate that theory. (RS)

  9. Determination of the content of fatty acid methyl esters (FAME) in biodiesel samples obtained by esterification using 1H-NMR spectroscopy.

    Science.gov (United States)

    Mello, Vinicius M; Oliveira, Flavia C C; Fraga, William G; do Nascimento, Claudia J; Suarez, Paulo A Z

    2008-11-01

    Three different calibration curves based on (1)H-NMR spectroscopy (300 MHz) were used for quantifying the reaction yield during biodiesel synthesis by esterification of fatty acids mixtures and methanol. For this purpose, the integrated intensities of the hydrogens of the ester methoxy group (3.67 ppm) were correlated with the areas related to the various protons of the alkyl chain (olefinic hydrogens: 5.30-5.46 ppm; aliphatic: 2.67-2.78 ppm, 2.30 ppm, 1.96-2.12 ppm, 1.56-1.68 ppm, 1.22-1.42 ppm, 0.98 ppm, and 0.84-0.92 ppm). The first curve was obtained using the peaks relating the olefinic hydrogens, a second with the parafinic protons and the third curve using the integrated intensities of all the hydrogens. A total of 35 samples were examined: 25 samples to build the three different calibration curves and ten samples to serve as external validation samples. The results showed no statistical differences among the three methods, and all presented prediction errors less than 2.45% with a co-efficient of variation (CV) of 4.66%. 2008 John Wiley & Sons, Ltd.

  10. Sample preparation optimization in fecal metabolic profiling.

    Science.gov (United States)

    Deda, Olga; Chatziioannou, Anastasia Chrysovalantou; Fasoula, Stella; Palachanis, Dimitris; Raikos, Νicolaos; Theodoridis, Georgios A; Gika, Helen G

    2017-03-15

    Metabolomic analysis of feces can provide useful insight on the metabolic status, the health/disease state of the human/animal and the symbiosis with the gut microbiome. As a result, recently there is increased interest on the application of holistic analysis of feces for biomarker discovery. For metabolomics applications, the sample preparation process used prior to the analysis of fecal samples is of high importance, as it greatly affects the obtained metabolic profile, especially since feces, as matrix are diversifying in their physicochemical characteristics and molecular content. However there is still little information in the literature and lack of a universal approach on sample treatment for fecal metabolic profiling. The scope of the present work was to study the conditions for sample preparation of rat feces with the ultimate goal of the acquisition of comprehensive metabolic profiles either untargeted by NMR spectroscopy and GC-MS or targeted by HILIC-MS/MS. A fecal sample pooled from male and female Wistar rats was extracted under various conditions by modifying the pH value, the nature of the organic solvent and the sample weight to solvent volume ratio. It was found that the 1/2 (w f /v s ) ratio provided the highest number of metabolites under neutral and basic conditions in both untargeted profiling techniques. Concerning LC-MS profiles, neutral acetonitrile and propanol provided higher signals and wide metabolite coverage, though extraction efficiency is metabolite dependent. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Separation/preconcentration of silver(I) and lead(II) in environmental samples on cellulose nitrate membrane filter prior to their flame atomic absorption spectrometric determinations

    International Nuclear Information System (INIS)

    Soylak, Mustafa; Cay, Rukiye Sungur

    2007-01-01

    An enrichment method for trace amounts of Ag(I) and Pb(II) has been established prior to their flame atomic absorption spectrometric determinations. The preconcentration/separation procedure is based on chelate formation of Ag(I) and Pb(II) with ammonium pyrrolidine dithiocarbamate (APDC) and on retention of the chelates on cellulose nitrate membrane filter. The influences of some analytical parameters including pH and amounts of reagent, etc. on the recoveries of analytes were investigated. The effects of interferic ions on the quantitative recoveries of analytes were also examined. The detection limits (k = 3, N = 11) were 4.6 μg L -1 for silver(I) and 15.3 μg L -1 for lead(II). The relative standard deviations (R.S.D.) of the determinations for analyte ions were below 3%. The method was applied to environmental samples for the determination of analyte ions with satisfactory results (recoveries >95%)

  12. Savings for visuomotor adaptation require prior history of error, not prior repetition of successful actions.

    Science.gov (United States)

    Leow, Li-Ann; de Rugy, Aymar; Marinovic, Welber; Riek, Stephan; Carroll, Timothy J

    2016-10-01

    When we move, perturbations to our body or the environment can elicit discrepancies between predicted and actual outcomes. We readily adapt movements to compensate for such discrepancies, and the retention of this learning is evident as savings, or faster readaptation to a previously encountered perturbation. The mechanistic processes contributing to savings, or even the necessary conditions for savings, are not fully understood. One theory suggests that savings requires increased sensitivity to previously experienced errors: when perturbations evoke a sequence of correlated errors, we increase our sensitivity to the errors experienced, which subsequently improves error correction (Herzfeld et al. 2014). An alternative theory suggests that a memory of actions is necessary for savings: when an action becomes associated with successful target acquisition through repetition, that action is more rapidly retrieved at subsequent learning (Huang et al. 2011). In the present study, to better understand the necessary conditions for savings, we tested how savings is affected by prior experience of similar errors and prior repetition of the action required to eliminate errors using a factorial design. Prior experience of errors induced by a visuomotor rotation in the savings block was either prevented at initial learning by gradually removing an oppositely signed perturbation or enforced by abruptly removing the perturbation. Prior repetition of the action required to eliminate errors in the savings block was either deprived or enforced by manipulating target location in preceding trials. The data suggest that prior experience of errors is both necessary and sufficient for savings, whereas prior repetition of a successful action is neither necessary nor sufficient for savings. Copyright © 2016 the American Physiological Society.

  13. Use on non-conjugate prior distributions in compound failure models. Final technical report

    International Nuclear Information System (INIS)

    Shultis, J.K.; Johnson, D.E.; Milliken, G.A.; Eckhoff, N.D.

    1981-12-01

    Several theoretical and computational techniques are presented for compound failure models in which the failure rate or failure probability for a class of components is considered to be a random variable. Both the failure-on-demand and failure-rate situation are considered. Ten different prior families are presented for describing the variation or uncertainty of the failure parameter. Methods considered for estimating values for the prior parameters from a given set of failure data are (1) matching data moments to those of the prior distribution, (2) matching data moments to those of the compound marginal distribution, and (3) the marginal maximum likelihood method. Numerical methods for computing the parameter estimators for all ten prior families are presented, as well as methods for obtaining estimates of the variances and covariance of the parameter estimators, it is shown that various confidence, probability, and tolerance intervals can be evaluated. Finally, to test the resulting failure models against the given failure data, generalized chi-squage and Kolmogorov-Smirnov goodness-of-fit tests are proposed together with a test to eliminate outliers from the failure data. Computer codes based on the results presented here have been prepared and are presented in a companion report

  14. Disparities in Diagnoses Received Prior to a Diagnosis of Autism Spectrum Disorder

    Science.gov (United States)

    Mandell, David S.; Ittenbach, Richard F.; Levy, Susan E.; Pinto-Martin, Jennifer A.

    2007-01-01

    This study estimated differences by ethnicity in the diagnoses assigned prior to the diagnosis of autism. In this sample of 406 Medicaid-eligible children, African-Americans were 2.6 times less likely than white children to receive an autism diagnosis on their first specialty care visit. Among children who did not receive an autism diagnosis on…

  15. Sex differences in foreign language text comprehension : The role of interests and prior knowledge

    NARCIS (Netherlands)

    Bügel, K; Buunk, Abraham (Bram)

    1996-01-01

    The scores obtained by female students on the national foreign language examinations in the Netherlands have been slightly but consistently lower than those of male students. The present research among 2980 high school students tested the hypothesis that, owing to sex differences in prior knowledge

  16. Perspectives among a Diverse Sample of Women on the Possibility of Obtaining Oral Contraceptives Over the Counter: A Qualitative Study.

    Science.gov (United States)

    Baum, Sarah; Burns, Bridgit; Davis, Laura; Yeung, Miriam; Scott, Cherisse; Grindlay, Kate; Grossman, Daniel

    2016-01-01

    There is increasing support among stakeholders in the United States to make oral contraceptives (OCs) available over the counter (OTC). Previous research on the topic has focused on representative samples of U.S. women, Latina women, low-income women, and abortion clients. However, little is known about the perspectives of African American women, Asian American women, and young women. We conducted 14 focus group discussions with 138 women. Twenty-three percent of participants were ages 18 or younger, 61% were African American, and 26% were Asian American/Pacific Islander. Community organizations recruited participants through convenience sampling and hosted the discussions. Focus groups were transcribed and coded thematically. Women reported potential benefits of OTC access, including convenience and privacy. Many believed OTC availability of OCs would help to reduce unintended pregnancy and help to destigmatize birth control. Participants also expressed concerns about OTC access, such as worry that first-time users and young adolescents would not have enough information to use the pill safely and effectively, as well as concerns about whether women would still obtain preventive screenings. Women were also worried that the cost of OTC OCs would be higher if insurance no longer covered them. Overall, women were interested in the option of obtaining the pill OTC. Future research and advocacy efforts should explore women's concerns, including whether adolescents can effectively use OTC pills and ensuring insurance coverage for OTC contraception. Copyright © 2016 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.

  17. Tank 12H residuals sample analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Oji, L. N. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Shine, E. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Diprete, D. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Coleman, C. J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Hay, M. S. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-06-11

    The Savannah River National Laboratory (SRNL) was requested by Savannah River Remediation (SRR) to provide sample preparation and analysis of the Tank 12H final characterization samples to determine the residual tank inventory prior to grouting. Eleven Tank 12H floor and mound residual material samples and three cooling coil scrape samples were collected and delivered to SRNL between May and August of 2014.

  18. Contamination risk of stable isotope samples during milling.

    Science.gov (United States)

    Isaac-Renton, M; Schneider, L; Treydte, K

    2016-07-15

    Isotope analysis of wood is an important tool in dendrochronology and ecophysiology. Prior to mass spectrometry analysis, wood must be homogenized, and a convenient method involves a ball mill capable of milling samples directly in sample tubes. However, sample-tube plastic can contaminate wood during milling, which could lead to biological misinterpretations. We tested possible contamination of whole wood and cellulose samples during ball-mill homogenization for carbon and oxygen isotope measurements. We used a multi-factorial design with two/three steel milling balls, two sample amounts (10 mg, 40 mg), and two milling times (5 min, 10 min). We further analyzed abrasion by milling empty tubes, and measured the isotope ratios of pure contaminants. A strong risk exists for carbon isotope bias through plastic contamination: the δ(13) C value of polypropylene deviated from the control by -6.77‰. Small fibers from PTFE filter bags used during cellulose extraction also present a risk as the δ(13) C value of this plastic deviated by -5.02‰. Low sample amounts (10 mg) showed highest contamination due to increased abrasion during milling (-1.34‰), which is further concentrated by cellulose extraction (-3.38‰). Oxygen isotope measurements were unaffected. A ball mill can be used to homogenize samples within test tubes prior to oxygen isotope analysis, but not prior to carbon or radiocarbon isotope analysis. There is still a need for a fast, simple and contamination-free sample preparation procedure. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Lessons Learned from Preparing OSIRIS-REx Spectral Analog Samples for Bennu

    Science.gov (United States)

    Schrader, D. L.; McCoy, T. J.; Cody, G. D.; King, A. J.; Schofield, P. F.; Russell, S. S.; Connolly, H. C., Jr.; Keller, L. P.; Donaldson Hanna, K.; Bowles, N.; hide

    2017-01-01

    NASA's OSIRIS-REx sample return mission launched on September 8th, 2016 to rendezvous with B-type asteroid (101955) Bennu in 2018. Type C and B asteroids have been linked to carbonaceous chondrites because of their similar visible - to - near infrared (VIS-NIR) spectral properties [e.g., 1,2]. The OSIRIS-REx Visible and Infrared Spectrometer (OVIRS) and the Thermal Emission Spectrometer (OTES) will make spectroscopic observations of Bennu during the encounter. Constraining the presence or absence of hydrous minerals (e.g., Ca-carbonate, phyllosilicates) and organic molecules will be key to characterizing Bennu [3] prior to sample site selection. The goal of this study was to develop a suite of analog and meteorite samples and obtain their spectral properties over the wavelength ranges of OVIRS (0.4- 4.3 micrometer) and OTES (5.0-50 micrometer). These spectral data were used to validate the mission science-data processing system. We discuss the reasoning behind the study and share lessons learned.

  20. Divergent Priors and well Behaved Bayes Factors

    NARCIS (Netherlands)

    R.W. Strachan (Rodney); H.K. van Dijk (Herman)

    2011-01-01

    textabstractDivergent priors are improper when defined on unbounded supports. Bartlett's paradox has been taken to imply that using improper priors results in ill-defined Bayes factors, preventing model comparison by posterior probabilities. However many improper priors have attractive properties

  1. Comparison of radon and radon-daughter grab samples obtained during the winter and summer

    International Nuclear Information System (INIS)

    Karp, K.E.

    1987-08-01

    The Technical Measurements Center (TMC), under the auspices of the US Department of Energy (DOE) Uranium Mill Tailings Remedial Action (UMTRA) program, is investigating short-term methods for estimating annual average indoor radon-daughter concentrations (RDC). A field study at 40 sample locations in 26 residential structures in Grand Junction, Colorado, was conducted once in the winter and once in the summer. The short-term methods investigated as part of this study include ten-minute radon and radon-daughter grab sampling and hourly RDC measurements. The results of the field study indicate that ten-minute radon grab samples from basement locations are reproducible over different seasons during controlled sampling conditions. Nonbasement radon and RDC grab samples are highly variable even when the use of the location by the occupant is controlled and the ventilation rate is restricted. The grab sampling was performed under controlled occupied conditions. These results confirm that a short-term radon or RDC measurement in a nonbasement location in a house is not a standardized measurement that can be used to infer an annual average concentration. The hourly RDC measurements were performed under three sets of conditions over a 72-hour period. The three sets of conditions were uncontrolled occupied, controlled occupied, and controlled unoccupied. These results indicate that it is not necessary to relocate the occupants during the time of grab sampling. 8 refs., 8 figs., 10 tabs

  2. Washing of waste prior to landfilling.

    Science.gov (United States)

    Cossu, Raffaello; Lai, Tiziana

    2012-05-01

    The main impact produced by landfills is represented by the release of leachate emissions. Waste washing treatment has been investigated to evaluate its efficiency in reducing the waste leaching fraction prior to landfilling. The results of laboratory-scale washing tests applied to several significant residues from integrated management of solid waste are presented in this study, specifically: non-recyclable plastics from source separation, mechanical-biological treated municipal solid waste and a special waste, automotive shredded residues. Results obtained demonstrate that washing treatment contributes towards combating the environmental impacts of raw wastes. Accordingly, a leachate production model was applied, leading to the consideration that the concentrations of chemical oxygen demand (COD) and total Kjeldahl nitrogen (TKN), parameters of fundamental importance in the characterization of landfill leachate, from a landfill containing washed wastes, are comparable to those that would only be reached between 90 and 220years later in the presence of raw wastes. The findings obtained demonstrated that washing of waste may represent an effective means of reducing the leachable fraction resulting in a consequent decrease in landfill emissions. Further studies on pilot scale are needed to assess the potential for full-scale application of this treatment. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. New methods for sampling sparse populations

    Science.gov (United States)

    Anna Ringvall

    2007-01-01

    To improve surveys of sparse objects, methods that use auxiliary information have been suggested. Guided transect sampling uses prior information, e.g., from aerial photographs, for the layout of survey strips. Instead of being laid out straight, the strips will wind between potentially more interesting areas. 3P sampling (probability proportional to prediction) uses...

  4. PERIOD ESTIMATION FOR SPARSELY SAMPLED QUASI-PERIODIC LIGHT CURVES APPLIED TO MIRAS

    Energy Technology Data Exchange (ETDEWEB)

    He, Shiyuan; Huang, Jianhua Z.; Long, James [Department of Statistics, Texas A and M University, College Station, TX (United States); Yuan, Wenlong; Macri, Lucas M., E-mail: lmacri@tamu.edu [George P. and Cynthia W. Mitchell Institute for Fundamental Physics and Astronomy, Department of Physics and Astronomy, Texas A and M University, College Station, TX (United States)

    2016-12-01

    We develop a nonlinear semi-parametric Gaussian process model to estimate periods of Miras with sparsely sampled light curves. The model uses a sinusoidal basis for the periodic variation and a Gaussian process for the stochastic changes. We use maximum likelihood to estimate the period and the parameters of the Gaussian process, while integrating out the effects of other nuisance parameters in the model with respect to a suitable prior distribution obtained from earlier studies. Since the likelihood is highly multimodal for period, we implement a hybrid method that applies the quasi-Newton algorithm for Gaussian process parameters and search the period/frequency parameter space over a dense grid. A large-scale, high-fidelity simulation is conducted to mimic the sampling quality of Mira light curves obtained by the M33 Synoptic Stellar Survey. The simulated data set is publicly available and can serve as a testbed for future evaluation of different period estimation methods. The semi-parametric model outperforms an existing algorithm on this simulated test data set as measured by period recovery rate and quality of the resulting period–luminosity relations.

  5. Study of the behavior to corrosion of samples of nuclear grade aluminium sheathed in nickel-phosphorous alloys obtained autocatalytically

    International Nuclear Information System (INIS)

    Castro, Maria Eugenia; Barbero, Jose Alfredo; Bubach, Ernesto

    2006-01-01

    One of the ways to protect an industrially important metallic material against corrosion is by covering the piece with an approximately 1 μm layer of a material whose resistance to corrosion is greater than the element being protected. The mechanism by which the anticorrosive protection is obtained is with the formation of a pore free physical barrier without defects that impedes the arrival of the agents responsible for the electrochemical attack. Other sacrifice anodes such as aluminum or zinc have protective forms based on their dissolution as a consequence of their less electrochemically noble behavior to preserve the material. This work studies the resistance to the corrosion of metallic coatings on nuclear grade aluminum substrates. The focus is on coating nickel-phosphorous (ni-P) alloys obtained autocatalytically from aluminum 6061. A comparative study is carried out of a series of electroless nickel coatings containing different amounts of the latter element, but without surpassing the threshold of 12%. The work includes the study of another nickel coating, Vitrovac 0080 (without phosphorous content) in order to compare structures and anticorrosive properties. These materials are also compared with the Al6061 substrate without any kind of coating. The study is carried out with surface characterization of each one of the samples with or without coating using a series of complementary techniques, such as chemical and electrochemical techniques (linear-sweep voltammetry, cyclic voltammetry, determination of the polarization resistance) and physical techniques (SEM microscopy, determination of micro-hardness). The correlation of variables is carried out later as a function of the phosphorous content of the test samples. The structures obtained from the coatings are amorphous. They have no pores or faults and have high hardness values. The electrochemical study proves that the anticorrosive protection capacity of the Ni-P alloy increases along with the

  6. Improved compressed sensing-based cone-beam CT reconstruction using adaptive prior image constraints

    Science.gov (United States)

    Lee, Ho; Xing, Lei; Davidi, Ran; Li, Ruijiang; Qian, Jianguo; Lee, Rena

    2012-04-01

    Volumetric cone-beam CT (CBCT) images are acquired repeatedly during a course of radiation therapy and a natural question to ask is whether CBCT images obtained earlier in the process can be utilized as prior knowledge to reduce patient imaging dose in subsequent scans. The purpose of this work is to develop an adaptive prior image constrained compressed sensing (APICCS) method to solve this problem. Reconstructed images using full projections are taken on the first day of radiation therapy treatment and are used as prior images. The subsequent scans are acquired using a protocol of sparse projections. In the proposed APICCS algorithm, the prior images are utilized as an initial guess and are incorporated into the objective function in the compressed sensing (CS)-based iterative reconstruction process. Furthermore, the prior information is employed to detect any possible mismatched regions between the prior and current images for improved reconstruction. For this purpose, the prior images and the reconstructed images are classified into three anatomical regions: air, soft tissue and bone. Mismatched regions are identified by local differences of the corresponding groups in the two classified sets of images. A distance transformation is then introduced to convert the information into an adaptive voxel-dependent relaxation map. In constructing the relaxation map, the matched regions (unchanged anatomy) between the prior and current images are assigned with smaller weight values, which are translated into less influence on the CS iterative reconstruction process. On the other hand, the mismatched regions (changed anatomy) are associated with larger values and the regions are updated more by the new projection data, thus avoiding any possible adverse effects of prior images. The APICCS approach was systematically assessed by using patient data acquired under standard and low-dose protocols for qualitative and quantitative comparisons. The APICCS method provides an

  7. Comparison of 230Th/234U Dating Results Obtained on Fossil Mollusk Shell from Morocco and Fossil Coral Samples from Egypt. Research of Methodological Criteria to Valid the Measured Age

    International Nuclear Information System (INIS)

    Choukri, A.; Hakam, O.K.; Reyss, J.L.

    2013-01-01

    Radiochemical ages of 126 unrecrystallized coral samples from the Egyptian shoreline and 125 fossil mollusk shell samples from the Atlantic coast of Moroccan High Atlas are discussed.For corals, the obtained ages are in good agreement with the ages reported previously on urecristallized corals except in some sites where some samples are affected by a cementation of younger aragonite.For mollusk shells, the obtained ages are in the most of cases, rejuvenated.This rejuvenation is due eventually to a post-incorporation of secondary uranium that is responsible of the wide dispersion of apparent ages of mollusk shells.

  8. Comparison of Hematologic and Biochemical Test Results in Blood Samples Obtained by Jugular Venipuncture Versus Nail Clip in Moluccan Cockatoos (Cacatua moluccensis).

    Science.gov (United States)

    Bennett, Tracy D; Lejnieks, Daniel V; Koepke, Hoyt; Grimson, Fiona; Szucs, Jennifer; Omaits, Kerri; Lane, Rosalie

    2015-12-01

    In birds, blood samples are often collected from the jugular, medial metatarsal, and basilic vein. Samples are sometimes collected by toe nail clip, but concerns to avoid drawing blood from the nail include pain after nail clips for blood collection, potential differences in complete blood count (CBC) results, and potential contamination with uric acid values. To compare differences in biochemical and hematologic values in blood samples obtained by jugular venipuncture versus toenail clip, blood samples were collected from Moluccan cockatoos (Cacatua moluccensis) (N = 23) and sent to a commercial laboratory for routine CBCs and serum biochemical analysis. Results showed good agreement between venipuncture and nail clip blood samples in red blood cell count, packed cell volume, heterophil count and percentage, lymphocyte count and percentage, aspartate aminotransferase, chloride, creatine phosphokinase, glucose, lactate dehydrogenase, total protein, and uric acid values. Constant bias was found in values of bile acids, cholesterol, and hemoglobin. Proportional bias toward higher values in the jugular sample were found in total white blood cell (WBC) count and inorganic phosphorus. Serum calcium plots revealed a proportional bias toward higher values in the toe nail blood when values were increased. Results suggest some differences in WBC count, bile acids, calcium, cholesterol, hemoglobin, and phosphorus values between blood samples collected by jugular venipuncture and samples collected by toe nail clip, but the differences are mostly minor and, with the possible exception of inorganic phosphorus and marginally elevated or very low WBC counts, are unlikely to affect the use or interpretation of the avian blood panel.

  9. Determination of trace inorganic mercury species in water samples by cloud point extraction and UV-vis spectrophotometry.

    Science.gov (United States)

    Ulusoy, Halil Ibrahim

    2014-01-01

    A new micelle-mediated extraction method was developed for preconcentration of ultratrace Hg(II) ions prior to spectrophotometric determination. 2-(2'-Thiazolylazo)-p-cresol (TAC) and Ponpe 7.5 were used as the chelating agent and nonionic surfactant, respectively. Hg(II) ions form a hydrophobic complex with TAC in a micelle medium. The main factors affecting cloud point extraction efficiency, such as pH of the medium, concentrations of TAC and Ponpe 7.5, and equilibration temperature and time, were investigated in detail. An overall preconcentration factor of 33.3 was obtained upon preconcentration of a 50 mL sample. The LOD obtained under the optimal conditions was 0.86 microg/L, and the RSD for five replicate measurements of 100 microg/L Hg(II) was 3.12%. The method was successfully applied to the determination of Hg in environmental water samples.

  10. Emulsification based dispersive liquid microextraction prior to flame atomic absorption spectrometry for the sensitive determination of Cd(II) in water samples

    International Nuclear Information System (INIS)

    Rahimi-Nasrabadi, Mehdi; Banan, Alireza; Zahedi, Mir Mahdi; Pourmortazavi, Seied Mahdi; Nazari, Zakieh; Asghari, Alireza

    2013-01-01

    We report on the application of emulsification-based dispersive liquid micro extraction (EB-DLME) to the preconcentration of Cd(II). This procedure not only possesses all the advantages of routine DLLME, but also results in a more stable cloudy state which is particularly useful when coupling it to FAAS. In EB-DLME, appropriate amounts of the extraction solvent (a solution of dithizone in chloroform) and an aqueous solution of sodium dodecyl sulfate (SDS; acting as a disperser) are injected into the samples. A stable cloudy microemulsion is formed and Cd(II) ion is extracted by chelation. After phase separation, the sedimented phase is subjected to FAAS. Under optimized conditions, the calibration curve for Cd(II) is linear in the range from 0.1 to 25 μg L −1 , the limit of detection (at S/N = 3) is 30 pg L −1 , the relative standard deviations for seven replicate analyses (at 0.56 μg L −1 of Cd(II)) is 4.6 %, and the enrichment factor is 151. EB-DLME in our opinion is a simple, efficient and rapid method for the preconcentration of Cd(II) (and most likely of many other ions) prior to FAAS determination. (author)

  11. The variance quadtree algorithm: use for spatial sampling design

    NARCIS (Netherlands)

    Minasny, B.; McBratney, A.B.; Walvoort, D.J.J.

    2007-01-01

    Spatial sampling schemes are mainly developed to determine sampling locations that can cover the variation of environmental properties in the area of interest. Here we proposed the variance quadtree algorithm for sampling in an area with prior information represented as ancillary or secondary

  12. Methods for obtaining a uniform volume concentration of implanted ions

    International Nuclear Information System (INIS)

    Reutov, V.F.

    1995-01-01

    Three simple practical methods of irradiations with high energy particles providing the conditions for obtaining a uniform volume concentration of the implanted ions in the massive samples are described in the present paper. Realization of the condition of two-sided irradiation of a plane sample during its rotation in the flux of the projectiles is the basis of the first method. The use of free air as a filter with varying absorbent ability due to movement of the irradiated sample along ion beam brought to the atmosphere is at the basis of the second method of uniform ion alloying. The third method for obtaining a uniform volume concentration of the implanted ions in a massive sample consists of irradiation of a sample through the absorbent filter in the shape of a foil curved according to the parabolic law moving along its surface. The first method is the most effective for obtaining a great number of the samples, for example, for mechanical tests, the second one - for irradiation in different gaseous media, and the third one - for obtaining high concentrations of the implanted ions under controlled (regulated) thermal and deformation conditions. 2 refs., 7 figs

  13. A Method to Correlate mRNA Expression Datasets Obtained from Fresh Frozen and Formalin-Fixed, Paraffin-Embedded Tissue Samples: A Matter of Thresholds.

    Directory of Open Access Journals (Sweden)

    Dana A M Mustafa

    Full Text Available Gene expression profiling of tumors is a successful tool for the discovery of new cancer biomarkers and potential targets for the development of new therapeutic strategies. Reliable profiling is preferably performed on fresh frozen (FF tissues in which the quality of nucleic acids is better preserved than in formalin-fixed paraffin-embedded (FFPE material. However, since snap-freezing of biopsy materials is often not part of daily routine in pathology laboratories, one may have to rely on archival FFPE material. Procedures to retrieve the RNAs from FFPE materials have been developed and therefore, datasets obtained from FFPE and FF materials need to be made compatible to ensure reliable comparisons are possible.To develop an efficient method to compare gene expression profiles obtained from FFPE and FF samples using the same platform.Twenty-six FFPE-FF sample pairs of the same tumors representing various cancer types, and two FFPE-FF sample pairs of breast cancer cell lines, were included. Total RNA was extracted and gene expression profiling was carried out using Illumina's Whole-Genome cDNA-mediated Annealing, Selection, extension and Ligation (WG-DASL V3 arrays, enabling the simultaneous detection of 24,526 mRNA transcripts. A sample exclusion criterion was created based on the expression of 11 stably expressed reference genes. Pearson correlation at the probe level was calculated for paired FFPE-FF, and three cut-off values were chosen. Spearman correlation coefficients between the matched FFPE and FF samples were calculated for three probe lists with varying levels of significance and compared to the correlation based on all measured probes. Unsupervised hierarchical cluster analysis was performed to verify performance of the included probe lists to compare matched FPPE-FF samples.Twenty-seven FFPE-FF pairs passed the sample exclusion criterion. From the profiles of 27 FFPE and FF matched samples, the best correlating probes were identified

  14. Synchrotron micro-diffraction analysis of the microstructure of cryogenically treated high performance tool steels prior to and after tempering

    Energy Technology Data Exchange (ETDEWEB)

    Xu, N.; Cavallaro, G.P. [Applied Centre for Structural and Synchrotron Studies, Mawson Lakes Blvd, University of South Australia, Mawson Lakes, South Australia 5095 (Australia); Gerson, A.R., E-mail: Andrea.Gerson@unisa.edu.au [Applied Centre for Structural and Synchrotron Studies, Mawson Lakes Blvd, University of South Australia, Mawson Lakes, South Australia 5095 (Australia)

    2010-10-15

    The phase transformation and strain changes within cryogenically (-196 deg. C) treated high performance tool steels (AISI H13) before and after tempering have been examined using both laboratory XRD and synchrotron micro-diffraction. The martensitic unit cell was found to have very low tetragonality as expected for low carbon steel. Tempering resulted in the diffusion of excess carbon out of the martensite phase and consequent unit cell shrinkage. In addition on tempering the martensite became more homogeneous as compared to the same samples prior to tempering. For cryogenically treated samples, the effect was most pronounced for the rapidly cooled sample which was the least homogenous sample prior to tempering but was the most homogenous sample after tempering. This suggests that the considerable degree of disorder resulting from rapid cryogenic cooling results in the beneficial release of micro-stresses on tempering thus possibly resulting in the improved wear resistance and durability observed for cryogenically treated tool steels.

  15. Synchrotron micro-diffraction analysis of the microstructure of cryogenically treated high performance tool steels prior to and after tempering

    International Nuclear Information System (INIS)

    Xu, N.; Cavallaro, G.P.; Gerson, A.R.

    2010-01-01

    The phase transformation and strain changes within cryogenically (-196 deg. C) treated high performance tool steels (AISI H13) before and after tempering have been examined using both laboratory XRD and synchrotron micro-diffraction. The martensitic unit cell was found to have very low tetragonality as expected for low carbon steel. Tempering resulted in the diffusion of excess carbon out of the martensite phase and consequent unit cell shrinkage. In addition on tempering the martensite became more homogeneous as compared to the same samples prior to tempering. For cryogenically treated samples, the effect was most pronounced for the rapidly cooled sample which was the least homogenous sample prior to tempering but was the most homogenous sample after tempering. This suggests that the considerable degree of disorder resulting from rapid cryogenic cooling results in the beneficial release of micro-stresses on tempering thus possibly resulting in the improved wear resistance and durability observed for cryogenically treated tool steels.

  16. Prior knowledge in recalling arguments in bioethical dilemmas

    Directory of Open Access Journals (Sweden)

    Hiemke Katharina Schmidt

    2015-09-01

    Full Text Available Prior knowledge is known to facilitate learning new information. Normally in studies confirming this outcome the relationship between prior knowledge and the topic to be learned is obvious: the information to be acquired is part of the domain or topic to which the prior knowledge belongs. This raises the question as to whether prior knowledge of various domains facilitates recalling information. In this study 79 eleventh-grade students completed a questionnaire on their prior knowledge of seven different domains related to the bioethical dilemma of prenatal diagnostics. The students read a text containing arguments for and arguments against prenatal diagnostics. After one week and again 12 weeks later they were asked to write down all the arguments they remembered. Prior knowledge helped them recall the arguments one week (r = .350 and 12 weeks (r = .316 later. Prior knowledge of three of the seven domains significantly helped them recall the arguments one week later (correlations between r = .194 to r = .394. Partial correlations with interest as a control item revealed that interest did not explain the relationship between prior knowledge and recall. Prior knowledge of different domains jointly supports the recall of arguments related to bioethical topics.

  17. Crystalline Silicon Solar Cells with Thin Silicon Passivation Film Deposited prior to Phosphorous Diffusion

    Directory of Open Access Journals (Sweden)

    Ching-Tao Li

    2014-01-01

    Full Text Available We demonstrate the performance improvement of p-type single-crystalline silicon (sc-Si solar cells resulting from front surface passivation by a thin amorphous silicon (a-Si film deposited prior to phosphorus diffusion. The conversion efficiency was improved for the sample with an a-Si film of ~5 nm thickness deposited on the front surface prior to high-temperature phosphorus diffusion, with respect to the samples with an a-Si film deposited on the front surface after phosphorus diffusion. The improvement in conversion efficiency is 0.4% absolute with respect to a-Si film passivated cells, that is, the cells with an a-Si film deposited on the front surface after phosphorus diffusion. The new technique provided a 0.5% improvement in conversion efficiency compared to the cells without a-Si passivation. Such performance improvements result from reduced surface recombination as well as lowered contact resistance, the latter of which induces a high fill factor of the solar cell.

  18. Statin Eligibility and Outpatient Care Prior to ST-Segment Elevation Myocardial Infarction.

    Science.gov (United States)

    Miedema, Michael D; Garberich, Ross F; Schnaidt, Lucas J; Peterson, Erin; Strauss, Craig; Sharkey, Scott; Knickelbine, Thomas; Newell, Marc C; Henry, Timothy D

    2017-04-12

    The impact of the 2013 American College of Cardiology/American Heart Association cholesterol guidelines on statin eligibility in individuals otherwise destined to experience cardiovascular disease (CVD) events is unclear. We analyzed a prospective cohort of consecutive ST-segment elevation myocardial infarction (STEMI) patients from a regional STEMI system with data on patient demographics, low-density lipoprotein cholesterol levels, CVD risk factors, medication use, and outpatient visits over the 2 years prior to STEMI. We determined pre-STEMI eligibility according to American College of Cardiology/American Heart Association guidelines and the prior Third Report of the Adult Treatment Panel guidelines. Our sample included 1062 patients with a mean age of 63.7 (13.0) years (72.5% male), and 761 (71.7%) did not have known CVD prior to STEMI. Only 62.5% and 19.3% of individuals with and without prior CVD were taking a statin before STEMI, respectively. In individuals not taking a statin, median (interquartile range) low-density lipoprotein cholesterol levels in those with and without known CVD were low (108 [83, 138]  mg/dL and 110 [87, 133] mg/dL). For individuals not taking a statin, only 38.7% were statin eligible by ATP III guidelines. Conversely, 79.0% would have been statin eligible according to American College of Cardiology/American Heart Association guidelines. Less than half of individuals with (49.2%) and without (41.1%) prior CVD had seen a primary care provider during the 2 years prior to STEMI. In a large cohort of STEMI patients, application of American College of Cardiology/American Heart Association guidelines more than doubled pre-STEMI statin eligibility compared with Third Report of the Adult Treatment Panel guidelines. However, access to and utilization of health care, a necessity for guideline implementation, was suboptimal prior to STEMI. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  19. Fast, simple and efficient salting-out assisted liquid-liquid extraction of naringenin from fruit juice samples prior to their enantioselective determination by liquid chromatography.

    Science.gov (United States)

    Magiera, Sylwia; Kwietniowska, Ewelina

    2016-11-15

    In this study, an easy, simple and efficient method for the determination of naringenin enantiomers in fruit juices after salting-out-assisted liquid-liquid extraction (SALLE) and high-performance liquid chromatography (HPLC) with diode-array detection (DAD) was developed. The sample treatment is based on the use of water-miscible acetonitrile as the extractant and acetonitrile phase separation under high-salt conditions. After extraction, juice samples were incubated with hydrochloric acid in order to achieve hydrolysis of naringin to naringenin. The hydrolysis parameters were optimized by using a half-fraction factorial central composite design (CCD). After sample preparation, chromatographic separation was obtained on a Chiralcel® OJ-RH column using the mobile phase consisting of 10mM aqueous ammonium acetate:methanol:acetonitrile (50:30:20; v/v/v) with detection at 288nm. The average recovery of the analyzed compounds ranged from 85.6 to 97.1%. The proposed method was satisfactorily used for the determination of naringenin enantiomers in various fruit juices samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Preconcentration of copper from natural water samples using ligand-less in situ surfactant-based solid phase extraction prior to FAAS determination

    Directory of Open Access Journals (Sweden)

    Sayed Zia Mohammadi

    2014-01-01

    Full Text Available In the present work, a simple and rapid ligand-less, in situ, surfactant-based solid phase extraction for the preconcentration of copper in water samples was developed. In this method, a cationic surfactant (n-dodecyltrimethylammonium bromide was dissolved in an aqueous sample followed by the addition of an appropriate ion-pairing agent (ClO4-. Due to the interaction between the surfactant and ion-pairing agent, solid particles were formed and subsequently used for the adsorption of Cu(OH2 and CuI. After centrifugation, the sediment was dissolved in 1.0 mL of 1 mol L-1 HNO3 in ethanol and aspirated directly into the flame atomic absorption spectrometer. In order to obtain the optimum conditions, several parameters affecting the performance of the LL-ISS-SPE, including the volumes of DTAB, KClO4, and KI, pH, and potentially interfering ions, were optimized. It was found that KI and phosphate buffer solution (pH = 9 could extract more than 95% of copper ions. The amount of copper ions in the water samples varied from 3.2 to 4.8 ng mL-1, with relative standard deviations of 98.5%-103%. The determination of copper in water samples was linear over a concentration range of 0.5-200.0 ng mL-1. The limit of detection (3Sb/m was 0.1 ng mL-1 with an enrichment factor of 38.7. The accuracy of the developed method was verified by the determination of copper in two certified reference materials, producing satisfactory results.

  1. On a Robust MaxEnt Process Regression Model with Sample-Selection

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2018-04-01

    Full Text Available In a regression analysis, a sample-selection bias arises when a dependent variable is partially observed as a result of the sample selection. This study introduces a Maximum Entropy (MaxEnt process regression model that assumes a MaxEnt prior distribution for its nonparametric regression function and finds that the MaxEnt process regression model includes the well-known Gaussian process regression (GPR model as a special case. Then, this special MaxEnt process regression model, i.e., the GPR model, is generalized to obtain a robust sample-selection Gaussian process regression (RSGPR model that deals with non-normal data in the sample selection. Various properties of the RSGPR model are established, including the stochastic representation, distributional hierarchy, and magnitude of the sample-selection bias. These properties are used in the paper to develop a hierarchical Bayesian methodology to estimate the model. This involves a simple and computationally feasible Markov chain Monte Carlo algorithm that avoids analytical or numerical derivatives of the log-likelihood function of the model. The performance of the RSGPR model in terms of the sample-selection bias correction, robustness to non-normality, and prediction, is demonstrated through results in simulations that attest to its good finite-sample performance.

  2. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    International Nuclear Information System (INIS)

    Elsheikh, Ahmed H.; Wheeler, Mary F.; Hoteit, Ibrahim

    2014-01-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems

  3. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    Energy Technology Data Exchange (ETDEWEB)

    Elsheikh, Ahmed H., E-mail: aelsheikh@ices.utexas.edu [Institute for Computational Engineering and Sciences (ICES), University of Texas at Austin, TX (United States); Institute of Petroleum Engineering, Heriot-Watt University, Edinburgh EH14 4AS (United Kingdom); Wheeler, Mary F. [Institute for Computational Engineering and Sciences (ICES), University of Texas at Austin, TX (United States); Hoteit, Ibrahim [Department of Earth Sciences and Engineering, King Abdullah University of Science and Technology (KAUST), Thuwal (Saudi Arabia)

    2014-02-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems.

  4. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    KAUST Repository

    Elsheikh, Ahmed H.

    2014-02-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems. © 2013 Elsevier Inc.

  5. Can natural selection encode Bayesian priors?

    Science.gov (United States)

    Ramírez, Juan Camilo; Marshall, James A R

    2017-08-07

    The evolutionary success of many organisms depends on their ability to make decisions based on estimates of the state of their environment (e.g., predation risk) from uncertain information. These decision problems have optimal solutions and individuals in nature are expected to evolve the behavioural mechanisms to make decisions as if using the optimal solutions. Bayesian inference is the optimal method to produce estimates from uncertain data, thus natural selection is expected to favour individuals with the behavioural mechanisms to make decisions as if they were computing Bayesian estimates in typically-experienced environments, although this does not necessarily imply that favoured decision-makers do perform Bayesian computations exactly. Each individual should evolve to behave as if updating a prior estimate of the unknown environment variable to a posterior estimate as it collects evidence. The prior estimate represents the decision-maker's default belief regarding the environment variable, i.e., the individual's default 'worldview' of the environment. This default belief has been hypothesised to be shaped by natural selection and represent the environment experienced by the individual's ancestors. We present an evolutionary model to explore how accurately Bayesian prior estimates can be encoded genetically and shaped by natural selection when decision-makers learn from uncertain information. The model simulates the evolution of a population of individuals that are required to estimate the probability of an event. Every individual has a prior estimate of this probability and collects noisy cues from the environment in order to update its prior belief to a Bayesian posterior estimate with the evidence gained. The prior is inherited and passed on to offspring. Fitness increases with the accuracy of the posterior estimates produced. Simulations show that prior estimates become accurate over evolutionary time. In addition to these 'Bayesian' individuals, we also

  6. Obtaining zircaloy powder through hydriding

    International Nuclear Information System (INIS)

    Dupim, Ivaldete da Silva; Moreira, Joao M.L.

    2009-01-01

    Zirconium alloys are good options for the metal matrix in dispersion fuels for power reactors due to their low thermal neutron absorption cross-section, good corrosion resistance, good mechanical strength and high thermal conductivity. A necessary step for obtaining such fuels is producing Zr alloy powder for the metal matrix composite material. This article presents results from the Zircaloy-4 hydrogenation tests with the purpose to embrittle the alloy as a first step for comminuting. Several hydrogenation tests were performed and studied through thermogravimetric analysis. They included H 2 pressures of 25 and 50 kPa and temperatures ranging between from 20 to 670 deg C. X-ray diffraction analysis showed in the hydrogenated samples the predominant presence of ZrH 2 and some ZrO 2 . Some kinetics parameters for the Zircaloy-4 hydrogenation reaction were obtained: the time required to reach the equilibrium state at the dwell temperature was about 100 minutes; the hydrogenation rate during the heating process from 20 to 670 deg C was about 21 mg/h, and at constant temperature of 670 deg C, the hydride rate was about 1.15 mg/h. The hydrogenation rate is largest during the heating process and most of it occurs during this period. After hydrogenated, the samples could easily be comminuted indicating that this is a possible technology to obtain Zircaloy powder. The results show that only few minutes of hydrogenation are necessary to reach the hydride levels required for comminuting the Zircaloy. The final hydride stoichiometry was between 2.7 and 2.8 H for each Zr atom in the sample (author)

  7. Histology of periapical lesions obtained during apical surgery.

    Science.gov (United States)

    Schulz, Malte; von Arx, Thomas; Altermatt, Hans Jörg; Bosshardt, Dieter

    2009-05-01

    The aim of this was to evaluate the histology of periapical lesions in teeth treated with periapical surgery. After root-end resection, the root tip was removed together with the periapical pathological tissue. Histologic sectioning was performed on calcified specimens embedded in methylmethacrylate (MMA) and on demineralized specimens embedded in LR White (Fluka, Buchs, Switzerland). The samples were evaluated with light and transmission electron microscopy (TEM). The histologic findings were classified into periapical abscesses, granulomas, or cystic lesions (true or pocket cysts). The final material comprised 70% granulomas, 23% cysts and 5% abscesses, 1% scar tissues, and 1% keratocysts. Six of 125 samples could not be used. The cystic lesions could not be subdivided into pocket or true cysts. All cysts had an epithelium-lined cavity, two of them with cilia-lined epithelium. These results show the high incidence of periapical granulomas among periapical lesions obtained during apical surgery. Periapical abscesses were a rare occasion. The histologic findings from samples obtained during apical surgery may differ from findings obtained by teeth extractions. A determination between pocket and true apical cysts is hardly possible when collecting samples by apical surgery.

  8. Use of high-intensity sonication for pre-treatment of biological tissues prior to multielemental analysis by total reflection X-ray fluorescence spectrometry

    International Nuclear Information System (INIS)

    De La Calle, Inmaculada; Costas, Marta; Cabaleiro, Noelia; Lavilla, Isela; Bendicho, Carlos

    2012-01-01

    In this work, two ultrasound-based procedures are developed for sample preparation prior to determination of P, K, Ca, Cr, Mn, Fe, Ni, Cu, Zn, As, Se and Sr in biological tissues by total reflection X-ray fluorescence spectrometry. Ultrasound-assisted extraction by means of a cup-horn sonoreactor and ultrasonic-probe slurry sampling were compared with a well-established procedure such as magnetic agitation slurry sampling. For that purpose, seven certified reference materials and different real samples of animal tissue were used. Similar accuracy and precision is obtained with the three sample preparation approaches tried. Limits of detection were dependent on both the sample matrix and the sample pre-treatment used, best values being achieved with ultrasound-assisted extraction. Advantages of ultrasound-assisted extraction include reduced sample handling, decreased contamination risks (neither addition of surfactants nor use of foreign objects inside the extraction vial), simpler background (no solid particles onto the sample carrier) and improved recovery for some elements such as P. A mixture of 10% v/v HNO 3 + 20–40% v/v HCl was suitable for extraction from biological tissues. - Highlights: ► We implement high-intensity sonication for pre-treatment of biological tissues. ► Multielemental analysis is performed by total reflection X-ray spectrometry. ► Ultrasound-based procedures are developed and compared to conventional slurry preparation. ► Features such as background, recovery and sample handling are favored by using ultrasonic extraction.

  9. Attentional and Contextual Priors in Sound Perception.

    Science.gov (United States)

    Wolmetz, Michael; Elhilali, Mounya

    2016-01-01

    Behavioral and neural studies of selective attention have consistently demonstrated that explicit attentional cues to particular perceptual features profoundly alter perception and performance. The statistics of the sensory environment can also provide cues about what perceptual features to expect, but the extent to which these more implicit contextual cues impact perception and performance, as well as their relationship to explicit attentional cues, is not well understood. In this study, the explicit cues, or attentional prior probabilities, and the implicit cues, or contextual prior probabilities, associated with different acoustic frequencies in a detection task were simultaneously manipulated. Both attentional and contextual priors had similarly large but independent impacts on sound detectability, with evidence that listeners tracked and used contextual priors for a variety of sound classes (pure tones, harmonic complexes, and vowels). Further analyses showed that listeners updated their contextual priors rapidly and optimally, given the changing acoustic frequency statistics inherent in the paradigm. A Bayesian Observer model accounted for both attentional and contextual adaptations found with listeners. These results bolster the interpretation of perception as Bayesian inference, and suggest that some effects attributed to selective attention may be a special case of contextual prior integration along a feature axis.

  10. Automated Blood Sample Preparation Unit (ABSPU) for Portable Microfluidic Flow Cytometry.

    Science.gov (United States)

    Chaturvedi, Akhil; Gorthi, Sai Siva

    2017-02-01

    Portable microfluidic diagnostic devices, including flow cytometers, are being developed for point-of-care settings, especially in conjunction with inexpensive imaging devices such as mobile phone cameras. However, two pervasive drawbacks of these have been the lack of automated sample preparation processes and cells settling out of sample suspensions, leading to inaccurate results. We report an automated blood sample preparation unit (ABSPU) to prevent blood samples from settling in a reservoir during loading of samples in flow cytometers. This apparatus automates the preanalytical steps of dilution and staining of blood cells prior to microfluidic loading. It employs an assembly with a miniature vibration motor to drive turbulence in a sample reservoir. To validate performance of this system, we present experimental evidence demonstrating prevention of blood cell settling, cell integrity, and staining of cells prior to flow cytometric analysis. This setup is further integrated with a microfluidic imaging flow cytometer to investigate cell count variability. With no need for prior sample preparation, a drop of whole blood can be directly introduced to the setup without premixing with buffers manually. Our results show that integration of this assembly with microfluidic analysis provides a competent automation tool for low-cost point-of-care blood-based diagnostics.

  11. Total Variability Modeling using Source-specific Priors

    DEFF Research Database (Denmark)

    Shepstone, Sven Ewan; Lee, Kong Aik; Li, Haizhou

    2016-01-01

    sequence of an utterance. In both cases the prior for the latent variable is assumed to be non-informative, since for homogeneous datasets there is no gain in generality in using an informative prior. This work shows in the heterogeneous case, that using informative priors for com- puting the posterior......, can lead to favorable results. We focus on modeling the priors using minimum divergence criterion or fac- tor analysis techniques. Tests on the NIST 2008 and 2010 Speaker Recognition Evaluation (SRE) dataset show that our proposed method beats four baselines: For i-vector extraction using an already...... trained matrix, for the short2-short3 task in SRE’08, five out of eight female and four out of eight male common conditions, were improved. For the core-extended task in SRE’10, four out of nine female and six out of nine male common conditions were improved. When incorporating prior information...

  12. Quantitative Evidence Synthesis with Power Priors

    NARCIS (Netherlands)

    Rietbergen, C.|info:eu-repo/dai/nl/322847796

    2016-01-01

    The aim of this thesis is to provide the applied researcher with a practical approach for quantitative evidence synthesis using the conditional power prior that allows for subjective input and thereby provides an alternative tgbgo deal with the difficulties as- sociated with the joint power prior

  13. Use of prior mammograms in the classification of benign and malignant masses

    International Nuclear Information System (INIS)

    Varela, Celia; Karssemeijer, Nico; Hendriks, Jan H.C.L.; Holland, Roland

    2005-01-01

    The purpose of this study was to determine the importance of using prior mammograms for classification of benign and malignant masses. Five radiologists and one resident classified mass lesions in 198 mammograms obtained from a population-based screening program. Cases were interpreted twice, once without and once with comparison of previous mammograms, in a sequential reading order using soft copy image display. The radiologists' performances in classifying benign and malignant masses without and with previous mammograms were evaluated with receiver operating characteristic (ROC) analysis. The statistical significance of the difference in performances was calculated using analysis of variance. The use of prior mammograms improved the classification performance of all participants in the study. The mean area under the ROC curve of the readers increased from 0.763 to 0.796. This difference in performance was statistically significant (P = 0.008)

  14. Recruiting for Prior Service Market

    Science.gov (United States)

    2008-06-01

    perceptions, expectations and issues for re-enlistment • Develop potential marketing and advertising tactics and strategies targeted to the defined...01 JUN 2008 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Recruiting for Prior Service Market 5a. CONTRACT NUMBER 5b. GRANT...Command First Handshake to First Unit of Assignment An Army of One Proud to Be e e to Serve Recruiting for Prior Service Market MAJ Eric Givens / MAJ Brian

  15. Variable Selection for Nonparametric Gaussian Process Priors: Models and Computational Strategies.

    Science.gov (United States)

    Savitsky, Terrance; Vannucci, Marina; Sha, Naijun

    2011-02-01

    This paper presents a unified treatment of Gaussian process models that extends to data from the exponential dispersion family and to survival data. Our specific interest is in the analysis of data sets with predictors that have an a priori unknown form of possibly nonlinear associations to the response. The modeling approach we describe incorporates Gaussian processes in a generalized linear model framework to obtain a class of nonparametric regression models where the covariance matrix depends on the predictors. We consider, in particular, continuous, categorical and count responses. We also look into models that account for survival outcomes. We explore alternative covariance formulations for the Gaussian process prior and demonstrate the flexibility of the construction. Next, we focus on the important problem of selecting variables from the set of possible predictors and describe a general framework that employs mixture priors. We compare alternative MCMC strategies for posterior inference and achieve a computationally efficient and practical approach. We demonstrate performances on simulated and benchmark data sets.

  16. Own and Others' Prior Experiences Influence Children's Imitation of Causal Acts.

    Science.gov (United States)

    Williamson, Rebecca A; Meltzoff, Andrew N

    2011-07-01

    Young children learn from others' examples, and they do so selectively. We examine whether the efficacy of prior experiences influences children's imitation. Thirty-six-month-olds had initial experience on a causal learning task either by performing the task themselves or by watching an adult perform it. The nature of the experience was manipulated such that the actor had either an easy or a difficult experience completing the task. Next, a second adult demonstrated an innovative technique for completing it. Children who had a difficult first-person experience, and those who had witnessed another person having difficulty, were significantly more likely to adopt and imitate the adult's innovation than those who had or witnessed an easy experience. Children who observed another were also more likely to imitate than were those who had the initial experience themselves. Imitation is influenced by prior experience, both when it is obtained through one's own hands-on motor manipulation and when it derives from observing the acts of others.

  17. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software.

    Science.gov (United States)

    Dols, W Stuart; Persily, Andrew K; Morrow, Jayne B; Matzke, Brett D; Sego, Landon H; Nuffer, Lisa L; Pulsipher, Brent A

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones.

  18. How often do surgeons obtain the critical view of safety during laparoscopic cholecystectomy?

    Science.gov (United States)

    Stefanidis, Dimitrios; Chintalapudi, Nikita; Anderson-Montoya, Brittany; Oommen, Bindhu; Tobben, Daniel; Pimentel, Manuel

    2017-01-01

    The reported incidence (0.16-1.5 %) of bile duct injury (BDI) during laparoscopic cholecystectomy (LC) is higher than during open cholecystectomy and has not decreased over time despite increasing experience with the procedure. The "critical view of safety" (CVS) technique may help to prevent BDI when certain criteria are met prior to division of any structures. This study aimed to evaluate the adherence of practicing surgeons to the CVS criteria during LC and the impact of a training intervention on CVS identification. LC procedures of general surgeons were video-recorded. De-identified recordings were reviewed by a blinded observer and rated on a 6-point scale using the previously published CVS criteria. A coaching program was conducted, and participating surgeons were re-assessed in the same manner. The observer assessed ten LC videos, each involving a different surgeon. The CVS was adequately achieved by two surgeons (20 %). The remaining eight surgeons (80 %) did not obtain adequate CVS prior to division of any structures, despite two surgeons dictating that they did; the mean score of this group was 1.75. After training, five participating surgeons (50 %) scored > 4, and the mean increased from 1.75 (baseline) to 3.75 (p < 0.05). The CVS criteria were not routinely used by the majority of participating surgeons. Further, one-fourth of those who claimed to obtain the CVS did so inadequately. All surgeons who participated in training showed improvement during their post-assessment. Our findings suggest that education of practicing surgeons in the application of the CVS during LC can result in increased implementation and quality of the CVS. Pending studies with larger samples, our findings may partly explain the sustained BDI incidence despite increased experience with LC. Our study also supports the value of direct observation of surgical practices and subsequent training for quality improvement.

  19. PET image reconstruction using multi-parametric anato-functional priors

    Science.gov (United States)

    Mehranian, Abolfazl; Belzunce, Martin A.; Niccolini, Flavia; Politis, Marios; Prieto, Claudia; Turkheimer, Federico; Hammers, Alexander; Reader, Andrew J.

    2017-08-01

    In this study, we investigate the application of multi-parametric anato-functional (MR-PET) priors for the maximum a posteriori (MAP) reconstruction of brain PET data in order to address the limitations of the conventional anatomical priors in the presence of PET-MR mismatches. In addition to partial volume correction benefits, the suitability of these priors for reconstruction of low-count PET data is also introduced and demonstrated, comparing to standard maximum-likelihood (ML) reconstruction of high-count data. The conventional local Tikhonov and total variation (TV) priors and current state-of-the-art anatomical priors including the Kaipio, non-local Tikhonov prior with Bowsher and Gaussian similarity kernels are investigated and presented in a unified framework. The Gaussian kernels are calculated using both voxel- and patch-based feature vectors. To cope with PET and MR mismatches, the Bowsher and Gaussian priors are extended to multi-parametric priors. In addition, we propose a modified joint Burg entropy prior that by definition exploits all parametric information in the MAP reconstruction of PET data. The performance of the priors was extensively evaluated using 3D simulations and two clinical brain datasets of [18F]florbetaben and [18F]FDG radiotracers. For simulations, several anato-functional mismatches were intentionally introduced between the PET and MR images, and furthermore, for the FDG clinical dataset, two PET-unique active tumours were embedded in the PET data. Our simulation results showed that the joint Burg entropy prior far outperformed the conventional anatomical priors in terms of preserving PET unique lesions, while still reconstructing functional boundaries with corresponding MR boundaries. In addition, the multi-parametric extension of the Gaussian and Bowsher priors led to enhanced preservation of edge and PET unique features and also an improved bias-variance performance. In agreement with the simulation results, the clinical results

  20. Statistical Analysis Of Tank 19F Floor Sample Results

    International Nuclear Information System (INIS)

    Harris, S.

    2010-01-01

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  1. Antigenic fractions from Taenia crassiceps metacestodes obtained by hydrophobicity for the immunodiagnosis of active and inactive forms of neurocysticercosis in human cerebrospinal fluid samples.

    Science.gov (United States)

    da Silva, Gabriela B; Nunes, Daniela S; de Sousa, José Eduardo N; Gonçalves-Pires, Maria do R F; Levenhagen, Marcelo A; Costa-Cruz, Julia M

    2017-04-01

    This study aimed to evaluate the total extract of Taenia crassiceps metacestodes (TC) and its antigenic fractions obtained by Triton X-114 fractionation techniques, such as detergent (DC) and aqueous (AC), in the immunodiagnosis of human neurocysticercosis (NCC). Cerebrospinal fluid samples were divided into two groups: Group 1 (n=40), which was further divided into active (n=20) and inactive (n=20) NCC, and Group 2 (control group), which comprised 39 CSF samples from patients who had another neurological disorder, were suffering from other infectious diseases of the brain or had other parasitic infections. The total extracts and antigenic fractions were tested by enzyme-linked immunosorbent assay (ELISA) to detect human IgG anti-Taenia solium. T. crassiceps fractions (DC and AC) showed the same value of sensitivity (Se), 100%, for active and inactive NCC and a specificity (Sp) of 97.4%. The DS fraction obtained from T. solium showed 100% Se for active NCC, 95% Se for inactive NCC and a 92.3% Sp. The AS fraction obtained from T. solium showed 100% Se for both active and inactive NCC and a 94.9% Sp. There was a positive correlation between the total saline extract of T. crassiceps (TC) and T. solium (TS) and their fractions (DC, AC, DS and AS). Positive predictive value, negative predictive value, diagnostic efficiency and Youden index were calculated. In conclusion, these results demonstrated that detergent and aqueous fractions obtained from T. crassiceps metacestodes are important sources of specific antigens and are efficient for immunodiagnosis of active and inactive NCC. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  2. Abdominal multi-organ segmentation from CT images using conditional shape-location and unsupervised intensity priors.

    Science.gov (United States)

    Okada, Toshiyuki; Linguraru, Marius George; Hori, Masatoshi; Summers, Ronald M; Tomiyama, Noriyuki; Sato, Yoshinobu

    2015-12-01

    This paper addresses the automated segmentation of multiple organs in upper abdominal computed tomography (CT) data. The aim of our study is to develop methods to effectively construct the conditional priors and use their prediction power for more accurate segmentation as well as easy adaptation to various imaging conditions in CT images, as observed in clinical practice. We propose a general framework of multi-organ segmentation which effectively incorporates interrelations among multiple organs and easily adapts to various imaging conditions without the need for supervised intensity information. The features of the framework are as follows: (1) A method for modeling conditional shape and location (shape-location) priors, which we call prediction-based priors, is developed to derive accurate priors specific to each subject, which enables the estimation of intensity priors without the need for supervised intensity information. (2) Organ correlation graph is introduced, which defines how the conditional priors are constructed and segmentation processes of multiple organs are executed. In our framework, predictor organs, whose segmentation is sufficiently accurate by using conventional single-organ segmentation methods, are pre-segmented, and the remaining organs are hierarchically segmented using conditional shape-location priors. The proposed framework was evaluated through the segmentation of eight abdominal organs (liver, spleen, left and right kidneys, pancreas, gallbladder, aorta, and inferior vena cava) from 134 CT data from 86 patients obtained under six imaging conditions at two hospitals. The experimental results show the effectiveness of the proposed prediction-based priors and the applicability to various imaging conditions without the need for supervised intensity information. Average Dice coefficients for the liver, spleen, and kidneys were more than 92%, and were around 73% and 67% for the pancreas and gallbladder, respectively. Copyright © 2015

  3. Image-guided percutaneous disc sampling: impact of antecedent antibiotics on yield

    International Nuclear Information System (INIS)

    Agarwal, V.; Wo, S.; Lagemann, G.M.; Tsay, J.; Delfyett, W.T.

    2016-01-01

    Aim: To evaluate the effect of antecedent antimicrobial therapy on diagnostic yield from percutaneous image-guided disc-space sampling. Materials and methods: A retrospective review of the electronic health records of all patients who underwent image-guided percutaneous sampling procedures for suspected discitis/osteomyelitis over a 5-year period was performed. One hundred and twenty-four patients were identified. Demographics, medical history, and culture results were recorded as well as duration of presenting symptoms and whether antecedent antibiotic therapy had been administered. Results: Of the 124 patients identified who underwent image-guided percutaneous disc-space sampling, 73 had received antecedent antibiotic treatment compared with 51 who had not. The overall positive culture rate for the present study population was 24% (n=30). The positive culture rate from patients previously on antibiotics was 21% (n=15) compared with 29% (n=15) for patients who had not received prior antibiotic treatment, which is not statistically significant (p=0.26). Eighty-six percent (n=63) of patients who had antecedent antibiotics received treatment for 4 or more days prior to their procedure, whereas 14% (n=10) received treatment for 1–3 days prior to their procedure. The difference in culture positivity rate between these two groups was not statistically significant (p=0.43). Culture results necessitated a change in antibiotic therapy in a third of the patients who had received antecedent antibiotic therapy. Conclusion: Antecedent antibiotic therapy, regardless of duration, did not result in significantly diminished diagnostic yield from percutaneous sampling for suspected discitis/osteomyelitis. The present results suggest that percutaneous biopsy may nonetheless yield positive diagnostic information despite prior antimicrobial therapy. If the diagnostic information may impact choice of therapeutic regimen, percutaneous biopsy should still be considered in cases where

  4. The influence of prior knowledge on the retrieval-directed function of note taking in prior knowledge activation

    NARCIS (Netherlands)

    Wetzels, Sandra; Kester, Liesbeth; Van Merriënboer, Jeroen; Broers, Nick

    2010-01-01

    Wetzels, S. A. J., Kester, L., Van Merriënboer, J. J. G., & Broers, N. J. (2011). The influence of prior knowledge on the retrieval-directed function of note taking in prior knowledge activation. British Journal of Educational Psychology, 81(2), 274-291. doi: 10.1348/000709910X517425

  5. Comparative Evaluation of Three Homogenization Methods for Isolating Middle East Respiratory Syndrome Coronavirus Nucleic Acids From Sputum Samples for Real-Time Reverse Transcription PCR.

    Science.gov (United States)

    Sung, Heungsup; Yong, Dongeun; Ki, Chang Seok; Kim, Jae Seok; Seong, Moon Woo; Lee, Hyukmin; Kim, Mi Na

    2016-09-01

    Real-time reverse transcription PCR (rRT-PCR) of sputum samples is commonly used to diagnose Middle East respiratory syndrome coronavirus (MERS-CoV) infection. Owing to the difficulty of extracting RNA from sputum containing mucus, sputum homogenization is desirable prior to nucleic acid isolation. We determined optimal homogenization methods for isolating viral nucleic acids from sputum. We evaluated the following three sputum-homogenization methods: proteinase K and DNase I (PK-DNase) treatment, phosphate-buffered saline (PBS) treatment, and N-acetyl-L-cysteine and sodium citrate (NALC) treatment. Sputum samples were spiked with inactivated MERS-CoV culture isolates. RNA was extracted from pretreated, spiked samples using the easyMAG system (bioMérieux, France). Extracted RNAs were then subjected to rRT-PCR for MERS-CoV diagnosis (DiaPlex Q MERS-coronavirus, SolGent, Korea). While analyzing 15 spiked sputum samples prepared in technical duplicate, false-negative results were obtained with five (16.7%) and four samples (13.3%), respectively, by using the PBS and NALC methods. The range of threshold cycle (Ct) values observed when detecting upE in sputum samples was 31.1-35.4 with the PK-DNase method, 34.7-39.0 with the PBS method, and 33.9-38.6 with the NALC method. Compared with the control, which were prepared by adding a one-tenth volume of 1:1,000 diluted viral culture to PBS solution, the ranges of Ct values obtained by the PBS and NALC methods differed significantly from the mean control Ct of 33.2 (both Phomogenizing sputum samples prior to RNA extraction.

  6. Precursors prior to type IIn supernova explosions are common: Precursor rates, properties, and correlations

    Energy Technology Data Exchange (ETDEWEB)

    Ofek, Eran O.; Steinbok, Aviram; Arcavi, Iair; Gal-Yam, Avishay; Tal, David; Ben-Ami, Sagi; Yaron, Ofer [Benoziyo Center for Astrophysics, Weizmann Institute of Science, 76100 Rehovot (Israel); Sullivan, Mark [School of Physics and Astronomy, University of Southampton, Southampton SO17 1BJ (United Kingdom); Shaviv, Nir J. [Racah Institute of Physics, The Hebrew University, 91904 Jerusalem (Israel); Kulkarni, Shrinivas R. [Cahill Center for Astronomy and Astrophysics, California Institute of Technology, Pasadena, CA 91125 (United States); Nugent, Peter E. [Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Kasliwal, Mansi M. [Observatories of the Carnegie Institution for Science, 813 Santa Barbara Street, Pasadena, CA 91101 (United States); Cenko, S. Bradley [Astrophysics Science Division, NASA/Goddard Space Flight Center, Mail Code 661, Greenbelt, MD 20771 (United States); Laher, Russ; Surace, Jason [Spitzer Science Center, California Institute of Technology, M/S 314-6, Pasadena, CA 91125 (United States); Bloom, Joshua S.; Filippenko, Alexei V. [Department of Astronomy, University of California, Berkeley, CA 94720-3411 (United States); Silverman, Jeffrey M. [Department of Astronomy, University of Texas, Austin, TX 78712 (United States)

    2014-07-10

    There is a growing number of Type IIn supernovae (SNe) which present an outburst prior to their presumably final explosion. These precursors may affect the SN display, and are likely related to poorly charted phenomena in the final stages of stellar evolution. By coadding Palomar Transient Factory (PTF) images taken prior to the explosion, here we present a search for precursors in a sample of 16 Type IIn SNe. We find five SNe IIn that likely have at least one possible precursor event (PTF 10bjb, SN 2010mc, PTF 10weh, SN 2011ht, and PTF 12cxj), three of which are reported here for the first time. For each SN we calculate the control time. We find that precursor events among SNe IIn are common: at the one-sided 99% confidence level, >50% of SNe IIn have at least one pre-explosion outburst that is brighter than 3 × 10{sup 7} L{sub ☉} taking place up to 1/3 yr prior to the SN explosion. The average rate of such precursor events during the year prior to the SN explosion is likely ≳ 1 yr{sup –1}, and fainter precursors are possibly even more common. Ignoring the two weakest precursors in our sample, the precursors rate we find is still on the order of one per year. We also find possible correlations between the integrated luminosity of the precursor and the SN total radiated energy, peak luminosity, and rise time. These correlations are expected if the precursors are mass-ejection events, and the early-time light curve of these SNe is powered by interaction of the SN shock and ejecta with optically thick circumstellar material.

  7. A combined method for DNA analysis and radiocarbon dating from a single sample.

    Science.gov (United States)

    Korlević, Petra; Talamo, Sahra; Meyer, Matthias

    2018-03-07

    Current protocols for ancient DNA and radiocarbon analysis of ancient bones and teeth call for multiple destructive samplings of a given specimen, thereby increasing the extent of undesirable damage to precious archaeological material. Here we present a method that makes it possible to obtain both ancient DNA sequences and radiocarbon dates from the same sample material. This is achieved by releasing DNA from the bone matrix through incubation with either EDTA or phosphate buffer prior to complete demineralization and collagen extraction utilizing the acid-base-acid-gelatinization and ultrafiltration procedure established in most radiocarbon dating laboratories. Using a set of 12 bones of different ages and preservation conditions we demonstrate that on average 89% of the DNA can be released from sample powder with minimal, or 38% without any, detectable collagen loss. We also detect no skews in radiocarbon dates compared to untreated samples. Given the different material demands for radiocarbon dating (500 mg of bone/dentine) and DNA analysis (10-100 mg), combined DNA and collagen extraction not only streamlines the sampling process but also drastically increases the amount of DNA that can be recovered from limited sample material.

  8. UNLABELED SELECTED SAMPLES IN FEATURE EXTRACTION FOR CLASSIFICATION OF HYPERSPECTRAL IMAGES WITH LIMITED TRAINING SAMPLES

    Directory of Open Access Journals (Sweden)

    A. Kianisarkaleh

    2015-12-01

    Full Text Available Feature extraction plays a key role in hyperspectral images classification. Using unlabeled samples, often unlimitedly available, unsupervised and semisupervised feature extraction methods show better performance when limited number of training samples exists. This paper illustrates the importance of selecting appropriate unlabeled samples that used in feature extraction methods. Also proposes a new method for unlabeled samples selection using spectral and spatial information. The proposed method has four parts including: PCA, prior classification, posterior classification and sample selection. As hyperspectral image passes these parts, selected unlabeled samples can be used in arbitrary feature extraction methods. The effectiveness of the proposed unlabeled selected samples in unsupervised and semisupervised feature extraction is demonstrated using two real hyperspectral datasets. Results show that through selecting appropriate unlabeled samples, the proposed method can improve the performance of feature extraction methods and increase classification accuracy.

  9. Roseomonas tokyonensis sp. nov. isolated from a biofilm sample obtained from a cooling tower in Tokyo, Japan.

    Science.gov (United States)

    Furuhata, Katsunori; Ishizaki, Naoto; Edagawa, Akiko; Fukuyama, Masafumi

    2013-01-01

    Strain K-20(T), a Gram-negative, nonmotile, nonspore-forming and strictly aerobic coccobacillus, which produces a pale pink pigment (R2A agar medium, 30℃, seven days) was isolated from a sample of biofilm obtained from a cooling tower in Tokyo, Japan. A phylogenetic analysis of the 16S rRNA partial gene sequences (1,439 bp) showed that the strain (accession number: AB297501) was related to Roseomonas frigidaquae CW67(T) and Roseomonas stagni HS-69(T) with 97.4% and 96.9% sequence similarity, respectively. Strain K-20(T) formed a distinct cluster with Roseomonas frigidaquae CW67(T) in the phylogenetic tree at a high bootstrap value (93%); however, distance was recognized between the strains. In addition, the DNA-DNA hybridization level between strain K-20(T) and Roseomonas frigidaquae JCM 15073(T) was 33%. The taxonomic data indicate that K-20(T) (=JCM 14634(T) =KCTC 32152(T)) should be classified in the genus Roseomonas as the type strain of a novel species, Roseomonas tokyonensis sp. nov.

  10. PDS Archive Release of Apollo 11, Apollo 12, and Apollo 17 Lunar Rock Sample Images

    Science.gov (United States)

    Garcia, P. A.; Stefanov, W. L.; Lofgren, G. E.; Todd, N. S.; Gaddis, L. R.

    2013-01-01

    Scientists at the Johnson Space Center (JSC) Lunar Sample Laboratory, Information Resources Directorate, and Image Science & Analysis Laboratory have been working to digitize (scan) the original film negatives of Apollo Lunar Rock Sample photographs [1, 2]. The rock samples, and associated regolith and lunar core samples, were obtained during the Apollo 11, 12, 14, 15, 16 and 17 missions. The images allow scientists to view the individual rock samples in their original or subdivided state prior to requesting physical samples for their research. In cases where access to the actual physical samples is not practical, the images provide an alternate mechanism for study of the subject samples. As the negatives are being scanned, they have been formatted and documented for permanent archive in the NASA Planetary Data System (PDS). The Astromaterials Research and Exploration Science Directorate (which includes the Lunar Sample Laboratory and Image Science & Analysis Laboratory) at JSC is working collaboratively with the Imaging Node of the PDS on the archiving of these valuable data. The PDS Imaging Node is now pleased to announce the release of the image archives for Apollo missions 11, 12, and 17.

  11. Application of pulsed OSL to polymineral fine-grained samples

    International Nuclear Information System (INIS)

    Feathers, James K.; Casson, M. Aksel; Schmidt, Amanda Henck; Chithambo, Makaiko L.

    2012-01-01

    Pulsed OSL is applied to nine fine-grained sediment samples from Sichuan province, China, using stimulating pulses of 10 μs on and 240 μs off, with an infrared exposure prior to each OSL measurement. Comparison of fading rates between pulsed and non-pulsed signals, the latter also obtained with a preceding IR exposure, shows that fading is significant for mainly the non-pulsed signals. Presence of a pulsed IRSL and the magnitudes of b-value to correct for lower alpha efficiency suggest that pulsing does not fully remove a significant feldspar signal, only a fading component. Comparison with ages of quartz extracts shows that pulsed OSL ages are consistent, while CW-OSL ages are slightly older and CW-IRSL ages are much older. The older ages suggest a less well-bleached feldspar component.

  12. The influence of prior rape on the psychological and physical health functioning of older adults.

    Science.gov (United States)

    Sachs-Ericsson, Natalie; Kendall-Tackett, Kathleen A; Sheffler, Julia; Arce, Darleine; Rushing, Nicole C; Corsentino, Elizabeth

    2014-01-01

    Older adults who have experienced traumatic events earlier in life may be especially vulnerable to additional challenges associated with aging. In a cross-sectional study of older females, the present study examines whether a history of rape is associated with current psychological and health problems. This study used existing data from the female respondents (N = 1228) in the National Social Life, Health, and Aging Project (NSHAP), a national probability sample of adults between the ages of 57 and 85 interviewed in their homes. It was determined whether or not the participant experienced forced sexual contact since the age of 18. Measures of psychological health (e.g., scales of depression, anxiety, and loneliness), the presence or absence of a number of serious health problems, and a one-item measure of self-esteem were obtained. Adult rape occurred in 7% of the sample. On average, 36 years had elapsed since the rape had occurred. Using structural equation modeling (SEM), rape was associated with lower self-esteem, psychological, and physical health functioning. Self-esteem partially mediated the association between rape and psychological functioning, but not health functioning. These associations were significant even after controlling for participant characteristics and risky health behaviors. Mechanisms linking prior rape to psychological and health problems in older age are discussed, as well as treatment recommendations for symptomatic older adults.

  13. Purification and concentration of lead samples in biological monitoring of occupational exposures

    Directory of Open Access Journals (Sweden)

    A Rahimi-Froushani

    2006-04-01

    Full Text Available Background and Aims:Lead is an important environmental constituent widely used in industrialprocesses for production of synthetic materials and therefore can be released in the environmentcausing public exposure especially around the industrial residence area. For evaluation of humanexposure to trace toxic metal of Pb (II, environmental and biological monitoring are essentialprocesses, in which, preparation of such samples is one of the most time-consuming and errorproneaspects prior to analysis. The use of solid-phase extraction (SPE has grown and is a fertiletechnique of sample preparation as it provides better results than those produced by liquid-liquidextraction (LLE. The aim of this study was to investigate factors influencing sample pretreatmentfor trace analysis of lead in biological samples for evaluation of occupational exposure.Method :To evaluate factors influencing quantitative analysis scheme of lead, solid phaseextraction using mini columns filled with XAD-4 resin was optimized with regard to sample pH,ligand concentration, loading flow rate, elution solvent, sample volume (up to 500 ml, elutionvolume, amount of resins, and sample matrix interferences.Results :Lead was retained on solid sorbent and eluted followed by simple determination ofanalytes by using flame atomic absorption spectrometery. Obtained recoveries of the metal ionwere more than 92%. The amount of the analyte detected after simultaneous pre-concentrationwas basically in agreement with the added amounts. The optimized procedure was also validatedwith three different pools of spiked urine samples and showed a good reproducibility over sixconsecutive days as well as six within-day experiments. The developed method promised to beapplicable for evaluation of other metal ions present in different environmental and occupationalsamples as suitable results were obtained for relative standard deviation (less than 10%.Conclusion:This optimized method can be considered to be

  14. An improved sampling method of complex network

    Science.gov (United States)

    Gao, Qi; Ding, Xintong; Pan, Feng; Li, Weixing

    2014-12-01

    Sampling subnet is an important topic of complex network research. Sampling methods influence the structure and characteristics of subnet. Random multiple snowball with Cohen (RMSC) process sampling which combines the advantages of random sampling and snowball sampling is proposed in this paper. It has the ability to explore global information and discover the local structure at the same time. The experiments indicate that this novel sampling method could keep the similarity between sampling subnet and original network on degree distribution, connectivity rate and average shortest path. This method is applicable to the situation where the prior knowledge about degree distribution of original network is not sufficient.

  15. Development and application of a micro-digestion device for biological samples

    International Nuclear Information System (INIS)

    Bohlen, A. von; Klockenkaemper, R.; Messerschmidt, J.; Alt, F.

    2000-01-01

    The analytical characterization of small amounts of a sample is of increasing importance for various research projects in biology, biochemistry and medicine. Reliable determinations of minor and trace elements in microsamples can be performed by total reflection x-ray fluorescence analysis (TXRF). This microanalytical method is suitable for direct multielement analyses of a tiny amount of a liquid or solid sample. Instead of a direct analysis, however, a complete digestion or mineralisation of the sample material prior to analysis can be recommendable. It can be advantageous for a favorable presentation, for a preconcentration and/or homogenization of the material and particularly for an accurate quantification. Unfortunately, commercially available digestion devices are optimized for amounts of 50 to 400 mg of a sample. For smaller amounts, a microdigestion device was constructed and adapted to an equipment of high pressure ashing, which is commercially available. Digestions of very different microsamples between some μg and some mg were carried out, followed by quantitative determinations of a lot of elements. Besides, different Standard Reference Materials (SRM) were analyzed. The homogeneity of these materials could be investigated by comparing the results found for microsamples with those obtained for samples of 200 mg, the latter after digestion in a conventional device. (author)

  16. Recognition of prior learning candidates’ experiences in a nurse training programme

    Directory of Open Access Journals (Sweden)

    Nomathemba B. Mothokoa

    2018-06-01

    Full Text Available Recognition of prior learning (RPL in South Africa is critical to the development of an equitable education and training system. Historically, nursing has been known as one of the professions that provides access to the training and education of marginalised groups who have minimal access to formal education. The advent of implementing RPL in nursing has, however, not been without challenges. The purpose of this study was to explore and describe the experiences of RPL nursing candidates related to a 4-year comprehensive nursing training programme at a nursing education institution in Gauteng. An exploratory, descriptive and contextual qualitative research design was undertaken. The research sample comprised 13 purposefully selected participants. Face-to-face individual interviews, using open-ended questions, were used to collect data, which were analysed using Tesch’s approach. Recognition of prior learning candidates experienced a number of realities as adult learners. On a positive note, their prior knowledge and experience supported them in their learning endeavours. Participants, however, experienced a number of challenges on personal, interpersonal and socialisation, and educational levels. It is important that opportunities are created to support and assist RPL candidates to complete their nursing training. This support structure, among others, should include the provision of RPL-related information, giving appropriate advice, coaching and mentoring, effective administration services, integrated curriculum design, and a variety of formative and summative assessment practices.

  17. Elicitation of expert prior opinion: application to the MYPAN trial in childhood polyarteritis nodosa.

    Directory of Open Access Journals (Sweden)

    Lisa V Hampson

    Full Text Available Definitive sample sizes for clinical trials in rare diseases are usually infeasible. Bayesian methodology can be used to maximise what is learnt from clinical trials in these circumstances. We elicited expert prior opinion for a future Bayesian randomised controlled trial for a rare inflammatory paediatric disease, polyarteritis nodosa (MYPAN, Mycophenolate mofetil for polyarteritis nodosa.A Bayesian prior elicitation meeting was convened. Opinion was sought on the probability that a patient in the MYPAN trial treated with cyclophosphamide would achieve disease remission within 6-months, and on the relative efficacies of mycophenolate mofetil and cyclophosphamide. Expert opinion was combined with previously unseen data from a recently completed randomised controlled trial in ANCA associated vasculitis.A pan-European group of fifteen experts participated in the elicitation meeting. Consensus expert prior opinion was that the most likely rates of disease remission within 6 months on cyclophosphamide or mycophenolate mofetil were 74% and 71%, respectively. This prior opinion will now be taken forward and will be modified to formulate a Bayesian posterior opinion once the MYPAN trial data from 40 patients randomised 1:1 to either CYC or MMF become available.We suggest that the methodological template we propose could be applied to trial design for other rare diseases.

  18. Estimation variance bounds of importance sampling simulations in digital communication systems

    Science.gov (United States)

    Lu, D.; Yao, K.

    1991-01-01

    In practical applications of importance sampling (IS) simulation, two basic problems are encountered, that of determining the estimation variance and that of evaluating the proper IS parameters needed in the simulations. The authors derive new upper and lower bounds on the estimation variance which are applicable to IS techniques. The upper bound is simple to evaluate and may be minimized by the proper selection of the IS parameter. Thus, lower and upper bounds on the improvement ratio of various IS techniques relative to the direct Monte Carlo simulation are also available. These bounds are shown to be useful and computationally simple to obtain. Based on the proposed technique, one can readily find practical suboptimum IS parameters. Numerical results indicate that these bounding techniques are useful for IS simulations of linear and nonlinear communication systems with intersymbol interference in which bit error rate and IS estimation variances cannot be obtained readily using prior techniques.

  19. Boat sampling and inservice inspections of the reactor pressure vessel weld No. 4 at Kozloduy NPP, Unit 1

    International Nuclear Information System (INIS)

    Cvitanovic, M.; Oreb, E.; Mudronja, V.; Zado, V.; Bezlaj, H.; Petkov, M.; Gledatchev, J.; Radomirski, S.; Ribarska, T.; Kroes, B.

    1999-01-01

    The paper deals with reactor pressure vessel (RPV) boat sampling performed at Kozloduy Nuclear Power Plant, Unit 1, from August to November 1996. Kozloduy NPP, Unit 1 has no reactor vessel material surveillance program. Changes in the material fracture toughness resulting from the fast neutron irradiation which cannot be monitored without removal of the vessel material. Therefore, the main objective of the project was to cut samples from the RPV wall in order to obtain samples of the RPV material for further structural analyses. The most critical area, i.e. weld No. 4 was determined as a location for boat sampling. Replication technique was applied in order to obtain precise determination of the weld geometry necessary for positioning of the cutting tool prior to boat sampling, and determination of divot depth left after boat sampling and grinding of sample sites. Boat sampling was performed by electrical discharge machining (EDM). Grinding of sample sites was implemented to minimize stress concentration effects on sample sites, to eliminate surface irregularities resulting from EDM process, and to eliminate recast layer on the surface of the EDM cut. Ultrasonic, liquid penetrant, magnetic particles, and visual examinations were performed after grinding to establish baseline data in the boat sampling area. The project preparation activities, apart from EDM process, and the site organization lead was entrusted to INETEC. The activities were funded by the PHARE program of the European Commission. (orig.)

  20. Offending prior to first psychiatric contact

    DEFF Research Database (Denmark)

    Stevens, H; Agerbo, E; Dean, K

    2012-01-01

    There is a well-established association between psychotic disorders and subsequent offending but the extent to which those who develop psychosis might have a prior history of offending is less clear. Little is known about whether the association between illness and offending exists in non-psychot......-psychotic disorders. The aim of this study was to determine whether the association between mental disorder and offending is present prior to illness onset in psychotic and non-psychotic disorders.......There is a well-established association between psychotic disorders and subsequent offending but the extent to which those who develop psychosis might have a prior history of offending is less clear. Little is known about whether the association between illness and offending exists in non...

  1. Optical waveguide lightmode spectroscopy technique-based immunosensor development for aflatoxin B1 determination in spice paprika samples.

    Science.gov (United States)

    Majer-Baranyi, Krisztina; Zalán, Zsolt; Mörtl, Mária; Juracsek, Judit; Szendrő, István; Székács, András; Adányi, Nóra

    2016-11-15

    Optical waveguide lightmode spectroscopy (OWLS) technique has been applied to label-free detection of aflatoxin B1 in a competitive immunoassay format, with the aim to compare the analytical goodness of the developed OWLS immunosenor with HPLC and enzyme-linked immunosorbent assay (ELISA) methods for the detection of aflatoxin in spice paprika matrix. We have also assessed applicability of the QuEChERS method prior to ELISA measurements, and the results were compared to those obtained by traditional solvent extraction followed by immunoaffinity clean-up. The AFB1 content of sixty commercial spice paprika samples from different countries were measured with the developed and optimized OWLS immunosensor. Comparing the results from the indirect immunosensor to that obtained by HPLC or ELISA provided excellent correlation (with regression coefficients above 0.94) indicating that the competitive OWLS immunosensor has a potential for quick determination of aflatoxin B1 in paprika samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Terminology for pregnancy loss prior to viability

    DEFF Research Database (Denmark)

    Kolte, A M; Bernardi, L A; Christiansen, O B

    2015-01-01

    Pregnancy loss prior to viability is common and research in the field is extensive. Unfortunately, terminology in the literature is inconsistent. The lack of consensus regarding nomenclature and classification of pregnancy loss prior to viability makes it difficult to compare study results from...... different centres. In our opinion, terminology and definitions should be based on clinical findings, and when possible, transvaginal ultrasound. With this Early Pregnancy Consensus Statement, it is our goal to provide clear and consistent terminology for pregnancy loss prior to viability....

  3. Detection of architectural distortion in prior screening mammograms using Gabor filters, phase portraits, fractal dimension, and texture analysis

    International Nuclear Information System (INIS)

    Rangayyan, Rangaraj M.; Prajna, Shormistha; Ayres, Fabio J.; Desautels, J.E.L.

    2008-01-01

    Mammography is a widely used screening tool for the early detection of breast cancer. One of the commonly missed signs of breast cancer is architectural distortion. The purpose of this study is to explore the application of fractal analysis and texture measures for the detection of architectural distortion in screening mammograms taken prior to the detection of breast cancer. A method based on Gabor filters and phase portrait analysis was used to detect initial candidates for sites of architectural distortion. A total of 386 regions of interest (ROIs) were automatically obtained from 14 ''prior mammograms'', including 21 ROIs related to architectural distortion. From the corresponding set of 14 ''detection mammograms'', 398 ROIs were obtained, including 18 related to breast cancer. For each ROI, the fractal dimension and Haralick's texture features were computed. The fractal dimension of the ROIs was calculated using the circular average power spectrum technique. The average fractal dimension of the normal (false-positive) ROIs was significantly higher than that of the ROIs with architectural distortion (p = 0.006). For the ''prior mammograms'', the best receiver operating characteristics (ROC) performance achieved, in terms of the area under the ROC curve, was 0.80 with a Bayesian classifier using four features including fractal dimension, entropy, sum entropy, and inverse difference moment. Analysis of the performance of the methods with free-response receiver operating characteristics indicated a sensitivity of 0.79 at 8.4 false positives per image in the detection of sites of architectural distortion in the ''prior mammograms''. Fractal dimension offers a promising way to detect the presence of architectural distortion in prior mammograms. (orig.)

  4. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  5. The influence of technological parameters on the dynamic behavior of "liquid wood" samples obtained by injection molding

    Science.gov (United States)

    Plavanescu Mazurchevici, Simona; Carausu, Constantin; Comaneci, Radu; Nedelcu, Dumitru

    2017-10-01

    The plastic products contribute to environmental pollution. Replacing the plastic materials with biodegradable materials with superior properties is an absolute necessity and important research direction for the near future. The first steps in this regard were the creation of composite materials containing natural fibers with positive effects on the environment that have penetrated in different fields. The bioplastics and biocomposites made from natural fibers is a topical solution. The next step was made towards obtaining biodegradable and recyclable materials based on cellulose, lignin and no carcinogens. In this category fall the "liquid wood" with a use up to five times without affecting the mechanical properties. "Liquid wood" is a high quality thermoplastic biocomposite. "Liquid wood" is a biopolymer composite divided in three categories, ARBOFORM®, ARBOBLEND® and ARBOFILL®, which have differed composition in terms of lignin percentage, being delivered by Tecnaro, as granules, [1]. The paper's research was focus on Arboform L V3 Nature and Arboform L V3 Nature reinforced with aramid fiber. In the experimental plan were taken into account six parameters (Dinj - direction of injection [°]; Ttop - melting temperature [°C]; Pinj - injection pressure [MPa] Ss - speed [m/min]; tinj - injection time [s] and tc - cooling time [s]) each with two levels, research carried on by Taguchi methodology. Processing Taguchi method allowed both Taguchi setting work parameters influence on storage modulus and damping as the size and influence their ranking. Experimental research concerning the influence technological parameters on storage modulus of samples obtained by injection from Arboform L V3 Nature yielded an average of 6055MPa and descending order as follows: Trac, Ss, Pinj, Dinj and Ttop. The average of model for reinforced material was 6419MPa and descending order of parameters influence such as: Dinj, Trac, Ttop, tinj, Ss and Pinj.

  6. Using Linked Survey Paradata to Improve Sampling Strategies in the Medical Expenditure Panel Survey

    Directory of Open Access Journals (Sweden)

    Mirel Lisa B.

    2017-06-01

    Full Text Available Using paradata from a prior survey that is linked to a new survey can help a survey organization develop more effective sampling strategies. One example of this type of linkage or subsampling is between the National Health Interview Survey (NHIS and the Medical Expenditure Panel Survey (MEPS. MEPS is a nationally representative sample of the U.S. civilian, noninstitutionalized population based on a complex multi-stage sample design. Each year a new sample is drawn as a subsample of households from the prior year’s NHIS. The main objective of this article is to examine how paradata from a prior survey can be used in developing a sampling scheme in a subsequent survey. A framework for optimal allocation of the sample in substrata formed for this purpose is presented and evaluated for the relative effectiveness of alternative substratification schemes. The framework is applied, using real MEPS data, to illustrate how utilizing paradata from the linked survey offers the possibility of making improvements to the sampling scheme for the subsequent survey. The improvements aim to reduce the data collection costs while maintaining or increasing effective responding sample sizes and response rates for a harder to reach population.

  7. Rapid determination of strontium-89 and strontium-90 in food and environmental samples by Cerenkov counting

    International Nuclear Information System (INIS)

    Melin, Judith; Suomela, Jorma

    1995-01-01

    The method has been developed for emergency situations. Minimum detectable concentrations of 5 Bq/liter, kilogram of strontium-89 and strontium-90 respectively is achievable in the presence of nuclides considered to be released under accidental conditions. Result on the strontium-89 and strontium-90 content in a sample can be obtained within 12 hours. One technician can easily handle 8-10 samples during a working day of eight hours. The determination of the strontium isotopes is accomplished by monitoring the Cerenkov radiation from strontium-89 and yttrium-90 in a liquid scintillation counter. The latter is the daughter product of strontium-90. Prior to the Cerenkov counting the sample is separated from interfering nuclides by oxalate precipitation, chromate precipitation and HDEHP-extraction. The method has to be further improved and evaluated with respect to different soil types such as forest mineral soil layers, agricultural soils and pastures. Furthermore, the decontamination procedure should be evaluated for a sample containing freshly irradiated uranium. (author)

  8. Possibility of obtaining reliable information on component safety by means of large-scale tensile samples with Orowan-Soete flaws

    International Nuclear Information System (INIS)

    Aurich, D.; Wobst, K.; Kafka, H.

    1984-01-01

    The aim of the paper is to review the present knowledge regarding the ability of wide plate tensile specimen with saw cut trough center flaws of providing accurate information on component reliability; it points out the advantages and disadvantages of this specimen geometries. The effect of temperature, specimen geometry, ligament size and notch radii are discussed in comparison with other specimen geometries. This is followed by a comparison of the results of such tests with tests on inside stressed tanks. Conclusions: wide-plate tensile specimen are generally appropriate for assessing welded joints. However, they result in a more favourable evaluation of low-toughness steels from the point of view of crack growth than of high-toughness and soft steels in case of stresses with incipient cracks, as compared with the results obtained with three-point bending samples. (orig.) [de

  9. TECHNIQUES WITH POTENTIAL FOR HANDLING ENVIRONMENTAL SAMPLES IN CAPILLARY ELECTROPHORESIS

    Science.gov (United States)

    An assessment of the methods for handling environmental samples prior to capillary electrophoresis (CE) is presented for both aqueous and solid matrices. Sample handling in environmental analyses is the subject of ongoing research at the Environmental Protection Agency's National...

  10. Maximum entropy reconstruction of spin densities involving non uniform prior

    International Nuclear Information System (INIS)

    Schweizer, J.; Ressouche, E.; Papoular, R.J.; Zheludev, A.I.

    1997-01-01

    Diffraction experiments give microscopic information on structures in crystals. A method which uses the concept of maximum of entropy (MaxEnt), appears to be a formidable improvement in the treatment of diffraction data. This method is based on a bayesian approach: among all the maps compatible with the experimental data, it selects that one which has the highest prior (intrinsic) probability. Considering that all the points of the map are equally probable, this probability (flat prior) is expressed via the Boltzman entropy of the distribution. This method has been used for the reconstruction of charge densities from X-ray data, for maps of nuclear densities from unpolarized neutron data as well as for distributions of spin density. The density maps obtained by this method, as compared to those resulting from the usual inverse Fourier transformation, are tremendously improved. In particular, any substantial deviation from the background is really contained in the data, as it costs entropy compared to a map that would ignore such features. However, in most of the cases, before the measurements are performed, some knowledge exists about the distribution which is investigated. It can range from the simple information of the type of scattering electrons to an elaborate theoretical model. In these cases, the uniform prior which considers all the different pixels as equally likely, is too weak a requirement and has to be replaced. In a rigorous bayesian analysis, Skilling has shown that prior knowledge can be encoded into the Maximum Entropy formalism through a model m(rvec r), via a new definition for the entropy given in this paper. In the absence of any data, the maximum of the entropy functional is reached for ρ(rvec r) = m(rvec r). Any substantial departure from the model, observed in the final map, is really contained in the data as, with the new definition, it costs entropy. This paper presents illustrations of model testing

  11. Transfer of residents to hospital prior to cardiac death: the influence of nursing home quality and ownership type.

    Science.gov (United States)

    Anic, Gabriella M; Pathak, Elizabeth Barnett; Tanner, Jean Paul; Casper, Michele L; Branch, Laurence G

    2014-01-01

    We hypothesised that among nursing home decedents, nursing home for-profit status and poor quality-of-care ratings, as well as patient characteristics, would lower the likelihood of transfer to hospital prior to heart disease death. Using death certificates from a large metropolitan area (Tampa Florida Metropolitan Statistical Area) for 1998-2002, we geocoded residential street addresses of heart disease decedents to identify 2172 persons who resided in nursing homes (n=131) at the time of death. We analysed decedent place of death as an indicator of transfer prior to death. Multilevel logistic regression modelling was used for analysis. Cause of death and decedent characteristics were obtained from death certificates. Nursing home characteristics, including state inspector ratings for multiple time points, were obtained from Florida's Agency for Healthcare Administration. Nursing home for-profit status, level of nursing care and quality-of-care ratings were not associated with the likelihood of transfer to hospital prior to heart disease death. Nursing homes >5 miles from a hospital were more likely to transfer decedents, compared with facilities located close to a hospital. Significant predictors of no transfer for nursing home residents were being white, female, older, less educated and widowed/unmarried. In this study population, contrary to our hypotheses, sociodemographic characteristics of nursing home decedents were more important predictors of no transfer prior to cardiac death than quality rankings or for-profit status of nursing homes.

  12. Reproducing kernel Hilbert spaces of Gaussian priors

    NARCIS (Netherlands)

    Vaart, van der A.W.; Zanten, van J.H.; Clarke, B.; Ghosal, S.

    2008-01-01

    We review definitions and properties of reproducing kernel Hilbert spaces attached to Gaussian variables and processes, with a view to applications in nonparametric Bayesian statistics using Gaussian priors. The rate of contraction of posterior distributions based on Gaussian priors can be described

  13. Employment Interventions for Individuals with ASD: The Relative Efficacy of Supported Employment With or Without Prior Project SEARCH Training.

    Science.gov (United States)

    Schall, Carol M; Wehman, Paul; Brooke, Valerie; Graham, Carolyn; McDonough, Jennifer; Brooke, Alissa; Ham, Whitney; Rounds, Rachael; Lau, Stephanie; Allen, Jaclyn

    2015-12-01

    This paper presents findings from a retrospective observational records review study that compares the outcomes associated with implementation of supported employment (SE) with and without prior Project SEARCH with ASD Supports (PS-ASD) on wages earned, time spent in intervention, and job retention. Results suggest that SE resulted in competitive employment for 45 adults with ASD. Twenty-five individuals received prior intervention through PS-ASD while the other 20 individuals received SE only. Individuals in this sample who received PS-ASD required fewer hours of intervention. Additionally, individuals in the PS-ASD group achieved a mean higher wage and had higher retention rates than their peers who received SE only. Further research with a larger sample is needed to confirm these findings.

  14. Prior Knowledge Assessment Guide

    Science.gov (United States)

    2014-12-01

    assessment in a reasonable amount of time. Hands-on assessments can be extremely diverse in makeup and administration depending on the subject matter...DEVELOPING AND USING PRIOR KNOWLEDGE ASSESSMENTS TO TAILOR TRAINING D-3 ___ Brush and scrub ___ Orchards ___ Rice

  15. Assessing representativeness of sampling methods for reaching men who have sex with men: a direct comparison of results obtained from convenience and probability samples.

    Science.gov (United States)

    Schwarcz, Sandra; Spindler, Hilary; Scheer, Susan; Valleroy, Linda; Lansky, Amy

    2007-07-01

    Convenience samples are used to determine HIV-related behaviors among men who have sex with men (MSM) without measuring the extent to which the results are representative of the broader MSM population. We compared results from a cross-sectional survey of MSM recruited from gay bars between June and October 2001 to a random digit dial telephone survey conducted between June 2002 and January 2003. The men in the probability sample were older, better educated, and had higher incomes than men in the convenience sample, the convenience sample enrolled more employed men and men of color. Substance use around the time of sex was higher in the convenience sample but other sexual behaviors were similar. HIV testing was common among men in both samples. Periodic validation, through comparison of data collected by different sampling methods, may be useful when relying on survey data for program and policy development.

  16. On the development of automatic sample preparation devices

    International Nuclear Information System (INIS)

    Oesselmann, J.

    1987-01-01

    Modern mass spectrometers for stable isotope analysis offer accurate isotope ratio results from gaseous samples (CO 2 , N 2 , H 2 , SO 2 ) in a completely automated fashion. However, most samples of interest either are associated with contaminant gases or the gas has to be liberated by a chemical procedure prior to measurement. In most laboratories this sample preparation step is performed manually. As a consequence, sample throughput is rather low and - despite skilful operation - the preparation procedure varies slightly from one sample to the next affecting mainly the reproducibility of the data. (author)

  17. Fractional Gaussian noise: Prior specification and model comparison

    KAUST Repository

    Sørbye, Sigrunn Holbek

    2017-07-07

    Fractional Gaussian noise (fGn) is a stationary stochastic process used to model antipersistent or persistent dependency structures in observed time series. Properties of the autocovariance function of fGn are characterised by the Hurst exponent (H), which, in Bayesian contexts, typically has been assigned a uniform prior on the unit interval. This paper argues why a uniform prior is unreasonable and introduces the use of a penalised complexity (PC) prior for H. The PC prior is computed to penalise divergence from the special case of white noise and is invariant to reparameterisations. An immediate advantage is that the exact same prior can be used for the autocorrelation coefficient ϕ(symbol) of a first-order autoregressive process AR(1), as this model also reflects a flexible version of white noise. Within the general setting of latent Gaussian models, this allows us to compare an fGn model component with AR(1) using Bayes factors, avoiding the confounding effects of prior choices for the two hyperparameters H and ϕ(symbol). Among others, this is useful in climate regression models where inference for underlying linear or smooth trends depends heavily on the assumed noise model.

  18. Fractional Gaussian noise: Prior specification and model comparison

    KAUST Repository

    Sø rbye, Sigrunn Holbek; Rue, Haavard

    2017-01-01

    Fractional Gaussian noise (fGn) is a stationary stochastic process used to model antipersistent or persistent dependency structures in observed time series. Properties of the autocovariance function of fGn are characterised by the Hurst exponent (H), which, in Bayesian contexts, typically has been assigned a uniform prior on the unit interval. This paper argues why a uniform prior is unreasonable and introduces the use of a penalised complexity (PC) prior for H. The PC prior is computed to penalise divergence from the special case of white noise and is invariant to reparameterisations. An immediate advantage is that the exact same prior can be used for the autocorrelation coefficient ϕ(symbol) of a first-order autoregressive process AR(1), as this model also reflects a flexible version of white noise. Within the general setting of latent Gaussian models, this allows us to compare an fGn model component with AR(1) using Bayes factors, avoiding the confounding effects of prior choices for the two hyperparameters H and ϕ(symbol). Among others, this is useful in climate regression models where inference for underlying linear or smooth trends depends heavily on the assumed noise model.

  19. GNS Castor V/21 Headspace Gas Sampling 2014

    Energy Technology Data Exchange (ETDEWEB)

    Winston, Philip Lon [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-01-01

    Prior to performing an internal visual inspection, samples of the headspace gas of the GNS Castor V/21 cask were taken on June 12, 2014. These samples were taken in support of the CREIPI/Japanese nuclear industry effort to validate fuel integrity without visual inspection by measuring the 85Kr content of the cask headspace

  20. DNA Sampling Hook

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The DNA Sampling Hook is a significant improvement on a method of obtaining a tissue sample from a live fish in situ from an aquatic environment. A tissue sample...

  1. Fluorescence imaging of ion distributions in an inductively coupled plasma with laser ablation sample introduction

    International Nuclear Information System (INIS)

    Moses, Lance M.; Ellis, Wade C.; Jones, Derick D.; Farnsworth, Paul B.

    2015-01-01

    High-resolution images of the spatial distributions of Sc II, Ca II, and Ba II ion densities in the 10 mm upstream from the sampling cone in a laser ablation-inductively coupled plasma-mass spectrometer (LA-ICP-MS) were obtained using planar laser induced fluorescence. Images were obtained for each analyte as a function of the carrier gas flow rate with laser ablation (LA) sample introduction and compared to images with solution nebulization (SN) over the same range of flow rates. Additionally, images were obtained using LA at varying fluences and with varying amounts of helium added to a constant flow of argon gas. Ion profiles in SN images followed a pattern consistent with previous work: increasing gas flow caused a downstream shift in the ion profiles. When compared to SN, LA led to ion profiles that were much narrower radially and reached a maximum near the sampling cone at higher flow rates. Increasing the fluence led to ions formed in the ICP over greater axial and radial distances. The addition of He to the carrier gas prior to the ablation cell led to an upstream shift in the position of ionization and lower overall fluorescence intensities. - Highlights: • We map distributions of analytes in the ICP using laser ablation sample introduction. • We compare images from laser ablation with those from a pneumatic nebulizer. • We document the effects of water added to the laser ablation aerosol. • We compare distributions from a metal to those from crystalline solids. • We document the effect of laser fluence on ion distributions

  2. Optimal Bayesian experimental design for priors of compact support with application to shock-tube experiments for combustion kinetics

    KAUST Repository

    Bisetti, Fabrizio; Kim, Daesang; Knio, Omar; Long, Quan; Tempone, Raul

    2016-01-01

    to account for the bounded domain of the uniform prior pdf of the parameters. The underlying Gaussian distribution is obtained in the spirit of the Laplace method, more precisely, the mode is chosen as the maximum a posteriori (MAP) estimate

  3. Learning priors for Bayesian computations in the nervous system.

    Directory of Open Access Journals (Sweden)

    Max Berniker

    Full Text Available Our nervous system continuously combines new information from our senses with information it has acquired throughout life. Numerous studies have found that human subjects manage this by integrating their observations with their previous experience (priors in a way that is close to the statistical optimum. However, little is known about the way the nervous system acquires or learns priors. Here we present results from experiments where the underlying distribution of target locations in an estimation task was switched, manipulating the prior subjects should use. Our experimental design allowed us to measure a subject's evolving prior while they learned. We confirm that through extensive practice subjects learn the correct prior for the task. We found that subjects can rapidly learn the mean of a new prior while the variance is learned more slowly and with a variable learning rate. In addition, we found that a Bayesian inference model could predict the time course of the observed learning while offering an intuitive explanation for the findings. The evidence suggests the nervous system continuously updates its priors to enable efficient behavior.

  4. Synchrotron diffraction studies of TiC/FeTi cermets obtained by SHS

    International Nuclear Information System (INIS)

    Contreras, L.; Turrillas, X.; Mas-Guindal, M.J.; Vaughan, G.B.M.; Kvick, A.; Rodriguez, M.A.

    2005-01-01

    TiC/FeTi composites have been obtained in situ by Self-propagating High-temperature Synthesis (SHS) of an intimate mixture of compacted powders of elemental carbon, titanium and iron. The reaction has been followed in real time by X-ray diffraction at the ESRF. The mechanism of the reaction is discussed in terms of the formation of a liquid phase corresponding to the eutectic of the Fe/Ti system prior to the TiC synthesis. Temperatures of reaction have been estimated by correlating thermal expansion coefficients with diffraction peaks shifts. The microstructures obtained by this method, suitable for cutting tools and wear resistant applications, are presented

  5. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.

    1978-10-01

    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  6. Honest Importance Sampling with Multiple Markov Chains.

    Science.gov (United States)

    Tan, Aixin; Doss, Hani; Hobert, James P

    2015-01-01

    Importance sampling is a classical Monte Carlo technique in which a random sample from one probability density, π 1 , is used to estimate an expectation with respect to another, π . The importance sampling estimator is strongly consistent and, as long as two simple moment conditions are satisfied, it obeys a central limit theorem (CLT). Moreover, there is a simple consistent estimator for the asymptotic variance in the CLT, which makes for routine computation of standard errors. Importance sampling can also be used in the Markov chain Monte Carlo (MCMC) context. Indeed, if the random sample from π 1 is replaced by a Harris ergodic Markov chain with invariant density π 1 , then the resulting estimator remains strongly consistent. There is a price to be paid however, as the computation of standard errors becomes more complicated. First, the two simple moment conditions that guarantee a CLT in the iid case are not enough in the MCMC context. Second, even when a CLT does hold, the asymptotic variance has a complex form and is difficult to estimate consistently. In this paper, we explain how to use regenerative simulation to overcome these problems. Actually, we consider a more general set up, where we assume that Markov chain samples from several probability densities, π 1 , …, π k , are available. We construct multiple-chain importance sampling estimators for which we obtain a CLT based on regeneration. We show that if the Markov chains converge to their respective target distributions at a geometric rate, then under moment conditions similar to those required in the iid case, the MCMC-based importance sampling estimator obeys a CLT. Furthermore, because the CLT is based on a regenerative process, there is a simple consistent estimator of the asymptotic variance. We illustrate the method with two applications in Bayesian sensitivity analysis. The first concerns one-way random effects models under different priors. The second involves Bayesian variable

  7. Effects of GPS sampling intensity on home range analyses

    Science.gov (United States)

    Jeffrey J. Kolodzinski; Lawrence V. Tannenbaum; David A. Osborn; Mark C. Conner; W. Mark Ford; Karl V. Miller

    2010-01-01

    The two most common methods for determining home ranges, minimum convex polygon (MCP) and kernel analyses, can be affected by sampling intensity. Despite prior research, it remains unclear how high-intensity sampling regimes affect home range estimations. We used datasets from 14 GPS-collared, white-tailed deer (Odocoileus virginianus) to describe...

  8. Effect of the Availability of Prior Full-Field Digital Mammography and Digital Breast Tomosynthesis Images on the Interpretation of Mammograms

    Science.gov (United States)

    Catullo, Victor J.; Chough, Denise M.; Ganott, Marie A.; Kelly, Amy E.; Shinde, Dilip D.; Sumkin, Jules H.; Wallace, Luisa P.; Bandos, Andriy I.; Gur, David

    2015-01-01

    Purpose To assess the effect of and interaction between the availability of prior images and digital breast tomosynthesis (DBT) images in decisions to recall women during mammogram interpretation. Materials and Methods Verbal informed consent was obtained for this HIPAA-compliant institutional review board–approved protocol. Eight radiologists independently interpreted twice deidentified mammograms obtained in 153 women (age range, 37–83 years; mean age, 53.7 years ± 9.3 [standard deviation]) in a mode by reader by case-balanced fully crossed study. Each study consisted of current and prior full-field digital mammography (FFDM) images and DBT images that were acquired in our facility between June 2009 and January 2013. For one reading, sequential ratings were provided by using (a) current FFDM images only, (b) current FFDM and DBT images, and (c) current FFDM, DBT, and prior FFDM images. The other reading consisted of (a) current FFDM images only, (b) current and prior FFDM images, and (c) current FFDM, prior FFDM, and DBT images. Fifty verified cancer cases, 60 negative and benign cases (clinically not recalled), and 43 benign cases (clinically recalled) were included. Recall recommendations and interaction between the effect of prior FFDM and DBT images were assessed by using a generalized linear model accounting for case and reader variability. Results Average recall rates in noncancer cases were significantly reduced with the addition of prior FFDM images by 34% (145 of 421) and 32% (106 of 333) without and with DBT images, respectively (P < .001). However, this recall reduction was achieved at the cost of a corresponding 7% (23 of 345) and 4% (14 of 353) reduction in sensitivity (P = .006). In contrast, availability of DBT images resulted in a smaller reduction in recall rates (false-positive interpretations) of 19% (76 of 409) and 26% (71 of 276) without and with prior FFDM images, respectively (P = .001). Availability of DBT images resulted in 4% (15 of

  9. Rotary mode core sampling approved checklist: 241-TX-113

    International Nuclear Information System (INIS)

    Fowler, K.D.

    1998-01-01

    The safety assessment for rotary mode core sampling was developed using certain bounding assumptions, however, those assumptions were not verified for each of the existing or potential flammable gas tanks. Therefore, a Flammable Gas/Rotary Mode Core Sampling Approved Checklist has been completed for tank 241-TX-113 prior to sampling operations. This transmittal documents the dispositions of the checklist items from the safety assessment

  10. Rotary mode core sampling approved checklist: 241-TX-116

    International Nuclear Information System (INIS)

    FOWLER, K.D.

    1999-01-01

    The safety assessment for rotary mode core sampling was developed using certain bounding assumptions, however, those assumptions were not verified for each of the existing or potential flammable gas tanks. Therefore, a Flammable Gas/Rotary Mode Core Sampling Approved Checklist has been completed for tank 241-TX-116 prior to sampling operations. This transmittal documents the dispositions of the checklist items from the safety assessment

  11. Immunoaffinity chromatography for the sample pretreatment of Taxus plant and cell extracts prior to analysis of taxanes by high-performance liquid chromatography

    NARCIS (Netherlands)

    Theodoridis, G; Haasnoot, W; Schilt, R; Jaziri, M; Diallo, B; Papadoyannis, IN; de Jong, GJ; Cazemier, G.

    2002-01-01

    The application of immunoaffinity chromatography for the purification of Taxus plant and cell extracts prior to the HPLC analysis is described. Polyclonal antibodies raised against 10-deacetylbaccatin III (10-DAB III), paclitaxel's main precursor in plant, were characterised by enzymed-linked

  12. STATISTICAL ANALYSIS OF TANK 18F FLOOR SAMPLE RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 18F as per the statistical sampling plan developed by Shine [1]. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL [2]. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results [3] to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL{sub 95%}) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 18F. The uncertainty is quantified in this report by an upper 95% confidence limit (UCL{sub 95%}) on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL{sub 95%} was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  13. Generalized multiple kernel learning with data-dependent priors.

    Science.gov (United States)

    Mao, Qi; Tsang, Ivor W; Gao, Shenghua; Wang, Li

    2015-06-01

    Multiple kernel learning (MKL) and classifier ensemble are two mainstream methods for solving learning problems in which some sets of features/views are more informative than others, or the features/views within a given set are inconsistent. In this paper, we first present a novel probabilistic interpretation of MKL such that maximum entropy discrimination with a noninformative prior over multiple views is equivalent to the formulation of MKL. Instead of using the noninformative prior, we introduce a novel data-dependent prior based on an ensemble of kernel predictors, which enhances the prediction performance of MKL by leveraging the merits of the classifier ensemble. With the proposed probabilistic framework of MKL, we propose a hierarchical Bayesian model to learn the proposed data-dependent prior and classification model simultaneously. The resultant problem is convex and other information (e.g., instances with either missing views or missing labels) can be seamlessly incorporated into the data-dependent priors. Furthermore, a variety of existing MKL models can be recovered under the proposed MKL framework and can be readily extended to incorporate these priors. Extensive experiments demonstrate the benefits of our proposed framework in supervised and semisupervised settings, as well as in tasks with partial correspondence among multiple views.

  14. Aerodynamic sampling for landmine trace detection

    Science.gov (United States)

    Settles, Gary S.; Kester, Douglas A.

    2001-10-01

    Electronic noses and similar sensors show promise for detecting buried landmines through the explosive trace signals they emit. A key step in this detection is the sampler or sniffer, which acquires the airborne trace signal and presents it to the detector. Practicality demands no physical contact with the ground. Further, both airborne particulates and molecular traces must be sampled. Given a complicated minefield terrain and microclimate, this becomes a daunting chore. Our prior research on canine olfactory aerodynamics revealed several ways that evolution has dealt with such problems: 1) proximity of the sniffer to the scent source is important, 2) avoid exhaling back into the scent source, 3) use an aerodynamic collar on the sniffer inlet, 4) use auxiliary airjets to stir up surface particles, and 5) manage the 'impedance mismatch' between sniffer and sensor airflows carefully. Unfortunately, even basic data on aerodynamic sniffer performance as a function of inlet-tube and scent-source diameters, standoff distance, etc., have not been previously obtained. A laboratory-prototype sniffer was thus developed to provide guidance for landmine trace detectors. Initial experiments with this device are the subject of this paper. For example, a spike in the trace signal is observed upon starting the sniffer airflow, apparently due to rapid depletion of the available signal-laden air. Further, shielding the sniffer from disruptive ambient airflows arises as a key issue in sampling efficiency.

  15. Magnetic matrix solid phase dispersion assisted dispersive liquid liquid microextraction of ultra trace polychlorinated biphenyls in water prior to GC-ECD determination

    International Nuclear Information System (INIS)

    Diao, Chunpeng; Li, Cong; Yang, Xiao; Sun, Ailing; Liu, Renmin

    2016-01-01

    Magnetic matrix solid phase dispersion (MMSPD) assisted dispersive liquid liquid microextraction (DLLME) was applied to extract ultra traces of polychlorinated biphenyls (PCBs) from water samples prior to gas chromatography with electron capture detection. PCBs in water were adsorbed by micro particles of magnetic bamboo charcoal and then transferred into the elution solvent. PCBs in the elution solvent of the MMSPD were further concentrated into trace volume extraction solvent of the DLLME procedure. Under optimized conditions, good linearity in the range of 0.2–100 ng L"−"1 was obtained with regression coefficients (r) higher than 0.9987. Based on a signal-noise ratio of 3, the limits of detection (LODs) range from 0.05–0.1 ng L"−"1. These LODs are much lower than those of MMSPD or DLLME alone. Relative standard deviations are between 4.9–8.2 %. The method was successfully applied to the determination of PCBs in lake and river water. Relative recoveries were 85.5–117.4 % for the spiked environmental water samples. (author)

  16. Calibrated birth-death phylogenetic time-tree priors for bayesian inference.

    Science.gov (United States)

    Heled, Joseph; Drummond, Alexei J

    2015-05-01

    Here we introduce a general class of multiple calibration birth-death tree priors for use in Bayesian phylogenetic inference. All tree priors in this class separate ancestral node heights into a set of "calibrated nodes" and "uncalibrated nodes" such that the marginal distribution of the calibrated nodes is user-specified whereas the density ratio of the birth-death prior is retained for trees with equal values for the calibrated nodes. We describe two formulations, one in which the calibration information informs the prior on ranked tree topologies, through the (conditional) prior, and the other which factorizes the prior on divergence times and ranked topologies, thus allowing uniform, or any arbitrary prior distribution on ranked topologies. Although the first of these formulations has some attractive properties, the algorithm we present for computing its prior density is computationally intensive. However, the second formulation is always faster and computationally efficient for up to six calibrations. We demonstrate the utility of the new class of multiple-calibration tree priors using both small simulations and a real-world analysis and compare the results to existing schemes. The two new calibrated tree priors described in this article offer greater flexibility and control of prior specification in calibrated time-tree inference and divergence time dating, and will remove the need for indirect approaches to the assessment of the combined effect of calibration densities and tree priors in Bayesian phylogenetic inference. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  17. A measure of uncertainty regarding the interval constraint of normal mean elicited by two stages of a prior hierarchy.

    Science.gov (United States)

    Kim, Hea-Jung

    2014-01-01

    This paper considers a hierarchical screened Gaussian model (HSGM) for Bayesian inference of normal models when an interval constraint in the mean parameter space needs to be incorporated in the modeling but when such a restriction is uncertain. An objective measure of the uncertainty, regarding the interval constraint, accounted for by using the HSGM is proposed for the Bayesian inference. For this purpose, we drive a maximum entropy prior of the normal mean, eliciting the uncertainty regarding the interval constraint, and then obtain the uncertainty measure by considering the relationship between the maximum entropy prior and the marginal prior of the normal mean in HSGM. Bayesian estimation procedure of HSGM is developed and two numerical illustrations pertaining to the properties of the uncertainty measure are provided.

  18. Protocol for Cohesionless Sample Preparation for Physical Experimentation

    Science.gov (United States)

    2016-05-01

    Standard test method for consolidated drained triaxial compression test for soils . In Annual book of ASTM standards. West Conshohocken, PA: ASTM...derived wherein uncertainties and laboratory scatter associated with soil fabric-behavior variance during sample preparation are mitigated. Samples of...wherein comparable analysis between different laboratory tests’ results can be made by ensuring a comparable soil fabric prior to laboratory testing

  19. Statistical literacy and sample survey results

    Science.gov (United States)

    McAlevey, Lynn; Sullivan, Charles

    2010-10-01

    Sample surveys are widely used in the social sciences and business. The news media almost daily quote from them, yet they are widely misused. Using students with prior managerial experience embarking on an MBA course, we show that common sample survey results are misunderstood even by those managers who have previously done a statistics course. In general, they fare no better than managers who have never studied statistics. There are implications for teaching, especially in business schools, as well as for consulting.

  20. Experimental and Sampling Design for the INL-2 Sample Collection Operational Test

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Amidan, Brett G.; Matzke, Brett D.

    2009-02-16

    This report describes the experimental and sampling design developed to assess sampling approaches and methods for detecting contamination in a building and clearing the building for use after decontamination. An Idaho National Laboratory (INL) building will be contaminated with BG (Bacillus globigii, renamed Bacillus atrophaeus), a simulant for Bacillus anthracis (BA). The contamination, sampling, decontamination, and re-sampling will occur per the experimental and sampling design. This INL-2 Sample Collection Operational Test is being planned by the Validated Sampling Plan Working Group (VSPWG). The primary objectives are: 1) Evaluate judgmental and probabilistic sampling for characterization as well as probabilistic and combined (judgment and probabilistic) sampling approaches for clearance, 2) Conduct these evaluations for gradient contamination (from low or moderate down to absent or undetectable) for different initial concentrations of the contaminant, 3) Explore judgment composite sampling approaches to reduce sample numbers, 4) Collect baseline data to serve as an indication of the actual levels of contamination in the tests. A combined judgmental and random (CJR) approach uses Bayesian methodology to combine judgmental and probabilistic samples to make clearance statements of the form "X% confidence that at least Y% of an area does not contain detectable contamination” (X%/Y% clearance statements). The INL-2 experimental design has five test events, which 1) vary the floor of the INL building on which the contaminant will be released, 2) provide for varying the amount of contaminant released to obtain desired concentration gradients, and 3) investigate overt as well as covert release of contaminants. Desirable contaminant gradients would have moderate to low concentrations of contaminant in rooms near the release point, with concentrations down to zero in other rooms. Such gradients would provide a range of contamination levels to challenge the sampling

  1. Acceleration to failure in geophysical signals prior to laboratory rock failure and volcanic eruptions (Invited)

    Science.gov (United States)

    Main, I. G.; Bell, A. F.; Greenhough, J.; Heap, M. J.; Meredith, P. G.

    2010-12-01

    The nucleation processes that ultimately lead to earthquakes, volcanic eruptions, rock bursts in mines, and landslides from cliff slopes are likely to be controlled at some scale by brittle failure of the Earth’s crust. In laboratory brittle deformation experiments geophysical signals commonly exhibit an accelerating trend prior to dynamic failure. Similar signals have been observed prior to volcanic eruptions, including volcano-tectonic earthquake event and moment release rates. Despite a large amount of effort in the search, no such statistically robust systematic trend is found prior to natural earthquakes. Here we describe the results of a suite of laboratory tests on Mount Etna Basalt and other rocks to examine the nature of the non-linear scaling from laboratory to field conditions, notably using laboratory ‘creep’ tests to reduce the boundary strain rate to conditions more similar to those in the field. Seismic event rate, seismic moment release rate and rate of porosity change show a classic ‘bathtub’ graph that can be derived from a simple damage model based on separate transient and accelerating sub-critical crack growth mechanisms, resulting from separate processes of negative and positive feedback in the population dynamics. The signals exhibit clear precursors based on formal statistical model tests using maximum likelihood techniques with Poisson errors. After correcting for the finite loading time of the signal, the results show a transient creep rate that decays as a classic Omori law for earthquake aftershocks, and remarkably with an exponent near unity, as commonly observed for natural earthquake sequences. The accelerating trend follows an inverse power law when fitted in retrospect, i.e. with prior knowledge of the failure time. In contrast the strain measured on the sample boundary shows a less obvious but still accelerating signal that is often absent altogether in natural strain data prior to volcanic eruptions. To test the

  2. Testing the homogeneity of candidate reference materials by solid sampling - AAS and INAA

    International Nuclear Information System (INIS)

    Rossbach, M.; Grobecker, K.-H.

    2002-01-01

    The necessity to quantify a natural material's homogeneity with respect to its elemental distribution prior to chemical analysis of a given aliquot is emphasised. Available instruments and methods to obtain the relevant information are described. Additionally the calculation of element specific, relative homogeneity factors, H E , and of a minimum sample mass M 5% to achieve 5% precision on a 95% confidence level is given. Especially, in the production and certification of Certified Reference Materials (CRMs) this characteristic information should be determined in order to provide the user with additional inherent properties of the CRM to enable more economical use of the expensive material and to evaluate further systematic bias of the applied analytical technique. (author)

  3. Use of standard laboratory methods to obviate routine dithiothreitol treatment of blood samples with daratumumab interference.

    Science.gov (United States)

    Lintel, Nicholas J; Brown, Debra K; Schafer, Diane T; Tsimba-Chitsva, Farai M; Koepsell, Scott A; Shunkwiler, Sara M

    2017-01-01

    Daratumumab is an antibody currently used in the treatment of patients with refractory multiple myeloma. Blood samples from patients being treated with daratumumab may show panreactivity during pre-transfusion testing. To facilitate the provision of blood components for such patients, it is recommended that a baseline phenotype or genotype be established prior to starting treatment with daratumumab. If patient red blood cells (RBCs) require phenotyping after the start of daratumumab treatment, dithiothreitol (DTT) treatment of the patient's RBCs should be performed. The medical charts of four patients treated with daratumumab were reviewed. The individual number of doses ranged from 1 to 14; patient age ranged from 55 to 78 years; two men and two women were included in the review. Type and screen data were obtained from samples collected over 33 encounters with a range of 1 to 13 encounters per patient. All samples were tested initially by automated solid-phase testing. Any reactivity with solid phase led to tube testing with either low-ionic-strength saline, polyethylene glycol, or both. If incubation failed to eliminate the reactivity, the sample was sent to a reference laboratory for DTT treatment and phenotyping. Of the 33 samples tested, 23 (69.7%) samples had reactivity in solid-phase testing. In 8 of the 10 samples that did not react in solid-phase, testing was conducted more than four half-lives after the last dose of daratumumab. Of the 23 that had reactivity in solid-phase, 16 (69.6%) samples demonstrated loss of reactivity using common laboratory methods. For the seven patients whose sample reactivity was not initially eliminated, six were provided with phenotypically matched blood based on prior molecular testing. Only one sample was sent out for DTT treatment. These results suggest that daratumumab interference with pre-transfusion testing can be addressed using common laboratory methods. This finding could save time and money for laboratories that do

  4. LC-MS analysis of the plasma metabolome–a novel sample preparation strategy

    DEFF Research Database (Denmark)

    Skov, Kasper; Hadrup, Niels; Smedsgaard, Jørn

    2015-01-01

    Blood plasma is a well-known body fluid often analyzed in studies on the effects of toxic compounds as physiological or chemical induced changes in the mammalian body are reflected in the plasma metabolome. Sample preparation prior to LC-MS based analysis of the plasma metabolome is a challenge...... as plasma contains compounds with very different properties. Besides, proteins, which usually are precipitated with organic solvent, phospholipids, are known to cause ion suppression in electrospray mass spectrometry. We have compared two different sample preparation techniques prior to LC-qTOF analysis...... of plasma samples: The first is protein precipitation; the second is protein precipitation followed by solid phase extraction with sub-fractionation into three sub-samples; a phospholipid, a lipid and a polar sub-fraction. Molecular feature extraction of the data files from LC-qTOF analysis of the samples...

  5. The effect of prior lumbar surgeries on the flexion relaxation phenomenon and its responsiveness to rehabilitative treatment.

    Science.gov (United States)

    Neblett, Randy; Mayer, Tom G; Brede, Emily; Gatchel, Robert J

    2014-06-01

    Abnormal pretreatment flexion-relaxation in chronic disabling occupational lumbar spinal disorder patients has been shown to improve with functional restoration rehabilitation. Little is known about the effects of prior lumbar surgeries on flexion-relaxation and its responsiveness to treatment. To quantify the effect of prior lumbar surgeries on the flexion-relaxation phenomenon and its responsiveness to rehabilitative treatment. A prospective cohort study of chronic disabling occupational lumbar spinal disorder patients, including those with and without prior lumbar spinal surgeries. A sample of 126 chronic disabling occupational lumbar spinal disorder patients with prior work-related injuries entered an interdisciplinary functional restoration program and agreed to enroll in this study. Fifty-seven patients had undergone surgical decompression or discectomy (n=32) or lumbar fusion (n=25), and the rest had no history of prior injury-related spine surgery (n=69). At post-treatment, 116 patients were reevaluated, including those with prior decompressions or discectomies (n=30), lumbar fusions (n=21), and no surgery (n=65). A comparison group of 30 pain-free control subjects was tested with an identical assessment protocol, and compared with post-rehabilitation outcomes. Mean surface electromyography (SEMG) at maximum voluntary flexion; subject achievement of flexion-relaxation (SEMG≤3.5 μV); gross lumbar, true lumbar, and pelvic flexion ROM; and a pain visual analog scale self-report during forward bending task. Identical measures were obtained at pretreatment and post-treatment. Patients entered an interdisciplinary functional restoration program, including a quantitatively directed, medically supervised exercise process and a multimodal psychosocial disability management component. The functional restoration program was accompanied by a SEMG-assisted stretching training program, designed to teach relaxation of the lumbar musculature during end-range flexion

  6. Degradation of hydrocarbons in soil samples analyzed within accepted analytical holding times

    International Nuclear Information System (INIS)

    Jackson, J.; Thomey, N.; Dietlein, L.F.

    1992-01-01

    Samples which are collected in conjunction with subsurface investigations at leaking petroleum storage tank sites and petroleum refineries are routinely analyzed for benzene, toluene, ethylbenzene, xylenes (BTEX), and total petroleum hydrocarbons (TPH). Water samples are preserved by the addition of hydrochloric acid and maintained at four degrees centigrade prior to analysis. This is done to prevent bacterial degradation of hydrocarbons. Chemical preservation is not presently performed on soil samples. Instead, the samples are cooled and maintained at four degrees centigrade. This study was done to measure the degree of degradation of hydrocarbons in soil samples which are analyzed within accepted holding times. Soil samples were collected and representative subsamples were prepared from the initial sample. Subsamples were analyzed in triplicate for BTEX and TPH throughout the length of the approved holding times to measure the extent of sample constituent degradation prior to analysis. Findings imply that for sandy soils, BTEX and TPH concentrations can be highly dependent upon the length of time which elapses between sample collection and analysis

  7. Magnetic headspace adsorptive extraction of chlorobenzenes prior to thermal desorption gas chromatography-mass spectrometry

    International Nuclear Information System (INIS)

    Vidal, Lorena; Ahmadi, Mazaher; Fernández, Elena; Madrakian, Tayyebeh; Canals, Antonio

    2017-01-01

    This study presents a new, user-friendly, cost-effective and portable headspace solid-phase extraction technique based on graphene oxide decorated with iron oxide magnetic nanoparticles as sorbent, located on one end of a small neodymium magnet. Hence, the new headspace solid-phase extraction technique has been called Magnetic Headspace Adsorptive Extraction (Mag-HSAE). In order to assess Mag-HSAE technique applicability to model analytes, some chlorobenzenes were extracted from water samples prior to gas chromatography-mass spectrometry determination. A multivariate approach was employed to optimize the experimental parameters affecting Mag-HSAE. The method was evaluated under optimized extraction conditions (i.e., sample volume, 20 mL; extraction time, 30 min; sorbent amount, 10 mg; stirring speed, 1500 rpm, and ionic strength, non-significant), obtaining a linear response from 0.5 to 100 ng L −1 for 1,3-DCB, 1,4-DCB, 1,2-DCB, 1,3,5-TCB, 1,2,4-TCB and 1,2,3-TCB; from 0.5 to 75 ng L −1 for 1,2,4,5-TeCB, and PeCB; and from 1 to 75 ng L −1 for 1,2,3,4-TeCB. The repeatability of the proposed method was evaluated at 10 ng L −1 and 50 ng L −1 spiking levels, and coefficients of variation ranged between 1.5 and 9.5% (n = 5). Limits of detection values were found between 93 and 301 pg L −1 . Finally, tap, mineral and effluent water were selected as real water samples to assess method applicability. Relative recoveries varied between 86 and 110% showing negligible matrix effects. - Highlights: • A new extraction technique named Magnetic Headspace Adsorptive Extraction is presented. • Graphene oxide/iron oxide composite deposited on a neodymiun magnet as sorbent. • Sorbent of low cost, rapid and simple synthesis, easy manipulation and portability options. • Fast and efficient extraction and sensitive determination of chlorobenzenes in water samples.

  8. Electromembrane extraction of biogenic amines in food samples by a microfluidic-chip system followed by dabsyl derivatization prior to high performance liquid chromatography analysis.

    Science.gov (United States)

    Zarghampour, Fereshteh; Yamini, Yadollah; Baharfar, Mahroo; Faraji, Mohammad

    2018-06-29

    In the present research, an on-chip electromembrane extraction coupled with high performance liquid chromatography was developed for monitoring the trace levels of biogenic amines (BAs), including histamine, tryptamine, putrescine, cadaverine and spermidine in food samples. A porous polypropylene sheet membrane impregnated with an organic solvent was placed between the two parts of the chip device to separate the channels. Two platinum electrodes were mounted at the bottom of these channels, which were connected to a power supply, providing the electrical driving force for migration of ionized analytes from the sample solution through the porous sheet membrane into the acceptor phase. BAs were extracted from 2 mL aqueous sample solutions at neutral pH into 50 μL of acidified (HCl 90 mM) acceptor solution. Supported liquid membrane including NPOE containing 10% DEHP was used to ensure efficient extraction. Low voltage of 40 V was applied over the SLMs during extraction time. The influences of fundamental parameters affecting the transport of BAs were optimized. Under the optimized conditions, the relative standard deviations based on four replicate measurements were less than 8.0% and limit of detections were in range of 3.0-8.0 μg L -1 . Finally, the method was successfully applied to determinate BAs in the food samples and satisfactory results (recovery > 95.6) were obtained. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Improving Open Access through Prior Learning Assessment

    Science.gov (United States)

    Yin, Shuangxu; Kawachi, Paul

    2013-01-01

    This paper explores and presents new data on how to improve open access in distance education through using prior learning assessments. Broadly there are three types of prior learning assessment (PLAR): Type-1 for prospective students to be allowed to register for a course; Type-2 for current students to avoid duplicating work-load to gain…

  10. Iterative importance sampling algorithms for parameter estimation

    OpenAIRE

    Morzfeld, Matthias; Day, Marcus S.; Grout, Ray W.; Pau, George Shu Heng; Finsterle, Stefan A.; Bell, John B.

    2016-01-01

    In parameter estimation problems one computes a posterior distribution over uncertain parameters defined jointly by a prior distribution, a model, and noisy data. Markov Chain Monte Carlo (MCMC) is often used for the numerical solution of such problems. An alternative to MCMC is importance sampling, which can exhibit near perfect scaling with the number of cores on high performance computing systems because samples are drawn independently. However, finding a suitable proposal distribution is ...

  11. Phantom experiments using soft-prior regularization EIT for breast cancer imaging.

    Science.gov (United States)

    Murphy, Ethan K; Mahara, Aditya; Wu, Xiaotian; Halter, Ryan J

    2017-06-01

    A soft-prior regularization (SR) electrical impedance tomography (EIT) technique for breast cancer imaging is described, which shows an ability to accurately reconstruct tumor/inclusion conductivity values within a dense breast model investigated using a cylindrical and a breast-shaped tank. The SR-EIT method relies on knowing the spatial location of a suspicious lesion initially detected from a second imaging modality. Standard approaches (using Laplace smoothing and total variation regularization) without prior structural information are unable to accurately reconstruct or detect the tumors. The soft-prior approach represents a very significant improvement to these standard approaches, and has the potential to improve conventional imaging techniques, such as automated whole breast ultrasound (AWB-US), by providing electrical property information of suspicious lesions to improve AWB-US's ability to discriminate benign from cancerous lesions. Specifically, the best soft-regularization technique found average absolute tumor/inclusion errors of 0.015 S m -1 for the cylindrical test and 0.055 S m -1 and 0.080 S m -1 for the breast-shaped tank for 1.8 cm and 2.5 cm inclusions, respectively. The standard approaches were statistically unable to distinguish the tumor from the mammary gland tissue. An analysis of false tumors (benign suspicious lesions) provides extra insight into the potential and challenges EIT has for providing clinically relevant information. The ability to obtain accurate conductivity values of a suspicious lesion (>1.8 cm) detected from another modality (e.g. AWB-US) could significantly reduce false positives and result in a clinically important technology.

  12. Obtaining nanofibers from curauá and sugarcane bagasse fibers using enzymatic hydrolysis followed by sonication

    DEFF Research Database (Denmark)

    Campos, Adriana de; Correa, Ana Carolina; Cannella, David

    2013-01-01

    This paper is an initial study of the implementation of two new enzymes, an endoglucanase and a concoction of hemicellulases and pectinases to obtain cellulosic nanoparticles. In this study, curauá and sugarcane bagasse were dewaxed and bleached prior to enzymatic action for 72 h at 50 °C, and th...

  13. Sago-Type Palms Were an Important Plant Food Prior to Rice in Southern Subtropical China

    Science.gov (United States)

    Yang, Xiaoyan; Barton, Huw J.; Wan, Zhiwei; Li, Quan; Ma, Zhikun; Li, Mingqi; Zhang, Dan; Wei, Jun

    2013-01-01

    Poor preservation of plant macroremains in the acid soils of southern subtropical China has hampered understanding of prehistoric diets in the region and of the spread of domesticated rice southwards from the Yangtze River region. According to records in ancient books and archaeological discoveries from historical sites, it is presumed that roots and tubers were the staple plant foods in this region before rice agriculture was widely practiced. But no direct evidences provided to test the hypothesis. Here we present evidence from starch and phytolith analyses of samples obtained during systematic excavations at the site of Xincun on the southern coast of China, demonstrating that during 3,350–2,470 aBC humans exploited sago palms, bananas, freshwater roots and tubers, fern roots, acorns, Job's-tears as well as wild rice. A dominance of starches and phytoliths from palms suggest that the sago-type palms were an important plant food prior to the rice in south subtropical China. We also believe that because of their reliance on a wide range of starch-rich plant foods, the transition towards labour intensive rice agriculture was a slow process. PMID:23667584

  14. Rotary Mode Core Sample System availability improvement

    International Nuclear Information System (INIS)

    Jenkins, W.W.; Bennett, K.L.; Potter, J.D.; Cross, B.T.; Burkes, J.M.; Rogers, A.C.

    1995-01-01

    The Rotary Mode Core Sample System (RMCSS) is used to obtain stratified samples of the waste deposits in single-shell and double-shell waste tanks at the Hanford Site. The samples are used to characterize the waste in support of ongoing and future waste remediation efforts. Four sampling trucks have been developed to obtain these samples. Truck I was the first in operation and is currently being used to obtain samples where the push mode is appropriate (i.e., no rotation of drill). Truck 2 is similar to truck 1, except for added safety features, and is in operation to obtain samples using either a push mode or rotary drill mode. Trucks 3 and 4 are now being fabricated to be essentially identical to truck 2

  15. Training shortest-path tractography: Automatic learning of spatial priors

    DEFF Research Database (Denmark)

    Kasenburg, Niklas; Liptrot, Matthew George; Reislev, Nina Linde

    2016-01-01

    Tractography is the standard tool for automatic delineation of white matter tracts from diffusion weighted images. However, the output of tractography often requires post-processing to remove false positives and ensure a robust delineation of the studied tract, and this demands expert prior...... knowledge. Here we demonstrate how such prior knowledge, or indeed any prior spatial information, can be automatically incorporated into a shortest-path tractography approach to produce more robust results. We describe how such a prior can be automatically generated (learned) from a population, and we...

  16. Acquisition of multiple prior distributions in tactile temporal order judgment

    Directory of Open Access Journals (Sweden)

    Yasuhito eNagai

    2012-08-01

    Full Text Available The Bayesian estimation theory proposes that the brain acquires the prior distribution of a task and integrates it with sensory signals to minimize the effect of sensory noise. Psychophysical studies have demonstrated that our brain actually implements Bayesian estimation in a variety of sensory-motor tasks. However, these studies only imposed one prior distribution on participants within a task period. In this study, we investigated the conditions that enable the acquisition of multiple prior distributions in temporal order judgment (TOJ of two tactile stimuli across the hands. In Experiment 1, stimulation intervals were randomly selected from one of two prior distributions (biased to right hand earlier and biased to left hand earlier in association with color cues (green and red, respectively. Although the acquisition of the two priors was not enabled by the color cues alone, it was significant when participants shifted their gaze (above or below in response to the color cues. However, the acquisition of multiple priors was not significant when participants moved their mouths (opened or closed. In Experiment 2, the spatial cues (above and below were used to identify which eye position or retinal cue position was crucial for the eye-movement-dependent acquisition of multiple priors in Experiment 1. The acquisition of the two priors was significant when participants moved their gaze to the cues (i.e., the cue positions on the retina were constant across the priors, as well as when participants did not shift their gazes (i.e., the cue positions on the retina changed according to the priors. Thus, both eye and retinal cue positions were effective in acquiring multiple priors. Based on previous neurophysiological reports, we discuss possible neural correlates that contribute to the acquisition of multiple priors.

  17. Field sampling for monitoring migration and defining the areal extent of chemical contamination

    International Nuclear Information System (INIS)

    Thomas, J.M.; Skalski, J.R.; Eberhardt, L.L.; Simmons, M.A.

    1984-11-01

    Initial research on compositing, field designs, and site mapping oriented toward detecting spills and migration at commercial low-level radioactive or chemical waste sites is summarized. Results indicate that the significance test developed to detect samples containing high levels of contamination when they are mixed with several other samples below detectable limits (composites), will be highly effective with large sample sizes when contaminant levels frequently or greatly exceed a maximum acceptable level. These conditions of frequent and high contaminant levels are most likely to occur in regions of a commercial waste site where the priors (previous knowledge) about a spill or migration are highest. Conversely, initial investigations of Bayes sampling strategies suggest that field sampling efforts should be inversely proportional to the priors (expressed as probabilities) for the occurrence of contamination

  18. A sampling-based approach to probabilistic pursuit evasion

    KAUST Repository

    Mahadevan, Aditya; Amato, Nancy M.

    2012-01-01

    Probabilistic roadmaps (PRMs) are a sampling-based approach to motion-planning that encodes feasible paths through the environment using a graph created from a subset of valid positions. Prior research has shown that PRMs can be augmented

  19. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    KAUST Repository

    Elsheikh, Ahmed H.; Wheeler, Mary Fanett; Hoteit, Ibrahim

    2014-01-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using

  20. Sampling procedure, receipt and conservation of water samples to determine environmental radioactivity

    International Nuclear Information System (INIS)

    Herranz, M.; Navarro, E.; Payeras, J.

    2009-01-01

    The present document informs about essential goals, processes and contents that the subgroups Sampling and Samples Preparation and Conservation believe they should be part of the procedure to obtain a correct sampling, receipt, conservation and preparation of samples of continental, marine and waste water before qualifying its radioactive content.

  1. A Measure of Uncertainty regarding the Interval Constraint of Normal Mean Elicited by Two Stages of a Prior Hierarchy

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2014-01-01

    Full Text Available This paper considers a hierarchical screened Gaussian model (HSGM for Bayesian inference of normal models when an interval constraint in the mean parameter space needs to be incorporated in the modeling but when such a restriction is uncertain. An objective measure of the uncertainty, regarding the interval constraint, accounted for by using the HSGM is proposed for the Bayesian inference. For this purpose, we drive a maximum entropy prior of the normal mean, eliciting the uncertainty regarding the interval constraint, and then obtain the uncertainty measure by considering the relationship between the maximum entropy prior and the marginal prior of the normal mean in HSGM. Bayesian estimation procedure of HSGM is developed and two numerical illustrations pertaining to the properties of the uncertainty measure are provided.

  2. Pumping time required to obtain tube well water samples with aquifer characteristic radon concentrations

    International Nuclear Information System (INIS)

    Ricardo, Carla Pereira; Oliveira, Arno Heeren de

    2011-01-01

    Radon is an inert noble gas, which comes from the natural radioactive decay of uranium and thorium in soil, rock and water. Radon isotopes emanated from radium-bearing grains of a rock or soil are released into the pore space. Radon that reaches the pore space is partitioned between the gaseous and aqueous phases. Thus, the groundwater presents a radon signature from the rock that is characteristic of the aquifer. The characteristic radon concentration of an aquifer, which is mainly related to the emanation, is also influenced by the degree of subsurface degassing, especially in the vicinity of a tube well, where the radon concentration is strongly reduced. Looking for the required pumping time to take a tube well water sample that presents the characteristic radon concentration of the aquifer, an experiment was conducted in an 80 m deep tube well. In this experiment, after twenty-four hours without extraction, water samples were collected periodically, about ten minutes intervals, during two hours of pumping time. The radon concentrations of the samples were determined by using the RAD7 Electronic Radon Detector from Durridge Company, a solid state alpha spectrometric detector. It was realized that the necessary time to reach the maximum radon concentration, that means the characteristic radon concentration of the aquifer, is about sixty minutes. (author)

  3. The influence on birthweight of maternal living conditions a decade prior to giving birth

    Directory of Open Access Journals (Sweden)

    John Singhammer

    2009-10-01

    Full Text Available The study’s aim was to correlate measures of mothers’ socio-economic status, a decade prior to giving birth, with their children’s birthweight. As part of a larger study, information on birth characteristics from 706 babies born 1970-73 were linked with census data obtained from their mothers near the time of birth as well as one decade earlier. The 706 individuals were selected at random from two national surveys in 1998 and 2000 and traced back to the time of birth in the period 1970-73. Information on birth characteristics was linked to census data obtained from the mothers in 1960 and 1970. Included was information on parent’s living conditions (e.g. income, type of dwelling, indoor plumbing, telephone, number of people in the household. Information on mother’s health during pregnancy, a decade before childbirth and near childbirth, and data on mothers’ and the infants’ health at birth was obtained from the Medical Birth Registry of Norway. In analysis that included both early and current socio-economic conditions maternal education and rural residency at the time of giving birth were observed as statistical significant predictors of birthweight. Results were adjusted for maternal age, parity, plurality, gender and diagnoses before and during pregnancy, all factors observed to attenuate birthweight. Indicators of women’s socio-economic conditions a decade prior to giving birth were not significantly associated with birthweight. These findings do not clearly support suggestions in the literature that an infant’s vitality may be influenced by the family’s socio-economic conditions years before birth.

  4. Organoclays obtaining starting up of clays sodium

    International Nuclear Information System (INIS)

    Silva, M.M. da; Mota, M.F.; Oliveira, G.C. de; Rodrigues, M.G.F.

    2012-01-01

    Clays have several applications in many areas of fields of technology, however, modification of these materials using organic compounds can be performed to obtain further hydrophobic materials, for applications in the adsorption of organic pollutants. This study aimed to analyze the effects of modifying two clays using sodium quaternary ammonium surfactants through ion exchange reaction process, in obtaining organoclays. The samples with sodium and organoclays were characterized by the techniques of X-ray diffraction (XRD), Infrared Spectroscopy in the region (IV), Gravimetric and Differential Thermal Analysis (DTA / TG) and organic adsorption tests. The results show that the process of obtaining organoclay is efficient, and materials have the potential for future applications in removing organic contaminants. (author)

  5. The Clinical Impact of Cardiology Consultation Prior to Major Vascular Surgery.

    Science.gov (United States)

    Davis, Frank M; Park, Yeo June; Grey, Scott F; Boniakowski, Anna E; Mansour, M Ashraf; Jain, Krishna M; Nypaver, Timothy; Grossman, Michael; Gurm, Hitinder; Henke, Peter K

    2018-01-01

    To understand statewide variation in preoperative cardiology consultation prior to major vascular surgery and to determine whether consultation was associated with differences in perioperative myocardial infarction (poMI). Medical consultation prior to major vascular surgery is obtained to reduce perioperative risk. Despite perceived benefit of preoperative consultation, evidence is lacking specifically for major vascular surgery on the effect of preoperative cardiac consultation. Patient and clinical data were obtained from a statewide vascular surgery registry between January 2012 and December 2014. Patients were risk stratified by revised cardiac risk index category and compared poMI between patients who did or did not receive a preoperative cardiology consultation. We then used logistic regression analysis to compare the rate of poMI across hospitals grouped into quartiles by rate of preoperative cardiology consultation. Our study population comprised 5191 patients undergoing open peripheral arterial bypass (n = 3037), open abdominal aortic aneurysm repair (n = 332), or endovascular aneurysm repair (n = 1822) at 29 hospitals. At the patient level, after risk-stratification by revised cardiac risk index category, there was no association between cardiac consultation and poMI. At the hospital level, preoperative cardiac consultation varied substantially between hospitals (6.9%-87.5%, P 66%) had a reduction in poMI (OR, 0.52; confidence interval: 0.28-0.98; P cardiology consultation for vascular surgery varies greatly between institutions, and does not appear to impact poMI at the patient level. However, reduction of poMI was noted at the hospitals with the highest rate of preoperative cardiology consultation as well as a variety of medical services, suggesting that other hospital culture effects play a role.

  6. Estimating Functions with Prior Knowledge, (EFPK) for diffusions

    DEFF Research Database (Denmark)

    Nolsøe, Kim; Kessler, Mathieu; Madsen, Henrik

    2003-01-01

    In this paper a method is formulated in an estimating function setting for parameter estimation, which allows the use of prior information. The main idea is to use prior knowledge of the parameters, either specified as moments restrictions or as a distribution, and use it in the construction of a...... of an estimating function. It may be useful when the full Bayesian analysis is difficult to carry out for computational reasons. This is almost always the case for diffusions, which is the focus of this paper, though the method applies in other settings.......In this paper a method is formulated in an estimating function setting for parameter estimation, which allows the use of prior information. The main idea is to use prior knowledge of the parameters, either specified as moments restrictions or as a distribution, and use it in the construction...

  7. Nonlinear Spatial Inversion Without Monte Carlo Sampling

    Science.gov (United States)

    Curtis, A.; Nawaz, A.

    2017-12-01

    High-dimensional, nonlinear inverse or inference problems usually have non-unique solutions. The distribution of solutions are described by probability distributions, and these are usually found using Monte Carlo (MC) sampling methods. These take pseudo-random samples of models in parameter space, calculate the probability of each sample given available data and other information, and thus map out high or low probability values of model parameters. However, such methods would converge to the solution only as the number of samples tends to infinity; in practice, MC is found to be slow to converge, convergence is not guaranteed to be achieved in finite time, and detection of convergence requires the use of subjective criteria. We propose a method for Bayesian inversion of categorical variables such as geological facies or rock types in spatial problems, which requires no sampling at all. The method uses a 2-D Hidden Markov Model over a grid of cells, where observations represent localized data constraining the model in each cell. The data in our example application are seismic properties such as P- and S-wave impedances or rock density; our model parameters are the hidden states and represent the geological rock types in each cell. The observations at each location are assumed to depend on the facies at that location only - an assumption referred to as `localized likelihoods'. However, the facies at a location cannot be determined solely by the observation at that location as it also depends on prior information concerning its correlation with the spatial distribution of facies elsewhere. Such prior information is included in the inversion in the form of a training image which represents a conceptual depiction of the distribution of local geologies that might be expected, but other forms of prior information can be used in the method as desired. The method provides direct (pseudo-analytic) estimates of posterior marginal probability distributions over each variable

  8. Scientific guidelines for preservation of samples collected from Mars

    International Nuclear Information System (INIS)

    Gooding, J.L.

    1990-04-01

    The maximum scientific value of Martian geologic and atmospheric samples is retained when the samples are preserved in the conditions that applied prior to their collection. Any sample degradation equates to loss of information. Based on detailed review of pertinent scientific literature, and advice from experts in planetary sample analysis, number values are recommended for key parameters in the environmental control of collected samples with respect to material contamination, temperature, head-space gas pressure, ionizing radiation, magnetic fields, and acceleration/shock. Parametric values recommended for the most sensitive geologic samples should also be adequate to preserve any biogenic compounds or exobiological relics

  9. Use of a holder-vacuum tube device to save on-site hands in preparing urine samples for head-space gas-chromatography, and its application to determine the time allowance for sample sealing.

    Science.gov (United States)

    Kawai, Toshio; Sumino, Kimiaki; Ohashi, Fumiko; Ikeda, Masayuki

    2011-01-01

    To facilitate urine sample preparation prior to head-space gas-chromatographic (HS-GC) analysis. Urine samples containing one of the five solvents (acetone, methanol, methyl ethyl ketone, methyl isobutyl ketone and toluene) at the levels of biological exposure limits were aspirated into a vacuum tube via holder, a device commercially available for venous blood collection (the vacuum tube method). The urine sample, 5 ml, was quantitatively transferred to a 20-ml head-space vial prior to HS-GC analysis. The loaded tubes were stored at +4 ℃ in dark for up to 3 d. The vacuum tube method facilitated on-site procedures of urine sample preparation for HS-GC with no significant loss of solvents in the sample and no need of skilled hands, whereas on-site sample preparation time was significantly reduced. Furthermore, no loss of solvents was detected during the 3-d storage, irrespective of hydrophilic (acetone) or lipophilic solvent (toluene). In a pilot application, high performance of the vacuum tube method in sealing a sample in an air-tight space succeeded to confirm that no solvent will be lost when sealing is completed within 5 min after urine voiding, and that the allowance time is as long as 30 min in case of toluene in urine. The use of the holder-vacuum tube device not only saves hands for transfer of the sample to air-tight space, but facilitates sample storage prior to HS-GC analysis.

  10. COMPARISON OF GINGER (Zingiber officiale Roscoe OLEORESIN OBTAINED WITH ETHANOL AND ISOPROPANOL WITH THAT OBTAINED WITH PRESSURIZED CO2

    Directory of Open Access Journals (Sweden)

    Lia P. NOBREGA

    1997-12-01

    Full Text Available Ginger (Zingiber officinale Roscoe belongs to the Zingiberacea family. It is a spice of great commercial importance. In this work ginger oleoresin was obtained with ethanol, isopropanol and liquid carbon dioxide. The chemical compositions of the extract were compared with each other. All oleoresin samples had monoterpenes and sesquiterpenes. Carboxylic acids were found in organic solvent extracts for an extraction time of 2 hours. The component responsible the for pungent characteristic of the oleoresin, gingerois, were detected in samples obtained with organic solvent for extraction times of 6 hours and in samples obtained with CO2 liquid for extraction times of 2 hours.O gengibre (Zingiber officinale Roscoe pertence à família Zingiberacea. É uma especiaria de grande importância comercial. Neste trabalho realizou-se extrações da oleoresina de gengibre com etanol, isopropanol e dióxido de carbono líquido e comparou-se a composição química de cada um dos extratos. A oleoresina obtida tem componentes das classes dos monoterpenos e sesquiterpenos em todas as amostras. Ácidos carboxílicos foram encontrados nos extratos obtidos com solvente orgânico, quando o tempo de extração foi de 2 horas. Os componentes que dão a característica pungente à oleoresina, os gingerois, foram detectados nas amostras obtidas com solvente orgânico, quando o tempo de extração foi de 6 horas e, nas amostras obtidas com CO2 líquido, com 2 horas de extração.

  11. Arthur Prior and 'Now'

    DEFF Research Database (Denmark)

    Blackburn, Patrick Rowan; Jørgensen, Klaus Frovin

    2016-01-01

    ’s search led him through the work of Castañeda, and back to his own work on hybrid logic: the first made temporal reference philosophically respectable, the second made it technically feasible in a modal framework. With the aid of hybrid logic, Prior built a bridge from a two-dimensional UT calculus...

  12. Heuristics as Bayesian inference under extreme priors.

    Science.gov (United States)

    Parpart, Paula; Jones, Matt; Love, Bradley C

    2018-05-01

    Simple heuristics are often regarded as tractable decision strategies because they ignore a great deal of information in the input data. One puzzle is why heuristics can outperform full-information models, such as linear regression, which make full use of the available information. These "less-is-more" effects, in which a relatively simpler model outperforms a more complex model, are prevalent throughout cognitive science, and are frequently argued to demonstrate an inherent advantage of simplifying computation or ignoring information. In contrast, we show at the computational level (where algorithmic restrictions are set aside) that it is never optimal to discard information. Through a formal Bayesian analysis, we prove that popular heuristics, such as tallying and take-the-best, are formally equivalent to Bayesian inference under the limit of infinitely strong priors. Varying the strength of the prior yields a continuum of Bayesian models with the heuristics at one end and ordinary regression at the other. Critically, intermediate models perform better across all our simulations, suggesting that down-weighting information with the appropriate prior is preferable to entirely ignoring it. Rather than because of their simplicity, our analyses suggest heuristics perform well because they implement strong priors that approximate the actual structure of the environment. We end by considering how new heuristics could be derived by infinitely strengthening the priors of other Bayesian models. These formal results have implications for work in psychology, machine learning and economics. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Efficiently sampling conformations and pathways using the concurrent adaptive sampling (CAS) algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Surl-Hee; Grate, Jay W.; Darve, Eric F.

    2017-08-21

    Molecular dynamics (MD) simulations are useful in obtaining thermodynamic and kinetic properties of bio-molecules but are limited by the timescale barrier, i.e., we may be unable to efficiently obtain properties because we need to run microseconds or longer simulations using femtoseconds time steps. While there are several existing methods to overcome this timescale barrier and efficiently sample thermodynamic and/or kinetic properties, problems remain in regard to being able to sample un- known systems, deal with high-dimensional space of collective variables, and focus the computational effort on slow timescales. Hence, a new sampling method, called the “Concurrent Adaptive Sampling (CAS) algorithm,” has been developed to tackle these three issues and efficiently obtain conformations and pathways. The method is not constrained to use only one or two collective variables, unlike most reaction coordinate-dependent methods. Instead, it can use a large number of collective vari- ables and uses macrostates (a partition of the collective variable space) to enhance the sampling. The exploration is done by running a large number of short simula- tions, and a clustering technique is used to accelerate the sampling. In this paper, we introduce the new methodology and show results from two-dimensional models and bio-molecules, such as penta-alanine and triazine polymer

  14. Real-Time PCR in faecal samples of Triatoma infestans obtained by xenodiagnosis: proposal for an exogenous internal control.

    Science.gov (United States)

    Bravo, Nicolás; Muñoz, Catalina; Nazal, Nicolás; Saavedra, Miguel; Martínez, Gabriela; Araya, Eduardo; Apt, Werner; Zulantay, Inés

    2012-03-26

    The polymerase chain reaction (PCR) has proved to be a sensitive technique to detect Trypanosoma cruzi in the chronic phase of Chagas disease, which is characterized by low and fluctuating parasitemia. Another technique proposed for parasitological diagnosis in this phase of infection combines a microscopic search for motile trypomastigote forms in faecal samples (FS) obtained by xenodiagnosis (XD) with conventional PCR (XD-PCR). In this study we evaluate the use of human blood DNA as an exogenous internal control (EIC) for real time PCR (qPCR) combined with XD (XD-qPCR) using chromosome 12 (X12) detection. None of the FS-XD evaluated by qPCR amplified for X12. Nevertheless, all the EIC-FS-XD mixtures amplified for X12. We determined that X12 is useful as an EIC for XD-qPCR because we showed that the FS-XD does not contain human DNA after 30 or more days of XD incubation. This information is relevant for research on T. cruzi by XD-qPCR since it allows ruling out inhibition and false negative results due to DNA loss during the process of extraction and purification.

  15. Prospective regularization design in prior-image-based reconstruction

    International Nuclear Information System (INIS)

    Dang, Hao; Siewerdsen, Jeffrey H; Stayman, J Webster

    2015-01-01

    Prior-image-based reconstruction (PIBR) methods leveraging patient-specific anatomical information from previous imaging studies and/or sequences have demonstrated dramatic improvements in dose utilization and image quality for low-fidelity data. However, a proper balance of information from the prior images and information from the measurements is required (e.g. through careful tuning of regularization parameters). Inappropriate selection of reconstruction parameters can lead to detrimental effects including false structures and failure to improve image quality. Traditional methods based on heuristics are subject to error and sub-optimal solutions, while exhaustive searches require a large number of computationally intensive image reconstructions. In this work, we propose a novel method that prospectively estimates the optimal amount of prior image information for accurate admission of specific anatomical changes in PIBR without performing full image reconstructions. This method leverages an analytical approximation to the implicitly defined PIBR estimator, and introduces a predictive performance metric leveraging this analytical form and knowledge of a particular presumed anatomical change whose accurate reconstruction is sought. Additionally, since model-based PIBR approaches tend to be space-variant, a spatially varying prior image strength map is proposed to optimally admit changes everywhere in the image (eliminating the need to know change locations a priori). Studies were conducted in both an ellipse phantom and a realistic thorax phantom emulating a lung nodule surveillance scenario. The proposed method demonstrated accurate estimation of the optimal prior image strength while achieving a substantial computational speedup (about a factor of 20) compared to traditional exhaustive search. Moreover, the use of the proposed prior strength map in PIBR demonstrated accurate reconstruction of anatomical changes without foreknowledge of change locations in

  16. GENERAL ASPECTS REGARDING THE PRIOR DISCIPLINARY RESEARCH

    Directory of Open Access Journals (Sweden)

    ANDRA PURAN (DASCĂLU

    2012-05-01

    Full Text Available Disciplinary research is the first phase of the disciplinary action. According to art. 251 paragraph 1 of the Labour Code no disciplinary sanction may be ordered before performing the prior disciplinary research.These regulations provide an exception: the sanction of written warning. The current regulations in question, kept from the old regulation, provides a protection for employees against abuses made by employers, since sanctions are affecting the salary or the position held, or even the development of individual employment contract. Thus, prior research of the fact that is a misconduct, before a disciplinary sanction is applied, is an essential condition for the validity of the measure ordered. Through this study we try to highlight some general issues concerning the characteristics, processes and effects of prior disciplinary research.

  17. Bayesian Prior Probability Distributions for Internal Dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Miller, G.; Inkret, W.C.; Little, T.T.; Martz, H.F.; Schillaci, M.E

    2001-07-01

    The problem of choosing a prior distribution for the Bayesian interpretation of measurements (specifically internal dosimetry measurements) is considered using a theoretical analysis and by examining historical tritium and plutonium urine bioassay data from Los Alamos. Two models for the prior probability distribution are proposed: (1) the log-normal distribution, when there is some additional information to determine the scale of the true result, and (2) the 'alpha' distribution (a simplified variant of the gamma distribution) when there is not. These models have been incorporated into version 3 of the Bayesian internal dosimetric code in use at Los Alamos (downloadable from our web site). Plutonium internal dosimetry at Los Alamos is now being done using prior probability distribution parameters determined self-consistently from population averages of Los Alamos data. (author)

  18. Exploring the Relationship between Prior Knowledge on Rain Gardens and Supports for Adopting Rain Gardens Using a Structural Equation Model

    Directory of Open Access Journals (Sweden)

    Suyeon Kim

    2018-05-01

    Full Text Available The objective of this study was to determine the effect of prior knowledge and visual evaluation on supports for rain garden installations. To achieve this objective, a survey was conducted to obtain prior knowledge of rain gardens, rain garden implementation support ratings, and visual evaluation of rain gardens in 100 visitors of three rain garden sites. Results of the analysis revealed that users’ visual evaluation of rain gardens played a role as a moderator in the relationship between prior knowledge and support for rain garden installations. In other words, education and publicity of rain gardens alone cannot increase support for rain gardens. However, if rain gardens are visually evaluated positively, the effects of education and publicity of rain gardens can be expected. Therefore, to successfully apply a rain garden policy in the future, basic consideration should be given to aesthetics in order to meet visitors’ visual expectations prior to education and publicity of rain gardens.

  19. Superflux chlorophyll-a analysis: An assessment of variability in results introduced prior to fluorometric analysis. [chesapeake bay and shelf regions

    Science.gov (United States)

    Cibik, S. J.; Rutledge, C. K.; Robertson, C. N.

    1981-01-01

    Several experiments were undertaken to identify variability in results that came from procedural differences in the processing of chlorophyll samples prior to fluorometric analysis. T-tests on group means indicated that significant differences (alpha = 0.05) in phaeopigment a concentrations did result in samples not initially screened, but not in the chlorophyll a concentrations. Highly significant differences (alpha = 0.001) in group means were found in samples which were held in acetone after filtering as compared to unfiltered seawater samples held for the same period. No difference in results was found between the 24-hour extraction and samples which were processed immediately.

  20. Method to obtain platelet-rich plasma from rabbits (Oryctolagus cuniculus

    Directory of Open Access Journals (Sweden)

    Josiane M. Pazzini

    2016-01-01

    Full Text Available Abstract: Platelet-rich plasma (PRP is a product easy and inxpesnsive, and stands out to for its growth factors in tissue repair. To obtain PRP, centrifugation of whole blood is made with specific time and gravitational forces. Thus, the present work aimed to study a method of double centrifugation to obtain PRP in order to evaluate the effective increase of platelet concentration in the final product, the preparation of PRP gel, and to optimize preparation time of the final sample. Fifteen female White New Zealand rabbits underwent blood sampling for the preparation of PRP. Samples were separated in two sterile tubes containing sodium citrate. Tubes were submitted to the double centrifugation protocol, with lid closed and 1600 revolutions per minute (rpm for 10 minutes, resulting in the separation of red blood cells, plasma with platelets and leucocytes. After were opened and plasma was pipetted and transferred into another sterile tube. Plasma was centrifuged again at 2000rpm for 10 minutes; as a result it was split into two parts: on the top, consisting of platelet-poor plasma (PPP and at the bottom of the platelet button. Part of the PPP was discarded so that only 1ml remained in the tube along with the platelet button. This material was gently agitated to promote platelets resuspension and activated when added 0.3ml of calcium gluconate, resulting in PRP gel. Double centrifugation protocol was able to make platelet concentration 3 times higher in relation to the initial blood sample. The volume of calcium gluconate used for platelet activation was 0.3ml, and was sufficient to coagulate the sample. Coagulation time ranged from 8 to 20 minutes, with an average of 17.6 minutes. Therefore, time of blood centrifugation until to obtain PRP gel took only 40 minutes. It was concluded that PRP was successfully obtained by double centrifugation protocol, which is able to increase the platelet concentration in the sample compared with whole blood

  1. Nanocrystalline and ultrafine grain copper obtained by mechanical attrition

    Directory of Open Access Journals (Sweden)

    Rodolfo Rodríguez Baracaldo

    2010-01-01

    Full Text Available This article presents a method for the sample preparation and characterisation of bulk copper having grain size lower than 1 μm (ultra-fine grain and lower than 100 nm grain size (nanocrystalline. Copper is initially manufactured by a milling/alloying me- chanical method thereby obtaining a powder having a nanocrystalline structure which is then consolidated through a process of warm compaction at high pressure. Microstructural characterisation of bulk copper samples showed the evolution of grain size during all stages involved in obtaining it. The results led to determining the necessary conditions for achieving a wide range of grain sizes. Mechanical characterisation indicated an increase in microhardness to values of around 3.40 GPa for unconsolida- ted nanocrystalline powder. Compressivee strength was increased by reducing the grain size, thereby obtaining an elastic limit of 650 MPa for consolidated copper having a ~ 62 nm grain size.

  2. Spectrally Consistent Satellite Image Fusion with Improved Image Priors

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Aanæs, Henrik; Jensen, Thomas B.S.

    2006-01-01

    Here an improvement to our previous framework for satellite image fusion is presented. A framework purely based on the sensor physics and on prior assumptions on the fused image. The contributions of this paper are two fold. Firstly, a method for ensuring 100% spectrally consistency is proposed......, even when more sophisticated image priors are applied. Secondly, a better image prior is introduced, via data-dependent image smoothing....

  3. Geopolymer obtained from coal ash

    International Nuclear Information System (INIS)

    Conte, V.; Bissari, E.S.; Uggioni, E.; Bernardin, A.M.

    2011-01-01

    Geopolymers are three-dimensional alumino silicates that can be rapidly formed at low temperature from naturally occurring aluminosilicates with a structure similar to zeolites. In this work coal ash (Tractebel Energy) was used as source of aluminosilicate according a full factorial design in eight formulations with three factors (hydroxide type and concentration and temperature) and two-levels. The ash was dried and hydroxide was added according type and concentration. The geopolymer was poured into cylindrical molds, cured (14 days) and subjected to compression test. The coal ash from power plants belongs to the Si-Al system and thus can easily form geopolymers. The compression tests showed that it is possible to obtain samples with strength comparable to conventional Portland cement. As a result, temperature and molarity are the main factors affecting the compressive strength of the obtained geopolymer. (author)

  4. Performance of Identifiler Direct and PowerPlex 16 HS on the Applied Biosystems 3730 DNA Analyzer for processing biological samples archived on FTA cards.

    Science.gov (United States)

    Laurin, Nancy; DeMoors, Anick; Frégeau, Chantal

    2012-09-01

    Direct amplification of STR loci from biological samples collected on FTA cards without prior DNA purification was evaluated using Identifiler Direct and PowerPlex 16 HS in conjunction with the use of a high throughput Applied Biosystems 3730 DNA Analyzer. In order to reduce the overall sample processing cost, reduced PCR volumes combined with various FTA disk sizes were tested. Optimized STR profiles were obtained using a 0.53 mm disk size in 10 μL PCR volume for both STR systems. These protocols proved effective in generating high quality profiles on the 3730 DNA Analyzer from both blood and buccal FTA samples. Reproducibility, concordance, robustness, sample stability and profile quality were assessed using a collection of blood and buccal samples on FTA cards from volunteer donors as well as from convicted offenders. The new developed protocols offer enhanced throughput capability and cost effectiveness without compromising the robustness and quality of the STR profiles obtained. These results support the use of these protocols for processing convicted offender samples submitted to the National DNA Data Bank of Canada. Similar protocols could be applied to the processing of casework reference samples or in paternity or family relationship testing. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  5. Bayesian Inference for Structured Spike and Slab Priors

    DEFF Research Database (Denmark)

    Andersen, Michael Riis; Winther, Ole; Hansen, Lars Kai

    2014-01-01

    Sparse signal recovery addresses the problem of solving underdetermined linear inverse problems subject to a sparsity constraint. We propose a novel prior formulation, the structured spike and slab prior, which allows to incorporate a priori knowledge of the sparsity pattern by imposing a spatial...

  6. Forensic Tools to Track and Connect Physical Samples to Related Data

    Science.gov (United States)

    Molineux, A.; Thompson, A. C.; Baumgardner, R. W.

    2016-12-01

    Identifiers, such as local sample numbers, are critical to successfully connecting physical samples and related data. However, identifiers must be globally unique. The International Geo Sample Number (IGSN) generated when registering the sample in the System for Earth Sample Registration (SESAR) provides a globally unique alphanumeric code associated with basic metadata, related samples and their current physical storage location. When registered samples are published, users can link the figured samples to the basic metadata held at SESAR. The use cases we discuss include plant specimens from a Permian core, Holocene corals and derived powders, and thin sections with SEM stubs. Much of this material is now published. The plant taxonomic study from the core is a digital pdf and samples can be directly linked from the captions to the SESAR record. The study of stable isotopes from the corals is not yet digitally available, but individual samples are accessible. Full data and media records for both studies are located in our database where higher quality images, field notes, and section diagrams may exist. Georeferences permit mapping in current and deep time plate configurations. Several aspects emerged during this study. The first, ensure adequate and consistent details are registered with SESAR. Second, educate and encourage the researcher to obtain IGSNs. Third, publish the archive numbers, assigned prior to publication, alongside the IGSN. This provides access to further data through an Integrated Publishing Toolkit (IPT)/aggregators/or online repository databases, thus placing the initial sample in a much richer context for future studies. Fourth, encourage software developers to customize community software to extract data from a database and use it to register samples in bulk. This would improve workflow and provide a path for registration of large legacy collections.

  7. Supramolecular solvent-based extraction of benzimidazolic fungicides from natural waters prior to their liquid chromatographic/fluorimetric determination.

    Science.gov (United States)

    Moral, Antonia; Sicilia, María Dolores; Rubio, Soledad

    2009-05-01

    A supramolecular solvent made up of vesicles of decanoic acid in the nano- and microscale regimes dispersed in a continuous aqueous phase is proposed for the extraction/preconcentration of benzimidazolic fungicides (BFs) from river and underground water samples prior to their determination by liquid chromatography (LC)/fluorimetry. The solvent is produced from the coacervation of decanoic acid aqueous vesicles by the action of tetrabutylammonium (Bu(4)N(+)). Carbendazim (CB), thiabendazole (TB) and fuberidazole (FB) are extracted on the basis of hydrophobic and pi-cation interactions and the formation of hydrogen bonds. The extraction provides high preconcentration factors (160 for CB and 190 for TB and FB), requires a short time (the procedure takes less than 20 min and several samples can be simultaneously processed) and a low sample volume (20 mL), and avoids the use of toxic organic solvents. Because of the absence of matrix interferences and the low viscosity of the extracts, these can be directly injected into the chromatographic system without the need of cleaning-up or diluting them. Recoveries are not influenced by the presence of salt concentrations up to 1 M. The proposed method provides detection limits for the determination of CB, TB and FB in natural waters of 32, 4 and 0.1 ng L(-1), respectively, and a precision, expressed as relative standard deviation (n=11) of 5.5% for CB (100 ng L(-1)), 4.0% for TB (80 ng L(-1)) and 2.5% for FB (30 ng L(-1)). Recoveries obtained by applying this approach to the analysis of river and underground water samples fortified at the ng L(-1) level are in the intervals 75-83, 95-102 and 97-101% for CB, TB and FB, respectively.

  8. Thermal diffusivity and electron transport properties of NTC samples obtained by the photoacoustic method

    International Nuclear Information System (INIS)

    Savic, S.M.; Aleksic, O.S.; Nikolic, M.V.; Lukovic, D.T.; Pejovic, V.Z.; Nikolic, P.M.

    2006-01-01

    Thermal diffusivity and electron transport parameters of sintered NTC samples were determined by the photoacoustic (PA) technique. Powder mixtures composed of MnO, NiO, CoO and Fe 2 O 3 were milled to nanometer particle size. NTC discs were dry powder pressed and sintered at different temperatures in the range from 900 deg. C to 1300 deg. C for 30 min. A second group of NTC discs was sintered at 1200 deg. C with the sintering time varying from 30 min to 360 min. These NTC samples were polished and exposed to a chopped laser beam in order to plot a response in the acoustic range. The thermal diffusivity of sintered NTC layers based on a metal oxide powder mixture was measured at room temperature by the photoacoustic technique. An increase of thermal diffusivity with the sintering temperature and time of sintering was observed

  9. Thermal diffusivity and electron transport properties of NTC samples obtained by the photoacoustic method

    Energy Technology Data Exchange (ETDEWEB)

    Savic, S.M. [Institute of Technical Sciences of SASA, Knez Mihailova 35/IV, 11000 Belgrade (Serbia); Aleksic, O.S. [Center for Multidisciplinary Studies of the University of Belgrade, Kneza Viseslava 1, 11000 Belgrade (Serbia); Nikolic, M.V. [Center for Multidisciplinary Studies of the University of Belgrade, Kneza Viseslava 1, 11000 Belgrade (Serbia); Lukovic, D.T. [Institute of Technical Sciences of SASA, Knez Mihailova 35/IV, 11000 Belgrade (Serbia); Pejovic, V.Z. [Center for Multidisciplinary Studies of the University of Belgrade, Kneza Viseslava 1, 11000 Belgrade (Serbia); Nikolic, P.M. [Institute of Technical Sciences of SASA, Knez Mihailova 35/IV, 11000 Belgrade (Serbia)]. E-mail: nikolic@sanu.ac.yu

    2006-07-15

    Thermal diffusivity and electron transport parameters of sintered NTC samples were determined by the photoacoustic (PA) technique. Powder mixtures composed of MnO, NiO, CoO and Fe{sub 2}O{sub 3} were milled to nanometer particle size. NTC discs were dry powder pressed and sintered at different temperatures in the range from 900 deg. C to 1300 deg. C for 30 min. A second group of NTC discs was sintered at 1200 deg. C with the sintering time varying from 30 min to 360 min. These NTC samples were polished and exposed to a chopped laser beam in order to plot a response in the acoustic range. The thermal diffusivity of sintered NTC layers based on a metal oxide powder mixture was measured at room temperature by the photoacoustic technique. An increase of thermal diffusivity with the sintering temperature and time of sintering was observed.

  10. Characterization of the Bacterial Community Naturally Present on Commercially Grown Basil Leaves: Evaluation of Sample Preparation Prior to Culture-Independent Techniques

    Directory of Open Access Journals (Sweden)

    Siele Ceuppens

    2015-08-01

    Full Text Available Fresh herbs such as basil constitute an important food commodity worldwide. Basil provides considerable culinary and health benefits, but has also been implicated in foodborne illnesses. The naturally occurring bacterial community on basil leaves is currently unknown, so the epiphytic bacterial community was investigated using the culture-independent techniques denaturing gradient gel electrophoresis (DGGE and next-generation sequencing (NGS. Sample preparation had a major influence on the results from DGGE and NGS: Novosphingobium was the dominant genus for three different basil batches obtained by maceration of basil leaves, while washing of the leaves yielded lower numbers but more variable dominant bacterial genera including Klebsiella, Pantoea, Flavobacterium, Sphingobacterium and Pseudomonas. During storage of basil, bacterial growth and shifts in the bacterial community were observed with DGGE and NGS. Spoilage was not associated with specific bacterial groups and presumably caused by physiological tissue deterioration and visual defects, rather than by bacterial growth.

  11. Evaluation of various techniques for the pretreatment of sewage sludges prior to trace metal analysis by atomic absorption spectrophotometry

    International Nuclear Information System (INIS)

    Smith, R.

    1983-01-01

    Six techniques were evaluated for their suitability for the pretreatment of dried sewage sludge prior to trace metal analysis by atomic absorption spectrophotometry. The evaluation comprised analysis of two prepared samples of dried sludge for aluminium, cadmium, chromium, copper, iron, lead, manganese, nickel and zinc, after the following pretreatment: dry ashing at 500 degrees Celsius followed by extraction with dilute hydrochloric acid; dry ashing at 500 degrees Celsius followed by extraction with aqua regia; nitric acid digestion followed by extraction with hydrochloric acid; extraction with aqua regia; ashing with magnesium nitrate solution at 550 degrees Celsius followed by digestion with hydrochloric acid and extraction with nitric acid; extraction with nitric acid. Procedures involving the use of perchloric acid, hydrofluoric acid and hydrogen peroxide were not considered for reasons of safety. Except in the case of aluminium the direct mineral acid digestion and/or extraction methods generally gave higher recoveries than the procedures incorporating an ashing step. Direct extraction of the sample with aqua regia was recommended as a rapid and simple general method of sample pretreatment prior to analysis for all the metals investigated except aluminium. For this metal, more drastic sample pretreatment will be required, for example fusion or hydrofluoric acid digestion

  12. Melview Nursing Home, Prior Park, Clonmel, Tipperary.

    LENUS (Irish Health Repository)

    Zhou, Ping

    2011-08-03

    Abstract Background To simplify the methodology for the isolation of Campylobacter spp. from retail broiler meat, we evaluated 108 samples (breasts and thighs) using an unpaired sample design. The enrichment broths were incubated under aerobic conditions (subsamples A) and for comparison under microaerobic conditions (subsamples M) as recommended by current reference protocols. Sensors were used to measure the dissolved oxygen (DO) in the broth and the percentage of oxygen (O2) in the head space of the bags used for enrichment. Campylobacter isolates were identified with multiplex PCR assays and typed using pulsed-field gel electrophoresis (PFGE). Ribosomal intergenic spacer analyses (RISA) and denaturing gradient gel electrophoresis (DGGE) were used to study the bacterial communities of subsamples M and A after 48 h enrichment. Results The number of Campylobacter positive subsamples were similar for A and M when all samples were combined (P = 0.81) and when samples were analyzed by product (breast: P = 0.75; thigh: P = 1.00). Oxygen sensors showed that DO values in the broth were around 6 ppm and O2 values in the head space were 14-16% throughout incubation. PFGE demonstrated high genomic similarity of isolates in the majority of the samples in which isolates were obtained from subsamples A and M. RISA and DGGE results showed a large variability in the bacterial populations that could be attributed to sample-to-sample variations and not enrichment conditions (aerobic or microaerobic). These data also suggested that current sampling protocols are not optimized to determine the true number of Campylobacter positive samples in retail boiler meat. Conclusions Decreased DO in enrichment broths is naturally achieved. This simplified, cost-effective enrichment protocol with aerobic incubation could be incorporated into reference methods for the isolation of Campylobacter spp. from retail broiler meat.

  13. Low-sampling-rate ultra-wideband channel estimation using equivalent-time sampling

    KAUST Repository

    Ballal, Tarig

    2014-09-01

    In this paper, a low-sampling-rate scheme for ultra-wideband channel estimation is proposed. The scheme exploits multiple observations generated by transmitting multiple pulses. In the proposed scheme, P pulses are transmitted to produce channel impulse response estimates at a desired sampling rate, while the ADC samples at a rate that is P times slower. To avoid loss of fidelity, the number of sampling periods (based on the desired rate) in the inter-pulse interval is restricted to be co-prime with P. This condition is affected when clock drift is present and the transmitted pulse locations change. To handle this case, and to achieve an overall good channel estimation performance, without using prior information, we derive an improved estimator based on the bounded data uncertainty (BDU) model. It is shown that this estimator is related to the Bayesian linear minimum mean squared error (LMMSE) estimator. Channel estimation performance of the proposed sub-sampling scheme combined with the new estimator is assessed in simulation. The results show that high reduction in sampling rate can be achieved. The proposed estimator outperforms the least squares estimator in almost all cases, while in the high SNR regime it also outperforms the LMMSE estimator. In addition to channel estimation, a synchronization method is also proposed that utilizes the same pulse sequence used for channel estimation. © 2014 IEEE.

  14. Use of respondent driven sampling (RDS generates a very diverse sample of men who have sex with men (MSM in Buenos Aires, Argentina.

    Directory of Open Access Journals (Sweden)

    Alex Carballo-Diéguez

    Full Text Available Prior research focusing on men who have sex with men (MSM conducted in Buenos Aires, Argentina, used convenience samples that included mainly gay identified men. To increase MSM sample representativeness, we used Respondent Driven Sampling (RDS for the first time in Argentina. Using RDS, under certain specified conditions, the observed estimates for the percentage of the population with a specific trait are asymptotically unbiased. We describe, the diversity of the recruited sample, from the point of view of sexual orientation, and contrast the different subgroups in terms of their HIV sexual risk behavior.500 MSM were recruited using RDS. Behavioral data were collected through face-to-face interviews and Web-based CASI.In contrast with prior studies, RDS generated a very diverse sample of MSM from a sexual identity perspective. Only 24.5% of participants identified as gay; 36.2% identified as bisexual, 21.9% as heterosexual, and 17.4% were grouped as "other." Gay and non-gay identified MSM differed significantly in their sexual behavior, the former having higher numbers of partners, more frequent sexual contacts and less frequency of condom use. One third of the men (gay, 3%; bisexual, 34%, heterosexual, 51%; other, 49% reported having had sex with men, women and transvestites in the two months prior to the interview. This population requires further study and, potentially, HIV prevention strategies tailored to such diversity of partnerships. Our results highlight the potential effectiveness of using RDS to reach non-gay identified MSM. They also present lessons learned in the implementation of RDS to recruit MSM concerning both the importance and limitations of formative work, the need to tailor incentives to circumstances of the less affluent potential participants, the need to prevent masking, and the challenge of assessing network size.

  15. Real-time image dehazing using local adaptive neighborhoods and dark-channel-prior

    Science.gov (United States)

    Valderrama, Jesus A.; Díaz-Ramírez, Víctor H.; Kober, Vitaly; Hernandez, Enrique

    2015-09-01

    A real-time algorithm for single image dehazing is presented. The algorithm is based on calculation of local neighborhoods of a hazed image inside a moving window. The local neighborhoods are constructed by computing rank-order statistics. Next the dark-channel-prior approach is applied to the local neighborhoods to estimate the transmission function of the scene. By using the suggested approach there is no need for applying a refining algorithm to the estimated transmission such as the soft matting algorithm. To achieve high-rate signal processing the proposed algorithm is implemented exploiting massive parallelism on a graphics processing unit (GPU). Computer simulation results are carried out to test the performance of the proposed algorithm in terms of dehazing efficiency and speed of processing. These tests are performed using several synthetic and real images. The obtained results are analyzed and compared with those obtained with existing dehazing algorithms.

  16. In situ emulsification microextraction using a dicationic ionic liquid followed by magnetic assisted physisorption for determination of lead prior to micro-sampling flame atomic absorption spectrometry

    International Nuclear Information System (INIS)

    Shokri, Masood; Beiraghi, Asadollah; Seidi, Shahram

    2015-01-01

    For the first time, a simple and efficient in situ emulsification microextraction method using a dicationic ionic liquid followed by magnetic assisted physisorption was presented to determine trace amounts of lead. In this method, 400 μL of 1.0 mol L −1 lithium bis (trifluoromethylsulfonyl) imide aqueous solution, Li[NTf 2 ], was added into the sample solution containing 100 μL of 1.0 mol L −1 1,3-(propyl-1,3-diyl) bis (3-methylimidazolium) chloride, [pbmim]Cl 2 , to form a water immiscible ionic liquid, [pbmim][NTf 2 ] 2 . This new in situ formed dicationic ionic liquid was applied as the acceptor phase to extract the lead-ammonium pyrrolidinedithiocarbamate (Pb-APDC) complexes from the sample solution. Subsequently, 30 mg of Fe 3 O 4 magnetic nanoparticles (MNPs) were added into the sample solution to collect the fine droplets of [pbmim][NTf 2 ] 2 , physisorptively. Finally, MNPs were eluted by acetonitrile, separated by an external magnetic field and the obtained eluent was subjected to micro-sampling flame atomic absorption spectrometry (FAAS) for further analysis. Comparing with other microextraction methods, no special devices and centrifugation step are required. Parameters influencing the extraction efficiency such as extraction time, pH, concentration of chelating agent, amount of MNPs and coexisting interferences were studied. Under the optimized conditions, this method showed high extraction recovery of 93% with low LOD of 0.7 μg L −1 . Good linearity was obtained in the range of 2.5–150 μg L −1 with determination coefficient (r 2 ) of 0.9921. Relative standard deviation (RSD%) for seven repeated measurements at the concentration of 10 μg L −1 was 4.1%. Finally, this method was successfully applied for determination of lead in some water and plant samples. - Highlights: • A dicationic ionic liquid was used as the extraction solvent, for the first time. • A simple and efficient in situ emulsification microextraction

  17. Bayesian Image Restoration Using a Large-Scale Total Patch Variation Prior

    Directory of Open Access Journals (Sweden)

    Yang Chen

    2011-01-01

    Full Text Available Edge-preserving Bayesian restorations using nonquadratic priors are often inefficient in restoring continuous variations and tend to produce block artifacts around edges in ill-posed inverse image restorations. To overcome this, we have proposed a spatial adaptive (SA prior with improved performance. However, this SA prior restoration suffers from high computational cost and the unguaranteed convergence problem. Concerning these issues, this paper proposes a Large-scale Total Patch Variation (LS-TPV Prior model for Bayesian image restoration. In this model, the prior for each pixel is defined as a singleton conditional probability, which is in a mixture prior form of one patch similarity prior and one weight entropy prior. A joint MAP estimation is thus built to ensure the iteration monotonicity. The intensive calculation of patch distances is greatly alleviated by the parallelization of Compute Unified Device Architecture(CUDA. Experiments with both simulated and real data validate the good performance of the proposed restoration.

  18. A simple vibrating sample magnetometer for macroscopic samples

    Science.gov (United States)

    Lopez-Dominguez, V.; Quesada, A.; Guzmán-Mínguez, J. C.; Moreno, L.; Lere, M.; Spottorno, J.; Giacomone, F.; Fernández, J. F.; Hernando, A.; García, M. A.

    2018-03-01

    We here present a simple model of a vibrating sample magnetometer (VSM). The system allows recording magnetization curves at room temperature with a resolution of the order of 0.01 emu and is appropriated for macroscopic samples. The setup can be mounted with different configurations depending on the requirements of the sample to be measured (mass, saturation magnetization, saturation field, etc.). We also include here examples of curves obtained with our setup and comparison curves measured with a standard commercial VSM that confirms the reliability of our device.

  19. Neutrino mass priors for cosmology from random matrices

    Science.gov (United States)

    Long, Andrew J.; Raveri, Marco; Hu, Wayne; Dodelson, Scott

    2018-02-01

    Cosmological measurements of structure are placing increasingly strong constraints on the sum of the neutrino masses, Σ mν, through Bayesian inference. Because these constraints depend on the choice for the prior probability π (Σ mν), we argue that this prior should be motivated by fundamental physical principles rather than the ad hoc choices that are common in the literature. The first step in this direction is to specify the prior directly at the level of the neutrino mass matrix Mν, since this is the parameter appearing in the Lagrangian of the particle physics theory. Thus by specifying a probability distribution over Mν, and by including the known squared mass splittings, we predict a theoretical probability distribution over Σ mν that we interpret as a Bayesian prior probability π (Σ mν). Assuming a basis-invariant probability distribution on Mν, also known as the anarchy hypothesis, we find that π (Σ mν) peaks close to the smallest Σ mν allowed by the measured mass splittings, roughly 0.06 eV (0.1 eV) for normal (inverted) ordering, due to the phenomenon of eigenvalue repulsion in random matrices. We consider three models for neutrino mass generation: Dirac, Majorana, and Majorana via the seesaw mechanism; differences in the predicted priors π (Σ mν) allow for the possibility of having indications about the physical origin of neutrino masses once sufficient experimental sensitivity is achieved. We present fitting functions for π (Σ mν), which provide a simple means for applying these priors to cosmological constraints on the neutrino masses or marginalizing over their impact on other cosmological parameters.

  20. Estimating security betas using prior information based on firm fundamentals

    NARCIS (Netherlands)

    Cosemans, M.; Frehen, R.; Schotman, P.C.; Bauer, R.

    2010-01-01

    This paper proposes a novel approach for estimating time-varying betas of individual stocks that incorporates prior information based on fundamentals. We shrink the rolling window estimate of beta towards a firm-specific prior that is motivated by asset pricing theory. The prior captures structural

  1. Wavelet bidomain sample entropy analysis to predict spontaneous termination of atrial fibrillation

    International Nuclear Information System (INIS)

    Alcaraz, Raúl; Rieta, José Joaquín

    2008-01-01

    The ability to predict if an atrial fibrillation (AF) episode terminates spontaneously or not through non-invasive techniques is a challenging problem of great clinical interest. This fact could avoid useless therapeutic interventions and minimize the risks for the patient. The present work introduces a robust AF prediction methodology carried out by estimating, through sample entropy (SampEn), the atrial activity (AA) organization increase prior to AF termination from the surface electrocardiogram (ECG). This regularity variation appears as a consequence of the decrease in the number of reentries wandering throughout the atrial tissue. AA was obtained from surface ECG recordings by applying a QRST cancellation technique. Next, a robust and reliable classification process for terminating and non-terminating AF episodes was developed, making use of two different wavelet decomposition strategies. Finally, the AA organization both in time and wavelet domains (bidomain) was estimated via SampEn. The methodology was validated using a training set consisting of 20 AF recordings with known termination properties and a test set of 30 recordings. All the training signals and 93.33% of the test set were correctly classified into terminating and sustained AF, obtaining 93.75% sensitivity and 92.86% specificity. It can be concluded that spontaneous AF termination can be reliably and noninvasively predicted by applying wavelet bidomain sample entropy

  2. Optimized preparation of urine samples for two-dimensional electrophoresis and initial application to patient samples

    DEFF Research Database (Denmark)

    Lafitte, Daniel; Dussol, Bertrand; Andersen, Søren

    2002-01-01

    OBJECTIVE: We optimized of the preparation of urinary samples to obtain a comprehensive map of urinary proteins of healthy subjects and then compared this map with the ones obtained with patient samples to show that the pattern was specific of their kidney disease. DESIGN AND METHODS: The urinary...

  3. Sample Size Calculations for Population Size Estimation Studies Using Multiplier Methods With Respondent-Driven Sampling Surveys.

    Science.gov (United States)

    Fearon, Elizabeth; Chabata, Sungai T; Thompson, Jennifer A; Cowan, Frances M; Hargreaves, James R

    2017-09-14

    While guidance exists for obtaining population size estimates using multiplier methods with respondent-driven sampling surveys, we lack specific guidance for making sample size decisions. To guide the design of multiplier method population size estimation studies using respondent-driven sampling surveys to reduce the random error around the estimate obtained. The population size estimate is obtained by dividing the number of individuals receiving a service or the number of unique objects distributed (M) by the proportion of individuals in a representative survey who report receipt of the service or object (P). We have developed an approach to sample size calculation, interpreting methods to estimate the variance around estimates obtained using multiplier methods in conjunction with research into design effects and respondent-driven sampling. We describe an application to estimate the number of female sex workers in Harare, Zimbabwe. There is high variance in estimates. Random error around the size estimate reflects uncertainty from M and P, particularly when the estimate of P in the respondent-driven sampling survey is low. As expected, sample size requirements are higher when the design effect of the survey is assumed to be greater. We suggest a method for investigating the effects of sample size on the precision of a population size estimate obtained using multipler methods and respondent-driven sampling. Uncertainty in the size estimate is high, particularly when P is small, so balancing against other potential sources of bias, we advise researchers to consider longer service attendance reference periods and to distribute more unique objects, which is likely to result in a higher estimate of P in the respondent-driven sampling survey. ©Elizabeth Fearon, Sungai T Chabata, Jennifer A Thompson, Frances M Cowan, James R Hargreaves. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 14.09.2017.

  4. 34 CFR 303.403 - Prior notice; native language.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 2 2010-07-01 2010-07-01 false Prior notice; native language. 303.403 Section 303.403... TODDLERS WITH DISABILITIES Procedural Safeguards General § 303.403 Prior notice; native language. (a... file a complaint and the timelines under those procedures. (c) Native language. (1) The notice must be...

  5. Including Below Detection Limit Samples in Post Decommissioning Soil Sample Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Hwan; Yim, Man Sung [KAIST, Daejeon (Korea, Republic of)

    2016-05-15

    To meet the required standards the site owner has to show that the soil at the facility has been sufficiently cleaned up. To do this one must know the contamination of the soil at the site prior to clean up. This involves sampling that soil to identify the degree of contamination. However there is a technical difficulty in determining how much decontamination should be done. The problem arises when measured samples are below the detection limit. Regulatory guidelines for site reuse after decommissioning are commonly challenged because the majority of the activity in the soil at or below the limit of detection. Using additional statistical analyses of contaminated soil after decommissioning is expected to have the following advantages: a better and more reliable probabilistic exposure assessment, better economics (lower project costs) and improved communication with the public. This research will develop an approach that defines an acceptable method for demonstrating compliance of decommissioned NPP sites and validates that compliance. Soil samples from NPP often contain censored data. Conventional methods for dealing with censored data sets are statistically biased and limited in their usefulness.

  6. A Improved Seabed Surface Sand Sampling Device

    Science.gov (United States)

    Luo, X.

    2017-12-01

    In marine geology research it is necessary to obtain a suf fcient quantity of seabed surface samples, while also en- suring that the samples are in their original state. Currently,there are a number of seabed surface sampling devices available, but we fnd it is very diffcult to obtain sand samples using these devices, particularly when dealing with fne sand. Machine-controlled seabed surface sampling devices are also available, but generally unable to dive into deeper regions of water. To obtain larger quantities of seabed surface sand samples in their original states, many researchers have tried to improve upon sampling devices,but these efforts have generally produced ambiguous results, in our opinion.To resolve this issue, we have designed an improved andhighly effective seabed surface sand sampling device that incorporates the strengths of a variety of sampling devices. It is capable of diving into deepwater to obtain fne sand samples and is also suited for use in streams, rivers, lakes and seas with varying levels of depth (up to 100 m). This device can be used for geological mapping, underwater prospecting, geological engineering and ecological, environmental studies in both marine and terrestrial waters.

  7. Pengaruh Prior Online Purchase Experience Terhadap Trust Dan Online Repurchase Intention (Survey Pada Pelanggan Zalora Indonesia Melalui Website Www.zalora.co.id)

    OpenAIRE

    Parastanti, Gadis Paramita

    2014-01-01

    This study aims to determine the effect of Prior Online Purchase Experience on the Trust and Online Repurchase Intention. Exogenous variables used in this study were Prior Online Purchase Experience, while the intervening variable in this study is the Trust and endogenous variables in this study are Online Repurchase Intention. Types of research used is explanatory research with quantitative approach. This study is done to the customer on the website ZALORA Indonesia (www.zalora.co.id). Sampl...

  8. Influence of chemical treatment of clay to obtain polypropylene nanocomposites

    International Nuclear Information System (INIS)

    Rosa, Jeferson L.S.; Marques, Maria F.V.

    2009-01-01

    Commercial clay was chemically treated to prepare a Ziegler-Natta catalyst containing MgCl 2 and clay for the synthesis of polypropylene nanocomposites by in situ polymerization. The performance of this catalyst and materials obtained in propylene polymerization was compared with a reference catalyst (without clay) and with another, whose composition presents the same clay but without prior chemical treatment. Techniques like differential scanning calorimetry (DSC), X-ray diffractometry (XRD) and melt flow index (MFI) measurements were performed. There was a marked reduction in catalytic activity of clay catalysts in comparison with the reference one, and a slight reduction in melting temperature of the polymers produced from first ones. The melt flow index of polymers obtained with treated clay were notably higher than those synthesized with the untreated clay, so the treated clay caused treated the production of PP's with lower molar mass. The clays showed an increase of spacing and irregular stacking of the lamellas, especially if chemically treated. (author)

  9. Testing a groundwater sampling tool: Are the samples representative?

    International Nuclear Information System (INIS)

    Kaback, D.S.; Bergren, C.L.; Carlson, C.A.; Carlson, C.L.

    1989-01-01

    A ground water sampling tool, the HydroPunch trademark, was tested at the Department of Energy's Savannah River Site in South Carolina to determine if representative ground water samples could be obtained without installing monitoring wells. Chemical analyses of ground water samples collected with the HydroPunch trademark from various depths within a borehole were compared with chemical analyses of ground water from nearby monitoring wells. The site selected for the test was in the vicinity of a large coal storage pile and a coal pile runoff basin that was constructed to collect the runoff from the coal storage pile. Existing monitoring wells in the area indicate the presence of a ground water contaminant plume that: (1) contains elevated concentrations of trace metals; (2) has an extremely low pH; and (3) contains elevated concentrations of major cations and anions. Ground water samples collected with the HydroPunch trademark provide in excellent estimate of ground water quality at discrete depths. Groundwater chemical data collected from various depths using the HydroPunch trademark can be averaged to simulate what a screen zone in a monitoring well would sample. The averaged depth-discrete data compared favorably with the data obtained from the nearby monitoring wells

  10. Electrical discharge machining for vessel sample removal

    International Nuclear Information System (INIS)

    Litka, T.J.

    1993-01-01

    Due to aging-related problems or essential metallurgy information (plant-life extension or decommissioning) of nuclear plants, sample removal from vessels may be required as part of an examination. Vessel or cladding samples with cracks may be removed to determine the cause of cracking. Vessel weld samples may be removed to determine the weld metallurgy. In all cases, an engineering analysis must be done prior to sample removal to determine the vessel's integrity upon sample removal. Electrical discharge machining (EDM) is being used for in-vessel nuclear power plant vessel sampling. Machining operations in reactor coolant system (RCS) components must be accomplished while collecting machining chips that could cause damage if they become part of the flow stream. The debris from EDM is a fine talclike particulate (no chips), which can be collected by flushing and filtration

  11. Fluorescent determination of graphene quantum dots in water samples

    Energy Technology Data Exchange (ETDEWEB)

    Benítez-Martínez, Sandra; Valcárcel, Miguel, E-mail: qa1meobj@uco.es

    2015-10-08

    This work presents a simple, fast and sensitive method for the preconcentration and quantification of graphene quantum dots (GQDs) in aqueous samples. GQDs are considered an object of analysis (analyte) not an analytical tool which is the most frequent situation in Analytical Nanoscience and Nanotechnology. This approach is based on the preconcentration of graphene quantum dots on an anion exchange sorbent by solid phase extraction and their subsequent elution prior fluorimetric analysis of the solution containing graphene quantum dots. Parameters of the extraction procedure such as sample volume, type of solvent, sample pH, sample flow rate and elution conditions were investigated in order to achieve extraction efficiency. The limits of detection and quantification were 7.5 μg L{sup −1} and 25 μg L{sup −1}, respectively. The precision for 200 μg L{sup −1}, expressed as %RSD, was 2.8%. Recoveries percentages between 86.9 and 103.9% were obtained for two different concentration levels. Interferences from other nanoparticles were studied and no significant changes were observed at the concentration levels tested. Consequently, the optimized procedure has great potential to be applied to the determination of graphene quantum dots at trace levels in drinking and environmental waters. - Highlights: • Development of a novel and simple method for determination of graphene quantum dots. • Preconcentration of graphene quantum dots by solid phase extraction. • Fluorescence spectroscopy allows fast measurements. • High sensitivity and great reproducibility are achieved.

  12. Direct trace-elemental analysis of urine samples by laser ablation-inductively coupled plasma mass spectrometry after sample deposition on clinical filter papers.

    Science.gov (United States)

    Aramendía, Maite; Rello, Luis; Vanhaecke, Frank; Resano, Martín

    2012-10-16

    Collection of biological fluids on clinical filter papers shows important advantages from a logistic point of view, although analysis of these specimens is far from straightforward. Concerning urine analysis, and particularly when direct trace elemental analysis by laser ablation-inductively coupled plasma mass spectrometry (LA-ICPMS) is aimed at, several problems arise, such as lack of sensitivity or different distribution of the analytes on the filter paper, rendering obtaining reliable quantitative results quite difficult. In this paper, a novel approach for urine collection is proposed, which circumvents many of these problems. This methodology consists on the use of precut filter paper discs where large amounts of sample can be retained upon a single deposition. This provides higher amounts of the target analytes and, thus, sufficient sensitivity, and allows addition of an adequate internal standard at the clinical lab prior to analysis, therefore making it suitable for a strategy based on unsupervised sample collection and ulterior analysis at referral centers. On the basis of this sampling methodology, an analytical method was developed for the direct determination of several elements in urine (Be, Bi, Cd, Co, Cu, Ni, Sb, Sn, Tl, Pb, and V) at the low μg L(-1) level by means of LA-ICPMS. The method developed provides good results in terms of accuracy and LODs (≤1 μg L(-1) for most of the analytes tested), with a precision in the range of 15%, fit-for-purpose for clinical control analysis.

  13. Comparison of the radiochemical separation procedures od plutonium applied for its determination in the environmental samples using alpha spectrometry

    International Nuclear Information System (INIS)

    Komosa, A.; Michalik, S.

    2006-01-01

    Alpha spectrometry of the plutonium isotopes can be performed only after the perfect plutonium separation from other components of the matrix. So, till now numerous procedures have been elaborated and tested. The communication presents comparison of the plutonium content determination in soil, bones, eggshells and in the reference materials obtained by alpha spectrometry combined with two different separation procedures. The samples were mineralized in the concentrated HCl or HF prior to plutonium electrodeposition or coprecipitation with NdF 3 . Some other details were also tested in various variants. Quality of the spectra is discussed in terms of all these pre-treatment methods

  14. Use of semiconductor detector c-Si microstrip type in obtaining the digital radiographic imaging of phantoms and biological samples of mammary glands

    International Nuclear Information System (INIS)

    Leyva, A.; Cabal, A.; Pinera, I.; Abreu, Y.; Cruz, C. M.; Montano, L. M.; Diaz, C. C.; Fontaine, M.; Ortiz, C. M.; Padilla, F.; De la Mora, R.

    2009-01-01

    The present work synthesizes the experimental results obtained in the characterization of 64 micro strips crystalline silicon detector designed for experiments in high energies physics, with the objective of studying its possible application in advanced medical radiography, specifically in digital mammography and angiography. The research includes the acquisition of two-dimensional radiography of a mammography phantom using the scanning method, and its comparison with similar images simulated mathematically for different X rays sources. The paper also shows the experimental radiography of two biological samples taken from biopsies of mammas, where it is possible to identify the presence of possible pathological lesions. The results reached in this work point positively toward the effective possibility of satisfactorily introducing those advanced detectors in medical digital imaging applications. (Author)

  15. Sampling and Characterization of 618-2 Anomalous Material

    International Nuclear Information System (INIS)

    Zacharias, A.E.

    2006-01-01

    This as low as reasonably achievable (ALARA) Level II review documents radiological engineering and administrative controls necessary for the sampling and characterization of anomalous materials discovered during the remediation of the 618-2 solid waste burial ground. The goals of these engineering and administrative controls are to keep personnel exposure ALARA, control contamination levels, and minimize potential for airborne contamination. Excavation of the 618-2 Burial Ground has produced many items of anomalous waste. Prior to temporary packaging and/or storage, these items have been characterized in the field to identify radiological and industrial safety conditions. Further sampling and characterization of these items, as well as those remaining from an excavated combination safe, is the subject of this ALARA Level II review. An ALARA in-progress review will also be performed prior to sampling and characterization of 618-2 anomalous materials offering risks of differing natures. General categories of anomalies requiring further characterization include the following: (1) Containers of unknown liquids and/or solids and powders (excluding transuranics); (2) Drums containing unknown liquids and/or solids; (3) Metal containers with unknown contents; and (4) Known or suspected transuranic material.

  16. A SELDI mass spectrometry study of experimental autoimmune encephalomyelitis: sample preparation, reproducibility, and differential protein expression patterns.

    Science.gov (United States)

    Azzam, Sausan; Broadwater, Laurie; Li, Shuo; Freeman, Ernest J; McDonough, Jennifer; Gregory, Roger B

    2013-05-01

    Experimental autoimmune encephalomyelitis (EAE) is an autoimmune, inflammatory disease of the central nervous system that is widely used as a model of multiple sclerosis (MS). Mitochondrial dysfunction appears to play a role in the development of neuropathology in MS and may also play a role in disease pathology in EAE. Here, surface enhanced laser desorption ionization mass spectrometry (SELDI-MS) has been employed to obtain protein expression profiles from mitochondrially enriched fractions derived from EAE and control mouse brain. To gain insight into experimental variation, the reproducibility of sub-cellular fractionation, anion exchange fractionation as well as spot-to-spot and chip-to-chip variation using pooled samples from brain tissue was examined. Variability of SELDI mass spectral peak intensities indicates a coefficient of variation (CV) of 15.6% and 17.6% between spots on a given chip and between different chips, respectively. Thinly slicing tissue prior to homogenization with a rotor homogenizer showed better reproducibility (CV = 17.0%) than homogenization of blocks of brain tissue with a Teflon® pestle (CV = 27.0%). Fractionation of proteins with anion exchange beads prior to SELDI-MS analysis gave overall CV values from 16.1% to 18.6%. SELDI mass spectra of mitochondrial fractions obtained from brain tissue from EAE mice and controls displayed 39 differentially expressed proteins (p≤ 0.05) out of a total of 241 protein peaks observed in anion exchange fractions. Hierarchical clustering analysis showed that protein fractions from EAE animals with severe disability clearly segregated from controls. Several components of electron transport chain complexes (cytochrome c oxidase subunit 6b1, subunit 6C, and subunit 4; NADH dehydrogenase flavoprotein 3, alpha subcomplex subunit 2, Fe-S protein 4, and Fe-S protein 6; and ATP synthase subunit e) were identified as possible differentially expressed proteins. Myelin Basic Protein isoform 8 (MBP8) (14.2 k

  17. Prior elicitation and Bayesian analysis of the Steroids for Corneal Ulcers Trial.

    Science.gov (United States)

    See, Craig W; Srinivasan, Muthiah; Saravanan, Somu; Oldenburg, Catherine E; Esterberg, Elizabeth J; Ray, Kathryn J; Glaser, Tanya S; Tu, Elmer Y; Zegans, Michael E; McLeod, Stephen D; Acharya, Nisha R; Lietman, Thomas M

    2012-12-01

    To elicit expert opinion on the use of adjunctive corticosteroid therapy in bacterial corneal ulcers. To perform a Bayesian analysis of the Steroids for Corneal Ulcers Trial (SCUT), using expert opinion as a prior probability. The SCUT was a placebo-controlled trial assessing visual outcomes in patients receiving topical corticosteroids or placebo as adjunctive therapy for bacterial keratitis. Questionnaires were conducted at scientific meetings in India and North America to gauge expert consensus on the perceived benefit of corticosteroids as adjunct treatment. Bayesian analysis, using the questionnaire data as a prior probability and the primary outcome of SCUT as a likelihood, was performed. For comparison, an additional Bayesian analysis was performed using the results of the SCUT pilot study as a prior distribution. Indian respondents believed there to be a 1.21 Snellen line improvement, and North American respondents believed there to be a 1.24 line improvement with corticosteroid therapy. The SCUT primary outcome found a non-significant 0.09 Snellen line benefit with corticosteroid treatment. The results of the Bayesian analysis estimated a slightly greater benefit than did the SCUT primary analysis (0.19 lines verses 0.09 lines). Indian and North American experts had similar expectations on the effectiveness of corticosteroids in bacterial corneal ulcers; that corticosteroids would markedly improve visual outcomes. Bayesian analysis produced results very similar to those produced by the SCUT primary analysis. The similarity in result is likely due to the large sample size of SCUT and helps validate the results of SCUT.

  18. 30 CFR 872.29 - What are prior balance replacement funds?

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 3 2010-07-01 2010-07-01 false What are prior balance replacement funds? 872... § 872.29 What are prior balance replacement funds? “Prior balance replacement funds” are moneys we must... general funds of the United States Treasury that are otherwise unappropriated. Under section 411(h)(1) of...

  19. Modern methods of sample preparation for GC analysis

    NARCIS (Netherlands)

    de Koning, S.; Janssen, H.-G.; Brinkman, U.A.Th.

    2009-01-01

    Today, a wide variety of techniques is available for the preparation of (semi-) solid, liquid and gaseous samples, prior to their instrumental analysis by means of capillary gas chromatography (GC) or, increasingly, comprehensive two-dimensional GC (GC × GC). In the past two decades, a large number

  20. Moderation of the Alliance-Outcome Association by Prior Depressive Episodes: Differential Effects in Cognitive-Behavioral Therapy and Short-Term Psychodynamic Supportive Psychotherapy.

    Science.gov (United States)

    Lorenzo-Luaces, Lorenzo; Driessen, Ellen; DeRubeis, Robert J; Van, Henricus L; Keefe, John R; Hendriksen, Mariëlle; Dekker, Jack

    2017-09-01

    Prior studies have suggested that the association between the alliance and depression improvement varies as a function of prior history of depression. We sought to replicate these findings and extend them to short-term psychodynamic supportive psychotherapy (SPSP) in a sample of patients who were randomized to one of these treatments and were administered the Helping Alliance Questionnaire (N=282) at Week 5 of treatment. Overall, the alliance was a predictor of symptom change (d=0.33). In SPSP, the alliance was a modest but robust predictor of change, irrespective of prior episodes (d=0.25-0.33). By contrast, in CBT, the effects of the alliance on symptom change were large for patients with 0 prior episodes (d=0.86), moderate for those with 1 prior episode (d=0.49), and small for those with 2+ prior episodes (d=0.12). These findings suggest a complex interaction between patient features and common vs. specific therapy processes. In CBT, the alliance relates to change for patients with less recurrent depression whereas other CBT-specific processes may account for change for patients with more recurrent depression. Copyright © 2016. Published by Elsevier Ltd.

  1. Prior opportunities to identify abuse in children with abusive head trauma.

    Science.gov (United States)

    Letson, Megan M; Cooper, Jennifer N; Deans, Katherine J; Scribano, Philip V; Makoroff, Kathi L; Feldman, Kenneth W; Berger, Rachel P

    2016-10-01

    Infants with minor abusive injuries are at risk for more serious abusive injury, including abusive head trauma (AHT). Our study objective was to determine if children with AHT had prior opportunities to detect abuse and to describe the opportunities. All AHT cases from 7/1/2009 to 12/31/2011 at four tertiary care children's hospitals were included. A prior opportunity was defined as prior evaluation by either a medical or child protective services (CPS) professional when the symptoms and/or referral could be consistent with abuse but the diagnosis was not made and/or an alternate explanation was given and accepted. Two-hundred-thirty-two children with AHT were identified; median age (IQR) was 5.40 (3.30, 14.60) months. Ten percent (22/232) died. Of the 232 patients diagnosed with AHT, 31% (n=73) had a total of 120 prior opportunities. Fifty-nine children (25%) had at least one prior opportunity to identify abuse in a medical setting, representing 98 prior opportunities. An additional 14 (6%) children had 22 prior opportunities through previous CPS involvement. There were no differences between those with and without a prior opportunity based on age, gender, race, insurance, mortality, or institution. Children with prior opportunities in a medical setting were more likely to have chronic subdural hemorrhage (48 vs. 17%, p<0.01) and healing fractures (31 vs. 19%, p=0.05). The most common prior opportunities included vomiting 31.6% (38/120), prior CPS contact 20% (24/120), and bruising 11.7% (14/120). Improvements in earlier recognition of AHT and subsequent intervention might prevent additional injuries and reduce mortality. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Study of Ferrite During Refinement of Prior Austenite Grains in Microalloyed Steel Continuous Casting

    Science.gov (United States)

    Liu, Jiang; Wen, Guanghua; Tang, Ping

    2017-12-01

    The formation of coarse prior austenite grain is a key factor to promote transverse crack, and the susceptibility to the transverse crack can be reduced by refining the austenite grain size. In the present study, the high-temperature confocal laser scanning microscope (CLSM) was used to simulate two types of double phase-transformation technologies. The distribution and morphology of ferrites under different cooling conditions were analyzed, and the effects of ferrite distribution and morphology on the double phase-transformation technologies were explored to obtain the suitable double phase-change technology for the continuous casting process. The results indicate that, under the thermal cycle TH0 [the specimens were cooled down to 913 K (640 °C) at a cooling rate of 5.0 K/s (5.0 °C/s)], the width of prior austenite grain boundaries was thick, and the dislocation density at grain boundaries was high. It had strong inhibition effect on crack propagation; under the thermal cycle TH1 [the specimens were cooled down to 1073 K (800 °C) at a cooling rate of 5.0 K/s (5.0 °C/s) and then to 913 K (640 °C) at a cooling rate of 1.0 K/s (1.0 °C/s)], the width of prior austenite grain boundary was thin, and the dislocation density at grain boundaries was low. It was beneficial to crack propagation. After the first phase change, the developed film-like ferrite along the austenite grain boundaries improved the nucleation conditions of new austenitic grains and removed the inhibition effect of the prior austenite grain boundaries on the austenite grain size.

  3. Obtaining antibiotics online from within the UK: a cross-sectional study

    Science.gov (United States)

    Boyd, Sara Elizabeth; Moore, Luke Stephen Prockter; Gilchrist, Mark; Costelloe, Ceire; Castro-Sánchez, Enrique; Franklin, Bryony Dean; Holmes, Alison Helen

    2017-01-01

    Background: Improved antibiotic stewardship (AS) and reduced prescribing in primary care, with a parallel increase in personal internet use, could lead citizens to obtain antibiotics from alternative sources online. Objectives: A cross-sectional analysis was performed to: (i) determine the quality and legality of online pharmacies selling antibiotics to the UK public; (ii) describe processes for obtaining antibiotics online from within the UK; and (iii) identify resulting AS and patient safety issues. Methods: Searches were conducted for ‘buy antibiotics online’ using Google and Yahoo. For each search engine, data from the first 10 web sites with unique URL addresses were reviewed. Analysis was conducted on evidence of appropriate pharmacy registration, prescription requirement, whether antibiotic choice was ‘prescriber-driven’ or ‘consumer-driven’, and whether specific information was required (allergies, comorbidities, pregnancy) or given (adverse effects) prior to purchase. Results: Twenty unique URL addresses were analysed in detail. Online pharmacies evidencing their location in the UK (n = 5; 25%) required a prescription before antibiotic purchase, and were appropriately registered. Online pharmacies unclear about the location they were operating from (n = 10; 50%) had variable prescription requirements, and no evidence of appropriate registration. Nine (45%) online pharmacies did not require a prescription prior to purchase. For 16 (80%) online pharmacies, decisions were initially consumer-driven for antibiotic choice, dose and quantity. Conclusions: Wide variation exists among online pharmacies in relation to antibiotic practices, highlighting considerable patient safety and AS issues. Improved education, legislation, regulation and new best practice stewardship guidelines are urgently needed for online antibiotic suppliers. PMID:28333179

  4. Cumulative Exposure to Prior Collective Trauma and Acute Stress Responses to the Boston Marathon Bombings

    OpenAIRE

    Garfin, DR; Holman, EA; Silver, RC

    2015-01-01

    © The Author(s) 2015 The role of repeated exposure to collective trauma in explaining response to subsequent community-wide trauma is poorly understood. We examined the relationship between acute stress response to the 2013 Boston Marathon bombings and prior direct and indirect media-based exposure to three collective traumatic events: the September 11, 2001 (9/11) terrorist attacks, Superstorm Sandy, and the Sandy Hook Elementary School shooting. Representative samples of residents of metrop...

  5. An Extended Multilocus Sequence Typing (MLST Scheme for Rapid Direct Typing of Leptospira from Clinical Samples.

    Directory of Open Access Journals (Sweden)

    Sabrina Weiss

    2016-09-01

    Full Text Available Rapid typing of Leptospira is currently impaired by requiring time consuming culture of leptospires. The objective of this study was to develop an assay that provides multilocus sequence typing (MLST data direct from patient specimens while minimising costs for subsequent sequencing.An existing PCR based MLST scheme was modified by designing nested primers including anchors for facilitated subsequent sequencing. The assay was applied to various specimen types from patients diagnosed with leptospirosis between 2014 and 2015 in the United Kingdom (UK and the Lao Peoples Democratic Republic (Lao PDR. Of 44 clinical samples (23 serum, 6 whole blood, 3 buffy coat, 12 urine PCR positive for pathogenic Leptospira spp. at least one allele was amplified in 22 samples (50% and used for phylogenetic inference. Full allelic profiles were obtained from ten specimens, representing all sample types (23%. No nonspecific amplicons were observed in any of the samples. Of twelve PCR positive urine specimens three gave full allelic profiles (25% and two a partial profile. Phylogenetic analysis allowed for species assignment. The predominant species detected was L. interrogans (10/14 and 7/8 from UK and Lao PDR, respectively. All other species were detected in samples from only one country (Lao PDR: L. borgpetersenii [1/8]; UK: L. kirschneri [1/14], L. santarosai [1/14], L. weilii [2/14].Typing information of pathogenic Leptospira spp. was obtained directly from a variety of clinical samples using a modified MLST assay. This assay negates the need for time-consuming culture of Leptospira prior to typing and will be of use both in surveillance, as single alleles enable species determination, and outbreaks for the rapid identification of clusters.

  6. Children Prefer Diverse Samples for Inductive Reasoning in the Social Domain

    Science.gov (United States)

    Noyes, Alexander; Christie, Stella

    2016-01-01

    Not all samples of evidence are equally conclusive: Diverse evidence is more representative than narrow evidence. Prior research showed that children did not use sample diversity in evidence selection tasks, indiscriminately choosing diverse or narrow sets (tiger-mouse; tiger-lion) to learn about animals. This failure is not due to a general…

  7. Entropy Econometrics for combining regional economic forecasts: A Data-Weighted Prior Estimator

    Science.gov (United States)

    Fernández-Vázquez, Esteban; Moreno, Blanca

    2017-10-01

    Forecast combination has been studied in econometrics for a long time, and the literature has shown the superior performance of forecast combination over individual predictions. However, there is still controversy on which is the best procedure to specify the forecast weights. This paper explores the possibility of using a procedure based on Entropy Econometrics, which allows setting the weights for the individual forecasts as a mixture of different alternatives. In particular, we examine the ability of the Data-Weighted Prior Estimator proposed by Golan (J Econom 101(1):165-193, 2001) to combine forecasting models in a context of small sample sizes, a relative common scenario when dealing with time series for regional economies. We test the validity of the proposed approach using a simulation exercise and a real-world example that aims at predicting gross regional product growth rates for a regional economy. The forecasting performance of the Data-Weighted Prior Estimator proposed is compared with other combining methods. The simulation results indicate that in scenarios of heavily ill-conditioned datasets the approach suggested dominates other forecast combination strategies. The empirical results are consistent with the conclusions found in the numerical experiment.

  8. Rapid determination of benzene derivatives in water samples by trace volume solvent DLLME prior to GC-FID

    Energy Technology Data Exchange (ETDEWEB)

    Diao, Chun Peng; Wei, Chao Hai; Feng, Chun Hua [South China Univ. of Technology, Guangzhou Higher Education Mega Center (China). College of Environmental Science and Engineering; Guangdong Regular Higher Education Institutions, Guangzhou (China). Key Lab. of Environmental Protection and Eco-Remediation

    2012-05-15

    An inexpensive, simple and environmentally friendly method based on dispersive liquid liquid microextraction (DLLME) for rapid determination of benzene derivatives in water samples was proposed. A significant improvement of DLLME procedure was achieved. Trace volume ethyl acetate (60 {mu}L) was exploited as dispersion solvent instead of common ones such as methanol and acetone, the volume of which was more than 0.5 mL, and the organic solvent required in DLLME was reduced to a great extent. Only 83-{mu}L organic solvent was consumed in the whole analytic process and the preconcentration procedure was less than 10 min. The advantageous approach coupled with gas chromatograph-flame ionization detector was proposed for the rapid determination of benzene, toluene, ethylbenzene and xylene isomers in water samples. Results showed that the proposed approach was an efficient method for rapid determination of benzene derivatives in aqueous samples. (orig.)

  9. A normative inference approach for optimal sample sizes in decisions from experience

    Science.gov (United States)

    Ostwald, Dirk; Starke, Ludger; Hertwig, Ralph

    2015-01-01

    “Decisions from experience” (DFE) refers to a body of work that emerged in research on behavioral decision making over the last decade. One of the major experimental paradigms employed to study experience-based choice is the “sampling paradigm,” which serves as a model of decision making under limited knowledge about the statistical structure of the world. In this paradigm respondents are presented with two payoff distributions, which, in contrast to standard approaches in behavioral economics, are specified not in terms of explicit outcome-probability information, but by the opportunity to sample outcomes from each distribution without economic consequences. Participants are encouraged to explore the distributions until they feel confident enough to decide from which they would prefer to draw from in a final trial involving real monetary payoffs. One commonly employed measure to characterize the behavior of participants in the sampling paradigm is the sample size, that is, the number of outcome draws which participants choose to obtain from each distribution prior to terminating sampling. A natural question that arises in this context concerns the “optimal” sample size, which could be used as a normative benchmark to evaluate human sampling behavior in DFE. In this theoretical study, we relate the DFE sampling paradigm to the classical statistical decision theoretic literature and, under a probabilistic inference assumption, evaluate optimal sample sizes for DFE. In our treatment we go beyond analytically established results by showing how the classical statistical decision theoretic framework can be used to derive optimal sample sizes under arbitrary, but numerically evaluable, constraints. Finally, we critically evaluate the value of deriving optimal sample sizes under this framework as testable predictions for the experimental study of sampling behavior in DFE. PMID:26441720

  10. Magnetic headspace adsorptive extraction of chlorobenzenes prior to thermal desorption gas chromatography-mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Vidal, Lorena, E-mail: lorena.vidal@ua.es [Department of Analytical Chemistry, Nutrition and Food Sciences and University Institute of Materials, University of Alicante, P.O. Box 99, E-03080, Alicante (Spain); Ahmadi, Mazaher [Faculty of Chemistry, Bu-Ali Sina University, Hamedan (Iran, Islamic Republic of); Fernández, Elena [Department of Analytical Chemistry, Nutrition and Food Sciences and University Institute of Materials, University of Alicante, P.O. Box 99, E-03080, Alicante (Spain); Madrakian, Tayyebeh [Faculty of Chemistry, Bu-Ali Sina University, Hamedan (Iran, Islamic Republic of); Canals, Antonio, E-mail: a.canals@ua.es [Department of Analytical Chemistry, Nutrition and Food Sciences and University Institute of Materials, University of Alicante, P.O. Box 99, E-03080, Alicante (Spain)

    2017-06-08

    This study presents a new, user-friendly, cost-effective and portable headspace solid-phase extraction technique based on graphene oxide decorated with iron oxide magnetic nanoparticles as sorbent, located on one end of a small neodymium magnet. Hence, the new headspace solid-phase extraction technique has been called Magnetic Headspace Adsorptive Extraction (Mag-HSAE). In order to assess Mag-HSAE technique applicability to model analytes, some chlorobenzenes were extracted from water samples prior to gas chromatography-mass spectrometry determination. A multivariate approach was employed to optimize the experimental parameters affecting Mag-HSAE. The method was evaluated under optimized extraction conditions (i.e., sample volume, 20 mL; extraction time, 30 min; sorbent amount, 10 mg; stirring speed, 1500 rpm, and ionic strength, non-significant), obtaining a linear response from 0.5 to 100 ng L{sup −1} for 1,3-DCB, 1,4-DCB, 1,2-DCB, 1,3,5-TCB, 1,2,4-TCB and 1,2,3-TCB; from 0.5 to 75 ng L{sup −1} for 1,2,4,5-TeCB, and PeCB; and from 1 to 75 ng L{sup −1} for 1,2,3,4-TeCB. The repeatability of the proposed method was evaluated at 10 ng L{sup −1} and 50 ng L{sup −1} spiking levels, and coefficients of variation ranged between 1.5 and 9.5% (n = 5). Limits of detection values were found between 93 and 301 pg L{sup −1}. Finally, tap, mineral and effluent water were selected as real water samples to assess method applicability. Relative recoveries varied between 86 and 110% showing negligible matrix effects. - Highlights: • A new extraction technique named Magnetic Headspace Adsorptive Extraction is presented. • Graphene oxide/iron oxide composite deposited on a neodymiun magnet as sorbent. • Sorbent of low cost, rapid and simple synthesis, easy manipulation and portability options. • Fast and efficient extraction and sensitive determination of chlorobenzenes in water samples.

  11. Characterization of a low-level radioactive waste grout: Sampling and test results

    International Nuclear Information System (INIS)

    Martin, P.F.C.; Lokken, R.O.

    1992-12-01

    WHC manages and operates the grout treatment facility at Hanford as part of a DOE program to clean up wastes stored at federal nuclear production sites. PNL provides support to the grout disposal program through pilot-scale tests, performance assessments, and formulation verification activities. in 1988 and 1989, over one million gallons of a low-level radioactive liquid waste was processed through the facility to produce a grout waste that was then deposited in an underground vault. The liquid waste was phosphate/sulfate waste (PSW) generated in decontamination of the N Reactor. PNL sampled and tested the grout produced during the second half of the PSW campaign to support quality verification activities prior to grout vault closure. Samples of grout were obtained by inserting nested-tube samplers into the grout slurry in the vault. After the grout had cured, the inner tube of the sampler was removed and the grout samples extracted. Tests for compressive strength, sonic velocity, and leach testing were used to assess grout quality; results were compared to those from pilot-scale test grouts made with a simulated PSW. The grout produced during the second half of the PSW campaign exceeded compressive strength and leachability formulation criteria. The nested tube samplers were effective in collecting samples of grout although their use introduced greater variability into the compressive strength data

  12. Prior and present evidence: how prior experience interacts with present information in a perceptual decision making task.

    Directory of Open Access Journals (Sweden)

    Muhsin Karim

    Full Text Available Vibrotactile discrimination tasks have been used to examine decision making processes in the presence of perceptual uncertainty, induced by barely discernible frequency differences between paired stimuli or by the presence of embedded noise. One lesser known property of such tasks is that decisions made on a single trial may be biased by information from prior trials. An example is the time-order effect whereby the presentation order of paired stimuli may introduce differences in accuracy. Subjects perform better when the first stimulus lies between the second stimulus and the global mean of all stimuli on the judged dimension ("preferred" time-orders compared to the alternative presentation order ("nonpreferred" time-orders. This has been conceptualised as a "drift" of the first stimulus representation towards the global mean of the stimulus-set (an internal standard. We describe the influence of prior information in relation to the more traditionally studied factors of interest in a classic discrimination task.Sixty subjects performed a vibrotactile discrimination task with different levels of uncertainty parametrically induced by increasing task difficulty, aperiodic stimulus noise, and changing the task instructions whilst maintaining identical stimulus properties (the "context".The time-order effect had a greater influence on task performance than two of the explicit factors-task difficulty and noise-but not context. The influence of prior information increased with the distance of the first stimulus from the global mean, suggesting that the "drift" velocity of the first stimulus towards the global mean representation was greater for these trials.Awareness of the time-order effect and prior information in general is essential when studying perceptual decision making tasks. Implicit mechanisms may have a greater influence than the explicit factors under study. It also affords valuable insights into basic mechanisms of information

  13. Nationwide survey of policies and practices related to capillary blood sampling in medical laboratories in Croatia.

    Science.gov (United States)

    Krleza, Jasna Lenicek

    2014-01-01

    Capillary sampling is increasingly used to obtain blood for laboratory tests in volumes as small as necessary and as non-invasively as possible. Whether capillary blood sampling is also frequent in Croatia, and whether it is performed according to international laboratory standards is unclear. All medical laboratories that participate in the Croatian National External Quality Assessment Program (N = 204) were surveyed on-line to collect information about the laboratory's parent institution, patient population, types and frequencies of laboratory tests based on capillary blood samples, choice of reference intervals, and policies and procedures specifically related to capillary sampling. Sampling practices were compared with guidelines from the Clinical and Laboratory Standards Institute (CLSI) and the World Health Organization (WHO). Of the 204 laboratories surveyed, 174 (85%) responded with complete questionnaires. Among the 174 respondents, 155 (89%) reported that they routinely perform capillary sampling, which is carried out by laboratory staff in 118 laboratories (76%). Nearly half of respondent laboratories (48%) do not have a written protocol including order of draw for multiple sampling. A single puncture site is used to provide capillary blood for up to two samples at 43% of laboratories that occasionally or regularly perform such sampling. Most respondents (88%) never perform arterialisation prior to capillary blood sampling. Capillary blood sampling is highly prevalent in Croatia across different types of clinical facilities and patient populations. Capillary sampling procedures are not standardised in the country, and the rate of laboratory compliance with CLSI and WHO guidelines is low.

  14. The Prior Can Often Only Be Understood in the Context of the Likelihood

    Directory of Open Access Journals (Sweden)

    Andrew Gelman

    2017-10-01

    Full Text Available A key sticking point of Bayesian analysis is the choice of prior distribution, and there is a vast literature on potential defaults including uniform priors, Jeffreys’ priors, reference priors, maximum entropy priors, and weakly informative priors. These methods, however, often manifest a key conceptual tension in prior modeling: a model encoding true prior information should be chosen without reference to the model of the measurement process, but almost all common prior modeling techniques are implicitly motivated by a reference likelihood. In this paper we resolve this apparent paradox by placing the choice of prior into the context of the entire Bayesian analysis, from inference to prediction to model evaluation.

  15. Quantum steganography using prior entanglement

    International Nuclear Information System (INIS)

    Mihara, Takashi

    2015-01-01

    Steganography is the hiding of secret information within innocent-looking information (e.g., text, audio, image, video, etc.). A quantum version of steganography is a method based on quantum physics. In this paper, we propose quantum steganography by combining quantum error-correcting codes with prior entanglement. In many steganographic techniques, embedding secret messages in error-correcting codes may cause damage to them if the embedded part is corrupted. However, our proposed steganography can separately create secret messages and the content of cover messages. The intrinsic form of the cover message does not have to be modified for embedding secret messages. - Highlights: • Our steganography combines quantum error-correcting codes with prior entanglement. • Our steganography can separately create secret messages and the content of cover messages. • Errors in cover messages do not have affect the recovery of secret messages. • We embed a secret message in the Steane code as an example of our steganography

  16. Quantum steganography using prior entanglement

    Energy Technology Data Exchange (ETDEWEB)

    Mihara, Takashi, E-mail: mihara@toyo.jp

    2015-06-05

    Steganography is the hiding of secret information within innocent-looking information (e.g., text, audio, image, video, etc.). A quantum version of steganography is a method based on quantum physics. In this paper, we propose quantum steganography by combining quantum error-correcting codes with prior entanglement. In many steganographic techniques, embedding secret messages in error-correcting codes may cause damage to them if the embedded part is corrupted. However, our proposed steganography can separately create secret messages and the content of cover messages. The intrinsic form of the cover message does not have to be modified for embedding secret messages. - Highlights: • Our steganography combines quantum error-correcting codes with prior entanglement. • Our steganography can separately create secret messages and the content of cover messages. • Errors in cover messages do not have affect the recovery of secret messages. • We embed a secret message in the Steane code as an example of our steganography.

  17. Crowdsourcing prior information to improve study design and data analysis.

    Directory of Open Access Journals (Sweden)

    Jeffrey S Chrabaszcz

    Full Text Available Though Bayesian methods are being used more frequently, many still struggle with the best method for setting priors with novel measures or task environments. We propose a method for setting priors by eliciting continuous probability distributions from naive participants. This allows us to include any relevant information participants have for a given effect. Even when prior means are near-zero, this method provides a principle way to estimate dispersion and produce shrinkage, reducing the occurrence of overestimated effect sizes. We demonstrate this method with a number of published studies and compare the effect of different prior estimation and aggregation methods.

  18. Formulating informative, data-based priors for failure probability estimation in reliability analysis

    International Nuclear Information System (INIS)

    Guikema, Seth D.

    2007-01-01

    Priors play an important role in the use of Bayesian methods in risk analysis, and using all available information to formulate an informative prior can lead to more accurate posterior inferences. This paper examines the practical implications of using five different methods for formulating an informative prior for a failure probability based on past data. These methods are the method of moments, maximum likelihood (ML) estimation, maximum entropy estimation, starting from a non-informative 'pre-prior', and fitting a prior based on confidence/credible interval matching. The priors resulting from the use of these different methods are compared qualitatively, and the posteriors are compared quantitatively based on a number of different scenarios of observed data used to update the priors. The results show that the amount of information assumed in the prior makes a critical difference in the accuracy of the posterior inferences. For situations in which the data used to formulate the informative prior is an accurate reflection of the data that is later observed, the ML approach yields the minimum variance posterior. However, the maximum entropy approach is more robust to differences between the data used to formulate the prior and the observed data because it maximizes the uncertainty in the prior subject to the constraints imposed by the past data

  19. Analysis of the Touch-And-Go Surface Sampling Concept for Comet Sample Return Missions

    Science.gov (United States)

    Mandic, Milan; Acikmese, Behcet; Bayard, David S.; Blackmore, Lars

    2012-01-01

    This paper studies the Touch-and-Go (TAG) concept for enabling a spacecraft to take a sample from the surface of a small primitive body, such as an asteroid or comet. The idea behind the TAG concept is to let the spacecraft descend to the surface, make contact with the surface for several seconds, and then ascend to a safe location. Sampling would be accomplished by an end-effector that is active during the few seconds of surface contact. The TAG event is one of the most critical events in a primitive body sample-return mission. The purpose of this study is to evaluate the dynamic behavior of a representative spacecraft during the TAG event, i.e., immediately prior, during, and after surface contact of the sampler. The study evaluates the sample-collection performance of the proposed sampling end-effector, in this case a brushwheel sampler, while acquiring material from the surface during the contact. A main result of the study is a guidance and control (G&C) validation of the overall TAG concept, in addition to specific contributions to demonstrating the effectiveness of using nonlinear clutch mechanisms in the sampling arm joints, and increasing the length of the sampling arms to improve robustness.

  20. Determination of 2-Octanone in Biological Samples Using Liquid–Liquid Microextractions Followed by Gas Chromatography–Flame Ionization Detectio

    Directory of Open Access Journals (Sweden)

    Abolghasem Jouyban, Maryam Abbaspour, Mir Ali Farajzadeh, Maryam Khoubnasabjafari

    2017-06-01

    Full Text Available Background: Analysis of chemicals in biological fluids is required in many areas of medical sciences. Rapid, highly efficient, and reliable dispersive and air assisted liquid–liquid microextraction methods followed by gas chromatography-flame ionization detection were developed for the extraction, preconcentration, and determination of 2-octanone in human plasma and urine samples. Methods: Proteins of plasma samples are precipitated by adding methanol and urine sample is diluted with water prior to performing the microextraction procedure. Fine organic solvent droplets are formed by repeated suction and injection of the mixture of sample solution and extraction solvent into a test tube with a glass syringe. After extraction, phase separation is performed by centrifuging and the enriched analyte in the sedimented organic phase is determined by the separation system. The main factors influencing the extraction efficiency including extraction solvent type and volume, salt addition, pH, and extraction times are investigated. Results: Under the optimized conditions, the proposed method showed good precision (relative standard deviation less than 7%. Limit of detection and lower limit of quantification for 2-octanone were obtained in the range of 0.1–0.5 µg mL−1. The linear ranges were 0.5-500 and 0.5-200 µg mL−1 in plasma and urine, respectively (r2 ≥ 0.9995. Enrichment factors were in the range of 13-37. Good recoveries (55–86% were obtained for the spiked samples. Conclusion: Preconcentration methods coupled with GC analysis were developed and could be used to monitor 2-octanone in biological samples.

  1. Bayesian Estimation of Two-Parameter Weibull Distribution Using Extension of Jeffreys' Prior Information with Three Loss Functions

    Directory of Open Access Journals (Sweden)

    Chris Bambey Guure

    2012-01-01

    Full Text Available The Weibull distribution has been observed as one of the most useful distribution, for modelling and analysing lifetime data in engineering, biology, and others. Studies have been done vigorously in the literature to determine the best method in estimating its parameters. Recently, much attention has been given to the Bayesian estimation approach for parameters estimation which is in contention with other estimation methods. In this paper, we examine the performance of maximum likelihood estimator and Bayesian estimator using extension of Jeffreys prior information with three loss functions, namely, the linear exponential loss, general entropy loss, and the square error loss function for estimating the two-parameter Weibull failure time distribution. These methods are compared using mean square error through simulation study with varying sample sizes. The results show that Bayesian estimator using extension of Jeffreys' prior under linear exponential loss function in most cases gives the smallest mean square error and absolute bias for both the scale parameter α and the shape parameter β for the given values of extension of Jeffreys' prior.

  2. A Noninformative Prior on a Space of Distribution Functions

    Directory of Open Access Journals (Sweden)

    Alexander Terenin

    2017-07-01

    Full Text Available In a given problem, the Bayesian statistical paradigm requires the specification of a prior distribution that quantifies relevant information about the unknowns of main interest external to the data. In cases where little such information is available, the problem under study may possess an invariance under a transformation group that encodes a lack of information, leading to a unique prior—this idea was explored at length by E.T. Jaynes. Previous successful examples have included location-scale invariance under linear transformation, multiplicative invariance of the rate at which events in a counting process are observed, and the derivation of the Haldane prior for a Bernoulli success probability. In this paper we show that this method can be extended, by generalizing Jaynes, in two ways: (1 to yield families of approximately invariant priors; and (2 to the infinite-dimensional setting, yielding families of priors on spaces of distribution functions. Our results can be used to describe conditions under which a particular Dirichlet Process posterior arises from an optimal Bayesian analysis, in the sense that invariances in the prior and likelihood lead to one and only one posterior distribution.

  3. Implicit Priors in Galaxy Cluster Mass and Scaling Relation Determinations

    Science.gov (United States)

    Mantz, A.; Allen, S. W.

    2011-01-01

    Deriving the total masses of galaxy clusters from observations of the intracluster medium (ICM) generally requires some prior information, in addition to the assumptions of hydrostatic equilibrium and spherical symmetry. Often, this information takes the form of particular parametrized functions used to describe the cluster gas density and temperature profiles. In this paper, we investigate the implicit priors on hydrostatic masses that result from this fully parametric approach, and the implications of such priors for scaling relations formed from those masses. We show that the application of such fully parametric models of the ICM naturally imposes a prior on the slopes of the derived scaling relations, favoring the self-similar model, and argue that this prior may be influential in practice. In contrast, this bias does not exist for techniques which adopt an explicit prior on the form of the mass profile but describe the ICM non-parametrically. Constraints on the slope of the cluster mass-temperature relation in the literature show a separation based the approach employed, with the results from fully parametric ICM modeling clustering nearer the self-similar value. Given that a primary goal of scaling relation analyses is to test the self-similar model, the application of methods subject to strong, implicit priors should be avoided. Alternative methods and best practices are discussed.

  4. Superposing pure quantum states with partial prior information

    Science.gov (United States)

    Dogra, Shruti; Thomas, George; Ghosh, Sibasish; Suter, Dieter

    2018-05-01

    The principle of superposition is an intriguing feature of quantum mechanics, which is regularly exploited in many different circumstances. A recent work [M. Oszmaniec et al., Phys. Rev. Lett. 116, 110403 (2016), 10.1103/PhysRevLett.116.110403] shows that the fundamentals of quantum mechanics restrict the process of superimposing two unknown pure states, even though it is possible to superimpose two quantum states with partial prior knowledge. The prior knowledge imposes geometrical constraints on the choice of input states. We discuss an experimentally feasible protocol to superimpose multiple pure states of a d -dimensional quantum system and carry out an explicit experimental realization for two single-qubit pure states with partial prior information on a two-qubit NMR quantum information processor.

  5. Factor Structure of the Eating Disorder Examination-Questionnaire in a Clinical Sample of Adult Women With Anorexia Nervosa.

    Science.gov (United States)

    Phillips, Kathryn E; Jennings, Karen M; Gregas, Matthew

    2018-05-01

    An exploratory factor analysis on the Eating Disorder Examination-Questionnaire (EDE-Q) is presented for a clinical sample of women with anorexia nervosa. THE EDE-Q was completed by 169 participants after admission to an inpatient unit for eating disorders. Results of the current study did not support the four-factor model presented by the EDE-Q. A new four-factor solution was obtained with two factors showing similarity to the Restraint and Eating Concern subscales of the original model. The Shape and Weight Concern items primarily loaded together on one factor, along with preoccupation with food and fear of losing control over eating, two Eating Concern items. Finally, an appearance factor was obtained that supports the results of prior research. [Journal of Psychosocial Nursing and Mental Health Services, 56(5), 33-39.]. Copyright 2018, SLACK Incorporated.

  6. Self-expandable metallic stent placement for patients with inoperable esophageal carcinoma. Investigation of the influence of prior radiotherapy and chemotherapy

    International Nuclear Information System (INIS)

    Ihara, Yuko; Murayama, Sadayuki; Toita, Takafumi; Utsunomiya, Takashi; Nagata, Osamu; Akamine, Tamaki; Ogawa, Kazuhiko; Adachi, Genki; Tanigawa, Noboru

    2006-01-01

    The aim of this study was to evaluate the efficacy and complications of self-expandable metallic stent placement for patients with inoperable esophageal carcinoma after radiotherapy and/or chemotherapy. We obtained data from 19 patients with advanced or recurrent esophageal carcinoma between 1996 and 2000. In all patients, a self-expandable metallic stent was placed under fluoroscopic guidance. Dysphagia before and after stent placement was graded. Complications after stent placement were also evaluated. Data were compared between patients with and without prior radiotherapy and/or chemotherapy. The procedure was technically successful in all but one patient. The dysphagia grade improved in all patients. No life-threatening complications occurred. The other major complications such as mediastinitis occurred in two patients, and pneumonia and funnel phenomenon occurred in one patient each. These patients had a history of radiotherapy and/or chemotherapy prior to stent placement. Eight of the twelve patients with prior radiotherapy and/or chemotherapy compared with one of seven patients without prior therapy had persistent chest pain, which was a statistically significant difference (P<0.05). Placement of self-expandable metallic stents was effective for patients with advanced or recurrent esophageal carcinoma. However, prior irradiation and/or chemotherapy increased the risk of persistent chest pain after stent placement. (author)

  7. Comparative evaluation of the US Environmental Protection Agency's and the Oak Ridge Institute for Science and Education's environmental survey and site assessment program field sampling procedures

    International Nuclear Information System (INIS)

    Vitkus, T.J.; Bright, T.L.; Roberts, S.A.

    1997-10-01

    At the request of the U.S. Nuclear Regulatory Commission's (NRC's) Headquarters Office, the Environmental Survey and Site Assessment Program (ESSAP) of the Oak Ridge Institute for Science and Education (ORISE) compared the documented procedures that the U.S. Environmental Protection Agency (EPA) and ESSAP use for collecting environmental samples. The project objectives were to review both organizations' procedures applicable to collecting various sample matrices, compare the procedures for similarities and differences, and then to evaluate the reason for any identified procedural differences and their potential impact on ESSAP's sample data quality. The procedures reviewed included those for sampling surface and subsurface soil, surface and groundwater, vegetation, air, and removable surface contamination. ESSAP obtained copies of relevant EPA documents and reviewed and prepared a tabulated summary of each applicable procedure. The methods for collecting and handling each type of sample were evaluated for differences, and where these were identified, the significance and effect of the differences on analytical quality were determined. The results of the comparison showed that, overall, the procedures and methods that EPA and ESSAP use for sample collection are very similar. The number of minor differences noted were the result of restrictions or procedures necessary to ensure sample integrity and prevent the introduction of interfering compounds when samples are to be analyzed for chemical parameters. For most radio nuclide analyses, these additional procedures are not necessary. Another item noted was EPA's inclusion of steps that reduce the potential for sample cross-contamination by preparing (dressing) a location prior to collecting a sample or removing a portion of a sample prior to containerization

  8. Generalized Likelihood Uncertainty Estimation (GLUE) Using Multi-Optimization Algorithm as Sampling Method

    Science.gov (United States)

    Wang, Z.

    2015-12-01

    For decades, distributed and lumped hydrological models have furthered our understanding of hydrological system. The development of hydrological simulation in large scale and high precision elaborated the spatial descriptions and hydrological behaviors. Meanwhile, the new trend is also followed by the increment of model complexity and number of parameters, which brings new challenges of uncertainty quantification. Generalized Likelihood Uncertainty Estimation (GLUE) has been widely used in uncertainty analysis for hydrological models referring to Monte Carlo method coupled with Bayesian estimation. However, the stochastic sampling method of prior parameters adopted by GLUE appears inefficient, especially in high dimensional parameter space. The heuristic optimization algorithms utilizing iterative evolution show better convergence speed and optimality-searching performance. In light of the features of heuristic optimization algorithms, this study adopted genetic algorithm, differential evolution, shuffled complex evolving algorithm to search the parameter space and obtain the parameter sets of large likelihoods. Based on the multi-algorithm sampling, hydrological model uncertainty analysis is conducted by the typical GLUE framework. To demonstrate the superiority of the new method, two hydrological models of different complexity are examined. The results shows the adaptive method tends to be efficient in sampling and effective in uncertainty analysis, providing an alternative path for uncertainty quantilization.

  9. Process to Obtain Quick Counts from PREP

    Directory of Open Access Journals (Sweden)

    Martínez–Cruz M.Á.

    2011-10-01

    Full Text Available Considering the Preliminary Electoral Results Program (PERP as a database of the federal elections for president of the Mexican Republic, a methodology was developed in order to find representative samples of ballot boxes installed in the election’s day (quick count in different hours, due to its characteristics of gathering of information, the PREP in the first hours forms a non-representative sample of data. In a particular way, in the election of July 2, 2006, after 3 hours of opening the PREP, it was observed that the accuracy of the process of the quick counts was better than the one obtained by the IFE. Among other things, this allows to lower the cost, to increase the confidentiality of the ballot boxes used in the sampling and to distinguish in a precise moment the winning candidate long before PREP finishes.

  10. Comparing nonparametric Bayesian tree priors for clonal reconstruction of tumors.

    Science.gov (United States)

    Deshwar, Amit G; Vembu, Shankar; Morris, Quaid

    2015-01-01

    Statistical machine learning methods, especially nonparametric Bayesian methods, have become increasingly popular to infer clonal population structure of tumors. Here we describe the treeCRP, an extension of the Chinese restaurant process (CRP), a popular construction used in nonparametric mixture models, to infer the phylogeny and genotype of major subclonal lineages represented in the population of cancer cells. We also propose new split-merge updates tailored to the subclonal reconstruction problem that improve the mixing time of Markov chains. In comparisons with the tree-structured stick breaking prior used in PhyloSub, we demonstrate superior mixing and running time using the treeCRP with our new split-merge procedures. We also show that given the same number of samples, TSSB and treeCRP have similar ability to recover the subclonal structure of a tumor…

  11. SUSPENSION OF THE PRIOR DISCIPLINARY INVESTIGATION ACCORDING TO LABOR LAW

    Directory of Open Access Journals (Sweden)

    Nicolae, GRADINARU

    2014-11-01

    Full Text Available In order to conduct the prior disciplinary investigation, the employee shall be convoked in writing by the person authorized by the employer to carry out the research, specifying the subject, date, time and place of the meeting. For this purpose the employer shall appoint a committee charged with conducting the prior disciplinary investigation. Prior disciplinary research cannot be done without the possibility of the accused person to defend himself. It would be an abuse of the employer to violate these provisions. Since the employee is entitled to formulate and sustain defence in proving innocence or lesser degree of guilt than imputed, it needs between the moment were disclosed to the employee and the one of performing the prior disciplinary investigation to be a reasonable term for the employee to be able to prepare a defence in this regard. The employee's failure to present at the convocation, without an objective reason entitles the employer to dispose the sanctioning without making the prior disciplinary investigation. The objective reason which makes the employee, that is subject to prior disciplinary investigation, unable to present to the preliminary disciplinary investigation, should be at the time of the investigation in question.

  12. Polypyrrole nanowire as an excellent solid phase microextraction fiber for bisphenol A analysis in food samples followed by ion mobility spectrometry.

    Science.gov (United States)

    Kamalabadi, Mahdie; Mohammadi, Abdorreza; Alizadeh, Naader

    2016-08-15

    A polypyrrole nanowire coated fiber was prepared and used in head-space solid phase microextraction coupled with ion mobility spectrometry (HS-SPME-IMS) to the analysis of bisphenol A (BPA) in canned food samples, for the first time. This fiber was synthesized by electrochemical oxidation of the monomer in aqueous solution. The fiber characterization by scanning electron microscopy (SEM) revealed that the new fiber exhibited two-dimensional structures with a nanowire morphology. The effects of important extraction parameters on the efficiency of HS-SPME were investigated and optimized. Under the optimum conditions, the linearity of 10-150ngg(-1) and limit of detection (based on S/N=3) of 1ngg(-1) were obtained in BPA analysis. The repeatability (n=5) expressed as the relative standard deviation (RSD%) was 5.8%. At the end, the proposed method was successfully applied to determine BPA in various canned food samples (peas, corns, beans). Relative recoveries were obtained 93-96%. Method validation was conducted by comparing our results with those obtained through HPLC with fluorescence detection (FLD). Compatible results indicate that the proposed method can be successfully used in BPA analysis. This method is simple and cheaper than chromatographic methods, with no need of extra organic solvent consumption and derivatization prior to sample introduction. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Accuracy of total oxidant measurement as obtained by the phenolphthalin method

    Energy Technology Data Exchange (ETDEWEB)

    Louw, C W; Halliday, E C

    1963-01-01

    The phenolphthalin method of Haagen-Smit and Brunelle (1958) was chosen for a preliminary survey of total oxidant level in Pretoria air, because of its sensitivity. Difficulty, however, was encountered in obtaining reliable standard curves. Some improvement was obtained when conducting all operations except photometer measurements at the temperature of melting ice. It was also found that when the sequence of adding the reagents was changed, so as to simulate conditions during actual sampling, a standard curve approximating a straight line and differing considerably from that of McCabe (1953) was obtained. It follows that values of total oxidant obtained by any experimentor will depend to a certain extent upon the method of standard curve preparation he uses, and when comparisons are made between measurements by experimentors in different towns or countries this factor should be taken into consideration. The accuracy (95% confidence) obtained by the phenolphthalin method, using the mean of three successive samples, was shown to be in the region of 30% for very low amounts of oxidant.

  14. Adaptive estimation of multivariate functions using conditionally Gaussian tensor-product spline priors

    NARCIS (Netherlands)

    Jonge, de R.; Zanten, van J.H.

    2012-01-01

    We investigate posterior contraction rates for priors on multivariate functions that are constructed using tensor-product B-spline expansions. We prove that using a hierarchical prior with an appropriate prior distribution on the partition size and Gaussian prior weights on the B-spline

  15. Analysis of Fluconazole in Human Urine Sample by High Performance Liquid Chromatography Method

    International Nuclear Information System (INIS)

    Hermawan, D; Ali, N A Md; Ibrahim, W A Wan; Sanagi, M M

    2013-01-01

    A method for determination of fluconazole, antifungal drug in human urine by using reversed-phased high performance liquid chromatography (RP-HPLC) with ultraviolet (UV) detector was developed. Optimization HPLC conditions were carried out by changing the flow rate and composition of mobile phase. The optimum separation conditions at a flow rate 0.85 mL/min with a composition of mobile phase containing methanol:water (70:30, v/v) with UV detection at a wavelength 254 nm was able to analyze fluconazole within 3 min. The excellent linearity was obtained in the range of concentration 1 to 10 μg/mL with r 2 = 0.998. The limit of detection (LOD) and limit of quantitation (LOQ) were 0.39 μg/mL and 1.28 μg/mL, respectively. Solid phase extraction (SPE) method using octadecylsilane (C18) as a sorbent was used to clean-up and pre-concentrated of the urine sample prior to HPLC analysis. The average recoveries of fluconazole in spiked urine sample was 72.4% with RSD of 3.21% (n=3).

  16. Forward flux sampling calculation of homogeneous nucleation rates from aqueous NaCl solutions.

    Science.gov (United States)

    Jiang, Hao; Haji-Akbari, Amir; Debenedetti, Pablo G; Panagiotopoulos, Athanassios Z

    2018-01-28

    We used molecular dynamics simulations and the path sampling technique known as forward flux sampling to study homogeneous nucleation of NaCl crystals from supersaturated aqueous solutions at 298 K and 1 bar. Nucleation rates were obtained for a range of salt concentrations for the Joung-Cheatham NaCl force field combined with the Extended Simple Point Charge (SPC/E) water model. The calculated nucleation rates are significantly lower than the available experimental measurements. The estimates for the nucleation rates in this work do not rely on classical nucleation theory, but the pathways observed in the simulations suggest that the nucleation process is better described by classical nucleation theory than an alternative interpretation based on Ostwald's step rule, in contrast to some prior simulations of related models. In addition to the size of NaCl nucleus, we find that the crystallinity of a nascent cluster plays an important role in the nucleation process. Nuclei with high crystallinity were found to have higher growth probability and longer lifetimes, possibly because they are less exposed to hydration water.

  17. Thermoluminescence properties of zinc oxide obtained by solution combustion synthesis

    Energy Technology Data Exchange (ETDEWEB)

    Orante B, V. R.; Escobar O, F. M.; Cruz V, C. [Universidad de Sonora, Departamento de Investigacion en Polimeros y Materiales, Apdo. Postal 130, 83000 Hermosillo, Sonora (Mexico); Bernal, R., E-mail: victor.orante@polimeros.uson.mx [Universidad de Sonora, Departamento de Investigacion en Fisica, Apdo. Postal 5-088, 83190 Hermosillo, Sonora (Mexico)

    2014-08-15

    High-dose thermoluminescence dosimetry properties of novel zinc oxide obtained by solution combustion synthesis in a glycine-nitrate process, with a non-stoichiometric value of the elemental stoichiometric coefficient (Φ{sub c}) are presented in this work. Zn O powder samples obtained were annealed afterwards at 900 grades C during 2 h in air. Sintered particles of sizes between ∼ 0.5 and ∼ 2 μm were obtained, according to scanning electron microscopy results. X-ray diffraction indicates the presence of the hexagonal phase of Zn O for the powder samples obtained, before and after thermal annealing, without any remaining nitrate peaks observed. Thermoluminescence glow curves of Zn O obtained after being exposed to beta radiation consists of two maxima; one located at ∼ 149 grades C and another at ∼ 308 grades C, being the latter the dosimetric component of the curve. Dosimetric characterization of non-stoichiometric zinc oxide provided experimental evidence like asymptotic behavior of the Tl signal fading for times greater than 16 h between irradiation and the corresponding Tl readout, as well as the linear behaviour of the dose response without saturation in the dose interval studied (from 12.5 up to 400 Gy). Such characteristics place Zn O phosphors obtained in this work as a promising material for high-dose radiation dosimetry applications (e.g., radiotherapy and food industry). (author)

  18. Thermoluminescence properties of zinc oxide obtained by solution combustion synthesis

    International Nuclear Information System (INIS)

    Orante B, V. R.; Escobar O, F. M.; Cruz V, C.; Bernal, R.

    2014-08-01

    High-dose thermoluminescence dosimetry properties of novel zinc oxide obtained by solution combustion synthesis in a glycine-nitrate process, with a non-stoichiometric value of the elemental stoichiometric coefficient (Φ c ) are presented in this work. Zn O powder samples obtained were annealed afterwards at 900 grades C during 2 h in air. Sintered particles of sizes between ∼ 0.5 and ∼ 2 μm were obtained, according to scanning electron microscopy results. X-ray diffraction indicates the presence of the hexagonal phase of Zn O for the powder samples obtained, before and after thermal annealing, without any remaining nitrate peaks observed. Thermoluminescence glow curves of Zn O obtained after being exposed to beta radiation consists of two maxima; one located at ∼ 149 grades C and another at ∼ 308 grades C, being the latter the dosimetric component of the curve. Dosimetric characterization of non-stoichiometric zinc oxide provided experimental evidence like asymptotic behavior of the Tl signal fading for times greater than 16 h between irradiation and the corresponding Tl readout, as well as the linear behaviour of the dose response without saturation in the dose interval studied (from 12.5 up to 400 Gy). Such characteristics place Zn O phosphors obtained in this work as a promising material for high-dose radiation dosimetry applications (e.g., radiotherapy and food industry). (author)

  19. Quantification of Campylobacter spp. in chicken rinse samples by using flotation prior to real-time PCR

    DEFF Research Database (Denmark)

    Wolffs, Petra; Norling, Börje; Hoorfar, Jeffrey

    2005-01-01

    Real-time PCR is fast, sensitive, specific, and can deliver quantitative data; however, two disadvantages are that this technology is sensitive to inhibition by food and that it does not distinguish between DNA originating from viable, viable nonculturable (VNC), and dead cells. For this reason......, real-time PCR has been combined with a novel discontinuous buoyant density gradient method, called flotation, in order to allow detection of only viable and VNC cells of thermotolerant campylobacters in chicken rinse samples. Studying the buoyant densities of different Campylobacter spp. showed...... enrichment and amounts as low as 2.6 X 10(3) CFU/ml could be quantified. Furthermore, subjecting viable cells and dead cells to flotation showed that viable cells were recovered after flotation treatment but that dead cells and/or their DNA was not detected. Also, when samples containing VNC cells mixed...

  20. On incomplete sampling under birth-death models and connections to the sampling-based coalescent.

    Science.gov (United States)

    Stadler, Tanja

    2009-11-07

    The constant rate birth-death process is used as a stochastic model for many biological systems, for example phylogenies or disease transmission. As the biological data are usually not fully available, it is crucial to understand the effect of incomplete sampling. In this paper, we analyze the constant rate birth-death process with incomplete sampling. We derive the density of the bifurcation events for trees on n leaves which evolved under this birth-death-sampling process. This density is used for calculating prior distributions in Bayesian inference programs and for efficiently simulating trees. We show that the birth-death-sampling process can be interpreted as a birth-death process with reduced rates and complete sampling. This shows that joint inference of birth rate, death rate and sampling probability is not possible. The birth-death-sampling process is compared to the sampling-based population genetics model, the coalescent. It is shown that despite many similarities between these two models, the distribution of bifurcation times remains different even in the case of very large population sizes. We illustrate these findings on an Hepatitis C virus dataset from Egypt. We show that the transmission times estimates are significantly different-the widely used Gamma statistic even changes its sign from negative to positive when switching from the coalescent to the birth-death process.

  1. Personality, depressive symptoms and prior trauma exposure of new ...

    African Journals Online (AJOL)

    Background. Police officers are predisposed to trauma exposure. The development of depression and post-traumatic stress disorder (PTSD) may be influenced by personality style, prior exposure to traumatic events and prior depression. Objectives. To describe the personality profiles of new Metropolitan Police Service ...

  2. Uncertainty plus prior equals rational bias: an intuitive Bayesian probability weighting function.

    Science.gov (United States)

    Fennell, John; Baddeley, Roland

    2012-10-01

    Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several nonexpected utility theories, including rank-dependent models and prospect theory; here, we propose a Bayesian approach to the probability weighting function and, with it, a psychological rationale. In the real world, uncertainty is ubiquitous and, accordingly, the optimal strategy is to combine probability statements with prior information using Bayes' rule. First, we show that any reasonable prior on probabilities leads to 2 of the observed effects; overweighting of low probabilities and underweighting of high probabilities. We then investigate 2 plausible kinds of priors: informative priors based on previous experience and uninformative priors of ignorance. Individually, these priors potentially lead to large problems of bias and inefficiency, respectively; however, when combined using Bayesian model comparison methods, both forms of prior can be applied adaptively, gaining the efficiency of empirical priors and the robustness of ignorance priors. We illustrate this for the simple case of generic good and bad options, using Internet blogs to estimate the relevant priors of inference. Given this combined ignorant/informative prior, the Bayesian probability weighting function is not only robust and efficient but also matches all of the major characteristics of the distortions found in empirical research. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  3. Logarithmic Laplacian Prior Based Bayesian Inverse Synthetic Aperture Radar Imaging.

    Science.gov (United States)

    Zhang, Shuanghui; Liu, Yongxiang; Li, Xiang; Bi, Guoan

    2016-04-28

    This paper presents a novel Inverse Synthetic Aperture Radar Imaging (ISAR) algorithm based on a new sparse prior, known as the logarithmic Laplacian prior. The newly proposed logarithmic Laplacian prior has a narrower main lobe with higher tail values than the Laplacian prior, which helps to achieve performance improvement on sparse representation. The logarithmic Laplacian prior is used for ISAR imaging within the Bayesian framework to achieve better focused radar image. In the proposed method of ISAR imaging, the phase errors are jointly estimated based on the minimum entropy criterion to accomplish autofocusing. The maximum a posterior (MAP) estimation and the maximum likelihood estimation (MLE) are utilized to estimate the model parameters to avoid manually tuning process. Additionally, the fast Fourier Transform (FFT) and Hadamard product are used to minimize the required computational efficiency. Experimental results based on both simulated and measured data validate that the proposed algorithm outperforms the traditional sparse ISAR imaging algorithms in terms of resolution improvement and noise suppression.

  4. Logarithmic Laplacian Prior Based Bayesian Inverse Synthetic Aperture Radar Imaging

    Directory of Open Access Journals (Sweden)

    Shuanghui Zhang

    2016-04-01

    Full Text Available This paper presents a novel Inverse Synthetic Aperture Radar Imaging (ISAR algorithm based on a new sparse prior, known as the logarithmic Laplacian prior. The newly proposed logarithmic Laplacian prior has a narrower main lobe with higher tail values than the Laplacian prior, which helps to achieve performance improvement on sparse representation. The logarithmic Laplacian prior is used for ISAR imaging within the Bayesian framework to achieve better focused radar image. In the proposed method of ISAR imaging, the phase errors are jointly estimated based on the minimum entropy criterion to accomplish autofocusing. The maximum a posterior (MAP estimation and the maximum likelihood estimation (MLE are utilized to estimate the model parameters to avoid manually tuning process. Additionally, the fast Fourier Transform (FFT and Hadamard product are used to minimize the required computational efficiency. Experimental results based on both simulated and measured data validate that the proposed algorithm outperforms the traditional sparse ISAR imaging algorithms in terms of resolution improvement and noise suppression.

  5. 29 CFR 5.16 - Training plans approved or recognized by the Department of Labor prior to August 20, 1975.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true Training plans approved or recognized by the Department of... STANDARDS ACT) Davis-Bacon and Related Acts Provisions and Procedures § 5.16 Training plans approved or... contractor shall be required to obtain approval of a training program which, prior to August 20, 1975, was...

  6. The Role of Prior Knowledge in International Franchise Partner Recruitment

    OpenAIRE

    Wang, Catherine; Altinay, Levent

    2006-01-01

    Purpose To investigate the role of prior knowledge in the international franchise partner recruitment process and to evaluate how cultural distance influences the role of prior knowledge in this process. Design/Methodology/Approach A single embedded case study of an international hotel firm was the focus of the enquiry. Interviews, observations and document analysis were used as the data collection techniques. Findings Findings reveal that prior knowledge of the franchisor enab...

  7. Serial isoelectric focusing as an effective and economic way to obtain maximal resolution and high-throughput in 2D-based comparative proteomics of scarce samples: proof-of-principle.

    Science.gov (United States)

    Farhoud, Murtada H; Wessels, Hans J C T; Wevers, Ron A; van Engelen, Baziel G; van den Heuvel, Lambert P; Smeitink, Jan A

    2005-01-01

    In 2D-based comparative proteomics of scarce samples, such as limited patient material, established methods for prefractionation and subsequent use of different narrow range IPG strips to increase overall resolution are difficult to apply. Also, a high number of samples, a prerequisite for drawing meaningful conclusions when pathological and control samples are considered, will increase the associated amount of work almost exponentially. Here, we introduce a novel, effective, and economic method designed to obtain maximum 2D resolution while maintaining the high throughput necessary to perform large-scale comparative proteomics studies. The method is based on connecting different IPG strips serially head-to-tail so that a complete line of different IPG strips with sequential pH regions can be focused in the same experiment. We show that when 3 IPG strips (covering together the pH range of 3-11) are connected head-to-tail an optimal resolution is achieved along the whole pH range. Sample consumption, time required, and associated costs are reduced by almost 70%, and the workload is reduced significantly.

  8. Systematic sampling for suspended sediment

    Science.gov (United States)

    Robert B. Thomas

    1991-01-01

    Abstract - Because of high costs or complex logistics, scientific populations cannot be measured entirely and must be sampled. Accepted scientific practice holds that sample selection be based on statistical principles to assure objectivity when estimating totals and variances. Probability sampling--obtaining samples with known probabilities--is the only method that...

  9. Test sample handling apparatus

    International Nuclear Information System (INIS)

    1981-01-01

    A test sample handling apparatus using automatic scintillation counting for gamma detection, for use in such fields as radioimmunoassay, is described. The apparatus automatically and continuously counts large numbers of samples rapidly and efficiently by the simultaneous counting of two samples. By means of sequential ordering of non-sequential counting data, it is possible to obtain precisely ordered data while utilizing sample carrier holders having a minimum length. (U.K.)

  10. Titania nanotube powders obtained by rapid breakdown anodization in perchloric acid electrolytes

    International Nuclear Information System (INIS)

    Ali, Saima; Hannula, Simo-Pekka

    2017-01-01

    Titania nanotube (TNT) powders are prepared by rapid break down anodization (RBA) in a 0.1 M perchloric acid (HClO 4 ) solution (Process 1), and ethylene glycol (EG) mixture with HClO 4 and water (Process 2). A study of the as-prepared and calcined TNT powders obtained by both processes is implemented to evaluate and compare the morphology, crystal structure, specific surface area, and the composition of the nanotubes. Longer TNTs are formed in Process 1, while comparatively larger pore diameter and wall thickness are obtained for the nanotubes prepared by Process 2. The TNTs obtained by Process 1 are converted to nanorods at 350 °C, while nanotubes obtained by Process 2 preserve tubular morphology till 350 °C. In addition, the TNTs prepared by an aqueous electrolyte have a crystalline structure, whereas the TNTs obtained by Process 2 are amorphous. Samples calcined till 450 °C have XRD peaks from the anatase phase, while the rutile phase appears at 550 °C for the TNTs prepared by both processes. The Raman spectra also show clear anatase peaks for all samples except the as-prepared sample obtained by Process 2, thus supporting the XRD findings. FTIR spectra reveal the presence of O-H groups in the structure for the TNTs obtained by both processes. However, the presence is less prominent for annealed samples. Additionally, TNTs obtained by Process 2 have a carbonaceous impurity present in the structure attributed to the electrolyte used in that process. While a negligible weight loss is typical for TNTs prepared from aqueous electrolytes, a weight loss of 38.6% in the temperature range of 25–600 °C is found for TNTs prepared in EG electrolyte (Process 2). A large specific surface area of 179.2 m 2 g −1 is obtained for TNTs prepared by Process 1, whereas Process 2 produces nanotubes with a lower specific surface area. The difference appears to correspond to the dimensions of the nanotubes obtained by the two processes. - Graphical abstract: Titania nanotube

  11. The Influence of Sampling Density on Bayesian Age-Depth Models and Paleoclimatic Reconstructions - Lessons Learned from Lake Titicaca - Bolivia/Peru

    Science.gov (United States)

    Salenbien, W.; Baker, P. A.; Fritz, S. C.; Guedron, S.

    2014-12-01

    Lake Titicaca is one of the most important archives of paleoclimate in tropical South America, and prior studies have elucidated patterns of climate variation at varied temporal scales over the past 0.5 Ma. Yet, slow sediment accumulation rates in the main deeper basin of the lake have precluded analysis of the lake's most recent history at high resolution. To obtain a paleoclimate record of the last few millennia at multi-decadal resolution, we obtained five short cores, ranging from 139 to 181 cm in length, from the shallower Wiñaymarka sub-basin of of Lake Titicaca, where sedimentation rates are higher than in the lake's main basin. Selected cores have been analyzed for their geochemical signature by scanning XRF, diatom stratigraphy, sedimentology, and for 14C age dating. A total of 72 samples were 14C-dated using a Gas Ion Source automated high-throughput method for carbonate samples (mainly Littoridina sp. and Taphius montanus gastropod shells) at NOSAMS (Woods Hole Oceanographic Institute) with an analytical precision higher than 2%. The method has lower analytical precision compared with traditional AMS radiocarbon dating, but the lower cost enables analysis of a larger number of samples, and the error associated with the lower precision is relatively small for younger samples (< ~8,000 years). A 172-cm-long core was divided into centimeter long sections, and 47 14C dates were obtained from 1-cm intervals, averaging one date every 3-4 cm. The other cores were radiocarbon dated with a sparser sampling density that focused on visual unconformities and shell beds. The high-resolution radiocarbon analysis reveals complex sedimentation patterns in visually continuous sections, with abundant indicators of bioturbated or reworked sediments and periods of very rapid sediment accumulation. These features are not evident in the sparser sampling strategy but have significant implications for reconstructing past lake level and paleoclimatic history.

  12. The effect of prior healthcare employment on the wages of registered nurses.

    Science.gov (United States)

    Yoo, Byung-Kwang; Kim, Minchul; Lin, Tzu-Chun; Sasaki, Tomoko; Ward, Debbie; Spetz, Joanne

    2016-08-19

    The proportion of registered nurses (RNs) with employment in health-related positions before their initial RN education has increased in the past two decades. Previous research found that prior health-related employment is positively associated with RN workforce supply, potentially due to the wage differences based on different career paths. This study's objective is to test the hypotheses that prior health-related employment is associated with differences in starting wages and with different rates of wage growth for experience as an RN. We conducted a cross-sectional analysis using the 2008 National Sample Survey of Registered Nurses (NSSRN) linked with county-level variables from the Area Health Resource File. We estimated a Heckman model where the second-stage equation's outcome variable was the logarithm of the RN hourly wage, accounting for the self-selection of working or not working as an RN (i.e., the first-stage equation's outcome variable). Key covariates included interaction terms between years of experience, experience squared, and six categories of prior health-related employment (manager, LPN/LVN, allied health, nursing aide, clerk, and all other healthcare positions). Additional covariates included demographics, weekly working hours, marital status, highest nursing degree, and county-level variables (e.g., unemployment rate). We estimated the marginal effect of experience on wage for each type of prior health-related employment, conducting separate analyses for RNs whose initial education was a Bachelor of Science in Nursing (BSN) (unweighted N = 10,345/weighted N = 945,429), RNs whose initial education was an Associate degree (unweighted N = 13,791/weighted N = 1,296,809), and total population combining the former groups (unweighted N = 24,136/weighted N = 2,242,238). Prior health-related employment was associated with higher wages, with the strongest wage differences among BSN-educated RNs. Among BSN-educated RNs, previous

  13. The cation inversion and magnetization in nanopowder zinc ferrite obtained by soft mechanochemical processing

    International Nuclear Information System (INIS)

    Milutinović, A.; Lazarević, Z.; Jovalekić, Č.; Kuryliszyn-Kudelska, I.; Romčević, M.; Kostić, S.; Romčević, N.

    2013-01-01

    Graphical abstract: - Highlights: • Nano powder of ZnFe 2 O 4 prepared by a soft mechanochemical route after 18 h milling. • Phase formation controlled by XRD, Raman spectroscopy and magnetic measurements. • Size, strain and cation inversion degree determined by Rietveld refinement. • We were able to estimate the degree of inversion at most 0.348 and 0.4. • Obtained extremely high values of saturation magnetizations at T = 4.5 K. - Abstract: Two zinc ferrite nanoparticle materials were prepared by the same method – soft mechanochemical synthesis, but starting from different powder mixtures: (1) Zn(OH) 2 /α-Fe 2 O 3 and (2) Zn(OH) 2 /Fe(OH) 3 . In both cases a single phase system was obtained after 18 h of milling. The progress of the synthesis was controlled by X-ray diffractometry (XRD), Raman spectroscopy, TEM and magnetic measurements. Analysis of the XRD patterns by Rietveld refinement allowed determination of the cation inversion degree for both obtained single phase ZnFe 2 O 4 samples. The sample obtained from mixture (1) has the cation inversion degree 0.3482 and the sample obtained from mixture (2) 0.400. Magnetization measurements were confirmed that the degrees of the inversion were well estimated. Comparison with published data shows that used method of synthesis gives nano powder samples with extremely high values of saturation magnetizations: sample (1) 78.3 emu g −1 and sample (2) 91.5 emu g −1 at T = 4.5 K

  14. Quantitating morphological changes in biological samples during scanning electron microscopy sample preparation with correlative super-resolution microscopy.

    Science.gov (United States)

    Zhang, Ying; Huang, Tao; Jorgens, Danielle M; Nickerson, Andrew; Lin, Li-Jung; Pelz, Joshua; Gray, Joe W; López, Claudia S; Nan, Xiaolin

    2017-01-01

    Sample preparation is critical to biological electron microscopy (EM), and there have been continuous efforts on optimizing the procedures to best preserve structures of interest in the sample. However, a quantitative characterization of the morphological changes associated with each step in EM sample preparation is currently lacking. Using correlative EM and superresolution microscopy (SRM), we have examined the effects of different drying methods as well as osmium tetroxide (OsO4) post-fixation on cell morphology during scanning electron microscopy (SEM) sample preparation. Here, SRM images of the sample acquired under hydrated conditions were used as a baseline for evaluating morphological changes as the sample went through SEM sample processing. We found that both chemical drying and critical point drying lead to a mild cellular boundary retraction of ~60 nm. Post-fixation by OsO4 causes at least 40 nm additional boundary retraction. We also found that coating coverslips with adhesion molecules such as fibronectin prior to cell plating helps reduce cell distortion from OsO4 post-fixation. These quantitative measurements offer useful information for identifying causes of cell distortions in SEM sample preparation and improving current procedures.

  15. A high area, porous and resistant platinized stainless steel fiber coated by nanostructured polypyrrole for direct HS-SPME of nicotine in biological samples prior to GC-FID quantification.

    Science.gov (United States)

    Abdolhosseini, Sana; Ghiasvand, Alireza; Heidari, Nahid

    2017-09-01

    The surface of a stainless steel fiber was made porous, resistant and cohesive using electrophoretic deposition and coated by the nanostructured polypyrrole using an amended in-situ electropolymerization method. The coated fiber was applied for direct extraction of nicotine in biological samples through a headspace solid-phase microextraction (HS-SPME) method followed by GC-FID determination. The effects of the important experimental variables on the efficiency of the developed HS-SPME-GC-FID method, including pH of sample solution, extraction temperature and time, stirring rate, and ionic strength were evaluated and optimized. Under the optimal experimental conditions, the calibration curve was linear over the range of 0.1-20μgmL -1 and the detection limit was obtained 20ngmL -1 . Relative standard deviation (RSD, n=6) was calculated 7.6%. The results demonstrated the superiority of the proposed fiber compared with the most used commercial types. The proposed HS-SPME-GC-FID method was successfully used for the analysis of nicotine in urine and human plasma samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Quality assessment of trace Cd and Pb contaminants in Thai herbal medicines using ultrasound-assisted digestion prior to flame atomic absorption spectrometry

    Directory of Open Access Journals (Sweden)

    Watsaka Siriangkhawut

    2017-10-01

    Full Text Available A simple, efficient, and reliable ultrasound-assisted digestion (UAD procedure was used for sample preparation prior to quantitative determination of trace Cd and Pb contaminants in herbal medicines using flame atomic absorption spectrometry. The parameters influencing UAD such as the solvent system, sample mass, presonication time, sonication time, and digestion temperature were evaluated. The efficiency of the proposed UAD procedure was evaluated by comparing with conventional acid digestion (CAD procedure. Under the optimum conditions, linear calibration graphs in a range of 2–250 μg/L for Cd, and 50–1000 μg/L for Pb were obtained with detection limits of 0.56 μg/L and 10.7 μg/L for Cd and Pb, respectively. The limit of quantification for Cd and Pb were 1.87 μg/L and 40.3 μg/L, respectively. The repeatability for analysis of 10 μg/L for Cd and 100 μg/L for Pb was 2.3% and 2.6%, respectively. The accuracy of the proposed method was evaluated by rice flour certified reference materials. The proposed method was successfully applied for analysis of trace Cd and Pb in samples of various types of medicinal plant and traditional medicine consumed in Thailand. Most herbal medicine samples were not contaminated with Cd or Pb. The contaminant levels for both metals were still lower than the maximum permissible levels of elements in medicinal plant materials and finished herbal products sets by the Ministry of Public Health of Thailand. The exception was the high level of Cd contamination found in two samples of processed medicinal plants.

  17. Sampling in epidemiological research: issues, hazards and pitfalls

    Science.gov (United States)

    Tyrer, Stephen; Heyman, Bob

    2016-01-01

    Surveys of people's opinions are fraught with difficulties. It is easier to obtain information from those who respond to text messages or to emails than to attempt to obtain a representative sample. Samples of the population that are selected non-randomly in this way are termed convenience samples as they are easy to recruit. This introduces a sampling bias. Such non-probability samples have merit in many situations, but an epidemiological enquiry is of little value unless a random sample is obtained. If a sufficient number of those selected actually complete a survey, the results are likely to be representative of the population. This editorial describes probability and non-probability sampling methods and illustrates the difficulties and suggested solutions in performing accurate epidemiological research. PMID:27087985

  18. Advanced prior modeling for 3D bright field electron tomography

    Science.gov (United States)

    Sreehari, Suhas; Venkatakrishnan, S. V.; Drummy, Lawrence F.; Simmons, Jeffrey P.; Bouman, Charles A.

    2015-03-01

    Many important imaging problems in material science involve reconstruction of images containing repetitive non-local structures. Model-based iterative reconstruction (MBIR) could in principle exploit such redundancies through the selection of a log prior probability term. However, in practice, determining such a log prior term that accounts for the similarity between distant structures in the image is quite challenging. Much progress has been made in the development of denoising algorithms like non-local means and BM3D, and these are known to successfully capture non-local redundancies in images. But the fact that these denoising operations are not explicitly formulated as cost functions makes it unclear as to how to incorporate them in the MBIR framework. In this paper, we formulate a solution to bright field electron tomography by augmenting the existing bright field MBIR method to incorporate any non-local denoising operator as a prior model. We accomplish this using a framework we call plug-and-play priors that decouples the log likelihood and the log prior probability terms in the MBIR cost function. We specifically use 3D non-local means (NLM) as the prior model in the plug-and-play framework, and showcase high quality tomographic reconstructions of a simulated aluminum spheres dataset, and two real datasets of aluminum spheres and ferritin structures. We observe that streak and smear artifacts are visibly suppressed, and that edges are preserved. Also, we report lower RMSE values compared to the conventional MBIR reconstruction using qGGMRF as the prior model.

  19. Histopathological examination of nerve samples from pure neural leprosy patients: obtaining maximum information to improve diagnostic efficiency.

    Science.gov (United States)

    Antunes, Sérgio Luiz Gomes; Chimelli, Leila; Jardim, Márcia Rodrigues; Vital, Robson Teixeira; Nery, José Augusto da Costa; Corte-Real, Suzana; Hacker, Mariana Andréa Vilas Boas; Sarno, Euzenir Nunes

    2012-03-01

    Nerve biopsy examination is an important auxiliary procedure for diagnosing pure neural leprosy (PNL). When acid-fast bacilli (AFB) are not detected in the nerve sample, the value of other nonspecific histological alterations should be considered along with pertinent clinical, electroneuromyographical and laboratory data (the detection of Mycobacterium leprae DNA with polymerase chain reaction and the detection of serum anti-phenolic glycolipid 1 antibodies) to support a possible or probable PNL diagnosis. Three hundred forty nerve samples [144 from PNL patients and 196 from patients with non-leprosy peripheral neuropathies (NLN)] were examined. Both AFB-negative and AFB-positive PNL samples had more frequent histopathological alterations (epithelioid granulomas, mononuclear infiltrates, fibrosis, perineurial and subperineurial oedema and decreased numbers of myelinated fibres) than the NLN group. Multivariate analysis revealed that independently, mononuclear infiltrate and perineurial fibrosis were more common in the PNL group and were able to correctly classify AFB-negative PNL samples. These results indicate that even in the absence of AFB, these histopathological nerve alterations may justify a PNL diagnosis when observed in conjunction with pertinent clinical, epidemiological and laboratory data.

  20. Magnetic/non-magnetic argan press cake nanocellulose for the selective extraction of sudan dyes in food samples prior to the determination by capillary liquid chromatograpy.

    Science.gov (United States)

    Benmassaoud, Yassine; Villaseñor, María J; Salghi, Rachid; Jodeh, Shehdeh; Algarra, Manuel; Zougagh, Mohammed; Ríos, Ángel

    2017-05-01

    Two methods for the determination of Sudan dyes (Sudan I, Sudan II, Sudan III and Sudan IV) in food samples, by solid phase extraction - capillary liquid chromatography, are proposed. Both methods use nanocellulose (NC) extracted from bleached argan press cake (APC), as a nano-adsorbent recycled from an agricultural waste material. One of the methods involves the dispersion of NC in food sample extracts, along with the waste and eluents being separated by centrifugation. In the other method, NC was modified by magnetic iron nanoparticles before using it in the extraction of Sudan dyes. The use of a magnetic component in the extraction process allows magnetic separation to replace the centrifugation step in a convenient and economical way. The two proposed methods allows the determination of Sudan dye amounts at the 0.25-2.00µgL -1 concentration range. The limit of detections, limit of quantifications and standard deviations achieved were lower than 0.1µgL -1 , 0.20µgL -1 and 3.46% respectively, when using NC as a nano-adsorbent, and lower than 0.07µgL -1 , 0.23µgL -1 and 2.62%, respectively, with the magnetic nanocellulose (MNC) was used. Both methods were applied to the determination of Sudan dyes in barbeque and ketchup sauce samples, obtaining recoveries between 93.4% and 109.6%. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Nudging toward Inquiry: Awakening and Building upon Prior Knowledge

    Science.gov (United States)

    Fontichiaro, Kristin, Comp.

    2010-01-01

    "Prior knowledge" (sometimes called schema or background knowledge) is information one already knows that helps him/her make sense of new information. New learning builds on existing prior knowledge. In traditional reporting-style research projects, students bypass this crucial step and plow right into answer-finding. It's no wonder that many…

  2. 5 CFR 6201.103 - Prior approval for outside employment.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Prior approval for outside employment. 6201.103 Section 6201.103 Administrative Personnel EXPORT-IMPORT BANK OF THE UNITED STATES SUPPLEMENTAL STANDARDS OF ETHICAL CONDUCT FOR EMPLOYEES OF THE EXPORT-IMPORT BANK OF THE UNITED STATES § 6201.103 Prior...

  3. Monolith Chromatography as Sample Preparation Step in Virome Studies of Water Samples.

    Science.gov (United States)

    Gutiérrez-Aguirre, Ion; Kutnjak, Denis; Rački, Nejc; Rupar, Matevž; Ravnikar, Maja

    2018-01-01

    Viruses exist in aquatic media and many of them use this media as transmission route. Next-generation sequencing (NGS) technologies have opened new doors in virus research, allowing also to reveal a hidden diversity of viral species in aquatic environments. Not surprisingly, many of the newly discovered viruses are found in environmental fresh and marine waters. One of the problems in virome research can be the low amount of viral nucleic acids present in the sample in contrast to the background ones (host, eukaryotic, prokaryotic, environmental). Therefore, virus enrichment prior to NGS is necessary in many cases. In water samples, an added problem resides in the low concentration of viruses typically present in aquatic media. Different concentration strategies have been used to overcome such limitations. CIM monoliths are a new generation of chromatographic supports that due to their particular structural characteristics are very efficient in concentration and purification of viruses. In this chapter, we describe the use of CIM monolithic chromatography for sample preparation step in NGS studies targeting viruses in fresh or marine water. The step-by-step protocol will include a case study where CIM concentration was used to study the virome of a wastewater sample using NGS.

  4. On Invertible Sampling and Adaptive Security

    DEFF Research Database (Denmark)

    Ishai, Yuval; Kumarasubramanian, Abishek; Orlandi, Claudio

    2011-01-01

    functionalities was left open. We provide the first convincing evidence that the answer to this question is negative, namely that some (randomized) functionalities cannot be realized with adaptive security. We obtain this result by studying the following related invertible sampling problem: given an efficient...... sampling algorithm A, obtain another sampling algorithm B such that the output of B is computationally indistinguishable from the output of A, but B can be efficiently inverted (even if A cannot). This invertible sampling problem is independently motivated by other cryptographic applications. We show......, under strong but well studied assumptions, that there exist efficient sampling algorithms A for which invertible sampling as above is impossible. At the same time, we show that a general feasibility result for adaptively secure MPC implies that invertible sampling is possible for every A, thereby...

  5. A sampling-based approach to probabilistic pursuit evasion

    KAUST Repository

    Mahadevan, Aditya

    2012-05-01

    Probabilistic roadmaps (PRMs) are a sampling-based approach to motion-planning that encodes feasible paths through the environment using a graph created from a subset of valid positions. Prior research has shown that PRMs can be augmented with useful information to model interesting scenarios related to multi-agent interaction and coordination. © 2012 IEEE.

  6. Source Localization by Entropic Inference and Backward Renormalization Group Priors

    Directory of Open Access Journals (Sweden)

    Nestor Caticha

    2015-04-01

    Full Text Available A systematic method of transferring information from coarser to finer resolution based on renormalization group (RG transformations is introduced. It permits building informative priors in finer scales from posteriors in coarser scales since, under some conditions, RG transformations in the space of hyperparameters can be inverted. These priors are updated using renormalized data into posteriors by Maximum Entropy. The resulting inference method, backward RG (BRG priors, is tested by doing simulations of a functional magnetic resonance imaging (fMRI experiment. Its results are compared with a Bayesian approach working in the finest available resolution. Using BRG priors sources can be partially identified even when signal to noise ratio levels are up to ~ -25dB improving vastly on the single step Bayesian approach. For low levels of noise the BRG prior is not an improvement over the single scale Bayesian method. Analysis of the histograms of hyperparameters can show how to distinguish if the method is failing, due to very high levels of noise, or if the identification of the sources is, at least partially possible.

  7. Hysteroscopic polypectomy prior to infertility treatment: A cost analysis and systematic review.

    Science.gov (United States)

    Mouhayar, Youssef; Yin, Ophelia; Mumford, Sunni L; Segars, James H

    2017-06-01

    The cost of fertility treatment is expensive and interventions that reduce cost can lead to greater efficiency and fewer embryos transferred. Endometrial polyps contribute to infertility and are frequently removed prior to infertility treatment. It is unclear whether polypectomy reduces fertility treatment cost and if so, the magnitude of cost reduction afforded by the procedure. The aim of this study was to determine whether performing office or operative hysteroscopic polypectomy prior to infertility treatment would be cost-effective. PubMed, Embase, and Cochrane libraries were used to identify publications reporting pregnancy rates after hysteroscopic polypectomy. Studies were required to have a polypectomy treatment group and control group of patients with polyps that were not resected. The charges of infertility treatments and polypectomy were obtained through infertility organizations and a private healthcare cost reporting website. These charges were applied to a decision tree model over the range of pregnancy rates observed in the representative studies to calculate an average cost per clinical or ongoing pregnancy. A sensitivity analysis was conducted to assess cost savings of polypectomy over a range of pregnancy rates and polypectomy costs. Pre-treatment office or operative hysteroscopic polypectomy ultimately saved €6658 ($7480) and €728 ($818), respectively, of the average cost per clinical pregnancy in women treated with four cycles of intrauterine insemination. Polypectomy prior to intrauterine insemination was cost-effective for clinical pregnancy rates greater than 30.2% for office polypectomy and 52.6% for operative polypectomy and for polypectomy price <€4414 ($4959). Office polypectomy or operative polypectomy saved €15,854 ($17,813) and €6644 ($7465), respectively, from the average cost per ongoing pregnancy for in vitro fertilization/intracytoplasmic sperm injection treated women and was cost-effective for ongoing pregnancy rates

  8. Nationwide survey of policies and practices related to capillary blood sampling in medical laboratories in Croatia

    Science.gov (United States)

    Krleza, Jasna Lenicek

    2014-01-01

    Introduction: Capillary sampling is increasingly used to obtain blood for laboratory tests in volumes as small as necessary and as non-invasively as possible. Whether capillary blood sampling is also frequent in Croatia, and whether it is performed according to international laboratory standards is unclear. Materials and methods: All medical laboratories that participate in the Croatian National External Quality Assessment Program (N = 204) were surveyed on-line to collect information about the laboratory’s parent institution, patient population, types and frequencies of laboratory tests based on capillary blood samples, choice of reference intervals, and policies and procedures specifically related to capillary sampling. Sampling practices were compared with guidelines from the Clinical and Laboratory Standards Institute (CLSI) and the World Health Organization (WHO). Results: Of the 204 laboratories surveyed, 174 (85%) responded with complete questionnaires. Among the 174 respondents, 155 (89%) reported that they routinely perform capillary sampling, which is carried out by laboratory staff in 118 laboratories (76%). Nearly half of respondent laboratories (48%) do not have a written protocol including order of draw for multiple sampling. A single puncture site is used to provide capillary blood for up to two samples at 43% of laboratories that occasionally or regularly perform such sampling. Most respondents (88%) never perform arterialisation prior to capillary blood sampling. Conclusions: Capillary blood sampling is highly prevalent in Croatia across different types of clinical facilities and patient populations. Capillary sampling procedures are not standardised in the country, and the rate of laboratory compliance with CLSI and WHO guidelines is low. PMID:25351353

  9. Health and human rights in eastern Myanmar prior to political transition: a population-based assessment using multistaged household cluster sampling.

    Science.gov (United States)

    Parmar, Parveen K; Benjamin-Chung, Jade; Smith, Linda S; Htoo, Saw Nay; Laeng, Sai; Lwin, Aye; Mahn, Mahn; Maung, Cynthia; Reh, Daniel; Shwe Oo, Eh Kalu; Lee, Thomas; Richards, Adam K

    2014-05-05

    Myanmar/Burma has received increased development and humanitarian assistance since the election in November 2010. Monitoring the impact of foreign assistance and economic development on health and human rights requires knowledge of pre-election conditions. From October 2008-January 2009, community-based organizations conducted household surveys using three-stage cluster sampling in Shan, Kayin, Bago, Kayah, Mon and Tanintharyi areas of Myanmar. Data was collected from 5,592 heads of household on household demographics, reproductive health, diarrhea, births, deaths, malaria, and acute malnutrition of children 6-59 months and women aged 15-49 years. A human rights focused survey module evaluated human rights violations (HRVs) experienced by household members during the previous year. Estimated infant and under-five rates were 77 (95% CI 56 to 98) and 139 (95% CI 107 to 171) deaths per 1,000 live births; and the crude mortality rate was 13 (95% CI 11 to 15) deaths per thousand persons. The leading respondent-reported cause of death was malaria, followed by acute respiratory infection and diarrhea, causing 21.2% (95% CI 16.5 to 25.8), 16.6% (95% CI 11.8 to 21.4), and 12.3% (95% CI 8.7 to 15.8), respectively. Over a third of households suffered at least one human rights violation in the preceding year (36.2%; 30.7 to 41.7). Household exposure to forced labor increased risk of death among infants (rate ratio (RR) = 2.2; 95% CI 1.1 to 4.4) and children under five (RR = 2.1; 95% CI 1.3 to 3.6). The proportion of children suffering from moderate to severe acute malnutrition was higher among households that were displaced (prevalence ratio (PR) = 3.3; 95% CI 1.9 to 5.6). Prior to the 2010 election, populations of eastern Myanmar experienced high rates of disease and death and high rates of HRVs. These population-based data provide a baseline that can be used to monitor national and international efforts to improve the health and human rights situation in the

  10. Incorporation of stochastic engineering models as prior information in Bayesian medical device trials.

    Science.gov (United States)

    Haddad, Tarek; Himes, Adam; Thompson, Laura; Irony, Telba; Nair, Rajesh

    2017-01-01

    Evaluation of medical devices via clinical trial is often a necessary step in the process of bringing a new product to market. In recent years, device manufacturers are increasingly using stochastic engineering models during the product development process. These models have the capability to simulate virtual patient outcomes. This article presents a novel method based on the power prior for augmenting a clinical trial using virtual patient data. To properly inform clinical evaluation, the virtual patient model must simulate the clinical outcome of interest, incorporating patient variability, as well as the uncertainty in the engineering model and in its input parameters. The number of virtual patients is controlled by a discount function which uses the similarity between modeled and observed data. This method is illustrated by a case study of cardiac lead fracture. Different discount functions are used to cover a wide range of scenarios in which the type I error rates and power vary for the same number of enrolled patients. Incorporation of engineering models as prior knowledge in a Bayesian clinical trial design can provide benefits of decreased sample size and trial length while still controlling type I error rate and power.

  11. Human papillomavirus self-sampling for screening nonattenders

    DEFF Research Database (Denmark)

    Lam, Janni Uyen Hoa; Rebolj, Matejka; Ejegod, Ditte Møller

    2017-01-01

    was well-accepted among nonattenders. Adopting modern technology-based platforms into the current organized screening program would serve as a convenient communication method between health authority and citizens, allowing easy access for the citizen and reducing the work load in administrating self-sampling......In organized cervical screening programs, typically 25% of the invited women do not attend. The Copenhagen Self-sampling Initiative (CSi) aimed to gain experiences on participation among screening nonattenders in the Capital Region of Denmark. Here, we report on the effectiveness of different...... communication platforms used in the pilot with suggestions for strategies prior to a full-implementation. Moreover, an innovative approach using self-sampling brushes with unique radio frequency identification chips allowed for unprecedented levels patient identification safety. Nonattenders from the capital...

  12. Global tractography with embedded anatomical priors for quantitative connectivity analysis

    Directory of Open Access Journals (Sweden)

    Alia eLemkaddem

    2014-11-01

    Full Text Available The main assumption of fiber-tracking algorithms is that fiber trajectories are represented by paths of highest diffusion, which is usually accomplished by following the principal diffusion directions estimated in every voxel from the measured diffusion MRI data. The state-of-the-art approaches, known as global tractography, reconstruct all the fiber tracts of the whole brain simultaneously by solving a global energy minimization problem. The tractograms obtained with these algorithms outperform any previous technique but, unfortunately, the price to pay is an increased computational cost which is not suitable in many practical settings, both in terms of time and memory requirements. Furthermore, existing global tractography algorithms suffer from an important shortcoming that is crucial in the context of brain connectivity analyses. As no anatomical priors are used during in the reconstruction process, the recovered fiber tracts are not guaranteed to connect cortical regions and, as a matter of fact, most of them stop prematurely in the white matter. This does not only unnecessarily slow down the estimation procedure and potentially biases any subsequent analysis but also, most importantly, prevents the de facto quantification of brain connectivity. In this work, we propose a novel approach for global tractography that is specifically designed for connectivity analysis applications by explicitly enforcing anatomical priors of the tracts in the optimization and considering the effective contribution of each of them, i.e. volume, to the acquired diffusion MRI image. We evaluated our approach on both a realistic diffusion MRI phantom and in-vivo data, and also compared its performance to existing tractography aloprithms.

  13. Histopathological examination of nerve samples from pure neural leprosy patients: obtaining maximum information to improve diagnostic efficiency

    Directory of Open Access Journals (Sweden)

    Sérgio Luiz Gomes Antunes

    2012-03-01

    Full Text Available Nerve biopsy examination is an important auxiliary procedure for diagnosing pure neural leprosy (PNL. When acid-fast bacilli (AFB are not detected in the nerve sample, the value of other nonspecific histological alterations should be considered along with pertinent clinical, electroneuromyographical and laboratory data (the detection of Mycobacterium leprae DNA with polymerase chain reaction and the detection of serum anti-phenolic glycolipid 1 antibodies to support a possible or probable PNL diagnosis. Three hundred forty nerve samples [144 from PNL patients and 196 from patients with non-leprosy peripheral neuropathies (NLN] were examined. Both AFB-negative and AFB-positive PNL samples had more frequent histopathological alterations (epithelioid granulomas, mononuclear infiltrates, fibrosis, perineurial and subperineurial oedema and decreased numbers of myelinated fibres than the NLN group. Multivariate analysis revealed that independently, mononuclear infiltrate and perineurial fibrosis were more common in the PNL group and were able to correctly classify AFB-negative PNL samples. These results indicate that even in the absence of AFB, these histopathological nerve alterations may justify a PNL diagnosis when observed in conjunction with pertinent clinical, epidemiological and laboratory data.

  14. PIRPLE: a penalized-likelihood framework for incorporation of prior images in CT reconstruction

    International Nuclear Information System (INIS)

    Stayman, J Webster; Dang, Hao; Ding, Yifu; Siewerdsen, Jeffrey H

    2013-01-01

    Over the course of diagnosis and treatment, it is common for a number of imaging studies to be acquired. Such imaging sequences can provide substantial patient-specific prior knowledge about the anatomy that can be incorporated into a prior-image-based tomographic reconstruction for improved image quality and better dose utilization. We present a general methodology using a model-based reconstruction approach including formulations of the measurement noise that also integrates prior images. This penalized-likelihood technique adopts a sparsity enforcing penalty that incorporates prior information yet allows for change between the current reconstruction and the prior image. Moreover, since prior images are generally not registered with the current image volume, we present a modified model-based approach that seeks a joint registration of the prior image in addition to the reconstruction of projection data. We demonstrate that the combined prior-image- and model-based technique outperforms methods that ignore the prior data or lack a noise model. Moreover, we demonstrate the importance of registration for prior-image-based reconstruction methods and show that the prior-image-registered penalized-likelihood estimation (PIRPLE) approach can maintain a high level of image quality in the presence of noisy and undersampled projection data. (paper)

  15. Veterinary antibiotic resistance, residues, and ecological risks in environmental samples obtained from poultry farms, Egypt.

    Science.gov (United States)

    Dahshan, Hesham; Abd-Elall, Amr Mohamed Mohamed; Megahed, Ayman Mohamed; Abd-El-Kader, Mahdy A; Nabawy, Ehab Elsayed

    2015-02-01

    In Egypt, poultry production constitutes one of the main sources of pollution with veterinary antibiotics (VAs) into the environment. About 80 % of meat production in Egypt is of poultry origin, and the potential environmental risks associated with the use of VAs in these farms have not yet been properly evaluated. Thus, the main purpose of this research was to evaluate the prevalence of antibiotic-resistant enteric key bacteria and the incidence of residual antibiotics in poultry farm environmental samples and to determine whether fertilizing soils with poultry litter from farms potentially brings ecological risks. From December 2011 to September 2012, a total of 225 litter, bird dropping, and water samples were collected from 75 randomly selected boiler poultry farms. A high prevalence of Escherichia coli (n = 179; 79.5 %) in contrast to the low prevalence of Salmonella spp. (n = 7; 3.1 %) was detected. Amongst E. coli isolates, serotypes O142:K86, O125:K70, O91:K, and O119:K69 were the most common. Meanwhile, Salmonella enterica serotypes emek and enteritidis were recovered. The antibiograms using the disc diffusion method revealed significantly more common resistant and multi-resistant isolates in broiler poultry farms. Residues of tetracycline and ciprofloxacin were detected at 2.125 and 1.401 mg kg(-1) mean levels, respectively, in environmental samples contaminated with E. coli-resistant strains by HPLC. The risk evaluations highlighted that tetracycline residues in poultry litter significantly display environmental risks with a hazard quotient value above 1 (1.64). Our study implies that ineffective implementation of veterinary laws which guide and guard against incorrect VA usage may potentially bring health and environmental risks.

  16. Delamination of Pearlitic Steel Wires: The Defining Role of Prior-Drawing Microstructure

    Science.gov (United States)

    Durgaprasad, A.; Giri, S.; Lenka, S.; Sarkar, Sudip Kumar; Biswas, Aniruddha; Kundu, S.; Mishra, S.; Chandra, S.; Doherty, R. D.; Samajdar, I.

    2018-03-01

    This article reports the occasional (alignment of the pearlite: 22 ± 5 pct vs 34 ± 4 pct in the nondelaminated wires. Although all wires had similar through-thickness texture and stress gradients, delaminated wires had stronger gradients in composition and higher hardness across the ferrite-cementite interface. Carbide dissolution and formation of supersaturated ferrite were clearly correlated with delamination, which could be effectively mitigated by controlled laboratory annealing at 673 K. Direct observations on samples subjected to simple shear revealed significant differences in shear localizations. These were controlled by pearlite morphology and interlamellar spacing. Prior-drawing microstructure of coarse misaligned pearlite thus emerged as a critical factor in the wire drawing-induced delamination of the pearlitic wires.

  17. Incorporating priors for EEG source imaging and connectivity analysis

    Directory of Open Access Journals (Sweden)

    Xu eLei

    2015-08-01

    Full Text Available Electroencephalography source imaging (ESI is a useful technique to localize the generators from a given scalp electric measurement and to investigate the temporal dynamics of the large-scale neural circuits. By introducing reasonable priors from other modalities, ESI reveals the most probable sources and communication structures at every moment in time. Here, we review the available priors from such techniques as magnetic resonance imaging (MRI, functional MRI (fMRI, and positron emission tomography (PET. The modality's specific contribution is analyzed from the perspective of source reconstruction. For spatial priors, such as EEG-correlated fMRI, temporally coherent networks and resting-state fMRI are systematically introduced in the ESI. Moreover, the fiber tracking (diffusion tensor imaging, DTI and neuro-stimulation techniques (transcranial magnetic stimulation, TMS are also introduced as the potential priors, which can help to draw inferences about the neuroelectric connectivity in the source space. We conclude that combining EEG source imaging with other complementary modalities is a promising approach towards the study of brain networks in cognitive and clinical neurosciences.

  18. Biopolymers for Sample Collection, Protection, and Preservation

    Science.gov (United States)

    2015-05-19

    knowledge of sample collection from various matrices is crucial. Recovery and preservation of microorganisms prior to analysis are important...Another method for encapsulating bacteria for use in biodegradation of gasoline involves a complex process using gellan gum (Moslemy et al. 2002). Many...use of acacia gum in preserving microorganisms for extended periods of time without refrigeration (Krumnow et al. 2009; Sorokulova et al. 2008, 2012

  19. On Bayesian Inference under Sampling from Scale Mixtures of Normals

    NARCIS (Netherlands)

    Fernández, C.; Steel, M.F.J.

    1996-01-01

    This paper considers a Bayesian analysis of the linear regression model under independent sampling from general scale mixtures of Normals.Using a common reference prior, we investigate the validity of Bayesian inference and the existence of posterior moments of the regression and precision

  20. Wrong, but useful: regional species distribution models may not be improved by range-wide data under biased sampling.

    Science.gov (United States)

    El-Gabbas, Ahmed; Dormann, Carsten F

    2018-02-01

    Species distribution modeling (SDM) is an essential method in ecology and conservation. SDMs are often calibrated within one country's borders, typically along a limited environmental gradient with biased and incomplete data, making the quality of these models questionable. In this study, we evaluated how adequate are national presence-only data for calibrating regional SDMs. We trained SDMs for Egyptian bat species at two different scales: only within Egypt and at a species-specific global extent. We used two modeling algorithms: Maxent and elastic net, both under the point-process modeling framework. For each modeling algorithm, we measured the congruence of the predictions of global and regional models for Egypt, assuming that the lower the congruence, the lower the appropriateness of the Egyptian dataset to describe the species' niche. We inspected the effect of incorporating predictions from global models as additional predictor ("prior") to regional models, and quantified the improvement in terms of AUC and the congruence between regional models run with and without priors. Moreover, we analyzed predictive performance improvements after correction for sampling bias at both scales. On average, predictions from global and regional models in Egypt only weakly concur. Collectively, the use of priors did not lead to much improvement: similar AUC and high congruence between regional models calibrated with and without priors. Correction for sampling bias led to higher model performance, whatever prior used, making the use of priors less pronounced. Under biased and incomplete sampling, the use of global bats data did not improve regional model performance. Without enough bias-free regional data, we cannot objectively identify the actual improvement of regional models after incorporating information from the global niche. However, we still believe in great potential for global model predictions to guide future surveys and improve regional sampling in data