WorldWideScience

Sample records for sampling times evaluating

  1. Correction of Sample-Time Error for Time-Interleaved Sampling System Using Cubic Spline Interpolation

    Directory of Open Access Journals (Sweden)

    Qin Guo-jie

    2014-08-01

    Full Text Available Sample-time errors can greatly degrade the dynamic range of a time-interleaved sampling system. In this paper, a novel correction technique employing a cubic spline interpolation is proposed for inter-channel sample-time error compensation. The cubic spline interpolation compensation filter is developed in the form of a finite-impulse response (FIR filter structure. The correction method of the interpolation compensation filter coefficients is deduced. A 4GS/s two-channel, time-interleaved ADC prototype system has been implemented to evaluate the performance of the technique. The experimental results showed that the correction technique is effective to attenuate the spurious spurs and improve the dynamic performance of the system.

  2. Sampling inspection for the evaluation of time-dependent reliability of deteriorating systems under imperfect defect detection

    International Nuclear Information System (INIS)

    Kuniewski, Sebastian P.; Weide, Johannes A.M. van der; Noortwijk, Jan M. van

    2009-01-01

    The paper presents a sampling-inspection strategy for the evaluation of time-dependent reliability of deteriorating systems, where the deterioration is assumed to initiate at random times and at random locations. After initiation, defects are weakening the system's resistance. The system becomes unacceptable when at least one defect reaches a critical depth. The defects are assumed to initiate at random times modeled as event times of a non-homogeneous Poisson process (NHPP) and to develop according to a non-decreasing time-dependent gamma process. The intensity rate of the NHPP is assumed to be a combination of a known time-dependent shape function and an unknown proportionality constant. When sampling inspection (i.e. inspection of a selected subregion of the system) results in a number of defect initiations, Bayes' theorem can be used to update prior beliefs about the proportionality constant of the NHPP intensity rate to the posterior distribution. On the basis of a time- and space-dependent Poisson process for the defect initiation, an adaptive Bayesian model for sampling inspection is developed to determine the predictive probability distribution of the time to failure. A potential application is, for instance, the inspection of a large vessel or pipeline suffering pitting/localized corrosion in the oil industry. The possibility of imperfect defect detection is also incorporated in the model.

  3. Clinical evaluation of a Mucorales-specific real-time PCR assay in tissue and serum samples.

    Science.gov (United States)

    Springer, Jan; Lackner, Michaela; Ensinger, Christian; Risslegger, Brigitte; Morton, Charles Oliver; Nachbaur, David; Lass-Flörl, Cornelia; Einsele, Hermann; Heinz, Werner J; Loeffler, Juergen

    2016-12-01

    Molecular diagnostic assays can accelerate the diagnosis of fungal infections and subsequently improve patient outcomes. In particular, the detection of infections due to Mucorales is still challenging for laboratories and physicians. The aim of this study was to evaluate a probe-based Mucorales-specific real-time PCR assay (Muc18S) using tissue and serum samples from patients suffering from invasive mucormycosis (IMM). This assay can detect a broad range of clinically relevant Mucorales species and can be used to complement existing diagnostic tests or to screen high-risk patients. An advantage of the Muc18S assay is that it exclusively detects Mucorales species allowing the diagnosis of Mucorales DNA without sequencing within a few hours. In paraffin-embedded tissue samples this PCR-based method allowed rapid identification of Mucorales in comparison with standard methods and showed 91 % sensitivity in the IMM tissue samples. We also evaluated serum samples, an easily accessible material, from patients at risk from IMM. Mucorales DNA was detected in all patients with probable/proven IMM (100 %) and in 29 % of the possible cases. Detection of IMM in serum could enable an earlier diagnosis (up to 21 days) than current methods including tissue samples, which were gained mainly post-mortem. A screening strategy for high-risk patients, which would enable targeted treatment to improve patient outcomes, is therefore possible.

  4. Time-Scale and Time-Frequency Analyses of Irregularly Sampled Astronomical Time Series

    Directory of Open Access Journals (Sweden)

    S. Roques

    2005-09-01

    Full Text Available We evaluate the quality of spectral restoration in the case of irregular sampled signals in astronomy. We study in details a time-scale method leading to a global wavelet spectrum comparable to the Fourier period, and a time-frequency matching pursuit allowing us to identify the frequencies and to control the error propagation. In both cases, the signals are first resampled with a linear interpolation. Both results are compared with those obtained using Lomb's periodogram and using the weighted waveletZ-transform developed in astronomy for unevenly sampled variable stars observations. These approaches are applied to simulations and to light variations of four variable stars. This leads to the conclusion that the matching pursuit is more efficient for recovering the spectral contents of a pulsating star, even with a preliminary resampling. In particular, the results are almost independent of the quality of the initial irregular sampling.

  5. Critical evaluation of sample pretreatment techniques.

    Science.gov (United States)

    Hyötyläinen, Tuulia

    2009-06-01

    Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.

  6. Evaluation of statistical methods for quantifying fractal scaling in water-quality time series with irregular sampling

    Science.gov (United States)

    Zhang, Qian; Harman, Ciaran J.; Kirchner, James W.

    2018-02-01

    River water-quality time series often exhibit fractal scaling, which here refers to autocorrelation that decays as a power law over some range of scales. Fractal scaling presents challenges to the identification of deterministic trends because (1) fractal scaling has the potential to lead to false inference about the statistical significance of trends and (2) the abundance of irregularly spaced data in water-quality monitoring networks complicates efforts to quantify fractal scaling. Traditional methods for estimating fractal scaling - in the form of spectral slope (β) or other equivalent scaling parameters (e.g., Hurst exponent) - are generally inapplicable to irregularly sampled data. Here we consider two types of estimation approaches for irregularly sampled data and evaluate their performance using synthetic time series. These time series were generated such that (1) they exhibit a wide range of prescribed fractal scaling behaviors, ranging from white noise (β = 0) to Brown noise (β = 2) and (2) their sampling gap intervals mimic the sampling irregularity (as quantified by both the skewness and mean of gap-interval lengths) in real water-quality data. The results suggest that none of the existing methods fully account for the effects of sampling irregularity on β estimation. First, the results illustrate the danger of using interpolation for gap filling when examining autocorrelation, as the interpolation methods consistently underestimate or overestimate β under a wide range of prescribed β values and gap distributions. Second, the widely used Lomb-Scargle spectral method also consistently underestimates β. A previously published modified form, using only the lowest 5 % of the frequencies for spectral slope estimation, has very poor precision, although the overall bias is small. Third, a recent wavelet-based method, coupled with an aliasing filter, generally has the smallest bias and root-mean-squared error among all methods for a wide range of

  7. Power distribution system reliability evaluation using dagger-sampling Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Y.; Zhao, S.; Ma, Y. [North China Electric Power Univ., Hebei (China). Dept. of Electrical Engineering

    2009-03-11

    A dagger-sampling Monte Carlo simulation method was used to evaluate power distribution system reliability. The dagger-sampling technique was used to record the failure of a component as an incident and to determine its occurrence probability by generating incident samples using random numbers. The dagger sampling technique was combined with the direct sequential Monte Carlo method to calculate average values of load point indices and system indices. Results of the 2 methods with simulation times of up to 100,000 years were then compared. The comparative evaluation showed that less computing time was required using the dagger-sampling technique due to its higher convergence speed. When simulation times were 1000 years, the dagger-sampling method required 0.05 seconds to accomplish an evaluation, while the direct method required 0.27 seconds. 12 refs., 3 tabs., 4 figs.

  8. Evaluation of statistical methods for quantifying fractal scaling in water-quality time series with irregular sampling

    Directory of Open Access Journals (Sweden)

    Q. Zhang

    2018-02-01

    Full Text Available River water-quality time series often exhibit fractal scaling, which here refers to autocorrelation that decays as a power law over some range of scales. Fractal scaling presents challenges to the identification of deterministic trends because (1 fractal scaling has the potential to lead to false inference about the statistical significance of trends and (2 the abundance of irregularly spaced data in water-quality monitoring networks complicates efforts to quantify fractal scaling. Traditional methods for estimating fractal scaling – in the form of spectral slope (β or other equivalent scaling parameters (e.g., Hurst exponent – are generally inapplicable to irregularly sampled data. Here we consider two types of estimation approaches for irregularly sampled data and evaluate their performance using synthetic time series. These time series were generated such that (1 they exhibit a wide range of prescribed fractal scaling behaviors, ranging from white noise (β  =  0 to Brown noise (β  =  2 and (2 their sampling gap intervals mimic the sampling irregularity (as quantified by both the skewness and mean of gap-interval lengths in real water-quality data. The results suggest that none of the existing methods fully account for the effects of sampling irregularity on β estimation. First, the results illustrate the danger of using interpolation for gap filling when examining autocorrelation, as the interpolation methods consistently underestimate or overestimate β under a wide range of prescribed β values and gap distributions. Second, the widely used Lomb–Scargle spectral method also consistently underestimates β. A previously published modified form, using only the lowest 5 % of the frequencies for spectral slope estimation, has very poor precision, although the overall bias is small. Third, a recent wavelet-based method, coupled with an aliasing filter, generally has the smallest bias and root-mean-squared error among

  9. The real-time fitting of radioactive decay curves. Pt. 3. Counting during sampling

    International Nuclear Information System (INIS)

    Hartley, B.M.

    1994-01-01

    An analysis of a least-squares method for the real-time fitting of the theoretical total count function to the actual total count from radioactive decays has been given previously for the case where counting takes place after a sample is taken. The counting may be done in a number of different counting systems which distinguish between different types or energies of radiation emitted from the sample. The method would allow real-time determination of the numbers of atoms and hence activities of the individual isotopes present and has been designated the Time Evolved Least-Squares method (TELS). If the radioactivity which is to be measured exists as an aerosol or in a form where a sample is taken at a constant rate it may be possible to count during sampling and by so doing reduce the total time required to determine the activity of the individual isotopes present. The TELS method is extended here to the case where counting and the evaluation of the activity takes place concurrently with the sampling. The functions which need to be evaluated are derived and the calculations required to implement the method are discussed. As with the TELS method of counting after sampling the technique of counting during sampling and the simultaneous evaluation of activity could be achieved in real-time. Results of testing the method by computer simulation for two counting schemes for the descendants of radon are presented. ((orig.))

  10. Choice of Sample Split in Out-of-Sample Forecast Evaluation

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Timmermann, Allan

    , while conversely the power of forecast evaluation tests is strongest with long out-of-sample periods. To deal with size distortions, we propose a test statistic that is robust to the effect of considering multiple sample split points. Empirical applications to predictabil- ity of stock returns......Out-of-sample tests of forecast performance depend on how a given data set is split into estimation and evaluation periods, yet no guidance exists on how to choose the split point. Empirical forecast evaluation results can therefore be difficult to interpret, particularly when several values...... and inflation demonstrate that out-of-sample forecast evaluation results can critically depend on how the sample split is determined....

  11. Experimental performance evaluation of two stack sampling systems in a plutonium facility

    International Nuclear Information System (INIS)

    Glissmeyer, J.A.

    1992-04-01

    The evaluation of two routine stack sampling systems at the Z-Plant plutonium facility operated by Rockwell International for USERDA is part of a larger study, sponsored by Rockwell and conducted by Battelle, Pacific Northwest Laboratories, of gaseous effluent sampling systems. The gaseous effluent sampling systems evaluated are located at the main plant ventilation stack (291-Z-1) and at a vessel vent stack (296-Z-3). A preliminary report, which was a paper study issued in April 1976, identified many deficiencies in the existing sampling systems and made recommendations for corrective action. The objectives of this experimental evaluation of those sampling systems were as follows: Characterize the radioactive aerosols in the stack effluents; Develop a tracer aerosol technique for validating particulate effluent sampling system performance; Evaluate the performance of the existing routine sampling systems and their compliance with the sponsor's criteria; and Recommend corrective action where required. The tracer aerosol approach to sampler evaluation was chosen because the low concentrations of radioactive particulates in the effluents would otherwise require much longer sampling times and thus more time to complete this evaluation. The following report describes the sampling systems that are the subject of this study and then details the experiments performed. The results are then presented and discussed. Much of the raw and finished data are included in the appendices

  12. Practical reporting times for environmental samples

    International Nuclear Information System (INIS)

    Bayne, C.K.; Schmoyer, D.D.; Jenkins, R.A.

    1993-02-01

    Preanalytical holding times for environmental samples are specified because chemical and physical characteristics may change between sampling and chemical analysis. For example, the Federal Register prescribes a preanalytical holding time of 14 days for volatile organic compounds in soil stored at 4 degrees C. The American Society for Testing Materials (ASTM) uses a more technical definition that the preanalytical holding time is the day when the analyte concentration for an environmental sample falls below the lower 99% confidence interval on the analyte concentration at day zero. This study reviews various holding time definitions and suggest a new preanalytical holding time approach using acceptable error rates for measuring an environmental analyte. This practical reporting time (PRT) approach has been applied to nineteen volatile organic compounds and four explosives in three environmental soil samples. A PRT nomograph of error rates has been developed to estimate the consequences of missing a preanalytical holding time. This nomograph can be applied to a large class of analytes with concentrations that decay linearly or exponentially with time regardless of sample matrices and storage conditions

  13. Practical reporting times for environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Bayne, C.K.; Schmoyer, D.D.; Jenkins, R.A.

    1993-02-01

    Preanalytical holding times for environmental samples are specified because chemical and physical characteristics may change between sampling and chemical analysis. For example, the Federal Register prescribes a preanalytical holding time of 14 days for volatile organic compounds in soil stored at 4{degrees}C. The American Society for Testing Materials (ASTM) uses a more technical definition that the preanalytical holding time is the day when the analyte concentration for an environmental sample falls below the lower 99% confidence interval on the analyte concentration at day zero. This study reviews various holding time definitions and suggest a new preanalytical holding time approach using acceptable error rates for measuring an environmental analyte. This practical reporting time (PRT) approach has been applied to nineteen volatile organic compounds and four explosives in three environmental soil samples. A PRT nomograph of error rates has been developed to estimate the consequences of missing a preanalytical holding time. This nomograph can be applied to a large class of analytes with concentrations that decay linearly or exponentially with time regardless of sample matrices and storage conditions.

  14. A confirmatory holding time study for purgeable VOCs in water samples

    International Nuclear Information System (INIS)

    West, O.R.; Bayne, C.K.; Siegrist, R.L.; Holden, W.H.; Bottrell, D.W.

    1996-01-01

    Analyte stability during pre-analytical storage is essential to the accurate quantification contaminants in environmental samples. This is particularly true for volatile organic compounds (VOCS) which can easily volatilize and/or degrade during sample storage. Recognizing this, regulatory agencies require water samples be collected in vials without headspace and stored at 4 degrees C, and that analyses be conducted within 14 days, 2048 even if samples are acid-preserved. Since the selection of a 14-day holding time was largely arbitrary, the appropriateness of this requirement must be re-evaluated. The goal of the study described here was to provide regulatory agencies with the necessary data to extend the maximum holding time for properly preserved VOC water samples to 28 days

  15. Long-term strategic asset allocation: An out-of-sample evaluation

    NARCIS (Netherlands)

    Diris, B.F.; Palm, F.C.; Schotman, P.C.

    We evaluate the out-of-sample performance of a long-term investor who follows an optimized dynamic trading strategy. Although the dynamic strategy is able to benefit from predictability out-of-sample, a short-term investor using a single-period market timing strategy would have realized an almost

  16. Method for evaluation of radiative properties of glass samples

    Energy Technology Data Exchange (ETDEWEB)

    Mohelnikova, Jitka [Faculty of Civil Engineering, Brno University of Technology, Veveri 95, 602 00 Brno (Czech Republic)], E-mail: mohelnikova.j@fce.vutbr.cz

    2008-04-15

    The paper presents a simple calculation method which serves for an evaluation of radiative properties of window glasses. The method is based on a computer simulation model of the energy balance of a thermally insulated box with selected glass samples. A temperature profile of the air inside of the box with a glass sample exposed to affecting radiation was determined for defined boundary conditions. The spectral range of the radiation was considered in the interval between 280 and 2500 nm. This interval is adequate to the spectral range of solar radiation affecting windows in building facades. The air temperature rise within the box was determined in a response to the affecting radiation in the time between the beginning of the radiation exposition and the time of steady-state thermal conditions. The steady state temperature inside of the insulated box serves for the evaluation of the box energy balance and determination of the glass sample radiative properties. These properties are represented by glass characteristics as mean values of transmittance, reflectance and absorptance calculated for a defined spectral range. The data of the computer simulations were compared to experimental measurements on a real model of the insulated box. Results of both the calculations and measurements are in a good compliance. The method is recommended for preliminary evaluation of window glass radiative properties which serve as data for energy evaluation of buildings.

  17. A test of alternative estimators for volume at time 1 from remeasured point samples

    Science.gov (United States)

    Francis A. Roesch; Edwin J. Green; Charles T. Scott

    1993-01-01

    Two estimators for volume at time 1 for use with permanent horizontal point samples are evaluated. One estimator, used traditionally, uses only the trees sampled at time 1, while the second estimator, originally presented by Roesch and coauthors (F.A. Roesch, Jr., E.J. Green, and C.T. Scott. 1989. For. Sci. 35(2):281-293). takes advantage of additional sample...

  18. Eigenvalue sensitivity of sampled time systems operating in closed loop

    Science.gov (United States)

    Bernal, Dionisio

    2018-05-01

    The use of feedback to create closed-loop eigenstructures with high sensitivity has received some attention in the Structural Health Monitoring field. Although practical implementation is necessarily digital, and thus in sampled time, work thus far has center on the continuous time framework, both in design and in checking performance. It is shown in this paper that the performance in discrete time, at typical sampling rates, can differ notably from that anticipated in the continuous time formulation and that discrepancies can be particularly large on the real part of the eigenvalue sensitivities; a consequence being important error on the (linear estimate) of the level of damage at which closed-loop stability is lost. As one anticipates, explicit consideration of the sampling rate poses no special difficulties in the closed-loop eigenstructure design and the relevant expressions are developed in the paper, including a formula for the efficient evaluation of the derivative of the matrix exponential based on the theory of complex perturbations. The paper presents an easily reproduced numerical example showing the level of error that can result when the discrete time implementation of the controller is not considered.

  19. Evaluation of Legionella real-time PCR against traditional culture for routine and public health testing of water samples.

    Science.gov (United States)

    Collins, S; Stevenson, D; Walker, J; Bennett, A

    2017-06-01

    To evaluate the usefulness of Legionella qPCR alongside traditional culture for enumeration of Legionella from water samples as part of both routine and public health investigation testing. Routine water samples (n = 2002) and samples from public health investigations (n = 215) were analysed by culture and qPCR for Legionella spp., Legionella pneumophila and L. pneumophila sg-1. A negative qPCR result was highly predictive of a negative culture result for all water systems (negative predictive values, NPV from 97·4 to 100%). Positive predictive values (PPV) were lower (0-50%). Results for qPCR were generally larger than culture with average log 10 differences of 1·1 for Legionella spp. and 1·2 for L. pneumophila. Alert and action levels of 1000 and 10 000 GU per litre, respectively, are proposed for Legionella qPCR for hot and cold water systems (HCWS). The use of qPCR significantly reduced the time to results for public health investigations by rapidly identifying potential sources and ruling out others, thus enabling a more rapid and efficient response. The high NPV of qPCR supports its use to rapidly screen out negative samples without culture. Alert and action levels for Legionella qPCR for HCWS are proposed. Quantitative PCR will be a valuable tool for both routine and public health testing. This study generated comparative data of >2000 water samples by qPCR and culture. Action and alert levels have been recommended that could enable duty holders to interpret qPCR results to facilitate timely Legionella control and public health protection. © 2017 Crown copyright. Journal of Applied Microbiology © 2017 The Society for Applied Microbiology.

  20. Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering

    Science.gov (United States)

    Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki

    2018-03-01

    We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.

  1. Low-sampling-rate ultra-wideband digital receiver using equivalent-time sampling

    KAUST Repository

    Ballal, Tarig

    2014-09-01

    In this paper, we propose an all-digital scheme for ultra-wideband symbol detection. In the proposed scheme, the received symbols are sampled many times below the Nyquist rate. It is shown that when the number of symbol repetitions, P, is co-prime with the symbol duration given in Nyquist samples, the receiver can sample the received data P times below the Nyquist rate, without loss of fidelity. The proposed scheme is applied to perform channel estimation and binary pulse position modulation (BPPM) detection. Results are presented for two receivers operating at two different sampling rates that are 10 and 20 times below the Nyquist rate. The feasibility of the proposed scheme is demonstrated in different scenarios, with reasonable bit error rates obtained in most of the cases.

  2. Low-sampling-rate ultra-wideband digital receiver using equivalent-time sampling

    KAUST Repository

    Ballal, Tarig; Al-Naffouri, Tareq Y.

    2014-01-01

    In this paper, we propose an all-digital scheme for ultra-wideband symbol detection. In the proposed scheme, the received symbols are sampled many times below the Nyquist rate. It is shown that when the number of symbol repetitions, P, is co-prime with the symbol duration given in Nyquist samples, the receiver can sample the received data P times below the Nyquist rate, without loss of fidelity. The proposed scheme is applied to perform channel estimation and binary pulse position modulation (BPPM) detection. Results are presented for two receivers operating at two different sampling rates that are 10 and 20 times below the Nyquist rate. The feasibility of the proposed scheme is demonstrated in different scenarios, with reasonable bit error rates obtained in most of the cases.

  3. Clinical evaluation of the Abbott RealTime MTB Assay for direct detection of Mycobacterium tuberculosis-complex from respiratory and non-respiratory samples.

    Science.gov (United States)

    Hinić, Vladimira; Feuz, Kinga; Turan, Selda; Berini, Andrea; Frei, Reno; Pfeifer, Karin; Goldenberger, Daniel

    2017-05-01

    Rapid and reliable diagnosis is crucial for correct management of tuberculosis. The Abbott RealTime MTB Assay represents a novel qualitative real-time PCR assay for direct detection of M. tuberculosis-complex (MTB) DNA from respiratory samples. The test targets two highly conserved sequences, the multi-copy insertion element IS6110 and the protein antigen B (PAB) gene of MTB, allowing even the detection of IS6610-deficient strains. We evaluated this commercial diagnostic test by analyzing 200 respiratory and, for the first time, 87 non-respiratory clinical specimens from our tertiary care institution and compared its results to our IS6110-based in-house real-time PCR for MTB as well as MTB culture. Overall sensitivity for Abbott RealTime MTB was 100% (19/19) in smear positive and 87.5% (7/8) in smear negative specimens, while the specificity of the assay was 100% (260/260). For both non-respiratory smear positive and smear negative specimens Abbott RealTime MTB tests showed 100% (8/8) sensitivity and 100% (8/8) specificity. Cycle threshold (Ct) value analysis of 16 MTB positive samples showed a slightly higher Ct value of the Abbott RealTime MTB test compared to our in-house MTB assay (mean delta Ct = 2.55). In conclusion, the performance of the new Abbott RealTime MTB Assay was highly similar to culture and in-house MTB PCR. We document successful analysis of 87 non-respiratory samples with the highly automated Abbott RealTime MTB test with no inhibition observed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Detecting chaos in irregularly sampled time series.

    Science.gov (United States)

    Kulp, C W

    2013-09-01

    Recently, Wiebe and Virgin [Chaos 22, 013136 (2012)] developed an algorithm which detects chaos by analyzing a time series' power spectrum which is computed using the Discrete Fourier Transform (DFT). Their algorithm, like other time series characterization algorithms, requires that the time series be regularly sampled. Real-world data, however, are often irregularly sampled, thus, making the detection of chaotic behavior difficult or impossible with those methods. In this paper, a characterization algorithm is presented, which effectively detects chaos in irregularly sampled time series. The work presented here is a modification of Wiebe and Virgin's algorithm and uses the Lomb-Scargle Periodogram (LSP) to compute a series' power spectrum instead of the DFT. The DFT is not appropriate for irregularly sampled time series. However, the LSP is capable of computing the frequency content of irregularly sampled data. Furthermore, a new method of analyzing the power spectrum is developed, which can be useful for differentiating between chaotic and non-chaotic behavior. The new characterization algorithm is successfully applied to irregularly sampled data generated by a model as well as data consisting of observations of variable stars.

  5. Time-dependent importance sampling in semiclassical initial value representation calculations for time correlation functions. II. A simplified implementation.

    Science.gov (United States)

    Tao, Guohua; Miller, William H

    2012-09-28

    An efficient time-dependent (TD) Monte Carlo (MC) importance sampling method has recently been developed [G. Tao and W. H. Miller, J. Chem. Phys. 135, 024104 (2011)] for the evaluation of time correlation functions using the semiclassical (SC) initial value representation (IVR) methodology. In this TD-SC-IVR method, the MC sampling uses information from both time-evolved phase points as well as their initial values, and only the "important" trajectories are sampled frequently. Even though the TD-SC-IVR was shown in some benchmark examples to be much more efficient than the traditional time-independent sampling method (which uses only initial conditions), the calculation of the SC prefactor-which is computationally expensive, especially for large systems-is still required for accepted trajectories. In the present work, we present an approximate implementation of the TD-SC-IVR method that is completely prefactor-free; it gives the time correlation function as a classical-like magnitude function multiplied by a phase function. Application of this approach to flux-flux correlation functions (which yield reaction rate constants) for the benchmark H + H(2) system shows very good agreement with exact quantum results. Limitations of the approximate approach are also discussed.

  6. Drawing evaluation report for sampling equipment drawings

    International Nuclear Information System (INIS)

    WILSON, G.W.

    1999-01-01

    This document presents the results of a task to evaluate Tank Waste Remediation System (TWRS) sampling equipment drawings and identifies drawings category as either essential, support, or general drawings. This report completes the drawing evaluation task as outlined in Engineering Task Plan For Truck 3 and 4 Drawing Compliance and Evaluation (Wilson, 1997). The scope of this report is limited to an evaluation and identification of drawing category for drawings of certain tank waste sampling equipment for which the TRWS Characterization Project has been assigned custody, including: vapor sampling, grab sampling, auger sampling, and all core sampling equipment (see LMHC Task Order 304). This report does not address drawings for other waste tank deployed equipment systems having similar assigned custody, such as, Light Duty Utility Arm (LDUA), Cone Penetrometer system, or Long Length Contaminated Equipment (LLCE). The LDUA drawings are addressed in the Characterization Equipment Essential Drawings (HNF 1998). The Cone Penetrometer system drawings which are vendor drawings (not H- series) is not currently turned over to operations for deployment. The LLCE equipment was just recently assigned to Characterization Project and were not included in the original scope for this evaluation and will be addressed in the evaluation update scheduled for fiscal year 1999

  7. Rational Arithmetic Mathematica Functions to Evaluate the Two-Sided One Sample K-S Cumulative Sampling Distribution

    Directory of Open Access Journals (Sweden)

    J. Randall Brown

    2007-06-01

    Full Text Available One of the most widely used goodness-of-fit tests is the two-sided one sample Kolmogorov-Smirnov (K-S test which has been implemented by many computer statistical software packages. To calculate a two-sided p value (evaluate the cumulative sampling distribution, these packages use various methods including recursion formulae, limiting distributions, and approximations of unknown accuracy developed over thirty years ago. Based on an extensive literature search for the two-sided one sample K-S test, this paper identifies an exact formula for sample sizes up to 31, six recursion formulae, and one matrix formula that can be used to calculate a p value. To ensure accurate calculation by avoiding catastrophic cancelation and eliminating rounding error, each of these formulae is implemented in rational arithmetic. For the six recursion formulae and the matrix formula, computational experience for sample sizes up to 500 shows that computational times are increasing functions of both the sample size and the number of digits in the numerator and denominator integers of the rational number test statistic. The computational times of the seven formulae vary immensely but the Durbin recursion formula is almost always the fastest. Linear search is used to calculate the inverse of the cumulative sampling distribution (find the confidence interval half-width and tables of calculated half-widths are presented for sample sizes up to 500. Using calculated half-widths as input, computational times for the fastest formula, the Durbin recursion formula, are given for sample sizes up to two thousand.

  8. Evaluation of complex gonioapparent samples using a bidirectional spectrometer.

    Science.gov (United States)

    Rogelj, Nina; Penttinen, Niko; Gunde, Marta Klanjšek

    2015-08-24

    Many applications use gonioapparent targets whose appearance depends on irradiation and viewing angles; the strongest effects are provided by light diffraction. These targets, optically variable devices (OVDs), are used in both security and authentication applications. This study introduces a bidirectional spectrometer, which enables to analyze samples with most complex angular and spectral properties. In our work, the spectrometer is evaluated with samples having very different types of reflection, concerning spectral and angular distributions. Furthermore, an OVD containing several different grating patches is evaluated. The device uses automatically adjusting exposure time to provide maximum signal dynamics and is capable of doing steps as small as 0.01°. However, even 2° steps for the detector movement showed that this device is more than capable of characterizing even the most complex reflecting surfaces. This study presents sRGB visualizations, discussion of bidirectional reflection, and accurate grating period calculations for all of the grating samples used.

  9. Technical basis and evaluation criteria for an air sampling/monitoring program

    International Nuclear Information System (INIS)

    Gregory, D.C.; Bryan, W.L.; Falter, K.G.

    1993-01-01

    Air sampling and monitoring programs at DOE facilities need to be reviewed in light of revised requirements and guidance found in, for example, DOE Order 5480.6 (RadCon Manual). Accordingly, the Oak Ridge National Laboratory (ORNL) air monitoring program is being revised and placed on a sound technical basis. A draft technical basis document has been written to establish placement criteria for instruments and to guide the ''retrospective sampling or real-time monitoring'' decision. Facility evaluations are being used to document air sampling/monitoring needs, and instruments are being evaluated in light of these needs. The steps used to develop this program and the technical basis for instrument placement are described

  10. Drawing Evaluation Report for Sampling Equipment Drawings

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    This document presents the results of a task to update the evaluation of River Protection Project (WP) sampling equipment drawings and updates the assigned drawings category as either essential, support, or general drawings. This report updates the drawing evaluation that was originally done per Engineering Task Plan For Truck 3 and 4 Drawing Compliance and Evaluation. The scope of this report is limited to updating the evaluation and identification of drawing category for drawings of certain tank waste sampling equipment for which the RPP Characterization Project has been assigned custody, including: vapor sampling, grab sampling, auger sampling, all core sampling equipment, and Light Duty Utility Arm (LDUA) (see LMHC contract No. 519, release 10). This report does not address drawings for other waste tank deployed equipment systems having similar assigned custody, such as, Cone Penetrometer system, or Long Length Contaminated Equipment (LLCE). The Cone Penetrometer system, which is depicted on vendor drawings, (not H- series), is not currently turned over to operations for deployment. The LLCE equipment was just recently assigned to Characterization Project and was not included in the original scope for this update and will be addressed in the evaluation update scheduled for later in fiscal year 1999, when equipment ownership is determined

  11. Uniform Sampling Table Method and its Applications II--Evaluating the Uniform Sampling by Experiment.

    Science.gov (United States)

    Chen, Yibin; Chen, Jiaxi; Chen, Xuan; Wang, Min; Wang, Wei

    2015-01-01

    A new method of uniform sampling is evaluated in this paper. The items and indexes were adopted to evaluate the rationality of the uniform sampling. The evaluation items included convenience of operation, uniformity of sampling site distribution, and accuracy and precision of measured results. The evaluation indexes included operational complexity, occupation rate of sampling site in a row and column, relative accuracy of pill weight, and relative deviation of pill weight. They were obtained from three kinds of drugs with different shape and size by four kinds of sampling methods. Gray correlation analysis was adopted to make the comprehensive evaluation by comparing it with the standard method. The experimental results showed that the convenience of uniform sampling method was 1 (100%), odds ratio of occupation rate in a row and column was infinity, relative accuracy was 99.50-99.89%, reproducibility RSD was 0.45-0.89%, and weighted incidence degree exceeded the standard method. Hence, the uniform sampling method was easy to operate, and the selected samples were distributed uniformly. The experimental results demonstrated that the uniform sampling method has good accuracy and reproducibility, which can be put into use in drugs analysis.

  12. In-well time-of-travel approach to evaluate optimal purge duration during low-flow sampling of monitoring wells

    Science.gov (United States)

    Harte, Philip T.

    2017-01-01

    A common assumption with groundwater sampling is that low (time until inflow from the high hydraulic conductivity part of the screened formation can travel vertically in the well to the pump intake. Therefore, the length of the time needed for adequate purging prior to sample collection (called optimal purge duration) is controlled by the in-well, vertical travel times. A preliminary, simple analytical model was used to provide information on the relation between purge duration and capture of formation water for different gross levels of heterogeneity (contrast between low and high hydraulic conductivity layers). The model was then used to compare these time–volume relations to purge data (pumping rates and drawdown) collected at several representative monitoring wells from multiple sites. Results showed that computation of time-dependent capture of formation water (as opposed to capture of preexisting screen water), which were based on vertical travel times in the well, compares favorably with the time required to achieve field parameter stabilization. If field parameter stabilization is an indicator of arrival time of formation water, which has been postulated, then in-well, vertical flow may be an important factor at wells where low-flow sampling is the sample method of choice.

  13. Study on auto-plating process time versus recovery for polonium, Po-210 in environmental sample

    International Nuclear Information System (INIS)

    Jalal Sharib; Zaharudin Ahmad; Abdul Kadir Ishak; Norfaizal Mohamed; Ahmad Sanadi Abu Bakar; Yii Mei Wo; Kamarozaman Ishak; Siti Aminah Yusoff

    2008-08-01

    This study was carried out to evaluate time effectiveness and recovery 16 samples of 4 Kuala Muda stations during auto-plating process procedures for determination Polonium, Po 210 activity concentration in environmental sample. The study was performed using Kuala Muda sediment as sample in the same methodology. The auto-plating process runs for 4, 12, 24 and 30 hours on a silver disc for 4 samples each station, and then counted for one (1) day using an alpha spectrometry counting system. The objectives for this study is to justify on time duration for auto-plating process effecting a chemical yield of Po-209.The results showed recovery are increasing versus time and constantly at 24 hour auto-plating. Its mean, 24 hour is an optimum time for auto-plating process for determination of Polonium, Po 210 activity concentration in environmental sample. (Author)

  14. Evaluation of sampling strategies to estimate crown biomass

    Directory of Open Access Journals (Sweden)

    Krishna P Poudel

    2015-01-01

    Full Text Available Background Depending on tree and site characteristics crown biomass accounts for a significant portion of the total aboveground biomass in the tree. Crown biomass estimation is useful for different purposes including evaluating the economic feasibility of crown utilization for energy production or forest products, fuel load assessments and fire management strategies, and wildfire modeling. However, crown biomass is difficult to predict because of the variability within and among species and sites. Thus the allometric equations used for predicting crown biomass should be based on data collected with precise and unbiased sampling strategies. In this study, we evaluate the performance different sampling strategies to estimate crown biomass and to evaluate the effect of sample size in estimating crown biomass. Methods Using data collected from 20 destructively sampled trees, we evaluated 11 different sampling strategies using six evaluation statistics: bias, relative bias, root mean square error (RMSE, relative RMSE, amount of biomass sampled, and relative biomass sampled. We also evaluated the performance of the selected sampling strategies when different numbers of branches (3, 6, 9, and 12 are selected from each tree. Tree specific log linear model with branch diameter and branch length as covariates was used to obtain individual branch biomass. Results Compared to all other methods stratified sampling with probability proportional to size estimation technique produced better results when three or six branches per tree were sampled. However, the systematic sampling with ratio estimation technique was the best when at least nine branches per tree were sampled. Under the stratified sampling strategy, selecting unequal number of branches per stratum produced approximately similar results to simple random sampling, but it further decreased RMSE when information on branch diameter is used in the design and estimation phases. Conclusions Use of

  15. Importance Sampling for Stochastic Timed Automata

    DEFF Research Database (Denmark)

    Jegourel, Cyrille; Larsen, Kim Guldstrand; Legay, Axel

    2016-01-01

    We present an importance sampling framework that combines symbolic analysis and simulation to estimate the probability of rare reachability properties in stochastic timed automata. By means of symbolic exploration, our framework first identifies states that cannot reach the goal. A state-wise cha......We present an importance sampling framework that combines symbolic analysis and simulation to estimate the probability of rare reachability properties in stochastic timed automata. By means of symbolic exploration, our framework first identifies states that cannot reach the goal. A state...

  16. Performance Evaluation of a Synthetic Aperture Real-Time Ultrasound System

    DEFF Research Database (Denmark)

    Stuart, Matthias Bo; Tomov, Borislav Gueorguiev; Jensen, Jørgen Arendt

    2011-01-01

    This paper evaluates the signal-to-noise ratio, the time stability, and the phase difference of the sampling in the experimental ultrasound scanner SARUS: A synthetic aperture, real-time ultrasound system. SARUS has 1024 independent transmit and receive channels and is capable of handling 2D probes...... arrays (FPGAs) making it very flexible and allowing implementation of other real-time ultrasound processing methods in the future. For conventional B-mode imaging, a penetration depth around 7 cm for a 7 MHz transducer is obtained (signal-tonoise ratio of 0 dB), which is comparable to commercial...... for 3D ultrasound imaging. It samples at 12 bits per sample and has a sampling rate of 70 MHz with the possibility of decimating the sampling frequency at the input. SARUS is capable of advanced real-time computations such as synthetic aperture imaging. The system is built using fieldprogrammable gate...

  17. An empirical comparison of respondent-driven sampling, time location sampling, and snowball sampling for behavioral surveillance in men who have sex with men, Fortaleza, Brazil.

    Science.gov (United States)

    Kendall, Carl; Kerr, Ligia R F S; Gondim, Rogerio C; Werneck, Guilherme L; Macena, Raimunda Hermelinda Maia; Pontes, Marta Kerr; Johnston, Lisa G; Sabin, Keith; McFarland, Willi

    2008-07-01

    Obtaining samples of populations at risk for HIV challenges surveillance, prevention planning, and evaluation. Methods used include snowball sampling, time location sampling (TLS), and respondent-driven sampling (RDS). Few studies have made side-by-side comparisons to assess their relative advantages. We compared snowball, TLS, and RDS surveys of men who have sex with men (MSM) in Forteleza, Brazil, with a focus on the socio-economic status (SES) and risk behaviors of the samples to each other, to known AIDS cases and to the general population. RDS produced a sample with wider inclusion of lower SES than snowball sampling or TLS-a finding of health significance given the majority of AIDS cases reported among MSM in the state were low SES. RDS also achieved the sample size faster and at lower cost. For reasons of inclusion and cost-efficiency, RDS is the sampling methodology of choice for HIV surveillance of MSM in Fortaleza.

  18. Real-time PCR for type-specific identification of herpes simplex in clinical samples: evaluation of type-specific results in the context of CNS diseases.

    Science.gov (United States)

    Meylan, Sylvain; Robert, Daniel; Estrade, Christine; Grimbuehler, Valérie; Péter, Olivier; Meylan, Pascal R; Sahli, Roland

    2008-02-01

    HSV-1 and HSV-2 cause CNS infections of dissimilar clinico-pathological characteristics with prognostic and therapeutic implications. To validate a type-specific real-time PCR that uses MGB/LNA Taqman probes and to review the virologico-clinical data of 25 eligible patients with non-neonatal CNS infections. This real-time PCR was evaluated against conventional PCR (26 CSF and 20 quality controls), and LightCycler assay (51 mucocutaneous, 8 CSF and 32 quality controls) and culture/immunofluorescence (75 mucocutaneous) to assess typing with independent methods. Taqman real-time PCR detected 240 HSV genomes per ml CSF, a level appropriate for the management of patients, and provided unambiguous typing for the 104 positive (62 HSV-1 and 42 HSV-2) out the 160 independent clinical samples tested. HSV type diagnosed by Taqman real-time PCR predicted final diagnosis (meningitis versus encephalitis/meningoencephalitis, p<0.001) in 24/25 patients at time of presentation, in contrast to clinical evaluation. Our real-time PCR, as a sensitive and specific means for type-specific HSV diagnosis, provided rapid prognostic information for patient management.

  19. High Accuracy Evaluation of the Finite Fourier Transform Using Sampled Data

    Science.gov (United States)

    Morelli, Eugene A.

    1997-01-01

    Many system identification and signal processing procedures can be done advantageously in the frequency domain. A required preliminary step for this approach is the transformation of sampled time domain data into the frequency domain. The analytical tool used for this transformation is the finite Fourier transform. Inaccuracy in the transformation can degrade system identification and signal processing results. This work presents a method for evaluating the finite Fourier transform using cubic interpolation of sampled time domain data for high accuracy, and the chirp Zeta-transform for arbitrary frequency resolution. The accuracy of the technique is demonstrated in example cases where the transformation can be evaluated analytically. Arbitrary frequency resolution is shown to be important for capturing details of the data in the frequency domain. The technique is demonstrated using flight test data from a longitudinal maneuver of the F-18 High Alpha Research Vehicle.

  20. Drawing evaluation report for sampling equipment drawings; TOPICAL

    International Nuclear Information System (INIS)

    WILSON, G.W.

    1999-01-01

    This document presents the results of a task to evaluate Tank Waste Remediation System (TWRS) sampling equipment drawings and identifies drawings category as either essential, support, or general drawings. This report completes the drawing evaluation task as outlined in Engineering Task Plan For Truck 3 and 4 Drawing Compliance and Evaluation (Wilson, 1997). The scope of this report is limited to an evaluation and identification of drawing category for drawings of certain tank waste sampling equipment for which the TRWS Characterization Project has been assigned custody, including: vapor sampling, grab sampling, auger sampling, and all core sampling equipment (see LMHC Task Order 304). This report does not address drawings for other waste tank deployed equipment systems having similar assigned custody, such as, Light Duty Utility Arm (LDUA), Cone Penetrometer system, or Long Length Contaminated Equipment (LLCE). The LDUA drawings are addressed in the Characterization Equipment Essential Drawings (HNF 1998). The Cone Penetrometer system drawings which are vendor drawings (not H- series) is not currently turned over to operations for deployment. The LLCE equipment was just recently assigned to Characterization Project and were not included in the original scope for this evaluation and will be addressed in the evaluation update scheduled for fiscal year 1999

  1. Orientation, Evaluation, and Integration of Part-Time Nursing Faculty.

    Science.gov (United States)

    Carlson, Joanne S

    2015-07-10

    This study helps to quantify and describe orientation, evaluation, and integration practices pertaining to part-time clinical nursing faculty teaching in prelicensure nursing education programs. A researcher designed Web-based survey was used to collect information from a convenience sample of part-time clinical nursing faculty teaching in prelicensure nursing programs. Survey questions focused on the amount and type of orientation, evaluation, and integration practices. Descriptive statistics were used to analyze results. Respondents reported on average four hours of orientation, with close to half reporting no more than two hours. Evaluative feedback was received much more often from students than from full-time faculty. Most respondents reported receiving some degree of mentoring and that it was easy to get help from full-time faculty. Respondents reported being most informed about student evaluation procedures, grading, and the steps to take when students are not meeting course objectives, and less informed about changes to ongoing curriculum and policy.

  2. Effect of sample storage time on detection of hybridization signals in Checkerboard DNA-DNA hybridization.

    Science.gov (United States)

    do Nascimento, Cássio; Muller, Katia; Sato, Sandra; Albuquerque Junior, Rubens Ferreira

    2012-04-01

    Long-term sample storage can affect the intensity of the hybridization signals provided by molecular diagnostic methods that use chemiluminescent detection. The aim of this study was to evaluate the effect of different storage times on the hybridization signals of 13 bacterial species detected by the Checkerboard DNA-DNA hybridization method using whole-genomic DNA probes. Ninety-six subgingival biofilm samples were collected from 36 healthy subjects, and the intensity of hybridization signals was evaluated at 4 different time periods: (1) immediately after collecting (n = 24) and (2) after storage at -20 °C for 6 months (n = 24), (3) for 12 months (n = 24), and (4) for 24 months (n = 24). The intensity of hybridization signals obtained from groups 1 and 2 were significantly higher than in the other groups (p  0.05). The Checkerboard DNA-DNA hybridization method was suitable to detect hybridization signals from all groups evaluated, and the intensity of signals decreased significantly after long periods of sample storage.

  3. Sample-interpolation timing: an optimized technique for the digital measurement of time of flight for γ rays and neutrons at relatively low sampling rates

    International Nuclear Information System (INIS)

    Aspinall, M D; Joyce, M J; Mackin, R O; Jarrah, Z; Boston, A J; Nolan, P J; Peyton, A J; Hawkes, N P

    2009-01-01

    A unique, digital time pick-off method, known as sample-interpolation timing (SIT) is described. This method demonstrates the possibility of improved timing resolution for the digital measurement of time of flight compared with digital replica-analogue time pick-off methods for signals sampled at relatively low rates. Three analogue timing methods have been replicated in the digital domain (leading-edge, crossover and constant-fraction timing) for pulse data sampled at 8 GSa s −1 . Events arising from the 7 Li(p, n) 7 Be reaction have been detected with an EJ-301 organic liquid scintillator and recorded with a fast digital sampling oscilloscope. Sample-interpolation timing was developed solely for the digital domain and thus performs more efficiently on digital signals compared with analogue time pick-off methods replicated digitally, especially for fast signals that are sampled at rates that current affordable and portable devices can achieve. Sample interpolation can be applied to any analogue timing method replicated digitally and thus also has the potential to exploit the generic capabilities of analogue techniques with the benefits of operating in the digital domain. A threshold in sampling rate with respect to the signal pulse width is observed beyond which further improvements in timing resolution are not attained. This advance is relevant to many applications in which time-of-flight measurement is essential

  4. Sampled-data and discrete-time H2 optimal control

    NARCIS (Netherlands)

    Trentelman, Harry L.; Stoorvogel, Anton A.

    1993-01-01

    This paper deals with the sampled-data H2 optimal control problem. Given a linear time-invariant continuous-time system, the problem of minimizing the H2 performance over all sampled-data controllers with a fixed sampling period can be reduced to a pure discrete-time H2 optimal control problem. This

  5. Time-dependent importance sampling in semiclassical initial value representation calculations for time correlation functions.

    Science.gov (United States)

    Tao, Guohua; Miller, William H

    2011-07-14

    An efficient time-dependent importance sampling method is developed for the Monte Carlo calculation of time correlation functions via the initial value representation (IVR) of semiclassical (SC) theory. A prefactor-free time-dependent sampling function weights the importance of a trajectory based on the magnitude of its contribution to the time correlation function, and global trial moves are used to facilitate the efficient sampling the phase space of initial conditions. The method can be generally applied to sampling rare events efficiently while avoiding being trapped in a local region of the phase space. Results presented in the paper for two system-bath models demonstrate the efficiency of this new importance sampling method for full SC-IVR calculations.

  6. Evaluating Distributed Timing Constraints

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Drejer, N.

    1994-01-01

    In this paper we describe a solution to the problem of implementing time-optimal evaluation of timing constraints in distributed real-time systems.......In this paper we describe a solution to the problem of implementing time-optimal evaluation of timing constraints in distributed real-time systems....

  7. Soil Gas Sample Handling: Evaluation of Water Removal and Sample Ganging

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Brad G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Abrecht, David G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hayes, James C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mendoza, Donaldo P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-10-31

    Soil gas sampling is currently conducted in support of Nuclear Test Ban treaty verification. Soil gas samples are collected and analyzed for isotopes of interest. Some issues that can impact sampling and analysis of these samples are excess moisture and sample processing time. Here we discuss three potential improvements to the current sampling protocol; a desiccant for water removal, use of molecular sieve to remove CO2 from the sample during collection, and a ganging manifold to allow composite analysis of multiple samples.

  8. Sampling returns for realized variance calculations: tick time or transaction time?

    NARCIS (Netherlands)

    Griffin, J.E.; Oomen, R.C.A.

    2008-01-01

    This article introduces a new model for transaction prices in the presence of market microstructure noise in order to study the properties of the price process on two different time scales, namely, transaction time where prices are sampled with every transaction and tick time where prices are

  9. Evaluation of primary immunization coverage of infants under universal immunization programme in an urban area of bangalore city using cluster sampling and lot quality assurance sampling techniques.

    Science.gov (United States)

    K, Punith; K, Lalitha; G, Suman; Bs, Pradeep; Kumar K, Jayanth

    2008-07-01

    Is LQAS technique better than cluster sampling technique in terms of resources to evaluate the immunization coverage in an urban area? To assess and compare the lot quality assurance sampling against cluster sampling in the evaluation of primary immunization coverage. Population-based cross-sectional study. Areas under Mathikere Urban Health Center. Children aged 12 months to 23 months. 220 in cluster sampling, 76 in lot quality assurance sampling. Percentages and Proportions, Chi square Test. (1) Using cluster sampling, the percentage of completely immunized, partially immunized and unimmunized children were 84.09%, 14.09% and 1.82%, respectively. With lot quality assurance sampling, it was 92.11%, 6.58% and 1.31%, respectively. (2) Immunization coverage levels as evaluated by cluster sampling technique were not statistically different from the coverage value as obtained by lot quality assurance sampling techniques. Considering the time and resources required, it was found that lot quality assurance sampling is a better technique in evaluating the primary immunization coverage in urban area.

  10. Evaluation of Respondent-Driven Sampling

    Science.gov (United States)

    McCreesh, Nicky; Frost, Simon; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda Ndagire; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Background Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex-workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total-population data. Methods Total-population data on age, tribe, religion, socioeconomic status, sexual activity and HIV status were available on a population of 2402 male household-heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, employing current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). Results We recruited 927 household-heads. Full and small RDS samples were largely representative of the total population, but both samples under-represented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven-sampling statistical-inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven-sampling bootstrap 95% confidence intervals included the population proportion. Conclusions Respondent-driven sampling produced a generally representative sample of this well-connected non-hidden population. However, current respondent-driven-sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience-sampling

  11. Evaluation of respondent-driven sampling.

    Science.gov (United States)

    McCreesh, Nicky; Frost, Simon D W; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda N; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total population data. Total population data on age, tribe, religion, socioeconomic status, sexual activity, and HIV status were available on a population of 2402 male household heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, using current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). We recruited 927 household heads. Full and small RDS samples were largely representative of the total population, but both samples underrepresented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven sampling statistical inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven sampling bootstrap 95% confidence intervals included the population proportion. Respondent-driven sampling produced a generally representative sample of this well-connected nonhidden population. However, current respondent-driven sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience sampling method, and caution is required

  12. Test plan for evaluating the performance of the in-tank fluidic sampling system

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    The PHMC will provide Low Activity Wastes (LAW) tank wastes for final treatment by a privatization contractor from double-shell feed tanks, 241-AP-102 and 241-AP-104, Concerns about the inability of the baseline ''grab'' sampling to provide large volume samples within time constraints has led to the development of a conceptual sampling system that would be deployed in a feed tank riser, This sampling system will provide large volume, representative samples without the environmental, radiation exposure, and sample volume impacts of the current base-line ''grab'' sampling method. This test plan identifies ''proof-of-principle'' cold tests for the conceptual sampling system using simulant materials. The need for additional testing was identified as a result of completing tests described in the revision test plan document, Revision 1 outlines tests that will evaluate the performance and ability to provide samples that are representative of a tanks' content within a 95 percent confidence interval, to recovery from plugging, to sample supernatant wastes with over 25 wt% solids content, and to evaluate the impact of sampling at different heights within the feed tank. The test plan also identifies operating parameters that will optimize the performance of the sampling system

  13. Evaluation of a centrifuge with rapid turnaround time for the preparation of plasma samples for measurement of common STAT markers on the ACS: 180 system.

    Science.gov (United States)

    Foster, K; Datta, P; Orswell, M; Tasaico, K; Alpert, A; Bluestein, B

    2000-01-01

    Reported is the evaluation of a new centrifugation method, Statspin, that addresses both time and sample separation integrity. The method can successfully separate the plasma fraction from the cellular material in 2 minutes as compared to 20 minutes for the conventional centrifuge method. The Statspin, combined with the ACS:180 system, can generate test results in less than 30 minutes, exclusive of transport to the laboratory. This study demonstrated that the combined technologies offer timing-saving improvements for clinical laboratories offering STAT immunoassays for cardiac markers, endocrine molecules, and therapeutic drugs.

  14. Evaluation of oil biodegradation using time warping and PCA

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, J.H. [Royal Veterinary and Agricultural Univ., Thorvaldsensvej (Denmark). Dept. of Natural Sciences; Hansen, A.B. [National Environmental Research Inst., Roskilde (Denmark). Dept. of Environmental Chemistry and Microbiology; Andersen, O. [Roskilde Univ., Roskilde (Denmark). Dept. of Life Sciences and Chemistry

    2005-07-01

    The effects of biodegradation on the composition of stranded oil after the Baltic Carrier oil spill in March 2001 was evaluated using a newly developed multivariate statistical methodology. Gas chromatography and mass spectrometry provided data on the oil compounds and oil biodegradation was determined by applying weighted least square principal component analysis to the preprocessed chromatograms of methylphenanthrenes and methyldibenzothiophenes. One principal component explained 46 per cent of the variation in the complete data set. Samples collected immediately after the spill and 2.5 months after the spill did not exhibit changes in isomer composition. However, the isomer patterns changed in samples collected between 6.5 and 16.5 months after the spill. Samples collected after 8.5 months were the most greatly affected. An evaluation of the degradation patterns suggest that time warping and multivariate statistical methods can successfully identify links between spill samples and can determine how chemical composition will respond to biodegradation processes. 27 refs., 1 tab., 3 figs.

  15. Evaluation of oil biodegradation using time warping and PCA

    International Nuclear Information System (INIS)

    Christensen, J.H.; Hansen, A.B.; Andersen, O.

    2005-01-01

    The effects of biodegradation on the composition of stranded oil after the Baltic Carrier oil spill in March 2001 was evaluated using a newly developed multivariate statistical methodology. Gas chromatography and mass spectrometry provided data on the oil compounds and oil biodegradation was determined by applying weighted least square principal component analysis to the preprocessed chromatograms of methylphenanthrenes and methyldibenzothiophenes. One principal component explained 46 per cent of the variation in the complete data set. Samples collected immediately after the spill and 2.5 months after the spill did not exhibit changes in isomer composition. However, the isomer patterns changed in samples collected between 6.5 and 16.5 months after the spill. Samples collected after 8.5 months were the most greatly affected. An evaluation of the degradation patterns suggest that time warping and multivariate statistical methods can successfully identify links between spill samples and can determine how chemical composition will respond to biodegradation processes. 27 refs., 1 tab., 3 figs

  16. Sample exchange/evaluation (SEE) report - Phase III

    International Nuclear Information System (INIS)

    Winters, W.I.

    1996-01-01

    This report describes the results from Phase III of the Sample Exchange Evaluation (SEE) program. The SEE program is used to compare analytical laboratory performance on samples from the Hanford Site's high level waste tanks

  17. Accurate Sample Time Reconstruction of Inertial FIFO Data

    Directory of Open Access Journals (Sweden)

    Sebastian Stieber

    2017-12-01

    Full Text Available In the context of modern cyber-physical systems, the accuracy of underlying sensor data plays an increasingly important role in sensor data fusion and feature extraction. The raw events of multiple sensors have to be aligned in time to enable high quality sensor fusion results. However, the growing number of simultaneously connected sensor devices make the energy saving data acquisition and processing more and more difficult. Hence, most of the modern sensors offer a first-in-first-out (FIFO interface to store multiple data samples and to relax timing constraints, when handling multiple sensor devices. However, using the FIFO interface increases the negative influence of individual clock drifts—introduced by fabrication inaccuracies, temperature changes and wear-out effects—onto the sampling data reconstruction. Furthermore, additional timing offset errors due to communication and software latencies increases with a growing number of sensor devices. In this article, we present an approach for an accurate sample time reconstruction independent of the actual clock drift with the help of an internal sensor timer. Such timers are already available in modern sensors, manufactured in micro-electromechanical systems (MEMS technology. The presented approach focuses on calculating accurate time stamps using the sensor FIFO interface in a forward-only processing manner as a robust and energy saving solution. The proposed algorithm is able to lower the overall standard deviation of reconstructed sampling periods below 40 μ s, while run-time savings of up to 42% are achieved, compared to single sample acquisition.

  18. Evaluation of primary immunization coverage of infants under universal immunization programme in an urban area of Bangalore city using cluster sampling and lot quality assurance sampling techniques

    Directory of Open Access Journals (Sweden)

    Punith K

    2008-01-01

    Full Text Available Research Question: Is LQAS technique better than cluster sampling technique in terms of resources to evaluate the immunization coverage in an urban area? Objective: To assess and compare the lot quality assurance sampling against cluster sampling in the evaluation of primary immunization coverage. Study Design: Population-based cross-sectional study. Study Setting: Areas under Mathikere Urban Health Center. Study Subjects: Children aged 12 months to 23 months. Sample Size: 220 in cluster sampling, 76 in lot quality assurance sampling. Statistical Analysis: Percentages and Proportions, Chi square Test. Results: (1 Using cluster sampling, the percentage of completely immunized, partially immunized and unimmunized children were 84.09%, 14.09% and 1.82%, respectively. With lot quality assurance sampling, it was 92.11%, 6.58% and 1.31%, respectively. (2 Immunization coverage levels as evaluated by cluster sampling technique were not statistically different from the coverage value as obtained by lot quality assurance sampling techniques. Considering the time and resources required, it was found that lot quality assurance sampling is a better technique in evaluating the primary immunization coverage in urban area.

  19. Evaluation of personal air sampling pumps

    International Nuclear Information System (INIS)

    Ritter, P.D.; Novick, V.J.; Alvarez, J.L.; Huntsman, B.L.

    1987-01-01

    Personal air samplers are used to more conveniently obtain breathing zone samples from individuals over periods of several hours. Personal air sampling pumps must meet minimum performance levels under all working conditions to be suitable for use in radiation protection programs. In addition, the pumps should be simple to operate and as comfortable to wear as possible. Ten models of personal air sampling pumps were tested to evaluate their mechanical performance and physical characteristics. The pumps varied over a wide range in basic performance and operating features. Some of the pumps were found to have adequate performance for use in health physics air sampling applications. 3 references, 2 figures, 5 tables

  20. Development and first evaluation of a novel multiplex real-time PCR on whole blood samples for rapid pathogen identification in critically ill patients with sepsis.

    Science.gov (United States)

    van de Groep, Kirsten; Bos, Martine P; Savelkoul, Paul H M; Rubenjan, Anna; Gazenbeek, Christel; Melchers, Willem J G; van der Poll, Tom; Juffermans, Nicole P; Ong, David S Y; Bonten, Marc J M; Cremer, Olaf L

    2018-04-26

    Molecular tests may enable early adjustment of antimicrobial therapy and be complementary to blood culture (BC) which has imperfect sensitivity in critically ill patients. We evaluated a novel multiplex real-time PCR assay to diagnose bloodstream pathogens directly in whole blood samples (BSI-PCR). BSI-PCR included 11 species- and four genus-specific PCRs, a molecular Gram-stain PCR, and two antibiotic resistance markers. We collected 5 mL blood from critically ill patients simultaneously with clinically indicated BC. Microbial DNA was isolated using the Polaris method followed by automated DNA extraction. Sensitivity and specificity were calculated using BC as reference. BSI-PCR was evaluated in 347 BC-positive samples (representing up to 50 instances of each pathogen covered by the test) and 200 BC-negative samples. Bacterial species-specific PCR sensitivities ranged from 65 to 100%. Sensitivity was 26% for the Gram-positive PCR, 32% for the Gram-negative PCR, and ranged 0 to 7% for yeast PCRs. Yeast detection was improved to 40% in a smaller set-up. There was no overall association between BSI-PCR sensitivity and time-to-positivity of BC (which was highly variable), yet Ct-values were lower for true-positive versus false-positive PCR results. False-positive results were observed in 84 (4%) of the 2200 species-specific PCRs in 200 culture-negative samples, and ranged from 0 to 6% for generic PCRs. Sensitivity of BSI-PCR was promising for individual bacterial pathogens, but still insufficient for yeasts and generic PCRs. Further development of BSI-PCR will focus on improving sensitivity by increasing input volumes and on subsequent implementation as a bedside test.

  1. Evaluation of diesel particulate matter sampling techniques

    CSIR Research Space (South Africa)

    Pretorius, CJ

    2011-09-01

    Full Text Available The study evaluated diesel particulate matter (DPM) sampling methods used in the South African mining industry. The three-piece cassette respirable, open face and stopper sampling methods were compared with the SKC DPM cassette method to find a...

  2. Optimal time points sampling in pathway modelling.

    Science.gov (United States)

    Hu, Shiyan

    2004-01-01

    Modelling cellular dynamics based on experimental data is at the heart of system biology. Considerable progress has been made to dynamic pathway modelling as well as the related parameter estimation. However, few of them gives consideration for the issue of optimal sampling time selection for parameter estimation. Time course experiments in molecular biology rarely produce large and accurate data sets and the experiments involved are usually time consuming and expensive. Therefore, to approximate parameters for models with only few available sampling data is of significant practical value. For signal transduction, the sampling intervals are usually not evenly distributed and are based on heuristics. In the paper, we investigate an approach to guide the process of selecting time points in an optimal way to minimize the variance of parameter estimates. In the method, we first formulate the problem to a nonlinear constrained optimization problem by maximum likelihood estimation. We then modify and apply a quantum-inspired evolutionary algorithm, which combines the advantages of both quantum computing and evolutionary computing, to solve the optimization problem. The new algorithm does not suffer from the morass of selecting good initial values and being stuck into local optimum as usually accompanied with the conventional numerical optimization techniques. The simulation results indicate the soundness of the new method.

  3. Performance evaluation soil samples utilizing encapsulation technology

    Science.gov (United States)

    Dahlgran, James R.

    1999-01-01

    Performance evaluation soil samples and method of their preparation using encapsulation technology to encapsulate analytes which are introduced into a soil matrix for analysis and evaluation by analytical laboratories. Target analytes are mixed in an appropriate solvent at predetermined concentrations. The mixture is emulsified in a solution of polymeric film forming material. The emulsified solution is polymerized to form microcapsules. The microcapsules are recovered, quantitated and introduced into a soil matrix in a predetermined ratio to form soil samples with the desired analyte concentration.

  4. Systematic Evaluation of Aggressive Air Sampling for Bacillus ...

    Science.gov (United States)

    Report The primary objectives of this project were to evaluate the Aggressive Air Sampling (AAS) method compared to currently used surface sampling methods and to determine if AAS is a viable option for sampling Bacillus anthracis spores.

  5. Digital timing: sampling frequency, anti-aliasing filter and signal interpolation filter dependence on timing resolution

    International Nuclear Information System (INIS)

    Cho, Sanghee; Grazioso, Ron; Zhang Nan; Aykac, Mehmet; Schmand, Matthias

    2011-01-01

    The main focus of our study is to investigate how the performance of digital timing methods is affected by sampling rate, anti-aliasing and signal interpolation filters. We used the Nyquist sampling theorem to address some basic questions such as what will be the minimum sampling frequencies? How accurate will the signal interpolation be? How do we validate the timing measurements? The preferred sampling rate would be as low as possible, considering the high cost and power consumption of high-speed analog-to-digital converters. However, when the sampling rate is too low, due to the aliasing effect, some artifacts are produced in the timing resolution estimations; the shape of the timing profile is distorted and the FWHM values of the profile fluctuate as the source location changes. Anti-aliasing filters are required in this case to avoid the artifacts, but the timing is degraded as a result. When the sampling rate is marginally over the Nyquist rate, a proper signal interpolation is important. A sharp roll-off (higher order) filter is required to separate the baseband signal from its replicates to avoid the aliasing, but in return the computation will be higher. We demonstrated the analysis through a digital timing study using fast LSO scintillation crystals as used in time-of-flight PET scanners. From the study, we observed that there is no significant timing resolution degradation down to 1.3 Ghz sampling frequency, and the computation requirement for the signal interpolation is reasonably low. A so-called sliding test is proposed as a validation tool checking constant timing resolution behavior of a given timing pick-off method regardless of the source location change. Lastly, the performance comparison for several digital timing methods is also shown.

  6. Adaptive Sampling of Time Series During Remote Exploration

    Science.gov (United States)

    Thompson, David R.

    2012-01-01

    This work deals with the challenge of online adaptive data collection in a time series. A remote sensor or explorer agent adapts its rate of data collection in order to track anomalous events while obeying constraints on time and power. This problem is challenging because the agent has limited visibility (all its datapoints lie in the past) and limited control (it can only decide when to collect its next datapoint). This problem is treated from an information-theoretic perspective, fitting a probabilistic model to collected data and optimizing the future sampling strategy to maximize information gain. The performance characteristics of stationary and nonstationary Gaussian process models are compared. Self-throttling sensors could benefit environmental sensor networks and monitoring as well as robotic exploration. Explorer agents can improve performance by adjusting their data collection rate, preserving scarce power or bandwidth resources during uninteresting times while fully covering anomalous events of interest. For example, a remote earthquake sensor could conserve power by limiting its measurements during normal conditions and increasing its cadence during rare earthquake events. A similar capability could improve sensor platforms traversing a fixed trajectory, such as an exploration rover transect or a deep space flyby. These agents can adapt observation times to improve sample coverage during moments of rapid change. An adaptive sampling approach couples sensor autonomy, instrument interpretation, and sampling. The challenge is addressed as an active learning problem, which already has extensive theoretical treatment in the statistics and machine learning literature. A statistical Gaussian process (GP) model is employed to guide sample decisions that maximize information gain. Nonsta tion - ary (e.g., time-varying) covariance relationships permit the system to represent and track local anomalies, in contrast with current GP approaches. Most common GP models

  7. Infrared biospectroscopy for a fast qualitative evaluation of sample preparation in metabolomics.

    Science.gov (United States)

    Kuligowski, Julia; Pérez-Guaita, David; Escobar, Javier; Lliso, Isabel; de la Guardia, Miguel; Lendl, Bernhard; Vento, Máximo; Quintás, Guillermo

    2014-09-01

    Liquid chromatography-mass spectrometry (LC-MS) has been increasingly used in biomedicine to study the dynamic metabolomic responses of biological systems under different physiological or pathological conditions. To obtain an integrated snapshot of the system, metabolomic methods in biomedicine typically analyze biofluids (e.g. plasma) that require clean-up before being injected into LC-MS systems. However, high resolution LC-MS is costly in terms of resources required for sample and data analysis and care must be taken to prevent chemical (e.g. ion suppression) or statistical artifacts. Because of that, the effect of sample preparation on the metabolomic profile during metabolomic method development is often overlooked. This work combines an Attenuated Total Reflectance-Fourier transform infrared (ATR-FTIR) and a multivariate exploratory data analysis for a cost-effective qualitative evaluation of major changes in sample composition during sample preparation. ATR-FTIR and LC-time of flight mass spectrometry (TOFMS) data from the analysis of a set of plasma samples precipitated using acetonitrile, methanol and acetone performed in parallel were used as a model example. Biochemical information obtained from the analysis of the ATR-FTIR and LC-TOFMS data was thoroughly compared to evaluate the strengths and shortcomings of FTIR biospectroscopy for assessing sample preparation in metabolomics studies. Results obtained show the feasibility of ATR-FTIR for the evaluation of major trends in the plasma composition changes among different sample pretreatments, providing information in terms of e.g., amino acids, proteins, lipids and carbohydrates overall contents comparable to those found by LC-TOFMS. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Unsupervised Ensemble Anomaly Detection Using Time-Periodic Packet Sampling

    Science.gov (United States)

    Uchida, Masato; Nawata, Shuichi; Gu, Yu; Tsuru, Masato; Oie, Yuji

    We propose an anomaly detection method for finding patterns in network traffic that do not conform to legitimate (i.e., normal) behavior. The proposed method trains a baseline model describing the normal behavior of network traffic without using manually labeled traffic data. The trained baseline model is used as the basis for comparison with the audit network traffic. This anomaly detection works in an unsupervised manner through the use of time-periodic packet sampling, which is used in a manner that differs from its intended purpose — the lossy nature of packet sampling is used to extract normal packets from the unlabeled original traffic data. Evaluation using actual traffic traces showed that the proposed method has false positive and false negative rates in the detection of anomalies regarding TCP SYN packets comparable to those of a conventional method that uses manually labeled traffic data to train the baseline model. Performance variation due to the probabilistic nature of sampled traffic data is mitigated by using ensemble anomaly detection that collectively exploits multiple baseline models in parallel. Alarm sensitivity is adjusted for the intended use by using maximum- and minimum-based anomaly detection that effectively take advantage of the performance variations among the multiple baseline models. Testing using actual traffic traces showed that the proposed anomaly detection method performs as well as one using manually labeled traffic data and better than one using randomly sampled (unlabeled) traffic data.

  9. Sampling pig farms at the abattoir in a cross-sectional study - Evaluation of a sampling method.

    Science.gov (United States)

    Birkegård, Anna Camilla; Halasa, Tariq; Toft, Nils

    2017-09-15

    A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However, it is difficult to sample a large number of farms from an exact predefined list, due to the logistics and workflow of an abattoir. Therefore, it is necessary to have a systematic sampling procedure and to evaluate the obtained sample with respect to the study objective. We propose a method for 1) planning, 2) conducting, and 3) evaluating the representativeness and reproducibility of a cross-sectional study when simple random sampling is not possible. We used an example of a cross-sectional study with the aim of quantifying the association of antimicrobial resistance and antimicrobial consumption in Danish slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2) conducting: sampling was carried out at five abattoirs; 3) evaluation: representativeness was evaluated by comparing sampled and non-sampled farms, and the reproducibility of the study was assessed through simulated sampling based on meat inspection data from the period where the actual data collection was carried out. In the cross-sectional study samples were taken from 681 Danish pig farms, during five weeks from February to March 2015. The evaluation showed that the sampling

  10. Evaluation of optimized bronchoalveolar lavage sampling designs for characterization of pulmonary drug distribution.

    Science.gov (United States)

    Clewe, Oskar; Karlsson, Mats O; Simonsson, Ulrika S H

    2015-12-01

    Bronchoalveolar lavage (BAL) is a pulmonary sampling technique for characterization of drug concentrations in epithelial lining fluid and alveolar cells. Two hypothetical drugs with different pulmonary distribution rates (fast and slow) were considered. An optimized BAL sampling design was generated assuming no previous information regarding the pulmonary distribution (rate and extent) and with a maximum of two samples per subject. Simulations were performed to evaluate the impact of the number of samples per subject (1 or 2) and the sample size on the relative bias and relative root mean square error of the parameter estimates (rate and extent of pulmonary distribution). The optimized BAL sampling design depends on a characterized plasma concentration time profile, a population plasma pharmacokinetic model, the limit of quantification (LOQ) of the BAL method and involves only two BAL sample time points, one early and one late. The early sample should be taken as early as possible, where concentrations in the BAL fluid ≥ LOQ. The second sample should be taken at a time point in the declining part of the plasma curve, where the plasma concentration is equivalent to the plasma concentration in the early sample. Using a previously described general pulmonary distribution model linked to a plasma population pharmacokinetic model, simulated data using the final BAL sampling design enabled characterization of both the rate and extent of pulmonary distribution. The optimized BAL sampling design enables characterization of both the rate and extent of the pulmonary distribution for both fast and slowly equilibrating drugs.

  11. Identification of continuous-time systems from samples of input ...

    Indian Academy of Sciences (India)

    Abstract. This paper presents an introductory survey of the methods that have been developed for identification of continuous-time systems from samples of input±output data. The two basic approaches may be described as (i) the indirect method, where first a discrete-time model is estimated from the sampled data and then ...

  12. Optimizing headspace sampling temperature and time for analysis of volatile oxidation products in fish oil

    DEFF Research Database (Denmark)

    Rørbæk, Karen; Jensen, Benny

    1997-01-01

    Headspace-gas chromatography (HS-GC), based on adsorption to Tenax GR(R), thermal desorption and GC, has been used for analysis of volatiles in fish oil. To optimize sam sampling conditions, the effect of heating the fish oil at various temperatures and times was evaluated from anisidine values (AV...

  13. The Brine Sampling and Evaluation Program (PSEP) at WIPP

    International Nuclear Information System (INIS)

    Deal, D.E.; Roggenthen, W.M.

    1989-01-01

    The Permian salt beds of the WIPP facility are virtually dry. The amount of water present in the rocks exposed in the excavations that is free to migrate under pressure gradients was estimated by heating salt samples to 95 degrees C and measuring weight loss. Clear balite contains about 0.22 weight percent water and the more argillaceous units average about 0.75 percent. Measurements made since 1984 as part of the Brine Sampling and Evaluation Program (BSEP) indicate that small amounts of this brine can migrate into the excavations and does accumulate in the underground environment. Brine seepage into drillholes monitored since thy were drilled show that brine seepage decreases with time and that many have dried up entirely. Weeping of brine from the walls of the repository excavations also decreases after two or more years. Chemical analyses of brines shows that they are sodium-chloride saturated and magnesium-rich

  14. Sample Exchange Evaluation (SEE) Report - Phase II

    Energy Technology Data Exchange (ETDEWEB)

    Winters, W.I.

    1994-09-28

    This report describes the results from Phase II of the Sample Exchange Evaluation (SEE) Program, a joint effort to compare analytical laboratory performance on samples from the Hanford Site`s high-level waste tanks. In Phase II, the program has been expanded to include inorganic constituents in addition to radionuclides. Results from Phase II that exceeded 20% relative percent difference criteria are identified.

  15. Sample Exchange Evaluation (SEE) Report - Phase II

    International Nuclear Information System (INIS)

    Winters, W.I.

    1994-01-01

    This report describes the results from Phase II of the Sample Exchange Evaluation (SEE) Program, a joint effort to compare analytical laboratory performance on samples from the Hanford Site's high-level waste tanks. In Phase II, the program has been expanded to include inorganic constituents in addition to radionuclides. Results from Phase II that exceeded 20% relative percent difference criteria are identified

  16. Implementing Run-Time Evaluation of Distributed Timing Constraints in a Real-Time Environment

    DEFF Research Database (Denmark)

    Kristensen, C. H.; Drejer, N.

    1994-01-01

    In this paper we describe a solution to the problem of implementing run-time evaluation of timing constraints in distributed real-time environments......In this paper we describe a solution to the problem of implementing run-time evaluation of timing constraints in distributed real-time environments...

  17. Algae viability over time in a ballast water sample

    Science.gov (United States)

    Gollasch, Stephan; David, Matej

    2018-03-01

    The biology of vessels' ballast water needs to be analysed for several reasons, one of these being performance tests of ballast water management systems. This analysis includes a viability assessment of phytoplankton. To overcome logistical problems to get algae sample processing gear on board of a vessel to document algae viability, samples may be transported to land-based laboratories. Concerns were raised how the storage conditions of the sample may impact algae viability over time and what the most appropriate storage conditions were. Here we answer these questions with a long-term algae viability study with daily sample analysis using Pulse-Amplitude Modulated (PAM) fluorometry. The sample was analysed over 79 days. We tested different storage conditions: fridge and room temperature with and without light. It seems that during the first two weeks of the experiment the viability remains almost unchanged with a slight downwards trend. In the continuing period, before the sample was split, a slightly stronger downwards viability trend was observed, which occurred at a similar rate towards the end of the experiment. After the sample was split, the strongest viability reduction was measured for the sample stored without light at room temperature. We concluded that the storage conditions, especially regarding temperature and light exposure, have a stronger impact on algae viability compared to the storage duration and that inappropriate storage conditions reduce algal viability. A sample storage time of up to two weeks in a dark and cool environment has little influence on the organism viability. This indicates that a two week time duration between sample taking on board a vessel and the viability measurement in a land-based laboratory may not be very critical.

  18. Low-sampling-rate ultra-wideband channel estimation using equivalent-time sampling

    KAUST Repository

    Ballal, Tarig

    2014-09-01

    In this paper, a low-sampling-rate scheme for ultra-wideband channel estimation is proposed. The scheme exploits multiple observations generated by transmitting multiple pulses. In the proposed scheme, P pulses are transmitted to produce channel impulse response estimates at a desired sampling rate, while the ADC samples at a rate that is P times slower. To avoid loss of fidelity, the number of sampling periods (based on the desired rate) in the inter-pulse interval is restricted to be co-prime with P. This condition is affected when clock drift is present and the transmitted pulse locations change. To handle this case, and to achieve an overall good channel estimation performance, without using prior information, we derive an improved estimator based on the bounded data uncertainty (BDU) model. It is shown that this estimator is related to the Bayesian linear minimum mean squared error (LMMSE) estimator. Channel estimation performance of the proposed sub-sampling scheme combined with the new estimator is assessed in simulation. The results show that high reduction in sampling rate can be achieved. The proposed estimator outperforms the least squares estimator in almost all cases, while in the high SNR regime it also outperforms the LMMSE estimator. In addition to channel estimation, a synchronization method is also proposed that utilizes the same pulse sequence used for channel estimation. © 2014 IEEE.

  19. On the sample transport time of a pneumatic transfer system

    International Nuclear Information System (INIS)

    Kondo, Yoshihide

    1983-01-01

    The counts accumulated in measuring system are affected by the variations in transport time of the sample on cyclic activation experiments with a mechanical sample transfer system. In use of the pneumatic transfer system, which has been set up, the transport time is variable according to the differences as follows: The form, size and weight of samples, the pneumatic pressure and so on. Comprehending the relationships between the transpot time and these variable factors is essentially important to make experiments with this transfer system. (author)

  20. Material sampling for rotor evaluation

    International Nuclear Information System (INIS)

    Mercaldi, D.; Parker, J.

    1990-01-01

    Decisions regarding continued operation of aging rotating machinery must often be made without adequate knowledge of rotor material conditions. Physical specimens of the material are not generally available due to lack of an appropriate sampling technique or the high cost and inconvenience of obtaining such samples. This is despite the fact that examination of such samples may be critical to effectively assess the degradation of mechanical properties of the components in service or to permit detailed examination of microstructure and surface flaws. Such information permits a reduction in the uncertainty of remaining life estimates for turbine rotors to avoid unnecessarily premature and costly rotor retirement decisions. This paper describes the operation and use of a recently developed material sampling device which machines and recovers an undeformed specimen from the surface of rotor bores or other components for metallurgical analysis. The removal of the thin, wafer-like sample has a negligible effect on the structural integrity of these components, due to the geometry and smooth surface finish of the resulting shallow depression. Samples measuring approximately 0.03 to 0.1 inches (0.76 to 2.5 mm) thick by 0.5 to 1.0 inch (1.3 to 2.5 cm) in diameter can be removed without mechanical deformation or thermal degradation of the sample or the remaining component material. The device is operated remotely from a control console and can be used externally or internally on any surface for which there is at least a three inch (7.6 cm) working clearance. Application of the device in two case studies of turbine-generator evaluations are presented

  1. DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1993-03-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others

  2. DOE methods for evaluating environmental and waste management samples.

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S C; McCulloch, M; Thomas, B L; Riley, R G; Sklarew, D S; Mong, G M; Fadeff, S K [eds.; Pacific Northwest Lab., Richland, WA (United States)

    1994-04-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others.

  3. Quality characteristics of Moroccan sweet paprika (Capsicum annuum L. at different sampling times

    Directory of Open Access Journals (Sweden)

    Naima Zaki

    2013-09-01

    Full Text Available "La Niora" is a red pepper variety cultivated in Tadla Region (Morocco which is used for manufacturing paprika after sun drying. The paprika quality (nutritional, chemical and microbiological was evaluated immediately after milling, from September to December. Sampling time mainly affected paprika color and the total capsaicinoid and vitamin C contents. The commercial quality was acceptable and no aflatoxins were found, but the microbial load sometimes exceeded permitted levels.

  4. Adaptive control of theophylline therapy: importance of blood sampling times.

    Science.gov (United States)

    D'Argenio, D Z; Khakmahd, K

    1983-10-01

    A two-observation protocol for estimating theophylline clearance during a constant-rate intravenous infusion is used to examine the importance of blood sampling schedules with regard to the information content of resulting concentration data. Guided by a theory for calculating maximally informative sample times, population simulations are used to assess the effect of specific sampling times on the precision of resulting clearance estimates and subsequent predictions of theophylline plasma concentrations. The simulations incorporated noise terms for intersubject variability, dosing errors, sample collection errors, and assay error. Clearance was estimated using Chiou's method, least squares, and a Bayesian estimation procedure. The results of these simulations suggest that clinically significant estimation and prediction errors may result when using the above two-point protocol for estimating theophylline clearance if the time separating the two blood samples is less than one population mean elimination half-life.

  5. Using forbidden ordinal patterns to detect determinism in irregularly sampled time series.

    Science.gov (United States)

    Kulp, C W; Chobot, J M; Niskala, B J; Needhammer, C J

    2016-02-01

    It is known that when symbolizing a time series into ordinal patterns using the Bandt-Pompe (BP) methodology, there will be ordinal patterns called forbidden patterns that do not occur in a deterministic series. The existence of forbidden patterns can be used to identify deterministic dynamics. In this paper, the ability to use forbidden patterns to detect determinism in irregularly sampled time series is tested on data generated from a continuous model system. The study is done in three parts. First, the effects of sampling time on the number of forbidden patterns are studied on regularly sampled time series. The next two parts focus on two types of irregular-sampling, missing data and timing jitter. It is shown that forbidden patterns can be used to detect determinism in irregularly sampled time series for low degrees of sampling irregularity (as defined in the paper). In addition, comments are made about the appropriateness of using the BP methodology to symbolize irregularly sampled time series.

  6. Salmonella detection in poultry samples. Comparison of two commercial real-time PCR systems with culture methods for the detection of Salmonella spp. in environmental and fecal samples of poultry.

    Science.gov (United States)

    Sommer, D; Enderlein, D; Antakli, A; Schönenbrücher, H; Slaghuis, J; Redmann, T; Lierz, M

    2012-01-01

    The efficiency of two commercial PCR methods based on real-time technology, the foodproof® Salmonella detection system and the BAX® PCR Assay Salmonella system was compared to standardized culture methods (EN ISO 6579:2002 - Annex D) for the detection of Salmonella spp. in poultry samples. Four sample matrices (feed, dust, boot swabs, feces) obtained directly from poultry flocks, as well as artificially spiked samples of the same matrices, were used. All samples were tested for Salmonella spp. using culture methods first as the gold standard. In addition samples spiked with Salmonella Enteridis were tested to evaluate the sensitivity of both PCR methods. Furthermore all methods were evaluated in an annual ring-trial of the National Salmonella Reference Laboratory of Germany. Salmonella detection in the matrices feed, dust and boot swabs were comparable in both PCR systems whereas the results from feces differed markedly. The quality, especially the freshness, of the fecal samples had an influence on the sensitivity of the real-time PCR and the results of the culture methods. In fresh fecal samples an initial spiking level of 100cfu/25g Salmonella Enteritidis was detected. Two-days-dried fecal samples allowed the detection of 14cfu/25g. Both real- time PCR protocols appear to be suitable for the detection of Salmonella spp. in all four matrices. The foodproof® system detected eight samples more to be positive compared to the BAX® system, but had a potential false positive result in one case. In 7-days-dried samples none of the methods was able to detect Salmonella likely through letal cell damage. In general the advantage of PCR analyses over the culture method is the reduction of working time from 4-5 days to only 2 days. However, especially for the analysis of fecal samples official validation should be conducted according to the requirement of EN ISO6579:2002 - Annex D.

  7. Field evaluation of personal sampling methods for multiple bioaerosols.

    Science.gov (United States)

    Wang, Chi-Hsun; Chen, Bean T; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  8. Method for Hot Real-Time Sampling of Gasification Products

    Energy Technology Data Exchange (ETDEWEB)

    Pomeroy, Marc D [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-29

    The Thermochemical Process Development Unit (TCPDU) at the National Renewable Energy Laboratory (NREL) is a highly instrumented half-ton/day pilot scale plant capable of demonstrating industrially relevant thermochemical technologies from lignocellulosic biomass conversion, including gasification. Gasification creates primarily Syngas (a mixture of Hydrogen and Carbon Monoxide) that can be utilized with synthesis catalysts to form transportation fuels and other valuable chemicals. Biomass derived gasification products are a very complex mixture of chemical components that typically contain Sulfur and Nitrogen species that can act as catalysis poisons for tar reforming and synthesis catalysts. Real-time hot online sampling techniques, such as Molecular Beam Mass Spectrometry (MBMS), and Gas Chromatographs with Sulfur and Nitrogen specific detectors can provide real-time analysis providing operational indicators for performance. Sampling typically requires coated sampling lines to minimize trace sulfur interactions with steel surfaces. Other materials used inline have also shown conversion of sulfur species into new components and must be minimized. Sample line Residence time within the sampling lines must also be kept to a minimum to reduce further reaction chemistries. Solids from ash and char contribute to plugging and must be filtered at temperature. Experience at NREL has shown several key factors to consider when designing and installing an analytical sampling system for biomass gasification products. They include minimizing sampling distance, effective filtering as close to source as possible, proper line sizing, proper line materials or coatings, even heating of all components, minimizing pressure drops, and additional filtering or traps after pressure drops.

  9. STATISTICAL EVALUATION OF SMALL SCALE MIXING DEMONSTRATION SAMPLING AND BATCH TRANSFER PERFORMANCE - 12093

    Energy Technology Data Exchange (ETDEWEB)

    GREER DA; THIEN MG

    2012-01-12

    The ability to effectively mix, sample, certify, and deliver consistent batches of High Level Waste (HLW) feed from the Hanford Double Shell Tanks (DST) to the Waste Treatment and Immobilization Plant (WTP) presents a significant mission risk with potential to impact mission length and the quantity of HLW glass produced. DOE's Tank Operations Contractor, Washington River Protection Solutions (WRPS) has previously presented the results of mixing performance in two different sizes of small scale DSTs to support scale up estimates of full scale DST mixing performance. Currently, sufficient sampling of DSTs is one of the largest programmatic risks that could prevent timely delivery of high level waste to the WTP. WRPS has performed small scale mixing and sampling demonstrations to study the ability to sufficiently sample the tanks. The statistical evaluation of the demonstration results which lead to the conclusion that the two scales of small DST are behaving similarly and that full scale performance is predictable will be presented. This work is essential to reduce the risk of requiring a new dedicated feed sampling facility and will guide future optimization work to ensure the waste feed delivery mission will be accomplished successfully. This paper will focus on the analytical data collected from mixing, sampling, and batch transfer testing from the small scale mixing demonstration tanks and how those data are being interpreted to begin to understand the relationship between samples taken prior to transfer and samples from the subsequent batches transferred. An overview of the types of data collected and examples of typical raw data will be provided. The paper will then discuss the processing and manipulation of the data which is necessary to begin evaluating sampling and batch transfer performance. This discussion will also include the evaluation of the analytical measurement capability with regard to the simulant material used in the demonstration tests. The

  10. Field evaluation of personal sampling methods for multiple bioaerosols.

    Directory of Open Access Journals (Sweden)

    Chi-Hsun Wang

    Full Text Available Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min. Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  11. An inexpensive and portable microvolumeter for rapid evaluation of biological samples.

    Science.gov (United States)

    Douglass, John K; Wcislo, William T

    2010-08-01

    We describe an improved microvolumeter (MVM) for rapidly measuring volumes of small biological samples, including live zooplankton, embryos, and small animals and organs. Portability and low cost make this instrument suitable for widespread use, including at remote field sites. Beginning with Archimedes' principle, which states that immersing an arbitrarily shaped sample in a fluid-filled container displaces an equivalent volume, we identified procedures that maximize measurement accuracy and repeatability across a broad range of absolute volumes. Crucial steps include matching the overall configuration to the size of the sample, using reflected light to monitor fluid levels precisely, and accounting for evaporation during measurements. The resulting precision is at least 100 times higher than in previous displacement-based methods. Volumes are obtained much faster than by traditional histological or confocal methods and without shrinkage artifacts due to fixation or dehydration. Calibrations using volume standards confirmed accurate measurements of volumes as small as 0.06 microL. We validated the feasibility of evaluating soft-tissue samples by comparing volumes of freshly dissected ant brains measured with the MVM and by confocal reconstruction.

  12. Transformation-cost time-series method for analyzing irregularly sampled data.

    Science.gov (United States)

    Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G Baris; Kurths, Jürgen

    2015-06-01

    Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations-with associated costs-to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.

  13. Transformation-cost time-series method for analyzing irregularly sampled data

    Science.gov (United States)

    Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G. Baris; Kurths, Jürgen

    2015-06-01

    Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations—with associated costs—to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.

  14. Sampling and Timing: A Task for the Environmetal Process

    NARCIS (Netherlands)

    Hilderink, G.H.; Broenink, Johannes F.

    2003-01-01

    Sampling and timing is considered a responsibility of the environment of controller software. In this paper we will illustrate a concept whereby an environmental process and multi-way events play an important role in applying timing for untimed CSP software architectures. We use this timing concept

  15. 105-DR Large sodium fire facility soil sampling data evaluation report

    International Nuclear Information System (INIS)

    Adler, J.G.

    1996-01-01

    This report evaluates the soil sampling activities, soil sample analysis, and soil sample data associated with the closure activities at the 105-DR Large Sodium Fire Facility. The evaluation compares these activities to the regulatory requirements for meeting clean closure. The report concludes that there is no soil contamination from the waste treatment activities

  16. Performance Evaluation of Commercial Dengue Diagnostic Tests for Early Detection of Dengue in Clinical Samples

    Directory of Open Access Journals (Sweden)

    Tuan Nur Akmalina Mat Jusoh

    2017-01-01

    Full Text Available The shattering rise in dengue virus infections globally has created a need for an accurate and validated rapid diagnostic test for this virus. Rapid diagnostic test (RDT and reverse transcription-polymerase chain reaction (RT-PCR diagnostic detection are useful tools for diagnosis of early dengue infection. We prospectively evaluated the diagnostic performance of nonstructural 1 (NS1 RDT and real-time RT-PCR diagnostic kits in 86 patient serum samples. Thirty-six samples were positive for dengue NS1 antigen while the remaining 50 were negative when tested with enzyme-linked immunosorbent assay (ELISA. Commercially available RDTs for NS1 detection, RTK ProDetect™, and SD Bioline showed high sensitivity of 94% and 89%, respectively, compared with ELISA. GenoAmp® Trioplex Real-Time RT-PCR and RealStar® Dengue RT-PCR tests presented a comparable kappa agreement with 0.722. The result obtained from GenoAmp® Real-Time RT-PCR Dengue test showed that 14 samples harbored dengue virus type 1 (DENV-1, 8 samples harbored DENV-2, 2 samples harbored DENV-3, and 1 sample harbored DENV-4. 1 sample had a double infection with DENV-1 and DENV-2. The NS1 RDTs and real-time RT-PCR tests were found to be a useful diagnostic for early and rapid diagnosis of acute dengue and an excellent surveillance tool in our battle against dengue.

  17. Comparative Evaluation of Three Homogenization Methods for Isolating Middle East Respiratory Syndrome Coronavirus Nucleic Acids From Sputum Samples for Real-Time Reverse Transcription PCR.

    Science.gov (United States)

    Sung, Heungsup; Yong, Dongeun; Ki, Chang Seok; Kim, Jae Seok; Seong, Moon Woo; Lee, Hyukmin; Kim, Mi Na

    2016-09-01

    Real-time reverse transcription PCR (rRT-PCR) of sputum samples is commonly used to diagnose Middle East respiratory syndrome coronavirus (MERS-CoV) infection. Owing to the difficulty of extracting RNA from sputum containing mucus, sputum homogenization is desirable prior to nucleic acid isolation. We determined optimal homogenization methods for isolating viral nucleic acids from sputum. We evaluated the following three sputum-homogenization methods: proteinase K and DNase I (PK-DNase) treatment, phosphate-buffered saline (PBS) treatment, and N-acetyl-L-cysteine and sodium citrate (NALC) treatment. Sputum samples were spiked with inactivated MERS-CoV culture isolates. RNA was extracted from pretreated, spiked samples using the easyMAG system (bioMérieux, France). Extracted RNAs were then subjected to rRT-PCR for MERS-CoV diagnosis (DiaPlex Q MERS-coronavirus, SolGent, Korea). While analyzing 15 spiked sputum samples prepared in technical duplicate, false-negative results were obtained with five (16.7%) and four samples (13.3%), respectively, by using the PBS and NALC methods. The range of threshold cycle (Ct) values observed when detecting upE in sputum samples was 31.1-35.4 with the PK-DNase method, 34.7-39.0 with the PBS method, and 33.9-38.6 with the NALC method. Compared with the control, which were prepared by adding a one-tenth volume of 1:1,000 diluted viral culture to PBS solution, the ranges of Ct values obtained by the PBS and NALC methods differed significantly from the mean control Ct of 33.2 (both Phomogenizing sputum samples prior to RNA extraction.

  18. Target Tracking of a Linear Time Invariant System under Irregular Sampling

    Directory of Open Access Journals (Sweden)

    Jin Xue-Bo

    2012-11-01

    Full Text Available Due to event-triggered sampling in a system, or maybe with the aim of reducing data storage, tracking many applications will encounter irregular sampling time. By calculating the matrix exponential using an inverse Laplace transform, this paper transforms the irregular sampling tracking problem to the problem of tracking with time-varying parameters of a system. Using the common Kalman filter, the developed method is used to track a target for the simulated trajectory and video tracking. The results of simulation experiments have shown that it can obtain good estimation performance even at a very high irregular rate of measurement sampling time.

  19. Plasma phenylalanine and tyrosine responses to different nutritional conditions (fasting/postprandial) in patients with phenylketonuria: effect of sample timing.

    Science.gov (United States)

    van Spronsen, F J; van Rijn, M; van Dijk, T; Smit, G P; Reijngoud, D J; Berger, R; Heymans, H S

    1993-10-01

    To evaluate the adequacy of dietary treatment in patients with phenylketonuria, the monitoring of plasma phenylalanine and tyrosine concentrations is of great importance. The preferable time of blood sampling in relation to the nutritional condition during the day, however, is not known. It was the aim of this study to define guidelines for the timing of blood sampling with a minimal burden for the patient. Plasma concentrations of phenylalanine and tyrosine were measured in nine patients with phenylketonuria who had no clinical evidence of tyrosine deficiency. These values were measured during the day both after a prolonged overnight fast, and before and after breakfast. Phenylalanine showed a small rise during prolonged fasting, while tyrosine decreased slightly. After an individually tailored breakfast, phenylalanine remained stable, while tyrosine showed large fluctuations. It is concluded that the patient's nutritional condition (fasting/postprandial) is not important in the evaluation of the phenylalanine intake. To detect a possible tyrosine deficiency, however, a single blood sample is not sufficient and a combination of a preprandial and postprandial blood sample on the same day is advocated.

  20. Double sampling with multiple imputation to answer large sample meta-research questions: Introduction and illustration by evaluating adherence to two simple CONSORT guidelines

    Directory of Open Access Journals (Sweden)

    Patrice L. Capers

    2015-03-01

    Full Text Available BACKGROUND: Meta-research can involve manual retrieval and evaluation of research, which is resource intensive. Creation of high throughput methods (e.g., search heuristics, crowdsourcing has improved feasibility of large meta-research questions, but possibly at the cost of accuracy. OBJECTIVE: To evaluate the use of double sampling combined with multiple imputation (DS+MI to address meta-research questions, using as an example adherence of PubMed entries to two simple Consolidated Standards of Reporting Trials (CONSORT guidelines for titles and abstracts. METHODS: For the DS large sample, we retrieved all PubMed entries satisfying the filters: RCT; human; abstract available; and English language (n=322,107. For the DS subsample, we randomly sampled 500 entries from the large sample. The large sample was evaluated with a lower rigor, higher throughput (RLOTHI method using search heuristics, while the subsample was evaluated using a higher rigor, lower throughput (RHITLO human rating method. Multiple imputation of the missing-completely-at-random RHITLO data for the large sample was informed by: RHITLO data from the subsample; RLOTHI data from the large sample; whether a study was an RCT; and country and year of publication. RESULTS: The RHITLO and RLOTHI methods in the subsample largely agreed (phi coefficients: title=1.00, abstract=0.92. Compliance with abstract and title criteria has increased over time, with non-US countries improving more rapidly. DS+MI logistic regression estimates were more precise than subsample estimates (e.g., 95% CI for change in title and abstract compliance by Year: subsample RHITLO 1.050-1.174 vs. DS+MI 1.082-1.151. As evidence of improved accuracy, DS+MI coefficient estimates were closer to RHITLO than the large sample RLOTHI. CONCLUSIONS: Our results support our hypothesis that DS+MI would result in improved precision and accuracy. This method is flexible and may provide a practical way to examine large corpora of

  1. Manual versus automated streaking system in clinical microbiology laboratory: Performance evaluation of Previ Isola for blood culture and body fluid samples.

    Science.gov (United States)

    Choi, Qute; Kim, Hyun Jin; Kim, Jong Wan; Kwon, Gye Cheol; Koo, Sun Hoe

    2018-01-04

    The process of plate streaking has been automated to improve routine workflow of clinical microbiology laboratories. Although there were many evaluation reports about the inoculation of various body fluid samples, few evaluations have been reported for blood. In this study, we evaluated the performance of automated inoculating system, Previ Isola for various routine clinical samples including blood. Blood culture, body fluid, and urine samples were collected. All samples were inoculated on both sheep blood agar plate (BAP) and MacConkey agar plate (MCK) using Previ Isola and manual method. We compared two methods in aspect of quality and quantity of cultures, and sample processing time. To ensure objective colony counting, an enumeration reading reference was made through a preliminary experiment. A total of 377 nonduplicate samples (102 blood culture, 203 urine, 72 body fluid) were collected and inoculated. The concordance rate of quality was 100%, 97.0%, and 98.6% in blood, urine, and other body fluids, respectively. In quantitative aspect, it was 98.0%, 97.0%, and 95.8%, respectively. The Previ Isola took a little longer to inoculate the specimen than manual method, but the hands-on time decreased dramatically. The shortened hands-on time using Previ Isola was about 6 minutes per 10 samples. We demonstrated that the Previ Isola showed high concordance with the manual method in the inoculation of various body fluids, especially in blood culture sample. The use of Previ Isola in clinical microbiology laboratories is expected to save considerable time and human resources. © 2018 Wiley Periodicals, Inc.

  2. Assessing the precision of a time-sampling-based study among GPs: balancing sample size and measurement frequency.

    Science.gov (United States)

    van Hassel, Daniël; van der Velden, Lud; de Bakker, Dinny; van der Hoek, Lucas; Batenburg, Ronald

    2017-12-04

    Our research is based on a technique for time sampling, an innovative method for measuring the working hours of Dutch general practitioners (GPs), which was deployed in an earlier study. In this study, 1051 GPs were questioned about their activities in real time by sending them one SMS text message every 3 h during 1 week. The required sample size for this study is important for health workforce planners to know if they want to apply this method to target groups who are hard to reach or if fewer resources are available. In this time-sampling method, however, standard power analyses is not sufficient for calculating the required sample size as this accounts only for sample fluctuation and not for the fluctuation of measurements taken from every participant. We investigated the impact of the number of participants and frequency of measurements per participant upon the confidence intervals (CIs) for the hours worked per week. Statistical analyses of the time-use data we obtained from GPs were performed. Ninety-five percent CIs were calculated, using equations and simulation techniques, for various different numbers of GPs included in the dataset and for various frequencies of measurements per participant. Our results showed that the one-tailed CI, including sample and measurement fluctuation, decreased from 21 until 3 h between one and 50 GPs. As a result of the formulas to calculate CIs, the increase of the precision continued and was lower with the same additional number of GPs. Likewise, the analyses showed how the number of participants required decreased if more measurements per participant were taken. For example, one measurement per 3-h time slot during the week requires 300 GPs to achieve a CI of 1 h, while one measurement per hour requires 100 GPs to obtain the same result. The sample size needed for time-use research based on a time-sampling technique depends on the design and aim of the study. In this paper, we showed how the precision of the

  3. Non-Cartesian MRI scan time reduction through sparse sampling

    NARCIS (Netherlands)

    Wajer, F.T.A.W.

    2001-01-01

    Non-Cartesian MRI Scan-Time Reduction through Sparse Sampling Magnetic resonance imaging (MRI) signals are measured in the Fourier domain, also called k-space. Samples of the MRI signal can not be taken at will, but lie along k-space trajectories determined by the magnetic field gradients. MRI

  4. PAVENET OS: A Compact Hard Real-Time Operating System for Precise Sampling in Wireless Sensor Networks

    Science.gov (United States)

    Saruwatari, Shunsuke; Suzuki, Makoto; Morikawa, Hiroyuki

    The paper shows a compact hard real-time operating system for wireless sensor nodes called PAVENET OS. PAVENET OS provides hybrid multithreading: preemptive multithreading and cooperative multithreading. Both of the multithreading are optimized for two kinds of tasks on wireless sensor networks, and those are real-time tasks and best-effort ones. PAVENET OS can efficiently perform hard real-time tasks that cannot be performed by TinyOS. The paper demonstrates the hybrid multithreading realizes compactness and low overheads, which are comparable to those of TinyOS, through quantitative evaluation. The evaluation results show PAVENET OS performs 100 Hz sensor sampling with 0.01% jitter while performing wireless communication tasks, whereas optimized TinyOS has 0.62% jitter. In addition, PAVENET OS has a small footprint and low overheads (minimum RAM size: 29 bytes, minimum ROM size: 490 bytes, minimum task switch time: 23 cycles).

  5. Influenza virus drug resistance: a time-sampled population genetics perspective.

    Directory of Open Access Journals (Sweden)

    Matthieu Foll

    2014-02-01

    Full Text Available The challenge of distinguishing genetic drift from selection remains a central focus of population genetics. Time-sampled data may provide a powerful tool for distinguishing these processes, and we here propose approximate Bayesian, maximum likelihood, and analytical methods for the inference of demography and selection from time course data. Utilizing these novel statistical and computational tools, we evaluate whole-genome datasets of an influenza A H1N1 strain in the presence and absence of oseltamivir (an inhibitor of neuraminidase collected at thirteen time points. Results reveal a striking consistency amongst the three estimation procedures developed, showing strongly increased selection pressure in the presence of drug treatment. Importantly, these approaches re-identify the known oseltamivir resistance site, successfully validating the approaches used. Enticingly, a number of previously unknown variants have also been identified as being positively selected. Results are interpreted in the light of Fisher's Geometric Model, allowing for a quantification of the increased distance to optimum exerted by the presence of drug, and theoretical predictions regarding the distribution of beneficial fitness effects of contending mutations are empirically tested. Further, given the fit to expectations of the Geometric Model, results suggest the ability to predict certain aspects of viral evolution in response to changing host environments and novel selective pressures.

  6. Sensitive time-resolved fluoroimmunoassay for quantitative determination of clothianidin in agricultural samples.

    Science.gov (United States)

    Li, Ming; Sheng, Enze; Yuan, Yulong; Liu, Xiaofeng; Hua, Xiude; Wang, Minghua

    2014-05-01

    Europium (Eu(3+))-labeled antibody was used as a fluorescent label to develop a highly sensitive time-resolved fluoroimmunoassay (TRFIA) for determination of clothianidin residues in agricultural samples. Toward this goal, the Eu(3+)-labeled polyclonal antibody and goat anti-rabbit antibody were prepared for developing and evaluating direct competitive TRFIA (dc-TRFIA) and indirect competitive TRFIA (ic-TRFIA). Under optimal conditions, the half-maximal inhibition concentration (IC50) and the limit of detection (LOD, IC10) of clothianidin were 9.20 and 0.0909 μg/L for the dc-TRFIA and 2.07 and 0.0220 μg/L for the ic-TRFIA, respectively. The ic-TRFIA has no obvious cross-reactivity with the analogues of clothianidin except for dinotefuran. The average recoveries of clothianidin from spiked water, soil, cabbage, and rice samples were estimated to range from 74.1 to 115.9 %, with relative standard deviations of 3.3 to 11.7 %. The results of TRFIA for the blind samples were largely consistent with gas chromatography (R (2) = 0.9902). The optimized ic-TRFIA might become a sensitive and satisfactory analytical method for the quantitative monitoring of clothianidin residues in agricultural samples.

  7. Multi-scale sampling to evaluate assemblage dynamics in an oceanic marine reserve.

    Science.gov (United States)

    Thompson, Andrew R; Watson, William; McClatchie, Sam; Weber, Edward D

    2012-01-01

    To resolve the capacity of Marine Protected Areas (MPA) to enhance fish productivity it is first necessary to understand how environmental conditions affect the distribution and abundance of fishes independent of potential reserve effects. Baseline fish production was examined from 2002-2004 through ichthyoplankton sampling in a large (10,878 km(2)) Southern Californian oceanic marine reserve, the Cowcod Conservation Area (CCA) that was established in 2001, and the Southern California Bight as a whole (238,000 km(2) CalCOFI sampling domain). The CCA assemblage changed through time as the importance of oceanic-pelagic species decreased between 2002 (La Niña) and 2003 (El Niño) and then increased in 2004 (El Niño), while oceanic species and rockfishes displayed the opposite pattern. By contrast, the CalCOFI assemblage was relatively stable through time. Depth, temperature, and zooplankton explained more of the variability in assemblage structure at the CalCOFI scale than they did at the CCA scale. CalCOFI sampling revealed that oceanic species impinged upon the CCA between 2002 and 2003 in association with warmer offshore waters, thus explaining the increased influence of these species in the CCA during the El Nino years. Multi-scale, spatially explicit sampling and analysis was necessary to interpret assemblage dynamics in the CCA and likely will be needed to evaluate other focal oceanic marine reserves throughout the world.

  8. In-Sample Confidence Bands and Out-of-Sample Forecast Bands for Time-Varying Parameters in Observation Driven Models

    NARCIS (Netherlands)

    Blasques, F.; Koopman, S.J.; Lasak, K.A.; Lucas, A.

    2016-01-01

    We study the performances of alternative methods for calculating in-sample confidence and out-of-sample forecast bands for time-varying parameters. The in-sample bands reflect parameter uncertainty, while the out-of-sample bands reflect not only parameter uncertainty, but also innovation

  9. Quality evaluation of processed clay soil samples.

    Science.gov (United States)

    Steiner-Asiedu, Matilda; Harrison, Obed Akwaa; Vuvor, Frederick; Tano-Debrah, Kwaku

    2016-01-01

    This study assessed the microbial quality of clay samples sold on two of the major Ghanaian markets. The study was a cross-sectional assessing the evaluation of processed clay and effects it has on the nutrition of the consumers in the political capital town of Ghana. The items for the examination was processed clay soil samples. Staphylococcus spp and fecal coliforms including Klebsiella, Escherichia, and Shigella and Enterobacterspp were isolated from the clay samples. Samples from the Kaneshie market in Accra recorded the highest total viable counts 6.5 Log cfu/g and Staphylococcal count 5.8 Log cfu/g. For fecal coliforms, Madina market samples had the highest count 6.5 Log cfu/g and also recorded the highest levels of yeast and mould. For Koforidua, total viable count was highest in the samples from the Zongo market 6.3 Log cfu/g. Central market samples had the highest count of fecal coliforms 4.6 Log cfu/g and yeasts and moulds 6.5 Log cfu/g. "Small" market recorded the highest staphylococcal count 6.2 Log cfu/g. The water activity of the clay samples were low, and ranged between 0.65±0.01 and 0.66±0.00 for samples collected from Koforidua and Accra respectively. The clay samples were found to contain Klebsiella spp. Escherichia, Enterobacter, Shigella spp. staphylococcus spp., yeast and mould. These have health implications when consumed.

  10. Effects of brief time delays on matching-to-sample abilities in capuchin monkeys (Sapajus spp.).

    Science.gov (United States)

    Truppa, Valentina; De Simone, Diego Antonio; Piano Mortari, Eva; De Lillo, Carlo

    2014-09-01

    Traditionally, studies of delayed matching-to-sample (DMTS) tasks in nonhuman species have focused on the assessment of the limits of the retrieval of information stored in short- and long-term memory systems. However, it is still unclear if visual recognition in these tasks is affected by very brief delay intervals, which are typically used to study rapidly decaying types of visual memory. This study aimed at evaluating if tufted capuchin monkeys' ability to recognise visual stimuli in a DMTS task is affected by (i) the disappearance of the sample stimulus and (ii) the introduction of delay intervals (0.5, 1.0, 2.0 and 3.0s) between the disappearance of the sample and the presentation of the comparison stimuli. The results demonstrated that the simple disappearance of the sample and the introduction of a delay of 0.5s did not affect capuchins' performance either in terms of accuracy or response time. A delay interval of 1.0s produced a significant increase in response time but still did not affect recognition accuracy. By contrast, delays of 2.0 and 3.0s determined a significant increase in response time and a reduction in recognition accuracy. These findings indicate the existence in capuchin monkeys of processes enabling a very accurate retention of stimulus features within time frames comparable to those reported for humans' sensory memory (0.5-1.0s). The extent to which such processes can be considered analogous to the sensory memory processes observed in human visual cognition is discussed. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Females' sampling strategy to comparatively evaluate prospective mates in the peacock blenny Salaria pavo

    Science.gov (United States)

    Locatello, Lisa; Rasotto, Maria B.

    2017-08-01

    Emerging evidence suggests the occurrence of comparative decision-making processes in mate choice, questioning the traditional idea of female choice based on rules of absolute preference. In such a scenario, females are expected to use a typical best-of- n sampling strategy, being able to recall previous sampled males based on memory of their quality and location. Accordingly, the quality of preferred mate is expected to be unrelated to both the number and the sequence of female visits. We found support for these predictions in the peacock blenny, Salaria pavo, a fish where females have the opportunity to evaluate the attractiveness of many males in a short time period and in a restricted spatial range. Indeed, even considering the variability in preference among females, most of them returned to previous sampled males for further evaluations; thus, the preferred male did not represent the last one in the sequence of visited males. Moreover, there was no relationship between the attractiveness of the preferred male and the number of further visits assigned to the other males. Our results suggest the occurrence of a best-of- n mate sampling strategy in the peacock blenny.

  12. Evaluating leaf litter beetle data sampled by Winkler extraction from Atlantic forest sites in southern Brazil

    Directory of Open Access Journals (Sweden)

    Philipp Werner Hopp

    2011-06-01

    Full Text Available Evaluating leaf litter beetle data sampled by Winkler extraction from Atlantic forest sites in southern Brazil. To evaluate the reliability of data obtained by Winkler extraction in Atlantic forest sites in southern Brazil, we studied litter beetle assemblages in secondary forests (5 to 55 years after abandonment and old-growth forests at two seasonally different points in time. For all regeneration stages, species density and abundance were lower in April compared to August; but, assemblage composition of the corresponding forest stages was similar in both months. We suggest that sampling of small litter inhabiting beetles at different points in time using the Winkler technique reveals identical ecological patterns, which are more likely to be influenced by sample incompleteness than by differences in their assemblage composition. A strong relationship between litter quantity and beetle occurrences indicates the importance of this variable for the temporal species density pattern. Additionally, the sampled beetle material was compared with beetle data obtained with pitfall traps in one old-growth forest. Over 60% of the focal species captured with pitfall traps were also sampled by Winkler extraction in different forest stages. Few beetles with a body size too large to be sampled by Winkler extraction were only sampled with pitfall traps. This indicates that the local litter beetle fauna is dominated by small species. Hence, being aware of the exclusion of large beetles and beetle species occurring during the wet season, the Winkler method reveals a reliable picture of the local leaf litter beetle community.

  13. Weighted statistical parameters for irregularly sampled time series

    Science.gov (United States)

    Rimoldini, Lorenzo

    2014-01-01

    Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time and corrupt measurements, for example, or inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. Irregular sampling often causes clumps of measurements and gaps with no data which can severely disrupt the values of estimators. This paper aims at improving the accuracy of common statistical parameters when linear interpolation (in time or phase) can be considered an acceptable approximation of a deterministic signal. A pragmatic solution is formulated in terms of a simple weighting scheme, adapting to the sampling density and noise level, applicable to large data volumes at minimal computational cost. Tests on time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the suggested scheme confirmed the benefits of the improved input attributes. The classification of eclipsing binaries, Mira, RR Lyrae, Delta Cephei and Alpha2 Canum Venaticorum stars employing exclusively weighted descriptive statistics achieved an overall accuracy of 92 per cent, about 6 per cent higher than with unweighted estimators.

  14. Degradation of hydrocarbons in soil samples analyzed within accepted analytical holding times

    International Nuclear Information System (INIS)

    Jackson, J.; Thomey, N.; Dietlein, L.F.

    1992-01-01

    Samples which are collected in conjunction with subsurface investigations at leaking petroleum storage tank sites and petroleum refineries are routinely analyzed for benzene, toluene, ethylbenzene, xylenes (BTEX), and total petroleum hydrocarbons (TPH). Water samples are preserved by the addition of hydrochloric acid and maintained at four degrees centigrade prior to analysis. This is done to prevent bacterial degradation of hydrocarbons. Chemical preservation is not presently performed on soil samples. Instead, the samples are cooled and maintained at four degrees centigrade. This study was done to measure the degree of degradation of hydrocarbons in soil samples which are analyzed within accepted holding times. Soil samples were collected and representative subsamples were prepared from the initial sample. Subsamples were analyzed in triplicate for BTEX and TPH throughout the length of the approved holding times to measure the extent of sample constituent degradation prior to analysis. Findings imply that for sandy soils, BTEX and TPH concentrations can be highly dependent upon the length of time which elapses between sample collection and analysis

  15. Evaluation of sample preparation protocols for spider venom profiling by MALDI-TOF MS.

    Science.gov (United States)

    Bočánek, Ondřej; Šedo, Ondrej; Pekár, Stano; Zdráhal, Zbyněk

    2017-07-01

    Spider venoms are highly complex mixtures containing biologically active substances with potential for use in biotechnology or pharmacology. Fingerprinting of venoms by Matrix-Assisted Laser Desorption-Ionization - Time of Flight Mass Spectrometry (MALDI-TOF MS) is a thriving technology, enabling the rapid detection of peptide/protein components that can provide comparative information. In this study, we evaluated the effects of sample preparation procedures on MALDI-TOF mass spectral quality to establish a protocol providing the most reliable analytical outputs. We adopted initial sample preparation conditions from studies already published in this field. Three different MALDI matrixes, three matrix solvents, two sample deposition methods, and different acid concentrations were tested. As a model sample, venom from Brachypelma albopilosa was used. The mass spectra were evaluated on the basis of absolute and relative signal intensities, and signal resolution. By conducting three series of analyses at three weekly intervals, the reproducibility of the mass spectra were assessed as a crucial factor in the selection for optimum conditions. A sample preparation protocol based on the use of an HCCA matrix dissolved in 50% acetonitrile with 2.5% TFA deposited onto the target by the dried-droplet method was found to provide the best results in terms of information yield and repeatability. We propose that this protocol should be followed as a standard procedure, enabling the comparative assessment of MALDI-TOF MS spider venom fingerprints. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Implementering Run-time Evaluation of Distributed Timing Constraints in a Micro Kernel

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Drejer, N.; Nielsen, Jens Frederik Dalsgaard

    In the present paper we describe a solution to the problem of implementing time-optimal evaluation of timing constraints in distributed real-time systems......In the present paper we describe a solution to the problem of implementing time-optimal evaluation of timing constraints in distributed real-time systems...

  17. Advancement of Solidification Processing Technology Through Real Time X-Ray Transmission Microscopy: Sample Preparation

    Science.gov (United States)

    Stefanescu, D. M.; Curreri, P. A.

    1996-01-01

    Two types of samples were prepared for the real time X-ray transmission microscopy (XTM) characterization. In the first series directional solidification experiments were carried out to evaluate the critical velocity of engulfment of zirconia particles in the Al and Al-Ni eutectic matrix under ground (l-g) conditions. The particle distribution in the samples was recorded on video before and after the samples were directionally solidified. In the second series samples of the above two type of composites were prepared for directional solidification runs to be carried out on the Advanced Gradient Heating Facility (AGHF) aboard the space shuttle during the LMS mission in June 1996. X-ray microscopy proved to be an invaluable tool for characterizing the particle distribution in the metal matrix samples. This kind of analysis helped in determining accurately the critical velocity of engulfment of ceramic particles by the melt interface in the opaque metal matrix composites. The quality of the cast samples with respect to porosity and instrumented thermocouple sheath breakage or shift could be easily viewed and thus helped in selecting samples for the space shuttle experiments. Summarizing the merits of this technique it can be stated that this technique enabled the use of cast metal matrix composite samples since the particle location was known prior to the experiment.

  18. Investigation of Legionella Contamination in Bath Water Samples by Culture, Amoebic Co-Culture, and Real-Time Quantitative PCR Methods.

    Science.gov (United States)

    Edagawa, Akiko; Kimura, Akio; Kawabuchi-Kurata, Takako; Adachi, Shinichi; Furuhata, Katsunori; Miyamoto, Hiroshi

    2015-10-19

    We investigated Legionella contamination in bath water samples, collected from 68 bathing facilities in Japan, by culture, culture with amoebic co-culture, real-time quantitative PCR (qPCR), and real-time qPCR with amoebic co-culture. Using the conventional culture method, Legionella pneumophila was detected in 11 samples (11/68, 16.2%). Contrary to our expectation, the culture method with the amoebic co-culture technique did not increase the detection rate of Legionella (4/68, 5.9%). In contrast, a combination of the amoebic co-culture technique followed by qPCR successfully increased the detection rate (57/68, 83.8%) compared with real-time qPCR alone (46/68, 67.6%). Using real-time qPCR after culture with amoebic co-culture, more than 10-fold higher bacterial numbers were observed in 30 samples (30/68, 44.1%) compared with the same samples without co-culture. On the other hand, higher bacterial numbers were not observed after propagation by amoebae in 32 samples (32/68, 47.1%). Legionella was not detected in the remaining six samples (6/68, 8.8%), irrespective of the method. These results suggest that application of the amoebic co-culture technique prior to real-time qPCR may be useful for the sensitive detection of Legionella from bath water samples. Furthermore, a combination of amoebic co-culture and real-time qPCR might be useful to detect viable and virulent Legionella because their ability to invade and multiply within free-living amoebae is considered to correlate with their pathogenicity for humans. This is the first report evaluating the efficacy of the amoebic co-culture technique for detecting Legionella in bath water samples.

  19. Multi-scale sampling to evaluate assemblage dynamics in an oceanic marine reserve.

    Directory of Open Access Journals (Sweden)

    Andrew R Thompson

    Full Text Available To resolve the capacity of Marine Protected Areas (MPA to enhance fish productivity it is first necessary to understand how environmental conditions affect the distribution and abundance of fishes independent of potential reserve effects. Baseline fish production was examined from 2002-2004 through ichthyoplankton sampling in a large (10,878 km(2 Southern Californian oceanic marine reserve, the Cowcod Conservation Area (CCA that was established in 2001, and the Southern California Bight as a whole (238,000 km(2 CalCOFI sampling domain. The CCA assemblage changed through time as the importance of oceanic-pelagic species decreased between 2002 (La Niña and 2003 (El Niño and then increased in 2004 (El Niño, while oceanic species and rockfishes displayed the opposite pattern. By contrast, the CalCOFI assemblage was relatively stable through time. Depth, temperature, and zooplankton explained more of the variability in assemblage structure at the CalCOFI scale than they did at the CCA scale. CalCOFI sampling revealed that oceanic species impinged upon the CCA between 2002 and 2003 in association with warmer offshore waters, thus explaining the increased influence of these species in the CCA during the El Nino years. Multi-scale, spatially explicit sampling and analysis was necessary to interpret assemblage dynamics in the CCA and likely will be needed to evaluate other focal oceanic marine reserves throughout the world.

  20. Evaluation of Stress Loaded Steel Samples Using Selected Electromagnetic Methods

    International Nuclear Information System (INIS)

    Chady, T.

    2004-01-01

    In this paper the magnetic leakage flux and eddy current method were used to evaluate changes of materials' properties caused by stress. Seven samples made of ferromagnetic material with different level of applied stress were prepared. First, the leakage magnetic fields were measured by scanning the surface of the specimens with GMR gradiometer. Next, the same samples were evaluated using an eddy current sensor. A comparison between results obtained from both methods was carried out. Finally, selected parameters of the measured signal were calculated and utilized to evaluate level of the applied stress. A strong coincidence between amount of the applied stress and the maximum amplitude of the derivative was confirmed

  1. Development and Validation of a Real-Time PCR Assay for Rapid Detection of Candida auris from Surveillance Samples.

    Science.gov (United States)

    Leach, L; Zhu, Y; Chaturvedi, S

    2018-02-01

    Candida auris is an emerging multidrug-resistant yeast causing invasive health care-associated infection with high mortality worldwide. Rapid identification of C. auris is of primary importance for the implementation of public health measures to control the spread of infection. To achieve these goals, we developed and validated a TaqMan-based real-time PCR assay targeting the internal transcribed spacer 2 ( ITS 2) region of the ribosomal gene. The assay was highly specific, reproducible, and sensitive, with the detection limit of 1 C. auris CFU/PCR. The performance of the C. auris real-time PCR assay was evaluated by using 623 surveillance samples, including 365 patient swabs and 258 environmental sponges. Real-time PCR yielded positive results from 49 swab and 58 sponge samples, with 89% and 100% clinical sensitivity with regard to their respective culture-positive results. The real-time PCR also detected C. auris DNA from 1% and 12% of swab and sponge samples with culture-negative results, indicating the presence of dead or culture-impaired C. auris The real-time PCR yielded results within 4 h of sample processing, compared to 4 to 14 days for culture, reducing turnaround time significantly. The new real-time PCR assay allows for accurate and rapid screening of C. auris and can increase effective control and prevention of this emerging multidrug-resistant fungal pathogen in health care facilities. Copyright © 2018 Leach et al.

  2. The significance of sampling time in therapeutic drug monitoring of clozapine

    DEFF Research Database (Denmark)

    Jakobsen, M I; Larsen, J R; Svensson, C K

    2017-01-01

    OBJECTIVE: Therapeutic drug monitoring (TDM) of clozapine is standardized to 12-h postdose samplings. In clinical settings, sampling time often deviates from this time point, although the importance of the deviation is unknown. To this end, serum concentrations (s-) of clozapine and its metabolite...... N-desmethyl-clozapine (norclozapine) were measured at 12 ± 1 and 2 h postdose. METHOD: Forty-six patients with a diagnosis of schizophrenia, and on stable clozapine treatment, were enrolled for hourly, venous blood sampling at 10-14 h postdose. RESULTS: Minor changes in median percentage values were...

  3. Process evaluation of treatment times in a large radiotherapy department

    International Nuclear Information System (INIS)

    Beech, R.; Burgess, K.; Stratford, J.

    2016-01-01

    Purpose/objective: The Department of Health (DH) recognises access to appropriate and timely radiotherapy (RT) services as crucial in improving cancer patient outcomes, especially when facing a predicted increase in cancer diagnosis. There is a lack of ‘real-time’ data regarding daily demand of a linear accelerator, the impact of increasingly complex techniques on treatment times, and whether current scheduling reflects time needed for RT delivery, which would be valuable in highlighting current RT provision. Material/methods: A systematic quantitative process evaluation was undertaken in a large regional cancer centre, including a satellite centre, between January and April 2014. Data collected included treatment room-occupancy time, RT site, RT and verification technique and patient mobility status. Data was analysed descriptively; average room-occupancy times were calculated for RT techniques and compared to historical standardised treatment times within the department. Results: Room-occupancy was recorded for over 1300 fractions, over 50% of which overran their allotted treatment time. In a focused sample of 16 common techniques, 10 overran their allocated timeslots. Verification increased room-occupancy by six minutes (50%) over non-imaging. Treatments for patients requiring mobility assistance took four minutes (29%) longer. Conclusion: The majority of treatments overran their standardised timeslots. Although technique advancement has reduced RT delivery time, room-occupancy has not necessarily decreased. Verification increases room-occupancy and needs to be considered when moving towards adaptive techniques. Mobility affects room-occupancy and will become increasingly significant in an ageing population. This evaluation assesses validity of current treatment times in this department, and can be modified and repeated as necessary. - Highlights: • A process evaluation examined room-occupancy for various radiotherapy techniques. • Appointment lengths

  4. Effects of statistical quality, sampling rate and temporal filtering techniques on the extraction of functional parameters from the left ventricular time-activity curves

    Energy Technology Data Exchange (ETDEWEB)

    Guignard, P.A.; Chan, W. (Royal Melbourne Hospital, Parkville (Australia). Dept. of Nuclear Medicine)

    1984-09-01

    Several techniques for the processing of a series of curves derived from two left ventricular time-activity curves acquired at rest and during exercise with a nuclear stethoscope were evaluated. They were three and five point time smoothing. Fourier filtering preserving one to four harmonics (H), truncated curve Fourier filtering, and third degree polynomial curve fitting. Each filter's ability to recover, with fidelity, systolic and diastolic function parameters was evaluated under increasingly 'noisy' conditions and at several sampling rates. Third degree polynomial curve fittings and truncated Fourier filters exhibited very high sensitivity to noise. Three and five point time smoothing had moderate sensitivity to noise, but were highly affected by sampling rate. Fourier filtering preserving 2H or 3H produced the best compromise with high resilience to noise and independence of sampling rate as far as the recovery of these functional parameters is concerned.

  5. Effects of statistical quality, sampling rate and temporal filtering techniques on the extraction of functional parameters from the left ventricular time-activity curves

    International Nuclear Information System (INIS)

    Guignard, P.A.; Chan, W.

    1984-01-01

    Several techniques for the processing of a series of curves derived from two left ventricular time-activity curves acquired at rest and during exercise with a nuclear stethoscope were evaluated. They were three and five point time smoothing. Fourier filtering preserving one to four harmonics (H), truncated curve Fourier filtering, and third degree polynomial curve fitting. Each filter's ability to recover, with fidelity, systolic and diastolic function parameters was evaluated under increasingly 'noisy' conditions and at several sampling rates. Third degree polynomial curve fittings and truncated Fourier filters exhibited very high sensitivity to noise. Three and five point time smoothing had moderate sensitivity to noise, but were highly affected by sampling rate. Fourier filtering preserving 2H or 3H produced the best compromise with high resilience to noise and independence of sampling rate as far as the recovery of these functional parameters is concerned. (author)

  6. Response of soil aggregate stability to storage time of soil samples

    International Nuclear Information System (INIS)

    Gerzabek, M.H.; Roessner, H.

    1993-04-01

    The aim of the present study was to investigate the well known phenomenon of changing aggregate stability values as result of soil sample storage. In order to evaluate the impact of soil microbial activity, the soil sample was split into three subsamples. Two samples were sterilized by means of chloroform fumigation and gamma irradiation, respectively. However, the aggregate stability measurements at three different dates were not correlated with the microbial activity (dehydrogenase activity). The moisture content of the aggregate samples seems to be of higher significance. Samples with lower moisture content (range: 0.4 to 1.9%) exhibited higher aggregate stabilities. Thus, airdried aggregate samples without further treatment don't seem to be suitable for standardized stability measurements. (authors)

  7. ACS sampling system: design, implementation, and performance evaluation

    Science.gov (United States)

    Di Marcantonio, Paolo; Cirami, Roberto; Chiozzi, Gianluca

    2004-09-01

    By means of ACS (ALMA Common Software) framework we designed and implemented a sampling system which allows sampling of every Characteristic Component Property with a specific, user-defined, sustained frequency limited only by the hardware. Collected data are sent to various clients (one or more Java plotting widgets, a dedicated GUI or a COTS application) using the ACS/CORBA Notification Channel. The data transport is optimized: samples are cached locally and sent in packets with a lower and user-defined frequency to keep network load under control. Simultaneous sampling of the Properties of different Components is also possible. Together with the design and implementation issues we present the performance of the sampling system evaluated on two different platforms: on a VME based system using VxWorks RTOS (currently adopted by ALMA) and on a PC/104+ embedded platform using Red Hat 9 Linux operating system. The PC/104+ solution offers, as an alternative, a low cost PC compatible hardware environment with free and open operating system.

  8. Investigation of Legionella Contamination in Bath Water Samples by Culture, Amoebic Co-Culture, and Real-Time Quantitative PCR Methods

    Directory of Open Access Journals (Sweden)

    Akiko Edagawa

    2015-10-01

    Full Text Available We investigated Legionella contamination in bath water samples, collected from 68 bathing facilities in Japan, by culture, culture with amoebic co-culture, real-time quantitative PCR (qPCR, and real-time qPCR with amoebic co-culture. Using the conventional culture method, Legionella pneumophila was detected in 11 samples (11/68, 16.2%. Contrary to our expectation, the culture method with the amoebic co-culture technique did not increase the detection rate of Legionella (4/68, 5.9%. In contrast, a combination of the amoebic co-culture technique followed by qPCR successfully increased the detection rate (57/68, 83.8% compared with real-time qPCR alone (46/68, 67.6%. Using real-time qPCR after culture with amoebic co-culture, more than 10-fold higher bacterial numbers were observed in 30 samples (30/68, 44.1% compared with the same samples without co-culture. On the other hand, higher bacterial numbers were not observed after propagation by amoebae in 32 samples (32/68, 47.1%. Legionella was not detected in the remaining six samples (6/68, 8.8%, irrespective of the method. These results suggest that application of the amoebic co-culture technique prior to real-time qPCR may be useful for the sensitive detection of Legionella from bath water samples. Furthermore, a combination of amoebic co-culture and real-time qPCR might be useful to detect viable and virulent Legionella because their ability to invade and multiply within free-living amoebae is considered to correlate with their pathogenicity for humans. This is the first report evaluating the efficacy of the amoebic co-culture technique for detecting Legionella in bath water samples.

  9. Time Optimal Run-time Evaluation of Distributed Timing Constraints in Process Control Software

    DEFF Research Database (Denmark)

    Drejer, N.; Kristensen, C.H.

    1993-01-01

    This paper considers run-time evaluation of an important class of constraints; Timing constraints. These appear extensively in process control systems. Timing constraints are considered in distributed systems, i.e. systems consisting of multiple autonomous nodes......

  10. Evaluation of sampling plans for in-service inspection of steam generator tubes

    International Nuclear Information System (INIS)

    Kurtz, R.J.; Heasler, P.G.; Baird, D.B.

    1994-02-01

    This report summarizes the results of three previous studies to evaluate and compare the effectiveness of sampling plans for steam generator tube inspections. An analytical evaluation and Monte Carlo simulation techniques were the methods used to evaluate sampling plan performance. To test the performance of candidate sampling plans under a variety of conditions, ranges of inspection system reliability were considered along with different distributions of tube degradation. Results from the eddy current reliability studies performed with the retired-from-service Surry 2A steam generator were utilized to guide the selection of appropriate probability of detection and flaw sizing models for use in the analysis. Different distributions of tube degradation were selected to span the range of conditions that might exist in operating steam generators. The principal means of evaluating sampling performance was to determine the effectiveness of the sampling plan for detecting and plugging defective tubes. A summary of key results from the eddy current reliability studies is presented. The analytical and Monte Carlo simulation analyses are discussed along with a synopsis of key results and conclusions

  11. Sampling methods for rumen microbial counts by Real-Time PCR techniques

    Directory of Open Access Journals (Sweden)

    S. Puppo

    2010-02-01

    Full Text Available Fresh rumen samples were withdrawn from 4 cannulated buffalo females fed a fibrous diets in order to quantify bacteria concentration in the rumen by Real-Time PCR techniques. To obtain DNA of a good quality from whole rumen fluid, eight (M1-M8 different pre-filtration methods (cheese cloths, glass-fibre and nylon filter in combination with various centrifugation speeds (1000, 5000 and 14,000 rpm were tested. Genomic DNA extraction was performed either on fresh or frozen samples (-20°C. The quantitative bacteria analysis was realized according to Real-Time PCR procedure for Butyrivibrio fibrisolvens reported in literature. M5 resulted the best sampling procedure allowing to obtain a suitable genomic DNA. No differences were revealed between fresh and frozen samples.

  12. Evaluation of multiplex tandem real-time PCR for detection of Cryptosporidium spp., Dientamoeba fragilis, Entamoeba histolytica, and Giardia intestinalis in clinical stool samples.

    Science.gov (United States)

    Stark, D; Al-Qassab, S E; Barratt, J L N; Stanley, K; Roberts, T; Marriott, D; Harkness, J; Ellis, J T

    2011-01-01

    The aim of this study was to describe the first development and evaluation of a multiplex tandem PCR (MT-PCR) assay for the detection and identification of 4 common pathogenic protozoan parasites, Cryptosporidium spp., Dientamoeba fragilis, Entamoeba histolytica, and Giardia intestinalis, from human clinical samples. A total of 472 fecal samples submitted to the Department of Microbiology at St. Vincent's Hospital were included in the study. The MT-PCR assay was compared to four real-time PCR (RT-PCR) assays and microscopy by a traditional modified iron hematoxylin stain. The MT-PCR detected 28 G. intestinalis, 26 D. fragilis, 11 E. histolytica, and 9 Cryptosporidium sp. isolates. Detection and identification of the fecal protozoa by MT-PCR demonstrated 100% correlation with the RT-PCR results, and compared to RT-PCR, MT-PCR exhibited 100% sensitivity and specificity, while traditional microscopy of stained fixed fecal smears exhibited sensitivities and specificities of 56% and 100% for Cryptosporidium spp., 38% and 99% for D. fragilis, 47% and 97% for E. histolytica, and 50% and 100% for G. intestinalis. No cross-reactivity was detected in 100 stool samples containing various other bacterial, viral, and protozoan species. The MT-PCR assay was able to provide rapid, sensitive, and specific simultaneous detection and identification of the four most important diarrhea-causing protozoan parasites that infect humans. This study also highlights the lack of sensitivity demonstrated by microscopy, and thus, molecular methods such as MT-PCR must be considered the diagnostic methods of choice for enteric protozoan parasites.

  13. An open-chain imaginary-time path-integral sampling approach to the calculation of approximate symmetrized quantum time correlation functions

    Science.gov (United States)

    Cendagorta, Joseph R.; Bačić, Zlatko; Tuckerman, Mark E.

    2018-03-01

    We introduce a scheme for approximating quantum time correlation functions numerically within the Feynman path integral formulation. Starting with the symmetrized version of the correlation function expressed as a discretized path integral, we introduce a change of integration variables often used in the derivation of trajectory-based semiclassical methods. In particular, we transform to sum and difference variables between forward and backward complex-time propagation paths. Once the transformation is performed, the potential energy is expanded in powers of the difference variables, which allows us to perform the integrals over these variables analytically. The manner in which this procedure is carried out results in an open-chain path integral (in the remaining sum variables) with a modified potential that is evaluated using imaginary-time path-integral sampling rather than requiring the generation of a large ensemble of trajectories. Consequently, any number of path integral sampling schemes can be employed to compute the remaining path integral, including Monte Carlo, path-integral molecular dynamics, or enhanced path-integral molecular dynamics. We believe that this approach constitutes a different perspective in semiclassical-type approximations to quantum time correlation functions. Importantly, we argue that our approximation can be systematically improved within a cumulant expansion formalism. We test this approximation on a set of one-dimensional problems that are commonly used to benchmark approximate quantum dynamical schemes. We show that the method is at least as accurate as the popular ring-polymer molecular dynamics technique and linearized semiclassical initial value representation for correlation functions of linear operators in most of these examples and improves the accuracy of correlation functions of nonlinear operators.

  14. An open-chain imaginary-time path-integral sampling approach to the calculation of approximate symmetrized quantum time correlation functions.

    Science.gov (United States)

    Cendagorta, Joseph R; Bačić, Zlatko; Tuckerman, Mark E

    2018-03-14

    We introduce a scheme for approximating quantum time correlation functions numerically within the Feynman path integral formulation. Starting with the symmetrized version of the correlation function expressed as a discretized path integral, we introduce a change of integration variables often used in the derivation of trajectory-based semiclassical methods. In particular, we transform to sum and difference variables between forward and backward complex-time propagation paths. Once the transformation is performed, the potential energy is expanded in powers of the difference variables, which allows us to perform the integrals over these variables analytically. The manner in which this procedure is carried out results in an open-chain path integral (in the remaining sum variables) with a modified potential that is evaluated using imaginary-time path-integral sampling rather than requiring the generation of a large ensemble of trajectories. Consequently, any number of path integral sampling schemes can be employed to compute the remaining path integral, including Monte Carlo, path-integral molecular dynamics, or enhanced path-integral molecular dynamics. We believe that this approach constitutes a different perspective in semiclassical-type approximations to quantum time correlation functions. Importantly, we argue that our approximation can be systematically improved within a cumulant expansion formalism. We test this approximation on a set of one-dimensional problems that are commonly used to benchmark approximate quantum dynamical schemes. We show that the method is at least as accurate as the popular ring-polymer molecular dynamics technique and linearized semiclassical initial value representation for correlation functions of linear operators in most of these examples and improves the accuracy of correlation functions of nonlinear operators.

  15. Random sampling of evolution time space and Fourier transform processing

    International Nuclear Information System (INIS)

    Kazimierczuk, Krzysztof; Zawadzka, Anna; Kozminski, Wiktor; Zhukov, Igor

    2006-01-01

    Application of Fourier Transform for processing 3D NMR spectra with random sampling of evolution time space is presented. The 2D FT is calculated for pairs of frequencies, instead of conventional sequence of one-dimensional transforms. Signal to noise ratios and linewidths for different random distributions were investigated by simulations and experiments. The experimental examples include 3D HNCA, HNCACB and 15 N-edited NOESY-HSQC spectra of 13 C 15 N labeled ubiquitin sample. Obtained results revealed general applicability of proposed method and the significant improvement of resolution in comparison with conventional spectra recorded in the same time

  16. Performance evaluation of continuous blood sampling system for PET study. Comparison of three detector-systems

    CERN Document Server

    Matsumoto, K; Sakamoto, S; Senda, M; Yamamoto, S; Tarutani, K; Minato, K

    2002-01-01

    To measure cerebral blood flow with sup 1 sup 5 O PET, it is necessary to measure the time course of arterial blood radioactivity. We examined the performance of three different types of continuous blood sampling system. Three kinds of continuous blood sampling system were used: a plastic scintillator-based beta detector (conventional beta detector (BETA)), a bismuth germinate (BGO)-based coincidence gamma detector (Pico-count flow-through detector (COINC)) and a Phoswich detector (PD) composed by a combination of plastic scintillator and BGO scintillator. Performance of these systems was evaluated for absolute sensitivity, count rate characteristic, sensitivity to background gamnra photons, and reproducibility for nylon tube geometry. The absolute sensitivity of the PD was 0.21 cps/Bq for sup 6 sup 8 Ga positrons at the center of the detector. This was approximately three times higher than BETA, two times higher than COINC. The value measured with BETA was stable, even when background radioactivity was incre...

  17. HCV-RNA quantification in liver bioptic samples and extrahepatic compartments, using the abbott RealTime HCV assay.

    Science.gov (United States)

    Antonucci, FrancescoPaolo; Cento, Valeria; Sorbo, Maria Chiara; Manuelli, Matteo Ciancio; Lenci, Ilaria; Sforza, Daniele; Di Carlo, Domenico; Milana, Martina; Manzia, Tommaso Maria; Angelico, Mario; Tisone, Giuseppe; Perno, Carlo Federico; Ceccherini-Silberstein, Francesca

    2017-08-01

    We evaluated the performance of a rapid method to quantify HCV-RNA in the hepatic and extrahepatic compartments, by using for the first time the Abbott RealTime HCV-assay. Non-tumoral (NT), tumoral (TT) liver samples, lymph nodes and ascitic fluid from patients undergoing orthotopic-liver-transplantation (N=18) or liver resection (N=4) were used for the HCV-RNA quantification; 5/22 patients were tested after or during direct acting antivirals (DAA) treatment. Total RNA and DNA quantification from tissue-biopsies allowed normalization of HCV-RNA concentrations in IU/μg of total RNA and IU/10 6 liver-cells, respectively. HCV-RNA was successfully quantified with high reliability in liver biopsies, lymph nodes and ascitic fluid samples. Among the 17 untreated patients, a positive and significant HCV-RNA correlation between serum and NT liver-samples was observed (Pearson: rho=0.544, p=0.024). Three DAA-treated patients were HCV-RNA "undetectable" in serum, but still "detectable" in all tested liver-tissues. Differently, only one DAA-treated patient, tested after sustained-virological-response, showed HCV-RNA "undetectability" in liver-tissue. HCV-RNA was successfully quantified with high reliability in liver bioptic samples and extrahepatic compartments, even when HCV-RNA was "undetectable" in serum. Abbott RealTime HCV-assay is a good diagnostic tool for HCV quantification in intra- and extra-hepatic compartments, whenever a bioptic sample is available. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Safety evaluation of small samples for isotope production

    International Nuclear Information System (INIS)

    Sharma, Archana; Singh, Tej; Varde, P.V.

    2015-09-01

    Radioactive isotopes are widely used in basic and applied science and engineering, most notably as environmental and industrial tracers, and for medical imaging procedures. Production of radioisotope constitutes important activity of Indian nuclear program. Since its initial criticality DHRUVA reactor has been facilitating the regular supply of most of the radioisotopes required in the country for application in the fields of medicine, industry and agriculture. In-pile irradiation of the samples requires a prior estimation of the sample reactivity load, heating rate, activity developed and shielding thickness required for post irradiation handling. This report is an attempt to highlight the contributions of DHRUVA reactor, as well as to explain in detail the methodologies used in safety evaluation of the in pile irradiation samples. (author)

  19. A whole-cell bioreporter assay for quantitative genotoxicity evaluation of environmental samples.

    Science.gov (United States)

    Jiang, Bo; Li, Guanghe; Xing, Yi; Zhang, Dayi; Jia, Jianli; Cui, Zhisong; Luan, Xiao; Tang, Hui

    2017-10-01

    Whole-cell bioreporters have emerged as promising tools for genotoxicity evaluation, due to their rapidity, cost-effectiveness, sensitivity and selectivity. In this study, a method for detecting genotoxicity in environmental samples was developed using the bioluminescent whole-cell bioreporter Escherichia coli recA::luxCDABE. To further test its performance in a real world scenario, the E. coli bioreporter was applied in two cases: i) soil samples collected from chromium(VI) contaminated sites; ii) crude oil contaminated seawater collected after the Jiaozhou Bay oil spill which occurred in 2013. The chromium(VI) contaminated soils were pretreated by water extraction, and directly exposed to the bioreporter in two phases: aqueous soil extraction (water phase) and soil supernatant (solid phase). The results indicated that both extractable and soil particle fixed chromium(VI) were bioavailable to the bioreporter, and the solid-phase contact bioreporter assay provided a more precise evaluation of soil genotoxicity. For crude oil contaminated seawater, the response of the bioreporter clearly illustrated the spatial and time change in genotoxicity surrounding the spill site, suggesting that the crude oil degradation process decreased the genotoxic risk to ecosystem. In addition, the performance of the bioreporter was simulated by a modified cross-regulation gene expression model, which quantitatively described the DNA damage response of the E. coli bioreporter. Accordingly, the bioluminescent response of the bioreporter was calculated as the mitomycin C equivalent, enabling quantitative comparison of genotoxicities between different environmental samples. This bioreporter assay provides a rapid and sensitive screening tool for direct genotoxicity assessment of environmental samples. Copyright © 2017. Published by Elsevier Ltd.

  20. Evaluation of energy deposition by 153Sm in small samples

    International Nuclear Information System (INIS)

    Cury, M.I.C.; Siqueira, P.T.D.; Yoriyaz, H.; Coelho, P.R.P.; Da Silva, M.A.; Okazaki, K.

    2002-01-01

    Aim: This work presents evaluations of the absorbed dose by 'in vitro' blood cultures when mixed with 153 Sm solutions of different concentrations. Although 153 Sm is used as radiopharmaceutical mainly due to its beta emission, which is short-range radiation, it also emits gamma radiation which has a longer-range penetration. Therefore it turns to be a difficult task to determine the absorbed dose by small samples where the infinite approximation is no longer valid. Materials and Methods: MCNP-4C (Monte Carlo N - Particle transport code) has been used to perform the evaluations. It is not a deterministic code that calculates the value of a specific quantity solving the physical equations involved in the problem, but a virtual experiment where the events related to the problems are simulated and the concerned quantities are tallied. MCNP also stands out by its possibilities to specify geometrically any problem. However, these features, among others, turns MCNP in a time consuming code. The simulated problem consists of a cylindrical plastic tube with 1.5 cm internal diameter and 0.1cm thickness. It also has 2.0 cm height conic bottom end, so that the represented sample has 4.0 ml ( consisted by 1 ml of blood and 3 ml culture medium). To evaluate the energy deposition in the blood culture in each 153 Sm decay, the problem has been divided in 3 steps to account to the β- emissions (which has a continuum spectrum), gammas and conversion and Auger electrons emissions. Afterwards each emission contribution was weighted and summed to present the final value. Besides this radiation 'fragmentation', simulations were performed for many different amounts of 153 Sm solution added to the sample. These amounts cover a range from 1μl to 0.5 ml. Results: The average energy per disintegration of 153 Sm is 331 keV [1]. Gammas account for 63 keV and β-, conversion and Auger electrons account for 268 keV. The simulations performed showed an average energy deposition of 260 ke

  1. Optimum time of blood sampling for determination of glomerular filtration rate by single-injection [51Cr]EDTA plasma clearance

    International Nuclear Information System (INIS)

    Broechner-Mortensen, J.; Roedbro, P.

    1976-01-01

    We have investigated the influence on reproducibility of total [ 51 Cr]EDTA plasma clearance (E) of various times and numbers of blood samples in patients with normal (13 patients) and low (14 patients) renal function. The study aims at fixing a clinically useful procedure suitable for all levels of renal function. Six different types of E were evaluated with time periods for blood sampling between 3 and 5 h after tracer injection, and the variation from counting radioactivity, s 1 , was determined as part of the total variation, s 2 . Optimum mean time, t(E), for blood sampling was calculated as a function of E, as the mean time giving the least change in E for a given change in the 'final slope' of the plasma curve. For patients with normal E, s 1 did not contribute significantly to s 2 , and t(E) was about 2h. For patients with low renal function s 1 contributed significantly to s 2 , and t(E) increased steeply with decreasing E. The relative error in s 1 from fixed Etypes was calculated for all levels of renal function. The results indicate that blood sampling individualized according to predicted E values is not necessary. A sufficient precision of E can be achieved for all function levels from three blood samples drawn at 180, 240, and 300 min after injection. (Auth.)

  2. Testing of Alignment Parameters for Ancient Samples: Evaluating and Optimizing Mapping Parameters for Ancient Samples Using the TAPAS Tool

    Directory of Open Access Journals (Sweden)

    Ulrike H. Taron

    2018-03-01

    Full Text Available High-throughput sequence data retrieved from ancient or other degraded samples has led to unprecedented insights into the evolutionary history of many species, but the analysis of such sequences also poses specific computational challenges. The most commonly used approach involves mapping sequence reads to a reference genome. However, this process becomes increasingly challenging with an elevated genetic distance between target and reference or with the presence of contaminant sequences with high sequence similarity to the target species. The evaluation and testing of mapping efficiency and stringency are thus paramount for the reliable identification and analysis of ancient sequences. In this paper, we present ‘TAPAS’, (Testing of Alignment Parameters for Ancient Samples, a computational tool that enables the systematic testing of mapping tools for ancient data by simulating sequence data reflecting the properties of an ancient dataset and performing test runs using the mapping software and parameter settings of interest. We showcase TAPAS by using it to assess and improve mapping strategy for a degraded sample from a banded linsang (Prionodon linsang, for which no closely related reference is currently available. This enables a 1.8-fold increase of the number of mapped reads without sacrificing mapping specificity. The increase of mapped reads effectively reduces the need for additional sequencing, thus making more economical use of time, resources, and sample material.

  3. Evaluation of sampling methods to quantify abundance of hardwoods and snags within conifer-dominated riparian zones

    Science.gov (United States)

    Theresa Marquardt; Hailemariam Temesgen; Paul D. Anderson; Bianca. Eskelson

    2012-01-01

    Six sampling alternatives were examined for their ability to quantify selected attributes of snags and hardwoods in conifer-dominated riparian areas of managed headwater forests in western Oregon. Each alternative was simulated 500 times at eight headwater forest locations based on a 0.52-ha square stem map. The alternatives were evaluated based on how well they...

  4. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  5. Assessing Time Management Skills as an Important Aspect of Student Learning: The Construction and Evaluation of a Time Management Scale with Spanish High School Students

    Science.gov (United States)

    Garcia-Ros, Rafael; Perez-Gonzalez, Francisco; Hinojosa, Eugenia

    2004-01-01

    The main purpose of this study is to analyse the factorial structure, psychometric properties and predictive capacity for academic achievement of a scale designed to evaluate the time management skills of Spanish high school students. An adaptation of the Time Management Questionnaire was presented to two samples of 350 Spanish high school…

  6. Evaluation of standard methods for collecting and processing fuel moisture samples

    Science.gov (United States)

    Sally M. Haase; José Sánchez; David R. Weise

    2016-01-01

    A variety of techniques for collecting and processing samples to determine moisture content of wildland fuels in support of fire management activities were evaluated. The effects of using a chainsaw or handsaw to collect samples of largediameter wood, containers for storing and transporting collected samples, and quick-response ovens for estimating moisture content...

  7. Method for Hot Real-Time Sampling of Pyrolysis Vapors

    Energy Technology Data Exchange (ETDEWEB)

    Pomeroy, Marc D [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-29

    Biomass Pyrolysis has been an increasing topic of research, in particular as a replacement for crude oil. This process utilizes moderate temperatures to thermally deconstruct the biomass which is then condensed into a mixture of liquid oxygenates to be used as fuel precursors. Pyrolysis oils contain more than 400 compounds, up to 60 percent of which do not re-volatilize for subsequent chemical analysis. Vapor chemical composition is also complicated as additional condensation reactions occur during the condensation and collection of the product. Due to the complexity of the pyrolysis oil, and a desire to catalytically upgrade the vapor composition before condensation, online real-time analytical techniques such as Molecular Beam Mass Spectrometry (MBMS) are of great use. However, in order to properly sample hot pyrolysis vapors, many challenges must be overcome. Sampling must occur within a narrow range of temperatures to reduce product composition changes from overheating or partial condensation or plugging of lines from condensed products. Residence times must be kept at a minimum to reduce further reaction chemistries. Pyrolysis vapors also form aerosols that are carried far downstream and can pass through filters resulting in build-up in downstream locations. The co-produced bio-char and ash from the pyrolysis process can lead to plugging of the sample lines, and must be filtered out at temperature, even with the use of cyclonic separators. A practical approach for considerations and sampling system design, as well as lessons learned are integrated into the hot analytical sampling system of the National Renewable Energy Laboratory's (NREL) Thermochemical Process Development Unit (TCPDU) to provide industrially relevant demonstrations of thermochemical transformations of biomass feedstocks at the pilot scale.

  8. [Sample size calculation in clinical post-marketing evaluation of traditional Chinese medicine].

    Science.gov (United States)

    Fu, Yingkun; Xie, Yanming

    2011-10-01

    In recent years, as the Chinese government and people pay more attention on the post-marketing research of Chinese Medicine, part of traditional Chinese medicine breed has or is about to begin after the listing of post-marketing evaluation study. In the post-marketing evaluation design, sample size calculation plays a decisive role. It not only ensures the accuracy and reliability of post-marketing evaluation. but also assures that the intended trials will have a desired power for correctly detecting a clinically meaningful difference of different medicine under study if such a difference truly exists. Up to now, there is no systemic method of sample size calculation in view of the traditional Chinese medicine. In this paper, according to the basic method of sample size calculation and the characteristic of the traditional Chinese medicine clinical evaluation, the sample size calculation methods of the Chinese medicine efficacy and safety are discussed respectively. We hope the paper would be beneficial to medical researchers, and pharmaceutical scientists who are engaged in the areas of Chinese medicine research.

  9. Evaluation of pump pulsation in respirable size-selective sampling: part II. Changes in sampling efficiency.

    Science.gov (United States)

    Lee, Eun Gyung; Lee, Taekhee; Kim, Seung Won; Lee, Larry; Flemmer, Michael M; Harper, Martin

    2014-01-01

    This second, and concluding, part of this study evaluated changes in sampling efficiency of respirable size-selective samplers due to air pulsations generated by the selected personal sampling pumps characterized in Part I (Lee E, Lee L, Möhlmann C et al. Evaluation of pump pulsation in respirable size-selective sampling: Part I. Pulsation measurements. Ann Occup Hyg 2013). Nine particle sizes of monodisperse ammonium fluorescein (from 1 to 9 μm mass median aerodynamic diameter) were generated individually by a vibrating orifice aerosol generator from dilute solutions of fluorescein in aqueous ammonia and then injected into an environmental chamber. To collect these particles, 10-mm nylon cyclones, also known as Dorr-Oliver (DO) cyclones, were used with five medium volumetric flow rate pumps. Those were the Apex IS, HFS513, GilAir5, Elite5, and Basic5 pumps, which were found in Part I to generate pulsations of 5% (the lowest), 25%, 30%, 56%, and 70% (the highest), respectively. GK2.69 cyclones were used with the Legacy [pump pulsation (PP) = 15%] and Elite12 (PP = 41%) pumps for collection at high flows. The DO cyclone was also used to evaluate changes in sampling efficiency due to pulse shape. The HFS513 pump, which generates a more complex pulse shape, was compared to a single sine wave fluctuation generated by a piston. The luminescent intensity of the fluorescein extracted from each sample was measured with a luminescence spectrometer. Sampling efficiencies were obtained by dividing the intensity of the fluorescein extracted from the filter placed in a cyclone with the intensity obtained from the filter used with a sharp-edged reference sampler. Then, sampling efficiency curves were generated using a sigmoid function with three parameters and each sampling efficiency curve was compared to that of the reference cyclone by constructing bias maps. In general, no change in sampling efficiency (bias under ±10%) was observed until pulsations exceeded 25% for the

  10. Evaluation of FTA cards as a laboratory and field sampling device for the detection of foot-and-mouth disease virus and serotyping by RT-PCR and real-time RT-PCR.

    Science.gov (United States)

    Muthukrishnan, Madhanmohan; Singanallur, Nagendrakumar B; Ralla, Kumar; Villuppanoor, Srinivasan A

    2008-08-01

    Foot-and-mouth disease virus (FMDV) samples transported to the laboratory from far and inaccessible areas for serodiagnosis pose a major problem in a tropical country like India, where there is maximum temperature fluctuation. Inadequate storage methods lead to spoilage of FMDV samples collected from clinically positive animals in the field. Such samples are declared as non-typeable by the typing laboratories with the consequent loss of valuable epidemiological data. The present study evaluated the usefulness of FTA Classic Cards for the collection, shipment, storage and identification of the FMDV genome by RT-PCR and real-time RT-PCR. The stability of the viral RNA, the absence of infectivity and ease of processing the sample for molecular methods make the FTA cards a useful option for transport of FMDV genome for identification and serotyping. The method can be used routinely for FMDV research as it is economical and the cards can be transported easily in envelopes by regular document transport methods. Live virus cannot be isolated from samples collected in FTA cards, which is a limitation. This property can be viewed as an advantage as it limits the risk of transmission of live virus.

  11. Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models

    Science.gov (United States)

    Price, Larry R.

    2012-01-01

    The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…

  12. Evaluation of sampling schemes for in-service inspection of steam generator tubing

    International Nuclear Information System (INIS)

    Hanlen, R.C.

    1990-03-01

    This report is a follow-on of work initially sponsored by the US Nuclear Regulatory Commission (Bowen et al. 1989). The work presented here is funded by EPRI and is jointly sponsored by the Electric Power Research Institute (EPRI) and the US Nuclear Regulatory Commission (NRC). The goal of this research was to evaluate fourteen sampling schemes or plans. The main criterion used for evaluating plan performance was the effectiveness for sampling, detecting and plugging defective tubes. The performance criterion was evaluated across several choices of distributions of degraded/defective tubes, probability of detection (POD) curves and eddy-current sizing models. Conclusions from this study are dependent upon the tube defect distributions, sample size, and expansion rules considered. As degraded/defective tubes form ''clusters'' (i.e., maps 6A, 8A and 13A), the smaller sample sizes provide a capability of detecting and sizing defective tubes that approaches 100% inspection. When there is little or no clustering (i.e., maps 1A, 20 and 21), sample efficiency is approximately equal to the initial sample size taken. Thee is an indication (though not statistically significant) that the systematic sampling plans are better than the random sampling plans for equivalent initial sample size. There was no indication of an effect due to modifying the threshold value for the second stage expansion. The lack of an indication is likely due to the specific tube flaw sizes considered for the six tube maps. 1 ref., 11 figs., 19 tabs

  13. Discrete-Time Mixing Receiver Architecture for RF-Sampling Software-Defined Radio

    NARCIS (Netherlands)

    Ru, Z.; Klumperink, Eric A.M.; Nauta, Bram

    2010-01-01

    Abstract—A discrete-time (DT) mixing architecture for RF-sampling receivers is presented. This architecture makes RF sampling more suitable for software-defined radio (SDR) as it achieves wideband quadrature demodulation and wideband harmonic rejection. The paper consists of two parts. In the first

  14. Time-to-event methodology improved statistical evaluation in register-based health services research.

    Science.gov (United States)

    Bluhmki, Tobias; Bramlage, Peter; Volk, Michael; Kaltheuner, Matthias; Danne, Thomas; Rathmann, Wolfgang; Beyersmann, Jan

    2017-02-01

    Complex longitudinal sampling and the observational structure of patient registers in health services research are associated with methodological challenges regarding data management and statistical evaluation. We exemplify common pitfalls and want to stimulate discussions on the design, development, and deployment of future longitudinal patient registers and register-based studies. For illustrative purposes, we use data from the prospective, observational, German DIabetes Versorgungs-Evaluation register. One aim was to explore predictors for the initiation of a basal insulin supported therapy in patients with type 2 diabetes initially prescribed to glucose-lowering drugs alone. Major challenges are missing mortality information, time-dependent outcomes, delayed study entries, different follow-up times, and competing events. We show that time-to-event methodology is a valuable tool for improved statistical evaluation of register data and should be preferred to simple case-control approaches. Patient registers provide rich data sources for health services research. Analyses are accompanied with the trade-off between data availability, clinical plausibility, and statistical feasibility. Cox' proportional hazards model allows for the evaluation of the outcome-specific hazards, but prediction of outcome probabilities is compromised by missing mortality information. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Detection of Mycoplasma genitalium in female cervical samples by Multitarget Real-Time PCR

    Directory of Open Access Journals (Sweden)

    Sabina Mahmutović-Vranić

    2007-05-01

    Full Text Available Mycoplasma genitalum (MG is associated with variety of urogenital infections such as non-gonococcal urethritis (NGU, endometritis and cervicitis. The objective of this study was to demonstrate and evaluate a research polymerase chain reaction (PCR assay, for the detection of MG in cervical samples of a tested population of women attending gynecology clinics in Bosnia and Herzegovina. The Multitarget Real-Time (MTRT PCR, utilizing the ABI 7900HT, the sequence detection system, was performed for the detection of MG. Cervical samples (N=97 from females were divided into three types of patient groups: Group 1: patients who had known abnormal clinical cytology reports (N=34; Group 2: patients who reported a history of genitourinary infections (N=22; and Group 3: patients not in either groups 1 or 2 (N=41. Overall, 14,43% (14/97 of those tested were positive for MG. A positive sample was defined as having a cycle threshold cross point (Ct < 40,0 with a fluorescent detection comparable to the low positive control utilized during the run. This study validated the use of MTRT PCR as a reliable method for the detection of MG in clinical specimens and should facilitate large-scale screening for this organism.

  16. Evaluation of sampling equipment for RCRA [Resource Conservation and Recovery Act] monitoring in a deep unconfined aquifer

    International Nuclear Information System (INIS)

    Kasper, R.B.; Serkowski, J.A.

    1988-02-01

    Acceptable sampling devices identified include gas-operated bladder pumps and bailers constructed of nonreactive materials. In arid portions of the western United States, depths to ground water, which are often greater than 100 feet, make the use of bailers extremely time consuming. The efficiency of gas-operated bladder pumps decreases with depth, and special design and construction is often needed to accommodate higher gas pressures. Commercially available sampling pumps were evaluated for their suitability as sampling and purging equipment for installation in monitoring wells in a deep, unconfined aquifer. The test was conducted in a well where the depth to ground water was 340 feet. The objective was to identify equipment capable of discharge rates suitable for sampling (∼0.025 gpm) and, if possible, for purging (>1 gpm). The potential for physical or chemical alteration of the water sample was evaluated from literature sources and not specifically tested. Four positive-displacement pumps were tested, consisting of two bladder pumps, a gas-driven piston pump, and a mechanically-driven pump. All pumps could be installed in a 2-inch diameter well, although this was not important for the planned application. 4 refs., 1 tab

  17. Evaluation of Vitamin D3 and D2 Stability in Fortified Flat Bread Samples During Dough Fermentation, Baking and Storage.

    Science.gov (United States)

    Tabibian, Mehrnaz; Torbati, Mohammadali; Afshar Mogaddam, Mohammad Reza; Mirlohi, Maryam; Sadeghi, Malihe; Mohtadinia, Javad

    2017-06-01

    Purpose: Vitamin D, a fat-soluble secosteroid, has a significant role in bone metabolism and helps calcium absorption in the body. Since vitamin D concentration is altered in fortified foods and dietary supplements, the actual amount of vitamin D may differ from the label value. Methods: In this study, the concentrations of vitamin D 2 and D 3 of fortified bread sample were analytically determined. For this purpose, dough or homogenized bread sample was saponified using potassium hydroxide solution (30%, w/v) at 80°C, and the saponified analytes were extracted into n -heptane followed by liquid-liquid extraction. Then n -heptane fraction was evaporated to dryness and the sample was reconstituted in methanol. The effect of different parameters was evaluated by one variable at one-time strategy. Results: The analytes concentrations were evaluated in dough fermentation, baking and storage steps. The effect of temperature in dough fermentation and baking was evaluated at the range of 5-30 and 200-250°C, respectively. Also, the fermentation time was studied in the range of 0-120 min. The analytes concentrations were followed for 1 to 5 days after baking. The results indicated that dough fermentation temperature has no significant effect on the concentration of the analytes. On the other hand, when the dough fermentation time and baking temperature are increased, the analytes concentrations are decreased. Also, the storage duration of the spiked bread samples decreased the analytes concentrations after one day. Conclusion: Based on the obtained results, baking the dough at high temperatures lead to decrease in vitamin levels.

  18. Evaluation of inlet sampling integrity on NSF/NCAR airborne platforms

    Science.gov (United States)

    Campos, T. L.; Stith, J. L.; Stephens, B. B.; Romashkin, P.

    2017-12-01

    An inlet test project was conducted during IDEAS-IV-GV (2013), to evaluate the sampling integrity of two inlet designs. Use of a single CO2 sensor provided a high precision detector and a large difference in the mean cabin and external concentrations (500-700 ppmv in the cabin). The original HIAPER Modular InLet (HIMIL) is comprised of a tapered flow straightening flow through `cigar' mounted to a strut. The cigar center sampling line sits 12" from the fuselage skin. An o-ring seals the feedthrough plate coupling sampling lines from the strut into the cigar. However, there is no seal to prevent air inside the strut from seeping out around the cigar body. A pressure-equalizing drain hole in the strut access panel; it was positioned at an approximate distance of 4" from the fuselage to ensure that air from any source that drained out of the strut was confined to a low release point. A second aft-facing inlet design was also evaluated. The sampling center line was moved farther from the fuselage at a height of 16". A similar approach was also applied to sampling locations on the C-130 in 2015. The results of these tests and recommendations for best practices will be presented.

  19. Timely and complete publication of economic evaluations alongside randomized controlled trials.

    Science.gov (United States)

    Thorn, Joanna C; Noble, Sian M; Hollingworth, William

    2013-01-01

    Little is known about the extent and nature of publication bias in economic evaluations. Our objective was to determine whether economic evaluations are subject to publication bias by considering whether economic data are as likely to be reported, and reported as promptly, as effectiveness data. Trials that intended to conduct an economic analysis and ended before 2008 were identified in the International Standard Randomised Controlled Trial Number (ISRCTN) register; a random sample of 100 trials was retrieved. Fifty comparator trials were randomly drawn from those not identified as intending to conduct an economic study. The trial start and end dates, estimated sample size and funder type were extracted. For trials planning economic evaluations, effectiveness and economic publications were sought; publication dates and journal impact factors were extracted. Effectiveness abstracts were assessed for whether they reached a firm conclusion that one intervention was most effective. Primary investigators were contacted about reasons for non-publication of results, or reasons for differential publication strategies for effectiveness and economic results. Trials planning an economic study were more likely to be funded by government (p = 0.01) and larger (p = 0.003) than other trials. The trials planning an economic evaluation had a mean of 6.5 (range 2.7-13.2) years since the trial end in which to publish their results. Effectiveness results were reported by 70 %, while only 43 % published economic evaluations (p economic results included the intervention being ineffective, and staffing issues. Funding source, time since trial end and length of study were not associated with a higher probability of publishing the economic evaluation. However, studies that were small or of unknown size were significantly less likely to publish economic evaluations than large studies (p journal impact factor was 1.6 points higher for effectiveness publications than for the

  20. Detection of Strongylus vulgaris in equine faecal samples by real-time PCR and larval culture - method comparison and occurrence assessment.

    Science.gov (United States)

    Kaspar, A; Pfister, K; Nielsen, M K; Silaghi, C; Fink, H; Scheuerle, M C

    2017-01-11

    Strongylus vulgaris has become a rare parasite in Germany during the past 50 years due to the practice of frequent prophylactic anthelmintic therapy. To date, the emerging development of resistance in Cyathostominae and Parascaris spp. to numerous equine anthelmintics has changed deworming management and the frequency of anthelmintic usage. In this regard, reliable detection of parasitic infections, especially of the highly pathogenic S. vulgaris is essential. In the current study, two diagnostic methods for the detection of infections with S. vulgaris were compared and information on the occurrence of this parasite in German horses was gained. For this purpose, faecal samples of 501 horses were screened for S. vulgaris with real-time PCR and an additional larval culture was performed in samples of 278 horses. A subset of 26 horses underwent multiple follow-up examinations with both methods in order to evaluate both the persistence of S. vulgaris infections and the reproducibility of each diagnostic method. The real-time PCR revealed S. vulgaris-DNA in ten of 501 investigated equine samples (1.9%). The larval culture demonstrated larvae of S. vulgaris in three of the 278 samples (1.1%). A direct comparison of the two methods was possible in 321 samples including 43 follow-up examinations with the result of 11 S. vulgaris-positive samples by real-time PCR and 4 S. vulgaris-positive samples by larval culture. The McNemar's test (p-value = 0.016) revealed a significant difference and the kappa values (0.525) showed a moderate agreement between real-time PCR and larval culture. The real-time PCR detected a significantly higher proportion of positives of S. vulgaris compared to larval culture and should thus be considered as a routine diagnostic method for the detection of S. vulgaris in equine samples.

  1. Clinical evaluation of β-tubulin real-time PCR for rapid diagnosis of dermatophytosis, a comparison with mycological methods.

    Science.gov (United States)

    Motamedi, Marjan; Mirhendi, Hossein; Zomorodian, Kamiar; Khodadadi, Hossein; Kharazi, Mahboobeh; Ghasemi, Zeinab; Shidfar, Mohammad Reza; Makimura, Koichi

    2017-10-01

    Following our previous report on evaluation of the beta tubulin real-time PCR for detection of dermatophytosis, this study aimed to compare the real-time PCR assay with conventional methods for the clinical assessment of its diagnostic performance. Samples from a total of 853 patients with suspected dermatophyte lesions were subjected to direct examination (all samples), culture (499 samples) and real-time PCR (all samples). Fungal DNA was extracted directly from clinical samples using a conical steel bullet, followed by purification with a commercial kit and subjected to the Taq-Man probe-based real-time PCR. The study showed that among the 499 specimens for which all three methods were used, 156 (31.2%), 128 (25.6%) and 205 (41.0%) were found to be positive by direct microscopy, culture and real-time PCR respectively. Real-time PCR significantly increased the detection rate of dermatophytes compared with microscopy (288 vs 229) with 87% concordance between the two methods. The sensitivity, specificity, positive predictive value, and negative predictive value of the real-time PCR was 87.5%, 85%, 66.5% and 95.2% respectively. Although real-time PCR performed better on skin than on nail samples, it should not yet fully replace conventional diagnosis. © 2017 Blackwell Verlag GmbH.

  2. DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1994-04-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Laboratory Management Division of the DOE. Methods are prepared for entry into DOE Methods as chapter editors, together with DOE and other participants in this program, identify analytical and sampling method needs. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types. open-quotes Draftclose quotes or open-quotes Verified.close quotes. open-quotes Draftclose quotes methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. open-quotes Verifiedclose quotes methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations

  3. Electro-optic sampling for time resolving relativistic ultrafast electron diffraction

    International Nuclear Information System (INIS)

    Scoby, C. M.; Musumeci, P.; Moody, J.; Gutierrez, M.; Tran, T.

    2009-01-01

    The Pegasus laboratory at UCLA features a state-of-the-art electron photoinjector capable of producing ultrashort (<100 fs) high-brightness electron bunches at energies of 3.75 MeV. These beams recently have been used to produce static diffraction patterns from scattering off thin metal foils, and it is foreseen to take advantage of the ultrashort nature of these bunches in future pump-probe time-resolved diffraction studies. In this paper, single shot 2-d electro-optic sampling is presented as a potential technique for time of arrival stamping of electron bunches used for diffraction. Effects of relatively low bunch charge (a few 10's of pC) and modestly relativistic beams are discussed and background compensation techniques to obtain high signal-to-noise ratio are explored. From these preliminary tests, electro-optic sampling is suitable to be a reliable nondestructive time stamping method for relativistic ultrafast electron diffraction at the Pegasus lab.

  4. Sampling density for the quantitative evaluation of air trapping

    International Nuclear Information System (INIS)

    Goris, Michael L.; Robinson, Terry E.

    2009-01-01

    Concerns have been expressed recently about the radiation burden on patient populations, especially children, undergoing serial radiological testing. To reduce the dose one can change the CT acquisition settings or decrease the sampling density. In this study we determined the minimum desirable sampling density to ascertain the degree of air trapping in children with cystic fibrosis. Ten children with cystic fibrosis in stable condition underwent a volumetric spiral CT scan. The degree of air trapping was determined by an automated algorithm for all slices in the volume, and then for 1/2, 1/4, to 1/128 of all slices, or a sampling density ranging from 100% to 1% of the total volume. The variation around the true value derived from 100% sampling was determined for all other sampling densities. The precision of the measurement remained stable down to a 10% sampling density, but decreased markedly below 3.4%. For a disease marker with the regional variability of air trapping in cystic fibrosis, regardless of observer variability, a sampling density below 10% and even more so, below 3.4%, apparently decreases the precision of the evaluation. (orig.)

  5. Effect of Sample Storage Temperature and Time Delay on Blood Gases, Bicarbonate and pH in Human Arterial Blood Samples.

    Science.gov (United States)

    Mohammadhoseini, Elham; Safavi, Enayat; Seifi, Sepideh; Seifirad, Soroush; Firoozbakhsh, Shahram; Peiman, Soheil

    2015-03-01

    Results of arterial blood gas analysis can be biased by pre-analytical factors, such as time interval before analysis, temperature during storage and syringe type. To investigate the effects of samples storage temperature and time delay on blood gases, bicarbonate and PH results in human arterial blood samples. 2.5 mL arterial blood samples were drawn from 45 patients via an indwelling Intraarterial catheter. Each sample was divided into five equal samples and stored in multipurpose tuberculin plastic syringes. Blood gas analysis was performed on one of five samples as soon as possible. Four other samples were divided into two groups stored at 22°C and 0°C. Blood gas analyses were repeated at 30 and 60 minutes after sampling. PaO2 of the samples stored at 0°C was increased significantly after 60 minutes (P = 0.007). The PaCO2 of the samples kept for 30 and 60 minutes at 22°C was significantly higher than primary result (P = 0.04, P samples stored at 22°C, pH decreased significantly after 30 and 60 minutes (P = 0.017, P = 0.001). There were no significant differences in other results of samples stored at 0°C or 22°C after 30 or 60 minutes. In samples stored in plastic syringes, overestimation of PaO2 levels should be noted if samples cooled before analysis. In samples stored in plastic syringes, it is not necessary to store samples in iced water when analysis delayed up to one hour.

  6. Detection of Salmonella spp. in veterinary samples by combining selective enrichment and real-time PCR.

    Science.gov (United States)

    Goodman, Laura B; McDonough, Patrick L; Anderson, Renee R; Franklin-Guild, Rebecca J; Ryan, James R; Perkins, Gillian A; Thachil, Anil J; Glaser, Amy L; Thompson, Belinda S

    2017-11-01

    Rapid screening for enteric bacterial pathogens in clinical environments is essential for biosecurity. Salmonella found in veterinary hospitals, particularly Salmonella enterica serovar Dublin, can pose unique challenges for culture and testing because of its poor growth. Multiple Salmonella serovars including Dublin are emerging threats to public health given increasing prevalence and antimicrobial resistance. We adapted an automated food testing method to veterinary samples and evaluated the performance of the method in a variety of matrices including environmental samples ( n = 81), tissues ( n = 52), feces ( n = 148), and feed ( n = 29). A commercial kit was chosen as the basis for this approach in view of extensive performance characterizations published by multiple independent organizations. A workflow was established for efficiently and accurately testing veterinary matrices and environmental samples by use of real-time PCR after selective enrichment in Rappaport-Vassiliadis soya (RVS) medium. Using this method, the detection limit for S. Dublin improved by 100-fold over subculture on selective agars (eosin-methylene blue, brilliant green, and xylose-lysine-deoxycholate). Overall, the procedure was effective in detecting Salmonella spp. and provided next-day results.

  7. Estimation of time-delayed mutual information and bias for irregularly and sparsely sampled time-series

    International Nuclear Information System (INIS)

    Albers, D.J.; Hripcsak, George

    2012-01-01

    Highlights: ► Time-delayed mutual information for irregularly sampled time-series. ► Estimation bias for the time-delayed mutual information calculation. ► Fast, simple, PDF estimator independent, time-delayed mutual information bias estimate. ► Quantification of data-set-size limits of the time-delayed mutual calculation. - Abstract: A method to estimate the time-dependent correlation via an empirical bias estimate of the time-delayed mutual information for a time-series is proposed. In particular, the bias of the time-delayed mutual information is shown to often be equivalent to the mutual information between two distributions of points from the same system separated by infinite time. Thus intuitively, estimation of the bias is reduced to estimation of the mutual information between distributions of data points separated by large time intervals. The proposed bias estimation techniques are shown to work for Lorenz equations data and glucose time series data of three patients from the Columbia University Medical Center database.

  8. Comparison of sampling procedures and microbiological and non-microbiological parameters to evaluate cleaning and disinfection in broiler houses.

    Science.gov (United States)

    Luyckx, K; Dewulf, J; Van Weyenberg, S; Herman, L; Zoons, J; Vervaet, E; Heyndrickx, M; De Reu, K

    2015-04-01

    Cleaning and disinfection of the broiler stable environment is an essential part of farm hygiene management. Adequate cleaning and disinfection is essential for prevention and control of animal diseases and zoonoses. The goal of this study was to shed light on the dynamics of microbiological and non-microbiological parameters during the successive steps of cleaning and disinfection and to select the most suitable sampling methods and parameters to evaluate cleaning and disinfection in broiler houses. The effectiveness of cleaning and disinfection protocols was measured in six broiler houses on two farms through visual inspection, adenosine triphosphate hygiene monitoring and microbiological analyses. Samples were taken at three time points: 1) before cleaning, 2) after cleaning, and 3) after disinfection. Before cleaning and after disinfection, air samples were taken in addition to agar contact plates and swab samples taken from various sampling points for enumeration of total aerobic flora, Enterococcus spp., and Escherichia coli and the detection of E. coli and Salmonella. After cleaning, air samples, swab samples, and adenosine triphosphate swabs were taken and a visual score was also assigned for each sampling point. The mean total aerobic flora determined by swab samples decreased from 7.7±1.4 to 5.7±1.2 log CFU/625 cm2 after cleaning and to 4.2±1.6 log CFU/625 cm2 after disinfection. Agar contact plates were used as the standard for evaluating cleaning and disinfection, but in this study they were found to be less suitable than swabs for enumeration. In addition to measuring total aerobic flora, Enterococcus spp. seemed to be a better hygiene indicator to evaluate cleaning and disinfection protocols than E. coli. All stables were Salmonella negative, but the detection of its indicator organism E. coli provided additional information for evaluating cleaning and disinfection protocols. Adenosine triphosphate analyses gave additional information about the

  9. Characterization of specimens obtained by different sampling methods for evaluation of periodontal bacteria.

    Science.gov (United States)

    Okada, Ayako; Sogabe, Kaoru; Takeuchi, Hiroaki; Okamoto, Masaaki; Nomura, Yoshiaki; Hanada, Nobuhiro

    2017-12-27

    Quantitative analysis of periodontal bacteria is considered useful for clinical diagnosis, evaluation and assessment of the risk of periodontal disease. The purpose of this study was to compare the effectiveness of sampling of saliva, supragingival and subgingival plaque for evaluation of periodontal bacteria. From each of 12 subjects, i) subgingival plaque was collected from the deepest pocket using a sterile paper point, ii) stimulated whole saliva was collected after chewing gum, and iii) supragingival plaque was collected using a tooth brush. These samples were sent to the medical examination laboratory for quantitative analysis of the counts of three periodontal bacterial species: Porphyromonas gingivalis, Treponema denticola, and Tannerella forsythia. The proportions of these bacteria in subgingival plaque were higher than those in saliva or supragingival plaque, but lower in subgingival plaque than in saliva or supragingival plaque. In several cases, periodontal bacteria were below the levels of detection in subgingival plaque. We concluded that samples taken from subgingival plaque may be more useful for evaluating the proportion of periodontal bacteria in deep pockets than is the case for other samples. Therefore, for evaluation of periodontal bacteria, clinicians should consider the characteristics of the specimens obtained using different sampling methods.

  10. Interlaboratory study of DNA extraction from multiple ground samples, multiplex real-time PCR, and multiplex qualitative PCR for individual kernel detection system of genetically modified maize.

    Science.gov (United States)

    Akiyama, Hiroshi; Sakata, Kozue; Makiyma, Daiki; Nakamura, Kosuke; Teshima, Reiko; Nakashima, Akie; Ogawa, Asako; Yamagishi, Toru; Futo, Satoshi; Oguchi, Taichi; Mano, Junichi; Kitta, Kazumi

    2011-01-01

    In many countries, the labeling of grains, feed, and foodstuff is mandatory if the genetically modified (GM) organism content exceeds a certain level of approved GM varieties. We previously developed an individual kernel detection system consisting of grinding individual kernels, DNA extraction from the individually ground kernels, GM detection using multiplex real-time PCR, and GM event detection using multiplex qualitative PCR to analyze the precise commingling level and varieties of GM maize in real sample grains. We performed the interlaboratory study of the DNA extraction with multiple ground samples, multiplex real-time PCR detection, and multiplex qualitative PCR detection to evaluate its applicability, practicality, and ruggedness for the individual kernel detection system of GM maize. DNA extraction with multiple ground samples, multiplex real-time PCR, and multiplex qualitative PCR were evaluated by five laboratories in Japan, and all results from these laboratories were consistent with the expected results in terms of the commingling level and event analysis. Thus, the DNA extraction with multiple ground samples, multiplex real-time PCR, and multiplex qualitative PCR for the individual kernel detection system is applicable and practicable in a laboratory to regulate the commingling level of GM maize grain for GM samples, including stacked GM maize.

  11. Investigation of Bicycle Travel Time Estimation Using Bluetooth Sensors for Low Sampling Rates

    Directory of Open Access Journals (Sweden)

    Zhenyu Mei

    2014-10-01

    Full Text Available Filtering the data for bicycle travel time using Bluetooth sensors is crucial to the estimation of link travel times on a corridor. The current paper describes an adaptive filtering algorithm for estimating bicycle travel times using Bluetooth data, with consideration of low sampling rates. The data for bicycle travel time using Bluetooth sensors has two characteristics. First, the bicycle flow contains stable and unstable conditions. Second, the collected data have low sampling rates (less than 1%. To avoid erroneous inference, filters are introduced to “purify” multiple time series. The valid data are identified within a dynamically varying validity window with the use of a robust data-filtering procedure. The size of the validity window varies based on the number of preceding sampling intervals without a Bluetooth record. Applications of the proposed algorithm to the dataset from Genshan East Road and Moganshan Road in Hangzhou demonstrate its ability to track typical variations in bicycle travel time efficiently, while suppressing high frequency noise signals.

  12. DOE methods for evaluating environmental and waste management samples

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K. [eds.

    1994-10-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, {open_quotes}Draft{close_quotes} or {open_quotes}Verified{close_quotes}. {open_quotes}Draft{close_quotes} methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. {open_quotes}Verified{close_quotes} methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy.

  13. DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1994-10-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, open-quotes Draftclose quotes or open-quotes Verifiedclose quotes. open-quotes Draftclose quotes methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. open-quotes Verifiedclose quotes methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy

  14. Learning Bounds of ERM Principle for Sequences of Time-Dependent Samples

    Directory of Open Access Journals (Sweden)

    Mingchen Yao

    2015-01-01

    Full Text Available Many generalization results in learning theory are established under the assumption that samples are independent and identically distributed (i.i.d.. However, numerous learning tasks in practical applications involve the time-dependent data. In this paper, we propose a theoretical framework to analyze the generalization performance of the empirical risk minimization (ERM principle for sequences of time-dependent samples (TDS. In particular, we first present the generalization bound of ERM principle for TDS. By introducing some auxiliary quantities, we also give a further analysis of the generalization properties and the asymptotical behaviors of ERM principle for TDS.

  15. Testing an Impedance Non-destructive Method to Evaluate Steel-Fiber Concrete Samples

    Science.gov (United States)

    Komarkova, Tereza; Fiala, Pavel; Steinbauer, Miloslav; Roubal, Zdenek

    2018-02-01

    Steel-fiber reinforced concrete is a composite material characterized by outstanding tensile properties and resistance to the development of cracks. The concrete, however, exhibits such characteristics only on the condition that the steel fibers in the final, hardened composite have been distributed evenly. The current methods to evaluate the distribution and concentration of a fiber composite are either destructive or exhibit a limited capability of evaluating the concentration and orientation of the fibers. In this context, the paper discusses tests related to the evaluation of the density and orientation of fibers in a composite material. Compared to the approaches used to date, the proposed technique is based on the evaluation of the electrical impedance Z in the band close to the resonance of the sensor-sample configuration. Using analytically expressed equations, we can evaluate the monitored part of the composite and its density at various depths of the tested sample. The method employs test blocks of composites, utilizing the resonance of the measuring device and the measured sample set; the desired state occurs within the interval of between f=3 kHz and 400 kHz.

  16. OUTPACE long duration stations: physical variability, context of biogeochemical sampling, and evaluation of sampling strategy

    Directory of Open Access Journals (Sweden)

    A. de Verneil

    2018-04-01

    Full Text Available Research cruises to quantify biogeochemical fluxes in the ocean require taking measurements at stations lasting at least several days. A popular experimental design is the quasi-Lagrangian drifter, often mounted with in situ incubations or sediment traps that follow the flow of water over time. After initial drifter deployment, the ship tracks the drifter for continuing measurements that are supposed to represent the same water environment. An outstanding question is how to best determine whether this is true. During the Oligotrophy to UlTra-oligotrophy PACific Experiment (OUTPACE cruise, from 18 February to 3 April 2015 in the western tropical South Pacific, three separate stations of long duration (five days over the upper 500 m were conducted in this quasi-Lagrangian sampling scheme. Here we present physical data to provide context for these three stations and to assess whether the sampling strategy worked, i.e., that a single body of water was sampled. After analyzing tracer variability and local water circulation at each station, we identify water layers and times where the drifter risks encountering another body of water. While almost no realization of this sampling scheme will be truly Lagrangian, due to the presence of vertical shear, the depth-resolved observations during the three stations show most layers sampled sufficiently homogeneous physical environments during OUTPACE. By directly addressing the concerns raised by these quasi-Lagrangian sampling platforms, a protocol of best practices can begin to be formulated so that future research campaigns include the complementary datasets and analyses presented here to verify the appropriate use of the drifter platform.

  17. REAL-TIME PCR DETECTION OF LISTERIA MONOCYTOGENES IN FOOD SAMPLES OF ANIMAL ORIGIN

    Directory of Open Access Journals (Sweden)

    Jaroslav Pochop

    2013-02-01

    Full Text Available The aim of this study was to follow the contamination of food with Listeria monocytogenes by using Step One real time polymerase chain reaction (PCR. We used the PrepSEQ Rapid Spin Sample Preparation Kit for isolation of DNA and SensiFAST SYBR Hi-ROX Kit for the real-time PCR performance. In 24 samples of food of animal origin without incubation were detected strains of Listeria monocytogenes in 15 samples (swabs. Nine samples were negative. Our results indicated that the real-time PCR assay developed in this study could sensitively detect Listeria monocytogenes in food of animal origin without incubation. This could prevent infection caused by Listeria monocytogenes, and also could benefit food manufacturing companies by extending their product’s shelf-life as well as saving the cost of warehousing their food products while awaiting pathogen testing results. The rapid real-time PCR-based method performed very well compared to the conventional method. It is a fast, simple, specific and sensitive way to detect nucleic acids, which could be used in clinical diagnostic tests in the future.

  18. Evaluation of sample preparation methods and optimization of nickel determination in vegetable tissues

    Directory of Open Access Journals (Sweden)

    Rodrigo Fernando dos Santos Salazar

    2011-02-01

    Full Text Available Nickel, although essential to plants, may be toxic to plants and animals. It is mainly assimilated by food ingestion. However, information about the average levels of elements (including Ni in edible vegetables from different regions is still scarce in Brazil. The objectives of this study were to: (a evaluate and optimize a method for preparation of vegetable tissue samples for Ni determination; (b optimize the analytical procedures for determination by Flame Atomic Absorption Spectrometry (FAAS and by Electrothermal Atomic Absorption (ETAAS in vegetable samples and (c determine the Ni concentration in vegetables consumed in the cities of Lorena and Taubaté in the Vale do Paraíba, State of São Paulo, Brazil. By means of the analytical technique for determination by ETAAS or FAAS, the results were validated by the test of analyte addition and recovery. The most viable method tested for quantification of this element was HClO4-HNO3 wet digestion. All samples but carrot tissue collected in Lorena contained Ni levels above the permitted by the Brazilian Ministry of Health. The most disturbing results, requiring more detailed studies, were the Ni concentrations measured in carrot samples from Taubaté, where levels were five times higher than permitted by Brazilian regulations.

  19. Sampling strategies in antimicrobial resistance monitoring: evaluating how precision and sensitivity vary with the number of animals sampled per farm.

    Directory of Open Access Journals (Sweden)

    Takehisa Yamamoto

    Full Text Available Because antimicrobial resistance in food-producing animals is a major public health concern, many countries have implemented antimicrobial monitoring systems at a national level. When designing a sampling scheme for antimicrobial resistance monitoring, it is necessary to consider both cost effectiveness and statistical plausibility. In this study, we examined how sampling scheme precision and sensitivity can vary with the number of animals sampled from each farm, while keeping the overall sample size constant to avoid additional sampling costs. Five sampling strategies were investigated. These employed 1, 2, 3, 4 or 6 animal samples per farm, with a total of 12 animals sampled in each strategy. A total of 1,500 Escherichia coli isolates from 300 fattening pigs on 30 farms were tested for resistance against 12 antimicrobials. The performance of each sampling strategy was evaluated by bootstrap resampling from the observational data. In the bootstrapping procedure, farms, animals, and isolates were selected randomly with replacement, and a total of 10,000 replications were conducted. For each antimicrobial, we observed that the standard deviation and 2.5-97.5 percentile interval of resistance prevalence were smallest in the sampling strategy that employed 1 animal per farm. The proportion of bootstrap samples that included at least 1 isolate with resistance was also evaluated as an indicator of the sensitivity of the sampling strategy to previously unidentified antimicrobial resistance. The proportion was greatest with 1 sample per farm and decreased with larger samples per farm. We concluded that when the total number of samples is pre-specified, the most precise and sensitive sampling strategy involves collecting 1 sample per farm.

  20. Statistical evaluations of current sampling procedures and incomplete core recovery

    International Nuclear Information System (INIS)

    Heasler, P.G.; Jensen, L.

    1994-03-01

    This document develops two formulas that describe the effects of incomplete recovery on core sampling results for the Hanford waste tanks. The formulas evaluate incomplete core recovery from a worst-case (i.e.,biased) and best-case (i.e., unbiased) perspective. A core sampler is unbiased if the sample material recovered is a random sample of the material in the tank, while any sampler that preferentially recovers a particular type of waste over others is a biased sampler. There is strong evidence to indicate that the push-mode sampler presently used at the Hanford site is a biased one. The formulas presented here show the effects of incomplete core recovery on the accuracy of composition measurements, as functions of the vertical variability in the waste. These equations are evaluated using vertical variability estimates from previously sampled tanks (B110, U110, C109). Assuming that the values of vertical variability used in this study adequately describes the Hanford tank farm, one can use the formulas to compute the effect of incomplete recovery on the accuracy of an average constituent estimate. To determine acceptable recovery limits, we have assumed that the relative error of such an estimate should be no more than 20%

  1. Brine Sampling and Evaluation Program

    International Nuclear Information System (INIS)

    Deal, D.E.; Case, J.B.; Deshler, R.M.; Drez, P.E.; Myers, J.; Tyburski, J.R.

    1987-12-01

    The Brine Sampling and Evaluation Program (BSEP) Phase II Report is an interim report which updates the data released in the BSEP Phase I Report. Direct measurements and observations of the brine that seeps into the WIPP repository excavations were continued through the period between August 1986 and July 1987. That data is included in Appendix A, which extends the observation period for some locations to approximately 900 days. Brine observations at 87 locations are presented in this report. Although WIPP underground workings are considered ''dry,'' small amounts of brine are present. Part of that brine migrates into the repository in response to pressure gradients at essentially isothermal conditions. The data presented in this report is a continuation of moisture content studies of the WIPP facility horizon that were initiated in 1982, as soon as underground drifts began to be excavated. Brine seepages are manifested by salt efflorescences, moist areas, and fluid accumulations in drillholes. 35 refs., 6 figs., 11 tabs

  2. Estimating time to pregnancy from current durations in a cross-sectional sample

    DEFF Research Database (Denmark)

    Keiding, Niels; Kvist, Kajsa; Hartvig, Helle

    2002-01-01

    A new design for estimating the distribution of time to pregnancy is proposed and investigated. The design is based on recording current durations in a cross-sectional sample of women, leading to statistical problems similar to estimating renewal time distributions from backward recurrence times....

  3. Evaluation and application of static headspace-multicapillary column-gas chromatography-ion mobility spectrometry for complex sample analysis.

    Science.gov (United States)

    Denawaka, Chamila J; Fowlis, Ian A; Dean, John R

    2014-04-18

    An evaluation of static headspace-multicapillary column-gas chromatography-ion mobility spectrometry (SHS-MCC-GC-IMS) has been undertaken to assess its applicability for the determination of 32 volatile compounds (VCs). The key experimental variables of sample incubation time and temperature have been evaluated alongside the MCC-GC variables of column polarity, syringe temperature, injection temperature, injection volume, column temperature and carrier gas flow rate coupled with the IMS variables of temperature and drift gas flow rate. This evaluation resulted in six sets of experimental variables being required to separate the 32 VCs. The optimum experimental variables for SHS-MCC-GC-IMS, the retention time and drift time operating parameters were determined; to normalise the operating parameters, the relative drift time and normalised reduced ion mobility for each VC were determined. In addition, a full theoretical explanation is provided on the formation of the monomer, dimer and trimer of a VC. The optimum operating condition for each VC calibration data was obtained alongside limit of detection (LOD) and limit of quantitation (LOQ) values. Typical detection limits ranged from 0.1ng bis(methylthio)methane, ethylbutanoate and (E)-2-nonenal to 472ng isovaleric acid with correlation coefficient (R(2)) data ranging from 0.9793 (for the dimer of octanal) through to 0.9990 (for isobutyric acid). Finally, the developed protocols were applied to the analysis of malodour in sock samples. Initial work involved spiking an inert matrix and sock samples with appropriate concentrations of eight VCs. The average recovery from the inert matrix was 101±18% (n=8), while recoveries from the sock samples were lower, that is, 54±30% (n=8) for sock type 1 and 78±24% (n=6) for sock type 2. Finally, SHS-MCC-GC-IMS was applied to sock malodour in a field trial based on 11 volunteers (mixed gender) over a 3-week period. By applying the SHS-MCC-GC-IMS database, four VCs were

  4. Effects of holding time and measurement error on culturing Legionella in environmental water samples.

    Science.gov (United States)

    Flanders, W Dana; Kirkland, Kimberly H; Shelton, Brian G

    2014-10-01

    Outbreaks of Legionnaires' disease require environmental testing of water samples from potentially implicated building water systems to identify the source of exposure. A previous study reports a large impact on Legionella sample results due to shipping and delays in sample processing. Specifically, this same study, without accounting for measurement error, reports more than half of shipped samples tested had Legionella levels that arbitrarily changed up or down by one or more logs, and the authors attribute this result to shipping time. Accordingly, we conducted a study to determine the effects of sample holding/shipping time on Legionella sample results while taking into account measurement error, which has previously not been addressed. We analyzed 159 samples, each split into 16 aliquots, of which one-half (8) were processed promptly after collection. The remaining half (8) were processed the following day to assess impact of holding/shipping time. A total of 2544 samples were analyzed including replicates. After accounting for inherent measurement error, we found that the effect of holding time on observed Legionella counts was small and should have no practical impact on interpretation of results. Holding samples increased the root mean squared error by only about 3-8%. Notably, for only one of 159 samples, did the average of the 8 replicate counts change by 1 log. Thus, our findings do not support the hypothesis of frequent, significant (≥= 1 log10 unit) Legionella colony count changes due to holding. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Evaluating the stability of DSM-5 PTSD symptom network structure in a national sample of U.S. military veterans.

    Science.gov (United States)

    von Stockert, Sophia H H; Fried, Eiko I; Armour, Cherie; Pietrzak, Robert H

    2018-03-15

    Previous studies have used network models to investigate how PTSD symptoms associate with each other. However, analyses examining the degree to which these networks are stable over time, which are critical to identifying symptoms that may contribute to the chronicity of this disorder, are scarce. In the current study, we evaluated the temporal stability of DSM-5 PTSD symptom networks over a three-year period in a nationally representative sample of trauma-exposed U.S. military veterans. Data were analyzed from 611 trauma-exposed U.S. military veterans who participated in the National Health and Resilience in Veterans Study (NHRVS). We estimated regularized partial correlation networks of DSM-5 PTSD symptoms at baseline (Time 1) and at three-year follow-up (Time 2), and examined their temporal stability. Evaluation of the network structure of PTSD symptoms at Time 1 and Time 2 using a formal network comparison indicated that the Time 1 network did not differ significantly from the Time 2 network with regard to network structure (p = 0.12) or global strength (sum of all absolute associations, i.e. connectivity; p = 0.25). Centrality estimates of both networks (r = 0.86) and adjacency matrices (r = 0.69) were highly correlated. In both networks, avoidance, intrusive, and negative cognition and mood symptoms were among the more central nodes. This study is limited by the use of a self-report instrument to assess PTSD symptoms and recruitment of a relatively homogeneous sample of predominantly older, Caucasian veterans. Results of this study demonstrate the three-year stability of DSM-5 PTSD symptom network structure in a nationally representative sample of trauma-exposed U.S. military veterans. They further suggest that trauma-related avoidance, intrusive, and dysphoric symptoms may contribute to the chronicity of PTSD symptoms in this population. Published by Elsevier B.V.

  6. Synchronization of Hierarchical Time-Varying Neural Networks Based on Asynchronous and Intermittent Sampled-Data Control.

    Science.gov (United States)

    Xiong, Wenjun; Patel, Ragini; Cao, Jinde; Zheng, Wei Xing

    In this brief, our purpose is to apply asynchronous and intermittent sampled-data control methods to achieve the synchronization of hierarchical time-varying neural networks. The asynchronous and intermittent sampled-data controllers are proposed for two reasons: 1) the controllers may not transmit the control information simultaneously and 2) the controllers cannot always exist at any time . The synchronization is then discussed for a kind of hierarchical time-varying neural networks based on the asynchronous and intermittent sampled-data controllers. Finally, the simulation results are given to illustrate the usefulness of the developed criteria.In this brief, our purpose is to apply asynchronous and intermittent sampled-data control methods to achieve the synchronization of hierarchical time-varying neural networks. The asynchronous and intermittent sampled-data controllers are proposed for two reasons: 1) the controllers may not transmit the control information simultaneously and 2) the controllers cannot always exist at any time . The synchronization is then discussed for a kind of hierarchical time-varying neural networks based on the asynchronous and intermittent sampled-data controllers. Finally, the simulation results are given to illustrate the usefulness of the developed criteria.

  7. Real-time PCR quantification of six periodontal pathogens in saliva samples from healthy young adults.

    Science.gov (United States)

    Zhou, Xiaodong; Liu, Xiaoli; Li, Jing; Aprecio, Raydolfo M; Zhang, Wu; Li, Yiming

    2015-05-01

    The use of saliva as a diagnostic fluid for the evaluation of periodontal health has gained attention recently. Most published real-time PCR assays focused on quantification of bacteria in subgingival plaque, not in saliva. The aims of this study were to develop a real-time PCR assay for quantification of six periodontal pathogens in saliva and to establish a relationship between the amount of DNA (fg) and colony-forming unit (CFU). TaqMan primers/probe sets were used for the detection of Aggregatibacter actinomycetemcomitans (Aa), Eikenella corrodens (Ec), Fusobacterium nucleatum (Fn), Porphyromonas gingivalis (Pg), Prevotella intermedia (Pi), Tannerella forsythia (Tf), and total bacteria. Six periodontal pathogens and total bacteria in saliva from 24 periodontally healthy individuals were determined. The relationship between the amount of DNA (fg) and CFU was established by measuring the concentrations of extracted bacterial DNA and CFU per milliliter of bacteria on agar plates. Fn, Ec, and Pi were detected in all saliva samples, while 58.5, 45.8, and 33.3% were detected for Tf, Pg, and Aa, respectively. Numbers of Ec and Fn in saliva were highly correlated (R(2) = 0.93, P periodontal pathogens in saliva and estimate the number of live bacteria (CFU). This real-time PCR assay in combination with the relationship between DNA (fg) and CFU has the potential to be an adjunct in evaluation of periodontal health status.

  8. Molecular dynamics based enhanced sampling of collective variables with very large time steps

    Science.gov (United States)

    Chen, Pei-Yang; Tuckerman, Mark E.

    2018-01-01

    Enhanced sampling techniques that target a set of collective variables and that use molecular dynamics as the driving engine have seen widespread application in the computational molecular sciences as a means to explore the free-energy landscapes of complex systems. The use of molecular dynamics as the fundamental driver of the sampling requires the introduction of a time step whose magnitude is limited by the fastest motions in a system. While standard multiple time-stepping methods allow larger time steps to be employed for the slower and computationally more expensive forces, the maximum achievable increase in time step is limited by resonance phenomena, which inextricably couple fast and slow motions. Recently, we introduced deterministic and stochastic resonance-free multiple time step algorithms for molecular dynamics that solve this resonance problem and allow ten- to twenty-fold gains in the large time step compared to standard multiple time step algorithms [P. Minary et al., Phys. Rev. Lett. 93, 150201 (2004); B. Leimkuhler et al., Mol. Phys. 111, 3579-3594 (2013)]. These methods are based on the imposition of isokinetic constraints that couple the physical system to Nosé-Hoover chains or Nosé-Hoover Langevin schemes. In this paper, we show how to adapt these methods for collective variable-based enhanced sampling techniques, specifically adiabatic free-energy dynamics/temperature-accelerated molecular dynamics, unified free-energy dynamics, and by extension, metadynamics, thus allowing simulations employing these methods to employ similarly very large time steps. The combination of resonance-free multiple time step integrators with free-energy-based enhanced sampling significantly improves the efficiency of conformational exploration.

  9. Evaluation of factor for one-point venous blood sampling method based on the causality model

    International Nuclear Information System (INIS)

    Matsutomo, Norikazu; Onishi, Hideo; Kobara, Kouichi; Sasaki, Fumie; Watanabe, Haruo; Nagaki, Akio; Mimura, Hiroaki

    2009-01-01

    One-point venous blood sampling method (Mimura, et al.) can evaluate the regional cerebral blood flow (rCBF) value with a high degree of accuracy. However, the method is accompanied by complexity of technique because it requires a venous blood Octanol value, and its accuracy is affected by factors of input function. Therefore, we evaluated the factors that are used for input function to determine the accuracy input function and simplify the technique. The input function which uses the time-dependent brain count of 5 minutes, 15 minutes, and 25 minutes from administration, and the input function in which an objective variable is used as the artery octanol value to exclude the venous blood octanol value are created. Therefore, a correlation between these functions and rCBF value by the microsphere (MS) method is evaluated. Creation of a high-accuracy input function and simplification of technique are possible. The rCBF value obtained by the input function, the factor of which is a time-dependent brain count of 5 minutes from administration, and the objective variable is artery octanol value, had a high correlation with the MS method (y=0.899x+4.653, r=0.842). (author)

  10. Development of real time visual evaluation system for sodium transient thermohydraulic experiments

    International Nuclear Information System (INIS)

    Tanigawa, Shingo

    1990-01-01

    A real time visual evaluation system, the Liquid Metal Visual Evaluation System (LIVES), has been developed for the Plant Dynamics Test Loop facility at O-arai Engineering Center. This facility is designed to provide sodium transient thermohydraulic experimental data not only in a fuel subassembly but also in a plant wide system simulating abnormal or accident conditions in liquid metal fast breeder reactors. Since liquid metal sodium is invisible, measurements to obtain experimental data are mainly conducted by numerous thermo couples installed at various locations in the test sections and the facility. The transient thermohydraulic phenomena are a result of complicated interactions among global and local scale three-dimensional phenomena, and short- and long-time scale phenomena. It is, therefore, difficult to grasp intuitively thermohydraulic behaviors and to observe accurately both temperature distribution and flow condition solely by digital data or various types of analog data in evaluating the experimental results. For effectively conducting sodium transient experiments and for making it possible to observe exactly thermohydraulic phenomena, the real time visualization technique for transient thermohydraulics has been developed using the latest Engineering Work Station. The system makes it possible to observe and compare instantly the experiment and analytical results while experiment or analysis is in progress. The results are shown by not only the time trend curves but also the graphic animations. This paper shows an outline of the system and sample applications of the system. (author)

  11. Evaluation of Legionella Air Contamination in Healthcare Facilities by Different Sampling Methods: An Italian Multicenter Study.

    Science.gov (United States)

    Montagna, Maria Teresa; De Giglio, Osvalda; Cristina, Maria Luisa; Napoli, Christian; Pacifico, Claudia; Agodi, Antonella; Baldovin, Tatjana; Casini, Beatrice; Coniglio, Maria Anna; D'Errico, Marcello Mario; Delia, Santi Antonino; Deriu, Maria Grazia; Guida, Marco; Laganà, Pasqualina; Liguori, Giorgio; Moro, Matteo; Mura, Ida; Pennino, Francesca; Privitera, Gaetano; Romano Spica, Vincenzo; Sembeni, Silvia; Spagnolo, Anna Maria; Tardivo, Stefano; Torre, Ida; Valeriani, Federica; Albertini, Roberto; Pasquarella, Cesira

    2017-06-22

    Healthcare facilities (HF) represent an at-risk environment for legionellosis transmission occurring after inhalation of contaminated aerosols. In general, the control of water is preferred to that of air because, to date, there are no standardized sampling protocols. Legionella air contamination was investigated in the bathrooms of 11 HF by active sampling (Surface Air System and Coriolis ® μ) and passive sampling using settling plates. During the 8-hour sampling, hot tap water was sampled three times. All air samples were evaluated using culture-based methods, whereas liquid samples collected using the Coriolis ® μ were also analyzed by real-time PCR. Legionella presence in the air and water was then compared by sequence-based typing (SBT) methods. Air contamination was found in four HF (36.4%) by at least one of the culturable methods. The culturable investigation by Coriolis ® μ did not yield Legionella in any enrolled HF. However, molecular investigation using Coriolis ® μ resulted in eight HF testing positive for Legionella in the air. Comparison of Legionella air and water contamination indicated that Legionella water concentration could be predictive of its presence in the air. Furthermore, a molecular study of 12 L. pneumophila strains confirmed a match between the Legionella strains from air and water samples by SBT for three out of four HF that tested positive for Legionella by at least one of the culturable methods. Overall, our study shows that Legionella air detection cannot replace water sampling because the absence of microorganisms from the air does not necessarily represent their absence from water; nevertheless, air sampling may provide useful information for risk assessment. The liquid impingement technique appears to have the greatest capacity for collecting airborne Legionella if combined with molecular investigations.

  12. Evaluation of Legionella Air Contamination in Healthcare Facilities by Different Sampling Methods: An Italian Multicenter Study

    Science.gov (United States)

    Montagna, Maria Teresa; De Giglio, Osvalda; Cristina, Maria Luisa; Napoli, Christian; Pacifico, Claudia; Agodi, Antonella; Baldovin, Tatjana; Casini, Beatrice; Coniglio, Maria Anna; D’Errico, Marcello Mario; Delia, Santi Antonino; Deriu, Maria Grazia; Guida, Marco; Laganà, Pasqualina; Liguori, Giorgio; Moro, Matteo; Mura, Ida; Pennino, Francesca; Privitera, Gaetano; Romano Spica, Vincenzo; Sembeni, Silvia; Spagnolo, Anna Maria; Tardivo, Stefano; Torre, Ida; Valeriani, Federica; Albertini, Roberto; Pasquarella, Cesira

    2017-01-01

    Healthcare facilities (HF) represent an at-risk environment for legionellosis transmission occurring after inhalation of contaminated aerosols. In general, the control of water is preferred to that of air because, to date, there are no standardized sampling protocols. Legionella air contamination was investigated in the bathrooms of 11 HF by active sampling (Surface Air System and Coriolis®μ) and passive sampling using settling plates. During the 8-hour sampling, hot tap water was sampled three times. All air samples were evaluated using culture-based methods, whereas liquid samples collected using the Coriolis®μ were also analyzed by real-time PCR. Legionella presence in the air and water was then compared by sequence-based typing (SBT) methods. Air contamination was found in four HF (36.4%) by at least one of the culturable methods. The culturable investigation by Coriolis®μ did not yield Legionella in any enrolled HF. However, molecular investigation using Coriolis®μ resulted in eight HF testing positive for Legionella in the air. Comparison of Legionella air and water contamination indicated that Legionella water concentration could be predictive of its presence in the air. Furthermore, a molecular study of 12 L. pneumophila strains confirmed a match between the Legionella strains from air and water samples by SBT for three out of four HF that tested positive for Legionella by at least one of the culturable methods. Overall, our study shows that Legionella air detection cannot replace water sampling because the absence of microorganisms from the air does not necessarily represent their absence from water; nevertheless, air sampling may provide useful information for risk assessment. The liquid impingement technique appears to have the greatest capacity for collecting airborne Legionella if combined with molecular investigations. PMID:28640202

  13. APTIMA assay on SurePath liquid-based cervical samples compared to endocervical swab samples facilitated by a real time database

    Directory of Open Access Journals (Sweden)

    Khader Samer

    2010-01-01

    Full Text Available Background: Liquid-based cytology (LBC cervical samples are increasingly being used to test for pathogens, including: HPV, Chlamydia trachomatis (CT and Neisseria gonorrhoeae (GC using nucleic acid amplification tests. Several reports have shown the accuracy of such testing on ThinPrep (TP LBC samples. Fewer studies have evaluated SurePath (SP LBC samples, which utilize a different specimen preservative. This study was undertaken to assess the performance of the Aptima Combo 2 Assay (AC2 for CT and GC on SP versus endocervical swab samples in our laboratory. Materials and Methods: The live pathology database of Montefiore Medical Center was searched for patients with AC2 endocervical swab specimens and SP Paps taken the same day. SP samples from CT- and/or GC-positive endocervical swab patients and randomly selected negative patients were studied. In each case, 1.5 ml of the residual SP vial sample, which was in SP preservative and stored at room temperature, was transferred within seven days of collection to APTIMA specimen transfer tubes without any sample or patient identifiers. Blind testing with the AC2 assay was performed on the Tigris DTS System (Gen-probe, San Diego, CA. Finalized SP results were compared with the previously reported endocervical swab results for the entire group and separately for patients 25 years and younger and patients over 25 years. Results: SP specimens from 300 patients were tested. This included 181 swab CT-positive, 12 swab GC-positive, 7 CT and GC positive and 100 randomly selected swab CT and GC negative patients. Using the endocervical swab results as the patient′s infection status, AC2 assay of the SP samples showed: CT sensitivity 89.3%, CT specificity 100.0%; GC sensitivity and specificity 100.0%. CT sensitivity for patients 25 years or younger was 93.1%, versus 80.7% for patients over 25 years, a statistically significant difference (P = 0.02. Conclusions: Our results show that AC2 assay of 1.5 ml SP

  14. Laboratory sample turnaround times: do they cause delays in the ED?

    Science.gov (United States)

    Gill, Dipender; Galvin, Sean; Ponsford, Mark; Bruce, David; Reicher, John; Preston, Laura; Bernard, Stephani; Lafferty, Jessica; Robertson, Andrew; Rose-Morris, Anna; Stoneham, Simon; Rieu, Romelie; Pooley, Sophie; Weetch, Alison; McCann, Lloyd

    2012-02-01

    Blood tests are requested for approximately 50% of patients attending the emergency department (ED). The time taken to obtain the results is perceived as a common reason for delay. The objective of this study was therefore to investigate the turnaround time (TAT) for blood results and whether this affects patient length of stay (LOS) and to identify potential areas for improvement. A time-in-motion study was performed at the ED of the John Radcliffe Hospital (JRH), Oxford, UK. The duration of each of the stages leading up to receipt of 101 biochemistry and haematology results was recorded, along with the corresponding patient's LOS. The findings reveal that the mean time for haematology results to become available was 1 hour 6 minutes (95% CI: 29 minutes to 2 hours 13 minutes), while biochemistry samples took 1 hour 42 minutes (95% CI: 1 hour 1 minute to 4 hours 21 minutes), with some positive correlation noted with the patient LOS, but no significant variation between different days or shifts. With the fastest 10% of samples being reported within 35 minutes (haematology) and 1 hour 5 minutes (biochemistry) of request, our study showed that delays can be attributable to laboratory TAT. Given the limited ability to further improve laboratory processes, the solutions to improving TAT need to come from a collaborative and integrated approach that includes strategies before samples reach the laboratory and downstream review of results. © 2010 Blackwell Publishing Ltd.

  15. Influence of sampling depth and post-sampling analysis time on the ...

    African Journals Online (AJOL)

    Bacteriological analysis was carried out for samples taken at water depth and at 1, 6, 12 and 24 hours post-sampling. It was observed that the total and faecal coliform bacteria were significantly higher in the 3 m water depth samples than in the surface water samples (ANOVA, F = 59.41, 26.751, 9.82 (T.C); 46.41, 26.81, ...

  16. Simple DNA extraction of urine samples: Effects of storage temperature and storage time.

    Science.gov (United States)

    Ng, Huey Hian; Ang, Hwee Chen; Hoe, See Ying; Lim, Mae-Lynn; Tai, Hua Eng; Soh, Richard Choon Hock; Syn, Christopher Kiu-Choong

    2018-06-01

    Urine samples are commonly analysed in cases with suspected illicit drug consumption. In events of alleged sample mishandling, urine sample source identification may be necessary. A simple DNA extraction procedure suitable for STR typing of urine samples was established on the Promega Maxwell ® 16 paramagnetic silica bead platform. A small sample volume of 1.7mL was used. Samples were stored at room temperature, 4°C and -20°C for 100days to investigate the influence of storage temperature and time on extracted DNA quantity and success rate of STR typing. Samples stored at room temperature exhibited a faster decline in DNA yield with time and lower typing success rates as compared to those at 4°C and -20°C. This trend can likely be attributed to DNA degradation. In conclusion, this study presents a quick and effective DNA extraction protocol from a small urine volume stored for up to 100days at 4°C and -20°C. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Impact of temperature and time storage on the microbial detection of oral samples by Checkerboard DNA-DNA hybridization method.

    Science.gov (United States)

    do Nascimento, Cássio; dos Santos, Janine Navarro; Pedrazzi, Vinícius; Pita, Murillo Sucena; Monesi, Nadia; Ribeiro, Ricardo Faria; de Albuquerque, Rubens Ferreira

    2014-01-01

    Molecular diagnosis methods have been largely used in epidemiological or clinical studies to detect and quantify microbial species that may colonize the oral cavity in healthy or disease. The preservation of genetic material from samples remains the major challenge to ensure the feasibility of these methodologies. Long-term storage may compromise the final result. The aim of this study was to evaluate the effect of temperature and time storage on the microbial detection of oral samples by Checkerboard DNA-DNA hybridization. Saliva and supragingival biofilm were taken from 10 healthy subjects, aliquoted (n=364) and processed according to proposed protocols: immediate processing and processed after 2 or 4 weeks, and 6 or 12 months of storage at 4°C, -20°C and -80°C. Either total or individual microbial counts were recorded in lower values for samples processed after 12 months of storage, irrespective of temperatures tested. Samples stored up to 6 months at cold temperatures showed similar counts to those immediately processed. The microbial incidence was also significantly reduced in samples stored during 12 months in all temperatures. Temperature and time of oral samples storage have relevant impact in the detection and quantification of bacterial and fungal species by Checkerboard DNA-DNA hybridization method. Samples should be processed immediately after collection or up to 6 months if conserved at cold temperatures to avoid false-negative results. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Effects of Long-Term Storage Time and Original Sampling Month on Biobank Plasma Protein Concentrations

    Directory of Open Access Journals (Sweden)

    Stefan Enroth

    2016-10-01

    Full Text Available The quality of clinical biobank samples is crucial to their value for life sciences research. A number of factors related to the collection and storage of samples may affect the biomolecular composition. We have studied the effect of long-time freezer storage, chronological age at sampling, season and month of the year and on the abundance levels of 108 proteins in 380 plasma samples collected from 106 Swedish women. Storage time affected 18 proteins and explained 4.8–34.9% of the observed variance. Chronological age at sample collection after adjustment for storage-time affected 70 proteins and explained 1.1–33.5% of the variance. Seasonal variation had an effect on 15 proteins and month (number of sun hours affected 36 proteins and explained up to 4.5% of the variance after adjustment for storage-time and age. The results show that freezer storage time and collection date (month and season exerted similar effect sizes as age on the protein abundance levels. This implies that information on the sample handling history, in particular storage time, should be regarded as equally prominent covariates as age or gender and need to be included in epidemiological studies involving protein levels.

  19. Scheduling sampling to maximize information about time dependence in experiments with limited resources

    DEFF Research Database (Denmark)

    Græsbøll, Kaare; Christiansen, Lasse Engbo

    2013-01-01

    Looking for periodicity in sampled data requires that periods (lags) of different length are represented in the sampling plan. We here present a method to assist in planning of temporal studies with sparse resources, which optimizes the number of observed time lags for a fixed amount of samples w...

  20. Evaluating sampling strategies for larval cisco (Coregonus artedi)

    Science.gov (United States)

    Myers, J.T.; Stockwell, J.D.; Yule, D.L.; Black, J.A.

    2008-01-01

    To improve our ability to assess larval cisco (Coregonus artedi) populations in Lake Superior, we conducted a study to compare several sampling strategies. First, we compared density estimates of larval cisco concurrently captured in surface waters with a 2 x 1-m paired neuston net and a 0.5-m (diameter) conical net. Density estimates obtained from the two gear types were not significantly different, suggesting that the conical net is a reasonable alternative to the more cumbersome and costly neuston net. Next, we assessed the effect of tow pattern (sinusoidal versus straight tows) to examine if propeller wash affected larval density. We found no effect of propeller wash on the catchability of larval cisco. Given the availability of global positioning systems, we recommend sampling larval cisco using straight tows to simplify protocols and facilitate straightforward measurements of volume filtered. Finally, we investigated potential trends in larval cisco density estimates by sampling four time periods during the light period of a day at individual sites. Our results indicate no significant trends in larval density estimates during the day. We conclude estimates of larval cisco density across space are not confounded by time at a daily timescale. Well-designed, cost effective surveys of larval cisco abundance will help to further our understanding of this important Great Lakes forage species.

  1. Quality evaluation of processed clay soil samples | Steiner-Asiedu ...

    African Journals Online (AJOL)

    Introduction: This study assessed the microbial quality of clay samples sold on two of the major Ghanaian markets. Methods: The study was a cross-sectional assessing the evaluation of processed clay and effects it has on the nutrition of the consumers in the political capital town of Ghana. The items for the examination was ...

  2. Double spike isotope dilution GC-ICP-MS for evaluation of mercury species transformation in real fish samples using ultrasound-assisted extraction.

    Science.gov (United States)

    Esteban-Fernández, Diego; Mirat, Manuela; de la Hinojosa, M Ignacia Martín; Alonso, J Ignacio García

    2012-08-29

    Sample preparation continues being a key factor to obtain fast and reliable quantification of Hg species. Assisted procedures enhance the efficiency and reduce the extraction time; however, collateral species transformations have been observed. Moreover, differential interconversions have been observed even between similar matrixes, which introduce an important uncertainty for real sample analysis. Trying to minimize Hg species transformations, we have tested a soft ultrasound-assisted extraction procedure. Species quantification and transformations have been evaluated using double spike isotope dilution analysis (IDA) together with gas chromatography inductively coupled plasma mass spectrometry (GC-ICP-MS) for a CRM material (Tort-2) and shark and swordfish muscle samples. Optimum extraction solution and sonication time led to quantitative extraction and accurate determination of MeHg and IHg in a short time, although different behaviors regarding species preservation were observed depending on the sample. Negligible species transformations were observed in the analysis of the CRM, while a small but significant demethylation factor was observed in the case of real samples. In comparison with other extraction procedures, species transformations became smaller, and fewer differences between fish species were found. Similar results were obtained for fresh and lyophilized samples of both fish samples, which permit one to analyze the fresh sample directly and save time in the sample preparation step. The high grade of species preservation and the affordability of the extraction procedure allow one to obtain accurate determinations even for routine laboratories using quantification techniques, which do not estimate species transformations.

  3. A sampling approach to constructing Lyapunov functions for nonlinear continuous–time systems

    NARCIS (Netherlands)

    Bobiti, R.V.; Lazar, M.

    2016-01-01

    The problem of constructing a Lyapunov function for continuous-time nonlinear dynamical systems is tackled in this paper via a sampling-based approach. The main idea of the sampling-based method is to verify a Lyapunov-type inequality for a finite number of points (known state vectors) in the

  4. Latent class analysis of real time qPCR and bacteriological culturing for the diagnosis of Streptococcus agalactiae in cow composite milk samples

    DEFF Research Database (Denmark)

    Holmøy, Ingrid H.; Toft, Nils; Jørgensen, Hannah J.

    2018-01-01

    Streptococcus agalactiae (S. agalactiae) has re-emerged as a mastitis pathogen among Norwegian dairy cows. The Norwegian cattle health services recommend that infected herds implement measures to eradicate S. agalactiae, this includes a screening of milk samples from all lactating cows....... The performance of the qPCR-test currently in use for this purpose has not been evaluated under field conditions. The objective of this study was to estimate the sensitivity and specificity of the real-time qPCR assay in use in Norway (Mastitis 4 qPCR, DNA Diagnostics A/S, Risskov, Denmark) and compare...... it to conventional bacteriological culturing for detection of S. agalactiae in milk samples. Because none of these tests are considered a perfect reference test, the evaluation was performed using latent class models in a Bayesian analysis. Aseptically collected cow-composite milk samples from 578 cows belonging...

  5. [DOE method for evaluating environmental and waste management samples: Revision 1, Addendum 1

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S.C.

    1995-04-01

    The US Dapartment of Energy`s (DOE`s) environmental and waste management (EM) sampling and analysis activities require that large numbers of samples be analyzed for materials characterization, environmental surveillance, and site-remediation programs. The present document, DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods), is a supplemental resource for analyzing many of these samples.

  6. [DOE method for evaluating environmental and waste management samples: Revision 1, Addendum 1

    International Nuclear Information System (INIS)

    Goheen, S.C.

    1995-04-01

    The US Dapartment of Energy's (DOE's) environmental and waste management (EM) sampling and analysis activities require that large numbers of samples be analyzed for materials characterization, environmental surveillance, and site-remediation programs. The present document, DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods), is a supplemental resource for analyzing many of these samples

  7. Crystallite size variation of TiO_2 samples depending time heat treatment

    International Nuclear Information System (INIS)

    Galante, A.G.M.; Paula, F.R. de; Montanhera, M.A.; Pereira, E.A.; Spada, E.R.

    2016-01-01

    Titanium dioxide (TiO_2) is an oxide semiconductor that may be found in mixed phase or in distinct phases: brookite, anatase and rutile. In this work was carried out the study of the residence time influence at a given temperature in the TiO_2 powder physical properties. After the powder synthesis, the samples were divided and heat treated at 650 °C with a ramp up to 3 °C/min and a residence time ranging from 0 to 20 hours and subsequently characterized by x-ray diffraction. Analyzing the obtained diffraction patterns, it was observed that, from 5-hour residence time, began the two-distinct phase coexistence: anatase and rutile. It also calculated the average crystallite size of each sample. The results showed an increase in average crystallite size with increasing residence time of the heat treatment. (author)

  8. Evaluation of dose exposure from irradiated samples at TRIGA PUSPATI reactor (RTP)

    International Nuclear Information System (INIS)

    Muhd Husamuddin Abdul Khalil; Julia Abdul Karim; Naim Syauqi Hamzah; Mohamad Hairie Rabir; Mohd Amin Sharifuldin Salleh

    2010-01-01

    An evaluation has been made to data of irradiated samples for the type of sample requested for activation at RTP. Sample types are grouped with percentage of total throughputs to rule out the weight percent of every respective group. The database consists of radionuclide inventory of short, intermediate and long half-life and high activity radionuclides such as Br and Au have been identified. Evaluation of gamma exposure using Micro shield has also been made to pattern the trend of gamma exposure at experimental facilities and to ensure radiological effect towards safety and health is limited per Radiation Protection (Basic Safety Standard) Regulation 1988. This analysis places an important parameter to improve the design accuracy of shielding design in assuring safety, reliability and economy. (author)

  9. Sampling rare fluctuations of discrete-time Markov chains

    Science.gov (United States)

    Whitelam, Stephen

    2018-03-01

    We describe a simple method that can be used to sample the rare fluctuations of discrete-time Markov chains. We focus on the case of Markov chains with well-defined steady-state measures, and derive expressions for the large-deviation rate functions (and upper bounds on such functions) for dynamical quantities extensive in the length of the Markov chain. We illustrate the method using a series of simple examples, and use it to study the fluctuations of a lattice-based model of active matter that can undergo motility-induced phase separation.

  10. Utility of gram staining for evaluation of the quality of cystic fibrosis sputum samples.

    Science.gov (United States)

    Nair, Bindu; Stapp, Jenny; Stapp, Lynn; Bugni, Linda; Van Dalfsen, Jill; Burns, Jane L

    2002-08-01

    The microscopic examination of Gram-stained sputum specimens is very helpful in the evaluation of patients with community-acquired pneumonia and has also been recommended for use in cystic fibrosis (CF) patients. This study was undertaken to evaluate that recommendation. One hundred one sputum samples from CF patients were cultured for gram-negative bacilli and examined by Gram staining for both sputum adequacy (using the quality [Q] score) and bacterial morphology. Subjective evaluation of adequacy was also performed and categorized. Based on Q score evaluation, 41% of the samples would have been rejected despite a subjective appearance of purulence. Only three of these rejected samples were culture negative for gram-negative CF pathogens. Correlation between culture results and quantitative Gram stain examination was also poor. These data suggest that subjective evaluation combined with comprehensive bacteriology is superior to Gram staining in identifying pathogens in CF sputum.

  11. Quantitative Evaluation of Fire and EMS Mobilization Times

    CERN Document Server

    Upson, Robert

    2010-01-01

    Quantitative Evaluation of Fire and EMS Mobilization Times presents comprehensive empirical data on fire emergency and EMS call processing and turnout times, and aims to improve the operational benchmarks of NFPA peer consensus standards through a close examination of real-world data. The book also identifies and analyzes the elements that can influence EMS mobilization response times. Quantitative Evaluation of Fire and EMS Mobilization Times is intended for practitioners as a tool for analyzing fire emergency response times and developing methods for improving them. Researchers working in a

  12. Time optimization of 90Sr measurements: Sequential measurement of multiple samples during ingrowth of 90Y

    International Nuclear Information System (INIS)

    Holmgren, Stina; Tovedal, Annika; Björnham, Oscar; Ramebäck, Henrik

    2016-01-01

    The aim of this paper is to contribute to a more rapid determination of a series of samples containing 90 Sr by making the Cherenkov measurement of the daughter nuclide 90 Y more time efficient. There are many instances when an optimization of the measurement method might be favorable, such as; situations requiring rapid results in order to make urgent decisions or, on the other hand, to maximize the throughput of samples in a limited available time span. In order to minimize the total analysis time, a mathematical model was developed which calculates the time of ingrowth as well as individual measurement times for n samples in a series. This work is focused on the measurement of 90 Y during ingrowth, after an initial chemical separation of strontium, in which it is assumed that no other radioactive strontium isotopes are present. By using a fixed minimum detectable activity (MDA) and iterating the measurement time for each consecutive sample the total analysis time will be less, compared to using the same measurement time for all samples. It was found that by optimization, the total analysis time for 10 samples can be decreased greatly, from 21 h to 6.5 h, when assuming a MDA of 1 Bq/L and at a background count rate of approximately 0.8 cpm. - Highlights: • An approach roughly a factor of three more efficient than an un-optimized method. • The optimization gives a more efficient use of instrument time. • The efficiency increase ranges from a factor of three to 10, for 10 to 40 samples.

  13. Analysis format and evaluation methods for effluent particle sampling systems in nuclear facilities

    International Nuclear Information System (INIS)

    Schwendiman, L.C.; Glissmeyer, J.A.

    1976-06-01

    Airborne effluent sampling systems for nuclear facilities are frequently designed, installed, and operated without a systematic approach which discloses and takes into account all the circumstances and conditions which would affect the validity and adequacy of the sample. Without a comprehensive check list or something similar, the designer of the system may not be given the important information needed to provide a good design. In like manner, an already operating system may be better appraised. Furthermore, the discipline of a more formal approach may compel the one who will use the system to make sure he knows what he wants and can thus give the designer the needed information. An important consideration is the criteria to be applied to the samples to be taken. This analysis format consists of a listing of questions and statements calling forth the necessary information required to analyze a sampling system. With this information developed, one can proceed with an evaluation, the methodology of which is also discussed in the paper. Errors in probe placement, failure to sample at the proper rate, delivery line losses, and others are evaluated using mathematical models and empirically derived relationships. Experimental methods are also described for demonstrating that quality sampling will be achieved. The experiments include using a temporary, simple, but optimal sample collection system to evaluate the more complex systems. The use of tracer particles injected in the stream is also discussed. The samples obtained with the existing system are compared with those obtained by the temporary, optimal system

  14. Assessment of soil sample quality used for density evaluations through computed tomography

    International Nuclear Information System (INIS)

    Pires, Luiz F.; Arthur, Robson C.J.; Bacchi, Osny O.S.

    2005-01-01

    There are several methods to measure soil bulk density (ρ s ) like the paraffin sealed clod (PS), the volumetric ring (VR), the computed tomography (CT), and the neutron-gamma surface gauge (SG). In order to evaluate by a non-destructive way the possible modifications in soil structure caused by sampling for the PS and VR methods of ρ s evaluation we proposed to use the gamma ray CT method. A first generation tomograph was used having a 241 Am source and a 3 in x 3 in NaI(Tl) scintillation crystal detector coupled to a photomultiplier tube. Results confirm the effect of soil sampler devices on the structure of soil samples, and that the compaction caused during sampling causes significant alterations of soil bulk density. Through the use of CT it was possible to determine the level of compaction and to make a detailed analysis of the soil bulk density distribution within the soil sample. (author)

  15. A Process For Performance Evaluation Of Real-Time Systems

    Directory of Open Access Journals (Sweden)

    Andrew J. Kornecki

    2003-12-01

    Full Text Available Real-time developers and engineers must not only meet the system functional requirements, but also the stringent timing requirements. One of the critical decisions leading to meeting these timing requirements is the selection of an operating system under which the software will be developed and run. Although there is ample documentation on real-time systems performance and evaluation, little can be found that combines such information into an efficient process for use by developers. As the software industry moves towards clearly defined processes, creation of appropriate guidelines describing a process for performance evaluation of real-time system would greatly benefit real-time developers. This technology transition research focuses on developing such a process. PROPERT (PROcess for Performance Evaluation of Real Time systems - the process described in this paper - is based upon established techniques for evaluating real-time systems. It organizes already existing real-time performance criteria and assessment techniques in a manner consistent with a well-formed process, based on the Personal Software Process concepts.

  16. Roadway sampling evaluation.

    Science.gov (United States)

    2014-09-01

    The Florida Department of Transportation (FDOT) has traditionally required that all sampling : and testing of asphalt mixtures be at the Contractors production facility. With recent staffing cuts, as : well as budget reductions, FDOT has been cons...

  17. Evaluation of analytical results on DOE Quality Assessment Program Samples

    International Nuclear Information System (INIS)

    Jaquish, R.E.; Kinnison, R.R.; Mathur, S.P.; Sastry, R.

    1985-01-01

    Criteria were developed for evaluating the participants analytical results in the DOE Quality Assessment Program (QAP). Historical data from previous QAP studies were analyzed using descriptive statistical methods to determine the interlaboratory precision that had been attained. Performance criteria used in other similar programs were also reviewed. Using these data, precision values and control limits were recommended for each type of analysis performed in the QA program. Results of the analysis performed by the QAP participants on the November 1983 samples were statistically analyzed and evaluated. The Environmental Measurements Laboratory (EML) values were used as the known values and 3-sigma precision values were used as control limits. Results were submitted by 26 participating laboratories for 49 different radionuclide media combinations. The participants reported 419 results and of these, 350 or 84% were within control limits. Special attention was given to the data from gamma spectral analysis of air filters and water samples. both normal probability and box plots were prepared for each nuclide to help evaluate the distribution of the data. Results that were outside the expected range were identified and suggestions made that laboratories check calculations, and procedures on these results

  18. Evaluation of limited sampling models for prediction of oral midazolam AUC for CYP3A phenotyping and drug interaction studies.

    Science.gov (United States)

    Mueller, Silke C; Drewelow, Bernd

    2013-05-01

    The area under the concentration-time curve (AUC) after oral midazolam administration is commonly used for cytochrome P450 (CYP) 3A phenotyping studies. The aim of this investigation was to evaluate a limited sampling strategy for the prediction of AUC with oral midazolam. A total of 288 concentration-time profiles from 123 healthy volunteers who participated in four previously performed drug interaction studies with intense sampling after a single oral dose of 7.5 mg midazolam were available for evaluation. Of these, 45 profiles served for model building, which was performed by stepwise multiple linear regression, and the remaining 243 datasets served for validation. Mean prediction error (MPE), mean absolute error (MAE) and root mean squared error (RMSE) were calculated to determine bias and precision The one- to four-sampling point models with the best coefficient of correlation were the one-sampling point model (8 h; r (2) = 0.84), the two-sampling point model (0.5 and 8 h; r (2) = 0.93), the three-sampling point model (0.5, 2, and 8 h; r (2) = 0.96), and the four-sampling point model (0.5,1, 2, and 8 h; r (2) = 0.97). However, the one- and two-sampling point models were unable to predict the midazolam AUC due to unacceptable bias and precision. Only the four-sampling point model predicted the very low and very high midazolam AUC of the validation dataset with acceptable precision and bias. The four-sampling point model was also able to predict the geometric mean ratio of the treatment phase over the baseline (with 90 % confidence interval) results of three drug interaction studies in the categories of strong, moderate, and mild induction, as well as no interaction. A four-sampling point limited sampling strategy to predict the oral midazolam AUC for CYP3A phenotyping is proposed. The one-, two- and three-sampling point models were not able to predict midazolam AUC accurately.

  19. Direct Quantification of Campylobacter jejuni in Chicken Fecal Samples Using Real-Time PCR: Evaluation of Six Rapid DNA Extraction Methods

    DEFF Research Database (Denmark)

    Garcia Clavero, Ana Belén; Kamara, Judy N.; Vigre, Håkan

    2013-01-01

    of this study, the Easy-DNA (Invitrogen) method generated lower Ct values, the best amplification efficiency (AE = 93.2 %) and good precision (R squared = 0.996). The method NucleoSpin® Tissue was able to detect samples spiked with the lowest Campylobacter concentration level (10 CFU/ml) but the amplification...... efficiency was not optimal (AE = 139.5 %). DNA extraction methods Easy-DNA Invitrogen, MiniMAG® and NucleoSpin® Tissue produced good real-time PCR reproducibility generating standard deviations from 0.3 to 0.8 between replicates....

  20. A Real-Time PCR Detection of Genus Salmonella in Meat and Milk Samples

    Directory of Open Access Journals (Sweden)

    Jaroslav Pochop

    2013-05-01

    Full Text Available The aim of this study was follow the contamination of ready to eat milk and meat products with Salmonella spp. by using the Step One real-time PCR. Classical microbiological methods for detection of food-borne bacteria involve the use of pre-enrichment and/or specific enrichment, followed by the isolation of the bacteria in solid media and a final confirmation by biochemical and/or serological tests. We used the PrepSEQ Rapid Spin Sample Preparation Kit for isolation of DNA and SensiFAST SYBR Hi-ROX Kit for the real-time PCR performance. In the investigated samples without incubation we could detect strain of Salmonella sp. in five out of twenty three samples (swabs. This Step One real-time PCR assay is extremely useful for any laboratory in possession of a real-time PCR. It is a fast, reproducible, simple, specific and sensitive way to detect nucleic acids, which could be used in clinical diagnostic tests in the future. Our results indicated that the Step One real-time PCR assay developed in this study could sensitively detect Salmonella spp. in ready to eat food.

  1. Evaluation of biological samples for specimen banking and biomonitoring by nuclear methods

    International Nuclear Information System (INIS)

    Stone, S.F.; Zeisler, R.

    1984-01-01

    In a pilot program for environmental specimen banking, human livers and marine mussels (Mytilus edulis) were sampled, analyzed and banked. Nuclear methods played a major role in the evaluation of the samples by providing concentration data for up to 37 major, mineral, and trace elements. Instrumental neutron activation analysis was complemented by neutron-capture prompt gamma activation analysis, radiochemical separations and, for the mussels, by instrumental X-ray fluorescence analysis. A cryogenic homogenization procedure was applied for sample preparation and evaluated. Assessment of accuracy was made by analyzing Standard Reference Materials and by intercomparing the techniques. Results are reported for 66 individual human liver specimens, collected at three locations in the United States, and for batches of 65 mussels from a collection made at Narragansett Bay, RI. 19 references, 23 figures, 4 tables

  2. Sampling Methodologies for Epidemiologic Surveillance of Men Who Have Sex with Men and Transgender Women in Latin America: An Empiric Comparison of Convenience Sampling, Time Space Sampling, and Respondent Driven Sampling

    OpenAIRE

    Clark, J. L.; Konda, K. A.; Silva-Santisteban, A.; Peinado, J.; Lama, J. R.; Kusunoki, L.; Perez-Brumer, A.; Pun, M.; Cabello, R.; Sebastian, J. L.; Suarez-Ognio, L.; Sanchez, J.

    2014-01-01

    Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June-August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through T...

  3. Timing of Emergency Medicine Student Evaluation Does Not Affect Scoring.

    Science.gov (United States)

    Hiller, Katherine M; Waterbrook, Anna; Waters, Kristina

    2016-02-01

    Evaluation of medical students rotating through the emergency department (ED) is an important formative and summative assessment method. Intuitively, delaying evaluation should affect the reliability of this assessment method, however, the effect of evaluation timing on scoring is unknown. A quality-improvement project evaluating the timing of end-of-shift ED evaluations at the University of Arizona was performed to determine whether delay in evaluation affected the score. End-of-shift ED evaluations completed on behalf of fourth-year medical students from July 2012 to March 2013 were reviewed. Forty-seven students were evaluated 547 times by 46 residents and attendings. Evaluation scores were means of anchored Likert scales (1-5) for the domains of energy/interest, fund of knowledge, judgment/problem-solving ability, clinical skills, personal effectiveness, and systems-based practice. Date of shift, date of evaluation, and score were collected. Linear regression was performed to determine whether timing of the evaluation had an effect on evaluation score. Data were complete for 477 of 547 evaluations (87.2%). Mean evaluation score was 4.1 (range 2.3-5, standard deviation 0.62). Evaluations took a mean of 8.5 days (median 4 days, range 0-59 days, standard deviation 9.77 days) to complete. Delay in evaluation had no significant effect on score (p = 0.983). The evaluation score was not affected by timing of the evaluation. Variance in scores was similar for both immediate and delayed evaluations. Considerable amounts of time and energy are expended tracking down delayed evaluations. This activity does not impact a student's final grade. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Evaluation of Sample Handling Effects on Serum Vitamin E and Cholesterol Concentrations in Alpacas

    Directory of Open Access Journals (Sweden)

    Andrea S. Lear

    2014-01-01

    Full Text Available Clinical cases of vitamin E deficiencies have been diagnosed in camelids and may indicate that these species are more sensitive to inadequate vitamin E in hay-based diets compared to other ruminant and equine species. In bovine, cholesterol has been reported to affect vitamin E concentrations. In order to evaluate vitamin E deficiencies in camelids, the effects of collection and storage of the blood samples prior to processing were necessary. Reports vary as to factors affecting vitamin E and cholesterol in blood samples, and diagnostic laboratories vary in instructions regarding sample handling. Blood was collected from healthy alpacas and processed under conditions including exposure to fluorescent light, serum and red blood cell contact, tube stopper contact, temperature, and hemolysis. Serum vitamin E and cholesterol concentrations were then measured. Statistical analyses found that the vitamin E concentrations decreased with prolonged contact with the tube stopper and with increasing hemolysis. Vitamin E concentration variations were seen with other factors but were not significant. Time prior to serum separation and individual animal variation was found to alter cholesterol concentrations within the sample, yet this finding was clinically unremarkable. No correlation was seen between vitamin E and cholesterol concentration, possibly due to lack of variation of cholesterol.

  5. Real-time ultrasonic weld evaluation system

    Science.gov (United States)

    Katragadda, Gopichand; Nair, Satish; Liu, Harry; Brown, Lawrence M.

    1996-11-01

    Ultrasonic testing techniques are currently used as an alternative to radiography for detecting, classifying,and sizing weld defects, and for evaluating weld quality. Typically, ultrasonic weld inspections are performed manually, which require significant operator expertise and time. Thus, in recent years, the emphasis is to develop automated methods to aid or replace operators in critical weld inspections where inspection time, reliability, and operator safety are major issues. During this period, significant advances wee made in the areas of weld defect classification and sizing. Very few of these methods, however have found their way into the market, largely due to the lack of an integrated approach enabling real-time implementation. Also, not much research effort was directed in improving weld acceptance criteria. This paper presents an integrated system utilizing state-of-the-art techniques for a complete automation of the weld inspection procedure. The modules discussed include transducer tracking, classification, sizing, and weld acceptance criteria. Transducer tracking was studied by experimentally evaluating sonic and optical position tracking techniques. Details for this evaluation are presented. Classification is obtained using a multi-layer perceptron. Results from different feature extraction schemes, including a new method based on a combination of time and frequency-domain signal representations are given. Algorithms developed to automate defect registration and sizing are discussed. A fuzzy-logic acceptance criteria for weld acceptance is presented describing how this scheme provides improved robustness compared to the traditional flow-diagram standards.

  6. Optoelectronic time-domain characterization of a 100 GHz sampling oscilloscope

    International Nuclear Information System (INIS)

    Füser, H; Baaske, K; Kuhlmann, K; Judaschke, R; Pierz, K; Bieler, M; Eichstädt, S; Elster, C

    2012-01-01

    We have carried out an optoelectronic measurement of the impulse response of an ultrafast sampling oscilloscope with a nominal bandwidth of 100 GHz within a time window of approximately 100 ps. Our experimental technique also considers frequency components above the cut-off frequency of higher order modes of the 1.0 mm coaxial line, which is shown to be important for the specification of the impulse response of ultrafast sampling oscilloscopes. Additionally, we have measured the reflection coefficient of the sampling head induced by the mismatch of the sampling circuit and the coaxial connector which is larger than 0.5 for certain frequencies. The uncertainty analysis has been performed using the Monte Carlo method of Supplement 1 to the 'Guide to the Expression of Uncertainty in Measurement' and correlations in the estimated impulse response have been determined. Our measurements extend previous work which deals with the characterization of 70 GHz oscilloscopes and the measurement of 100 GHz oscilloscopes up to the cut-off frequency of higher order modes

  7. Cross-sample entropy of foreign exchange time series

    Science.gov (United States)

    Liu, Li-Zhi; Qian, Xi-Yuan; Lu, Heng-Yao

    2010-11-01

    The correlation of foreign exchange rates in currency markets is investigated based on the empirical data of DKK/USD, NOK/USD, CAD/USD, JPY/USD, KRW/USD, SGD/USD, THB/USD and TWD/USD for a period from 1995 to 2002. Cross-SampEn (cross-sample entropy) method is used to compare the returns of every two exchange rate time series to assess their degree of asynchrony. The calculation method of confidence interval of SampEn is extended and applied to cross-SampEn. The cross-SampEn and its confidence interval for every two of the exchange rate time series in periods 1995-1998 (before the Asian currency crisis) and 1999-2002 (after the Asian currency crisis) are calculated. The results show that the cross-SampEn of every two of these exchange rates becomes higher after the Asian currency crisis, indicating a higher asynchrony between the exchange rates. Especially for Singapore, Thailand and Taiwan, the cross-SampEn values after the Asian currency crisis are significantly higher than those before the Asian currency crisis. Comparison with the correlation coefficient shows that cross-SampEn is superior to describe the correlation between time series.

  8. Wuchereria bancrofti in Tanzania: microfilarial periodicity and effect of blood sampling time on microfilarial intensities

    DEFF Research Database (Denmark)

    Simonsen, Poul Erik; Niemann, L.; Meyrowitsch, Dan Wolf

    1997-01-01

    The circadian periodicity of Wuchereria bancrofti microfilarial (mf) intensities in peripheral blood was analysed in a group of infected individuals from an endemic community in north-eastern Tanzania. The mf density was quantified at two-hourly intervals for 24 hours. A clear nocturnal periodic...... of blood sampling before peak time is discussed, and the importance of taking sampling time into consideration when analysing data from epidemiological studies is emphasized. A simple method is devised which can be used to adjust for the influence of time on mf intensities, in studies where accurate...... information on mf intensities is necessary, and where it is impossible to obtain all samples at peak time....

  9. A novel heterogeneous training sample selection method on space-time adaptive processing

    Science.gov (United States)

    Wang, Qiang; Zhang, Yongshun; Guo, Yiduo

    2018-04-01

    The performance of ground target detection about space-time adaptive processing (STAP) decreases when non-homogeneity of clutter power is caused because of training samples contaminated by target-like signals. In order to solve this problem, a novel nonhomogeneous training sample selection method based on sample similarity is proposed, which converts the training sample selection into a convex optimization problem. Firstly, the existing deficiencies on the sample selection using generalized inner product (GIP) are analyzed. Secondly, the similarities of different training samples are obtained by calculating mean-hausdorff distance so as to reject the contaminated training samples. Thirdly, cell under test (CUT) and the residual training samples are projected into the orthogonal subspace of the target in the CUT, and mean-hausdorff distances between the projected CUT and training samples are calculated. Fourthly, the distances are sorted in order of value and the training samples which have the bigger value are selective preference to realize the reduced-dimension. Finally, simulation results with Mountain-Top data verify the effectiveness of the proposed method.

  10. Evaluation of Sample Stability and Automated DNA Extraction for Fetal Sex Determination Using Cell-Free Fetal DNA in Maternal Plasma

    Directory of Open Access Journals (Sweden)

    Elena Ordoñez

    2013-01-01

    Full Text Available Objective. The detection of paternally inherited sequences in maternal plasma, such as the SRY gene for fetal sexing or RHD for fetal blood group genotyping, is becoming part of daily routine in diagnostic laboratories. Due to the low percentage of fetal DNA, it is crucial to ensure sample stability and the efficiency of DNA extraction. We evaluated blood stability at 4°C for at least 24 hours and automated DNA extraction, for fetal sex determination in maternal plasma. Methods. A total of 158 blood samples were collected, using EDTA-K tubes, from women in their 1st trimester of pregnancy. Samples were kept at 4°C for at least 24 hours before processing. An automated DNA extraction was evaluated, and its efficiency was compared with a standard manual procedure. The SRY marker was used to quantify cfDNA by real-time PCR. Results. Although lower cfDNA amounts were obtained by automated DNA extraction (mean 107,35 GE/mL versus 259,43 GE/mL, the SRY sequence was successfully detected in all 108 samples from pregnancies with male fetuses. Conclusion. We successfully evaluated the suitability of standard blood tubes for the collection of maternal blood and assessed samples to be suitable for analysis at least 24 hours later. This would allow shipping to a central reference laboratory almost from anywhere in Europe.

  11. Comparisons of sampling procedures and time of sampling for the detection of Salmonella in Danish infected chicken flocks raised in floor systems

    DEFF Research Database (Denmark)

    Gradel, K.O.; Andersen, J.; Madsen, M.

    2002-01-01

    other within each flock: 1) 5 pairs of socks, analysed as 5 samples, 2) 2 pairs of socks, analysed as one sample, and 3) 60 faecal samples, analysed as one pooled sample. Agreement between sampling methods was evaluated by the following statistical tests: 'Kappa', 'The adjusted rand', McNemar"s test...... in detecting S. enterica as the 60 faecal samples. In broiler flocks, 5 pairs of socks were used both in the routine samples taken at about 3 weeks of age for the establishment of infection of the flock, and as one of the follow-up samples taken shortly before slaughter age, which means that the only notable...... for marginal symmetry, Proportion of agreement P-0, P-, P-, and Odds Ratio. The highest agreement was found between the 2 types of sock sampling, while the lowest agreement was found by comparing 60 faecal samples with 5 pairs of socks. Two pairs of socks analysed as one pool appeared to be just as effective...

  12. Appropriate xenon-inhalation time in xenon-enhanced CT using the end-tidal gas-sampling method

    Energy Technology Data Exchange (ETDEWEB)

    Asada, Hideo; Furuhata, Shigeru; Onozuka, Satoshi; Uchida, Koichi; Fujii, Koji; Suga, Sadao; Kawase, Takeshi; Toya, Shigeo; Shiga, Hayao

    1988-12-01

    For the end-tidal gas-sampling method of xenon-enhanced CT (Xe-CT), the respective functional images of K, lambda, and the regional cerebral blood flow (rCBF) were studied and compared using the data at 7-, 10-, 15- and 25-minute inhalations. The most appropriate inhalation time of xenon gas was evaluated in 14 clinical cases. An end-tidal xenon curve which represents the arterial xenon concentration was monitored with a xenon analyzer; the xenon concentration was gradually increased to a level of 50% by using a xenon inhalator with a closed circuit to prevent the overestimation of the xenon concentration sampled from the mask. Serial CT scans were taken over a period of 25 minutes of inhalation. The functional images of K, lambda, and rCBF were calculated for serial CT scans for 7, 10, 15 and 25 minutes using Fick's equation. Those various images and absolute values were then compared. The rCBF value of a 15-minute inhalation was approximately 15% greater than that of 25 minutes, while the values of K, lambda, rCBF from a 15-minute inhalation were significantly correlated to those from 25 minutes. The regression line made it possible to estimate 25-minute inhalation values from those of 15 minutes. In imaging, the rCBF mapping of the 15-minute inhalation was found to be more reliable than that of 25 minutes. This study suggests that the minimal time of xenon inhalation is 15 minutes for the end-tidal gas-sampling method. A longer inhalation may be necessary for the estimation of rCBF in the low-flow area, such as the white matter or the pathological region.

  13. Defining an optimum pumping-time requirement for sampling ground-water wells on the Hanford site

    International Nuclear Information System (INIS)

    Scharnhorst, N.L.

    1982-04-01

    The objective was to determine the optimum time period necessary to pump water from a well before a representative sample of the ground water can be obtained. It was assumed that a representative sample has been collected if the concentration of chemical parameters is the same in a number of samples taken consecutively, so that the concentration of parameters does not vary with time of collection. Ground-water samples used in this project were obtained by pumping selected wells on the Hanford Site. At each well, samples were taken at two minute intervals, and on each sample various chemical analyses were performed. Samples were checked for pH, sulfate, iron, specific conductivity, chloride, nitrate and alkalinity. The data showed that pH, alkalinity, sulfate and specific conductivity levels stabilized almost immediately after pumping of the well began. In many wells, the chloride and nitrate levels were unstable throughout the 38-minute sampling period. Iron levels, however, did not behave in either fashion. The concentration of iron in the samples was high when pumping began but dropped rapidly as pumping continued. The best explanation for this is that iron is flushed from the sides of the casing into the well when pumping begins. After several minutes of pumping, most of the dissolved iron is washed from the well casing and the iron concentration reaches a stable plateau representative of the iron concentration in the ground water.Since iron concentration takes longest to stabilize, the optimum pumping time for a well is based on the iron stabilization time for that well

  14. Asymptotic theory for the sample covariance matrix of a heavy-tailed multivariate time series

    DEFF Research Database (Denmark)

    Davis, Richard A.; Mikosch, Thomas Valentin; Pfaffel, Olivier

    2016-01-01

    In this paper we give an asymptotic theory for the eigenvalues of the sample covariance matrix of a multivariate time series. The time series constitutes a linear process across time and between components. The input noise of the linear process has regularly varying tails with index α∈(0,4) in...... particular, the time series has infinite fourth moment. We derive the limiting behavior for the largest eigenvalues of the sample covariance matrix and show point process convergence of the normalized eigenvalues. The limiting process has an explicit form involving points of a Poisson process and eigenvalues...... of a non-negative definite matrix. Based on this convergence we derive limit theory for a host of other continuous functionals of the eigenvalues, including the joint convergence of the largest eigenvalues, the joint convergence of the largest eigenvalue and the trace of the sample covariance matrix...

  15. Evaluation Of ARG-1 Samples Prepared By Cesium Carbonate Dissolution During The Isolok SME Acceptability Testing

    International Nuclear Information System (INIS)

    Edwards, T.; Hera, K.; Coleman, C.

    2011-01-01

    Evaluation of Defense Waste Processing Facility (DWPF) Chemical Process Cell (CPC) cycle time identified several opportunities to improve the CPC processing time. The Mechanical Systems and Custom Equipment Development (MS and CED) Section of the Savannah River National Laboratory (SRNL) recently completed the evaluation of one of these opportunities - the possibility of using an Isolok sampling valve as an alternative to the Hydragard valve for taking DWPF process samples at the Slurry Mix Evaporator (SME). The use of an Isolok for SME sampling has the potential to improve operability, reduce maintenance time, and decrease CPC cycle time. The SME acceptability testing for the Isolok was requested in Task Technical Request (TTR) HLW-DWPF-TTR-2010-0036 and was conducted as outlined in Task Technical and Quality Assurance Plan (TTQAP) SRNLRP-2011-00145. RW-0333P QA requirements applied to the task, and the results from the investigation were documented in SRNL-STI-2011-00693. Measurement of the chemical composition of study samples was a critical component of the SME acceptability testing of the Isolok. A sampling and analytical plan supported the investigation with the analytical plan directing that the study samples be prepared by a cesium carbonate (Cs 2 CO 3 ) fusion dissolution method and analyzed by Inductively Coupled Plasma - Optical Emission Spectroscopy (ICP-OES). The use of the cesium carbonate preparation method for the Isolok testing provided an opportunity for an additional assessment of this dissolution method, which is being investigated as a potential replacement for the two methods (i.e., sodium peroxide fusion and mixed acid dissolution) that have been used at the DWPF for the analysis of SME samples. Earlier testing of the Cs 2 CO 3 method yielded promising results which led to a TTR from Savannah River Remediation, LLC (SRR) to SRNL for additional support and an associated TTQAP to direct the SRNL efforts. A technical report resulting from this

  16. An approach based on HPLC-fingerprint and chemometrics to quality consistency evaluation of Matricaria chamomilla L. commercial samples

    Directory of Open Access Journals (Sweden)

    Agnieszka Viapiana

    2016-10-01

    Full Text Available Chamomile has been used as an herbal medication since ancient times and is still popular because it contains various bioactive phytochemicals that could provide therapeutic effects. In this study, a simple and reliable HPLC method was developed to evaluate the quality consistency of nineteen chamomile samples through establishing a chromatographic fingerprint, quantification of phenolic compounds and determination of antioxidant activity. For fingerprint analysis, 12 peaks were selected as the common peaks to evaluate the similarities of commercial samples of chamomile obtained from different manufacturers. A similarity analysis was performed to assess the similarity/dissimilarity of chamomile samples where values varied from 0.868 to 0.990 what indicating that samples from different manufacturers were consistent. Additionally, simultaneous quantification of five phenolic acids (gallic, caffeic, syringic, p-coumaric, ferulic and four flavonoids (rutin, myricetin, quercetin and keampferol was performed to interpret the quality consistency. In quantitative analysis, the nine individual phenolic compounds showed good regression (r > 0.9975. Inter- and intra-day precisions for all analysed compounds expressed as relative standard deviation (CV ranged from 0.05% to 3.12%. Since flavonoids and other polyphenols are commonly recognised as natural antioxidants, the antioxidant activity of chamomile samples was evaluated using 1,1-diphenyl-2-picrylhydrazyl (DPPH radical scavenging activity and ferric reducing/antioxidant power (FRAP assay. Correlation analysis was used to assess the relationship between antioxidant activity and phenolic composition, and multivariate analysis (PCA and HCA were applied to distinguish chamomile samples. Results shown in the study indicate high similarity of chamomile samples among them, widely spread in the market and commonly used by people as infusions or teas, as well as that there were no statistically significant

  17. An Approach Based on HPLC-Fingerprint and Chemometrics to Quality Consistency Evaluation of Matricaria chamomilla L. Commercial Samples

    Science.gov (United States)

    Viapiana, Agnieszka; Struck-Lewicka, Wiktoria; Konieczynski, Pawel; Wesolowski, Marek; Kaliszan, Roman

    2016-01-01

    Chamomile has been used as an herbal medication since ancient times and is still popular because it contains various bioactive phytochemicals that could provide therapeutic effects. In this study, a simple and reliable HPLC method was developed to evaluate the quality consistency of nineteen chamomile samples through establishing a chromatographic fingerprint, quantification of phenolic compounds and determination of antioxidant activity. For fingerprint analysis, 12 peaks were selected as the common peaks to evaluate the similarities of commercial samples of chamomile obtained from different manufacturers. A similarity analysis was performed to assess the similarity/dissimilarity of chamomile samples where values varied from 0.868 to 0.990 what indicating that samples from different manufacturers were consistent. Additionally, simultaneous quantification of five phenolic acids (gallic, caffeic, syringic, p-coumaric, ferulic) and four flavonoids (rutin, myricetin, quercetin and keampferol) was performed to interpret the quality consistency. In quantitative analysis, the nine individual phenolic compounds showed good regression (r > 0.9975). Inter- and intra-day precisions for all analyzed compounds expressed as relative standard deviation (CV) ranged from 0.05% to 3.12%. Since flavonoids and other polyphenols are commonly recognized as natural antioxidants, the antioxidant activity of chamomile samples was evaluated using 1,1-diphenyl-2-picrylhydrazyl (DPPH) radical scavenging activity and ferric reducing/antioxidant power (FRAP) assay. Correlation analysis was used to assess the relationship between antioxidant activity and phenolic composition, and multivariate analysis (PCA and HCA) were applied to distinguish chamomile samples. Results shown in the study indicate high similarity of chamomile samples among them, widely spread in the market and commonly used by people as infusions or teas, as well as that there were no statistically significant differences among

  18. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys

    OpenAIRE

    Hund, Lauren; Bedrick, Edward J.; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we comp...

  19. Real-time recursive hyperspectral sample and band processing algorithm architecture and implementation

    CERN Document Server

    Chang, Chein-I

    2017-01-01

    This book explores recursive architectures in designing progressive hyperspectral imaging algorithms. In particular, it makes progressive imaging algorithms recursive by introducing the concept of Kalman filtering in algorithm design so that hyperspectral imagery can be processed not only progressively sample by sample or band by band but also recursively via recursive equations. This book can be considered a companion book of author’s books, Real-Time Progressive Hyperspectral Image Processing, published by Springer in 2016. Explores recursive structures in algorithm architecture Implements algorithmic recursive architecture in conjunction with progressive sample and band processing Derives Recursive Hyperspectral Sample Processing (RHSP) techniques according to Band-Interleaved Sample/Pixel (BIS/BIP) acquisition format Develops Recursive Hyperspectral Band Processing (RHBP) techniques according to Band SeQuential (BSQ) acquisition format for hyperspectral data.

  20. The U-tube sampling methodology and real-time analysis of geofluids

    International Nuclear Information System (INIS)

    Freifeld, Barry; Perkins, Ernie; Underschultz, James; Boreham, Chris

    2009-01-01

    The U-tube geochemical sampling methodology, an extension of the porous cup technique proposed by Wood (1973), provides minimally contaminated aliquots of multiphase fluids from deep reservoirs and allows for accurate determination of dissolved gas composition. The initial deployment of the U-tube during the Frio Brine Pilot CO 2 storage experiment, Liberty County, Texas, obtained representative samples of brine and supercritical CO 2 from a depth of 1.5 km. A quadrupole mass spectrometer provided real-time analysis of dissolved gas composition. Since the initial demonstration, the U-tube has been deployed for (1) sampling of fluids down gradient of the proposed Yucca Mountain High-Level Waste Repository, Armagosa Valley, Nevada (2) acquiring fluid samples beneath permafrost in Nunuvut Territory, Canada, and (3) at a CO 2 storage demonstration project within a depleted gas reservoir, Otway Basin, Victoria, Australia. The addition of in-line high-pressure pH and EC sensors allows for continuous monitoring of fluid during sample collection. Difficulties have arisen during U-tube sampling, such as blockage of sample lines from naturally occurring waxes or from freezing conditions; however, workarounds such as solvent flushing or heating have been used to address these problems. The U-tube methodology has proven to be robust, and with careful consideration of the constraints and limitations, can provide high quality geochemical samples.

  1. Evaluation of three sampling methods to monitor outcomes of antiretroviral treatment programmes in low- and middle-income countries.

    Science.gov (United States)

    Tassie, Jean-Michel; Malateste, Karen; Pujades-Rodríguez, Mar; Poulet, Elisabeth; Bennett, Diane; Harries, Anthony; Mahy, Mary; Schechter, Mauro; Souteyrand, Yves; Dabis, François

    2010-11-10

    Retention of patients on antiretroviral therapy (ART) over time is a proxy for quality of care and an outcome indicator to monitor ART programs. Using existing databases (Antiretroviral in Lower Income Countries of the International Databases to Evaluate AIDS and Médecins Sans Frontières), we evaluated three sampling approaches to simplify the generation of outcome indicators. We used individual patient data from 27 ART sites and included 27,201 ART-naive adults (≥15 years) who initiated ART in 2005. For each site, we generated two outcome indicators at 12 months, retention on ART and proportion of patients lost to follow-up (LFU), first using all patient data and then within a smaller group of patients selected using three sampling methods (random, systematic and consecutive sampling). For each method and each site, 500 samples were generated, and the average result was compared with the unsampled value. The 95% sampling distribution (SD) was expressed as the 2.5(th) and 97.5(th) percentile values from the 500 samples. Overall, retention on ART was 76.5% (range 58.9-88.6) and the proportion of patients LFU, 13.5% (range 0.8-31.9). Estimates of retention from sampling (n = 5696) were 76.5% (SD 75.4-77.7) for random, 76.5% (75.3-77.5) for systematic and 76.0% (74.1-78.2) for the consecutive method. Estimates for the proportion of patients LFU were 13.5% (12.6-14.5), 13.5% (12.6-14.3) and 14.0% (12.5-15.5), respectively. With consecutive sampling, 50% of sites had SD within ±5% of the unsampled site value. Our results suggest that random, systematic or consecutive sampling methods are feasible for monitoring ART indicators at national level. However, sampling may not produce precise estimates in some sites.

  2. An Approach Based on HPLC-Fingerprint and Chemometrics to Quality Consistency Evaluation of Matricaria chamomilla L. Commercial Samples

    OpenAIRE

    Viapiana, Agnieszka; Struck-Lewicka, Wiktoria; Konieczynski, Pawel; Wesolowski, Marek; Kaliszan, Roman

    2016-01-01

    Chamomile has been used as an herbal medication since ancient times and is still popular because it contains various bioactive phytochemicals that could provide therapeutic effects. In this study, a simple and reliable HPLC method was developed to evaluate the quality consistency of nineteen chamomile samples through establishing a chromatographic fingerprint, quantification of phenolic compounds and determination of antioxidant activity. For fingerprint analysis, 12 peaks were selected as th...

  3. Experimental evaluation of the detection threshold of uranium in urine samples

    International Nuclear Information System (INIS)

    Ferreyra, M. D.; Suarez Mendez, Sebastian; Tossi, Mirta H.

    1999-01-01

    The routine internal dosimetric tests for nuclear installations workers includes the determination of uranium in urine. The analysis is carried out, after chemical treatment, by UV fluorometry, comparing the results with urine blank samples from workers not exposed professionally to contamination. The fluctuation of the results of the uranium content in the blank samples greatly affects the determinations. In 30 blank samples the uranium content was determined and the results were evaluated by three calculation methods: 1) The procedure recommended by IUPAC; 2) The graphical method; 3) and The error propagation method. The last one has been adopted for the calculation of the detection threshold. (authors)

  4. Comparison of Nested Polymerase Chain Reaction and Real-Time Polymerase Chain Reaction with Parasitological Methods for Detection of Strongyloides stercoralis in Human Fecal Samples

    Science.gov (United States)

    Sharifdini, Meysam; Mirhendi, Hossein; Ashrafi, Keyhan; Hosseini, Mostafa; Mohebali, Mehdi; Khodadadi, Hossein; Kia, Eshrat Beigom

    2015-01-01

    This study was performed to evaluate nested polymerase chain reaction (PCR) and real-time PCR methods for detection of Strongyloides stercoralis in fecal samples compared with parasitological methods. A total of 466 stool samples were examined by conventional parasitological methods (formalin ether concentration [FEC] and agar plate culture [APC]). DNA was extracted using an in-house method, and mitochondrial cytochrome c oxidase subunit 1 and 18S ribosomal genes were amplified by nested PCR and real-time PCR, respectively. Among 466 samples, 12.7% and 18.2% were found infected with S. stercoralis by FEC and APC, respectively. DNA of S. stercoralis was detected in 18.9% and 25.1% of samples by real-time PCR and nested PCR, respectively. Considering parasitological methods as the diagnostic gold standard, the sensitivity and specificity of nested PCR were 100% and 91.6%, respectively, and that of real-time PCR were 84.7% and 95.8%, respectively. However, considering sequence analyzes of the selected nested PCR products, the specificity of nested PCR is increased. In general, molecular methods were superior to parasitological methods. They were more sensitive and more reliable in detection of S. stercoralis in comparison with parasitological methods. Between the two molecular methods, the sensitivity of nested PCR was higher than real-time PCR. PMID:26350449

  5. Surveying immigrants without sampling frames - evaluating the success of alternative field methods.

    Science.gov (United States)

    Reichel, David; Morales, Laura

    2017-01-01

    This paper evaluates the sampling methods of an international survey, the Immigrant Citizens Survey, which aimed at surveying immigrants from outside the European Union (EU) in 15 cities in seven EU countries. In five countries, no sample frame was available for the target population. Consequently, alternative ways to obtain a representative sample had to be found. In three countries 'location sampling' was employed, while in two countries traditional methods were used with adaptations to reach the target population. The paper assesses the main methodological challenges of carrying out a survey among a group of immigrants for whom no sampling frame exists. The samples of the survey in these five countries are compared to results of official statistics in order to assess the accuracy of the samples obtained through the different sampling methods. It can be shown that alternative sampling methods can provide meaningful results in terms of core demographic characteristics although some estimates differ to some extent from the census results.

  6. Determination of heavy metals in groundwater samples - ICP-MS analysis and evaluation

    International Nuclear Information System (INIS)

    Leiterer, M.; Muench, U.

    1994-01-01

    An analytical programme which permits the direct, simultaneous determination of Al, As, Cd, Cr, Cu, Mn, Ni, Pb and Zn in groundwater samples was developed for ICP-MS. Spectral mass interferences, attributable to great differences in groundwater matrices, precision and accuracy have been discussed. The evaluation of analytical results was demonstrated for selected sampling points of the groundwater observation network of Thuringia. (orig.)

  7. Effect of Different Sampling Schedules on Results of Bioavailability and Bioequivalence Studies: Evaluation by Means of Monte Carlo Simulations.

    Science.gov (United States)

    Kano, Eunice Kazue; Chiann, Chang; Fukuda, Kazuo; Porta, Valentina

    2017-08-01

    Bioavailability and bioequivalence study is one of the most frequently performed investigations in clinical trials. Bioequivalence testing is based on the assumption that 2 drug products will be therapeutically equivalent when they are equivalent in the rate and extent to which the active drug ingredient or therapeutic moiety is absorbed and becomes available at the site of drug action. In recent years there has been a significant growth in published papers that use in silico studies based on mathematical simulations to analyze pharmacokinetic and pharmacodynamic properties of drugs, including bioavailability and bioequivalence aspects. The goal of this study is to evaluate the usefulness of in silico studies as a tool in the planning of bioequivalence, bioavailability and other pharmacokinetic assays, e.g., to determine an appropriate sampling schedule. Monte Carlo simulations were used to define adequate blood sampling schedules for a bioequivalence assay comparing 2 different formulations of cefadroxil oral suspensions. In silico bioequivalence studies comparing different formulation of cefadroxil oral suspensions using various sampling schedules were performed using models. An in vivo study was conducted to confirm in silico results. The results of in silico and in vivo bioequivalence studies demonstrated that schedules with fewer sampling times are as efficient as schedules with larger numbers of sampling times in the assessment of bioequivalence, but only if T max is included as a sampling time. It was also concluded that in silico studies are useful tools in the planning of bioequivalence, bioavailability and other pharmacokinetic in vivo assays. © Georg Thieme Verlag KG Stuttgart · New York.

  8. Evaluation of various conventional methods for sampling weeds in potato and spinach crops

    Directory of Open Access Journals (Sweden)

    David Jamaica

    2014-04-01

    Full Text Available This study aimed to evaluate (at an exploratory level, some of the different conventional sampling designs in a section of a potato crop and in a commercial crop of spinach. Weeds were sampled in a 16 x 48 m section of a potato crop with a set grid of 192 sections. The cover and density of the weeds were registered in squares of from 0.25 to 64 m². The results were used to create a database that allowed for the simulation of different sampling designs: variables and square size. A second sampling was carried out with these results in a spinach crop of 1.16 ha with a set grid of 6 x 6 m cells, evaluating the cover in 4 m² squares. Another database was created with this information, which was used to simulate other sampling designs such as distribution and quantity of sampling squares. According to the obtained results, a good method for approximating the quantity of squares for diverse samples is 10-12 squares (4 m² for richness per ha and 18 or more squares for abundance per hectare. This square size is optimal since it allows for a sampling of more area without losing sight of low-profile species, with the cover variable best representing the abundance of the weeds.

  9. Deflection evaluation using time-resolved radiography

    International Nuclear Information System (INIS)

    Fry, D.A.; Lucero, J.P.

    1990-01-01

    Time-resolved radiography is the creation of an x-ray image for which both the start-exposure and stop-exposure times are known with respect to the event under study. The combination of image and timing are used to derive information about the event. The authors have applied time-resolved radiography to evaluate motions of explosive-driven events. In the particular application discussed in this paper, the author's intent is to measure maximum deflections of the components involved. Exposures are made during the time just before to just after the event of interest occurs. A smear or blur of motion out to its furthest extent is recorded on the image. Comparison of the dynamic images with static images allows deflection measurements to be made

  10. Correction to the count-rate detection limit and sample/blank time-allocation methods

    International Nuclear Information System (INIS)

    Alvarez, Joseph L.

    2013-01-01

    A common form of count-rate detection limits contains a propagation of uncertainty error. This error originated in methods to minimize uncertainty in the subtraction of the blank counts from the gross sample counts by allocation of blank and sample counting times. Correct uncertainty propagation showed that the time allocation equations have no solution. This publication presents the correct form of count-rate detection limits. -- Highlights: •The paper demonstrated a proper method of propagating uncertainty of count rate differences. •The standard count-rate detection limits were in error. •Count-time allocation methods for minimum uncertainty were in error. •The paper presented the correct form of the count-rate detection limit. •The paper discussed the confusion between count-rate uncertainty and count uncertainty

  11. A DOE manual: DOE Methods for Evaluating Environmental and Waste Management Samples

    International Nuclear Information System (INIS)

    Goheen, S.C.; McCulloch, M.; Riley, R.G.

    1994-01-01

    Waste Management inherently requires knowledge of the waste's chemical composition. The waste can often be analyzed by established methods; however, if the samples are radioactive, or are plagued by other complications, established methods may not be feasible. The US Department of Energy (DOE) has been faced with managing some waste types that are not amenable to standard or available methods, so new or modified sampling and analysis methods are required. These methods are incorporated into DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods), which is a guidance/methods document for sampling and analysis activities in support of DOE sites. It is a document generated by consensus of the DOE laboratory staff and is intended to fill the gap within existing guidance documents (e. g., the Environmental Protection Agency's (EPA's) Test Methods for Evaluating Solid Waste, SW-846), which apply to low-level or non-radioactive samples. DOE Methods fills the gap by including methods that take into account the complexities of DOE site matrices. The most recent update, distributed in October 1993, contained quality assurance (QA), quality control (QC), safety, sampling, organic analysis, inorganic analysis, and radioanalytical guidance as well as 29 methods. The next update, which will be distributed in April 1994, will contain 40 methods and will therefore have greater applicability. All new methods are either peer reviewed or labeled ''draft'' methods. Draft methods were added to speed the release of methods to field personnel

  12. Air exposure and sample storage time influence on hydrogen release from tungsten

    Energy Technology Data Exchange (ETDEWEB)

    Moshkunov, K.A., E-mail: moshkunov@gmail.co [National Research Nuclear University ' MEPhI' , Kashirskoe sh. 31, 115409 Moscow (Russian Federation); Schmid, K.; Mayer, M. [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Boltzmannstrasse 2, D-85748 Garching (Germany); Kurnaev, V.A.; Gasparyan, Yu.M. [National Research Nuclear University ' MEPhI' , Kashirskoe sh. 31, 115409 Moscow (Russian Federation)

    2010-09-30

    In investigations of hydrogen retention in first wall components the influence of the conditions of the implanted target storage prior to analysis and the storage time is often neglected. Therefore we have performed a dedicated set of experiments. The release of hydrogen from samples exposed to ambient air after irradiation was compared to samples kept in vacuum. For air exposed samples significant amounts of HDO and D{sub 2}O are detected during TDS. Additional experiments have shown that heavy water is formed by recombination of releasing D and H atoms with O on the W surface. This water formation can alter hydrogen retention results significantly, in particular - for low retention cases. In addition to the influence of ambient air exposure also the influence of storage time in vacuum was investigated. After implantation at 300 K the samples were stored in vacuum for up to 1 week during which the retained amount decreased significantly. The subsequently measured TDS spectra showed that D was lost from both the high and low energy peaks during storage at ambient temperature of {approx}300 K. An attempt to simulate this release from both peaks during room temperature storage by TMAP 7 calculations showed that this effect cannot be explained by conventional diffusion/trapping models.

  13. Air exposure and sample storage time influence on hydrogen release from tungsten

    International Nuclear Information System (INIS)

    Moshkunov, K.A.; Schmid, K.; Mayer, M.; Kurnaev, V.A.; Gasparyan, Yu.M.

    2010-01-01

    In investigations of hydrogen retention in first wall components the influence of the conditions of the implanted target storage prior to analysis and the storage time is often neglected. Therefore we have performed a dedicated set of experiments. The release of hydrogen from samples exposed to ambient air after irradiation was compared to samples kept in vacuum. For air exposed samples significant amounts of HDO and D 2 O are detected during TDS. Additional experiments have shown that heavy water is formed by recombination of releasing D and H atoms with O on the W surface. This water formation can alter hydrogen retention results significantly, in particular - for low retention cases. In addition to the influence of ambient air exposure also the influence of storage time in vacuum was investigated. After implantation at 300 K the samples were stored in vacuum for up to 1 week during which the retained amount decreased significantly. The subsequently measured TDS spectra showed that D was lost from both the high and low energy peaks during storage at ambient temperature of ∼300 K. An attempt to simulate this release from both peaks during room temperature storage by TMAP 7 calculations showed that this effect cannot be explained by conventional diffusion/trapping models.

  14. Air exposure and sample storage time influence on hydrogen release from tungsten

    Science.gov (United States)

    Moshkunov, K. A.; Schmid, K.; Mayer, M.; Kurnaev, V. A.; Gasparyan, Yu. M.

    2010-09-01

    In investigations of hydrogen retention in first wall components the influence of the conditions of the implanted target storage prior to analysis and the storage time is often neglected. Therefore we have performed a dedicated set of experiments. The release of hydrogen from samples exposed to ambient air after irradiation was compared to samples kept in vacuum. For air exposed samples significant amounts of HDO and D 2O are detected during TDS. Additional experiments have shown that heavy water is formed by recombination of releasing D and H atoms with O on the W surface. This water formation can alter hydrogen retention results significantly, in particular - for low retention cases. In addition to the influence of ambient air exposure also the influence of storage time in vacuum was investigated. After implantation at 300 K the samples were stored in vacuum for up to 1 week during which the retained amount decreased significantly. The subsequently measured TDS spectra showed that D was lost from both the high and low energy peaks during storage at ambient temperature of ˜300 K. An attempt to simulate this release from both peaks during room temperature storage by TMAP 7 calculations showed that this effect cannot be explained by conventional diffusion/trapping models.

  15. Towards the production of reliable quantitative microbiological data for risk assessment: Direct quantification of Campylobacter in naturally infected chicken fecal samples using selective culture and real-time PCR

    DEFF Research Database (Denmark)

    Garcia Clavero, Ana Belén; Vigre, Håkan; Josefsen, Mathilde Hasseldam

    2015-01-01

    of Campylobacter by real-time PCR was performed using standard curves designed for two different DNA extraction methods: Easy-DNA™ Kit from Invitrogen (Easy-DNA) and NucliSENS® MiniMAG® from bioMérieux (MiniMAG). Results indicated that the estimation of the numbers of Campylobacter present in chicken fecal samples...... and for the evaluation of control strategies implemented in poultry production. The aim of this study was to compare estimates of the numbers of Campylobacter spp. in naturally infected chicken fecal samples obtained using direct quantification by selective culture and by real-time PCR. Absolute quantification....... Although there were differences in terms of estimates of Campylobacter numbers between the methods and samples, the differences between culture and real-time PCR were not statistically significant for most of the samples used in this study....

  16. Development and evaluation of a gas chromatographic method for the determination of triazine herbicides in natural water samples

    Science.gov (United States)

    Steinheimer, T.R.; Brooks, M.G.

    1984-01-01

    A multi-residue method is described for the determination of triazine herbicides in natural water samples. The technique uses solvent extraction followed by gas chromatographic separation and detection employing nitrogen-selective devices. Seven compounds can be determined simultaneously at a nominal detection limit of 0.1 ??g/L in a 1-litre sample. Three different natural water samples were used for error analysis via evaluation of recovery efficiencies and estimation of overall method precision. As an alternative to liquid-liquid partition (solvent extraction) for removal of compounds of interest from water, solid-phase extraction (SPE) techniques employing chromatographic grade silicas with chemically modified surfaces have been examined. SPE is found to provide rapid and efficient concentration with quantitative recovery of some triazine herbicides from natural water samples. Concentration factors of 500 to 1000 times are obtained readily by the SPE technique.A multi-residue method is described for the determination of triazine herbicides in natural water samples. The technique uses solvent extraction followed by gas chromatographic separation and detection employing nitrogen-selective devices. Seven compounds can be determined simultaneously at a nominal detection limit of 0. 1 mu g/L in a 1-litre sample. As an alternative to liquid-liquid partition (solvent extraction) for removal of compounds of interest from water, solid-phase extraction (SPE) techniques employing chromatographic grade silicas with chemically modified surfaces have been examined. SPE is found to provide rapid and efficient concentration with quantitative recovery of some triazine herbicides from natural water samples. Concentration factors of 500 to 1000 times are obtained readily by the SPE technique.

  17. Chromosomal radiosensitivity of human leucocytes in relation to sampling time

    International Nuclear Information System (INIS)

    Buul, P.P.W. van; Natarajan, A.T.

    1980-01-01

    Frequencies of chromosomal aberrations after irradiation with X-rays of peripheral blood lymphocytes in vitro were determined at different times after initiation of cultures. In each culture, the kinetics of cell multiplication was followed by using BrdU labelling and differential staining of chromosomes. The results indicate that the mixing up of first and second cell cycle cells at later sampling times cannot explain the observed variation in the frequencies of chromosomal aberrations but that donor-to-donor variation is a predominant factor influencing yields of aberrations. The condition of a donor seems to be most important because repeats on the same donor also showed marked variability. (orig.)

  18. Unreviewed safety question evaluation of 100 K West fuel canister gas and liquid sampling

    International Nuclear Information System (INIS)

    Alwardt, L.D.

    1995-01-01

    The purpose of this report is to provide the basis for answers to an Unreviewed Safety Question (USQ) safety evaluation for the gas and liquid sampling activities associated with the fuel characterization program at the 100 K West (KW) fuel storage basin. The scope of this safety evaluation is limited to the movement of canisters between the main storage basin, weasel pit, and south loadout pit transfer channel (also known as the decapping station); gas and liquid sampling of fuel canisters in the weasel pit; mobile laboratory preliminary sample analysis in or near the 105 KW basin building; and the placement of sample containers in an approved shipping container. It was concluded that the activities and potential accident consequences associated with the gas and liquid sampling of 100 KW fuel canisters are bounded by the current safety basis documents and do not constitute an Unreviewed Safety Question

  19. Measuring Sulfur Isotope Ratios from Solid Samples with the Sample Analysis at Mars Instrument and the Effects of Dead Time Corrections

    Science.gov (United States)

    Franz, H. B.; Mahaffy, P. R.; Kasprzak, W.; Lyness, E.; Raaen, E.

    2011-01-01

    The Sample Analysis at Mars (SAM) instrument suite comprises the largest science payload on the Mars Science Laboratory (MSL) "Curiosity" rover. SAM will perform chemical and isotopic analysis of volatile compounds from atmospheric and solid samples to address questions pertaining to habitability and geochemical processes on Mars. Sulfur is a key element of interest in this regard, as sulfur compounds have been detected on the Martian surface by both in situ and remote sensing techniques. Their chemical and isotopic composition can belp constrain environmental conditions and mechanisms at the time of formation. A previous study examined the capability of the SAM quadrupole mass spectrometer (QMS) to determine sulfur isotope ratios of SO2 gas from a statistical perspective. Here we discuss the development of a method for determining sulfur isotope ratios with the QMS by sampling SO2 generated from heating of solid sulfate samples in SAM's pyrolysis oven. This analysis, which was performed with the SAM breadboard system, also required development of a novel treatment of the QMS dead time to accommodate the characteristics of an aging detector.

  20. A Schistosoma haematobium-specific real-time PCR for diagnosis of urogenital schistosomiasis in serum samples of international travelers and migrants.

    Science.gov (United States)

    Cnops, Lieselotte; Soentjens, Patrick; Clerinx, Jan; Van Esbroeck, Marjan

    2013-01-01

    Diagnosis of urogenital schistosomiasis by microscopy and serological tests may be elusive in travelers due to low egg load and the absence of seroconversion upon arrival. There is need for a more sensitive diagnostic test. Therefore, we developed a real-time PCR targeting the Schistosoma haematobium-specific Dra1 sequence. The PCR was evaluated on urine (n = 111), stool (n = 84) and serum samples (n = 135), and one biopsy from travelers and migrants with confirmed or suspected schistosomiasis. PCR revealed a positive result in 7/7 urine samples, 11/11 stool samples and 1/1 biopsy containing S. haematobium eggs as demonstrated by microscopy and in 22/23 serum samples from patients with a parasitological confirmed S. haematobium infection. S. haematobium DNA was additionally detected by PCR in 7 urine, 3 stool and 5 serum samples of patients suspected of having schistosomiasis without egg excretion in urine and feces. None of these suspected patients demonstrated other parasitic infections except one with Blastocystis hominis and Entamoeba cyst in a fecal sample. The PCR was negative in all stool samples containing S. mansoni eggs (n = 21) and in all serum samples of patients with a microscopically confirmed S. mansoni (n = 22), Ascaris lumbricoides (n = 1), Ancylostomidae (n = 1), Strongyloides stercoralis (n = 1) or Trichuris trichuria infection (n = 1). The PCR demonstrated a high specificity, reproducibility and analytical sensitivity (0.5 eggs per gram of feces). The real-time PCR targeting the Dra1 sequence for S. haematobium-specific detection in urine, feces, and particularly serum, is a promising tool to confirm the diagnosis, also during the acute phase of urogenital schistosomiasis.

  1. Discrete- vs. Continuous-Time Modeling of Unequally Spaced Experience Sampling Method Data

    Directory of Open Access Journals (Sweden)

    Silvia de Haan-Rietdijk

    2017-10-01

    Full Text Available The Experience Sampling Method is a common approach in psychological research for collecting intensive longitudinal data with high ecological validity. One characteristic of ESM data is that it is often unequally spaced, because the measurement intervals within a day are deliberately varied, and measurement continues over several days. This poses a problem for discrete-time (DT modeling approaches, which are based on the assumption that all measurements are equally spaced. Nevertheless, DT approaches such as (vector autoregressive modeling are often used to analyze ESM data, for instance in the context of affective dynamics research. There are equivalent continuous-time (CT models, but they are more difficult to implement. In this paper we take a pragmatic approach and evaluate the practical relevance of the violated model assumption in DT AR(1 and VAR(1 models, for the N = 1 case. We use simulated data under an ESM measurement design to investigate the bias in the parameters of interest under four different model implementations, ranging from the true CT model that accounts for all the exact measurement times, to the crudest possible DT model implementation, where even the nighttime is treated as a regular interval. An analysis of empirical affect data illustrates how the differences between DT and CT modeling can play out in practice. We find that the size and the direction of the bias in DT (VAR models for unequally spaced ESM data depend quite strongly on the true parameter in addition to data characteristics. Our recommendation is to use CT modeling whenever possible, especially now that new software implementations have become available.

  2. Evaluation of BBL™ Sensi-Discs™ and FTA® cards as sampling devices for detection of rotavirus in stool samples.

    Science.gov (United States)

    Tam, Ka Ian; Esona, Mathew D; Williams, Alice; Ndze, Valantine N; Boula, Angeline; Bowen, Michael D

    2015-09-15

    Rotavirus is the most important cause of severe childhood gastroenteritis worldwide. Rotavirus vaccines are available and rotavirus surveillance is carried out to assess vaccination impact. In surveillance studies, stool samples are stored typically at 4°C or frozen to maintain sample quality. Uninterrupted cold storage is a problem in developing countries because of power interruptions. Cold-chain transportation of samples from collection sites to testing laboratories is costly. In this study, we evaluated the use of BBL™ Sensi-Discs™ and FTA(®) cards for storage and transportation of samples for virus isolation, EIA, and RT-PCR testing. Infectious rotavirus was recovered after 30 days of storage on Sensi-Discs™ at room temperature. We were able to genotype 98-99% of samples stored on Sensi-Discs™ and FTA(®) cards at temperatures ranging from -80°C to 37°C up to 180 days. A field sampling test using samples prepared and shipped from Cameroon, showed that both matrices yielded 100% genotyping success compared with whole stool and Sensi-Discs™ demonstrated 95% concordance with whole stool in EIA testing. The utilization of BBL™ Sensi-Discs™ and FTA(®) cards for stool sample storage and shipment has the potential to have great impact on global public health by facilitating surveillance and epidemiological investigations of rotavirus strains worldwide at a reduced cost. Published by Elsevier B.V.

  3. Evaluation of BBL™ Sensi-Discs™ and FTA® cards as sampling devices for detection of rotavirus in stool samples

    Science.gov (United States)

    Tam, Ka Ian; Esona, Mathew D.; Williams, Alice; Ndze, Valentine N.; Boula, Angeline; Bowen, Michael D.

    2015-01-01

    Rotavirus is the most important cause of severe childhood gastroenteritis worldwide. Rotavirus vaccines are available and rotavirus surveillance is carried out to assess vaccination impact. In surveillance studies, stool samples are stored typically at 4°C or frozen to maintain sample quality. Uninterrupted cold storage is a problem in developing countries because of power interruptions. Cold-chain transportation of samples from collection sites to testing laboratories is costly. In this study, we evaluated the use of BBL™ Sensi-Discs™ and FTA® cards for storage and transportation of samples for virus isolation, EIA, and RT-PCR testing. Infectious rotavirus was recovered after 30 days of storage on Sensi-Discs™ at room temperature. We were able to genotype 98–99% of samples stored on Sensi-Discs™ and FTA® cards at temperatures ranging from −80°C to 37°C up to 180 days. A field sampling test using samples prepared and shipped from Cameroon, showed that both matrices yielded 100% genotyping success compared with whole stool and Sensi-Discs™ demonstrated 95% concordance with whole stool in EIA testing. The utilization of BBL™ Sensi-Discs™ and FTA® cards for stool sample storage and shipment has the potential to have great impact on global public health by facilitating surveillance and epidemiological investigations of rotavirus strains worldwide at a reduced cost. PMID:26022083

  4. Low-sensitivity H ∞ filter design for linear delta operator systems with sampling time jitter

    Science.gov (United States)

    Guo, Xiang-Gui; Yang, Guang-Hong

    2012-04-01

    This article is concerned with the problem of designing H ∞ filters for a class of linear discrete-time systems with low-sensitivity to sampling time jitter via delta operator approach. Delta-domain model is used to avoid the inherent numerical ill-condition resulting from the use of the standard shift-domain model at high sampling rates. Based on projection lemma in combination with the descriptor system approach often used to solve problems related to delay, a novel bounded real lemma with three slack variables for delta operator systems is presented. A sensitivity approach based on this novel lemma is proposed to mitigate the effects of sampling time jitter on system performance. Then, the problem of designing a low-sensitivity filter can be reduced to a convex optimisation problem. An important consideration in the design of correlation filters is the optimal trade-off between the standard H ∞ criterion and the sensitivity of the transfer function with respect to sampling time jitter. Finally, a numerical example demonstrating the validity of the proposed design method is given.

  5. Analysis of time series and size of equivalent sample

    International Nuclear Information System (INIS)

    Bernal, Nestor; Molina, Alicia; Pabon, Daniel; Martinez, Jorge

    2004-01-01

    In a meteorological context, a first approach to the modeling of time series is to use models of autoregressive type. This allows one to take into account the meteorological persistence or temporal behavior, thereby identifying the memory of the analyzed process. This article seeks to pre-sent the concept of the size of an equivalent sample, which helps to identify in the data series sub periods with a similar structure. Moreover, in this article we examine the alternative of adjusting the variance of the series, keeping in mind its temporal structure, as well as an adjustment to the covariance of two time series. This article presents two examples, the first one corresponding to seven simulated series with autoregressive structure of first order, and the second corresponding to seven meteorological series of anomalies of the air temperature at the surface in two Colombian regions

  6. Diagnosing intramammary infections: evaluation of definitions based on a single milk sample.

    Science.gov (United States)

    Dohoo, I R; Smith, J; Andersen, S; Kelton, D F; Godden, S

    2011-01-01

    Criteria for diagnosing intramammary infections (IMI) have been debated for many years. Factors that may be considered in making a diagnosis include the organism of interest being found on culture, the number of colonies isolated, whether or not the organism was recovered in pure or mixed culture, and whether or not concurrent evidence of inflammation existed (often measured by somatic cell count). However, research using these criteria has been hampered by the lack of a "gold standard" test (i.e., a perfect test against which the criteria can be evaluated) and the need for very large data sets of culture results to have sufficient numbers of quarters with infections with a variety of organisms. This manuscript used 2 large data sets of culture results to evaluate several definitions (sets of criteria) for classifying a quarter as having, or not having an IMI by comparing the results from a single culture to a gold standard diagnosis based on a set of 3 milk samples. The first consisted of 38,376 milk samples from which 25,886 triplicate sets of milk samples taken 1 wk apart were extracted. The second consisted of 784 quarters that were classified as infected or not based on a set of 3 milk samples collected at 2-d intervals. From these quarters, a total of 3,136 additional samples were evaluated. A total of 12 definitions (named A to L) based on combinations of the number of colonies isolated, whether or not the organism was recovered in pure or mixed culture, and the somatic cell count were evaluated for each organism (or group of organisms) with sufficient data. The sensitivity (ability of a definition to detect IMI) and the specificity (Sp; ability of a definition to correctly classify noninfected quarters) were both computed. For all species, except Staphylococcus aureus, the sensitivity of all definitions was definition A). With the exception of "any organism" and coagulase-negative staphylococci, all Sp estimates were over 94% in the daily data and over 97

  7. Standardised Resting Time Prior to Blood Sampling and Diurnal Variation Associated with Risk of Patient Misclassification

    DEFF Research Database (Denmark)

    Bøgh Andersen, Ida; Brasen, Claus L.; Christensen, Henry

    2015-01-01

    .9×10-7) and sodium (p = 8.7×10-16). Only TSH and albumin were clinically significantly influenced by diurnal variation. Resting time had no clinically significant effect. CONCLUSIONS: We found no need for resting 15 minutes prior to blood sampling. However, diurnal variation was found to have a significant......BACKGROUND: According to current recommendations, blood samples should be taken in the morning after 15 minutes' resting time. Some components exhibit diurnal variation and in response to pressures to expand opening hours and reduce waiting time, the aims of this study were to investigate...... the impact of resting time prior to blood sampling and diurnal variation on biochemical components, including albumin, thyrotropin (TSH), total calcium and sodium in plasma. METHODS: All patients referred to an outpatient clinic for blood sampling were included in the period Nov 2011 until June 2014 (opening...

  8. Impact of collection container material and holding times on sample integrity for mercury and methylmercury in water

    Energy Technology Data Exchange (ETDEWEB)

    Riscassi, Ami L [ORNL; Miller, Carrie L [ORNL; Brooks, Scott C [ORNL

    2014-01-01

    Mercury (Hg) and methylmercury (MeHg) concentrations in streamwater can vary on short timescales (hourly or less) during storm flow and on a diel cycle; the frequency and timing of sampling required to accurately characterize these dynamics may be difficult to accomplish manually. Automated sampling can assist in sample collection; however use has been limited for Hg and MeHg analysis due to stability concerns of trace concentrations during extended storage times. We examined the viability of using automated samplers with disposable low-density polyethylene (LDPE) sample bags to collect industrially contaminated streamwater for unfiltered and filtered Hg and MeHg analysis. Specifically we investigated the effect of holding times ranging from hours to days on streamwater collected during baseflow and storm flow. Unfiltered and filtered Hg and MeHg concentrations decreased with increases in time prior to sample processing; holding times of 24 hours or less resulted in concentration changes (mean 11 7% different) similar to variability in duplicates collected manually during analogous field conditions (mean 7 10% different). Comparisons of samples collected with manual and automated techniques throughout a year for a wide range of stream conditions were also found to be similar to differences observed between duplicate grab samples. These results demonstrate automated sampling into LDPE bags with holding times of 24 hours or less can be effectively used to collect streamwater for Hg and MeHg analysis, and encourage the testing of these materials and methods for implementation in other aqueous systems where high-frequency sampling is warranted.

  9. Evaluation of immunization coverage by lot quality assurance sampling compared with 30-cluster sampling in a primary health centre in India.

    Science.gov (United States)

    Singh, J; Jain, D C; Sharma, R S; Verghese, T

    1996-01-01

    The immunization coverage of infants, children and women residing in a primary health centre (PHC) area in Rajasthan was evaluated both by lot quality assurance sampling (LQAS) and by the 30-cluster sampling method recommended by WHO's Expanded Programme on Immunization (EPI). The LQAS survey was used to classify 27 mutually exclusive subunits of the population, defined as residents in health subcentre areas, on the basis of acceptable or unacceptable levels of immunization coverage among infants and their mothers. The LQAS results from the 27 subcentres were also combined to obtain an overall estimate of coverage for the entire population of the primary health centre, and these results were compared with the EPI cluster survey results. The LQAS survey did not identify any subcentre with a level of immunization among infants high enough to be classified as acceptable; only three subcentres were classified as having acceptable levels of tetanus toxoid (TT) coverage among women. The estimated overall coverage in the PHC population from the combined LQAS results showed that a quarter of the infants were immunized appropriately for their ages and that 46% of their mothers had been adequately immunized with TT. Although the age groups and the periods of time during which the children were immunized differed for the LQAS and EPI survey populations, the characteristics of the mothers were largely similar. About 57% (95% CI, 46-67) of them were found to be fully immunized with TT by 30-cluster sampling, compared with 46% (95% CI, 41-51) by stratified random sampling. The difference was not statistically significant. The field work to collect LQAS data took about three times longer, and cost 60% more than the EPI survey. The apparently homogeneous and low level of immunization coverage in the 27 subcentres makes this an impractical situation in which to apply LQAS, and the results obtained were therefore not particularly useful. However, if LQAS had been applied by local

  10. Time delay estimation in a reverberant environment by low rate sampling of impulsive acoustic sources

    KAUST Repository

    Omer, Muhammad

    2012-07-01

    This paper presents a new method of time delay estimation (TDE) using low sample rates of an impulsive acoustic source in a room environment. The proposed method finds the time delay from the room impulse response (RIR) which makes it robust against room reverberations. The RIR is considered a sparse phenomenon and a recently proposed sparse signal reconstruction technique called orthogonal clustering (OC) is utilized for its estimation from the low rate sampled received signal. The arrival time of the direct path signal at a pair of microphones is identified from the estimated RIR and their difference yields the desired time delay. Low sampling rates reduce the hardware and computational complexity and decrease the communication between the microphones and the centralized location. The performance of the proposed technique is demonstrated by numerical simulations and experimental results. © 2012 IEEE.

  11. Evaluation of Penicillium digitatum sterilization using non-equilibrium atmospheric pressure plasma by terahertz time-domain spectroscopy

    Science.gov (United States)

    Hiraoka, Takehiro; Ebizuka, Noboru; Takeda, Keigo; Ohta, Takayuki; Kondo, Hiroki; Ishikawa, Kenji; Kawase, Kodo; Ito, Masafumi; Sekine, Makoto; Hori, Masaru

    2011-10-01

    Recently, the plasma sterilization has attracted much attention as a new sterilization technique that takes the place of spraying agricultural chemicals. The conventional methods for sterilization evaluation, was demanded to culture the samples for several days after plasma treatment. Then, we focused on Terahertz time-domain spectroscopy (THz-TDS). At the THz region, vibrational modes of biological molecules and fingerprint spectra of biologically-relevant molecules were also observed. In this study, our purpose was measurement of the fingerprint spectrum of the Penicillium digitatum (PD) spore and establishment of sterilization method by THz-TDS. The sample was 40mg/ml PD spore suspensions which dropped on cover glass. The atmospheric pressure plasma generated under the conditions which Ar gas flow was 3slm, and alternating voltage of 6kV was applied. The samples were exposed the plasma from 10mm distance for 10 minutes. We could obtain the fingerprint spectrum of the PD spore from 0.5 to 0.9THz. This result indicated the possibility of in-situ evaluation for PD sterilization using THz-TDS.

  12. Evaluating change in bruise colorimetry and the effect of subject characteristics over time.

    Science.gov (United States)

    Scafide, Katherine R N; Sheridan, Daniel J; Campbell, Jacquelyn; Deleon, Valerie B; Hayat, Matthew J

    2013-09-01

    Forensic clinicians are routinely asked to estimate the age of cutaneous bruises. Unfortunately, existing research on noninvasive methods to date bruises has been mostly limited to relatively small, homogeneous samples or cross-sectional designs. The purpose of this prospective, foundational study was to examine change in bruise colorimetry over time and evaluate the effects of bruise size, skin color, gender, and local subcutaneous fat on that change. Bruises were created by a controlled application of a paintball pellet to 103 adult, healthy volunteers. Daily colorimetry measures were obtained for four consecutive days using the Minolta Chroma-meter(®). The sample was nearly equal by gender and skin color (light, medium, dark). Analysis included general linear mixed modeling (GLMM). Change in bruise colorimetry over time was significant for all three color parameters (L*a*b*), the most notable changes being the decrease in red (a*) and increase in yellow (b*) starting at 24 h. Skin color was a significant predictor for all three colorimetry values but sex or subcutaneous fat levels were not. Bruise size was a significant predictor and moderator and may have accounted for the lack of effect of gender or subcutaneous fat. Study results demonstrated the ability to model the change in bruise colorimetry over time in a diverse sample of healthy adults. Multiple factors, including skin color and bruise size must be considered when assessing bruise color in relation to its age. This study supports the need for further research that could build the science to allow more accurate bruise age estimations.

  13. Evaluation of Two Surface Sampling Methods for Microbiological and Chemical Analyses To Assess the Presence of Biofilms in Food Companies.

    Science.gov (United States)

    Maes, Sharon; Huu, Son Nguyen; Heyndrickx, Marc; Weyenberg, Stephanie van; Steenackers, Hans; Verplaetse, Alex; Vackier, Thijs; Sampers, Imca; Raes, Katleen; Reu, Koen De

    2017-12-01

    Biofilms are an important source of contamination in food companies, yet the composition of biofilms in practice is still mostly unknown. The chemical and microbiological characterization of surface samples taken after cleaning and disinfection is very important to distinguish free-living bacteria from the attached bacteria in biofilms. In this study, sampling methods that are potentially useful for both chemical and microbiological analyses of surface samples were evaluated. In the manufacturing facilities of eight Belgian food companies, surfaces were sampled after cleaning and disinfection using two sampling methods: the scraper-flocked swab method and the sponge stick method. Microbiological and chemical analyses were performed on these samples to evaluate the suitability of the sampling methods for the quantification of extracellular polymeric substance components and microorganisms originating from biofilms in these facilities. The scraper-flocked swab method was most suitable for chemical analyses of the samples because the material in these swabs did not interfere with determination of the chemical components. For microbiological enumerations, the sponge stick method was slightly but not significantly more effective than the scraper-flocked swab method. In all but one of the facilities, at least 20% of the sampled surfaces had more than 10 2 CFU/100 cm 2 . Proteins were found in 20% of the chemically analyzed surface samples, and carbohydrates and uronic acids were found in 15 and 8% of the samples, respectively. When chemical and microbiological results were combined, 17% of the sampled surfaces were contaminated with both microorganisms and at least one of the analyzed chemical components; thus, these surfaces were characterized as carrying biofilm. Overall, microbiological contamination in the food industry is highly variable by food sector and even within a facility at various sampling points and sampling times.

  14. Sampling times influence the estimate of parameters in the Weibull dissolution model

    Czech Academy of Sciences Publication Activity Database

    Čupera, J.; Lánský, Petr; Šklubalová, Z.

    2015-01-01

    Roč. 78, Oct 12 (2015), s. 171-176 ISSN 0928-0987 Institutional support: RVO:67985823 Keywords : dissolution * Fisher information * rate constant * optimal sampling times Subject RIV: BA - General Mathematics Impact factor: 3.773, year: 2015

  15. Evaluation of the stability and validity of participant samples recruited over the Internet.

    Science.gov (United States)

    Lieberman, Daniel Z

    2008-12-01

    Research conducted via the Internet has the potential to reach important clinical populations of participants who would not participate in traditional studies. Concerns exist, however, about the validity of samples recruited in this manner, especially when participants are anonymous and never have contact with study staff. This study evaluated two anonymous samples that were recruited over the Internet to test an online program designed to help problem drinkers. The two studies were conducted 3 years apart, and different recruitment strategies were utilized. Despite these differences, the two samples were highly similar in demographic and clinical features. Correlations that have been found between variables in traditional non-anonymous studies were also found in both online samples, supporting the validity of the data that was collected. Appropriate skepticism is required when critically evaluating Internet studies. Nevertheless, the results of this study indicate that it is possible to obtain stable, valid data from anonymous participants over the Internet, even when there are significant differences in the way the participants are obtained.

  16. Systematic Sampling and Cluster Sampling of Packet Delays

    OpenAIRE

    Lindh, Thomas

    2006-01-01

    Based on experiences of a traffic flow performance meter this papersuggests and evaluates cluster sampling and systematic sampling as methods toestimate average packet delays. Systematic sampling facilitates for exampletime analysis, frequency analysis and jitter measurements. Cluster samplingwith repeated trains of periodically spaced sampling units separated by randomstarting periods, and systematic sampling are evaluated with respect to accuracyand precision. Packet delay traces have been ...

  17. Evaluation of alternative macroinvertebrate sampling techniques for use in a new tropical freshwater bioassessment scheme

    OpenAIRE

    Isabel Eleanor Moore; Kevin Joseph Murphy

    2015-01-01

    Aim: The study aimed to determine the effectiveness of benthic macroinvertebrate dredge net sampling procedures as an alternative method to kick net sampling in tropical freshwater systems, specifically as an evaluation of sampling methods used in the Zambian Invertebrate Scoring System (ZISS) river bioassessment scheme. Tropical freshwater ecosystems are sometimes dangerous or inaccessible to sampling teams using traditional kick-sampling methods, so identifying an alternative procedure that...

  18. Sample pooling for real-time PCR detection and virulence determination of the footrot pathogen Dichelobacter nodosus.

    Science.gov (United States)

    Frosth, Sara; König, Ulrika; Nyman, Ann-Kristin; Aspán, Anna

    2017-09-01

    Dichelobacter nodosus is the principal cause of ovine footrot and strain virulence is an important factor in disease severity. Therefore, detection and virulence determination of D. nodosus is important for proper diagnosis of the disease. Today this is possible by real-time PCR analysis. Analysis of large numbers of samples is costly and laborious; therefore, pooling of individual samples is common in surveillance programs. However, pooling can reduce the sensitivity of the method. The aim of this study was to develop a pooling method for real-time PCR analysis that would allow sensitive detection and simultaneous virulence determination of D. nodosus. A total of 225 sheep from 17 flocks were sampled using ESwabs within the Swedish Footrot Control Program in 2014. Samples were first analysed individually and then in pools of five by real-time PCR assays targeting the 16S rRNA and aprV2/B2 genes of D. nodosus. Each pool consisted of four negative and one positive D. nodosus samples with varying amounts of the bacterium. In the individual analysis, 61 (27.1%) samples were positive in the 16S rRNA and the aprV2/B2 PCR assays and 164 (72.9%) samples were negative. All samples positive in the aprV2/B2 PCR-assay were of aprB2 variant. The pooled analysis showed that all 41 pools were also positive for D. nodosus 16S rRNA and the aprB2 variant. The diagnostic sensitivity for pooled and individual samples was therefore similar. Our method includes concentration of the bacteria before DNA-extraction. This may account for the maintenance of diagnostic sensitivity. Diagnostic sensitivity in the real-time PCR assays of the pooled samples were comparable to the sensitivity obtained for individually analysed samples. Even sub-clinical infections were able to be detected in the pooled PCR samples which is important for control of the disease. This method may therefore be implemented in footrot control programs where it can replace analysis of individual samples.

  19. Evaluation of sampling methods for Bacillus spore-contaminated HVAC filters.

    Science.gov (United States)

    Calfee, M Worth; Rose, Laura J; Tufts, Jenia; Morse, Stephen; Clayton, Matt; Touati, Abderrahmane; Griffin-Gatchalian, Nicole; Slone, Christina; McSweeney, Neal

    2014-01-01

    The objective of this study was to compare an extraction-based sampling method to two vacuum-based sampling methods (vacuum sock and 37mm cassette filter) with regards to their ability to recover Bacillus atrophaeus spores (surrogate for Bacillus anthracis) from pleated heating, ventilation, and air conditioning (HVAC) filters that are typically found in commercial and residential buildings. Electrostatic and mechanical HVAC filters were tested, both without and after loading with dust to 50% of their total holding capacity. The results were analyzed by one-way ANOVA across material types, presence or absence of dust, and sampling device. The extraction method gave higher relative recoveries than the two vacuum methods evaluated (p≤0.001). On average, recoveries obtained by the vacuum methods were about 30% of those achieved by the extraction method. Relative recoveries between the two vacuum methods were not significantly different (p>0.05). Although extraction methods yielded higher recoveries than vacuum methods, either HVAC filter sampling approach may provide a rapid and inexpensive mechanism for understanding the extent of contamination following a wide-area biological release incident. Published by Elsevier B.V.

  20. Evaluation of wastewater contaminant transport in surface waters using verified Lagrangian sampling

    Science.gov (United States)

    Antweiler, Ronald C.; Writer, Jeffrey H.; Murphy, Sheila F.

    2014-01-01

    Contaminants released from wastewater treatment plants can persist in surface waters for substantial distances. Much research has gone into evaluating the fate and transport of these contaminants, but this work has often assumed constant flow from wastewater treatment plants. However, effluent discharge commonly varies widely over a 24-hour period, and this variation controls contaminant loading and can profoundly influence interpretations of environmental data. We show that methodologies relying on the normalization of downstream data to conservative elements can give spurious results, and should not be used unless it can be verified that the same parcel of water was sampled. Lagrangian sampling, which in theory samples the same water parcel as it moves downstream (the Lagrangian parcel), links hydrologic and chemical transformation processes so that the in-stream fate of wastewater contaminants can be quantitatively evaluated. However, precise Lagrangian sampling is difficult, and small deviations – such as missing the Lagrangian parcel by less than 1 h – can cause large differences in measured concentrations of all dissolved compounds at downstream sites, leading to erroneous conclusions regarding in-stream processes controlling the fate and transport of wastewater contaminants. Therefore, we have developed a method termed “verified Lagrangian” sampling, which can be used to determine if the Lagrangian parcel was actually sampled, and if it was not, a means for correcting the data to reflect the concentrations which would have been obtained had the Lagrangian parcel been sampled. To apply the method, it is necessary to have concentration data for a number of conservative constituents from the upstream, effluent, and downstream sites, along with upstream and effluent concentrations that are constant over the short-term (typically 2–4 h). These corrections can subsequently be applied to all data, including non-conservative constituents. Finally, we

  1. Evaluation of wastewater contaminant transport in surface waters using verified Lagrangian sampling.

    Science.gov (United States)

    Antweiler, Ronald C; Writer, Jeffrey H; Murphy, Sheila F

    2014-02-01

    Contaminants released from wastewater treatment plants can persist in surface waters for substantial distances. Much research has gone into evaluating the fate and transport of these contaminants, but this work has often assumed constant flow from wastewater treatment plants. However, effluent discharge commonly varies widely over a 24-hour period, and this variation controls contaminant loading and can profoundly influence interpretations of environmental data. We show that methodologies relying on the normalization of downstream data to conservative elements can give spurious results, and should not be used unless it can be verified that the same parcel of water was sampled. Lagrangian sampling, which in theory samples the same water parcel as it moves downstream (the Lagrangian parcel), links hydrologic and chemical transformation processes so that the in-stream fate of wastewater contaminants can be quantitatively evaluated. However, precise Lagrangian sampling is difficult, and small deviations - such as missing the Lagrangian parcel by less than 1h - can cause large differences in measured concentrations of all dissolved compounds at downstream sites, leading to erroneous conclusions regarding in-stream processes controlling the fate and transport of wastewater contaminants. Therefore, we have developed a method termed "verified Lagrangian" sampling, which can be used to determine if the Lagrangian parcel was actually sampled, and if it was not, a means for correcting the data to reflect the concentrations which would have been obtained had the Lagrangian parcel been sampled. To apply the method, it is necessary to have concentration data for a number of conservative constituents from the upstream, effluent, and downstream sites, along with upstream and effluent concentrations that are constant over the short-term (typically 2-4h). These corrections can subsequently be applied to all data, including non-conservative constituents. Finally, we show how data

  2. Comparisons of Sampling Procedures and Time of Sampling for the Detection of Salmonella in Danish Infected Chicken Flocks Raised in Floor Systems

    Directory of Open Access Journals (Sweden)

    Madsen M

    2002-03-01

    Full Text Available Bacteriological follow-up samples were taken from 41 chicken (Gallus gallus flocks in floor systems, where Salmonella enterica (Salmonella had been detected either directly in bacteriological samples or indirectly by serological samples. Three types of follow-up samples were compared to each other within each flock: 1 5 pairs of socks, analysed as 5 samples, 2 2 pairs of socks, analysed as one sample, and 3 60 faecal samples, analysed as one pooled sample. Agreement between sampling methods was evaluated by the following statistical tests: 'Kappa', 'The adjusted rand', McNemar's test for marginal symmetry, Proportion of agreement P0, P+, P-, and Odds Ratio. The highest agreement was found between the 2 types of sock sampling, while the lowest agreement was found by comparing 60 faecal samples with 5 pairs of socks. Two pairs of socks analysed as one pool appeared to be just as effective in detecting S. enterica as the 60 faecal samples. In broiler flocks, 5 pairs of socks were used both in the routine samples taken at about 3 weeks of age for the establishment of infection of the flock, and as one of the follow-up samples taken shortly before slaughter age, which means that the only notable differences between the 2 sampling rounds were the age of the broilers and of their litter. S. enterica was detected more frequently in samples from broilers about 3 weeks old, than in similar samples taken from broilers a few days prior to slaughter at ca. 33–40 days of age.

  3. Evaluation of passive sampling of gaseous mercury using different sorbing materials.

    Science.gov (United States)

    Lin, Huiming; Zhang, Wei; Deng, Chunyan; Tong, Yingdong; Zhang, Qianggong; Wang, Xuejun

    2017-06-01

    Atmospheric mercury monitoring is essential because of its potential human health and ecological impacts. Current automated monitoring systems include limitations such as high cost, complicated configuration, and electricity requirements. Passive samplers require no electric power and are more appropriate for screening applications and long-term monitoring. Sampling rate is a major factor to evaluate the performance of a passive sampler. In this study, laboratory experiments were carried out using an exposure chamber to search for high efficiency sorbents for gaseous mercury. Four types of sorbents, including sulfur-impregnated carbon (SIC), chlorine-impregnated carbon (CIC), bromine-impregnated carbon (BIC), and gold-coated sand (GCS) were evaluated under a wide range of meteorological parameters, including temperature, relative humidity, and wind speed. The results showed that the four sorbents all have a high sampling rate above 0.01 m 3 g -1  day -1 , and wind speed has a positive correlation with the sampling rate. Under different temperature and relative humidity, the sampling rate of SIC keeps stable. The sampling rate of CIC and BIC shows a negative correlation with temperature, and GCS is influenced by all the three meteorological factors. Furthermore, long-term experiments were carried out to investigate the uptake capacity of GCS and SIC. Uptake curves show that the mass amount of sorbent in a passive sampler can influence uptake capacity. In the passive sampler, 0.9 g SIC or 0.9 g GCS can achieve stable uptake efficiency for at least 110 days with gaseous mercury concentration at or below 2 ng/m 3 . For mercury concentration at or below 21 ng/m 3 , 0.9 g SIC can maintain stable uptake efficiency for 70 days, and 0.9 g GCS can maintain stability for 45 days.

  4. Evaluation and optimization of DNA extraction and purification procedures for soil and sediment samples.

    Science.gov (United States)

    Miller, D N; Bryant, J E; Madsen, E L; Ghiorse, W C

    1999-11-01

    We compared and statistically evaluated the effectiveness of nine DNA extraction procedures by using frozen and dried samples of two silt loam soils and a silt loam wetland sediment with different organic matter contents. The effects of different chemical extractants (sodium dodecyl sulfate [SDS], chloroform, phenol, Chelex 100, and guanadinium isothiocyanate), different physical disruption methods (bead mill homogenization and freeze-thaw lysis), and lysozyme digestion were evaluated based on the yield and molecular size of the recovered DNA. Pairwise comparisons of the nine extraction procedures revealed that bead mill homogenization with SDS combined with either chloroform or phenol optimized both the amount of DNA extracted and the molecular size of the DNA (maximum size, 16 to 20 kb). Neither lysozyme digestion before SDS treatment nor guanidine isothiocyanate treatment nor addition of Chelex 100 resin improved the DNA yields. Bead mill homogenization in a lysis mixture containing chloroform, SDS, NaCl, and phosphate-Tris buffer (pH 8) was found to be the best physical lysis technique when DNA yield and cell lysis efficiency were used as criteria. The bead mill homogenization conditions were also optimized for speed and duration with two different homogenizers. Recovery of high-molecular-weight DNA was greatest when we used lower speeds and shorter times (30 to 120 s). We evaluated four different DNA purification methods (silica-based DNA binding, agarose gel electrophoresis, ammonium acetate precipitation, and Sephadex G-200 gel filtration) for DNA recovery and removal of PCR inhibitors from crude extracts. Sephadex G-200 spin column purification was found to be the best method for removing PCR-inhibiting substances while minimizing DNA loss during purification. Our results indicate that for these types of samples, optimum DNA recovery requires brief, low-speed bead mill homogenization in the presence of a phosphate-buffered SDS-chloroform mixture, followed

  5. Further observations on comparison of immunization coverage by lot quality assurance sampling and 30 cluster sampling.

    Science.gov (United States)

    Singh, J; Jain, D C; Sharma, R S; Verghese, T

    1996-06-01

    Lot Quality Assurance Sampling (LQAS) and standard EPI methodology (30 cluster sampling) were used to evaluate immunization coverage in a Primary Health Center (PHC) where coverage levels were reported to be more than 85%. Of 27 sub-centers (lots) evaluated by LQAS, only 2 were accepted for child coverage, whereas none was accepted for tetanus toxoid (TT) coverage in mothers. LQAS data were combined to obtain an estimate of coverage in the entire population; 41% (95% CI 36-46) infants were immunized appropriately for their ages, while 42% (95% CI 37-47) of their mothers had received a second/ booster dose of TT. TT coverage in 149 contemporary mothers sampled in EPI survey was also 42% (95% CI 31-52). Although results by the two sampling methods were consistent with each other, a big gap was evident between reported coverage (in children as well as mothers) and survey results. LQAS was found to be operationally feasible, but it cost 40% more and required 2.5 times more time than the EPI survey. LQAS therefore, is not a good substitute for current EPI methodology to evaluate immunization coverage in a large administrative area. However, LQAS has potential as method to monitor health programs on a routine basis in small population sub-units, especially in areas with high and heterogeneously distributed immunization coverage.

  6. Evaluation of sampling methods for toxicological testing of indoor air particulate matter.

    Science.gov (United States)

    Tirkkonen, Jenni; Täubel, Martin; Hirvonen, Maija-Riitta; Leppänen, Hanna; Lindsley, William G; Chen, Bean T; Hyvärinen, Anne; Huttunen, Kati

    2016-09-01

    There is a need for toxicity tests capable of recognizing indoor environments with compromised air quality, especially in the context of moisture damage. One of the key issues is sampling, which should both provide meaningful material for analyses and fulfill requirements imposed by practitioners using toxicity tests for health risk assessment. We aimed to evaluate different existing methods of sampling indoor particulate matter (PM) to develop a suitable sampling strategy for a toxicological assay. During three sampling campaigns in moisture-damaged and non-damaged school buildings, we evaluated one passive and three active sampling methods: the Settled Dust Box (SDB), the Button Aerosol Sampler, the Harvard Impactor and the National Institute for Occupational Safety and Health (NIOSH) Bioaerosol Cyclone Sampler. Mouse RAW264.7 macrophages were exposed to particle suspensions and cell metabolic activity (CMA), production of nitric oxide (NO) and tumor necrosis factor (TNFα) were determined after 24 h of exposure. The repeatability of the toxicological analyses was very good for all tested sampler types. Variability within the schools was found to be high especially between different classrooms in the moisture-damaged school. Passively collected settled dust and PM collected actively with the NIOSH Sampler (Stage 1) caused a clear response in exposed cells. The results suggested the higher relative immunotoxicological activity of dust from the moisture-damaged school. The NIOSH Sampler is a promising candidate for the collection of size-fractionated PM to be used in toxicity testing. The applicability of such sampling strategy in grading moisture damage severity in buildings needs to be developed further in a larger cohort of buildings.

  7. Characteristic Performance Evaluation of a new SAGe Well Detector for Small and Large Sample Geometries

    International Nuclear Information System (INIS)

    Adekola, A.S.; Colaresi, J.; Douwen, J.; Jaederstroem, H.; Mueller, W.F.; Yocum, K.M.; Carmichael, K.

    2015-01-01

    Environmental scientific research requires a detector that has sensitivity low enough to reveal the presence of any contaminant in the sample at a reasonable counting time. Canberra developed the germanium detector geometry called Small Anode Germanium (SAGe) Well detector, which is now available commercially. The SAGe Well detector is a new type of low capacitance germanium well detector manufactured using small anode technology capable of advancing many environmental scientific research applications. The performance of this detector has been evaluated for a range of sample sizes and geometries counted inside the well, and on the end cap of the detector. The detector has energy resolution performance similar to semi-planar detectors, and offers significant improvement over the existing coaxial and Well detectors. Energy resolution performance of 750 eV Full Width at Half Maximum (FWHM) at 122 keV γ-ray energy and resolution of 2.0 - 2.3 keV FWHM at 1332 keV γ-ray energy are guaranteed for detector volumes up to 425 cm 3 . The SAGe Well detector offers an optional 28 mm well diameter with the same energy resolution as the standard 16 mm well. Such outstanding resolution performance will benefit environmental applications in revealing the detailed radionuclide content of samples, particularly at low energy, and will enhance the detection sensitivity resulting in reduced counting time. The detector is compatible with electric coolers without any sacrifice in performance and supports the Canberra Mathematical efficiency calibration method (In situ Object Calibration Software or ISOCS, and Laboratory Source-less Calibration Software or LABSOCS). In addition, the SAGe Well detector supports true coincidence summing available in the ISOCS/LABSOCS framework. The improved resolution performance greatly enhances detection sensitivity of this new detector for a range of sample sizes and geometries counted inside the well. This results in lower minimum detectable

  8. Characteristic Performance Evaluation of a new SAGe Well Detector for Small and Large Sample Geometries

    Energy Technology Data Exchange (ETDEWEB)

    Adekola, A.S.; Colaresi, J.; Douwen, J.; Jaederstroem, H.; Mueller, W.F.; Yocum, K.M.; Carmichael, K. [Canberra Industries Inc., 800 Research Parkway, Meriden, CT 06450 (United States)

    2015-07-01

    Environmental scientific research requires a detector that has sensitivity low enough to reveal the presence of any contaminant in the sample at a reasonable counting time. Canberra developed the germanium detector geometry called Small Anode Germanium (SAGe) Well detector, which is now available commercially. The SAGe Well detector is a new type of low capacitance germanium well detector manufactured using small anode technology capable of advancing many environmental scientific research applications. The performance of this detector has been evaluated for a range of sample sizes and geometries counted inside the well, and on the end cap of the detector. The detector has energy resolution performance similar to semi-planar detectors, and offers significant improvement over the existing coaxial and Well detectors. Energy resolution performance of 750 eV Full Width at Half Maximum (FWHM) at 122 keV γ-ray energy and resolution of 2.0 - 2.3 keV FWHM at 1332 keV γ-ray energy are guaranteed for detector volumes up to 425 cm{sup 3}. The SAGe Well detector offers an optional 28 mm well diameter with the same energy resolution as the standard 16 mm well. Such outstanding resolution performance will benefit environmental applications in revealing the detailed radionuclide content of samples, particularly at low energy, and will enhance the detection sensitivity resulting in reduced counting time. The detector is compatible with electric coolers without any sacrifice in performance and supports the Canberra Mathematical efficiency calibration method (In situ Object Calibration Software or ISOCS, and Laboratory Source-less Calibration Software or LABSOCS). In addition, the SAGe Well detector supports true coincidence summing available in the ISOCS/LABSOCS framework. The improved resolution performance greatly enhances detection sensitivity of this new detector for a range of sample sizes and geometries counted inside the well. This results in lower minimum detectable

  9. Assessment of real-time PCR method for detection of EGFR mutation using both supernatant and cell pellet of malignant pleural effusion samples from non-small-cell lung cancer patients.

    Science.gov (United States)

    Shin, Saeam; Kim, Juwon; Kim, Yoonjung; Cho, Sun-Mi; Lee, Kyung-A

    2017-10-26

    EGFR mutation is an emerging biomarker for treatment selection in non-small-cell lung cancer (NSCLC) patients. However, optimal mutation detection is hindered by complications associated with the biopsy procedure, tumor heterogeneity and limited sensitivity of test methodology. In this study, we evaluated the diagnostic utility of real-time PCR using malignant pleural effusion samples. A total of 77 pleural fluid samples from 77 NSCLC patients were tested using the cobas EGFR mutation test (Roche Molecular Systems). Pleural fluid was centrifuged, and separated cell pellets and supernatants were tested in parallel. Results were compared with Sanger sequencing and/or peptide nucleic acid (PNA)-mediated PCR clamping of matched tumor tissue or pleural fluid samples. All samples showed valid real-time PCR results in one or more DNA samples extracted from cell pellets and supernatants. Compared with other molecular methods, the sensitivity of real-time PCR method was 100%. Concordance rate of real-time PCR and Sanger sequencing plus PNA-mediated PCR clamping was 98.7%. We have confirmed that real-time PCR using pleural fluid had a high concordance rate compared to conventional methods, with no failed samples. Our data demonstrated that the parallel real-time PCR testing using supernatant and cell pellet could offer reliable and robust surrogate strategy when tissue is not available.

  10. Time-Efficiency of Sorting Chironomidae Surface-Floating Pupal Exuviae Samples from Urban Trout Streams in Northeast Minnesota, USA

    Directory of Open Access Journals (Sweden)

    Alyssa M Anderson

    2012-10-01

    Full Text Available Collections of Chironomidae surface-floating pupal exuviae (SFPE provide an effective means of assessing water quality in streams. Although not widely used in the United States, the technique is not new and has been shown to be more cost-efficient than traditional dip-net sampling techniques in organically enriched stream in an urban landscape. The intent of this research was to document the efficiency of sorting SFPE samples relative to dip-net samples in trout streams with catchments varying in amount of urbanization and differences in impervious surface. Samples of both SFPE and dip-nets were collected from 17 sample sites located on 12 trout streams in Duluth, MN, USA. We quantified time needed to sort subsamples of 100 macroinvertebrates from dip-net samples, and less than or greater than 100 chironomid exuviae from SFPE samples. For larger samples of SFPE, the time required to subsample up to 300 exuviae was also recorded. The average time to sort subsamples of 100 specimens was 22.5 minutes for SFPE samples, compared to 32.7 minutes for 100 macroinvertebrates in dip-net samples. Average time to sort up to 300 exuviae was 37.7 minutes. These results indicate that sorting SFPE samples is more time-efficient than traditional dip-net techniques in trout streams with varying catchment characteristics.doi: 10.5324/fn.v31i0.1380.Published online: 17 October 2012.

  11. Multi-Factor Policy Evaluation and Selection in the One-Sample Situation

    NARCIS (Netherlands)

    C.M. Chen (Chien-Ming)

    2008-01-01

    textabstractFirms nowadays need to make decisions with fast information obsolesce. In this paper I deal with one class of decision problems in this situation, called the “one-sample” problems: we have finite options and one sample of the multiple criteria with which we use to evaluate those options.

  12. Efficient Round-Trip Time Optimization for Replica-Exchange Enveloping Distribution Sampling (RE-EDS).

    Science.gov (United States)

    Sidler, Dominik; Cristòfol-Clough, Michael; Riniker, Sereina

    2017-06-13

    Replica-exchange enveloping distribution sampling (RE-EDS) allows the efficient estimation of free-energy differences between multiple end-states from a single molecular dynamics (MD) simulation. In EDS, a reference state is sampled, which can be tuned by two types of parameters, i.e., smoothness parameters(s) and energy offsets, such that all end-states are sufficiently sampled. However, the choice of these parameters is not trivial. Replica exchange (RE) or parallel tempering is a widely applied technique to enhance sampling. By combining EDS with the RE technique, the parameter choice problem could be simplified and the challenge shifted toward an optimal distribution of the replicas in the smoothness-parameter space. The choice of a certain replica distribution can alter the sampling efficiency significantly. In this work, global round-trip time optimization (GRTO) algorithms are tested for the use in RE-EDS simulations. In addition, a local round-trip time optimization (LRTO) algorithm is proposed for systems with slowly adapting environments, where a reliable estimate for the round-trip time is challenging to obtain. The optimization algorithms were applied to RE-EDS simulations of a system of nine small-molecule inhibitors of phenylethanolamine N-methyltransferase (PNMT). The energy offsets were determined using our recently proposed parallel energy-offset (PEOE) estimation scheme. While the multistate GRTO algorithm yielded the best replica distribution for the ligands in water, the multistate LRTO algorithm was found to be the method of choice for the ligands in complex with PNMT. With this, the 36 alchemical free-energy differences between the nine ligands were calculated successfully from a single RE-EDS simulation 10 ns in length. Thus, RE-EDS presents an efficient method for the estimation of relative binding free energies.

  13. A Formal Approach to Run-Time Evaluation of Real-Time Behaviour in Distributed Process Control Systems

    DEFF Research Database (Denmark)

    Kristensen, C.H.

    This thesis advocates a formal approach to run-time evaluation of real-time behaviour in distributed process sontrol systems, motivated by a growing interest in applying the increasingly popular formal methods in the application area of distributed process control systems. We propose to evaluate...... because the real-time aspects of distributed process control systems are considered to be among the hardest and most interesting to handle....

  14. Towards Just-In-Time Partial Evaluation of Prolog

    Science.gov (United States)

    Bolz, Carl Friedrich; Leuschel, Michael; Rigo, Armin

    We introduce a just-in-time specializer for Prolog. Just-in-time specialization attempts to unify of the concepts and benefits of partial evaluation (PE) and just-in-time (JIT) compilation. It is a variant of PE that occurs purely at runtime, which lazily generates residual code and is constantly driven by runtime feedback.

  15. Evaluation of methods for cleaning low carbon uranium metal and alloy samples

    International Nuclear Information System (INIS)

    Kirchner, K.; Dixon, M.

    1979-01-01

    Several methods for cleaning uranium samples prior to carbon analysis, using a Leco Carbon Analyzer, were evaluated. Use of Oakite Aluminum NST Cleaner followed by water and acetone rinse was found to be the best overall technique

  16. Cast Stone Oxidation Front Evaluation: Preliminary Results For Samples Exposed To Moist Air

    International Nuclear Information System (INIS)

    Langton, C. A.; Almond, P. M.

    2013-01-01

    (depth to which soluble Cr was detected) for the Cast Stone sample exposed for 68 days to ambient outdoor temperatures and humid air (total age of sample was 131 days) was determined to be about 35 mm below the top sample surface exposed. The Tc oxidation front, depth at which Tc was insoluble, was not determined. Interpretation of the results indicates that the oxidation front is at least 38 mm below the exposed surface. The sample used for this measurement was exposed to ambient laboratory conditions and humid air for 50 days. The total age of the sample was 98 days. Technetium appears to be more easily oxidized than Cr in the Cast Stone matrix. The oxidized forms of Tc and Cr are soluble and therefore leachable. Longer exposure times are required for both the Cr and Tc spiked samples to better interpret the rate of oxidation. Tc spiked subsamples need to be taken further from the exposed surface to better define and interpret the leachable Tc profile. Finally Tc(VII) reduction to Tc(IV) appears to occur relatively fast. Results demonstrated that about 95 percent of the Tc(VII) was reduced to Tc(IV) during the setting and very early stage setting for a Cast Stone sample cured 10 days. Additional testing at longer curing times is required to determine whether additional time is required to reduce 100% of the Tc(VII) in Cast Stone or whether the Tc loading exceeded the ability of the waste form to reduce 100% of the Tc(VII). Additional testing is required for samples cured for longer times. Depth discrete subsampling in a nitrogen glove box is also required to determine whether the 5 percent Tc extracted from the subsamples was the result of the sampling process which took place in air. Reduction capacity measurements (per the Angus-Glasser method) performed on depth discrete samples could not be correlated with the amount of chromium or technetium leached from the depth discrete subsamples or with the oxidation front inferred from soluble chromium and technetium (i

  17. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    Science.gov (United States)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  18. Evaluation design of New York City's regulations on nutrition, physical activity, and screen time in early child care centers.

    Science.gov (United States)

    Breck, Andrew; Goodman, Ken; Dunn, Lillian; Stephens, Robert L; Dawkins, Nicola; Dixon, Beth; Jernigan, Jan; Kakietek, Jakub; Lesesne, Catherine; Lessard, Laura; Nonas, Cathy; O'Dell, Sarah Abood; Osuji, Thearis A; Bronson, Bernice; Xu, Ye; Kettel Khan, Laura

    2014-10-16

    This article describes the multi-method cross-sectional design used to evaluate New York City Department of Health and Mental Hygiene's regulations of nutrition, physical activity, and screen time for children aged 3 years or older in licensed group child care centers. The Center Evaluation Component collected data from a stratified random sample of 176 licensed group child care centers in New York City. Compliance with the regulations was measured through a review of center records, a facility inventory, and interviews of center directors, lead teachers, and food service staff. The Classroom Evaluation Component included an observational and biometric study of a sample of approximately 1,400 children aged 3 or 4 years attending 110 child care centers and was designed to complement the center component at the classroom and child level. The study methodology detailed in this paper may aid researchers in designing policy evaluation studies that can inform other jurisdictions considering similar policies.

  19. Modular time division multiplexer: Efficient simultaneous characterization of fast and slow transients in multiple samples

    Science.gov (United States)

    Kim, Stephan D.; Luo, Jiajun; Buchholz, D. Bruce; Chang, R. P. H.; Grayson, M.

    2016-09-01

    A modular time division multiplexer (MTDM) device is introduced to enable parallel measurement of multiple samples with both fast and slow decay transients spanning from millisecond to month-long time scales. This is achieved by dedicating a single high-speed measurement instrument for rapid data collection at the start of a transient, and by multiplexing a second low-speed measurement instrument for slow data collection of several samples in parallel for the later transients. The MTDM is a high-level design concept that can in principle measure an arbitrary number of samples, and the low cost implementation here allows up to 16 samples to be measured in parallel over several months, reducing the total ensemble measurement duration and equipment usage by as much as an order of magnitude without sacrificing fidelity. The MTDM was successfully demonstrated by simultaneously measuring the photoconductivity of three amorphous indium-gallium-zinc-oxide thin films with 20 ms data resolution for fast transients and an uninterrupted parallel run time of over 20 days. The MTDM has potential applications in many areas of research that manifest response times spanning many orders of magnitude, such as photovoltaics, rechargeable batteries, amorphous semiconductors such as silicon and amorphous indium-gallium-zinc-oxide.

  20. Large-scale machine learning and evaluation platform for real-time traffic surveillance

    Science.gov (United States)

    Eichel, Justin A.; Mishra, Akshaya; Miller, Nicholas; Jankovic, Nicholas; Thomas, Mohan A.; Abbott, Tyler; Swanson, Douglas; Keller, Joel

    2016-09-01

    In traffic engineering, vehicle detectors are trained on limited datasets, resulting in poor accuracy when deployed in real-world surveillance applications. Annotating large-scale high-quality datasets is challenging. Typically, these datasets have limited diversity; they do not reflect the real-world operating environment. There is a need for a large-scale, cloud-based positive and negative mining process and a large-scale learning and evaluation system for the application of automatic traffic measurements and classification. The proposed positive and negative mining process addresses the quality of crowd sourced ground truth data through machine learning review and human feedback mechanisms. The proposed learning and evaluation system uses a distributed cloud computing framework to handle data-scaling issues associated with large numbers of samples and a high-dimensional feature space. The system is trained using AdaBoost on 1,000,000 Haar-like features extracted from 70,000 annotated video frames. The trained real-time vehicle detector achieves an accuracy of at least 95% for 1/2 and about 78% for 19/20 of the time when tested on ˜7,500,000 video frames. At the end of 2016, the dataset is expected to have over 1 billion annotated video frames.

  1. Evaluation of skin moisturizer effects using terahertz time domain imaging

    Science.gov (United States)

    Martinez-Meza, L. H.; Rojas-Landeros, S. C.; Castro-Camus, E.; Alfaro-Gomez, M.

    2018-02-01

    We use terahertz time domain imaging for the evaluation of the effects of skin-moisturizers in vivo. We evaluate three principal substances used in commercial moisturizers: glycerin, hyaluronic acid and lanolin. We image the interaction of the forearm with each of the substances taking terahertz spectra at sequential times. With this, we are able to measure the effect of the substances on the hydration level of the skin in time, determining the feasibility of using THz imaging for the evaluation of the products and their effects on the hydration levels of the skin.

  2. Evaluating Complex Mixtures in the Zebrafish Embryo by Reconstituting Field Water Samples: A Metal Pollution Case Study

    Directory of Open Access Journals (Sweden)

    Ellen D. G. Michiels

    2017-03-01

    Full Text Available Accurately assessing the toxicity of complex, environmentally relevant mixtures remains an important challenge in ecotoxicology. The goal was to identify biological effects after exposure to environmental water samples and to determine whether the observed effects could be explained by the waterborne metal mixture found in the samples. Zebrafish embryos were exposed to water samples of five different sites originating from two Flemish (Mol and Olen, Belgium metal contaminated streams: “Scheppelijke Nete” (SN and “Kneutersloop” (K, and a ditch (D, which is the contamination source of SN. Trace metal concentrations, and Na, K, Mg and Ca concentrations were measured using ICP-MS and were used to reconstitute site-specific water samples. We assessed whether the effects that were observed after exposure to environmental samples could be explained by metal mixture toxicity under standardized laboratory conditions. Exposure to “D” or “reconstituted D” water caused 100% mortality. SN and reconstituted SN water caused similar effects on hatching, swim bladder inflation, growth and swimming activity. A canonical discriminant analysis confirmed a high similarity between both exposure scenarios, indicating that the observed toxicity was indeed primarily caused by metals. The applied workflow could be a valuable approach to evaluate mixture toxicity that limits time and costs while maintaining biological relevance.

  3. Iohexol plasma clearance measurement in older adults with chronic kidney disease-sampling time matters.

    Science.gov (United States)

    Ebert, Natalie; Loesment, Amina; Martus, Peter; Jakob, Olga; Gaedeke, Jens; Kuhlmann, Martin; Bartel, Jan; Schuchardt, Mirjam; Tölle, Markus; Huang, Tao; van der Giet, Markus; Schaeffner, Elke

    2015-08-01

    Accurate and precise measurement of GFR is important for patients with chronic kidney disease (CKD). Sampling time of exogenous filtration markers may have great impact on measured GFR (mGFR) results, but there is still uncertainty about optimal timing of plasma clearance measurement in patients with advanced CKD, for whom 24-h measurement is recommended. This satellite project of the Berlin Initiative Study evaluates whether 24-h iohexol plasma clearance reveals a clinically relevant difference compared with 5-h measurement in older adults. In 104 participants with a mean age of 79 years and diagnosed CKD, we performed standard GFR measurement over 5 h (mGFR300) using iohexol plasma concentrations at 120, 180, 240 and 300 min after injection. With an additional sample at 1440 min, we assessed 24-h GFR measurement (mGFR1440). Study design was cross-sectional. Calculation of mGFR was conducted with a one compartment model using the Brochner-Mortensen equation to calculate the fast component. mGFR values were compared with estimated GFR values (MDRD, CKD-EPI, BIS1, Revised Lund-Malmö and Cockcroft-Gault). In all 104 subjects, mGFR1440 was lower than mGFR300 (23 ± 8 versus 29 ± 9 mL/min/1.73 m(2), mean ± SD; P clearance up to 5 h leads to a clinically relevant overestimation of GFR compared with 24-h measurement. In clinical care, this effect should be bore in mind especially for patients with considerably reduced GFR levels. A new correction formula has been developed to predict mGFR1440 from mGFR300. For accurate GFR estimates in elderly CKD patients, we recommend the Revised Lund Malmö equation. © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  4. Draft evaluation of the frequency for gas sampling for the high burnup confirmatory data project

    Energy Technology Data Exchange (ETDEWEB)

    Stockman, Christine T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Alsaed, Halim A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bryan, Charles R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-03-26

    This report fulfills the M3 milestone M3FT-15SN0802041, “Draft Evaluation of the Frequency for Gas Sampling for the High Burn-up Storage Demonstration Project” under Work Package FT-15SN080204, “ST Field Demonstration Support – SNL”. This report provides a technically based gas sampling frequency strategy for the High Burnup (HBU) Confirmatory Data Project. The evaluation of: 1) the types and magnitudes of gases that could be present in the project cask and, 2) the degradation mechanisms that could change gas compositions culminates in an adaptive gas sampling frequency strategy. This adaptive strategy is compared against the sampling frequency that has been developed based on operational considerations. Gas sampling will provide information on the presence of residual water (and byproducts associated with its reactions and decomposition) and breach of cladding, which could inform the decision of when to open the project cask.

  5. A time-sorting pitfall trap and temperature datalogger for the sampling of surface-active arthropods

    Directory of Open Access Journals (Sweden)

    Marshall S. McMunn

    2017-04-01

    Full Text Available Nearly all arthropods display consistent patterns of activity according to time of day. These patterns of activity often limit the extent of animal co-occurrence in space and time. Quantifying when particular species are active and how activity varies with environmental conditions is difficult without the use of automated devices due to the need for continuous monitoring. Time-sorting pitfall traps passively collect active arthropods into containers with known beginning and end sample times. The trap described here, similar to previous designs, sorts arthropods by the time they fall into the trap using a rotating circular rack of vials. This trap represents a reduction in size, cost, and time of construction, while increasing the number of time windows sampled. The addition of temperature data collection extends functionality, while the use of store-bought components and inclusion of customizable software make the trap easy to reproduce and use.

  6. Sampling Methodologies for Epidemiologic Surveillance of Men Who Have Sex with Men and Transgender Women in Latin America: An Empiric Comparison of Convenience Sampling, Time Space Sampling, and Respondent Driven Sampling

    Science.gov (United States)

    Clark, J. L.; Konda, K. A.; Silva-Santisteban, A.; Peinado, J.; Lama, J. R.; Kusunoki, L.; Perez-Brumer, A.; Pun, M.; Cabello, R.; Sebastian, J. L.; Suarez-Ognio, L.; Sanchez, J.

    2014-01-01

    Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June–August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through TSS, and 127 through RDS. The TSS sample included the largest proportion of TW (30.7 %) and the lowest percentage of subjects who had previously participated in HIV/STI research (14.9 %). The prevalence of newly diagnosed HIV infection, according to participants’ self-reported previous HIV diagnosis, was highest among TSS recruits (17.9 %) compared with RDS (12.6 %) and CS (10.2 %). TSS identified diverse populations of MSM/TW with higher prevalences of HIV/STIs not accessed by other methods. PMID:24362754

  7. Sampling methodologies for epidemiologic surveillance of men who have sex with men and transgender women in Latin America: an empiric comparison of convenience sampling, time space sampling, and respondent driven sampling.

    Science.gov (United States)

    Clark, J L; Konda, K A; Silva-Santisteban, A; Peinado, J; Lama, J R; Kusunoki, L; Perez-Brumer, A; Pun, M; Cabello, R; Sebastian, J L; Suarez-Ognio, L; Sanchez, J

    2014-12-01

    Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June-August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through TSS, and 127 through RDS. The TSS sample included the largest proportion of TW (30.7 %) and the lowest percentage of subjects who had previously participated in HIV/STI research (14.9 %). The prevalence of newly diagnosed HIV infection, according to participants' self-reported previous HIV diagnosis, was highest among TSS recruits (17.9 %) compared with RDS (12.6 %) and CS (10.2 %). TSS identified diverse populations of MSM/TW with higher prevalences of HIV/STIs not accessed by other methods.

  8. Sampling genetic diversity in the sympatrically and allopatrically speciating Midas cichlid species complex over a 16 year time series

    Directory of Open Access Journals (Sweden)

    Bunje Paul ME

    2007-02-01

    Full Text Available Abstract Background Speciation often occurs in complex or uncertain temporal and spatial contexts. Processes such as reinforcement, allopatric divergence, and assortative mating can proceed at different rates and with different strengths as populations diverge. The Central American Midas cichlid fish species complex is an important case study for understanding the processes of speciation. Previous analyses have demonstrated that allopatric processes led to species formation among the lakes of Nicaragua as well as sympatric speciation that is occurring within at least one crater lake. However, since speciation is an ongoing process and sampling genetic diversity of such lineages can be biased by collection scheme or random factors, it is important to evaluate the robustness of conclusions drawn on individual time samples. Results In order to assess the validity and reliability of inferences based on different genetic samples, we have analyzed fish from several lakes in Nicaragua sampled at three different times over 16 years. In addition, this time series allows us to analyze the population genetic changes that have occurred between lakes, where allopatric speciation has operated, as well as between different species within lakes, some of which have originated by sympatric speciation. Focusing on commonly used genetic markers, we have analyzed both DNA sequences from the complete mitochondrial control region as well as nuclear DNA variation at ten microsatellite loci from these populations, sampled thrice in a 16 year time period, to develop a robust estimate of the population genetic history of these diversifying lineages. Conclusion The conclusions from previous work are well supported by our comprehensive analysis. In particular, we find that the genetic diversity of derived crater lake populations is lower than that of the source population regardless of when and how each population was sampled. Furthermore, changes in various estimates of

  9. Real-time PCR to supplement gold-standard culture-based detection of Legionella in environmental samples.

    Science.gov (United States)

    Collins, S; Jorgensen, F; Willis, C; Walker, J

    2015-10-01

    Culture remains the gold-standard for the enumeration of environmental Legionella. However, it has several drawbacks including long incubation and poor sensitivity, causing delays in response times to outbreaks of Legionnaires' disease. This study aimed to validate real-time PCR assays to quantify Legionella species (ssrA gene), Legionella pneumophila (mip gene) and Leg. pneumophila serogroup-1 (wzm gene) to support culture-based detection in a frontline public health laboratory. Each qPCR assay had 100% specificity, excellent sensitivity (5 GU/reaction) and reproducibility. Comparison of the assays to culture-based enumeration of Legionella from 200 environmental samples showed that they had a negative predictive value of 100%. Thirty eight samples were positive for Legionella species by culture and qPCR. One hundred samples were negative by both methods, whereas 62 samples were negative by culture but positive by qPCR. The average log10 increase between culture and qPCR for Legionella spp. and Leg. pneumophila was 0·72 (P = 0·0002) and 0·51 (P = 0·006), respectively. The qPCR assays can be conducted on the same 1 l water sample as culture thus can be used as a supplementary technique to screen out negative samples and allow more rapid indication of positive samples. The assay could prove informative in public health investigations to identify or rule out sources of Legionella as well as to specifically identify Leg. pneumophila serogroup 1 in a timely manner not possible with culture. © 2015 The Society for Applied Microbiology.

  10. Evaluation of the Frequency for Gas Sampling for the High Burnup Confirmatory Data Project

    Energy Technology Data Exchange (ETDEWEB)

    Stockman, Christine T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Alsaed, Halim A. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bryan, Charles R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Marschman, Steven C. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Scaglione, John M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-05-01

    This report provides a technically based gas sampling frequency strategy for the High Burnup (HBU) Confirmatory Data Project. The evaluation of: 1) the types and magnitudes of gases that could be present in the project cask and, 2) the degradation mechanisms that could change gas compositions culminates in an adaptive gas sampling frequency strategy. This adaptive strategy is compared against the sampling frequency that has been developed based on operational considerations.

  11. Sample size for morphological traits of pigeonpea

    Directory of Open Access Journals (Sweden)

    Giovani Facco

    2015-12-01

    Full Text Available The objectives of this study were to determine the sample size (i.e., number of plants required to accurately estimate the average of morphological traits of pigeonpea (Cajanus cajan L. and to check for variability in sample size between evaluation periods and seasons. Two uniformity trials (i.e., experiments without treatment were conducted for two growing seasons. In the first season (2011/2012, the seeds were sown by broadcast seeding, and in the second season (2012/2013, the seeds were sown in rows spaced 0.50 m apart. The ground area in each experiment was 1,848 m2, and 360 plants were marked in the central area, in a 2 m × 2 m grid. Three morphological traits (e.g., number of nodes, plant height and stem diameter were evaluated 13 times during the first season and 22 times in the second season. Measurements for all three morphological traits were normally distributed and confirmed through the Kolmogorov-Smirnov test. Randomness was confirmed using the Run Test, and the descriptive statistics were calculated. For each trait, the sample size (n was calculated for the semiamplitudes of the confidence interval (i.e., estimation error equal to 2, 4, 6, ..., 20% of the estimated mean with a confidence coefficient (1-? of 95%. Subsequently, n was fixed at 360 plants, and the estimation error of the estimated percentage of the average for each trait was calculated. Variability of the sample size for the pigeonpea culture was observed between the morphological traits evaluated, among the evaluation periods and between seasons. Therefore, to assess with an accuracy of 6% of the estimated average, at least 136 plants must be evaluated throughout the pigeonpea crop cycle to determine the sample size for the traits (e.g., number of nodes, plant height and stem diameter in the different evaluation periods and between seasons. 

  12. Analysis of volatile organic compounds in compost samples: A potential tool to determine appropriate composting time.

    Science.gov (United States)

    Zhu, Fengxiang; Pan, Zaifa; Hong, Chunlai; Wang, Weiping; Chen, Xiaoyang; Xue, Zhiyong; Yao, Yanlai

    2016-12-01

    Changes in volatile organic compound contents in compost samples during pig manure composting were studied using a headspace, solid-phase micro-extraction method (HS-SPME) followed by gas chromatography with mass spectrometric detection (GC/MS). Parameters affecting the SPME procedure were optimized as follows: the coating was carbon molecular sieve/polydimethylsiloxane (CAR/PDMS) fiber, the temperature was 60°C and the time was 30min. Under these conditions, 87 compounds were identified from 17 composting samples. Most of the volatile components could only be detected before day 22. However, benzenes, alkanes and alkenes increased and eventually stabilized after day 22. Phenol and acid substances, which are important factors for compost quality, were almost undetectable on day 39 in natural compost (NC) samples and on day 13 in maggot-treated compost (MC) samples. Our results indicate that the approach can be effectively used to determine the composting times by analysis of volatile substances in compost samples. An appropriate composting time not only ensures the quality of compost and reduces the loss of composting material but also reduces the generation of hazardous substances. The appropriate composting times for MC and NC were approximately 22days and 40days, respectively, during the summer in Zhejiang. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Freeze core sampling to validate time-lapse resistivity monitoring of the hyporheic zone.

    Science.gov (United States)

    Toran, Laura; Hughes, Brian; Nyquist, Jonathan; Ryan, Robert

    2013-01-01

    A freeze core sampler was used to characterize hyporheic zone storage during a stream tracer test. The pore water from the frozen core showed tracer lingered in the hyporheic zone after the tracer had returned to background concentration in collocated well samples. These results confirmed evidence of lingering subsurface tracer seen in time-lapse electrical resistivity tomographs. The pore water exhibited brine exclusion (ion concentrations in ice lower than source water) in a sediment matrix, despite the fast freezing time. Although freeze core sampling provided qualitative evidence of lingering tracer, it proved difficult to quantify tracer concentration because the amount of brine exclusion during freezing could not be accurately determined. Nonetheless, the additional evidence for lingering tracer supports using time-lapse resistivity to detect regions of low fluid mobility within the hyporheic zone that can act as chemically reactive zones of importance in stream health. © 2012, The Author(s). GroundWater © 2012, National Ground Water Association.

  14. AN EVALUATION OF PRIMARY DATA-COLLECTION MODES IN AN ADDRESS-BASED SAMPLING DESIGN.

    Science.gov (United States)

    Amaya, Ashley; Leclere, Felicia; Carris, Kari; Liao, Youlian

    2015-01-01

    As address-based sampling becomes increasingly popular for multimode surveys, researchers continue to refine data-collection best practices. While much work has been conducted to improve efficiency within a given mode, additional research is needed on how multimode designs can be optimized across modes. Previous research has not evaluated the consequences of mode sequencing on multimode mail and phone surveys, nor has significant research been conducted to evaluate mode sequencing on a variety of indicators beyond response rates. We conducted an experiment within the Racial and Ethnic Approaches to Community Health across the U.S. Risk Factor Survey (REACH U.S.) to evaluate two multimode case-flow designs: (1) phone followed by mail (phone-first) and (2) mail followed by phone (mail-first). We compared response rates, cost, timeliness, and data quality to identify differences across case-flow design. Because surveys often differ on the rarity of the target population, we also examined whether changes in the eligibility rate altered the choice of optimal case flow. Our results suggested that, on most metrics, the mail-first design was superior to the phone-first design. Compared with phone-first, mail-first achieved a higher yield rate at a lower cost with equivalent data quality. While the phone-first design initially achieved more interviews compared to the mail-first design, over time the mail-first design surpassed it and obtained the greatest number of interviews.

  15. Evaluation of Multiple-Sampling Function used with a Microtek flatbed scanner for Radiation Dosimetry Calibration of EBT2 Film

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Liyun [Department of Medical Imaging and Radiological Sciences, I-Shou University, Kaohsiung 82445, Taiwan (China); Ho, Sheng-Yow [Department of Nursing, Chang Jung Christian University, Tainan 71101, Taiwan (China); Department of Radiation Oncology, Chi Mei Medical Center, Liouying, Tainan 73657, Taiwan (China); Ding, Hueisch-Jy [Department of Medical Imaging and Radiological Sciences, I-Shou University, Kaohsiung 82445, Taiwan (China); Hwang, Ing-Ming [Department of Medical Imaging and Radiology, Shu Zen College of Medicine and Management, Kaohsiung 82144, Taiwan (China); Chen, Pang-Yu, E-mail: pangyuchen@yahoo.com.tw [Department of Radiation Oncology, Sinlau Christian Hospital, Tainan 70142, Taiwan (China); Lee, Tsair-Fwu, E-mail: tflee@kuas.edu.tw [Medical Physics and Informatics Laboratory, Department of Electronics Engineering, National Kaohsiung University of Applied Sciences, Kaohsiung 80778, Taiwan (China)

    2016-10-01

    The radiochromic EBT2 film is a widely used quality assurance device for radiation therapy. This study evaluated the film calibration performance of the multiple-sampling function, a function of the ScanWizard Pro scanning software provided by the manufacturer, when used with Microtek 9800XL plus (9800XL{sup +}) flatbed scanner. By using the PDD method, each one of the eight EBT2 films, four delivered by 290 monitor unit (MU) and four by 88 MU via 6-MV photon beams, was tightly sandwiched in a 30{sup 3}-cm{sup 3} water equivalent polystyrene phantom prior to irradiation. Before and after irradiation, all films were scanned using the Microtek 9800XL{sup +} scanner with five different modes of the multiple-sampling function, which could generate the image with the averaged result of multiple-sampling. The net optical densities (netOD) on the beam central axis of film were assigned to corresponding depth doses for calibration. For each sampling mode with either delivered MU, the depth-dose uncertainty of a single film from repeated scans and that of a single scan of the four films were analyzed. Finally, the calibration error and the combined calibration uncertainty between film determined depth-doses and delivered depth-doses were calculated and evaluated for each sampling mode. All standard deviations and the calibration error were demonstrated to be unrelated to the number of sampling lines. The calibration error of the 2-line and 16-line mode was within 3 cGy and better than that of the other modes. The combined uncertainty of the 2-line mode was the lowest, which was generally less than 6 cGy except for the delivered dose around 100 cGy. The evaluation described herein revealed that the EBT2 film calibrated with the 2-line mode has relatively lower error, scanning time and combined uncertianty. Therefore, it is recommended for routine EBT2 film calibration and verification of treatment plans.

  16. Evaluation of Multiple-Sampling Function used with a Microtek flatbed scanner for Radiation Dosimetry Calibration of EBT2 Film

    International Nuclear Information System (INIS)

    Chang, Liyun; Ho, Sheng-Yow; Ding, Hueisch-Jy; Hwang, Ing-Ming; Chen, Pang-Yu; Lee, Tsair-Fwu

    2016-01-01

    The radiochromic EBT2 film is a widely used quality assurance device for radiation therapy. This study evaluated the film calibration performance of the multiple-sampling function, a function of the ScanWizard Pro scanning software provided by the manufacturer, when used with Microtek 9800XL plus (9800XL + ) flatbed scanner. By using the PDD method, each one of the eight EBT2 films, four delivered by 290 monitor unit (MU) and four by 88 MU via 6-MV photon beams, was tightly sandwiched in a 30 3 -cm 3 water equivalent polystyrene phantom prior to irradiation. Before and after irradiation, all films were scanned using the Microtek 9800XL + scanner with five different modes of the multiple-sampling function, which could generate the image with the averaged result of multiple-sampling. The net optical densities (netOD) on the beam central axis of film were assigned to corresponding depth doses for calibration. For each sampling mode with either delivered MU, the depth-dose uncertainty of a single film from repeated scans and that of a single scan of the four films were analyzed. Finally, the calibration error and the combined calibration uncertainty between film determined depth-doses and delivered depth-doses were calculated and evaluated for each sampling mode. All standard deviations and the calibration error were demonstrated to be unrelated to the number of sampling lines. The calibration error of the 2-line and 16-line mode was within 3 cGy and better than that of the other modes. The combined uncertainty of the 2-line mode was the lowest, which was generally less than 6 cGy except for the delivered dose around 100 cGy. The evaluation described herein revealed that the EBT2 film calibrated with the 2-line mode has relatively lower error, scanning time and combined uncertianty. Therefore, it is recommended for routine EBT2 film calibration and verification of treatment plans.

  17. Evaluation of two real time PCR assays for the detection of bacterial DNA in amniotic fluid.

    Science.gov (United States)

    Girón de Velasco-Sada, Patricia; Falces-Romero, Iker; Quiles-Melero, Inmaculada; García-Perea, Adela; Mingorance, Jesús

    2018-01-01

    The aim of this study was to evaluate two non-commercial Real-Time PCR assays for the detection of microorganisms in amniotic fluid followed by identification by pyrosequencing. We collected 126 amniotic fluids from 2010 to 2015 for the evaluation of two Real-Time PCR assays for detection of bacterial DNA in amniotic fluid (16S Universal PCR and Ureaplasma spp. specific PCR). The method was developed in the Department of Microbiology of the University Hospital La Paz. Thirty-seven samples (29.3%) were positive by PCR/pyrosequencing and/or culture, 4 of them were mixed cultures with Ureaplasma urealyticum. The Universal 16S Real-Time PCR was compared with the standard culture (81.8% sensitivity, 97.4% specificity, 75% positive predictive value, 98% negative predictive value). The Ureaplasma spp. specific Real-Time PCR was compared with the Ureaplasma/Mycoplasma specific culture (92.3% sensitivity, 89.4% specificity, 50% positive predictive value, 99% negative predictive value) with statistically significant difference (p=0.005). Ureaplasma spp. PCR shows a rapid response time (5h from DNA extraction until pyrosequencing) when comparing with culture (48h). So, the response time of bacteriological diagnosis in suspected chorioamnionitis is reduced. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Hot sample archiving. Revision 3

    International Nuclear Information System (INIS)

    McVey, C.B.

    1995-01-01

    This Engineering Study revision evaluated the alternatives to provide tank waste characterization analytical samples for a time period as recommended by the Tank Waste Remediation Systems Program. The recommendation of storing 40 ml segment samples for a period of approximately 18 months (6 months past the approval date of the Tank Characterization Report) and then composite the core segment material in 125 ml containers for a period of five years. The study considers storage at 222-S facility. It was determined that the critical storage problem was in the hot cell area. The 40 ml sample container has enough material for approximately 3 times the required amount for a complete laboratory re-analysis. The final result is that 222-S can meet the sample archive storage requirements. During the 100% capture rate the capacity is exceeded in the hot cell area, but quick, inexpensive options are available to meet the requirements

  19. Evaluation of different enrichment methods for pathogenic Yersinia species detection by real time PCR

    Science.gov (United States)

    2014-01-01

    Background Yersiniosis is a zoonotic disease reported worldwide. Culture and PCR based protocols are the most common used methods for detection of pathogenic Yersinia species in animal samples. PCR sensitivity could be increased by an initial enrichment step. This step is particularly useful in surveillance programs, where PCR is applied to samples from asymptomatic animals. The aim of this study was to evaluate the improvement in pathogenic Yersinia species detection using a suitable enrichment method prior to the real time PCR (rtPCR). Nine different enrichment protocols were evaluated including six different broth mediums (CASO, ITC, PSB, PBS, PBSMSB and PBSSSB). Results The analysis of variance showed significant differences in Yersinia detection by rtPCR according to the enrichment protocol used. These differences were higher for Y. pseudotuberculosis than for Y. enterocolitica. In general, samples incubated at lower temperatures yielded the highest detection rates. The best results were obtained with PBSMSB and PBS2. Application of PBSMSB protocol to free-ranging wild board samples improved the detection of Y. enterocolitica by 21.2% when compared with direct rtPCR. Y. pseudotuberculosis detection was improved by 10.6% when results obtained by direct rtPCR and by PBSMSB enrichment before rtPCR were analyzed in combination. Conclusions The data obtained in the present study indicate a difference in Yersinia detection by rtPCR related to the enrichment protocol used, being PBSMSB enrichment during 15 days at 4°C and PBS during 7 days at 4°C the most efficient. The use of direct rtPCR in combination with PBSMSB enrichment prior to rtPCR resulted in an improvement in the detection rates of pathogenic Yersinia in wild boar and could be useful for application in other animal samples. PMID:25168886

  20. Evaluation of immunization coverage by lot quality assurance sampling compared with 30-cluster sampling in a primary health centre in India.

    OpenAIRE

    Singh, J.; Jain, D. C.; Sharma, R. S.; Verghese, T.

    1996-01-01

    The immunization coverage of infants, children and women residing in a primary health centre (PHC) area in Rajasthan was evaluated both by lot quality assurance sampling (LQAS) and by the 30-cluster sampling method recommended by WHO's Expanded Programme on Immunization (EPI). The LQAS survey was used to classify 27 mutually exclusive subunits of the population, defined as residents in health subcentre areas, on the basis of acceptable or unacceptable levels of immunization coverage among inf...

  1. A comprehensive analyzing and evaluating of the results of a wide scope comparison on the environmental level radioactive samples with γ spectrometer

    International Nuclear Information System (INIS)

    Su Qiong; Cheng Jianping; Wang Xuewu; Fan Jiajin; Chen Boxian

    2001-01-01

    A wide scope comparison on the environmental level radioactive samples by γ spectrometers, that has been done in 1998 - 1999, was introduced. Some original data about the comparison are presented. Comprehensive analyzing and evaluating on the comparison results have been done. A new method used for determining comparison reference values, the Model Real Time Weight Average, is adopted. The method is detailed and compared with other models. The practice shows that the Model Real Time Weight Average adopted is feasible and successful

  2. Comparison of the Abbott RealTime High Risk HPV test and the Roche cobas 4800 HPV test using urine samples.

    Science.gov (United States)

    Lim, Myong Cheol; Lee, Do-Hoon; Hwang, Sang-Hyun; Hwang, Na Rae; Lee, Bomyee; Shin, Hye Young; Jun, Jae Kwan; Yoo, Chong Woo; Lee, Dong Ock; Seo, Sang-Soo; Park, Sang-Yoon; Joo, Jungnam

    2017-05-01

    Human papillomavirus (HPV) testing based on cervical samples is important for use in cervical cancer screening. However, cervical sampling is invasive. Therefore, non-invasive methods for detecting HPV, such as urine samples, are needed. For HPV detection in urine samples, two real-time PCR (RQ-PCR) tests, Roche cobas 4800 test (Roche_HPV; Roche Molecular Diagnostics) and Abbott RealTime High Risk HPV test (Abbott_HPV; Abbott Laboratories) were compared to standard cervical samples. The performance of Roche_HPV and Abbott_HPV for HPV detection was evaluated at the National Cancer Center using 100 paired cervical and urine samples. The tests were also compared using urine samples stored at various temperatures and for a range of durations. The overall agreement between the Roche_HPV and Abbott_HPV tests using urine samples for any hrHPV type was substantial (86.0% with a kappa value of 0.7173), and that for HPV 16/18 was nearly perfect (99.0% with a kappa value of 0.9668). The relative sensitivities (based on cervical samples) for HPV 16/18 detection using Roche_HPV and Abbott_HPV with urine samples were 79.2% (95% CI; 57.9-92.9%) and 81.8% (95% CI; 59.7-94.8%), respectively. When the cut-off C T value for Abbott_HPV was extended to 40 for urine samples, the relative sensitivity of Abbott_HPV increased to 91.7% from 81.8% for HPV16/18 detection and to 87.0% from 68.5% for other hrHPV detection. The specificity was not affected by the change in the C T threshold. Roche_HPV and Abbott_HPV showed high concordance. However, HPV DNA detection using urine samples was inferior to HPV DNA detection using cervical samples. Interestingly, when the cut-off C T value was set to 40, Abbott_HPV using urine samples showed high sensitivity and specificity, comparable to those obtained using cervical samples. Fully automated DNA extraction and detection systems, such as Roche_HPV and Abbott_HPV, could reduce the variability in HPV detection and accelerate the standardization of HPV

  3. Sampling a guide for internal auditors

    CERN Document Server

    Apostolou, Barbara

    2004-01-01

    While it is possible to examine 100 percent of an audit customer's data, the time and cost associated with such a study are often prohibitive. To obtain sufficient, reliable, and relevant information with a limited data set, sampling is an efficient and effective tool. It can help you evaluate the customer's assertions, as well as reach audit conclusions and provide reasonable assurance to your organization. This handbook will help you understand sampling. It also serves as a guide for auditors and students preparing for certification. Topics include: An overview of sampling. Statistical and nonstatistical sampling issues. Sampling selection methods and risks. The pros and cons of popular sampling plans.

  4. Evaluation of a Chlamydia trachomatis-specific, commercial, real-time PCR for use with ocular swabs.

    Science.gov (United States)

    Pickering, Harry; Holland, Martin J; Last, Anna R; Burton, Matthew J; Burr, Sarah E

    2018-02-20

    Trachoma, the leading infectious cause of blindness worldwide, is caused by conjunctival Chlamydia trachomatis infection. Trachoma is diagnosed clinically by observation of conjunctival inflammation and/or scarring; however, there is evidence that monitoring C. trachomatis infection may be required for elimination programmes. There are many commercial and 'in-house' nucleic acid amplification tests for the detection of C. trachomatis DNA, but the majority have not been validated for use with ocular swabs. This study evaluated a commercial assay, the Fast-Track Vaginal swab kit, using conjunctival samples from trachoma-endemic areas. An objective, biostatistical-based method for binary classification of continuous PCR data was developed, to limit potential user-bias in diagnostic settings. The Fast-Track Vaginal swab assay was run on 210 ocular swab samples from Guinea-Bissau and Tanzania. Fit of individual amplification curves to exponential or sigmoid models, derivative and second derivative of the curves and final fluorescence value were examined for utility in thresholding for determining positivity. The results from the Fast-Track Vaginal swab assay were evaluated against a commercial test (Amplicor CT/NG) and a non-commercial test (in-house droplet digital PCR), both of whose performance has previously been evaluated. Significant evidence of exponential amplification (R 2  > 0.99) and final fluorescence > 0.15 were combined for thresholding. This objective approach identified a population of positive samples, however there were a subset of samples that amplified towards the end of the cycling protocol (at or later than 35 cycles), which were less clearly defined. The Fast-Track Vaginal swab assay showed good sensitivity against the commercial (95.71) and non-commercial (97.18) tests. Specificity was lower against both (90.00 and 96.55, respectively). This study defined a simple, automated protocol for binary classification of continuous, real-time q

  5. TMI-2 accident evaluation program sample acquisition and examination plan. Executive summary

    International Nuclear Information System (INIS)

    Russell, M.L.; McCardell, R.K.; Broughton, J.M.

    1985-12-01

    The purpose of the TMI-2 Accident Evaluation Program Sample Acquisition and Examination (TMI-2 AEP SA and E) program is to develop and implement a test and inspection plan that completes the current-condition characterization of (a) the TMI-2 equipment that may have been damaged by the core damage events and (b) the TMI-2 core fission product inventory. The characterization program includes both sample acquisitions and examinations and in-situ measurements. Fission product characterization involves locating the fission products as well as determining their chemical form and determining material association

  6. Evaluation of a single-tube fluorogenic RT-PCR assay for detection of bovine respiratory syncytial virus in clinical samples

    DEFF Research Database (Denmark)

    Hakhverdyan, Mikhayil; Hägglund, Sara; Larsen, Lars Erik

    2005-01-01

    understanding of the virus. In this study, a BRSV fluorogenic reverse transcription PCR (fRT-PCR) assay, based on TaqMan principle, was developed and evaluated on a large number of clinical samples, representing various cases of natural and experimental BRSV infections. By using a single-step closed-tube format......, the turn-around time was shortened drastically and results were obtained with minimal risk for cross-contamination. According to comparative analyses, the detection limit of the fRT-PCR was on the same level as that of a nested PCR and the sensitivity relatively higher than that of a conventional PCR......, antigen ELISA (Ag-ELISA) and virus isolation (VI). Interspersed negative control samples, samples from healthy animals and eight symptomatically or genetically related viruses were all negative, confirming a high specificity of the assay. Taken together, the data indicated that the fRT-PCR assay can...

  7. Measurement of glucose area under the curve using minimally invasive interstitial fluid extraction technology: evaluation of glucose monitoring concepts without blood sampling.

    Science.gov (United States)

    Sato, Toshiyuki; Okada, Seiki; Hagino, Kei; Asakura, Yoshihiro; Kikkawa, Yasuo; Kojima, Junko; Watanabe, Toshihiro; Maekawa, Yasunori; Isobe, Kazuki; Koike, Reona; Nakajima, Hiromu; Asano, Kaoru

    2011-12-01

    Monitoring postprandial hyperglycemia is crucial in treating diabetes, although its dynamics make accurate monitoring difficult. We developed a new technology for monitoring postprandial hyperglycemia using interstitial fluid (ISF) extraction technology without blood sampling. The glucose area under the curve (AUC) using this system was measured as accumulated ISF glucose (IG) with simultaneous calibration with sodium ions. The objective of this study was to evaluate this technological concept in healthy individuals. Minimally invasive ISF extraction technology (MIET) comprises two steps: pretreatment with microneedles and ISF accumulation over a specific time by contact with a solvent. The correlation between glucose and sodium ion levels using MIET was evaluated in 12 subjects with stable blood glucose (BG) levels during fasting. BG and IG time courses were evaluated in three subjects to confirm their relationship while BG was fluctuating. Furthermore, the accuracy of glucose AUC measurements by MIET was evaluated several hours after a meal in 30 subjects. A high correlation was observed between glucose and sodium ion levels when BG levels were stable (R=0.87), indicating that sodium ion is a good internal standard for calibration. The variation in IG and BG with MIET was similar, indicating that IG is an adequate substitute for BG. Finally, we showed a strong correlation (R=0.92) between IG-AUC and BG-AUC after a meal. These findings validate the adequacy of glucose AUC measurements using MIET. Monitoring glucose using MIET without blood sampling may be beneficial to patients with diabetes.

  8. Radiation damage and life-time evaluation of RBMK graphite stack

    Energy Technology Data Exchange (ETDEWEB)

    Platonov, P A; Chugunov, O K; Manevsky, V N; Karpukhin, V I [Russian Research Centre Kurchatov Inst., Moscow (Russian Federation). Reactor Material Div.

    1996-08-01

    At the present time there are 11 NPP units with RBMK reactors in operation in Russia, with the oldest now in operation 22 years. Design life-time of the RBMK-1000 reactor is 30 years. This paper addresses the evaluation of RBMK graphite stack life-time. It is the practice in Russia to evaluate the reliability of the channel reactor graphite stack using at least three criteria: degradation of physical-mechanical properties of graphite, preservation of the graphite brick integrity, and degradation of the graphite stack as a structure. Stack life-time evaluation by different criteria indicates that the most realistic approach may be realized on the basis of the criteria of brick cracking and degradation of the graphite stack as a structure. The RBMK reactor graphite stack life-time depends on its temperature and for different units it may be different. (author). 2 refs, 10 figs.

  9. Accounting for Diversity in Suicide Research: Sampling and Sample Reporting Practices in the United States.

    Science.gov (United States)

    Cha, Christine B; Tezanos, Katherine M; Peros, Olivia M; Ng, Mei Yi; Ribeiro, Jessica D; Nock, Matthew K; Franklin, Joseph C

    2018-04-01

    Research on suicidal thoughts and behaviors (STB) has identified many risk factors, but whether these findings generalize to diverse populations remains unclear. We review longitudinal studies on STB risk factors over the past 50 years in the United States and evaluate the methodological practices of sampling and reporting sample characteristics. We found that articles frequently reported participant age and sex, less frequently reported participant race and ethnicity, and rarely reported participant veteran status or lesbian, gay, bisexual, and transgender status. Sample reporting practices modestly and inconsistently improved over time. Finally, articles predominantly featured White, non-Hispanic, young adult samples. © 2017 The American Association of Suicidology.

  10. Evaluating Site-Specific and Generic Spatial Models of Aboveground Forest Biomass Based on Landsat Time-Series and LiDAR Strip Samples in the Eastern USA

    Science.gov (United States)

    Ram Deo; Matthew Russell; Grant Domke; Hans-Erik Andersen; Warren Cohen; Christopher Woodall

    2017-01-01

    Large-area assessment of aboveground tree biomass (AGB) to inform regional or national forest monitoring programs can be efficiently carried out by combining remotely sensed data and field sample measurements through a generic statistical model, in contrast to site-specific models. We integrated forest inventory plot data with spatial predictors from Landsat time-...

  11. Structural evaluation for the core sampling trucks, RMCS operations, 200 Area

    International Nuclear Information System (INIS)

    Islam, M.A.

    1996-01-01

    This report evaluates the structural adequacy and the integrity of the existing core sampling trucks to withstand impact should the trucks drop off the ramp, either onto the soft ground or onto a non-yielding surface due to operational error, wind, or earthquake. The report also addresses if the allowable tank dome load will be exceeded by the addition of the impact load

  12. SU-E-T-21: A Novel Sampling Algorithm to Reduce Intensity-Modulated Radiation Therapy (IMRT) Optimization Time

    International Nuclear Information System (INIS)

    Tiwari, P; Xie, Y; Chen, Y; Deasy, J

    2014-01-01

    Purpose: The IMRT optimization problem requires substantial computer time to find optimal dose distributions because of the large number of variables and constraints. Voxel sampling reduces the number of constraints and accelerates the optimization process, but usually deteriorates the quality of the dose distributions to the organs. We propose a novel sampling algorithm that accelerates the IMRT optimization process without significantly deteriorating the quality of the dose distribution. Methods: We included all boundary voxels, as well as a sampled fraction of interior voxels of organs in the optimization. We selected a fraction of interior voxels using a clustering algorithm, that creates clusters of voxels that have similar influence matrix signatures. A few voxels are selected from each cluster based on the pre-set sampling rate. Results: We ran sampling and no-sampling IMRT plans for de-identified head and neck treatment plans. Testing with the different sampling rates, we found that including 10% of inner voxels produced the good dose distributions. For this optimal sampling rate, the algorithm accelerated IMRT optimization by a factor of 2–3 times with a negligible loss of accuracy that was, on average, 0.3% for common dosimetric planning criteria. Conclusion: We demonstrated that a sampling could be developed that reduces optimization time by more than a factor of 2, without significantly degrading the dose quality

  13. Sparse-sampling with time-encoded (TICO) stimulated Raman scattering for fast image acquisition

    Science.gov (United States)

    Hakert, Hubertus; Eibl, Matthias; Karpf, Sebastian; Huber, Robert

    2017-07-01

    Modern biomedical imaging modalities aim to provide researchers a multimodal contrast for a deeper insight into a specimen under investigation. A very promising technique is stimulated Raman scattering (SRS) microscopy, which can unveil the chemical composition of a sample with a very high specificity. Although the signal intensities are enhanced manifold to achieve a faster acquisition of images if compared to standard Raman microscopy, there is a trade-off between specificity and acquisition speed. Commonly used SRS concepts either probe only very few Raman transitions as the tuning of the applied laser sources is complicated or record whole spectra with a spectrometer based setup. While the first approach is fast, it reduces the specificity and the spectrometer approach records whole spectra -with energy differences where no Raman information is present-, which limits the acquisition speed. Therefore, we present a new approach based on the TICO-Raman concept, which we call sparse-sampling. The TICO-sparse-sampling setup is fully electronically controllable and allows probing of only the characteristic peaks of a Raman spectrum instead of always acquiring a whole spectrum. By reducing the spectral points to the relevant peaks, the acquisition time can be greatly reduced compared to a uniformly, equidistantly sampled Raman spectrum while the specificity and the signal to noise ratio (SNR) are maintained. Furthermore, all laser sources are completely fiber based. The synchronized detection enables a full resolution of the Raman signal, whereas the analogue and digital balancing allows shot noise limited detection. First imaging results with polystyrene (PS) and polymethylmethacrylate (PMMA) beads confirm the advantages of TICO sparse-sampling. We achieved a pixel dwell time as low as 35 μs for an image differentiating both species. The mechanical properties of the applied voice coil stage for scanning the sample currently limits even faster acquisition.

  14. Selective plane illumination microscopy (SPIM) with time-domain fluorescence lifetime imaging microscopy (FLIM) for volumetric measurement of cleared mouse brain samples

    Science.gov (United States)

    Funane, Tsukasa; Hou, Steven S.; Zoltowska, Katarzyna Marta; van Veluw, Susanne J.; Berezovska, Oksana; Kumar, Anand T. N.; Bacskai, Brian J.

    2018-05-01

    We have developed an imaging technique which combines selective plane illumination microscopy with time-domain fluorescence lifetime imaging microscopy (SPIM-FLIM) for three-dimensional volumetric imaging of cleared mouse brains with micro- to mesoscopic resolution. The main features of the microscope include a wavelength-adjustable pulsed laser source (Ti:sapphire) (near-infrared) laser, a BiBO frequency-doubling photonic crystal, a liquid chamber, an electrically focus-tunable lens, a cuvette based sample holder, and an air (dry) objective lens. The performance of the system was evaluated with a lifetime reference dye and micro-bead phantom measurements. Intensity and lifetime maps of three-dimensional human embryonic kidney (HEK) cell culture samples and cleared mouse brain samples expressing green fluorescent protein (GFP) (donor only) and green and red fluorescent protein [positive Förster (fluorescence) resonance energy transfer] were acquired. The results show that the SPIM-FLIM system can be used for sample sizes ranging from single cells to whole mouse organs and can serve as a powerful tool for medical and biological research.

  15. Are marketed topical metronidazole creams bioequivalent? Evaluation by in vivo microdialysis sampling and tape stripping methodology

    DEFF Research Database (Denmark)

    Garcia Ortiz, Patricia Elodia; Hansen, S H; Shah, Surendra P.

    2011-01-01

    To evaluate the bioequivalence of 3 marketed topical metronidazole formulations by simultaneous dermal microdialysis and stratum corneum sampling by the tape stripping methodology, and to compare the techniques as tools for the determination of bioequivalence.......To evaluate the bioequivalence of 3 marketed topical metronidazole formulations by simultaneous dermal microdialysis and stratum corneum sampling by the tape stripping methodology, and to compare the techniques as tools for the determination of bioequivalence....

  16. Dual Source Time-of-flight Mass Spectrometer and Sample Handling System

    Science.gov (United States)

    Brinckerhoff, W.; Mahaffy, P.; Cornish, T.; Cheng, A.; Gorevan, S.; Niemann, H.; Harpold, D.; Rafeek, S.; Yucht, D.

    We present details of an instrument under development for potential NASA missions to planets and small bodies. The instrument comprises a dual ionization source (laser and electron impact) time-of-flight mass spectrometer (TOF-MS) and a carousel sam- ple handling system for in situ analysis of solid materials acquired by, e.g., a coring drill. This DSTOF instrument could be deployed on a fixed lander or a rover, and has an open design that would accommodate measurements by additional instruments. The sample handling system (SHS) is based on a multi-well carousel, originally de- signed for Champollion/DS4. Solid samples, in the form of drill cores or as loose chips or fines, are inserted through an access port, sealed in vacuum, and transported around the carousel to a pyrolysis cell and/or directly to the TOF-MS inlet. Samples at the TOF-MS inlet are xy-addressable for laser or optical microprobe. Cups may be ejected from their holders for analyzing multiple samples or caching them for return. Samples are analyzed with laser desorption and evolved-gas/electron-impact sources. The dual ion source permits studies of elemental, isotopic, and molecular composition of unprepared samples with a single mass spectrometer. Pulsed laser desorption per- mits the measurement of abundance and isotope ratios of refractory elements, as well as the detection of high-mass organic molecules in solid samples. Evolved gas analysis permits similar measurements of the more volatile species in solids and aerosols. The TOF-MS is based on previous miniature prototypes at JHU/APL that feature high sensitivity and a wide mass range. The laser mode, in which the sample cup is directly below the TOF-MS inlet, permits both ablation and desorption measurements, to cover elemental and molecular species, respectively. In the evolved gas mode, sample cups are raised into a small pyrolysis cell and heated, producing a neutral gas that is elec- tron ionized and pulsed into the TOF-MS. (Any imaging

  17. Recommendations for Pathologic Evaluation of Reduction Mammoplasty Specimens: A Prospective Study With Systematic Tissue Sampling.

    Science.gov (United States)

    Ambaye, Abiy B; Goodwin, Andrew J; MacLennan, Susan E; Naud, Shelly; Weaver, Donald L

    2017-11-01

    - Breast reduction mammaplasty (RMP) for symptomatic macromastia or correction of asymmetry is performed in more than 100 000 patients per year in the United States. The reported incidence of significant pathologic findings (SPF), that is, carcinoma and atypical hyperplasia, ranges from 0.06% to 12.8%. No standard pathology assessment for RMP exists. - To propose standard sampling for microscopic evaluation in RMP specimens, to evaluate the incidence of occult carcinoma and atypical hyperplasia, and to identify clinical risk factors for SPF in patients undergoing RMP. - All RMP specimens from 2006 to 2013 at a single institution were prospectively examined. After baseline gross and microscopic evaluations, each specimen was subjected to systematic additional sampling. The incidence of SPF was tabulated, and variables such as age, specimen weight, previous history of SPF, and results of preoperative mammogram were examined. Clinical follow-up review was also subsequently undertaken. - A total of 595 patients were evaluated. Significant pathologic findings were present in 9.8% (58 of 595) of patients. No cancer was identified in patients younger than 40 years; the rates of carcinoma were 2.4% (14 of 595) in all patients, 3.6% (14 of 392) in patients aged 40 years or older, and 4.3% (10 of 233) in patients aged 50 years or older. No carcinoma or atypical hyperplasia was identified on preoperative mammogram. Increased sampling was associated with a significantly greater frequency of SPF only in patients aged 40 years or older. - In patients younger than 35 years, gross-only evaluation is sufficient. However, increased sampling may be necessary in patients older than 40 years.

  18. Method validation and uncertainty evaluation of organically bound tritium analysis in environmental sample.

    Science.gov (United States)

    Huang, Yan-Jun; Zeng, Fan; Zhang, Bing; Chen, Chao-Feng; Qin, Hong-Juan; Wu, Lian-Sheng; Guo, Gui-Yin; Yang, Li-Tao; Shang-Guan, Zhi-Hong

    2014-08-01

    The analytical method for organically bound tritium (OBT) was developed in our laboratory. The optimized operating conditions and parameters were established for sample drying, special combustion, distillation, and measurement on a liquid scintillation spectrometer (LSC). Selected types of OBT samples such as rice, corn, rapeseed, fresh lettuce and pork were analyzed for method validation of recovery rate reproducibility, the minimum detection concentration, and the uncertainty for typical low level environmental sample was evaluated. The combustion water recovery rate of different dried environmental sample was kept at about 80%, the minimum detection concentration of OBT ranged from 0.61 to 0.89 Bq/kg (dry weight), depending on the hydrogen content. It showed that this method is suitable for OBT analysis of environmental sample with stable recovery rate, and the combustion water yield of a sample with weight about 40 g would provide sufficient quantity for measurement on LSC. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Soil sampling strategies: Evaluation of different approaches

    Energy Technology Data Exchange (ETDEWEB)

    De Zorzi, Paolo [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Metrologia Ambientale, Via di Castel Romano, 100-00128 Roma (Italy)], E-mail: paolo.dezorzi@apat.it; Barbizzi, Sabrina; Belli, Maria [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Metrologia Ambientale, Via di Castel Romano, 100-00128 Roma (Italy); Mufato, Renzo; Sartori, Giuseppe; Stocchero, Giulia [Agenzia Regionale per la Prevenzione e Protezione dell' Ambiente del Veneto, ARPA Veneto, U.O. Centro Qualita Dati, Via Spalato, 14-36045 Vicenza (Italy)

    2008-11-15

    The National Environmental Protection Agency of Italy (APAT) performed a soil sampling intercomparison, inviting 14 regional agencies to test their own soil sampling strategies. The intercomparison was carried out at a reference site, previously characterised for metal mass fraction distribution. A wide range of sampling strategies, in terms of sampling patterns, type and number of samples collected, were used to assess the mean mass fraction values of some selected elements. The different strategies led in general to acceptable bias values (D) less than 2{sigma}, calculated according to ISO 13258. Sampling on arable land was relatively easy, with comparable results between different sampling strategies.

  20. Soil sampling strategies: Evaluation of different approaches

    International Nuclear Information System (INIS)

    De Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Mufato, Renzo; Sartori, Giuseppe; Stocchero, Giulia

    2008-01-01

    The National Environmental Protection Agency of Italy (APAT) performed a soil sampling intercomparison, inviting 14 regional agencies to test their own soil sampling strategies. The intercomparison was carried out at a reference site, previously characterised for metal mass fraction distribution. A wide range of sampling strategies, in terms of sampling patterns, type and number of samples collected, were used to assess the mean mass fraction values of some selected elements. The different strategies led in general to acceptable bias values (D) less than 2σ, calculated according to ISO 13258. Sampling on arable land was relatively easy, with comparable results between different sampling strategies

  1. Soil sampling strategies: evaluation of different approaches.

    Science.gov (United States)

    de Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Mufato, Renzo; Sartori, Giuseppe; Stocchero, Giulia

    2008-11-01

    The National Environmental Protection Agency of Italy (APAT) performed a soil sampling intercomparison, inviting 14 regional agencies to test their own soil sampling strategies. The intercomparison was carried out at a reference site, previously characterised for metal mass fraction distribution. A wide range of sampling strategies, in terms of sampling patterns, type and number of samples collected, were used to assess the mean mass fraction values of some selected elements. The different strategies led in general to acceptable bias values (D) less than 2sigma, calculated according to ISO 13258. Sampling on arable land was relatively easy, with comparable results between different sampling strategies.

  2. A comparison of four porewater sampling methods for metal mixtures and dissolved organic carbon and the implications for sediment toxicity evaluations.

    Science.gov (United States)

    Cleveland, Danielle; Brumbaugh, William G; MacDonald, Donald D

    2017-11-01

    Evaluations of sediment quality conditions are commonly conducted using whole-sediment chemistry analyses but can be enhanced by evaluating multiple lines of evidence, including measures of the bioavailable forms of contaminants. In particular, porewater chemistry data provide information that is directly relevant for interpreting sediment toxicity data. Various methods for sampling porewater for trace metals and dissolved organic carbon (DOC), which is an important moderator of metal bioavailability, have been employed. The present study compares the peeper, push point, centrifugation, and diffusive gradients in thin films (DGT) methods for the quantification of 6 metals and DOC. The methods were evaluated at low and high concentrations of metals in 3 sediments having different concentrations of total organic carbon and acid volatile sulfide and different particle-size distributions. At low metal concentrations, centrifugation and push point sampling resulted in up to 100 times higher concentrations of metals and DOC in porewater compared with peepers and DGTs. At elevated metal levels, the measured concentrations were in better agreement among the 4 sampling techniques. The results indicate that there can be marked differences among operationally different porewater sampling methods, and it is unclear if there is a definitive best method for sampling metals and DOC in porewater. Environ Toxicol Chem 2017;36:2906-2915. Published 2017 Wiley Periodicals Inc. on behalf of SETAC. This article is a US government work and, as such, is in the public domain in the United States of America. Published 2017 Wiley Periodicals Inc. on behalf of SETAC. This article is a US government work and, as such, is in the public domain in the United States of America.

  3. Autoregressive Prediction with Rolling Mechanism for Time Series Forecasting with Small Sample Size

    Directory of Open Access Journals (Sweden)

    Zhihua Wang

    2014-01-01

    Full Text Available Reasonable prediction makes significant practical sense to stochastic and unstable time series analysis with small or limited sample size. Motivated by the rolling idea in grey theory and the practical relevance of very short-term forecasting or 1-step-ahead prediction, a novel autoregressive (AR prediction approach with rolling mechanism is proposed. In the modeling procedure, a new developed AR equation, which can be used to model nonstationary time series, is constructed in each prediction step. Meanwhile, the data window, for the next step ahead forecasting, rolls on by adding the most recent derived prediction result while deleting the first value of the former used sample data set. This rolling mechanism is an efficient technique for its advantages of improved forecasting accuracy, applicability in the case of limited and unstable data situations, and requirement of little computational effort. The general performance, influence of sample size, nonlinearity dynamic mechanism, and significance of the observed trends, as well as innovation variance, are illustrated and verified with Monte Carlo simulations. The proposed methodology is then applied to several practical data sets, including multiple building settlement sequences and two economic series.

  4. Quantum supremacy in constant-time measurement-based computation: A unified architecture for sampling and verification

    Science.gov (United States)

    Miller, Jacob; Sanders, Stephen; Miyake, Akimasa

    2017-12-01

    While quantum speed-up in solving certain decision problems by a fault-tolerant universal quantum computer has been promised, a timely research interest includes how far one can reduce the resource requirement to demonstrate a provable advantage in quantum devices without demanding quantum error correction, which is crucial for prolonging the coherence time of qubits. We propose a model device made of locally interacting multiple qubits, designed such that simultaneous single-qubit measurements on it can output probability distributions whose average-case sampling is classically intractable, under similar assumptions as the sampling of noninteracting bosons and instantaneous quantum circuits. Notably, in contrast to these previous unitary-based realizations, our measurement-based implementation has two distinctive features. (i) Our implementation involves no adaptation of measurement bases, leading output probability distributions to be generated in constant time, independent of the system size. Thus, it could be implemented in principle without quantum error correction. (ii) Verifying the classical intractability of our sampling is done by changing the Pauli measurement bases only at certain output qubits. Our usage of random commuting quantum circuits in place of computationally universal circuits allows a unique unification of sampling and verification, so they require the same physical resource requirements in contrast to the more demanding verification protocols seen elsewhere in the literature.

  5. Dependability of Data Derived from Time Sampling Methods with Multiple Observation Targets

    Science.gov (United States)

    Johnson, Austin H.; Chafouleas, Sandra M.; Briesch, Amy M.

    2017-01-01

    In this study, generalizability theory was used to examine the extent to which (a) time-sampling methodology, (b) number of simultaneous behavior targets, and (c) individual raters influenced variance in ratings of academic engagement for an elementary-aged student. Ten graduate-student raters, with an average of 7.20 hr of previous training in…

  6. A simple method to adapt time sampling of the analog signal

    International Nuclear Information System (INIS)

    Kalinin, Yu.G.; Martyanov, I.S.; Sadykov, Kh.; Zastrozhnova, N.N.

    2004-01-01

    In this paper we briefly describe the time sampling method, which is adapted to the speed of the signal change. Principally, this method is based on a simple idea--the combination of discrete integration with differentiation of the analog signal. This method can be used in nuclear electronics research into the characteristics of detectors and the shape of the pulse signal, pulse and transitive characteristics of inertial systems of processing of signals, etc

  7. Technology evaluation for time sensitive data transport

    DEFF Research Database (Denmark)

    Wessing, Henrik; Breach, Tony; Colmenero, Alberto

    . The NREN communities must provide underlying network infrastructures and transport technologies to facilitate ser-vices with such requirements to the network. In this paper we investigate and evaluate circuit and packet based transport technologies from classic best effort IP over MPLS flavours, Provider...... Backbone Bridging (PBB), “Transparent Interconnect of Lots of Links” (TRILL) to Optical Transport Network (OTN) and SDH. The transport technologies are evaluated theoreti-cally, using simulations and/or experimentally. Each transport technology is evaluated based on its performances and capabilities...... overhead and restoration time. Thirdly, complexity and automation possibilities for establishment of paths for high demanding applica-tions, and finally how the technologies are backed by research communities and major vendors like Ciena, Alcatel-Lucent, Nokia-Siemens and Huawei. The technologies...

  8. Improving clinical laboratory efficiency: a time-motion evaluation of the Abbott m2000 RealTime and Roche COBAS AmpliPrep/COBAS TaqMan PCR systems for the simultaneous quantitation of HIV-1 RNA and HCV RNA.

    Science.gov (United States)

    Amendola, Alessandra; Coen, Sabrina; Belladonna, Stefano; Pulvirenti, F Renato; Clemens, John M; Capobianchi, M Rosaria

    2011-08-01

    Diagnostic laboratories need automation that facilitates efficient processing and workflow management to meet today's challenges for expanding services and reducing cost, yet maintaining the highest levels of quality. Processing efficiency of two commercially available automated systems for quantifying HIV-1 and HCV RNA, Abbott m2000 system and Roche COBAS Ampliprep/COBAS TaqMan 96 (docked) systems (CAP/CTM), was evaluated in a mid/high throughput workflow laboratory using a representative daily workload of 24 HCV and 72 HIV samples. Three test scenarios were evaluated: A) one run with four batches on the CAP/CTM system, B) two runs on the Abbott m2000 and C) one run using the Abbott m2000 maxCycle feature (maxCycle) for co-processing these assays. Cycle times for processing, throughput and hands-on time were evaluated. Overall processing cycle time was 10.3, 9.1 and 7.6 h for Scenarios A), B) and C), respectively. Total hands-on time for each scenario was, in order, 100.0 (A), 90.3 (B) and 61.4 min (C). The interface of an automated analyzer to the laboratory workflow, notably system set up for samples and reagents and clean up functions, are as important as the automation capability of the analyzer for the overall impact to processing efficiency and operator hands-on time.

  9. A method for ion distribution function evaluation using escaping neutral atom kinetic energy samples

    International Nuclear Information System (INIS)

    Goncharov, P.R.; Ozaki, T.; Veshchev, E.A.; Sudo, S.

    2008-01-01

    A reliable method to evaluate the probability density function for escaping atom kinetic energies is required for the analysis of neutral particle diagnostic data used to study the fast ion distribution function in fusion plasmas. Digital processing of solid state detector signals is proposed in this paper as an improvement of the simple histogram approach. Probability density function for kinetic energies of neutral particles escaping from the plasma has been derived in a general form taking into account the plasma ion energy distribution, electron capture and loss rates, superposition along the diagnostic sight line and the magnetic surface geometry. A pseudorandom number generator has been realized that enables a sample of escaping neutral particle energies to be simulated for given plasma parameters and experimental conditions. Empirical probability density estimation code has been developed and tested to reconstruct the probability density function from simulated samples assuming. Maxwellian and classical slowing down plasma ion energy distribution shapes for different temperatures and different slowing down times. The application of the developed probability density estimation code to the analysis of experimental data obtained by the novel Angular-Resolved Multi-Sightline Neutral Particle Analyzer has been studied to obtain the suprathermal particle distributions. The optimum bandwidth parameter selection algorithm has also been realized. (author)

  10. [Design of standard voice sample text for subjective auditory perceptual evaluation of voice disorders].

    Science.gov (United States)

    Li, Jin-rang; Sun, Yan-yan; Xu, Wen

    2010-09-01

    To design a speech voice sample text with all phonemes in Mandarin for subjective auditory perceptual evaluation of voice disorders. The principles for design of a speech voice sample text are: The short text should include the 21 initials and 39 finals, this may cover all the phonemes in Mandarin. Also, the short text should have some meanings. A short text was made out. It had 155 Chinese words, and included 21 initials and 38 finals (the final, ê, was not included because it was rarely used in Mandarin). Also, the text covered 17 light tones and one "Erhua". The constituent ratios of the initials and finals presented in this short text were statistically similar as those in Mandarin according to the method of similarity of the sample and population (r = 0.742, P text were statistically not similar as those in Mandarin (r = 0.731, P > 0.05). A speech voice sample text with all phonemes in Mandarin was made out. The constituent ratios of the initials and finals presented in this short text are similar as those in Mandarin. Its value for subjective auditory perceptual evaluation of voice disorders need further study.

  11. Rigid Body Sampling and Individual Time Stepping for Rigid-Fluid Coupling of Fluid Simulation

    Directory of Open Access Journals (Sweden)

    Xiaokun Wang

    2017-01-01

    Full Text Available In this paper, we propose an efficient and simple rigid-fluid coupling scheme with scientific programming algorithms for particle-based fluid simulation and three-dimensional visualization. Our approach samples the surface of rigid bodies with boundary particles that interact with fluids. It contains two procedures, that is, surface sampling and sampling relaxation, which insures uniform distribution of particles with less iterations. Furthermore, we present a rigid-fluid coupling scheme integrating individual time stepping to rigid-fluid coupling, which gains an obvious speedup compared to previous method. The experimental results demonstrate the effectiveness of our approach.

  12. Evaluation of the iPLEX(®) Sample ID Plus Panel designed for the Sequenom MassARRAY(®) system. A SNP typing assay developed for human identification and sample tracking based on the SNPforID panel

    DEFF Research Database (Denmark)

    Johansen, P; Andersen, J D; Børsting, Claus

    2013-01-01

    on the peak height and the signal to noise data exported from the TYPER 4.0 software. With the forensic analysis parameters, all inconsistencies were eliminated in reactions with ≥10ng DNA. However, the average call rate decreased to 69.9%. The iPLEX(®) Sample ID Plus Panel was tested on 10 degraded samples......Sequenom launched the first commercial SNP typing kit for human identification, named the iPLEX(®) Sample ID Plus Panel. The kit amplifies 47 of the 52 SNPs in the SNPforID panel, amelogenin and two Y-chromosome SNPs in one multiplex PCR. The SNPs were analyzed by single base extension (SBE......) and Matrix Assisted Laser Desorption/Ionization-Time of Flight Mass Spectrometry (MALDI-TOF MS). In this study, we evaluated the accuracy and sensitivity of the iPLEX(®) Sample ID Plus Panel by comparing the typing results of the iPLEX(®) Sample ID Plus Panel with those obtained with our ISO 17025 accredited...

  13. Sample Entropy-Based Approach to Evaluate the Stability of Double-Wire Pulsed MIG Welding

    Directory of Open Access Journals (Sweden)

    Ping Yao

    2014-01-01

    Full Text Available According to the sample entropy, this paper deals with a quantitative method to evaluate the current stability in double-wire pulsed MIG welding. Firstly, the sample entropy of current signals with different stability but the same parameters is calculated. The results show that the more stable the current, the smaller the value and the standard deviation of sample entropy. Secondly, four parameters, which are pulse width, peak current, base current, and frequency, are selected for four-level three-factor orthogonal experiment. The calculation and analysis of desired signals indicate that sample entropy values are affected by welding current parameters. Then, a quantitative method based on sample entropy is proposed. The experiment results show that the method can preferably quantify the welding current stability.

  14. Computer graphics for quality control in the INAA of geological samples

    International Nuclear Information System (INIS)

    Grossman, J.N.; Baedecker, P.A.

    1987-01-01

    A data reduction system for the routine instrumental activation analysis of samples is described, with particular emphasis on interactive graphics capabilities for evaluating analytical quality. Graphics procedures were developed to interactively control the analysis of selected photopeaks during spectral analysis, and to evaluate detector performance during a given counting cycle. Graphics algorithms are also used to compare the data on reference samples with accepted values, to prepare quality control charts to evaluate long term precision and to search for systematic variations in data on reference samples as a function of time. (author)

  15. Evaluation of environmental samples containing heavy hydrocarbon components in environmental forensic investigations

    Energy Technology Data Exchange (ETDEWEB)

    Raia, J.C.; Blakley, C.R.; Fuex, A.N.; Villalanti, D.C.; Fahrenthold, P.D. [Triton Anal Corp, Houston, TX (United States)

    2004-03-01

    This article presents a procedure to evaluate and characterize environmental samples containing mixtures of hydrocarbons over a wide boiling range of materials that include fuels and other products used in commerce. The range of the method extends to the higher boiling and heavier molecular weight hydrocarbon products in the range of motor oil, bunker fuel, and heavier residue materials. The procedure uses the analytical laboratory technique of high-temperature simulated distillation along with mathematical regression of the analytical data to estimate the relative contribution of individual products in mixtures of hydrocarbons present in environmental samples. An analytical technique to determine hydrocarbon-type distributions by gas chromatography-mass spectrometry with nitric oxide ionization spectrometry evaluation is also presented. This type of analysis allows complex hydrocarbon mixtures to be classified by their chemical composition, or types of hydrocarbons that include paraffins, cycloparaffins, monoaromatics, and polycyclic aromatic hydrocarbons. Characteristic hydrocarbon patterns for example, in the relative distribution of polycyclic aromatic hydrocarbons are valuable for determining the potential origin of materials present in environmental samples. These methods provide quantitative data for hydrocarbon components in mixtures as a function of boiling range and 'hydrocarbon fingerprints' of the types of materials present. This information is valuable in assessing environmental impacts of hydrocarbons at contaminated sites and establishing the liabilities and cost allocations for responsible parties.

  16. Neural correlates of time versus money in product evaluation

    Directory of Open Access Journals (Sweden)

    Sebastian eLehmann

    2012-10-01

    Full Text Available The common saying time is money reflects the widespread belief in many people’s everyday life that time is valuable like money. Psychologically and neurophysiologically, however, these concepts seem to be quite different. This research replicates prior behavioral investigations by showing that merely mentioning time (compared to merely mentioning money leads participants to evaluate a product more positively. Beyond this finding, the present functional magnetic resonance imaging (fMRI experiment provides novel insight into the neurophysiological underpinnings of this behavioral effect by showing that more positive product evaluations in the time primes (compared to money primes are preceded by increased activation in the insula. Our data, therefore, support the idea of a time mindset that is different from a money mindset. Studies on the functional neuroanatomy of the insula have implicated this brain area in distinct but related psychological phenomena such as urging, addiction, loss aversion, and love. These functions imply greater personal connection between the consumer and a target subject or object and, thus, help explain why time-primed consumers rate products more positively.

  17. Dynamic telecytologic evaluation of imprint cytology samples from CT-guided lung biopsies: A feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Prosch, Helmut [Otto Wagner Hospital, Department of Radiology, Vienna (Austria); Medical University of Vienna, Department of Radiology, Vienna (Austria); Medical University of Vienna, Vienna General Hospital, Department of Radiology, Vienna (Austria); Hoffmann, Elisabeth; Schober, Ewald; Mostbeck, Gerhard [Otto Wagner Hospital, Department of Radiology, Vienna (Austria); Bernhardt, Klaus; Schalleschak, Johann [Otto Wagner Hospital, Department of Laboratory Medicine, Vienna (Austria); Rowhani, Marcel [Otto Wagner Hospital, Department of Respiratory and Critical Care Medicine, Vienna (Austria); Weber, Michael [Medical University of Vienna, Department of Radiology, Vienna (Austria)

    2011-09-15

    This study assessed the feasibility of telecytological evaluation of samples from CT-guided lung biopsies using a dynamic telecytological system in which the microscope was operated by personnel from the radiology department at the site of the biopsy and a cytologist off-site diagnosed the biopsy sample. 45 imprint samples from CT-guided biopsies of lung lesions were reviewed by two cytologists using a telecytological microscope (Olympus BX51, Tokyo, Japan). The telecytological microscope was operated by one radiologist and one radiology technician. The cytological samples were classified by a cytologist into four categories: benign, malignant, atypical cells of undetermined significance, and non-diagnostic. The results were compared with those of a previous consensus reading of two independent cytologists (gold standard). When the radiologist was operating the microscope, the diagnostic accuracy was 100% as both cytologists came to the correct diagnosis in all samples. When the technician operated the microscope, two diagnoses of cyotologist 1 differed from the gold standard. Thus, the accuracy for the technician was 95.56%. Telecytological evaluation of imprint samples from CT-guided lung biopsies is feasible because it can be performed with high diagnostic accuracy if personnel from the radiology department operate the microscope. (orig.)

  18. Development of a real-time multiplex PCR assay for the detection of multiple Salmonella serotypes in chicken samples

    Directory of Open Access Journals (Sweden)

    Whyte Paul

    2008-09-01

    Full Text Available Abstract Background A real-time multiplex PCR assay was developed for the detection of multiple Salmonella serotypes in chicken samples. Poultry-associated serotypes detected in the assay include Enteritidis, Gallinarum, Typhimurium, Kentucky and Dublin. The traditional cultural method according to EN ISO 6579:2002 for the detection of Salmonella in food was performed in parallel. The real-time PCR based method comprised a pre-enrichment step in Buffered Peptone Water (BPW overnight, followed by a shortened selective enrichment in Rappaport Vasilliadis Soya Broth (RVS for 6 hours and subsequent DNA extraction. Results The real-time multiplex PCR assay and traditional cultural method showed 100% inclusivity and 100% exclusivity on all strains tested. The real-time multiplex PCR assay was as sensitive as the traditional cultural method in detecting Salmonella in artificially contaminated chicken samples and correctly identified the serotype. Artificially contaminated chicken samples resulted in a detection limit of between 1 and 10 CFU per 25 g sample for both methods. A total of sixty-three naturally contaminated chicken samples were investigated by both methods and relative accuracy, relative sensitivity and relative specificity of the real-time PCR method were determined to be 89, 94 and 87%, respectively. Thirty cultures blind tested were correctly identified by the real-time multiplex PCR method. Conclusion Real-time PCR methodology can contribute to meet the need for rapid identification and detection methods in food testing laboratories.

  19. Laser-induced breakdown spectroscopy for the real-time analysis of mixed waste samples containing Sr

    International Nuclear Information System (INIS)

    Barefield, J.E. II; Koskelo, A.C.; Multari, R.A.; Cremers, D.A.; Gamble, T.K.; Han, C.Y.

    1995-01-01

    In this report, the use of Laser-induced breakdown spectroscopy to analyze mixed waste samples containing Sr is discussed. The mixed waste samples investigated include vitrified waste glass and contaminated soil. Compared to traditional analysis techniques, the laser-based method is fast (i.e., analysis times on the order of minutes) and essentially waste free since little or no sample preparation is required. Detection limits on the order of pmm Sr were determined. Detection limits obtained using a fiber optic cable to deliver laser pulses to soil samples containing Cr, Zr, Pb, Be, Cu, and Ni will also be discussed

  20. International study to evaluate PCR methods for detection of Trypanosoma cruzi DNA in blood samples from Chagas disease patients.

    Directory of Open Access Journals (Sweden)

    Alejandro G Schijman

    Full Text Available BACKGROUND: A century after its discovery, Chagas disease still represents a major neglected tropical threat. Accurate diagnostics tools as well as surrogate markers of parasitological response to treatment are research priorities in the field. The purpose of this study was to evaluate the performance of PCR methods in detection of Trypanosoma cruzi DNA by an external quality evaluation. METHODOLOGY/FINDINGS: An international collaborative study was launched by expert PCR laboratories from 16 countries. Currently used strategies were challenged against serial dilutions of purified DNA from stocks representing T. cruzi discrete typing units (DTU I, IV and VI (set A, human blood spiked with parasite cells (set B and Guanidine Hidrochloride-EDTA blood samples from 32 seropositive and 10 seronegative patients from Southern Cone countries (set C. Forty eight PCR tests were reported for set A and 44 for sets B and C; 28 targeted minicircle DNA (kDNA, 13 satellite DNA (Sat-DNA and the remainder low copy number sequences. In set A, commercial master mixes and Sat-DNA Real Time PCR showed better specificity, but kDNA-PCR was more sensitive to detect DTU I DNA. In set B, commercial DNA extraction kits presented better specificity than solvent extraction protocols. Sat-DNA PCR tests had higher specificity, with sensitivities of 0.05-0.5 parasites/mL whereas specific kDNA tests detected 5.10(-3 par/mL. Sixteen specific and coherent methods had a Good Performance in both sets A and B (10 fg/µl of DNA from all stocks, 5 par/mL spiked blood. The median values of sensitivities, specificities and accuracies obtained in testing the Set C samples with the 16 tests determined to be good performing by analyzing Sets A and B samples varied considerably. Out of them, four methods depicted the best performing parameters in all three sets of samples, detecting at least 10 fg/µl for each DNA stock, 0.5 par/mL and a sensitivity between 83.3-94.4%, specificity of 85

  1. Performance Evaluation and Market Timing: the Skill Index

    Directory of Open Access Journals (Sweden)

    Ney Roberto Otoni de Brito

    2003-01-01

    Full Text Available MERTON (1981 examines the creation of value by fund managers selecting between stocks and fixed income instruments through market timing. HENRIKSON and MERTON (1981 proceed to propose empirical tests of funds and manager performance in market timing. BRITO, BONA and TACIRO (2003 generalize the results of MERTON (1981 and HENRIKSON and MERTON (1981 for actively managed funds with a clearly defined benchmark portfolio. In the generalized context of active portfolio management, this paper proposes a new index – the Skill Index of Brito (SIB – to measure the performance and efficiency in market timing of actively managed funds. The paper proceeds to test the performance and skill of hedge funds in Brazil using the SIB. A representative sample of 32 hedge funds with a window of 90 trading days on October 31, 1999 was obtained. The empirical tests of performance and skill use the interbank borrowing and lending rate as the passive benchmark. The results indicate the significance at the 5% level of the SIB for ten hedge funds in the sample. Among them seven funds also have shown significance at the 1% level. In sum the results indicate a majority of hedge funds with no significant skill in the Brazilian market in the examined period.

  2. Evaluating ethanol-based sample preservation to facilitate use of DNA barcoding in routine freshwater biomonitoring programs using benthic macroinvertebrates.

    Directory of Open Access Journals (Sweden)

    Eric D Stein

    Full Text Available Molecular methods, such as DNA barcoding, have the potential to enhance biomonitoring programs worldwide. Altering routinely used sample preservation methods to protect DNA from degradation may pose a potential impediment to application of DNA barcoding and metagenomics for biomonitoring using benthic macroinvertebrates. Using higher volumes or concentrations of ethanol, requirements for shorter holding times, or the need to include additional filtering may increase cost and logistical constraints to existing biomonitoring programs. To address this issue we evaluated the efficacy of various ethanol-based sample preservation methods at maintaining DNA integrity. We evaluated a series of methods that were minimally modified from typical field protocols in order to identify an approach that can be readily incorporated into existing monitoring programs. Benthic macroinvertebrates were collected from a minimally disturbed stream in southern California, USA and subjected to one of six preservation treatments. Ten individuals from five taxa were selected from each treatment and processed to produce DNA barcodes from the mitochondrial gene cytochrome c oxidase I (COI. On average, we obtained successful COI sequences (i.e. either full or partial barcodes for between 93-99% of all specimens across all six treatments. As long as samples were initially preserved in 95% ethanol, successful sequencing of COI barcodes was not affected by a low dilution ratio of 2∶1, transfer to 70% ethanol, presence of abundant organic matter, or holding times of up to six months. Barcoding success varied by taxa, with Leptohyphidae (Ephemeroptera producing the lowest barcode success rate, most likely due to poor PCR primer efficiency. Differential barcoding success rates have the potential to introduce spurious results. However, routine preservation methods can largely be used without adverse effects on DNA integrity.

  3. Evaluation of environmental sampling methods for detection of Salmonella enterica in a large animal veterinary hospital.

    Science.gov (United States)

    Goeman, Valerie R; Tinkler, Stacy H; Hammac, G Kenitra; Ruple, Audrey

    2018-04-01

    Environmental surveillance for Salmonella enterica can be used for early detection of contamination; thus routine sampling is an integral component of infection control programs in hospital environments. At the Purdue University Veterinary Teaching Hospital (PUVTH), the technique regularly employed in the large animal hospital for sample collection uses sterile gauze sponges for environmental sampling, which has proven labor-intensive and time-consuming. Alternative sampling methods use Swiffer brand electrostatic wipes for environmental sample collection, which are reportedly effective and efficient. It was hypothesized that use of Swiffer wipes for sample collection would be more efficient and less costly than the use of gauze sponges. A head-to-head comparison between the 2 sampling methods was conducted in the PUVTH large animal hospital and relative agreement, cost-effectiveness, and sampling efficiency were compared. There was fair agreement in culture results between the 2 sampling methods, but Swiffer wipes required less time and less physical effort to collect samples and were more cost-effective.

  4. Evaluation of sampling, cookery, and shear force protocols for objective evaluation of lamb longissimus tenderness.

    Science.gov (United States)

    Shackelford, S D; Wheeler, T L; Koohmaraie, M

    2004-03-01

    Experiments were conducted to compare the effects of two cookery methods, two shear force procedures, and sampling location within non-callipyge and callipyge lamb LM on the magnitude, variance, and repeatability of LM shear force data. In Exp. 1, 15 non-callipyge and 15 callipyge carcasses were sampled, and Warner-Bratzler shear force (WBSF) was determined for both sides of each carcass at three locations along the length (anterior to posterior) of the LM, whereas slice shear force (SSF) was determined for both sides of each carcass at only one location. For approximately half the carcasses within each genotype, LM chops were cooked for a constant amount of time using a belt grill, and chops of the remaining carcasses were cooked to a constant endpoint temperature using open-hearth electric broilers. Regardless of cooking method and sampling location, repeatability estimates were at least 0.8 for LM WBSF and SSF. For WBSF, repeatability estimates were slightly higher at the anterior location (0.93 to 0.98) than the posterior location (0.88 to 0.90). The difference in repeatability between locations was probably a function of a greater level of variation in shear force at the anterior location. For callipyge LM, WBSF was higher (P lamb LM chops cooked with the belt grill using a larger number of animals (n = 87). In Exp. 2, LM chops were obtained from matching locations of both sides of 44 non-callipyge and 43 callipyge carcasses. Chops were cooked with a belt grill and SSF was measured, and repeatability was estimated to be 0.95. Repeatable estimates of lamb LM tenderness can be achieved either by cooking to a constant endpoint temperature with electric broilers or cooking for a constant amount of time with a belt grill. Likewise, repeatable estimates of lamb LM tenderness can be achieved with WBSF or SSF. However, use of belt grill cookery and the SSF technique could decrease time requirements which would decrease research costs.

  5. Development of a real-time PCR to detect Demodex canis DNA in different tissue samples.

    Science.gov (United States)

    Ravera, Ivan; Altet, Laura; Francino, Olga; Bardagí, Mar; Sánchez, Armand; Ferrer, Lluís

    2011-02-01

    The present study reports the development of a real-time polymerase chain reaction (PCR) to detect Demodex canis DNA on different tissue samples. The technique amplifies a 166 bp of D. canis chitin synthase gene (AB 080667) and it has been successfully tested on hairs extracted with their roots and on formalin-fixed paraffin embedded skin biopsies. The real-time PCR amplified on the hairs of all 14 dogs with a firm diagnosis of demodicosis and consistently failed to amplify on negative controls. Eleven of 12 skin biopsies with a morphologic diagnosis of canine demodicosis were also positive. Sampling hairs on two skin points (lateral face and interdigital skin), D. canis DNA was detected on nine of 51 healthy dogs (17.6%) a much higher percentage than previously reported with microscopic studies. Furthermore, it is foreseen that if the number of samples were increased, the percentage of positive dogs would probably also grow. Moreover, in four of the six dogs with demodicosis, the samples taken from non-lesioned skin were positive. This finding, if confirmed in further studies, suggests that demodicosis is a generalized phenomenon in canine skin, due to proliferation of local mite populations, even though macroscopic lesions only appear in certain areas. The real-time PCR technique to detect D. canis DNA described in this work is a useful tool to advance our understanding of canine demodicosis.

  6. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, which...... indicates signal sampling points in time. Practical random sampling patterns are constrained by ADC characteristics and application requirements. In this paper, we introduce statistical methods which evaluate random sampling pattern generators with emphasis on practical applications. Furthermore, we propose...... algorithm generates random sampling patterns dedicated for event-driven-ADCs better than existed sampling pattern generators. Finally, implementation issues of random sampling patterns are discussed....

  7. Handling time in economic evaluation studies.

    Science.gov (United States)

    Permsuwan, Unchalee; Guntawongwan, Kansinee; Buddhawongsa, Piyaluk

    2014-05-01

    The discount rates and time horizons used in a health technology assessment (HTA) can have a significant impact on the results, and thus the prioritization of technologies. Therefore, it is important that clear guidance be provided on the appropriate discount rates for cost and health effect and appropriate time horizons. In this paper we conduct a review of relevant case studies and guidelines and provide guidance for all researchers conducting economic evaluations of health technologies in the Thai context. A uniform discount rate of 3% is recommended for both costs and health effects in base case analyses. A sensitivity analysis should also be conducted, with a discount range of 0-6%. For technologies where the effects are likely to sustain for at least 30y ears, a rate of 4% for costs and 2% for health effects is recommended. The time horizon should be long enough to capture the full costs and effects of the programs.

  8. Evaluation of Brazilian intercomparison program data from 1991 to 1995 of radionuclide assays in environmental samples

    International Nuclear Information System (INIS)

    Vianna, Maria Elizabeth Couto M.; Tauhata, Luiz; Oliveira, Antonio Eduardo de; Oliveira, Josue Peter de; Clain, Almir Faria; Ferreira, Ana Cristina M.

    1998-01-01

    Historical radioanalytical data from the Institute of Radiation Protection and Dosimetry (IRD) national intercomparison program from 1991 to 1995 were analyzed to evaluate the performance of sixteen Brazilian laboratories in radionuclide analyses in environmental samples. Data are comprised of measurements of radionuclides in 435 spiked environmental samples distributed in fifteen intercomparison runs comprised of 955 analyses. The general and specific radionuclide performances of the participating laboratories were evaluated relative to the reference value. Data analysis encourages improvements in beta emitter measurements

  9. Evaluation of the stages involved in cold ischemia time in renal transplants in Chile.

    Science.gov (United States)

    Elgueta, S; Fuentes, C; Arenas, A; Labraña, C; Gajardo, J G; Lopez, M; Hernandez, J; Rodriguez, H; Rodriguez, L

    2010-01-01

    Cold ischemia time (CIT) is one of the factors that determine the evolution of a renal transplant; taking measures to reduce this time requires knowledge of its stages. The objective of this study was to evaluate the times in the stages that determine CIT in renal transplants. We analyzed 108 donors and 201 kidney transplantations performed in Chile in 2008, establishing the CIT for the kidney transplanted by the center that extracted the kidneys (local kidney) and for the kidney transplanted in another center (shared kidney). Average CIT was 18.8 hours: namely, 16.9 hours for local and 20.2 hours for shared kidneys (P = .0001484). CIT for cases in which samples were sent to histocompatibility laboratory prior to nephrectomy was 7.3 hours less than for those sent postnephrectomy. The mean time between the allocation of the kidney and the transplant was 7.3 hours; 5.6 hours for local kidneys and 8.4 hours for shared kidneys (P = .000007124). We identified the stages at which intervention is possible to reduce the CIT, mainly for shared kidneys. All involved parties should make an effort to reduce this time.

  10. Malaria diagnosis from pooled blood samples: comparative analysis of real-time PCR, nested PCR and immunoassay as a platform for the molecular and serological diagnosis of malaria on a large-scale

    Directory of Open Access Journals (Sweden)

    Giselle FMC Lima

    2011-09-01

    Full Text Available Malaria diagnoses has traditionally been made using thick blood smears, but more sensitive and faster techniques are required to process large numbers of samples in clinical and epidemiological studies and in blood donor screening. Here, we evaluated molecular and serological tools to build a screening platform for pooled samples aimed at reducing both the time and the cost of these diagnoses. Positive and negative samples were analysed in individual and pooled experiments using real-time polymerase chain reaction (PCR, nested PCR and an immunochromatographic test. For the individual tests, 46/49 samples were positive by real-time PCR, 46/49 were positive by nested PCR and 32/46 were positive by immunochromatographic test. For the assays performed using pooled samples, 13/15 samples were positive by real-time PCR and nested PCR and 11/15 were positive by immunochromatographic test. These molecular methods demonstrated sensitivity and specificity for both the individual and pooled samples. Due to the advantages of the real-time PCR, such as the fast processing and the closed system, this method should be indicated as the first choice for use in large-scale diagnosis and the nested PCR should be used for species differentiation. However, additional field isolates should be tested to confirm the results achieved using cultured parasites and the serological test should only be adopted as a complementary method for malaria diagnosis.

  11. Evaluation of scaling invariance embedded in short time series.

    Directory of Open Access Journals (Sweden)

    Xue Pan

    Full Text Available Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2. Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03 and sharp confidential interval (standard deviation ≤0.05. Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.

  12. Evaluation of scaling invariance embedded in short time series.

    Science.gov (United States)

    Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping

    2014-01-01

    Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2). Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03) and sharp confidential interval (standard deviation ≤0.05). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.

  13. Evaluation of monoclonal antibody-based sandwich direct ELISA (MSD-ELISA) for antigen detection of foot-and-mouth disease virus using clinical samples.

    Science.gov (United States)

    Morioka, Kazuki; Fukai, Katsuhiko; Sakamoto, Kenichi; Yoshida, Kazuo; Kanno, Toru

    2014-01-01

    A monoclonal antibody-based sandwich direct ELISA (MSD-ELISA) method was previously developed for foot-and-mouth disease (FMD) viral antigen detection. Here we evaluated the sensitivity and specificity of two FMD viral antigen detection MSD-ELISAs and compared them with conventional indirect sandwich (IS)-ELISA. The MSD-ELISAs were able to detect the antigen in saliva samples of experimentally-infected pigs for a longer term compared to the IS-ELISA. We also used 178 RT-PCR-positive field samples from cattle and pigs affected by the 2010 type-O FMD outbreak in Japan, and we found that the sensitivities of both MSD-ELISAs were about 7 times higher than that of the IS-ELISA against each sample (P<0.01). In terms of the FMD-positive farm detection rate, the sensitivities of the MSD-ELISAs were about 6 times higher than that of the IS-ELISA against each farm (P<0.01). Although it is necessary to conduct further validation study using the other virus strains, MSD-ELISAs could be appropriate as a method to replace IS-ELISA for FMD antigen detection.

  14. Pumping time required to obtain tube well water samples with aquifer characteristic radon concentrations

    International Nuclear Information System (INIS)

    Ricardo, Carla Pereira; Oliveira, Arno Heeren de

    2011-01-01

    Radon is an inert noble gas, which comes from the natural radioactive decay of uranium and thorium in soil, rock and water. Radon isotopes emanated from radium-bearing grains of a rock or soil are released into the pore space. Radon that reaches the pore space is partitioned between the gaseous and aqueous phases. Thus, the groundwater presents a radon signature from the rock that is characteristic of the aquifer. The characteristic radon concentration of an aquifer, which is mainly related to the emanation, is also influenced by the degree of subsurface degassing, especially in the vicinity of a tube well, where the radon concentration is strongly reduced. Looking for the required pumping time to take a tube well water sample that presents the characteristic radon concentration of the aquifer, an experiment was conducted in an 80 m deep tube well. In this experiment, after twenty-four hours without extraction, water samples were collected periodically, about ten minutes intervals, during two hours of pumping time. The radon concentrations of the samples were determined by using the RAD7 Electronic Radon Detector from Durridge Company, a solid state alpha spectrometric detector. It was realized that the necessary time to reach the maximum radon concentration, that means the characteristic radon concentration of the aquifer, is about sixty minutes. (author)

  15. Evaluation of correlation between physical properties and ultrasonic pulse velocity of fired clay samples.

    Science.gov (United States)

    Özkan, İlker; Yayla, Zeliha

    2016-03-01

    The aim of this study is to establish a correlation between physical properties and ultrasonic pulse velocity of clay samples fired at elevated temperatures. Brick-making clay and pottery clay were studied for this purpose. The physical properties of clay samples were assessed after firing pressed clay samples separately at temperatures of 850, 900, 950, 1000, 1050 and 1100 °C. A commercial ultrasonic testing instrument (Proceq Pundit Lab) was used to evaluate the ultrasonic pulse velocity measurements for each fired clay sample as a function of temperature. It was observed that there became a relationship between physical properties and ultrasonic pulse velocities of the samples. The results showed that in consequence of increasing densification of the samples, the differences between the ultrasonic pulse velocities were higher with increasing temperature. These findings may facilitate the use of ultrasonic pulse velocity for the estimation of physical properties of fired clay samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. A time-sorting pitfall trap and temperature datalogger for the sampling of surface-active arthropods

    OpenAIRE

    McMunn, Marshall S.

    2017-01-01

    Nearly all arthropods display consistent patterns of activity according to time of day. These patterns of activity often limit the extent of animal co-occurrence in space and time. Quantifying when particular species are active and how activity varies with environmental conditions is difficult without the use of automated devices due to the need for continuous monitoring. Time-sorting pitfall traps passively collect active arthropods into containers with known beginning and end sample times. ...

  17. Evaluation of oxidation techniques for preparing bioassay and environmental samples for liquid scintillation counting

    International Nuclear Information System (INIS)

    Miller, H.H.

    1979-10-01

    In environmental and biological monitoring for carbon-14 and tritium, the presence of color and chemical quenching agents in the samples can degrade the efficiency of liquid scintillation counting. A series of experiments was performed to evaluate the usefulness, under routine conditions, of first oxidizing the samples to improve the counting by removing the color and quenching agents. The scintillation counter was calibrated for the effects of quenching agents on its counting efficiency. Oxidizing apparatus was tested for its ability to accurately recover the 14 C and 3 H in the samples. Scintillation counting efficiences were compared for a variety of oxidized and unoxidized environmental and bioassay samples. The overall conclusion was that, for routine counting, oxidation of such samples is advantageous when they are highly quenched or in solid form

  18. Molecular methods (digital PCR and real-time PCR) for the quantification of low copy DNA of Phytophthora nicotianae in environmental samples.

    Science.gov (United States)

    Blaya, Josefa; Lloret, Eva; Santísima-Trinidad, Ana B; Ros, Margarita; Pascual, Jose A

    2016-04-01

    Currently, real-time polymerase chain reaction (qPCR) is the technique most often used to quantify pathogen presence. Digital PCR (dPCR) is a new technique with the potential to have a substantial impact on plant pathology research owing to its reproducibility, sensitivity and low susceptibility to inhibitors. In this study, we evaluated the feasibility of using dPCR and qPCR to quantify Phytophthora nicotianae in several background matrices, including host tissues (stems and roots) and soil samples. In spite of the low dynamic range of dPCR (3 logs compared with 7 logs for qPCR), this technique proved to have very high precision applicable at very low copy numbers. The dPCR was able to detect accurately the pathogen in all type of samples in a broad concentration range. Moreover, dPCR seems to be less susceptible to inhibitors than qPCR in plant samples. Linear regression analysis showed a high correlation between the results obtained with the two techniques in soil, stem and root samples, with R(2) = 0.873, 0.999 and 0.995 respectively. These results suggest that dPCR is a promising alternative for quantifying soil-borne pathogens in environmental samples, even in early stages of the disease. © 2015 Society of Chemical Industry.

  19. Evaluation of a real-time travel time prediction system in a freeway construction work zone : executive summary.

    Science.gov (United States)

    2001-03-01

    A real-time travel time prediction system (TIPS) was evaluated in a construction work : zone. TIPS includes changeable message signs (CMSs) displaying the travel time and : distance to the end of the work zone to motorists. The travel times displayed...

  20. Identification of driving network of cellular differentiation from single sample time course gene expression data

    Science.gov (United States)

    Chen, Ye; Wolanyk, Nathaniel; Ilker, Tunc; Gao, Shouguo; Wang, Xujing

    Methods developed based on bifurcation theory have demonstrated their potential in driving network identification for complex human diseases, including the work by Chen, et al. Recently bifurcation theory has been successfully applied to model cellular differentiation. However, there one often faces a technical challenge in driving network prediction: time course cellular differentiation study often only contains one sample at each time point, while driving network prediction typically require multiple samples at each time point to infer the variation and interaction structures of candidate genes for the driving network. In this study, we investigate several methods to identify both the critical time point and the driving network through examination of how each time point affects the autocorrelation and phase locking. We apply these methods to a high-throughput sequencing (RNA-Seq) dataset of 42 subsets of thymocytes and mature peripheral T cells at multiple time points during their differentiation (GSE48138 from GEO). We compare the predicted driving genes with known transcription regulators of cellular differentiation. We will discuss the advantages and limitations of our proposed methods, as well as potential further improvements of our methods.

  1. Preservation of water samples for arsenic(III/V) determinations: An evaluation of the literature and new analytical results

    Science.gov (United States)

    McCleskey, R. Blaine; Nordstrom, D. Kirk; Maest, A.S.

    2004-01-01

    Published literature on preservation procedures for stabilizing aqueous inorganic As(III/V) redox species contains discrepancies. This study critically evaluates published reports on As redox preservation and explains discrepancies in the literature. Synthetic laboratory preservation experiments and time stability experiments were conducted for natural water samples from several field sites. Any field collection procedure that filters out microorganisms, adds a reagent that prevents dissolved Fe and Mn oxidation and precipitation, and isolates the sample from solar radiation will preserve the As(III/V) ratio. Reagents that prevent Fe and Mn oxidation and precipitation include HCl, H 2SO4, and EDTA, although extremely high concentrations of EDTA are necessary for some water samples high in Fe. Photo-catalyzed Fe(III) reduction causes As(III) oxidation; however, storing the sample in the dark prevents photochemical reactions. Furthermore, the presence of Fe(II) or SO 4 inhibits the oxidation of As(III) by Fe(III) because of complexation reactions and competing reactions with free radicals. Consequently, fast abiotic As(III) oxidation reactions observed in the laboratory are not observed in natural water samples for one or more of the following reasons: (1) the As redox species have already stabilized, (2) most natural waters contain very low dissolved Fe(III) concentrations, (3) the As(III) oxidation caused by Fe(III) photoreduction is inhibited by Fe(II) or SO4.

  2. Preservation of water samples for arsenic(III/V) determinations: an evaluation of the literature and new analytical results

    International Nuclear Information System (INIS)

    McCleskey, R.Blaine; Nordstrom, D.Kirk; Maest, Ann S.

    2004-01-01

    Published literature on preservation procedures for stabilizing aqueous inorganic As(III/V) redox species contains discrepancies. This study critically evaluates published reports on As redox preservation and explains discrepancies in the literature. Synthetic laboratory preservation experiments and time stability experiments were conducted for natural water samples from several field sites. Any field collection procedure that filters out microorganisms, adds a reagent that prevents dissolved Fe and Mn oxidation and precipitation, and isolates the sample from solar radiation will preserve the As(III/V) ratio. Reagents that prevent Fe and Mn oxidation and precipitation include HCl, H 2 SO 4 , and EDTA, although extremely high concentrations of EDTA are necessary for some water samples high in Fe. Photo-catalyzed Fe(III) reduction causes As(III) oxidation; however, storing the sample in the dark prevents photochemical reactions. Furthermore, the presence of Fe(II) or SO 4 inhibits the oxidation of As(III) by Fe(III) because of complexation reactions and competing reactions with free radicals. Consequently, fast abiotic As(III) oxidation reactions observed in the laboratory are not observed in natural water samples for one or more of the following reasons: (1) the As redox species have already stabilized, (2) most natural waters contain very low dissolved Fe(III) concentrations, (3) the As(III) oxidation caused by Fe(III) photoreduction is inhibited by Fe(II) or SO 4

  3. Evaluation of the International Outcome Inventory for Hearing Aids in a veteran sample.

    Science.gov (United States)

    Smith, Sherri L; Noe, Colleen M; Alexander, Genevieve C

    2009-06-01

    The International Outcome Inventory for Hearing Aids (IOI-HA) was developed as a global hearing aid outcome measure targeting seven outcome domains. The published norms were based on a private-pay sample who were fitted with analog hearing aids. The purpose of this study was to evaluate the psychometric properties of the IOI-HA and to establish normative data in a veteran sample. Survey. The participants were 131 male veterans (mean age of 74.3 years, SD = 7.4) who were issued hearing aids with digital signal processing (DSP). Hearing aids with DSP that were fitted bilaterally between 2005 and 2007. Veterans were mailed two copies of the IOI-HA. The participants were instructed to complete the first copy of the questionnaire immediately and the second copy in two weeks. The completed questionnaires were mailed to the laboratory. The psychometric properties of the questionnaire were evaluated. As suggested by Cox and colleagues, the participants were divided into two categories based on their unaided subjective hearing difficulty. The two categories were (1) those with less hearing difficulty (none-to-moderate category) and (2) those who report more hearing difficulty (moderately severe+ category). The norms from the current veteran sample then were compared to the original, published sample. For each hearing difficulty category, the critical difference values were calculated for each item and for the total score. A factor analysis showed that the IOI-HA in the veteran sample had the identical subscale structure as reported in the original sample. For the total scale, the internal consistency was good (Chronbach's alpha = 0.83), and the test-retest reliability was high (lambda = 0.94). Group and individual norms were developed for both hearing difficulty categories in the veteran sample. For each IOI-HA item, the critical difference scores were one response unit between two test sessions reflects a true change in outcome for a given domain. The results of this study

  4. Evaluation of the Chemical Composition of Brazilian Commercial Cymbopogon citratus (D.C. Stapf Samples

    Directory of Open Access Journals (Sweden)

    Evandro de Castro Melo

    2008-08-01

    Full Text Available Abstract: The concentration and the chemical composition of the essential oils obtained from different samples of Cymbopogon citratus were evaluated. Among the 12 samples investigated (11 dried leaf samples and fresh plant leaves, seven presented essential oil concentrations within the threshold established by the Brazilian legislation. The moisture content was also determined and the majority of the samples presented humidity contents near 12%. The GC and GC/MS analyses of the essential oils led to identification of 22 compounds, with neral and geranial as the two major components. The total percentage of these two compounds varied within the investigated sample oils from 40.7% to 75.4%. In addition, a considerable variation in the chemical composition of the analyzed samples was observed. The process of grinding the leaves significantly decreased (by up to 68% the essential oil content, as well as the percentage of myrcene in the oils.

  5. Imaging systems and algorithms to analyze biological samples in real-time using mobile phone microscopy.

    Science.gov (United States)

    Shanmugam, Akshaya; Usmani, Mohammad; Mayberry, Addison; Perkins, David L; Holcomb, Daniel E

    2018-01-01

    Miniaturized imaging devices have pushed the boundaries of point-of-care imaging, but existing mobile-phone-based imaging systems do not exploit the full potential of smart phones. This work demonstrates the use of simple imaging configurations to deliver superior image quality and the ability to handle a wide range of biological samples. Results presented in this work are from analysis of fluorescent beads under fluorescence imaging, as well as helminth eggs and freshwater mussel larvae under white light imaging. To demonstrate versatility of the systems, real time analysis and post-processing results of the sample count and sample size are presented in both still images and videos of flowing samples.

  6. Observer-based output feedback control of networked control systems with non-uniform sampling and time-varying delay

    Science.gov (United States)

    Meng, Su; Chen, Jie; Sun, Jian

    2017-10-01

    This paper investigates the problem of observer-based output feedback control for networked control systems with non-uniform sampling and time-varying transmission delay. The sampling intervals are assumed to vary within a given interval. The transmission delay belongs to a known interval. A discrete-time model is first established, which contains time-varying delay and norm-bounded uncertainties coming from non-uniform sampling intervals. It is then converted to an interconnection of two subsystems in which the forward channel is delay-free. The scaled small gain theorem is used to derive the stability condition for the closed-loop system. Moreover, the observer-based output feedback controller design method is proposed by utilising a modified cone complementary linearisation algorithm. Finally, numerical examples illustrate the validity and superiority of the proposed method.

  7. Evaluation of Damping Using Time Domain OMA Techniques

    DEFF Research Database (Denmark)

    Bajric, Anela; Brincker, Rune; Georgakis, Christos T.

    2014-01-01

    . In this paper a comparison is made of the effectiveness of three existing OMA techniques in providing accurate damping estimates for varying loadings, levels of noise, number of added measurement channels and structural damping. The evaluated techniques are derived in the time domain and are namely the Ibrahim...... Time Domain (ITD), Eigenvalue Realization Algorithm (ERA) and the Polyreference Time Domain (PTD). The response of a two degree-of-freedom (2DOF) system is numerically established from specified modal parameters with well separated and closely spaced modes. Two types of response are considered, free...

  8. Evaluating organochlorine pesticide residues in the aquatic environment of the Lake Naivasha River basin using passive sampling techniques.

    Science.gov (United States)

    Abbasi, Yasser; Mannaerts, Chris M

    2018-05-18

    Passive sampling techniques can improve the discovery of low concentrations by continuous collecting the contaminants, which usually go undetected with classic and once-off time-point grab sampling. The aim of this study was to evaluate organochlorine pesticide (OCP) residues in the aquatic environment of the Lake Naivasha river basin (Kenya) using passive sampling techniques. Silicone rubber sheet and Speedisk samplers were used to detect residues of α-HCH, β-HCH, γ-HCH, δ-HCH, heptachlor, aldrin, heptachlor epoxide, pp-DDE, endrin, dieldrin, α-endosulfan, β-endosulfan, pp-DDD, endrin aldehyde, pp-DDT, endosulfan sulfate, and methoxychlor in the Malewa River and Lake Naivasha. After solvent extraction from the sampling media, the residues were analyzed using gas chromatography electron capture detection (GC-ECD) for the OCPs and gas chromatography-mass spectrometry (GC-MS) for the PCB reference compounds. Measuring the OCP residues using the silicone rubber samplers revealed the highest concentration of residues (∑OCPs of 81 (± 18.9 SD) μg/L) to be at the Lake site, being the ultimate accumulation environment for surficial hydrological, chemical, and sediment transport through the river basin. The total OCP residue sums changed to 71.5 (± 11.3 SD) μg/L for the Middle Malewa and 59 (± 12.5 SD) μg/L for the Upper Malewa River sampling sites. The concentration sums of OCPs detected using the Speedisk samplers at the Upper Malewa, Middle Malewa, and the Lake Naivasha sites were 28.2 (± 4.2 SD), 31.3 (± 1.8 SD), and 34.2 (± 6.4 SD) μg/L, respectively. An evaluation of the different pesticide compound variations identified at the three sites revealed that endosulfan sulfate, α-HCH, methoxychlor, and endrin aldehyde residues were still found at all sampling sites. However, the statistical analysis of one-way ANOVA for testing the differences of ∑OCPs between the sampling sites for both the silicone rubber sheet and Speedisk samplers

  9. Computer graphics for quality control in the INAA of geological samples

    Science.gov (United States)

    Grossman, J.N.; Baedecker, P.A.

    1987-01-01

    A data reduction system for the routine instrumental activation analysis of samples is described, with particular emphasis on interactive graphics capabilities for evaluating analytical quality. Graphics procedures have been developed to interactively control the analysis of selected photopeaks during spectral analysis, and to evaluate detector performance during a given counting cycle. Graphics algorithms are also used to compare the data on reference samples with accepted values, to prepare quality control charts to evaluate long term precision and to search for systematic variations in data on reference samples as a function of time. ?? 1987 Akade??miai Kiado??.

  10. Sample processing, protocol, and statistical analysis of the time-of-flight secondary ion mass spectrometry (ToF-SIMS) of protein, cell, and tissue samples.

    Science.gov (United States)

    Barreto, Goncalo; Soininen, Antti; Sillat, Tarvo; Konttinen, Yrjö T; Kaivosoja, Emilia

    2014-01-01

    Time-of-flight secondary ion mass spectrometry (ToF-SIMS) is increasingly being used in analysis of biological samples. For example, it has been applied to distinguish healthy and osteoarthritic human cartilage. This chapter discusses ToF-SIMS principle and instrumentation including the three modes of analysis in ToF-SIMS. ToF-SIMS sets certain requirements for the samples to be analyzed; for example, the samples have to be vacuum compatible. Accordingly, sample processing steps for different biological samples, i.e., proteins, cells, frozen and paraffin-embedded tissues and extracellular matrix for the ToF-SIMS are presented. Multivariate analysis of the ToF-SIMS data and the necessary data preprocessing steps (peak selection, data normalization, mean-centering, and scaling and transformation) are discussed in this chapter.

  11. Limited sampling strategy for determining metformin area under the plasma concentration-time curve

    DEFF Research Database (Denmark)

    Santoro, Ana Beatriz; Stage, Tore Bjerregaard; Struchiner, Claudio José

    2016-01-01

    AIM: The aim was to develop and validate limited sampling strategy (LSS) models to predict the area under the plasma concentration-time curve (AUC) for metformin. METHODS: Metformin plasma concentrations (n = 627) at 0-24 h after a single 500 mg dose were used for LSS development, based on all su...

  12. The effects of disjunct sampling and averaging time on maximum mean wind speeds

    DEFF Research Database (Denmark)

    Larsén, Xiaoli Guo; Mann, J.

    2006-01-01

    Conventionally, the 50-year wind is calculated on basis of the annual maxima of consecutive 10-min averages. Very often, however, the averages are saved with a temporal spacing of several hours. We call it disjunct sampling. It may also happen that the wind speeds are averaged over a longer time...

  13. Study of adolescents exposed in utero. Methodological evaluation of the Nagasaki sample

    Energy Technology Data Exchange (ETDEWEB)

    Hrubec, Zdenek, Noble, K.B.; Burrow, G N

    1962-09-12

    Fetal tissues have been shown to be extremely sensitive to ionizing radiation, and therefore a group of children who were exposed in utero are of special interest. When these children entered adolescence, an intensive study was undertaken to determine whether differences not otherwise apparent would be revealed during the stress of this period of rapid growth. The purpose of this report is to describe the sample used to study these adolescent children who were exposed in utero and to provide reference information. The problems of using ex post facto methods as employed in this study have been discussed in detail elsewhere. In summary, the extent to which findings of a retrospective study may be generalized to a larger population can be determined only from a careful and extensive study of the characteristics of the sample and an evaluation of the procedures used in its selection. It is generally recognized that even an extensive methodologic exploration of this kind offers no conclusive proof that a sample is useful for a specific study. In the sample, some variables which may have a considerable effect on the medical data, such as socioeconomic status, have been taken into account only superficially. There is always the possibility that some important, completely unsuspected variables may produce spurious associations. However there is an almost infinite number of such factors which might conceivably affect the data. Vast research resources could be committed to a methodologic evaluation without fulfilling the basic purpose of the study. An approach must be devised which is judged methodologically adequate but which will not tax the research resource to the detriment of the basic objectives. It is hoped that this report will satisfy the requirements of this compromise. 30 references, 36 tables.

  14. Aerosol sampling and characterization for hazard evaluation. Progress report, July 1, 1975--September 30, 1976

    International Nuclear Information System (INIS)

    Scripsick, R.C.; Gray, D.C.; Tillery, M.I.; Stafford, R.G.; Romero, P.O.

    1977-04-01

    A draft Manual of Recommended Practice for Aerosol Sampling and Evaluation was completed and sent to the Energy Research and Development Administration (ERDA) Division of Safety, Standards, and Compliance (DSSC) for review. The results of the Survey of Sampling Techniques for Defining Respirable Concentration and/or Particle Size Characteristics of Aerosols were published as LA-6087. The need for greater standardization of ERDA aerosol sampling techniques was indicated. The Aerosol Training Course was presented in 11 sessions to 85 persons. General elements of good practice were emphasized, and recommendation of specific sampling devices or procedures was avoided. A system for estimating dissolution rates of plutonium aerosols was developed. Studies indicate that plutonium aerosols found in the field have a rapid initial dissolution phase followed by a slower secondary phase. Three methods of particle sizing air samples collected on membrane filters were investigated. The most promising was a scanning electron microscope electron microprobe (SEM-EMp) method. An operating plutonium handling facility was a model for development of techniques to evaluate aerosol surveillance systems performance. Airborne contamination records were studied. The physicochemical properties of a plutonium aerosol existing in the facility were investigated in relation to plutonium handling operations. The techniques developed have indicated some areas of the aerosol surveillance system that need improvement

  15. A method to evaluate process performance by integrating time and resources

    Science.gov (United States)

    Wang, Yu; Wei, Qingjie; Jin, Shuang

    2017-06-01

    The purpose of process mining is to improve the existing process of the enterprise, so how to measure the performance of the process is particularly important. However, the current research on the performance evaluation method is still insufficient. The main methods of evaluation are mainly using time or resource. These basic statistics cannot evaluate process performance very well. In this paper, a method of evaluating the performance of the process based on time dimension and resource dimension is proposed. This method can be used to measure the utilization and redundancy of resources in the process. This paper will introduce the design principle and formula of the evaluation algorithm. Then, the design and the implementation of the evaluation method will be introduced. Finally, we will use the evaluating method to analyse the event log from a telephone maintenance process and propose an optimization plan.

  16. An Experimental Evaluation of Competing Age-Predictions of Future Time Perspective between Workplace and Retirement Domains.

    Science.gov (United States)

    Kerry, Matthew J; Embretson, Susan E

    2017-01-01

    Future time perspective (FTP) is defined as "perceptions of the future as being limited or open-ended" (Lang and Carstensen, 2002; p. 125). The construct figures prominently in both workplace and retirement domains, but the age-predictions are competing: Workplace research predicts decreasing FTP age-change, in contrast, retirement scholars predict increasing FTP age-change. For the first time, these competing predictions are pitted in an experimental manipulation of subjective life expectancy (SLE). A sample of N = 207 older adults (age 45-60) working full-time (>30-h/week) were randomly assigned to SLE questions framed as either 'Live-to' or 'Die-by' to evaluate competing predictions for FTP. Results indicate general support for decreasing age-change in FTP, indicated by independent-sample t -tests showing lower FTP in the 'Die-by' framing condition. Further general-linear model analyses were conducted to test for interaction effects of retirement planning with experimental framings on FTP and intended retirement; While retirement planning buffered FTP's decrease, simple-effects also revealed that retirement planning increased intentions for sooner retirement, but lack of planning increased intentions for later retirement. Discussion centers on practical implications of our findings and consequences validity evidence in future empirical research of FTP in both workplace and retirement domains.

  17. Time perspective in hereditary cancer: psychometric properties of a short form of the Zimbardo Time Perspective Inventory in a community and clinical sample.

    Science.gov (United States)

    Wakefield, Claire E; Homewood, Judi; Taylor, Alan; Mahmut, Mehmet; Meiser, Bettina

    2010-10-01

    We aimed to assess the psychometric properties of a 25-item short form of the Zimbardo Time Perspective Inventory in a community sample (N = 276) and in individuals with a strong family history of cancer, considering genetic testing for cancer risk (N = 338). In the community sample, individuals with high past-negative or present-fatalistic scores had higher levels of distress, as measured by depression, anxiety, and aggression. Similarly, in the patient sample, past-negative time perspective was positively correlated with distress, uncertainty, and postdecision regret when making a decision about genetic testing. Past-negative-oriented individuals were also more likely to be undecided about, or against, genetic testing. Hedonism was associated with being less likely to read the educational materials they received at their clinic, and fatalism was associated with having lower knowledge levels about genetic testing. The assessment of time perspective in individuals at increased risk of cancer can provide valuable clinical insights. However, further investigation of the psychometric properties of the short form of this scale is warranted, as it did not meet the currently accepted criteria for psychometric validation studies.

  18. Development of a diagnostic real-time polymerase chain reaction assay for the detection of invasive Haemophilus influenzae in clinical samples.

    LENUS (Irish Health Repository)

    Meyler, Kenneth L

    2012-12-01

    Since the introduction of the Haemophilus influenzae serotype b vaccine, invasive H. influenzae disease has become dominated by nontypeable (NT) strains. Several widely used molecular diagnostic methods have been shown to lack sensitivity or specificity in the detection of some of these strains. Novel real-time assays targeting the fucK, licA, and ompP2 genes were developed and evaluated. The fucK assay detected all strains of H. influenzae tested (n = 116) and had an analytical sensitivity of 10 genome copies\\/polymerase chain reaction (PCR). This assay detected both serotype b and NT H. influenzae in 12 previously positive specimens (culture and\\/or bexA PCR) and also detected H. influenzae in a further 5 of 883 culture-negative blood and cerebrospinal fluid (CSF) samples. The fucK assay has excellent potential as a diagnostic test for detection of typeable and nontypeable strains of invasive H. influenzae in clinical samples of blood and CSF.

  19. Development of a diagnostic real-time polymerase chain reaction assay for the detection of invasive Haemophilus influenzae in clinical samples.

    Science.gov (United States)

    Meyler, Kenneth L; Meehan, Mary; Bennett, Desiree; Cunney, Robert; Cafferkey, Mary

    2012-12-01

    Since the introduction of the Haemophilus influenzae serotype b vaccine, invasive H. influenzae disease has become dominated by nontypeable (NT) strains. Several widely used molecular diagnostic methods have been shown to lack sensitivity or specificity in the detection of some of these strains. Novel real-time assays targeting the fucK, licA, and ompP2 genes were developed and evaluated. The fucK assay detected all strains of H. influenzae tested (n = 116) and had an analytical sensitivity of 10 genome copies/polymerase chain reaction (PCR). This assay detected both serotype b and NT H. influenzae in 12 previously positive specimens (culture and/or bexA PCR) and also detected H. influenzae in a further 5 of 883 culture-negative blood and cerebrospinal fluid (CSF) samples. The fucK assay has excellent potential as a diagnostic test for detection of typeable and nontypeable strains of invasive H. influenzae in clinical samples of blood and CSF. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Hierarchical Bayesian modelling of gene expression time series across irregularly sampled replicates and clusters.

    Science.gov (United States)

    Hensman, James; Lawrence, Neil D; Rattray, Magnus

    2013-08-20

    Time course data from microarrays and high-throughput sequencing experiments require simple, computationally efficient and powerful statistical models to extract meaningful biological signal, and for tasks such as data fusion and clustering. Existing methodologies fail to capture either the temporal or replicated nature of the experiments, and often impose constraints on the data collection process, such as regularly spaced samples, or similar sampling schema across replications. We propose hierarchical Gaussian processes as a general model of gene expression time-series, with application to a variety of problems. In particular, we illustrate the method's capacity for missing data imputation, data fusion and clustering.The method can impute data which is missing both systematically and at random: in a hold-out test on real data, performance is significantly better than commonly used imputation methods. The method's ability to model inter- and intra-cluster variance leads to more biologically meaningful clusters. The approach removes the necessity for evenly spaced samples, an advantage illustrated on a developmental Drosophila dataset with irregular replications. The hierarchical Gaussian process model provides an excellent statistical basis for several gene-expression time-series tasks. It has only a few additional parameters over a regular GP, has negligible additional complexity, is easily implemented and can be integrated into several existing algorithms. Our experiments were implemented in python, and are available from the authors' website: http://staffwww.dcs.shef.ac.uk/people/J.Hensman/.

  1. Evaluation and Prediction present of radionuclide for surface wipe sample in Emergency Related with Fukushima Nuclear Accident

    International Nuclear Information System (INIS)

    Zalina Laili; Muhamat Omar; Woo, Y.M.

    2011-01-01

    Surface wipe samples of aircraft and container from Japan that were exposed to radioactive dust fallout due to Fukushima nuclear accident has been analysed using gamma spectrometry systems. The samples were analysed to determine their contamination levels. The surface of aircraft and container might be exposed to short and long lived fission and activation products. Thus, good evaluations, as well as a reliable and reasonable judgment were needed in order to determine the presence of fission and activation products. A work procedure has been developed to evaluate and predict the presence of fission and activation products in surface wipe samples. Good references, skilled and experienced level in analysis, a well calibrated and validated detector system were the important factors in determining the presence of fission and activation products in surface wipe samples. (author)

  2. Reference Intervals for Urinary Cotinine Levels and the Influence of Sampling Time and Other Predictors on Its Excretion Among Italian Schoolchildren

    Directory of Open Access Journals (Sweden)

    Carmela Protano

    2018-04-01

    Full Text Available (1 Background: Environmental Tobacco Smoke (ETS exposure remains a public health problem worldwide. The aims are to establish urinary (u- cotinine reference values for healthy Italian children, to evaluate the role of the sampling time and of other factors on children’s u-cotinine excretion. (2 Methods: A cross-sectional study was performed on 330 children. Information on participants was gathered by a questionnaire and u-cotinine was determined in two samples for each child, collected during the evening and the next morning. (3 Results: Reference intervals (as the 2.5th and 97.5th percentiles of the distribution in evening and morning samples were respectively equal to 0.98–4.29 and 0.91–4.50 µg L−1 (ETS unexposed and 1.39–16.34 and 1.49–20.95 µg L−1 (ETS exposed. No statistical differences were recovered between median values found in evening and morning samples, both in ETS unexposed and exposed. Significant predictors of u-cotinine excretions were ponderal status according to body mass index of children (β = 0.202; p-value = 0.041 for evening samples; β = 0.169; p-value = 0.039 for morning samples and paternal educational level (β = −0.258; p-value = 0.010; for evening samples; β = −0.013; p-value = 0.003 for morning samples. (4 Conclusions: The results evidenced the need of further studies for assessing the role of confounding factors on ETS exposure, and the necessity of educational interventions on smokers for rising their awareness about ETS.

  3. Extracting Hydrologic Understanding from the Unique Space-time Sampling of the Surface Water and Ocean Topography (SWOT) Mission

    Science.gov (United States)

    Nickles, C.; Zhao, Y.; Beighley, E.; Durand, M. T.; David, C. H.; Lee, H.

    2017-12-01

    The Surface Water and Ocean Topography (SWOT) satellite mission is jointly developed by NASA, the French space agency (CNES), with participation from the Canadian and UK space agencies to serve both the hydrology and oceanography communities. The SWOT mission will sample global surface water extents and elevations (lakes/reservoirs, rivers, estuaries, oceans, sea and land ice) at a finer spatial resolution than is currently possible enabling hydrologic discovery, model advancements and new applications that are not currently possible or likely even conceivable. Although the mission will provide global cover, analysis and interpolation of the data generated from the irregular space/time sampling represents a significant challenge. In this study, we explore the applicability of the unique space/time sampling for understanding river discharge dynamics throughout the Ohio River Basin. River network topology, SWOT sampling (i.e., orbit and identified SWOT river reaches) and spatial interpolation concepts are used to quantify the fraction of effective sampling of river reaches each day of the three-year mission. Streamflow statistics for SWOT generated river discharge time series are compared to continuous daily river discharge series. Relationships are presented to transform SWOT generated streamflow statistics to equivalent continuous daily discharge time series statistics intended to support hydrologic applications using low-flow and annual flow duration statistics.

  4. Evaluation of maternal serum alpha-foetoprotein assay using dry blood spot samples.

    Science.gov (United States)

    González, C; Guerrero, J M; Elorza, F L; Molinero, P; Goberna, R

    1988-02-01

    The quantification of alpha-foetoprotein in dry blood spots from pregnant women was evaluated, using a conventional radioimmunoassay (RIA) with a monospecific antibody. The stability of alpha-foetoprotein in dry blood spots on filter paper was evaluated with respect to mailing, distances travelled, and the existence of high summer temperatures in our region. The results obtained show that the blood alpha-foetoprotein is stable on dry filter spots sent by mail and is stable for up to four weeks at 4, 25 and 37 degrees C. The analytical method used has a minimal detectable concentration of 10 +/- 1.9 international kilo-units/l. Both inter- and intra-assay variabilities are smaller than 10% and this method can provide results comparable with those of conventional serum assays. Results from dry blood spots and serum samples (the latter analysed by both RIA and two-site enzyme immunoassay) exhibited a good correlation (r = 0.98 and r = 0.97, p less than 0.001). The design of the assay and the nature of the samples make this method suitable for a screening programmes for the antenatal detection of open neural tube defects.

  5. Preliminary evaluation of the gaseous effluent sampling and monitoring systems at the 291-Z-1 and 296-Z-3 stacks

    International Nuclear Information System (INIS)

    Schwendiman, L.C.; Glissmeyer, J.A.

    1992-04-01

    The 291-Z-1 and 296-Z-3 stack effluent particulate sampling and monitoring systems are being evaluated for compliance with Atlantic Richfield Hanford Company's Interim Criteria for such systems. This evaluation is part of a study by Battelle-Northwest of gaseous effluent sampling systems in ARHCO facilities. This letter report presents a preliminary evaluation of the mentioned facilities and the indicated improvements needed to meet the Interim Criteria so that conceptual design work for improved systems can be initiated. There is currently underway a detailed study at the two stacks including a series of sampling experiments, the findings of which will not be included in this report. The gaseous effluent sampling system at the 291-Z-1 and 296-Z-3 stacks are very dissimilar and will be treated in separate sections of this report. The discussions for each sampling system will include a brief description and a preliminary evaluation of the systems

  6. Critical points of DNA quantification by real-time PCR--effects of DNA extraction method and sample matrix on quantification of genetically modified organisms.

    Science.gov (United States)

    Cankar, Katarina; Stebih, Dejan; Dreo, Tanja; Zel, Jana; Gruden, Kristina

    2006-08-14

    evaluate the quality and performance on different matrixes and extraction techniques. The effect of PCR efficiency on the resulting GMO content is demonstrated. The crucial influence of extraction technique and sample matrix properties on the results of GMO quantification is demonstrated. Appropriate extraction techniques for each matrix need to be determined to achieve accurate DNA quantification. Nevertheless, as it is shown that in the area of food and feed testing matrix with certain specificities is impossible to define strict quality controls need to be introduced to monitor PCR. The results of our study are also applicable to other fields of quantitative testing by real-time PCR.

  7. A general theory on frequency and time-frequency analysis of irregularly sampled time series based on projection methods - Part 2: Extension to time-frequency analysis

    Science.gov (United States)

    Lenoir, Guillaume; Crucifix, Michel

    2018-03-01

    Geophysical time series are sometimes sampled irregularly along the time axis. The situation is particularly frequent in palaeoclimatology. Yet, there is so far no general framework for handling the continuous wavelet transform when the time sampling is irregular. Here we provide such a framework. To this end, we define the scalogram as the continuous-wavelet-transform equivalent of the extended Lomb-Scargle periodogram defined in Part 1 of this study (Lenoir and Crucifix, 2018). The signal being analysed is modelled as the sum of a locally periodic component in the time-frequency plane, a polynomial trend, and a background noise. The mother wavelet adopted here is the Morlet wavelet classically used in geophysical applications. The background noise model is a stationary Gaussian continuous autoregressive-moving-average (CARMA) process, which is more general than the traditional Gaussian white and red noise processes. The scalogram is smoothed by averaging over neighbouring times in order to reduce its variance. The Shannon-Nyquist exclusion zone is however defined as the area corrupted by local aliasing issues. The local amplitude in the time-frequency plane is then estimated with least-squares methods. We also derive an approximate formula linking the squared amplitude and the scalogram. Based on this property, we define a new analysis tool: the weighted smoothed scalogram, which we recommend for most analyses. The estimated signal amplitude also gives access to band and ridge filtering. Finally, we design a test of significance for the weighted smoothed scalogram against the stationary Gaussian CARMA background noise, and provide algorithms for computing confidence levels, either analytically or with Monte Carlo Markov chain methods. All the analysis tools presented in this article are available to the reader in the Python package WAVEPAL.

  8. Verification of Representative Sampling in RI waste

    International Nuclear Information System (INIS)

    Ahn, Hong Joo; Song, Byung Cheul; Sohn, Se Cheul; Song, Kyu Seok; Jee, Kwang Yong; Choi, Kwang Seop

    2009-01-01

    For evaluating the radionuclide inventories for RI wastes, representative sampling is one of the most important parts in the process of radiochemical assay. Sampling to characterized RI waste conditions typically has been based on judgment or convenience sampling of individual or groups. However, it is difficult to get a sample representatively among the numerous drums. In addition, RI waste drums might be classified into heterogeneous wastes because they have a content of cotton, glass, vinyl, gloves, etc. In order to get the representative samples, the sample to be analyzed must be collected from selected every drum. Considering the expense and time of analysis, however, the number of sample has to be minimized. In this study, RI waste drums were classified by the various conditions of the half-life, surface dose, acceptance date, waste form, generator, etc. A sample for radiochemical assay was obtained through mixing samples of each drum. The sample has to be prepared for radiochemical assay and although the sample should be reasonably uniform, it is rare that a completely homogeneous material is received. Every sample is shredded by a 1 ∼ 2 cm 2 diameter and a representative aliquot taken for the required analysis. For verification of representative sampling, classified every group is tested for evaluation of 'selection of representative drum in a group' and 'representative sampling in a drum'

  9. New product development projects evaluation under time uncertainty

    Directory of Open Access Journals (Sweden)

    Thiago Augusto de Oliveira Silva

    2009-12-01

    Full Text Available The development time is one of the key factors that contribute to the new product development success. In spite of that, the impact of the time uncertainty on the development has been not fully exploited, as far as decision supporting models to evaluate this kind of projects is concerned. In this context, the objective of the present paper is to evaluate the development process of new technologies under time uncertainty. We introduce a model which captures this source of uncertainty and develop an algorithm to evaluate projects that incorporates Monte Carlo Simulation and Dynamic Programming. The novelty in our approach is to thoroughly blend the stochastic time with a formal approach to the problem, which preserves the Markov property. We base our model on the distinction between the decision epoch and the stochastic time. We discuss and illustrate the applicability of our model through an empirical example.O tempo de desenvolvimento é um dos fatores-chave que contribuem para o sucesso do desenvolvimento de novos produtos. Apesar disso, o impacto da incerteza de tempo no desenvolvimento tem sido pouco considerado em modelos de avaliação e valoração deste tipo de projetos. Neste contexto, este trabalho tem como objetivo avaliar projetos de desenvolvimento de novas tecnologias mediante o tempo incerto. Introduzimos um modelo capaz de captar esta fonte de incerteza e desenvolvemos um algoritmo para a valoração do projeto que integra Simulação de Monte Carlo e Programação Dinâmica. A novidade neste trabalho é conseguir integrar meticulosamente o tempo estocástico a uma estrutura formal para tomada de decisão que preserva a propriedade de Markov. O principal ponto para viabilizar este fato é distinção entre o momento de revisão e o tempo estocástico. Ilustramos e discutimos a aplicabilidade deste modelo por meio de um exemplo empírico.

  10. Brine Sampling and Evaluation Program: Phase 1 report

    International Nuclear Information System (INIS)

    Deal, D.E.; Case, J.B.

    1987-01-01

    This interim report presents preliminary data obtained in the course of the WIPP Brine Sampling and Evaluation Program. The investigations focus on the brine present in the near-field environment around the WIPP underground workings. Although the WIPP underground workings are considered dry, small amounts of brine are present. This amount of brine is not unexpected in rocks of marine sedimentary origin. Part of that brine can and does migrate into the repository in response to pressure gradients, at essentially isothermal conditions. These small volumes of brine have little effect on the day-to-day operations, but are pervasive throughout the repository and may contribute enough moisture over a period of years to affect resaturation and repressurization after sealing and closure. Gas bubbles are observed in many of the brine occurrences. Gas is also known to exsolve from solution as the brine is poured from container to container. 68 refs., 9 figs., 2 tabs

  11. Symbol synchronization and sampling frequency synchronization techniques in real-time DDO-OFDM systems

    Science.gov (United States)

    Chen, Ming; He, Jing; Cao, Zizheng; Tang, Jin; Chen, Lin; Wu, Xian

    2014-09-01

    In this paper, we propose and experimentally demonstrate a symbol synchronization and sampling frequency synchronization techniques in real-time direct-detection optical orthogonal frequency division multiplexing (DDO-OFDM) system, over 100-km standard single mode fiber (SSMF) using a cost-effective directly modulated distributed feedback (DFB) laser. The experiment results show that the proposed symbol synchronization based on training sequence (TS) has a low complexity and high accuracy even at a sampling frequency offset (SFO) of 5000-ppm. Meanwhile, the proposed pilot-assisted sampling frequency synchronization between digital-to-analog converter (DAC) and analog-to-digital converter (ADC) is capable of estimating SFOs with an accuracy of technique can also compensate SFO effects within a small residual SFO caused by deviation of SFO estimation and low-precision or unstable clock source. The two synchronization techniques are suitable for high-speed DDO-OFDM transmission systems.

  12. How to Make Evaluation Time Stress-Free!

    Science.gov (United States)

    Peterson, Kristy

    2010-01-01

    When it's annual review time, are you frustrated or overwhelmed by the task of remembering an employee's performance throughout the year? Do you ever feel that if you took notes on each task set within the year, it would be easier for you to summarize the evaluation? There is a way to make everyone's life a little easier when giving or receiving…

  13. Commutability of food microbiology proficiency testing samples.

    Science.gov (United States)

    Abdelmassih, M; Polet, M; Goffaux, M-J; Planchon, V; Dierick, K; Mahillon, J

    2014-03-01

    Food microbiology proficiency testing (PT) is a useful tool to assess the analytical performances among laboratories. PT items should be close to routine samples to accurately evaluate the acceptability of the methods. However, most PT providers distribute exclusively artificial samples such as reference materials or irradiated foods. This raises the issue of the suitability of these samples because the equivalence-or 'commutability'-between results obtained on artificial vs. authentic food samples has not been demonstrated. In the clinical field, the use of noncommutable PT samples has led to erroneous evaluation of the performances when different analytical methods were used. This study aimed to provide a first assessment of the commutability of samples distributed in food microbiology PT. REQUASUD and IPH organized 13 food microbiology PTs including 10-28 participants. Three types of PT items were used: genuine food samples, sterile food samples and reference materials. The commutability of the artificial samples (reference material or sterile samples) was assessed by plotting the distribution of the results on natural and artificial PT samples. This comparison highlighted matrix-correlated issues when nonfood matrices, such as reference materials, were used. Artificially inoculated food samples, on the other hand, raised only isolated commutability issues. In the organization of a PT-scheme, authentic or artificially inoculated food samples are necessary to accurately evaluate the analytical performances. Reference materials, used as PT items because of their convenience, may present commutability issues leading to inaccurate penalizing conclusions for methods that would have provided accurate results on food samples. For the first time, the commutability of food microbiology PT samples was investigated. The nature of the samples provided by the organizer turned out to be an important factor because matrix effects can impact on the analytical results. © 2013

  14. Evaluation of Rheological Properties and Swelling Behaviour of Sonicated Scleroglucan Samples

    Directory of Open Access Journals (Sweden)

    Siddique Akber Ansari

    2012-02-01

    Full Text Available Scleroglucan is a natural polysaccharide that has been proposed for various applications. However there is no investigation on its property variations when the molecular weight of this polymer is reduced. Scleroglucan was sonicated at two different polymer concentrations for different periods of time and the effect of sonication was investigated with respect to molecular weight variations and rheological properties. Molar mass, estimated by viscometric measurements, was drastically reduced already after a sonication for a few min. Sonicated samples were used for the preparation of gels in the presence of borate ions. The effect of borax on the new samples was investigated by recording the mechanical spectra and the flow curves. A comparison with the system prepared with the dialysed polymer was also carried out. The anisotropic elongation, observed with tablets of scleroglucan and borax, was remarkably reduced when the sonicated samples were used for the preparation of the gels.

  15. In-situ high resolution particle sampling by large time sequence inertial spectrometry

    International Nuclear Information System (INIS)

    Prodi, V.; Belosi, F.

    1990-09-01

    In situ sampling is always preferred, when possible, because of the artifacts that can arise when the aerosol has to flow through long sampling lines. On the other hand, the amount of possible losses can be calculated with some confidence only when the size distribution can be measured with a sufficient precision and the losses are not too large. This makes it desirable to sample directly in the vicinity of the aerosol source or containment. High temperature sampling devices with a detailed aerodynamic separation are extremely useful to this purpose. Several measurements are possible with the inertial spectrometer (INSPEC), but not with cascade impactors or cyclones. INSPEC - INertial SPECtrometer - has been conceived to measure the size distribution of aerosols by separating the particles while airborne according to their size and collecting them on a filter. It consists of a channel of rectangular cross-section with a 90 degree bend. Clean air is drawn through the channel, with a thin aerosol sheath injected close to the inner wall. Due to the bend, the particles are separated according to their size, leaving the original streamline by a distance which is a function of particle inertia and resistance, i.e. of aerodynamic diameter. The filter collects all the particles of the same aerodynamic size at the same distance from the inlet, in a continuous distribution. INSPEC particle separation at high temperature (up to 800 C) has been tested with Zirconia particles as calibration aerosols. The feasibility study has been concerned with resolution and time sequence sampling capabilities under high temperature (700 C)

  16. Verification and Performance Evaluation of Timed Game Strategies

    DEFF Research Database (Denmark)

    David, Alexandre; Fang, Huixing; Larsen, Kim Guldstrand

    2014-01-01

    Control synthesis techniques, based on timed games, derive strategies to ensure a given control objective, e.g., time-bounded reachability. Model checking verifies correctness properties of systems. Statistical model checking can be used to analyse performance aspects of systems, e.g., energy...... consumption. In this work, we propose to combine these three techniques. In particular, given a strategy synthesized for a timed game and a given control objective, we want to make a deeper examination of the consequences of adopting this strategy. Firstly, we want to apply model checking to the timed game...... under the synthesized strategy in order to verify additional correctness properties. Secondly, we want to apply statistical model checking to evaluate various performance aspects of the synthesized strategy. For this, the underlying timed game is extended with relevant price and stochastic information...

  17. Evaluation of dyspnoea in a sample of elderly subjects recruited from general practice

    DEFF Research Database (Denmark)

    Pedersen, F; Mehlsen, J; Raymond, I

    2007-01-01

    The objectives of this study were to investigate the cause of dyspnoea in a sample of elderly individuals and to assess the diagnostic yield of a three-step examination algorithm for the evaluation of dyspnoea paired with a cost analysis. A total of 152 subjects were examined. A predefined diagno...

  18. Clinical evaluation of Statstrip(R) Lactate for use in fetal scalp blood sampling

    NARCIS (Netherlands)

    Heinis, A.M.F.; Dillen, J. van; Oosting, J.D.; Rhose, S.; Vandenbussche, F.P.; Drongelen, J. van

    2017-01-01

    INTRODUCTION: Point-of-care testing of fetal scalp blood lactate is used as an alternative to pH analysis in fetal scalp blood sampling (FBS) during labor. Lactate measurements are not standardized and values vary with each device used. The aim of this study was to evaluate StatStrip(R) Lactate

  19. An evaluation of soil sampling for 137Cs using various field-sampling volumes.

    Science.gov (United States)

    Nyhan, J W; White, G C; Schofield, T G; Trujillo, G

    1983-05-01

    The sediments from a liquid effluent receiving area at the Los Alamos National Laboratory and soils from an intensive study area in the fallout pathway of Trinity were sampled for 137Cs using 25-, 500-, 2500- and 12,500-cm3 field sampling volumes. A highly replicated sampling program was used to determine mean concentrations and inventories of 137Cs at each site, as well as estimates of spatial, aliquoting, and counting variance components of the radionuclide data. The sampling methods were also analyzed as a function of soil size fractions collected in each field sampling volume and of the total cost of the program for a given variation in the radionuclide survey results. Coefficients of variation (CV) of 137Cs inventory estimates ranged from 0.063 to 0.14 for Mortandad Canyon sediments, whereas CV values for Trinity soils were observed from 0.38 to 0.57. Spatial variance components of 137Cs concentration data were usually found to be larger than either the aliquoting or counting variance estimates and were inversely related to field sampling volume at the Trinity intensive site. Subsequent optimization studies of the sampling schemes demonstrated that each aliquot should be counted once, and that only 2-4 aliquots out of as many as 30 collected need be assayed for 137Cs. The optimization studies showed that as sample costs increased to 45 man-hours of labor per sample, the variance of the mean 137Cs concentration decreased dramatically, but decreased very little with additional labor.

  20. Implementing BosonSampling with time-bin encoding: Analysis of loss, mode mismatch, and time jitter

    Science.gov (United States)

    Motes, Keith R.; Dowling, Jonathan P.; Gilchrist, Alexei; Rohde, Peter P.

    2015-11-01

    It was recently shown by Motes, Gilchrist, Dowling, and Rohde [Phys. Rev. Lett. 113, 120501 (2014), 10.1103/PhysRevLett.113.120501] that a time-bin encoded fiber-loop architecture can implement an arbitrary passive linear optics transformation. This was shown in the case of an ideal scheme whereby the architecture has no sources of error. In any realistic implementation, however, physical errors are present, which corrupt the output of the transformation. We investigate the dominant sources of error in this architecture—loss and mode mismatch—and consider how it affects the BosonSampling protocol, a key application for passive linear optics. For our loss analysis we consider two major components that contribute to loss—fiber and switches—and calculate how this affects the success probability and fidelity of the device. Interestingly, we find that errors due to loss are not uniform (unique to time-bin encoding), which asymmetrically biases the implemented unitary. Thus loss necessarily limits the class of unitaries that may be implemented, and therefore future implementations must prioritize minimizing loss rates if arbitrary unitaries are to be implemented. Our formalism for mode mismatch is generalized to account for various phenomenon that may cause mode mismatch, but we focus on two—errors in fiber-loop lengths and time jitter of the photon source. These results provide a guideline for how well future experimental implementations might perform in light of these error mechanisms.

  1. Phase II of a Six sigma Initiative to Study DWPF SME Analytical Turnaround Times: SRNL's Evaluation of Carbonate-Based Dissolution Methods

    International Nuclear Information System (INIS)

    Edwards, Thomas

    2005-01-01

    The Analytical Development Section (ADS) and the Statistical Consulting Section (SCS) of the Savannah River National Laboratory (SRNL) are participating in a Six Sigma initiative to improve the Defense Waste Processing Facility (DWPF) Laboratory. The Six Sigma initiative has focused on reducing the analytical turnaround time of samples from the Slurry Mix Evaporator (SME) by developing streamlined sampling and analytical methods [1]. The objective of Phase I was to evaluate the sub-sampling of a larger sample bottle and the performance of a cesium carbonate (Cs 2 CO 3 ) digestion method. Successful implementation of the Cs 2 CO 3 fusion method in the DWPF would have important time savings and convenience benefits because this single digestion would replace the dual digestion scheme now used. A single digestion scheme would result in more efficient operations in both the DWPF shielded cells and the inductively coupled plasma--atomic emission spectroscopy (ICP-AES) laboratory. By taking a small aliquot of SME slurry from a large sample bottle and dissolving the vitrified SME sample with carbonate fusion methods, an analytical turnaround time reduction from 27 hours to 9 hours could be realized in the DWPF. This analytical scheme has the potential for not only dramatically reducing turnaround times, but also streamlining operations to minimize wear and tear on critical shielded cell components that are prone to fail, including the Hydragard(trademark) sampling valves and manipulators. Favorable results from the Phase I tests [2] led to the recommendation for a Phase II effort as outlined in the DWPF Technical Task Request (TTR) [3]. There were three major tasks outlined in the TTR, and SRNL issued a Task Technical and QA Plan [4] with a corresponding set of three major task activities: (1) Compare weight percent (wt%) total solids measurements of large volume samples versus peanut vial samples. (2) Evaluate Cs 2 CO 3 and K 2 CO 3 fusion methods using DWPF simulated

  2. Integral ceramic superstructure evaluation using time domain optical coherence tomography

    Science.gov (United States)

    Sinescu, Cosmin; Bradu, Adrian; Topala, Florin I.; Negrutiu, Meda Lavinia; Duma, Virgil-Florin; Podoleanu, Adrian G.

    2014-02-01

    Optical Coherence Tomography (OCT) is a non-invasive low coherence interferometry technique that includes several technologies (and the corresponding devices and components), such as illumination and detection, interferometry, scanning, adaptive optics, microscopy and endoscopy. From its large area of applications, we consider in this paper a critical aspect in dentistry - to be investigated with a Time Domain (TD) OCT system. The clinical situation of an edentulous mandible is considered; it can be solved by inserting 2 to 6 implants. On these implants a mesostructure will be manufactured and on it a superstructure is needed. This superstructure can be integral ceramic; in this case materials defects could be trapped inside the ceramic layers and those defects could lead to fractures of the entire superstructure. In this paper we demonstrate that a TD-OCT imaging system has the potential to properly evaluate the presence of the defects inside the ceramic layers and those defects can be fixed before inserting the prosthesis inside the oral cavity. Three integral ceramic superstructures were developed by using a CAD/CAM technology. After the milling, the ceramic layers were applied on the core. All the three samples were evaluated by a TD-OCT system working at 1300 nm. For two of the superstructures evaluated, no defects were found in the most stressed areas. The third superstructure presented four ceramic defects in the mentioned areas. Because of those defects the superstructure may fracture. The integral ceramic prosthesis was send back to the dental laboratory to fix the problems related to the material defects found. Thus, TD-OCT proved to be a valuable method for diagnosing the ceramic defects inside the integral ceramic superstructures in order to prevent fractures at this level.

  3. Exponential synchronization of chaotic Lur'e systems with time-varying delay via sampled-data control

    International Nuclear Information System (INIS)

    Rakkiyappan, R.; Sivasamy, R.; Lakshmanan, S.

    2014-01-01

    In this paper, we study the exponential synchronization of chaotic Lur'e systems with time-varying delays via sampled-data control by using sector nonlinearties. In order to make full use of information about sampling intervals and interval time-varying delays, new Lyapunov—Krasovskii functionals with triple integral terms are introduced. Based on the convex combination technique, two kinds of synchronization criteria are derived in terms of linear matrix inequalities, which can be efficiently solved via standard numerical software. Finally, three numerical examples are provided to demonstrate the less conservatism and effectiveness of the proposed results

  4. Evaluation of setting time and flow properties of self-synthesize alginate impressions

    Science.gov (United States)

    Halim, Calista; Cahyanto, Arief; Sriwidodo, Harsatiningsih, Zulia

    2018-02-01

    Alginate is an elastic hydrocolloid dental impression materials to obtain negative reproduction of oral mucosa such as to record soft-tissue and occlusal relationships. The aim of the present study was to synthesize alginate and to determine the setting time and flow properties. There were five groups of alginate consisted of fifty samples self-synthesize alginate and commercial alginate impression product. Fifty samples were divided according to two tests, each twenty-five samples for setting time and flow test. Setting time test was recorded in the s unit, meanwhile, flow test was recorded in the mm2 unit. The fastest setting time result was in the group three (148.8 s) and the latest was group fours). The highest flow test result was in the group three (69.70 mm2) and the lowest was group one (58.34 mm2). Results were analyzed statistically by one way ANOVA (α= 0.05), showed that there was a statistical significance of setting time while no statistical significance of flow properties between self-synthesize alginate and alginate impression product. In conclusion, the alginate impression was successfully self-synthesized and variation composition gives influence toward setting time and flow properties. The most resemble setting time of control group is group three. The most resemble flow of control group is group four.

  5. Evaluation of physical sampling efficiency for cyclone-based personal bioaerosol samplers in moving air environments.

    Science.gov (United States)

    Su, Wei-Chung; Tolchinsky, Alexander D; Chen, Bean T; Sigaev, Vladimir I; Cheng, Yung Sung

    2012-09-01

    The need to determine occupational exposure to bioaerosols has notably increased in the past decade, especially for microbiology-related workplaces and laboratories. Recently, two new cyclone-based personal bioaerosol samplers were developed by the National Institute for Occupational Safety and Health (NIOSH) in the USA and the Research Center for Toxicology and Hygienic Regulation of Biopreparations (RCT & HRB) in Russia to monitor bioaerosol exposure in the workplace. Here, a series of wind tunnel experiments were carried out to evaluate the physical sampling performance of these two samplers in moving air conditions, which could provide information for personal biological monitoring in a moving air environment. The experiments were conducted in a small wind tunnel facility using three wind speeds (0.5, 1.0 and 2.0 m s(-1)) and three sampling orientations (0°, 90°, and 180°) with respect to the wind direction. Monodispersed particles ranging from 0.5 to 10 μm were employed as the test aerosols. The evaluation of the physical sampling performance was focused on the aspiration efficiency and capture efficiency of the two samplers. The test results showed that the orientation-averaged aspiration efficiencies of the two samplers closely agreed with the American Conference of Governmental Industrial Hygienists (ACGIH) inhalable convention within the particle sizes used in the evaluation tests, and the effect of the wind speed on the aspiration efficiency was found negligible. The capture efficiencies of these two samplers ranged from 70% to 80%. These data offer important information on the insight into the physical sampling characteristics of the two test samplers.

  6. [Acceptance of lot sampling: its applicability to the evaluation of the primary care services portfolio].

    Science.gov (United States)

    López-Picazo Ferrer, J

    2001-05-15

    To determine the applicability of the acceptance of lot quality assurance sampling (LQAS) in the primary care service portfolio, comparing its results with those given by classic evaluation. Compliance with the minimum technical norms (MTN) of the service of diabetic care was evaluated through the classic methodology (confidence 95%, accuracy 5%, representativeness of area, sample of 376 histories) and by LQAS (confidence 95%, power 80%, representativeness of primary care team (PCT), defining a lot by MTN and PCT, sample of 13 histories/PCT). Effort, information obtained and its operative nature were assessed. 44 PCTs from Murcia Primary Care Region. Classic methodology: compliance with MTN ranged between 91.1% (diagnosis, 95% CI, 84.2-94.0) and 30% (repercussion in viscera, 95% CI, 25.4-34.6). Objectives in three MTN were reached (diagnosis, history and EKG). LQAS: no MTN was accepted in all the PCTs, being the most accepted (42 PCT, 95.6%) and the least accepted (24 PCT, 55.6%). In 9 PCT all were accepted (20.4%), and in 2 none were accepted (4.5%). Data were analysed through Pareto charts. Classic methodology offered accurate results, but did not identify which centres were those that did not comply (general focus). LQAS was preferable for evaluating MTN and probably coverage because: 1) it uses small samples, which foment internal quality-improvement initiatives; 2) it is easy and rapid to execute; 3) it identifies the PCT and criteria where there is an opportunity for improvement (specific focus), and 4) it can be used operatively for monitoring.

  7. Evaluation of monoclonal antibody-based sandwich direct ELISA (MSD-ELISA for antigen detection of foot-and-mouth disease virus using clinical samples.

    Directory of Open Access Journals (Sweden)

    Kazuki Morioka

    Full Text Available A monoclonal antibody-based sandwich direct ELISA (MSD-ELISA method was previously developed for foot-and-mouth disease (FMD viral antigen detection. Here we evaluated the sensitivity and specificity of two FMD viral antigen detection MSD-ELISAs and compared them with conventional indirect sandwich (IS-ELISA. The MSD-ELISAs were able to detect the antigen in saliva samples of experimentally-infected pigs for a longer term compared to the IS-ELISA. We also used 178 RT-PCR-positive field samples from cattle and pigs affected by the 2010 type-O FMD outbreak in Japan, and we found that the sensitivities of both MSD-ELISAs were about 7 times higher than that of the IS-ELISA against each sample (P<0.01. In terms of the FMD-positive farm detection rate, the sensitivities of the MSD-ELISAs were about 6 times higher than that of the IS-ELISA against each farm (P<0.01. Although it is necessary to conduct further validation study using the other virus strains, MSD-ELISAs could be appropriate as a method to replace IS-ELISA for FMD antigen detection.

  8. One Sample, One Shot - Evaluation of sample preparation protocols for the mass spectrometric proteome analysis of human bile fluid without extensive fractionation.

    Science.gov (United States)

    Megger, Dominik A; Padden, Juliet; Rosowski, Kristin; Uszkoreit, Julian; Bracht, Thilo; Eisenacher, Martin; Gerges, Christian; Neuhaus, Horst; Schumacher, Brigitte; Schlaak, Jörg F; Sitek, Barbara

    2017-02-10

    The proteome analysis of bile fluid represents a promising strategy to identify biomarker candidates for various diseases of the hepatobiliary system. However, to obtain substantive results in biomarker discovery studies large patient cohorts necessarily need to be analyzed. Consequently, this would lead to an unmanageable number of samples to be analyzed if sample preparation protocols with extensive fractionation methods are applied. Hence, the performance of simple workflows allowing for "one sample, one shot" experiments have been evaluated in this study. In detail, sixteen different protocols implying modifications at the stages of desalting, delipidation, deglycosylation and tryptic digestion have been examined. Each method has been individually evaluated regarding various performance criteria and comparative analyses have been conducted to uncover possible complementarities. Here, the best performance in terms of proteome coverage has been assessed for a combination of acetone precipitation with in-gel digestion. Finally, a mapping of all obtained protein identifications with putative biomarkers for hepatocellular carcinoma (HCC) and cholangiocellular carcinoma (CCC) revealed several proteins easily detectable in bile fluid. These results can build the basis for future studies with large and well-defined patient cohorts in a more disease-related context. Human bile fluid is a proximal body fluid and supposed to be a potential source of disease markers. However, due to its biochemical composition, the proteome analysis of bile fluid still represents a challenging task and is therefore mostly conducted using extensive fractionation procedures. This in turn leads to a high number of mass spectrometric measurements for one biological sample. Considering the fact that in order to overcome the biological variability a high number of biological samples needs to be analyzed in biomarker discovery studies, this leads to the dilemma of an unmanageable number of

  9. Evaluation of Glucose-6-Phosphate Dehydrogenase stability in stored blood samples.

    Science.gov (United States)

    Jalil, Norunaluwar; Azma, Raja Zahratul; Mohamed, Emida; Ithnin, Azlin; Alauddin, Hafiza; Baya, Siti Noor; Othman, Ainoon

    2016-01-01

    Glucose-6-Phosphate Dehydrogenase (G6PD) deficiency is the commonest cause of neonatal jaundice in Malaysia. Recently, OSMMR2000-D G6PD Assay Kit has been introduced to quantitate the level of G6PD activity in newborns delivered in Universiti Kebangsaan Malaysia Medical Centre (UKMMC). As duration of sample storage prior to analysis is one of the matters of concern, this study was conducted to identify the stability of G6PD enzyme during storage. A total of 188 cord blood samples from normal term newborns delivered at UKMMC were selected for this study. The cord bloods samples were collected in ethylene-diamine-tetra-acetic acid (EDTA) tubes and refrigerated at 2-8 °C. In addition, 32 out of 188 cord blood samples were spotted on chromatography paper, air-dried and stored at room temperature. G6PD enzyme activities were measured daily for 7 days using the OSMMR2000-D G6PD Assay Kit on both the EDTA blood and dried blood samples. The mean value for G6PD activity was compared between days of analysis using Student Paired T-Test. In this study, 172 out of 188 cord blood samples showed normal enzyme levels while 16 had levels corresponding to severe enzyme deficiency. The daily mean G6PD activity for EDTA blood samples of newborns with normal G6PD activity showed a significant drop on the fourth day of storage (p samples with severely deficient G6PD activity, significant drop was seen on third day of storage (p = 0.002). Analysis of dried cord blood showed a significant reduction in enzyme activity as early as the second day of storage (p = 0.001). It was also noted that mean G6PD activity for spotted blood samples were lower compared to those in EDTA tubes for all days (p = 0.001). Thus, EDTA blood samples stored at 2-8 °C appeared to have better stability in terms of their G6PD enzyme level as compared to dried blood samples on filter paper, giving a storage time of up to 3 days.

  10. Analysis and evaluation of compounds from Cichorium intybus aromatic water trade market samples

    Directory of Open Access Journals (Sweden)

    A. Hosseini*

    2017-11-01

    Full Text Available Background and objectives: Cichorium intybus products are one of the best sellers in market Because of their effect on treatment of infection, poisoning, diabetes and allergy. This is the first study about Cichorium intybus market samplephytochemical compounds and the aim of this study was to define a method to recognize the original products. Methods: The sample compounds were extracted by liquid-liquid method and evaluated by GC-MS and compared with the references like Adams 2007. The obtained phytochemical data were analyzed with SPSS and classified by dendrogram method and was compared with the data earned from the standard sample. Results: Forty one compounds were detected. Carvacrol was available in all samples from 1.14 to 39.34%. Also, thymol was present in most of samples from 1.24 to 69.32%. Moreover, we understood that some compounds like pulegone, carvone, carvacrol and piperitenone could be detected in all samples mostly with different percentages. Some linear hydrocarbon was detected in this method along with some other unexpected compounds like cinnamaldehyde. Conclusion: Existence of some impure compounds like: pulegone, carvone, piperitenone and cinnamaldehyde in trade samples showed cleaning of container might not have been proper. Carvacrol and thymol are common compounds to define acceptable standard for Cichorium intybus aromatic water.

  11. Evaluation of sample pretreatment methods for analysis of polonium isotopes in herbal medicines

    International Nuclear Information System (INIS)

    Sreejith, Sathyapriya R.; Nair, Madhu G.; Rao, D.D.

    2014-01-01

    Herbal infusions like ayurvedic aristas are widely consumed by Indian population for good health. With increasing awareness about radiological assessment, an effort was made to assess the radioactivity concentration of naturally occurring radionuclides in herbal medicines. 210 Po is an important alpha particle emitter contributing to internal dose to man from ingestion. Though 210 Po can be spontaneously deposited on silver disk for alpha spectrometric measurements with less radiochemical step, great care has to be taken during the sample pretreatment step owing to the high volatility of polonium even at low temperatures. Aim of the study was to evaluate an appropriate sample pretreatment method for estimation of polonium in herbal medicines. 209 Po was used for radiochemical yield calculation. Conventional open vessel wet ashing, physical evaporation, freeze-drying and microwave digestion in a Teflon vessel were examined. The recovery ranged between 9 and 79%. The lowest recovery was obtained for the samples that were processed by open vessel digestion without any volume reduction. The recoveries were comparable for those samples that were freeze dried and subjected to HNO 3 + HClO 4 + H 2 O 2 + HF acid digestion and microwave digested samples. 210 Po concentration in the samples ranged from 11.3 to 39.6 mBq/L

  12. A psychometric validation analysis of Eysenck’s Neuroticism and Extraversion Scales in a sample of first time depressed patients

    DEFF Research Database (Denmark)

    Møller, Stine Bjerrum; Bech, Per; Kessing, Lars Vedel

    2015-01-01

    Eysenck and Eysenck identified the two-factor structure of personality, namely neuroticism and extraversion which has been widely used in clinical psychiatry, and generated much research on the psychometric properties of the scales. Using a classical psychometric approach the neuroticism...... and extraversion scales have shown robust psychometric properties. The present study used both classical psychometric and item response theory (IRT) analyses to evaluate the neuroticism and extraversion scales and improve scalability of the instrument neuroticism and extraversion. A first time depressed sample...... symptoms related to interpersonal sensitivity were identified. For the extraversion scale a shorter and psychometrically more robust version was identified together with a short introversion scale. Clinically discriminant validity was analysed using correlations. The correlation between depression (Ham...

  13. Replicability of time-varying connectivity patterns in large resting state fMRI samples.

    Science.gov (United States)

    Abrol, Anees; Damaraju, Eswar; Miller, Robyn L; Stephen, Julia M; Claus, Eric D; Mayer, Andrew R; Calhoun, Vince D

    2017-12-01

    The past few years have seen an emergence of approaches that leverage temporal changes in whole-brain patterns of functional connectivity (the chronnectome). In this chronnectome study, we investigate the replicability of the human brain's inter-regional coupling dynamics during rest by evaluating two different dynamic functional network connectivity (dFNC) analysis frameworks using 7 500 functional magnetic resonance imaging (fMRI) datasets. To quantify the extent to which the emergent functional connectivity (FC) patterns are reproducible, we characterize the temporal dynamics by deriving several summary measures across multiple large, independent age-matched samples. Reproducibility was demonstrated through the existence of basic connectivity patterns (FC states) amidst an ensemble of inter-regional connections. Furthermore, application of the methods to conservatively configured (statistically stationary, linear and Gaussian) surrogate datasets revealed that some of the studied state summary measures were indeed statistically significant and also suggested that this class of null model did not explain the fMRI data fully. This extensive testing of reproducibility of similarity statistics also suggests that the estimated FC states are robust against variation in data quality, analysis, grouping, and decomposition methods. We conclude that future investigations probing the functional and neurophysiological relevance of time-varying connectivity assume critical importance. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Comparison of Time-of-flight and Multicollector ICP Mass Spectrometers for Measuring Actinides in Small Samples using single shot Laser Ablation

    International Nuclear Information System (INIS)

    R.S. Houk; D.B. Aeschliman; S.J. Bajic; D. Baldwin

    2005-01-01

    The objective of these experiments is to evaluate the performance of two types of ICP-MS device for measurement of actinide isotopes by laser ablation (LA) ICP-MS. The key advantage of ICP-MS compared to monitoring of radioactive decay is that the element need not decay during the measurement time. Hence ICP-MS is much faster for long-lived radionuclides. The LA process yields a transient signal. When spatially resolved analysis is required for small samples, the laser ablation sample pulse lasts only ∼10 seconds. It is difficult to measure signals at several isotopes with analyzers that are scanned for such a short sample transient. In this work, a time-of-flight (TOF) ICP-MS device, the GBC Optimass 8000 (Figure 1) is one instrument used. Strictly speaking, ions at different m/z values are not measured simultaneously in TOF. However, they are measured in very rapid sequence with little or no compromise between the number of m/z values monitored and the performance. Ions can be measured throughout the m/z range in single sample transients by TOF. The other ICP-MS instrument used is a magnetic sector multicollector MS, the NU Plasma 1700 (Figure 2). Up to 8 adjacent m/z values can be monitored at one setting of the magnetic field and accelerating voltage. Three of these m/z values can be measured with an electron multiplier. This device is usually used for high precision isotope ratio measurements with the Faraday cup detectors. The electron multipliers have much higher sensitivity. In our experience with the scanning magnetic sector instrument in Ames, these devices have the highest sensitivity and lowest background of any ICP-MS device. The ability to monitor several ions simultaneously, or nearly so, should make these devices valuable for the intended application: measurement of actinide isotopes at low concentrations in very small samples for nonproliferation purposes. The primary sample analyzed was an urban dust pellet reference material, NIST 1648. The

  15. A 'smart' tube holder enables real-time sample monitoring in a standard lab centrifuge.

    Science.gov (United States)

    Hoang, Tony; Moskwa, Nicholas; Halvorsen, Ken

    2018-01-01

    The centrifuge is among the oldest and most widely used pieces of laboratory equipment, with significant applications that include clinical diagnostics and biomedical research. A major limitation of laboratory centrifuges is their "black box" nature, limiting sample observation to before and after centrifugation. Thus, optimized protocols require significant trial and error, while unoptimized protocols waste time by centrifuging longer than necessary or material due to incomplete sedimentation. Here, we developed an instrumented centrifuge tube receptacle compatible with several commercial benchtop centrifuges that can provide real-time sample analysis during centrifugation. We demonstrated the system by monitoring cell separations during centrifugation for different spin speeds, concentrations, buffers, cell types, and temperatures. We show that the collected data are valuable for analytical purposes (e.g. quality control), or as feedback to the user or the instrument. For the latter, we verified an adaptation where complete sedimentation turned off the centrifuge and notified the user by a text message. Our system adds new functionality to existing laboratory centrifuges, saving users time and providing useful feedback. This add-on potentially enables new analytical applications for an instrument that has remained largely unchanged for decades.

  16. Real-time photonic sampling with improved signal-to-noise and distortion ratio using polarization-dependent modulators

    Science.gov (United States)

    Liang, Dong; Zhang, Zhiyao; Liu, Yong; Li, Xiaojun; Jiang, Wei; Tan, Qinggui

    2018-04-01

    A real-time photonic sampling structure with effective nonlinearity suppression and excellent signal-to-noise ratio (SNR) performance is proposed. The key points of this scheme are the polarization-dependent modulators (P-DMZMs) and the sagnac loop structure. Thanks to the polarization sensitive characteristic of P-DMZMs, the differences between transfer functions of the fundamental signal and the distortion become visible. Meanwhile, the selection of specific biases in P-DMZMs is helpful to achieve a preferable linearized performance with a low noise level for real-time photonic sampling. Compared with the quadrature-biased scheme, the proposed scheme is capable of valid nonlinearity suppression and is able to provide a better SNR performance even in a large frequency range. The proposed scheme is proved to be effective and easily implemented for real time photonic applications.

  17. Usefulness of in-house real time PCR for HBV DNA quantification in serum and oral fluid samples.

    Science.gov (United States)

    Portilho, Moyra Machado; Mendonça, Ana Carolina da Fonseca; Bezerra, Cristianne Sousa; do Espirito-Santo, Márcia Paschoal; de Paula, Vanessa Salete; Nabuco, Leticia Cancella; Villela-Nogueira, Cristiane Alves; Lewis-Ximenez, Lia Laura; Lampe, Elisabeth; Villar, Livia Melo

    2018-06-01

    For quantification of hepatitis B virus DNA (HBV DNA), commercial assays are used with serum or plasma samples, but oral fluid samples could be an alternative for HBV diagnosis due to ease of collection. This study aims to develop in-house real time PCR using synthetic curve for HBV DNA quantification for serum and oral fluid samples. Samples were collected from 103 individuals (55 HBsAg reactive and HBV DNA reactive by commercial assay and 48 without HBV markers) and submitted to two in-house real time PCR assays for HBV pre-S/S region with different standard curves: qPCR plasmidial and qPCR synthetic. A total of 27 serum samples were HBV DNA positive by qPCR plasmidial and 40 with qPCR synthetic (72% and 85% of concordance, respectively). Quantitative PCR synthetic presented efficiency of 99% and sensitivity of 2log10 copies/mL. Among oral fluid samples, five and ten were detected using qPCR plasmidial and synthetic, respectively. This study demonstrated that qPCR synthetic using serum samples could be used as alternative for HBV DNA quantification due to its sensitivity. In addition, it was possible to quantify HBV DNA in oral fluid samples suggesting the potential of this specimen for molecular diagnosis of HBV. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Evaluation of Skin Surface as an Alternative Source of Reference DNA Samples: A Pilot Study.

    Science.gov (United States)

    Albujja, Mohammed H; Bin Dukhyil, Abdul Aziz; Chaudhary, Abdul Rauf; Kassab, Ahmed Ch; Refaat, Ahmed M; Babu, Saranya Ramesh; Okla, Mohammad K; Kumar, Sachil

    2018-01-01

    An acceptable area for collecting DNA reference sample is a part of the forensic DNA analysis development. The aim of this study was to evaluate skin surface cells (SSC) as an alternate source of reference DNA sample. From each volunteer (n = 10), six samples from skin surface areas (forearm and fingertips) and two traditional samples (blood and buccal cells) were collected. Genomic DNA was extracted and quantified then genotyped using standard techniques. The highest DNA concentration of SSC samples was collected using the tape/forearm method of collection (2.1 ng/μL). Cotton swabs moistened with ethanol yielded higher quantities of DNA than swabs moistened with salicylic acid, and it gave the highest percentage of full STR profiles (97%). This study supports the use of SSC as a noninvasive sampling technique and as a extremely useful source of DNA reference samples among certain cultures where the use of buccal swabs can be considered socially unacceptable. © 2017 American Academy of Forensic Sciences.

  19. Brine Sampling and Evaluation Program, 1990 report

    Energy Technology Data Exchange (ETDEWEB)

    Deal, D.E.; Abitz, R.J.; Myers, J.; Case, J.B.; Martin, M.L.; Roggenthen, W.M. [International Technology Corp., Albuquerque, NM (United States); Belski, D.S. [Westinghouse Electric Corp., Carlsbad, NM (United States). Waste Isolation Div.

    1991-08-01

    The data presented in this report are the result of Brine Sampling and Evaluation Program (BSEP) activities at the Waste Isolation Pilot Plant (WIPP) during 1990. When excavations began in 1982, small brine seepages (weeps) were observed on the walls. These brine occurrences were initially described as part of the Site Validation Program. Brine studies were formalized in 1985. The BSEP activities document and investigate the origins, hydraulic characteristics, extent, and composition of brine occurrences in the Permian Salado Formation and seepage of that brine into the excavations at the WIPP. The brine chemistry is important because it assists in understanding the origin of the brine and because it may affect possible chemical reactions in the buried waste after sealing the repository. The volume of brine and the hydrologic system that drives the brine seepage also need to be understood to assess the long-term performance of the repository. After more than eight years of observations (1982--1990), no credible evidence exists to indicate that enough naturally occurring brine will seep into the WIPP excavations to be of practical concern. The detailed observations and analyses summarized herein and in previous BSEP reports confirm the evidence apparent during casual visits to the underground workings -- that the excavations are remarkably dry.

  20. Brine Sampling and Evaluation Program, 1990 report

    International Nuclear Information System (INIS)

    Deal, D.E.; Abitz, R.J.; Myers, J.; Case, J.B.; Martin, M.L.; Roggenthen, W.M.; Belski, D.S.

    1991-08-01

    The data presented in this report are the result of Brine Sampling and Evaluation Program (BSEP) activities at the Waste Isolation Pilot Plant (WIPP) during 1990. When excavations began in 1982, small brine seepages (weeps) were observed on the walls. These brine occurrences were initially described as part of the Site Validation Program. Brine studies were formalized in 1985. The BSEP activities document and investigate the origins, hydraulic characteristics, extent, and composition of brine occurrences in the Permian Salado Formation and seepage of that brine into the excavations at the WIPP. The brine chemistry is important because it assists in understanding the origin of the brine and because it may affect possible chemical reactions in the buried waste after sealing the repository. The volume of brine and the hydrologic system that drives the brine seepage also need to be understood to assess the long-term performance of the repository. After more than eight years of observations (1982--1990), no credible evidence exists to indicate that enough naturally occurring brine will seep into the WIPP excavations to be of practical concern. The detailed observations and analyses summarized herein and in previous BSEP reports confirm the evidence apparent during casual visits to the underground workings -- that the excavations are remarkably dry

  1. Evaluation of real-time operating system for small-scale embedded systems

    International Nuclear Information System (INIS)

    Dayang Norhayati Abang Jawawi; Rosbi Mamat

    1999-01-01

    In this paper, the performance of some real-time operating systems for small-scale embedded systems are evaluated based on some criteria. The evaluation is performed qualitatively and quantitatively. The evaluation results based on a case study on an engineering application will be presented. (author)

  2. Coordination of Conditional Poisson Samples

    Directory of Open Access Journals (Sweden)

    Grafström Anton

    2015-12-01

    Full Text Available Sample coordination seeks to maximize or to minimize the overlap of two or more samples. The former is known as positive coordination, and the latter as negative coordination. Positive coordination is mainly used for estimation purposes and to reduce data collection costs. Negative coordination is mainly performed to diminish the response burden of the sampled units. Poisson sampling design with permanent random numbers provides an optimum coordination degree of two or more samples. The size of a Poisson sample is, however, random. Conditional Poisson (CP sampling is a modification of the classical Poisson sampling that produces a fixed-size πps sample. We introduce two methods to coordinate Conditional Poisson samples over time or simultaneously. The first one uses permanent random numbers and the list-sequential implementation of CP sampling. The second method uses a CP sample in the first selection and provides an approximate one in the second selection because the prescribed inclusion probabilities are not respected exactly. The methods are evaluated using the size of the expected sample overlap, and are compared with their competitors using Monte Carlo simulation. The new methods provide a good coordination degree of two samples, close to the performance of Poisson sampling with permanent random numbers.

  3. Identifying undiagnosed HIV in men who have sex with men (MSM) by offering HIV home sampling via online gay social media: a service evaluation.

    Science.gov (United States)

    Elliot, E; Rossi, M; McCormack, S; McOwan, A

    2016-09-01

    An estimated one in eight men who have sex with men (MSM) in London lives with HIV, of which 16% are undiagnosed. It is a public health priority to minimise time spent undiagnosed and reduce morbidity, mortality and onward HIV transmission. 'Dean Street at Home' provided an online HIV risk self-assessment and postal home HIV sampling service aimed at hard-to-reach, high-risk MSM. This 2-year service evaluation aims to determine the HIV risk behaviour of users, the uptake of offer of home sampling and the acceptability of the service. Users were invited to assess their HIV risk anonymously through messages or promotional banners on several gay social networking websites. Regardless of risk, they were offered a free postal HIV oral fluid or blood self-sampling kit. Reactive results were confirmed in clinic. A user survey was sent to first year respondents. 17 361 respondents completed the risk self-assessment. Of these, half had an 'identifiable risk' for HIV and a third was previously untested. 5696 test kits were returned. 121 individuals had a reactive sample; 82 (1.4% of returned samples) confirmed as new HIV diagnoses linked to care; 14 (0.25%) already knew their diagnosis; and 14 (0.25%) were false reactives. The median age at diagnosis was 38; median CD4 505 cells/µL and 20% were recent infections. 61/82 (78%) were confirmed on treatment at the time of writing. The post-test email survey revealed a high service acceptability rate. The service was the first of its kind in the UK. This evaluation provides evidence to inform the potential roll-out of further online strategies to enhance community HIV testing. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  4. A chromatochemometric approach for evaluating and selecting the perfume maceration time.

    Science.gov (United States)

    López-Nogueroles, Marina; Chisvert, Alberto; Salvador, Amparo

    2010-04-30

    A chemometric treatment of the data obtained by gas chromatography (GC) with flame ionization detector (FID) has been proposed to study the maceration time involved in perfumes manufacture with the final purpose of reducing this time but preserving the organoleptic characteristics of the perfume that is being elaborated. In this sense, GC-FID chromatograms were used as a fingerprint of perfume samples subjected to different maceration times, and data were treated by linear discriminant analysis (LDA), by comparing to a set of samples known to be macerated or not, which were used as calibration objects. The GC-FID methodology combined with the treatment of data by LDA has been applied successfully to seven different perfumes. The constructed LDA models exhibited excellent Wilks' lambdas (0.013-0.118, depending on the perfume), and up to a reduction of 57% has been achieved with respect to the maceration time initially established. 2010 Elsevier B.V. All rights reserved.

  5. Method Evaluation And Field Sample Measurements For The Rate Of Movement Of The Oxidation Front In Saltstone

    Energy Technology Data Exchange (ETDEWEB)

    Almond, P. M. [Savannah River Site (SRS), Aiken, SC (United States); Kaplan, D. I. [Savannah River Site (SRS), Aiken, SC (United States); Langton, C. A. [Savannah River Site (SRS), Aiken, SC (United States); Stefanko, D. B. [Savannah River Site (SRS), Aiken, SC (United States); Spencer, W. A. [Savannah River Site (SRS), Aiken, SC (United States); Hatfield, A. [Clemson University, Clemson, SC (United States); Arai, Y. [Clemson University, Clemson, SC (United States)

    2012-08-23

    The objective of this work was to develop and evaluate a series of methods and validate their capability to measure differences in oxidized versus reduced saltstone. Validated methods were then applied to samples cured under field conditions to simulate Performance Assessment (PA) needs for the Saltstone Disposal Facility (SDF). Four analytical approaches were evaluated using laboratory-cured saltstone samples. These methods were X-ray absorption spectroscopy (XAS), diffuse reflectance spectroscopy (DRS), chemical redox indicators, and thin-section leaching methods. XAS and thin-section leaching methods were validated as viable methods for studying oxidation movement in saltstone. Each method used samples that were spiked with chromium (Cr) as a tracer for oxidation of the saltstone. The two methods were subsequently applied to field-cured samples containing chromium to characterize the oxidation state of chromium as a function of distance from the exposed air/cementitious material surface.

  6. Real-Time PCR in faecal samples of Triatoma infestans obtained by xenodiagnosis: proposal for an exogenous internal control.

    Science.gov (United States)

    Bravo, Nicolás; Muñoz, Catalina; Nazal, Nicolás; Saavedra, Miguel; Martínez, Gabriela; Araya, Eduardo; Apt, Werner; Zulantay, Inés

    2012-03-26

    The polymerase chain reaction (PCR) has proved to be a sensitive technique to detect Trypanosoma cruzi in the chronic phase of Chagas disease, which is characterized by low and fluctuating parasitemia. Another technique proposed for parasitological diagnosis in this phase of infection combines a microscopic search for motile trypomastigote forms in faecal samples (FS) obtained by xenodiagnosis (XD) with conventional PCR (XD-PCR). In this study we evaluate the use of human blood DNA as an exogenous internal control (EIC) for real time PCR (qPCR) combined with XD (XD-qPCR) using chromosome 12 (X12) detection. None of the FS-XD evaluated by qPCR amplified for X12. Nevertheless, all the EIC-FS-XD mixtures amplified for X12. We determined that X12 is useful as an EIC for XD-qPCR because we showed that the FS-XD does not contain human DNA after 30 or more days of XD incubation. This information is relevant for research on T. cruzi by XD-qPCR since it allows ruling out inhibition and false negative results due to DNA loss during the process of extraction and purification.

  7. Failure-censored accelerated life test sampling plans for Weibull distribution under expected test time constraint

    International Nuclear Information System (INIS)

    Bai, D.S.; Chun, Y.R.; Kim, J.G.

    1995-01-01

    This paper considers the design of life-test sampling plans based on failure-censored accelerated life tests. The lifetime distribution of products is assumed to be Weibull with a scale parameter that is a log linear function of a (possibly transformed) stress. Two levels of stress higher than the use condition stress, high and low, are used. Sampling plans with equal expected test times at high and low test stresses which satisfy the producer's and consumer's risk requirements and minimize the asymptotic variance of the test statistic used to decide lot acceptability are obtained. The properties of the proposed life-test sampling plans are investigated

  8. Sample size for comparing negative binomial rates in noninferiority and equivalence trials with unequal follow-up times.

    Science.gov (United States)

    Tang, Yongqiang

    2017-05-25

    We derive the sample size formulae for comparing two negative binomial rates based on both the relative and absolute rate difference metrics in noninferiority and equivalence trials with unequal follow-up times, and establish an approximate relationship between the sample sizes required for the treatment comparison based on the two treatment effect metrics. The proposed method allows the dispersion parameter to vary by treatment groups. The accuracy of these methods is assessed by simulations. It is demonstrated that ignoring the between-subject variation in the follow-up time by setting the follow-up time for all individuals to be the mean follow-up time may greatly underestimate the required size, resulting in underpowered studies. Methods are provided for back-calculating the dispersion parameter based on the published summary results.

  9. Bacterial communities of disease vectors sampled across time, space, and species.

    Science.gov (United States)

    Jones, Ryan T; Knight, Rob; Martin, Andrew P

    2010-02-01

    A common strategy of pathogenic bacteria is to form close associations with parasitic insects that feed on animals and to use these insects as vectors for their own transmission. Pathogens interact closely with other coexisting bacteria within the insect, and interactions between co-occurring bacteria may influence the vector competency of the parasite. Interactions between particular lineages can be explored through measures of alpha-diversity. Furthermore, general patterns of bacterial community assembly can be explored through measures of beta-diversity. Here, we use pyrosequencing (n=115,924 16S rRNA gene sequences) to describe the bacterial communities of 230 prairie dog fleas sampled across space and time. We use these communinty characterizations to assess interactions between dominant community members and to explore general patterns of bacterial community assembly in fleas. An analysis of co-occurrence patterns suggests non-neutral negative interactions between dominant community members (Pspace (phylotype-based: R=0.418, Pspace and time.

  10. Histopathologic evaluation of postmortem autolytic changes in bluegill (Lepomis macrohirus and crappie (Pomoxis anularis at varied time intervals and storage temperatures

    Directory of Open Access Journals (Sweden)

    Jami George

    2016-04-01

    Full Text Available Information is lacking on preserving fish carcasses to minimize postmortem autolysis artifacts when a necropsy cannot be performed immediately. The purpose of this study was to qualitatively identify and score histologic postmortem changes in two species of freshwater fish (bluegill—Lepomis macrochirus; crappie—Pomoxis annularis, at varied time intervals and storage temperatures, to assess the histologic quality of collected samples. A pooled sample of 36 mix sex individuals of healthy bluegill and crappie were euthanized, stored either at room temperature, refrigerated at 4 °C, or frozen at −20 °C, and then necropsied at 0, 4, 24, and 48 h intervals. Histologic specimens were evaluated by light microscopy. Data showed that immediate harvesting of fresh samples provides the best quality and refrigeration would be the preferred method of storage if sample collection had to be delayed for up to 24 h. When sample collection must be delayed more than 24 h, the preferred method of storage to minimize autolysis artifacts is freezing if evaluation of the gastrointestinal tract is most important, or refrigeration if gill histology is most important. The gill arch, intestinal tract, followed by the liver and kidney were the most sensitive organs to autolysis.

  11. The integrated performance evaluation program quality assurance guidance in support of EM environmental sampling and analysis activities

    International Nuclear Information System (INIS)

    1994-05-01

    EM's (DOE's Environmental Restoration and Waste Management) Integrated Performance Evaluation Program (IPEP) has the purpose of integrating information from existing PE programs with expanded QA activities to develop information about the quality of radiological, mixed waste, and hazardous environmental sample analyses provided by all laboratories supporting EM programs. The guidance addresses the goals of identifying specific PE sample programs and contacts, identifying specific requirements for participation in DOE's internal and external (regulatory) programs, identifying key issues relating to application and interpretation of PE materials for EM headquarters and field office managers, and providing technical guidance covering PE materials for site-specific activities. (PE) Performance Evaluation materials or samples are necessary for the quality assurance/control programs covering environmental data collection

  12. Evaluation of Pu sample oscillations in CESAR

    Energy Technology Data Exchange (ETDEWEB)

    Brunet, M.

    1974-10-15

    A set of 12 plutonium samples of various compositions were oscillated in CESAR in 1973. Comparisons were made to the oscillated reactivity effect of a known specimen of U-235 and boron where the detector signals were corrected against a background signal based on comparison to the motion of a control rod in the central location of critical assembly. An equivalent sample method was tested first for various samples of U-235 and boron to establish a means of correction in the detector response. Inferred plutonium reaction rates in the experiments were compared to transport theory calculations using the APOLLO code. Addition effort is needed to reconcile differences in measured and calculated results requiring both chemical analyses of the plutonium isotopes in the samples and improved cross sections for the plutonium isotopes.

  13. Centralized and decentralized global outer-synchronization of asymmetric recurrent time-varying neural network by data-sampling.

    Science.gov (United States)

    Lu, Wenlian; Zheng, Ren; Chen, Tianping

    2016-03-01

    In this paper, we discuss outer-synchronization of the asymmetrically connected recurrent time-varying neural networks. By using both centralized and decentralized discretization data sampling principles, we derive several sufficient conditions based on three vector norms to guarantee that the difference of any two trajectories starting from different initial values of the neural network converges to zero. The lower bounds of the common time intervals between data samples in centralized and decentralized principles are proved to be positive, which guarantees exclusion of Zeno behavior. A numerical example is provided to illustrate the efficiency of the theoretical results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. RealWorld evaluation: working under budget, time, data, and political constraints

    National Research Council Canada - National Science Library

    Bamberger, Michael; Rugh, Jim; Mabry, Linda

    2012-01-01

    This book addresses the challenges of conducting program evaluations in real-world contexts where evaluators and their clients face budget and time constraints and where critical data may be missing...

  15. Evaluation of curing compound application time on concrete surface durability.

    Science.gov (United States)

    2015-03-01

    The effect of curing compound application time after concrete finishing was examined in the study. Times of 30 minutes, 2 hours and 4 hours were considered and repeatability was evaluated with comparisons to a Phase I portion of the study. Scaling re...

  16. Impact of a Mobile Phone Intervention to Reduce Sedentary Behavior in a Community Sample of Adults: A Quasi-Experimental Evaluation.

    Science.gov (United States)

    Kendzor, Darla E; Shuval, Kerem; Gabriel, Kelley Pettee; Businelle, Michael S; Ma, Ping; High, Robin R; Cuate, Erica L; Poonawalla, Insiya B; Rios, Debra M; Demark-Wahnefried, Wendy; Swartz, Michael D; Wetter, David W

    2016-01-25

    Greater time spent sedentary is linked with increased risk of breast, colorectal, ovarian, endometrial, and prostate cancers. Given steadily increasing rates of mobile phone ownership, mobile phone interventions may have the potential to broadly influence sedentary behavior across settings. The purpose of this study was to examine the short-term impact of a mobile phone intervention that targeted sedentary time in a diverse community sample. Adults participated in a quasi-experimental evaluation of a mobile phone intervention designed to reduce sedentary time through prompts to interrupt periods of sitting. Participants carried mobile phones and wore accelerometers for 7 consecutive days. Intervention participants additionally received mobile phone prompts during self-reported sitting and information about the negative health impact of prolonged sedentariness. The study was conducted from December 2012 to November 2013 in Dallas, Texas. Linear mixed model regression analyses were conducted to evaluate the influence of the intervention on daily accelerometer-determined estimates of sedentary and active time. Participants (N=215) were predominantly female (67.9%, 146/215) and nonwhite (black: 50.7%, 109/215; Latino: 12.1%, 26/215; other: 5.6%, 12/215). Analyses revealed that participants who received the mobile phone intervention had significantly fewer daily minutes of sedentary time (B=-22.09, P=.045) and more daily active minutes (B=23.01, P=.04) than control participants. A simple mobile phone intervention was associated with engaging in less sedentary time and more physical activity. Findings underscore the potential impact of mobile phone interventions to positively influence sedentary behavior and physical activity.

  17. Phytochemical analysis and biological evaluation of selected African propolis samples from Cameroon and Congo

    NARCIS (Netherlands)

    Papachroni, D.; Graikou, K.; Kosalec, I.; Damianakos, H.; Ingram, V.J.; Chinou, I.

    2015-01-01

    The objective of this study was the chemical analysis of four selected samples of African propolis (Congo and Cameroon) and their biological evaluation. Twenty-one secondary metabolites belonging to four different chemical groups were isolated from the 70% ethanolic extracts of propolis and their

  18. Evaluation of Sampling Methods for Bacillus Spore ...

    Science.gov (United States)

    Journal Article Following a wide area release of biological materials, mapping the extent of contamination is essential for orderly response and decontamination operations. HVAC filters process large volumes of air and therefore collect highly representative particulate samples in buildings. HVAC filter extraction may have great utility in rapidly estimating the extent of building contamination following a large-scale incident. However, until now, no studies have been conducted comparing the two most appropriate sampling approaches for HVAC filter materials: direct extraction and vacuum-based sampling.

  19. Sample design effects in landscape genetics

    Science.gov (United States)

    Oyler-McCance, Sara J.; Fedy, Bradley C.; Landguth, Erin L.

    2012-01-01

    An important research gap in landscape genetics is the impact of different field sampling designs on the ability to detect the effects of landscape pattern on gene flow. We evaluated how five different sampling regimes (random, linear, systematic, cluster, and single study site) affected the probability of correctly identifying the generating landscape process of population structure. Sampling regimes were chosen to represent a suite of designs common in field studies. We used genetic data generated from a spatially-explicit, individual-based program and simulated gene flow in a continuous population across a landscape with gradual spatial changes in resistance to movement. Additionally, we evaluated the sampling regimes using realistic and obtainable number of loci (10 and 20), number of alleles per locus (5 and 10), number of individuals sampled (10-300), and generational time after the landscape was introduced (20 and 400). For a simulated continuously distributed species, we found that random, linear, and systematic sampling regimes performed well with high sample sizes (>200), levels of polymorphism (10 alleles per locus), and number of molecular markers (20). The cluster and single study site sampling regimes were not able to correctly identify the generating process under any conditions and thus, are not advisable strategies for scenarios similar to our simulations. Our research emphasizes the importance of sampling data at ecologically appropriate spatial and temporal scales and suggests careful consideration for sampling near landscape components that are likely to most influence the genetic structure of the species. In addition, simulating sampling designs a priori could help guide filed data collection efforts.

  20. Hanford high level waste: Sample Exchange/Evaluation (SEE) Program

    International Nuclear Information System (INIS)

    King, A.G.

    1994-08-01

    The Pacific Northwest Laboratory (PNL)/Analytical Chemistry Laboratory (ACL) and the Westinghouse Hanford Company (WHC)/Process Analytical Laboratory (PAL) provide analytical support services to various environmental restoration and waste management projects/programs at Hanford. In response to a US Department of Energy -- Richland Field Office (DOE-RL) audit, which questioned the comparability of analytical methods employed at each laboratory, the Sample Exchange/Exchange (SEE) program was initiated. The SEE Program is a selfassessment program designed to compare analytical methods of the PAL and ACL laboratories using sitespecific waste material. The SEE program is managed by a collaborative, the Quality Assurance Triad (Triad). Triad membership is made up of representatives from the WHC/PAL, PNL/ACL, and WHC Hanford Analytical Services Management (HASM) organizations. The Triad works together to design/evaluate/implement each phase of the SEE Program

  1. Remote Sensing Time Series to Evaluate Direct Land Use Change of Recent Expanded Sugarcane Crop in Brazil

    Directory of Open Access Journals (Sweden)

    Marcio Pupin Mello

    2012-04-01

    Full Text Available The use of biofuels to mitigate global carbon emissions is highly dependent on direct and indirect land use changes (LUC. The direct LUC (dLUC can be accurately evaluated using remote sensing images. In this work we evaluated the dLUC of about 4 million hectares of sugarcane expanded from 2005 to 2010 in the South-central region of Brazil. This region has a favorable climate for rain-fed sugarcane, a great potential for agriculture expansion without deforestation, and is currently responsible for almost 90% of Brazilian’s sugarcane production. An available thematic map of sugarcane along with MODIS and Landast images, acquired from 2000 to 2009, were used to evaluate the land use prior to the conversion to sugarcane. A systematic sampling procedure was adopted and the land use identification prior to sugarcane, for each sample, was performed using a web tool developed to visualize both the MODIS time series and the multitemporal Landsat images. Considering 2000 as reference year, it was observed that sugarcane expanded: 69.7% on pasture land; 25.0% on annual crops; 0.6% on forest; while 3.4% was sugarcane land under crop rotation. The results clearly show that the dLUC of recent sugarcane expansion has occurred on more than 99% of either pasture or agriculture land.

  2. Evaluating hepatocellular carcinoma cell lines for tumour samples using within-sample relative expression orderings of genes.

    Science.gov (United States)

    Ao, Lu; Guo, You; Song, Xuekun; Guan, Qingzhou; Zheng, Weicheng; Zhang, Jiahui; Huang, Haiyan; Zou, Yi; Guo, Zheng; Wang, Xianlong

    2017-11-01

    Concerns are raised about the representativeness of cell lines for tumours due to the culture environment and misidentification. Liver is a major metastatic destination of many cancers, which might further confuse the origin of hepatocellular carcinoma cell lines. Therefore, it is of crucial importance to understand how well they can represent hepatocellular carcinoma. The HCC-specific gene pairs with highly stable relative expression orderings in more than 99% of hepatocellular carcinoma but with reversed relative expression orderings in at least 99% of one of the six types of cancer, colorectal carcinoma, breast carcinoma, non-small-cell lung cancer, gastric carcinoma, pancreatic carcinoma and ovarian carcinoma, were identified. With the simple majority rule, the HCC-specific relative expression orderings from comparisons with colorectal carcinoma and breast carcinoma could exactly discriminate primary hepatocellular carcinoma samples from both primary colorectal carcinoma and breast carcinoma samples. Especially, they correctly classified more than 90% of liver metastatic samples from colorectal carcinoma and breast carcinoma to their original tumours. Finally, using these HCC-specific relative expression orderings from comparisons with six cancer types, we identified eight of 24 hepatocellular carcinoma cell lines in the Cancer Cell Line Encyclopedia (Huh-7, Huh-1, HepG2, Hep3B, JHH-5, JHH-7, C3A and Alexander cells) that are highly representative of hepatocellular carcinoma. Evaluated with a REOs-based prognostic signature for hepatocellular carcinoma, all these eight cell lines showed the same metastatic properties of the high-risk metastatic hepatocellular carcinoma tissues. Caution should be taken for using hepatocellular carcinoma cell lines. Our results should be helpful to select proper hepatocellular carcinoma cell lines for biological experiments. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. Efficient Evaluation of Wireless Real-Time Control Networks

    Directory of Open Access Journals (Sweden)

    Peter Horvath

    2015-02-01

    Full Text Available In this paper, we present a system simulation framework for the design and performance evaluation of complex wireless cyber-physical systems. We describe the simulator architecture and the specific developments that are required to simulate cyber-physical systems relying on multi-channel, multihop mesh networks. We introduce realistic and efficient physical layer models and a system simulation methodology, which provides statistically significant performance evaluation results with low computational complexity. The capabilities of the proposed framework are illustrated in the example of WirelessHART, a centralized, real-time, multi-hop mesh network designed for industrial control and monitor applications.

  4. Evaluating the suitability of different environmental samples for tracing atmospheric pollution in industrial areas.

    Science.gov (United States)

    Francová, Anna; Chrastný, Vladislav; Šillerová, Hana; Vítková, Martina; Kocourková, Jana; Komárek, Michael

    2017-01-01

    Samples of lichens, snow and particulate matter (PM 10 , 24 h) are used for the source identification of air pollution in the heavily industrialized region of Ostrava, Upper Silesia, Czech Republic. An integrated approach that uses different environmental samples for metal concentration and Pb isotope analyses was applied. The broad range of isotope ratios in the samples indicates a combination of different pollution sources, the strongest among them being the metallurgical industry, bituminous coal combustion and traffic. Snow samples are proven as the most relevant indicator for tracing metal(loid)s and recent local contamination in the atmosphere. Lichens can be successfully used as tracers of the long-term activity of local and remote sources of contamination. The combination of PM 10 with snow can provide very useful information for evaluation of current pollution sources. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Evaluation protocol for amusia: Portuguese sample.

    Science.gov (United States)

    Peixoto, Maria Conceição; Martins, Jorge; Teixeira, Pedro; Alves, Marisa; Bastos, José; Ribeiro, Carlos

    2012-12-01

    Amusia is a disorder that affects the processing of music. Part of this processing happens in the primary auditory cortex. The study of this condition allows us to evaluate the central auditory pathways. To explore the diagnostic evaluation tests of amusia. The authors propose an evaluation protocol for patients with suspected amusia (after brain injury or complaints of poor musical perception), in parallel with the assessment of central auditory processing, already implemented in the department. The Montreal Evaluation of Battery of amusia was the basis for the selection of the tests. From this comprehensive battery of tests we selected some of the musical examples to evaluate different musical aspects, including memory and perception of music, ability concerning musical recognition and discrimination. In terms of memory there is a test for assessing delayed memory, adapted to the Portuguese culture. Prospective study. Although still experimental, with the possibility of adjustments in the assessment, we believe that this assessment, combined with the study of central auditory processing, will allow us to understand some central lesions, congenital or acquired hearing perception limitations.

  6. Anthrax Sampling and Decontamination: Technology Trade-Offs

    Energy Technology Data Exchange (ETDEWEB)

    Price, Phillip N.; Hamachi, Kristina; McWilliams, Jennifer; Sohn, Michael D.

    2008-09-12

    The goal of this project was to answer the following questions concerning response to a future anthrax release (or suspected release) in a building: 1. Based on past experience, what rules of thumb can be determined concerning: (a) the amount of sampling that may be needed to determine the extent of contamination within a given building; (b) what portions of a building should be sampled; (c) the cost per square foot to decontaminate a given type of building using a given method; (d) the time required to prepare for, and perform, decontamination; (e) the effectiveness of a given decontamination method in a given type of building? 2. Based on past experience, what resources will be spent on evaluating the extent of contamination, performing decontamination, and assessing the effectiveness of the decontamination in abuilding of a given type and size? 3. What are the trade-offs between cost, time, and effectiveness for the various sampling plans, sampling methods, and decontamination methods that have been used in the past?

  7. Effects of computing time delay on real-time control systems

    Science.gov (United States)

    Shin, Kang G.; Cui, Xianzhong

    1988-01-01

    The reliability of a real-time digital control system depends not only on the reliability of the hardware and software used, but also on the speed in executing control algorithms. The latter is due to the negative effects of computing time delay on control system performance. For a given sampling interval, the effects of computing time delay are classified into the delay problem and the loss problem. Analysis of these two problems is presented as a means of evaluating real-time control systems. As an example, both the self-tuning predicted (STP) control and Proportional-Integral-Derivative (PID) control are applied to the problem of tracking robot trajectories, and their respective effects of computing time delay on control performance are comparatively evaluated. For this example, the STP (PID) controller is shown to outperform the PID (STP) controller in coping with the delay (loss) problem.

  8. Estimation of river and stream temperature trends under haphazard sampling

    Science.gov (United States)

    Gray, Brian R.; Lyubchich, Vyacheslav; Gel, Yulia R.; Rogala, James T.; Robertson, Dale M.; Wei, Xiaoqiao

    2015-01-01

    Long-term temporal trends in water temperature in rivers and streams are typically estimated under the assumption of evenly-spaced space-time measurements. However, sampling times and dates associated with historical water temperature datasets and some sampling designs may be haphazard. As a result, trends in temperature may be confounded with trends in time or space of sampling which, in turn, may yield biased trend estimators and thus unreliable conclusions. We address this concern using multilevel (hierarchical) linear models, where time effects are allowed to vary randomly by day and date effects by year. We evaluate the proposed approach by Monte Carlo simulations with imbalance, sparse data and confounding by trend in time and date of sampling. Simulation results indicate unbiased trend estimators while results from a case study of temperature data from the Illinois River, USA conform to river thermal assumptions. We also propose a new nonparametric bootstrap inference on multilevel models that allows for a relatively flexible and distribution-free quantification of uncertainties. The proposed multilevel modeling approach may be elaborated to accommodate nonlinearities within days and years when sampling times or dates typically span temperature extremes.

  9. Bovine milk sampling efficiency for pregnancy-associated glycoproteins (PAG) detection test

    Energy Technology Data Exchange (ETDEWEB)

    Silva, H. K. da; Cassoli, L.D.; Pantoja, J.F.C.; Cerqueira, P.H.R.; Coitinho, T.B.; Machado, P.F.

    2016-07-01

    Two experiments were conducted to verify whether the time of day at which a milk sample is collected and the possible carryover in the milking system may affect pregnancy-associated glycoproteins (PAG) levels and, consequently, the pregnancy test results in dairy cows. In experiment one, we evaluated the effect of time of day at which the milk sample is collected from 51 cows. In experiment two, which evaluated the possible occurrence of carryover in the milk meter milking system, milk samples from 94 cows belonging to two different farms were used. The samples were subjected to pregnancy test using ELISA methodology to measure PAG concentrations and to classify the samples as positive (pregnant), negative (nonpregnant), or suspicious (recheck). We found that the time of milking did not affect the PAG levels. As to the occurrence of carryover in the milk meter, the PAG levels of the samples collected from Farm-2 were heavily influenced by a carryover effect compared with the samples from Farm-1. Thus, milk samples submitted to a pregnancy test can be collected during the morning or the evening milking. When the sample is collected from the milk meters, periodic equipment maintenance should be noted, including whether the milk meter is totally drained between different animals’ milking and equipment cleaning between milking is performed correctly to minimize the occurrence of carryover, thereby avoiding the effect on PAG levels and, consequently, the pregnancy test results. Therefore, a single milk sample can be used for both milk quality tests and pregnancy test.

  10. Bovine milk sampling efficiency for pregnancy-associated glycoproteins (PAG) detection test

    International Nuclear Information System (INIS)

    Silva, H. K. da; Cassoli, L.D.; Pantoja, J.F.C.; Cerqueira, P.H.R.; Coitinho, T.B.; Machado, P.F.

    2016-01-01

    Two experiments were conducted to verify whether the time of day at which a milk sample is collected and the possible carryover in the milking system may affect pregnancy-associated glycoproteins (PAG) levels and, consequently, the pregnancy test results in dairy cows. In experiment one, we evaluated the effect of time of day at which the milk sample is collected from 51 cows. In experiment two, which evaluated the possible occurrence of carryover in the milk meter milking system, milk samples from 94 cows belonging to two different farms were used. The samples were subjected to pregnancy test using ELISA methodology to measure PAG concentrations and to classify the samples as positive (pregnant), negative (nonpregnant), or suspicious (recheck). We found that the time of milking did not affect the PAG levels. As to the occurrence of carryover in the milk meter, the PAG levels of the samples collected from Farm-2 were heavily influenced by a carryover effect compared with the samples from Farm-1. Thus, milk samples submitted to a pregnancy test can be collected during the morning or the evening milking. When the sample is collected from the milk meters, periodic equipment maintenance should be noted, including whether the milk meter is totally drained between different animals’ milking and equipment cleaning between milking is performed correctly to minimize the occurrence of carryover, thereby avoiding the effect on PAG levels and, consequently, the pregnancy test results. Therefore, a single milk sample can be used for both milk quality tests and pregnancy test.

  11. First detection and genotyping of Giardia intestinalis in stool samples collected from children in Ghazni Province, eastern Afghanistan and evaluation of the PCR assay in formalin-fixed specimens.

    Science.gov (United States)

    Lass, Anna; Karanis, Panagiotis; Korzeniewski, Krzysztof

    2017-08-01

    It is estimated that faecal-orally transmitted diseases are common in Afghanistan, as a consequence of poor hygienic standards of life and widespread contamination of water and food with both human and animal faeces. However, there is little information in the literature concerning infections caused by intestinal parasites in the Afghan population. In this study, we report the occurrence of Giardia intestinalis assemblages (A and B) in formalin-fixed stool samples collected from 245 Afghan schoolchildren living in Ghazni Province in eastern Afghanistan. Detection of the parasite's DNA and genotyping was performed using real-time PCR, specific to the β-giardin gene of G. intestinalis. Positive results were recorded in 52 (21.2%) samples. Genotyping was successful in 39 faecal samples and showed the predominance of assemblage B of G. intestinalis in this population (15 assemblage A and 24 assemblage B). Co-infection with both genotypes A and B was detected in four samples. Additionally, we evaluated the effect of 10% buffered formalin fixative on the detection of G. intestinalis DNA using real-time PCR and nested PCR characterised by different lengths of PCR products (74 and 479 bp, respectively). The human faeces containing the Giardia cysts were tested for 16 weeks. Amplification of G. intestinalis DNA with real-time PCR was possible up to 6 weeks of preservation of stool sample in formalin, compared to only 2 weeks with nested PCR. This suggests that real-time PCR is a more suitable tool in cases where stool samples have to be kept in formalin for longer periods of time.

  12. ACTINIDE REMOVAL PROCESS SAMPLE ANALYSIS, CHEMICAL MODELING, AND FILTRATION EVALUATION

    Energy Technology Data Exchange (ETDEWEB)

    Martino, C.; Herman, D.; Pike, J.; Peters, T.

    2014-06-05

    Filtration within the Actinide Removal Process (ARP) currently limits the throughput in interim salt processing at the Savannah River Site. In this process, batches of salt solution with Monosodium Titanate (MST) sorbent are concentrated by crossflow filtration. The filtrate is subsequently processed to remove cesium in the Modular Caustic Side Solvent Extraction Unit (MCU) followed by disposal in saltstone grout. The concentrated MST slurry is washed and sent to the Defense Waste Processing Facility (DWPF) for vitrification. During recent ARP processing, there has been a degradation of filter performance manifested as the inability to maintain high filtrate flux throughout a multi-batch cycle. The objectives of this effort were to characterize the feed streams, to determine if solids (in addition to MST) are precipitating and causing the degraded performance of the filters, and to assess the particle size and rheological data to address potential filtration impacts. Equilibrium modelling with OLI Analyzer{sup TM} and OLI ESP{sup TM} was performed to determine chemical components at risk of precipitation and to simulate the ARP process. The performance of ARP filtration was evaluated to review potential causes of the observed filter behavior. Task activities for this study included extensive physical and chemical analysis of samples from the Late Wash Pump Tank (LWPT) and the Late Wash Hold Tank (LWHT) within ARP as well as samples of the tank farm feed from Tank 49H. The samples from the LWPT and LWHT were obtained from several stages of processing of Salt Batch 6D, Cycle 6, Batch 16.

  13. Clinical evaluation of human papillomavirus detection by careHPV™ test on physician-samples and self-samples using the indicating FTA Elute® card.

    Science.gov (United States)

    Wang, Shao-Ming; Hu, Shang-Ying; Chen, Feng; Chen, Wen; Zhao, Fang-Hui; Zhang, Yu-Qing; Ma, Xin-Ming; Qiao, You-Lin

    2014-01-01

    To make the clinical evaluation of a solid-state human papillomavirus (HPV) sampling medium in combination with an economical HPV testing method (careHPV™) for cervical cancer screening. 396 women aged 25-65 years were enrolled for cervical cancer screening, and four samples were collected. Two samples were collected by woman themselves, among which one was stored in DCM preservative solution (called "liquid sample") and the other was applied on the Whatman Indicating FTA Elute® card (FTA card). Another two samples were collected by physician and stored in DCM preservative solution and FTA card, respectively. All the samples were detected by careHPV™ test. All the women were administered a colposcopy examination, and biopsies were taken for pathological confirmation if necessary. FTA card demonstrated a comparable sensitivity of detecting high grade Cervical Intraepithelial Neoplasia (CIN) with the liquid sample carrier for self and physician-sampling, but showed a higher specificity than that of liquid sample carrier for self-sampling (FTA vs Liquid: 79.0% vs 71.6%, p=0.02). Generally, the FTA card had a comparable accuracy with that of Liquid-based medium by different sampling operators, with an area under the curve of 0.807 for physician and FTA, 0.781 for physician and Liquid, 0.728 for self and FTA, and 0.733 for self and Liquid (p>0.05). FTA card is a promising sample carrier for cervical cancer screening. With appropriate education programmes and further optimization of the experimental workflow, FTA card based self-collection in combination with centralized careHPV™ testing can help expand the coverage of cervical cancer screening in low-resource areas.

  14. Comparison of Spot and Time Weighted Averaging (TWA Sampling with SPME-GC/MS Methods for Trihalomethane (THM Analysis

    Directory of Open Access Journals (Sweden)

    Don-Roger Parkinson

    2016-02-01

    Full Text Available Water samples were collected and analyzed for conductivity, pH, temperature and trihalomethanes (THMs during the fall of 2014 at two monitored municipal drinking water source ponds. Both spot (or grab and time weighted average (TWA sampling methods were assessed over the same two day sampling time period. For spot sampling, replicate samples were taken at each site and analyzed within 12 h of sampling by both Headspace (HS- and direct (DI- solid phase microextraction (SPME sampling/extraction methods followed by Gas Chromatography/Mass Spectrometry (GC/MS. For TWA, a two day passive on-site TWA sampling was carried out at the same sampling points in the ponds. All SPME sampling methods undertaken used a 65-µm PDMS/DVB SPME fiber, which was found optimal for THM sampling. Sampling conditions were optimized in the laboratory using calibration standards of chloroform, bromoform, bromodichloromethane, dibromochloromethane, 1,2-dibromoethane and 1,2-dichloroethane, prepared in aqueous solutions from analytical grade samples. Calibration curves for all methods with R2 values ranging from 0.985–0.998 (N = 5 over the quantitation linear range of 3–800 ppb were achieved. The different sampling methods were compared for quantification of the water samples, and results showed that DI- and TWA- sampling methods gave better data and analytical metrics. Addition of 10% wt./vol. of (NH42SO4 salt to the sampling vial was found to aid extraction of THMs by increasing GC peaks areas by about 10%, which resulted in lower detection limits for all techniques studied. However, for on-site TWA analysis of THMs in natural waters, the calibration standard(s ionic strength conditions, must be carefully matched to natural water conditions to properly quantitate THM concentrations. The data obtained from the TWA method may better reflect actual natural water conditions.

  15. Quantitative real-time polymerase chain reaction for Streptococcus mutans and Streptococcus sobrinus in dental plaque samples and its association with early childhood caries.

    Science.gov (United States)

    Choi, Eun-Jung; Lee, Sung-Hoon; Kim, Young-Jae

    2009-03-01

    Streptococcus mutans and Streptococcus sobrinus are closely associated with the development of early childhood caries (ECC). Recently, quantitative real-time polymerase chain reaction (qRT-PCR) has been used for rapid and accurate quantification of these bacterial species. This study aims to detect quantitatively the levels of S. mutans and S. sobrinus in plaque samples by qRT-PCR, and to assess their association with the prevalence of ECC in Korean preschool children. One hundred and five children (71 months old or younger) were examined and classified into three groups (caries-free, ECC, severe ECC). Dental plaque samples were collected and qRT-PCR was conducted using oligonucleotide primers specific for glucosyltransferase gene (S. mutans-gtfB, S. sobrinus-gtfU) and universal primer. Pearson's correlation test was conducted to evaluate the relationship between the dmfs (decayed, missing, or filled surfaces primary teeth) scores and the microbiological findings. There was a significant difference between the levels of S. mutans and S. sobrinus in the plaque samples of the three groups (P plaque samples. The children with higher ratio of S. sobrinus to S. mutans in their dental plaque showed higher incidence of ECC.

  16. Evaluation of ultrasound-assisted extraction as sample pre-treatment for quantitative determination of rare earth elements in marine biological tissues by inductively coupled plasma-mass spectrometry

    International Nuclear Information System (INIS)

    Costas, M.; Lavilla, I.; Gil, S.; Pena, F.; Calle, I.; Cabaleiro, N. de la; Bendicho, C.

    2010-01-01

    In this work, the determination of rare earth elements (REEs), i.e. Y, La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Ho, Er, Tm, Yb and Lu in marine biological tissues by inductively coupled-mass spectrometry (ICP-MS) after a sample preparation method based on ultrasound-assisted extraction (UAE) is described. The suitability of the extracts for ICP-MS measurements was evaluated. For that, studies were focused on the following issues: (i) use of clean up of extracts with a C18 cartridge for non-polar solid phase extraction; (ii) use of different internal standards; (iii) signal drift caused by changes in the nebulization efficiency and salt deposition on the cones during the analysis. The signal drift produced by direct introduction of biological extracts in the instrument was evaluated using a calibration verification standard for bracketing (standard-sample bracketing, SSB) and cumulative sum (CUSUM) control charts. Parameters influencing extraction such as extractant composition, mass-to-volume ratio, particle size, sonication time and sonication amplitude were optimized. Diluted single acids (HNO 3 and HCl) and mixtures (HNO 3 + HCl) were evaluated for improving the extraction efficiency. Quantitative recoveries for REEs were achieved using 5 mL of 3% (v/v) HNO 3 + 2% (v/v) HCl, particle size <200 μm, 3 min of sonication time and 50% of sonication amplitude. Precision, expressed as relative standard deviation from three independent extractions, ranged from 0.1 to 8%. In general, LODs were improved by a factor of 5 in comparison with those obtained after microwave-assisted digestion (MAD). The accuracy of the method was evaluated using the CRM BCR-668 (mussel tissue). Different seafood samples of common consumption were analyzed by ICP-MS after UAE and MAD.

  17. β-NMR sample optimization

    CERN Document Server

    Zakoucka, Eva

    2013-01-01

    During my summer student programme I was working on sample optimization for a new β-NMR project at the ISOLDE facility. The β-NMR technique is well-established in solid-state physics and just recently it is being introduced for applications in biochemistry and life sciences. The β-NMR collaboration will be applying for beam time to the INTC committee in September for three nuclei: Cu, Zn and Mg. Sample optimization for Mg was already performed last year during the summer student programme. Therefore sample optimization for Cu and Zn had to be completed as well for the project proposal. My part in the project was to perform thorough literature research on techniques studying Cu and Zn complexes in native conditions, search for relevant binding candidates for Cu and Zn applicable for ß-NMR and eventually evaluate selected binding candidates using UV-VIS spectrometry.

  18. Reducing the Cost and Time to Perform a Human Factors Engineering Evaluation

    International Nuclear Information System (INIS)

    Geary, L.C. Dr.

    2003-01-01

    The Westinghouse Savannah River Company, a contractor to the Department of Energy, has developed a new software tool for automating the Human Factors Engineering design review, analysis, and evaluation processes. The set of design guidelines, used in the tool, was obtained from the United States Nuclear Regulatory Commission Nuclear Regulatory Guide, NUREG- 0700 - Human System Interface Design Review Guideline. This tool has been described at a previous IEEE Conference on Human Factors and Power Plants. The original software tool in NUREG- 0700 was used to evaluate a facility and a separate independent evaluation was performed using the new tool for the same facility. A comparison was made between the two different tools; both in results obtained and cost and time to complete the evaluation. The results demonstrate a five to ten fold reduction in time and cost to complete the evaluation using the newly developed tool while maintaining consistent evaluation results. The time to per form the review was measured in weeks using the new software tool rather than months using the existing NUREG-0700 tool. The new tool has been so successful that it was applied to two additional facilities with the same reduced time and cost savings. Plans have been made to use the new tool at other facilities in order to provide the same savings

  19. Recombinant plasmid-based quantitative Real-Time PCR analysis of Salmonella enterica serotypes and its application to milk samples.

    Science.gov (United States)

    Gokduman, Kurtulus; Avsaroglu, M Dilek; Cakiris, Aris; Ustek, Duran; Gurakan, G Candan

    2016-03-01

    The aim of the current study was to develop, a new, rapid, sensitive and quantitative Salmonella detection method using a Real-Time PCR technique based on an inexpensive, easy to produce, convenient and standardized recombinant plasmid positive control. To achieve this, two recombinant plasmids were constructed as reference molecules by cloning the two most commonly used Salmonella-specific target gene regions, invA and ttrRSBC. The more rapid detection enabled by the developed method (21 h) compared to the traditional culture method (90 h) allows the quantitative evaluation of Salmonella (quantification limits of 10(1)CFU/ml and 10(0)CFU/ml for the invA target and the ttrRSBC target, respectively), as illustrated using milk samples. Three advantages illustrated by the current study demonstrate the potential of the newly developed method to be used in routine analyses in the medical, veterinary, food and water/environmental sectors: I--The method provides fast analyses including the simultaneous detection and determination of correct pathogen counts; II--The method is applicable to challenging samples, such as milk; III--The method's positive controls (recombinant plasmids) are reproducible in large quantities without the need to construct new calibration curves. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Evaluation of a real-time travel time prediction system in a freeway construction work zone : final report, March 2001.

    Science.gov (United States)

    2001-03-01

    A real-time travel time prediction system (TIPS) was evaluated in a construction work zone. TIPS includes changeable message signs (CMSs) displaying the travel time and distance to the end of the work zone to motorists. The travel times displayed by ...

  1. Rapid Analysis of Carbohydrates in Bioprocess Samples: An Evaluation of the CarboPac SA10 for HPAE-PAD Analysis by Interlaboratory Comparison

    Energy Technology Data Exchange (ETDEWEB)

    Sevcik, R. S.; Hyman, D. A.; Basumallich, L.; Scarlata, C. J.; Rohrer, J.; Chambliss, C. K.

    2013-01-01

    A technique for carbohydrate analysis for bioprocess samples has been developed, providing reduced analysis time compared to current practice in the biofuels R&D community. The Thermofisher CarboPac SA10 anion-exchange column enables isocratic separation of monosaccharides, sucrose and cellobiose in approximately 7 minutes. Additionally, use of a low-volume (0.2 mL) injection valve in combination with a high-volume detection cell minimizes the extent of sample dilution required to bring sugar concentrations into the linear range of the pulsed amperometric detector (PAD). Three laboratories, representing academia, industry, and government, participated in an interlaboratory study which analyzed twenty-one opportunistic samples representing biomass pretreatment, enzymatic saccharification, and fermentation samples. The technique's robustness, linearity, and interlaboratory reproducibility were evaluated and showed excellent-to-acceptable characteristics. Additionally, quantitation by the CarboPac SA10/PAD was compared with the current practice method utilizing a HPX-87P/RID. While these two methods showed good agreement a statistical comparison found significant quantitation difference between them, highlighting the difference between selective and universal detection modes.

  2. Evaluation of Two Lyophilized Molecular Assays to Rapidly Detect Foot-and-Mouth Disease Virus Directly from Clinical Samples in Field Settings.

    Science.gov (United States)

    Howson, E L A; Armson, B; Madi, M; Kasanga, C J; Kandusi, S; Sallu, R; Chepkwony, E; Siddle, A; Martin, P; Wood, J; Mioulet, V; King, D P; Lembo, T; Cleaveland, S; Fowler, V L

    2017-06-01

    Accurate, timely diagnosis is essential for the control, monitoring and eradication of foot-and-mouth disease (FMD). Clinical samples from suspect cases are normally tested at reference laboratories. However, transport of samples to these centralized facilities can be a lengthy process that can impose delays on critical decision making. These concerns have motivated work to evaluate simple-to-use technologies, including molecular-based diagnostic platforms, that can be deployed closer to suspect cases of FMD. In this context, FMD virus (FMDV)-specific reverse transcription loop-mediated isothermal amplification (RT-LAMP) and real-time RT-PCR (rRT-PCR) assays, compatible with simple sample preparation methods and in situ visualization, have been developed which share equivalent analytical sensitivity with laboratory-based rRT-PCR. However, the lack of robust 'ready-to-use kits' that utilize stabilized reagents limits the deployment of these tests into field settings. To address this gap, this study describes the performance of lyophilized rRT-PCR and RT-LAMP assays to detect FMDV. Both of these assays are compatible with the use of fluorescence to monitor amplification in real-time, and for the RT-LAMP assays end point detection could also be achieved using molecular lateral flow devices. Lyophilization of reagents did not adversely affect the performance of the assays. Importantly, when these assays were deployed into challenging laboratory and field settings within East Africa they proved to be reliable in their ability to detect FMDV in a range of clinical samples from acutely infected as well as convalescent cattle. These data support the use of highly sensitive molecular assays into field settings for simple and rapid detection of FMDV. © 2015 The Authors. Transboundary and Emerging Diseases Published by Blackwell Verlag GmbH.

  3. Evaluation of highly accelerated real-time cardiac cine MRI in tachycardia.

    Science.gov (United States)

    Bassett, Elwin C; Kholmovski, Eugene G; Wilson, Brent D; DiBella, Edward V R; Dosdall, Derek J; Ranjan, Ravi; McGann, Christopher J; Kim, Daniel

    2014-02-01

    Electrocardiogram (ECG)-gated breath-hold cine MRI is considered to be the gold standard test for the assessment of cardiac function. However, it may fail in patients with arrhythmia, impaired breath-hold capacity and poor ECG gating. Although ungated real-time cine MRI may mitigate these problems, commercially available real-time cine MRI pulse sequences using parallel imaging typically yield relatively poor spatiotemporal resolution because of their low image acquisition efficiency. As an extension of our previous work, the purpose of this study was to evaluate the diagnostic quality and accuracy of eight-fold-accelerated real-time cine MRI with compressed sensing (CS) for the quantification of cardiac function in tachycardia, where it is challenging for real-time cine MRI to provide sufficient spatiotemporal resolution. We evaluated the performances of eight-fold-accelerated cine MRI with CS, three-fold-accelerated real-time cine MRI with temporal generalized autocalibrating partially parallel acquisitions (TGRAPPA) and ECG-gated breath-hold cine MRI in 21 large animals with tachycardia (mean heart rate, 104 beats per minute) at 3T. For each cine MRI method, two expert readers evaluated the diagnostic quality in four categories (image quality, temporal fidelity of wall motion, artifacts and apparent noise) using a Likert scale (1-5, worst to best). One reader evaluated the left ventricular functional parameters. The diagnostic quality scores were significantly different between the three cine pulse sequences, except for the artifact level between CS and TGRAPPA real-time cine MRI. Both ECG-gated breath-hold cine MRI and eight-fold accelerated real-time cine MRI yielded all four scores of ≥ 3.0 (acceptable), whereas three-fold-accelerated real-time cine MRI yielded all scores below 3.0, except for artifact (3.0). The left ventricular ejection fraction (LVEF) measurements agreed better between ECG-gated cine MRI and eight-fold-accelerated real-time cine MRI

  4. Stochastic, real-space, imaginary-time evaluation of third-order Feynman–Goldstone diagrams

    International Nuclear Information System (INIS)

    Willow, Soohaeng Yoo; Hirata, So

    2014-01-01

    A new, alternative set of interpretation rules of Feynman–Goldstone diagrams for many-body perturbation theory is proposed, which translates diagrams into algebraic expressions suitable for direct Monte Carlo integrations. A vertex of a diagram is associated with a Coulomb interaction (rather than a two-electron integral) and an edge with the trace of a Green's function in real space and imaginary time. With these, 12 diagrams of third-order many-body perturbation (MP3) theory are converted into 20-dimensional integrals, which are then evaluated by a Monte Carlo method. It uses redundant walkers for convergence acceleration and a weight function for importance sampling in conjunction with the Metropolis algorithm. The resulting Monte Carlo MP3 method has low-rank polynomial size dependence of the operation cost, a negligible memory cost, and a naturally parallel computational kernel, while reproducing the correct correlation energies of small molecules within a few mE h after 10 6 Monte Carlo steps

  5. Stochastic, real-space, imaginary-time evaluation of third-order Feynman–Goldstone diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Willow, Soohaeng Yoo [Department of Chemistry, University of Illinois at Urbana-Champaign, 600 South Mathews Avenue, Urbana, Illinois 61801 (United States); Center for Superfunctional Materials, Department of Chemistry, Pohang University of Science and Technology, San 31, Hyojadong, Namgu, Pohang 790-784 (Korea, Republic of); Hirata, So, E-mail: sohirata@illinois.edu [Department of Chemistry, University of Illinois at Urbana-Champaign, 600 South Mathews Avenue, Urbana, Illinois 61801 (United States); CREST, Japan Science and Technology Agency, Saitama 332-0012 (Japan)

    2014-01-14

    A new, alternative set of interpretation rules of Feynman–Goldstone diagrams for many-body perturbation theory is proposed, which translates diagrams into algebraic expressions suitable for direct Monte Carlo integrations. A vertex of a diagram is associated with a Coulomb interaction (rather than a two-electron integral) and an edge with the trace of a Green's function in real space and imaginary time. With these, 12 diagrams of third-order many-body perturbation (MP3) theory are converted into 20-dimensional integrals, which are then evaluated by a Monte Carlo method. It uses redundant walkers for convergence acceleration and a weight function for importance sampling in conjunction with the Metropolis algorithm. The resulting Monte Carlo MP3 method has low-rank polynomial size dependence of the operation cost, a negligible memory cost, and a naturally parallel computational kernel, while reproducing the correct correlation energies of small molecules within a few mE{sub h} after 10{sup 6} Monte Carlo steps.

  6. Evaluation of time integration methods for transient response analysis of nonlinear structures

    International Nuclear Information System (INIS)

    Park, K.C.

    1975-01-01

    Recent developments in the evaluation of direct time integration methods for the transient response analysis of nonlinear structures are presented. These developments, which are based on local stability considerations of an integrator, show that the interaction between temporal step size and nonlinearities of structural systems has a pronounced effect on both accuracy and stability of a given time integration method. The resulting evaluation technique is applied to a model nonlinear problem, in order to: 1) demonstrate that it eliminates the present costly process of evaluating time integrator for nonlinear structural systems via extensive numerical experiments; 2) identify the desirable characteristics of time integration methods for nonlinear structural problems; 3) develop improved stiffly-stable methods for application to nonlinear structures. Extension of the methodology for examination of the interaction between a time integrator and the approximate treatment of nonlinearities (such as due to pseudo-force or incremental solution procedures) is also discussed. (Auth.)

  7. Multiplex Real-Time PCR Assay Using TaqMan Probes for the Identification of Trypanosoma cruzi DTUs in Biological and Clinical Samples

    Science.gov (United States)

    Cura, Carolina I.; Duffy, Tomas; Lucero, Raúl H.; Bisio, Margarita; Péneau, Julie; Jimenez-Coello, Matilde; Calabuig, Eva; Gimenez, María J.; Valencia Ayala, Edward; Kjos, Sonia A.; Santalla, José; Mahaney, Susan M.; Cayo, Nelly M.; Nagel, Claudia; Barcán, Laura; Málaga Machaca, Edith S.; Acosta Viana, Karla Y.; Brutus, Laurent; Ocampo, Susana B.; Aznar, Christine; Cuba Cuba, Cesar A.; Gürtler, Ricardo E.; Ramsey, Janine M.; Ribeiro, Isabela; VandeBerg, John L.; Yadon, Zaida E.; Osuna, Antonio; Schijman, Alejandro G.

    2015-01-01

    Background Trypanosoma cruzi has been classified into six Discrete Typing Units (DTUs), designated as TcI–TcVI. In order to effectively use this standardized nomenclature, a reproducible genotyping strategy is imperative. Several typing schemes have been developed with variable levels of complexity, selectivity and analytical sensitivity. Most of them can be only applied to cultured stocks. In this context, we aimed to develop a multiplex Real-Time PCR method to identify the six T. cruzi DTUs using TaqMan probes (MTq-PCR). Methods/Principal Findings The MTq-PCR has been evaluated in 39 cultured stocks and 307 biological samples from vectors, reservoirs and patients from different geographical regions and transmission cycles in comparison with a multi-locus conventional PCR algorithm. The MTq-PCR was inclusive for laboratory stocks and natural isolates and sensitive for direct typing of different biological samples from vectors, reservoirs and patients with acute, congenital infection or Chagas reactivation. The first round SL-IR MTq-PCR detected 1 fg DNA/reaction tube of TcI, TcII and TcIII and 1 pg DNA/reaction tube of TcIV, TcV and TcVI reference strains. The MTq-PCR was able to characterize DTUs in 83% of triatomine and 96% of reservoir samples that had been typed by conventional PCR methods. Regarding clinical samples, 100% of those derived from acute infected patients, 62.5% from congenitally infected children and 50% from patients with clinical reactivation could be genotyped. Sensitivity for direct typing of blood samples from chronic Chagas disease patients (32.8% from asymptomatic and 22.2% from symptomatic patients) and mixed infections was lower than that of the conventional PCR algorithm. Conclusions/Significance Typing is resolved after a single or a second round of Real-Time PCR, depending on the DTU. This format reduces carryover contamination and is amenable to quantification, automation and kit production. PMID:25993316

  8. Extending laboratory automation to the wards: effect of an innovative pneumatic tube system on diagnostic samples and transport time.

    Science.gov (United States)

    Suchsland, Juliane; Winter, Theresa; Greiser, Anne; Streichert, Thomas; Otto, Benjamin; Mayerle, Julia; Runge, Sören; Kallner, Anders; Nauck, Matthias; Petersmann, Astrid

    2017-02-01

    The innovative pneumatic tube system (iPTS) transports one sample at a time without the use of cartridges and allows rapid sending of samples directly into the bulk loader of a laboratory automation system (LAS). We investigated effects of the iPTS on samples and turn-around time (TAT). During transport, a mini data logger recorded the accelerations in three dimensions and reported them in arbitrary area under the curve (AUC) units. In addition representative quantities of clinical chemistry, hematology and coagulation were measured and compared in 20 blood sample pairs transported by iPTS and courier. Samples transported by iPTS were brought to the laboratory (300 m) within 30 s without adverse effects on the samples. The information retrieved from the data logger showed a median AUC of 7 and 310 arbitrary units for courier and iPTS transport, respectively. This is considerably below the reported limit for noticeable hemolysis of 500 arbitrary units. iPTS reduces TAT by reducing the hands-on time and a fast transport. No differences in the measurement results were found for any of the investigated 36 analytes between courier and iPTS transport. Based on these findings the iPTS was cleared for clinical use in our hospital.

  9. Evaluation of laser phosphorimetry for the analysis of uranium in biological samples from laboratory animal studies

    International Nuclear Information System (INIS)

    Gray, D.; Eidsom, A.F.

    1985-01-01

    Laser phosphorimetry has been used for uranium analyses in a variety of sample matrices, including environmental and human bioassay samples. The Scientrex-UA-3 Uranium Analyzer has been used at ITRI to acquire data on the applicability of laser phosphorimetry to analyses of uranium in the highly concentrated solutions resulting from chemical processing of biological comparisons of results with those obtained from conventional fluorometry. These comparisons have been very favorable for many sample types. Results of these comparisons and an evaluation of the data obtained with the Scintrex unit are presented

  10. Dual-labeled time-resolved fluoroimmunoassay for simultaneous detection of clothianidin and diniconazole in agricultural samples.

    Science.gov (United States)

    Sheng, Enze; Shi, Haiyan; Zhou, Liangliang; Hua, Xiude; Feng, Lu; Yu, Tong; Wang, Minghua

    2016-02-01

    Europium (Eu(3+)) and samarium (Sm(3+)) were used as fluorescent labels to develop a highly sensitive dual-labeled time-resolved fluoroimmunoassay (TRFIA) for detect clothianidin and diniconazole in food samples. Under the optimized assay conditions, 50% inhibition concentration (IC50) and the limit of detection (LOD, IC10) of clothianidin were 5.08 and 0.021 μg/L, and 13.14 and 0.029 μg/L for diniconazole. The cross-reactivities (CRs) were negligible except dinotefuran (9.4%) and uniconazole (4.28%). The recoveries of clothianidin and diniconazole ranged from 79.3% to 108.7% in food samples. The results of TRFIA for the authentic samples were validated by gas chromatography (GC) analyses, and a satisfactory correlations were obtained. These results indicated that the method was an alternative tool for simultaneous detection of clothianidin and diniconazole in food samples. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Quasi-exact evaluation of time domain MFIE MOT matrix elements

    KAUST Repository

    Shi, Yifei; Bagci, Hakan; Shanker, Balasubramaniam; Lu, Mingyu; Michielssen, Eric

    2013-01-01

    A previously proposed quasi-exact scheme for evaluating matrix elements resulting from the marching-on-in-time (MOT) discretization of the time domain electric field integral equation (EFIE) is extended to matrix entries resulting from the discretization of its magnetic field integral equation (MFIE) counterpart. Numerical results demonstrate the accuracy of the scheme as well as the late-time stability of the resulting MOT-MFIE solver. © 2013 IEEE.

  12. Quasi-exact evaluation of time domain MFIE MOT matrix elements

    KAUST Repository

    Shi, Yifei

    2013-07-01

    A previously proposed quasi-exact scheme for evaluating matrix elements resulting from the marching-on-in-time (MOT) discretization of the time domain electric field integral equation (EFIE) is extended to matrix entries resulting from the discretization of its magnetic field integral equation (MFIE) counterpart. Numerical results demonstrate the accuracy of the scheme as well as the late-time stability of the resulting MOT-MFIE solver. © 2013 IEEE.

  13. Comparative Evaluations of Four Specification Methods for Real-Time Systems

    Science.gov (United States)

    1989-12-01

    December 1989 Comparative Evaluations of Four Specification Methods for Real - Time Systems David P. Wood William G. Wood Specification and Design Methods...Methods for Real - Time Systems Abstract: A number of methods have been proposed in the last decade for the specification of system and software requirements...and software specification for real - time systems . Our process for the identification of methods that meet the above criteria is described in greater

  14. Evaluation of the Abbott Real Time HCV genotype II assay for Hepatitis C virus genotyping.

    Science.gov (United States)

    Sariguzel, Fatma Mutlu; Berk, Elife; Gokahmetoglu, Selma; Ercal, Baris Derya; Celik, Ilhami

    2015-01-01

    The determination of HCV genotypes and subtypes is very important for the selection of antiviral therapy and epidemiological studies. The aim of this study was to evaluate the performance of Abbott Real Time HCV Genotype II assay in HCV genotyping of HCV infected patients in Kayseri, Turkey. One hundred patients with chronic hepatitis C admitted to our hospital were evaluated between June 2012 and December 2012, HCV RNA levels were determined by the COBAS® AmpliPrep/COBAS® TaqMan® 48 HCV test. HCV genotyping was investigated by the Abbott Real Time HCV Genotype II assay. With the exception of genotype 1, subtypes of HCV genotypes could not be determined by Abbott assay. Sequencing analysis was used as the reference method. Genotypes 1, 2, 3 and 4 were observed in 70, 4, 2 and 24 of the 100 patients, respectively, by two methods. The concordance between the two systems to determine HCV major genotypes was 100%. Of 70 patients with genotype 1, 66 showed infection with subtype 1b and 4 with subtype 1a by Abbott Real Time HCV Genotype II assay. Using sequence analysis, 61 showed infection with subtype 1b and 9 with subtype 1a. In determining of HCV genotype 1 subtypes, the difference between the two methods was not statistically significant (P>0.05). HCV genotype 4 and 3 samples were found to be subtype 4d and 3a, respectively, by sequence analysis. There were four patients with genotype 2. Sequence analysis revealed that two of these patients had type 2a and the other two had type 2b. The Abbott Real Time HCV Genotype II assay yielded results consistent with sequence analysis. However, further optimization of the Abbott Real Time HCV Genotype II assay for subtype identification of HCV is required.

  15. EVALUATION OF DISPOSABLE DIAPERS FOR QUANTATIVE MEASUREMENTS OF PESTICIDE METABOLITES AND CREATININE IN URINE SAMPLES

    Science.gov (United States)

    This project consisted of a laboratory study to evaluate an extraction and analysis method for quantifying biomarkers of pesticide exposure and creatinine in urine samples collected with commercially-available disposable diapers. For large exposure studies, such as the National ...

  16. Non-destructive evaluation of the hidden voids in integrated circuit packages using terahertz time-domain spectroscopy

    International Nuclear Information System (INIS)

    Park, Sung-Hyeon; Kim, Hak-Sung; Jang, Jin-Wook

    2015-01-01

    In this work, a terahertz time-domain spectroscopy (THz-TDS) imaging technique was used as a non-destructive inspection method for detecting voids in integrated circuit (IC) packages. Transmission and reflection modes, with an angle of incidence of 30°, were used to detect voids in IC packages. The locations of the detected voids in the IC packages could be calculated by analyzing THz waveforms. Finally, voids that are positioned at the different interfaces in the IC package samples could be successfully detected and imaged. Therefore, this THz-TDS imaging technique is expected to be a promising technique for non-destructive evaluation of IC packages. (paper)

  17. Two General Extension Algorithms of Latin Hypercube Sampling

    Directory of Open Access Journals (Sweden)

    Zhi-zhao Liu

    2015-01-01

    Full Text Available For reserving original sampling points to reduce the simulation runs, two general extension algorithms of Latin Hypercube Sampling (LHS are proposed. The extension algorithms start with an original LHS of size m and construct a new LHS of size m+n that contains the original points as many as possible. In order to get a strict LHS of larger size, some original points might be deleted. The relationship of original sampling points in the new LHS structure is shown by a simple undirected acyclic graph. The basic general extension algorithm is proposed to reserve the most original points, but it costs too much time. Therefore, a general extension algorithm based on greedy algorithm is proposed to reduce the extension time, which cannot guarantee to contain the most original points. These algorithms are illustrated by an example and applied to evaluating the sample means to demonstrate the effectiveness.

  18. Evaluation of common methods for sampling invertebrate pollinator assemblages: net sampling out-perform pan traps.

    Science.gov (United States)

    Popic, Tony J; Davila, Yvonne C; Wardle, Glenda M

    2013-01-01

    Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km(2) area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service.

  19. PHOTOACOUSTIC NON-DESTRUCTIVE EVALUATION AND IMAGING OF CARIES IN DENTAL SAMPLES

    International Nuclear Information System (INIS)

    Li, T.; Dewhurst, R. J.

    2010-01-01

    Dental caries is a disease wherein bacterial processes damage hard tooth structure. Traditional dental radiography has its limitations for detecting early stage caries. In this study, a photoacoustic (PA) imaging system with the near-infrared light source has been applied to postmortem dental samples to obtain 2-D and 3-D images. Imaging results showed that the PA technique can be used to image human teeth caries. For non-destructive photoacoustic evaluation and imaging, the induced temperature and pressure rises within biotissues should not cause physical damage to the tissue. For example, temperature rises above 5 deg. C within live human teeth will cause pulpal necrosis. Therefore, several simulations based on the thermoelastic effect have been applied to predict temperature and pressure fields within samples. Predicted temperature levels are below corresponding safety limits, but care is required to avoid nonlinear absorption phenomena. Furthermore, PA imaging results from the phantom provide evidence for high sensitivity, which shows the imaging potential of the PA technique for detecting early stage disease.

  20. Evaluation of accidental coincidences for time-differential Moessbauer-spectroscopy

    International Nuclear Information System (INIS)

    Alflen, M.; Meyer, W.

    1995-01-01

    The accidental coincidences of a measuring system based on time-to-amplitude conversion are considered in some detail for the case of low starting and high stopping rates. Two types of accidental coincidences are distinguished, those carrying time information and those without time information. Neglecting any deadtime effects of the detectors, analytical expressions for the calculation of the time distribution of the random coincidences are evaluated. The analytical expressions have been confirmed by Monte Carlo simulations. The procedure is applied to time-differential Moessbauer spectroscopy in order to extract the time spectra of true coincidences. The measured spectrum in a time channel turns out to be a superposition of the true spectrum (true coincidences), a time integral spectrum (random coincidences), and a weighted superposition of true spectra of other time channels (random but time carrying information). A measurement with a single line 57 Co/Rh-source and single line K[Fe(CN) 6 ].3H 2 O-absorber with stopping rates of 1 MBq shows agreement between the theoretical time-filtered spectra and the corrected measured spectra of true coincidences. ((orig.))