Sample records for assurance sampling method

  1. Investigation of cholesterol bias due to a matrix effect of external quality assurance samples: how true is your cholesterol method? (United States)

    Pretorius, Carel J; Klingberg, Sandra; Johnson, Leslie; Park, Rodney; Wilgen, Urs; Ungerer, Jacobus P J


    Comparability of cholesterol measurement is clinically required and external quality assurance (EQA) programmes are important to verify the trueness of routine methods. We developed a gas chromatography-isotope dilution mass spectrometry (GC-IDMS) total cholesterol assay to investigate the cause of a suspected matrix-related negative bias with the Beckman Coulter enzymatic method discovered in an EQA programme. The GC-IDMS method was calibrated with certified reference material and verified against a secondary reference method. Bias between the GC-IDMS and Beckman Coulter methods was estimated according to Clinical and Laboratory Standards Institute (CLSI) protocol EP9-A2 with 40 clinical samples. At clinically important decision levels, no significant bias was demonstrated on patients' samples (all results within a ±3% limit). A matrix effect confined to the EQA material that affected the Beckman Coulter total cholesterol method was confirmed. The GC-IDMS method is suitable as a higher order total cholesterol method in a routine clinical laboratory. Matrix effects defeat the objectives of EQA schemes by preventing the verification of trueness. Given the importance of obtaining a true cholesterol result without systematic error, we recommend that EQA material without matrix effects should be used.

  2. Prevalence study of yaws in the Democratic Republic of Congo using the lot quality assurance sampling method.

    Directory of Open Access Journals (Sweden)

    Sibylle Gerstl

    Full Text Available BACKGROUND: Until the 1970s the prevalence of non-venereal trepanomatosis, including yaws, was greatly reduced after worldwide mass treatment. In 2005, cases were again reported in the Democratic Republic of the Congo. We carried out a survey to estimate the village-level prevalence of yaws in the region of Equator in the north of the country in order to define appropriate strategies to effectively treat the affected population. METHODOLOGY/PRINCIPAL FINDINGS: We designed a community-based survey using the Lot Quality Assurance Sampling method to classify the prevalence of active yaws in 14 groups of villages (lots. The classification into high, moderate, or low yaws prevalence corresponded to World Health Organization prevalence thresholds for identifying appropriate operational treatment strategies. Active yaws cases were defined by suggestive clinical signs and positive rapid plasma reagin and Treponema pallidum hemagglutination serological tests. The overall prevalence in the study area was 4.7% (95% confidence interval: 3.4-6.0. Two of 14 lots had high prevalence (>10%, three moderate prevalence (5-10% and nine low prevalence (<5%.. CONCLUSIONS/SIGNIFICANCE: Although yaws is no longer a World Health Organization priority disease, the presence of yaws in a region where it was supposed to be eradicated demonstrates the importance of continued surveillance and control efforts. Yaws should remain a public health priority in countries where previously it was known to be endemic. The integration of sensitive surveillance systems together with free access to effective treatment is recommended. As a consequence of our study results, more than 16,000 people received free treatment against yaws.

  3. Comparative validation study to demonstrate the equivalence of a minor modification to AOAC official method 2005.04 assurance GdS E. coli 0157:H7 method to the reference culture method: 375 gram sample size. (United States)

    Feldsine, Philip T; Montgomery-Fullerton, Megan; Roa, Nerie; Kaur, Mandeep; Lienau, Andrew H; Jucker, Markus; Kerr, David E


    The Assurance GDS Escherichia coli (E. col) O157:H7, AOAC Official Method 2005.04, has been modified to include a larger sample size of 375 g. A methods comparison study was conducted to demonstrate the equivalence of this modification to the reference culture method. Ninety samples and controls, representing three foods, were analyzed. Results show no statistically detectable difference between the Assurance GDS E. coli O157:H7 assay and the reference culture methods for the detection of E. coli O157:H7, other than the low level of inoculation for leaf lettuce, for which the GDS gave noticeably higher recovery [difference in probability of detection between candidate methods (dPODc = +0.45)]. There were also suggestions of moderate differences (dPODc = +0.15 to +0.20) for ground beef and the high level of leaf lettuce, but the study size was too small to detect differences of this size. Results showed that the Assurance GDS E. coli O157:H7 method is equivalent to reference culture methods for the detection of E. coli O157:H7.

  4. Comparative validation study to demonstrate the equivalence of a minor modification to AOAC Official Method 2005.05 Assurance GDS shiga Toxin Genes (O157) method to the reference culture method: 375 gram sample size. (United States)

    Feldsine, Philip T; Montgomery-Fullerton, Megan; Roa, Nerie; Kaur, Mandeep; Kerr, David E; Lienau, Andrew H; Jucker, Markus


    The Assurance GDS Shiga Toxin Genes (0157), AOAC Official MethodsM 2005.05, has been modified to include a larger sample size of 375 g. A methods comparison study was conducted to demonstrate the equivalence of this modification to the reference culture method. Ninety samples and controls, representing three foods, were analyzed. Results show no statistically detectable difference between the Assurance GDS Escherichia coli O157:H7 assay and the reference culture methods for the detection of E. coli O157:H7, other than the low level of inoculation for leaf lettuce for which the GDS gave noticeably higher recovery [difference in Probability of Detection between candidate methods (dPODc = +0.45)]. There were also suggestions of moderate differences (dPODc = +0.15 to +0.20) for ground beef and the high level of leaf lettuce, but the study size was too small to detect differences of this size. Results showed that the Assurance GDS Shiga Toxin Genes (0157) method is equivalent to the reference culture methods for the detection of Shiga toxigenic E. coli O157:H7.

  5. Estimation of measles vaccination coverage using the Lot Quality Assurance Sampling (LQAS) method--Tamilnadu, India, 2002-2003. (United States)

    Sivasankaran, Saravanan; Manickam, P; Ramakrishnan, R; Hutin, Y; Gupte, M D


    As part of the global strategic plan to reduce the number of measles deaths in India, the state of Tamilnadu aims at > or =95% measles vaccination coverage. A study was conducted to measure overall coverage levels for the Poondi Primary Health Center (PPHC), a rural health-care facility in Tiruvallur District, and to determine whether any of the PPHC's six health subcenters had coverage levels vaccination coverage levels vaccination status of 73 children aged 12--23 months had to be assessed in each health subcenter coverage area, with a 5% level of significance and a decision value of two. If more than two children were unvaccinated, the null hypothesis (i.e., that coverage in the health subcenter was low [ or =95%). All data were pooled in a stratified sample to estimate overall total coverage in the PPHC area. For two (33.3%) of the six health subcenters, more than two children were unvaccinated (i.e., coverage was vaccinated. LQAS techniques proved useful in identifying small health areas with lower vaccination coverage, which helps to target interventions. Monthly review of vaccination coverage by subcenter and village is recommended to identify pockets of unvaccinated children and to maintain uniform high coverage in the PPHC area.

  6. Sampling for assurance of future reliability (United States)

    Klauenberg, Katy; Elster, Clemens


    Ensuring measurement trueness, compliance with regulations and conformity with standards are key tasks in metrology which are often considered at the time of an inspection. Current practice does not always verify quality after or between inspections, calibrations, laboratory comparisons, conformity assessments, etc. Statistical models describing behavior over time may ensure reliability, i.e. they may give the probability of functioning, compliance or survival until some future point in time. It may not always be possible or economic to inspect a whole population of measuring devices or other units. Selecting a subset of the population according to statistical sampling plans and inspecting only these, allows conclusions about the quality of the whole population with a certain confidence. Combining these issues of sampling and aging, raises questions such as: How many devices need to be inspected, and at least how many of them must conform, so that one can be sure, that more than 100p % of the population will comply until the next inspection? This research is to raise awareness and offer a simple answer to such time- and sample-based quality statements in metrology and beyond. Reliability demonstration methods, such as the prevailing Weibull binomial model, quantify the confidence in future reliability on the basis of a sample. We adapt the binomial model to be applicable to sampling without replacement and simplify the Weibull model so that sampling plans may be determined on the basis of existing ISO standards. Provided the model is suitable, no additional information and no software are needed; and yet, the consumer is protected against future failure. We establish new sampling plans for utility meter surveillance, which are required by a recent modification of German law. These sampling plans are given in similar tables to the previous ones, which demonstrates their suitability for everyday use.

  7. Chesapeake Bay coordinated split sample program annual report, 1990-1991: Analytical methods and quality assurance workgroup of the Chesapeake Bay program monitoring subcommittee

    Energy Technology Data Exchange (ETDEWEB)


    The Chesapeake Bay Program is a federal-state partnership with a goal of restoring the Chesapeake Bay. Its ambient water quality monitoring programs, started in 1984, sample over 150 monitoring stations once or twice a month a month. Due to the size of the Bay watershed (64,000 square miles) and the cooperative nature of the CBP, these monitoring programs involve 10 different analytical laboratories. The Chesapeake Bay Coordinated Split Sample Program (CSSP), initialed in 1988, assesses the comparability of the water quality results from these laboratories. The report summarizes CSSP results for 1990 and 1991, its second and third full years of operation. The CSSP has two main objectives: identifying parameters with low inter-organization agreement, and estimating measurement system variability. The identification of parmeters with low agreement is used as part of the overall Quality Assurance program. Laboratory and program personnel use the information to investigate possible causes of the differences, and take action to increase agreement if possible. Later CSSP results will document any improvements in inter-organization agreement. The variability estimates are most useful to data analysts and modelers who need confidence estimates for monitoring data.

  8. Authentication Assurance Level Application to the Inventory Sampling Measurement System

    Energy Technology Data Exchange (ETDEWEB)

    Devaney, Mike M.; Kouzes, Richard T.; Hansen, Randy R.; Geelhood, Bruce D.


    This document concentrates on the identification of a standardized assessment approach for the verification of security functionality in specific equipment, the Inspection Sampling Measurement System (ISMS) being developed for MAYAK. Specifically, an Authentication Assurance Level 3 is proposed to be reached in authenticating the ISMS.

  9. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys. (United States)

    Hund, Lauren; Bedrick, Edward J; Pagano, Marcello


    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes) depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.

  10. Illinois' Forests, 2005: Statistics, Methods, and Quality Assurance (United States)

    Susan J. Crocker; Charles J. Barnett; Mark A. Hatfield


    The first full annual inventory of Illinois' forests was completed in 2005. This report contains 1) descriptive information on methods, statistics, and quality assurance of data collection, 2) a glossary of terms, 3) tables that summarize quality assurance, and 4) a core set of tabular estimates for a variety of forest resources. A detailed analysis of inventory...

  11. Extending cluster lot quality assurance sampling designs for surveillance programs. (United States)

    Hund, Lauren; Pagano, Marcello


    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance on the basis of the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible nonparametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate. Copyright © 2014 John Wiley & Sons, Ltd.

  12. Multidrug resistance among new tuberculosis cases: detecting local variation through lot quality-assurance sampling. (United States)

    Hedt, Bethany Lynn; van Leth, Frank; Zignol, Matteo; Cobelens, Frank; van Gemert, Wayne; Nhung, Nguyen Viet; Lyepshina, Svitlana; Egwaga, Saidi; Cohen, Ted


    Current methodology for multidrug-resistant tuberculosis (MDR TB) surveys endorsed by the World Health Organization provides estimates of MDR TB prevalence among new cases at the national level. On the aggregate, local variation in the burden of MDR TB may be masked. This paper investigates the utility of applying lot quality-assurance sampling to identify geographic heterogeneity in the proportion of new cases with multidrug resistance. We simulated the performance of lot quality-assurance sampling by applying these classification-based approaches to data collected in the most recent TB drug-resistance surveys in Ukraine, Vietnam, and Tanzania. We explored 3 classification systems- two-way static, three-way static, and three-way truncated sequential sampling-at 2 sets of thresholds: low MDR TB = 2%, high MDR TB = 10%, and low MDR TB = 5%, high MDR TB = 20%. The lot quality-assurance sampling systems identified local variability in the prevalence of multidrug resistance in both high-resistance (Ukraine) and low-resistance settings (Vietnam). In Tanzania, prevalence was uniformly low, and the lot quality-assurance sampling approach did not reveal variability. The three-way classification systems provide additional information, but sample sizes may not be obtainable in some settings. New rapid drug-sensitivity testing methods may allow truncated sequential sampling designs and early stopping within static designs, producing even greater efficiency gains. Lot quality-assurance sampling study designs may offer an efficient approach for collecting critical information on local variability in the burden of multidrug-resistant TB. Before this methodology is adopted, programs must determine appropriate classification thresholds, the most useful classification system, and appropriate weighting if unbiased national estimates are also desired.

  13. Japanese Society for Laboratory Hematology flow cytometric reference method of determining the differential leukocyte count: external quality assurance using fresh blood samples. (United States)

    Kawai, Y; Nagai, Y; Ogawa, E; Kondo, H


    To provide target values for the manufacturers' survey of the Japanese Society for Laboratory Hematology (JSLH), accurate standard data from healthy volunteers were needed for the five-part differential leukocyte count. To obtain such data, JSLH required an antibody panel that achieved high specificity (particularly for mononuclear cells) using simple gating procedures. We developed a flow cytometric method for determining the differential leukocyte count (JSLH-Diff) and validated it by comparison with the flow cytometric differential leukocyte count of the International Council for Standardization in Haematology (ICSH-Diff) and the manual differential count obtained by microscopy (Manual-Diff). First, the reference laboratory performed an imprecision study of JSLH-Diff and ICSH-Diff, as well as performing comparison among JSLH-Diff, Manual-Diff, and ICSH-Diff. Then two reference laboratories and seven participating laboratories performed imprecision and accuracy studies of JSLH-Diff, Manual-Diff, and ICSH-Diff. Simultaneously, six manufacturers' laboratories provided their own representative values by using automated hematology analyzers. The precision of both JSLH-Diff and ICSH-Diff methods was adequate. Comparison by the reference laboratory showed that all correlation coefficients, slopes and intercepts obtained by the JSLH-Diff, ICSH-Diff, and Manual-Diff methods conformed to the criteria. When the imprecision and accuracy of JSLH-Diff were assessed at seven laboratories, the CV% for lymphocytes, neutrophils, monocytes, eosinophils, and basophils was 0.5~0.9%, 0.3~0.7%, 1.7~2.6%, 3.0~7.9%, and 3.8~10.4%, respectively. More than 99% of CD45 positive leukocytes were identified as normal leukocytes by JSLH-Diff. When JSLH-Diff method were validated by comparison with Manual-Diff and ICSH-Diff, JSLH-Diff showed good performance as a reference method. © 2016 John Wiley & Sons Ltd.

  14. Sampling system and method (United States)

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee


    The present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit.

  15. Estimation after classification using lot quality assurance sampling: corrections for curtailed sampling with application to evaluating polio vaccination campaigns. (United States)

    Olives, Casey; Valadez, Joseph J; Pagano, Marcello


    To assess the bias incurred when curtailment of Lot Quality Assurance Sampling (LQAS) is ignored, to present unbiased estimators, to consider the impact of cluster sampling by simulation and to apply our method to published polio immunization data from Nigeria. We present estimators of coverage when using two kinds of curtailed LQAS strategies: semicurtailed and curtailed. We study the proposed estimators with independent and clustered data using three field-tested LQAS designs for assessing polio vaccination coverage, with samples of size 60 and decision rules of 9, 21 and 33, and compare them to biased maximum likelihood estimators. Lastly, we present estimates of polio vaccination coverage from previously published data in 20 local government authorities (LGAs) from five Nigerian states. Simulations illustrate substantial bias if one ignores the curtailed sampling design. Proposed estimators show no bias. Clustering does not affect the bias of these estimators. Across simulations, standard errors show signs of inflation as clustering increases. Neither sampling strategy nor LQAS design influences estimates of polio vaccination coverage in 20 Nigerian LGAs. When coverage is low, semicurtailed LQAS strategies considerably reduces the sample size required to make a decision. Curtailed LQAS designs further reduce the sample size when coverage is high. Results presented dispel the misconception that curtailed LQAS data are unsuitable for estimation. These findings augment the utility of LQAS as a tool for monitoring vaccination efforts by demonstrating that unbiased estimation using curtailed designs is not only possible but these designs also reduce the sample size. © 2014 John Wiley & Sons Ltd.

  16. Chapter 5: Quality assurance/quality control in stormwater sampling (United States)

    Sampling the quality of stormwater presents unique challenges because stormwater flow is relatively short-lived with drastic variability. Furthermore, storm events often occur with little advance warning, outside conventional work hours, and under adverse weather conditions. Therefore, most stormwat...

  17. Sampling system and method

    Energy Technology Data Exchange (ETDEWEB)

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee


    In one embodiment, the present disclosure provides an apparatus and method for supporting a tubing bundle during installation or removal. The apparatus includes a clamp for securing the tubing bundle to an external wireline. In various examples, the clamp is external to the tubing bundle or integral with the tubing bundle. According to one method, a tubing bundle and wireline are deployed together and the tubing bundle periodically secured to the wireline using a clamp. In another embodiment, the present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit. In a specific example, one or more clamps are used to connect the first and/or second conduits to an external wireline.

  18. Procedures for sampling and sample reduction within quality assurance systems for solid biofuels

    Energy Technology Data Exchange (ETDEWEB)



    The objective of this experimental study on sampling was to determine the size and number of samples of biofuels required (taken at two sampling points in each case) and to compare two methods of sampling. The first objective of the sample-reduction exercise was to compare the reliability of various sampling methods, and the second objective was to measure the variations introduced as a result of reducing the sample size to form suitable test portions. The materials studied were sawdust, wood chips, wood pellets and bales of straw, and these were analysed for moisture, ash, particle size and chloride. The sampling procedures are described. The study was conducted in Scandinavia. The results of the study were presented in Leipzig in October 2004. The work was carried out as part of the UK's DTI Technology Programme: New and Renewable Energy.

  19. Radioactive air sampling methods

    CERN Document Server

    Maiello, Mark L


    Although the field of radioactive air sampling has matured and evolved over decades, it has lacked a single resource that assimilates technical and background information on its many facets. Edited by experts and with contributions from top practitioners and researchers, Radioactive Air Sampling Methods provides authoritative guidance on measuring airborne radioactivity from industrial, research, and nuclear power operations, as well as naturally occuring radioactivity in the environment. Designed for industrial hygienists, air quality experts, and heath physicists, the book delves into the applied research advancing and transforming practice with improvements to measurement equipment, human dose modeling of inhaled radioactivity, and radiation safety regulations. To present a wide picture of the field, it covers the international and national standards that guide the quality of air sampling measurements and equipment. It discusses emergency response issues, including radioactive fallout and the assets used ...

  20. Bayesian adaptive determination of the sample size required to assure acceptably low adverse event risk. (United States)

    Lawrence Gould, A; Zhang, Xiaohua Douglas


    An emerging concern with new therapeutic agents, especially treatments for type 2 diabetes, a prevalent condition that increases an individual's risk of heart attack or stroke, is the likelihood of adverse events, especially cardiovascular events, that the new agents may cause. These concerns have led to regulatory requirements for demonstrating that a new agent increases the risk of an adverse event relative to a control by no more than, say, 30% or 80% with high (e.g., 97.5%) confidence. We describe a Bayesian adaptive procedure for determining if the sample size for a development program needs to be increased and, if necessary, by how much, to provide the required assurance of limited risk. The decision is based on the predictive likelihood of a sufficiently high posterior probability that the relative risk is no more than a specified bound. Allowance can be made for between-center as well as within-center variability to accommodate large-scale developmental programs, and design alternatives (e.g., many small centers, few large centers) for obtaining additional data if needed can be explored. Binomial or Poisson likelihoods can be used, and center-level covariates can be accommodated. The predictive likelihoods are explored under various conditions to assess the statistical properties of the method. Copyright © 2013 John Wiley & Sons, Ltd.

  1. 42 CFR 440.260 - Methods and standards to assure quality of services. (United States)


    ... 42 Public Health 4 2010-10-01 2010-10-01 false Methods and standards to assure quality of services. 440.260 Section 440.260 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH... and Limits Applicable to All Services § 440.260 Methods and standards to assure quality of services...

  2. Clustered lot quality assurance sampling: a pragmatic tool for timely assessment of vaccination coverage. (United States)

    Greenland, K; Rondy, M; Chevez, A; Sadozai, N; Gasasira, A; Abanida, E A; Pate, M A; Ronveaux, O; Okayasu, H; Pedalino, B; Pezzoli, L


    To evaluate oral poliovirus vaccine (OPV) coverage of the November 2009 round in five Northern Nigeria states with ongoing wild poliovirus transmission using clustered lot quality assurance sampling (CLQAS). We selected four local government areas in each pre-selected state and sampled six clusters of 10 children in each Local Government Area, defined as the lot area. We used three decision thresholds to classify OPV coverage: 75-90%, 55-70% and 35-50%. A full lot was completed, but we also assessed in retrospect the potential time-saving benefits of stopping sampling when a lot had been classified. We accepted two local government areas (LGAs) with vaccination coverage above 75%. Of the remaining 18 rejected LGAs, 11 also failed to reach 70% coverage, of which four also failed to reach 50%. The average time taken to complete a lot was 10 h. By stopping sampling when a decision was reached, we could have classified lots in 5.3, 7.7 and 7.3 h on average at the 90%, 70% and 50% coverage targets, respectively. Clustered lot quality assurance sampling was feasible and useful to estimate OPV coverage in Northern Nigeria. The multi-threshold approach provided useful information on the variation of IPD vaccination coverage. CLQAS is a very timely tool, allowing corrective actions to be directly taken in insufficiently covered areas. © 2011 Blackwell Publishing Ltd.

  3. Woodbridge research facility remedial investigation/feasibility study. Sampling and analysis plan vol 1: Field sampling plan vol II: Quality assurance project plan. Addendum 1

    Energy Technology Data Exchange (ETDEWEB)

    Wisbeck, D.; Thompson, P.; Williams, T.; Ehlers, M.; Eliass, M.


    U.S. Army Woodbridge Research Facility (WRF) was used in the past as a major military communications center and a research and development laboratory where electromagnetic pulse energy was tested on military and other equipment. WRF is presently an inactive facility pursuant to the 1991 Base Realignment and Closure list. Past investigation activities indicate that polychlorinated biphenyl compounds (PCBs) are the primary chemicals of concern. This task calls for provision of the necessary staff and equipment to provide remedial investigation/feasibility study support for the USAEC BRAC Program investigation at WRF. This Sampling and Analysis Plan, Addendum 1, Field Sampling Plan presents the sample location and rationale for additional samples required to complete the RI/FS; and the Quality Assurance Project Plan presents any additional data quality objectives and proposed laboratory methods for chemical analysis of samples.

  4. Neonatal blood gas sampling methods

    African Journals Online (AJOL)

    Blood gas sampling is part of everyday practice in the care of babies admitted to the neonatal intensive care unit, particularly for those receiving respiratory support. There is little published guidance that systematically evaluates the different methods of neonatal blood gas sampling, where each method has its individual ...

  5. Evaluation of Primary Immunization Coverage of Infants Under Universal Immunization Programme in an Urban Area of Bangalore City Using Cluster Sampling and Lot Quality Assurance Sampling Techniques (United States)

    K, Punith; K, Lalitha; G, Suman; BS, Pradeep; Kumar K, Jayanth


    Research Question: Is LQAS technique better than cluster sampling technique in terms of resources to evaluate the immunization coverage in an urban area? Objective: To assess and compare the lot quality assurance sampling against cluster sampling in the evaluation of primary immunization coverage. Study Design: Population-based cross-sectional study. Study Setting: Areas under Mathikere Urban Health Center. Study Subjects: Children aged 12 months to 23 months. Sample Size: 220 in cluster sampling, 76 in lot quality assurance sampling. Statistical Analysis: Percentages and Proportions, Chi square Test. Results: (1) Using cluster sampling, the percentage of completely immunized, partially immunized and unimmunized children were 84.09%, 14.09% and 1.82%, respectively. With lot quality assurance sampling, it was 92.11%, 6.58% and 1.31%, respectively. (2) Immunization coverage levels as evaluated by cluster sampling technique were not statistically different from the coverage value as obtained by lot quality assurance sampling techniques. Considering the time and resources required, it was found that lot quality assurance sampling is a better technique in evaluating the primary immunization coverage in urban area. PMID:19876474

  6. Data quality assessment in the routine health information system: an application of the Lot Quality Assurance Sampling in Benin. (United States)

    Glèlè Ahanhanzo, Yolaine; Ouendo, Edgard-Marius; Kpozèhouen, Alphonse; Levêque, Alain; Makoutodé, Michel; Dramaix-Wilmet, Michèle


    Health information systems in developing countries are often faulted for the poor quality of the data generated and for the insufficient means implemented to improve system performance. This study examined data quality in the Routine Health Information System in Benin in 2012 and carried out a cross-sectional evaluation of the quality of the data using the Lot Quality Assurance Sampling method. The results confirm the insufficient quality of the data based on three criteria: completeness, reliability and accuracy. However, differences can be seen as the shortcomings are less significant for financial data and for immunization data. The method is simple, fast and can be proposed for current use at operational level as a data quality control tool during the production stage. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2014; all rights reserved.


    National Research Council Canada - National Science Library



    .... Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors...

  8. Evaluation of the Assurance GDS for Salmonella method in foods and environmental surfaces: multilaboratory collaborative study. (United States)

    Feldsine, Philip T; Jucker, Markus T; Kaur, Mandeep; Lienau, Andrew H; Kerr, David E


    A multilaboratory collaborative study was conducted to compare the detection of Salmonella by the Assurance GDS for Salmonella method and the Reference culture methods. Six foods, representing a variety of low microbial and high microbial load foods were analyzed. Seventeen laboratories in the United States and Canada participated in this study. No statistical differences (P Salmonella and the Reference culture methods for any inoculation level of any food type or naturally contaminated food, except for pasta, for which the Assurance GDS method had a higher number of confirmed test portions for Salmonella compared to the Reference method.

  9. Rapid assessment of antimicrobial resistance prevalence using a Lot Quality Assurance sampling approach. (United States)

    van Leth, Frank; den Heijer, Casper; Beerepoot, Mariëlle; Stobberingh, Ellen; Geerlings, Suzanne; Schultsz, Constance


    Increasing antimicrobial resistance (AMR) requires rapid surveillance tools, such as Lot Quality Assurance Sampling (LQAS). LQAS classifies AMR as high or low based on set parameters. We compared classifications with the underlying true AMR prevalence using data on 1335 Escherichia coli isolates from surveys of community-acquired urinary tract infection in women, by assessing operating curves, sensitivity and specificity. Sensitivity and specificity of any set of LQAS parameters was above 99% and between 79 and 90%, respectively. Operating curves showed high concordance of the LQAS classification with true AMR prevalence estimates. LQAS-based AMR surveillance is a feasible approach that provides timely and locally relevant estimates, and the necessary information to formulate and evaluate guidelines for empirical treatment.


    Directory of Open Access Journals (Sweden)



    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  11. Quality assurance guidance for field sampling and measurement assessment plates in support of EM environmental sampling and analysis activities

    Energy Technology Data Exchange (ETDEWEB)


    This document is one of several guidance documents developed by the US Department of Energy (DOE) Office of Environmental Restoration and Waste Management (EM). These documents support the EM Analytical Services Program (ASP) and are based on applicable regulatory requirements and DOE Orders. They address requirements in DOE Orders by providing guidance that pertains specifically to environmental restoration and waste management sampling and analysis activities. DOE 5700.6C Quality Assurance (QA) defines policy and requirements to establish QA programs ensuring that risks and environmental impacts are minimized and that safety, reliability, and performance are maximized. This is accomplished through the application of effective management systems commensurate with the risks imposed by the facility and the project. Every organization supporting EM`s environmental sampling and analysis activities must develop and document a QA program. Management of each organization is responsible for appropriate QA program implementation, assessment, and improvement. The collection of credible and cost-effective environmental data is critical to the long-term success of remedial and waste management actions performed at DOE facilities. Only well established and management supported assessment programs within each EM-support organization will enable DOE to demonstrate data quality. The purpose of this series of documents is to offer specific guidance for establishing an effective assessment program for EM`s environmental sampling and analysis (ESA) activities.

  12. European guidelines for quality assurance in cervical cancer screening: recommendations for collecting samples for conventional and liquid-based cytology.

    NARCIS (Netherlands)

    Arbyn, M.; Herbert, A.; Schenck, U.; Nieminen, P.; Jordan, J.; Mcgoogan, E.; Patnick, J.; Bergeron, C.; Baldauf, J.J.; Klinkhamer, P.; Bulten, J.; Martin-Hirsch, P.


    The current paper presents an annex in the second edition of the European Guidelines for Quality Assurance in Cervical Cancer Screening. It provides guidance on how to make a satisfactory conventional Pap smear or a liquid-based cytology (LBC) sample. Practitioners taking samples for cytology should

  13. Distance sampling methods and applications

    CERN Document Server

    Buckland, S T; Marques, T A; Oedekoven, C S


    In this book, the authors cover the basic methods and advances within distance sampling that are most valuable to practitioners and in ecology more broadly. This is the fourth book dedicated to distance sampling. In the decade since the last book published, there have been a number of new developments. The intervening years have also shown which advances are of most use. This self-contained book covers topics from the previous publications, while also including recent developments in method, software and application. Distance sampling refers to a suite of methods, including line and point transect sampling, in which animal density or abundance is estimated from a sample of distances to detected individuals. The book illustrates these methods through case studies; data sets and computer code are supplied to readers through the book’s accompanying website.  Some of the case studies use the software Distance, while others use R code. The book is in three parts.  The first part addresses basic methods, the ...

  14. Use of Lot Quality Assurance Sampling to Ascertain Levels of Drug Resistant Tuberculosis in Western Kenya.

    Directory of Open Access Journals (Sweden)

    Julia Jezmir

    Full Text Available To classify the prevalence of multi-drug resistant tuberculosis (MDR-TB in two different geographic settings in western Kenya using the Lot Quality Assurance Sampling (LQAS methodology.The prevalence of drug resistance was classified among treatment-naïve smear positive TB patients in two settings, one rural and one urban. These regions were classified as having high or low prevalence of MDR-TB according to a static, two-way LQAS sampling plan selected to classify high resistance regions at greater than 5% resistance and low resistance regions at less than 1% resistance.This study classified both the urban and rural settings as having low levels of TB drug resistance. Out of the 105 patients screened in each setting, two patients were diagnosed with MDR-TB in the urban setting and one patient was diagnosed with MDR-TB in the rural setting. An additional 27 patients were diagnosed with a variety of mono- and poly- resistant strains.Further drug resistance surveillance using LQAS may help identify the levels and geographical distribution of drug resistance in Kenya and may have applications in other countries in the African Region facing similar resource constraints.

  15. Use of Lot Quality Assurance Sampling to Ascertain Levels of Drug Resistant Tuberculosis in Western Kenya. (United States)

    Jezmir, Julia; Cohen, Ted; Zignol, Matteo; Nyakan, Edwin; Hedt-Gauthier, Bethany L; Gardner, Adrian; Kamle, Lydia; Injera, Wilfred; Carter, E Jane


    To classify the prevalence of multi-drug resistant tuberculosis (MDR-TB) in two different geographic settings in western Kenya using the Lot Quality Assurance Sampling (LQAS) methodology. The prevalence of drug resistance was classified among treatment-naïve smear positive TB patients in two settings, one rural and one urban. These regions were classified as having high or low prevalence of MDR-TB according to a static, two-way LQAS sampling plan selected to classify high resistance regions at greater than 5% resistance and low resistance regions at less than 1% resistance. This study classified both the urban and rural settings as having low levels of TB drug resistance. Out of the 105 patients screened in each setting, two patients were diagnosed with MDR-TB in the urban setting and one patient was diagnosed with MDR-TB in the rural setting. An additional 27 patients were diagnosed with a variety of mono- and poly- resistant strains. Further drug resistance surveillance using LQAS may help identify the levels and geographical distribution of drug resistance in Kenya and may have applications in other countries in the African Region facing similar resource constraints.

  16. Methods for quality-assurance review of water-quality data in New Jersey (United States)

    Brown, G. Allan; Pustay, Edward A.; Gibs, Jacob


    This report is an instructional and reference manual that describes methods developed and used by the U.S. Geological Survey (USGS), New Jersey District, to assure the accuracy and precision of the results of analyses of surface- and ground-water samples received from analyzing laboratories and, ultimately, to ensure the integrity of water-quality data in USGS databases and published reports. A statistical-analysis computer program, COMP.PPL, is used to determine whether the values reported by the laboratories are internally consistent, whether they are reasonable when compared with values for samples previously collected at the same site, and whether they exceed applicable drinking-water regulations. The program output consists of three files -- QWREVIEW, QWOUTLIERS, and QWCALC. QWREVIEW presents the results of tests of chemical logic and shows values that exceed drinking-water regulations. QWOUTLIERS identifies values that fall outside the historical range of values for the site sampled. QWCALC shows values and calculations used for reference purposes.

  17. Improved method for the quality assurance of [C-11]choline. (United States)

    Mishani, E; Ben-David, I; Rozen, Y


    [C-11]choline is a very promising radiomarker for the diagnosis of various human tumors using Positron Emission Tomography (PET). The existing quality control process for [C-11]choline is complicated and combines two HPLC methods with limited separation and sensitivity which prevent the accurate determination of the specific activity. We have developed a new efficient single HPLC method for the detection of choline chloride and dimethylaminoethanol with high resolution and sensitivity using cation-exchange chromatography.

  18. Cluster lot quality assurance sampling: effect of increasing the number of clusters on classification precision and operational feasibility. (United States)

    Okayasu, Hiromasa; Brown, Alexandra E; Nzioki, Michael M; Gasasira, Alex N; Takane, Marina; Mkanda, Pascal; Wassilak, Steven G F; Sutter, Roland W


    To assess the quality of supplementary immunization activities (SIAs), the Global Polio Eradication Initiative (GPEI) has used cluster lot quality assurance sampling (C-LQAS) methods since 2009. However, since the inception of C-LQAS, questions have been raised about the optimal balance between operational feasibility and precision of classification of lots to identify areas with low SIA quality that require corrective programmatic action. To determine if an increased precision in classification would result in differential programmatic decision making, we conducted a pilot evaluation in 4 local government areas (LGAs) in Nigeria with an expanded LQAS sample size of 16 clusters (instead of the standard 6 clusters) of 10 subjects each. The results showed greater heterogeneity between clusters than the assumed standard deviation of 10%, ranging from 12% to 23%. Comparing the distribution of 4-outcome classifications obtained from all possible combinations of 6-cluster subsamples to the observed classification of the 16-cluster sample, we obtained an exact match in classification in 56% to 85% of instances. We concluded that the 6-cluster C-LQAS provides acceptable classification precision for programmatic action. Considering the greater resources required to implement an expanded C-LQAS, the improvement in precision was deemed insufficient to warrant the effort. Published by Oxford University Press on behalf of the Infectious Diseases Society of America 2014. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  19. Solving Differential Equations by Parallel Laplace Method with Assured Accuracy


    Malaschonok, Natasha


    The paper has been presented at the 12th International Conference on Applications of Computer Algebra, Varna, Bulgaria, June, 2006 We produce a parallel algorithm realizing the Laplace transform method for the symbolic solving of differential equations. In this paper we consider systems of ordinary linear differential equations with constant coefficients, nonzero initial conditions and right-hand parts reduced to sums of exponents with polynomial coefficients.

  20. Method for Generating Pseudorandom Sequences with the Assured Period Based on R-blocks

    Directory of Open Access Journals (Sweden)

    M. A. Ivanov


    Full Text Available The article describes the characteristics of a new class of fast-acting pseudorandom number generators, based on the use of stochastic adders or R-blocks. A new method for generating pseudorandom sequences with the assured length of period is offered.

  1. Statistical considerations for plot design, sampling procedures, analysis, and quality assurance of ozone injury studies (United States)

    Michael Arbaugh; Larry Bednar


    The sampling methods used to monitor ozone injury to ponderosa and Jeffrey pines depend on the objectives of the study, geographic and genetic composition of the forest, and the source and composition of air pollutant emissions. By using a standardized sampling methodology, it may be possible to compare conditions within local areas more accurately, and to apply the...

  2. Towards Cost-efficient Sampling Methods

    CERN Document Server

    Peng, Luo; Chong, Wu


    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper presents two new sampling methods based on the perspective that a small part of vertices with high node degree can possess the most structure information of a network. The two proposed sampling methods are efficient in sampling the nodes with high degree. The first new sampling method is improved on the basis of the stratified random sampling method and selects the high degree nodes with higher probability by classifying the nodes according to their degree distribution. The second sampling method improves the existing snowball sampling method so that it enables to sample the targeted nodes selectively in every sampling step. Besides, the two proposed sampling methods not only sample the nodes but also pick the edges directly connected to these nodes. In order to demonstrate the two methods' availability and accuracy, we compare them with the existing sampling methods in...

  3. Improving data quality and supervision of antiretroviral therapy sites in Malawi: an application of Lot Quality Assurance Sampling

    Directory of Open Access Journals (Sweden)

    Hedt-Gauthier Bethany L


    Full Text Available Abstract Background High quality program data is critical for managing, monitoring, and evaluating national HIV treatment programs. By 2009, the Malawi Ministry of Health had initiated more than 270,000 patients on HIV treatment at 377 sites. Quarterly supervision of these antiretroviral therapy (ART sites ensures high quality care, but the time currently dedicated to exhaustive record review and data cleaning detracts from other critical components. The exhaustive record review is unlikely to be sustainable long term because of the resources required and increasing number of patients on ART. This study quantifies the current levels of data quality and evaluates Lot Quality Assurance Sampling (LQAS as a tool to prioritize sites with low data quality, thus lowering costs while maintaining sufficient quality for program monitoring and patient care. Methods In January 2010, a study team joined supervision teams at 19 sites purposely selected to reflect the variety of ART sites. During the exhaustive data review, the time allocated to data cleaning and data discrepancies were documented. The team then randomly sampled 76 records from each site, recording secondary outcomes and the time required for sampling. Results At the 19 sites, only 1.2% of records had discrepancies in patient outcomes and 0.4% in treatment regimen. However, data cleaning took 28.5 hours in total, suggesting that data cleaning for all 377 ART sites would require over 350 supervision-hours quarterly. The LQAS tool accurately identified the sites with the low data quality, reduced the time for data cleaning by 70%, and allowed for reporting on secondary outcomes. Conclusions Most sites maintained high quality records. In spite of this, data cleaning required significant amounts of time with little effect on program estimates of patient outcomes. LQAS conserves resources while maintaining sufficient data quality for program assessment and management to allow for quality patient

  4. Sample processing device and method

    DEFF Research Database (Denmark)


    A sample processing device is disclosed, which sample processing device comprises a first substrate and a second substrate, where the first substrate has a first surface comprising two area types, a first area type with a first contact angle with water and a second area type with a second contact...... a sample liquid comprising the sample and the first preparation system is adapted to receive a receiving liquid. In a particular embodiment, a magnetic sample transport component, such as a permanent magnet or an electromagnet, is arranged to move magnetic beads in between the first and second substrates....

  5. Paediatric rehabilitation treatment standards: a method for quality assurance in Germany

    Directory of Open Access Journals (Sweden)

    Jutta Ahnert


    Full Text Available Over the last few years, the German Pension Insurance has implemented a new method of quality assurance for inpatient rehabilitation of children and adolescents diagnosed with bronchial asthma, obesity, or atopic dermatitis: the so-called rehabilitation treatment standards (RTS. They aim at promoting a comprehensive and evidence-based care in rehabilitation. Furthermore, they are intended to make the therapeutic processes in medical rehabilitation as well as potential deficits more transparent. The development of RTS was composed of five phases during which current scientific evidence, expert knowledge, and patient expectations were included. Their core element is the specification of evidence-based treatment modules that describe a good rehabilitation standard for children diagnosed with bronchial asthma, obesity, or atopic dermatitis. Opportunities and limitations of the RTS as a tool for quality assurance are discussed.

  6. Quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, B.M.; Gleckler, B.P.


    This section of the 1994 Hanford Site Environmental Report summarizes the quality assurance and quality control practices of Hanford Site environmental monitoring and surveillance programs. Samples are analyzed according to documented standard analytical procedures. This section discusses specific measures taken to ensure quality in project management, sample collection, and analytical results.

  7. Paediatric rehabilitation treatment standards: a method for quality assurance in Germany. (United States)

    Ahnert, Jutta; Löffler, Stefan; Müller, Jochen; Lukasczik, Matthias; Brüggemann, Silke; Vogel, Heiner


    OVER THE LAST FEW YEARS, THE GERMAN PENSION INSURANCE HAS IMPLEMENTED A NEW METHOD OF QUALITY ASSURANCE FOR INPATIENT REHABILITATION OF CHILDREN AND ADOLESCENTS DIAGNOSED WITH BRONCHIAL ASTHMA, OBESITY, OR ATOPIC DERMATITIS: the so-called rehabilitation treatment standards (RTS). They aim at promoting a comprehensive and evidence-based care in rehabilitation. Furthermore, they are intended to make the therapeutic processes in medical rehabilitation as well as potential deficits more transparent. The development of RTS was composed of five phases during which current scientific evidence, expert knowledge, and patient expectations were included. Their core element is the specification of evidence-based treatment modules that describe a good rehabilitation standard for children diagnosed with bronchial asthma, obesity, or atopic dermatitis. Opportunities and limitations of the RTS as a tool for quality assurance are discussed. Significance for public healthThe German pension insurance's rehabilitation treatment standards (RTS) for inpatient rehabilitation of children and adolescents aim at contributing to a comprehensive and evidence-based care in paediatric rehabilitation. As a core element, they comprise evidence-based treatment modules that describe a good rehabilitation standard for children diagnosed with bronchial asthma, obesity, or atopic dermatitis. Although the RTS have been developed for the specific context of the German health care system, they may be referred to as a more general starting point regarding the development of health care and quality assurance standards in child/adolescent medical rehabilitative care.

  8. A quantitative method for cost reimbursement and length of stay quality assurance in multiple trauma patients. (United States)

    Siegel, J H; Shafi, S; Goodarzi, S; Dischinger, P C


    To develop a statistically valid method for trauma reimbursement and quality assurance (QA) length-of-stay filters. This is needed because diagnosis related group (DRG)-based trauma payment systems assume a random sampling of injury severities from a normally distributed population and thus result in economic disincentives to level I trauma centers. 142 trauma patients with MVC blunt multisystem injuries (MSI) (ISS > or = 16) were studied concurrently during their hospital course. Level I regional trauma center. Outcome measures were (dependent variables) length of stay (LOS) and state-approved hospital charges (COST). Mean acute care COST was $74,310, but the distribution of COST was log normal, rather than Gaussian normal as assumed by DRGs. The LOS for MSI was more than twice the average for all trauma (22 vs. 9 days), reflecting skewed severities of level I patients and was related to COST (r2 = 0.802; p < 0.0001). The ISS alone was a weak determinant of COST or LOS (r2 = 0.05; p < 0.0001). The best single determinant of COST and LOS was survival (r2 = 0.15; p < 0.0001): as it increased, it increased LOS. The most costly injuries (all p < 0.0001) involved the lower extremity (LE) or hip joint (HIP), whereas sepsis and pulmonary and surgical complications constituted the most costly complications (all p < 0.0001). Regression models that accounted for the log-normal distribution of the dependent variable and based on binary variables for survival, LE and HIP injuries, and the complications of sepsis, ARDS, pulmonary failure, MOFS, plus ISS, explained nearly two thirds of the variability in COST (r2 = 0.621; p < 0.0001) or LOS (r2 = 0.687; p < 0.0001) and the residuals were normally distributed. These models provide a valid method of reimbursement for MSI trauma for level I trauma centers, since the data imply that good care associated with survival from specific complications of MSI are the major determinants of COST, rather than the specific type of injury or

  9. A method of language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik; Hengeveld, Kees


    created with this method will reflect optimally the diversity of the languages of the world. On the basis of the internal structure of each genetic language tree a measure is computed that reflects the linguistic diversity in the language families represented by these trees. This measure is used...

  10. Methods of Computational Intelligence in the Context of Quality Assurance in Foundry Products

    Directory of Open Access Journals (Sweden)

    Rojek G.


    Full Text Available One way to ensure the required technical characteristics of castings is the strict control of production parameters affecting the quality of the finished products. If the production process is improperly configured, the resulting defects in castings lead to huge losses. Therefore, from the point of view of economics, it is advisable to use the methods of computational intelligence in the field of quality assurance and adjustment of parameters of future production. At the same time, the development of knowledge in the field of metallurgy, aimed to raise the technical level and efficiency of the manufacture of foundry products, should be followed by the development of information systems to support production processes in order to improve their effectiveness and compliance with the increasingly more stringent requirements of ergonomics, occupational safety, environmental protection and quality. This article is a presentation of artificial intelligence methods used in practical applications related to quality assurance. The problem of control of the production process involves the use of tools such as the induction of decision trees, fuzzy logic, rough set theory, artificial neural networks or case-based reasoning.

  11. Lot quality assurance sampling to monitor supplemental immunization activity quality: an essential tool for improving performance in polio endemic countries. (United States)

    Brown, Alexandra E; Okayasu, Hiromasa; Nzioki, Michael M; Wadood, Mufti Z; Chabot-Couture, Guillaume; Quddus, Arshad; Walker, George; Sutter, Roland W


    Monitoring the quality of supplementary immunization activities (SIAs) is a key tool for polio eradication. Regular monitoring data, however, are often unreliable, showing high coverage levels in virtually all areas, including those with ongoing virus circulation. To address this challenge, lot quality assurance sampling (LQAS) was introduced in 2009 as an additional tool to monitor SIA quality. Now used in 8 countries, LQAS provides a number of programmatic benefits: identifying areas of weak coverage quality with statistical reliability, differentiating areas of varying coverage with greater precision, and allowing for trend analysis of campaign quality. LQAS also accommodates changes to survey format, interpretation thresholds, evaluations of sample size, and data collection through mobile phones to improve timeliness of reporting and allow for visualization of campaign quality. LQAS becomes increasingly important to address remaining gaps in SIA quality and help focus resources on high-risk areas to prevent the continued transmission of wild poliovirus. © Crown copyright 2014.

  12. A Validated Trichinella Digestion Assay and an Associated Sampling and Quality Assurance System for Use in Testing Pork and Horse Meat

    National Research Council Canada - National Science Library

    Forbes, Lorry B; Gajadhar, Alvin A


    A revised digestion method, developed for efficiency and quality assurance, was validated for the detection of Trichinella larvae in pork and horse meat to meet requirements for food safety testing...

  13. New prior sampling methods for nested sampling - Development and testing (United States)

    Stokes, Barrie; Tuyl, Frank; Hudson, Irene


    Nested Sampling is a powerful algorithm for fitting models to data in the Bayesian setting, introduced by Skilling [1]. The nested sampling algorithm proceeds by carrying out a series of compressive steps, involving successively nested iso-likelihood boundaries, starting with the full prior distribution of the problem parameters. The "central problem" of nested sampling is to draw at each step a sample from the prior distribution whose likelihood is greater than the current likelihood threshold, i.e., a sample falling inside the current likelihood-restricted region. For both flat and informative priors this ultimately requires uniform sampling restricted to the likelihood-restricted region. We present two new methods of carrying out this sampling step, and illustrate their use with the lighthouse problem [2], a bivariate likelihood used by Gregory [3] and a trivariate Gaussian mixture likelihood. All the algorithm development and testing reported here has been done with Mathematica® [4].

  14. Private sector delivery of health services in developing countries: a mixed-methods study on quality assurance in social franchises

    Directory of Open Access Journals (Sweden)

    Schlein Karen


    Full Text Available Abstract Background Across the developing world health care services are most often delivered in the private sector and social franchising has emerged, over the past decade, as an increasingly popular method of private sector health care delivery. Social franchising aims to strengthen business practices through economies of scale: branding clinics and purchasing drugs in bulk at wholesale prices. While quality is one of the established goals of social franchising, there is no published documentation of how quality levels might be set in the context of franchised private providers, nor what quality assurance measures can or should exist within social franchises. The aim of this study was to better understand the quality assurance systems currently utilized in social franchises, and to determine if there are shared standards for practice or quality outcomes that exist across programs. Methods The study included three data sources and levels of investigation: 1 Self-reported program data; 2 Scoping telephone interviews; and 3 In-depth field interviews and clinic visits. Results Social Franchises conceive of quality assurance not as an independent activity, but rather as a goal that is incorporated into all areas of franchise operations, including recruitment, training, monitoring of provider performance, monitoring of client experience and the provision of feedback. Conclusions These findings are the first evidence to support the 2002 conceptual model of social franchising which proposed that the assurance of quality was one of the three core goals of all social franchises. However, while quality is important to franchise programs, quality assurance systems overall are not reflective of the evidence to-date on quality measurement or quality improvement best practices. Future research in this area is needed to better understand the details of quality assurance systems as applied in social franchise programs, the process by which quality assurance

  15. Examination of fast reactor fuels, FBR analytical quality assurance standards and methods, and analytical methods development: irradiation tests. Progress report, April 1--June 30, 1976, and FY 1976. [UO/sub 2/; PuO/sub 2/

    Energy Technology Data Exchange (ETDEWEB)

    Baker, R.D. (comp.)


    Characterization of unirradiated and irradiated LMFBR fuels by analytical chemistry methods will continue, and additional methods will be modified and mechanized for hot cell application. Macro- and microexaminations will be made on fuel and cladding using the shielded electron microprobe, emission spectrograph, radiochemistry, gamma scanner, mass spectrometers, and other analytical facilities. New capabilities will be developed in gamma scanning, analyses to assess spatial distributions of fuel and fission products, mass spectrometric measurements of burnup and fission gas constituents and other chemical analyses. Microstructural analyses of unirradiated and irradiated materials will continue using optical and electron microscopy and autoradiographic and x-ray techniques. Analytical quality assurance standards tasks are designed to assure the quality of the chemical characterizations necessary to evaluate reactor components relative to specifications. Tasks include: (1) the preparation and distribution of calibration materials and quality control samples for use in quality assurance surveillance programs, (2) the development of and the guidance in the use of quality assurance programs for sampling and analysis, (3) the development of improved methods of analysis, and (4) the preparation of continuously updated analytical method manuals. Reliable analytical methods development for the measurement of burnup, oxygen-to-metal (O/M) ratio, and various gases in irradiated fuels is described.

  16. Statistical methods for quality assurance basics, measurement, control, capability, and improvement

    CERN Document Server

    Vardeman, Stephen B


    This undergraduate statistical quality assurance textbook clearly shows with real projects, cases and data sets how statistical quality control tools are used in practice. Among the topics covered is a practical evaluation of measurement effectiveness for both continuous and discrete data. Gauge Reproducibility and Repeatability methodology (including confidence intervals for Repeatability, Reproducibility and the Gauge Capability Ratio) is thoroughly developed. Process capability indices and corresponding confidence intervals are also explained. In addition to process monitoring techniques, experimental design and analysis for process improvement are carefully presented. Factorial and Fractional Factorial arrangements of treatments and Response Surface methods are covered. Integrated throughout the book are rich sets of examples and problems that help readers gain a better understanding of where and how to apply statistical quality control tools. These large and realistic problem sets in combination with the...

  17. Food safety assurance systems: Microbiological testing, sampling plans, and microbiological criteria

    NARCIS (Netherlands)

    Zwietering, M.H.; Ross, T.; Gorris, L.G.M.


    Microbiological criteria give information about the quality or safety of foods. A key component of a microbiological criterion is the sampling plan. Considering: (1) the generally low level of pathogens that are deemed tolerable in foods, (2) large batch sizes, and (3) potentially substantial

  18. Transuranic waste characterization sampling and analysis methods manual. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Suermann, J.F.


    This Methods Manual provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program) and the WIPP Waste Analysis Plan. This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP and the WIPP Waste Analysis Plan. The procedures in this Methods Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site-specific procedures. With some analytical methods, such as Gas Chromatography/Mass Spectrometry, the Methods Manual procedures may be used directly. With other methods, such as nondestructive characterization, the Methods Manual provides guidance rather than a step-by-step procedure. Sites must meet all of the specified quality control requirements of the applicable procedure. Each DOE site must document the details of the procedures it will use and demonstrate the efficacy of such procedures to the Manager, National TRU Program Waste Characterization, during Waste Characterization and Certification audits.

  19. Data mining methods for quality assurance in an environmental monitoring network

    NARCIS (Netherlands)

    Athanasiadis, Ioannis N.; Rizzoli, Andrea Emilio; Beard, Daniel W.


    The paper presents a system architecture that employs data mining techniques for ensuring quality assurance in an environmental monitoring network. We investigate how data mining techniques can be incorporated in the quality assurance decision making process. As prior expert decisions are

  20. On the use of certified reference materials for assuring the quality of results for the determination of mercury in environmental samples. (United States)

    Bulska, Ewa; Krata, Agnieszka; Kałabun, Mateusz; Wojciechowski, Marcin


    This work focused on the development and validation of methodologies for the accurate determination of mercury in environmental samples and its further application for the preparation and certification of new reference materials (RMs). Two certified RMs ERM-CC580 (inorganic matrix) and ERM-CE464 (organic matrix) were used for the evaluation of digestion conditions assuring the quantitative recovery of mercury. These conditions were then used for the digestion of new candidates for the environmental RMs: bottom sediment (M_2 BotSed), herring tissue (M_3 HerTis), cormorant tissue (M_4 CormTis), and codfish muscle (M_5 CodTis). Cold vapor atomic absorption spectrometry (CV AAS) and inductively coupled plasma mass spectrometry (ICP MS) were used for the measurement of mercury concentration in all RMs. In order to validate and assure the accuracy of results, isotope dilution mass spectrometry (IDMS) was applied as a primary method of measurement, assuring the traceability of obtained values to the SI units: the mole, the kilogram, and the second. Results obtained by IDMS using n( 200 Hg)/n( 202 Hg) ratio, with estimated combined uncertainty, were as follows: (916 ± 41)/[4.5 %] ng g -1 (M_2 BotSed), (236 ± 14)/[5.9 %] ng g -1 (M_3 HerTis), (2252 ± 54)/[2.4 %] ng g -1 (M_4 CormTis), and (303 ± 15)/[4.9 %] ng g -1 (M_CodTis), respectively. Different types of detection techniques and quantification (external calibration, standard addition, isotope dilution) were applied in order to improve the quality of the analytical results. The good agreement (within less than 2.5 %) between obtained results and those derived from the Inter-laboratory Comparison, executed by the Institute of Nuclear Chemistry and Technology (Warsaw, Poland) on the same sample matrices, further validated the analytical procedures developed in this study, as well as the concentration of mercury in all four new RMs. Although the developed protocol enabling the metrological

  1. Sample normalization methods in quantitative metabolomics. (United States)

    Wu, Yiman; Li, Liang


    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Fluidics platform and method for sample preparation (United States)

    Benner, Henry W.; Dzenitis, John M.


    Provided herein are fluidics platforms and related methods for performing integrated sample collection and solid-phase extraction of a target component of the sample all in one tube. The fluidics platform comprises a pump, particles for solid-phase extraction and a particle-holding means. The method comprises contacting the sample with one or more reagents in a pump, coupling a particle-holding means to the pump and expelling the waste out of the pump while the particle-holding means retains the particles inside the pump. The fluidics platform and methods herein described allow solid-phase extraction without pipetting and centrifugation.

  3. Lot quality assurance sampling for monitoring coverage and quality of a targeted condom social marketing programme in traditional and non-traditional outlets in India (United States)

    Piot, Bram; Navin, Deepa; Krishnan, Nattu; Bhardwaj, Ashish; Sharma, Vivek; Marjara, Pritpal


    Objectives This study reports on the results of a large-scale targeted condom social marketing campaign in and around areas where female sex workers are present. The paper also describes the method that was used for the routine monitoring of condom availability in these sites. Methods The lot quality assurance sampling (LQAS) method was used for the assessment of the geographical coverage and quality of coverage of condoms in target areas in four states and along selected national highways in India, as part of Avahan, the India AIDS initiative. Results A significant general increase in condom availability was observed in the intervention area between 2005 and 2008. High coverage rates were gradually achieved through an extensive network of pharmacies and particularly of non-traditional outlets, whereas traditional outlets were instrumental in providing large volumes of condoms. Conclusion LQAS is seen as a valuable tool for the routine monitoring of the geographical coverage and of the quality of delivery systems of condoms and of health products and services in general. With a relatively small sample size, easy data collection procedures and simple analytical methods, it was possible to inform decision-makers regularly on progress towards coverage targets. PMID:20167732

  4. Using lot quality assurance sampling to assess measurements for growth monitoring in a developing country's primary health care system. (United States)

    Valadez, J J; Brown, L D; Vargas, W V; Morley, D


    Local supervisors used lot quality assurance sampling (LQAS) during routine household visits to assess the technical quality of Costa Rican community-based health workers (CHW): measuring and recording weights of children, interpreting their growth trend and providing nutrition education to mothers. Supervisors sampled 10 households in each of 12 Health Areas (4-8 hours per area). No more than two performance errors were allowed for each CHW. This LQAS decision rule resulted in judgments with a sensitivity and specificity of about 95 percent. Three categories of results are reported: (1) CHW adequately weighed children, calculated ages, identified children requiring nutritional services, and used the growth chart. (2) They needed to improve referral, education, and documentation skills. (3) The lack of system support to regularly provide growth cards, supplementary feeding to identified malnourished children, and other essential materials may have discouraged some CHW resulting in them not applying their skills. Supervisors regularly using LQAS should, by the sixth round of supervision, identify at least 90 percent of inadequately performing CHW. This paper demonstrates the strength of LQAS, namely, to be used easily by low level local health workers to identify poorly functioning components of growth monitoring and promotion.

  5. Dynamic Method for Identifying Collected Sample Mass (United States)

    Carson, John


    G-Sample is designed for sample collection missions to identify the presence and quantity of sample material gathered by spacecraft equipped with end effectors. The software method uses a maximum-likelihood estimator to identify the collected sample's mass based on onboard force-sensor measurements, thruster firings, and a dynamics model of the spacecraft. This makes sample mass identification a computation rather than a process requiring additional hardware. Simulation examples of G-Sample are provided for spacecraft model configurations with a sample collection device mounted on the end of an extended boom. In the absence of thrust knowledge errors, the results indicate that G-Sample can identify the amount of collected sample mass to within 10 grams (with 95-percent confidence) by using a force sensor with a noise and quantization floor of 50 micrometers. These results hold even in the presence of realistic parametric uncertainty in actual spacecraft inertia, center-of-mass offset, and first flexibility modes. Thrust profile knowledge is shown to be a dominant sensitivity for G-Sample, entering in a nearly one-to-one relationship with the final mass estimation error. This means thrust profiles should be well characterized with onboard accelerometers prior to sample collection. An overall sample-mass estimation error budget has been developed to approximate the effect of model uncertainty, sensor noise, data rate, and thrust profile error on the expected estimate of collected sample mass.


    NARCIS (Netherlands)


    Stable lyophilized ethylenediaminetetra-acetic acid (EDTA)-blood haemolysates were applied in an external quality assurance programme (SKZL, The Netherlands) for glycohaemoglobin assays in 101 laboratories using 12 methods. The mean intralaboratory day-to-day coefficient of variation (CV),

  7. Lot quality assurance sampling for monitoring coverage and quality of a targeted condom social marketing programme in traditional and non-traditional outlets in India. (United States)

    Piot, Bram; Mukherjee, Amajit; Navin, Deepa; Krishnan, Nattu; Bhardwaj, Ashish; Sharma, Vivek; Marjara, Pritpal


    This study reports on the results of a large-scale targeted condom social marketing campaign in and around areas where female sex workers are present. The paper also describes the method that was used for the routine monitoring of condom availability in these sites. The lot quality assurance sampling (LQAS) method was used for the assessment of the geographical coverage and quality of coverage of condoms in target areas in four states and along selected national highways in India, as part of Avahan, the India AIDS initiative. A significant general increase in condom availability was observed in the intervention area between 2005 and 2008. High coverage rates were gradually achieved through an extensive network of pharmacies and particularly of non-traditional outlets, whereas traditional outlets were instrumental in providing large volumes of condoms. LQAS is seen as a valuable tool for the routine monitoring of the geographical coverage and of the quality of delivery systems of condoms and of health products and services in general. With a relatively small sample size, easy data collection procedures and simple analytical methods, it was possible to inform decision-makers regularly on progress towards coverage targets.

  8. Method and apparatus for sampling atmospheric mercury (United States)

    Trujillo, Patricio E.; Campbell, Evan E.; Eutsler, Bernard C.


    A method of simultaneously sampling particulate mercury, organic mercurial vapors, and metallic mercury vapor in the working and occupational environment and determining the amount of mercury derived from each such source in the sampled air. A known volume of air is passed through a sampling tube containing a filter for particulate mercury collection, a first adsorber for the selective adsorption of organic mercurial vapors, and a second adsorber for the adsorption of metallic mercury vapor. Carbon black molecular sieves are particularly useful as the selective adsorber for organic mercurial vapors. The amount of mercury adsorbed or collected in each section of the sampling tube is readily quantitatively determined by flameless atomic absorption spectrophotometry.

  9. External quality assurance as a revalidation method for pathologists in pediatric histopathology: Comparison of four international programs

    Directory of Open Access Journals (Sweden)

    Mikuz Gregor


    Full Text Available Abstract Aim External quality assurance (EQA is an extremely valuable resource for clinical pathologists to maintain high standards, improve diagnostic skills, and possibly revalidate medical license. The aim of this study was to participate in and compare four international slide survey programs (UK, IAP-Germany, USA-Canada, Australasia in pediatric histopathology for clinical pathologists with the aim to use it as a revalidation method. Methods The following parameters were evaluated: number of circulations per year, number of slides, membership requirement, proof of significant pediatric pathology work, open to overseas participants, laboratory accreditation, issue of continuing professional development certificates and credits, slides discussion meeting, use of digital images, substandard performance letter, and anonymity of responses. Results The UK scheme, which has sampling procedure over several time frames (2 circulations/year, 30 slides, partial confidentiality, and multiple sources of data and assessors, can be used as a model for revalidation. The US-Canadian and Australasian schemes only partially fulfill the revalidation requirements. The IAP scheme appears to be essentially an educational program and may be unsuitable for revalidation. Conclusion The purposes and programs of EQA schemes vary worldwide. In order for it to be used for revalidation, it is advisable that EQA schemes are immediately unified.

  10. Serum vitamin A and E analysis: comparison of methods between laboratories enrolled in an external quality assurance programme. (United States)

    Greaves, Ronda; Jolly, Lisa; Woollard, Gerald; Hoad, Kirsten


    To survey laboratories enrolled in the Royal College of Pathologists of Australasia (RCPA) Chemical Pathology Quality Assurance Programme (QAP) for vitamin A and E testing to determine differences between methods of analysis. A detailed questionnaire covering the major aspects of serum vitamin A and E analysis was sent to all participating laboratories in 2007. Thirteen out of the 22 laboratories completed the questionnaire. Methods between laboratories showed a great deal of commonality. All respondents performed a liquid extraction step, which included the addition of an internal standard, followed by high-performance liquid chromatography (C18 columns with predominantly methanol-based mobile phases) with spectrophotometric detection. Major inter-laboratory differences were whether the sample was protected from light, the extraction solvents and ratios used, the drying down temperature used post-liquid extraction and choice of calibrator. The questionnaire highlighted discrete methodological differences between laboratories. These findings provide direction to enable the Vitamins Working Party of the Australasian Association of Clinical Biochemists to further investigate the dispersion in results between participants of the RCPA QAP vitamin programme.

  11. Assessing Local Risk of Rifampicin-Resistant Tuberculosis in KwaZulu-Natal, South Africa Using Lot Quality Assurance Sampling.

    Directory of Open Access Journals (Sweden)

    Christine L Heidebrecht

    Full Text Available KwaZulu-Natal (KZN has the highest burden of notified multidrug-resistant tuberculosis (MDR TB and extensively drug-resistant (XDR TB cases in South Africa. A better understanding of spatial heterogeneity in the risk of drug-resistance may help to prioritize local responses.Between July 2012 and June 2013, we conducted a two-way Lot Quality Assurance Sampling (LQAS study to classify the burden of rifampicin (RIF-resistant TB among incident TB cases notified within the catchment areas of seven laboratories in two northern and one southern district of KZN. Decision rules for classification of areas as having either a high- or low-risk of RIF resistant TB (based on proportion of RIF resistance among all TB cases were based on consultation with local policy makers.We classified five areas as high-risk and two as low-risk. High-risk areas were identified in both Southern and Northern districts, with the greatest proportion of RIF resistance observed in the northernmost area, the Manguzi community situated on the Mozambique border.Our study revealed heterogeneity in the risk of RIF resistant disease among incident TB cases in KZN. This study demonstrates the potential for LQAS to detect geographic heterogeneity in areas where access to drug susceptibility testing is limited.

  12. A novel and independent method for time-resolved gantry angle quality assurance for VMAT. (United States)

    Fuangrod, Todsaporn; Greer, Peter B; Zwan, Benjamin J; Barnes, Michael P; Lehmann, Joerg


    Volumetric-modulated arc therapy (VMAT) treatment delivery requires three key dynamic components; gantry rotation, dose rate modulation, and multi-leaf collimator motion, which are all simultaneously varied during the delivery. Misalignment of the gantry angle can potentially affect clinical outcome due to the steep dose gradients and complex MLC shapes involved. It is essential to develop independent gantry angle quality assurance (QA) appropriate to VMAT that can be performed simultaneously with other key VMAT QA testing. In this work, a simple and inexpensive fully independent gantry angle measurement methodology was developed that allows quantitation of the gantry angle accuracy as a function of time. This method is based on the analysis of video footage of a "Double dot" pattern attached to the front cover of the linear accelerator that consists of red and green circles printed on A4 paper sheet. A standard mobile phone is placed on the couch to record the video footage during gantry rotation. The video file is subsequently analyzed and used to determine the gantry angle from each video frame using the relative position of the two dots. There were two types of validation tests performed including the static mode with manual gantry angle rotation and dynamic mode with three complex test plans. The accuracy was 0.26° ± 0.04° and 0.46° ± 0.31° (mean ± 1 SD) for the static and dynamic modes, respectively. This method is user friendly, cost effective, easy to setup, has high temporal resolution, and can be combined with existing time-resolved method for QA of MLC and dose rate to form a comprehensive set of procedures for time-resolved QA of VMAT delivery system. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  13. Sampling of temporal networks: Methods and biases (United States)

    Rocha, Luis E. C.; Masuda, Naoki; Holme, Petter


    Temporal networks have been increasingly used to model a diversity of systems that evolve in time; for example, human contact structures over which dynamic processes such as epidemics take place. A fundamental aspect of real-life networks is that they are sampled within temporal and spatial frames. Furthermore, one might wish to subsample networks to reduce their size for better visualization or to perform computationally intensive simulations. The sampling method may affect the network structure and thus caution is necessary to generalize results based on samples. In this paper, we study four sampling strategies applied to a variety of real-life temporal networks. We quantify the biases generated by each sampling strategy on a number of relevant statistics such as link activity, temporal paths and epidemic spread. We find that some biases are common in a variety of networks and statistics, but one strategy, uniform sampling of nodes, shows improved performance in most scenarios. Given the particularities of temporal network data and the variety of network structures, we recommend that the choice of sampling methods be problem oriented to minimize the potential biases for the specific research questions on hand. Our results help researchers to better design network data collection protocols and to understand the limitations of sampled temporal network data.

  14. Monitoring maternal, newborn, and child health interventions using lot quality assurance sampling in Sokoto State of northern Nigeria. (United States)

    Abegunde, Dele; Orobaton, Nosa; Shoretire, Kamil; Ibrahim, Mohammed; Mohammed, Zainab; Abdulazeez, Jumare; Gwamzhi, Ringpon; Ganiyu, Akeem


    Maternal mortality ratio and infant mortality rate are as high as 1,576 per 100,000 live births and 78 per 1,000 live births, respectively, in Nigeria's northwestern region, where Sokoto State is located. Using applicable monitoring indicators for tracking progress in the UN/WHO framework on continuum of maternal, newborn, and child health care, this study evaluated the progress of Sokoto toward achieving the Millennium Development Goals (MDGs) 4 and 5 by December 2015. The changes in outcomes in 2012-2013 associated with maternal and child health interventions were assessed. We used baseline and follow-up lot quality assurance sampling (LQAS) data obtained in 2012 and 2013, respectively. In each of the surveys, data were obtained from 437 households sampled from 19 LQAS locations in each of the 23 local government areas (LGAs). The composite state-level coverage estimates of the respective indicators were aggregated from estimated LGA coverage estimates. None of the nine indicators associated with the continuum of maternal, neonatal, and child care satisfied the recommended 90% coverage target for achieving MDGs 4 and 5. Similarly, the average state coverage estimates were lower than national coverage estimates. Marginal improvements in coverage were obtained in the demand for family planning satisfied, antenatal care visits, postnatal care for mothers, and exclusive breast-feeding. Antibiotic treatment for acute pneumonia increased significantly by 12.8 percentage points. The majority of the LGAs were classifiable as low-performing, high-priority areas for intensified program intervention. Despite the limited time left in the countdown to December 2015, Sokoto State, Nigeria, is not on track to achieving the MDG 90% coverage of indicators tied to the continuum of maternal and child care, to reduce maternal and childhood mortality by a third by 2015. Targeted health system investments at the primary care level remain a priority, for intensive program scale-up to

  15. Assessing full immunisation coverage using lot quality assurance sampling in urban and rural districts of southwest Nigeria. (United States)

    Fatiregun, Akinola Ayoola; Adebowale, Ayo Stephen; Ayoka, Rita Ogechi; Fagbamigbe, Adeniyi Francis


    This study was conducted to identify administrative wards (lots) with unacceptable levels of full child immunisation coverage, and to identify factors associated with achievement of a complete child immunisation schedule in Ibadan North East (IBNE) and Ido local government areas (LGAs) of Oyo State, Nigeria. A cross-sectional survey involving 1178 mothers, 588 from IBNE LGAs and 590 from Ido LGAs, with children 12-23 months of age was conducted. Children were considered 'fully-immunised' if they received all the vaccines included in the immunisation schedule. Lot quality assurance sampling was used to determine lots with acceptable and non-acceptable coverage. Samples were weighted based on the population by lot to estimate overall coverage in the two LGAs and a logistic regression model was used to identify factors associated with the fully immunised child. Mean age of the mothers was 28.5 ± 5.6 and 28.1± 6.0 years in IBNE and Ido LGAs, respectively. Eleven of 12 wards in IBNE and all the wards in Ido had unacceptable coverage. The proportion of fully immunised children was 40.2% in IBNE and 41.3% in Ido. Maternal age ≥30 years, retention of an immunisation card, completion of tertiary education, or secondary education, hospital birth and first-order birth were significant predictors of complete childhood immunisation. The level of full immunisation coverage was unacceptable in almost all the wards. Educational intervention on the importance of completion of immunisation schedule should target young, uneducated mothers, mothers who delivered their babies at home and those with a high birth order.

  16. The use of a lot quality assurance sampling methodology to assess and manage primary health interventions in conflict-affected West Darfur, Sudan. (United States)

    Pham, Kiemanh; Sharpe, Emily Chambers; Weiss, William M; Vu, Alexander


    Organizations working in conflict-affected areas have a need to monitor and evaluate their programs, however this is often difficult due to the logistical challenges of conflict areas. Lot quality assurance sampling may be a suitable method of assessing programs in these situations. We conducted a secondary data analysis of information collected during Medair's routine program management functions. Medair's service area in West Darfur, Sudan was divided into seven supervisory areas. Using the available population information, a sampling frame was developed and interviews were conducted from randomly selected caretakers of children in each supervisory area every six months over 19 months. A survey instrument with questions related to key indicators for immunizations and maternal, newborn, and child health was used for the interviews. Based on Medair's goals for each indicator, decision rules were calculated for the indicators; these decision rules determined which supervisory areas and indicators performed adequately in each assessment period. Pearson's chi-squared tests, adjusted for the survey design using STATA "svy: tab" commands, were used to detect overall differences in coverage in this analysis. The coverage of tetanus toxoid vaccination among pregnant women increased from 47.2 to 69.7 % (p value = 0.046), and births attended by a skilled health professional increased from 35.7 to 52.7 % (p value = 0.025) from the first to last assessment periods. Measles vaccinations declined from 72.0 to 54.1 % (p value = 0.046). The estimated coverage for the proportion of women receiving a postpartum dose of vitamin A (54.7 to 61.3 %, p value = 0.44); pregnant women receiving a clean delivery kit (54.6 to 47.1 %, p value = 0.49); and pentavalent vaccinations (49.7 to 42.1 %, p value = 0.28) did not significantly change. Lot quality assurance sampling was a feasible method for Medair staff to evaluate and optimize primary health programs in

  17. Vitamin B1 and B6 method harmonization: comparison of performance between laboratories enrolled in the RCPA Quality Assurance Program. (United States)

    Hoad, Kirsten E; Johnson, Lambro A; Woollard, Gerald A; Walmsley, Trevor A; Briscoe, Scott; Jolly, Lisa M; Gill, Janice P; Greaves, Ronda F


    The RCPA Quality Assurance Program (RCPA QAP) offers monthly proficiency testing for vitamins A, B1, B6, β-carotene, C and E to laboratories worldwide. A review of the results submitted for the whole blood vitamin B1/B6 sub-program revealed a wide dispersion. Here we describe the results of a methodology survey for vitamins B1 and B6. A questionnaire was sent to thirteen laboratories. Eleven laboratories were returning QAP results for vitamin B1 (thiamine diphosphate) and five were returning results for vitamin B6 (pyridoxal-5-phosphate). All nine respondents provided a clinical service for vitamins B1 and B6. HPLC with fluorescence detection was the most common method principle. For vitamin B1, six respondents used a commercial assay whilst three used in-house methods; whole blood was the matrix for all. For vitamin B6, five respondents used commercial assays and four used in-house assays. The choice of matrix for vitamin B6 varied with three respondents using whole blood and five using plasma for analysis. Sample preparation incorporated protein precipitation and derivatization steps. An internal standard was employed in sample preparation by only one survey respondent. The immediate result of this survey was the incorporation of plasma vitamin B6 into the RCPA QAP vitamin program. The absence of an internal standard in current vitamin B1 and B6 assays is a likely contributor to the wide dispersion of results seen in this program. We recommend kit manufacturers and laboratories investigate the inclusion of internal standards to correct the variability that may occur during processing. Copyright © 2013 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  18. A Quality Assurance Method that Utilizes 3D Dosimetry and Facilitates Clinical Interpretation

    Energy Technology Data Exchange (ETDEWEB)

    Oldham, Mark, E-mail: [Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States); Thomas, Andrew; O' Daniel, Jennifer; Juang, Titania [Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States); Ibbott, Geoffrey [University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Adamovics, John [Rider University, Lawrenceville, New Jersey (United States); Kirkpatrick, John P. [Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States)


    Purpose: To demonstrate a new three-dimensional (3D) quality assurance (QA) method that provides comprehensive dosimetry verification and facilitates evaluation of the clinical significance of QA data acquired in a phantom. Also to apply the method to investigate the dosimetric efficacy of base-of-skull (BOS) intensity-modulated radiotherapy (IMRT) treatment. Methods and Materials: Two types of IMRT QA verification plans were created for 6 patients who received BOS IMRT. The first plan enabled conventional 2D planar IMRT QA using the Varian portal dosimetry system. The second plan enabled 3D verification using an anthropomorphic head phantom. In the latter, the 3D dose distribution was measured using the DLOS/Presage dosimetry system (DLOS = Duke Large-field-of-view Optical-CT System, Presage Heuris Pharma, Skillman, NJ), which yielded isotropic 2-mm data throughout the treated volume. In a novel step, measured 3D dose distributions were transformed back to the patient's CT to enable calculation of dose-volume histograms (DVH) and dose overlays. Measured and planned patient DVHs were compared to investigate clinical significance. Results: Close agreement between measured and calculated dose distributions was observed for all 6 cases. For gamma criteria of 3%, 2 mm, the mean passing rate for portal dosimetry was 96.8% (range, 92.0%-98.9%), compared to 94.9% (range, 90.1%-98.9%) for 3D. There was no clear correlation between 2D and 3D passing rates. Planned and measured dose distributions were evaluated on the patient's anatomy, using DVH and dose overlays. Minor deviations were detected, and the clinical significance of these are presented and discussed. Conclusions: Two advantages accrue to the methods presented here. First, treatment accuracy is evaluated throughout the whole treated volume, yielding comprehensive verification. Second, the clinical significance of any deviations can be assessed through the generation of DVH curves and dose overlays on

  19. Sampling methods for terrestrial amphibians and reptiles. (United States)

    Paul Stephen Corn; R. Bruce. Bury


    Methods described for sampling amphibians and reptiles in Douglas-fir forests in the Pacific Northwest include pitfall trapping, time-constrained collecting, and surveys of coarse woody debris. The herpetofauna of this region differ in breeding and nonbreeding habitats and vagility, so that no single technique is sufficient for a community study. A combination of...

  20. Development of a quality assured calibration method for the PSI radon chamber reference atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Schuler, C.; Butterweck-Dempewolf, G.; Vezzu, G. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)


    Radon detectors and measuring instruments are calibrated at the PSI Reference Laboratory for Radon Gas Concentration Measurements by exposing them to a calibrated radon reference atmosphere in the PSI radon chamber. A sophisticated and quality assured calibration technique was developed which guarantees the traceability of this radon chamber reference atmosphere to standards of internationally acknowledged primary laboratories. (author) 2 figs., 2 refs.

  1. Sample conditioning system for quality assurance in waste combustion and waste recycling; Probenaufbereitungssystem zur Qualitaetssicherung fuer Abfaelle zur energetischen (stofflichen) Verwertung

    Energy Technology Data Exchange (ETDEWEB)

    Spuziak-Salzenberg, D.; Riemer, S.; Bayley-Blackwedel, B.; Baer, G.


    Effective quality assurance is an indispensable element of waste recycling and waste utilization in order to make a recycling-based economy efficient and sustainable. Quality criteria and target or limiting values for waste to be recycled or disposed of have already been defined in different guidelines or drafts by LAGA and corresponding regulations (for instance, waste combustion in cements works, combustion of used wood; ordinance on biocompost, RAL quality criteria for used wood, LAGA guidelines). Whereas there exist methods of analysis and conditioning for samples for mineral, or mainly mineral, wastes in analogy to those for the bulk material used in road building, no such methods exist for mainly non-mineral material and, especially, heterogeneous mixtures of waste. (orig.) [Deutsch] Im Rahmen einer effizienten und nachhaltigen Kreislaufwirtschaft ist eine effektive Qualitaetssicherung fuer die stoffliche und energetische Verwertung von Abfaellen unerlaesslich. Zur Verwertung bzw. Beseitigung der Abfaelle sind bereits in verschiedenen Richtlinien/Entwuerfen der LAGA und entsprechenden Verwertungsvorschriften Guetekriterien und Richt- bzw. Grenzwerte definiert (u.a. energetische Verwertung in Zementwerken, Altholz-Verwertung, Biokompost-VO, RAL-Guetekriterien-Gebrauchtholz, LAGA-Richtlinien). Liegen fuer mineralische bzw. ueberwiegend mineralische Abfaelle entsprechende Probenaufbereitungs- und Analysemethoden in Analogie zum Bereich der Boden-Schuettgut-Untersuchung vor, so muss dies fuer ueberwiegend nichtmineralische Stoffe und besonders fuer heterogene Abfallgemische verneint werden. (orig.)

  2. Countdown to 2015: Tracking Maternal and Child Health Intervention Targets Using Lot Quality Assurance Sampling in Bauchi State Nigeria.

    Directory of Open Access Journals (Sweden)

    Dele Abegunde

    Full Text Available Improving maternal and child health remains a top priority in Nigeria's Bauchi State in the northeastern region where the maternal mortality ratio (MMR and infant mortality rate (IMR are as high as 1540 per 100,000 live births and 78 per 1,000 live births respectively. In this study, we used the framework of the continuum of maternal and child care to evaluate the impact of interventions in Bauchi State focused on improved maternal and child health, and to ascertain progress towards the achievement of Millennium Development Goals (MDGs 4 and 5.At baseline (2012 and then at follow-up (2013, we randomly sampled 340 households from 19 random locations in each of the 20 Local Government Areas (LGA of Bauchi State in Northern Nigeria, using the Lot Quality Assurance Sampling (LQAS technique. Women residents in the households were interviewed about their own health and that of their children. Estimated LGA coverage of maternal and child health indicators were aggregated across the State. These values were then compared to the national figures, and the differences from 2012 to 2014 were calculated.For several of the indicators, a modest improvement from baseline was found. However, the indicators in the continuum of care neither reached the national average nor attained the 90% globally recommended coverage level. The majority of the LGA surveyed were classifiable as high priority, thus requiring intensified efforts and programmatic scale up.Intensive scale-up of programs and interventions is needed in Bauchi State, Northern Nigeria, to accelerate, consolidate and sustain the modest but significant achievements in the continuum of care, if MDGs 4 and 5 are to be achieved by the end of 2015. The intentional focus of LGAs as the unit of intervention ought to be considered a condition precedent for future investments. Priority should be given to the re-allocating resources to program areas and regions where coverage has been low. Finally, systematic

  3. Countdown to 2015: Tracking Maternal and Child Health Intervention Targets Using Lot Quality Assurance Sampling in Bauchi State Nigeria. (United States)

    Abegunde, Dele; Orobaton, Nosa; Sadauki, Habib; Bassi, Amos; Kabo, Ibrahim A; Abdulkarim, Masduq


    Improving maternal and child health remains a top priority in Nigeria's Bauchi State in the northeastern region where the maternal mortality ratio (MMR) and infant mortality rate (IMR) are as high as 1540 per 100,000 live births and 78 per 1,000 live births respectively. In this study, we used the framework of the continuum of maternal and child care to evaluate the impact of interventions in Bauchi State focused on improved maternal and child health, and to ascertain progress towards the achievement of Millennium Development Goals (MDGs) 4 and 5. At baseline (2012) and then at follow-up (2013), we randomly sampled 340 households from 19 random locations in each of the 20 Local Government Areas (LGA) of Bauchi State in Northern Nigeria, using the Lot Quality Assurance Sampling (LQAS) technique. Women residents in the households were interviewed about their own health and that of their children. Estimated LGA coverage of maternal and child health indicators were aggregated across the State. These values were then compared to the national figures, and the differences from 2012 to 2014 were calculated. For several of the indicators, a modest improvement from baseline was found. However, the indicators in the continuum of care neither reached the national average nor attained the 90% globally recommended coverage level. The majority of the LGA surveyed were classifiable as high priority, thus requiring intensified efforts and programmatic scale up. Intensive scale-up of programs and interventions is needed in Bauchi State, Northern Nigeria, to accelerate, consolidate and sustain the modest but significant achievements in the continuum of care, if MDGs 4 and 5 are to be achieved by the end of 2015. The intentional focus of LGAs as the unit of intervention ought to be considered a condition precedent for future investments. Priority should be given to the re-allocating resources to program areas and regions where coverage has been low. Finally, systematic considerations

  4. Electronic laboratory quality assurance program: A method of enhancing the prosthodontic curriculum and addressing accreditation standards. (United States)

    Moghadam, Marjan; Jahangiri, Leila


    An electronic quality assurance (eQA) program was developed to replace a paper-based system and to address standards introduced by the Commission on Dental Accreditation (CODA) and to improve educational outcomes. This eQA program provides feedback to predoctoral dental students on prosthodontic laboratory steps at New York University College of Dentistry. The purpose of this study was to compare the eQA program of performing laboratory quality assurance with the former paper-based format. Fourth-year predoctoral dental students (n=334) who experienced both the paper-based and the electronic version of the quality assurance program were surveyed about their experiences. Additionally, data extracted from the eQA program were analyzed to identify areas of weakness in the curriculum. The study findings revealed that 73.8% of the students preferred the eQA program to the paper-based version. The average number of treatments that did not pass quality assurance standards was 119.5 per month. This indicated a 6.34% laboratory failure rate. Further analysis of these data revealed that 62.1% of the errors were related to fixed prosthodontic treatment, 27.9% to partial removable dental prostheses, and 10% to complete removable dental prostheses in the first 18 months of program implementation. The eQA program was favored by dental students who have experienced both electronic and paper-based versions of the system. Error type analysis can yield the ability to create customized faculty standardization sessions and refine the didactic and clinical teaching of the predoctoral students. This program was also able to link patient care activity with the student's laboratory activities, thus addressing the latest requirements of the CODA regarding the competence of graduates in evaluating laboratory work related to their patient care. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.


    Directory of Open Access Journals (Sweden)

    A Karinagannanavar


    Full Text Available Background: Measles is a leading cause of childhood morbidity and mortality accounting for nearly half the global burden of vaccine preventable deaths. In 2007, there were 197000 measles deaths globally nearly 540 deaths every day or 22 deaths per hour. According to NFHS-3 2005 – 06 total measles vaccination coverage in Karnataka was 72%. Objectives: 1 To find out measles vaccination coverage in Bellary District. 2 To know the reasons for non-vaccination. Material and Methods:   A Cross sectional study was conducted from May 2010 to April 2011 at areas covered by PHC/PHU of Bellary district by using Lot Quality Assurance Sampling (LQAS method. Total sample size was 1007(53X19. Bellary district has 47 primary health centers (PHC and 6 primary health units (PHU, all of which were studied in which each PHC/PHU is considered as a lot. The data was collected from parents of children aged 12-23 months using a pretested semi structured questionnaire. Results: Out of 53 PHC’s/PHU’s we accepted 41 (77.35% and vaccination coverage in these lots was considered as more than 85% and overall coverage in Bellary district was 69.41% and  53.62% had received Vitamin A supplementation. The reasons for non vaccination were lack of awareness, ignorance, ill health of the child, fear of side effects & lack of health services. Conclusion: Measles vaccination coverage was 69.41% and the reasons for non vaccination were lack of awareness, ignorance, ill health of the child, fear of side effects and lack of health services.

  6. Filmless methods for quality assurance of Tomotherapy using ArcCHECK. (United States)

    Yang, B; Wong, W K R; Geng, H; Lam, W W; Ho, Y W; Kwok, W M; Cheung, K Y; Yu, S K


    Tomotherapy delivers an intensity-modulated radiation therapy (IMRT) treatment by the synchronization of gantry rotation, multileaf collimator (MLC), and couch movement. This dynamic nature makes the quality assurance (QA) important and challenging. The purpose of this study is to develop some methodologies using an ArcCHECK for accurate QA measurements of the gantry angle and speed, MLC synchronization and leaf open time, couch translation per gantry rotation, couch speed and uniformity, and constancy of longitudinal beam profile for a Tomotherapy unit. Four test plans recommended by AAPM Task Group 148 (TG148) and the manufacturer were chosen for this study. Helical and static star shot tests are used for checking the leaves opened at the expected gantry angles. Another helical test is to verify the couch traveled the expected distance per gantry rotation. The final test is for checking the couch speed constancy with a static gantry. ArcCHECK can record the detector signal every 50 ms as a movie file, and has a virtual inclinometer for gantry angle measurement. These features made the measurement of gantry angle and speed, MLC synchronization and leaf open time, and longitudinal beam profile possible. A shaping parameter was defined for facilitating the location of the beam center during the plan delivery, which was thereafter used to calculate the couch translation per gantry rotation and couch speed. The full width at half maximum (FWHM) was calculated for each measured longitudinal beam profile and then used to evaluate the couch speed uniformity. Furthermore, a mean longitudinal profile was obtained for constancy check of field width. The machine trajectory log data were also collected for comparison. Inhouse programs were developed in MATLAB to process both the ArcCHECK and machine log data. The deviation of our measurement results from the log data for gantry angle was calculated to be less than 0.4°. The percentage differences between measured and planned

  7. Methods, quality assurance, and data for assessing atmospheric deposition of pesticides in the Central Valley of California (United States)

    Zamora, Celia; Majewski, Michael S.; Foreman, William T.


    The U.S. Geological Survey monitored atmospheric deposition of pesticides in the Central Valley of California during two studies in 2001 and 2002–04. The 2001 study sampled wet deposition (rain) and storm-drain runoff in the Modesto, California, area during the orchard dormant-spray season to examine the contribution of pesticide concentrations to storm runoff from rainfall. In the 2002–04 study, the number and extent of collection sites in the Central Valley were increased to determine the areal distribution of organophosphate insecticides and other pesticides, and also five more sample types were collected. These were dry deposition, bulk deposition, and three sample types collected from a soil box: aqueous phase in runoff, suspended sediment in runoff, and surficial-soil samples. This report provides concentration data and describes methods and quality assurance of sample collection and laboratory analysis for pesticide compounds in all samples collected from 16 sites. Each sample was analyzed for 41 currently used pesticides and 23 pesticide degradates, including oxygen analogs (oxons) of 9 organophosphate insecticides. Analytical results are presented by sample type and study period. The median concentrations of both chloryprifos and diazinon sampled at four urban (0.067 micrograms per liter [μg/L] and 0.515 μg/L, respectively) and four agricultural sites (0.079 μg/L and 0.583 μg/L, respectively) during a January 2001 storm event in and around Modesto, Calif., were nearly identical, indicating that the overall atmospheric burden in the region appeared to be fairly similar during the sampling event. Comparisons of median concentrations in the rainfall to those in the McHenry storm-drain runoff showed that, for some compounds, rainfall contributed a substantial percentage of the concentration in the runoff; for other compounds, the concentrations in rainfall were much greater than in the runoff. For example, diazinon concentrations in rainfall were about

  8. System and Method for Isolation of Samples (United States)

    Zhang, Ye (Inventor); Wu, Honglu (Inventor)


    Systems and methods for isolating samples are provided. The system comprises a first membrane and a second membrane disposed within an enclosure. First and second reservoirs can also be disposed within the enclosure and adapted to contain one or more reagents therein. A first valve can be disposed within the enclosure and in fluid communication with the first reservoir, the second reservoir, or both. The first valve can also be in fluid communication with the first or second membranes or both. The first valve can be adapted to selectively regulate the flow of the reagents from the first reservoir, through at least one of the first and second membranes, and into the second reservoir.

  9. Using lot quality assurance sampling to assess access to water, sanitation and hygiene services in a refugee camp setting in South Sudan: a feasibility study. (United States)

    Harding, Elizabeth; Beckworth, Colin; Fesselet, Jean-Francois; Lenglet, Annick; Lako, Richard; Valadez, Joseph J


    Humanitarian agencies working in refugee camp settings require rapid assessment methods to measure the needs of the populations they serve. Due to the high level of dependency of refugees, agencies need to carry out these assessments. Lot Quality Assurance Sampling (LQAS) is a method commonly used in development settings to assess populations living in a project catchment area to identify their greatest needs. LQAS could be well suited to serve the needs of refugee populations, but it has rarely been used in humanitarian settings. We adapted and implemented an LQAS survey design in Batil refugee camp, South Sudan in May 2013 to measure the added value of using it for sub-camp level assessment. Using pre-existing divisions within the camp, we divided the Batil catchment area into six contiguous segments, called 'supervision areas' (SA). Six teams of two data collectors randomly selected 19 respondents in each SA, who they interviewed to collect information on water, sanitation, hygiene, and diarrhoea prevalence. These findings were aggregated into a stratified random sample of 114 respondents, and the results were analysed to produce a coverage estimate with 95% confidence interval for the camp and to prioritize SAs within the camp. The survey provided coverage estimates on WASH indicators as well as evidence that areas of the camp closer to the main road, to clinics and to the market were better served than areas at the periphery of the camp. This assumption did not hold for all services, however, as sanitation services were uniformly high regardless of location. While it was necessary to adapt the standard LQAS protocol used in low-resource communities, the LQAS model proved to be feasible in a refugee camp setting, and program managers found the results useful at both the catchment area and SA level. This study, one of the few adaptations of LQAS for a camp setting, shows that it is a feasible method for regular monitoring, with the added value of enabling camp

  10. Can health workers reliably assess their own work? A test-retest study of bias among data collectors conducting a Lot Quality Assurance Sampling survey in Uganda. (United States)

    Beckworth, Colin A; Davis, Rosemary H; Faragher, Brian; Valadez, Joseph J


    Lot Quality Assurance Sampling (LQAS) is a classification method that enables local health staff to assess health programmes for which they are responsible. While LQAS has been favourably reviewed by the World Bank and World Health Organization (WHO), questions remain about whether using local health staff as data collectors can lead to biased data. In this test-retest research, Pallisa Health District in Uganda is subdivided into four administrative units called supervision areas (SA). Data collectors from each SA conducted an LQAS survey. A week later, the data collectors were swapped to a different SA, outside their area of responsibility, to repeat the LQAS survey with the same respondents. The two data sets were analysed for agreement using Cohens' kappa coefficient and disagreements were analysed. Kappa values ranged from 0.19 to 0.97. On average, there was a moderate degree of agreement for knowledge indicators and a substantial level for practice indicators. Respondents were found to be systematically more knowledgeable on retest indicating bias favouring the retest, although no evidence of bias was found for practices indicators. In this initial study, using local health care providers to collect data did not bias data collection. The bias observed in the knowledge indicators is most likely due to the 'practice effect', whereby respondents increased their knowledge as a result of completing the first survey, as no corresponding effect was seen in the practices indicators. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2014; all rights reserved.


    Directory of Open Access Journals (Sweden)



    Full Text Available Acoustic Emission (AE signatures of various weld defects of stainless steel 316L nuclear grade weld material are investigated. The samples are fabricated by Tungsten Inert Gas (TIG Welding Method have final dimension of 140 mm x 15 mm x 10 mm. AE signals from weld defects such as Pinhole, Porosity, Lack of Penetration, Lack of Side Fusion and Slag are recorded under dynamic load conditions by specially designed mechanical jig. AE features of the weld defects were attained using Linear Location Technique (LLT. The results from this study concluded that, stress release and structure deformation between the sections in welding area are load conditions major part of Acoustic Emission activity during loading.

  12. Optimisation of small-scale hydropower using quality assurance methods - Preliminary project; Vorprojekt: Optimierung von Kleinwasserkraftwerken durch Qualitaetssicherung. Programm Kleinwasserkraftwerke

    Energy Technology Data Exchange (ETDEWEB)

    Hofer, S.; Staubli, T.


    This comprehensive final report for the Swiss Federal Office of Energy (SFOE) presents the results of a preliminary project that examined how quality assurance methods can be used in the optimisation of small-scale hydropower projects. The aim of the project, to use existing know-how, experience and synergies, is examined. Discrepancies in quality and their effects on production prices were determined in interviews. The paper describes best-practice guidelines for the quality assurance of small-scale hydro schemes. A flow chart describes the various steps that have to be taken in the project and realisation work. Information collected from planners and from interviews made with them are presented along with further information obtained from literature. The results of interviews concerning planning work, putting to tender and the construction stages of these hydro schemes are presented and commented on. Similarly, the operational phase of such power plant is also examined, including questions on operation and guarantees. The aims of the follow-up main project - the definition of a tool and guidelines for ensuring quality - are briefly reviewed.

  13. Quality assurance of metabolomics. (United States)

    Bouhifd, Mounir; Beger, Richard; Flynn, Thomas; Guo, Lining; Harris, Georgina; Hogberg, Helena; Kaddurah-Daouk, Rima; Kamp, Hennicke; Kleensang, Andre; Maertens, Alexandra; Odwin-DaCosta, Shelly; Pamies, David; Robertson, Donald; Smirnova, Lena; Sun, Jinchun; Zhao, Liang; Hartung, Thomas


    Metabolomics promises a holistic phenotypic characterization of biological responses to toxicants. This technology is based on advanced chemical analytical tools with reasonable throughput, including mass-spectroscopy and NMR. Quality assurance, however - from experimental design, sample preparation, metabolite identification, to bioinformatics data-mining - is urgently needed to assure both quality of metabolomics data and reproducibility of biological models. In contrast to microarray-based transcriptomics, where consensus on quality assurance and reporting standards has been fostered over the last two decades, quality assurance of metabolomics is only now emerging. Regulatory use in safety sciences, and even proper scientific use of these technologies, demand quality assurance. In an effort to promote this discussion, an expert workshop discussed the quality assurance needs of metabolomics. The goals for this workshop were 1) to consider the challenges associated with metabolomics as an emerging science, with an emphasis on its application in toxicology and 2) to identify the key issues to be addressed in order to establish and implement quality assurance procedures in metabolomics-based toxicology. Consensus has still to be achieved regarding best practices to make sure sound, useful, and relevant information is derived from these new tools.

  14. Statistical sampling method, used in the audit

    Directory of Open Access Journals (Sweden)

    Gabriela-Felicia UNGUREANU


    Full Text Available The rapid increase in the size of U.S. companies from the early twentieth century created the need for audit procedures based on the selection of a part of the total population audited to obtain reliable audit evidence, to characterize the entire population consists of account balances or classes of transactions. Sampling is not used only in audit – is used in sampling surveys, market analysis and medical research in which someone wants to reach a conclusion about a large number of data by examining only a part of these data. The difference is the “population” from which the sample is selected, ie that set of data which is intended to draw a conclusion. Audit sampling applies only to certain types of audit procedures.

  15. Improvement of the customer satisfaction through Quality Assurance Matrix and QC-Story methods: A case study from automotive industry (United States)

    Sicoe, G. M.; Belu, N.; Rachieru, N.; Nicolae, E. V.


    Presently, in the automotive industry, the tendency is to adapt permanently to the changes and introduce the market tendency in the new products that leads of the customer satisfaction. Many quality techniques were adopted in this field to continuous improvement of product and process quality and advantages were also gained. The present paper has focused on possibilities that offers the use of Quality Assurance Matrix (QAM) and Quality Control Story (QC Story) to provide largest protection against nonconformities in the production process, throughout a case study in the automotive industry. There is a direct relationship from the QAM to a QC Story analysis. The failures identified using QAM are treated with QC Story methodology. Using this methods, will help to decrease the PPM values and will increase the quality performance and the customer satisfaction.

  16. Establishment of a method of anonymization of DNA samples in genetic research. (United States)

    Hara, Kazuo; Ohe, Kazuhiko; Kadowaki, Takashi; Kato, Naoya; Imai, Yasushi; Tokunaga, Katsushi; Nagai, Ryozo; Omata, Masao


    As the number of the genetic studies has rapidly increased in recent years, there has been growing concern that the privacy of the participants in such studies can be invaded unless effective measures are adopted to protect confidentiality. It is crucial for the scientific community to establish a method to anonymize DNA samples so that the public will trust genetic researchers. Here, we present a reliable and practical method of making DNA samples used in the genetic research anonymous. It assures complete anonymity by coding samples and personal information twice. Since it does not require equipment, such as bar-code readers or a software package, its cost is nominal compared with the laboratory costs. All institutions engaged in genetic research may wish to take measures such as the one described here to ensure the privacy and confidentiality of the participants in their genetic studies.

  17. Transuranic Waste Characterization Quality Assurance Program Plan

    Energy Technology Data Exchange (ETDEWEB)



    This quality assurance plan identifies the data necessary, and techniques designed to attain the required quality, to meet the specific data quality objectives associated with the DOE Waste Isolation Pilot Plant (WIPP). This report specifies sampling, waste testing, and analytical methods for transuranic wastes.

  18. System and method for extracting a sample from a surface (United States)

    Van Berkel, Gary; Covey, Thomas


    A system and method is disclosed for extracting a sample from a sample surface. A sample is provided and a sample surface receives the sample which is deposited on the sample surface. A hydrophobic material is applied to the sample surface, and one or more devices are configured to dispense a liquid on the sample, the liquid dissolving the sample to form a dissolved sample material, and the one or more devices are configured to extract the dissolved sample material from the sample surface.

  19. Perpendicular distance sampling: an alternative method for sampling downed coarse woody debris (United States)

    Michael S. Williams; Jeffrey H. Gove


    Coarse woody debris (CWD) plays an important role in many forest ecosystem processes. In recent years, a number of new methods have been proposed to sample CWD. These methods select individual logs into the sample using some form of unequal probability sampling. One concern with most of these methods is the difficulty in estimating the volume of each log. A new method...

  20. SU-E-T-438: Commissioning of An In-Vivo Quality Assurance Method Using the Electronic Portal Imaging Device

    Energy Technology Data Exchange (ETDEWEB)

    Morin, O; Held, M; Pouliot, J [UC San Francisco, San Francisco, CA (United States)


    Purpose: Patient specific pre-treatment quality assurance (QA) using arrays of detectors or film have been the standard approach to assure the correct treatment is delivered to the patient. This QA approach is expensive, labor intensive and does not guarantee or document that all remaining fractions were treated properly. The purpose of this abstract is to commission and evaluate the performance of a commercially available in-vivo QA software using the electronic portal imaging device (EPID) to record the daily treatments. Methods: The platform EPIgray V2.0.2 (Dosisoft), which machine model compares ratios of TMR with EPID signal to predict dose was commissioned for an Artiste (Siemens Oncology Care Systems) and a Truebeam (Varian medical systems) linear accelerator following the given instructions. The systems were then tested on three different phantoms (homogeneous stack of solid water, anthropomorphic head and pelvis) and on a library of patient cases. Simple and complex fields were delivered at different exposures and for different gantry angles. The effects of the table attenuation and the EPID sagging were evaluated. Gamma analysis of the measured dose was compared to the predicted dose for complex clinical IMRT cases. Results: Commissioning of the EPIgray system for two photon energies took 8 hours. The difference between the dose planned and the dose measured with EPIgray was better than 3% for all phantom scenarios tested. Preliminary results on patients demonstrate an accuracy of 5% is achievable in high dose regions for both 3DCRT and IMRT. Large discrepancies (>5%) were observed due to metallic structures or air cavities and in low dose areas. Flat panel sagging was visible and accounted for in the EPIgray model. Conclusion: The accuracy achieved by EPIgray is sufficient to document the safe delivery of complex IMRT treatments. Future work will evaluate EPIgray for VMAT and high dose rate deliveries. This work is supported by Dosisoft, Cachan, France.

  1. Method Description, Quality Assurance, Environmental Data, and other Information for Analysis of Pharmaceuticals in Wastewater-Treatment-Plant Effluents, Streamwater, and Reservoirs, 2004-2009 (United States)

    Phillips, Patrick J.; Smith, Steven G.; Kolpin, Dana W.; Zaugg, Steven D.; Buxton, Herbert T.; Furlong, Edward T.


    Abstract Wastewater-treatment-plant (WWTP) effluents are a demonstrated source of pharmaceuticals to the environment. During 2004-09, a study was conducted to identify pharmaceutical compounds in effluents from WWTPs (including two that receive substantial discharges from pharmaceutical formulation facilities), streamwater, and reservoirs. The methods used to determine and quantify concentrations of seven pharmaceuticals are described. In addition, the report includes information on pharmaceuticals formulated or potentially formulated at the two pharmaceutical formulation facilities that provide substantial discharge to two of the WWTPs, and potential limitations to these data are discussed. The analytical methods used to provide data on the seven pharmaceuticals (including opioids, muscle relaxants, and other pharmaceuticals) in filtered water samples also are described. Data are provided on method performance, including spike data, method detection limit results, and an estimation of precision. Quality-assurance data for sample collection and handling are included. Quantitative data are presented for the seven pharmaceuticals in water samples collected at WWTP discharge points, from streams, and at reservoirs. Occurrence data also are provided for 19 pharmaceuticals that were qualitatively identified. Flow data at selected WWTP and streams are presented. Between 2004-09, 35-38 effluent samples were collected from each of three WWTPs in New York and analyzed for seven pharmaceuticals. Two WWTPs (NY2 and NY3) receive substantial inflows (greater than 20 percent of plant flow) from pharmaceutical formulation facilities (PFF) and one (NY1) receives no PFF flow. Samples of effluents from 23 WWTPs across the United States were analyzed once for these pharmaceuticals as part of a national survey. Maximum pharmaceutical effluent concentrations for the national survey and NY1 effluent samples were generally less than 1 ug/L. Four pharmaceuticals (methadone, oxycodone

  2. Use of destructive and nondestructive methods of analysis for quality assurance at MOX fuel production in the Russia

    Energy Technology Data Exchange (ETDEWEB)

    Bibilashvili, Y.K.; Rudenko, V.S.; Chorokhov, N.A.; Korovin, Y.I.; Petrov, A.M.; Vorobiev, A.V.; Mukhortov, N.F.; Smirnov, Y.A.; Kudryavtsev, V.N. [A.A. Bochvar All-Russia Research Institute of Inorganic Materials (Russian Federation)


    Parameters of MOX fuel with various plutonium contents are considered from the point of view of necessity of their control for quality assurance. Destructive and nondestructive methods used for this purpose in the Russia are described: controlled potential coulometry for determination of uranium or/and plutonium contents, their ratio and oxygen factor; mass spectrometry for determination of uranium and plutonium isotopic composition; chemical spectral emission method for determination of contents of 'metal' impurities, boron and silicon, and methods of determination of gas forming impurities. Capabilities of nondestructive gamma-ray spectrometry techniques are considered in detail and results of their use at measurement of uranium and plutonium isotopic composition in initial dioxides, at determination of contents of uranium and plutonium, and uniformity of their distribution in MOX powder and pellets. The necessity of correction of algorithm of the MGA program is shown for using the program at analyses of gamma-ray spectra of MOX with low contents of low burnup plutonium. (authors)

  3. Quality assurance using outlier detection on an automatic segmentation method for the cerebellar peduncles (United States)

    Li, Ke; Ye, Chuyang; Yang, Zhen; Carass, Aaron; Ying, Sarah H.; Prince, Jerry L.


    Cerebellar peduncles (CPs) are white matter tracts connecting the cerebellum to other brain regions. Automatic segmentation methods of the CPs have been proposed for studying their structure and function. Usually the performance of these methods is evaluated by comparing segmentation results with manual delineations (ground truth). However, when a segmentation method is run on new data (for which no ground truth exists) it is highly desirable to efficiently detect and assess algorithm failures so that these cases can be excluded from scientific analysis. In this work, two outlier detection methods aimed to assess the performance of an automatic CP segmentation algorithm are presented. The first one is a univariate non-parametric method using a box-whisker plot. We first categorize automatic segmentation results of a dataset of diffusion tensor imaging (DTI) scans from 48 subjects as either a success or a failure. We then design three groups of features from the image data of nine categorized failures for failure detection. Results show that most of these features can efficiently detect the true failures. The second method—supervised classification—was employed on a larger DTI dataset of 249 manually categorized subjects. Four classifiers—linear discriminant analysis (LDA), logistic regression (LR), support vector machine (SVM), and random forest classification (RFC)—were trained using the designed features and evaluated using a leave-one-out cross validation. Results show that the LR performs worst among the four classifiers and the other three perform comparably, which demonstrates the feasibility of automatically detecting segmentation failures using classification methods.

  4. U.S. Geological Survey nutrient preservation experiment; nutrient concentration data for surface-, ground-, and municipal-supply water samples and quality-assurance samples (United States)

    Patton, Charles J.; Truitt, Earl P.


    This report is a compilation of analytical results from a study conducted at the U.S. Geological Survey, National Water Quality Laboratory (NWQL) in 1992 to assess the effectiveness of three field treatment protocols to stabilize nutrient concentra- tions in water samples stored for about 1 month at 4C. Field treatments tested were chilling, adjusting sample pH to less than 2 with sulfuric acid and chilling, and adding 52 milligrams of mercury (II) chloride per liter of sample and chilling. Field treatments of samples collected for determination of ammonium, nitrate plus nitrite, nitrite, dissolved Kjeldahl nitrogen, orthophosphate, and dissolved phosphorus included 0.45-micrometer membrane filtration. Only total Kjeldahl nitrogen and total phosphorus were determined in unfiltered samples. Data reported here pertain to water samples collected in April and May 1992 from 15 sites within the continental United States. Also included in this report are analytical results for nutrient concentrations in synthetic reference samples that were analyzed concurrently with real samples.

  5. Quality assurance and quality control in light stable isotope laboratories: A case study of Rio Grande, Texas, water samples (United States)

    Coplen, T.B.; Qi, H.


    New isotope laboratories can achieve the goal of reporting the same isotopic composition within analytical uncertainty for the same material analysed decades apart by (1) writing their own acceptance testing procedures and putting them into their mass spectrometric or laser-based isotope-ratio equipment procurement contract, (2) requiring a manufacturer to demonstrate acceptable performance using all sample ports provided with the instrumentation, (3) for each medium to be analysed, prepare two local reference materials substantially different in isotopic composition to encompass the range in isotopic composition expected in the laboratory and calibrated them with isotopic reference materials available from the International Atomic Energy Agency (IAEA) or the US National Institute of Standards and Technology (NIST), (4) using the optimum storage containers (for water samples, sealing in glass ampoules that are sterilised after sealing is satisfactory), (5) interspersing among sample unknowns local laboratory isotopic reference materials daily (internationally distributed isotopic reference materials can be ordered at three-year intervals, and can be used for elemental analyser analyses and other analyses that consume less than 1 mg of material) - this process applies to H, C, N, O, and S isotope ratios, (6) calculating isotopic compositions of unknowns by normalising isotopic data to that of local reference materials, which have been calibrated to internationally distributed isotopic reference materials, (7) reporting results on scales normalised to internationally distributed isotopic reference materials (where they are available) and providing to sample submitters the isotopic compositions of internationally distributed isotopic reference materials of the same substance had they been analysed with unknowns, (8) providing an audit trail in the laboratory for analytical results - this trail commonly will be in electronic format and might include a laboratory

  6. Use of Lot quality assurance sampling surveys to evaluate community health worker performance in rural Zambia: a case of Luangwa district. (United States)

    Mwanza, Moses; Zulu, Japhet; Topp, Stephanie M; Musonda, Patrick; Mutale, Wilbroad; Chilengi, Roma


    The Better Health Outcomes through Mentoring and Assessment (BHOMA) project is a cluster randomized controlled trial aimed at reducing age-standardized mortality rates in three rural districts through involvement of Community Health Workers (CHWs), Traditional Birth Attendants (TBAs), and Neighborhood Health Committees (NHCs). CHWs conduct quarterly surveys on all households using a questionnaire that captures key health events occurring within their catchment population. In order to validate contact with households, we utilize the Lot Quality Assurance Sampling (LQAS) methodology. In this study, we report experiences of applying the LQAS approach to monitor performance of CHWs in Luangwa District. Between April 2011 and December 2013, seven health facilities in Luangwa district were enrolled into the BHOMA project. The health facility catchment areas were divided into 33 geographic zones. Quality assurance was performed each quarter by randomly selecting zones representing about 90% of enrolled catchment areas from which 19 households per zone where also randomly identified. The surveys were conducted by CHW supervisors who had been trained on using the LQAS questionnaire. Information collected included household identity number (ID), whether the CHW visited the household, duration of the most recent visit, and what health information was discussed during the CHW visit. The threshold for success was set at 75% household outreach by CHWs in each zone. There are 4,616 total households in the 33 zones. This yielded a target of 32,212 household visits by community health workers during the 7 survey rounds. Based on the set cutoff point for passing the surveys (at least 75% households confirmed as visited), only one team of CHWs at Luangwa high school failed to reach the target during round 1 of the surveys; all the teams otherwise registered successful visits in all the surveys. We have employed the LQAS methodology for assurance that quarterly surveys were

  7. Analytical methods, quality assurance and quality control used in the Greenland AMAP programme. (United States)

    Asmund, G; Cleemann, M


    The majority of analytical results in the Greenland AMAP (Arctic Monitoring and Assessment Programme) have been produced by laboratories that participate regularly in performance studies. This makes it possible to judge the quality of the results based on objective measurements made by independent assessors. AMAP laboratories participated while analysing the AMAP samples in the QUASIMEME laboratory performance study programme, in the 'Interlaboratory Comparison Program' organised by Le Centre de Toxicologie du Québec, in a toxaphene intercomparison study organised by The Food Research Division of Health Canada, and in an International Atomic Energy Agency Intercomparison exercise. The relative errors of the trace analyses, i.e. the relative deviation of the result obtained by the AMAP laboratory from the assigned value, are in most cases less than the 25% which is regarded as acceptable by QUASIMEME. Usually the errors, especially for trace elements, are less than 12.5%, while errors for trace organics below 1 microgram kg-1 may rise to 50% or more. This study covers the period 1993 to 1998 for trace elements and one or more years from the period 1994-1996 for trace organics.

  8. Some connections between importance sampling and enhanced sampling methods in molecular dynamics (United States)

    Lie, H. C.; Quer, J.


    In molecular dynamics, enhanced sampling methods enable the collection of better statistics of rare events from a reference or target distribution. We show that a large class of these methods is based on the idea of importance sampling from mathematical statistics. We illustrate this connection by comparing the Hartmann-Schütte method for rare event simulation (J. Stat. Mech. Theor. Exp. 2012, P11004) and the Valsson-Parrinello method of variationally enhanced sampling [Phys. Rev. Lett. 113, 090601 (2014)]. We use this connection in order to discuss how recent results from the Monte Carlo methods literature can guide the development of enhanced sampling methods.

  9. 19 CFR 151.83 - Method of sampling. (United States)


    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Method of sampling. 151.83 Section 151.83 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Cotton § 151.83 Method of sampling. For...

  10. 7 CFR 29.110 - Method of sampling. (United States)


    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling tobacco...

  11. an assessment of methods for sampling carabid beetles

    African Journals Online (AJOL)


    particular habitat where we sampled (rugged montane rain forest) pitfall trapping has no advantage over searching methods with respect to ease of operation, low cost or efficiency. However, despite its inefficiency, pitfall trapping cannot be left out of sampling protocols because the method sampled some species that were ...

  12. Sampling Methods in Cardiovascular Nursing Research: An Overview. (United States)

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie


    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  13. SU-E-J-126: Respiratory Gating Quality Assurance: A Simple Method to Achieve Millisecond Temporal Resolution

    Energy Technology Data Exchange (ETDEWEB)

    McCabe, B; Wiersma, R [The University of Chicago, Chicago, IL (United States)


    Purpose: Low temporal latency between a gating on/off signal and a linac beam on/off during respiratory gating is critical for patient safety. Although, a measurement of temporal lag is recommended by AAPM Task Group 142 for commissioning and annual quality assurance, there currently exists no published method. Here we describe a simple, inexpensive, and reliable method to precisely measure gating lag at millisecond resolutions. Methods: A Varian Real-time Position Management™ (RPM) gating simulator with rotating disk was modified with a resistive flex sensor (Spectra Symbol) attached to the gating box platform. A photon diode was placed at machine isocenter. Output signals of the flex sensor and diode were monitored with a multichannel oscilloscope (Tektronix™ DPO3014). Qualitative inspection of the gating window/beam on synchronicity were made by setting the linac to beam on/off at end-expiration, and the oscilloscope's temporal window to 100 ms to visually examine if the on/off timing was within the recommended 100-ms tolerance. Quantitative measurements were made by saving the signal traces and analyzing in MatLab™. The on and off of the beam signal were located and compared to the expected gating window (e.g. 40% to 60%). Four gating cycles were measured and compared. Results: On a Varian TrueBeam™ STx linac with RPM gating software, the average difference in synchronicity at beam on and off for four cycles was 14 ms (3 to 30 ms) and 11 ms (2 to 32 ms), respectively. For a Varian Clinac™ 21EX the average difference at beam on and off was 127 ms (122 to 133 ms) and 46 ms (42 to 49 ms), respectively. The uncertainty in the synchrony difference was estimated at ±6 ms. Conclusion: This new gating QA method is easy to implement and allows for fast qualitative inspection and quantitative measurements for commissioning and TG-142 annual QA measurements.

  14. Methods for collecting algal samples as part of the National Water-Quality Assessment Program (United States)

    Porter, Stephen D.; Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.


    Benthic algae (periphyton) and phytoplankton communities are characterized in the U.S. Geological Survey's National Water-Quality Assessment Program as part of an integrated physical, chemical, and biological assessment of the Nation's water quality. This multidisciplinary approach provides multiple lines of evidence for evaluating water-quality status and trends, and for refining an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. Water quality can be characterized by evaluating the results of qualitative and quantitative measurements of the algal community. Qualitative periphyton samples are collected to develop of list of taxa present in the sampling reach. Quantitative periphyton samples are collected to measure algal community structure within selected habitats. These samples of benthic algal communities are collected from natural substrates, using the sampling methods that are most appropriate for the habitat conditions. Phytoplankton samples may be collected in large nonwadeable streams and rivers to meet specific program objectives. Estimates of algal biomass (chlorophyll content and ash-free dry mass) also are optional measures that may be useful for interpreting water-quality conditions. A nationally consistent approach provides guidance on site, reach, and habitat selection, as well as information on methods and equipment for qualitative and quantitative sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data locally, regionally, and nationally.

  15. Systemic Assurance (United States)


    benefits. The first is direct: Cost- effective and rapid recertification is essential to support the development of systems that must adapt to changes...simulations, cyber-physical robotic systems, and extremely large commercial Java programs. An important goal is to develop incrementally compostable ...combinations of models, practices, and tools for obtaining the most cost- and schedule- effective combinations for the assurance of necessary system

  16. On Angular Sampling Methods for 3-D Spatial Channel Models

    DEFF Research Database (Denmark)

    Fan, Wei; Jämsä, Tommi; Nielsen, Jesper Ødum


    This paper discusses generating three dimensional (3D) spatial channel models with emphasis on the angular sampling methods. Three angular sampling methods, i.e. modified uniform power sampling, modified uniform angular sampling, and random pairing methods are proposed and investigated in detail....... The random pairing method, which uses only twenty sinusoids in the ray-based model for generating the channels, presents good results if the spatial channel cluster is with a small elevation angle spread. For spatial clusters with large elevation angle spreads, however, the random pairing method would fail...

  17. Laser based water equilibration method for d18O determination of water samples (United States)

    Mandic, Magda; Smajgl, Danijela; Stoebener, Nils


    Determination of d18O with water equilibration method using mass spectrometers equipped with equilibration unit or Gas Bench is known already for many years. Now, with development of laser spectrometers this extends methods and possibilities to apply different technologies in laboratory but also in the field. The Thermo Scientific™ Delta Ray™ Isotope Ratio Infrared Spectrometer (IRIS) analyzer with the Universal Reference Interface (URI) Connect and Teledyne Cetac ASX-7100 offers high precision and throughput of samples. It employs optical spectroscopy for continuous measurement of isotope ratio values and concentration of carbon dioxide in ambient air, and also for analysis of discrete samples from vials, syringes, bags, or other user-provided sample containers. Test measurements and conformation of precision and accuracy of method determination d18O in water samples were done in Thermo Fisher application laboratory with three lab standards, namely ANST, Ocean II and HBW. All laboratory standards were previously calibrated with international reference material VSMOW2 and SLAP2 to assure accuracy of the isotopic values of the water. With method that we present in this work achieved repeatability and accuracy are 0.16‰ and 0.71‰, respectively, which fulfill requirements of regulatory method for wine and must after equilibration with CO2.

  18. Remedial investigation sampling and analysis plan for J-Field, Aberdeen Proving Ground, Maryland: Volume 2, Quality Assurance Project Plan

    Energy Technology Data Exchange (ETDEWEB)

    Prasad, S.; Martino, L.; Patton, T.


    J-Field encompasses about 460 acres at the southern end of the Gunpowder Neck Peninsula in the Edgewood Area of APG (Figure 2.1). Since World War II, the Edgewood Area of APG has been used to develop, manufacture, test, and destroy chemical agents and munitions. These materials were destroyed at J-Field by open burning and open detonation (OB/OD). For the purposes of this project, J-Field has been divided into eight geographic areas or facilities that are designated as areas of concern (AOCs): the Toxic Burning Pits (TBP), the White Phosphorus Burning Pits (WPP), the Riot Control Burning Pit (RCP), the Robins Point Demolition Ground (RPDG), the Robins Point Tower Site (RPTS), the South Beach Demolition Ground (SBDG), the South Beach Trench (SBT), and the Prototype Building (PB). The scope of this project is to conduct a remedial investigation/feasibility study (RI/FS) and ecological risk assessment to evaluate the impacts of past disposal activities at the J-Field site. Sampling for the RI will be carried out in three stages (I, II, and III) as detailed in the FSP. A phased approach will be used for the J-Field ecological risk assessment (ERA).

  19. New adaptive sampling method in particle image velocimetry (United States)

    Yu, Kaikai; Xu, Jinglei; Tang, Lan; Mo, Jianwei


    This study proposes a new adaptive method to enable the number of interrogation windows and their positions in a particle image velocimetry (PIV) image interrogation algorithm to become self-adapted according to the seeding density. The proposed method can relax the constraint of uniform sampling rate and uniform window size commonly adopted in the traditional PIV algorithm. In addition, the positions of the sampling points are redistributed on the basis of the spring force generated by the sampling points. The advantages include control of the number of interrogation windows according to the local seeding density and smoother distribution of sampling points. The reliability of the adaptive sampling method is illustrated by processing synthetic and experimental images. The synthetic example attests to the advantages of the sampling method. Compared with that of the uniform interrogation technique in the experimental application, the spatial resolution is locally enhanced when using the proposed sampling method.

  20. Application of advanced data collection and quality assurance methods in open prospective study - a case study of PONS project. (United States)

    Wawrzyniak, Zbigniew M; Paczesny, Daniel; Mańczuk, Marta; Zatoński, Witold A


    Large-scale epidemiologic studies can assess health indicators differentiating social groups and important health outcomes of the incidence and mortality of cancer, cardiovascular disease, and others, to establish a solid knowledgebase for the prevention management of premature morbidity and mortality causes. This study presents new advanced methods of data collection and data management systems with current data quality control and security to ensure high quality data assessment of health indicators in the large epidemiologic PONS study (The Polish-Norwegian Study). The material for experiment is the data management design of the large-scale population study in Poland (PONS) and the managed processes are applied into establishing a high quality and solid knowledge. The functional requirements of the PONS study data collection, supported by the advanced IT web-based methods, resulted in medical data of a high quality, data security, with quality data assessment, control process and evolution monitoring are fulfilled and shared by the IT system. Data from disparate and deployed sources of information are integrated into databases via software interfaces, and archived by a multi task secure server. The practical and implemented solution of modern advanced database technologies and remote software/hardware structure successfully supports the research of the big PONS study project. Development and implementation of follow-up control of the consistency and quality of data analysis and the processes of the PONS sub-databases have excellent measurement properties of data consistency of more than 99%. The project itself, by tailored hardware/software application, shows the positive impact of Quality Assurance (QA) on the quality of outcomes analysis results, effective data management within a shorter time. This efficiency ensures the quality of the epidemiological data and indicators of health by the elimination of common errors of research questionnaires and medical

  1. Sci—Sat AM: Stereo — 05: The Development of Quality Assurance Methods for Trajectory based Cranial SRS Treatments

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, B; Duzenli, C; Gete, E [BC Cancer Agency, Vancouver Cancer Centre (Canada); Teke, T [BC Cancer Agency, Centre for the Southern Interior (Canada)


    The goal of this work was to develop and validate non-planar linac beam trajectories defined by the dynamic motion of the gantry, couch, jaws, collimator and MLCs. This was conducted on the Varian TrueBeam linac by taking advantage of the linac's advanced control features in a non-clinical mode (termed developers mode). In this work, we present quality assurance methods that we have developed to test for the positional and temporal accuracy of the linac's moving components. The first QA method focuses on the coordination of couch and gantry. For this test, we developed a cylindrical phantom which has a film insert. Using this phantom we delivered a plan with dynamic motion of the couch and gantry. We found the mean absolute deviation of the entrance position from its expected value to be 0.5mm, with a standard deviation of 0.5mm. This was within the tolerances set by the machine's mechanical accuracy and the setup accuracy of the phantom. We also present an altered picket fence test which has added dynamic and simultaneous rotations of the couch and the collimator. While the test was shown to be sensitive enough to discern errors 1° and greater, we were unable to identify any errors in the coordination of the linacs collimator and couch. When operating under normal conditions, the Varian TrueBeam linac was able to pass both tests and is within tolerances acceptable for complex trajectory based treatments.

  2. Systems and methods for self-synchronized digital sampling (United States)

    Samson, Jr., John R. (Inventor)


    Systems and methods for self-synchronized data sampling are provided. In one embodiment, a system for capturing synchronous data samples is provided. The system includes an analog to digital converter adapted to capture signals from one or more sensors and convert the signals into a stream of digital data samples at a sampling frequency determined by a sampling control signal; and a synchronizer coupled to the analog to digital converter and adapted to receive a rotational frequency signal from a rotating machine, wherein the synchronizer is further adapted to generate the sampling control signal, and wherein the sampling control signal is based on the rotational frequency signal.

  3. DOE methods for evaluating environmental and waste management samples.

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S C; McCulloch, M; Thomas, B L; Riley, R G; Sklarew, D S; Mong, G M; Fadeff, S K [eds.; Pacific Northwest Lab., Richland, WA (United States)


    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others.


    Frisch, H. P.


    SAMSAN was developed to aid the control system analyst by providing a self consistent set of computer algorithms that support large order control system design and evaluation studies, with an emphasis placed on sampled system analysis. Control system analysts have access to a vast array of published algorithms to solve an equally large spectrum of controls related computational problems. The analyst usually spends considerable time and effort bringing these published algorithms to an integrated operational status and often finds them less general than desired. SAMSAN reduces the burden on the analyst by providing a set of algorithms that have been well tested and documented, and that can be readily integrated for solving control system problems. Algorithm selection for SAMSAN has been biased toward numerical accuracy for large order systems with computational speed and portability being considered important but not paramount. In addition to containing relevant subroutines from EISPAK for eigen-analysis and from LINPAK for the solution of linear systems and related problems, SAMSAN contains the following not so generally available capabilities: 1) Reduction of a real non-symmetric matrix to block diagonal form via a real similarity transformation matrix which is well conditioned with respect to inversion, 2) Solution of the generalized eigenvalue problem with balancing and grading, 3) Computation of all zeros of the determinant of a matrix of polynomials, 4) Matrix exponentiation and the evaluation of integrals involving the matrix exponential, with option to first block diagonalize, 5) Root locus and frequency response for single variable transfer functions in the S, Z, and W domains, 6) Several methods of computing zeros for linear systems, and 7) The ability to generate documentation "on demand". All matrix operations in the SAMSAN algorithms assume non-symmetric matrices with real double precision elements. There is no fixed size limit on any matrix in any

  5. A random spatial sampling method in a rural developing nation (United States)

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas


    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  6. Experience sampling with elderly persons: an exploration of the method. (United States)

    Hnatiuk, S H


    The daily lives of a sample of elderly widows (greater than 69 years of age) were studied using the method of experience sampling developed by Csikszentmihalyi and his colleagues. The purpose of the study was to investigate the response of elderly people to experience sampling as a means of collecting information about their activities, thoughts, and moods during the course of one week. The method proved acceptable to the majority of participants and yielded reliable, valid data about their home lives, particularly from among the younger, more physically able women. Experience sampling was, within certain limits, a useful method of obtaining information from elderly people.


    Directory of Open Access Journals (Sweden)



    Full Text Available Marketing and statistical literature available to practitioners provides a wide range of sampling methods that can be implemented in the context of marketing research. Ranking sampling method is based on taking apart the general population into several strata, namely into several subdivisions which are relatively homogenous regarding a certain characteristic. In fact, the sample will be composed by selecting, from each stratum, a certain number of components (which can be proportional or non-proportional to the size of the stratum until the pre-established volume of the sample is reached. Using ranking sampling within marketing research requires the determination of some relevant statistical indicators - average, dispersion, sampling error etc. To that end, the paper contains a case study which illustrates the actual approach used in order to apply the ranking sample method within a marketing research made by a company which provides Internet connection services, on a particular category of customers – small and medium enterprises.


    It has been over 10 years since the low-flow ground water purging and sampling method was initially reported in the literature. The method grew from the recognition that well purging was necessary to collect representative samples, bailers could not achieve well purging, and high...

  9. Neonatal blood gas sampling methods | Goenka | South African ...

    African Journals Online (AJOL)

    There is little published guidance that systematically evaluates the different methods of neonatal blood gas sampling, where each method has its individual benefits and risks. This review critically surveys the available evidence to generate a comparison between arterial and capillary blood gas sampling, focusing on their ...

  10. An efficient method for sampling the essential subspace of proteins

    NARCIS (Netherlands)

    Amadei, A; Linssen, A.B M; de Groot, B.L.; van Aalten, D.M.F.; Berendsen, H.J.C.

    A method is presented for a more efficient sampling of the configurational space of proteins as compared to conventional sampling techniques such as molecular dynamics. The method is based on the large conformational changes in proteins revealed by the ''essential dynamics'' analysis. A form of

  11. Validation of a 20-h real-time PCR method for screening of Salmonella in poultry faecal samples

    DEFF Research Database (Denmark)

    Löfström, Charlotta; Hansen, Flemming; Hoorfar, Jeffrey


    Efficient and rapid monitoring of Salmonella in the poultry production chain is necessary to assure safe food. The objective was to validate an open-formula real-time PCR method for screening of Salmonella in poultry faeces (sock samples). The method consists of incubation in buffered peptone water...... (NordVal). The comparative trial was performed against a reference method from the Nordic Committee on Food Analysis (NMKL187, 2007) using 132 artificially and naturally contaminated samples. The limit of detection (LOD50) was found to be 24 and 33 CFU/sample for the PCR and NMKL187 methods...... for 18 ± 2 h, centrifugation of a 1-ml subsample, DNA extraction on the pellet and PCR. The total analysis time is 20 h. The validation study included comparative and collaborative trials, based on the recommendations from the Nordic organization for validation of alternative microbiological methods...

  12. A method and fortran program for quantitative sampling in paleontology (United States)

    Tipper, J.C.


    The Unit Sampling Method is a binomial sampling method applicable to the study of fauna preserved in rocks too well cemented to be disaggregated. Preliminary estimates of the probability of detecting each group in a single sampling unit can be converted to estimates of the group's volumetric abundance by means of correction curves obtained by a computer simulation technique. This paper describes the technique and gives the FORTRAN program. ?? 1976.

  13. Method and apparatus for imaging a sample on a device (United States)

    Trulson, Mark; Stern, David; Fiekowsky, Peter; Rava, Richard; Walton, Ian; Fodor, Stephen P. A.


    A method and apparatus for imaging a sample are provided. An electromagnetic radiation source generates excitation radiation which is sized by excitation optics to a line. The line is directed at a sample resting on a support and excites a plurality of regions on the sample. Collection optics collect response radiation reflected from the sample I and image the reflected radiation. A detector senses the reflected radiation and is positioned to permit discrimination between radiation reflected from a certain focal plane in the sample and certain other planes within the sample.

  14. Evaluating Composite Sampling Methods of Bacillus spores at Low Concentrations

    Energy Technology Data Exchange (ETDEWEB)

    Hess, Becky M.; Amidan, Brett G.; Anderson, Kevin K.; Hutchison, Janine R.


    Restoring facility operations after the 2001 Amerithrax attacks took over three months to complete, highlighting the need to reduce remediation time. The most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite: a single cellulose sponge samples multiple coupons; 2) single medium multi-pass composite: a single cellulose sponge is used to sample multiple coupons; and 3) multi-medium post-sample composite: a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155CFU/cm2, respectively). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p-value < 0.0001) and coupon material (p-value = 0.0008). Recovery efficiency (RE) was higher overall using the post-sample composite (PSC) method compared to single medium composite from both clean and grime coated materials. RE with the PSC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, painted wall board, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but significantly lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when

  15. Sampling Methods for Web and E-mail Surveys


    Fricker, RD


    London: SAGE Publications. Reprinted from The SAGE Handbook of Online Research Methods, N. Fielding, R.M. Lee and G. Blank, eds., chapter 11, London: SAGE Publications, 195-216. This chapter is a comprehensive overview of sampling methods for web and e-mail (‘Internetbased’) surveys. It reviews the various types of sampling method – both probability and nonprobability – and examines their applicability to Internet-based surveys. Issues related to Internetbased survey samp...

  16. Evaluation of common methods for sampling invertebrate pollinator assemblages: net sampling out-perform pan traps.

    Directory of Open Access Journals (Sweden)

    Tony J Popic

    Full Text Available Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km(2 area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service.

  17. A Swiss cheese error detection method for real-time EPID-based quality assurance and error prevention. (United States)

    Passarge, Michelle; Fix, Michael K; Manser, Peter; Stampanoni, Marco F M; Siebers, Jeffrey V


    To develop a robust and efficient process that detects relevant dose errors (dose errors of ≥5%) in external beam radiation therapy and directly indicates the origin of the error. The process is illustrated in the context of electronic portal imaging device (EPID)-based angle-resolved volumetric-modulated arc therapy (VMAT) quality assurance (QA), particularly as would be implemented in a real-time monitoring program. A Swiss cheese error detection (SCED) method was created as a paradigm for a cine EPID-based during-treatment QA. For VMAT, the method compares a treatment plan-based reference set of EPID images with images acquired over each 2° gantry angle interval. The process utilizes a sequence of independent consecutively executed error detection tests: an aperture check that verifies in-field radiation delivery and ensures no out-of-field radiation; output normalization checks at two different stages; global image alignment check to examine if rotation, scaling, and translation are within tolerances; pixel intensity check containing the standard gamma evaluation (3%, 3 mm) and pixel intensity deviation checks including and excluding high dose gradient regions. Tolerances for each check were determined. To test the SCED method, 12 different types of errors were selected to modify the original plan. A series of angle-resolved predicted EPID images were artificially generated for each test case, resulting in a sequence of precalculated frames for each modified treatment plan. The SCED method was applied multiple times for each test case to assess the ability to detect introduced plan variations. To compare the performance of the SCED process with that of a standard gamma analysis, both error detection methods were applied to the generated test cases with realistic noise variations. Averaged over ten test runs, 95.1% of all plan variations that resulted in relevant patient dose errors were detected within 2° and 100% within 14° (error, determination of error

  18. Method for using polarization gating to measure a scattering sample (United States)

    Baba, Justin S.


    Described herein are systems, devices, and methods facilitating optical characterization of scattering samples. A polarized optical beam can be directed to pass through a sample to be tested. The optical beam exiting the sample can then be analyzed to determine its degree of polarization, from which other properties of the sample can be determined. In some cases, an apparatus can include a source of an optical beam, an input polarizer, a sample, an output polarizer, and a photodetector. In some cases, a signal from a photodetector can be processed through attenuation, variable offset, and variable gain.

  19. [Respondent-Driven Sampling: a new sampling method to study visible and hidden populations]. (United States)

    Mantecón, Alejandro; Juan, Montse; Calafat, Amador; Becoña, Elisardo; Román, Encarna


    The paper introduces a variant of chain-referral sampling: respondent-driven sampling (RDS). This sampling method shows that methods based on network analysis can be combined with the statistical validity of standard probability sampling methods. In this sense, RDS appears to be a mathematical improvement of snowball sampling oriented to the study of hidden populations. However, we try to prove its validity with populations that are not within a sampling frame but can nonetheless be contacted without difficulty. The basics of RDS are explained through our research on young people (aged 14 to 25) who go clubbing, consume alcohol and other drugs, and have sex. Fieldwork was carried out between May and July 2007 in three Spanish regions: Baleares, Galicia and Comunidad Valenciana. The presentation of the study shows the utility of this type of sampling when the population is accessible but there is a difficulty deriving from the lack of a sampling frame. However, the sample obtained is not a random representative one in statistical terms of the target population. It must be acknowledged that the final sample is representative of a 'pseudo-population' that approximates to the target population but is not identical to it.

  20. Sampling and analysis methods for geothermal fluids and gases

    Energy Technology Data Exchange (ETDEWEB)

    Watson, J.C.


    The sampling procedures for geothermal fluids and gases include: sampling hot springs, fumaroles, etc.; sampling condensed brine and entrained gases; sampling steam-lines; low pressure separator systems; high pressure separator systems; two-phase sampling; downhole samplers; and miscellaneous methods. The recommended analytical methods compiled here cover physical properties, dissolved solids, and dissolved and entrained gases. The sequences of methods listed for each parameter are: wet chemical, gravimetric, colorimetric, electrode, atomic absorption, flame emission, x-ray fluorescence, inductively coupled plasma-atomic emission spectroscopy, ion exchange chromatography, spark source mass spectrometry, neutron activation analysis, and emission spectrometry. Material on correction of brine component concentrations for steam loss during flashing is presented. (MHR)

  1. A random spatial sampling method in a rural developing nation. (United States)

    Kondo, Michelle C; Bream, Kent D W; Barg, Frances K; Branas, Charles C


    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available.

  2. [Biological Advisory Subcommittee Sampling Methods : Results, Resolutions, and Correspondences : 2002 (United States)

    US Fish and Wildlife Service, Department of the Interior — This document contains a variety of information concerning Biological Advisory Subcommittee sampling methods at the Rocky Mountain Arsenal Refuge in 2002. Multiple...

  3. A cryopreservation method for Pasteurella multocida from wetland samples (United States)

    Moore, Melody K.; Shadduck, D.J.; Goldberg, D.R.; Samuel, M.D.


    A cryopreservation method and improved isolation techniques for detection of Pasteurella multocida from wetland samples were developed. Wetland water samples were collected in the field, diluted in dimethyl sulfoxide (DMSO, final concentration 10%), and frozen at -180 C in a liquid nitrogen vapor shipper. Frozen samples were transported to the laboratory where they were subsequently thawed and processed in Pasteurella multocida selective broth (PMSB) to isolate P. multocida. This method allowed for consistent isolation of 2 to 18 organisms/ml from water seeded with known concentrations of P. multocida. The method compared favorably with the standard mouse inoculation method and allowed for preservation of the samples until they could be processed in the laboratory.

  4. Methods for collection and analysis of water samples (United States)

    Rainwater, Frank Hays; Thatcher, Leland Lincoln


    This manual contains methods used by the U.S. Geological Survey to collect, preserve, and analyze water samples. Throughout, the emphasis is on obtaining analytical results that accurately describe the chemical composition of the water in situ. Among the topics discussed are selection of sampling sites, frequency of sampling, field equipment, preservatives and fixatives, analytical techniques of water analysis, and instruments. Seventy-seven laboratory and field procedures are given for determining fifty-three water properties.

  5. Field evaluation of personal sampling methods for multiple bioaerosols.

    Directory of Open Access Journals (Sweden)

    Chi-Hsun Wang

    Full Text Available Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min. Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  6. Adaptive cluster sampling: An efficient method for assessing inconspicuous species (United States)

    Andrea M. Silletti; Joan Walker


    Restorationistis typically evaluate the success of a project by estimating the population sizes of species that have been planted or seeded. Because total census is raely feasible, they must rely on sampling methods for population estimates. However, traditional random sampling designs may be inefficient for species that, for one reason or another, are challenging to...

  7. Access to Education for Orphans and Vulnerable Children in Uganda: A Multi-District, Cross-Sectional Study Using Lot Quality Assurance Sampling from 2011 to 2013. (United States)

    Olanrewaju, Ayobami D; Jeffery, Caroline; Crossland, Nadine; Valadez, Joseph J


    This study estimates the proportion of Orphans and Vulnerable Children (OVC) attending school in 89 districts of Uganda from 2011 - 2013 and investigates the factors influencing OVC access to education among this population. This study used secondary survey data from OVCs aged 5 - 17 years, collected using Lot Quality Assurance Sampling in 87 Ugandan districts over a 3-year period (2011 - 2013). Estimates of OVC school attendance were determined for the yearly time periods. Logistic regression was used to investigate the factors influencing OVC access to education. 19,354 children aged 5-17 were included in the analysis. We estimated that 79.1% (95% CI: 78.5% - 79.7%) of OVCs attended school during the 3-year period. Logistic regression revealed the odds of attending school were lower among OVCs from Western (OR 0.88; 95% CI: 0.79 - 0.99) and Northern (OR 0.64; 95% CI: 0.56 - 0.73) regions compared to the Central region. Female OVCs had a significantly higher odds of attending school (OR 1.09; 95% CI: 1.02 - 1.17) compared to their male counterparts. When adjusting for all variables simultaneously, we found the odds of school attendance reduced by 12% between 2011 and 2012 among all OVCs (OR 0.88; 95% CI: 0.81 - 0.97). Our findings reinforce the need to provide continuing support to OVC in Uganda, ensuring they have the opportunity to attain an education. The data indicate important regional and gender variation that needs to be considered for support strategies and in social policy. The results suggest the need for greater local empowerment to address the needs of OVCs. We recommend further research to understand why OVC access to education and attendance varies between regions and improvement of district level mapping of OVC access to education, and further study to understand the particular factors impacting the lower school attendance of male OVCs.

  8. DOE methods for evaluating environmental and waste management samples

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K. [eds.


    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, {open_quotes}Draft{close_quotes} or {open_quotes}Verified{close_quotes}. {open_quotes}Draft{close_quotes} methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. {open_quotes}Verified{close_quotes} methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy.

  9. Method and sample spinning apparatus for measuring the NMR spectrum of an orientationally disordered sample (United States)

    Pines, Alexander; Samoson, Ago


    An improved NMR apparatus and method are described which substantially improve the resolution of NMR measurements made on powdered or amorphous or otherwise orientationally disordered samples. The apparatus spins the sample about an axis. The angle of the axis is mechanically varied such that the time average of two or more Legendre polynomials are zero.

  10. Methods for sample size determination in cluster randomized trials. (United States)

    Rutterford, Clare; Copas, Andrew; Eldridge, Sandra


    The use of cluster randomized trials (CRTs) is increasing, along with the variety in their design and analysis. The simplest approach for their sample size calculation is to calculate the sample size assuming individual randomization and inflate this by a design effect to account for randomization by cluster. The assumptions of a simple design effect may not always be met; alternative or more complicated approaches are required. We summarise a wide range of sample size methods available for cluster randomized trials. For those familiar with sample size calculations for individually randomized trials but with less experience in the clustered case, this manuscript provides formulae for a wide range of scenarios with associated explanation and recommendations. For those with more experience, comprehensive summaries are provided that allow quick identification of methods for a given design, outcome and analysis method. We present first those methods applicable to the simplest two-arm, parallel group, completely randomized design followed by methods that incorporate deviations from this design such as: variability in cluster sizes; attrition; non-compliance; or the inclusion of baseline covariates or repeated measures. The paper concludes with methods for alternative designs. There is a large amount of methodology available for sample size calculations in CRTs. This paper gives the most comprehensive description of published methodology for sample size calculation and provides an important resource for those designing these trials. © The Author 2015. Published by Oxford University Press on behalf of the International Epidemiological Association.

  11. System and method for measuring fluorescence of a sample

    Energy Technology Data Exchange (ETDEWEB)

    Riot, Vincent J.


    The present disclosure provides a system and a method for measuring fluorescence of a sample. The sample may be a polymerase-chain-reaction (PCR) array, a loop-mediated-isothermal amplification array, etc. LEDs are used to excite the sample, and a photodiode is used to collect the sample's fluorescence. An electronic offset signal is used to reduce the effects of background fluorescence and the noises from the measurement system. An integrator integrates the difference between the output of the photodiode and the electronic offset signal over a given period of time. The resulting integral is then converted into digital domain for further processing and storage.

  12. Soil separator and sampler and method of sampling (United States)

    O'Brien, Barry H [Idaho Falls, ID; Ritter, Paul D [Idaho Falls, ID


    A soil sampler includes a fluidized bed for receiving a soil sample. The fluidized bed may be in communication with a vacuum for drawing air through the fluidized bed and suspending particulate matter of the soil sample in the air. In a method of sampling, the air may be drawn across a filter, separating the particulate matter. Optionally, a baffle or a cyclone may be included within the fluidized bed for disentrainment, or dedusting, so only the finest particulate matter, including asbestos, will be trapped on the filter. The filter may be removable, and may be tested to determine the content of asbestos and other hazardous particulate matter in the soil sample.

  13. System and method for measuring fluorescence of a sample (United States)

    Riot, Vincent J


    The present disclosure provides a system and a method for measuring fluorescence of a sample. The sample may be a polymerase-chain-reaction (PCR) array, a loop-mediated-isothermal amplification array, etc. LEDs are used to excite the sample, and a photodiode is used to collect the sample's fluorescence. An electronic offset signal is used to reduce the effects of background fluorescence and the noises from the measurement system. An integrator integrates the difference between the output of the photodiode and the electronic offset signal over a given period of time. The resulting integral is then converted into digital domain for further processing and storage.

  14. Using re-sampling methods in mortality studies.

    Directory of Open Access Journals (Sweden)

    Igor Itskovich

    Full Text Available Traditional methods of computing standardized mortality ratios (SMR in mortality studies rely upon a number of conventional statistical propositions to estimate confidence intervals for obtained values. Those propositions include a common but arbitrary choice of the confidence level and the assumption that observed number of deaths in the test sample is a purely random quantity. The latter assumption may not be fully justified for a series of periodic "overlapping" studies. We propose a new approach to evaluating the SMR, along with its confidence interval, based on a simple re-sampling technique. The proposed method is most straightforward and requires neither the use of above assumptions nor any rigorous technique, employed by modern re-sampling theory, for selection of a sample set. Instead, we include all possible samples that correspond to the specified time window of the study in the re-sampling analysis. As a result, directly obtained confidence intervals for repeated overlapping studies may be tighter than those yielded by conventional methods. The proposed method is illustrated by evaluating mortality due to a hypothetical risk factor in a life insurance cohort. With this method used, the SMR values can be forecast more precisely than when using the traditional approach. As a result, the appropriate risk assessment would have smaller uncertainties.

  15. Heat-capacity measurements on small samples: The hybrid method

    NARCIS (Netherlands)

    Klaasse, J.C.P.; Brück, E.H.


    A newly developed method is presented for measuring heat capacities on small samples, particularly where thermal isolation is not sufficient for the use of the traditional semiadiabatic heat-pulse technique. This "hybrid technique" is a modification of this heat-pulse method in case the temperature

  16. Method of determining an electrical property of a test sample

    DEFF Research Database (Denmark)


    A method of obtaining an electrical property of a test sample, comprising a non-conductive area and a conductive or semi-conductive test area, byperforming multiple measurements using a multi-point probe. The method comprising the steps of providing a magnetic field having field lines passing...... the electrical property of the test area....

  17. SU-F-T-172: A Method for Log File QA On An IBA Proteus System for Patient Specific Spot Scanning Quality Assurance

    Energy Technology Data Exchange (ETDEWEB)

    Tang, S; Ho, M; Chen, C; Mah, D [ProCure NJ, Somerset, NJ (United States); Rice, I; Doan, D; Mac Rae, B [IBA, Somerset, NJ (United States)


    Purpose: The use of log files to perform patient specific quality assurance for both protons and IMRT has been established. Here, we extend that approach to a proprietary log file format and compare our results to measurements in phantom. Our goal was to generate a system that would permit gross errors to be found within 3 fractions until direct measurements. This approach could eventually replace direct measurements. Methods: Spot scanning protons pass through multi-wire ionization chambers which provide information about the charge, location, and size of each delivered spot. We have generated a program that calculates the dose in phantom from these log files and compares the measurements with the plan. The program has 3 different spot shape models: single Gaussian, double Gaussian and the ASTROID model. The program was benchmarked across different treatment sites for 23 patients and 74 fields. Results: The dose calculated from the log files were compared to those generate by the treatment planning system (Raystation). While the dual Gaussian model often gave better agreement, overall, the ASTROID model gave the most consistent results. Using a 5%–3 mm gamma with a 90% passing criteria and excluding doses below 20% of prescription all patient samples passed. However, the degree of agreement of the log file approach was slightly worse than that of the chamber array measurement approach. Operationally, this implies that if the beam passes the log file model, it should pass direct measurement. Conclusion: We have established and benchmarked a model for log file QA in an IBA proteus plus system. The choice of optimal spot model for a given class of patients may be affected by factors such as site, field size, and range shifter and will be investigated further.

  18. The use of microbead-based spoligotyping for Mycobacterium tuberculosis complex to evaluate the quality of the conventional method: Providing guidelines for Quality Assurance when working on membranes

    Directory of Open Access Journals (Sweden)

    Garzelli Carlo


    Full Text Available Abstract Background The classical spoligotyping technique, relying on membrane reverse line-blot hybridization of the spacers of the Mycobacterium tuberculosis CRISPR locus, is used world-wide (598 references in Pubmed on April 8th, 2011. However, until now no inter-laboratory quality control study had been undertaken to validate this technique. We analyzed the quality of membrane-based spoligotyping by comparing it to the recently introduced and highly robust microbead-based spoligotyping. Nine hundred and twenty-seven isolates were analyzed totaling 39,861 data points. Samples were received from 11 international laboratories with a worldwide distribution. Methods The high-throughput microbead-based Spoligotyping was performed on CTAB and thermolyzate DNA extracted from isolated Mycobacterium tuberculosis complex (MTC strains coming from the genotyping participating centers. Information regarding how the classical Spoligotyping method was performed by center was available. Genotype discriminatory analyses were carried out by comparing the spoligotypes obtained by both methods. The non parametric U-Mann Whitney homogeneity test and the Spearman rank correlation test were performed to validate the observed results. Results Seven out of the 11 laboratories (63 %, perfectly typed more than 90% of isolates, 3 scored between 80-90% and a single center was under 80% reaching 51% concordance only. However, this was mainly due to discordance in a single spacer, likely having a non-functional probe on the membrane used. The centers using thermolyzate DNA performed as well as centers using the more extended CTAB extraction procedure. Few centers shared the same problematic spacers and these problematic spacers were scattered over the whole CRISPR locus (Mostly spacers 15, 14, 18, 37, 39, 40. Conclusions We confirm that classical spoligotyping is a robust method with generally a high reliability in most centers. The applied DNA extraction procedure (CTAB

  19. Efficiency of snake sampling methods in the Brazilian semiarid region. (United States)

    Mesquita, Paula C M D; Passos, Daniel C; Cechin, Sonia Z


    The choice of sampling methods is a crucial step in every field survey in herpetology. In countries where time and financial support are limited, the choice of the methods is critical. The methods used to sample snakes often lack objective criteria, and the traditional methods have apparently been more important when making the choice. Consequently researches using not-standardized methods are frequently found in the literature. We have compared four commonly used methods for sampling snake assemblages in a semiarid area in Brazil. We compared the efficacy of each method based on the cost-benefit regarding the number of individuals and species captured, time, and financial investment. We found that pitfall traps were the less effective method in all aspects that were evaluated and it was not complementary to the other methods in terms of abundance of species and assemblage structure. We conclude that methods can only be considered complementary if they are standardized to the objectives of the study. The use of pitfall traps in short-term surveys of the snake fauna in areas with shrubby vegetation and stony soil is not recommended.

  20. A multi-dimensional sampling method for locating small scatterers (United States)

    Song, Rencheng; Zhong, Yu; Chen, Xudong


    A multiple signal classification (MUSIC)-like multi-dimensional sampling method (MDSM) is introduced to locate small three-dimensional scatterers using electromagnetic waves. The indicator is built with the most stable part of signal subspace of the multi-static response matrix on a set of combinatorial sampling nodes inside the domain of interest. It has two main advantages compared to the conventional MUSIC methods. First, the MDSM is more robust against noise. Second, it can work with a single incidence even for multi-scatterers. Numerical simulations are presented to show the good performance of the proposed method.

  1. Advanced Markov chain Monte Carlo methods learning from past samples

    CERN Document Server

    Liang, Faming; Carrol, Raymond J


    This book provides comprehensive coverage of simulation of complex systems using Monte Carlo methods. Developing algorithms that are immune to the local trap problem has long been considered as the most important topic in MCMC research. Various advanced MCMC algorithms which address this problem have been developed include, the modified Gibbs sampler, the methods based on auxiliary variables and the methods making use of past samples. The focus of this book is on the algorithms that make use of past samples. This book includes the multicanonical algorithm, dynamic weighting, dynamically weight

  2. Quality assurance and statistical control

    DEFF Research Database (Denmark)

    Heydorn, K.


    In scientific research laboratories it is rarely possible to use quality assurance schemes, developed for large-scale analysis. Instead methods have been developed to control the quality of modest numbers of analytical results by relying on statistical control: Analysis of precision serves...... to detect analytical errors by comparing the a priori precision of the analytical results with the actual variability observed among replicates or duplicates. The method relies on the chi-square distribution to detect excess variability and is quite sensitive even for 5-10 results. Interference control...... through the origo. Calibration control is an essential link in the traceability of results. Only one or two samples of pure solid or aqueous standards with accurately known content need to be analyzed. Verification is carried out by analyzing certified reference materials from BCR, NIST, or others...

  3. Sample size formulae for the Bayesian continual reassessment method. (United States)

    Cheung, Ying Kuen


    In the planning of a dose finding study, a primary design objective is to maintain high accuracy in terms of the probability of selecting the maximum tolerated dose. While numerous dose finding methods have been proposed in the literature, concrete guidance on sample size determination is lacking. With a motivation to provide quick and easy calculations during trial planning, we present closed form formulae for sample size determination associated with the use of the Bayesian continual reassessment method (CRM). We examine the sampling distribution of a nonparametric optimal design and exploit it as a proxy to empirically derive an accuracy index of the CRM using linear regression. We apply the formulae to determine the sample size of a phase I trial of PTEN-long in pancreatic cancer patients and demonstrate that the formulae give results very similar to simulation. The formulae are implemented by an R function 'getn' in the package 'dfcrm'. The results are developed for the Bayesian CRM and should be validated by simulation when used for other dose finding methods. The analytical formulae we propose give quick and accurate approximation of the required sample size for the CRM. The approach used to derive the formulae can be applied to obtain sample size formulae for other dose finding methods.


    Energy Technology Data Exchange (ETDEWEB)

    Maxwell, S.; Culligan, B.


    A new rapid separation method for radiostrontium in emergency milk samples was developed at the Savannah River Site (SRS) Environmental Bioassay Laboratory (Aiken, SC, USA) that will allow rapid separation and measurement of Sr-90 within 8 hours. The new method uses calcium phosphate precipitation, nitric acid dissolution of the precipitate to coagulate residual fat/proteins and a rapid strontium separation using Sr Resin (Eichrom Technologies, Darien, IL, USA) with vacuum-assisted flow rates. The method is much faster than previous method that use calcination or cation exchange pretreatment, has excellent chemical recovery, and effectively removes beta interferences. When a 100 ml sample aliquot is used, the method has a detection limit of 0.5 Bq/L, well below generic emergency action levels.

  5. Characterizing lentic freshwater fish assemblages using multiple sampling methods. (United States)

    Fischer, Jesse R; Quist, Michael C


    Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48-1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.

  6. Self-contained cryogenic gas sampling apparatus and method (United States)

    McManus, G.J.; Motes, B.G.; Bird, S.K.; Kotter, D.K.


    Apparatus for obtaining a whole gas sample, is composed of: a sample vessel having an inlet for receiving a gas sample; a controllable valve mounted for controllably opening and closing the inlet; a valve control coupled to the valve for opening and closing the valve at selected times; a portable power source connected for supplying operating power to the valve control; and a cryogenic coolant in thermal communication with the vessel for cooling the interior of the vessel to cryogenic temperatures. A method is described for obtaining an air sample using the apparatus described above, by: placing the apparatus at a location at which the sample is to be obtained; operating the valve control to open the valve at a selected time and close the valve at a selected subsequent time; and between the selected times maintaining the vessel at a cryogenic temperature by heat exchange with the coolant. 3 figs.

  7. Fluidics platform and method for sample preparation and analysis (United States)

    Benner, W. Henry; Dzenitis, John M.; Bennet, William J.; Baker, Brian R.


    Herein provided are fluidics platform and method for sample preparation and analysis. The fluidics platform is capable of analyzing DNA from blood samples using amplification assays such as polymerase-chain-reaction assays and loop-mediated-isothermal-amplification assays. The fluidics platform can also be used for other types of assays and analyzes. In some embodiments, a sample in a sealed tube can be inserted directly. The following isolation, detection, and analyzes can be performed without a user's intervention. The disclosed platform may also comprises a sample preparation system with a magnetic actuator, a heater, and an air-drying mechanism, and fluid manipulation processes for extraction, washing, elution, assay assembly, assay detection, and cleaning after reactions and between samples.

  8. Methods for Sampling and Measurement of Compressed Air Contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Stroem, L.


    In order to improve the technique for measuring oil and water entrained in a compressed air stream, a laboratory study has been made of some methods for sampling and measurement. For this purpose water or oil as artificial contaminants were injected in thin streams into a test loop, carrying dry compressed air. Sampling was performed in a vertical run, down-stream of the injection point. Wall attached liquid, coarse droplet flow, and fine droplet flow were sampled separately. The results were compared with two-phase flow theory and direct observation of liquid behaviour. In a study of sample transport through narrow tubes, it was observed that, below a certain liquid loading, the sample did not move, the liquid remaining stationary on the tubing wall. The basic analysis of the collected samples was made by gravimetric methods. Adsorption tubes were used with success to measure water vapour. A humidity meter with a sensor of the aluminium oxide type was found to be unreliable. Oil could be measured selectively by a flame ionization detector, the sample being pretreated in an evaporation- condensation unit


    Energy Technology Data Exchange (ETDEWEB)

    Maxwell, S.; Culligan, B.


    The Savannah River Site Environmental Bioassay Lab participated in the 2008 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2008. A new rapid column separation method was used for analysis of actinides and {sup 90}Sr the NRIP 2008 emergency water and urine samples. Significant method improvements were applied to reduce analytical times. As a result, much faster analysis times were achieved, less than 3 hours for determination of {sup 90}Sr and 3-4 hours for actinides. This represents a 25%-33% improvement in analysis times from NRIP 2007 and a {approx}100% improvement compared to NRIP 2006 report times. Column flow rates were increased by a factor of two, with no significant adverse impact on the method performance. Larger sample aliquots, shorter count times, faster cerium fluoride microprecipitation and streamlined calcium phosphate precipitation were also employed. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and {sup 90}Sr analyses for NRIP 2008 emergency urine samples. High levels of potential matrix interferences may be present in emergency samples and rugged methods are essential. Extremely high levels of {sup 210}Po were found to have an adverse effect on the uranium results for the NRIP-08 urine samples, while uranium results for NRIP-08 water samples were not affected. This problem, which was not observed for NRIP-06 or NRIP-07 urine samples, was resolved by using an enhanced {sup 210}Po removal step, which will be described.


    Energy Technology Data Exchange (ETDEWEB)

    Maxwell, S; Brian Culligan, B


    The Savannah River Site Environmental Bioassay Lab participated in the 2007 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2007. A new rapid column separation method was applied directly to the NRIP 2007 emergency urine samples, with only minimal sample preparation to reduce preparation time. Calcium phosphate precipitation, previously used to pre-concentrate actinides and Sr-90 in NRIP 2006 urine and water samples, was not used for the NRIP 2007 urine samples. Instead, the raw urine was acidified and passed directly through the stacked resin columns (TEVA+TRU+SR Resins) to separate the actinides and strontium from the NRIP urine samples more quickly. This improvement reduced sample preparation time for the NRIP 2007 emergency urine analyses significantly. This approach works well for small volume urine samples expected during an emergency response event. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and strontium-90 analyses for NRIP 2007 urine samples.

  11. Evaluation of Two Malaria Rapid Diagnostic Tests Quality Assurance (mRDT’s QA) Methods in Peripheral Health Facilities, Rural Tanzania.


    Masanja, Irene; Maganga, Musa; Sumari, Debora; Lucchi, Naomi; Udhayakumar, Venkatachalam; McMorrow, Meredith; Kachur, Patrick


    WHO recommends confirming suspected malaria cases before initiation of treatment. Due to the imited availability of quality microscopy services, this recommendation has been followed with increased use of antigen-detecting malaria rapid diagnostic tests (mRDTs) in many malaria endemic countries. With the increased use of mRDTs, the need for a thorough mRDT quality assurance (RDT QA) method has become more apparent. One of the WHO recommendations for RDT QA is to monitor the tests in field use...

  12. Comparison of sampling methods for radiocarbon dating of carbonyls in air samples via accelerator mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Schindler, Matthias, E-mail:; Kretschmer, Wolfgang; Scharf, Andreas; Tschekalinskij, Alexander


    Three new methods to sample and prepare various carbonyl compounds for radiocarbon measurements were developed and tested. Two of these procedures utilized the Strecker synthetic method to form amino acids from carbonyl compounds with either sodium cyanide or trimethylsilyl cyanide. The third procedure used semicarbazide to form crystalline carbazones with the carbonyl compounds. The resulting amino acids and semicarbazones were then separated and purified using thin layer chromatography. The separated compounds were then combusted to CO{sub 2} and reduced to graphite to determine {sup 14}C content by accelerator mass spectrometry (AMS). All of these methods were also compared with the standard carbonyl compound sampling method wherein a compound is derivatized with 2,4-dinitrophenylhydrazine and then separated by high-performance liquid chromatography (HPLC).

  13. Global metabolite analysis of yeast: evaluation of sample preparation methods

    DEFF Research Database (Denmark)

    Villas-Bôas, Silas Granato; Højer-Pedersen, Jesper; Åkesson, Mats Fredrik


    Sample preparation is considered one of the limiting steps in microbial metabolome analysis. Eukaryotes and prokaryotes behave very differently during the several steps of classical sample preparation methods for analysis of metabolites. Even within the eukaryote kingdom there is a vast diversity...... of cell structures that make it imprudent to blindly adopt protocols that were designed for a specific group of microorganisms. We have therefore reviewed and evaluated the whole sample preparation procedures for analysis of yeast metabolites. Our focus has been on the current needs in metabolome analysis......, which is the analysis of a large number of metabolites with very diverse chemical and physical properties. This work reports the leakage of intracellular metabolites observed during quenching yeast cells with cold methanol solution, the efficacy of six different methods for the extraction...

  14. Sampling and analysis methods for geothermal fluids and gases

    Energy Technology Data Exchange (ETDEWEB)

    Shannon, D. W.


    The data obtained for the first round robin sample collected at Mesa 6-2 wellhead, East Mesa Test Site, Imperial Valley are summarized. Test results are listed by method used for cross reference to the analytic methods section. Results obtained for radioactive isotopes present in the brine sample are tabulated. The data obtained for the second round robin sample collected from the Woolsey No. 1 first stage flash unit, San Diego Gas and Electric Niland Test Facility are presented in the same manner. Lists of the participants of the two round robins are given. Data from miscellaneous analyses are included. Summaries of values derived from the round robin raw data are presented. (MHR)

  15. A Frequency Domain Design Method For Sampled-Data Compensators

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Jannerup, Ole Erik


    A new approach to the design of a sampled-data compensator in the frequency domain is investigated. The starting point is a continuous-time compensator for the continuous-time system which satisfy specific design criteria. The new design method will graphically show how the discrete...

  16. Effect of method of sample preparation on ruminal in situ ...

    African Journals Online (AJOL)

    The objective of this study was to investigate the effect of method of sample preparation on the degradation kinetics of herbage when applying the in situ technique. Ryegrass (Lolium multiflorum cv. Midmar) was harvested at three and four weeks after cutting and fertilizing with 200 kg nitrogen (N)/ha. Freshly cut herbage ...

  17. Neonatal blood gas sampling methods | Goenka | South African ...

    African Journals Online (AJOL)

    Indwelling arterial catheters are a practical, reliable and accurate method of measuring acid-base parameters, provided they are inserted and maintained with the proper care. Capillary blood gas sampling is accurate, and a good substitute for radial 'stab' arterial puncture, avoiding many of the complications of repeated ...

  18. A General Linear Method for Equating with Small Samples (United States)

    Albano, Anthony D.


    Research on equating with small samples has shown that methods with stronger assumptions and fewer statistical estimates can lead to decreased error in the estimated equating function. This article introduces a new approach to linear observed-score equating, one which provides flexible control over how form difficulty is assumed versus estimated…

  19. Protein precipitation methods for sample pretreatment of grass pea ...

    African Journals Online (AJOL)

    Protein precipitation methods for sample pretreatment of grass pea extracts. Negussie Wodajo, Ghirma Moges, Theodros Solomon. Abstract. Bull. Chem. Soc. Ethiop. 1996, 10(2), 129-134. Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT. Article Metrics.

  20. Sample Selected Averaging Method for Analyzing the Event Related Potential (United States)

    Taguchi, Akira; Ono, Youhei; Kimura, Tomoaki

    The event related potential (ERP) is often measured through the oddball task. On the oddball task, subjects are given “rare stimulus” and “frequent stimulus”. Measured ERPs were analyzed by the averaging technique. In the results, amplitude of the ERP P300 becomes large when the “rare stimulus” is given. However, measured ERPs are included samples without an original feature of ERP. Thus, it is necessary to reject unsuitable measured ERPs when using the averaging technique. In this paper, we propose the rejection method for unsuitable measured ERPs for the averaging technique. Moreover, we combine the proposed method and Woody's adaptive filter method.

  1. Comparison of DNA preservation methods for environmental bacterial community samples (United States)

    Gray, Michael A.; Pratte, Zoe A.; Kellogg, Christina A.


    Field collections of environmental samples, for example corals, for molecular microbial analyses present distinct challenges. The lack of laboratory facilities in remote locations is common, and preservation of microbial community DNA for later study is critical. A particular challenge is keeping samples frozen in transit. Five nucleic acid preservation methods that do not require cold storage were compared for effectiveness over time and ease of use. Mixed microbial communities of known composition were created and preserved by DNAgard™, RNAlater®, DMSO–EDTA–salt (DESS), FTA® cards, and FTA Elute® cards. Automated ribosomal intergenic spacer analysis and clone libraries were used to detect specific changes in the faux communities over weeks and months of storage. A previously known bias in FTA® cards that results in lower recovery of pure cultures of Gram-positive bacteria was also detected in mixed community samples. There appears to be a uniform bias across all five preservation methods against microorganisms with high G + C DNA. Overall, the liquid-based preservatives (DNAgard™, RNAlater®, and DESS) outperformed the card-based methods. No single liquid method clearly outperformed the others, leaving method choice to be based on experimental design, field facilities, shipping constraints, and allowable cost.

  2. Software Quality Assurance Metrics (United States)

    McRae, Kalindra A.


    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  3. Development of an automated data processing method for sample to sample comparison of seized methamphetamines. (United States)

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun


    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  4. Standard methods for sampling freshwater fishes: Opportunities for international collaboration (United States)

    Bonar, Scott A.; Mercado-Silva, Norman; Hubert, Wayne A.; Beard, Douglas; Dave, Göran; Kubečka, Jan; Graeb, Brian D. S.; Lester, Nigel P.; Porath, Mark T.; Winfield, Ian J.


    With publication of Standard Methods for Sampling North American Freshwater Fishes in 2009, the American Fisheries Society (AFS) recommended standard procedures for North America. To explore interest in standardizing at intercontinental scales, a symposium attended by international specialists in freshwater fish sampling was convened at the 145th Annual AFS Meeting in Portland, Oregon, in August 2015. Participants represented all continents except Australia and Antarctica and were employed by state and federal agencies, universities, nongovernmental organizations, and consulting businesses. Currently, standardization is practiced mostly in North America and Europe. Participants described how standardization has been important for management of long-term data sets, promoting fundamental scientific understanding, and assessing efficacy of large spatial scale management strategies. Academics indicated that standardization has been useful in fisheries education because time previously used to teach how sampling methods are developed is now more devoted to diagnosis and treatment of problem fish communities. Researchers reported that standardization allowed increased sample size for method validation and calibration. Group consensus was to retain continental standards where they currently exist but to further explore international and intercontinental standardization, specifically identifying where synergies and bridges exist, and identify means to collaborate with scientists where standardization is limited but interest and need occur.

  5. Soybean yield modeling using bootstrap methods for small samples

    Energy Technology Data Exchange (ETDEWEB)

    Dalposso, G.A.; Uribe-Opazo, M.A.; Johann, J.A.


    One of the problems that occur when working with regression models is regarding the sample size; once the statistical methods used in inferential analyzes are asymptotic if the sample is small the analysis may be compromised because the estimates will be biased. An alternative is to use the bootstrap methodology, which in its non-parametric version does not need to guess or know the probability distribution that generated the original sample. In this work we used a set of soybean yield data and physical and chemical soil properties formed with fewer samples to determine a multiple linear regression model. Bootstrap methods were used for variable selection, identification of influential points and for determination of confidence intervals of the model parameters. The results showed that the bootstrap methods enabled us to select the physical and chemical soil properties, which were significant in the construction of the soybean yield regression model, construct the confidence intervals of the parameters and identify the points that had great influence on the estimated parameters. (Author)

  6. A Novel Fast Method for Point-sampled Model Simplification

    Directory of Open Access Journals (Sweden)

    Cao Zhi


    Full Text Available A novel fast simplification method for point-sampled statue model is proposed. Simplifying method for 3d model reconstruction is a hot topic in the field of 3D surface construction. But it is difficult as point cloud of many 3d models is very large, so its running time becomes very long. In this paper, a two-stage simplifying method is proposed. Firstly, a feature-preserved non-uniform simplification method for cloud points is presented, which simplifies the data set to remove the redundancy while keeping down the features of the model. Secondly, an affinity clustering simplifying method is used to classify the point cloud into a sharp point or a simple point. The advantage of Affinity Propagation clustering is passing messages among data points and fast speed of processing. Together with the re-sampling, it can dramatically reduce the duration of the process while keep a lower memory cost. Both theoretical analysis and experimental results show that after the simplification, the performance of the proposed method is efficient as well as the details of the surface are preserved well.

  7. The Sensitivity of Respondent-driven Sampling Method

    CERN Document Server

    Lu, Xin; Britton, Tom; Camitz, Martin; Kim, Beom Jun; Thorson, Anna; Liljeros, Fredrik


    Researchers in many scientific fields make inferences from individuals to larger groups. For many groups however, there is no list of members from which to take a random sample. Respondent-driven sampling (RDS) is a relatively new sampling methodology that circumvents this difficulty by using the social networks of the groups under study. The RDS method has been shown to provide unbiased estimates of population proportions given certain conditions. The method is now widely used in the study of HIV-related high-risk populations globally. In this paper, we test the RDS methodology by simulating RDS studies on the social networks of a large LGBT web community. The robustness of the RDS method is tested by violating, one by one, the conditions under which the method provides unbiased estimates. Results reveal that the risk of bias is large if networks are directed, or respondents choose to invite persons based on characteristics that are correlated with the study outcomes. If these two problems are absent, the RD...


    Energy Technology Data Exchange (ETDEWEB)

    Maxwell, S.; Culligan, B.; Noyes, G.


    A new rapid method for the determination of actinides in soil and sediment samples has been developed at the Savannah River Site Environmental Lab (Aiken, SC, USA) that can be used for samples up to 2 grams in emergency response situations. The actinides in soil method utilizes a rapid sodium hydroxide fusion method, a lanthanum fluoride soil matrix removal step, and a streamlined column separation process with stacked TEVA, TRU and DGA Resin cartridges. Lanthanum was separated rapidly and effectively from Am and Cm on DGA Resin. Vacuum box technology and rapid flow rates are used to reduce analytical time. Alpha sources are prepared using cerium fluoride microprecipitation for counting by alpha spectrometry. The method showed high chemical recoveries and effective removal of interferences. This new procedure was applied to emergency soil samples received in the NRIP Emergency Response exercise administered by the National Institute for Standards and Technology (NIST) in April, 2009. The actinides in soil results were reported within 4-5 hours with excellent quality.

  9. A direct sampling method to an inverse medium scattering problem

    KAUST Repository

    Ito, Kazufumi


    In this work we present a novel sampling method for time harmonic inverse medium scattering problems. It provides a simple tool to directly estimate the shape of the unknown scatterers (inhomogeneous media), and it is applicable even when the measured data are only available for one or two incident directions. A mathematical derivation is provided for its validation. Two- and three-dimensional numerical simulations are presented, which show that the method is accurate even with a few sets of scattered field data, computationally efficient, and very robust with respect to noises in the data. © 2012 IOP Publishing Ltd.

  10. Microextraction Methods for Preconcentration of Aluminium in Urine Samples

    Directory of Open Access Journals (Sweden)

    Farzad Farajbakhsh, Mohammad Amjadi, Jamshid Manzoori, Mohammad R. Ardalan, Abolghasem Jouyban


    Full Text Available Background: Analysis of aluminium (Al in urine samples is required in management of a number of diseases including patients with renal failure. This work aimed to present dispersive liquid-liquid microextraction (DLLME and ultrasound-assisted emulsification microextraction (USAEME methods for the preconcentration of ultra-trace amount of aluminum in human urine prior to its determination by a graphite furnace atomic absorption spectrometry (GFAAS. Methods: The microextraction methods were based on the complex formation of Al3+ with 8-hydroxyquinoline. The effect of various experimental parameters on the efficiencies of the methods and their optimum values were studied. Results: Under the optimal conditions, the limits of detection for USAEME-GFAAS and DLLME-GFAAS were 0.19 and 0.30 ng mL−1, respectively and corresponding relative standard deviations (RSD, n=5 for the determination of 40 ng mL−1 Al3+ were 5.9% and 4.9%. Conclusion: Both methods could be successfully used to the analysis of ultra trace concentrations of Al in urine samples of dialysis patients.

  11. Testing K. Patrick Method of Psychopathy Diagnosis in Russian Sample

    Directory of Open Access Journals (Sweden)

    Atadzhykova Y.A.,


    Full Text Available The article is devoted to the development of a method of diagnosing psychopathy, or antisocial (dissocial personality disorder. Modern researchers mostly use the methods of experiment, expert assessment, clinical interview or different combinations for personality disorders, including psychopathy. However, nowadays there is a growing need in development of a psychopathy diagnosis method which would be less labour-intensive, less expensive and more objective. One of the recently developed models of psychopathy is Trierarchic conceptualization by C. Patrick, it offers a new way to operationalize and diagnose psychopathy. The authors had tested this method in the Russian population, including both common sample as well as criminal offender sample consisting of individuals that have been suspected, accused or convicted of violent crimes. The subject of the current research is psychopathic traits measured by the tested method. We had carried out statistical and content analyzes of the data. Our study allowed to conclude that tested Russian version of the Triarchic Psychopathy Measure is effective enough to be used for research purposes. However, further research is required in order to render this measure valid to practical use.

  12. Angoff's delta method revisited: improving DIF detection under small samples. (United States)

    Magis, David; Facon, Bruno


    Most methods for detecting differential item functioning (DIF) are suitable when the sample sizes are sufficiently large to validate the null statistical distributions. There is no guarantee, however, that they will still perform adequately when there are few respondents in the focal group or in both the reference and the focal group. Angoff's delta plot is a potentially useful alternative for small-sample DIF investigation, but it suffers from an improper DIF flagging criterion. The purpose of this paper is to improve this classification rule under mild statistical assumptions. This improvement yields a modified delta plot with an adjusted DIF flagging criterion for small samples. A simulation study was conducted to compare the modified delta plot with both the classical delta plot approach and the Mantel-Haenszel method. It is concluded that the modified delta plot is consistently less conservative and more powerful than the usual delta plot, and is also less conservative and more powerful than the Mantel-Haenszel method as long as at least one group of respondents is small. ©2011 The British Psychological Society.

  13. Radiochemistry methods in DOE methods for evaluating environmental and waste management samples

    Energy Technology Data Exchange (ETDEWEB)

    Fadeff, S.K.; Goheen, S.C.


    Current standard sources of radiochemistry methods are often inappropriate for use in evaluating US Department of Energy environmental and waste management (DOE/EW) samples. Examples of current sources include EPA, ASTM, Standard Methods for the Examination of Water and Wastewater and HASL-300. Applicability of these methods is limited to specific matrices (usually water), radiation levels (usually environmental levels), and analytes (limited number). Radiochemistry methods in DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) attempt to fill the applicability gap that exists between standard methods and those needed for DOE/EM activities. The Radiochemistry chapter in DOE Methods includes an ``analysis and reporting`` guidance section as well as radiochemistry methods. A basis for identifying the DOE/EM radiochemistry needs is discussed. Within this needs framework, the applicability of standard methods and targeted new methods is identified. Sources of new methods (consolidated methods from DOE laboratories and submissions from individuals) and the methods review process will be discussed. The processes involved in generating consolidated methods add editing individually submitted methods will be compared. DOE Methods is a living document and continues to expand by adding various kinds of methods. Radiochemistry methods are highlighted in this paper. DOE Methods is intended to be a resource for methods applicable to DOE/EM problems. Although it is intended to support DOE, the guidance and methods are not necessarily exclusive to DOE. The document is available at no cost through the Laboratory Management Division of DOE, Office of Technology Development.

  14. Sediment sampling and processing methods in Hungary, and possible improvements (United States)

    Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy


    The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river

  15. Method for Sampling Alpha-Helical Protein Backbones

    Energy Technology Data Exchange (ETDEWEB)

    Fain, Boris; Levitt, Michael


    We present a novel technique of sampling the configurations of helical proteins. Assuming knowledge of native secondary structure, we employ assembly rules gathered from a database of existing structures to enumerate the geometrically possible 3-D arrangements of the constituent helices. We produce a library of possible folds for 25 helical protein cores. In each case the method finds significant numbers of conformations close to the native structure. In addition we assign coordinates to all atoms for 4 of the 25 proteins. In the context of database driven exhaustive enumeration our method performs extremely well, yielding significant percentages of structures (0.02%--82%) within 6A of the native structure. The method's speed and efficiency make it a valuable contribution towards the goal of predicting protein structure.

  16. Comparison between powder and slices diffraction methods in teeth samples

    Energy Technology Data Exchange (ETDEWEB)

    Colaco, Marcos V.; Barroso, Regina C. [Universidade do Estado do Rio de Janeiro (IF/UERJ), RJ (Brazil). Inst. de Fisica. Dept. de Fisica Aplicada; Porto, Isabel M. [Universidade Estadual de Campinas (FOP/UNICAMP), Piracicaba, SP (Brazil). Fac. de Odontologia. Dept. de Morfologia; Gerlach, Raquel F. [Universidade de Sao Paulo (FORP/USP), Rieirao Preto, SP (Brazil). Fac. de Odontologia. Dept. de Morfologia, Estomatologia e Fisiologia; Costa, Fanny N. [Coordenacao dos Programas de Pos-Graduacao de Engenharia (LIN/COPPE/UFRJ), RJ (Brazil). Lab. de Instrumentacao Nuclear


    Propose different methods to obtain crystallographic information about biological materials are important since powder method is a nondestructive method. Slices are an approximation of what would be an in vivo analysis. Effects of samples preparation cause differences in scattering profiles compared with powder method. The main inorganic component of bones and teeth is a calcium phosphate mineral whose structure closely resembles hydroxyapatite (HAp). The hexagonal symmetry, however, seems to work well with the powder diffraction data, and the crystal structure of HAp is usually described in space group P63/m. Were analyzed ten third molar teeth. Five teeth were separated in enamel, detin and circumpulpal detin powder and five in slices. All the scattering profile measurements were carried out at the X-ray diffraction beamline (XRD1) at the National Synchrotron Light Laboratory - LNLS, Campinas, Brazil. The LNLS synchrotron light source is composed of a 1.37 GeV electron storage ring, delivering approximately 4x10{sup -1}0 photons/s at 8 keV. A double-crystal Si(111) pre-monochromator, upstream of the beamline, was used to select a small energy bandwidth at 11 keV . Scattering signatures were obtained at intervals of 0.04 deg for angles from 24 deg to 52 deg. The human enamel experimental crystallite size obtained in this work were 30(3)nm (112 reflection) and 30(3)nm (300 reflection). These values were obtained from measurements of powdered enamel. When comparing the slice obtained 58(8)nm (112 reflection) and 37(7)nm (300 reflection) enamel diffraction patterns with those generated by the powder specimens, a few differences emerge. This work shows differences between powder and slices methods, separating characteristics of sample of the method's influence. (author)

  17. Compliance Assurance Monitoring (United States)

    Compliance assurance monitoring is intended to provide a reasonable assurance of compliance with applicable requirements under the Clean Air Act for large emission units that rely on pollution control device equipment to achieve compliance.

  18. Hand held sample tube manipulator, system and method (United States)

    Kenny, Donald V [Liberty Township, OH; Smith, Deborah L [Liberty Township, OH; Severance, Richard A [late of Columbus, OH


    A manipulator apparatus, system and method for measuring analytes present in sample tubes. The manipulator apparatus includes a housing having a central bore with an inlet end and outlet end; a plunger mechanism with at least a portion thereof slideably disposed for reciprocal movement within the central bore, the plunger mechanism having a tubular gas channel with an inlet end and an outlet end, the gas channel inlet end disposed in the same direction as said inlet end of the central bore, wherein the inlet end of said plunger mechanism is adapted for movement so as to expel a sample tube inserted in the bore at the outlet end of the housing, the inlet end of the plunger mechanism is adapted for connection to gas supply; a first seal is disposed in the housing for sealing between the central bore and the plunger mechanism; a second seal is disposed at the outlet end of the housing for sealing between the central bore and a sample tube; a holder mounted on the housing for holding the sample tube; and a biasing mechanism for returning the plunger mechanism to a starting position.

  19. Sample Size for Assessing Agreement between Two Methods of Measurement by Bland-Altman Method. (United States)

    Lu, Meng-Jie; Zhong, Wei-Hua; Liu, Yu-Xiu; Miao, Hua-Zhang; Li, Yong-Chang; Ji, Mu-Huo


    The Bland-Altman method has been widely used for assessing agreement between two methods of measurement. However, it remains unsolved about sample size estimation. We propose a new method of sample size estimation for Bland-Altman agreement assessment. According to the Bland-Altman method, the conclusion on agreement is made based on the width of the confidence interval for LOAs (limits of agreement) in comparison to predefined clinical agreement limit. Under the theory of statistical inference, the formulae of sample size estimation are derived, which depended on the pre-determined level of α, β, the mean and the standard deviation of differences between two measurements, and the predefined limits. With this new method, the sample sizes are calculated under different parameter settings which occur frequently in method comparison studies, and Monte-Carlo simulation is used to obtain the corresponding powers. The results of Monte-Carlo simulation showed that the achieved powers could coincide with the pre-determined level of powers, thus validating the correctness of the method. The method of sample size estimation can be applied in the Bland-Altman method to assess agreement between two methods of measurement.

  20. On the sensitivity of common gamma-index evaluation methods to MLC misalignments in Rapidarc quality assurance. (United States)

    Heilemann, G; Poppe, B; Laub, W


    In this study the effects of small systematic MLC misalignments and gravitational errors on the quality of Rapidarc treatment plan delivery are investigated with respect to verification measurements with two detector arrays and the evaluation of clinical significance of the error-induced deviations. Five prostate and six head and neck plans were modified by means of three error types: (1) both MLC banks are opened, respectively, in opposing directions, resulting in larger fields; (2) both MLC banks are closed, resulting in smaller fields; and (3) both MLC banks are shifted for lateral gantry angles, respectively, in the same direction to simulate the effects of gravity on the leaves. Measurements were evaluated with respect to a gamma-index of 3%/3 mm and 2%/2 mm. Dose in the modified plans was recalculated and the resulting dose volume histograms for target and critical structures were compared to those of the unaltered plans. The smallest introduced leaf position deviations which fail the >90% criterion for a gamma-index of 2%/2 mm are: (1) 1 mm; (2) 0.5 mm for prostate and 1.0 mm for head and neck cases; and (3) 3 mm corresponding to the error types, respectively. These errors would lead to significant changes in mean PTV dose and would not be detected with the more commonly used 3%/3 mm gamma-index criterion. A stricter gamma-index (2%/2 mm) is necessary in order to detect positional errors of the MLC. Nevertheless, the quality assurance procedure of Rapidarc treatment plans must include a thorough examination of where dose discrepancies occur, and professional judgment is needed when interpreting the gamma-index analysis, since even a >90% passing rate using the 2%/2 mm gamma-index criterion does not guarantee the absence of clinically significance dose deviation.

  1. A direct sampling method for inverse electromagnetic medium scattering

    KAUST Repository

    Ito, Kazufumi


    In this paper, we study the inverse electromagnetic medium scattering problem of estimating the support and shape of medium scatterers from scattered electric/magnetic near-field data. We shall develop a novel direct sampling method based on an analysis of electromagnetic scattering and the behavior of the fundamental solution. It is applicable to a few incident fields and needs only to compute inner products of the measured scattered field with the fundamental solutions located at sampling points. Hence, it is strictly direct, computationally very efficient and highly robust to the presence of data noise. Two- and three-dimensional numerical experiments indicate that it can provide reliable support estimates for multiple scatterers in the case of both exact and highly noisy data. © 2013 IOP Publishing Ltd.

  2. Method optimization for fecal sample collection and fecal DNA extraction. (United States)

    Mathay, Conny; Hamot, Gael; Henry, Estelle; Georges, Laura; Bellora, Camille; Lebrun, Laura; de Witt, Brian; Ammerlaan, Wim; Buschart, Anna; Wilmes, Paul; Betsou, Fay


    This is the third in a series of publications presenting formal method validation for biospecimen processing in the context of accreditation in laboratories and biobanks. We report here optimization of a stool processing protocol validated for fitness-for-purpose in terms of downstream DNA-based analyses. Stool collection was initially optimized in terms of sample input quantity and supernatant volume using canine stool. Three DNA extraction methods (PerkinElmer MSM I®, Norgen Biotek All-In-One®, MoBio PowerMag®) and six collection container types were evaluated with human stool in terms of DNA quantity and quality, DNA yield, and its reproducibility by spectrophotometry, spectrofluorometry, and quantitative PCR, DNA purity, SPUD assay, and 16S rRNA gene sequence-based taxonomic signatures. The optimal MSM I protocol involves a 0.2 g stool sample and 1000 μL supernatant. The MSM I extraction was superior in terms of DNA quantity and quality when compared to the other two methods tested. Optimal results were obtained with plain Sarstedt tubes (without stabilizer, requiring immediate freezing and storage at -20°C or -80°C) and Genotek tubes (with stabilizer and RT storage) in terms of DNA yields (total, human, bacterial, and double-stranded) according to spectrophotometry and spectrofluorometry, with low yield variability and good DNA purity. No inhibitors were identified at 25 ng/μL. The protocol was reproducible in terms of DNA yield among different stool aliquots. We validated a stool collection method suitable for downstream DNA metagenomic analysis. DNA extraction with the MSM I method using Genotek tubes was considered optimal, with simple logistics in terms of collection and shipment and offers the possibility of automation. Laboratories and biobanks should ensure protocol conditions are systematically recorded in the scope of accreditation.

  3. Empirical comparison of neutron activation sample analysis methods (United States)

    Gillenwalters, Elizabeth

    The U.S. Geological Survey (USGS) operates a research reactor used mainly for neutron activation of samples, which are then shipped to industrial customers. Accurate nuclide identification and activity determination are crucial to remain in compliance with Code of Federal Regulations guidelines. This facility utilized a Canberra high purity germanium detector (HPGe) coupled with Canberra Genie(TM) 2000 (G2K) software for gamma spectroscopy. This study analyzed the current method of nuclide identification and activity determination of neutron activated materials utilized by the USGS reactor staff and made recommendations to improve the method. Additionally, analysis of attenuators, effect of detector dead time on nuclide identification, and validity of activity determination assumptions were investigated. The current method of activity determination utilized the G2K software to obtain ratio of activity per nuclide identified. This determination was performed without the use of geometrically appropriate efficiency calibration curves. The ratio of activity per nuclide was used in conjunction with an overall exposure rate in mR/h obtained via a Fluke Biomedical hand-held ion chamber. The overall exposure rate was divided into individual nuclide amounts based on the G2K nuclide ratios. A gamma energy of 1 MeV and a gamma yield of 100% was assumed for all samples. Utilizing the gamma assumption and nuclide ratios, a calculation was performed to determine total sample activity in muCi (microCuries). An alternative method was proposed, which would eliminate the use of exposure rate and rely solely on the G2K software capabilities. The G2K software was energy and efficiency calibrated with efficiency curves developed for multiple geometries. The USGS reactor staff were trained to load appropriate calibration data into the G2K software prior to sample analysis. Comparison of the current method and proposed method demonstrated that the activity value calculated with the 1 Me

  4. Generalized Jones matrix method for homogeneous biaxial samples. (United States)

    Ortega-Quijano, Noé; Fade, Julien; Alouini, Mehdi


    The generalized Jones matrix (GJM) is a recently introduced tool to describe linear transformations of three-dimensional light fields. Based on this framework, a specific method for obtaining the GJM of uniaxial anisotropic media was recently presented. However, the GJM of biaxial media had not been tackled so far, as the previous method made use of a simplified rotation matrix that lacks a degree of freedom in the three-dimensional rotation, thus being not suitable for calculating the GJM of biaxial media. In this work we propose a general method to derive the GJM of arbitrarily-oriented homogeneous biaxial media. It is based on the differential generalized Jones matrix (dGJM), which is the three-dimensional counterpart of the conventional differential Jones matrix. We show that the dGJM provides a simple and elegant way to describe uniaxial and biaxial media, with the capacity to model multiple simultaneous optical effects. The practical usefulness of this method is illustrated by the GJM modeling of the polarimetric properties of a negative uniaxial KDP crystal and a biaxial KTP crystal for any three-dimensional sample orientation. The results show that this method constitutes an advantageous and straightforward way to model biaxial media, which show a growing relevance for many interesting applications.

  5. Methods of Human Body Odor Sampling: The Effect of Freezing

    National Research Council Canada - National Science Library

    Lenochova, Pavlina; Roberts, S. Craig; Havlicek, Jan

    Body odor sampling is an essential tool in human chemical ecology research. However, methodologies of individual studies vary widely in terms of sampling material, length of sampling, and sample processing...

  6. Verification of spectrophotometric method for nitrate analysis in water samples (United States)

    Kurniawati, Puji; Gusrianti, Reny; Dwisiwi, Bledug Bernanti; Purbaningtias, Tri Esti; Wiyantoko, Bayu


    The aim of this research was to verify the spectrophotometric method to analyze nitrate in water samples using APHA 2012 Section 4500 NO3-B method. The verification parameters used were: linearity, method detection limit, level of quantitation, level of linearity, accuracy and precision. Linearity was obtained by using 0 to 50 mg/L nitrate standard solution and the correlation coefficient of standard calibration linear regression equation was 0.9981. The method detection limit (MDL) was defined as 0,1294 mg/L and limit of quantitation (LOQ) was 0,4117 mg/L. The result of a level of linearity (LOL) was 50 mg/L and nitrate concentration 10 to 50 mg/L was linear with a level of confidence was 99%. The accuracy was determined through recovery value was 109.1907%. The precision value was observed using % relative standard deviation (%RSD) from repeatability and its result was 1.0886%. The tested performance criteria showed that the methodology was verified under the laboratory conditions.

  7. Methods to maximise recovery of environmental DNA from water samples.

    Directory of Open Access Journals (Sweden)

    Rheyda Hinlo

    Full Text Available The environmental DNA (eDNA method is a detection technique that is rapidly gaining credibility as a sensitive tool useful in the surveillance and monitoring of invasive and threatened species. Because eDNA analysis often deals with small quantities of short and degraded DNA fragments, methods that maximize eDNA recovery are required to increase detectability. In this study, we performed experiments at different stages of the eDNA analysis to show which combinations of methods give the best recovery rate for eDNA. Using Oriental weatherloach (Misgurnus anguillicaudatus as a study species, we show that various combinations of DNA capture, preservation and extraction methods can significantly affect DNA yield. Filtration using cellulose nitrate filter paper preserved in ethanol or stored in a -20°C freezer and extracted with the Qiagen DNeasy kit outperformed other combinations in terms of cost and efficiency of DNA recovery. Our results support the recommendation to filter water samples within 24hours but if this is not possible, our results suggest that refrigeration may be a better option than freezing for short-term storage (i.e., 3-5 days. This information is useful in designing eDNA detection of low-density invasive or threatened species, where small variations in DNA recovery can signify the difference between detection success or failure.

  8. Methods to maximise recovery of environmental DNA from water samples. (United States)

    Hinlo, Rheyda; Gleeson, Dianne; Lintermans, Mark; Furlan, Elise


    The environmental DNA (eDNA) method is a detection technique that is rapidly gaining credibility as a sensitive tool useful in the surveillance and monitoring of invasive and threatened species. Because eDNA analysis often deals with small quantities of short and degraded DNA fragments, methods that maximize eDNA recovery are required to increase detectability. In this study, we performed experiments at different stages of the eDNA analysis to show which combinations of methods give the best recovery rate for eDNA. Using Oriental weatherloach (Misgurnus anguillicaudatus) as a study species, we show that various combinations of DNA capture, preservation and extraction methods can significantly affect DNA yield. Filtration using cellulose nitrate filter paper preserved in ethanol or stored in a -20°C freezer and extracted with the Qiagen DNeasy kit outperformed other combinations in terms of cost and efficiency of DNA recovery. Our results support the recommendation to filter water samples within 24hours but if this is not possible, our results suggest that refrigeration may be a better option than freezing for short-term storage (i.e., 3-5 days). This information is useful in designing eDNA detection of low-density invasive or threatened species, where small variations in DNA recovery can signify the difference between detection success or failure.

  9. The curvHDR method for gating flow cytometry samples

    Directory of Open Access Journals (Sweden)

    Wand Matthew P


    Full Text Available Abstract Background High-throughput flow cytometry experiments produce hundreds of large multivariate samples of cellular characteristics. These samples require specialized processing to obtain clinically meaningful measurements. A major component of this processing is a form of cell subsetting known as gating. Manual gating is time-consuming and subjective. Good automatic and semi-automatic gating algorithms are very beneficial to high-throughput flow cytometry. Results We develop a statistical procedure, named curvHDR, for automatic and semi-automatic gating. The method combines the notions of significant high negative curvature regions and highest density regions and has the ability to adapt well to human-perceived gates. The underlying principles apply to dimension of arbitrary size, although we focus on dimensions up to three. Accompanying software, compatible with contemporary flow cytometry infor-matics, is developed. Conclusion The method is seen to adapt well to nuances in the data and, to a reasonable extent, match human perception of useful gates. It offers big savings in human labour when processing high-throughput flow cytometry data whilst retaining a good degree of efficacy.

  10. Field evaluation of broiler gait score using different sampling methods

    Directory of Open Access Journals (Sweden)

    AFS Cordeiro


    Full Text Available Brazil is today the world's largest broiler meat exporter; however, in order to keep this position, it must comply with welfare regulations while maintaining low production costs. Locomotion problems restrain bird movements, limiting their access to drinking and feeding equipment, and therefore their survival and productivity. The objective of this study was to evaluate locomotion deficiency in broiler chickens reared under stressful temperature conditions using three different sampling methods of birds from three different ages. The experiment consisted in determining the gait score of 28, 35, 42 and 49-day-old broilers using three different known gait scoring methods: M1, birds were randomly selected, enclosed in a circle, and then stimulated to walk out of the circle; M2, ten birds were randomly selected and gait scored; and M3, birds were randomly selected, enclosed in a circle, and then observed while walking away from the circle without stimulus to walking. Environmental temperature, relative humidity, and light intensity inside the poultry houses were recorded. No evidence of interaction between scoring method and age was found however, both method and age influenced gait score. Gait score was found to be lower at 28 days of age. The evaluation using the ten randomly selected birds within the house was the method that presented the less reliable results. Gait score results when birds were stimulated to walk were lower than when they were not simulated, independently of age. The gait scores obtained with the three tested methods and ages were higher than those considered acceptable. The highest frequency of normal gait score (0 represented 50% of the flock. These results may be related to heat stress during rearing. Average gait score incresead with average ambient temperature, relative humidity, and light intensity. The evaluation of gait score to detect locomotion problems of broilers under rearing conditions seems subjective and

  11. Microbiological Test Data - Assuring Data Integrity. (United States)

    Tidswell, Edward Charles; Sandle, Tim


    Marketed drugs and devices possess specifications including critical microbiological quality attributes purposed to assure efficacy and patient safety. These attributes are legislated requirements intended to protect the recipient patient. Sampling, microbiological testing, interpretation of data for final products, raw materials and intermediates all contribute to a cohesive assessment in the assurance of finished product quality. Traditional culture-based microbiological methods possess inherent and unavoidable variability, recognized by the compendia and which might lead to erroneous conclusion pertaining to product quality. Such variability has been associated and intrinsically linked with data integrity issues; manufacturers have subsequently been encouraged by regulatory authorities to introduce multiple microbiologists or checks to prevent such issues. Understanding microbiological variability is essential such that genuine data integrity issues are identified. Furthermore, a range of meaningful preventative strategies are feasible beyond increasing the capacity of the quality control microbiological laboratory. This short review describes the legislative requirements, inherent microbiological variability and realistic actions and activities that genuinely assure patient safety. Copyright © 2017, Parenteral Drug Association.

  12. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples (United States)

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.


    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  13. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples. (United States)

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G


    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  14. Quality assurance in biomarker measurement. (United States)

    Aitio, A; Apostoli, P


    Quality assurance (QA) concerns the validity of all the analytical processes (from collection of the samples to interpretation of the results). It is not an abstract concept but must be adapted to the different situations such as the different exposure levels, the different analytical methods, and the context of use (risk assessment procedures, research, routine determinations). The main requirements in QA programmes regard the control of all the known sources of preanalytical and analytical variations, while the instruments with which adequate QA can be implemented are the certified materials and the quality control programmes (quality manual, internal and external quality controls). Another important concept in QA is that measurements must be placed a different metrological levels: at the highest there are the methods (definitive, reference) to be used for assessing accuracy of routine methods. QA programmes should enable a grading of biomarkers (from experimental only to full evaluated) and of the laboratories in order to identify the significance of the test and to assess the level at which a laboratory could operate.

  15. Method of remotely characterizing thermal properties of a sample (United States)

    Heyman, Joseph S. (Inventor); Heath, D. Michele (Inventor); Welch, Christopher (Inventor); Winfree, William P. (Inventor); Miller, William E. (Inventor)


    A sample in a wind tunnel is radiated from a thermal energy source outside of the wind tunnel. A thermal imager system, also located outside of the wind tunnel, reads surface radiations from the sample as a function of time. The produced thermal images are characteristic of the heat transferred from the sample to the flow across the sample. In turn, the measured rates of heat loss of the sample are characteristic of the flow and the sample.

  16. Passive sampling methods for contaminated sediments: risk assessment and management. (United States)

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F


    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree ), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal ) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree ) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. © 2014

  17. Passive sampling methods for contaminated sediments: Risk assessment and management (United States)

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F


    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. Integr

  18. A comparison of methods for representing sparsely sampled random quantities.

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose; Swiler, Laura Painton; Urbina, Angel; Mullins, Joshua


    This report discusses the treatment of uncertainties stemming from relatively few samples of random quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse data samples it is not practical to have a goal of accurately estimating the underlying probability density function (PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a specified percentile range of the actual PDF, say the range between 0.025 and .975 percentiles, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the desired percentile range of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem interesting and difficult. In this report, five uncertainty representation techniques are characterized for their performance on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods, statistical Tolerance Intervals and a kernel density approach specifically developed for handling sparse data, exhibit significantly better overall performance than the others.

  19. An adaptive sampling and windowing interrogation method in PIV (United States)

    Theunissen, R.; Scarano, F.; Riethmuller, M. L.


    This study proposes a cross-correlation based PIV image interrogation algorithm that adapts the number of interrogation windows and their size to the image properties and to the flow conditions. The proposed methodology releases the constraint of uniform sampling rate (Cartesian mesh) and spatial resolution (uniform window size) commonly adopted in PIV interrogation. Especially in non-optimal experimental conditions where the flow seeding is inhomogeneous, this leads either to loss of robustness (too few particles per window) or measurement precision (too large or coarsely spaced interrogation windows). Two criteria are investigated, namely adaptation to the local signal content in the image and adaptation to local flow conditions. The implementation of the adaptive criteria within a recursive interrogation method is described. The location and size of the interrogation windows are locally adapted to the image signal (i.e., seeding density). Also the local window spacing (commonly set by the overlap factor) is put in relation with the spatial variation of the velocity field. The viability of the method is illustrated over two experimental cases where the limitation of a uniform interrogation approach appears clearly: a shock-wave-boundary layer interaction and an aircraft vortex wake. The examples show that the spatial sampling rate can be adapted to the actual flow features and that the interrogation window size can be arranged so as to follow the spatial distribution of seeding particle images and flow velocity fluctuations. In comparison with the uniform interrogation technique, the spatial resolution is locally enhanced while in poorly seeded regions the level of robustness of the analysis (signal-to-noise ratio) is kept almost constant.

  20. Data quality assurance in monitoring of wastewater quality: Univariate on-line and off-line methods

    DEFF Research Database (Denmark)

    Alferes, J.; Poirier, P.; Lamaire-Chad, C.

    procedure is presented that combines univariate off-line and on-line methods to assess water quality sensors and to detect and replace doubtful data. While the off-line concept uses control charts for quality control, the on-line methods aim at outlier and fault detection by using autoregressive models...

  1. Geothermal water and gas: collected methods for sampling and analysis. Comment issue. [Compilation of methods

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, J.G.; Serne, R.J.; Shannon, D.W.; Woodruff, E.M.


    A collection of methods for sampling and analysis of geothermal fluids and gases is presented. Compilations of analytic options for constituents in water and gases are given. Also, a survey of published methods of laboratory water analysis is included. It is stated that no recommendation of the applicability of the methods to geothermal brines should be assumed since the intent of the table is to encourage and solicit comments and discussion leading to recommended analytical procedures for geothermal waters and research. (WHK)

  2. Passive sampling methods for contaminated sediments: Scientific rationale supporting use of freely dissolved concentrations (United States)

    Mayer, Philipp; Parkerton, Thomas F; Adams, Rachel G; Cargill, John G; Gan, Jay; Gouin, Todd; Gschwend, Philip M; Hawthorne, Steven B; Helm, Paul; Witt, Gesine; You, Jing; Escher, Beate I


    Passive sampling methods (PSMs) allow the quantification of the freely dissolved concentration (Cfree) of an organic contaminant even in complex matrices such as sediments. Cfree is directly related to a contaminant's chemical activity, which drives spontaneous processes including diffusive uptake into benthic organisms and exchange with the overlying water column. Consequently, Cfree provides a more relevant dose metric than total sediment concentration. Recent developments in PSMs have significantly improved our ability to reliably measure even very low levels of Cfree. Application of PSMs in sediments is preferably conducted in the equilibrium regime, where freely dissolved concentrations in the sediment are well-linked to the measured concentration in the sampler via analyte-specific partition ratios. The equilibrium condition can then be assured by measuring a time series or a single time point using passive samplers with different surface to volume ratios. Sampling in the kinetic regime is also possible and generally involves the application of performance reference compounds for the calibration. Based on previous research on hydrophobic organic contaminants, it is concluded that Cfree allows a direct assessment of 1) contaminant exchange and equilibrium status between sediment and overlying water, 2) benthic bioaccumulation, and 3) potential toxicity to benthic organisms. Thus, the use of PSMs to measure Cfree provides an improved basis for the mechanistic understanding of fate and transport processes in sediments and has the potential to significantly improve risk assessment and management of contaminated sediments. Integr Environ Assess Manag 2014;10:197–209. © 2014 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of SETAC. PMID:24288295

  3. Employer-Led Quality Assurance (United States)

    Tyszko, Jason A.


    Recent criticism of higher education accreditation has prompted calls for reform and sparked interest in piloting alternative quality assurance methods that better address student learning and employment outcomes. Although this debate has brought much needed attention to improving the outcomes of graduates and safeguarding federal investment in…

  4. A simple capacitive method to evaluate ethanol fuel samples (United States)

    Vello, Tatiana P.; de Oliveira, Rafael F.; Silva, Gustavo O.; de Camargo, Davi H. S.; Bufon, Carlos C. B.


    Ethanol is a biofuel used worldwide. However, the presence of excessive water either during the distillation process or by fraudulent adulteration is a major concern in the use of ethanol fuel. High water levels may cause engine malfunction, in addition to being considered illegal. Here, we describe the development of a simple, fast and accurate platform based on nanostructured sensors to evaluate ethanol samples. The device fabrication is facile, based on standard microfabrication and thin-film deposition methods. The sensor operation relies on capacitance measurements employing a parallel plate capacitor containing a conformational aluminum oxide (Al2O3) thin layer (15 nm). The sensor operates over the full range water concentration, i.e., from approximately 0% to 100% vol. of water in ethanol, with water traces being detectable down to 0.5% vol. These characteristics make the proposed device unique with respect to other platforms. Finally, the good agreement between the sensor response and analyses performed by gas chromatography of ethanol biofuel endorses the accuracy of the proposed method. Due to the full operation range, the reported sensor has the technological potential for use as a point-of-care analytical tool at gas stations or in the chemical, pharmaceutical, and beverage industries, to mention a few.

  5. Martian Radiative Transfer Modeling Using the Optimal Spectral Sampling Method (United States)

    Eluszkiewicz, J.; Cady-Pereira, K.; Uymin, G.; Moncet, J.-L.


    The large volume of existing and planned infrared observations of Mars have prompted the development of a new martian radiative transfer model that could be used in the retrievals of atmospheric and surface properties. The model is based on the Optimal Spectral Sampling (OSS) method [1]. The method is a fast and accurate monochromatic technique applicable to a wide range of remote sensing platforms (from microwave to UV) and was originally developed for the real-time processing of infrared and microwave data acquired by instruments aboard the satellites forming part of the next-generation global weather satellite system NPOESS (National Polarorbiting Operational Satellite System) [2]. As part of our on-going research related to the radiative properties of the martian polar caps, we have begun the development of a martian OSS model with the goal of using it to perform self-consistent atmospheric corrections necessary to retrieve caps emissivity from the Thermal Emission Spectrometer (TES) spectra. While the caps will provide the initial focus area for applying the new model, it is hoped that the model will be of interest to the wider Mars remote sensing community.

  6. Methods and quality assurance in environmental medicine. Formation of a RKI-Commission; Methoden und Qualitaetssicherung in der Umweltmedizin. Einrichtung einer Umweltmedizin-Kommission am RKI

    Energy Technology Data Exchange (ETDEWEB)

    Eis, D. [Bundesgesundheitsamt, Berlin (Germany). Robert-Koch-Institut


    An almost bewildering number of widely differing methods and techniques, often not validated, are being applied often inappropriately in the field of environmental medicine to answer questions regarding exposure assessment, diagnosis, treatment, counselling and prevention. Therefore, quality control within the field of environmental medicine is quite problematic. A primary goal of the newly formed RKI-Commission 'Methods and Quality Assurance in Environmental Medicine' is to form a panel of experts in the field, who evaluate the situation and generate consensus documents containing respective recommendations. By this the commission will contribute to standardization and agreement on appropriate methods, procedures and their correct application in the practice of environmental medicine. Hopefully it will also achieve a stronger, more consistent use of evidence-based-medicine and improve the quality of the structure, processes and results of research and practice in this field. The committee will initially deal with the issue of clinical environmental medicine, because here the largest problems in quality assurance are seen. In this context the commission will look at the problem areas of environmental-medical outpatient units and environmental clinics. The work of the commission will be supported by the newly formed Documentation and Evaluation Center for Methods in Environmental Medicine (Zentrale Erfassungs- und Bewertungsstelle fuer umweltmedizinische Methoden, ZEBUM) at the Robert Koch Institute. (orig.) [German] Im Rahmen der umweltmedizinischen Expositionserfassung, Diagnostik, Beratung, Therapie, Prophylaxe und Sanierung wird eine kaum mehr ueberschaubare Zahl unterschiedlichster, zum Teil nichtvalidierter Verfahren bei oftmals fragwuerdiger Indikation eingesetzt. Die umweltmedizinische Qualitaetssicherung (QS) ist damit zum Problem geworden. Ein Hauptanliegen der neu eingerichteten RKI-Kommission 'Methoden und Qualitaetssicherung in der

  7. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods (United States)

    Coes, Alissa L.; Paretti, Nicholas V.; Foreman, William T.; Iverson, Jana L.; Alvarez, David A.


    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19–23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method.

  8. Probing methane hydrate nucleation through the forward flux sampling method. (United States)

    Bi, Yuanfei; Li, Tianshu


    Understanding the nucleation of hydrate is the key to developing effective strategies for controlling methane hydrate formation. Here we present a computational study of methane hydrate nucleation, by combining the forward flux sampling (FFS) method and the coarse-grained water model mW. To facilitate the application of FFS in studying the formation of methane hydrate, we developed an effective order parameter λ on the basis of the topological analysis of the tetrahedral network. The order parameter capitalizes the signature of hydrate structure, i.e., polyhedral cages, and is capable of efficiently distinguishing hydrate from ice and liquid water while allowing the formation of different hydrate phases, i.e., sI, sII, and amorphous. Integration of the order parameter λ with FFS allows explicitly computing hydrate nucleation rates and obtaining an ensemble of nucleation trajectories under conditions where spontaneous hydrate nucleation becomes too slow to occur in direct simulation. The convergence of the obtained hydrate nucleation rate was found to depend crucially on the convergence of the spatial distribution for the spontaneously formed hydrate seeds obtained from the initial sampling of FFS. The validity of the approach is also verified by the agreement between the calculated nucleation rate and that inferred from the direct simulation. Analyzing the obtained large ensemble of hydrate nucleation trajectories, we show hydrate formation at 220 K and 500 bar is initiated by the nucleation events occurring in the vicinity of water-methane interface, and facilitated by a gradual transition from amorphous to crystalline structure. The latter provides the direct support to the proposed two-step nucleation mechanism of methane hydrate.

  9. Comparison of two methods of tear sampling for protein quantification by Bradford method

    Directory of Open Access Journals (Sweden)

    Eliana Farias


    Full Text Available The aim of this study was to compare two methods of tear sampling for protein quantification. Tear samples were collected from 29 healthy dogs (58 eyes using Schirmer tear test (STT strip and microcapillary tubes. The samples were frozen at -80ºC and analyzed by the Bradford method. Results were analyzed by Student's t test. The average protein concentration and standard deviation from tears collected with microcapillary tube were 4.45mg/mL ±0.35 and 4,52mg/mL ±0.29 for right and left eyes respectively. The average protein concentration and standard deviation from tears collected with Schirmer Tear Test (STT strip were and 54.5mg/mL ±0.63 and 54.15mg/mL ±0.65 to right and left eyes respectively. Statistically significant differences (p<0.001 were found between the methods. In the conditions in which this study was conducted, the average protein concentration obtained with the Bradford test from tear samples obtained by Schirmer Tear Test (STT strip showed values higher than those obtained with microcapillary tube. It is important that concentration of tear protein pattern values should be analyzed according the method used to collect tear samples.

  10. Using Multidimensional Methods to Understand the Development, Interpretation and Enactment of Quality Assurance Policy within the Educational Development Community (United States)

    Smith, Karen


    Policy texts are representations of practice that both reflect and shape the world around them. There is, however, little higher education research that critically analyses the impact of higher education policy on educational developers and educational development practice. Extending methods from critical discourse analysis by combining textual…

  11. Quality assurance and marketing. (United States)

    Demby, N A


    Although considerable efforts have been directed toward the development and utilization of marketing strategies for dental practices, little if any information exists in the specific area of the role quality assurance may play in marketing dental services. This article describes and analyzes the current relationship between quality assurance and marketing, given the complex array of factors on the horizon that may affect how dentistry is organized and delivered. It must become the role of the profession to see that the alliance between marketing and quality assurance continues and is utilized to assure the quality of care provided and accountability to the public.

  12. Method to resolve microphone and sample location errors in the two-microphone duct measurement method (United States)



    Utilizing the two-microphone impedance tube method, the normal incidence acoustic absorption and acoustic impedance can be measured for a given sample. This method relies on the measured transfer function between two microphones, and the knowledge of their precise location relative to each other and the sample material. In this article, a method is proposed to accurately determine these locations. A third sensor is added at the end of the tube to simplify the measurement. First, a justification and investigation of the method is presented. Second, reference terminations are measured to evaluate the accuracy of the apparatus. Finally, comparisons are made between the new method and current methods for determining these distances and the variations are discussed. From this, conclusions are drawn with regards to the applicability and need for the new method and under which circumstances it is applicable. Results show that the method provides a reliable determination of both microphone locations, which is not possible using the current techniques. Errors due to inaccurate determinination of these parameters between methods were on the order of 3% for R and 12% for Re Z.

  13. an assessment of methods for sampling carabid beetles

    African Journals Online (AJOL)


    Moist leaf litter was scooped onto white clothing (1 square metre beating sheet) and carabid beetles caught using a “pootah” (aspirator) or a pair of forceps. Resting beetles were sampled by manual searching under logs, stones and tree barks. Sampling effort was measured by time, each "sample" containing carabid.

  14. Interatomic force microscope and sample observing method therefor


    YAMANAKA, K; Kolosov, Oleg; Ogiso, H; Sato, H.; Koda, T


    PURPOSE:To provide a measuring technology for interatomic microscope in which the irregular sample can be separated well from the frictional force. SOLUTION :An oscillating force applied laterally relatively between a sample 8 and a probe 4 Is provided. The sample 8 tilted laterally to excite bending orthogonal oscillation. The phase and the amplitude of the oscillation of the cantilever are detected.

  15. 19 CFR 151.70 - Method of sampling by Customs. (United States)


    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such a... clean yield if such a test is requested in accordance with the provisions of § 151.71(c), or if a second...

  16. Validity and reliability of the Experience-Sampling Method. (United States)

    Csikszentmihalyi, M; Larson, R


    To understand the dynamics of mental health, it is essential to develop measures for the frequency and the patterning of mental processes in every-day-life situations. The Experience-Sampling Method (ESM) is an attempt to provide a valid instrument to describe variations in self-reports of mental processes. It can be used to obtain empirical data on the following types of variables: a) frequency and patterning of daily activity, social interaction, and changes in location; b) frequency, intensity, and patterning of psychological states, i.e., emotional, cognitive, and conative dimensions of experience; c) frequency and patterning of thoughts, including quality and intensity of thought disturbance. The article reviews practical and methodological issues of the ESM and presents evidence for its short- and long-term reliability when used as an instrument for assessing the variables outlined above. It also presents evidence for validity by showing correlation between ESM measures on the one hand and physiological measures, one-time psychological tests, and behavioral indices on the other. A number of studies with normal and clinical populations that have used the ESM are reviewed to demonstrate the range of issues to which the technique can be usefully applied.

  17. A proficiency test system to improve performance of milk analysis methods and produce reference values for component calibration samples for infrared milk analysis. (United States)

    Wojciechowski, Karen L; Melilli, Caterina; Barbano, David M


    Our goal was to determine the feasibility of combining proficiency testing, analytical method quality-assurance system, and production of reference samples for calibration of infrared milk analyzers to achieve a more efficient use of resources and reduce costs while maximizing analytical accuracy within and among milk payment-testing laboratories. To achieve this, we developed and demonstrated a multilaboratory combined proficiency testing and analytical method quality-assurance system as an approach to evaluate and improve the analytical performance of methods. A set of modified milks was developed and optimized to serve multiple purposes (i.e., proficiency testing, quality-assurance and method improvement, and to provide reference materials for calibration of secondary testing methods). Over a period of years, the approach has enabled the group of laboratories to document improved analytical performance (i.e., reduced within- and between-laboratory variation) of chemical reference methods used as the primary reference for calibration of high-speed electronic milk-testing equipment. An annual meeting of the laboratory technicians allows for review of results and discussion of each method and provides a forum for communication of experience and techniques that are of value to new analysts in the group. The monthly proficiency testing sample exchanges have the added benefit of producing all-laboratory mean reference values for a set of 14 milks that can be used for calibration, evaluation, and troubleshooting of calibration adjustment issues on infrared milk analyzers. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  18. SU-E-T-570: New Quality Assurance Method Using Motion Tracking for 6D Robotic Couches

    Energy Technology Data Exchange (ETDEWEB)

    Cheon, W; Cho, J [SungKyunKwan University, Seoul (Korea, Republic of); Ahn, S [Samsung Medical Center, Seoul (Korea, Republic of); Han, Y; Choi, D [Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul (Korea, Republic of)


    Purpose: To accommodate geometrically accurate patient positioning, a robotic couch that is capable of 6-degrees of freedom has been introduced. However, conventional couch QA methods are not sufficient to enable the necessary accuracy of tests. Therefore, we have developed a camera based motion detection and geometry calibration system for couch QA. Methods: Employing a Visual-Tracking System (VTS, BonitaB10, Vicon, UK) which tracks infrared reflective(IR) markers, camera calibration was conducted using a 5.7 × 5.7 × 5.7 cm{sup 3} cube attached with IR markers at each corner. After positioning a robotic-couch at the origin with the cube on the table top, 3D coordinates of the cube’s eight corners were acquired by VTS in the VTS coordinate system. Next, positions in reference coordinates (roomcoordinates) were assigned using the known relation between each point. Finally, camera calibration was completed by finding a transformation matrix between VTS and reference coordinate systems and by applying a pseudo inverse matrix method. After the calibration, the accuracy of linear and rotational motions as well as couch sagging could be measured by analyzing the continuously acquired data of the cube while the couch moves to a designated position. Accuracy of the developed software was verified through comparison with measurement data when using a Laser tracker (FARO, Lake Mary, USA) for a robotic-couch installed for proton therapy. Results: VTS system could track couch motion accurately and measured position in room-coordinates. The VTS measurements and Laser tracker data agreed within 1% of difference for linear and rotational motions. Also because the program analyzes motion in 3-Dimension, it can compute couch sagging. Conclusion: Developed QA system provides submillimeter/ degree accuracy which fulfills the high-end couch QA. This work was supported by the National Research Foundation of Korea funded by Ministry of Science, ICT & Future Planning. (2013M2A2A

  19. 40 CFR 761.243 - Standard wipe sample method and size. (United States)


    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Standard wipe sample method and size... Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.243 Standard wipe sample method and size. (a) Collect a surface sample from a natural gas...

  20. High assurance services computing

    CERN Document Server


    Covers service-oriented technologies in different domains including high assurance systemsAssists software engineers from industry and government laboratories who develop mission-critical software, and simultaneously provides academia with a practitioner's outlook on the problems of high-assurance software development

  1. SU-F-T-450: The Investigation of Radiotherapy Quality Assurance and Automatic Treatment Planning Based On the Kernel Density Estimation Method

    Energy Technology Data Exchange (ETDEWEB)

    Fan, J; Fan, J; Hu, W; Wang, J [Fudan University Shanghai Cancer Center, Shanghai, Shanghai (China)


    Purpose: To develop a fast automatic algorithm based on the two dimensional kernel density estimation (2D KDE) to predict the dose-volume histogram (DVH) which can be employed for the investigation of radiotherapy quality assurance and automatic treatment planning. Methods: We propose a machine learning method that uses previous treatment plans to predict the DVH. The key to the approach is the framing of DVH in a probabilistic setting. The training consists of estimating, from the patients in the training set, the joint probability distribution of the dose and the predictive features. The joint distribution provides an estimation of the conditional probability of the dose given the values of the predictive features. For the new patient, the prediction consists of estimating the distribution of the predictive features and marginalizing the conditional probability from the training over this. Integrating the resulting probability distribution for the dose yields an estimation of the DVH. The 2D KDE is implemented to predict the joint probability distribution of the training set and the distribution of the predictive features for the new patient. Two variables, including the signed minimal distance from each OAR (organs at risk) voxel to the target boundary and its opening angle with respect to the origin of voxel coordinate, are considered as the predictive features to represent the OAR-target spatial relationship. The feasibility of our method has been demonstrated with the rectum, breast and head-and-neck cancer cases by comparing the predicted DVHs with the planned ones. Results: The consistent result has been found between these two DVHs for each cancer and the average of relative point-wise differences is about 5% within the clinical acceptable extent. Conclusion: According to the result of this study, our method can be used to predict the clinical acceptable DVH and has ability to evaluate the quality and consistency of the treatment planning.

  2. Investigation of Presage 3D Dosimetry as a Method of Clinically Intuitive Quality Assurance and Comparison to a Semi-3D Delta4 System (United States)

    Crockett, Ethan Van

    The need for clinically intuitive metrics for patient-specific quality assurance in radiation therapy has been well-documented (Zhen, Nelms et al. 2011). A novel transform method has shown to be effective at converting full-density 3D dose measurements made in a phantom to dose values in the patient geometry, enabling comparisons using clinically intuitive metrics such as dose-volume histograms (Oldham et al. 2011). This work investigates the transform method and compares its calculated dose-volume histograms (DVHs) to DVH values calculated by a Delta4 QA device (Scandidos), marking the first comparison of a true 3D system to a semi-3D device using clinical metrics. Measurements were made using Presage 3D dosimeters, which were readout by an in-house optical-CT scanner. Three patient cases were chosen for the study: one head-and-neck VMAT treatment and two spine IMRT treatments. The transform method showed good agreement with the planned dose values for all three cases. Furthermore, the transformed DVHs adhered to the planned dose with more accuracy than the Delta4 DVHs. The similarity between the Delta4 DVHs and the transformed DVHs, however, was greater for one of the spine cases than it was for the head-and-neck case, implying that the accuracy of the Delta4 Anatomy software may vary from one treatment site to another. Overall, the transform method, which incorporates data from full-density 3D dose measurements, provides clinically intuitive results that are more accurate and consistent than the corresponding results from a semi-3D Delta 4 system.

  3. Assuring NASA's Safety and Mission Critical Software (United States)

    Deadrick, Wesley


    What is IV&V? Independent Verification and Validation (IV&V) is an objective examination of safety and mission critical software processes and products. Independence: 3 Key parameters: Technical Independence; Managerial Independence; Financial Independence. NASA IV&V perspectives: Will the system's software: Do what it is supposed to do?; Not do what it is not supposed to do?; Respond as expected under adverse conditions?. Systems Engineering: Determines if the right system has been built and that it has been built correctly. IV&V Technical Approaches: Aligned with IEEE 1012; Captured in a Catalog of Methods; Spans the full project lifecycle. IV&V Assurance Strategy: The IV&V Project's strategy for providing mission assurance; Assurance Strategy is driven by the specific needs of an individual project; Implemented via an Assurance Design; Communicated via Assurance Statements.

  4. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples. (United States)

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane


    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  5. A cleaning method to minimize contaminant luminescence signal of empty sample carriers using off-the-shelf chemical agents. (United States)

    Kazakis, Nikolaos A; Kitis, George; Tsirliganis, Nestor C


    Signals acquired during thermoluminescence or optically stimulated luminescence measurements must be completely free of any spurious and/or contamination signals to assure the credibility of the results, especially during exploratory research investigating the luminescence behavior of new materials. Experiments indicate that such unwanted signals may also stem from new (unused) and used empty sample carriers, namely cups and discs, which are widely used for such measurements, probably due to contamination from a fluorite and/or silica-related source. Fluorite and/or silicone oil appear to be the most likely sources of contamination, thus, their removal, along with any other possible source that exhibits undesirable luminescence behavior, is necessary. Conventional cleaning methods fail to eliminate such contaminants from empty cups and discs. In this work a new cleaning method is proposed incorporating off-the-shelf chemical agents. Results of thermoluminescence measurements highlight the efficiency of the new cleaning process, since it can completely remove any observed contaminants from both new and used sample carriers, of various shapes and/or materials. Consequently their signal is minimized even at relatively high beta-doses, where it is prominent, resulting in a clean and only sample-attributed signal. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Method for Hot Real-Time Sampling of Gasification Products

    Energy Technology Data Exchange (ETDEWEB)

    Pomeroy, Marc D [National Renewable Energy Laboratory (NREL), Golden, CO (United States)


    The Thermochemical Process Development Unit (TCPDU) at the National Renewable Energy Laboratory (NREL) is a highly instrumented half-ton/day pilot scale plant capable of demonstrating industrially relevant thermochemical technologies from lignocellulosic biomass conversion, including gasification. Gasification creates primarily Syngas (a mixture of Hydrogen and Carbon Monoxide) that can be utilized with synthesis catalysts to form transportation fuels and other valuable chemicals. Biomass derived gasification products are a very complex mixture of chemical components that typically contain Sulfur and Nitrogen species that can act as catalysis poisons for tar reforming and synthesis catalysts. Real-time hot online sampling techniques, such as Molecular Beam Mass Spectrometry (MBMS), and Gas Chromatographs with Sulfur and Nitrogen specific detectors can provide real-time analysis providing operational indicators for performance. Sampling typically requires coated sampling lines to minimize trace sulfur interactions with steel surfaces. Other materials used inline have also shown conversion of sulfur species into new components and must be minimized. Sample line Residence time within the sampling lines must also be kept to a minimum to reduce further reaction chemistries. Solids from ash and char contribute to plugging and must be filtered at temperature. Experience at NREL has shown several key factors to consider when designing and installing an analytical sampling system for biomass gasification products. They include minimizing sampling distance, effective filtering as close to source as possible, proper line sizing, proper line materials or coatings, even heating of all components, minimizing pressure drops, and additional filtering or traps after pressure drops.

  7. Sampling Methods for Wallenius' and Fisher's Noncentral Hypergeometric Distributions

    DEFF Research Database (Denmark)

    Fog, Agner


    Several methods for generating variates with univariate and multivariate Wallenius' and Fisher's noncentral hypergeometric distributions are developed. Methods for the univariate distributions include: simulation of urn experiments, inversion by binary search, inversion by chop-down search from t...

  8. Reliability of a method of sampling stream invertebrates

    CSIR Research Space (South Africa)

    Chutter, FM


    Full Text Available In field ecological studies inferences must often be drawn from dissimilarities in numbers and species of organisms found in biological samples collected at different times and under various conditions....

  9. Summary Report for Evaluation of Compost Sample Drying Methods

    National Research Council Canada - National Science Library

    Frye, Russell


    .... Previous work in Support of these efforts developed a compost sample preparation scheme, consisting of air drying followed by milling, to reduce analytical variability in the heterogeneous compost matrix...

  10. A Review of Biological Agent Sampling Methods and ... (United States)

    Report This study was conducted to evaluate current sampling and analytical capabilities, from a time and resource perspective, for a large-scale biological contamination incident. The analysis will be useful for strategically directing future research investment.

  11. Method for preconcentrating a sample for subsequent analysis (United States)

    Zaromb, Solomon


    A system for analysis of trace concentration of contaminants in air includes a portable liquid chromatograph and a preconcentrator for the contaminants to be analyzed. The preconcentrator includes a sample bag having an inlet valve and an outlet valve for collecting an air sample. When the sample is collected the sample bag is connected in series with a sorbing apparatus in a recirculation loop. The sorbing apparatus has an inner gas-permeable container containing a sorbent material and an outer gas-impermeable container. The sample is circulated through the outer container and around the inner container for trapping and preconcentrating the contaminants in the sorbent material. The sorbent material may be a liquid having the same composition as the mobile phase of the chromatograph for direct injection thereinto. Alternatively, the sorbent material may be a porous, solid body, to which mobile phase liquid is added after preconcentration of the contaminants for dissolving the contaminants, the liquid solution then being withdrawn for injection into the chromatograph.

  12. Improvements in Sample Selection Methods for Image Classification

    Directory of Open Access Journals (Sweden)

    Thales Sehn Körting


    Full Text Available Traditional image classification algorithms are mainly divided into unsupervised and supervised paradigms. In the first paradigm, algorithms are designed to automatically estimate the classes’ distributions in the feature space. The second paradigm depends on the knowledge of a domain expert to identify representative examples from the image to be used for estimating the classification model. Recent improvements in human-computer interaction (HCI enable the construction of more intuitive graphic user interfaces (GUIs to help users obtain desired results. In remote sensing image classification, GUIs still need advancements. In this work, we describe our efforts to develop an improved GUI for selecting the representative samples needed to estimate the classification model. The idea is to identify changes in the common strategies for sample selection to create a user-driven sample selection, which focuses on different views of each sample, and to help domain experts identify explicit classification rules, which is a well-established technique in geographic object-based image analysis (GEOBIA. We also propose the use of the well-known nearest neighbor algorithm to identify similar samples and accelerate the classification.

  13. Analytical results, database management and quality assurance for analysis of soil and groundwater samples collected by cone penetrometer from the F and H Area seepage basins

    Energy Technology Data Exchange (ETDEWEB)

    Boltz, D.R.; Johnson, W.H.; Serkiz, S.M.


    The Quantification of Soil Source Terms and Determination of the Geochemistry Controlling Distribution Coefficients (K{sub d} values) of Contaminants at the F- and H-Area Seepage Basins (FHSB) study was designed to generate site-specific contaminant transport factors for contaminated groundwater downgradient of the Basins. The experimental approach employed in this study was to collect soil and its associated porewater from contaminated areas downgradient of the FHSB. Samples were collected over a wide range of geochemical conditions (e.g., pH, conductivity, and contaminant concentration) and were used to describe the partitioning of contaminants between the aqueous phase and soil surfaces at the site. The partitioning behavior may be used to develop site-specific transport factors. This report summarizes the analytical procedures and results for both soil and porewater samples collected as part of this study and the database management of these data.

  14. Methods of sampling airborne fungi in working environments of waste treatment facilities


    Kristýna Černá; Zdeňka Wittlingerová; Magdaléna Zimová; Zdeněk Janovský


    Objectives: The objective of the present study was to evaluate and compare the efficiency of a filter based sampling method and a high volume sampling method for sampling airborne culturable fungi present in waste sorting facilities. Material and Methods: Membrane filters method was compared with surface air system method. The selected sampling methods were modified and tested in 2 plastic waste sorting facilities. Results: The total number of colony-forming units (CFU)/m3 of airborne fungi w...

  15. Statistical Methods and Tools for Hanford Staged Feed Tank Sampling

    Energy Technology Data Exchange (ETDEWEB)

    Fountain, Matthew S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Brigantic, Robert T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Peterson, Reid A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    This report summarizes work conducted by Pacific Northwest National Laboratory to technically evaluate the current approach to staged feed sampling of high-level waste (HLW) sludge to meet waste acceptance criteria (WAC) for transfer from tank farms to the Hanford Waste Treatment and Immobilization Plant (WTP). The current sampling and analysis approach is detailed in the document titled Initial Data Quality Objectives for WTP Feed Acceptance Criteria, 24590-WTP-RPT-MGT-11-014, Revision 0 (Arakali et al. 2011). The goal of this current work is to evaluate and provide recommendations to support a defensible, technical and statistical basis for the staged feed sampling approach that meets WAC data quality objectives (DQOs).

  16. Processes and procedures for a worldwide biological samples distribution; product assurance and logistic activities to support the mice drawer system tissue sharing event (United States)

    Benassai, Mario; Cotronei, Vittorio

    The Mice Drawer System (MDS) is a scientific payload developed by the Italian Space Agency (ASI), it hosted 6 mice on the International Space Station (ISS) and re-entered on ground on November 28, 2009 with the STS 129 at KSC. Linked to the MDS experiment, a Tissue Sharing Program (TSP), was developed in order to make available to 16 Payload Investigators (PI) (located in USA, Canada, EU -Italy, Belgium and Germany -and Japan) the biological samples coming from the mice. ALTEC SpA (a PPP owned by ASI, TAS-I and local institutions) was responsible to support the logistics aspects of the MDS samples for the first MDS mission, in the frame of Italian Space Agency (ASI) OSMA program (OSteoporosis and Muscle Atrophy). The TSP resulted in a complex scenario, as ASI, progressively, extended the original OSMA Team also to researchers from other ASI programs and from other Agencies (ESA, NASA, JAXA). The science coordination was performed by the University of Genova (UNIGE). ALTEC has managed all the logistic process with the support of a specialized freight forwarder agent during the whole shipping operation phases. ALTEC formalized all the steps from the handover of samples by the dissection Team to the packaging and shipping process in a dedicated procedure. ALTEC approached all the work in a structured way, performing: A study of the aspects connected to international shipments of biological samples. A coopera-tive work with UNIGE/ASI /PIs to identify all the needs of the various researchers and their compatibility. A complete revision and integration of shipment requirements (addresses, tem-peratures, samples, materials and so on). A complete definition of the final shipment scenario in terms of boxes, content, refrigerant and requirements. A formal approach to identification and selection of the most suited and specialized Freight Forwarder. A clear identification of all the processes from sample dissection by PI Team, sample processing, freezing, tube preparation

  17. Development and validation of a multi-locus DNA metabarcoding method to identify endangered species in complex samples. (United States)

    Arulandhu, Alfred J; Staats, Martijn; Hagelaar, Rico; Voorhuijzen, Marleen M; Prins, Theo W; Scholtens, Ingrid; Costessi, Adalberto; Duijsings, Danny; Rechenmann, François; Gaspar, Frédéric B; Barreto Crespo, Maria Teresa; Holst-Jensen, Arne; Birck, Matthew; Burns, Malcolm; Haynes, Edward; Hochegger, Rupert; Klingl, Alexander; Lundberg, Lisa; Natale, Chiara; Niekamp, Hauke; Perri, Elena; Barbante, Alessandra; Rosec, Jean-Philippe; Seyfarth, Ralf; Sovová, Tereza; Van Moorleghem, Christoff; van Ruth, Saskia; Peelen, Tamara; Kok, Esther


    DNA metabarcoding provides great potential for species identification in complex samples such as food supplements and traditional medicines. Such a method would aid Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) enforcement officers to combat wildlife crime by preventing illegal trade of endangered plant and animal species. The objective of this research was to develop a multi-locus DNA metabarcoding method for forensic wildlife species identification and to evaluate the applicability and reproducibility of this approach across different laboratories. A DNA metabarcoding method was developed that makes use of 12 DNA barcode markers that have demonstrated universal applicability across a wide range of plant and animal taxa and that facilitate the identification of species in samples containing degraded DNA. The DNA metabarcoding method was developed based on Illumina MiSeq amplicon sequencing of well-defined experimental mixtures, for which a bioinformatics pipeline with user-friendly web-interface was developed. The performance of the DNA metabarcoding method was assessed in an international validation trial by 16 laboratories, in which the method was found to be highly reproducible and sensitive enough to identify species present in a mixture at 1% dry weight content. The advanced multi-locus DNA metabarcoding method assessed in this study provides reliable and detailed data on the composition of complex food products, including information on the presence of CITES-listed species. The method can provide improved resolution for species identification, while verifying species with multiple DNA barcodes contributes to an enhanced quality assurance. © The Authors 2017. Published by Oxford University Press.

  18. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)


    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  19. Comparison of indoor air sampling and dust collection methods for fungal exposure assessment using quantitative PCR (United States)

    Evaluating fungal contamination indoors is complicated because of the many different sampling methods utilized. In this study, fungal contamination was evaluated using five sampling methods and four matrices for results. The five sampling methods were a 48 hour indoor air sample ...

  20. 7 CFR 28.46 - Method of submitting samples and types. (United States)


    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Method of submitting samples and types. 28.46 Section... Standards Act Sample Or Type Comparison § 28.46 Method of submitting samples and types. The method of submitting samples and types for comparison shall be the same as that prescribed in this subpart for...

  1. Three sampling methods for visibility measures of landscape perception

    NARCIS (Netherlands)

    Weitkamp, S.G.; Bregt, A.K.; Lammeren, van R.J.A.; Berg, van den A.E.


    The character of a landscape can be seen as the outcome of people¿s perception of their physical environment, which is important for spatial planning and decision making. Three modes of landscape perception are proposed: view from a viewpoint, view from a road, and view of an area. Three sampling

  2. Microbial diversity in fecal samples depends on DNA extraction method

    DEFF Research Database (Denmark)

    Mirsepasi, Hengameh; Persson, Søren; Struve, Carsten


    BACKGROUND: There are challenges, when extracting bacterial DNA from specimens for molecular diagnostics, since fecal samples also contain DNA from human cells and many different substances derived from food, cell residues and medication that can inhibit downstream PCR. The purpose of the study w...

  3. Modern methods of sample preparation for GC analysis

    NARCIS (Netherlands)

    de Koning, S.; Janssen, H.-G.; Brinkman, U.A.Th.


    Today, a wide variety of techniques is available for the preparation of (semi-) solid, liquid and gaseous samples, prior to their instrumental analysis by means of capillary gas chromatography (GC) or, increasingly, comprehensive two-dimensional GC (GC × GC). In the past two decades, a large number

  4. An adaptive household sampling method for rural African communities

    African Journals Online (AJOL)

    Utilizing Google Earth images and a Graphical Information System (GIS) map of Berekuso, sampling units were defined as 15-degree wedge-shaped sectors ... of Berekuso, and produced generalizable results for median household size, median age of residents, sources of potable water and toilet types, among others.

  5. Method for Hot Real-Time Sampling of Pyrolysis Vapors

    Energy Technology Data Exchange (ETDEWEB)

    Pomeroy, Marc D [National Renewable Energy Laboratory (NREL), Golden, CO (United States)


    Biomass Pyrolysis has been an increasing topic of research, in particular as a replacement for crude oil. This process utilizes moderate temperatures to thermally deconstruct the biomass which is then condensed into a mixture of liquid oxygenates to be used as fuel precursors. Pyrolysis oils contain more than 400 compounds, up to 60 percent of which do not re-volatilize for subsequent chemical analysis. Vapor chemical composition is also complicated as additional condensation reactions occur during the condensation and collection of the product. Due to the complexity of the pyrolysis oil, and a desire to catalytically upgrade the vapor composition before condensation, online real-time analytical techniques such as Molecular Beam Mass Spectrometry (MBMS) are of great use. However, in order to properly sample hot pyrolysis vapors, many challenges must be overcome. Sampling must occur within a narrow range of temperatures to reduce product composition changes from overheating or partial condensation or plugging of lines from condensed products. Residence times must be kept at a minimum to reduce further reaction chemistries. Pyrolysis vapors also form aerosols that are carried far downstream and can pass through filters resulting in build-up in downstream locations. The co-produced bio-char and ash from the pyrolysis process can lead to plugging of the sample lines, and must be filtered out at temperature, even with the use of cyclonic separators. A practical approach for considerations and sampling system design, as well as lessons learned are integrated into the hot analytical sampling system of the National Renewable Energy Laboratory's (NREL) Thermochemical Process Development Unit (TCPDU) to provide industrially relevant demonstrations of thermochemical transformations of biomass feedstocks at the pilot scale.

  6. Sampling methods to the statistical control of the production of blood components. (United States)

    Pereira, Paulo; Seghatchian, Jerard; Caldeira, Beatriz; Santos, Paula; Castro, Rosa; Fernandes, Teresa; Xavier, Sandra; de Sousa, Gracinda; de Almeida E Sousa, João Paulo


    The control of blood components specifications is a requirement generalized in Europe by the European Commission Directives and in the US by the AABB standards. The use of a statistical process control methodology is recommended in the related literature, including the EDQM guideline. The control reliability is dependent of the sampling. However, a correct sampling methodology seems not to be systematically applied. Commonly, the sampling is intended to comply uniquely with the 1% specification to the produced blood components. Nevertheless, on a purely statistical viewpoint, this model could be argued not to be related to a consistent sampling technique. This could be a severe limitation to detect abnormal patterns and to assure that the production has a non-significant probability of producing nonconforming components. This article discusses what is happening in blood establishments. Three statistical methodologies are proposed: simple random sampling, sampling based on the proportion of a finite population, and sampling based on the inspection level. The empirical results demonstrate that these models are practicable in blood establishments contributing to the robustness of sampling and related statistical process control decisions for the purpose they are suggested for. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling (United States)

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah


    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  8. Sampling pig farms at the abattoir in a cross-sectional study - Evaluation of a sampling method. (United States)

    Birkegård, Anna Camilla; Halasa, Tariq; Toft, Nils


    A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However, it is difficult to sample a large number of farms from an exact predefined list, due to the logistics and workflow of an abattoir. Therefore, it is necessary to have a systematic sampling procedure and to evaluate the obtained sample with respect to the study objective. We propose a method for 1) planning, 2) conducting, and 3) evaluating the representativeness and reproducibility of a cross-sectional study when simple random sampling is not possible. We used an example of a cross-sectional study with the aim of quantifying the association of antimicrobial resistance and antimicrobial consumption in Danish slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2) conducting: sampling was carried out at five abattoirs; 3) evaluation: representativeness was evaluated by comparing sampled and non-sampled farms, and the reproducibility of the study was assessed through simulated sampling based on meat inspection data from the period where the actual data collection was carried out. In the cross-sectional study samples were taken from 681 Danish pig farms, during five weeks from February to March 2015. The evaluation showed that the sampling

  9. Designing waveforms for temporal encoding using a frequency sampling method

    DEFF Research Database (Denmark)

    Gran, Fredrik; Jensen, Jørgen Arendt


    In this paper a method for designing waveforms for temporal encoding in medical ultrasound imaging is described. The method is based on least squares optimization and is used to design nonlinear frequency modulated signals for synthetic transmit aperture imaging. By using the proposed design method......, the amplitude spectrum of the transmitted waveform can be optimized, such that most of the energy is transmitted where the transducer has large amplification. To test the design method, a waveform was designed for a BK8804 linear array transducer. The resulting nonlinear frequency modulated waveform...... waveform, on the other hand, was designed so that only frequencies where the transducer had a large amplification were excited. Hereby, unnecessary heating of the transducer could be avoided and the signal-tonoise ratio could be increased. The experimental ultrasound scanner RASMUS was used to evaluate...

  10. On-line sample processing methods in flow analysis

    DEFF Research Database (Denmark)

    Miró, Manuel; Hansen, Elo Harald


    -line dilution, derivatization, separation and preconcentration methods encompassing solid reactors, solvent extraction, sorbent extraction, precipitation/coprecipitation, hydride/vapor generation and digestion/leaching protocols as hyphenated to a plethora of detection devices is discussed in detail...

  11. Rapid method for plutonium-241 determination in soil samples


    Piekarz, M.; Komosa, A.


    A simple and rapid procedure for the determination of plutonium isotopes in the environment is presented. The procedure combines alpha spectrometry, solvent extraction and liquid scintillation measurements to ensure that both alpha- and beta-emitting isotopes are determined. Of five tested extractants, bis-(2-ethylhexyl) phosphoric acid was found to be the best choice. The procedure was applied to soil samples contaminated with Chernobyl fallout.

  12. Technical Evaluation of Sample-Processing, Collection, and Preservation Methods (United States)


    enhanced situational awareness of biological threats to the environment, human health, agriculture , and food supplies. Specifically mentioned is the...preparing for the possibility of biologically based attacks on military, civilian, or agricultural targets. To be fully prepared for this...from the various collected samples was extracted using an identical process—the Blood and Tissue Midi Preparation Kit (Qiagen, Inc.; Valencia , CA)—and

  13. EMC Compliance Assurance Monitoring (United States)

    The Compliance Assurance Monitoring, or CAM, rule is designed to satisfy the requirements for monitoring and compliance certification in the Part 70 operating permits program and Title VII of the 1990 Clean Air Act Amendments



    A Karinagannanavar; W Khan; Raghavendra, B; ARB Sameena; TG Goud


    Background: Measles is a leading cause of childhood morbidity and mortality accounting for nearly half the global burden of vaccine preventable deaths. In 2007, there were 197000 measles deaths globally nearly 540 deaths every day or 22 deaths per hour. According to NFHS-3 2005 – 06 total measles vaccination coverage in Karnataka was 72%. Objectives: 1) To find out measles vaccination coverage in Bellary District. 2) To know the reasons for non-vaccination. Material and Methods:   A Cross sec...

  15. Sample Size Calculations for Population Size Estimation Studies Using Multiplier Methods With Respondent-Driven Sampling Surveys. (United States)

    Fearon, Elizabeth; Chabata, Sungai T; Thompson, Jennifer A; Cowan, Frances M; Hargreaves, James R


    While guidance exists for obtaining population size estimates using multiplier methods with respondent-driven sampling surveys, we lack specific guidance for making sample size decisions. To guide the design of multiplier method population size estimation studies using respondent-driven sampling surveys to reduce the random error around the estimate obtained. The population size estimate is obtained by dividing the number of individuals receiving a service or the number of unique objects distributed (M) by the proportion of individuals in a representative survey who report receipt of the service or object (P). We have developed an approach to sample size calculation, interpreting methods to estimate the variance around estimates obtained using multiplier methods in conjunction with research into design effects and respondent-driven sampling. We describe an application to estimate the number of female sex workers in Harare, Zimbabwe. There is high variance in estimates. Random error around the size estimate reflects uncertainty from M and P, particularly when the estimate of P in the respondent-driven sampling survey is low. As expected, sample size requirements are higher when the design effect of the survey is assumed to be greater. We suggest a method for investigating the effects of sample size on the precision of a population size estimate obtained using multipler methods and respondent-driven sampling. Uncertainty in the size estimate is high, particularly when P is small, so balancing against other potential sources of bias, we advise researchers to consider longer service attendance reference periods and to distribute more unique objects, which is likely to result in a higher estimate of P in the respondent-driven sampling survey.

  16. RAVEN Quality Assurance Activities

    Energy Technology Data Exchange (ETDEWEB)

    Cogliati, Joshua Joseph [Idaho National Lab. (INL), Idaho Falls, ID (United States)


    This report discusses the quality assurance activities needed to raise the Quality Level of Risk Analysis in a Virtual Environment (RAVEN) from Quality Level 3 to Quality Level 2. This report also describes the general RAVEN quality assurance activities. For improving the quality, reviews of code changes have been instituted, more parts of testing have been automated, and improved packaging has been created. For upgrading the quality level, requirements have been created and the workflow has been improved.

  17. Benchmarking Software Assurance Implementation (United States)


    product The chicken#. (a.k.a. Process Focused Assessment ) – Management Systems ( ISO 9001, ISO 27001 , ISO 2000) – Capability Maturity Models (CMMI...Benchmarking Software Assurance Implementation Michele Moss SSTC Conference May 18, 2011 Report Documentation Page Form ApprovedOMB No. 0704-0188...00-00-2011 4. TITLE AND SUBTITLE Benchmarking Software Assurance Implementation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER

  18. [Quality assurance in cardiology: Germany]. (United States)

    Silber, S


    Quality assurance is a touchy subject: difficult to implement, time-demanding and expensive. The goal of quality assurance is to assist both the patients and the physicians. In addition to legal requirements, quality assurance is necessary for medical as well as economical reasons. It makes sense that the license to practice medicine does not automatically entail the right to perform all medical procedures; the development of new methods and the insights won from important scientific studies necessitates constant training. Furthermore, the decreasing allocation of funds for medical care combined with increased demand effected by new treatment methods and longer life expectancy force the development of instruments for specific and reasonable budgeting of medical expenditures. The primary goal of quality management in respect to economical regards must be the avoidance of unnecessary hospital admissions. But the patient must retain the right to choose the physician he prefers. The organization of the supervising structures in Germany is inconsistent: in 1995, a new Zentralstelle der Deutschen Arzteschaft zur Qualitätssicherung in der Medizin (German Physicians Headquarters for Quality Assurance in Medicine) was founded; it is proportionally staffed by representatives of the Bundesärztekammer (BAK, Federal Board of Physicians) and the Kassenärztliche Bundesvereinigung (KBV, Federal Commission of Panel Physicians). Furthermore, there is the Arbeitsgemeinschaft zur Förderung der Qualitätssicherung in der Medizin (Working Group for the Advancement of Quality Assurance in Medicine), in which the Bundesministerium für Gesundheit (Federal Ministry of Health) and the Kassenärztliche Vereinigung (KV, Public Health Insurance Providers) are represented. The KV is already seeing to it that stricter regulations govern physicians with private practice than those governing hospital physicians. There are three data banks existing on a voluntary basis for invasive diagnostic

  19. Liquid Chromatographic Method for Determination of Nisoldipine from Pharmaceutical Samples

    Directory of Open Access Journals (Sweden)

    Amit Gupta


    Full Text Available A simple and specific high performance thin layer chromatographic method was developed and validated for the determination of nisoldipine from tablet dosage form. The method was carried out at 320 nm after extraction of drug in methanol. The method uses aluminum plates pre-coated with silica gel 60F-254 as stationary phase and cyclohexane-ethyl acetate-toluene (3:3:4, v/v/v as mobile phase. Linearity was established over a range of 400-2400 ng per zone. Both peak area ratio and peak height ratio showed acceptable correlation coefficient i.e. more than 0.99. However we used peak area for validation purpose. Intra-day and inter-day precision was determined and found to have less than 6.0 % RSD.

  20. Method for fractional solid-waste sampling and chemical analysis

    DEFF Research Database (Denmark)

    Riber, Christian; Rodushkin, I.; Spliid, Henrik


    Chemical characterization of solid waste is a demanding task due to the heterogeneity of the waste. This article describes how 45 material fractions hand-sorted from Danish household waste were subsampled and prepared for chemical analysis of 61 substances. All material fractions were subject...... of variance (20-85% of the overall variation). Only by increasing the sample size significantly can this variance be reduced. The accuracy and short-term reproducibility of the chemical characterization were good, as determined by the analysis of several relevant certified reference materials. Typically, six...

  1. Method for spiking soil samples with organic compounds

    DEFF Research Database (Denmark)

    Brinch, Ulla C; Ekelund, Flemming; Jacobsen, Carsten S


    We examined the harmful side effects on indigenous soil microorganisms of two organic solvents, acetone and dichloromethane, that are normally used for spiking of soil with polycyclic aromatic hydrocarbons for experimental purposes. The solvents were applied in two contamination protocols to either...... higher than in control soil, probably due mainly to release of predation from indigenous protozoa. In order to minimize solvent effects on indigenous soil microorganisms when spiking native soil samples with compounds having a low water solubility, we propose a common protocol in which the contaminant...

  2. [DOE method for evaluating environmental and waste management samples: Revision 1, Addendum 1

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S.C.


    The US Dapartment of Energy`s (DOE`s) environmental and waste management (EM) sampling and analysis activities require that large numbers of samples be analyzed for materials characterization, environmental surveillance, and site-remediation programs. The present document, DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods), is a supplemental resource for analyzing many of these samples.

  3. 7 CFR 32.402 - Samples of mohair top grades; method of obtaining. (United States)


    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Samples of mohair top grades; method of obtaining. 32... STANDARD CONTAINER REGULATIONS PURCHASE OF GREASE MOHAIR AND MOHAIR TOP SAMPLES § 32.402 Samples of mohair top grades; method of obtaining. Samples certified as representative of the official standards of the...

  4. 7 CFR 32.400 - Samples of grease mohair grades; method of obtaining. (United States)


    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Samples of grease mohair grades; method of obtaining... STANDARD CONTAINER REGULATIONS PURCHASE OF GREASE MOHAIR AND MOHAIR TOP SAMPLES § 32.400 Samples of grease mohair grades; method of obtaining. Samples certified as representative of the official standards of the...

  5. 7 CFR 31.400 - Samples for wool and wool top grades; method of obtaining. (United States)


    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Samples for wool and wool top grades; method of... STANDARDS AND STANDARD CONTAINER REGULATIONS PURCHASE OF WOOL AND WOOL TOP SAMPLES § 31.400 Samples for wool and wool top grades; method of obtaining. Samples certified as representative of the official...

  6. Vegetation Sampling for Wetland Delineation: A Review and Synthesis of Methods and Sampling Issues (United States)


    element in several forest inventory programs. These alternative metrics to cover or frequency are not regularly used in the majority of seedlings, saplings, and overstory vegetation are routinely collected in many forest inventory methods (Schreuder et al. 1993; McRoberts and...Service collects vegetation data using strata from long-term monitoring plots as part of its Forest Inventory and Analysis (FIA) and National Forest

  7. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method (United States)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander


    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  8. Tissue sampling methods and standards for vertebrate genomics

    Directory of Open Access Journals (Sweden)

    Wong Pamela BY


    Full Text Available Abstract The recent rise in speed and efficiency of new sequencing technologies have facilitated high-throughput sequencing, assembly and analyses of genomes, advancing ongoing efforts to analyze genetic sequences across major vertebrate groups. Standardized procedures in acquiring high quality DNA and RNA and establishing cell lines from target species will facilitate these initiatives. We provide a legal and methodological guide according to four standards of acquiring and storing tissue for the Genome 10K Project and similar initiatives as follows: four-star (banked tissue/cell cultures, RNA from multiple types of tissue for transcriptomes, and sufficient flash-frozen tissue for 1 mg of DNA, all from a single individual; three-star (RNA as above and frozen tissue for 1 mg of DNA; two-star (frozen tissue for at least 700 μg of DNA; and one-star (ethanol-preserved tissue for 700 μg of DNA or less of mixed quality. At a minimum, all tissues collected for the Genome 10K and other genomic projects should consider each species’ natural history and follow institutional and legal requirements. Associated documentation should detail as much information as possible about provenance to ensure representative sampling and subsequent sequencing. Hopefully, the procedures outlined here will not only encourage success in the Genome 10K Project but also inspire the adaptation of standards by other genomic projects, including those involving other biota.

  9. The Marker State Space (MSS method for classifying clinical samples.

    Directory of Open Access Journals (Sweden)

    Brian P Fallon

    Full Text Available The development of accurate clinical biomarkers has been challenging in part due to the diversity between patients and diseases. One approach to account for the diversity is to use multiple markers to classify patients, based on the concept that each individual marker contributes information from its respective subclass of patients. Here we present a new strategy for developing biomarker panels that accounts for completely distinct patient subclasses. Marker State Space (MSS defines "marker states" based on all possible patterns of high and low values among a panel of markers. Each marker state is defined as either a case state or a control state, and a sample is classified as case or control based on the state it occupies. MSS was used to define multi-marker panels that were robust in cross validation and training-set/test-set analyses and that yielded similar classification accuracy to several other classification algorithms. A three-marker panel for discriminating pancreatic cancer patients from control subjects revealed subclasses of patients based on distinct marker states. MSS provides a straightforward approach for modeling highly divergent subclasses of patients, which may be adaptable for diverse applications.

  10. 7 CFR 51.308 - Methods of sampling and calculation of percentages. (United States)


    ..., CERTIFICATION, AND STANDARDS) United States Standards for Grades of Apples Methods of Sampling and Calculation of Percentages § 51.308 Methods of sampling and calculation of percentages. (a) When the numerical... 7 Agriculture 2 2010-01-01 2010-01-01 false Methods of sampling and calculation of percentages. 51...

  11. Determination of methylmercury in marine biota samples: method validation. (United States)

    Carrasco, Luis; Vassileva, Emilia


    Regulatory authorities are expected to measure concentration of contaminants in foodstuffs, but the simple determination of total amount cannot be sufficient for fully judging its impact on the human health. In particular, the methylation of metals generally increases their toxicity; therefore validated analytical methods producing reliable results for the assessment of methylated species are highly needed. Nowadays, there is no legal limit for methylmercury (MeHg) in food matrices. Hence, no standardized method for the determination of MeHg exists within the international jurisdiction. Contemplating the possibility of a future legislative limit, a method for low level determination of MeHg in marine biota matrixes, based on aqueous-phase ethylation followed by purge and trap and gas chromatography (GC) coupled to pyrolysis-atomic fluorescence spectrometry (Py-AFS) detection, has been developed and validated. Five different extraction procedures, namely acid and alkaline leaching assisted by microwave and conventional oven heating, as well as enzymatic digestion, were evaluated in terms of their efficiency to extract MeHg from Scallop soft tissue IAEA-452 Certified Reference Material. Alkaline extraction with 25% (w/w) KOH in methanol, microwave-assisted extraction (MAE) with 5M HCl and enzymatic digestion with protease XIV yielded the highest extraction recoveries. Standard addition or the introduction of a dilution step were successfully applied to overcome the matrix effects observed when microwave-assisted extraction using 25% (w/w) KOH in methanol or 25% (w/v) aqueous TMAH were used. ISO 17025 and Eurachem guidelines were followed to perform the validation of the methodology. Accordingly, blanks, selectivity, calibration curve, linearity (0.9995), working range (1-800pg), recovery (97%), precision, traceability, limit of detection (0.45pg), limit of quantification (0.85pg) and expanded uncertainty (15.86%, k=2) were assessed with Fish protein Dorm-3 Certified

  12. Sampling and measurement methods for diesel exhaust aerosol

    Energy Technology Data Exchange (ETDEWEB)

    Ristimaeki, J.


    Awareness of adverse health effects of urban aerosols has increased general interest in aerosol sources. As diesel engines are one significant urban anthropogenic particle source, diesel aerosols have been under intense research during the last decades. This thesis discusses the measurement issues related to the diesel exhaust particles, focusing on the effective density measurement with Elpi-Sumps and Tda-Elpi methods and presents some additional performance issues not discussed in the papers. As the emergence of volatile nanoparticles in the diesel exhaust is sensitive to prevailing circumstances there is a need to properly control the dilution parameters in laboratory measurements in order to obtain repeatable and reproducible results. In addition to the dilution parameters, the effect of ambient temperature on the light duty vehicle exhaust particulate emission was studied. It was found that turbo charged diesel engines were relatively insensitive to changes in ambient temperature whereas particle emissions from naturally aspirated gasoline vehicles were significantly increased at low temperatures. The measurement of effective density and mass of aerosol particles with Dma and impactor was studied and applied to characterisation of diesel exhaust particles. The Tda-Elpi method was used for determination of the volatile mass of diesel exhaust particles as a function of particle size. Based on the measurement results, condensation was suggested to be the main phenomena driving volatile mass transfer to the exhaust particles. Identification of the process and the separation of volatile and solid mass may become important as some health effect studies suggest the volatile fraction to be a key component causing the biological effects of diesel exhaust particles. (orig.)

  13. Effect of sample preparation methods on photometric determination of the tellurium and cobalt content in the samples of copper concentrates

    Directory of Open Access Journals (Sweden)

    Viktoriya Butenko


    Full Text Available Methods of determination of cobalt and nickel in copper concentrates currently used in factory laboratories are very labor intensive and time consuming. The limiting stage of the analysis is preliminary chemical sample preparation. Carrying out the decomposition process of industrial samples with concentrated mineral acids in open systems does not allow to improve the metrological characteristics of the methods, for this reason improvement the methods of sample preparation is quite relevant and has a practical interest. The work was dedicated to the determination of the optimal conditions of preliminary chemical preparation of copper concentrate samples for the subsequent determination of cobalt and tellurium in the obtained solution using tellurium-spectrophotometric method. Decomposition of the samples was carried out by acid dissolving in individual mineral acids and their mixtures by heating in an open system as well as by using ultrasonification and microwave radiation in a closed system. In order to select the optimal conditions for the decomposition of the samples in a closed system the phase contact time and ultrasonic generator’s power were varied. Intensification of the processes of decomposition of copper concentrates with nitric acid (1:1, ultrasound and microwave radiation allowed to transfer quantitatively cobalt and tellurium into solution spending 20 and 30 min respectively. This reduced the amount of reactants used and improved the accuracy of determination by running the process in strictly identical conditions.

  14. Estimating the Expected Value of Sample Information Using the Probabilistic Sensitivity Analysis Sample: A Fast, Nonparametric Regression-Based Method. (United States)

    Strong, Mark; Oakley, Jeremy E; Brennan, Alan; Breeze, Penny


    Health economic decision-analytic models are used to estimate the expected net benefits of competing decision options. The true values of the input parameters of such models are rarely known with certainty, and it is often useful to quantify the value to the decision maker of reducing uncertainty through collecting new data. In the context of a particular decision problem, the value of a proposed research design can be quantified by its expected value of sample information (EVSI). EVSI is commonly estimated via a 2-level Monte Carlo procedure in which plausible data sets are generated in an outer loop, and then, conditional on these, the parameters of the decision model are updated via Bayes rule and sampled in an inner loop. At each iteration of the inner loop, the decision model is evaluated. This is computationally demanding and may be difficult if the posterior distribution of the model parameters conditional on sampled data is hard to sample from. We describe a fast nonparametric regression-based method for estimating per-patient EVSI that requires only the probabilistic sensitivity analysis sample (i.e., the set of samples drawn from the joint distribution of the parameters and the corresponding net benefits). The method avoids the need to sample from the posterior distributions of the parameters and avoids the need to rerun the model. The only requirement is that sample data sets can be generated. The method is applicable with a model of any complexity and with any specification of model parameter distribution. We demonstrate in a case study the superior efficiency of the regression method over the 2-level Monte Carlo method. © The Author(s) 2015.

  15. Sampling

    CERN Document Server

    Thompson, Steven K


    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  16. [Comparative Analysis of Spectrophotometric Methods of the Protein Measurement in the Pectic Polysaccharide Samples]. (United States)

    Ponomareva, S A; Golovchenko, V V; Patova, O A; Vanchikova, E V; Ovodov, Y S


    For the assay to reliability of determination of the protein content in the pectic polysaccharide samples by absorbance in the ultraviolet and visible regions of the spectrum a comparison of the eleven techniques called Flores, Lovry, Bradford, Sedmak, Rueman (ninhydrin reaction) methods, the method of ultraviolet spectrophotometry, the method Benedict's reagent, the method Nessler's reagent, the method with amide black, the bicinchoninic reagent and the biuret method was carried out. The data obtained show that insufficient sensitivity of the seven methods from the listed techniques doesn't allow their usage for determination of protein content in pectic polysaccharide samples. But the Lowry, Bradford, Sedmak methods, and the method Nessler's reagent may be used for determination of protein content in pectic polysaccharide samples, and the Bradford method is advisable for protein contaminants content determination in pectic polysaccharide samples in case protein content is less than 15%, and the Lowry method--for samples is more than 15%.

  17. Method optimization and quality assurance in speciation analysis using high performance liquid chromatography with detection by inductively coupled plasma mass spectrometry

    DEFF Research Database (Denmark)

    Larsen, Erik Huusfeldt


    by a factor of four by continuously introducing carbon as methanol via the mobile phase into the ICP. Sources of error in the HPLC system (column overload), in the sample introduction system (memory by organic solvents) and in the ICP-MS (spectroscopic interferences) and their prevention are also discussed...... speciation in the shrimp sample. With this analytical technique the HPLC retention time in combination with mass analysis of the molecular ions and their collision-induced fragments provide almost conclusive evidence of the identity of the analyte species. The speciation methods are validated by establishing...

  18. Field Methods and Quality-Assurance Plan for Quality-of-Water Activities, U.S. Geological Survey, Idaho National Laboratory, Idaho (United States)

    Knobel, LeRoy L.; Tucker, Betty J.; Rousseau, Joseph P.


    Water-quality activities conducted by the staff of the U.S. Geological Survey (USGS) Idaho National Laboratory (INL) Project Office coincide with the USGS mission of appraising the quantity and quality of the Nation's water resources. The activities are conducted in cooperation with the U.S. Department of Energy's (DOE) Idaho Operations Office. Results of the water-quality investigations are presented in various USGS publications or in refereed scientific journals. The results of the studies are highly regarded, and they are used with confidence by researchers, regulatory and managerial agencies, and interested civic groups. In its broadest sense, quality assurance refers to doing the job right the first time. It includes the functions of planning for products, review and acceptance of the products, and an audit designed to evaluate the system that produces the products. Quality control and quality assurance differ in that quality control ensures that things are done correctly given the 'state-of-the-art' technology, and quality assurance ensures that quality control is maintained within specified limits.

  19. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample (United States)

    Turner, Terry D.; Beller, Laurence S.; Clark, Michael L.; Klingler, Kerry M.


    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus are also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container.

  20. Comparisons of polybrominated diphenyl ether and hexabromocyclododecane concentrations in dust collected with two sampling methods and matched breast milk samples. (United States)

    Björklund, J A; Sellström, U; de Wit, C A; Aune, M; Lignell, S; Darnerud, P O


    Household dust from 19 Swedish homes was collected using two different sampling methods: from the occupant's own home vacuum cleaner after insertion of a new bag and using a researcher-collected method where settled house dust was collected from surfaces above floor level. The samples were analyzed for 16 polybrominated diphenyl ether (PBDE) congeners and total hexabromocyclododecane (HBCD). Significant correlations (r = 0.60-0.65, Spearman r = 0.47-0.54, P samples collected with the two sampling methods for ∑OctaBDE and ∑DecaBDE but not for ∑PentaBDE or HBCD. Statistically significantly higher concentrations of all PBDE congeners were found in the researcher-collected dust than in the home vacuum cleaner bag dust (VCBD). For HBCD, however, the concentrations were significantly higher in the home VCBD samples. Analysis of the bags themselves indicated no or very low levels of PBDEs and HBCD. This indicates that there may be specific HBCD sources to the floor and/or that it may be present in the vacuum cleaners themselves. The BDE-47 concentrations in matched pairs of VCBD and breast milk samples were significantly correlated (r = 0.514, P = 0.029), indicating that one possible exposure route for this congener may be via dust ingestion. The statistically significant correlations found for several individual polybrominated diphenyl ether (PBDE) congeners, ∑OctaBDE and ∑DecaBDE between the two dust sampling methods in this study indicate that the same indoor sources contaminate both types of dust or that common processes govern the distribution of these compounds in the indoor environment. Therefore, either method is adequate for screening ∑OctaBDE and ∑DecaBDE in dust. The high variability seen between dust samples confirms results seen in other studies. For hexabromocyclododecane (HBCD), divergent results in the two dust types indicate differences in contamination sources to the floor than to above-floor surfaces. Thus, it is still unclear which dust

  1. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne


    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  2. A method for the measurement of shielding effectiveness of planar samples requiring no sample edge preparation or contact


    Marvin, Andrew C.; Dawson, Linda; Flintoft, Ian Dand; Dawson, John F.


    A method is presented for the measurement of shielding effectiveness of planar materials with nonconducting surfaces such as carbon fiber composites. The method overcomes edge termination problems with such materials by absorbing edge-diffracted energy. A dynamic range of up to 100 dB has been demonstrated over a frequency range of 1-8.5 GHz, depending on the size of the sample under test. Comparison with ASTM D4935 and nested reverberation measurements of shielding effectiveness shows good a...

  3. Improving malaria treatment and prevention in India by aiding district managers to manage their programmes with local information: a trial assessing the impact of Lot Quality Assurance Sampling on programme outcomes. (United States)

    Valadez, Joseph J; Devkota, Baburam; Pradhan, Madan Mohan; Meherda, Pramod; Sonal, G S; Dhariwal, Akshay; Davis, Rosemary


    This paper reports the first trial of Lot Quality Assurance Sampling (LQAS) assessing associations between access to LQAS data and subsequent improvements in district programming. This trial concerns India's approach to addressing an increase in malaria-attributable deaths by training community health workers to diagnose, treat and prevent malaria, while using LQAS to monitor sub-district performance and make programme improvements. The Ministry of Health introduced LQAS into four matched high malaria burden districts (Annual Parasite Incidence >5) (N > 5 million). In each sub-district, we sampled four populations in three 6-monthly surveys: households, children <5 years, people with fever in the last 2 weeks and community health workers. In three districts, trained local staff collected, analysed and used data for programme management; in one control district, non-local staff collected data and did not disseminate results. For eight indicators, we calculated the change in proportion from survey one to three and used a Difference-in-Differences test to compare the relative change between intervention and control districts. Coverage increased from survey one to three for 24 of 32 comparisons. Difference-in-Differences tests revealed that intervention districts exhibited significantly greater change in four of six vertical strategies (insecticide treated bed-nets and indoor residual spraying), one of six treatment-seeking behaviours and four of 12 health worker capacity indicators. The control district displayed greater improvement than two intervention districts for one health worker capacity indicator. One district with poor management did not improve. In this study, LQAS results appeared to support district managers to increase coverage in underperforming areas, especially for vertical strategies in the presence of diligent managers. © 2014 The Authors. Tropical Medicine & International Health published by John Wiley & Sons Ltd.

  4. Quality Assurance Program Description

    Energy Technology Data Exchange (ETDEWEB)

    Halford, Vaughn Edward [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ryder, Ann Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)


    Effective May 1, 2017, led by a new executive leadership team, Sandia began operating within a new organizational structure. National Technology and Engineering Solutions of Sandia (Sandia’s) Quality Assurance Program (QAP) was established to assign responsibilities and authorities, define workflow policies and requirements, and provide for the performance and assessment of work.

  5. Medicine in Ancient Assur

    DEFF Research Database (Denmark)

    Arbøll, Troels Pank

    This dissertation is a microhistorical study of a single individual named Kiṣir-Aššur who practiced medicine in the ancient city of Assur (modern northern Iraq) in the 7th century BCE. The study provides the first detailed analysis of one healer’s education and practice in ancient Mesopotamia...

  6. Read Code Quality Assurance (United States)

    Schulz, Erich; Barrett, James W.; Price, Colin


    As controlled clinical vocabularies assume an increasing role in modern clinical information systems, so the issue of their quality demands greater attention. In order to meet the resulting stringent criteria for completeness and correctness, a quality assurance system comprising a database of more than 500 rules is being developed and applied to the Read Thesaurus. The authors discuss the requirement to apply quality assurance processes to their dynamic editing database in order to ensure the quality of exported products. Sources of errors include human, hardware, and software factors as well as new rules and transactions. The overall quality strategy includes prevention, detection, and correction of errors. The quality assurance process encompasses simple data specification, internal consistency, inspection procedures and, eventually, field testing. The quality assurance system is driven by a small number of tables and UNIX scripts, with “business rules” declared explicitly as Structured Query Language (SQL) statements. Concurrent authorship, client-server technology, and an initial failure to implement robust transaction control have all provided valuable lessons. The feedback loop for error management needs to be short. PMID:9670131

  7. Quality assurance in Zambia. (United States)

    Reinke, J; Tembo, J; Limbambala, M F; Chikuta, S; Zaenger, D


    Primary health care reforms in Zambia have focused on the themes of effective leadership, community involvement, and improved service quality. To achieve these goals, the Ministry of Health's structure has been decentralized and a Health Reforms Implementation Team (including a Quality Assurance Unit) has been established. This unit collaborates with government and private sector organizations and professional groups in areas such as strategic planning, problem solving, facility assessment, standards setting, and indicator development. Each province has two linkage facilitators who provide district-level training and support to quality assurance coaches. As part of this process, staff at Nanga Rural Health Center in Mazabuka District selected patient privacy as a priority quality assurance issue and established an enclosed area for patient interviews. This measure facilitated increased patient disclosure about and comfort with discussing sensitive medical issues such as family planning and sexually transmitted diseases. Next, the health center staff examined the problem of pharmaceutical shortages, and user fees were identified as a means of purchasing commonly unavailable drugs. At the Magoye Rural Health Center, quality assurance assessment led to the consolidation of services such as infant weighing and immunization at the same location, thereby significantly increasing service utilization.

  8. Providing Continuous Assurance

    NARCIS (Netherlands)

    Kocken, Jonne; Hulstijn, Joris


    It has been claimed that continuous assurance can be attained by combining continuous monitoring by management, with continuous auditing of data streams and the effectiveness of internal controls by an external auditor. However, we find that in existing literature the final step to continuous

  9. Microwave preservation method for DMSP, DMSO, and acrylate in unfiltered seawater and phytoplankton culture samples

    National Research Council Canada - National Science Library

    Kinsey, Joanna D; Kieber, David J


    ... T ), dimethylsulfoxide (DMSO T ), and acrylate (acrylate T ) concentrations in unfiltered samples to alleviate problems associated with the acidification method when applied to samples containing Phaeocystis . Microwave‐ and acid...

  10. Detection and monitoring of invasive exotic plants: a comparison of four sampling methods (United States)

    Cynthia D. Huebner


    The ability to detect and monitor exotic invasive plants is likely to vary depending on the sampling method employed. Methods with strong qualitative thoroughness for species detection often lack the intensity necessary to monitor vegetation change. Four sampling methods (systematic plot, stratified-random plot, modified Whittaker, and timed meander) in hemlock and red...

  11. Creating Quality Assurance and International Transparency for Quality Assurance Agencies

    DEFF Research Database (Denmark)

    Kristoffersen, Dorte; Lindeberg, Tobias


    The paper presents the experiences gained in the pilot project on mutual recognition conducted by the quality assurance agencies in the Nordic countries and the future perspective for international quality assurance of national quality assurance agencies. The background of the project was the nee...

  12. Understanding What It Means for Assurance Cases to "Work" (United States)

    Rinehart, David J.; Knight, John C.; Rowanhill, Jonathan


    This report is the result of our year-long investigation into assurance case practices and effectiveness. Assurance cases are a method for working toward acceptable critical system performance. They represent a significant thread of applied assurance methods extending back many decades and being employed in a range of industries and applications. Our research presented in this report includes a literature survey of over 50 sources and interviews with nearly a dozen practitioners in the field. We have organized our results into seven major claimed assurance case benefits and their supporting mechanisms, evidence, counter-evidence, and caveats.

  13. Evaluation of micro-colorimetric lipid determination method with samples prepared using sonication and accelerated solvent extraction methods. (United States)

    Billa, Nanditha; Hubin-Barrows, Dylan; Lahren, Tylor; Burkhard, Lawrence P


    Two common laboratory extraction techniques were evaluated for routine use with the micro-colorimetric lipid determination method developed by Van Handel (1985) [2] and recently validated for small samples by Inouye and Lotufo (2006) [1]. With the accelerated solvent extraction method using chloroform:methanol solvent and the colorimetric lipid determination method, 28 of 30 samples had significant proportional bias (α=1%, determined using standard additions) and 1 of 30 samples had significant constant bias (α=1%, determined using Youden Blank measurements). With sonic extraction, 0 of 6 samples had significant proportional bias (α=1%) and 1 of 6 samples had significant constant bias (α=1%). These demonstrate that the accelerated solvent extraction method with chloroform:methanol solvent system creates an interference with the colorimetric assay method, and without accounting for the bias in the analysis, inaccurate measurements would be obtained. Published by Elsevier B.V.

  14. BYU Food Quality Assurance Laboratory (United States)

    Federal Laboratory Consortium — The Quality Assurance Lab is located in the Eyring Science Center in the department of Nutrition, Dietetics, and Food Science. The Quality Assurance Lab has about 10...

  15. A single-blood-sample method using inulin for estimating feline glomerular filtration rate. (United States)

    Katayama, M; Saito, J; Katayama, R; Yamagishi, N; Murayama, I; Miyano, A; Furuhama, K


    Application of a multisample method using inulin to estimate glomerular filtration rate (GFR) in cats is cumbersome. To establish a simplified procedure to estimate GFR in cats, a single-blood-sample method using inulin was compared with a conventional 3-sample method. Nine cats including 6 clinically healthy cats and 3 cats with spontaneous chronic kidney disease. Retrospective study. Inulin was administered as an intravenous bolus at 50 mg/kg to cats, and blood was collected at 60, 90, and 120 minutes later for the 3-sample method. Serum inulin concentrations were colorimetrically determined by an autoanalyzer method. The GFR in the single-blood-sample method was calculated from the dose injected, serum concentration, sampling time, and estimated volume of distribution on the basis of the data of the 3-sample method. An excellent correlation was observed (r = 0.99, P = .0001) between GFR values estimated by the single-blood-sample and 3-sample methods. The single-blood-sample method using inulin provides a practicable and ethical alternative for estimating glomerular filtration rate in cats. Copyright © 2012 by the American College of Veterinary Internal Medicine.

  16. Evaluation of sampling methods for the detection of Salmonella in broiler flocks

    DEFF Research Database (Denmark)

    Skov, Marianne N.; Carstensen, B.; Tornoe, N.


    The present study compares four different sampling methods potentially applicable to detection of Salmonella in broiler flocks, based on collection of faecal samples (i) by hand, 300 fresh faecal samples (ii) absorbed on five sheets of paper (iii) absorbed on five pairs of socks (elastic cotton...... horizontal or vertical) were found in the investigation. The results showed that the sock method (five pairs of socks) had a sensitivity comparable with the hand collection method (60 pools of five faecal samples); the paper collection method was inferior, as was the use of only one pair of socks, Estimation...

  17. Intervene before leaving: clustered lot quality assurance sampling to monitor vaccination coverage at health district level before the end of a yellow fever and measles vaccination campaign in Sierra Leone in 2009. (United States)

    Pezzoli, Lorenzo; Conteh, Ishata; Kamara, Wogba; Gacic-Dobo, Marta; Ronveaux, Olivier; Perea, William A; Lewis, Rosamund F


    In November 2009, Sierra Leone conducted a preventive yellow fever (YF) vaccination campaign targeting individuals aged nine months and older in six health districts. The campaign was integrated with a measles follow-up campaign throughout the country targeting children aged 9-59 months. For both campaigns, the operational objective was to reach 95% of the target population. During the campaign, we used clustered lot quality assurance sampling (C-LQAS) to identify areas of low coverage to recommend timely mop-up actions. We divided the country in 20 non-overlapping lots. Twelve lots were targeted by both vaccinations, while eight only by measles. In each lot, five clusters of ten eligible individuals were selected for each vaccine. The upper threshold (UT) was set at 90% and the lower threshold (LT) at 75%. A lot was rejected for low vaccination coverage if more than 7 unvaccinated individuals (not presenting vaccination card) were found. After the campaign, we plotted the C-LQAS results against the post-campaign coverage estimations to assess if early interventions were successful enough to increase coverage in the lots that were at the level of rejection before the end of the campaign. During the last two days of campaign, based on card-confirmed vaccination status, five lots out of 20 (25.0%) failed for having low measles vaccination coverage and three lots out of 12 (25.0%) for low YF coverage. In one district, estimated post-campaign vaccination coverage for both vaccines was still not significantly above the minimum acceptable level (LT = 75%) even after vaccination mop-up activities. C-LQAS during the vaccination campaign was informative to identify areas requiring mop-up activities to reach the coverage target prior to leaving the region. The only district where mop-up activities seemed to be unsuccessful might have had logistical difficulties that should be further investigated and resolved.

  18. Methods of sampling airborne fungi in working environments of waste treatment facilities

    Directory of Open Access Journals (Sweden)

    Kristýna Černá


    Full Text Available Objectives: The objective of the present study was to evaluate and compare the efficiency of a filter based sampling method and a high volume sampling method for sampling airborne culturable fungi present in waste sorting facilities. Material and Methods: Membrane filters method was compared with surface air system method. The selected sampling methods were modified and tested in 2 plastic waste sorting facilities. Results: The total number of colony-forming units (CFU/m3 of airborne fungi was dependent on the type of sampling device, on the time of sampling, which was carried out every hour from the beginning of the work shift, and on the type of cultivation medium (p < 0.001. Detected concentrations of airborne fungi ranged 2×102–1.7×106 CFU/m3 when using the membrane filters (MF method, and 3×102–6.4×104 CFU/m3 when using the surface air system (SAS method. Conclusions: Both methods showed comparable sensitivity to the fluctuations of the concentrations of airborne fungi during the work shifts. The SAS method is adequate for a fast indicative determination of concentration of airborne fungi. The MF method is suitable for thorough assessment of working environment contamination by airborne fungi. Therefore we recommend the MF method for the implementation of a uniform standard methodology of airborne fungi sampling in working environments of waste treatment facilities.

  19. Trends in newborn umbilical cord care practices in Sokoto and Bauchi States of Nigeria: the where, who, how, what and the ubiquitous role of traditional birth attendants: a lot quality assurance sampling survey. (United States)

    Abegunde, Dele; Orobaton, Nosa; Beal, Katherine; Bassi, Amos; Bamidele, Moyosola; Akomolafe, Toyin; Ohanyido, Francis; Umar-Farouk, Olayinka; Danladi, Saba'atu


    Neonatal infections caused by unsafe umbilical cord practices account for the majority of neonatal deaths in Nigeria. We examined the trends in umbilical cord care practices between 2012 and 2015 that coincided with the introduction of chlorhexidine digluconate 7.1% gel in Bauchi and Sokoto States. We obtained data from three rounds of lot quality assurance samples (LQAS) surveys conducted in 2012, 2013 and 2015. Households were randomly sampled in each round that totaled 1140 and 1311 households in Bauchi and Sokoto States respectively. Mothers responded to questions on cord care practices in the last delivery. Coverage estimates of practice indicators were obtained for each survey period. Local Government Area (LGA) estimates for each indicator were obtained with α ≤ 5%, and β ≤20% statistical errors and aggregated to State-level estimates with finite sample correction relative to the LGA population. Over 75 and 80% of deliveries in Bauchi and Sokoto States respectively took place at home. The proportion of deliveries in public facilities reported by mothers ranged from 19% in 2012 to 22.4% in 2015 in Bauchi State and from 12.9 to 13.2% in 2015 in Sokoto State. Approximately 50% of deliveries in Bauchi and more than 80% in Sokoto States were assisted by traditional birth attendants (TBAs) or relatives and friends, with little change in the survey periods. In Bauchi and in Sokoto States, over 75% and over 80% of newborn cords were cut with razor blades underscoring the pervasive role of the TBAs in the immediate postpartum period. Use of chlorhexidine digluconate 7.1% gel for cord dressing significantly increased to the highest level in 2015 in both States. Health workers who attended deliveries in health facilities switched from methylated spirit to chlorhexidine. There were no observable changes in cord care practices among the TBAs. Unsafe umbilical cord care practices remained prevalent in Bauchi and Sokoto States of Nigeria, although a recent

  20. Comparison of preprocessing methods and storage times for touch DNA samples. (United States)

    Dong, Hui; Wang, Jing; Zhang, Tao; Ge, Jian-Ye; Dong, Ying-Qiang; Sun, Qi-Fan; Liu, Chao; Li, Cai-Xia


    To select appropriate preprocessing methods for different substrates by comparing the effects of four different preprocessing methods on touch DNA samples and to determine the effect of various storage times on the results of touch DNA sample analysis. Hand touch DNA samples were used to investigate the detection and inspection results of DNA on different substrates. Four preprocessing methods, including the direct cutting method, stubbing procedure, double swab technique, and vacuum cleaner method, were used in this study. DNA was extracted from mock samples with four different preprocessing methods. The best preprocess protocol determined from the study was further used to compare performance after various storage times. DNA extracted from all samples was quantified and amplified using standard procedures. The amounts of DNA and the number of alleles detected on the porous substrates were greater than those on the non-porous substrates. The performances of the four preprocessing methods varied with different substrates. The direct cutting method displayed advantages for porous substrates, and the vacuum cleaner method was advantageous for non-porous substrates. No significant degradation trend was observed as the storage times increased. Different substrates require the use of different preprocessing method in order to obtain the highest DNA amount and allele number from touch DNA samples. This study provides a theoretical basis for explorations of touch DNA samples and may be used as a reference when dealing with touch DNA samples in case work.

  1. A Novel Method of Failure Sample Selection for Electrical Systems Using Ant Colony Optimization. (United States)

    Xiong, Jian; Tian, Shulin; Yang, Chenglin; Liu, Cheng


    The influence of failure propagation is ignored in failure sample selection based on traditional testability demonstration experiment method. Traditional failure sample selection generally causes the omission of some failures during the selection and this phenomenon could lead to some fearful risks of usage because these failures will lead to serious propagation failures. This paper proposes a new failure sample selection method to solve the problem. First, the method uses a directed graph and ant colony optimization (ACO) to obtain a subsequent failure propagation set (SFPS) based on failure propagation model and then we propose a new failure sample selection method on the basis of the number of SFPS. Compared with traditional sampling plan, this method is able to improve the coverage of testing failure samples, increase the capacity of diagnosis, and decrease the risk of using.

  2. [Comparison of the designing effects (DE) among different designs related to complex sampling methods]. (United States)

    Wang, Jian-Sheng; Feng, Guo-Shuang; Yu, Shi-Cheng; Ma, Lin-Mao; Zhou, Mai-Geng; Liu, Shi-Yao


    To compare the designing effects (DE) among different complex sampling designing programs. Data from the '2002 Chinese Nutrition and Health Survey' was used as an example to generate the sampling population, and statistical simulation method was used to estimate the values of DEs from six complex sampling designing programs. It was found that the values of DEs varied among the six complex sampling designing programs. The values of the DEs were associated with the sample sizes in a positive way, with more sample stages and less stratified categories. Reduction of the numbers of sample stages and detailing stratified categories could decrease the DE values so as to improve the DE.

  3. Simulated Tempering Distributed Replica Sampling, Virtual Replica Exchange, and Other Generalized-Ensemble Methods for Conformational Sampling. (United States)

    Rauscher, Sarah; Neale, Chris; Pomès, Régis


    Generalized-ensemble algorithms in temperature space have become popular tools to enhance conformational sampling in biomolecular simulations. A random walk in temperature leads to a corresponding random walk in potential energy, which can be used to cross over energetic barriers and overcome the problem of quasi-nonergodicity. In this paper, we introduce two novel methods: simulated tempering distributed replica sampling (STDR) and virtual replica exchange (VREX). These methods are designed to address the practical issues inherent in the replica exchange (RE), simulated tempering (ST), and serial replica exchange (SREM) algorithms. RE requires a large, dedicated, and homogeneous cluster of CPUs to function efficiently when applied to complex systems. ST and SREM both have the drawback of requiring extensive initial simulations, possibly adaptive, for the calculation of weight factors or potential energy distribution functions. STDR and VREX alleviate the need for lengthy initial simulations, and for synchronization and extensive communication between replicas. Both methods are therefore suitable for distributed or heterogeneous computing platforms. We perform an objective comparison of all five algorithms in terms of both implementation issues and sampling efficiency. We use disordered peptides in explicit water as test systems, for a total simulation time of over 42 μs. Efficiency is defined in terms of both structural convergence and temperature diffusion, and we show that these definitions of efficiency are in fact correlated. Importantly, we find that ST-based methods exhibit faster temperature diffusion and correspondingly faster convergence of structural properties compared to RE-based methods. Within the RE-based methods, VREX is superior to both SREM and RE. On the basis of our observations, we conclude that ST is ideal for simple systems, while STDR is well-suited for complex systems.

  4. A standardized method for sampling and extraction methods for quantifying microplastics in beach sand. (United States)

    Besley, Aiken; Vijver, Martina G; Behrens, Paul; Bosker, Thijs


    Microplastics are ubiquitous in the environment, are frequently ingested by organisms, and may potentially cause harm. A range of studies have found significant levels of microplastics in beach sand. However, there is a considerable amount of methodological variability among these studies. Methodological variation currently limits comparisons as there is no standard procedure for sampling or extraction of microplastics. We identify key sampling and extraction procedures across the literature through a detailed review. We find that sampling depth, sampling location, number of repeat extractions, and settling times are the critical parameters of variation. Next, using a case-study we determine whether and to what extent these differences impact study outcomes. By investigating the common practices identified in the literature with the case-study, we provide a standard operating procedure for sampling and extracting microplastics from beach sand. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Individual and pen-based oral fluid sampling: A welfare-friendly sampling method for group-housed gestating sows. (United States)

    Pol, Françoise; Dorenlor, Virginie; Eono, Florent; Eudier, Solveig; Eveno, Eric; Liégard-Vanhecke, Dorine; Rose, Nicolas; Fablet, Christelle


    The aims of this study were to assess the feasibility of individual and pen-based oral fluid sampling (OFS) in 35 pig herds with group-housed sows, compare these methods to blood sampling, and assess the factors influencing the success of sampling. Individual samples were collected from at least 30 sows per herd. Pen-based OFS was performed using devices placed in at least three pens for 45min. Information related to the farm, the sows, and their living conditions were collected. Factors significantly associated with the duration of sampling and the chewing behaviour of sows were identified by logistic regression. Individual OFS took 2min 42s on average; the type of floor, swab size, and operator were associated with a sampling time >2min. Pen-based OFS was obtained from 112 devices (62.2%). The type of floor, parity, pen-level activity, and type of feeding were associated with chewing behaviour. Pen activity was associated with the latency to interact with the device. The type of floor, gestation stage, parity, group size, and latency to interact with the device were associated with a chewing time >10min. After 15, 30 and 45min of pen-based OFS, 48%, 60% and 65% of the sows were lying down, respectively. The time spent after the beginning of sampling, genetic type, and time elapsed since the last meal were associated with 50% of the sows lying down at one time point. The mean time to blood sample the sows was 1min 16s and 2min 52s if the number of operators required was considered in the sampling time estimation. The genetic type, parity, and type of floor were significantly associated with a sampling time higher than 1min 30s. This study shows that individual OFS is easy to perform in group-housed sows by a single operator, even though straw-bedded animals take longer to sample than animals housed on slatted floors, and suggests some guidelines to optimise pen-based OFS success. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Estimation method for mathematical expectation of continuous variable upon ordered sample


    Domchenkov, O. A.


    Method for estimation of mathematical expectation of a continuous variable based on analysis of the ordered sample is proposed. The method admits the estimation class propagation on nonlinear estimation classes.

  7. Methods of sampling airborne fungi in working environments of waste treatment facilities. (United States)

    Černá, Kristýna; Wittlingerová, Zdeňka; Zimová, Magdaléna; Janovský, Zdeněk


    The objective of the present study was to evaluate and compare the efficiency of a filter based sampling method and a high volume sampling method for sampling airborne culturable fungi present in waste sorting facilities. Membrane filters method was compared with surface air system method. The selected sampling methods were modified and tested in 2 plastic waste sorting facilities. The total number of colony-forming units (CFU)/m3 of airborne fungi was dependent on the type of sampling device, on the time of sampling, which was carried out every hour from the beginning of the work shift, and on the type of cultivation medium (p fungi ranged 2×102-1.7×106 CFU/m3 when using the membrane filters (MF) method, and 3×102-6.4×104 CFU/m3 when using the surface air system (SAS) method. Both methods showed comparable sensitivity to the fluctuations of the concentrations of airborne fungi during the work shifts. The SAS method is adequate for a fast indicative determination of concentration of airborne fungi. The MF method is suitable for thorough assessment of working environment contamination by airborne fungi. Therefore we recommend the MF method for the implementation of a uniform standard methodology of airborne fungi sampling in working environments of waste treatment facilities. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  8. Power transformers quality assurance

    CERN Document Server

    Dasgupta, Indrajit


    About the Book: With the view to attain higher reliability in power system operation, the quality assurance in the field of distribution and power transformers has claimed growing attention. Besides new developments in the material technology and manufacturing processes of transformers, regular diagnostic testing and maintenance of any engineering product may be ascertained by ensuring: right selection of materials and components and their quality checks. application of correct manufacturing processes any systems engineering. the user`s awareness towards preventive maintenance. The

  9. Quality assurance for gamma knives

    Energy Technology Data Exchange (ETDEWEB)

    Jones, E.D.; Banks, W.W.; Fischer, L.E. [Lawrence Livermore National Lab., CA (United States)


    This report describes and summarizes the results of a quality assurance (QA) study of the Gamma Knife, a nuclear medical device used for the gamma irradiation of intracranial lesions. Focus was on the physical aspects of QA and did not address issues that are essentially medical, such as patient selection or prescription of dose. A risk-based QA assessment approach was used. Sample programs for quality control and assurance are included. The use of the Gamma Knife was found to conform to existing standards and guidelines concerning radiation safety and quality control of external beam therapies (shielding, safety reviews, radiation surveys, interlock systems, exposure monitoring, good medical physics practices, etc.) and to be compliant with NRC teletherapy regulations. There are, however, current practices for the Gamma Knife not covered by existing, formalized regulations, standards, or guidelines. These practices have been adopted by Gamma Knife users and continue to be developed with further experience. Some of these have appeared in publications or presentations and are slowly finding their way into recommendations of professional organizations.

  10. Reliability of different methods used for forming of working samples in the laboratory for seed testing

    Directory of Open Access Journals (Sweden)

    Opra Branislava


    Full Text Available The testing of seed quality starts from the moment a sample is formed in a warehouse during processing or packaging of the seed. The seed sampling as the process of obtaining the working sample also assumes each step undertaken during its testing in the laboratory. With the aim of appropriate forming of a seed sample in the laboratory, the usage of seed divider is prescribed for large seeded species (such as seed the size of wheat or larger (ISTA Rules, 1999. The aim of this paper was the comparison of different methods used for obtaining the working samples of maize and wheat seeds using conical, soil and centrifugal dividers. The number of seed of added admixtures confirmed the reliability of working samples formation. To each maize sample (1000 g 10 seeds of the following admixtures were added: Zea mays L. (red pericarp, Hordeum vulgäre L., Triticum aestivum L., and Glycine max (L. Merr. Two methods were used for formation of maze seed working sample. To wheat samples (1000 g 10 seeds of each of the following species were added: Avena saliva (hulled seeds, Hordeum vulgäre L., Galium tricorne Stokes, and Polygonum lapatifolmm L. For formation of wheat seed working samples four methods were used. Optimum of 9, but not less than 7 seeds of admixture were due to be determined in the maize seed working sample, while for wheat, at least one seed of admixture was expected to be found in the working sample. The obtained results confirmed that the formation of the maize seed working samples was the most reliable when centrifugal divider, the first method was used (average of admixture - 9.37. From the observed admixtures the seed of Triticum aestivum L. was the most uniformly distributed, the first method also being used (6.93. The second method gains high average values satisfying the given criterion, but it should be used with previous homogenization of the sample being tested. The forming of wheat seed working samples is the most reliable if the

  11. An improved adaptive sampling and experiment design method for aerodynamic optimization

    Directory of Open Access Journals (Sweden)

    Huang Jiangtao


    Full Text Available Experiment design method is a key to construct a highly reliable surrogate model for numerical optimization in large-scale project. Within the method, the experimental design criterion directly affects the accuracy of the surrogate model and the optimization efficient. According to the shortcomings of the traditional experimental design, an improved adaptive sampling method is proposed in this paper. The surrogate model is firstly constructed by basic sparse samples. Then the supplementary sampling position is detected according to the specified criteria, which introduces the energy function and curvature sampling criteria based on radial basis function (RBF network. Sampling detection criteria considers both the uniformity of sample distribution and the description of hypersurface curvature so as to significantly improve the prediction accuracy of the surrogate model with much less samples. For the surrogate model constructed with sparse samples, the sample uniformity is an important factor to the interpolation accuracy in the initial stage of adaptive sampling and surrogate model training. Along with the improvement of uniformity, the curvature description of objective function surface gradually becomes more important. In consideration of these issues, crowdness enhance function and root mean square error (RMSE feedback function are introduced in C criterion expression. Thus, a new sampling method called RMSE and crowdness enhance (RCE adaptive sampling is established. The validity of RCE adaptive sampling method is studied through typical test function firstly and then the airfoil/wing aerodynamic optimization design problem, which has high-dimensional design space. The results show that RCE adaptive sampling method not only reduces the requirement for the number of samples, but also effectively improves the prediction accuracy of the surrogate model, which has a broad prospects for applications.

  12. Methods and devices for hyperpolarising and melting NMR samples in a cryostat

    DEFF Research Database (Denmark)

    Ardenkjaer-Larsen, Jan Henrik; Axelsson, Oskar H. E.; Golman, Klaes Koppel


    The present invention relates to devices and method for melting solid polarised sample while retaining a high level of polarisation. In an embodiment of the present invention a sample is polarised in a sample-retaining cup 9 in a strong magnetic field in a polarising means 3a, 3b, 3c in a cryosta...

  13. Optical Methods for Identifying Hard Clay Core Samples During Petrophysical Studies (United States)

    Morev, A. V.; Solovyeva, A. V.; Morev, V. A.


    X-ray phase analysis of the general mineralogical composition of core samples from one of the West Siberian fields was performed. Electronic absorption spectra of the clay core samples with an added indicator were studied. The speed and availability of applying the two methods in petrophysical laboratories during sample preparation for standard and special studies were estimated.

  14. Quantifying Uncertainties from Presence Data Sampling Methods for Species Distribution Modeling: Focused on Vegetation. (United States)

    Sung, S.; Kim, H. G.; Lee, D. K.; Park, J. H.; Mo, Y.; Kil, S.; Park, C.


    The impact of climate change has been observed throughout the globe. The ecosystem experiences rapid changes such as vegetation shift, species extinction. In these context, Species Distribution Model (SDM) is one of the popular method to project impact of climate change on the ecosystem. SDM basically based on the niche of certain species with means to run SDM present point data is essential to find biological niche of species. To run SDM for plants, there are certain considerations on the characteristics of vegetation. Normally, to make vegetation data in large area, remote sensing techniques are used. In other words, the exact point of presence data has high uncertainties as we select presence data set from polygons and raster dataset. Thus, sampling methods for modeling vegetation presence data should be carefully selected. In this study, we used three different sampling methods for selection of presence data of vegetation: Random sampling, Stratified sampling and Site index based sampling. We used one of the R package BIOMOD2 to access uncertainty from modeling. At the same time, we included BioCLIM variables and other environmental variables as input data. As a result of this study, despite of differences among the 10 SDMs, the sampling methods showed differences in ROC values, random sampling methods showed the lowest ROC value while site index based sampling methods showed the highest ROC value. As a result of this study the uncertainties from presence data sampling methods and SDM can be quantified.

  15. Universal nucleic acids sample preparation method for cells, spores and their mixture (United States)

    Bavykin, Sergei [Darien, IL


    The present invention relates to a method for extracting nucleic acids from biological samples. More specifically the invention relates to a universal method for extracting nucleic acids from unidentified biological samples. An advantage of the presently invented method is its ability to effectively and efficiently extract nucleic acids from a variety of different cell types including but not limited to prokaryotic or eukaryotic cells and/or recalcitrant organisms (i.e. spores). Unlike prior art methods which are focused on extracting nucleic acids from vegetative cell or spores, the present invention effectively extracts nucleic acids from spores, multiple cell types or mixtures thereof using a single method. Important that the invented method has demonstrated an ability to extract nucleic acids from spores and vegetative bacterial cells with similar levels effectiveness. The invented method employs a multi-step protocol which erodes the cell structure of the biological sample, isolates, labels, fragments nucleic acids and purifies labeled samples from the excess of dye.

  16. A Method for Microalgae Proteomics Analysis Based on Modified Filter-Aided Sample Preparation. (United States)

    Li, Song; Cao, Xupeng; Wang, Yan; Zhu, Zhen; Zhang, Haowei; Xue, Song; Tian, Jing


    With the fast development of microalgal biofuel researches, the proteomics studies of microalgae increased quickly. A filter-aided sample preparation (FASP) method is widely used proteomics sample preparation method since 2009. Here, a method of microalgae proteomics analysis based on modified filter-aided sample preparation (mFASP) was described to meet the characteristics of microalgae cells and eliminate the error caused by over-alkylation. Using Chlamydomonas reinhardtii as the model, the prepared sample was tested by standard LC-MS/MS and compared with the previous reports. The results showed mFASP is suitable for most of occasions of microalgae proteomics studies.

  17. Method and device for detecting a similarity in the shape of sampled signals

    NARCIS (Netherlands)

    Coenen, A.J.R.


    Method for detecting a similarity in shape between a first and a second sampled signal fragment. The method comprises the steps of: formation of a first fragment function from the first sampled signal fragment by means of inverse interpolation, definition of a domain interval, for example the time

  18. The mouthwash : A non-invasive sampling method to study cytokine gene polymorphisms

    NARCIS (Netherlands)

    Laine, ML; Farre, MA; Crusius, JBA; van Winkelhoff, AJ; Pena, AS

    Background: We describe a simple, non-invasive mouthwash sampling method for rapid DNA isolation to detect cytokine gene polymorphisms. In the present paper, interleukin-1 beta (IL-1B) and interleukin-1 receptor antagonist (IL-1RN) gene polymorphisms were studied. Methods: Two mouthwash samples and

  19. Sampling Methods and the Accredited Population in Athletic Training Education Research (United States)

    Carr, W. David; Volberding, Jennifer


    Context: We describe methods of sampling the widely-studied, yet poorly defined, population of accredited athletic training education programs (ATEPs). Objective: There are two purposes to this study; first to describe the incidence and types of sampling methods used in athletic training education research, and second to clearly define the…

  20. A simple method for determination of natural and depleted uranium in surface soil samples. (United States)

    Vukanac, I; Novković, D; Kandić, A; Djurasević, M; Milosević, Z


    A simple and efficient method for determination of uranium content in surface soil samples contaminated with depleted uranium, by gamma ray spectrometry is presented. The content of natural uranium and depleted uranium, as well as the activity ratio (235)U/(238)U of depleted uranium, were determined in contaminated surface soil samples by application of this method. Copyright 2009 Elsevier Ltd. All rights reserved.

  1. Melting Temperature Mapping Method: A Novel Method for Rapid Identification of Unknown Pathogenic Microorganisms within Three Hours of Sample Collection. (United States)

    Niimi, Hideki; Ueno, Tomohiro; Hayashi, Shirou; Abe, Akihito; Tsurue, Takahiro; Mori, Masashi; Tabata, Homare; Minami, Hiroshi; Goto, Michihiko; Akiyama, Makoto; Yamamoto, Yoshihiro; Saito, Shigeru; Kitajima, Isao


    Acquiring the earliest possible identification of pathogenic microorganisms is critical for selecting the appropriate antimicrobial therapy in infected patients. We herein report the novel "melting temperature (Tm) mapping method" for rapidly identifying the dominant bacteria in a clinical sample from sterile sites. Employing only seven primer sets, more than 100 bacterial species can be identified. In particular, using the Difference Value, it is possible to identify samples suitable for Tm mapping identification. Moreover, this method can be used to rapidly diagnose the absence of bacteria in clinical samples. We tested the Tm mapping method using 200 whole blood samples obtained from patients with suspected sepsis, 85% (171/200) of which matched the culture results based on the detection level. A total of 130 samples were negative according to the Tm mapping method, 98% (128/130) of which were also negative based on the culture method. Meanwhile, 70 samples were positive according to the Tm mapping method, and of the 59 suitable for identification, 100% (59/59) exhibited a "match" or "broad match" with the culture or sequencing results. These findings were obtained within three hours of whole blood collection. The Tm mapping method is therefore useful for identifying infectious diseases requiring prompt treatment.

  2. Comparison of granulometric methods and sampling strategies used in marine habitat classification and Ecological Status assessment. (United States)

    Forde, James; Collins, Patrick Colman; Patterson, Adrian; Kennedy, Robert


    Sediment particle size analysis (PSA) is routinely used to support benthic macrofaunal community distribution data in habitat mapping and Ecological Status (ES) assessment. No optimal PSA Method to explain variability in multivariate macrofaunal distribution has been identified nor have the effects of changing sampling strategy been examined. Here, we use benthic macrofaunal and PSA grabs from two embayments in the south of Ireland. Four frequently used PSA Methods and two common sampling strategies are applied. A combination of laser particle sizing and wet/dry sieving without peroxide pre-treatment to remove organics was identified as the optimal Method for explaining macrofaunal distributions. ES classifications and EUNIS sediment classification were robust to changes in PSA Method. Fauna and PSA samples returned from the same grab sample significantly decreased macrofaunal variance explained by PSA and caused ES to be classified as lower. Employing the optimal PSA Method and sampling strategy will improve benthic monitoring. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Quantitative method of determining beryllium or a compound thereof in a sample (United States)

    McCleskey, T. Mark; Ehler, Deborah S.; John, Kevin D.; Burrell, Anthony K.; Collis, Gavin E.; Minogue, Edel M.; Warner, Benjamin P.


    A method of determining beryllium or a beryllium compound thereof in a sample, includes providing a sample suspected of comprising beryllium or a compound thereof, extracting beryllium or a compound thereof from the sample by dissolving in a solution, adding a fluorescent indicator to the solution to thereby bind any beryllium or a compound thereof to the fluorescent indicator, and determining the presence or amount of any beryllium or a compound thereof in the sample by measuring fluorescence.

  4. Estimation of the sugar cane cultivated area from LANDSAT images using the two phase sampling method (United States)

    Parada, N. D. J. (Principal Investigator); Cappelletti, C. A.; Mendonca, F. J.; Lee, D. C. L.; Shimabukuro, Y. E.


    A two phase sampling method and the optimal sampling segment dimensions for the estimation of sugar cane cultivated area were developed. This technique employs visual interpretations of LANDSAT images and panchromatic aerial photographs considered as the ground truth. The estimates, as a mean value of 100 simulated samples, represent 99.3% of the true value with a CV of approximately 1%; the relative efficiency of the two phase design was 157% when compared with a one phase aerial photographs sample.

  5. [Preparation of sub-standard samples and XRF analytical method of powder non-metallic minerals]. (United States)

    Kong, Qin; Chen, Lei; Wang, Ling


    In order to solve the problem that standard samples of non-metallic minerals are not satisfactory in practical work by X-ray fluorescence spectrometer (XRF) analysis with pressed powder pellet, a method was studied how to make sub-standard samples according to standard samples of non-metallic minerals and to determine how they can adapt to analysis of mineral powder samples, taking the K-feldspar ore in Ebian-Wudu, Sichuan as an example. Based on the characteristic analysis of K-feldspar ore and the standard samples by X-ray diffraction (XRD) and chemical methods, combined with the principle of the same or similar between the sub-standard samples and unknown samples, the experiment developed the method of preparation of sub-standard samples: both of the two samples above mentioned should have the same kind of minerals and the similar chemical components, adapt mineral processing, and benefit making working curve. Under the optimum experimental conditions, a method for determination of SiO2, Al2O3, Fe2O3, TiO2, CaO, MgO, K2O and Na2O of K-feldspar ore by XRF was established. Thedetermination results are in good agreement with classical chemical methods, which indicates that this method was accurate.

  6. Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research. (United States)

    Palinkas, Lawrence A; Horwitz, Sarah M; Green, Carla A; Wisdom, Jennifer P; Duan, Naihua; Hoagwood, Kimberly


    Purposeful sampling is widely used in qualitative research for the identification and selection of information-rich cases related to the phenomenon of interest. Although there are several different purposeful sampling strategies, criterion sampling appears to be used most commonly in implementation research. However, combining sampling strategies may be more appropriate to the aims of implementation research and more consistent with recent developments in quantitative methods. This paper reviews the principles and practice of purposeful sampling in implementation research, summarizes types and categories of purposeful sampling strategies and provides a set of recommendations for use of single strategy or multistage strategy designs, particularly for state implementation research.

  7. Cavitation Erosion Tests Performed by Indirect Vibratory Method on Stainless Steel Welded Samples with Hardened Surface

    Directory of Open Access Journals (Sweden)

    Marian-Dumitru Nedeloni


    Full Text Available The paper presents the results of cavitation erosion tests performed on two types of samples. The materials of the samples are frequently used for manufacturing and repairs of the hydro turbines components submitted to cavitation. The first sample was made by welding of an austenitic stainless steel on austenito-feritic base material. The second sample was made similarly with the first but with a martensitic base material. After the welding processes, on both samples was applied a hardening treatment by surface peening. The cavitation erosion tests were performed on vibratory equipment using the indirect method with stationary specimen. The results show a good cavitation erosion resistance on both samples.

  8. Optical method and system for the characterization of laterally-patterned samples in integrated circuits (United States)

    Maris, Humphrey J.


    Disclosed is a method for characterizing a sample having a structure disposed on or within the sample, comprising the steps of applying a first pulse of light to a surface of the sample for creating a propagating strain pulse in the sample, applying a second pulse of light to the surface so that the second pulse of light interacts with the propagating strain pulse in the sample, sensing from a reflection of the second pulse a change in optical response of the sample, and relating a time of occurrence of the change in optical response to at least one dimension of the structure.

  9. Molecular cancer classification using a meta-sample-based regularized robust coding method. (United States)

    Wang, Shu-Lin; Sun, Liuchao; Fang, Jianwen


    Previous studies have demonstrated that machine learning based molecular cancer classification using gene expression profiling (GEP) data is promising for the clinic diagnosis and treatment of cancer. Novel classification methods with high efficiency and prediction accuracy are still needed to deal with high dimensionality and small sample size of typical GEP data. Recently the sparse representation (SR) method has been successfully applied to the cancer classification. Nevertheless, its efficiency needs to be improved when analyzing large-scale GEP data. In this paper we present the meta-sample-based regularized robust coding classification (MRRCC), a novel effective cancer classification technique that combines the idea of meta-sample-based cluster method with regularized robust coding (RRC) method. It assumes that the coding residual and the coding coefficient are respectively independent and identically distributed. Similar to meta-sample-based SR classification (MSRC), MRRCC extracts a set of meta-samples from the training samples, and then encodes a testing sample as the sparse linear combination of these meta-samples. The representation fidelity is measured by the l2-norm or l1-norm of the coding residual. Extensive experiments on publicly available GEP datasets demonstrate that the proposed method is more efficient while its prediction accuracy is equivalent to existing MSRC-based methods and better than other state-of-the-art dimension reduction based methods.

  10. Sample-Size Planning for More Accurate Statistical Power: A Method Adjusting Sample Effect Sizes for Publication Bias and Uncertainty. (United States)

    Anderson, Samantha F; Kelley, Ken; Maxwell, Scott E


    The sample size necessary to obtain a desired level of statistical power depends in part on the population value of the effect size, which is, by definition, unknown. A common approach to sample-size planning uses the sample effect size from a prior study as an estimate of the population value of the effect to be detected in the future study. Although this strategy is intuitively appealing, effect-size estimates, taken at face value, are typically not accurate estimates of the population effect size because of publication bias and uncertainty. We show that the use of this approach often results in underpowered studies, sometimes to an alarming degree. We present an alternative approach that adjusts sample effect sizes for bias and uncertainty, and we demonstrate its effectiveness for several experimental designs. Furthermore, we discuss an open-source R package, BUCSS, and user-friendly Web applications that we have made available to researchers so that they can easily implement our suggested methods.

  11. Comparability among four invertebrate sampling methods, Fountain Creek Basin, Colorado, 2010-2012 (United States)

    Zuellig, Robert E.; Bruce, James F.; Stogner, Sr., Robert W.; Brown, Krystal D.


    The U.S. Geological Survey, in cooperation with Colorado Springs City Engineering and Colorado Springs Utilities, designed a study to determine if sampling method and sample timing resulted in comparable samples and assessments of biological condition. To accomplish this task, annual invertebrate samples were collected concurrently using four sampling methods at 15 U.S. Geological Survey streamflow gages in the Fountain Creek basin from 2010 to 2012. Collectively, the four methods are used by local (U.S. Geological Survey cooperative monitoring program) and State monitoring programs (Colorado Department of Public Health and Environment) in the Fountain Creek basin to produce two distinct sample types for each program that target single-and multiple-habitats. This study found distinguishable differences between single-and multi-habitat sample types using both community similarities and multi-metric index values, while methods from each program within sample type were comparable. This indicates that the Colorado Department of Public Health and Environment methods were compatible with the cooperative monitoring program methods within multi-and single-habitat sample types. Comparisons between September and October samples found distinguishable differences based on community similarities for both sample types, whereas only differences were found for single-habitat samples when multi-metric index values were considered. At one site, differences between September and October index values from single-habitat samples resulted in opposing assessments of biological condition. Direct application of the results to inform the revision of the existing Fountain Creek basin U.S. Geological Survey cooperative monitoring program are discussed.

  12. A new enrichment method for isolation of Bacillus thuringiensis from diverse sample types. (United States)

    Patel, Ketan D; Bhanshali, Forum C; Chaudhary, Avani V; Ingle, Sanjay S


    New or more efficient methodologies having different principles are needed, as one method could not be suitable for isolation of organisms from samples of diverse types and from various environments. In present investigation, growth kinetics study revealed a higher germination rate, a higher growth rate, and maximum sporulation of Bacillus thuringiensis (Bt) compared to other Bacillus species. Considering these facts, a simple and efficient enrichment method was devised which allowed propagation of spores and vegetative cells of Bt and thereby increased Bt cell population proportionately. The new enrichment method yielded Bt from 44 out of 58 samples. Contrarily, Bt was isolated only from 16 and 18 samples by sodium acetate selection and dry heat pretreatment methods, respectively. Moreover, the percentages of Bt colonies isolated by the enrichment method were higher comparatively. Vegetative whole cell protein profile analysis indicated isolation of diverse population of Bt from various samples. Bt strains isolated by the enrichment method represented novel serovars and possibly new cry2 gene.

  13. Kinetic studies in solid state reactions by sample-controlled methods and advanced analysis procedures


    Pérez-Maqueda, Luis A.; Criado, J. M.; Sánchez-Jiménez, P.E.; Perejón, Antonio


    A comparative study of both conventional rising temperature and sample-controlled methods, like constant rate thermal analysis (CRTA), is carried out after analyzing a set of solid state reactions using both methods. It is shown that CRTA avoids the influence of heat and mass transfer phenomena for a wide range of sample sizes leading to reliable kinetic parameters. On the other hand, conventional rising temperature methods yield α–T plots dependent on experimental conditions, even when using...

  14. Large loop conformation sampling using the activation relaxation technique, ART-nouveau method. (United States)

    St-Pierre, Jean-François; Mousseau, Normand


    We present an adaptation of the ART-nouveau energy surface sampling method to the problem of loop structure prediction. This method, previously used to study protein folding pathways and peptide aggregation, is well suited to the problem of sampling the conformation space of large loops by targeting probable folding pathways instead of sampling exhaustively that space. The number of sampled conformations needed by ART nouveau to find the global energy minimum for a loop was found to scale linearly with the sequence length of the loop for loops between 8 and about 20 amino acids. Considering the linear scaling dependence of the computation cost on the loop sequence length for sampling new conformations, we estimate the total computational cost of sampling larger loops to scale quadratically compared to the exponential scaling of exhaustive search methods. Copyright © 2012 Wiley Periodicals, Inc.

  15. Electrodeposition as an alternate method for preparation of environmental samples for iodide by AMS

    Energy Technology Data Exchange (ETDEWEB)

    Adamic, M.L., E-mail: [Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83402 (United States); Lister, T.E.; Dufek, E.J.; Jenson, D.D.; Olson, J.E. [Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83402 (United States); Vockenhuber, C. [Laboratory of Ion Beam Physics, ETH Zurich, Otto-Stern-Weg 5, 8093 Zurich (Switzerland); Watrous, M.G. [Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83402 (United States)


    This paper presents an evaluation of an alternate method for preparing environmental samples for {sup 129}I analysis by accelerator mass spectrometry (AMS) at Idaho National Laboratory. The optimal sample preparation method is characterized by ease of preparation, capability of processing very small quantities of iodide, and ease of loading into a cathode. Electrodeposition of iodide on a silver wire was evaluated using these criteria. This study indicates that the electrochemically-formed silver iodide deposits produce ion currents similar to those from precipitated silver iodide for the same sample mass. Precipitated silver iodide samples are usually mixed with niobium or silver powder prior to loading in a cathode. Using electrodeposition, the silver is already mixed with the sample and can simply be picked up with tweezers, placed in the sample die, and pressed into a cathode. The major advantage of this method is that the silver wire/electrodeposited silver iodide is much easier to load into a cathode.

  16. Method validation and uncertainty evaluation of organically bound tritium analysis in environmental sample. (United States)

    Huang, Yan-Jun; Zeng, Fan; Zhang, Bing; Chen, Chao-Feng; Qin, Hong-Juan; Wu, Lian-Sheng; Guo, Gui-Yin; Yang, Li-Tao; Shang-Guan, Zhi-Hong


    The analytical method for organically bound tritium (OBT) was developed in our laboratory. The optimized operating conditions and parameters were established for sample drying, special combustion, distillation, and measurement on a liquid scintillation spectrometer (LSC). Selected types of OBT samples such as rice, corn, rapeseed, fresh lettuce and pork were analyzed for method validation of recovery rate reproducibility, the minimum detection concentration, and the uncertainty for typical low level environmental sample was evaluated. The combustion water recovery rate of different dried environmental sample was kept at about 80%, the minimum detection concentration of OBT ranged from 0.61 to 0.89 Bq/kg (dry weight), depending on the hydrogen content. It showed that this method is suitable for OBT analysis of environmental sample with stable recovery rate, and the combustion water yield of a sample with weight about 40 g would provide sufficient quantity for measurement on LSC. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Serum chromium levels sampled with steel needle versus plastic IV cannula. Does method matter?

    DEFF Research Database (Denmark)

    Penny, Jeannette Ø; Overgaard, Søren


    PURPOSE: Modern metal-on-metal (MoM) joint articulations releases metal ions to the body. Research tries to establish how much this elevates metal ion levels and whether it causes adverse effects. The steel needle that samples the blood may introduce additional chromium to the sample thereby...... causing bias. This study aimed to test that theory. METHODS: We compared serum chromium values for two sampling methods, steel needle and IV plastic cannula, as well as sampling sequence in 16 healthy volunteers. RESULTS: We found statistically significant chromium contamination from the steel needle...... significant. CONCLUSION: The chromium contamination from the steel needle is low, and sampling method matters little in MoM populations. If using steel needles we suggest discarding the first sample....

  18. Fast egg collection method greatly improves randomness of egg sampling in Drosophila melanogaster

    DEFF Research Database (Denmark)

    Schou, Mads Fristrup


    When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column and dimini......When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column...... and to obtain a representative collection of genotypes, the method presented here is strongly recommended when collecting eggs from Drosophila....

  19. Comparison of chlorzoxazone one-sample methods to estimate CYP2E1 activity in humans

    DEFF Research Database (Denmark)

    Kramer, Iza; Dalhoff, Kim; Clemmesen, Jens O


    OBJECTIVE: Comparison of a one-sample with a multi-sample method (the metabolic fractional clearance) to estimate CYP2E1 activity in humans. METHODS: Healthy, male Caucasians ( n=19) were included. The multi-sample fractional clearance (Cl(fe)) of chlorzoxazone was compared with one-time-point cl......OBJECTIVE: Comparison of a one-sample with a multi-sample method (the metabolic fractional clearance) to estimate CYP2E1 activity in humans. METHODS: Healthy, male Caucasians ( n=19) were included. The multi-sample fractional clearance (Cl(fe)) of chlorzoxazone was compared with one......-time-point clearance estimation (Cl(est)) at 3, 4, 5 and 6 h. Furthermore, the metabolite/drug ratios (MRs) estimated from one-time-point samples at 1, 2, 3, 4, 5 and 6 h were compared with Cl(fe). RESULTS: The concordance between Cl(est) and Cl(fe) was highest at 6 h. The minimal mean prediction error (MPE) of Cl......-dose-sample estimates, Cl(est) at 3 h or 6 h, and MR at 3 h, can serve as reliable markers of CYP2E1 activity. The one-sample clearance method is an accurate, renal function-independent measure of the intrinsic activity; it is simple to use and easily applicable to humans....

  20. Reliability assurance for regulation of advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Fullwood, R.; Lofaro, R.; Samanta, P.


    The advanced nuclear power plants must achieve higher levels of safety than the first generation of plants. Showing that this is indeed true provides new challenges to reliability and risk assessment methods in the analysis of the designs employing passive and semi-passive protection. Reliability assurance of the advanced reactor systems is important for determining the safety of the design and for determining the plant operability. Safety is the primary concern, but operability is considered indicative of good and safe operation. This paper discusses several concerns for reliability assurance of the advanced design encompassing reliability determination, level of detail required in advanced reactor submittals, data for reliability assurance, systems interactions and common cause effects, passive component reliability, PRA-based configuration control system, and inspection, training, maintenance and test requirements. Suggested approaches are provided for addressing each of these topics.

  1. Method matters: Experimental evidence for shorter avian sperm in faecal compared to abdominal massage samples.

    Directory of Open Access Journals (Sweden)

    Antje Girndt

    Full Text Available Birds are model organisms in sperm biology. Previous work in zebra finches, suggested that sperm sampled from males' faeces and ejaculates do not differ in size. Here, we tested this assumption in a captive population of house sparrows, Passer domesticus. We compared sperm length in samples from three collection techniques: female dummy, faecal and abdominal massage samples. We found that sperm were significantly shorter in faecal than abdominal massage samples, which was explained by shorter heads and midpieces, but not flagella. This result might indicate that faecal sampled sperm could be less mature than sperm collected by abdominal massage. The female dummy method resulted in an insufficient number of experimental ejaculates because most males ignored it. In light of these results, we recommend using abdominal massage as a preferred method for avian sperm sampling. Where avian sperm cannot be collected by abdominal massage alone, we advise controlling for sperm sampling protocol statistically.

  2. Comparison of individual and pooled sampling methods for detecting bacterial pathogens of fish (United States)

    Mumford, Sonia; Patterson, Chris; Evered, J.; Brunson, Ray; Levine, J.; Winton, J.


    Examination of finfish populations for viral and bacterial pathogens is an important component of fish disease control programs worldwide. Two methods are commonly used for collecting tissue samples for bacteriological culture, the currently accepted standards for detection of bacterial fish pathogens. The method specified in the Office International des Epizooties Manual of Diagnostic Tests for Aquatic Animals permits combining renal and splenic tissues from as many as 5 fish into pooled samples. The American Fisheries Society (AFS) Blue Book/US Fish and Wildlife Service (USFWS) Inspection Manual specifies the use of a bacteriological loop for collecting samples from the kidney of individual fish. An alternative would be to more fully utilize the pooled samples taken for virology. If implemented, this approach would provide substantial savings in labor and materials. To compare the relative performance of the AFS/USFWS method and this alternative approach, cultures of Yersinia ruckeri were used to establish low-level infections in groups of rainbow trout (Oncorhynchus mykiss) that were sampled by both methods. Yersinia ruckeri was cultured from 22 of 37 groups by at least 1 method. The loop method yielded 18 positive groups, with 1 group positive in the loop samples but negative in the pooled samples. The pooled samples produced 21 positive groups, with 4 groups positive in the pooled samples but negative in the loop samples. There was statistically significant agreement (Spearman coefficient 0.80, P < 0.001) in the relative ability of the 2 sampling methods to permit detection of low-level bacterial infections of rainbow trout.

  3. 42 CFR 431.53 - Assurance of transportation. (United States)


    ... 42 Public Health 4 2010-10-01 2010-10-01 false Assurance of transportation. 431.53 Section 431.53... Requirements § 431.53 Assurance of transportation. A State plan must— (a) Specify that the Medicaid agency will ensure necessary transportation for recipients to and from providers; and (b) Describe the methods that...

  4. Product assurance policies and procedures for flight dynamics software development (United States)

    Perry, Sandra; Jordan, Leon; Decker, William; Page, Gerald; Mcgarry, Frank E.; Valett, Jon


    The product assurance policies and procedures necessary to support flight dynamics software development projects for Goddard Space Flight Center are presented. The quality assurance and configuration management methods and tools for each phase of the software development life cycles are described, from requirements analysis through acceptance testing; maintenance and operation are not addressed.

  5. Applicability of Demirjian's four methods and Willems method for age estimation in a sample of Turkish children. (United States)

    Akkaya, Nursel; Yilanci, Hümeyra Özge; Göksülük, Dinçer


    The aim of this study was to evaluate applicability of five dental methods including Demirjian's original, revised, four teeth, and alternate four teeth methods and Willems method for age estimation in a sample of Turkish children. Panoramic radiographs of 799 children (412 females, 387 males) aged between 2.20 and 15.99years were examined by two observers. A repeated measures ANOVA was performed to compare dental methods among gender and age groups. All of the five methods overestimated the chronological age on the average. Among these, Willems method was found to be the most accurate method, which showed 0.07 and 0.15years overestimation for males and females, respectively. It was followed by Demirjian's four teeth methods, revised and original methods. According to the results, Willems method can be recommended for dental age estimation of Turkish children in forensic applications. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Probabilistic finite element stiffness of a laterally loaded monopile based on an improved asymptotic sampling method

    DEFF Research Database (Denmark)

    Vahdatirad, Mohammadjavad; Bayat, Mehdi; Andersen, Lars Vabbersgaard


    The mechanical responses of an offshore monopile foundation mounted in over-consolidated clay are calculated by employing a stochastic approach where a nonlinear p–y curve is incorporated with a finite element scheme. The random field theory is applied to represent a spatial variation for undrained...... shear strength of clay. Normal and Sobol sampling are employed to provide the asymptotic sampling method to generate the probability distribution of the foundation stiffnesses. Monte Carlo simulation is used as a benchmark. Asymptotic sampling accompanied with Sobol quasi random sampling demonstrates...... an efficient method for estimating the probability distribution of stiffnesses for the offshore monopile foundation....

  7. Method for Vanadium Speciation in Aqueous Samples by HPLC-ICP ...

    African Journals Online (AJOL)

    Method for Vanadium Speciation in Aqueous Samples by HPLC-ICP-OES. M Hu, PP Coetzee. Abstract. A method for vanadium speciation is proposed. The method uses a low concentration eluent, 10 mmol L–1 EDTA and 14 mmol L–1 sodium carbonate, for the ion chromatographic separation of vanadium species at a ...

  8. Optimal sampling strategies to assess inulin clearance in children by the inulin single-injection method

    NARCIS (Netherlands)

    van Rossum, Lyonne K.; Mathot, Ron A. A.; Cransberg, Karlien; Vulto, Arnold G.


    Glomerular filtration rate in patients can be determined by estimating the plasma clearance of inulin with the single-injection method. In this method, a single bolus injection of inulin is administered and several blood samples are collected. For practical and convenient application of this method

  9. Methods, compounds and systems for detecting a microorganism in a sample

    Energy Technology Data Exchange (ETDEWEB)

    Colston, Jr, Bill W.; Fitch, J. Patrick; Gardner, Shea N.; Williams, Peter L.; Wagner, Mark C.


    Methods to identify a set of probe polynucleotides suitable for detecting a set of targets and in particular methods for identification of primers suitable for detection of target microorganisms related polynucleotides, set of polynucleotides and compositions, and related methods and systems for detection and/or identification of microorganisms in a sample.

  10. Spin column extraction as a new sample preparation method in bioanalysis. (United States)

    Namera, Akira; Saito, Takashi


    Sample preparation is important in obtaining accurate data for qualification and quantification in bioanalysis. We have recently focused on monolithic silica for high-throughput analysis. These extraction processes - using monolithic silica packed in spin column - such as sample loading, washing and elution, are executed by centrifugation. There are several possibilities such as on-column derivatization for the determination of amines or carboxylic acids in the sample. The spin column extraction reduces the sample preparation time required for determination of drugs and other chemicals in biological materials and increases productivity in bioanalysis. We expect spin column extraction to become the mainstream method of sample processing in the future.

  11. Comparing two sampling methods to engage hard-to-reach communities in research priority setting

    Directory of Open Access Journals (Sweden)

    Melissa A. Valerio


    Full Text Available Abstract Background Effective community-partnered and patient-centered outcomes research needs to address community priorities. However, optimal sampling methods to engage stakeholders from hard-to-reach, vulnerable communities to generate research priorities have not been identified. Methods In two similar rural, largely Hispanic communities, a community advisory board guided recruitment of stakeholders affected by chronic pain using a different method in each community: 1 snowball sampling, a chain- referral method or 2 purposive sampling to recruit diverse stakeholders. In both communities, three groups of stakeholders attended a series of three facilitated meetings to orient, brainstorm, and prioritize ideas (9 meetings/community. Using mixed methods analysis, we compared stakeholder recruitment and retention as well as priorities from both communities’ stakeholders on mean ratings of their ideas based on importance and feasibility for implementation in their community. Results Of 65 eligible stakeholders in one community recruited by snowball sampling, 55 (85 % consented, 52 (95 % attended the first meeting, and 36 (65 % attended all 3 meetings. In the second community, the purposive sampling method was supplemented by convenience sampling to increase recruitment. Of 69 stakeholders recruited by this combined strategy, 62 (90 % consented, 36 (58 % attended the first meeting, and 26 (42 % attended all 3 meetings. Snowball sampling recruited more Hispanics and disabled persons (all P < 0.05. Despite differing recruitment strategies, stakeholders from the two communities identified largely similar ideas for research, focusing on non-pharmacologic interventions for management of chronic pain. Ratings on importance and feasibility for community implementation differed only on the importance of massage services (P = 0.045 which was higher for the purposive/convenience sampling group and for city improvements

  12. Novel joint selection methods can reduce sample size for rheumatoid arthritis clinical trials with ultrasound endpoints. (United States)

    Allen, John C; Thumboo, Julian; Lye, Weng Kit; Conaghan, Philip G; Chew, Li-Ching; Tan, York Kiat


    To determine whether novel methods of selecting joints through (i) ultrasonography (individualized-ultrasound [IUS] method), or (ii) ultrasonography and clinical examination (individualized-composite-ultrasound [ICUS] method) translate into smaller rheumatoid arthritis (RA) clinical trial sample sizes when compared to existing methods utilizing predetermined joint sites for ultrasonography. Cohen's effect size (ES) was estimated (ES^) and a 95% CI (ES^L, ES^U) calculated on a mean change in 3-month total inflammatory score for each method. Corresponding 95% CIs [nL(ES^U), nU(ES^L)] were obtained on a post hoc sample size reflecting the uncertainty in ES^. Sample size calculations were based on a one-sample t-test as the patient numbers needed to provide 80% power at α = 0.05 to reject a null hypothesis H0 : ES = 0 versus alternative hypotheses H1 : ES = ES^, ES = ES^L and ES = ES^U. We aimed to provide point and interval estimates on projected sample sizes for future studies reflecting the uncertainty in our study ES^S. Twenty-four treated RA patients were followed up for 3 months. Utilizing the 12-joint approach and existing methods, the post hoc sample size (95% CI) was 22 (10-245). Corresponding sample sizes using ICUS and IUS were 11 (7-40) and 11 (6-38), respectively. Utilizing a seven-joint approach, the corresponding sample sizes using ICUS and IUS methods were nine (6-24) and 11 (6-35), respectively. Our pilot study suggests that sample size for RA clinical trials with ultrasound endpoints may be reduced using the novel methods, providing justification for larger studies to confirm these observations. © 2017 Asia Pacific League of Associations for Rheumatology and John Wiley & Sons Australia, Ltd.

  13. Assessing impacts of DNA extraction methods on next generation sequencing of water and wastewater samples. (United States)

    Walden, Connie; Carbonero, Franck; Zhang, Wen


    Next Generation Sequencing (NGS) is increasingly affordable and easier to perform. However, standard protocols prior to the sequencing step are only available for few selected sample types. Here we investigated the impact of DNA extraction methods on the consistency of NGS results. Four commercial DNA extraction kits (QIAamp DNA Mini Kit, QIAamp DNA Stool Mini Kit, MO BIO Power Water Kit, and MO BIO Power Soil DNA Isolation Kit) were used on sample sources including lake water and wastewater, and sample types including planktonic and biofilm bacteria communities. Sampling locations included a lake water reservoir, a trickling filter, and a moving bed biofilm reactor (MBBR). Unique genera such as Gemmatimonadetes, Elusimicrobia, and Latescibacteria were found in multiple samples. The Stool Mini Kit was least efficient in terms of diversity in sampling results with freshwater lake samples, and surprisingly the Power Water Kit was the least efficient across all sample types examined. Detailed NGS beta diversity comparisons indicated that the Mini Kit and PowerSoil Kit are best suited for studies that extract DNA from a variety of water and wastewater samples. We ultimately recommend application of Mini Kit or PowerSoil Kit as an improvement to NGS protocols for these sampling environments. These results are a step toward achieving accurate comparability of complex samples from water and wastewater environments by applying a single DNA extraction method, further streamlining future investigations. Copyright © 2017 Elsevier B.V. All rights reserved.


    DEFF Research Database (Denmark)


    The invention relates to a method for preparing a substrate (105a) comprising a sample reception area (110) and a sensing area (111). The method comprises the steps of: 1) applying a sample on the sample reception area; 2) rotating the substrate around a predetermined axis; 3) during rotation......, at least part of the liquid travels from the sample reception area to the sensing area due to capillary forces acting between the liquid and the substrate; and 4) removing the wave of particles and liquid formed at one end of the substrate. The sensing area is closer to the predetermined axis than...... the sample reception area. The sample comprises a liquid part and particles suspended therein....

  15. Quality-assurance plan and field methods for quality-of-water activities, U.S. Geological Survey, Idaho National Engineering Laboratory, Idaho

    Energy Technology Data Exchange (ETDEWEB)

    Mann, L.J.


    Water-quality activities at the Idaho National Engineering Laboratory (INEL) Project Office are part of the US Geological Survey`s (USGS) Water Resources Division (WRD) mission of appraising the quantity and quality of the Nation`s water resources. The purpose of the Quality Assurance Plan (QAP) for water-quality activities performed by the INEL Project Office is to maintain and improve the quality of technical products, and to provide a formal standardization, documentation, and review of the activities that lead to these products. The principles of this plan are as follows: (1) water-quality programs will be planned in a competent manner and activities will be monitored for compliance with stated objectives and approaches; (2) field, laboratory, and office activities will be performed in a conscientious and professional manner in accordance with specified WRD practices and procedures by qualified and experienced employees who are well trained and supervised, if or when, WRD practices and procedures are inadequate, data will be collected in a manner that its quality will be documented; (3) all water-quality activities will be reviewed for completeness, reliability, credibility, and conformance to specified standards and guidelines; (4) a record of actions will be kept to document the activity and the assigned responsibility; (5) remedial action will be taken to correct activities that are deficient.

  16. Passive sampling methods for contaminated sediments: Scientific rationale supporting use of freely dissolved concentrations

    DEFF Research Database (Denmark)

    Mayer, Philipp; Parkerton, Thomas F.; Adams, Rachel G.


    Passive sampling methods (PSMs) allow the quantification of the freely dissolved concentration (Cfree ) of an organic contaminant even in complex matrices such as sediments. Cfree is directly related to a contaminant's chemical activity, which drives spontaneous processes including diffusive upta...

  17. Non-invasive sampling methods of inflammatory biomarkers in asthma and allergic rhinitis

    NARCIS (Netherlands)

    Boot, Johan Diderik


    In this thesis, a series of clinical studies have been described, in which we applied, evaluated or modified novel and existing non- or semi-invasive sampling methods and detection techniques for the assessment of biomarkers in allergic airway inflammation.

  18. Referred Air Method 25E: Determination of a Vapor Phase Organic Concentration in Waste Samples (United States)

    This method is applicable for determining the vapor pressure of waste. The headspace vapor of the sample is analyzed for carbon content by a headspace analyzer, which uses a flame ionization detector (FID).

  19. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2 (United States)

    Johnson, Kenneth L.; White, K. Preston, Jr.


    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  20. Using a bootstrap method to choose the sample fraction in tail index estimation

    NARCIS (Netherlands)

    J. Daníelsson (Jón); L.F.M. de Haan (Laurens); L. Peng (Liang); C.G. de Vries (Casper)


    textabstractTail index estimation depends for its accuracy on a precise choice of the sample fraction, i.e. the number of extreme order statistics on which the estimation is based. A complete solution to the sample fraction selection is given by means of a two step subsample bootstrap method. This

  1. Method and apparatus for measuring the NMR spectrum of an orientationally disordered sample (United States)

    Pines, Alexander; Samoson, Ago


    An improved NMR probe and method are described which substantially improve the resolution of NMR measurements made on powdered or amorphous or otherwise oreintationally disordered samples. The apparatus mechanically varies the orientation of the sample such that the time average of two or more sets of spherical harmonic functions is zero.

  2. Measurement quality assurance for beta particle calibrations at NIST

    Energy Technology Data Exchange (ETDEWEB)

    Soares, C.G.; Pruitt, J.S. [National Institute of Standards and Technology, Gaithersburg, MD (United States)


    Standardized beta-particle fields have been established in an international standard and have been adopted for use in several U.S. dosimeter and instrument testing standards. Calibration methods and measurement quality assurance procedures employed at the National Institute of Standards and Technology (NIST) for beta-particle calibrations in these reference fields are discussed. The calibration facility including the NIST-automated extrapolation ionization chamber is described, and some sample results of calibrations are shown. Methods for establishing and maintaining traceability to NIST of secondary laboratories are discussed. Currently, there are problems in finding a good method for routine testing of traceability to NIST. Some examples of past testing methods are given and solutions to this problem are proposed.

  3. A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine. (United States)

    Hodgson, James A; Seyler, Tiffany H; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing


    Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs-N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)-using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV method increases sample throughput while maintaining a low limit of detection (sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle.

  4. Sample preparation method for glass welding by ultrashort laser pulses yields higher seam strength. (United States)

    Cvecek, K; Miyamoto, I; Strauss, J; Wolf, M; Frick, T; Schmidt, M


    Glass welding by ultrashort laser pulses allows joining without the need of an absorber or a preheating and postheating process. However, cracks generated during the welding process substantially impair the joining strength of the welding seams. In this paper a sample preparation method is described that prevents the formation of cracks. The measured joining strength of samples prepared by this method is substantially higher than previously reported values.

  5. A comparative study of extraction and purification methods for environmental DNA from soil and sludge samples


    Roh, Changhyun; Villatte, Francois; Kim, Byung-Gee; Schmid, Rolf D.


    An important prerequisite for a successful metagenome library construction is an efficient extraction procedure for DNA out of environmental samples. In this study we compared three indirect and four direct extraction methods, including a commercial kit, in terms of DNA yield, purity and time requirement. A special focus was set on methods which are appropriate for the extraction of environmental DNA (eDNA) from very limited sample sizes (0.1 g) to enable a highly parallel approach. Direct ex...

  6. Field Methods and Sample Collection Techniques for the Surveillance of West Nile Virus in Avian Hosts. (United States)

    Wheeler, Sarah S; Boyce, Walter M; Reisen, William K


    Avian hosts play an important role in the spread, maintenance, and amplification of West Nile virus (WNV). Avian susceptibility to WNV varies from species to species thus surveillance efforts can focus both on birds that survive infection and those that succumb. Here we describe methods for the collection and sampling of live birds for WNV antibodies or viremia, and methods for the sampling of dead birds. Target species and study design considerations are discussed.

  7. Relative efficiency of anuran sampling methods in a restinga habitat (Jurubatiba, Rio de Janeiro, Brazil

    Directory of Open Access Journals (Sweden)

    C. F. D. Rocha

    Full Text Available Studies on anurans in restinga habitats are few and, as a result, there is little information on which methods are more efficient for sampling them in this environment. Ten methods are usually used for sampling anuran communities in tropical and sub-tropical areas. In this study we evaluate which methods are more appropriate for this purpose in the restinga environment of Parque Nacional da Restinga de Jurubatiba. We analyzed six methods among those usually used for anuran samplings. For each method, we recorded the total amount of time spent (in min., the number of researchers involved, and the number of species captured. We calculated a capture efficiency index (time necessary for a researcher to capture an individual frog in order to make comparable the data obtained. Of the methods analyzed, the species inventory (9.7 min/searcher /ind.- MSI; richness = 6; abundance = 23 and the breeding site survey (9.5 MSI; richness = 4; abundance = 22 were the most efficient. The visual encounter inventory (45.0 MSI and patch sampling (65.0 MSI methods were of comparatively lower efficiency restinga, whereas the plot sampling and the pit-fall traps with drift-fence methods resulted in no frog capture. We conclude that there is a considerable difference in efficiency of methods used in the restinga environment and that the complete species inventory method is highly efficient for sampling frogs in the restinga studied and may be so in other restinga environments. Methods that are usually efficient in forested areas seem to be of little value in open restinga habitats.

  8. A novel sample preparation method using rapid nonheated saponification method for the determination of cholesterol in emulsified foods. (United States)

    Jeong, In-Seek; Kwak, Byung-Man; Ahn, Jang-Hyuk; Leem, Donggil; Yoon, Taehyung; Yoon, Changyong; Jeong, Jayoung; Park, Jung-Min; Kim, Jin-Man


    In this study, nonheated saponification was employed as a novel, rapid, and easy sample preparation method for the determination of cholesterol in emulsified foods. Cholesterol content was analyzed using gas chromatography with a flame ionization detector (GC-FID). The cholesterol extraction method was optimized for maximum recovery from baby food and infant formula. Under these conditions, the optimum extraction solvent was 10 mL ethyl ether per 1 to 2 g sample, and the saponification solution was 0.2 mL KOH in methanol. The cholesterol content in the products was determined to be within the certified range of certified reference materials (CRMs), NIST SRM 1544 and SRM 1849. The results of the recovery test performed using spiked materials were in the range of 98.24% to 99.45% with an relative standard devitation (RSD) between 0.83% and 1.61%. This method could be used to reduce sample pretreatment time and is expected to provide an accurate determination of cholesterol in emulsified food matrices such as infant formula and baby food. A novel, rapid, and easy sample preparation method using nonheated saponification was developed for cholesterol detection in emulsified foods. Recovery tests of CRMs were satisfactory, and the recoveries of spiked materials were accurate and precise. This method was effective and decreased the time required for analysis by 5-fold compared to the official method. © 2012 Institute of Food Technologists®

  9. Hanford analytical services quality assurance requirements documents

    Energy Technology Data Exchange (ETDEWEB)

    Hyatt, J.E.


    Hanford Analytical Services Quality Assurance Requirements Document (HASQARD) is issued by the Analytical Services, Program of the Waste Management Division, US Department of Energy (US DOE), Richland Operations Office (DOE-RL). The HASQARD establishes quality requirements in response to DOE Order 5700.6C (DOE 1991b). The HASQARD is designed to meet the needs of DOE-RL for maintaining a consistent level of quality for sampling and field and laboratory analytical services provided by contractor and commercial field and laboratory analytical operations. The HASQARD serves as the quality basis for all sampling and field/laboratory analytical services provided to DOE-RL through the Analytical Services Program of the Waste Management Division in support of Hanford Site environmental cleanup efforts. This includes work performed by contractor and commercial laboratories and covers radiological and nonradiological analyses. The HASQARD applies to field sampling, field analysis, and research and development activities that support work conducted under the Hanford Federal Facility Agreement and Consent Order Tri-Party Agreement and regulatory permit applications and applicable permit requirements described in subsections of this volume. The HASQARD applies to work done to support process chemistry analysis (e.g., ongoing site waste treatment and characterization operations) and research and development projects related to Hanford Site environmental cleanup activities. This ensures a uniform quality umbrella to analytical site activities predicated on the concepts contained in the HASQARD. Using HASQARD will ensure data of known quality and technical defensibility of the methods used to obtain that data. The HASQARD is made up of four volumes: Volume 1, Administrative Requirements; Volume 2, Sampling Technical Requirements; Volume 3, Field Analytical Technical Requirements; and Volume 4, Laboratory Technical Requirements. Volume 1 describes the administrative requirements

  10. Apparatus and method for maintaining multi-component sample gas constituents in vapor phase during sample extraction and cooling (United States)

    Felix, Larry Gordon; Farthing, William Earl; Irvin, James Hodges; Snyder, Todd Robert


    A dilution apparatus for diluting a gas sample. The apparatus includes a sample gas conduit having a sample gas inlet end and a diluted sample gas outlet end, and a sample gas flow restricting orifice disposed proximate the sample gas inlet end connected with the sample gas conduit and providing fluid communication between the exterior and the interior of the sample gas conduit. A diluted sample gas conduit is provided within the sample gas conduit having a mixing end with a mixing space inlet opening disposed proximate the sample gas inlet end, thereby forming an annular space between the sample gas conduit and the diluted sample gas conduit. The mixing end of the diluted sample gas conduit is disposed at a distance from the sample gas flow restricting orifice. A dilution gas source connected with the sample gas inlet end of the sample gas conduit is provided for introducing a dilution gas into the annular space, and a filter is provided for filtering the sample gas. The apparatus is particularly suited for diluting heated sample gases containing one or more condensable components.

  11. A proteomics sample preparation method for mature, recalcitrant leaves of perennial plants. (United States)

    Gang, Deng; Xinyue, Zhong; Na, Zhang; Chengying, Lao; Bo, Wang; Dingxiang, Peng; Lijun, Liu


    Sample preparation is key to the success of proteomics studies. In the present study, two sample preparation methods were tested for their suitability on the mature, recalcitrant leaves of six representative perennial plants (grape, plum, pear, peach, orange, and ramie). An improved sample preparation method was obtained: Tris and Triton X-100 were added together instead of CHAPS to the lysis buffer, and a 20% TCA-water solution and 100% precooled acetone were added after the protein extraction for the further purification of protein. This method effectively eliminates nonprotein impurities and obtains a clear two-dimensional gel electrophoresis array. The method facilitates the separation of high-molecular-weight proteins and increases the resolution of low-abundance proteins. This method provides a widely applicable and economically feasible technology for the proteomic study of the mature, recalcitrant leaves of perennial plants.

  12. A proteomics sample preparation method for mature, recalcitrant leaves of perennial plants.

    Directory of Open Access Journals (Sweden)

    Deng Gang

    Full Text Available Sample preparation is key to the success of proteomics studies. In the present study, two sample preparation methods were tested for their suitability on the mature, recalcitrant leaves of six representative perennial plants (grape, plum, pear, peach, orange, and ramie. An improved sample preparation method was obtained: Tris and Triton X-100 were added together instead of CHAPS to the lysis buffer, and a 20% TCA-water solution and 100% precooled acetone were added after the protein extraction for the further purification of protein. This method effectively eliminates nonprotein impurities and obtains a clear two-dimensional gel electrophoresis array. The method facilitates the separation of high-molecular-weight proteins and increases the resolution of low-abundance proteins. This method provides a widely applicable and economically feasible technology for the proteomic study of the mature, recalcitrant leaves of perennial plants.

  13. The case of sustainability assurance: constructing a new assurance service

    NARCIS (Netherlands)

    O'Dwyer, B.


    This paper presents an in-depth longitudinal case study examining the processes through which practitioners in two Big 4 professional services firms have attempted to construct sustainability assurance (independent assurance on sustainability reports). Power’s (1996, 1997, 1999, 2003) theorization

  14. Efficient free energy calculations by combining two complementary tempering sampling methods (United States)

    Xie, Liangxu; Shen, Lin; Chen, Zhe-Ning; Yang, Mingjun


    Although energy barriers can be efficiently crossed in the reaction coordinate (RC) guided sampling, this type of method suffers from identification of the correct RCs or requirements of high dimensionality of the defined RCs for a given system. If only the approximate RCs with significant barriers are used in the simulations, hidden energy barriers with small to medium height would exist in other degrees of freedom (DOFs) relevant to the target process and consequently cause the problem of insufficient sampling. To address the sampling in this so-called hidden barrier situation, here we propose an effective approach to combine temperature accelerated molecular dynamics (TAMD), an efficient RC-guided sampling method, with the integrated tempering sampling (ITS), a generalized ensemble sampling method. In this combined ITS-TAMD method, the sampling along the major RCs with high energy barriers is guided by TAMD and the sampling of the rest of the DOFs with lower but not negligible barriers is enhanced by ITS. The performance of ITS-TAMD to three systems in the processes with hidden barriers has been examined. In comparison to the standalone TAMD or ITS approach, the present hybrid method shows three main improvements. (1) Sampling efficiency can be improved at least five times even if in the presence of hidden energy barriers. (2) The canonical distribution can be more accurately recovered, from which the thermodynamic properties along other collective variables can be computed correctly. (3) The robustness of the selection of major RCs suggests that the dimensionality of necessary RCs can be reduced. Our work shows more potential applications of the ITS-TAMD method as the efficient and powerful tool for the investigation of a broad range of interesting cases.

  15. Experimental application of contour method for determination of residual stress in subsurface layers of milled sample

    Directory of Open Access Journals (Sweden)

    Karel Horák


    Full Text Available Determination of residual stress close to the sample surface is in the most cases performed by hole-drilling method, X-Ray diffraction or neutron diffraction. Each of these methods has its benefits and disadvantages. In case of diffraction methods the measurement speed is the main disadvantage. It is also very problematic to apply diffraction method in case of sample with mechanically deformed surface, for example by standard machining operations. Therefore, determined results are very often confusing and hard to interpret. On the other side, hole drilling method is less sensitive to quality of sample surface than diffraction methods, but measurement realization is quite expensive and equipment demanding (strain gage rosettes, miniature milling cutter, high speed milling machine, pc equipment,….Recently introduce contour method used for determination of residual stress inside the sample is very fast, can be performed with almost common laboratory equipment and combines traditional stance with modern numerical methods by FEM. Contour method was selected for determination of residual stress below the milled surface and the dependency of milling process quality on residual stress value is demonstrated.

  16. MDMS: Molecular dynamics meta-simulator for evaluating exchange type sampling methods (United States)

    Smith, Daniel B.; Okur, Asim; Brooks, Bernard R.


    Replica exchange methods have become popular tools to explore conformational space for small proteins. For larger biological systems, even with enhanced sampling methods, exploring the free energy landscape remains computationally challenging. This problem has led to the development of many improved replica exchange methods. Unfortunately, testing these methods remains expensive. We propose a molecular dynamics meta-simulator (MDMS) based on transition state theory to simulate a replica exchange simulation, eliminating the need to run explicit dynamics between exchange attempts. MDMS simulations allow for rapid testing of new replica exchange based methods, greatly reducing the amount of time needed for new method development.

  17. New approaches to wipe sampling methods for antineoplastic and other hazardous drugs in healthcare settings (United States)

    Connor, Thomas H.; Smith, Jerome P.


    Purpose At the present time, the method of choice to determine surface contamination of the workplace with antineoplastic and other hazardous drugs is surface wipe sampling and subsequent sample analysis with a variety of analytical techniques. The purpose of this article is to review current methodology for determining the level of surface contamination with hazardous drugs in healthcare settings and to discuss recent advances in this area. In addition it will provide some guidance for conducting surface wipe sampling and sample analysis for these drugs in healthcare settings. Methods Published studies on the use of wipe sampling to measure hazardous drugs on surfaces in healthcare settings drugs were reviewed. These studies include the use of well-documented chromatographic techniques for sample analysis in addition to newly evolving technology that provides rapid analysis of specific antineoplastic Results Methodology for the analysis of surface wipe samples for hazardous drugs are reviewed, including the purposes, technical factors, sampling strategy, materials required, and limitations. The use of lateral flow immunoassay (LFIA) and fluorescence covalent microbead immunosorbent assay (FCMIA) for surface wipe sample evaluation is also discussed. Conclusions Current recommendations are that all healthcare settings where antineoplastic and other hazardous drugs are handled include surface wipe sampling as part of a comprehensive hazardous drug-safe handling program. Surface wipe sampling may be used as a method to characterize potential occupational dermal exposure risk and to evaluate the effectiveness of implemented controls and the overall safety program. New technology, although currently limited in scope, may make wipe sampling for hazardous drugs more routine, less costly, and provide a shorter response time than classical analytical techniques now in use. PMID:28459100

  18. A single sample method for estimating glomerular filtration rate in cats. (United States)

    Finch, N C; Heiene, R; Elliott, J; Syme, H M; Peters, A M


    Validated methods of estimating glomerular filtration rate (GFR) in cats requiring only a limited number of samples are desirable. To test a single sample method of determining GFR in cats. The validation population (group 1) consisted of 89 client-owned cats (73 nonazotemic and 16 azotemic). A separate population of 18 healthy nonazotemic cats (group 2) was used to test the methods. Glomerular filtration rate was determined in group 1 using corrected slope-intercept iohexol clearance. Single sample clearance was determined using the Jacobsson and modified Jacobsson methods and validated against slope-intercept clearance. Extracellular fluid volume (ECFV) was determined from slope-intercept clearance with correction for the 1 compartment assumption and by deriving a prediction formula for ECFV (ECFV Predicted ) based on the body weight. The optimal single sample method was tested in group 2. A blood sample at 180 minutes and ECFV Predicted were optimal for single sample clearance. Mean ± SD GFR in group 1 determined using the Jacobsson and modified Jacobsson formulae was 1.78 ± 0.70 and 1.65 ± 0.60 mL/min/kg, respectively. When tested in group 2, the Jacobsson method overestimated multisample clearance. The modified Jacobsson method (mean ± SD 2.22 ± 0.34 mL/min/kg) was in agreement with multisample clearance (mean ± SD 2.19 ± 0.34 mL/min/kg). The modified Jacobsson method provides accurate estimation of iohexol clearance in cats, from a single sample collected at 180 minutes postinjection and using a formula based on the body weight to predict ECFV. Further validation of the formula in patients with very high or very low GFR is required. Copyright © 2013 by the American College of Veterinary Internal Medicine.

  19. Development of active and diffusive sampling methods for determination of 3-methoxybutyl acetate in workplace air. (United States)

    Takeuchi, Akito; Takigawa, Tomoko; Kawasumi, Yaeko; Yasugi, Tomojiro; Endo, Yoko; Wang, Da-Hong; Takaki, Jiro; Sakurai, Haruhiko; Ogino, Keiki


    Monitoring of the workplace concentration of 3-methoxybutyl acetate (MBA), which is used in printer's ink and thinner for screen-printing and as an organic solvent to dissolve various resins, is important for health reasons. An active and a diffusive sampling method, using a gas chromatograph equipped with a flame ionization detector, were developed for the determination of MBA in workplace air. For the active sampling method using an activated charcoal tube, the overall desorption efficiency was 101%, the overall recovery was 104%, and the recovery after 8 days of storage in a refrigerator was more than 90%. For the diffusive sampling method using the 3M 3500 organic vapor monitor, the MBA sampling rate was 19.89 cm(3) min(-1). The linear range was from 0.01 to 96.00 microg ml(-1), with a correlation coefficient of 0.999, and the detection limits of the active and diffusive samplers were 0.04 and 0.07 microg sample(-1), respectively. The geometric mean of stationary sampling and personal sampling in a screen-printing factory were 12.61 and 16.52 ppm, respectively, indicating that both methods can be used to measure MBA in workplace air.

  20. "How" and "what" matters: Sampling method affects biodiversity estimates of reef fishes. (United States)

    Bosch, Néstor E; Gonçalves, Jorge M S; Erzini, Karim; Tuya, Fernando


    Understanding changes in biodiversity requires the implementation of monitoring programs encompassing different dimensions of biodiversity through varying sampling techniques. In this work, fish assemblages associated with the "outer" and "inner" sides of four marinas, two at the Canary Islands and two at southern Portugal, were investigated using three complementary sampling techniques: underwater visual censuses (UVCs), baited cameras (BCs), and fish traps (FTs). We firstly investigated the complementarity of these sampling methods to describe species composition. Then, we investigated differences in taxonomic (TD), phylogenetic (PD) and functional diversity (FD) between sides of the marinas according to each sampling method. Finally, we explored the applicability/reproducibility of each sampling technique to characterize fish assemblages according to these metrics of diversity. UVCs and BCs provided complementary information, in terms of the number and abundances of species, while FTs sampled a particular assemblage. Patterns of TD, PD, and FD between sides of the marinas varied depending on the sampling method. UVC was the most cost-efficient technique, in terms of personnel hours, and it is recommended for local studies. However, for large-scale studies, BCs are recommended, as it covers greater spatio-temporal scales by a lower cost. Our study highlights the need to implement complementary sampling techniques to monitor ecological change, at various dimensions of biodiversity. The results presented here will be useful for optimizing future monitoring programs.

  1. Construction quality assurance report

    Energy Technology Data Exchange (ETDEWEB)

    Roscha, V.


    This report provides a summary of the construction quality assurance (CQA) observation and test results, including: The results of the geosynthetic and soil materials conformance testing. The observation and testing results associates with the installation of the soil liners. The observation and testing results associated with the installation of the HDPE geomembrane liner systems. The observation and testing results associated with the installation of the leachate collection and removal systems. The observation and testing results associated with the installation of the working surfaces. The observation and testing results associated with in-plant manufacturing process. Summary of submittal reviews by Golder Construction Services, Inc. The submittal and certification of the piping material specifications. The observation and verification associated of the Acceptance Test Procedure results of the operational equipment functions. Summary of the ECNs which are incorporated into the project.

  2. FESA Quality Assurance

    CERN Multimedia

    CERN. Geneva


    FESA is a framework used by 100+ developers at CERN to design and implement the real-time software used to control the accelerators. Each new version must be tested and qualified to ensure that no backward compatibility issues have been introduced and that there is no major bug which might prevent accelerator operations. Our quality assurance approach is based on code review and a two-level testing process. The first level is made of unit-test (Python unittest & Google tests for C++). The second level consists of integration tests running on an isolated test environment. We also use a continuous integration service (Bamboo) to ensure the tests are executed periodically and the bugs caught early. In the presentation, we will explain the reasons why we took this approach, the results and some thoughts on the pros and cons.

  3. Concrete quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Holz, N. [Harza Engineering Company, Chicago, IL (United States)


    This short article reports on progress at the world's largest civil construction project, namely China's Three Gorges hydro project. Work goes on around the clock to put in place nearly 28 M m{sup 3} of concrete. At every stage of the work there is strong emphasis on quality assurance (QA) and concrete is no exception. The US company Harza Engineering has been providing QA since the mid-1980s and concrete QA has been based on international standards. Harza personnel work in the field with supervisors developing educational tools for supervising concrete construction and quality, as well as providing training courses in concrete technology. Some details on flood control, capacity, water quality and environmental aspects are given..

  4. New method to estimate the sample size for calculation of a proportion assuming binomial distribution. (United States)

    Vallejo, Adriana; Muniesa, Ana; Ferreira, Chelo; de Blas, Ignacio


    Nowadays the formula to calculate the sample size for estimate a proportion (as prevalence) is based on the Normal distribution, however it would be based on a Binomial distribution which confidence interval was possible to be calculated using the Wilson Score method. By comparing the two formulae (Normal and Binomial distributions), the variation of the amplitude of the confidence intervals is relevant in the tails and the center of the curves. In order to calculate the needed sample size we have simulated an iterative sampling procedure, which shows an underestimation of the sample size for values of prevalence closed to 0 or 1, and also an overestimation for values closed to 0.5. Attending to these results we proposed an algorithm based on Wilson Score method that provides similar values for the sample size than empirically obtained by simulation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Method for Operating a Sensor to Differentiate Between Analytes in a Sample (United States)

    Kunt, Tekin; Cavicchi, Richard E; Semancik, Stephen; McAvoy, Thomas J


    Disclosed is a method for operating a sensor to differentiate between first and second analytes in a sample. The method comprises the steps of determining a input profile for the sensor which will enhance the difference in the output profiles of the sensor as between the first analyte and the second analyte; determining a first analyte output profile as observed when the input profile is applied to the sensor; determining a second analyte output profile as observed when the temperature profile is applied to the sensor; introducing the sensor to the sample while applying the temperature profile to the sensor, thereby obtaining a sample output profile; and evaluating the sample output profile as against the first and second analyte output profiles to thereby determine which of the analytes is present in the sample.

  6. Method of evaluation of process of red blood cell sedimentation based on photometry of droplet samples. (United States)

    Aristov, Alexander; Nosova, Ekaterina


    The paper focuses on research aimed at creating and testing a new approach to evaluate the processes of aggregation and sedimentation of red blood cells for purpose of its use in clinical laboratory diagnostics. The proposed method is based on photometric analysis of blood sample formed as a sessile drop. The results of clinical approbation of this method are given in the paper. Analysis of the processes occurring in the sample in the form of sessile drop during the process of blood cells sedimentation is described. The results of experimental studies to evaluate the effect of the droplet sample focusing properties on light radiation transmittance are presented. It is shown that this method significantly reduces the sample volume and provides sufficiently high sensitivity to the studied processes.

  7. Phylogenetic representativeness: a new method for evaluating taxon sampling in evolutionary studies. (United States)

    Plazzi, Federico; Ferrucci, Ronald R; Passamonti, Marco


    Taxon sampling is a major concern in phylogenetic studies. Incomplete, biased, or improper taxon sampling can lead to misleading results in reconstructing evolutionary relationships. Several theoretical methods are available to optimize taxon choice in phylogenetic analyses. However, most involve some knowledge about the genetic relationships of the group of interest (i.e., the ingroup), or even a well-established phylogeny itself; these data are not always available in general phylogenetic applications. We propose a new method to assess taxon sampling developing Clarke and Warwick statistics. This method aims to measure the "phylogenetic representativeness" of a given sample or set of samples and it is based entirely on the pre-existing available taxonomy of the ingroup, which is commonly known to investigators. Moreover, our method also accounts for instability and discordance in taxonomies. A Python-based script suite, called PhyRe, has been developed to implement all analyses we describe in this paper. We show that this method is sensitive and allows direct discrimination between representative and unrepresentative samples. It is also informative about the addition of taxa to improve taxonomic coverage of the ingroup. Provided that the investigators' expertise is mandatory in this field, phylogenetic representativeness makes up an objective touchstone in planning phylogenetic studies.

  8. Super-resolution method for arbitrary retrospective sampling in fluorescence tomography with raster scanning photodetectors (United States)

    Zhang, Xiaofeng


    Dense spatial sampling is required in high-resolution optical imaging and many other biomedical optical imaging methods, such as diffuse optical imaging. Arrayed photodetectors, in particular charge coupled device cameras are commonly used mainly because of their high pixel count. Nonetheless, discrete-element photodetectors, such as photomultiplier tubes, are often desirable in many performance-demanding imaging applications. However, utilization of the discrete-element photodetectors typically requires raster scan to achieve arbitrary retrospective sampling with high density. Care must be taken in using the relatively large sensitive areas of discrete-element photodetectors to densely sample the image plane. In addition, off-line data analysis and image reconstruction often require full-field sampling. Pixel-by-pixel scanning is not only slow but also unnecessary in diffusion-limited imaging. We propose a superresolution method that can recover the finer features of an image sampled with a coarse-scale sensor. This generalpurpose method was established on the spatial transfer function of the photodetector-lens system, and achieved superresolution by inversion of this linear transfer function. Regularized optimization algorithms were used to achieve optimized deconvolution. Compared to the uncorrected blurred image, the proposed super-resolution method significantly improved image quality in terms of resolution and quantitation. Using this reconstruction method, the acquisition speed with a scanning photodetector can be dramatically improved without significantly sacrificing sampling density or flexibility.

  9. Phylogenetic representativeness: a new method for evaluating taxon sampling in evolutionary studies

    Directory of Open Access Journals (Sweden)

    Passamonti Marco


    Full Text Available Abstract Background Taxon sampling is a major concern in phylogenetic studies. Incomplete, biased, or improper taxon sampling can lead to misleading results in reconstructing evolutionary relationships. Several theoretical methods are available to optimize taxon choice in phylogenetic analyses. However, most involve some knowledge about the genetic relationships of the group of interest (i.e., the ingroup, or even a well-established phylogeny itself; these data are not always available in general phylogenetic applications. Results We propose a new method to assess taxon sampling developing Clarke and Warwick statistics. This method aims to measure the "phylogenetic representativeness" of a given sample or set of samples and it is based entirely on the pre-existing available taxonomy of the ingroup, which is commonly known to investigators. Moreover, our method also accounts for instability and discordance in taxonomies. A Python-based script suite, called PhyRe, has been developed to implement all analyses we describe in this paper. Conclusions We show that this method is sensitive and allows direct discrimination between representative and unrepresentative samples. It is also informative about the addition of taxa to improve taxonomic coverage of the ingroup. Provided that the investigators' expertise is mandatory in this field, phylogenetic representativeness makes up an objective touchstone in planning phylogenetic studies.

  10. Evaluation of Legionella Air Contamination in Healthcare Facilities by Different Sampling Methods: An Italian Multicenter Study

    Directory of Open Access Journals (Sweden)

    Maria Teresa Montagna


    Full Text Available Healthcare facilities (HF represent an at-risk environment for legionellosis transmission occurring after inhalation of contaminated aerosols. In general, the control of water is preferred to that of air because, to date, there are no standardized sampling protocols. Legionella air contamination was investigated in the bathrooms of 11 HF by active sampling (Surface Air System and Coriolis®μ and passive sampling using settling plates. During the 8-hour sampling, hot tap water was sampled three times. All air samples were evaluated using culture-based methods, whereas liquid samples collected using the Coriolis®μ were also analyzed by real-time PCR. Legionella presence in the air and water was then compared by sequence-based typing (SBT methods. Air contamination was found in four HF (36.4% by at least one of the culturable methods. The culturable investigation by Coriolis®μ did not yield Legionella in any enrolled HF. However, molecular investigation using Coriolis®μ resulted in eight HF testing positive for Legionella in the air. Comparison of Legionella air and water contamination indicated that Legionella water concentration could be predictive of its presence in the air. Furthermore, a molecular study of 12 L. pneumophila strains confirmed a match between the Legionella strains from air and water samples by SBT for three out of four HF that tested positive for Legionella by at least one of the culturable methods. Overall, our study shows that Legionella air detection cannot replace water sampling because the absence of microorganisms from the air does not necessarily represent their absence from water; nevertheless, air sampling may provide useful information for risk assessment. The liquid impingement technique appears to have the greatest capacity for collecting airborne Legionella if combined with molecular investigations.

  11. Evaluation of Legionella Air Contamination in Healthcare Facilities by Different Sampling Methods: An Italian Multicenter Study. (United States)

    Montagna, Maria Teresa; De Giglio, Osvalda; Cristina, Maria Luisa; Napoli, Christian; Pacifico, Claudia; Agodi, Antonella; Baldovin, Tatjana; Casini, Beatrice; Coniglio, Maria Anna; D'Errico, Marcello Mario; Delia, Santi Antonino; Deriu, Maria Grazia; Guida, Marco; Laganà, Pasqualina; Liguori, Giorgio; Moro, Matteo; Mura, Ida; Pennino, Francesca; Privitera, Gaetano; Romano Spica, Vincenzo; Sembeni, Silvia; Spagnolo, Anna Maria; Tardivo, Stefano; Torre, Ida; Valeriani, Federica; Albertini, Roberto; Pasquarella, Cesira


    Healthcare facilities (HF) represent an at-risk environment for legionellosis transmission occurring after inhalation of contaminated aerosols. In general, the control of water is preferred to that of air because, to date, there are no standardized sampling protocols. Legionella air contamination was investigated in the bathrooms of 11 HF by active sampling (Surface Air System and Coriolis(®)μ) and passive sampling using settling plates. During the 8-hour sampling, hot tap water was sampled three times. All air samples were evaluated using culture-based methods, whereas liquid samples collected using the Coriolis(®)μ were also analyzed by real-time PCR. Legionella presence in the air and water was then compared by sequence-based typing (SBT) methods. Air contamination was found in four HF (36.4%) by at least one of the culturable methods. The culturable investigation by Coriolis(®)μ did not yield Legionella in any enrolled HF. However, molecular investigation using Coriolis(®)μ resulted in eight HF testing positive for Legionella in the air. Comparison of Legionella air and water contamination indicated that Legionella water concentration could be predictive of its presence in the air. Furthermore, a molecular study of 12 L. pneumophila strains confirmed a match between the Legionella strains from air and water samples by SBT for three out of four HF that tested positive for Legionella by at least one of the culturable methods. Overall, our study shows that Legionella air detection cannot replace water sampling because the absence of microorganisms from the air does not necessarily represent their absence from water; nevertheless, air sampling may provide useful information for risk assessment. The liquid impingement technique appears to have the greatest capacity for collecting airborne Legionella if combined with molecular investigations.

  12. Grass thrips (Anaphothrips obscurus) (Thysanoptera: Thripidae) population dynamics and sampling method comparison in timothy. (United States)

    Reisig, Dominic D; Godfrey, Larry D; Marcum, Daniel B


    Sampling studies were conducted on grass thrips, Anaphothrips obscurus (Müller) (Thysanoptera: Thripidae), in timothy, Phleum pratense L. These studies were used to compare the occurrence of brachypterous and macropterous thrips across sampling methods, seasons, and time of day. Information about the population dynamics of this thrips was also revealed. Three absolute and two relative methods were tested at three different dates within a season and three different daily times during four harvest periods. Thrips were counted and different phenotypes were recorded from one of the absolute methods. Absolute methods were the most similar to one another over time of day and within seasonal dates. Relative methods varied in assessing thrips population dynamics over time of day and within seasonal dates. Based on thrips collected from the plant and sticky card counts, macropterous individuals increased in the spring and summer. Thrips aerially dispersed in the summer. An absolute method, the beat cup method (rapping timothy inside a plastic cup), was among the least variable sampling methods and was faster than direct observations. These findings parallel other studies, documenting the commonality of diel and diurnal effects on sampled arthropod abundance and the seasonal effects on population abundance and structure. These studies also demonstrate that estimated population abundance can be markedly affected by temporal patterns as well as shifting adult phenotypes.


    This report compares simultaneous results from three woodstove sampling methods and evaluates particulate emission rates of conventional and Oregon-certified catalytic and noncatalytic woodstoves in six Portland, OR, houses. EPA Methods 5G and 5H and the field emission sampler (A...

  14. A simple and rapid cultural method for detection of Enterobacter sakazakii in environmental samples

    NARCIS (Netherlands)

    Guillaume-Gentil, O.; Sonnard, V.; Kandhai, M.C.; Marugg, J.; Joosten, H.


    A method was developed to detect and identify Enterobacter sakazakii in environmental samples. The method is based on selective enrichment at 45 ± 0.5°C in lauryl sulfate tryptose broth supplemented with 0.5 M NaCl and 10 mg/liter vancomycin (mLST) for 22 to 24 h followed by streaking on tryptone

  15. Comparisons among Small Sample Equating Methods in a Common-Item Design (United States)

    Kim, Sooyeon; Livingston, Samuel A.


    Score equating based on small samples of examinees is often inaccurate for the examinee populations. We conducted a series of resampling studies to investigate the accuracy of five methods of equating in a common-item design. The methods were chained equipercentile equating of smoothed distributions, chained linear equating, chained mean equating,…

  16. Estimating sample size for a small-quadrat method of botanical ...

    African Journals Online (AJOL)

    ... in eight plant communities in the Nylsvley Nature Reserve. Illustrates with a table. Keywords: Botanical surveys; Grass density; Grasslands; Mixed Bushveld; Nylsvley Nature Reserve; Quadrat size species density; Small-quadrat method; Species density; Species richness; botany; sample size; method; survey; south africa

  17. Sampling naturally contaminated broiler carcasses for Salmonella by three different methods (United States)

    Postchill neck skin (NS) maceration and whole carcass rinsing (WCR) are frequently used methods to detect salmonellae from commercially processed broilers. These are practical, nondestructive methods, but they are insensitive and may result in frequent false negatives (20 to 40%). NS samples only ...

  18. Cytotoxicity of Light-Cured Dental Materials according to Different Sample Preparation Methods

    Directory of Open Access Journals (Sweden)

    Myung-Jin Lee


    Full Text Available Dental light-cured resins can undergo different degrees of polymerization when applied in vivo. When polymerization is incomplete, toxic monomers may be released into the oral cavity. The present study assessed the cytotoxicity of different materials, using sample preparation methods that mirror clinical conditions. Composite and bonding resins were used and divided into four groups according to sample preparation method: uncured; directly cured samples, which were cured after being placed on solidified agar; post-cured samples were polymerized before being placed on agar; and “removed unreacted layer” samples had their oxygen-inhibition layer removed after polymerization. Cytotoxicity was evaluated using an agar diffusion test, MTT assay, and confocal microscopy. Uncured samples were the most cytotoxic, while removed unreacted layer samples were the least cytotoxic (p < 0.05. In the MTT assay, cell viability increased significantly in every group as the concentration of the extracts decreased (p < 0.05. Extracts from post-cured and removed unreacted layer samples of bonding resin were less toxic than post-cured and removed unreacted layer samples of composite resin. Removal of the oxygen-inhibition layer resulted in the lowest cytotoxicity. Clinicians should remove unreacted monomers on the resin surface immediately after restoring teeth with light-curing resin to improve the restoration biocompatibility.

  19. Comparative analysis of aspartic acid racemization methods using whole-tooth and dentin samples. (United States)

    Sakuma, Ayaka; Ohtani, Susumu; Saitoh, Hisako; Iwase, Hirotaro


    One way to estimate biological age is to use the aspartic acid (Asp) racemization method. Although this method has been performed mostly using enamel and dentin, we investigated whether an entire tooth can be used for age estimation. This study used 12 pairs of canines extracted from both sides of the mandible of 12 individuals of known age. From each pair, one tooth was used as a dentin sample and the other as a whole-tooth sample. Amino acids were extracted from each sample, and the integrated peak areas of D-Asp and L-Asp were determined using a gas chromatograph/mass spectrometer. Statistical analysis was performed using the D/L-Asp ratio. Furthermore, teeth from two unidentified bodies, later identified as Japanese and Brazilian, were examined in the same manner. Results showed that the D/L ratios of whole-tooth samples were higher overall than those of dentin samples. The correlation coefficient between the D/L ratios of dentin samples and their age was r=0.98, and that of the whole-tooth samples was r=0.93. The difference between estimated age and actual chronological age was -0.116 and -6.86 years in the Japanese and Brazilian cases, respectively. The use of whole teeth makes the racemization technique easier and can standardize the sampling site. Additionally, using only a few tooth samples per analysis made it possible to reanalyze known-age samples. Although the difficulty in obtaining a proper control sample has prevented racemization from being widely used, the method described here not only ensures the availability of a control tooth, but also enables the teeth to be used for other purposes such as DNA analysis. The use of a whole tooth will increase the application of the racemization technique for age determination. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  20. Sample preparation method for the combined extraction of ethyl glucuronide and drugs of abuse in hair. (United States)

    Meier, Ulf; Briellmann, Thomas; Scheurer, Eva; Dussy, Franz


    Often in hair analysis, a small hair sample is available while the analysis of a multitude of structurally diverse substances with different concentration ranges is demanded. The analysis of the different substances often requires different sample preparation methods, increasing the amount of required hair sample. When segmental hair analysis is necessary, the amount of hair sample needed is further increased. Therefore, the required sample amount for a full analysis can quickly exceed what is available. To combat this problem, a method for the combined hair sample preparation using a single extraction procedure for analysis of ethyl glucuronide with liquid chromatography-multistage fragmentation mass spectrometry/multiple reaction monitoring (LC-MS3 /MRM) and common drugs of abuse with LC-MRM was developed. The combined sample preparation is achieved by separating ethyl glucuronide from the drugs of abuse into separate extracts by fractionation in the solid-phase extraction step during sample clean-up. A full validation for all substances for the parameters selectivity, linearity, limit of detection, limit of quantification, accuracy, precision, matrix effects, and recovery was successfully completed. The following drugs of abuse were included in the method: Amphetamine; methamphetamine; 3,4-methylenedioxy-N-methylamphetamine (MDMA); 3,4-methylenedioxyamphetamine (MDA); 3,4-methylenedioxy-N-ethylamphetamine (MDE); morphine; 6-monoacetylmorphine; codeine; acetylcodeine; cocaine; benzoylecgonine; norcocaine; cocaethylene; methadone; 2-ethylidene-1,5-dimethyl-3,3-diphenylpyrrolidine (EDDP) and methylphenidate. In conclusion, as only 1 sample preparation is needed with 1 aliquot of hair, the presented sample preparation allows an optimal analysis of both ethyl glucuronide and of the drugs of abuse, even when the sample amount is a limiting factor. Copyright © 2017 John Wiley & Sons, Ltd.

  1. [Simulation on design-based and model-based methods in descriptive analysis of complex samples]. (United States)

    Li, Yichong; Yu, Shicheng; Zhao, Yinjun; Jiang, Yong; Wang, Limin; Zhang, Mei; Jiang, Wei; Bao, Heling; Zhou, Maigeng; Jiang, Bo


    To compare design-based and model-based methods in descriptive analysis of complex sample. A total of 1 000 samples were selected and a multistage random sampling design was used in the analysis of the 2010 China chronic disease and risk factors surveillance. For each simulated sample, cases with probability proportional age were randomly deleted so that sample age structure was deviated systematically from that of the target population. Mean systolic blood pressure (SBP) and prevalence of raised blood pressure, as well as their 95% confidence intervals (95%CI) were determined using design-based and model-based methods (routine method and multi-level model). For estimators generated from those 3 methods, mean squared error(MSE) was computed to evaluate their validity. To compare performance of statistical inference of these methods, the probability of 95%CI covering the true parameter(mean SBP and raised blood pressure prevalence of the population) was used. MSE of mean estimator for routine method, design-based analysis and multilevel model was 6.41, 1.38, and 5.86, respectively; and the probability of 95%CI covering the true parameter was 24.7%, 97.5% and 84.3%, respectively. The routine method and multi-level model probably led to an increased probability of type I error in statistical inference. MSE of prevalence estimator was 4.80 for design-based method, which was far lower than those for routine method (20.9) and multilevel model (17.2). Probability of 95%CI covering the true prevalence for routine method was only 29.4%, and 86.4% for multilevel model, both of which were lower than that for design-based method (97.3%). Compared to routine method and multi-level model, design-based method had the best performance both in point estimation and confidence interval construction. Design-based method should be the first choice when doing statistical description of complex samples with a systematically biased sample structure.

  2. Comparison of three methods for concentration of rotavirus from artificially spiked shellfish samples

    Directory of Open Access Journals (Sweden)

    Vysakh Mohan


    Full Text Available Background: Shellfish are a nutritious food source whose consumption and commercial value have risen dramatically worldwide. Shellfish being filter feeders concentrate particulate matters including microorganisms such as pathogenic bacteria and viruses and thus constitute a major public health concern. Effective preliminary sample treatment steps such as concentration of virus from shellfish are essential before RNA/DNA isolation for final PCR accuracy and reproducibility due to presence of PCR inhibitors in shellfish. Aim: The current study was done to compare three methods for concentration of rotavirus from shellfish samples. Materials and Methods: Shellfish samples artificially spiked with tenfold serial dilutions of known concentration of rotavirus were subjected to three different concentration methods namely; proteinase K treatment, precipitation with polyethylene glycol 8000 and use of lysis buffer. RNA was isolated from the concentrated samples using phenol chloroform method. Rota viral RNA was detected using RT-PCR. Results: Concentration of virus using proteinase K and lysis buffer yielded better result than concentration by PEG 8000 in samples with lowest concentration of virus. Among these two methods proteinase K treatment was superior as it showed better amplification of the highest dilution (107 used. Conclusion: Treatment with proteinase K was better than other two methods as it could detect the viral RNA in all three tenfold serial dilutions.

  3. A comparison of microscopic and spectroscopic identification methods for analysis of microplastics in environmental samples. (United States)

    Song, Young Kyoung; Hong, Sang Hee; Jang, Mi; Han, Gi Myung; Rani, Manviri; Lee, Jongmyoung; Shim, Won Joon


    The analysis of microplastics in various environmental samples requires the identification of microplastics from natural materials. The identification technique lacks a standardized protocol. Herein, stereomicroscope and Fourier transform infrared spectroscope (FT-IR) identification methods for microplastics (microplastics were significantly (p0.05) different. Depending on the number of samples and the microplastic size range of interest, the appropriate identification method should be determined; selecting a suitable identification method for microplastics is crucial for evaluating microplastic pollution. Copyright © 2015 Elsevier Ltd. All rights reserved.


    Directory of Open Access Journals (Sweden)

    Maja Vrkljan


    Full Text Available The purpose of this study was to determine an optimal dissolution method for silicate rock samples for further analytical purposes. Analytical FAAS method of determining cobalt, chromium, copper, nickel, lead and zinc content in gabbro sample and geochemical standard AGV-1 has been applied for verification. Dissolution in mixtures of various inorganic acids has been tested, as well as Na2CO3 fusion technique. The results obtained by different methods have been compared and dissolution in the mixture of HNO3 + HF has been recommended as optimal.

  5. Analysis of aroma compounds of Roselle by Dynamic Headspace Sampling using different preparation methods

    DEFF Research Database (Denmark)

    Juhari, Nurul Hanisah Binti; Varming, Camilla; Petersen, Mikael Agerlin


    The influence of different methods of sample preparation on the aroma profiles of dried Roselle (Hibiscus sabdariffa) was studied. Least amounts of aroma compounds were recovered by analysis of whole dry calyxes (WD) followed by ground dry (GD), blended together with water (BTW), and ground...... and then mixed with water (GMW). The highest number of aroma compounds was found in Roselle treated in water bath (2hr/40°C) (GMWKB). GMW was chosen as the preparation method because it was shown to be an efficient extraction method without the possibility of excessive chemical changes of the sample....

  6. Field sampling and selecting on-site analytical methods for explosives in soil

    Energy Technology Data Exchange (ETDEWEB)

    Crockett, A.B.; Craig, H.D.; Jenkins, T.F.; Sisk, W.E.


    A large number of defense-related sites are contaminated with elevated levels of secondary explosives. Levels of contamination range from barely detectable to levels above 10% that need special handling because of the detonation potential. Characterization of explosives-contaminated sites is particularly difficult because of the very heterogeneous distribution of contamination in the environment and within samples. To improve site characterization, several options exist including collecting more samples, providing on-site analytical data to help direct the investigation, compositing samples, improving homogenization of the samples, and extracting larger samples. This publication is intended to provide guidance to Remedial Project Managers regarding field sampling and on-site analytical methods for detecting and quantifying secondary explosive compounds in soils, and is not intended to include discussions of the safety issues associated with sites contaminated with explosive residues.

  7. System and method for liquid extraction electrospray-assisted sample transfer to solution for chemical analysis (United States)

    Kertesz, Vilmos; Van Berkel, Gary J.


    A system for sampling a surface includes a surface sampling probe comprising a solvent liquid supply conduit and a distal end, and a sample collector for suspending a sample collection liquid adjacent to the distal end of the probe. A first electrode provides a first voltage to solvent liquid at the distal end of the probe. The first voltage produces a field sufficient to generate electrospray plume at the distal end of the probe. A second electrode provides a second voltage and is positioned to produce a plume-directing field sufficient to direct the electrospray droplets and ions to the suspended sample collection liquid. The second voltage is less than the first voltage in absolute value. A voltage supply system supplies the voltages to the first electrode and the second electrode. The first electrode can apply the first voltage directly to the solvent liquid. A method for sampling for a surface is also disclosed.

  8. Recent Trends in Quality Assurance (United States)

    Amaral, Alberto; Rosa, Maria Joao


    In this paper we present a brief description of the evolution of quality assurance in Europe, paying particular attention to its relationship to the rising loss of trust in higher education institutions. We finalise by analysing the role of the European Commission in the setting up of new quality assurance mechanisms that tend to promote…

  9. QANU - Quality Assurance Netherlands Universities

    DEFF Research Database (Denmark)

    Jensen, Henrik Toft; Maria E., Weber; Vyt, André

    The Quality Assurance Netherlands Universities (QANU) underwent an ENQA-coordinated external review in 2016. The review was chaired by Henrik Toft Jensen, Research fellow at Roskilde University (RUC), Denmark.......The Quality Assurance Netherlands Universities (QANU) underwent an ENQA-coordinated external review in 2016. The review was chaired by Henrik Toft Jensen, Research fellow at Roskilde University (RUC), Denmark....

  10. Develpment of quality assurance manual for fabrication of DUPIC fuel

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Gun; Lee, J. W.; Kim, S. S. and others


    The Quality Assurance Manual for the fabrication of DUPIC fuel with high quality was developed. The Quality Assurance Policy established by this manual is to assure that the DUPIC fuel element supplied to customer conform to the specified requirements of customer, applicable codes and standards. The management of KAERI is committed to implementation and maintenance of the program described by this manual. This manual describes the quality assurance program for DUPIC fuel fabrication to comply with CAN3-Z299.2-85 to the extent as needed and appropriate. This manual describes the methods which DUPIC Fuel Development Team(DFDT) personnel must follow to achieve and assure high quality of our product. This manual also describes the quality management system applicable to the activities performed at DFDT.

  11. Validation of the ANSR Listeria method for detection of Listeria spp. in environmental samples. (United States)

    Wendorf, Michael; Feldpausch, Emily; Pinkava, Lisa; Luplow, Karen; Hosking, Edan; Norton, Paul; Biswas, Preetha; Mozola, Mark; Rice, Jennifer


    ANSR Listeria is a new diagnostic assay for detection of Listeria spp. in sponge or swab samples taken from a variety of environmental surfaces. The method is an isothermal nucleic acid amplification assay based on the nicking enzyme amplification reaction technology. Following single-step sample enrichment for 16-24 h, the assay is completed in 40 min, requiring only simple instrumentation. In inclusivity testing, 48 of 51 Listeria strains tested positive, with only the three strains of L. grayi producing negative results. Further investigation showed that L. grayi is reactive in the ANSR assay, but its ability to grow under the selective enrichment conditions used in the method is variable. In exclusivity testing, 32 species of non-Listeria, Gram-positive bacteria all produced negative ANSR assay results. Performance of the ANSR method was compared to that of the U.S. Department of Agriculture-Food Safety and Inspection Service reference culture procedure for detection of Listeria spp. in sponge or swab samples taken from inoculated stainless steel, plastic, ceramic tile, sealed concrete, and rubber surfaces. Data were analyzed using Chi-square and probability of detection models. Only one surface, stainless steel, showed a significant difference in performance between the methods, with the ANSR method producing more positive results. Results of internal trials were supported by findings from independent laboratory testing. The ANSR Listeria method can be used as an accurate, rapid, and simple alternative to standard culture methods for detection of Listeria spp. in environmental samples.

  12. Method for producing a thin sample band in a microchannel device (United States)

    Griffiths, Stewart K [Livermore, CA; Nilson, Robert H [Cardiff, CA


    The present invention improves the performance of microchannel systems for chemical and biological synthesis and analysis by providing a method and apparatus for producing a thin band of a species sample. Thin sample bands improve the resolution of microchannel separation processes, as well as many other processes requiring precise control of sample size and volume. The new method comprises a series of steps in which a species sample is manipulated by controlled transport through a junction formed at the intersection of four or more channels. A sample is first inserted into the end of one of these channels in the vicinity of the junction. Next, this sample is thinned by transport across the junction one or more times. During these thinning steps, flow enters the junction through one of the channels and exists through those remaining, providing a divergent flow field that progressively stretches and thins the band with each traverse of the junction. The thickness of the resulting sample band may be smaller than the channel width. Moreover, the thickness of the band may be varied and controlled by altering the method alone, without modification to the channel or junction geometries. The invention is applicable to both electroosmotic and electrophoretic transport, to combined electrokinetic transport, and to some special cases in which bulk fluid transport is driven by pressure gradients. It is further applicable to channels that are open, filled with a gel or filled with a porous or granular material.

  13. A new dissolved gas sampling method from primary water of the Paks Nuclear Power Plant, Hungary

    Energy Technology Data Exchange (ETDEWEB)

    Papp, L., E-mail: [Institute for Nuclear Research, Hungarian Academy of Sciences, Debrecen (Hungary); Isotoptech Co. Ltd., Debrecen (Hungary); Palcsu, L. [Institute for Nuclear Research, Hungarian Academy of Sciences, Debrecen (Hungary); Veres, M. [Isotoptech Co. Ltd., Debrecen (Hungary); Pintér, T. [Paks Nuclear Power Plant, Paks (Hungary)


    Highlights: • We constructed and applied a lightweight portable dissolved gas sampling device. • A membrane contactor has been used to sample the dissolved gases from the water. • Gas compound and gamma spectrometric measurements were done from the samples. - Abstract: This article describes a novel sampling method for dissolved gases from radioactive waters. The major aim was to build a portable, lightweight sampling device in which the gas sample container is not in contact with the water itself. Therefore, a membrane contactor was used to take representative dissolved gas samples from the water of spent fuel pools. Quadrupole mass spectrometric and gamma spectrometric measurements were made from the samples to determine the gas composition and to detect any radioactive gas of fission origin. The paper describes (i) the construction of the sampler in general, (ii) the operation of the sampling unit and (iii) the measurement results of the first samples and the interpretation of the data. Both small and large fluctuations were able to be detected when the freshly spent fuel rods were put into the spent fuel pool or when the head valves of the toques of the fuel rods were replaced. In the investigated period (2013–2014), the main gas composition did not show large fluctuations, it was close to the composition of dissolved air. However, the activity concentration of {sup 85}Kr varied in a broad range (0.001–100 kBq/l).

  14. A Comparison between Three Methods of Language Sampling: Freeplay, Narrative Speech and Conversation

    Directory of Open Access Journals (Sweden)

    Yasser Rezapour


    Full Text Available Objectives: The spontaneous language sample analysis is an important part of the language assessment protocol. Language samples give us useful information about how children use language in the natural situations of daily life. The purpose of this study was to compare Conversation, Freeplay, and narrative speech in aspects of Mean Length of Utterance (MLU, Type-token ratio (TTR, and the number of utterances. Methods: By cluster sampling method, a total of 30 Semnanian five-year-old boys with normal speech and language development were selected from the active kindergartens in Semnan city. Conversation, Freeplay, and narrative speech were three applied language sample elicitation methods to obtain 15 minutes of children’s spontaneous language samples. Means for MLU, TTR, and the number of utterances are analyzed by dependent ANOVA. Results: The result showed no significant difference in number of elicited utterances among these three language sampling methods. Narrative speech elicited longer MLU than freeplay and conversation, and compared to freeplay and narrative speech, conversation elicited higher TTR. Discussion: Results suggest that in the clinical assessment of the Persian-language children, it is better to use narrative speech to elicit longer MLU and to use conversation to elicit higher TTR.

  15. An evaluation of sampling methods for the detection of Escherichia coli and Salmonella on Turkey carcasses. (United States)

    McEvoy, J M; Nde, C W; Sherwood, J S; Logue, C M


    The efficacy of rinse, excision, and swab methods for the microbiological analysis of prechill turkey carcasses was investigated. Aerobic plate counts from a 50-cm2 area of the breast sampled by excision and by swabbing were compared. Escherichia coli and Salmonella recoveries were determined from turkeys sampled by a carcass rinse (CR), a modified rinse with the carcass supported in a swing (MCR), a two-site swab of 50 cm2 at the back and thigh (2S), a one-site swab of 50 cm2 beneath the wing (1S), a whole-carcass swab of the inner and outer carcass surface (WS), and excision of 25 g of neck skin tissue (NE). The effect of diluent volume (25, 50, and 100 ml) on E. coli counts from swab samples was also assessed. The aerobic plate count from breast tissue sampled by excision was greater than that by swabbing (P diluent (P diluent, E. coli recoveries by the MCR, 2S, 1S, and WS methods were similar. For swabs stomached in 50 ml of diluent, Salmonella recoveries by the WS and MCR methods were higher than those by the 2S and 1S methods. Excision was more effective than swabbing for obtaining total bacterial counts from reduced turkey carcass areas. Whole-carcass sampling by rinsing or swabbing is necessary for optimum Salmonella recovery. Sampling a reduced area of the carcass is sufficient for E. coli analysis.

  16. Automated PCR setup for forensic casework samples using the Normalization Wizard and PCR Setup robotic methods. (United States)

    Greenspoon, S A; Sykes, K L V; Ban, J D; Pollard, A; Baisden, M; Farr, M; Graham, N; Collins, B L; Green, M M; Christenson, C C


    Human genome, pharmaceutical and research laboratories have long enjoyed the application of robotics to performing repetitive laboratory tasks. However, the utilization of robotics in forensic laboratories for processing casework samples is relatively new and poses particular challenges. Since the quantity and quality (a mixture versus a single source sample, the level of degradation, the presence of PCR inhibitors) of the DNA contained within a casework sample is unknown, particular attention must be paid to procedural susceptibility to contamination, as well as DNA yield, especially as it pertains to samples with little biological material. The Virginia Department of Forensic Science (VDFS) has successfully automated forensic casework DNA extraction utilizing the DNA IQ(trade mark) System in conjunction with the Biomek 2000 Automation Workstation. Human DNA quantitation is also performed in a near complete automated fashion utilizing the AluQuant Human DNA Quantitation System and the Biomek 2000 Automation Workstation. Recently, the PCR setup for casework samples has been automated, employing the Biomek 2000 Automation Workstation and Normalization Wizard, Genetic Identity version, which utilizes the quantitation data, imported into the software, to create a customized automated method for DNA dilution, unique to that plate of DNA samples. The PCR Setup software method, used in conjunction with the Normalization Wizard method and written for the Biomek 2000, functions to mix the diluted DNA samples, transfer the PCR master mix, and transfer the diluted DNA samples to PCR amplification tubes. Once the process is complete, the DNA extracts, still on the deck of the robot in PCR amplification strip tubes, are transferred to pre-labeled 1.5 mL tubes for long-term storage using an automated method. The automation of these steps in the process of forensic DNA casework analysis has been accomplished by performing extensive optimization, validation and testing of the

  17. Error baseline rates of five sample preparation methods used to characterize RNA virus populations. (United States)

    Kugelman, Jeffrey R; Wiley, Michael R; Nagle, Elyse R; Reyes, Daniel; Pfeffer, Brad P; Kuhn, Jens H; Sanchez-Lockhart, Mariano; Palacios, Gustavo F


    Individual RNA viruses typically occur as populations of genomes that differ slightly from each other due to mutations introduced by the error-prone viral polymerase. Understanding the variability of RNA virus genome populations is critical for understanding virus evolution because individual mutant genomes may gain evolutionary selective advantages and give rise to dominant subpopulations, possibly even leading to the emergence of viruses resistant to medical countermeasures. Reverse transcription of virus genome populations followed by next-generation sequencing is the only available method to characterize variation for RNA viruses. However, both steps may lead to the introduction of artificial mutations, thereby skewing the data. To better understand how such errors are introduced during sample preparation, we determined and compared error baseline rates of five different sample preparation methods by analyzing in vitro transcribed Ebola virus RNA from an artificial plasmid-based system. These methods included: shotgun sequencing from plasmid DNA or in vitro transcribed RNA as a basic "no amplification" method, amplicon sequencing from the plasmid DNA or in vitro transcribed RNA as a "targeted" amplification method, sequence-independent single-primer amplification (SISPA) as a "random" amplification method, rolling circle reverse transcription sequencing (CirSeq) as an advanced "no amplification" method, and Illumina TruSeq RNA Access as a "targeted" enrichment method. The measured error frequencies indicate that RNA Access offers the best tradeoff between sensitivity and sample preparation error (1.4-5) of all compared methods.

  18. [Estimate methods used with complex sampling designs: their application in the Cuban 2001 health survey]. (United States)

    Cañizares Pérez, Mayilée; Barroso Utra, Isabel; Alfonso León, Alina; García Roche, René; Alfonso Sagué, Karen; Chang de la Rosa, Martha; Bonet Gorbea, Mariano; León, Esther M


    To look at the individual features of three different methods used to estimate simple parameters--means, totals, and percentages, as well as their standard errors--and of logistic regression models, and to describe how such methods can be used for analyzing data obtained from complex samples. Data from Cuba's Second National Survey of Risk Factors and Non-Communicable Chronic Ailments [Segunda Encuesta Nacional de Factores de Riesgo y Afecciones Crónicas No Transmisibles], which was conducted in 2001, were studied. A complex, stratified multi-stage cluster sampling design was used. Cuba's 14 provinces and the municipality of Isla de la Juventud served as the strata, while the clusters consisted of sampled geographic areas (SGA), blocks, and sectors. Samples were weighted in inverse proportion to their probability of being selected, and estimates were performed by sex and age group (15-34, 35-54, 55-74, and 75 or more years). Taylor approximations were used to estimate variances. Three statistical methods were compared: conventional analysis, which assumes all data were obtained through simple random sampling; weighted analysis, which only takes into account the weight of the samples when performing estimates; and adjusted analysis, which looks at all aspects of the sampling design (namely, the disparity in the probability of being included in the sample and the effect of clustering on the data). The point estimates obtained with the three different types of analytic methods were similar. Standard error (SE) estimates for the prevalence of overweight and of arterial hypertension that were obtained by conventional analysis were underestimated by 19.3% and by more than 11.5%, respectively, when such estimates were compared to those obtained with the other two analytic methods. On the other hand, weighted analysis generated SE values that were much smaller than those obtained with the other two types of analyses. The same pattern was noted when odds ratios were

  19. Current Practices in Constructing and Evaluating Assurance Cases With Applications to Aviation (United States)

    Rinehart, David J.; Knight, John C.; Rowanhill, Jonathan


    This report introduces and provides an overview of assurance cases including theory, practice, and evaluation. This report includes a section that introduces the principles, terminology, and history of assurance cases. The core of the report presents twelve example uses of assurance cases from a range of domains, using a novel classification scheme. The report also reviews the state of the art in assurance case evaluation methods.

  20. Quality Assurance of Ultrasonic Diagnosis in Breast

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Soo Young; Kim, Hong Dae [Hallym University, Kangnam Sacred Heart Hospital, Seoul (Korea, Republic of)


    Sonography is a subjective diagnostic method which is highly dependent on the experience of the operator and the equipment quality which requires real-time adjustments. Breast screening examination currently consists of clinical examination and mammography. Breast sonography, either supplementary to mammography or independently, is indicated for the dense breast, especially in younger women. Breast sonography is especially applicable for Korean women because of the denser breast parenchyma and the approximately 10-year younger incidence rate of breast cancer of Korean women compared to western women. To avoid unnecessary breast biopsy because of the high rate of false positive lesions in breast parenchyma, which is different from other body organs such as the liver or the kidney, a quality assurance program for breast sonography is essential. The quality assurance of breast ultrasound involves quality assurance of the equipment, imaging display and acquisition of clinical images, personnel qualifications and other aspects such as unification of lexicon, guideline of diagnostic examination and reporting system; US BI-RAD reporting system, assessment items and organization, education program, medical audit, certification issues, and medicolegal issues. A breast sonographic quality assurance system should be established before a scheme to initiate governmental medical insurance for breast sonography

  1. Preparation of Samples for Leaf Architecture Studies, A Method for Mounting Cleared Leaves

    Directory of Open Access Journals (Sweden)

    Alejandra Vasco


    Full Text Available Premise of the study: Several recent waves of interest in leaf architecture have shown an expanding range of approaches and applications across a number of disciplines. Despite this increased interest, examination of existing archives of cleared and mounted leaves shows that current methods for mounting, in particular, yield unsatisfactory results and deterioration of samples over relatively short periods. Although techniques for clearing and staining leaves are numerous, published techniques for mounting leaves are scarce. Methods and Results: Here we present a complete protocol and recommendations for clearing, staining, and imaging leaves, and, most importantly, a method to permanently mount cleared leaves. Conclusions: The mounting protocol is faster than other methods, inexpensive, and straightforward; moreover, it yields clear and permanent samples that can easily be imaged, scanned, and stored. Specimens mounted with this method preserve well, with leaves that were mounted more than 35 years ago showing no signs of bubbling or discoloration.

  2. Rapid, sensitive and cost effective method for isolation of viral DNA from feacal samples of dogs

    Directory of Open Access Journals (Sweden)



    Full Text Available A simple method for viral DNA extraction using chelex resin was developed. The method used was eco-friendly and cost effective compared to other methods such as phenol chloroform method which use health hazardous organic reagents. Further, a polymerase chain reaction (PCR based detection of canine parvovirus (CPV using primers from conserved region of VP2 gene was developed. To increase the sensitivity and specificity of reaction, nested PCR was designed. PCR reaction was optimized to amplify 747bp product of VP2 gene. The assay can be completed in few hours and doesn’t need hazardous chemicals. Thus, the sample preparation using chelating resin along with nested PCR seems to be a sensitive, specific and practical method for the detection of CPV in diarrhoeal feacal samples. [Vet. World 2010; 3(3.000: 105-106

  3. A thickness measurement method for biological samples using lensed-fiber sensors (United States)

    Kim, Do-Hyun; Ilev, Ilko K.; Han, Young-Geun


    We present a simple fiber-optic confocal method for high-precision thickness measurement of optically transparent and non-transparent objects that require noncontact measurement. The method is based on measurement of confocal backreflection responses from the opposite surfaces of objects, which imposes inessential limitations on the shape, thickness, and transparency of testing objects. A novel reference comparison method to eliminate additional errors existing commonly in confocal microscope designs is adapted. The measurement error highly depends on the axial response of confocal microscope, and was measured to be 5.0 μm using a single-mode optical fiber construction, 60× objective lenses, and a 658-nm-wavelength laser source. We demonstrate the method using lensed-fiber sensors, which reduces the size of the experimental setup so that the method can be utilized for smaller samples at in-vivo situation. We demonstrate the proof-of-concept measurement using biological samples.

  4. Transformation-cost time-series method for analyzing irregularly sampled data (United States)

    Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G. Baris; Kurths, Jürgen


    Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations—with associated costs—to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.

  5. Phylogenetic representativeness: a new method for evaluating taxon sampling in evolutionary studies


    Passamonti Marco; Ferrucci Ronald R; Plazzi Federico


    Abstract Background Taxon sampling is a major concern in phylogenetic studies. Incomplete, biased, or improper taxon sampling can lead to misleading results in reconstructing evolutionary relationships. Several theoretical methods are available to optimize taxon choice in phylogenetic analyses. However, most involve some knowledge about the genetic relationships of the group of interest (i.e., the ingroup), or even a well-established phylogeny itself; these data are not always available in ge...

  6. Development of New Methods and Software for Distance Sampling Surveys of Cetacean Populations (United States)


    2007). Populations of animals were created using the WiSP package of routines developed by Borchers et al. (2002). This package can produce...monotonic manner). Distance sampling methods are also incorporated into the WiSP package, and take into account imperfect detectability along transects...placed within the rectangular survey region. One limitation of the distance sampling implementation of WiSP is that the transects can only be oriented

  7. A method for disaggregating clay concretions and eliminating formalin smell in the processing of sediment samples

    DEFF Research Database (Denmark)

    Cedhagen, Tomas


    A complete handling procedure for processing sediment samples is described. It includes some improvements of conventional methods. The fixed sediment sample is mixed with a solution of the alkaline detergent AJAX® (Colgate-Palmolive). It is kept at 80-900 C for 20-40 min. This treatment facilitates...... subsequent sorting as it disaggregates clay concretions and faecal pellets ·but leaves even fragile organisms clean and unaffected. The ammonia in the detergent eliminates the formalin smell....

  8. A novel method of sampling gingival crevicular fluid from a mouse model of periodontitis. (United States)

    Matsuda, Shinji; Movila, Alexandru; Suzuki, Maiko; Kajiya, Mikihito; Wisitrasameewong, Wichaya; Kayal, Rayyan; Hirshfeld, Josefine; Al-Dharrab, Ayman; Savitri, Irma J; Mira, Abdulghani; Kurihara, Hidemi; Taubman, Martin A; Kawai, Toshihisa


    Using a mouse model of silk ligature-induced periodontal disease (PD), we report a novel method of sampling mouse gingival crevicular fluid (GCF) to evaluate the time-dependent secretion patterns of bone resorption-related cytokines. GCF is a serum transudate containing host-derived biomarkers which can represent cellular response in the periodontium. As such, human clinical evaluations of PD status rely on sampling this critical secretion. At the same time, a method of sampling GCF from mice is absent, hindering the translational value of mouse models of PD. Therefore, we herein report a novel method of sampling GCF from a mouse model of periodontitis, involving a series of easy steps. First, the original ligature used for induction of PD was removed, and a fresh ligature for sampling GCF was placed in the gingival crevice for 10min. Immediately afterwards, the volume of GCF collected in the sampling ligature was measured using a high precision weighing balance. The sampling ligature containing GCF was then immersed in a solution of PBS-Tween 20 and subjected to ELISA. This enabled us to monitor the volume of GCF and detect time-dependent changes in the expression of such cytokines as IL-1b, TNF-α, IL-6, RANKL, and OPG associated with the levels of alveolar bone loss, as reflected in GCF collected from a mouse model of PD. Therefore, this novel GCF sampling method can be used to measure various cytokines in GCF relative to the dynamic changes in periodontal bone loss induced in a mouse model of PD. Copyright © 2016. Published by Elsevier B.V.

  9. A combined method for correlative 3D imaging of biological samples from macro to nano scale (United States)

    Kellner, Manuela; Heidrich, Marko; Lorbeer, Raoul-Amadeus; Antonopoulos, Georgios C.; Knudsen, Lars; Wrede, Christoph; Izykowski, Nicole; Grothausmann, Roman; Jonigk, Danny; Ochs, Matthias; Ripken, Tammo; Kühnel, Mark P.; Meyer, Heiko


    Correlative analysis requires examination of a specimen from macro to nano scale as well as applicability of analytical methods ranging from morphological to molecular. Accomplishing this with one and the same sample is laborious at best, due to deformation and biodegradation during measurements or intermediary preparation steps. Furthermore, data alignment using differing imaging techniques turns out to be a complex task, which considerably complicates the interconnection of results. We present correlative imaging of the accessory rat lung lobe by combining a modified Scanning Laser Optical Tomography (SLOT) setup with a specially developed sample preparation method (CRISTAL). CRISTAL is a resin-based embedding method that optically clears the specimen while allowing sectioning and preventing degradation. We applied and correlated SLOT with Multi Photon Microscopy, histological and immunofluorescence analysis as well as Transmission Electron Microscopy, all in the same sample. Thus, combining CRISTAL with SLOT enables the correlative utilization of a vast variety of imaging techniques.

  10. Method of analyzing multiple sample simultaneously by detecting absorption and systems for use in such a method (United States)

    Yeung, Edward S.; Gong, Xiaoyi


    The present invention provides a method of analyzing multiple samples simultaneously by absorption detection. The method comprises: (i) providing a planar array of multiple containers, each of which contains a sample comprising at least one absorbing species, (ii) irradiating the planar array of multiple containers with a light source and (iii) detecting absorption of light with a detetion means that is in line with the light source at a distance of at leaat about 10 times a cross-sectional distance of a container in the planar array of multiple containers. The absorption of light by a sample indicates the presence of an absorbing species in it. The method can further comprise: (iv) measuring the amount of absorption of light detected in (iii) indicating the amount of the absorbing species in the sample. Also provided by the present invention is a system for use in the abov metho.The system comprises; (i) a light source comrnpising or consisting essentially of at leaat one wavelength of light, the absorption of which is to be detected, (ii) a planar array of multiple containers, and (iii) a detection means that is in line with the light source and is positioned in line with and parallel to the planar array of multiple contiainers at a distance of at least about 10 times a cross-sectional distance of a container.

  11. Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems (United States)

    Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros


    In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the

  12. Surface Sampling Collection and Culture Methods for Escherichia coli in Household Environments with High Fecal Contamination (United States)

    Kosek, Margaret N.; Schwab, Kellogg J.


    Empiric quantification of environmental fecal contamination is an important step toward understanding the impact that water, sanitation, and hygiene interventions have on reducing enteric infections. There is a need to standardize the methods used for surface sampling in field studies that examine fecal contamination in low-income settings. The dry cloth method presented in this manuscript improves upon the more commonly used swabbing technique that has been shown in the literature to have a low sampling efficiency. The recovery efficiency of a dry electrostatic cloth sampling method was evaluated using Escherichia coli and then applied to household surfaces in Iquitos, Peru, where there is high fecal contamination and enteric infection. Side-by-side measurements were taken from various floor locations within a household at the same time over a three-month period to compare for consistency of quantification of E. coli bacteria. The dry cloth sampling method in the laboratory setting showed 105% (95% Confidence Interval: 98%, 113%) E. coli recovery efficiency off of the cloths. The field application demonstrated strong agreement of side-by-side results (Pearson correlation coefficient for dirt surfaces was 0.83 (p < 0.0001) and 0.91 (p < 0.0001) for cement surfaces) and moderate agreement for results between entrance and kitchen samples (Pearson (0.53, p < 0.0001) and weighted Kappa statistic (0.54, p < 0.0001)). Our findings suggest that this method can be utilized in households with high bacterial loads using either continuous (quantitative) or categorical (semi-quantitative) data. The standardization of this low-cost, dry electrostatic cloth sampling method can be used to measure differences between households in intervention and non-intervention arms of randomized trials. PMID:28829392

  13. Parasitological stool sample exam by spontaneous sedimentation method using conical tubes: effectiveness, practice, and biosafety

    Directory of Open Access Journals (Sweden)

    Steveen Rios Ribeiro


    Full Text Available INTRODUCTION: Spontaneous sedimentation is an important procedure for stool examination. A modification of this technique using conical tubes was performed and evaluated. METHODS: Fifty fecal samples were processed in sedimentation glass and in polypropylene conical tubes. Another 50 samples were used for quantitative evaluation of protozoan cysts. RESULTS: Although no significant differences occurred in the frequency of protozoa and helminths detected, significant differences in protozoan cyst counts did occur. CONCLUSIONS: The use of tube predicts a shorter path in the sedimentation of the sample, increases concentration of parasites for microscopy analysis, minimizes the risks of contamination, reduces the odor, and optimizes the workspace.

  14. Preparation method matters: Aiming at higher NPP diversity and representativeness in sediment samples

    DEFF Research Database (Denmark)

    Enevold, Renée; Odgaard, Bent Vad


    of palynology in archaeological and forensic sciences. NPPs in anthropogenic soils and archaeological samples may be numerous in types as well as in abundance. However, preparing these soil samples with methods based on acid digestion potentially biases NPP assemblages because of differential damage or even...... dissolution of microfossils. In spite of this potential bias standard preparation procedures for pollen analysis have, in most cases without modification, generally been applied to palynological samples used for NPP analysis. We review briefly the advantages of high diversity NPP-analysis and preparation...

  15. Method and apparatus for measuring the gas permeability of a solid sample (United States)

    Carstens, D.H.W.


    The disclosure is directed to an apparatus and method for measuring the permeability of a gas in a sample. The gas is allowed to reach a steady flow rate through the sample. A measurable amount of the gas is collected during a given time period and then delivered to a sensitive quadrupole. The quadrupole signal, adjusted for background, is proportional to the amount of gas collected during the time period. The quadrupole can be calibrated with a standard helium leak. The gas can be deuterium and the sample can be polyvinyl alcohol.

  16. Comparison of microRNA expression using different preservation methods of matched psoriatic skin samples

    DEFF Research Database (Denmark)

    Løvendorf, Marianne B; Zibert, John R; Hagedorn, Peter H


    -frozen (FS) and Tissue-Tek-embedding (OCT). We found a strong correlation of the microRNA expression levels between all preservation methods of matched psoriatic skin samples (r(s) ranging from 0.91 to 0.95 (P ... contains high levels of RNases. As microRNAs are 19-23 nucleotides long and lack a poly-A tail, they may be less prone to RNA degradation than mRNAs. We investigated whether microRNAs in psoriatic (FFPE) samples reliably reflect microRNA expression in samples less prone to RNA degradation such as fresh...

  17. Flagging versus dragging as sampling methods for nymphal Ixodes scapularis (Acari: Ixodidae). (United States)

    Rulison, Eric L; Kuczaj, Isis; Pang, Genevieve; Hickling, Graham J; Tsao, Jean I; Ginsberg, Howard S


    The nymphal stage of the blacklegged tick, Ixodes scapularis (Acari: Ixodidae), is responsible for most transmission of Borrelia burgdorferi, the etiologic agent of Lyme disease, to humans in North America. From 2010 to fall of 2012, we compared two commonly used techniques, flagging and dragging, as sampling methods for nymphal I. scapularis at three sites, each with multiple sampling arrays (grids), in the eastern and central United States. Flagging and dragging collected comparable numbers of nymphs, with no consistent differences between methods. Dragging collected more nymphs than flagging in some samples, but these differences were not consistent among sites or sampling years. The ratio of nymphs collected by flagging vs dragging was not significantly related to shrub density, so habitat type did not have a strong effect on the relative efficacy of these methods. Therefore, although dragging collected more ticks in a few cases, the numbers collected by each method were so variable that neither technique had a clear advantage for sampling nymphal I. scapularis. © 2013 The Society for Vector Ecology.

  18. Determining optimal sample sizes for multi-stage randomized clinical trials using value of information methods. (United States)

    Willan, Andrew; Kowgier, Matthew


    Traditional sample size calculations for randomized clinical trials depend on somewhat arbitrarily chosen factors, such as Type I and II errors. An effectiveness trial (otherwise known as a pragmatic trial or management trial) is essentially an effort to inform decision-making, i.e., should treatment be adopted over standard? Taking a societal perspective and using Bayesian decision theory, Willan and Pinto (Stat. Med. 2005; 24:1791-1806 and Stat. Med. 2006; 25:720) show how to determine the sample size that maximizes the expected net gain, i.e., the difference between the cost of doing the trial and the value of the information gained from the results. These methods are extended to include multi-stage adaptive designs, with a solution given for a two-stage design. The methods are applied to two examples. As demonstrated by the two examples, substantial increases in the expected net gain (ENG) can be realized by using multi-stage adaptive designs based on expected value of information methods. In addition, the expected sample size and total cost may be reduced. Exact solutions have been provided for the two-stage design. Solutions for higher-order designs may prove to be prohibitively complex and approximate solutions may be required. The use of multi-stage adaptive designs for randomized clinical trials based on expected value of sample information methods leads to substantial gains in the ENG and reductions in the expected sample size and total cost.

  19. Method of separate determination of high-ohmic sample resistance and contact resistance

    Directory of Open Access Journals (Sweden)

    Vadim A. Golubiatnikov


    Full Text Available A method of separate determination of two-pole sample volume resistance and contact resistance is suggested. The method is applicable to high-ohmic semiconductor samples: semi-insulating gallium arsenide, detector cadmium-zinc telluride (CZT, etc. The method is based on near-contact region illumination by monochromatic radiation of variable intensity from light emitting diodes with quantum energies exceeding the band gap of the material. It is necessary to obtain sample photo-current dependence upon light emitting diode current and to find the linear portion of this dependence. Extrapolation of this linear portion to the Y-axis gives the cut-off current. As the bias voltage is known, it is easy to calculate sample volume resistance. Then, using dark current value, one can determine the total contact resistance. The method was tested for n-type semi-insulating GaAs. The contact resistance value was shown to be approximately equal to the sample volume resistance. Thus, the influence of contacts must be taken into account when electrophysical data are analyzed.

  20. Flagging versus dragging as sampling methods for nymphal Ixodes scapularis (Acari: Ixodidae) (United States)

    Rulison, Eric L.; Kuczaj, Isis; Pang, Genevieve; Hickling, Graham J.; Tsao, Jean I.; Ginsberg, Howard S.


    The nymphal stage of the blacklegged tick, Ixodes scapularis (Acari: Ixodidae), is responsible for most transmission of Borrelia burgdorferi, the etiologic agent of Lyme disease, to humans in North America. From 2010 to fall of 2012, we compared two commonly used techniques, flagging and dragging, as sampling methods for nymphal I. scapularis at three sites, each with multiple sampling arrays (grids), in the eastern and central United States. Flagging and dragging collected comparable numbers of nymphs, with no consistent differences between methods. Dragging collected more nymphs than flagging in some samples, but these differences were not consistent among sites or sampling years. The ratio of nymphs collected by flagging vs dragging was not significantly related to shrub density, so habitat type did not have a strong effect on the relative efficacy of these methods. Therefore, although dragging collected more ticks in a few cases, the numbers collected by each method were so variable that neither technique had a clear advantage for sampling nymphal I. scapularis.

  1. Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR. (United States)

    Mobli, Mehdi; Hoch, Jeffrey C


    Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Sampling method and location affect recovery of coliforms and Escherichia coli from broiler carcasses. (United States)

    Smith, D P


    Two experiments were conducted, the first to determine whether numbers of recovered bacteria differed due to sampling method used or due to location on carcass sampled (breast or leg quarters) and the second to determine if numbers of bacteria differed between the front (ventral) and back (dorsal) side of the carcass. In both experiments, eviscerated broiler carcasses were obtained from a commercial processing plant just before the final inside-outside bird washer. In experiment 1, carcasses (3 in each of 4 replicate trials) were separated into leg quarters and breast quarters (n = 48) and either rinsed or ground and stomached for microbiological sampling. In experiment 2, for 3 replicate trials of 4 carcasses each, necks, wings, and legs were manually removed; the remaining trunks were cut through the sides to produce front (ventral) and back (dorsal) halves (n = 24); and then rinsed. For both experiments, coliforms and Escherichia coli were enumerated. In experiment 1, significantly higher numbers (P coliforms and E. coli were recovered by rinsing than by grinding from both breast and leg quarters. Leg quarters were found to have higher bacterial numbers than breasts from grind samples, but no quarter differences were found for rinse samples. In experiment 2, higher (P coliforms and E. coli were recovered from the dorsal carcass half compared with the ventral half. Bacterial counts of broiler carcasses are affected by both the sampling method used and by carcass location sampled.

  3. Final Report for X-ray Diffraction Sample Preparation Method Development

    Energy Technology Data Exchange (ETDEWEB)

    Ely, T. M.; Meznarich, H. K.; Valero, T.


    WRPS-1500790, “X-ray Diffraction Saltcake Sample Preparation Method Development Plan/Procedure,” was originally prepared with the intent of improving the specimen preparation methodology used to generate saltcake specimens suitable for XRD-based solid phase characterization. At the time that this test plan document was originally developed, packed powder in cavity supports with collodion binder was the established XRD specimen preparation method. An alternate specimen preparation method less vulnerable, if not completely invulnerable to preferred orientation effects, was desired as a replacement for the method.

  4. Statistical methods for detecting differentially abundant features in clinical metagenomic samples.

    Directory of Open Access Journals (Sweden)

    James Robert White


    Full Text Available Numerous studies are currently underway to characterize the microbial communities inhabiting our world. These studies aim to dramatically expand our understanding of the microbial biosphere and, more importantly, hope to reveal the secrets of the complex symbiotic relationship between us and our commensal bacterial microflora. An important prerequisite for such discoveries are computational tools that are able to rapidly and accurately compare large datasets generated from complex bacterial communities to identify features that distinguish them.We present a statistical method for comparing clinical metagenomic samples from two treatment populations on the basis of count data (e.g. as obtained through sequencing to detect differentially abundant features. Our method, Metastats, employs the false discovery rate to improve specificity in high-complexity environments, and separately handles sparsely-sampled features using Fisher's exact test. Under a variety of simulations, we show that Metastats performs well compared to previously used methods, and significantly outperforms other methods for features with sparse counts. We demonstrate the utility of our method on several datasets including a 16S rRNA survey of obese and lean human gut microbiomes, COG functional profiles of infant and mature gut microbiomes, and bacterial and viral metabolic subsystem data inferred from random sequencing of 85 metagenomes. The application of our method to the obesity dataset reveals differences between obese and lean subjects not reported in the original study. For the COG and subsystem datasets, we provide the first statistically rigorous assessment of the differences between these populations. The methods described in this paper are the first to address clinical metagenomic datasets comprising samples from multiple subjects. Our methods are robust across datasets of varied complexity and sampling level. While designed for metagenomic applications, our software

  5. A Bayesian adaptive blinded sample size adjustment method for risk differences. (United States)

    Hartley, Andrew Montgomery


    Adaptive sample size adjustment (SSA) for clinical trials consists of examining early subsets of on trial data to adjust estimates of sample size requirements. Blinded SSA is often preferred over unblinded SSA because it obviates many logistical complications of the latter and generally introduces less bias. On the other hand, current blinded SSA methods for binary data offer little to no new information about the treatment effect, ignore uncertainties associated with the population treatment proportions, and/or depend on enhanced randomization schemes that risk partial unblinding. I propose an innovative blinded SSA method for use when the primary analysis is a non-inferiority or superiority test regarding a risk difference. The method incorporates evidence about the treatment effect via the likelihood function of a mixture distribution. I compare the new method with an established one and with the fixed sample size study design, in terms of maximization of an expected utility function. The new method maximizes the expected utility better than do the comparators, under a range of assumptions. I illustrate the use of the proposed method with an example that incorporates a Bayesian hierarchical model. Lastly, I suggest topics for future study regarding the proposed methods. Copyright © 2015 John Wiley & Sons, Ltd.

  6. Analysis of training sample selection strategies for regression-based quantitative landslide susceptibility mapping methods (United States)

    Erener, Arzu; Sivas, A. Abdullah; Selcuk-Kestel, A. Sevtap; Düzgün, H. Sebnem


    All of the quantitative landslide susceptibility mapping (QLSM) methods requires two basic data types, namely, landslide inventory and factors that influence landslide occurrence (landslide influencing factors, LIF). Depending on type of landslides, nature of triggers and LIF, accuracy of the QLSM methods differs. Moreover, how to balance the number of 0 (nonoccurrence) and 1 (occurrence) in the training set obtained from the landslide inventory and how to select which one of the 1's and 0's to be included in QLSM models play critical role in the accuracy of the QLSM. Although performance of various QLSM methods is largely investigated in the literature, the challenge of training set construction is not adequately investigated for the QLSM methods. In order to tackle this challenge, in this study three different training set selection strategies along with the original data set is used for testing the performance of three different regression methods namely Logistic Regression (LR), Bayesian Logistic Regression (BLR) and Fuzzy Logistic Regression (FLR). The first sampling strategy is proportional random sampling (PRS), which takes into account a weighted selection of landslide occurrences in the sample set. The second method, namely non-selective nearby sampling (NNS), includes randomly selected sites and their surrounding neighboring points at certain preselected distances to include the impact of clustering. Selective nearby sampling (SNS) is the third method, which concentrates on the group of 1's and their surrounding neighborhood. A randomly selected group of landslide sites and their neighborhood are considered in the analyses similar to NNS parameters. It is found that LR-PRS, FLR-PRS and BLR-Whole Data set-ups, with order, yield the best fits among the other alternatives. The results indicate that in QLSM based on regression models, avoidance of spatial correlation in the data set is critical for the model's performance.

  7. Quality Assurance in Asian Distance Education: Diverse Approaches and Common Culture

    Directory of Open Access Journals (Sweden)

    Insung Jung


    Full Text Available With the phenomenal expansion of distance education in Asia during the past three decades, there has been growing public demand for quality and accountability in distance education. This study investigates the national quality assurance systems for distance education at the higher education level in Asia with the aim of contributing to a better understanding of the current level of development of quality assurance in Asian distance education and to offer potential directions for policy makers when developing and elaborating quality assurance systems for distance education. The analysis of the existing quality assurance frameworks in the 11 countries/territories selected reveals that the level of quality assurance policy integration in the overall national quality assurance in higher education policy framework varies considerably. The purpose of quality assurance, policy frameworks, methods, and instruments in place are generally tailored to each country’s particular circumstances. There are, however, obvious commonalities that underpin these different quality assurance efforts.

  8. Evaluation of a gas chromatography method for azelaic acid determination in selected biological samples (United States)

    Garelnabi, Mahdi; Litvinov, Dmitry; Parthasarathy, Sampath


    Background: Azelaic acid (AzA) is the best known dicarboxilic acid to have pharmaceutical benefits and clinical applications and also to be associated with some diseases pathophysiology. Materials and Methods: We extracted and methylesterified AzA and determined its concentration in human plasma obtained from healthy individuals and also in mice fed AzA containing diet for three months. Results: AzA was detected in Gas Chromatography (GC) and confirmed by Liquid chromatography mass spectrometry (LCMS), and gas chromatography mass spectrometry (GCMC). Our results have shown that AzA can be determined efficiently in selected biological samples by GC method with 1nM limit of detection (LoD) and the limit of quantification (LoQ); was established at 50nM. Analytical Sensitivity as assayed by hexane demonstrated an analytical sensitivity at 0.050nM. The method has demonstrated 8-10% CV batch repeatability across the sample types and 13-18.9% CV for the Within-Lab Precision analysis. The method has shown that AzA can efficiently be recovered from various sample preparation including liver tissue homogenate (95%) and human plasma (97%). Conclusions: Because of its simplicity and lower limit of quantification, the present method provides a useful tool for determining AzA in various biological sample preparations. PMID:22558586

  9. Real-Time PCR Method for Detection of Salmonella spp. in Environmental Samples. (United States)

    Kasturi, Kuppuswamy N; Drgon, Tomas


    The methods currently used for detecting Salmonella in environmental samples require 2 days to produce results and have limited sensitivity. Here, we describe the development and validation of a real-time PCR Salmonella screening method that produces results in 18 to 24 h. Primers and probes specific to the gene invA , group D, and Salmonella enterica serovar Enteritidis organisms were designed and evaluated for inclusivity and exclusivity using a panel of 329 Salmonella isolates representing 126 serovars and 22 non- Salmonella organisms. The invA - and group D-specific sets identified all the isolates accurately. The PCR method had 100% inclusivity and detected 1 to 2 copies of Salmonella DNA per reaction. Primers specific for Salmonella -differentiating fragment 1 (Sdf-1) in conjunction with the group D set had 100% inclusivity for 32 S Enteritidis isolates and 100% exclusivity for the 297 non-Enteritidis Salmonella isolates. Single-laboratory validation performed on 1,741 environmental samples demonstrated that the PCR method detected 55% more positives than the V itek i mmuno d iagnostic a ssay s ystem (VIDAS) method. The PCR results correlated well with the culture results, and the method did not report any false-negative results. The receiver operating characteristic (ROC) analysis documented excellent agreement between the results from the culture and PCR methods (area under the curve, 0.90; 95% confidence interval of 0.76 to 1.0) confirming the validity of the PCR method. IMPORTANCE This validated PCR method detects 55% more positives for Salmonella in half the time required for the reference method, VIDAS. The validated PCR method will help to strengthen public health efforts through rapid screening of Salmonella spp. in environmental samples.

  10. Trace element analysis of humus-rich natural water samples:method development for UV-LED assisted photocatalytic sample preparation and hydride generation ICP-MS analysis


    Havia, J. (Johanna)


    Abstract Humus-rich natural water samples, containing high concentrations of dissolved organic carbon (DOC), are challenging for certain analytical methods used in trace element analysis, including hydride generation methods and electrochemical methods. In order to obtain reliable results, the samples must to be pretreated to release analytes from humic acid complexes prior to the determination. In this study, methods for both pretreatment and analysis steps were developed. Arsenic is ...

  11. Comparison between active (pumped) and passive (diffusive) sampling methods for formaldehyde in pathology and histology laboratories. (United States)

    Lee, Eun Gyung; Magrm, Rana; Kusti, Mohannad; Kashon, Michael L; Guffey, Steven; Costas, Michelle M; Boykin, Carie J; Harper, Martin


    This study was to determine occupational exposures to formaldehyde and to compare concentrations of formaldehyde obtained by active and passive sampling methods. In one pathology and one histology laboratories, exposure measurements were collected with sets of active air samplers (Supelco LpDNPH tubes) and passive badges (ChemDisk Aldehyde Monitor 571). Sixty-six sample pairs (49 personal and 17 area) were collected and analyzed by NIOSH NMAM 2016 for active samples and OSHA Method 1007 (using the manufacturer's updated uptake rate) for passive samples. All active and passive 8-hr time-weighted average (TWA) measurements showed compliance with the OSHA permissible exposure limit (PEL-0.75 ppm) except for one passive measurement, whereas 78% for the active and 88% for the passive samples exceeded the NIOSH recommended exposure limit (REL-0.016 ppm). Overall, 73% of the passive samples showed higher concentrations than the active samples and a statistical test indicated disagreement between two methods for all data and for data without outliers. The OSHA Method cautions that passive samplers should not be used for sampling situations involving formalin solutions because of low concentration estimates in the presence of reaction products of formaldehyde and methanol (a formalin additive). However, this situation was not observed, perhaps because the formalin solutions used in these laboratories included much less methanol (3%) than those tested in the OSHA Method (up to 15%). The passive samplers in general overestimated concentrations compared to the active method, which is prudent for demonstrating compliance with an occupational exposure limit, but occasional large differences may be a result of collecting aerosolized droplets or splashes on the face of the samplers. In the situations examined in this study the passive sampler generally produces higher results than the active sampler so that a body of results from passive samplers demonstrating compliance with the

  12. A sampling method for estimating the accuracy of predicted breeding values in genetic evaluation

    Directory of Open Access Journals (Sweden)

    Laloë Denis


    Full Text Available Abstract A sampling-based method for estimating the accuracy of estimated breeding values using an animal model is presented. Empirical variances of true and estimated breeding values were estimated from a simulated n-sample. The method was validated using a small data set from the Parthenaise breed with the estimated coefficient of determination converging to the true values. It was applied to the French Salers data file used for the 2000 on-farm evaluation (IBOVAL of muscle development score. A drawback of the method is its computational demand. Consequently, convergence can not be achieved in a reasonable time for very large data files. Two advantages of the method are that a it is applicable to any model (animal, sire, multivariate, maternal effects... and b it supplies off-diagonal coefficients of the inverse of the mixed model equations and can therefore be the basis of connectedness studies.

  13. Evaluation of various conventional methods for sampling weeds in potato and spinach crops

    Directory of Open Access Journals (Sweden)

    David Jamaica


    Full Text Available This study aimed to evaluate (at an exploratory level, some of the different conventional sampling designs in a section of a potato crop and in a commercial crop of spinach. Weeds were sampled in a 16 x 48 m section of a potato crop with a set grid of 192 sections. The cover and density of the weeds were registered in squares of from 0.25 to 64 m². The results were used to create a database that allowed for the simulation of different sampling designs: variables and square size. A second sampling was carried out with these results in a spinach crop of 1.16 ha with a set grid of 6 x 6 m cells, evaluating the cover in 4 m² squares. Another database was created with this information, which was used to simulate other sampling designs such as distribution and quantity of sampling squares. According to the obtained results, a good method for approximating the quantity of squares for diverse samples is 10-12 squares (4 m² for richness per ha and 18 or more squares for abundance per hectare. This square size is optimal since it allows for a sampling of more area without losing sight of low-profile species, with the cover variable best representing the abundance of the weeds.

  14. An Optimized Method for Quantification of Pathogenic Leptospira in Environmental Water Samples. (United States)

    Riediger, Irina N; Hoffmaster, Alex R; Casanovas-Massana, Arnau; Biondo, Alexander W; Ko, Albert I; Stoddard, Robyn A


    Leptospirosis is a zoonotic disease usually acquired by contact with water contaminated with urine of infected animals. However, few molecular methods have been used to monitor or quantify pathogenic Leptospira in environmental water samples. Here we optimized a DNA extraction method for the quantification of leptospires using a previously described Taqman-based qPCR method targeting lipL32, a gene unique to and highly conserved in pathogenic Leptospira. QIAamp DNA mini, MO BIO PowerWater DNA and PowerSoil DNA Isolation kits were evaluated to extract DNA from sewage, pond, river and ultrapure water samples spiked with leptospires. Performance of each kit varied with sample type. Sample processing methods were further evaluated and optimized using the PowerSoil DNA kit due to its performance on turbid water samples and reproducibility. Centrifugation speeds, water volumes and use of Escherichia coli as a carrier were compared to improve DNA recovery. All matrices showed a strong linearity in a range of concentrations from 106 to 10° leptospires/mL and lower limits of detection ranging from Leptospira in environmental waters (river, pond and sewage) which consists of the concentration of 40 mL samples by centrifugation at 15,000×g for 20 minutes at 4°C, followed by DNA extraction with the PowerSoil DNA Isolation kit. Although the method described herein needs to be validated in environmental studies, it potentially provides the opportunity for effective, timely and sensitive assessment of environmental leptospiral burden.

  15. Materials and Methods for Streamlined Laboratory Analysis of Environmental Samples, FY 2016 Report

    Energy Technology Data Exchange (ETDEWEB)

    Addleman, Raymond S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Naes, Benjamin E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McNamara, Bruce K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Olsen, Khris B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chouyyok, Wilaiwan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Willingham, David G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Spigner, Angel C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    The International Atomic Energy Agency (IAEA) relies upon laboratory analysis of environmental samples (typically referred to as “swipes”) collected during on-site inspections of safeguarded facilities to support the detection and deterrence of undeclared activities. Unfortunately, chemical processing and assay of the samples is slow and expensive. A rapid, effective, and simple extraction process and analysis method is needed to provide certified results with improved timeliness at reduced costs (principally in the form of reduced labor), while maintaining or improving sensitivity and efficacy. To address these safeguard needs the Pacific Northwest National Laboratory (PNNL) explored and demonstrated improved methods for environmental sample (ES) analysis. Improvements for both bulk and particle analysis were explored. To facilitate continuity and adoption, the new sampling materials and processing methods will be compatible with existing IAEA protocols for ES analysis. PNNL collaborated with Oak Ridge National Laboratory (ORNL), which performed independent validation of the new bulk analysis methods and compared performance to traditional IAEA’s Network of Analytical Laboratories (NWAL) protocol. ORNL efforts are reported separately. This report describes PNNL’s FY 2016 progress, which was focused on analytical application supporting environmental monitoring of uranium enrichment plants and nuclear fuel processing. In the future the technology could be applied to other safeguard applications and analytes related to fuel manufacturing, reprocessing, etc. PNNL’s FY 2016 efforts were broken into two tasks and a summary of progress, accomplishments and highlights are provided below. Principal progress and accomplishments on Task 1, Optimize Materials and Methods for ICP-MS Environmental Sample Analysis, are listed below. • Completed initial procedure for rapid uranium extraction from ES swipes based upon carbonate-peroxide chemistry (delivered to ORNL for

  16. NIST-Traceable NMR Method to Determine Quantitative Weight Percentage Purity of Mustard (HD) Feedstock Samples (United States)


    ECBC-TR-1506 NIST-TRACEABLE NMR METHOD TO DETERMINE QUANTITATIVE WEIGHT PERCENTAGE PURITY OF MUSTARD (HD) FEEDSTOCK SAMPLES David J...McGarvey RESEARCH AND TECHNOLOGY DIRECTORATE William R. Creasy LEIDOS, INC. Abingdon, MD 21009-1261 Theresa R. Connell EXCET, INC...Jan 2012–May 2012 4. TITLE AND SUBTITLE NIST-Traceable NMR Method to Determine Quantitative Weight Percentage Purity of Mustard (HD) Feedstock

  17. Detection of PRRSV in 218 field samples using six molecular methods: What we are looking for?

    DEFF Research Database (Denmark)

    Toplak, Ivan; Štukelj, Marina; Gracieux, Patrice


    Objectives The purpose of this study was to determine the sensitivity and the specificity of six molecular methods used for the detection of porcine reproductive and respiratory syndrome virus (PRRSV). Methods 218 field samples (serum, tissues) were collected between 2009 and 2011 from 50 PRRSV p......-time) Continuesly follow the genetic evaluation of especially Type I PRRSV subtype viruses and regularly update their primer sequences....

  18. Determination of rare-earth elements in Luna 16 regolith sample by chemical spectral method (United States)

    Stroganova, N. S.; Ryabukhin, V. A.; Laktinova, N. V.; Ageyeva, L. V.; Galkina, I. P.; Gatinskaya, N. G.; Yermakov, A. N.; Karyakin, A. V.


    An analysis was made of regolith from layer A of the Luna 16 sample for rare earth elements, by a chemical spectral method. Chemical and ion exchange concentrations were used to determine the content of 12 elements and Y at the level 0.001 to 0.0001 percent with 10 to 15 percent reproducibility of the emission determination. Results within the limits of reproducibility agree with data obtained by mass spectra, activation, and X-ray fluorescent methods.

  19. Final LDRD report : development of sample preparation methods for ChIPMA-based imaging mass spectrometry of tissue samples.

    Energy Technology Data Exchange (ETDEWEB)

    Maharrey, Sean P.; Highley, Aaron M.; Behrens, Richard, Jr.; Wiese-Smith, Deneille


    The objective of this short-term LDRD project was to acquire the tools needed to use our chemical imaging precision mass analyzer (ChIPMA) instrument to analyze tissue samples. This effort was an outgrowth of discussions with oncologists on the need to find the cellular origin of signals in mass spectra of serum samples, which provide biomarkers for ovarian cancer. The ultimate goal would be to collect chemical images of biopsy samples allowing the chemical images of diseased and nondiseased sections of a sample to be compared. The equipment needed to prepare tissue samples have been acquired and built. This equipment includes an cyro-ultramicrotome for preparing thin sections of samples and a coating unit. The coating unit uses an electrospray system to deposit small droplets of a UV-photo absorbing compound on the surface of the tissue samples. Both units are operational. The tissue sample must be coated with the organic compound to enable matrix assisted laser desorption/ionization (MALDI) and matrix enhanced secondary ion mass spectrometry (ME-SIMS) measurements with the ChIPMA instrument Initial plans to test the sample preparation using human tissue samples required development of administrative procedures beyond the scope of this LDRD. Hence, it was decided to make two types of measurements: (1) Testing the spatial resolution of ME-SIMS by preparing a substrate coated with a mixture of an organic matrix and a bio standard and etching a defined pattern in the coating using a liquid metal ion beam, and (2) preparing and imaging C. elegans worms. Difficulties arose in sectioning the C. elegans for analysis and funds and time to overcome these difficulties were not available in this project. The facilities are now available for preparing biological samples for analysis with the ChIPMA instrument. Some further investment of time and resources in sample preparation should make this a useful tool for chemical imaging applications.

  20. Evaluation of sampling and analytical methods for the determination of chlorodifluoromethane in air. (United States)

    Seymour, M J; Lucas, M F


    In January 1989, the Occupational Safety and Health Administration (OSHA) published revised permissible exposure limits (PELs) for 212 compounds and established PELs for 164 additional compounds. In cases where regulated compounds did not have specific sampling and analytical methods, methods were suggested by OSHA. The National Institute for Occupational Safety and Health (NIOSH) Manual of Analytical Methods (NMAM) Method 1020, which was developed for 1,1,2-trichloro-1,2,2-trifluoroethane, was suggested by OSHA for the determination of chlorodifluoromethane in workplace air. Because this method was developed for a liquid and chlorodifluoromethane is a gas, the ability of NMAM Method 1020 to adequately sample and quantitate chlorodifluoromethane was questioned and tested by researchers at NIOSH. The evaluation of NMAM Method 1020 for chlorodifluoromethane showed that the capacity of the 100/50-mg charcoal sorbent bed was limited, the standard preparation procedure was incorrect for a gas analyte, and the analyte had low solubility in carbon disulfide. NMAM Method 1018 for dichlorodifluoromethane uses two coconut-shell charcoal tubes in series, a 400/200-mg tube followed by a 100/50-mg tube, which are desorbed with methylene chloride. This method was evaluated for chlorodifluoromethane. Test atmospheres, with chlorodifluoromethane concentrations from 0.5-2 times the PEL were generated. Modifications of NMAM Method 1018 included changes in the standard preparation procedure, and the gas chromatograph was equipped with a capillary column. These revisions to NMAM 1018 resulted in a 96.5% recovery and a total precision for the method of 7.1% for chlorodifluoromethane. No significant bias in the method was found. Results indicate that the revised NMAM Method 1018 is suitable for the determination of chlorodifluoromethane in workplace air.

  1. An improved method for the analysis of volatile polyfluorinated alkyl substances in environmental air samples

    Energy Technology Data Exchange (ETDEWEB)

    Jahnke, Annika; Ahrens, Lutz [Institute for Coastal Research, GKSS Research Centre, Department of Environmental Chemistry, Geesthacht (Germany); University of Lueneburg, Institute for Ecology and Environmental Chemistry, Faculty of Environmental Sciences, Lueneburg (Germany); Ebinghaus, Ralf; Temme, Christian [Institute for Coastal Research, GKSS Research Centre, Department of Environmental Chemistry, Geesthacht (Germany); Berger, Urs [Norwegian Institute for Air Research (NILU), Polar Environmental Centre, Tromsoe (Norway); Stockholm University, Department of Applied Environmental Science (ITM), Stockholm (Sweden); Barber, Jonathan L. [Lancaster University, Department of Environmental Science, Faculty of Science and Technology, Lancaster (United Kingdom)


    This article describes the optimisation and validation of an analytical method for the determination of volatile polyfluorinated alkyl substances (PFAS) in environmental air samples. Airborne fluorinated telomer alcohols (FTOHs) as well as fluorinated sulfonamides and sulfonamidoethanols (FOSAs/FOSEs) were enriched on glass-fibre filters (GFFs), polyurethane foams (PUFs) and XAD-2 resin by means of high-volume air samplers. Sensitive and selective determination was performed using gas chromatography/chemical ionisation-mass spectrometry (GC/CI-MS). Five mass-labelled internal standard (IS) compounds were applied to ensure the accuracy of the analytical results. No major blank problems were encountered. Recovery experiments were performed, showing losses of the most volatile compounds during extraction and extract concentration as well as strong signal enhancement for FOSEs due to matrix effects. Breakthrough experiments revealed losses of the most volatile FTOHs during sampling, while FOSAs/FOSEs were quantitatively retained. Both analyte losses and matrix effects could be remediated by application of adequate mass-labelled IS. Method quantification limits (MQLs) of the optimised method ranged from 0.2 to 2.5 pg/m{sup 3} for individual target compounds. As part of the method validation, an interlaboratory comparison of instrumental quantification methods was conducted. The applicability of the method was demonstrated by means of environmental air samples from an urban and a rural location in Northern Germany. (orig.)

  2. A simple sample preparation method for measuring amoxicillin in human plasma by hollow fiber centrifugal ultrafiltration. (United States)

    Dong, Wei-Chong; Hou, Zi-Li; Jiang, Xin-Hui; Jiang, Ye


    A simple sample preparation method has been developed for the determination of amoxicillin in human plasma by hollow fiber centrifugal ultrafiltration (HF-CF-UF). A 400-μL plasma sample was placed directly into the HF-CF-UF device, which consisited of a slim glass tube and a U-shaped hollow fiber. After centrifugation at 1.25 × 10(3) g for 10 min, the filtrate was withdrawn from the hollow fiber and 20 µL was directly injected into the high-performance liquid chromatography (HPLC) for analysis. The calibration curve was linear over the range of 0.1-20 µg/mL (r = 0.9996) and the limit of detection was as low as 0.025 µg/mL. The average recovery and absolute recovery were 99.9% and 84.5%, respectively. Both the intra-day and inter-day precisions (relative standard deviation) were less than 3.1% for three concentrations (0.25, 2.5 and 10 µg/mL). The sample preparation process was simplified. Only after a single centrifugal ultrafiltration can the filtrate be injected directly into HPLC. The present method is simple, sensitive and accurate. It could be effective for the analysis of biological samples with high protein contents, especially for the biopharmaceutical analysis of drugs that use traditional isolation techniques for sample preparation such as the protein precipitation method.

  3. Sample preparation method considerations for integrated transcriptomic and proteomic analysis of tumors. (United States)

    Bhat, Anupama Rajan; Gupta, Manoj Kumar; Krithivasan, Priya; Dhas, Kunal; Nair, Jayalakshmi; Reddy, Ram Bhupal; Sudheendra, Holalugunda Vittalamurthy; Chavan, Sandip; Vardhan, Harsha; Darsi, Sujatha; Balakrishnan, Lavanya; Katragadda, Shanmukh; Kekatpure, Vikram; Suresh, Amritha; Tata, Pramila; Panda, Binay; Kuriakose, Moni A; Sirdeshmukh, Ravi


    Sample processing protocols that enable compatible recovery of differentially expressed transcripts and proteins are necessary for integration of the multiomics data applied in the analysis of tumors. In this pilot study, we compared two different isolation methods for extracting RNA and protein from laryngopharyngeal tumor tissues and the corresponding adjacent normal sections. In Method 1, RNA and protein were isolated from a single tissue section sequentially and in Method 2, the extraction was carried out using two different sections and two independent and parallel protocols for RNA and protein. RNA and protein from both methods were subjected to RNA-seq and iTRAQ-based LC-MS/MS analysis, respectively. Analysis of data revealed that a higher number of differentially expressed transcripts and proteins were concordant in their regulation trends in Method 1 as compared to Method 2. Cross-method comparison of concordant entities revealed that RNA and protein extraction from the same tissue section (Method 1) recovered more concordant entities that are missed in the other extraction method (Method 2) indicating heterogeneity in distribution of these entities in different tissue sections. Method 1 could thus be the method of choice for integrated analysis of transcriptome and proteome data. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Metrological assurance in radiochemical production

    Directory of Open Access Journals (Sweden)

    Yuliya A. Tadevosyan


    Full Text Available Introduction: Radiochemical production is one of the richest in terms of data for analytical control, high-quality implementation of which is impossible without using type approved reference materials (RMs. Industry-specific institutes used to satisfy the demand for RMs, many of which nowadays have ceased this type of activities due to various reasons. The paper in question covers problems of metrological support in radiochemical production caused by the lack of type approved RMs.Materials and methods: Technologies used for obtaining homogeneous reference material for radiochemical production are described. Methods and measuring instruments used for certifying RMs are listed.Results: Results of developing certified reference materials at Mayak Production Association are given. Examples of codeveloping RMs of triuranium octoxide and plutonium dioxide are provided.Discussion and conclusions: Assessment of the current situation in terms of provision of type approved RMs is given. The paper provides data on availability of raw material, quality of instrument and methodological base in order to create a reference material production site at Mayak Production Association. Results of step-by-step solutions to problems of metrological assurance in radiochemical production are presented. Research prospects of developing RMs for inductively coupled plasma mass spectrometry are outlined. An outlook is given and practical proposals are formulated in the paper. The proposals in question are related to the interaction between institutes and enterprises of the field in terms of developing type approved RMs. 

  5. [Quality assurance in acupuncture therapy]. (United States)

    Kubiena, G


    Quality assurance for acupuncture therapy requires a good basic and on-going training in both conventional western medicine as well as in the theory and practice of acupuncture, the ability to synthesize the patient's objective findings and subjective feelings, and honesty with the patient and towards oneself. Thus, based on the continuous critical evaluation of the objective and subjective parameters, the question of acupunture as the optimal form of therapy for this specific case is honestly answered and one has the courage to admit failures. With regard to the theory, surveys of the acupuncture literature show that a considerable improvement in quality and honesty is necessary. There is a lack of standardised experimental methods (e.g. 28 different placebos in 28 different studies!). Especially German acupuncture journals have a disturbed relation to failures. To hide or deny failures is of no benefit neither to acupuncture, science to the relationship between the physician and the patient since the practitioner must be able to rely on the information in the literature. Furthermore, one should be open minded to alternative methods even if this means to refer a patient to a colleague.

  6. Effects of Heterogeneities, Sampling Frequencies, Tools and Methods on Uncertainties in Subsurface Contaminant Concentration Measurements (United States)

    Ezzedine, S. M.; McNab, W. W.


    Long-term monitoring (LTM) is particularly important for contaminants which are mitigated by natural processes of dilution, dispersion, and degradation. At many sites, LTM can require decades of expensive sampling at tens or even hundreds of existing monitoring wells, resulting in hundreds of thousands, or millions of dollars per year for sampling and data management. Therefore, contaminant sampling tools, methods and frequencies are chosen to minimize waste and data management costs while ensuring a reliable and informative time-history of contaminant measurement for regulatory compliance. The interplay play between cause (i.e. subsurface heterogeneities, sampling techniques, measurement frequencies) and effect (unreliable data and measurements gap) has been overlooked in many field applications which can lead to inconsistencies in time- histories of contaminant samples. In this study we address the relationship between cause and effect for different hydrogeological sampling settings: porous and fractured media. A numerical model has been developed using AMR-FEM to solve the physicochemical processes that take place in the aquifer and the monitoring well. In the latter, the flow is governed by the Navier-Stokes equations while in the former the flow is governed by the diffusivity equation; both are fully coupled to mimic stressed conditions and to assess the effect of dynamic sampling tool on the formation surrounding the monitoring well. First of all, different sampling tools (i.e., Easy Pump, Snapper Grab Sampler) were simulated in a monitoring well screened in different homogeneous layered aquifers to assess their effect on the sampling measurements. Secondly, in order to make the computer runs more CPU efficient the flow in the monitoring well was replaced by its counterpart flow in porous media with infinite permeability and the new model was used to simulate the effect of heterogeneities, sampling depth, sampling tool and sampling frequencies on the

  7. Spiders and harvestmen on tree trunks obtained by three sampling methods

    Directory of Open Access Journals (Sweden)

    Machač, Ondřej


    Full Text Available We studied spiders and harvestmen on tree trunks using three sampling methods. In 2013, spider and harvestman research was conducted on the trunks of selected species of deciduous trees (linden, oak, maple in the town of Přerov and a surrounding floodplain forest near the Bečva River in the Czech Republic. Three methods were used to collect arachnids (pitfall traps with a conservation fluid, sticky traps and cardboard pocket traps. Overall, 1862 spiders and 864 harvestmen were trapped, represented by 56 spider species belonging to 15 families and seven harvestman species belonging to one family. The most effective method for collecting spider specimens was a modified pitfall trap method, and in autumn (September to October a cardboard band method. The results suggest a high number of spiders overwintering on the tree bark. The highest species diversity of spiders was found in pitfall traps, evaluated as the most effective method for collecting harvestmen too.

  8. Sample Size Considerations of Prediction-Validation Methods in High-Dimensional Data for Survival Outcomes (United States)

    Pang, Herbert; Jung, Sin-Ho


    A variety of prediction methods are used to relate high-dimensional genome data with a clinical outcome using a prediction model. Once a prediction model is developed from a data set, it should be validated using a resampling method or an independent data set. Although the existing prediction methods have been intensively evaluated by many investigators, there has not been a comprehensive study investigating the performance of the validation methods, especially with a survival clinical outcome. Understanding the properties of the various validation methods can allow researchers to perform more powerful validations while controlling for type I error. In addition, sample size calculation strategy based on these validation methods is lacking. We conduct extensive simulations to examine the statistical properties of these validation strategies. In both simulations and a real data example, we have found that 10-fold cross-validation with permutation gave the best power while controlling type I error close to the nominal level. Based on this, we have also developed a sample size calculation method that will be used to design a validation study with a user-chosen combination of prediction. Microarray and genome-wide association studies data are used as illustrations. The power calculation method in this presentation can be used for the design of any biomedical studies involving high-dimensional data and survival outcomes. PMID:23471879

  9. Comparing methods for fetal fraction determination and quality control of NIPT samples. (United States)

    van Beek, Daphne M; Straver, Roy; Weiss, Marian M; Boon, Elles M J; Huijsdens-van Amsterdam, Karin; Oudejans, Cees B M; Reinders, Marcel J T; Sistermans, Erik A


    To compare available analysis methods for determining fetal fraction on single read next generation sequencing data. This is important as the performance of non-invasive prenatal testing (NIPT) procedures depends on the fraction of fetal DNA. We tested six different methods for the detection of fetal fraction in NIPT samples. The same clinically obtained data were used for all methods, allowing us to assess the effect of fetal fraction on the test result, and to investigate the use of fetal fraction for quality control. We show that non-NIPT methods based on body mass index (BMI) and gestational age are unreliable predictors of fetal fraction, male pregnancy specific methods based on read counts on the Y chromosome perform consistently and the fetal sex-independent new methods SeqFF and SANEFALCON are less reliable but can be used to obtain a basic indication of fetal fraction in case of a female fetus. We recommend the use of a combination of methods to prevent the issue of reports on samples with insufficient fetal DNA; SANEFALCON to check for presence of fetal DNA, SeqFF for estimating the fetal fraction for a female pregnancy and any Y-based method for estimating the fetal fraction for a male pregnancy. © 2017 The Authors. Prenatal Diagnosis published by John Wiley & Sons, Ltd. © 2017 The Authors. Prenatal Diagnosis published by John Wiley & Sons, Ltd.

  10. Study on a pattern classification method of soil quality based on simplified learning sample dataset (United States)

    Zhang, Jiahua; Liu, S.; Hu, Y.; Tian, Y.


    Based on the massive soil information in current soil quality grade evaluation, this paper constructed an intelligent classification approach of soil quality grade depending on classical sampling techniques and disordered multiclassification Logistic regression model. As a case study to determine the learning sample capacity under certain confidence level and estimation accuracy, and use c-means algorithm to automatically extract the simplified learning sample dataset from the cultivated soil quality grade evaluation database for the study area, Long chuan county in Guangdong province, a disordered Logistic classifier model was then built and the calculation analysis steps of soil quality grade intelligent classification were given. The result indicated that the soil quality grade can be effectively learned and predicted by the extracted simplified dataset through this method, which changed the traditional method for soil quality grade evaluation. ?? 2011 IEEE.

  11. Multicenter validation of PCR-based method for detection of Salmonella in chicken and pig samples

    DEFF Research Database (Denmark)

    Malorny, B.; Cook, N.; D'Agostino, M.


    and pig swab samples. The 3 levels were 1-10, 10-100, and 100-1000 colony-forming units (CFU)/100 mL. Sample preparations, including inoculation and pre-enrichment in buffered peptone water (BPW), were performed centrally in a German laboratory; the pre-PCR sample preparation (by a resin-based method......As part of a standardization project, an interlaboratory trial including 15 laboratories from 13 European countries was conducted to evaluate the performance of a noproprietary polymerase chain reaction (PCR)-based method for the detection of Salmonella on artificially contaminated chicken rinse......) and PCR assay (gel electrophoresis detection) were performed by the receiving laboratories. Aliquots of BPW enrichment cultures were sent to the participants, who analyzed them using a thermal lysis procedure followed by a validated Salmonella-specific PCR assay. The results were reported as negative...

  12. Using the Experience Sampling Method in the Context of Contingency Management for Substance Abuse Treatment (United States)

    Husky, Mathilde M.; Mazure, Carolyn M.; Carroll, Kathleen M.; Barry, Danielle; Petry, Nancy M.


    Contingency management (CM) treatments have been shown to be effective in reducing substance use. This manuscript illustrates how the experience sampling method (ESM) can depict behavior and behavior change and can be used to explore CM treatment mechanisms. ESM characterizes idiosyncratic patterns of behavior and offers the potential to determine…

  13. A self-sampling method to obtain large volumes of undiluted cervicovaginal secretions. (United States)

    Boskey, Elizabeth R; Moench, Thomas R; Hees, Paul S; Cone, Richard A


    Studies of vaginal physiology and pathophysiology sometime require larger volumes of undiluted cervicovaginal secretions than can be obtained by current methods. A convenient method for self-sampling these secretions outside a clinical setting can facilitate such studies of reproductive health. The goal was to develop a vaginal self-sampling method for collecting large volumes of undiluted cervicovaginal secretions. A menstrual collection device (the Instead cup) was inserted briefly into the vagina to collect secretions that were then retrieved from the cup by centrifugation in a 50-ml conical tube. All 16 women asked to perform this procedure found it feasible and acceptable. Among 27 samples, an average of 0.5 g of secretions (range, 0.1-1.5 g) was collected. This is a rapid and convenient self-sampling method for obtaining relatively large volumes of undiluted cervicovaginal secretions. It should prove suitable for a wide range of assays, including those involving sexually transmitted diseases, microbicides, vaginal physiology, immunology, and pathophysiology.

  14. Highly Effective DNA Extraction Method from Fresh, Frozen, Dried and Clotted Blood Samples

    Directory of Open Access Journals (Sweden)

    Jaleh Barar


    Full Text Available Introduction: Today, with the tremendous potential of genomics and other recent advances in science, the role of science to improve reliable DNA extraction methods is more relevant than ever before. The ideal process for genomic DNA extraction demands high quantities of pure, integral and intact genomic DNA (gDNA from the sample with minimal co-extraction of inhibitors of downstream processes. Here, we report the development of a very rapid, less-hazardous, and high throughput protocol for extracting of high quality DNA from blood samples. Methods: Dried, clotted and ethylene diamine tetra-acetic acid (EDTA treated fresh and frozen blood samples were extracted using this method in which the quality and integrity of the extracted DNA were corroborated by agarose gel electrophoresis, PCR reaction and DNA digestion using restricted enzyme. The UV spectrophotometric and gel electrophoresis analysis resulted in high A260/A280 ratio (>1.8 with high intactness of DNA. Results: PCR and DNA digestion experiments indicated that the final solutions of extracted DNA contained no inhibitory substances, which confirms that the isolated DNA is of good quality. Conclusion: The high quality and quantity of current method, no enzymatic processing and accordingly its low cost, make it appropriate for DNA extraction not only from human but also from animal blood samples in any molecular biology labs.

  15. Evaluation of surface sampling method performance for Bacillus Spores on clean and dirty outdoor surfaces.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, Mollye C.; Einfeld, Wayne; Boucher, Raymond M.; Brown, Gary Stephen; Tezak, Matthew Stephen


    Recovery of Bacillus atrophaeous spores from grime-treated and clean surfaces was measured in a controlled chamber study to assess sampling method performance. Outdoor surfaces investigated by wipe and vacuum sampling methods included stainless steel, glass, marble and concrete. Bacillus atrophaeous spores were used as a surrogate for Bacillus anthracis spores in this study designed to assess whether grime-coated surfaces significantly affected surface sampling method performance when compared to clean surfaces. A series of chamber tests were carried out in which known amounts of spores were allowed to gravitationally settle onto both clean and dirty surfaces. Reference coupons were co-located with test coupons in all chamber experiments to provide a quantitative measure of initial surface concentrations of spores on all surfaces, thereby allowing sampling recovery calculations. Results from these tests, carried out under both low and high humidity conditions, show that spore recovery from grime-coated surfaces is the same as or better than spore recovery from clean surfaces. Statistically significant differences between method performance for grime-coated and clean surfaces were observed in only about half of the chamber tests conducted.

  16. Protein Profile study of clinical samples using Laser Induced Fluorescence as the detection method

    DEFF Research Database (Denmark)

    Karemore, Gopal Raghunath; Raja, Sujatha N.; Rai, Lavanya


      Protein profiles of tissue homogenates were recorded using HPLC separation and LIF detection method. The samples were collected from volunteers with clinically normal or cervical cancer conditions. It is shown that the protein profile can be classified as belonging to malignant or normal state ...

  17. Impact of culture media and sampling methods on Staphylococcus aureus aerosols. (United States)

    Chang, C-W; Wang, L-J


    Staphylococcus aureus has been detected indoors and is associated with human infection. Reliable quantification of S. aureus using a sampling technique followed by culture assay helps in assessing the risks of human exposure. The efficiency of five culture media and eight sampling methods in recovering S. aureus aerosols were evaluated. Methods to extract cells from filters were also studied. Tryptic soy agar (TSA) presented greater bacterial recovery than mannitol salt agar (MSA), CHROMagar staph aureus, Chapman stone medium, and Baird-Park agarose (P filters and 2-min vortex of polycarbonate (PC) filters. Evaluation of two filtration (IOM with gelatin filter and cassette with PC filter), two impaction (Andersen 1-STG loaded with TSA and MSA) and four impingement methods [AGI-30 and BioSampler filled with Tween mixture (TM) and phosphate-buffered saline (PBS)] revealed the BioSampler/TM performed best over 30 and 60 min of sampling (P < 0.05), while low recovery efficiencies were associated with the IOM/gelatin, cassette/PC, and AGI-30/PBS combinations (P < 0.05). In addition to BioSampler/TM, collecting S. aureus onto TSA from the Andersen 1-STG is also recommended, as it is the second best method at the 60-min sampling (P < 0.05). © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. Measurement of nasal nitric oxide : evaluation of six different sampling methods

    NARCIS (Netherlands)

    de Winter-de Groot, K. M.; van der Ent, C. K.

    Specific guidelines are developed for the measurement of bronchial FE(NO), however, nasal nitric oxide (nNO) measurement is not standardised yet, resulting in divergent nNO values. This study compares six different sampling methods for nNO as described in the literature, to analyse their outcome and

  19. Precise confidence intervals of regression-based reference limits: Method comparisons and sample size requirements. (United States)

    Shieh, Gwowen


    Covariate-dependent reference limits have been extensively applied in biology and medicine for determining the substantial magnitude and relative importance of quantitative measurements. Confidence interval and sample size procedures are available for studying regression-based reference limits. However, the existing popular methods employ different technical simplifications and are applicable only in certain limited situations. This paper describes exact confidence intervals of regression-based reference limits and compares the exact approach with the approximate methods under a wide range of model configurations. Using the ratio between the widths of confidence interval and reference interval as the relative precision index, optimal sample size procedures are presented for precise interval estimation under expected ratio and tolerance probability considerations. Simulation results show that the approximate interval methods using normal distribution have inaccurate confidence limits. The exact confidence intervals dominate the approximate procedures in one- and two-sided coverage performance. Unlike the current simplifications, the proposed sample size procedures integrate all key factors including covariate features in the optimization process and are suitable for various regression-based reference limit studies with potentially diverse configurations. The exact interval estimation has theoretical and practical advantages over the approximate methods. The corresponding sample size procedures and computing algorithms are also presented to facilitate the data analysis and research design of regression-based reference limits. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Statistical methods for genetic association studies with response-selective sampling designs

    NARCIS (Netherlands)

    Balliu, Brunilda


    This dissertation describes new statistical methods designed to improve the power of genetic association studies. Of particular interest are studies with a response-selective sampling design, i.e. case-control studies of unrelated individuals and case-control studies of family members. The