WorldWideScience

Sample records for assurance sampling method

  1. QUALITY ASSESSEMENT OF ANTE-NATAL CARE USING THE METHOD OF LOT QUALITY ASSURANCE SAMPLING

    Directory of Open Access Journals (Sweden)

    Sh. Salarilak

    1999-08-01

    Full Text Available To determine the coverage rate, timeliness and quality of ante-natal care in rural areas under the coverage of Health Houses in West Azerbaijan province, 30 Health Houses (HH were randomly selected out of 731 HH in the province. In each HH, using the method of Lot Quality Assurance Sampling (LQAS 28 women having recently born babies was selected. Data were collected using check-list for facilities, and questionnaires and forms to be completed from the files by interview. The study showed that the method of LQAS is quite effective for evaluation of this service at HH level. The weighted total coverage of ante-natal care was 46.2%. Quality of care was acceptable for 53.9% of mothers. The weighted average of time lines of care was 49.8%. Availability of facilities in delivery of this service was 100%, showing there was no short coming in this respect.

  2. QUALITY ASSURANCE PROCEDURES: METHOD 5G DETERMINATION OF PARTICULATE EMISSIONS FROM WOOD HEATERS FROM A DILUTION TUNNEL SAMPLING LOCATION

    Science.gov (United States)

    Quality assurance procedures are contained in this comprehensive document intended to be used as an aid for wood heater manufacturers and testing laboratories in performing particulate matter sampling of wood heaters according to EPA protocol, Method 5G. These procedures may be u...

  3. Radioactivity in environmental samples: calibration standards measurement methods, quality assurance, and data analysis

    International Nuclear Information System (INIS)

    The numerous environmental radioactivity measurements made by and for the U.S. Environmental Protection Agency (U.S.EPA) include measurements on samples of water, urine, food, milk, and air filters. Calibration standards are listed which are available in the form of water solutions and soils for a wide range of radionuclides. Method validation procedures for U.S. EPA approval include protocol development and single-laboratory and multiple-laboratory evaluation for precision and accuracy. Inter-laboratory comparison studies are conducted for both cross-check and performance evaluation samples and involve 295 federal, state, and local laboratories. For water samples, 80% to 90% of the participating laboratories are within the control limits for most of the radionuclides measured; however, some problem areas exist, especially for radium-228 and strontium-89 and -90. For milk and food samples, more than 90% of the laboratories are within control limits for cobalt-60 and cesium-137 but some problems exist for the measurement of strontium-90, iodine-131, and potassium-40. For tritium, 91% of the laboratories are within the control limit for water samples and 87% are within the control limits for the urine samples. The laboratory performance for air filter samples shows some problems for gross beta, strontium-90 and cesium-137 measurements. (author)

  4. Authentication Assurance Level Application to the Inventory Sampling Measurement System

    International Nuclear Information System (INIS)

    This document concentrates on the identification of a standardized assessment approach for the verification of security functionality in specific equipment, the Inspection Sampling Measurement System (ISMS) being developed for MAYAK. Specifically, an Authentication Assurance Level 3 is proposed to be reached in authenticating the ISMS

  5. Sampling quality assurance guidance in support of EM environmental sampling and analysis activities

    International Nuclear Information System (INIS)

    This document introduces quality assurance guidance pertaining to the design and implementation of sampling procedures and processes for collecting environmental data for DOE's Office of EM (Environmental Restoration and Waste Management)

  6. Quality assurance strategies in hospitals: Development, implementation and impact of quality assurance methods in Iranian hospitals

    NARCIS (Netherlands)

    A. Aghaei Hashjin

    2015-01-01

    This thesis concentrates on the subject of quality assurance strategies in hospitals; exploring the development, implementation and impact of quality assurance (QA) methods in Iranian hospitals. A series of descriptive and analytical studies using qualitative and quantitative data were performed. Th

  7. Extending cluster lot quality assurance sampling designs for surveillance programs.

    Science.gov (United States)

    Hund, Lauren; Pagano, Marcello

    2014-07-20

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance on the basis of the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible nonparametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate. PMID:24633656

  8. A method of safety assurance for fusion experimental reactor

    International Nuclear Information System (INIS)

    The present report describes safety assurance method for fusion experimental reactor. The ALARA (As Low As Reasonably Achievable) principle for a normal condition and the defence in depth principle for states deviated from the normal condition can be used as basic principles of safety assurance of the reactor. The method includes safety design for systems, importance categorization method to impose suitable demands to their systems, safety evaluation method to validate the design and application of the method. It is considered that this method can be a strong candidate for safety assurance method. (author)

  9. Analysis of large samples by neutron activation analysis. Quality assurance aspects

    International Nuclear Information System (INIS)

    The need for quality assurance in large sample instrumental neutron activation analysis (INAA) requires the development of unconventional methods of quality control. Certified reference materials are not available at the 1-5 kg scale; moreover, inhomogeneities which might affect the accuracy of the real sample analysis would not be reflected in the analysis of a reference material or in-house control sample even when available. Model studies indicate that inhomogeneities with strong gamma ray absorbing properties have the largest effect on the accuracy of the concentrations. The occurrence of these inhomogeneities may be derived from gamma spectrum analysis. Other opportunities for quality assurance are with the calculated estimates of the parameters describing neutron and gamma ray self-attenuation, and eventually through direct assessment after homogenization of the large sample, subsampling and conventional analysis. (author)

  10. Optimization of single plate-serial dilution spotting (SP-SDS with sample anchoring as an assured method for bacterial and yeast cfu enumeration and single colony isolation from diverse samples

    Directory of Open Access Journals (Sweden)

    Pious Thomas

    2015-12-01

    Full Text Available We propose a simple technique for bacterial and yeast cfu estimations from diverse samples with no prior idea of viable counts, designated as single plate-serial dilution spotting (SP-SDS with the prime recommendation of sample anchoring (100 stocks. For pure cultures, serial dilutions were prepared from 0.1 OD (100 stock and 20 μl aliquots of six dilutions (101–106 were applied as 10–15 micro-drops in six sectors over agar-gelled medium in 9-cm plates. For liquid samples 100–105 dilutions, and for colloidal suspensions and solid samples (10% w/v, 101–106 dilutions were used. Following incubation, at least one dilution level yielded 6–60 cfu per sector comparable to the standard method involving 100 μl samples. Tested on diverse bacteria, composite samples and Saccharomyces cerevisiae, SP-SDS offered wider applicability over alternative methods like drop-plating and track-dilution for cfu estimation, single colony isolation and culture purity testing, particularly suiting low resource settings.

  11. Measurement assurance program for FTIR analyses of deuterium oxide samples

    International Nuclear Information System (INIS)

    Analytical chemistry measurements require an installed criterion based assessment program to identify and control sources of error. This program should also gauge the uncertainty about the data. A self- assessment was performed of long established quality control practices against the characteristics of a comprehensive measurement assurance program. Opportunities for improvement were identified. This paper discusses the efforts to transform quality control practices into a complete measurement assurance program. The resulting program heightened the laboratory's confidence in the data it generated, by providing real-time statistical information to control and determine measurement quality

  12. [Quality assurance in geriatric rehabilitation--approaches and methods].

    Science.gov (United States)

    Deckenbach, B; Borchelt, M; Steinhagen-Thiessen, E

    1997-08-01

    It did not take the provisions of the 5th Book of the Social Code for quality assurance issues to gain significance in the field of geriatric rehabilitation as well. While in the surgical specialties, experience in particular with external quality assurance have already been gathered over several years now, suitable concepts and methods for the new Geriatric Rehabilitation specialty are still in the initial stages of development. Proven methods from the industrial and service sectors, such as auditing, monitoring and quality circles, can in principle be drawn on for devising geriatric rehabilitation quality assurance schemes; these in particular need to take into account the multiple factors influencing the course and outcome of rehabilitation entailed by multimorbidity and multi-drug use; the eminent role of the social environment; therapeutic interventions by a multidisciplinary team; as well as the multi-dimensional nature of rehabilitation outcomes. Moreover, the specific conditions of geriatric rehabilitation require development not only of quality standards unique to this domain but also of quality assurance procedures specific to geriatrics. Along with a number of other methods, standardized geriatric assessment will play a crucial role in this respect. PMID:9411627

  13. QUALITY ASSURANCE PROCEDURES: METHOD 28 CERTIFICATION AND AUDITING OF WOOD HEATERS

    Science.gov (United States)

    Quality assurance procedures are contained in this comprehensive document intended to be used as an aid for wood heater manufacturers and testing laboratories in performing particulate matter sampling of wood heaters according to EPA protocol, Method 28. These procedures may be u...

  14. Radioactive air sampling methods

    CERN Document Server

    Maiello, Mark L

    2010-01-01

    Although the field of radioactive air sampling has matured and evolved over decades, it has lacked a single resource that assimilates technical and background information on its many facets. Edited by experts and with contributions from top practitioners and researchers, Radioactive Air Sampling Methods provides authoritative guidance on measuring airborne radioactivity from industrial, research, and nuclear power operations, as well as naturally occuring radioactivity in the environment. Designed for industrial hygienists, air quality experts, and heath physicists, the book delves into the applied research advancing and transforming practice with improvements to measurement equipment, human dose modeling of inhaled radioactivity, and radiation safety regulations. To present a wide picture of the field, it covers the international and national standards that guide the quality of air sampling measurements and equipment. It discusses emergency response issues, including radioactive fallout and the assets used ...

  15. Evaluation of a Standardized Method of Quality Assurance in Mental Health Records: A Pilot Study

    Science.gov (United States)

    Bradshaw, Kelsey M.; Donohue, Bradley; Fayeghi, Jasmine; Lee, Tiffany; Wilks, Chelsey R.; Ross, Brendon

    2016-01-01

    The widespread adoption of research-supported treatments by mental health providers has facilitated empirical development of quality assurance (QA) methods. Research in this area has focused on QA systems aimed at assuring the integrity of research-supported treatment implementation, while examination of QA systems to assure appropriate…

  16. Sampling system and method

    Energy Technology Data Exchange (ETDEWEB)

    Decker, David L; Lyles, Brad F; Purcell, Richard G; Hershey, Ronald Lee

    2014-05-20

    An apparatus and method for supporting a tubing bundle during installation or removal. The apparatus includes a clamp for securing the tubing bundle to an external wireline. The method includes deploying the tubing bundle and wireline together, The tubing bundle is periodically secured to the wireline using a clamp.

  17. A method of language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik; Hengeveld, Kees;

    1993-01-01

    In recent years more attention is paid to the quality of language samples in typological work. Without an adequate sampling strategy, samples may suffer from various kinds of bias. In this article we propose a sampling method in which the genetic criterion is taken as the most important: samples ...

  18. 利用LQAS方法开展疫苗接种率调查的分析%Analysis of carrying out the immunization coverage rate survey using Lot-Quality Assurance Sampling (LQAS) method

    Institute of Scientific and Technical Information of China (English)

    谢群; 池益强; 马姗姗

    2014-01-01

    Objective To evaluate the advantage and disadvantage of LQAS in investigating immunization coverage rate in children of Xiamen city, by comparing immunization coverage rate in children of Xiamen city with routine report immunization coverage rate. Methods Forty-two eligible children were sampled from each of the three towns, Xike, Gulangyu and Jinshan with LQAS, immunization coverage rate was calculated and was compared with routine report immunization coverage rate from na-tional routine immunization system. Results The investigated immunization coverage rates of the three towns were all above 97. 62%, while the routine report immunization coverage rates were above 97. 84%. The compared results showed that there was no significant difference between the two rates in Gulangyu and Jinshan town, while the investigated immunization coverage rate in Xike town was slightly lower than that of the reported. Conclusion LQAS can be used to investigate large-scale immunization coverage rate survey and evaluate the quality of reported immunization coverage rate because it is fast and convenient. It can’t be used to estimate the level of immunization coverage rate and should be used with caution in solving practical problems. Be sure to sample randomly and control the quality of investigation when using it.%目的:比较厦门市儿童调查疫苗接种率与常规免疫报告接种率的差异,评价调查抽样方法LQAS的优点与缺点。方法采用批质量保证抽样方法( LQAS)抽取厦门3个乡镇适龄儿童各42名,调查统计儿童疫苗接种情况,并将调查疫苗接种率与中国国家常规免疫监测系统中报告疫苗接种率进行比较。结果3个街道/乡镇调查疫苗接种率均在97.62%以上,报告疫苗接种率均在97.84%以上。思明区鼓浪屿街道与湖里区金山街道调查疫苗接种率与报告疫苗接种率相比无显著性差异,而同安区西柯镇调查疫苗接种率低于报

  19. Multiple category-lot quality assurance sampling: a new classification system with application to schistosomiasis control.

    Directory of Open Access Journals (Sweden)

    Casey Olives

    Full Text Available BACKGROUND: Originally a binary classifier, Lot Quality Assurance Sampling (LQAS has proven to be a useful tool for classification of the prevalence of Schistosoma mansoni into multiple categories (≤10%, >10 and <50%, ≥50%, and semi-curtailed sampling has been shown to effectively reduce the number of observations needed to reach a decision. To date the statistical underpinnings for Multiple Category-LQAS (MC-LQAS have not received full treatment. We explore the analytical properties of MC-LQAS, and validate its use for the classification of S. mansoni prevalence in multiple settings in East Africa. METHODOLOGY: We outline MC-LQAS design principles and formulae for operating characteristic curves. In addition, we derive the average sample number for MC-LQAS when utilizing semi-curtailed sampling and introduce curtailed sampling in this setting. We also assess the performance of MC-LQAS designs with maximum sample sizes of n=15 and n=25 via a weighted kappa-statistic using S. mansoni data collected in 388 schools from four studies in East Africa. PRINCIPLE FINDINGS: Overall performance of MC-LQAS classification was high (kappa-statistic of 0.87. In three of the studies, the kappa-statistic for a design with n=15 was greater than 0.75. In the fourth study, where these designs performed poorly (kappa-statistic less than 0.50, the majority of observations fell in regions where potential error is known to be high. Employment of semi-curtailed and curtailed sampling further reduced the sample size by as many as 0.5 and 3.5 observations per school, respectively, without increasing classification error. CONCLUSION/SIGNIFICANCE: This work provides the needed analytics to understand the properties of MC-LQAS for assessing the prevalance of S. mansoni and shows that in most settings a sample size of 15 children provides a reliable classification of schools.

  20. SWOT ANALYSIS ON SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    CHIS ANCA OANA

    2014-07-01

    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  1. Transuranic waste characterization sampling and analysis methods manual

    International Nuclear Information System (INIS)

    The Transuranic Waste Characterization Sampling and Analysis Methods Manual (Methods Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program). This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP

  2. Transuranic waste characterization sampling and analysis methods manual

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-05-01

    The Transuranic Waste Characterization Sampling and Analysis Methods Manual (Methods Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program). This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP.

  3. Distance sampling methods and applications

    CERN Document Server

    Buckland, S T; Marques, T A; Oedekoven, C S

    2015-01-01

    In this book, the authors cover the basic methods and advances within distance sampling that are most valuable to practitioners and in ecology more broadly. This is the fourth book dedicated to distance sampling. In the decade since the last book published, there have been a number of new developments. The intervening years have also shown which advances are of most use. This self-contained book covers topics from the previous publications, while also including recent developments in method, software and application. Distance sampling refers to a suite of methods, including line and point transect sampling, in which animal density or abundance is estimated from a sample of distances to detected individuals. The book illustrates these methods through case studies; data sets and computer code are supplied to readers through the book’s accompanying website.  Some of the case studies use the software Distance, while others use R code. The book is in three parts.  The first part addresses basic methods, the ...

  4. Use of Lot Quality Assurance Sampling to Ascertain Levels of Drug Resistant Tuberculosis in Western Kenya

    Science.gov (United States)

    Cohen, Ted; Zignol, Matteo; Nyakan, Edwin; Hedt-Gauthier, Bethany L.; Gardner, Adrian; Kamle, Lydia; Injera, Wilfred; Carter, E. Jane

    2016-01-01

    Objective To classify the prevalence of multi-drug resistant tuberculosis (MDR-TB) in two different geographic settings in western Kenya using the Lot Quality Assurance Sampling (LQAS) methodology. Design The prevalence of drug resistance was classified among treatment-naïve smear positive TB patients in two settings, one rural and one urban. These regions were classified as having high or low prevalence of MDR-TB according to a static, two-way LQAS sampling plan selected to classify high resistance regions at greater than 5% resistance and low resistance regions at less than 1% resistance. Results This study classified both the urban and rural settings as having low levels of TB drug resistance. Out of the 105 patients screened in each setting, two patients were diagnosed with MDR-TB in the urban setting and one patient was diagnosed with MDR-TB in the rural setting. An additional 27 patients were diagnosed with a variety of mono- and poly- resistant strains. Conclusion Further drug resistance surveillance using LQAS may help identify the levels and geographical distribution of drug resistance in Kenya and may have applications in other countries in the African Region facing similar resource constraints. PMID:27167381

  5. Use of Lot Quality Assurance Sampling to Ascertain Levels of Drug Resistant Tuberculosis in Western Kenya.

    Directory of Open Access Journals (Sweden)

    Julia Jezmir

    Full Text Available To classify the prevalence of multi-drug resistant tuberculosis (MDR-TB in two different geographic settings in western Kenya using the Lot Quality Assurance Sampling (LQAS methodology.The prevalence of drug resistance was classified among treatment-naïve smear positive TB patients in two settings, one rural and one urban. These regions were classified as having high or low prevalence of MDR-TB according to a static, two-way LQAS sampling plan selected to classify high resistance regions at greater than 5% resistance and low resistance regions at less than 1% resistance.This study classified both the urban and rural settings as having low levels of TB drug resistance. Out of the 105 patients screened in each setting, two patients were diagnosed with MDR-TB in the urban setting and one patient was diagnosed with MDR-TB in the rural setting. An additional 27 patients were diagnosed with a variety of mono- and poly- resistant strains.Further drug resistance surveillance using LQAS may help identify the levels and geographical distribution of drug resistance in Kenya and may have applications in other countries in the African Region facing similar resource constraints.

  6. International symposium on quality assurance for analytical methods in isotope hydrology. Book of extended synopses

    International Nuclear Information System (INIS)

    A large variety of isotopic techniques is available and commonly used in water resources investigations as well as in a wide range of other scientific fields. These techniques include the stable isotope analysis of light elements (H, C, N, O, S), activity measurements of radioactive isotopes at environmental level (3H, 14C, 3H/3He, 85Kr) as well as measurements of CFCs, SF6 and other chemical and isotopic tracers. They provide valuable tools for the assessment of scientific questions and the solution of practical problems. During the last decade, new analytical tools have significantly fostered the application of isotopic techniques in many new fields and caused a steep increase in the number of laboratories applying these methods. International trends in improved analytical quality and requirements for laboratory certification and accreditation have pushed issues of quality control and quality assurance to a high level of importance for the operation of isotope laboratories worldwide. The objectives of the symposium are to promote a wide exchange of information on key issues for high quality isotopic measurements. The main focus is on the analytical techniques and on all means to ensure high quality standards for isotopic measurements. Recent advances in analytical quality assurance and laboratory quality systems will be presented and discussed together with state-of-the-art techniques. The scope of the conference is to demonstrate the use of best laboratory practices in the following fields: calibration of measurements and traceability; interlaboratory comparisons; best laboratory practices for daily analyses of samples; quality control and statistical evaluation of results; calculation of uncertainty budgets; new analytical techniques; improvements in precision and accuracy of analytical methods; laboratory information management, databases and sample handling; laboratory quality systems and international guides. The 42 papers are indexed individually

  7. A Method for Evaluating Quality Assurance Needs in Radiation Therapy

    International Nuclear Information System (INIS)

    The increasing complexity of modern radiation therapy planning and delivery techniques challenges traditional prescriptive quality control and quality assurance programs that ensure safety and reliability of treatment planning and delivery systems under all clinical scenarios. Until now quality management (QM) guidelines published by concerned organizations (e.g., American Association of Physicists in Medicine [AAPM], European Society for Therapeutic Radiology and Oncology [ESTRO], International Atomic Energy Agency [IAEA]) have focused on monitoring functional performance of radiotherapy equipment by measurable parameters, with tolerances set at strict but achievable values. In the modern environment, however, the number and sophistication of possible tests and measurements have increased dramatically. There is a need to prioritize QM activities in a way that will strike a balance between being reasonably achievable and optimally beneficial to patients. A systematic understanding of possible errors over the course of a radiation therapy treatment and the potential clinical impact of each is needed to direct limited resources in such a way to produce maximal benefit to the quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and is developing a framework for designing QM activities, and hence allocating resources, based on estimates of clinical outcome, risk assessment, and failure modes. The report will provide guidelines on risk assessment approaches with emphasis on failure mode and effect analysis (FMEA) and an achievable QM program based on risk analysis. Examples of FMEA to intensity-modulated radiation therapy and high-dose-rate brachytherapy are presented. Recommendations on how to apply this new approach to individual clinics and further research and development will also be discussed

  8. Sampling methods for phlebotomine sandflies.

    Science.gov (United States)

    Alexander, B

    2000-06-01

    A review is presented of methods for sampling phlebotomine sandflies (Diptera: Psychodidae). Among approximately 500 species of Phlebotominae so far described, mostly in the New World genus Lutzomyia and the Old World genus Phlebotomus, about 10% are known vectors of Leishmania parasites or other pathogens. Despite being small and fragile, sandflies have a wide geographical range with species occupying a considerable diversity of ecotopes and habitats, from deserts to humid forests, so that suitable methods for collecting them are influenced by environmental conditions where they are sought. Because immature phlebotomines occupy obscure terrestrial habitats, it is difficult to find their breeding sites. Therefore, most trapping methods and sampling procedures focus on sandfly adults, whether resting or active. The diurnal resting sites of adult sandflies include tree holes, buttress roots, rock crevices, houses, animal shelters and burrows, from which they may be aspirated directly or trapped after being disturbed. Sandflies can be collected during their periods of activity by interception traps, or by using attractants such as bait animals, CO2 or light. The method of trapping used should: (a) be suited to the habitat and area to be surveyed, (b) take into account the segment of the sandfly population to be sampled (species, sex and reproduction condition) and (c) yield specimens of appropriate condition for the study objectives (e.g. identification of species present, population genetics or vector implication). Methods for preservation and transportation of sandflies to the laboratory also depend on the objectives of a particular study and are described accordingly. PMID:10872855

  9. Data Mining Methods Applied to Flight Operations Quality Assurance Data: A Comparison to Standard Statistical Methods

    Science.gov (United States)

    Stolzer, Alan J.; Halford, Carl

    2007-01-01

    In a previous study, multiple regression techniques were applied to Flight Operations Quality Assurance-derived data to develop parsimonious model(s) for fuel consumption on the Boeing 757 airplane. The present study examined several data mining algorithms, including neural networks, on the fuel consumption problem and compared them to the multiple regression results obtained earlier. Using regression methods, parsimonious models were obtained that explained approximately 85% of the variation in fuel flow. In general data mining methods were more effective in predicting fuel consumption. Classification and Regression Tree methods reported correlation coefficients of .91 to .92, and General Linear Models and Multilayer Perceptron neural networks reported correlation coefficients of about .99. These data mining models show great promise for use in further examining large FOQA databases for operational and safety improvements.

  10. Quality Assurance Program Plan for the Waste Sampling and Characterization Facility

    Energy Technology Data Exchange (ETDEWEB)

    Grabbe, R.R.

    1995-03-02

    The objective of this Quality Assurance Plan is to provide quality assurance (QA) guidance, implementation of regulatory QA requirements, and quality control (QC) specifications for analytical service. This document follows the Department of Energy (DOE)-issued Hanford Analytical Services Quality Assurance Plan (HASQAP) and additional federal [10 US Code of Federal Regulations (CFR) 830.120] QA requirements that HASQAP does not cover. This document describes how the laboratory implements QA requirements to meet the federal or state requirements, provides what are the default QC specifications, and/or identifies the procedural information that governs how the laboratory operates. In addition, this document meets the objectives of the Quality Assurance Program provided in the WHC-CM-4-2, Section 2.1. This document also covers QA elements that are required in the Guidelines and Specifications for Preparing Quality Assurance Program Plans (QAPPs), (QAMS-004), and Interim Guidelines and Specifications for Preparing Quality Assurance Product Plans (QAMS-005) from the Environmental Protection Agency (EPA). A QA Index is provided in the Appendix A.

  11. Methods for quality-assurance review of water-quality data in New Jersey

    Science.gov (United States)

    Brown, G. Allan; Pustay, Edward A.; Gibs, Jacob

    2003-01-01

    This report is an instructional and reference manual that describes methods developed and used by the U.S. Geological Survey (USGS), New Jersey District, to assure the accuracy and precision of the results of analyses of surface- and ground-water samples received from analyzing laboratories and, ultimately, to ensure the integrity of water-quality data in USGS databases and published reports. A statistical-analysis computer program, COMP.PPL, is used to determine whether the values reported by the laboratories are internally consistent, whether they are reasonable when compared with values for samples previously collected at the same site, and whether they exceed applicable drinking-water regulations. The program output consists of three files -- QWREVIEW, QWOUTLIERS, and QWCALC. QWREVIEW presents the results of tests of chemical logic and shows values that exceed drinking-water regulations. QWOUTLIERS identifies values that fall outside the historical range of values for the site sampled. QWCALC shows values and calculations used for reference purposes.

  12. Quality assurance manual plutonium liquid scintillation methods and procedures

    International Nuclear Information System (INIS)

    Nose swipe analysis is a very important tool for Radiation Protection personnel. Nose swipe analysis is a very fast and accurate method for (1) determining if a worker has been exposed to airborne plutonium contamination and (2) Identifying the area where there has been a possible plutonium release. Liquid scintillation analysis techniques have been effectively applied to accurately determine the plutonium alpha activity on nose swipe media. Whatman-40 paper and Q-Tips are the only two media which have been evaluated and can be used for nose swipe analysis. Presently, only Q-Tips are used by Group HSE-1 Radiation Protection Personnel. However, both swipe media will be discussed in this report

  13. Improving data quality and supervision of antiretroviral therapy sites in Malawi: an application of Lot Quality Assurance Sampling

    Directory of Open Access Journals (Sweden)

    Hedt-Gauthier Bethany L

    2012-07-01

    Full Text Available Abstract Background High quality program data is critical for managing, monitoring, and evaluating national HIV treatment programs. By 2009, the Malawi Ministry of Health had initiated more than 270,000 patients on HIV treatment at 377 sites. Quarterly supervision of these antiretroviral therapy (ART sites ensures high quality care, but the time currently dedicated to exhaustive record review and data cleaning detracts from other critical components. The exhaustive record review is unlikely to be sustainable long term because of the resources required and increasing number of patients on ART. This study quantifies the current levels of data quality and evaluates Lot Quality Assurance Sampling (LQAS as a tool to prioritize sites with low data quality, thus lowering costs while maintaining sufficient quality for program monitoring and patient care. Methods In January 2010, a study team joined supervision teams at 19 sites purposely selected to reflect the variety of ART sites. During the exhaustive data review, the time allocated to data cleaning and data discrepancies were documented. The team then randomly sampled 76 records from each site, recording secondary outcomes and the time required for sampling. Results At the 19 sites, only 1.2% of records had discrepancies in patient outcomes and 0.4% in treatment regimen. However, data cleaning took 28.5 hours in total, suggesting that data cleaning for all 377 ART sites would require over 350 supervision-hours quarterly. The LQAS tool accurately identified the sites with the low data quality, reduced the time for data cleaning by 70%, and allowed for reporting on secondary outcomes. Conclusions Most sites maintained high quality records. In spite of this, data cleaning required significant amounts of time with little effect on program estimates of patient outcomes. LQAS conserves resources while maintaining sufficient data quality for program assessment and management to allow for quality patient

  14. Comparison of various monitoring methods for quality assurance purposes at Thoriated electrode production

    International Nuclear Information System (INIS)

    Raw material for Thoriated electrodes is a powder mixture of pure Tungsten and about one to five percent of Thoriumdioxide. The mixture has to be pressed and sintered. At all these work sites, Thorium could be emitted. Thorium exposure measurements described in the literature show significant differences in the results of stationary and personal air sampling measurements on the one side and excretion measurements on the other side. These discrepancies were the motivation for an extensive measurement program, wherein all available monitoring methods should be compared for quality assurance purposes. The investigated plant was chosen for this measurement program, because the described handlings occur only two times a year for two weeks each. Long term Thorium depositions, which would disturb the interpretation of excretion monitoring measurements, could be neglected therefore. Six volunteer persons were equipped with personal air sampling pumps. Some of these samplings were taken twice at the same person during the same period and analyzed by two different institutions for comparison purposes. Stationary air samplings had been done in the vicinity during the same sampling periods. The filters were analyzed by neutron activation analysis. Urine- and faeces excretion measurements had been taken before, during and after the exposure period. These samples were divided into two parts and analyzed by two methods (ICP-MS and Alpha-spectrometry) by four different institutions. Additional exhalation measurements had been performed. The comparison of the results from personal air monitoring showed a maximum difference of a factor of two. Other samples taken with stationary sampling pumps and at the same period, showed a maximum difference of a factor of five. But the results from stationary sampling and personal sampling varies up to a factor of 200. The mean value of personal determined activity concentration is 50 mBq m-3 and the mean value determined by stationary sampling

  15. Quality assurance and quality control for thermal/optical analysis of aerosol samples for organic and elemental carbon.

    Science.gov (United States)

    Chow, Judith C; Watson, John G; Robles, Jerome; Wang, Xiaoliang; Chen, L-W Antony; Trimble, Dana L; Kohl, Steven D; Tropp, Richard J; Fung, Kochy K

    2011-12-01

    Accurate, precise, and valid organic and elemental carbon (OC and EC, respectively) measurements require more effort than the routine analysis of ambient aerosol and source samples. This paper documents the quality assurance (QA) and quality control (QC) procedures that should be implemented to ensure consistency of OC and EC measurements. Prior to field sampling, the appropriate filter substrate must be selected and tested for sampling effectiveness. Unexposed filters are pre-fired to remove contaminants and acceptance tested. After sampling, filters must be stored in the laboratory in clean, labeled containers under refrigeration (carbon analyses, periodic QC tests include calibration of the flame ionization detector with different types of carbon standards, thermogram inspection, replicate analyses, quantification of trace oxygen concentrations (helium atmosphere, and calibration of the sample temperature sensor. These established QA/QC procedures are applicable to aerosol sampling and analysis for carbon and other chemical components. PMID:21626190

  16. Sample processing device and method

    DEFF Research Database (Denmark)

    2011-01-01

    A sample processing device is disclosed, which sample processing device comprises a first substrate and a second substrate, where the first substrate has a first surface comprising two area types, a first area type with a first contact angle with water and a second area type with a second contact...

  17. Quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, B.M.; Gleckler, B.P.

    1995-06-01

    This section of the 1994 Hanford Site Environmental Report summarizes the quality assurance and quality control practices of Hanford Site environmental monitoring and surveillance programs. Samples are analyzed according to documented standard analytical procedures. This section discusses specific measures taken to ensure quality in project management, sample collection, and analytical results.

  18. A method for critical software event execution reliability in high assurance systems

    Energy Technology Data Exchange (ETDEWEB)

    Kidd, M.E.C.

    1997-03-01

    This paper presents a method for Critical Software Event Execution Reliability (Critical SEER). The Critical SEER method is intended for high assurance software that operates in an environment where transient upsets could occur, causing a disturbance of the critical software event execution order, which could cause safety or security hazards. The method has a finite automata based module that watches (hence SEER) and tracks the critical events and ensures they occur in the proper order or else a fail safe state is forced. This method is applied during the analysis, design and implementation phases of software engineering.

  19. Paediatric rehabilitation treatment standards: a method for quality assurance in Germany

    Directory of Open Access Journals (Sweden)

    Jutta Ahnert

    2014-07-01

    Full Text Available Over the last few years, the German Pension Insurance has implemented a new method of quality assurance for inpatient rehabilitation of children and adolescents diagnosed with bronchial asthma, obesity, or atopic dermatitis: the so-called rehabilitation treatment standards (RTS. They aim at promoting a comprehensive and evidence-based care in rehabilitation. Furthermore, they are intended to make the therapeutic processes in medical rehabilitation as well as potential deficits more transparent. The development of RTS was composed of five phases during which current scientific evidence, expert knowledge, and patient expectations were included. Their core element is the specification of evidence-based treatment modules that describe a good rehabilitation standard for children diagnosed with bronchial asthma, obesity, or atopic dermatitis. Opportunities and limitations of the RTS as a tool for quality assurance are discussed.

  20. The integrated performance evaluation program quality assurance guidance in support of EM environmental sampling and analysis activities

    International Nuclear Information System (INIS)

    EM's (DOE's Environmental Restoration and Waste Management) Integrated Performance Evaluation Program (IPEP) has the purpose of integrating information from existing PE programs with expanded QA activities to develop information about the quality of radiological, mixed waste, and hazardous environmental sample analyses provided by all laboratories supporting EM programs. The guidance addresses the goals of identifying specific PE sample programs and contacts, identifying specific requirements for participation in DOE's internal and external (regulatory) programs, identifying key issues relating to application and interpretation of PE materials for EM headquarters and field office managers, and providing technical guidance covering PE materials for site-specific activities. (PE) Performance Evaluation materials or samples are necessary for the quality assurance/control programs covering environmental data collection

  1. Methods and quality assurance in environmental medicine. Formation of a RKI-Commission

    International Nuclear Information System (INIS)

    An almost bewildering number of widely differing methods and techniques, often not validated, are being applied often inappropriately in the field of environmental medicine to answer questions regarding exposure assessment, diagnosis, treatment, counselling and prevention. Therefore, quality control within the field of environmental medicine is quite problematic. A primary goal of the newly formed RKI-Commission 'Methods and Quality Assurance in Environmental Medicine' is to form a panel of experts in the field, who evaluate the situation and generate consensus documents containing respective recommendations. By this the commission will contribute to standardization and agreement on appropriate methods, procedures and their correct application in the practice of environmental medicine. Hopefully it will also achieve a stronger, more consistent use of evidence-based-medicine and improve the quality of the structure, processes and results of research and practice in this field. The committee will initially deal with the issue of clinical environmental medicine, because here the largest problems in quality assurance are seen. In this context the commission will look at the problem areas of environmental-medical outpatient units and environmental clinics. The work of the commission will be supported by the newly formed Documentation and Evaluation Center for Methods in Environmental Medicine (Zentrale Erfassungs- und Bewertungsstelle fuer umweltmedizinische Methoden, ZEBUM) at the Robert Koch Institute. (orig.)

  2. Direct method for second-order sensitivity analysis of modal assurance criterion

    Science.gov (United States)

    Lei, Sheng; Mao, Kuanmin; Li, Li; Xiao, Weiwei; Li, Bin

    2016-08-01

    A Lagrange direct method is proposed to calculate the second-order sensitivity of modal assurance criterion (MAC) values of undamped systems. The eigenvalue problem and normalizations of eigenvectors, which augmented by using some Lagrange multipliers, are used as the constraints of the Lagrange functional. Once the Lagrange multipliers are determined, the sensitivities of MAC values can be evaluated directly. The Lagrange direct method is accurate, efficient and easy to implement. A simply supported beam is utilized to check the accuracy of the proposed method. A frame is adopted to validate the predicting capacity of the first- and second-order sensitivities of MAC values. It is shown that the computational costs of the proposed method can be remarkably reduced in comparison with those of the indirect method without loss of accuracy.

  3. Subrandom methods for multidimensional nonuniform sampling

    Science.gov (United States)

    Worley, Bradley

    2016-08-01

    Methods of nonuniform sampling that utilize pseudorandom number sequences to select points from a weighted Nyquist grid are commonplace in biomolecular NMR studies, due to the beneficial incoherence introduced by pseudorandom sampling. However, these methods require the specification of a non-arbitrary seed number in order to initialize a pseudorandom number generator. Because the performance of pseudorandom sampling schedules can substantially vary based on seed number, this can complicate the task of routine data collection. Approaches such as jittered sampling and stochastic gap sampling are effective at reducing random seed dependence of nonuniform sampling schedules, but still require the specification of a seed number. This work formalizes the use of subrandom number sequences in nonuniform sampling as a means of seed-independent sampling, and compares the performance of three subrandom methods to their pseudorandom counterparts using commonly applied schedule performance metrics. Reconstruction results using experimental datasets are also provided to validate claims made using these performance metrics.

  4. ALARA ASSESSMENT OF SETTLER SLUDGE SAMPLING METHODS

    Energy Technology Data Exchange (ETDEWEB)

    NELSEN LA

    2009-01-30

    The purpose of this assessment is to compare underwater and above water settler sludge sampling methods to determine if the added cost for underwater sampling for the sole purpose of worker dose reductions is justified. Initial planning for sludge sampling included container, settler and knock-out-pot (KOP) sampling. Due to the significantly higher dose consequence of KOP sludge, a decision was made to sample KOP underwater to achieve worker dose reductions. Additionally, initial plans were to utilize the underwater sampling apparatus for settler sludge. Since there are no longer plans to sample KOP sludge, the decision for underwater sampling for settler sludge needs to be revisited. The present sampling plan calls for spending an estimated $2,500,000 to design and construct a new underwater sampling system (per A21 C-PL-001 RevOE). This evaluation will compare and contrast the present method of above water sampling to the underwater method that is planned by the Sludge Treatment Project (STP) and determine if settler samples can be taken using the existing sampling cart (with potentially minor modifications) while maintaining doses to workers As Low As Reasonably Achievable (ALARA) and eliminate the need for costly redesigns, testing and personnel retraining.

  5. Analytical laboratory quality assurance guidance in support of EM environmental sampling and analysis activities

    International Nuclear Information System (INIS)

    This document introduces QA guidance pertaining to design and implementation of laboratory procedures and processes for collecting DOE Environmental Restoration and Waste Management (EM) ESAA (environmental sampling and analysis activities) data. It addresses several goals: identifying key laboratory issues and program elements to EM HQ and field office managers; providing non-prescriptive guidance; and introducing environmental data collection program elements for EM-263 assessment documents and programs. The guidance describes the implementation of laboratory QA elements within a functional QA program (development of the QA program and data quality objectives are not covered here)

  6. Private sector delivery of health services in developing countries: a mixed-methods study on quality assurance in social franchises

    Directory of Open Access Journals (Sweden)

    Schlein Karen

    2013-01-01

    Full Text Available Abstract Background Across the developing world health care services are most often delivered in the private sector and social franchising has emerged, over the past decade, as an increasingly popular method of private sector health care delivery. Social franchising aims to strengthen business practices through economies of scale: branding clinics and purchasing drugs in bulk at wholesale prices. While quality is one of the established goals of social franchising, there is no published documentation of how quality levels might be set in the context of franchised private providers, nor what quality assurance measures can or should exist within social franchises. The aim of this study was to better understand the quality assurance systems currently utilized in social franchises, and to determine if there are shared standards for practice or quality outcomes that exist across programs. Methods The study included three data sources and levels of investigation: 1 Self-reported program data; 2 Scoping telephone interviews; and 3 In-depth field interviews and clinic visits. Results Social Franchises conceive of quality assurance not as an independent activity, but rather as a goal that is incorporated into all areas of franchise operations, including recruitment, training, monitoring of provider performance, monitoring of client experience and the provision of feedback. Conclusions These findings are the first evidence to support the 2002 conceptual model of social franchising which proposed that the assurance of quality was one of the three core goals of all social franchises. However, while quality is important to franchise programs, quality assurance systems overall are not reflective of the evidence to-date on quality measurement or quality improvement best practices. Future research in this area is needed to better understand the details of quality assurance systems as applied in social franchise programs, the process by which quality assurance

  7. Very accurate (definitive) methods by radiochemical NAA and their significance for quality assurance in trace analysis

    International Nuclear Information System (INIS)

    The idea of very accurate (definitive) methods by RNAA for the determination of individual trace elements in selected matrices is presented. The approach is based on combination of neutron activation with selective and truly quantitative post-irradiation isolation of an indicator radionuclide by column chromatography followed by high resolution γ-ray spectrometric measurement. The method should be, in principle, a single element method to optimize all conditions with respect to determination of this particular element. Radiochemical separation scheme should assure separation of the analyte from practically all accompanying radionuclides to provide interference-free γ-ray spectrometric measurement and achieving best detection limits. The method should have some intrinsic mechanisms incorporated into the procedure preventing any possibility of making gross errors. Several criteria were formulated which must be simultaneously fulfilled in order to acknowledge the analytical result as obtained by definitive method. Such methods are not intended for routine measurements but rather for verifying the accuracy of other methods of analysis and certification of the candidate reference materials. The usefulness of such methods is illustrated on the example of Cd and references are given to similar methods elaborated for the determination of several other elements (Co, Cu, Mo, Ni and U) in biological materials. (author)

  8. Transuranic waste characterization sampling and analysis methods manual. Revision 1

    International Nuclear Information System (INIS)

    This Methods Manual provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program) and the WIPP Waste Analysis Plan. This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP and the WIPP Waste Analysis Plan. The procedures in this Methods Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site-specific procedures. With some analytical methods, such as Gas Chromatography/Mass Spectrometry, the Methods Manual procedures may be used directly. With other methods, such as nondestructive characterization, the Methods Manual provides guidance rather than a step-by-step procedure. Sites must meet all of the specified quality control requirements of the applicable procedure. Each DOE site must document the details of the procedures it will use and demonstrate the efficacy of such procedures to the Manager, National TRU Program Waste Characterization, during Waste Characterization and Certification audits

  9. Transuranic waste characterization sampling and analysis methods manual. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Suermann, J.F.

    1996-04-01

    This Methods Manual provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program) and the WIPP Waste Analysis Plan. This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP and the WIPP Waste Analysis Plan. The procedures in this Methods Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site-specific procedures. With some analytical methods, such as Gas Chromatography/Mass Spectrometry, the Methods Manual procedures may be used directly. With other methods, such as nondestructive characterization, the Methods Manual provides guidance rather than a step-by-step procedure. Sites must meet all of the specified quality control requirements of the applicable procedure. Each DOE site must document the details of the procedures it will use and demonstrate the efficacy of such procedures to the Manager, National TRU Program Waste Characterization, during Waste Characterization and Certification audits.

  10. Statistical methods for quality assurance basics, measurement, control, capability, and improvement

    CERN Document Server

    Vardeman, Stephen B

    2016-01-01

    This undergraduate statistical quality assurance textbook clearly shows with real projects, cases and data sets how statistical quality control tools are used in practice. Among the topics covered is a practical evaluation of measurement effectiveness for both continuous and discrete data. Gauge Reproducibility and Repeatability methodology (including confidence intervals for Repeatability, Reproducibility and the Gauge Capability Ratio) is thoroughly developed. Process capability indices and corresponding confidence intervals are also explained. In addition to process monitoring techniques, experimental design and analysis for process improvement are carefully presented. Factorial and Fractional Factorial arrangements of treatments and Response Surface methods are covered. Integrated throughout the book are rich sets of examples and problems that help readers gain a better understanding of where and how to apply statistical quality control tools. These large and realistic problem sets in combination with the...

  11. Application of Sampling Methods to Geological Survey

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    @@There are two kinds of research methods in geological observation study. One is the remote-sensing observation. The other is the partial sampling method extensively used in every stage of the geological work, for example, in arranging the lines and points of geologic survey, and in arranging the exploration engineering. Three problems may occur in practical application of the sampling method: (1) Though we use the partial sampling method in geological work, we must make use of many labor powers, materials and money to accomplish the geological task. Is the method we use appropriate to some special geological task? (2) How many samples or observation points should be appropriate to the geological research?

  12. Fluidics platform and method for sample preparation

    Energy Technology Data Exchange (ETDEWEB)

    Benner, Henry W.; Dzenitis, John M.

    2016-06-21

    Provided herein are fluidics platforms and related methods for performing integrated sample collection and solid-phase extraction of a target component of the sample all in one tube. The fluidics platform comprises a pump, particles for solid-phase extraction and a particle-holding means. The method comprises contacting the sample with one or more reagents in a pump, coupling a particle-holding means to the pump and expelling the waste out of the pump while the particle-holding means retains the particles inside the pump. The fluidics platform and methods herein described allow solid-phase extraction without pipetting and centrifugation.

  13. AOAC Official MethodSM Matrix Extension Validation Study of Assurance GDSTM for the Detection of Salmonella in Selected Spices.

    Science.gov (United States)

    Feldsine, Philip; Kaur, Mandeep; Shah, Khyati; Immerman, Amy; Jucker, Markus; Lienau, Andrew

    2015-01-01

    Assurance GDSTM for Salmonella Tq has been validated according to the AOAC INTERNATIONAL Methods Committee Guidelines for Validation of Microbiological Methods for Food and Environmental Surfaces for the detection of selected foods and environmental surfaces (Official Method of AnalysisSM 2009.03, Performance Tested MethodSM No. 050602). The method also completed AFNOR validation (following the ISO 16140 standard) compared to the reference method EN ISO 6579. For AFNOR, GDS was given a scope covering all human food, animal feed stuff, and environmental surfaces (Certificate No. TRA02/12-01/09). Results showed that Assurance GDS for Salmonella (GDS) has high sensitivity and is equivalent to the reference culture methods for the detection of motile and non-motile Salmonella. As part of the aforementioned validations, inclusivity and exclusivity studies, stability, and ruggedness studies were also conducted. Assurance GDS has 100% inclusivity and exclusivity among the 100 Salmonella serovars and 35 non-Salmonella organisms analyzed. To add to the scope of the Assurance GDS for Salmonella method, a matrix extension study was conducted, following the AOAC guidelines, to validate the application of the method for selected spices, specifically curry powder, cumin powder, and chili powder, for the detection of Salmonella.

  14. Waveform sample method of excitable sensory neuron

    OpenAIRE

    Wang, Sheng-Jun; Xu, Xin-Jian; Wang, Ying-Hai

    2006-01-01

    We present a new interpretation for encoding information of the period of input signals into spike-trains in individual sensory neuronal systems. The spike-train could be described as the waveform sample of the input signal which locks sample points to wave crests with randomness. Based on simulations of the Hodgkin-Huxley (HH) neuron responding to periodic inputs, we demonstrate that the random sampling is a proper encoding method in medium frequency region since power spectra of the reconst...

  15. Sample preparation method for scanning force microscopy

    CERN Document Server

    Jankov, I R; Szente, R N; Carreno, M N P; Swart, J W; Landers, R

    2001-01-01

    We present a method of sample preparation for studies of ion implantation on metal surfaces. The method, employing a mechanical mask, is specially adapted for samples analysed by Scanning Force Microscopy. It was successfully tested on polycrystalline copper substrates implanted with phosphorus ions at an acceleration voltage of 39 keV. The changes of the electrical properties of the surface were measured by Kelvin Probe Force Microscopy and the surface composition was analysed by Auger Electron Spectroscopy.

  16. Assessing Local Risk of Rifampicin-Resistant Tuberculosis in KwaZulu-Natal, South Africa Using Lot Quality Assurance Sampling.

    Directory of Open Access Journals (Sweden)

    Christine L Heidebrecht

    Full Text Available KwaZulu-Natal (KZN has the highest burden of notified multidrug-resistant tuberculosis (MDR TB and extensively drug-resistant (XDR TB cases in South Africa. A better understanding of spatial heterogeneity in the risk of drug-resistance may help to prioritize local responses.Between July 2012 and June 2013, we conducted a two-way Lot Quality Assurance Sampling (LQAS study to classify the burden of rifampicin (RIF-resistant TB among incident TB cases notified within the catchment areas of seven laboratories in two northern and one southern district of KZN. Decision rules for classification of areas as having either a high- or low-risk of RIF resistant TB (based on proportion of RIF resistance among all TB cases were based on consultation with local policy makers.We classified five areas as high-risk and two as low-risk. High-risk areas were identified in both Southern and Northern districts, with the greatest proportion of RIF resistance observed in the northernmost area, the Manguzi community situated on the Mozambique border.Our study revealed heterogeneity in the risk of RIF resistant disease among incident TB cases in KZN. This study demonstrates the potential for LQAS to detect geographic heterogeneity in areas where access to drug susceptibility testing is limited.

  17. GLYCOHEMOGLOBIN - COMPARISON OF 12 ANALYTICAL METHODS, APPLIED TO LYOPHILIZED HEMOLYSATES BY 101 LABORATORIES IN AN EXTERNAL QUALITY ASSURANCE PROGRAM

    NARCIS (Netherlands)

    WEYKAMP, CW; PENDERS, TJ; MUSKIET, FAJ; VANDERSLIK, W

    1993-01-01

    Stable lyophilized ethylenediaminetetra-acetic acid (EDTA)-blood haemolysates were applied in an external quality assurance programme (SKZL, The Netherlands) for glycohaemoglobin assays in 101 laboratories using 12 methods. The mean intralaboratory day-to-day coefficient of variation (CV), calculate

  18. Sampling Procedures for Coordinating Stratified Samples : Methods Based on Microstrata

    OpenAIRE

    Nedyalkova, Desislava; Tillé, Yves; Pea, Johan

    2016-01-01

    The aim of sampling coordination is to maximize or minimize the overlap between several samples drawn successively in a population that changes over time. Therefore, the selection of a new sample will depend on the samples previously drawn. In order to obtain a larger (or smaller) overlap of the samples than the one obtained by independent selection of samples, a dependence between the samplesmust be introduced. This dependence will emphasize (or limit) the number of common units in the selec...

  19. Are patent medicine vendors effective agents in malaria control? Using lot quality assurance sampling to assess quality of practice in Jigawa, Nigeria.

    Directory of Open Access Journals (Sweden)

    Sima Berendes

    Full Text Available BACKGROUND: Patent medicine vendors (PMV provide antimalarial treatment and care throughout Sub-Saharan Africa, and can play an important role in the fight against malaria. Their close-to-client infrastructure could enable lifesaving artemisinin-based combination therapy (ACT to reach patients in time. However, systematic assessments of drug sellers' performance quality are crucial if their role is to be managed within the health system. Lot quality assurance sampling (LQAS could be an efficient method to monitor and evaluate PMV practice, but has so far never been used for this purpose. METHODS: In support of the Nigeria Malaria Booster Program we assessed PMV practices in three Senatorial Districts (SDs of Jigawa, Nigeria. A two-stage LQAS assessed whether at least 80% of PMV stores in SDs used national treatment guidelines. Acceptable sampling errors were set in consultation with government officials (alpha and beta <0.10. The hypergeometric formula determined sample sizes and cut-off values for SDs. A structured assessment tool identified high and low performing SDs for quality of care indicators. FINDINGS: Drug vendors performed poorly in all SDs of Jigawa for all indicators. For example, all SDs failed for stocking and selling first-line antimalarials. PMV sold no longer recommended antimalarials, such as Chloroquine, Sulfadoxine-Pyrimethamine and oral Artesunate monotherapy. Most PMV were ignorant of and lacked training about new treatment guidelines that had endorsed ACTs as first-line treatment for uncomplicated malaria. CONCLUSION: There is urgent need to regularly monitor and improve the availability and quality of malaria treatment provided by medicine sellers in Nigeria; the irrational use of antimalarials in the ACT era revealed in this study bears a high risk of economic loss, death and development of drug resistance. LQAS has been shown to be a suitable method for monitoring malaria-related indicators among PMV, and should be

  20. Countdown to 2015: Tracking Maternal and Child Health Intervention Targets Using Lot Quality Assurance Sampling in Bauchi State Nigeria.

    Directory of Open Access Journals (Sweden)

    Dele Abegunde

    Full Text Available Improving maternal and child health remains a top priority in Nigeria's Bauchi State in the northeastern region where the maternal mortality ratio (MMR and infant mortality rate (IMR are as high as 1540 per 100,000 live births and 78 per 1,000 live births respectively. In this study, we used the framework of the continuum of maternal and child care to evaluate the impact of interventions in Bauchi State focused on improved maternal and child health, and to ascertain progress towards the achievement of Millennium Development Goals (MDGs 4 and 5.At baseline (2012 and then at follow-up (2013, we randomly sampled 340 households from 19 random locations in each of the 20 Local Government Areas (LGA of Bauchi State in Northern Nigeria, using the Lot Quality Assurance Sampling (LQAS technique. Women residents in the households were interviewed about their own health and that of their children. Estimated LGA coverage of maternal and child health indicators were aggregated across the State. These values were then compared to the national figures, and the differences from 2012 to 2014 were calculated.For several of the indicators, a modest improvement from baseline was found. However, the indicators in the continuum of care neither reached the national average nor attained the 90% globally recommended coverage level. The majority of the LGA surveyed were classifiable as high priority, thus requiring intensified efforts and programmatic scale up.Intensive scale-up of programs and interventions is needed in Bauchi State, Northern Nigeria, to accelerate, consolidate and sustain the modest but significant achievements in the continuum of care, if MDGs 4 and 5 are to be achieved by the end of 2015. The intentional focus of LGAs as the unit of intervention ought to be considered a condition precedent for future investments. Priority should be given to the re-allocating resources to program areas and regions where coverage has been low. Finally, systematic

  1. Bayesian individualization via sampling-based methods.

    Science.gov (United States)

    Wakefield, J

    1996-02-01

    We consider the situation where we wish to adjust the dosage regimen of a patient based on (in general) sparse concentration measurements taken on-line. A Bayesian decision theory approach is taken which requires the specification of an appropriate prior distribution and loss function. A simple method for obtaining samples from the posterior distribution of the pharmacokinetic parameters of the patient is described. In general, these samples are used to obtain a Monte Carlo estimate of the expected loss which is then minimized with respect to the dosage regimen. Some special cases which yield analytic solutions are described. When the prior distribution is based on a population analysis then a method of accounting for the uncertainty in the population parameters is described. Two simulation studies showing how the methods work in practice are presented. PMID:8827585

  2. Alternate calibration method of radiochromic EBT3 film for quality assurance verification of clinical radiotherapy treatments

    Science.gov (United States)

    Park, Soah; Kang, Sei-Kwon; Cheong, Kwang-Ho; Hwang, Taejin; Yoon, Jai-Woong; Koo, Taeryool; Han, Tae Jin; Kim, Haeyoung; Lee, Me Yeon; Bae, Hoonsik; Kim, Kyoung Ju

    2016-07-01

    EBT3 film is utilized as a dosimetry quality assurance tool for the verification of clinical radiotherapy treatments. In this work, we suggest a percentage-depth-dose (PDD) calibration method that can calibrate several EBT3 film pieces together at different dose levels because photon beams provide different dose levels at different depths along the axis of the beam. We investigated the feasibility of the film PDD calibration method based on PDD data and compared the results those from the traditional film calibration method. Photon beams at 6 MV were delivered to EBT3 film pieces for both calibration methods. For the PDD-based calibration, the film pieces were placed on solid phantoms at the depth of maximum dose (dmax) and at depths of 3, 5, 8, 12, 17, and 22 cm, and a photon beam was delivered twice, at 100 cGy and 400 cGy, to extend the calibration dose range under the same conditions. Fourteen film pieces, to maintain their consistency, were irradiated at doses ranging from approximately 30 to 400 cGy for both film calibrations. The film pieces were located at the center position on the scan bed of an Epson 1680 flatbed scanner in the parallel direction. Intensity-modulated radiation therapy (IMRT) plans were created, and their dose distributions were delivered to the film. The dose distributions for the traditional method and those for the PDD-based calibration method were evaluated using a Gamma analysis. The PDD dose values using a CC13 ion chamber and those obtained by using a FC65-G Farmer chamber and measured at the depth of interest produced very similar results. With the objective test criterion of a 1% dosage agreement at 1 mm, the passing rates for the four cases of the three IMRT plans were essentially identical. The traditional and the PDD-based calibrations provided similar plan verification results. We also describe another alternative for calibrating EBT3 films, i.e., a PDD-based calibration method that provides an easy and time-saving approach

  3. Are Patent Medicine Vendors Effective Agents in Malaria Control? Using Lot Quality Assurance Sampling to Assess Quality of Practice in Jigawa, Nigeria

    OpenAIRE

    Sima Berendes; Olusegun Adeyemi; Edward Adekola Oladele; Olusola Bukola Oresanya; Festus Okoh; Joseph J Valadez

    2012-01-01

    BACKGROUND: Patent medicine vendors (PMV) provide antimalarial treatment and care throughout Sub-Saharan Africa, and can play an important role in the fight against malaria. Their close-to-client infrastructure could enable lifesaving artemisinin-based combination therapy (ACT) to reach patients in time. However, systematic assessments of drug sellers' performance quality are crucial if their role is to be managed within the health system. Lot quality assurance sampling (LQAS) could be an eff...

  4. A Quality Assurance Method that Utilizes 3D Dosimetry and Facilitates Clinical Interpretation

    Energy Technology Data Exchange (ETDEWEB)

    Oldham, Mark, E-mail: mark.oldham@duke.edu [Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States); Thomas, Andrew; O' Daniel, Jennifer; Juang, Titania [Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States); Ibbott, Geoffrey [University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Adamovics, John [Rider University, Lawrenceville, New Jersey (United States); Kirkpatrick, John P. [Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States)

    2012-10-01

    Purpose: To demonstrate a new three-dimensional (3D) quality assurance (QA) method that provides comprehensive dosimetry verification and facilitates evaluation of the clinical significance of QA data acquired in a phantom. Also to apply the method to investigate the dosimetric efficacy of base-of-skull (BOS) intensity-modulated radiotherapy (IMRT) treatment. Methods and Materials: Two types of IMRT QA verification plans were created for 6 patients who received BOS IMRT. The first plan enabled conventional 2D planar IMRT QA using the Varian portal dosimetry system. The second plan enabled 3D verification using an anthropomorphic head phantom. In the latter, the 3D dose distribution was measured using the DLOS/Presage dosimetry system (DLOS = Duke Large-field-of-view Optical-CT System, Presage Heuris Pharma, Skillman, NJ), which yielded isotropic 2-mm data throughout the treated volume. In a novel step, measured 3D dose distributions were transformed back to the patient's CT to enable calculation of dose-volume histograms (DVH) and dose overlays. Measured and planned patient DVHs were compared to investigate clinical significance. Results: Close agreement between measured and calculated dose distributions was observed for all 6 cases. For gamma criteria of 3%, 2 mm, the mean passing rate for portal dosimetry was 96.8% (range, 92.0%-98.9%), compared to 94.9% (range, 90.1%-98.9%) for 3D. There was no clear correlation between 2D and 3D passing rates. Planned and measured dose distributions were evaluated on the patient's anatomy, using DVH and dose overlays. Minor deviations were detected, and the clinical significance of these are presented and discussed. Conclusions: Two advantages accrue to the methods presented here. First, treatment accuracy is evaluated throughout the whole treated volume, yielding comprehensive verification. Second, the clinical significance of any deviations can be assessed through the generation of DVH curves and dose overlays on

  5. Guidelines for the processing and quality assurance of benthic invertebrate samples collected as part of the National Water-Quality Assessment Program

    Science.gov (United States)

    Cuffney, T.F.; Gurtz, M.E.; Meador, M.R.

    1993-01-01

    Benthic invertebrate samples are collected as part of the U.S. Geological Survey's National Water-Quality Assessment Program. This is a perennial, multidisciplinary program that integrates biological, physical, and chemical indicators of water quality to evaluate status and trends and to develop an understanding of the factors controlling observed water quality. The Program examines water quality in 60 study units (coupled ground- and surface-water systems) that encompass most of the conterminous United States and parts of Alaska and Hawaii. Study-unit teams collect and process qualitative and semi-quantitative invertebrate samples according to standardized procedures. These samples are processed (elutriated and subsampled) in the field to produce as many as four sample components: large-rare, main-body, elutriate, and split. Each sample component is preserved in 10-percent formalin, and two components, large-rare and main-body, are sent to contract laboratories for further processing. The large-rare component is composed of large invertebrates that are removed from the sample matrix during field processing and placed in one or more containers. The main-body sample component consists of the remaining sample materials (sediment, detritus, and invertebrates) and is subsampled in the field to achieve a volume of 750 milliliters or less. The remaining two sample components, elutriate and split, are used for quality-assurance and quality-control purposes. Contract laboratories are used to identify and quantify invertebrates from the large-rare and main-body sample components according to the procedures and guidelines specified within this document. These guidelines allow the use of subsampling techniques to reduce the volume of sample material processed and to facilitate identifications. These processing procedures and techniques may be modified if the modifications provide equal or greater levels of accuracy and precision. The intent of sample processing is to

  6. Methods, quality assurance, and data for assessing atmospheric deposition of pesticides in the Central Valley of California

    Science.gov (United States)

    Zamora, Celia; Majewski, Michael S.; Foreman, William T.

    2013-01-01

    The U.S. Geological Survey monitored atmospheric deposition of pesticides in the Central Valley of California during two studies in 2001 and 2002–04. The 2001 study sampled wet deposition (rain) and storm-drain runoff in the Modesto, California, area during the orchard dormant-spray season to examine the contribution of pesticide concentrations to storm runoff from rainfall. In the 2002–04 study, the number and extent of collection sites in the Central Valley were increased to determine the areal distribution of organophosphate insecticides and other pesticides, and also five more sample types were collected. These were dry deposition, bulk deposition, and three sample types collected from a soil box: aqueous phase in runoff, suspended sediment in runoff, and surficial-soil samples. This report provides concentration data and describes methods and quality assurance of sample collection and laboratory analysis for pesticide compounds in all samples collected from 16 sites. Each sample was analyzed for 41 currently used pesticides and 23 pesticide degradates, including oxygen analogs (oxons) of 9 organophosphate insecticides. Analytical results are presented by sample type and study period. The median concentrations of both chloryprifos and diazinon sampled at four urban (0.067 micrograms per liter [μg/L] and 0.515 μg/L, respectively) and four agricultural sites (0.079 μg/L and 0.583 μg/L, respectively) during a January 2001 storm event in and around Modesto, Calif., were nearly identical, indicating that the overall atmospheric burden in the region appeared to be fairly similar during the sampling event. Comparisons of median concentrations in the rainfall to those in the McHenry storm-drain runoff showed that, for some compounds, rainfall contributed a substantial percentage of the concentration in the runoff; for other compounds, the concentrations in rainfall were much greater than in the runoff. For example, diazinon concentrations in rainfall were about

  7. Tracking the quality of care for sick children using lot quality assurance sampling: targeting improvements of health services in Jigawa, Nigeria.

    Directory of Open Access Journals (Sweden)

    Edward Adekola Oladele

    Full Text Available BACKGROUND: In Nigeria, 30% of child deaths are due to malaria. The National Malaria Control Program of Nigeria (NMCP during 2009 initiated a program to improve the quality of paediatric malaria services delivered in health facilities (HF. This study reports a rapid approach used to assess the existing quality of services in Jigawa state at decentralised levels of the health system. METHODS: NMCP selected Lot Quality Assurance Sampling (LQAS to identify the variation in HF service quality among Senatorial Districts (SD. LQAS was selected because it was affordable and could be used by local health workers (HW in a population-based survey. NMCP applied a 2-stage LQAS using a structured Rapid Health Facility Assessment (R-HFA tool to identify high and low performing SD for specified indicators. FINDINGS: LQAS identified variations in HF performance (n = 21 and enabled resources to be targeted to address priorities. All SD exhibited deficient essential services, supplies and equipment. Only 9.7% of HF had Artemisinin-based Combination Therapies and other first-line treatments for childhood illnesses. No SD and few HF exhibited adequate HW performance for the assessment, treatment or counselling of sick children. Using the IMCI algorithm, 17.5% of HW assessed the child's vaccination status, 46.8% assessed nutritional status, and 65.1% assessed children for dehydration. Only 5.1% of HW treatments were appropriate for the assessment. Exit interviews revealed that 5.1% of caregivers knew their children's illness, and only 19.9% could accurately describe how to administer the prescribed drug. CONCLUSION: This R-HFA, using LQAS principles, is a rapid, simple tool for assessing malaria services and can be used at scale. It identified technical deficiencies that could be corrected by improved continuing medical education, targeted supervision, and recurrent R-HFA assessments of the quality of services.

  8. Selected quality assurance data for water samples collected by the US Geological Survey, Idaho National Engineering Laboratory, Idaho, 1980 to 1988

    International Nuclear Information System (INIS)

    Multiple water samples from 115 wells and 3 surface-water sites were collected between 1980 and 1988 for the ongoing quality assurance program at the Idaho National Engineering Laboratory. The reported results from the six laboratories involved were analyzed for agreement using descriptive statistics. The analytical constituents and properties included: tritium, plutonium-238, plutonium-239, -240 (undivided), strontium-90, americium-241, cesium-137, total dissolved chromium, selected dissolved trace metals, sodium, chloride, nitrate, selected purgeable organic compounds, and specific conductance. Agreement could not be calculated for trace metals, some nitrates, purgeable organic compounds, and blank-sample analyses because analytical uncertainties were not consistently reported. However, differences between results for most of these data were calculated. The blank samples were not analyzed for difference. The laboratory results analyzed using descriptive statistics showed a median agreement of 95 percent between all useable data pairs. 13 refs., 4 figs., 24 tabs

  9. CC-Case as an Integrated Method of Security Analysis and Assurance over Life-cycle Process

    Directory of Open Access Journals (Sweden)

    Tomoko Kaneko

    2015-05-01

    Full Text Available Secure system design faces many risks such as information leakage and denial of service. We propose a method named CC-Case to describe se-curity assurance cases based on the security struc-tures and thereat analysis. CC-Case uses Common Criteria (ISO/IEC15408. While the scope of CC-Case mainly focuses to the requirement stage, CC-Case can handle the life-cycle process of sys-tem design that contains the requirement, design, implementation, test and the maintenance stages. It can make countermeasure easily against the situation which an unexpected new threat produced by invisible attackers incessantly.

  10. Methods, quality assurance, and data for assessing atmospheric deposition of pesticides in the Central Valley of California

    Science.gov (United States)

    Zamora, Celia; Majewski, Michael S.; Foreman, William T.

    2013-01-01

    The U.S. Geological Survey monitored atmospheric deposition of pesticides in the Central Valley of California during two studies in 2001 and 2002–04. The 2001 study sampled wet deposition (rain) and storm-drain runoff in the Modesto, California, area during the orchard dormant-spray season to examine the contribution of pesticide concentrations to storm runoff from rainfall. In the 2002–04 study, the number and extent of collection sites in the Central Valley were increased to determine the areal distribution of organophosphate insecticides and other pesticides, and also five more sample types were collected. These were dry deposition, bulk deposition, and three sample types collected from a soil box: aqueous phase in runoff, suspended sediment in runoff, and surficial-soil samples. This report provides concentration data and describes methods and quality assurance of sample collection and laboratory analysis for pesticide compounds in all samples collected from 16 sites. Each sample was analyzed for 41 currently used pesticides and 23 pesticide degradates, including oxygen analogs (oxons) of 9 organophosphate insecticides. Analytical results are presented by sample type and study period. The median concentrations of both chloryprifos and diazinon sampled at four urban (0.067 micrograms per liter [μg/L] and 0.515 μg/L, respectively) and four agricultural sites (0.079 μg/L and 0.583 μg/L, respectively) during a January 2001 storm event in and around Modesto, Calif., were nearly identical, indicating that the overall atmospheric burden in the region appeared to be fairly similar during the sampling event. Comparisons of median concentrations in the rainfall to those in the McHenry storm-drain runoff showed that, for some compounds, rainfall contributed a substantial percentage of the concentration in the runoff; for other compounds, the concentrations in rainfall were much greater than in the runoff. For example, diazinon concentrations in rainfall were about

  11. Clustered lot quality assurance sampling: a tool to monitor immunization coverage rapidly during a national yellow fever and polio vaccination campaign in Cameroon, May 2009.

    Science.gov (United States)

    Pezzoli, L; Tchio, R; Dzossa, A D; Ndjomo, S; Takeu, A; Anya, B; Ticha, J; Ronveaux, O; Lewis, R F

    2012-01-01

    We used the clustered lot quality assurance sampling (clustered-LQAS) technique to identify districts with low immunization coverage and guide mop-up actions during the last 4 days of a combined oral polio vaccine (OPV) and yellow fever (YF) vaccination campaign conducted in Cameroon in May 2009. We monitored 17 pre-selected districts at risk for low coverage. We designed LQAS plans to reject districts with YF vaccination coverage <90% and with OPV coverage <95%. In each lot the sample size was 50 (five clusters of 10) with decision values of 3 for assessing OPV and 7 for YF coverage. We 'rejected' 10 districts for low YF coverage and 14 for low OPV coverage. Hence we recommended a 2-day extension of the campaign. Clustered-LQAS proved to be useful in guiding the campaign vaccination strategy before the completion of the operations.

  12. Creating Quality Assurance and International Transparency for Quality Assurance Agencies

    DEFF Research Database (Denmark)

    Kristoffersen, Dorte; Lindeberg, Tobias

    2004-01-01

    , on the one hand, to advance internationalisation of quality assurance of higher education, and on the other hand, allow for the differences in the national approaches to quality assurance. The paper will focus on two issues: first, the strength and weaknesses of the method employed and of the use of the ENQA......The paper presents the experiences gained in the pilot project on mutual recognition conducted by the quality assurance agencies in the Nordic countries and the future perspective for international quality assurance of national quality assurance agencies. The background of the project was the need...

  13. Creating quality assurance and international transparency for quality assurance agencies

    DEFF Research Database (Denmark)

    Lindeberg, Tobias Høygaard; Kristoffersen, Dorte

    2004-01-01

    , on the one hand, to advance internationalisation of quality assurance of higher education, and on the other hand, allow for the differences in the national approaches to quality assurance. The paper will focus on two issues: first, the strength and weaknesses of the method employed and of the use of the ENQA......The paper presents the experiences gained in the pilot project on mutual recognition conducted by the quality assurance agencies in the Nordic countries and the future perspective for international quality assurance of national quality assurance agencies. The background of the project was the need...

  14. Optimisation of small-scale hydropower using quality assurance methods - Preliminary project; Vorprojekt: Optimierung von Kleinwasserkraftwerken durch Qualitaetssicherung. Programm Kleinwasserkraftwerke

    Energy Technology Data Exchange (ETDEWEB)

    Hofer, S.; Staubli, T.

    2006-11-15

    This comprehensive final report for the Swiss Federal Office of Energy (SFOE) presents the results of a preliminary project that examined how quality assurance methods can be used in the optimisation of small-scale hydropower projects. The aim of the project, to use existing know-how, experience and synergies, is examined. Discrepancies in quality and their effects on production prices were determined in interviews. The paper describes best-practice guidelines for the quality assurance of small-scale hydro schemes. A flow chart describes the various steps that have to be taken in the project and realisation work. Information collected from planners and from interviews made with them are presented along with further information obtained from literature. The results of interviews concerning planning work, putting to tender and the construction stages of these hydro schemes are presented and commented on. Similarly, the operational phase of such power plant is also examined, including questions on operation and guarantees. The aims of the follow-up main project - the definition of a tool and guidelines for ensuring quality - are briefly reviewed.

  15. Data quality assurance in monitoring of wastewater quality: Univariate on-line and off-line methods

    DEFF Research Database (Denmark)

    Alferes, J.; Poirier, P.; Lamaire-Chad, C.;

    To make water quality monitoring networks useful for practice, the automation of data collection and data validation still represents an important challenge. Efficient monitoring depends on careful quality control and quality assessment. With a practical orientation a data quality assurance...... procedure is presented that combines univariate off-line and on-line methods to assess water quality sensors and to detect and replace doubtful data. While the off-line concept uses control charts for quality control, the on-line methods aim at outlier and fault detection by using autoregressive models....... The proposed tools were successfully tested with data sets collected at the inlet of a primary clarifier,where probably the toughest measurement conditions are found in wastewater treatment plants....

  16. Transuranic Waste Characterization Quality Assurance Program Plan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-04-30

    This quality assurance plan identifies the data necessary, and techniques designed to attain the required quality, to meet the specific data quality objectives associated with the DOE Waste Isolation Pilot Plant (WIPP). This report specifies sampling, waste testing, and analytical methods for transuranic wastes.

  17. Flow assurance

    Energy Technology Data Exchange (ETDEWEB)

    Mullins, O.C.; Dong, C. [Schlumberger-Doll Research Center, Cambridge, MA (United States); Elshahawi, H. [Shell Exploration and Production Company, The Hague (Netherlands)

    2008-07-01

    This study emphasized the need for considering flow assurance for producing oil and gas, particularly in high cost areas such as deepwater. Phase behaviour studies, sticking propensities, and interfacial interactions have been investigated in many laboratory studies using asphaltenes, wax, hydrates, organic and inorganic scale, and even diamondoids. However, the spatial variation of reservoir fluids has received little attention, despite the fact that it is one of the most important factors affecting flow assurance. This issue was difficult to address in a systematic way in the past because of cost constraints. Today, reservoir fluid variation and flow assurance can be considered at the outset of a project given the technological advances in downhole fluid analysis. This study described the origins of reservoir fluid compositional variations and the controversies surrounding them. It also described the indispensable chemical analytical technology. The impact of these reservoir fluid compositional variations on flow assurance considerations was also discussed. A methodology that accounts for these variations at the outset in flow assurance evaluation was also presented.

  18. Financial assurances

    International Nuclear Information System (INIS)

    US Ecology is a full service waste management company. The company operates two of the nation's three existing low-level radioactive waste (LLRW) disposal facilities and has prepared and submitted license applications for two new LLRW disposal facilities in California and Nebraska. The issue of financial assurances is an important aspect of site development and operation. Proper financial assurances help to insure that uninterrupted operation, closure and monitoring of a facility will be maintained throughout the project's life. Unfortunately, this aspect of licensing is not like others where you can gauge acceptance by examining approved computer codes, site performance standards or applying specific technical formulas. There is not a standard financial assurance plan. Each site should develop its requirements based upon the conditions of the site, type of design, existing state or federal controls, and realistic assessments of future financial needs. Financial assurances at U.S. Ecology's existing sites in Richland, Washington, and Beatty, Nevada, have been in place for several years and are accomplished in a variety of ways by the use of corporate guarantees, corporate capital funds, third party liability insurance, and post closure/long-term care funds. In addressing financial assurances, one can divide the issue into three areas: Site development/operations, third party damages, and long-term care/cleanup

  19. 19 CFR 151.70 - Method of sampling by Customs.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Method of sampling by Customs. 151.70 Section 151.70 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF... Method of sampling by Customs. A general sample shall be taken from each sampling unit, unless it is...

  20. A method for sampling waste corn

    Science.gov (United States)

    Frederick, R.B.; Klaas, E.E.; Baldassarre, G.A.; Reinecke, K.J.

    1984-01-01

    Corn had become one of the most important wildlife food in the United States. It is eaten by a wide variety of animals, including white-tailed deer (Odocoileus virginianus ), raccoon (Procyon lotor ), ring-necked pheasant (Phasianus colchicus , wild turkey (Meleagris gallopavo ), and many species of aquatic birds. Damage to unharvested crops had been documented, but many birds and mammals eat waste grain after harvest and do not conflict with agriculture. A good method for measuring waste-corn availability can be essential to studies concerning food density and food and feeding habits of field-feeding wildlife. Previous methods were developed primarily for approximating losses due to harvest machinery. In this paper, a method is described for estimating the amount of waste corn potentially available to wildlife. Detection of temporal changes in food availability and differences caused by agricultural operations (e.g., recently harvested stubble fields vs. plowed fields) are discussed.

  1. Understanding Sampling Style Adversarial Search Methods

    CERN Document Server

    Ramanujan, Raghuram; Selman, Bart

    2012-01-01

    UCT has recently emerged as an exciting new adversarial reasoning technique based on cleverly balancing exploration and exploitation in a Monte-Carlo sampling setting. It has been particularly successful in the game of Go but the reasons for its success are not well understood and attempts to replicate its success in other domains such as Chess have failed. We provide an in-depth analysis of the potential of UCT in domain-independent settings, in cases where heuristic values are available, and the effect of enhancing random playouts to more informed playouts between two weak minimax players. To provide further insights, we develop synthetic game tree instances and discuss interesting properties of UCT, both empirically and analytically.

  2. Time efficient methods for scanning a fluorescent membrane with a fluorescent microscopic imager for the quality assurance of food

    Science.gov (United States)

    Lerm, Steffen; Holder, Silvio; Schellhorn, Mathias; Brückner, Peter; Linß, Gerhard

    2013-05-01

    An important part of the quality assurance of meat is the estimation of germs in the meat exudes. The kind and the number of the germs in the meat affect the medical risk for the consumer of the meat. State-of-the-art analyses of meat are incubator test procedures. The main disadvantages of such incubator tests are the time consumption, the necessary equipment and the need of special skilled employees. These facts cause in high inspection cost. For this reason a new method for the quality assurance is necessary which combines low detection limits and less time consumption. One approach for such a new method is fluorescence microscopic imaging. The germs in the meat exude are caught in special membranes by body-antibody reactions. The germ typical signature could be enhanced with fluorescent chemical markers instead of reproduction of the germs. Each fluorescent marker connects with a free germ or run off the membrane. An image processing system is used to detect the number of fluorescent particles. Each fluorescent spot should be a marker which is connected with a germ. Caused by the small object sizes of germs, the image processing system needs a high optical magnification of the camera. However, this leads to a small field of view and a small depth of focus. For this reasons the whole area of the membrane has to be scanned in three dimensions. To minimize the time consumption, the optimal path has to be found. This optimization problem is influenced by features of the hardware and is presented in this paper. The traversing range in each direction, the step width, the velocity, the shape of the inspection volume and the field of view have influence on the optimal path to scan the membrane.

  3. Quality assurance and quality control in light stable isotope laboratories: A case study of Rio Grande, Texas, water samples

    Science.gov (United States)

    Coplen, T.B.; Qi, H.

    2009-01-01

    New isotope laboratories can achieve the goal of reporting the same isotopic composition within analytical uncertainty for the same material analysed decades apart by (1) writing their own acceptance testing procedures and putting them into their mass spectrometric or laser-based isotope-ratio equipment procurement contract, (2) requiring a manufacturer to demonstrate acceptable performance using all sample ports provided with the instrumentation, (3) for each medium to be analysed, prepare two local reference materials substantially different in isotopic composition to encompass the range in isotopic composition expected in the laboratory and calibrated them with isotopic reference materials available from the International Atomic Energy Agency (IAEA) or the US National Institute of Standards and Technology (NIST), (4) using the optimum storage containers (for water samples, sealing in glass ampoules that are sterilised after sealing is satisfactory), (5) interspersing among sample unknowns local laboratory isotopic reference materials daily (internationally distributed isotopic reference materials can be ordered at three-year intervals, and can be used for elemental analyser analyses and other analyses that consume less than 1 mg of material) - this process applies to H, C, N, O, and S isotope ratios, (6) calculating isotopic compositions of unknowns by normalising isotopic data to that of local reference materials, which have been calibrated to internationally distributed isotopic reference materials, (7) reporting results on scales normalised to internationally distributed isotopic reference materials (where they are available) and providing to sample submitters the isotopic compositions of internationally distributed isotopic reference materials of the same substance had they been analysed with unknowns, (8) providing an audit trail in the laboratory for analytical results - this trail commonly will be in electronic format and might include a laboratory

  4. Quality assurance and quality control in light stable isotope laboratories: a case study of Rio Grande, Texas, water samples.

    Science.gov (United States)

    Coplen, Tyler B; Qi, Haiping

    2009-06-01

    New isotope laboratories can achieve the goal of reporting the same isotopic composition within analytical uncertainty for the same material analysed decades apart by (1) writing their own acceptance testing procedures and putting them into their mass spectrometric or laser-based isotope-ratio equipment procurement contract, (2) requiring a manufacturer to demonstrate acceptable performance using all sample ports provided with the instrumentation, (3) for each medium to be analysed, prepare two local reference materials substantially different in isotopic composition to encompass the range in isotopic composition expected in the laboratory and calibrated them with isotopic reference materials available from the International Atomic Energy Agency (IAEA) or the US National Institute of Standards and Technology (NIST), (4) using the optimum storage containers (for water samples, sealing in glass ampoules that are sterilised after sealing is satisfactory), (5) interspersing among sample unknowns local laboratory isotopic reference materials daily (internationally distributed isotopic reference materials can be ordered at three-year intervals, and can be used for elemental analyser analyses and other analyses that consume less than 1 mg of material) - this process applies to H, C, N, O, and S isotope ratios, (6) calculating isotopic compositions of unknowns by normalising isotopic data to that of local reference materials, which have been calibrated to internationally distributed isotopic reference materials, (7) reporting results on scales normalised to internationally distributed isotopic reference materials (where they are available) and providing to sample submitters the isotopic compositions of internationally distributed isotopic reference materials of the same substance had they been analysed with unknowns, (8) providing an audit trail in the laboratory for analytical results - this trail commonly will be in electronic format and might include a laboratory

  5. Evaluation of Environmental Sample Analysis Methods and Results Reporting in the National Children's Study Vanguard Study.

    Science.gov (United States)

    Heikkinen, Maire S A; Khalaf, Abdisalam; Beard, Barbara; Viet, Susan M; Dellarco, Michael

    2016-05-01

    During the initial Vanguard phase of the U.S. National Children's Study (NCS), about 2000 tap water, surface wipe, and air samples were collected and analyzed immediately. The shipping conditions, analysis methods, results, and laboratory performance were evaluated to determine the best approaches for use in the NCS Main Study. The main conclusions were (1) to employ established sample analysis methods, when possible, and alternate methodologies only after careful consideration with method validation studies; (2) lot control and prescreening sample collection materials are important quality assurance procedures; (3) packing samples correctly requires careful training and adjustment of shipping conditions to local conditions; (4) trip blanks and spiked samples should be considered for samplers with short expiration times and labile analytes; (5) two study-specific results reports should be required: laboratory electronic data deliverables (EDD) of sample results in a useable electronic format (CSV or SEDD XML/CSV) and a data package with sample results and supporting information in PDF format. These experiences and lessons learned can be applied to any long-term study.

  6. Method Description, Quality Assurance, Environmental Data, and other Information for Analysis of Pharmaceuticals in Wastewater-Treatment-Plant Effluents, Streamwater, and Reservoirs, 2004-2009

    Science.gov (United States)

    Phillips, Patrick J.; Smith, Steven G.; Kolpin, Dana W.; Zaugg, Steven D.; Buxton, Herbert T.; Furlong, Edward T.

    2010-01-01

    Abstract Wastewater-treatment-plant (WWTP) effluents are a demonstrated source of pharmaceuticals to the environment. During 2004-09, a study was conducted to identify pharmaceutical compounds in effluents from WWTPs (including two that receive substantial discharges from pharmaceutical formulation facilities), streamwater, and reservoirs. The methods used to determine and quantify concentrations of seven pharmaceuticals are described. In addition, the report includes information on pharmaceuticals formulated or potentially formulated at the two pharmaceutical formulation facilities that provide substantial discharge to two of the WWTPs, and potential limitations to these data are discussed. The analytical methods used to provide data on the seven pharmaceuticals (including opioids, muscle relaxants, and other pharmaceuticals) in filtered water samples also are described. Data are provided on method performance, including spike data, method detection limit results, and an estimation of precision. Quality-assurance data for sample collection and handling are included. Quantitative data are presented for the seven pharmaceuticals in water samples collected at WWTP discharge points, from streams, and at reservoirs. Occurrence data also are provided for 19 pharmaceuticals that were qualitatively identified. Flow data at selected WWTP and streams are presented. Between 2004-09, 35-38 effluent samples were collected from each of three WWTPs in New York and analyzed for seven pharmaceuticals. Two WWTPs (NY2 and NY3) receive substantial inflows (greater than 20 percent of plant flow) from pharmaceutical formulation facilities (PFF) and one (NY1) receives no PFF flow. Samples of effluents from 23 WWTPs across the United States were analyzed once for these pharmaceuticals as part of a national survey. Maximum pharmaceutical effluent concentrations for the national survey and NY1 effluent samples were generally less than 1 ug/L. Four pharmaceuticals (methadone, oxycodone

  7. SU-E-T-438: Commissioning of An In-Vivo Quality Assurance Method Using the Electronic Portal Imaging Device

    International Nuclear Information System (INIS)

    Purpose: Patient specific pre-treatment quality assurance (QA) using arrays of detectors or film have been the standard approach to assure the correct treatment is delivered to the patient. This QA approach is expensive, labor intensive and does not guarantee or document that all remaining fractions were treated properly. The purpose of this abstract is to commission and evaluate the performance of a commercially available in-vivo QA software using the electronic portal imaging device (EPID) to record the daily treatments. Methods: The platform EPIgray V2.0.2 (Dosisoft), which machine model compares ratios of TMR with EPID signal to predict dose was commissioned for an Artiste (Siemens Oncology Care Systems) and a Truebeam (Varian medical systems) linear accelerator following the given instructions. The systems were then tested on three different phantoms (homogeneous stack of solid water, anthropomorphic head and pelvis) and on a library of patient cases. Simple and complex fields were delivered at different exposures and for different gantry angles. The effects of the table attenuation and the EPID sagging were evaluated. Gamma analysis of the measured dose was compared to the predicted dose for complex clinical IMRT cases. Results: Commissioning of the EPIgray system for two photon energies took 8 hours. The difference between the dose planned and the dose measured with EPIgray was better than 3% for all phantom scenarios tested. Preliminary results on patients demonstrate an accuracy of 5% is achievable in high dose regions for both 3DCRT and IMRT. Large discrepancies (>5%) were observed due to metallic structures or air cavities and in low dose areas. Flat panel sagging was visible and accounted for in the EPIgray model. Conclusion: The accuracy achieved by EPIgray is sufficient to document the safe delivery of complex IMRT treatments. Future work will evaluate EPIgray for VMAT and high dose rate deliveries. This work is supported by Dosisoft, Cachan, France

  8. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    Science.gov (United States)

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  9. Methods for collecting algal samples as part of the National Water-Quality Assessment Program

    Science.gov (United States)

    Porter, Stephen D.; Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.

    1993-01-01

    Benthic algae (periphyton) and phytoplankton communities are characterized in the U.S. Geological Survey's National Water-Quality Assessment Program as part of an integrated physical, chemical, and biological assessment of the Nation's water quality. This multidisciplinary approach provides multiple lines of evidence for evaluating water-quality status and trends, and for refining an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. Water quality can be characterized by evaluating the results of qualitative and quantitative measurements of the algal community. Qualitative periphyton samples are collected to develop of list of taxa present in the sampling reach. Quantitative periphyton samples are collected to measure algal community structure within selected habitats. These samples of benthic algal communities are collected from natural substrates, using the sampling methods that are most appropriate for the habitat conditions. Phytoplankton samples may be collected in large nonwadeable streams and rivers to meet specific program objectives. Estimates of algal biomass (chlorophyll content and ash-free dry mass) also are optional measures that may be useful for interpreting water-quality conditions. A nationally consistent approach provides guidance on site, reach, and habitat selection, as well as information on methods and equipment for qualitative and quantitative sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data locally, regionally, and nationally.

  10. Remedial investigation sampling and analysis plan for J-Field, Aberdeen Proving Ground, Maryland: Volume 2, Quality Assurance Project Plan

    Energy Technology Data Exchange (ETDEWEB)

    Prasad, S.; Martino, L.; Patton, T.

    1995-03-01

    J-Field encompasses about 460 acres at the southern end of the Gunpowder Neck Peninsula in the Edgewood Area of APG (Figure 2.1). Since World War II, the Edgewood Area of APG has been used to develop, manufacture, test, and destroy chemical agents and munitions. These materials were destroyed at J-Field by open burning and open detonation (OB/OD). For the purposes of this project, J-Field has been divided into eight geographic areas or facilities that are designated as areas of concern (AOCs): the Toxic Burning Pits (TBP), the White Phosphorus Burning Pits (WPP), the Riot Control Burning Pit (RCP), the Robins Point Demolition Ground (RPDG), the Robins Point Tower Site (RPTS), the South Beach Demolition Ground (SBDG), the South Beach Trench (SBT), and the Prototype Building (PB). The scope of this project is to conduct a remedial investigation/feasibility study (RI/FS) and ecological risk assessment to evaluate the impacts of past disposal activities at the J-Field site. Sampling for the RI will be carried out in three stages (I, II, and III) as detailed in the FSP. A phased approach will be used for the J-Field ecological risk assessment (ERA).

  11. Use of destructive and nondestructive methods of analysis for quality assurance at MOX fuel production in the Russia

    Energy Technology Data Exchange (ETDEWEB)

    Bibilashvili, Y.K.; Rudenko, V.S.; Chorokhov, N.A.; Korovin, Y.I.; Petrov, A.M.; Vorobiev, A.V.; Mukhortov, N.F.; Smirnov, Y.A.; Kudryavtsev, V.N. [A.A. Bochvar All-Russia Research Institute of Inorganic Materials (Russian Federation)

    2000-07-01

    Parameters of MOX fuel with various plutonium contents are considered from the point of view of necessity of their control for quality assurance. Destructive and nondestructive methods used for this purpose in the Russia are described: controlled potential coulometry for determination of uranium or/and plutonium contents, their ratio and oxygen factor; mass spectrometry for determination of uranium and plutonium isotopic composition; chemical spectral emission method for determination of contents of 'metal' impurities, boron and silicon, and methods of determination of gas forming impurities. Capabilities of nondestructive gamma-ray spectrometry techniques are considered in detail and results of their use at measurement of uranium and plutonium isotopic composition in initial dioxides, at determination of contents of uranium and plutonium, and uniformity of their distribution in MOX powder and pellets. The necessity of correction of algorithm of the MGA program is shown for using the program at analyses of gamma-ray spectra of MOX with low contents of low burnup plutonium. (authors)

  12. Use of destructive and nondestructive methods of analysis for quality assurance at MOX fuel production in the Russia

    International Nuclear Information System (INIS)

    Parameters of MOX fuel with various plutonium contents are considered from the point of view of necessity of their control for quality assurance. Destructive and nondestructive methods used for this purpose in the Russia are described: controlled potential coulometry for determination of uranium or/and plutonium contents, their ratio and oxygen factor; mass spectrometry for determination of uranium and plutonium isotopic composition; chemical spectral emission method for determination of contents of 'metal' impurities, boron and silicon, and methods of determination of gas forming impurities. Capabilities of nondestructive gamma-ray spectrometry techniques are considered in detail and results of their use at measurement of uranium and plutonium isotopic composition in initial dioxides, at determination of contents of uranium and plutonium, and uniformity of their distribution in MOX powder and pellets. The necessity of correction of algorithm of the MGA program is shown for using the program at analyses of gamma-ray spectra of MOX with low contents of low burnup plutonium. (authors)

  13. Quality assurance using outlier detection on an automatic segmentation method for the cerebellar peduncles

    Science.gov (United States)

    Li, Ke; Ye, Chuyang; Yang, Zhen; Carass, Aaron; Ying, Sarah H.; Prince, Jerry L.

    2016-03-01

    Cerebellar peduncles (CPs) are white matter tracts connecting the cerebellum to other brain regions. Automatic segmentation methods of the CPs have been proposed for studying their structure and function. Usually the performance of these methods is evaluated by comparing segmentation results with manual delineations (ground truth). However, when a segmentation method is run on new data (for which no ground truth exists) it is highly desirable to efficiently detect and assess algorithm failures so that these cases can be excluded from scientific analysis. In this work, two outlier detection methods aimed to assess the performance of an automatic CP segmentation algorithm are presented. The first one is a univariate non-parametric method using a box-whisker plot. We first categorize automatic segmentation results of a dataset of diffusion tensor imaging (DTI) scans from 48 subjects as either a success or a failure. We then design three groups of features from the image data of nine categorized failures for failure detection. Results show that most of these features can efficiently detect the true failures. The second method—supervised classification—was employed on a larger DTI dataset of 249 manually categorized subjects. Four classifiers—linear discriminant analysis (LDA), logistic regression (LR), support vector machine (SVM), and random forest classification (RFC)—were trained using the designed features and evaluated using a leave-one-out cross validation. Results show that the LR performs worst among the four classifiers and the other three perform comparably, which demonstrates the feasibility of automatically detecting segmentation failures using classification methods.

  14. 76 FR 26341 - Medicaid Program; Methods for Assuring Access to Covered Medicaid Services

    Science.gov (United States)

    2011-05-06

    ..., telemedicine and telehealth, nurse help lines, health information technology and other methods for providing... years of research and consulted extensively with key stakeholders to develop a recommendation on how to... standards and are considering future proposals to address access issues under managed care delivery...

  15. Photoacoustic sample vessel and method of elevated pressure operation

    Science.gov (United States)

    Autrey, Tom; Yonker, Clement R.

    2004-05-04

    An improved photoacoustic vessel and method of photoacoustic analysis. The photoacoustic sample vessel comprises an acoustic detector, an acoustic couplant, and an acoustic coupler having a chamber for holding the acoustic couplant and a sample. The acoustic couplant is selected from the group consisting of liquid, solid, and combinations thereof. Passing electromagnetic energy through the sample generates an acoustic signal within the sample, whereby the acoustic signal propagates through the sample to and through the acoustic couplant to the acoustic detector.

  16. Methods for collecting benthic invertebrate samples as part of the National Water-Quality Assessment Program

    Science.gov (United States)

    Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.

    1993-01-01

    Benthic invertebrate communities are evaluated as part of the ecological survey component of the U.S. Geological Survey's National Water-Quality Assessment Program. These biological data are collected along with physical and chemical data to assess water-quality conditions and to develop an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. The objectives of benthic invertebrate community characterizations are to (1) develop for each site a list of tax a within the associated stream reach and (2) determine the structure of benthic invertebrate communities within selected habitats of that reach. A nationally consistent approach is used to achieve these objectives. This approach provides guidance on site, reach, and habitat selection and methods and equipment for qualitative multihabitat sampling and semi-quantitative single habitat sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data within and among study units.

  17. Evaporation from weighing precipitation gauges: impacts on automated gauge measurements and quality assurance methods

    Directory of Open Access Journals (Sweden)

    R. D. Leeper

    2014-12-01

    Full Text Available The effects of evaporation on precipitation measurements have been understood to bias total precipitation lower. For automated weighing-bucket gauges, the World Meteorological Organization (WMO suggests the use of evaporative suppressants with frequent observations. However, the use of evaporation suppressants is not always feasible due to environmental hazards and the added cost of maintenance, transport, and disposal of the gauge additive. In addition, research has suggested that evaporation prior to precipitation may affect precipitation measurements from auto-recording gauges operating at sub-hourly frequencies. For further evaluation, a field campaign was conducted to monitor evaporation and its impacts on the quality of precipitation measurements from gauges used at US Climate Reference Network (USCRN stations. Collocated Geonor gauges with (nonEvap and without (evap an evaporative suppressant were compared to evaluate evaporative losses and evaporation biases on precipitation measurements. From June to August, evaporative losses from the evap gauge exceeded accumulated precipitation, with an average loss of 0.12 mm h−1. However, the impact of evaporation on precipitation measurements was sensitive to calculation methods. In general, methods that utilized a longer time series to smooth out sensor noise were more sensitive to gauge (−4.6% bias with respect to control evaporation than methods computing depth change without smoothing (< +1% bias. These results indicate that while climate and gauge design affect gauge evaporation rates computational methods can influence the magnitude of evaporation bias on precipitation measurements. It is hoped this study will advance QA techniques that mitigate the impact of evaporation biases on precipitation measurements from other automated networks.

  18. Quality assurance and statistical control

    DEFF Research Database (Denmark)

    Heydorn, K.

    1991-01-01

    In scientific research laboratories it is rarely possible to use quality assurance schemes, developed for large-scale analysis. Instead methods have been developed to control the quality of modest numbers of analytical results by relying on statistical control: Analysis of precision serves...... through the origo. Calibration control is an essential link in the traceability of results. Only one or two samples of pure solid or aqueous standards with accurately known content need to be analyzed. Verification is carried out by analyzing certified reference materials from BCR, NIST, or others...

  19. Discussion on fleet test method for radioactive ore sample

    International Nuclear Information System (INIS)

    Testing radionuclides in building materials sample with γ-ray spectrometry has become a commonly used laboratory method, in view of the activity of radioactive ore is ten times more than the building materials sample,its fleet test is possible. It discusses the fleet test method for radioactivity ore sample from the 'social production' point of view. Achieving the test quickly and effectively within error permission, so the method can enhance the efficiency of test and reduce the economic cost. (authors)

  20. Using A Particular Sampling Method for Impedance Measurement

    OpenAIRE

    Lentka Grzegorz

    2014-01-01

    The paper presents an impedance measurement method using a particular sampling method which is an alternative to DFT calculation. The method uses a sine excitation signal and sampling response signals proportional to current flowing through and voltage across the measured impedance. The object impedance is calculated without using Fourier transform. The method was first evaluated in MATLAB by means of simulation. The method was then practically verified in a constructed simple impedance measu...

  1. Systems and methods for self-synchronized digital sampling

    Science.gov (United States)

    Samson, Jr., John R. (Inventor)

    2008-01-01

    Systems and methods for self-synchronized data sampling are provided. In one embodiment, a system for capturing synchronous data samples is provided. The system includes an analog to digital converter adapted to capture signals from one or more sensors and convert the signals into a stream of digital data samples at a sampling frequency determined by a sampling control signal; and a synchronizer coupled to the analog to digital converter and adapted to receive a rotational frequency signal from a rotating machine, wherein the synchronizer is further adapted to generate the sampling control signal, and wherein the sampling control signal is based on the rotational frequency signal.

  2. γ-ray spectrometry results versus sample preparation methods

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    According to recommended conditions two bio-samples, tea leave and flour, are prepared with different methods: grounding into powder and reducing to ash, then they were analyzed by γ ray spectrometry. Remarkable difference was shown between the measured values of tea samples prepared with these different methods. One of the reasons may be that the method of reducing to ash makes some nuclides lost. Compared with the "non-destructive"method of grounding into powder, the method of reducing to ash can be much more sensible to the loss of some nuclides. The probable reasons are discussed for the varied influences of different preparation methods of tea leave and flour samples.

  3. A method for selecting training samples based on camera response

    Science.gov (United States)

    Zhang, Leihong; Li, Bei; Pan, Zilan; Liang, Dong; Kang, Yi; Zhang, Dawei; Ma, Xiuhua

    2016-09-01

    In the process of spectral reflectance reconstruction, sample selection plays an important role in the accuracy of the constructed model and in reconstruction effects. In this paper, a method for training sample selection based on camera response is proposed. It has been proved that the camera response value has a close correlation with the spectral reflectance. Consequently, in this paper we adopt the technique of drawing a sphere in camera response value space to select the training samples which have a higher correlation with the test samples. In addition, the Wiener estimation method is used to reconstruct the spectral reflectance. Finally, we find that the method of sample selection based on camera response value has the smallest color difference and root mean square error after reconstruction compared to the method using the full set of Munsell color charts, the Mohammadi training sample selection method, and the stratified sampling method. Moreover, the goodness of fit coefficient of this method is also the highest among the four sample selection methods. Taking all the factors mentioned above into consideration, the method of training sample selection based on camera response value enhances the reconstruction accuracy from both the colorimetric and spectral perspectives.

  4. SU-E-J-126: Respiratory Gating Quality Assurance: A Simple Method to Achieve Millisecond Temporal Resolution

    International Nuclear Information System (INIS)

    Purpose: Low temporal latency between a gating on/off signal and a linac beam on/off during respiratory gating is critical for patient safety. Although, a measurement of temporal lag is recommended by AAPM Task Group 142 for commissioning and annual quality assurance, there currently exists no published method. Here we describe a simple, inexpensive, and reliable method to precisely measure gating lag at millisecond resolutions. Methods: A Varian Real-time Position Management™ (RPM) gating simulator with rotating disk was modified with a resistive flex sensor (Spectra Symbol) attached to the gating box platform. A photon diode was placed at machine isocenter. Output signals of the flex sensor and diode were monitored with a multichannel oscilloscope (Tektronix™ DPO3014). Qualitative inspection of the gating window/beam on synchronicity were made by setting the linac to beam on/off at end-expiration, and the oscilloscope's temporal window to 100 ms to visually examine if the on/off timing was within the recommended 100-ms tolerance. Quantitative measurements were made by saving the signal traces and analyzing in MatLab™. The on and off of the beam signal were located and compared to the expected gating window (e.g. 40% to 60%). Four gating cycles were measured and compared. Results: On a Varian TrueBeam™ STx linac with RPM gating software, the average difference in synchronicity at beam on and off for four cycles was 14 ms (3 to 30 ms) and 11 ms (2 to 32 ms), respectively. For a Varian Clinac™ 21EX the average difference at beam on and off was 127 ms (122 to 133 ms) and 46 ms (42 to 49 ms), respectively. The uncertainty in the synchrony difference was estimated at ±6 ms. Conclusion: This new gating QA method is easy to implement and allows for fast qualitative inspection and quantitative measurements for commissioning and TG-142 annual QA measurements

  5. Developing a novel method to analyse Gafchromic EBT2 films in intensity modulated radiation therapy quality assurance

    International Nuclear Information System (INIS)

    Recently individual intensity modulated radiation therapy quality assurances (IMRT QA) have been more and more performed with Gafchromic™ EBT series films processed in red–green–blue (R–G–B) channel due to their extremely high spatial resolution. However, the efficiency of this method is relatively low, as for each box of film, a calibration curve must be established prior to the film being used for measurement. In this study, the authors find a novel method to process the Gafchromic™ EBT series, that is, to use the 16-bit grey scale channel to process the exposed film rather than the conventional 48-bit R–G–B channel, which greatly increases the efficiency and even accuracy of the whole IMRT procedure. The main advantage is that when processed in grey scale channel, the Gafchromic™ EBT2 films exhibits a linear relationship between the net pixel value and the dose delivered. This linear relationship firstly reduces the error in calibration-curve fitting, and secondly saves the need of establishing a calibration curve for each box of films if it is only to be used for relative measurements. Clinical testing for this novel method was carried out in two radiation therapy centres that involved a total of 743 IMRT cases, and 740 cases passed the 3 mm 3 % gamma analysis criteria. The cases were also tested with small ionization chambers (cc-13) and the results were convincing. Consequently the authors recommend the use of this novel method to improve the accuracy and efficiency of individual IMRT QA procedure using Gafchromic EBT2 films.

  6. Rapid screening methods for beta-emitters in food samples

    International Nuclear Information System (INIS)

    In case of a nuclear emergency, many samples need to be measured in a short time period. Therefore, it is of vital importance to have a quick and reliable (screening)method. Most methods to determine total beta activity are time-consuming because of extensive sample preparation, such as ashing. In this article three different rapid screening methods for beta emitting nuclides in agriculture, livestock and fishery products were tested and compared to each other, and to an accurate but more time consuming reference method. The goal was to find the method with the optimal trade-off between accuracy, speed and minimal detectable activity (MDA). All of the methods rely on liquid scintillation counting (LSC) or Cerenkov counting, and differ mainly in sample preparation. For matrices with little or no colour, the direct LSC-method is the most accurate and fastest option, while for darker coloured samples this method is not suitable because of high colour quenching. For such samples, two additional methods using a microwave digestion during sample preparation, produced good results. - Highlights: • Comparison of rapid screening methods for beta-emitters. • Sample preparation and measurement done within 1.5–7.5 h instead of 56 h. • MDA less than 100 Bq/kg fresh product. • Recoveries for all rapid screening methods higher than 73%

  7. DOE methods for evaluating environmental and waste management samples.

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S C; McCulloch, M; Thomas, B L; Riley, R G; Sklarew, D S; Mong, G M; Fadeff, S K [eds.; Pacific Northwest Lab., Richland, WA (United States)

    1994-04-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others.

  8. DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others

  9. SAMSAN- MODERN NUMERICAL METHODS FOR CLASSICAL SAMPLED SYSTEM ANALYSIS

    Science.gov (United States)

    Frisch, H. P.

    1994-01-01

    SAMSAN was developed to aid the control system analyst by providing a self consistent set of computer algorithms that support large order control system design and evaluation studies, with an emphasis placed on sampled system analysis. Control system analysts have access to a vast array of published algorithms to solve an equally large spectrum of controls related computational problems. The analyst usually spends considerable time and effort bringing these published algorithms to an integrated operational status and often finds them less general than desired. SAMSAN reduces the burden on the analyst by providing a set of algorithms that have been well tested and documented, and that can be readily integrated for solving control system problems. Algorithm selection for SAMSAN has been biased toward numerical accuracy for large order systems with computational speed and portability being considered important but not paramount. In addition to containing relevant subroutines from EISPAK for eigen-analysis and from LINPAK for the solution of linear systems and related problems, SAMSAN contains the following not so generally available capabilities: 1) Reduction of a real non-symmetric matrix to block diagonal form via a real similarity transformation matrix which is well conditioned with respect to inversion, 2) Solution of the generalized eigenvalue problem with balancing and grading, 3) Computation of all zeros of the determinant of a matrix of polynomials, 4) Matrix exponentiation and the evaluation of integrals involving the matrix exponential, with option to first block diagonalize, 5) Root locus and frequency response for single variable transfer functions in the S, Z, and W domains, 6) Several methods of computing zeros for linear systems, and 7) The ability to generate documentation "on demand". All matrix operations in the SAMSAN algorithms assume non-symmetric matrices with real double precision elements. There is no fixed size limit on any matrix in any

  10. GROUND WATER PURGING AND SAMPLING METHODS: HISTORY VS. HYSTERIA

    Science.gov (United States)

    It has been over 10 years since the low-flow ground water purging and sampling method was initially reported in the literature. The method grew from the recognition that well purging was necessary to collect representative samples, bailers could not achieve well purging, and high...

  11. Engineering Study of 500 ML Sample Bottle Transportation Methods

    Energy Technology Data Exchange (ETDEWEB)

    BOGER, R.M.

    1999-08-25

    This engineering study reviews and evaluates all available methods for transportation of 500-mL grab sample bottles, reviews and evaluates transportation requirements and schedules and analyzes and recommends the most cost-effective method for transporting 500-mL grab sample bottles.

  12. DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    This publication provides applicable methods in use by the US Department of Energy (DOE) laboratories for the analysis of constituents of waste and environmental samples. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. It is a resource intended to support sampling and analytical activities that will determine whether environmental restoration or waste management actions are needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others

  13. Evaluation of common methods for sampling invertebrate pollinator assemblages: net sampling out-perform pan traps.

    Directory of Open Access Journals (Sweden)

    Tony J Popic

    Full Text Available Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km(2 area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service.

  14. Evaluation of common methods for sampling invertebrate pollinator assemblages: net sampling out-perform pan traps.

    Science.gov (United States)

    Popic, Tony J; Davila, Yvonne C; Wardle, Glenda M

    2013-01-01

    Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km(2) area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service. PMID:23799127

  15. Validation of a 20-h real-time PCR method for screening of Salmonella in poultry faecal samples

    DEFF Research Database (Denmark)

    Löfström, Charlotta; Hansen, Flemming; Hoorfar, Jeffrey

    2010-01-01

    Efficient and rapid monitoring of Salmonella in the poultry production chain is necessary to assure safe food. The objective was to validate an open-formula real-time PCR method for screening of Salmonella in poultry faeces (sock samples). The method consists of incubation in buffered peptone water...... for 18 ± 2 h, centrifugation of a 1-ml subsample, DNA extraction on the pellet and PCR. The total analysis time is 20 h. The validation study included comparative and collaborative trials, based on the recommendations from the Nordic organization for validation of alternative microbiological methods...... (NordVal). The comparative trial was performed against a reference method from the Nordic Committee on Food Analysis (NMKL187, 2007) using 132 artificially and naturally contaminated samples. The limit of detection (LOD50) was found to be 24 and 33 CFU/sample for the PCR and NMKL187 methods...

  16. Towards improvement in quality assurance

    International Nuclear Information System (INIS)

    This first document in the series of the International Nuclear Safety Advisory Group (INSAG) Technical Notes is a general guideline for the establishment of effective quality assurance procedures at nuclear facilities. It sets out primary requirements such as quality objectives, methods for measuring the effectiveness of the quality assurance programme, priority of activities in relation to importance of safety of items, motivation of personnel

  17. Method of analysis and quality-assurance practices by the U. S. Geological Survey Organic Geochemistry Research Group; determination of four selected mosquito insecticides and a synergist in water using liquid-liquid extraction and gas chrom

    Science.gov (United States)

    Zimmerman, L.R.; Strahan, A.P.; Thurman, E.M.

    2001-01-01

    A method of analysis and quality-assurance practices were developed for the determination of four mosquito insecticides (malathion, metho-prene, phenothrin, and resmethrin) and one synergist (piperonyl butoxide) in water. The analytical method uses liquid-liquid extraction (LLE) and gas chromatography/mass spectrometry (GC/MS). Good precision and accuracy were demonstrated in reagent water, urban surface water, and ground water. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 50 nanograms per liter ranged from 68 to 171 percent, with standard deviations in concentrations of 27 nanograms per liter or less. The method detection limit for all compounds was 5.9 nanograms per liter or less for 247-milliliter samples. This method is valuable for acquiring information about the fate and transport of these mosquito insecticides and one synergist in water.

  18. Sci—Sat AM: Stereo — 05: The Development of Quality Assurance Methods for Trajectory based Cranial SRS Treatments

    International Nuclear Information System (INIS)

    The goal of this work was to develop and validate non-planar linac beam trajectories defined by the dynamic motion of the gantry, couch, jaws, collimator and MLCs. This was conducted on the Varian TrueBeam linac by taking advantage of the linac's advanced control features in a non-clinical mode (termed developers mode). In this work, we present quality assurance methods that we have developed to test for the positional and temporal accuracy of the linac's moving components. The first QA method focuses on the coordination of couch and gantry. For this test, we developed a cylindrical phantom which has a film insert. Using this phantom we delivered a plan with dynamic motion of the couch and gantry. We found the mean absolute deviation of the entrance position from its expected value to be 0.5mm, with a standard deviation of 0.5mm. This was within the tolerances set by the machine's mechanical accuracy and the setup accuracy of the phantom. We also present an altered picket fence test which has added dynamic and simultaneous rotations of the couch and the collimator. While the test was shown to be sensitive enough to discern errors 1° and greater, we were unable to identify any errors in the coordination of the linacs collimator and couch. When operating under normal conditions, the Varian TrueBeam linac was able to pass both tests and is within tolerances acceptable for complex trajectory based treatments

  19. Sci—Sat AM: Stereo — 05: The Development of Quality Assurance Methods for Trajectory based Cranial SRS Treatments

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, B; Duzenli, C; Gete, E [BC Cancer Agency, Vancouver Cancer Centre (Canada); Teke, T [BC Cancer Agency, Centre for the Southern Interior (Canada)

    2014-08-15

    The goal of this work was to develop and validate non-planar linac beam trajectories defined by the dynamic motion of the gantry, couch, jaws, collimator and MLCs. This was conducted on the Varian TrueBeam linac by taking advantage of the linac's advanced control features in a non-clinical mode (termed developers mode). In this work, we present quality assurance methods that we have developed to test for the positional and temporal accuracy of the linac's moving components. The first QA method focuses on the coordination of couch and gantry. For this test, we developed a cylindrical phantom which has a film insert. Using this phantom we delivered a plan with dynamic motion of the couch and gantry. We found the mean absolute deviation of the entrance position from its expected value to be 0.5mm, with a standard deviation of 0.5mm. This was within the tolerances set by the machine's mechanical accuracy and the setup accuracy of the phantom. We also present an altered picket fence test which has added dynamic and simultaneous rotations of the couch and the collimator. While the test was shown to be sensitive enough to discern errors 1° and greater, we were unable to identify any errors in the coordination of the linacs collimator and couch. When operating under normal conditions, the Varian TrueBeam linac was able to pass both tests and is within tolerances acceptable for complex trajectory based treatments.

  20. Application of the SAMR method to high magnetostrictive samples

    Science.gov (United States)

    Sanchez, P.; Lopez, E.; Trujillo, M. C. Sanchez; Aroca, C.

    1988-12-01

    Magnetostriction measurement by using the small angle magnetization rotation method (SAMR) has been performed in high magnetostrictive amorphous samples. To apply the SAMR method to these samples, a theoritical model about the influence of the internal stresses and magnetization distribution has been proposed. The dependence of the magnetostriction, λ s, with the temperature and applied stress was measured in as-cast and in different annealed samples. In the as-cast samples the existence of a stray field and a dependence of λ s with the applied stress has been observed.

  1. Method for using polarization gating to measure a scattering sample

    Energy Technology Data Exchange (ETDEWEB)

    Baba, Justin S.

    2015-08-04

    Described herein are systems, devices, and methods facilitating optical characterization of scattering samples. A polarized optical beam can be directed to pass through a sample to be tested. The optical beam exiting the sample can then be analyzed to determine its degree of polarization, from which other properties of the sample can be determined. In some cases, an apparatus can include a source of an optical beam, an input polarizer, a sample, an output polarizer, and a photodetector. In some cases, a signal from a photodetector can be processed through attenuation, variable offset, and variable gain.

  2. DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Laboratory Management Division of the DOE. Methods are prepared for entry into DOE Methods as chapter editors, together with DOE and other participants in this program, identify analytical and sampling method needs. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types. open-quotes Draftclose quotes or open-quotes Verified.close quotes. open-quotes Draftclose quotes methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. open-quotes Verifiedclose quotes methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations

  3. Sampling and analysis methods for geothermal fluids and gases

    Energy Technology Data Exchange (ETDEWEB)

    Watson, J.C.

    1978-07-01

    The sampling procedures for geothermal fluids and gases include: sampling hot springs, fumaroles, etc.; sampling condensed brine and entrained gases; sampling steam-lines; low pressure separator systems; high pressure separator systems; two-phase sampling; downhole samplers; and miscellaneous methods. The recommended analytical methods compiled here cover physical properties, dissolved solids, and dissolved and entrained gases. The sequences of methods listed for each parameter are: wet chemical, gravimetric, colorimetric, electrode, atomic absorption, flame emission, x-ray fluorescence, inductively coupled plasma-atomic emission spectroscopy, ion exchange chromatography, spark source mass spectrometry, neutron activation analysis, and emission spectrometry. Material on correction of brine component concentrations for steam loss during flashing is presented. (MHR)

  4. Marshall Island radioassay quality assurance program an overview

    Energy Technology Data Exchange (ETDEWEB)

    Conrado, C.L.; Hamilton, T.F.; Kehl, S.R.; Robison, W.L.; Stoker, A.C.

    1998-09-01

    The Lawrence Livermore National Laboratory has developed an extensive quality assurance program to provide high quality data and assessments in support of the Marshall Islands Dose Assessment and Radioecology Program. Our quality assurance objectives begin with the premise of providing integrated and cost-effective program support (to meet wide-ranging programmatic needs, scientific peer review, litigation defense, and build public confidence) and continue through from design and implementation of large-scale field programs, sampling and sample preparation, radiometric and chemical analyses, documentation of quality assurance/quality control practices, exposure assessments, and dose/risk assessments until publication. The basic structure of our radioassay quality assurance/quality control program can be divided into four essential elements; (1) sample and data integrity control; (2) instrument validation and calibration; (3) method performance testing, validation, development and documentation; and (4) periodic peer review and on-site assessments. While our quality assurance objectives are tailored towards a single research program and the evaluation of major exposure pathways/critical radionuclides pertinent to the Marshall Islands, we have attempted to develop quality assurance practices that are consistent with proposed criteria designed for laboratory accre

  5. A microRNA isolation method from clinical samples

    Directory of Open Access Journals (Sweden)

    Sepideh Zununi Vahed

    2016-03-01

    Conclusion: The current isolation method can be applied for most clinical samples including cells, formalin-fixed and paraffin-embedded (FFPE tissues and even body fluids with a wide applicability in molecular biology investigations.

  6. Probability Sampling Method for a Hidden Population Using Respondent-Driven Sampling: Simulation for Cancer Survivors.

    Science.gov (United States)

    Jung, Minsoo

    2015-01-01

    When there is no sampling frame within a certain group or the group is concerned that making its population public would bring social stigma, we say the population is hidden. It is difficult to approach this kind of population survey-methodologically because the response rate is low and its members are not quite honest with their responses when probability sampling is used. The only alternative known to address the problems caused by previous methods such as snowball sampling is respondent-driven sampling (RDS), which was developed by Heckathorn and his colleagues. RDS is based on a Markov chain, and uses the social network information of the respondent. This characteristic allows for probability sampling when we survey a hidden population. We verified through computer simulation whether RDS can be used on a hidden population of cancer survivors. According to the simulation results of this thesis, the chain-referral sampling of RDS tends to minimize as the sample gets bigger, and it becomes stabilized as the wave progresses. Therefore, it shows that the final sample information can be completely independent from the initial seeds if a certain level of sample size is secured even if the initial seeds were selected through convenient sampling. Thus, RDS can be considered as an alternative which can improve upon both key informant sampling and ethnographic surveys, and it needs to be utilized for various cases domestically as well. PMID:26107223

  7. An express method of Cs-137 determination in environmental samples

    International Nuclear Information System (INIS)

    A method for CS-137 determination has been suggested that is based on the use of complex ferro-ferricyanides of transition metals. This method had been approbated at Kozloduj NPP and IRT-2000 reactor in Sofia. A good agreement with the classic method (ceasium sedimentation in the form of Cs2PtCl6) was obtained for measurements of soil, river bed deposition, plant, milk powder and other samples. The described method is easily applicable, non-expensive, does not require high qualification personnel. The radiochemical part of the analysis without preliminary sample treatment takes 12-24 hours

  8. System and method for measuring fluorescence of a sample

    Energy Technology Data Exchange (ETDEWEB)

    Riot, Vincent J

    2015-03-24

    The present disclosure provides a system and a method for measuring fluorescence of a sample. The sample may be a polymerase-chain-reaction (PCR) array, a loop-mediated-isothermal amplification array, etc. LEDs are used to excite the sample, and a photodiode is used to collect the sample's fluorescence. An electronic offset signal is used to reduce the effects of background fluorescence and the noises from the measurement system. An integrator integrates the difference between the output of the photodiode and the electronic offset signal over a given period of time. The resulting integral is then converted into digital domain for further processing and storage.

  9. Soil separator and sampler and method of sampling

    Science.gov (United States)

    O'Brien, Barry H [Idaho Falls, ID; Ritter, Paul D [Idaho Falls, ID

    2010-02-16

    A soil sampler includes a fluidized bed for receiving a soil sample. The fluidized bed may be in communication with a vacuum for drawing air through the fluidized bed and suspending particulate matter of the soil sample in the air. In a method of sampling, the air may be drawn across a filter, separating the particulate matter. Optionally, a baffle or a cyclone may be included within the fluidized bed for disentrainment, or dedusting, so only the finest particulate matter, including asbestos, will be trapped on the filter. The filter may be removable, and may be tested to determine the content of asbestos and other hazardous particulate matter in the soil sample.

  10. DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, open-quotes Draftclose quotes or open-quotes Verifiedclose quotes. open-quotes Draftclose quotes methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. open-quotes Verifiedclose quotes methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy

  11. DOE methods for evaluating environmental and waste management samples

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K. [eds.

    1994-10-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, {open_quotes}Draft{close_quotes} or {open_quotes}Verified{close_quotes}. {open_quotes}Draft{close_quotes} methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. {open_quotes}Verified{close_quotes} methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy.

  12. Reliability analysis method for slope stability based on sample weight

    Directory of Open Access Journals (Sweden)

    Zhi-gang YANG

    2009-09-01

    Full Text Available The single safety factor criteria for slope stability evaluation, derived from the rigid limit equilibrium method or finite element method (FEM, may not include some important information, especially for steep slopes with complex geological conditions. This paper presents a new reliability method that uses sample weight analysis. Based on the distribution characteristics of random variables, the minimal sample size of every random variable is extracted according to a small sample t-distribution under a certain expected value, and the weight coefficient of each extracted sample is considered to be its contribution to the random variables. Then, the weight coefficients of the random sample combinations are determined using the Bayes formula, and different sample combinations are taken as the input for slope stability analysis. According to one-to-one mapping between the input sample combination and the output safety coefficient, the reliability index of slope stability can be obtained with the multiplication principle. Slope stability analysis of the left bank of the Baihetan Project is used as an example, and the analysis results show that the present method is reasonable and practicable for the reliability analysis of steep slopes with complex geological conditions.

  13. Comparison of surface sampling methods for virus recovery from fomites.

    Science.gov (United States)

    Julian, Timothy R; Tamayo, Francisco J; Leckie, James O; Boehm, Alexandria B

    2011-10-01

    The role of fomites in infectious disease transmission relative to other exposure routes is difficult to discern due, in part, to the lack of information on the level and distribution of virus contamination on surfaces. Comparisons of studies intending to fill this gap are difficult because multiple different sampling methods are employed and authors rarely report their method's lower limit of detection. In the present study, we compare a subset of sampling methods identified from a literature review to demonstrate that sampling method significantly influences study outcomes. We then compare a subset of methods identified from the review to determine the most efficient methods for recovering virus from surfaces in a laboratory trial using MS2 bacteriophage as a model virus. Recoveries of infective MS2 and MS2 RNA are determined using both a plaque assay and quantitative reverse transcription-PCR, respectively. We conclude that the method that most effectively recovers virus from nonporous fomites uses polyester-tipped swabs prewetted in either one-quarter-strength Ringer's solution or saline solution. This method recovers a median fraction for infective MS2 of 0.40 and for MS2 RNA of 0.07. Use of the proposed method for virus recovery in future fomite sampling studies would provide opportunities to compare findings across multiple studies.

  14. Using re-sampling methods in mortality studies.

    Directory of Open Access Journals (Sweden)

    Igor Itskovich

    Full Text Available Traditional methods of computing standardized mortality ratios (SMR in mortality studies rely upon a number of conventional statistical propositions to estimate confidence intervals for obtained values. Those propositions include a common but arbitrary choice of the confidence level and the assumption that observed number of deaths in the test sample is a purely random quantity. The latter assumption may not be fully justified for a series of periodic "overlapping" studies. We propose a new approach to evaluating the SMR, along with its confidence interval, based on a simple re-sampling technique. The proposed method is most straightforward and requires neither the use of above assumptions nor any rigorous technique, employed by modern re-sampling theory, for selection of a sample set. Instead, we include all possible samples that correspond to the specified time window of the study in the re-sampling analysis. As a result, directly obtained confidence intervals for repeated overlapping studies may be tighter than those yielded by conventional methods. The proposed method is illustrated by evaluating mortality due to a hypothetical risk factor in a life insurance cohort. With this method used, the SMR values can be forecast more precisely than when using the traditional approach. As a result, the appropriate risk assessment would have smaller uncertainties.

  15. Method of monitoring core sampling during borehole drilling

    Energy Technology Data Exchange (ETDEWEB)

    Duckworth, A.; Barnes, D.; Gennings, T.L.

    1991-04-30

    This paper describes a method of monitoring the acquisition of a core sample obtained from a coring tool on a drillstring in a borehole. It comprises: measuring the weight of a portion of the drillstring at a pre-selected borehole depth when the drillstring is off bottom to define a first measurement; drilling to acquire a core sample; measuring the weight of a portion of the drillstring at a pre-selected borehole depth when the drillstring is off-bottom to define a second measurement; determining the difference between the first and second measurements, the difference corresponding to the weight of the core sample; and comparing the measured core sample weight to a calculated weight of a full core sample to determine if the core sample has been acquired.

  16. Extending the alias Monte Carlo sampling method to general distributions

    International Nuclear Information System (INIS)

    The alias method is a Monte Carlo sampling technique that offers significant advantages over more traditional methods. It equals the accuracy of table lookup and the speed of equal probable bins. The original formulation of this method sampled from discrete distributions and was easily extended to histogram distributions. We have extended the method further to applications more germane to Monte Carlo particle transport codes: continuous distributions. This paper presents the alias method as originally derived and our extensions to simple continuous distributions represented by piecewise linear functions. We also present a method to interpolate accurately between distributions tabulated at points other than the point of interest. We present timing studies that demonstrate the method's increased efficiency over table lookup and show further speedup achieved through vectorization. 6 refs., 12 figs., 2 tabs

  17. Methods of analysis and quality-assurance practices of the U.S. Geological Survey organic laboratory, Sacramento, California; determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry

    Science.gov (United States)

    Crepeau, Kathryn L.; Domagalski, Joseph L.; Kuivila, Kathryn M.

    1994-01-01

    Analytical method and quality-assurance practices were developed for a study of the fate and transport of pesticides in the Sacramento-San Joaquin Delta and the Sacramento and San Joaquin River. Water samples were filtered to remove suspended parti- culate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide, and the pesticides were eluted with three 2-milliliter aliquots of hexane:diethyl ether (1:1). The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for analytes determined per 1,500-milliliter samples ranged from 0.006 to 0.047 microgram per liter. Recoveries ranged from 47 to 89 percent for 12 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.05 and 0.26 microgram per liter. The method was modified to improve the pesticide recovery by reducing the sample volume to 1,000 milliliters. Internal standards were added to improve quantitative precision and accuracy. The analysis also was expanded to include a total of 21 pesticides. The method detection limits for 1,000-milliliter samples ranged from 0.022 to 0.129 microgram per liter. Recoveries ranged from 38 to 128 percent for 21 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.10 and 0.75 microgram per liter.

  18. Heat-capacity measurements on small samples: The hybrid method

    NARCIS (Netherlands)

    J.C.P. Klaasse; E.H. Brück

    2008-01-01

    A newly developed method is presented for measuring heat capacities on small samples, particularly where thermal isolation is not sufficient for the use of the traditional semiadiabatic heat-pulse technique. This "hybrid technique" is a modification of this heat-pulse method in case the temperature

  19. Sampling Methods for Varroa Mites on the Domesticated Honeybee

    OpenAIRE

    Barlow, Vonny M.; Richard D. Fell

    2009-01-01

    Varroa mites are serious pests of the apiculture industry throughout the Americas. Various methods have been used to determine if a colony is infested with varroa mites necessitating some type of control. This publication presents various varroa sampling methods and compares their relative effectiveness.

  20. Beryllium Wipe Sampling (differing methods - differing exposure potentials)

    Energy Technology Data Exchange (ETDEWEB)

    Kerr, Kent

    2005-03-09

    This research compared three wipe sampling techniques currently used to test for beryllium contamination on room and equipment surfaces in Department of Energy facilities. Efficiencies of removal of beryllium contamination from typical painted surfaces were tested by wipe sampling without a wetting agent, with water-moistened wipe materials, and by methanol-moistened wipes. Analysis indicated that methanol-moistened wipe sampling removed about twice as much beryllium/oil-film surface contamination as water-moistened wipes, which removed about twice as much residue as dry wipes. Criteria at 10 CFR 850.30 and .31 were established on unspecified wipe sampling method(s). The results of this study reveal a need to identify criteria-setting method and equivalency factors. As facilities change wipe sampling methods among the three compared in this study, these results may be useful for approximate correlations. Accurate decontamination decision-making depends on the selection of appropriate wetting agents for the types of residues and surfaces. Evidence for beryllium sensitization via skin exposure argues in favor of wipe sampling with wetting agents that provide enhanced removal efficiency such as methanol when surface contamination includes oil mist residue.

  1. SU-E-I-60: Quality Assurance Testing Methods and Customized Phantom for Magnetic Resonance Imaging and Spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Song, K-H; Lee, D-W; Choe, B-Y [Department of Biomedical Engineering, Research Institute of Biomedical Engineering, College of Medicine, The Catholic University of Korea, Seoul, Seoul (Korea, Republic of)

    2015-06-15

    Purpose: The objectives of this study are to develop an magnetic resonance imaging and spectroscopy (MRI-MRS) fused phantom along with the inserts for metabolite quantification and to conduct quantitative analysis and evaluation of the layered vials of brain-mimicking solution for quality assurance (QA) performance, according to the localization sequence. Methods: The outer cylindrical phantom body is made of acrylic materials. The section other than where the inner vials are located was filled with copper sulfate and diluted with water so as to reduce the T1 relaxation time. Sodium chloride was included to provide conductivity similar to the human body. All measurements of MRI and MRS were made using a 3.0 T scanner (Achiva Tx 3.0 T; Philips Medical Systems, Netherlands). The MRI scan parameters were as follows: (1) spin echo (SE) T1-weighted image: repetition time (TR), 500ms; echo time (TE), 20ms; matrix, 256×256; field of view (FOV), 250mm; gap, 1mm; number of signal averages (NSA), 1; (2) SE T2-weighted image: TR, 2,500 ms; TE, 80 ms; matrix, 256×256; FOV, 250mm; gap, 1mm; NSA, 1; 23 slice images were obtained with slice thickness of 5mm. The water signal of each volume of interest was suppressed by variable pulse power and optimized relaxation delays (VAPOR) applied before the scan. By applying a point-resolved spectroscopy sequence, the MRS scan parameters were as follows: voxel size, 0.8×0.8×0.8 cm{sup 3}; TR, 2,000ms; TE, 35ms; NSA, 128. Results: Using the fused phantom, the results of measuring MRI factors were: geometric distortion, <2% and ±2 mm; image intensity uniformity, 83.09±1.33%; percent-signal ghosting, 0.025±0.004; low-contrast object detectability, 27.85±0.80. In addition, the signal-to-noise ratio of N-acetyl-aspartate was consistently high (42.00±5.66). Conclusion: The MRI-MRS QA factors obtained simultaneously using the phantom can facilitate evaluation of both images and spectra, and provide guidelines for obtaining MRI and MRS QA

  2. Sample preparation methods for determination of drugs of abuse in hair samples: A review.

    Science.gov (United States)

    Vogliardi, Susanna; Tucci, Marianna; Stocchero, Giulia; Ferrara, Santo Davide; Favretto, Donata

    2015-02-01

    Hair analysis has assumed increasing importance in the determination of substances of abuse, both in clinical and forensic toxicology investigations. Hair analysis offers particular advantages over other biological matrices (blood and urine), including a larger window of detection, ease of collection and sample stability. In the present work, an overview of sample preparation techniques for the determination of substances of abuse in hair is provided, specifically regarding the principal steps in hair sample treatment-decontamination, extraction and purification. For this purpose, a survey of publications found in the MEDLINE database from 2000 to date was conducted. The most widely consumed substances of abuse and psychotropic drugs were considered. Trends in simplification of hair sample preparation, washing procedures and cleanup methods are discussed. Alternative sample extraction techniques, such as head-space solid phase microextraction (HS-SPDE), supercritical fluid extraction (SFE) and molecularly imprinted polymers (MIP) are also reported. PMID:25604816

  3. Sampling and sample preparation methods for the analysis of trace elements in biological material

    International Nuclear Information System (INIS)

    The authors attempt to give a most systamtic possible treatment of the sample taking and sample preparation of biological material (particularly in human medicine) for trace analysis (e.g. neutron activation analysis, atomic absorption spectrometry). Contamination and loss problems are discussed as well as the manifold problems of the different consistency of solid and liquid biological materials, as well as the stabilization of the sample material. The process of dry and wet ashing is particularly dealt with, where new methods are also described. (RB)

  4. An algorithm to improve sampling efficiency for uncertainty propagation using sampling based method

    Energy Technology Data Exchange (ETDEWEB)

    Campolina, Daniel; Lima, Paulo Rubens I., E-mail: campolina@cdtn.br, E-mail: pauloinacio@cpejr.com.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Servico de Tecnologia de Reatores; Pereira, Claubia; Veloso, Maria Auxiliadora F., E-mail: claubia@nuclear.ufmg.br, E-mail: dora@nuclear.ufmg.br [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Dept. de Engenharia Nuclear

    2015-07-01

    Sample size and computational uncertainty were varied in order to investigate sample efficiency and convergence of the sampling based method for uncertainty propagation. Transport code MCNPX was used to simulate a LWR model and allow the mapping, from uncertain inputs of the benchmark experiment, to uncertain outputs. Random sampling efficiency was improved through the use of an algorithm for selecting distributions. Mean range, standard deviation range and skewness were verified in order to obtain a better representation of uncertainty figures. Standard deviation of 5 pcm in the propagated uncertainties for 10 n-samples replicates was adopted as convergence criterion to the method. Estimation of 75 pcm uncertainty on reactor k{sub eff} was accomplished by using sample of size 93 and computational uncertainty of 28 pcm to propagate 1σ uncertainty of burnable poison radius. For a fixed computational time, in order to reduce the variance of the uncertainty propagated, it was found, for the example under investigation, it is preferable double the sample size than double the amount of particles followed by Monte Carlo process in MCNPX code. (author)

  5. Self-contained cryogenic gas sampling apparatus and method

    Science.gov (United States)

    McManus, G.J.; Motes, B.G.; Bird, S.K.; Kotter, D.K.

    1996-03-26

    Apparatus for obtaining a whole gas sample, is composed of: a sample vessel having an inlet for receiving a gas sample; a controllable valve mounted for controllably opening and closing the inlet; a valve control coupled to the valve for opening and closing the valve at selected times; a portable power source connected for supplying operating power to the valve control; and a cryogenic coolant in thermal communication with the vessel for cooling the interior of the vessel to cryogenic temperatures. A method is described for obtaining an air sample using the apparatus described above, by: placing the apparatus at a location at which the sample is to be obtained; operating the valve control to open the valve at a selected time and close the valve at a selected subsequent time; and between the selected times maintaining the vessel at a cryogenic temperature by heat exchange with the coolant. 3 figs.

  6. Fluidics platform and method for sample preparation and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Benner, W. Henry; Dzenitis, John M.; Bennet, William J.; Baker, Brian R.

    2014-08-19

    Herein provided are fluidics platform and method for sample preparation and analysis. The fluidics platform is capable of analyzing DNA from blood samples using amplification assays such as polymerase-chain-reaction assays and loop-mediated-isothermal-amplification assays. The fluidics platform can also be used for other types of assays and analyzes. In some embodiments, a sample in a sealed tube can be inserted directly. The following isolation, detection, and analyzes can be performed without a user's intervention. The disclosed platform may also comprises a sample preparation system with a magnetic actuator, a heater, and an air-drying mechanism, and fluid manipulation processes for extraction, washing, elution, assay assembly, assay detection, and cleaning after reactions and between samples.

  7. Computing Greeks with Multilevel Monte Carlo Methods using Importance Sampling

    OpenAIRE

    Euget, Thomas

    2012-01-01

    This paper presents a new efficient way to reduce the variance of an estimator of popular payoffs and greeks encounter in financial mathematics. The idea is to apply Importance Sampling with the Multilevel Monte Carlo recently introduced by M.B. Giles. So far, Importance Sampling was proved successful in combination with standard Monte Carlo method. We will show efficiency of our approach on the estimation of financial derivatives prices and then on the estimation of Greeks (i.e. sensitivitie...

  8. Compressive sampling in computed tomography: Method and application

    International Nuclear Information System (INIS)

    Since Donoho and Candes et al. published their groundbreaking work on compressive sampling or compressive sensing (CS), CS theory has attracted a lot of attention and become a hot topic, especially in biomedical imaging. Specifically, some CS based methods have been developed to enable accurate reconstruction from sparse data in computed tomography (CT) imaging. In this paper, we will review the progress in CS based CT from aspects of three fundamental requirements of CS: sparse representation, incoherent sampling and reconstruction algorithm. In addition, some potential applications of compressive sampling in CT are introduced

  9. Compressive sampling in computed tomography: Method and application

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Zhanli [Paul Lauterbur Center for Biomedical Imaging Research, Institute of Biomedical and Health Engineering, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen (China); Beijing Center for Mathematics and Information Interdisciplinary Sciences, Beijing (China); Shenzhen Key Lab for Molecular Imaging, Shenzhen (China); Liang, Dong, E-mail: dong.liang@siat.ac.cn [Paul Lauterbur Center for Biomedical Imaging Research, Institute of Biomedical and Health Engineering, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen (China); Beijing Center for Mathematics and Information Interdisciplinary Sciences, Beijing (China); Xia, Dan [University of Pennsylvania, Philadelphia (United States); Zheng, Hairong [Paul Lauterbur Center for Biomedical Imaging Research, Institute of Biomedical and Health Engineering, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen (China); Beijing Center for Mathematics and Information Interdisciplinary Sciences, Beijing (China)

    2014-06-01

    Since Donoho and Candes et al. published their groundbreaking work on compressive sampling or compressive sensing (CS), CS theory has attracted a lot of attention and become a hot topic, especially in biomedical imaging. Specifically, some CS based methods have been developed to enable accurate reconstruction from sparse data in computed tomography (CT) imaging. In this paper, we will review the progress in CS based CT from aspects of three fundamental requirements of CS: sparse representation, incoherent sampling and reconstruction algorithm. In addition, some potential applications of compressive sampling in CT are introduced.

  10. Comparison of pigment content of paint samples using spectrometric methods

    Science.gov (United States)

    Trzcińska, Beata; Kowalski, Rafał; Zięba-Palus, Janina

    2014-09-01

    The aim of the paper was to evaluate the influence of pigment concentration and its distribution in polymer binder on the possibility of colour identification and paint sample comparison. Two sets of paint samples: one containing red and another one green pigment were prepared. Each set consisted of 13 samples differing gradually in the concentration of pigment. To obtain the sets of various colour shades white paint was mixed with the appropriate pigment in the form of a concentrated suspension. After solvents evaporation the samples were examined using spectrometric methods. The resin and main filler were identified by IR method. Colour and white pigments were identified on the base of Raman spectra. Colour of samples were compared based on Vis spectrometry according to colour theory. It was found that samples are homogenous (parameter measuring colour similarity ΔE < 3). The values of ΔE between the neighbouring samples in the set revealed decreasing linear function and between the first and following one - a logarithmic function.

  11. Methods for Sampling and Measurement of Compressed Air Contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Stroem, L.

    1976-10-15

    In order to improve the technique for measuring oil and water entrained in a compressed air stream, a laboratory study has been made of some methods for sampling and measurement. For this purpose water or oil as artificial contaminants were injected in thin streams into a test loop, carrying dry compressed air. Sampling was performed in a vertical run, down-stream of the injection point. Wall attached liquid, coarse droplet flow, and fine droplet flow were sampled separately. The results were compared with two-phase flow theory and direct observation of liquid behaviour. In a study of sample transport through narrow tubes, it was observed that, below a certain liquid loading, the sample did not move, the liquid remaining stationary on the tubing wall. The basic analysis of the collected samples was made by gravimetric methods. Adsorption tubes were used with success to measure water vapour. A humidity meter with a sensor of the aluminium oxide type was found to be unreliable. Oil could be measured selectively by a flame ionization detector, the sample being pretreated in an evaporation- condensation unit

  12. Off-axis angular spectrum method with variable sampling interval

    Science.gov (United States)

    Kim, Yong-Hae; Byun, Chun-Won; Oh, Himchan; Pi, Jae-Eun; Choi, Ji-Hun; Kim, Gi Heon; Lee, Myung-Lae; Ryu, Hojun; Hwang, Chi-Sun

    2015-08-01

    We proposed a novel off-axis angular spectrum method (ASM) for simulating free space wave propagation with a large shifted destination plane. The off-axis numerical simulation took wave propagation between a parallel source and a destination plane, but a destination plane was shifted from a source plane. The shifted angular spectrum method was proposed for diffraction simulation with a shifted destination plane and satisfied the Nyquist condition for sampling by limiting a bandwidth of a propagation field to avoid an aliasing error due to under sampling. However, the effective sampling number of the shifted ASM decreased when the shifted distance of the destination plane was large which caused a numerical error in the diffraction simulation. To compensate for the decrease of an effective sampling number for the large shifted destination plane, we used a variable sampling interval in a Fourier space to maintain the same effective sampling number independent of the shifted distance of the destination plane. As a result, our proposed off-axis ASM with a variable sampling interval can produce simulation results with high accuracy for nearly every shifted distance of a destination plane when an off-axis angle is less than 75°. We compared the performances of the off-axis ASM using the Chirp Z transform and non-uniform FFT for implementing a variable spatial frequency in a Fourier space.

  13. Methods for Sampling and Measurement of Compressed Air Contaminants

    International Nuclear Information System (INIS)

    In order to improve the technique for measuring oil and water entrained in a compressed air stream, a laboratory study has been made of some methods for sampling and measurement. For this purpose water or oil as artificial contaminants were injected in thin streams into a test loop, carrying dry compressed air. Sampling was performed in a vertical run, down-stream of the injection point. Wall attached liquid, coarse droplet flow, and fine droplet flow were sampled separately. The results were compared with two-phase flow theory and direct observation of liquid behaviour. In a study of sample transport through narrow tubes, it was observed that, below a certain liquid loading, the sample did not move, the liquid remaining stationary on the tubing wall. The basic analysis of the collected samples was made by gravimetric methods. Adsorption tubes were used with success to measure water vapour. A humidity meter with a sensor of the aluminium oxide type was found to be unreliable. Oil could be measured selectively by a flame ionization detector, the sample being pretreated in an evaporation- condensation unit

  14. Noninvasive methods for estradiol recovery from infant fecal samples

    Directory of Open Access Journals (Sweden)

    Amanda L Thompson

    2010-11-01

    Full Text Available While the activation of the infant hypothalamic-pituitary-gonadal (HPG axis and the existence of a postnatal gonadotrophin surge were first documented in the early 1970s, study of the longitudinal development of gonadal hormones in infancy, and the potential physiological and behavioral correlates of this development, have been hampered by reliance on infrequent serum sampling. The present study reports the validation of a noninvasive method for repeated assessment of steroid hormones in infant fecal samples. Fecal samples were collected in, and excised from cotton diaper liners, and extracted using methanol. Extracts were analyzed for estradiol using a diluted assay modification. Method validity was supported by a steroid recovery rate of at least 80%, a sensitivity of 0.35 pg/ml, and inter- and intra-assay coefficients of variations of less than 10% and 20%, respectively. Variation in estradiol concentration was assessed across (1 sample type (scraped vs. cut from diaper liner, (2 time of day (morning vs. afternoon/evening samples, (3 time interval between samples and (4 time to assay (1 day vs. 489 days after collection. Of these characteristics, only the time interval between samples within an individual was significantly associated with estradiol concentration. This is the first report of human infant fecal estradiol levels. The results support fecal recovery as a novel and powerful noninvasive tool for longitudinal studies of human infants, expanding research opportunities for investigating development of sex-specific behaviors in infancy, and the potential effects of endocrine disruptors on development.

  15. Rapid Method For Determination Of Radiostrontium In Emergency Milk Samples

    International Nuclear Information System (INIS)

    A new rapid separation method for radiostrontium in emergency milk samples was developed at the Savannah River Site (SRS) Environmental Bioassay Laboratory (Aiken, SC, USA) that will allow rapid separation and measurement of Sr-90 within 8 hours. The new method uses calcium phosphate precipitation, nitric acid dissolution of the precipitate to coagulate residual fat/proteins and a rapid strontium separation using Sr Resin (Eichrom Technologies, Darien, IL, USA) with vacuum-assisted flow rates. The method is much faster than previous method that use calcination or cation exchange pretreatment, has excellent chemical recovery, and effectively removes beta interferences. When a 100 ml sample aliquot is used, the method has a detection limit of 0.5 Bq/L, well below generic emergency action levels.

  16. On the Exploitation of Sensitivity Derivatives for Improving Sampling Methods

    Science.gov (United States)

    Cao, Yanzhao; Hussaini, M. Yousuff; Zang, Thomas A.

    2003-01-01

    Many application codes, such as finite-element structural analyses and computational fluid dynamics codes, are capable of producing many sensitivity derivatives at a small fraction of the cost of the underlying analysis. This paper describes a simple variance reduction method that exploits such inexpensive sensitivity derivatives to increase the accuracy of sampling methods. Three examples, including a finite-element structural analysis of an aircraft wing, are provided that illustrate an order of magnitude improvement in accuracy for both Monte Carlo and stratified sampling schemes.

  17. Optimizing the atmospheric sampling sites using fuzzy mathematic methods

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    A new approach applying fuzzy mathematic theorems, including the Primary Matrix Element Theorem and the Fisher ClassificationMethod, was established to solve the optimization problem of atmospheric environmental sampling sites. According to its basis, an applicationin the optimization of sampling sites in the atmospheric environmental monitoring was discussed. The method was proven to be suitable andeffective. The results were admitted and applied by the Environmental Protection Bureau (EPB) of many cities of China. A set of computersoftware of this approach was also completely compiled and used.

  18. The use of microbead-based spoligotyping for Mycobacterium tuberculosis complex to evaluate the quality of the conventional method: Providing guidelines for Quality Assurance when working on membranes

    Directory of Open Access Journals (Sweden)

    Garzelli Carlo

    2011-04-01

    Full Text Available Abstract Background The classical spoligotyping technique, relying on membrane reverse line-blot hybridization of the spacers of the Mycobacterium tuberculosis CRISPR locus, is used world-wide (598 references in Pubmed on April 8th, 2011. However, until now no inter-laboratory quality control study had been undertaken to validate this technique. We analyzed the quality of membrane-based spoligotyping by comparing it to the recently introduced and highly robust microbead-based spoligotyping. Nine hundred and twenty-seven isolates were analyzed totaling 39,861 data points. Samples were received from 11 international laboratories with a worldwide distribution. Methods The high-throughput microbead-based Spoligotyping was performed on CTAB and thermolyzate DNA extracted from isolated Mycobacterium tuberculosis complex (MTC strains coming from the genotyping participating centers. Information regarding how the classical Spoligotyping method was performed by center was available. Genotype discriminatory analyses were carried out by comparing the spoligotypes obtained by both methods. The non parametric U-Mann Whitney homogeneity test and the Spearman rank correlation test were performed to validate the observed results. Results Seven out of the 11 laboratories (63 %, perfectly typed more than 90% of isolates, 3 scored between 80-90% and a single center was under 80% reaching 51% concordance only. However, this was mainly due to discordance in a single spacer, likely having a non-functional probe on the membrane used. The centers using thermolyzate DNA performed as well as centers using the more extended CTAB extraction procedure. Few centers shared the same problematic spacers and these problematic spacers were scattered over the whole CRISPR locus (Mostly spacers 15, 14, 18, 37, 39, 40. Conclusions We confirm that classical spoligotyping is a robust method with generally a high reliability in most centers. The applied DNA extraction procedure (CTAB

  19. Evaluation of sample preservation methods for poultry manure.

    Science.gov (United States)

    Pan, J; Fadel, J G; Zhang, R; El-Mashad, H M; Ying, Y; Rumsey, T

    2009-08-01

    When poultry manure is collected but cannot be analyzed immediately, a method for storing the manure is needed to ensure accurate subsequent analyses. This study has 3 objectives: (1) to investigate effects of 4 poultry manure sample preservation methods (refrigeration, freezing, acidification, and freeze-drying) on the compositional characteristics of poultry manure; (2) to determine compositional differences in fresh manure with manure samples at 1, 2, and 3 d of accumulation under bird cages; and (3) to assess the influence of 14-d freezing storage on the composition of manure when later exposed to 25 degrees C for 7 d as compared with fresh manure. All manure samples were collected from a layer house. Analyses performed on the manure samples included total Kjeldahl nitrogen, uric acid nitrogen, ammonia nitrogen, and urea nitrogen. In experiment 1, the storage methods most similar to fresh manure, in order of preference, were freezing, freeze-drying, acidification, and refrigeration. Thoroughly mixing manure samples and compressing them to 2 to 3 mm is important for the freezing and freeze-dried samples. In general, refrigeration was found unacceptable for nitrogen analyses. A significant effect (P Kjeldahl nitrogen and uric acid nitrogen were significantly lower (P < 0.05) for 1, 2, and 3 d of accumulation compared with fresh manure. Manure after 1, 2, and 3 d of accumulation had similar nitrogen compositions. The results from experiment 3 show that nitrogen components from fresh manure samples and thawed samples from 14 d of freezing are similar at 7 d but high variability of nitrogen compositions during intermediate times from 0 to 7 d prevents the recommendation of freezing manure for use in subsequent experiments and warrants future experimentation. In conclusion, fresh poultry manure can be frozen for accurate subsequent nitrogen compositional analyses but this same frozen manure may not be a reliable substitute for fresh manure if a subsequent experiment

  20. Evaluation of sample preservation methods for poultry manure.

    Science.gov (United States)

    Pan, J; Fadel, J G; Zhang, R; El-Mashad, H M; Ying, Y; Rumsey, T

    2009-08-01

    When poultry manure is collected but cannot be analyzed immediately, a method for storing the manure is needed to ensure accurate subsequent analyses. This study has 3 objectives: (1) to investigate effects of 4 poultry manure sample preservation methods (refrigeration, freezing, acidification, and freeze-drying) on the compositional characteristics of poultry manure; (2) to determine compositional differences in fresh manure with manure samples at 1, 2, and 3 d of accumulation under bird cages; and (3) to assess the influence of 14-d freezing storage on the composition of manure when later exposed to 25 degrees C for 7 d as compared with fresh manure. All manure samples were collected from a layer house. Analyses performed on the manure samples included total Kjeldahl nitrogen, uric acid nitrogen, ammonia nitrogen, and urea nitrogen. In experiment 1, the storage methods most similar to fresh manure, in order of preference, were freezing, freeze-drying, acidification, and refrigeration. Thoroughly mixing manure samples and compressing them to 2 to 3 mm is important for the freezing and freeze-dried samples. In general, refrigeration was found unacceptable for nitrogen analyses. A significant effect (P manure. Manure after 1, 2, and 3 d of accumulation had similar nitrogen compositions. The results from experiment 3 show that nitrogen components from fresh manure samples and thawed samples from 14 d of freezing are similar at 7 d but high variability of nitrogen compositions during intermediate times from 0 to 7 d prevents the recommendation of freezing manure for use in subsequent experiments and warrants future experimentation. In conclusion, fresh poultry manure can be frozen for accurate subsequent nitrogen compositional analyses but this same frozen manure may not be a reliable substitute for fresh manure if a subsequent experiment is performed. PMID:19590065

  1. Comparison of sampling methods for radiocarbon dating of carbonyls in air samples via accelerator mass spectrometry

    Science.gov (United States)

    Schindler, Matthias; Kretschmer, Wolfgang; Scharf, Andreas; Tschekalinskij, Alexander

    2016-05-01

    Three new methods to sample and prepare various carbonyl compounds for radiocarbon measurements were developed and tested. Two of these procedures utilized the Strecker synthetic method to form amino acids from carbonyl compounds with either sodium cyanide or trimethylsilyl cyanide. The third procedure used semicarbazide to form crystalline carbazones with the carbonyl compounds. The resulting amino acids and semicarbazones were then separated and purified using thin layer chromatography. The separated compounds were then combusted to CO2 and reduced to graphite to determine 14C content by accelerator mass spectrometry (AMS). All of these methods were also compared with the standard carbonyl compound sampling method wherein a compound is derivatized with 2,4-dinitrophenylhydrazine and then separated by high-performance liquid chromatography (HPLC).

  2. Sampling and analysis methods for geothermal fluids and gases

    Energy Technology Data Exchange (ETDEWEB)

    Shannon, D. W.

    1978-01-01

    The data obtained for the first round robin sample collected at Mesa 6-2 wellhead, East Mesa Test Site, Imperial Valley are summarized. Test results are listed by method used for cross reference to the analytic methods section. Results obtained for radioactive isotopes present in the brine sample are tabulated. The data obtained for the second round robin sample collected from the Woolsey No. 1 first stage flash unit, San Diego Gas and Electric Niland Test Facility are presented in the same manner. Lists of the participants of the two round robins are given. Data from miscellaneous analyses are included. Summaries of values derived from the round robin raw data are presented. (MHR)

  3. Global metabolite analysis of yeast: evaluation of sample preparation methods

    DEFF Research Database (Denmark)

    Villas-Bôas, Silas Granato; Højer-Pedersen, Jesper; Åkesson, Mats Fredrik;

    2005-01-01

    Sample preparation is considered one of the limiting steps in microbial metabolome analysis. Eukaryotes and prokaryotes behave very differently during the several steps of classical sample preparation methods for analysis of metabolites. Even within the eukaryote kingdom there is a vast diversity...... of cell structures that make it imprudent to blindly adopt protocols that were designed for a specific group of microorganisms. We have therefore reviewed and evaluated the whole sample preparation procedures for analysis of yeast metabolites. Our focus has been on the current needs in metabolome analysis......, which is the analysis of a large number of metabolites with very diverse chemical and physical properties. This work reports the leakage of intracellular metabolites observed during quenching yeast cells with cold methanol solution, the efficacy of six different methods for the extraction...

  4. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    Science.gov (United States)

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  5. FEASIBILITY OF DEVELOPING SOURCE SAMPLING METHODS FOR ASBESTOS EMISSIONS

    Science.gov (United States)

    The objective of this program was to determine the feasibility of developing methods for sampling asbestos in the emissions of major asbestos sources: (1) ore production and taconite production, (2) asbestos-cement production, (3) asbestos felt and paper production, and (4) the p...

  6. A Frequency Domain Design Method For Sampled-Data Compensators

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Jannerup, Ole Erik

    1990-01-01

    A new approach to the design of a sampled-data compensator in the frequency domain is investigated. The starting point is a continuous-time compensator for the continuous-time system which satisfy specific design criteria. The new design method will graphically show how the discrete...

  7. MOMENT-METHOD ESTIMATION BASED ON CENSORED SAMPLE

    Institute of Scientific and Technical Information of China (English)

    NI Zhongxin; FEI Heliang

    2005-01-01

    In reliability theory and survival analysis,the problem of point estimation based on the censored sample has been discussed in many literatures.However,most of them are focused on MLE,BLUE etc;little work has been done on the moment-method estimation in censoring case.To make the method of moment estimation systematic and unifiable,in this paper,the moment-method estimators(abbr.MEs) and modified momentmethod estimators(abbr.MMEs) of the parameters based on type I and type Ⅱ censored samples are put forward involving mean residual lifetime. The strong consistency and other properties are proved. To be worth mentioning,in the exponential distribution,the proposed moment-method estimators are exactly MLEs. By a simulation study,in the view point of bias and mean square of error,we show that the MEs and MMEs are better than MLEs and the "pseudo complete sample" technique introduced in Whitten et al.(1988).And the superiority of the MEs is especially conspicuous,when the sample is heavily censored.

  8. Apparatus and method for inspecting a surface of a sample

    NARCIS (Netherlands)

    Kruit, P.

    2014-01-01

    The invention relates to an apparatus and method for inspecting a sample. The apparatus comprises a generator for generating an array of primary charged particle beams (33), and a charged particle optical system with an optical axis (38). The optical system comprises a first lens system (37, 310) fo

  9. Sample processing method for the determination of perchlorate in milk

    International Nuclear Information System (INIS)

    In recent years, many different water sources and foods have been reported to contain perchlorate. Studies indicate that significant levels of perchlorate are present in both human and dairy milk. The determination of perchlorate in milk is particularly important due to its potential health impact on infants and children. As for many other biological samples, sample preparation is more time consuming than the analysis itself. The concurrent presence of large amounts of fats, proteins, carbohydrates, etc., demands some initial cleanup; otherwise the separation column lifetime and the limit of detection are both greatly compromised. Reported milk processing methods require the addition of chemicals such as ethanol, acetic acid or acetonitrile. Reagent addition is undesirable in trace analysis. We report here an essentially reagent-free sample preparation method for the determination of perchlorate in milk. Milk samples are spiked with isotopically labeled perchlorate and centrifuged to remove lipids. The resulting liquid is placed in a disposable centrifugal ultrafilter device with a molecular weight cutoff of 10 kDa, and centrifuged. Approximately 5-10 ml of clear liquid, ready for analysis, is obtained from a 20 ml milk sample. Both bovine and human milk samples have been successfully processed and analyzed by ion chromatography-mass spectrometry (IC-MS). Standard addition experiments show good recoveries. The repeatability of the analytical result for the same sample in multiple sample cleanup runs ranged from 3 to 6% R.S.D. This processing technique has also been successfully applied for the determination of iodide and thiocyanate in milk

  10. Quality Assurance - Construction

    DEFF Research Database (Denmark)

    Gaarslev, Axel

    1996-01-01

    Gives contains three main chapters:1. Quality Assurance initiated by external demands2. Quality Assurance initiated by internal company goals3. Innovation strategies......Gives contains three main chapters:1. Quality Assurance initiated by external demands2. Quality Assurance initiated by internal company goals3. Innovation strategies...

  11. Comparison of aquatic macroinvertebrate samples collected using different field methods

    Science.gov (United States)

    Lenz, Bernard N.; Miller, Michael A.

    1996-01-01

    Government agencies, academic institutions, and volunteer monitoring groups in the State of Wisconsin collect aquatic macroinvertebrate data to assess water quality. Sampling methods differ among agencies, reflecting the differences in the sampling objectives of each agency. Lack of infor- mation about data comparability impedes data shar- ing among agencies, which can result in duplicated sampling efforts or the underutilization of avail- able information. To address these concerns, com- parisons were made of macroinvertebrate samples collected from wadeable streams in Wisconsin by personnel from the U.S. Geological Survey- National Water Quality Assessment Program (USGS-NAWQA), the Wisconsin Department of Natural Resources (WDNR), the U.S. Department of Agriculture-Forest Service (USDA-FS), and volunteers from the Water Action Volunteer-Water Quality Monitoring Program (WAV). This project was part of the Intergovernmental Task Force on Monitoring Water Quality (ITFM) Wisconsin Water Resources Coordination Project. The numbers, types, and environmental tolerances of the organ- isms collected were analyzed to determine if the four different field methods that were used by the different agencies and volunteer groups provide comparable results. Additionally, this study com- pared the results of samples taken from different locations and habitats within the same streams.

  12. A proficiency test system to improve performance of milk analysis methods and produce reference values for component calibration samples for infrared milk analysis.

    Science.gov (United States)

    Wojciechowski, Karen L; Melilli, Caterina; Barbano, David M

    2016-08-01

    Our goal was to determine the feasibility of combining proficiency testing, analytical method quality-assurance system, and production of reference samples for calibration of infrared milk analyzers to achieve a more efficient use of resources and reduce costs while maximizing analytical accuracy within and among milk payment-testing laboratories. To achieve this, we developed and demonstrated a multilaboratory combined proficiency testing and analytical method quality-assurance system as an approach to evaluate and improve the analytical performance of methods. A set of modified milks was developed and optimized to serve multiple purposes (i.e., proficiency testing, quality-assurance and method improvement, and to provide reference materials for calibration of secondary testing methods). Over a period of years, the approach has enabled the group of laboratories to document improved analytical performance (i.e., reduced within- and between-laboratory variation) of chemical reference methods used as the primary reference for calibration of high-speed electronic milk-testing equipment. An annual meeting of the laboratory technicians allows for review of results and discussion of each method and provides a forum for communication of experience and techniques that are of value to new analysts in the group. The monthly proficiency testing sample exchanges have the added benefit of producing all-laboratory mean reference values for a set of 14 milks that can be used for calibration, evaluation, and troubleshooting of calibration adjustment issues on infrared milk analyzers.

  13. A proficiency test system to improve performance of milk analysis methods and produce reference values for component calibration samples for infrared milk analysis.

    Science.gov (United States)

    Wojciechowski, Karen L; Melilli, Caterina; Barbano, David M

    2016-08-01

    Our goal was to determine the feasibility of combining proficiency testing, analytical method quality-assurance system, and production of reference samples for calibration of infrared milk analyzers to achieve a more efficient use of resources and reduce costs while maximizing analytical accuracy within and among milk payment-testing laboratories. To achieve this, we developed and demonstrated a multilaboratory combined proficiency testing and analytical method quality-assurance system as an approach to evaluate and improve the analytical performance of methods. A set of modified milks was developed and optimized to serve multiple purposes (i.e., proficiency testing, quality-assurance and method improvement, and to provide reference materials for calibration of secondary testing methods). Over a period of years, the approach has enabled the group of laboratories to document improved analytical performance (i.e., reduced within- and between-laboratory variation) of chemical reference methods used as the primary reference for calibration of high-speed electronic milk-testing equipment. An annual meeting of the laboratory technicians allows for review of results and discussion of each method and provides a forum for communication of experience and techniques that are of value to new analysts in the group. The monthly proficiency testing sample exchanges have the added benefit of producing all-laboratory mean reference values for a set of 14 milks that can be used for calibration, evaluation, and troubleshooting of calibration adjustment issues on infrared milk analyzers. PMID:27209129

  14. New Methods of Sample Preparation for Atom Probe Specimens

    Science.gov (United States)

    Kuhlman, Kimberly, R.; Kowalczyk, Robert S.; Ward, Jennifer R.; Wishard, James L.; Martens, Richard L.; Kelly, Thomas F.

    2003-01-01

    Magnetite is a common conductive mineral found on Earth and Mars. Disk-shaped precipitates approximately 40 nm in diameter have been shown to have manganese and aluminum concentrations. Atom-probe field-ion microscopy (APFIM) is the only technique that can potentially quantify the composition of these precipitates. APFIM will be used to characterize geological and planetary materials, analyze samples of interest for geomicrobiology; and, for the metrology of nanoscale instrumentation. Prior to APFIM sample preparation was conducted by electropolishing, the method of sharp shards (MSS), or Bosch process (deep reactive ion etching) with focused ion beam (FIB) milling as a final step. However, new methods are required for difficult samples. Many materials are not easily fabricated using electropolishing, MSS, or the Bosch process, FIB milling is slow and expensive, and wet chemistry and the reactive ion etching are typically limited to Si and other semiconductors. APFIM sample preparation using the dicing saw is commonly used to section semiconductor wafers into individual devices following manufacture. The dicing saw is a time-effective method for preparing high aspect ratio posts of poorly conducting materials. Femtosecond laser micromachining is also suitable for preparation of posts. FIB time required is reduced by about a factor of 10 and multi-tip specimens can easily be fabricated using the dicing saw.

  15. Introduction to quality assurance

    International Nuclear Information System (INIS)

    In today's interpretation 'quality assurance' means 'good management'. Quality assurance has to cover all phases of a work, but all quality assurance measures must be adapted to the relevance and complexity of the actual task. Examples are given for the preparation of quality classes, the organization of quality assurance during design and manufacturing and for auditing. Finally, efficiency and limits of quality assurance systems are described. (orig.)

  16. Rational Construction of Stochastic Numerical Methods for Molecular Sampling

    CERN Document Server

    Leimkuhler, Benedict

    2012-01-01

    In this article, we focus on the sampling of the configurational Gibbs-Boltzmann distribution, that is, the calculation of averages of functions of the position coordinates of a molecular $N$-body system modelled at constant temperature. We show how a formal series expansion of the invariant measure of a Langevin dynamics numerical method can be obtained in a straightforward way using the Baker-Campbell-Hausdorff lemma. We then compare Langevin dynamics integrators in terms of their invariant distributions and demonstrate a superconvergence property (4th order accuracy where only 2nd order would be expected) of one method in the high friction limit; this method, moreover, can be reduced to a simple modification of the Euler-Maruyama method for Brownian dynamics involving a non-Markovian (coloured noise) random process. In the Brownian dynamics case, 2nd order accuracy of the invariant density is achieved. All methods considered are efficient for molecular applications (requiring one force evaluation per times...

  17. The Sensitivity of Respondent-driven Sampling Method

    CERN Document Server

    Lu, Xin; Britton, Tom; Camitz, Martin; Kim, Beom Jun; Thorson, Anna; Liljeros, Fredrik

    2012-01-01

    Researchers in many scientific fields make inferences from individuals to larger groups. For many groups however, there is no list of members from which to take a random sample. Respondent-driven sampling (RDS) is a relatively new sampling methodology that circumvents this difficulty by using the social networks of the groups under study. The RDS method has been shown to provide unbiased estimates of population proportions given certain conditions. The method is now widely used in the study of HIV-related high-risk populations globally. In this paper, we test the RDS methodology by simulating RDS studies on the social networks of a large LGBT web community. The robustness of the RDS method is tested by violating, one by one, the conditions under which the method provides unbiased estimates. Results reveal that the risk of bias is large if networks are directed, or respondents choose to invite persons based on characteristics that are correlated with the study outcomes. If these two problems are absent, the RD...

  18. Microfluidic Sample Preparation Methods for the Analysis of Milk Contaminants

    Directory of Open Access Journals (Sweden)

    Andrea Adami

    2016-01-01

    Full Text Available In systems for food analysis, one of the major challenges is related to the quantification of specific species into the complex chemical and physical composition of foods, that is, the effect of “matrix”; the sample preparation is often the key to a successful application of biosensors to real measurements but little attention is traditionally paid to such aspects in sensor research. In this critical review, we discuss several microfluidic concepts that can play a significant role in sample preparation, highlighting the importance of sample preparation for efficient detection of food contamination. As a case study, we focus on the challenges related to the detection of aflatoxin M1 in milk and we evaluate possible approaches based on inertial microfluidics, electrophoresis, and acoustic separation, compared with traditional laboratory and industrial methods for phase separation as a baseline of thrust and well-established techniques.

  19. Software Assurance Using Structured Assurance Case Models

    OpenAIRE

    Rhodes, Thomas; Boland, Frederick; Fong, Elizabeth; Kass, Michael

    2010-01-01

    Software assurance is an important part of the software development process to reduce risks and ensure that the software is dependable and trustworthy. Software defects and weaknesses can often lead to software errors and failures and to exploitation by malicious users. Testing, certification and accreditation have been traditionally used in the software assurance process to attempt to improve software trustworthiness. In this paper, we examine a methodology known as a structured assurance mo...

  20. Aerosol sampling methods for wide area environmental sampling (WAES). Finnish support to IAEA

    International Nuclear Information System (INIS)

    Enrichment of uranium or reprocessing of nuclear fuel are expected to produce releases of non-natural radionuclides, that are carried away in air attached in airborne particles. Wide Area Environmental Sampling (WAES)-method utilises air samplers distributed in the monitored area to detect the possible releases. For reduction of expenses, the samplers located in remote areas should be able to operate long periods of time unattended. In this work, a high-volume (flow rate 150 m3/h) air sampler with an automatic filter changing system, Hunter MKII, was developed for WAES. The sampler can collect six one-week samples until it needs to be visited for unloading the used filters and loading the new ones. The device sends real-time state-of-health information to headquarters so that longtime loss of sampling can be avoided in case of malfunction. The state-of-health data also includes indication to prevent inconspicuous tampering of the unattended sampling process. Organic filter materials are used to collect particles due to their applicability to radiochemical analysis. Four filter materials were tested for collection efficiency and pressure drop. The material selected for current use (Petrianov FPP-15-1.5 used as two-layers one upon the other) can collect more than 90% of the 0.2 μm particles throughout the sampling period. If there is a large concentration of coarse particles in air (as is typically the case in desert conditions), the filter clogging rate can be significantly decreased by preceding it with a low-pressure-drop pre-filter that collects the coarse particles. The filter pressure drop is low enough to easily allow one-week sampling time in typical sampling conditions (a pre-filter may be needed in heavily dust laden desert air). (orig.)

  1. Quality assurance within regulatory bodies

    International Nuclear Information System (INIS)

    The IAEA directed extensive efforts during the years 1991 to 1995 to the integral revision of all NUSS quality assurance publications, which were approved and issued as Safety Series No.50-C/SG-Q, Quality Assurance for Safety in Nuclear Power Plants and other Nuclear Installations (1996). When these quality assurance publications were developed, their prime focus was on requirements against which work performed by the licensees could be measured and assessed by the regulatory bodies. In this way, they only helped to facilitate the functions of regulators. No requirements or recommendations were provided on how the regulators should ensure the effective implementation of their own activities. The present publication is a first attempt to collect, integrate and offer available experience to directly support performance of regulatory activities. It presents a comprehensive compilation on the application of quality assurance principles and methods by regulatory bodies to their activities. The aim is consistent good performance of regulatory activities through a systematic approach

  2. WHO informal consultation on the application of molecular methods to assure the quality, safety and efficacy of vaccines, Geneva, Switzerland, 7-8 April 2005.

    Science.gov (United States)

    Shin, Jinho; Wood, David; Robertson, James; Minor, Philip; Peden, Keith

    2007-03-01

    In April 2005, the World Health Organization convened an informal consultation on molecular methods to assure the quality, safety and efficacy of vaccines. The consultation was attended by experts from national regulatory authorities, vaccine industry and academia. Crosscutting issues on the application of molecular methods for a number of vaccines that are currently in use or under development were presented, and specific methods for further collaborative studies were discussed and identified. The main points of recommendation from meeting participants were fourfold: (i) that molecular methods should be encouraged; (ii) that collaborative studies are needed for many methods/applications; (iii) that basic science should be promoted; and (iv) that investment for training, equipment and facilities should be encouraged.

  3. A direct method for e-cigarette aerosol sample collection.

    Science.gov (United States)

    Olmedo, Pablo; Navas-Acien, Ana; Hess, Catherine; Jarmul, Stephanie; Rule, Ana

    2016-08-01

    E-cigarette use is increasing in populations around the world. Recent evidence has shown that the aerosol produced by e-cigarettes can contain a variety of toxicants. Published studies characterizing toxicants in e-cigarette aerosol have relied on filters, impingers or sorbent tubes, which are methods that require diluting or extracting the sample in a solution during collection. We have developed a collection system that directly condenses e-cigarette aerosol samples for chemical and toxicological analyses. The collection system consists of several cut pipette tips connected with short pieces of tubing. The pipette tip-based collection system can be connected to a peristaltic pump, a vacuum pump, or directly to an e-cigarette user for the e-cigarette aerosol to flow through the system. The pipette tip-based system condenses the aerosol produced by the e-cigarette and collects a liquid sample that is ready for analysis without the need of intermediate extraction solutions. We tested a total of 20 e-cigarettes from 5 different brands commercially available in Maryland. The pipette tip-based collection system condensed between 0.23 and 0.53mL of post-vaped e-liquid after 150 puffs. The proposed method is highly adaptable, can be used during field work and in experimental settings, and allows collecting aerosol samples from a wide variety of e-cigarette devices, yielding a condensate of the likely exact substance that is being delivered to the lungs. PMID:27200479

  4. A direct method for e-cigarette aerosol sample collection.

    Science.gov (United States)

    Olmedo, Pablo; Navas-Acien, Ana; Hess, Catherine; Jarmul, Stephanie; Rule, Ana

    2016-08-01

    E-cigarette use is increasing in populations around the world. Recent evidence has shown that the aerosol produced by e-cigarettes can contain a variety of toxicants. Published studies characterizing toxicants in e-cigarette aerosol have relied on filters, impingers or sorbent tubes, which are methods that require diluting or extracting the sample in a solution during collection. We have developed a collection system that directly condenses e-cigarette aerosol samples for chemical and toxicological analyses. The collection system consists of several cut pipette tips connected with short pieces of tubing. The pipette tip-based collection system can be connected to a peristaltic pump, a vacuum pump, or directly to an e-cigarette user for the e-cigarette aerosol to flow through the system. The pipette tip-based system condenses the aerosol produced by the e-cigarette and collects a liquid sample that is ready for analysis without the need of intermediate extraction solutions. We tested a total of 20 e-cigarettes from 5 different brands commercially available in Maryland. The pipette tip-based collection system condensed between 0.23 and 0.53mL of post-vaped e-liquid after 150 puffs. The proposed method is highly adaptable, can be used during field work and in experimental settings, and allows collecting aerosol samples from a wide variety of e-cigarette devices, yielding a condensate of the likely exact substance that is being delivered to the lungs.

  5. Microbial diversity in fecal samples depends on DNA extraction method

    DEFF Research Database (Denmark)

    Mirsepasi, Hengameh; Persson, Søren; Struve, Carsten;

    2014-01-01

    BACKGROUND: There are challenges, when extracting bacterial DNA from specimens for molecular diagnostics, since fecal samples also contain DNA from human cells and many different substances derived from food, cell residues and medication that can inhibit downstream PCR. The purpose of the study...... was to evaluate two different DNA extraction methods in order to choose the most efficient method for studying intestinal bacterial diversity using Denaturing Gradient Gel Electrophoresis (DGGE). FINDINGS: In this study, a semi-automatic DNA extraction system (easyMag®, BioMérieux, Marcy I'Etoile, France......) and a manual one (QIAamp DNA Stool Mini Kit, Qiagen, Hilden, Germany) were tested on stool samples collected from 3 patients with Inflammatory Bowel disease (IBD) and 5 healthy individuals. DNA extracts obtained by the QIAamp DNA Stool Mini Kit yield a higher amount of DNA compared to DNA extracts obtained...

  6. A new method for vitrifying samples for cryoEM.

    Science.gov (United States)

    Razinkov, Ivan; Dandey, Venkata P; Wei, Hui; Zhang, Zhening; Melnekoff, David; Rice, William J; Wigge, Christoph; Potter, Clinton S; Carragher, Bridget

    2016-08-01

    Almost every aspect of cryo electron microscopy (cryoEM) has been automated over the last few decades. One of the challenges that remains to be addressed is the robust and reliable preparation of vitrified specimens of suitable ice thickness. We present results from a new device for preparing vitrified samples. The successful use of the device is coupled to a new "self-blotting" grid that we have developed to provide a method for spreading a sample to a thin film without the use of externally applied filter paper. This new approach has the advantage of using small amounts of protein material, resulting in large areas of ice of a well defined thickness containing evenly distributed single particles. We believe that these methods will in the future result in a system for vitrifying grids that is completely automated. PMID:27288865

  7. The autism inpatient collection: methods and preliminary sample description

    OpenAIRE

    Siegel, Matthew; Smith, Kahsi A.; Mazefsky, Carla; Gabriels, Robin L.; Erickson, Craig; Kaplan, Desmond; Morrow, Eric M; Wink, Logan; Santangelo, Susan L.; ,

    2015-01-01

    Background Individuals severely affected by autism spectrum disorder (ASD), including those with intellectual disability, expressive language impairment, and/or self-injurious behavior (SIB), are underrepresented in the ASD literature and extant collections of phenotypic and biological data. An understanding of ASD’s etiology and subtypes can only be as complete as the studied samples are representative. Methods The Autism Inpatient Collection (AIC) is a multi-site study enrolling children an...

  8. A direct sampling method to an inverse medium scattering problem

    KAUST Repository

    Ito, Kazufumi

    2012-01-10

    In this work we present a novel sampling method for time harmonic inverse medium scattering problems. It provides a simple tool to directly estimate the shape of the unknown scatterers (inhomogeneous media), and it is applicable even when the measured data are only available for one or two incident directions. A mathematical derivation is provided for its validation. Two- and three-dimensional numerical simulations are presented, which show that the method is accurate even with a few sets of scattered field data, computationally efficient, and very robust with respect to noises in the data. © 2012 IOP Publishing Ltd.

  9. Microextraction Methods for Preconcentration of Aluminium in Urine Samples

    Directory of Open Access Journals (Sweden)

    Farzad Farajbakhsh, Mohammad Amjadi, Jamshid Manzoori, Mohammad R. Ardalan, Abolghasem Jouyban

    2016-07-01

    Full Text Available Background: Analysis of aluminium (Al in urine samples is required in management of a number of diseases including patients with renal failure. This work aimed to present dispersive liquid-liquid microextraction (DLLME and ultrasound-assisted emulsification microextraction (USAEME methods for the preconcentration of ultra-trace amount of aluminum in human urine prior to its determination by a graphite furnace atomic absorption spectrometry (GFAAS. Methods: The microextraction methods were based on the complex formation of Al3+ with 8-hydroxyquinoline. The effect of various experimental parameters on the efficiencies of the methods and their optimum values were studied. Results: Under the optimal conditions, the limits of detection for USAEME-GFAAS and DLLME-GFAAS were 0.19 and 0.30 ng mL−1, respectively and corresponding relative standard deviations (RSD, n=5 for the determination of 40 ng mL−1 Al3+ were 5.9% and 4.9%. Conclusion: Both methods could be successfully used to the analysis of ultra trace concentrations of Al in urine samples of dialysis patients.

  10. Grading of quality assurance requirements

    International Nuclear Information System (INIS)

    The present Manual provides guidance and illustrative examples for applying a method by which graded quality assurance requirements may be determined and adapted to the items and services of a nuclear power plant in conformance with the requirements of the IAEA Nuclear Safety Standards (NUSS) Code and Safety Guides on quality assurance. The Manual replaces the previous publication IAEA-TECDOC-303 on the same subject. Various methods of grading quality assurance are available in a number of Member States. During the development of the present Manual it was not considered practical to attempt to resolve the differences between those methods and it was preferred to identify and benefit from the good practices available in all the methods. The method presented in this Manual deals with the aspects of management, documentation, control, verification and administration which affect quality. 1 fig., 4 tabs

  11. Sediment sampling and processing methods in Hungary, and possible improvements

    Science.gov (United States)

    Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy

    2016-04-01

    The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river

  12. Sediment sampling and processing methods in Hungary, and possible improvements

    Science.gov (United States)

    Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy

    2016-04-01

    The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river

  13. Comprehensive method to analyze thick insulating samples using PIXE technique

    Energy Technology Data Exchange (ETDEWEB)

    Ismail, I.M., E-mail: cscientific3@aec.org.sy [Atomic Energy Commission, Department of Chemistry, P.O. Box 6091, Damascus (Syrian Arab Republic); Rihawy, M.S. [Atomic Energy Commission, Department of Chemistry, P.O. Box 6091, Damascus (Syrian Arab Republic)

    2013-02-01

    Highlights: ► A new comprehensive method to analyze directly thick insulating samples by PIXE technique is presented. ► The method is based on the use of an electron flood gun and a beam profile monitor. ► Both accuracy and precision of the experimental procedure were successfully verified using reference material. -- Abstract: In this work, we present a new method to analyze thick insulating samples by PIXE technique. The method is based on the use of both an electron flood gun to compensate the charge build-up at the insulating surface and a beam profile monitor (BPM) to provide a precise indirect measurement of the beam current and accumulated charge. A filament extracted from an ordinary flashlight lamp was used as an electron flood gun. While, a commercial BPM has been adapted in order to carry out charge measurements. The results have revealed the convenience of using BPM for measuring the charge in PIXE measurements. The use of the electron flood gun has given very satisfactory results in term of preventing charge build-up and reducing its contribution to the bremsstrahlung background in the PIXE spectra. The applicability and efficiency of the overall system for elemental analysis were successfully verified using IAEA-Soil-7 reference material where both accuracy and precision were found to be better than 10% in most cases.

  14. RAPID SEPARATION METHOD FOR ACTINIDES IN EMERGENCY AIR FILTER SAMPLES

    Energy Technology Data Exchange (ETDEWEB)

    Maxwell, S.; Noyes, G.; Culligan, B.

    2010-02-03

    A new rapid method for the determination of actinides and strontium in air filter samples has been developed at the Savannah River Site Environmental Lab (Aiken, SC, USA) that can be used in emergency response situations. The actinides and strontium in air filter method utilizes a rapid acid digestion method and a streamlined column separation process with stacked TEVA, TRU and Sr Resin cartridges. Vacuum box technology and rapid flow rates are used to reduce analytical time. Alpha emitters are prepared using cerium fluoride microprecipitation for counting by alpha spectrometry. The purified {sup 90}Sr fractions are mounted directly on planchets and counted by gas flow proportional counting. The method showed high chemical recoveries and effective removal of interferences. This new procedure was applied to emergency air filter samples received in the NRIP Emergency Response exercise administered by the National Institute for Standards and Technology (NIST) in April, 2009. The actinide and {sup 90}Sr in air filter results were reported in {approx}4 hours with excellent quality.

  15. Progressive prediction method for failure data with small sample size

    Institute of Scientific and Technical Information of China (English)

    WANG Zhi-hua; FU Hui-min; LIU Cheng-rui

    2011-01-01

    The small sample prediction problem which commonly exists in reliability analysis was discussed with the progressive prediction method in this paper.The modeling and estimation procedure,as well as the forecast and confidence limits formula of the progressive auto regressive(PAR) method were discussed in great detail.PAR model not only inherits the simple linear features of auto regressive(AR) model,but also has applicability for nonlinear systems.An application was illustrated for predicting the future fatigue failure for Tantalum electrolytic capacitors.Forecasting results of PAR model were compared with auto regressive moving average(ARMA) model,and it can be seen that the PAR method can be considered good and shows a promise for future applications.

  16. Comparison between powder and slices diffraction methods in teeth samples

    Energy Technology Data Exchange (ETDEWEB)

    Colaco, Marcos V.; Barroso, Regina C. [Universidade do Estado do Rio de Janeiro (IF/UERJ), RJ (Brazil). Inst. de Fisica. Dept. de Fisica Aplicada; Porto, Isabel M. [Universidade Estadual de Campinas (FOP/UNICAMP), Piracicaba, SP (Brazil). Fac. de Odontologia. Dept. de Morfologia; Gerlach, Raquel F. [Universidade de Sao Paulo (FORP/USP), Rieirao Preto, SP (Brazil). Fac. de Odontologia. Dept. de Morfologia, Estomatologia e Fisiologia; Costa, Fanny N. [Coordenacao dos Programas de Pos-Graduacao de Engenharia (LIN/COPPE/UFRJ), RJ (Brazil). Lab. de Instrumentacao Nuclear

    2011-07-01

    Propose different methods to obtain crystallographic information about biological materials are important since powder method is a nondestructive method. Slices are an approximation of what would be an in vivo analysis. Effects of samples preparation cause differences in scattering profiles compared with powder method. The main inorganic component of bones and teeth is a calcium phosphate mineral whose structure closely resembles hydroxyapatite (HAp). The hexagonal symmetry, however, seems to work well with the powder diffraction data, and the crystal structure of HAp is usually described in space group P63/m. Were analyzed ten third molar teeth. Five teeth were separated in enamel, detin and circumpulpal detin powder and five in slices. All the scattering profile measurements were carried out at the X-ray diffraction beamline (XRD1) at the National Synchrotron Light Laboratory - LNLS, Campinas, Brazil. The LNLS synchrotron light source is composed of a 1.37 GeV electron storage ring, delivering approximately 4x10{sup -1}0 photons/s at 8 keV. A double-crystal Si(111) pre-monochromator, upstream of the beamline, was used to select a small energy bandwidth at 11 keV . Scattering signatures were obtained at intervals of 0.04 deg for angles from 24 deg to 52 deg. The human enamel experimental crystallite size obtained in this work were 30(3)nm (112 reflection) and 30(3)nm (300 reflection). These values were obtained from measurements of powdered enamel. When comparing the slice obtained 58(8)nm (112 reflection) and 37(7)nm (300 reflection) enamel diffraction patterns with those generated by the powder specimens, a few differences emerge. This work shows differences between powder and slices methods, separating characteristics of sample of the method's influence. (author)

  17. A direct sampling method for inverse electromagnetic medium scattering

    International Nuclear Information System (INIS)

    In this paper, we study the inverse electromagnetic medium scattering problem of estimating the support and shape of medium scatterers from scattered electric/magnetic near-field data. We shall develop a novel direct sampling method based on an analysis of electromagnetic scattering and the behavior of the fundamental solution. It is applicable to a few incident fields and needs only to compute inner products of the measured scattered field with the fundamental solutions located at sampling points. Hence, it is strictly direct, computationally very efficient and highly robust to the presence of data noise. Two- and three-dimensional numerical experiments indicate that it can provide reliable support estimates for multiple scatterers in the case of both exact and highly noisy data. (paper)

  18. A direct sampling method for inverse electromagnetic medium scattering

    KAUST Repository

    Ito, Kazufumi

    2013-09-01

    In this paper, we study the inverse electromagnetic medium scattering problem of estimating the support and shape of medium scatterers from scattered electric/magnetic near-field data. We shall develop a novel direct sampling method based on an analysis of electromagnetic scattering and the behavior of the fundamental solution. It is applicable to a few incident fields and needs only to compute inner products of the measured scattered field with the fundamental solutions located at sampling points. Hence, it is strictly direct, computationally very efficient and highly robust to the presence of data noise. Two- and three-dimensional numerical experiments indicate that it can provide reliable support estimates for multiple scatterers in the case of both exact and highly noisy data. © 2013 IOP Publishing Ltd.

  19. Path Sampling Methods for Enzymatic Quantum Particle Transfer Reactions.

    Science.gov (United States)

    Dzierlenga, M W; Varga, M J; Schwartz, S D

    2016-01-01

    The mechanisms of enzymatic reactions are studied via a host of computational techniques. While previous methods have been used successfully, many fail to incorporate the full dynamical properties of enzymatic systems. This can lead to misleading results in cases where enzyme motion plays a significant role in the reaction coordinate, which is especially relevant in particle transfer reactions where nuclear tunneling may occur. In this chapter, we outline previous methods, as well as discuss newly developed dynamical methods to interrogate mechanisms of enzymatic particle transfer reactions. These new methods allow for the calculation of free energy barriers and kinetic isotope effects (KIEs) with the incorporation of quantum effects through centroid molecular dynamics (CMD) and the full complement of enzyme dynamics through transition path sampling (TPS). Recent work, summarized in this chapter, applied the method for calculation of free energy barriers to reaction in lactate dehydrogenase (LDH) and yeast alcohol dehydrogenase (YADH). We found that tunneling plays an insignificant role in YADH but plays a more significant role in LDH, though not dominant over classical transfer. Additionally, we summarize the application of a TPS algorithm for the calculation of reaction rates in tandem with CMD to calculate the primary H/D KIE of YADH from first principles. We found that the computationally obtained KIE is within the margin of error of experimentally determined KIEs and corresponds to the KIE of particle transfer in the enzyme. These methods provide new ways to investigate enzyme mechanism with the inclusion of protein and quantum dynamics.

  20. Path Sampling Methods for Enzymatic Quantum Particle Transfer Reactions.

    Science.gov (United States)

    Dzierlenga, M W; Varga, M J; Schwartz, S D

    2016-01-01

    The mechanisms of enzymatic reactions are studied via a host of computational techniques. While previous methods have been used successfully, many fail to incorporate the full dynamical properties of enzymatic systems. This can lead to misleading results in cases where enzyme motion plays a significant role in the reaction coordinate, which is especially relevant in particle transfer reactions where nuclear tunneling may occur. In this chapter, we outline previous methods, as well as discuss newly developed dynamical methods to interrogate mechanisms of enzymatic particle transfer reactions. These new methods allow for the calculation of free energy barriers and kinetic isotope effects (KIEs) with the incorporation of quantum effects through centroid molecular dynamics (CMD) and the full complement of enzyme dynamics through transition path sampling (TPS). Recent work, summarized in this chapter, applied the method for calculation of free energy barriers to reaction in lactate dehydrogenase (LDH) and yeast alcohol dehydrogenase (YADH). We found that tunneling plays an insignificant role in YADH but plays a more significant role in LDH, though not dominant over classical transfer. Additionally, we summarize the application of a TPS algorithm for the calculation of reaction rates in tandem with CMD to calculate the primary H/D KIE of YADH from first principles. We found that the computationally obtained KIE is within the margin of error of experimentally determined KIEs and corresponds to the KIE of particle transfer in the enzyme. These methods provide new ways to investigate enzyme mechanism with the inclusion of protein and quantum dynamics. PMID:27497161

  1. Quality assurance program

    International Nuclear Information System (INIS)

    This topical report describes the Gibbs and Hill Quality Assurance Program and sets forth the methods to be followed in controlling quality-related activities performed by Gibbs and Hill and its contractors. The program is based on company experience in nuclear power and related work, and defines a system found effective in providing independent control of quality-related functions and documentation. The scope of the report covers activities involving nuclear safety-related structures, systems, and components covered by Gibbs and Hill' contractual obligation to the Utility Owner for each project

  2. 30 CFR 74.9 - Quality assurance.

    Science.gov (United States)

    2010-07-01

    ... CFR part 51. Persons may obtain a copy from the International Organization for Standardization at the... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Quality assurance. 74.9 Section 74.9 Mineral... DUST SAMPLING DEVICES Requirements for Continuous Personal Dust Monitors § 74.9 Quality assurance....

  3. Vadose Zone Sampling Methods for Detection of Preferential Pesticides Transport

    Science.gov (United States)

    Peranginangin, N.; Richards, B. K.; Steenhuis, T. S.

    2003-12-01

    Leaching of agricultural applied chemicals through the vadose zone is a major cause for the occurrence of agrichemicals in groundwater. Accurate soil water sampling methods are needed to ensure meaningful monitoring results, especially for soils that have significant preferential flow paths. The purpose of this study was to assess the capability and the effectiveness of various soil water sampling methods in detecting preferential transport of pesticides in a strongly-structured silty clay loam (Hudson series) soil. Soil water sampling devices tested were wick pan and gravity pan lysimeters, tile lines, porous ceramic cups, and pipe lysimeters; all installed at 45 to105 cm depth below the ground surface. A reasonable worse-case scenario was tested by applying a simulated rain storm soon after pesticides were sprayed at agronomic rates. Herbicides atrazine (6-chloro-N2-ethyl-N4-isopropyl-1,3,5-triazine-2,4-diamine) and 2,4-D (2,4-dichloro-phenoxyacetic acid) were chosen as model compounds. Chloride (KCl) tracer was used to determine spatial and temporal distribution of non-reactive solute and water as well as a basis for determining the retardation in pesticides movement. Results show that observed pesticide mobility was much greater than would be predicted by uniform flow. Under relatively high soil moisture conditions, gravity and wick pan lysimeters had comparably good collection efficiencies, whereas the wick samplers had an advantage over gravity driven sampler when the soil moisture content was below field capacity. Pipe lysimeters had breakthrough patterns that were similar to pan samplers. At small plot scale, tile line samplers tended to underestimate solute concentration because of water dilution around the samplers. The use of porous cup samplers performed poorly because of their sensitivity to local profile characteristics: only by chance can they intercept and sample the preferential flow paths that are critical to transport. Wick sampler had the least

  4. Alum sludge: a sampling method for the radioactivity in waters

    International Nuclear Information System (INIS)

    The alum coagulation and settling steps used in water treatment plants can be the basis of a method to concentrate radionuclides present in very small amounts in soft waters. Indeed, the gamma spectrum of a dried alum sludge shows the presence of a number of naturally occurring radionuclides, of fission and activation products unobservable unless a preconcentration step is carried out. The floc sampling technique is described. The different methods developed to relate the amount of floc to the volume of water, are presented; they are based either on the chemical characterisation of the suspended matter or on the use of internal standards like 137Cs and 7Be. The question of the distribution of some radionuclides between the liquid and the solid phases in the rivers and that of their retention by the floc is discussed: the results show clearly the importance of the suspended matter as a carrier for the radioactivity

  5. Multinational Quality Assurance

    Science.gov (United States)

    Kinser, Kevin

    2011-01-01

    Multinational colleges and universities pose numerous challenges to the traditional models of quality assurance that are designed to validate domestic higher education. When institutions cross international borders, at least two quality assurance protocols are involved. To guard against fraud and abuse, quality assurance in the host country is…

  6. Computer software quality assurance

    International Nuclear Information System (INIS)

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  7. The curvHDR method for gating flow cytometry samples

    Directory of Open Access Journals (Sweden)

    Wand Matthew P

    2010-01-01

    Full Text Available Abstract Background High-throughput flow cytometry experiments produce hundreds of large multivariate samples of cellular characteristics. These samples require specialized processing to obtain clinically meaningful measurements. A major component of this processing is a form of cell subsetting known as gating. Manual gating is time-consuming and subjective. Good automatic and semi-automatic gating algorithms are very beneficial to high-throughput flow cytometry. Results We develop a statistical procedure, named curvHDR, for automatic and semi-automatic gating. The method combines the notions of significant high negative curvature regions and highest density regions and has the ability to adapt well to human-perceived gates. The underlying principles apply to dimension of arbitrary size, although we focus on dimensions up to three. Accompanying software, compatible with contemporary flow cytometry infor-matics, is developed. Conclusion The method is seen to adapt well to nuances in the data and, to a reasonable extent, match human perception of useful gates. It offers big savings in human labour when processing high-throughput flow cytometry data whilst retaining a good degree of efficacy.

  8. Quality assurance in the ambulatory care setting.

    Science.gov (United States)

    Tyler, R D

    1989-01-01

    One of the most utilitarian developments in the field of quality assurance in health care has been the introduction of industrial concepts of quality management. These concepts, coupled with buyer demand for accountability, are bringing new perspectives to health care quality assurance. These perspectives provide a new view of quality assurance as a major responsibility and strategic opportunity for management; a competitive and marketable commodity; and a method of improving safety, effectiveness, and satisfaction with medical care.

  9. BMAA extraction of cyanobacteria samples: which method to choose?

    Science.gov (United States)

    Lage, Sandra; Burian, Alfred; Rasmussen, Ulla; Costa, Pedro Reis; Annadotter, Heléne; Godhe, Anna; Rydberg, Sara

    2016-01-01

    β-N-Methylamino-L-alanine (BMAA), a neurotoxin reportedly produced by cyanobacteria, diatoms and dinoflagellates, is proposed to be linked to the development of neurological diseases. BMAA has been found in aquatic and terrestrial ecosystems worldwide, both in its phytoplankton producers and in several invertebrate and vertebrate organisms that bioaccumulate it. LC-MS/MS is the most frequently used analytical technique in BMAA research due to its high selectivity, though consensus is lacking as to the best extraction method to apply. This study accordingly surveys the efficiency of three extraction methods regularly used in BMAA research to extract BMAA from cyanobacteria samples. The results obtained provide insights into possible reasons for the BMAA concentration discrepancies in previous publications. In addition and according to the method validation guidelines for analysing cyanotoxins, the TCA protein precipitation method, followed by AQC derivatization and LC-MS/MS analysis, is now validated for extracting protein-bound (after protein hydrolysis) and free BMAA from cyanobacteria matrix. BMAA biological variability was also tested through the extraction of diatom and cyanobacteria species, revealing a high variance in BMAA levels (0.0080-2.5797 μg g(-1) DW).

  10. A sampling method for inverse scattering in the time domain

    International Nuclear Information System (INIS)

    We consider a near-field inverse scattering problem for the wave equation: find the shape of a Dirichlet scattering object from time domain measurements of scattered waves. For this time-domain inverse problem, we propose a linear sampling method, a well-known technique for corresponding frequency domain inverse scattering problems. The problem setting and the algorithm incorporate two basic features. First, the data for the method consist of measurements of causal waves, that is, of waves that vanish before some moment in time. Second, the inversion algorithm directly works on the time-domain data without using a Fourier transformation. The first point is related to the applications we have in mind, which include for instance ground-penetrating radar imaging. The second feature allows us to naturally incorporate multiple (in fact, a continuum of) frequencies in the inversion algorithm. Consequently, it offers the potential of improving the quality of the reconstruction compared to frequency domain methods working with a single frequency. We demonstrate this potential by several numerical examples

  11. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  12. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  13. A comparison of methods for representing sparsely sampled random quantities.

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose; Swiler, Laura Painton; Urbina, Angel; Mullins, Joshua

    2013-09-01

    This report discusses the treatment of uncertainties stemming from relatively few samples of random quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse data samples it is not practical to have a goal of accurately estimating the underlying probability density function (PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a specified percentile range of the actual PDF, say the range between 0.025 and .975 percentiles, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the desired percentile range of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem interesting and difficult. In this report, five uncertainty representation techniques are characterized for their performance on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods, statistical Tolerance Intervals and a kernel density approach specifically developed for handling sparse data, exhibit significantly better overall performance than the others.

  14. Cognitive anxiety: a method of content analysis for verbal samples.

    Science.gov (United States)

    Viney, L L; Westbrook, M T

    1976-04-01

    The work of such psychologists as Kelly, McReynolds, Epstein, and Lazarus suggested the need for a measure of cognitive anxiety and provided a definition of that construct. A method of content analysis of verbal samples was devised and found to have adequate interjudge reliability. Normative data for five groups of subjects were provided. The validity of the measure as representative of a reaction to being unable to anticipate and integrate experience meaningfully was demonstrated in (a) the higher scores of groups of subjects who were currently coping with new experiences than those who were not, (b) the significant correlation of its scores with a state rather than trait anxiety measures, (c) the variability of its scores over time as observed in a generalizeability study, and (d) the higher scores of subjects when they were dealing with experiences for which meaningful anticipation was relatively difficult. PMID:16367387

  15. Comprehensive method to analyze thick insulating samples using PIXE technique

    Science.gov (United States)

    Ismail, I. M.; Rihawy, M. S.

    2013-02-01

    In this work, we present a new method to analyze thick insulating samples by PIXE technique. The method is based on the use of both an electron flood gun to compensate the charge build-up at the insulating surface and a beam profile monitor (BPM) to provide a precise indirect measurement of the beam current and accumulated charge. A filament extracted from an ordinary flashlight lamp was used as an electron flood gun. While, a commercial BPM has been adapted in order to carry out charge measurements. The results have revealed the convenience of using BPM for measuring the charge in PIXE measurements. The use of the electron flood gun has given very satisfactory results in term of preventing charge build-up and reducing its contribution to the bremsstrahlung background in the PIXE spectra. The applicability and efficiency of the overall system for elemental analysis were successfully verified using IAEA-Soil-7 reference material where both accuracy and precision were found to be better than 10% in most cases.

  16. Geothermal water and gas: collected methods for sampling and analysis. Comment issue. [Compilation of methods

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, J.G.; Serne, R.J.; Shannon, D.W.; Woodruff, E.M.

    1976-08-01

    A collection of methods for sampling and analysis of geothermal fluids and gases is presented. Compilations of analytic options for constituents in water and gases are given. Also, a survey of published methods of laboratory water analysis is included. It is stated that no recommendation of the applicability of the methods to geothermal brines should be assumed since the intent of the table is to encourage and solicit comments and discussion leading to recommended analytical procedures for geothermal waters and research. (WHK)

  17. Sequential sampling: a novel method in farm animal welfare assessment.

    Science.gov (United States)

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  18. Sequential sampling: a novel method in farm animal welfare assessment.

    Science.gov (United States)

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  19. Probing methane hydrate nucleation through the forward flux sampling method.

    Science.gov (United States)

    Bi, Yuanfei; Li, Tianshu

    2014-11-26

    Understanding the nucleation of hydrate is the key to developing effective strategies for controlling methane hydrate formation. Here we present a computational study of methane hydrate nucleation, by combining the forward flux sampling (FFS) method and the coarse-grained water model mW. To facilitate the application of FFS in studying the formation of methane hydrate, we developed an effective order parameter λ on the basis of the topological analysis of the tetrahedral network. The order parameter capitalizes the signature of hydrate structure, i.e., polyhedral cages, and is capable of efficiently distinguishing hydrate from ice and liquid water while allowing the formation of different hydrate phases, i.e., sI, sII, and amorphous. Integration of the order parameter λ with FFS allows explicitly computing hydrate nucleation rates and obtaining an ensemble of nucleation trajectories under conditions where spontaneous hydrate nucleation becomes too slow to occur in direct simulation. The convergence of the obtained hydrate nucleation rate was found to depend crucially on the convergence of the spatial distribution for the spontaneously formed hydrate seeds obtained from the initial sampling of FFS. The validity of the approach is also verified by the agreement between the calculated nucleation rate and that inferred from the direct simulation. Analyzing the obtained large ensemble of hydrate nucleation trajectories, we show hydrate formation at 220 K and 500 bar is initiated by the nucleation events occurring in the vicinity of water-methane interface, and facilitated by a gradual transition from amorphous to crystalline structure. The latter provides the direct support to the proposed two-step nucleation mechanism of methane hydrate. PMID:24849698

  20. Revitalizing quality assurance

    International Nuclear Information System (INIS)

    The image of someone inspecting or auditing often comes to mind when people hear the term quality assurance. Although partially correct, this image is not the complete picture. The person doing the inspecting or auditing is probably part of a traditional quality assurance organization, but that organization is only one aspect of a properly conceived and effectively implemented quality assurance system whose goal is improved facility safety and reliability. This paper introduces the underlying philosophies and basic concepts of the International Atomic Energy Agency's new quality assurance initiative that began in 1991 as part of a broad Agency-wide program to enhance nuclear safety. The first product of that initiative was publication in 1996 of a new Quality Assurance Code 50-C/SG-Q and fourteen related Safety Guides. This new suite of documents provide the technical and philosophical foundation upon which Member States can base their quality assurance programs. (author)

  1. Task Technical and Quality Assurance Plan for Testing Methods to Reduce 235 Uranium Enrichment in Tank 43H Supernatant Liquid

    Energy Technology Data Exchange (ETDEWEB)

    Hobbs, D.T.

    2000-10-24

    In July of 1997, the 2H-Evaporator was shutdown due to the inability to lift material from the vessel. Inspections of the gravity drain line (GDL) showed a scale deposit coating the inside of the line. A Sample of the material was obtained and analyses performed. Plans are to chemically clean the evaporator pot by dissolving the solids in a 1.5M nitric acid solution containing depleted uranium.

  2. Photoacoustic spectroscopy sample array vessels and photoacoustic spectroscopy methods for using the same

    Science.gov (United States)

    Amonette, James E.; Autrey, S. Thomas; Foster-Mills, Nancy S.

    2006-02-14

    Methods and apparatus for simultaneous or sequential, rapid analysis of multiple samples by photoacoustic spectroscopy are disclosed. Particularly, a photoacoustic spectroscopy sample array vessel including a vessel body having multiple sample cells connected thereto is disclosed. At least one acoustic detector is acoustically positioned near the sample cells. Methods for analyzing the multiple samples in the sample array vessels using photoacoustic spectroscopy are provided.

  3. Multi-Scaling Sampling: An Adaptive Sampling Method for Discovering Approximate Association Rules

    Institute of Scientific and Technical Information of China (English)

    Cai-Yan Jia; Xie-Ping Gao

    2005-01-01

    One of the obstacles of the efficient association rule mining is the explosive expansion of data sets since it is costly or impossible to scan large databases, esp., for multiple times. A popular solution to improve the speed and scalability of the association rule mining is to do the algorithm on a random sample instead of the entire database. But how to effectively define and efficiently estimate the degree of error with respect to the outcome of the algorithm, and how to determine the sample size needed are entangling researches until now. In this paper, an effective and efficient algorithm is given based on the PAC (Probably Approximate Correct) learning theory to measure and estimate sample error. Then, a new adaptive, on-line, fast sampling strategy - multi-scaling sampling - is presented inspired by MRA (Multi-Resolution Analysis) and Shannon sampling theorem, for quickly obtaining acceptably approximate association rules at appropriate sample size. Both theoretical analysis and empirical study have showed that the sampling strategy can achieve a very good speed-accuracy trade-off.

  4. Methods and quality assurance in environmental medicine. Formation of a RKI-Commission; Methoden und Qualitaetssicherung in der Umweltmedizin. Einrichtung einer Umweltmedizin-Kommission am RKI

    Energy Technology Data Exchange (ETDEWEB)

    Eis, D. [Bundesgesundheitsamt, Berlin (Germany). Robert-Koch-Institut

    2000-05-01

    An almost bewildering number of widely differing methods and techniques, often not validated, are being applied often inappropriately in the field of environmental medicine to answer questions regarding exposure assessment, diagnosis, treatment, counselling and prevention. Therefore, quality control within the field of environmental medicine is quite problematic. A primary goal of the newly formed RKI-Commission 'Methods and Quality Assurance in Environmental Medicine' is to form a panel of experts in the field, who evaluate the situation and generate consensus documents containing respective recommendations. By this the commission will contribute to standardization and agreement on appropriate methods, procedures and their correct application in the practice of environmental medicine. Hopefully it will also achieve a stronger, more consistent use of evidence-based-medicine and improve the quality of the structure, processes and results of research and practice in this field. The committee will initially deal with the issue of clinical environmental medicine, because here the largest problems in quality assurance are seen. In this context the commission will look at the problem areas of environmental-medical outpatient units and environmental clinics. The work of the commission will be supported by the newly formed Documentation and Evaluation Center for Methods in Environmental Medicine (Zentrale Erfassungs- und Bewertungsstelle fuer umweltmedizinische Methoden, ZEBUM) at the Robert Koch Institute. (orig.) [German] Im Rahmen der umweltmedizinischen Expositionserfassung, Diagnostik, Beratung, Therapie, Prophylaxe und Sanierung wird eine kaum mehr ueberschaubare Zahl unterschiedlichster, zum Teil nichtvalidierter Verfahren bei oftmals fragwuerdiger Indikation eingesetzt. Die umweltmedizinische Qualitaetssicherung (QS) ist damit zum Problem geworden. Ein Hauptanliegen der neu eingerichteten RKI-Kommission 'Methoden und Qualitaetssicherung in der

  5. A novel sampling method for the investigation of gut microbiota

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    AIM: In order to characterize the qualitative and quantitative microorganisms in different sites of the lower digestive tract (LDT) in healthy volunteers, a specific technique was developed for collecting mucous of the distal ileum, colon and rectum.METHODS: A polyethylene tube was designed to go through the colonoscope channel with a No. 8 French tube. In order to avoid internal contamination, the distal extremity was protected with a membrane of microfilm after being sterilized in ethilene oxid. To facilitate the aspiration of a precise volume, its interior was coated with silicone. One hundred microlliter (0.1 mL) sample of mucous was collected and transferred into an Eppenddorff tube containing nine hundred microlliter (0.9 mL) of VMGA-3 (viable medium of Goteborg). This procedure was repeated at each site of the LDT with a new sterilized catheter.RESULTS: All sites revealed the "non pathogenic"anaerobic bacteria Veillonella sp (average 105 colony forming units/mL-CFU/mL), allowing to conclude an environment of low oxidation-reduction potential (redox)in the LDT. It was also characterized the presence of Klebisiella sp with significant statistical predominance (SSP) in the ileum. Enterobacter sp was found with SSP in the sigmoid colon, BacteroideS sp non-pigmented (npg)and E. coli with SSP in the sigmoid colon and rectum,Enterococcus sp and Lactobacillus sp with SSP in the rectum, all in a mean concentration of 105 CFU/mL.CONCLUSION: This procedure is feasible and efficient and can point out a similar distribution of the aerobic and anaerobic bacteria with the presence of biological markers of normal microbiota in the LDT.

  6. SAR imaging method based on coprime sampling and nested sparse sampling

    Institute of Scientific and Technical Information of China (English)

    Hongyin Shi; Baojing Jia

    2015-01-01

    As the signal bandwidth and the number of channels increase, the synthetic aperture radar (SAR) imaging system pro-duces huge amount of data according to the Shannon-Nyquist theorem, causing a huge burden for data transmission. This pa-per concerns the coprime sampling and nested sparse sampling, which are proposed recently but have never been applied to real world for target detection, and proposes a novel way which uti-lizes these new sub-Nyquist sampling structures for SAR sam-pling in azimuth and reconstructs the data of SAR sampling by compressive sensing (CS). Both the simulated and real data are processed to test the algorithm, and the results indicate the way which combines these new undersampling structures and CS is able to achieve the SAR imaging effectively with much less data than regularly ways required. Final y, the influence of a little sam-pling jitter to SAR imaging is analyzed by theoretical analysis and experimental analysis, and then it concludes a little sampling jitter have no effect on image quality of SAR.

  7. Acoustically levitated droplets: a contactless sampling method for fluorescence studies.

    Science.gov (United States)

    Leiterer, Jork; Grabolle, Markus; Rurack, Knut; Resch-Genger, Ute; Ziegler, Jan; Nann, Thomas; Panne, Ulrich

    2008-01-01

    Acoustic levitation is used as a new tool to study concentration-dependent processes in fluorescence spectroscopy. With this technique, small amounts of liquid and solid samples can be measured without the need for sample supports or containers, which often limits signal acquisition and can even alter sample properties due to interactions with the support material. We demonstrate that, because of the small sample volume, fluorescence measurements at high concentrations of an organic dye are possible without the limitation of inner-filter effects, which hamper such experiments in conventional, cuvette-based measurements. Furthermore, we show that acoustic levitation of liquid samples provides an experimentally simple way to study distance-dependent fluorescence modulations in semiconductor nanocrystals. The evaporation of the solvent during levitation leads to a continuous increase of solute concentration and can easily be monitored by laser-induced fluorescence. PMID:18596335

  8. Structure and process of university teaching in psychiatry: a field for methods of quality assurance and evaluation

    Directory of Open Access Journals (Sweden)

    Barkmann, Claus

    2005-08-01

    Full Text Available Objective: Given the exceptional workload at a university psychiatric hospital and the current emphasis on clinical medicine and science, teaching is systematically being neglected.Methods: With the help of evaluation methods involving the completion of a questionnaire, lectures and seminars held during one semester at the Department for Child and Adolescent Psychiatry and Psychotherapy of the University Hospital Hamburg-Eppendorf were assessed separately by students and lecturers in terms of form, content, lecturers, and overall assessment.Results: Despite organizational shortcomings, the lectures and seminars were rated on average as good in all four assessment areas. Using a bivariate prediction model, it was possible to explain 46% of the variance in overall assessment. A surprisingly high concordance was found between the assessments by students and lecturers.Conclusion: Continuous and systematic evaluation of lectures and seminars ensures and improves the quality of current and future teaching methods.

  9. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples.

    Science.gov (United States)

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane

    2012-09-01

    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination.

  10. Analytical results, database management and quality assurance for analysis of soil and groundwater samples collected by cone penetrometer from the F and H Area seepage basins

    Energy Technology Data Exchange (ETDEWEB)

    Boltz, D.R.; Johnson, W.H.; Serkiz, S.M.

    1994-10-01

    The Quantification of Soil Source Terms and Determination of the Geochemistry Controlling Distribution Coefficients (K{sub d} values) of Contaminants at the F- and H-Area Seepage Basins (FHSB) study was designed to generate site-specific contaminant transport factors for contaminated groundwater downgradient of the Basins. The experimental approach employed in this study was to collect soil and its associated porewater from contaminated areas downgradient of the FHSB. Samples were collected over a wide range of geochemical conditions (e.g., pH, conductivity, and contaminant concentration) and were used to describe the partitioning of contaminants between the aqueous phase and soil surfaces at the site. The partitioning behavior may be used to develop site-specific transport factors. This report summarizes the analytical procedures and results for both soil and porewater samples collected as part of this study and the database management of these data.

  11. High assurance services computing

    CERN Document Server

    2009-01-01

    Covers service-oriented technologies in different domains including high assurance systemsAssists software engineers from industry and government laboratories who develop mission-critical software, and simultaneously provides academia with a practitioner's outlook on the problems of high-assurance software development

  12. Authentication Assurance Levels

    International Nuclear Information System (INIS)

    This Common Criteria approach has been applied to create a definition of Authentication Assurance Levels that can quantify the level of assurance reached for a system subject to a set of authentication procedures. The arms-control authentication application of the Common Criteria expands on more typical information security evaluations in that it must contend with information barriers and preclude sophisticated intentional subversion attempts.

  13. Quality assurance program

    International Nuclear Information System (INIS)

    The concept of levels of quality assurance as applied to CANDU-type nuclear power plant components, i.e. maintaining an appropriate cost/benefit ratio, is introduced. The design process itself has quality assurance features by virtue of multi-level review. (E.C.B.)

  14. Mathematical Optimum of the Audit Sample

    OpenAIRE

    Georgeta Ancuta Span; Irimie Emil Popa

    2012-01-01

    Problem statement: The primary objective of any audit mission is to obtain a high level of assurance on the fact that financial statements are prepared in accordance with a general financial reporting framework. Getting an absolute level of assurance is not possible due to the complexity and big number of transactions and operations found in practice. This research aims to identify the shortcomings of one of the sampling methods used by the Romania auditors (80/20 method) and it was proposed ...

  15. Processes and procedures for a worldwide biological samples distribution; product assurance and logistic activities to support the mice drawer system tissue sharing event

    Science.gov (United States)

    Benassai, Mario; Cotronei, Vittorio

    The Mice Drawer System (MDS) is a scientific payload developed by the Italian Space Agency (ASI), it hosted 6 mice on the International Space Station (ISS) and re-entered on ground on November 28, 2009 with the STS 129 at KSC. Linked to the MDS experiment, a Tissue Sharing Program (TSP), was developed in order to make available to 16 Payload Investigators (PI) (located in USA, Canada, EU -Italy, Belgium and Germany -and Japan) the biological samples coming from the mice. ALTEC SpA (a PPP owned by ASI, TAS-I and local institutions) was responsible to support the logistics aspects of the MDS samples for the first MDS mission, in the frame of Italian Space Agency (ASI) OSMA program (OSteoporosis and Muscle Atrophy). The TSP resulted in a complex scenario, as ASI, progressively, extended the original OSMA Team also to researchers from other ASI programs and from other Agencies (ESA, NASA, JAXA). The science coordination was performed by the University of Genova (UNIGE). ALTEC has managed all the logistic process with the support of a specialized freight forwarder agent during the whole shipping operation phases. ALTEC formalized all the steps from the handover of samples by the dissection Team to the packaging and shipping process in a dedicated procedure. ALTEC approached all the work in a structured way, performing: A study of the aspects connected to international shipments of biological samples. A coopera-tive work with UNIGE/ASI /PIs to identify all the needs of the various researchers and their compatibility. A complete revision and integration of shipment requirements (addresses, tem-peratures, samples, materials and so on). A complete definition of the final shipment scenario in terms of boxes, content, refrigerant and requirements. A formal approach to identification and selection of the most suited and specialized Freight Forwarder. A clear identification of all the processes from sample dissection by PI Team, sample processing, freezing, tube preparation

  16. Contaminated soil remediation and quality assurance; Pilaantuneen maan kunnostaminen ja laadunvarmistus

    Energy Technology Data Exchange (ETDEWEB)

    Sarkkila, J.; Mroueh, U.M.; Leino-Forsman, H.

    2004-07-01

    The aim of contaminated soil remediation quality assurance is to carry out remediation activities according to plans. Besides the design work the appropriate implementation of the quality assurance covers source data and investigation methods as well as the requirements for documentation. Contaminated soil characterization and the selection of the most suitable remediation method is made with the help of various sampling and analysis methods. There are different kinds of requirements to the sampling plan depending on the type of remediation project. Quality assurance is taken into account in sampling, in sample handling and analysis as well as in the reporting of results. The most common unsaturated zone remediation methods used in Finland are introduced in this guide. These methods include excavation (as part of remediation), encapsulating, stabilization, thermal desorption, soil washing, composting, soil vapor extraction and bioventing. The methods are introduced on a general level with emphasis on their technical implementation and feasibility as well as on the eventual material requirements. Harmful environmental impacts of the methods must be identified and prevented. In order to monitor the remediation process, various chemical and physical quality assurance measurements are performed. Additionally the work safety issues related to remediation methods must be taken into account and proper documentation must be prepared. (orig.)

  17. Photoacoustic spectroscopy sample array vessel and photoacoustic spectroscopy method for using the same

    Science.gov (United States)

    Amonette, James E.; Autrey, S. Thomas; Foster-Mills, Nancy S.; Green, David

    2005-03-29

    Methods and apparatus for analysis of multiple samples by photoacoustic spectroscopy are disclosed. Particularly, a photoacoustic spectroscopy sample array vessel including a vessel body having multiple sample cells connected thereto is disclosed. At least one acoustic detector is acoustically coupled with the vessel body. Methods for analyzing the multiple samples in the sample array vessels using photoacoustic spectroscopy are provided.

  18. Internal Quality Assurance System and Its Implementation in Kaunas College

    Science.gov (United States)

    Misiunas, Mindaugas

    2007-01-01

    The article discusses the internal system of quality assurance and its implementation methods in Kaunas College. The issues of quality assurance are reviewed in the context of the European higher education area covering the three levels: European, national and institutional. The importance of quality assurance and its links with external…

  19. SU-E-T-570: New Quality Assurance Method Using Motion Tracking for 6D Robotic Couches

    International Nuclear Information System (INIS)

    Purpose: To accommodate geometrically accurate patient positioning, a robotic couch that is capable of 6-degrees of freedom has been introduced. However, conventional couch QA methods are not sufficient to enable the necessary accuracy of tests. Therefore, we have developed a camera based motion detection and geometry calibration system for couch QA. Methods: Employing a Visual-Tracking System (VTS, BonitaB10, Vicon, UK) which tracks infrared reflective(IR) markers, camera calibration was conducted using a 5.7 × 5.7 × 5.7 cm3 cube attached with IR markers at each corner. After positioning a robotic-couch at the origin with the cube on the table top, 3D coordinates of the cube’s eight corners were acquired by VTS in the VTS coordinate system. Next, positions in reference coordinates (roomcoordinates) were assigned using the known relation between each point. Finally, camera calibration was completed by finding a transformation matrix between VTS and reference coordinate systems and by applying a pseudo inverse matrix method. After the calibration, the accuracy of linear and rotational motions as well as couch sagging could be measured by analyzing the continuously acquired data of the cube while the couch moves to a designated position. Accuracy of the developed software was verified through comparison with measurement data when using a Laser tracker (FARO, Lake Mary, USA) for a robotic-couch installed for proton therapy. Results: VTS system could track couch motion accurately and measured position in room-coordinates. The VTS measurements and Laser tracker data agreed within 1% of difference for linear and rotational motions. Also because the program analyzes motion in 3-Dimension, it can compute couch sagging. Conclusion: Developed QA system provides submillimeter/ degree accuracy which fulfills the high-end couch QA. This work was supported by the National Research Foundation of Korea funded by Ministry of Science, ICT & Future Planning. (2013M2A2A7043507 and

  20. SU-E-T-570: New Quality Assurance Method Using Motion Tracking for 6D Robotic Couches

    Energy Technology Data Exchange (ETDEWEB)

    Cheon, W; Cho, J [SungKyunKwan University, Seoul (Korea, Republic of); Ahn, S [Samsung Medical Center, Seoul (Korea, Republic of); Han, Y; Choi, D [Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul (Korea, Republic of)

    2015-06-15

    Purpose: To accommodate geometrically accurate patient positioning, a robotic couch that is capable of 6-degrees of freedom has been introduced. However, conventional couch QA methods are not sufficient to enable the necessary accuracy of tests. Therefore, we have developed a camera based motion detection and geometry calibration system for couch QA. Methods: Employing a Visual-Tracking System (VTS, BonitaB10, Vicon, UK) which tracks infrared reflective(IR) markers, camera calibration was conducted using a 5.7 × 5.7 × 5.7 cm{sup 3} cube attached with IR markers at each corner. After positioning a robotic-couch at the origin with the cube on the table top, 3D coordinates of the cube’s eight corners were acquired by VTS in the VTS coordinate system. Next, positions in reference coordinates (roomcoordinates) were assigned using the known relation between each point. Finally, camera calibration was completed by finding a transformation matrix between VTS and reference coordinate systems and by applying a pseudo inverse matrix method. After the calibration, the accuracy of linear and rotational motions as well as couch sagging could be measured by analyzing the continuously acquired data of the cube while the couch moves to a designated position. Accuracy of the developed software was verified through comparison with measurement data when using a Laser tracker (FARO, Lake Mary, USA) for a robotic-couch installed for proton therapy. Results: VTS system could track couch motion accurately and measured position in room-coordinates. The VTS measurements and Laser tracker data agreed within 1% of difference for linear and rotational motions. Also because the program analyzes motion in 3-Dimension, it can compute couch sagging. Conclusion: Developed QA system provides submillimeter/ degree accuracy which fulfills the high-end couch QA. This work was supported by the National Research Foundation of Korea funded by Ministry of Science, ICT & Future Planning. (2013M2A2A

  1. Facile implementation of integrated tempering sampling method to enhance the sampling over a broad range of temperatures

    CERN Document Server

    Zhao, Peng; Gao, Yi Qin; Lu, Zhong-Yuan

    2013-01-01

    Integrated tempering sampling (ITS) method is an approach to enhance the sampling over a broad range of energies and temperatures in computer simulations. In this paper, a new version of integrated tempering sampling method is proposed. In the new approach presented here, we obtain parameters such as the set of temperatures and the corresponding weighting factors from canonical average of potential energies. These parameters can be easily obtained without estimating partition functions. We apply this new approach to study the Lennard-Jones fluid, the ALA-PRO peptide and the single polymer chain systems to validate and benchmark the method.

  2. Methods of sampling airborne fungi in working environments of waste treatment facilities

    OpenAIRE

    Kristýna Černá; Zdeňka Wittlingerová; Magdaléna Zimová; Zdeněk Janovský

    2016-01-01

    Objectives: The objective of the present study was to evaluate and compare the efficiency of a filter based sampling method and a high volume sampling method for sampling airborne culturable fungi present in waste sorting facilities. Material and Methods: Membrane filters method was compared with surface air system method. The selected sampling methods were modified and tested in 2 plastic waste sorting facilities. Results: The total number of colony-forming units (CFU)/m3 of airborne fungi w...

  3. Statistical Methods and Tools for Hanford Staged Feed Tank Sampling

    Energy Technology Data Exchange (ETDEWEB)

    Fountain, Matthew S.; Brigantic, Robert T.; Peterson, Reid A.

    2013-10-01

    This report summarizes work conducted by Pacific Northwest National Laboratory to technically evaluate the current approach to staged feed sampling of high-level waste (HLW) sludge to meet waste acceptance criteria (WAC) for transfer from tank farms to the Hanford Waste Treatment and Immobilization Plant (WTP). The current sampling and analysis approach is detailed in the document titled Initial Data Quality Objectives for WTP Feed Acceptance Criteria, 24590-WTP-RPT-MGT-11-014, Revision 0 (Arakali et al. 2011). The goal of this current work is to evaluate and provide recommendations to support a defensible, technical and statistical basis for the staged feed sampling approach that meets WAC data quality objectives (DQOs).

  4. Method for validating radiobiological samples using a linear accelerator

    International Nuclear Information System (INIS)

    There is an immediate need for rapid triage of the population in case of a large scale exposure to ionizing radiation. Knowing the dose absorbed by the body will allow clinicians to administer medical treatment for the best chance of recovery for the victim. In addition, today's radiotherapy treatment could benefit from additional information regarding the patient's sensitivity to radiation before starting the treatment. As of today, there is no system in place to respond to this demand. This paper will describe specific procedures to mimic the effects of human exposure to ionizing radiation creating the tools for optimization of administered radiation dosimetry for radiotherapy and/or to estimate the doses of radiation received accidentally during a radiation event that could pose a danger to the public. In order to obtain irradiated biological samples to study ionizing radiation absorbed by the body, we performed ex-vivo irradiation of human blood samples using the linear accelerator (LINAC). The LINAC was implemented and calibrated for irradiating human whole blood samples. To test the calibration, a 2 Gy test run was successfully performed on a tube filled with water with an accuracy of 3% in dose distribution. To validate our technique the blood samples were ex-vivo irradiated and the results were analyzed using a gene expression assay to follow the effect of the ionizing irradiation by characterizing dose responsive biomarkers from radiobiological assays. The response of 5 genes was monitored resulting in expression increase with the dose of radiation received. The blood samples treated with the LINAC can provide effective irradiated blood samples suitable for molecular profiling to validate radiobiological measurements via the gene-expression based biodosimetry tools. (orig.)

  5. Apparatus and method for centrifugation and robotic manipulation of samples

    Science.gov (United States)

    Vellinger, John C. (Inventor); Ormsby, Rachel A. (Inventor); Kennedy, David J. (Inventor); Thomas, Nathan A. (Inventor); Shulthise, Leo A. (Inventor); Kurk, Michael A. (Inventor); Metz, George W. (Inventor)

    2007-01-01

    A device for centrifugation and robotic manipulation of specimen samples, including incubating eggs, and uses thereof are provided. The device may advantageously be used for the incubation of avian, reptilian or any type of vertebrate eggs. The apparatus comprises a mechanism for holding samples individually, rotating them individually, rotating them on a centrifuge collectively, injecting them individually with a fixative or other chemical reagent, and maintaining them at controlled temperature, relative humidity and atmospheric composition. The device is applicable to experiments involving entities other than eggs, such as invertebrate specimens, plants, microorganisms and molecular systems.

  6. Ten years of balanced sampling with the cube method: An appraisal

    OpenAIRE

    Tillé, Yves

    2016-01-01

    This paper presents a review and assessment of the use of balanced sampling by means of the cube method. After defining the notion of balanced sample and balanced sampling, a short history of the concept of balancing is presented. The theory of the cube method is briefly presented. Emphasis is placed on the practical problems posed by balanced sampling: the interest of the method with respect to other sampling methods and calibration, the field of application, the accuracy of balancing, the c...

  7. An Importance Sampling Simulation Method for Bayesian Decision Feedback Equalizers

    OpenAIRE

    Chen, S.; Hanzo, L.

    2000-01-01

    An importance sampling (IS) simulation technique is presented for evaluating the lower-bound bit error rate (BER) of the Bayesian decision feedback equalizer (DFE) under the assumption of correct decisions being fed back. A design procedure is developed, which chooses appropriate bias vectors for the simulation density to ensure asymptotic efficiency of the IS simulation.

  8. Improved sample management in the cylindrical-tube microelectrophoresis method

    Science.gov (United States)

    Smolka, A. J. K.

    1980-01-01

    A modification to an analytical microelectrophoresis system is described that improves the manipulation of the sample particles and fluid. The apparatus modification and improved operational procedure should yield more accurate measurements of particle mobilities and permit less skilled operators to use the apparatus.

  9. Rapid methods for measuring radionuclides in food and environmental samples

    International Nuclear Information System (INIS)

    The application of ICP/mass spectrometry for the isotopic analysis of environmental samples, the use of drum assayers for measuring radionuclides in food and a rapid procedure for the measurement of the transuranic elements and thorium, performed at the Pacific Northwest Laboratory are discussed

  10. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  11. Estimation of uranium in columbite-tantalite samples: a method for sample solution preparation for fluorimetric estimation

    International Nuclear Information System (INIS)

    A method has been developed for obtaining a clear solution of columbite-tantalite samples in nitric acid medium before the fluorimetric estimation of uranium. Ammonium hydrogen fluoride is used to keep tantalum, niobium and titanium dissolved in the acid medium. The excess of fluoride is complexed with boric acid. The method has been successfully applied to a number of synthetic and natural columbite-tantalite samples. (author)

  12. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  13. Method for spiking soil samples with organic compounds

    DEFF Research Database (Denmark)

    Brinch, Ulla C; Ekelund, Flemming; Jacobsen, Carsten S

    2002-01-01

    We examined the harmful side effects on indigenous soil microorganisms of two organic solvents, acetone and dichloromethane, that are normally used for spiking of soil with polycyclic aromatic hydrocarbons for experimental purposes. The solvents were applied in two contamination protocols to either...... the whole soil sample or 25% of the soil volume, which was subsequently mixed with 75% untreated soil. For dichloromethane, we included a third protocol, which involved application to 80% of the soil volume with or without phenanthrene and introduction of Pseudomonas fluorescens VKI171 SJ132 genetically...... tagged with luxAB::Tn5. For both solvents, application to the whole sample resulted in severe side effects on both indigenous protozoa and bacteria. Application of dichloromethane to the whole soil volume immediately reduced the number of protozoa to below the detection limit. In one of the soils...

  14. Sample preparation method for induced mutation on orchid

    International Nuclear Information System (INIS)

    Studies on the induction of mutation in Dendrobium orchid at MINT has produced a number of new orchid mutant cultivars. Tissue culture techniques on orchid seeds and meristem cloning are employed in preparing the samples for the mutation induction. Solid medium based on the Murashige and Skoog (1962) and liquid medium based on Vacin and Went (1949) were found to be suitable in producing protocorm like bodies (PLBs) that are required for the irradiation treatment. (Author)

  15. Method for Spiking Soil Samples with Organic Compounds

    OpenAIRE

    Brinch, Ulla C.; Ekelund, Flemming; Jacobsen, Carsten S.

    2002-01-01

    We examined the harmful side effects on indigenous soil microorganisms of two organic solvents, acetone and dichloromethane, that are normally used for spiking of soil with polycyclic aromatic hydrocarbons for experimental purposes. The solvents were applied in two contamination protocols to either the whole soil sample or 25% of the soil volume, which was subsequently mixed with 75% untreated soil. For dichloromethane, we included a third protocol, which involved application to 80% of the so...

  16. Randomized Interior Point methods for Sampling and Optimization

    CERN Document Server

    Narayanan, Hariharan

    2009-01-01

    All known Markov Chains for sampling a general convex set have mixing times that depend upon the aspect ratio of the convex set, a measure of which is the ratio between the radius of the smallest sphere circumscribed around $K$ and the radius of the largest sphere inscribed in $K$. Extending earlier work on polytopes, we present a Markov chain for sampling from a convex body using a self-concordant barrier, whose mixing time does not depend on its aspect ratio or diameter. The mixing time of this chain is invariant under affine transformations of the convex set, thus eliminating the need for first placing the body in an isotropic position. Whether the body is in isotropic position or not, if it is the intersection of $O(n^{1 - \\eps})$ ellipsoids or the feasible set corresponding to semi-definite constraints of rank $O(n^{1-\\eps})$, the bounds on the mixing time improve upon existing bounds. In the cases where the self-concordant barrier has a tractable closed form, the Markov chain leads to efficient sampling...

  17. A new method of snowmelt sampling for water stable isotopes

    Science.gov (United States)

    Penna, D.; Ahmad, M.; Birks, S. J.; Bouchaou, L.; Brencic, M.; Butt, S.; Holko, L.; Jeelani, G.; Martinez, D. E.; Melikadze, G.; Shanley, J.B.; Sokratov, S. A.; Stadnyk, T.; Sugimoto, A.; Vreca, P.

    2014-01-01

    We modified a passive capillary sampler (PCS) to collect snowmelt water for isotopic analysis. Past applications of PCSs have been to sample soil water, but the novel aspect of this study was the placement of the PCSs at the ground-snowpack interface to collect snowmelt. We deployed arrays of PCSs at 11 sites in ten partner countries on five continents representing a range of climate and snow cover worldwide. The PCS reliably collected snowmelt at all sites and caused negligible evaporative fractionation effects in the samples. PCS is low-cost, easy to install, and collects a representative integrated snowmelt sample throughout the melt season or at the melt event scale. Unlike snow cores, the PCS collects the water that would actually infiltrate the soil; thus, its isotopic composition is appropriate to use for tracing snowmelt water through the hydrologic cycle. The purpose of this Briefing is to show the potential advantages of PCSs and recommend guidelines for constructing and installing them based on our preliminary results from two snowmelt seasons.

  18. Requirement Assurance: A Verification Process

    Science.gov (United States)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  19. On-line sample processing methods in flow analysis

    DEFF Research Database (Denmark)

    Miró, Manuel; Hansen, Elo Harald

    2008-01-01

    -line dilution, derivatization, separation and preconcentration methods encompassing solid reactors, solvent extraction, sorbent extraction, precipitation/coprecipitation, hydride/vapor generation and digestion/leaching protocols as hyphenated to a plethora of detection devices is discussed in detail...

  20. Automatic Sampling with the Ratio-of-uniforms Method

    OpenAIRE

    Leydold, Josef

    1999-01-01

    Applying the ratio-of-uniforms method for generating random variates results in very efficient, fast and easy to implement algorithms. However parameters for every particular type of density must be precalculated analytically. In this paper we show, that the ratio-of-uniforms method is also useful for the design of a black-box algorithm suitable for a large class of distributions, including all with log-concave densities. Using polygonal envelopes and squeezes results in an algorithm that is ...

  1. What dentition assures oral function?

    DEFF Research Database (Denmark)

    Gotfredsen, Klaus; Walls, Angus W G

    2007-01-01

    OBJECTIVE: To evaluate the relationship between dentition and oral function. MATERIAL AND METHODS: A search of the English literature was undertaken using PubMed and appropriate keywords. Citations were identified and hand sorted to confirm their validity against our inclusion criteria. Four spec...... for the year 2000, namely to maintain a natural dentition of not less than 20 teeth throughout life, is substantiated by the current literature review as this proposed dentition will assure an acceptable level of oral function....

  2. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    Science.gov (United States)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  3. Ant colony optimization as a method for strategic genotype sampling.

    Science.gov (United States)

    Spangler, M L; Robbins, K R; Bertrand, J K; Macneil, M; Rekaya, R

    2009-06-01

    A simulation study was carried out to develop an alternative method of selecting animals to be genotyped. Simulated pedigrees included 5000 animals, each assigned genotypes for a bi-allelic single nucleotide polymorphism (SNP) based on assumed allelic frequencies of 0.7/0.3 and 0.5/0.5. In addition to simulated pedigrees, two beef cattle pedigrees, one from field data and the other from a research population, were used to test selected methods using simulated genotypes. The proposed method of ant colony optimization (ACO) was evaluated based on the number of alleles correctly assigned to ungenotyped animals (AK(P)), the probability of assigning true alleles (AK(G)) and the probability of correctly assigning genotypes (APTG). The proposed animal selection method of ant colony optimization was compared to selection using the diagonal elements of the inverse of the relationship matrix (A(-1)). Comparisons of these two methods showed that ACO yielded an increase in AK(P) ranging from 4.98% to 5.16% and an increase in APTG from 1.6% to 1.8% using simulated pedigrees. Gains in field data and research pedigrees were slightly lower. These results suggest that ACO can provide a better genotyping strategy, when compared to A(-1), with different pedigree sizes and structures. PMID:19220227

  4. Healthcare Software Assurance

    OpenAIRE

    Cooper, Jason G.; Pauley, Keith A.

    2006-01-01

    Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Dru...

  5. RAVEN Quality Assurance Activities

    Energy Technology Data Exchange (ETDEWEB)

    Cogliati, Joshua Joseph [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This report discusses the quality assurance activities needed to raise the Quality Level of Risk Analysis in a Virtual Environment (RAVEN) from Quality Level 3 to Quality Level 2. This report also describes the general RAVEN quality assurance activities. For improving the quality, reviews of code changes have been instituted, more parts of testing have been automated, and improved packaging has been created. For upgrading the quality level, requirements have been created and the workflow has been improved.

  6. RAVEN Quality Assurance Activities

    International Nuclear Information System (INIS)

    This report discusses the quality assurance activities needed to raise the Quality Level of Risk Analysis in a Virtual Environment (RAVEN) from Quality Level 3 to Quality Level 2. This report also describes the general RAVEN quality assurance activities. For improving the quality, reviews of code changes have been instituted, more parts of testing have been automated, and improved packaging has been created. For upgrading the quality level, requirements have been created and the workflow has been improved.

  7. [DOE method for evaluating environmental and waste management samples: Revision 1, Addendum 1

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S.C.

    1995-04-01

    The US Dapartment of Energy`s (DOE`s) environmental and waste management (EM) sampling and analysis activities require that large numbers of samples be analyzed for materials characterization, environmental surveillance, and site-remediation programs. The present document, DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods), is a supplemental resource for analyzing many of these samples.

  8. [DOE method for evaluating environmental and waste management samples: Revision 1, Addendum 1

    International Nuclear Information System (INIS)

    The US Dapartment of Energy's (DOE's) environmental and waste management (EM) sampling and analysis activities require that large numbers of samples be analyzed for materials characterization, environmental surveillance, and site-remediation programs. The present document, DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods), is a supplemental resource for analyzing many of these samples

  9. A Study of Tapered Beard Sampling Method as Used in HVI

    Science.gov (United States)

    Beard method is used for sampling cotton fibers to generate fibrograms from which length parameters can be obtained. It is the sampling method used by HVI. HVI uses a fiber comb to sample cotton fibers and form a fiber beard for measuring fiber length parameters. A fundamental issue about this sampl...

  10. Liquid Chromatographic Method for Determination of Nisoldipine from Pharmaceutical Samples

    Directory of Open Access Journals (Sweden)

    Amit Gupta

    2010-01-01

    Full Text Available A simple and specific high performance thin layer chromatographic method was developed and validated for the determination of nisoldipine from tablet dosage form. The method was carried out at 320 nm after extraction of drug in methanol. The method uses aluminum plates pre-coated with silica gel 60F-254 as stationary phase and cyclohexane-ethyl acetate-toluene (3:3:4, v/v/v as mobile phase. Linearity was established over a range of 400-2400 ng per zone. Both peak area ratio and peak height ratio showed acceptable correlation coefficient i.e. more than 0.99. However we used peak area for validation purpose. Intra-day and inter-day precision was determined and found to have less than 6.0 % RSD.

  11. Ant colony optimization as a method for strategic genotype sampling.

    Science.gov (United States)

    A simulation study was carried out to develop an alternative method of selecting animals to be genotyped. Simulated pedigrees included 5000 animals, each assigned genotypes for a bi-allelic single nucleotide polymorphism (SNP) based on assumed allelic frequencies of 0.7/ 0.3 and 0.5/0.5. In addition...

  12. Effect of sample preparation methods on photometric determination of the tellurium and cobalt content in the samples of copper concentrates

    Directory of Open Access Journals (Sweden)

    Viktoriya Butenko

    2016-03-01

    Full Text Available Methods of determination of cobalt and nickel in copper concentrates currently used in factory laboratories are very labor intensive and time consuming. The limiting stage of the analysis is preliminary chemical sample preparation. Carrying out the decomposition process of industrial samples with concentrated mineral acids in open systems does not allow to improve the metrological characteristics of the methods, for this reason improvement the methods of sample preparation is quite relevant and has a practical interest. The work was dedicated to the determination of the optimal conditions of preliminary chemical preparation of copper concentrate samples for the subsequent determination of cobalt and tellurium in the obtained solution using tellurium-spectrophotometric method. Decomposition of the samples was carried out by acid dissolving in individual mineral acids and their mixtures by heating in an open system as well as by using ultrasonification and microwave radiation in a closed system. In order to select the optimal conditions for the decomposition of the samples in a closed system the phase contact time and ultrasonic generator’s power were varied. Intensification of the processes of decomposition of copper concentrates with nitric acid (1:1, ultrasound and microwave radiation allowed to transfer quantitatively cobalt and tellurium into solution spending 20 and 30 min respectively. This reduced the amount of reactants used and improved the accuracy of determination by running the process in strictly identical conditions.

  13. Random sampling of quantum states: a survey of methods and some issues regarding the Overparametrized Method

    Energy Technology Data Exchange (ETDEWEB)

    Maziero, Jonas, E-mail: jonas.maziero@ufsm.br [Universidade Federal de Santa Maria (UFSM), Santa Maria, RS (Brazil). Dept. de Fisica

    2015-12-15

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed. (author)

  14. Random Sampling of Quantum States: a Survey of Methods. And Some Issues Regarding the Overparametrized Method

    Science.gov (United States)

    Maziero, Jonas

    2015-12-01

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed.

  15. Good manufacturing practice - quality assurance programs

    International Nuclear Information System (INIS)

    The concept of good manufacturing practice (GMP) in the medical device industry requires the use of controlled methods and equipment in performing each step in the device manufacturing process. Quality assurance programs are used to maintain compliance with GMP requirements by prescribing the operating and control procedures to be used. The specific elements of a quality assurance program for the radiation sterilization of medical devices are described. (author)

  16. Determination of methylmercury in marine biota samples: method validation.

    Science.gov (United States)

    Carrasco, Luis; Vassileva, Emilia

    2014-05-01

    Regulatory authorities are expected to measure concentration of contaminants in foodstuffs, but the simple determination of total amount cannot be sufficient for fully judging its impact on the human health. In particular, the methylation of metals generally increases their toxicity; therefore validated analytical methods producing reliable results for the assessment of methylated species are highly needed. Nowadays, there is no legal limit for methylmercury (MeHg) in food matrices. Hence, no standardized method for the determination of MeHg exists within the international jurisdiction. Contemplating the possibility of a future legislative limit, a method for low level determination of MeHg in marine biota matrixes, based on aqueous-phase ethylation followed by purge and trap and gas chromatography (GC) coupled to pyrolysis-atomic fluorescence spectrometry (Py-AFS) detection, has been developed and validated. Five different extraction procedures, namely acid and alkaline leaching assisted by microwave and conventional oven heating, as well as enzymatic digestion, were evaluated in terms of their efficiency to extract MeHg from Scallop soft tissue IAEA-452 Certified Reference Material. Alkaline extraction with 25% (w/w) KOH in methanol, microwave-assisted extraction (MAE) with 5M HCl and enzymatic digestion with protease XIV yielded the highest extraction recoveries. Standard addition or the introduction of a dilution step were successfully applied to overcome the matrix effects observed when microwave-assisted extraction using 25% (w/w) KOH in methanol or 25% (w/v) aqueous TMAH were used. ISO 17025 and Eurachem guidelines were followed to perform the validation of the methodology. Accordingly, blanks, selectivity, calibration curve, linearity (0.9995), working range (1-800pg), recovery (97%), precision, traceability, limit of detection (0.45pg), limit of quantification (0.85pg) and expanded uncertainty (15.86%, k=2) were assessed with Fish protein Dorm-3 Certified

  17. Determination of optimal sampling times for a two blood sample clearance method using (51)Cr-EDTA in cats.

    Science.gov (United States)

    Vandermeulen, Eva; De Sadeleer, Carlos; Piepsz, Amy; Ham, Hamphrey R; Dobbeleir, André A; Vermeire, Simon T; Van Hoek, Ingrid M; Daminet, Sylvie; Slegers, Guido; Peremans, Kathelijne Y

    2010-08-01

    Estimation of the glomerular filtration rate (GFR) is a useful tool in the evaluation of kidney function in feline medicine. GFR can be determined by measuring the rate of tracer disappearance from the blood, and although these measurements are generally performed by multi-sampling techniques, simplified methods are more convenient in clinical practice. The optimal times for a simplified sampling strategy with two blood samples (2BS) for GFR measurement in cats using plasma (51)chromium ethylene diamine tetra-acetic acid ((51)Cr-EDTA) clearance were investigated. After intravenous administration of (51)Cr-EDTA, seven blood samples were obtained in 46 cats (19 euthyroid and 27 hyperthyroid cats, none with previously diagnosed chronic kidney disease (CKD)). The plasma clearance was then calculated from the seven point blood kinetics (7BS) and used for comparison to define the optimal sampling strategy by correlating different pairs of time points to the reference method. Mean GFR estimation for the reference method was 3.7+/-2.5 ml/min/kg (mean+/-standard deviation (SD)). Several pairs of sampling times were highly correlated with this reference method (r(2) > or = 0.980), with the best results when the first sample was taken 30 min after tracer injection and the second sample between 198 and 222 min after injection; or with the first sample at 36 min and the second at 234 or 240 min (r(2) for both combinations=0.984). Because of the similarity of GFR values obtained with the 2BS method in comparison to the values obtained with the 7BS reference method, the simplified method may offer an alternative for GFR estimation. Although a wide range of GFR values was found in the included group of cats, the applicability should be confirmed in cats suspected of renal disease and with confirmed CKD. Furthermore, although no indications of age-related effect were found in this study, a possible influence of age should be included in future studies. PMID:20452793

  18. A new resampling method for sampling designs without replacement: the doubled half bootstrap

    OpenAIRE

    Antal, Erika; Tillé, Yves

    2016-01-01

    A new and very fast method of bootstrap for sampling without replacement from a finite population is proposed. This method can be used to estimate the variance in sampling with unequal inclusion probabilities and does not require artificial populations or utilization of bootstrap weights. The bootstrap samples are directly selected from the original sample. The bootstrap procedure contains two steps: in the first step, units are selected once with Poisson sampling using the same inclusion pro...

  19. Flight Dynamics Mission Support and Quality Assurance Process

    Science.gov (United States)

    Oh, InHwan

    1996-01-01

    This paper summarizes the method of the Computer Sciences Corporation Flight Dynamics Operation (FDO) quality assurance approach to support the National Aeronautics and Space Administration Goddard Space Flight Center Flight Dynamics Support Branch. Historically, a strong need has existed for developing systematic quality assurance using methods that account for the unique nature and environment of satellite Flight Dynamics mission support. Over the past few years FDO has developed and implemented proactive quality assurance processes applied to each of the six phases of the Flight Dynamics mission support life cycle: systems and operations concept, system requirements and specifications, software development support, operations planing and training, launch support, and on-orbit mission operations. Rather than performing quality assurance as a final step after work is completed, quality assurance has been built in as work progresses in the form of process assurance. Process assurance activities occur throughout the Flight Dynamics mission support life cycle. The FDO Product Assurance Office developed process checklists for prephase process reviews, mission team orientations, in-progress reviews, and end-of-phase audits. This paper will outline the evolving history of FDO quality assurance approaches, discuss the tailoring of Computer Science Corporations's process assurance cycle procedures, describe some of the quality assurance approaches that have been or are being developed, and present some of the successful results.

  20. Vega flow assurance system

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Marit; Munaweera, Sampath

    2010-07-01

    Vega is a gas condensate field located at the west coast of Norway and developed as a tie-in to the Gjoea platform. Operator is Statoil, production startup is estimated to the end of 2010. Flow assurance challenges are high reservoir pressure and temperature, hydrate and wax control, liquid accumulation and monitoring the well/template production rates. The Vega Flow Assurance System (FAS) is a software that supports monitoring and operation of the field. The FAS is based FlowManagerTM designed for real time systems. This is a flexible tool with its own steady state multiphase- and flow assurance models. Due to the long flowlines lines and the dynamic behavior, the multiphase flow simulator OLGA is also integrated in the system. Vega FAS will be used as: - An online monitoring tool - An offline what-if simulation and validation tool - An advisory control system for well production allocation. (Author)

  1. A Typology of Mixed Methods Sampling Designs in Social Science Research

    Science.gov (United States)

    Onwuegbuzie, Anthony J.; Collins, Kathleen M. T.

    2007-01-01

    This paper provides a framework for developing sampling designs in mixed methods research. First, we present sampling schemes that have been associated with quantitative and qualitative research. Second, we discuss sample size considerations and provide sample size recommendations for each of the major research designs for quantitative and…

  2. A fast method to prepare water samples for 15N analysis

    Institute of Scientific and Technical Information of China (English)

    肖化云; 刘丛强

    2001-01-01

    Automatic element analyser is often used to prepare organic matters tor 15N analysis. It is seldom used to prepare water samples. Water samples are conventionally dealt with by Kjeldahl-Rittenberg technique. But it requires tedious and labor-intensive sample preparation. A fast and reliable method is proposed in this paper to prepare water samples for 15N analysis.

  3. Comparison of chlorzoxazone one-sample methods to estimate CYP2E1 activity in humans

    DEFF Research Database (Denmark)

    Kramer, Iza; Dalhoff, Kim; Clemmesen, Jens O;

    2003-01-01

    OBJECTIVE: Comparison of a one-sample with a multi-sample method (the metabolic fractional clearance) to estimate CYP2E1 activity in humans. METHODS: Healthy, male Caucasians ( n=19) were included. The multi-sample fractional clearance (Cl(fe)) of chlorzoxazone was compared with one-time-point cl...

  4. 222Rn in water: A comparison of two sample collection methods and two sample transport methods, and the determination of temporal variation in North Carolina ground water

    International Nuclear Information System (INIS)

    Objectives of this field experiment were: (1) determine whether there was a statistically significant difference between the radon concentrations of samples collected by EPA's standard method, using a syringe, and an alternative, slow-flow method; (2) determine whether there was a statistically significant difference between the measured radon concentrations of samples mailed vs samples not mailed; and (3) determine whether there was a temporal variation of water radon concentration over a 7-month period. The field experiment was conducted at 9 sites, 5 private wells, and 4 public wells, at various locations in North Carolina. Results showed that a syringe is not necessary for sample collection, there was generally no significant radon loss due to mailing samples, and there was statistically significant evidence of temporal variations in water radon concentrations

  5. Composite sampling a novel method to accomplish observational economy in environmental studies

    CERN Document Server

    Patil, Ganapati P; Taillie, Charles

    2010-01-01

    This monograph provides a comprehensive statistical account of composite sampling as an ingenious environmental sampling method to help accomplish observational economy in a variety of environmental and ecological studies.

  6. Impact of the sampling method and chilling on the Salmonella recovery from pig carcasses.

    Science.gov (United States)

    Vanantwerpen, Gerty; De Zutter, Lieven; Berkvens, Dirk; Houf, Kurt

    2016-09-01

    Differences in recovery of Salmonella on pig carcasses using non-destructive and destructive sampling methods is not well understood in respect to the chilling processes applied in slaughterhouses. Therefore, in two slaughterhouses, four strains at two different concentrations were inoculated onto pork skin. Inoculated skin samples were sampled before and after chilling with two sampling methods: swabbing and destruction. Both slaughterhouses were visited three times and all tests were performed in triplicate. All samples were analysed using the ISO-method and recovered isolates were confirmed by PFGE. The chilling system (fast or conventional cooling) nor the sampling step (before and after chilling) did not significantly influence the recovery of Salmonella. However, swabbing after chilling leads to an underestimation of the real number of contaminated carcasses. Therefore, destructive sampling is the more designated sampling method after chilling. PMID:27236225

  7. Practical Method for Extraction of PCR-Quality DNA from Environmental Soil Samples ▿ †

    OpenAIRE

    Kelly A Fitzpatrick; Kersh, Gilbert J.; Massung, Robert F.

    2010-01-01

    Methods for the extraction of PCR-quality DNA from environmental soil samples by using pairs of commercially available kits were evaluated. Coxiella burnetii DNA was detected in spiked soil samples at

  8. Quality Assurance for All

    Science.gov (United States)

    Cheung, Peter P. T.; Tsui, Cecilia B. S.

    2010-01-01

    For higher education reform, most decision-makers aspire to achieving a higher participation rate and a respectable degree of excellence with diversity at the same time. But very few know exactly how. External quality assurance is a fair basis for differentiation but there can be doubt and resistance in some quarters. Stakeholder interests differ…

  9. Mission Operations Assurance

    Science.gov (United States)

    Faris, Grant

    2012-01-01

    Integrate the mission operations assurance function into the flight team providing: (1) value added support in identifying, mitigating, and communicating the project's risks and, (2) being an essential member of the team during the test activities, training exercises and critical flight operations.

  10. Phase 2 sampling and analysis plan, Quality Assurance Project Plan, and environmental health and safety plan for the Clinch River Remedial Investigation: An addendum to the Clinch River RCRA Facility Investigation plan

    International Nuclear Information System (INIS)

    This document contains a three-part addendum to the Clinch River Resource Conservation and Recovery Act (RCRA) Facility Investigation Plan. The Clinch River RCRA Facility Investigation began in 1989, as part of the comprehensive remediation of facilities on the US Department of Energy Oak Ridge Reservation (ORR). The ORR was added to the National Priorities List in December 1989. The regulatory agencies have encouraged the adoption of Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) terminology; therefore, the Clinch River activity is now referred to as the Clinch River Remedial Investigation (CRRI), not the Clinch River RCRA Facility Investigation. Part 1 of this document is the plan for sampling and analysis (S ampersand A) during Phase 2 of the CRRI. Part 2 is a revision of the Quality Assurance Project Plan for the CRRI, and Part 3 is a revision of the Environmental Health and Safety Plan for the CRRI. The Clinch River RI (CRRI) is designed to address the transport, fate, and distribution of waterborne contaminants (radionuclides, metals, and organic compounds) released from the DOE Oak Ridge Reservation (ORR) and to assess potential risks to human health and the environment associated with these contaminants. Primary areas of investigation are Melton Hill Reservoir, the Clinch River from Melton Hill Dam to its confluence with the Tennessee River, Poplar Creek, and Watts Bar Reservoir. The contaminants identified in the Clinch River/Watts Bar Reservoir (CR/WBR) downstream of the ORR are those associated with the water, suspended particles, deposited sediments, aquatic organisms, and wildlife feeding on aquatic organisms. The purpose of the Phase 2 S ampersand A Plan is to describe the proposed tasks and subtasks developed to meet the primary objectives of the CRRI

  11. Field methods and quality-assurance plan for water-quality activities and water-level measurements, U.S. Geological Survey, Idaho National Laboratory, Idaho

    Science.gov (United States)

    Bartholomay, Roy C.; Maimer, Neil V.; Wehnke, Amy J.

    2014-01-01

    Water-quality activities and water-level measurements by the personnel of the U.S. Geological Survey (USGS) Idaho National Laboratory (INL) Project Office coincide with the USGS mission of appraising the quantity and quality of the Nation’s water resources. The activities are carried out in cooperation with the U.S. Department of Energy (DOE) Idaho Operations Office. Results of the water-quality and hydraulic head investigations are presented in various USGS publications or in refereed scientific journals and the data are stored in the National Water Information System (NWIS) database. The results of the studies are used by researchers, regulatory and managerial agencies, and interested civic groups. In the broadest sense, quality assurance refers to doing the job right the first time. It includes the functions of planning for products, review and acceptance of the products, and an audit designed to evaluate the system that produces the products. Quality control and quality assurance differ in that quality control ensures that things are done correctly given the “state-of-the-art” technology, and quality assurance ensures that quality control is maintained within specified limits.

  12. A Direct Bootstrap Method for Complex Sampling Designs From a Finite Population

    OpenAIRE

    Antal, Erika; Tillé, Yves

    2016-01-01

    In complex designs, classical bootstrap methods result in a biased variance estimator when the sampling design is not taken into account. Resampled units are usually rescaled or weighted in order to achieve unbiasedness in the linear case. In the present article, we propose novel resampling methods that may be directly applied to variance estimation. These methods consist of selecting subsamples under a completely different sampling scheme from that which generated the original sample, whic...

  13. Identification of Legionella spp. in Environmental Water Samples by ScanVIT-Legionella™ Method in Spain

    OpenAIRE

    Gruas, Cristina; Álvarez, Isidro; Lara, Carlos; García, Cristina Belén; Savva, Demetris; Arruga, M. Victoria

    2013-01-01

    Rapid and more sensitive methods for the detection and quantification of viable Legionella cells have been developed. In this paper, a comparative analysis of environmental water samples using the ScanVIT-Legionella™ method and the traditional “gold standard” method of culturing is realised indicating the usefulness of the ScanVIT method. The ScanVIT-Legionella™ method was performed on environmental water samples from different locations of Huesca region (Spain). Legionella micro-colonies sho...

  14. Evaluation of seven aquatic sampling methods for amphibians and other aquatic fauna

    Science.gov (United States)

    Gunzburger, M.S.

    2007-01-01

    To design effective and efficient research and monitoring programs researchers must have a thorough understanding of the capabilities and limitations of their sampling methods. Few direct comparative studies exist for aquatic sampling methods for amphibians. The objective of this study was to simultaneously employ seven aquatic sampling methods in 10 wetlands to compare amphibian species richness and number of individuals detected with each method. Four sampling methods allowed counts of individuals (metal dipnet, D-frame dipnet, box trap, crayfish trap), whereas the other three methods allowed detection of species (visual encounter, aural, and froglogger). Amphibian species richness was greatest with froglogger, box trap, and aural samples. For anuran species, the sampling methods by which each life stage was detected was related to relative length of larval and breeding periods and tadpole size. Detection probability of amphibians varied across sampling methods. Box trap sampling resulted in the most precise amphibian count, but the precision of all four count-based methods was low (coefficient of variation > 145 for all methods). The efficacy of the four count sampling methods at sampling fish and aquatic invertebrates was also analyzed because these predatory taxa are known to be important predictors of amphibian habitat distribution. Species richness and counts were similar for fish with the four methods, whereas invertebrate species richness and counts were greatest in box traps. An effective wetland amphibian monitoring program in the southeastern United States should include multiple sampling methods to obtain the most accurate assessment of species community composition at each site. The combined use of frogloggers, crayfish traps, and dipnets may be the most efficient and effective amphibian monitoring protocol. ?? 2007 Brill Academic Publishers.

  15. Quality assurance in digital radiography

    International Nuclear Information System (INIS)

    At present, there is no standard way of evaluating performance characteristics of digital radiography systems. Continuous measurements of performance parameters are necessary in order to obtain images of high quality. Parameters of quality assurance in digital radiography, which can be evaluated with simple, quick methods, are spatial resolution, low-contrast detectability, dynamic range and exposure dose. Spatial resolution was determined by a lead bar pattern, whereas the other parameters were measured by commercially available phantoms. Performance measurements of 10 digital subtraction angiography (DSA) units and one digital radiography system for unsubtracted digital radiography were assessed. From these results, recommendations for performance parameter levels will be discussed. (author)

  16. Methods of sampling airborne fungi in working environments of waste treatment facilities

    Directory of Open Access Journals (Sweden)

    Kristýna Černá

    2016-03-01

    Full Text Available Objectives: The objective of the present study was to evaluate and compare the efficiency of a filter based sampling method and a high volume sampling method for sampling airborne culturable fungi present in waste sorting facilities. Material and Methods: Membrane filters method was compared with surface air system method. The selected sampling methods were modified and tested in 2 plastic waste sorting facilities. Results: The total number of colony-forming units (CFU/m3 of airborne fungi was dependent on the type of sampling device, on the time of sampling, which was carried out every hour from the beginning of the work shift, and on the type of cultivation medium (p < 0.001. Detected concentrations of airborne fungi ranged 2×102–1.7×106 CFU/m3 when using the membrane filters (MF method, and 3×102–6.4×104 CFU/m3 when using the surface air system (SAS method. Conclusions: Both methods showed comparable sensitivity to the fluctuations of the concentrations of airborne fungi during the work shifts. The SAS method is adequate for a fast indicative determination of concentration of airborne fungi. The MF method is suitable for thorough assessment of working environment contamination by airborne fungi. Therefore we recommend the MF method for the implementation of a uniform standard methodology of airborne fungi sampling in working environments of waste treatment facilities.

  17. Quality assurance in clinical trials.

    NARCIS (Netherlands)

    Ottevanger, P.B.; Therasse, P.; Veld, C.J.H. van de; Bernier, J.; Krieken, J.H.J.M. van; Grol, R.P.T.M.; Mulder, P.H.M. de

    2003-01-01

    From the literature that was initially searched by electronic databases using the keywords quality, quality control and quality assurance in combination with clinical trials, surgery, pathology, radiotherapy, chemotherapy and data management, a comprehensive review is given on what quality assurance

  18. An improved adaptive sampling and experiment design method for aerodynamic optimization

    Institute of Scientific and Technical Information of China (English)

    Huang Jiangtao; Gao Zhenghong; Zhou Zhu; Zhao Ke

    2015-01-01

    Experiment design method is a key to construct a highly reliable surrogate model for numerical optimization in large-scale project. Within the method, the experimental design criterion directly affects the accuracy of the surrogate model and the optimization efficient. According to the shortcomings of the traditional experimental design, an improved adaptive sampling method is pro-posed in this paper. The surrogate model is firstly constructed by basic sparse samples. Then the supplementary sampling position is detected according to the specified criteria, which introduces the energy function and curvature sampling criteria based on radial basis function (RBF) network. Sampling detection criteria considers both the uniformity of sample distribution and the description of hypersurface curvature so as to significantly improve the prediction accuracy of the surrogate model with much less samples. For the surrogate model constructed with sparse samples, the sample uniformity is an important factor to the interpolation accuracy in the initial stage of adaptive sam-pling and surrogate model training. Along with the improvement of uniformity, the curvature description of objective function surface gradually becomes more important. In consideration of these issues, crowdness enhance function and root mean square error (RMSE) feedback function are introduced in C criterion expression. Thus, a new sampling method called RMSE and crowd-ness enhance (RCE) adaptive sampling is established. The validity of RCE adaptive sampling method is studied through typical test function firstly and then the airfoil/wing aerodynamic opti-mization design problem, which has high-dimensional design space. The results show that RCE adaptive sampling method not only reduces the requirement for the number of samples, but also effectively improves the prediction accuracy of the surrogate model, which has a broad prospects for applications.

  19. Method for sample preparation for cryoelectron microscopy (CEM) microreactor and loading platform

    NARCIS (Netherlands)

    Zandbergen, H.W.; Ahn, C.W.

    2008-01-01

    A method for sample preparation for cryoelectron microscopy (CEM), wherein the sample is held in a microreactor, wherein the conditions in the microreactor are regulated relative to the environment, wherein the sample in the microreactor is frozen according to a quench freeze process, whereupon the

  20. Fast egg collection method greatly improves randomness of egg sampling in Drosophila melanogaster

    DEFF Research Database (Denmark)

    Schou, Mads Fristrup

    2013-01-01

    When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column and dimini...

  1. Improvements in pentosan polysulfate sodium quality assurance using fingerprint electropherograms.

    Science.gov (United States)

    Schirm, B; Benend, H; Wätzig, H

    2001-04-01

    Complex samples from polymer production, plant extracts or biotechnology mixtures can be characterized by fingerprints. Currently, the standard approach for sample characterization employs near-infrared (NIR) spectroscopy fingerprinting. Up to now, however, fingerprints obtained by chromatography or electrophoresis could only be visually evaluated. This type of inspection is very labor-intensive and difficult to validate. In order to transfer the use of fingerprints from spectroscopy to electrophoresis, spectra-like properties must be obtained through a complete alignment of the electropherograms. This has been achieved by interpolation and wavelet filtering of the baseline signal in the present work. The resulting data have been classified by several algorithms. The methods under survey include self-organizing maps (SOMs), artificial neural networks (ANNs), soft independent modeling of class analogy (SIMCA) and k-nearest neighbors (KNNs). In order to test the performance of this combined approach in practice, it was applied to the quality assurance of pentosan polysulfate (PPS). A recently developed capillary electrophoresis (CE) method using indirect UV detection was employed in these studies [1]. All algorithms were well capable of classifying the examined PPS test batches. Even minor variations in the PPS composition, not perceptible by visual inspection, could be automatically detected. The whole method has been validated by classifying various (n = 400) unknown PPS quality assurance samples, which have been correctly identified without exception.

  2. Basic Study of Establishment of Quality Assurance Processes to Develop an Integrated Quality Assurance System for Nuclear Power Plant Construction

    International Nuclear Information System (INIS)

    An integrated quality assurance system has necessitated carrying out quality assurance programs in a systematic manner because the opportunities to expand business in overseas markets have increased since the export of a nuclear power plant to UAE in 2009. In this study, we use PDCA method to systematically analyze the quality assurance procedures that were used in previous projects for constructing nuclear power plants. We reached a classification system of quality assurance processes at each phase of nuclear power plant construction by integrating similar work related to quality such as planning, design, equipment manufacturing, construction and start-up. We also established a hierarchy of quality assurance processes to develop an integrated quality assurance system as a technology goal to be developed later. To obtain most updated quality assurance activities, a quality assurance process is structured by integrating similar works analyzed from quality assurance procedures through PDCA cycle method. At the implementation phase of Hierarchy of quality processes and sequence of processes for constructing nuclear power plant are established in this study. Integrated quality assurance system is to be developed by connecting organizations as well as stakeholders such as owners, Architect engineering, suppliers, contractors, and sub-contractors to carry out assigned work efficiently

  3. Systems and methods for separating particles and/or substances from a sample fluid

    Energy Technology Data Exchange (ETDEWEB)

    Mariella, Jr., Raymond P.; Dougherty, George M.; Dzenitis, John M.; Miles, Robin R.; Clague, David S.

    2016-11-01

    Systems and methods for separating particles and/or toxins from a sample fluid. A method according to one embodiment comprises simultaneously passing a sample fluid and a buffer fluid through a chamber such that a fluidic interface is formed between the sample fluid and the buffer fluid as the fluids pass through the chamber, the sample fluid having particles of interest therein; applying a force to the fluids for urging the particles of interest to pass through the interface into the buffer fluid; and substantially separating the buffer fluid from the sample fluid.

  4. Multiplex biotoxin surface plasmon resonance method for marine biotoxins in algal and seawater samples

    OpenAIRE

    McNamee, E; Elliott, T.; Delahaut, P.; Campbell, K

    2013-01-01

    A multiplex surface plasmon resonance (SPR) biosensor method for the detection of paralytic shellfish poisoning (PSP) toxins, okadaic acid (and analogues) and domoic acid was developed. This method was compared to enzyme-linked immunosorbent assay (ELISA) methods. Seawater samples (n = 256) from around Europe were collected by the consortia of an EU project MIcroarrays for the Detection of Toxic Algae (MIDTAL) and evaluated using each method. A simple sample preparation procedure was develope...

  5. Sorbent-based sampling methods for volatile and semi-volatile organic compounds in air Part 1: Sorbent-based air monitoring options.

    Science.gov (United States)

    Woolfenden, Elizabeth

    2010-04-16

    Sorbent tubes/traps are widely used in combination with gas chromatographic (GC) analytical methods to monitor the vapour-phase fraction of organic compounds in air. Target compounds range in volatility from acetylene and freons to phthalates and PCBs and include apolar, polar and reactive species. Airborne vapour concentrations will vary depending on the nature of the location, nearby pollution sources, weather conditions, etc. Levels can range from low percent concentrations in stack and vent emissions to low part per trillion (ppt) levels in ultra-clean outdoor locations. Hundreds, even thousands of different compounds may be present in any given atmosphere. GC is commonly used in combination with mass spectrometry (MS) detection especially for environmental monitoring or for screening uncharacterised workplace atmospheres. Given the complexity and variability of organic vapours in air, no one sampling approach suits every monitoring scenario. A variety of different sampling strategies and sorbent media have been developed to address specific applications. Key sorbent-based examples include: active (pumped) sampling onto tubes packed with one or more sorbents held at ambient temperature; diffusive (passive) sampling onto sorbent tubes/cartridges; on-line sampling of air/gas streams into cooled sorbent traps; and transfer of air samples from containers (canisters, Tedlar) bags, etc.) into cooled sorbent focusing traps. Whichever sampling approach is selected, subsequent analysis almost always involves either solvent extraction or thermal desorption (TD) prior to GC(/MS) analysis. The overall performance of the air monitoring method will depend heavily on appropriate selection of key sampling and analytical parameters. This comprehensive review of air monitoring using sorbent tubes/traps is divided into 2 parts. (1) Sorbent-based air sampling option. (2) Sorbent selection and other aspects of optimizing sorbent-based air monitoring methods. The paper presents

  6. Improved Butanol-Methanol (BUME) Method by Replacing Acetic Acid for Lipid Extraction of Biological Samples.

    Science.gov (United States)

    Cruz, Mutya; Wang, Miao; Frisch-Daiello, Jessica; Han, Xianlin

    2016-07-01

    Extraction of lipids from biological samples is a critical step in lipidomics, especially for shotgun lipidomics where lipid extracts are directly infused into a mass spectrometer. The butanol-methanol (BUME) extraction method was originally developed to extract lipids from plasma samples with 1 % acetic acid. Considering some lipids are sensitive to acidic environments, we modified this protocol by replacing acetic acid with lithium chloride solution and extended the modified extraction to tissue samples. Although no significant reduction of plasmalogen levels in the acidic BUME extracts of rat heart samples was found, the modified method was established to extract various tissue samples, including rat liver, heart, and plasma. Essentially identical profiles of the majority of lipid classes were obtained from the extracts of the modified BUME and traditional Bligh-Dyer methods. However, it was found that neither the original, nor the modified BUME method was suitable for 4-hydroxyalkenal species measurement in biological samples. PMID:27245345

  7. Total nitrogen determination of various sample types: a comparison of the Hach, Kjeltec, and Kjeldahl methods.

    Science.gov (United States)

    Watkins, K L; Veum, T L; Krause, G F

    1987-01-01

    Conventional Kjeldahl analysis with modifications, Kjeltec analysis with block digestion and semiautomated distillation, and the Hach method for determining nitrogen (N) were compared using a wide range of samples. Twenty different sample types were ground and mixed. Each sample type was divided into 5 subsamples which were analyzed for N by each of the 3 methods. In each sample type, differences (P less than 0.05) were detected among the 3 N determination methods in 5 of the 20 N sources analyzed. The mean N content over all 20 samples was higher with Kjeldahl analysis (P less than 0.05) than with Kjeltec, while Hach analysis produced intermediate results. Results also indicated that the Hach procedure had the greatest ability to detect differences in N content among sample types, being more sensitive than either other method (P less than 0.05).

  8. Quality assurance for gamma knives

    Energy Technology Data Exchange (ETDEWEB)

    Jones, E.D.; Banks, W.W.; Fischer, L.E. [Lawrence Livermore National Lab., CA (United States)

    1995-09-01

    This report describes and summarizes the results of a quality assurance (QA) study of the Gamma Knife, a nuclear medical device used for the gamma irradiation of intracranial lesions. Focus was on the physical aspects of QA and did not address issues that are essentially medical, such as patient selection or prescription of dose. A risk-based QA assessment approach was used. Sample programs for quality control and assurance are included. The use of the Gamma Knife was found to conform to existing standards and guidelines concerning radiation safety and quality control of external beam therapies (shielding, safety reviews, radiation surveys, interlock systems, exposure monitoring, good medical physics practices, etc.) and to be compliant with NRC teletherapy regulations. There are, however, current practices for the Gamma Knife not covered by existing, formalized regulations, standards, or guidelines. These practices have been adopted by Gamma Knife users and continue to be developed with further experience. Some of these have appeared in publications or presentations and are slowly finding their way into recommendations of professional organizations.

  9. Quality assurance for gamma knives

    International Nuclear Information System (INIS)

    This report describes and summarizes the results of a quality assurance (QA) study of the Gamma Knife, a nuclear medical device used for the gamma irradiation of intracranial lesions. Focus was on the physical aspects of QA and did not address issues that are essentially medical, such as patient selection or prescription of dose. A risk-based QA assessment approach was used. Sample programs for quality control and assurance are included. The use of the Gamma Knife was found to conform to existing standards and guidelines concerning radiation safety and quality control of external beam therapies (shielding, safety reviews, radiation surveys, interlock systems, exposure monitoring, good medical physics practices, etc.) and to be compliant with NRC teletherapy regulations. There are, however, current practices for the Gamma Knife not covered by existing, formalized regulations, standards, or guidelines. These practices have been adopted by Gamma Knife users and continue to be developed with further experience. Some of these have appeared in publications or presentations and are slowly finding their way into recommendations of professional organizations

  10. Cavitation Erosion Tests Performed by Indirect Vibratory Method on Stainless Steel Welded Samples with Hardened Surface

    Directory of Open Access Journals (Sweden)

    Marian-Dumitru Nedeloni

    2012-09-01

    Full Text Available The paper presents the results of cavitation erosion tests performed on two types of samples. The materials of the samples are frequently used for manufacturing and repairs of the hydro turbines components submitted to cavitation. The first sample was made by welding of an austenitic stainless steel on austenito-feritic base material. The second sample was made similarly with the first but with a martensitic base material. After the welding processes, on both samples was applied a hardening treatment by surface peening. The cavitation erosion tests were performed on vibratory equipment using the indirect method with stationary specimen. The results show a good cavitation erosion resistance on both samples.

  11. Assurance of quality in the diagnostic X-ray department

    International Nuclear Information System (INIS)

    A handbook has been prepared on the 'Assurance of Quality in the Diagnostic X-ray Department'. Part 1 discusses applied techniques which covers organization and methods, reject analysis, radiation protection, status tests, tolerance limits, practical tests for X-ray equipment and electrical and mechanical safety. Part 2 discusses the systematic approach to quality assurance. Part 3 discusses quality assurance for computed tomography, magnetic resource imaging systems and digital subtraction angiography. (U.K.)

  12. Contamination Rates of Three Urine-Sampling Methods to Assess Bacteriuria in Pregnant Women

    NARCIS (Netherlands)

    Schneeberger, Caroline; van den Heuvel, Edwin R.; Erwich, Jan Jaap H. M.; Stolk, Ronald P.; Visser, Caroline E.; Geerlings, Suzanne E.

    2013-01-01

    OBJECTIVE: To estimate and compare contamination rates of three different urine-sampling methods in pregnant women to assess bacteriuria. METHODS: In this cross-sectional study, 113 pregnant women collected three different midstream urine samples consecutively: morning (first void); midstream (void

  13. Purging Musical Instrument Sample Databases Using Automatic Musical Instrument Recognition Methods

    OpenAIRE

    Livshin, Arie; Rodet, Xavier

    2009-01-01

    cote interne IRCAM: Livshin09a None / None National audience Compilation of musical instrument sample databases requires careful elimination of badly recorded samples and validation of sample classification into correct categories. This paper introduces algorithms for automatic removal of bad instrument samples using Automatic Musical Instrument Recognition and Outlier Detection techniques. Best evaluation results on a methodically contaminated sound database are achieved using the i...

  14. Method for sample preparation for cryoelectron microscopy (CEM) microreactor and loading platform

    OpenAIRE

    H.W. Zandbergen; Ahn, C. W.

    2008-01-01

    A method for sample preparation for cryoelectron microscopy (CEM), wherein the sample is held in a microreactor, wherein the conditions in the microreactor are regulated relative to the environment, wherein the sample in the microreactor is frozen according to a quench freeze process, whereupon the sample, in frozen condition, is placed in the electron microscope. A microreactor for use with cryoelectron microscopy (CEM), comprising a first and second membrane, which membranes, at least in a ...

  15. Estimation of the sugar cane cultivated area from LANDSAT images using the two phase sampling method

    Science.gov (United States)

    Parada, N. D. J. (Principal Investigator); Cappelletti, C. A.; Mendonca, F. J.; Lee, D. C. L.; Shimabukuro, Y. E.

    1982-01-01

    A two phase sampling method and the optimal sampling segment dimensions for the estimation of sugar cane cultivated area were developed. This technique employs visual interpretations of LANDSAT images and panchromatic aerial photographs considered as the ground truth. The estimates, as a mean value of 100 simulated samples, represent 99.3% of the true value with a CV of approximately 1%; the relative efficiency of the two phase design was 157% when compared with a one phase aerial photographs sample.

  16. [Study on the optimization methods of common-batch identification of amphetamine samples].

    Science.gov (United States)

    Zhang, Jianxin; Zhang, Daming

    2008-07-01

    The essay introduced the technology of amphetamine identification and its optimization method. Impurity profiling of amphetamine was analyzed by GC-MS. Identification of common-batch amphetamine samples could be successfully finished by the data transition and pre-treating of the peak areas. The analytical method was improved by optimizing the techniques of sample extraction, gas chromatograph, sample separation and detection. PMID:18839544

  17. Revenue assurance in utilities

    OpenAIRE

    Rihar, Miha

    2010-01-01

    In recent times utility companies have to orient to effective business due to hard market conditions. Thus, companies want to diminish business expenses and increase the revenues. Effective revenue capture is, after all, the aim of revenue assurance. Actually the revenue capture is usually not perfect and without losses. A part of revenues are always lost on the way from a service to payment, which is called revenue leakage and causes a financial loss. The revenue leakage is above all the ...

  18. Introduction to quality assurance

    International Nuclear Information System (INIS)

    Safety requirements set forth in the regulatory requirement, codes, standards as well as other requirements for various aspects of nuclear power plant design and operation are strictly implemented through QA activities. The overall QA aim is to assure that the plant is soundly and correctly designed and that it is built, tested and operated in accordance with stringent quality standards and conservative engineering practices. In this way a high degree of freedom from faults and errors can be achieved. (orig.)

  19. Quality assurance in radiotherapy

    International Nuclear Information System (INIS)

    Good radiotherapy results and safety of treatment require the radiation to be optimally applied to a specified target area and the correct dose. According to international recommendations, the average uncertainty in therapeutic dose should not exceed 5%. The need for high precision in therapeutic dose requires quality assurance covering the entire radiotherapy process. Besides the physical and technical characteristics of the therapy equipment, quality assurance must include all radiotherapy equipment and procedures that are significant for the correct magnitude and precision of application of the therapeutic dose. The duties and responsibilities pertaining to various stages of treatment must also be precisely defined. These requirements may be best implemented through a quality system. The general requirements for supervision and quality assurance of medical radiation apparatus are prescribed in section 40 of the Radiation Act (592/1991, amendment 1142/1998) and in sections 18 and 32 of the Decree of the Ministry of Social Affairs and Health on the medical use of radiation (423/2000). Guide ST 2.2 imposes requirements on structural radiation shielding of radiotherapy equipment and the premises in which it is used, and on warning and safety arrangements. Guide ST 1.1 sets out the general safety principles for radiation practices and regulatory control procedure for the use of radiation. Guide ST 1.6 provides general requirements for operational measures in the use of radiation. This Guide sets out the duties of responsible parties (the party running a radiation practice) in respect of arranging and maintaining radiotherapy quality assurance. The principles set out in this Guide and Guide ST 6.3 may be applied to radionuclide therapy

  20. Power transformers quality assurance

    CERN Document Server

    Dasgupta, Indrajit

    2009-01-01

    About the Book: With the view to attain higher reliability in power system operation, the quality assurance in the field of distribution and power transformers has claimed growing attention. Besides new developments in the material technology and manufacturing processes of transformers, regular diagnostic testing and maintenance of any engineering product may be ascertained by ensuring: right selection of materials and components and their quality checks. application of correct manufacturing processes any systems engineering. the user`s awareness towards preventive maintenance. The

  1. 基于无干扰理论的安全保障方法%Security assurance method based on non-interference

    Institute of Scientific and Technical Information of China (English)

    孙瑜; 陈亚莎; 张兴; 刘毅

    2011-01-01

    In recent years,more and more researchers have given attention to security assurance as an important aspect of an operating system security.For a high level secure operating system,structuration of the architecture level must be met as security assurance requirements,which is the essential characteristics from the low level secure system.First,the lack of the traditional information flow model on solving the problem of security is analyzed,and the description and reflection of the security assurance in the non-interference model are studied.Then structural rules are raised that can match reference monitor hypothesis.Second,the concept of the trusted pipeline is applied to the structural assurance of the non-interference model,and security of the new model is proven.Finally,an implementation scheme of structured information flow control based on the trusted pipeline is proposed.%近年来,安全保障作为操作系统安全的一个重要方面越来越受到研究者的重视。对于高安全级别操作系统,体系结构层次的结构化是必须要达到的安全保障要求,是其区别于低安全级别系统的本质特征。本文首先分析了传统信息流模型在解决安全保障问题方面的不足,然后以传统无干扰模型为基础,研究了安全保障在其中的描述和体现,提出了能够满足引用监视器假设的结构化规则。其次,提出可信管道的概念,将其应用到无干扰模型的结构化保障中,并对模型的安全性进行了证明。最后,给出了一种基于可信管道的结构化信息流控制的实现方案。

  2. Two media method for linear attenuation coefficient determination of irregular soil samples

    International Nuclear Information System (INIS)

    In several situations of nuclear applications, the knowledge of gamma-ray linear attenuation coefficient for irregular samples is necessary, such as in soil physics and geology. This work presents the validation of a methodology for the determination of the linear attenuation coefficient (μ) of irregular shape samples, in such a way that it is not necessary to know the thickness of the considered sample. With this methodology irregular soil samples (undeformed field samples) from Londrina region, north of Parana were studied. It was employed the two media method for the μ determination. It consists of the μ determination through the measurement of a gamma-ray beam attenuation by the sample sequentially immersed in two different media, with known and appropriately chosen attenuation coefficients. For comparison, the theoretical value of μ was calculated by the product of the mass attenuation coefficient, obtained by the WinXcom code, and the measured value of the density sample. This software employs the chemical composition of the samples and supplies a table of the mass attenuation coefficients versus the photon energy. To verify the validity of the two media method, compared with the simple gamma ray transmission method, regular pome stone samples were used. With these results for the attenuation coefficients and their respective deviations, it was possible to compare the two methods. In this way we concluded that the two media method is a good tool for the determination of the linear attenuation coefficient of irregular materials, particularly in the study of soils samples. (author)

  3. Comparison of Six Culture Methods for Salmonella Isolation from Poultry Fecal Samples

    Directory of Open Access Journals (Sweden)

    Morshed, R. (PhD

    2014-06-01

    Full Text Available Background and Objective: Salmonellosis is one of the most important food-borne bacterial zoonotic diseases worldwide, and poultry and its products are the major sources for salmonella transmission to human. Isolation of Salmonella enterica from poultry needs bacteriologic enrichment and selected cultures of fecal samples. In this study, different culture methods for the isolation of salmonella from fecal samples were compared. Material and Methods: Forty- five positive samples from infected farms and 45 negative samples from normal farms were processed using enrichment media including tetrathionate broth, selenite cistine and Rappaport-Vassiliadis. Then the samples were incubated in selective cultures, and after 24 h, their results were compared with standard method. Results: Specificity of all methods for salmonella isolation was 100%, and salmonella was not isolated from the negative samples. The highest susceptibility was related to the method in which the sample first in Selenite cistine and later in Rappaport-Vassiliadis was enriched (100%. Enrichment in Rappaport-Vassiliadis could isolate 41 salmonella from 45 positive samples (91% while the result of enrichment in tetrathionate was 6 isolates (13.3%. Conclusion: This study shows that enrichment in selenite cistine and then in Rappaport-Vassiliadis is currently the best method for isolating salmonella from fecal samples of poultry. Key words: Salmonella; Bacteriologic Culture; Diagnosis; Isolation; Enrichment; Poultry

  4. [Preparation of sub-standard samples and XRF analytical method of powder non-metallic minerals].

    Science.gov (United States)

    Kong, Qin; Chen, Lei; Wang, Ling

    2012-05-01

    In order to solve the problem that standard samples of non-metallic minerals are not satisfactory in practical work by X-ray fluorescence spectrometer (XRF) analysis with pressed powder pellet, a method was studied how to make sub-standard samples according to standard samples of non-metallic minerals and to determine how they can adapt to analysis of mineral powder samples, taking the K-feldspar ore in Ebian-Wudu, Sichuan as an example. Based on the characteristic analysis of K-feldspar ore and the standard samples by X-ray diffraction (XRD) and chemical methods, combined with the principle of the same or similar between the sub-standard samples and unknown samples, the experiment developed the method of preparation of sub-standard samples: both of the two samples above mentioned should have the same kind of minerals and the similar chemical components, adapt mineral processing, and benefit making working curve. Under the optimum experimental conditions, a method for determination of SiO2, Al2O3, Fe2O3, TiO2, CaO, MgO, K2O and Na2O of K-feldspar ore by XRF was established. Thedetermination results are in good agreement with classical chemical methods, which indicates that this method was accurate. PMID:22827101

  5. Revenue assurance methodology

    OpenAIRE

    Filipová, Michaela

    2006-01-01

    Práce se zabývá návrhem uceleného teoreticko-metodologického rámce nové podnikové funkce revenue assurance pro telekomunikační společnosti, které usilují o systematický a koncepční přístup k zajištění a maximalizaci příjmů. Popsány jsou role revenue assurance funkce v podniku, cíle, vývojová stádia. Dále jsou popsány hrozby a konkrétní podoby úniků a nadhodnocení příjmů. Podrobně je rozebrána metodika pro plnění úkolů revenue assurance a techniky pro eliminaci úniků a nadhodnocení příjmů. Prá...

  6. Measurement quality assurance

    International Nuclear Information System (INIS)

    The quality of a radiation protection program can be no better than the quality of the measurements made to support it. In many cases, that quality is unknown and is merely implied on the basis of a calibration of a measuring instrument. If that calibration is inappropriate or is performed improperly, the measurement result will be inaccurate and misleading. Assurance of measurement quality can be achieved if appropriate procedures are followed, including periodic quality control actions that demonstrate adequate performance. Several national measurement quality assurance (MQA) programs are operational or under development in specific areas. They employ secondary standards laboratories that provide a high-quality link between the National Bureau of Standards and measurements made at the field use level. The procedures followed by these secondary laboratories to achieve MQA will be described, as well as plans for similar future programs. A growing general national interest in quality assurance, combined with strong specific motivations for MQA in the area of ionizing radiation, will provide continued demand for appropriate national programs. Such programs must, however, employ procedures that are cost effective and must be developed with participation by all affected parties

  7. Quality Assurance Project Plan for Facility Effluent Monitoring Plan activities

    International Nuclear Information System (INIS)

    This Quality Assurance Project Plan addresses the quality assurance requirements for the Facility Monitoring Plans of the overall site-wide environmental monitoring plan. This plan specifically applies to the sampling and analysis activities and continuous monitoring performed for all Facility Effluent Monitoring Plan activities conducted by Westinghouse Hanford Company. It is generic in approach and will be implemented in conjunction with the specific requirements of individual Facility Effluent Monitoring Plans. This document is intended to be a basic road map to the Facility Effluent Monitoring Plan documents (i.e., the guidance document for preparing Facility Effluent Monitoring Plans, Facility Effluent Monitoring Plan determinations, management plan, and Facility Effluent Monitoring Plans). The implementing procedures, plans, and instructions are appropriate for the control of effluent monitoring plans requiring compliance with US Department of Energy, US Environmental Protection Agency, state, and local requirements. This Quality Assurance Project Plan contains a matrix of organizational responsibilities, procedural resources from facility or site manuals used in the Facility Effluent Monitoring Plans, and a list of the analytes of interest and analytical methods for each facility preparing a Facility Effluent Monitoring Plan. 44 refs., 1 figs., 2 tabs

  8. Quality assurance in diagnostic ultrasound

    Energy Technology Data Exchange (ETDEWEB)

    Sipilae, Outi, E-mail: outi.sipila@hus.fi [HUS Helsinki Medical Imaging Center, Helsinki University Central Hospital, P.O. Box 340, 00029 HUS (Finland); Mannila, Vilma, E-mail: vilma.mannila@hus.fi [HUS Helsinki Medical Imaging Center, Helsinki University Central Hospital, P.O. Box 340, 00029 HUS (Finland); Department of Physics, University of Helsinki, P.O. Box 64, 00014 Helsinki University (Finland); Vartiainen, Eija, E-mail: eija.vartiainen@hus.fi [HUS Helsinki Medical Imaging Center, Helsinki University Central Hospital, P.O. Box 750, 00029 HUS (Finland)

    2011-11-15

    Objective: To setup a practical ultrasound quality assurance protocol in a large radiological center, results from transducer tests, phantom measurements and visual checks for physical faults were compared. Materials and methods: Altogether 151 transducers from 54 ultrasound scanners, from seven different manufacturers, were tested with a Sonora FirstCall aPerio{sup TM} system (Sonora Medical Systems, Inc., Longmont, CO, USA) to detect non-functional elements. Phantom measurements using a CIRS General Purpose Phantom Model 040 (CIRS Tissue Simulation and Phantom Technology, VA, USA) were available for 135 transducers. The transducers and scanners were also checked visually for physical faults. The percentages of defective findings in these tests were computed. Results: Defective results in the FirstCall tests were found in 17% of the 151 transducers, and in 16% of the 135 transducers. Defective image quality resulted with 15% of the transducers, and 25% of the transducers had a physical flaw. In 16% of the scanners, a physical fault elsewhere than in the transducer was found. Seven percent of the transducers had a concurrent defective result both in the FirstCall test and in the phantom measurements, 8% in the FirstCall test and in the visual check, 4% in the phantom measurements and in the visual check, and 2% in all three tests. Conclusion: The tested methods produced partly complementary results and seemed all to be necessary. Thus a quality assurance protocol is forced to be rather labored, and therefore the benefits and costs must be closely followed.

  9. [Sample preparation methods for chromatographic analysis of organic components in atmospheric particulate matter].

    Science.gov (United States)

    Hao, Liang; Wu, Dapeng; Guan, Yafeng

    2014-09-01

    The determination of organic composition in atmospheric particulate matter (PM) is of great importance in understanding how PM affects human health, environment, climate, and ecosystem. Organic components are also the scientific basis for emission source tracking, PM regulation and risk management. Therefore, the molecular characterization of the organic fraction of PM has become one of the priority research issues in the field of environmental analysis. Due to the extreme complexity of PM samples, chromatographic methods have been the chief selection. The common procedure for the analysis of organic components in PM includes several steps: sample collection on the fiber filters, sample preparation (transform the sample into a form suitable for chromatographic analysis), analysis by chromatographic methods. Among these steps, the sample preparation methods will largely determine the throughput and the data quality. Solvent extraction methods followed by sample pretreatment (e. g. pre-separation, derivatization, pre-concentration) have long been used for PM sample analysis, and thermal desorption methods have also mainly focused on the non-polar organic component analysis in PM. In this paper, the sample preparation methods prior to chromatographic analysis of organic components in PM are reviewed comprehensively, and the corresponding merits and limitations of each method are also briefly discussed.

  10. Quality assurance of specialised treatment of eating disorders using large-scale Internet-based collection systems: methods, results and lessons learned from designing the Stepwise database.

    Science.gov (United States)

    Birgegård, Andreas; Björck, Caroline; Clinton, David

    2010-01-01

    Computer-based quality assurance of specialist eating disorder (ED) care is a possible way of meeting demands for evaluating the real-life effectiveness of treatment, in a large-scale, cost-effective and highly structured way. The Internet-based Stepwise system combines clinical utility for patients and practitioners, and provides research-quality naturalistic data. Stepwise was designed to capture relevant variables concerning EDs and general psychiatric status, and the database can be used for both clinical and research purposes. The system comprises semi-structured diagnostic interviews, clinical ratings and self-ratings, automated follow-up schedules, as well as administrative functions to facilitate registration compliance. As of June 2009, the system is in use at 20 treatment units and comprises 2776 patients. Diagnostic distribution (including subcategories of eating disorder not otherwise specified) and clinical characteristics are presented, as well as data on registration compliance. Obstacles and keys to successful implementation of the Stepwise system are discussed, including possible gains and on-going challenges inherent in large-scale, Internet-based quality assurance. PMID:20589767

  11. Large loop conformation sampling using the activation relaxation technique, ART-nouveau method.

    Science.gov (United States)

    St-Pierre, Jean-François; Mousseau, Normand

    2012-07-01

    We present an adaptation of the ART-nouveau energy surface sampling method to the problem of loop structure prediction. This method, previously used to study protein folding pathways and peptide aggregation, is well suited to the problem of sampling the conformation space of large loops by targeting probable folding pathways instead of sampling exhaustively that space. The number of sampled conformations needed by ART nouveau to find the global energy minimum for a loop was found to scale linearly with the sequence length of the loop for loops between 8 and about 20 amino acids. Considering the linear scaling dependence of the computation cost on the loop sequence length for sampling new conformations, we estimate the total computational cost of sampling larger loops to scale quadratically compared to the exponential scaling of exhaustive search methods.

  12. Method and Implementation of High-speed Digital Sampling Technology Based on Impulse Radar Signal

    Directory of Open Access Journals (Sweden)

    Shen Shao-xiang

    2012-06-01

    Full Text Available A High-speed digital sampling technology suitable for periodical impulse radar signal is proposed in this paper. One bit high-speed quantize is constructed by differential comparator in FPGA. Time-interleaved digital sampling and buffer encoding are used to one bit stream based on the internal multi-phase clock of FPGA, to achieve sampling rate higher than 1 GHz. High speed digital sampling is realized by the accumulation of one bit sampling data with different comparison levels. An 8 bit, 1.6 GHz ADC based on the proposed method is realized on XC2V3000 Xilinx’s FPGA, which is successfully applied in GPR. The proposed method has the advantages of low cost and power consumption as compared with real sampling, and exhibits higher efficiency as compared with equivalent sampling.

  13. The quality assurance liaison: Combined technical and quality assurance support

    International Nuclear Information System (INIS)

    This paper describes the role of the quality assurance liaison, the responsibilities of this position, and the evolutionary changes in duties over the last six years. The role of the quality assurance liaison has had a very positive impact on the Los Alamos Yucca Mountain Site Characterization (YW) quality assurance program. Having both technical and quality assurance expertise, the quality assurance liaisons are able to facilitate communications with scientists on quality assurance issues and requirements, thereby generating greater productivity in scientific investigations. The quality assurance liaisons help ensure that the scientific community knows and implements existing requirements, is aware of new or changing regulations, and is able to conduct scientific work within Project requirements. The influence of the role of the quality assurance liaison can be measured by an overall improvement in attitude of the staff regarding quality assurance requirements and improved job performance, as well as a decrease in deficiencies identified during both internal and external audits and surveillances. This has resulted in a more effective implementation of quality assurance requirements

  14. The quality assurance liaison: Combined technical and quality assurance support

    Science.gov (United States)

    Bolivar, S. L.; Day, J. L.

    1993-03-01

    The role of the quality assurance liaison, the responsibilities of this position, and the evolutionary changes in duties over the last six years are described. The role of the quality assurance liaison has had a very positive impact on the Los Alamos Yucca Mountain Site Characterization (YW) quality assurance program. Having both technical and quality assurance expertise, the quality assurance liaisons are able to facilitate communications with scientists on quality assurance issues and requirements, thereby generating greater productivity in scientific investigations. The quality assurance liaisons help ensure that the scientific community knows and implements existing requirements, is aware of new or changing regulations, and is able to conduct scientific work within Project requirements. The influence of the role of the quality assurance liaison can be measured by an overall improvement in attitude of the staff regarding quality assurance requirements and improved job performance, as well as a decrease in deficiencies identified during both internal and external audits and surveillances. This has resulted in a more effective implementation of quality assurance requirements.

  15. Multiplex biotoxin surface plasmon resonance method for marine biotoxins in algal and seawater samples.

    Science.gov (United States)

    McNamee, Sara E; Elliott, Christopher T; Delahaut, Philippe; Campbell, Katrina

    2013-10-01

    A multiplex surface plasmon resonance (SPR) biosensor method for the detection of paralytic shellfish poisoning (PSP) toxins, okadaic acid (and analogues) and domoic acid was developed. This method was compared to enzyme-linked immunosorbent assay (ELISA) methods. Seawater samples (n=256) from around Europe were collected by the consortia of an EU project MIcroarrays for the Detection of Toxic Algae (MIDTAL) and evaluated using each method. A simple sample preparation procedure was developed which involved lysing and releasing the toxins from the algal cells with glass beads followed by centrifugation and filtering the extract before testing for marine biotoxins by both multi-SPR and ELISA. Method detection limits based on IC20 values for PSP, okadaic acid and domoic acid toxins were 0.82, 0.36 and 1.66 ng/ml, respectively, for the prototype multiplex SPR biosensor. Evaluation by SPR for seawater samples has shown that 47, 59 and 61 % of total seawater samples tested positive (result greater than the IC20) for PSP, okadaic acid (and analogues) and domoic acid toxins, respectively. Toxic samples were received mainly from Spain and Ireland. This work has demonstrated the potential of multiplex analysis for marine biotoxins in algal and seawater samples with results available for 24 samples within a 7 h period for three groups of key marine biotoxins. Multiplex immunological methods could therefore be used as early warning monitoring tools for a variety of marine biotoxins in seawater samples.

  16. Multiplex biotoxin surface plasmon resonance method for marine biotoxins in algal and seawater samples.

    Science.gov (United States)

    McNamee, Sara E; Elliott, Christopher T; Delahaut, Philippe; Campbell, Katrina

    2013-10-01

    A multiplex surface plasmon resonance (SPR) biosensor method for the detection of paralytic shellfish poisoning (PSP) toxins, okadaic acid (and analogues) and domoic acid was developed. This method was compared to enzyme-linked immunosorbent assay (ELISA) methods. Seawater samples (n=256) from around Europe were collected by the consortia of an EU project MIcroarrays for the Detection of Toxic Algae (MIDTAL) and evaluated using each method. A simple sample preparation procedure was developed which involved lysing and releasing the toxins from the algal cells with glass beads followed by centrifugation and filtering the extract before testing for marine biotoxins by both multi-SPR and ELISA. Method detection limits based on IC20 values for PSP, okadaic acid and domoic acid toxins were 0.82, 0.36 and 1.66 ng/ml, respectively, for the prototype multiplex SPR biosensor. Evaluation by SPR for seawater samples has shown that 47, 59 and 61 % of total seawater samples tested positive (result greater than the IC20) for PSP, okadaic acid (and analogues) and domoic acid toxins, respectively. Toxic samples were received mainly from Spain and Ireland. This work has demonstrated the potential of multiplex analysis for marine biotoxins in algal and seawater samples with results available for 24 samples within a 7 h period for three groups of key marine biotoxins. Multiplex immunological methods could therefore be used as early warning monitoring tools for a variety of marine biotoxins in seawater samples. PMID:23250726

  17. Towards Run-time Assurance of Advanced Propulsion Algorithms

    Science.gov (United States)

    Wong, Edmond; Schierman, John D.; Schlapkohl, Thomas; Chicatelli, Amy

    2014-01-01

    This paper covers the motivation and rationale for investigating the application of run-time assurance methods as a potential means of providing safety assurance for advanced propulsion control systems. Certification is becoming increasingly infeasible for such systems using current verification practices. Run-time assurance systems hold the promise of certifying these advanced systems by continuously monitoring the state of the feedback system during operation and reverting to a simpler, certified system if anomalous behavior is detected. The discussion will also cover initial efforts underway to apply a run-time assurance framework to NASA's model-based engine control approach. Preliminary experimental results are presented and discussed.

  18. Security assurance capability assessment based on entropy weight method for cryptographic module%基于熵权法的密码模块安全保障能力评估

    Institute of Scientific and Technical Information of China (English)

    粟登银; 徐开勇; 高杨

    2012-01-01

    To solve the problems that the index value of cryptographic modules is not fixed, the index system is hardly built, and the security assurance ability can not be quantitatively assessed, a security assurance capability assessment for cryptographic module was proposed. The description on indexes by interval number was applied to illustrate the security attribute of cryptographic modules. This paper determined the weight vector of each period point by entropy weight coefficient method combined with expert decision weight method. According to the interval multi-attribute decision methodology, a feasible methodology was adopted to solve the interval Information Assurance ( IA) capability evaluation problem of cryptographic modules. Finally, through analyzing two kinds of cryptographic modules, the experimental results show that the proposed method is feasible.%针对密码模块这类信息安全产品指标值不固定、指标系统难以建立、安全保障能力难以定量评估的问题,提出了一种定量描述密码模块安全保障能力的可行方法.方法运用区间数描述密码模块的安全属性,采用熵权法结合主观赋权法确定每个安全属性的权重值,运用区间型多属性决策方法进行综合评价,最后运用所提方法对两种商用密码模块进行了实例分析,计算结果表明所提方法可行.

  19. Quality assurance of the bentonite material

    International Nuclear Information System (INIS)

    This report describes a quality assurance chain for the bentonite material acquisition for a nuclear waste disposal repository. Chemical, mineralogical and geotechnical methods, which may be applied in quality control of bentonite are shortly reviewed. As a case study, many of the presented control studies were performed for six different bentonite samples. Chemical analysis is a very reliable research method to control material homogeneity, because the accuracy and repeatability of the study method is extremely good. Accurate mineralogical study of bentonite is a complicated task. X-ray diffractometry is the best method to identify smectite minerals, but quantitative analysis of smectite content remains uncertain. To obtain a better quantitative analysis, development of techniques based on automatic image analysis of SEM images is proposed. General characteristics of bentonite can be obtained by rapid indicator tests, which can be done on the place of reception. These tests are methylene blue test giving information on the cation exchange capacity, swelling index and determination of water absorption. Different methods were used in the determination of cation exchange capacity (CEC) of bentonite. The results indicated differences both between methodologies and between replicate determinations for the same material and method. Additional work should be done to improve the reliability and reproducibility of the methodology. Bentonite contains water in different modes. Thus, different determination methods are used in bentonite studies and they give somewhat dissimilar results. Clay research use frequently the so-called consistency tests (liquid limit, plastic limit and plasticity index). This study method does, however, not seem to be very practical in quality control of bentonite. Therefore, only the determination of liquid limit with fall-cone method is recommended for quality control. (orig.)

  20. RAPID FUSION METHOD FOR DETERMINATION OF PLUTONIUM ISOTOPES IN LARGE RICE SAMPLES

    Energy Technology Data Exchange (ETDEWEB)

    Maxwell, S.

    2013-03-01

    A new rapid fusion method for the determination of plutonium in large rice samples has been developed at the Savannah River National Laboratory (Aiken, SC, USA) that can be used to determine very low levels of plutonium isotopes in rice. The recent accident at Fukushima Nuclear Power Plant in March, 2011 reinforces the need to have rapid, reliable radiochemical analyses for radionuclides in environmental and food samples. Public concern regarding foods, particularly foods such as rice in Japan, highlights the need for analytical techniques that will allow very large sample aliquots of rice to be used for analysis so that very low levels of plutonium isotopes may be detected. The new method to determine plutonium isotopes in large rice samples utilizes a furnace ashing step, a rapid sodium hydroxide fusion method, a lanthanum fluoride matrix removal step, and a column separation process with TEVA Resin cartridges. The method can be applied to rice sample aliquots as large as 5 kg. Plutonium isotopes can be determined using alpha spectrometry or inductively-coupled plasma mass spectrometry (ICP-MS). The method showed high chemical recoveries and effective removal of interferences. The rapid fusion technique is a rugged sample digestion method that ensures that any refractory plutonium particles are effectively digested. The MDA for a 5 kg rice sample using alpha spectrometry is 7E-5 mBq g{sup -1}. The method can easily be adapted for use by ICP-MS to allow detection of plutonium isotopic ratios.

  1. Capability of analysis arsenic in geology sample by gamma-gamma coincidence method

    International Nuclear Information System (INIS)

    Gamma-gamma coincidence method has been successfully applied to the study of nuclear data and structure. Due to good abilities of background reduction, gamma-gamma coincidence method has been widely applied in neutron activation analysis. The experimental studies on geological and environmental samples have been conducted in several laboratories in the world. This report presents the results of Arsenic (As) analysis of geological sample by neutron activation analysis with coincidence method. The results show that the linearity between concentration in sample and count rate of peak in coincidence measurements and the influence of isotopes was eliminated and the background was reduced by application of this method in comparison with the conventional method that uses a detector. The results also found out that detection limits for analysis of As in geological samples were improved. (author)

  2. A Generalized Method for Integrating Rule-based Knowledge into Inductive Methods Through Virtual Sample Creation

    CERN Document Server

    Iqbal, Ridwan Al

    2011-01-01

    Hybrid learning methods use theoretical knowledge of a domain and a set of classified examples to develop a method for classification. Methods that use domain knowledge have been shown to perform better than inductive learners. However, there is no general method to include domain knowledge into all inductive learning algorithms as all hybrid methods are highly specialized for a particular algorithm. We present an algorithm that will take domain knowledge in the form of propositional rules, generate artificial examples from the rules and also remove instances likely to be flawed. This enriched dataset then can be used by any learning algorithm. Experimental results of different scenarios are shown that demonstrate this method to be more effective than simple inductive learning.

  3. Analysis method on shoot precision of weapon in small-sample case

    Institute of Scientific and Technical Information of China (English)

    Jiang Jun; Song Baowei; Liang Qingwei

    2007-01-01

    Because of limits of cost, in general, the test data of weapons are shortness. It is always an important topic that to gain scientific results of weapon performance analyses in small-sample case. Based on the analysis of distribution function characteristics and grey mathematics, a weighting grey method in small-sample case is presented. According to the analysis of test data of a weapon, it is proved that the method is a good method to deal with data in the small-sample case and has a high value in the analysis of weapon performance.

  4. Determination of thorium and uranium contents in soil samples using SSNTD's passive method

    Indian Academy of Sciences (India)

    T A Salama; U Seddik; T M Dsoky; A Ahmed Morsy; R El-Asser

    2006-08-01

    Thorium-to-uranium ratios have been determined in different soil samples using CR-39 and LR-115-II solid-state nuclear track detectors (SSNTDs). A calibration method based on determination of SSNTD registration sensitivity ratio for -particles of thorium and uranium series has been developed. Thorium and uranium contents of the standard soil samples have been determined and compared with its known values. There is a good agreement between the results of this method and the values of standard samples. The method is simple, inexpensive, non-destructive and has a wide range of applications in environment, building materials and petroleum fields.

  5. Quality assurance program plan for low-level waste at the WSCF Laboratory

    International Nuclear Information System (INIS)

    The purpose of this document is to provide guidance for the implementation of the Quality Assurance Program Plan (QAPP) for the management of low-level waste at the Waste Sampling and Characterization Facility (WSCF) Laboratory Complex as required by WHC-CM-4-2, Quality Assurance Manual, which is based on Quality Assurance Program Requirements for Nuclear Facilities, NQA-1 (ASME)

  6. INVESTIGATION OF THE TOTAL ORGANIC HALOGEN ANALYTICAL METHOD AT THE WASTE SAMPLING CHARACTERIZATION FACILITY (WSCF)

    Energy Technology Data Exchange (ETDEWEB)

    DOUGLAS JG; MEZNARICH HD, PHD; OLSEN JR; ROSS GA; STAUFFER M

    2008-09-30

    Total organic halogen (TOX) is used as a parameter to screen groundwater samples at the Hanford Site. Trending is done for each groundwater well, and changes in TOX and other screening parameters can lead to costly changes in the monitoring protocol. The Waste Sampling and Characterization Facility (WSCF) analyzes groundwater samples for TOX using the United States Environmental Protection Agency (EPA) SW-846 method 9020B (EPA 1996a). Samples from the Soil and Groundwater Remediation Project (S&GRP) are submitted to the WSCF for analysis without information regarding the source of the sample; each sample is in essence a 'blind' sample to the laboratory. Feedback from the S&GRP indicated that some of the WSCF-generated TOX data from groundwater wells had a number of outlier values based on the historical trends (Anastos 2008a). Additionally, analysts at WSCF observed inconsistent TOX results among field sample replicates. Therefore, the WSCF lab performed an investigation of the TOX analysis to determine the cause of the outlier data points. Two causes were found that contributed to generating out-of-trend TOX data: (1) The presence of inorganic chloride in the groundwater samples: at inorganic chloride concentrations greater than about 10 parts per million (ppm), apparent TOX values increase with increasing chloride concentration. A parallel observation is the increase in apparent breakthrough of TOX from the first to the second activated-carbon adsorption tubes with increasing inorganic chloride concentration. (2) During the sample preparation step, excessive purging of the adsorption tubes with oxygen pressurization gas after sample loading may cause channeling in the activated-carbon bed. This channeling leads to poor removal of inorganic chloride during the subsequent wash step with aqueous potassium nitrate. The presence of this residual inorganic chloride then produces erroneously high TOX values. Changes in sample preparation were studied to more

  7. Methods, compounds and systems for detecting a microorganism in a sample

    Energy Technology Data Exchange (ETDEWEB)

    Colston, Jr, Bill W.; Fitch, J. Patrick; Gardner, Shea N.; Williams, Peter L.; Wagner, Mark C.

    2016-09-06

    Methods to identify a set of probe polynucleotides suitable for detecting a set of targets and in particular methods for identification of primers suitable for detection of target microorganisms related polynucleotides, set of polynucleotides and compositions, and related methods and systems for detection and/or identification of microorganisms in a sample.

  8. Protein Profile study of clinical samples using Laser Induced Fluorescence as the detection method

    DEFF Research Database (Denmark)

    Karemore, Gopal Raghunath; Raja, Sujatha N.; Rai, Lavanya;

    2009-01-01

    by using hard and Fuzzy clustering methods. The study was performed to test the utility of the HPLC-LIF protein profiling method for classification of tissue samples as well as to establish a complementary method for histopathology for clinical diagnosis of the tissue as normal or malignant.  ...

  9. Methods and devices for hyperpolarising and melting NMR samples in a cryostat

    DEFF Research Database (Denmark)

    Ardenkjaer-Larsen, Jan Henrik; Axelsson, Oskar H. E.; Golman, Klaes Koppel;

    2006-01-01

    The present invention relates to devices and method for melting solid polarised sample while retaining a high level of polarisation. In an embodiment of the present invention a sample is polarised in a sample-retaining cup 9 in a strong magnetic field in a polarising means 3a, 3b, 3c in a cryosta...... 2 and then melted inside the cryostat 2 by melting means such as a laser 8 connected by an optical fibre 4 to the interior of the cryostat.......The present invention relates to devices and method for melting solid polarised sample while retaining a high level of polarisation. In an embodiment of the present invention a sample is polarised in a sample-retaining cup 9 in a strong magnetic field in a polarising means 3a, 3b, 3c in a cryostat...

  10. Apparatus and method for maintaining multi-component sample gas constituents in vapor phase during sample extraction and cooling

    Energy Technology Data Exchange (ETDEWEB)

    Felix, Larry Gordon; Farthing, William Earl; Irvin, James Hodges; Snyder, Todd Robert

    2010-05-11

    A dilution apparatus for diluting a gas sample. The apparatus includes a sample gas conduit having a sample gas inlet end and a diluted sample gas outlet end, and a sample gas flow restricting orifice disposed proximate the sample gas inlet end connected with the sample gas conduit and providing fluid communication between the exterior and the interior of the sample gas conduit. A diluted sample gas conduit is provided within the sample gas conduit having a mixing end with a mixing space inlet opening disposed proximate the sample gas inlet end, thereby forming an annular space between the sample gas conduit and the diluted sample gas conduit. The mixing end of the diluted sample gas conduit is disposed at a distance from the sample gas flow restricting orifice. A dilution gas source connected with the sample gas inlet end of the sample gas conduit is provided for introducing a dilution gas into the annular space, and a filter is provided for filtering the sample gas. The apparatus is particularly suited for diluting heated sample gases containing one or more condensable components.

  11. Quality assurance services

    International Nuclear Information System (INIS)

    For over 20 years the quality assurance services at the Springfields Laboratories have been concerned with manufacturing both simple and complex engineering products to the highest standard. The scientists working there have considerable expertise in the practical application of quality control and the development and design of inspection and non-destructive testing equipment. The folder contains six sheets or leaflets illustrating the work and equipment. The subjects are the mechanical standards laboratory, non-destructive testing, the digitising table, the peripheral camera, automated measurement, data handling and presentation, and the computer controlled three axis co-ordinate measuring machine. (U.K.)

  12. Reactor system safety assurance

    International Nuclear Information System (INIS)

    The philosophy of reactor safety is that design should follow established and conservative engineering practices, there should be safety margins in all modes of plant operation, special systems should be provided for accidents, and safety systems should have redundant components. This philosophy provides ''defense in depth.'' Additionally, the safety of nuclear power plants relies on ''safety systems'' to assure acceptable response to design basis events. Operating experience has shown the need to study plant response to more frequent upset conditions and to account for the influence of operators and non-safety systems on overall performance. Defense in depth is being supplemented by risk and reliability assessment

  13. Sampling Methods for Detection and Monitoring of the Asian Citrus Psyllid (Hemiptera: Psyllidae).

    Science.gov (United States)

    Monzo, C; Arevalo, H A; Jones, M M; Vanaclocha, P; Croxton, S D; Qureshi, J A; Stansly, P A

    2015-06-01

    The Asian citrus psyllid (ACP), Diaphorina citri Kuwayama is a key pest of citrus due to its role as vector of citrus greening disease or "huanglongbing." ACP monitoring is considered an indispensable tool for management of vector and disease. In the present study, datasets collected between 2009 and 2013 from 245 citrus blocks were used to evaluate precision, sensitivity for detection, and efficiency of five sampling methods. The number of samples needed to reach a 0.25 standard error-mean ratio was estimated using Taylor's power law and used to compare precision among sampling methods. Comparison of detection sensitivity and time expenditure (cost) between stem-tap and other sampling methodologies conducted consecutively at the same location were also assessed. Stem-tap sampling was the most efficient sampling method when ACP densities were moderate to high and served as the basis for comparison with all other methods. Protocols that grouped trees near randomly selected locations across the block were more efficient than sampling trees at random across the block. Sweep net sampling was similar to stem-taps in number of captures per sampled unit, but less precise at any ACP density. Yellow sticky traps were 14 times more sensitive than stem-taps but much more time consuming and thus less efficient except at very low population densities. Visual sampling was efficient for detecting and monitoring ACP at low densities. Suction sampling was time consuming and taxing but the most sensitive of all methods for detection of sparse populations. This information can be used to optimize ACP monitoring efforts. PMID:26313984

  14. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    Science.gov (United States)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  15. Comparison of LSO samples produced by Czochralsky and modified Musatov methods

    CERN Document Server

    Antich, P; Tsyganov, E N; Garmash, V; Zheleznykh, I

    2000-01-01

    This study is based on test results of 30 LSO samples produced by the POLUS Research Institute in Moscow, Russia, for the University of Texas Southwestern Medical Center at Dallas. Samples were produced by the Czochralsky and the modified Musatov methods. Pulse-height spectra from sup 2 sup 2 Na positron annihilations were analyzed and conclusions are drawn. After some minor corrections to the modified Musatov technology, samples could be recommended for use in PET systems.

  16. Minimally invasive blood sampling method for genetic studies on Gopherus tortoises

    OpenAIRE

    García–Feria, L. M.; Ureña–Aranda, C. A.; Espinosa de los Monteros, A.

    2015-01-01

    Obtaining good quality tissue samples is the first hurdle in any molecular study. This is especially true for studies involving management and conservation of wild fauna. In the case of tortoises, the most common sources of DNA are blood samples. However, only a minimal amount of blood is required for PCR assays. Samples are obtained mainly from the brachial and jugular vein after restraining the animal chemically, or from conscious individuals by severe handling methods and clamping. Herein,...

  17. ESTIMATION AND SELECTION OF PLASMA IODENT CORN SAMPLES ON DROUGHT-RESISTANCE BY PHYSIOLOGICAL METHODS

    OpenAIRE

    Grabovskaya, T.

    2009-01-01

    The complex estimation of corn samples on drought-resistance was conducted by physiological methods. It was proven that samples which grown on osmotic solutions of sucrose with the gradual increase of pressure have greater water-retaining ability and greater content of bound-water. Also in the selected samples the less percent of hydrolyzed starch in the cells of root caps was indicated.

  18. Ontario's Quality Assurance Framework: A Critical Response

    Science.gov (United States)

    Heap, James

    2013-01-01

    Ontario's Quality Assurance Framework (QAF) is reviewed and found not to meet all five criteria proposed for a strong quality assurance system focused on student learning. The QAF requires a statement of student learning outcomes and a method and means of assessing those outcomes, but it does not require that data on achievement of intended…

  19. 42 CFR 431.53 - Assurance of transportation.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Assurance of transportation. 431.53 Section 431.53... Requirements § 431.53 Assurance of transportation. A State plan must— (a) Specify that the Medicaid agency will ensure necessary transportation for recipients to and from providers; and (b) Describe the methods...

  20. The quality of chemotherapy and its quality assurance.

    NARCIS (Netherlands)

    Ottevanger, P.B.; Mulder, P.H.M. de

    2005-01-01

    AIMS: Assessment of the quality of chemotherapy care and its quality assurance in clinical trials and daily practice. METHODS: Using Medline, literature was searched combining the following words: quality assurance or quality of care, combined with anti-neoplastic agents. The bibliography of each ar

  1. Quality assurance of absorbed energy in Charpy impact test

    Science.gov (United States)

    Rocha, C. L. F.; Fabricio, D. A. K.; Costa, V. M.; Reguly, A.

    2016-07-01

    In order to ensure the quality assurance and comply with standard requirements, an intralaboratory study has been performed for impact Charpy tests, involving two operators. The results based on ANOVA (Analysis of Variance) and Normalized Error statistical techniques pointed out that the execution of the tests is appropriate, because the implementation of quality assurance methods showed acceptable results.

  2. A venue-based method for sampling hard-to-reach populations.

    Science.gov (United States)

    Muhib, F B; Lin, L S; Stueve, A; Miller, R L; Ford, W L; Johnson, W D; Smith, P J

    2001-01-01

    Constructing scientifically sound samples of hard-to-reach populations, also known as hidden populations, is a challenge for many research projects. Traditional sample survey methods, such as random sampling from telephone or mailing lists, can yield low numbers of eligible respondents while non-probability sampling introduces unknown biases. The authors describe a venue-based application of time-space sampling (TSS) that addresses the challenges of accessing hard-to-reach populations. The method entails identifying days and times when the target population gathers at specific venues, constructing a sampling frame of venue, day-time units (VDTs), randomly selecting and visiting VDTs (the primary sampling units), and systematically intercepting and collecting information from consenting members of the target population. This allows researchers to construct a sample with known properties, make statistical inference to the larger population of venue visitors, and theorize about the introduction of biases that may limit generalization of results to the target population. The authors describe their use of TSS in the ongoing Community Intervention Trial for Youth (CITY) project to generate a systematic sample of young men who have sex with men. The project is an ongoing community level HIV prevention intervention trial funded by the Centers for Disease Control and Prevention. The TSS method is reproducible and can be adapted to hard-to-reach populations in other situations, environments, and cultures.

  3. Optimised Spatial Sampling Scheme for Soil Electriclal Conductivity Based on Variance Quad-Tree(VQT)Method

    Institute of Scientific and Technical Information of China (English)

    LI Yan; SHI Zhou; WU Ci-fang; LI Feng; LI Hong-yi

    2007-01-01

    The acquisition of precise soil data representative of the entire survey area,is a critical issue for many treatments such as irrigation or fertilization in precision agriculture.The aim of this study was to investigate the spatial variability of soil bulk electrical conductivity(ECb)in a coastal saline field and design an optimized spatial sampling scheme of ECb based on a sampling design algorithm,the variance quad-tree(VQT)method.Soil ECb data were collected from the field at 20m interval in a regular grid scheme.The smooth contour map of the whole field was obtained by ordinary kriging interpolation,VQT algorithm was then used to split the smooth contour map into strata of different number desired,the sampling locations can be selected within each stratum in subsequent sampling.The result indicated that the probability of choosing representative sampling sites was increased significantly by using VQT method with the sampling number being greatly reduced compared to grid sampling design while retaining the same prediction accuracy.The advantage of the VQT method is that this scheme samples sparsely in fields where the spatial variability is relatively uniform and more intensive where the variability is large.Thus the sampling efficiency can be improved,hence facilitate an assessment methodology that can be applied in a rapid,practical and cost-effective manner.

  4. Drying and storage methods affect cyfluthrin concentrations in exposed plant samples

    Science.gov (United States)

    Standard procedures exist for collection and chemical analyses of pyrethroid insecticides in environmental matrices. However, less detail is given for drying and potential storage methods of plant samples prior to analyses. Due to equipment and financial limitations, immediate sample analysis is n...

  5. A new method for preparing thin samples of pottery shards for PIXE analysis

    International Nuclear Information System (INIS)

    A novel method, based on the centrifugal precipitation technique, has been used for preparing thin (1 mg/cm2) uniform and homogeneous samples of pottery shards for PIXE analysis. The reproducibility of the results has been tested on standard Perlman-Asaro samples and ancient Greek pottery shards. The abundances of more than 12 elements can be reliably measured. (orig.)

  6. MCNP{trademark} Software Quality Assurance plan

    Energy Technology Data Exchange (ETDEWEB)

    Abhold, H.M.; Hendricks, J.S.

    1996-04-01

    MCNP is a computer code that models the interaction of radiation with matter. MCNP is developed and maintained by the Transport Methods Group (XTM) of the Los Alamos National Laboratory (LANL). This plan describes the Software Quality Assurance (SQA) program applied to the code. The SQA program is consistent with the requirements of IEEE-730.1 and the guiding principles of ISO 900.

  7. Passive sampling methods for contaminated sediments: Scientific rationale supporting use of freely dissolved concentrations

    DEFF Research Database (Denmark)

    Mayer, Philipp; Parkerton, Thomas F.; Adams, Rachel G.;

    2014-01-01

    Passive sampling methods (PSMs) allow the quantification of the freely dissolved concentration (Cfree ) of an organic contaminant even in complex matrices such as sediments. Cfree is directly related to a contaminant's chemical activity, which drives spontaneous processes including diffusive upta...

  8. Short communication: appropriate and alternative methods to determine viable bacterial counts in cow milk samples.

    Science.gov (United States)

    Loss, G; Apprich, S; Kneifel, W; von Mutius, E; Genuneit, J; Braun-Fahrländer, C

    2012-06-01

    Farm milk consumption is reported to be inversely related to the development of asthma and atopy in children and it has been hypothesized that microorganisms in milk might contribute to this protective effect. The GABRIEL study was designed to investigate this hypothesis in a large population of European children, calling for a rapid alternative to classical culture techniques to determine bacteriological properties of milk samples. One objective was to evaluate 2 different rapid methods to determine bacteriological properties in a large number of cow milk samples collected under field conditions. BactoScan (Foss Analytical, Hillerød, Denmark), an automated standard flow cytometric method utilized for routine testing of milk quality, and TEMPO (bioMérieux, Marcy l'Etoile, France), an automated most-probable-number method, were used to assess the total viable bacterial count in farm and commercial milk samples. Both methods were compared with standard plate count method and each other. Measurements based on the TEMPO method were in good agreement with the standard plate count method and showed reliable results, whereas BactoScan results did not correlate with standard plate count measurements and yielded higher bacteria counts in heat-treated milk samples compared with raw milk samples. Most likely, these discrepant results were due to inferences with staining reactions and detection of bacteria in heat-treated milk samples. We conclude that, in contrast to the routinely used BactoScan method, the TEMPO method is an inexpensive and rapid alternative to standard culture methods suitable to assess total bacterial counts in processed and raw milk samples.

  9. Active Search on Carcasses versus Pitfall Traps: a Comparison of Sampling Methods.

    Science.gov (United States)

    Zanetti, N I; Camina, R; Visciarelli, E C; Centeno, N D

    2016-04-01

    The study of insect succession in cadavers and the classification of arthropods have mostly been done by placing a carcass in a cage, protected from vertebrate scavengers, which is then visited periodically. An alternative is to use specific traps. Few studies on carrion ecology and forensic entomology involving the carcasses of large vertebrates have employed pitfall traps. The aims of this study were to compare both sampling methods (active search on a carcass and pitfall trapping) for each coleopteran family, and to establish whether there is a discrepancy (underestimation and/or overestimation) in the presence of each family by either method. A great discrepancy was found for almost all families with some of them being more abundant in samples obtained through active search on carcasses and others in samples from traps, whereas two families did not show any bias towards a given sampling method. The fact that families may be underestimated or overestimated by the type of sampling technique highlights the importance of combining both methods, active search on carcasses and pitfall traps, in order to obtain more complete information on decomposition, carrion habitat and cadaveric families or species. Furthermore, a hypothesis advanced on the reasons for the underestimation by either sampling method showing biases towards certain families. Information about the sampling techniques indicating which would be more appropriate to detect or find a particular family is provided. PMID:26732526

  10. Comparison of alkali fusion and acid digestion methods for radiochemical separation of Uranium from dietary samples

    International Nuclear Information System (INIS)

    Several methods exist for separation and measurement of uranium in dietary samples such as neutron activation analysis (NAA), alpha spectrometric determination, inductively coupled plasma mass spectrometry (ICP-MS) and fluorimetry. For qualitative determination of activity, NAA and alpha spectrometry are said to be superior to evaluate the isotopes of uranium (238U, 234U and 235U). In case of alpha spectrometry, the samples have to undergo radiochemical analysis for separation from other elements for uranium detection. In our studies, uranium was determined in food matrices by acid digestion (AD) and alkali fusion (AF) methods. The recovery yield of uranium in food matrices was compared in order to get consistent yield. The average activity levels of 238U and 234U in food samples were calculated based on recovery yield of 232U in the samples. The average recovery of 232U in AD method was 22 ± 8% and in AF method, it was 14.9 ± 1.3%. The spread is more in AD method than the AF method from their mean. The lowest recovery of 232U was found in AF method. This is due to the interference of other elements in the sample during electroplating. Experimental results showed that the uranium separation by AD method has better recovery than the AF method. The consistency in recovery of 232U was better for AF method, which was lower than the AD method. However, overall for both the methods, the recovery can be termed as poor and need rigorous follow up studies for consistently higher recoveries (>50%) in these type of biological samples. There are reports indicating satisfactory recoveries of around 80% with 232U as tracer in the food matrices

  11. A New Method for Noninvasive Genetic Sampling of Saliva in Ecological Research

    OpenAIRE

    Diana Lobo; Raquel Godinho; Francisco Álvares; López-Bao, José V.; Alejandro Rodríguez

    2015-01-01

    Noninvasive samples for genetic analyses have become essential to address ecological questions. Popular noninvasive samples such as faeces contain degraded DNA which may compromise genotyping success. Saliva is an excellent alternative DNA source but scarcity of suitable collection methods makes its use anecdotal in field ecological studies. We develop a noninvasive method of collection that combines baits and porous materials able to capture saliva. We report its potential in optimal conditi...

  12. Business Sample Survey Measurement on Statistical Thinking and Methods Adoption: the Case of Croatian Small Enterprises

    OpenAIRE

    Berislav Žmuk

    2015-01-01

    The objective of this research is to investigate attitudes of management in Croatian small enterprises that use statistical methods towards statistical thinking in order to gain an insight into related issues. The research was conducted in 2013 using a web survey with a random sample of 631 Croatian small enterprises, but this paper focuses only on those enterprises that use statistical methods. In order to get detailed information, a complex stratified sample survey design was used. In the a...

  13. A comparative study of extraction and purification methods for environmental DNA from soil and sludge samples

    OpenAIRE

    Roh, Changhyun; Villatte, Francois; Kim, Byung-Gee; Schmid, Rolf D.

    2006-01-01

    An important prerequisite for a successful metagenome library construction is an efficient extraction procedure for DNA out of environmental samples. In this study we compared three indirect and four direct extraction methods, including a commercial kit, in terms of DNA yield, purity and time requirement. A special focus was set on methods which are appropriate for the extraction of environmental DNA (eDNA) from very limited sample sizes (0.1 g) to enable a highly parallel approach. Direct ex...

  14. Quality assurance records system

    International Nuclear Information System (INIS)

    This Safety Guide was prepared as part of the Agency's programme, referred to as the NUSS programme, for establishing Codes of Practice and Safety Guides relating to nuclear power plants. It supplements the IAEA Code of Practice on Quality Assurance for Safety in Nuclear Power Plants (IAEA Safety Series No.50-C-QA), which requires that for each nuclear power plant a system for the generation, identification, collection, indexing, filing, storing, maintenance and disposition of quality assurance records shall be established and executed in accordance with written procedures and instructions. The purpose of this Safety Guide is to provide assistance in the establishment and operation of such a system. An orderly established and maintained records system is considered to be part of the means of providing a basis for an appropriate level of confidence that the activities which affect the quality of a nuclear power plant have been performed in accordance with the specific requirements and that the required quality has been achieved and is maintained

  15. INVESTIGATION OF THE TOTAL ORGANIC HALOGEN ANALYTICAL METHOD AT THE WASTE SAMPLING AND CHARACTERIZATION FACILITY

    International Nuclear Information System (INIS)

    Total organic halogen (TOX) is used as a parameter to screen groundwater samples at the Hanford Site. Trending is done for each groundwater well, and changes in TOX and other screening parameters can lead to costly changes in the monitoring protocol. The Waste Sampling and Characterization Facility (WSCF) analyzes groundwater samples for TOX using the United States Environmental Protection Agency (EPA) SW-S46 method 9020B (EPA 1996a). Samples from the Soil and Groundwater Remediation Project (SGRP) are submitted to the WSCF for analysis without information regarding the source of the sample; each sample is in essence a ''blind'' sample to the laboratory. Feedback from the SGRP indicated that some of the WSCF-generated TOX data from groundwater wells had a number of outlier values based on the historical trends (Anastos 200Sa). Additionally, analysts at WSCF observed inconsistent TOX results among field sample replicates. Therefore, the WSCF lab performed an investigation of the TOX analysis to determine the cause of the outlier data points. Two causes were found that contributed to generating out-of-trend TOX data: (1) The presence of inorganic chloride in the groundwater samples: at inorganic chloride concentrations greater than about 10 parts per million (ppm), apparent TOX values increase with increasing chloride concentration. A parallel observation is the increase in apparent breakthrough of TOX from the first to the second activated-carbon adsorption tubes with increasing inorganic chloride concentration. (2) During the sample preparation step, excessive purging of the adsorption tubes with oxygen pressurization gas after sample loading may cause channeling in the activated carbon bed. This channeling leads to poor removal of inorganic chloride during the subsequent wash step with aqueous potassium nitrate. The presence of this residual inorganic chloride then produces erroneously high TOX values. Changes in sample preparation were studied to more effectively

  16. Quality assurance standards for purchasing and inventory control.

    Science.gov (United States)

    Soares, D P

    1985-03-01

    A process is described for quality assurance in pharmaceutical purchasing and inventory control. A quality assurance program should ensure that quality drugs are purchased at the lowest price, drug products are available when needed, the system is managed efficiently, internal controls are provided, drug products are stored under appropriate conditions, and laws, regulations, accreditation standards, and procedures are followed. To meet these objectives, product quality, vendor performance, the department's system of internal controls, purchasing data, and storage conditions should be monitored. A checklist for evaluating purchasing and inventory practices and a sample audit form listing quality assurance criteria, standards, procedures, and recommended actions are provided. A quality assurance program for pharmaceutical purchasing and inventory control should define institution-specific criteria and standards and use these standards for continual evaluation of all aspects of the purchasing and inventory control system. Documentation of quality assurance activities should be provided for use by the purchasing department, hospital administration, and regulatory bodies. PMID:3985026

  17. Hanford analytical services quality assurance requirements documents

    Energy Technology Data Exchange (ETDEWEB)

    Hyatt, J.E.

    1997-09-25

    Hanford Analytical Services Quality Assurance Requirements Document (HASQARD) is issued by the Analytical Services, Program of the Waste Management Division, US Department of Energy (US DOE), Richland Operations Office (DOE-RL). The HASQARD establishes quality requirements in response to DOE Order 5700.6C (DOE 1991b). The HASQARD is designed to meet the needs of DOE-RL for maintaining a consistent level of quality for sampling and field and laboratory analytical services provided by contractor and commercial field and laboratory analytical operations. The HASQARD serves as the quality basis for all sampling and field/laboratory analytical services provided to DOE-RL through the Analytical Services Program of the Waste Management Division in support of Hanford Site environmental cleanup efforts. This includes work performed by contractor and commercial laboratories and covers radiological and nonradiological analyses. The HASQARD applies to field sampling, field analysis, and research and development activities that support work conducted under the Hanford Federal Facility Agreement and Consent Order Tri-Party Agreement and regulatory permit applications and applicable permit requirements described in subsections of this volume. The HASQARD applies to work done to support process chemistry analysis (e.g., ongoing site waste treatment and characterization operations) and research and development projects related to Hanford Site environmental cleanup activities. This ensures a uniform quality umbrella to analytical site activities predicated on the concepts contained in the HASQARD. Using HASQARD will ensure data of known quality and technical defensibility of the methods used to obtain that data. The HASQARD is made up of four volumes: Volume 1, Administrative Requirements; Volume 2, Sampling Technical Requirements; Volume 3, Field Analytical Technical Requirements; and Volume 4, Laboratory Technical Requirements. Volume 1 describes the administrative requirements

  18. Quality-Assurance Program Plan

    International Nuclear Information System (INIS)

    This Quality Assurance Program Plan (QAPP) is provided to describe the Quality Assurance Program which is applied to the waste management activities conducted by AESD-Nevada Operations at the E-MAD Facility located in Area 25 of the Nevada Test Site. The AESD-Nevada Operations QAPP provides the necessary systematic and administrative controls to assure activities that affect quality, safety, reliability, and maintainability during design, procurement, fabrication, inspection, shipments, tests, and storage are conducted in accordance with established requirements

  19. Plastic bag method for active sample loading into transmission electron microscope

    OpenAIRE

    Yao, Hao; Isobe, Shigehito; Wang, Yongming; Hashimoto, Naoyuki; Ohnuki, Somei

    2011-01-01

    A plastic bag method was developed to observe air sensitive samples on microstructure and phases distribution without air exposure during holder transfer process into the Transmission Electron Microscope (TEM). As an example, a type of lithium aluminum hydride (Li3AlH6) was observed in TEM to demonstrate the effectiveness of this method. Results show that plastic bag method is a simple and practical TEM transfer method utilized to reduce air contact for a series of air sensitive materials.

  20. A Method to Reconstruct Nth-Order Periodically Nonuniform Sampled Signals

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yao

    2004-01-01

    It is well known that nonuniform sampling is usually needed in special signals processing.In this paper, a general method to reconstruct Nth-order periodically nonuniform sampled signals is presented which is also developed to digital domain, and the designs of the digital filters and the synthesis system are given. This paper extends the studies of Kohlenberg, whose work concentrate on the periodically nonuniform sampling of second order, as well as the studies of A.J. Coulson, J.L.Brown,whose work deal with the problems of two-band signals' Nth-order sampling and reconstruction.

  1. Evaluation of sampling methods for the detection of Salmonella in broiler flocks

    DEFF Research Database (Denmark)

    Skov, Marianne N.; Carstensen, B.; Tornoe, N.;

    1999-01-01

    The present study compares four different sampling methods potentially applicable to detection of Salmonella in broiler flocks, based on collection of faecal samples (i) by hand, 300 fresh faecal samples (ii) absorbed on five sheets of paper (iii) absorbed on five pairs of socks (elastic cotton...... tubes pulled over the boots and termed 'socks') and (iv) by using only one pair of socks. Twenty-three broiler flocks were included in the investigation and 18 of these were found to be positive by at least one method. Seven serotypes of Salmonella with different patterns of transmission (mainly...

  2. Method for rapid screening analysis of Sr-90 in edible plant samples collected near Fukushima, Japan.

    Science.gov (United States)

    Amano, Hikaru; Sakamoto, Hideaki; Shiga, Norikatsu; Suzuki, Kaori

    2016-06-01

    A screening method for measuring (90)Sr in edible plant samples by focusing on (90)Y in equilibrium with (90)Sr is reported. (90)Y was extracted from samples with acid, co-precipitated with iron hydroxide, and precipitated with oxalic acid. The dissolved oxalate precipitate was loaded on an extraction chromatography resin, and the (90)Y-enriched eluate was analyzed by Cherenkov counting with a TDCR liquid scintillation counter. (90)Sr ((90)Y) concentration was determined in plant samples collected near the damaged Fukushima Daiichi Nuclear Power Plants with this method. PMID:27043171

  3. A proteomics sample preparation method for mature, recalcitrant leaves of perennial plants.

    Directory of Open Access Journals (Sweden)

    Deng Gang

    Full Text Available Sample preparation is key to the success of proteomics studies. In the present study, two sample preparation methods were tested for their suitability on the mature, recalcitrant leaves of six representative perennial plants (grape, plum, pear, peach, orange, and ramie. An improved sample preparation method was obtained: Tris and Triton X-100 were added together instead of CHAPS to the lysis buffer, and a 20% TCA-water solution and 100% precooled acetone were added after the protein extraction for the further purification of protein. This method effectively eliminates nonprotein impurities and obtains a clear two-dimensional gel electrophoresis array. The method facilitates the separation of high-molecular-weight proteins and increases the resolution of low-abundance proteins. This method provides a widely applicable and economically feasible technology for the proteomic study of the mature, recalcitrant leaves of perennial plants.

  4. PhyloChip™ microarray comparison of sampling methods used for coral microbial ecology.

    Science.gov (United States)

    Kellogg, Christina A; Piceno, Yvette M; Tom, Lauren M; DeSantis, Todd Z; Zawada, David G; Andersen, Gary L

    2012-01-01

    Interest in coral microbial ecology has been increasing steadily over the last decade, yet standardized methods of sample collection still have not been defined. Two methods were compared for their ability to sample coral-associated microbial communities: tissue punches and foam swabs, the latter being less invasive and preferred by reef managers. Four colonies of star coral, Montastraea annularis, were sampled in the Dry Tortugas National Park (two healthy and two with white plague disease). The PhyloChip™ G3 microarray was used to assess microbial community structure of amplified 16S rRNA gene sequences. Samples clustered based on methodology rather than coral colony. Punch samples from healthy and diseased corals were distinct. All swab samples clustered closely together with the seawater control and did not group according to the health state of the corals. Although more microbial taxa were detected by the swab method, there is a much larger overlap between the water control and swab samples than punch samples, suggesting some of the additional diversity is due to contamination from water absorbed by the swab. While swabs are useful for noninvasive studies of the coral surface mucus layer, these results show that they are not optimal for studies of coral disease. PMID:22085912

  5. Distributed sampling measurement method of network traffic in high-speed IPv6 networks

    Institute of Scientific and Technical Information of China (English)

    Pan Qiao; Pei Changxing

    2007-01-01

    With the advent of large-scale and high-speed IPv6 network technology, an effective multi-point traffic sampling is becoming a necessity. A distributed multi-point traffic sampling method that provides an accurate and efficient solution to measure IPv6 traffic is proposed. The proposed method is to sample IPv6 traffic based on the analysis of bit randomness of each byte in the packet header. It offers a way to consistently select the same subset of packets at each measurement point, which satisfies the requirement of the distributed multi-point measurement. Finally, using real IPv6 traffic traces, the conclusion that the sampled traffic data have a good uniformity that satisfies the requirement of sampling randomness and can correctly reflect the packet size distribution of full packet trace is proved.

  6. Comparison of blood chemistry values for samples collected from juvenile chinook salmon by three methods

    Science.gov (United States)

    Congleton, J.L.; LaVoie, W.J.

    2001-01-01

    Thirteen blood chemistry indices were compared for samples collected by three commonly used methods: caudal transection, heart puncture, and caudal vessel puncture. Apparent biases in blood chemistry values for samples obtained by caudal transection were consistent with dilution with tissue fluids: alanine aminotransferase (ALT), aspartate aminotransferase (AST), lactate dehydrogenase (LDH), creatine kinase (CK), triglyceride, and K+ were increased and Na+ and Cl- were decreased relative to values for samples obtained by caudal vessel puncture. Some enzyme activities (ALT, AST, LDH) and K+ concentrations were also greater in samples taken by heart puncture than in samples taken by caudal vessel puncture. Of the methods tested, caudal vessel puncture had the least effect on blood chemistry values and should be preferred for blood chemistry studies on juvenile salmonids.

  7. Rapid fusion method for the determination of refractory thorium and uranium isotopes in soil samples

    International Nuclear Information System (INIS)

    Recently, approximately 80 % of participating laboratories failed to accurately determine uranium isotopes in soil samples in the U. S Department of Energy Mixed Analyte Performance Evaluation Program Session 30, due to incomplete dissolution of refractory particles in the samples. Failing laboratories employed acid dissolution methods, including hydrofluoric acid, to recover uranium from the soil matrix. A new rapid fusion method has been developed by the Savannah River National Laboratory (SRNL) to prepare 1-2 g soil sample aliquots very quickly, with total dissolution of refractory particles. Soil samples are fused with sodium hydroxide at 600 deg C in zirconium crucibles to enable complete dissolution of the sample. Uranium and thorium are separated on stacked TEVA and TRU extraction chromatographic resin cartridges, prior to isotopic measurements by alpha spectrometry on cerium fluoride microprecipitation sources. (author)

  8. The impact of different blood sampling methods on laboratory rats under different types of anaesthesia

    DEFF Research Database (Denmark)

    Toft, Martin Fitzner; Petersen, Mikke Haxø; Dragsted, Nils;

    2006-01-01

    and that it might take an extra hour to recover from it. CO2 anaesthesia seemed unable to prevent the increase in blood pressure and the fluctuations in body temperature induced by blood sampling, and up to 10 h after sampling, the rats were still affected by CO2 anaesthesia. Rats anaesthetized with isoflurane...... to be the method from which rats most rapidly recover when compared with periorbital puncture and tail vein puncture, and that for anaesthesia, isoflurane is recommended in preference to CO2....... for rats sampled from the tail vein, which showed fluctuations in body temperature in excess of 30 h after sampling. Increases in heart rate and blood pressure within the first hours after sampling indicated that periorbital puncture was the method that had the largest acute impact on the rats...

  9. Impact of processing method on recovery of bacteria from wipes used in biological surface sampling.

    Science.gov (United States)

    Downey, Autumn S; Da Silva, Sandra M; Olson, Nathan D; Filliben, James J; Morrow, Jayne B

    2012-08-01

    Environmental sampling for microbiological contaminants is a key component of hygiene monitoring and risk characterization practices utilized across diverse fields of application. However, confidence in surface sampling results, both in the field and in controlled laboratory studies, has been undermined by large variation in sampling performance results. Sources of variation include controlled parameters, such as sampling materials and processing methods, which often differ among studies, as well as random and systematic errors; however, the relative contributions of these factors remain unclear. The objective of this study was to determine the relative impacts of sample processing methods, including extraction solution and physical dissociation method (vortexing and sonication), on recovery of Gram-positive (Bacillus cereus) and Gram-negative (Burkholderia thailandensis and Escherichia coli) bacteria from directly inoculated wipes. This work showed that target organism had the largest impact on extraction efficiency and recovery precision, as measured by traditional colony counts. The physical dissociation method (PDM) had negligible impact, while the effect of the extraction solution was organism dependent. Overall, however, extraction of organisms from wipes using phosphate-buffered saline with 0.04% Tween 80 (PBST) resulted in the highest mean recovery across all three organisms. The results from this study contribute to a better understanding of the factors that influence sampling performance, which is critical to the development of efficient and reliable sampling methodologies relevant to public health and biodefense.

  10. Solvent extraction method for rapid separation of strontium-90 in milk and food samples

    International Nuclear Information System (INIS)

    A solvent extraction method, using tributyl phosphate, for rapid separation of strontium-90 in milk and other food samples has been presented in this report in view of large number of samples recieved after Chernobyl accident for checking radioactive contamination. The earlier nitration method in use for the determination of 90Sr through its daughter 90Y takes over two weeks for analysis of a sample. While by this extraction method it takes only 4 to 5 hours for sample analysis. Complete estimation including initial counting can be done in a single day. The chemical recovery varies between 80-90% compared to nitration method which is 65-80%. The purity of the method has been established by following the decay of yttrium-90 separated. Some of the results obtained by adopting this chemical method for food analysis are included. The method is, thus, found to be rapid and convenient for accurate estimation of strontium-90 in milk and food samples. (author). 2 tabs., 1 fig

  11. Satisfaction survey with DNA cards method to collect genetic samples for pharmacogenetics studies

    Directory of Open Access Journals (Sweden)

    Mas Herrero Sergio

    2006-05-01

    Full Text Available Abstract Background Pharmacogenetic studies are essential in understanding the interindividual variability of drug responses. DNA sample collection for genotyping is a critical step in genetic studies. A method using dried blood samples from finger-puncture, collected on DNA-cards, has been described as an alternative to the usual venepuncture technique. The purpose of this study is to evaluate the implementation of the DNA cards method in a multicentre clinical trial, and to assess the degree of investigators' satisfaction and the acceptance of the patients perceived by the investigators. Methods Blood samples were collected on DNA-cards. The quality and quantity of DNA recovered were analyzed. Investigators were questioned regarding their general interest, previous experience, safety issues, preferences and perceived patient satisfaction. Results 151 patients' blood samples were collected. Genotyping of GST polymorphisms was achieved in all samples (100%. 28 investigators completed the survey. Investigators perceived patient satisfaction as very good (60.7% or good (39.3%, without reluctance to finger puncture. Investigators preferred this method, which was considered safer and better than the usual methods. All investigators would recommend using it in future genetic studies. Conclusion Within the clinical trial setting, the DNA-cards method was very well accepted by investigators and patients (in perception of investigators, and was preferred to conventional methods due to its ease of use and safety.

  12. Estimation and Comparison of Immunization Coverage under Different Sampling Methods for Health Surveys

    Directory of Open Access Journals (Sweden)

    D. C. Nath

    2014-01-01

    Full Text Available Immunization currently averts an estimated 2-3 million deaths every year in all age groups. Hepatitis B is a major public health problem worldwide. In this study, the estimates of hepatitis B vaccine coverage are compared among three sampling plans namely, 30×30 sampling and 30×7 sampling method under cluster sampling and systematic random sampling schemes. The data has been taken from the survey “Comparison of Two Survey Methodologies to Estimate Total Vaccination Coverage” sponsored by Indian Council of Medical Research, New Delhi. It is observed that the estimations of proportions of this vaccination coverage are significantly not different at 5% level of probability. Both 30×30 sampling and 30×7 sampling will be preferred to systematic sampling in estimation of hepatitis B vaccine coverage for this study population because of quick estimation and lesser cost. The 30×7 cluster sampling is the most recommended method for such immunization coverage especially in a developing country.

  13. Evaluation of sampling methods for Bacillus spore-contaminated HVAC filters

    Science.gov (United States)

    Calfee, M. Worth; Rose, Laura J.; Tufts, Jenia; Morse, Stephen; Clayton, Matt; Touati, Abderrahmane; Griffin-Gatchalian, Nicole; Slone, Christina; McSweeney, Neal

    2016-01-01

    The objective of this study was to compare an extraction-based sampling method to two vacuum-based sampling methods (vacuum sock and 37 mm cassette filter) with regards to their ability to recover Bacillus atrophaeus spores (surrogate for Bacillus anthracis) from pleated heating, ventilation, and air conditioning (HVAC) filters that are typically found in commercial and residential buildings. Electrostatic and mechanical HVAC filters were tested, both without and after loading with dust to 50% of their total holding capacity. The results were analyzed by one-way ANOVA across material types, presence or absence of dust, and sampling device. The extraction method gave higher relative recoveries than the two vacuum methods evaluated (p ≤ 0.001). On average, recoveries obtained by the vacuum methods were about 30% of those achieved by the extraction method. Relative recoveries between the two vacuum methods were not significantly different (p > 0.05). Although extraction methods yielded higher recoveries than vacuum methods, either HVAC filter sampling approach may provide a rapid and inexpensive mechanism for understanding the extent of contamination following a wide-area biological release incident. PMID:24184312

  14. Measurement quality assurance for beta particle calibrations at NIST

    Energy Technology Data Exchange (ETDEWEB)

    Soares, C.G.; Pruitt, J.S. [National Institute of Standards and Technology, Gaithersburg, MD (United States)

    1993-12-31

    Standardized beta-particle fields have been established in an international standard and have been adopted for use in several U.S. dosimeter and instrument testing standards. Calibration methods and measurement quality assurance procedures employed at the National Institute of Standards and Technology (NIST) for beta-particle calibrations in these reference fields are discussed. The calibration facility including the NIST-automated extrapolation ionization chamber is described, and some sample results of calibrations are shown. Methods for establishing and maintaining traceability to NIST of secondary laboratories are discussed. Currently, there are problems in finding a good method for routine testing of traceability to NIST. Some examples of past testing methods are given and solutions to this problem are proposed.

  15. Shear Strength of Remoulding Clay Samples Using Different Methods of Moulding

    Science.gov (United States)

    Norhaliza, W.; Ismail, B.; Azhar, A. T. S.; Nurul, N. J.

    2016-07-01

    Shear strength for clay soil was required to determine the soil stability. Clay was known as a soil with complex natural formations and very difficult to obtain undisturbed samples at the site. The aim of this paper was to determine the unconfined shear strength of remoulded clay on different methods in moulding samples which were proctor compaction, hand operated soil compacter and miniature mould methods. All the samples were remoulded with the same optimum moisture content (OMC) and density that were 18% and 1880 kg/m3 respectively. The unconfined shear strength results of remoulding clay soils for proctor compaction method was 289.56kPa with the strain 4.8%, hand operated method was 261.66kPa with the strain 4.4% and miniature mould method was 247.52kPa with the strain 3.9%. Based on the proctor compaction method, the reduction percentage of unconfined shear strength of remoulded clay soil of hand operated method was 9.66%, and for miniature mould method was 14.52%. Thus, because there was no significant difference of reduction percentage of unconfined shear strength between three different methods, so it can be concluded that remoulding clay by hand operated method and miniature mould method were accepted and suggested to perform remoulding clay samples by other future researcher. However for comparison, the hand operated method was more suitable to form remoulded clay sample in term of easiness, saving time and less energy for unconfined shear strength determination purposes.

  16. Microwave assisted rapid and improved radiochemical method for the estimation of uranium in leaf samples

    International Nuclear Information System (INIS)

    In this paper, we report the development of a rapid and improved radiochemical method assisted by microwave technique for the determination of uranium in leafy samples for use in radiological emergency situations, where quick assessment of radioactivity is required. About 200 g of fresh leaf sample was ashed in a microwave muffle furnace, followed by digestion in a microwave digester and radiochemically separated using UTEVA resin. Counting was performed in a passivated ion implanted planar silicon alpha spectrometer after electroplating the separated sample for uranium. By this method, the time of sample processing and analysis was reduced to few hours from about 2 days and also the radiochemical recovery of uranium has considerably enhanced as compared to conventional methods. (author)

  17. A New Method of Measuring 81Kr and 85Kr Abundances in Environmental Samples

    CERN Document Server

    Du, X; Bailey, K; Lehmann, B E; Lorenzo, R; Lu, Z T; Müller, P; O'Connor, T P; Sturchio, N C; Young, L

    2003-01-01

    We demonstrate a new method for determining the 81Kr/Kr ratio in environmental samples based upon two measurements: the 85Kr/81Kr ratio measured by Atom Trap Trace Analysis (ATTA) and the 85Kr/Kr ratio measured by Low-Level Counting (LLC). This method can be used to determine the mean residence time of groundwater in the range of 10^5 - 10^6 a. It requires a sample of 100 micro-l STP of Kr extracted from approximately two tons of water. With modern atmospheric Kr samples, we demonstrate that the ratios measured by ATTA and LLC are directly proportional to each other within the measurement error of +/- 10%; we calibrate the 81Kr/Kr ratio of modern air measured using this method; and we show that the 81Kr/Kr ratios of samples extracted from air before and after the development of the nuclear industry are identical within the measurement error.

  18. Phylogenetic representativeness: a new method for evaluating taxon sampling in evolutionary studies

    Directory of Open Access Journals (Sweden)

    Passamonti Marco

    2010-04-01

    Full Text Available Abstract Background Taxon sampling is a major concern in phylogenetic studies. Incomplete, biased, or improper taxon sampling can lead to misleading results in reconstructing evolutionary relationships. Several theoretical methods are available to optimize taxon choice in phylogenetic analyses. However, most involve some knowledge about the genetic relationships of the group of interest (i.e., the ingroup, or even a well-established phylogeny itself; these data are not always available in general phylogenetic applications. Results We propose a new method to assess taxon sampling developing Clarke and Warwick statistics. This method aims to measure the "phylogenetic representativeness" of a given sample or set of samples and it is based entirely on the pre-existing available taxonomy of the ingroup, which is commonly known to investigators. Moreover, our method also accounts for instability and discordance in taxonomies. A Python-based script suite, called PhyRe, has been developed to implement all analyses we describe in this paper. Conclusions We show that this method is sensitive and allows direct discrimination between representative and unrepresentative samples. It is also informative about the addition of taxa to improve taxonomic coverage of the ingroup. Provided that the investigators' expertise is mandatory in this field, phylogenetic representativeness makes up an objective touchstone in planning phylogenetic studies.

  19. Uncertainty Estimation of Analysis of Hg in Fish and Shrimp Sample by SS-AAS Method

    International Nuclear Information System (INIS)

    An uncertainty estimation of analysis of Hg in Fish and Shrimp sample by SS-AAS method has been done. Samples preparation have been done by using freeze drying. They were then homogenized and sieved, so their grain size of 100 mesh were obtained. The content of Hg in the samples and SRM 1566a has been analyzed by using SS-AAS (solid sampling - atomic absorption spectrometer) AAS 5 EA, ZEISS. The experimental result shows that expanded uncertainty value of fish sample is 4.3636, whereas expanded uncertainty value of shrimp’s sample = 2.638. Precision value is = 98.46 % and accuracy value = 97.0 % and limit of detection = 0.0154 ng/g. (author)

  20. System and method for liquid extraction electrospray-assisted sample transfer to solution for chemical analysis

    Science.gov (United States)

    Kertesz, Vilmos; Van Berkel, Gary J.

    2016-07-12

    A system for sampling a surface includes a surface sampling probe comprising a solvent liquid supply conduit and a distal end, and a sample collector for suspending a sample collection liquid adjacent to the distal end of the probe. A first electrode provides a first voltage to solvent liquid at the distal end of the probe. The first voltage produces a field sufficient to generate electrospray plume at the distal end of the probe. A second electrode provides a second voltage and is positioned to produce a plume-directing field sufficient to direct the electrospray droplets and ions to the suspended sample collection liquid. The second voltage is less than the first voltage in absolute value. A voltage supply system supplies the voltages to the first electrode and the second electrode. The first electrode can apply the first voltage directly to the solvent liquid. A method for sampling for a surface is also disclosed.

  1. Field sampling and selecting on-site analytical methods for explosives in soil

    Energy Technology Data Exchange (ETDEWEB)

    Crockett, A.B.; Craig, H.D.; Jenkins, T.F.; Sisk, W.E.

    1996-12-01

    A large number of defense-related sites are contaminated with elevated levels of secondary explosives. Levels of contamination range from barely detectable to levels above 10% that need special handling because of the detonation potential. Characterization of explosives-contaminated sites is particularly difficult because of the very heterogeneous distribution of contamination in the environment and within samples. To improve site characterization, several options exist including collecting more samples, providing on-site analytical data to help direct the investigation, compositing samples, improving homogenization of the samples, and extracting larger samples. This publication is intended to provide guidance to Remedial Project Managers regarding field sampling and on-site analytical methods for detecting and quantifying secondary explosive compounds in soils, and is not intended to include discussions of the safety issues associated with sites contaminated with explosive residues.

  2. Improved Algorithms and Coupled Neutron-Photon Transport for Auto-Importance Sampling Method

    CERN Document Server

    Wang, Xin; Qiu, Rui; Li, Chun-Yan; Liang, Man-Chun; Zhang, Hui; Li, Jun-Li

    2016-01-01

    Auto-Importance Sampling (AIS) method is a Monte Carlo variance reduction technique proposed by Tsinghua University for deep penetration problem, which can improve computational efficiency significantly without pre-calculations for importance distribution. However AIS method is only validated with several basic deep penetration problems of simple geometries and cannot be used for coupled neutron-photon transport. This paper firstly presented the latest algorithm improvements for AIS method including particle transport, fictitious particles creation and adjustment, fictitious surface geometry, random number allocation and calculation of estimated relative error, which made AIS method applicable to complicated deep penetration problem. Then, a coupled Neutron-Photon Auto-Importance Sampling (NP-AIS) method was proposed to apply AIS method with the improved algorithms in coupled neutron-photon Monte Carlo transport. Finally, the NUREG/CR-6115 PWR benchmark model was calculated with the method of geometry splitti...

  3. Simplified sampling methods for estimating levels of lactobacilli in saliva in dental clinical practice.

    Science.gov (United States)

    Gabre, P; Martinsson, T; Gahnberg, L

    1999-08-01

    The aim of the present study was to evaluate whether estimation of lactobacilli was possible with simplified saliva sampling methods. Dentocult LB (Orion Diagnostica AB, Trosa, Sweden) was used to estimate the number of lactobacilli in saliva sampled by 3 different methods from 96 individuals: (i) Collecting and pouring stimulated saliva over a Dentocult dip-slide; (ii) direct licking of the Dentocult LB dip-slide; (iii) contaminating a wooden spatula with saliva and pressing against the Dentocult dip-slide. The first method was in accordance with the manufacturer's instructions and selected as the 'gold standard'; the other 2 methods were compared with this result. The 2 simplified methods for estimating levels of lactobacilli in saliva showed good reliability and specificity. Sensitivity, defined as the ability to detect individuals with a high number of lactabacilli in saliva, was sufficient for the licking method (85%), but significantly reduced for the wooden spatula method (52%).

  4. The case of sustainability assurance: constructing a new assurance service

    NARCIS (Netherlands)

    B. O'Dwyer

    2011-01-01

    This paper presents an in-depth longitudinal case study examining the processes through which practitioners in two Big 4 professional services firms have attempted to construct sustainability assurance (independent assurance on sustainability reports). Power’s (1996, 1997, 1999, 2003) theorization o

  5. 40 CFR 80.580 - What are the sampling and testing methods for sulfur?

    Science.gov (United States)

    2010-07-01

    ... listed in this section as prescribed in 5 U.S.C. 552(a) and 1 CFR part 51. Anyone may inspect copies at... methods for sulfur? 80.580 Section 80.580 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... the sampling and testing methods for sulfur? The sulfur content of diesel fuel and diesel...

  6. Methods for point-of-care detection of nucleic acid in a sample

    Science.gov (United States)

    Bearinger, Jane P.; Dugan, Lawrence C.

    2015-12-29

    Provided herein are methods and apparatus for detecting a target nucleic acid in a sample and related methods and apparatus for diagnosing a condition in an individual. The condition is associated with presence of nucleic acid produced by certain pathogens in the individual.

  7. A sampling method for the reconstruction of a periodic interface in a layered medium

    Science.gov (United States)

    Sun, Guanying; Zhang, Ruming

    2016-07-01

    In this paper, we consider the inverse problem of reconstructing periodic interfaces in a two-layered medium with TM-mode. We propose a sampling-type method to recover the top periodic interface from the near-field data measured on a straight line above the total structure. Finally, numerical experiments are illustrated to show the effectiveness of the method.

  8. Method of high-density seismic imaging exploration and application samples

    Institute of Scientific and Technical Information of China (English)

    熊章强; 张学强; 李修忠; 谢尚平; 张大洲

    2004-01-01

    The paper introduces the method of high-density seismic imaging exploration, discusses its features different fromconventional shallow seismic reflection wave technique, and illustrates the application effect of the method usingthree samples of engineering geological explorations on land and in water - exploration of underground cavity,location survey of sunk ship and investigation of channel silt depth.

  9. Evaluation of three methods for sampling ground-dwelling Ants in the Brazilian Cerrado.

    Science.gov (United States)

    Lopes, Cauê T; Vasconcelos, Heraldo L

    2008-01-01

    Few studies have evaluated the efficiency of methods for sampling ants, especially in regions with highly variable vegetation physiognomies such as the Cerrado region of central Brazil. Here we compared three methods to collect ground-dwelling ants: pitfall traps, sardine baits, and the Winkler litter extractor. Our aim was to determine which method would be most appropriate to characterize the ant assemblages inhabiting different vegetation types. More species were collected with pitfall traps and with the Winkler extractor than with sardine baits. Pitfall traps collected more species in the cerrado (savanna) physiognomies, particularly in those with a poor litter cover, whereas the Winlker extractor was more efficient in the forest physiognomies, except the one subject to periodic inundations. There was a low similarity in species composition between forest and cerrado physiognomies, and this pattern was detected regardless of the method used to sampling ants. Therefore, even the use of a single, relatively selective method of collection can be enough for studies comparing highly distinct habitats and/or conditions. However, if the purpose of the sampling is to produce a more thoroughly inventory of the ant fauna, we suggest the use of a combination of methods, particularly pitfall traps and the Winkler extractor. Therefore, the Ants of the Leaf-Litter (ALL) Sampling Protocol appear to be an adequate protocol for sampling ants in the highly-threatened Brazilian cerrado biome. PMID:18813741

  10. OPTIMAL METHOD FOR PREPARATION OF SILICATE ROCK SAMPLES FOR ANALYTICAL PURPOSES

    Directory of Open Access Journals (Sweden)

    Maja Vrkljan

    2004-12-01

    Full Text Available The purpose of this study was to determine an optimal dissolution method for silicate rock samples for further analytical purposes. Analytical FAAS method of determining cobalt, chromium, copper, nickel, lead and zinc content in gabbro sample and geochemical standard AGV-1 has been applied for verification. Dissolution in mixtures of various inorganic acids has been tested, as well as Na2CO3 fusion technique. The results obtained by different methods have been compared and dissolution in the mixture of HNO3 + HF has been recommended as optimal.

  11. Methods for detection of fused alumina powder, used as a tracer aerosol, on air sample filters

    International Nuclear Information System (INIS)

    Graded fused aluminia dusts (Aloxite) of mean particle sizes of 6 μm upwards are useful for measuring the effects of particle size on aerosol sampling systems and in dust experiments. Methods are described for the determination of filter loading with Aloxite aerosol using thermoluminescence (TL) and x-ray fluorescence (XRF). The XRF method was found to be almost as precise as direct weighing with a precision microbalance, and presents a useful method of detection where other contaminants are present in the sampled airstream. (author)

  12. Comparison of acid leachate and fusion methods to determine plutonium and americium in environmental samples

    International Nuclear Information System (INIS)

    The Analytical Chemistry Laboratory at Argonne National Laboratory performs radiochemical analyses for a wide variety of sites within the Department of Energy complex. Since the chemical history of the samples may vary drastically from site to site, the effectiveness of any analytical technique may also vary. This study compares a potassium fluoride-pyrosulfate fusion technique with an acid leachate method. Both normal and high-fired soils and vegetation samples were analyzed for both americium and plutonium. Results show both methods work well, except for plutonium in high-fired soils. Here the fusion method provides higher accuracy

  13. Computational methods and modeling. 1. Sampling a Position Uniformly in a Trilinear Hexahedral Volume

    International Nuclear Information System (INIS)

    Monte Carlo particle transport plays an important role in some multi-physics simulations. These simulations, which may additionally involve deterministic calculations, typically use a hexahedral or tetrahedral mesh. Trilinear hexahedrons are attractive for physics calculations because faces between cells are uniquely defined, distance-to-boundary calculations are deterministic, and hexahedral meshes tend to require fewer cells than tetrahedral meshes. We discuss one aspect of Monte Carlo transport: sampling a position in a tri-linear hexahedron, which is made up of eight control points, or nodes, and six bilinear faces, where each face is defined by four non-coplanar nodes in three-dimensional Cartesian space. We derive, code, and verify the exact sampling method and propose an approximation to it. Our proposed approximate method uses about one-third the memory and can be twice as fast as the exact sampling method, but we find that its inaccuracy limits its use to well-behaved hexahedrons. Daunted by the expense of the exact method, we propose an alternate approximate sampling method. First, calculate beforehand an approximate volume for each corner of the hexahedron by taking one-eighth of the volume of an imaginary parallelepiped defined by the corner node and the three nodes to which it is directly connected. For the sampling, assume separability in the parameters, and sample each parameter, in turn, from a linear pdf defined by the sum of the four corner volumes at each limit (-1 and 1) of the parameter. This method ignores the quadratic portion of the pdf, but it requires less storage, has simpler sampling, and needs no extra, on-the-fly calculations. We simplify verification by designing tests that consist of one or more cells that entirely fill a unit cube. Uniformly sampling complicated cells that fill a unit cube will result in uniformly sampling the unit cube. Unit cubes are easily analyzed. The first problem has four wedges (or tents, or A frames) whose

  14. A faster sample preparation method for determination of polonium-210 in fish

    International Nuclear Information System (INIS)

    In order to facilitate Health Canada’s study on background radiation levels in country foods, an in-house radio-analytical method has been developed for determination of polonium-210 (210Po) in fish samples. The method was validated by measurement of 210Po in a certified reference material. It was also evaluated by comparing 210Po concentrations in a number of fish samples by another method. The in-house method offers faster sample dissolution using an automated digestion system compared to currently used wet-ashing on a hot plate. It also utilizes pre-packed Sr-resin® cartridges for rapid and reproducible separation of 210Po versus time-consuming manually packed Sr-resin® columns. (author)

  15. Determination of radiocaesium in agriculture-related water samples containing suspended solids using gelling method

    International Nuclear Information System (INIS)

    After the TEPCO Fukushima Dai-ichi Nuclear Power Plant accident in 2011, the radiocaesium, which flowed into the paddy fields via irrigation water, have been widely investigated. When the concentration of radiocaesium in the water samples containing suspended solids were directly measured using a high purity germanium detector with a 2 L marinelli beaker, the radiocaesium concentration might be overestimated due to the sedimentation of the suspended solids during the measurement time. In fact, the values obtained by the direct method were higher than those obtained by the filtering method and/or the gelling method in most of the agriculture-related water samples. We concluded that the gelling method using sodium polyacrylate can be widely adapted for the analysis of the total radiocaesium in the agriculture-related water samples because of its many advantage such as simple preparation procedure, accurate analysis values, excellent long-term stability of geometry and low operating cost. (author)

  16. Subsurface quality assurance practices

    International Nuclear Information System (INIS)

    This report addresses only the concept of applying Nuclear Quality Assurance (NQA) practices to repository shaft and subsurface design and construction; how NQA will be applied; and the level of detail required in the documentation for construction of a shaft and subsurface repository in contrast to the level of detail required in the documentation for construction of a traditional mine. This study determined that NQA practices are viable, attainable, as well as required. The study identified the appropriate NQA criteria and the repository's major structures, systems, items, and activities to which the criteria are applicable. A QA plan, for design and construction, and a list of documentation, for construction, are presented. 7 refs., 1 fig., 18 tabs

  17. Construction quality assurance report

    Energy Technology Data Exchange (ETDEWEB)

    Roscha, V.

    1994-09-08

    This report provides a summary of the construction quality assurance (CQA) observation and test results, including: The results of the geosynthetic and soil materials conformance testing. The observation and testing results associates with the installation of the soil liners. The observation and testing results associated with the installation of the HDPE geomembrane liner systems. The observation and testing results associated with the installation of the leachate collection and removal systems. The observation and testing results associated with the installation of the working surfaces. The observation and testing results associated with in-plant manufacturing process. Summary of submittal reviews by Golder Construction Services, Inc. The submittal and certification of the piping material specifications. The observation and verification associated of the Acceptance Test Procedure results of the operational equipment functions. Summary of the ECNs which are incorporated into the project.

  18. FESA Quality Assurance

    CERN Document Server

    CERN. Geneva

    2015-01-01

    FESA is a framework used by 100+ developers at CERN to design and implement the real-time software used to control the accelerators. Each new version must be tested and qualified to ensure that no backward compatibility issues have been introduced and that there is no major bug which might prevent accelerator operations. Our quality assurance approach is based on code review and a two-level testing process. The first level is made of unit-test (Python unittest & Google tests for C++). The second level consists of integration tests running on an isolated test environment. We also use a continuous integration service (Bamboo) to ensure the tests are executed periodically and the bugs caught early. In the presentation, we will explain the reasons why we took this approach, the results and some thoughts on the pros and cons.

  19. Concrete quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Holz, N. [Harza Engineering Company, Chicago, IL (United States)

    2000-08-01

    This short article reports on progress at the world's largest civil construction project, namely China's Three Gorges hydro project. Work goes on around the clock to put in place nearly 28 M m{sup 3} of concrete. At every stage of the work there is strong emphasis on quality assurance (QA) and concrete is no exception. The US company Harza Engineering has been providing QA since the mid-1980s and concrete QA has been based on international standards. Harza personnel work in the field with supervisors developing educational tools for supervising concrete construction and quality, as well as providing training courses in concrete technology. Some details on flood control, capacity, water quality and environmental aspects are given..

  20. Construction quality assurance report

    International Nuclear Information System (INIS)

    This report provides a summary of the construction quality assurance (CQA) observation and test results, including: The results of the geosynthetic and soil materials conformance testing. The observation and testing results associates with the installation of the soil liners. The observation and testing results associated with the installation of the HDPE geomembrane liner systems. The observation and testing results associated with the installation of the leachate collection and removal systems. The observation and testing results associated with the installation of the working surfaces. The observation and testing results associated with in-plant manufacturing process. Summary of submittal reviews by Golder Construction Services, Inc. The submittal and certification of the piping material specifications. The observation and verification associated of the Acceptance Test Procedure results of the operational equipment functions. Summary of the ECNs which are incorporated into the project

  1. Quality-assurance plan and field methods for quality-of-water activities, U.S. Geological Survey, Idaho National Engineering Laboratory, Idaho

    Energy Technology Data Exchange (ETDEWEB)

    Mann, L.J.

    1996-10-01

    Water-quality activities at the Idaho National Engineering Laboratory (INEL) Project Office are part of the US Geological Survey`s (USGS) Water Resources Division (WRD) mission of appraising the quantity and quality of the Nation`s water resources. The purpose of the Quality Assurance Plan (QAP) for water-quality activities performed by the INEL Project Office is to maintain and improve the quality of technical products, and to provide a formal standardization, documentation, and review of the activities that lead to these products. The principles of this plan are as follows: (1) water-quality programs will be planned in a competent manner and activities will be monitored for compliance with stated objectives and approaches; (2) field, laboratory, and office activities will be performed in a conscientious and professional manner in accordance with specified WRD practices and procedures by qualified and experienced employees who are well trained and supervised, if or when, WRD practices and procedures are inadequate, data will be collected in a manner that its quality will be documented; (3) all water-quality activities will be reviewed for completeness, reliability, credibility, and conformance to specified standards and guidelines; (4) a record of actions will be kept to document the activity and the assigned responsibility; (5) remedial action will be taken to correct activities that are deficient.

  2. A New Method for Estimating Bacterial Abundances in Natural Samples using Sublimation

    Science.gov (United States)

    Glavin, Daniel P.; Cleaves, H. James; Schubert, Michael; Aubrey, Andrew; Bada, Jeffrey L.

    2004-01-01

    We have developed a new method based on the sublimation of adenine from Escherichia coli to estimate bacterial cell counts in natural samples. To demonstrate this technique, several types of natural samples including beach sand, seawater, deep-sea sediment, and two soil samples from the Atacama Desert were heated to a temperature of 500 C for several seconds under reduced pressure. The sublimate was collected on a cold finger and the amount of adenine released from the samples then determined by high performance liquid chromatography (HPLC) with UV absorbance detection. Based on the total amount of adenine recovered from DNA and RNA in these samples, we estimated bacterial cell counts ranging from approx. l0(exp 5) to l0(exp 9) E. coli cell equivalents per gram. For most of these samples, the sublimation based cell counts were in agreement with total bacterial counts obtained by traditional DAPI staining. The simplicity and robustness of the sublimation technique compared to the DAPI staining method makes this approach particularly attractive for use by spacecraft instrumentation. NASA is currently planning to send a lander to Mars in 2009 in order to assess whether or not organic compounds, especially those that might be associated with life, are present in Martian surface samples. Based on our analyses of the Atacama Desert soil samples, several million bacterial cells per gam of Martian soil should be detectable using this sublimation technique.

  3. A method to combine non-probability sample data with probability sample data in estimating spatial means of environmental variables

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    2003-01-01

    In estimating spatial means of environmental variables of a region from data collected by convenience or purposive sampling, validity of the results can be ensured by collecting additional data through probability sampling. The precision of the pi estimator that uses the probability sample can be in

  4. Sparse feature learning for instrument identification: Effects of sampling and pooling methods.

    Science.gov (United States)

    Han, Yoonchang; Lee, Subin; Nam, Juhan; Lee, Kyogu

    2016-05-01

    Feature learning for music applications has recently received considerable attention from many researchers. This paper reports on the sparse feature learning algorithm for musical instrument identification, and in particular, focuses on the effects of the frame sampling techniques for dictionary learning and the pooling methods for feature aggregation. To this end, two frame sampling techniques are examined that are fixed and proportional random sampling. Furthermore, the effect of using onset frame was analyzed for both of proposed sampling methods. Regarding summarization of the feature activation, a standard deviation pooling method is used and compared with the commonly used max- and average-pooling techniques. Using more than 47 000 recordings of 24 instruments from various performers, playing styles, and dynamics, a number of tuning parameters are experimented including the analysis frame size, the dictionary size, and the type of frequency scaling as well as the different sampling and pooling methods. The results show that the combination of proportional sampling and standard deviation pooling achieve the best overall performance of 95.62% while the optimal parameter set varies among the instrument classes. PMID:27250124

  5. A Method of Multi-channel Data Acquisition with Adjustable Sampling Rate

    Directory of Open Access Journals (Sweden)

    Su Shujing

    2013-09-01

    Full Text Available Sampling rate of current signal acquisition systems are singular. Aiming at this shortcoming, a method of multi-channel data acquisition(DAQ with adjustable sampling rate is presented. The method realizes the cut-off frequency of anti-aliasing filter controlled by program with the help of switched-capacitor; by independently pulsing sampling signal of different ADCs, 16-channel sampling rate are adjustable within the range 50ksps, 25ksps, 10ksps, 5ksps, 1ksps. Theoretical analysis and experimental verification pointing at the proposed method are implemented: theoretical analysis shows that parameters of the filter meet the design requirements; experimental results show that cut-off frequency of the anti-aliasing filter matches variable sampling rate very well; choosing appropriate sampling rate according to the characteristics of the measured signal not only can well restore the measured signal, but also prevents system resources from waste. This method can meet needs of testing various signals with different frequency at the same time.  

  6. A representative microbial sampling method for large commercial containers of raw beef based on purge.

    Science.gov (United States)

    Dorsa, W J; Siragusa, S R

    1998-02-01

    The purge from beef combos ( a boxed collection of beef trimmings) was tested as a means of representatively sampling the microbial content of this raw product. In the first experiment, purge was sampled from model beef combos that had been inoculated with bovine feces.. Data from this experiment indicated a strong correlation (r = 0.94) between the total aerobic bacteria counts derived from the purge samples of a model beef combo and the total aerobic bacteria present in a rinse sample of the entire model beef combo. In a second experiment, two 500-g meat pieces were inoculated with an antibiotic-resistant Escherichia coli O157:H7 and place at various levels within a 75-cm meat column. The marked bacteria were retrievable from the purge of the meat column after 24 h, showing that bacteria are carried downward into the purge. During the third part of the study, 90 beef combos (approximately 900 kg beef/combo) were randomly selected at the receiving dock of a commercial grinding facility and sampled using both purge and concurrently used 11-g core samples. Purge samples from these combos recovered significantly greater numbers of mesophilic and psychrotrophic aerobic bacteria, coliforms, and E. coli than core samples from the same combos. Additionally, coliforms and E. coli were recoverable from 100% and 80%, respectively, of the purge samples taken, whereas core samples were only able to recover 60% and 40%, respectively, from the same combos. These findings indicate that a purge sample from a beef combo is a more efficacious sampling method for determining the general bacterial profile and identifying the presence of coliforms and E. coli than randomly taken core samples. PMID:9708274

  7. An accurate, rapid, and simple method for determination of tritium concentration in environmental water sample

    International Nuclear Information System (INIS)

    Liquid scintillation counting is now the most popular method to measure the tritium concentration in the low level water sample such as environmental water samples. However, it takes much time with a lot of doing to distill off the impurities in the sample water before mixing the sample with the liquid scintillation cocktail. In the light of it, we investigated the possibility of an alternative method with membrane filters for purification. In Japan, measurements of tritium concentrations in the terrestrial and subterranean water and rainwater are now carried out in many laboratories by means of the official tritium analyzing method. According to the method, distillation is essential to avoid misestimation caused by quenching, chemical luminescence and other radioactive substances. However, because of the following four reasons, a possible alternative using the membrane filter investigated. (a) The distillation method takes much time and labor to distill off and wash out the used utensils. (b) Such substances as those which have lower boiling points and not very higher ones than boiling point of water can hardly be removed because you need to heat enough to vaporize the water thoroughly. (c) Generally you have scarcely any quenchable substances in the environmental water samples. (d) Scintillation cocktail as known a non-quenching liquid has been on the market lately. As the result, the filtration method was proved to be available to be alternatively used for tritium measurement. We also tried to apply the filtration method, when we concentrate the water sample using the electrolysis enrichment apparatus. We found out that the method was also available to use before and after the enrichment. (author)

  8. Simultaneous extraction of PCDDs/PCDFs, PCBs and PBDEs. Extension of a sample preparation method for determination of PCDDs/PCDFs

    Energy Technology Data Exchange (ETDEWEB)

    Thomsen, C.; Nicolaysen, T.; Broadwell, S.L.; Haug, L.S.; Becher, G. [Norwegian Institute of Public Health, Oslo (Norway)

    2004-09-15

    Due to emission controls and regulatory measures, the levels of polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDDs/PCDFs) and polychlorinated biphenyls (PCBs) have been steadily decreasing in the environment and in human samples the last decades. Nevertheless, the exposure of general populations is still considered to be high and many individuals may have a dietary intake above the established tolerable daily intake. During the recent years, several brominated flame retardants (BFRs) and especially the polybrominated diphenyl ethers (PBDEs) have been shown to be potential persistent organic pollutants (POPs)5. In contrast to PCDDs/PCDFs and PCBs, levels of BFRs seem to be increasing in several environmental compartments. Thus it is of great importance to obtain information on levels of both PCDDs/PCDFs, PCBs and BFRs. Traditionally, PCDDs/PCDFs have been extracted together with the non-ortho PCBs, while extracts of other POPs and PCBs have been prepared separately. Recently, efficient automated methods preparing PCDDs/PCDFs and PCBs extracts at the same time, have been described. A simultaneous sample preparation is advantageous in cases where limited amounts of sample is available, e.g. when analysing human milk or blood, and assures comparable results since the different POPs are determined in exactly the same sample aliquot. Also, due to the low concentration of PCDDs/PCDFs and non-ortho PCBs usually present, a relatively large amount of sample is applied for the extraction, which leads to the possibility of detecting other POPs that are normally not found. We present here a simple and inexpensive extension of our sample preparation method used for determination of PCDDs/PCDFs and non-ortho PCBs that leads to inclusion of both ortho PCBs and PBDEs.

  9. Validation of the ANSR Listeria method for detection of Listeria spp. in environmental samples.

    Science.gov (United States)

    Wendorf, Michael; Feldpausch, Emily; Pinkava, Lisa; Luplow, Karen; Hosking, Edan; Norton, Paul; Biswas, Preetha; Mozola, Mark; Rice, Jennifer

    2013-01-01

    ANSR Listeria is a new diagnostic assay for detection of Listeria spp. in sponge or swab samples taken from a variety of environmental surfaces. The method is an isothermal nucleic acid amplification assay based on the nicking enzyme amplification reaction technology. Following single-step sample enrichment for 16-24 h, the assay is completed in 40 min, requiring only simple instrumentation. In inclusivity testing, 48 of 51 Listeria strains tested positive, with only the three strains of L. grayi producing negative results. Further investigation showed that L. grayi is reactive in the ANSR assay, but its ability to grow under the selective enrichment conditions used in the method is variable. In exclusivity testing, 32 species of non-Listeria, Gram-positive bacteria all produced negative ANSR assay results. Performance of the ANSR method was compared to that of the U.S. Department of Agriculture-Food Safety and Inspection Service reference culture procedure for detection of Listeria spp. in sponge or swab samples taken from inoculated stainless steel, plastic, ceramic tile, sealed concrete, and rubber surfaces. Data were analyzed using Chi-square and probability of detection models. Only one surface, stainless steel, showed a significant difference in performance between the methods, with the ANSR method producing more positive results. Results of internal trials were supported by findings from independent laboratory testing. The ANSR Listeria method can be used as an accurate, rapid, and simple alternative to standard culture methods for detection of Listeria spp. in environmental samples.

  10. A method to combine non-probability sample data with probability sample data in estimating spatial means of environmental variables.

    Science.gov (United States)

    Brus, D J; de Gruijter, J J

    2003-04-01

    In estimating spatial means of environmental variables of a region from data collected by convenience or purposive sampling, validity of the results can be ensured by collecting additional data through probability sampling. The precision of the pi estimator that uses the probability sample can be increased by interpolating the values at the nonprobability sample points to the probability sample points, and using these interpolated values as an auxiliary variable in the difference or regression estimator. These estimators are (approximately) unbiased, even when the nonprobability sample is severely biased such as in preferential samples. The gain in precision compared to the pi estimator in combination with Simple Random Sampling is controlled by the correlation between the target variable and interpolated variable. This correlation is determined by the size (density) and spatial coverage of the nonprobability sample, and the spatial continuity of the target variable. In a case study the average ratio of the variances of the simple regression estimator and pi estimator was 0.68 for preferential samples of size 150 with moderate spatial clustering, and 0.80 for preferential samples of similar size with strong spatial clustering. In the latter case the simple regression estimator was substantially more precise than the simple difference estimator.

  11. Quality assurance of qualitative analysis

    DEFF Research Database (Denmark)

    Ríos, Ángel; Barceló, Damiá; Buydens, Lutgarde;

    2003-01-01

    and quality assurance. One important part of this document deals, therefore, with aspects involved in analytical quality assurance of qualitative analysis. This article shows the main conclusions reported in the document referring to the implementation of quality principles in qualitative analysis...

  12. Recent Trends in Quality Assurance

    Science.gov (United States)

    Amaral, Alberto; Rosa, Maria Joao

    2010-01-01

    In this paper we present a brief description of the evolution of quality assurance in Europe, paying particular attention to its relationship to the rising loss of trust in higher education institutions. We finalise by analysing the role of the European Commission in the setting up of new quality assurance mechanisms that tend to promote…

  13. Evaluation Research and Quality Assurance.

    Science.gov (United States)

    Pesta, George; Respress, Trinetia; Major, Aline K.; Arazan, Christine; Coxe, Terry

    2002-01-01

    Describes the Juvenile Justice Educational Enhancement Program's implementation of an evaluation research-driven quality assurance process for Florida's juvenile justice educational programs. Reviews the literature on juvenile justice educational practices and describes the educational quality assurance standards and annual modifications that draw…

  14. Density estimation of small-mammal populations using a trapping web and distance sampling methods

    Science.gov (United States)

    Anderson, David R.; Burnham, Kenneth P.; White, Gary C.; Otis, David L.

    1983-01-01

    Distance sampling methodology is adapted to enable animal density (number per unit of area) to be estimated from capture-recapture and removal data. A trapping web design provides the link between capture data and distance sampling theory. The estimator of density is D = Mt+1f(0), where Mt+1 is the number of individuals captured and f(0) is computed from the Mt+1 distances from the web center to the traps in which those individuals were first captured. It is possible to check qualitatively the critical assumption on which the web design and the estimator are based. This is a conceptual paper outlining a new methodology, not a definitive investigation of the best specific way to implement this method. Several alternative sampling and analysis methods are possible within the general framework of distance sampling theory; a few alternatives are discussed and an example is given.

  15. Detection of protozoa in water samples by formalin/ether concentration method.

    Science.gov (United States)

    Lora-Suarez, Fabiana; Rivera, Raul; Triviño-Valencia, Jessica; Gomez-Marin, Jorge E

    2016-09-01

    Methods to detect protozoa in water samples are expensive and laborious. We evaluated the formalin/ether concentration method to detect Giardia sp., Cryptosporidium sp. and Toxoplasma in water. In order to test the properties of the method, we spiked water samples with different amounts of each protozoa (0, 10 and 50 cysts or oocysts) in a volume of 10 L of water. Immunofluorescence assay was used for detection of Giardia and Cryptosporidium. Toxoplasma oocysts were identified by morphology. The mean percent of recovery in 10 repetitions of the entire method, in 10 samples spiked with ten parasites and read by three different observers, were for Cryptosporidium 71.3 ± 12, for Giardia 63 ± 10 and for Toxoplasma 91.6 ± 9 and the relative standard deviation of the method was of 17.5, 17.2 and 9.8, respectively. Intraobserver variation as measured by intraclass correlation coefficient, was fair for Toxoplasma, moderate for Cryptosporidium and almost perfect for Giardia. The method was then applied in 77 samples of raw and drinkable water in three different plant of water treatment. Cryptosporidium was found in 28 of 77 samples (36%) and Giardia in 31 of 77 samples (40%). Theses results identified significant differences in treatment process to reduce the presence of Giardia and Cryptosporidium. In conclusion, the formalin ether method to concentrate protozoa in water is a new alternative for low resources countries, where is urgently need to monitor and follow the presence of theses protozoa in drinkable water. PMID:27219047

  16. Detection of protozoa in water samples by formalin/ether concentration method.

    Science.gov (United States)

    Lora-Suarez, Fabiana; Rivera, Raul; Triviño-Valencia, Jessica; Gomez-Marin, Jorge E

    2016-09-01

    Methods to detect protozoa in water samples are expensive and laborious. We evaluated the formalin/ether concentration method to detect Giardia sp., Cryptosporidium sp. and Toxoplasma in water. In order to test the properties of the method, we spiked water samples with different amounts of each protozoa (0, 10 and 50 cysts or oocysts) in a volume of 10 L of water. Immunofluorescence assay was used for detection of Giardia and Cryptosporidium. Toxoplasma oocysts were identified by morphology. The mean percent of recovery in 10 repetitions of the entire method, in 10 samples spiked with ten parasites and read by three different observers, were for Cryptosporidium 71.3 ± 12, for Giardia 63 ± 10 and for Toxoplasma 91.6 ± 9 and the relative standard deviation of the method was of 17.5, 17.2 and 9.8, respectively. Intraobserver variation as measured by intraclass correlation coefficient, was fair for Toxoplasma, moderate for Cryptosporidium and almost perfect for Giardia. The method was then applied in 77 samples of raw and drinkable water in three different plant of water treatment. Cryptosporidium was found in 28 of 77 samples (36%) and Giardia in 31 of 77 samples (40%). Theses results identified significant differences in treatment process to reduce the presence of Giardia and Cryptosporidium. In conclusion, the formalin ether method to concentrate protozoa in water is a new alternative for low resources countries, where is urgently need to monitor and follow the presence of theses protozoa in drinkable water.

  17. Minimally invasive blood sampling method for genetic studies on Gopherus tortoises

    Directory of Open Access Journals (Sweden)

    García–Feria, L. M.

    2015-04-01

    Full Text Available Obtaining good quality tissue samples is the first hurdle in any molecular study. This is especially true for studies involving management and conservation of wild fauna. In the case of tortoises, the most common sources of DNA are blood samples. However, only a minimal amount of blood is required for PCR assays. Samples are obtained mainly from the brachial and jugular vein after restraining the animal chemically, or from conscious individuals by severe handling methods and clamping. Herein, we present a minimally invasive technique that has proven effective for extracting small quantities of blood, suitable for genetic analyses. Furthermore, the samples obtained yielded better DNA amplification than other cell sources, such as cloacal epithelium cells. After two years of use on wild tortoises, this technique has shown to be harmless. We suggest that sampling a small amount of blood could also be useful for other types of analyses, such as physiologic and medical monitoring.

  18. Study of performance characteristics of a radiochemical method to determine uranium in biological samples

    International Nuclear Information System (INIS)

    In this paper is described a methodology to calculate detection limit (Ld), quantification level (Lq) and minimum detectable activity (MDA) in a radiochemical method for determination of uranium in urine samples. The concentration is measured by fluorimetry and alpha gross activity using liquid scintillation counting (LSC). The calculation of total propagated uncertainty on a spike sample is presented. Furthermore, the major sources of uncertainty and percentage contribution in both measurements are assessed. (author)

  19. Photothermal method using a pyroelectric sensor for thermophysical characterization of agricultural and biological samples

    Science.gov (United States)

    Frandas, A.; Dadarlat, Dorin; Chirtoc, Mihai; Jalink, Henk; Bicanic, Dane D.; Paris, D.; Antoniow, Jean S.; Egee, Michel; Ungureanu, Costica

    1998-07-01

    The photopyroelectric method in different experimental configurations was used for thermophysical characterization of agricultural and biological samples. The study appears important due to the relation of thermal parameters to the quality of foodstuffs (connected to their preservation, storage and adulteration), migration profiles in biodegradable packages, and the mechanism of desiccation tolerance of seeds. Results are presented on the thermal parameters measurement and their dependence on temperature and water content for samples such as: honey, starch, seeds.

  20. An analytical method for the determination of plutonium in autopsy samples

    International Nuclear Information System (INIS)

    A sensitive method for the determination of plutonium in autopsy samples is described. After a suitable chemical pretreatment of the samples the plutonium is separated by extraction chromatography with tri-n-octylphosphine oxide (TOPO) supported on microporus polyethylene. After electrodeposition of plutonium the activity is counted by alpha spectroscopy. The global yield was 75-80%. The reagent blank activity was such to allow the determination of some femtocuries of plutonium

  1. A method for disaggregating clay concretions and eliminating formalin smell in the processing of sediment samples

    DEFF Research Database (Denmark)

    Cedhagen, Tomas

    1989-01-01

    A complete handling procedure for processing sediment samples is described. It includes some improvements of conventional methods. The fixed sediment sample is mixed with a solution of the alkaline detergent AJAX® (Colgate-Palmolive). It is kept at 80-900 C for 20-40 min. This treatment facilitat...... subsequent sorting as it disaggregates clay concretions and faecal pellets ·but leaves even fragile organisms clean and unaffected. The ammonia in the detergent eliminates the formalin smell....

  2. Evaluation of micro-colorimetric lipid determination method with samples prepared using sonication and accelerated solvent extraction methods

    Science.gov (United States)

    Two common laboratory extraction techniques were evaluated for routine use with the micro-colorimetric lipid determination method developed by Van Handel (1985) [E. Van Handel, J. Am. Mosq. Control Assoc. 1(1985) 302] and recently validated for small samples by Inouye and Lotufo ...

  3. A two-step semiparametric method to accommodate sampling weights in multiple imputation.

    Science.gov (United States)

    Zhou, Hanzhi; Elliott, Michael R; Raghunathan, Trviellore E

    2016-03-01

    Multiple imputation (MI) is a well-established method to handle item-nonresponse in sample surveys. Survey data obtained from complex sampling designs often involve features that include unequal probability of selection. MI requires imputation to be congenial, that is, for the imputations to come from a Bayesian predictive distribution and for the observed and complete data estimator to equal the posterior mean given the observed or complete data, and similarly for the observed and complete variance estimator to equal the posterior variance given the observed or complete data; more colloquially, the analyst and imputer make similar modeling assumptions. Yet multiply imputed data sets from complex sample designs with unequal sampling weights are typically imputed under simple random sampling assumptions and then analyzed using methods that account for the sampling weights. This is a setting in which the analyst assumes more than the imputer, which can led to biased estimates and anti-conservative inference. Less commonly used alternatives such as including case weights as predictors in the imputation model typically require interaction terms for more complex estimators such as regression coefficients, and can be vulnerable to model misspecification and difficult to implement. We develop a simple two-step MI framework that accounts for sampling weights using a weighted finite population Bayesian bootstrap method to validly impute the whole population (including item nonresponse) from the observed data. In the second step, having generated posterior predictive distributions of the entire population, we use standard IID imputation to handle the item nonresponse. Simulation results show that the proposed method has good frequentist properties and is robust to model misspecification compared to alternative approaches. We apply the proposed method to accommodate missing data in the Behavioral Risk Factor Surveillance System when estimating means and parameters of

  4. Evolving Principles of Office Quality Assurance

    OpenAIRE

    Norman, Lee A.

    1988-01-01

    The application of medical quality assurance principles to ambulatory patient care using the traditional methods of medical chart audit, process review, and physician education has yielded generally disappointing results in improving patient care and physician performance. Newer methods assist physicians by providing patient and medical reference data at the time of a patient's visit. Techniques for tracking treatment outcomes and patients' test results and for providing instructions to patie...

  5. Detection of Acanthamoeba and Toxoplasma in River Water Samples by Molecular Methods in Iran.

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Mahmoudi

    2015-06-01

    Full Text Available Free-living amoebae such as Acanthamoeba species may act as carriers of Cryptosporidium and Toxoplasma oocysts, thus, may play an important role in the water-borne transmission of these parasites. In the present study, a loop mediated isothermal amplification (LAMP method for detection of Toxoplasma and a PCR assay were developed for investigation of Acanthamoeba in environmental water samples.A total of 34 samples were collected from the surface water in Guilan Province. Water samples were filtrated with membrane filters and followed by DNA extraction. PCR and LAMP methods used for detection of the protozoan parasites Acanthamoeba and Toxoplasma respectively.Totally 30 and 2 of 34 samples were positive for Acanthamoeba and Toxoplasma oocysts respectively. Two samples were positive for both investigated parasites.The investigated water supplies, are contaminated by Toxoplasma and Acanthamoeba (oocystes. Acanthamoeba may play an important role in water-borne transmission of Toxoplasma in the study area. For the first time in Iran, protocol of LAMP method was used effectively for the detection of Toxoplasma in surface water samples in Iran.

  6. Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR.

    Science.gov (United States)

    Mobli, Mehdi; Hoch, Jeffrey C

    2014-11-01

    Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR.

  7. A two-stage method to determine optimal product sampling considering dynamic potential market.

    Science.gov (United States)

    Hu, Zhineng; Lu, Wei; Han, Bing

    2015-01-01

    This paper develops an optimization model for the diffusion effects of free samples under dynamic changes in potential market based on the characteristics of independent product and presents a two-stage method to figure out the sampling level. The impact analysis of the key factors on the sampling level shows that the increase of the external coefficient or internal coefficient has a negative influence on the sampling level. And the changing rate of the potential market has no significant influence on the sampling level whereas the repeat purchase has a positive one. Using logistic analysis and regression analysis, the global sensitivity analysis gives a whole analysis of the interaction of all parameters, which provides a two-stage method to estimate the impact of the relevant parameters in the case of inaccuracy of the parameters and to be able to construct a 95% confidence interval for the predicted sampling level. Finally, the paper provides the operational steps to improve the accuracy of the parameter estimation and an innovational way to estimate the sampling level.

  8. NOISE AMPLIFICATION ANALYSIS AND COMPARISON OF TWO PERIODIC NONUNIFORM SAMPLING RECONSTRUCTION METHODS USED IN DPCA SAR

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    General Sampling Expansion Reconstruction Method(GSERM)and Digital Spectrum Reconstruction Method(DSRM),which prove effective to reconstruct azimuth signal of Displaced Phase Center Apertures(DPCA) Synthetic Aperture areal"(SAR)system from its Periodic Non-Uniform Sampling(PNUS)data sequences,would amplify the noise and sidelobe clutter simultaneously in the reconstruction.This paper formulates the relation of the system transfer matrixes of the above two methods,gives the properties,such aS periodicity,symmetry,and time-shift property,of their Noise and Sidelobe Clutter Amplification Factor(NSCAF),and discovers that DSRM is more sensitive than GSERM in the white noise environment.In addition,criteria based on initial sampling point analysis for the robust PRF selection are suggested.Computer simulation results support these con-clusions.

  9. An Investigation of the Sequential Sampling Method for Crossdocking Simulation Output Variance Reduction

    CERN Document Server

    Adewunmi, Adrian; Byrne, Mike

    2008-01-01

    This paper investigates the reduction of variance associated with a simulation output performance measure, using the Sequential Sampling method while applying minimum simulation replications, for a class of JIT (Just in Time) warehousing system called crossdocking. We initially used the Sequential Sampling method to attain a desired 95% confidence interval half width of plus/minus 0.5 for our chosen performance measure (Total usage cost, given the mean maximum level of 157,000 pounds and a mean minimum level of 149,000 pounds). From our results, we achieved a 95% confidence interval half width of plus/minus 2.8 for our chosen performance measure (Total usage cost, with an average mean value of 115,000 pounds). However, the Sequential Sampling method requires a huge number of simulation replications to reduce variance for our simulation output value to the target level. Arena (version 11) simulation software was used to conduct this study.

  10. Preparation of Samples for Leaf Architecture Studies, A Method for Mounting Cleared Leaves

    Directory of Open Access Journals (Sweden)

    Alejandra Vasco

    2014-09-01

    Full Text Available Premise of the study: Several recent waves of interest in leaf architecture have shown an expanding range of approaches and applications across a number of disciplines. Despite this increased interest, examination of existing archives of cleared and mounted leaves shows that current methods for mounting, in particular, yield unsatisfactory results and deterioration of samples over relatively short periods. Although techniques for clearing and staining leaves are numerous, published techniques for mounting leaves are scarce. Methods and Results: Here we present a complete protocol and recommendations for clearing, staining, and imaging leaves, and, most importantly, a method to permanently mount cleared leaves. Conclusions: The mounting protocol is faster than other methods, inexpensive, and straightforward; moreover, it yields clear and permanent samples that can easily be imaged, scanned, and stored. Specimens mounted with this method preserve well, with leaves that were mounted more than 35 years ago showing no signs of bubbling or discoloration.

  11. Method of analysis and quality-assurance practices for determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry at the U.S. Geological Survey California District Organic Chemistry Laboratory, 1996-99

    Science.gov (United States)

    Crepeau, Kathryn L.; Baker, Lucian M.; Kuivila, Kathryn M.

    2000-01-01

    A method of analysis and quality-assurance practices were developed to study the fate and transport of pesticides in the San Francisco Bay-Estuary by the U.S. Geological Survey. Water samples were filtered to remove suspended-particulate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide and the pesticides were eluted with three cartridge volumes of hexane:diethyl ether (1:1) solution. The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for pesticides ranged from 0.002 to 0.025 microgram per liter for 1-liter samples. Recoveries ranged from 44 to 140 percent for 25 pesticides in samples of organic-free reagent water and Sacramento-San Joaquin Delta and Suisun Bay water fortified at 0.05 and 0.50 microgram per liter. The estimated holding time for pesticides after extraction on C-8 solid-phase extraction cartridges ranged from 10 to 257 days.

  12. Flagging versus dragging as sampling methods for nymphal Ixodes scapularis (Acari: Ixodidae)

    Science.gov (United States)

    Rulison, Eric L.; Kuczaj, Isis; Pang, Genevieve; Hickling, Graham J.; Tsao, Jean I.; Ginsberg, Howard S.

    2013-01-01

    The nymphal stage of the blacklegged tick, Ixodes scapularis (Acari: Ixodidae), is responsible for most transmission of Borrelia burgdorferi, the etiologic agent of Lyme disease, to humans in North America. From 2010 to fall of 2012, we compared two commonly used techniques, flagging and dragging, as sampling methods for nymphal I. scapularis at three sites, each with multiple sampling arrays (grids), in the eastern and central United States. Flagging and dragging collected comparable numbers of nymphs, with no consistent differences between methods. Dragging collected more nymphs than flagging in some samples, but these differences were not consistent among sites or sampling years. The ratio of nymphs collected by flagging vs dragging was not significantly related to shrub density, so habitat type did not have a strong effect on the relative efficacy of these methods. Therefore, although dragging collected more ticks in a few cases, the numbers collected by each method were so variable that neither technique had a clear advantage for sampling nymphal I. scapularis.

  13. Final LDRD report : development of sample preparation methods for ChIPMA-based imaging mass spectrometry of tissue samples.

    Energy Technology Data Exchange (ETDEWEB)

    Maharrey, Sean P.; Highley, Aaron M.; Behrens, Richard, Jr.; Wiese-Smith, Deneille

    2007-12-01

    The objective of this short-term LDRD project was to acquire the tools needed to use our chemical imaging precision mass analyzer (ChIPMA) instrument to analyze tissue samples. This effort was an outgrowth of discussions with oncologists on the need to find the cellular origin of signals in mass spectra of serum samples, which provide biomarkers for ovarian cancer. The ultimate goal would be to collect chemical images of biopsy samples allowing the chemical images of diseased and nondiseased sections of a sample to be compared. The equipment needed to prepare tissue samples have been acquired and built. This equipment includes an cyro-ultramicrotome for preparing thin sections of samples and a coating unit. The coating unit uses an electrospray system to deposit small droplets of a UV-photo absorbing compound on the surface of the tissue samples. Both units are operational. The tissue sample must be coated with the organic compound to enable matrix assisted laser desorption/ionization (MALDI) and matrix enhanced secondary ion mass spectrometry (ME-SIMS) measurements with the ChIPMA instrument Initial plans to test the sample preparation using human tissue samples required development of administrative procedures beyond the scope of this LDRD. Hence, it was decided to make two types of measurements: (1) Testing the spatial resolution of ME-SIMS by preparing a substrate coated with a mixture of an organic matrix and a bio standard and etching a defined pattern in the coating using a liquid metal ion beam, and (2) preparing and imaging C. elegans worms. Difficulties arose in sectioning the C. elegans for analysis and funds and time to overcome these difficulties were not available in this project. The facilities are now available for preparing biological samples for analysis with the ChIPMA instrument. Some further investment of time and resources in sample preparation should make this a useful tool for chemical imaging applications.

  14. Analysis of bulk sample of salicylic acid by application of hydrotropic solubilization method

    Directory of Open Access Journals (Sweden)

    Maheshwari R

    2008-01-01

    Full Text Available In the present investigation, the poorly water-soluble drug, salicylic acid has been solubilized using 0.5 M ibuprofen sodium and 2.0 M sodium salicylate solution as hydrotropic agents for the titrimetric analysis precluding the use of organic solvents. Both hydrotropes are economic and pollution-free. The mean percent estimation of salicylic acid estimated in bulk sample by Indian Pharmacopoeial method is 98.78%. The mean percent estimation by ibuprofen sodium method and sodium salicylate method are 99.25% and 98.82%, respectively. The results of analysis by the proposed method are very close to the results of analysis by the standard method. This confirms the accuracy of the proposed method. The proposed method was validated statistically by low values of statistical parameters viz. standard deviation, percent coefficient of variation and standard error. The proposed method is new, accurate, simple and economic.

  15. Analytical Chemistry Laboratory (ACL) procedure compendium. Volume 2, Sample preparation methods

    Energy Technology Data Exchange (ETDEWEB)

    1993-08-01

    This volume contains the interim change notice for sample preparation methods. Covered are: acid digestion for metals analysis, fusion of Hanford tank waste solids, water leach of sludges/soils/other solids, extraction procedure toxicity (simulate leach in landfill), sample preparation for gamma spectroscopy, acid digestion for radiochemical analysis, leach preparation of solids for free cyanide analysis, aqueous leach of solids for anion analysis, microwave digestion of glasses and slurries for ICP/MS, toxicity characteristic leaching extraction for inorganics, leach/dissolution of activated metal for radiochemical analysis, extraction of single-shell tank (SST) samples for semi-VOC analysis, preparation and cleanup of hydrocarbon- containing samples for VOC and semi-VOC analysis, receiving of waste tank samples in onsite transfer cask, receipt and inspection of SST samples, receipt and extrusion of core samples at 325A shielded facility, cleaning and shipping of waste tank samplers, homogenization of solutions/slurries/sludges, and test sample preparation for bioassay quality control program.

  16. Purity calculation method for event samples with two the same particles

    CERN Document Server

    Kuzmin, Valentin

    2016-01-01

    We present a method of the two dimensional background calculation for an analysis of events with two the same particles observing by a detector of high energy physics. Usual two dimensional integration is replaced by an approximation of a specially constructed one-dimensional function. The value of the signal events is found by the subtraction of the background from the value of all selected events. It allows to calculate the purity value of the selected events sample. The procedure does not require a hypothesis about background and signal shapes. The nice performance of the purity calculation method is shown on Monte Carlo examples of double J/psi samples.

  17. Quality Assurance of Ultrasonic Diagnosis in Breast

    International Nuclear Information System (INIS)

    Sonography is a subjective diagnostic method which is highly dependent on the experience of the operator and the equipment quality which requires real-time adjustments. Breast screening examination currently consists of clinical examination and mammography. Breast sonography, either supplementary to mammography or independently, is indicated for the dense breast, especially in younger women. Breast sonography is especially applicable for Korean women because of the denser breast parenchyma and the approximately 10-year younger incidence rate of breast cancer of Korean women compared to western women. To avoid unnecessary breast biopsy because of the high rate of false positive lesions in breast parenchyma, which is different from other body organs such as the liver or the kidney, a quality assurance program for breast sonography is essential. The quality assurance of breast ultrasound involves quality assurance of the equipment, imaging display and acquisition of clinical images, personnel qualifications and other aspects such as unification of lexicon, guideline of diagnostic examination and reporting system; US BI-RAD reporting system, assessment items and organization, education program, medical audit, certification issues, and medicolegal issues. A breast sonographic quality assurance system should be established before a scheme to initiate governmental medical insurance for breast sonography

  18. INFORMATION ON STRATEGIES FOR ACHIEVING COMPLIANCE WITH THE GUIDELINES FOR QUALITY ASSURANCE IN NIGERIAN UNIVERSITIES

    OpenAIRE

    Sali, Maryam; Akor, Philip Usman

    2015-01-01

    This paper examines the strategies adopted for achieving compliance with the guidelines for quality assurance in Nigerian university system using descriptiveresearch method. It focused on the concepts of quality and quality assurance, quality assurance guidelines in the Nigerian university system and strategies adopted by the NUC, for achieving compliance with the quality assurance guidelines. The paper also highlighted the challenges hindering the effective achievement of compliance with the...

  19. A multi-threshold sampling method for TOF PET signal processing

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Heejong; Kao, Chien-Min; Xie, Q.; Chen, Chin-Tu; Zhou, L.; Tang, F.; Frisch, Henry; Moses, William W.; Choong, Woon-Seng

    2009-02-02

    As an approach to realizing all-digital data acquisition for positron emission tomography (PET), we have previously proposed and studied a multithreshold sampling method to generate samples of a PET event waveform with respect to a few user-defined amplitudes. In this sampling scheme, one can extract both the energy and timing information for an event. In this paper, we report our prototype implementation of this sampling method and the performance results obtained with this prototype. The prototype consists of two multi-threshold discriminator boards and a time-to-digital converter (TDC) board. Each of the multi-threshold discriminator boards takes one input and provides up to 8 threshold levels, which can be defined by users, for sampling the input signal. The TDC board employs the CERN HPTDC chip that determines the digitized times of the leading and falling edges of the discriminator output pulses. We connect our prototype electronics to the outputs of two Hamamatsu R9800 photomultiplier tubes (PMTs) that are individually coupled to a 6.25 x 6.25 x 25mm{sup 3} LSO crystal. By analyzing waveform samples generated by using four thresholds, we obtain a coincidence timing resolution of about 340 ps and an {approx}18% energy resolution at 511 keV. We are also able to estimate the decay-time constant from the resulting samples and obtain a mean value of 44 ns with an {approx}9 ns FWHM. In comparison, using digitized waveforms obtained at a 20 GSps sampling rate for the same LSO/PMT modules we obtain {approx}300 ps coincidence timing resolution, {approx}14% energy resolution at 511 keV, and {approx}5 ns FWHM for the estimated decay-time constant. Details of the results on the timing and energy resolutions by using the multi-threshold method indicate that it is a promising approach for implementing digital PET data acquisition.

  20. Sampling a guide for internal auditors

    CERN Document Server

    Apostolou, Barbara

    2004-01-01

    While it is possible to examine 100 percent of an audit customer's data, the time and cost associated with such a study are often prohibitive. To obtain sufficient, reliable, and relevant information with a limited data set, sampling is an efficient and effective tool. It can help you evaluate the customer's assertions, as well as reach audit conclusions and provide reasonable assurance to your organization. This handbook will help you understand sampling. It also serves as a guide for auditors and students preparing for certification. Topics include: An overview of sampling. Statistical and nonstatistical sampling issues. Sampling selection methods and risks. The pros and cons of popular sampling plans.

  1. Methods of powder sample mounting and their evaluations in XPS analysis

    International Nuclear Information System (INIS)

    Two different methods for CuO powder, mounting using adhesive tape and compressed powder pellets were compared as to how these methods influence the XPS measurements. In Ar+ sputtering, these methods showed differences in the extent of selective sputtering by oxygen and the reduction rate. In a long period of sputtering, the reduction rate was greater with the powders mounted on tapes than with those compressed into pellets. During data acquisition, the amount of carbon on the sample surface on the tape increased with time to a greater extend compared with the pellet sample surface. This is interpreted in terms of the contamination from the tape materials. In addition, the pellet samples gave nearly twice the signal intensity of the samples on tapes and a smaller FWHM value of Cu2p3/2 peak by approximately 0.05 eV. These results indicate that the pellet method is better than the tape method in terms of low contamination, signal intensity and energy resolution. (author)

  2. Quality Assurance Project Plan for Facility Effluent Monitoring Plan activities

    Energy Technology Data Exchange (ETDEWEB)

    Frazier, T.P.

    1994-10-20

    This Quality Assurance Project Plan addresses the quality assurance requirements for the activities associated with the Facility Effluent Monitoring Plans, which are part of the overall Hanford Site Environmental Protection Plan. This plan specifically applies to the sampling and analysis activities and continuous monitoring performed for all Facility Effluent Monitoring Plan activities conducted by Westinghouse Hanford Company. It is generic in approach and will be implemented in conjunction with the specific requirements of the individual Facility Effluent Monitoring Plans.

  3. Statistical methods for detecting differentially abundant features in clinical metagenomic samples.

    Directory of Open Access Journals (Sweden)

    James Robert White

    2009-04-01

    Full Text Available Numerous studies are currently underway to characterize the microbial communities inhabiting our world. These studies aim to dramatically expand our understanding of the microbial biosphere and, more importantly, hope to reveal the secrets of the complex symbiotic relationship between us and our commensal bacterial microflora. An important prerequisite for such discoveries are computational tools that are able to rapidly and accurately compare large datasets generated from complex bacterial communities to identify features that distinguish them.We present a statistical method for comparing clinical metagenomic samples from two treatment populations on the basis of count data (e.g. as obtained through sequencing to detect differentially abundant features. Our method, Metastats, employs the false discovery rate to improve specificity in high-complexity environments, and separately handles sparsely-sampled features using Fisher's exact test. Under a variety of simulations, we show that Metastats performs well compared to previously used methods, and significantly outperforms other methods for features with sparse counts. We demonstrate the utility of our method on several datasets including a 16S rRNA survey of obese and lean human gut microbiomes, COG functional profiles of infant and mature gut microbiomes, and bacterial and viral metabolic subsystem data inferred from random sequencing of 85 metagenomes. The application of our method to the obesity dataset reveals differences between obese and lean subjects not reported in the original study. For the COG and subsystem datasets, we provide the first statistically rigorous assessment of the differences between these populations. The methods described in this paper are the first to address clinical metagenomic datasets comprising samples from multiple subjects. Our methods are robust across datasets of varied complexity and sampling level. While designed for metagenomic applications, our software

  4. Radiochemical methods for the determination of subnanogram amounts of cadmium in environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Shamaev, V.I.

    1986-02-01

    A radiochemical method has been developed for the determination of cadmium, based on an interpolation method with the addition of an interfering element (zinc). Using extraction by Dithizone in chloroform form from alkaline media it is possible to determine cadmium with a detection limit of about 2x10/sup -10/ M and quite high selectivity. Combination of the method with a preliminary substoichiometric concentration allows the detection limit to be reduced to about 2x10/sup -11/ M and the selectivity to be increased significantly. The method was used to determine cadmium in environmental samples.

  5. INVESTIGATION OF THE TOTAL ORGANIC HALOGEN ANALYTICAL METHOD AT THE WASTE SAMPLING AND CHARACTERIZATION FACILITY

    Energy Technology Data Exchange (ETDEWEB)

    JG DOUGLAS; HK MEZNARICH, PHD; JR OLSEN; GA ROSS PHD; M STAUFFER

    2009-02-13

    Total organic halogen (TOX) is used as a parameter to screen groundwater samples at the Hanford Site. Trending is done for each groundwater well, and changes in TOX and other screening parameters can lead to costly changes in the monitoring protocol. The Waste Sampling and Characterization Facility (WSCF) analyzes groundwater samples for TOX using the United States Environmental Protection Agency (EPA) SW-S46 method 9020B (EPA 1996a). Samples from the Soil and Groundwater Remediation Project (SGRP) are submitted to the WSCF for analysis without information regarding the source of the sample; each sample is in essence a ''blind'' sample to the laboratory. Feedback from the SGRP indicated that some of the WSCF-generated TOX data from groundwater wells had a number of outlier values based on the historical trends (Anastos 200Sa). Additionally, analysts at WSCF observed inconsistent TOX results among field sample replicates. Therefore, the WSCF lab performed an investigation of the TOX analysis to determine the cause of the outlier data points. Two causes were found that contributed to generating out-of-trend TOX data: (1) The presence of inorganic chloride in the groundwater samples: at inorganic chloride concentrations greater than about 10 parts per million (ppm), apparent TOX values increase with increasing chloride concentration. A parallel observation is the increase in apparent breakthrough of TOX from the first to the second activated-carbon adsorption tubes with increasing inorganic chloride concentration. (2) During the sample preparation step, excessive purging of the adsorption tubes with oxygen pressurization gas after sample loading may cause channeling in the activated carbon bed. This channeling leads to poor removal of inorganic chloride during the subsequent wash step with aqueous potassium nitrate. The presence of this residual inorganic chloride then produces erroneously high TOX values. Changes in sample preparation were studied

  6. A sampling method for estimating the accuracy of predicted breeding values in genetic evaluation

    Directory of Open Access Journals (Sweden)

    Laloë Denis

    2001-09-01

    Full Text Available Abstract A sampling-based method for estimating the accuracy of estimated breeding values using an animal model is presented. Empirical variances of true and estimated breeding values were estimated from a simulated n-sample. The method was validated using a small data set from the Parthenaise breed with the estimated coefficient of determination converging to the true values. It was applied to the French Salers data file used for the 2000 on-farm evaluation (IBOVAL of muscle development score. A drawback of the method is its computational demand. Consequently, convergence can not be achieved in a reasonable time for very large data files. Two advantages of the method are that a it is applicable to any model (animal, sire, multivariate, maternal effects... and b it supplies off-diagonal coefficients of the inverse of the mixed model equations and can therefore be the basis of connectedness studies.

  7. Methods for simultaneous detection of the cyanotoxins BMAA, DABA, and anatoxin-a in environmental samples.

    Science.gov (United States)

    Al-Sammak, Maitham Ahmed; Hoagland, Kyle D; Snow, Daniel D; Cassada, David

    2013-12-15

    Blue-green algae, also known as cyanobacteria, can produce several different groups of toxins in the environment including hepatotoxins (microcystins), neurotoxic non-protein amino acids β-methylamino-l-alanine (BMAA), and 2,4-diaminobutyric (DABA), as well as the bicyclic amine alkaloid anatoxin-a. Few studies have addressed the methods necessary for an accurate determination of cyanotoxins in environmental samples, and none have been published that can detect these cyanotoxins together in a single sample. Cyanotoxins occur in a wide range of environmental samples including water, fish, and aquatic plant samples. Using polymeric cation exchange solid phase extraction (SPE) coupled with liquid chromatography and fluorescence detection (HPLC/FD), and liquid chromatography ion trap tandem mass spectrometry (LC/MS/MS), these compounds can for the first time be simultaneously quantified in a variety of environmental sample types. The extraction method for biological samples can distinguish bound and free cyanotoxins. Detection limits for water ranged from 5 to 7 μg/L using HPLC/FD, while detection limits for and LC/MS were in the range of 0.8-3.2 μg/L. PMID:24140919

  8. A New Method for Noninvasive Genetic Sampling of Saliva in Ecological Research.

    Science.gov (United States)

    Lobo, Diana; Godinho, Raquel; Álvares, Francisco; López-Bao, José V; Rodríguez, Alejandro

    2015-01-01

    Noninvasive samples for genetic analyses have become essential to address ecological questions. Popular noninvasive samples such as faeces contain degraded DNA which may compromise genotyping success. Saliva is an excellent alternative DNA source but scarcity of suitable collection methods makes its use anecdotal in field ecological studies. We develop a noninvasive method of collection that combines baits and porous materials able to capture saliva. We report its potential in optimal conditions, using confined dogs and collecting saliva early after deposition. DNA concentration in saliva extracts was generally high (mean 14 ng μl(-1)). We correctly identified individuals in 78% of samples conservatively using ten microsatellite loci, and 90% of samples using only eight loci. Consensus genotypes closely matched reference genotypes obtained from hair DNA (99% of identification successes and 91% of failures). Mean genotyping effort needed for identification using ten loci was 2.2 replicates. Genotyping errors occurred at a very low frequency (allelic dropout: 2.3%; false alleles: 1.5%). Individual identification success increased with duration of substrate handling inside dog's mouth and the volume of saliva collected. Low identification success was associated with baits rich in DNA-oxidant polyphenols and DNA concentrations methods, and could advantageously allow detection of socially low-ranked individuals underrepresented in sources of DNA that are involved in marking behaviour (faeces or urine). Once adapted and refined, there is promise for this technique to allow potentially high rates of individual identification in ecological field studies requiring noninvasive sampling of wild vertebrates.

  9. Ballistic quality assurance

    International Nuclear Information System (INIS)

    This review describes the ballistic quality assurance for stereotactic intracranial irradiation treatments delivered with Gamma KnifeR either dedicated or adapted medical linear accelerators. Specific and periodic controls should be performed in order to check the mechanical stability for both irradiation and collimation systems. If this step remains under the responsibility of the medical physicist, it should be done in agreement with the manufacturer's technical support. At this time, there are no recent published guidelines. With technological developments, both frequency and accuracy should be assessed in each institution according to the treatment mode: single versus hypo-fractionated dose, circular collimator versus micro-multi-leaf collimators. In addition, 'end-to-end' techniques are mandatory to find the origin of potential discrepancies and to estimate the global ballistic accuracy of the delivered treatment. Indeed, they include frames, non-invasive immobilization devices, localizers, multimodal imaging for delineation and in-room positioning imaging systems. The final precision that could be reasonably achieved is more or less 1 mm. (authors)

  10. Quality assurance during site construction

    International Nuclear Information System (INIS)

    During the time of planing and construction of a nuclear power plant, the following proceeding is approved: - the deliverer of a nuclear power plant provides the reports fixing the quality assurance program, it means that he is responsible to write the safety analysis report, the specifications for the erection of the components, the working manuals and specifications for testing (eg nondestr. testing) - the manufacturing of components or systems will be controlled by an own independent quality assurance group, provided that this group was checked by the quality assurance group of the applicant - the TUeV with its independent assessors will fix the requirements relating to quality assurance in its assessment. On this basis the examination of the applicants specifications, working manuals, testing specifications will be done. The efficiency of quality assurance at the manufacturer and at the applicant will be checked by the TUeV specialists by considering specifications of modifications, repairs or tolerances. A mean point of the quality assurance in Germany is the dynamic adjustment, of an action on the latest state of engineering or science. If there exists a change of rules or guidelines, the quality assurance requirements have to be fit on this state in so far as it is feasible from the technical point of view. (orig./RW)

  11. Reaction sampling and reactivity prediction using the stochastic surface walking method.

    Science.gov (United States)

    Zhang, Xiao-Jie; Liu, Zhi-Pan

    2015-01-28

    The prediction of chemical reactivity and thus the design of new reaction systems are the key challenges in chemistry. Here, we develop an unbiased general-purpose reaction sampling method, the stochastic surface walking based reaction sampling (SSW-RS) method, and show that the new method is a promising solution for reactivity prediction of complex reaction systems. The SSW-RS method is capable of sampling both the configuration space of the reactant and the reaction space of pathways, owing to the combination of two recently developed theoretical methods, namely, the stochastic surface walking (SSW) method for potential energy surface (PES) exploration and the double-ended surface walking (DESW) method for building pathways. By integrating with first principles calculations, we show that the SSW-RS method can be applied to investigate the kinetics of complex organic reactions featuring many possible reaction channels and complex hydrogen-bonding networks, as demonstrated here using two examples, epoxypropane hydrolysis in aqueous solution and β-d-glucopyranose decomposition. Our results show that simultaneous sampling of the soft hydrogen-bonding conformations and the chemical reactions involving hard bond making/breaking can be achieved in the SSW-RS simulation, and the mechanism and kinetics can be predicted without a priori information on the system. Unexpected new chemistry for these reactions is revealed and discussed. In particular, despite many possible pathways for β-d-glucopyranose decomposition, the SSW-RS shows that only β-d-glucose and levoglucosan are kinetically preferred direct products and the 5- or 7-member ring products should be secondary products derived from β-d-glucose or levoglucosan. As a general tool for reactivity prediction, the SSW-RS opens a new route for the design of rational reactions. PMID:25503262

  12. Simple and accessible analytical methods for the determination of mercury in soil and coal samples.

    Science.gov (United States)

    Park, Chul Hee; Eom, Yujin; Lee, Lauren Jong-Eun; Lee, Tai Gyu

    2013-09-01

    Simple and accessible analytical methods compared to conventional methods such as US EPA Method 7471B and ASTM-D6414 for the determination of mercury (Hg) in soil and coal samples are proposed. The new methods are consisted of fewer steps without the Hg oxidizing step consequently eliminating a step necessary to reduce excess oxidant. In the proposed methods, a Hg extraction is an inexpensive and accessible step utilizing a disposable test tube and a heating block instead of an expensive autoclave vessel and a specially-designed microwave. Also, a common laboratory vacuum filtration was used for the extracts instead of centrifugation. As for the optimal conditions, first, best acids for extracting Hg from soil and coal samples was investigated using certified reference materials (CRMs). Among common laboratory acids (HCl, HNO3, H2SO4, and aqua regia), aqua regia was most effective for the soil CRM whereas HNO3 was for the coal CRM. Next, the optimal heating temperature and time for Hg extraction were evaluated. The most effective Hg extraction was obtained at 120°C for 30min for soil CRM and at 70°C for 90min for coal CRM. Further tests using selected CRMs showed that all the measured values were within the allowable certification range. Finally, actual soil and coal samples were analyzed using the new methods and the US EPA Method 7473. The relative standard deviation values of 1.71-6.55% for soil and 0.97-12.11% for coal samples were obtained proving that the proposed methods were not only simple and accessible but also accurate. PMID:23683353

  13. Applications of time series analysis in geosciences: an overview of methods and sample applications

    Directory of Open Access Journals (Sweden)

    W. Gossel

    2013-10-01

    Full Text Available Time series analysis methods are compared based on four geoscientific datasets. New methods such as wavelet analysis, STFT and period scanning bridge the gap between high resolution analysis of periodicities and non-equidistant data sets. The sample studies include not only time series but also spatial data. The application of variograms as an addition to or instead of autocorrelation opens new research possibilities for storage parameters.

  14. Analysis of Bulk Sample of Salicylic Acid by Application of Hydrotropic Solubilization Method

    OpenAIRE

    Maheshwari R; Chavada V; Varghese S; Shahoo K

    2008-01-01

    In the present investigation, the poorly water-soluble drug, salicylic acid has been solubilized using 0.5 M ibuprofen sodium and 2.0 M sodium salicylate solution as hydrotropic agents for the titrimetric analysis precluding the use of organic solvents. Both hydrotropes are economic and pollution-free. The mean percent estimation of salicylic acid estimated in bulk sample by Indian Pharmacopoeial method is 98.78%. The mean percent estimation by ibuprofen sodium method and sodium salicy...

  15. Use of thermal neutron reflection method for chemical analysis of bulk samples

    Energy Technology Data Exchange (ETDEWEB)

    Papp, A., E-mail: papppa@atomki.hu [Institute of Nuclear Research of the Hungarian Academy of Sciences, (ATOMKI), 4001 Debrecen, Pf. 51 (Hungary); Csikai, J. [Institute of Nuclear Research of the Hungarian Academy of Sciences, (ATOMKI), 4001 Debrecen, Pf. 51 (Hungary); Institute of Experimental Physics, University Debrecen (IEP), 4010 Debrecen-10, Pf. 105 (Hungary)

    2014-09-11

    Microscopic, σ{sub β}, and macroscopic, Σ{sub β}, reflection cross-sections of thermal neutrons averaged over bulk samples as a function of thickness (z) are given. The σ{sub β} values are additive even for bulk samples in the z=0.5–8 cm interval and so the σ{sub βmol}(z) function could be given for hydrogenous substances, including some illicit drugs, explosives and hiding materials of ∼1000 cm{sup 3} dimensions. The calculated excess counts agree with the measured R(z) values. For the identification of concealed objects and chemical analysis of bulky samples, different neutron methods need to be used simultaneously. - Highlights: • Check the proposed analytical expression for the description of the flux. • Determination of the reflection cross-sections averaged over bulk samples. • Data rendered to estimate the excess counts for various materials.

  16. Sampling methods for rumen microbial counts by Real-Time PCR techniques

    Directory of Open Access Journals (Sweden)

    S. Puppo

    2010-02-01

    Full Text Available Fresh rumen samples were withdrawn from 4 cannulated buffalo females fed a fibrous diets in order to quantify bacteria concentration in the rumen by Real-Time PCR techniques. To obtain DNA of a good quality from whole rumen fluid, eight (M1-M8 different pre-filtration methods (cheese cloths, glass-fibre and nylon filter in combination with various centrifugation speeds (1000, 5000 and 14,000 rpm were tested. Genomic DNA extraction was performed either on fresh or frozen samples (-20°C. The quantitative bacteria analysis was realized according to Real-Time PCR procedure for Butyrivibrio fibrisolvens reported in literature. M5 resulted the best sampling procedure allowing to obtain a suitable genomic DNA. No differences were revealed between fresh and frozen samples.

  17. Assessment of methods to recover DNA from bacteria, fungi and archaea in complex environmental samples.

    Science.gov (United States)

    Guillén-Navarro, Karina; Herrera-López, David; López-Chávez, Mariana Y; Cancino-Gómez, Máximo; Reyes-Reyes, Ana L

    2015-11-01

    DNA extraction from environmental samples is a critical step for metagenomic analysis to study microbial communities, including those considered uncultivable. Nevertheless, obtaining good quality DNA in sufficient quantities for downstream methodologies is not always possible, and it depends on the complexity and stability of each ecosystem, which could be more problematic for samples from tropical regions because those ecosystems are less stable and more complex. Three laboratory methods for the extraction of nucleic acids from samples representing unstable (decaying coffee pulp and mangrove sediments) and relatively stable (compost and soil) environments were tested. The results were compared with those obtained using two commercial DNA extraction kits. The quality of the extracted DNA was evaluated by PCR amplification to verify the recovery of bacterial, archaeal, and fungal genetic material. The laboratory method that gave the best results used a lysis procedure combining physical, chemical, and enzymatic steps.

  18. Study on a pattern classification method of soil quality based on simplified learning sample dataset

    Science.gov (United States)

    Zhang, Jiahua; Liu, S.; Hu, Y.; Tian, Y.

    2011-01-01

    Based on the massive soil information in current soil quality grade evaluation, this paper constructed an intelligent classification approach of soil quality grade depending on classical sampling techniques and disordered multiclassification Logistic regression model. As a case study to determine the learning sample capacity under certain confidence level and estimation accuracy, and use c-means algorithm to automatically extract the simplified learning sample dataset from the cultivated soil quality grade evaluation database for the study area, Long chuan county in Guangdong province, a disordered Logistic classifier model was then built and the calculation analysis steps of soil quality grade intelligent classification were given. The result indicated that the soil quality grade can be effectively learned and predicted by the extracted simplified dataset through this method, which changed the traditional method for soil quality grade evaluation. ?? 2011 IEEE.

  19. A new method for automatic discontinuity traces sampling on rock mass 3D model

    Science.gov (United States)

    Umili, G.; Ferrero, A.; Einstein, H. H.

    2013-02-01

    A new automatic method for discontinuity traces mapping and sampling on a rock mass digital model is described in this work. The implemented procedure allows one to automatically identify discontinuity traces on a Digital Surface Model: traces are detected directly as surface breaklines, by means of maximum and minimum principal curvature values of the vertices that constitute the model surface. Color influence and user errors, that usually characterize the trace mapping on images, are eliminated. Also trace sampling procedures based on circular windows and circular scanlines have been implemented: they are used to infer trace data and to calculate values of mean trace length, expected discontinuity diameter and intensity of rock discontinuities. The method is tested on a case study: results obtained applying the automatic procedure on the DSM of a rock face are compared to those obtained performing a manual sampling on the orthophotograph of the same rock face.

  20. Multicenter validation of PCR-based method for detection of Salmonella in chicken and pig samples

    DEFF Research Database (Denmark)

    Malorny, B.; Cook, N.; D'Agostino, M.;

    2004-01-01

    As part of a standardization project, an interlaboratory trial including 15 laboratories from 13 European countries was conducted to evaluate the performance of a noproprietary polymerase chain reaction (PCR)-based method for the detection of Salmonella on artificially contaminated chicken rinse......) and PCR assay (gel electrophoresis detection) were performed by the receiving laboratories. Aliquots of BPW enrichment cultures were sent to the participants, who analyzed them using a thermal lysis procedure followed by a validated Salmonella-specific PCR assay. The results were reported as negative...... and pig swab samples. The 3 levels were 1-10, 10-100, and 100-1000 colony-forming units (CFU)/100 mL. Sample preparations, including inoculation and pre-enrichment in buffered peptone water (BPW), were performed centrally in a German laboratory; the pre-PCR sample preparation (by a resin-based method...

  1. Energy degeneracies from Broad Histogram Method and Wang-Landau Sampling

    CERN Document Server

    Lima, Alexandre Pereira; Girardi, Daniel

    2016-01-01

    In this work, we present a comparative study of the accuracy provided by the Wang-Landau sampling and the Broad Histogram method to estimate de density of states of the two dimensional Ising ferromagnet. The microcanonical averages used to describe the thermodynamic behaviour and to use the Broad Histogram method were obtained using the single spin-flip Wang-Landau sampling, attempting to convergence issues and accuracy improvements. We compare the results provided by both techniques with the exact ones for thermodynamic properties and critical exponents. Our results, within the Wang-Landau sampling, reveal that the Broad Histogram approach provides a better description of the density of states for all cases analysed.

  2. Assessment of methods to recover DNA from bacteria, fungi and archaea in complex environmental samples.

    Science.gov (United States)

    Guillén-Navarro, Karina; Herrera-López, David; López-Chávez, Mariana Y; Cancino-Gómez, Máximo; Reyes-Reyes, Ana L

    2015-11-01

    DNA extraction from environmental samples is a critical step for metagenomic analysis to study microbial communities, including those considered uncultivable. Nevertheless, obtaining good quality DNA in sufficient quantities for downstream methodologies is not always possible, and it depends on the complexity and stability of each ecosystem, which could be more problematic for samples from tropical regions because those ecosystems are less stable and more complex. Three laboratory methods for the extraction of nucleic acids from samples representing unstable (decaying coffee pulp and mangrove sediments) and relatively stable (compost and soil) environments were tested. The results were compared with those obtained using two commercial DNA extraction kits. The quality of the extracted DNA was evaluated by PCR amplification to verify the recovery of bacterial, archaeal, and fungal genetic material. The laboratory method that gave the best results used a lysis procedure combining physical, chemical, and enzymatic steps. PMID:26014885

  3. Open tube combustion method of organic samples for stable carbon isotope analysis.

    Science.gov (United States)

    Velivetskaya, Tatiana A; Ignatyev, Alexander V; Reize, Marina V; Kiyashko, Serguei I

    2007-01-01

    A simple and effective method for the conversion of organic carbon into carbon dioxide for analysis of stable carbon isotopes (delta(13)C) in samples of various organic substances, soils, sedimentary rocks, oils and volatile organic liquids is presented. The conversion of organic carbon of the samples is carried out in a quartz reactor connected to a vacuum line for CO(2) freezing and purification. A solid organic sample mixed with CuO is placed at the reactor bottom and the reactor is subsequently filled with granular CuO. One end of the CuO column is preheated to 850 degrees C while the other end of the column in contact with the sample is kept at ambient temperature. Heating of the sample (850 degrees C) and the remainder of the column is then performed. The preheated part of the column provides efficient conversion of carbon into CO(2). The reactor for the conversion of volatile liquid organic compounds is filled with granular CuO. The column of CuO is heated to 850 degrees C. Samples of volatile liquids are introduced into the reactor through a septum using a microsyringe. Complete conversion takes 10 min for solid samples and 3 min for volatile liquids. The precision of the delta(13)C analysis for solid and volatile liquid organic substances is +/-0.1 per thousand and +/-0.04 per thousand, respectively.

  4. A review of methods for sampling large airborne particles and associated radioactivity

    International Nuclear Information System (INIS)

    Radioactive particles, tens of μm or more in diameter, are unlikely to be emitted directly from nuclear facilities with exhaust gas cleansing systems, but may arise in the case of an accident or where resuspension from contaminated surfaces is significant. Such particles may dominate deposition and, according to some workers, may contribute to inhalation doses. Quantitative sampling of large airborne particles is difficult because of their inertia and large sedimentation velocities. The literature describes conditions for unbiased sampling and the magnitude of sampling errors for idealised sampling inlets in steady winds. However, few air samplers for outdoor use have been assessed for adequacy of sampling. Many size selective sampling methods are found in the literature but few are suitable at the low concentrations that are often encountered in the environment. A number of approaches for unbiased sampling of large particles have been found in the literature. Some are identified as meriting further study, for application in the measurement of airborne radioactivity. (author)

  5. Sensitivity of the DRP-4DVar Performance to Perturbation Samples Obtained by Two Different Methods

    Institute of Scientific and Technical Information of China (English)

    ZHAO Juan; WANG Bin

    2010-01-01

    The dimension-reduced projection four-dimensional variational data assimilation (DRP-4DVar) approach utilizes the ensemble of historical forecasts to estimate the background error covariance (BEC) and directly obtains the analysis in the ensemble space. As a result, the quality of ensemble members significantly affects the DRP-4DVar performance. The historical-forecast-based initial perturbation samples are flow-dependent and can describe the error-growth pattern of the atmospheric model and the balanced relationship between different model variables. However, the ensemble spread is not big enough because of the short time interval between adjacent historical samples and the limited ensemble size. In this study, the BEC of the Weather Research and Forecasting Model (WRF) three-dimensional variational data assimilation (3DVar) system is employed to produce initial perturbation samples for the DRP-4DVar. The control variable perturbation method based on the structure characteristics of the 3DVar BEC produces initial perturbation samples that have reasonable background error correlations. Moreover, the estimated BEC also has good dynamic and physical consistency between variables after the initial perturbation samples undergo a development with a 6- or 12-h model forward integration. In terms of computational expense, the historical forecast results can be obtained without any additional computational cost at the operational numerical weather forecast centers, while the integration of samples from the 3DVar-based control variable perturbation method is time-consuming, but this difficulty can be alleviated through parallel computing. Although the assimilation run with the historical-forecast-based ensemble generates slightly better initial analysis field, the forecasts from the assimilation experiment using the 3DVar method performs better during the period from 12 to 30 h. Moreover, precipitation is simulated significantly better when the new ensemble is used. In

  6. Highly Effective DNA Extraction Method from Fresh, Frozen, Dried and Clotted Blood Samples

    Directory of Open Access Journals (Sweden)

    Jaleh Barar

    2011-09-01

    Full Text Available Introduction: Today, with the tremendous potential of genomics and other recent advances in science, the role of science to improve reliable DNA extraction methods is more relevant than ever before. The ideal process for genomic DNA extraction demands high quantities of pure, integral and intact genomic DNA (gDNA from the sample with minimal co-extraction of inhibitors of downstream processes. Here, we report the development of a very rapid, less-hazardous, and high throughput protocol for extracting of high quality DNA from blood samples. Methods: Dried, clotted and ethylene diamine tetra-acetic acid (EDTA treated fresh and frozen blood samples were extracted using this method in which the quality and integrity of the extracted DNA were corroborated by agarose gel electrophoresis, PCR reaction and DNA digestion using restricted enzyme. The UV spectrophotometric and gel electrophoresis analysis resulted in high A260/A280 ratio (>1.8 with high intactness of DNA. Results: PCR and DNA digestion experiments indicated that the final solutions of extracted DNA contained no inhibitory substances, which confirms that the isolated DNA is of good quality. Conclusion: The high quality and quantity of current method, no enzymatic processing and accordingly its low cost, make it appropriate for DNA extraction not only from human but also from animal blood samples in any molecular biology labs.

  7. Race and Research Methods Anxiety in an Undergraduate Sample: The Potential Effects of Self-Perception

    Science.gov (United States)

    Eckberg, Deborah A.

    2015-01-01

    This study explores race as a potential predictor of research methods anxiety among a sample of undergraduates. While differences in academic achievement based on race and ethnicity have been well documented, few studies have examined racial differences in anxiety with regard to specific subject matter in undergraduate curricula. This exploratory…

  8. A RAPID GAS-CHROMATOGRAPHIC METHOD FOR THE FINGERPRINTING OF ILLICIT COCAINE SAMPLES

    NARCIS (Netherlands)

    ENSING, JG; RACAMY, C; DEZEEUW, RA

    1992-01-01

    A gas chromatographic (GC) fingerprint method, based on the presence or absence of six congeners, was developed for illicit cocaine samples. The fingerprint utilizes the relative abundances of these congeners towards each other, disregarding cocaine as the main constituent, and can be expressed nume

  9. Comparison of three microbial screening methods for antibiotics using routine monitoring samples

    NARCIS (Netherlands)

    Pikkemaat, M.G.; Rapallini, M.; Oostra, S.; Elferink, J.W.A.

    2009-01-01

    Monitoring large numbers of slaughter animals for the presence of antimicrobial residues is preferably carried out using microbiological screening methods, because of their high cost-effectiveness. An evaluation of the Nouws antibiotic test (NAT) was performed on routine monitoring samples and the p

  10. Measurement of glomerular filtration rate in adults: accuracy of five single-sample plasma clearance methods

    DEFF Research Database (Denmark)

    Rehling, M; Rabøl, A

    1989-01-01

    After an intravenous injection of a tracer that is removed from the body solely by filtration in the kidneys, the glomerular filtration rate (GFR) can be determined from its plasma clearance. The method requires a great number of blood samples but collection of urine is not needed. In the present...

  11. An off-line breath sampling and analysis method suitable for large screening studies

    NARCIS (Netherlands)

    Steeghs, M.M.L.; Cristescu, S.M.; Munnik, P.; Zanen, P.; Harren, F.J.M.

    2007-01-01

    We present a new, off-line breath collection and analysis method, suitable for large screening studies. The breath collection system is based on the guidelines of the American Thoracic Society for the sampling of exhaled NO. Breath containing volatile gases is collected in custom-made black-layered

  12. Evaluation of surface sampling method performance for Bacillus Spores on clean and dirty outdoor surfaces.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, Mollye C.; Einfeld, Wayne; Boucher, Raymond M.; Brown, Gary Stephen; Tezak, Matthew Stephen

    2011-06-01

    Recovery of Bacillus atrophaeous spores from grime-treated and clean surfaces was measured in a controlled chamber study to assess sampling method performance. Outdoor surfaces investigated by wipe and vacuum sampling methods included stainless steel, glass, marble and concrete. Bacillus atrophaeous spores were used as a surrogate for Bacillus anthracis spores in this study designed to assess whether grime-coated surfaces significantly affected surface sampling method performance when compared to clean surfaces. A series of chamber tests were carried out in which known amounts of spores were allowed to gravitationally settle onto both clean and dirty surfaces. Reference coupons were co-located with test coupons in all chamber experiments to provide a quantitative measure of initial surface concentrations of spores on all surfaces, thereby allowing sampling recovery calculations. Results from these tests, carried out under both low and high humidity conditions, show that spore recovery from grime-coated surfaces is the same as or better than spore recovery from clean surfaces. Statistically significant differences between method performance for grime-coated and clean surfaces were observed in only about half of the chamber tests conducted.

  13. Analysis of aroma compounds of Roselle by Dynamic Headspace Sampling using different preparation methods

    DEFF Research Database (Denmark)

    Juhari, Nurul Hanisah Binti; Varming, Camilla; Petersen, Mikael Agerlin

    2015-01-01

    The influence of different methods of sample preparation on the aroma profiles of dried Roselle (Hibiscus sabdariffa) was studied. Least amounts of aroma compounds were recovered by analysis of whole dry calyxes (WD) followed by ground dry (GD), blended together with water (BTW), and ground and t...

  14. Methods and Techniques of Sampling, Culturing and Identifying of Subsurface Bacteria

    International Nuclear Information System (INIS)

    This report described sampling, culturing and identifying of KURT underground bacteria, which existed as iron-, manganese-, and sulfate-reducing bacteria. The methods of culturing and media preparation were different by bacteria species affecting bacteria growth-rates. It will be possible for the cultured bacteria to be used for various applied experiments and researches in the future

  15. IDENTIFICATION OF SALMONELLA-POSITIVE FECAL SAMPLES USING A 96-WELL MICROCULTURE PLATE TECHNIQUE (RX METHOD)

    Science.gov (United States)

    Conventional Salmonella isolation involves multiple sample transfers to culture media performed by an experienced microbiologist. The Reaction (RX) Plate method, a modification of the RX tube designed by Gailey et al. (2004), consolidates pre-enrichment (buffered peptone water or GN Hajna), enrichm...

  16. A simplified method for determination of radioactive iron in whole-blood samples

    DEFF Research Database (Denmark)

    Bukhave, Klaus; Sørensen, Anne Dorthe; Hansen, M.

    2001-01-01

    in humans. The overall recovery of radioiron from blood is more than 90%, and the coefficient of variation, as judged by the variation in the ratio Fe-55/Fe-59 is in the order of 4%. Combined with whole-body counting of 59Fe and direct gamma -counting of Fe-59 on blood samples, this method represents...

  17. Comparing Respondent-Driven Sampling and Targeted Sampling Methods of Recruiting Injection Drug Users in San Francisco

    OpenAIRE

    Kral, Alex H.; Malekinejad, Mohsen; Vaudrey, Jason; Martinez, Alexis N.; Lorvick, Jennifer; McFarland, Willi; Raymond, H. Fisher

    2010-01-01

    The objective of this article is to compare demographic characteristics, risk behaviors, and service utilization among injection drug users (IDUs) recruited from two separate studies in San Francisco in 2005, one which used targeted sampling (TS) and the other which used respondent-driven sampling (RDS). IDUs were recruited using TS (n = 651) and RDS (n = 534) and participated in quantitative interviews that included demographic characteristics, risk behaviors, and service utilization. Preval...

  18. Rapid detection of intestinal pathogens in fecal samples by an improved reverse dot blot method

    Institute of Scientific and Technical Information of China (English)

    Jian-Ming Xing; Su Zhang; Ying Du; Dan Bi; Li-Hui Yao

    2009-01-01

    AIM:To develop a new, rapid and accurate reverse dot blot (RDB) method for the detection of intestinal pathogens in fecal samples.METHODS:The 12 intestinal pathogens tested were Salmonella spp., Brucella spp., Escherichia coli O157:H7,Clostridium botulinum, Bacillus cereus,Clostridium perfringens, Vibrio parahaemolyticus,Shigella spp., Yersinia enterocolitica, Vibrio cholerae,Listeria monocytogenes and Staphylococcus aureus.The two universal primers were designed to amplify two variable regions of bacterial 16S and 23S rDNA genes from all of the 12 bacterial species tested. Five hundred and forty fecal samples from the diarrhea patients were detected using the improved RDB assay.RESULTS:The methods could identify the 12 intestinal pathogens specifically, and the detection limit was as low as 103 CFUs. The consistent detection rate of the improved RDB assay compared with the traditional culture method was up to 88.75%.CONCLUSION:The hybridization results indicated that the improved RDB assay developed was a reliable method for the detection of intestinal pathogen in fecal samples.

  19. Determination of appropriate sampling frequency and time of multiple blood sampling dual exponential method with {sup 99m}Tc-DTPA for calculating GFR

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chung Ho; O, Joo Hyun; Chung, Yong An; Yoo, Le Ryung; Sohn, Hyung Sun; Kim, Sung Hoon; Chung, Soo Kyo; Lee, Hyoung Koo [Catholic University of Korea, Seoul (Korea, Republic of)

    2006-02-15

    To determine appropriate sampling frequency and time of multiple blood sampling dual exponential method with {sup 99m}Tc-DTPA for calculating glomerular filtration rate (GFR). Thirty four patients were included in this study. Three mCi of {sup 99m}Tc-DTPA was intravenously injected and blood sampling at 9 different times, 5 ml each, were done. Using the radioactivity of serum, measured by gamma counter, the GFR was calculated using dual exponential method and corrected with the body surface area. Using spontaneously chosen 2 data points of serum radioactivity, 15 collections of 2-sample GFR were calculated. And 10 collections of 3-sample GFR and 12 collections of 4-sample GFR were also calculated. Using the 9-sample GFR as a reference value, degree of agreement was analyzed with Kendall's {tau} correlation coefficients, mean difference and standard deviation. Although some of the 2-sample GFR showed high correlation coefficient, over or underestimation had evolved as the renal function change. The 10-120-240 min 3-sample GFR showed a high correlation coefficient {tau} =0.93), minimal difference (Mean{+-}SD= -1.784{+-}3.972), and no over or underestimation as the renal function changed. Th 4-sample GFR showed no better accuracy than the 3-sample GFR. Int the wide spectrum or renal function, the 10-120-240 min 3-sample GFR could be the best choice for estimating the patients' renal function.

  20. Small Body GN and C Research Report: G-SAMPLE - An In-Flight Dynamical Method for Identifying Sample Mass [External Release Version

    Science.gov (United States)

    Carson, John M., III; Bayard, David S.

    2006-01-01

    G-SAMPLE is an in-flight dynamical method for use by sample collection missions to identify the presence and quantity of collected sample material. The G-SAMPLE method implements a maximum-likelihood estimator to identify the collected sample mass, based on onboard force sensor measurements, thruster firings, and a dynamics model of the spacecraft. With G-SAMPLE, sample mass identification becomes a computation rather than an extra hardware requirement; the added cost of cameras or other sensors for sample mass detection is avoided. Realistic simulation examples are provided for a spacecraft configuration with a sample collection device mounted on the end of an extended boom. In one representative example, a 1000 gram sample mass is estimated to within 110 grams (95% confidence) under realistic assumptions of thruster profile error, spacecraft parameter uncertainty, and sensor noise. For convenience to future mission design, an overall sample-mass estimation error budget is developed to approximate the effect of model uncertainty, sensor noise, data rate, and thrust profile error on the expected estimate of collected sample mass.