WorldWideScience

Sample records for samples including qc

  1. New QC 7 tools

    International Nuclear Information System (INIS)

    1982-03-01

    This book tells of new QC with 7 tools which includes TQC and new QC with 7 tools which is for better propel, what is QC method to think? what is new QC 7 tool ? like KJ law, PDPC law, arrow and diagram law, and matrix diagram law, application of new QC 7 tools such as field to apply, application of new QC 7 tools for policy management the method of new QC 7 tools including related regulations KJ law, matrix and data analysis, PDPC law and education and introduction of new QC 7 tools.

  2. Automated dried blood spots standard and QC sample preparation using a robotic liquid handler.

    Science.gov (United States)

    Yuan, Long; Zhang, Duxi; Aubry, Anne-Francoise; Arnold, Mark E

    2012-12-01

    A dried blood spot (DBS) bioanalysis assay involves many steps, such as the preparation of standard (STD) and QC samples in blood, the spotting onto DBS cards, and the cutting-out of the spots. These steps are labor intensive and time consuming if done manually, which, therefore, makes automation very desirable in DBS bioanalysis. A robotic liquid handler was successfully applied to the preparation of STD and QC samples in blood and to spot the blood samples onto DBS cards using buspirone as the model compound. This automated preparation was demonstrated to be accurate and consistent. However the accuracy and precision of automated preparation were similar to those from manual preparation. The effect of spotting volume on accuracy was evaluated and a trend of increasing concentrations of buspirone with increasing spotting volumes was observed. The automated STD and QC sample preparation process significantly improved the efficiency, robustness and safety of DBS bioanalysis.

  3. Droplet Digital™ PCR Next-Generation Sequencing Library QC Assay.

    Science.gov (United States)

    Heredia, Nicholas J

    2018-01-01

    Digital PCR is a valuable tool to quantify next-generation sequencing (NGS) libraries precisely and accurately. Accurately quantifying NGS libraries enable accurate loading of the libraries on to the sequencer and thus improve sequencing performance by reducing under and overloading error. Accurate quantification also benefits users by enabling uniform loading of indexed/barcoded libraries which in turn greatly improves sequencing uniformity of the indexed/barcoded samples. The advantages gained by employing the Droplet Digital PCR (ddPCR™) library QC assay includes the precise and accurate quantification in addition to size quality assessment, enabling users to QC their sequencing libraries with confidence.

  4. Easy QC 7 tools

    International Nuclear Information System (INIS)

    1981-04-01

    This book explains method of QC 7 tools, mind for using QC 7 tools, effect of QC 7 tools application, giving descriptions of graph, pareto's diagram like application writing way and using method of pareto's diagram, characteristic diagram, check sheet such as purpose and subject of check, goals and types of check sheet, and using point of check sheet, histogram like application and using method, and stratification, scatterplot, control chart, promotion method and improvement and cases of practice of QC tools.

  5. Easy QC 7 tools

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1981-04-15

    This book explains method of QC 7 tools, mind for using QC 7 tools, effect of QC 7 tools application, giving descriptions of graph, pareto's diagram like application writing way and using method of pareto's diagram, characteristic diagram, check sheet such as purpose and subject of check, goals and types of check sheet, and using point of check sheet, histogram like application and using method, and stratification, scatterplot, control chart, promotion method and improvement and cases of practice of QC tools.

  6. Construction QA/QC systems: comparative analysis

    International Nuclear Information System (INIS)

    Willenbrock, J.H.; Shepard, S.

    1980-01-01

    An analysis which compares the quality assurance/quality control (QA/QC) systems adopted in the highway, nuclear power plant, and U.S. Navy construction areas with the traditional quality control approach used in building construction is presented. Full participation and support by the owner as well as the contractor and AE firm are required if a QA/QC system is to succeed. Process quality control, acceptance testing and quality assurance responsibilities must be clearly defined in the contract documents. The owner must audit these responsibilities. A contractor quality control plan, indicating the tasks which will be performed and the fact that QA/QC personnel are independent of project time/cost pressures should be submitted for approval. The architect must develop realistic specifications which consider the natural variability of material. Acceptance criteria based on the random sampling technique should be used. 27 refs

  7. Results of a QC program on dental radiography in Greece

    International Nuclear Information System (INIS)

    Pappous, George; Kolitsi, Zoi; Pallikarakis, Nikolas; Arvanitakis, Gerasimos

    1998-01-01

    Quality Control (QC) performed on 99 intraoral dental X-Ray units, installed in equal in number dental offices, at the Achaia prefecture, a region of south west Greece. The QC procedure includes collection of general information, radiation safety checks, beam qualitative and quantitative characteristic checks, and film processing checks, according to international established protocols. The collected data are characterised by a non-uniformity and in some cases indicate a poor performance level. The results of the study on a representative sample of dental X-Ray units helps to map the existing situation and may be useful in the reviewing and optimisation of the applied process. (authors)

  8. Results of a QC program on dental radiography in Greece

    Energy Technology Data Exchange (ETDEWEB)

    Pappous, George; Kolitsi, Zoi; Pallikarakis, Nikolas [Medical Physics Department, Patras University, 26 500 Patras (Greece); Arvanitakis, Gerasimos [Achaia branch of Hellenic Dental Association, Pantanasis 70-72, 262 21 (Greece)

    1999-12-31

    Quality Control (QC) performed on 99 intraoral dental X-Ray units, installed in equal in number dental offices, at the Achaia prefecture, a region of south west Greece. The QC procedure includes collection of general information, radiation safety checks, beam qualitative and quantitative characteristic checks, and film processing checks, according to international established protocols. The collected data are characterised by a non-uniformity and in some cases indicate a poor performance level. The results of the study on a representative sample of dental X-Ray units helps to map the existing situation and may be useful in the reviewing and optimisation of the applied process. (authors) 10 refs., 11 figs.

  9. qcML

    DEFF Research Database (Denmark)

    Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara

    2014-01-01

    provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible...... use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml....

  10. ChronQC: a quality control monitoring system for clinical next generation sequencing.

    Science.gov (United States)

    Tawari, Nilesh R; Seow, Justine Jia Wen; Perumal, Dharuman; Ow, Jack L; Ang, Shimin; Devasia, Arun George; Ng, Pauline C

    2018-05-15

    ChronQC is a quality control (QC) tracking system for clinical implementation of next-generation sequencing (NGS). ChronQC generates time series plots for various QC metrics to allow comparison of current runs to historical runs. ChronQC has multiple features for tracking QC data including Westgard rules for clinical validity, laboratory-defined thresholds and historical observations within a specified time period. Users can record their notes and corrective actions directly onto the plots for long-term recordkeeping. ChronQC facilitates regular monitoring of clinical NGS to enable adherence to high quality clinical standards. ChronQC is freely available on GitHub (https://github.com/nilesh-tawari/ChronQC), Docker (https://hub.docker.com/r/nileshtawari/chronqc/) and the Python Package Index. ChronQC is implemented in Python and runs on all common operating systems (Windows, Linux and Mac OS X). tawari.nilesh@gmail.com or pauline.c.ng@gmail.com. Supplementary data are available at Bioinformatics online.

  11. Activity know-how and doctrine of QC circle

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1976-09-15

    This books introduces activity know-how of QC circle giving descriptions of basic of QC circle activities, introduction operation and development and mind of QC circle activities, method for beginning of QC circle activity like, way order, motivation of introduction of QC circle activity, propel method of QC circle activities, such as leadership, brain storming, and rule of QC circle activity, management and propel method for improvement, development of QC circle activities. It also deals with doctrine of basic of QC circle, purpose, self improvement and group activity.

  12. Activity know-how and doctrine of QC circle

    International Nuclear Information System (INIS)

    1976-09-01

    This books introduces activity know-how of QC circle giving descriptions of basic of QC circle activities, introduction operation and development and mind of QC circle activities, method for beginning of QC circle activity like, way order, motivation of introduction of QC circle activity, propel method of QC circle activities, such as leadership, brain storming, and rule of QC circle activity, management and propel method for improvement, development of QC circle activities. It also deals with doctrine of basic of QC circle, purpose, self improvement and group activity.

  13. PGDP [Paducah Gaseous Diffusion Plant]-UF6 handling, sampling, analysis and associated QC/QA and safety related procedures

    International Nuclear Information System (INIS)

    Harris, R.L.

    1987-01-01

    This document is a compilation of Paducah Gaseous Diffusion Plant procedures on UF 6 handling, sampling, and analysis, along with associated QC/QA and safety related procedures. It was assembled for transmission by the US Department of Energy to the Korean Advanced Energy Institute as a part of the US-Korea technical exchange program

  14. RNA-SeQC: RNA-seq metrics for quality control and process optimization.

    Science.gov (United States)

    DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad

    2012-06-01

    RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.

  15. Impact and payback of a QA/QC program for steam-water chemistry

    International Nuclear Information System (INIS)

    Lerman, S.I.; Wilson, D.

    1992-01-01

    QA/QC programs for analytical laboratories and in-line instrumentation are essential if we are to have any faith in the data they produce. When the analytes are at trace levels, as they frequently are in a steam-water cycle, the importance of QA/QC increases by an order of magnitude. The cost and resources of such a program, although worth it, are frequently underestimated. QA/QC is much more than running a standard several times a week. This paper will discuss some of the essential elements of such a program, compare them to the cost, and point out the impact of not having such a program. RP-2712-3 showed how essential QA/QC is to understand the limitations of instruments doing trace analysis of water. What it did not do, nor was it intended to, is discuss how good reliability can be in your own plant. QA programs that include training of personnel, written procedures, and comprehensive maintenance and inventory programs ensure optimum performance of chemical monitors. QC samples run regularly allow plant personnel to respond to poor performance in a timely manner, appropriate to plant demands. Proper data management establishes precision information necessary to determine how good our measurements are. Generally, the plant has the advantage of a central laboratory to perform corroborative analysis, and a comprehensive QA/QC program will integrate the plant monitoring operations with the central lab. Where trace analysis is concerned, attention to detail becomes paramount. Instrument performance may be below expected levels, and instruments are probably being run at the bottom end of their optimum range. Without QA/QC the plant manager can have no confidence in analytical results. Poor steam-water chemistry can go unnoticed, causing system deterioration. We can't afford to wait for another RP-2712-3 to tell us how good our data is

  16. QC-LDPC code-based cryptography

    CERN Document Server

    Baldi, Marco

    2014-01-01

    This book describes the fundamentals of cryptographic primitives based on quasi-cyclic low-density parity-check (QC-LDPC) codes, with a special focus on the use of these codes in public-key cryptosystems derived from the McEliece and Niederreiter schemes. In the first part of the book, the main characteristics of QC-LDPC codes are reviewed, and several techniques for their design are presented, while tools for assessing the error correction performance of these codes are also described. Some families of QC-LDPC codes that are best suited for use in cryptography are also presented. The second part of the book focuses on the McEliece and Niederreiter cryptosystems, both in their original forms and in some subsequent variants. The applicability of QC-LDPC codes in these frameworks is investigated by means of theoretical analyses and numerical tools, in order to assess their benefits and drawbacks in terms of system efficiency and security. Several examples of QC-LDPC code-based public key cryptosystems are prese...

  17. Monte Carlo generated spectra for QA/QC of automated NAA routine

    International Nuclear Information System (INIS)

    Jackman, K.R.; Biegalski, S.R.

    2007-01-01

    A quality check for an automated system of analyzing large sets of neutron activated samples has been developed. Activated samples are counted with an HPGe detector, in conjunction with an automated sample changer and spectral analysis tools, controlled by the Canberra GENIE 2K and REXX software. After each sample is acquired and analyzed, a Microsoft Visual Basic program imports the results into a template Microsoft Excel file where the final concentrations, uncertainties, and detection limits are determined. Standard reference materials are included in each set of 40 samples as a standard quality assurance/quality control (QA/QC) test. A select group of sample spectra are also visually reviewed to check the peak fitting routines. A reference spectrum was generated in MCNP 4c2 using an F8, pulse-height, tally with a detector model of the actual detector used in counting. The detector model matches the detector resolution, energy calibration, and counting geometry. The generated spectrum also contained a radioisotope matrix that was similar to what was expected in the samples. This spectrum can then be put through the automated system and analyzed along with the other samples. The automated results are then compared to expected results for QA/QC assurance. (author)

  18. Calibration of ARI QC ionisation chambers using the Australian secondary standards for activity

    International Nuclear Information System (INIS)

    Mo, L.; Van Der Gaast, H.A.; Alexiev, D.; Butcher, K.S.A.; Davies, J.

    1999-01-01

    The Secondary Standard Activity Laboratory (SSAL) in ANSTO routinely provides standardised radioactive sources, traceable activity measurements and custom source preparation services to customers. The most important activity carried out is the calibration of ionisation chambers located in the Quality Control (QC) section of Australian Radioisotopes (ARI). This ensures that their activity measurements are traceable to the Australian primary methods of standardisation. ARI QC ionisation chambers are calibrated for 99m Tc, 67 Ga, 131 I, 201 Tl and 153 Sm. The SSAL has a TPA ionisation chamber, which has been directly calibrated against a primary standard for a variety of radioactive nuclides. Calibration factors for this chamber were determined specifically for the actual volumes (5ml for 99m Tc, 131 I, 2ml for 67 Ga, 201 Tl and 3 ml for 153 Sm) and types of vial (Wheaton) which are routinely used at ARI. These calibration factors can be used to accurately measure the activity of samples prepared by ARI. The samples can subsequently be used to calibrate the QC ionisation chambers. QC ionisation chambers are re-calibrated biannually

  19. M073: Monte Carlo generated spectra for QA/QC of automated NAA routine

    International Nuclear Information System (INIS)

    Jackman, K.R.; Biegalski, S.R.

    2004-01-01

    A quality check for an automated system of analyzing large sets of neutron activated samples has been developed. Activated samples are counted with an HPGe detector, in conjunction with an automated sample changer and spectral analysis tools, controlled by the Canberra GENIE 2K and REXX software. After each sample is acquired and analyzed, a Microsoft Visual Basic program imports the results into a template Microsoft Excel file where the final concentrations, uncertainties, and detection limits are determined. Standard reference materials are included in each set of 40 samples as a standard quality assurance/quality control (QA/QC) test. A select group of sample spectra are also visually reviewed to check the peak fitting routines. A reference spectrum was generated in MCNP 4c2 using an F8, pulse height, tally with a detector model of the actual detector used in counting. The detector model matches the detector resolution, energy calibration, and counting geometry. The generated spectrum also contained a radioisotope matrix that was similar to what was expected in the samples. This spectrum can then be put through the automated system and analyzed along with the other samples. The automated results are then compared to expected results for QA/QC assurance.

  20. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    Science.gov (United States)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  1. Selecting a Risk-Based SQC Procedure for a HbA1c Total QC Plan.

    Science.gov (United States)

    Westgard, Sten A; Bayat, Hassan; Westgard, James O

    2017-09-01

    Recent US practice guidelines and laboratory regulations for quality control (QC) emphasize the development of QC plans and the application of risk management principles. The US Clinical Laboratory Improvement Amendments (CLIA) now includes an option to comply with QC regulations by developing an individualized QC plan (IQCP) based on a risk assessment of the total testing process. The Clinical and Laboratory Standards Institute (CLSI) has provided new practice guidelines for application of risk management to QC plans and statistical QC (SQC). We describe an alternative approach for developing a total QC plan (TQCP) that includes a risk-based SQC procedure. CLIA compliance is maintained by analyzing at least 2 levels of controls per day. A Sigma-Metric SQC Run Size nomogram provides a graphical tool to simplify the selection of risk-based SQC procedures. Current HbA1c method performance, as demonstrated by published method validation studies, is estimated to be 4-Sigma quality at best. Optimal SQC strategies require more QC than the CLIA minimum requirement of 2 levels per day. More complex control algorithms, more control measurements, and a bracketed mode of operation are needed to assure the intended quality of results. A total QC plan with a risk-based SQC procedure provides a simpler alternative to an individualized QC plan. A Sigma-Metric SQC Run Size nomogram provides a practical tool for selecting appropriate control rules, numbers of control measurements, and run size (or frequency of SQC). Applications demonstrate the need for continued improvement of analytical performance of HbA1c laboratory methods.

  2. Quality Certification 4 (QC4) for RE4 Performance Plots

    CERN Document Server

    CMS Collaboration

    2013-01-01

    The installation of two new wheels in the end-caps of the RPC system (RE4) is expected during LHC Long Shutdown (LS1). The RE4 upgrade project consists of 72 Super Modules (SM), each one made with 2 RPC chambers, for a total of 144 double-gap RPC chambers. To ensure the quality of the chambers several steps have been established for the Quality Certification (QC) of the RPC chamber production: QC1 (for components), QC2 (for gaps), QC3 (for chambers), QC4 (for chambers and super modules) and QC5 (Commissioning at P5). The results from the QC4 tests, performed for the new RPC, are presented in this note.

  3. 40 CFR 98.164 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Petroleum Products and... Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by reference, see... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Hydrogen Production § 98.164 Monitoring and QA/QC requirements...

  4. QC in MRI: useful or superfluous?

    Energy Technology Data Exchange (ETDEWEB)

    Infantino, S; Malchair, F [Biomed Engineering, Boncelles (France)

    1995-12-01

    A European task group has developed a protocol for quality control (QC) in MRI. This protocol is essentially based on the control of image properties (uniformity of the signal, signal-to-noise ratio (SNR), resolution, distortion, ...). We applied this protocol to the Magnetom SP (Siemens) of the University Hospital in Liege (Belgium). We used the Siemens multi-purpose phantom, which does not permit a QC as complete and accurate as the test objects used in the protocol. The phantom simulates the magnetic properties of the body. The body and head coils were tested with and without a loading annulus that simulates the body conductivity. The following results were obtained: body coil: the signal, SNR, uniformity and artifacts were satisfactory just after a maintenance but had changed significantly and became unacceptable two weeks later. Head coil: the uniformity of the signal and SNR were satisfactory without the annulus. With the annulus the signal increased from the right to the left of the phantom of nearly 20%. This came from a lack of correction of uniformity in the static field. Other parameters (slice width and spacing, resolution and distortion) were satisfactory. Since the head coil problem did not appear during the maintenance we suggest Siemens QC should include observation of intensity profiles. It is recommended to archive the obtained images so that the evolution of performances of the scanner can be followed.

  5. QC in MRI: useful or superfluous?

    International Nuclear Information System (INIS)

    Infantino, S.; Malchair, F.

    1995-01-01

    A European task group has developed a protocol for quality control (QC) in MRI. This protocol is essentially based on the control of image properties (uniformity of the signal, signal-to-noise ratio (SNR), resolution, distortion, ...). We applied this protocol to the Magnetom SP (Siemens) of the University Hospital in Liege (Belgium). We used the Siemens multi-purpose phantom, which does not permit a QC as complete and accurate as the test objects used in the protocol. The phantom simulates the magnetic properties of the body. The body and head coils were tested with and without a loading annulus that simulates the body conductivity. The following results were obtained: body coil: the signal, SNR, uniformity and artifacts were satisfactory just after a maintenance but had changed significantly and became unacceptable two weeks later. Head coil: the uniformity of the signal and SNR were satisfactory without the annulus. With the annulus the signal increased from the right to the left of the phantom of nearly 20%. This came from a lack of correction of uniformity in the static field. Other parameters (slice width and spacing, resolution and distortion) were satisfactory. Since the head coil problem did not appear during the maintenance we suggest Siemens QC should include observation of intensity profiles. It is recommended to archive the obtained images so that the evolution of performances of the scanner can be followed

  6. 40 CFR 98.144 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98... fraction for each carbonate consumed based on sampling and chemical analysis using an industry consensus... testing method published by an industry consensus standards organization (e.g., ASTM, ASME, API, etc.). ...

  7. Quality control in urodynamics and the role of software support in the QC procedure.

    Science.gov (United States)

    Hogan, S; Jarvis, P; Gammie, A; Abrams, P

    2011-11-01

    This article aims to identify quality control (QC) best practice, to review published QC audits in order to identify how closely good practice is followed, and to carry out a market survey of the software features that support QC offered by urodynamics machines available in the UK. All UK distributors of urodynamic systems were contacted and asked to provide information on the software features relating to data quality of the products they supply. The results of the market survey show that the features offered by manufacturers differ greatly. Automated features, which can be turned off in most cases, include: cough recognition, detrusor contraction detection, and high pressure alerts. There are currently no systems that assess data quality based on published guidelines. A literature review of current QC guidelines for urodynamics was carried out; QC audits were included in the literature review to see how closely guidelines were being followed. This review highlights the fact that basic QC is not being carried out effectively by urodynamicists. Based on the software features currently available and the results of the literature review there is both the need and capacity for a greater degree of automation in relation to urodynamic data quality and accuracy assessment. Some progress has been made in this area and certain manufacturers have already developed automated cough detection. Copyright © 2011 Wiley Periodicals, Inc.

  8. Development of QC Procedures for Ocean Data Obtained by National Research Projects of Korea

    Science.gov (United States)

    Kim, S. D.; Park, H. M.

    2017-12-01

    To establish data management system for ocean data obtained by national research projects of Ministry of Oceans and Fisheries of Korea, KIOST conducted standardization and development of QC procedures. After reviewing and analyzing the existing international and domestic ocean-data standards and QC procedures, the draft version of standards and QC procedures were prepared. The proposed standards and QC procedures were reviewed and revised by experts in the field of oceanography and academic societies several times. A technical report on the standards of 25 data items and 12 QC procedures for physical, chemical, biological and geological data items. The QC procedure for temperature and salinity data was set up by referring the manuals published by GTSPP, ARGO and IOOS QARTOD. It consists of 16 QC tests applicable for vertical profile data and time series data obtained in real-time mode and delay mode. Three regional range tests to inspect annual, seasonal and monthly variations were included in the procedure. Three programs were developed to calculate and provide upper limit and lower limit of temperature and salinity at depth from 0 to 1550m. TS data of World Ocean Database, ARGO, GTSPP and in-house data of KIOST were analysed statistically to calculate regional limit of Northwest Pacific area. Based on statistical analysis, the programs calculate regional ranges using mean and standard deviation at 3 kind of grid systems (3° grid, 1° grid and 0.5° grid) and provide recommendation. The QC procedures for 12 data items were set up during 1st phase of national program for data management (2012-2015) and are being applied to national research projects practically at 2nd phase (2016-2019). The QC procedures will be revised by reviewing the result of QC application when the 2nd phase of data management programs is completed.

  9. A free-volume modification of GEM-QC to correlate VLE and LLE in polymer solutions

    International Nuclear Information System (INIS)

    Radfarnia, H.R.; Taghikhani, V.; Ghotbi, C.; Khoshkbarchi, M.K.

    2004-01-01

    The generalized quasi-chemical (GEM-QC) model proposed by Wang and Vera is modified to correlate better the phase equilibrium and to overcome the shortcoming of the original model to predict the lower critical solution temperature (LCST) of binary polymer solutions. This shortcoming is mainly because the GEM-QC model does not consider the effect of free-volume, which is important in systems containing molecules with large size differences. The proposed modification is based on replacing the combinatorial term of the GEM-QC model by a term proposed by Kontogerogis et al., which includes the effect of the free-volume. The main advantage of the free volume generalized quasi-chemical (GEM-QC-FV) model over the original GEM-QC is its ability to predict the phase behaviour of binary polymer solutions with LCST behaviour. In addition, the free volume UNIQUAC (UNIQUAC-FV) model is used to correlate VLE and LLE experimental data for binary polymer solutions. The comparison of the results obtained from the GEM-QC-FV model and the UNIQUAC-FV model shows the superiority of the GEM-QC-FV model in correlating the VLE and LLE experimental data for binary polymer solutions

  10. Accelerator based-boron neutron capture therapy (BNCT)-clinical QA and QC

    International Nuclear Information System (INIS)

    Suzuki, Minoru; Tanaka, Hiroki; Sakurai, Yoshinori; Yong, Liu; Kashino, Genro; Kinashi, Yuko; Masunaga, Shinichiro; Ono, Koji; Maruhashi, Akira

    2009-01-01

    Alpha-particle and recoil Li atom yielded by the reaction ( 10 B, n), due to their high LET properties, efficiently and specifically kill the cancer cell that has incorporated the boron. Efficacy of this boron neutron capture therapy (BNCT) has been demonstrated mainly in the treatment of recurrent head/neck and malignant brain cancers in Kyoto University Research Reactor Institute (KUR). As the clinical trial of BNCT is to start from 2009 based on an accelerator (not on the Reactor), this paper describes the tentative outline of the standard operation procedure of BNCT for its quality assurance (QA) and quality control (QC) along the flow of its clinical practice. Personnel concerned in the practice involve the attending physician, multiple physicians in charge of BNCT, medical physicists, nurses and reactor stuff. The flow order of the actual BNCT is as follows: Pre-therapeutic evaluation mainly including informed consent and confirmation of the prescription; Therapeutic planning including setting of therapy volume, and of irradiation axes followed by meeting for stuffs' agreement, decision of irradiating field in the irradiation room leading to final decision of the axis, CT for the planning, decision of the final therapeutic plan according to Japan Atomic Energy Agency-Computational Dosimetry System (JCDS) and meeting of all related personnel for the final confirmation of therapeutic plan; and BNCT including the transport of patient to KUR, dripping of boronophenylalanine, setting up of the patient on the machine, blood sampling for pharmacokinetics, boron level measurement for decision of irradiating time, switch on/off of the accelerator, confirmation of patient's movement in the irradiated field after the neutron irradiation, blood sampling for confirmation of the boron level, and patient's leave from the room. The QA/QC check is principally to be conducted with the two-person rule. The purpose of the clinical trial is to establish the usefulness of BNCT

  11. A Computerized QC Analysis of TLD Glow Curves for Personal Dosimetry Measurements Using Tag QC Program

    International Nuclear Information System (INIS)

    Primo, S.; Datz, H.; Dar, A.

    2014-01-01

    The External Dosimetry Lab (EDL) at the Radiation Safety Division at Soreq Nuclear Research Center (SNRC) is ISO 17025 certified and provides its services to approximately 13,000 users throughout the country from various sectors such as medical, industrial and academic. About 95% of the users are monitored monthly for X-rays, radiation using Thermoluminescence Dosimeter (TLD) cards that contain three LiF:Mg,Ti elements and the other users, who work also with thermal neutrons, use TLD cards that contain four LiF:Mg,Ti elements. All TLD cards are measured with the Thermo 8800pc reader. Suspicious TLD glow curve (GC) can cause wrong dose estimation so the EDL makes great efforts to ensure that each GC undergoes a careful QC procedure. The current QC procedure is performed manually and through a few steps using different softwares and databases in a long and complicated procedure: EDL staff needs to export all the results/GCs to be checked to an Excel file, followed by finding the suspicious GCs, which is done in a different program (WinREMS), According to the GC shapes (Figure 1 illustrates suitable and suspicious GC shapes) and the ratio between the elements result values, the inspecting technician corrects the data. The motivation for developing the new program is the complicated and time consuming process of our the manual procedure to the large amount of TLDs each month (13,000), similarly to other Dosimetry services that use computerized QC GC analysis. it is important to note that only ~25% of the results are above the EDL recorded level (0.10 mSv) and need to be inspected. Thus, the purpose of this paper is to describe a new program, TagQC, which allows a computerized QC GC analysis that identifies automatically, swiftly, and accurately suspicious TLD GC

  12. Vision of new generation CRMs for QC of microanalysis

    International Nuclear Information System (INIS)

    Tian Weizhi

    2005-01-01

    Direct analysis of ever smaller solid samples has become one of the trends in modern analytical science, in coping with the increasing requirements from life, materials, environment, and other frontier scientific fields. Due to the lack of natural matrix CRMs certified at matched sample size levels, however, quantitative calibration and quality control have long been a bottleneck of microanalysis. CRMs of new generation are therefore called for to make solid sampling microanalysis an accurately quantitative and quality-controllable technique. In this paper, an approach is proposed to use a combination of several nuclear analytical techniques in the certification of RMs suitable for QC of analyses at sub-ng sample size levels. The technical procedures, the major problems, and the possible format of certificates of the new generation CRMs, and the outliik of the establishment of QC system for microanalysis are described. The CRMs of current generation have played an important role in the quality of analysis, especially trace analysis, and in turn in the development of related scientific fields in 20 th century. It may be reasonably predicted that the new generation CRMs will play the similar role in the quality of microanalysis, and in turn in relevant frontier scientific fields in 21 st century. Nuclear analytical techniques have made, and will continue to make, unique contributions to both generations of CRMs.

  13. Encoding of QC-LDPC Codes of Rank Deficient Parity Matrix

    Directory of Open Access Journals (Sweden)

    Mohammed Kasim Mohammed Al-Haddad

    2016-05-01

    Full Text Available the encoding of long low density parity check (LDPC codes presents a challenge compared to its decoding. The Quasi Cyclic (QC LDPC codes offer the advantage for reducing the complexity for both encoding and decoding due to its QC structure. Most QC-LDPC codes have rank deficient parity matrix and this introduces extra complexity over the codes with full rank parity matrix. In this paper an encoding scheme of QC-LDPC codes is presented that is suitable for codes with full rank parity matrix and rank deficient parity matrx. The extra effort required by the codes with rank deficient parity matrix over the codes of full rank parity matrix is investigated.

  14. QC-ART: A tool for real-time quality control assessment of mass spectrometry-based proteomics data.

    Science.gov (United States)

    Stanfill, Bryan A; Nakayasu, Ernesto S; Bramer, Lisa M; Thompson, Allison M; Ansong, Charles K; Clauss, Therese; Gritsenko, Marina A; Monroe, Matthew E; Moore, Ronald J; Orton, Daniel J; Piehowski, Paul D; Schepmoes, Athena A; Smith, Richard D; Webb-Robertson, Bobbie-Jo; Metz, Thomas O; TEDDY Study Group, The Environmental Determinants Of Diabetes In The Young

    2018-04-17

    Liquid chromatography-mass spectrometry (LC-MS)-based proteomics studies of large sample cohorts can easily require from months to years to complete. Acquiring consistent, high-quality data in such large-scale studies is challenging because of normal variations in instrumentation performance over time, as well as artifacts introduced by the samples themselves, such as those due to collection, storage and processing. Existing quality control methods for proteomics data primarily focus on post-hoc analysis to remove low-quality data that would degrade downstream statistics; they are not designed to evaluate the data in near real-time, which would allow for interventions as soon as deviations in data quality are detected.  In addition to flagging analyses that demonstrate outlier behavior, evaluating how the data structure changes over time can aide in understanding typical instrument performance or identify issues such as a degradation in data quality due to the need for instrument cleaning and/or re-calibration.  To address this gap for proteomics, we developed Quality Control Analysis in Real-Time (QC-ART), a tool for evaluating data as they are acquired in order to dynamically flag potential issues with instrument performance or sample quality.  QC-ART has similar accuracy as standard post-hoc analysis methods with the additional benefit of real-time analysis.  We demonstrate the utility and performance of QC-ART in identifying deviations in data quality due to both instrument and sample issues in near real-time for LC-MS-based plasma proteomics analyses of a sample subset of The Environmental Determinants of Diabetes in the Young cohort. We also present a case where QC-ART facilitated the identification of oxidative modifications, which are often underappreciated in proteomic experiments. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.

  15. Utility view on QA/QC of WWER-440 fuel design and manufacture

    International Nuclear Information System (INIS)

    Vesely, P.

    1999-01-01

    In this lecture the legislation implements in the Czech Republic, QA/QC system at CEZ, demonstration and development program (purchaser point of view), audit of QA/QC system for fuel design and manufacturing as well as QA/QC records are discussed

  16. Accounting for human factor in QC and QA inspections

    International Nuclear Information System (INIS)

    Goodman, J.

    1986-01-01

    Two types of human error during QC/QA inspection have been identified. The method of accounting for the effects of human error in QC/QA inspections was developed. The result of evaluation of the proportion of discrepant items in the population is affected significantly by human factor

  17. Quality control for measurement of soil samples containing 237Np and 241Am as radiotracer

    International Nuclear Information System (INIS)

    Sha Lianmao; Zhang Caihong; Song Hailong; Ren Xiaona; Han Yuhu; Zhang Aiming; Chu Taiwei

    2003-01-01

    This paper reports quality control (QC) for the measurement of soil samples containing 237 Np and 241 Am as radiotracers in migration test of transuranic nuclides. All of the QC were done independently by the QA members of analytical work. It mainly included checking 5%-10% of the total analyzed samples; preparing blank samples, blind replicate sample and spiked samples used as quality control samples to check the quality of analytical work

  18. jqcML: an open-source java API for mass spectrometry quality control data in the qcML format.

    Science.gov (United States)

    Bittremieux, Wout; Kelchtermans, Pieter; Valkenborg, Dirk; Martens, Lennart; Laukens, Kris

    2014-07-03

    The awareness that systematic quality control is an essential factor to enable the growth of proteomics into a mature analytical discipline has increased over the past few years. To this aim, a controlled vocabulary and document structure have recently been proposed by Walzer et al. to store and disseminate quality-control metrics for mass-spectrometry-based proteomics experiments, called qcML. To facilitate the adoption of this standardized quality control routine, we introduce jqcML, a Java application programming interface (API) for the qcML data format. First, jqcML provides a complete object model to represent qcML data. Second, jqcML provides the ability to read, write, and work in a uniform manner with qcML data from different sources, including the XML-based qcML file format and the relational database qcDB. Interaction with the XML-based file format is obtained through the Java Architecture for XML Binding (JAXB), while generic database functionality is obtained by the Java Persistence API (JPA). jqcML is released as open-source software under the permissive Apache 2.0 license and can be downloaded from https://bitbucket.org/proteinspector/jqcml .

  19. NGS QC Toolkit: a toolkit for quality control of next generation sequencing data.

    Directory of Open Access Journals (Sweden)

    Ravi K Patel

    Full Text Available Next generation sequencing (NGS technologies provide a high-throughput means to generate large amount of sequence data. However, quality control (QC of sequence data generated from these technologies is extremely important for meaningful downstream analysis. Further, highly efficient and fast processing tools are required to handle the large volume of datasets. Here, we have developed an application, NGS QC Toolkit, for quality check and filtering of high-quality data. This toolkit is a standalone and open source application freely available at http://www.nipgr.res.in/ngsqctoolkit.html. All the tools in the application have been implemented in Perl programming language. The toolkit is comprised of user-friendly tools for QC of sequencing data generated using Roche 454 and Illumina platforms, and additional tools to aid QC (sequence format converter and trimming tools and analysis (statistics tools. A variety of options have been provided to facilitate the QC at user-defined parameters. The toolkit is expected to be very useful for the QC of NGS data to facilitate better downstream analysis.

  20. Performance Analysis for Cooperative Communication System with QC-LDPC Codes Constructed with Integer Sequences

    Directory of Open Access Journals (Sweden)

    Yan Zhang

    2015-01-01

    Full Text Available This paper presents four different integer sequences to construct quasi-cyclic low-density parity-check (QC-LDPC codes with mathematical theory. The paper introduces the procedure of the coding principle and coding. Four different integer sequences constructing QC-LDPC code are compared with LDPC codes by using PEG algorithm, array codes, and the Mackey codes, respectively. Then, the integer sequence QC-LDPC codes are used in coded cooperative communication. Simulation results show that the integer sequence constructed QC-LDPC codes are effective, and overall performance is better than that of other types of LDPC codes in the coded cooperative communication. The performance of Dayan integer sequence constructed QC-LDPC is the most excellent performance.

  1. Requirements tests for QC of microSelectron-HDR

    International Nuclear Information System (INIS)

    Gesheva-Atanasova, N.; Gogova, A.; Peycheva, S.; Constantinov, B.; Ganchev, M.

    2000-01-01

    The Quality Control (QC) considers checks and measurements with the purpose of reconstruction, maintaining and increasing the quality of medical procedures and equipment. The QC tests for micro Selectron HDR afterloading machine with 192 Ir which allows more precise calculation and realisation of the tumour's dose have been created and introduced regularly in National Oncological Centre, Sofia. This paper has been cover the machine and software performance, source positioning, application equipment and radiation safety. A list of tests, their frequency, tolerance and action levels, as well as the tests' procedures have been worked out. The used methods are based on establishment of QC protocols. The documents have achieved for a certain period of time and they are available at any time. The experience shows drastically reduction of failures during medical treatment, ensuring the reliability of the used equipment and confidence that all the patients have treated adequate. Where some parameter is above the tolerance is it possible to do proper corrections measures immediately. This QA protocols give assurance that specific objectives being successfully met

  2. Dehydroabietic Acid Derivative QC4 Induces Gastric Cancer Cell Death via Oncosis and Apoptosis

    Directory of Open Access Journals (Sweden)

    Dongjun Luo

    2016-01-01

    Full Text Available Aim. QC4 is the derivative of rosin’s main components dehydroabietic acid (DHA. We investigated the cytotoxic effect of QC4 on gastric cancer cells and revealed the mechanisms beneath the induction of cell death. Methods. The cytotoxic effect of QC4 on gastric cancer cells was evaluated by CCK-8 assay and flow cytometry. The underlying mechanisms were tested by administration of cell death related inhibitors and detection of apoptotic and oncosis related proteins. Cytomembrane integrity and organelles damage were confirmed by lactate dehydrogenase (LDH leakage assay, mitochondrial function test, and cytosolic free Ca2+ concentration detection. Results. QC4 inhibited cell proliferation dose- and time-dependently and destroyed cell membrane integrity, activated calpain-1 autolysis, and induced apoptotic protein cleavage in gastric cancer cells. The detection of decreased ATP and mitochondrial membrane potential, ROS accumulation, and cytosolic free Ca2+ elevation confirmed organelles damage in QC4-treated gastric cancer cells. Conclusions. DHA derivative QC4 induced the damage of cytomembrane and organelles which finally lead to oncosis and apoptosis in gastric cancer cells. Therefore, as a derivative of plant derived small molecule DHA, QC4 might become a promising agent in gastric cancer therapy.

  3. From Field Notes to Data Portal - A Scalable Data QA/QC Framework for Tower Networks: Progress and Preliminary Results

    Science.gov (United States)

    Sturtevant, C.; Hackley, S.; Lee, R.; Holling, G.; Bonarrigo, S.

    2017-12-01

    Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. Data quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from humans or the natural environment. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process heavily relying on visual inspection of data. In addition, notes of measurement interference are often recorded on paper without an explicit pathway to data flagging. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. We present a scalable QA/QC framework in development for NEON that combines the efficiency and standardization of automated checks with the power and flexibility of human review. This framework includes fast-response monitoring of sensor health, a mobile application for electronically recording maintenance activities, traditional point-based automated quality flagging, and continuous monitoring of quality outcomes and longer-term holistic evaluations. This framework maintains the traceability of quality information along the entirety of the data generation pipeline, and explicitly links field reports of measurement interference to quality flagging. Preliminary results show that data quality can be effectively monitored and managed for a multitude of sites with a small group of QA/QC staff. Several components of this framework are open-source, including a R-Shiny application for efficiently monitoring, synthesizing, and investigating data quality issues.

  4. Sci-Fri AM: Quality, Safety, and Professional Issues 05: QC Program Management and the Benefits of QATrack+

    Energy Technology Data Exchange (ETDEWEB)

    Angers, Crystal Plume; Bottema, Ryan; Buckley, Lesley; Studinski, Ryan [The Ottawa Hospital Cancer Centre (Canada)

    2016-08-15

    Purpose: Treatment unit uptime statistics are typically used to monitor radiation equipment performance. The Ottawa Hospital Cancer Centre has introduced the use of Quality Control (QC) test success as a quality indicator for equipment performance and overall health of the equipment QC program. Methods: Implemented in 2012, QATrack+ is used to record and monitor over 1100 routine machine QC tests each month for 20 treatment and imaging units ( http://qatrackplus.com/ ). Using an SQL (structured query language) script, automated queries of the QATrack+ database are used to generate program metrics such as the number of QC tests executed and the percentage of tests passing, at tolerance or at action. These metrics are compared against machine uptime statistics already reported within the program. Results: Program metrics for 2015 show good correlation between pass rate of QC tests and uptime for a given machine. For the nine conventional linacs, the QC test success rate was consistently greater than 97%. The corresponding uptimes for these units are better than 98%. Machines that consistently show higher failure or tolerance rates in the QC tests have lower uptimes. This points to either poor machine performance requiring corrective action or to problems with the QC program. Conclusions: QATrack+ significantly improves the organization of QC data but can also aid in overall equipment management. Complimenting machine uptime statistics with QC test metrics provides a more complete picture of overall machine performance and can be used to identify areas of improvement in the machine service and QC programs.

  5. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool.

    Science.gov (United States)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data. FQC is implemented in Python 3 and Javascript, and is maintained under an MIT license. Documentation and source code is available at: https://github.com/pnnl/fqc . joseph.brown@pnnl.gov. © The Author(s) 2017. Published by Oxford University Press.

  6. Fast QC-LDPC code for free space optical communication

    Science.gov (United States)

    Wang, Jin; Zhang, Qi; Udeh, Chinonso Paschal; Wu, Rangzhong

    2017-02-01

    Free Space Optical (FSO) Communication systems use the atmosphere as a propagation medium. Hence the atmospheric turbulence effects lead to multiplicative noise related with signal intensity. In order to suppress the signal fading induced by multiplicative noise, we propose a fast Quasi-Cyclic (QC) Low-Density Parity-Check (LDPC) code for FSO Communication systems. As a linear block code based on sparse matrix, the performances of QC-LDPC is extremely near to the Shannon limit. Currently, the studies on LDPC code in FSO Communications is mainly focused on Gauss-channel and Rayleigh-channel, respectively. In this study, the LDPC code design over atmospheric turbulence channel which is nether Gauss-channel nor Rayleigh-channel is closer to the practical situation. Based on the characteristics of atmospheric channel, which is modeled as logarithmic-normal distribution and K-distribution, we designed a special QC-LDPC code, and deduced the log-likelihood ratio (LLR). An irregular QC-LDPC code for fast coding, of which the rates are variable, is proposed in this paper. The proposed code achieves excellent performance of LDPC codes and can present the characteristics of high efficiency in low rate, stable in high rate and less number of iteration. The result of belief propagation (BP) decoding shows that the bit error rate (BER) obviously reduced as the Signal-to-Noise Ratio (SNR) increased. Therefore, the LDPC channel coding technology can effectively improve the performance of FSO. At the same time, the BER, after decoding reduces with the increase of SNR arbitrarily, and not having error limitation platform phenomenon with error rate slowing down.

  7. FPGA implementation of high-performance QC-LDPC decoder for optical communications

    Science.gov (United States)

    Zou, Ding; Djordjevic, Ivan B.

    2015-01-01

    Forward error correction is as one of the key technologies enabling the next-generation high-speed fiber optical communications. Quasi-cyclic (QC) low-density parity-check (LDPC) codes have been considered as one of the promising candidates due to their large coding gain performance and low implementation complexity. In this paper, we present our designed QC-LDPC code with girth 10 and 25% overhead based on pairwise balanced design. By FPGAbased emulation, we demonstrate that the 5-bit soft-decision LDPC decoder can achieve 11.8dB net coding gain with no error floor at BER of 10-15 avoiding using any outer code or post-processing method. We believe that the proposed single QC-LDPC code is a promising solution for 400Gb/s optical communication systems and beyond.

  8. External quality control in ground-water sampling and analysis at the Hanford Site

    International Nuclear Information System (INIS)

    Hall, S.H.; Juracich, S.P.

    1991-11-01

    At the US Department of Energy's Hanford Site, external Quality Control (QC) for ground-water monitoring is extensive and has included routine submittal of intra- and interlaboratory duplicate samples, blind samples, and several kinds of blank samples. Examination of the resulting QC data for nine of the constituents found in ground water at the Hanford Site shows that the quality of analysis has generally been within the expectations of precision and accuracy that have been established by the US Environmental Protection Agency (EPA). The constituents subjected to review were nitrate, chromium, sodium, fluoride, carbon tetrachloride, tritium, ammonium, trichloroethylene, and cyanide. Of these, the fluoride measurements were notable exceptions and were poor by EPA standards. The review has shown that interlaboratory analysis of duplicate samples yields the most useful QC data for evaluating laboratory performance in determining commonly encountered constituents. For rarely encountered constituents, interlaboratory comparisons may be augmented with blind samples (synthetic samples of known composition). Intralaboratory comparisons, blanks, and spikes should be generally restricted to studies of suspected or known sample contamination and to studies of the adequacy of sampling and analytical procedures

  9. A novel construction scheme of QC-LDPC codes based on the RU algorithm for optical transmission systems

    Science.gov (United States)

    Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu

    2016-03-01

    A novel lower-complexity construction scheme of quasi-cyclic low-density parity-check (QC-LDPC) codes for optical transmission systems is proposed based on the structure of the parity-check matrix for the Richardson-Urbanke (RU) algorithm. Furthermore, a novel irregular QC-LDPC(4 288, 4 020) code with high code-rate of 0.937 is constructed by this novel construction scheme. The simulation analyses show that the net coding gain ( NCG) of the novel irregular QC-LDPC(4 288,4 020) code is respectively 2.08 dB, 1.25 dB and 0.29 dB more than those of the classic RS(255, 239) code, the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code at the bit error rate ( BER) of 10-6. The irregular QC-LDPC(4 288, 4 020) code has the lower encoding/decoding complexity compared with the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code. The proposed novel QC-LDPC(4 288, 4 020) code can be more suitable for the increasing development requirements of high-speed optical transmission systems.

  10. DATA PROCESSING FROM THE MEASURING DEVICE BALLBAR QC20

    Directory of Open Access Journals (Sweden)

    Matúš Košinár

    2014-03-01

    Full Text Available The paper presents an innovative method of data processing from the measurement device – Ballbar QC20W. It was created with a program for data transformation (Visual Basic.NET and it used Fourier transformation. The paper deals with the measuring method of CNC machine tools using Ballbar QC20W. There is an influence between qualitative parameters of machine tools and qualitative parameters of products (tolerances, roughness, etc.. It is very important to hold the stability of qualitative parameters of products as a key factor of production quality. Therefore, is also important to evaluate the accuracy of machine tools and make prediction of possible accuracy.

  11. Comparison of Different Matrices as Potential Quality Control Samples for Neurochemical Dementia Diagnostics

    NARCIS (Netherlands)

    Lelental, Natalia; Brandner, Sebastian; Kofanova, Olga; Blennow, Kaj; Zetterberg, Henrik; Andreasson, Ulf; Engelborghs, Sebastiaan; Mroczko, Barbara; Gabryelewicz, Tomasz; Teunissen, Charlotte; Mollenhauer, Brit; Parnetti, Lucilla; Chiasserini, Davide; Molinuevo, Jose Luis; Perret-Liaudet, Armand; Verbeek, Marcel M.; Andreasen, Niels; Brosseron, Frederic; Bahl, Justyna M. C.; Herukka, Sanna-Kaisa; Hausner, Lucrezia; Froelich, Lutz; Labonte, Anne; Poirier, Judes; Miller, Anne-Marie; Zilka, Norbert; Kovacech, Branislav; Urbani, Andrea; Suardi, Silvia; Oliveira, Catarina; Baldeiras, Ines; Dubois, Bruno; Rot, Uros; Lehmann, Sylvain; Skinningsrud, Anders; Betsou, Fay; Wiltfang, Jens; Gkatzima, Olymbia; Winblad, Bengt; Buchfelder, Michael; Kornhuber, Johannes; Lewczuk, Piotr

    2016-01-01

    Background: Assay-vendor independent quality control (QC) samples for neurochemical dementia diagnostics (NDD) biomarkers are so far commercially unavailable. This requires that NDD laboratories prepare their own QC samples, for example by pooling leftover cerebrospinal fluid (CSF) samples.

  12. A novel construction method of QC-LDPC codes based on CRT for optical communications

    Science.gov (United States)

    Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu

    2016-05-01

    A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes is proposed based on Chinese remainder theory (CRT). The method can not only increase the code length without reducing the girth, but also greatly enhance the code rate, so it is easy to construct a high-rate code. The simulation results show that at the bit error rate ( BER) of 10-7, the net coding gain ( NCG) of the regular QC-LDPC(4 851, 4 546) code is respectively 2.06 dB, 1.36 dB, 0.53 dB and 0.31 dB more than those of the classic RS(255, 239) code in ITU-T G.975, the LDPC(32 640, 30 592) code in ITU-T G.975.1, the QC-LDPC(3 664, 3 436) code constructed by the improved combining construction method based on CRT and the irregular QC-LDPC(3 843, 3 603) code constructed by the construction method based on the Galois field ( GF( q)) multiplicative group. Furthermore, all these five codes have the same code rate of 0.937. Therefore, the regular QC-LDPC(4 851, 4 546) code constructed by the proposed construction method has excellent error-correction performance, and can be more suitable for optical transmission systems.

  13. Implementation of regional centres for SPECT QC/QA in Brazil

    International Nuclear Information System (INIS)

    Robilotta, C.C.; Dias-Neto, A.L.; Abe, R.; Khoury, H.J.; Silva, D.C. da; Martini, J.C.; Brunetto, S.; Ney, C.

    2002-01-01

    Aims: SPECT technology was introduced in Brazil at the early 80s and, presently, there are more than 230 systems installed in the whole country. In order to establish a quality standard for these systems, a RCP was submitted and received partial support from IAEA for the implementation of regional centres, so that clinics in different regions could be evaluated using the same protocols. Materials and Methods: Six centres were created in 5 public (federal and state) universities and one private philanthropic medical school in: USP-Sao Paulo, UNICAMP-Campinas, CNEN-Rio de Janeiro, UFBA-Salvador, UFPE-Recife and FM/Santa Casa-Porto Alegre. All sites have teaching and technical supports available and there is at least one nuclear medicine physicist in charge. The basic QC/QA set included: 57 Co sheet source, orthogonal hole phantom, quadrant bar phantom, calibrated sources for dose calibrator ( 57 Co, 133 Ba, and 137 Cs) and a DeLuxe SPECT phantom from Data Spectrum Corp. Basic and complete/acceptance protocols were defined as the reference procedures. Measurements and evaluations were performed in 21 (<10%) centres and inter-comparisons were made amongst the groups. Results: Some information about the centres and evaluated systems are presented. A large number of the visited clinics never had any QC tests done except for the manufacturer's installation tests and the daily uniformity test. On the average, most of the cameras needed tuning and one of them had to have the PM tubes re-coupled. The main difficulties encountered by all groups were the lack of physicists in almost all the visited clinics and the inadequate training of many local technologists, especially in the remote areas. In spite of the misunderstanding and scepticism from some of the visited MDs, the majority recognized the importance of proper QC/QA testing. Conclusions: It was shown that regional centres are essential if one aims quality and reliability in the nuclear medicine clinics, especially in a

  14. A Computerized QC Analysis of TLD Glow Curves for Personal Dosimetry Measurements Using TagQC Program

    International Nuclear Information System (INIS)

    Primo, S.; Datz, H.; Dar, A.

    2014-01-01

    The External Dosimetry Lab (EDL) at the Radiation Safety Division at Soreq Nuclear Research Center (SNRC) is ISO 17025 certified and provides its services to approximately 13,000 users throughout the country from various sectors such as medical, industrial and academic. About 95% of the users are monitored monthly for X-rays, and radiation using Thermoluminescence Dosimeter (TLD) cards that contain three LiF:Mg,Ti elements and the other users, who work also with thermal neutrons, use TLD cards that contain four LiF:Mg,Ti elements. All TLD cards are measured with the Thermo 8800pc reader.Suspicious TLD glow curve (GC) can cause wrong dose estimation so the EDL makes great efforts to ensure that each GC undergoes a careful QC procedure. The current QC procedure is performed manually and through a few steps using different softwares and databases in a long and complicated procedure: EDL staff needs to export all the results/GCs to be checked to an Excel file, followed by finding the suspicious GCs, which is done in a different program (WinREMS), According to the GC shapes (Figure 1 illustrates suitable and suspicious GC shapes) and the ratio between the elements result values, the inspecting technician corrects the data

  15. Quality control of portal imaging with PTW EPID QC PHANTOM registered

    International Nuclear Information System (INIS)

    Pesznyak, Csilla; Kiraly, Reka; Polgar, Istvan; Zarand, Pal; Mayer, Arpad; Fekete, Gabor; Mozes, Arpad; Kiss, Balazs

    2009-01-01

    Purpose: quality assurance (QA) and quality control (QC) of different electronic portal imaging devices (EPID) and portal images with the PTW EPID QC PHANTOM registered . Material and methods: characteristic properties of images of different file formats were measured on Siemens OptiVue500aSi registered , Siemens BeamView Plus registered , Elekta iView registered , and Varian PortalVision trademark and analyzed with the epidSoft registered 2.0 program in four radiation therapy centers. The portal images were taken with Kodak X-OMAT V registered and the Kodak Portal Localisation ReadyPack registered films and evaluated with the same program. Results: the optimal exposition both for EPIDs and portal films of different kind was determined. For double exposition, the 2+1 MU values can be recommended in the case of Siemens OptiVue500aSi registered , Elekta iView registered and Kodak Portal Localisation ReadyPack registered films, while for Siemens BeamView Plus registered , Varian PortalVision trademark and Kodak X-OMAT V registered film 7+7 MU is recommended. Conclusion: the PTW EPID QC PHANTOM registered can be used not only for amorphous silicon EPIDs but also for images taken with a video-based system or by using an ionization chamber matrix or for portal film. For analysis of QC tests, a standardized format (used at the acceptance test) should be applied, as the results are dependent on the file format used. (orig.)

  16. Quality assurance (QA) and quality control (QC) of image guided radiotherapy (IGRT). Osaka Rosai Hospital experience

    International Nuclear Information System (INIS)

    Tsuboi, Kazuki; Yagi, Masayuki; Fujiwara, Kanta

    2013-01-01

    The linear accelerator with image guided radiation therapy (IGRT) was introduced in May 2010. We performed the verification of the IGRT system, id est (i.e.), acceptance test and our original performance test and confirmed the acceptability for clinical use. We also performed daily QA/QC program before the start of treatment. One-year experience of QA/QC program showed excellent stability of IGRT function compared with our old machine. We further hope to establish the more useful management system and QA/QC program. (author)

  17. Construction method of QC-LDPC codes based on multiplicative group of finite field in optical communication

    Science.gov (United States)

    Huang, Sheng; Ao, Xiang; Li, Yuan-yuan; Zhang, Rui

    2016-09-01

    In order to meet the needs of high-speed development of optical communication system, a construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes based on multiplicative group of finite field is proposed. The Tanner graph of parity check matrix of the code constructed by this method has no cycle of length 4, and it can make sure that the obtained code can get a good distance property. Simulation results show that when the bit error rate ( BER) is 10-6, in the same simulation environment, the net coding gain ( NCG) of the proposed QC-LDPC(3 780, 3 540) code with the code rate of 93.7% in this paper is improved by 2.18 dB and 1.6 dB respectively compared with those of the RS(255, 239) code in ITU-T G.975 and the LDPC(3 2640, 3 0592) code in ITU-T G.975.1. In addition, the NCG of the proposed QC-LDPC(3 780, 3 540) code is respectively 0.2 dB and 0.4 dB higher compared with those of the SG-QC-LDPC(3 780, 3 540) code based on the two different subgroups in finite field and the AS-QC-LDPC(3 780, 3 540) code based on the two arbitrary sets of a finite field. Thus, the proposed QC-LDPC(3 780, 3 540) code in this paper can be well applied in optical communication systems.

  18. A novel QC-LDPC code based on the finite field multiplicative group for optical communications

    Science.gov (United States)

    Yuan, Jian-guo; Xu, Liang; Tong, Qing-zhen

    2013-09-01

    A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) code is proposed based on the finite field multiplicative group, which has easier construction, more flexible code-length code-rate adjustment and lower encoding/decoding complexity. Moreover, a regular QC-LDPC(5334,4962) code is constructed. The simulation results show that the constructed QC-LDPC(5334,4962) code can gain better error correction performance under the condition of the additive white Gaussian noise (AWGN) channel with iterative decoding sum-product algorithm (SPA). At the bit error rate (BER) of 10-6, the net coding gain (NCG) of the constructed QC-LDPC(5334,4962) code is 1.8 dB, 0.9 dB and 0.2 dB more than that of the classic RS(255,239) code in ITU-T G.975, the LDPC(32640,30592) code in ITU-T G.975.1 and the SCG-LDPC(3969,3720) code constructed by the random method, respectively. So it is more suitable for optical communication systems.

  19. Quality control of portal imaging with PTW EPID QC PHANTOM {sup registered}

    Energy Technology Data Exchange (ETDEWEB)

    Pesznyak, Csilla; Kiraly, Reka; Polgar, Istvan; Zarand, Pal; Mayer, Arpad [Inst. of Oncoradiology, Uzsoki Hospital, Budapest (Hungary); Fekete, Gabor [Dept. of Oncotherapy, Univ. of Szeged (Hungary); Mozes, Arpad [Oncology Center, Kalman Pandy County Hospital, Gyula (Hungary); Kiss, Balazs [Dept. of Radiation Oncology, Markusovszky County Hospital, Szombathely (Hungary)

    2009-01-15

    Purpose: quality assurance (QA) and quality control (QC) of different electronic portal imaging devices (EPID) and portal images with the PTW EPID QC PHANTOM {sup registered}. Material and methods: characteristic properties of images of different file formats were measured on Siemens OptiVue500aSi {sup registered}, Siemens BeamView Plus {sup registered}, Elekta iView {sup registered}, and Varian PortalVision trademark and analyzed with the epidSoft {sup registered} 2.0 program in four radiation therapy centers. The portal images were taken with Kodak X-OMAT V {sup registered} and the Kodak Portal Localisation ReadyPack {sup registered} films and evaluated with the same program. Results: the optimal exposition both for EPIDs and portal films of different kind was determined. For double exposition, the 2+1 MU values can be recommended in the case of Siemens OptiVue500aSi {sup registered}, Elekta iView {sup registered} and Kodak Portal Localisation ReadyPack {sup registered} films, while for Siemens BeamView Plus {sup registered}, Varian PortalVision trademark and Kodak X-OMAT V {sup registered} film 7+7 MU is recommended. Conclusion: the PTW EPID QC PHANTOM {sup registered} can be used not only for amorphous silicon EPIDs but also for images taken with a video-based system or by using an ionization chamber matrix or for portal film. For analysis of QC tests, a standardized format (used at the acceptance test) should be applied, as the results are dependent on the file format used. (orig.)

  20. 40 CFR 98.414 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98.414 Section 98.414 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Industrial Greenhouse Gases § 98.414 Monitoring...

  1. 40 CFR 98.214 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98.214 Section 98.214 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... standard method or other enhanced industry consensus standard method published by an industry consensus...

  2. Role of NAA in characterizations of sampling behaviors of multiple elements in CRMs

    International Nuclear Information System (INIS)

    Tian Weizhi; Ni Bangfa; Wang Pingsheng; Nie Huiling

    1997-01-01

    Taking the advantage of high precision and accuracy of neutron activation analysis (NAA), sampling constants have been determined for multielements in several international and Chinese reference materials. The suggested technique may be used for finding elements in existing CRMs qualified for quality control (QC) of small size samples (several mg or less), and characterizing sampling behaviors of multielements in new CRMs specifically made for QC of microanalysis

  3. QC operator’s nonneutral posture against musculoskeletal disorder’s (MSDs) risks

    Science.gov (United States)

    Kautsar, F.; Gustopo, D.; Achmadi, F.

    2018-04-01

    Musculoskeletal disorders refer to a gamut of inflammatory and degenerative disorders aggravated largely by the performance of work. It is the major cause of pain, disability, absenteeism and reduced productivity among workers worldwide. Although it is not fatal, MSDs have the potential to develop into serious injuries in the musculoskeletal system if ignored. QC operators work in nonneutral body posture. This cross-sectional study was condusted in order to investigate correlation between risk assessment results of QEC and body posture calculation of mannequin pro. Statistical analysis was condusted using SPSS version 16.0. Validity test, Reliability test and Regression analysis were conducted to compare the risk assessment output of applied method and nonneutral body posture simulation. All of QEC’s indicator classified as valid and reliable. The result of simple regression anlysis are back (0.3264.32), wrist/hand (4.86 >4.32) and neck (1.298 <4.32). Result of this study shows that there is an influence between nonneutral body posture of the QC operator during work with risk of musculoskeletal disorders. The potential risk of musculoskeletal disorders is in the shoulder/arm and wrist/hand of the QC operator, whereas the back and neck are not affected.

  4. 40 CFR 98.394 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98... by a consensus-based standards organization exists, such a method shall be used. Consensus-based... (NAESB). (ii) Where no appropriate standard method developed by a consensus-based standards organization...

  5. 40 CFR 98.424 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98... published by a consensus-based standards organization if such a method exists. Consensus-based standards...). (ii) Where no appropriate standard method developed by a consensus-based standards organization exists...

  6. Design of a Clean Room for Quality Control of an Environmental Sampling in KINAC

    International Nuclear Information System (INIS)

    Yoon, Jongho; Ahn, Gil Hoon; Seo, Hana; Han, Kitek; Park, Il Jin

    2014-01-01

    The objective of environmental sampling and analysis for safeguards is to characterize the nuclear materials handled and the activities conducted at the specific locations. The KINAC is responsible for the conclusions drawn from the analytical results provided by the analytical laboratories. To assure the KINAC of the continuity of the quality of the analytical results provided by the laboratories, the KINAC will implement a quality control(QC) programme. One of the QC programme is to prepare QC samples. The establishment of a clean room is needed to handle QC samples due to stringent control of contamination. The KINAC designed a clean facility with cleanliness of ISO Class 6, the Clean Room for Estimation and Assay of trace Nuclear materials(CREAN) to meet conflicting requirements of a clean room and for handling of nuclear materials according to Korean laws. The clean room will be expected to acquire of a radiation safety license under these conditions in this year and continue to improve it. The construction of the CREAN facility will be completed by the middle of 2015. In terms of QC programme, the establishment of a clean room is essential and will be not only very helpful for setting of quality control system for the national environmental sampling programme but also be applied for the environmental sample analysis techniques to the nuclear forensics

  7. Advances in Automated QA/QC for TRISO Fuel Particle Production

    International Nuclear Information System (INIS)

    Hockey, Ronald L.; Bond, Leonard J.; Batishko, Charles R.; Gray, Joseph N.; Saurwein, John J.; Lowden, Richard A.

    2004-01-01

    Fuel in most Generation IV reactor designs typically encompasses billions of the TRISO particles. Present day QA/QC methods, done manually and in many cases destructively, cannot economically test a statistically significant fraction of the large number of the individual fuel particles required. Fully automated inspection technologies are essential to economical TRISO fuel particle production. A combination of in-line nondestructive (NDE) measurements employing electromagnetic induction and digital optical imaging analysis is currently under investigation and preliminary data indicate the potential for meeting the demands of this application. To calibrate high-speed NDE methods, surrogate fuel particle samples are being coated with layers containing a wide array of defect types found to degrade fuel performance and these are being characterized via high-resolution CT and digital radiographic images

  8. 40 CFR 98.404 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98... published by a consensus-based standards organization exists, such a method shall be used. Consensus-based... (NAESB). (ii) Where no appropriate standard method developed by a consensus-based standards organization...

  9. qcML: an exchange format for quality control metrics from mass spectrometry experiments.

    Science.gov (United States)

    Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W P; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A; Kelstrup, Christian D; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S; Olsen, Jesper V; Heck, Albert J R; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart

    2014-08-01

    Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.

  10. qcML: An Exchange Format for Quality Control Metrics from Mass Spectrometry Experiments*

    Science.gov (United States)

    Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W. P.; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A.; Kelstrup, Christian D.; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S.; Olsen, Jesper V.; Heck, Albert J. R.; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart

    2014-01-01

    Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. PMID:24760958

  11. 40 CFR 98.174 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Iron and Steel Production § 98.174 Monitoring and QA/QC... moisture content of the stack gas. (5) Determine the mass rate of process feed or process production (as... Fusion Techniques (incorporated by reference, see § 98.7) for iron and ferrous scrap. (v) ASM CS-104 UNS...

  12. The sensitivity of patient specific IMRT QC to systematic MLC leaf bank offset errors

    International Nuclear Information System (INIS)

    Rangel, Alejandra; Palte, Gesa; Dunscombe, Peter

    2010-01-01

    Purpose: Patient specific IMRT QC is performed routinely in many clinics as a safeguard against errors and inaccuracies which may be introduced during the complex planning, data transfer, and delivery phases of this type of treatment. The purpose of this work is to evaluate the feasibility of detecting systematic errors in MLC leaf bank position with patient specific checks. Methods: 9 head and neck (H and N) and 14 prostate IMRT beams were delivered using MLC files containing systematic offsets (±1 mm in two banks, ±0.5 mm in two banks, and 1 mm in one bank of leaves). The beams were measured using both MAPCHECK (Sun Nuclear Corp., Melbourne, FL) and the aS1000 electronic portal imaging device (Varian Medical Systems, Palo Alto, CA). Comparisons with calculated fields, without offsets, were made using commonly adopted criteria including absolute dose (AD) difference, relative dose difference, distance to agreement (DTA), and the gamma index. Results: The criteria most sensitive to systematic leaf bank offsets were the 3% AD, 3 mm DTA for MAPCHECK and the gamma index with 2% AD and 2 mm DTA for the EPID. The criterion based on the relative dose measurements was the least sensitive to MLC offsets. More highly modulated fields, i.e., H and N, showed greater changes in the percentage of passing points due to systematic MLC inaccuracy than prostate fields. Conclusions: None of the techniques or criteria tested is sufficiently sensitive, with the population of IMRT fields, to detect a systematic MLC offset at a clinically significant level on an individual field. Patient specific QC cannot, therefore, substitute for routine QC of the MLC itself.

  13. The sensitivity of patient specific IMRT QC to systematic MLC leaf bank offset errors

    Energy Technology Data Exchange (ETDEWEB)

    Rangel, Alejandra; Palte, Gesa; Dunscombe, Peter [Department of Medical Physics, Tom Baker Cancer Centre, 1331-29 Street NW, Calgary, Alberta T2N 4N2, Canada and Department of Physics and Astronomy, University of Calgary, 2500 University Drive North West, Calgary, Alberta T2N 1N4 (Canada); Department of Medical Physics, Tom Baker Cancer Centre, 1331-29 Street NW, Calgary, Alberta T2N 4N2 (Canada); Department of Medical Physics, Tom Baker Cancer Centre, 1331-29 Street NW, Calgary, Alberta T2N 4N2 (Canada); Department of Physics and Astronomy, University of Calgary, 2500 University Drive NW, Calgary, Alberta T2N 1N4 (Canada) and Department of Oncology, Tom Baker Cancer Centre, 1331-29 Street NW, Calgary, Alberta T2N 4N2 (Canada)

    2010-07-15

    Purpose: Patient specific IMRT QC is performed routinely in many clinics as a safeguard against errors and inaccuracies which may be introduced during the complex planning, data transfer, and delivery phases of this type of treatment. The purpose of this work is to evaluate the feasibility of detecting systematic errors in MLC leaf bank position with patient specific checks. Methods: 9 head and neck (H and N) and 14 prostate IMRT beams were delivered using MLC files containing systematic offsets ({+-}1 mm in two banks, {+-}0.5 mm in two banks, and 1 mm in one bank of leaves). The beams were measured using both MAPCHECK (Sun Nuclear Corp., Melbourne, FL) and the aS1000 electronic portal imaging device (Varian Medical Systems, Palo Alto, CA). Comparisons with calculated fields, without offsets, were made using commonly adopted criteria including absolute dose (AD) difference, relative dose difference, distance to agreement (DTA), and the gamma index. Results: The criteria most sensitive to systematic leaf bank offsets were the 3% AD, 3 mm DTA for MAPCHECK and the gamma index with 2% AD and 2 mm DTA for the EPID. The criterion based on the relative dose measurements was the least sensitive to MLC offsets. More highly modulated fields, i.e., H and N, showed greater changes in the percentage of passing points due to systematic MLC inaccuracy than prostate fields. Conclusions: None of the techniques or criteria tested is sufficiently sensitive, with the population of IMRT fields, to detect a systematic MLC offset at a clinically significant level on an individual field. Patient specific QC cannot, therefore, substitute for routine QC of the MLC itself.

  14. Certification for Trace Elements and Methyl Mercury Mass Fractions in IAEA-452 Scallop (Pecten maximus) Sample

    International Nuclear Information System (INIS)

    2013-01-01

    The primary goal of the IAEA Environment Laboratories (NAEL) is to help Member States understand, monitor and protect the marine environment. The major impact exerted by large coastal cities on marine ecosystems is therefore of great concern to the IAEA, particularly to its Environment Laboratories. The marine pollution assessments needed to understand such impacts depend on accurate knowledge of contaminant concentrations in various environmental compartments. Two fundamental requirements to ensure the reliability of analytical results are quality control (QC) and quality assurance (QA). Since the early 1970s, NAEL has been assisting national laboratories and regional laboratory networks through its reference material programme for the analysis of radionuclides, trace elements and organic compounds in marine samples. Relevant activities include global interlaboratory comparison exercises and regional proficiency tests, the production of marine reference materials, and the development of reference methods for analysis of trace elements and organic pollutants in marine samples. QA, QC and associated good laboratory practice should be essential components of all marine environmental monitoring. QC procedures are commonly based on the analysis of reference materials to assess reproducibility and measurement bias. QA can be realized by participation in externally organized laboratory performance studies, also known as interlaboratory comparison exercises, which compare and evaluate the analytical performance and measurement capabilities of participating laboratories. The need for good QA/QC in the chemical analysis of marine environmental samples is widely recognized and has been tested in a number of international QA exercises. Such diligence also needs to be applied to other components of the monitoring exercise, since these may represent a greater source of error in many instances. Data that are not based on adequate QA/QC can be erroneous, and their misuse can lead

  15. Role of NAA in determination and characterisation of sampling behaviours of multiple elements in CRMs

    International Nuclear Information System (INIS)

    Tian Weizhi; Ni Bangfa; Wang Pingsheng; Nie Huiling

    2002-01-01

    Taking the advantage of high precision and accuracy of neutron activation analysis (NAA), sampling constants have been determined for multielements in several international and Chinese reference materials. The suggested technique may be used for finding elements in existing CRMs qualified for quality control (QC) of small size samples (several mg or less), and characterizing sampling behaviors of multielements in new CRMs specifically made for QC of microanalysis. (author)

  16. Construction of type-II QC-LDPC codes with fast encoding based on perfect cyclic difference sets

    Science.gov (United States)

    Li, Ling-xiang; Li, Hai-bing; Li, Ji-bi; Jiang, Hua

    2017-09-01

    In view of the problems that the encoding complexity of quasi-cyclic low-density parity-check (QC-LDPC) codes is high and the minimum distance is not large enough which leads to the degradation of the error-correction performance, the new irregular type-II QC-LDPC codes based on perfect cyclic difference sets (CDSs) are constructed. The parity check matrices of these type-II QC-LDPC codes consist of the zero matrices with weight of 0, the circulant permutation matrices (CPMs) with weight of 1 and the circulant matrices with weight of 2 (W2CMs). The introduction of W2CMs in parity check matrices makes it possible to achieve the larger minimum distance which can improve the error- correction performance of the codes. The Tanner graphs of these codes have no girth-4, thus they have the excellent decoding convergence characteristics. In addition, because the parity check matrices have the quasi-dual diagonal structure, the fast encoding algorithm can reduce the encoding complexity effectively. Simulation results show that the new type-II QC-LDPC codes can achieve a more excellent error-correction performance and have no error floor phenomenon over the additive white Gaussian noise (AWGN) channel with sum-product algorithm (SPA) iterative decoding.

  17. Building a QC Database of Meteorological Data From NASA KSC and the United States Air Force's Eastern Range

    Science.gov (United States)

    Brenton, James C.; Barbre, Robert E.; Orcutt, John M.; Decker, Ryan K.

    2018-01-01

    The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER is one of the most heavily instrumented sites in the United States measuring various atmospheric parameters on a continuous basis. An inherent challenge with the large databases that EV44 receives from the ER consists of ensuring erroneous data are removed from the databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments; however, no standard QC procedures for all databases currently exist resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build flags within the meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC checks are described. The flagged data points will be plotted in a graphical user interface (GUI) as part of a manual confirmation that the flagged data do indeed need to be removed from the archive. As the rate of launches increases with additional launch vehicle programs, more emphasis is being placed to continually update and check weather databases for data quality before use in launch vehicle design and certification analyses.

  18. Portland cement concrete pavement review of QC/QA data 2000 through 2009.

    Science.gov (United States)

    2011-04-01

    This report analyzes the Quality Control/Quality Assurance (QC/QA) data for Portland cement concrete pavement : (PCCP) awarded in the years 2000 through 2009. Analysis of the overall performance of the projects is accomplished by : reviewing the Calc...

  19. QA/QC - Practices and procedures in WWER fuel management

    International Nuclear Information System (INIS)

    Keselica, M.

    1999-01-01

    Construction time schedule and commissioning (unit by unit) of the NPP Dukovany as well as structure of electricity generation in the CEZ in 1998 are reviewed. History of QA/QC system establishment and rules (system standards) as well as organization chart of the NPP Dukovany and quality manual of reactor physics department are presented. Standards of worker's qualification and nuclear fuel inspections are discussed. Fuel reliability indicators are presented

  20. Experiences of Radiochemical Lab of Faculty of Natural Sciences, Comenius University, Bratislava, Slovakia with implementation of QA/QC system

    International Nuclear Information System (INIS)

    Rajec, Pavol; Mackova, Jana

    2002-01-01

    This report gives an overview of the Laboratory experience from the participation in the Project. The Project helped the Laboratory to obtain accreditation with the Slovak National Accreditation Service, to receive more contracts and clients and to implement QA/QC principles according to ISO 17025. The future plans of the Laboratory include ISO 17025 compliance certification

  1. Laboratory QA/QC improvements for small drinking water systems at Savannah River Site

    Energy Technology Data Exchange (ETDEWEB)

    Turner, R.D.

    1995-12-01

    The Savannah River Site (SRS), a 310 square mile facility located near Aiken, S.C., is operated by Westinghouse Savannah River Company for the US Department of Energy. SRS has 28 separate drinking water systems with average daily demands ranging from 0.0002 to 0.5 MGD. All systems utilize treated groundwater. Until recently, the water laboratories for each system operated independently. As a result, equipment, reagents, chemicals, procedures, personnel, and quality control practices differed from location to location. Due to this inconsistency, and a lack of extensive laboratory OA/QC practices at some locations, SRS auditors were not confident in the accuracy of daily water quality analyses results. The Site`s Water Services Department addressed these concerns by developing and implementing a practical laboratory QA/QC program. Basic changes were made which can be readily adopted by most small drinking water systems. Key features of the program include: Standardized and upgraded laboratory instrumentation and equipment; standardized analytical procedures based on vendor manuals and site requirements; periodic accuracy checks for all instrumentation; creation of a centralized laboratory to perform metals digestions and chlorine colorimeter accuracy checks; off-site and on-site operator training; proper storage, inventory and shelf life monitoring for reagents and chemicals. This program has enhanced the credibility and accuracy of SRS drinking water system analyses results.

  2. Low Complexity Encoder of High Rate Irregular QC-LDPC Codes for Partial Response Channels

    Directory of Open Access Journals (Sweden)

    IMTAWIL, V.

    2011-11-01

    Full Text Available High rate irregular QC-LDPC codes based on circulant permutation matrices, for efficient encoder implementation, are proposed in this article. The structure of the code is an approximate lower triangular matrix. In addition, we present two novel efficient encoding techniques for generating redundant bits. The complexity of the encoder implementation depends on the number of parity bits of the code for the one-stage encoding and the length of the code for the two-stage encoding. The advantage of both encoding techniques is that few XOR-gates are used in the encoder implementation. Simulation results on partial response channels also show that the BER performance of the proposed code has gain over other QC-LDPC codes.

  3. Polarization tracking system for free-space optical communication, including quantum communication

    Science.gov (United States)

    Nordholt, Jane Elizabeth; Newell, Raymond Thorson; Peterson, Charles Glen; Hughes, Richard John

    2018-01-09

    Quantum communication transmitters include beacon lasers that transmit a beacon optical signal in a predetermined state of polarization such as one of the states of polarization of a quantum communication basis. Changes in the beacon polarization are detected at a receiver, and a retarder is adjusted so that the states of polarization in a received quantum communication optical signal are matched to basis polarizations. The beacon and QC signals can be at different wavelengths so that the beacon does not interfere with detection and decoding of the QC optical signal.

  4. Seismologic study of Los Humeros geothermal field, Puebla, Mexico. Part II: Seismic tomography by attenuation of coda waves (Qc-1) of local earthquakes; Estudio sismologico del campo geotermico de Los Humeros, Puebla, Mexico. Parte II: Tomografia sismica por atenuacion a partir de ondas de coda (Qc-1) de sismos locales

    Energy Technology Data Exchange (ETDEWEB)

    Antayhua, Yanet; Lermo, Javier [Instituto de Ingenieria, Universidad Nacional Autonoma de Mexico, D.F (Mexico); Carlos, Vargas [Departamento de Geociencias, Universidad Nacional de Colombia (Colombia)]. E-mail: jles@pumas.iingen.unam.mx

    2008-07-15

    In the Los Humeros geothermal field, Puebla, seismic tomography has been studied using the attenuation of coda waves (Qc{sup -1}). Ninety-five local earthquakes (Md{<=}3.6) have been used with depths up to 4.0 km registered in the seismic network stations from December 1997 to December 2004. A simple backscattering model was used, filtered in four ranks of frequencies (2, 4, 6, and 8 Hz) and one window of 5 seconds. For the 3D-representation, we used an approximation based on first-order scattering of ellipsoids. The results show that values of Qc for the used frequencies have a frequency dependency shown in the equation: Qc=24{+-}12f{sup 0.86}{+-}{sup 0.06}, where the low values of Qc were observed in the zone of higher seismic and tectonic activity and in the location of injection and production wells. The high values are located in the periphery of the geothermal field. The distribution of the Qc{sup -1} attenuation in 3D and 2D shows the anomalies of high-seismic attenuation are located in the north, south, and southwestern ends of the zone presently under operation, at depths greater than 2.5 km. [Spanish] Para realizar la tomografia sismica por atenuacion de ondas de coda (Qc{sup -1}) en el campo geotermico de Los Humeros, Puebla, se han utilizado 95 sismos locales (Md{<=}3.6) con profundidades hasta 4.0 km, registrados en las estaciones de su red sismica, durante el periodo de diciembre 1997 a diciembre 2004. Se utilizo el modelo de retrodispersion simple, filtrados en cuatro rangos de frecuencias (2, 4, 6, y 8 Hz) y una ventana de 5 segundos. Para la representacion en 3D, se utilizo una aproximacion basada en elipsoides que representan dispersion de primer orden. Los resultados muestran que los valores de Qc para las frecuencias utilizadas tienen una dependencia con la frecuencia de la forma: Qc=24{+-}12f{sup 0.86}{+-}{sup 0.06}, donde los valores bajos de Qc fueron observados en la zona de mayor actividad sismica y en la ubicacion de pozos inyectores y

  5. Improvement of the customer satisfaction through Quality Assurance Matrix and QC-Story methods: A case study from automotive industry

    Science.gov (United States)

    Sicoe, G. M.; Belu, N.; Rachieru, N.; Nicolae, E. V.

    2017-10-01

    Presently, in the automotive industry, the tendency is to adapt permanently to the changes and introduce the market tendency in the new products that leads of the customer satisfaction. Many quality techniques were adopted in this field to continuous improvement of product and process quality and advantages were also gained. The present paper has focused on possibilities that offers the use of Quality Assurance Matrix (QAM) and Quality Control Story (QC Story) to provide largest protection against nonconformities in the production process, throughout a case study in the automotive industry. There is a direct relationship from the QAM to a QC Story analysis. The failures identified using QAM are treated with QC Story methodology. Using this methods, will help to decrease the PPM values and will increase the quality performance and the customer satisfaction.

  6. jQC-PET, an ImageJ macro to analyse the quality control of a PET/CT; jQC-PET, una macro de ImageJ para el analisis del control de calidad de un PET/CT

    Energy Technology Data Exchange (ETDEWEB)

    Cortes-Rodicio, J.; Sanchez-Merino, G.; Garcia-Fidalgo, A.

    2015-07-01

    An ImageJ macro has been developed to facilitate the analysis of three PET/CT quality control procedures included in the documents from the National Electrical Manufacturers Association (NU2-2007) and the International Atomic Energy Agency (Pub-1393): image quality, uniformity and spatial resolution. In them, the generation of the regions of interest and the analysis are automatized. The results obtained with the software have been compared with those of the commercial software and the literature. The use of jQC-PET allows a standard analysis and the independence of the commercial software. (Author)

  7. Theory of sampling: four critical success factors before analysis.

    Science.gov (United States)

    Wagner, Claas; Esbensen, Kim H

    2015-01-01

    Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.

  8. DIAGNOSTIC OF CNC LATHE WITH QC 20 BALLBAR SYSTEM

    Directory of Open Access Journals (Sweden)

    Jerzy Józwik

    2015-11-01

    Full Text Available This paper presents the evaluation of the influence of the feedmotion speed on the value of selected geometric errors of CNC lathe CTX 310 eco by DMG, indentified by QC 20 Ballbar system. Diagnostically evaluated were: the deviation of the axis squareness, reversal spike, and backlash. These errors determine the forming of the dimensional and shape accuracy of a machine tool. The article discusses the process of the CNC diagnostic test, the diagnostic evaluation and formulates guidelines on further CNC operation. The results of measurements were presented in tables and diagrams.

  9. SU-E-T-473: A Patient-Specific QC Paradigm Based On Trajectory Log Files and DICOM Plan Files

    International Nuclear Information System (INIS)

    DeMarco, J; McCloskey, S; Low, D; Moran, J

    2014-01-01

    Purpose: To evaluate a remote QC tool for monitoring treatment machine parameters and treatment workflow. Methods: The Varian TrueBeamTM linear accelerator is a digital machine that records machine axis parameters and MLC leaf positions as a function of delivered monitor unit or control point. This information is saved to a binary trajectory log file for every treatment or imaging field in the patient treatment session. A MATLAB analysis routine was developed to parse the trajectory log files for a given patient, compare the expected versus actual machine and MLC positions as well as perform a cross-comparison with the DICOM-RT plan file exported from the treatment planning system. The parsing routine sorts the trajectory log files based on the time and date stamp and generates a sequential report file listing treatment parameters and provides a match relative to the DICOM-RT plan file. Results: The trajectory log parsing-routine was compared against a standard record and verify listing for patients undergoing initial IMRT dosimetry verification and weekly and final chart QC. The complete treatment course was independently verified for 10 patients of varying treatment site and a total of 1267 treatment fields were evaluated including pre-treatment imaging fields where applicable. In the context of IMRT plan verification, eight prostate SBRT plans with 4-arcs per plan were evaluated based on expected versus actual machine axis parameters. The average value for the maximum RMS MLC error was 0.067±0.001mm and 0.066±0.002mm for leaf bank A and B respectively. Conclusion: A real-time QC analysis program was tested using trajectory log files and DICOM-RT plan files. The parsing routine is efficient and able to evaluate all relevant machine axis parameters during a patient treatment course including MLC leaf positions and table positions at time of image acquisition and during treatment

  10. jQC-PET, an ImageJ macro to analyse the quality control of a PET/CT

    International Nuclear Information System (INIS)

    Cortes-Rodicio, J.; Sanchez-Merino, G.; Garcia-Fidalgo, A.

    2015-01-01

    An ImageJ macro has been developed to facilitate the analysis of three PET/CT quality control procedures included in the documents from the National Electrical Manufacturers Association (NU2-2007) and the International Atomic Energy Agency (Pub-1393): image quality, uniformity and spatial resolution. In them, the generation of the regions of interest and the analysis are automatized. The results obtained with the software have been compared with those of the commercial software and the literature. The use of jQC-PET allows a standard analysis and the independence of the commercial software. (Author)

  11. Soil Sampling Plan for the transuranic storage area soil overburden and final report: Soil overburden sampling at the RWMC transuranic storage area

    International Nuclear Information System (INIS)

    Stanisich, S.N.

    1994-12-01

    This Soil Sampling Plan (SSP) has been developed to provide detailed procedural guidance for field sampling and chemical and radionuclide analysis of selected areas of soil covering waste stored at the Transuranic Storage Area (TSA) at the Idaho National Engineering Laboratory's (INEL) Radioactive Waste Management Complex (RWMC). The format and content of this SSP represents a complimentary hybrid of INEL Waste Management--Environmental Restoration Program, and Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) Remedial Investigation/Feasibility Study (RI/FS) sampling guidance documentation. This sampling plan also functions as a Quality Assurance Project Plan (QAPP). The QAPP as a controlling mechanism during sampling to ensure that all data collected are valid, reliabile, and defensible. This document outlines organization, objectives and quality assurance/quality control (QA/QC) activities to achieve the desired data quality goals. The QA/QC requirements for this project are outlined in the Data Collection Quality Assurance Plan (DCQAP) for the Buried Waste Program. The DCQAP is a program plan and does not outline the site specific requirements for the scope of work covered by this SSP

  12. Analisa Beban Kerja Fisik dan Mental dengan Menggunakan Work Sampling dan NASA-TLX Untuk Menentukan Jumlah Operator

    Directory of Open Access Journals (Sweden)

    Anton Maretno

    2015-04-01

    Full Text Available Perbedaan sistem kerja yang ada di antara operator Quality Control dan operator produksi pada divisi Particle Board menyebabkan adanya perbedaan nilai beban kerja. Hal tersebut dapat dilihat dari perbedaan jumlah jam lembur pada kedua bagian tersebut dimana jumlah jam lembur pada operator Quality Control lebih besar. Penelitian ini bertujuan untuk menganalisa beban kerja operator Quality Control serta menganalisa jumlah operator yang optimal untuk menyelesaikan pekerjaan Quality Control.Penelitian ini menggunakan metode pengukuran beban kerja fisik (Work sampling dan pengukuran beban kerja Mental NASA - Task Load Index (NASA-TLX. Menurut perhitungan beban kerja fisik dan mental, pelaksana yang memiliki load paling tinggi adalah pekerjaanQuality Control ( QC Finish board (108.1%, sedangkan terendah ada pada pekerjaan Quality Control ( QC Produk (72.3%. Setelah penambahan pelaksana  Quality Control ( QC Finish board sebanyak 1 orang, beban kerja fisik untuk pekerjaanQuality Control ( QC Finish board menjadi 71.1%. Sedangkan untuk pekerjaan Quality Control yang lain tidak membutuhkan tambahan operator karena bisa memanfaatkan waktu idle yang dimiliki pelaksana Quality Control ( QC Produk untuk membantu pekerjaan lain

  13. Collection and preparation of bottom sediment samples for analysis of radionuclides and trace elements

    International Nuclear Information System (INIS)

    2003-07-01

    The publication is the first in a series of TECDOCs on sampling and sample handling as part of the IAEA support to improve reliability of nuclear analytical techniques (NATs) in Member State laboratories. The purpose of the document is to provide information on the methods for collecting sediments, the equipment used, and the sample preparation techniques for radionuclide and elemental analysis. The most appropriate procedures for defining the strategies and criteria for selecting sampling locations, for sample storage and transportation are also given. Elements of QA/QC and documentation needs for sampling and sediment analysis are discussed. Collection and preparation of stream and river bottom sediments, lake bottom sediments, estuary bottom sediments, and marine (shallow) bottom sediments are covered. The document is intended to be a comprehensive manual for the collection and preparation of bottom sediments as a prerequisite to obtain representative and meaningful results using NATs. Quality assurance and quality control (QA/QC) is emphasized as an important aspect to ensure proper collection, transportation, preservation, and analysis since it forms the basis for interpretation and legislation. Although there are many approaches and methods available for sediment analyses, the scope of the report is limited to sample preparation for (1) analysis of radionuclides (including sediment dating using radionuclides such as Pb-210 and Cs-137) and (2) analysis of trace, minor and major elements using nuclear and related analytical techniques such as NAA, XRF and PIXE

  14. Collection and preparation of bottom sediment samples for analysis of radionuclides and trace elements

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    The publication is the first in a series of TECDOCs on sampling and sample handling as part of the IAEA support to improve reliability of nuclear analytical techniques (NATs) in Member State laboratories. The purpose of the document is to provide information on the methods for collecting sediments, the equipment used, and the sample preparation techniques for radionuclide and elemental analysis. The most appropriate procedures for defining the strategies and criteria for selecting sampling locations, for sample storage and transportation are also given. Elements of QA/QC and documentation needs for sampling and sediment analysis are discussed. Collection and preparation of stream and river bottom sediments, lake bottom sediments, estuary bottom sediments, and marine (shallow) bottom sediments are covered. The document is intended to be a comprehensive manual for the collection and preparation of bottom sediments as a prerequisite to obtain representative and meaningful results using NATs. Quality assurance and quality control (QA/QC) is emphasized as an important aspect to ensure proper collection, transportation, preservation, and analysis since it forms the basis for interpretation and legislation. Although there are many approaches and methods available for sediment analyses, the scope of the report is limited to sample preparation for (1) analysis of radionuclides (including sediment dating using radionuclides such as Pb-210 and Cs-137) and (2) analysis of trace, minor and major elements using nuclear and related analytical techniques such as NAA, XRF and PIXE.

  15. Environmental analytical laboratory setup operation and QA/QC

    International Nuclear Information System (INIS)

    Hsu, J.P.; Boyd, J.A.; DeViney, S.

    1991-01-01

    Environmental analysis requires precise and timely measurements. The required precise measurement is ensured with quality control and timeliness through an efficient operation. The efficiency of the operation also ensures cost-competitiveness. Environmental analysis plays a very important role in the environmental protection program. Due to the possible litigation involvement, most environmental analyses follow stringent criteria, such as the U.S. EPA Contract Laboratory Program procedures with analytical results documented in an orderly manner. The documentation demonstrates that all quality control steps are followed and facilitates data evaluation to determine the quality and usefulness of the data. Furthermore, the tedious documents concerning sample checking, chain-of-custody, standard or surrogate preparation, daily refrigerator and oven temperature monitoring, analytical and extraction logbooks, standard operation procedures, etc., also are an important part of the laboratory documentation. Quality control for environmental analysis is becoming more stringent, required documentation is becoming more detailed and turnaround time is shorter. However, the business is becoming more cost-competitive and it appears that this trend will continue. In this paper, we discuss what should be done to deal this high quality, fast-paced and tedious environmental analysis process at a competitive cost. The success of environmental analysis is people. The knowledge and experience of the staff are the key to a successful environmental analysis program. In order to be successful in this new area, the ability to develop new methods is crucial. In addition, the laboratory information system, laboratory automation and quality assurance/quality control (QA/QC) are major factors for laboratory success. This paper concentrates on these areas

  16. Determination of the anionic surfactant di(ethylhexyl) sodium sulfosuccinate in water samples collected from Gulf of Mexico coastal waters before and after landfall of oil from the Deepwater Horizon oil spill, May to October, 2010

    Science.gov (United States)

    Gray, James L.; Kanagy, Leslie K.; Furlong, Edward T.; McCoy, Jeff W.; Kanagy, Chris J.

    2011-01-01

    . Laboratory and field QA/QC for pre-landfall samples included laboratory reagent spike and blank samples, a total of 34 replicate analyses for the 78 environmental and field blank samples, and 11 randomly chosen laboratory matrix spike samples. Laboratory and field QA/QC for post-landfall samples included laboratory reagent spike and blank samples, a laboratory 'in-bottle' duplicate for each sample, and analysis of 24 randomly chosen laboratory matrix spike samples. Average DOSS recovery of 89(+/-)9.5 percent in all native (non-13C4-DOSS ) spikes was observed, with a mean relative percent difference between sample duplicates of 36 percent. The reporting limit for this analysis was 0.25 micrograms per liter due to blank limitations; DOSS was not detected in any samples collected in October (after oil landfall at certain study sites) above that concentration. It was detected prior to oil landfall above 0.25 micrograms per liter in 3 samples, but none exceeded the Environmental Protection Agency aquatic life criteria of 40 micrograms per liter.

  17. Sampling point selection for energy estimation in the quasicontinuum method

    NARCIS (Netherlands)

    Beex, L.A.A.; Peerlings, R.H.J.; Geers, M.G.D.

    2010-01-01

    The quasicontinuum (QC) method reduces computational costs of atomistic calculations by using interpolation between a small number of so-called repatoms to represent the displacements of the complete lattice and by selecting a small number of sampling atoms to estimate the total potential energy of

  18. SprayQc: a real-time LC-MS/MS quality monitoring system to maximize uptime using off the shelf components.

    Science.gov (United States)

    Scheltema, Richard A; Mann, Matthias

    2012-06-01

    With the advent of high-throughput mass spectrometry (MS)-based proteomics, the magnitude and complexity of the performed experiments has increased dramatically. Likewise, investments in chromatographic and MS instrumentation are a large proportion of the budget of proteomics laboratories. Guarding measurement quality and maximizing uptime of the LC-MS/MS systems therefore requires constant care despite automated workflows. We describe a real-time surveillance system, called SprayQc, that continuously monitors the status of the peripheral equipment to ensure that operational parameters are within an acceptable range. SprayQc is composed of multiple plug-in software components that use computer vision to analyze electrospray conditions, monitor the chromatographic device for stable backpressure, interact with a column oven to control pressure by temperature, and ensure that the mass spectrometer is still acquiring data. Action is taken when a failure condition has been detected, such as stopping the column oven and the LC flow, as well as automatically notifying the appropriate operator. Additionally, all defined metrics can be recorded synchronized on retention time with the MS acquisition file, allowing for later inspection and providing valuable information for optimization. SprayQc has been extensively tested in our laboratory, supports third-party plug-in development, and is freely available for download from http://sourceforge.org/projects/sprayqc .

  19. A novel construction method of QC-LDPC codes based on the subgroup of the finite field multiplicative group for optical transmission systems

    Science.gov (United States)

    Yuan, Jian-guo; Zhou, Guang-xiang; Gao, Wen-chun; Wang, Yong; Lin, Jin-zhao; Pang, Yu

    2016-01-01

    According to the requirements of the increasing development for optical transmission systems, a novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes based on the subgroup of the finite field multiplicative group is proposed. Furthermore, this construction method can effectively avoid the girth-4 phenomena and has the advantages such as simpler construction, easier implementation, lower encoding/decoding complexity, better girth properties and more flexible adjustment for the code length and code rate. The simulation results show that the error correction performance of the QC-LDPC(3 780,3 540) code with the code rate of 93.7% constructed by this proposed method is excellent, its net coding gain is respectively 0.3 dB, 0.55 dB, 1.4 dB and 1.98 dB higher than those of the QC-LDPC(5 334,4 962) code constructed by the method based on the inverse element characteristics in the finite field multiplicative group, the SCG-LDPC(3 969,3 720) code constructed by the systematically constructed Gallager (SCG) random construction method, the LDPC(32 640,30 592) code in ITU-T G.975.1 and the classic RS(255,239) code which is widely used in optical transmission systems in ITU-T G.975 at the bit error rate ( BER) of 10-7. Therefore, the constructed QC-LDPC(3 780,3 540) code is more suitable for optical transmission systems.

  20. ATACseqQC: a Bioconductor package for post-alignment quality assessment of ATAC-seq data.

    Science.gov (United States)

    Ou, Jianhong; Liu, Haibo; Yu, Jun; Kelliher, Michelle A; Castilla, Lucio H; Lawson, Nathan D; Zhu, Lihua Julie

    2018-03-01

    ATAC-seq (Assays for Transposase-Accessible Chromatin using sequencing) is a recently developed technique for genome-wide analysis of chromatin accessibility. Compared to earlier methods for assaying chromatin accessibility, ATAC-seq is faster and easier to perform, does not require cross-linking, has higher signal to noise ratio, and can be performed on small cell numbers. However, to ensure a successful ATAC-seq experiment, step-by-step quality assurance processes, including both wet lab quality control and in silico quality assessment, are essential. While several tools have been developed or adopted for assessing read quality, identifying nucleosome occupancy and accessible regions from ATAC-seq data, none of the tools provide a comprehensive set of functionalities for preprocessing and quality assessment of aligned ATAC-seq datasets. We have developed a Bioconductor package, ATACseqQC, for easily generating various diagnostic plots to help researchers quickly assess the quality of their ATAC-seq data. In addition, this package contains functions to preprocess aligned ATAC-seq data for subsequent peak calling. Here we demonstrate the utilities of our package using 25 publicly available ATAC-seq datasets from four studies. We also provide guidelines on what the diagnostic plots should look like for an ideal ATAC-seq dataset. This software package has been used successfully for preprocessing and assessing several in-house and public ATAC-seq datasets. Diagnostic plots generated by this package will facilitate the quality assessment of ATAC-seq data, and help researchers to evaluate their own ATAC-seq experiments as well as select high-quality ATAC-seq datasets from public repositories such as GEO to avoid generating hypotheses or drawing conclusions from low-quality ATAC-seq experiments. The software, source code, and documentation are freely available as a Bioconductor package at https://bioconductor.org/packages/release/bioc/html/ATACseqQC.html .

  1. NGSCheckMate: software for validating sample identity in next-generation sequencing studies within and across data types.

    Science.gov (United States)

    Lee, Sejoon; Lee, Soohyun; Ouellette, Scott; Park, Woong-Yang; Lee, Eunjung A; Park, Peter J

    2017-06-20

    In many next-generation sequencing (NGS) studies, multiple samples or data types are profiled for each individual. An important quality control (QC) step in these studies is to ensure that datasets from the same subject are properly paired. Given the heterogeneity of data types, file types and sequencing depths in a multi-dimensional study, a robust program that provides a standardized metric for genotype comparisons would be useful. Here, we describe NGSCheckMate, a user-friendly software package for verifying sample identities from FASTQ, BAM or VCF files. This tool uses a model-based method to compare allele read fractions at known single-nucleotide polymorphisms, considering depth-dependent behavior of similarity metrics for identical and unrelated samples. Our evaluation shows that NGSCheckMate is effective for a variety of data types, including exome sequencing, whole-genome sequencing, RNA-seq, ChIP-seq, targeted sequencing and single-cell whole-genome sequencing, with a minimal requirement for sequencing depth (>0.5X). An alignment-free module can be run directly on FASTQ files for a quick initial check. We recommend using this software as a QC step in NGS studies. https://github.com/parklab/NGSCheckMate. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Pilot studies for the North American Soil Geochemical Landscapes Project - Site selection, sampling protocols, analytical methods, and quality control protocols

    Science.gov (United States)

    Smith, D.B.; Woodruff, L.G.; O'Leary, R. M.; Cannon, W.F.; Garrett, R.G.; Kilburn, J.E.; Goldhaber, M.B.

    2009-01-01

    In 2004, the US Geological Survey (USGS) and the Geological Survey of Canada sampled and chemically analyzed soils along two transects across Canada and the USA in preparation for a planned soil geochemical survey of North America. This effort was a pilot study to test and refine sampling protocols, analytical methods, quality control protocols, and field logistics for the continental survey. A total of 220 sample sites were selected at approximately 40-km intervals along the two transects. The ideal sampling protocol at each site called for a sample from a depth of 0-5 cm and a composite of each of the O, A, and C horizons. The Ca, Fe, K, Mg, Na, S, Ti, Ag, As, Ba, Be, Bi, Cd, Ce, Co, Cr, Cs, Cu, Ga, In, La, Li, Mn, Mo, Nb, Ni, P, Pb, Rb, Sb, Sc, Sn, Sr, Te, Th, Tl, U, V, W, Y, and Zn by inductively coupled plasma-mass spectrometry and inductively coupled plasma-atomic emission spectrometry following a near-total digestion in a mixture of HCl, HNO3, HClO4, and HF. Separate methods were used for Hg, Se, total C, and carbonate-C on this same size fraction. Only Ag, In, and Te had a large percentage of concentrations below the detection limit. Quality control (QC) of the analyses was monitored at three levels: the laboratory performing the analysis, the USGS QC officer, and the principal investigator for the study. This level of review resulted in an average of one QC sample for every 20 field samples, which proved to be minimally adequate for such a large-scale survey. Additional QC samples should be added to monitor within-batch quality to the extent that no more than 10 samples are analyzed between a QC sample. Only Cr (77%), Y (82%), and Sb (80%) fell outside the acceptable limits of accuracy (% recovery between 85 and 115%) because of likely residence in mineral phases resistant to the acid digestion. A separate sample of 0-5-cm material was collected at each site for determination of organic compounds. A subset of 73 of these samples was analyzed for a suite of

  3. APPLICATION OF QC TOOLS FOR CONTINUOUS IMPROVEMENT IN AN EXPENSIVE SEAT HARDFACING PROCESS USING TIG WELDING

    Directory of Open Access Journals (Sweden)

    Mohammed Yunus

    2016-09-01

    Full Text Available The present study is carried out to improve quality level by identifying the prime reasons of the quality related problems in the seat hardfacing process involving the deposition of cobalt based super alloy in I.C. Engine valves using TIG welding process. During the Process, defects like stellite deposition overflow, head melt, non-uniform stellite merging, etc., are observed and combining all these defects, the rejection level was in top position in Forge shop. We use widely referred QC tools of the manufacturing field to monitor the complete operation and continuous progressive process improvement to ensure ability and efficiency of quality management system of any firm. The work aims to identify the various causes for the rejection by the detailed study of the operation, equipment, materials and the various process parameters that are very important to get defects-free products. Also, to evolve suitable countermeasures for reducing the rejection percentage using seven QC tools. To further understand and validate the obtained results, we need to address other studies related to motivations, advantages, and disadvantages of applying quality control tools.

  4. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates large-scale quality control of FASTQ files by carrying out a QC protocol, parsing results, and aggregating quality metrics within and across experiments into an interactive dashboard. The dashboard utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  5. Performance Analysis of Faulty Gallager-B Decoding of QC-LDPC Codes with Applications

    Directory of Open Access Journals (Sweden)

    O. Al Rasheed

    2014-06-01

    Full Text Available In this paper we evaluate the performance of Gallager-B algorithm, used for decoding low-density parity-check (LDPC codes, under unreliable message computation. Our analysis is restricted to LDPC codes constructed from circular matrices (QC-LDPC codes. Using Monte Carlo simulation we investigate the effects of different code parameters on coding system performance, under a binary symmetric communication channel and independent transient faults model. One possible application of the presented analysis in designing memory architecture with unreliable components is considered.

  6. A windows based automated quality control system for the ICP-AES analysis of Waste Isolation Pilot Plant (WIPP) brines

    International Nuclear Information System (INIS)

    Gerth, D.J.

    1996-01-01

    High sample volume analytical laboratories typically require automation of tasks to maximize efficiency and productivity. Typical approaches target instrument operation and data reporting (LIMS), but frequently ignore the data evaluation and run time QC aspects. Automation of these steps can save up to 50% of the time it takes to analyze, evaluate, and report data from a typical ICP-AES run. The program developed in this project addresses this need by performing a CLP-style evaluation of the run time QC data included in an instrument run. Written in Microcraft Visual Basic 3.0, it makes use of a Microsoft Access database to store method parameters and QC sample results for control charting. In operation, the analyst enters method background data (e.g., control samples types and acceptance criteria for each analyte), which is then stored in the method database. Once the method parameters are entered, instrument data files may be imported for review. Upon import, the run is automatically checked against desired QC criteria, QC sample data are added to the database, and failing samples flagged appropriately. Analytes passing all QC checks are flagged for upload to the laboratory LIMS. The analyst may then review the run sample by sample, and, if desired, override the computer upload flag. An exception report may be generated detailing samples that require reanalysis

  7. Experimental and Sampling Design for the INL-2 Sample Collection Operational Test

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Amidan, Brett G.; Matzke, Brett D.

    2009-02-16

    , sample extraction, and analytical methods to be used in the INL-2 study. For each of the five test events, the specified floor of the INL building will be contaminated with BG using a point-release device located in the room specified in the experimental design. Then quality control (QC), reference material coupon (RMC), judgmental, and probabilistic samples will be collected according to the sampling plan for each test event. Judgmental samples will be selected based on professional judgment and prior information. Probabilistic samples were selected with a random aspect and in sufficient numbers to provide desired confidence for detecting contamination or clearing uncontaminated (or decontaminated) areas. Following sample collection for a given test event, the INL building will be decontaminated. For possibly contaminated areas, the numbers of probabilistic samples were chosen to provide 95% confidence of detecting contaminated areas of specified sizes. For rooms that may be uncontaminated following a contamination event, or for whole floors after decontamination, the numbers of judgmental and probabilistic samples were chosen using the CJR approach. The numbers of samples were chosen to support making X%/Y% clearance statements with X = 95% or 99% and Y = 96% or 97%. The experimental and sampling design also provides for making X%/Y% clearance statements using only probabilistic samples. For each test event, the numbers of characterization and clearance samples were selected within limits based on operational considerations while still maintaining high confidence for detection and clearance aspects. The sampling design for all five test events contains 2085 samples, with 1142 after contamination and 943 after decontamination. These numbers include QC, RMC, judgmental, and probabilistic samples. The experimental and sampling design specified in this report provides a good statistical foundation for achieving the objectives of the INL-2 study.

  8. Establishing daily quality control (QC) in screen-film mammography using leeds tor (max) phantom at the breast imaging unit of USTH-Benavides Cancer Institute

    Science.gov (United States)

    Acaba, K. J. C.; Cinco, L. D.; Melchor, J. N.

    2016-03-01

    Daily QC tests performed on screen film mammography (SFM) equipment are essential to ensure that both SFM unit and film processor are working in a consistent manner. The Breast Imaging Unit of USTH-Benavides Cancer Institute has been conducting QC following the test protocols in the IAEA Human Health Series No.2 manual. However, the availability of Leeds breast phantom (CRP E13039) in the facility made the task easier. Instead of carrying out separate tests on AEC constancy and light sensitometry, only one exposure of the phantom is done to accomplish the two tests. It was observed that measurements made on mAs output and optical densities (ODs) using the Leeds TOR (MAX) phantom are comparable with that obtained from the usual conduct of tests, taking into account the attenuation characteristic of the phantom. Image quality parameters such as low contrast and high contrast details were also evaluated from the phantom image. The authors recognize the usefulness of the phantom in determining technical factors that will help improve detection of smallest pathological details on breast images. The phantom is also convenient for daily QC monitoring and economical since less number of films is expended.

  9. Certification of Trace Element Mass Fractions in IAEA-458 Marine Sediment Sample

    International Nuclear Information System (INIS)

    2013-01-01

    The primary goal of the IAEA Environment Laboratories (NAEL) is to help Member States understand, monitor and protect the marine environment. The major impact exerted by large coastal cities on marine ecosystems is therefore of great concern to the IAEA and its Environment Laboratories. Given that marine pollution assessments of such impacts depend on accurate knowledge of contaminant concentrations in various environmental compartments, the NAEL has assisted national laboratories and regional laboratory networks through its Reference Products for Environment and Trade programme since the early 1970s. Quality assurance (QA), quality control (QC) and associated good laboratory practice are essential components of all marine environmental monitoring studies. QC procedures are commonly based on the analysis of certified reference materials and reference samples in order to validate analytical methods used in monitoring studies and to assess reliability and comparability of measurement data. QA can be realized by participation in externally organized laboratory performance studies, also known as interlaboratory comparisons, which compare and evaluate the analytical performance and measurement capabilities of participating laboratories. Data that are not based on adequate QA/QC can be erroneous, and their misuse can lead to incorrect environmental management decisions. This report describes the sample preparation methodology, material homogeneity and stability study, selection of laboratories, evaluation of results from the certification campaign and assignment of property values and their associated uncertainty. As a result, reference values for mass fractions and associated expanded uncertainty for 16 trace elements (Al, As, Cd, Cr, Co, Cu, Fe, Hg, Li, Mn, Ni, Pb, Sr, Sn, V and Zn) in marine sediment were established

  10. Quality-control activities of the Hanford Environmental Surveillance Program

    International Nuclear Information System (INIS)

    Price, K.R.; Jaquish, R.E.

    1982-01-01

    A comprehensive approach to quality control (QC) has been developed by the Pacific Northwest Laboratory for the Hanford Environmental Surveillance Program. The framework of quality control for the surveillance program has been documented in a QC implementation guide wherein QC requirements are specified and specific responsibilities and authorities are described. Subjects in the guide include the collection, analysis, and reporting of samples as well as equipment calibration and maintenance, training, audits, and record keeping. A QC file and library have been established to store pertinent documentation, records, and references for ready access

  11. Evaluation of methods to reduce background using the Python-based ELISA_QC program.

    Science.gov (United States)

    Webster, Rose P; Cohen, Cinder F; Saeed, Fatima O; Wetzel, Hanna N; Ball, William J; Kirley, Terence L; Norman, Andrew B

    2018-05-01

    Almost all immunological approaches [immunohistochemistry, enzyme-linked immunosorbent assay (ELISA), Western blot], that are used to quantitate specific proteins have had to address high backgrounds due to non-specific reactivity. We report here for the first time a quantitative comparison of methods for reduction of the background of commercial biotinylated antibodies using the Python-based ELISA_QC program. This is demonstrated using a recombinant humanized anti-cocaine monoclonal antibody. Several approaches, such as adjustment of the incubation time and the concentration of blocking agent, as well as the dilution of secondary antibodies, have been explored to address this issue. In this report, systematic comparisons of two different methods, contrasted with other more traditional methods to address this problem are provided. Addition of heparin (HP) at 1 μg/ml to the wash buffer prior to addition of the secondary biotinylated antibody reduced the elevated background absorbance values (from a mean of 0.313 ± 0.015 to 0.137 ± 0.002). A novel immunodepletion (ID) method also reduced the background (from a mean of 0.331 ± 0.010 to 0.146 ± 0.013). Overall, the ID method generated more similar results at each concentration of the ELISA standard curve to that using the standard lot 1 than the HP method, as analyzed by the Python-based ELISA_QC program. We conclude that the ID method, while more laborious, provides the best solution to resolve the high background seen with specific lots of biotinylated secondary antibody. Copyright © 2018. Published by Elsevier B.V.

  12. Recursive construction of (J,L (J,L QC LDPC codes with girth 6

    Directory of Open Access Journals (Sweden)

    Mohammad Gholami

    2016-06-01

    Full Text Available ‎In this paper‎, ‎a recursive algorithm is presented to generate some exponent matrices which correspond to Tanner graphs with girth at least 6‎. ‎For a J×L J×L exponent matrix E E‎, ‎the lower bound Q(E Q(E is obtained explicitly such that (J,L (J,L QC LDPC codes with girth at least 6 exist for any circulant permutation matrix (CPM size m≥Q(E m≥Q(E‎. ‎The results show that the exponent matrices constructed with our recursive algorithm have smaller lower-bound than the ones proposed recently with girth 6‎

  13. QA/QC Reflected in ISO 11137; The Role of Dosimetry in the Validation Process

    International Nuclear Information System (INIS)

    Kovacs, A.

    2007-01-01

    Standardized dosimetry (ISO/ASTM standards) - as a tool of QC - has got key role for the validation of the sterilization and ford irradiation processes, as well as to control the radiation processing of polymer products. In radiation processing, validation and process control (e.g. sterilization, food irradiation) depend on the measurement of absorbed dose. These measurements shall be performed using a dosimetric system or systems having a known level of accuracy and precision (European standard EN552:1994). In presented lecture different aspects of the operational qualification during the radiation processing of polymer products are described

  14. PubChemQC Project: A Large-Scale First-Principles Electronic Structure Database for Data-Driven Chemistry.

    Science.gov (United States)

    Nakata, Maho; Shimazaki, Tomomi

    2017-06-26

    Large-scale molecular databases play an essential role in the investigation of various subjects such as the development of organic materials, in silico drug design, and data-driven studies with machine learning. We have developed a large-scale quantum chemistry database based on first-principles methods. Our database currently contains the ground-state electronic structures of 3 million molecules based on density functional theory (DFT) at the B3LYP/6-31G* level, and we successively calculated 10 low-lying excited states of over 2 million molecules via time-dependent DFT with the B3LYP functional and the 6-31+G* basis set. To select the molecules calculated in our project, we referred to the PubChem Project, which was used as the source of the molecular structures in short strings using the InChI and SMILES representations. Accordingly, we have named our quantum chemistry database project "PubChemQC" ( http://pubchemqc.riken.jp/ ) and placed it in the public domain. In this paper, we show the fundamental features of the PubChemQC database and discuss the techniques used to construct the data set for large-scale quantum chemistry calculations. We also present a machine learning approach to predict the electronic structure of molecules as an example to demonstrate the suitability of the large-scale quantum chemistry database.

  15. Development of manufacturing equipment and QC equipment for DUPIC fuel

    International Nuclear Information System (INIS)

    Yang, Myung Seung; Park, J.J.; Lee, J.W.; Kim, S.S.; Yim, S.P.; Kim, J.H.; Kim, K.H.; Na, S.H.; Kim, W.K.; Shin, J.M.; Lee, D.Y.; Cho, K.H.; Lee, Y.S.; Sohn, J.S.; Kim, M.J.

    1999-05-01

    In this study, DUPIC powder and pellet fabrication equipment, welding system, QC equipment, and fission gas treatment are developed to fabricate DUPIC fuel at IMEF M6 hot cell. The systems are improved to be suitable for remote operation and maintenance with the manipulator at hot cell. Powder and pellet fabrication equipment have been recently developed. The systems are under performance test to check remote operation and maintenance. Welding chamber and jigs are designed and developed to remotely weld DUPIC fuel rod with manipulators at hot cell. Remote quality control equipment are being tested for analysis and inspection of DUPIC fuel characteristics at hot cell. And trapping characteristics is analyzed for cesium and ruthenium released under oxidation/reduction and sintering processes. The design criteria and process flow diagram of fission gas treatment system are prepared incorporating the experimental results. The fission gas treatment system has been successfully manufactured. (Author). 33 refs., 14 tabs., 91 figs

  16. Statistical validation of reagent lot change in the clinical chemistry laboratory can confer insights on good clinical laboratory practice.

    Science.gov (United States)

    Cho, Min-Chul; Kim, So Young; Jeong, Tae-Dong; Lee, Woochang; Chun, Sail; Min, Won-Ki

    2014-11-01

    Verification of new lot reagent's suitability is necessary to ensure that results for patients' samples are consistent before and after reagent lot changes. A typical procedure is to measure results of some patients' samples along with quality control (QC) materials. In this study, the results of patients' samples and QC materials in reagent lot changes were analysed. In addition, the opinion regarding QC target range adjustment along with reagent lot changes was proposed. Patients' sample and QC material results of 360 reagent lot change events involving 61 analytes and eight instrument platforms were analysed. The between-lot differences for the patients' samples (ΔP) and the QC materials (ΔQC) were tested by Mann-Whitney U tests. The size of the between-lot differences in the QC data was calculated as multiples of standard deviation (SD). The ΔP and ΔQC values only differed significantly in 7.8% of the reagent lot change events. This frequency was not affected by the assay principle or the QC material source. One SD was proposed for the cutoff for maintaining pre-existing target range after reagent lot change. While non-commutable QC material results were infrequent in the present study, our data confirmed that QC materials have limited usefulness when assessing new reagent lots. Also a 1 SD standard for establishing a new QC target range after reagent lot change event was proposed. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  17. Certification of Trace Element Mass Fractions in IAEA-457 Marine Sediment Sample

    International Nuclear Information System (INIS)

    2013-01-01

    The primary goal of the IAEA Environment Laboratories in Monaco (NAEL) is to help Member States understand, monitor and protect the marine environment. The major impact exerted by large coastal cities on marine ecosystems is therefore of great concern to the IAEA and its Environment Laboratories. Given that marine pollution assessments of such impacts depend on accurate knowledge of contaminant concentrations in various environmental compartments, the NAEL has assisted national laboratories and regional laboratory networks through its Reference Products for Environment and Trade programme since the early 1970s. Quality assurance (QA), quality control (QC) and associated good laboratory practice are essential components of all marine environmental monitoring studies. QC procedures are commonly based on the analysis of certified reference materials and reference samples in order to validate analytical methods used in monitoring studies and to assess reliability and comparability of measurement data. QA can be realized by participation in externally organized laboratory performance studies, also known as interlaboratory comparisons, which compare and evaluate analytical performance and measurement capabilities of participating laboratories. Data that are not based on adequate QA/QC can be erroneous and their misuse can lead to incorrect environmental management decisions. A marine sediment sample with certified mass fractions for Ag, Al, As, Cd, Cr, Co, Cu, Fe, Hg, Li, Mn, Ni, Pb, Sn, Sr, V and Zn was recently produced by the NAEL in the frame of a project between the IAEA and the Korea Institute of Ocean Science and Technology. This report describes the sample preparation methodology, the material homogeneity and stability study, the selection of laboratories, the evaluation of results from the certification campaign and the assignment of property values and their associated uncertainty. As a result, reference values for mass fractions and associated expanded

  18. Development of the QA/QC Procedures for a Neutron Interrogation System

    Energy Technology Data Exchange (ETDEWEB)

    Obhodas, Jasmina; Sudac, Davorin; Valkovic, Vladivoj [Ruder Boskovic Institute, 10000 Zagreb (Croatia)

    2015-07-01

    In order to perform QA/QC procedures for a system dedicated to the neutron interrogation of objects for the presence of threat materials one needs to perform measurements of reference materials (RM) having the same (or similar) atomic ratios as real materials. It is well known that explosives, drugs, and various other benign materials, contain chemical elements such as hydrogen, oxygen, carbon and nitrogen in distinctly different quantities. For example, a high carbon-to-oxygen ratio (C/O) is characteristic of drugs. Explosives can be differentiated by measurement of both C/O and nitrogen-to-oxygen (N/O) ratios. The C/N ratio of the chemical warfare agents, coupled with the measurement of elements such as fluorine and phosphorus, clearly differentiate them from the conventional explosives. Correlations between theoretical values and experimental results obtained in laboratory conditions for C/O and N/C ratios of simulants of hexogen (RDX), TNT, DLM2, TATP, cocaine, heroin, yperite, tetranitromethane, peroxide methylethyl-ketone, nitromethane and ethyleneglycol dinitrate are presented. (authors)

  19. A QC approach to the determination of day-to-day reproducibility and robustness of LC-MS methods for global metabolite profiling in metabonomics/metabolomics.

    Science.gov (United States)

    Gika, Helen G; Theodoridis, Georgios A; Earll, Mark; Wilson, Ian D

    2012-09-01

    An approach to the determination of day-to-day analytical robustness of LC-MS-based methods for global metabolic profiling using a pooled QC sample is presented for the evaluation of metabonomic/metabolomic data. A set of 60 urine samples were repeatedly analyzed on five different days and the day-to-day reproducibility of the data obtained was determined. Multivariate statistical analysis was performed with the aim of evaluating variability and selected peaks were assessed and validated in terms of retention time stability, mass accuracy and intensity. The methodology enables the repeatability/reproducibility of extended analytical runs in large-scale studies to be determined, allowing the elimination of analytical (as opposed to biological) variability, in order to discover true patterns and correlations within the data. The day-to-day variability of the data revealed by this process suggested that, for this particular system, 3 days continuous operation was possible without the need for maintenance and cleaning. Variation was generally based on signal intensity changes over the 7-day period of the study, and was mainly a result of source contamination.

  20. Data Stewardship in the Ocean Sciences Needs to Include Physical Samples

    Science.gov (United States)

    Carter, M.; Lehnert, K.

    2016-02-01

    Across the Ocean Sciences, research involves the collection and study of samples collected above, at, and below the seafloor, including but not limited to rocks, sediments, fluids, gases, and living organisms. Many domains in the Earth Sciences have recently expressed the need for better discovery, access, and sharing of scientific samples and collections (EarthCube End-User Domain workshops, 2012 and 2013, http://earthcube.org/info/about/end-user-workshops), as has the US government (OSTP Memo, March 2014). iSamples (Internet of Samples in the Earth Sciences) is a Research Coordination Network within the EarthCube program that aims to advance the use of innovative cyberinfrastructure to support and advance the utility of physical samples and sample collections for science and ensure reproducibility of sample-based data and research results. iSamples strives to build, grow, and foster a new community of practice, in which domain scientists, curators of sample repositories and collections, computer and information scientists, software developers and technology innovators engage in and collaborate on defining, articulating, and addressing the needs and challenges of physical samples as a critical component of digital data infrastructure. A primary goal of iSamples is to deliver a community-endorsed set of best practices and standards for the registration, description, identification, and citation of physical specimens and define an actionable plan for implementation. iSamples conducted a broad community survey about sample sharing and has created 5 different working groups to address the different challenges of developing the internet of samples - from metadata schemas and unique identifiers to an architecture for a shared cyberinfrastructure to manage collections, to digitization of existing collections, to education, and ultimately to establishing the physical infrastructure that will ensure preservation and access of the physical samples. Repositories that curate

  1. Study of Efficiency Calibrations of HPGe Detectors for Radioactivity Measurements of Environmental Samples

    International Nuclear Information System (INIS)

    Harb, S.; Salahel Din, K.; Abbady, A.

    2009-01-01

    In this paper, we describe a method of calibrating of efficiency of a HPGe gamma-ray spectrometry of bulk environmental samples (Tea, crops, water, and soil) is a significant part of the environmental radioactivity measurements. Here we will discuss the full energy peak efficiency (FEPE) of three HPGe detectors it as a consequence, it is essential that the efficiency is determined for each set-up employed. Besides to take full advantage at gamma-ray spectrometry, a set of efficiency at several energies which covers the wide the range in energy, the large the number of radionuclides whose concentration can be determined to measure the main natural gamma-ray emitters, the efficiency should be known at least from 46.54 keV ( 210 Pb) to 1836 keV ( 88 Y). Radioactive sources were prepared from two different standards, a first mixed standard QC Y 40 containing 210 Pb, 241 Am, 109 Cd, and Co 57 , and the second QC Y 48 containing 241 Am, 109 Cd, 57 Co, 139 Ce, 113 Sn, 85 Sr, 137 Cs, 88 Y, and 60 Co is necessary in order to calculate the activity of the different radionuclides contained in a sample. In this work, we will study the efficiency calibration as a function of different parameters as:- Energy of gamma ray from 46.54 keV ( 210 Pb) to 1836 keV ( 88 Y), three different detectors A, B, and C, geometry of containers (point source, marinelli beaker, and cylindrical bottle 1 L), height of standard soil samples in bottle 250 ml, and density of standard environmental samples. These standard environmental sample must be measured before added standard solution because we will use the same environmental samples in order to consider the self absorption especially and composition in the case of volume samples.

  2. Application of newly developed Fluoro-QC software for image quality evaluation in cardiac X-ray systems.

    Science.gov (United States)

    Oliveira, M; Lopez, G; Geambastiani, P; Ubeda, C

    2018-05-01

    A quality assurance (QA) program is a valuable tool for the continuous production of optimal quality images. The aim of this paper is to assess a newly developed automatic computer software for image quality (IR) evaluation in fluoroscopy X-ray systems. Test object images were acquired using one fluoroscopy system, Siemens Axiom Artis model (Siemens AG, Medical Solutions Erlangen, Germany). The software was developed as an ImageJ plugin. Two image quality parameters were assessed: high-contrast spatial resolution (HCSR) and signal-to-noise ratio (SNR). The time between manual and automatic image quality assessment procedures were compared. The paired t-test was used to assess the data. p Values of less than 0.05 were considered significant. The Fluoro-QC software generated faster IQ evaluation results (mean = 0.31 ± 0.08 min) than manual procedure (mean = 4.68 ± 0.09 min). The mean difference between techniques was 4.36 min. Discrepancies were identified in the region of interest (ROI) areas drawn manually with evidence of user dependence. The new software presented the results of two tests (HCSR = 3.06, SNR = 5.17) and also collected information from the DICOM header. Significant differences were not identified between manual and automatic measures of SNR (p value = 0.22) and HCRS (p value = 0.46). The Fluoro-QC software is a feasible, fast and free to use method for evaluating imaging quality parameters on fluoroscopy systems. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.

  3. QA/QC For Radon Concentration Measurement With Charcoal Canister

    International Nuclear Information System (INIS)

    Pantelic, G.; Zivanovic, M.; Rajacic, M.; Krneta Nikolic, J.; Todorovic, D.

    2015-01-01

    The primary concern of any measuring of radon or radon progeny must be the quality of the results. A good quality assurance program, when properly designed and diligently followed, ensures that laboratory staff will be able to produce the type and quality of measurement results which is needed and expected. Active charcoal detectors are used for testing the concentration of radon in dwellings. The method of measurement is based on radon adsorption on coal and measurement of gamma radiation of radon daughters. Upon closing the detectors, the measurement was carried out after achieving the equilibrium between radon and its daughters (at least 3 hours) using NaI or HPGe detector. Radon concentrations as well as measurement uncertainties were calculated according to US EPA protocol 520/5-87-005. Detectors used for the measurements were calibrated by 226Ra standard of known activity in the same geometry. Standard and background canisters are used for QA and QC, as well as for the calibration of the measurement equipment. Standard canister is a sealed canister with the same matrix and geometry as the canisters used for measurements, but with the known activity of radon. Background canister is a regular radon measurement canister, which has never been exposed. The detector background and detector efficiency are measured to ascertain whether they are within the warning and acceptance limits. (author).

  4. Application of Passive Sampling to Characterise the Fish Exometabolome

    Directory of Open Access Journals (Sweden)

    Mark R. Viant

    2017-02-01

    Full Text Available The endogenous metabolites excreted by organisms into their surrounding environment, termed the exometabolome, are important for many processes including chemical communication. In fish biology, such metabolites are also known to be informative markers of physiological status. While metabolomics is increasingly used to investigate the endogenous biochemistry of organisms, no non-targeted studies of the metabolic complexity of fish exometabolomes have been reported to date. In environmental chemistry, Chemcatcher® (Portsmouth, UK passive samplers have been developed to sample for micro-pollutants in water. Given the importance of the fish exometabolome, we sought to evaluate the capability of Chemcatcher® samplers to capture a broad spectrum of endogenous metabolites excreted by fish and to measure these using non-targeted direct infusion mass spectrometry metabolomics. The capabilities of C18 and styrene divinylbenzene reversed-phase sulfonated (SDB-RPS Empore™ disks for capturing non-polar and polar metabolites, respectively, were compared. Furthermore, we investigated real, complex metabolite mixtures excreted from two model fish species, rainbow trout (Oncorhynchus mykiss and three-spined stickleback (Gasterosteus aculeatus. In total, 344 biological samples and 28 QC samples were analysed, revealing 646 and 215 m/z peaks from trout and stickleback, respectively. The measured exometabolomes were principally affected by the type of Empore™ (Hemel Hempstead, UK disk and also by the sampling time. Many peaks were putatively annotated, including several bile acids (e.g., chenodeoxycholate, taurocholate, glycocholate, glycolithocholate, glycochenodeoxycholate, glycodeoxycholate. Collectively these observations show the ability of Chemcatcher® passive samplers to capture endogenous metabolites excreted from fish.

  5. [Establishment and assessment of QA/QC method for sampling and analysis of atmosphere background CO2].

    Science.gov (United States)

    Liu, Li-xin; Zhou, Ling-xi; Xia, Ling-jun; Wang, Hong-yang; Fang, Shuang-xi

    2014-12-01

    To strengthen scientific management and sharing of greenhouse gas data obtained from atmospheric background stations in China, it is important to ensure the standardization of quality assurance and quality control method for background CO2 sampling and analysis. Based on the greenhouse gas sampling and observation experience of CMA, using portable sampling observation and WS-CRDS analysis technique as an example, the quality assurance measures for atmospheric CO,sampling and observation in the Waliguan station (Qinghai), the glass bottle quality assurance measures and the systematic quality control method during sample analysis, the correction method during data processing, as well as the data grading quality markers and data fitting interpolation method were systematically introduced. Finally, using this research method, the CO2 sampling and observation data at the atmospheric background stations in 3 typical regions were processed and the concentration variation characteristics were analyzed, indicating that this research method could well catch the influences of the regional and local environmental factors on the observation results, and reflect the characteristics of natural and human activities in an objective and accurate way.

  6. Sampling and Analysis for Tank 241-AW-104 Waste in Support of Evaporator Campaign 2001-1

    International Nuclear Information System (INIS)

    MCKINNEY, S.G.

    2000-01-01

    This Tank Sampling and Analysis Plan (TSAP) identifies sample collection, laboratory analysis, quality assurance/quality control (QA/QC), and reporting objectives for the characterization of tank 241-AW-104 waste. Technical bases for these objectives are specified in the 242-A Evaporator Data Quality Objectives (Bowman 2000a and Von Bargen 1998), 242-A Evaporator Quality Assurance Project Plan (Bowman 1998 and Bowman 2000b), Tank 241-AW-104 Sampling Requirements in Support of Evaporator Campaign 2000-1 (Le 2000). Characterization results will be used to support the evaporator campaign currently planned for early fiscal year 2001. No other needs (or issues) requiring data for this tank waste apply to this sampling event

  7. The Development of Quality Control Genotyping Approaches: A Case Study Using Elite Maize Lines.

    Directory of Open Access Journals (Sweden)

    Jiafa Chen

    Full Text Available Quality control (QC of germplasm identity and purity is a critical component of breeding and conservation activities. SNP genotyping technologies and increased availability of markers provide the opportunity to employ genotyping as a low-cost and robust component of this QC. In the public sector available low-cost SNP QC genotyping methods have been developed from a very limited panel of markers of 1,000 to 1,500 markers without broad selection of the most informative SNPs. Selection of optimal SNPs and definition of appropriate germplasm sampling in addition to platform section impact on logistical and resource-use considerations for breeding and conservation applications when mainstreaming QC. In order to address these issues, we evaluated the selection and use of SNPs for QC applications from large DArTSeq data sets generated from CIMMYT maize inbred lines (CMLs. Two QC genotyping strategies were developed, the first is a "rapid QC", employing a small number of SNPs to identify potential mislabeling of seed packages or plots, the second is a "broad QC", employing a larger number of SNP, used to identify each germplasm entry and to measure heterogeneity. The optimal marker selection strategies combined the selection of markers with high minor allele frequency, sampling of clustered SNP in proportion to marker cluster distance and selecting markers that maintain a uniform genomic distribution. The rapid and broad QC SNP panels selected using this approach were further validated using blind test assessments of related re-generation samples. The influence of sampling within each line was evaluated. Sampling 192 individuals would result in close to 100% possibility of detecting a 5% contamination in the entry, and approximately a 98% probability to detect a 2% contamination of the line. These results provide a framework for the establishment of QC genotyping. A comparison of financial and time costs for use of these approaches across different

  8. Including Below Detection Limit Samples in Post Decommissioning Soil Sample Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Hwan; Yim, Man Sung [KAIST, Daejeon (Korea, Republic of)

    2016-05-15

    To meet the required standards the site owner has to show that the soil at the facility has been sufficiently cleaned up. To do this one must know the contamination of the soil at the site prior to clean up. This involves sampling that soil to identify the degree of contamination. However there is a technical difficulty in determining how much decontamination should be done. The problem arises when measured samples are below the detection limit. Regulatory guidelines for site reuse after decommissioning are commonly challenged because the majority of the activity in the soil at or below the limit of detection. Using additional statistical analyses of contaminated soil after decommissioning is expected to have the following advantages: a better and more reliable probabilistic exposure assessment, better economics (lower project costs) and improved communication with the public. This research will develop an approach that defines an acceptable method for demonstrating compliance of decommissioned NPP sites and validates that compliance. Soil samples from NPP often contain censored data. Conventional methods for dealing with censored data sets are statistically biased and limited in their usefulness.

  9. Performance evaluation of the QC-6PLUS quality control system in terms of photons and electrons absorbed doses to water; Avaliacao do desempenho do sistema de controle da qualidade QC-6Plus em termos de dose absorvida na agua para fotons e eletrons

    Energy Technology Data Exchange (ETDEWEB)

    Teixeira, Flavia Cristina da Silva

    2004-06-15

    The quality of the treatment in radiotherapy depends on the necessary knowledge of the liberated dose in the tumor and of several other physical parameters and dosimetric that characterize the profile of the radiation field. Worrying about the reliability of some commercial equipment that aim at determining the main parameters of a radiation field in a practical way for daily checks in an institution with radiotherapy service, in this work a study of the performance of the quality assurance system, QC6-Plus manufactured by PTW-Freiburg for daily checks, was developed, in order to assure the use of this equipment with larger reliability level in the routine of quality assurance of the hospitals as well as to make possible its use in the Program of Regulatory Inspections of the Services of Radiotherapy of the Country accomplished by IRD/CNEN. The found results indicate that the system QC6-Plus is perfectly adapted and practical for relative measures of daily and weekly control of the main parameters of clinical beans in agreement with reference values recommended in TECDOC 1151. However for measurements of absolute dose it should not be used because, for beams of electrons the system does not present the necessary characteristics to execute this measure type in agreement with the reference protocol, TRS 398, and for photons of energy 15 MV presented a deviation in relation to the conventional method of absolute dosimetry of 7,7%, that it is a lot above the expected in agreement with TRS 398. (author)

  10. Quality assurance and quality control procedures in river water radioecological monitoring

    International Nuclear Information System (INIS)

    Nalbandyan, A.; Stepanyan, A.

    2006-01-01

    For recent decades the issue of radioactive pollution of environmental components has acquired a global character as a result of nuclear weapon testing, accidents in NPPs, development of nuclear technologies and so on. A study object of this research is river water as it is known to be radionuclide transport and accumulation mediums and radioactive elements in river water are available as radioactive salts and mechanic and biological pollutants. Moreover, river water is widely used for various economic and commercial purposes and serves a drinking water supply source as well. The ongoing research is performed in the frame of a NATO/OSCE project 'South Caucasus River Monitoring'. The topicality of the problem dictates a necessity of getting credible and compatible results. For adequate radioactive pollution assessment, decisive are the application and keeping standard QA/QC procedures at all the stages of radioecological monitoring. In our research we apply the following ISO standard-based QA/QC procedures: sampling (emphasizing sample identification: sample collection site, date and method), sample transportation (keeping sample conservation and storing requirements), sample treatment and preparation in the lab, radiometric measurements of samples with regard for the time that past from sampling moment to analysis, control and calibration of analytic instruments, control analysis of samples. The obtained data are processed through standard statistic methods of QC to check measurement errors. Gamma-spectrometric measurements are maid using a Genie-2000 (Canberra) software that includes a separate program for measurement QC. The ultimate outcomes are arranged in special protocols (analysis and sampling tasks protocols, sampling task form, field measurement protocol, sample chain of custody form, sample analysis protocol) and compiled in appropriate databases

  11. Research And Establishment Of The Analytical Procedure For/Of Sr-90 In Milk Samples

    International Nuclear Information System (INIS)

    Tran Thi Tuyet Mai; Duong Duc Thang; Nguyen Thi Linh; Bui Thi Anh Duong

    2014-01-01

    Sr-90 is an indicator for the transfer radionuclides from environment to human. This work was setup to build a procedure for Sr-90 determination in main popular foodstuff and focus to fresh milk. The deal of this work was establish procedure for Sr-90 , assessment for chemical yield and test sample of Vietnam fresh milk, also in this work, the QA, QC for the procedure was carried out using standard sample of IAEA. The work has been completed for the procedure of determination Sr-90 in milk. The chemical yield of recovery for Y-90 and Sr-90 were at 46.76 % ±1.25% and 0.78 ± 0.086, respectively. The QA & QC program was carried out using reference material IAEA-373. The result parse is appropriate equally and well agreement with the certificate value. Three reference samples were analyses with 15 measurements. The results of Sr-90 concentration after processing statistics given a value at 3.69 Bq/kg with uncertainty of 0.23 Bq/kg. The certificate of IAEA-154 for Sr-90 (half live 28.8 year) is the 6.9 Bq/kg, with the range 95% Confidence Interval as (6.0 -8.0 ) Bq/kg at 31st August 1987. After adjusting decay, the radioactivity at this time is 3.67 Bq/kg. It means that such the result of this work was perfect matching the value of stock index IAEA. Five Vietnam fresh milk samples were analyzed for Sr-90, the specific radioactivity of Sr-90 in milk were in a range from 0.032 to 0.041 Bq/l. (author)

  12. A DOE manual: DOE Methods for Evaluating Environmental and Waste Management Samples

    International Nuclear Information System (INIS)

    Goheen, S.C.; McCulloch, M.; Riley, R.G.

    1994-01-01

    Waste Management inherently requires knowledge of the waste's chemical composition. The waste can often be analyzed by established methods; however, if the samples are radioactive, or are plagued by other complications, established methods may not be feasible. The US Department of Energy (DOE) has been faced with managing some waste types that are not amenable to standard or available methods, so new or modified sampling and analysis methods are required. These methods are incorporated into DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods), which is a guidance/methods document for sampling and analysis activities in support of DOE sites. It is a document generated by consensus of the DOE laboratory staff and is intended to fill the gap within existing guidance documents (e. g., the Environmental Protection Agency's (EPA's) Test Methods for Evaluating Solid Waste, SW-846), which apply to low-level or non-radioactive samples. DOE Methods fills the gap by including methods that take into account the complexities of DOE site matrices. The most recent update, distributed in October 1993, contained quality assurance (QA), quality control (QC), safety, sampling, organic analysis, inorganic analysis, and radioanalytical guidance as well as 29 methods. The next update, which will be distributed in April 1994, will contain 40 methods and will therefore have greater applicability. All new methods are either peer reviewed or labeled ''draft'' methods. Draft methods were added to speed the release of methods to field personnel

  13. An analysis of lead (Pb) from human hair samples (20-40 years of age) by atomic absorption spectrophotometry

    International Nuclear Information System (INIS)

    Gelsano, Flordeliza K.; Timing, Laurie D.

    2003-01-01

    This analysis of lead from human hair samples in five different groups namely scavengers from Payatas Quezon City, tricycle drivers, car shop workers, paint factory workers, and students from Polytechnic University of the Philippines. The people from Nagcarlan, Laguna represented as a ''base-line value'' or as a control group. The method applied was acid digestion using HNO 3 and HClO 4 then the samples were subjected to atomic absorption spectrophotometer. In terms of lead found from hair, the scavengers from Payatas Q.C. obtained high exposure of lead among the samples that were tested. The result of the analysis of concentration of lead was expressed in mg/L. (Authors)

  14. 105-N Basin sediment disposition phase-one sampling and analysis plan

    International Nuclear Information System (INIS)

    1997-01-01

    The sampling and analysis plan (SAP) for Phase 1 of the 105-N Basin sediment disposition project defines the sampling and analytical activities that will be performed for the engineering assessment phase (phase 1) of the project. A separate SAP defines the sampling and analytical activities that will be performed for the characterization phase (Phase 2) of the 105-N sediment disposition project. The Phase-1 SAP is presented in the introduction (Section 1.0), in the field sampling plan (FSP) (Section 2.0), and in the quality assurance project plan (QAPjP) (Section 3.0). The FSP defines the sampling and analytical methodologies to be performed. The QAPjP provides information on the quality assurance/quality control (QA/QC) parameters related to the sampling and analytical methodologies. This SAP defines the strategy and the methods that will be used to sample and analyze the sediment on the floor of the 105-N Basin. The resulting data will be used to develop and evaluate engineering designs for collecting and removing sediment from the basin

  15. Economic Design of Acceptance Sampling Plans in a Two-Stage Supply Chain

    Directory of Open Access Journals (Sweden)

    Lie-Fern Hsu

    2012-01-01

    Full Text Available Supply Chain Management, which is concerned with material and information flows between facilities and the final customers, has been considered the most popular operations strategy for improving organizational competitiveness nowadays. With the advanced development of computer technology, it is getting easier to derive an acceptance sampling plan satisfying both the producer's and consumer's quality and risk requirements. However, all the available QC tables and computer software determine the sampling plan on a noneconomic basis. In this paper, we design an economic model to determine the optimal sampling plan in a two-stage supply chain that minimizes the producer's and the consumer's total quality cost while satisfying both the producer's and consumer's quality and risk requirements. Numerical examples show that the optimal sampling plan is quite sensitive to the producer's product quality. The product's inspection, internal failure, and postsale failure costs also have an effect on the optimal sampling plan.

  16. Environmental data qualification system

    International Nuclear Information System (INIS)

    Hester, O.V.; Groh, M.R.

    1989-01-01

    The Integrated Environmental Data Management System (IEDMS) is a PC-based system that can support environmental investigations from their design stage and throughout the duration of the study. The system integrates data originating from the Sampling and Analysis Plan, field data and analytical findings. The IEDMS automated features include sampling guidance forms, barcoded sample labels and tags, field and analytical forms reproduction, sample tracking, analytical data qualification, completeness reports, and results and QC data reporting. The IEDMS has extensive automated capabilities that support a systematic and comprehensive process for performing quality assessment of EPA-CLP chemical analyses data. One product of this process is a unique and extremely useful tabular presentation of the data. One table contains the complete set of results and QC data included on the CLP data forms while presenting the information consistent with the chronology in which the analysis was performed. 3 refs., 1 fig

  17. Performance evaluation of the QC-6PLUS quality control system in terms of photons and electrons absorbed doses to water

    International Nuclear Information System (INIS)

    Teixeira, Flavia Cristina da Silva

    2004-06-01

    The quality of the treatment in radiotherapy depends on the necessary knowledge of the liberated dose in the tumor and of several other physical parameters and dosimetric that characterize the profile of the radiation field. Worrying about the reliability of some commercial equipment that aim at determining the main parameters of a radiation field in a practical way for daily checks in an institution with radiotherapy service, in this work a study of the performance of the quality assurance system, QC6-Plus manufactured by PTW-Freiburg for daily checks, was developed, in order to assure the use of this equipment with larger reliability level in the routine of quality assurance of the hospitals as well as to make possible its use in the Program of Regulatory Inspections of the Services of Radiotherapy of the Country accomplished by IRD/CNEN. The found results indicate that the system QC6-Plus is perfectly adapted and practical for relative measures of daily and weekly control of the main parameters of clinical beans in agreement with reference values recommended in TECDOC 1151. However for measurements of absolute dose it should not be used because, for beams of electrons the system does not present the necessary characteristics to execute this measure type in agreement with the reference protocol, TRS 398, and for photons of energy 15 MV presented a deviation in relation to the conventional method of absolute dosimetry of 7,7%, that it is a lot above the expected in agreement with TRS 398. (author)

  18. Trends in analytical methodologies for the determination of alkylphenols and bisphenol A in water samples.

    Science.gov (United States)

    Salgueiro-González, N; Muniategui-Lorenzo, S; López-Mahía, P; Prada-Rodríguez, D

    2017-04-15

    In the last decade, the impact of alkylphenols and bisphenol A in the aquatic environment has been widely evaluated because of their high use in industrial and household applications as well as their toxicological effects. These compounds are well-known endocrine disrupting compounds (EDCs) which can affect the hormonal system of humans and wildlife, even at low concentrations. Due to the fact that these pollutants enter into the environment through waters, and it is the most affected compartment, analytical methods which allow the determination of these compounds in aqueous samples at low levels are mandatory. In this review, an overview of the most significant advances in the analytical methodologies for the determination of alkylphenols and bisphenol A in waters is considered (from 2002 to the present). Sample handling and instrumental detection strategies are critically discussed, including analytical parameters related to quality assurance and quality control (QA/QC). Special attention is paid to miniaturized sample preparation methodologies and approaches proposed to reduce time- and reagents consumption according to Green Chemistry principles, which have increased in the last five years. Finally, relevant applications of these methods to the analysis of water samples are examined, being wastewater and surface water the most investigated. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Groundwater quality assessment for the Upper East Fork Poplar Creek Hydrogeologic Regime at the Y-12 Plant

    International Nuclear Information System (INIS)

    1992-02-01

    This report contains groundwater quality data obtained during the 1991 calendar year at several waste management facilities and petroleum fuel underground storage tank (UST) sites associated with the Y-12 Plant. These sites are within the Upper East Fork Poplar Creek Hydrogeologic Regime (UEFPCHR), which is one of three regimes defined for the purposes of groundwater and surface-water quality monitoring and remediation. This report was prepared for informational purposes. Included are the analytical data for groundwater samples collected from selected monitoring wells during 1991 and the results for quality assurance/quality control (QA/QC) samples associated with each groundwater sample. This report also contains summaries of selected data, including ion-charge balances for each groundwater sample, a summary of analytical results for nitrate (a principle contaminant in the UEFPCHR), results of volatile organic compounds (VOCs) analyses validated using the associated QA/QC sample data, a summary of trace metal concentrations which exceeded drinking-water standards, and a summary of radiochemical analyses and associated counting errors

  20. Groundwater quality assessment for the Upper East Fork Poplar Creek Hydrogeologic Regime at the Y-12 Plant. 1991 groundwater quality data and calculated rate of contaminant migration

    Energy Technology Data Exchange (ETDEWEB)

    1992-02-01

    This report contains groundwater quality data obtained during the 1991 calendar year at several waste management facilities and petroleum fuel underground storage tank (UST) sites associated with the Y-12 Plant. These sites are within the Upper East Fork Poplar Creek Hydrogeologic Regime (UEFPCHR), which is one of three regimes defined for the purposes of groundwater and surface-water quality monitoring and remediation. This report was prepared for informational purposes. Included are the analytical data for groundwater samples collected from selected monitoring wells during 1991 and the results for quality assurance/quality control (QA/QC) samples associated with each groundwater sample. This report also contains summaries of selected data, including ion-charge balances for each groundwater sample, a summary of analytical results for nitrate (a principle contaminant in the UEFPCHR), results of volatile organic compounds (VOCs) analyses validated using the associated QA/QC sample data, a summary of trace metal concentrations which exceeded drinking-water standards, and a summary of radiochemical analyses and associated counting errors.

  1. Sampling and Analysis Plan for Tank 241-AP-108 Waste in Support of Evaporator Campaign 2000-1

    International Nuclear Information System (INIS)

    RASMUSSEN, J.H.

    2000-01-01

    This Tank Sampling and Analysis Plan (TSAP) identifies sample collection, laboratory analysis, quality assurance/quality control (QA/QC), and reporting objectives for the characterization of tank 241-AP-108 waste. Technical bases for these objectives are specified in the 242-A Evaporator Data Quality Objectives (Bowman 2000 and Von Bargen 1998) and 108-AP Tank Sampling Requirements in Support of Evaporator Campaign 2000-1 (Le 2000). Evaporator campaign 2000-1 will process waste from tank 241-AP-108 in addition to that from tank 241-AP-107. Characterization results will be used to support the evaporator campaign currently planned for early fiscal year 2000. No other needs (or issues) requiring data for this tank waste apply to this sampling event

  2. Metabolomic analysis of urine samples by UHPLC-QTOF-MS: Impact of normalization strategies.

    Science.gov (United States)

    Gagnebin, Yoric; Tonoli, David; Lescuyer, Pierre; Ponte, Belen; de Seigneux, Sophie; Martin, Pierre-Yves; Schappler, Julie; Boccard, Julien; Rudaz, Serge

    2017-02-22

    Among the various biological matrices used in metabolomics, urine is a biofluid of major interest because of its non-invasive collection and its availability in large quantities. However, significant sources of variability in urine metabolomics based on UHPLC-MS are related to the analytical drift and variation of the sample concentration, thus requiring normalization. A sequential normalization strategy was developed to remove these detrimental effects, including: (i) pre-acquisition sample normalization by individual dilution factors to narrow the concentration range and to standardize the analytical conditions, (ii) post-acquisition data normalization by quality control-based robust LOESS signal correction (QC-RLSC) to correct for potential analytical drift, and (iii) post-acquisition data normalization by MS total useful signal (MSTUS) or probabilistic quotient normalization (PQN) to prevent the impact of concentration variability. This generic strategy was performed with urine samples from healthy individuals and was further implemented in the context of a clinical study to detect alterations in urine metabolomic profiles due to kidney failure. In the case of kidney failure, the relation between creatinine/osmolality and the sample concentration is modified, and relying only on these measurements for normalization could be highly detrimental. The sequential normalization strategy was demonstrated to significantly improve patient stratification by decreasing the unwanted variability and thus enhancing data quality. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. An analysis of lead (Pb) from human hair samples (20-40 years of age) by atomic absorption spectrophotometry

    Energy Technology Data Exchange (ETDEWEB)

    Gelsano, Flordeliza K; Timing, Laurie D

    2003-02-17

    This analysis of lead from human hair samples in five different groups namely scavengers from Payatas Quezon City, tricycle drivers, car shop workers, paint factory workers, and students from Polytechnic University of the Philippines. The people from Nagcarlan, Laguna represented as a ''base-line value'' or as a control group. The method applied was acid digestion using HNO{sub 3} and HClO{sub 4} then the samples were subjected to atomic absorption spectrophotometer. In terms of lead found from hair, the scavengers from Payatas Q.C. obtained high exposure of lead among the samples that were tested. The result of the analysis of concentration of lead was expressed in mg/L. (Authors)

  4. Use of Six Sigma Worksheets for assessment of internal and external failure costs associated with candidate quality control rules for an ADVIA 120 hematology analyzer.

    Science.gov (United States)

    Cian, Francesco; Villiers, Elisabeth; Archer, Joy; Pitorri, Francesca; Freeman, Kathleen

    2014-06-01

    Quality control (QC) validation is an essential tool in total quality management of a veterinary clinical pathology laboratory. Cost-analysis can be a valuable technique to help identify an appropriate QC procedure for the laboratory, although this has never been reported in veterinary medicine. The aim of this study was to determine the applicability of the Six Sigma Quality Cost Worksheets in the evaluation of possible candidate QC rules identified by QC validation. Three months of internal QC records were analyzed. EZ Rules 3 software was used to evaluate candidate QC procedures, and the costs associated with the application of different QC rules were calculated using the Six Sigma Quality Cost Worksheets. The costs associated with the current and the candidate QC rules were compared, and the amount of cost savings was calculated. There was a significant saving when the candidate 1-2.5s, n = 3 rule was applied instead of the currently utilized 1-2s, n = 3 rule. The savings were 75% per year (£ 8232.5) based on re-evaluating all of the patient samples in addition to the controls, and 72% per year (£ 822.4) based on re-analyzing only the control materials. The savings were also shown to change accordingly with the number of samples analyzed and with the number of daily QC procedures performed. These calculations demonstrated the importance of the selection of an appropriate QC procedure, and the usefulness of the Six Sigma Costs Worksheet in determining the most cost-effective rule(s) when several candidate rules are identified by QC validation. © 2014 American Society for Veterinary Clinical Pathology and European Society for Veterinary Clinical Pathology.

  5. The Use of Signal Dimensionality for Automatic QC of Seismic Array Data

    Science.gov (United States)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.; Draganov, D.; Maceira, M.; Gomez, M.

    2014-12-01

    A significant problem in seismic array analysis is the inclusion of bad sensor channels in the beam-forming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by-node basis, so the dimensionality of the node traffic is instead monitored for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. We examine the signal dimension in similar way to the method addressing node traffic anomalies in large computer systems. We explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for identifying bad array elements. We show preliminary results applied to arrays in Kazakhstan (Makanchi) and Argentina (Malargue).

  6. An X-band Co{sup 2+} EPR study of Zn{sub 1−x}Co{sub x}O (x=0.005–0.1) nanoparticles prepared by chemical hydrolysis methods using diethylene glycol and denaturated alcohol at 5 K

    Energy Technology Data Exchange (ETDEWEB)

    Misra, Sushil K., E-mail: skmisra@alcor.concordia.ca [Physics Department, Concordia University, Montreal, QC, Canada H3G 1M8 (Canada); Andronenko, S.I. [Physics Institute, Kazan Federal University, Kazan 420008 (Russian Federation); Srinivasa Rao, S.; Chess, Jordan; Punnoose, A. [Department of Physics, Boise State University, Boise, ID 83725-1570 (United States)

    2015-11-15

    EPR investigations on two types of dilute magnetic semiconductor (DMS) ZnO nanoparticles doped with 0.5–10% Co{sup 2+} ions, prepared by two chemical hydrolysis methods, using: (i) diethylene glycol ((CH{sub 2}CH{sub 2}OH){sub 2}O) (NC-rod-like samples), and (ii) denatured ethanol (CH{sub 3}CH{sub 2}OH) solutions (QC-spherical samples), were carried out at X-band (9.5 GHz) at 5 K. The analysis of EPR data for NC samples revealed the presence of several types of EPR lines: (i) two types, intense and weak, of high-spin Co{sup 2+} ions in the samples with Co concentration >0.5%; (ii) surface oxygen vacancies, and (iii) a ferromagnetic resonance (FMR) line. QC samples exhibit an intense FMR line and an EPR line due to high-spin Co{sup 2+} ions. FMR line is more intense, than the corresponding line exhibited by NC samples. These EPR spectra varied for sample with different doping concentrations. The magnetic states of these samples as revealed by EPR spectra, as well as the origin of ferromagnetism DMS samples are discussed. - Highlights: • 5 K X band Co{sup 2+} EPR investigations on QC and NC ZnO dilute magnetic semiconductor nanoparticles. • NC and QC samples exhibited high-spin Co{sup 2+} EPR lines and ferromagnetic resonance line. • NC sample also exhibit line due surface oxygen vacancies. • FMR line is more intense in QC than that in NC samples. • Magnetic states and the origin of ferromagnetism are discussed.

  7. SNP Data Quality Control in a National Beef and Dairy Cattle System and Highly Accurate SNP Based Parentage Verification and Identification

    Directory of Open Access Journals (Sweden)

    Matthew C. McClure

    2018-03-01

    Full Text Available A major use of genetic data is parentage verification and identification as inaccurate pedigrees negatively affect genetic gain. Since 2012 the international standard for single nucleotide polymorphism (SNP verification in Bos taurus cattle has been the ISAG SNP panels. While these ISAG panels provide an increased level of parentage accuracy over microsatellite markers (MS, they can validate the wrong parent at ≤1% misconcordance rate levels, indicating that more SNP are needed if a more accurate pedigree is required. With rapidly increasing numbers of cattle being genotyped in Ireland that represent 61 B. taurus breeds from a wide range of farm types: beef/dairy, AI/pedigree/commercial, purebred/crossbred, and large to small herd size the Irish Cattle Breeding Federation (ICBF analyzed different SNP densities to determine that at a minimum ≥500 SNP are needed to consistently predict only one set of parents at a ≤1% misconcordance rate. For parentage validation and prediction ICBF uses 800 SNP (ICBF800 selected based on SNP clustering quality, ISAG200 inclusion, call rate (CR, and minor allele frequency (MAF in the Irish cattle population. Large datasets require sample and SNP quality control (QC. Most publications only deal with SNP QC via CR, MAF, parent-progeny conflicts, and Hardy-Weinberg deviation, but not sample QC. We report here parentage, SNP QC, and a genomic sample QC pipelines to deal with the unique challenges of >1 million genotypes from a national herd such as SNP genotype errors from mis-tagging of animals, lab errors, farm errors, and multiple other issues that can arise. We divide the pipeline into two parts: a Genotype QC and an Animal QC pipeline. The Genotype QC identifies samples with low call rate, missing or mixed genotype classes (no BB genotype or ABTG alleles present, and low genotype frequencies. The Animal QC handles situations where the genotype might not belong to the listed individual by identifying: >1 non

  8. Bioanalytical method development and validation for the determination of glycine in human cerebrospinal fluid by ion-pair reversed-phase liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Jiang, Jian; James, Christopher A; Wong, Philip

    2016-09-05

    A LC-MS/MS method has been developed and validated for the determination of glycine in human cerebrospinal fluid (CSF). The validated method used artificial cerebrospinal fluid as a surrogate matrix for calibration standards. The calibration curve range for the assay was 100-10,000ng/mL and (13)C2, (15)N-glycine was used as an internal standard (IS). Pre-validation experiments were performed to demonstrate parallelism with surrogate matrix and standard addition methods. The mean endogenous glycine concentration in a pooled human CSF determined on three days by using artificial CSF as a surrogate matrix and the method of standard addition was found to be 748±30.6 and 768±18.1ng/mL, respectively. A percentage difference of -2.6% indicated that artificial CSF could be used as a surrogate calibration matrix for the determination of glycine in human CSF. Quality control (QC) samples, except the lower limit of quantitation (LLOQ) QC and low QC samples, were prepared by spiking glycine into aliquots of pooled human CSF sample. The low QC sample was prepared from a separate pooled human CSF sample containing low endogenous glycine concentrations, while the LLOQ QC sample was prepared in artificial CSF. Standard addition was used extensively to evaluate matrix effects during validation. The validated method was used to determine the endogenous glycine concentrations in human CSF samples. Incurred sample reanalysis demonstrated reproducibility of the method. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Quality assurance during fabrication of high-damping rubber isolation bearings

    Energy Technology Data Exchange (ETDEWEB)

    Way, D.; Greaves, W.C. [Base Isolation Consultants, Inc., San Francisco, CA (United States)

    1995-12-01

    Successful implementation of a high-damping rubber (HDR) base isolation project requires the application of Quality Assurance/Quality Control (QA/QC) methodology through all phases of the bearing fabrication process. HDR base isolation bearings must be fabricated with uniform physical characteristics while being produced in large quantities. To satisfy this requirement, manufacturing processes must be controlled. Prototype tests that include dynamic testing of small samples of rubber are necessary. Stringent full scale bearing testing must be carried out prior to beginning production, during which manufacturing is strictly regulated by small rubber sample and production bearing testing. All such activities should be supervised and continuously inspected by independent and experienced QA/QC personnel.

  10. Ongoing quality control in digital radiography: Report of AAPM Imaging Physics Committee Task Group 151

    International Nuclear Information System (INIS)

    Jones, A. Kyle; Geiser, William; Heintz, Philip; Goldman, Lee; Jerjian, Khachig; Martin, Melissa; Peck, Donald; Pfeiffer, Douglas; Ranger, Nicole; Yorkston, John

    2015-01-01

    Quality control (QC) in medical imaging is an ongoing process and not just a series of infrequent evaluations of medical imaging equipment. The QC process involves designing and implementing a QC program, collecting and analyzing data, investigating results that are outside the acceptance levels for the QC program, and taking corrective action to bring these results back to an acceptable level. The QC process involves key personnel in the imaging department, including the radiologist, radiologic technologist, and the qualified medical physicist (QMP). The QMP performs detailed equipment evaluations and helps with oversight of the QC program, the radiologic technologist is responsible for the day-to-day operation of the QC program. The continued need for ongoing QC in digital radiography has been highlighted in the scientific literature. The charge of this task group was to recommend consistency tests designed to be performed by a medical physicist or a radiologic technologist under the direction of a medical physicist to identify problems with an imaging system that need further evaluation by a medical physicist, including a fault tree to define actions that need to be taken when certain fault conditions are identified. The focus of this final report is the ongoing QC process, including rejected image analysis, exposure analysis, and artifact identification. These QC tasks are vital for the optimal operation of a department performing digital radiography

  11. Selection Component Analysis of Natural Polymorphisms using Population Samples Including Mother-Offspring Combinations, II

    DEFF Research Database (Denmark)

    Jarmer, Hanne Østergaard; Christiansen, Freddy Bugge

    1981-01-01

    Population samples including mother-offspring combinations provide information on the selection components: zygotic selection, sexual selection, gametic seletion and fecundity selection, on the mating pattern, and on the deviation from linkage equilibrium among the loci studied. The theory...

  12. A Total Quality-Control Plan with Right-Sized Statistical Quality-Control.

    Science.gov (United States)

    Westgard, James O

    2017-03-01

    A new Clinical Laboratory Improvement Amendments option for risk-based quality-control (QC) plans became effective in January, 2016. Called an Individualized QC Plan, this option requires the laboratory to perform a risk assessment, develop a QC plan, and implement a QC program to monitor ongoing performance of the QC plan. Difficulties in performing a risk assessment may limit validity of an Individualized QC Plan. A better alternative is to develop a Total QC Plan including a right-sized statistical QC procedure to detect medically important errors. Westgard Sigma Rules provides a simple way to select the right control rules and the right number of control measurements. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Potential contamination of shipboard air samples by diffusive emissions of PCBs and other organic pollutants: implications and solutions.

    Science.gov (United States)

    Lohmann, Rainer; Jaward, Foday M; Durham, Louise; Barber, Jonathan L; Ockenden, Wendy; Jones, Kevin C; Bruhn, Regina; Lakaschus, Soenke; Dachs, Jordi; Booij, Kees

    2004-07-15

    Air samples were taken onboard the RRS Bransfield on an Atlantic cruise from the United Kingdom to Halley, Antarctica, from October to December 1998, with the aim of establishing PCB oceanic background air concentrations and assessing their latitudinal distribution. Great care was taken to minimize pre- and post-collection contamination of the samples, which was validated through stringent QA/QC procedures. However, there is evidence that onboard contamination of the air samples occurred,following insidious, diffusive emissions on the ship. Other data (for PCBs and other persistent organic pollutants (POPs)) and examples of shipboard contamination are presented. The implications of these findings for past and future studies of global POPs distribution are discussed. Recommendations are made to help critically appraise and minimize the problems of insidious/diffusive shipboard contamination.

  14. Hydrogen storage properties for Mg–Zn–Y quasicrystal and ternary alloys

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Xuanli, E-mail: Xuanli.Luo@nottingham.ac.uk; Grant, David M., E-mail: David.Grant@nottingham.ac.uk; Walker, Gavin S., E-mail: Gavin.Walker@nottingham.ac.uk

    2015-10-05

    Highlights: • Quasicrystal (QC) and H-phase alloys were detected in the Zn–Mg–Y samples. • Hydrogen storage properties of Zn–Mg–Y samples were investigated. • Zn{sub 50}Mg{sub 42}Y{sub 8} showed a capacity of 0.9 wt.% and decomposition temperature of 445 °C. - Abstract: Three Zn–Mg–Y alloys with nominal compositions of Zn{sub 50}Mg{sub 42}Y{sub 8} and Zn{sub 60}Mg{sub 30}Y{sub 10} were prepared by induction melting or gas atomisation. XRD and SEM analysis shows samples ZMY-1 and ZMY-2 consisted of multiple phases including icosahedral quasicrystal (QC) i-phase, hexagonal H-phase and Mg{sub 7}Zn{sub 3}, whilst ZMY-3 contained QC only. The hydrogen storage properties of the Zn–Mg–Y quasicrystal and ternary alloys were investigated for the first time. The quasicrystal sample ZMY-3 hydrogenated at 300 °C had 0.3 wt.% capacity and the DSC decomposition peak temperature was 503 °C. Amongst the three samples, the highest hydrogen storage capacity (0.9 wt.%) and the lowest decomposition peak temperature (445 °C) was achieved by sample ZMY-1. The pressure–composition–isotherm (PCI) curve of ZMY-1 sample showed a flat plateau gave a plateau pressure of 3.5 bar at 300 °C, which indicates a lower dehydrogenation enthalpy than MgH{sub 2}.

  15. Achievements and advantages of participation in the IAEA project RER 002/004/1999-2001 'QA/QC of Nuclear Analytical Techniques'

    International Nuclear Information System (INIS)

    Vata, Ion; Cincu, Em.

    2002-01-01

    The National Institute for Physics and Engineering 'Horia Hulubei' (IFIN-HH) decided in the late 1990s to start applying nuclear techniques in economy and social life on a routine scale; reaching this goal implied achieving first-rate analytical performances and complying with the QA/QC requirements, as detailed in the ISO 17025. The IAEA Project appeared in 1999 as the best opportunity and tool for our specialists to become familiar with the standard requirements and begin to implement them in their operations, thus further enabling them to apply for accreditation according to the international criteria. This report outlines the experience gained from the participation in the project. The accomplishments of the project are presented and the main difficulties are identified

  16. Temperature-dependent viscosity analysis of SAE 10W-60 engine oil with RheolabQC rotational rheometer

    Directory of Open Access Journals (Sweden)

    Zahariea Dănuț

    2017-01-01

    Full Text Available The purpose of this work was to determine a viscositytemperature relationship for SAE 10W-60 engine oil. The rheological properties of this engine oil, for a temperature range of 20÷60 °C, were obtained with RheolabQC rotational rheometer. For the first reference temperature of 40 °C, the experimental result was obtained with a relative error of 1.29%. The temperature-dependent viscosity was modelled, comparatively, with the Arrhenius and the 3rd degree polynomial models. Comparing the graphs of the fits with prediction bounds for 95% confidence level, as well as the goodness-of-fit statistics, the preliminary conclusion was that the 3rd degree polynomial could be the best fit model. However, the fit model should be used also for extrapolation, for the second reference temperature of 100 °C. This new approach changes the fit models order, the Arrhenius equation becoming the best fit model, because of the completely failed to predict the extrapolated value with the polynomial model.

  17. Long term high resolution rainfall runoff observations for improved water balance uncertainty and database QA-QC in the Walnut Gulch Experimental Watershed.

    Science.gov (United States)

    Bitew, M. M.; Goodrich, D. C.; Demaria, E.; Heilman, P.; Kautz, M. A.

    2017-12-01

    Walnut Gulch is a semi-arid environment experimental watershed and Long Term Agro-ecosystem Research (LTAR) site managed by USDA-ARS Southwest Watershed Research Center for which high-resolution long-term hydro-climatic data are available across its 150 km2 drainage area. In this study, we present the analysis of 50 years of continuous hourly rainfall data to evaluate runoff control and generation processes for improving the QA-QC plans of Walnut Gulch to create high-quality data set that is critical for reducing water balance uncertainties. Multiple linear regression models were developed to relate rainfall properties, runoff characteristics and watershed properties. The rainfall properties were summarized to event based total depth, maximum intensity, duration, the location of the storm center with respect to the outlet, and storm size normalized to watershed area. We evaluated the interaction between the runoff and rainfall and runoff as antecedent moisture condition (AMC), antecedent runoff condition (ARC) and, runoff depth and duration for each rainfall events. We summarized each of the watershed properties such as contributing area, slope, shape, channel length, stream density, channel flow area, and percent of the area of retention stock ponds for each of the nested catchments in Walnut Gulch. The evaluation of the model using basic and categorical statistics showed good predictive skill throughout the watersheds. The model produced correlation coefficients ranging from 0.4-0.94, Nash efficiency coefficients up to 0.77, and Kling-Gupta coefficients ranging from 0.4 to 0.98. The model predicted 92% of all runoff generations and 98% of no-runoff across all sub-watersheds in Walnut Gulch. The regression model also indicated good potential to complement the QA-QC procedures in place for Walnut Gulch dataset publications developed over the years since the 1960s through identification of inconsistencies in rainfall and runoff relations.

  18. The Impact of Including Below Detection Limit Samples in Post Decommissioning Soil Sample Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Hwan; Yim, Man-Sung [KAIST, Daejeon (Korea, Republic of)

    2016-10-15

    To meet the required standards the site owner has to show that the soil at the facility has been sufficiently cleaned up. To do this one must know the contamination of the soil at the site prior to clean up. This involves sampling that soil to identify the degree of contamination. However there is a technical difficulty in determining how much decontamination should be done. The problem arises when measured samples are below the detection limit. Regulatory guidelines for site reuse after decommissioning are commonly challenged because the majority of the activity in the soil at or below the limit of detection. Using additional statistical analyses of contaminated soil after decommissioning is expected to have the following advantages: a better and more reliable probabilistic exposure assessment, better economics (lower project costs) and improved communication with the public. This research will develop an approach that defines an acceptable method for demonstrating compliance of decommissioned NPP sites and validates that compliance. Soil samples from NPP often contain censored data. Conventional methods for dealing with censored data sets are statistically biased and limited in their usefulness. In this research, additional methods are performed using real data from a monazite manufacturing factory.

  19. Quality Control Practices for Chemistry and Immunochemistry in a Cohort of 21 Large Academic Medical Centers.

    Science.gov (United States)

    Rosenbaum, Matthew W; Flood, James G; Melanson, Stacy E F; Baumann, Nikola A; Marzinke, Mark A; Rai, Alex J; Hayden, Joshua; Wu, Alan H B; Ladror, Megan; Lifshitz, Mark S; Scott, Mitchell G; Peck-Palmer, Octavia M; Bowen, Raffick; Babic, Nikolina; Sobhani, Kimia; Giacherio, Donald; Bocsi, Gregary T; Herman, Daniel S; Wang, Ping; Toffaletti, John; Handel, Elizabeth; Kelly, Kathleen A; Albeiroti, Sami; Wang, Sihe; Zimmer, Melissa; Driver, Brandon; Yi, Xin; Wilburn, Clayton; Lewandrowski, Kent B

    2018-05-29

    In the United States, minimum standards for quality control (QC) are specified in federal law under the Clinical Laboratory Improvement Amendment and its revisions. Beyond meeting this required standard, laboratories have flexibility to determine their overall QC program. We surveyed chemistry and immunochemistry QC procedures at 21 clinical laboratories within leading academic medical centers to assess if standardized QC practices exist for chemistry and immunochemistry testing. We observed significant variation and unexpected similarities in practice across laboratories, including QC frequency, cutoffs, number of levels analyzed, and other features. This variation in practice indicates an opportunity exists to establish an evidence-based approach to QC that can be generalized across institutions.

  20. WE-AB-206-01: Diagnostic Ultrasound Imaging Quality Assurance

    Energy Technology Data Exchange (ETDEWEB)

    Zagzebski, J. [University of Wisconsin (United States)

    2016-06-15

    The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.

  1. WE-AB-206-01: Diagnostic Ultrasound Imaging Quality Assurance

    International Nuclear Information System (INIS)

    Zagzebski, J.

    2016-01-01

    The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.

  2. WE-AB-206-02: ACR Ultrasound Accreditation: Requirements and Pitfalls

    International Nuclear Information System (INIS)

    Walter, J.

    2016-01-01

    The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.

  3. WE-AB-206-03: Workshop

    International Nuclear Information System (INIS)

    Lu, Z.

    2016-01-01

    The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.

  4. Exotic open-flavor $bc\\bar{q}\\bar{q}$, $bc\\bar{s}\\bar{s}$ and $qc\\bar{q}\\bar{b}$, $sc\\bar{s}\\bar{b}$ tetraquark states

    OpenAIRE

    Chen, Wei; Steele, T. G.; Zhu, Shi-Lin

    2013-01-01

    We study the exotic $bc\\bar{q}\\bar{q}$, $bc\\bar{s}\\bar{s}$ and $qc\\bar{q}\\bar{b}$, $sc\\bar{s}\\bar{b}$ systems by constructing the corresponding tetraquark currents with $J^P=0^+$ and $1^+$. After investigating the two-point correlation functions and the spectral densities, we perform QCD sum rule analysis and extract the masses of these open-flavor tetraquark states. Our results indicate that the masses of both the scalar and axial vector tetraquark states are about $7.1-7.2$ GeV for the $bc\\...

  5. Planning Risk-Based SQC Schedules for Bracketed Operation of Continuous Production Analyzers.

    Science.gov (United States)

    Westgard, James O; Bayat, Hassan; Westgard, Sten A

    2018-02-01

    To minimize patient risk, "bracketed" statistical quality control (SQC) is recommended in the new CLSI guidelines for SQC (C24-Ed4). Bracketed SQC requires that a QC event both precedes and follows (brackets) a group of patient samples. In optimizing a QC schedule, the frequency of QC or run size becomes an important planning consideration to maintain quality and also facilitate responsive reporting of results from continuous operation of high production analytic systems. Different plans for optimizing a bracketed SQC schedule were investigated on the basis of Parvin's model for patient risk and CLSI C24-Ed4's recommendations for establishing QC schedules. A Sigma-metric run size nomogram was used to evaluate different QC schedules for processes of different sigma performance. For high Sigma performance, an effective SQC approach is to employ a multistage QC procedure utilizing a "startup" design at the beginning of production and a "monitor" design periodically throughout production. Example QC schedules are illustrated for applications with measurement procedures having 6-σ, 5-σ, and 4-σ performance. Continuous production analyzers that demonstrate high σ performance can be effectively controlled with multistage SQC designs that employ a startup QC event followed by periodic monitoring or bracketing QC events. Such designs can be optimized to minimize the risk of harm to patients. © 2017 American Association for Clinical Chemistry.

  6. Protocols for the analytical characterization of therapeutic monoclonal antibodies. II - Enzymatic and chemical sample preparation.

    Science.gov (United States)

    Bobaly, Balazs; D'Atri, Valentina; Goyon, Alexandre; Colas, Olivier; Beck, Alain; Fekete, Szabolcs; Guillarme, Davy

    2017-08-15

    The analytical characterization of therapeutic monoclonal antibodies and related proteins usually incorporates various sample preparation methodologies. Indeed, quantitative and qualitative information can be enhanced by simplifying the sample, thanks to the removal of sources of heterogeneity (e.g. N-glycans) and/or by decreasing the molecular size of the tested protein by enzymatic or chemical fragmentation. These approaches make the sample more suitable for chromatographic and mass spectrometric analysis. Structural elucidation and quality control (QC) analysis of biopharmaceutics are usually performed at intact, subunit and peptide levels. In this paper, general sample preparation approaches used to attain peptide, subunit and glycan level analysis are overviewed. Protocols are described to perform tryptic proteolysis, IdeS and papain digestion, reduction as well as deglycosylation by PNGase F and EndoS2 enzymes. Both historical and modern sample preparation methods were compared and evaluated using rituximab and trastuzumab, two reference therapeutic mAb products approved by Food and Drug Administration (FDA) and European Medicines Agency (EMA). The described protocols may help analysts to develop sample preparation methods in the field of therapeutic protein analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. IDMS analysis of blank swipe samples for uranium quantity and isotopic composition

    International Nuclear Information System (INIS)

    Ryjinski, M.; Donohue, D.

    2001-01-01

    Since 1996 the IAEA has started routine implementation of environmental sampling. During the last 5 years more than 1700 swipe samples were collected and analyzed in the Network of Analytical Laboratories (NWAL). One sensitive point of analyzing environmental samples is evidence of the presence of enriched U. The U content on swipes is extremely low and therefore there is a relatively high probability of a false positive, e.g. small contamination or a measurement bias. In order to avoid and/or control this the IAEA systematically sends to the laboratories blind blank QC samples. In particular more than 50 blank samples were analyzed during the last two years. A preliminary analysis of blank swipes showed the swipe material itself contains up to 10 ng of NU per swipe. However, about 50% of blind blank swipes analyzed show the presence of enriched uranium. A source of this bias has to be clarified and excluded. This paper presents the results of modeling of IDMS analysis for quantity and isotopic composition of uranium in order to identify the possible contribution of different factors to the final measurement uncertainty. This modeling was carried out based on the IAEA Clean Laboratory measurement data and simulation technique

  8. Quality control for diagnostic oral microbiology laboratories in European countries

    Directory of Open Access Journals (Sweden)

    Andrew J. Smith

    2011-11-01

    Full Text Available Participation in diagnostic microbiology internal and external quality control (QC processes is good laboratory practice and an essential component of a quality management system. However, no QC scheme for diagnostic oral microbiology existed until 2009 when the Clinical Oral Microbiology (COMB Network was created. At the European Oral Microbiology Workshop in 2008, 12 laboratories processing clinical oral microbiological samples were identified. All these were recruited to participate into the study and six laboratories from six European countries completed both the online survey and the first QC round. Three additional laboratories participated in the second round. Based on the survey, European oral microbiology laboratories process a significant (mean per laboratory 4,135 number of diagnostic samples from the oral cavity annually. A majority of the laboratories did not participate in any internal or external QC programme and nearly half of the laboratories did not have standard operating procedures for the tests they performed. In both QC rounds, there was a large variation in the results, interpretation and reporting of antibiotic susceptibility testing among the laboratories. In conclusion, the results of this study demonstrate the need for harmonisation of laboratory processing methods and interpretation of results for oral microbiology specimens. The QC rounds highlighted the value of external QC in evaluating the efficacy and safety of processes, materials and methods used in the laboratory. The use of standardised methods is also a prerequisite for multi-centre epidemiological studies that can provide important information on emerging microbes and trends in anti-microbial susceptibility for empirical prescribing in oro-facial infections.

  9. The Ocean Observatories Initiative Data Management and QA/QC: Lessons Learned and the Path Ahead

    Science.gov (United States)

    Vardaro, M.; Belabbassi, L.; Garzio, L. M.; Knuth, F.; Smith, M. J.; Kerfoot, J.; Crowley, M. F.

    2016-02-01

    The Ocean Observatories Initiative (OOI) is a multi-decadal, NSF-funded program that will provide long-term, near real-time cabled and telemetered measurements of climate variability, ocean circulation, ecosystem dynamics, air-sea exchange, seafloor processes, and plate-scale geodynamics. The OOI platforms consist of seafloor sensors, fixed moorings, and mobile assets containing over 700 operational instruments in the Atlantic and Pacific oceans. Rutgers University operates the Cyberinfrastructure (CI) component of the OOI, which acquires, processes and distributes data to scientists, researchers, educators and the public. It will also provide observatory mission command and control, data assessment and distribution, and long-term data management. The Rutgers Data Management Team consists of a data manager and four data evaluators, who are tasked with ensuring data completeness and quality, as well as interaction with OOI users to facilitate data delivery and utility. Here we will discuss the procedures developed to guide the data team workflow, the automated QC algorithms and human-in-the-loop (HITL) annotations that are used to flag suspect data (whether due to instrument failures, biofouling, or unanticipated events), system alerts and alarms, long-term data storage and CF (Climate and Forecast) standard compliance, and the lessons learned during construction and the first several months of OOI operations.

  10. Guest Foreword from Michael Thomas CMG QC

    Directory of Open Access Journals (Sweden)

    Michael Thomas

    2012-04-01

    precedents and thought in a unique legal market in which ideas drawn from Islamic law, civil law and common law can intermingle and blend. It is not surprising therefore to see that this new publication will be dedicated to the subject of international law, both public and private. Its laudable aim is to promote legal discourse around the world, and to promote a wider international understanding of contemporary legal issues for the common benefit. As an open access, bilingual journal, addressing topics concerning any jurisdiction, I hope it will reach a wide audience, and fulfil its aim of promoting understanding between different cultures. I am sure that the journal will not only benefit Qatar’s legal community by advancing academic and practice-based legal discussion. I am also confident that it will stimulate thought in the global legal community at large. May I wish it every success and a long life. Michael Thomas CMG QC

  11. Natural Products from Microalgae with Potential against Alzheimer’s Disease: Sulfolipids Are Potent Glutaminyl Cyclase Inhibitors

    Directory of Open Access Journals (Sweden)

    Stephanie Hielscher-Michael

    2016-11-01

    Full Text Available In recent years, many new enzymes, like glutaminyl cyclase (QC, could be associated with pathophysiological processes and represent targets for many diseases, so that enzyme-inhibiting properties of natural substances are becoming increasingly important. In different studies, the pathophysiology connection of QC to various diseases including Alzheimer’s disease (AD was described. Algae are known for the ability to synthesize complex and highly-diverse compounds with specific enzyme inhibition properties. Therefore, we screened different algae species for the presence of QC inhibiting metabolites using a new “Reverse Metabolomics” technique including an Activity-correlation Analysis (AcorA, which is based on the correlation of bioactivities to mass spectral data with the aid of mathematic informatics deconvolution. Thus, three QC inhibiting compounds from microalgae belonging to the family of sulfolipids were identified. The compounds showed a QC inhibition of 81% and 76% at concentrations of 0.25 mg/mL and 0.025 mg/mL, respectively. Thus, for the first time, sulfolipids are identified as QC inhibiting compounds and possess substructures with the required pharmacophore qualities. They represent a new lead structure for QC inhibitors.

  12. Mitochondrial control region and GSTP1 polymorphism associated ...

    African Journals Online (AJOL)

    These two heteroplasmic mutations were found at positions 11qG3037G/A and 11qC3038C/A in patient, father, mother, brother and son, but not in the sister and wife samples in family 2. The GSTP1, 105Ile >Val is most susceptible to inherited UBC risk for these ethnic families. The samples from families 1 and 2, including ...

  13. Analytical approaches to quality assurance and quality control in rangeland monitoring data

    Science.gov (United States)

    Producing quality data to support land management decisions is the goal of every rangeland monitoring program. However, the results of quality assurance (QA) and quality control (QC) efforts to improve data quality are rarely reported. The purpose of QA and QC is to prevent and describe non-sampling...

  14. Textures and mechanical properties in rare-earth free quasicrystal reinforced Mg-Zn-Zr alloys prepared by extrusion

    International Nuclear Information System (INIS)

    Ohhashi, S.; Kato, A.; Demura, M.; Tsai, A.P.

    2011-01-01

    Highlights: → Powder-metallurgical warm extrusion made quasicrystal dispersing Mg alloys. → Mg extrusions containing quasicrystals showed randomized textures. → These extrusion showed the enhancement of mechanical properties at 150 deg. C. - Abstract: Microstructure and mechanical properties of quasicrystals dispersed Mg alloys prepared by warm extrusion of the mixtures of Mg and Zn-Mg-Zr quasicrystalline (Qc) powders have been studied. Strong texture oriented along a [101-bar 0] direction observed in pure Mg was reduced in Qc-dispersed samples, as verified by pole figure method and electron back scattering diffraction. The ultimate tensile strengths at 150 deg. C for Qc-dispersed extrusions were much higher than 110 MPa for pure Mg, which drastically reached 156 MPa for 15 wt.% Qc by preventing the motion of dislocations. Elongation was improved by the randomization of grain orientation: from 5.7% for pure Mg to 12.9% for 10 wt.% Qc at room temperature; from 15% for pure Mg to 37.1% for 5 wt.% Qc at 150 deg. C.

  15. A combined QC methodology in Ebro Delta HF radar system: real time web monitoring of diagnostic parameters and offline validation of current data

    Science.gov (United States)

    Lorente, Pablo; Piedracoba, Silvia; Soto-Navarro, Javier; Ruiz, Maria Isabel; Alvarez Fanjul, Enrique

    2015-04-01

    Over recent years, special attention has been focused on the development of protocols for near real-time quality control (QC) of HF radar derived current measurements. However, no agreement has been worldwide achieved to date to establish a standardized QC methodology, although a number of valuable international initiatives have been launched. In this context, Puertos del Estado (PdE) aims to implement a fully operational HF radar network with four different Codar SeaSonde HF radar systems by means of: - The development of a best-practices robust protocol for data processing and QC procedures to routinely monitor sites performance under a wide variety of ocean conditions. - The execution of validation works with in-situ observations to assess the accuracy of HF radar-derived current measurements. The main goal of the present work is to show this combined methodology for the specific case of Ebro HF radar (although easily expandable to the rest of PdE radar systems), deployed to manage Ebro River deltaic area and promote the conservation of an important aquatic ecosystem exposed to a severe erosion and reshape. To this aim, a web interface has been developed to efficiently monitor in real time the evolution of several diagnostic parameters provided by the manufacturer (CODAR) and used as indicators of HF radar system health. This web, updated automatically every hour, examines sites performance on different time basis in terms of: - Hardware parameters: power and temperature. - Radial parameters, among others: Signal-to-Noise Ratio (SNR), number of radial vectors provided by time step, maximum radial range and bearing. - Total uncertainty metrics provided by CODAR: zonal and meridional standard deviations and covariance between both components. - Additionally, a widget embedded in the web interface executes queries against PdE database, providing the chance to compare current time series observed by Tarragona buoy (located within Ebro HF radar spatial domain) and

  16. Comparison of confinement characters between porous silicon and silicon nanowires

    International Nuclear Information System (INIS)

    Tit, Nacir; Yamani, Zain H.; Pizzi, Giovanni; Virgilio, Michele

    2011-01-01

    Confinement character and its effects on photoluminescence (PL) properties are theoretically investigated and compared between porous silicon (p-Si) and silicon nanowires (Si-NWs). The method is based on the application of the tight-binding technique using the minimal sp 3 -basis set, including the second-nearest-neighbor interactions. The results show that the quantum confinement (QC) is not entirely controlled by the porosity, rather it is mainly affected by the average distance between pores (d). The p-Si is found to exhibit weaker confinement character than Si-NWs. The confinement energy of charge carriers decays against d exponentially for p-Si and via a power-law for Si-NWs. This latter type of QC is much stronger and is somewhat similar to the case of a single particle in a quantum box. The excellent fit to the PL data demonstrates that the experimental samples of p-Si do exhibit strong QC character and thus reveals the possibility of silicon clustering into nano-crystals and/or nanowires. Furthermore, the results show that the passivation of the surface dangling bonds by the hydrogen atoms plays an essential role in preventing the appearance of gap states and consequently enhances the optical qualities of the produced structures. The oscillator strength (OS) is found to increase exponentially with energy in Si-NWs confirming the strong confinement character of carriers. Our theoretical findings suggest the existence of Si nanocrystals (Si-NCs) of sizes 1-3 nm and/or Si-NWs of cross-sectional sizes in the 1-3 nm range inside the experimental p-Si samples. The experimentally-observed strong photoluminescence from p-Si should be in favor of an exhibition of 3D-confinement character. The favorable comparison of our theoretical results with the experimental data consolidates our above claims. -- Highlights: → Tight-binding is used to study quantum-confinement (QC) effects in p-Si and Si-NWs. → QC is not entirely controlled by the porosity but also by the d

  17. Standardization of D2 lymphadenectomy and surgical quality control (KLASS-02-QC): a prospective, observational, multicenter study [NCT01283893

    International Nuclear Information System (INIS)

    Kim, Hyoung-Il; Hur, Hoon; Kim, Youn Nam; Lee, Hyuk-Joon; Kim, Min-Chan; Han, Sang-Uk; Hyung, Woo Jin

    2014-01-01

    Extended systemic lymphadenectomy (D2) is standard procedure for surgical treatment of advanced gastric cancer (AGC) although less extensive lymphadenectomy (D1) can be applied to early gastric cancer. Complete D2 lymphadenectomy is the mandatory procedure for studies that evaluate surgical treatment results of AGC. However, the actual extent of D2 lymphadenectomy varies among surgeons because of a lacking consensus on the anatomical definition of each lymph node station. This study is aimed to develop a consensus for D2 lymphadenectomy and also to qualify surgeons that can perform both laparoscopic and open D2 gastrectomy. This (KLASS-02-QC) is a prospective, observational, multicenter study to qualify the surgeons that will participate in the KLASS-02-RCT, which is a prospective, randomized, clinical trial comparing laparoscopic and open gastrectomy for AGC. Surgeons and reviewers participating in the study will be required to complete a questionnaire detailing their professional experience and specific gastrectomy surgical background/training, and the gastrectomy metrics of their primary hospitals. All surgeons must submit three laparoscopic and three open D2 gastrectomy videos, respectively. Each video will be allocated to five peer reviewers; thus each surgeon’s operations will be assessed by a total of 30 reviews. Based on blinded assessment of unedited videos by experts’ review, a separate review evaluation committee will decide whether or not the evaluated surgeon will participate in the KLASS-02-RCT. The primary outcome measure is each surgeon’s proficiency, as assessed by the reviewers based on evaluation criteria for completeness of D2 lymphadenectomy. We believe that our study for standardization of D2 lymphadenectomy and surgical quality control (KLASS-02-QC) will guarantee successful implementation of the subsequent KLASS-02-RCT study. After making consensus on D2 lymphadenectomy, we developed evaluation criteria for completeness of D2

  18. Reinforcing of QA/QC programs in radiotherapy departments in Croatia: Results of treatment planning system verification

    Energy Technology Data Exchange (ETDEWEB)

    Jurković, Slaven; Švabić, Manda; Diklić, Ana; Smilović Radojčić, Đeni; Dundara, Dea [Clinic for Radiotherapy and Oncology, Physics Division, University Hospital Rijeka, Rijeka (Croatia); Kasabašić, Mladen; Ivković, Ana [Department for Radiotherapy and Oncology, University Hospital Osijek, Osijek (Croatia); Faj, Dario, E-mail: dariofaj@mefos.hr [Department of Physics, School of Medicine, University of Osijek, Osijek (Croatia)

    2013-04-01

    Implementation of advanced techniques in clinical practice can greatly improve the outcome of radiation therapy, but it also makes the process much more complex with a lot of room for errors. An important part of the quality assurance program is verification of treatment planning system (TPS). Dosimetric verifications in anthropomorphic phantom were performed in 4 centers where new systems were installed. A total of 14 tests for 2 photon energies and multigrid superposition algorithms were conducted using the CMS XiO TPS. Evaluation criteria as specified in the International Atomic Energy Agency Technical Reports Series (IAEA TRS) 430 were employed. Results of measurements are grouped according to the placement of the measuring point and the beam energy. The majority of differences between calculated and measured doses in the water-equivalent part of the phantom were in tolerance. Significantly more out-of-tolerance values were observed in “nonwater-equivalent” parts of the phantom, especially for higher-energy photon beams. This survey was done as a part of continuous effort to build up awareness of quality assurance/quality control (QA/QC) importance in the Croatian radiotherapy community. Understanding the limitations of different parts of the various systems used in radiation therapy can systematically improve quality as well.

  19. Empirical insights and considerations for the OBT inter-laboratory comparison of environmental samples.

    Science.gov (United States)

    Kim, Sang-Bog; Roche, Jennifer

    2013-08-01

    Organically bound tritium (OBT) is an important tritium species that can be measured in most environmental samples, but has only recently been recognized as a species of tritium in these samples. Currently, OBT is not routinely measured by environmental monitoring laboratories around the world. There are no certified reference materials (CRMs) for environmental samples. Thus, quality assurance (QA), or verification of the accuracy of the OBT measurement, is not possible. Alternatively, quality control (QC), or verification of the precision of the OBT measurement, can be achieved. In the past, there have been differences in OBT analysis results between environmental laboratories. A possible reason for the discrepancies may be differences in analytical methods. Therefore, inter-laboratory OBT comparisons among the environmental laboratories are important and would provide a good opportunity for adopting a reference OBT analytical procedure. Due to the analytical issues, only limited information is available on OBT measurement. Previously conducted OBT inter-laboratory practices are reviewed and the findings are described. Based on our experiences, a few considerations were suggested for the international OBT inter-laboratory comparison exercise to be completed in the near future. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  20. Evaluation of capillary zone electrophoresis for the quality control of complex biologic samples: Application to snake venoms.

    Science.gov (United States)

    Kpaibe, André P S; Ben-Ameur, Randa; Coussot, Gaëlle; Ladner, Yoann; Montels, Jérôme; Ake, Michèle; Perrin, Catherine

    2017-08-01

    Snake venoms constitute a very promising resource for the development of new medicines. They are mainly composed of very complex peptide and protein mixtures, which composition may vary significantly from batch to batch. This latter consideration is a challenge for routine quality control (QC) in the pharmaceutical industry. In this paper, we report the use of capillary zone electrophoresis for the development of an analytical fingerprint methodology to assess the quality of snake venoms. The analytical fingerprint concept is being widely used for the QC of herbal drugs but rarely for venoms QC so far. CZE was chosen for its intrinsic efficiency in the separation of protein and peptide mixtures. The analytical fingerprint methodology was first developed and evaluated for a particular snake venom, Lachesis muta. Optimal analysis conditions required the use of PDADMAC capillary coating to avoid protein and peptide adsorption. Same analytical conditions were then applied to other snake venom species. Different electrophoretic profiles were obtained for each venom. Excellent repeatability and intermediate precision was observed for each batch. Analysis of different batches of the same species revealed inherent qualitative and quantitative composition variations of the venoms between individuals. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Sci-Fri AM: Quality, Safety, and Professional Issues 01: CPQR Technical Quality Control Suite Development including Quality Control Workload Results

    Energy Technology Data Exchange (ETDEWEB)

    Malkoske, Kyle; Nielsen, Michelle; Brown, Erika; Diamond, Kevin; Frenière, Normand; Grant, John; Pomerleau-Dalcourt, Natalie; Schella, Jason; Schreiner, L. John; Tantot, Laurent; Barajas, Eduardo Villareal; Bissonnette, Jean-Pierre [Royal Victoria Hospital, Trillium Health Partners, CPQR, Juravinski Cancer Centre, CIUSSS MCQ - CHAUR, Cape Breton Health Care Complex, Centre d’oncologie Dr. Léon-Richard / Dr. Léon Richard Oncology Centre, QEII Health Sciences Centre, Cancer Centre of Southeastern Ontario, Hôpital Maisonneuve-Rosemont, Tom Baker Cancer Centre, Princess Margaret Cancer Centre (Canada)

    2016-08-15

    A close partnership between the Canadian Partnership for Quality Radiotherapy (CPQR) and the Canadian Organization of Medical Physicist’s (COMP) Quality Assurance and Radiation Safety Advisory Committee (QARSAC) has resulted in the development of a suite of Technical Quality Control (TQC) Guidelines for radiation treatment equipment, that outline specific performance objectives and criteria that equipment should meet in order to assure an acceptable level of radiation treatment quality. The framework includes consolidation of existing guidelines and/or literature by expert reviewers, structured stages of public review, external field-testing and ratification by COMP. The adopted framework for the development and maintenance of the TQCs ensures the guidelines incorporate input from the medical physics community during development, measures the workload required to perform the QC tests outlined in each TQC, and remain relevant (i.e. “living documents”) through subsequent planned reviews and updates. This presentation will show the Multi-Leaf Linear Accelerator document as an example of how feedback and cross-national work to achieve a robust guidance document. During field-testing, each technology was tested at multiple centres in a variety of clinic environments. As part of the defined feedback, workload data was captured. This lead to average time associated with testing as defined in each TQC document. As a result, for a medium-sized centre comprising 6 linear accelerators and a comprehensive brachytherapy program, we evaluate the physics workload to 1.5 full-time equivalent physicist per year to complete all QC tests listed in this suite.

  2. Sci-Fri AM: Quality, Safety, and Professional Issues 01: CPQR Technical Quality Control Suite Development including Quality Control Workload Results

    International Nuclear Information System (INIS)

    Malkoske, Kyle; Nielsen, Michelle; Brown, Erika; Diamond, Kevin; Frenière, Normand; Grant, John; Pomerleau-Dalcourt, Natalie; Schella, Jason; Schreiner, L. John; Tantot, Laurent; Barajas, Eduardo Villareal; Bissonnette, Jean-Pierre

    2016-01-01

    A close partnership between the Canadian Partnership for Quality Radiotherapy (CPQR) and the Canadian Organization of Medical Physicist’s (COMP) Quality Assurance and Radiation Safety Advisory Committee (QARSAC) has resulted in the development of a suite of Technical Quality Control (TQC) Guidelines for radiation treatment equipment, that outline specific performance objectives and criteria that equipment should meet in order to assure an acceptable level of radiation treatment quality. The framework includes consolidation of existing guidelines and/or literature by expert reviewers, structured stages of public review, external field-testing and ratification by COMP. The adopted framework for the development and maintenance of the TQCs ensures the guidelines incorporate input from the medical physics community during development, measures the workload required to perform the QC tests outlined in each TQC, and remain relevant (i.e. “living documents”) through subsequent planned reviews and updates. This presentation will show the Multi-Leaf Linear Accelerator document as an example of how feedback and cross-national work to achieve a robust guidance document. During field-testing, each technology was tested at multiple centres in a variety of clinic environments. As part of the defined feedback, workload data was captured. This lead to average time associated with testing as defined in each TQC document. As a result, for a medium-sized centre comprising 6 linear accelerators and a comprehensive brachytherapy program, we evaluate the physics workload to 1.5 full-time equivalent physicist per year to complete all QC tests listed in this suite.

  3. Communication: importance sampling including path correlation in semiclassical initial value representation calculations for time correlation functions.

    Science.gov (United States)

    Pan, Feng; Tao, Guohua

    2013-03-07

    Full semiclassical (SC) initial value representation (IVR) for time correlation functions involves a double phase space average over a set of two phase points, each of which evolves along a classical path. Conventionally, the two initial phase points are sampled independently for all degrees of freedom (DOF) in the Monte Carlo procedure. Here, we present an efficient importance sampling scheme by including the path correlation between the two initial phase points for the bath DOF, which greatly improves the performance of the SC-IVR calculations for large molecular systems. Satisfactory convergence in the study of quantum coherence in vibrational relaxation has been achieved for a benchmark system-bath model with up to 21 DOF.

  4. Development and implementation of tPA clot lysis activity assay using ACL TOP™ hemeostasis testing system in QC laboratories

    Directory of Open Access Journals (Sweden)

    Lichun Huang

    2017-12-01

    Full Text Available This report describes the design, development, validation and long-term performance of tPA clot lysis activity assay using Advanced Chemistry Line Total Operational Performance (ACL TOP™ Homeostasis Testing System. The results of the study demonstrated robust and stable performance of the analytical method. The accuracy of the assay, expressed by percent recovery is 98–99%. The intermediate precision and repeatability precision, expressed as Relative Standard Deviation (RSD, was 3% and less than 2% respectively. The validated range is from 70% to 130% of the target potency of 5.8 × 105 IU/mg. The linearity of this range, expressed in correlation coefficient, is 0.997. After the assay is transferred to a QC laboratory, the assay retained high accuracy and precision with a success rate of >99%. Keywords: Potency assay, Clot lysis, Comparability, Automation

  5. QC in RIA

    International Nuclear Information System (INIS)

    Little, J.

    1989-01-01

    A Regional Health Authority expert from the service which performs radioimmunoassays (RIA) and immunoradiometric assays (IRMA) discusses the importance of quality control in these techniques. Originally developed as an aid to endocrinology, applications are now much more widely based and include virology, microbiology, drugs testing, immunology, parasitology, veterinary science and the food industry. The origins of errors in the practice of RIA are examined, with and between batches and the limitations placed on accuracy are explained. Measurement variability between laboratories is also mentioned. (U.K.)

  6. Development and prospective evaluation of an automated software system for quality control of quantitative 99mTc-MAG3 renal studies.

    Science.gov (United States)

    Folks, Russell D; Garcia, Ernest V; Taylor, Andrew T

    2007-03-01

    Quantitative nuclear renography has numerous potential sources of error. We previously reported the initial development of a computer software module for comprehensively addressing the issue of quality control (QC) in the analysis of radionuclide renal images. The objective of this study was to prospectively test the QC software. The QC software works in conjunction with standard quantitative renal image analysis using a renal quantification program. The software saves a text file that summarizes QC findings as possible errors in user-entered values, calculated values that may be unreliable because of the patient's clinical condition, and problems relating to acquisition or processing. To test the QC software, a technologist not involved in software development processed 83 consecutive nontransplant clinical studies. The QC findings of the software were then tabulated. QC events were defined as technical (study descriptors that were out of range or were entered and then changed, unusually sized or positioned regions of interest, or missing frames in the dynamic image set) or clinical (calculated functional values judged to be erroneous or unreliable). Technical QC events were identified in 36 (43%) of 83 studies. Clinical QC events were identified in 37 (45%) of 83 studies. Specific QC events included starting the camera after the bolus had reached the kidney, dose infiltration, oversubtraction of background activity, and missing frames in the dynamic image set. QC software has been developed to automatically verify user input, monitor calculation of renal functional parameters, summarize QC findings, and flag potentially unreliable values for the nuclear medicine physician. Incorporation of automated QC features into commercial or local renal software can reduce errors and improve technologist performance and should improve the efficiency and accuracy of image interpretation.

  7. A modified routine analysis of arsenic content in drinking-water in Bangladesh by hydride generation-atomic absorption spectrophotometry.

    Science.gov (United States)

    Wahed, M A; Chowdhury, Dulaly; Nermell, Barbro; Khan, Shafiqul Islam; Ilias, Mohammad; Rahman, Mahfuzar; Persson, Lars Ake; Vahter, Marie

    2006-03-01

    The high prevalence of elevated levels of arsenic in drinking-water in many countries, including Bangladesh, has necessitated the development of reliable and rapid methods for the determination of a wide range of arsenic concentrations in water. A simple hydride generation-atomic absorption spectrometry (HG-AAS) method for the determination of arsenic in the range of microg/L to mg/L concentrations in water is reported here. The method showed linearity over concentrations ranging from 1 to 30 microg/L, but requires dilution of samples with higher concentrations. The detection limit ranged from 0.3 to 0.5 microg/L. Evaluation of the method, using internal quality-control (QC) samples (pooled water samples) and spiked internal QC samples throughout the study, and Standard Reference Material in certain lots, showed good accuracy and precision. Analysis of duplicate water samples at another laboratory also showed good agreement. In total, 13,286 tubewell water samples from Matlab, a rural area in Bangladesh, were analyzed. Thirty-seven percent of the water samples had concentrations below 50 microg/L, 29% below the WHO guideline value of 10 microg/L, and 17% below 1 microg/L. The HG-AAS was found to be a precise, sensitive, and reasonably fast and simple method for analysis of arsenic concentrations in water samples.

  8. Suitability of selected free-gas and dissolved-gas sampling containers for carbon isotopic analysis.

    Science.gov (United States)

    Eby, P; Gibson, J J; Yi, Y

    2015-07-15

    Storage trials were conducted for 2 to 3 months using a hydrocarbon and carbon dioxide gas mixture with known carbon isotopic composition to simulate typical hold times for gas samples prior to isotopic analysis. A range of containers (both pierced and unpierced) was periodically sampled to test for δ(13)C isotopic fractionation. Seventeen containers were tested for free-gas storage (20°C, 1 atm pressure) and 7 containers were tested for dissolved-gas storage, the latter prepared by bubbling free gas through tap water until saturated (20°C, 1 atm) and then preserved to avoid biological activity by acidifying to pH 2 with phosphoric acid and stored in the dark at 5°C. Samples were extracted using valves or by piercing septa, and then introduced into an isotope ratio mass spectrometer for compound-specific δ(13)C measurements. For free gas, stainless steel canisters and crimp-top glass serum bottles with butyl septa were most effective at preventing isotopic fractionation (pierced and unpierced), whereas silicone and PTFE-butyl septa allowed significant isotopic fractionation. FlexFoil and Tedlar bags were found to be effective only for storage of up to 1 month. For dissolved gas, crimp-top glass serum bottles with butyl septa were again effective, whereas silicone and PTFE-butyl were not. FlexFoil bags were reliable for up to 2 months. Our results suggest a range of preferred containers as well as several that did not perform very well for isotopic analysis. Overall, the results help establish better QA/QC procedures to avoid isotopic fractionation when storing environmental gas samples. Recommended containers for air transportation include steel canisters and glass serum bottles with butyl septa (pierced and unpierced). Copyright © 2015 John Wiley & Sons, Ltd.

  9. Quality Assurance Program Plan for the Waste Sampling and Characterization Facility

    International Nuclear Information System (INIS)

    Grabbe, R.R.

    1995-01-01

    The objective of this Quality Assurance Plan is to provide quality assurance (QA) guidance, implementation of regulatory QA requirements, and quality control (QC) specifications for analytical service. This document follows the Department of Energy (DOE)-issued Hanford Analytical Services Quality Assurance Plan (HASQAP) and additional federal [10 US Code of Federal Regulations (CFR) 830.120] QA requirements that HASQAP does not cover. This document describes how the laboratory implements QA requirements to meet the federal or state requirements, provides what are the default QC specifications, and/or identifies the procedural information that governs how the laboratory operates. In addition, this document meets the objectives of the Quality Assurance Program provided in the WHC-CM-4-2, Section 2.1. This document also covers QA elements that are required in the Guidelines and Specifications for Preparing Quality Assurance Program Plans (QAPPs), (QAMS-004), and Interim Guidelines and Specifications for Preparing Quality Assurance Product Plans (QAMS-005) from the Environmental Protection Agency (EPA). A QA Index is provided in the Appendix A

  10. Nuclear Energy Research Initiative Project No. 02 103 Innovative Low Cost Approaches to Automating QA/QC of Fuel Particle Production Using On Line Nondestructive Methods for Higher Reliability Final Project Report

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Salahuddin; Batishko, Charles R.; Flake, Matthew; Good, Morris S.; Mathews, Royce; Morra, Marino; Panetta, Paul D.; Pardini, Allan F.; Sandness, Gerald A.; Tucker, Brian J.; Weier, Dennis R.; Hockey, Ronald L.; Gray, Joseph N.; Saurwein, John J.; Bond, Leonard J.; Lowden, Richard A.; Miller, James H.

    2006-02-28

    This Nuclear Energy Research Initiative (NERI) project was tasked with exploring, adapting, developing and demonstrating innovative nondestructive test methods to automate nuclear coated particle fuel inspection so as to provide the United States (US) with necessary improved and economical Quality Assurance and Control (QA/QC) that is needed for the fuels for several reactor concepts being proposed for both near term deployment [DOE NE & NERAC, 2001] and Generation IV nuclear systems. Replacing present day QA/QC methods, done manually and in many cases destructively, with higher speed automated nondestructive methods will make fuel production for advanced reactors economically feasible. For successful deployment of next generation reactors that employ particle fuels, or fuels in the form of pebbles based on particles, extremely large numbers of fuel particles will require inspection at throughput rates that do not significantly impact the proposed manufacturing processes. The focus of the project is nondestructive examination (NDE) technologies that can be automated for production speeds and make either: (I) On Process Measurements or (II) In Line Measurements. The inspection technologies selected will enable particle “quality” qualification as a particle or group of particles passes a sensor. A multiple attribute dependent signature will be measured and used for qualification or process control decisions. A primary task for achieving this objective is to establish standard signatures for both good/acceptable particles and the most problematic types of defects using several nondestructive methods.

  11. Sampled data CT system including analog filter and compensating digital filter

    International Nuclear Information System (INIS)

    Glover, G. H.; DallaPiazza, D. G.; Pelc, N. J.

    1985-01-01

    A CT scanner in which the amount of x-ray information acquired per unit time is substantially increased by using a continuous-on x-ray source and a sampled data system with the detector. An analog filter is used in the sampling system for band limiting the detector signal below the highest frequency of interest, but is a practically realizable filter and is therefore non-ideal. A digital filter is applied to the detector data after digitization to compensate for the characteristics of the analog filter, and to provide an overall filter characteristic more nearly like the ideal

  12. Quality assurance and quality control of geochemical data—A primer for the research scientist

    Science.gov (United States)

    Geboy, Nicholas J.; Engle, Mark A.

    2011-01-01

    Geochemistry is a constantly expanding science. More and more, scientists are employing geochemical tools to help answer questions about the Earth and earth system processes. Scientists may assume that the responsibility of examining and assessing the quality of the geochemical data they generate is not theirs but rather that of the analytical laboratories to which their samples have been submitted. This assumption may be partially based on knowledge about internal and external quality assurance and quality control (QA/QC) programs in which analytical laboratories typically participate. Or there may be a perceived lack of time or resources to adequately examine data quality. Regardless of the reason, the lack of QA/QC protocols can lead to the generation and publication of erroneous data. Because the interpretations drawn from the data are primary products to U.S. Geological Survey (USGS) stakeholders, the consequences of publishing erroneous results can be significant. The principal investigator of a scientific study ultimately is responsible for the quality and interpretation of the project's findings, and thus must also play a role in the understanding, implementation, and presentation of QA/QC information about the data. Although occasionally ignored, QA/QC protocols apply not only to procedures in the laboratory but also in the initial planning of a research study and throughout the life of the project. Many of the tenets of developing a sound QA/QC program or protocols also parallel the core concepts of developing a good study: What is the main objective of the study? Will the methods selected provide data of enough resolution to answer the hypothesis? How should samples be collected? Are there known or unknown artifacts or contamination sources in the sampling and analysis methods? Assessing data quality requires communication between the scientists responsible for designing the study and those collecting samples, analyzing samples, treating data, and

  13. Subspace Dimensionality: A Tool for Automated QC in Seismic Array Processing

    Science.gov (United States)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.

    2013-12-01

    Because of the great resolving power of seismic arrays, the application of automated processing to array data is critically important in treaty verification work. A significant problem in array analysis is the inclusion of bad sensor channels in the beamforming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by node basis, so the dimensionality of the node traffic is instead monitoried for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. In the established template application, a detector functions in a manner analogous to waveform cross-correlation, applying a statistical test to assess the similarity of the incoming data stream to known templates for events of interest. In our approach, we seek not to detect matching signals, but instead, we examine the signal subspace dimensionality in much the same way that the method addresses node traffic anomalies in large computer systems. Signal anomalies recorded on seismic arrays affect the dimensional structure of the array-wide time-series. We have shown previously that this observation is useful in identifying real seismic events, either by looking at the raw signal or derivatives thereof (entropy, kurtosis), but here we explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for

  14. Results from 15years of quality surveillance for a National Indigenous Point-of-Care Testing Program for diabetes.

    Science.gov (United States)

    Shephard, Mark; Shephard, Anne; McAteer, Bridgit; Regnier, Tamika; Barancek, Kristina

    2017-12-01

    Diabetes is a major health problem for Australia's Aboriginal and Torres Strait Islander peoples. Point-of-care testing for haemoglobin A1c (HbA1c) has been the cornerstone of a long-standing program (QAAMS) to manage glycaemic control in Indigenous people with diabetes and recently, to diagnose diabetes. The QAAMS quality management framework includes monthly testing of quality control (QC) and external quality assurance (EQA) samples. Key performance indicators of quality include imprecision (coefficient of variation [CV%]) and percentage acceptable results. This paper reports on the past 15years of quality testing in QAAMS and examines the performance of HbA1c POC testing at the 6.5% cut-off recommended for diagnosis. The total number of HbA1c EQA results submitted from 2002 to 2016 was 29,093. The median imprecision for EQA testing by QAAMS device operators averaged 2.81% (SD 0.50; range 2.2 to 3.9%) from 2002 to 2016 and 2.44% (SD 0.22; range 2.2 to 2.9%) from 2009 to 2016. No significant difference was observed between the median imprecision achieved in QAAMS and by Australasian laboratories from 2002 to 2016 (p=0.05; two-tailed paired t-test) and from 2009 to 2016 (p=0.17; two-tailed paired t-test). For QC testing from 2009 to 2016, imprecision averaged 2.5% and 3.0% for the two levels of QC tested. Percentage acceptable results averaged 90% for QA testing from 2002 to 2016 and 96% for QC testing from 2009 to 2016. The DCA Vantage was able to measure a patient and an EQA sample with an HbA1c value close to 6.5% both accurately and precisely. HbA1c POC testing in QAAMS has remained analytically sound, matched the quality achieved by Australasian laboratories and met profession-derived analytical goals for 15years. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  15. Empirical insights and considerations for the OBT inter-laboratory comparison of environmental samples

    International Nuclear Information System (INIS)

    Kim, Sang-Bog; Roche, Jennifer

    2013-01-01

    Organically bound tritium (OBT) is an important tritium species that can be measured in most environmental samples, but has only recently been recognized as a species of tritium in these samples. Currently, OBT is not routinely measured by environmental monitoring laboratories around the world. There are no certified reference materials (CRMs) for environmental samples. Thus, quality assurance (QA), or verification of the accuracy of the OBT measurement, is not possible. Alternatively, quality control (QC), or verification of the precision of the OBT measurement, can be achieved. In the past, there have been differences in OBT analysis results between environmental laboratories. A possible reason for the discrepancies may be differences in analytical methods. Therefore, inter-laboratory OBT comparisons among the environmental laboratories are important and would provide a good opportunity for adopting a reference OBT analytical procedure. Due to the analytical issues, only limited information is available on OBT measurement. Previously conducted OBT inter-laboratory practices are reviewed and the findings are described. Based on our experiences, a few considerations were suggested for the international OBT inter-laboratory comparison exercise to be completed in the near future. -- Highlights: ► Inter-laboratory OBT comparisons would provide a good opportunity for developing reference OBT analytical procedures. ► The measurement of environmental OBT concentrations has a higher associated uncertainty. ► Certified reference materials for OBT in environmental samples are required

  16. Method for determining the wedge angle from the daily measurements made with the measurement enabled devices DC6; Metodo para la determinacion del angulo de cuna a partir de las medidas diarias realizadas con el dispositio de medida QC6

    Energy Technology Data Exchange (ETDEWEB)

    Marques Fraguela, E.; Suero Rodrigo, M. A.

    2011-07-01

    The aim of this paper is to present a method for determining the angle of the wedges virtual electron linear accelerator (ALE) Siemens Primus, from the daily measurements made with the measurement system PTW-QC6Plus and found to be sufficiently sensitive to determine variations of {+-} 1 of this parameter. In addition, we study the behavior statistically CUFLE angle over a year.

  17. Tragacanth gum-based nanogel as a superparamagnetic molecularly imprinted polymer for quercetin recognition and controlled release.

    Science.gov (United States)

    Hemmati, Khadijeh; Masoumi, Arameh; Ghaemy, Mousa

    2016-01-20

    A highly selective magnetic molecularly imprinted polymer (MMIP) with core-shell structure has been synthesized by a sol-gel process composed of Tragacanth Gum (TG) crosslinker, Fe3O4/SiO2 nanoparticles, and N-vinyl imidazole(VI) functional monomer in the presence of template Quercetin (QC). Different techniques including scanning electron microscopy (SEM), SEM-energy dispersive spectroscopy (SEM-EDS), vibrating sample magnetometer (VSM), and transmission electron microscopy (TEM) were used to verify the successful synthesis of MIP on the surface of Fe3O4/SiO2 nanoparticles. The swelling behavior of MMIP, its recognition and selectivity for QC and structural analog, Catechin (CT), were tested and compared with magnetic non imprinted polymer (MNIP). MMIP adsorbs the template drug quickly and equilibrium could be reached in 2h. The mechanism for adsorption was found to follow the Langmuir model with the maximum capacity of 175.43 mg g(-1). The MMIP indicated excellent recognition and binding affinity toward QC, selectivity factor (ɛ) relative to CT was 2.16. Finally, the MMIP was evaluated as a drug delivery device by performing in vitro release studies in PBS. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Automated aerosol sampling and analysis for the Comprehensive Test Ban Treaty

    International Nuclear Information System (INIS)

    Miley, H.S.; Bowyer, S.M.; Hubbard, C.W.; McKinnon, A.D.; Perkins, R.W.; Thompson, R.C.; Warner, R.A.

    1998-01-01

    Detecting nuclear debris from a nuclear weapon exploded in or substantially vented to the Earth's atmosphere constitutes the most certain indication that a violation of the Comprehensive Test Ban Treaty has occurred. For this reason, a radionuclide portion of the International Monitoring System is being designed and implemented. The IMS will monitor aerosols and gaseous xenon isotopes to detect atmospheric and underground tests, respectively. An automated system, the Radionuclide Aerosol Sampler/Analyzer (RASA), has been developed at Pacific Northwest National Laboratory to meet CTBT aerosol measurement requirements. This is achieved by the use of a novel sampling apparatus, a high-resolution germanium detector, and very sophisticated software. This system draws a large volume of air (∼ 20,000 m 3 /day), performs automated gamma-ray spectral measurements (MDC( 140 Ba) 3 ), and communicates this and other data to a central data facility. Automated systems offer the added benefit of rigid controls, easily implemented QA/QC procedures, and centralized depot maintenance and operation. Other types of automated communication include pull or push transmission of State-Of-Health data, commands, and configuration data. In addition, a graphical user interface, Telnet, and other interactive communications are supported over ordinary phone or network lines. This system has been the subject of a USAF commercialization effort to meet US CTBT monitoring commitments. It will also be available to other CTBT signatories and the monitoring community for various governmental, environmental, or commercial needs. The current status of the commercialization is discussed

  19. Evaluation of effective energy for QA and QC: measurement of half-value layer using radiochromic film density

    International Nuclear Information System (INIS)

    Gotanda, R.; Takeda, Y.; Gotanda, T.; Oishi Hospital, Hiroshima; Tabuchi, A.; Kawasaki Hospital, Okayama; Yamamoto, K.; Osaka Cancer Prevention and Detection Centre, Osaka; Kuwano, T.; Osaka Medical Center for Cancer and Cardovascular Diseases, Osaka; Yatake, H.; Kaizuka City Hospital, Osaka; Katsuda, T.

    2009-01-01

    The effective energy of diagnostic x-rays is important for quality assurance (QA) and quality control (QC). However, the half-value layer (HVL), which is necessary to evaluate the effective energy, is not ubiquitously monitored because ionization-chamber dosimetry is time-consuming and complicated. To verify the applicability of GAFCHROMIC XR type R (GAF-R) film for HVL measurement as an alternative to monitoring with an ionization chamber, a single-strip method for measuring the HVL has been evaluated. Calibration curves of absorbed dose versus film density were generated using this single-strip method with GAF-R film, and the coefficient of determination (r2) of the straight-line approximation was evaluated. The HVLs (effective energies) estimated using the GAF-R film and an ionization chamber were compared. The coefficient of determination (r2) of the straight-line approximation obtained with the GAF-R film was more than 0.99. The effective energies (HVLs) evaluated using the GAF-R film and the ionization chamber were 43.25 keV (5.10 m m) and 39.86 keV (4.45 mm), respectively. The difference in the effective energies determined by the two methods was thus 8.5%. These results suggest that GAF-R might be used to evaluate the effective energy from the film-density growth without the need for ionization-chamber measurements.

  20. Quantum Testbeds Stakeholder Workshop (QTSW) Report meeting purpose and agenda.

    Energy Technology Data Exchange (ETDEWEB)

    Hebner, Gregory A.

    2017-04-01

    Quantum computing (QC) is a promising early-stage technology with the potential to provide scientific computing capabilities far beyond what is possible with even an Exascale computer in specific problems of relevance to the Office of Science. These include (but are not limited to) materials modeling, molecular dynamics, and quantum chromodynamics. However, commercial QC systems are not yet available and the technical maturity of current QC hardware, software, algorithms, and systems integration is woefully incomplete. Thus, there is a significant opportunity for DOE to define the technology building blocks, and solve the system integration issues to enable a revolutionary tool. Once realized, QC will have world changing impact on economic competitiveness, the scientific enterprise, and citizen well-being. Prior to this workshop, DOE / Office of Advanced Scientific Computing Research (ASCR) hosted a workshop in 2015 to explore QC scientific applications. The goal of that workshop was to assess the viability of QC technologies to meet the computational requirements in support of DOE’s science and energy mission and to identify the potential impact of these technologies.

  1. Design, fabrication, and optimization of quantum cascade laser cavities and spectroscopy of the intersubband gain

    Science.gov (United States)

    Dirisu, Afusat Olayinka

    Quantum Cascade (QC) lasers are intersubband light sources operating in the wavelength range of ˜ 3 to 300 mum and are used in applications such as sensing (environmental, biological, and hazardous chemical), infrared countermeasures, and free-space infrared communications. The mid-infrared range (i.e. lambda ˜ 3-30 mum) is of particular importance in sensing because of the strong interaction of laser radiation with various chemical species, while in free space communications the atmospheric windows of 3-5 mum and 8-12 mum are highly desirable for low loss transmission. Some of the requirements of these applications include, (1) high output power for improved sensitivity; (2) high operating temperatures for compact and cost-effective systems; (3) wide tunability; (4) single mode operation for high selectivity. In the past, available mid-infrared sources, such as the lead-salt and solid-state lasers, were bulky, expensive, or emit low output power. In recent years, QC lasers have been explored as cost-effective and compact sources because of their potential to satisfy and exceed all the above requirements. Also, the ultrafast carrier lifetimes of intersubband transitions in QC lasers are promising for high bandwidth free-space infrared communication. This thesis was focused on the improvement of QC lasers through the design and optimization of the laser cavity and characterization of the laser gain medium. The optimization of the laser cavity included, (1) the design and fabrication of high reflection Bragg gratings and subwavelength antireflection gratings, by focused ion beam milling, to achieve tunable, single mode and high power QC lasers, and (2) modeling of slab-coupled optical waveguide QC lasers for high brightness output beams. The characterization of the QC laser gain medium was carried out using the single-pass transmission experiment, a sensitive measurement technique, for probing the intersubband transitions and the electron distribution of QC lasers

  2. QA/QC in pesticide residue analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A [Agrochemicals Unit, Agency' s Laboratories, Seibersdorf (Austria)

    2002-07-01

    This paper outlines problems related to pesticide residue analysis in a regulatory laboratory that are related to: availability of reference materials, as over 1000 pesticide active ingredients are currently in use and over 400 crops represent a large part of a healthy diet; analysis time; availability of samples in sufficient numbers; uncertainties of the procedures.

  3. QA/QC in pesticide residue analysis

    International Nuclear Information System (INIS)

    Ambrus, A.

    2002-01-01

    This paper outlines problems related to pesticide residue analysis in a regulatory laboratory that are related to: availability of reference materials, as over 1000 pesticide active ingredients are currently in use and over 400 crops represent a large part of a healthy diet; analysis time; availability of samples in sufficient numbers; uncertainties of the procedures

  4. Spectrally high performing quantum cascade lasers

    Science.gov (United States)

    Toor, Fatima

    emits at lambda = 10.8 mum for positive and lambda = 8.6 mum for negative polarity current with microsecond time delay is presented. Such a system is the first demonstration of a time and wavelength multiplexed system that uses a single QC laser. Fourth, work on the design and fabrication of a single-mode distributed feedback (DFB) QC laser emitting at lambda ≈ 7.7 mum to be used in a QC laser based photoacoustic sensor is presented. The DFB QC laser had a temperature tuning co-efficient of 0.45 nm/K for a temperature range of 80 K to 320 K, and a side mode suppression ratio of greater than 30 dB. Finally, study on the lateral mode patterns of wide ridge QC lasers is presented. The results include the observation of degenerate and non-degenerate lateral modes in wide ridge QC lasers emitting at lambda ≈ 5.0 mum. This study was conducted with the end goal of using wide ridge QC lasers in a novel technique to spatiospectrally combine multiple transverse modes to obtain an ultra high power single spot QC laser beam.

  5. Droplet digital PCR-based EGFR mutation detection with an internal quality control index to determine the quality of DNA.

    Science.gov (United States)

    Kim, Sung-Su; Choi, Hyun-Jeung; Kim, Jin Ju; Kim, M Sun; Lee, In-Seon; Byun, Bohyun; Jia, Lina; Oh, Myung Ryurl; Moon, Youngho; Park, Sarah; Choi, Joon-Seok; Chae, Seoung Wan; Nam, Byung-Ho; Kim, Jin-Soo; Kim, Jihun; Min, Byung Soh; Lee, Jae Seok; Won, Jae-Kyung; Cho, Soo Youn; Choi, Yoon-La; Shin, Young Kee

    2018-01-11

    In clinical translational research and molecular in vitro diagnostics, a major challenge in the detection of genetic mutations is overcoming artefactual results caused by the low-quality of formalin-fixed paraffin-embedded tissue (FFPET)-derived DNA (FFPET-DNA). Here, we propose the use of an 'internal quality control (iQC) index' as a criterion for judging the minimum quality of DNA for PCR-based analyses. In a pre-clinical study comparing the results from droplet digital PCR-based EGFR mutation test (ddEGFR test) and qPCR-based EGFR mutation test (cobas EGFR test), iQC index ≥ 0.5 (iQC copies ≥ 500, using 3.3 ng of FFPET-DNA [1,000 genome equivalents]) was established, indicating that more than half of the input DNA was amplifiable. Using this criterion, we conducted a retrospective comparative clinical study of the ddEGFR and cobas EGFR tests for the detection of EGFR mutations in non-small cell lung cancer (NSCLC) FFPET-DNA samples. Compared with the cobas EGFR test, the ddEGFR test exhibited superior analytical performance and equivalent or higher clinical performance. Furthermore, iQC index is a reliable indicator of the quality of FFPET-DNA and could be used to prevent incorrect diagnoses arising from low-quality samples.

  6. High performance control strategy for single-phase three-level neutral-point-clamped traction four-quadrant converters

    DEFF Research Database (Denmark)

    Kejian, Song; Konstantinou, Georgios; Jing, Li

    2017-01-01

    Operational data from Chinese railways indicate a number of challenges for traction four-quadrant converter (4QC) control including low-order voltage and current harmonics and reference tracking. A control strategy for a single-phase three-level neutral-point-clamped 4QC employed in the electric...

  7. Mid-infrared studies of GaAs/AlGaAs quantum cascade structures

    International Nuclear Information System (INIS)

    Keightley, Peter Thomas

    2001-01-01

    This thesis describes an investigation of GaAs/AIGaAs Quantum Cascade (QC) structures. Mid-infrared spectroscopic techniques are employed to study several QC LED and laser structures, in order to investigate the fundamental principles underlying the operation of these state of the art devices. The results presented in this thesis include the demonstration of intersubband lasing in a GaAs/AIGaAs QC laser, which closely followed the first report of QC lasing using this materials system in 1998, and form a basis from which further research into QC lasers can be built upon. Initially, a spectroscopic investigation of several QC LEDs is presented, beginning with a comparison of the performance of two designs incorporating an active region based on a diagonal transition. These devices have single quantum well (SQW), or multi-quantum well (MQW) bridging regions and are investigated using intersubband electroluminescence (EL) spectroscopy. It is found that although growth and design are simplified by the use of a SQW bridging region, superior performance is obtained by the use of MQW bridging regions, intersubband EL and photocurrent (PC) spectroscopy are employed to study the operating characteristics of a QC LED incorporating a graded superlattice active region. EL is observed at 9 and 11μm arising from interminiband radiative transitions. Complementary intersubband and interband spectroscopic techniques have been employed to study the evolution of the electron distribution within a QC LED, with increasing bias. Below the device turn on, the transfer of electrons from the donors to the active region ground state is observed. As the bias is increased the redistribution of electrons through the bridging region is observed, in conjunction with an alignment of energy levels within the structure, close to the operating bias. Intersubband lasing has been demonstrated from a GaAs/AIGaAs QC laser at λ∼9μm. Reciprocal gain measurements have been performed to determine the

  8. 40 CFR 98.34 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ...-derived gaseous fuels, and for biogas; sampling and analysis is required at least once per calendar.... (iv) For solid fuels other than coal and MSW, weekly sampling is required to obtain composite samples... obtained at less than the minimum frequency specifed in paragraph (a)(2) of this section, appropriate...

  9. Effect of different solutions on color stability of acrylic resin-based dentures

    Directory of Open Access Journals (Sweden)

    Marcelo Coelho Goiato

    2014-01-01

    Full Text Available The aim of this study was to evaluate the effect of thermocycling and immersion in mouthwash or beverage solutions on the color stability of four different acrylic resin-based dentures (Onda Cryl, OC; QC20, QC; Classico, CL; and Lucitone, LU. The factors evaluated were type of acrylic resin, immersion time, and solution (mouthwash or beverage. A total of 224 denture samples were fabricated. For each type of resin, eight samples were immersed in mouthwashes (Plax-Colgate, PC; Listerine, LI; and Oral-B, OB, beverages (coffee, CP; cola, C; and wine, W, and artificial saliva (AS; control. The color change (DE was evaluated before (baseline and after thermocycling (T1, and after immersion in solution for 1 h (T2, 3 h (T3, 24 h (T4, 48 h (T5, and 96 h (T6. The CIE Lab system was used to determine the color changes. The thermocycling test was performed for 5000 cycles. Data were submitted to three-way repeated-measures analysis of variance and Tukey's test (p < 0.05. When the samples were immersed in each mouthwash, all assessed factors, associated or not, significantly influenced the color change values, except there was no association between the mouthwash and acrylic resin. Similarly, when the samples were immersed in each beverage, all studied factors influenced the color change values. In general, regardless of the solution, LU exhibited the greatest DE values in the period from T1 to T5; and QC presented the greatest DE values at T6. Thus, thermocycling and immersion in the various solutions influenced the color stability of acrylic resins and QC showed the greatest color alteration.

  10. Effect of different solutions on color stability of acrylic resin-based dentures.

    Science.gov (United States)

    Goiato, Marcelo Coelho; Nóbrega, Adhara Smith; dos Santos, Daniela Micheline; Andreotti, Agda Marobo; Moreno, Amália

    2014-01-01

    The aim of this study was to evaluate the effect of thermocycling and immersion in mouthwash or beverage solutions on the color stability of four different acrylic resin-based dentures (Onda Cryl, OC; QC20, QC; Classico, CL; and Lucitone, LU). The factors evaluated were type of acrylic resin, immersion time, and solution (mouthwash or beverage). A total of 224 denture samples were fabricated. For each type of resin, eight samples were immersed in mouthwashes (Plax-Colgate, PC; Listerine, LI; and Oral-B, OB), beverages (coffee, CP; cola, C; and wine, W), and artificial saliva (AS; control). The color change (DE) was evaluated before (baseline) and after thermocycling (T1), and after immersion in solution for 1 h (T2), 3 h (T3), 24 h (T4), 48 h (T5), and 96 h (T6). The CIE Lab system was used to determine the color changes. The thermocycling test was performed for 5000 cycles. Data were submitted to three-way repeated-measures analysis of variance and Tukey's test (p<0.05). When the samples were immersed in each mouthwash, all assessed factors, associated or not, significantly influenced the color change values, except there was no association between the mouthwash and acrylic resin. Similarly, when the samples were immersed in each beverage, all studied factors influenced the color change values. In general, regardless of the solution, LU exhibited the greatest DE values in the period from T1 to T5; and QC presented the greatest DE values at T6. Thus, thermocycling and immersion in the various solutions influenced the color stability of acrylic resins and QC showed the greatest color alteration.

  11. Development of a Climatology of Vertically Complete Wind Profiles from Doppler Radar Wind Profiler Systems

    Science.gov (United States)

    Barbre, Robert E., Jr.

    2015-01-01

    This paper describes in detail the QC and splicing methodology for KSC's 50- and 915-MHz DRWP measurements that generates an extensive archive of vertically complete profiles from 0.20-18.45 km. The concurrent POR from each archive extends from April 2000 to December 2009. MSFC NE applies separate but similar QC processes to each of the 50- and 915-MHz DRWP archives. DRWP literature and data examination provide the basis for developing and applying the automated and manual QC processes on both archives. Depending on the month, the QC'ed 50- and 915-MHz DRWP archives retain 52-65% and 16-30% of the possible data, respectively. The 50- and 915-MHz DRWP QC archives retain 84-91% and 85-95%, respectively, of all the available data provided that data exist in the non- QC'ed archives. Next, MSFC NE applies an algorithm to splice concurrent measurements from both DRWP sources. Last, MSFC NE generates a composite profile from the (up to) five available spliced profiles to effectively characterize boundary layer winds and to utilize all possible 915-MHz DRWP measurements at each timestamp. During a given month, roughly 23,000-32,000 complete profiles exist from 0.25-18.45 km from the composite profiles' archive, and approximately 5,000- 27,000 complete profiles exist from an archive utilizing an individual 915-MHz DRWP. One can extract a variety of profile combinations (pairs, triplets, etc.) from this sample for a given application. The sample of vertically complete DRWP wind measurements not only gives launch vehicle customers greater confidence in loads and trajectory assessments versus using balloon output, but also provides flexibility to simulate different DOL situations across applicable altitudes. In addition to increasing sample size and providing more flexibility for DOL simulations in the vehicle design phase, the spliced DRWP database provides any upcoming launch vehicle program with the capability to utilize DRWP profiles on DOL to compute vehicle steering

  12. Report on the analysis of the quality assurance and quality control data for the petroleum refining sector

    International Nuclear Information System (INIS)

    Thorton, N.; Michajluk, S.; Powell, T.; Lee, G.

    1992-07-01

    The Ontario Municipal-Industrial Strategy for Abatement (MISA) program has the ultimate goal of virtual elimination of persistent toxic contaminants from all discharges to provincial waterways. MISA effluent monitoring regulations, first promulgated for the petroleum refining sector, require direct dischargers to monitor their effluents for a specified number of contaminants and a specified frequency over a one-year period. The refineries were also required to carry out a quality control program on all process effluent streams and for specified analytical test groups. Two types of quality assurance/quality control (QA/QC) data were required: field QA/QC, which would indicate problems with field contamination or sampling, and laboratory QA/QC, which would indicate problems with the laboratory. The objectives of QA/QC analysis are to identify the significance of biases, chronic contamination, data variability, and false results, to assess data validity, and to allow data comparability among companies and laboratories. Of the 149 parameters monitored in the petroleum refining sector, 34 qualified as candidates for setting effluent limits. The QA/QC evaluation of monitoring data for the 34 parameters confirmed the presence of 18 parameters at such levels that they could be used to set statistically valid quantitative limits. 50 tabs

  13. AmeriFlux Data Processing: Integrating automated and manual data management across software technologies and an international network to generate timely data products

    Science.gov (United States)

    Christianson, D. S.; Beekwilder, N.; Chan, S.; Cheah, Y. W.; Chu, H.; Dengel, S.; O'Brien, F.; Pastorello, G.; Sandesh, M.; Torn, M. S.; Agarwal, D.

    2017-12-01

    AmeriFlux is a network of scientists who independently collect eddy covariance and related environmental observations at over 250 locations across the Americas. As part of the AmeriFlux Management Project, the AmeriFlux Data Team manages standardization, collection, quality assurance / quality control (QA/QC), and distribution of data submitted by network members. To generate data products that are timely, QA/QC'd, and repeatable, and have traceable provenance, we developed a semi-automated data processing pipeline. The new pipeline consists of semi-automated format and data QA/QC checks. Results are communicated via on-line reports as well as an issue-tracking system. Data processing time has been reduced from 2-3 days to a few hours of manual review time, resulting in faster data availability from the time of data submission. The pipeline is scalable to the network level and has the following key features. (1) On-line results of the format QA/QC checks are available immediately for data provider review. This enables data providers to correct and resubmit data quickly. (2) The format QA/QC assessment includes an automated attempt to fix minor format errors. Data submissions that are formatted in the new AmeriFlux FP-In standard can be queued for the data QA/QC assessment, often with minimal delay. (3) Automated data QA/QC checks identify and communicate potentially erroneous data via online, graphical quick views that highlight observations with unexpected values, incorrect units, time drifts, invalid multivariate correlations, and/or radiation shadows. (4) Progress through the pipeline is integrated with an issue-tracking system that facilitates communications between data providers and the data processing team in an organized and searchable fashion. Through development of these and other features of the pipeline, we present solutions to challenges that include optimizing automated with manual processing, bridging legacy data management infrastructure with

  14. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 2. [sample problem library guide

    Science.gov (United States)

    Jackson, C. E., Jr.

    1977-01-01

    A sample problem library containing 20 problems covering most facets of Nastran Thermal Analyzer modeling is presented. Areas discussed include radiative interchange, arbitrary nonlinear loads, transient temperature and steady-state structural plots, temperature-dependent conductivities, simulated multi-layer insulation, and constraint techniques. The use of the major control options and important DMAP alters is demonstrated.

  15. Shear Bond Strength of Orthodontic Brackets Fixed with Remineralizing Adhesive Systems after Simulating One Year of Orthodontic Treatment

    Directory of Open Access Journals (Sweden)

    Gisele Lima Bezerra

    2015-01-01

    Full Text Available The objective of this study is to assess, in vitro, the shear bond strength of orthodontic brackets fixed with remineralizing adhesive systems submitted to thermomechanical cycling, simulating one year of orthodontic treatment. Sixty-four bovine incisor teeth were randomly divided into 4 experimental groups (n=16: XT: Transbond XT, QC: Quick Cure, OL: Ortholite Color, and SEP: Transbond Plus Self-Etching Primer. The samples were submitted to thermomechanical cycling simulating one year of orthodontic treatment. Shear bond strength tests were carried out using a universal testing machine with a load cell of 50 KgF at 0.5 mm/minute. The samples were examined with a stereomicroscope and a scanning electron microscope (SEM in order to analyze enamel surface and Adhesive Remnant Index (ARI. Kruskal-Wallis and Mann-Whitney (with Bonferroni correction tests showed a significant difference between the studied groups (p<0.05. Groups XT, QC, and SEP presented the highest values of adhesive resistance and no statistical differences were found between them. The highest frequency of failures between enamel and adhesive was observed in groups XT, QC, and OL. Quick Cure (QC remineralizing adhesive system presented average adhesive resistance values similar to conventional (XT and self-etching (SEP adhesives, while remineralizing system (OL provided the lowest values of adhesive resistance.

  16. Challenges in Development of Sperm Repositories for Biomedical Fishes: Quality Control in Small-Bodied Species.

    Science.gov (United States)

    Torres, Leticia; Liu, Yue; Guitreau, Amy; Yang, Huiping; Tiersch, Terrence R

    2017-12-01

    Quality control (QC) is essential for reproducible and efficient functioning of germplasm repositories. However, many biomedical fish models present significant QC challenges due to small body sizes (<5 cm) and miniscule sperm volumes (<5 μL). Using minimal volumes of sperm, we used Zebrafish to evaluate common QC endpoints as surrogates for fertilization success along sequential steps of cryopreservation. First, concentrations of calibration bead suspensions were evaluated with a Makler ® counting chamber by using different sample volumes and mixing methods. For sperm analysis, samples were initially diluted at a 1:30 ratio with Hanks' balanced salt solution (HBSS). Motility was evaluated by using different ratios of sperm and activation medium, and membrane integrity was analyzed with flow cytometry at different concentrations. Concentration and sperm motility could be confidently estimated by using volumes as small as 1 μL, whereas membrane integrity required a minimum of 2 μL (at 1 × 10 6 cells/mL). Thus, <5 μL of sperm suspension (after dilution to 30-150 μL with HBSS) was required to evaluate sperm quality by using three endpoints. Sperm quality assessment using a combination of complementary endpoints enhances QC efforts during cryopreservation, increasing reliability and reproducibility, and reducing waste of time and resources.

  17. Examination of China's performance and thematic evolution in quantum cryptography research using quantitative and computational techniques.

    Science.gov (United States)

    Olijnyk, Nicholas V

    2018-01-01

    This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China's H-index (a normalized indicator) has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers

  18. Examination of China's performance and thematic evolution in quantum cryptography research using quantitative and computational techniques.

    Directory of Open Access Journals (Sweden)

    Nicholas V Olijnyk

    Full Text Available This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index are growing. China's H-index (a normalized indicator has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures; some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state, while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation. Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology

  19. Performance of a new meter designed for assisted monitoring of blood glucose and point-of-care testing.

    Science.gov (United States)

    Macrury, Sandra; Srinivasan, Aparna; Mahoney, John J

    2013-03-01

    Blood glucose (BG) meters used for assisted monitoring of blood glucose (AMBG) require different attributes compared with meters designed for home use. These include safety considerations (i.e., minimized risk of blood-borne pathogen transmission), capability for testing multiple blood sample types, and enhanced performance specifications. The OneTouch® Verio™Pro+ BG meter is designed to incorporate all of these attributes. Meter accuracy was assessed in clinical studies with arterial, venous, and capillary blood samples with a hematocrit range of 22.9-59.8%. The effect of interferents, including anticoagulants, on accuracy was evaluated. The meter disinfection protocol was validated, and instructions for use and user acceptance of the system were assessed. A total of 97% (549/566) of BG measures from all blood sample types and 95.5% (191/200) of arterial blood samples were within ±12 mg/dl or 12.5% of reference measurements. The system was unaffected by 4 anticoagulants and 57 of 59 endogenous and exogenous compounds; it was affected by 2 compounds: pralidoxime iodide and xylose. Bleach wipes were sufficient to disinfect the meter. Users felt that the meter's quality control (QC) prompts would help them to comply with regulatory requirements. The meter provided accurate measurements of different blood samples over a wide hematocrit range and was not affected by 57 physiologic and therapeutic compounds. The QC prompts and specific infection-mitigating design further aid to make this meter system practical for AMBG in care facilities. © 2013 Diabetes Technology Society.

  20. Inorganic chemical analysis of environmental materials—A lecture series

    Science.gov (United States)

    Crock, J.G.; Lamothe, P.J.

    2011-01-01

    At the request of the faculty of the Colorado School of Mines, Golden, Colorado, the authors prepared and presented a lecture series to the students of a graduate level advanced instrumental analysis class. The slides and text presented in this report are a compilation and condensation of this series of lectures. The purpose of this report is to present the slides and notes and to emphasize the thought processes that should be used by a scientist submitting samples for analyses in order to procure analytical data to answer a research question. First and foremost, the analytical data generated can be no better than the samples submitted. The questions to be answered must first be well defined and the appropriate samples collected from the population that will answer the question. The proper methods of analysis, including proper sample preparation and digestion techniques, must then be applied. Care must be taken to achieve the required limits of detection of the critical analytes to yield detectable analyte concentration (above "action" levels) for the majority of the study's samples and to address what portion of those analytes answer the research question-total or partial concentrations. To guarantee a robust analytical result that answers the research question(s), a well-defined quality assurance and quality control (QA/QC) plan must be employed. This QA/QC plan must include the collection and analysis of field and laboratory blanks, sample duplicates, and matrix-matched standard reference materials (SRMs). The proper SRMs may include in-house materials and/or a selection of widely available commercial materials. A discussion of the preparation and applicability of in-house reference materials is also presented. Only when all these analytical issues are sufficiently addressed can the research questions be answered with known certainty.

  1. Workshop on measurement quality assurance for ionizing radiation: Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Heath, J.A.; Swinth, K.L. [comps.

    1993-12-31

    This workshop was held to review the status of secondary level calibration accreditation programs, review related measurement accreditation programs, document lessons learned, and to present changes in programs due to new national priorities involving radioactivity measurements. Contents include: fundamentals of measurement quality assurance (MQA), standards for MQA programs; perspectives and policies; complete MQA programs; future MQA programs; QA/QC programs--radioactivity; QA/QC programs--dosimetry; laboratory procedures for QA/QC; in-house control of reference dosimetry laboratories; in-house controls of radioactivity laboratories; and poster session. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  2. Workshop on measurement quality assurance for ionizing radiation: Proceedings

    International Nuclear Information System (INIS)

    Heath, J.A.; Swinth, K.L.

    1993-01-01

    This workshop was held to review the status of secondary level calibration accreditation programs, review related measurement accreditation programs, document lessons learned, and to present changes in programs due to new national priorities involving radioactivity measurements. Contents include: fundamentals of measurement quality assurance (MQA), standards for MQA programs; perspectives and policies; complete MQA programs; future MQA programs; QA/QC programs--radioactivity; QA/QC programs--dosimetry; laboratory procedures for QA/QC; in-house control of reference dosimetry laboratories; in-house controls of radioactivity laboratories; and poster session. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database

  3. Operational quality control of daily precipitation using spatio-climatological consistency testing

    Science.gov (United States)

    Scherrer, S. C.; Croci-Maspoli, M.; van Geijtenbeek, D.; Naguel, C.; Appenzeller, C.

    2010-09-01

    Quality control (QC) of meteorological data is of utmost importance for climate related decisions. The search for an effective automated QC of precipitation data has proven difficult and many weather services still use mainly manual inspection of daily precipitation including MeteoSwiss. However, man power limitations force many weather services to move towards less labour intensive and more automated QC with the challenge to keeping data quality high. In the last decade, several approaches have been presented to objectify daily precipitation QC. Here we present a spatio-climatological approach that will be implemented operationally at MeteoSwiss. It combines the information from the event based spatial distribution of everyday's precipitation field and the historical information of the interpolation error using different precipitation intensity intervals. Expert judgement shows that the system is able to detect potential outliers very well (hardly any missed errors) without creating too many false alarms that need human inspection. 50-80% of all flagged values have been classified as real errors by the data editor. This is much better than the roughly 15-20% using standard spatial regression tests. Very helpful in the QC process is the automatic redistribution of accumulated several day sums. Manual inspection in operations can be reduced and the QC of precipitation objectified substantially.

  4. Establishing quality control ranges for antimicrobial susceptibility testing of Escherichia coli, Pseudomonas aeruginosa, and Staphylococcus aureus: a cornerstone to develop reference strains for Korean clinical microbiology laboratories.

    Science.gov (United States)

    Hong, Sung Kuk; Choi, Seung Jun; Shin, Saeam; Lee, Wonmok; Pinto, Naina; Shin, Nari; Lee, Kwangjun; Hong, Seong Geun; Kim, Young Ah; Lee, Hyukmin; Kim, Heejung; Song, Wonkeun; Lee, Sun Hwa; Yong, Dongeun; Lee, Kyungwon; Chong, Yunsop

    2015-11-01

    Quality control (QC) processes are being performed in the majority of clinical microbiology laboratories to ensure the performance of microbial identification and antimicrobial susceptibility testing by using ATCC strains. To obtain these ATCC strains, some inconveniences are encountered concerning the purchase cost of the strains and the shipping time required. This study was focused on constructing a database of reference strains for QC processes using domestic bacterial strains, concentrating primarily on antimicrobial susceptibility testing. Three strains (Escherichia coli, Pseudomonas aeruginosa, and Staphylococcus aureus) that showed legible results in preliminary testing were selected. The minimal inhibitory concentrations (MICs) and zone diameters (ZDs) of eight antimicrobials for each strain were determined according to the CLSI M23. All resulting MIC and ZD ranges included at least 95% of the data. The ZD QC ranges obtained by using the CLSI method were less than 12 mm, and the MIC QC ranges extended no more than five dilutions. This study is a preliminary attempt to construct a bank of Korean QC strains. With further studies, a positive outcome toward cost and time reduction can be anticipated.

  5. Polymerase chain reaction system using magnetic beads for analyzing a sample that includes nucleic acid

    Science.gov (United States)

    Nasarabadi, Shanavaz [Livermore, CA

    2011-01-11

    A polymerase chain reaction system for analyzing a sample containing nucleic acid includes providing magnetic beads; providing a flow channel having a polymerase chain reaction chamber, a pre polymerase chain reaction magnet position adjacent the polymerase chain reaction chamber, and a post pre polymerase magnet position adjacent the polymerase chain reaction chamber. The nucleic acid is bound to the magnetic beads. The magnetic beads with the nucleic acid flow to the pre polymerase chain reaction magnet position in the flow channel. The magnetic beads and the nucleic acid are washed with ethanol. The nucleic acid in the polymerase chain reaction chamber is amplified. The magnetic beads and the nucleic acid are separated into a waste stream containing the magnetic beads and a post polymerase chain reaction mix containing the nucleic acid. The reaction mix containing the nucleic acid flows to an analysis unit in the channel for analysis.

  6. Evaluation of intensity drift correction strategies using MetaboDrift, a normalization tool for multi-batch metabolomics data.

    Science.gov (United States)

    Thonusin, Chanisa; IglayReger, Heidi B; Soni, Tanu; Rothberg, Amy E; Burant, Charles F; Evans, Charles R

    2017-11-10

    In recent years, mass spectrometry-based metabolomics has increasingly been applied to large-scale epidemiological studies of human subjects. However, the successful use of metabolomics in this context is subject to the challenge of detecting biologically significant effects despite substantial intensity drift that often occurs when data are acquired over a long period or in multiple batches. Numerous computational strategies and software tools have been developed to aid in correcting for intensity drift in metabolomics data, but most of these techniques are implemented using command-line driven software and custom scripts which are not accessible to all end users of metabolomics data. Further, it has not yet become routine practice to assess the quantitative accuracy of drift correction against techniques which enable true absolute quantitation such as isotope dilution mass spectrometry. We developed an Excel-based tool, MetaboDrift, to visually evaluate and correct for intensity drift in a multi-batch liquid chromatography - mass spectrometry (LC-MS) metabolomics dataset. The tool enables drift correction based on either quality control (QC) samples analyzed throughout the batches or using QC-sample independent methods. We applied MetaboDrift to an original set of clinical metabolomics data from a mixed-meal tolerance test (MMTT). The performance of the method was evaluated for multiple classes of metabolites by comparison with normalization using isotope-labeled internal standards. QC sample-based intensity drift correction significantly improved correlation with IS-normalized data, and resulted in detection of additional metabolites with significant physiological response to the MMTT. The relative merits of different QC-sample curve fitting strategies are discussed in the context of batch size and drift pattern complexity. Our drift correction tool offers a practical, simplified approach to drift correction and batch combination in large metabolomics studies

  7. Structures of human Golgi-resident glutaminyl cyclase and its complexes with inhibitors reveal a large loop movement upon inhibitor binding.

    Science.gov (United States)

    Huang, Kai-Fa; Liaw, Su-Sen; Huang, Wei-Lin; Chia, Cho-Yun; Lo, Yan-Chung; Chen, Yi-Ling; Wang, Andrew H-J

    2011-04-08

    Aberrant pyroglutamate formation at the N terminus of certain peptides and proteins, catalyzed by glutaminyl cyclases (QCs), is linked to some pathological conditions, such as Alzheimer disease. Recently, a glutaminyl cyclase (QC) inhibitor, PBD150, was shown to be able to reduce the deposition of pyroglutamate-modified amyloid-β peptides in brain of transgenic mouse models of Alzheimer disease, leading to a significant improvement of learning and memory in those transgenic animals. Here, we report the 1.05-1.40 Å resolution structures, solved by the sulfur single-wavelength anomalous dispersion phasing method, of the Golgi-luminal catalytic domain of the recently identified Golgi-resident QC (gQC) and its complex with PBD150. We also describe the high-resolution structures of secretory QC (sQC)-PBD150 complex and two other gQC-inhibitor complexes. gQC structure has a scaffold similar to that of sQC but with a relatively wider and negatively charged active site, suggesting a distinct substrate specificity from sQC. Upon binding to PBD150, a large loop movement in gQC allows the inhibitor to be tightly held in its active site primarily by hydrophobic interactions. Further comparisons of the inhibitor-bound structures revealed distinct interactions of the inhibitors with gQC and sQC, which are consistent with the results from our inhibitor assays reported here. Because gQC and sQC may play different biological roles in vivo, the different inhibitor binding modes allow the design of specific inhibitors toward gQC and sQC.

  8. Diamond drilling for nuclear waste QC

    International Nuclear Information System (INIS)

    Jennings, Martin.

    1990-01-01

    Specialised diamond core drilling equipment could soon have a role to play in the safe disposal of intermediate level radioactive waste (ILW). Equipment to core and extract samples for quality checking from cement-filled steel waste drums by techniques compatible with eventual remote-handling operations in a 'hot-cell' is being developed. All coring tests carried out to date have been on simulant waste: 200 litre drums containing mixtures of Ordinary Portland Cement, Ground Granulated Blast Furnace Slag and Pulverised Fuel Ash. No radioactive materials have yet been used for the coring trials. The coring equipment and the diamond coring bits are described. (author)

  9. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  10. Accuracy and Quality Assessment of EUS-FNA: A Single-Center Large Cohort of Biopsies

    Directory of Open Access Journals (Sweden)

    Benjamin Ephraim Bluen

    2012-01-01

    Full Text Available Introduction. Thorough quality control (QC study with systemic monitoring and evaluation is crucial to optimizing the effectiveness of EUS-FNA. Methods. Retrospective analysis was composed of investigating consecutive patient files that underwent EUS-FNA. QC specifically focused on diagnostic accuracy, impacts on preexisting diagnoses, and case management. Results. 268 patient files were evaluated. EUS-FNA cytology helped establish accurate diagnoses in 92.54% (248/268 of patients. Sensitivity, specificity, PPV, NPV, and accuracy were 83%, 100%, 100%, 91.6%, and 94%, respectively. The most common biopsy site was the pancreas (68%. The most accurate location for EUS-FNA was the esophagus, 13/13 (100%, followed by the pancreas (89.6%. EUS-FNA was least informative for abdominal lymph nodes (70.5%. After FNA and followup, eight false negatives for tumors were found (3%, while 7.5% of samples still lacked a definitive diagnosis. Discussion. QC suggests that the diagnostic accuracy of EUS-FNA might be improved further by (1 taking more FNA passes from suspected lesions, (2 optimizing needle selection (3 having an experienced echo-endoscopist available during the learning curve, and (4 having a cytologist present during the procedure. QC also identified remediable reporting errors. In conclusion, QC study is valuable in identifying weaknesses and thereby augmenting the effectiveness of EUS-FNA.

  11. Projector-based virtual reality dome environment for procedural pain and anxiety in young children with burn injuries: a pilot study

    Directory of Open Access Journals (Sweden)

    Khadra C

    2018-02-01

    victims], and sedation (Ramsay Sedation Scale were collected before, during, and after the procedure. Data analyses included descriptive and non-parametric inferential statistics.Results: We recruited 15 children with a mean age of 2.2±2.1 years and a mean total body surface area of 5% (±4. Mean pain score during the procedure was low (2.9/10, ±3, as was the discomfort level (2.9/10, ±2.8. Most children were cooperative, oriented, and calm. Assessing anxiety was not feasible with our sample of participants. The prototype did not interfere with the procedure and was considered useful for procedural pain management by most health care professionals.Conclusion: The projector-based VR is a feasible and acceptable intervention for procedural pain management in young children with burn injuries. A larger trial with a control group is required to assess its efficacy. Keywords: pain, virtual reality, distraction, burns, preschool children, wound care

  12. TaSYP71, a Qc-SNARE, Contributes to Wheat Resistance against Puccinia striiformis f. sp. tritici

    Directory of Open Access Journals (Sweden)

    Minjie eLiu

    2016-04-01

    Full Text Available N-ethylmaleimide-sensitive factor attachment protein receptors (SNAREs are involved in plant resistance; however, the role of SYP71 in the regulation of plant–pathogen interactions is not well known. In this study, we characterized a plant-specific SNARE in wheat, TaSYP71, which contains a Qc-SNARE domain. Three homologues are localized on chromosome 1AL, 1BL and 1DL. Using Agrobacterium-mediated transient expression, TaSYP71 was localized to the plasma membrane in Nicotiana benthamiana. Quantitative real-time PCR assays revealed that TaSYP71 homologues was induced by NaCl, H2O2 stress and infection by virulent and avirulent Puccinia striiformis f. sp. tritici (Pst isolates. Heterologous expression of TaSYP71 in Schizosaccharomyces pombe elevated tolerance to H2O2. Meanwhile, H2O2 scavenging gene (TaCAT was downregulated in TaSYP71 silenced plants treated by H2O2 compared to that in control, which indicated that TaSYP71 enhanced tolerance to H2O2 stress possibly by influencing the expression of TaCAT to remove the excessive H2O2 accumulation. When TaSYP71 homologues were all silenced in wheat by the virus-induced gene silencing system, wheat plants were more susceptible to Pst, with larger infection area and more haustoria number, but the necrotic area of wheat mesophyll cells were larger, one possible explanation that minor contribution of resistance to Pst was insufficient to hinder pathogen extension when TaSYP71were silenced, and the necrotic area was enlarged accompanied with the pathogen growth. Of course, later cell death could not be excluded. In addition, the expression of pathogenesis-related genes were down-regulated in TaSYP71 silenced wheat plants. These results together suggest that TaSYP71 play a positive role in wheat defence against Pst.

  13. Quality Control of Mega Voltage Portal Imaging System

    International Nuclear Information System (INIS)

    Diklic, A.; Dundara Debeljuh, D.; Jurkovic, S.; Smilovic Radojcic, D.; Svabic Kolacio; Kasabasic, M.; Faj, D.

    2013-01-01

    The Electronic Portal Imaging Device (EPID) is a system used to verify either the correct positioning of the patient during radiotherapy treatment or the linear accelerator beam parameters. The correct position of the patient corresponds to the position at which the patient was scanned at the CT simulator and according to which the therapy plan was made and optimized. Regarding this, besides the advanced treatment planning system and optimized treatment planning techniques, the day-to-day reproduction of simulated conditions is of great importance for the treatment outcome. Therefore, to verify the patient set-up portal imaging should be applied prior to the first treatment session and repeated according to treatment prescriptions during the treatment. In order to achieve full functionality and precision of the EPID, it must be included in radiotherapy Quality Control (QC) programme. The QC of the Mega Voltage portal imaging system was separated in two parts. In the first, the QC of the detector parameters should be performed. For this purpose, the FC2 and QC3 phantoms should be used, along with the Portal Image Processing System program (PIPSpro) package for data analysis. The second part of the QC of the linear accelerator's portal imaging system should include the QC of the CBCT. In this part a set of predefined manufacturer's tests using two different phantoms, one for the geometry calibration and the other for the image quality evaluation, should be performed. Also, the treatment conditions should be simulated using anthropomorphic phantoms and dose distributions for particular EPID protocols should be measured. Procedures for quality control of the portal imaging system developed and implemented at University Hospital Rijeka are presented in this paper.(author)

  14. An X-band Co2+ EPR study of Zn1-xCoxO (x=0.005-0.1) nanoparticles prepared by chemical hydrolysis methods using diethylene glycol and denaturated alcohol at 5 K

    Science.gov (United States)

    Misra, Sushil K.; Andronenko, S. I.; Srinivasa Rao, S.; Chess, Jordan; Punnoose, A.

    2015-11-01

    EPR investigations on two types of dilute magnetic semiconductor (DMS) ZnO nanoparticles doped with 0.5-10% Co2+ ions, prepared by two chemical hydrolysis methods, using: (i) diethylene glycol ((CH2CH2OH)2O) (NC-rod-like samples), and (ii) denatured ethanol (CH3CH2OH) solutions (QC-spherical samples), were carried out at X-band (9.5 GHz) at 5 K. The analysis of EPR data for NC samples revealed the presence of several types of EPR lines: (i) two types, intense and weak, of high-spin Co2+ ions in the samples with Co concentration >0.5%; (ii) surface oxygen vacancies, and (iii) a ferromagnetic resonance (FMR) line. QC samples exhibit an intense FMR line and an EPR line due to high-spin Co2+ ions. FMR line is more intense, than the corresponding line exhibited by NC samples. These EPR spectra varied for sample with different doping concentrations. The magnetic states of these samples as revealed by EPR spectra, as well as the origin of ferromagnetism DMS samples are discussed.

  15. 222-S laboratory quality assurance plan

    International Nuclear Information System (INIS)

    Meznarich, H.K.

    1995-01-01

    This document provides quality assurance guidelines and quality control requirements for analytical services. This document is designed on the basis of Hanford Analytical Services Quality Assurance Plan (HASQAP) technical guidelines and is used for governing 222-S and 222-SA analytical and quality control activities. The 222-S Laboratory provides analytical services to various clients including, but not limited to, waste characterization for the Tank Waste Remediation Systems (TWRS), waste characterization for regulatory waste treatment, storage, and disposal (TSD), regulatory compliance samples, radiation screening, process samples, and TPA samples. A graded approach is applied on the level of sample custody, QC, data verification, and data reporting to meet the specific needs of the client

  16. Single Laboratory Validated Method for Determination of Cylindrospermopsin and Anatoxin-a in Ambient Water by Liquid Chromatography/ Tandem Mass Spectrometry (LC/MS/MS)

    Science.gov (United States)

    This product is an LC/MS/MS single laboratory validated method for the determination of cylindrospermopsin and anatoxin-a in ambient waters. The product contains step-by-step instructions for sample preparation, analyses, preservation, sample holding time and QC protocols to ensu...

  17. Quercetin ameliorates imiquimod-induced psoriasis-like skin inflammation in mice via the NF-κB pathway.

    Science.gov (United States)

    Chen, Haiming; Lu, Chuanjian; Liu, Huazhen; Wang, Maojie; Zhao, Hui; Yan, Yuhong; Han, Ling

    2017-07-01

    Quercetin (QC) is a dietary flavonoid abundant in many natural plants. A series of studies have shown that it has been shown to exhibit several biological properties, including anti-inflammatory, anti-oxidant, cardio-protective, vasodilatory, liver-protective and anti-cancer activities. However, so far the possible therapeutic effect of QC on psoriasis has not been reported. The present study was undertaken to evaluate the potential beneficial effect of QC in psoriasis using a generated imiquimod (IMQ)-induced psoriasis-like mouse model, and to further elucidate its underlying mechanisms of action. Effects of QC on PASI scores, back temperature, histopathological changes, oxidative/anti-oxidative indexes, pro-inflammatory cytokines and NF-κB pathway in IMQ-induced mice were investigated. Our results showed that QC could significantly reduce the PASI scores, decrease the temperature of the psoriasis-like lesions, and ameliorate the deteriorating histopathology in IMQ-induced mice. Moreover, QC effectively attenuated levels of TNF-α, IL-6 and IL-17 in serum, increased activities of GSH, CAT and SOD, and decreased the accumulation of MDA in skin tissue induced by IMQ in mice. The mechanism may be associated with the down-regulation of NF-κB, IKKα, NIK and RelB expression and up-regulation of TRAF3, which were critically involved in the non-canonical NF-κB pathway. In conclusion, our present study demonstrated that QC had appreciable anti-psoriasis effects in IMQ-induced mice, and the underlying mechanism may involve the improvement of antioxidant and anti-inflammatory status and inhibition on the activation of the NF-κB signaling. Hence, QC, a naturally occurring flavone with potent anti-psoriatic effects, has the potential for further development as a candidate for psoriasis treatment. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Examination of China’s performance and thematic evolution in quantum cryptography research using quantitative and computational techniques

    Science.gov (United States)

    2018-01-01

    This study performed two phases of analysis to shed light on the performance and thematic evolution of China’s quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001–2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China’s QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China’s performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China’s performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China’s H-index (a normalized indicator) has surpassed all other countries’ over the last several years. The second phase of analysis shows how China’s main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China’s QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology

  19. General Quality Control (QC) Guidelines for SAM Methods

    Science.gov (United States)

    Learn more about quality control guidelines and recommendations for the analysis of samples using the methods listed in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  20. A generic template for automated bioanalytical ligand-binding assays using modular robotic scripts in support of discovery biotherapeutic programs.

    Science.gov (United States)

    Duo, Jia; Dong, Huijin; DeSilva, Binodh; Zhang, Yan J

    2013-07-01

    Sample dilution and reagent pipetting are time-consuming steps in ligand-binding assays (LBAs). Traditional automation-assisted LBAs use assay-specific scripts that require labor-intensive script writing and user training. Five major script modules were developed on Tecan Freedom EVO liquid handling software to facilitate the automated sample preparation and LBA procedure: sample dilution, sample minimum required dilution, standard/QC minimum required dilution, standard/QC/sample addition, and reagent addition. The modular design of automation scripts allowed the users to assemble an automated assay with minimal script modification. The application of the template was demonstrated in three LBAs to support discovery biotherapeutic programs. The results demonstrated that the modular scripts provided the flexibility in adapting to various LBA formats and the significant time saving in script writing and scientist training. Data generated by the automated process were comparable to those by manual process while the bioanalytical productivity was significantly improved using the modular robotic scripts.

  1. Circle activity of quality assurance in construction

    International Nuclear Information System (INIS)

    1982-05-01

    This book explains purpose, introduction 10 things to keep in mind, management and role of QC activity, TQC and QC circle activity in construction, introduction of the case of QC circle activity in a company like QC circle activity as TQC activity and QC circle for the bright future, case of experience of QC circle activity such as decreasing concrete loss, improvement of sleeve sticking on the ps wooden floor, overcoming handicap in field where one person works and point of QC 7 tools and order of improvement and management.

  2. Complete genomic sequences for hepatitis C virus subtypes 4b, 4c, 4d, 4g, 4k, 4l, 4m, 4n, 4o, 4p, 4q, 4r and 4t.

    Science.gov (United States)

    Li, Chunhua; Lu, Ling; Wu, Xianghong; Wang, Chuanxi; Bennett, Phil; Lu, Teng; Murphy, Donald

    2009-08-01

    In this study, we characterized the full-length genomic sequences of 13 distinct hepatitis C virus (HCV) genotype 4 isolates/subtypes: QC264/4b, QC381/4c, QC382/4d, QC193/4g, QC383/4k, QC274/4l, QC249/4m, QC97/4n, QC93/4o, QC139/4p, QC262/4q, QC384/4r and QC155/4t. These were amplified, using RT-PCR, from the sera of patients now residing in Canada, 11 of which were African immigrants. The resulting genomes varied between 9421 and 9475 nt in length and each contains a single ORF of 9018-9069 nt. The sequences showed nucleotide similarities of 77.3-84.3 % in comparison with subtypes 4a (GenBank accession no. Y11604) and 4f (EF589160) and 70.6-72.8 % in comparison with genotype 1 (M62321/1a, M58335/1b, D14853/1c, and 1?/AJ851228) reference sequences. These similarities were often higher than those currently defined by HCV classification criteria for subtype (75.0-80.0 %) and genotype (67.0-70.0 %) division, respectively. Further analyses of the complete and partial E1 and partial NS5B sequences confirmed these 13 'provisionally assigned subtypes'.

  3. The HAZWRAP/Navy Radon Project

    International Nuclear Information System (INIS)

    Otten, J.A.; Wilson, D.L.

    1992-01-01

    In 1987, the Naval Facilities Engineering Command (NAVFACENGCOM) was assigned the responsibility of identifying potential hazards from exposure to naturally occurring indoor radon to personnel in Naval facilities and prioritizing corrective actions. In response to this request, NAVFACENGCOM established the Navy Radon Assessment and Mitigation Program (NAVRAMP), which evaluates problems associated with radon levels above current Environmental Protection Agency (EPA)-guidelines (EPA-520/1-86-04) of 4pCi/L. NAVRAMP consists of four phases: screening, assessment, mitigation, and postmitigation. Approximately 300,000 alpha track detectors (ATDs) will be used during the screening and assessment phases. Several unique approaches were developed to verify the quality of the information collected during this large program. To guarantee chain-of-custody, a computerized data management system was developed to track each ATD from initial purchase through field implementation to final analysis and was also used to track and identify quality control (QC) problems related to the manufacture and analysis of ATDs. The QC program addressed potential problems that could occur using ATDs. The potential errors associated with the use of ATDs included (1) lot-to-lot variation, (2) chemical process variation, (3) pouch leakage, and (4) background. The data management and QC programs ensured the quality and integrity of the samples, the accuracy and precision of the analyses, the representatives of the results, and the completeness of the information

  4. The Quality Control Algorithms Used in the Creation of NASA Kennedy Space Center Lightning Protection System Towers Meteorological Database

    Science.gov (United States)

    Orcutt, John M.; Brenton, James C.

    2016-01-01

    An accurate database of meteorological data is essential for designing any aerospace vehicle and for preparing launch commit criteria. Meteorological instrumentation were recently placed on the three Lightning Protection System (LPS) towers at Kennedy Space Center (KSC) launch complex 39B (LC-39B), which provide a unique meteorological dataset existing at the launch complex over an extensive altitude range. Data records of temperature, dew point, relative humidity, wind speed, and wind direction are produced at 40, 78, 116, and 139 m at each tower. The Marshall Space Flight Center Natural Environments Branch (EV44) received an archive that consists of one-minute averaged measurements for the period of record of January 2011 - April 2015. However, before the received database could be used EV44 needed to remove any erroneous data from within the database through a comprehensive quality control (QC) process. The QC process applied to the LPS towers' meteorological data is similar to other QC processes developed by EV44, which were used in the creation of meteorological databases for other towers at KSC. The QC process utilized in this study has been modified specifically for use with the LPS tower database. The QC process first includes a check of each individual sensor. This check includes removing any unrealistic data and checking the temporal consistency of each variable. Next, data from all three sensors at each height are checked against each other, checked against climatology, and checked for sensors that erroneously report a constant value. Then, a vertical consistency check of each variable at each tower is completed. Last, the upwind sensor at each level is selected to minimize the influence of the towers and other structures at LC-39B on the measurements. The selection process for the upwind sensor implemented a study of tower-induced turbulence. This paper describes in detail the QC process, QC results, and the attributes of the LPS towers meteorological

  5. Navy radon assessment and mitigation program: Final report

    International Nuclear Information System (INIS)

    1994-10-01

    This final report encompasses the events from the beginning of the Navy Radon Assessment and Mitigation Program to the closure of the program on October 31, 1994. Included in the report are discussions of the phases of the program including screening, assessment, mitigation, and post-mitigation. The primary discussion involves screening and assessment. The report addresses recommendations made to the Naval Facilities Engineering Command by the Hazardous Waste Remedial Actions Program of Martin Marietta Energy Systems, Inc., and the final decisions that were made. Special emphasis is placed on quality assurance/quality control (QA/QC), since QA/QC was given top priority during the implementation of this program. Included in the discussion on QA/QC are ana overview of the measurement process, positive and negative controls, replicated measurements, and application of chamber exposures to data calibration. The report concludes with a discussion of testing considerations for naval facilities and radon mitigation considerations for the Department of the Navy

  6. New Approaches to Quantum Computing using Nuclear Magnetic Resonance Spectroscopy

    International Nuclear Information System (INIS)

    Colvin, M; Krishnan, V V

    2003-01-01

    The power of a quantum computer (QC) relies on the fundamental concept of the superposition in quantum mechanics and thus allowing an inherent large-scale parallelization of computation. In a QC, binary information embodied in a quantum system, such as spin degrees of freedom of a spin-1/2 particle forms the qubits (quantum mechanical bits), over which appropriate logical gates perform the computation. In classical computers, the basic unit of information is the bit, which can take a value of either 0 or 1. Bits are connected together by logic gates to form logic circuits to implement complex logical operations. The expansion of modern computers has been driven by the developments of faster, smaller and cheaper logic gates. As the size of the logic gates become smaller toward the level of atomic dimensions, the performance of such a system is no longer considered classical but is rather governed by quantum mechanics. Quantum computers offer the potentially superior prospect of solving computational problems that are intractable to classical computers such as efficient database searches and cryptography. A variety of algorithms have been developed recently, most notably Shor's algorithm for factorizing long numbers into prime factors in polynomial time and Grover's quantum search algorithm. The algorithms that were of only theoretical interest as recently, until several methods were proposed to build an experimental QC. These methods include, trapped ions, cavity-QED, coupled quantum dots, Josephson junctions, spin resonance transistors, linear optics and nuclear magnetic resonance. Nuclear magnetic resonance (NMR) is uniquely capable of constructing small QCs and several algorithms have been implemented successfully. NMR-QC differs from other implementations in one important way that it is not a single QC, but a statistical ensemble of them. Thus, quantum computing based on NMR is considered as ensemble quantum computing. In NMR quantum computing, the spins with

  7. A survey of the practice and management of radiotherapy linear accelerator quality control in the UK.

    Science.gov (United States)

    Palmer, A; Kearton, J; Hayman, O

    2012-11-01

    The objective of this study was to determine current radiotherapy linear accelerator quality control (QC) practice in the UK, as a comparative benchmark and indicator of development needs, and to raise awareness of QC as a key performance indicator. All UK radiotherapy centres were invited to complete an online questionnaire regarding their local QC processes, and submit their QC schedules. The range of QC tests, frequency of measurements and acceptable tolerances in use across the UK were analysed, and consensus and range statistics determined. 72% of the UK's 62 radiotherapy centres completed the questionnaire and 40% provided their QC schedules. 60 separate QC tests were identified from the returned schedules. There was a large variation in the total time devoted to QC between centres: interquartile range from 13 to 26 h per linear accelerator per month. There has been a move from weekly to monthly testing of output calibration in the last decade, with reliance on daily constancy testing equipment. 33% of centres thought their schedules were in need of an update and only 30% used risk-assessment approaches to determine local QC schedule content. Less than 30% of centres regularly complete all planned QC tests each month, although 96% achieve over 80% of tests. A comprehensive "snapshot" of linear accelerator QC testing practice in the UK has been collated, which demonstrates reasonable agreement between centres in their stated QC test frequencies. However, intelligent design of QC schedules and management is necessary to ensure efficiency and appropriateness.

  8. On-chip acoustophoretic isolation of microflora including S. typhimurium from raw chicken, beef and blood samples.

    Science.gov (United States)

    Ngamsom, Bongkot; Lopez-Martinez, Maria J; Raymond, Jean-Claude; Broyer, Patrick; Patel, Pradip; Pamme, Nicole

    2016-04-01

    Pathogen analysis in food samples routinely involves lengthy growth-based pre-enrichment and selective enrichment of food matrices to increase the ratio of pathogen to background flora. Similarly, for blood culture analysis, pathogens must be isolated and enriched from a large excess of blood cells to allow further analysis. Conventional techniques of centrifugation and filtration are cumbersome, suffer from low sample throughput, are not readily amenable to automation and carry a risk of damaging biological samples. We report on-chip acoustophoresis as a pre-analytical technique for the resolution of total microbial flora from food and blood samples. The resulting 'clarified' sample is expected to increase the performance of downstream systems for the specific detection of the pathogens. A microfluidic chip with three inlets, a central separation channel and three outlets was utilized. Samples were introduced through the side inlets, and buffer solution through the central inlet. Upon ultrasound actuation, large debris particles (10-100 μm) from meat samples were continuously partitioned into the central buffer channel, leaving the 'clarified' outer sample streams containing both, the pathogenic cells and the background flora (ca. 1 μm) to be collected over a 30 min operation cycle before further analysis. The system was successfully tested with Salmonella typhimurium-spiked (ca. 10(3)CFU mL(-1)) samples of chicken and minced beef, demonstrating a high level of the pathogen recovery (60-90%). When applied to S. typhimurium contaminated blood samples (10(7)CFU mL(-1)), acoustophoresis resulted in a high depletion (99.8%) of the red blood cells (RBC) which partitioned in the buffer stream, whilst sufficient numbers of the viable S. typhimurium remained in the outer channels for further analysis. These results indicate that the technology may provide a generic approach for pre-analytical sample preparation prior to integrated and automated downstream detection of

  9. Influence of particle size and preparation methods on the physical and chemical stability of amorphous simvastatin

    DEFF Research Database (Denmark)

    Zhang, Fang; Aaltonen, Jaakko; Tian, Fang

    2009-01-01

    This study investigated the factors influencing the stability of amorphous simvastatin. Quench-cooled amorphous simvastatin in two particle size ranges, 150-180 microm (QC-big) and ... compared to the crystalline form. The rank of solubility was found to be QC-big=QC-small>CM>crystalline. For the physical stability, the highest crystallization rate was observed for CM, and the slowest rate was detected for QC-big, with an intermediate rate occurring for QC-small. QC exhibited lower...

  10. Effects of two sources of tannins (Quercus L. and Vaccinium vitis idaea L. on rumen microbial fermentation: an in vitro study

    Directory of Open Access Journals (Sweden)

    Adam Cieslak

    2014-04-01

    Full Text Available The aim of the experiment was to determine the effect of different sources of tannins on the in vitro rumen fermentation with focus on methane production. In the experiment, a rumen simulation system (RUSITEC equipped with 4 fermenters (1 L was used in three replicated runs (6 d of adaptation and 4 d of sampling to study the effects of Quercus cortex extract (QC, Vaccinium vitis idaea (VVI dried leaf extract and a mixture of VVI/QC on rumen microbial fermentation. Fermenters were fed 10.9 g/d of dry matter (DM of a 600:400 forage:concentrate diet. Treatments were control, QC (2.725 mL, VVI leaves 0.080 g and mixture of QC/VVI (1.362 mL+0.040 g and were randomly assigned to fermenters within periods. The equivalent of 2.5 g of tannins/kg dietary DM from three sources of tannins was evaluated. All tannin sources decreased CH4 and ammonia concentrations, as well as protozoa and methanogen counts (P<0.001. Vaccinium vitis idaea and QC/VVI tended (P=0.005 to reduce the acetate to propionate ratio. There were no changes in nutrient digestion. Results suggest that these sources of tannins, especially VVI have the potential to reduce rumen CH4 production and ammonia concentration without negative effects on in vitro DM digestibility, total volatile fatty acids and pH.

  11. Multi-factor authentication using quantum communication

    Science.gov (United States)

    Hughes, Richard John; Peterson, Charles Glen; Thrasher, James T.; Nordholt, Jane E.; Yard, Jon T.; Newell, Raymond Thorson; Somma, Rolando D.

    2018-02-06

    Multi-factor authentication using quantum communication ("QC") includes stages for enrollment and identification. For example, a user enrolls for multi-factor authentication that uses QC with a trusted authority. The trusted authority transmits device factor information associated with a user device (such as a hash function) and user factor information associated with the user (such as an encrypted version of a user password). The user device receives and stores the device factor information and user factor information. For multi-factor authentication that uses QC, the user device retrieves its stored device factor information and user factor information, then transmits the user factor information to the trusted authority, which also retrieves its stored device factor information. The user device and trusted authority use the device factor information and user factor information (more specifically, information such as a user password that is the basis of the user factor information) in multi-factor authentication that uses QC.

  12. An effective and optimal quality control approach for green energy manufacturing using design of experiments framework and evolutionary algorithm

    Science.gov (United States)

    Saavedra, Juan Alejandro

    Quality Control (QC) and Quality Assurance (QA) strategies vary significantly across industries in the manufacturing sector depending on the product being built. Such strategies range from simple statistical analysis and process controls, decision-making process of reworking, repairing, or scraping defective product. This study proposes an optimal QC methodology in order to include rework stations during the manufacturing process by identifying the amount and location of these workstations. The factors that are considered to optimize these stations are cost, cycle time, reworkability and rework benefit. The goal is to minimize the cost and cycle time of the process, but increase the reworkability and rework benefit. The specific objectives of this study are: (1) to propose a cost estimation model that includes energy consumption, and (2) to propose an optimal QC methodology to identify quantity and location of rework workstations. The cost estimation model includes energy consumption as part of the product direct cost. The cost estimation model developed allows the user to calculate product direct cost as the quality sigma level of the process changes. This provides a benefit because a complete cost estimation calculation does not need to be performed every time the processes yield changes. This cost estimation model is then used for the QC strategy optimization process. In order to propose a methodology that provides an optimal QC strategy, the possible factors that affect QC were evaluated. A screening Design of Experiments (DOE) was performed on seven initial factors and identified 3 significant factors. It reflected that one response variable was not required for the optimization process. A full factorial DOE was estimated in order to verify the significant factors obtained previously. The QC strategy optimization is performed through a Genetic Algorithm (GA) which allows the evaluation of several solutions in order to obtain feasible optimal solutions. The GA

  13. CHALLENGES IN SETTING UP QUALITY CONTROL IN DIAGNOSTIC RADIOLOGY FACILITIES IN NIGERIA.

    Science.gov (United States)

    Inyang, S O; Egbe, N O; Ekpo, E

    2015-01-01

    The Nigerian Nuclear Regulatory Authority (NNRA) was established to regulate and control the use of radioactive and radiation emitting sources in Nigeria. Quality control (QC) on diagnostic radiology equipment form part of the fundamental requirements for the authorization of diagnostic radiology facilities in the Country. Some quality control tests (output, exposure linearity and reproducibility) were measured on the x-ray machines in the facilities that took part in the study. Questionnaire was developed to evaluate the frequencies at which QC tests were conducted in the facilities and the challenges in setting up QC. Results show great variation in the values of the QC parameters measured. Inadequate cooperation by facilities management, lack of QC equipment and insufficient staff form the major challenges in setting up QC in the facilities under study. The responses on the frequencies at which QC tests should be conducted did not correspond to the recommended standards; indicating that personnel were not familiar with QC implementation and may require further training on QC.

  14. International Journal of Health Research

    African Journals Online (AJOL)

    Erah

    for each clinical chemistry laboratory to establish its own ranges. Keywords: ... In clinical management of patients, ... physician have good reference information. ... Control (QC) sample (Multi Sera) using Beckman Synchron System CX5.

  15. Helios: History and Anatomy of a Successful In-House Enterprise High-Throughput Screening and Profiling Data Analysis System.

    Science.gov (United States)

    Gubler, Hanspeter; Clare, Nicholas; Galafassi, Laurent; Geissler, Uwe; Girod, Michel; Herr, Guy

    2018-06-01

    We describe the main characteristics of the Novartis Helios data analysis software system (Novartis, Basel, Switzerland) for plate-based screening and profiling assays, which was designed and built about 11 years ago. It has been in productive use for more than 10 years and is one of the important standard software applications running for a large user community at all Novartis Institutes for BioMedical Research sites globally. A high degree of automation is reached by embedding the data analysis capabilities into a software ecosystem that deals with the management of samples, plates, and result data files, including automated data loading. The application provides a series of analytical procedures, ranging from very simple to advanced, which can easily be assembled by users in very flexible ways. This also includes the automatic derivation of a large set of quality control (QC) characteristics at every step. Any of the raw, intermediate, and final results and QC-relevant quantities can be easily explored through linked visualizations. Links to global assay metadata management, data warehouses, and an electronic lab notebook system are in place. Automated transfer of relevant data to data warehouses and electronic lab notebook systems are also implemented.

  16. The molecular beam epitaxy growth and characterization of zinc cadmium selenide/zinc cadmium magnesium selenide-indium phosphide quantum cascade structures for operation in the 3 - 5 um range

    Science.gov (United States)

    Charles, William O.

    material parameters are critically important in the process of modeling QC structures, it is not surprising that early success was achieved using these systems. Today, the best performing QC lasers operate in the 4--13 mum range and are produced using lattice matched InGaAs/InAlAs-InP. In order to produce short wavelength QC lasers, the well layer thicknesses in the active region of the device must be reduced in an effort to push the lasing energy states further apart. This reduction in well thicknesses results in the movement of the upper lasing state closer to the bandedge. This action increases the probability of the lost of lasing state electrons to the continuum. Therefore, in order to produce high performing short wavelength QC lasers, a large conduction band offset (CBO) is required. The CBO of lattice matched InGaAs/InAlAs-InP is 0.52 eV. In an attempt to produce high performing devices below 4 mum many researchers have resorted to the use of strain compensation9-11 . This approach has yielded very little improvement in performance due to electron scattering to the X and L intervalleys. This has lead to the exploration of wide bandgap material systems such as the antominides and nitrides. In this work the wide bandgap II-V Znx'Cd(1-x')Se/Zn xCdyMg(1-x-y)Se-InP will be explored for QC laser fabrication. To this end, QC lasers were designed for operation at 3--5 mum range. A Matlab-based program was written to calculate the energy level spacing within the active region of these devices. This simulation program was based on Schroindger's equation and the transfer matrix technique. Several calibration samples were grown to establish the doping levels and growth rate of the well and barrier materials. The growth rate was measured using scanning electron microscopy (SEM) and reflection high energy electron diffraction (RHEED) oscillations during MBE growth. X-ray diffraction measurements were performed to determine the lattice mismatch of the II-VI bulk layers, and

  17. Surrogate analyte approach for quantitation of endogenous NAD(+) in human acidified blood samples using liquid chromatography coupled with electrospray ionization tandem mass spectrometry.

    Science.gov (United States)

    Liu, Liling; Cui, Zhiyi; Deng, Yuzhong; Dean, Brian; Hop, Cornelis E C A; Liang, Xiaorong

    2016-02-01

    A high-performance liquid chromatography tandem mass spectrometry (LC-MS/MS) assay for the quantitative determination of NAD(+) in human whole blood using a surrogate analyte approach was developed and validated. Human whole blood was acidified using 0.5N perchloric acid at a ratio of 1:3 (v:v, blood:perchloric acid) during sample collection. 25μL of acidified blood was extracted using a protein precipitation method and the resulting extracts were analyzed using reverse-phase chromatography and positive electrospray ionization mass spectrometry. (13)C5-NAD(+) was used as the surrogate analyte for authentic analyte, NAD(+). The standard curve ranging from 0.250 to 25.0μg/mL in acidified human blood for (13)C5-NAD(+) was fitted to a 1/x(2) weighted linear regression model. The LC-MS/MS response between surrogate analyte and authentic analyte at the same concentration was obtained before and after the batch run. This response factor was not applied when determining the NAD(+) concentration from the (13)C5-NAD(+) standard curve since the percent difference was less than 5%. The precision and accuracy of the LC-MS/MS assay based on the five analytical QC levels were well within the acceptance criteria from both FDA and EMA guidance for bioanalytical method validation. Average extraction recovery of (13)C5-NAD(+) was 94.6% across the curve range. Matrix factor was 0.99 for both high and low QC indicating minimal ion suppression or enhancement. The validated assay was used to measure the baseline level of NAD(+) in 29 male and 21 female human subjects. This assay was also used to study the circadian effect of endogenous level of NAD(+) in 10 human subjects. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Quality Control Quantification (QCQ): A Tool to Measure the Value of Quality Control Checks in Radiation Oncology

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric C., E-mail: eford@uw.edu [Department of Radiation Oncology and Molecular Radiation Sciences, Johns Hopkins University, Baltimore, Maryland (United States); Terezakis, Stephanie; Souranis, Annette; Harris, Kendra [Department of Radiation Oncology and Molecular Radiation Sciences, Johns Hopkins University, Baltimore, Maryland (United States); Gay, Hiram; Mutic, Sasa [Department of Radiation Oncology, Washington University, St. Louis, Missouri (United States)

    2012-11-01

    Purpose: To quantify the error-detection effectiveness of commonly used quality control (QC) measures. Methods: We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentage of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. Results: In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 {+-} 2.3 (mean {+-} SD) and 2.6 {+-} 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. Conclusions: The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that

  19. Quality Control Quantification (QCQ): A Tool to Measure the Value of Quality Control Checks in Radiation Oncology

    International Nuclear Information System (INIS)

    Ford, Eric C.; Terezakis, Stephanie; Souranis, Annette; Harris, Kendra; Gay, Hiram; Mutic, Sasa

    2012-01-01

    Purpose: To quantify the error-detection effectiveness of commonly used quality control (QC) measures. Methods: We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentage of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. Results: In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 ± 2.3 (mean ± SD) and 2.6 ± 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. Conclusions: The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that further

  20. Quality control quantification (QCQ): a tool to measure the value of quality control checks in radiation oncology.

    Science.gov (United States)

    Ford, Eric C; Terezakis, Stephanie; Souranis, Annette; Harris, Kendra; Gay, Hiram; Mutic, Sasa

    2012-11-01

    To quantify the error-detection effectiveness of commonly used quality control (QC) measures. We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentage of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 ± 2.3 (mean ± SD) and 2.6 ± 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that further improvements are needed. These data

  1. Quality Control Algorithms for the Kennedy Space Center 50-Megahertz Doppler Radar Wind Profiler Winds Database

    Science.gov (United States)

    Barbre, Robert E., Jr.

    2012-01-01

    This paper presents the process used by the Marshall Space Flight Center Natural Environments Branch (EV44) to quality control (QC) data from the Kennedy Space Center's 50-MHz Doppler Radar Wind Profiler for use in vehicle wind loads and steering commands. The database has been built to mitigate limitations of using the currently archived databases from weather balloons. The DRWP database contains wind measurements from approximately 2.7-18.6 km altitude at roughly five minute intervals for the August 1997 to December 2009 period of record, and the extensive QC process was designed to remove spurious data from various forms of atmospheric and non-atmospheric artifacts. The QC process is largely based on DRWP literature, but two new algorithms have been developed to remove data contaminated by convection and excessive first guess propagations from the Median Filter First Guess Algorithm. In addition to describing the automated and manual QC process in detail, this paper describes the extent of the data retained. Roughly 58% of all possible wind observations exist in the database, with approximately 100 times as many complete profile sets existing relative to the EV44 balloon databases. This increased sample of near-continuous wind profile measurements may help increase launch availability by reducing the uncertainty of wind changes during launch countdown

  2. Chiral liquid chromatography-mass spectrometry (LC-MS/MS) method development for the detection of salbutamol in urine samples.

    Science.gov (United States)

    Chan, Sue Hay; Lee, Warren; Asmawi, Mohd Zaini; Tan, Soo Choon

    2016-07-01

    A sequential solid-phase extraction (SPE) method was developed and validated using liquid chromatography-electrospray ionization tandem mass spectrometry (LC-ESI-MS/MS) for the detection and quantification of salbutamol enantiomers in porcine urine. Porcine urine samples were hydrolysed with β-glucuronidase/arylsulfatase from Helix pomatia and then subjected to a double solid-phase extraction (SPE) first using the Abs-Elut Nexus SPE and then followed by the Bond Elut Phenylboronic Acid (PBA) SPE. The salbutamol enantiomers were separated using the Astec CHIROBIOTIC™ T HPLC column (3.0mm×100mm; 5μm) maintained at 15°C with a 15min isocratic run at a flow rate of 0.4mL/min. The mobile phase constituted of 5mM ammonium formate in methanol. Salbutamol and salbutamol-tert-butyl-d9 (internal standard, IS) was monitored and quantified with the multiple reaction monitoring (MRM) mode. The method showed good linearity for the range of 0.1-10ng/mL with limit of quantification at 0.3ng/mL. Analysis of the QC samples showed intra- and inter-assay precisions to be less than 5.04%, and recovery ranging from 83.82 to 102.33%. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Short term performance and effect of speed humps on pavement condition of Alexandria Governorate roads

    Directory of Open Access Journals (Sweden)

    Wael Bekheet

    2014-12-01

    Some performance trends were observed and found to be statistically significant, including superior short-term performance of projects with good or average construction QC records when compared to poor construction QC records. Raveling was the most widely observed distress, while load-related distresses were not common. The analysis also showed that the presence of improper speed humps significantly affected the pavement condition, reducing the PCI of the pavement sections by up to 19 PCI points.

  4. Analytic device including nanostructures

    KAUST Repository

    Di Fabrizio, Enzo M.; Fratalocchi, Andrea; Totero Gongora, Juan Sebastian; Coluccio, Maria Laura; Candeloro, Patrizio; Cuda, Gianni

    2015-01-01

    A device for detecting an analyte in a sample comprising: an array including a plurality of pixels, each pixel including a nanochain comprising: a first nanostructure, a second nanostructure, and a third nanostructure, wherein size of the first nanostructure is larger than that of the second nanostructure, and size of the second nanostructure is larger than that of the third nanostructure, and wherein the first nanostructure, the second nanostructure, and the third nanostructure are positioned on a substrate such that when the nanochain is excited by an energy, an optical field between the second nanostructure and the third nanostructure is stronger than an optical field between the first nanostructure and the second nanostructure, wherein the array is configured to receive a sample; and a detector arranged to collect spectral data from a plurality of pixels of the array.

  5. Quantum key distribution using card, base station and trusted authority

    Energy Technology Data Exchange (ETDEWEB)

    Nordholt, Jane E.; Hughes, Richard John; Newell, Raymond Thorson; Peterson, Charles Glen; Rosenberg, Danna; McCabe, Kevin Peter; Tyagi, Kush T.; Dallmann, Nicholas

    2017-06-14

    Techniques and tools for quantum key distribution ("QKD") between a quantum communication ("QC") card, base station and trusted authority are described herein. In example implementations, a QC card contains a miniaturized QC transmitter and couples with a base station. The base station provides a network connection with the trusted authority and can also provide electric power to the QC card. When coupled to the base station, after authentication by the trusted authority, the QC card acquires keys through QKD with a trust authority. The keys can be used to set up secure communication, for authentication, for access control, or for other purposes. The QC card can be implemented as part of a smart phone or other mobile computing device, or the QC card can be used as a fillgun for distribution of the keys.

  6. Quantum key distribution using card, base station and trusted authority

    Science.gov (United States)

    Nordholt, Jane Elizabeth; Hughes, Richard John; Newell, Raymond Thorson; Peterson, Charles Glen; Rosenberg, Danna; McCabe, Kevin Peter; Tyagi, Kush T; Dallman, Nicholas

    2015-04-07

    Techniques and tools for quantum key distribution ("QKD") between a quantum communication ("QC") card, base station and trusted authority are described herein. In example implementations, a QC card contains a miniaturized QC transmitter and couples with a base station. The base station provides a network connection with the trusted authority and can also provide electric power to the QC card. When coupled to the base station, after authentication by the trusted authority, the QC card acquires keys through QKD with a trusted authority. The keys can be used to set up secure communication, for authentication, for access control, or for other purposes. The QC card can be implemented as part of a smart phone or other mobile computing device, or the QC card can be used as a fillgun for distribution of the keys.

  7. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Tonsina area, Valdez Quadrangle, Alaska

    Science.gov (United States)

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 128 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Tonsina area in the Chugach Mountains, Valdez quadrangle, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies

  8. 40 CFR 98.334 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal... Standard Test Method for Determination of Carbon in Refractory and Reactive Metals and Their Alloys...

  9. Results of the Excreta Bioassay Quality Control Program for April 1, 2009 through March 31, 2010

    Energy Technology Data Exchange (ETDEWEB)

    Antonio, Cheryl L.

    2012-07-19

    A total of 58 urine samples and 10 fecal samples were submitted during the report period (April 1, 2009 through March 31, 2010) to General Engineering Laboratories, South Carolina by the Hanford Internal Dosimetry Program (IDP) to check the accuracy, precision, and detection levels of their analyses. Urine analyses for Sr, 238Pu, 239Pu, 241Am, 243Am 235U, 238U, elemental uranium and fecal analyses for 241Am, 238Pu and 239Pu were tested this year as well as four tissue samples for 238Pu, 239Pu, 241Am and 241Pu. The number of QC urine samples submitted during the report period represented 1.3% of the total samples submitted. In addition to the samples provided by IDP, GEL was also required to conduct their own QC program, and submit the results of analyses to IDP. About 33% of the analyses processed by GEL during the third year of this contract were quality control samples. GEL tested the performance of 21 radioisotopes, all of which met or exceeded the specifications in the Statement of Work within statistical uncertainty (Table 4).

  10. Multi-Site Quality Assurance Project Plan for Wisconsin Public Service Corporation, Peoples Gas Light and Coke Company, and North Shore Gas

    Science.gov (United States)

    This Multi-Site QAPP presents the organization, data quality objectives (DQOs), a set of anticipated activities, sample analysis, data handling and specific Quality Assurance/Quality Control (QA/QC) procedures associated with Studies done in EPA Region 5

  11. 40 CFR 98.184 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by... Determination of Carbon in Refractory and Reactive Metals and Their Alloys (incorporated by reference, see § 98...

  12. WE-AB-209-12: Quasi Constrained Multi-Criteria Optimization for Automated Radiation Therapy Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, W.T.; Siebers, J.V. [University of Virginia, Charlottesville, VA (United States)

    2016-06-15

    Purpose: To introduce quasi-constrained Multi-Criteria Optimization (qcMCO) for unsupervised radiation therapy optimization which generates alternative patient-specific plans emphasizing dosimetric tradeoffs and conformance to clinical constraints for multiple delivery techniques. Methods: For N Organs At Risk (OARs) and M delivery techniques, qcMCO generates M(N+1) alternative treatment plans per patient. Objective weight variations for OARs and targets are used to generate alternative qcMCO plans. For 30 locally advanced lung cancer patients, qcMCO plans were generated for dosimetric tradeoffs to four OARs: each lung, heart, and esophagus (N=4) and 4 delivery techniques (simple 4-field arrangements, 9-field coplanar IMRT, 27-field non-coplanar IMRT, and non-coplanar Arc IMRT). Quasi-constrained objectives included target prescription isodose to 95% (PTV-D95), maximum PTV dose (PTV-Dmax)< 110% of prescription, and spinal cord Dmax<45 Gy. The algorithm’s ability to meet these constraints while simultaneously revealing dosimetric tradeoffs was investigated. Statistically significant dosimetric tradeoffs were defined such that the coefficient of determination between dosimetric indices which varied by at least 5 Gy between different plans was >0.8. Results: The qcMCO plans varied mean dose by >5 Gy to ipsilateral lung for 24/30 patients, contralateral lung for 29/30 patients, esophagus for 29/30 patients, and heart for 19/30 patients. In the 600 plans computed without human interaction, average PTV-D95=67.4±3.3 Gy, PTV-Dmax=79.2±5.3 Gy, and spinal cord Dmax was >45 Gy in 93 plans (>50 Gy in 2/600 plans). Statistically significant dosimetric tradeoffs were evident in 19/30 plans, including multiple tradeoffs of at least 5 Gy between multiple OARs in 7/30 cases. The most common statistically significant tradeoff was increasing PTV-Dmax to reduce OAR dose (15/30 patients). Conclusion: The qcMCO method can conform to quasi-constrained objectives while revealing

  13. WE-AB-209-12: Quasi Constrained Multi-Criteria Optimization for Automated Radiation Therapy Treatment Planning

    International Nuclear Information System (INIS)

    Watkins, W.T.; Siebers, J.V.

    2016-01-01

    Purpose: To introduce quasi-constrained Multi-Criteria Optimization (qcMCO) for unsupervised radiation therapy optimization which generates alternative patient-specific plans emphasizing dosimetric tradeoffs and conformance to clinical constraints for multiple delivery techniques. Methods: For N Organs At Risk (OARs) and M delivery techniques, qcMCO generates M(N+1) alternative treatment plans per patient. Objective weight variations for OARs and targets are used to generate alternative qcMCO plans. For 30 locally advanced lung cancer patients, qcMCO plans were generated for dosimetric tradeoffs to four OARs: each lung, heart, and esophagus (N=4) and 4 delivery techniques (simple 4-field arrangements, 9-field coplanar IMRT, 27-field non-coplanar IMRT, and non-coplanar Arc IMRT). Quasi-constrained objectives included target prescription isodose to 95% (PTV-D95), maximum PTV dose (PTV-Dmax)< 110% of prescription, and spinal cord Dmax<45 Gy. The algorithm’s ability to meet these constraints while simultaneously revealing dosimetric tradeoffs was investigated. Statistically significant dosimetric tradeoffs were defined such that the coefficient of determination between dosimetric indices which varied by at least 5 Gy between different plans was >0.8. Results: The qcMCO plans varied mean dose by >5 Gy to ipsilateral lung for 24/30 patients, contralateral lung for 29/30 patients, esophagus for 29/30 patients, and heart for 19/30 patients. In the 600 plans computed without human interaction, average PTV-D95=67.4±3.3 Gy, PTV-Dmax=79.2±5.3 Gy, and spinal cord Dmax was >45 Gy in 93 plans (>50 Gy in 2/600 plans). Statistically significant dosimetric tradeoffs were evident in 19/30 plans, including multiple tradeoffs of at least 5 Gy between multiple OARs in 7/30 cases. The most common statistically significant tradeoff was increasing PTV-Dmax to reduce OAR dose (15/30 patients). Conclusion: The qcMCO method can conform to quasi-constrained objectives while revealing

  14. Impact of Case Mix Severity on Quality Improvement in a Patient-centered Medical Home (PCMH) in the Maryland Multi-Payor Program.

    Science.gov (United States)

    Khanna, Niharika; Shaya, Fadia T; Chirikov, Viktor V; Sharp, David; Steffen, Ben

    2016-01-01

    We present data on quality of care (QC) improvement in 35 of 45 National Quality Forum metrics reported annually by 52 primary care practices recognized as patient-centered medical homes (PCMHs) that participated in the Maryland Multi-Payor Program from 2011 to 2013. We assigned QC metrics to (1) chronic, (2) preventive, and (3) mental health care domains. The study used a panel data design with no control group. Using longitudinal fixed-effects regressions, we modeled QC and case mix severity in a PCMH. Overall, 35 of 45 quality metrics reported by 52 PCMHs demonstrated improvement over 3 years, and case mix severity did not affect the achievement of quality improvement. From 2011 to 2012, QC increased by 0.14 (P case mix severity did not correlate with QC. In multivariate analyses, higher QC correlated with larger practices, greater proportion of older patients, and readmission visits. Rural practices had higher proportions of Medicaid patients, lower QC, and higher QC improvement in interaction analyses with time. The gains in QC in the chronic disease domain, the preventive care domain, and, most significantly, the mental health care domain were observed over time regardless of patient case mix severity. QC improvement was generally not modified by practice characteristics, except for rurality. © Copyright 2016 by the American Board of Family Medicine.

  15. The FAA's postmortem forensic toxicology self-evaluated proficiency test program: the second seven years.

    Science.gov (United States)

    Chaturvedi, Arvind K; Craft, Kristi J; Cardona, Patrick S; Rogers, Paul B; Canfield, Dennis V

    2009-05-01

    During toxicological evaluations of samples from fatally injured pilots involved in civil aviation accidents, a high degree of quality control/quality assurance (QC/QA) is maintained. Under this philosophy, the Federal Aviation Administration (FAA) started a forensic toxicology proficiency-testing (PT) program in July 1991. In continuation of the first seven years of the PT findings reported earlier, PT findings of the next seven years are summarized herein. Twenty-eight survey samples (12 urine, 9 blood, and 7 tissue homogenate) with/without alcohols/volatiles, drugs, and/or putrefactive amine(s) were submitted to an average of 31 laboratories, of which an average of 25 participants returned their results. Analytes in survey samples were correctly identified and quantitated by a large number of participants, but some false positives of concern were reported. It is anticipated that the FAA's PT program will continue to serve the forensic toxicology community through this important part of the QC/QA for laboratory accreditations.

  16. Euthanasia and assisted suicide: a physician’s and ethicist’s perspectives

    OpenAIRE

    Boudreau, J. Donald; Somerville,Margaret

    2014-01-01

    J Donald Boudreau,1 Margaret A Somerville21Faculty of Medicine, Department of Medicine, McGill University, Montreal, QC, Canada; 2Faculty of Law, Faculty of Medicine, and Centre for Medicine, Ethics and Law, McGill University, Montreal, QC, CanadaAbstract: The debate on legalizing euthanasia and assisted suicide has a broad range of participants including physicians, scholars in ethics and health law, politicians, and the general public. It is conflictual, and despite its importance, particip...

  17. A field-deployable compound-specific isotope analyzer based on quantum cascade laser and hollow waveguide

    Science.gov (United States)

    Wu, Sheng; Deev, Andrei

    2013-01-01

    A field deployable Compound Specific Isotope Analyzer (CSIA) coupled with capillary chromatogrpahy based on Quantum Cascade (QC) lasers and Hollow Waveguide (HWG) with precision and chemical resolution matching mature Mass Spectroscopy has been achieved in our laboratory. The system could realize 0.3 per mil accuracy for 12C/13C for a Gas Chromatography (GC) peak lasting as short as 5 seconds with carbon molar concentration in the GC peak less than 0.5%. Spectroscopic advantages of HWG when working with QC lasers, i.e. single mode transmission, noiseless measurement and small sample volume, are compared with traditional free space and multipass spectroscopy methods.

  18. Quality control of 131I treatment of graves' disease

    International Nuclear Information System (INIS)

    Liu Zeng; Liu Guoqiang

    2009-01-01

    To make a preliminary quality control (QC) criteria and apply on the various stages of clinic 131 I treatment of Graves' disease in order to decrease the early happening of hypothyroidism and enhance the onetime 131 I cure rate of Graves' disease, the quality control criteria in the stochastic outpatient with 131 I treatment, such as plan of the indication, contraindication, method of treatment, matters needing attention, follow-up observation and curative effect appraisal, patient selection, RAIU, thyroid gland weight measurement and 131 I dose criteria for the various steps of 131 I medication were determined. The 131 I treatment effects of Graves' disease including the once-cure rate, the improving rate, duplicate cure rate and the early happening rate of hypothyroidism were analyzed in patients with applying QC and without QC ccriteria. The results showed that the oncecure rate in patients with applying QC criteria was increased from 76.6% to 90.9% (P≤0.01); the improving rate was decreased from 12.2% to 7.0% (P≤0.01); the duplicate cure rate was increased from 90.1% to 93.0% (P>0.05); the early happening rate of hypothyroidism was decreased from 11.0% to 2.1% (P≤0.01). The 131 I treatment of Graves' disease applying with QC criteria had tremendously improved the oncecure rate and decreased the early happening of hypothyroidism rate. (authors)

  19. Proteomics Quality Control: Quality Control Software for MaxQuant Results.

    Science.gov (United States)

    Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan

    2016-03-04

    Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .

  20. EMODnet Thematic Lot n° 4 - Chemistry

    DEFF Research Database (Denmark)

    Beckers, Jean-Marie; Buga, Luminita; Debray, Noelie

    2015-01-01

    Data quality assurance and quality control (QA/QC) is an important issue in oceanographic data management, especially for the creation of multidisciplinary and comprehensive databases which include data from different and/or unknown origin covering long time periods. The data-collection methods i...... inconsistent data quality flags and the need for coordination and harmonization of practices. A dedicated workshop was organized to review the different practices and agree on a common methodology for data QA/QC and Diva products generation for EMODnet Chemistry....... will contribute considerably to the validation of large data collections. This report intends to be a reference manual for EMODnet Chemistry data QA/QC and the subsequent product generation. In fact, during the first data validation loop, each region adopted its own protocol and the results showed many...

  1. Results of The Excreta Bioassay Quality Control Program For April 1, 2010 Through March 31, 2011

    Energy Technology Data Exchange (ETDEWEB)

    Antonio, Cheryl L.

    2012-07-19

    A total of 76 urine samples and 10 spiked fecal samples were submitted during the report period (April 1, 2010 through March 31, 2011) to GEL Laboratories, LLC in South Carolina by the Hanford Internal Dosimetry Program (IDP) to check the accuracy, precision, and detection levels of their analyses. Urine analyses for 14C, Sr, for 238Pu, 239Pu, 241Am, 243Am, 235U, 238U, 238U-mass and fecal analyses for 241Am, 238Pu and 239Pu were tested this year. The number of QC urine samples submitted during the report period represented 1.1% of the total samples submitted. In addition to the samples provided by IDP, GEL was also required to conduct their own QC program, and submit the results of analyses to IDP. About 31% of the analyses processed by GEL during the first year of contract 112512 were quality control samples. GEL tested the performance of 23 radioisotopes, all of which met or exceeded the specifications in the Statement of Work within statistical uncertainty except the slightly elevated relative bias for 243,244Cm (Table 4).

  2. A real-time automated quality control of rain gauge data based on multiple sensors

    Science.gov (United States)

    qi, Y.; Zhang, J.

    2013-12-01

    Precipitation is one of the most important meteorological and hydrological variables. Automated rain gauge networks provide direct measurements of precipitation and have been used for numerous applications such as generating regional and national precipitation maps, calibrating remote sensing data, and validating hydrological and meteorological model predictions. Automated gauge observations are prone to a variety of error sources (instrument malfunction, transmission errors, format changes), and require careful quality controls (QC). Many previous gauge QC techniques were based on neighborhood checks within the gauge network itself and the effectiveness is dependent on gauge densities and precipitation regimes. The current study takes advantage of the multi-sensor data sources in the National Mosaic and Multi-Sensor QPE (NMQ/Q2) system and developes an automated gauge QC scheme based the consistency of radar hourly QPEs and gauge observations. Error characteristics of radar and gauge as a function of the radar sampling geometry, precipitation regimes, and the freezing level height are considered. The new scheme was evaluated by comparing an NMQ national gauge-based precipitation product with independent manual gauge observations. Twelve heavy rainfall events from different seasons and areas of the United States are selected for the evaluation, and the results show that the new NMQ product with QC'ed gauges has a more physically spatial distribution than the old product. And the new product agrees much better statistically with the independent gauges.

  3. Quality control in dual head γ-cameras: comparison between methods and software s used for image analysis

    International Nuclear Information System (INIS)

    Nayl E, A.; Fornasier, M. R.; De Denaro, M.; Sulieman, A.; Alkhorayef, M.; Bradley, D.

    2017-10-01

    Patient radiation dose and image quality are the main issues in nuclear medicine (Nm) procedures. Currently, many protocols are used for image acquisition and analysis of quality control (Qc) tests. National Electrical Manufacturers Association (Nema) methods and protocols are widely accepted method used for providing accurate description, measurement and reporting of γ-camera performance parameters. However, no standard software is available for image analysis. The aim os this study was to compare between the vendor Qc software analysis and three software from different developers downloaded free from internet; NMQC, Nm Tool kit and ImageJ-Nm Tool kit software. The three software are used for image analysis of some Qc tests for γ-cameras based on Nema protocols including non-uniformity evaluation. Ten non-uniformity Qc images were taken from dual head γ-camera (Siemens Symbia) installed in Trieste general hospital (Italy), and analyzed. Excel analysis was used as baseline calculation of the non-uniformity test according Nema procedures. The results of the non-uniformity analysis showed good agreement between the three independent software and excel calculation (the average differences were 0.3%, 2.9%, 1.3% and 1.6% for UFOV integral, UFOV differential, CFOV integral and CFOV differential respectively), while significant difference was detected on the analysis of the company Qc software with compare to the excel analysis (the average differences were 14.6%, 20.7%, 25.7% and 31.9% for UFOV integral, UFOV differential, CFOV integral and CFOV differential respectively). NMQC software was the best in comparison with the excel calculations. The variation in the results is due to different pixel sizes used for analysis in the three software and the γ-camera Qc software. Therefore, is important to perform the tests by the vendor Qc software as well as by independent analysis to understand the differences between the values. Moreover, the medical physicist should know

  4. Quality control in dual head γ-cameras: comparison between methods and software s used for image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nayl E, A. [Sudan Atomic Energy Commission, Radiation Safety Institute, Khartoum (Sudan); Fornasier, M. R.; De Denaro, M. [Azienda Sanitaria Universitaria Integrata di Trieste, Medical Physics Department, Via Giovanni Sai 7, 34128 Trieste (Italy); Sulieman, A. [Prince Sattam bin Abdulaziz University, College of Applied Medical Sciences, Radiology and Medical Imaging Department, P. O. Box 422, 11942 Al-Kharj (Saudi Arabia); Alkhorayef, M.; Bradley, D., E-mail: abdwsh10@hotmail.com [University of Surrey, Department of Physics, GU2-7XH Guildford, Surrey (United Kingdom)

    2017-10-15

    Patient radiation dose and image quality are the main issues in nuclear medicine (Nm) procedures. Currently, many protocols are used for image acquisition and analysis of quality control (Qc) tests. National Electrical Manufacturers Association (Nema) methods and protocols are widely accepted method used for providing accurate description, measurement and reporting of γ-camera performance parameters. However, no standard software is available for image analysis. The aim os this study was to compare between the vendor Qc software analysis and three software from different developers downloaded free from internet; NMQC, Nm Tool kit and ImageJ-Nm Tool kit software. The three software are used for image analysis of some Qc tests for γ-cameras based on Nema protocols including non-uniformity evaluation. Ten non-uniformity Qc images were taken from dual head γ-camera (Siemens Symbia) installed in Trieste general hospital (Italy), and analyzed. Excel analysis was used as baseline calculation of the non-uniformity test according Nema procedures. The results of the non-uniformity analysis showed good agreement between the three independent software and excel calculation (the average differences were 0.3%, 2.9%, 1.3% and 1.6% for UFOV integral, UFOV differential, CFOV integral and CFOV differential respectively), while significant difference was detected on the analysis of the company Qc software with compare to the excel analysis (the average differences were 14.6%, 20.7%, 25.7% and 31.9% for UFOV integral, UFOV differential, CFOV integral and CFOV differential respectively). NMQC software was the best in comparison with the excel calculations. The variation in the results is due to different pixel sizes used for analysis in the three software and the γ-camera Qc software. Therefore, is important to perform the tests by the vendor Qc software as well as by independent analysis to understand the differences between the values. Moreover, the medical physicist should know

  5. Studies on failure kind analysis of the radiologic medical equipment in general hospital

    International Nuclear Information System (INIS)

    Lee, Woo Cheul; Kim, Jeong Lae

    1999-01-01

    This paper included a data analysis of the unit of medical devices using maintenance recording card that had medical devices of unit failure mode, hospital of failure mode and MTBF. The results of the analysis were as follows : 1. Medical devices of unit failure mode was the highest in QC/PM such A hospital as 33.9%, B hospital 30.9%, C hospital 30.3%, second degree was the Electrical and Electronic failure such A hospital as 23.5%, B hospital 25.3%, C hospital 28%, third degree was mechanical failure such A hospital as 19.6%, B hospital 22.5%, C hospital 25.4%. 2. Hospital of failure mode was the highest in Mobile X-ray device(A hospital 62.5%, B hospital 69.5%, C hospital 37.4%), and was the lowest in Sono devices(A hospital 16.76%, B hospital 8.4%, C hospital 7%). 3. Mean time between failures(MTBT) was the highest in SONO devices and was the lowest in Mobile X-ray devices which have 200 - 400 failure hours. 4. Average failure ratio was the highest in Mobile X-ray devices(A hospital 31.3%, B hospital 34.8%, C hospital 18.7%), and was the lowest in Sono(Ultrasound) devices (A hospital 8.4%, B hospital 4.2%, C hospital 3.5%). 5. Failure ratio results of medical devices according to QC/PM part of unit failure mode were as follows ; A hospital was the highest part of QC/PM (50%) in Mamo X-ray device and was the lowest part of QC/PM(26.4%) in Gastro X-ray. B hospital was the highest part of QC/PM(56%) in Mobile X-ray device, and the lowest part of QC/PM(12%) in Gastro X-ray. C hospital was the highest part of QC/PM(60%) in R/F X-ray device, and the lowest a part of QC/PM(21%) in Universal X-ray. It was found that the units responsible for most failure decreased by systematic management. We made the preventive maintenance schedule focusing on adjustment of operating and dust removal

  6. Micro-PIXE characterization of reference samples intended for QA/QC of k0 NAA

    International Nuclear Information System (INIS)

    Bucar, T.; Smodis, B.; Pelicon, P.; Simcic, J.; Jacimovic, R.

    2008-01-01

    Cellulose cylinders and circular filter papers spiked with known amounts of standard element solutions were prepared for studying some aspects of assessing measurement uncertainty of NAA and the elemental distribution measured by micro-PIXE analysis. Results for the cylinders showed strongly non-homogeneous distribution of the elements, both in radial and vertical directions, dominantly caused by osmosis driven transport of added liquid solution from the centre to the edges. Results for the thin cellulose filter paper disks exhibited weaker peaking of the standard element concentrations at the edges in comparison with the thick cylinders. (author)

  7. An UHPLC-MS/MS method for simultaneous quantification of human amyloid beta peptides Aβ1-38, Aβ1-40 and Aβ1-42 in cerebrospinal fluid using micro-elution solid phase extraction.

    Science.gov (United States)

    Lin, Ping-Ping; Chen, Wei-Li; Yuan, Fei; Sheng, Lei; Wu, Yu-Jia; Zhang, Wei-Wei; Li, Guo-Qing; Xu, Hong-Rong; Li, Xue-Ning

    2017-12-01

    Amyloid beta (Aβ) peptides in cerebrospinal fluid are extensively estimated for identification of Alzheimer's disease (AD) as diagnostic biomarkers. Unfortunately, their pervasive application is hampered by interference from Aβ propensity of self-aggregation, nonspecifically bind to surfaces and matrix proteins, and by lack of quantitive standardization. Here we report on an alternative Ultra-high performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method for simultaneous measurement of human amyloid beta peptides Aβ1-38, Aβ1-40 and Aβ1-42 in cerebrospinal fluid (CSF) using micro-elution solid phase extraction (SPE). Samples were pre-processing by the mixed-mode micro-elution solid phase extraction and quantification was performed in the positive ion multiple reaction monitoring (MRM) mode using electrospray ionization. The stable-isotope labeled Aβ peptides 15 N 51 - Aβ1-38, 15 N 53 - Aβ1-40 and 15 N 55 - Aβ1-42 peptides were used as internal standards. And the artificial cerebrospinal fluid (ACSF) containing 5% rat plasma was used as a surrogate matrix for calibration curves. The quality control (QC) samples at 0.25, 2 and 15ng/mL were prepared. A "linear" regression (1/x 2 weighting): y=ax+b was used to fit the calibration curves over the concentration range of 0.1-20ng/mL for all three peptides. Coefficient of variation (CV) of intra-batch and inter-batch assays were all less than 6.44% for Aβ1-38, 6.75% for Aβ1-40 and 10.74% for Aβ1-42. The precision values for all QC samples of three analytes met the acceptance criteria. Extract recoveries of Aβ1-38, Aβ1-40 and Aβ1-42 were all greater than 70.78%, both in low and high QC samples. The stability assessments showed that QC samples at both low and high levels could be stable for at least 24h at 4°C, 4h at room temperature and through three freeze-thaw cycles without sacrificing accuracy or precision. And no significant carryover effect was observed. This validated UHPLC

  8. Current trends in needle-free jet injection: an update

    Directory of Open Access Journals (Sweden)

    Barolet D

    2018-05-01

    Full Text Available Daniel Barolet,1,2 Antranik Benohanian3 1RoseLab Skin Optics Research Laboratory, Laval, QC, Canada; 2MUHC Dermatology Service, Department of Medicine, McGill University, Montreal, QC, Canada; 3CHUM Service de Dermatologie, Université de Montréal, Montréal, QC, Canada Background: Jet injection can be defined as a needle-free drug delivery method in which a high-speed stream of fluid impacts the skin and delivers a drug. Despite 75 years of existence, it never reached its full potential as a strategic tool to deliver medications through the skin. Objective: The aim of this review was to evaluate and summarize the evolution of jet injection intradermal drug delivery method including technological advancements and new indications for use. Methods: A review of the literature was performed with no limits placed on publication date. Results: Needleless injectors not only reduce pain during drug delivery but also confine the drug more evenly in the dermis. Understanding skin properties of the injection site is a key factor to obtain optimal results as well as setting the right parameters of the jet injector. Until the advent of disposable jet injectors/cartridges, autoclaving of the injector remains the only reliable method to eliminate the risk of infection. Needle-free intradermal injection using corticosteroids and/or local anesthetics is well documented with promising indications being developed. Limitations: Limitations of the review include low-quality evidence, small sample sizes, varying treatment parameters, and publication bias. Conclusion: New developments may help reconsider the use of jet injection technology. Future studies should focus on measurable optimized parameters to insure a safe and effective outcome. Keywords: needle free, injector, jet injection, xylocaine, triamcinolone, PDT

  9. Device including a contact detector

    DEFF Research Database (Denmark)

    2011-01-01

    arms (12) may extend from the supporting body in co-planar relationship with the first surface. The plurality of cantilever arms (12) may extend substantially parallel to each other and each of the plurality of cantilever arms (12) may include an electrical conductive tip for contacting the area......The present invention relates to a probe for determining an electrical property of an area of a surface of a test sample, the probe is intended to be in a specific orientation relative to the test sample. The probe may comprise a supporting body defining a first surface. A plurality of cantilever...... of the test sample by movement of the probe relative to the surface of the test sample into the specific orientation.; The probe may further comprise a contact detector (14) extending from the supporting body arranged so as to contact the surface of the test sample prior to any one of the plurality...

  10. 40 CFR 98.314 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... requirements. (a) You must measure your consumption of calcined petroleum coke using plant instruments used for.... Alternatively, facilities can measure monthly carbon contents of the petroleum coke using ASTM D3176-89... Nitrogen in Laboratory Samples of Coal (incorporated by reference, see § 98.7). (d) For quality assurance...

  11. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Zane Hills, Hughes and Shungnak quadrangles, Alaska

    Science.gov (United States)

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential.The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska.For this report, DGGS funded reanalysis of 105 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Zane Hills area in the Hughes and Shungnak quadrangles, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies.

  12. Spatial variation of contaminant elements of roadside dust samples from Budapest (Hungary) and Seoul (Republic of Korea), including Pt, Pd and Ir.

    Science.gov (United States)

    Sager, Manfred; Chon, Hyo-Taek; Marton, Laszlo

    2015-02-01

    Roadside dusts were studied to explain the spatial variation and present levels of contaminant elements including Pt, Pd and Ir in urban environment and around Budapest (Hungary) and Seoul (Republic of Korea). The samples were collected from six sites of high traffic volumes in Seoul metropolitan city and from two control sites within the suburbs of Seoul, for comparison. Similarly, road dust samples were obtained two times from traffic focal points in Budapest, from the large bridges across the River Danube, from Margitsziget (an island in the Danube in the northern part of Budapest, used for recreation) as well as from main roads (no highways) outside Budapest. The samples were analysed for contaminant elements by ICP-AES and for Pt, Pd and Ir by ICP-MS. The highest Pt, Pd and Ir levels in road dusts were found from major roads with high traffic volume, but correlations with other contaminant elements were low, however. This reflects automobile catalytic converter to be an important source. To interpret the obtained multi-element results in short, pollution index, contamination index and geo-accumulation index were calculated. Finally, the obtained data were compared with total concentrations encountered in dust samples from Madrid, Oslo, Tokyo and Muscat (Oman). Dust samples from Seoul reached top level concentrations for Cd-Zn-As-Co-Cr-Cu-Mo-Ni-Sn. Just Pb was rather low because unleaded gasoline was introduced as compulsory in 1993. Concentrations in Budapest dust samples were lower than from Seoul, except for Pb and Mg. Compared with Madrid as another continental site, Budapest was higher in Co-V-Zn. Dust from Oslo, which is not so large, contained more Mn-Na-Sr than dust from other towns, but less other metals.

  13. Quality control of brachytherapy equipment in the Netherlands and Belgium: current practice and minimum requirements

    International Nuclear Information System (INIS)

    Elfrink, Robert J.M.; Kolkman-Deurloo, Inger-Karine K.; Kleffens, Herman J. van; Rijnders, Alex; Schaeken, Bob; Aalbers, Tony H.L.; Dries, Wim J.F.; Venselaar, Jack L.M.

    2002-01-01

    Background and purpose: Brachytherapy is applied in 39 radiotherapy institutions in The Netherlands and Belgium. Each institution has its own quality control (QC) programme to ensure safe and accurate dose delivery to the patient. The main goal of this work is to gain insight into the current practice of QC of brachytherapy in The Netherlands and Belgium and to reduce possible variations in test frequencies and tolerances by formulating a set of minimum QC-requirements. Materials and methods: An extensive questionnaire about QC of brachytherapy was distributed to and completed by the 39 radiotherapy institutions. A separate smaller questionnaire was sent to nine institutions performing intracoronary brachytherapy. The questions were related to safety systems, physical irradiation parameters and total time spent on QC. The results of the questionnaires were compared with recommendations given in international brachytherapy QC reports. Results: The answers to the questionnaires showed large variations in test frequencies and test methods. Furthermore, large variations in time spent on QC exist, which is mainly due to differences in QC-philosophy and differences in the available resources. Conclusions: Based on the results of the questionnaires and the comparison with the international recommendations, a set of minimum requirements for QC of brachytherapy has been formulated. These guidelines will be implemented in the radiotherapy institutions in The Netherlands and Belgium

  14. Aminoacyl-tRNA quality control is required for efficient activation of the TOR pathway regulator Gln3p.

    Science.gov (United States)

    Mohler, Kyle; Mann, Rebecca; Kyle, Amanda; Reynolds, Noah; Ibba, Michael

    2017-09-14

    The aminoacylation status of the cellular tRNA pool regulates both general amino acid control (GAAC) and target of rapamycin (TOR) stress response pathways in yeast. Consequently, fidelity of translation at the level of aminoacyl-tRNA synthesis plays a central role in determining accuracy and sensitivity of stress responses. To investigate effects of translational quality control (QC) on cell physiology under stress conditions, phenotypic microarray analyses were used to identify changes in QC deficient cells. Nitrogen source growth assays showed QC deficient yeast grew differently compared to WT. The QC deficient strain was more tolerant to caffeine treatment than wild type through altered interactions with the TOR and GAAC pathways. Increased caffeine tolerance of the QC deficient strain was consistent with the observation that the activity of Gln3p, a transcription factor controlled by the TOR pathway, is decreased in the QC deficient strain compared to WT. GCN4 translation, which is typically repressed in the absence of nutritional stress, was enhanced in the QC deficient strain through TOR inhibition. QC did not impact cell cycle regulation; however, the chronological lifespan of QC deficient yeast strains decreased compared to wild type, likely due to translational errors and alteration of the TOR-associated regulon. These findings support the idea that changes in translational fidelity provide a mechanism of cellular adaptation by modulating TOR activity. This, in turn, supports a central role for aminoacyl-tRNA synthesis QC in the integrated stress response by maintaining the proper aa-tRNA pools necessary to coordinate the GAAC and TOR.

  15. Influence of partial replacement of sodium chloride by potassium chloride in Minas fresh cheese of sheep’s milk

    Directory of Open Access Journals (Sweden)

    Dalana Cecília Hanauer

    2017-08-01

    Full Text Available The sheep’s milk has high contents of fat, protein and minerals in relation to the cow’s milk and is suitable for the production of cheeses, as the Minas fresh. The production of this cheese includes the salting, by offering important functions for this product. The salting is performed by adding sodium chloride (NaCl, however in excess this salt may be harmful to consumer health. Then, it was evaluated the development of tree formulations of Minas fresh cheese sheep’s milk (100% NaCl – QA; 75% NaCl and 25% potassium chloride (KCl – QB; 50% NaCl and 50% KCl – QC and they were evaluated by physical-chemical, microbiological and sensorial analyzes. A partial replacement of NaCl by KCl did not influence the moisture, protein and ash contents, pH and water activity of the cheeses. Furthermore, a 50% substitution of NaCl by KCl enabled to obtain a cheese with reduced sodium content in relation to the standard with 100% NaCl. The sensorial analysis showed that the substitution of 50% (QC and 25% (QB of NaCl by KCl did not show significant for the overall acceptance index, however, the use of KCl was perceived by the evaluators, since the formulations QB and QC differed significantly from the standard (QA. However, in the multiple comparison test there was no significant difference between the samples. Thus, the results indicated that a partial replacement of NaCl by KCl can be performed at Minas fresh cheese from sheep’smilk.

  16. Remedial Investigation/Feasibility Study (RI/FS) Report, David Global Communications Site. Volume 2

    Science.gov (United States)

    1994-02-23

    locations were measured with a rag tape and a compass . Subsurface Conditions Subsurface conditions were generally quite uniform in the test pit...prepared for each specific sample delivery group ( SDG ) and each specific parameter. The laboratory groups samples into SDGs ; the samples in an SDG have...34advisory" flag) These SDG specific detailed reports are kept in project files. All the samples have been reviewed for QC data in accordance with EPA

  17. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  18. Quality management in BNCT at a nuclear research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Sauerwein, Wolfgang, E-mail: w.sauerwein@uni-due.de [NCTeam, Department of Radiation Oncology, University Hospital Essen, University Duisburg-Essen, Hufelandstrasse 55, 45122 Essen (Germany); Moss, Raymond [ESE Unit, Institute for Energy, Joint Research Centre, European Commission, Westerduinweg 3, P.O. Box 2 NL-1755ZG Petten (Netherlands); Stecher-Rasmussen, Finn [NCT Physics, Nassaulaan 12, 1815GK Alkmaar (Netherlands); Rassow, Juergen [NCTeam, Department of Radiation Oncology, University Hospital Essen, University Duisburg-Essen, Hufelandstrasse 55, 45122 Essen (Germany); Wittig, Andrea [Department of Radiotherapy and Radiation Oncology, University Hospital Marburg, Philipps-University Marburg, Baldingerstrasse, 35043 Marburg (Germany)

    2011-12-15

    Each medical intervention must be performed respecting Health Protection directives, with special attention to Quality Assurance (QA) and Quality Control (QC). This is the basis of safe and reliable treatments. BNCT must apply QA programs as required for performance and safety in (conventional) radiotherapy facilities, including regular testing of performance characteristics (QC). Furthermore, the well-established Quality Management (QM) system of the nuclear reactor used has to be followed. Organization of these complex QM procedures is offered by the international standard ISO 9001:2008.

  19. [Does implementation of benchmarking in quality circles improve the quality of care of patients with asthma and reduce drug interaction?].

    Science.gov (United States)

    Kaufmann-Kolle, Petra; Szecsenyi, Joachim; Broge, Björn; Haefeli, Walter Emil; Schneider, Antonius

    2011-01-01

    The purpose of this cluster-randomised controlled trial was to evaluate the efficacy of quality circles (QCs) working either with general data-based feedback or with an open benchmark within the field of asthma care and drug-drug interactions. Twelve QCs, involving 96 general practitioners from 85 practices, were randomised. Six QCs worked with traditional anonymous feedback and six with an open benchmark. Two QC meetings supported with feedback reports were held covering the topics "drug-drug interactions" and "asthma"; in both cases discussions were guided by a trained moderator. Outcome measures included health-related quality of life and patient satisfaction with treatment, asthma severity and number of potentially inappropriate drug combinations as well as the general practitioners' satisfaction in relation to the performance of the QC. A significant improvement in the treatment of asthma was observed in both trial arms. However, there was only a slight improvement regarding inappropriate drug combinations. There were no relevant differences between the group with open benchmark (B-QC) and traditional quality circles (T-QC). The physicians' satisfaction with the QC performance was significantly higher in the T-QCs. General practitioners seem to take a critical perspective about open benchmarking in quality circles. Caution should be used when implementing benchmarking in a quality circle as it did not improve healthcare when compared to the traditional procedure with anonymised comparisons. Copyright © 2011. Published by Elsevier GmbH.

  20. Eastward and northward components of ocean current and water temperature collected from moorings in the vicinity of Quinault Canyon in the North East Pacific Coast from 1980-09-25 to 1981-01-24 (NCEI Accession 0164076)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The University of Washington maintained 9 current meter moorings, QC801 through QC8010 (QC806 was not deployed) in and around Quinault Canyon. Current meters were...

  1. Assays for Qualification and Quality Stratification of Clinical Biospecimens Used in Research: A Technical Report from the ISBER Biospecimen Science Working Group.

    Science.gov (United States)

    Betsou, Fay; Bulla, Alexandre; Cho, Sang Yun; Clements, Judith; Chuaqui, Rodrigo; Coppola, Domenico; De Souza, Yvonne; De Wilde, Annemieke; Grizzle, William; Guadagni, Fiorella; Gunter, Elaine; Heil, Stacey; Hodgkinson, Verity; Kessler, Joseph; Kiehntopf, Michael; Kim, Hee Sung; Koppandi, Iren; Shea, Katheryn; Singh, Rajeev; Sobel, Marc; Somiari, Stella; Spyropoulos, Demetri; Stone, Mars; Tybring, Gunnel; Valyi-Nagy, Klara; Van den Eynden, Gert; Wadhwa, Lalita

    2016-10-01

    This technical report presents quality control (QC) assays that can be performed in order to qualify clinical biospecimens that have been biobanked for use in research. Some QC assays are specific to a disease area. Some QC assays are specific to a particular downstream analytical platform. When such a qualification is not possible, QC assays are presented that can be performed to stratify clinical biospecimens according to their biomolecular quality.

  2. Sampling of ore

    International Nuclear Information System (INIS)

    Boehme, R.C.; Nicholas, B.L.

    1987-01-01

    This invention relates to a method of an apparatus for ore sampling. The method includes the steps of periodically removing a sample of the output material of a sorting machine, weighing each sample so that each is of the same weight, measuring a characteristic such as the radioactivity, magnetivity or the like of each sample, subjecting at least an equal portion of each sample to chemical analysis to determine the mineral content of the sample and comparing the characteristic measurement with desired mineral content of the chemically analysed portion of the sample to determine the characteristic/mineral ratio of the sample. The apparatus includes an ore sample collector, a deflector for deflecting a sample of ore particles from the output of an ore sorter into the collector and means for moving the deflector from a first position in which it is clear of the particle path from the sorter to a second position in which it is in the particle path at predetermined time intervals and for predetermined time periods to deflect the sample particles into the collector. The apparatus conveniently includes an ore crusher for comminuting the sample particle, a sample hopper means for weighing the hopper, a detector in the hopper for measuring a characteristic such as radioactivity, magnetivity or the like of particles in the hopper, a discharge outlet from the hopper and means for feeding the particles from the collector to the crusher and then to the hopper

  3. Droplet Size and Liquid Water Characteristics of the USAAEFA (CH-47) Helicopter Spray System and Natural Clouds as Sampled by a JUH-1H Helicopter.

    Science.gov (United States)

    1980-08-01

    o080Atalea Cal lolna 9" C’ lelephone 121 3) 79, .I 9" lee f6 5A2i A S.,&,osa’fr o MC CALIBRAT ION REPORT Date: 1/8/80 Instrument: ASSP-l 00-1 Size Size...c% c l a a 0 -1 a*’ ac e a***cca *a c a am a*ca c acca =aac ca~ac caaa U𔃺 C C C C0 C C C C C C C C Q C C C C C C C> C C C C C C C CD C C C C CD 0 a C...C C atI 0 l C 0 .C C 0 0 QC 0 0 0 c CD C c 0 C C 0 Q 0 CIC C C0 C 0 C C> C go 1 C 0j C C C QC 0 QQO0COQC=C0c 00CC =1C Q0c cC aCCa CSQ --- -V -, -7

  4. Sample summary report for KOR1 pressure tube sample

    International Nuclear Information System (INIS)

    Lee, Hee Jong; Nam, Min Woo; Choi, Young Ha

    2006-01-01

    This summary report includes basically the following: - The FLAW CHARACTERIZATION TABLE of KOR1 sample and supporting documentation. - The CROSS REFERENCE TABLES for each investigator, which is the SAMPLE INSPECTION TABLE that cross reference to the FLAW CHARACTERIZATION TABLE. - Each Sample Inspection Report as Appendices

  5. Application of microwave assisted digestion in industrial hygiene

    International Nuclear Information System (INIS)

    Paudyn, A.M.; Smith, R.G.; Gawlowski, E.

    1990-01-01

    Microwave assisted digestion plays an important role in speeding up acquisition of analytical data for industrial hygiene purposes. This paper will compare hot plate and microwave assisted digestion for the determination of elements in industrial samples (air sampling filters, dusts, ashes, paints) by the ICPAES technique. Also, the determination of radionuclides in environmental samples (soils, sediments, rocks) by alpha, beta and gamma spectroscopy after the dissolution in a microwave oven will be presented. The results on the determination of elements in NIST standard reference materials and radionuclides in IAEA standards will be included. QC/QA protocols used in an occupational health laboratory setting will be discussed. Sample preparation using microwave assisted digestion proved not only to speed-up extraction of acid soluble elements, but also to achieve better recovery of some elements (Pb in paints) and give better reproducibility of determinations

  6. Human glutaminyl cyclase and bacterial zinc aminopeptidase share a common fold and active site

    Directory of Open Access Journals (Sweden)

    Misquitta Stephanie A

    2004-02-01

    Full Text Available Abstract Background Glutaminyl cyclase (QC forms the pyroglutamyl residue at the amino terminus of numerous secretory peptides and proteins. We previously proposed the mammalian QC has some features in common with zinc aminopeptidases. We now have generated a structural model for human QC based on the aminopeptidase fold (pdb code 1AMP and mutated the apparent active site residues to assess their role in QC catalysis. Results The structural model proposed here for human QC, deposited in the protein databank as 1MOI, is supported by a variety of fold prediction programs, by the circular dichroism spectrum, and by the presence of the disulfide. Mutagenesis of the six active site residues present in both 1AMP and QC reveal essential roles for the two histidines (140 and 330, QC numbering and the two glutamates (201 and 202, while the two aspartates (159 and 248 appear to play no catalytic role. ICP-MS analysis shows less than stoichiometric zinc (0.3:1 in the purified enzyme. Conclusions We conclude that human pituitary glutaminyl cyclase and bacterial zinc aminopeptidase share a common fold and active site residues. In contrast to the aminopeptidase, however, QC does not appear to require zinc for enzymatic activity.

  7. Survey of Current Status of Quality Control of Gamma Cameras in Republic of Korea

    International Nuclear Information System (INIS)

    Choe, Jae Gol; Joh, Cheol Woo

    2008-01-01

    It is widely recognized that good quality control (QC) program is essential for adequate imaging diagnosis using gamma camera. The purpose of this study is to survey the current status of QC of gamma cameras in Republic of Korea for implementing appropriate nationwide quality control guidelines and programs. A collection of data is done for personnel, equipment and appropriateness of each nuclear medicine imaging laboratory's' quality control practice. This survey is done by collection of formatted questionnaire by mails, e mails or interviews. We also reviewed the current recommendations concerning quality assurance by international societies. This survey revealed that practice of quality control is irregular and not satisfactory. The irregularity of the QC practice seems due partly to the lack of trained personnel, equipment, budget, time and hand-on guidelines. The implementation of QC program may cause additional burden to the hospitals, patients and nuclear medicine laboratories. However, the benefit of a good QC program is obvious that the hospitals can provide good quality nuclear medicine imaging studies to the patients. It is important to use least cumbersome QC protocol, to educate the nuclear medicine and hospital administrative personnel concerning QC, and to establish national QC guidelines to help each individual nuclear medicine laboratory

  8. Identification of potential glutaminyl cyclase inhibitors from lead-like libraries by in silico and in vitro fragment-based screening.

    Science.gov (United States)

    Szaszkó, Mária; Hajdú, István; Flachner, Beáta; Dobi, Krisztina; Magyar, Csaba; Simon, István; Lőrincz, Zsolt; Kapui, Zoltán; Pázmány, Tamás; Cseh, Sándor; Dormán, György

    2017-02-01

    A glutaminyl cyclase (QC) fragment library was in silico selected by disconnection of the structure of known QC inhibitors and by lead-like 2D virtual screening of the same set. The resulting fragment library (204 compounds) was acquired from commercial suppliers and pre-screened by differential scanning fluorimetry followed by functional in vitro assays. In this way, 10 fragment hits were identified ([Formula: see text]5 % hit rate, best inhibitory activity: 16 [Formula: see text]). The in vitro hits were then docked to the active site of QC, and the best scoring compounds were analyzed for binding interactions. Two fragments bound to different regions in a complementary manner, and thus, linking those fragments offered a rational strategy to generate novel QC inhibitors. Based on the structure of the virtual linked fragment, a 77-membered QC target focused library was selected from vendor databases and docked to the active site of QC. A PubChem search confirmed that the best scoring analogues are novel, potential QC inhibitors.

  9. [Development of quality assurance/quality control web system in radiotherapy].

    Science.gov (United States)

    Okamoto, Hiroyuki; Mochizuki, Toshihiko; Yokoyama, Kazutoshi; Wakita, Akihisa; Nakamura, Satoshi; Ueki, Heihachi; Shiozawa, Keiko; Sasaki, Koji; Fuse, Masashi; Abe, Yoshihisa; Itami, Jun

    2013-12-01

    Our purpose is to develop a QA/QC (quality assurance/quality control) web system using a server-side script language such as HTML (HyperText Markup Language) and PHP (Hypertext Preprocessor), which can be useful as a tool to share information about QA/QC in radiotherapy. The system proposed in this study can be easily built in one's own institute, because HTML can be easily handled. There are two desired functions in a QA/QC web system: (i) To review the results of QA/QC for a radiotherapy machine, manuals, and reports necessary for routinely performing radiotherapy through this system. By disclosing the results, transparency can be maintained, (ii) To reveal a protocol for QA/QC in one's own institute using pictures and movies relating to QA/QC for simplicity's sake, which can also be used as an educational tool for junior radiation technologists and medical physicists. By using this system, not only administrators, but also all staff involved in radiotherapy, can obtain information about the conditions and accuracy of treatment machines through the QA/QC web system.

  10. DTIPrep: quality control of diffusion-weighted images.

    Science.gov (United States)

    Oguz, Ipek; Farzinfar, Mahshid; Matsui, Joy; Budin, Francois; Liu, Zhexing; Gerig, Guido; Johnson, Hans J; Styner, Martin

    2014-01-01

    In the last decade, diffusion MRI (dMRI) studies of the human and animal brain have been used to investigate a multitude of pathologies and drug-related effects in neuroscience research. Study after study identifies white matter (WM) degeneration as a crucial biomarker for all these diseases. The tool of choice for studying WM is dMRI. However, dMRI has inherently low signal-to-noise ratio and its acquisition requires a relatively long scan time; in fact, the high loads required occasionally stress scanner hardware past the point of physical failure. As a result, many types of artifacts implicate the quality of diffusion imagery. Using these complex scans containing artifacts without quality control (QC) can result in considerable error and bias in the subsequent analysis, negatively affecting the results of research studies using them. However, dMRI QC remains an under-recognized issue in the dMRI community as there are no user-friendly tools commonly available to comprehensively address the issue of dMRI QC. As a result, current dMRI studies often perform a poor job at dMRI QC. Thorough QC of dMRI will reduce measurement noise and improve reproducibility, and sensitivity in neuroimaging studies; this will allow researchers to more fully exploit the power of the dMRI technique and will ultimately advance neuroscience. Therefore, in this manuscript, we present our open-source software, DTIPrep, as a unified, user friendly platform for thorough QC of dMRI data. These include artifacts caused by eddy-currents, head motion, bed vibration and pulsation, venetian blind artifacts, as well as slice-wise and gradient-wise intensity inconsistencies. This paper summarizes a basic set of features of DTIPrep described earlier and focuses on newly added capabilities related to directional artifacts and bias analysis.

  11. Construction of Quasi-Cyclic LDPC Codes Based on Fundamental Theorem of Arithmetic

    Directory of Open Access Journals (Sweden)

    Hai Zhu

    2018-01-01

    Full Text Available Quasi-cyclic (QC LDPC codes play an important role in 5G communications and have been chosen as the standard codes for 5G enhanced mobile broadband (eMBB data channel. In this paper, we study the construction of QC LDPC codes based on an arbitrary given expansion factor (or lifting degree. First, we analyze the cycle structure of QC LDPC codes and give the necessary and sufficient condition for the existence of short cycles. Based on the fundamental theorem of arithmetic in number theory, we divide the integer factorization into three cases and present three classes of QC LDPC codes accordingly. Furthermore, a general construction method of QC LDPC codes with girth of at least 6 is proposed. Numerical results show that the constructed QC LDPC codes perform well over the AWGN channel when decoded with the iterative algorithms.

  12. Recent Progress toward Microfluidic Quality Control Testing of Radiopharmaceuticals

    Directory of Open Access Journals (Sweden)

    Noel S. Ha

    2017-11-01

    Full Text Available Radiopharmaceuticals labeled with short-lived positron-emitting or gamma-emitting isotopes are injected into patients just prior to performing positron emission tomography (PET or single photon emission tomography (SPECT scans, respectively. These imaging modalities are widely used in clinical care, as well as in the development and evaluation of new therapies in clinical research. Prior to injection, these radiopharmaceuticals (tracers must undergo quality control (QC testing to ensure product purity, identity, and safety for human use. Quality tests can be broadly categorized as (i pharmaceutical tests, needed to ensure molecular identity, physiological compatibility and that no microbiological, pyrogenic, chemical, or particulate contamination is present in the final preparation; and (ii radioactive tests, needed to ensure proper dosing and that there are no radiochemical and radionuclidic impurities that could interfere with the biodistribution or imaging. Performing the required QC tests is cumbersome and time-consuming, and requires an array of expensive analytical chemistry equipment and significant dedicated lab space. Calibrations, day of use tests, and documentation create an additional burden. Furthermore, in contrast to ordinary pharmaceuticals, each batch of short-lived radiopharmaceuticals must be manufactured and tested within a short period of time to avoid significant losses due to radioactive decay. To meet these challenges, several efforts are underway to develop integrated QC testing instruments that automatically perform and document all of the required tests. More recently, microfluidic quality control systems have been gaining increasing attention due to vastly reduced sample and reagent consumption, shorter analysis times, higher detection sensitivity, increased multiplexing, and reduced instrumentation size. In this review, we describe each of the required QC tests and conventional testing methods, followed by a

  13. Direct infusion mass spectrometry metabolomics dataset: a benchmark for data processing and quality control

    Science.gov (United States)

    Kirwan, Jennifer A; Weber, Ralf J M; Broadhurst, David I; Viant, Mark R

    2014-01-01

    Direct-infusion mass spectrometry (DIMS) metabolomics is an important approach for characterising molecular responses of organisms to disease, drugs and the environment. Increasingly large-scale metabolomics studies are being conducted, necessitating improvements in both bioanalytical and computational workflows to maintain data quality. This dataset represents a systematic evaluation of the reproducibility of a multi-batch DIMS metabolomics study of cardiac tissue extracts. It comprises of twenty biological samples (cow vs. sheep) that were analysed repeatedly, in 8 batches across 7 days, together with a concurrent set of quality control (QC) samples. Data are presented from each step of the workflow and are available in MetaboLights. The strength of the dataset is that intra- and inter-batch variation can be corrected using QC spectra and the quality of this correction assessed independently using the repeatedly-measured biological samples. Originally designed to test the efficacy of a batch-correction algorithm, it will enable others to evaluate novel data processing algorithms. Furthermore, this dataset serves as a benchmark for DIMS metabolomics, derived using best-practice workflows and rigorous quality assessment. PMID:25977770

  14. Energy transfer and visible-infrared quantum cutting photoluminescence modification in Tm-Yb codoped YPO(4) inverse opal photonic crystals.

    Science.gov (United States)

    Wang, Siqin; Qiu, Jianbei; Wang, Qi; Zhou, Dacheng; Yang, Zhengwen

    2015-08-01

    YPO4:  Tm, Yb inverse opal photonic crystals were successfully synthesized by the colloidal crystal templates method, and the visible-infrared quantum cutting (QC) photoluminescence properties of YPO4:  Tm, Yb inverse opal photonic crystals were investigated. We obtained tetragonal phase YPO4 in all the samples when the samples sintered at 950°C for 5 h. The visible emission intensity of Tm3+ decreased significantly when the photonic bandgap was located at 650 nm under 480 nm excitation. On the contrary, the QC emission intensity of Yb3+ was enhanced as compared with the no photonic bandgap sample. When the photonic bandgap was located at 480 nm, the Yb3+ and Tm3+ light-emitting intensity weakened at the same time. We demonstrated that the energy transfer between Tm3+ and Yb3+ is enhanced by the suppression of the red emission of Tm3+. Additionally, the mechanisms for the influence of the photonic bandgap on the energy transfer process of the Tm3+, Yb3+ codoped YPO4 inverse opal are discussed.

  15. Implementation of quality assurance and quality control in the Nuclear Analytical Laboratory of the Estonian Radiation Protection Centre

    International Nuclear Information System (INIS)

    Koeoep, T.; Jakobson, E.

    2002-01-01

    The Analytical Laboratory of the Estonian Radiation Protection Centre is in the process of implementing the system of Quality Assurance (QA) and Quality Control (QC) in the framework of the IAEA TC Project RER/2/004/ 'QA/QC of Nuclear Analytical Techniques'. The draft Quality Manual with annexes has been prepared accordingly to the ISO 17025 Guide, documents and other printed material delivered on the seminars of the project. The laboratory supply has been supplemented with necessary equipment for guaranteeing of quality. Proficiency testing included in the project has been performed successfully. (author)

  16. An integrated and accessible sample data library for Mars sample return science

    Science.gov (United States)

    Tuite, M. L., Jr.; Williford, K. H.

    2015-12-01

    Over the course of the next decade or more, many thousands of geological samples will be collected and analyzed in a variety of ways by researchers at the Jet Propulsion Laboratory (California Institute of Technology) in order to facilitate discovery and contextualize observations made of Mars rocks both in situ and here on Earth if samples are eventually returned. Integration of data from multiple analyses of samples including petrography, thin section and SEM imaging, isotope and organic geochemistry, XRF, XRD, and Raman spectrometry is a challenge and a potential obstacle to discoveries that require supporting lines of evidence. We report the development of a web-accessible repository, the Sample Data Library (SDL) for the sample-based data that are generated by the laboratories and instruments that comprise JPL's Center for Analysis of Returned Samples (CARS) in order to facilitate collaborative interpretation of potential biosignatures in Mars-analog geological samples. The SDL is constructed using low-cost, open-standards-based Amazon Web Services (AWS), including web-accessible storage, relational data base services, and a virtual web server. The data structure is sample-centered with a shared registry for assigning unique identifiers to all samples including International Geo-Sample Numbers. Both raw and derived data produced by instruments and post-processing workflows are automatically uploaded to online storage and linked via the unique identifiers. Through the web interface, users are able to find all the analyses associated with a single sample or search across features shared by multiple samples, sample localities, and analysis types. Planned features include more sophisticated search and analytical interfaces as well as data discoverability through NSF's EarthCube program.

  17. Calendar Year 2008 Groundwater Monitoring Report, U.S. Department of Energy Y-12 National Security Complex, Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    Elvado Environmental LLC

    2009-12-01

    groundwater and surface water sampling and analysis activities implemented under the Y-12 GWPP including sampling locations and frequency; quality assurance (QA)/quality control (QC) sampling; sample collection and handling; field measurements and laboratory analytes; data management and data quality objective (DQO) evaluation; and groundwater elevation monitoring. However, this report does not include equivalent QA/QC or DQO evaluation information regarding the groundwater and surface water sampling and analysis activities associated with the monitoring programs implemented by BJC. Such details are deferred to the respective programmatic plans and reports issued by BJC (see Section 3.0).

  18. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Kougarok area, Bendeleben and Teller quadrangles, Seward Peninsula, Alaska

    Science.gov (United States)

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 302 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Kougarok River drainage as well as smaller adjacent drainages in the Bendeleben and Teller quadrangles, Seward Peninsula, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated

  19. Cloud point extraction and spectrophotometric determination of mercury species at trace levels in environmental samples.

    Science.gov (United States)

    Ulusoy, Halil İbrahim; Gürkan, Ramazan; Ulusoy, Songül

    2012-01-15

    A new micelle-mediated separation and preconcentration method was developed for ultra-trace quantities of mercury ions prior to spectrophotometric determination. The method is based on cloud point extraction (CPE) of Hg(II) ions with polyethylene glycol tert-octylphenyl ether (Triton X-114) in the presence of chelating agents such as 1-(2-pyridylazo)-2-naphthol (PAN) and 4-(2-thiazolylazo) resorcinol (TAR). Hg(II) ions react with both PAN and TAR in a surfactant solution yielding a hydrophobic complex at pH 9.0 and 8.0, respectively. The phase separation was accomplished by centrifugation for 5 min at 3500 rpm. The calibration graphs obtained from Hg(II)-PAN and Hg(II)-TAR complexes were linear in the concentration ranges of 10-1000 μg L(-1) and 50-2500 μg L(-1) with detection limits of 1.65 and 14.5 μg L(-1), respectively. The relative standard deviations (RSDs) were 1.85% and 2.35% in determinations of 25 and 250 μg L(-1) Hg(II), respectively. The interference effect of several ions were studied and seen commonly present ions in water samples had no significantly effect on determination of Hg(II). The developed methods were successfully applied to determine mercury concentrations in environmental water samples. The accuracy and validity of the proposed methods were tested by means of five replicate analyses of the certified standard materials such as QC Metal LL3 (VWR, drinking water) and IAEA W-4 (NIST, simulated fresh water). Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Levey-Jennings Analysis Uncovers Unsuspected Causes of Immunohistochemistry Stain Variability.

    Science.gov (United States)

    Vani, Kodela; Sompuram, Seshi R; Naber, Stephen P; Goldsmith, Jeffrey D; Fulton, Regan; Bogen, Steven A

    Almost all clinical laboratory tests use objective, quantitative measures of quality control (QC), incorporating Levey-Jennings analysis and Westgard rules. Clinical immunohistochemistry (IHC) testing, in contrast, relies on subjective, qualitative QC review. The consequences of using Levey-Jennings analysis for QC assessment in clinical IHC testing are not known. To investigate this question, we conducted a 1- to 2-month pilot test wherein the QC for either human epidermal growth factor receptor 2 (HER-2) or progesterone receptor (PR) in 3 clinical IHC laboratories was quantified and analyzed with Levey-Jennings graphs. Moreover, conventional tissue controls were supplemented with a new QC comprised of HER-2 or PR peptide antigens coupled onto 8 μm glass beads. At institution 1, this more stringent analysis identified a decrease in the HER-2 tissue control that had escaped notice by subjective evaluation. The decrement was due to heterogeneity in the tissue control itself. At institution 2, we identified a 1-day sudden drop in the PR tissue control, also undetected by subjective evaluation, due to counterstain variability. At institution 3, a QC shift was identified, but only with 1 of 2 controls mounted on each slide. The QC shift was due to use of the instrument's selective reagent drop zones dispense feature. None of these events affected patient diagnoses. These case examples illustrate that subjective QC evaluation of tissue controls can detect gross assay failure but not subtle changes. The fact that QC issues arose from each site, and in only a pilot study, suggests that immunohistochemical stain variability may be an underappreciated problem.

  1. New methods for optical distance indicator and gantry angle quality control tests in medical linear accelerators: image processing by using a 3D phantom

    Energy Technology Data Exchange (ETDEWEB)

    Shandiz, Mahdi Heravian; Khalilzadeh, Mohammadmahdi; Anvari, Kazem [Mashhad Branch, Islamic Azad University, Mashhad (Iran, Islamic Republic of); Layen, Ghorban Safaeian [Mashhad University of Medical Science, Mashhad (Iran, Islamic Republic of)

    2015-03-15

    In order to keep the acceptable level of the radiation oncology linear accelerators, it is necessary to apply a reliable quality assurance (QA) program. The QA protocols, published by authoritative organizations, such as the American Association of Physicists in Medicine (AAPM), determine the quality control (QC) tests which should be performed on the medical linear accelerators and the threshold levels for each test. The purpose of this study is to increase the accuracy and precision of the selected QC tests in order to increase the quality of treatment and also increase the speed of the tests to convince the crowded centers to start a reliable QA program. A new method has been developed for two of the QC tests; optical distance indicator (ODI) QC test as a daily test and gantry angle QC test as a monthly test. This method uses an image processing approach utilizing the snapshots taken by the CCD camera to measure the source to surface distance (SSD) and gantry angle. The new method of ODI QC test has an accuracy of 99.95% with a standard deviation of 0.061 cm and the new method for gantry angle QC has a precision of 0.43 degrees. The automated proposed method which is used for both ODI and gantry angle QC tests, contains highly accurate and precise results which are objective and the human-caused errors have no effect on the results. The results show that they are in the acceptable range for both of the QC tests, according to AAPM task group 142.

  2. New methods for optical distance indicator and gantry angle quality control tests in medical linear accelerators: image processing by using a 3D phantom

    International Nuclear Information System (INIS)

    Shandiz, Mahdi Heravian; Khalilzadeh, Mohammadmahdi; Anvari, Kazem; Layen, Ghorban Safaeian

    2015-01-01

    In order to keep the acceptable level of the radiation oncology linear accelerators, it is necessary to apply a reliable quality assurance (QA) program. The QA protocols, published by authoritative organizations, such as the American Association of Physicists in Medicine (AAPM), determine the quality control (QC) tests which should be performed on the medical linear accelerators and the threshold levels for each test. The purpose of this study is to increase the accuracy and precision of the selected QC tests in order to increase the quality of treatment and also increase the speed of the tests to convince the crowded centers to start a reliable QA program. A new method has been developed for two of the QC tests; optical distance indicator (ODI) QC test as a daily test and gantry angle QC test as a monthly test. This method uses an image processing approach utilizing the snapshots taken by the CCD camera to measure the source to surface distance (SSD) and gantry angle. The new method of ODI QC test has an accuracy of 99.95% with a standard deviation of 0.061 cm and the new method for gantry angle QC has a precision of 0.43 degrees. The automated proposed method which is used for both ODI and gantry angle QC tests, contains highly accurate and precise results which are objective and the human-caused errors have no effect on the results. The results show that they are in the acceptable range for both of the QC tests, according to AAPM task group 142.

  3. The preparation and characterization of a loess sediment reference material for QC/QA of the annual radiation dose determination in luminescence dating

    International Nuclear Information System (INIS)

    De Corte, F.; De Wispelaere, A.; Vandenberghe, D.; Hossain, S.M.; Van den haute, P.

    2005-01-01

    Of crucial importance for obtaining reliable results in the luminescence dating of sediments, is the accurate and precise assessment of both the palaeodose and the annual radiation dose [cf the age equation: luminescence-age (ka) = palaeodose (Gray)/annual radiation dose (Gray.ka -1 )]. Clearly, for QC/QA of the annual radiation dose determination, a sediment reference material - not readily available up to now - would be highly useful. Therefore, in the present work a loess sediment was prepared and characterized with well-defined K, Th and U contents (the radiation dose being built up mainly by 40 K, and by 232 Th and 235,238 U and their decay daughters) and - otherwise expressed - alpha, beta, gamma and total radiation dose-rates. The material, a fine-grained aeolian loess sediment deposited in the Young-Pleistocene (Weichselian), a part of the Quaternary, was collected at Volkegem, Belgium. At the sampling site, NaI(Tl) field gamma-ray spectrometry was performed, yielding - via comparison with the 'Heidelberg calibration block' - concentrations (wet loess weight) for K, Th and U. About 14 kg material was brought to the laboratory and kept for ∼1 week at 110 degree C until constant weight (water content ≅14%). Then, the dried loess was subject to agate ball milling so as to pass through a 50 μm sieve. The ∼12 kg powder obtained in this way was homogenized both in a turbula mixer and manually. For the thus prepared loess material, good homogeneity for its K, Th and U content was found, as investigated via k 0 -INAA. For the final concentration and radiation dose-rate characterization, use was made of (next to NaI(Tl) field gamma-ray spectrometry and k 0 -INAA): extended energy-range low-background Ge gamma-ray spectrometry (also showing that the 232 Th and 238 U decay series were in secular equilibrium), thick source ZnS alpha-counting and GM beta-counting. For the latter', the conversion factors 'beta count-rate mutually implies radiation dose-rate' were

  4. INAA Application for Trace Element Determination in Biological Reference Material

    Science.gov (United States)

    Atmodjo, D. P. D.; Kurniawati, S.; Lestiani, D. D.; Adventini, N.

    2017-06-01

    Trace element determination in biological samples is often used in the study of health and toxicology. Determination change to its essentiality and toxicity of trace element require an accurate determination method, which implies that a good Quality Control (QC) procedure should be performed. In this study, QC for trace element determination in biological samples was applied by analyzing the Standard Reference Material (SRM) Bovine muscle 8414 NIST using Instrumental Neutron Activation Analysis (INAA). Three selected trace element such as Fe, Zn, and Se were determined. Accuracy of the elements showed as %recovery and precision as %coefficient of variance (%CV). The result showed that %recovery of Fe, Zn, and Se were in the range between 99.4-107%, 92.7-103%, and 91.9-112%, respectively, whereas %CV were 2.92, 3.70, and 5.37%, respectively. These results showed that INAA method is precise and accurate for trace element determination in biological matrices.

  5. Eight years of quality control in Bulgaria: impact on mammography practice.

    Science.gov (United States)

    Avramova-Cholakova, S; Lilkov, G; Kaneva, M; Terziev, K; Nakov, I; Mutkurov, N; Kovacheva, D; Ivanova, M; Vasilev, D

    2015-07-01

    The requirements for quality control (QC) in diagnostic radiology were introduced in Bulgarian legislation in 2005. Hospital medical physicists and several private medical physics groups provide QC services to radiology departments. The aim of this study was to analyse data from QC tests in mammography and to investigate the impact of QC introduction on mammography practice in the country. The study was coordinated by the National Centre of Radiobiology and Radiation Protection. All medical physics services were requested to fill in standardised forms with information about most important parameters routinely measured during QC. All QC service providers responded. Results demonstrated significant improvement of practice since the introduction of QC, with reduction of established deviations from 65 % during the first year to 7 % in the last year. The systems that do not meet the acceptability criteria were suspended from use. Performance of automatic exposure control and digital detectors are not regularly tested because of the absence of requirements in the legislation. The need of updated guidance and training of medical physicists to reflect the change in technology was demonstrated. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Development and experience of quality control methods for digital breast tomosynthesis systems.

    Science.gov (United States)

    Strudley, Cecilia J; Young, Kenneth C; Looney, Padraig; Gilbert, Fiona J

    2015-01-01

    To develop tomosynthesis quality control (QC) test methods and use them alongside established two-dimensional (2D) QC tests to measure the performance of digital breast tomosynthesis (DBT) systems used in a comparative trial with 2D mammography. DBT QC protocols and associated analysis were developed, incorporating adaptions of some 2D tests as well as some novel tests. The tomosynthesis tests were: mean glandular dose to the standard breast model; contrast-to-noise ratio in reconstructed focal planes; geometric distortion; artefact spread; threshold contrast detail detection in reconstructed focal planes, alignment of the X-ray beam to the reconstructed image and missed tissue; reproducibility of the tomosynthesis exposure; and homogeneity of the reconstructed focal planes. Summaries of results from the tomosynthesis QC tests are presented together with some 2D results for comparison. The tomosynthesis QC tests and analysis methods developed were successfully applied. The lessons learnt, which are detailed in the Discussion section, may be helpful to others embarking on DBT QC programmes. DBT performance test equipment and analysis methods have been developed. The experience gained has contributed to the subsequent drafting of DBT QC protocols in the UK and Europe.

  7. 78 FR 45173 - Agency Information Collection Activities: Proposed Collection; Comment Request-Enhancing...

    Science.gov (United States)

    2013-07-26

    ... Public: State Employees: Respondent groups identified include (1) State QC directors, if the position... Annual Responses: The estimated total annual responses is 1,040, including initial recruitment and... burden on respondents is 255.60 hours (including recruitment communications and completed and attempted...

  8. Laser sampling

    International Nuclear Information System (INIS)

    Gorbatenko, A A; Revina, E I

    2015-01-01

    The review is devoted to the major advances in laser sampling. The advantages and drawbacks of the technique are considered. Specific features of combinations of laser sampling with various instrumental analytical methods, primarily inductively coupled plasma mass spectrometry, are discussed. Examples of practical implementation of hybrid methods involving laser sampling as well as corresponding analytical characteristics are presented. The bibliography includes 78 references

  9. Comparison of quality control software tools for diffusion tensor imaging.

    Science.gov (United States)

    Liu, Bilan; Zhu, Tong; Zhong, Jianhui

    2015-04-01

    Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Improvement of early detection of breast cancer through collaborative multi-country efforts: Medical physics component.

    Science.gov (United States)

    Mora, Patricia; Faulkner, Keith; Mahmoud, Ahmed M; Gershan, Vesna; Kausik, Aruna; Zdesar, Urban; Brandan, María-Ester; Kurt, Serap; Davidović, Jasna; Salama, Dina H; Aribal, Erkin; Odio, Clara; Chaturvedi, Arvind K; Sabih, Zahida; Vujnović, Saša; Paez, Diana; Delis, Harry

    2018-04-01

    The International Atomic Energy Agency (IAEA) through a Coordinated Research Project on "Enhancing Capacity for Early Detection and Diagnosis of Breast Cancer through Imaging", brought together a group of mammography radiologists, medical physicists and radiographers; to investigate current practices and improve procedures for the early detection of breast cancer by strengthening both the clinical and medical physics components. This paper addresses the medical physics component. The countries that participated in the CRP were Bosnia and Herzegovina, Costa Rica, Egypt, India, Kenya, the Frmr. Yug. Rep. of Macedonia, Mexico, Nigeria, Pakistan, Philippines, Slovenia, Turkey, Uganda, United Kingdom and Zambia. Ten institutions participated using IAEA quality control protocols in 9 digital and 3 analogue mammography equipment. A spreadsheet for data collection was generated and distributed. Evaluation of image quality was done using TOR MAX and DMAM2 Gold phantoms. QC results for analogue equipment showed satisfactory results. QC tests performed on digital systems showed that improvements needed to be implemented, especially in thickness accuracy, signal difference to noise ratio (SDNR) values for achievable levels, uniformity and modulation transfer function (MTF). Mean glandular dose (MGD) was below international recommended levels for patient radiation protection. Evaluation of image quality by phantoms also indicated the need for improvement. Common activities facilitated improvement in mammography practice, including training of medical physicists in QC programs and infrastructure was improved and strengthened; networking among medical physicists and radiologists took place and was maintained over time. IAEA QC protocols provided a uniformed approach to QC measurements. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  11. Development of Medical Technology for Contingency Response to Marrow Toxic Agents

    Science.gov (United States)

    2015-10-02

    Recommended Screening and Preventive Practices for Long-Term Survivors after Hematopoietic Cell Transplantation. Revista Brasileira de Hematologia e...Sorting FBI Federal Bureau of Investigation FDA Food and Drug Administration FDR Fund Drive Request FGM France Greffe de Moelle FHCRC Fred... inclusion as blind QC samples. • Of 127 samples that underwent the cell transformation process, 74 (58%) exhibited negative cell growth. A total of

  12. Sample representativeness verification of the FADN CZ farm business sample

    Directory of Open Access Journals (Sweden)

    Marie Prášilová

    2011-01-01

    Full Text Available Sample representativeness verification is one of the key stages of statistical work. After having joined the European Union the Czech Republic joined also the Farm Accountancy Data Network system of the Union. This is a sample of bodies and companies doing business in agriculture. Detailed production and economic data on the results of farming business are collected from that sample annually and results for the entire population of the country´s farms are then estimated and assessed. It is important hence, that the sample be representative. Representativeness is to be assessed as to the number of farms included in the survey and also as to the degree of accordance of the measures and indices as related to the population. The paper deals with the special statistical techniques and methods of the FADN CZ sample representativeness verification including the necessary sample size statement procedure. The Czech farm population data have been obtained from the Czech Statistical Office data bank.

  13. [A comparison of convenience sampling and purposive sampling].

    Science.gov (United States)

    Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien

    2014-06-01

    Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.

  14. A sensitive and selective liquid chromatography/tandem mass spectrometry method for quantitative analysis of efavirenz in human plasma.

    Directory of Open Access Journals (Sweden)

    Praveen Srivastava

    Full Text Available A selective and a highly sensitive method for the determination of the non-nucleoside reverse transcriptase inhibitor (NNRTI, efavirenz, in human plasma has been developed and fully validated based on high performance liquid chromatography tandem mass spectrometry (LC-MS/MS. Sample preparation involved protein precipitation followed by one to one dilution with water. The analyte, efavirenz was separated by high performance liquid chromatography and detected with tandem mass spectrometry in negative ionization mode with multiple reaction monitoring. Efavirenz and ¹³C₆-efavirenz (Internal Standard, respectively, were detected via the following MRM transitions: m/z 314.20243.90 and m/z 320.20249.90. A gradient program was used to elute the analytes using 0.1% formic acid in water and 0.1% formic acid in acetonitrile as mobile phase solvents, at a flow-rate of 0.3 mL/min. The total run time was 5 min and the retention times for the internal standard (¹³C₆-efavirenz and efavirenz was approximately 2.6 min. The calibration curves showed linearity (coefficient of regression, r>0.99 over the concentration range of 1.0-2,500 ng/mL. The intraday precision based on the standard deviation of replicates of lower limit of quantification (LLOQ was 9.24% and for quality control (QC samples ranged from 2.41% to 6.42% and with accuracy from 112% and 100-111% for LLOQ and QC samples. The inter day precision was 12.3% and 3.03-9.18% for LLOQ and quality controls samples, and the accuracy was 108% and 95.2-108% for LLOQ and QC samples. Stability studies showed that efavirenz was stable during the expected conditions for sample preparation and storage. The lower limit of quantification for efavirenz was 1 ng/mL. The analytical method showed excellent sensitivity, precision, and accuracy. This method is robust and is being successfully applied for therapeutic drug monitoring and pharmacokinetic studies in HIV-infected patients.

  15. Benefits of a Pharmacology Antimalarial Reference Standard and Proficiency Testing Program Provided by the Worldwide Antimalarial Resistance Network (WWARN)

    Science.gov (United States)

    Lourens, Chris; Lindegardh, Niklas; Barnes, Karen I.; Guerin, Philippe J.; Sibley, Carol H.; White, Nicholas J.

    2014-01-01

    Comprehensive assessment of antimalarial drug resistance should include measurements of antimalarial blood or plasma concentrations in clinical trials and in individual assessments of treatment failure so that true resistance can be differentiated from inadequate drug exposure. Pharmacometric modeling is necessary to assess pharmacokinetic-pharmacodynamic relationships in different populations to optimize dosing. To accomplish both effectively and to allow comparison of data from different laboratories, it is essential that drug concentration measurement is accurate. Proficiency testing (PT) of laboratory procedures is necessary for verification of assay results. Within the Worldwide Antimalarial Resistance Network (WWARN), the goal of the quality assurance/quality control (QA/QC) program is to facilitate and sustain high-quality antimalarial assays. The QA/QC program consists of an international PT program for pharmacology laboratories and a reference material (RM) program for the provision of antimalarial drug standards, metabolites, and internal standards for laboratory use. The RM program currently distributes accurately weighed quantities of antimalarial drug standards, metabolites, and internal standards to 44 pharmacology, in vitro, and drug quality testing laboratories. The pharmacology PT program has sent samples to eight laboratories in four rounds of testing. WWARN technical experts have provided advice for correcting identified problems to improve performance of subsequent analysis and ultimately improved the quality of data. Many participants have demonstrated substantial improvements over subsequent rounds of PT. The WWARN QA/QC program has improved the quality and value of antimalarial drug measurement in laboratories globally. It is a model that has potential to be applied to strengthening laboratories more widely and improving the therapeutics of other infectious diseases. PMID:24777099

  16. Challenges in setting up quality control in diagnostic radiology ...

    African Journals Online (AJOL)

    Journal Home > Vol 24, No 4 (2015) >. Log in or ... Quality control (QC) on diagnostic radiology equipment form part of the fundamental requirements for the ... Inadequate cooperation by facilities management, lack of QC equipment and insufficient staff form the major challenges in setting up QC in the facilities under study.

  17. The development of quality assurance program for cyberknife

    International Nuclear Information System (INIS)

    Jang, Ji Sun; Lee, Dong Han; Kang, Young Nam

    2006-01-01

    Standardization quality assurance (QA)program of Cyberknife for suitable circumstances in Korea has not been established. In this research, we investigated the development of QA program for Cyberknife and evaluation of the feasibility under applications. Considering the feature of constitution for systems and the therapeutic methodology of Cyberknife, the list of quality control (QC) was established and divided dependent on the each period of operations. And then all these developed QC lists were categorized into three groups such as basic QC, delivery specific QC, and patient specific QC based on the each purpose of QA. In order to verify the validity of the established QA program, this QC lists was applied to two Cyberknife centers. The acceptable tolerance was based on the undertaking inspection list from the Cyberknife manufacturer and the QC results during last three years of two Cyberknife centers in Korea. The acquired measurement results were evaluated for the analysis of the current QA status and the verification of the propriety for the developed QA program. The current QA status of two Cyberknife centers was evaluated from the accuracy of all measurements in relation with application of the established QA program. Each measurement result was verified having a good agreement within the acceptable tolerance limit of the developed QA program. It is considered that the developed QA program in this research could be established the standardization of QC methods for Cyberknife and confirmed the accuracy and stability for the image-guided stereotactic radiotherapy

  18. Lean Six Sigma in Health Care: Improving Utilization and Reducing Waste.

    Science.gov (United States)

    Almorsy, Lamia; Khalifa, Mohamed

    2016-01-01

    Healthcare costs have been increasing worldwide mainly due to over utilization of resources. The savings potentially achievable from systematic, comprehensive, and cooperative reduction in waste are far higher than from more direct and blunter cuts in care and coverage. At King Faisal Specialist Hospital and Research Center inappropriate and over utilization of the glucose test strips used for whole blood glucose determination using glucometers was observed. The hospital implemented a project to improve its utilization. Using the Six Sigma DMAIC approach (Define, Measure, Analyze, Improve and Control), an efficient practice was put in place including updating the related internal policies and procedures and the proper implementation of an effective users' training and competency check off program. That resulted in decreasing the unnecessary Quality Control (QC) runs from 13% to 4%, decreasing the failed QC runs from 14% to 7%, lowering the QC to patient testing ratio from 24/76 to 19/81.

  19. QC methods and means during pellets and fuel rods manufacturing at JSC 'MSZ'

    International Nuclear Information System (INIS)

    Kouznetsov, A.I.

    2000-01-01

    The report contains the description of the main methods and devices used in fabrication of pellets and fuel rods to prove their conformity to the requirements of technical specifications. The basic principals, range and accuracy of methods and devices are considered in detail, as well as system of metrological support of measurements. The latter includes the metrological certification and periodical verification of the devices, metrological qualification of measurement procedures, standard samples provision and checking the correctness of the analyses performance. If one makes an overall review of testing methods used in different fuel production plants he will find that most part of methods and devices are very similar. There are still some variations in methods which could be a subject for interesting discussions among specialists. This report contains a brief review of testing methods and devices used at our plant. More detailed description is given to methods which differ from those commonly used. (author)

  20. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Haines area, Juneau and Skagway quadrangles, southeast Alaska

    Science.gov (United States)

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 212 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Chilkat, Klehini, Tsirku, and Takhin river drainages, as well as smaller drainages flowing into Chilkat and Chilkoot Inlets near Haines, Skagway Quadrangle, Southeast Alaska. Additionally some samples were also chosen from the Juneau gold belt, Juneau Quadrangle, Southeast Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical

  1. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the northeastern Alaska Range, Healy, Mount Hayes, Nabesna, and Tanacross quadrangles, Alaska

    Science.gov (United States)

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 670 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the northeastern Alaska Range, in the Healy, Mount Hayes, Nabesna, and Tanacross quadrangles, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical

  2. QUANTITATIVE STRUCTURE-ACTIVITY RELATIONSHIP ANALYSIS (QSAR OF VINCADIFFORMINE ANALOGUES AS THE ANTIPLASMODIAL COMPOUNDS OF THE CHLOROQUINOSENSIBLE STRAIN

    Directory of Open Access Journals (Sweden)

    Iqmal Tahir

    2010-06-01

    Full Text Available Quantitative Structure-Activity Relationship (QSAR analysis of vincadifformine analogs as an antimalarial drug has been conducted using atomic net charges (q, moment dipole (, LUMO (Lowest Unoccupied Molecular Orbital and HOMO (Highest Occupied Molecular Orbital energies, molecular mass (m as well as surface area (A as the predictors to their activity. Data of predictors are obtained from computational chemistry method using semi-empirical molecular orbital AM1 calculation. Antimalarial activities were taken as the activity of the drugs against chloroquine-sensitive Plasmodium falciparum (Nigerian Cell strain and were presented as the value of ln(1/IC50 where IC50 is an effective concentration inhibiting 50% of the parasite growth. The best QSAR model has been determined by multiple linier regression analysis giving QSAR equation: Log (1/IC50 = 9.602.qC1 -17.012.qC2 +6.084.qC3 -19.758.qC5 -6.517.qC6 +2.746.qC7 -6.795.qN +6.59.qC8 -0.190. -0.974.ELUMO +0.515.EHOMO -0.274. +0.029.A -1.673 (n = 16; r = 0.995; SD = 0.099; F = 2.682   Keywords: QSAR analysis, antimalaria, vincadifformine.

  3. Intercomparison of quality control procedures in radiotherapy in the Netherlands

    International Nuclear Information System (INIS)

    Kleffens, H.J. van; Meijer, G.J.; Mijnheer, B.J.

    1997-01-01

    A grant was received from the Dutch government to accomplish the development and implementation of guidelines for quality control (QC) of radiotherapy equipment in The Netherlands. QC of electron accelerators, simulators, CT scanners, mould room equipment, dosimetry equipment and treatment planning systems will be considered in this project. The project started in September 1994 with an investigation of QC of medical electron accelerators as performed in all 21 radiotherapy institutions in The Netherlands. An extensive questionnaire on QC procedures of electron accelerators was sent to all centres with items related to safety systems, mechanical aspects, radiation leakage, beam data and dosimetry equipment (in total about 60 questions). From the answers the following conclusions can be drawn: There is a large variation in time spent on QC; This QC time strongly depends on the complexity of the linear accelerator; There is a large variation in frequency and tolerance levels of the various tests; The way QC of an item is performed differs considerably (extensive-comprehensive). From these data recommendations specific for the situation in The Netherlands are being prepared and compared with other existing national and international reports. Similar procedures are underway for CT scanners and simulators while for the other equipment minimum guidelines still have to be developed. (author)

  4. Simultaneous Determination of Flavonols and Terpene Lactones in ...

    African Journals Online (AJOL)

    Lactones in Beagle Dog Plasma by Ultra-Performance ... School of Chinese Materia Medica, Beijing University of Chinese Medicine, Beijing 100102, ... Matrix effect derived from QC samples was in the range of 85.09 – 113.14 %. ..... with the suitable weighting factor of 1/x. .... pharmacokinetic studies of G. biloba and its.

  5. Academics\\' perceptions of `quality in higher education\\' and quality ...

    African Journals Online (AJOL)

    academics. The article discusses various perceptions of QHE as well as the concern for quality nationally and internationally and distils out some general QP, QA, QC and QM strategies. This research was a case study. The sample consisted of 28 academics from the Faculty of Science. Data were gathered mainly through

  6. Survey on quality control measurements for nuclear medicine imaging equipment in Finland in 2006

    International Nuclear Information System (INIS)

    Korpela, Helinae; Niemelae, Jarkko

    2008-01-01

    Routine quality control (QC) is an essential requirement in nuclear medicine (NM) in order to ensure optimal functioning of equipment. To harmonise the routine QC of NM imaging equipment in Finnish hospital s (planar gamma cameras, SPECT, coincidence gamma cameras, PET), the Radiation and Nuclear Safety Authority (STUK) will publish guidelines on QC in collaboration with several hospital physicists. Recommendations will be provided on routine QC measurements and on the frequency of testing. It is also planned to provide recommendations for the acceptance criteria when assessing different performance parameters for NM imaging equipment. In order to determine what performance parameters of NM equipment are currently measured in hospitals, how frequently they are measured and what acceptance criteria are used, a survey was carried out on the QC of NM equipment in Finland during 2006. (author)

  7. Robustness of quantum correlations against linear noise

    International Nuclear Information System (INIS)

    Guo, Zhihua; Cao, Huaixin; Qu, Shixian

    2016-01-01

    Relative robustness of quantum correlations (RRoQC) of a bipartite state is firstly introduced relative to a classically correlated state. Robustness of quantum correlations (RoQC) of a bipartite state is then defined as the minimum of RRoQC of the state relative to all classically correlated ones. It is proved that as a function on quantum states, RoQC is nonnegative, lower semi-continuous and neither convex nor concave; especially, it is zero if and only if the state is classically correlated. Thus, RoQC not only quantifies the endurance of quantum correlations of a state against linear noise, but also can be used to distinguish between quantum and classically correlated states. Furthermore, the effects of local quantum channels on the robustness are explored and characterized. (paper)

  8. Product quality control, irradiation and shipping procedures for mass-reared tephritid fruit flies for sterile insect release programmes

    International Nuclear Information System (INIS)

    1999-05-01

    This document represents the recommendations, reached by consensus of an international group of quality control experts, on the standard procedures for product quality control (QC) for mass reared tephritid flies that are to be used in Sterile Insect Technique (SIT) programs. In addition, the manual describes recommended methods of handling and packaging pupae during irradiation and shipment. Most of the procedures were designed specifically for use with Mediterranean fruit flies, Ceratitis capitata (Wied.), but they are applicable, with minor modification in some cases, for other tephritid species such as Caribbean fruit fly Anastrepha suspense, Mexican fruit fly A. ludens, and various Bactrocera species. The manual is evolving and subject to periodic updates. The future additions will include other fruit flies as the need is identified. If followed, procedures described in this manual will help ensure that the quality of mass-produced flies is measured accurately in a standardised fashion, allowing comparisons of quality over time and across rearing facilities and field programmes. Problems in rearing, irradiation and handling procedures, and strain quality can be identified and hopefully corrected before control programmes are affected. Tests and procedures described in this document are only part of a total quality control programme for tephritid fly production. The product QC evaluations included in this manual are, unless otherwise noted, required to be conducted during SIT programmes by the Field programme staff not the production staff. Additional product QC tests have been developed and their use is optional (see ancillary test section). Production and process QC evaluations (e.g., analysis of diet components, monitoring the rearing environment, yield of larvae, development rate, etc.) are not within the scope of this document. Quality specifications are included for minimum and mean acceptability of conventional strains of C. capitata, A. ludens, and A

  9. An overview of quality control practices in Ontario with particular reference to cholesterol analysis.

    Science.gov (United States)

    Krishnan, S; Webb, S; Henderson, A R; Cheung, C M; Nazir, D J; Richardson, H

    1999-03-01

    The Laboratory Proficiency Testing Program (LPTP) assesses the analytical performance of all licensed laboratories in Ontario. The LPTP Enzymes, Cardiac Markers, and Lipids Committee conducted a "Patterns of Practice" survey to assess the in-house quality control (QC) practices of laboratories in Ontario using cholesterol as the QC paradigm. The survey was questionnaire-based seeking information on statistical calculations, software rules, review process and data retention, and so on. Copies of the in-house cholesterol QC graphs were requested. A total of 120 of 210 laboratories were randomly chosen to receive the questionnaires during 1995 and 1996; 115 laboratories responded, although some did not answer all questions. The majority calculate means and standard deviations (SD) every month, using anywhere from 4 to >100 data points. 65% use a fixed mean and SD, while 17% use means calculated from the previous month. A few use a floating or cumulative mean. Some laboratories that do not use fixed means use a fixed SD. About 90% use some form of statistical quality control rules. The most common rules used to detect random error are 1(3s)/R4s while 2(2s)/4(1s)/10x are used for systematic errors. About 20% did not assay any QC at levels >5.5 mmol/L. Quality control data are reviewed daily (technologists), weekly and monthly (supervisors/directors). Most laboratories retain their QC records for up to 3 years on paper and magnetic media. On some QC graphs the mean and SD, QC product lot number, or reference to action logs are not apparent. Quality control practices in Ontario are, therefore, disappointing. Improvement is required in the use of clinically appropriate concentrations of QC material and documentation on QC graphs.

  10. Quality Control in Primary Schools: Progress from 2001-2006

    Science.gov (United States)

    Hofman, Roelande H.; de Boom, Jan; Hofman, W. H. Adriaan

    2010-01-01

    This article presents findings of research into the quality control (QC) of schools from 2001-2006. In 2001 several targets for QC were set and the progress of 939 primary schools is presented. Furthermore, using cluster analysis, schools are classified into four QC-types that differ in their focus on school (self) evaluation and school…

  11. Automated quality control in a file-based broadcasting workflow

    Science.gov (United States)

    Zhang, Lina

    2014-04-01

    Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.

  12. Quality control of CT systems by automated monitoring of key performance indicators: a two-year study.

    Science.gov (United States)

    Nowik, Patrik; Bujila, Robert; Poludniowski, Gavin; Fransson, Annette

    2015-07-08

    The purpose of this study was to develop a method of performing routine periodical quality controls (QC) of CT systems by automatically analyzing key performance indicators (KPIs), obtainable from images of manufacturers' quality assurance (QA) phantoms. A KPI pertains to a measurable or determinable QC parameter that is influenced by other underlying fundamental QC parameters. The established KPIs are based on relationships between existing QC parameters used in the annual testing program of CT scanners at the Karolinska University Hospital in Stockholm, Sweden. The KPIs include positioning, image noise, uniformity, homogeneity, the CT number of water, and the CT number of air. An application (MonitorCT) was developed to automatically evaluate phantom images in terms of the established KPIs. The developed methodology has been used for two years in clinical routine, where CT technologists perform daily scans of the manufacturer's QA phantom and automatically send the images to MonitorCT for KPI evaluation. In the cases where results were out of tolerance, actions could be initiated in less than 10 min. 900 QC scans from two CT scanners have been collected and analyzed over the two-year period that MonitorCT has been active. Two types of errors have been registered in this period: a ring artifact was discovered with the image noise test, and a calibration error was detected multiple times with the CT number test. In both cases, results were outside the tolerances defined for MonitorCT, as well as by the vendor. Automated monitoring of KPIs is a powerful tool that can be used to supplement established QC methodologies. Medical physicists and other professionals concerned with the performance of a CT system will, using such methods, have access to comprehensive data on the current and historical (trend) status of the system such that swift actions can be taken in order to ensure the quality of the CT examinations, patient safety, and minimal disruption of service.

  13. Quality control of CT systems by automated monitoring of key performance indicators: a two‐year study

    Science.gov (United States)

    Bujila, Robert; Poludniowski, Gavin; Fransson, Annette

    2015-01-01

    The purpose of this study was to develop a method of performing routine periodical quality controls (QC) of CT systems by automatically analyzing key performance indicators (KPIs), obtainable from images of manufacturers' quality assurance (QA) phantoms. A KPI pertains to a measurable or determinable QC parameter that is influenced by other underlying fundamental QC parameters. The established KPIs are based on relationships between existing QC parameters used in the annual testing program of CT scanners at the Karolinska University Hospital in Stockholm, Sweden. The KPIs include positioning, image noise, uniformity, homogeneity, the CT number of water, and the CT number of air. An application (MonitorCT) was developed to automatically evaluate phantom images in terms of the established KPIs. The developed methodology has been used for two years in clinical routine, where CT technologists perform daily scans of the manufacturer's QA phantom and automatically send the images to MonitorCT for KPI evaluation. In the cases where results were out of tolerance, actions could be initiated in less than 10 min. 900 QC scans from two CT scanners have been collected and analyzed over the two‐year period that MonitorCT has been active. Two types of errors have been registered in this period: a ring artifact was discovered with the image noise test, and a calibration error was detected multiple times with the CT number test. In both cases, results were outside the tolerances defined for MonitorCT, as well as by the vendor. Automated monitoring of KPIs is a powerful tool that can be used to supplement established QC methodologies. Medical physicists and other professionals concerned with the performance of a CT system will, using such methods, have access to comprehensive data on the current and historical (trend) status of the system such that swift actions can be taken in order to ensure the quality of the CT examinations, patient safety, and minimal disruption of service

  14. Superfund Site Information - Site Sampling Data

    Data.gov (United States)

    U.S. Environmental Protection Agency — This asset includes Superfund site-specific sampling information including location of samples, types of samples, and analytical chemistry characteristics of...

  15. Inspection and verification of waste packages for near surface disposal

    International Nuclear Information System (INIS)

    2000-01-01

    Extensive experience has been gained with various disposal options for low and intermediate level waste at or near surface disposal facilities. Near surface disposal is based on proven and well demonstrated technologies. To ensure the safety of near surface disposal facilities when available technologies are applied, it is necessary to control and assure the quality of the repository system's performance, which includes waste packages, engineered features and natural barriers, as well as siting, design, construction, operation, closure and institutional controls. Recognizing the importance of repository performance, the IAEA is producing a set of technical publications on quality assurance and quality control (QA/QC) for waste disposal to provide Member States with technical guidance and current information. These publications cover issues on the application of QA/QC programmes to waste disposal, long term record management, and specific QA/QC aspects of waste packaging, repository design and R and D. Waste package QA/QC is especially important because the package is the primary barrier to radionuclide release from a disposal facility. Waste packaging also involves interface issues between the waste generator and the disposal facility operator. Waste should be packaged by generators to meet waste acceptance requirements set for a repository or disposal system. However, it is essential that the disposal facility operator ensure that waste packages conform with disposal facility acceptance requirements. Demonstration of conformance with disposal facility acceptance requirements can be achieved through the systematic inspection and verification of waste packages at both the waste generator's site and at the disposal facility, based on a waste package QA/QC programme established by the waste generator and approved by the disposal operator. However, strategies, approaches and the scope of inspection and verification will be somewhat different from country to country

  16. Al-based metal matrix composites reinforced with Al–Cu–Fe quasicrystalline particles: Strengthening by interfacial reaction

    International Nuclear Information System (INIS)

    Ali, F.; Scudino, S.; Anwar, M.S.; Shahid, R.N.; Srivastava, V.C.; Uhlenwinkel, V.; Stoica, M.; Vaughan, G.; Eckert, J.

    2014-01-01

    Highlights: • Strength of composites is enhanced as the QC-to-ω phase transformation advances. • Yield strength increases from 195 to 400 MPa with QC-to-ω interfacial reaction. • Reducing matrix ligament size explains most of the strengthening. • Improved interfacial bonding and nano ω phase explains divergence from model. - Abstract: The interfacial reaction between the Al matrix and the Al 62.5 Cu 25 Fe 12.5 quasicrystalline (QC) reinforcing particles to form the Al 7 Cu 2 Fe ω-phase has been used to further enhance the strength of the Al/QC composites. The QC-to-ω phase transformation during heating was studied by in situ X-ray diffraction using a high-energy monochromatic synchrotron beam, which permits to follow the structural evolution and to correlate it with the mechanical properties of the composites. The mechanical behavior of these transformation-strengthened composites is remarkably improved as the QC-to-ω phase transformation progresses: the yield strength increases from 195 MPa for the starting material reinforced exclusively with QC particles to 400 MPa for the material where the QC-to-ω reaction is complete. The reduction of the matrix ligament size resulting from the increased volume fraction of the reinforcing phase during the transformation can account for most of the observed improvement in strength, whereas the additional strengthening can be ascribed to the possible presence of nanosized ω-phase particles as well as to the improved interfacial bonding between matrix and particles caused by the compressive stresses arising in the matrix

  17. Al-based metal matrix composites reinforced with Al–Cu–Fe quasicrystalline particles: Strengthening by interfacial reaction

    Energy Technology Data Exchange (ETDEWEB)

    Ali, F. [IFW Dresden, Institut für Komplexe Materialien, Postfach 27 01 16, D-01171 Dresden (Germany); Materials Processing Group, DMME, Pakistan Institute of Engineering and Applied Sciences, P.O. Nilore, Islamabad (Pakistan); Scudino, S., E-mail: s.scudino@ifw-dresden.de [IFW Dresden, Institut für Komplexe Materialien, Postfach 27 01 16, D-01171 Dresden (Germany); Anwar, M.S.; Shahid, R.N. [Materials Processing Group, DMME, Pakistan Institute of Engineering and Applied Sciences, P.O. Nilore, Islamabad (Pakistan); Srivastava, V.C. [Metal Extraction and Forming Division, National Metallurgical Laboratory, Jamshedpur 831007 (India); Uhlenwinkel, V. [Institut für Werkstofftechnik, Universität Bremen, D-28359 Bremen (Germany); Stoica, M. [IFW Dresden, Institut für Komplexe Materialien, Postfach 27 01 16, D-01171 Dresden (Germany); Vaughan, G. [European Synchrotron Radiation Facilities ESRF, BP 220, 38043 Grenoble (France); Eckert, J. [IFW Dresden, Institut für Komplexe Materialien, Postfach 27 01 16, D-01171 Dresden (Germany); TU Dresden, Institut für Werkstoffwissenschaft, D-01062 Dresden (Germany)

    2014-09-01

    Highlights: • Strength of composites is enhanced as the QC-to-ω phase transformation advances. • Yield strength increases from 195 to 400 MPa with QC-to-ω interfacial reaction. • Reducing matrix ligament size explains most of the strengthening. • Improved interfacial bonding and nano ω phase explains divergence from model. - Abstract: The interfacial reaction between the Al matrix and the Al{sub 62.5}Cu{sub 25}Fe{sub 12.5} quasicrystalline (QC) reinforcing particles to form the Al{sub 7}Cu{sub 2}Fe ω-phase has been used to further enhance the strength of the Al/QC composites. The QC-to-ω phase transformation during heating was studied by in situ X-ray diffraction using a high-energy monochromatic synchrotron beam, which permits to follow the structural evolution and to correlate it with the mechanical properties of the composites. The mechanical behavior of these transformation-strengthened composites is remarkably improved as the QC-to-ω phase transformation progresses: the yield strength increases from 195 MPa for the starting material reinforced exclusively with QC particles to 400 MPa for the material where the QC-to-ω reaction is complete. The reduction of the matrix ligament size resulting from the increased volume fraction of the reinforcing phase during the transformation can account for most of the observed improvement in strength, whereas the additional strengthening can be ascribed to the possible presence of nanosized ω-phase particles as well as to the improved interfacial bonding between matrix and particles caused by the compressive stresses arising in the matrix.

  18. Genetic Sample Inventory

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database archives genetic tissue samples from marine mammals collected primarily from the U.S. east coast. The collection includes samples from field programs,...

  19. A simple vibrating sample magnetometer for macroscopic samples

    Science.gov (United States)

    Lopez-Dominguez, V.; Quesada, A.; Guzmán-Mínguez, J. C.; Moreno, L.; Lere, M.; Spottorno, J.; Giacomone, F.; Fernández, J. F.; Hernando, A.; García, M. A.

    2018-03-01

    We here present a simple model of a vibrating sample magnetometer (VSM). The system allows recording magnetization curves at room temperature with a resolution of the order of 0.01 emu and is appropriated for macroscopic samples. The setup can be mounted with different configurations depending on the requirements of the sample to be measured (mass, saturation magnetization, saturation field, etc.). We also include here examples of curves obtained with our setup and comparison curves measured with a standard commercial VSM that confirms the reliability of our device.

  20. Iowa Geologic Sampling Points

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — Point locations of geologic samples/files in the IGS repository. Types of samples include well cuttings, outcrop samples, cores, drillers logs, measured sections,...

  1. Method of extruding and packaging a thin sample of reactive material including forming the extrusion die

    International Nuclear Information System (INIS)

    Lewandowski, E.F.; Peterson, L.L.

    1985-01-01

    This invention teaches a method of cutting a narrow slot in an extrusion die with an electrical discharge machine by first drilling spaced holes at the ends of where the slot will be, whereby the oil can flow through the holes and slot to flush the material eroded away as the slot is being cut. The invention further teaches a method of extruding a very thin ribbon of solid highly reactive material such as lithium or sodium through the die in an inert atmosphere of nitrogen, argon or the like as in a glovebox. The invention further teaches a method of stamping out sample discs from the ribbon and of packaging each disc by sandwiching it between two aluminum sheets and cold welding the sheets together along an annular seam beyond the outer periphery of the disc. This provides a sample of high purity reactive material that can have a long shelf life

  2. Quantum cost optimized design of 4-bit reversible universal shift register using reduced number of logic gate

    Science.gov (United States)

    Maity, H.; Biswas, A.; Bhattacharjee, A. K.; Pal, A.

    In this paper, we have proposed the design of quantum cost (QC) optimized 4-bit reversible universal shift register (RUSR) using reduced number of reversible logic gates. The proposed design is very useful in quantum computing due to its low QC, less no. of reversible logic gate and less delay. The QC, no. of gates, garbage outputs (GOs) are respectively 64, 8 and 16 for proposed work. The improvement of proposed work is also presented. The QC is 5.88% to 70.9% improved, no. of gate is 60% to 83.33% improved with compared to latest reported result.

  3. NEMA NU-1 2007 based and independent quality control software for gamma cameras and SPECT

    International Nuclear Information System (INIS)

    Vickery, A; Joergensen, T; De Nijs, R

    2011-01-01

    A thorough quality assurance of gamma and SPECT cameras requires a careful handling of the measured quality control (QC) data. Most gamma camera manufacturers provide the users with camera specific QC Software. This QC software is indeed a useful tool for the following of day-to-day performance of a single camera. However, when it comes to objective performance comparison of different gamma cameras and a deeper understanding of the calculated numbers, the use of camera specific QC software without access to the source code is rather avoided. Calculations and definitions might differ, and manufacturer independent standardized results are preferred. Based upon the NEMA Standards Publication NU 1-2007, we have developed a suite of easy-to-use data handling software for processing acquired QC data providing the user with instructive images and text files with the results.

  4. Evaluation of the 1Shot Phantom dedicated to the mammography system using FCR

    International Nuclear Information System (INIS)

    Nagashima, Chieko; Uchiyama, Nachiko; Moriyama, Noriyuki; Nagata, Mio; Kobayashi, Hiroyuki; Sankoda, Katsuhiro; Saotome, Shigeru; Tagi, Masahiro; Kusunoki, Tetsurou

    2009-01-01

    Currently daily quality control (QC) tests for mammography systems are generally evaluated by using visual analysis phantoms, which of course means subjective measurement. In our study, however, we evaluated a novel digital phantom, the 1Shot Phantom M plus (1Shot Phantom), together with automatic analysis software dedicated for mammography systems using Fuji computed radiography (FCR). The digital phantom enables objective evaluation by providing for actual physical measurement rather than subjective visual assessment. We measured contrast to noise ratio (CNR), image receptor homogeneity, missed tissue at chest wall side, modulation transfer function (MTF), and geometric distortion utilizing the 1Shot Phantom. We then compared the values obtained using the 1Shot Phantom with values obtained from the European guidelines and International Electrotechnical Commission (IEC) standards. In addition, we evaluated the convenience of using the digital phantom. The values utilizing the 1Shot Phantom and those from the European guidelines and IEC standards were consistent, but the QC tests for the European guidelines and IEC standards methods took about six hours while the same QC tests using the 1Shot Phantom took 10 minutes or less including exposure of the phantom image, measurement, and analysis. In conclusion, the digital phantom and dedicated software proved very useful and produced improved analysis for mammography systems using FCR in clinical daily QC testing because of their objectivity and substantial time-saving convenience. (author)

  5. Quality control for the mammography screening program in Serbia: Physical and technical aspects

    International Nuclear Information System (INIS)

    Ciraj-Bjelac, O.; Bozovic, P.; Lazarevic, D.; Arandjic, D.; Kosutic, D.

    2012-01-01

    Breast cancer is the major cause of mortality among female population in Serbia. It is presumed that the introduction of screening programme will reduce mortality and therefore, 47 new mammography units were installed for the purpose of population-based screening program in 2011. In parallel, Quality assurance and Quality control (QC) in mammography has received increasing attention as an essential element of the successful breast cancer campaign that is for the first time initialed in Serbia. The purpose of this study is to investigate the need for and the possible implementation of the comprehensive QC programme for the mammography screening in Serbia, with special focus on physical and technical aspect. In the first phase, a QC protocols containing list of parameters, methodology, frequency of tests and reference values for screen-film, computed radiography and full-filed digital mammography) units, were developed. The second phase is focused on the initial implementation of these protocols. The paper presents results of tests of the selected parameters in 35 mammography units, with special emphasis on patient dose and image quality descriptors. After initial implementation at the beginning of the population based breast cancer screening campaign, it is essential to establish system of regular and periodic QC equipment monitoring and to ensure high quality mammograms with minimal possible radiation dose to population included in the screening. (authors)

  6. Determination of melamine in milk-based products and other food and beverage products by ion-pair liquid chromatography-tandem mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez, Maria; Sancho, Juan V. [Research Institute for Pesticides and Water, University Jaume I, E-12071, Castellon (Spain); Hernandez, Felix, E-mail: felix.hernandez@qfa.uji.es [Research Institute for Pesticides and Water, University Jaume I, E-12071, Castellon (Spain)

    2009-09-01

    This paper describes a fast method for the sensitive and selective determination of melamine in a wide range of food matrices, including several milk-based products. The method involves an extraction with aqueous 1% trichloroacetic acid before the injection of the 10-fold diluted extract into the liquid chromatography-electrospray tandem mass spectrometry (LC-ESI-MS/MS) system, using labelled melamine as the internal standard. As melamine is present in aqueous media in the cationic form, the chromatographic separation in reversed-phase LC requires the use of anionic ion-pair reagents, such as tridecafluoroheptanoic acid (THFA). This allows a satisfactory chromatographic retention and peak shape in all the types of food samples investigated. The method has been validated in six food matrices (biscuit, dry pasta and four milk-based products) by means of recovery experiments in samples spiked at 1 and 5 mg kg{sup -1}. Average recoveries (n = 5) ranged from 77% to 100%, with excellent precision (RSDs lower than 5%) and limits of detection between 0.01 and 0.1 mg kg{sup -1}. In addition, accuracy and robustness of the method was proven in different soya-based matrices by means of quality control (QC) sample analysis. QC recoveries, at 1 and 2.5 mg kg{sup -1}, were satisfactory, ranging from 79% to 110%. The method developed in this work has been applied to the determination of melamine in different types of food samples. All detections were confirmed by acquiring two MS/MS transitions (127 > 85 for quantification; 127 > 68 for confirmation) and comparing their ion intensity ratio with that of reference standards. Accuracy of the method was also assessed by applying it to a milk-based product and a baking mix material as part of an EU proficiency test, in which highly satisfactory results were obtained.

  7. Determination of melamine in milk-based products and other food and beverage products by ion-pair liquid chromatography-tandem mass spectrometry

    International Nuclear Information System (INIS)

    Ibanez, Maria; Sancho, Juan V.; Hernandez, Felix

    2009-01-01

    This paper describes a fast method for the sensitive and selective determination of melamine in a wide range of food matrices, including several milk-based products. The method involves an extraction with aqueous 1% trichloroacetic acid before the injection of the 10-fold diluted extract into the liquid chromatography-electrospray tandem mass spectrometry (LC-ESI-MS/MS) system, using labelled melamine as the internal standard. As melamine is present in aqueous media in the cationic form, the chromatographic separation in reversed-phase LC requires the use of anionic ion-pair reagents, such as tridecafluoroheptanoic acid (THFA). This allows a satisfactory chromatographic retention and peak shape in all the types of food samples investigated. The method has been validated in six food matrices (biscuit, dry pasta and four milk-based products) by means of recovery experiments in samples spiked at 1 and 5 mg kg -1 . Average recoveries (n = 5) ranged from 77% to 100%, with excellent precision (RSDs lower than 5%) and limits of detection between 0.01 and 0.1 mg kg -1 . In addition, accuracy and robustness of the method was proven in different soya-based matrices by means of quality control (QC) sample analysis. QC recoveries, at 1 and 2.5 mg kg -1 , were satisfactory, ranging from 79% to 110%. The method developed in this work has been applied to the determination of melamine in different types of food samples. All detections were confirmed by acquiring two MS/MS transitions (127 > 85 for quantification; 127 > 68 for confirmation) and comparing their ion intensity ratio with that of reference standards. Accuracy of the method was also assessed by applying it to a milk-based product and a baking mix material as part of an EU proficiency test, in which highly satisfactory results were obtained.

  8. References on EPA Quality Assurance Project Plans

    Science.gov (United States)

    Provides requirements for the conduct of quality management practices, including quality assurance (QA) and quality control (QC) activities, for all environmental data collection and environmental technology programs performed by or for this Agency.

  9. The DSM-5 Dimensional Anxiety Scales in a Dutch non-clinical sample: psychometric properties including the adult separation anxiety disorder scale.

    Science.gov (United States)

    Möller, Eline L; Bögels, Susan M

    2016-09-01

    With DSM-5, the American Psychiatric Association encourages complementing categorical diagnoses with dimensional severity ratings. We therefore examined the psychometric properties of the DSM-5 Dimensional Anxiety Scales, a set of brief dimensional scales that are consistent in content and structure and assess DSM-5-based core features of anxiety disorders. Participants (285 males, 255 females) completed the DSM-5 Dimensional Anxiety Scales for social anxiety disorder, generalized anxiety disorder, specific phobia, agoraphobia, and panic disorder that were included in previous studies on the scales, and also for separation anxiety disorder, which is included in the DSM-5 chapter on anxiety disorders. Moreover, they completed the Screen for Child Anxiety Related Emotional Disorders Adult version (SCARED-A). The DSM-5 Dimensional Anxiety Scales demonstrated high internal consistency, and the scales correlated significantly and substantially with corresponding SCARED-A subscales, supporting convergent validity. Separation anxiety appeared present among adults, supporting the DSM-5 recognition of separation anxiety as an anxiety disorder across the life span. To conclude, the DSM-5 Dimensional Anxiety Scales are a valuable tool to screen for specific adult anxiety disorders, including separation anxiety. Research in more diverse and clinical samples with anxiety disorders is needed. © 2016 The Authors International Journal of Methods in Psychiatric Research Published by John Wiley & Sons Ltd. © 2016 The Authors International Journal of Methods in Psychiatric Research Published by John Wiley & Sons Ltd.

  10. New opportunities for the enhanced NAA services through the research reactor coalitions and networks

    International Nuclear Information System (INIS)

    Danas Ridikas; Pablo Adelfang; Kevin Alldred; Marta Ferrari

    2012-01-01

    Although the number of research reactors (RRs) is steadily decreasing, more than half of the operational RRs are still heavily underutilized, and in most cases, underfunded. The decreasing and rather old fleet of RRs needs to ensure the provision of useful services to the community, in some cases with adequate revenue generation for reliable, safe and secure facility management and operations. Enhancement of low and medium power research reactor (RR) utilization is often pursued by increasing the neutron activation analysis (NAA) activities. In this paper we will present the strategy and concrete actions how NAA as one of the most popular RR applications can contribute to the above goals in particular through (a) RR coalitions and networks, (b) implementation of automation in different stages of NAA, (c) QA/QC, including skills improvement of involved personnel, (d) dedicated proficiency tests performed by a number of targeted analytical laboratories. We also show that despite the IAEA's efforts, some of the NAA laboratories still perform badly in proficiency tests, do not have formal QA/QC procedures implemented, have not implemented automation to process large number of samples or lack of clear marketing strategies. Some concrete actions are proposed and outlined to address these issues in the near future. (author)

  11. Simplifying sample pretreatment: application of dried blood spot (DBS) method to blood samples, including postmortem, for UHPLC-MS/MS analysis of drugs of abuse.

    Science.gov (United States)

    Odoardi, Sara; Anzillotti, Luca; Strano-Rossi, Sabina

    2014-10-01

    The complexity of biological matrices, such as blood, requires the development of suitably selective and reliable sample pretreatment procedures prior to their instrumental analysis. A method has been developed for the analysis of drugs of abuse and their metabolites from different chemical classes (opiates, methadone, fentanyl and analogues, cocaine, amphetamines and amphetamine-like substances, ketamine, LSD) in human blood using dried blood spot (DBS) and subsequent UHPLC-MS/MS analysis. DBS extraction required only 100μL of sample, added with the internal standards and then three droplets (30μL each) of this solution were spotted on the card, let dry for 1h, punched and extracted with methanol with 0.1% of formic acid. The supernatant was evaporated and the residue was then reconstituted in 100μL of water with 0.1% of formic acid and injected in the UHPLC-MS/MS system. The method was validated considering the following parameters: LOD and LOQ, linearity, precision, accuracy, matrix effect and dilution integrity. LODs were 0.05-1ng/mL and LOQs were 0.2-2ng/mL. The method showed satisfactory linearity for all substances, with determination coefficients always higher than 0.99. Intra and inter day precision, accuracy, matrix effect and dilution integrity were acceptable for all the studied substances. The addition of internal standards before DBS extraction and the deposition of a fixed volume of blood on the filter cards ensured the accurate quantification of the analytes. The validated method was then applied to authentic postmortem blood samples. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. 7 CFR 275.21 - Quality control review reports.

    Science.gov (United States)

    2010-01-01

    ... terminals, the State agency shall submit the results of each QC review in a format specified by FNS. Upon... in the individual case records, or legible copies of that material, as well as legible hard copies of... selection and completion on the Form FNS-248, Status of Sample Selection and Completion or other format...

  13. Quality Control Assessment of Radiology Devices in Kerman Province, Iran

    Directory of Open Access Journals (Sweden)

    Zahra Jomehzadeh

    2016-03-01

    Full Text Available Introduction Application of quality control (QC programs at diagnostic radiology departments is of great significance for optimization of image quality and reduction of patient dose. The main objective of this study was to perform QC tests on stationary radiographic X-ray machines, installed in 14 hospitals of Kerman province, Iran. Materials and Methods In this cross-sectional study, QC tests were performed on 28 conventional radiographic X-ray units in Kerman governmental hospitals, based on the protocols and criteria recommended by the Atomic Energy Organization of Iran (AEOI, using a calibrated Gammex QC kit. Each section of the QC kit incorporated different models. Results Based on the findings, kVp accuracy, kVp reproducibility, timer accuracy, timer reproducibility, exposure reproducibility, mA/timer linearity, and half-value layer were not within the acceptable limits in 25%, 4%, 29%, 18%, 11%, 12%, and 7% of the evaluated units (n=28, respectively. Conclusion As radiographic X-ray equipments in Kerman province are relatively old with a high workload, it is recommended that AEOI modify the current policies by changing the frequency of QC test implementation to at least once a year.

  14. New insight into the comparative power of quality-control rules that use control observations within a single analytical run.

    Science.gov (United States)

    Parvin, C A

    1993-03-01

    The error detection characteristics of quality-control (QC) rules that use control observations within a single analytical run are investigated. Unlike the evaluation of QC rules that span multiple analytical runs, most of the fundamental results regarding the performance of QC rules applied within a single analytical run can be obtained from statistical theory, without the need for simulation studies. The case of two control observations per run is investigated for ease of graphical display, but the conclusions can be extended to more than two control observations per run. Results are summarized in a graphical format that offers many interesting insights into the relations among the various QC rules. The graphs provide heuristic support to the theoretical conclusions that no QC rule is best under all error conditions, but the multirule that combines the mean rule and a within-run standard deviation rule offers an attractive compromise.

  15. Quality control in urinalysis.

    Science.gov (United States)

    Takubo, T; Tatsumi, N

    1999-01-01

    Quality control (QC) has been introduced in laboratories, and QC surveys in urinalysis have been performed by College of American Pathologist, by Japanese Association of Medical Technologists, by Osaka Medical Association and by manufacturers. QC survey in urinalysis for synthetic urine by the reagent strip and instrument made in same manufacturer, and by an automated urine cell analyser provided satisfactory results among laboratories. QC survey in urinalysis for synthetic urine by the reagent strips and instruments made by various manufacturers indicated differences in the determination values among manufacturers, and between manual and automated methods because the reagent strips and instruments have different characteristics, respectively. QC photo survey in urinalysis on the microscopic photos of urine sediment constituents indicated differences in the identification of cells among laboratories. From the results, it is necessary to standardize a reagent strip method, manual and automated methods, and synthetic urine.

  16. Effect of quality control implementation on image quality of radiographic films and irradiation doses to patients

    International Nuclear Information System (INIS)

    Cheng Yuxi; Zhou Qipu; Ge Lijuan; Hou Changsong; Qi Xuesong; Yue Baorong; Wang Zuoling; Wei Kedao

    1999-01-01

    Objective: To study the changes in the image quality of radiographic films and the irradiation doses to patients after quality control (QC) implementation. Methods: The entrance surface doses (ESD) to patients measured with TLD and the image quality of radiographic films were evaluated on the basis of CEC image quality criteria. Results: The ESD to patients were significantly reduced after QC implementation (P 0.05), but the post-QC image quality was significantly improved in chest PA, lumbar spine AP and pelvis AP(P0.01 or P<0.05). Conclusion: Significantly reduced irradiation dose with improved image quality can be obtained by QC implementation

  17. Mixed-mode crack tip loading and crack deflection in 1D quasicrystals

    Science.gov (United States)

    Wang, Zhibin; Scheel, Johannes; Ricoeur, Andreas

    2016-12-01

    Quasicrystals (QC) are a new class of materials besides crystals and amorphous solids and have aroused much attention of researchers since they were discovered. This paper presents a generalized fracture theory including the J-integral and crack closure integrals, relations between J1, J2 and the stress intensity factors as well as the implementation of the near-tip stress and displacement solutions of 1D QC. Different crack deflection criteria, i.e. the J-integral and maximum circumferential stress criteria, are investigated for mixed-mode loading conditions accounting for phonon-phason coupling. One focus is on the influence of phason stress intensity factors on crack deflection angles.

  18. An elementary components of variance analysis for multi-center quality control

    International Nuclear Information System (INIS)

    Munson, P.J.; Rodbard, D.

    1977-01-01

    The serious variability of RIA results from different laboratories indicates the need for multi-laboratory collaborative quality control (QC) studies. Statistical analysis methods for such studies using an 'analysis of variance with components of variance estimation' are discussed. This technique allocates the total variance into components corresponding to between-laboratory, between-assay, and residual or within-assay variability. Components of variance analysis also provides an intelligent way to combine the results of several QC samples run at different evels, from which we may decide if any component varies systematically with dose level; if not, pooling of estimates becomes possible. We consider several possible relationships of standard deviation to the laboratory mean. Each relationship corresponds to an underlying statistical model, and an appropriate analysis technique. Tests for homogeneity of variance may be used to determine if an appropriate model has been chosen, although the exact functional relationship of standard deviation to lab mean may be difficult to establish. Appropriate graphical display of the data aids in visual understanding of the data. A plot of the ranked standard deviation vs. ranked laboratory mean is a convenient way to summarize a QC study. This plot also allows determination of the rank correlation, which indicates a net relationship of variance to laboratory mean. (orig.) [de

  19. Results-driven approach to improving quality and productivity

    Science.gov (United States)

    John Dramm

    2000-01-01

    Quality control (QC) programs do not often realize their full potential. Elaborate and expensive QC programs can easily get side tracked by the process of building a program with promises of “Someday, this will all pay off.” Training employees in QC methods is no guarantee that quality will improve. Several documented cases show that such activity-centered efforts...

  20. Quantitative determination of BAF312, a S1P-R modulator, in human urine by LC-MS/MS: prevention and recovery of lost analyte due to container surface adsorption.

    Science.gov (United States)

    Li, Wenkui; Luo, Suyi; Smith, Harold T; Tse, Francis L S

    2010-02-15

    Analyte loss due to non-specific binding, especially container surface adsorption, is not uncommon in the quantitative analysis of urine samples. In developing a sensitive LC-MS/MS method for the determination of a drug candidate, BAF312, in human urine, a simple procedure was outlined for identification, confirmation and prevention of analyte non-specific binding to a container surface and to recover the 'non-specific loss' of an analyte, if no transfer has occurred to the original urine samples. Non-specific binding or container surface adsorption can be quickly identified by using freshly spiked urine calibration standards and pre-pooled QC samples during a LC-MS/MS feasibility run. The resulting low recovery of an analyte in urine samples can be prevented through the use of additives, such as the non-ionic surfactant Tween-80, CHAPS and others, to the container prior to urine sample collection. If the urine samples have not been transferred from the bulk container, the 'non-specific binding' of an analyte to the container surface can be reversed by the addition of a specified amount of CHAPS, Tween-80 or bovine serum albumin, followed by appropriate mixing. Among the above agents, Tween-80 is the most cost-effective. beta-cyclodextrin may be suitable in stabilizing the analyte of interest in urine via pre-treating the matrix with the agent. However, post-addition of beta-cyclodextrin to untreated urine samples does not recover the 'lost' analyte due to non-specific binding or container surface adsorption. In the case of BAF312, a dynamic range of 0.0200-20.0 ng/ml in human urine was validated with an overall accuracy and precision for QC sample results ranging from -3.2 to 5.1% (bias) and 3.9 to 10.2% (CV), respectively. Pre- and post-addition of 0.5% (v/v) Tween-80 to the container provided excellent overall analyte recovery and minimal MS signal suppression when a liquid-liquid extraction in combination with an isocratic LC separation was employed. The

  1. Main factors causing intergranular and quasi-cleavage fractures at hydrogen-induced cracking in tempered martensitic steels

    Science.gov (United States)

    Kurokawa, Ami; Doshida, Tomoki; Hagihara, Yukito; Suzuki, Hiroshi; Takai, Kenichi

    2018-05-01

    Though intergranular (IG) and quasi-cleavage (QC) fractures have been widely recognized as typical fracture modes of the hydrogen-induced cracking in high-strength steels, the main factor has been unclarified yet. In the present study, the hydrogen content dependence on the main factor causing hydrogen-induced cracking has been examined through the fracture mode transition from QC to IG at the crack initiation site in the tempered martensitic steels. Two kinds of tempered martensitic steels were prepared to change the cohesive force due to the different precipitation states of Fe3C on the prior γ grain boundaries. A high amount of Si (H-Si) steel has a small amount of Fe3C on the prior austenite grain boundaries. Whereas, a low amount of Si (L-Si) steel has a large amount of Fe3C sheets on the grain boundaries. The fracture modes and initiations were observed using FE-SEM (Field Emission-Scanning Electron Microscope). The crack initiation sites of the H-Si steel were QC fracture at the notch tip under various hydrogen contents. While the crack initiation of the L-Si steel change from QC fracture at the notch tip to QC and IG fractures from approximately 10 µm ahead of the notch tip as increasing in hydrogen content. For L-Si steels, two possibilities are considered that the QC or IG fracture occurred firstly, or the QC and IG fractures occurred simultaneously. Furthermore, the principal stress and equivalent plastic strain distributions near the notch tip were calculated with FEM (Finite Element Method) analysis. The plastic strain was the maximum at the notch tip and the principle stress was the maximum at approximately 10 µm from the notch tip. The position of the initiation of QC and IG fracture observed using FE-SEM corresponds to the position of maximum strain and stress obtained with FEM, respectively. These findings indicate that the main factors causing hydrogen-induced cracking are different between QC and IG fractures.

  2. Microsystem enabled photovoltaic modules and systems

    Science.gov (United States)

    Nielson, Gregory N; Sweatt, William C; Okandan, Murat

    2015-05-12

    A microsystem enabled photovoltaic (MEPV) module including: an absorber layer; a fixed optic layer coupled to the absorber layer; a translatable optic layer; a translation stage coupled between the fixed and translatable optic layers; and a motion processor electrically coupled to the translation stage to controls motion of the translatable optic layer relative to the fixed optic layer. The absorber layer includes an array of photovoltaic (PV) elements. The fixed optic layer includes an array of quasi-collimating (QC) micro-optical elements designed and arranged to couple incident radiation from an intermediate image formed by the translatable optic layer into one of the PV elements such that it is quasi-collimated. The translatable optic layer includes an array of focusing micro-optical elements corresponding to the QC micro-optical element array. Each focusing micro-optical element is designed to produce a quasi-telecentric intermediate image from substantially collimated radiation incident within a predetermined field of view.

  3. Comparative performance evaluation of a new a-Si EPID that exceeds quad high-definition resolution.

    Science.gov (United States)

    McConnell, Kristen A; Alexandrian, Ara; Papanikolaou, Niko; Stathakis, Sotiri

    2018-01-01

    Electronic portal imaging devices (EPIDs) are an integral part of the radiation oncology workflow for treatment setup verification. Several commercial EPID implementations are currently available, each with varying capabilities. To standardize performance evaluation, Task Group Report 58 (TG-58) and TG-142 outline specific image quality metrics to be measured. A LinaTech Image Viewing System (IVS), with the highest commercially available pixel matrix (2688x2688 pixels), was independently evaluated and compared to an Elekta iViewGT (1024x1024 pixels) and a Varian aSi-1000 (1024x768 pixels) using a PTW EPID QC Phantom. The IVS, iViewGT, and aSi-1000 were each used to acquire 20 images of the PTW QC Phantom. The QC phantom was placed on the couch and aligned at isocenter. The images were exported and analyzed using the epidSoft image quality assurance (QA) software. The reported metrics were signal linearity, isotropy of signal linearity, signal-tonoise ratio (SNR), low contrast resolution, and high-contrast resolution. These values were compared between the three EPID solutions. Computed metrics demonstrated comparable results between the EPID solutions with the IVS outperforming the aSi-1000 and iViewGT in the low and high-contrast resolution analysis. The performance of three commercial EPID solutions have been quantified, evaluated, and compared using results from the PTW QC Phantom. The IVS outperformed the other panels in low and high-contrast resolution, but to fully realize the benefits of the IVS, the selection of the monitor on which to view the high-resolution images is important to prevent down sampling and visual of resolution.

  4. DTIPrep: Quality Control of Diffusion-Weighted Images

    Directory of Open Access Journals (Sweden)

    Ipek eOguz

    2014-01-01

    Full Text Available In the last decade, diffusion MRI (dMRI studies of the human and animal brain have been used to investigate a multitude of pathologies and drug-related effects in neuroscience research. Study after study identifies white matter (WM degeneration as a crucial biomarker for all these diseases. The tool of choice for studying WM is dMRI. However, dMRI has inherently low signal-to-noise ratio and its acquisition requires a relatively long scan time; in fact, the high loads required occasionally stress scanner hardware past the point of physical failure. As a result, many types of artifacts implicate the quality of diffusion imagery. Using these complex scans containing artifacts without quality control (QC can result in considerable error and bias in the subsequent analysis, negatively affecting the results of research studies using them. However, dMRI QC remains an under-recognized issue in the dMRI community as there are no user-friendly tools commonly available to comprehensively address the issue of dMRI QC. As a result, current dMRI studies often perform a poor job at dMRI QC.Thorough QC of diffusion MRI will reduce measurement noise and improve reproducibility, and sensitivity in neuroimaging studies; this will allow researchers to more fully exploit the power of the dMRI technique and will ultimately advance neuroscience. Therefore, in this manuscript, we present our open-source software, DTIPrep, as a unified, user friendly platform for thorough quality control of dMRI data. These include artifacts caused by eddy-currents, head motion, bed vibration and pulsation, venetian blind artifacts, as well as slice-wise and gradient-wise intensity inconsistencies. This paper summarizes a basic set of features of DTIPrep described earlier and focuses on newly added capabilities related to directional artifacts and bias analysis.

  5. Localization of ascorbic acid, ascorbic acid oxidase, and glutathione in roots of Cucurbita maxima L.

    Science.gov (United States)

    Liso, Rosalia; De Tullio, Mario C; Ciraci, Samantha; Balestrini, Raffaella; La Rocca, Nicoletta; Bruno, Leonardo; Chiappetta, Adriana; Bitonti, Maria Beatrice; Bonfante, Paola; Arrigoni, Oreste

    2004-12-01

    To understand the function of ascorbic acid (ASC) in root development, the distribution of ASC, ASC oxidase, and glutathione (GSH) were investigated in cells and tissues of the root apex of Cucubita maxima. ASC was regularly distributed in the cytosol of almost all root cells, with the exception of quiescent centre (QC) cells. ASC also occurred at the surface of the nuclear membrane and correspondingly in the nucleoli. No ASC could be observed in vacuoles. ASC oxidase was detected by immunolocalization mainly in cell walls and vacuoles. This enzyme was particularly abundant in the QC and in differentiating vascular tissues and was absent in lateral root primordia. Administration of the ASC precursor L-galactono-gamma-lactone markedly increased ASC content in all root cells, including the QC. Root treatment with the ASC oxidized product, dehydroascorbic acid (DHA), also increased ASC content, but caused ASC accumulation only in peripheral tissues, where DHA was apparently reduced at the expense of GSH. The different pattern of distribution of ASC in different tissues and cell compartments reflects its possible role in cell metabolism and root morphogenesis.

  6. Quality Control in Diagnostic Radiology: Experiences and Achievements

    International Nuclear Information System (INIS)

    Mohd Khalid Matori; Husaini Salleh; Muhammad Jamal Md Isa

    2015-01-01

    Malaysian Nuclear Agency through its Medical Physics Group has been providing Quality Control (QC) services for medical X-ray apparatus used in diagnostic radiology to private clinics and hospitals since the year 1997. The Medical Physics Groups services is endorsed by the Malaysian Ministry Of Health (MOH) and is in accordance with the Malaysian Standard MS 838 and the Atomic Energy Licensing Act, 1984. Until today, the scopes of testing services also include all types of medical x-ray apparatus. The quality control (QC) in diagnostic radiology is considered as part of quality assurance program which provide accurate diagnostic information at the lowest cost and the least exposure of the patients to radiation. Many experience and obstacles were faced by Medical Physics Group. This paper will discuss the experiences and achievements of providing QC service from early stage until now so that it can be shared by the citizens of the Malaysian Nuclear Agency. The results of quality assurance inspection of all types of X-ray apparatus for medical conducted by Agency Nuclear Malaysia will be presented in brief. (author)

  7. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  8. Ab initio quantum chemistry for combustion

    International Nuclear Information System (INIS)

    Page, M.; Lengsfield, B.H.

    1991-01-01

    Advances in theoretical and computational methods, coupled with the rapid development of powerful and inexpensive computers, fuel the current rapid development in computational quantum chemistry (QC). Nowhere is this more evident than in the areas of QC most relevant to combustion: the description of bond breaking and rate phenomena. although the development of faster computers with larger memories has had a major impact on the scope of problems that can be addressed with QC, the development of new theoretical techniques and capabilities is responsible for adding new dimensions in QC and has paved the way for the unification of QC electronic structure calculations with statistical and dynamical models of chemical reactions. These advances will be stressed in this chapter. This paper describes past accomplishments selectively to set the stage for discussion of ideas or techniques that we believe will have significant impact on combustion research. Thus, the focus of the chapter is as much on the future as it is on the past

  9. Quality control of flow cytometry data analysis for evaluation of minimal residual disease in bone marrow from acute leukemia patients during treatment

    DEFF Research Database (Denmark)

    Bjorklund, E.; Matinlauri, I.; Tierens, A.

    2009-01-01

    before implementation of MRD at cutoff level 10 as one of stratifying parameters in next Nordic Society of Pediatric Hematology and Oncology (NOPHO) treatment program for ALL. In 4 quality control (QC) rounds 15 laboratories determined the MRD levels in 48 follow-up samples from 12 ALL patients treated...... according to NOPHO 2000. Analysis procedures were standardized. For each QC round a compact disc containing data in list-mode files was sent out and results were submitted to a central laboratory. At cutoff level 10, which will be applied for clinical decisions, laboratories obtained a high concordance (91......Low levels of leukemia cells in the bone marrow, minimal residual disease (MRD), are considered to be a powerful indicator of treatment response in acute lymphatic leukemia (ALL). A Nordic quality assurance program, aimed on standardization of the flow cytometry MRD analysis, has been established...

  10. Developing Water Sampling Standards

    Science.gov (United States)

    Environmental Science and Technology, 1974

    1974-01-01

    Participants in the D-19 symposium on aquatic sampling and measurement for water pollution assessment were informed that determining the extent of waste water stream pollution is not a cut and dry procedure. Topics discussed include field sampling, representative sampling from storm sewers, suggested sampler features and application of improved…

  11. Phase formation and stability of quasicrystal/α-Mg interfaces in the Mg–Cd–Yb system

    International Nuclear Information System (INIS)

    Ohhashi, S.; Suzuki, K.; Kato, A.; Tsai, A.P.

    2014-01-01

    Phase formation involving icosahedral quasicrystals (iQc) in the Mg–Cd–Yb system was investigated. The phase diagrams obtained revealed that the iQc is in equilibrium with either (Mg, Cd) 2 Yb or an α-Mg phase over a wide composition range at 673 K. A eutectic reaction, where the melt decomposed to a rod-like lamella structure consisting of iQc and α-Mg phases was observed for Mg 68 Cd 24 Yb 8 at 735 K. High-angle annular dark-field scanning transmission microscopy observation of the iQc in Mg 96 Cd 3 Yb 1 verified the atomic positions of the Yb icosahedra and confirmed that the i-MgCdYb is isostructural to the i-CdYb. The formation of the eutectic structure is responsible for the high stability of the iQc/α-Mg interfaces because of good lattice matching; which is coincident interplanar spacing over several planes for the two phases. This coincidence in interplanar spacing was further confirmed in the real atomic structure, for which the twofold planes of the iQc, and the [0 0 0 2] and [2 −1 −1 0] planes of α-Mg are dominant factors in determining the stability of the interfaces

  12. Impact of errors in recorded compressed breast thickness measurements on volumetric density classification using volpara v1.5.0 software.

    Science.gov (United States)

    Waade, Gunvor Gipling; Highnam, Ralph; Hauge, Ingrid H R; McEntee, Mark F; Hofvind, Solveig; Denton, Erika; Kelly, Judith; Sarwar, Jasmine J; Hogg, Peter

    2016-06-01

    Mammographic density has been demonstrated to predict breast cancer risk. It has been proposed that it could be used for stratifying screening pathways and recommending additional imaging. Volumetric density tools use the recorded compressed breast thickness (CBT) of the breast measured at the x-ray unit in their calculation; however, the accuracy of the recorded thickness can vary. The aim of this study was to investigate whether inaccuracies in recorded CBT impact upon volumetric density classification and to examine whether the current quality control (QC) standard is sufficient for assessing mammographic density. Raw data from 52 digital screening mammograms were included in the study. For each image, the clinically recorded CBT was artificially increased and decreased in increments of 1 mm to simulate measurement error, until ±15% from the recorded CBT was reached. New images were created for each 1 mm step in thickness resulting in a total of 974 images which then had volpara density grade (VDG) and volumetric density percentage assigned. A change in VDG was observed in 38.5% (n = 20) of mammograms when applying ±15% error to the recorded CBT and 11.5% (n = 6) was within the QC standard prescribed error of ±5 mm. The current QC standard of ±5 mm error in recorded CBT creates the potential for error in mammographic density measurement. This may lead to inaccurate classification of mammographic density. The current QC standard for assessing mammographic density should be reconsidered.

  13. Protocol for the Quick Clinical study: a randomised controlled trial to assess the impact of an online evidence retrieval system on decision-making in general practice

    Directory of Open Access Journals (Sweden)

    Kidd Michael R

    2006-08-01

    Full Text Available Abstract Background Online information retrieval systems have the potential to improve patient care but there are few comparative studies of the impact of online evidence on clinicians' decision-making behaviour in routine clinical work. Methods/design A randomized controlled parallel design is employed to assess the effectiveness of an online evidence retrieval system, Quick Clinical (QC in improving clinical decision-making processes in general practice. Eligible clinicians are randomised either to receive access or not to receive access to QC in their consulting rooms for 12 months. Participants complete pre- and post trial surveys. Two-hundred general practitioners are recruited. Participants must be registered to practice in Australia, have a computer with Internet access in their consulting room and use electronic prescribing. Clinicians planning to retire or move to another practice within 12 months or participating in any other clinical trial involving electronic extraction of prescriptions data are excluded from the study. The primary end-points for the study is clinician acceptance and use of QC and the resulting change in decision-making behaviour. The study will examine prescribing patterns related to frequently prescribed medications where there has been a recent significant shift in recommendations regarding their use based upon new evidence. Secondary outcome measures include self-reported changes in diagnosis, patient education, prescriptions written, investigations and referrals. Discussion A trial under experimental conditions is an effective way of examining the impact of using QC in routine general practice consultations.

  14. On sampling social networking services

    OpenAIRE

    Wang, Baiyang

    2012-01-01

    This article aims at summarizing the existing methods for sampling social networking services and proposing a faster confidence interval for related sampling methods. It also includes comparisons of common network sampling techniques.

  15. Quality Control Assessment of Radiology Devices in Kerman Province, Iran

    OpenAIRE

    Zahra Jomehzadeh; Ali Jomehzadeh; Mohammad Bagher Tavakoli

    2016-01-01

    Introduction Application of quality control (QC) programs at diagnostic radiology departments is of great significance for optimization of image quality and reduction of patient dose. The main objective of this study was to perform QC tests on stationary radiographic X-ray machines, installed in 14 hospitals of Kerman province, Iran. Materials and Methods In this cross-sectional study, QC tests were performed on 28 conventional radiographic X-ray units in Kerman governmental hospitals, based ...

  16. A Study of the Relationship of Geological Formation to the NORM

    International Nuclear Information System (INIS)

    Bursh, Talmage P.; Chriss, Derald

    1999-01-01

    Naturally Occurring Radioactive Materials (NORM) is a common and costly contaminant of produced waters associated with natural gas production and exploration. One way of combating this problem is by identifying the problem beforehand. Our approach to this problem involves development of NORM prediction capabilities based on the geological environment. During quarter fifteen of this project, work has continued under the recently approved revisions. We have selected sampling sites and are awaiting samples for analysis. In addition, the QA/QC plans are in the final stages in anticipation of sample acquisition

  17. SIMPATIQCO: a server-based software suite which facilitates monitoring the time course of LC-MS performance metrics on Orbitrap instruments.

    Science.gov (United States)

    Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl

    2012-11-02

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.

  18. Genetic Sample Inventory - NRDA

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database archives genetic tissue samples from marine mammals collected in the North-Central Gulf of Mexico from 2010-2015. The collection includes samples from...

  19. Quality control of next-generation sequencing library through an integrative digital microfluidic platform.

    Science.gov (United States)

    Thaitrong, Numrin; Kim, Hanyoup; Renzi, Ronald F; Bartsch, Michael S; Meagher, Robert J; Patel, Kamlesh D

    2012-12-01

    We have developed an automated quality control (QC) platform for next-generation sequencing (NGS) library characterization by integrating a droplet-based digital microfluidic (DMF) system with a capillary-based reagent delivery unit and a quantitative CE module. Using an in-plane capillary-DMF interface, a prepared sample droplet was actuated into position between the ground electrode and the inlet of the separation capillary to complete the circuit for an electrokinetic injection. Using a DNA ladder as an internal standard, the CE module with a compact LIF detector was capable of detecting dsDNA in the range of 5-100 pg/μL, suitable for the amount of DNA required by the Illumina Genome Analyzer sequencing platform. This DMF-CE platform consumes tenfold less sample volume than the current Agilent BioAnalyzer QC technique, preserving precious sample while providing necessary sensitivity and accuracy for optimal sequencing performance. The ability of this microfluidic system to validate NGS library preparation was demonstrated by examining the effects of limited-cycle PCR amplification on the size distribution and the yield of Illumina-compatible libraries, demonstrating that as few as ten cycles of PCR bias the size distribution of the library toward undesirable larger fragments. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Nuclear analytical methods in quality control of microanalysis

    International Nuclear Information System (INIS)

    Tian Weizhi

    2004-01-01

    Quantitative calibration and quality control have been a major bottleneck in microanalysis due to the lack of natural-matrix CRMs certified at sample sizes compatible with those of unknown samples. A solution is described to characterize sampling behavior for individual elements, so as to identify elements homogeneous enough at stated sample size levels in given CRMs/RMs. By using a combination of several nuclear analytical techniques, INAA-EDXRF-μPIXE, sampling behavior for individual elements can be characterized at sample size levels from grams down to pg. Natural-matrix CRMs specifically for QC of microanalysis may thus be created. Additional information in certificates of these new generation CRMs is imagined. (author)

  1. panelcn.MOPS: Copy-number detection in targeted NGS panel data for clinical diagnostics.

    Science.gov (United States)

    Povysil, Gundula; Tzika, Antigoni; Vogt, Julia; Haunschmid, Verena; Messiaen, Ludwine; Zschocke, Johannes; Klambauer, Günter; Hochreiter, Sepp; Wimmer, Katharina

    2017-07-01

    Targeted next-generation-sequencing (NGS) panels have largely replaced Sanger sequencing in clinical diagnostics. They allow for the detection of copy-number variations (CNVs) in addition to single-nucleotide variants and small insertions/deletions. However, existing computational CNV detection methods have shortcomings regarding accuracy, quality control (QC), incidental findings, and user-friendliness. We developed panelcn.MOPS, a novel pipeline for detecting CNVs in targeted NGS panel data. Using data from 180 samples, we compared panelcn.MOPS with five state-of-the-art methods. With panelcn.MOPS leading the field, most methods achieved comparably high accuracy. panelcn.MOPS reliably detected CNVs ranging in size from part of a region of interest (ROI), to whole genes, which may comprise all ROIs investigated in a given sample. The latter is enabled by analyzing reads from all ROIs of the panel, but presenting results exclusively for user-selected genes, thus avoiding incidental findings. Additionally, panelcn.MOPS offers QC criteria not only for samples, but also for individual ROIs within a sample, which increases the confidence in called CNVs. panelcn.MOPS is freely available both as R package and standalone software with graphical user interface that is easy to use for clinical geneticists without any programming experience. panelcn.MOPS combines high sensitivity and specificity with user-friendliness rendering it highly suitable for routine clinical diagnostics. © 2017 The Authors. Human Mutation published by Wiley Periodicals, Inc.

  2. Uniform Sampling Table Method and its Applications II--Evaluating the Uniform Sampling by Experiment.

    Science.gov (United States)

    Chen, Yibin; Chen, Jiaxi; Chen, Xuan; Wang, Min; Wang, Wei

    2015-01-01

    A new method of uniform sampling is evaluated in this paper. The items and indexes were adopted to evaluate the rationality of the uniform sampling. The evaluation items included convenience of operation, uniformity of sampling site distribution, and accuracy and precision of measured results. The evaluation indexes included operational complexity, occupation rate of sampling site in a row and column, relative accuracy of pill weight, and relative deviation of pill weight. They were obtained from three kinds of drugs with different shape and size by four kinds of sampling methods. Gray correlation analysis was adopted to make the comprehensive evaluation by comparing it with the standard method. The experimental results showed that the convenience of uniform sampling method was 1 (100%), odds ratio of occupation rate in a row and column was infinity, relative accuracy was 99.50-99.89%, reproducibility RSD was 0.45-0.89%, and weighted incidence degree exceeded the standard method. Hence, the uniform sampling method was easy to operate, and the selected samples were distributed uniformly. The experimental results demonstrated that the uniform sampling method has good accuracy and reproducibility, which can be put into use in drugs analysis.

  3. Non-monotonicity and divergent time scale in Axelrod model dynamics

    Science.gov (United States)

    Vazquez, F.; Redner, S.

    2007-04-01

    We study the evolution of the Axelrod model for cultural diversity, a prototypical non-equilibrium process that exhibits rich dynamics and a dynamic phase transition between diversity and an inactive state. We consider a simple version of the model in which each individual possesses two features that can assume q possibilities. Within a mean-field description in which each individual has just a few interaction partners, we find a phase transition at a critical value qc between an active, diverse state for q < qc and a frozen state. For q lesssim qc, the density of active links is non-monotonic in time and the asymptotic approach to the steady state is controlled by a time scale that diverges as (q-qc)-1/2.

  4. DNA fingerprinting: a quality control case study for human biospecimen authentication.

    Science.gov (United States)

    Kofanova, Olga A; Mathieson, William; Thomas, Gerry A; Betsou, Fotini

    2014-04-01

    This case study illustrates the usefulness of the DNA fingerprinting method in biobank quality control (QC) procedures and emphasizes the need for detailed and accurate record keeping during processing of biological samples. It also underlines the value of independent third-party assessment to identify points at which errors are most likely to have occurred when unexpected results are obtained from biospecimens.

  5. Lens Coupled Quantum Cascade Laser

    Science.gov (United States)

    Hu, Qing (Inventor); Lee, Alan Wei Min (Inventor)

    2013-01-01

    Terahertz quantum cascade (QC) devices are disclosed that can operate, e.g., in a range of about 1 THz to about 10 THz. In some embodiments, QC lasers are disclosed in which an optical element (e.g., a lens) is coupled to an output facet of the laser's active region to enhance coupling of the lasing radiation from the active region to an external environment. In other embodiments, terahertz amplifier and tunable terahertz QC lasers are disclosed.

  6. Framework of Six Sigma implementation analysis on SMEs in Malaysia for information technology services, products and processes

    OpenAIRE

    Wong, Whee Yen; ,; ,

    2015-01-01

    For the past two decades, the majority of Malaysia’s IT companies have been widely adopting a Quality Assurance (QA) approach as a basis for self-improvement and internal-assessment in IT project management. Quality Control (QC) is a comprehensive top-down observation approach used to fulfill requirements for quality outputs which focuses on the aspect of process outputs evaluation. However in the Malaysian context, QC and combination of QA and QC as a means of quality improvement approaches ...

  7. Identification of trapped electron modes in frequency fluctuation spectra of fusion plasmas

    International Nuclear Information System (INIS)

    Arnichand, Hugo

    2015-01-01

    This thesis shows that the analysis of frequency fluctuation spectra can provide an additional experimental indication of the dominant mode. Depending on the plasma scenario, fluctuation spectra can display different frequency components: Broadband spectra (Δf ∼ hundreds of kHz) which are always observed. Their amplitude is maximum at the zero frequency and they are attributed to turbulence. Coherent modes (Δf ∼ 1 kHz) which oscillate at a very well defined frequency. They can for example be due to geodesic acoustic or magnetohydrodynamic (MHD) modes; Quasi-Coherent (QC) modes (Δf ∼ tens of kHz) which oscillate at a rather well defined frequency but which are reminiscent of broadband fluctuations. The fluctuation study performed in the plasma core region shows that the fluctuation spectra in TEM-dominated regimes can be noticeably different from the ones in ITG-dominated regimes, as only TEM can induce QC modes. Such a finding has been achieved by comparing fluctuations measurements with simulations Measurements are made with a reflectometry diagnostic, a radar-like technique able to provide local indications of the density fluctuations occurring in the vicinity of the reflection layer. Frequency fluctuation spectra are inferred from a Fourier analysis of the reflectometry signal. First, the main properties of QC modes are characterized experimentally. Their normalized scale is estimated to k(perpendicular)ρ i ≤1, their amplitude is ballooned on the low field side mid-plane and they can be observed at many different radii. These indications are in agreement with what could be expected for ITG/TEM instabilities. Then reflectometry measurements are analyzed in Ohmic plasmas. QC modes are observed in the Linear Ohmic Confinement (LOC) regime dominated by TEM whereas only broadband spectra are seen in the Saturated Ohmic Confinement (SOC) regime dominated by ITG. Frequency spectra from nonlinear gyrokinetic simulations show that TEM induce a narrow

  8. A method to establish seismic noise baselines for automated station assessment

    Science.gov (United States)

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  9. 40 CFR 98.364 - Monitoring and QA/QC requirements.

    Science.gov (United States)

    2010-07-01

    ... by Gas Chromatography (incorporated by reference see § 98.7). All gas composition monitors shall be calibrated prior to the first reporting year for biogas methane and carbon dioxide content using ASTM D1946... composition, temperature, and pressure measurements. These procedures include, but are not limited to...

  10. Simple and ultra-fast recognition and quantitation of compounded monoclonal antibodies: Application to flow injection analysis combined to UV spectroscopy and matching method.

    Science.gov (United States)

    Jaccoulet, E; Schweitzer-Chaput, A; Toussaint, B; Prognon, P; Caudron, E

    2018-09-01

    Compounding of monoclonal antibody (mAbs) constantly increases in hospital. Quality control (QC) of the compounded mAbs based on quantification and identification is required to prevent potential errors and fast method is needed to manage outpatient chemotherapy administration. A simple and ultra-fast (less than 30 s) method using flow injection analysis associated to least square matching method issued from the analyzer software was performed and evaluated for the routine hospital QC of three compounded mAbs: bevacizumab, infliximab and rituximab. The method was evaluated through qualitative and quantitative parameters. Preliminary analysis of the UV absorption and second derivative spectra of the mAbs allowed us to adapt analytical conditions according to the therapeutic range of the mAbs. In terms of quantitative QC, linearity, accuracy and precision were assessed as specified in ICH guidelines. Very satisfactory recovery was achieved and the RSD (%) of the intermediate precision were less than 1.1%. Qualitative analytical parameters were also evaluated in terms of specificity, sensitivity and global precision through a matrix of confusion. Results showed to be concentration and mAbs dependant and excellent (100%) specificity and sensitivity were reached within specific concentration range. Finally, routine application on "real life" samples (n = 209) from different batch of the three mAbs complied with the specifications of the quality control i.e. excellent identification (100%) and ± 15% of targeting concentration belonging to the calibration range. The successful use of the combination of second derivative spectroscopy and partial least square matching method demonstrated the interest of FIA for the ultra-fast QC of mAbs after compounding using matching method. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Intersubband spectroscopy of ZnO/ZnMgO quantum wells grown on m-plane ZnO substrates for quantum cascade device applications (Conference Presentation)

    Science.gov (United States)

    Quach, Patrick; Jollivet, Arnaud; Isac, Nathalie; Bousseksou, Adel; Ariel, Frédéric; Tchernycheva, Maria; Julien, François H.; Montes Bajo, Miguel; Tamayo-Arriola, Julen; Hierro, Adrián.; Le Biavan, Nolwenn; Hugues, Maxime; Chauveau, Jean-Michel

    2017-03-01

    Quantum cascade (QC) lasers opens new prospects for powerful sources operating at THz frequencies. Up to now the best THz QC lasers are based on intersubband emission in GaAs/AlGaAs quantum well (QW) heterostructures. The maximum operating temperature is 200 K, which is too low for wide-spread applications. This is due to the rather low LO-phonon energy (36 meV) of GaAs-based materials. Indeed, thermal activation allows non-radiative path through electron-phonon interaction which destroys the population inversion. Wide band gap materials such as ZnO have been predicted to provide much higher operating temperatures because of the high value of their LO-phonon energy. However, despite some observations of intersubband absorption in c-plane ZnO/ZnMgO quantum wells, little is known on the fundamental parameters such as the conduction band offset in such heterostructures. In addition the internal field inherent to c-plane grown heterostuctures is an handicap for the design of QC lasers and detectors. In this talk, we will review a systematic investigation of ZnO/ZnMgO QW heterostructures with various Mg content and QW thicknesses grown by plasma molecular beam epitaxy on low-defect m-plane ZnO substrates. We will show that most samples exhibit TM-polarized intersubband absorption at room temperature linked either to bound-to-quasi bound inter-miniband absorption or to bound-to bound intersubband absorption depending on the Mg content of the barrier material. This systematic study allows for the first time to estimate the conduction band offset of ZnO/ZnMgO heterostructures, opening prospects for the design of QC devices operating at THz frequencies. This was supported by the European Union's Horizon 2020 research and innovation programme under grant agreement #665107.

  12. Multiple sample, radioactive particle counting apparatus

    International Nuclear Information System (INIS)

    Reddy, R.R.V.; Kelso, D.M.

    1978-01-01

    An apparatus is described for determining the respective radioactive particle sample count being emitted from radioactive particle containing samples. It includes means for modulating the information on the radioactive particles being emitted from the samples, coded detecting means for sequentially detecting different respective coded combinations of the radioactive particles emitted from more than one but less than all of the samples, and processing the modulated information to derive the sample count for each sample. It includes a single light emitting crystal next to a number of samples, an encoder belt sequentially movable between the crystal and the samples. The encoder belt has a coded array of apertures to provide corresponding modulated light pulses from the crystal, and a photomultiplier tube to convert the modulated light pulses to decodable electrical signals for deriving the respective sample count

  13. Measurement of the quantum capacitance from two-dimensional surface state of a topological insulator at room temperature

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Hyunwoo, E-mail: chw0089@gmail.com [Department of Electrical and Computer Engineering, University of Seoul, Seoul 02504 (Korea, Republic of); Kim, Tae Geun, E-mail: tgkim1@korea.ac.kr [School of Electrical Engineering, Korea University, Seoul 02841 (Korea, Republic of); Shin, Changhwan, E-mail: cshin@uos.ac.kr [Department of Electrical and Computer Engineering, University of Seoul, Seoul 02504 (Korea, Republic of)

    2017-06-15

    Highlights: • The quantum capacitance in topological insulator (TI) at room temperature is directly revealed. • The physical origin of quantum capacitance, the two dimensional surface state of TI, is experimentally validated. • Theoretically calculated results of ideal quantum capacitance can well predict the experimental data. - Abstract: A topological insulator (TI) is a new kind of material that exhibits unique electronic properties owing to its topological surface state (TSS). Previous studies focused on the transport properties of the TSS, since it can be used as the active channel layer in metal-oxide-semiconductor field-effect transistors (MOSFETs). However, a TI with a negative quantum capacitance (QC) effect can be used in the gate stack of MOSFETs, thereby facilitating the creation of ultra-low power electronics. Therefore, it is important to study the physics behind the QC in TIs in the absence of any external magnetic field, at room temperature. We fabricated a simple capacitor structure using a TI (TI-capacitor: Au-TI-SiO{sub 2}-Si), which shows clear evidence of QC at room temperature. In the capacitance-voltage (C-V) measurement, the total capacitance of the TI-capacitor increases in the accumulation regime, since QC is the dominant capacitive component in the series capacitor model (i.e., C{sub T}{sup −1} = C{sub Q}{sup −1} + C{sub SiO2}{sup −1}). Based on the QC model of the two-dimensional electron systems, we quantitatively calculated the QC, and observed that the simulated C-V curve theoretically supports the conclusion that the QC of the TI-capacitor is originated from electron–electron interaction in the two-dimensional surface state of the TI.

  14. Sampling Operations on Big Data

    Science.gov (United States)

    2015-11-29

    gories. These include edge sampling methods where edges are selected by a predetermined criteria; snowball sampling methods where algorithms start... Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln...process and disseminate information for discovery and exploration under real-time constraints. Common signal processing operations such as sampling and

  15. Neuroprotective potential of quercetin in combination with piperine against 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine-induced neurotoxicity

    Directory of Open Access Journals (Sweden)

    Shamsher Singh

    2017-01-01

    Full Text Available 1-Methy-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP is a neurotoxin that selectively damages dopaminergic neurons in the substantia nigra pars compacta and induces Parkinson's like symptoms in rodents. Quercetin (QC is a natural polyphenolic bioflavonoid with potent antioxidant and anti-inflammatory properties but lacks of clinical attraction due to low oral bioavailability. Piperine is a well established bioavailability enhancer used pre-clinically to improve the bioavailability of antioxidants (e.g., Quercetin. Therefore, the present study was designed to evaluate the neuroprotective potential of QC together with piperine against MPTP-induced neurotoxicity in rats. MPTP (100 μg/μL/rat, bilaterally was injected intranigrally on days 1, 4 and 7 using a digital stereotaxic apparatus. QC (25 and 50 mg/kg, intragastrically and QC (25 mg/kg, intragastrically in combination with piperine (2.5 mg/kg, intragastrically were administered daily for 14 days starting from day 8 after the 3rd injection of MPTP. On day 22, animals were sacrificed and the striatum was isolated for oxidative stress parameter (thiobarbituric acid reactive substances, nitrite and glutathione, neuroinflammatory cytokine (interleukin-1β, interleukin-6, and tumor necrosis factor-α and neurotransmitter (dopamine, norepinephrine, serotonin, gamma-aminobutyric acid, glutamate, 3,4-dihydroxyphenylacetic acid, homovanillic acid, and 5-hydroxyindoleacetic acid evaluations. Bilateral infusion of MPTP into substantia nigra pars compacta led to significant motor deficits as evidenced by impairments in locomotor activity and rotarod performance in open field test and grip strength and narrow beam walk performance. Both QC (25 and 50 mg/kg and QC (25 mg/kg in combination with piperine (2.5 mg/kg, in particular the combination therapy, significantly improved MPTP-induced behavioral abnormalities in rats, reversed the abnormal alterations of neurotransmitters in the striatum, and alleviated

  16. Eastward and northward components of ocean current and water temperature collected from moorings in the vicinity of Quinault Canyon in the North East Pacific Coast from 1981-10-02 to 1982-01-19 (NCEI Accession 0164026)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The University of Washington maintained 5 current meter moorings, QC811 through QC815 in and around Quinault Canyon. Current meters were all Aanderaa (AA)...

  17. Determination of water-extractable nonstructural carbohydrates, including inulin, in grass samples with high-performance anion exchange chromatography and pulsed amperometric detection.

    Science.gov (United States)

    Raessler, Michael; Wissuwa, Bianka; Breul, Alexander; Unger, Wolfgang; Grimm, Torsten

    2008-09-10

    The exact and reliable determination of carbohydrates in plant samples of different origin is of great importance with respect to plant physiology. Additionally, the identification and quantification of carbohydrates are necessary for the evaluation of the impact of these compounds on the biogeochemistry of carbon. To attain this goal, it is necessary to analyze a great number of samples with both high sensitivity and selectivity within a limited time frame. This paper presents a rugged and easy method that allows the isocratic chromatographic determination of 12 carbohydrates and sugar alcohols from one sample within 30 min. The method was successfully applied to a variety of plant materials with particular emphasis on perennial ryegrass samples of the species Lolium perenne. The method was easily extended to the analysis of the polysaccharide inulin after its acidic hydrolysis into the corresponding monomers without the need for substantial change of chromatographic conditions or even the use of enzymes. It therefore offers a fundamental advantage for the analysis of the complex mixture of nonstructural carbohydrates often found in plant samples.

  18. Quasi Cyclic Low Density Parity Check Code for High SNR Data Transfer

    Directory of Open Access Journals (Sweden)

    M. R. Islam

    2010-06-01

    Full Text Available An improved Quasi Cyclic Low Density Parity Check code (QC-LDPC is proposed to reduce the complexity of the Low Density Parity Check code (LDPC while obtaining the similar performance. The proposed QC-LDPC presents an improved construction at high SNR with circulant sub-matrices. The proposed construction yields a performance gain of about 1 dB at a 0.0003 bit error rate (BER and it is tested on 4 different decoding algorithms. Proposed QC-LDPC is compared with the existing QC-LDPC and the simulation results show that the proposed approach outperforms the existing one at high SNR. Simulations are also performed varying the number of horizontal sub matrices and the results show that the parity check matrix with smaller horizontal concatenation shows better performance.

  19. Developing Product Quality Control for Standardization of Tsetse Mass Production. Working Material

    International Nuclear Information System (INIS)

    2002-01-01

    The recent Pan-African Tsetse and Trypanosomosis Eradication Campaign (PATTEC) provides a mechanism within which SIT will be one of the major components of an integrated areawide approach to the establishment of tsetse fly-free areas. Currently world-wide tsetse production is 1/40 of the projected requirement in 2006. To achieve this objective it is essential that quality control (QC) measures suitable for the expanded production be in place. Therefore, improved QC methodology has become a top priority. Improvements in QC methodology will help to ensure the attainment of these production goals and improve quality of rearing, minimize production costs and generate trained QC and production staff required to successfully produce flies and monitor their quality and suitability for release. The proposed CRP is designed to address these issues.

  20. 222-S Laboratory Quality Assurance Plan. Revision 1

    International Nuclear Information System (INIS)

    Meznarich, H.K.

    1995-01-01

    This Quality Assurance Plan provides,quality assurance (QA) guidance, regulatory QA requirements (e.g., 10 CFR 830.120), and quality control (QC) specifications for analytical service. This document follows the U.S Department of Energy (DOE) issued Hanford Analytical Services Quality Assurance Plan (HASQAP). In addition, this document meets the objectives of the Quality Assurance Program provided in the WHC-CM-4-2, Section 2.1. Quality assurance elements required in the Guidelines and Specifications for Preparing Quality Assurance Program Plans (QAMS-004) and Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans (QAMS-005) from the US Environmental Protection Agency (EPA) are covered throughout this document. A quality assurance index is provided in the Appendix A. This document also provides and/or identifies the procedural information that governs laboratory operations. The personnel of the 222-S Laboratory and the Standards Laboratory including managers, analysts, QA/QC staff, auditors, and support staff shall use this document as guidance and instructions for their operational and quality assurance activities. Other organizations that conduct activities described in this document for the 222-S Laboratory shall follow this QA/QC document

  1. Prospects for octopus rhodopsin utilization in optical and quantum computation

    International Nuclear Information System (INIS)

    Sivozhelezov, V.; Nicolini, A.

    2007-01-01

    Visual membranes of octopus, whose main component is the light-sensitive signal transducer octopus rhodopsin (octR), are extremely highly ordered, easily capture single photons, and are sensitive to light polarization, which shows their high potential for use as a QC detector. However, artificial membranes made of octR are neither highly enough ordered nor stable, while the bacterial homolog of octR, bacteriorhodopsin (bR), having the same topology as octR, forms both stable and ordered artificial membranes but lacks the optical properties important for optical QC. In this study, we investigate the structural basis for ordering of the two proteins in membranes in terms of crystallization behavior. We compare atomic resolution 3D structures of octR and bR and show the possibility for structural bR/octR interconversion by mutagenesis. We also show that the use of (nano)biotechnology can allow (1) high-precision manipulation of the light acceptor, retinal, including converting its surrounding into that of bacterial rhodopsin, the protein already used in optical-computation devices and (2) development of multicomponent and highly regular 2D structures with a high potential for being efficient optical QC detectors

  2. Consistency of cruise data of the CARINA database in the Atlantic sector of the Southern Ocean

    Directory of Open Access Journals (Sweden)

    M. Hoppema

    2009-12-01

    Full Text Available Initially a North Atlantic project, the CARINA carbon synthesis was extended to include the Southern Ocean. Carbon and relevant hydrographic and geochemical ancillary data from cruises all across the Arctic Mediterranean Seas, Atlantic and Southern Ocean were released to the public and merged into a new database as part of the CARINA synthesis effort. Of a total of 188 cruises, 37 cruises are part of the Southern Ocean, including 11 from the Atlantic sector. The variables from all Southern Ocean cruises, including dissolved inorganic carbon (TCO2, total alkalinity, oxygen, nitrate, phosphate and silicate, were examined for cruise-to-cruise consistency in one collective effort. Seawater pH and chlorofluorocarbons (CFCs are also part of the database, but the pH quality control (QC is described in another Earth System Science Data publication, while the complexity of the Southern Ocean physics and biogeochemistry prevented a proper QC analysis of the CFCs. The area-specific procedures of quality control, including crossover analysis between stations and inversion analysis of all crossover data (i.e. secondary QC, are briefly described here for the Atlantic sector of the Southern Ocean. Data from an existing, quality controlled database (GLODAP were used as a reference for our computations – however, the reference data were included into the analysis without applying the recommended GLODAP adjustments so the corrections could be independently verified. The outcome of this effort is an internally consistent, high-quality carbon data set for all cruises, including the reference cruises. The suggested corrections by the inversion analysis were allowed to vary within a fixed envelope, thus accounting for natural variability. The percentage of cruises adjusted ranged from 31% (for nitrate to 54% (for phosphate depending on the variable.

  3. Biochemical reference values in elderly black subjects

    African Journals Online (AJOL)

    1990-09-01

    Sep 1, 1990 ... were those included in the Sequential Multiple Analyser. Computer profile because it includes the 20 ... (AST, EC 2.6.1.1), lactate dehydrogenase (LD, EC 1.1.1.27) and l'-gluramyl transferase (GGT, EC 2.3.2.2). ... cholesterol, glucose, triglycerides and uric acid. Commercial quality control (QC) sera were ...

  4. Quality Assurance and Quality Control Practices for Rehabilitation of Sewer and Water Mains

    Science.gov (United States)

    As part of the US Environmental Protection Agency (EPA)’s Aging Water Infrastructure Research Program, several areas of research are being pursued, including a review of quality assurance and quality control (QA/QC) practices and acceptance testing during the installation of reha...

  5. Plasma waves

    National Research Council Canada - National Science Library

    Swanson, D. G

    1989-01-01

    ... Swanson, D.G. (Donald Gary), D a t e - Plasma waves. Bibliography: p. Includes index. 1. Plasma waves. QC718.5.W3S43 1989 ISBN 0-12-678955-X I. Title. 530.4'4 88-34388 Printed in the United Sta...

  6. Implementation of Quality Control Protocol in Mammography: A Serbian Experience

    International Nuclear Information System (INIS)

    Ciraj Bjelac, O.; Kosutic, D.; Arandjic, D.; Kovacevic, M.

    2008-01-01

    Mammography is method of choice for early detection of breast cancer. In Serbia, mammography is performed only clinically, although there is a long term plan to introduce mammography as screening method. Currently there are 60 mammography units in practice in Serbia, resulting with 70 000 mammographies annually. The purpose of this paper is preliminary evaluation of the mammography practice in Serbia, having in mind the annual number of examinations and fact that part of examination is performed on women without any clinical signs. For pilot implementation of Quality Control (QC) protocol in mammography, five hospitals with highest workload have been selected, representing the typical mammography practice in Serbia. Developed QC protocol, based on European guidelines for quality assurance in breast cancer screening and diagnosis, actual practice and resources, includes equipment testing and maintenance, staff training and QC management and allocation of responsibilities. Subsequently, it should be applied on the national scale. The survey demonstrated considerable variations in technical parameters that affect image quality and patients doses. Mean glandular doses ranged from 0.12 to 2.8 mGy, while reference optical density ranged from 1.2 to 2.8. Main problems were associated with film processing, viewing conditions and optical density control. The preliminary survey of mammography practice highlighted the need for optimization of radiation protection and training of operating staff, although the survey itself was very valuable learning process for all participants. Furthermore, systematic implementation of QC protocol should provide reliable performance of mammography units and maintain satisfactory image quality and keep patient doses as low as reasonably practical.(author)

  7. Patient identification in blood sampling.

    Science.gov (United States)

    Davidson, Anne; Bolton-Maggs, Paula

    The majority of adverse reports relating to blood transfusions result from human error, including misidentification of patients and incorrect labelling of samples. This article outlines best practice in blood sampling for transfusion (but is recommended for all pathology samples) and the role of patient empowerment in improving safety.

  8. 7 CFR 283.15 - Procedure for hearing.

    Science.gov (United States)

    2010-01-01

    ... evidence, the QC claim against the State agency for a QC error rate in excess of the tolerance level. The... admissible in evidence subject to such objections as to relevancy, materiality or competency of the testimony...

  9. MODIS/Terra+Aqua Land Cover Dynamics Yearly L3 Global 500m SIN Grid V005

    Data.gov (United States)

    National Aeronautics and Space Administration — Attention: The Dynamics_QC layer of the MCD12Q2 products is not performing as intended. Users are advised to ignore QC information until the issue is resolved. The...

  10. Environmental surveillance master sampling schedule

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, L E

    1992-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest Laboratory (PNL) for the US Department of Energy (DOE). This document contains the planned schedule for routine sample collection for the Surface Environmental Surveillance Project (SESP) and Ground-Water Monitoring Project. Samples for radiological analyses include Air-Particulate Filter, gases and vapor; Water/Columbia River, Onsite Pond, Spring, Irrigation, and Drinking; Foodstuffs/Animal Products including Whole Milk, Poultry and Eggs, and Beef; Foodstuffs/Produce including Leafy Vegetables, Vegetables, and Fruit; Foodstuffs/Farm Products including Wine, Wheat and Alfalfa; Wildlife; Soil; Vegetation; and Sediment. Direct Radiation Measurements include Terrestrial Locations, Columbia River Shoreline Locations, and Onsite Roadway, Railway and Aerial, Radiation Surveys.

  11. Synchronizing data from irregularly sampled sensors

    Science.gov (United States)

    Uluyol, Onder

    2017-07-11

    A system and method include receiving a set of sampled measurements for each of multiple sensors, wherein the sampled measurements are at irregular intervals or different rates, re-sampling the sampled measurements of each of the multiple sensors at a higher rate than one of the sensor's set of sampled measurements, and synchronizing the sampled measurements of each of the multiple sensors.

  12. Logical design of anti-prion agents using NAGARA

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Biao; Yamaguchi, Keiichi; Fukuoka, Mayuko [United Graduate School of Drug Discovery and Medical Information Sciences, Gifu University, 1-1 Yanagido, Gifu 501-1193 (Japan); Kuwata, Kazuo, E-mail: kuwata@gifu-u.ac.jp [United Graduate School of Drug Discovery and Medical Information Sciences, Gifu University, 1-1 Yanagido, Gifu 501-1193 (Japan); Department of Gene and Development, Graduate School of Medicine, Gifu University, 1-1 Yanagido, Gifu 501-1193 (Japan)

    2016-01-22

    To accelerate the logical drug design procedure, we created the program “NAGARA,” a plugin for PyMOL, and applied it to the discovery of small compounds called medical chaperones (MCs) that stabilize the cellular form of a prion protein (PrP{sup C}). In NAGARA, we constructed a single platform to unify the docking simulation (DS), free energy calculation by molecular dynamics (MD) simulation, and interfragment interaction energy (IFIE) calculation by quantum chemistry (QC) calculation. NAGARA also enables large-scale parallel computing via a convenient graphical user interface. Here, we demonstrated its performance and its broad applicability from drug discovery to lead optimization with full compatibility with various experimental methods including Western blotting (WB) analysis, surface plasmon resonance (SPR), and nuclear magnetic resonance (NMR) measurements. Combining DS and WB, we discovered anti-prion activities for two compounds and tegobuvir (TGV), a non-nucleoside non-structural protein NS5B polymerase inhibitor showing activity against hepatitis C virus genotype 1. Binding profiles predicted by MD and QC are consistent with those obtained by SPR and NMR. Free energy analyses showed that these compounds stabilize the PrP{sup C} conformation by decreasing the conformational fluctuation of the PrP{sup C}. Because TGV has been already approved as a medicine, its extension to prion diseases is straightforward. Finally, we evaluated the affinities of the fragmented regions of TGV using QC and found a clue for its further optimization. By repeating WB, MD, and QC recursively, we were able to obtain the optimum lead structure. - Highlights: • NAGARA integrates docking simulation, molecular dynamics, and quantum chemistry. • We found many compounds, e.g., tegobuvir (TGV), that exhibit anti-prion activities. • We obtained insights into the action mechanism of TGV as a medical chaperone. • Using QC, we obtained useful information for optimization of the

  13. Evaluation of the Quality Control Program for Diagnostic Radiography and Fluoroscopy Devices in Syria during 2005-2013

    Directory of Open Access Journals (Sweden)

    M. H. Kharita

    2017-06-01

    Full Text Available Introduction: Extensive use of diagnostic radiology is the largest contributor to total population radiation doses. Thus, appropriate equipment and safe practice are necessary for good-quality images with optimal doses. This study aimed to perform quality control (QC audit for radiography and fluoroscopy devices owned by private sector in Syria (2005-2013 to verify compliance of performance of X-ray machines with the regulatory requirements stipulated by the national regulatory body. Materials and Methods: In this study, QC audit included 487 X-ray diagnostic machines, (363 radiography and 124 fluoroscopy devices, installed in 306 medical diagnostic radiology centers in 14 provinces in Syria. We employed an X-ray beam analyzer device (NERO model 8000, Victoreen, USA, which was tested and calibrated at the National Secondary Standard Dosimetry Laboratory traceable to the IAEA Network of Secondary Standard Dosimetry Laboratories. Standard QC tool kits were used to evaluate tube and generator of the X-ray machines, which constituted potential (kVp, timer accuracy, radiation output consistency, tube filtration, small and large focal spot sizes, X-ray beam collimation and alignment, as well as high- and low-resolution and entrance surface dose in fluoroscopy. Results: According to our results, most of the assessed operating parameters were in compliance with the standards stipulated by the National Regulatory Authority. In cases of noncompliance for the assessed parameters, maximum value (28.77% pertained to accuracy of kVp calibration for radiography units, while the lowest value (2.42% belonged to entrance surface dose in fluoroscopy systems. Conclusion: Effective QC program in diagnostic radiology leads to obtaining information regarding quality of radiology devices used for medical diagnosis and minimizing the doses received by patients and medical personnel. The findings of this QC program, as the main part of QA program, illustrated that most

  14. Quality control and assurance for validation of DOS/I measurements

    Science.gov (United States)

    Cerussi, Albert; Durkin, Amanda; Kwong, Richard; Quang, Timothy; Hill, Brian; Tromberg, Bruce J.; MacKinnon, Nick; Mantulin, William W.

    2010-02-01

    Ongoing multi-center clinical trials are crucial for Biophotonics to gain acceptance in medical imaging. In these trials, quality control (QC) and assurance (QA) are key to success and provide "data insurance". Quality control and assurance deal with standardization, validation, and compliance of procedures, materials and instrumentation. Specifically, QC/QA involves systematic assessment of testing materials, instrumentation performance, standard operating procedures, data logging, analysis, and reporting. QC and QA are important for FDA accreditation and acceptance by the clinical community. Our Biophotonics research in the Network for Translational Research in Optical Imaging (NTROI) program for breast cancer characterization focuses on QA/QC issues primarily related to the broadband Diffuse Optical Spectroscopy and Imaging (DOS/I) instrumentation, because this is an emerging technology with limited standardized QC/QA in place. In the multi-center trial environment, we implement QA/QC procedures: 1. Standardize and validate calibration standards and procedures. (DOS/I technology requires both frequency domain and spectral calibration procedures using tissue simulating phantoms and reflectance standards, respectively.) 2. Standardize and validate data acquisition, processing and visualization (optimize instrument software-EZDOS; centralize data processing) 3. Monitor, catalog and maintain instrument performance (document performance; modularize maintenance; integrate new technology) 4. Standardize and coordinate trial data entry (from individual sites) into centralized database 5. Monitor, audit and communicate all research procedures (database, teleconferences, training sessions) between participants ensuring "calibration". This manuscript describes our ongoing efforts, successes and challenges implementing these strategies.

  15. Computer Graphics Simulations of Sampling Distributions.

    Science.gov (United States)

    Gordon, Florence S.; Gordon, Sheldon P.

    1989-01-01

    Describes the use of computer graphics simulations to enhance student understanding of sampling distributions that arise in introductory statistics. Highlights include the distribution of sample proportions, the distribution of the difference of sample means, the distribution of the difference of sample proportions, and the distribution of sample…

  16. CDT-a entropic theory of quantum gravity

    DEFF Research Database (Denmark)

    Ambjørn, Jan; Görlich, A.; Jurkiewicz, J.

    2010-01-01

    High Energy Physics - Theory (hep-th); General Relativity and Quantum Cosmology (gr-qc); High Energy Physics - Lattice (hep-lat)......High Energy Physics - Theory (hep-th); General Relativity and Quantum Cosmology (gr-qc); High Energy Physics - Lattice (hep-lat)...

  17. CAUSAL DYNAMICAL TRIANGULATIONS AND THE SEARCH FOR A THEORY OF QUANTUM GRAVITY

    DEFF Research Database (Denmark)

    Ambjørn, Jan; Görlich, Andrzej; Jurkiewicz, J.

    2013-01-01

    High Energy Physics - Theory (hep-th); General Relativity and Quantum Cosmology (gr-qc); High Energy Physics - Lattice (hep-lat)......High Energy Physics - Theory (hep-th); General Relativity and Quantum Cosmology (gr-qc); High Energy Physics - Lattice (hep-lat)...

  18. Classic and Quantum Capacitances in Bernal Bilayer and Trilayer Graphene Field Effect Transistor

    Directory of Open Access Journals (Sweden)

    Hatef Sadeghi

    2013-01-01

    Full Text Available Our focus in this study is on characterizing the capacitance voltage (C-V behavior of Bernal stacking bilayer graphene (BG and trilayer graphene (TG as the channel of FET devices. The analytical models of quantum capacitance (QC of BG and TG are presented. Although QC is smaller than the classic capacitance in conventional devices, its contribution to the total metal oxide semiconductor capacitor in graphene-based FET devices becomes significant in the nanoscale. Our calculation shows that QC increases with gate voltage in both BG and TG and decreases with temperature with some fluctuations. However, in bilayer graphene the fluctuation is higher due to its tunable band structure with external electric fields. In similar temperature and size, QC in metal oxide BG is higher than metal oxide TG configuration. Moreover, in both BG and TG, total capacitance is more affected by classic capacitance as the distance between gate electrode and channel increases. However, QC is more dominant when the channel becomes thinner into the nanoscale, and therefore we mostly deal with quantum capacitance in top gate in contrast with bottom gate that the classic capacitance is dominant.

  19. A Benchmark Study on Error Assessment and Quality Control of CCS Reads Derived from the PacBio RS.

    Science.gov (United States)

    Jiao, Xiaoli; Zheng, Xin; Ma, Liang; Kutty, Geetha; Gogineni, Emile; Sun, Qiang; Sherman, Brad T; Hu, Xiaojun; Jones, Kristine; Raley, Castle; Tran, Bao; Munroe, David J; Stephens, Robert; Liang, Dun; Imamichi, Tomozumi; Kovacs, Joseph A; Lempicki, Richard A; Huang, Da Wei

    2013-07-31

    PacBio RS, a newly emerging third-generation DNA sequencing platform, is based on a real-time, single-molecule, nano-nitch sequencing technology that can generate very long reads (up to 20-kb) in contrast to the shorter reads produced by the first and second generation sequencing technologies. As a new platform, it is important to assess the sequencing error rate, as well as the quality control (QC) parameters associated with the PacBio sequence data. In this study, a mixture of 10 prior known, closely related DNA amplicons were sequenced using the PacBio RS sequencing platform. After aligning Circular Consensus Sequence (CCS) reads derived from the above sequencing experiment to the known reference sequences, we found that the median error rate was 2.5% without read QC, and improved to 1.3% with an SVM based multi-parameter QC method. In addition, a De Novo assembly was used as a downstream application to evaluate the effects of different QC approaches. This benchmark study indicates that even though CCS reads are post error-corrected it is still necessary to perform appropriate QC on CCS reads in order to produce successful downstream bioinformatics analytical results.

  20. Room temperature deformation of in-situ grown quasicrystals embedded in Al-based cast alloy

    Directory of Open Access Journals (Sweden)

    Boštjan Markoli

    2013-12-01

    Full Text Available An Al-based cast alloy containing Mn, Be and Cu has been chosen to investigate the room temperature deformation behavior of QC particles embedded in Al-matrix. Using LOM, SEM (equipped with EDS, conventional TEM with SAED and controlled tensile and compression tests, the deformation response of AlMn2Be2Cu2 cast alloy at room temperature has been examined. Alloy consisted of Al-based matrix, primary particles and eutectic icosahedral quasicrystalline (QC i-phase and traces of Θ-Al2Cu and Al10Mn3. Tensile and compression specimens were used for evaluation of mechanical response and behavior of QC i-phase articles embedded in Al-cast alloy. It has been established that embedded QC i-phase particles undergo plastic deformation along with the Al-based matrix even under severe deformation and have the response resembling that of the metallic materials by formation of typical cup-and-cone feature prior to failure. So, we can conclude that QC i-phase has the ability to undergo plastic deformation along with the Al-matrix to greater extent contrary to e.g. intermetallics such as Θ-Al2Cu for instance.

  1. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    Science.gov (United States)

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  2. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    Science.gov (United States)

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  3. Distance sampling methods and applications

    CERN Document Server

    Buckland, S T; Marques, T A; Oedekoven, C S

    2015-01-01

    In this book, the authors cover the basic methods and advances within distance sampling that are most valuable to practitioners and in ecology more broadly. This is the fourth book dedicated to distance sampling. In the decade since the last book published, there have been a number of new developments. The intervening years have also shown which advances are of most use. This self-contained book covers topics from the previous publications, while also including recent developments in method, software and application. Distance sampling refers to a suite of methods, including line and point transect sampling, in which animal density or abundance is estimated from a sample of distances to detected individuals. The book illustrates these methods through case studies; data sets and computer code are supplied to readers through the book’s accompanying website.  Some of the case studies use the software Distance, while others use R code. The book is in three parts.  The first part addresses basic methods, the ...

  4. Livermore Big Trees Park Soil Survey

    International Nuclear Information System (INIS)

    McConachie, W.A.; Failor, R.A.

    1995-01-01

    Lawrence Livermore National Laboratory (LLNL) will sample and analyze soil in the Big Trees Park area in Livermore, California, to determine if the initial level of plutonium (Pu) in a soil sample taken by the U.S. Environmental Protection Agency (EPA) in September 1993 can be confirmed. Nineteen samples will be collected and analyzed: 4 in the area where the initial EPA sample was taken, 2 in the nearby Arroyo Seco, 12 in scattered uncovered soil areas in the park and nearby school, and 1 from the sandbox of a nearby apartment complex. Two quality control (QC) samples (field duplicates of the preceding samples) win also be collected and analyzed. This document briefly describes the purpose behind the sampling, the sampling rationale, and the methodology

  5. Quality control for diagnostic oral microbiology laboratories in European countries

    NARCIS (Netherlands)

    Rautemaa-Richardson, R.; van der Reijden, W.A.; Dahlen, G.; Smith, A.J.

    2011-01-01

    Participation in diagnostic microbiology internal and external quality control (QC) processes is good laboratory practice and an essential component of a quality management system. However, no QC scheme for diagnostic oral microbiology existed until 2009 when the Clinical Oral Microbiology (COMB)

  6. Quality control and conduct of genome-wide association meta-analyses

    DEFF Research Database (Denmark)

    Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C

    2014-01-01

    Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for (i) organizational aspects of GWAMAs, and for (ii) QC...

  7. Transmission electron microscope sample holder with optical features

    Science.gov (United States)

    Milas, Mirko [Port Jefferson, NY; Zhu, Yimei [Stony Brook, NY; Rameau, Jonathan David [Coram, NY

    2012-03-27

    A sample holder for holding a sample to be observed for research purposes, particularly in a transmission electron microscope (TEM), generally includes an external alignment part for directing a light beam in a predetermined beam direction, a sample holder body in optical communication with the external alignment part and a sample support member disposed at a distal end of the sample holder body opposite the external alignment part for holding a sample to be analyzed. The sample holder body defines an internal conduit for the light beam and the sample support member includes a light beam positioner for directing the light beam between the sample holder body and the sample held by the sample support member.

  8. Environmental monitoring master sampling schedule: January--December 1989

    International Nuclear Information System (INIS)

    Bisping, L.E.

    1989-01-01

    Environmental monitoring of the Hanford Site is conducted by the Pacific Northwest Laboratory (PNL) for the US Department of Energy (DOE). This document contains the planned schedule for routine sample collection for calendar year 1989 for the Surface and Ground-Water Environmental Monitoring Projects. This schedule is subject to modification during the year in response to changes in Site operations, program requirements, and the nature of the observed results. Operational limitations such as weather, mechanical failures, sample availability, etc., may also require schedule modifications. Changes will be documented in the respective project files, but this plan will not be reissued. This schedule includes routine ground-water sampling performed by PNL for Westinghouse Hanford Company, but does not include samples that may be collected in 1989 to support special studies or special contractor projects, or for quality control. The sampling schedule for Site-wide chemical monitoring is not included here, because it varies each quarter as needed, based on past results and operating needs. This schedule does not include Resource Conservation and Recovery Act ground-water sampling performed by PNL for Hanford Site contractors, nor does it include sampling that may be done by other DOE Hanford contractors

  9. 40 CFR 1048.205 - What must I include in my application?

    Science.gov (United States)

    2010-07-01

    ... exhaust pipe, show how to sample exhaust emissions in a way that prevents diluting the exhaust sample with... system components for controlling exhaust emissions, including all auxiliary emission control devices.... (p) Present emission data to show that you meet emission standards, as follows: (1) Present exhaust...

  10. The Internet of Samples in the Earth Sciences (iSamples)

    Science.gov (United States)

    Carter, M. R.; Lehnert, K. A.

    2015-12-01

    samples. Creating awareness of the need to include physical samples in discussions of reproducible science is another priority of the iSamples RCN.

  11. 23 CFR 650.313 - Inspection procedures.

    Science.gov (United States)

    2010-04-01

    ...) Quality control and quality assurance. Assure systematic quality control (QC) and quality assurance (QA) procedures are used to maintain a high degree of accuracy and consistency in the inspection program. Include... allow assessment of current bridge condition. Record the findings and results of bridge inspections on...

  12. Latin Hypercube Sampling (LHS) at variable resolutions for enhanced watershed scale Soil Sampling and Digital Soil Mapping.

    Science.gov (United States)

    Hamalainen, Sampsa; Geng, Xiaoyuan; He, Juanxia

    2017-04-01

    Latin Hypercube Sampling (LHS) at variable resolutions for enhanced watershed scale Soil Sampling and Digital Soil Mapping. Sampsa Hamalainen, Xiaoyuan Geng, and Juanxia, He. AAFC - Agriculture and Agr-Food Canada, Ottawa, Canada. The Latin Hypercube Sampling (LHS) approach to assist with Digital Soil Mapping has been developed for some time now, however the purpose of this work was to complement LHS with use of multiple spatial resolutions of covariate datasets and variability in the range of sampling points produced. This allowed for specific sets of LHS points to be produced to fulfil the needs of various partners from multiple projects working in the Ontario and Prince Edward Island provinces of Canada. Secondary soil and environmental attributes are critical inputs that are required in the development of sampling points by LHS. These include a required Digital Elevation Model (DEM) and subsequent covariate datasets produced as a result of a Digital Terrain Analysis performed on the DEM. These additional covariates often include but are not limited to Topographic Wetness Index (TWI), Length-Slope (LS) Factor, and Slope which are continuous data. The range of specific points created in LHS included 50 - 200 depending on the size of the watershed and more importantly the number of soil types found within. The spatial resolution of covariates included within the work ranged from 5 - 30 m. The iterations within the LHS sampling were run at an optimal level so the LHS model provided a good spatial representation of the environmental attributes within the watershed. Also, additional covariates were included in the Latin Hypercube Sampling approach which is categorical in nature such as external Surficial Geology data. Some initial results of the work include using a 1000 iteration variable within the LHS model. 1000 iterations was consistently a reasonable value used to produce sampling points that provided a good spatial representation of the environmental

  13. Quality of temperature and salinity data from Argo profiling floats in the Bay of Bengal

    Digital Repository Service at National Institute of Oceanography (India)

    Parvathi, V.; Pankajakshan, T.; Rajkumar, M.; Prasannakumar, S.; Muraleedharan, P.M.; Ravichandran, M.; Rao, R.R.; Gopalakrishna, V.V.

    In the present study, temperature and salinity from APEX -Argo floats with reported SPB (Argo-SPB) and salinity from normal floats without any reported SPB (Argo-N) in the BoB have been subjected to quality check (QC) Method used for QC depends...

  14. Development of a Supercritical Fluid Chromatography-Tandem Mass Spectrometry Method for the Determination of Azacitidine in Rat Plasma and Its Application to a Bioavailability Study

    Directory of Open Access Journals (Sweden)

    Dongpo Li

    2013-12-01

    Full Text Available Azacitidine is widely used for the treatment of myelodysplastic syndromes (MDS and acute myelogenous leukaemia (AML. The analysis of azacitidine in biological samples is subject to interference by endogenous compounds. Previously reported high-performance liquid chromatography/tandem mass spectrometric (HPLC-MS/MS bioanalytical assays for azacitidine suffer from expensive sample preparation procedures or from long separation times to achieve the required selectivity. Herein, supercritical fluid chromatography with tandem mass spectrometry (SFC-MS/MS was explored as a more promising technique for the selective analysis of structure-like or chiral drugs in biological matrices. In this study, a simple, rapid and specific SFC/MS/MS analytical method was developed for the determination of azacitidine levels in rat plasma. Azacitidine was completely separated from the endogenous compounds on an ACQUITY UPLC™ BEH C18 column (100 mm × 3.0 mm, 1.7 μm; Waters Corp., Milford, MA, USA using isocratic elution with CO2/methanol as the mobile phase. The single-run analysis time was as short as 3.5 min. The sample preparation for protein removal was accomplished using a simple methanol precipitation method. The lower limit of quantification (LLOQ of azacitidine was 20 ng/mL. The intra-day and inter-day precisions were less than 15%, and the relative error (RE was within ±15% for the medium- and high-concentration quality control (QC samples and within ±20% for the low-concentration QC samples. Finally, the developed method was successfully applied to a pharmacokinetic study in rats following the intravenous administration of azacitidine.

  15. Enhancing sampling design in mist-net bat surveys by accounting for sample size optimization

    OpenAIRE

    Trevelin, Leonardo Carreira; Novaes, Roberto Leonan Morim; Colas-Rosas, Paul François; Benathar, Thayse Cristhina Melo; Peres, Carlos A.

    2017-01-01

    The advantages of mist-netting, the main technique used in Neotropical bat community studies to date, include logistical implementation, standardization and sampling representativeness. Nonetheless, study designs still have to deal with issues of detectability related to how different species behave and use the environment. Yet there is considerable sampling heterogeneity across available studies in the literature. Here, we approach the problem of sample size optimization. We evaluated the co...

  16. System for Earth Sample Registration SESAR: Services for IGSN Registration and Sample Metadata Management

    Science.gov (United States)

    Chan, S.; Lehnert, K. A.; Coleman, R. J.

    2011-12-01

    SESAR, the System for Earth Sample Registration, is an online registry for physical samples collected for Earth and environmental studies. SESAR generates and administers the International Geo Sample Number IGSN, a unique identifier for samples that is dramatically advancing interoperability amongst information systems for sample-based data. SESAR was developed to provide the complete range of registry services, including definition of IGSN syntax and metadata profiles, registration and validation of name spaces requested by users, tools for users to submit and manage sample metadata, validation of submitted metadata, generation and validation of the unique identifiers, archiving of sample metadata, and public or private access to the sample metadata catalog. With the development of SESAR v3, we placed particular emphasis on creating enhanced tools that make metadata submission easier and more efficient for users, and that provide superior functionality for users to manage metadata of their samples in their private workspace MySESAR. For example, SESAR v3 includes a module where users can generate custom spreadsheet templates to enter metadata for their samples, then upload these templates online for sample registration. Once the content of the template is uploaded, it is displayed online in an editable grid format. Validation rules are executed in real-time on the grid data to ensure data integrity. Other new features of SESAR v3 include the capability to transfer ownership of samples to other SESAR users, the ability to upload and store images and other files in a sample metadata profile, and the tracking of changes to sample metadata profiles. In the next version of SESAR (v3.5), we will further improve the discovery, sharing, registration of samples. For example, we are developing a more comprehensive suite of web services that will allow discovery and registration access to SESAR from external systems. Both batch and individual registrations will be possible

  17. Indonesia's experience with IAEA-CRP on radiation protection in diagnostic radiology

    International Nuclear Information System (INIS)

    Nasukha

    2001-01-01

    IAEA-CRP on Radiation Doses in Diagnostic Radiology and Methods for Dose Reduction has as participants some Asian and East European countries. Indonesia is one of participants that followed the IAEA program. This paper is not a discussion of CRP-results since it will be published as a TECDOC soon. But the work on evaluation of examination frequencies, film reject rate analysis, patient dose measurements, image quality before and after Quality Control (QC) and QC itself, gave some experiences to investigators to be explored and presented. Experiences could be in the form of problems, how to solve problems and some suggestions, starting from no QC up to complicated QC to be faced in conventional radiography to CT-scan and fluoroscopy units. These valuable experiences of Indonesia are proven exercise of IAEA-CRP as a good start for next CRP or national projects in diagnostic radiology. (author)

  18. Living conditions, including life style, in primary-care patients with nonacute, nonspecific spinal pain compared with a population-based sample: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Odd Lindell

    2010-11-01

    Full Text Available Odd Lindell, Sven-Erik Johansson, Lars-Erik Strender1Center for Family and Community Medicine, Department of Neurobiology, Care Sciences and Society, Karolinska Institutet, Huddinge, SwedenBackground: Nonspecific spinal pain (NSP, comprising back and/or neck pain, is one of the leading disorders behind long-term sick-listing, including disability pensions. Early interventions to prevent long-term sick-listing require the identification of patients at risk. The aim of this study was to compare living conditions associated with long-term sick-listing for NSP in patients with nonacute NSP, with a nonpatient population-based sample. Nonacute NSP is pain that leads to full-time sick-listing>3 weeks.Methods: One hundred and twenty-five patients with nonacute NSP, 2000–2004, were included in a randomized controlled trial in Stockholm County with the objective of comparing cognitive–behavioral rehabilitation with traditional primary care. For these patients, a cross-sectional study was carried out with baseline data. Living conditions were compared between the patients and 338 nonpatients by logistic regression. The conditions from univariate analyses were included in a multivariate analysis. The nonsignificant variables were excluded sequentially to yield a model comprising only the significant factors (P <0.05. The results are shown as odds ratios (OR with 95% confidence intervals.Results: In the univariate analyses, 13 of the 18 living conditions had higher odds for the patients with a dominance of physical work strains and Indication of alcohol over-consumption, odds ratio (OR 14.8 (95% confidence interval [CI] 3.2–67.6. Five conditions qualified for the multivariate model: High physical workload, OR 13.7 (CI 5.9–32.2; Hectic work tempo, OR 8.4 (CI 2.5–28.3; Blue-collar job, OR 4.5 (CI 1.8–11.4; Obesity, OR 3.5 (CI 1.2–10.2; and Low education, OR 2.7 (CI 1.1–6.8.Conclusions: As most of the living conditions have previously been

  19. Detection of Bartonella henselae DNA in clinical samples including peripheral blood of immune competent and immune compromised patients by three nested amplifications

    Directory of Open Access Journals (Sweden)

    Karina Hatamoto Kawasato

    2013-02-01

    Full Text Available Bacteria of the genus Bartonella are emerging pathogens detected in lymph node biopsies and aspirates probably caused by increased concentration of bacteria. Twenty-three samples of 18 patients with clinical, laboratory and/or epidemiological data suggesting bartonellosis were subjected to three nested amplifications targeting a fragment of the 60-kDa heat shock protein (HSP, the internal transcribed spacer 16S-23S rRNA (ITS and the cell division (FtsZ of Bartonella henselae, in order to improve detection in clinical samples. In the first amplification 01, 04 and 05 samples, were positive by HSP (4.3%, FtsZ (17.4% and ITS (21.7%, respectively. After the second round six positive samples were identified by nested-HSP (26%, eight by nested-ITS (34.8% and 18 by nested-FtsZ (78.2%, corresponding to 10 peripheral blood samples, five lymph node biopsies, two skin biopsies and one lymph node aspirate. The nested-FtsZ was more sensitive than nested-HSP and nested-ITS (p < 0.0001, enabling the detection of Bartonella henselae DNA in 15 of 18 patients (83.3%. In this study, three nested-PCR that should be specific for Bartonella henselae amplification were developed, but only the nested-FtsZ did not amplify DNA from Bartonella quintana. We conclude that nested amplifications increased detection of B. henselae DNA, and that the nested-FtsZ was the most sensitive and the only specific to B. henselae in different biological samples. As all samples detected by nested-HSP and nested-ITS, were also by nested-FtsZ, we infer that in our series infections were caused by Bartonella henselae. The high number of positive blood samples draws attention to the use of this biological material in the investigation of bartonellosis, regardless of the immune status of patients. This fact is important in the case of critically ill patients and young children to avoid more invasive procedures such as lymph nodes biopsies and aspirates.

  20. Including Online-Recruited Seeds: A Respondent-Driven Sample of Men Who Have Sex With Men.

    Science.gov (United States)

    Lachowsky, Nathan John; Lal, Allan; Forrest, Jamie I; Card, Kiffer George; Cui, Zishan; Sereda, Paul; Rich, Ashleigh; Raymond, Henry Fisher; Roth, Eric A; Moore, David M; Hogg, Robert S

    2016-03-15

    Technology has changed the way men who have sex with men (MSM) seek sex and socialize, which may impact the implementation of respondent-driven sampling (RDS) among this population. Initial participants (also known as seeds) are a critical consideration in RDS because they begin the recruitment chains. However, little information is available on how the online-recruited seeds may effect RDS implementation. The objectives of this study were to compare (1) online-recruited versus offline-recruited seeds and (2) subsequent recruitment chains of online-recruited versus offline-recruited seeds. Between 2012 and 2014, we recruited MSM using RDS in Vancouver, Canada. RDS weights were used with logistic regression to address each objective. A total of 119 seeds were used, 85 of whom were online-recruited seeds, to recruit an additional 600 MSM. Compared with offline-recruited seeds, online-recruited seeds were less likely to be HIV-positive (OR 0.34, 95% CI 0.13-0.88), to have attended a gay community group (AOR 0.33, 95% CI 0.12-0.90), and to feel gay community involvement was "very important" (AOR 0.16, 95% CI 0.03-0.93). Online-recruited seeds were more likely to ask a sexual partner's HIV status always versus online (AOR 4.29, 95% CI 1.53-12-12.05). Further, compared with recruitment chains started by offline-recruited seeds, recruits from chains started by online-recruited seeds (283/600, 47.2%) were less likely to be HIV-positive (AOR 0.25, 95% CI 0.16-0.40), to report "versatile" versus "bottom" sexual position preference (AOR 0.56, 95% CI 0.35-0.88), and to be in a relationship lasting >1 year (AOR 1.65, 95% CI 1.06-2.56). Recruits of online seeds were more likely to be out as gay for longer (eg, 11-21 vs 1-4 years, AOR 2.22, 95% CI 1.27-3.88) and have fewer Facebook friends (eg, 201-500 vs >500, AOR 1.69, 95% CI 1.02-2.80). Online-recruited seeds were more prevalent, recruited fewer participants, but were different from those recruited offline. This may therefore

  1. A Quadrupole Dalton-based multi-attribute method for product characterization, process development, and quality control of therapeutic proteins.

    Science.gov (United States)

    Xu, Weichen; Jimenez, Rod Brian; Mowery, Rachel; Luo, Haibin; Cao, Mingyan; Agarwal, Nitin; Ramos, Irina; Wang, Xiangyang; Wang, Jihong

    2017-10-01

    During manufacturing and storage process, therapeutic proteins are subject to various post-translational modifications (PTMs), such as isomerization, deamidation, oxidation, disulfide bond modifications and glycosylation. Certain PTMs may affect bioactivity, stability or pharmacokinetics and pharmacodynamics profile and are therefore classified as potential critical quality attributes (pCQAs). Identifying, monitoring and controlling these PTMs are usually key elements of the Quality by Design (QbD) approach. Traditionally, multiple analytical methods are utilized for these purposes, which is time consuming and costly. In recent years, multi-attribute monitoring methods have been developed in the biopharmaceutical industry. However, these methods combine high-end mass spectrometry with complicated data analysis software, which could pose difficulty when implementing in a quality control (QC) environment. Here we report a multi-attribute method (MAM) using a Quadrupole Dalton (QDa) mass detector to selectively monitor and quantitate PTMs in a therapeutic monoclonal antibody. The result output from the QDa-based MAM is straightforward and automatic. Evaluation results indicate this method provides comparable results to the traditional assays. To ensure future application in the QC environment, this method was qualified according to the International Conference on Harmonization (ICH) guideline and applied in the characterization of drug substance and stability samples. The QDa-based MAM is shown to be an extremely useful tool for product and process characterization studies that facilitates facile understanding of process impact on multiple quality attributes, while being QC friendly and cost-effective.

  2. Building a Quality Controlled Database of Meteorological Data from NASA Kennedy Space Center and the United States Air Force's Eastern Range

    Science.gov (United States)

    Brenton, James C.; Barbre. Robert E., Jr.; Decker, Ryan K.; Orcutt, John M.

    2018-01-01

    The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large sets of data consists of ensuring erroneous data is removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, it is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.

  3. Development of a standardized susceptibility test for Campylobacter with quality control ranges for ciprofloxacin, doxycycline, erythromycin, gentamicin, and meropenem

    DEFF Research Database (Denmark)

    McDermott, P. F.; Bodeis, S. M.; Aarestrup, Frank Møller

    2004-01-01

    -control (QC) strain. Minimal inhibitory concentration (MIC) QC ranges were determined for two incubation time/temperature combinations: 36degreesC for 48 hr and 42degreesC for 24 hr. Quality-control ranges were determined for ciprofloxacin, doxycycline, erythromycin, gentamicin, and meropenem. For all...

  4. Measurement network design including traveltime determinations to minimize model prediction uncertainty

    NARCIS (Netherlands)

    Janssen, G.M.C.M.; Valstar, J.R.; Zee, van der S.E.A.T.M.

    2008-01-01

    Traveltime determinations have found increasing application in the characterization of groundwater systems. No algorithms are available, however, to optimally design sampling strategies including this information type. We propose a first-order methodology to include groundwater age or tracer arrival

  5. Calendar Year 2009 Groundwater Monitoring Report, U.S. Department of Energy, Y-12 National Security Complex, Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    Elvado Environmental LLC

    2010-12-01

    surface water sampling and analysis activities implemented under the Y-12 GWPP including sampling locations and frequency; quality assurance (QA)/quality control (QC) sampling; sample collection and handling; field measurements and laboratory analytes; data management and data quality objective (DQO) evaluation; and groundwater elevation monitoring. However, this report does not include equivalent QA/QC or DQO evaluation information regarding the groundwater and surface water sampling and analysis activities associated with the monitoring programs implemented by BJC. Such details are deferred to the respective programmatic plans and reports issued by BJC (see Section 3.0). Collectively, the groundwater and surface water monitoring data obtained during CY 2009 by the Y-12 GWPP and BJC address DOE Order 450.1A (Environmental Protection Program) requirements for monitoring groundwater and surface water quality in areas: (1) which are, or could be, affected by operations at Y-12 (surveillance monitoring); and (2) where contaminants from Y-12 are most likely to migrate beyond the boundaries of the ORR (exit pathway/perimeter monitoring). Section 4 of this report presents a summary evaluation of the monitoring data with regard to the respective objectives of surveillance monitoring and exit pathway/perimeter monitoring, based on the analytical results for the principal groundwater contaminants at Y-12: nitrate, uranium, volatile organic compounds (VOCs), gross alpha activity, and gross beta activity. Section 5 of this report summarizes the most pertinent findings regarding the principal contaminants, along with recommendations proposed for ongoing groundwater and surface water quality monitoring performed under the Y-12 GWPP. Narrative sections of this report reference several appendices. Figures (maps and diagrams) and tables (excluding data summary tables presented in the narrative sections) are in Appendix A and Appendix B, respectively. Appendix C contains construction

  6. Calendar Year 2009 Groundwater Monitoring Report, U.S. Department of Energy, Y-12 National Security Complex, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    2010-01-01

    groundwater and surface water sampling and analysis activities implemented under the Y-12 GWPP including sampling locations and frequency; quality assurance (QA)/quality control (QC) sampling; sample collection and handling; field measurements and laboratory analytes; data management and data quality objective (DQO) evaluation; and groundwater elevation monitoring. However, this report does not include equivalent QA/QC or DQO evaluation information regarding the groundwater and surface water sampling and analysis activities associated with the monitoring programs implemented by BJC. Such details are deferred to the respective programmatic plans and reports issued by BJC (see Section 3.0). Collectively, the groundwater and surface water monitoring data obtained during CY 2009 by the Y-12 GWPP and BJC address DOE Order 450.1A (Environmental Protection Program) requirements for monitoring groundwater and surface water quality in areas: (1) which are, or could be, affected by operations at Y-12 (surveillance monitoring); and (2) where contaminants from Y-12 are most likely to migrate beyond the boundaries of the ORR (exit pathway/perimeter monitoring). Section 4 of this report presents a summary evaluation of the monitoring data with regard to the respective objectives of surveillance monitoring and exit pathway/perimeter monitoring, based on the analytical results for the principal groundwater contaminants at Y-12: nitrate, uranium, volatile organic compounds (VOCs), gross alpha activity, and gross beta activity. Section 5 of this report summarizes the most pertinent findings regarding the principal contaminants, along with recommendations proposed for ongoing groundwater and surface water quality monitoring performed under the Y-12 GWPP. Narrative sections of this report reference several appendices. Figures (maps and diagrams) and tables (excluding data summary tables presented in the narrative sections) are in Appendix A and Appendix B, respectively. Appendix C contains

  7. New Approach Based on Compressive Sampling for Sample Rate Enhancement in DASs for Low-Cost Sensing Nodes

    Directory of Open Access Journals (Sweden)

    Francesco Bonavolontà

    2014-10-01

    Full Text Available The paper deals with the problem of improving the maximum sample rate of analog-to-digital converters (ADCs included in low cost wireless sensing nodes. To this aim, the authors propose an efficient acquisition strategy based on the combined use of high-resolution time-basis and compressive sampling. In particular, the high-resolution time-basis is adopted to provide a proper sequence of random sampling instants, and a suitable software procedure, based on compressive sampling approach, is exploited to reconstruct the signal of interest from the acquired samples. Thanks to the proposed strategy, the effective sample rate of the reconstructed signal can be as high as the frequency of the considered time-basis, thus significantly improving the inherent ADC sample rate. Several tests are carried out in simulated and real conditions to assess the performance of the proposed acquisition strategy in terms of reconstruction error. In particular, the results obtained in experimental tests with ADC included in actual 8- and 32-bits microcontrollers highlight the possibility of achieving effective sample rate up to 50 times higher than that of the original ADC sample rate.

  8. Quality control of the software in the JT-60 computer control system

    International Nuclear Information System (INIS)

    Isaji, Nobuaki; Kurihara, Kenichi; Kimura, Toyoaki

    1990-07-01

    The JT-60 Control System should be improved corresponding to the experimental requirements. In order to keep the integrity of the system even in the modification the concept of quality control (QC) was introduced in the software development. What we have done for QC activity are (1) to establish standard procedures of the software development, (2) to develop support tools for grasping the present status of the program structure, and (3) to develop a document system, and a source program management system. This paper reports these QC activities and their problems for the JT-60 control system. (author)

  9. Ball assisted device for analytical surface sampling

    Science.gov (United States)

    ElNaggar, Mariam S; Van Berkel, Gary J; Covey, Thomas R

    2015-11-03

    A system for sampling a surface includes a sampling probe having a housing and a socket, and a rolling sampling sphere within the socket. The housing has a sampling fluid supply conduit and a sampling fluid exhaust conduit. The sampling fluid supply conduit supplies sampling fluid to the sampling sphere. The sampling fluid exhaust conduit has an inlet opening for receiving sampling fluid carried from the surface by the sampling sphere. A surface sampling probe and a method for sampling a surface are also disclosed.

  10. Does job satisfaction mediate the relationship between healthy work environment and care quality?

    Science.gov (United States)

    Bai, Jinbing

    2016-01-01

    A healthy work environment can increase nurse-reported job satisfaction and patient care outcomes. Yet the associations between healthy work environment, nurse job satisfaction and QC have not been comprehensively examined in Chinese ICUs. To investigate the mediating effect of nurse job satisfaction on the relationship between healthy work environment and nurse-reported quality of care (QC) in Chinese intensive care units (ICUs). A total of 706 nurses were recruited from 28 ICUs of 14 tertiary hospitals. The nurses completed self-reported questionnaires to evaluate healthy work environment, job satisfaction and quality of patient care. Mediation analysis was conducted to explore the mediating effect between nurse-reported healthy work environment and QC. Nurse work environment showed positive correlations with nurse-reported QC in the ICUs. Nurse-reported job satisfaction showed full mediating effects between healthy work environment and QC in the medical-surgical ICUs, surgical ICUs and neonatal/paediatric ICUs and indicated a partial mediating effect in the medical ICUs. Significant mediating effects of nurse job satisfaction provide more support for thinking about how to use this mediator to increase nurse and patient care outcomes. Nurse administrators can design interventions to increase nurse work environment and patient care outcomes with this mediating factor addressed. © 2015 British Association of Critical Care Nurses.

  11. Characterisation of imperial college reactor centre legacy waste using gamma-ray spectrometry

    International Nuclear Information System (INIS)

    Shuhaimi, Alif Imran Mohd

    2016-01-01

    Waste characterisation is a principal component in waste management strategy. The characterisation includes identification of chemical, physical and radiochemical parameters of radioactive waste. Failure to determine specific waste properties may result in sentencing waste packages which are not compliant with the regulation of long term storage or disposal. This project involved measurement of intensity and energy of gamma photons which may be emitted by radioactive waste generated during decommissioning of Imperial College Reactor Centre (ICRC). The measurement will use High Purity Germanium (HPGe) as Gamma-ray detector and ISOTOPIC-32 V4.1 as analyser. In order to ensure the measurements provide reliable results, two quality control (QC) measurements using difference matrices have been conducted. The results from QC measurements were used to determine the accuracy of the ISOTOPIC software

  12. The April 1994 and October 1994 radon intercomparisons at EML

    International Nuclear Information System (INIS)

    Fisenne, I.M.; George, A.C.; Perry, P.M.; Keller, H.W.

    1995-10-01

    Quality assurance/quality control (QA/QC) are the backbone of many commercial and research processes and programs. QA/QC research tests the state of a functioning system, be it the production of manufactured goods or the ability to make accurate and precise measurements. The quality of the radon measurements in the US have been tested under controlled conditions in semi-annual radon gas intercomparison exercises sponsored by the Environmental Measurements Laboratory (EML) since 1981. The two Calendar Year 1994 radon gas intercomparison exercises were conducted in the EML exposure chamber. Thirty-two groups including US Federal facilities, USDOE contractors, national and state laboratories, universities and foreign institutions participated in these exercises. The majority of the participant's results were within ±10% of the EML value at radon concentrations of 570 and 945 Bq m -3

  13. Characterisation of imperial college reactor centre legacy waste using gamma-ray spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Shuhaimi, Alif Imran Mohd [Nuclear Energy Department, Regulatory Economics & Planning Division, Tenaga Nasional Berhad (Malaysia)

    2016-01-22

    Waste characterisation is a principal component in waste management strategy. The characterisation includes identification of chemical, physical and radiochemical parameters of radioactive waste. Failure to determine specific waste properties may result in sentencing waste packages which are not compliant with the regulation of long term storage or disposal. This project involved measurement of intensity and energy of gamma photons which may be emitted by radioactive waste generated during decommissioning of Imperial College Reactor Centre (ICRC). The measurement will use High Purity Germanium (HPGe) as Gamma-ray detector and ISOTOPIC-32 V4.1 as analyser. In order to ensure the measurements provide reliable results, two quality control (QC) measurements using difference matrices have been conducted. The results from QC measurements were used to determine the accuracy of the ISOTOPIC software.

  14. Development of Quantitative Competitive PCR and Absolute Based Real-Time PCR Assays for Quantification of The Butyrate Producing Bacterium: Butyrivibrio fibrisolvens

    Directory of Open Access Journals (Sweden)

    Mojtaba Tahmoorespur

    2016-04-01

    Full Text Available Introduction Butyrivibrio fibrisolvens strains are presently recognized as the major butyrate-producing bacteria found in the rumen and digestive track of many animals and also in the human gut. In this study we reported the development of two DNA based techniques, quantitative competitive (QC PCR and absolute based Real-Time PCR, for enumerating Butyrivibrio fibrisolvens strains. Despite the recent introduction of real-time PCR method for the rapid quantification of the target DNA sequences, use of quantitative competitive PCR (QC-PCR technique continues to play an important role in nucleic acid quantification since it is more cost effective. The procedure relies on the co-amplification of the sequence of interest with a serially diluted synthetic DNA fragment of the known concentration (competitor, using the single set primers. A real-time polymerase chain reaction is a laboratory technique of molecular biology based on the polymerase chain reaction (PCR. It monitors the amplification of a targeted DNA molecule during the PCR. Materials and Methods At first reported species-specific primers targeting the 16S rDNA region of the bacterium Butyrivibrio fibrisolvens were used for amplifying a 213 bp fragment. A DNA competitor differing by 50 bp in length from the 213 bp fragment was constructed and cloned into pTZ57R/T vector. The competitor was quantified by NanoDrop spectrophotometer and serially diluted and co-amplified by PCR with total extracted DNA from rumen fluid samples. PCR products were quantified by photographing agarose gels and analyzed with Image J software and the amount of amplified target DNA was log plotted against the amount of amplified competitor. Coefficient of determination (R2 was used as a criterion of methodology precision. For developing the Real-time PCR technique, the 213 bp fragment was amplified and cloned into pTZ57R/T was used to draw a standard curve. Results and Discussion The specific primers of Butyrivibrio

  15. Quality Assurance of Real-Time Oceanographic Data from the Cabled Array of the Ocean Observatories Initiative

    Science.gov (United States)

    Kawka, O. E.; Nelson, J. S.; Manalang, D.; Kelley, D. S.

    2016-02-01

    The Cabled Array component of the NSF-funded Ocean Observatories Initiative (OOI) provides access to real-time physical, chemical, geological, and biological data from water column and seafloor platforms/instruments at sites spanning the southern half of the Juan de Fuca Plate. The Quality Assurance (QA) program for OOI data is designed to ensure that data products meet OOI science requirements. This overall data QA plan establishes the guidelines for assuring OOI data quality and summarizes Quality Control (QC) protocols and procedures, based on best practices, which can be utilized to ensure the highest quality data across the OOI program. This presentation will highlight, specifically, the QA/QC approach being utilized for the OOI Cabled Array infrastructure and data and will include a summary of both shipboard and shore-based protocols currently in use. Aspects addressed will be pre-deployment instrument testing and calibration checks, post-deployment and pre-recovery field verification of data, and post-recovery "as-found" testing of instruments. Examples of QA/QC data will be presented and specific cases of cabled data will be discussed in the context of quality assessments and adjustment/correction of OOI datasets overall for inherent sensor drift and/or instrument fouling.

  16. PACS 2000: quality control using the task allocation chart

    Science.gov (United States)

    Norton, Gary S.; Romlein, John R.; Lyche, David K.; Richardson, Ronald R., Jr.

    2000-05-01

    Medical imaging's technological evolution in the next century will continue to include Picture Archive and Communication Systems (PACS) and teleradiology. It is difficult to predict radiology's future in the new millennium with both computed radiography and direct digital capture competing as the primary image acquisition methods for routine radiography. Changes in Computed Axial Tomography (CT) and Magnetic Resonance Imaging (MRI) continue to amaze the healthcare community. No matter how the acquisition, display, and archive functions change, Quality Control (QC) of the radiographic imaging chain will remain an important step in the imaging process. The Task Allocation Chart (TAC) is a tool that can be used in a medical facility's QC process to indicate the testing responsibilities of the image stakeholders and the medical informatics department. The TAC shows a grid of equipment to be serviced, tasks to be performed, and the organization assigned to perform each task. Additionally, skills, tasks, time, and references for each task can be provided. QC of the PACS must be stressed as a primary element of a PACS' implementation. The TAC can be used to clarify responsibilities during warranty and paid maintenance periods. Establishing a TAC a part of a PACS implementation has a positive affect on patient care and clinical acceptance.

  17. MO-AB-207-00: ACR Update in MR, CT, Nuclear Medicine, and Mammography

    International Nuclear Information System (INIS)

    2015-01-01

    A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date as the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program

  18. MO-AB-207-03: ACR Update in Nuclear Medicine

    International Nuclear Information System (INIS)

    Harkness, B.

    2015-01-01

    A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date as the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program

  19. MO-AB-207-04: ACR Update in Mammography

    International Nuclear Information System (INIS)

    Berns, E.

    2015-01-01

    A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date as the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program

  20. MO-AB-207-02: ACR Update in MR

    International Nuclear Information System (INIS)

    Price, R.

    2015-01-01

    A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date as the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program

  1. MO-AB-207-04: ACR Update in Mammography

    Energy Technology Data Exchange (ETDEWEB)

    Berns, E. [University of Colorado Health Science (United States)

    2015-06-15

    A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date as the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program.

  2. MO-AB-207-00: ACR Update in MR, CT, Nuclear Medicine, and Mammography

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-06-15

    A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date as the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program.

  3. MO-AB-207-02: ACR Update in MR

    Energy Technology Data Exchange (ETDEWEB)

    Price, R. [Vanderbilt Medical Center (United States)

    2015-06-15

    A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date as the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program.

  4. MO-AB-207-03: ACR Update in Nuclear Medicine

    Energy Technology Data Exchange (ETDEWEB)

    Harkness, B. [Henry Ford Hospital System (United States)

    2015-06-15

    A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date as the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program.

  5. MO-AB-207-01: ACR Update in CT

    Energy Technology Data Exchange (ETDEWEB)

    McNitt-Gray, M. [UCLA School of Medicine (United States)

    2015-06-15

    A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date as the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program.

  6. Science and technology of kernels and TRISO coated particle sorting

    International Nuclear Information System (INIS)

    Nothnagel, G.

    2006-09-01

    The ~1mm diameter TRISO coated particles, which form the elemental units of PBMR nuclear fuel, has to be close to spherical in order to best survive damage during sphere pressing. Spherical silicon carbide layers further provide the strongest miniature pressure vessels for fission product retention. To make sure that the final product contains particles of acceptable shape, 100% of kernels and coated particles have to be sorted on a surface-ground sorting table. Broken particles, twins, irregular (odd) shapes and extreme ellipsoids have to be separated from the final kernel and coated particle batches. Proper sorting of particles is an extremely important step in quality fuel production as the final failure fraction depends sensitively on the quality of sorting. After sorting a statistically significant sample of the sorted product is analysed for sphericity, which is defined as the ratio of maximum to minimum diameter, as part of a standard QC test to ensure conformance to German specifications. In addition a burn-leach test is done on coated particles (before pressing) and fuel spheres (after pressing) to ensure adherence to failure specifications. Because of the extreme importance of particle sorting for assurance of fuel quality it is essential to have an in-depth understanding of the capabilities and limitations of particle sorting. In this report a systematic scientific rationale is developed, from fundamental principles, to provide a basis for understanding the relationship between product quality and sorting parameters. The principles and concepts, developed in this report, will be of importance when future sorting tables (or equivalents) are to be designed. A number of new concepts and methodologies are developed to assist with equivalence validation of any two sorting tables. This is aimed in particular towards quantitative assessment of equivalence between current QC tables (closely based on the original NUKEM parameters, except for the driving mechanism

  7. Reference samples for the earth sciences

    Science.gov (United States)

    Flanagan, F.J.

    1974-01-01

    A revised list of reference samples of interest to geoscientists has been extended to include samples for the agronomist, the archaeologist and the environmentalist. In addition to the source from which standard samples may be obtained, references or pertinent notes for some samples are included. The number of rock reference samples is now almost adequate, and the variety of ore samples will soon be sufficient. There are very few samples for microprobe work. Oil shales will become more important because of the outlook for world petroleum resources. The dryland equivalent of a submarine basalt might be useful in studies of sea-floor spreading and of the geochemistry of basalts. The Na- and K-feldspars of BCS (British Chemical Standards-Bureau of Analysed Samples), NBS (National Bureau of Standards), and ANRT (Association Kationale de la Recherche Technique) could serve as trace-element standards if such data were available. Similarly, the present NBS flint and plastic clays, as well as their predecessors, might be useful for archaeological pottery studies. The International Decade for Ocean Exploration may stimulate the preparation of ocean-water standards for trace elements or pollutants and a standard for manganese nodules. ?? 1974.

  8. AcEST: DK943732 [AcEST

    Lifescience Database Archive (English)

    Full Text Available NTPSRLDLRKFCRYCHKHTI 60 Query: 392 HKE 400 + E Sbjct: 61 YGE 63 >tr|A9QC76|A9QC76_TRACE 50S ribosomal protein L33 (Fragment) OS=Trach...elium caeruleum GN=rpl33 PE=3 SV=1 Length = 64 Score = 8

  9. 7 CFR 272.1 - General terms and conditions.

    Science.gov (United States)

    2010-01-01

    ... participating State or political subdivision shall decrease any assistance otherwise provided an individual or... the conversion process as required), shall be subject to standard QC review procedures. When the QC... shall issue press releases to the news media advising of the impending program changes. (v) For the...

  10. Photochemical Energy Storage and Electrochemically Triggered Energy Release in the Norbornadiene-Quadricyclane System: UV Photochemistry and IR Spectroelectrochemistry in a Combined Experiment.

    Science.gov (United States)

    Brummel, Olaf; Waidhas, Fabian; Bauer, Udo; Wu, Yanlin; Bochmann, Sebastian; Steinrück, Hans-Peter; Papp, Christian; Bachmann, Julien; Libuda, Jörg

    2017-07-06

    The two valence isomers norbornadiene (NBD) and quadricyclane (QC) enable solar energy storage in a single molecule system. We present a new photoelectrochemical infrared reflection absorption spectroscopy (PEC-IRRAS) experiment, which allows monitoring of the complete energy storage and release cycle by in situ vibrational spectroscopy. Both processes were investigated, the photochemical conversion from NBD to QC using the photosensitizer 4,4'-bis(dimethylamino)benzophenone (Michler's ketone, MK) and the electrochemically triggered cycloreversion from QC to NBD. Photochemical conversion was obtained with characteristic conversion times on the order of 500 ms. All experiments were performed under full potential control in a thin-layer configuration with a Pt(111) working electrode. The vibrational spectra of NBD, QC, and MK were analyzed in the fingerprint region, permitting quantitative analysis of the spectroscopic data. We determined selectivities for both the photochemical conversion and the electrochemical cycloreversion and identified the critical steps that limit the reversibility of the storage cycle.

  11. 40 CFR 141.802 - Coliform sampling plan.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Coliform sampling plan. 141.802... sampling plan. (a) Each air carrier under this subpart must develop a coliform sampling plan covering each... required actions, including repeat and follow-up sampling, corrective action, and notification of...

  12. Machine-Specific Magnetic Resonance Imaging Quality Control Procedures for Stereotactic Radiosurgery Treatment Planning.

    Science.gov (United States)

    Fatemi, Ali; Taghizadeh, Somayeh; Yang, Claus Chunli; R Kanakamedala, Madhava; Morris, Bart; Vijayakumar, Srinivasan

    2017-12-18

    Purpose Magnetic resonance (MR) images are necessary for accurate contouring of intracranial targets, determination of gross target volume and evaluation of organs at risk during stereotactic radiosurgery (SRS) treatment planning procedures. Many centers use magnetic resonance imaging (MRI) simulators or regular diagnostic MRI machines for SRS treatment planning; while both types of machine require two stages of quality control (QC), both machine- and patient-specific, before use for SRS, no accepted guidelines for such QC currently exist. This article describes appropriate machine-specific QC procedures for SRS applications. Methods and materials We describe the adaptation of American College of Radiology (ACR)-recommended QC tests using an ACR MRI phantom for SRS treatment planning. In addition, commercial Quasar MRID 3D and Quasar GRID 3D phantoms were used to evaluate the effects of static magnetic field (B 0 ) inhomogeneity, gradient nonlinearity, and a Leksell G frame (SRS frame) and its accessories on geometrical distortion in MR images. Results QC procedures found in-plane distortions (Maximum = 3.5 mm, Mean = 0.91 mm, Standard deviation = 0.67 mm, >2.5 mm (%) = 2) in X-direction (Maximum = 2.51 mm, Mean = 0.52 mm, Standard deviation = 0.39 mm, > 2.5 mm (%) = 0) and in Y-direction (Maximum = 13. 1 mm , Mean = 2.38 mm, Standard deviation = 2.45 mm, > 2.5 mm (%) = 34) in Z-direction and < 1 mm distortion at a head-sized region of interest. MR images acquired using a Leksell G frame and localization devices showed a mean absolute deviation of 2.3 mm from isocenter. The results of modified ACR tests were all within recommended limits, and baseline measurements have been defined for regular weekly QC tests. Conclusions With appropriate QC procedures in place, it is possible to routinely obtain clinically useful MR images suitable for SRS treatment planning purposes. MRI examination for SRS planning can benefit from the improved localization and planning

  13. Sensitive determination of iodine species, including organo-iodine, for freshwater and seawater samples using high performance liquid chromatography and spectrophotometric detection

    International Nuclear Information System (INIS)

    Schwehr, Kathleen A.; Santschi, Peter H.

    2003-01-01

    In order to more effectively use iodine isotope ratios, 129 I/ 127 I, as hydrological and geochemical tracers in aquatic systems, a new high performance liquid chromatography (HPLC) method was developed for the determination of iodine speciation. The dissolved iodine species that dominate natural water systems are iodide, iodate, and organic iodine. Using this new method, iodide was determined directly by combining anion exchange chromatography and spectrophotometry. Iodate and the total of organic iodine species are determined as iodide, with minimal sample preparation, compared to existing methods. The method has been applied to quantitatively determine iodide, iodate as the difference of total inorganic iodide and iodide after reduction of the sample by NaHSO 3 , and organic iodine as the difference of total iodide (after organic decomposition by dehydrohalogenation and reduction by NaHSO 3 ) and total inorganic iodide. Analytical accuracy was tested: (1) against certified reference material, SRM 1549, powdered milk (NIST); (2) through the method of standard additions; and (3) by comparison to values of environmental waters measured independently by inductively coupled plasma mass spectrometry (ICP-MS). The method has been successfully applied to measure the concentrations of iodide species in rain, surface and ground water, estuarine and seawater samples. The detection limit was ∼1 nM (0.2 ppb), with less than 3% relative standard deviation (R.S.D.) for samples determined by standard additions to an iodide solution of 20 nM in 0.1 M NaCl. This technique is one of the few methods sensitive enough to accurately quantify stable iodine species at nanomolar concentrations in aquatic systems across a range of matrices, and to quantitatively measure organic iodine. Additionally, this method makes use of a very dilute mobile phase, and may be applied to small sample volumes without pre-column concentration or post-column reactions

  14. Soil sampling for environmental contaminants

    International Nuclear Information System (INIS)

    2004-10-01

    The Consultants Meeting on Sampling Strategies, Sampling and Storage of Soil for Environmental Monitoring of Contaminants was organized by the International Atomic Energy Agency to evaluate methods for soil sampling in radionuclide monitoring and heavy metal surveys for identification of punctual contamination (hot particles) in large area surveys and screening experiments. A group of experts was invited by the IAEA to discuss and recommend methods for representative soil sampling for different kinds of environmental issues. The ultimate sinks for all kinds of contaminants dispersed within the natural environment through human activities are sediment and soil. Soil is a particularly difficult matrix for environmental pollution studies as it is generally composed of a multitude of geological and biological materials resulting from weathering and degradation, including particles of different sizes with varying surface and chemical properties. There are so many different soil types categorized according to their content of biological matter, from sandy soils to loam and peat soils, which make analytical characterization even more complicated. Soil sampling for environmental monitoring of pollutants, therefore, is still a matter of debate in the community of soil, environmental and analytical sciences. The scope of the consultants meeting included evaluating existing techniques with regard to their practicability, reliability and applicability to different purposes, developing strategies of representative soil sampling for cases not yet considered by current techniques and recommending validated techniques applicable to laboratories in developing Member States. This TECDOC includes a critical survey of existing approaches and their feasibility to be applied in developing countries. The report is valuable for radioanalytical laboratories in Member States. It would assist them in quality control and accreditation process

  15. The Quebec Liberation Front (FLQ) as an Insurgency

    Science.gov (United States)

    2010-04-08

    Histoire d’un mouvement clandestin, rev. ed. (Outremont, QC: Lanctôt Éditeur, 1998), 11. 4 targeted union workers and students. They also represented...Washington, DC: Government Printing Office, 2006. Fournier, Louis. FLQ: Histoire d’un mouvement clandestin. Rev. ed. Outremont, QC: Lanctôt Éditeur

  16. Impacts of Intelligent Automated Quality Control on a Small Animal APD-Based Digital PET Scanner

    Science.gov (United States)

    Charest, Jonathan; Beaudoin, Jean-François; Bergeron, Mélanie; Cadorette, Jules; Arpin, Louis; Lecomte, Roger; Brunet, Charles-Antoine; Fontaine, Réjean

    2016-10-01

    Stable system performance is mandatory to warrant the accuracy and reliability of biological results relying on small animal positron emission tomography (PET) imaging studies. This simple requirement sets the ground for imposing routine quality control (QC) procedures to keep PET scanners at a reliable optimal performance level. However, such procedures can become burdensome to implement for scanner operators, especially taking into account the increasing number of data acquisition channels in newer generation PET scanners. In systems using pixel detectors to achieve enhanced spatial resolution and contrast-to-noise ratio (CNR), the QC workload rapidly increases to unmanageable levels due to the number of independent channels involved. An artificial intelligence based QC system, referred to as Scanner Intelligent Diagnosis for Optimal Performance (SIDOP), was proposed to help reducing the QC workload by performing automatic channel fault detection and diagnosis. SIDOP consists of four high-level modules that employ machine learning methods to perform their tasks: Parameter Extraction, Channel Fault Detection, Fault Prioritization, and Fault Diagnosis. Ultimately, SIDOP submits a prioritized faulty channel list to the operator and proposes actions to correct them. To validate that SIDOP can perform QC procedures adequately, it was deployed on a LabPET™ scanner and multiple performance metrics were extracted. After multiple corrections on sub-optimal scanner settings, a 8.5% (with a 95% confidence interval (CI) of [7.6, 9.3]) improvement in the CNR, a 17.0% (CI: [15.3, 18.7]) decrease of the uniformity percentage standard deviation, and a 6.8% gain in global sensitivity were observed. These results confirm that SIDOP can indeed be of assistance in performing QC procedures and restore performance to optimal figures.

  17. Sampling and monitoring for the mine life cycle

    Science.gov (United States)

    McLemore, Virginia T.; Smith, Kathleen S.; Russell, Carol C.

    2014-01-01

    Sampling and Monitoring for the Mine Life Cycle provides an overview of sampling for environmental purposes and monitoring of environmentally relevant variables at mining sites. It focuses on environmental sampling and monitoring of surface water, and also considers groundwater, process water streams, rock, soil, and other media including air and biological organisms. The handbook includes an appendix of technical summaries written by subject-matter experts that describe field measurements, collection methods, and analytical techniques and procedures relevant to environmental sampling and monitoring.The sixth of a series of handbooks on technologies for management of metal mine and metallurgical process drainage, this handbook supplements and enhances current literature and provides an awareness of the critical components and complexities involved in environmental sampling and monitoring at the mine site. It differs from most information sources by providing an approach to address all types of mining influenced water and other sampling media throughout the mine life cycle.Sampling and Monitoring for the Mine Life Cycle is organized into a main text and six appendices that are an integral part of the handbook. Sidebars and illustrations are included to provide additional detail about important concepts, to present examples and brief case studies, and to suggest resources for further information. Extensive references are included.

  18. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    Energy Technology Data Exchange (ETDEWEB)

    Shine, E. P.; Poirier, M. R.

    2013-10-29

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  19. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    International Nuclear Information System (INIS)

    Shine, E. P.; Poirier, M. R.

    2013-01-01

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  20. UNLABELED SELECTED SAMPLES IN FEATURE EXTRACTION FOR CLASSIFICATION OF HYPERSPECTRAL IMAGES WITH LIMITED TRAINING SAMPLES

    Directory of Open Access Journals (Sweden)

    A. Kianisarkaleh

    2015-12-01

    Full Text Available Feature extraction plays a key role in hyperspectral images classification. Using unlabeled samples, often unlimitedly available, unsupervised and semisupervised feature extraction methods show better performance when limited number of training samples exists. This paper illustrates the importance of selecting appropriate unlabeled samples that used in feature extraction methods. Also proposes a new method for unlabeled samples selection using spectral and spatial information. The proposed method has four parts including: PCA, prior classification, posterior classification and sample selection. As hyperspectral image passes these parts, selected unlabeled samples can be used in arbitrary feature extraction methods. The effectiveness of the proposed unlabeled selected samples in unsupervised and semisupervised feature extraction is demonstrated using two real hyperspectral datasets. Results show that through selecting appropriate unlabeled samples, the proposed method can improve the performance of feature extraction methods and increase classification accuracy.